Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
5
112
repo_url
stringlengths
34
141
action
stringclasses
3 values
title
stringlengths
1
844
labels
stringlengths
4
721
body
stringlengths
1
261k
index
stringclasses
12 values
text_combine
stringlengths
96
261k
label
stringclasses
2 values
text
stringlengths
96
248k
binary_label
int64
0
1
310,192
23,325,204,052
IssuesEvent
2022-08-08 20:26:09
wazuh/wazuh-documentation
https://api.github.com/repos/wazuh/wazuh-documentation
closed
Script for jira integration blog post doesn't work anymore
operations integrator blog documentation
Hello team, I had to configure this integration many times lately and found that the script we provide in this blog post: https://wazuh.com/blog/how-to-integrate-external-software-using-integrator/ ... doesn't work anymore. I kept getting next error: ``` { "errorMessages": [], "errors": { "project": "Especifica un ID o una clave de proyecto válida" } } ``` This is because Jira now requires an `issuetype ID` for issues to be created via API. But even inserting the issuetype ID into the script, still didn't work. This time it was because they also changed the format for the `description` field. It now requires additional parameters. I managed to adjust the script to this: ```python #!/usr/bin/env python import sys import json import requests from requests.auth import HTTPBasicAuth # Read configuration parameters alert_file = open(sys.argv[1]) user = sys.argv[2].split(':')[0] api_key = sys.argv[2].split(':')[1] hook_url = sys.argv[3] # Read the alert file alert_json = json.loads(alert_file.read()) alert_file.close() # Extract issue fields alert_level = alert_json['rule']['level'] ruleid = alert_json['rule']['id'] description = alert_json['rule']['description'] agentid = alert_json['agent']['id'] agentname = alert_json['agent']['name'] path = alert_json['syscheck']['path'] # Set the project attributes ===> This sections needs to be manually configured before running! project_key = 'WT' # You can get this from the beggining of an issue key. E.g.: For WS-5018 its "WS" issuetypeid = '10002' # Check https://confluence.atlassian.com/jirakb/finding-the-id-for-issue-types-646186508.html. There's also an API endpoint to get it. # Generate request headers = {'content-type': 'application/json'} issue_data = { "update": {}, "fields": { "summary": 'FIM alert on [' + path + ']', "issuetype": { "id": issuetypeid }, "project": { "key": project_key }, "description": { 'version': 1, 'type': 'doc', 'content': [ { "type": "paragraph", "content": [ { "text": '- State: ' + description + '\n- Rule ID: ' + str(ruleid) + '\n- Alert level: ' + str(alert_level) + '\n- Agent: ' + str(agentid) + ' ' + agentname, "type": "text" } ] } ], }, } } # Send the request response = requests.post(hook_url, data=json.dumps(issue_data), headers=headers, auth=(user, api_key)) #print(json.dumps(json.loads(response.text), sort_keys=True, indent=4, separators=(",", ": "))) sys.exit(0) ``` Also here's a script for general use (nor tied to FIM): ```python #!/usr/bin/env python import sys import json import requests from requests.auth import HTTPBasicAuth # Read configuration parameters alert_file = open(sys.argv[1]) user = sys.argv[2].split(':')[0] api_key = sys.argv[2].split(':')[1] hook_url = sys.argv[3] # Read the alert file alert_json = json.loads(alert_file.read()) alert_file.close() # Extract issue fields alert_level = alert_json['rule']['level'] ruleid = alert_json['rule']['id'] description = alert_json['rule']['description'] agentid = alert_json['agent']['id'] agentname = alert_json['agent']['name'] #path = alert_json['syscheck']['path'] # Set the project attributes ===> This sections needs to be manually configured before running! project_key = 'WT' # You can get this from the beggining of an issue key. E.g.: For WS-5018 its "WS" issuetypeid = '10002' # Check https://confluence.atlassian.com/jirakb/finding-the-id-for-issue-types-646186508.html. There's also an API endpoint to get it. # Generate request headers = {'content-type': 'application/json'} issue_data = { "update": {}, "fields": { "summary": 'Wazuh Alert: ' + description, "issuetype": { "id": issuetypeid }, "project": { "key": project_key }, "description": { 'version': 1, 'type': 'doc', 'content': [ { "type": "paragraph", "content": [ { "text": '- Rule ID: ' + str(ruleid) + '\n- Alert level: ' + str(alert_level) + '\n- Agent: ' + str(agentid) + ' ' + agentname, "type": "text" } ] } ], }, } } # Send the request response = requests.post(hook_url, data=json.dumps(issue_data), headers=headers, auth=(user, api_key)) #print(json.dumps(json.loads(response.text), sort_keys=True, indent=4, separators=(",", ": "))) sys.exit(0) ``` It would be great if we could have both scripts in the blog. Users (and the team) will appreciate it :) It is worth mentioning that the user will need to adjust both `project_key` and `issuetypeid` variables with theirs. I also commented out a "print" command at the end. In case of troubleshooting, it will be very useful to uncomment it. It is also useful to know that the issuetype ID can be queried to Jira API as well by running next command: ```bash curl --request GET \ --url 'https://your-domain.atlassian.net/rest/api/3/issuetype' \ --user 'email@example.com:<api_token>' \ --header 'Accept: application/json' ``` With these scripts I got next results in Jira: ![image](https://user-images.githubusercontent.com/29696136/159353665-e377a3a1-21e5-4c9d-b0ea-f5fcdeaabfb4.png) ![image](https://user-images.githubusercontent.com/29696136/159353637-520e4e78-89f6-4ecc-a25d-c01fae4ea651.png)
1.0
Script for jira integration blog post doesn't work anymore - Hello team, I had to configure this integration many times lately and found that the script we provide in this blog post: https://wazuh.com/blog/how-to-integrate-external-software-using-integrator/ ... doesn't work anymore. I kept getting next error: ``` { "errorMessages": [], "errors": { "project": "Especifica un ID o una clave de proyecto válida" } } ``` This is because Jira now requires an `issuetype ID` for issues to be created via API. But even inserting the issuetype ID into the script, still didn't work. This time it was because they also changed the format for the `description` field. It now requires additional parameters. I managed to adjust the script to this: ```python #!/usr/bin/env python import sys import json import requests from requests.auth import HTTPBasicAuth # Read configuration parameters alert_file = open(sys.argv[1]) user = sys.argv[2].split(':')[0] api_key = sys.argv[2].split(':')[1] hook_url = sys.argv[3] # Read the alert file alert_json = json.loads(alert_file.read()) alert_file.close() # Extract issue fields alert_level = alert_json['rule']['level'] ruleid = alert_json['rule']['id'] description = alert_json['rule']['description'] agentid = alert_json['agent']['id'] agentname = alert_json['agent']['name'] path = alert_json['syscheck']['path'] # Set the project attributes ===> This sections needs to be manually configured before running! project_key = 'WT' # You can get this from the beggining of an issue key. E.g.: For WS-5018 its "WS" issuetypeid = '10002' # Check https://confluence.atlassian.com/jirakb/finding-the-id-for-issue-types-646186508.html. There's also an API endpoint to get it. # Generate request headers = {'content-type': 'application/json'} issue_data = { "update": {}, "fields": { "summary": 'FIM alert on [' + path + ']', "issuetype": { "id": issuetypeid }, "project": { "key": project_key }, "description": { 'version': 1, 'type': 'doc', 'content': [ { "type": "paragraph", "content": [ { "text": '- State: ' + description + '\n- Rule ID: ' + str(ruleid) + '\n- Alert level: ' + str(alert_level) + '\n- Agent: ' + str(agentid) + ' ' + agentname, "type": "text" } ] } ], }, } } # Send the request response = requests.post(hook_url, data=json.dumps(issue_data), headers=headers, auth=(user, api_key)) #print(json.dumps(json.loads(response.text), sort_keys=True, indent=4, separators=(",", ": "))) sys.exit(0) ``` Also here's a script for general use (nor tied to FIM): ```python #!/usr/bin/env python import sys import json import requests from requests.auth import HTTPBasicAuth # Read configuration parameters alert_file = open(sys.argv[1]) user = sys.argv[2].split(':')[0] api_key = sys.argv[2].split(':')[1] hook_url = sys.argv[3] # Read the alert file alert_json = json.loads(alert_file.read()) alert_file.close() # Extract issue fields alert_level = alert_json['rule']['level'] ruleid = alert_json['rule']['id'] description = alert_json['rule']['description'] agentid = alert_json['agent']['id'] agentname = alert_json['agent']['name'] #path = alert_json['syscheck']['path'] # Set the project attributes ===> This sections needs to be manually configured before running! project_key = 'WT' # You can get this from the beggining of an issue key. E.g.: For WS-5018 its "WS" issuetypeid = '10002' # Check https://confluence.atlassian.com/jirakb/finding-the-id-for-issue-types-646186508.html. There's also an API endpoint to get it. # Generate request headers = {'content-type': 'application/json'} issue_data = { "update": {}, "fields": { "summary": 'Wazuh Alert: ' + description, "issuetype": { "id": issuetypeid }, "project": { "key": project_key }, "description": { 'version': 1, 'type': 'doc', 'content': [ { "type": "paragraph", "content": [ { "text": '- Rule ID: ' + str(ruleid) + '\n- Alert level: ' + str(alert_level) + '\n- Agent: ' + str(agentid) + ' ' + agentname, "type": "text" } ] } ], }, } } # Send the request response = requests.post(hook_url, data=json.dumps(issue_data), headers=headers, auth=(user, api_key)) #print(json.dumps(json.loads(response.text), sort_keys=True, indent=4, separators=(",", ": "))) sys.exit(0) ``` It would be great if we could have both scripts in the blog. Users (and the team) will appreciate it :) It is worth mentioning that the user will need to adjust both `project_key` and `issuetypeid` variables with theirs. I also commented out a "print" command at the end. In case of troubleshooting, it will be very useful to uncomment it. It is also useful to know that the issuetype ID can be queried to Jira API as well by running next command: ```bash curl --request GET \ --url 'https://your-domain.atlassian.net/rest/api/3/issuetype' \ --user 'email@example.com:<api_token>' \ --header 'Accept: application/json' ``` With these scripts I got next results in Jira: ![image](https://user-images.githubusercontent.com/29696136/159353665-e377a3a1-21e5-4c9d-b0ea-f5fcdeaabfb4.png) ![image](https://user-images.githubusercontent.com/29696136/159353637-520e4e78-89f6-4ecc-a25d-c01fae4ea651.png)
non_priority
script for jira integration blog post doesn t work anymore hello team i had to configure this integration many times lately and found that the script we provide in this blog post doesn t work anymore i kept getting next error errormessages errors project especifica un id o una clave de proyecto válida this is because jira now requires an issuetype id for issues to be created via api but even inserting the issuetype id into the script still didn t work this time it was because they also changed the format for the description field it now requires additional parameters i managed to adjust the script to this python usr bin env python import sys import json import requests from requests auth import httpbasicauth read configuration parameters alert file open sys argv user sys argv split api key sys argv split hook url sys argv read the alert file alert json json loads alert file read alert file close extract issue fields alert level alert json ruleid alert json description alert json agentid alert json agentname alert json path alert json set the project attributes this sections needs to be manually configured before running project key wt you can get this from the beggining of an issue key e g for ws its ws issuetypeid check there s also an api endpoint to get it generate request headers content type application json issue data update fields summary fim alert on issuetype id issuetypeid project key project key description version type doc content type paragraph content text state description n rule id str ruleid n alert level str alert level n agent str agentid agentname type text send the request response requests post hook url data json dumps issue data headers headers auth user api key print json dumps json loads response text sort keys true indent separators sys exit also here s a script for general use nor tied to fim python usr bin env python import sys import json import requests from requests auth import httpbasicauth read configuration parameters alert file open sys argv user sys argv split api key sys argv split hook url sys argv read the alert file alert json json loads alert file read alert file close extract issue fields alert level alert json ruleid alert json description alert json agentid alert json agentname alert json path alert json set the project attributes this sections needs to be manually configured before running project key wt you can get this from the beggining of an issue key e g for ws its ws issuetypeid check there s also an api endpoint to get it generate request headers content type application json issue data update fields summary wazuh alert description issuetype id issuetypeid project key project key description version type doc content type paragraph content text rule id str ruleid n alert level str alert level n agent str agentid agentname type text send the request response requests post hook url data json dumps issue data headers headers auth user api key print json dumps json loads response text sort keys true indent separators sys exit it would be great if we could have both scripts in the blog users and the team will appreciate it it is worth mentioning that the user will need to adjust both project key and issuetypeid variables with theirs i also commented out a print command at the end in case of troubleshooting it will be very useful to uncomment it it is also useful to know that the issuetype id can be queried to jira api as well by running next command bash curl request get url user email example com header accept application json with these scripts i got next results in jira
0
194,010
14,667,214,784
IssuesEvent
2020-12-29 18:04:28
github-vet/rangeloop-pointer-findings
https://api.github.com/repos/github-vet/rangeloop-pointer-findings
closed
itsivareddy/terrafrom-Oci: oci/autoscaling_auto_scaling_configuration_test.go; 14 LoC
fresh small test
Found a possible issue in [itsivareddy/terrafrom-Oci](https://www.github.com/itsivareddy/terrafrom-Oci) at [oci/autoscaling_auto_scaling_configuration_test.go](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/autoscaling_auto_scaling_configuration_test.go#L501-L514) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > reference to autoScalingConfigurationId is reassigned at line 505 [Click here to see the code in its original context.](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/autoscaling_auto_scaling_configuration_test.go#L501-L514) <details> <summary>Click here to show the 14 line(s) of Go which triggered the analyzer.</summary> ```go for _, autoScalingConfigurationId := range autoScalingConfigurationIds { if ok := SweeperDefaultResourceId[autoScalingConfigurationId]; !ok { deleteAutoScalingConfigurationRequest := oci_auto_scaling.DeleteAutoScalingConfigurationRequest{} deleteAutoScalingConfigurationRequest.AutoScalingConfigurationId = &autoScalingConfigurationId deleteAutoScalingConfigurationRequest.RequestMetadata.RetryPolicy = getRetryPolicy(true, "auto_scaling") _, error := autoScalingClient.DeleteAutoScalingConfiguration(context.Background(), deleteAutoScalingConfigurationRequest) if error != nil { fmt.Printf("Error deleting AutoScalingConfiguration %s %s, It is possible that the resource is already deleted. Please verify manually \n", autoScalingConfigurationId, error) continue } } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 075608a9e201ee0e32484da68d5ba5370dfde1be
1.0
itsivareddy/terrafrom-Oci: oci/autoscaling_auto_scaling_configuration_test.go; 14 LoC - Found a possible issue in [itsivareddy/terrafrom-Oci](https://www.github.com/itsivareddy/terrafrom-Oci) at [oci/autoscaling_auto_scaling_configuration_test.go](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/autoscaling_auto_scaling_configuration_test.go#L501-L514) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > reference to autoScalingConfigurationId is reassigned at line 505 [Click here to see the code in its original context.](https://github.com/itsivareddy/terrafrom-Oci/blob/075608a9e201ee0e32484da68d5ba5370dfde1be/oci/autoscaling_auto_scaling_configuration_test.go#L501-L514) <details> <summary>Click here to show the 14 line(s) of Go which triggered the analyzer.</summary> ```go for _, autoScalingConfigurationId := range autoScalingConfigurationIds { if ok := SweeperDefaultResourceId[autoScalingConfigurationId]; !ok { deleteAutoScalingConfigurationRequest := oci_auto_scaling.DeleteAutoScalingConfigurationRequest{} deleteAutoScalingConfigurationRequest.AutoScalingConfigurationId = &autoScalingConfigurationId deleteAutoScalingConfigurationRequest.RequestMetadata.RetryPolicy = getRetryPolicy(true, "auto_scaling") _, error := autoScalingClient.DeleteAutoScalingConfiguration(context.Background(), deleteAutoScalingConfigurationRequest) if error != nil { fmt.Printf("Error deleting AutoScalingConfiguration %s %s, It is possible that the resource is already deleted. Please verify manually \n", autoScalingConfigurationId, error) continue } } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 075608a9e201ee0e32484da68d5ba5370dfde1be
non_priority
itsivareddy terrafrom oci oci autoscaling auto scaling configuration test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message reference to autoscalingconfigurationid is reassigned at line click here to show the line s of go which triggered the analyzer go for autoscalingconfigurationid range autoscalingconfigurationids if ok sweeperdefaultresourceid ok deleteautoscalingconfigurationrequest oci auto scaling deleteautoscalingconfigurationrequest deleteautoscalingconfigurationrequest autoscalingconfigurationid autoscalingconfigurationid deleteautoscalingconfigurationrequest requestmetadata retrypolicy getretrypolicy true auto scaling error autoscalingclient deleteautoscalingconfiguration context background deleteautoscalingconfigurationrequest if error nil fmt printf error deleting autoscalingconfiguration s s it is possible that the resource is already deleted please verify manually n autoscalingconfigurationid error continue leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
0
1,280
2,603,746,679
IssuesEvent
2015-02-24 17:42:47
chrsmith/bwapi
https://api.github.com/repos/chrsmith/bwapi
closed
Training queue in nexus is never more than 1 unit
auto-migrated Type-Defect
``` What steps will reproduce the problem? 1. Create a nexus 2. Train probes non-stop and make them gather minerals 3. The training queue is never more than 1 unit - they dont queue up? What version of the product are you using? On what operating system? Rev 1610 - Windows XP Please provide any additional information below. I will love the nexus to use the queue to make sure that the nexus is never idle no matter the lag. ``` ----- Original issue reported on code.google.com by `wizuffeg...@gmail.com` on 29 Nov 2009 at 4:36
1.0
Training queue in nexus is never more than 1 unit - ``` What steps will reproduce the problem? 1. Create a nexus 2. Train probes non-stop and make them gather minerals 3. The training queue is never more than 1 unit - they dont queue up? What version of the product are you using? On what operating system? Rev 1610 - Windows XP Please provide any additional information below. I will love the nexus to use the queue to make sure that the nexus is never idle no matter the lag. ``` ----- Original issue reported on code.google.com by `wizuffeg...@gmail.com` on 29 Nov 2009 at 4:36
non_priority
training queue in nexus is never more than unit what steps will reproduce the problem create a nexus train probes non stop and make them gather minerals the training queue is never more than unit they dont queue up what version of the product are you using on what operating system rev windows xp please provide any additional information below i will love the nexus to use the queue to make sure that the nexus is never idle no matter the lag original issue reported on code google com by wizuffeg gmail com on nov at
0
227,688
17,396,303,994
IssuesEvent
2021-08-02 13:52:26
TheSingleOneYT/FNLevel-DiscordBot
https://api.github.com/repos/TheSingleOneYT/FNLevel-DiscordBot
opened
Documentation
documentation
I have made a wiki for this project explaining how some of it works. https://github.com/TheSingleOneYT/FNLevel-DiscordBot/wiki Enjoy reading!
1.0
Documentation - I have made a wiki for this project explaining how some of it works. https://github.com/TheSingleOneYT/FNLevel-DiscordBot/wiki Enjoy reading!
non_priority
documentation i have made a wiki for this project explaining how some of it works enjoy reading
0
85,842
15,755,290,436
IssuesEvent
2021-03-31 01:31:01
ChenLuigi/GitHubScannerBower4
https://api.github.com/repos/ChenLuigi/GitHubScannerBower4
opened
CVE-2020-24025 (Medium) detected in node-sass-1.2.3.tgz
security vulnerability
## CVE-2020-24025 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-1.2.3.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-1.2.3.tgz">https://registry.npmjs.org/node-sass/-/node-sass-1.2.3.tgz</a></p> <p>Path to dependency file: GitHubScannerBower4/GoldenPanel_Lighter/GoldenPanel/c3-0.4.10/package/package.json</p> <p>Path to vulnerable library: GitHubScannerBower4/GoldenPanel_Lighter/GoldenPanel/c3-0.4.10/package/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - grunt-sass-0.17.0.tgz (Root Library) - :x: **node-sass-1.2.3.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Certificate validation in node-sass 2.0.0 to 4.14.1 is disabled when requesting binaries even if the user is not specifying an alternative download path. <p>Publish Date: 2021-01-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24025>CVE-2020-24025</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-24025 (Medium) detected in node-sass-1.2.3.tgz - ## CVE-2020-24025 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-1.2.3.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-1.2.3.tgz">https://registry.npmjs.org/node-sass/-/node-sass-1.2.3.tgz</a></p> <p>Path to dependency file: GitHubScannerBower4/GoldenPanel_Lighter/GoldenPanel/c3-0.4.10/package/package.json</p> <p>Path to vulnerable library: GitHubScannerBower4/GoldenPanel_Lighter/GoldenPanel/c3-0.4.10/package/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - grunt-sass-0.17.0.tgz (Root Library) - :x: **node-sass-1.2.3.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Certificate validation in node-sass 2.0.0 to 4.14.1 is disabled when requesting binaries even if the user is not specifying an alternative download path. <p>Publish Date: 2021-01-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24025>CVE-2020-24025</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve medium detected in node sass tgz cve medium severity vulnerability vulnerable library node sass tgz wrapper around libsass library home page a href path to dependency file goldenpanel lighter goldenpanel package package json path to vulnerable library goldenpanel lighter goldenpanel package node modules node sass package json dependency hierarchy grunt sass tgz root library x node sass tgz vulnerable library vulnerability details certificate validation in node sass to is disabled when requesting binaries even if the user is not specifying an alternative download path publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href step up your open source security game with whitesource
0
27,317
12,540,308,580
IssuesEvent
2020-06-05 10:07:54
terraform-providers/terraform-provider-azurerm
https://api.github.com/repos/terraform-providers/terraform-provider-azurerm
closed
x509_certificate_properties should be required to create certificate
question service/keyvault
<!--- Please note the following potential times when an issue might be in Terraform core: * [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues * [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues * [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues * [Registry](https://registry.terraform.io/) issues * Spans resources across multiple providers If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead. ---> <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform (and AzureRM Provider) Version <!--- Please run `terraform -v` to show the Terraform core version and provider version(s). If you are not running the latest version of Terraform or the provider, please upgrade because your issue may have already been fixed. [Terraform documentation on provider versioning](https://www.terraform.io/docs/configuration/providers.html#provider-versions). ---> ### Affected Resource(s) <!--- Please list the affected resources and data sources. ---> * `azurerm v2.4.0 ` ### Terraform Configuration Files <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl resource "azurerm_key_vault_certificate" "sshkey" { name = "nancyc-kv-cert-01" key_vault_id = azurerm_key_vault.kv.id certificate_policy { issuer_parameters { name = "Self" } key_properties { exportable = true key_size = 2048 key_type = "RSA" reuse_key = true } secret_properties { content_type = "application/x-pem-file" } } } ``` ### Debug Output <!--- Please provide a link to a GitHub Gist containing the complete debug output. Please do NOT paste the debug output in the issue; just paste a link to the Gist. To obtain the debug output, see the [Terraform documentation on debugging](https://www.terraform.io/docs/internals/debugging.html). ---> ### Panic Output <!--- If Terraform produced a panic, please provide a link to a GitHub Gist containing the output of the `crash.log`. ---> ### Expected Behavior https://www.terraform.io/docs/providers/azurerm/r/key_vault_certificate.html > certificate_policy supports the following: > > issuer_parameters - (Required) A issuer_parameters block as defined below. > key_properties - (Required) A key_properties block as defined below. > lifetime_action - (Optional) A lifetime_action block as defined below. > secret_properties - (Required) A secret_properties block as defined below. > **x509_certificate_properties - (Optional) A x509_certificate_properties block as defined below.** However this should be **Required**. ### Actual Behavior ``` Error: keyvault.BaseClient#CreateCertificate: Failure responding to request: StatusCode=400 -- Original Error: autorest/azure: Service returned an error. Status=400 Code="BadParameter" Message="Property policy has invalid value\r\n" on main.tf line 89, in resource "azurerm_key_vault_certificate" "sshkey": 89: resource "azurerm_key_vault_certificate" "sshkey" { ``` ### Steps to Reproduce <!--- Please list the steps required to reproduce the issue. ---> 1. `terraform apply` ### Important Factoids <!--- Are there anything atypical about your accounts that we should know? For example: Running in a Azure China/Germany/Government? ---> ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Such as vendor documentation? ---> * #0000
1.0
x509_certificate_properties should be required to create certificate - <!--- Please note the following potential times when an issue might be in Terraform core: * [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues * [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues * [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues * [Registry](https://registry.terraform.io/) issues * Spans resources across multiple providers If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead. ---> <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform (and AzureRM Provider) Version <!--- Please run `terraform -v` to show the Terraform core version and provider version(s). If you are not running the latest version of Terraform or the provider, please upgrade because your issue may have already been fixed. [Terraform documentation on provider versioning](https://www.terraform.io/docs/configuration/providers.html#provider-versions). ---> ### Affected Resource(s) <!--- Please list the affected resources and data sources. ---> * `azurerm v2.4.0 ` ### Terraform Configuration Files <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl resource "azurerm_key_vault_certificate" "sshkey" { name = "nancyc-kv-cert-01" key_vault_id = azurerm_key_vault.kv.id certificate_policy { issuer_parameters { name = "Self" } key_properties { exportable = true key_size = 2048 key_type = "RSA" reuse_key = true } secret_properties { content_type = "application/x-pem-file" } } } ``` ### Debug Output <!--- Please provide a link to a GitHub Gist containing the complete debug output. Please do NOT paste the debug output in the issue; just paste a link to the Gist. To obtain the debug output, see the [Terraform documentation on debugging](https://www.terraform.io/docs/internals/debugging.html). ---> ### Panic Output <!--- If Terraform produced a panic, please provide a link to a GitHub Gist containing the output of the `crash.log`. ---> ### Expected Behavior https://www.terraform.io/docs/providers/azurerm/r/key_vault_certificate.html > certificate_policy supports the following: > > issuer_parameters - (Required) A issuer_parameters block as defined below. > key_properties - (Required) A key_properties block as defined below. > lifetime_action - (Optional) A lifetime_action block as defined below. > secret_properties - (Required) A secret_properties block as defined below. > **x509_certificate_properties - (Optional) A x509_certificate_properties block as defined below.** However this should be **Required**. ### Actual Behavior ``` Error: keyvault.BaseClient#CreateCertificate: Failure responding to request: StatusCode=400 -- Original Error: autorest/azure: Service returned an error. Status=400 Code="BadParameter" Message="Property policy has invalid value\r\n" on main.tf line 89, in resource "azurerm_key_vault_certificate" "sshkey": 89: resource "azurerm_key_vault_certificate" "sshkey" { ``` ### Steps to Reproduce <!--- Please list the steps required to reproduce the issue. ---> 1. `terraform apply` ### Important Factoids <!--- Are there anything atypical about your accounts that we should know? For example: Running in a Azure China/Germany/Government? ---> ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Such as vendor documentation? ---> * #0000
non_priority
certificate properties should be required to create certificate please note the following potential times when an issue might be in terraform core or resource ordering issues and issues issues issues spans resources across multiple providers if you are running into one of these scenarios we recommend opening an issue in the instead community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform and azurerm provider version affected resource s azurerm terraform configuration files hcl resource azurerm key vault certificate sshkey name nancyc kv cert key vault id azurerm key vault kv id certificate policy issuer parameters name self key properties exportable true key size key type rsa reuse key true secret properties content type application x pem file debug output please provide a link to a github gist containing the complete debug output please do not paste the debug output in the issue just paste a link to the gist to obtain the debug output see the panic output expected behavior certificate policy supports the following issuer parameters required a issuer parameters block as defined below key properties required a key properties block as defined below lifetime action optional a lifetime action block as defined below secret properties required a secret properties block as defined below certificate properties optional a certificate properties block as defined below however this should be required actual behavior error keyvault baseclient createcertificate failure responding to request statuscode original error autorest azure service returned an error status code badparameter message property policy has invalid value r n on main tf line in resource azurerm key vault certificate sshkey resource azurerm key vault certificate sshkey steps to reproduce terraform apply important factoids references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here such as vendor documentation
0
210,306
23,750,846,983
IssuesEvent
2022-08-31 20:27:17
ManageIQ/manageiq
https://api.github.com/repos/ManageIQ/manageiq
opened
Move all server certificates to /etc/pki/tls
enhancement core/security
Followup from: https://github.com/ManageIQ/manageiq/issues/21722 - [ ] https://github.com/ManageIQ/manageiq-pods/pull/770 update [orchestrator](https://github.com/ManageIQ/manageiq-pods/blob/bc27d1d70f451d7f52abe62db94bba540df34960/manageiq-operator/pkg/helpers/miq-components/orchestrator.go#L176-L179) to not copy to `/root/.postgres`
True
Move all server certificates to /etc/pki/tls - Followup from: https://github.com/ManageIQ/manageiq/issues/21722 - [ ] https://github.com/ManageIQ/manageiq-pods/pull/770 update [orchestrator](https://github.com/ManageIQ/manageiq-pods/blob/bc27d1d70f451d7f52abe62db94bba540df34960/manageiq-operator/pkg/helpers/miq-components/orchestrator.go#L176-L179) to not copy to `/root/.postgres`
non_priority
move all server certificates to etc pki tls followup from update to not copy to root postgres
0
217,759
16,887,561,310
IssuesEvent
2021-06-23 03:48:35
microsoft/vscode
https://api.github.com/repos/microsoft/vscode
closed
[Test API] A newly added test will be triggered again in auto-run mode
testing under-discussion
Version: 1.58.0-insider (user setup) Commit: a81fff00c9dab105800118fcf8b044cd84620419 Date: 2021-06-17T05:17:34.858Z Electron: 12.0.11 Chrome: 89.0.4389.128 Node.js: 14.16.0 V8: 8.9.255.25-electron.0 OS: Windows_NT x64 10.0.19043 Steps to Reproduce: 1. In `test-provider-sample` project, in `run()` method of the `TestCase` class, change the else block to: ```ts if (actual === this.expected) { ... } else { const child1 = vscode.test.createTestItem<TestCase>({ id: `fakeTest/${this.item.uri!.toString()}/${this.item.label}#1`, label: this.item.label, uri: this.item.uri!, }); child1.range = this.item.range; child1.runnable = false; child1.debuggable = false; child1.status = vscode.TestItemStatus.Resolved; this.item.addChild(child1); const message = vscode.TestMessage.diff(`Expected ${this.item.label}`, String(this.expected), String(actual)); message.location = new vscode.Location(this.item.uri!, this.item.range!); options.appendMessage(child1, message); options.setState(child1, vscode.TestResultState.Failed, duration); } ``` _This is to imitate the case that when running parameterized tests, a test method will be triggered multiple times with different parameters during execution. So each invocation will be a added as a child of the test method during the execution._ 2. Turn on the autorun mode 3. `runTests()` will be triggered twice, for the second time, only the newly added test item `child1` is contained in the tests array. There are two reasons why I think the second `runTests()` should not be triggered: 1. The test state of `child1` has just been set for the last run 2. The test item `child1` is not runnable neither debuggable // cc @connor4312
1.0
[Test API] A newly added test will be triggered again in auto-run mode - Version: 1.58.0-insider (user setup) Commit: a81fff00c9dab105800118fcf8b044cd84620419 Date: 2021-06-17T05:17:34.858Z Electron: 12.0.11 Chrome: 89.0.4389.128 Node.js: 14.16.0 V8: 8.9.255.25-electron.0 OS: Windows_NT x64 10.0.19043 Steps to Reproduce: 1. In `test-provider-sample` project, in `run()` method of the `TestCase` class, change the else block to: ```ts if (actual === this.expected) { ... } else { const child1 = vscode.test.createTestItem<TestCase>({ id: `fakeTest/${this.item.uri!.toString()}/${this.item.label}#1`, label: this.item.label, uri: this.item.uri!, }); child1.range = this.item.range; child1.runnable = false; child1.debuggable = false; child1.status = vscode.TestItemStatus.Resolved; this.item.addChild(child1); const message = vscode.TestMessage.diff(`Expected ${this.item.label}`, String(this.expected), String(actual)); message.location = new vscode.Location(this.item.uri!, this.item.range!); options.appendMessage(child1, message); options.setState(child1, vscode.TestResultState.Failed, duration); } ``` _This is to imitate the case that when running parameterized tests, a test method will be triggered multiple times with different parameters during execution. So each invocation will be a added as a child of the test method during the execution._ 2. Turn on the autorun mode 3. `runTests()` will be triggered twice, for the second time, only the newly added test item `child1` is contained in the tests array. There are two reasons why I think the second `runTests()` should not be triggered: 1. The test state of `child1` has just been set for the last run 2. The test item `child1` is not runnable neither debuggable // cc @connor4312
non_priority
a newly added test will be triggered again in auto run mode version insider user setup commit date electron chrome node js electron os windows nt steps to reproduce in test provider sample project in run method of the testcase class change the else block to ts if actual this expected else const vscode test createtestitem id faketest this item uri tostring this item label label this item label uri this item uri range this item range runnable false debuggable false status vscode testitemstatus resolved this item addchild const message vscode testmessage diff expected this item label string this expected string actual message location new vscode location this item uri this item range options appendmessage message options setstate vscode testresultstate failed duration this is to imitate the case that when running parameterized tests a test method will be triggered multiple times with different parameters during execution so each invocation will be a added as a child of the test method during the execution turn on the autorun mode runtests will be triggered twice for the second time only the newly added test item is contained in the tests array there are two reasons why i think the second runtests should not be triggered the test state of has just been set for the last run the test item is not runnable neither debuggable cc
0
3,279
3,873,268,040
IssuesEvent
2016-04-11 16:22:20
Fermat-ORG/beta-testing-program
https://api.github.com/repos/Fermat-ORG/beta-testing-program
closed
Bit Coin Wallet _Cripto Wallet User
Performance
### Template issue error report #### Nombre pantalla Cripto Wallet users ### Bitcoin Wallet ----- ##### Home - [x] Pantalla de Transacciones recibidas (Es la de la derecha) - [x] Pantalla de Transacciones enviadas (Es la de la izquierda) - [x] Balance (es la circunferencia con la cantidad de bitcoin) - [x] Pop up de bienvenida - [x] Pop up de ayuda ##### Reporte de error: *Al ingresar a la pantalla en ocasiones salen usuarios y en otras no sale nada como si estuviera vacio. Me ha pasado varias,veces el dia,de hoy, volvi a instalar el app pero el problema sigue. #### Dispositivo desde donde se realizo la prueba * Marca:zte * Modelo:apex2 * Version de Android:4.4.2
True
Bit Coin Wallet _Cripto Wallet User - ### Template issue error report #### Nombre pantalla Cripto Wallet users ### Bitcoin Wallet ----- ##### Home - [x] Pantalla de Transacciones recibidas (Es la de la derecha) - [x] Pantalla de Transacciones enviadas (Es la de la izquierda) - [x] Balance (es la circunferencia con la cantidad de bitcoin) - [x] Pop up de bienvenida - [x] Pop up de ayuda ##### Reporte de error: *Al ingresar a la pantalla en ocasiones salen usuarios y en otras no sale nada como si estuviera vacio. Me ha pasado varias,veces el dia,de hoy, volvi a instalar el app pero el problema sigue. #### Dispositivo desde donde se realizo la prueba * Marca:zte * Modelo:apex2 * Version de Android:4.4.2
non_priority
bit coin wallet cripto wallet user template issue error report nombre pantalla cripto wallet users bitcoin wallet home pantalla de transacciones recibidas es la de la derecha pantalla de transacciones enviadas es la de la izquierda balance es la circunferencia con la cantidad de bitcoin pop up de bienvenida pop up de ayuda reporte de error al ingresar a la pantalla en ocasiones salen usuarios y en otras no sale nada como si estuviera vacio me ha pasado varias veces el dia de hoy volvi a instalar el app pero el problema sigue dispositivo desde donde se realizo la prueba marca zte modelo version de android
0
1,368
3,925,265,252
IssuesEvent
2016-04-22 18:17:48
e-government-ua/iBP
https://api.github.com/repos/e-government-ua/iBP
closed
Дніпропетровська область - Видача довідки про неотримання аліментів-розмір аліментів
In process of testing
[Послуга 3 Отримання довідки про аліменти.docx](https://github.com/e-government-ua/iBP/files/198059/3.docx) [ЗАЯВКА ПРО ВИДАЧУ ДОВІДКИ ПРО РОЗМІР АЛІМЕНТІВ.docx](https://github.com/e-government-ua/iBP/files/198062/default.docx) [ЗАЯВКА ПО ДОВІДКЕ ПО АЛІМЕНТАМ.docx](https://github.com/e-government-ua/iBP/files/198063/default.docx)
1.0
Дніпропетровська область - Видача довідки про неотримання аліментів-розмір аліментів - [Послуга 3 Отримання довідки про аліменти.docx](https://github.com/e-government-ua/iBP/files/198059/3.docx) [ЗАЯВКА ПРО ВИДАЧУ ДОВІДКИ ПРО РОЗМІР АЛІМЕНТІВ.docx](https://github.com/e-government-ua/iBP/files/198062/default.docx) [ЗАЯВКА ПО ДОВІДКЕ ПО АЛІМЕНТАМ.docx](https://github.com/e-government-ua/iBP/files/198063/default.docx)
non_priority
дніпропетровська область видача довідки про неотримання аліментів розмір аліментів
0
334,049
24,401,793,234
IssuesEvent
2022-10-05 02:38:50
dankrzeminski32/BirthdayDiscordBot
https://api.github.com/repos/dankrzeminski32/BirthdayDiscordBot
closed
Adding @dataclasses
documentation enhancement
Current: We have a bunch of classes that use __init__ and __repr__ which create instance variables. Expect: Import and use Dataclasses which in turn makes it so you dont need the __init__ and __repr__. It does it for you. you just have to name the instance variables below the class EX: @dataclass class Test: name: str id: int Reason: Cuts the amount of lines in each file down a tad.
1.0
Adding @dataclasses - Current: We have a bunch of classes that use __init__ and __repr__ which create instance variables. Expect: Import and use Dataclasses which in turn makes it so you dont need the __init__ and __repr__. It does it for you. you just have to name the instance variables below the class EX: @dataclass class Test: name: str id: int Reason: Cuts the amount of lines in each file down a tad.
non_priority
adding dataclasses current we have a bunch of classes that use init and repr which create instance variables expect import and use dataclasses which in turn makes it so you dont need the init and repr it does it for you you just have to name the instance variables below the class ex dataclass class test name str id int reason cuts the amount of lines in each file down a tad
0
118,050
25,240,154,036
IssuesEvent
2022-11-15 06:32:31
hemx0147/TDVFuzz
https://api.github.com/repos/hemx0147/TDVFuzz
closed
read into CodeQL
code locations
read into the [CodeQL analysis engine](https://github.com/github/codeql) and figure out whether it could be used for **RQ2**: _find code locations that consume untrusted VMM input_ this is important to estimate the effort and time required for **RQ2**
1.0
read into CodeQL - read into the [CodeQL analysis engine](https://github.com/github/codeql) and figure out whether it could be used for **RQ2**: _find code locations that consume untrusted VMM input_ this is important to estimate the effort and time required for **RQ2**
non_priority
read into codeql read into the and figure out whether it could be used for find code locations that consume untrusted vmm input this is important to estimate the effort and time required for
0
83,980
24,187,924,007
IssuesEvent
2022-09-23 14:48:06
xamarin/xamarin-macios
https://api.github.com/repos/xamarin/xamarin-macios
closed
FileCopier.cs does not correctly Log\Error when used in msbuild
enhancement macOS iOS msbuild
https://github.com/xamarin/xamarin-macios/pull/5167#discussion_r238684107 This PR uses CWL\throwing an exception to report when run as part of an msbuild task. Fixing this is tricky, as the LogError\LogMessage APIs are instance variables only accessible from the task. We could stash the task somewhere in a static for reporting or bubble up some event to the task or something else, but in any case we need to fix this.
1.0
FileCopier.cs does not correctly Log\Error when used in msbuild - https://github.com/xamarin/xamarin-macios/pull/5167#discussion_r238684107 This PR uses CWL\throwing an exception to report when run as part of an msbuild task. Fixing this is tricky, as the LogError\LogMessage APIs are instance variables only accessible from the task. We could stash the task somewhere in a static for reporting or bubble up some event to the task or something else, but in any case we need to fix this.
non_priority
filecopier cs does not correctly log error when used in msbuild this pr uses cwl throwing an exception to report when run as part of an msbuild task fixing this is tricky as the logerror logmessage apis are instance variables only accessible from the task we could stash the task somewhere in a static for reporting or bubble up some event to the task or something else but in any case we need to fix this
0
19,499
10,361,331,065
IssuesEvent
2019-09-06 09:46:03
hisptz/nacp-dashboard-v2
https://api.github.com/repos/hisptz/nacp-dashboard-v2
opened
CVE-2019-6283 (Medium) detected in opennms-opennms-source-23.0.0-1
security vulnerability
## CVE-2019-6283 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary> <p> <p>A Java based fault and performance management system</p> <p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p> <p>Found in HEAD commit: <a href="https://github.com/hisptz/nacp-dashboard-v2/commit/769865a5eb38141c1edeb82f84f2ccacac36048a">769865a5eb38141c1edeb82f84f2ccacac36048a</a></p> </p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (62)</summary> <p></p> <p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p> <p> - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/expand.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/expand.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/factory.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/boolean.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/util.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/value.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/emitter.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/callback_bridge.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/file.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/sass.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/operation.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/operators.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/constants.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/error_handling.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/custom_importer_bridge.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/parser.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/constants.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/list.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/cssize.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/functions.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/util.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/custom_function_bridge.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/custom_importer_bridge.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/bind.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/eval.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/backtrace.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/extend.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_context_wrapper.h - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/sass_value_wrapper.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/error_handling.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/debugger.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/emitter.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/number.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/color.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/sass_values.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/ast.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/output.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/check_nesting.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/null.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/functions.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/cssize.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/prelexer.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/ast.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/to_c.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/to_value.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/inspect.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/color.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/values.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_context_wrapper.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/list.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/check_nesting.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/map.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/to_value.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/context.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/string.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/sass_context.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/prelexer.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/context.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/boolean.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/eval.cpp </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::parenthese_scope in prelexer.hpp. <p>Publish Date: 2019-01-14 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6283>CVE-2019-6283</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284</a></p> <p>Release Date: 2019-08-06</p> <p>Fix Resolution: 3.6.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-6283 (Medium) detected in opennms-opennms-source-23.0.0-1 - ## CVE-2019-6283 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary> <p> <p>A Java based fault and performance management system</p> <p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p> <p>Found in HEAD commit: <a href="https://github.com/hisptz/nacp-dashboard-v2/commit/769865a5eb38141c1edeb82f84f2ccacac36048a">769865a5eb38141c1edeb82f84f2ccacac36048a</a></p> </p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (62)</summary> <p></p> <p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p> <p> - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/expand.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/expand.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/factory.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/boolean.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/util.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/value.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/emitter.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/callback_bridge.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/file.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/sass.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/operation.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/operators.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/constants.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/error_handling.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/custom_importer_bridge.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/parser.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/constants.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/list.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/cssize.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/functions.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/util.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/custom_function_bridge.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/custom_importer_bridge.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/bind.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/eval.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/backtrace.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/extend.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_context_wrapper.h - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/sass_value_wrapper.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/error_handling.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/debugger.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/emitter.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/number.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/color.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/sass_values.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/ast.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/output.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/check_nesting.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/null.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/functions.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/cssize.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/prelexer.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/ast.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/to_c.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/to_value.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/inspect.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/color.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/values.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_context_wrapper.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/list.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/check_nesting.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/map.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/to_value.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/context.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/string.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/sass_context.cpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/prelexer.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/context.hpp - /nacp-dashboard-v2/node_modules/node-sass/src/sass_types/boolean.h - /nacp-dashboard-v2/node_modules/node-sass/src/libsass/src/eval.cpp </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::parenthese_scope in prelexer.hpp. <p>Publish Date: 2019-01-14 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6283>CVE-2019-6283</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284</a></p> <p>Release Date: 2019-08-06</p> <p>Fix Resolution: 3.6.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve medium detected in opennms opennms source cve medium severity vulnerability vulnerable library opennmsopennms source a java based fault and performance management system library home page a href found in head commit a href library source files the source files were matched to this source library based on a best effort match source libraries are selected from a list of probable public libraries nacp dashboard node modules node sass src libsass src expand hpp nacp dashboard node modules node sass src libsass src expand cpp nacp dashboard node modules node sass src sass types factory cpp nacp dashboard node modules node sass src sass types boolean cpp nacp dashboard node modules node sass src libsass src util hpp nacp dashboard node modules node sass src sass types value h nacp dashboard node modules node sass src libsass src emitter hpp nacp dashboard node modules node sass src callback bridge h nacp dashboard node modules node sass src libsass src file cpp nacp dashboard node modules node sass src libsass src sass cpp nacp dashboard node modules node sass src libsass src operation hpp nacp dashboard node modules node sass src libsass src operators hpp nacp dashboard node modules node sass src libsass src constants hpp nacp dashboard node modules node sass src libsass src error handling hpp nacp dashboard node modules node sass src custom importer bridge cpp nacp dashboard node modules node sass src libsass src parser hpp nacp dashboard node modules node sass src libsass src constants cpp nacp dashboard node modules node sass src sass types list cpp nacp dashboard node modules node sass src libsass src cssize cpp nacp dashboard node modules node sass src libsass src functions hpp nacp dashboard node modules node sass src libsass src util cpp nacp dashboard node modules node sass src custom function bridge cpp nacp dashboard node modules node sass src custom importer bridge h nacp dashboard node modules node sass src libsass src bind cpp nacp dashboard node modules node sass src libsass src eval hpp nacp dashboard node modules node sass src libsass src backtrace cpp nacp dashboard node modules node sass src libsass src extend cpp nacp dashboard node modules node sass src sass context wrapper h nacp dashboard node modules node sass src sass types sass value wrapper h nacp dashboard node modules node sass src libsass src error handling cpp nacp dashboard node modules node sass src libsass src debugger hpp nacp dashboard node modules node sass src libsass src emitter cpp nacp dashboard node modules node sass src sass types number cpp nacp dashboard node modules node sass src sass types color h nacp dashboard node modules node sass src libsass src sass values cpp nacp dashboard node modules node sass src libsass src ast hpp nacp dashboard node modules node sass src libsass src output cpp nacp dashboard node modules node sass src libsass src check nesting cpp nacp dashboard node modules node sass src sass types null cpp nacp dashboard node modules node sass src libsass src ast def macros hpp nacp dashboard node modules node sass src libsass src functions cpp nacp dashboard node modules node sass src libsass src cssize hpp nacp dashboard node modules node sass src libsass src prelexer cpp nacp dashboard node modules node sass src libsass src ast cpp nacp dashboard node modules node sass src libsass src to c cpp nacp dashboard node modules node sass src libsass src to value hpp nacp dashboard node modules node sass src libsass src ast fwd decl hpp nacp dashboard node modules node sass src libsass src inspect hpp nacp dashboard node modules node sass src sass types color cpp nacp dashboard node modules node sass src libsass src values cpp nacp dashboard node modules node sass src sass context wrapper cpp nacp dashboard node modules node sass src sass types list h nacp dashboard node modules node sass src libsass src check nesting hpp nacp dashboard node modules node sass src sass types map cpp nacp dashboard node modules node sass src libsass src to value cpp nacp dashboard node modules node sass src libsass src context cpp nacp dashboard node modules node sass src sass types string cpp nacp dashboard node modules node sass src libsass src sass context cpp nacp dashboard node modules node sass src libsass src prelexer hpp nacp dashboard node modules node sass src libsass src context hpp nacp dashboard node modules node sass src sass types boolean h nacp dashboard node modules node sass src libsass src eval cpp vulnerability details in libsass a heap based buffer over read exists in sass prelexer parenthese scope in prelexer hpp publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
76,664
9,478,855,345
IssuesEvent
2019-04-20 02:11:24
AgileVentures/sfn-client
https://api.github.com/repos/AgileVentures/sfn-client
closed
Make Trending Artists more responsive
css design help wanted review styling
Make Trending Artists more responsive ## Expected Behavior Trending Artists should be more responsive ## Current Behavior <img width="351" alt="Screenshot 2019-04-16 13 36 21" src="https://user-images.githubusercontent.com/11988089/56235338-c3dda100-6076-11e9-82e4-a4d6644db1ad.png"> ## Your Environment * Version used: * Operating System and version (desktop or mobile): iPhoneX
1.0
Make Trending Artists more responsive - Make Trending Artists more responsive ## Expected Behavior Trending Artists should be more responsive ## Current Behavior <img width="351" alt="Screenshot 2019-04-16 13 36 21" src="https://user-images.githubusercontent.com/11988089/56235338-c3dda100-6076-11e9-82e4-a4d6644db1ad.png"> ## Your Environment * Version used: * Operating System and version (desktop or mobile): iPhoneX
non_priority
make trending artists more responsive make trending artists more responsive expected behavior trending artists should be more responsive current behavior img width alt screenshot src your environment version used operating system and version desktop or mobile iphonex
0
239,304
26,223,042,695
IssuesEvent
2023-01-04 16:15:53
RG4421/Prebid.js
https://api.github.com/repos/RG4421/Prebid.js
reopened
CVE-2011-4969 (Low) detected in jquery-1.4.2.min.js
security vulnerability
## CVE-2011-4969 - Low Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.4.2.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.2/jquery.min.js</a></p> <p>Path to dependency file: /node_modules/faker/examples/browser/index.html</p> <p>Path to vulnerable library: /node_modules/faker/examples/browser/js/jquery.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.4.2.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/RG4421/Prebid.js/commit/27177360829b424f1689a61dcd534ddfd40f7842">27177360829b424f1689a61dcd534ddfd40f7842</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Cross-site scripting (XSS) vulnerability in jQuery before 1.6.3, when using location.hash to select elements, allows remote attackers to inject arbitrary web script or HTML via a crafted tag. <p>Publish Date: 2013-03-08 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2011-4969>CVE-2011-4969</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.7</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2011-4969">https://nvd.nist.gov/vuln/detail/CVE-2011-4969</a></p> <p>Release Date: 2013-03-08</p> <p>Fix Resolution: 1.6.3</p> </p> </details> <p></p>
True
CVE-2011-4969 (Low) detected in jquery-1.4.2.min.js - ## CVE-2011-4969 - Low Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.4.2.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.2/jquery.min.js</a></p> <p>Path to dependency file: /node_modules/faker/examples/browser/index.html</p> <p>Path to vulnerable library: /node_modules/faker/examples/browser/js/jquery.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.4.2.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/RG4421/Prebid.js/commit/27177360829b424f1689a61dcd534ddfd40f7842">27177360829b424f1689a61dcd534ddfd40f7842</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Cross-site scripting (XSS) vulnerability in jQuery before 1.6.3, when using location.hash to select elements, allows remote attackers to inject arbitrary web script or HTML via a crafted tag. <p>Publish Date: 2013-03-08 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2011-4969>CVE-2011-4969</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.7</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2011-4969">https://nvd.nist.gov/vuln/detail/CVE-2011-4969</a></p> <p>Release Date: 2013-03-08</p> <p>Fix Resolution: 1.6.3</p> </p> </details> <p></p>
non_priority
cve low detected in jquery min js cve low severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file node modules faker examples browser index html path to vulnerable library node modules faker examples browser js jquery js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details cross site scripting xss vulnerability in jquery before when using location hash to select elements allows remote attackers to inject arbitrary web script or html via a crafted tag publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
0
76,595
7,541,158,764
IssuesEvent
2018-04-17 08:57:48
web-platform-tests/results-collection
https://api.github.com/repos/web-platform-tests/results-collection
closed
Support external runs (submitting results) for more browsers
enhancement project:wpt.fyi test-running
In https://github.com/w3c/wptdashboard/issues/219 @breezet and I were discussed getting Baidu Browser into wpt.fyi, and I've also discussed Opera briefly with @jensl. Other such useful cases aren't hard to come up with: * Servo * content_shell results from Chromium debug builds * UC Browser * Samsung S Browser * etc... For any case where the wpt.fyi project isn't yet prioritizing running the browsers, we should still make it possible for others to perform the runs and submit the results with minimal effort. This has implications for the wpt.fyi UI of course, as well as how to communicate with these external runners.
1.0
Support external runs (submitting results) for more browsers - In https://github.com/w3c/wptdashboard/issues/219 @breezet and I were discussed getting Baidu Browser into wpt.fyi, and I've also discussed Opera briefly with @jensl. Other such useful cases aren't hard to come up with: * Servo * content_shell results from Chromium debug builds * UC Browser * Samsung S Browser * etc... For any case where the wpt.fyi project isn't yet prioritizing running the browsers, we should still make it possible for others to perform the runs and submit the results with minimal effort. This has implications for the wpt.fyi UI of course, as well as how to communicate with these external runners.
non_priority
support external runs submitting results for more browsers in breezet and i were discussed getting baidu browser into wpt fyi and i ve also discussed opera briefly with jensl other such useful cases aren t hard to come up with servo content shell results from chromium debug builds uc browser samsung s browser etc for any case where the wpt fyi project isn t yet prioritizing running the browsers we should still make it possible for others to perform the runs and submit the results with minimal effort this has implications for the wpt fyi ui of course as well as how to communicate with these external runners
0
63,672
7,737,486,557
IssuesEvent
2018-05-28 08:26:45
CLARIAH/wp5_mediasuite
https://api.github.com/repos/CLARIAH/wp5_mediasuite
closed
Reorder steps in the Collection inspector recipe
MS-Global MS-Recipe Collection inspector MSv3 (expected) Theme: Interaction design (worflow)
"The dropdown for “Selected collections” is confusing given that there is nothing else on the page when it’s closed." (comes from Gitter) Suggestion: Perhaps since this recipe is divided into steps, make that more clear in the interface separating the blocks differently? 1) Selecting collections (instead of "selected") 2) Pre-Analyzing metadata (metadata inspector) 3) Collection analytics via timeplot 4) Sending collection to recipe The blocks are then 1 + 2; 3; 4 (in this order), not having 4 (recipe selector) under 1+2 but under 3 (field selector for collection analytics). ![screen shot 2017-05-23 at 09 19 29](https://cloud.githubusercontent.com/assets/8133228/26342911/3857c89e-3f9a-11e7-8dbb-8dc6b635b2c3.png) Issue related to #62, #78.
1.0
Reorder steps in the Collection inspector recipe - "The dropdown for “Selected collections” is confusing given that there is nothing else on the page when it’s closed." (comes from Gitter) Suggestion: Perhaps since this recipe is divided into steps, make that more clear in the interface separating the blocks differently? 1) Selecting collections (instead of "selected") 2) Pre-Analyzing metadata (metadata inspector) 3) Collection analytics via timeplot 4) Sending collection to recipe The blocks are then 1 + 2; 3; 4 (in this order), not having 4 (recipe selector) under 1+2 but under 3 (field selector for collection analytics). ![screen shot 2017-05-23 at 09 19 29](https://cloud.githubusercontent.com/assets/8133228/26342911/3857c89e-3f9a-11e7-8dbb-8dc6b635b2c3.png) Issue related to #62, #78.
non_priority
reorder steps in the collection inspector recipe the dropdown for “selected collections” is confusing given that there is nothing else on the page when it’s closed comes from gitter suggestion perhaps since this recipe is divided into steps make that more clear in the interface separating the blocks differently selecting collections instead of selected pre analyzing metadata metadata inspector collection analytics via timeplot sending collection to recipe the blocks are then in this order not having recipe selector under but under field selector for collection analytics issue related to
0
309,733
23,304,194,952
IssuesEvent
2022-08-07 19:28:52
bitwes/Gut
https://api.github.com/repos/bitwes/Gut
closed
Unclear Parameterized Tests example
documentation
While going throughout the <a href="https://github.com/bitwes/Gut/wiki/Parameterized-Tests">Parameterized Tests</a> docs it's uncertain how the following <code>test_foo</code> function works: extends 'res://addons/gut/test.gd' var foo_params = [[1, 2, 3], ['a', 'b', 'c']] func test_foo(params=use_parameters(foo_params)): var foo = Foo.new() var result = foo.do_something(params[0], params[1]) assert_eq(result, params[2]) As well as the <code>do_something</code> method within the <code>Foo</code> class. Attempted to recreate this on my end to see if I could get a better understanding of things. My <b>first attempt</b> was: # class located at: 'res//scripts/foo.gd' func do_something(param): return param Which returned the following error: <code>Invalid call to function 'do_something' in base 'Node (foo.gd).' Expected 1 arguments.</code> This makes, sense since I'm passing the arguments <code>params[0]</code> & <code>params[1]</code> to the function that accepts a single argument. My <b>second attempt</b> was: # class located at: 'res//scripts/foo.gd' func do_something(param0, param1): return param0 This time it runs but, my tests are failing: <code> (call #1) with paramters: [1, 2, 3] [Failed]: [1] expected to equal [3]</code> I understand that <code>result</code> is basically <code>foo_params[i][0]</code>, on this case but, how can I make the <code>assert_eq</code> with <code>params[2]</code>. My <b>third attempt</b> was: # class located at: 'res//scripts/foo.gd' func do_something(param0, param1): return param0[2] Which returned the following error: <code>Invalid get index '2' (on base: 'int').</code> which makes total sense since GDScript does not know that param0 should be an array. Tried googling how to do it and could not find a way to do it. My <b>fourth & final attempt</b> was: # class located at: 'res//scripts/foo.gd' func do_something(param0, param1, param2): return param2 Which returned the following error: <code>Invalid call to function 'do_something' in base 'Node (foo.gd).' Expected 1 arguments.</code> Which is basically the first error. Now I went full circle and is why I'm asking for a clarification here. Could you please elaborate on the example of <a href="https://github.com/bitwes/Gut/wiki/Parameterized-Tests">Parameterized Tests</a> docs? I'm really not sure how to implement one, given my current knowledge of Gut & Godot.
1.0
Unclear Parameterized Tests example - While going throughout the <a href="https://github.com/bitwes/Gut/wiki/Parameterized-Tests">Parameterized Tests</a> docs it's uncertain how the following <code>test_foo</code> function works: extends 'res://addons/gut/test.gd' var foo_params = [[1, 2, 3], ['a', 'b', 'c']] func test_foo(params=use_parameters(foo_params)): var foo = Foo.new() var result = foo.do_something(params[0], params[1]) assert_eq(result, params[2]) As well as the <code>do_something</code> method within the <code>Foo</code> class. Attempted to recreate this on my end to see if I could get a better understanding of things. My <b>first attempt</b> was: # class located at: 'res//scripts/foo.gd' func do_something(param): return param Which returned the following error: <code>Invalid call to function 'do_something' in base 'Node (foo.gd).' Expected 1 arguments.</code> This makes, sense since I'm passing the arguments <code>params[0]</code> & <code>params[1]</code> to the function that accepts a single argument. My <b>second attempt</b> was: # class located at: 'res//scripts/foo.gd' func do_something(param0, param1): return param0 This time it runs but, my tests are failing: <code> (call #1) with paramters: [1, 2, 3] [Failed]: [1] expected to equal [3]</code> I understand that <code>result</code> is basically <code>foo_params[i][0]</code>, on this case but, how can I make the <code>assert_eq</code> with <code>params[2]</code>. My <b>third attempt</b> was: # class located at: 'res//scripts/foo.gd' func do_something(param0, param1): return param0[2] Which returned the following error: <code>Invalid get index '2' (on base: 'int').</code> which makes total sense since GDScript does not know that param0 should be an array. Tried googling how to do it and could not find a way to do it. My <b>fourth & final attempt</b> was: # class located at: 'res//scripts/foo.gd' func do_something(param0, param1, param2): return param2 Which returned the following error: <code>Invalid call to function 'do_something' in base 'Node (foo.gd).' Expected 1 arguments.</code> Which is basically the first error. Now I went full circle and is why I'm asking for a clarification here. Could you please elaborate on the example of <a href="https://github.com/bitwes/Gut/wiki/Parameterized-Tests">Parameterized Tests</a> docs? I'm really not sure how to implement one, given my current knowledge of Gut & Godot.
non_priority
unclear parameterized tests example while going throughout the docs it s uncertain how the following test foo function works extends res addons gut test gd var foo params func test foo params use parameters foo params var foo foo new var result foo do something params params assert eq result params as well as the do something method within the foo class attempted to recreate this on my end to see if i could get a better understanding of things my first attempt was class located at res scripts foo gd func do something param return param which returned the following error invalid call to function do something in base node foo gd expected arguments this makes sense since i m passing the arguments params params to the function that accepts a single argument my second attempt was class located at res scripts foo gd func do something return this time it runs but my tests are failing call with paramters expected to equal i understand that result is basically foo params on this case but how can i make the assert eq with params my third attempt was class located at res scripts foo gd func do something return which returned the following error invalid get index on base int which makes total sense since gdscript does not know that should be an array tried googling how to do it and could not find a way to do it my fourth final attempt was class located at res scripts foo gd func do something return which returned the following error invalid call to function do something in base node foo gd expected arguments which is basically the first error now i went full circle and is why i m asking for a clarification here could you please elaborate on the example of docs i m really not sure how to implement one given my current knowledge of gut godot
0
60,533
12,127,167,784
IssuesEvent
2020-04-22 18:15:34
terraform-providers/terraform-provider-aws
https://api.github.com/repos/terraform-providers/terraform-provider-aws
closed
Referencing namespace variable in CodePipeline action fails
needs-triage service/codepipeline
<!--- Please note the following potential times when an issue might be in Terraform core: * [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues * [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues * [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues * [Registry](https://registry.terraform.io/) issues * Spans resources across multiple providers If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead. ---> <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform Version Terraform v0.11.15-oci ### Affected Resource(s) <!--- Please list the affected resources and data sources. ---> * aws_codepipeline ### Terraform Configuration Files <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl resource "aws_codepipeline" "build_push_deploy" { # omitted code stage { name = "BuildAndPush" action { name = "BuildDockerImages" category = "Build" owner = "AWS" provider = "CodeBuild" input_artifacts = ["source"] output_artifacts = ["build"] version = "1" namespace = "BuildVariables" # <----- this is actually not in the code and it's done manually in console as this PR is not yet merged in the AWS provider https://github.com/terraform-providers/terraform-provider-aws/issues/11418 configuration = { ProjectName = "build" EnvironmentVariables = <<EOF [{"name":"SERVICES","value":"test","type":"PLAINTEXT"}] EOF } run_order = 1 } action { name = "PushToECR" category = "Build" owner = "AWS" provider = "CodeBuild" input_artifacts = ["source", "build"] version = "1" configuration = { ProjectName = "push" PrimarySource = "source" EnvironmentVariables = <<EOF [{"name":"SERVICES","type":"PLAINTEXT","value":"#{BuildVariables.SERVICES}"}] EOF } run_order = 2 } } # omitted code } ``` ### Expected Behavior Referencing a namespace variable should proceed even if it's not declared in previous steps. For example it can be defined only in the `buildspec.yml` file and then there is no record of existence of the namespace variable before running the pipeline. ``` env: variables: SERVICES: "service1" exported-variables: - SERVICES ``` Configuration in the AWS Console allows referencing variables that are known only after running previous CodePipeline Actions. ### Actual Behavior Before running the Plan, the variable namespace is already defined in the AWS Console. [screenshot](https://ibb.co/C5RdXR6) As you can see in the Plan - Terraform is not removing that settings. The only change that I am adding is referencing that variable from previous step as described in the [AWS docs](https://docs.aws.amazon.com/codepipeline/latest/userguide/reference-variables.html). Terraform plan ``` Terraform will perform the following actions: ~ aws_codepipeline.build_push_deploy stage.1.action.1.configuration.%: "2" => "3" stage.1.action.1.configuration.EnvironmentVariables: "" => "[{\"name\":\"SERVICES\",\"type\":\"PLAINTEXT\",\"value\":\"#{BuildVariables.SERVICES}\"}]\n" ``` Error output ```aws_codepipeline.build_push_deploy: Modifying... (ID: mallpay-be-accept-build-push-deploy-v2) stage.1.action.1.configuration.%: "2" => "3" stage.1.action.1.configuration.EnvironmentVariables: "" => "[{\"name\":\"SERVICES\",\"type\":\"PLAINTEXT\",\"value\":\"#{BuildVariables.SERVICES}\"}]\n" Error: Error applying plan: 1 error occurred: * aws_codepipeline.build_push_deploy: 1 error occurred: * aws_codepipeline.build_push_deploy: [ERROR] Error updating CodePipeline (mallpay-be-accept-build-push-deploy-v2): InvalidActionDeclarationException: Valid format for a pipeline execution variable reference is a namespace and a key separated by a period (.). The following pipeline execution variables are referencing a namespace that does not exist. StageName=[BuildAndPush], ActionName=[PushToECR], ActionConfigurationKey=[EnvironmentVariables], VariableReferenceText=[BuildVariables.SERVICES] ``` ### Steps to Reproduce <!--- Please list the steps required to reproduce the issue. ---> 1. In CodePipeline action in AWS Console - manually create a variable namespace (until this is merged https://github.com/terraform-providers/terraform-provider-aws/issues/11418) 2. Terraform won't try to remove the namespace - as it doesn't know about that field. [screenshot](https://ibb.co/C5RdXR6) 3. Define the variable in the Action Environment / `buildspec.yml` 4. run `terraform apply` ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor documentation? For example: ---> * #11418
1.0
Referencing namespace variable in CodePipeline action fails - <!--- Please note the following potential times when an issue might be in Terraform core: * [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues * [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues * [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues * [Registry](https://registry.terraform.io/) issues * Spans resources across multiple providers If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead. ---> <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform Version Terraform v0.11.15-oci ### Affected Resource(s) <!--- Please list the affected resources and data sources. ---> * aws_codepipeline ### Terraform Configuration Files <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl resource "aws_codepipeline" "build_push_deploy" { # omitted code stage { name = "BuildAndPush" action { name = "BuildDockerImages" category = "Build" owner = "AWS" provider = "CodeBuild" input_artifacts = ["source"] output_artifacts = ["build"] version = "1" namespace = "BuildVariables" # <----- this is actually not in the code and it's done manually in console as this PR is not yet merged in the AWS provider https://github.com/terraform-providers/terraform-provider-aws/issues/11418 configuration = { ProjectName = "build" EnvironmentVariables = <<EOF [{"name":"SERVICES","value":"test","type":"PLAINTEXT"}] EOF } run_order = 1 } action { name = "PushToECR" category = "Build" owner = "AWS" provider = "CodeBuild" input_artifacts = ["source", "build"] version = "1" configuration = { ProjectName = "push" PrimarySource = "source" EnvironmentVariables = <<EOF [{"name":"SERVICES","type":"PLAINTEXT","value":"#{BuildVariables.SERVICES}"}] EOF } run_order = 2 } } # omitted code } ``` ### Expected Behavior Referencing a namespace variable should proceed even if it's not declared in previous steps. For example it can be defined only in the `buildspec.yml` file and then there is no record of existence of the namespace variable before running the pipeline. ``` env: variables: SERVICES: "service1" exported-variables: - SERVICES ``` Configuration in the AWS Console allows referencing variables that are known only after running previous CodePipeline Actions. ### Actual Behavior Before running the Plan, the variable namespace is already defined in the AWS Console. [screenshot](https://ibb.co/C5RdXR6) As you can see in the Plan - Terraform is not removing that settings. The only change that I am adding is referencing that variable from previous step as described in the [AWS docs](https://docs.aws.amazon.com/codepipeline/latest/userguide/reference-variables.html). Terraform plan ``` Terraform will perform the following actions: ~ aws_codepipeline.build_push_deploy stage.1.action.1.configuration.%: "2" => "3" stage.1.action.1.configuration.EnvironmentVariables: "" => "[{\"name\":\"SERVICES\",\"type\":\"PLAINTEXT\",\"value\":\"#{BuildVariables.SERVICES}\"}]\n" ``` Error output ```aws_codepipeline.build_push_deploy: Modifying... (ID: mallpay-be-accept-build-push-deploy-v2) stage.1.action.1.configuration.%: "2" => "3" stage.1.action.1.configuration.EnvironmentVariables: "" => "[{\"name\":\"SERVICES\",\"type\":\"PLAINTEXT\",\"value\":\"#{BuildVariables.SERVICES}\"}]\n" Error: Error applying plan: 1 error occurred: * aws_codepipeline.build_push_deploy: 1 error occurred: * aws_codepipeline.build_push_deploy: [ERROR] Error updating CodePipeline (mallpay-be-accept-build-push-deploy-v2): InvalidActionDeclarationException: Valid format for a pipeline execution variable reference is a namespace and a key separated by a period (.). The following pipeline execution variables are referencing a namespace that does not exist. StageName=[BuildAndPush], ActionName=[PushToECR], ActionConfigurationKey=[EnvironmentVariables], VariableReferenceText=[BuildVariables.SERVICES] ``` ### Steps to Reproduce <!--- Please list the steps required to reproduce the issue. ---> 1. In CodePipeline action in AWS Console - manually create a variable namespace (until this is merged https://github.com/terraform-providers/terraform-provider-aws/issues/11418) 2. Terraform won't try to remove the namespace - as it doesn't know about that field. [screenshot](https://ibb.co/C5RdXR6) 3. Define the variable in the Action Environment / `buildspec.yml` 4. run `terraform apply` ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor documentation? For example: ---> * #11418
non_priority
referencing namespace variable in codepipeline action fails please note the following potential times when an issue might be in terraform core or resource ordering issues and issues issues issues spans resources across multiple providers if you are running into one of these scenarios we recommend opening an issue in the instead community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or other comments that do not add relevant new information or questions they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform version terraform oci affected resource s aws codepipeline terraform configuration files hcl resource aws codepipeline build push deploy omitted code stage name buildandpush action name builddockerimages category build owner aws provider codebuild input artifacts output artifacts version namespace buildvariables this is actually not in the code and it s done manually in console as this pr is not yet merged in the aws provider configuration projectname build environmentvariables eof eof run order action name pushtoecr category build owner aws provider codebuild input artifacts version configuration projectname push primarysource source environmentvariables eof eof run order omitted code expected behavior referencing a namespace variable should proceed even if it s not declared in previous steps for example it can be defined only in the buildspec yml file and then there is no record of existence of the namespace variable before running the pipeline env variables services exported variables services configuration in the aws console allows referencing variables that are known only after running previous codepipeline actions actual behavior before running the plan the variable namespace is already defined in the aws console as you can see in the plan terraform is not removing that settings the only change that i am adding is referencing that variable from previous step as described in the terraform plan terraform will perform the following actions aws codepipeline build push deploy stage action configuration stage action configuration environmentvariables n error output aws codepipeline build push deploy modifying id mallpay be accept build push deploy stage action configuration stage action configuration environmentvariables n error error applying plan error occurred aws codepipeline build push deploy error occurred aws codepipeline build push deploy error updating codepipeline mallpay be accept build push deploy invalidactiondeclarationexception valid format for a pipeline execution variable reference is a namespace and a key separated by a period the following pipeline execution variables are referencing a namespace that does not exist stagename actionname actionconfigurationkey variablereferencetext steps to reproduce in codepipeline action in aws console manually create a variable namespace until this is merged terraform won t try to remove the namespace as it doesn t know about that field define the variable in the action environment buildspec yml run terraform apply references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here vendor documentation for example
0
13,612
3,163,883,898
IssuesEvent
2015-09-20 18:12:16
Quaggles/Icarus
https://api.github.com/repos/Quaggles/Icarus
closed
7/8: Tutorial Update
bug design enhancement programming
1. Couldn't pick up weapon in tutorial 2. Should ask player to repeat action 3x because if its only 1x, spamming may trigger the event and now the players don't know what they've actually done. 3. Tutorial didn't continue at dash stage. Dash did not trigger next part 4. Tutorial doesn't apply to current mechanics and buttons and doesn't include everything - we know this one already 5. Add Matt's Voiceover and subtitles to the Tutorial. 6. Add the pause menu so players can leave the tutorial at any time. 7. Make tutorial Multiplayer.
1.0
7/8: Tutorial Update - 1. Couldn't pick up weapon in tutorial 2. Should ask player to repeat action 3x because if its only 1x, spamming may trigger the event and now the players don't know what they've actually done. 3. Tutorial didn't continue at dash stage. Dash did not trigger next part 4. Tutorial doesn't apply to current mechanics and buttons and doesn't include everything - we know this one already 5. Add Matt's Voiceover and subtitles to the Tutorial. 6. Add the pause menu so players can leave the tutorial at any time. 7. Make tutorial Multiplayer.
non_priority
tutorial update couldn t pick up weapon in tutorial should ask player to repeat action because if its only spamming may trigger the event and now the players don t know what they ve actually done tutorial didn t continue at dash stage dash did not trigger next part tutorial doesn t apply to current mechanics and buttons and doesn t include everything we know this one already add matt s voiceover and subtitles to the tutorial add the pause menu so players can leave the tutorial at any time make tutorial multiplayer
0
79,201
15,164,269,666
IssuesEvent
2021-02-12 13:30:10
MurmurationsNetwork/MurmurationsServices
https://api.github.com/repos/MurmurationsNetwork/MurmurationsServices
closed
Implement Index API prototype
design & code
This issue is to track the overall progress of implementing the Index API in prototype form for the complete node profile creation, change and deletion process. It will be completed when we have a working version of the Index API on a test server (not production) for public access.
1.0
Implement Index API prototype - This issue is to track the overall progress of implementing the Index API in prototype form for the complete node profile creation, change and deletion process. It will be completed when we have a working version of the Index API on a test server (not production) for public access.
non_priority
implement index api prototype this issue is to track the overall progress of implementing the index api in prototype form for the complete node profile creation change and deletion process it will be completed when we have a working version of the index api on a test server not production for public access
0
7,610
7,018,205,337
IssuesEvent
2017-12-21 12:48:36
SatelliteQE/robottelo
https://api.github.com/repos/SatelliteQE/robottelo
closed
Some tests deselected even if they have no bz skip decorator
6.3 Bug High Infrastructure
Note: tests.foreman.api.test_docker.test_positive_update_url log deselection reported 2 times this test has no bz skip seems the get_func_name in helpers do not take in consideration the class name ``` 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_docker.test_positive_update_url 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_docker.test_positive_publish_with_docker_repo_composite 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_docker.test_positive_create_using_cv 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_docker.test_positive_read_container_log 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_docker.test_positive_update_url 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_hostgroup.test_positive_update_arch 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_hostgroup.test_positive_update_content_source ```
1.0
Some tests deselected even if they have no bz skip decorator - Note: tests.foreman.api.test_docker.test_positive_update_url log deselection reported 2 times this test has no bz skip seems the get_func_name in helpers do not take in consideration the class name ``` 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_docker.test_positive_update_url 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_docker.test_positive_publish_with_docker_repo_composite 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_docker.test_positive_create_using_cv 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_docker.test_positive_read_container_log 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_docker.test_positive_update_url 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_hostgroup.test_positive_update_arch 2017-12-15 20:59:59 - conftest - DEBUG - Deselected test tests.foreman.api.test_hostgroup.test_positive_update_content_source ```
non_priority
some tests deselected even if they have no bz skip decorator note tests foreman api test docker test positive update url log deselection reported times this test has no bz skip seems the get func name in helpers do not take in consideration the class name conftest debug deselected test tests foreman api test docker test positive update url conftest debug deselected test tests foreman api test docker test positive publish with docker repo composite conftest debug deselected test tests foreman api test docker test positive create using cv conftest debug deselected test tests foreman api test docker test positive read container log conftest debug deselected test tests foreman api test docker test positive update url conftest debug deselected test tests foreman api test hostgroup test positive update arch conftest debug deselected test tests foreman api test hostgroup test positive update content source
0
23,073
11,839,740,193
IssuesEvent
2020-03-23 17:38:33
microsoft/botframework-solutions
https://api.github.com/repos/microsoft/botframework-solutions
closed
deploy_cognitive_models.ps1 not deploying connected skills intents to Dispatch
Bot Services Type: Bug customer-replied-to customer-reported
#### What project is affected? Virtual Assistant template script deploy_cognitive_models #### What language is this in? powershell #### What happens? launching ./Deployment/Scripts/deploy_cognitive_models.ps1 -language "it-it" -qnaSubscriptionKey "xxxxxxxxxxxxxxxxxxxxxxx" is not adding the Virtual Assistant connected skills to the Dispatch model #### What are the steps to reproduce this issue? - Deploy a Virtual Assistant - Connect a skill with BOTSKILL CONNECT - delete apps General and Dispatch for the Virtual Agent from LUIS portal - launch ``` ./Deployment/Scripts/deploy_cognitive_models.ps1 -language "it-it" -qnaSubscriptionKey "xxxxxxxxxxxxxxxxxxxxxxxxxx" ``` - check the deployed Dispatch app in the LUIS portal: the skills intents are not in there #### What were you expecting to happen? The redeployed Dispatch app to contain the connected skill's intents. #### Can you share any logs, error output, etc.? ``` > Deploying cognitive models ... > Initializing dispatch model ... > Parsing General LU file ... > Deploying General LUIS app ... > Setting LUIS subscription key ... > Adding General app to dispatch model ... > Parsing Chitchat LU file ... > Deploying Chitchat QnA kb ... > Adding Chitchat kb to dispatch model ... > Parsing Faq LU file ... > Deploying Faq QnA kb ... > Adding Faq kb to dispatch model ... > Creating dispatch model... > Setting LUIS subscription key ... ``` Note: the skills intents are not being added to the Dispatch model. The bot and the skills are in italian culture. #### Any screenshots or additional context?
1.0
deploy_cognitive_models.ps1 not deploying connected skills intents to Dispatch - #### What project is affected? Virtual Assistant template script deploy_cognitive_models #### What language is this in? powershell #### What happens? launching ./Deployment/Scripts/deploy_cognitive_models.ps1 -language "it-it" -qnaSubscriptionKey "xxxxxxxxxxxxxxxxxxxxxxx" is not adding the Virtual Assistant connected skills to the Dispatch model #### What are the steps to reproduce this issue? - Deploy a Virtual Assistant - Connect a skill with BOTSKILL CONNECT - delete apps General and Dispatch for the Virtual Agent from LUIS portal - launch ``` ./Deployment/Scripts/deploy_cognitive_models.ps1 -language "it-it" -qnaSubscriptionKey "xxxxxxxxxxxxxxxxxxxxxxxxxx" ``` - check the deployed Dispatch app in the LUIS portal: the skills intents are not in there #### What were you expecting to happen? The redeployed Dispatch app to contain the connected skill's intents. #### Can you share any logs, error output, etc.? ``` > Deploying cognitive models ... > Initializing dispatch model ... > Parsing General LU file ... > Deploying General LUIS app ... > Setting LUIS subscription key ... > Adding General app to dispatch model ... > Parsing Chitchat LU file ... > Deploying Chitchat QnA kb ... > Adding Chitchat kb to dispatch model ... > Parsing Faq LU file ... > Deploying Faq QnA kb ... > Adding Faq kb to dispatch model ... > Creating dispatch model... > Setting LUIS subscription key ... ``` Note: the skills intents are not being added to the Dispatch model. The bot and the skills are in italian culture. #### Any screenshots or additional context?
non_priority
deploy cognitive models not deploying connected skills intents to dispatch what project is affected virtual assistant template script deploy cognitive models what language is this in powershell what happens launching deployment scripts deploy cognitive models language it it qnasubscriptionkey xxxxxxxxxxxxxxxxxxxxxxx is not adding the virtual assistant connected skills to the dispatch model what are the steps to reproduce this issue deploy a virtual assistant connect a skill with botskill connect delete apps general and dispatch for the virtual agent from luis portal launch deployment scripts deploy cognitive models language it it qnasubscriptionkey xxxxxxxxxxxxxxxxxxxxxxxxxx check the deployed dispatch app in the luis portal the skills intents are not in there what were you expecting to happen the redeployed dispatch app to contain the connected skill s intents can you share any logs error output etc deploying cognitive models initializing dispatch model parsing general lu file deploying general luis app setting luis subscription key adding general app to dispatch model parsing chitchat lu file deploying chitchat qna kb adding chitchat kb to dispatch model parsing faq lu file deploying faq qna kb adding faq kb to dispatch model creating dispatch model setting luis subscription key note the skills intents are not being added to the dispatch model the bot and the skills are in italian culture any screenshots or additional context
0
251,363
18,947,960,492
IssuesEvent
2021-11-18 12:18:21
HU-ICT-LAB/WebVR-Demo
https://api.github.com/repos/HU-ICT-LAB/WebVR-Demo
closed
Write physics information on Wiki
documentation User Story Sprint #4
**Priority:** [2] **Time estimation:** [6] **Description:** As a developer I want to know how physics work to make objects fall and collide in a VR-game Tasks: - [ ] Write about physics - [ ] Write about collision detection **Definition of Done:** Written information about the 2 tasks in this user story
1.0
Write physics information on Wiki - **Priority:** [2] **Time estimation:** [6] **Description:** As a developer I want to know how physics work to make objects fall and collide in a VR-game Tasks: - [ ] Write about physics - [ ] Write about collision detection **Definition of Done:** Written information about the 2 tasks in this user story
non_priority
write physics information on wiki priority time estimation description as a developer i want to know how physics work to make objects fall and collide in a vr game tasks write about physics write about collision detection definition of done written information about the tasks in this user story
0
220,592
17,209,479,663
IssuesEvent
2021-07-19 00:17:50
WordPress/gutenberg
https://api.github.com/repos/WordPress/gutenberg
opened
Ideas for improving E2E test developer experience
Automated Testing [Type] Overview
@kevin940726 and I were brainstorming a few things we could try to improve the experience of writing and running E2E tests. Noting them here so that we don't forget them. - [ ] Build a web dashboard that shows which tests are the most flakey and/or slow. This might help show us which tests need the most attention. It might also let us measure our work's impact. - [ ] Look at splitting `npm run build` into its own GitHub action which runs prior to the four `npm run test-e2e` actions. This might make it quicker to restart failing E2E tests on a PR. - [ ] Fix `.wp-env.json` version to a known git commit which is updated automatically via a PR every week. This might make it less disruptive (i.e. doesn't block every single developer) when a Core change breaks Gutenberg CI. - [ ] Look at automatically retrying E2E tests. This might help with stability. - [ ] Look at splitting the 4 `npm run test-e2e` actions into 6 actions. This might speed up E2E test runs on a PR. - [ ] Capture screencasts of failed E2E tests. This might make it easier to debug failing tests. - [ ] Look at rewriting flakey tests using `puppeteer-testing-library`. This might help with stability. - [ ] Investigate how else to improve E2E execution time.
1.0
Ideas for improving E2E test developer experience - @kevin940726 and I were brainstorming a few things we could try to improve the experience of writing and running E2E tests. Noting them here so that we don't forget them. - [ ] Build a web dashboard that shows which tests are the most flakey and/or slow. This might help show us which tests need the most attention. It might also let us measure our work's impact. - [ ] Look at splitting `npm run build` into its own GitHub action which runs prior to the four `npm run test-e2e` actions. This might make it quicker to restart failing E2E tests on a PR. - [ ] Fix `.wp-env.json` version to a known git commit which is updated automatically via a PR every week. This might make it less disruptive (i.e. doesn't block every single developer) when a Core change breaks Gutenberg CI. - [ ] Look at automatically retrying E2E tests. This might help with stability. - [ ] Look at splitting the 4 `npm run test-e2e` actions into 6 actions. This might speed up E2E test runs on a PR. - [ ] Capture screencasts of failed E2E tests. This might make it easier to debug failing tests. - [ ] Look at rewriting flakey tests using `puppeteer-testing-library`. This might help with stability. - [ ] Investigate how else to improve E2E execution time.
non_priority
ideas for improving test developer experience and i were brainstorming a few things we could try to improve the experience of writing and running tests noting them here so that we don t forget them build a web dashboard that shows which tests are the most flakey and or slow this might help show us which tests need the most attention it might also let us measure our work s impact look at splitting npm run build into its own github action which runs prior to the four npm run test actions this might make it quicker to restart failing tests on a pr fix wp env json version to a known git commit which is updated automatically via a pr every week this might make it less disruptive i e doesn t block every single developer when a core change breaks gutenberg ci look at automatically retrying tests this might help with stability look at splitting the npm run test actions into actions this might speed up test runs on a pr capture screencasts of failed tests this might make it easier to debug failing tests look at rewriting flakey tests using puppeteer testing library this might help with stability investigate how else to improve execution time
0
53,422
28,125,226,655
IssuesEvent
2023-03-31 17:07:17
sul-dlss/purl-fetcher
https://api.github.com/repos/sul-dlss/purl-fetcher
closed
Publish PURL updates to a queue
enhancement performance
We believe that the root cause of #530 is related to time spent interacting with the purl filesystem (NFS mount) or database, or both. To address it, we want to change the data structure underlying purl-fetcher to a queue, to make it asynchronous (and possibly parallelized). This depends on infrastructure sending us the entire public cocina on requests to the `/purls/:druid` endpoint, so that we can extract the release tags and collection membership information from it (see https://github.com/sul-dlss/dor-services-app/pull/4451) We can use our existing kafka instances at `sul-kafka-stage-a`/`sul-kafka-prod-a` as the queue implementation. For an example of existing code that publishes to kafka, see [searchworks-traject-indexer's `PurlFetcherKafkaExtractor`](https://github.com/sul-dlss/searchworks_traject_indexer/blob/64359399e8f670ed414b1c56c648dc9b95ad6bad/lib/traject/extractors/purl_fetcher_kafka_extractor.rb#L18-L22). We should take note of the fact that our current ruby kafka client is EOL (https://github.com/sul-dlss/searchworks_traject_indexer/issues/737) and probably start by looking at [librdkafka](https://github.com/edenhill/librdkafka).
True
Publish PURL updates to a queue - We believe that the root cause of #530 is related to time spent interacting with the purl filesystem (NFS mount) or database, or both. To address it, we want to change the data structure underlying purl-fetcher to a queue, to make it asynchronous (and possibly parallelized). This depends on infrastructure sending us the entire public cocina on requests to the `/purls/:druid` endpoint, so that we can extract the release tags and collection membership information from it (see https://github.com/sul-dlss/dor-services-app/pull/4451) We can use our existing kafka instances at `sul-kafka-stage-a`/`sul-kafka-prod-a` as the queue implementation. For an example of existing code that publishes to kafka, see [searchworks-traject-indexer's `PurlFetcherKafkaExtractor`](https://github.com/sul-dlss/searchworks_traject_indexer/blob/64359399e8f670ed414b1c56c648dc9b95ad6bad/lib/traject/extractors/purl_fetcher_kafka_extractor.rb#L18-L22). We should take note of the fact that our current ruby kafka client is EOL (https://github.com/sul-dlss/searchworks_traject_indexer/issues/737) and probably start by looking at [librdkafka](https://github.com/edenhill/librdkafka).
non_priority
publish purl updates to a queue we believe that the root cause of is related to time spent interacting with the purl filesystem nfs mount or database or both to address it we want to change the data structure underlying purl fetcher to a queue to make it asynchronous and possibly parallelized this depends on infrastructure sending us the entire public cocina on requests to the purls druid endpoint so that we can extract the release tags and collection membership information from it see we can use our existing kafka instances at sul kafka stage a sul kafka prod a as the queue implementation for an example of existing code that publishes to kafka see we should take note of the fact that our current ruby kafka client is eol and probably start by looking at
0
91,534
26,416,722,845
IssuesEvent
2023-01-13 16:31:20
scikit-learn/scikit-learn
https://api.github.com/repos/scikit-learn/scikit-learn
closed
CI "no OpenMP" build environment actually has OpenMP
Build / CI
This avoided catching a regression where an unprotected `cimport openmp` was introduced. As a side-comment: Pyodide build needs to be built without OpenMP. From https://github.com/scikit-learn/scikit-learn/pull/24682#issuecomment-1281939439, there is OpenMP in the build environment: ``` ❯ ag openmp build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock 9:https://repo.anaconda.com/pkgs/main/osx-64/intel-openmp-2021.4.0-hecd8cb5_3538.conda#65e79d0ffef79cbb8ebd3c71e74eb50a 15:https://repo.anaconda.com/pkgs/main/osx-64/llvm-openmp-14.0.6-h0dcd299_0.conda#b5804d32b87dc61ca94561ade33d5f2d ``` From https://github.com/scikit-learn/scikit-learn/pull/24682#issuecomment-1282012785 Looking at why we get OpenMP in the "no OpenMP" build: - libopenblas can be compiled without openmp i.e. with pthreads for Linux and Windows, e.g. see [this](https://conda-forge.org/docs/user/announcements.html#conda-forge-is-building-openblas-with-both-pthreads-and-openmp-on-linux) - there is no libopenblas with pthreads on OSX [anaconda.org/conda-forge/libopenblas/files?sort=basename&sort_order=desc](https://anaconda.org/conda-forge/libopenblas/files?sort=basename&sort_order=desc) So it seems like if we want an "no OpenMP" build we need it to be Linux or Windows. Not sure whether there was a good reason to have it on OSX originally.
1.0
CI "no OpenMP" build environment actually has OpenMP - This avoided catching a regression where an unprotected `cimport openmp` was introduced. As a side-comment: Pyodide build needs to be built without OpenMP. From https://github.com/scikit-learn/scikit-learn/pull/24682#issuecomment-1281939439, there is OpenMP in the build environment: ``` ❯ ag openmp build_tools/azure/pylatest_conda_mkl_no_openmp_osx-64_conda.lock 9:https://repo.anaconda.com/pkgs/main/osx-64/intel-openmp-2021.4.0-hecd8cb5_3538.conda#65e79d0ffef79cbb8ebd3c71e74eb50a 15:https://repo.anaconda.com/pkgs/main/osx-64/llvm-openmp-14.0.6-h0dcd299_0.conda#b5804d32b87dc61ca94561ade33d5f2d ``` From https://github.com/scikit-learn/scikit-learn/pull/24682#issuecomment-1282012785 Looking at why we get OpenMP in the "no OpenMP" build: - libopenblas can be compiled without openmp i.e. with pthreads for Linux and Windows, e.g. see [this](https://conda-forge.org/docs/user/announcements.html#conda-forge-is-building-openblas-with-both-pthreads-and-openmp-on-linux) - there is no libopenblas with pthreads on OSX [anaconda.org/conda-forge/libopenblas/files?sort=basename&sort_order=desc](https://anaconda.org/conda-forge/libopenblas/files?sort=basename&sort_order=desc) So it seems like if we want an "no OpenMP" build we need it to be Linux or Windows. Not sure whether there was a good reason to have it on OSX originally.
non_priority
ci no openmp build environment actually has openmp this avoided catching a regression where an unprotected cimport openmp was introduced as a side comment pyodide build needs to be built without openmp from there is openmp in the build environment ❯ ag openmp build tools azure pylatest conda mkl no openmp osx conda lock from looking at why we get openmp in the no openmp build libopenblas can be compiled without openmp i e with pthreads for linux and windows e g see there is no libopenblas with pthreads on osx so it seems like if we want an no openmp build we need it to be linux or windows not sure whether there was a good reason to have it on osx originally
0
45,647
5,723,926,710
IssuesEvent
2017-04-20 13:29:13
openbmc/openbmc-test-automation
https://api.github.com/repos/openbmc/openbmc-test-automation
closed
[Errorlog] Delete/Clear errorlog utility
Test
https://github.com/openbmc/openbmc/issues/1327 Need to add utility in code for Errorlog and association use cases.
1.0
[Errorlog] Delete/Clear errorlog utility - https://github.com/openbmc/openbmc/issues/1327 Need to add utility in code for Errorlog and association use cases.
non_priority
delete clear errorlog utility need to add utility in code for errorlog and association use cases
0
54,056
13,893,004,691
IssuesEvent
2020-10-19 12:59:22
nakasho-dev/conference-app-2020
https://api.github.com/repos/nakasho-dev/conference-app-2020
opened
CVE-2020-13956 (Medium) detected in httpclient-4.5.5.jar
security vulnerability
## CVE-2020-13956 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>httpclient-4.5.5.jar</b></p></summary> <p>Apache HttpComponents Client</p> <p>Path to dependency file: conference-app-2020/buildSrc/build.gradle.kts</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.httpcomponents/httpclient/4.5.5/1603dfd56ebcd583ccdf337b6c3984ac55d89e58/httpclient-4.5.5.jar</p> <p> Dependency Hierarchy: - kotlin-reflect-1.3.50.jar (Root Library) - google-api-client-1.28.0.jar - google-http-client-apache-2.0.0.jar - :x: **httpclient-4.5.5.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/nakasho-dev/conference-app-2020/commit/ec2284d85604ba33cf06de1b5080110845ffd054">ec2284d85604ba33cf06de1b5080110845ffd054</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache HttpClient versions prior to version 4.5.13 and 5.0.3 can misinterpret malformed authority component in request URIs passed to the library as java.net.URI object and pick the wrong target host for request execution. <p>Publish Date: 2020-07-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-13956>CVE-2020-13956</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2020-13956">https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2020-13956</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: org.apache.httpcomponents:httpclient:4.5.13;org.apache.httpcomponents:httpclient-osgi:4.5.13;org.apache.httpcomponents.client5:httpclient5:5.0.3;org.apache.httpcomponents.client5:httpclient5-osgi:5.0.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-13956 (Medium) detected in httpclient-4.5.5.jar - ## CVE-2020-13956 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>httpclient-4.5.5.jar</b></p></summary> <p>Apache HttpComponents Client</p> <p>Path to dependency file: conference-app-2020/buildSrc/build.gradle.kts</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.httpcomponents/httpclient/4.5.5/1603dfd56ebcd583ccdf337b6c3984ac55d89e58/httpclient-4.5.5.jar</p> <p> Dependency Hierarchy: - kotlin-reflect-1.3.50.jar (Root Library) - google-api-client-1.28.0.jar - google-http-client-apache-2.0.0.jar - :x: **httpclient-4.5.5.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/nakasho-dev/conference-app-2020/commit/ec2284d85604ba33cf06de1b5080110845ffd054">ec2284d85604ba33cf06de1b5080110845ffd054</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache HttpClient versions prior to version 4.5.13 and 5.0.3 can misinterpret malformed authority component in request URIs passed to the library as java.net.URI object and pick the wrong target host for request execution. <p>Publish Date: 2020-07-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-13956>CVE-2020-13956</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2020-13956">https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2020-13956</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: org.apache.httpcomponents:httpclient:4.5.13;org.apache.httpcomponents:httpclient-osgi:4.5.13;org.apache.httpcomponents.client5:httpclient5:5.0.3;org.apache.httpcomponents.client5:httpclient5-osgi:5.0.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve medium detected in httpclient jar cve medium severity vulnerability vulnerable library httpclient jar apache httpcomponents client path to dependency file conference app buildsrc build gradle kts path to vulnerable library home wss scanner gradle caches modules files org apache httpcomponents httpclient httpclient jar dependency hierarchy kotlin reflect jar root library google api client jar google http client apache jar x httpclient jar vulnerable library found in head commit a href found in base branch master vulnerability details apache httpclient versions prior to version and can misinterpret malformed authority component in request uris passed to the library as java net uri object and pick the wrong target host for request execution publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache httpcomponents httpclient org apache httpcomponents httpclient osgi org apache httpcomponents org apache httpcomponents osgi step up your open source security game with whitesource
0
87,268
10,888,051,609
IssuesEvent
2019-11-18 15:36:26
patternfly/patternfly-next
https://api.github.com/repos/patternfly/patternfly-next
closed
Add Context Selector to page header
P2 blocked by design enhancement interaction design
Create a new example that shows context selector placed in a page. In the page header we have containers for tools/nav etc. I placed the context selector in the nav as a demo and it looks like this: <img width="1157" alt="Screen Shot 2019-06-05 at 4 06 21 PM" src="https://user-images.githubusercontent.com/20118816/58986652-e3a66100-87ab-11e9-9a72-0b80ff630c46.png"> Visuals look like this: https://drive.google.com/file/d/1b405W_CGasRR-C7eZZl7KgigYODvtsMi/view cc: @mcoker Then when I put in the nav also it sat in front of the context selector because the `.pf-c-page__header-nav` uses grid columns/rows
2.0
Add Context Selector to page header - Create a new example that shows context selector placed in a page. In the page header we have containers for tools/nav etc. I placed the context selector in the nav as a demo and it looks like this: <img width="1157" alt="Screen Shot 2019-06-05 at 4 06 21 PM" src="https://user-images.githubusercontent.com/20118816/58986652-e3a66100-87ab-11e9-9a72-0b80ff630c46.png"> Visuals look like this: https://drive.google.com/file/d/1b405W_CGasRR-C7eZZl7KgigYODvtsMi/view cc: @mcoker Then when I put in the nav also it sat in front of the context selector because the `.pf-c-page__header-nav` uses grid columns/rows
non_priority
add context selector to page header create a new example that shows context selector placed in a page in the page header we have containers for tools nav etc i placed the context selector in the nav as a demo and it looks like this img width alt screen shot at pm src visuals look like this cc mcoker then when i put in the nav also it sat in front of the context selector because the pf c page header nav uses grid columns rows
0
184,283
14,974,106,863
IssuesEvent
2021-01-28 02:43:33
Katsute/Mal4J
https://api.github.com/repos/Katsute/Mal4J
closed
Redoc defect issues.
bug documentation website
This issue only affects branch [`redoc@0450bbc5`](https://github.com/Katsute/Mal4J/tree/redoc%40450bbc5). There are several defects in redoc that is causing valid schema to produce invalid documentation. - [ ] `/anime`, `/anime/seasonal/{year}/{season}`, `/anime/suggestions` - [x] List status not being returned - [ ] Studios not being returned - [ ] `/anime/{anime_id}` - [ ] Studios not being returned - [ ] RelatedAnime not being returned - [ ] Recommendations not being returned - [ ] `/anime/ranking` - [ ] Not returning Anime - [x] `/users/{user_name}/animelist` - [x] Comments type is `any` - [ ] `/manga` - [x] List status not being returned - [ ] Authors not being returned - [ ] `/manga/{anime_id}` - [ ] Authors not being returned - [ ] RelatedManga not being returned - [ ] Recommendations not being returned - [ ] Serialization not being returned - [ ] `/manga/ranking` - [ ] Not returning Manga - [x] `/users/{user_name}/mangalist` - [x] Edit not being returned - [ ] `/forum/boards` - [ ] Subboards not being returned - [ ] Post author not being returned - [ ] `/forum/topic/{topic_id}` - [ ] Internal posts not being returned - [ ] Poll options not being returned
1.0
Redoc defect issues. - This issue only affects branch [`redoc@0450bbc5`](https://github.com/Katsute/Mal4J/tree/redoc%40450bbc5). There are several defects in redoc that is causing valid schema to produce invalid documentation. - [ ] `/anime`, `/anime/seasonal/{year}/{season}`, `/anime/suggestions` - [x] List status not being returned - [ ] Studios not being returned - [ ] `/anime/{anime_id}` - [ ] Studios not being returned - [ ] RelatedAnime not being returned - [ ] Recommendations not being returned - [ ] `/anime/ranking` - [ ] Not returning Anime - [x] `/users/{user_name}/animelist` - [x] Comments type is `any` - [ ] `/manga` - [x] List status not being returned - [ ] Authors not being returned - [ ] `/manga/{anime_id}` - [ ] Authors not being returned - [ ] RelatedManga not being returned - [ ] Recommendations not being returned - [ ] Serialization not being returned - [ ] `/manga/ranking` - [ ] Not returning Manga - [x] `/users/{user_name}/mangalist` - [x] Edit not being returned - [ ] `/forum/boards` - [ ] Subboards not being returned - [ ] Post author not being returned - [ ] `/forum/topic/{topic_id}` - [ ] Internal posts not being returned - [ ] Poll options not being returned
non_priority
redoc defect issues this issue only affects branch there are several defects in redoc that is causing valid schema to produce invalid documentation anime anime seasonal year season anime suggestions list status not being returned studios not being returned anime anime id studios not being returned relatedanime not being returned recommendations not being returned anime ranking not returning anime users user name animelist comments type is any manga list status not being returned authors not being returned manga anime id authors not being returned relatedmanga not being returned recommendations not being returned serialization not being returned manga ranking not returning manga users user name mangalist edit not being returned forum boards subboards not being returned post author not being returned forum topic topic id internal posts not being returned poll options not being returned
0
281,599
21,315,419,360
IssuesEvent
2022-04-16 07:23:29
riakhaitan/pe
https://api.github.com/repos/riakhaitan/pe
opened
NFR requirement not met
severity.Medium type.DocumentationBug
![Screen Shot 2022-04-16 at 3.22.27 PM.png](https://raw.githubusercontent.com/riakhaitan/pe/main/files/7678607d-a672-4565-94b4-267ac2b97ea4.png) ![Screen Shot 2022-04-16 at 3.22.46 PM.png](https://raw.githubusercontent.com/riakhaitan/pe/main/files/bad969f5-6229-47a6-82aa-0370ddb0c687.png) The NFRs mention that the part of the command that is wrong must be highlighted when in fact the whole command is highlighted. <!--session: 1650086337343-bc035de2-7316-4500-992a-e4b207d9b173--> <!--Version: Web v3.4.2-->
1.0
NFR requirement not met - ![Screen Shot 2022-04-16 at 3.22.27 PM.png](https://raw.githubusercontent.com/riakhaitan/pe/main/files/7678607d-a672-4565-94b4-267ac2b97ea4.png) ![Screen Shot 2022-04-16 at 3.22.46 PM.png](https://raw.githubusercontent.com/riakhaitan/pe/main/files/bad969f5-6229-47a6-82aa-0370ddb0c687.png) The NFRs mention that the part of the command that is wrong must be highlighted when in fact the whole command is highlighted. <!--session: 1650086337343-bc035de2-7316-4500-992a-e4b207d9b173--> <!--Version: Web v3.4.2-->
non_priority
nfr requirement not met the nfrs mention that the part of the command that is wrong must be highlighted when in fact the whole command is highlighted
0
249,354
26,912,726,708
IssuesEvent
2023-02-07 02:05:28
BRAEVincent52bae/nomulus
https://api.github.com/repos/BRAEVincent52bae/nomulus
opened
json-20160212.jar: 1 vulnerabilities (highest severity is: 7.5)
security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>json-20160212.jar</b></p></summary> <p>JSON is a light-weight, language independent, data interchange format. See http://www.JSON.org/ The files in this package implement JSON encoders/decoders in Java. It also includes the capability to convert between JSON and XML, HTTP headers, Cookies, and CDL. This is a reference implementation. There is a large number of JSON packages in Java. Perhaps someday the Java community will standardize on one. Until then, choose carefully. The license includes this restriction: "The software shall be used for good, not evil." If your conscience cannot live with that, then choose a different package.</p> <p>Library home page: <a href="https://github.com/douglascrockford/JSON-java">https://github.com/douglascrockford/JSON-java</a></p> <p>Path to dependency file: /core/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.json/json/20160212/a742e3f85161835b95877478c5dd5b405cefaab9/json-20160212.jar</p> <p> <p>Found in HEAD commit: <a href="https://github.com/BRAEVincent52bae/nomulus/commit/b67a1450e2ecb21d7cd0812c41c98fff37ac287c">b67a1450e2ecb21d7cd0812c41c98fff37ac287c</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (json version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [WS-2017-3805](https://github.com/stleary/JSON-java/commit/ed8745cd634f3276b7f7bef4bf0f49987c83256d) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | json-20160212.jar | Direct | 20180130 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> WS-2017-3805</summary> ### Vulnerable Library - <b>json-20160212.jar</b></p> <p>JSON is a light-weight, language independent, data interchange format. See http://www.JSON.org/ The files in this package implement JSON encoders/decoders in Java. It also includes the capability to convert between JSON and XML, HTTP headers, Cookies, and CDL. This is a reference implementation. There is a large number of JSON packages in Java. Perhaps someday the Java community will standardize on one. Until then, choose carefully. The license includes this restriction: "The software shall be used for good, not evil." If your conscience cannot live with that, then choose a different package.</p> <p>Library home page: <a href="https://github.com/douglascrockford/JSON-java">https://github.com/douglascrockford/JSON-java</a></p> <p>Path to dependency file: /core/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.json/json/20160212/a742e3f85161835b95877478c5dd5b405cefaab9/json-20160212.jar</p> <p> Dependency Hierarchy: - :x: **json-20160212.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/BRAEVincent52bae/nomulus/commit/b67a1450e2ecb21d7cd0812c41c98fff37ac287c">b67a1450e2ecb21d7cd0812c41c98fff37ac287c</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> Affected versions of JSON In Java are vulnerable to Denial of Service (DoS) when trying to initialize a JSONArray object and the input is [. This will cause the jvm to crash with StackOverflowError due to non-cyclical stack overflow. <p>Publish Date: 2017-10-30 <p>URL: <a href=https://github.com/stleary/JSON-java/commit/ed8745cd634f3276b7f7bef4bf0f49987c83256d>WS-2017-3805</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.5</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Release Date: 2017-10-30</p> <p>Fix Resolution: 20180130</p> </p> <p></p> Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) </details>
True
json-20160212.jar: 1 vulnerabilities (highest severity is: 7.5) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>json-20160212.jar</b></p></summary> <p>JSON is a light-weight, language independent, data interchange format. See http://www.JSON.org/ The files in this package implement JSON encoders/decoders in Java. It also includes the capability to convert between JSON and XML, HTTP headers, Cookies, and CDL. This is a reference implementation. There is a large number of JSON packages in Java. Perhaps someday the Java community will standardize on one. Until then, choose carefully. The license includes this restriction: "The software shall be used for good, not evil." If your conscience cannot live with that, then choose a different package.</p> <p>Library home page: <a href="https://github.com/douglascrockford/JSON-java">https://github.com/douglascrockford/JSON-java</a></p> <p>Path to dependency file: /core/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.json/json/20160212/a742e3f85161835b95877478c5dd5b405cefaab9/json-20160212.jar</p> <p> <p>Found in HEAD commit: <a href="https://github.com/BRAEVincent52bae/nomulus/commit/b67a1450e2ecb21d7cd0812c41c98fff37ac287c">b67a1450e2ecb21d7cd0812c41c98fff37ac287c</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (json version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [WS-2017-3805](https://github.com/stleary/JSON-java/commit/ed8745cd634f3276b7f7bef4bf0f49987c83256d) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | json-20160212.jar | Direct | 20180130 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> WS-2017-3805</summary> ### Vulnerable Library - <b>json-20160212.jar</b></p> <p>JSON is a light-weight, language independent, data interchange format. See http://www.JSON.org/ The files in this package implement JSON encoders/decoders in Java. It also includes the capability to convert between JSON and XML, HTTP headers, Cookies, and CDL. This is a reference implementation. There is a large number of JSON packages in Java. Perhaps someday the Java community will standardize on one. Until then, choose carefully. The license includes this restriction: "The software shall be used for good, not evil." If your conscience cannot live with that, then choose a different package.</p> <p>Library home page: <a href="https://github.com/douglascrockford/JSON-java">https://github.com/douglascrockford/JSON-java</a></p> <p>Path to dependency file: /core/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.json/json/20160212/a742e3f85161835b95877478c5dd5b405cefaab9/json-20160212.jar</p> <p> Dependency Hierarchy: - :x: **json-20160212.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/BRAEVincent52bae/nomulus/commit/b67a1450e2ecb21d7cd0812c41c98fff37ac287c">b67a1450e2ecb21d7cd0812c41c98fff37ac287c</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> Affected versions of JSON In Java are vulnerable to Denial of Service (DoS) when trying to initialize a JSONArray object and the input is [. This will cause the jvm to crash with StackOverflowError due to non-cyclical stack overflow. <p>Publish Date: 2017-10-30 <p>URL: <a href=https://github.com/stleary/JSON-java/commit/ed8745cd634f3276b7f7bef4bf0f49987c83256d>WS-2017-3805</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.5</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Release Date: 2017-10-30</p> <p>Fix Resolution: 20180130</p> </p> <p></p> Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) </details>
non_priority
json jar vulnerabilities highest severity is vulnerable library json jar json is a light weight language independent data interchange format see the files in this package implement json encoders decoders in java it also includes the capability to convert between json and xml http headers cookies and cdl this is a reference implementation there is a large number of json packages in java perhaps someday the java community will standardize on one until then choose carefully the license includes this restriction the software shall be used for good not evil if your conscience cannot live with that then choose a different package library home page a href path to dependency file core build gradle path to vulnerable library home wss scanner gradle caches modules files org json json json jar found in head commit a href vulnerabilities cve severity cvss dependency type fixed in json version remediation available high json jar direct details ws vulnerable library json jar json is a light weight language independent data interchange format see the files in this package implement json encoders decoders in java it also includes the capability to convert between json and xml http headers cookies and cdl this is a reference implementation there is a large number of json packages in java perhaps someday the java community will standardize on one until then choose carefully the license includes this restriction the software shall be used for good not evil if your conscience cannot live with that then choose a different package library home page a href path to dependency file core build gradle path to vulnerable library home wss scanner gradle caches modules files org json json json jar dependency hierarchy x json jar vulnerable library found in head commit a href found in base branch master vulnerability details affected versions of json in java are vulnerable to denial of service dos when trying to initialize a jsonarray object and the input is this will cause the jvm to crash with stackoverflowerror due to non cyclical stack overflow publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution step up your open source security game with mend
0
258,830
22,351,750,219
IssuesEvent
2022-06-15 12:40:34
lowRISC/opentitan
https://api.github.com/repos/lowRISC/opentitan
opened
entropy_src - fuse enable fw read
Type:Task IP:entropy_src Component:ChipLevelTest
# Verify the fuse input entropy_src. - Initialize the OTP with this fuse bit set to 1. - Perform an entropy request operation. - Read the entropy_data_fifo via SW; verify that it reads valid Values. - Reset the chip, but this time, initialize the OTP with this fuse bit set to 0. - Perform an entropy request operation. - Read the internal state via SW; verify that it reads all zeros this time.
1.0
entropy_src - fuse enable fw read - # Verify the fuse input entropy_src. - Initialize the OTP with this fuse bit set to 1. - Perform an entropy request operation. - Read the entropy_data_fifo via SW; verify that it reads valid Values. - Reset the chip, but this time, initialize the OTP with this fuse bit set to 0. - Perform an entropy request operation. - Read the internal state via SW; verify that it reads all zeros this time.
non_priority
entropy src fuse enable fw read verify the fuse input entropy src initialize the otp with this fuse bit set to perform an entropy request operation read the entropy data fifo via sw verify that it reads valid values reset the chip but this time initialize the otp with this fuse bit set to perform an entropy request operation read the internal state via sw verify that it reads all zeros this time
0
54,288
13,902,504,800
IssuesEvent
2020-10-20 05:30:09
emilwareus/angular
https://api.github.com/repos/emilwareus/angular
opened
CVE-2019-20920 (High) detected in multiple libraries
security vulnerability
## CVE-2019-20920 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>handlebars-4.1.1.tgz</b>, <b>handlebars-4.1.0.tgz</b>, <b>handlebars-4.0.12.tgz</b>, <b>handlebars-4.0.11.tgz</b></p></summary> <p> <details><summary><b>handlebars-4.1.1.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.1.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.1.tgz</a></p> <p>Path to dependency file: angular/yarn.lock</p> <p>Path to vulnerable library: angular/yarn.lock</p> <p> Dependency Hierarchy: - jasmine-0.29.0.tgz (Root Library) - v8-coverage-1.0.9.tgz - istanbul-reports-1.5.1.tgz - :x: **handlebars-4.1.1.tgz** (Vulnerable Library) </details> <details><summary><b>handlebars-4.1.0.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz</a></p> <p>Path to dependency file: angular/aio/yarn.lock</p> <p>Path to vulnerable library: angular/aio/yarn.lock</p> <p> Dependency Hierarchy: - karma-coverage-istanbul-reporter-1.4.3.tgz (Root Library) - istanbul-api-1.3.7.tgz - istanbul-reports-1.5.1.tgz - :x: **handlebars-4.1.0.tgz** (Vulnerable Library) </details> <details><summary><b>handlebars-4.0.12.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.12.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.12.tgz</a></p> <p>Path to dependency file: angular/integration/cli-hello-world-ivy-compat/yarn.lock</p> <p>Path to vulnerable library: angular/integration/cli-hello-world-ivy-compat/yarn.lock,angular/integration/cli-hello-world-ivy-minimal/yarn.lock</p> <p> Dependency Hierarchy: - karma-coverage-istanbul-reporter-2.0.4.tgz (Root Library) - istanbul-api-2.0.6.tgz - istanbul-reports-2.0.1.tgz - :x: **handlebars-4.0.12.tgz** (Vulnerable Library) </details> <details><summary><b>handlebars-4.0.11.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.11.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.11.tgz</a></p> <p>Path to dependency file: angular/yarn.lock</p> <p>Path to vulnerable library: angular/yarn.lock,angular/integration/cli-hello-world/yarn.lock</p> <p> Dependency Hierarchy: - karma-coverage-istanbul-reporter-1.4.1.tgz (Root Library) - istanbul-api-1.2.1.tgz - istanbul-reports-1.1.3.tgz - :x: **handlebars-4.0.11.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/emilwareus/angular/commit/0a802f3678958587eafa0136d927232b89cc1427">0a802f3678958587eafa0136d927232b89cc1427</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Handlebars before 3.0.8 and 4.x before 4.5.3 is vulnerable to Arbitrary Code Execution. The lookup helper fails to properly validate templates, allowing attackers to submit templates that execute arbitrary JavaScript. This can be used to run arbitrary code on a server processing Handlebars templates or in a victim's browser (effectively serving as XSS). <p>Publish Date: 2020-09-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20920>CVE-2019-20920</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20920">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20920</a></p> <p>Release Date: 2020-09-30</p> <p>Fix Resolution: v3.0.8, v4.5.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-20920 (High) detected in multiple libraries - ## CVE-2019-20920 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>handlebars-4.1.1.tgz</b>, <b>handlebars-4.1.0.tgz</b>, <b>handlebars-4.0.12.tgz</b>, <b>handlebars-4.0.11.tgz</b></p></summary> <p> <details><summary><b>handlebars-4.1.1.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.1.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.1.tgz</a></p> <p>Path to dependency file: angular/yarn.lock</p> <p>Path to vulnerable library: angular/yarn.lock</p> <p> Dependency Hierarchy: - jasmine-0.29.0.tgz (Root Library) - v8-coverage-1.0.9.tgz - istanbul-reports-1.5.1.tgz - :x: **handlebars-4.1.1.tgz** (Vulnerable Library) </details> <details><summary><b>handlebars-4.1.0.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz</a></p> <p>Path to dependency file: angular/aio/yarn.lock</p> <p>Path to vulnerable library: angular/aio/yarn.lock</p> <p> Dependency Hierarchy: - karma-coverage-istanbul-reporter-1.4.3.tgz (Root Library) - istanbul-api-1.3.7.tgz - istanbul-reports-1.5.1.tgz - :x: **handlebars-4.1.0.tgz** (Vulnerable Library) </details> <details><summary><b>handlebars-4.0.12.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.12.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.12.tgz</a></p> <p>Path to dependency file: angular/integration/cli-hello-world-ivy-compat/yarn.lock</p> <p>Path to vulnerable library: angular/integration/cli-hello-world-ivy-compat/yarn.lock,angular/integration/cli-hello-world-ivy-minimal/yarn.lock</p> <p> Dependency Hierarchy: - karma-coverage-istanbul-reporter-2.0.4.tgz (Root Library) - istanbul-api-2.0.6.tgz - istanbul-reports-2.0.1.tgz - :x: **handlebars-4.0.12.tgz** (Vulnerable Library) </details> <details><summary><b>handlebars-4.0.11.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.11.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.11.tgz</a></p> <p>Path to dependency file: angular/yarn.lock</p> <p>Path to vulnerable library: angular/yarn.lock,angular/integration/cli-hello-world/yarn.lock</p> <p> Dependency Hierarchy: - karma-coverage-istanbul-reporter-1.4.1.tgz (Root Library) - istanbul-api-1.2.1.tgz - istanbul-reports-1.1.3.tgz - :x: **handlebars-4.0.11.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/emilwareus/angular/commit/0a802f3678958587eafa0136d927232b89cc1427">0a802f3678958587eafa0136d927232b89cc1427</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Handlebars before 3.0.8 and 4.x before 4.5.3 is vulnerable to Arbitrary Code Execution. The lookup helper fails to properly validate templates, allowing attackers to submit templates that execute arbitrary JavaScript. This can be used to run arbitrary code on a server processing Handlebars templates or in a victim's browser (effectively serving as XSS). <p>Publish Date: 2020-09-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20920>CVE-2019-20920</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20920">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20920</a></p> <p>Release Date: 2020-09-30</p> <p>Fix Resolution: v3.0.8, v4.5.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries handlebars tgz handlebars tgz handlebars tgz handlebars tgz handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file angular yarn lock path to vulnerable library angular yarn lock dependency hierarchy jasmine tgz root library coverage tgz istanbul reports tgz x handlebars tgz vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file angular aio yarn lock path to vulnerable library angular aio yarn lock dependency hierarchy karma coverage istanbul reporter tgz root library istanbul api tgz istanbul reports tgz x handlebars tgz vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file angular integration cli hello world ivy compat yarn lock path to vulnerable library angular integration cli hello world ivy compat yarn lock angular integration cli hello world ivy minimal yarn lock dependency hierarchy karma coverage istanbul reporter tgz root library istanbul api tgz istanbul reports tgz x handlebars tgz vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file angular yarn lock path to vulnerable library angular yarn lock angular integration cli hello world yarn lock dependency hierarchy karma coverage istanbul reporter tgz root library istanbul api tgz istanbul reports tgz x handlebars tgz vulnerable library found in head commit a href vulnerability details handlebars before and x before is vulnerable to arbitrary code execution the lookup helper fails to properly validate templates allowing attackers to submit templates that execute arbitrary javascript this can be used to run arbitrary code on a server processing handlebars templates or in a victim s browser effectively serving as xss publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope changed impact metrics confidentiality impact high integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
283,669
24,557,899,354
IssuesEvent
2022-10-12 17:27:05
ckan/ckan.org
https://api.github.com/repos/ckan/ckan.org
closed
Link to Data Explorer extension is wrong
in testing
On the page https://ckan.org/features/datastore the link to the Data explorer extension is wrong, missing part of the domain (ckan.org): http://docs./en/latest/maintaining/data-viewer.html#data-explorer
1.0
Link to Data Explorer extension is wrong - On the page https://ckan.org/features/datastore the link to the Data explorer extension is wrong, missing part of the domain (ckan.org): http://docs./en/latest/maintaining/data-viewer.html#data-explorer
non_priority
link to data explorer extension is wrong on the page the link to the data explorer extension is wrong missing part of the domain ckan org
0
39,921
6,782,240,883
IssuesEvent
2017-10-30 06:59:58
cyberFund/cyber-search
https://api.github.com/repos/cyberFund/cyber-search
closed
Bitcoin multisig transaction
discussion documentation improvement
Bitcoin transaction output could contains multiple addresses. Define model for this See https://blockchain.info/tx/56214420a7c4dcc4832944298d169a75e93acf9721f00656b2ee0e4d194f9970
1.0
Bitcoin multisig transaction - Bitcoin transaction output could contains multiple addresses. Define model for this See https://blockchain.info/tx/56214420a7c4dcc4832944298d169a75e93acf9721f00656b2ee0e4d194f9970
non_priority
bitcoin multisig transaction bitcoin transaction output could contains multiple addresses define model for this see
0
135,563
12,686,429,134
IssuesEvent
2020-06-20 10:52:04
Perl/perl5
https://api.github.com/repos/Perl/perl5
closed
[doc] perlcommunity has lots of out of date information
Needs Triage documentation
The perlcommunity.pod file has many references to sites and events that no longer exist. For example: - References to Perl 6 (rather than Raku) - http://use.perl.org/ is a dead link - references to YAPC, which has changed to TPC - OSCON has been discontinued - https://www.theperlreview.com/community_calendar is a dead link - The last Open Source Developers' Conference was in 2015
1.0
[doc] perlcommunity has lots of out of date information - The perlcommunity.pod file has many references to sites and events that no longer exist. For example: - References to Perl 6 (rather than Raku) - http://use.perl.org/ is a dead link - references to YAPC, which has changed to TPC - OSCON has been discontinued - https://www.theperlreview.com/community_calendar is a dead link - The last Open Source Developers' Conference was in 2015
non_priority
perlcommunity has lots of out of date information the perlcommunity pod file has many references to sites and events that no longer exist for example references to perl rather than raku is a dead link references to yapc which has changed to tpc oscon has been discontinued is a dead link the last open source developers conference was in
0
49,283
6,189,567,578
IssuesEvent
2017-07-04 13:16:37
JacquesCarette/literate-scientific-software
https://api.github.com/repos/JacquesCarette/literate-scientific-software
closed
Design data structure for Solution Characteristics Specification Section
Design
As discussed, design is required to replace the overarching functions (`introductionF` and `solChSpecF`) from being large to more easily readable (similar to `RefSec` and `RefProg`).
1.0
Design data structure for Solution Characteristics Specification Section - As discussed, design is required to replace the overarching functions (`introductionF` and `solChSpecF`) from being large to more easily readable (similar to `RefSec` and `RefProg`).
non_priority
design data structure for solution characteristics specification section as discussed design is required to replace the overarching functions introductionf and solchspecf from being large to more easily readable similar to refsec and refprog
0
26,474
12,406,350,213
IssuesEvent
2020-05-21 18:58:33
microsoft/vscode-cpptools
https://api.github.com/repos/microsoft/vscode-cpptools
closed
Undefined identifiers in librairies
Feature: Configuration Language Service more info needed
- OS: Linux 5.4.13-3 Manjaro 18.1.5 - VS Code (OSS): 1.41.1 - C/C++ Extension: 0.26.3 Hi, I have an issue using VS Code with the C/C++ extension. Some identifiers are not recognized by Intellisense, although my includePath in ```c_cpp_properties.json``` seems fine. For instance here I am trying to display the current time with ```time()``` from the ```ctime``` library. VS Code tells me that the time identifier is undefined: ``` #include <iostream> #include <ctime> using namespace std; int main() { cout << time(nullptr) << endl; //--> "time" identifier is undefined } ``` Here is my c_cpp_properties.json file: ``` { "configurations": [ { "name": "Linux", "includePath": [ "${workspaceFolder}/**", "/usr/include/linux", "/usr/include/c++/9.2.0/tr1" ], "defines": [], "compilerPath": "/usr/bin/gcc", "cStandard": "c11", "cppStandard": "c++17", "intelliSenseMode": "gcc-x64" } ], "version": 4 } ``` There is a ```ctime``` file at``` /usr/include/c++/9.2.0/tr1/``` as well as at ```/usr/include/c++/9.2.0/``` but changing the includePath to the latter doesn't change anything. Thanks for your help, FB.
1.0
Undefined identifiers in librairies - - OS: Linux 5.4.13-3 Manjaro 18.1.5 - VS Code (OSS): 1.41.1 - C/C++ Extension: 0.26.3 Hi, I have an issue using VS Code with the C/C++ extension. Some identifiers are not recognized by Intellisense, although my includePath in ```c_cpp_properties.json``` seems fine. For instance here I am trying to display the current time with ```time()``` from the ```ctime``` library. VS Code tells me that the time identifier is undefined: ``` #include <iostream> #include <ctime> using namespace std; int main() { cout << time(nullptr) << endl; //--> "time" identifier is undefined } ``` Here is my c_cpp_properties.json file: ``` { "configurations": [ { "name": "Linux", "includePath": [ "${workspaceFolder}/**", "/usr/include/linux", "/usr/include/c++/9.2.0/tr1" ], "defines": [], "compilerPath": "/usr/bin/gcc", "cStandard": "c11", "cppStandard": "c++17", "intelliSenseMode": "gcc-x64" } ], "version": 4 } ``` There is a ```ctime``` file at``` /usr/include/c++/9.2.0/tr1/``` as well as at ```/usr/include/c++/9.2.0/``` but changing the includePath to the latter doesn't change anything. Thanks for your help, FB.
non_priority
undefined identifiers in librairies os linux manjaro vs code oss c c extension hi i have an issue using vs code with the c c extension some identifiers are not recognized by intellisense although my includepath in c cpp properties json seems fine for instance here i am trying to display the current time with time from the ctime library vs code tells me that the time identifier is undefined include include using namespace std int main cout time identifier is undefined here is my c cpp properties json file configurations name linux includepath workspacefolder usr include linux usr include c defines compilerpath usr bin gcc cstandard cppstandard c intellisensemode gcc version there is a ctime file at usr include c as well as at usr include c but changing the includepath to the latter doesn t change anything thanks for your help fb
0
99,230
16,437,602,153
IssuesEvent
2021-05-20 11:01:59
iVipz/WebGoat
https://api.github.com/repos/iVipz/WebGoat
opened
CVE-2020-28502 (High) detected in xmlhttprequest-ssl-1.5.5.tgz
security vulnerability
## CVE-2020-28502 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xmlhttprequest-ssl-1.5.5.tgz</b></p></summary> <p>XMLHttpRequest for Node</p> <p>Library home page: <a href="https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz">https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz</a></p> <p>Path to dependency file: WebGoat/docs/package.json</p> <p>Path to vulnerable library: WebGoat/docs/node_modules/xmlhttprequest-ssl/package.json</p> <p> Dependency Hierarchy: - browser-sync-2.26.3.tgz (Root Library) - browser-sync-ui-2.26.2.tgz - socket.io-client-2.2.0.tgz - engine.io-client-3.3.1.tgz - :x: **xmlhttprequest-ssl-1.5.5.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/iVipz/WebGoat/commits/b22423d123f2a162972d83138f5c1b596cac9420">b22423d123f2a162972d83138f5c1b596cac9420</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package xmlhttprequest before 1.7.0; all versions of package xmlhttprequest-ssl. Provided requests are sent synchronously (async=False on xhr.open), malicious user input flowing into xhr.send could result in arbitrary code being injected and run. <p>Publish Date: 2021-03-05 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28502>CVE-2020-28502</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-h4j5-c7cj-74xg">https://github.com/advisories/GHSA-h4j5-c7cj-74xg</a></p> <p>Release Date: 2021-03-05</p> <p>Fix Resolution: xmlhttprequest - 1.7.0,xmlhttprequest-ssl - 1.6.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-28502 (High) detected in xmlhttprequest-ssl-1.5.5.tgz - ## CVE-2020-28502 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xmlhttprequest-ssl-1.5.5.tgz</b></p></summary> <p>XMLHttpRequest for Node</p> <p>Library home page: <a href="https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz">https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz</a></p> <p>Path to dependency file: WebGoat/docs/package.json</p> <p>Path to vulnerable library: WebGoat/docs/node_modules/xmlhttprequest-ssl/package.json</p> <p> Dependency Hierarchy: - browser-sync-2.26.3.tgz (Root Library) - browser-sync-ui-2.26.2.tgz - socket.io-client-2.2.0.tgz - engine.io-client-3.3.1.tgz - :x: **xmlhttprequest-ssl-1.5.5.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/iVipz/WebGoat/commits/b22423d123f2a162972d83138f5c1b596cac9420">b22423d123f2a162972d83138f5c1b596cac9420</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package xmlhttprequest before 1.7.0; all versions of package xmlhttprequest-ssl. Provided requests are sent synchronously (async=False on xhr.open), malicious user input flowing into xhr.send could result in arbitrary code being injected and run. <p>Publish Date: 2021-03-05 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28502>CVE-2020-28502</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-h4j5-c7cj-74xg">https://github.com/advisories/GHSA-h4j5-c7cj-74xg</a></p> <p>Release Date: 2021-03-05</p> <p>Fix Resolution: xmlhttprequest - 1.7.0,xmlhttprequest-ssl - 1.6.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve high detected in xmlhttprequest ssl tgz cve high severity vulnerability vulnerable library xmlhttprequest ssl tgz xmlhttprequest for node library home page a href path to dependency file webgoat docs package json path to vulnerable library webgoat docs node modules xmlhttprequest ssl package json dependency hierarchy browser sync tgz root library browser sync ui tgz socket io client tgz engine io client tgz x xmlhttprequest ssl tgz vulnerable library found in head commit a href found in base branch develop vulnerability details this affects the package xmlhttprequest before all versions of package xmlhttprequest ssl provided requests are sent synchronously async false on xhr open malicious user input flowing into xhr send could result in arbitrary code being injected and run publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution xmlhttprequest xmlhttprequest ssl step up your open source security game with whitesource
0
439,508
30,700,735,756
IssuesEvent
2023-07-26 23:01:33
andersonjoseph/fastify-hashids
https://api.github.com/repos/andersonjoseph/fastify-hashids
opened
Document hashids missing options
documentation good first issue
Documentation should be as self-contained as possible. Instead of linking to the available options for hashids, we should include all of the supported options in our README. The missing options that we need to document are: - salt - minLength - alphabet - seps reference: https://github.com/niieani/hashids.js
1.0
Document hashids missing options - Documentation should be as self-contained as possible. Instead of linking to the available options for hashids, we should include all of the supported options in our README. The missing options that we need to document are: - salt - minLength - alphabet - seps reference: https://github.com/niieani/hashids.js
non_priority
document hashids missing options documentation should be as self contained as possible instead of linking to the available options for hashids we should include all of the supported options in our readme the missing options that we need to document are salt minlength alphabet seps reference
0
165,861
20,623,620,720
IssuesEvent
2022-03-07 20:00:21
clowdr-app/clowdr
https://api.github.com/repos/clowdr-app/clowdr
closed
Allow organisers to segregate conference content and schedule for different groups of people
C-enhancement D-hard T-security S-frontend S-hasura
This functionality is important as part of supporting co-events and workshops. Different groups of attendees need to be able to see different subsets of the overall conference content. More general discussion about permission here: #212
True
Allow organisers to segregate conference content and schedule for different groups of people - This functionality is important as part of supporting co-events and workshops. Different groups of attendees need to be able to see different subsets of the overall conference content. More general discussion about permission here: #212
non_priority
allow organisers to segregate conference content and schedule for different groups of people this functionality is important as part of supporting co events and workshops different groups of attendees need to be able to see different subsets of the overall conference content more general discussion about permission here
0
173,771
13,443,204,511
IssuesEvent
2020-09-08 08:00:34
Cookie-AutoDelete/Cookie-AutoDelete
https://api.github.com/repos/Cookie-AutoDelete/Cookie-AutoDelete
closed
[BUG] List of Expression shows lots (30+) of "INVALID CONTAINER" messages under each of the containers
incomplete untested bug/issue
Under List of Expressions, "INVALID CONTAINER" appears 38 times (in red) ** Steps to reproduce the behavior ** : 1. Open CAD Settings 2. Go to "List of Expressions" 3. See error **Expected Behavior** Have not seen this behaviour befor A clear and concise description of what you expected to happen. **Screenshots** See screenshot: ![image](https://user-images.githubusercontent.com/53639985/92449213-10da8e00-f1fd-11ea-8696-60918ec79304.png) **Your System Info** - MacOS 10.15.6 - Firefox 80.0.1 (64-bit) - CookieAutoDelete Version: 3.5.1
1.0
[BUG] List of Expression shows lots (30+) of "INVALID CONTAINER" messages under each of the containers - Under List of Expressions, "INVALID CONTAINER" appears 38 times (in red) ** Steps to reproduce the behavior ** : 1. Open CAD Settings 2. Go to "List of Expressions" 3. See error **Expected Behavior** Have not seen this behaviour befor A clear and concise description of what you expected to happen. **Screenshots** See screenshot: ![image](https://user-images.githubusercontent.com/53639985/92449213-10da8e00-f1fd-11ea-8696-60918ec79304.png) **Your System Info** - MacOS 10.15.6 - Firefox 80.0.1 (64-bit) - CookieAutoDelete Version: 3.5.1
non_priority
list of expression shows lots of invalid container messages under each of the containers under list of expressions invalid container appears times in red steps to reproduce the behavior open cad settings go to list of expressions see error expected behavior have not seen this behaviour befor a clear and concise description of what you expected to happen screenshots see screenshot your system info macos firefox bit cookieautodelete version
0
221,245
16,998,610,508
IssuesEvent
2021-07-01 09:38:43
chrklemm/SESMG
https://api.github.com/repos/chrklemm/SESMG
closed
update of the "application" docu
documentation
with the v0.1 update the order of the columns in the scenario-sheet was changed. The order of the columns in the documentation should be unified.
1.0
update of the "application" docu - with the v0.1 update the order of the columns in the scenario-sheet was changed. The order of the columns in the documentation should be unified.
non_priority
update of the application docu with the update the order of the columns in the scenario sheet was changed the order of the columns in the documentation should be unified
0
51,672
10,710,036,423
IssuesEvent
2019-10-25 00:26:32
microsoft/vscode
https://api.github.com/repos/microsoft/vscode
reopened
Code Action provider called n times on a single key stroke
*out-of-scope api editor-code-actions under-discussion
I noticed today that my ESLint code action provider is called n times on every keystroke. This only seems to happen in a build version of VS Code. I couldn't reproduce this out of source. So I suspect a timing / debouncing issues. I debugged it in the built version and here is what is happening: - in the test workspace this is happening the following LS provide diagnostics: tslint, TS and ESLint - on a keystroke all three LS update the diagnostics - this results to 3 calls to `_onMarkerChanges` here: https://github.com/Microsoft/vscode/blob/master/src/vs/editor/contrib/codeAction/codeActionModel.ts#L45 - depending on the timeing these seem to get folded or not The more general question for me is why the ESLint code action provider is called when TS or TSLint diagnostics change. I do already check if the diagnostics contain ESLint errors. However in the above example if the ESLint errors come in first then there are errors when the code action provider is triggered again on a TSLint or TS diagnostic change.
1.0
Code Action provider called n times on a single key stroke - I noticed today that my ESLint code action provider is called n times on every keystroke. This only seems to happen in a build version of VS Code. I couldn't reproduce this out of source. So I suspect a timing / debouncing issues. I debugged it in the built version and here is what is happening: - in the test workspace this is happening the following LS provide diagnostics: tslint, TS and ESLint - on a keystroke all three LS update the diagnostics - this results to 3 calls to `_onMarkerChanges` here: https://github.com/Microsoft/vscode/blob/master/src/vs/editor/contrib/codeAction/codeActionModel.ts#L45 - depending on the timeing these seem to get folded or not The more general question for me is why the ESLint code action provider is called when TS or TSLint diagnostics change. I do already check if the diagnostics contain ESLint errors. However in the above example if the ESLint errors come in first then there are errors when the code action provider is triggered again on a TSLint or TS diagnostic change.
non_priority
code action provider called n times on a single key stroke i noticed today that my eslint code action provider is called n times on every keystroke this only seems to happen in a build version of vs code i couldn t reproduce this out of source so i suspect a timing debouncing issues i debugged it in the built version and here is what is happening in the test workspace this is happening the following ls provide diagnostics tslint ts and eslint on a keystroke all three ls update the diagnostics this results to calls to onmarkerchanges here depending on the timeing these seem to get folded or not the more general question for me is why the eslint code action provider is called when ts or tslint diagnostics change i do already check if the diagnostics contain eslint errors however in the above example if the eslint errors come in first then there are errors when the code action provider is triggered again on a tslint or ts diagnostic change
0
256,839
27,561,729,677
IssuesEvent
2023-03-07 22:42:42
samqws-marketing/box_box-ui-elements
https://api.github.com/repos/samqws-marketing/box_box-ui-elements
closed
CVE-2018-11499 (High) detected in node-sass-4.13.1.tgz - autoclosed
security vulnerability
## CVE-2018-11499 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-4.13.1.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - :x: **node-sass-4.13.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/box_box-ui-elements/commit/4fc776e2b95c8b497f6994cb2165365562ae1f82">4fc776e2b95c8b497f6994cb2165365562ae1f82</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A use-after-free vulnerability exists in handle_error() in sass_context.cpp in LibSass 3.4.x and 3.5.x through 3.5.4 that could be leveraged to cause a denial of service (application crash) or possibly unspecified other impact. <p>Publish Date: 2018-05-26 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-11499>CVE-2018-11499</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2018-05-26</p> <p>Fix Resolution: 4.14.0</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
True
CVE-2018-11499 (High) detected in node-sass-4.13.1.tgz - autoclosed - ## CVE-2018-11499 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-4.13.1.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.13.1.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - :x: **node-sass-4.13.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/box_box-ui-elements/commit/4fc776e2b95c8b497f6994cb2165365562ae1f82">4fc776e2b95c8b497f6994cb2165365562ae1f82</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A use-after-free vulnerability exists in handle_error() in sass_context.cpp in LibSass 3.4.x and 3.5.x through 3.5.4 that could be leveraged to cause a denial of service (application crash) or possibly unspecified other impact. <p>Publish Date: 2018-05-26 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-11499>CVE-2018-11499</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2018-05-26</p> <p>Fix Resolution: 4.14.0</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
non_priority
cve high detected in node sass tgz autoclosed cve high severity vulnerability vulnerable library node sass tgz wrapper around libsass library home page a href path to dependency file package json path to vulnerable library node modules node sass package json dependency hierarchy x node sass tgz vulnerable library found in head commit a href found in base branch master vulnerability details a use after free vulnerability exists in handle error in sass context cpp in libsass x and x through that could be leveraged to cause a denial of service application crash or possibly unspecified other impact publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution check this box to open an automated fix pr
0
320,831
23,828,076,723
IssuesEvent
2022-09-05 16:43:21
ange-yaghi/engine-sim
https://api.github.com/repos/ange-yaghi/engine-sim
closed
how to create a custom engine? Example: audi V10 5.2FSI
documentation support
hello, I would like to create another engine like the Audi V10 5.2FSI or the BMW M70 or the VR6 AAA, is this possible?
1.0
how to create a custom engine? Example: audi V10 5.2FSI - hello, I would like to create another engine like the Audi V10 5.2FSI or the BMW M70 or the VR6 AAA, is this possible?
non_priority
how to create a custom engine example audi hello i would like to create another engine like the audi or the bmw or the aaa is this possible
0
68,182
9,126,708,201
IssuesEvent
2019-02-24 23:40:41
SAP/fundamental-ngx
https://api.github.com/repos/SAP/fundamental-ngx
opened
update README to use the IE11 css instead of non-IE11
documentation
#### Is this a bug, enhancement, or feature request? documentation #### Briefly describe your proposal. fundamental-ngx needs to support IE11 and the import of the file `./node_modules/fiori-fundamentals/dist/fiori-fundamentals.min.css` doesn't support IE11.
1.0
update README to use the IE11 css instead of non-IE11 - #### Is this a bug, enhancement, or feature request? documentation #### Briefly describe your proposal. fundamental-ngx needs to support IE11 and the import of the file `./node_modules/fiori-fundamentals/dist/fiori-fundamentals.min.css` doesn't support IE11.
non_priority
update readme to use the css instead of non is this a bug enhancement or feature request documentation briefly describe your proposal fundamental ngx needs to support and the import of the file node modules fiori fundamentals dist fiori fundamentals min css doesn t support
0
57,482
6,548,436,421
IssuesEvent
2017-09-04 21:54:05
nodejs/node
https://api.github.com/repos/nodejs/node
closed
test: post-snapshot `test-cli-node-options` timeouts on RasberryPi
arm CI / flaky test cli test
<!-- Thank you for reporting an issue. This issue tracker is for bugs and issues found within Node.js core. If you require more general support please file an issue on our help repo. https://github.com/nodejs/help Please fill in as much of the template below as you're able. Version: output of `node -v` Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows) Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: 8.1.4 * **Platform**: `pi1-raspbian-wheezy` * **Subsystem**: test,cli <!-- Enter your issue details below this comment. --> https://ci.nodejs.org/job/node-test-binary-arm/9178/RUN_SUBSET=2,label=pi1-raspbian-wheezy/ ``` not ok 30 parallel/test-cli-node-options --- duration_ms: 180.249 severity: fail stack: |- timeout ... ```
2.0
test: post-snapshot `test-cli-node-options` timeouts on RasberryPi - <!-- Thank you for reporting an issue. This issue tracker is for bugs and issues found within Node.js core. If you require more general support please file an issue on our help repo. https://github.com/nodejs/help Please fill in as much of the template below as you're able. Version: output of `node -v` Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows) Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: 8.1.4 * **Platform**: `pi1-raspbian-wheezy` * **Subsystem**: test,cli <!-- Enter your issue details below this comment. --> https://ci.nodejs.org/job/node-test-binary-arm/9178/RUN_SUBSET=2,label=pi1-raspbian-wheezy/ ``` not ok 30 parallel/test-cli-node-options --- duration_ms: 180.249 severity: fail stack: |- timeout ... ```
non_priority
test post snapshot test cli node options timeouts on rasberrypi thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version platform raspbian wheezy subsystem test cli not ok parallel test cli node options duration ms severity fail stack timeout
0
28,482
4,413,984,577
IssuesEvent
2016-08-13 05:05:54
log2timeline/plaso
https://api.github.com/repos/log2timeline/plaso
opened
Improve test coverage
enhancement testing
Improve unit test coverage to 100% * [ ] add missing JSON serializer tests
1.0
Improve test coverage - Improve unit test coverage to 100% * [ ] add missing JSON serializer tests
non_priority
improve test coverage improve unit test coverage to add missing json serializer tests
0
113,524
14,443,423,671
IssuesEvent
2020-12-07 19:37:48
department-of-veterans-affairs/va.gov-team
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
closed
Ch36 - Copy mismatches
CH36 bug design vsa vsa-ebenefits
## Background We need to confirm that the design files and the staging environments are synced. ## What happened? On Staging **Apply for Personalized Career Planning and Guidance** form, copy didn't match provided design prototypes. [See **Steps to Reproduce** for details.] ## Specs: - Device: [device-agnostic] - Browser: [browser-agnostic] - Feature-flag authentication: [contact @jason-gcio for HTTP Basic Authentication credentials] - Test User _(if applicable)_: For authenticated flows, `vets.gov.user+4@gmail.com` (Alfredo) ## Steps to Reproduce 1. Go to https://staging.va.gov/careers-employment/education-and-career-counseling/apply-career-guidance-form-28-8832/introduction [if prompted for HTTP Basic Authentication login, request credentials from VSA eBenefits team PM.] 1. Under **I know this is the form...**, click **Apply online...**. 1. Scroll down to **subway-map** ("Follow these steps...") section, and observe the following mismatches: - **Step 1**: Under **To fill out...** that 2nd & 3rd bullet-items (**Date of birth** & **If your'e a dependent...** are missing. - **Step 2**: **...service-connected disability**-related content is missing. - **Step 3**: Missing phone number. 1. Scroll back up, click **Start... without signing in**, and on 1st **Claimant Information** [personal info] form-page, observe the following mismatches [these mismatches recur on all subsequent form-pages]: - **Form title**. - Missing **form number**. ### Screenshots **Subway-map Step 1** ![ch36-copy-bug-subway-map-step1](https://user-images.githubusercontent.com/587583/96354722-99670c80-108e-11eb-9204-00f87c3fc47b.png) **Subway-map Step 2** ![ch36-copy-bug-subway-map-step2](https://user-images.githubusercontent.com/587583/96354906-be5c7f00-1090-11eb-9b4a-ff45227f3925.png) **Subway-map Step 3** ![ch36-copy-bug-subway-map-step3](https://user-images.githubusercontent.com/587583/96354986-bcdf8680-1091-11eb-9f08-542614e0b73f.png) **Post-Intro-pages form-titles, missing form-numbers** ![ch36-copy-bug-post-intro-form-title](https://user-images.githubusercontent.com/587583/96355295-e948d200-1094-11eb-82da-9eccb8631e4a.png) ## Desired copy Copy/styling should match design prototypes. [See screenshots below.] **[Subway-map Step 1](https://preview.uxpin.com/2dbde8d15bf667f5c584fe4a8a6d011cf9f0a14d#/pages/132088566/simulate/sitemap)** ![ch36-copy-spec-subway-map-step1](https://user-images.githubusercontent.com/587583/96354758-011d5780-108f-11eb-8198-2ffa6dd4dd30.png) **[Subway-map Step 2](https://preview.uxpin.com/2dbde8d15bf667f5c584fe4a8a6d011cf9f0a14d#/pages/132088566/simulate/sitemap)** ![ch36-copy-spec-subway-map-step2](https://user-images.githubusercontent.com/587583/96354914-d7fdc680-1090-11eb-9cf5-f45d50b11141.png) **Subway-map Step 3** [Missing phone# not provided in prototype.] **Post-Intro-pages form-title & -number** ![ch36-spec-bug-post-intro-form-title](https://user-images.githubusercontent.com/587583/96355324-36c53f00-1095-11eb-9171-ea0ea0a2e591.png) ## Acceptance Criteria [See **Desired copy/styling** above] ## How to configure this issue - [ ] **Attached to a Milestone** (when will this be completed?) - [x] **Attached to Epic** (what body of work is this a part of? possibly `Ongoing Maintenance`) - [x] **Labeled with Team** (`product support`, `analytics-insights`, `operations`, `triage`, `tools-improvements`) - [x] **Labeled with Practice Area** (`backend`, `frontend`, `devops`, `design`, `research`, `product`, `ia`, `qa`, `analytics`, `contact center`, `research`, `accessibility`, `content`) - [x] **Labeled with `Bug`**
1.0
Ch36 - Copy mismatches - ## Background We need to confirm that the design files and the staging environments are synced. ## What happened? On Staging **Apply for Personalized Career Planning and Guidance** form, copy didn't match provided design prototypes. [See **Steps to Reproduce** for details.] ## Specs: - Device: [device-agnostic] - Browser: [browser-agnostic] - Feature-flag authentication: [contact @jason-gcio for HTTP Basic Authentication credentials] - Test User _(if applicable)_: For authenticated flows, `vets.gov.user+4@gmail.com` (Alfredo) ## Steps to Reproduce 1. Go to https://staging.va.gov/careers-employment/education-and-career-counseling/apply-career-guidance-form-28-8832/introduction [if prompted for HTTP Basic Authentication login, request credentials from VSA eBenefits team PM.] 1. Under **I know this is the form...**, click **Apply online...**. 1. Scroll down to **subway-map** ("Follow these steps...") section, and observe the following mismatches: - **Step 1**: Under **To fill out...** that 2nd & 3rd bullet-items (**Date of birth** & **If your'e a dependent...** are missing. - **Step 2**: **...service-connected disability**-related content is missing. - **Step 3**: Missing phone number. 1. Scroll back up, click **Start... without signing in**, and on 1st **Claimant Information** [personal info] form-page, observe the following mismatches [these mismatches recur on all subsequent form-pages]: - **Form title**. - Missing **form number**. ### Screenshots **Subway-map Step 1** ![ch36-copy-bug-subway-map-step1](https://user-images.githubusercontent.com/587583/96354722-99670c80-108e-11eb-9204-00f87c3fc47b.png) **Subway-map Step 2** ![ch36-copy-bug-subway-map-step2](https://user-images.githubusercontent.com/587583/96354906-be5c7f00-1090-11eb-9b4a-ff45227f3925.png) **Subway-map Step 3** ![ch36-copy-bug-subway-map-step3](https://user-images.githubusercontent.com/587583/96354986-bcdf8680-1091-11eb-9f08-542614e0b73f.png) **Post-Intro-pages form-titles, missing form-numbers** ![ch36-copy-bug-post-intro-form-title](https://user-images.githubusercontent.com/587583/96355295-e948d200-1094-11eb-82da-9eccb8631e4a.png) ## Desired copy Copy/styling should match design prototypes. [See screenshots below.] **[Subway-map Step 1](https://preview.uxpin.com/2dbde8d15bf667f5c584fe4a8a6d011cf9f0a14d#/pages/132088566/simulate/sitemap)** ![ch36-copy-spec-subway-map-step1](https://user-images.githubusercontent.com/587583/96354758-011d5780-108f-11eb-8198-2ffa6dd4dd30.png) **[Subway-map Step 2](https://preview.uxpin.com/2dbde8d15bf667f5c584fe4a8a6d011cf9f0a14d#/pages/132088566/simulate/sitemap)** ![ch36-copy-spec-subway-map-step2](https://user-images.githubusercontent.com/587583/96354914-d7fdc680-1090-11eb-9cf5-f45d50b11141.png) **Subway-map Step 3** [Missing phone# not provided in prototype.] **Post-Intro-pages form-title & -number** ![ch36-spec-bug-post-intro-form-title](https://user-images.githubusercontent.com/587583/96355324-36c53f00-1095-11eb-9171-ea0ea0a2e591.png) ## Acceptance Criteria [See **Desired copy/styling** above] ## How to configure this issue - [ ] **Attached to a Milestone** (when will this be completed?) - [x] **Attached to Epic** (what body of work is this a part of? possibly `Ongoing Maintenance`) - [x] **Labeled with Team** (`product support`, `analytics-insights`, `operations`, `triage`, `tools-improvements`) - [x] **Labeled with Practice Area** (`backend`, `frontend`, `devops`, `design`, `research`, `product`, `ia`, `qa`, `analytics`, `contact center`, `research`, `accessibility`, `content`) - [x] **Labeled with `Bug`**
non_priority
copy mismatches background we need to confirm that the design files and the staging environments are synced what happened on staging apply for personalized career planning and guidance form copy didn t match provided design prototypes specs device browser feature flag authentication test user if applicable for authenticated flows vets gov user gmail com alfredo steps to reproduce go to under i know this is the form click apply online scroll down to subway map follow these steps section and observe the following mismatches step under to fill out that bullet items date of birth if your e a dependent are missing step service connected disability related content is missing step missing phone number scroll back up click start without signing in and on claimant information form page observe the following mismatches form title missing form number screenshots subway map step subway map step subway map step post intro pages form titles missing form numbers desired copy copy styling should match design prototypes subway map step post intro pages form title number acceptance criteria how to configure this issue attached to a milestone when will this be completed attached to epic what body of work is this a part of possibly ongoing maintenance labeled with team product support analytics insights operations triage tools improvements labeled with practice area backend frontend devops design research product ia qa analytics contact center research accessibility content labeled with bug
0
336,741
30,219,618,780
IssuesEvent
2023-07-05 18:15:59
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
opened
Fix jax_numpy_statistical.test_jax_median
JAX Frontend Sub Task Failing Test
| | | |---|---| |jax|<a href="https://github.com/unifyai/ivy/actions/runs/5466767206/jobs/9952175381"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5466767206/jobs/9952175381"><img src=https://img.shields.io/badge/-success-success></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5466767206/jobs/9952175381"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/5466767206/jobs/9952175381"><img src=https://img.shields.io/badge/-failure-red></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5466767206/jobs/9952175381"><img src=https://img.shields.io/badge/-failure-red></a>
1.0
Fix jax_numpy_statistical.test_jax_median - | | | |---|---| |jax|<a href="https://github.com/unifyai/ivy/actions/runs/5466767206/jobs/9952175381"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5466767206/jobs/9952175381"><img src=https://img.shields.io/badge/-success-success></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5466767206/jobs/9952175381"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/5466767206/jobs/9952175381"><img src=https://img.shields.io/badge/-failure-red></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5466767206/jobs/9952175381"><img src=https://img.shields.io/badge/-failure-red></a>
non_priority
fix jax numpy statistical test jax median jax a href src numpy a href src tensorflow a href src torch a href src paddle a href src
0
72,847
8,785,765,820
IssuesEvent
2018-12-20 13:58:58
Altinn/altinn-studio
https://api.github.com/repos/Altinn/altinn-studio
closed
SBL/Preview layout looks weird on laptop and smaller screens
bug designer team-tamagotchi
**Describe the bug** The layout for SBL/Preview does not look good when loaded on a smaller screen than desktop (laptop, mobile, etc.) How it looks: ![image](https://user-images.githubusercontent.com/1636323/49809787-183ae200-fd60-11e8-80c3-08b065de20ea.png) Looks like an issue related to screen _height_. **To Reproduce** Steps to reproduce the behavior: 1. Go to preview for a service, make sure to load the page on a laptop, or reduce the height of the browser window to 793px or less. 2. See broken layout **Expected behavior** How it should look: ![image](https://user-images.githubusercontent.com/1636323/49810119-c777b900-fd60-11e8-8ec9-5d44e44edbfb.png) **Desktop (please complete the following information):** - OS: Windows 10 - Browser chrome - Version 70.0.3538.77 **Smartphone (please complete the following information):** - Device: [e.g. iPhone6] - OS: [e.g. iOS8.1] - Browser [e.g. stock browser, safari] - Version [e.g. 22] **Additional context** Nor sure if this is an issue with the FormFiller component or the top nav header.
1.0
SBL/Preview layout looks weird on laptop and smaller screens - **Describe the bug** The layout for SBL/Preview does not look good when loaded on a smaller screen than desktop (laptop, mobile, etc.) How it looks: ![image](https://user-images.githubusercontent.com/1636323/49809787-183ae200-fd60-11e8-80c3-08b065de20ea.png) Looks like an issue related to screen _height_. **To Reproduce** Steps to reproduce the behavior: 1. Go to preview for a service, make sure to load the page on a laptop, or reduce the height of the browser window to 793px or less. 2. See broken layout **Expected behavior** How it should look: ![image](https://user-images.githubusercontent.com/1636323/49810119-c777b900-fd60-11e8-8ec9-5d44e44edbfb.png) **Desktop (please complete the following information):** - OS: Windows 10 - Browser chrome - Version 70.0.3538.77 **Smartphone (please complete the following information):** - Device: [e.g. iPhone6] - OS: [e.g. iOS8.1] - Browser [e.g. stock browser, safari] - Version [e.g. 22] **Additional context** Nor sure if this is an issue with the FormFiller component or the top nav header.
non_priority
sbl preview layout looks weird on laptop and smaller screens describe the bug the layout for sbl preview does not look good when loaded on a smaller screen than desktop laptop mobile etc how it looks looks like an issue related to screen height to reproduce steps to reproduce the behavior go to preview for a service make sure to load the page on a laptop or reduce the height of the browser window to or less see broken layout expected behavior how it should look desktop please complete the following information os windows browser chrome version smartphone please complete the following information device os browser version additional context nor sure if this is an issue with the formfiller component or the top nav header
0
6,271
6,286,008,177
IssuesEvent
2017-07-19 11:53:55
SatelliteQE/robottelo
https://api.github.com/repos/SatelliteQE/robottelo
opened
Many UI tests fails to save screenshot when using saucelabs
6.3 Infrastructure test-failure
many UI tests failed in tearDown when saving screenshot with the same error: ```console robottelo/test.py:622: in take_screenshot self.browser.save_screenshot(path) ../../shiningpanda/jobs/375dbdea/virtualenvs/d41d8cd9/lib/python2.7/site-packages/selenium/webdriver/remote/webdriver.py:758: in get_screenshot_as_file png = self.get_screenshot_as_png() ../../shiningpanda/jobs/375dbdea/virtualenvs/d41d8cd9/lib/python2.7/site-packages/selenium/webdriver/remote/webdriver.py:777: in get_screenshot_as_png return base64.b64decode(self.get_screenshot_as_base64().encode('ascii')) ../../shiningpanda/jobs/375dbdea/virtualenvs/d41d8cd9/lib/python2.7/site-packages/selenium/webdriver/remote/webdriver.py:787: in get_screenshot_as_base64 return self.execute(Command.SCREENSHOT)['value'] robottelo/ui/browser.py:44: in execute params) ../../shiningpanda/jobs/375dbdea/virtualenvs/d41d8cd9/lib/python2.7/site-packages/selenium/webdriver/remote/webdriver.py:201: in execute self.error_handler.check_response(response) ../../shiningpanda/jobs/375dbdea/virtualenvs/d41d8cd9/lib/python2.7/site-packages/selenium/webdriver/remote/errorhandler.py:102: in check_response value = json.loads(value_json) /usr/lib64/python2.7/json/__init__.py:339: in loads return _default_decoder.decode(s) /usr/lib64/python2.7/json/decoder.py:364: in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <json.decoder.JSONDecoder object at 0x7f71580bb590> s = "The test with session id b08625539bb64e46af50a676d3ad4345 has already finished, and can't receive further commands. Y.../b08625539bb64e46af50a676d3ad4345 For help, please check https://wiki.saucelabs.com/display/DOCS/Common+Error+Messages" idx = 0 def raw_decode(self, s, idx=0): """Decode a JSON document from ``s`` (a ``str`` or ``unicode`` beginning with a JSON document) and return a 2-tuple of the Python representation and the index in ``s`` where the document ended. This can be used to decode a JSON document from a string that may have extraneous data at the end. """ try: obj, end = self.scan_once(s, idx) except StopIteration: > raise ValueError("No JSON object could be decoded") E ValueError: No JSON object could be decoded /usr/lib64/python2.7/json/decoder.py:382: ValueError ```
1.0
Many UI tests fails to save screenshot when using saucelabs - many UI tests failed in tearDown when saving screenshot with the same error: ```console robottelo/test.py:622: in take_screenshot self.browser.save_screenshot(path) ../../shiningpanda/jobs/375dbdea/virtualenvs/d41d8cd9/lib/python2.7/site-packages/selenium/webdriver/remote/webdriver.py:758: in get_screenshot_as_file png = self.get_screenshot_as_png() ../../shiningpanda/jobs/375dbdea/virtualenvs/d41d8cd9/lib/python2.7/site-packages/selenium/webdriver/remote/webdriver.py:777: in get_screenshot_as_png return base64.b64decode(self.get_screenshot_as_base64().encode('ascii')) ../../shiningpanda/jobs/375dbdea/virtualenvs/d41d8cd9/lib/python2.7/site-packages/selenium/webdriver/remote/webdriver.py:787: in get_screenshot_as_base64 return self.execute(Command.SCREENSHOT)['value'] robottelo/ui/browser.py:44: in execute params) ../../shiningpanda/jobs/375dbdea/virtualenvs/d41d8cd9/lib/python2.7/site-packages/selenium/webdriver/remote/webdriver.py:201: in execute self.error_handler.check_response(response) ../../shiningpanda/jobs/375dbdea/virtualenvs/d41d8cd9/lib/python2.7/site-packages/selenium/webdriver/remote/errorhandler.py:102: in check_response value = json.loads(value_json) /usr/lib64/python2.7/json/__init__.py:339: in loads return _default_decoder.decode(s) /usr/lib64/python2.7/json/decoder.py:364: in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <json.decoder.JSONDecoder object at 0x7f71580bb590> s = "The test with session id b08625539bb64e46af50a676d3ad4345 has already finished, and can't receive further commands. Y.../b08625539bb64e46af50a676d3ad4345 For help, please check https://wiki.saucelabs.com/display/DOCS/Common+Error+Messages" idx = 0 def raw_decode(self, s, idx=0): """Decode a JSON document from ``s`` (a ``str`` or ``unicode`` beginning with a JSON document) and return a 2-tuple of the Python representation and the index in ``s`` where the document ended. This can be used to decode a JSON document from a string that may have extraneous data at the end. """ try: obj, end = self.scan_once(s, idx) except StopIteration: > raise ValueError("No JSON object could be decoded") E ValueError: No JSON object could be decoded /usr/lib64/python2.7/json/decoder.py:382: ValueError ```
non_priority
many ui tests fails to save screenshot when using saucelabs many ui tests failed in teardown when saving screenshot with the same error console robottelo test py in take screenshot self browser save screenshot path shiningpanda jobs virtualenvs lib site packages selenium webdriver remote webdriver py in get screenshot as file png self get screenshot as png shiningpanda jobs virtualenvs lib site packages selenium webdriver remote webdriver py in get screenshot as png return self get screenshot as encode ascii shiningpanda jobs virtualenvs lib site packages selenium webdriver remote webdriver py in get screenshot as return self execute command screenshot robottelo ui browser py in execute params shiningpanda jobs virtualenvs lib site packages selenium webdriver remote webdriver py in execute self error handler check response response shiningpanda jobs virtualenvs lib site packages selenium webdriver remote errorhandler py in check response value json loads value json usr json init py in loads return default decoder decode s usr json decoder py in decode obj end self raw decode s idx w s end self s the test with session id has already finished and can t receive further commands y for help please check idx def raw decode self s idx decode a json document from s a str or unicode beginning with a json document and return a tuple of the python representation and the index in s where the document ended this can be used to decode a json document from a string that may have extraneous data at the end try obj end self scan once s idx except stopiteration raise valueerror no json object could be decoded e valueerror no json object could be decoded usr json decoder py valueerror
0
39,831
5,251,969,227
IssuesEvent
2017-02-02 01:52:59
rancher/rancher
https://api.github.com/repos/rancher/rancher
closed
Metdata for exposed ports have 0.0.0.0 instead of host Ipaddress.
kind/bug status/resolved status/to-test
Rancher Versions: v1.4.0-rc3 2 automation test cases relating to metadata for exposed ports is failing because of ip being set to 0.0.0.0 instead of host Ipaddress. ``` cattlevalidationtest.core.test_rancher_compose_metadata.test_metadata_self_2015_07_25 Error Details assert ['0.0.0.0:6000:22/tcp'] == ['35.160.82.141:6000:22/tcp'] At index 0 diff: u'0.0.0.0:6000:22/tcp' != u'35.160.82.141:6000:22/tcp' Use -v to get the full diff Stack Trace admin_client = <cattle.Client object at 0x7f14e6e4cdd0> client = <cattle.Client object at 0x7f14f459bb10> rancher_compose_container = None @if_compose_data_files def test_metadata_self_2015_07_25( admin_client, client, rancher_compose_container): env_name = random_str().replace("-", "") # Create an environment using up launch_rancher_compose_from_file( client, METADATA_SUBDIR, "dc_metadata_1.yml", env_name, "up -d", "Creating stack", "rc_metadata_1.yml") env, service = get_env_service_by_name(client, env_name, "test") assert service.state == "active" print service.metadata assert service.metadata["test1"]["name"] == "t1name" assert service.metadata["test1"]["value"] == "t1value" assert service.metadata["test2"]["name"] == [1, 2, 3, 4] wait_for_metadata_propagation(admin_client) service_containers = get_service_container_list(admin_client, service) port = 6000 con_names = [] for con in service_containers: con_names.append(con.name) for con in service_containers: # Service related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/service", "2015-07-25") metadata = json.loads(metadata_str) assert set(metadata["containers"]) == set(con_names) print metadata["external_ips"] print metadata["hostname"] assert metadata["name"] == "test" assert metadata["ports"] == ["6000:22/tcp"] assert metadata["stack_name"] == env_name assert metadata["kind"] == "service" assert metadata["labels"] == service.launchConfig["labels"] assert metadata["metadata"] == service.metadata assert metadata["uuid"] == service.uuid host = admin_client.by_id('host', con.hosts[0].id) # Host related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/host", "2015-07-25") metadata = json.loads(metadata_str) assert metadata["agent_ip"] == host.ipAddresses()[0].address assert metadata["labels"] == host.labels assert metadata["name"] == host.hostname assert metadata["uuid"] == host.uuid # Stack related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/stack", "2015-07-25") metadata = json.loads(metadata_str) assert metadata["environment_name"] == "Default" assert metadata["services"] == ["test"] assert metadata["name"] == env.name assert metadata["uuid"] == env.uuid # Container related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/container", "2015-07-25") metadata = json.loads(metadata_str) assert metadata["create_index"] == con.createIndex assert metadata["host_uuid"] == host.uuid assert metadata["ips"] == [con.primaryIpAddress] assert metadata["labels"] == con.labels assert metadata["name"] == con.name > assert metadata["ports"] == [host.ipAddresses()[0].address + ":6000:22/tcp"] E assert ['0.0.0.0:6000:22/tcp'] == ['35.160.82.141:6000:22/tcp'] E At index 0 diff: u'0.0.0.0:6000:22/tcp' != u'35.160.82.141:6000:22/tcp' E Use -v to get the full diff validation-tests/tests/v2_validation/cattlevalidationtest/core/test_rancher_compose_metadata.py:476: AssertionError 17 sec 1 cattlevalidationtest.core.test_rancher_compose_metadata.test_metadata_self_2015_12_19 Error Details assert ['0.0.0.0:6001:22/tcp'] == ['52.11.211.99:6001:22/tcp'] At index 0 diff: u'0.0.0.0:6001:22/tcp' != u'52.11.211.99:6001:22/tcp' Use -v to get the full diff Stack Trace admin_client = <cattle.Client object at 0x7f14e6e4cdd0> client = <cattle.Client object at 0x7f14f459bb10> rancher_compose_container = None @if_compose_data_files def test_metadata_self_2015_12_19( admin_client, client, rancher_compose_container): env_name = random_str().replace("-", "") # Create an environment using up launch_rancher_compose_from_file( client, METADATA_SUBDIR, "dc_metadata_1n.yml", env_name, "up -d", "Creating stack", "rc_metadata_1n.yml") env, service = get_env_service_by_name(client, env_name, "test1n") assert service.state == "active" print service.metadata assert service.metadata["test1"]["name"] == "t1name" assert service.metadata["test1"]["value"] == "t1value" assert service.metadata["test2"]["name"] == [1, 2, 3, 4] service_containers = get_service_container_list(admin_client, service) port = 6001 con_metadata = {} wait_for_metadata_propagation(admin_client) for con in service_containers: metadata_str = fetch_rancher_metadata(admin_client, con, metadata_client_port, "containers/" + con.name, "2015-12-19") con_metadata[con.name] = json.loads(metadata_str) for con in service_containers: # Service related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/service", "2015-12-19") service_metadata = json.loads(metadata_str) con_list = service_metadata["containers"] # Check for container object list assert len(con_list) == len(con_metadata.keys()) for container in con_list: assert cmp(container, con_metadata[container["name"]]) == 0 assert service_metadata["name"] == "test1n" assert service_metadata["ports"] == ["6001:22/tcp"] assert service_metadata["stack_name"] == env_name assert service_metadata["kind"] == "service" assert service_metadata["labels"] == service.launchConfig["labels"] assert service_metadata["metadata"] == service.metadata assert service_metadata["uuid"] == service.uuid host = admin_client.by_id('host', con.hosts[0].id) # Host related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/host", "2015-12-19") metadata = json.loads(metadata_str) assert metadata["agent_ip"] == host.ipAddresses()[0].address assert metadata["labels"] == host.labels assert metadata["name"] == host.hostname assert metadata["uuid"] == host.uuid # Stack related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/stack", "2015-12-19") metadata = json.loads(metadata_str) assert metadata["environment_name"] == "Default" # Check for service object list # Set token value to None in service metadata object returned # from self before comparing service object retrieved by index service_metadata["token"] = None assert cmp(metadata["services"][0], service_metadata) == 0 assert metadata["name"] == env.name assert metadata["uuid"] == env.uuid # Container related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/container", "2015-12-19") metadata = json.loads(metadata_str) assert metadata["create_index"] == con.createIndex assert metadata["host_uuid"] == host.uuid assert metadata["ips"] == [con.primaryIpAddress] assert metadata["labels"] == con.labels assert metadata["name"] == con.name > assert metadata["ports"] == [host.ipAddresses()[0].address + ":6001:22/tcp"] E assert ['0.0.0.0:6001:22/tcp'] == ['52.11.211.99:6001:22/tcp'] E At index 0 diff: u'0.0.0.0:6001:22/tcp' != u'52.11.211.99:6001:22/tcp' E Use -v to get the full diff validation-tests/tests/v2_validation/cattlevalidationtest/core/test_rancher_compose_metadata.py:311: AssertionError ```
1.0
Metdata for exposed ports have 0.0.0.0 instead of host Ipaddress. - Rancher Versions: v1.4.0-rc3 2 automation test cases relating to metadata for exposed ports is failing because of ip being set to 0.0.0.0 instead of host Ipaddress. ``` cattlevalidationtest.core.test_rancher_compose_metadata.test_metadata_self_2015_07_25 Error Details assert ['0.0.0.0:6000:22/tcp'] == ['35.160.82.141:6000:22/tcp'] At index 0 diff: u'0.0.0.0:6000:22/tcp' != u'35.160.82.141:6000:22/tcp' Use -v to get the full diff Stack Trace admin_client = <cattle.Client object at 0x7f14e6e4cdd0> client = <cattle.Client object at 0x7f14f459bb10> rancher_compose_container = None @if_compose_data_files def test_metadata_self_2015_07_25( admin_client, client, rancher_compose_container): env_name = random_str().replace("-", "") # Create an environment using up launch_rancher_compose_from_file( client, METADATA_SUBDIR, "dc_metadata_1.yml", env_name, "up -d", "Creating stack", "rc_metadata_1.yml") env, service = get_env_service_by_name(client, env_name, "test") assert service.state == "active" print service.metadata assert service.metadata["test1"]["name"] == "t1name" assert service.metadata["test1"]["value"] == "t1value" assert service.metadata["test2"]["name"] == [1, 2, 3, 4] wait_for_metadata_propagation(admin_client) service_containers = get_service_container_list(admin_client, service) port = 6000 con_names = [] for con in service_containers: con_names.append(con.name) for con in service_containers: # Service related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/service", "2015-07-25") metadata = json.loads(metadata_str) assert set(metadata["containers"]) == set(con_names) print metadata["external_ips"] print metadata["hostname"] assert metadata["name"] == "test" assert metadata["ports"] == ["6000:22/tcp"] assert metadata["stack_name"] == env_name assert metadata["kind"] == "service" assert metadata["labels"] == service.launchConfig["labels"] assert metadata["metadata"] == service.metadata assert metadata["uuid"] == service.uuid host = admin_client.by_id('host', con.hosts[0].id) # Host related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/host", "2015-07-25") metadata = json.loads(metadata_str) assert metadata["agent_ip"] == host.ipAddresses()[0].address assert metadata["labels"] == host.labels assert metadata["name"] == host.hostname assert metadata["uuid"] == host.uuid # Stack related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/stack", "2015-07-25") metadata = json.loads(metadata_str) assert metadata["environment_name"] == "Default" assert metadata["services"] == ["test"] assert metadata["name"] == env.name assert metadata["uuid"] == env.uuid # Container related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/container", "2015-07-25") metadata = json.loads(metadata_str) assert metadata["create_index"] == con.createIndex assert metadata["host_uuid"] == host.uuid assert metadata["ips"] == [con.primaryIpAddress] assert metadata["labels"] == con.labels assert metadata["name"] == con.name > assert metadata["ports"] == [host.ipAddresses()[0].address + ":6000:22/tcp"] E assert ['0.0.0.0:6000:22/tcp'] == ['35.160.82.141:6000:22/tcp'] E At index 0 diff: u'0.0.0.0:6000:22/tcp' != u'35.160.82.141:6000:22/tcp' E Use -v to get the full diff validation-tests/tests/v2_validation/cattlevalidationtest/core/test_rancher_compose_metadata.py:476: AssertionError 17 sec 1 cattlevalidationtest.core.test_rancher_compose_metadata.test_metadata_self_2015_12_19 Error Details assert ['0.0.0.0:6001:22/tcp'] == ['52.11.211.99:6001:22/tcp'] At index 0 diff: u'0.0.0.0:6001:22/tcp' != u'52.11.211.99:6001:22/tcp' Use -v to get the full diff Stack Trace admin_client = <cattle.Client object at 0x7f14e6e4cdd0> client = <cattle.Client object at 0x7f14f459bb10> rancher_compose_container = None @if_compose_data_files def test_metadata_self_2015_12_19( admin_client, client, rancher_compose_container): env_name = random_str().replace("-", "") # Create an environment using up launch_rancher_compose_from_file( client, METADATA_SUBDIR, "dc_metadata_1n.yml", env_name, "up -d", "Creating stack", "rc_metadata_1n.yml") env, service = get_env_service_by_name(client, env_name, "test1n") assert service.state == "active" print service.metadata assert service.metadata["test1"]["name"] == "t1name" assert service.metadata["test1"]["value"] == "t1value" assert service.metadata["test2"]["name"] == [1, 2, 3, 4] service_containers = get_service_container_list(admin_client, service) port = 6001 con_metadata = {} wait_for_metadata_propagation(admin_client) for con in service_containers: metadata_str = fetch_rancher_metadata(admin_client, con, metadata_client_port, "containers/" + con.name, "2015-12-19") con_metadata[con.name] = json.loads(metadata_str) for con in service_containers: # Service related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/service", "2015-12-19") service_metadata = json.loads(metadata_str) con_list = service_metadata["containers"] # Check for container object list assert len(con_list) == len(con_metadata.keys()) for container in con_list: assert cmp(container, con_metadata[container["name"]]) == 0 assert service_metadata["name"] == "test1n" assert service_metadata["ports"] == ["6001:22/tcp"] assert service_metadata["stack_name"] == env_name assert service_metadata["kind"] == "service" assert service_metadata["labels"] == service.launchConfig["labels"] assert service_metadata["metadata"] == service.metadata assert service_metadata["uuid"] == service.uuid host = admin_client.by_id('host', con.hosts[0].id) # Host related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/host", "2015-12-19") metadata = json.loads(metadata_str) assert metadata["agent_ip"] == host.ipAddresses()[0].address assert metadata["labels"] == host.labels assert metadata["name"] == host.hostname assert metadata["uuid"] == host.uuid # Stack related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/stack", "2015-12-19") metadata = json.loads(metadata_str) assert metadata["environment_name"] == "Default" # Check for service object list # Set token value to None in service metadata object returned # from self before comparing service object retrieved by index service_metadata["token"] = None assert cmp(metadata["services"][0], service_metadata) == 0 assert metadata["name"] == env.name assert metadata["uuid"] == env.uuid # Container related metadata metadata_str = fetch_rancher_metadata(admin_client, con, port, "self/container", "2015-12-19") metadata = json.loads(metadata_str) assert metadata["create_index"] == con.createIndex assert metadata["host_uuid"] == host.uuid assert metadata["ips"] == [con.primaryIpAddress] assert metadata["labels"] == con.labels assert metadata["name"] == con.name > assert metadata["ports"] == [host.ipAddresses()[0].address + ":6001:22/tcp"] E assert ['0.0.0.0:6001:22/tcp'] == ['52.11.211.99:6001:22/tcp'] E At index 0 diff: u'0.0.0.0:6001:22/tcp' != u'52.11.211.99:6001:22/tcp' E Use -v to get the full diff validation-tests/tests/v2_validation/cattlevalidationtest/core/test_rancher_compose_metadata.py:311: AssertionError ```
non_priority
metdata for exposed ports have instead of host ipaddress rancher versions automation test cases relating to metadata for exposed ports is failing because of ip being set to instead of host ipaddress cattlevalidationtest core test rancher compose metadata test metadata self error details assert at index diff u tcp u tcp use v to get the full diff stack trace admin client client rancher compose container none if compose data files def test metadata self admin client client rancher compose container env name random str replace create an environment using up launch rancher compose from file client metadata subdir dc metadata yml env name up d creating stack rc metadata yml env service get env service by name client env name test assert service state active print service metadata assert service metadata assert service metadata assert service metadata wait for metadata propagation admin client service containers get service container list admin client service port con names for con in service containers con names append con name for con in service containers service related metadata metadata str fetch rancher metadata admin client con port self service metadata json loads metadata str assert set metadata set con names print metadata print metadata assert metadata test assert metadata assert metadata env name assert metadata service assert metadata service launchconfig assert metadata service metadata assert metadata service uuid host admin client by id host con hosts id host related metadata metadata str fetch rancher metadata admin client con port self host metadata json loads metadata str assert metadata host ipaddresses address assert metadata host labels assert metadata host hostname assert metadata host uuid stack related metadata metadata str fetch rancher metadata admin client con port self stack metadata json loads metadata str assert metadata default assert metadata assert metadata env name assert metadata env uuid container related metadata metadata str fetch rancher metadata admin client con port self container metadata json loads metadata str assert metadata con createindex assert metadata host uuid assert metadata assert metadata con labels assert metadata con name assert metadata address tcp e assert e at index diff u tcp u tcp e use v to get the full diff validation tests tests validation cattlevalidationtest core test rancher compose metadata py assertionerror sec cattlevalidationtest core test rancher compose metadata test metadata self error details assert at index diff u tcp u tcp use v to get the full diff stack trace admin client client rancher compose container none if compose data files def test metadata self admin client client rancher compose container env name random str replace create an environment using up launch rancher compose from file client metadata subdir dc metadata yml env name up d creating stack rc metadata yml env service get env service by name client env name assert service state active print service metadata assert service metadata assert service metadata assert service metadata service containers get service container list admin client service port con metadata wait for metadata propagation admin client for con in service containers metadata str fetch rancher metadata admin client con metadata client port containers con name con metadata json loads metadata str for con in service containers service related metadata metadata str fetch rancher metadata admin client con port self service service metadata json loads metadata str con list service metadata check for container object list assert len con list len con metadata keys for container in con list assert cmp container con metadata assert service metadata assert service metadata assert service metadata env name assert service metadata service assert service metadata service launchconfig assert service metadata service metadata assert service metadata service uuid host admin client by id host con hosts id host related metadata metadata str fetch rancher metadata admin client con port self host metadata json loads metadata str assert metadata host ipaddresses address assert metadata host labels assert metadata host hostname assert metadata host uuid stack related metadata metadata str fetch rancher metadata admin client con port self stack metadata json loads metadata str assert metadata default check for service object list set token value to none in service metadata object returned from self before comparing service object retrieved by index service metadata none assert cmp metadata service metadata assert metadata env name assert metadata env uuid container related metadata metadata str fetch rancher metadata admin client con port self container metadata json loads metadata str assert metadata con createindex assert metadata host uuid assert metadata assert metadata con labels assert metadata con name assert metadata address tcp e assert e at index diff u tcp u tcp e use v to get the full diff validation tests tests validation cattlevalidationtest core test rancher compose metadata py assertionerror
0
184,865
14,289,966,340
IssuesEvent
2020-11-23 20:06:51
github-vet/rangeclosure-findings
https://api.github.com/repos/github-vet/rangeclosure-findings
closed
cameljjh/education: vendor/github.com/google/certificate-transparency-go/fixchain/fix_and_log_test.go; 33 LoC
fresh small test
Found a possible issue in [cameljjh/education](https://www.github.com/cameljjh/education) at [vendor/github.com/google/certificate-transparency-go/fixchain/fix_and_log_test.go](https://github.com/cameljjh/education/blob/897296a73d5e895afca835766e3a8456a14c314f/vendor/github.com/google/certificate-transparency-go/fixchain/fix_and_log_test.go#L199-L231) The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements which capture loop variables. [Click here to see the code in its original context.](https://github.com/cameljjh/education/blob/897296a73d5e895afca835766e3a8456a14c314f/vendor/github.com/google/certificate-transparency-go/fixchain/fix_and_log_test.go#L199-L231) <details> <summary>Click here to show the 33 line(s) of Go which triggered the analyzer.</summary> ```go for i, test := range newFixAndLogTests { seen := make([]bool, len(test.expLoggedChains)) errors := make(chan *FixError) c := &http.Client{Transport: &testRoundTripper{t: t, test: &test, testIndex: i, seen: seen}} logClient, err := client.New(test.url, c, jsonclient.Options{}) if err != nil { t.Fatalf("failed to create LogClient: %v", err) } fl := NewFixAndLog(ctx, 1, 1, errors, c, logClient, newNilLimiter(), false) var wg sync.WaitGroup wg.Add(1) go func() { defer wg.Done() testErrors(t, i, test.expectedErrs, errors) }() switch test.function { case "QueueChain": fl.QueueChain(extractTestChain(t, i, test.chain)) case "QueueAllCertsInChain": fl.QueueAllCertsInChain(extractTestChain(t, i, test.chain)) } fl.Wait() close(errors) wg.Wait() // Check that no chains that were expected to be logged were not. for j, val := range seen { if !val { t.Errorf("#%d: Expected chain was not logged: %s", i, strings.Join(test.expLoggedChains[j], " -> ")) } } } ``` </details> commit ID: 897296a73d5e895afca835766e3a8456a14c314f
1.0
cameljjh/education: vendor/github.com/google/certificate-transparency-go/fixchain/fix_and_log_test.go; 33 LoC - Found a possible issue in [cameljjh/education](https://www.github.com/cameljjh/education) at [vendor/github.com/google/certificate-transparency-go/fixchain/fix_and_log_test.go](https://github.com/cameljjh/education/blob/897296a73d5e895afca835766e3a8456a14c314f/vendor/github.com/google/certificate-transparency-go/fixchain/fix_and_log_test.go#L199-L231) The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements which capture loop variables. [Click here to see the code in its original context.](https://github.com/cameljjh/education/blob/897296a73d5e895afca835766e3a8456a14c314f/vendor/github.com/google/certificate-transparency-go/fixchain/fix_and_log_test.go#L199-L231) <details> <summary>Click here to show the 33 line(s) of Go which triggered the analyzer.</summary> ```go for i, test := range newFixAndLogTests { seen := make([]bool, len(test.expLoggedChains)) errors := make(chan *FixError) c := &http.Client{Transport: &testRoundTripper{t: t, test: &test, testIndex: i, seen: seen}} logClient, err := client.New(test.url, c, jsonclient.Options{}) if err != nil { t.Fatalf("failed to create LogClient: %v", err) } fl := NewFixAndLog(ctx, 1, 1, errors, c, logClient, newNilLimiter(), false) var wg sync.WaitGroup wg.Add(1) go func() { defer wg.Done() testErrors(t, i, test.expectedErrs, errors) }() switch test.function { case "QueueChain": fl.QueueChain(extractTestChain(t, i, test.chain)) case "QueueAllCertsInChain": fl.QueueAllCertsInChain(extractTestChain(t, i, test.chain)) } fl.Wait() close(errors) wg.Wait() // Check that no chains that were expected to be logged were not. for j, val := range seen { if !val { t.Errorf("#%d: Expected chain was not logged: %s", i, strings.Join(test.expLoggedChains[j], " -> ")) } } } ``` </details> commit ID: 897296a73d5e895afca835766e3a8456a14c314f
non_priority
cameljjh education vendor github com google certificate transparency go fixchain fix and log test go loc found a possible issue in at the below snippet of go code triggered static analysis which searches for goroutines and or defer statements which capture loop variables click here to show the line s of go which triggered the analyzer go for i test range newfixandlogtests seen make bool len test exploggedchains errors make chan fixerror c http client transport testroundtripper t t test test testindex i seen seen logclient err client new test url c jsonclient options if err nil t fatalf failed to create logclient v err fl newfixandlog ctx errors c logclient newnillimiter false var wg sync waitgroup wg add go func defer wg done testerrors t i test expectederrs errors switch test function case queuechain fl queuechain extracttestchain t i test chain case queueallcertsinchain fl queueallcertsinchain extracttestchain t i test chain fl wait close errors wg wait check that no chains that were expected to be logged were not for j val range seen if val t errorf d expected chain was not logged s i strings join test exploggedchains commit id
0
65,170
27,001,781,032
IssuesEvent
2023-02-10 08:27:59
opensdmx/rsdmx
https://api.github.com/repos/opensdmx/rsdmx
closed
change in Eurostat API: adjustment of rsdmx request builder and man page?
web-service connector user support
I've recently decided to try SDMX for my data queries. Thank you for this package. Anyway, seems that Eurostat (recently?) implemented changes in their API. Would it be possible to adjust the request builder and the help pages accordingly? I first tried an example from ?readSDMX: ``` url <- paste("http://ec.europa.eu/eurostat/SDMX/diss-web/rest/data/", "cdh_e_fos/all/?startperiod=2000&endPeriod=2010", sep = "") sdmx <- readSDMX(url) ## Not Found (HTTP 404).Error in readSDMX(url) : HTTP request failed with status: 404 ``` I also tried to use the builtin request builder: ``` test.sdmx <- rsdmx::readSDMX(providerId = "ESTAT", resource = "data", flowRef = "nama_10_gdp", key = list("A", "CP_MEUR", "B1GQ", "BE"), start = 2012, end = 2014 ) ## [rsdmx][INFO] Fetching 'http://ec.europa.eu/eurostat/SDMX/diss-web/rest/data/nama_10_gdp/A.CP_MEUR.B1GQ.BE/all/?startPeriod=2012&endPeriod=2014' ## Not Found (HTTP 404).Error in rsdmx::readSDMX(providerId = "ESTAT", resource = "data", flowRef = "nama_10_gdp", : ## HTTP request failed with status: 404 ``` I then followed the EC/Eurostat wiki (https://wikis.ec.europa.eu/display/EUROSTATHELP/API+SDMX+2.1+-+data+filtering#APISDMX2.1datafiltering-SDMX21FDIM), querying for 'generic data': ``` test.url <- "https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/data/NAMA_10_GDP/A.CP_MEUR.B1GQ.BE+LU" test.sdmx <- rsdmx::readSDMX(test.url) ## Error in structuresXML[[1]] : subscript out of bounds ``` Finally, what works is the following ('structured format'): ``` test.url <- "https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/data/NAMA_10_GDP/A.CP_MEUR.B1GQ.BE+LU?format=SDMX_2.1_STRUCTURED" test.sdmx <- rsdmx::readSDMX(test.url) test.df <- as.data.frame(test.sdmx) head(test.df) ## yeeeeah ``` To sum up, I had to change the URL structure (following EC/Eurostat wiki) and to specify a format key to make it works.
1.0
change in Eurostat API: adjustment of rsdmx request builder and man page? - I've recently decided to try SDMX for my data queries. Thank you for this package. Anyway, seems that Eurostat (recently?) implemented changes in their API. Would it be possible to adjust the request builder and the help pages accordingly? I first tried an example from ?readSDMX: ``` url <- paste("http://ec.europa.eu/eurostat/SDMX/diss-web/rest/data/", "cdh_e_fos/all/?startperiod=2000&endPeriod=2010", sep = "") sdmx <- readSDMX(url) ## Not Found (HTTP 404).Error in readSDMX(url) : HTTP request failed with status: 404 ``` I also tried to use the builtin request builder: ``` test.sdmx <- rsdmx::readSDMX(providerId = "ESTAT", resource = "data", flowRef = "nama_10_gdp", key = list("A", "CP_MEUR", "B1GQ", "BE"), start = 2012, end = 2014 ) ## [rsdmx][INFO] Fetching 'http://ec.europa.eu/eurostat/SDMX/diss-web/rest/data/nama_10_gdp/A.CP_MEUR.B1GQ.BE/all/?startPeriod=2012&endPeriod=2014' ## Not Found (HTTP 404).Error in rsdmx::readSDMX(providerId = "ESTAT", resource = "data", flowRef = "nama_10_gdp", : ## HTTP request failed with status: 404 ``` I then followed the EC/Eurostat wiki (https://wikis.ec.europa.eu/display/EUROSTATHELP/API+SDMX+2.1+-+data+filtering#APISDMX2.1datafiltering-SDMX21FDIM), querying for 'generic data': ``` test.url <- "https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/data/NAMA_10_GDP/A.CP_MEUR.B1GQ.BE+LU" test.sdmx <- rsdmx::readSDMX(test.url) ## Error in structuresXML[[1]] : subscript out of bounds ``` Finally, what works is the following ('structured format'): ``` test.url <- "https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/data/NAMA_10_GDP/A.CP_MEUR.B1GQ.BE+LU?format=SDMX_2.1_STRUCTURED" test.sdmx <- rsdmx::readSDMX(test.url) test.df <- as.data.frame(test.sdmx) head(test.df) ## yeeeeah ``` To sum up, I had to change the URL structure (following EC/Eurostat wiki) and to specify a format key to make it works.
non_priority
change in eurostat api adjustment of rsdmx request builder and man page i ve recently decided to try sdmx for my data queries thank you for this package anyway seems that eurostat recently implemented changes in their api would it be possible to adjust the request builder and the help pages accordingly i first tried an example from readsdmx url paste cdh e fos all startperiod endperiod sep sdmx readsdmx url not found http error in readsdmx url http request failed with status i also tried to use the builtin request builder test sdmx rsdmx readsdmx providerid estat resource data flowref nama gdp key list a cp meur be start end fetching not found http error in rsdmx readsdmx providerid estat resource data flowref nama gdp http request failed with status i then followed the ec eurostat wiki querying for generic data test url test sdmx rsdmx readsdmx test url error in structuresxml subscript out of bounds finally what works is the following structured format test url test sdmx rsdmx readsdmx test url test df as data frame test sdmx head test df yeeeeah to sum up i had to change the url structure following ec eurostat wiki and to specify a format key to make it works
0
5,357
12,397,237,818
IssuesEvent
2020-05-20 22:13:21
Azure/azure-sdk
https://api.github.com/repos/Azure/azure-sdk
opened
Board Review: Event Hubs - Direct Partition Connections
architecture board-review
Thank you for starting the process for approval of the client library for your Azure service. Thorough review of your client library ensures that your APIs are consistent with the guidelines and the consumers of your client library have a consistently good experience when using Azure. ** Before submitting, ensure you adjust the title of the issue appropriately ** To ensure consistency, all Tier-1 languages (C#, TypeScript, Java, Python) will generally be reviewed together. In expansive libraries, we will pair dynamic languages (Python, TypeScript) together, and strongly typed languages (C#, Java) together in separate meetings. ## The Basics * Service team responsible for the client library: Event Hubs * Link to documentation describing the service: https://azure.microsoft.com/en-us/services/event-hubs/ * Contact email (if service team, provide PM and Dev Lead): SeongJoon Kwak, Javier Fernandez & Shubha Vijayasarathy ## About this client library * Name of the client library: azure-messaging-eventhub * Languages for this review: Python, Java, .NET, JS * Link to the service REST APIs: N/A ## Background and alternatives considered ## Artifacts required (per language) We use an API review tool ([apiview](https://apiview.azurewebsites.net)) to support .NET and Java API reviews. For Python and TypeScript, use the API extractor tool, then submit the output as a Draft PR to the relevant repository (azure-sdk-for-python or azure-sdk-for-js). ### .NET * Upload DLL to [apiview](https://apiview.azurewebsites.net). Link: * Link to samples for champion scenarios: ### Java * Upload JAR to [apiview](https://apiview.azurewebsites.net). Link: * Link to samples for champion scenarios: ### Python * Upload the api as a Draft PR. Link to PR: * Link to samples for champion scenarios: ### TypeScript * Upload output of api-extractor as a Draft PR. Link to PR: * Link to samples for champion scenarios: ## Champion Scenarios A champion scenario is a use case that the consumer of the client library is commonly expected to perform. Champion scenarios are used to ensure the developer experience is exemplary for the common cases. You need to show the entire code sample (including error handling, as an example) for the champion scenarios. * Champion Scenario 1: * Describe the champion scenario * Estimate the percentage of developers using the service who would use the champion scenario * Link to the code sample _ Repeat for each champion scenario _ Examples of good scenarios are technology agnostic (i.e. the customer can do the same thing in multiple ways), and are expected to be used by > 20% of users: * Upload a file * Update firmware on the device * Recognize faces in an uploaded image Examples of bad scenarios: * Create a client (it's part of a scenario, and we'll see it often enough in true champion scenarios) * Send a batch of events (again, part of the scenario) * Create a page blob (it's not used by enough of the user base)
1.0
Board Review: Event Hubs - Direct Partition Connections - Thank you for starting the process for approval of the client library for your Azure service. Thorough review of your client library ensures that your APIs are consistent with the guidelines and the consumers of your client library have a consistently good experience when using Azure. ** Before submitting, ensure you adjust the title of the issue appropriately ** To ensure consistency, all Tier-1 languages (C#, TypeScript, Java, Python) will generally be reviewed together. In expansive libraries, we will pair dynamic languages (Python, TypeScript) together, and strongly typed languages (C#, Java) together in separate meetings. ## The Basics * Service team responsible for the client library: Event Hubs * Link to documentation describing the service: https://azure.microsoft.com/en-us/services/event-hubs/ * Contact email (if service team, provide PM and Dev Lead): SeongJoon Kwak, Javier Fernandez & Shubha Vijayasarathy ## About this client library * Name of the client library: azure-messaging-eventhub * Languages for this review: Python, Java, .NET, JS * Link to the service REST APIs: N/A ## Background and alternatives considered ## Artifacts required (per language) We use an API review tool ([apiview](https://apiview.azurewebsites.net)) to support .NET and Java API reviews. For Python and TypeScript, use the API extractor tool, then submit the output as a Draft PR to the relevant repository (azure-sdk-for-python or azure-sdk-for-js). ### .NET * Upload DLL to [apiview](https://apiview.azurewebsites.net). Link: * Link to samples for champion scenarios: ### Java * Upload JAR to [apiview](https://apiview.azurewebsites.net). Link: * Link to samples for champion scenarios: ### Python * Upload the api as a Draft PR. Link to PR: * Link to samples for champion scenarios: ### TypeScript * Upload output of api-extractor as a Draft PR. Link to PR: * Link to samples for champion scenarios: ## Champion Scenarios A champion scenario is a use case that the consumer of the client library is commonly expected to perform. Champion scenarios are used to ensure the developer experience is exemplary for the common cases. You need to show the entire code sample (including error handling, as an example) for the champion scenarios. * Champion Scenario 1: * Describe the champion scenario * Estimate the percentage of developers using the service who would use the champion scenario * Link to the code sample _ Repeat for each champion scenario _ Examples of good scenarios are technology agnostic (i.e. the customer can do the same thing in multiple ways), and are expected to be used by > 20% of users: * Upload a file * Update firmware on the device * Recognize faces in an uploaded image Examples of bad scenarios: * Create a client (it's part of a scenario, and we'll see it often enough in true champion scenarios) * Send a batch of events (again, part of the scenario) * Create a page blob (it's not used by enough of the user base)
non_priority
board review event hubs direct partition connections thank you for starting the process for approval of the client library for your azure service thorough review of your client library ensures that your apis are consistent with the guidelines and the consumers of your client library have a consistently good experience when using azure before submitting ensure you adjust the title of the issue appropriately to ensure consistency all tier languages c typescript java python will generally be reviewed together in expansive libraries we will pair dynamic languages python typescript together and strongly typed languages c java together in separate meetings the basics service team responsible for the client library event hubs link to documentation describing the service contact email if service team provide pm and dev lead seongjoon kwak javier fernandez shubha vijayasarathy about this client library name of the client library azure messaging eventhub languages for this review python java net js link to the service rest apis n a background and alternatives considered artifacts required per language we use an api review tool to support net and java api reviews for python and typescript use the api extractor tool then submit the output as a draft pr to the relevant repository azure sdk for python or azure sdk for js net upload dll to link link to samples for champion scenarios java upload jar to link link to samples for champion scenarios python upload the api as a draft pr link to pr link to samples for champion scenarios typescript upload output of api extractor as a draft pr link to pr link to samples for champion scenarios champion scenarios a champion scenario is a use case that the consumer of the client library is commonly expected to perform champion scenarios are used to ensure the developer experience is exemplary for the common cases you need to show the entire code sample including error handling as an example for the champion scenarios champion scenario describe the champion scenario estimate the percentage of developers using the service who would use the champion scenario link to the code sample repeat for each champion scenario examples of good scenarios are technology agnostic i e the customer can do the same thing in multiple ways and are expected to be used by of users upload a file update firmware on the device recognize faces in an uploaded image examples of bad scenarios create a client it s part of a scenario and we ll see it often enough in true champion scenarios send a batch of events again part of the scenario create a page blob it s not used by enough of the user base
0
47,664
25,128,442,151
IssuesEvent
2022-11-09 13:32:02
elastic/kibana
https://api.github.com/repos/elastic/kibana
closed
[Bug] Missing journey results on performance benchmarking dashboard
bug wg:performance Team:Performance
Our performance benchmarking dashboard shows gaps in data reported from the client. We need to make sure telemetry is reported and collected reliably ![image](https://user-images.githubusercontent.com/3016806/191741555-1d96e634-0e84-4e41-89e5-6759985ddf20.png) The weblogs journey shows only once within the timeframe ![image](https://user-images.githubusercontent.com/3016806/191741792-9c85d4fa-9ec5-41c3-983c-1c9035df68c4.png) We're missing results from a lens journey ![image](https://user-images.githubusercontent.com/3016806/191741903-136fe8de-f4f3-4c88-b362-7bb44ad0e41e.png) We're missing a bunch of server startup events
True
[Bug] Missing journey results on performance benchmarking dashboard - Our performance benchmarking dashboard shows gaps in data reported from the client. We need to make sure telemetry is reported and collected reliably ![image](https://user-images.githubusercontent.com/3016806/191741555-1d96e634-0e84-4e41-89e5-6759985ddf20.png) The weblogs journey shows only once within the timeframe ![image](https://user-images.githubusercontent.com/3016806/191741792-9c85d4fa-9ec5-41c3-983c-1c9035df68c4.png) We're missing results from a lens journey ![image](https://user-images.githubusercontent.com/3016806/191741903-136fe8de-f4f3-4c88-b362-7bb44ad0e41e.png) We're missing a bunch of server startup events
non_priority
missing journey results on performance benchmarking dashboard our performance benchmarking dashboard shows gaps in data reported from the client we need to make sure telemetry is reported and collected reliably the weblogs journey shows only once within the timeframe we re missing results from a lens journey we re missing a bunch of server startup events
0
311,083
23,370,160,269
IssuesEvent
2022-08-10 19:02:07
uf-mil/mil
https://api.github.com/repos/uf-mil/mil
closed
Update README.md
documentation enhancement good first issue
Update the Github README to be prettier and contain more updated, recent information about our project. Some nice additions could include pictures of the bot and a lengthier explanation of how our team works and how to get started.
1.0
Update README.md - Update the Github README to be prettier and contain more updated, recent information about our project. Some nice additions could include pictures of the bot and a lengthier explanation of how our team works and how to get started.
non_priority
update readme md update the github readme to be prettier and contain more updated recent information about our project some nice additions could include pictures of the bot and a lengthier explanation of how our team works and how to get started
0
78,485
27,552,687,428
IssuesEvent
2023-03-07 15:53:57
vector-im/element-web
https://api.github.com/repos/vector-im/element-web
closed
riot-web has a changelog.gz but has only single comment always -
T-Defect T-Task P3 S-Minor
In Debian and debian-derived derivatives there is a custom to provide at the very least two changelogs, a changelog.Debian.gz and changelog.gz . The changelog.gz provides info. about all upstream changes. This is what changelog.gz of riot tells all the time in every release. ``` ─[shirish@debian] - [/usr/share/doc/riot-web] - [10110] └─[$] zless changelog.gz riot-web (0.12.2) whatever; urgency=medium * Package created with FPM. -- support@riot.im Thu, 24 Aug 2017 15:42:03 +0100 changelog.gz (END) ``` wherein it should provide upstream changes from https://github.com/vector-im/riot-web/blob/master/CHANGELOG.md The changelog.Debian.gz is there for changes done in improvement of packaging. An example of the same - ``` ┌─[shirish@debian] - [/usr/share/doc/weechat] - [10124] └─[$] ls changelog.Debian.gz changelog.gz copyright NEWS.Debian.gz ┌─[shirish@debian] - [/usr/share/doc/weechat] - [10125] └─[$] zless changelog.Debian.gz weechat (1.9-1) unstable; urgency=medium * New upstream release * Remove useless weechat-dbg binary package * Bump Standards-Version to 4.0.0 -- Emmanuel Bouthenot <kolter@debian.org> Tue, 04 Jul 2017 13:29:55 +0200 weechat (1.8-1) unstable; urgency=medium * New upstream release * Remove usless patches (backports from upstream) -- Emmanuel Bouthenot <kolter@debian.org> Tue, 16 May 2017 14:53:07 +0200 weechat (1.7-3) unstable; urgency=medium * Add a patch to fix CVE-2017-8073 which allows a remote crash by sending a filename via DCC to the IRC plugin (Closes: #861121) -- Emmanuel Bouthenot <kolter@debian.org> Tue, 25 Apr 2017 10:46:10 +0200 = WeeChat ChangeLog :author: Sébastien Helleu :email: flashcode@flashtux.org :lang: en :toc: left :docinfo1: This document lists all changes for each version (the latest formatted version of this document can be found https://weechat.org/files/changelog/ChangeLog-devel.html[here]). For a list of important changes that require manual action, please look at https://weechat.org/files/releasenotes/ReleaseNotes-devel.html[release notes] (file _ReleaseNotes.adoc_ in sources). [[v1.9]] == Version 1.9 (2017-06-25) New features:: * api: allow update of variables "scroll_x" and "scroll_y" in bar_window with function hdata_update * api: add functions config_option_get_string() and hdata_compare() * buflist: add option buflist.look.auto_scroll (issue #332) * buflist: add keys kbd:[F1]/kbd:[F2], kbd:[Alt+F1]/kbd:[Alt+F2] to scroll the buflist bar Improvements:: * core: improve speed of nicklist bar item callback * core: allow index for hdata arrays in evaluation of expressions * buflist: display a warning when the script "buffers.pl" is loaded * buflist: add support of char "~" in option buflist.look.sort for case insensitive comparison * buflist: add variable `${format_name}` in bar item evaluation and option buflist.format.name (issue #1020) * buflist: add variables `${current_buffer}` and `${merged}` (booleans "0" / "1") in bar item evaluation * relay: add option "start" in command /relay * trigger: add "irc_server" and "irc_channel" pointers in data for IRC signal/modifier hooks Bug fixes:: * core: fix bind of keys with space key, like kbd:[Alt+Space] (bug #32133) * core: fix infinite loop when the terminal is closed on the secure password prompt (issue #1010) * buflist: fix long mouse gestures * buflist: fix slow switch of buffer when there are a lot of buffers opened (issue #998) * buflist: fix slow switch of buffer when there are a lot of buffers opened (issue #998) * buflist: add option "bar" in command /buflist, do not automatically add the "buflist" bar when the option buflist.look.enabled is off (issue #994) * buflist: fix crash on drag & drop of buffers * irc: don't reset nick properties (prefixes/away/account/realname) on /names when the nick already exists (issue #1019) * irc: fix memory leak in case of error in "ecdsa-nist256p-challenge" SASL mechanism * relay: rebind on address after option relay.network.bind_address is changed * relay: fix parsing of CAP command arguments in irc protocol (issue #995) ``` As can be seen changelog.gz is and can be generated by simply using either `git log` or `git shortlog` and sharing the contents thereafter. In the above irc client, there is also NEWS.Debian.gz which is usually told/shared when something important happens for e.g. internal changes to how the client saves data is changed which makes it incompatible with early versions of the same client. Usually there is a version which provides the transition path though. Hoping to see this happen soonish so there is more sensible changelogs.
1.0
riot-web has a changelog.gz but has only single comment always - - In Debian and debian-derived derivatives there is a custom to provide at the very least two changelogs, a changelog.Debian.gz and changelog.gz . The changelog.gz provides info. about all upstream changes. This is what changelog.gz of riot tells all the time in every release. ``` ─[shirish@debian] - [/usr/share/doc/riot-web] - [10110] └─[$] zless changelog.gz riot-web (0.12.2) whatever; urgency=medium * Package created with FPM. -- support@riot.im Thu, 24 Aug 2017 15:42:03 +0100 changelog.gz (END) ``` wherein it should provide upstream changes from https://github.com/vector-im/riot-web/blob/master/CHANGELOG.md The changelog.Debian.gz is there for changes done in improvement of packaging. An example of the same - ``` ┌─[shirish@debian] - [/usr/share/doc/weechat] - [10124] └─[$] ls changelog.Debian.gz changelog.gz copyright NEWS.Debian.gz ┌─[shirish@debian] - [/usr/share/doc/weechat] - [10125] └─[$] zless changelog.Debian.gz weechat (1.9-1) unstable; urgency=medium * New upstream release * Remove useless weechat-dbg binary package * Bump Standards-Version to 4.0.0 -- Emmanuel Bouthenot <kolter@debian.org> Tue, 04 Jul 2017 13:29:55 +0200 weechat (1.8-1) unstable; urgency=medium * New upstream release * Remove usless patches (backports from upstream) -- Emmanuel Bouthenot <kolter@debian.org> Tue, 16 May 2017 14:53:07 +0200 weechat (1.7-3) unstable; urgency=medium * Add a patch to fix CVE-2017-8073 which allows a remote crash by sending a filename via DCC to the IRC plugin (Closes: #861121) -- Emmanuel Bouthenot <kolter@debian.org> Tue, 25 Apr 2017 10:46:10 +0200 = WeeChat ChangeLog :author: Sébastien Helleu :email: flashcode@flashtux.org :lang: en :toc: left :docinfo1: This document lists all changes for each version (the latest formatted version of this document can be found https://weechat.org/files/changelog/ChangeLog-devel.html[here]). For a list of important changes that require manual action, please look at https://weechat.org/files/releasenotes/ReleaseNotes-devel.html[release notes] (file _ReleaseNotes.adoc_ in sources). [[v1.9]] == Version 1.9 (2017-06-25) New features:: * api: allow update of variables "scroll_x" and "scroll_y" in bar_window with function hdata_update * api: add functions config_option_get_string() and hdata_compare() * buflist: add option buflist.look.auto_scroll (issue #332) * buflist: add keys kbd:[F1]/kbd:[F2], kbd:[Alt+F1]/kbd:[Alt+F2] to scroll the buflist bar Improvements:: * core: improve speed of nicklist bar item callback * core: allow index for hdata arrays in evaluation of expressions * buflist: display a warning when the script "buffers.pl" is loaded * buflist: add support of char "~" in option buflist.look.sort for case insensitive comparison * buflist: add variable `${format_name}` in bar item evaluation and option buflist.format.name (issue #1020) * buflist: add variables `${current_buffer}` and `${merged}` (booleans "0" / "1") in bar item evaluation * relay: add option "start" in command /relay * trigger: add "irc_server" and "irc_channel" pointers in data for IRC signal/modifier hooks Bug fixes:: * core: fix bind of keys with space key, like kbd:[Alt+Space] (bug #32133) * core: fix infinite loop when the terminal is closed on the secure password prompt (issue #1010) * buflist: fix long mouse gestures * buflist: fix slow switch of buffer when there are a lot of buffers opened (issue #998) * buflist: fix slow switch of buffer when there are a lot of buffers opened (issue #998) * buflist: add option "bar" in command /buflist, do not automatically add the "buflist" bar when the option buflist.look.enabled is off (issue #994) * buflist: fix crash on drag & drop of buffers * irc: don't reset nick properties (prefixes/away/account/realname) on /names when the nick already exists (issue #1019) * irc: fix memory leak in case of error in "ecdsa-nist256p-challenge" SASL mechanism * relay: rebind on address after option relay.network.bind_address is changed * relay: fix parsing of CAP command arguments in irc protocol (issue #995) ``` As can be seen changelog.gz is and can be generated by simply using either `git log` or `git shortlog` and sharing the contents thereafter. In the above irc client, there is also NEWS.Debian.gz which is usually told/shared when something important happens for e.g. internal changes to how the client saves data is changed which makes it incompatible with early versions of the same client. Usually there is a version which provides the transition path though. Hoping to see this happen soonish so there is more sensible changelogs.
non_priority
riot web has a changelog gz but has only single comment always in debian and debian derived derivatives there is a custom to provide at the very least two changelogs a changelog debian gz and changelog gz the changelog gz provides info about all upstream changes this is what changelog gz of riot tells all the time in every release ─ └─ zless changelog gz riot web whatever urgency medium package created with fpm support riot im thu aug changelog gz end wherein it should provide upstream changes from the changelog debian gz is there for changes done in improvement of packaging an example of the same ┌─ └─ ls changelog debian gz changelog gz copyright news debian gz ┌─ └─ zless changelog debian gz weechat unstable urgency medium new upstream release remove useless weechat dbg binary package bump standards version to emmanuel bouthenot tue jul weechat unstable urgency medium new upstream release remove usless patches backports from upstream emmanuel bouthenot tue may weechat unstable urgency medium add a patch to fix cve which allows a remote crash by sending a filename via dcc to the irc plugin closes emmanuel bouthenot tue apr weechat changelog author sébastien helleu email flashcode flashtux org lang en toc left this document lists all changes for each version the latest formatted version of this document can be found for a list of important changes that require manual action please look at file releasenotes adoc in sources version new features api allow update of variables scroll x and scroll y in bar window with function hdata update api add functions config option get string and hdata compare buflist add option buflist look auto scroll issue buflist add keys kbd kbd kbd kbd to scroll the buflist bar improvements core improve speed of nicklist bar item callback core allow index for hdata arrays in evaluation of expressions buflist display a warning when the script buffers pl is loaded buflist add support of char in option buflist look sort for case insensitive comparison buflist add variable format name in bar item evaluation and option buflist format name issue buflist add variables current buffer and merged booleans in bar item evaluation relay add option start in command relay trigger add irc server and irc channel pointers in data for irc signal modifier hooks bug fixes core fix bind of keys with space key like kbd bug core fix infinite loop when the terminal is closed on the secure password prompt issue buflist fix long mouse gestures buflist fix slow switch of buffer when there are a lot of buffers opened issue buflist fix slow switch of buffer when there are a lot of buffers opened issue buflist add option bar in command buflist do not automatically add the buflist bar when the option buflist look enabled is off issue buflist fix crash on drag drop of buffers irc don t reset nick properties prefixes away account realname on names when the nick already exists issue irc fix memory leak in case of error in ecdsa challenge sasl mechanism relay rebind on address after option relay network bind address is changed relay fix parsing of cap command arguments in irc protocol issue as can be seen changelog gz is and can be generated by simply using either git log or git shortlog and sharing the contents thereafter in the above irc client there is also news debian gz which is usually told shared when something important happens for e g internal changes to how the client saves data is changed which makes it incompatible with early versions of the same client usually there is a version which provides the transition path though hoping to see this happen soonish so there is more sensible changelogs
0
6,737
4,519,786,590
IssuesEvent
2016-09-06 07:59:09
KazDragon/telnetpp
https://api.github.com/repos/KazDragon/telnetpp
closed
Add install(EXPORT) to CMakeLists.txt
Improvement in progress Usability
According to the documentation, this generates a CMakeLists.txt file in the specified location, allowing it to be included as a complete project from other solutions, resulting in a more composeable project.
True
Add install(EXPORT) to CMakeLists.txt - According to the documentation, this generates a CMakeLists.txt file in the specified location, allowing it to be included as a complete project from other solutions, resulting in a more composeable project.
non_priority
add install export to cmakelists txt according to the documentation this generates a cmakelists txt file in the specified location allowing it to be included as a complete project from other solutions resulting in a more composeable project
0
209,274
23,705,989,132
IssuesEvent
2022-08-30 01:11:35
nidhi7598/linux-4.19.72
https://api.github.com/repos/nidhi7598/linux-4.19.72
reopened
CVE-2022-0480 (Medium) detected in linuxlinux-4.19.254
security vulnerability
## CVE-2022-0480 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.254</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in HEAD commit: <a href="https://github.com/nidhi7598/linux-4.19.72/commit/10a8c99e4f60044163c159867bc6f5452c1c36e5">10a8c99e4f60044163c159867bc6f5452c1c36e5</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A flaw was found in the filelock_init in fs/locks.c function in the Linux kernel. This issue can lead to host memory exhaustion due to memcg not limiting the number of Portable Operating System Interface (POSIX) file locks. <p>Publish Date: 2022-08-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0480>CVE-2022-0480</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.2</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2022-0480">https://www.linuxkernelcves.com/cves/CVE-2022-0480</a></p> <p>Release Date: 2022-02-03</p> <p>Fix Resolution: v5.15-rc1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-0480 (Medium) detected in linuxlinux-4.19.254 - ## CVE-2022-0480 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.254</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in HEAD commit: <a href="https://github.com/nidhi7598/linux-4.19.72/commit/10a8c99e4f60044163c159867bc6f5452c1c36e5">10a8c99e4f60044163c159867bc6f5452c1c36e5</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A flaw was found in the filelock_init in fs/locks.c function in the Linux kernel. This issue can lead to host memory exhaustion due to memcg not limiting the number of Portable Operating System Interface (POSIX) file locks. <p>Publish Date: 2022-08-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0480>CVE-2022-0480</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.2</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2022-0480">https://www.linuxkernelcves.com/cves/CVE-2022-0480</a></p> <p>Release Date: 2022-02-03</p> <p>Fix Resolution: v5.15-rc1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve medium detected in linuxlinux cve medium severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch master vulnerable source files vulnerability details a flaw was found in the filelock init in fs locks c function in the linux kernel this issue can lead to host memory exhaustion due to memcg not limiting the number of portable operating system interface posix file locks publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
0
24,992
6,618,264,227
IssuesEvent
2017-09-21 07:23:23
RemcoTjuna/CodeValidator
https://api.github.com/repos/RemcoTjuna/CodeValidator
opened
Ik wil als administrator codes kunnen verwijderen
code feature javascript laravel
Als administrator moet je ook codes kunnen verwijderen. Dit moet simpel gebeuren door een lijstje waar alle codes instaan, hiernaast zal dan een delete button staan waarmee je de code kunt verwijderen.
1.0
Ik wil als administrator codes kunnen verwijderen - Als administrator moet je ook codes kunnen verwijderen. Dit moet simpel gebeuren door een lijstje waar alle codes instaan, hiernaast zal dan een delete button staan waarmee je de code kunt verwijderen.
non_priority
ik wil als administrator codes kunnen verwijderen als administrator moet je ook codes kunnen verwijderen dit moet simpel gebeuren door een lijstje waar alle codes instaan hiernaast zal dan een delete button staan waarmee je de code kunt verwijderen
0
39,869
9,702,622,948
IssuesEvent
2019-05-27 09:13:05
hazelcast/hazelcast-jet
https://api.github.com/repos/hazelcast/hazelcast-jet
closed
CLI throws an error if download path includes spaces
cli defect
``` emindemirci  ~/Downloads/hazelcast-jet-3.0 2   ./bin/jet-start.sh ######################################## # JAVA=/Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/bin/java # JAVA_OPTS= -Dhazelcast.config=/Users/emindemirci/Downloads/hazelcast-jet-3.0 2/config/hazelcast.xml -Djet.home=/Users/emindemirci/Downloads/hazelcast-jet-3.0 2 -Dhazelcast.client.config=/Users/emindemirci/Downloads/hazelcast-jet-3.0 2/config/hazelcast-client.xml -Dhazelcast.jet.config=/Users/emindemirci/Downloads/hazelcast-jet-3.0 2/config/hazelcast-jet.xml # CLASSPATH=/Users/emindemirci/Downloads/hazelcast-jet-3.0 2/lib/hazelcast-jet-3.0.jar: # starting now.... ######################################## Error: Could not find or load main class 2.config.hazelcast.xml Caused by: java.lang.ClassNotFoundException: 2.config.hazelcast.xml 1  emindemirci  ~/Downloads/hazelcast-jet-3.0 2  ```
1.0
CLI throws an error if download path includes spaces - ``` emindemirci  ~/Downloads/hazelcast-jet-3.0 2   ./bin/jet-start.sh ######################################## # JAVA=/Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/bin/java # JAVA_OPTS= -Dhazelcast.config=/Users/emindemirci/Downloads/hazelcast-jet-3.0 2/config/hazelcast.xml -Djet.home=/Users/emindemirci/Downloads/hazelcast-jet-3.0 2 -Dhazelcast.client.config=/Users/emindemirci/Downloads/hazelcast-jet-3.0 2/config/hazelcast-client.xml -Dhazelcast.jet.config=/Users/emindemirci/Downloads/hazelcast-jet-3.0 2/config/hazelcast-jet.xml # CLASSPATH=/Users/emindemirci/Downloads/hazelcast-jet-3.0 2/lib/hazelcast-jet-3.0.jar: # starting now.... ######################################## Error: Could not find or load main class 2.config.hazelcast.xml Caused by: java.lang.ClassNotFoundException: 2.config.hazelcast.xml 1  emindemirci  ~/Downloads/hazelcast-jet-3.0 2  ```
non_priority
cli throws an error if download path includes spaces emindemirci  downloads hazelcast jet   bin jet start sh java library java javavirtualmachines jdk jdk contents home bin java java opts dhazelcast config users emindemirci downloads hazelcast jet config hazelcast xml djet home users emindemirci downloads hazelcast jet dhazelcast client config users emindemirci downloads hazelcast jet config hazelcast client xml dhazelcast jet config users emindemirci downloads hazelcast jet config hazelcast jet xml classpath users emindemirci downloads hazelcast jet lib hazelcast jet jar starting now error could not find or load main class config hazelcast xml caused by java lang classnotfoundexception config hazelcast xml  emindemirci  downloads hazelcast jet 
0
266,000
23,215,460,056
IssuesEvent
2022-08-02 13:45:52
ComplianceAsCode/content
https://api.github.com/repos/ComplianceAsCode/content
closed
SSGTS variables metadata cannot have comma as a separator
Test Suite
#### Description of problem: In cases that XCCDF variable contains commas it is impossible to define `variables` test scenario metadata. Example can be the `sshd_use_approved_ciphers` rule with variable `sshd_approved_ciphers` where the default is `default: aes128-ctr,aes192-ctr,aes256-ctr,aes128-cbc,3des-cbc,aes192-cbc,aes256-cbc,rijndael-cbc@lysator.liu.se`. We need to update a separator to be something different, preferably a character which is not used in our XCCDF variables.
1.0
SSGTS variables metadata cannot have comma as a separator - #### Description of problem: In cases that XCCDF variable contains commas it is impossible to define `variables` test scenario metadata. Example can be the `sshd_use_approved_ciphers` rule with variable `sshd_approved_ciphers` where the default is `default: aes128-ctr,aes192-ctr,aes256-ctr,aes128-cbc,3des-cbc,aes192-cbc,aes256-cbc,rijndael-cbc@lysator.liu.se`. We need to update a separator to be something different, preferably a character which is not used in our XCCDF variables.
non_priority
ssgts variables metadata cannot have comma as a separator description of problem in cases that xccdf variable contains commas it is impossible to define variables test scenario metadata example can be the sshd use approved ciphers rule with variable sshd approved ciphers where the default is default ctr ctr ctr cbc cbc cbc cbc rijndael cbc lysator liu se we need to update a separator to be something different preferably a character which is not used in our xccdf variables
0
414,204
27,981,036,024
IssuesEvent
2023-03-26 06:13:47
egoughnour/ham-whisperer
https://api.github.com/repos/egoughnour/ham-whisperer
opened
Set up documentation generation
documentation good first issue
- [ ] validate the documentation of at least one function can be generated via normal mechanisms.
1.0
Set up documentation generation - - [ ] validate the documentation of at least one function can be generated via normal mechanisms.
non_priority
set up documentation generation validate the documentation of at least one function can be generated via normal mechanisms
0
157,400
24,666,783,187
IssuesEvent
2022-10-18 10:54:17
baloise-incubator/design-system
https://api.github.com/repos/baloise-incubator/design-system
opened
[bal-navigation | all components] transfer data-attributes to hydrated markup
🆕 Enhancement 🧑‍💻 Technical Design System 🌐 IBM IX
### Summary As we talked about in a meeting, we need the possibility to get data-attributes(for tracking parmeters) from source to hydrated markup in the navigation or a generic approach for all (relevant) components to do that. ### Justification Needed for tracking parameters in the navigation, later on for other components, too. ### Desired UX and success metrics Successful tracking of the navigation ### Specific timeline issues / requests Tracking team needs it ASAP, but needs to be decided on your side in the end
1.0
[bal-navigation | all components] transfer data-attributes to hydrated markup - ### Summary As we talked about in a meeting, we need the possibility to get data-attributes(for tracking parmeters) from source to hydrated markup in the navigation or a generic approach for all (relevant) components to do that. ### Justification Needed for tracking parameters in the navigation, later on for other components, too. ### Desired UX and success metrics Successful tracking of the navigation ### Specific timeline issues / requests Tracking team needs it ASAP, but needs to be decided on your side in the end
non_priority
transfer data attributes to hydrated markup summary as we talked about in a meeting we need the possibility to get data attributes for tracking parmeters from source to hydrated markup in the navigation or a generic approach for all relevant components to do that justification needed for tracking parameters in the navigation later on for other components too desired ux and success metrics successful tracking of the navigation specific timeline issues requests tracking team needs it asap but needs to be decided on your side in the end
0
2,991
3,054,669,379
IssuesEvent
2015-08-13 05:31:22
servo/servo
https://api.github.com/repos/servo/servo
closed
Items in scrollable layers can be clipped incorrectly.
A-layout/tree-builder
The code that clips items is here: https://github.com/servo/servo/blob/master/components/layout/display_list_builder.rs#L925 An example of what happens: * A scroll layer has height 2000, with visible height of 768. * The first tile paints at 512x512 correctly. * The second tile paints at 512x512 and correctly clips items in the code above that are outside the 768 pixel enclosing rect. * When you scroll, the compositor doesn't ask for the 2nd tile to be repainted, as it hasn't changed. This means that items clipped during the first render never get added to the display list as you scroll. We could potentially force a repaint, but isn't this what the display port code should be handling? Perhaps we should just remove the clipping above and rely on the display port clipping? cc @pcwalton
1.0
Items in scrollable layers can be clipped incorrectly. - The code that clips items is here: https://github.com/servo/servo/blob/master/components/layout/display_list_builder.rs#L925 An example of what happens: * A scroll layer has height 2000, with visible height of 768. * The first tile paints at 512x512 correctly. * The second tile paints at 512x512 and correctly clips items in the code above that are outside the 768 pixel enclosing rect. * When you scroll, the compositor doesn't ask for the 2nd tile to be repainted, as it hasn't changed. This means that items clipped during the first render never get added to the display list as you scroll. We could potentially force a repaint, but isn't this what the display port code should be handling? Perhaps we should just remove the clipping above and rely on the display port clipping? cc @pcwalton
non_priority
items in scrollable layers can be clipped incorrectly the code that clips items is here an example of what happens a scroll layer has height with visible height of the first tile paints at correctly the second tile paints at and correctly clips items in the code above that are outside the pixel enclosing rect when you scroll the compositor doesn t ask for the tile to be repainted as it hasn t changed this means that items clipped during the first render never get added to the display list as you scroll we could potentially force a repaint but isn t this what the display port code should be handling perhaps we should just remove the clipping above and rely on the display port clipping cc pcwalton
0
130,994
12,467,433,930
IssuesEvent
2020-05-28 17:03:54
koreader/koreader
https://api.github.com/repos/koreader/koreader
closed
Writing a Plugin for KOReader
Plugin documentation question
Happy to know that someone has written another reader for kindles. I was thinking about writing a DSL (Domain Specific Language) on top of highlight-note feature of kindle readers to convert several highlights into Q/A of a card (For a Spaced-repetition system). Sadly couldn't find any guide on writing plugins for KOReader. So i have to ask it here. ## Scenario A user press and hold on an image within a book. (Let it be in EPUB format which i think is the easiest one to start with). Some menu appears with a button named **Add metadata**. When the button is clicked another window appears with the following contents and a button named **Append to clippings**: ````yaml img: path: /path/to/image/inside/epub # user can add more meta here ```` After user hits **Append to clippings** button, i want the window contents to be appended to the `My clippings.txt` or its alternative in KOReader environment. Please guide me in the right directon.
1.0
Writing a Plugin for KOReader - Happy to know that someone has written another reader for kindles. I was thinking about writing a DSL (Domain Specific Language) on top of highlight-note feature of kindle readers to convert several highlights into Q/A of a card (For a Spaced-repetition system). Sadly couldn't find any guide on writing plugins for KOReader. So i have to ask it here. ## Scenario A user press and hold on an image within a book. (Let it be in EPUB format which i think is the easiest one to start with). Some menu appears with a button named **Add metadata**. When the button is clicked another window appears with the following contents and a button named **Append to clippings**: ````yaml img: path: /path/to/image/inside/epub # user can add more meta here ```` After user hits **Append to clippings** button, i want the window contents to be appended to the `My clippings.txt` or its alternative in KOReader environment. Please guide me in the right directon.
non_priority
writing a plugin for koreader happy to know that someone has written another reader for kindles i was thinking about writing a dsl domain specific language on top of highlight note feature of kindle readers to convert several highlights into q a of a card for a spaced repetition system sadly couldn t find any guide on writing plugins for koreader so i have to ask it here scenario a user press and hold on an image within a book let it be in epub format which i think is the easiest one to start with some menu appears with a button named add metadata when the button is clicked another window appears with the following contents and a button named append to clippings yaml img path path to image inside epub user can add more meta here after user hits append to clippings button i want the window contents to be appended to the my clippings txt or its alternative in koreader environment please guide me in the right directon
0
52,963
10,964,921,895
IssuesEvent
2019-11-28 00:35:32
jdalzatec/llg
https://api.github.com/repos/jdalzatec/llg
closed
Give the option to compress the HDF files.
python-code
Implement the package of python Click, with the option [is_flag](https://click.palletsprojects.com/en/7.x/options/).
1.0
Give the option to compress the HDF files. - Implement the package of python Click, with the option [is_flag](https://click.palletsprojects.com/en/7.x/options/).
non_priority
give the option to compress the hdf files implement the package of python click with the option
0
223,022
24,711,614,221
IssuesEvent
2022-10-20 01:33:45
Kijacode/Node_Report
https://api.github.com/repos/Kijacode/Node_Report
closed
WS-2020-0070 (High) detected in lodash-4.17.15.tgz - autoclosed
security vulnerability
## WS-2020-0070 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.15.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/Node_Report/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/Node_Report/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - cloudinary-1.22.0.tgz (Root Library) - :x: **lodash-4.17.15.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Kijacode/Node_Report/commit/4b68dc6a8fc807f80ba91fd55e54438e1ed22f69">4b68dc6a8fc807f80ba91fd55e54438e1ed22f69</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> All versions of lodash are vulnerable to Prototype Pollution. The function zipObjectDeep allows a malicious user to modify the prototype of Object if the property identifiers are user-supplied. Being affected by this issue requires zipping objects based on user-provided property arrays. This vulnerability may lead to Denial of Service or Code Execution. <p>Publish Date: 2020-04-28 <p>URL: <a href=https://hackerone.com/reports/712065>WS-2020-0070</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2020-0070 (High) detected in lodash-4.17.15.tgz - autoclosed - ## WS-2020-0070 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.15.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/Node_Report/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/Node_Report/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - cloudinary-1.22.0.tgz (Root Library) - :x: **lodash-4.17.15.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Kijacode/Node_Report/commit/4b68dc6a8fc807f80ba91fd55e54438e1ed22f69">4b68dc6a8fc807f80ba91fd55e54438e1ed22f69</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> All versions of lodash are vulnerable to Prototype Pollution. The function zipObjectDeep allows a malicious user to modify the prototype of Object if the property identifiers are user-supplied. Being affected by this issue requires zipping objects based on user-provided property arrays. This vulnerability may lead to Denial of Service or Code Execution. <p>Publish Date: 2020-04-28 <p>URL: <a href=https://hackerone.com/reports/712065>WS-2020-0070</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
ws high detected in lodash tgz autoclosed ws high severity vulnerability vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file tmp ws scm node report package json path to vulnerable library tmp ws scm node report node modules lodash package json dependency hierarchy cloudinary tgz root library x lodash tgz vulnerable library found in head commit a href vulnerability details all versions of lodash are vulnerable to prototype pollution the function zipobjectdeep allows a malicious user to modify the prototype of object if the property identifiers are user supplied being affected by this issue requires zipping objects based on user provided property arrays this vulnerability may lead to denial of service or code execution publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href step up your open source security game with whitesource
0
12,621
3,283,235,323
IssuesEvent
2015-10-28 11:32:42
metafizzy/isotope
https://api.github.com/repos/metafizzy/isotope
closed
want to set the grid item in %age but I,m unable to do so
test case required
Hy Sir, I am using isotope layout complete event and I am unable to use the height and width of the grid item in %age. When I set the height and width of the grid-item in %age it does not work for me. Images not showing and also event layout complete not worked for me. Kindly guide me ASAP. Regards Abdullah Presstiger
1.0
want to set the grid item in %age but I,m unable to do so - Hy Sir, I am using isotope layout complete event and I am unable to use the height and width of the grid item in %age. When I set the height and width of the grid-item in %age it does not work for me. Images not showing and also event layout complete not worked for me. Kindly guide me ASAP. Regards Abdullah Presstiger
non_priority
want to set the grid item in age but i m unable to do so hy sir i am using isotope layout complete event and i am unable to use the height and width of the grid item in age when i set the height and width of the grid item in age it does not work for me images not showing and also event layout complete not worked for me kindly guide me asap regards abdullah presstiger
0
295,579
22,242,371,215
IssuesEvent
2022-06-09 06:59:59
znuny/Znuny
https://api.github.com/repos/znuny/Znuny
closed
Suggestion: Consider not referring to $HttpType in OAuth2Token.pm's GetAuthorizationCodeRequestRedirectURL subroutine
2 - Documentation
The subroutine [GetAuthorizationCodeRequestRedirectURL in OAuth2Token.pm](https://github.com/znuny/Znuny/blob/bbbf5a0df0d1927c744d661639e767c6868f784e/Kernel/System/OAuth2Token.pm#L238) relies on the Frontend::Base::HttpType setting, but the Redirect URIs will always begin with "https" regardless of end-user configuration; [Microsoft](https://docs.microsoft.com/en-us/azure/active-directory/develop/reply-url) and [Google](https://developers.google.com/identity/protocols/oauth2/web-server#uri-validation) require that (non-local) URIs begin with "https" when configuring the URIs on their end. I believe "http" is the default HttpType setting, so as its written now this sub will generate the redirect URIs incorrectly and then OAuth2 configuration fails with an error. In Microsoft's OAuth2, this is error code 50011: _The {redirectTerm} '{replyAddress}' specified in the request does not match the {redirectTerm}s configured for the application '{identifier}'._ redirectTerm will have a leading http:// , but https:// is needed. It might make sense to disregard the HttpType setting in this submodule so that the URI generated is always correct. We were able to find our mistake eventually, and update our HttpType to be "https" so the OAuth2 integration works, but it took us a while, and it seems like they should be decoupled in this particular case. I'm not sure if changing that setting leads to any unintended consequences.
1.0
Suggestion: Consider not referring to $HttpType in OAuth2Token.pm's GetAuthorizationCodeRequestRedirectURL subroutine - The subroutine [GetAuthorizationCodeRequestRedirectURL in OAuth2Token.pm](https://github.com/znuny/Znuny/blob/bbbf5a0df0d1927c744d661639e767c6868f784e/Kernel/System/OAuth2Token.pm#L238) relies on the Frontend::Base::HttpType setting, but the Redirect URIs will always begin with "https" regardless of end-user configuration; [Microsoft](https://docs.microsoft.com/en-us/azure/active-directory/develop/reply-url) and [Google](https://developers.google.com/identity/protocols/oauth2/web-server#uri-validation) require that (non-local) URIs begin with "https" when configuring the URIs on their end. I believe "http" is the default HttpType setting, so as its written now this sub will generate the redirect URIs incorrectly and then OAuth2 configuration fails with an error. In Microsoft's OAuth2, this is error code 50011: _The {redirectTerm} '{replyAddress}' specified in the request does not match the {redirectTerm}s configured for the application '{identifier}'._ redirectTerm will have a leading http:// , but https:// is needed. It might make sense to disregard the HttpType setting in this submodule so that the URI generated is always correct. We were able to find our mistake eventually, and update our HttpType to be "https" so the OAuth2 integration works, but it took us a while, and it seems like they should be decoupled in this particular case. I'm not sure if changing that setting leads to any unintended consequences.
non_priority
suggestion consider not referring to httptype in pm s getauthorizationcoderequestredirecturl subroutine the subroutine relies on the frontend base httptype setting but the redirect uris will always begin with https regardless of end user configuration and require that non local uris begin with https when configuring the uris on their end i believe http is the default httptype setting so as its written now this sub will generate the redirect uris incorrectly and then configuration fails with an error in microsoft s this is error code the redirectterm replyaddress specified in the request does not match the redirectterm s configured for the application identifier redirectterm will have a leading http but https is needed it might make sense to disregard the httptype setting in this submodule so that the uri generated is always correct we were able to find our mistake eventually and update our httptype to be https so the integration works but it took us a while and it seems like they should be decoupled in this particular case i m not sure if changing that setting leads to any unintended consequences
0
99,335
16,445,974,930
IssuesEvent
2021-05-20 19:38:08
tuanducteam/service.tuanducdesign.com
https://api.github.com/repos/tuanducteam/service.tuanducdesign.com
closed
CVE-2019-6286 (Medium) detected in node-sass-4.14.1.tgz, opennmsopennms-source-26.0.0-1
security vulnerability wontfix
## CVE-2019-6286 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.14.1.tgz</b>, <b>opennmsopennms-source-26.0.0-1</b></p></summary> <p> <details><summary><b>node-sass-4.14.1.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz</a></p> <p>Path to dependency file: service.tuanducdesign.com/package.json</p> <p>Path to vulnerable library: service.tuanducdesign.com/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - :x: **node-sass-4.14.1.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://api.github.com/repos/tuanducteam/service.tuanducdesign.com/commits/bcda4c49653f0f100ba550797d8a0f0bf9c62ba3">bcda4c49653f0f100ba550797d8a0f0bf9c62ba3</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::skip_over_scopes in prelexer.hpp when called from Sass::Parser::parse_import(), a similar issue to CVE-2018-11693. <p>Publish Date: 2019-01-14 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-6286>CVE-2019-6286</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6286">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6286</a></p> <p>Release Date: 2019-08-06</p> <p>Fix Resolution: LibSass - 3.6.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-6286 (Medium) detected in node-sass-4.14.1.tgz, opennmsopennms-source-26.0.0-1 - ## CVE-2019-6286 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.14.1.tgz</b>, <b>opennmsopennms-source-26.0.0-1</b></p></summary> <p> <details><summary><b>node-sass-4.14.1.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz</a></p> <p>Path to dependency file: service.tuanducdesign.com/package.json</p> <p>Path to vulnerable library: service.tuanducdesign.com/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - :x: **node-sass-4.14.1.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://api.github.com/repos/tuanducteam/service.tuanducdesign.com/commits/bcda4c49653f0f100ba550797d8a0f0bf9c62ba3">bcda4c49653f0f100ba550797d8a0f0bf9c62ba3</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::skip_over_scopes in prelexer.hpp when called from Sass::Parser::parse_import(), a similar issue to CVE-2018-11693. <p>Publish Date: 2019-01-14 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-6286>CVE-2019-6286</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6286">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6286</a></p> <p>Release Date: 2019-08-06</p> <p>Fix Resolution: LibSass - 3.6.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve medium detected in node sass tgz opennmsopennms source cve medium severity vulnerability vulnerable libraries node sass tgz opennmsopennms source node sass tgz wrapper around libsass library home page a href path to dependency file service tuanducdesign com package json path to vulnerable library service tuanducdesign com node modules node sass package json dependency hierarchy x node sass tgz vulnerable library found in head commit a href found in base branch master vulnerability details in libsass a heap based buffer over read exists in sass prelexer skip over scopes in prelexer hpp when called from sass parser parse import a similar issue to cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass step up your open source security game with whitesource
0
81,556
31,018,920,406
IssuesEvent
2023-08-10 02:29:59
openzfs/zfs
https://api.github.com/repos/openzfs/zfs
closed
Data corruption since generic_file_splice_read -> filemap_splice_read change (6.5 compat, but occurs on 6.4 too)
Type: Defect
### System information Type | Version/Name --- | --- Distribution Name | Arch Distribution Version | Rolling release Kernel Version | 6.4.8, 6.5rc1/2/3/4 Architecture | x86-64 OpenZFS Version | [commit 36261c8](https://github.com/openzfs/zfs/commit/36261c8238df462b214854ccea1df4f060cf0995) ### Describe the problem you're observing After the recent changes to get OpenZFS compiling/running on 6.5, there appears to be a possible lingering data corruption bug. In the repeatable example below, it reliably inserts a long run of NULL bytes into a file, causing a build to fail (conveniently, the build of ZFS). My expectation is that the bug probably exists for any kernel where `filemap_splice_read` exists, which recently has replaced `generic_file_splice_read` in other Linux filesystem code. ### Describe how to reproduce the problem **Again - despite demonstrating the problem with the OpenZFS build, the problem only manifests itself when running on the ZFS branch at the commit listed above. It just so happens that I'm able to use our build to reproduce the bug.** 1. You need to be running ZFS patched up to the commit listed above. I have reproduced this on Kernel 6.4.8 and all 6.5 RC's up to rc4 2. `git clone https://github.com/openzfs/zfs.git` 3. `cd ./zfs` 4. `./autogen.sh` 5. `mkdir -p ../zfs-test` 6. `cd ../zfs-test` 7. `../zfs/configure --with-linux=/usr/src/linux` (or wherever your headers/source tree is) Eventually, the `configure` will fail with the following message: ``` configure: error: *** This kernel does not include the required loadable module *** support! *** *** To build OpenZFS as a loadable Linux kernel module *** enable loadable module support by setting *** `CONFIG_MODULES=y` in the kernel configuration and run *** `make modules_prepare` in the Linux source tree. *** *** If you don't intend to enable loadable kernel module *** support, please compile OpenZFS as a Linux kernel built-in. *** *** Prepare the Linux source tree by running `make prepare`, *** use the OpenZFS `--enable-linux-builtin` configure option, *** copy the OpenZFS sources into the Linux source tree using *** `./copy-builtin <linux source directory>`, *** set `CONFIG_ZFS=y` in the kernel configuration and compile *** kernel as usual. ``` I enter the directory of the failing test: ``` cd build/config_modules ``` Looking at the `config_modules.c` file, which is resulting in the failure: ```c /* confdefs.h */ #define PACKAGE_NAME "zfs" #define PACKAGE_TARNAME "zfs" #define PACKAGE_VERSION "2.2.99" #define PACKAGE_STRING "zfs 2.2.99" #define PACKAGE_BUGREPORT "" #define PACKAGE_URL "" #define ZFS_META_NAME "zfs" #define ZFS_META_VERSION "2.2.99" #define SPL_META_VERSION ZFS_META_VERSION #define ZFS_META_RELEASE "1" #define SPL_META_RELEASE ZFS_META_RELEASE #define ZFS_META_LICENSE "CDDL" #define ZFS_META_ALIAS "zfs-2.2.99-1" #define SPL_META_ALIAS ZFS_META_ALIAS #define ZFS_META_AUTHOR "OpenZFS" #define ZFS_META_KVER_MIN "3.10" #define ZFS_META_KVER_MAX "6.4" #define PACKAGE "zfs" #define VERSION "2.2.99" ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ #include <linux/module.h> #if !defined(CONFIG_MODULES) #error CONFIG_MODULES not defined #endif int main (void) { ; return 0; } ``` Curiously, this bug does not manifest frequently and the system largely appears to run stable in many other use cases. The section of `configure` that writes that file looks like below. Including this snippet here as it might help illuminate what conditions need to be true to trigger the bug: ```bash cat confdefs.h - <<_ACEOF >build/config_modules/config_modules.c #include <linux/module.h> #if !defined(CONFIG_MODULES) #error CONFIG_MODULES not defined #endif int main (void) { ; return 0; } MODULE_DESCRIPTION("conftest"); MODULE_AUTHOR(ZFS_META_AUTHOR); MODULE_VERSION(ZFS_META_VERSION "-" ZFS_META_RELEASE); MODULE_LICENSE("Dual BSD/GPL"); _ACEOF ``` ### Include any warning/errors/backtraces from the system logs There are no errors reported to the console or in kernel messages
1.0
Data corruption since generic_file_splice_read -> filemap_splice_read change (6.5 compat, but occurs on 6.4 too) - ### System information Type | Version/Name --- | --- Distribution Name | Arch Distribution Version | Rolling release Kernel Version | 6.4.8, 6.5rc1/2/3/4 Architecture | x86-64 OpenZFS Version | [commit 36261c8](https://github.com/openzfs/zfs/commit/36261c8238df462b214854ccea1df4f060cf0995) ### Describe the problem you're observing After the recent changes to get OpenZFS compiling/running on 6.5, there appears to be a possible lingering data corruption bug. In the repeatable example below, it reliably inserts a long run of NULL bytes into a file, causing a build to fail (conveniently, the build of ZFS). My expectation is that the bug probably exists for any kernel where `filemap_splice_read` exists, which recently has replaced `generic_file_splice_read` in other Linux filesystem code. ### Describe how to reproduce the problem **Again - despite demonstrating the problem with the OpenZFS build, the problem only manifests itself when running on the ZFS branch at the commit listed above. It just so happens that I'm able to use our build to reproduce the bug.** 1. You need to be running ZFS patched up to the commit listed above. I have reproduced this on Kernel 6.4.8 and all 6.5 RC's up to rc4 2. `git clone https://github.com/openzfs/zfs.git` 3. `cd ./zfs` 4. `./autogen.sh` 5. `mkdir -p ../zfs-test` 6. `cd ../zfs-test` 7. `../zfs/configure --with-linux=/usr/src/linux` (or wherever your headers/source tree is) Eventually, the `configure` will fail with the following message: ``` configure: error: *** This kernel does not include the required loadable module *** support! *** *** To build OpenZFS as a loadable Linux kernel module *** enable loadable module support by setting *** `CONFIG_MODULES=y` in the kernel configuration and run *** `make modules_prepare` in the Linux source tree. *** *** If you don't intend to enable loadable kernel module *** support, please compile OpenZFS as a Linux kernel built-in. *** *** Prepare the Linux source tree by running `make prepare`, *** use the OpenZFS `--enable-linux-builtin` configure option, *** copy the OpenZFS sources into the Linux source tree using *** `./copy-builtin <linux source directory>`, *** set `CONFIG_ZFS=y` in the kernel configuration and compile *** kernel as usual. ``` I enter the directory of the failing test: ``` cd build/config_modules ``` Looking at the `config_modules.c` file, which is resulting in the failure: ```c /* confdefs.h */ #define PACKAGE_NAME "zfs" #define PACKAGE_TARNAME "zfs" #define PACKAGE_VERSION "2.2.99" #define PACKAGE_STRING "zfs 2.2.99" #define PACKAGE_BUGREPORT "" #define PACKAGE_URL "" #define ZFS_META_NAME "zfs" #define ZFS_META_VERSION "2.2.99" #define SPL_META_VERSION ZFS_META_VERSION #define ZFS_META_RELEASE "1" #define SPL_META_RELEASE ZFS_META_RELEASE #define ZFS_META_LICENSE "CDDL" #define ZFS_META_ALIAS "zfs-2.2.99-1" #define SPL_META_ALIAS ZFS_META_ALIAS #define ZFS_META_AUTHOR "OpenZFS" #define ZFS_META_KVER_MIN "3.10" #define ZFS_META_KVER_MAX "6.4" #define PACKAGE "zfs" #define VERSION "2.2.99" ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ #include <linux/module.h> #if !defined(CONFIG_MODULES) #error CONFIG_MODULES not defined #endif int main (void) { ; return 0; } ``` Curiously, this bug does not manifest frequently and the system largely appears to run stable in many other use cases. The section of `configure` that writes that file looks like below. Including this snippet here as it might help illuminate what conditions need to be true to trigger the bug: ```bash cat confdefs.h - <<_ACEOF >build/config_modules/config_modules.c #include <linux/module.h> #if !defined(CONFIG_MODULES) #error CONFIG_MODULES not defined #endif int main (void) { ; return 0; } MODULE_DESCRIPTION("conftest"); MODULE_AUTHOR(ZFS_META_AUTHOR); MODULE_VERSION(ZFS_META_VERSION "-" ZFS_META_RELEASE); MODULE_LICENSE("Dual BSD/GPL"); _ACEOF ``` ### Include any warning/errors/backtraces from the system logs There are no errors reported to the console or in kernel messages
non_priority
data corruption since generic file splice read filemap splice read change compat but occurs on too system information type version name distribution name arch distribution version rolling release kernel version architecture openzfs version describe the problem you re observing after the recent changes to get openzfs compiling running on there appears to be a possible lingering data corruption bug in the repeatable example below it reliably inserts a long run of null bytes into a file causing a build to fail conveniently the build of zfs my expectation is that the bug probably exists for any kernel where filemap splice read exists which recently has replaced generic file splice read in other linux filesystem code describe how to reproduce the problem again despite demonstrating the problem with the openzfs build the problem only manifests itself when running on the zfs branch at the commit listed above it just so happens that i m able to use our build to reproduce the bug you need to be running zfs patched up to the commit listed above i have reproduced this on kernel and all rc s up to git clone cd zfs autogen sh mkdir p zfs test cd zfs test zfs configure with linux usr src linux or wherever your headers source tree is eventually the configure will fail with the following message configure error this kernel does not include the required loadable module support to build openzfs as a loadable linux kernel module enable loadable module support by setting config modules y in the kernel configuration and run make modules prepare in the linux source tree if you don t intend to enable loadable kernel module support please compile openzfs as a linux kernel built in prepare the linux source tree by running make prepare use the openzfs enable linux builtin configure option copy the openzfs sources into the linux source tree using copy builtin set config zfs y in the kernel configuration and compile kernel as usual i enter the directory of the failing test cd build config modules looking at the config modules c file which is resulting in the failure c confdefs h define package name zfs define package tarname zfs define package version define package string zfs define package bugreport define package url define zfs meta name zfs define zfs meta version define spl meta version zfs meta version define zfs meta release define spl meta release zfs meta release define zfs meta license cddl define zfs meta alias zfs define spl meta alias zfs meta alias define zfs meta author openzfs define zfs meta kver min define zfs meta kver max define package zfs define version include if defined config modules error config modules not defined endif int main void return curiously this bug does not manifest frequently and the system largely appears to run stable in many other use cases the section of configure that writes that file looks like below including this snippet here as it might help illuminate what conditions need to be true to trigger the bug bash cat confdefs h build config modules config modules c include if defined config modules error config modules not defined endif int main void return module description conftest module author zfs meta author module version zfs meta version zfs meta release module license dual bsd gpl aceof include any warning errors backtraces from the system logs there are no errors reported to the console or in kernel messages
0
211,430
16,444,705,458
IssuesEvent
2021-05-20 18:06:43
pablovarela/terraform-provider-slack
https://api.github.com/repos/pablovarela/terraform-provider-slack
closed
Documentation missing specifics on Slack Token OAuth Scopes
documentation
The documentation does not specify which OAuth Bot Scopes are required for the this provider to perform all its tasks. Could that be added to the documentation?
1.0
Documentation missing specifics on Slack Token OAuth Scopes - The documentation does not specify which OAuth Bot Scopes are required for the this provider to perform all its tasks. Could that be added to the documentation?
non_priority
documentation missing specifics on slack token oauth scopes the documentation does not specify which oauth bot scopes are required for the this provider to perform all its tasks could that be added to the documentation
0
106,253
9,125,651,045
IssuesEvent
2019-02-24 15:30:43
dateutil/dateutil
https://api.github.com/repos/dateutil/dateutil
closed
bytes/str conversion lines not hit in test_isoparser
good first issue help wanted isoparser low-difficulty tests
I've noticed that in the [coverage for `test_isoparser`](https://codecov.io/gh/dateutil/dateutil/src/572e358279fb0d90862e2f5fd8d55300a9670fcf/dateutil/test/test_isoparser.py), lines [375](https://github.com/dateutil/dateutil/blob/a41c4c6b498189b1c456e2d8e7a21a308974bc89/dateutil/test/test_isoparser.py#L375), [447](https://github.com/dateutil/dateutil/blob/a41c4c6b498189b1c456e2d8e7a21a308974bc89/dateutil/test/test_isoparser.py#L447) and [449]([447](https://github.com/dateutil/dateutil/blob/a41c4c6b498189b1c456e2d8e7a21a308974bc89/dateutil/test/test_isoparser.py#L449)) are not being hit. I can't tell if this means that the tests aren't being run against both `bytes` and `str` or if it means that I have unnecessary `if` conditions in the tests. If the former, we need to make sure tests are being run against `bytes` and `str`. If the latter, we can remove the unnecessary branches. The parts that are not covered because they are under an `xfail` can have `#pragma: no cover` added to them, as can the error handling lines in the test generation part.
1.0
bytes/str conversion lines not hit in test_isoparser - I've noticed that in the [coverage for `test_isoparser`](https://codecov.io/gh/dateutil/dateutil/src/572e358279fb0d90862e2f5fd8d55300a9670fcf/dateutil/test/test_isoparser.py), lines [375](https://github.com/dateutil/dateutil/blob/a41c4c6b498189b1c456e2d8e7a21a308974bc89/dateutil/test/test_isoparser.py#L375), [447](https://github.com/dateutil/dateutil/blob/a41c4c6b498189b1c456e2d8e7a21a308974bc89/dateutil/test/test_isoparser.py#L447) and [449]([447](https://github.com/dateutil/dateutil/blob/a41c4c6b498189b1c456e2d8e7a21a308974bc89/dateutil/test/test_isoparser.py#L449)) are not being hit. I can't tell if this means that the tests aren't being run against both `bytes` and `str` or if it means that I have unnecessary `if` conditions in the tests. If the former, we need to make sure tests are being run against `bytes` and `str`. If the latter, we can remove the unnecessary branches. The parts that are not covered because they are under an `xfail` can have `#pragma: no cover` added to them, as can the error handling lines in the test generation part.
non_priority
bytes str conversion lines not hit in test isoparser i ve noticed that in the lines and are not being hit i can t tell if this means that the tests aren t being run against both bytes and str or if it means that i have unnecessary if conditions in the tests if the former we need to make sure tests are being run against bytes and str if the latter we can remove the unnecessary branches the parts that are not covered because they are under an xfail can have pragma no cover added to them as can the error handling lines in the test generation part
0
122,466
17,703,927,673
IssuesEvent
2021-08-25 04:07:38
Chiencc/Sample_Webgoat
https://api.github.com/repos/Chiencc/Sample_Webgoat
closed
CVE-2014-3625 (Medium) detected in spring-webmvc-3.2.4.RELEASE.jar - autoclosed
security vulnerability
## CVE-2014-3625 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-webmvc-3.2.4.RELEASE.jar</b></p></summary> <p>Spring Web MVC</p> <p>Library home page: <a href="https://github.com/SpringSource/spring-framework">https://github.com/SpringSource/spring-framework</a></p> <p>Path to dependency file: Sample_Webgoat_depth_0/SourceCode/webgoat-standalone/webgoat-standalone/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-webmvc/3.2.4.RELEASE/spring-webmvc-3.2.4.RELEASE.jar</p> <p> Dependency Hierarchy: - webgoat-container-7.1.jar (Root Library) - :x: **spring-webmvc-3.2.4.RELEASE.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Chiencc/Sample_Webgoat/commit/8c8daafbebc152c1aabb39157cf71791044ee1af">8c8daafbebc152c1aabb39157cf71791044ee1af</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Directory traversal vulnerability in Pivotal Spring Framework 3.0.4 through 3.2.x before 3.2.12, 4.0.x before 4.0.8, and 4.1.x before 4.1.2 allows remote attackers to read arbitrary files via unspecified vectors, related to static resource handling. <p>Publish Date: 2014-11-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2014-3625>CVE-2014-3625</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://tanzu.vmware.com/security/CVE-2014-3625">https://tanzu.vmware.com/security/CVE-2014-3625</a></p> <p>Release Date: 2014-11-20</p> <p>Fix Resolution: org.springframework:spring-webmvc:3.2.12.RELEASE,4.0.8.RELEASE,4.1.2.RELEASE</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2014-3625 (Medium) detected in spring-webmvc-3.2.4.RELEASE.jar - autoclosed - ## CVE-2014-3625 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-webmvc-3.2.4.RELEASE.jar</b></p></summary> <p>Spring Web MVC</p> <p>Library home page: <a href="https://github.com/SpringSource/spring-framework">https://github.com/SpringSource/spring-framework</a></p> <p>Path to dependency file: Sample_Webgoat_depth_0/SourceCode/webgoat-standalone/webgoat-standalone/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-webmvc/3.2.4.RELEASE/spring-webmvc-3.2.4.RELEASE.jar</p> <p> Dependency Hierarchy: - webgoat-container-7.1.jar (Root Library) - :x: **spring-webmvc-3.2.4.RELEASE.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Chiencc/Sample_Webgoat/commit/8c8daafbebc152c1aabb39157cf71791044ee1af">8c8daafbebc152c1aabb39157cf71791044ee1af</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Directory traversal vulnerability in Pivotal Spring Framework 3.0.4 through 3.2.x before 3.2.12, 4.0.x before 4.0.8, and 4.1.x before 4.1.2 allows remote attackers to read arbitrary files via unspecified vectors, related to static resource handling. <p>Publish Date: 2014-11-20 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2014-3625>CVE-2014-3625</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://tanzu.vmware.com/security/CVE-2014-3625">https://tanzu.vmware.com/security/CVE-2014-3625</a></p> <p>Release Date: 2014-11-20</p> <p>Fix Resolution: org.springframework:spring-webmvc:3.2.12.RELEASE,4.0.8.RELEASE,4.1.2.RELEASE</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve medium detected in spring webmvc release jar autoclosed cve medium severity vulnerability vulnerable library spring webmvc release jar spring web mvc library home page a href path to dependency file sample webgoat depth sourcecode webgoat standalone webgoat standalone pom xml path to vulnerable library home wss scanner repository org springframework spring webmvc release spring webmvc release jar dependency hierarchy webgoat container jar root library x spring webmvc release jar vulnerable library found in head commit a href found in base branch master vulnerability details directory traversal vulnerability in pivotal spring framework through x before x before and x before allows remote attackers to read arbitrary files via unspecified vectors related to static resource handling publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution org springframework spring webmvc release release release step up your open source security game with whitesource
0
263,409
28,030,145,856
IssuesEvent
2023-03-28 11:51:34
RG4421/ampere-centos-kernel
https://api.github.com/repos/RG4421/ampere-centos-kernel
reopened
CVE-2021-3506 (High) detected in linuxv5.2
Mend: dependency security vulnerability
## CVE-2021-3506 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary> <p> <p>Linux kernel source tree</p> <p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p> <p>Found in base branch: <b>amp-centos-8.0-kernel</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/f2fs/node.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/f2fs/node.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An out-of-bounds (OOB) memory access flaw was found in fs/f2fs/node.c in the f2fs module in the Linux kernel in versions before 5.12.0-rc4. A bounds check failure allows a local attacker to gain access to out-of-bounds memory leading to a system crash or a leak of internal kernel information. The highest threat from this vulnerability is to system availability. <p>Publish Date: 2021-04-19 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-3506>CVE-2021-3506</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2021-3506">https://www.linuxkernelcves.com/cves/CVE-2021-3506</a></p> <p>Release Date: 2021-04-19</p> <p>Fix Resolution: v4.19.191,v5.10.36,v5.11.20,v5.12.3,v5.4.118,v5.13-rc1</p> </p> </details> <p></p>
True
CVE-2021-3506 (High) detected in linuxv5.2 - ## CVE-2021-3506 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary> <p> <p>Linux kernel source tree</p> <p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p> <p>Found in base branch: <b>amp-centos-8.0-kernel</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/f2fs/node.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/f2fs/node.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An out-of-bounds (OOB) memory access flaw was found in fs/f2fs/node.c in the f2fs module in the Linux kernel in versions before 5.12.0-rc4. A bounds check failure allows a local attacker to gain access to out-of-bounds memory leading to a system crash or a leak of internal kernel information. The highest threat from this vulnerability is to system availability. <p>Publish Date: 2021-04-19 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-3506>CVE-2021-3506</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2021-3506">https://www.linuxkernelcves.com/cves/CVE-2021-3506</a></p> <p>Release Date: 2021-04-19</p> <p>Fix Resolution: v4.19.191,v5.10.36,v5.11.20,v5.12.3,v5.4.118,v5.13-rc1</p> </p> </details> <p></p>
non_priority
cve high detected in cve high severity vulnerability vulnerable library linux kernel source tree library home page a href found in base branch amp centos kernel vulnerable source files fs node c fs node c vulnerability details an out of bounds oob memory access flaw was found in fs node c in the module in the linux kernel in versions before a bounds check failure allows a local attacker to gain access to out of bounds memory leading to a system crash or a leak of internal kernel information the highest threat from this vulnerability is to system availability publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
0
146,360
13,179,639,324
IssuesEvent
2020-08-12 11:19:50
ch99q/nuxt-pdf
https://api.github.com/repos/ch99q/nuxt-pdf
closed
pdf created but completely empty
documentation
Hi, I just stumbled upon your plugin and it was exactly what I was looking for. Unfortunately, the generated .pdf is completely empty. I am using @nuxt/content for product pages and then want to use nuxt-pdf to generate a companion pdf of the same page. What could be a good reason, why the generated pdf ist completely empty? Thanks!
1.0
pdf created but completely empty - Hi, I just stumbled upon your plugin and it was exactly what I was looking for. Unfortunately, the generated .pdf is completely empty. I am using @nuxt/content for product pages and then want to use nuxt-pdf to generate a companion pdf of the same page. What could be a good reason, why the generated pdf ist completely empty? Thanks!
non_priority
pdf created but completely empty hi i just stumbled upon your plugin and it was exactly what i was looking for unfortunately the generated pdf is completely empty i am using nuxt content for product pages and then want to use nuxt pdf to generate a companion pdf of the same page what could be a good reason why the generated pdf ist completely empty thanks
0
50,294
13,508,558,673
IssuesEvent
2020-09-14 07:54:37
zulcomp/zulcomp.github.io
https://api.github.com/repos/zulcomp/zulcomp.github.io
opened
CVE-2015-3227 (Medium) detected in activesupport-3.2.22.5.gem
security vulnerability
## CVE-2015-3227 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>activesupport-3.2.22.5.gem</b></p></summary> <p>A toolkit of support libraries and Ruby core extensions extracted from the Rails framework. Rich support for multibyte strings, internationalization, time zones, and testing.</p> <p>Library home page: <a href="https://rubygems.org/gems/activesupport-3.2.22.5.gem">https://rubygems.org/gems/activesupport-3.2.22.5.gem</a></p> <p> Dependency Hierarchy: - github-pages-207.gem (Root Library) - jekyll-mentions-1.5.1.gem - html-pipeline-2.14.0.gem - :x: **activesupport-3.2.22.5.gem** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/zulcomp/zulcomp.github.io/commit/8f289349e52dce35c389277d9fcb8c54ffae2969">8f289349e52dce35c389277d9fcb8c54ffae2969</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The (1) jdom.rb and (2) rexml.rb components in Active Support in Ruby on Rails before 4.1.11 and 4.2.x before 4.2.2, when JDOM or REXML is enabled, allow remote attackers to cause a denial of service (SystemStackError) via a large XML document depth. <p>Publish Date: 2015-07-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-3227>CVE-2015-3227</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-3227">https://nvd.nist.gov/vuln/detail/CVE-2015-3227</a></p> <p>Release Date: 2015-07-26</p> <p>Fix Resolution: 4.1.11,4.2.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2015-3227 (Medium) detected in activesupport-3.2.22.5.gem - ## CVE-2015-3227 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>activesupport-3.2.22.5.gem</b></p></summary> <p>A toolkit of support libraries and Ruby core extensions extracted from the Rails framework. Rich support for multibyte strings, internationalization, time zones, and testing.</p> <p>Library home page: <a href="https://rubygems.org/gems/activesupport-3.2.22.5.gem">https://rubygems.org/gems/activesupport-3.2.22.5.gem</a></p> <p> Dependency Hierarchy: - github-pages-207.gem (Root Library) - jekyll-mentions-1.5.1.gem - html-pipeline-2.14.0.gem - :x: **activesupport-3.2.22.5.gem** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/zulcomp/zulcomp.github.io/commit/8f289349e52dce35c389277d9fcb8c54ffae2969">8f289349e52dce35c389277d9fcb8c54ffae2969</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The (1) jdom.rb and (2) rexml.rb components in Active Support in Ruby on Rails before 4.1.11 and 4.2.x before 4.2.2, when JDOM or REXML is enabled, allow remote attackers to cause a denial of service (SystemStackError) via a large XML document depth. <p>Publish Date: 2015-07-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-3227>CVE-2015-3227</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-3227">https://nvd.nist.gov/vuln/detail/CVE-2015-3227</a></p> <p>Release Date: 2015-07-26</p> <p>Fix Resolution: 4.1.11,4.2.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve medium detected in activesupport gem cve medium severity vulnerability vulnerable library activesupport gem a toolkit of support libraries and ruby core extensions extracted from the rails framework rich support for multibyte strings internationalization time zones and testing library home page a href dependency hierarchy github pages gem root library jekyll mentions gem html pipeline gem x activesupport gem vulnerable library found in head commit a href found in base branch master vulnerability details the jdom rb and rexml rb components in active support in ruby on rails before and x before when jdom or rexml is enabled allow remote attackers to cause a denial of service systemstackerror via a large xml document depth publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
21,473
14,589,107,822
IssuesEvent
2020-12-19 00:32:15
KaTeX/KaTeX
https://api.github.com/repos/KaTeX/KaTeX
closed
Safari screenshotter
bug infrastructure
The Safari screenshotter is broken. I have three PRs that currently are shown as having errors purely due to Safari screenshotter disfunction. Even when the Safari screenshotter works, the process is clumsy and tedious. I think we should get rid of it.
1.0
Safari screenshotter - The Safari screenshotter is broken. I have three PRs that currently are shown as having errors purely due to Safari screenshotter disfunction. Even when the Safari screenshotter works, the process is clumsy and tedious. I think we should get rid of it.
non_priority
safari screenshotter the safari screenshotter is broken i have three prs that currently are shown as having errors purely due to safari screenshotter disfunction even when the safari screenshotter works the process is clumsy and tedious i think we should get rid of it
0
21,938
7,098,106,501
IssuesEvent
2018-01-15 02:25:32
hadithhouse/hadithhouse
https://api.github.com/repos/hadithhouse/hadithhouse
opened
Use tslint in build
build & deploy
To make sure that code not styled correctly doesn't get into the repository, I would like to add `tslint` to the build process so build fails if there are style warnings.
1.0
Use tslint in build - To make sure that code not styled correctly doesn't get into the repository, I would like to add `tslint` to the build process so build fails if there are style warnings.
non_priority
use tslint in build to make sure that code not styled correctly doesn t get into the repository i would like to add tslint to the build process so build fails if there are style warnings
0
348,687
24,919,350,547
IssuesEvent
2022-10-30 19:23:50
sfeir-open-source/sfeir-school-github-action-dev
https://api.github.com/repos/sfeir-open-source/sfeir-school-github-action-dev
closed
Add Actionlint tool to testing section
documentation
Actionlint is a linter for your action to help you with a set of recommendations on GitHub Actions ecosystem https://rhysd.github.io/actionlint/
1.0
Add Actionlint tool to testing section - Actionlint is a linter for your action to help you with a set of recommendations on GitHub Actions ecosystem https://rhysd.github.io/actionlint/
non_priority
add actionlint tool to testing section actionlint is a linter for your action to help you with a set of recommendations on github actions ecosystem
0
17,818
6,517,712,333
IssuesEvent
2017-08-28 02:37:38
Tirocupidus/TheExiledRPOverhaul
https://api.github.com/repos/Tirocupidus/TheExiledRPOverhaul
closed
Ice arrow recipe still 10 shards
easy fix ready for build
This wasn't fixed at least in the tooltip, haven't tried to make.
1.0
Ice arrow recipe still 10 shards - This wasn't fixed at least in the tooltip, haven't tried to make.
non_priority
ice arrow recipe still shards this wasn t fixed at least in the tooltip haven t tried to make
0
71,606
18,793,710,296
IssuesEvent
2021-11-08 19:37:18
o3de/o3de
https://api.github.com/repos/o3de/o3de
opened
Potential missing dependency in autogen under Ninja
kind/bug needs-triage sig/build sig/network
**Describe the bug** During a build of Android it was observed that there was a missing dependency to a codegen functions: ``` 05:45:03 lib/profile/libAzNetworking.a(unity_3_cxx.cxx.o): In function `AzNetworking::UdpNetworkInterface::HandleConnectionTimeout(AzNetworking::TimeoutQueue::TimeoutItem&)': 05:45:03 D:/workspace/o3de/Code/Framework/AzNetworking/AzNetworking/UdpTransport/UdpNetworkInterface.cpp:736: undefined reference to `CorePackets::HeartbeatPacket::HeartbeatPacket(bool)' 05:45:03 lib/profile/libAzNetworking.a(unity_2_cxx.cxx.o): In function `AzNetworking::UdpConnection::UpdateHeartbeat(AZ::TimeMs)': 05:45:03 D:/workspace/o3de/Code/Framework/AzNetworking/AzNetworking/UdpTransport/UdpConnection.cpp:83: undefined reference to `CorePackets::HeartbeatPacket::HeartbeatPacket(bool)' 05:45:03 lib/profile/libAzNetworking.a(unity_2_cxx.cxx.o): In function `AzNetworking::UdpConnection::HandleCorePacket(AzNetworking::IConnectionListener&, AzNetworking::UdpPacketHeader&, AzNetworking::ISerializer&)': 05:45:03 D:/workspace/o3de/Code/Framework/AzNetworking/AzNetworking/UdpTransport/UdpConnection.cpp:296: undefined reference to `CorePackets::HeartbeatPacket::HeartbeatPacket(bool)' ``` The functions are generated by codegen and are within the same static library, so it looks like codegen did not generate those functions. In windows the rule is created properly with the dependency to the xml that should produce the generation of those functions: ![image](https://user-images.githubusercontent.com/81431996/140806211-35f526ce-28e3-4a2d-8c57-a299200847ca.png) So this could be an issue with dependencies in the Ninja generator. Do some digging to find if Ninja is declaring and tracking that dependency properly.
1.0
Potential missing dependency in autogen under Ninja - **Describe the bug** During a build of Android it was observed that there was a missing dependency to a codegen functions: ``` 05:45:03 lib/profile/libAzNetworking.a(unity_3_cxx.cxx.o): In function `AzNetworking::UdpNetworkInterface::HandleConnectionTimeout(AzNetworking::TimeoutQueue::TimeoutItem&)': 05:45:03 D:/workspace/o3de/Code/Framework/AzNetworking/AzNetworking/UdpTransport/UdpNetworkInterface.cpp:736: undefined reference to `CorePackets::HeartbeatPacket::HeartbeatPacket(bool)' 05:45:03 lib/profile/libAzNetworking.a(unity_2_cxx.cxx.o): In function `AzNetworking::UdpConnection::UpdateHeartbeat(AZ::TimeMs)': 05:45:03 D:/workspace/o3de/Code/Framework/AzNetworking/AzNetworking/UdpTransport/UdpConnection.cpp:83: undefined reference to `CorePackets::HeartbeatPacket::HeartbeatPacket(bool)' 05:45:03 lib/profile/libAzNetworking.a(unity_2_cxx.cxx.o): In function `AzNetworking::UdpConnection::HandleCorePacket(AzNetworking::IConnectionListener&, AzNetworking::UdpPacketHeader&, AzNetworking::ISerializer&)': 05:45:03 D:/workspace/o3de/Code/Framework/AzNetworking/AzNetworking/UdpTransport/UdpConnection.cpp:296: undefined reference to `CorePackets::HeartbeatPacket::HeartbeatPacket(bool)' ``` The functions are generated by codegen and are within the same static library, so it looks like codegen did not generate those functions. In windows the rule is created properly with the dependency to the xml that should produce the generation of those functions: ![image](https://user-images.githubusercontent.com/81431996/140806211-35f526ce-28e3-4a2d-8c57-a299200847ca.png) So this could be an issue with dependencies in the Ninja generator. Do some digging to find if Ninja is declaring and tracking that dependency properly.
non_priority
potential missing dependency in autogen under ninja describe the bug during a build of android it was observed that there was a missing dependency to a codegen functions lib profile libaznetworking a unity cxx cxx o in function aznetworking udpnetworkinterface handleconnectiontimeout aznetworking timeoutqueue timeoutitem d workspace code framework aznetworking aznetworking udptransport udpnetworkinterface cpp undefined reference to corepackets heartbeatpacket heartbeatpacket bool lib profile libaznetworking a unity cxx cxx o in function aznetworking udpconnection updateheartbeat az timems d workspace code framework aznetworking aznetworking udptransport udpconnection cpp undefined reference to corepackets heartbeatpacket heartbeatpacket bool lib profile libaznetworking a unity cxx cxx o in function aznetworking udpconnection handlecorepacket aznetworking iconnectionlistener aznetworking udppacketheader aznetworking iserializer d workspace code framework aznetworking aznetworking udptransport udpconnection cpp undefined reference to corepackets heartbeatpacket heartbeatpacket bool the functions are generated by codegen and are within the same static library so it looks like codegen did not generate those functions in windows the rule is created properly with the dependency to the xml that should produce the generation of those functions so this could be an issue with dependencies in the ninja generator do some digging to find if ninja is declaring and tracking that dependency properly
0
262,287
19,770,673,955
IssuesEvent
2022-01-17 09:43:04
alphagov/govuk-design-system
https://api.github.com/repos/alphagov/govuk-design-system
closed
Accordion - it's not clear how section expanded is stored in session, and id has to be unique across the site
documentation 🕔 hours accordion guidance
We store whether a user expanded a section, and use 'id' for that. It's not clear from the guidance that we do that, so it's unexpected, and that means its not clear that 'id' has to be unique across the whole site. There is some guidance in the nunjucks options but people might not see that. In addition this means 'expanded' option may not work as expected - as the stored value will override it.
1.0
Accordion - it's not clear how section expanded is stored in session, and id has to be unique across the site - We store whether a user expanded a section, and use 'id' for that. It's not clear from the guidance that we do that, so it's unexpected, and that means its not clear that 'id' has to be unique across the whole site. There is some guidance in the nunjucks options but people might not see that. In addition this means 'expanded' option may not work as expected - as the stored value will override it.
non_priority
accordion it s not clear how section expanded is stored in session and id has to be unique across the site we store whether a user expanded a section and use id for that it s not clear from the guidance that we do that so it s unexpected and that means its not clear that id has to be unique across the whole site there is some guidance in the nunjucks options but people might not see that in addition this means expanded option may not work as expected as the stored value will override it
0
341,822
30,602,336,490
IssuesEvent
2023-07-22 14:51:21
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
reopened
Fix manipulation.test_flipud
Sub Task Ivy API Experimental Failing Test
| | | |---|---| |jax|<a href="https://github.com/unifyai/ivy/actions/runs/5630947656/job/15257413832"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5630947656/job/15257413832"><img src=https://img.shields.io/badge/-success-success></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5630947656/job/15257413832"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/5630947656/job/15257413832"><img src=https://img.shields.io/badge/-success-success></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5630947656/job/15257413832"><img src=https://img.shields.io/badge/-failure-red></a>
1.0
Fix manipulation.test_flipud - | | | |---|---| |jax|<a href="https://github.com/unifyai/ivy/actions/runs/5630947656/job/15257413832"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5630947656/job/15257413832"><img src=https://img.shields.io/badge/-success-success></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5630947656/job/15257413832"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/5630947656/job/15257413832"><img src=https://img.shields.io/badge/-success-success></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5630947656/job/15257413832"><img src=https://img.shields.io/badge/-failure-red></a>
non_priority
fix manipulation test flipud jax a href src numpy a href src tensorflow a href src torch a href src paddle a href src
0
134,832
30,195,046,101
IssuesEvent
2023-07-04 19:48:22
h4sh5/pypi-auto-scanner
https://api.github.com/repos/h4sh5/pypi-auto-scanner
opened
raspisump 1.8.2 has 4 GuardDog issues
guarddog code-execution
https://pypi.org/project/raspisump https://inspector.pypi.io/project/raspisump ```{ "dependency": "raspisump", "version": "1.8.2", "result": { "issues": 4, "errors": {}, "results": { "code-execution": [ { "location": "raspisump-1.8.2/setup.py:11", "code": " os.system(cmd)", "message": "This package is executing OS commands in the setup.py file" }, { "location": "raspisump-1.8.2/setup.py:69", "code": " os.system(cmd)", "message": "This package is executing OS commands in the setup.py file" }, { "location": "raspisump-1.8.2/setup.py:71", "code": " os.system(cmd)", "message": "This package is executing OS commands in the setup.py file" }, { "location": "raspisump-1.8.2/setup.py:73", "code": " os.system(cmd)", "message": "This package is executing OS commands in the setup.py file" } ] }, "path": "/tmp/tmp1h_owrtk/raspisump" } }```
1.0
raspisump 1.8.2 has 4 GuardDog issues - https://pypi.org/project/raspisump https://inspector.pypi.io/project/raspisump ```{ "dependency": "raspisump", "version": "1.8.2", "result": { "issues": 4, "errors": {}, "results": { "code-execution": [ { "location": "raspisump-1.8.2/setup.py:11", "code": " os.system(cmd)", "message": "This package is executing OS commands in the setup.py file" }, { "location": "raspisump-1.8.2/setup.py:69", "code": " os.system(cmd)", "message": "This package is executing OS commands in the setup.py file" }, { "location": "raspisump-1.8.2/setup.py:71", "code": " os.system(cmd)", "message": "This package is executing OS commands in the setup.py file" }, { "location": "raspisump-1.8.2/setup.py:73", "code": " os.system(cmd)", "message": "This package is executing OS commands in the setup.py file" } ] }, "path": "/tmp/tmp1h_owrtk/raspisump" } }```
non_priority
raspisump has guarddog issues dependency raspisump version result issues errors results code execution location raspisump setup py code os system cmd message this package is executing os commands in the setup py file location raspisump setup py code os system cmd message this package is executing os commands in the setup py file location raspisump setup py code os system cmd message this package is executing os commands in the setup py file location raspisump setup py code os system cmd message this package is executing os commands in the setup py file path tmp owrtk raspisump
0
54,199
7,877,245,934
IssuesEvent
2018-06-26 06:09:51
xcat2/xcat-core
https://api.github.com/repos/xcat2/xcat-core
closed
Suggestion re man page for rcons and exit code
type:documentation
The man page for rcons states: "To exit the console session, enter: <ctrl>e c ." While this is correct, it could be confusing. I.e., it is easy to incorrectly assume that the exit code is only two characters (ctrl-e and c), interpretting the "." as punctuation to the sentence. Could I suggest something like "To exit the console session, enter: '<ctrl>e c .' (three characters, ctrl-e, 'c', and '.')." type:documentation prioirity:low type:enhancement
1.0
Suggestion re man page for rcons and exit code - The man page for rcons states: "To exit the console session, enter: <ctrl>e c ." While this is correct, it could be confusing. I.e., it is easy to incorrectly assume that the exit code is only two characters (ctrl-e and c), interpretting the "." as punctuation to the sentence. Could I suggest something like "To exit the console session, enter: '<ctrl>e c .' (three characters, ctrl-e, 'c', and '.')." type:documentation prioirity:low type:enhancement
non_priority
suggestion re man page for rcons and exit code the man page for rcons states to exit the console session enter e c while this is correct it could be confusing i e it is easy to incorrectly assume that the exit code is only two characters ctrl e and c interpretting the as punctuation to the sentence could i suggest something like to exit the console session enter e c three characters ctrl e c and type documentation prioirity low type enhancement
0
253,557
19,123,061,816
IssuesEvent
2021-12-01 02:07:47
dbuscombe-usgs/CoastSeg
https://api.github.com/repos/dbuscombe-usgs/CoastSeg
opened
This toolbox is a CoastSat plugin
documentation enhancement
- [ ] import [coastsat](https://github.com/kvos/CoastSat) - [ ] install the coastsat [conda environment](https://github.com/kvos/CoastSat/blob/master/environment.yml) - [ ] explore how other plugins (extensions) work in terms of wrapping/modifying the coastsat functionality - [ ] https://github.com/ydoherty/CoastSat.PlanetScope - [ ] https://github.com/VHeimhuber/InletTracker - [ ] https://github.com/mcuttler/CoastSat.islands - [ ] https://github.com/kvos/CoastSat.slope
1.0
This toolbox is a CoastSat plugin - - [ ] import [coastsat](https://github.com/kvos/CoastSat) - [ ] install the coastsat [conda environment](https://github.com/kvos/CoastSat/blob/master/environment.yml) - [ ] explore how other plugins (extensions) work in terms of wrapping/modifying the coastsat functionality - [ ] https://github.com/ydoherty/CoastSat.PlanetScope - [ ] https://github.com/VHeimhuber/InletTracker - [ ] https://github.com/mcuttler/CoastSat.islands - [ ] https://github.com/kvos/CoastSat.slope
non_priority
this toolbox is a coastsat plugin import install the coastsat explore how other plugins extensions work in terms of wrapping modifying the coastsat functionality
0
165,744
14,009,512,279
IssuesEvent
2020-10-29 02:33:55
MegEngine/MegEngine
https://api.github.com/repos/MegEngine/MegEngine
closed
交叉熵API错误
status: in progress type: documentation
## 文档链接 <!-- 请您贴出有问题的文档链接 --> https://megengine.org.cn/doc/basic/train_and_evaluation.html ## 问题描述 <!-- 请您简要清晰的描述您的问题 --> loss = F.cross_entropy_with_softmax(logits, batch_label) 实际1.0.0版本为loss = F.loss.cross_entropy(logits, batch_label) 但是文档没有更新
1.0
交叉熵API错误 - ## 文档链接 <!-- 请您贴出有问题的文档链接 --> https://megengine.org.cn/doc/basic/train_and_evaluation.html ## 问题描述 <!-- 请您简要清晰的描述您的问题 --> loss = F.cross_entropy_with_softmax(logits, batch_label) 实际1.0.0版本为loss = F.loss.cross_entropy(logits, batch_label) 但是文档没有更新
non_priority
交叉熵api错误 文档链接 问题描述 loss f cross entropy with softmax logits batch label f loss cross entropy logits batch label 但是文档没有更新
0
50,420
10,508,225,367
IssuesEvent
2019-09-27 08:08:45
ReikaKalseki/Reika_Mods_Issues
https://api.github.com/repos/ReikaKalseki/Reika_Mods_Issues
closed
RoC Gravel Gun can be enchanted, but looses enchants after firing
Bug RotaryCraft Stupid Code
When you fire the item goes from 5013:31608 to 5013:31594, and the enchantments disappear off of the item. This means any enchants you could put on it (soulbound, etc) are wiped after firing/
1.0
RoC Gravel Gun can be enchanted, but looses enchants after firing - When you fire the item goes from 5013:31608 to 5013:31594, and the enchantments disappear off of the item. This means any enchants you could put on it (soulbound, etc) are wiped after firing/
non_priority
roc gravel gun can be enchanted but looses enchants after firing when you fire the item goes from to and the enchantments disappear off of the item this means any enchants you could put on it soulbound etc are wiped after firing
0
73,388
24,604,012,991
IssuesEvent
2022-10-14 14:42:13
junichi11/netbeans-color-codes-preview
https://api.github.com/repos/junichi11/netbeans-color-codes-preview
opened
Error: color codes preview in an editor's sidebar not visible
defect
Error: color codes preview in an editor's sidebar not visible It was working in netbeans IDE 14 **Environments (please complete the following information):** - OS: [win 10 x64 21h2] - NetBeans Version: [apache NetBeans IDE 15] - Plugin Version: [Version: 0.13.4]
1.0
Error: color codes preview in an editor's sidebar not visible - Error: color codes preview in an editor's sidebar not visible It was working in netbeans IDE 14 **Environments (please complete the following information):** - OS: [win 10 x64 21h2] - NetBeans Version: [apache NetBeans IDE 15] - Plugin Version: [Version: 0.13.4]
non_priority
error color codes preview in an editor s sidebar not visible error color codes preview in an editor s sidebar not visible it was working in netbeans ide environments please complete the following information os netbeans version plugin version
0
364,892
25,510,403,800
IssuesEvent
2022-11-28 12:42:00
airalab/robonomics-wiki
https://api.github.com/repos/airalab/robonomics-wiki
closed
[Checking up-to-date]: Robonomics-js
documentation deprecation
### Issue description Author: @Vourhey Since we are updating the wiki and you are the author of the article, you need to check if this article is up to date. You options: - If the article is up to date, then make a comment about it in the issue. - If the article needs to be update, and you will definitely do it, then write about it in the issue. Let us know when you will finish it. - If you are unlikely to update the article, write about it and we will delete it. In any case, the article will be removed from the sidebar, and will be available only through a direct link. If there are no updates after 3 months, the article will be automatically removed from the wiki. Please, add the **tools parameter** to the article, if applicable, to indicate which versions of the software are needed for this article. Also, consider adding dependencies to the article for automatic deprecation reminders. This is explained here: https://github.com/airalab/robonomics-wiki-deprecation-notifier ### Doc Page https://wiki.robonomics.network/docs/en/robonomics-js/
1.0
[Checking up-to-date]: Robonomics-js - ### Issue description Author: @Vourhey Since we are updating the wiki and you are the author of the article, you need to check if this article is up to date. You options: - If the article is up to date, then make a comment about it in the issue. - If the article needs to be update, and you will definitely do it, then write about it in the issue. Let us know when you will finish it. - If you are unlikely to update the article, write about it and we will delete it. In any case, the article will be removed from the sidebar, and will be available only through a direct link. If there are no updates after 3 months, the article will be automatically removed from the wiki. Please, add the **tools parameter** to the article, if applicable, to indicate which versions of the software are needed for this article. Also, consider adding dependencies to the article for automatic deprecation reminders. This is explained here: https://github.com/airalab/robonomics-wiki-deprecation-notifier ### Doc Page https://wiki.robonomics.network/docs/en/robonomics-js/
non_priority
robonomics js issue description author vourhey since we are updating the wiki and you are the author of the article you need to check if this article is up to date you options if the article is up to date then make a comment about it in the issue if the article needs to be update and you will definitely do it then write about it in the issue let us know when you will finish it if you are unlikely to update the article write about it and we will delete it in any case the article will be removed from the sidebar and will be available only through a direct link if there are no updates after months the article will be automatically removed from the wiki please add the tools parameter to the article if applicable to indicate which versions of the software are needed for this article also consider adding dependencies to the article for automatic deprecation reminders this is explained here doc page
0
64,624
26,815,836,488
IssuesEvent
2023-02-02 04:30:26
ballerina-platform/openapi-tools
https://api.github.com/repos/ballerina-platform/openapi-tools
closed
Handle `multipart/form-data` in service generation in request payload
Type/Improvement Service OpenAPIToBallerina
**Description:** ```openapi openapi: 3.0.1 info: title: testInlineRequestBody version: 1.0.0 paths: /user: post: summary: Post operation for the path /user operationId: addUser requestBody: content: multipart/form-data: schema: type: object properties: userName: description: User Name type: string userPhone: description: User Phone Number type: string required: - userName - userPhone responses: 200: description: Successful content: application/json: example: Ok components: {} ``` generated code : ```ballerina import ballerina/http; listener http:Listener ep0 = new (9090, config = {host: "localhost"}); service / on ep0 { resource function post user(@http:Payload json payload) returns json { } } ``` - [ ] Enable this [test](https://github.com/ballerina-platform/openapi-tools/blob/d770c608b3a028915a52e089f0bcd657d2071722/openapi-cli/src/test/java/io/ballerina/openapi/cmd/OpenApiGenServiceCmdTest.java#L38) **Steps to reproduce:** **Affected Versions:** **OS, DB, other environment details and versions:** **Related Issues (optional):** <!-- Any related issues such as sub tasks, issues reported in other repositories (e.g component repositories), similar problems, etc. --> **Suggested Labels (optional):** <!-- Optional comma separated list of suggested labels. Non committers can’t assign labels to issues, so this will help issue creators who are not a committer to suggest possible labels--> **Suggested Assignees (optional):** <!--Optional comma separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, so this will help issue creators who are not a committer to suggest possible assignees-->
1.0
Handle `multipart/form-data` in service generation in request payload - **Description:** ```openapi openapi: 3.0.1 info: title: testInlineRequestBody version: 1.0.0 paths: /user: post: summary: Post operation for the path /user operationId: addUser requestBody: content: multipart/form-data: schema: type: object properties: userName: description: User Name type: string userPhone: description: User Phone Number type: string required: - userName - userPhone responses: 200: description: Successful content: application/json: example: Ok components: {} ``` generated code : ```ballerina import ballerina/http; listener http:Listener ep0 = new (9090, config = {host: "localhost"}); service / on ep0 { resource function post user(@http:Payload json payload) returns json { } } ``` - [ ] Enable this [test](https://github.com/ballerina-platform/openapi-tools/blob/d770c608b3a028915a52e089f0bcd657d2071722/openapi-cli/src/test/java/io/ballerina/openapi/cmd/OpenApiGenServiceCmdTest.java#L38) **Steps to reproduce:** **Affected Versions:** **OS, DB, other environment details and versions:** **Related Issues (optional):** <!-- Any related issues such as sub tasks, issues reported in other repositories (e.g component repositories), similar problems, etc. --> **Suggested Labels (optional):** <!-- Optional comma separated list of suggested labels. Non committers can’t assign labels to issues, so this will help issue creators who are not a committer to suggest possible labels--> **Suggested Assignees (optional):** <!--Optional comma separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, so this will help issue creators who are not a committer to suggest possible assignees-->
non_priority
handle multipart form data in service generation in request payload description openapi openapi info title testinlinerequestbody version paths user post summary post operation for the path user operationid adduser requestbody content multipart form data schema type object properties username description user name type string userphone description user phone number type string required username userphone responses description successful content application json example ok components generated code ballerina import ballerina http listener http listener new config host localhost service on resource function post user http payload json payload returns json enable this steps to reproduce affected versions os db other environment details and versions related issues optional suggested labels optional suggested assignees optional
0
418,233
28,114,079,996
IssuesEvent
2023-03-31 09:24:33
sansders/ped
https://api.github.com/repos/sansders/ped
opened
Hyperlinks in table of contents in UG do not work
type.DocumentationBug severity.Medium
With regards to the table of contents in the UG, clicking on any of the links under `Features` does not redirect the reader to the specified sections. ![image.png](https://raw.githubusercontent.com/sansders/ped/main/files/6a11a30d-7ef3-4130-a8e9-e49ae957eab1.png) <!--session: 1680252442399-47eb6088-0019-4920-8c5f-b40e61a04a6d--> <!--Version: Web v3.4.7-->
1.0
Hyperlinks in table of contents in UG do not work - With regards to the table of contents in the UG, clicking on any of the links under `Features` does not redirect the reader to the specified sections. ![image.png](https://raw.githubusercontent.com/sansders/ped/main/files/6a11a30d-7ef3-4130-a8e9-e49ae957eab1.png) <!--session: 1680252442399-47eb6088-0019-4920-8c5f-b40e61a04a6d--> <!--Version: Web v3.4.7-->
non_priority
hyperlinks in table of contents in ug do not work with regards to the table of contents in the ug clicking on any of the links under features does not redirect the reader to the specified sections
0