Unnamed: 0 int64 1 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 7 112 | repo_url stringlengths 36 141 | action stringclasses 3 values | title stringlengths 3 438 | labels stringlengths 4 308 | body stringlengths 7 254k | index stringclasses 7 values | text_combine stringlengths 96 254k | label stringclasses 2 values | text stringlengths 96 246k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
5,057 | 25,893,863,286 | IssuesEvent | 2022-12-14 20:26:47 | aws/serverless-application-model | https://api.github.com/repos/aws/serverless-application-model | closed | S3 Event triggers not working | type/bug area/event-source breaking-change stage/pm-review maintainer/need-followup | Hi, I am facing an issue where Event is not being created and associated with Lambda, although it is specified in SAM:
```yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Example
Resources:
LogToWatch:
Type: AWS::Serverless::Function
Properties:
Handler: index.handler
Runtime: python3.6
Timeout: 300
Policies: AmazonS3ReadOnlyAccess
Events:
S3CreateObject:
Type: S3
Properties:
Bucket:
Ref: TargetBucket
Events: s3:ObjectCreated:Put
TargetBucket:
Type: AWS::S3::Bucket
``` | True | S3 Event triggers not working - Hi, I am facing an issue where Event is not being created and associated with Lambda, although it is specified in SAM:
```yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Example
Resources:
LogToWatch:
Type: AWS::Serverless::Function
Properties:
Handler: index.handler
Runtime: python3.6
Timeout: 300
Policies: AmazonS3ReadOnlyAccess
Events:
S3CreateObject:
Type: S3
Properties:
Bucket:
Ref: TargetBucket
Events: s3:ObjectCreated:Put
TargetBucket:
Type: AWS::S3::Bucket
``` | main | event triggers not working hi i am facing an issue where event is not being created and associated with lambda although it is specified in sam yaml awstemplateformatversion transform aws serverless description example resources logtowatch type aws serverless function properties handler index handler runtime timeout policies events type properties bucket ref targetbucket events objectcreated put targetbucket type aws bucket | 1 |
3,071 | 11,591,765,534 | IssuesEvent | 2020-02-24 10:06:29 | backdrop-ops/contrib | https://api.github.com/repos/backdrop-ops/contrib | closed | Port of Image Field Caption | Maintainer application Port in progress | **Name of the module, theme, or layout you are working on**
Image Field Caption
**Link to the drupal.org module, theme, or layout**
https://www.drupal.org/project/image_field_caption
**(Optional) Link to the drupal.org issue where you notified the maintainer(s) of this port**
https://www.drupal.org/project/image_field_caption/issues/3097151
**Do you need assistance with this port?**
No, not at the moment - but anyone interested in (co-)maintaining, feel free to contact me! | True | Port of Image Field Caption - **Name of the module, theme, or layout you are working on**
Image Field Caption
**Link to the drupal.org module, theme, or layout**
https://www.drupal.org/project/image_field_caption
**(Optional) Link to the drupal.org issue where you notified the maintainer(s) of this port**
https://www.drupal.org/project/image_field_caption/issues/3097151
**Do you need assistance with this port?**
No, not at the moment - but anyone interested in (co-)maintaining, feel free to contact me! | main | port of image field caption name of the module theme or layout you are working on image field caption link to the drupal org module theme or layout optional link to the drupal org issue where you notified the maintainer s of this port do you need assistance with this port no not at the moment but anyone interested in co maintaining feel free to contact me | 1 |
5,378 | 27,032,054,217 | IssuesEvent | 2023-02-12 10:21:42 | Windham-High-School/CubeServer | https://api.github.com/repos/Windham-High-School/CubeServer | closed | Prebuild Docker images | enhancement docker maintainability | Prebuild images off of the production server to minimize downtime- This could be a complete workaround for #59 | True | Prebuild Docker images - Prebuild images off of the production server to minimize downtime- This could be a complete workaround for #59 | main | prebuild docker images prebuild images off of the production server to minimize downtime this could be a complete workaround for | 1 |
746,019 | 26,010,242,388 | IssuesEvent | 2022-12-21 00:30:34 | pystardust/ani-cli | https://api.github.com/repos/pystardust/ani-cli | closed | "Episodes not released yet!" error | type: support priority 3: low | **Metadata (please complete the following information)**
Version: 3.4.0
OS: EndeavourOS x86_64
Shell: `bash`
Anime: Blue Gender
**Describe the bug**
Unable to connect to scrape site when trying to watch/download an anime.
**Steps To Reproduce**
1. Run `ani-cli -d blue gender` (or run without `-d`)
2. Choose 1 (blue gender)
**Expected behavior**
Downloading/watching should work.
**Additional context**
Attempting to connect to https://animixplay.to/ gives the following error page on Firefox (tested with an arkenfox profile as well as a fresh profile):
```
An error occurred during a connection to animixplay.to. SSL received a record that exceeded the maximum permissible length.
Error code: SSL_ERROR_RX_RECORD_TOO_LONG
```
I have also tried this with other shows (e.g. `ani-cli -d mobile suit gundam`) and receive the same error. | 1.0 | "Episodes not released yet!" error - **Metadata (please complete the following information)**
Version: 3.4.0
OS: EndeavourOS x86_64
Shell: `bash`
Anime: Blue Gender
**Describe the bug**
Unable to connect to scrape site when trying to watch/download an anime.
**Steps To Reproduce**
1. Run `ani-cli -d blue gender` (or run without `-d`)
2. Choose 1 (blue gender)
**Expected behavior**
Downloading/watching should work.
**Additional context**
Attempting to connect to https://animixplay.to/ gives the following error page on Firefox (tested with an arkenfox profile as well as a fresh profile):
```
An error occurred during a connection to animixplay.to. SSL received a record that exceeded the maximum permissible length.
Error code: SSL_ERROR_RX_RECORD_TOO_LONG
```
I have also tried this with other shows (e.g. `ani-cli -d mobile suit gundam`) and receive the same error. | non_main | episodes not released yet error metadata please complete the following information version os endeavouros shell bash anime blue gender describe the bug unable to connect to scrape site when trying to watch download an anime steps to reproduce run ani cli d blue gender or run without d choose blue gender expected behavior downloading watching should work additional context attempting to connect to gives the following error page on firefox tested with an arkenfox profile as well as a fresh profile an error occurred during a connection to animixplay to ssl received a record that exceeded the maximum permissible length error code ssl error rx record too long i have also tried this with other shows e g ani cli d mobile suit gundam and receive the same error | 0 |
13,085 | 3,684,381,401 | IssuesEvent | 2016-02-24 17:09:04 | snowplow/snowplow | https://api.github.com/repos/snowplow/snowplow | closed | Improve Snowplow version matrix | documentation | A basic page has been connected here: https://github.com/snowplow/snowplow/wiki/Snowplow-version-matrix
@ihortom you should have write access to the underlying Google Spreadsheet. Tasks:
* [ ] Populate the yellow highlighted cells
* [ ] Flesh out the wiki page
* [ ] Wire the wiki page into other pages (e.g. upgrade guide)
* [ ] Email user group to let people know this exists now | 1.0 | Improve Snowplow version matrix - A basic page has been connected here: https://github.com/snowplow/snowplow/wiki/Snowplow-version-matrix
@ihortom you should have write access to the underlying Google Spreadsheet. Tasks:
* [ ] Populate the yellow highlighted cells
* [ ] Flesh out the wiki page
* [ ] Wire the wiki page into other pages (e.g. upgrade guide)
* [ ] Email user group to let people know this exists now | non_main | improve snowplow version matrix a basic page has been connected here ihortom you should have write access to the underlying google spreadsheet tasks populate the yellow highlighted cells flesh out the wiki page wire the wiki page into other pages e g upgrade guide email user group to let people know this exists now | 0 |
10,970 | 6,997,276,812 | IssuesEvent | 2017-12-16 12:45:24 | kimai/kimai | https://api.github.com/repos/kimai/kimai | closed | Loading icon when kimai is still working | Feature Request Usability | Especially when working with slow internet connection or with heaps of entries at once, kimai might take a while to produce the required output.
I suggest a loading icon somewhere on the main screen to inform the user that kimai is still working.
| True | Loading icon when kimai is still working - Especially when working with slow internet connection or with heaps of entries at once, kimai might take a while to produce the required output.
I suggest a loading icon somewhere on the main screen to inform the user that kimai is still working.
| non_main | loading icon when kimai is still working especially when working with slow internet connection or with heaps of entries at once kimai might take a while to produce the required output i suggest a loading icon somewhere on the main screen to inform the user that kimai is still working | 0 |
143,994 | 11,590,668,313 | IssuesEvent | 2020-02-24 07:30:11 | ubtue/DatenProbleme | https://api.github.com/repos/ubtue/DatenProbleme | closed | ISSN 1741-3095 Punishment & Society Sprachcode tag 041 | Zotero_AUTO_RSS high priority ready for testing | In den Testdaten wird der Sprachcode nicht richtig umgeetzt. Ausgegeben wird en statt eng.
Dieser Fehler ist bereits in den von uns gelieferten Daten vorhanden. Zumindest wenn man einen Testlauf auf nu macht.
PPN auf dem Testserver ist 3537338139 | 1.0 | ISSN 1741-3095 Punishment & Society Sprachcode tag 041 - In den Testdaten wird der Sprachcode nicht richtig umgeetzt. Ausgegeben wird en statt eng.
Dieser Fehler ist bereits in den von uns gelieferten Daten vorhanden. Zumindest wenn man einen Testlauf auf nu macht.
PPN auf dem Testserver ist 3537338139 | non_main | issn punishment society sprachcode tag in den testdaten wird der sprachcode nicht richtig umgeetzt ausgegeben wird en statt eng dieser fehler ist bereits in den von uns gelieferten daten vorhanden zumindest wenn man einen testlauf auf nu macht ppn auf dem testserver ist | 0 |
5,181 | 26,375,830,062 | IssuesEvent | 2023-01-12 02:17:57 | aws/aws-sam-cli | https://api.github.com/repos/aws/aws-sam-cli | closed | Include cfn-python-lint checks into sam validate | type/feature area/validate maintainer/need-response | ### Describe your idea/feature/enhancement
Currently `sam validate` only catches a very small subset of problems with SAM templates. For example it doesn't even catch when the `AWS::Serverless-2016-10-31` transformation isn't specified in the template. It'd be great if users could be more confident that a template works if it passed `sam validate` successfully.
### Proposal
For better validation [`cfn-python-lint`](https://github.com/awslabs/cfn-python-lint) comes to mind, which already does a great job detecting invalid CloudFormation and SAM templates. It'd be great if the checks done by `cfn-python-lint` would be included into `sam validate` as well. As both are implemented in Python it's probably straight forward to add such support.
The idea is not new and has already been suggested by @jfuss in https://github.com/awslabs/aws-sam-cli/issues/686#issuecomment-450193749 (and probably others somewhere else). @heitorlessa just suggested in Slack to track it properly with such a feature request. So here it is. :slightly_smiling_face: | True | Include cfn-python-lint checks into sam validate - ### Describe your idea/feature/enhancement
Currently `sam validate` only catches a very small subset of problems with SAM templates. For example it doesn't even catch when the `AWS::Serverless-2016-10-31` transformation isn't specified in the template. It'd be great if users could be more confident that a template works if it passed `sam validate` successfully.
### Proposal
For better validation [`cfn-python-lint`](https://github.com/awslabs/cfn-python-lint) comes to mind, which already does a great job detecting invalid CloudFormation and SAM templates. It'd be great if the checks done by `cfn-python-lint` would be included into `sam validate` as well. As both are implemented in Python it's probably straight forward to add such support.
The idea is not new and has already been suggested by @jfuss in https://github.com/awslabs/aws-sam-cli/issues/686#issuecomment-450193749 (and probably others somewhere else). @heitorlessa just suggested in Slack to track it properly with such a feature request. So here it is. :slightly_smiling_face: | main | include cfn python lint checks into sam validate describe your idea feature enhancement currently sam validate only catches a very small subset of problems with sam templates for example it doesn t even catch when the aws serverless transformation isn t specified in the template it d be great if users could be more confident that a template works if it passed sam validate successfully proposal for better validation comes to mind which already does a great job detecting invalid cloudformation and sam templates it d be great if the checks done by cfn python lint would be included into sam validate as well as both are implemented in python it s probably straight forward to add such support the idea is not new and has already been suggested by jfuss in and probably others somewhere else heitorlessa just suggested in slack to track it properly with such a feature request so here it is slightly smiling face | 1 |
1,288 | 5,448,199,801 | IssuesEvent | 2017-03-07 15:22:33 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | S3 - Delete removes entire bucket instead of provided object | affects_2.0 aws bug_report cloud waiting_on_maintainer | <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
<!--- Pick one below and delete the rest: -->
- Bug Report
##### COMPONENT NAME
<!--- Name of the plugin/module/task -->
ansible/ansible-modules-core/amazon/s3.py
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
ansible 2.0.1.0
```
##### CONFIGURATION
<!---
Mention any settings you have changed/added/removed in ansible.cfg
(or using the ANSIBLE_* environment variables).
-->
No changes in ansible.cfg
##### OS / ENVIRONMENT
ami-c481fad3
Amazon Linux AMI 2016.09.0 was released on 2016-09-27
##### SUMMARY
<!--- Explain the problem briefly -->
The s3 module when used with the mode: delete, it is expected to throw an error when an object is defined. Instead it deletes the entire bucket instead of the object.
##### STEPS TO REPRODUCE
<!---
For bugs, show exactly how to reproduce the problem.
For new features, show how the feature would be used.
-->
<!--- Paste example playbooks or commands between quotes below -->
```
- name: "Delete file from S3"
local_action:
module: s3
mode: delete
bucket: "files-us-east-1"
object: "/env/stage/10/backup-10"
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- What did you expect to happen when running the steps above? -->
Throw an error.
##### ACTUAL RESULTS
<!--- What actually happened? If possible run with extra verbosity (-vvvv) -->
It deletes the entire bucket.
<!--- Paste verbatim command output between quotes below -->
```
TASK [Delete file from S3] *****************************************
task path: /vagrant/aws/local-delete-s3-bucket.yml:16
ESTABLISH LOCAL CONNECTION FOR USER: vagrant
localhost EXEC /bin/sh -c '( umask 22 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1477559860.67-160911494850295 `" && echo "` echo $HOME/.ansible/tmp/ansible-tmp-1477559860.67-160911494850295 `" )'
localhost PUT /tmp/tmpHbvO9F TO /home/vagrant/.ansible/tmp/ansible-tmp-1477559860.67-160911494850295/s3
localhost EXEC /bin/sh -c 'LANG=en_GB.UTF-8 LC_ALL=en_GB.UTF-8 LC_MESSAGES=en_GB.UTF-8 /usr/local/prog/apps/ansible2/bin/python2.6 /home/vagrant/.ansible/tmp/ansible-tmp-1477559860.67-160911494850295/s3; rm -rf "/home/vagrant/.ansible/tmp/ansible-tmp-1477559860.67-160911494850295/" > /dev/null 2>&1'
changed: [localhost -> localhost] => {"changed": true, "invocation": {"module_args": {"aws_access_key": null, "aws_secret_key": null, "bucket": "files-us-east-1", "dest": null, "ec2_url": null, "encrypt": true, "expiry": 600, "headers": null, "marker": null, "max_keys": 1000, "metadata": null, "mode": "delete", "object": "/env/stage/10/backup-10", "overwrite": "always", "permission": ["private"], "prefix": null, "profile": null, "region": null, "retries": 0, "s3_url": null, "security_token": null, "src": null, "validate_certs": true, "version": null}, "module_name": "s3"}, "msg": "Bucket files-us-east-1 and all keys have been deleted."}
```
| True | S3 - Delete removes entire bucket instead of provided object - <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
<!--- Pick one below and delete the rest: -->
- Bug Report
##### COMPONENT NAME
<!--- Name of the plugin/module/task -->
ansible/ansible-modules-core/amazon/s3.py
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
ansible 2.0.1.0
```
##### CONFIGURATION
<!---
Mention any settings you have changed/added/removed in ansible.cfg
(or using the ANSIBLE_* environment variables).
-->
No changes in ansible.cfg
##### OS / ENVIRONMENT
ami-c481fad3
Amazon Linux AMI 2016.09.0 was released on 2016-09-27
##### SUMMARY
<!--- Explain the problem briefly -->
The s3 module when used with the mode: delete, it is expected to throw an error when an object is defined. Instead it deletes the entire bucket instead of the object.
##### STEPS TO REPRODUCE
<!---
For bugs, show exactly how to reproduce the problem.
For new features, show how the feature would be used.
-->
<!--- Paste example playbooks or commands between quotes below -->
```
- name: "Delete file from S3"
local_action:
module: s3
mode: delete
bucket: "files-us-east-1"
object: "/env/stage/10/backup-10"
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- What did you expect to happen when running the steps above? -->
Throw an error.
##### ACTUAL RESULTS
<!--- What actually happened? If possible run with extra verbosity (-vvvv) -->
It deletes the entire bucket.
<!--- Paste verbatim command output between quotes below -->
```
TASK [Delete file from S3] *****************************************
task path: /vagrant/aws/local-delete-s3-bucket.yml:16
ESTABLISH LOCAL CONNECTION FOR USER: vagrant
localhost EXEC /bin/sh -c '( umask 22 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1477559860.67-160911494850295 `" && echo "` echo $HOME/.ansible/tmp/ansible-tmp-1477559860.67-160911494850295 `" )'
localhost PUT /tmp/tmpHbvO9F TO /home/vagrant/.ansible/tmp/ansible-tmp-1477559860.67-160911494850295/s3
localhost EXEC /bin/sh -c 'LANG=en_GB.UTF-8 LC_ALL=en_GB.UTF-8 LC_MESSAGES=en_GB.UTF-8 /usr/local/prog/apps/ansible2/bin/python2.6 /home/vagrant/.ansible/tmp/ansible-tmp-1477559860.67-160911494850295/s3; rm -rf "/home/vagrant/.ansible/tmp/ansible-tmp-1477559860.67-160911494850295/" > /dev/null 2>&1'
changed: [localhost -> localhost] => {"changed": true, "invocation": {"module_args": {"aws_access_key": null, "aws_secret_key": null, "bucket": "files-us-east-1", "dest": null, "ec2_url": null, "encrypt": true, "expiry": 600, "headers": null, "marker": null, "max_keys": 1000, "metadata": null, "mode": "delete", "object": "/env/stage/10/backup-10", "overwrite": "always", "permission": ["private"], "prefix": null, "profile": null, "region": null, "retries": 0, "s3_url": null, "security_token": null, "src": null, "validate_certs": true, "version": null}, "module_name": "s3"}, "msg": "Bucket files-us-east-1 and all keys have been deleted."}
```
| main | delete removes entire bucket instead of provided object issue type bug report component name ansible ansible modules core amazon py ansible version ansible configuration mention any settings you have changed added removed in ansible cfg or using the ansible environment variables no changes in ansible cfg os environment ami amazon linux ami was released on summary the module when used with the mode delete it is expected to throw an error when an object is defined instead it deletes the entire bucket instead of the object steps to reproduce for bugs show exactly how to reproduce the problem for new features show how the feature would be used name delete file from local action module mode delete bucket files us east object env stage backup expected results throw an error actual results it deletes the entire bucket task task path vagrant aws local delete bucket yml establish local connection for user vagrant localhost exec bin sh c umask mkdir p echo home ansible tmp ansible tmp echo echo home ansible tmp ansible tmp localhost put tmp to home vagrant ansible tmp ansible tmp localhost exec bin sh c lang en gb utf lc all en gb utf lc messages en gb utf usr local prog apps bin home vagrant ansible tmp ansible tmp rm rf home vagrant ansible tmp ansible tmp dev null changed changed true invocation module args aws access key null aws secret key null bucket files us east dest null url null encrypt true expiry headers null marker null max keys metadata null mode delete object env stage backup overwrite always permission prefix null profile null region null retries url null security token null src null validate certs true version null module name msg bucket files us east and all keys have been deleted | 1 |
763,569 | 26,763,257,498 | IssuesEvent | 2023-01-31 08:49:28 | War-Brokers/War-Brokers | https://api.github.com/repos/War-Brokers/War-Brokers | opened | Do not show "localhost" button when pressing the ctrl key | priority:3 - low type:suggestion area:UI/UX | Make the keybind for it a bit more difficult | 1.0 | Do not show "localhost" button when pressing the ctrl key - Make the keybind for it a bit more difficult | non_main | do not show localhost button when pressing the ctrl key make the keybind for it a bit more difficult | 0 |
218,045 | 7,330,349,343 | IssuesEvent | 2018-03-05 09:38:11 | NCEAS/metacat | https://api.github.com/repos/NCEAS/metacat | closed | Registry: Back button erases all the data | Category: registry Component: Bugzilla-Id Priority: Immediate Status: Resolved Tracker: Bug | ---
Author Name: **Saurabh Garg** (Saurabh Garg)
Original Redmine Issue: 1326, https://projects.ecoinformatics.org/ecoinfo/issues/1326
Original Date: 2004-02-05
Original Assignee: Saurabh Garg
---
Instead of using the back button, there could be a button on the error page
which does the same work as 'No, I want to edit the document' on the confirm
page. I am assigning P1 to this as it would prevent a lot of effort for people
using browsers which donot store value. Also it is useful in cases like dynamic
taxon fields.
Andrea's comments:
This was already mentioned to you at one of the meetings this week, but in case
you didn’t record it: After entering information into the form at the NCEAS
Data Repository web page, and clicking “Submit Entry”, I was informed that I
missed some required information and should go back. I went back (using my
browser’s Back button), and all the information I had already entered had
disappeared
| 1.0 | Registry: Back button erases all the data - ---
Author Name: **Saurabh Garg** (Saurabh Garg)
Original Redmine Issue: 1326, https://projects.ecoinformatics.org/ecoinfo/issues/1326
Original Date: 2004-02-05
Original Assignee: Saurabh Garg
---
Instead of using the back button, there could be a button on the error page
which does the same work as 'No, I want to edit the document' on the confirm
page. I am assigning P1 to this as it would prevent a lot of effort for people
using browsers which donot store value. Also it is useful in cases like dynamic
taxon fields.
Andrea's comments:
This was already mentioned to you at one of the meetings this week, but in case
you didn’t record it: After entering information into the form at the NCEAS
Data Repository web page, and clicking “Submit Entry”, I was informed that I
missed some required information and should go back. I went back (using my
browser’s Back button), and all the information I had already entered had
disappeared
| non_main | registry back button erases all the data author name saurabh garg saurabh garg original redmine issue original date original assignee saurabh garg instead of using the back button there could be a button on the error page which does the same work as no i want to edit the document on the confirm page i am assigning to this as it would prevent a lot of effort for people using browsers which donot store value also it is useful in cases like dynamic taxon fields andrea s comments this was already mentioned to you at one of the meetings this week but in case you didn’t record it after entering information into the form at the nceas data repository web page and clicking “submit entry” i was informed that i missed some required information and should go back i went back using my browser’s back button and all the information i had already entered had disappeared | 0 |
294,321 | 9,022,215,084 | IssuesEvent | 2019-02-07 00:31:47 | metabase/metabase | https://api.github.com/repos/metabase/metabase | closed | Missing date format parameter when adding date field filter to custom sql | Bug Dashboards Database/Crate Parameters/Variables Priority/P3 | Database: Crate
Metabase: 0.25.2
I created the following custom sql question:
```sql
select * from test where {{Date}}
```
Date is set as a **field filter**. Widget set to **Date Filter** but I tested with the other widgets and the same error happens.
When I try to get the answer I get the following error:
```
No value specified for parameter 3.
```
The metabase log shows the following request that has 3 declared parameters (```date_format(?, ```, ```BETWEEN ?```, ```AND ?```) in the query but only passes 2 parameters, the date format parameter is missing.
```
Sep 21 13:37:14 DEBUG metabase.query-processor.middleware.mbql-to-native :: NATIVE FORM: 😳
{:query "select * from test where date_format(?, date_trunc('day', \"doc\".\"test\".\"date\")) BETWEEN ? AND ?",
:template_tags {:Date {:id "67ee7881-53a8-a931-627c-2d756b6c88ff", :name "Date", :display_name "Date", :type "dimension", :dimension ["field-id" 787], :widget_type "date/all-options"}},
:params (#inst "2017-08-22T00:00:00.000000000-00:00" #inst "2017-09-20T00:00:00.000000000-00:00")}
``` | 1.0 | Missing date format parameter when adding date field filter to custom sql - Database: Crate
Metabase: 0.25.2
I created the following custom sql question:
```sql
select * from test where {{Date}}
```
Date is set as a **field filter**. Widget set to **Date Filter** but I tested with the other widgets and the same error happens.
When I try to get the answer I get the following error:
```
No value specified for parameter 3.
```
The metabase log shows the following request that has 3 declared parameters (```date_format(?, ```, ```BETWEEN ?```, ```AND ?```) in the query but only passes 2 parameters, the date format parameter is missing.
```
Sep 21 13:37:14 DEBUG metabase.query-processor.middleware.mbql-to-native :: NATIVE FORM: 😳
{:query "select * from test where date_format(?, date_trunc('day', \"doc\".\"test\".\"date\")) BETWEEN ? AND ?",
:template_tags {:Date {:id "67ee7881-53a8-a931-627c-2d756b6c88ff", :name "Date", :display_name "Date", :type "dimension", :dimension ["field-id" 787], :widget_type "date/all-options"}},
:params (#inst "2017-08-22T00:00:00.000000000-00:00" #inst "2017-09-20T00:00:00.000000000-00:00")}
``` | non_main | missing date format parameter when adding date field filter to custom sql database crate metabase i created the following custom sql question sql select from test where date date is set as a field filter widget set to date filter but i tested with the other widgets and the same error happens when i try to get the answer i get the following error no value specified for parameter the metabase log shows the following request that has declared parameters date format between and in the query but only passes parameters the date format parameter is missing sep debug metabase query processor middleware mbql to native native form 😳 query select from test where date format date trunc day doc test date between and template tags date id name date display name date type dimension dimension widget type date all options params inst inst | 0 |
466,892 | 13,436,448,354 | IssuesEvent | 2020-09-07 14:23:53 | magento/adobe-stock-integration | https://api.github.com/repos/magento/adobe-stock-integration | closed | Remove generated renditions when the asset is deleted | Priority: P2 Progress: dev in progress Severity: S2 | ### Steps to reproduce (*)
1. Upload an asset of high resolution to Media Gallery
2. Delete an asset from Media Gallery
3. Check pub/media/.renditions directory
### Expected result (*)
The rendition for the deleted asset is still preset
### Actual result (*)
The rendition for the deleted asset should be deleted too
### Additionally
The renditions should be also removed when the asset is removed by running `bin/magento media-gallery:sync` command
| 1.0 | Remove generated renditions when the asset is deleted - ### Steps to reproduce (*)
1. Upload an asset of high resolution to Media Gallery
2. Delete an asset from Media Gallery
3. Check pub/media/.renditions directory
### Expected result (*)
The rendition for the deleted asset is still preset
### Actual result (*)
The rendition for the deleted asset should be deleted too
### Additionally
The renditions should be also removed when the asset is removed by running `bin/magento media-gallery:sync` command
| non_main | remove generated renditions when the asset is deleted steps to reproduce upload an asset of high resolution to media gallery delete an asset from media gallery check pub media renditions directory expected result the rendition for the deleted asset is still preset actual result the rendition for the deleted asset should be deleted too additionally the renditions should be also removed when the asset is removed by running bin magento media gallery sync command | 0 |
84,986 | 16,585,442,187 | IssuesEvent | 2021-05-31 18:27:41 | fossasia/open-event-frontend | https://api.github.com/repos/fossasia/open-event-frontend | closed | Organizer Session Overview: Re-add rating columns to all session tables | bug codeheat enhancement | The organizer session overview page does not have the column for ratings anymore. Please re-add rating columns to all session tables and ensure the feature works.
The feature was originally implemented here https://github.com/fossasia/open-event-frontend/pull/3198

| 1.0 | Organizer Session Overview: Re-add rating columns to all session tables - The organizer session overview page does not have the column for ratings anymore. Please re-add rating columns to all session tables and ensure the feature works.
The feature was originally implemented here https://github.com/fossasia/open-event-frontend/pull/3198

| non_main | organizer session overview re add rating columns to all session tables the organizer session overview page does not have the column for ratings anymore please re add rating columns to all session tables and ensure the feature works the feature was originally implemented here | 0 |
11,298 | 3,482,162,495 | IssuesEvent | 2015-12-29 21:14:08 | APY/APYDataGridBundle | https://api.github.com/repos/APY/APYDataGridBundle | closed | @TODO List | Documentation Enhancement Feature Helper | * <del>Date filter #41</del>
* <del>Export data #31</del> (v2)
* <del>New documentation structure + update</del> (v2)
* <del>Add dql function support on non-mapped fields</del>
* <del>Add number of results</del> and show the current range of result (v2)
* <del>Full php configuration #44</del> (v2)
* Yaml configuration #44 + #4
* <del>Xml configuration #44</del>
* <del>Implement all options on Document source #32 + others</del> (v2)
* Implement Propel source #86
* <del>Reset filters #36</del> (v2)
* <del>Custom/combo filter #28</del>
* <del>Prevent SQL injection ? #73</del>
* <del>New branch for Symfony 2.1</del> (v2) v2.0 works with Symfony 2.1.0 too
* <del>Lazy loading to add options in any order</del> (v2)
* <del>Manipulate the default action column</del> (v2)
* <del>Add role control on massAction, rowAction, Export</del> (v2.1)
* <del>Add sourceselect support for the vector source [#113]</del> (v2)
* <del>Add XMLHttpRequest full support [#130]</del>
* <del>searchOnClick feature</del> (v2.1)
* <del>Fix #201</del>
* Add footer aggregate informations (avg, count, ...) [#207]
* Add the number of results corresponding of a value in a select filter. (See the genre filter on http://dj.beatport.com/events)
* <del>[Filter] Add radio and checkbox filters.</del>
* <del>Add an attribute to turn off the escape of the value of a cell. [#257]</del>
* Merge rowAction and massAction [#22]
* Add cache support for result/page [#417]
* Add ajax refresh after row action (add a param)
* Add request parameters mapping (i.e. grid_id[_page] = $page) - Add a converter callback($name, $value) for encode parameter | 1.0 | @TODO List - * <del>Date filter #41</del>
* <del>Export data #31</del> (v2)
* <del>New documentation structure + update</del> (v2)
* <del>Add dql function support on non-mapped fields</del>
* <del>Add number of results</del> and show the current range of result (v2)
* <del>Full php configuration #44</del> (v2)
* Yaml configuration #44 + #4
* <del>Xml configuration #44</del>
* <del>Implement all options on Document source #32 + others</del> (v2)
* Implement Propel source #86
* <del>Reset filters #36</del> (v2)
* <del>Custom/combo filter #28</del>
* <del>Prevent SQL injection ? #73</del>
* <del>New branch for Symfony 2.1</del> (v2) v2.0 works with Symfony 2.1.0 too
* <del>Lazy loading to add options in any order</del> (v2)
* <del>Manipulate the default action column</del> (v2)
* <del>Add role control on massAction, rowAction, Export</del> (v2.1)
* <del>Add sourceselect support for the vector source [#113]</del> (v2)
* <del>Add XMLHttpRequest full support [#130]</del>
* <del>searchOnClick feature</del> (v2.1)
* <del>Fix #201</del>
* Add footer aggregate informations (avg, count, ...) [#207]
* Add the number of results corresponding of a value in a select filter. (See the genre filter on http://dj.beatport.com/events)
* <del>[Filter] Add radio and checkbox filters.</del>
* <del>Add an attribute to turn off the escape of the value of a cell. [#257]</del>
* Merge rowAction and massAction [#22]
* Add cache support for result/page [#417]
* Add ajax refresh after row action (add a param)
* Add request parameters mapping (i.e. grid_id[_page] = $page) - Add a converter callback($name, $value) for encode parameter | non_main | todo list date filter export data new documentation structure update add dql function support on non mapped fields add number of results and show the current range of result full php configuration yaml configuration xml configuration implement all options on document source others implement propel source reset filters custom combo filter prevent sql injection new branch for symfony works with symfony too lazy loading to add options in any order manipulate the default action column add role control on massaction rowaction export add sourceselect support for the vector source add xmlhttprequest full support searchonclick feature fix add footer aggregate informations avg count add the number of results corresponding of a value in a select filter see the genre filter on add radio and checkbox filters add an attribute to turn off the escape of the value of a cell merge rowaction and massaction add cache support for result page add ajax refresh after row action add a param add request parameters mapping i e grid id page add a converter callback name value for encode parameter | 0 |
2,369 | 8,475,190,868 | IssuesEvent | 2018-10-24 18:13:58 | dgets/d4m0Turtle | https://api.github.com/repos/dgets/d4m0Turtle | opened | Throw logging into new code | enhancement good first issue help wanted maintainability | A bit of new code has been introduced into `d4m0Turtle.py` that needs to have logging added in order to obtain proper gameplay analysis. Can't do much without being able to figure out easily what logical branch the execution went down at any one point. | True | Throw logging into new code - A bit of new code has been introduced into `d4m0Turtle.py` that needs to have logging added in order to obtain proper gameplay analysis. Can't do much without being able to figure out easily what logical branch the execution went down at any one point. | main | throw logging into new code a bit of new code has been introduced into py that needs to have logging added in order to obtain proper gameplay analysis can t do much without being able to figure out easily what logical branch the execution went down at any one point | 1 |
438,597 | 12,641,687,661 | IssuesEvent | 2020-06-16 06:45:09 | wso2/carbon-apimgt | https://api.github.com/repos/wso2/carbon-apimgt | opened | Null pointer Exception throws when importing an OAS definition without scopes | Priority/Normal Type/Bug | ### Description:
When importing an OAS definition using publisher portal or apictl, product throws a null pointer exception when the given OAS definition doesn't contain any scopes.
```
2020-06-16 11:51:10,835] ERROR - GlobalThrowableMapper An unknown exception has been captured by the global exception mapper.
java.lang.NullPointerException: null
at org.wso2.carbon.apimgt.impl.definitions.OAS3Parser.injectOtherScopesToDefaultScheme_aroundBody88(OAS3Parser.java:1399) ~[org.wso2.carbon.apimgt.impl_6.7.10.SNAPSHOT.jar:?]
at org.wso2.carbon.apimgt.impl.definitions.OAS3Parser.injectOtherScopesToDefaultScheme(OAS3Parser.java:1345) ~[org.wso2.carbon.apimgt.impl_6.7.10.SNAPSHOT.jar:?]
```
This occurs due to lack of check conditions to check whether _Components_ and _securityDefinitions_ are null or not in the given definition.
### Steps to reproduce:
1. Run publisher portal.
2. Import an API using "Create new api > Existing rest API option"
3. Provide openAPI or swagger definition
4. click create
### Affected Product Version:
3.2.0, 3.1.0
### Environment details (with versions):
- OS:
- Client:
- Env (Docker/K8s):
--
#### Suggested Labels:
<!--Only to be used by non-members-->
#### Suggested Assignees:
Chamindu36 | 1.0 | Null pointer Exception throws when importing an OAS definition without scopes - ### Description:
When importing an OAS definition using publisher portal or apictl, product throws a null pointer exception when the given OAS definition doesn't contain any scopes.
```
2020-06-16 11:51:10,835] ERROR - GlobalThrowableMapper An unknown exception has been captured by the global exception mapper.
java.lang.NullPointerException: null
at org.wso2.carbon.apimgt.impl.definitions.OAS3Parser.injectOtherScopesToDefaultScheme_aroundBody88(OAS3Parser.java:1399) ~[org.wso2.carbon.apimgt.impl_6.7.10.SNAPSHOT.jar:?]
at org.wso2.carbon.apimgt.impl.definitions.OAS3Parser.injectOtherScopesToDefaultScheme(OAS3Parser.java:1345) ~[org.wso2.carbon.apimgt.impl_6.7.10.SNAPSHOT.jar:?]
```
This occurs due to lack of check conditions to check whether _Components_ and _securityDefinitions_ are null or not in the given definition.
### Steps to reproduce:
1. Run publisher portal.
2. Import an API using "Create new api > Existing rest API option"
3. Provide openAPI or swagger definition
4. click create
### Affected Product Version:
3.2.0, 3.1.0
### Environment details (with versions):
- OS:
- Client:
- Env (Docker/K8s):
--
#### Suggested Labels:
<!--Only to be used by non-members-->
#### Suggested Assignees:
Chamindu36 | non_main | null pointer exception throws when importing an oas definition without scopes description when importing an oas definition using publisher portal or apictl product throws a null pointer exception when the given oas definition doesn t contain any scopes error globalthrowablemapper an unknown exception has been captured by the global exception mapper java lang nullpointerexception null at org carbon apimgt impl definitions injectotherscopestodefaultscheme java at org carbon apimgt impl definitions injectotherscopestodefaultscheme java this occurs due to lack of check conditions to check whether components and securitydefinitions are null or not in the given definition steps to reproduce run publisher portal import an api using create new api existing rest api option provide openapi or swagger definition click create affected product version environment details with versions os client env docker suggested labels suggested assignees | 0 |
306,501 | 26,473,897,162 | IssuesEvent | 2023-01-17 09:40:42 | mantidproject/mantid | https://api.github.com/repos/mantidproject/mantid | closed | Manual Testing Documentation | Manual Tests | You have been assigned manual testing. The hope is to catch as many problems with the code before release, so it would be great if you can take some time to give a serious test to your assigned area. Thank you!!
The general guide to manual testing:
* The tests must be performed on the installer versions of the final release candidate. Not on local compiled code.
* Serious errors involving loss of functionality, crashes etc. should be raised
as issues with the current release as a milestone and an email sent to the project manager immediately.
* Minor and cosmetic issues should be raised as issues against the forthcoming
releases.
* First try things that should work, then try to break Mantid, e.g. entering invalid values, unexpected characters etc.
* Don't spend more than a few hours on the testing as fatigue will kick in.
* If you find errors in the documentation, please correct them.
* Comment against this ticket the OS environment you are testing against.
* Close the this issue once you are done.
### Specific Notes:
Check Online Docs and the Qt-help docs built into MantidWorkbench (from the help droppdown menu)
* Algorithm, fit, concept and api pages should be generated
* Algorithm dialog snapshots should appear on algorithm pages in offline help
* Math formulae should appear on algorithm pages in offline help
* workflow diagrams should appear on algorithm pages in offline help
You may wish to use this script ( [OpenMostDocumentationForTesting.py](https://github.com/mantidproject/mantid/blob/master/tools/scripts/OpenMostDocumentationForTesting.py) ) to open all the online docs pages! | 1.0 | Manual Testing Documentation - You have been assigned manual testing. The hope is to catch as many problems with the code before release, so it would be great if you can take some time to give a serious test to your assigned area. Thank you!!
The general guide to manual testing:
* The tests must be performed on the installer versions of the final release candidate. Not on local compiled code.
* Serious errors involving loss of functionality, crashes etc. should be raised
as issues with the current release as a milestone and an email sent to the project manager immediately.
* Minor and cosmetic issues should be raised as issues against the forthcoming
releases.
* First try things that should work, then try to break Mantid, e.g. entering invalid values, unexpected characters etc.
* Don't spend more than a few hours on the testing as fatigue will kick in.
* If you find errors in the documentation, please correct them.
* Comment against this ticket the OS environment you are testing against.
* Close the this issue once you are done.
### Specific Notes:
Check Online Docs and the Qt-help docs built into MantidWorkbench (from the help droppdown menu)
* Algorithm, fit, concept and api pages should be generated
* Algorithm dialog snapshots should appear on algorithm pages in offline help
* Math formulae should appear on algorithm pages in offline help
* workflow diagrams should appear on algorithm pages in offline help
You may wish to use this script ( [OpenMostDocumentationForTesting.py](https://github.com/mantidproject/mantid/blob/master/tools/scripts/OpenMostDocumentationForTesting.py) ) to open all the online docs pages! | non_main | manual testing documentation you have been assigned manual testing the hope is to catch as many problems with the code before release so it would be great if you can take some time to give a serious test to your assigned area thank you the general guide to manual testing the tests must be performed on the installer versions of the final release candidate not on local compiled code serious errors involving loss of functionality crashes etc should be raised as issues with the current release as a milestone and an email sent to the project manager immediately minor and cosmetic issues should be raised as issues against the forthcoming releases first try things that should work then try to break mantid e g entering invalid values unexpected characters etc don t spend more than a few hours on the testing as fatigue will kick in if you find errors in the documentation please correct them comment against this ticket the os environment you are testing against close the this issue once you are done specific notes check online docs and the qt help docs built into mantidworkbench from the help droppdown menu algorithm fit concept and api pages should be generated algorithm dialog snapshots should appear on algorithm pages in offline help math formulae should appear on algorithm pages in offline help workflow diagrams should appear on algorithm pages in offline help you may wish to use this script to open all the online docs pages | 0 |
2,433 | 8,621,354,938 | IssuesEvent | 2018-11-20 17:07:36 | RalfKoban/MiKo-Analyzers | https://api.github.com/repos/RalfKoban/MiKo-Analyzers | closed | Namespace hierarchy should not be too deep | Area: analyzer Area: maintainability feasability unclear feature in progress | At most a namespace may have a depth of `10` (ten).
Example:
Namespace `A.B.C.D.E` has a depth of `5`.
The reasoning behind: If the namespace depth is too deep, then it's probably too concrete and can be simplified. | True | Namespace hierarchy should not be too deep - At most a namespace may have a depth of `10` (ten).
Example:
Namespace `A.B.C.D.E` has a depth of `5`.
The reasoning behind: If the namespace depth is too deep, then it's probably too concrete and can be simplified. | main | namespace hierarchy should not be too deep at most a namespace may have a depth of ten example namespace a b c d e has a depth of the reasoning behind if the namespace depth is too deep then it s probably too concrete and can be simplified | 1 |
186,942 | 14,426,868,406 | IssuesEvent | 2020-12-06 00:28:36 | kalexmills/github-vet-tests-dec2020 | https://api.github.com/repos/kalexmills/github-vet-tests-dec2020 | closed | fabric8io/configmapcontroller: vendor/github.com/openshift/origin/test/integration/template_test.go; 64 LoC | fresh medium test vendored |
Found a possible issue in [fabric8io/configmapcontroller](https://www.github.com/fabric8io/configmapcontroller) at [vendor/github.com/openshift/origin/test/integration/template_test.go](https://github.com/fabric8io/configmapcontroller/blob/3fc50a70ba68517ddf7c8fdad713018b01cdfcbe/vendor/github.com/openshift/origin/test/integration/template_test.go#L27-L90)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> reference to version is reassigned at line 32
[Click here to see the code in its original context.](https://github.com/fabric8io/configmapcontroller/blob/3fc50a70ba68517ddf7c8fdad713018b01cdfcbe/vendor/github.com/openshift/origin/test/integration/template_test.go#L27-L90)
<details>
<summary>Click here to show the 64 line(s) of Go which triggered the analyzer.</summary>
```go
for _, version := range []unversioned.GroupVersion{v1.SchemeGroupVersion} {
config, err := testutil.GetClusterAdminClientConfig(path)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
config.GroupVersion = &version
c, err := client.New(config)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
template := &templateapi.Template{
Parameters: []templateapi.Parameter{
{
Name: "NAME",
Value: "test",
},
},
}
templateObjects := []runtime.Object{
&v1.Service{
ObjectMeta: v1.ObjectMeta{
Name: "${NAME}-tester",
Namespace: "somevalue",
},
Spec: v1.ServiceSpec{
ClusterIP: "1.2.3.4",
SessionAffinity: "some-bad-${VALUE}",
},
},
}
templateapi.AddObjectsToTemplate(template, templateObjects, v1.SchemeGroupVersion)
obj, err := c.TemplateConfigs("default").Create(template)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if len(obj.Objects) != 1 {
t.Fatalf("unexpected object: %#v", obj)
}
if err := runtime.DecodeList(obj.Objects, runtime.UnstructuredJSONScheme); err != nil {
t.Fatalf("unexpected error: %v", err)
}
svc := obj.Objects[0].(*runtime.Unstructured).Object
spec := svc["spec"].(map[string]interface{})
meta := svc["metadata"].(map[string]interface{})
// keep existing values
if spec["clusterIP"] != "1.2.3.4" {
t.Fatalf("unexpected object: %#v", svc)
}
// replace a value
if meta["name"] != "test-tester" {
t.Fatalf("unexpected object: %#v", svc)
}
// clear namespace
if meta["namespace"] != "" {
t.Fatalf("unexpected object: %#v", svc)
}
// preserve values exactly
if spec["sessionAffinity"] != "some-bad-${VALUE}" {
t.Fatalf("unexpected object: %#v", svc)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 3fc50a70ba68517ddf7c8fdad713018b01cdfcbe
| 1.0 | fabric8io/configmapcontroller: vendor/github.com/openshift/origin/test/integration/template_test.go; 64 LoC -
Found a possible issue in [fabric8io/configmapcontroller](https://www.github.com/fabric8io/configmapcontroller) at [vendor/github.com/openshift/origin/test/integration/template_test.go](https://github.com/fabric8io/configmapcontroller/blob/3fc50a70ba68517ddf7c8fdad713018b01cdfcbe/vendor/github.com/openshift/origin/test/integration/template_test.go#L27-L90)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> reference to version is reassigned at line 32
[Click here to see the code in its original context.](https://github.com/fabric8io/configmapcontroller/blob/3fc50a70ba68517ddf7c8fdad713018b01cdfcbe/vendor/github.com/openshift/origin/test/integration/template_test.go#L27-L90)
<details>
<summary>Click here to show the 64 line(s) of Go which triggered the analyzer.</summary>
```go
for _, version := range []unversioned.GroupVersion{v1.SchemeGroupVersion} {
config, err := testutil.GetClusterAdminClientConfig(path)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
config.GroupVersion = &version
c, err := client.New(config)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
template := &templateapi.Template{
Parameters: []templateapi.Parameter{
{
Name: "NAME",
Value: "test",
},
},
}
templateObjects := []runtime.Object{
&v1.Service{
ObjectMeta: v1.ObjectMeta{
Name: "${NAME}-tester",
Namespace: "somevalue",
},
Spec: v1.ServiceSpec{
ClusterIP: "1.2.3.4",
SessionAffinity: "some-bad-${VALUE}",
},
},
}
templateapi.AddObjectsToTemplate(template, templateObjects, v1.SchemeGroupVersion)
obj, err := c.TemplateConfigs("default").Create(template)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if len(obj.Objects) != 1 {
t.Fatalf("unexpected object: %#v", obj)
}
if err := runtime.DecodeList(obj.Objects, runtime.UnstructuredJSONScheme); err != nil {
t.Fatalf("unexpected error: %v", err)
}
svc := obj.Objects[0].(*runtime.Unstructured).Object
spec := svc["spec"].(map[string]interface{})
meta := svc["metadata"].(map[string]interface{})
// keep existing values
if spec["clusterIP"] != "1.2.3.4" {
t.Fatalf("unexpected object: %#v", svc)
}
// replace a value
if meta["name"] != "test-tester" {
t.Fatalf("unexpected object: %#v", svc)
}
// clear namespace
if meta["namespace"] != "" {
t.Fatalf("unexpected object: %#v", svc)
}
// preserve values exactly
if spec["sessionAffinity"] != "some-bad-${VALUE}" {
t.Fatalf("unexpected object: %#v", svc)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 3fc50a70ba68517ddf7c8fdad713018b01cdfcbe
| non_main | configmapcontroller vendor github com openshift origin test integration template test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message reference to version is reassigned at line click here to show the line s of go which triggered the analyzer go for version range unversioned groupversion schemegroupversion config err testutil getclusteradminclientconfig path if err nil t fatalf unexpected error v err config groupversion version c err client new config if err nil t fatalf unexpected error v err template templateapi template parameters templateapi parameter name name value test templateobjects runtime object service objectmeta objectmeta name name tester namespace somevalue spec servicespec clusterip sessionaffinity some bad value templateapi addobjectstotemplate template templateobjects schemegroupversion obj err c templateconfigs default create template if err nil t fatalf unexpected error v err if len obj objects t fatalf unexpected object v obj if err runtime decodelist obj objects runtime unstructuredjsonscheme err nil t fatalf unexpected error v err svc obj objects runtime unstructured object spec svc map interface meta svc map interface keep existing values if spec t fatalf unexpected object v svc replace a value if meta test tester t fatalf unexpected object v svc clear namespace if meta t fatalf unexpected object v svc preserve values exactly if spec some bad value t fatalf unexpected object v svc leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id | 0 |
3,659 | 14,941,826,256 | IssuesEvent | 2021-01-25 20:20:47 | hydroshare/hydroshare | https://api.github.com/repos/hydroshare/hydroshare | closed | Unable to Maintain Frontend Libraries | Maintainability | One of the core tasks of managing sustainability for a project like this is updating libraries. On the frontend, problems are compounded by a colossal global variable scope, and entirely chained together html/template files and scripts and CSS files into a huge shared scope.
Task is to separate out the navbar, create resource, profile manager and notifications bell into a proper, discrete scope and include it by some better means. This will allow for Discover to remove temporary overrides that mitigate this issue and a version conflict with bootstrap and the CSS changes across bootstrap3 to bootstrap4.
Most importantly, frontend libraries and corresponding CSS files can start to be upgraded appropriately and will be easier to do so, going forward. | True | Unable to Maintain Frontend Libraries - One of the core tasks of managing sustainability for a project like this is updating libraries. On the frontend, problems are compounded by a colossal global variable scope, and entirely chained together html/template files and scripts and CSS files into a huge shared scope.
Task is to separate out the navbar, create resource, profile manager and notifications bell into a proper, discrete scope and include it by some better means. This will allow for Discover to remove temporary overrides that mitigate this issue and a version conflict with bootstrap and the CSS changes across bootstrap3 to bootstrap4.
Most importantly, frontend libraries and corresponding CSS files can start to be upgraded appropriately and will be easier to do so, going forward. | main | unable to maintain frontend libraries one of the core tasks of managing sustainability for a project like this is updating libraries on the frontend problems are compounded by a colossal global variable scope and entirely chained together html template files and scripts and css files into a huge shared scope task is to separate out the navbar create resource profile manager and notifications bell into a proper discrete scope and include it by some better means this will allow for discover to remove temporary overrides that mitigate this issue and a version conflict with bootstrap and the css changes across to most importantly frontend libraries and corresponding css files can start to be upgraded appropriately and will be easier to do so going forward | 1 |
454 | 3,631,355,241 | IssuesEvent | 2016-02-11 00:57:25 | Homebrew/homebrew | https://api.github.com/repos/Homebrew/homebrew | opened | Telepathy devel versions. | maintainer feedback | A couple of the Telepathy formulae:
* telepathy-gabble
* telepathy-glib
Were moved onto what seem to be development releases in November with:
* b66d6921df7ca6ad5f4e5008382d17adcbbb7427
* 8ec3cd94bb46ad16fb7613f43f3017577ecb33ff
This is currently breaking:
* telepathy-mission-control
* telepathy-idle
This wasn't detected at the time, possibly because there are no tests on those formulae, but trying to compile them throws this sort of error:
```
configure: error: Package requirements (telepathy-glib >= 0.20.0) were not met:
No package 'telepathy-glib' found
Consider adjusting the PKG_CONFIG_PATH environment variable if you
installed software in a non-standard prefix.
```
Our options are kind of limited:
* We do a revert and leave people who upgraded to the development versions stuck there indefinitely because a stable release above `0.99.x` is unlikely in the medium-term; so far upstream has largely followed the pattern of one stable release per year going from i.e. `0.22.0` to `0.24.0`.
* We scrap all these Telepathy formulae because nobody has reported them broken in nearly 3 months. According to Bintray since `telepathy-glib` was upgraded to a development version about 50 people have downloaded a bottle and `telepathy-gabble` has been downloaded 19 times. | True | Telepathy devel versions. - A couple of the Telepathy formulae:
* telepathy-gabble
* telepathy-glib
Were moved onto what seem to be development releases in November with:
* b66d6921df7ca6ad5f4e5008382d17adcbbb7427
* 8ec3cd94bb46ad16fb7613f43f3017577ecb33ff
This is currently breaking:
* telepathy-mission-control
* telepathy-idle
This wasn't detected at the time, possibly because there are no tests on those formulae, but trying to compile them throws this sort of error:
```
configure: error: Package requirements (telepathy-glib >= 0.20.0) were not met:
No package 'telepathy-glib' found
Consider adjusting the PKG_CONFIG_PATH environment variable if you
installed software in a non-standard prefix.
```
Our options are kind of limited:
* We do a revert and leave people who upgraded to the development versions stuck there indefinitely because a stable release above `0.99.x` is unlikely in the medium-term; so far upstream has largely followed the pattern of one stable release per year going from i.e. `0.22.0` to `0.24.0`.
* We scrap all these Telepathy formulae because nobody has reported them broken in nearly 3 months. According to Bintray since `telepathy-glib` was upgraded to a development version about 50 people have downloaded a bottle and `telepathy-gabble` has been downloaded 19 times. | main | telepathy devel versions a couple of the telepathy formulae telepathy gabble telepathy glib were moved onto what seem to be development releases in november with this is currently breaking telepathy mission control telepathy idle this wasn t detected at the time possibly because there are no tests on those formulae but trying to compile them throws this sort of error configure error package requirements telepathy glib were not met no package telepathy glib found consider adjusting the pkg config path environment variable if you installed software in a non standard prefix our options are kind of limited we do a revert and leave people who upgraded to the development versions stuck there indefinitely because a stable release above x is unlikely in the medium term so far upstream has largely followed the pattern of one stable release per year going from i e to we scrap all these telepathy formulae because nobody has reported them broken in nearly months according to bintray since telepathy glib was upgraded to a development version about people have downloaded a bottle and telepathy gabble has been downloaded times | 1 |
2,841 | 10,215,688,323 | IssuesEvent | 2019-08-15 08:22:51 | arcticicestudio/styleguide-javascript | https://api.github.com/repos/arcticicestudio/styleguide-javascript | opened | Prettier | context-workflow scope-maintainability type-feature | <p align="center"><img src="https://user-images.githubusercontent.com/7836623/48644231-4556d780-e9e2-11e8-862e-e8ce630fd0ba.png" width="30%" /></p>
> Epic: #8
Integrate [Prettier][], the opinionated code formatter with support for many languages and integrations with most editors. It ensures that all outputted code conforms to a consistent style.
### Configuration
This is one of the main features of Prettier: It already provides the best and recommended style configurations of-out-the-box™.
The only option we will change is the [print width][prettier-docs-pwidth]. It is set to 80 by default which not up-to-date for modern screens (might only be relevant when working in terminals only like e.g. with Vim). It'll be changed to 120 used by all of Arctic Ice Studio's style guides.
The `prettier.config.js` configuration file will be placed in the project root as well as the `.prettierignore` file to also define ignore pattern.
### NPM script/task
To allow to format all sources a `format:pretty` npm script/task will be added to be included in the main `format` script flow.
## Tasks
- [ ] Install [prettier][npm-prettier] packages.
- [ ] Implement `prettier.config.js` configuration file.
- [ ] Implement `.prettierignore` ignore pattern file.
- [ ] Implement NPM `format:pretty` script/task.
- [ ] Format current code base for the first time and fix possible style guide violations using the configured linters of the project.
[npm-prettier]: https://www.npmjs.com/package/prettier
[prettier-docs-pwidth]: https://prettier.io/docs/en/options.html#print-width
[prettier]: https://prettier.io | True | Prettier - <p align="center"><img src="https://user-images.githubusercontent.com/7836623/48644231-4556d780-e9e2-11e8-862e-e8ce630fd0ba.png" width="30%" /></p>
> Epic: #8
Integrate [Prettier][], the opinionated code formatter with support for many languages and integrations with most editors. It ensures that all outputted code conforms to a consistent style.
### Configuration
This is one of the main features of Prettier: It already provides the best and recommended style configurations of-out-the-box™.
The only option we will change is the [print width][prettier-docs-pwidth]. It is set to 80 by default which not up-to-date for modern screens (might only be relevant when working in terminals only like e.g. with Vim). It'll be changed to 120 used by all of Arctic Ice Studio's style guides.
The `prettier.config.js` configuration file will be placed in the project root as well as the `.prettierignore` file to also define ignore pattern.
### NPM script/task
To allow to format all sources a `format:pretty` npm script/task will be added to be included in the main `format` script flow.
## Tasks
- [ ] Install [prettier][npm-prettier] packages.
- [ ] Implement `prettier.config.js` configuration file.
- [ ] Implement `.prettierignore` ignore pattern file.
- [ ] Implement NPM `format:pretty` script/task.
- [ ] Format current code base for the first time and fix possible style guide violations using the configured linters of the project.
[npm-prettier]: https://www.npmjs.com/package/prettier
[prettier-docs-pwidth]: https://prettier.io/docs/en/options.html#print-width
[prettier]: https://prettier.io | main | prettier epic integrate the opinionated code formatter with support for many languages and integrations with most editors it ensures that all outputted code conforms to a consistent style configuration this is one of the main features of prettier it already provides the best and recommended style configurations of out the box™ the only option we will change is the it is set to by default which not up to date for modern screens might only be relevant when working in terminals only like e g with vim it ll be changed to used by all of arctic ice studio s style guides the prettier config js configuration file will be placed in the project root as well as the prettierignore file to also define ignore pattern npm script task to allow to format all sources a format pretty npm script task will be added to be included in the main format script flow tasks install packages implement prettier config js configuration file implement prettierignore ignore pattern file implement npm format pretty script task format current code base for the first time and fix possible style guide violations using the configured linters of the project | 1 |
98,687 | 4,030,025,323 | IssuesEvent | 2016-05-18 12:59:53 | gama-platform/gama | https://api.github.com/repos/gama-platform/gama | opened | MacOS X release not functional and not correctly packaged | > Bug Concerns Development OS OSX Priority Critical Version 1.7 beta | The automatic release for MacOS X is not functional and is probably missing several components to become testable.
- [ ] It is missing the `Info.plist` file (the one present in `msi.gama.application`, https://github.com/gama-platform/gama/blob/master/msi.gama.application/macosx/Info.plist), which should be present in `Gama.app/Contents/`.
- [ ] It is missing the `Model.icns` and `icon.ins` files (https://github.com/gama-platform/gama/blob/master/msi.gama.application/macosx/Model.icns & https://github.com/gama-platform/gama/blob/master/msi.gama.application/icons/launcher_icons/icon.icns) in `/Contents/Resources`
- [ ] It is named `Eclipse.app` and not `Gama.app`
| 1.0 | MacOS X release not functional and not correctly packaged - The automatic release for MacOS X is not functional and is probably missing several components to become testable.
- [ ] It is missing the `Info.plist` file (the one present in `msi.gama.application`, https://github.com/gama-platform/gama/blob/master/msi.gama.application/macosx/Info.plist), which should be present in `Gama.app/Contents/`.
- [ ] It is missing the `Model.icns` and `icon.ins` files (https://github.com/gama-platform/gama/blob/master/msi.gama.application/macosx/Model.icns & https://github.com/gama-platform/gama/blob/master/msi.gama.application/icons/launcher_icons/icon.icns) in `/Contents/Resources`
- [ ] It is named `Eclipse.app` and not `Gama.app`
| non_main | macos x release not functional and not correctly packaged the automatic release for macos x is not functional and is probably missing several components to become testable it is missing the info plist file the one present in msi gama application which should be present in gama app contents it is missing the model icns and icon ins files in contents resources it is named eclipse app and not gama app | 0 |
4,733 | 24,442,969,430 | IssuesEvent | 2022-10-06 15:44:20 | OpenRefine/OpenRefine | https://api.github.com/repos/OpenRefine/OpenRefine | closed | Updated ODFDOM API | maintainability import ODS | The ODFDM API that we are using was deprecated in 0.8:
> * OdfTable represents the table feature in ODF spreadsheet and text documents.
> * OdfTable provides methods to get/add/delete/modify table column/row/cell.
> *
> * @deprecated As of release 0.8.8, replaced by {@link org.odftoolkit.simple.table.Table} in Simple API.
> */
> public class OdfTable {
**but** it's replacement (SimpleAPI) is being removed in 1.0, so we should migrate directly to the [1.0 Collaboration API](https://tdf.github.io/odftoolkit/docs/odfdom/ReleaseNotes.html).
| True | Updated ODFDOM API - The ODFDM API that we are using was deprecated in 0.8:
> * OdfTable represents the table feature in ODF spreadsheet and text documents.
> * OdfTable provides methods to get/add/delete/modify table column/row/cell.
> *
> * @deprecated As of release 0.8.8, replaced by {@link org.odftoolkit.simple.table.Table} in Simple API.
> */
> public class OdfTable {
**but** it's replacement (SimpleAPI) is being removed in 1.0, so we should migrate directly to the [1.0 Collaboration API](https://tdf.github.io/odftoolkit/docs/odfdom/ReleaseNotes.html).
| main | updated odfdom api the odfdm api that we are using was deprecated in odftable represents the table feature in odf spreadsheet and text documents odftable provides methods to get add delete modify table column row cell deprecated as of release replaced by link org odftoolkit simple table table in simple api public class odftable but it s replacement simpleapi is being removed in so we should migrate directly to the | 1 |
286,943 | 8,796,363,444 | IssuesEvent | 2018-12-23 05:36:47 | open-learning-exchange/planet | https://api.github.com/repos/open-learning-exchange/planet | closed | The program hangs when submitting an answer on an arithmetic question | priority | ## Description
When submitting arithmetic related answer typed in an input field, the program is hanging.
## Screenshot

| 1.0 | The program hangs when submitting an answer on an arithmetic question - ## Description
When submitting arithmetic related answer typed in an input field, the program is hanging.
## Screenshot

| non_main | the program hangs when submitting an answer on an arithmetic question description when submitting arithmetic related answer typed in an input field the program is hanging screenshot | 0 |
257 | 3,008,040,935 | IssuesEvent | 2015-07-27 19:08:57 | borisblizzard/arcreator | https://api.github.com/repos/borisblizzard/arcreator | opened | Remove of global variables usage | bug Editor Related Maintainability | While this has been fixed in many place there are still some places that need to be fixed
the implementation of the Database panels leftover from F0's efforts made use of g global variables to make project data and configuration available to the class, initialising them in the __init__ method of the class
this is bad practice and counter productive to the modular design of the system.
these usages should be replaced with a proper access of the Project object stored in Kernel.GlobalObjects for each use of project data and
Kernel.Config for configuration | True | Remove of global variables usage - While this has been fixed in many place there are still some places that need to be fixed
the implementation of the Database panels leftover from F0's efforts made use of g global variables to make project data and configuration available to the class, initialising them in the __init__ method of the class
this is bad practice and counter productive to the modular design of the system.
these usages should be replaced with a proper access of the Project object stored in Kernel.GlobalObjects for each use of project data and
Kernel.Config for configuration | main | remove of global variables usage while this has been fixed in many place there are still some places that need to be fixed the implementation of the database panels leftover from s efforts made use of g global variables to make project data and configuration available to the class initialising them in the init method of the class this is bad practice and counter productive to the modular design of the system these usages should be replaced with a proper access of the project object stored in kernel globalobjects for each use of project data and kernel config for configuration | 1 |
1,235 | 5,265,268,403 | IssuesEvent | 2017-02-04 00:25:34 | duckduckgo/zeroclickinfo-goodies | https://api.github.com/repos/duckduckgo/zeroclickinfo-goodies | closed | Conversions: show full answer when converting megabytes | Maintainer Approved | A DuckDuckGo user sent in feedback regarding the conversions IA. Searching for an example -- "100 megabytes in bytes" only returns "1 \* 108 bytes" and not the desired answer: "104857600."
---
IA Page: http://duck.co/ia/view/conversions
[Maintainer](http://docs.duckduckhack.com/maintaining/guidelines.html): @mintsoft
| True | Conversions: show full answer when converting megabytes - A DuckDuckGo user sent in feedback regarding the conversions IA. Searching for an example -- "100 megabytes in bytes" only returns "1 \* 108 bytes" and not the desired answer: "104857600."
---
IA Page: http://duck.co/ia/view/conversions
[Maintainer](http://docs.duckduckhack.com/maintaining/guidelines.html): @mintsoft
| main | conversions show full answer when converting megabytes a duckduckgo user sent in feedback regarding the conversions ia searching for an example megabytes in bytes only returns bytes and not the desired answer ia page mintsoft | 1 |
279,438 | 30,702,550,537 | IssuesEvent | 2023-07-27 01:39:49 | nidhi7598/linux-3.0.35_CVE-2018-13405 | https://api.github.com/repos/nidhi7598/linux-3.0.35_CVE-2018-13405 | closed | CVE-2020-14351 (High) detected in linux-stable-rtv3.8.6 - autoclosed | Mend: dependency security vulnerability | ## CVE-2020-14351 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw was found in the Linux kernel. A use-after-free memory flaw was found in the perf subsystem allowing a local attacker with permission to monitor perf events to corrupt memory and possibly escalate privileges. The highest threat from this vulnerability is to data confidentiality and integrity as well as system availability.
<p>Publish Date: 2020-12-03
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-14351>CVE-2020-14351</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1862849">https://bugzilla.redhat.com/show_bug.cgi?id=1862849</a></p>
<p>Release Date: 2020-12-03</p>
<p>Fix Resolution: 4.14.207,4.19.158,4.4.244,4.9.244,5.4.78,5.8.17,5.9.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-14351 (High) detected in linux-stable-rtv3.8.6 - autoclosed - ## CVE-2020-14351 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw was found in the Linux kernel. A use-after-free memory flaw was found in the perf subsystem allowing a local attacker with permission to monitor perf events to corrupt memory and possibly escalate privileges. The highest threat from this vulnerability is to data confidentiality and integrity as well as system availability.
<p>Publish Date: 2020-12-03
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-14351>CVE-2020-14351</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1862849">https://bugzilla.redhat.com/show_bug.cgi?id=1862849</a></p>
<p>Release Date: 2020-12-03</p>
<p>Fix Resolution: 4.14.207,4.19.158,4.4.244,4.9.244,5.4.78,5.8.17,5.9.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_main | cve high detected in linux stable autoclosed cve high severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in base branch master vulnerable source files vulnerability details a flaw was found in the linux kernel a use after free memory flaw was found in the perf subsystem allowing a local attacker with permission to monitor perf events to corrupt memory and possibly escalate privileges the highest threat from this vulnerability is to data confidentiality and integrity as well as system availability publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
13,485 | 23,201,026,013 | IssuesEvent | 2022-08-01 21:29:54 | vectordotdev/vector | https://api.github.com/repos/vectordotdev/vector | closed | Load balancing for splunk_hec sink | sink: splunk_hec type: enhancement needs: requirements domain: networking meta: feedback provider: splunk have: should domain: reliability | Users should be able to provide a list of endpoints to the `splunk_hec` sink. The implementation should then round-robin requests between these endpoints, temporarily removing any from the pool that return errors.
There are a few things to consider here:
* The sink healthcheck should check all provided endpoints
* When an error occurs, the request should be retried on a different endpoint
* If a request fails on one endpoint and succeeds on another, that should count against the health of that endpoint
* When an endpoint is sufficiently unhealthy (e.g. 2 errors in a row where the same request succeeded on another endpoint), that endpoint should be removed from the active set
* Endpoints outside of the active set should be periodically healthchecked to see if they should be brought back in to the active set
* We need to handle cases where the healthcheck of an endpoint works, but the actual requests do not. This should not result in the endpoint flapping in and out of the active set.
* We need to handle cases where all endpoints are failing, and continue to retry against all endpoints regardless of their health | 1.0 | Load balancing for splunk_hec sink - Users should be able to provide a list of endpoints to the `splunk_hec` sink. The implementation should then round-robin requests between these endpoints, temporarily removing any from the pool that return errors.
There are a few things to consider here:
* The sink healthcheck should check all provided endpoints
* When an error occurs, the request should be retried on a different endpoint
* If a request fails on one endpoint and succeeds on another, that should count against the health of that endpoint
* When an endpoint is sufficiently unhealthy (e.g. 2 errors in a row where the same request succeeded on another endpoint), that endpoint should be removed from the active set
* Endpoints outside of the active set should be periodically healthchecked to see if they should be brought back in to the active set
* We need to handle cases where the healthcheck of an endpoint works, but the actual requests do not. This should not result in the endpoint flapping in and out of the active set.
* We need to handle cases where all endpoints are failing, and continue to retry against all endpoints regardless of their health | non_main | load balancing for splunk hec sink users should be able to provide a list of endpoints to the splunk hec sink the implementation should then round robin requests between these endpoints temporarily removing any from the pool that return errors there are a few things to consider here the sink healthcheck should check all provided endpoints when an error occurs the request should be retried on a different endpoint if a request fails on one endpoint and succeeds on another that should count against the health of that endpoint when an endpoint is sufficiently unhealthy e g errors in a row where the same request succeeded on another endpoint that endpoint should be removed from the active set endpoints outside of the active set should be periodically healthchecked to see if they should be brought back in to the active set we need to handle cases where the healthcheck of an endpoint works but the actual requests do not this should not result in the endpoint flapping in and out of the active set we need to handle cases where all endpoints are failing and continue to retry against all endpoints regardless of their health | 0 |
83,586 | 16,238,901,302 | IssuesEvent | 2021-05-07 06:48:27 | joomla/joomla-cms | https://api.github.com/repos/joomla/joomla-cms | closed | [4.0] [Autum] [A11y] [Contrast] menu | No Code Attached Yet | ### Steps to reproduce the issue

Not enough contrast
### Expected result
WCAG 2.0 level AA requires a contrast ratio of at least 4.5:1 for normal text and 3:1 for large text. WCAG 2.1 requires a contrast ratio of at least 3:1 for graphics and user interface components (such as form input borders). WCAG Level AAA requires a contrast ratio of at least 7:1 for normal text and 4.5:1 for large text.
@ciar4n
### Actual result
### System information (as much as possible)
### Additional comments
| 1.0 | [4.0] [Autum] [A11y] [Contrast] menu - ### Steps to reproduce the issue

Not enough contrast
### Expected result
WCAG 2.0 level AA requires a contrast ratio of at least 4.5:1 for normal text and 3:1 for large text. WCAG 2.1 requires a contrast ratio of at least 3:1 for graphics and user interface components (such as form input borders). WCAG Level AAA requires a contrast ratio of at least 7:1 for normal text and 4.5:1 for large text.
@ciar4n
### Actual result
### System information (as much as possible)
### Additional comments
| non_main | menu steps to reproduce the issue not enough contrast expected result wcag level aa requires a contrast ratio of at least for normal text and for large text wcag requires a contrast ratio of at least for graphics and user interface components such as form input borders wcag level aaa requires a contrast ratio of at least for normal text and for large text actual result system information as much as possible additional comments | 0 |
635 | 4,151,692,823 | IssuesEvent | 2016-06-15 21:27:19 | dotnet/roslyn-analyzers | https://api.github.com/repos/dotnet/roslyn-analyzers | closed | Port FxCop rule CA1812: AvoidUninstantiatedInternalClasses | Area-Microsoft.Maintainability.Analyzers FxCop-Port Urgency-Soon | **Title:** Avoid uninstantiated internal classes
**Description:**
An instance of an assembly-level type is not created by code in the assembly.
**Dependency:** FxCopSDKUtilities
**Notes:**
Do we need additional APIs to make this efficient, for example, a RegisterSymbolReferenceAction API? | True | Port FxCop rule CA1812: AvoidUninstantiatedInternalClasses - **Title:** Avoid uninstantiated internal classes
**Description:**
An instance of an assembly-level type is not created by code in the assembly.
**Dependency:** FxCopSDKUtilities
**Notes:**
Do we need additional APIs to make this efficient, for example, a RegisterSymbolReferenceAction API? | main | port fxcop rule avoiduninstantiatedinternalclasses title avoid uninstantiated internal classes description an instance of an assembly level type is not created by code in the assembly dependency fxcopsdkutilities notes do we need additional apis to make this efficient for example a registersymbolreferenceaction api | 1 |
52,722 | 3,028,117,866 | IssuesEvent | 2015-08-04 01:31:06 | Reviewable/Reviewable | https://api.github.com/repos/Reviewable/Reviewable | closed | Allow users to select from a handful (3?) stock themes | enhancement low priority | From discussion in gitter:

Inevitably some users won't like the default Reviewable theme. It seems that what people dislike can vary. But what if there were about 3 camps of opinions? Then Reviewable could provide 3 stock themes. This could include the monospace font, how distinct the file views are, whether "tabs" are rounded off or are square.
Personally, these are some features I don't like:
1. Conversation boxes kinda meld into the code; it's hard to see them distinctly. Maybe a really light grey background for the chat boxes?
2. If there's a comment on one side of the diff, the opposite side of the file is blank which is confusing. Maybe use a background color to show that there's space just b/c there's a comment on the other side?

3. The ripped-page aesthetic over gaps of unaltered code is cluttery to me. There's so much going on already.
4. A narrower monospace font. Or use whichever one I set with my browser settings to be my monospace font.
5. Rounded file tabs.
6. There seems to be a trend on the web of replacing text buttons with icons (github did this a few years ago). This is fine when someone is familiar with a site, but really hinders the ability to learn a site. *Even* if there is a nice legend, like you have.
7. The slanted "r5" text next to each comment box. I think having the revision number is good,but slanted text is harder to read and distracting. You can still use the diagonal cut-off, just put the text horizontal. | 1.0 | Allow users to select from a handful (3?) stock themes - From discussion in gitter:

Inevitably some users won't like the default Reviewable theme. It seems that what people dislike can vary. But what if there were about 3 camps of opinions? Then Reviewable could provide 3 stock themes. This could include the monospace font, how distinct the file views are, whether "tabs" are rounded off or are square.
Personally, these are some features I don't like:
1. Conversation boxes kinda meld into the code; it's hard to see them distinctly. Maybe a really light grey background for the chat boxes?
2. If there's a comment on one side of the diff, the opposite side of the file is blank which is confusing. Maybe use a background color to show that there's space just b/c there's a comment on the other side?

3. The ripped-page aesthetic over gaps of unaltered code is cluttery to me. There's so much going on already.
4. A narrower monospace font. Or use whichever one I set with my browser settings to be my monospace font.
5. Rounded file tabs.
6. There seems to be a trend on the web of replacing text buttons with icons (github did this a few years ago). This is fine when someone is familiar with a site, but really hinders the ability to learn a site. *Even* if there is a nice legend, like you have.
7. The slanted "r5" text next to each comment box. I think having the revision number is good,but slanted text is harder to read and distracting. You can still use the diagonal cut-off, just put the text horizontal. | non_main | allow users to select from a handful stock themes from discussion in gitter inevitably some users won t like the default reviewable theme it seems that what people dislike can vary but what if there were about camps of opinions then reviewable could provide stock themes this could include the monospace font how distinct the file views are whether tabs are rounded off or are square personally these are some features i don t like conversation boxes kinda meld into the code it s hard to see them distinctly maybe a really light grey background for the chat boxes if there s a comment on one side of the diff the opposite side of the file is blank which is confusing maybe use a background color to show that there s space just b c there s a comment on the other side the ripped page aesthetic over gaps of unaltered code is cluttery to me there s so much going on already a narrower monospace font or use whichever one i set with my browser settings to be my monospace font rounded file tabs there seems to be a trend on the web of replacing text buttons with icons github did this a few years ago this is fine when someone is familiar with a site but really hinders the ability to learn a site even if there is a nice legend like you have the slanted text next to each comment box i think having the revision number is good but slanted text is harder to read and distracting you can still use the diagonal cut off just put the text horizontal | 0 |
231,279 | 25,499,098,347 | IssuesEvent | 2022-11-28 01:06:22 | joshbnewton31080/NodeGoat | https://api.github.com/repos/joshbnewton31080/NodeGoat | opened | swig-1.4.2.tgz: 2 vulnerabilities (highest severity is: 7.8) | security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>swig-1.4.2.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/uglify-js/node_modules/async/package.json,/node_modules/winston/node_modules/async/package.json,/node_modules/prompt/node_modules/async/package.json,/node_modules/broadway/node_modules/async/package.json</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (swig version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2021-43138](https://www.mend.io/vulnerability-database/CVE-2021-43138) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.8 | async-0.2.10.tgz | Transitive | N/A* | ❌ |
| [CVE-2015-8858](https://www.mend.io/vulnerability-database/CVE-2015-8858) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | uglify-js-2.4.24.tgz | Transitive | N/A* | ❌ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the section "Details" below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-43138</summary>
### Vulnerable Library - <b>async-0.2.10.tgz</b></p>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-0.2.10.tgz">https://registry.npmjs.org/async/-/async-0.2.10.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/uglify-js/node_modules/async/package.json,/node_modules/winston/node_modules/async/package.json,/node_modules/prompt/node_modules/async/package.json,/node_modules/broadway/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- swig-1.4.2.tgz (Root Library)
- uglify-js-2.4.24.tgz
- :x: **async-0.2.10.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Async before 2.6.4 and 3.x before 3.2.2, a malicious user can obtain privileges via the mapValues() method, aka lib/internal/iterator.js createObjectIterator prototype pollution.
<p>Publish Date: 2022-04-06
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-43138>CVE-2021-43138</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p>
<p>Release Date: 2022-04-06</p>
<p>Fix Resolution: async - 2.6.4,3.2.2</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2015-8858</summary>
### Vulnerable Library - <b>uglify-js-2.4.24.tgz</b></p>
<p>JavaScript parser, mangler/compressor and beautifier toolkit</p>
<p>Library home page: <a href="https://registry.npmjs.org/uglify-js/-/uglify-js-2.4.24.tgz">https://registry.npmjs.org/uglify-js/-/uglify-js-2.4.24.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/uglify-js/package.json</p>
<p>
Dependency Hierarchy:
- swig-1.4.2.tgz (Root Library)
- :x: **uglify-js-2.4.24.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The uglify-js package before 2.6.0 for Node.js allows attackers to cause a denial of service (CPU consumption) via crafted input in a parse call, aka a "regular expression denial of service (ReDoS)."
<p>Publish Date: 2017-01-23
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-8858>CVE-2015-8858</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8858">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8858</a></p>
<p>Release Date: 2017-01-23</p>
<p>Fix Resolution: v2.6.0</p>
</p>
<p></p>
</details> | True | swig-1.4.2.tgz: 2 vulnerabilities (highest severity is: 7.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>swig-1.4.2.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/uglify-js/node_modules/async/package.json,/node_modules/winston/node_modules/async/package.json,/node_modules/prompt/node_modules/async/package.json,/node_modules/broadway/node_modules/async/package.json</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (swig version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2021-43138](https://www.mend.io/vulnerability-database/CVE-2021-43138) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.8 | async-0.2.10.tgz | Transitive | N/A* | ❌ |
| [CVE-2015-8858](https://www.mend.io/vulnerability-database/CVE-2015-8858) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | uglify-js-2.4.24.tgz | Transitive | N/A* | ❌ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the section "Details" below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-43138</summary>
### Vulnerable Library - <b>async-0.2.10.tgz</b></p>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-0.2.10.tgz">https://registry.npmjs.org/async/-/async-0.2.10.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/uglify-js/node_modules/async/package.json,/node_modules/winston/node_modules/async/package.json,/node_modules/prompt/node_modules/async/package.json,/node_modules/broadway/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- swig-1.4.2.tgz (Root Library)
- uglify-js-2.4.24.tgz
- :x: **async-0.2.10.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Async before 2.6.4 and 3.x before 3.2.2, a malicious user can obtain privileges via the mapValues() method, aka lib/internal/iterator.js createObjectIterator prototype pollution.
<p>Publish Date: 2022-04-06
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-43138>CVE-2021-43138</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p>
<p>Release Date: 2022-04-06</p>
<p>Fix Resolution: async - 2.6.4,3.2.2</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2015-8858</summary>
### Vulnerable Library - <b>uglify-js-2.4.24.tgz</b></p>
<p>JavaScript parser, mangler/compressor and beautifier toolkit</p>
<p>Library home page: <a href="https://registry.npmjs.org/uglify-js/-/uglify-js-2.4.24.tgz">https://registry.npmjs.org/uglify-js/-/uglify-js-2.4.24.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/uglify-js/package.json</p>
<p>
Dependency Hierarchy:
- swig-1.4.2.tgz (Root Library)
- :x: **uglify-js-2.4.24.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The uglify-js package before 2.6.0 for Node.js allows attackers to cause a denial of service (CPU consumption) via crafted input in a parse call, aka a "regular expression denial of service (ReDoS)."
<p>Publish Date: 2017-01-23
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-8858>CVE-2015-8858</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8858">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8858</a></p>
<p>Release Date: 2017-01-23</p>
<p>Fix Resolution: v2.6.0</p>
</p>
<p></p>
</details> | non_main | swig tgz vulnerabilities highest severity is vulnerable library swig tgz path to dependency file package json path to vulnerable library node modules uglify js node modules async package json node modules winston node modules async package json node modules prompt node modules async package json node modules broadway node modules async package json vulnerabilities cve severity cvss dependency type fixed in swig version remediation available high async tgz transitive n a high uglify js tgz transitive n a for some transitive vulnerabilities there is no version of direct dependency with a fix check the section details below to see if there is a version of transitive dependency where vulnerability is fixed details cve vulnerable library async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file package json path to vulnerable library node modules uglify js node modules async package json node modules winston node modules async package json node modules prompt node modules async package json node modules broadway node modules async package json dependency hierarchy swig tgz root library uglify js tgz x async tgz vulnerable library found in base branch master vulnerability details in async before and x before a malicious user can obtain privileges via the mapvalues method aka lib internal iterator js createobjectiterator prototype pollution publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution async cve vulnerable library uglify js tgz javascript parser mangler compressor and beautifier toolkit library home page a href path to dependency file package json path to vulnerable library node modules uglify js package json dependency hierarchy swig tgz root library x uglify js tgz vulnerable library found in base branch master vulnerability details the uglify js package before for node js allows attackers to cause a denial of service cpu consumption via crafted input in a parse call aka a regular expression denial of service redos publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution | 0 |
820,714 | 30,784,613,798 | IssuesEvent | 2023-07-31 12:27:47 | owid/etl | https://api.github.com/repos/owid/etl | closed | Remove generated grapher steps with no modifications | wontfix needs triage priority 3 - nice to have | We have tons of grapher steps that follow exactly the same template. They just copy garden dataset to grapher. Instead of having hundreds of almost identical grapher steps there, we could generate the default template on the fly if we see `data://grapher/...` step in DAG.
@pabloarosado do you think it's a useful cleanup or is it better to have them explicitly there? | 1.0 | Remove generated grapher steps with no modifications - We have tons of grapher steps that follow exactly the same template. They just copy garden dataset to grapher. Instead of having hundreds of almost identical grapher steps there, we could generate the default template on the fly if we see `data://grapher/...` step in DAG.
@pabloarosado do you think it's a useful cleanup or is it better to have them explicitly there? | non_main | remove generated grapher steps with no modifications we have tons of grapher steps that follow exactly the same template they just copy garden dataset to grapher instead of having hundreds of almost identical grapher steps there we could generate the default template on the fly if we see data grapher step in dag pabloarosado do you think it s a useful cleanup or is it better to have them explicitly there | 0 |
3,968 | 18,161,104,905 | IssuesEvent | 2021-09-27 09:42:26 | pypa/get-pip | https://api.github.com/repos/pypa/get-pip | closed | Version 21.2.4 | maintainance | Just wondering, when will there be a new get-pip that installs version 21.2.4. I'm asking because me and someone else are making a Docker image with the latest versions of python and pip compiled from source, which is actually slimmer than the official image. | True | Version 21.2.4 - Just wondering, when will there be a new get-pip that installs version 21.2.4. I'm asking because me and someone else are making a Docker image with the latest versions of python and pip compiled from source, which is actually slimmer than the official image. | main | version just wondering when will there be a new get pip that installs version i m asking because me and someone else are making a docker image with the latest versions of python and pip compiled from source which is actually slimmer than the official image | 1 |
2,425 | 8,615,644,794 | IssuesEvent | 2018-11-19 21:13:09 | arcticicestudio/nord-docs | https://api.github.com/repos/arcticicestudio/nord-docs | closed | Webpack configuration | context-workflow scope-configurability scope-maintainability scope-plugin-support type-feature | <p align="center"><img src="https://user-images.githubusercontent.com/7836623/48711809-761a5500-ec0c-11e8-8ba7-dc4acbd5d64c.png" width="70%"/> <img src="https://user-images.githubusercontent.com/74381/34074079-93ac0420-e25c-11e7-9c0a-642986b2aa58.png" width="20%"/></p> <!-- Sources: https://webpack.js.org/e0b5805d423a4ec9473ee315250968b2.svg -->
> Related epics: #25
This story is almost the same like the modified Babel configuration implemented in #29, but for the [Webpack][] configuration instead. It will be used to
- configure the plugin [webpack-bundle-analyzer][gh-webpack-bundle-analyzer]
- configure the plugin [git-revision-webpack-plugin][gh-git-revision-webpack-plugin]
- configure the plugin [webpack.DefinePlugin][webpack-docs-define-plug]
- configure [resolve aliases][webpack-docs-resolve-alias] (see #26 for details about the structure concept)
Gatsby also supports this by providing the [`onCreateWebpackConfig`][gatsby-docs-api-node-webpack] function through the [Node API][gatsby-docs-api-node].
## Tasks
- [x] Install required Webpack plugin packages:
- [webpack-bundle-analyzer][npm-webpack-bundle-analyzer]
- [git-revision-webpack-plugin][npm-git-revision-webpack-plugin]
- [x] Configure resolve aliases for
- `src/assets`
- `src/atoms`
- `src/config`
- `src/containers`
- `src/data`
- `src/layouts`
- `src/molecules`
- `src/organisms`
- `src/pages`
- `src/stores`
- `src/styles`
- `src/templates`
- `src/utils`
- [x] Configure `git-revision-webpack-plugin` to provide the version, commit hash and branch as environment variables through through the `webpack.DefinePlugin`.
- [x] Configure `webpack-bundle-analyzer` to generate a "static" report with a JSON stats file stored in a newly created `build` directory within the project root.
[gatsby-docs-api-node]: https://www.gatsbyjs.org/docs/node-apis
[gatsby-docs-api-node-webpack]: https://www.gatsbyjs.org/docs/node-apis/#onCreateWebpackConfig
[gh-git-revision-webpack-plugin]: https://github.com/pirelenito/git-revision-webpack-plugin
[gh-webpack-bundle-analyzer]: https://github.com/webpack-contrib/webpack-bundle-analyzer
[npm-git-revision-webpack-plugin]: https://www.npmjs.com/package/git-revision-webpack-plugin
[npm-webpack-bundle-analyzer]: https://www.npmjs.com/package/webpack-bundle-analyzer
[webpack]: https://webpack.js.org
[webpack-docs-define-plug]: https://webpack.js.org/plugins/define-plugin
[webpack-docs-resolve-alias]: https://webpack.js.org/configuration/resolve/#resolve-alias
| True | Webpack configuration - <p align="center"><img src="https://user-images.githubusercontent.com/7836623/48711809-761a5500-ec0c-11e8-8ba7-dc4acbd5d64c.png" width="70%"/> <img src="https://user-images.githubusercontent.com/74381/34074079-93ac0420-e25c-11e7-9c0a-642986b2aa58.png" width="20%"/></p> <!-- Sources: https://webpack.js.org/e0b5805d423a4ec9473ee315250968b2.svg -->
> Related epics: #25
This story is almost the same like the modified Babel configuration implemented in #29, but for the [Webpack][] configuration instead. It will be used to
- configure the plugin [webpack-bundle-analyzer][gh-webpack-bundle-analyzer]
- configure the plugin [git-revision-webpack-plugin][gh-git-revision-webpack-plugin]
- configure the plugin [webpack.DefinePlugin][webpack-docs-define-plug]
- configure [resolve aliases][webpack-docs-resolve-alias] (see #26 for details about the structure concept)
Gatsby also supports this by providing the [`onCreateWebpackConfig`][gatsby-docs-api-node-webpack] function through the [Node API][gatsby-docs-api-node].
## Tasks
- [x] Install required Webpack plugin packages:
- [webpack-bundle-analyzer][npm-webpack-bundle-analyzer]
- [git-revision-webpack-plugin][npm-git-revision-webpack-plugin]
- [x] Configure resolve aliases for
- `src/assets`
- `src/atoms`
- `src/config`
- `src/containers`
- `src/data`
- `src/layouts`
- `src/molecules`
- `src/organisms`
- `src/pages`
- `src/stores`
- `src/styles`
- `src/templates`
- `src/utils`
- [x] Configure `git-revision-webpack-plugin` to provide the version, commit hash and branch as environment variables through through the `webpack.DefinePlugin`.
- [x] Configure `webpack-bundle-analyzer` to generate a "static" report with a JSON stats file stored in a newly created `build` directory within the project root.
[gatsby-docs-api-node]: https://www.gatsbyjs.org/docs/node-apis
[gatsby-docs-api-node-webpack]: https://www.gatsbyjs.org/docs/node-apis/#onCreateWebpackConfig
[gh-git-revision-webpack-plugin]: https://github.com/pirelenito/git-revision-webpack-plugin
[gh-webpack-bundle-analyzer]: https://github.com/webpack-contrib/webpack-bundle-analyzer
[npm-git-revision-webpack-plugin]: https://www.npmjs.com/package/git-revision-webpack-plugin
[npm-webpack-bundle-analyzer]: https://www.npmjs.com/package/webpack-bundle-analyzer
[webpack]: https://webpack.js.org
[webpack-docs-define-plug]: https://webpack.js.org/plugins/define-plugin
[webpack-docs-resolve-alias]: https://webpack.js.org/configuration/resolve/#resolve-alias
| main | webpack configuration related epics this story is almost the same like the modified babel configuration implemented in but for the configuration instead it will be used to configure the plugin configure the plugin configure the plugin configure see for details about the structure concept gatsby also supports this by providing the function through the tasks install required webpack plugin packages configure resolve aliases for src assets src atoms src config src containers src data src layouts src molecules src organisms src pages src stores src styles src templates src utils configure git revision webpack plugin to provide the version commit hash and branch as environment variables through through the webpack defineplugin configure webpack bundle analyzer to generate a static report with a json stats file stored in a newly created build directory within the project root | 1 |
1,738 | 6,574,876,376 | IssuesEvent | 2017-09-11 14:21:55 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | template action could use ignore_regexp to skip a template change if the change matches the given regexp | affects_2.2 feature_idea waiting_on_maintainer | ##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
template
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
ansible 2.2.0
```
##### SUMMARY
For certain configuration like bind database, we use ansible_date_time.epoch as the serial number, like this
```
{{ ansible_date_time.epoch }} ; Serial
```
And a change to the db file will trigger the bind service to be restarted. It works fine if there is real change. But sometimes the only change is the epoch thus the serial number change, in which case, we don't really want to proceed.
So we would like to propose a ignore_regexp setting to template(or copy) action, once this attribute is set, when template action is performed, such regexp will be used to filter out the lines in the template output and the existing file, so the action will be only considered **changed** if the filtered output and existing file are still different. I think such a change will not only benefit our use case, but will also help address issues like https://github.com/ansible/ansible/issues/5317.
| True | template action could use ignore_regexp to skip a template change if the change matches the given regexp - ##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
template
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
ansible 2.2.0
```
##### SUMMARY
For certain configuration like bind database, we use ansible_date_time.epoch as the serial number, like this
```
{{ ansible_date_time.epoch }} ; Serial
```
And a change to the db file will trigger the bind service to be restarted. It works fine if there is real change. But sometimes the only change is the epoch thus the serial number change, in which case, we don't really want to proceed.
So we would like to propose a ignore_regexp setting to template(or copy) action, once this attribute is set, when template action is performed, such regexp will be used to filter out the lines in the template output and the existing file, so the action will be only considered **changed** if the filtered output and existing file are still different. I think such a change will not only benefit our use case, but will also help address issues like https://github.com/ansible/ansible/issues/5317.
| main | template action could use ignore regexp to skip a template change if the change matches the given regexp issue type feature idea component name template ansible version ansible summary for certain configuration like bind database we use ansible date time epoch as the serial number like this ansible date time epoch serial and a change to the db file will trigger the bind service to be restarted it works fine if there is real change but sometimes the only change is the epoch thus the serial number change in which case we don t really want to proceed so we would like to propose a ignore regexp setting to template or copy action once this attribute is set when template action is performed such regexp will be used to filter out the lines in the template output and the existing file so the action will be only considered changed if the filtered output and existing file are still different i think such a change will not only benefit our use case but will also help address issues like | 1 |
314,399 | 23,519,752,788 | IssuesEvent | 2022-08-19 03:42:49 | Azure/Azure-Functions | https://api.github.com/repos/Azure/Azure-Functions | opened | v4 logging change detail | documentation | HI team, is there any change about logging output from v3 to v4?
Sharing the results of my quick test.
I start Function App then Execute 1 HTTP trigger, then Stop Function App.
We can see 4 more records in v4 compared to v3.
If there are more change about logging output from FunctionsHost, please share some documents.
Because log output volume change effects Application Insights usage cost.
v4

v3
<img width="903" alt="image" src="https://user-images.githubusercontent.com/1381907/185537804-6562f4e4-b03f-4c6d-9ec6-535fcfa09aed.png">
| 1.0 | v4 logging change detail - HI team, is there any change about logging output from v3 to v4?
Sharing the results of my quick test.
I start Function App then Execute 1 HTTP trigger, then Stop Function App.
We can see 4 more records in v4 compared to v3.
If there are more change about logging output from FunctionsHost, please share some documents.
Because log output volume change effects Application Insights usage cost.
v4

v3
<img width="903" alt="image" src="https://user-images.githubusercontent.com/1381907/185537804-6562f4e4-b03f-4c6d-9ec6-535fcfa09aed.png">
| non_main | logging change detail hi team is there any change about logging output from to sharing the results of my quick test i start function app then execute http trigger then stop function app we can see more records in compared to if there are more change about logging output from functionshost please share some documents because log output volume change effects application insights usage cost img width alt image src | 0 |
443,441 | 12,794,428,775 | IssuesEvent | 2020-07-02 06:52:57 | kubeflow/pipelines | https://api.github.com/repos/kubeflow/pipelines | closed | Option for disabling using .tar.gz when uploading artifact | area/sdk kind/feature lifecycle/stale priority/p2 | Wondering if KFP is planning to support uploading artifacts without archiving to tar.gz since Argo supports this feature as well:
https://github.com/argoproj/argo/commit/11e57f4dea93fde60b204a5e7675fec999c66f56#diff-2ecc5d66cfc406141926d272bab09f2f
This is useful because we might want to upload some result images as artifacts and later visualize them in html with output viewer. Since current output viewer does not support referencing to other files in the filesystem, we can only upload the images first.
| 1.0 | Option for disabling using .tar.gz when uploading artifact - Wondering if KFP is planning to support uploading artifacts without archiving to tar.gz since Argo supports this feature as well:
https://github.com/argoproj/argo/commit/11e57f4dea93fde60b204a5e7675fec999c66f56#diff-2ecc5d66cfc406141926d272bab09f2f
This is useful because we might want to upload some result images as artifacts and later visualize them in html with output viewer. Since current output viewer does not support referencing to other files in the filesystem, we can only upload the images first.
| non_main | option for disabling using tar gz when uploading artifact wondering if kfp is planning to support uploading artifacts without archiving to tar gz since argo supports this feature as well this is useful because we might want to upload some result images as artifacts and later visualize them in html with output viewer since current output viewer does not support referencing to other files in the filesystem we can only upload the images first | 0 |
430,282 | 30,173,608,223 | IssuesEvent | 2023-07-04 01:04:49 | danthegoodman1/icedb | https://api.github.com/repos/danthegoodman1/icedb | opened | Faceting/Dynamic indexing | documentation enhancement | Facets kind of taken from datadog's terminology, but this would allow for dynamic "indexing" of columsn not in the partition stretegy.
We can add additional columns to the meta store called `facet_keys` and `facet_values` to serve as a that keeps track of the known keys and values of additional columns inside the parquet file.
Schema like:
```sql
facets JSONB
```
Facet values should be stored in arrays like:
```
{
"some.known.path": [1, 2, 'a']
}
```
A secondary [GIN index](https://www.cockroachlabs.com/docs/stable/inverted-indexes.html) will then allow us to track the values so that they can be considered in queries for increased filtering. This would allow a sort of "indexing" on additional columns without double-writing (i.e. a second table).
Facets will have to be defined, and will not backfill on previous data. The known facets will need to be stored in the DB as well in a new table.
Facets can have any data type, since they are JSONB columns. We will have to match query predicates to these facets similar to #45
This should be exposed entirely as python functions so that any query engine can be used, and as long as interception of the predicate can occur then faceting can be supported (otherwise I guess really ugly functions could be used) | 1.0 | Faceting/Dynamic indexing - Facets kind of taken from datadog's terminology, but this would allow for dynamic "indexing" of columsn not in the partition stretegy.
We can add additional columns to the meta store called `facet_keys` and `facet_values` to serve as a that keeps track of the known keys and values of additional columns inside the parquet file.
Schema like:
```sql
facets JSONB
```
Facet values should be stored in arrays like:
```
{
"some.known.path": [1, 2, 'a']
}
```
A secondary [GIN index](https://www.cockroachlabs.com/docs/stable/inverted-indexes.html) will then allow us to track the values so that they can be considered in queries for increased filtering. This would allow a sort of "indexing" on additional columns without double-writing (i.e. a second table).
Facets will have to be defined, and will not backfill on previous data. The known facets will need to be stored in the DB as well in a new table.
Facets can have any data type, since they are JSONB columns. We will have to match query predicates to these facets similar to #45
This should be exposed entirely as python functions so that any query engine can be used, and as long as interception of the predicate can occur then faceting can be supported (otherwise I guess really ugly functions could be used) | non_main | faceting dynamic indexing facets kind of taken from datadog s terminology but this would allow for dynamic indexing of columsn not in the partition stretegy we can add additional columns to the meta store called facet keys and facet values to serve as a that keeps track of the known keys and values of additional columns inside the parquet file schema like sql facets jsonb facet values should be stored in arrays like some known path a secondary will then allow us to track the values so that they can be considered in queries for increased filtering this would allow a sort of indexing on additional columns without double writing i e a second table facets will have to be defined and will not backfill on previous data the known facets will need to be stored in the db as well in a new table facets can have any data type since they are jsonb columns we will have to match query predicates to these facets similar to this should be exposed entirely as python functions so that any query engine can be used and as long as interception of the predicate can occur then faceting can be supported otherwise i guess really ugly functions could be used | 0 |
662,496 | 22,141,627,546 | IssuesEvent | 2022-06-03 07:33:32 | admiral-team/admiralui-ios | https://api.github.com/repos/admiral-team/admiralui-ios | closed | [FEATURE] - SwiftUI отсутствует Liner | enhancement medium priority | **Описание задачи:**
Page Controls co Swiftui - нет Liner
**Фреймворк**
Liner
**Ресурсы:**
Ссылка на макеты или ресурсы...
<img width="360" alt="Снимок экрана 2022-05-23 в 16 22 35" src="https://user-images.githubusercontent.com/101252323/169829020-144ce8ff-65a9-4b74-9c71-de80ad9d32f7.png">

| 1.0 | [FEATURE] - SwiftUI отсутствует Liner - **Описание задачи:**
Page Controls co Swiftui - нет Liner
**Фреймворк**
Liner
**Ресурсы:**
Ссылка на макеты или ресурсы...
<img width="360" alt="Снимок экрана 2022-05-23 в 16 22 35" src="https://user-images.githubusercontent.com/101252323/169829020-144ce8ff-65a9-4b74-9c71-de80ad9d32f7.png">

| non_main | swiftui отсутствует liner описание задачи page controls co swiftui нет liner фреймворк liner ресурсы ссылка на макеты или ресурсы img width alt снимок экрана в src | 0 |
3,579 | 14,353,967,104 | IssuesEvent | 2020-11-30 07:54:02 | cloverhearts/quilljs-markdown | https://api.github.com/repos/cloverhearts/quilljs-markdown | closed | TypeError: Cannot read property 'on' of undefined | RESEARCH Saw with Maintainer WILL MAKE IT | when i try use this
import Quill from 'quill'
import QuillMarkdown from 'quilljs-markdown'
const editor = new Quill('#editor', {})
new QuillMarkdown(editor)
I get this error

| True | TypeError: Cannot read property 'on' of undefined - when i try use this
import Quill from 'quill'
import QuillMarkdown from 'quilljs-markdown'
const editor = new Quill('#editor', {})
new QuillMarkdown(editor)
I get this error

| main | typeerror cannot read property on of undefined when i try use this import quill from quill import quillmarkdown from quilljs markdown const editor new quill editor new quillmarkdown editor i get this error | 1 |
47,034 | 10,021,688,035 | IssuesEvent | 2019-07-16 15:06:36 | jOOQ/jOOQ | https://api.github.com/repos/jOOQ/jOOQ | closed | NPE in code generator when generating Oracle UDTs | C: Code Generation C: DB: Oracle E: Enterprise Edition E: Professional Edition P: Medium R: Fixed T: Defect | The `OracleDatabase.getUDT0()` implementation can produce Null Pointer Exceptions in some cases when a UDT references a type that is not in the list of input schemas:
```
[ERROR] Error while fetching udts
java.lang.NullPointerException
at org.jooq.meta.AbstractElementContainerDefinition.<init> (AbstractElementContainerDefinition.java:75)
at org.jooq.meta.AbstractUDTDefinition.<init> (AbstractUDTDefinition.java:66)
at org.jooq.meta.oracle.OracleUDTDefinition.<init> (OracleUDTDefinition.java:76)
at org.jooq.meta.oracle.OracleDatabase.getUDTs0 (OracleDatabase.java:628)
at org.jooq.meta.AbstractDatabase.getAllUDTs (AbstractDatabase.java:1749)
at org.jooq.meta.AbstractDatabase.getUDTs (AbstractDatabase.java:1780)
at org.jooq.codegen.JavaGenerator.generateSchema (JavaGenerator.java:4798)
at org.jooq.codegen.JavaGenerator.generateSchema (JavaGenerator.java:4717)
at org.jooq.codegen.JavaGenerator.generate (JavaGenerator.java:507)
at org.jooq.codegen.JavaGenerator.generate (JavaGenerator.java:468)
at org.jooq.codegen.JavaGenerator.generate (JavaGenerator.java:389)
at org.jooq.codegen.GenerationTool.run0 (GenerationTool.java:800)
at org.jooq.codegen.GenerationTool.run (GenerationTool.java:221)
at org.jooq.codegen.GenerationTool.generate (GenerationTool.java:216)
at org.jooq.codegen.maven.Plugin.execute (Plugin.java:198)
``` | 1.0 | NPE in code generator when generating Oracle UDTs - The `OracleDatabase.getUDT0()` implementation can produce Null Pointer Exceptions in some cases when a UDT references a type that is not in the list of input schemas:
```
[ERROR] Error while fetching udts
java.lang.NullPointerException
at org.jooq.meta.AbstractElementContainerDefinition.<init> (AbstractElementContainerDefinition.java:75)
at org.jooq.meta.AbstractUDTDefinition.<init> (AbstractUDTDefinition.java:66)
at org.jooq.meta.oracle.OracleUDTDefinition.<init> (OracleUDTDefinition.java:76)
at org.jooq.meta.oracle.OracleDatabase.getUDTs0 (OracleDatabase.java:628)
at org.jooq.meta.AbstractDatabase.getAllUDTs (AbstractDatabase.java:1749)
at org.jooq.meta.AbstractDatabase.getUDTs (AbstractDatabase.java:1780)
at org.jooq.codegen.JavaGenerator.generateSchema (JavaGenerator.java:4798)
at org.jooq.codegen.JavaGenerator.generateSchema (JavaGenerator.java:4717)
at org.jooq.codegen.JavaGenerator.generate (JavaGenerator.java:507)
at org.jooq.codegen.JavaGenerator.generate (JavaGenerator.java:468)
at org.jooq.codegen.JavaGenerator.generate (JavaGenerator.java:389)
at org.jooq.codegen.GenerationTool.run0 (GenerationTool.java:800)
at org.jooq.codegen.GenerationTool.run (GenerationTool.java:221)
at org.jooq.codegen.GenerationTool.generate (GenerationTool.java:216)
at org.jooq.codegen.maven.Plugin.execute (Plugin.java:198)
``` | non_main | npe in code generator when generating oracle udts the oracledatabase implementation can produce null pointer exceptions in some cases when a udt references a type that is not in the list of input schemas error while fetching udts java lang nullpointerexception at org jooq meta abstractelementcontainerdefinition abstractelementcontainerdefinition java at org jooq meta abstractudtdefinition abstractudtdefinition java at org jooq meta oracle oracleudtdefinition oracleudtdefinition java at org jooq meta oracle oracledatabase oracledatabase java at org jooq meta abstractdatabase getalludts abstractdatabase java at org jooq meta abstractdatabase getudts abstractdatabase java at org jooq codegen javagenerator generateschema javagenerator java at org jooq codegen javagenerator generateschema javagenerator java at org jooq codegen javagenerator generate javagenerator java at org jooq codegen javagenerator generate javagenerator java at org jooq codegen javagenerator generate javagenerator java at org jooq codegen generationtool generationtool java at org jooq codegen generationtool run generationtool java at org jooq codegen generationtool generate generationtool java at org jooq codegen maven plugin execute plugin java | 0 |
3,125 | 11,968,153,372 | IssuesEvent | 2020-04-06 08:08:50 | short-d/short | https://api.github.com/repos/short-d/short | opened | [Maintainability] Add PR author's guide to CONTRIBUTING.md | maintainability | **What is the problem?**
There are lots of large PRs without clear PR descriptions. People never had the opportunity to learn how to write good PR description and why to create small PRs.
**Your solution**
Revise https://wiki.gnome.org/Git/CommitMessages and https://github.com/google/eng-practices/blob/master/review/developer/index.md. Add PR author section to CONTRIBUTING.md
| True | [Maintainability] Add PR author's guide to CONTRIBUTING.md - **What is the problem?**
There are lots of large PRs without clear PR descriptions. People never had the opportunity to learn how to write good PR description and why to create small PRs.
**Your solution**
Revise https://wiki.gnome.org/Git/CommitMessages and https://github.com/google/eng-practices/blob/master/review/developer/index.md. Add PR author section to CONTRIBUTING.md
| main | add pr author s guide to contributing md what is the problem there are lots of large prs without clear pr descriptions people never had the opportunity to learn how to write good pr description and why to create small prs your solution revise and add pr author section to contributing md | 1 |
433,598 | 30,339,031,245 | IssuesEvent | 2023-07-11 11:31:33 | pmizio/typescript-tools.nvim | https://api.github.com/repos/pmizio/typescript-tools.nvim | closed | styled components features don't work | bug documentation | Here is my config
````
{
"pmizio/typescript-tools.nvim",
ft = { "typescript", "typescriptreact" },
dependencies = { "nvim-lua/plenary.nvim", "neovim/nvim-lspconfig" },
opts = {},
config = function()
require("typescript-tools").setup({
settings = {
tsserver_plugins = { "typescript-styled-plugin" },
},
})
end,
},
```
I have no diagnostic or completion inside of SC. | 1.0 | styled components features don't work - Here is my config
````
{
"pmizio/typescript-tools.nvim",
ft = { "typescript", "typescriptreact" },
dependencies = { "nvim-lua/plenary.nvim", "neovim/nvim-lspconfig" },
opts = {},
config = function()
require("typescript-tools").setup({
settings = {
tsserver_plugins = { "typescript-styled-plugin" },
},
})
end,
},
```
I have no diagnostic or completion inside of SC. | non_main | styled components features don t work here is my config pmizio typescript tools nvim ft typescript typescriptreact dependencies nvim lua plenary nvim neovim nvim lspconfig opts config function require typescript tools setup settings tsserver plugins typescript styled plugin end i have no diagnostic or completion inside of sc | 0 |
5,622 | 28,127,754,902 | IssuesEvent | 2023-03-31 19:21:04 | scott-ainsworth/dotnet-eithers | https://api.github.com/repos/scott-ainsworth/dotnet-eithers | closed | Eliminate remnants of test values | bug maintainability | *Issue*: Originally, the unit tests relied on constant and readonly values defined in the `TestData` class. This turned out to make the test code harder to grok.
*Requirements*
- Remove the constants and readonly values from `TestData`.
- Fix any test code that breaks. | True | Eliminate remnants of test values - *Issue*: Originally, the unit tests relied on constant and readonly values defined in the `TestData` class. This turned out to make the test code harder to grok.
*Requirements*
- Remove the constants and readonly values from `TestData`.
- Fix any test code that breaks. | main | eliminate remnants of test values issue originally the unit tests relied on constant and readonly values defined in the testdata class this turned out to make the test code harder to grok requirements remove the constants and readonly values from testdata fix any test code that breaks | 1 |
5,587 | 28,007,728,563 | IssuesEvent | 2023-03-27 16:17:45 | oele-isis-vanderbilt/ChimeraPy | https://api.github.com/repos/oele-isis-vanderbilt/ChimeraPy | closed | Create v0.0.10 to upgrade ChimeraPy-based pipelines to new ID system | maintainence | The Reading project, MMLAPIPE, and Nursing Project need the latest version of ChimeraPy, which would be helpful to publish version v0.0.10 to PYPI. | True | Create v0.0.10 to upgrade ChimeraPy-based pipelines to new ID system - The Reading project, MMLAPIPE, and Nursing Project need the latest version of ChimeraPy, which would be helpful to publish version v0.0.10 to PYPI. | main | create to upgrade chimerapy based pipelines to new id system the reading project mmlapipe and nursing project need the latest version of chimerapy which would be helpful to publish version to pypi | 1 |
5,173 | 26,344,165,846 | IssuesEvent | 2023-01-10 20:22:49 | aws/aws-sam-cli | https://api.github.com/repos/aws/aws-sam-cli | closed | Debugging on Windows WSL2 Fails with Timeout | area/docker type/bug platform/windows maintainer/need-followup |
### Description
I am unable to debug a simple Lambda function generated with an AWS Toolkit Template whilst running Windows 10 with Docker Desktop and WSL 2 mode enabled. The debugger times out.
09:59 Connection to Python debugger failed: Connection to the debugger script at localhost:61481 timed out
10:40 Connection to Python debugger failed: Connection to the debugger script at localhost:60043 timed out
### Steps to reproduce
1. Create a project using AWS Toolkit for PyCharm or code
2. Set a breakpoint in the lambda
3. Start debugger
### Observed result
```bash
C:\Users\Me\scoop\apps\PyCharm-Professional\2020.1.2-201.7846.77\IDE\bin\runnerw64.exe "C:\Program Files\Amazon\AWSSAMCLI\bin\sam.cmd" local invoke HelloWorldFunction --template C:\Temp\PyCharmEventBridge\.aws-sam\build\template.yaml --event "C:\Users\Me\AppData\Local\Temp\[Local] HelloWorldFunction-event.json" --debug-port 54610 --debugger-path C:\Users\Me\scoop\apps\PyCharm-Professional\2020.1.2-201.7846.77\IDE\plugins\python\helpers\pydev --debug-args "-u /tmp/lambci_debug_files/pydevd.py --multiprocess --port 54610 --file"
Invoking hello_world/app.lambda_handler (python3.8)
Image was not found.
Building image...Traceback (most recent call last):
File "D:\obj\windows-release\37amd64_Release\msi_python\zip_amd64\runpy.py", line 193, in _run_module_as_main
File "D:\obj\windows-release\37amd64_Release\msi_python\zip_amd64\runpy.py", line 85, in _run_code
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\__main__.py", line 12, in <module>
cli(prog_name="sam")
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 782, in main
rv = self.invoke(ctx)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 610, in invoke
return callback(*args, **kwargs)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\decorators.py", line 73, in new_func
return ctx.invoke(f, obj, *args, **kwargs)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 610, in invoke
return callback(*args, **kwargs)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\lib\telemetry\metrics.py", line 96, in wrapped
raise exception # pylint: disable=raising-bad-type
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\lib\telemetry\metrics.py", line 62, in wrapped
return_value = func(*args, **kwargs)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\commands\local\invoke\cli.py", line 86, in cli
parameter_overrides,
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\commands\local\invoke\cli.py", line 151, in do_cli
context.function_name, event=event_data, stdout=context.stdout, stderr=context.stderr
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\commands\local\lib\local_lambda.py", line 100, in invoke
self.local_runtime.invoke(config, event, debug_context=self.debug_context, stdout=stdout, stderr=stderr)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\local\lambdafn\runtime.py", line 77, in invoke
debug_options=debug_context,
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\local\docker\lambda_container.py", line 72, in __init__
image = LambdaContainer._get_image(image_builder, runtime, layers, debug_options)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\local\docker\lambda_container.py", line 176, in _get_image
return image_builder.build(runtime, layers, is_debug)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\local\docker\lambda_image.py", line 123, in build
self._build_image(base_image, image_tag, downloaded_layers, is_debug_go, stream=stream_writer)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\local\docker\lambda_image.py", line 201, in _build_image
fileobj=tarballfile, custom_context=True, rm=True, tag=docker_tag, pull=not self.skip_pull_image
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\api\build.py", line 261, in build
self._set_auth_headers(headers)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\api\build.py", line 308, in _set_auth_headers
auth_data = self._auth_configs.get_all_credentials()
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\auth.py", line 311, in get_all_credentials
reg, store_name
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\auth.py", line 262, in _resolve_authconfig_credstore
store = self._get_store_instance(credstore_name)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\auth.py", line 287, in _get_store_instance
name, environment=self._credstore_env
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\credentials\store.py", line 25, in __init__
self.program
docker.credentials.errors.InitializationError: docker-credential-gcloud not installed or not available in PATH
```
### Expected result
SAM CLI Should download and install latest emulator and debugger should attach allowing local debugging.
### Workarounds (Things that may help)
* Use Legacy HyperV Docker Engine rather than WSL2 Engine in Docker Desktop -> Settings -> General options page and add shared folders to allow Docker access to local sam build output
* Set --skip-pull-image flag for build
* Make sure any source code is located in user home folder to avoid possible Docker/WSL2/Windows permissions access issues (I did at points have source located in C:\Dev\)
* Set docker network to bridge
* Switch docker to enable Kubernetes
* Manually download AWS SAM Emulator image for environment by running docker pull amazon/aws-sam-cli-emulation-image-python3.8
* Ensure Firewall exceptions exist for Docker and Pycharm or VSCode
### Additional environment details (Ex: Windows, Mac, Amazon Linux etc)
1. OS: Windows 10, WSL2
2. `sam --version`: SAM CLI, version 1.0.0
--debug Output after using workarounds and the debugger finally attached
```bash
"C:\Program Files\Amazon\AWSSAMCLI\bin\sam.cmd" build AccountOuMovedHandlerFunction --template C:\Users\Me\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\template.yaml --build-dir C:\Users\Me\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build --use-container --skip-pull-image --docker-network bridge --debug
Telemetry endpoint configured to be https://aws-serverless-tools-telemetry.us-west-2.amazonaws.com/metrics
'build' command is called
Starting Build inside a container
No Parameters detected in the template
1 resources found in the template
Found Serverless function with name='AccountOuMovedHandlerFunction' and CodeUri='lambda/'
No Parameters detected in the template
Building function 'AccountOuMovedHandlerFunction'
Requested to skip pulling images ...
Mounting C:\Users\Me\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\lambda as /tmp/samcli/source:ro,delegated inside runtime container
Using the request object from command line argument
Loading workflow module 'aws_lambda_builders.workflows'
Registering workflow 'PythonPipBuilder' with capability 'Capability(language='python', dependency_manager='pip', application_framework=None)'
Registering workflow 'NodejsNpmBuilder' with capability 'Capability(language='nodejs', dependency_manager='npm', application_framework=None)'
Registering workflow 'RubyBundlerBuilder' with capability 'Capability(language='ruby', dependency_manager='bundler', application_framework=None)'
Registering workflow 'GoDepBuilder' with capability 'Capability(language='go', dependency_manager='dep', application_framework=None)'
Registering workflow 'GoModulesBuilder' with capability 'Capability(language='go', dependency_manager='modules', application_framework=None)'
Registering workflow 'JavaGradleWorkflow' with capability 'Capability(language='java', dependency_manager='gradle', application_framework=None)'
Registering workflow 'JavaMavenWorkflow' with capability 'Capability(language='java', dependency_manager='maven', application_framework=None)'
Registering workflow 'DotnetCliPackageBuilder' with capability 'Capability(language='dotnet', dependency_manager='cli-package', application_framework=None)'
Registering workflow 'CustomMakeBuilder' with capability 'Capability(language='provided', dependency_manager=None, application_framework=None)'
Found workflow 'PythonPipBuilder' to support capabilities 'Capability(language='python', dependency_manager='pip', application_framework=None)'
Running workflow 'PythonPipBuilder'
Running PythonPipBuilder:ResolveDependencies
calling pip download -r /tmp/samcli/source/requirements.txt --dest /tmp/samcli/scratch
Full dependency closure: {aiobotocore==1.0.4(wheel), async-timeout==3.0.1(wheel), fastjsonschema==2.14.4(wheel), botocore==1.17.32(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), chardet==3.0.4(wheel), idna==2.10(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), attrs==19.3.0(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), boto3==1.12.32(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), future==0.18.2(sdist), certifi==2020.6.20(wheel), wrapt==1.12.1(sdist)}
initial compatible: {aiobotocore==1.0.4(wheel), async-timeout==3.0.1(wheel), fastjsonschema==2.14.4(wheel), botocore==1.17.32(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), chardet==3.0.4(wheel), idna==2.10(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), attrs==19.3.0(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), boto3==1.12.32(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), certifi==2020.6.20(wheel)}
initial incompatible: {wrapt==1.12.1(sdist), future==0.18.2(sdist)}
Downloading missing wheels: {wrapt==1.12.1(sdist), future==0.18.2(sdist)}
calling pip download --only-binary=:all: --no-deps --platform manylinux1_x86_64 --implementation cp --abi cp38 --dest /tmp/samcli/scratch wrapt==1.12.1
calling pip download --only-binary=:all: --no-deps --platform manylinux1_x86_64 --implementation cp --abi cp38 --dest /tmp/samcli/scratch future==0.18.2
compatible wheels after second download pass: {aiobotocore==1.0.4(wheel), async-timeout==3.0.1(wheel), fastjsonschema==2.14.4(wheel), botocore==1.17.32(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), chardet==3.0.4(wheel), idna==2.10(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), attrs==19.3.0(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), boto3==1.12.32(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), certifi==2020.6.20(wheel)}
Build missing wheels from sdists (C compiling True): {wrapt==1.12.1(sdist), future==0.18.2(sdist)}
calling pip wheel --no-deps --wheel-dir /tmp/samcli/scratch /tmp/samcli/scratch/wrapt-1.12.1.tar.gz
calling pip wheel --no-deps --wheel-dir /tmp/samcli/scratch /tmp/samcli/scratch/future-0.18.2.tar.gz
compatible after building wheels (no C compiling): {aiobotocore==1.0.4(wheel), async-timeout==3.0.1(wheel), fastjsonschema==2.14.4(wheel), botocore==1.17.32(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), chardet==3.0.4(wheel), idna==2.10(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), attrs==19.3.0(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), boto3==1.12.32(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), future==0.18.2(wheel), certifi==2020.6.20(wheel), wrapt==1.12.1(wheel)}
Build missing wheels from sdists (C compiling False): set()
compatible after building wheels (C compiling): {aiobotocore==1.0.4(wheel), async-timeout==3.0.1(wheel), fastjsonschema==2.14.4(wheel), botocore==1.17.32(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), chardet==3.0.4(wheel), idna==2.10(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), attrs==19.3.0(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), boto3==1.12.32(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), future==0.18.2(wheel), certifi==2020.6.20(wheel), wrapt==1.12.1(wheel)}
Final compatible: {aiobotocore==1.0.4(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), async-timeout==3.0.1(wheel), attrs==19.3.0(wheel), fastjsonschema==2.14.4(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), botocore==1.17.32(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), chardet==3.0.4(wheel), boto3==1.12.32(wheel), idna==2.10(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), future==0.18.2(wheel), certifi==2020.6.20(wheel), wrapt==1.12.1(wheel)}
Final incompatible: set()
Final missing wheels: set()
PythonPipBuilder:ResolveDependencies succeeded
Running PythonPipBuilder:CopySource
PythonPipBuilder:CopySource succeeded
Build inside container returned response {"jsonrpc": "2.0", "id": 1, "result": {"artifacts_dir": "/tmp/samcli/artifacts"}}
Build inside container was successful. Copying artifacts from container to host
Copying from container: /tmp/samcli/artifacts/. -> C:\Users\Me\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build\AccountOuMovedHandlerFunction
Build inside container succeeded
Build Succeeded
Built Artifacts : ..\..\..\..\..\..\..\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build
Built Template : ..\..\..\..\..\..\..\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build\template.yaml
Commands you can use next
=========================
[*] Invoke Function: sam local invoke -t ..\..\..\..\..\..\..\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build\template.yaml
[*] Deploy: sam deploy --guided --template-file ..\..\..\..\..\..\..\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build\template.yaml
Sending Telemetry: {'metrics': [{'commandRun': {'awsProfileProvided': False, 'debugFlagProvided': True, 'region': '', 'commandName': 'sam build', 'duration': 47956, 'exitReason': 'success', 'exitCode': 0, 'requestId': '38225bda-2891-481b-85af-e00d41f2e206', 'installationId': 'c73c971d-b022-447b-b96c-6d36c642513e', 'sessionId': 'f66e0125-08db-4c1a-9f8b-2a8881b7181e', 'executionEnvironment': 'CLI', 'pyversion': '3.7.6', 'samcliVersion': '1.0.0'}}]}
HTTPSConnectionPool(host='aws-serverless-tools-telemetry.us-west-2.amazonaws.com', port=443): Read timed out. (read timeout=0.1)
``` | True | Debugging on Windows WSL2 Fails with Timeout -
### Description
I am unable to debug a simple Lambda function generated with an AWS Toolkit Template whilst running Windows 10 with Docker Desktop and WSL 2 mode enabled. The debugger times out.
09:59 Connection to Python debugger failed: Connection to the debugger script at localhost:61481 timed out
10:40 Connection to Python debugger failed: Connection to the debugger script at localhost:60043 timed out
### Steps to reproduce
1. Create a project using AWS Toolkit for PyCharm or code
2. Set a breakpoint in the lambda
3. Start debugger
### Observed result
```bash
C:\Users\Me\scoop\apps\PyCharm-Professional\2020.1.2-201.7846.77\IDE\bin\runnerw64.exe "C:\Program Files\Amazon\AWSSAMCLI\bin\sam.cmd" local invoke HelloWorldFunction --template C:\Temp\PyCharmEventBridge\.aws-sam\build\template.yaml --event "C:\Users\Me\AppData\Local\Temp\[Local] HelloWorldFunction-event.json" --debug-port 54610 --debugger-path C:\Users\Me\scoop\apps\PyCharm-Professional\2020.1.2-201.7846.77\IDE\plugins\python\helpers\pydev --debug-args "-u /tmp/lambci_debug_files/pydevd.py --multiprocess --port 54610 --file"
Invoking hello_world/app.lambda_handler (python3.8)
Image was not found.
Building image...Traceback (most recent call last):
File "D:\obj\windows-release\37amd64_Release\msi_python\zip_amd64\runpy.py", line 193, in _run_module_as_main
File "D:\obj\windows-release\37amd64_Release\msi_python\zip_amd64\runpy.py", line 85, in _run_code
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\__main__.py", line 12, in <module>
cli(prog_name="sam")
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 782, in main
rv = self.invoke(ctx)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 610, in invoke
return callback(*args, **kwargs)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\decorators.py", line 73, in new_func
return ctx.invoke(f, obj, *args, **kwargs)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\click\core.py", line 610, in invoke
return callback(*args, **kwargs)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\lib\telemetry\metrics.py", line 96, in wrapped
raise exception # pylint: disable=raising-bad-type
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\lib\telemetry\metrics.py", line 62, in wrapped
return_value = func(*args, **kwargs)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\commands\local\invoke\cli.py", line 86, in cli
parameter_overrides,
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\commands\local\invoke\cli.py", line 151, in do_cli
context.function_name, event=event_data, stdout=context.stdout, stderr=context.stderr
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\commands\local\lib\local_lambda.py", line 100, in invoke
self.local_runtime.invoke(config, event, debug_context=self.debug_context, stdout=stdout, stderr=stderr)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\local\lambdafn\runtime.py", line 77, in invoke
debug_options=debug_context,
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\local\docker\lambda_container.py", line 72, in __init__
image = LambdaContainer._get_image(image_builder, runtime, layers, debug_options)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\local\docker\lambda_container.py", line 176, in _get_image
return image_builder.build(runtime, layers, is_debug)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\local\docker\lambda_image.py", line 123, in build
self._build_image(base_image, image_tag, downloaded_layers, is_debug_go, stream=stream_writer)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\samcli\local\docker\lambda_image.py", line 201, in _build_image
fileobj=tarballfile, custom_context=True, rm=True, tag=docker_tag, pull=not self.skip_pull_image
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\api\build.py", line 261, in build
self._set_auth_headers(headers)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\api\build.py", line 308, in _set_auth_headers
auth_data = self._auth_configs.get_all_credentials()
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\auth.py", line 311, in get_all_credentials
reg, store_name
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\auth.py", line 262, in _resolve_authconfig_credstore
store = self._get_store_instance(credstore_name)
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\auth.py", line 287, in _get_store_instance
name, environment=self._credstore_env
File "C:\Program Files\Amazon\AWSSAMCLI\runtime\lib\site-packages\docker\credentials\store.py", line 25, in __init__
self.program
docker.credentials.errors.InitializationError: docker-credential-gcloud not installed or not available in PATH
```
### Expected result
SAM CLI Should download and install latest emulator and debugger should attach allowing local debugging.
### Workarounds (Things that may help)
* Use Legacy HyperV Docker Engine rather than WSL2 Engine in Docker Desktop -> Settings -> General options page and add shared folders to allow Docker access to local sam build output
* Set --skip-pull-image flag for build
* Make sure any source code is located in user home folder to avoid possible Docker/WSL2/Windows permissions access issues (I did at points have source located in C:\Dev\)
* Set docker network to bridge
* Switch docker to enable Kubernetes
* Manually download AWS SAM Emulator image for environment by running docker pull amazon/aws-sam-cli-emulation-image-python3.8
* Ensure Firewall exceptions exist for Docker and Pycharm or VSCode
### Additional environment details (Ex: Windows, Mac, Amazon Linux etc)
1. OS: Windows 10, WSL2
2. `sam --version`: SAM CLI, version 1.0.0
--debug Output after using workarounds and the debugger finally attached
```bash
"C:\Program Files\Amazon\AWSSAMCLI\bin\sam.cmd" build AccountOuMovedHandlerFunction --template C:\Users\Me\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\template.yaml --build-dir C:\Users\Me\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build --use-container --skip-pull-image --docker-network bridge --debug
Telemetry endpoint configured to be https://aws-serverless-tools-telemetry.us-west-2.amazonaws.com/metrics
'build' command is called
Starting Build inside a container
No Parameters detected in the template
1 resources found in the template
Found Serverless function with name='AccountOuMovedHandlerFunction' and CodeUri='lambda/'
No Parameters detected in the template
Building function 'AccountOuMovedHandlerFunction'
Requested to skip pulling images ...
Mounting C:\Users\Me\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\lambda as /tmp/samcli/source:ro,delegated inside runtime container
Using the request object from command line argument
Loading workflow module 'aws_lambda_builders.workflows'
Registering workflow 'PythonPipBuilder' with capability 'Capability(language='python', dependency_manager='pip', application_framework=None)'
Registering workflow 'NodejsNpmBuilder' with capability 'Capability(language='nodejs', dependency_manager='npm', application_framework=None)'
Registering workflow 'RubyBundlerBuilder' with capability 'Capability(language='ruby', dependency_manager='bundler', application_framework=None)'
Registering workflow 'GoDepBuilder' with capability 'Capability(language='go', dependency_manager='dep', application_framework=None)'
Registering workflow 'GoModulesBuilder' with capability 'Capability(language='go', dependency_manager='modules', application_framework=None)'
Registering workflow 'JavaGradleWorkflow' with capability 'Capability(language='java', dependency_manager='gradle', application_framework=None)'
Registering workflow 'JavaMavenWorkflow' with capability 'Capability(language='java', dependency_manager='maven', application_framework=None)'
Registering workflow 'DotnetCliPackageBuilder' with capability 'Capability(language='dotnet', dependency_manager='cli-package', application_framework=None)'
Registering workflow 'CustomMakeBuilder' with capability 'Capability(language='provided', dependency_manager=None, application_framework=None)'
Found workflow 'PythonPipBuilder' to support capabilities 'Capability(language='python', dependency_manager='pip', application_framework=None)'
Running workflow 'PythonPipBuilder'
Running PythonPipBuilder:ResolveDependencies
calling pip download -r /tmp/samcli/source/requirements.txt --dest /tmp/samcli/scratch
Full dependency closure: {aiobotocore==1.0.4(wheel), async-timeout==3.0.1(wheel), fastjsonschema==2.14.4(wheel), botocore==1.17.32(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), chardet==3.0.4(wheel), idna==2.10(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), attrs==19.3.0(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), boto3==1.12.32(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), future==0.18.2(sdist), certifi==2020.6.20(wheel), wrapt==1.12.1(sdist)}
initial compatible: {aiobotocore==1.0.4(wheel), async-timeout==3.0.1(wheel), fastjsonschema==2.14.4(wheel), botocore==1.17.32(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), chardet==3.0.4(wheel), idna==2.10(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), attrs==19.3.0(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), boto3==1.12.32(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), certifi==2020.6.20(wheel)}
initial incompatible: {wrapt==1.12.1(sdist), future==0.18.2(sdist)}
Downloading missing wheels: {wrapt==1.12.1(sdist), future==0.18.2(sdist)}
calling pip download --only-binary=:all: --no-deps --platform manylinux1_x86_64 --implementation cp --abi cp38 --dest /tmp/samcli/scratch wrapt==1.12.1
calling pip download --only-binary=:all: --no-deps --platform manylinux1_x86_64 --implementation cp --abi cp38 --dest /tmp/samcli/scratch future==0.18.2
compatible wheels after second download pass: {aiobotocore==1.0.4(wheel), async-timeout==3.0.1(wheel), fastjsonschema==2.14.4(wheel), botocore==1.17.32(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), chardet==3.0.4(wheel), idna==2.10(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), attrs==19.3.0(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), boto3==1.12.32(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), certifi==2020.6.20(wheel)}
Build missing wheels from sdists (C compiling True): {wrapt==1.12.1(sdist), future==0.18.2(sdist)}
calling pip wheel --no-deps --wheel-dir /tmp/samcli/scratch /tmp/samcli/scratch/wrapt-1.12.1.tar.gz
calling pip wheel --no-deps --wheel-dir /tmp/samcli/scratch /tmp/samcli/scratch/future-0.18.2.tar.gz
compatible after building wheels (no C compiling): {aiobotocore==1.0.4(wheel), async-timeout==3.0.1(wheel), fastjsonschema==2.14.4(wheel), botocore==1.17.32(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), chardet==3.0.4(wheel), idna==2.10(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), attrs==19.3.0(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), boto3==1.12.32(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), future==0.18.2(wheel), certifi==2020.6.20(wheel), wrapt==1.12.1(wheel)}
Build missing wheels from sdists (C compiling False): set()
compatible after building wheels (C compiling): {aiobotocore==1.0.4(wheel), async-timeout==3.0.1(wheel), fastjsonschema==2.14.4(wheel), botocore==1.17.32(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), chardet==3.0.4(wheel), idna==2.10(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), attrs==19.3.0(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), boto3==1.12.32(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), future==0.18.2(wheel), certifi==2020.6.20(wheel), wrapt==1.12.1(wheel)}
Final compatible: {aiobotocore==1.0.4(wheel), urllib3==1.25.10(wheel), aws-lambda-powertools==1.0.2(wheel), async-timeout==3.0.1(wheel), attrs==19.3.0(wheel), fastjsonschema==2.14.4(wheel), aiohttp==3.6.2(wheel), yarl==1.5.0(wheel), botocore==1.17.32(wheel), six==1.15.0(wheel), python-dateutil==2.8.1(wheel), s3transfer==0.3.3(wheel), zipp==3.1.0(wheel), typing-extensions==3.7.4.2(wheel), jmespath==0.10.0(wheel), jsonpickle==1.4.1(wheel), multidict==4.7.6(wheel), docutils==0.15.2(wheel), aioboto3==8.0.5(wheel), aws-xray-sdk==2.6.0(wheel), requests==2.24.0(wheel), chardet==3.0.4(wheel), boto3==1.12.32(wheel), idna==2.10(wheel), aioitertools==0.7.0(wheel), importlib-metadata==1.7.0(wheel), future==0.18.2(wheel), certifi==2020.6.20(wheel), wrapt==1.12.1(wheel)}
Final incompatible: set()
Final missing wheels: set()
PythonPipBuilder:ResolveDependencies succeeded
Running PythonPipBuilder:CopySource
PythonPipBuilder:CopySource succeeded
Build inside container returned response {"jsonrpc": "2.0", "id": 1, "result": {"artifacts_dir": "/tmp/samcli/artifacts"}}
Build inside container was successful. Copying artifacts from container to host
Copying from container: /tmp/samcli/artifacts/. -> C:\Users\Me\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build\AccountOuMovedHandlerFunction
Build inside container succeeded
Build Succeeded
Built Artifacts : ..\..\..\..\..\..\..\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build
Built Template : ..\..\..\..\..\..\..\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build\template.yaml
Commands you can use next
=========================
[*] Invoke Function: sam local invoke -t ..\..\..\..\..\..\..\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build\template.yaml
[*] Deploy: sam deploy --guided --template-file ..\..\..\..\..\..\..\Dev\aws-control-tower-customizations-private\templates\Organizations\account-ou-moved-event-handler\.aws-sam\build\template.yaml
Sending Telemetry: {'metrics': [{'commandRun': {'awsProfileProvided': False, 'debugFlagProvided': True, 'region': '', 'commandName': 'sam build', 'duration': 47956, 'exitReason': 'success', 'exitCode': 0, 'requestId': '38225bda-2891-481b-85af-e00d41f2e206', 'installationId': 'c73c971d-b022-447b-b96c-6d36c642513e', 'sessionId': 'f66e0125-08db-4c1a-9f8b-2a8881b7181e', 'executionEnvironment': 'CLI', 'pyversion': '3.7.6', 'samcliVersion': '1.0.0'}}]}
HTTPSConnectionPool(host='aws-serverless-tools-telemetry.us-west-2.amazonaws.com', port=443): Read timed out. (read timeout=0.1)
``` | main | debugging on windows fails with timeout description i am unable to debug a simple lambda function generated with an aws toolkit template whilst running windows with docker desktop and wsl mode enabled the debugger times out connection to python debugger failed connection to the debugger script at localhost timed out connection to python debugger failed connection to the debugger script at localhost timed out steps to reproduce create a project using aws toolkit for pycharm or code set a breakpoint in the lambda start debugger observed result bash c users me scoop apps pycharm professional ide bin exe c program files amazon awssamcli bin sam cmd local invoke helloworldfunction template c temp pycharmeventbridge aws sam build template yaml event c users me appdata local temp helloworldfunction event json debug port debugger path c users me scoop apps pycharm professional ide plugins python helpers pydev debug args u tmp lambci debug files pydevd py multiprocess port file invoking hello world app lambda handler image was not found building image traceback most recent call last file d obj windows release release msi python zip runpy py line in run module as main file d obj windows release release msi python zip runpy py line in run code file c program files amazon awssamcli runtime lib site packages samcli main py line in cli prog name sam file c program files amazon awssamcli runtime lib site packages click core py line in call return self main args kwargs file c program files amazon awssamcli runtime lib site packages click core py line in main rv self invoke ctx file c program files amazon awssamcli runtime lib site packages click core py line in invoke return process result sub ctx command invoke sub ctx file c program files amazon awssamcli runtime lib site packages click core py line in invoke return process result sub ctx command invoke sub ctx file c program files amazon awssamcli runtime lib site packages click core py line in invoke return ctx invoke self callback ctx params file c program files amazon awssamcli runtime lib site packages click core py line in invoke return callback args kwargs file c program files amazon awssamcli runtime lib site packages click decorators py line in new func return ctx invoke f obj args kwargs file c program files amazon awssamcli runtime lib site packages click core py line in invoke return callback args kwargs file c program files amazon awssamcli runtime lib site packages samcli lib telemetry metrics py line in wrapped raise exception pylint disable raising bad type file c program files amazon awssamcli runtime lib site packages samcli lib telemetry metrics py line in wrapped return value func args kwargs file c program files amazon awssamcli runtime lib site packages samcli commands local invoke cli py line in cli parameter overrides file c program files amazon awssamcli runtime lib site packages samcli commands local invoke cli py line in do cli context function name event event data stdout context stdout stderr context stderr file c program files amazon awssamcli runtime lib site packages samcli commands local lib local lambda py line in invoke self local runtime invoke config event debug context self debug context stdout stdout stderr stderr file c program files amazon awssamcli runtime lib site packages samcli local lambdafn runtime py line in invoke debug options debug context file c program files amazon awssamcli runtime lib site packages samcli local docker lambda container py line in init image lambdacontainer get image image builder runtime layers debug options file c program files amazon awssamcli runtime lib site packages samcli local docker lambda container py line in get image return image builder build runtime layers is debug file c program files amazon awssamcli runtime lib site packages samcli local docker lambda image py line in build self build image base image image tag downloaded layers is debug go stream stream writer file c program files amazon awssamcli runtime lib site packages samcli local docker lambda image py line in build image fileobj tarballfile custom context true rm true tag docker tag pull not self skip pull image file c program files amazon awssamcli runtime lib site packages docker api build py line in build self set auth headers headers file c program files amazon awssamcli runtime lib site packages docker api build py line in set auth headers auth data self auth configs get all credentials file c program files amazon awssamcli runtime lib site packages docker auth py line in get all credentials reg store name file c program files amazon awssamcli runtime lib site packages docker auth py line in resolve authconfig credstore store self get store instance credstore name file c program files amazon awssamcli runtime lib site packages docker auth py line in get store instance name environment self credstore env file c program files amazon awssamcli runtime lib site packages docker credentials store py line in init self program docker credentials errors initializationerror docker credential gcloud not installed or not available in path expected result sam cli should download and install latest emulator and debugger should attach allowing local debugging workarounds things that may help use legacy hyperv docker engine rather than engine in docker desktop settings general options page and add shared folders to allow docker access to local sam build output set skip pull image flag for build make sure any source code is located in user home folder to avoid possible docker windows permissions access issues i did at points have source located in c dev set docker network to bridge switch docker to enable kubernetes manually download aws sam emulator image for environment by running docker pull amazon aws sam cli emulation image ensure firewall exceptions exist for docker and pycharm or vscode additional environment details ex windows mac amazon linux etc os windows sam version sam cli version debug output after using workarounds and the debugger finally attached bash c program files amazon awssamcli bin sam cmd build accountoumovedhandlerfunction template c users me dev aws control tower customizations private templates organizations account ou moved event handler template yaml build dir c users me dev aws control tower customizations private templates organizations account ou moved event handler aws sam build use container skip pull image docker network bridge debug telemetry endpoint configured to be build command is called starting build inside a container no parameters detected in the template resources found in the template found serverless function with name accountoumovedhandlerfunction and codeuri lambda no parameters detected in the template building function accountoumovedhandlerfunction requested to skip pulling images mounting c users me dev aws control tower customizations private templates organizations account ou moved event handler lambda as tmp samcli source ro delegated inside runtime container using the request object from command line argument loading workflow module aws lambda builders workflows registering workflow pythonpipbuilder with capability capability language python dependency manager pip application framework none registering workflow nodejsnpmbuilder with capability capability language nodejs dependency manager npm application framework none registering workflow rubybundlerbuilder with capability capability language ruby dependency manager bundler application framework none registering workflow godepbuilder with capability capability language go dependency manager dep application framework none registering workflow gomodulesbuilder with capability capability language go dependency manager modules application framework none registering workflow javagradleworkflow with capability capability language java dependency manager gradle application framework none registering workflow javamavenworkflow with capability capability language java dependency manager maven application framework none registering workflow dotnetclipackagebuilder with capability capability language dotnet dependency manager cli package application framework none registering workflow custommakebuilder with capability capability language provided dependency manager none application framework none found workflow pythonpipbuilder to support capabilities capability language python dependency manager pip application framework none running workflow pythonpipbuilder running pythonpipbuilder resolvedependencies calling pip download r tmp samcli source requirements txt dest tmp samcli scratch full dependency closure aiobotocore wheel async timeout wheel fastjsonschema wheel botocore wheel wheel zipp wheel typing extensions wheel jmespath wheel jsonpickle wheel multidict wheel docutils wheel wheel chardet wheel idna wheel wheel aws lambda powertools wheel attrs wheel aiohttp wheel yarl wheel six wheel python dateutil wheel aws xray sdk wheel requests wheel wheel aioitertools wheel importlib metadata wheel future sdist certifi wheel wrapt sdist initial compatible aiobotocore wheel async timeout wheel fastjsonschema wheel botocore wheel wheel zipp wheel typing extensions wheel jmespath wheel jsonpickle wheel multidict wheel docutils wheel wheel chardet wheel idna wheel wheel aws lambda powertools wheel attrs wheel aiohttp wheel yarl wheel six wheel python dateutil wheel aws xray sdk wheel requests wheel wheel aioitertools wheel importlib metadata wheel certifi wheel initial incompatible wrapt sdist future sdist downloading missing wheels wrapt sdist future sdist calling pip download only binary all no deps platform implementation cp abi dest tmp samcli scratch wrapt calling pip download only binary all no deps platform implementation cp abi dest tmp samcli scratch future compatible wheels after second download pass aiobotocore wheel async timeout wheel fastjsonschema wheel botocore wheel wheel zipp wheel typing extensions wheel jmespath wheel jsonpickle wheel multidict wheel docutils wheel wheel chardet wheel idna wheel wheel aws lambda powertools wheel attrs wheel aiohttp wheel yarl wheel six wheel python dateutil wheel aws xray sdk wheel requests wheel wheel aioitertools wheel importlib metadata wheel certifi wheel build missing wheels from sdists c compiling true wrapt sdist future sdist calling pip wheel no deps wheel dir tmp samcli scratch tmp samcli scratch wrapt tar gz calling pip wheel no deps wheel dir tmp samcli scratch tmp samcli scratch future tar gz compatible after building wheels no c compiling aiobotocore wheel async timeout wheel fastjsonschema wheel botocore wheel wheel zipp wheel typing extensions wheel jmespath wheel jsonpickle wheel multidict wheel docutils wheel wheel chardet wheel idna wheel wheel aws lambda powertools wheel attrs wheel aiohttp wheel yarl wheel six wheel python dateutil wheel aws xray sdk wheel requests wheel wheel aioitertools wheel importlib metadata wheel future wheel certifi wheel wrapt wheel build missing wheels from sdists c compiling false set compatible after building wheels c compiling aiobotocore wheel async timeout wheel fastjsonschema wheel botocore wheel wheel zipp wheel typing extensions wheel jmespath wheel jsonpickle wheel multidict wheel docutils wheel wheel chardet wheel idna wheel wheel aws lambda powertools wheel attrs wheel aiohttp wheel yarl wheel six wheel python dateutil wheel aws xray sdk wheel requests wheel wheel aioitertools wheel importlib metadata wheel future wheel certifi wheel wrapt wheel final compatible aiobotocore wheel wheel aws lambda powertools wheel async timeout wheel attrs wheel fastjsonschema wheel aiohttp wheel yarl wheel botocore wheel six wheel python dateutil wheel wheel zipp wheel typing extensions wheel jmespath wheel jsonpickle wheel multidict wheel docutils wheel wheel aws xray sdk wheel requests wheel chardet wheel wheel idna wheel aioitertools wheel importlib metadata wheel future wheel certifi wheel wrapt wheel final incompatible set final missing wheels set pythonpipbuilder resolvedependencies succeeded running pythonpipbuilder copysource pythonpipbuilder copysource succeeded build inside container returned response jsonrpc id result artifacts dir tmp samcli artifacts build inside container was successful copying artifacts from container to host copying from container tmp samcli artifacts c users me dev aws control tower customizations private templates organizations account ou moved event handler aws sam build accountoumovedhandlerfunction build inside container succeeded build succeeded built artifacts dev aws control tower customizations private templates organizations account ou moved event handler aws sam build built template dev aws control tower customizations private templates organizations account ou moved event handler aws sam build template yaml commands you can use next invoke function sam local invoke t dev aws control tower customizations private templates organizations account ou moved event handler aws sam build template yaml deploy sam deploy guided template file dev aws control tower customizations private templates organizations account ou moved event handler aws sam build template yaml sending telemetry metrics httpsconnectionpool host aws serverless tools telemetry us west amazonaws com port read timed out read timeout | 1 |
310,686 | 26,733,774,091 | IssuesEvent | 2023-01-30 07:47:41 | EddieHubCommunity/LinkFree | https://api.github.com/repos/EddieHubCommunity/LinkFree | closed | New Testimonial for Mohd Imran | testimonial | ### Name
imran1509
### Title
FIRST_INTERACTION
### Description
he was so humble and steady throughout the interaction
guided me to start learning online and make a twitter account for more exposure
& get to know about an overview of devOps. | 1.0 | New Testimonial for Mohd Imran - ### Name
imran1509
### Title
FIRST_INTERACTION
### Description
he was so humble and steady throughout the interaction
guided me to start learning online and make a twitter account for more exposure
& get to know about an overview of devOps. | non_main | new testimonial for mohd imran name title first interaction description he was so humble and steady throughout the interaction guided me to start learning online and make a twitter account for more exposure get to know about an overview of devops | 0 |
99,049 | 12,395,348,419 | IssuesEvent | 2020-05-20 18:30:06 | pnp/sp-dev-list-formatting | https://api.github.com/repos/pnp/sp-dev-list-formatting | closed | Using () in JSON STYLE attribute causes column formatter to fail | ✔ By Design 🌭 List Formatting General | #### Category
- [ ] Question
- [X ] Bug
- [ ] Enhancement
#### Expected or Desired Behavior
Using a CSS style attribute in a column formatter containing brackets "()" it should render the condition / attribute.
#### Observed Behavior
As soon as I start using brackets "()" within CSS style attributes the whole column formatter stops working and no output is displayed.
#### Steps to Reproduce
"style": {
"background-color": {
"rgba(0,0,0,0)"
}
}
"style": {
"background-image": {
"url(/path/to/my/image.png)"
}
}
"style": {
"color": {
"rgb(0,0,0)"
}
}
"style": {
"color": {
"rgba(0,0,0,0)"
}
}
"style": {
"width": {
"calc(100px - 1px)"
}
}
"style": {
"height": {
"calc(100px - 1px)"
}
}
"style": {
"max-width": {
"calc(100px - 1px)"
}
}
"style": {
"max-height": {
"calc(100px - 1px)"
}
}
| 1.0 | Using () in JSON STYLE attribute causes column formatter to fail - #### Category
- [ ] Question
- [X ] Bug
- [ ] Enhancement
#### Expected or Desired Behavior
Using a CSS style attribute in a column formatter containing brackets "()" it should render the condition / attribute.
#### Observed Behavior
As soon as I start using brackets "()" within CSS style attributes the whole column formatter stops working and no output is displayed.
#### Steps to Reproduce
"style": {
"background-color": {
"rgba(0,0,0,0)"
}
}
"style": {
"background-image": {
"url(/path/to/my/image.png)"
}
}
"style": {
"color": {
"rgb(0,0,0)"
}
}
"style": {
"color": {
"rgba(0,0,0,0)"
}
}
"style": {
"width": {
"calc(100px - 1px)"
}
}
"style": {
"height": {
"calc(100px - 1px)"
}
}
"style": {
"max-width": {
"calc(100px - 1px)"
}
}
"style": {
"max-height": {
"calc(100px - 1px)"
}
}
| non_main | using in json style attribute causes column formatter to fail category question bug enhancement expected or desired behavior using a css style attribute in a column formatter containing brackets it should render the condition attribute observed behavior as soon as i start using brackets within css style attributes the whole column formatter stops working and no output is displayed steps to reproduce style background color rgba style background image url path to my image png style color rgb style color rgba style width calc style height calc style max width calc style max height calc | 0 |
110,509 | 9,458,699,672 | IssuesEvent | 2019-04-17 06:23:39 | zeroc-ice/ice | https://api.github.com/repos/zeroc-ice/ice | opened | Xamarin C# controller no longer builds | csharp testsuite | ```
Target _CheckSupportedAbis:
/Library/Frameworks/Mono.framework/External/xbuild/Xamarin/Android/Xamarin.Android.Common.targets(1065,2): error XA0115: Invalid value 'armeabi' in $(AndroidSupportedAbis). This ABI is no longer supported. Please update your project properties. [/Users/vagrant/workspace/ice-dist/3.7/dist-utils/build/ice/builds/ice-clang-default/csharp/test/xamarin/controller.Android/controller.Android.csproj]
Done building target "_CheckSupportedAbis" in project "controller.Android.csproj" -- FAILED.
Done building project "controller.Android.csproj" -- FAILED.
Build FAILED.
/Library/Frameworks/Mono.framework/External/xbuild/Xamarin/Android/Xamarin.Android.Common.targets(1065,2): error XA0115: Invalid value 'armeabi' in $(AndroidSupportedAbis). This ABI is no longer supported. Please update your project properties. [/Users/vagrant/workspace/ice-dist/3.7/dist-utils/build/ice/builds/ice-clang-default/csharp/test/xamarin/controller.Android/controller.Android.csproj]
0 Warning(s)
1 Error(s)
```
| 1.0 | Xamarin C# controller no longer builds - ```
Target _CheckSupportedAbis:
/Library/Frameworks/Mono.framework/External/xbuild/Xamarin/Android/Xamarin.Android.Common.targets(1065,2): error XA0115: Invalid value 'armeabi' in $(AndroidSupportedAbis). This ABI is no longer supported. Please update your project properties. [/Users/vagrant/workspace/ice-dist/3.7/dist-utils/build/ice/builds/ice-clang-default/csharp/test/xamarin/controller.Android/controller.Android.csproj]
Done building target "_CheckSupportedAbis" in project "controller.Android.csproj" -- FAILED.
Done building project "controller.Android.csproj" -- FAILED.
Build FAILED.
/Library/Frameworks/Mono.framework/External/xbuild/Xamarin/Android/Xamarin.Android.Common.targets(1065,2): error XA0115: Invalid value 'armeabi' in $(AndroidSupportedAbis). This ABI is no longer supported. Please update your project properties. [/Users/vagrant/workspace/ice-dist/3.7/dist-utils/build/ice/builds/ice-clang-default/csharp/test/xamarin/controller.Android/controller.Android.csproj]
0 Warning(s)
1 Error(s)
```
| non_main | xamarin c controller no longer builds target checksupportedabis library frameworks mono framework external xbuild xamarin android xamarin android common targets error invalid value armeabi in androidsupportedabis this abi is no longer supported please update your project properties done building target checksupportedabis in project controller android csproj failed done building project controller android csproj failed build failed library frameworks mono framework external xbuild xamarin android xamarin android common targets error invalid value armeabi in androidsupportedabis this abi is no longer supported please update your project properties warning s error s | 0 |
3,283 | 12,536,563,504 | IssuesEvent | 2020-06-05 00:29:07 | Kashdeya/Tiny-Progressions | https://api.github.com/repos/Kashdeya/Tiny-Progressions | closed | Tiny Progressions 1.12 Big Pouch dupe glitch | Version not Maintainted | There is a dupe exploit with the big pouch.
If you open it an throw it the inventory does not close and you can still take items from it.
And when you pick it back up you get double the items. | True | Tiny Progressions 1.12 Big Pouch dupe glitch - There is a dupe exploit with the big pouch.
If you open it an throw it the inventory does not close and you can still take items from it.
And when you pick it back up you get double the items. | main | tiny progressions big pouch dupe glitch there is a dupe exploit with the big pouch if you open it an throw it the inventory does not close and you can still take items from it and when you pick it back up you get double the items | 1 |
4,809 | 24,768,884,171 | IssuesEvent | 2022-10-22 22:18:01 | backdrop-ops/contrib | https://api.github.com/repos/backdrop-ops/contrib | closed | Contrib Group Application: Rob Squires (port of uc_gc_client) | Port in progress Maintainer application | **Please indicate how you intend to help the Backdrop community by joining this group**
* Option 1: I would like to contribute a project
* Option 2: I would like to maintain a project, but have nothing to contribute at this time
* Option 3: I would like to update documentation and/or triage issue queues
<!-- example: Option 1 -->
Option 1
## Based on your selection above, please provide the following information:
**(option 1) The name of your module, theme, or layout**
Ubercart GoCardless Client
## (option 1) Please note these 3 requirements for new contrib projects:
- [x] Include a README.md file containing license and maintainer information.
You can use this example: https://raw.githubusercontent.com/backdrop-ops/contrib/master/examples/README.md
- [x] Include a LICENSE.txt file.
You can use this example: https://raw.githubusercontent.com/backdrop-ops/contrib/master/examples/LICENSE.txt.
- [x] If porting a Drupal 7 project, Maintain the Git history from Drupal.
I have not finished porting the module yet, but will add the README and LICENSE files as required.
I followed the instructions at https://docs.backdropcms.org/converting-modules-from-drupal for "Creating the Repository" in my personal github space. There are 3 branches in the Drupal repository (7.x-1.x, 7.x-2.x, and 8.x-1.x). However when I applied the "git remote remove origin" command as shown in the instructions, the 7.x-1.x and 8.x-1.x branches were removed from the clone. Did I do something wrong here?
**(option 1 -- optional) Post a link here to an issue in the drupal.org queue notifying the Drupal 7 maintainers that you are working on a Backdrop port of their project**
<!-- example: https://www.drupal.org/project/forum_access/issues/3070491 -->
I have omitted this step as I am the sole maintainer of the project on Drupal, and am not looking for any assistance in maintaining it at this point. The project can be seen on Drupal at https://www.drupal.org/project/uc_gc_client.
**Post a link to your new Backdrop project under your own GitHub account (option 1)**
<!-- example: https://github.com/jenlampton/forum_access -->
https://github.com/roblog/uc_gc_client/tree/1.x-2.x
**If you have chosen option 2 or 1 above, do you agree to the [Backdrop Contributed Project Agreement](https://github.com/backdrop-ops/contrib#backdrop-contributed-project-agreement)**
YES
<!-- (option 1) Once we have a chance to review your project, we will check for the 3 requirements at the top of this issue. If those requirements are met, you will be invited to the @backdrop-contrib group. At that point you will be able to transfer the project. -->
<!-- (option 1) Please note that we may also include additional feedback in the code review, but anything else is only intended to be helpful, and is NOT a requirement for joining the contrib group. -->
| True | Contrib Group Application: Rob Squires (port of uc_gc_client) - **Please indicate how you intend to help the Backdrop community by joining this group**
* Option 1: I would like to contribute a project
* Option 2: I would like to maintain a project, but have nothing to contribute at this time
* Option 3: I would like to update documentation and/or triage issue queues
<!-- example: Option 1 -->
Option 1
## Based on your selection above, please provide the following information:
**(option 1) The name of your module, theme, or layout**
Ubercart GoCardless Client
## (option 1) Please note these 3 requirements for new contrib projects:
- [x] Include a README.md file containing license and maintainer information.
You can use this example: https://raw.githubusercontent.com/backdrop-ops/contrib/master/examples/README.md
- [x] Include a LICENSE.txt file.
You can use this example: https://raw.githubusercontent.com/backdrop-ops/contrib/master/examples/LICENSE.txt.
- [x] If porting a Drupal 7 project, Maintain the Git history from Drupal.
I have not finished porting the module yet, but will add the README and LICENSE files as required.
I followed the instructions at https://docs.backdropcms.org/converting-modules-from-drupal for "Creating the Repository" in my personal github space. There are 3 branches in the Drupal repository (7.x-1.x, 7.x-2.x, and 8.x-1.x). However when I applied the "git remote remove origin" command as shown in the instructions, the 7.x-1.x and 8.x-1.x branches were removed from the clone. Did I do something wrong here?
**(option 1 -- optional) Post a link here to an issue in the drupal.org queue notifying the Drupal 7 maintainers that you are working on a Backdrop port of their project**
<!-- example: https://www.drupal.org/project/forum_access/issues/3070491 -->
I have omitted this step as I am the sole maintainer of the project on Drupal, and am not looking for any assistance in maintaining it at this point. The project can be seen on Drupal at https://www.drupal.org/project/uc_gc_client.
**Post a link to your new Backdrop project under your own GitHub account (option 1)**
<!-- example: https://github.com/jenlampton/forum_access -->
https://github.com/roblog/uc_gc_client/tree/1.x-2.x
**If you have chosen option 2 or 1 above, do you agree to the [Backdrop Contributed Project Agreement](https://github.com/backdrop-ops/contrib#backdrop-contributed-project-agreement)**
YES
<!-- (option 1) Once we have a chance to review your project, we will check for the 3 requirements at the top of this issue. If those requirements are met, you will be invited to the @backdrop-contrib group. At that point you will be able to transfer the project. -->
<!-- (option 1) Please note that we may also include additional feedback in the code review, but anything else is only intended to be helpful, and is NOT a requirement for joining the contrib group. -->
| main | contrib group application rob squires port of uc gc client please indicate how you intend to help the backdrop community by joining this group option i would like to contribute a project option i would like to maintain a project but have nothing to contribute at this time option i would like to update documentation and or triage issue queues option based on your selection above please provide the following information option the name of your module theme or layout ubercart gocardless client option please note these requirements for new contrib projects include a readme md file containing license and maintainer information you can use this example include a license txt file you can use this example if porting a drupal project maintain the git history from drupal i have not finished porting the module yet but will add the readme and license files as required i followed the instructions at for creating the repository in my personal github space there are branches in the drupal repository x x x x and x x however when i applied the git remote remove origin command as shown in the instructions the x x and x x branches were removed from the clone did i do something wrong here option optional post a link here to an issue in the drupal org queue notifying the drupal maintainers that you are working on a backdrop port of their project i have omitted this step as i am the sole maintainer of the project on drupal and am not looking for any assistance in maintaining it at this point the project can be seen on drupal at post a link to your new backdrop project under your own github account option if you have chosen option or above do you agree to the yes | 1 |
2,986 | 10,780,094,549 | IssuesEvent | 2019-11-04 12:10:28 | wtfd-tech/wtfd | https://api.github.com/repos/wtfd-tech/wtfd | closed | split config loading into it's own package (internal/config) | Backend Maintainability | maybe with a config interface, with implementations for json/yaml, env, ... | True | split config loading into it's own package (internal/config) - maybe with a config interface, with implementations for json/yaml, env, ... | main | split config loading into it s own package internal config maybe with a config interface with implementations for json yaml env | 1 |
3,515 | 13,756,887,382 | IssuesEvent | 2020-10-06 20:40:38 | carbon-design-system/carbon | https://api.github.com/repos/carbon-design-system/carbon | closed | File Uploader Component is removing both first and second file on adding second file in Internet Explorer | status: needs triage 🕵️♀️ status: waiting for maintainer response 💬 type: question ❓ | <!--
Hi there! 👋 Hope everything is going okay using projects from the Carbon Design
System. It looks like you might have a question about our work, so we wanted to
share a couple resources that you could use if you haven't tried them yet 🙂.
If you're an IBMer, we have a couple of Slack channels available across all IBM
Workspaces:
- #carbon-design-system for questions about the Design System
- #carbon-components for questions about component styles
- #carbon-react for questions about our React components
If these resources don't work out, help us out by filling out a couple of
details below!
-->
## What package(s) are you using?
<!--
Add an x in one of the options below, for example:
- [x] package name
-->
- [x] `carbon-components-angular`
## Summary
I am trying to use File Uploader component in Angular. Requirement is to upload a Single File in Internet Explorer v10 or v11.
Adding a first file is working fine, but adding a second file is removing both first and second file.
Expectation is that on adding the second file, first file should be removed and second file should be added.
This is working as expected in Chrome, but is not working in Internet Explorer.
I am not sure if this is a problem or I am doing something wrong. that's why I have raised this as a question.
It would be great help, if someone please help with this issue.
## Relevant information
Steps :
User clicks on Add file. First file is uploaded successfully.
Now the user has changed his mind, and wants to add some other file.
If the User clicks on Add file again.
It removes the First file, which is fine, but it does not add second file.
In order to verify, that I am not doing something wrong, I have tried this with minimum properties of File Uploader component.
I am using below code:
<ibm-file-uploader
[multiple]=false>
</ibm-file-uploader>
<!-- Provide as much useful information as you can -->
| True | File Uploader Component is removing both first and second file on adding second file in Internet Explorer - <!--
Hi there! 👋 Hope everything is going okay using projects from the Carbon Design
System. It looks like you might have a question about our work, so we wanted to
share a couple resources that you could use if you haven't tried them yet 🙂.
If you're an IBMer, we have a couple of Slack channels available across all IBM
Workspaces:
- #carbon-design-system for questions about the Design System
- #carbon-components for questions about component styles
- #carbon-react for questions about our React components
If these resources don't work out, help us out by filling out a couple of
details below!
-->
## What package(s) are you using?
<!--
Add an x in one of the options below, for example:
- [x] package name
-->
- [x] `carbon-components-angular`
## Summary
I am trying to use File Uploader component in Angular. Requirement is to upload a Single File in Internet Explorer v10 or v11.
Adding a first file is working fine, but adding a second file is removing both first and second file.
Expectation is that on adding the second file, first file should be removed and second file should be added.
This is working as expected in Chrome, but is not working in Internet Explorer.
I am not sure if this is a problem or I am doing something wrong. that's why I have raised this as a question.
It would be great help, if someone please help with this issue.
## Relevant information
Steps :
User clicks on Add file. First file is uploaded successfully.
Now the user has changed his mind, and wants to add some other file.
If the User clicks on Add file again.
It removes the First file, which is fine, but it does not add second file.
In order to verify, that I am not doing something wrong, I have tried this with minimum properties of File Uploader component.
I am using below code:
<ibm-file-uploader
[multiple]=false>
</ibm-file-uploader>
<!-- Provide as much useful information as you can -->
| main | file uploader component is removing both first and second file on adding second file in internet explorer hi there 👋 hope everything is going okay using projects from the carbon design system it looks like you might have a question about our work so we wanted to share a couple resources that you could use if you haven t tried them yet 🙂 if you re an ibmer we have a couple of slack channels available across all ibm workspaces carbon design system for questions about the design system carbon components for questions about component styles carbon react for questions about our react components if these resources don t work out help us out by filling out a couple of details below what package s are you using add an x in one of the options below for example package name carbon components angular summary i am trying to use file uploader component in angular requirement is to upload a single file in internet explorer or adding a first file is working fine but adding a second file is removing both first and second file expectation is that on adding the second file first file should be removed and second file should be added this is working as expected in chrome but is not working in internet explorer i am not sure if this is a problem or i am doing something wrong that s why i have raised this as a question it would be great help if someone please help with this issue relevant information steps user clicks on add file first file is uploaded successfully now the user has changed his mind and wants to add some other file if the user clicks on add file again it removes the first file which is fine but it does not add second file in order to verify that i am not doing something wrong i have tried this with minimum properties of file uploader component i am using below code ibm file uploader false | 1 |
298,009 | 9,188,383,712 | IssuesEvent | 2019-03-06 07:13:25 | gbv/cocoda | https://api.github.com/repos/gbv/cocoda | opened | Idea: Allow displaying non-existent URIs/notations | feature low priority question | I've had this idea when we recently talked about synthetic notations in DDC. We could allow displaying non-existent concepts by deconstructing the URI to get the notation. Of course, those concepts would have no other information, i.e. no label or additional information. If we then allow people to "create" new concepts by providing a notation, those new concepts could be used for mappings.
In case of DDC, we could at some point show the decomposition of the notation so that there would be some additional information about those concepts. For other concept schemes, we could try to determine possibly related concepts by slicing the notation. | 1.0 | Idea: Allow displaying non-existent URIs/notations - I've had this idea when we recently talked about synthetic notations in DDC. We could allow displaying non-existent concepts by deconstructing the URI to get the notation. Of course, those concepts would have no other information, i.e. no label or additional information. If we then allow people to "create" new concepts by providing a notation, those new concepts could be used for mappings.
In case of DDC, we could at some point show the decomposition of the notation so that there would be some additional information about those concepts. For other concept schemes, we could try to determine possibly related concepts by slicing the notation. | non_main | idea allow displaying non existent uris notations i ve had this idea when we recently talked about synthetic notations in ddc we could allow displaying non existent concepts by deconstructing the uri to get the notation of course those concepts would have no other information i e no label or additional information if we then allow people to create new concepts by providing a notation those new concepts could be used for mappings in case of ddc we could at some point show the decomposition of the notation so that there would be some additional information about those concepts for other concept schemes we could try to determine possibly related concepts by slicing the notation | 0 |
5,771 | 30,587,553,344 | IssuesEvent | 2023-07-21 14:27:12 | carbon-design-system/carbon | https://api.github.com/repos/carbon-design-system/carbon | closed | [Bug]: HeaderMenuButton is not show when the width is more than 1055px | type: bug 🐛 status: needs triage 🕵️♀️ status: waiting for maintainer response 💬 | ### Package
@carbon/react
### Browser
Chrome, Firefox
### Package version
1.33.2
### React version
18.2.0
### Description
i am new guy to the carbon design system . when i add the HeaderMenuButton to the Header element, the HeaderMenuButton is not shown when the width is more than 1055px in my pc. i try the same the code in stackblitz, if i open the result page in a new tab(the page become larger) ,the HeaderMenuButton is hidden.i don`t know it is a bug or is my mistake, help me please ,thanks very much.
### Reproduction/example
https://stackblitz.com/edit/github-vvnkhg?file=src%2FApp.jsx,src%2Fmain.jsx
### Steps to reproduce
import { Header, HeaderMenuButton } from '@carbon/react';
import React from 'react';
export default function App() {
return (
<div>
<Header>
<HeaderMenuButton></HeaderMenuButton>
</Header>
</div>
);
}
### Suggested Severity
Severity 2 = User cannot complete task, and/or no workaround within the user experience of a given component.
### Application/PAL
browser
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/carbon-design-system/carbon/blob/f555616971a03fd454c0f4daea184adf41fff05b/.github/CODE_OF_CONDUCT.md)
- [X] I checked the [current issues](https://github.com/carbon-design-system/carbon/issues) for duplicate problems | True | [Bug]: HeaderMenuButton is not show when the width is more than 1055px - ### Package
@carbon/react
### Browser
Chrome, Firefox
### Package version
1.33.2
### React version
18.2.0
### Description
i am new guy to the carbon design system . when i add the HeaderMenuButton to the Header element, the HeaderMenuButton is not shown when the width is more than 1055px in my pc. i try the same the code in stackblitz, if i open the result page in a new tab(the page become larger) ,the HeaderMenuButton is hidden.i don`t know it is a bug or is my mistake, help me please ,thanks very much.
### Reproduction/example
https://stackblitz.com/edit/github-vvnkhg?file=src%2FApp.jsx,src%2Fmain.jsx
### Steps to reproduce
import { Header, HeaderMenuButton } from '@carbon/react';
import React from 'react';
export default function App() {
return (
<div>
<Header>
<HeaderMenuButton></HeaderMenuButton>
</Header>
</div>
);
}
### Suggested Severity
Severity 2 = User cannot complete task, and/or no workaround within the user experience of a given component.
### Application/PAL
browser
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/carbon-design-system/carbon/blob/f555616971a03fd454c0f4daea184adf41fff05b/.github/CODE_OF_CONDUCT.md)
- [X] I checked the [current issues](https://github.com/carbon-design-system/carbon/issues) for duplicate problems | main | headermenubutton is not show when the width is more than package carbon react browser chrome firefox package version react version description i am new guy to the carbon design system when i add the headermenubutton to the header element the headermenubutton is not shown when the width is more than in my pc i try the same the code in stackblitz if i open the result page in a new tab the page become larger the headermenubutton is hidden i don t know it is a bug or is my mistake help me please thanks very much reproduction example steps to reproduce import header headermenubutton from carbon react import react from react export default function app return suggested severity severity user cannot complete task and or no workaround within the user experience of a given component application pal browser code of conduct i agree to follow this project s i checked the for duplicate problems | 1 |
4,580 | 23,778,790,463 | IssuesEvent | 2022-09-02 00:48:31 | chocolatey-community/chocolatey-package-requests | https://api.github.com/repos/chocolatey-community/chocolatey-package-requests | closed | RFP - logseq | Status: Available For Maintainer(s) | ## Checklist
- [x] The package I am requesting does not already exist on https://chocolatey.org/packages;
- [x] There is no open issue for this package;
- [x] The issue title starts with 'RFP - ';
- [x] The download URL is public and not locked behind a paywall / login;
## Package Details
Software project URL : https://github.com/logseq/logseq
Direct download URL for the software / installer : https://github.com/logseq/logseq/releases/download/0.4.4/logseq-win-x64-0.4.4.exe
Software summary / short description: An Open Source login-free version of Obsidian, an aesthetically better alternative to Trilium, Foam and Dendron without VSCode
| True | RFP - logseq - ## Checklist
- [x] The package I am requesting does not already exist on https://chocolatey.org/packages;
- [x] There is no open issue for this package;
- [x] The issue title starts with 'RFP - ';
- [x] The download URL is public and not locked behind a paywall / login;
## Package Details
Software project URL : https://github.com/logseq/logseq
Direct download URL for the software / installer : https://github.com/logseq/logseq/releases/download/0.4.4/logseq-win-x64-0.4.4.exe
Software summary / short description: An Open Source login-free version of Obsidian, an aesthetically better alternative to Trilium, Foam and Dendron without VSCode
| main | rfp logseq checklist the package i am requesting does not already exist on there is no open issue for this package the issue title starts with rfp the download url is public and not locked behind a paywall login package details software project url direct download url for the software installer software summary short description an open source login free version of obsidian an aesthetically better alternative to trilium foam and dendron without vscode | 1 |
173,289 | 6,523,185,277 | IssuesEvent | 2017-08-29 07:37:32 | FreeAndFair/ColoradoRLA | https://api.github.com/repos/FreeAndFair/ColoradoRLA | closed | Client behavior for County after file uploads before SoS starts audit | CDOS Priority client feature | From CDOS 8/28 feedback:
After successfully uploading hte CVR and the ballot manifest files a message should appear to notify the county to "stand-by" until the state performs their part of the audit. The "Audit Board Sign-in" button should be grayed out until the state starts the audit. | 1.0 | Client behavior for County after file uploads before SoS starts audit - From CDOS 8/28 feedback:
After successfully uploading hte CVR and the ballot manifest files a message should appear to notify the county to "stand-by" until the state performs their part of the audit. The "Audit Board Sign-in" button should be grayed out until the state starts the audit. | non_main | client behavior for county after file uploads before sos starts audit from cdos feedback after successfully uploading hte cvr and the ballot manifest files a message should appear to notify the county to stand by until the state performs their part of the audit the audit board sign in button should be grayed out until the state starts the audit | 0 |
231,761 | 25,543,260,535 | IssuesEvent | 2022-11-29 16:45:04 | dotJEM/json-index | https://api.github.com/repos/dotJEM/json-index | closed | WS-2022-0161 (High) detected in newtonsoft.json.8.0.2.nupkg - autoclosed | security vulnerability | ## WS-2022-0161 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>newtonsoft.json.8.0.2.nupkg</b></p></summary>
<p>Json.NET is a popular high-performance JSON framework for .NET</p>
<p>Library home page: <a href="https://api.nuget.org/packages/newtonsoft.json.8.0.2.nupkg">https://api.nuget.org/packages/newtonsoft.json.8.0.2.nupkg</a></p>
<p>Path to dependency file: /DotJEM.Json.Index.Benchmarks/DotJEM.Json.Index.Benchmarks.csproj</p>
<p>Path to vulnerable library: /et/packages/newtonsoft.json/8.0.2/newtonsoft.json.8.0.2.nupkg,/et/packages/newtonsoft.json/8.0.2/newtonsoft.json.8.0.2.nupkg,/et/packages/newtonsoft.json/8.0.2/newtonsoft.json.8.0.2.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **newtonsoft.json.8.0.2.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/dotJEM/json-index/commit/f2be35ee966dc2bd4ca639e9696883ac57ede085">f2be35ee966dc2bd4ca639e9696883ac57ede085</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Improper Handling of Exceptional Conditions in Newtonsoft.Json.
Newtonsoft.Json prior to version 13.0.1 is vulnerable to Insecure Defaults due to improper handling of StackOverFlow exception (SOE) whenever nested expressions are being processed. Exploiting this vulnerability results in Denial Of Service (DoS), and it is exploitable when an attacker sends 5 requests that cause SOE in time frame of 5 minutes. This vulnerability affects Internet Information Services (IIS) Applications.
<p>Publish Date: 2022-06-22
<p>URL: <a href=https://github.com/JamesNK/Newtonsoft.Json/commit/7e77bbe1beccceac4fc7b174b53abfefac278b66>WS-2022-0161</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-06-22</p>
<p>Fix Resolution: Newtonsoft.Json - 13.0.1;Microsoft.Extensions.ApiDescription.Server - 6.0.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2022-0161 (High) detected in newtonsoft.json.8.0.2.nupkg - autoclosed - ## WS-2022-0161 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>newtonsoft.json.8.0.2.nupkg</b></p></summary>
<p>Json.NET is a popular high-performance JSON framework for .NET</p>
<p>Library home page: <a href="https://api.nuget.org/packages/newtonsoft.json.8.0.2.nupkg">https://api.nuget.org/packages/newtonsoft.json.8.0.2.nupkg</a></p>
<p>Path to dependency file: /DotJEM.Json.Index.Benchmarks/DotJEM.Json.Index.Benchmarks.csproj</p>
<p>Path to vulnerable library: /et/packages/newtonsoft.json/8.0.2/newtonsoft.json.8.0.2.nupkg,/et/packages/newtonsoft.json/8.0.2/newtonsoft.json.8.0.2.nupkg,/et/packages/newtonsoft.json/8.0.2/newtonsoft.json.8.0.2.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **newtonsoft.json.8.0.2.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/dotJEM/json-index/commit/f2be35ee966dc2bd4ca639e9696883ac57ede085">f2be35ee966dc2bd4ca639e9696883ac57ede085</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Improper Handling of Exceptional Conditions in Newtonsoft.Json.
Newtonsoft.Json prior to version 13.0.1 is vulnerable to Insecure Defaults due to improper handling of StackOverFlow exception (SOE) whenever nested expressions are being processed. Exploiting this vulnerability results in Denial Of Service (DoS), and it is exploitable when an attacker sends 5 requests that cause SOE in time frame of 5 minutes. This vulnerability affects Internet Information Services (IIS) Applications.
<p>Publish Date: 2022-06-22
<p>URL: <a href=https://github.com/JamesNK/Newtonsoft.Json/commit/7e77bbe1beccceac4fc7b174b53abfefac278b66>WS-2022-0161</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-06-22</p>
<p>Fix Resolution: Newtonsoft.Json - 13.0.1;Microsoft.Extensions.ApiDescription.Server - 6.0.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_main | ws high detected in newtonsoft json nupkg autoclosed ws high severity vulnerability vulnerable library newtonsoft json nupkg json net is a popular high performance json framework for net library home page a href path to dependency file dotjem json index benchmarks dotjem json index benchmarks csproj path to vulnerable library et packages newtonsoft json newtonsoft json nupkg et packages newtonsoft json newtonsoft json nupkg et packages newtonsoft json newtonsoft json nupkg dependency hierarchy x newtonsoft json nupkg vulnerable library found in head commit a href found in base branch master vulnerability details improper handling of exceptional conditions in newtonsoft json newtonsoft json prior to version is vulnerable to insecure defaults due to improper handling of stackoverflow exception soe whenever nested expressions are being processed exploiting this vulnerability results in denial of service dos and it is exploitable when an attacker sends requests that cause soe in time frame of minutes this vulnerability affects internet information services iis applications publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution newtonsoft json microsoft extensions apidescription server step up your open source security game with mend | 0 |
3,558 | 14,201,548,538 | IssuesEvent | 2020-11-16 07:53:25 | bcurran3/ChocolateyPackages | https://api.github.com/repos/bcurran3/ChocolateyPackages | closed | resilio-sync-business: checksum fail for v2.6.3.1340 | !LOOKING_FOR_NEW_MAINTAINER! Done Update_Package | Just an FYI, the resilio-sync-business package is failing checksums. | True | resilio-sync-business: checksum fail for v2.6.3.1340 - Just an FYI, the resilio-sync-business package is failing checksums. | main | resilio sync business checksum fail for just an fyi the resilio sync business package is failing checksums | 1 |
634,450 | 20,361,595,965 | IssuesEvent | 2022-02-20 19:13:14 | Biologer/Biologer | https://api.github.com/repos/Biologer/Biologer | opened | Nalazi gajenih, alohtonih i invazivnih vrsta | enhancement priority:low | Do sada je bilo moguće neki takson označiti kao alohton ili invazivan iz taksonomskog stabla. Jožef Dožai je predložio da omogućimo označavanje taksona kao alohtonih u slučaju da je vrsta unesena u neko staništa, a tamo ne priopada. Dobar primer su biljke kojima se areal nalazi na području Srbije, ali su presađene na lokacije gde prirodno ne rastu.
Ukoliko se odlučimo za implementaciju ovoga treba
1. Napraviti čekbox „Van prirodnog staništa“ prilikom unosa nalaza sa terena.
2. Označiti ovakve nalaze drugačijom bojom na mapi. | 1.0 | Nalazi gajenih, alohtonih i invazivnih vrsta - Do sada je bilo moguće neki takson označiti kao alohton ili invazivan iz taksonomskog stabla. Jožef Dožai je predložio da omogućimo označavanje taksona kao alohtonih u slučaju da je vrsta unesena u neko staništa, a tamo ne priopada. Dobar primer su biljke kojima se areal nalazi na području Srbije, ali su presađene na lokacije gde prirodno ne rastu.
Ukoliko se odlučimo za implementaciju ovoga treba
1. Napraviti čekbox „Van prirodnog staništa“ prilikom unosa nalaza sa terena.
2. Označiti ovakve nalaze drugačijom bojom na mapi. | non_main | nalazi gajenih alohtonih i invazivnih vrsta do sada je bilo moguće neki takson označiti kao alohton ili invazivan iz taksonomskog stabla jožef dožai je predložio da omogućimo označavanje taksona kao alohtonih u slučaju da je vrsta unesena u neko staništa a tamo ne priopada dobar primer su biljke kojima se areal nalazi na području srbije ali su presađene na lokacije gde prirodno ne rastu ukoliko se odlučimo za implementaciju ovoga treba napraviti čekbox „van prirodnog staništa“ prilikom unosa nalaza sa terena označiti ovakve nalaze drugačijom bojom na mapi | 0 |
3,305 | 12,798,424,052 | IssuesEvent | 2020-07-02 13:54:38 | precice/precice | https://api.github.com/repos/precice/precice | opened | Improve test coverage for WatchPoint | good first issue maintainability | We currently don't test for cases **without** mesh connectivity. And, we don't test for parallel (distributed) meshes.
Current tests: https://github.com/precice/precice/blob/develop/src/precice/tests/WatchPointTest.cpp | True | Improve test coverage for WatchPoint - We currently don't test for cases **without** mesh connectivity. And, we don't test for parallel (distributed) meshes.
Current tests: https://github.com/precice/precice/blob/develop/src/precice/tests/WatchPointTest.cpp | main | improve test coverage for watchpoint we currently don t test for cases without mesh connectivity and we don t test for parallel distributed meshes current tests | 1 |
56,292 | 6,973,003,210 | IssuesEvent | 2017-12-11 18:58:11 | pennsignals/gentry | https://api.github.com/repos/pennsignals/gentry | closed | Refactor encode.go | design refactoring | **Suggestion:** Remove [`Product`](https://github.com/pennsignals/gentry/blob/design/rename-converter-to-encoder/encode.go#L12) method from Converter interface
**Purpose:** The products produced by the concrete builders differ greatly in their representation. For example, the product produced for a Slack message is different than the product produced for an SMS text.
**Suggestion:** Update [`BufferedReader.Parse`](https://github.com/pennsignals/gentry/blob/design/rename-converter-to-encoder/encode.go#L31) method
**Purpose:** The primary responsibility of the `Parse` method is to direct the encoding of the Consul Health Checks to the encoder object. Instead of invoking the `Encode` method on each check object, do it once for all checks (e.g., `r.encoder.Encode(checks)`).
**Suggestion:** Update [`PostMessageEncoder.Encode`](https://github.com/pennsignals/gentry/blob/design/rename-converter-to-encoder/encode.go#L52) method
**Purpose:** This suggestion correlates with the previous suggestion. The `PostMessageEncoder` maintains an instance of a [`PostMessage `](https://github.com/pennsignals/gentry/blob/design/rename-converter-to-encoder/slack.go#L48) object (product). The `Encode` method needs to traverse the `checks` object and convert each `check` object to an [`Attachment`](https://github.com/pennsignals/gentry/blob/design/rename-converter-to-encoder/slack.go#L3) object. | 1.0 | Refactor encode.go - **Suggestion:** Remove [`Product`](https://github.com/pennsignals/gentry/blob/design/rename-converter-to-encoder/encode.go#L12) method from Converter interface
**Purpose:** The products produced by the concrete builders differ greatly in their representation. For example, the product produced for a Slack message is different than the product produced for an SMS text.
**Suggestion:** Update [`BufferedReader.Parse`](https://github.com/pennsignals/gentry/blob/design/rename-converter-to-encoder/encode.go#L31) method
**Purpose:** The primary responsibility of the `Parse` method is to direct the encoding of the Consul Health Checks to the encoder object. Instead of invoking the `Encode` method on each check object, do it once for all checks (e.g., `r.encoder.Encode(checks)`).
**Suggestion:** Update [`PostMessageEncoder.Encode`](https://github.com/pennsignals/gentry/blob/design/rename-converter-to-encoder/encode.go#L52) method
**Purpose:** This suggestion correlates with the previous suggestion. The `PostMessageEncoder` maintains an instance of a [`PostMessage `](https://github.com/pennsignals/gentry/blob/design/rename-converter-to-encoder/slack.go#L48) object (product). The `Encode` method needs to traverse the `checks` object and convert each `check` object to an [`Attachment`](https://github.com/pennsignals/gentry/blob/design/rename-converter-to-encoder/slack.go#L3) object. | non_main | refactor encode go suggestion remove method from converter interface purpose the products produced by the concrete builders differ greatly in their representation for example the product produced for a slack message is different than the product produced for an sms text suggestion update method purpose the primary responsibility of the parse method is to direct the encoding of the consul health checks to the encoder object instead of invoking the encode method on each check object do it once for all checks e g r encoder encode checks suggestion update method purpose this suggestion correlates with the previous suggestion the postmessageencoder maintains an instance of a object product the encode method needs to traverse the checks object and convert each check object to an object | 0 |
193,110 | 14,635,272,598 | IssuesEvent | 2020-12-24 07:47:25 | pingcap/tidb | https://api.github.com/repos/pingcap/tidb | closed | DATA_RACE:runtime.mapassign_faststr() failed | component/test | DATA_RACE:runtime.mapassign_faststr()
```
[2020-11-09T10:50:22.717Z] WARNING: DATA RACE
[2020-11-09T10:50:22.717Z] runtime.mapassign_faststr()
[2020-11-09T10:50:22.717Z] /usr/local/go/src/runtime/map_faststr.go:202 +0x0
[2020-11-09T10:50:22.717Z] github.com/pingcap/parser/terror.ErrClass.initError()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/parser@v0.0.0-20201109022253-d384bee1451e/terror/terror.go:140 +0x244
[2020-11-09T10:50:22.717Z] github.com/pingcap/parser/terror.ErrClass.NewStdErr()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/parser@v0.0.0-20201109022253-d384bee1451e/terror/terror.go:161 +0x4f
[2020-11-09T10:50:22.717Z] github.com/pingcap/tidb/executor.(*InsertValues).handleErr()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/insert_common.go:300 +0xa54
[2020-11-09T10:50:22.717Z] github.com/pingcap/tidb/executor.(*InsertValues).fastEvalRow()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/insert_common.go:373 +0x569
[2020-11-09T10:50:22.717Z] github.com/pingcap/tidb/executor.(*InsertValues).fastEvalRow-fm()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/insert_common.go:359 +0xaa
[2020-11-09T10:50:22.717Z] github.com/pingcap/tidb/executor.insertRows()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/insert_common.go:233 +0x3b6
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*InsertExec).Next()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/insert.go:288 +0x117
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.Next()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/executor.go:268 +0x27d
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ExecStmt).handleNoDelayExecutor()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/adapter.go:522 +0x38e
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ExecStmt).handleNoDelay()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/adapter.go:404 +0x254
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ExecStmt).Exec()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/adapter.go:354 +0x3f6
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/session.runStmt()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:1285 +0x2c1
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/session.(*session).ExecuteStmt()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:1229 +0xa57
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/util/testkit.(*TestKit).Exec()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/util/testkit/testkit.go:170 +0x2f1
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor_test.(*testSuite).TestInsert()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/write_test.go:274 +0x3a00
[2020-11-09T10:50:22.718Z] runtime.call32()
[2020-11-09T10:50:22.718Z] /usr/local/go/src/runtime/asm_amd64.s:539 +0x3a
[2020-11-09T10:50:22.718Z] reflect.Value.Call()
[2020-11-09T10:50:22.718Z] /usr/local/go/src/reflect/value.go:321 +0xd3
[2020-11-09T10:50:22.718Z] github.com/pingcap/check.(*suiteRunner).forkTest.func1()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:850 +0x9aa
[2020-11-09T10:50:22.718Z] github.com/pingcap/check.(*suiteRunner).forkCall.func1()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:739 +0x113
[2020-11-09T10:50:22.718Z]
[2020-11-09T10:50:22.718Z] Previous read at 0x00c0001de450 by goroutine 418:
[2020-11-09T10:50:22.718Z] runtime.mapaccess2_faststr()
[2020-11-09T10:50:22.718Z] /usr/local/go/src/runtime/map_faststr.go:107 +0x0
[2020-11-09T10:50:22.718Z] github.com/pingcap/parser/terror.getMySQLErrorCode()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/parser@v0.0.0-20201109022253-d384bee1451e/terror/terror.go:195 +0xeb
[2020-11-09T10:50:22.718Z] github.com/pingcap/parser/terror.ToSQLError()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/parser@v0.0.0-20201109022253-d384bee1451e/terror/terror.go:185 +0x50
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ShowExec).fetchShowWarnings()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/show.go:1400 +0x27e
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ShowExec).fetchAll()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/show.go:173 +0x218
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ShowExec).Next()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/show.go:100 +0x522
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.Next()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/executor.go:268 +0x27d
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*recordSet).Next()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/adapter.go:128 +0x110
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/session.(*execStmtResult).Next()
[2020-11-09T10:50:22.718Z] <autogenerated>:1 +0x84
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/session.GetRows4Test()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/tidb.go:287 +0x35a
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/session.ResultSetToStringSlice()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/tidb.go:305 +0xb8
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/util/testkit.(*TestKit).ResultSetToResultWithCtx()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/util/testkit/testkit.go:321 +0xa8
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/util/testkit.(*TestKit).MustQuery()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/util/testkit/testkit.go:316 +0x4e4
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor_test.(*testSuite8).TestUpdate()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/write_test.go:1413 +0x1362
[2020-11-09T10:50:22.718Z] runtime.call32()
[2020-11-09T10:50:22.718Z] /usr/local/go/src/runtime/asm_amd64.s:539 +0x3a
[2020-11-09T10:50:22.719Z] reflect.Value.Call()
[2020-11-09T10:50:22.719Z] /usr/local/go/src/reflect/value.go:321 +0xd3
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkTest.func1()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:850 +0x9aa
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkCall.func1()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:739 +0x113
[2020-11-09T10:50:22.719Z]
[2020-11-09T10:50:22.719Z] Goroutine 106 (running) created at:
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkCall()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:734 +0x4a3
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkTest()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:832 +0x1b9
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).doRun()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:666 +0x13a
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).asyncRun.func1()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:650 +0xf7
[2020-11-09T10:50:22.719Z]
[2020-11-09T10:50:22.719Z] Goroutine 418 (running) created at:
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkCall()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:734 +0x4a3
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkTest()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:832 +0x1b9
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).doRun()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:666 +0x13a
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).asyncRun.func1()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:650 +0xf7
[2020-11-09T10:50:22.719Z] ==================
```
Latest failed builds:
https://internal.pingcap.net/idc-jenkins/job/tidb_ghpr_unit_test/57353/display/redirect
| 1.0 | DATA_RACE:runtime.mapassign_faststr() failed - DATA_RACE:runtime.mapassign_faststr()
```
[2020-11-09T10:50:22.717Z] WARNING: DATA RACE
[2020-11-09T10:50:22.717Z] runtime.mapassign_faststr()
[2020-11-09T10:50:22.717Z] /usr/local/go/src/runtime/map_faststr.go:202 +0x0
[2020-11-09T10:50:22.717Z] github.com/pingcap/parser/terror.ErrClass.initError()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/parser@v0.0.0-20201109022253-d384bee1451e/terror/terror.go:140 +0x244
[2020-11-09T10:50:22.717Z] github.com/pingcap/parser/terror.ErrClass.NewStdErr()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/parser@v0.0.0-20201109022253-d384bee1451e/terror/terror.go:161 +0x4f
[2020-11-09T10:50:22.717Z] github.com/pingcap/tidb/executor.(*InsertValues).handleErr()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/insert_common.go:300 +0xa54
[2020-11-09T10:50:22.717Z] github.com/pingcap/tidb/executor.(*InsertValues).fastEvalRow()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/insert_common.go:373 +0x569
[2020-11-09T10:50:22.717Z] github.com/pingcap/tidb/executor.(*InsertValues).fastEvalRow-fm()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/insert_common.go:359 +0xaa
[2020-11-09T10:50:22.717Z] github.com/pingcap/tidb/executor.insertRows()
[2020-11-09T10:50:22.717Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/insert_common.go:233 +0x3b6
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*InsertExec).Next()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/insert.go:288 +0x117
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.Next()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/executor.go:268 +0x27d
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ExecStmt).handleNoDelayExecutor()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/adapter.go:522 +0x38e
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ExecStmt).handleNoDelay()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/adapter.go:404 +0x254
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ExecStmt).Exec()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/adapter.go:354 +0x3f6
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/session.runStmt()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:1285 +0x2c1
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/session.(*session).ExecuteStmt()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:1229 +0xa57
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/util/testkit.(*TestKit).Exec()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/util/testkit/testkit.go:170 +0x2f1
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor_test.(*testSuite).TestInsert()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/write_test.go:274 +0x3a00
[2020-11-09T10:50:22.718Z] runtime.call32()
[2020-11-09T10:50:22.718Z] /usr/local/go/src/runtime/asm_amd64.s:539 +0x3a
[2020-11-09T10:50:22.718Z] reflect.Value.Call()
[2020-11-09T10:50:22.718Z] /usr/local/go/src/reflect/value.go:321 +0xd3
[2020-11-09T10:50:22.718Z] github.com/pingcap/check.(*suiteRunner).forkTest.func1()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:850 +0x9aa
[2020-11-09T10:50:22.718Z] github.com/pingcap/check.(*suiteRunner).forkCall.func1()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:739 +0x113
[2020-11-09T10:50:22.718Z]
[2020-11-09T10:50:22.718Z] Previous read at 0x00c0001de450 by goroutine 418:
[2020-11-09T10:50:22.718Z] runtime.mapaccess2_faststr()
[2020-11-09T10:50:22.718Z] /usr/local/go/src/runtime/map_faststr.go:107 +0x0
[2020-11-09T10:50:22.718Z] github.com/pingcap/parser/terror.getMySQLErrorCode()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/parser@v0.0.0-20201109022253-d384bee1451e/terror/terror.go:195 +0xeb
[2020-11-09T10:50:22.718Z] github.com/pingcap/parser/terror.ToSQLError()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/parser@v0.0.0-20201109022253-d384bee1451e/terror/terror.go:185 +0x50
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ShowExec).fetchShowWarnings()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/show.go:1400 +0x27e
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ShowExec).fetchAll()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/show.go:173 +0x218
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*ShowExec).Next()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/show.go:100 +0x522
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.Next()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/executor.go:268 +0x27d
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor.(*recordSet).Next()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/adapter.go:128 +0x110
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/session.(*execStmtResult).Next()
[2020-11-09T10:50:22.718Z] <autogenerated>:1 +0x84
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/session.GetRows4Test()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/tidb.go:287 +0x35a
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/session.ResultSetToStringSlice()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/tidb.go:305 +0xb8
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/util/testkit.(*TestKit).ResultSetToResultWithCtx()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/util/testkit/testkit.go:321 +0xa8
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/util/testkit.(*TestKit).MustQuery()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/util/testkit/testkit.go:316 +0x4e4
[2020-11-09T10:50:22.718Z] github.com/pingcap/tidb/executor_test.(*testSuite8).TestUpdate()
[2020-11-09T10:50:22.718Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/executor/write_test.go:1413 +0x1362
[2020-11-09T10:50:22.718Z] runtime.call32()
[2020-11-09T10:50:22.718Z] /usr/local/go/src/runtime/asm_amd64.s:539 +0x3a
[2020-11-09T10:50:22.719Z] reflect.Value.Call()
[2020-11-09T10:50:22.719Z] /usr/local/go/src/reflect/value.go:321 +0xd3
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkTest.func1()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:850 +0x9aa
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkCall.func1()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:739 +0x113
[2020-11-09T10:50:22.719Z]
[2020-11-09T10:50:22.719Z] Goroutine 106 (running) created at:
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkCall()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:734 +0x4a3
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkTest()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:832 +0x1b9
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).doRun()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:666 +0x13a
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).asyncRun.func1()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:650 +0xf7
[2020-11-09T10:50:22.719Z]
[2020-11-09T10:50:22.719Z] Goroutine 418 (running) created at:
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkCall()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:734 +0x4a3
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).forkTest()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:832 +0x1b9
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).doRun()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:666 +0x13a
[2020-11-09T10:50:22.719Z] github.com/pingcap/check.(*suiteRunner).asyncRun.func1()
[2020-11-09T10:50:22.719Z] /home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/pkg/mod/github.com/pingcap/check@v0.0.0-20200212061837-5e12011dc712/check.go:650 +0xf7
[2020-11-09T10:50:22.719Z] ==================
```
Latest failed builds:
https://internal.pingcap.net/idc-jenkins/job/tidb_ghpr_unit_test/57353/display/redirect
| non_main | data race runtime mapassign faststr failed data race runtime mapassign faststr warning data race runtime mapassign faststr usr local go src runtime map faststr go github com pingcap parser terror errclass initerror home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap parser terror terror go github com pingcap parser terror errclass newstderr home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap parser terror terror go github com pingcap tidb executor insertvalues handleerr home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor insert common go github com pingcap tidb executor insertvalues fastevalrow home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor insert common go github com pingcap tidb executor insertvalues fastevalrow fm home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor insert common go github com pingcap tidb executor insertrows home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor insert common go github com pingcap tidb executor insertexec next home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor insert go github com pingcap tidb executor next home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor executor go github com pingcap tidb executor execstmt handlenodelayexecutor home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor adapter go github com pingcap tidb executor execstmt handlenodelay home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor adapter go github com pingcap tidb executor execstmt exec home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor adapter go github com pingcap tidb session runstmt home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb session session go github com pingcap tidb session session executestmt home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb session session go github com pingcap tidb util testkit testkit exec home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb util testkit testkit go github com pingcap tidb executor test testsuite testinsert home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor write test go runtime usr local go src runtime asm s reflect value call usr local go src reflect value go github com pingcap check suiterunner forktest home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go github com pingcap check suiterunner forkcall home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go previous read at by goroutine runtime faststr usr local go src runtime map faststr go github com pingcap parser terror getmysqlerrorcode home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap parser terror terror go github com pingcap parser terror tosqlerror home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap parser terror terror go github com pingcap tidb executor showexec fetchshowwarnings home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor show go github com pingcap tidb executor showexec fetchall home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor show go github com pingcap tidb executor showexec next home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor show go github com pingcap tidb executor next home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor executor go github com pingcap tidb executor recordset next home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor adapter go github com pingcap tidb session execstmtresult next github com pingcap tidb session home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb session tidb go github com pingcap tidb session resultsettostringslice home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb session tidb go github com pingcap tidb util testkit testkit resultsettoresultwithctx home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb util testkit testkit go github com pingcap tidb util testkit testkit mustquery home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb util testkit testkit go github com pingcap tidb executor test testupdate home jenkins agent workspace tidb ghpr unit test go src github com pingcap tidb executor write test go runtime usr local go src runtime asm s reflect value call usr local go src reflect value go github com pingcap check suiterunner forktest home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go github com pingcap check suiterunner forkcall home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go goroutine running created at github com pingcap check suiterunner forkcall home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go github com pingcap check suiterunner forktest home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go github com pingcap check suiterunner dorun home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go github com pingcap check suiterunner asyncrun home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go goroutine running created at github com pingcap check suiterunner forkcall home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go github com pingcap check suiterunner forktest home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go github com pingcap check suiterunner dorun home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go github com pingcap check suiterunner asyncrun home jenkins agent workspace tidb ghpr unit test go pkg mod github com pingcap check check go latest failed builds | 0 |
709,259 | 24,371,957,650 | IssuesEvent | 2022-10-03 20:08:36 | fuseumass/dashboard | https://api.github.com/repos/fuseumass/dashboard | closed | Accept non-UMass emails | High Priority | - Configure dashboard so that it accepts any valid emails (not just UMass) | 1.0 | Accept non-UMass emails - - Configure dashboard so that it accepts any valid emails (not just UMass) | non_main | accept non umass emails configure dashboard so that it accepts any valid emails not just umass | 0 |
4,797 | 24,722,806,543 | IssuesEvent | 2022-10-20 12:02:51 | centerofci/mathesar | https://api.github.com/repos/centerofci/mathesar | opened | Formulate Postgres-coupling-related assumptions | work: backend work: database status: ready restricted: maintainers | We don't have an explicit agreement on just how coupled to Postgres we want to be (or are prepared to be). We've also been experiencing disagreements on the subject. For the sake of productivity, we should formulate these assumptions. | True | Formulate Postgres-coupling-related assumptions - We don't have an explicit agreement on just how coupled to Postgres we want to be (or are prepared to be). We've also been experiencing disagreements on the subject. For the sake of productivity, we should formulate these assumptions. | main | formulate postgres coupling related assumptions we don t have an explicit agreement on just how coupled to postgres we want to be or are prepared to be we ve also been experiencing disagreements on the subject for the sake of productivity we should formulate these assumptions | 1 |
45,937 | 24,280,432,830 | IssuesEvent | 2022-09-28 16:54:16 | getsentry/develop | https://api.github.com/repos/getsentry/develop | closed | Span Operations Spec | performance | https://www.notion.so/sentry/Set-up-an-audit-for-SDK-consistency-for-span-operations-to-enable-performance-issues-addf02a8fa234dda8acf48d4ff9b8efb#1b58a8e7b0554e3b8a641102dfa1a939
- [x] Present at TSC
- [x] Email relevant teams
- [x] Get alignment on unknowns
- [x] Chatting with Neel about Ruby
- [x] Update develop docs https://github.com/getsentry/develop/pull/694
- [x] Make GH issues across various SDK repos
- JavaScript https://github.com/getsentry/sentry-javascript/issues/5837
- React Native - DONE
- Python https://github.com/getsentry/sentry-python/issues/1643
- Ruby https://github.com/getsentry/sentry-ruby/issues/1903
- Laravel https://github.com/getsentry/sentry-laravel/issues/571
- Dart https://github.com/getsentry/sentry-dart/issues/1022
- Cocoa - DONE
- Java https://github.com/getsentry/sentry-java/issues/2261 | True | Span Operations Spec - https://www.notion.so/sentry/Set-up-an-audit-for-SDK-consistency-for-span-operations-to-enable-performance-issues-addf02a8fa234dda8acf48d4ff9b8efb#1b58a8e7b0554e3b8a641102dfa1a939
- [x] Present at TSC
- [x] Email relevant teams
- [x] Get alignment on unknowns
- [x] Chatting with Neel about Ruby
- [x] Update develop docs https://github.com/getsentry/develop/pull/694
- [x] Make GH issues across various SDK repos
- JavaScript https://github.com/getsentry/sentry-javascript/issues/5837
- React Native - DONE
- Python https://github.com/getsentry/sentry-python/issues/1643
- Ruby https://github.com/getsentry/sentry-ruby/issues/1903
- Laravel https://github.com/getsentry/sentry-laravel/issues/571
- Dart https://github.com/getsentry/sentry-dart/issues/1022
- Cocoa - DONE
- Java https://github.com/getsentry/sentry-java/issues/2261 | non_main | span operations spec present at tsc email relevant teams get alignment on unknowns chatting with neel about ruby update develop docs make gh issues across various sdk repos javascript react native done python ruby laravel dart cocoa done java | 0 |
40,859 | 16,557,842,700 | IssuesEvent | 2021-05-28 15:52:01 | microsoft/BotFramework-Composer | https://api.github.com/repos/microsoft/BotFramework-Composer | closed | when running update schema there are 2 packages with the same name | Bot Services Support Type: Bug customer-replied-to customer-reported |
## Describe the bug
when running
sh ./update-schema.sh
i am getting
`
Running schema merge on nodejs runtime.
Finding component files
Error conflicting definitions of Microsoft.IDialog.schema
botbuilder-dialogs-declarative: /Users/vanilla/git_repo/AOSREbot/runtime/node_modules/botbuilder-ai/node_modules/botbuilder-dialogs-declarative/schemas/Microsoft.IDialog.schema
botbuilder-dialogs-declarative: /Users/vanilla/git_repo/AOSREbot/runtime/node_modules/botbuilder-dialogs-declarative/schemas/Microsoft.IDialog.schema`
## Version
1.4.1
## Browser
Emulator
Teams
- [ ] Electron distribution
- [x ] Chrome
- [ ] Safari
- [ ] Firefox
- [ ] Edge
## OS
- [x ] macOS
- [ ] Windows
- [ ] Ubuntu
## To Reproduce
Steps to reproduce the behavior:
1. Go to runtime foloder
2. run npm i -g @microsoft/botframework-cli ; npm run build
3. go to schemas folder
4. run sh ./update-schema.sh
## Expected behavior
the schema to be updated successfully
## Screenshots





## Additional context
`{
"$schema": "https://raw.githubusercontent.com/microsoft/botframework-sdk/master/schemas/component/component.schema",
"type": "object",
"title": "Component kinds",
"description": "These are all of the kinds that can be created by the loader.",
"oneOf": [
{
"$ref": "#/definitions/AkaDiagErrStrDialog"
},
{`
[.....]
` "definitions": {
"AkaDiagErrStrDialog": {
"$role": "implements(Microsoft.IDialog)",
"title": "AkaDiagErrStr",
"description": "This will return the translated CDN Error String",
"type": "object",
"$package": {
"name": "node-runtime",
"version": "1.0.0"
},`
| 1.0 | when running update schema there are 2 packages with the same name -
## Describe the bug
when running
sh ./update-schema.sh
i am getting
`
Running schema merge on nodejs runtime.
Finding component files
Error conflicting definitions of Microsoft.IDialog.schema
botbuilder-dialogs-declarative: /Users/vanilla/git_repo/AOSREbot/runtime/node_modules/botbuilder-ai/node_modules/botbuilder-dialogs-declarative/schemas/Microsoft.IDialog.schema
botbuilder-dialogs-declarative: /Users/vanilla/git_repo/AOSREbot/runtime/node_modules/botbuilder-dialogs-declarative/schemas/Microsoft.IDialog.schema`
## Version
1.4.1
## Browser
Emulator
Teams
- [ ] Electron distribution
- [x ] Chrome
- [ ] Safari
- [ ] Firefox
- [ ] Edge
## OS
- [x ] macOS
- [ ] Windows
- [ ] Ubuntu
## To Reproduce
Steps to reproduce the behavior:
1. Go to runtime foloder
2. run npm i -g @microsoft/botframework-cli ; npm run build
3. go to schemas folder
4. run sh ./update-schema.sh
## Expected behavior
the schema to be updated successfully
## Screenshots





## Additional context
`{
"$schema": "https://raw.githubusercontent.com/microsoft/botframework-sdk/master/schemas/component/component.schema",
"type": "object",
"title": "Component kinds",
"description": "These are all of the kinds that can be created by the loader.",
"oneOf": [
{
"$ref": "#/definitions/AkaDiagErrStrDialog"
},
{`
[.....]
` "definitions": {
"AkaDiagErrStrDialog": {
"$role": "implements(Microsoft.IDialog)",
"title": "AkaDiagErrStr",
"description": "This will return the translated CDN Error String",
"type": "object",
"$package": {
"name": "node-runtime",
"version": "1.0.0"
},`
| non_main | when running update schema there are packages with the same name describe the bug when running sh update schema sh i am getting running schema merge on nodejs runtime finding component files error conflicting definitions of microsoft idialog schema botbuilder dialogs declarative users vanilla git repo aosrebot runtime node modules botbuilder ai node modules botbuilder dialogs declarative schemas microsoft idialog schema botbuilder dialogs declarative users vanilla git repo aosrebot runtime node modules botbuilder dialogs declarative schemas microsoft idialog schema version browser emulator teams electron distribution chrome safari firefox edge os macos windows ubuntu to reproduce steps to reproduce the behavior go to runtime foloder run npm i g microsoft botframework cli npm run build go to schemas folder run sh update schema sh expected behavior the schema to be updated successfully screenshots additional context schema type object title component kinds description these are all of the kinds that can be created by the loader oneof ref definitions akadiagerrstrdialog definitions akadiagerrstrdialog role implements microsoft idialog title akadiagerrstr description this will return the translated cdn error string type object package name node runtime version | 0 |
902 | 4,561,538,410 | IssuesEvent | 2016-09-14 12:07:58 | simplesamlphp/simplesamlphp | https://api.github.com/repos/simplesamlphp/simplesamlphp | opened | Makee a script to convert dictionaries to locales | maintainability | [ ] Combine domains automatically
[ ] Possibly different handling for `modules/core`
Root and module-domains should be handled differently. | True | Makee a script to convert dictionaries to locales - [ ] Combine domains automatically
[ ] Possibly different handling for `modules/core`
Root and module-domains should be handled differently. | main | makee a script to convert dictionaries to locales combine domains automatically possibly different handling for modules core root and module domains should be handled differently | 1 |
546,577 | 16,015,107,088 | IssuesEvent | 2021-04-20 15:08:16 | publiclab/plots2 | https://api.github.com/repos/publiclab/plots2 | closed | Image upload failing | bug help wanted high-priority | Image upload fails with this error in the body section "Section 4" of `/post` route
 this is on the https://publiclab.org/, was able to replicate it on https://unstable.publiclab.org and locally
Note: Image upload is working fine on "Section 2" of `/post`
Template: https://github.com/publiclab/plots2/blob/main/app/views/editor/rich.html.erb
| 1.0 | Image upload failing - Image upload fails with this error in the body section "Section 4" of `/post` route
 this is on the https://publiclab.org/, was able to replicate it on https://unstable.publiclab.org and locally
Note: Image upload is working fine on "Section 2" of `/post`
Template: https://github.com/publiclab/plots2/blob/main/app/views/editor/rich.html.erb
| non_main | image upload failing image upload fails with this error in the body section section of post route this is on the was able to replicate it on and locally note image upload is working fine on section of post template | 0 |
281,553 | 21,315,414,507 | IssuesEvent | 2022-04-16 07:22:32 | brian16600/pe | https://api.github.com/repos/brian16600/pe | opened | More questions needed in FAQ, currently only one (from AB3) | severity.Low type.DocumentationBug | 
Perhaps more common questions could be added here as well.
<!--session: 1650087604666-a1d11fe1-d251-4419-8612-b6b9b1cbcb7c-->
<!--Version: Web v3.4.2--> | 1.0 | More questions needed in FAQ, currently only one (from AB3) - 
Perhaps more common questions could be added here as well.
<!--session: 1650087604666-a1d11fe1-d251-4419-8612-b6b9b1cbcb7c-->
<!--Version: Web v3.4.2--> | non_main | more questions needed in faq currently only one from perhaps more common questions could be added here as well | 0 |
1,882 | 6,577,511,061 | IssuesEvent | 2017-09-12 01:25:24 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | Make add_host less verbose | affects_2.0 feature_idea waiting_on_maintainer | ##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
add_host
##### ANSIBLE VERSION
```
ansible 2.0.1.0
config file = /etc/ansible/ansible.cfg
configured module search path = Default w/o overrides
```
##### CONFIGURATION
```
[defaults]
remote_tmp = $HOME/.ansible/tmp
roles_path = /etc/ansible/roles
inventory = inventory
host_key_checking = False
ansible_managed = Ansible managed: {file} modified on %Y-%m-%d %H:%M:%S
jinja2_extensions = jinja2.ext.do
[privilege_escalation]
become = True
become_method = sudo
become_user = root
become_ask_pass = False
[paramiko_connection]
[ssh_connection]
pipelining = True
scp_if_ssh = True
ssh_args = -F ssh_config
[accelerate]
[selinux]
```
##### OS / ENVIRONMENT
N/A
##### SUMMARY
When running `add_host` I get a ton of output on my shell. I don't see any reasons for this verbose output.
##### STEPS TO REPRODUCE
```
add_host:
name: foobar
```
##### EXPECTED RESULTS
Not a ton of output
##### ACTUAL RESULTS
This is the output... Without -vvv for a single server on OpenStack
> ok: [localhost] => (item={'_ansible_no_log': False, u'changed': False, u'server': {u'OS-EXT-STS:task_state': None, u'addresses': {u'internal': [{u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:d3:6e:0e', u'version': 4, u'addr': u'192.168.0.9', u'OS-EXT-IPS:type': u'fixed'}, {u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:d3:6e:0e', u'version': 4, u'addr': u'192.168.0.22', u'OS-EXT-IPS:type': u'floating'}]}, u'image': {u'id': u'd4711bae-b30e-4e32-a4dd-64010a01e104'}, u'OS-EXT-STS:vm_state': u'active', u'OS-SRV-USG:launched_at': u'2016-03-24T14:55:58.000000', u'NAME_ATTR': u'name', u'flavor': {u'id': u'ba1dc475-4f14-4e46-b601-ab43b775e4b5', u'name': u'm1.micro'}, u'az': u'nova', u'id': u'637f46be-0b6c-494e-b75f-b4172c60db35', u'security_groups': [{u'description': u'Default policy which allows all outgoing and incomming only SSH from foo jumphosts', u'id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'security_group_rules': [{u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.0.0.5/25', u'id': u'0a7cd664-0896-40bd-b98e-20a6d25dc4e6'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.0.0.10/24', u'id': u'18326637-7af7-4db1-a575-3c474a8506b8'}, {u'direction': u'ingress', u'protocol': None, u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': None, u'id': u'1b8c5e01-c739-46b1-bdeb-e4e46460ee54'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.100.0.10/32', u'id': u'1c33a398-12ee-4a85-b70c-176ee3cd627a'}, {u'direction': u'ingress', u'protocol': u'icmp', u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': u'0.0.0.0/0', u'id': u'cd43952e-cbeb-4b07-86c5-a357cbf0fab4'}, {u'direction': u'ingress', u'protocol': None, u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': None, u'id': u'd50c2cd0-9ae9-4a1b-b8d9-e8880ad4bc52'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.100.0.15/32', u'id': u'e8099b94-603f-4602-bb57-2f678e1a8a22'}], u'name': u'default'}], u'user_id': u'eaa1c24248ef4c9ab7dd87b7f2a96572', u'OS-DCF:diskConfig': u'MANUAL', u'networks': {u'internal': [u'192.168.0.9', u'192.168.0.22']}, u'accessIPv4': u'192.168.0.22', u'accessIPv6': u'', u'cloud': u'envvars', u'key_name': u'username', u'progress': 0, u'OS-EXT-STS:power_state': 1, u'interface_ip': u'192.168.0.22', u'config_drive': u'', u'status': u'ACTIVE', u'updated': u'2016-03-24T14:55:58Z', u'hostId': u'd3d17c9a8b6b19ccda574e8418ca98da23682e0f1f4398a122a96088', u'HUMAN_ID': True, u'OS-SRV-USG:terminated_at': None, u'public_v4': u'192.168.0.22', u'public_v6': u'', u'private_v4': u'192.168.0.9', u'OS-EXT-AZ:availability_zone': u'nova', u'name': u'singlebox', u'created': u'2016-03-24T14:55:53Z', u'tenant_id': u'35f7725e44794773ae17d9ad18a4dd23', u'region': u'RegionOne', u'os-extended-volumes:volumes_attached': [], u'volumes': [], u'metadata': {}, u'human_id': u'singlebox'}, 'item': u'singlebox', 'invocation': {'module_name': u'os_server', u'module_args': {u'auth_type': None, u'availability_zone': None, u'image': u'Ubuntu 14.04 foo-cloudimg amd64', u'image_exclude': u'(deprecated)', u'flavor_include': None, u'meta': None, u'flavor': u'm1.micro', u'security_groups': [u'default', u'default'], u'boot_from_volume': False, u'userdata': u'#cloud-config\nsystem_info:\n default_user:\n name: foostaff\n home: /home/foostaff\n shell: /bin/bash\n lock_passwd: True\n gecos: foostaff\n sudo: ["ALL=(ALL) NOPASSWD:ALL"]\nruncmd:\n - [ mkdir, -p, "/home/foostaff/.ssh" ]\n - "wget \'https://gitlab.foo.de/security/foostaff-keys/raw/master/authorized_keys\' -O - > /home/foostaff/.ssh/authorized_keys -q -t 5 -T 300"\n - [ chmod, 700, "/home/foostaff/.ssh" ]\n - [ chmod, 600, "/home/foostaff/.ssh/authorized_keys" ]\n - [ chown, -R, foostaff, "/home/foostaff/.ssh/" ]\n', u'network': None, u'nics': [{u'net-name': u'internal'}], u'floating_ips': None, u'flavor_ram': None, u'volume_size': False, u'state': u'present', u'auto_ip': True, u'cloud': None, u'floating_ip_pools': [u'float1'], u'region_name': None, u'key_name': u'username', u'api_timeout': None, u'auth': None, u'endpoint_type': u'public', u'boot_volume': None, u'key': None, u'cacert': None, u'terminate_volume': False, u'wait': True, u'name': u'singlebox', u'timeout': 180, u'cert': None, u'volumes': [], u'verify': True, u'config_drive': False}}, u'openstack': {u'OS-EXT-STS:task_state': None, u'addresses': {u'internal': [{u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:d3:6e:0e', u'version': 4, u'addr': u'192.168.0.9', u'OS-EXT-IPS:type': u'fixed'}, {u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:d3:6e:0e', u'version': 4, u'addr': u'192.168.0.22', u'OS-EXT-IPS:type': u'floating'}]}, u'image': {u'id': u'd4711bae-b30e-4e32-a4dd-64010a01e104'}, u'OS-EXT-STS:vm_state': u'active', u'OS-SRV-USG:launched_at': u'2016-03-24T14:55:58.000000', u'NAME_ATTR': u'name', u'flavor': {u'id': u'ba1dc475-4f14-4e46-b601-ab43b775e4b5', u'name': u'm1.micro'}, u'az': u'nova', u'id': u'637f46be-0b6c-494e-b75f-b4172c60db35', u'security_groups': [{u'description': u'Default policy which allows all outgoing and incomming only SSH from foo jumphosts', u'id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'security_group_rules': [{u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.0.0.5/25', u'id': u'0a7cd664-0896-40bd-b98e-20a6d25dc4e6'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.0.0.10/24', u'id': u'18326637-7af7-4db1-a575-3c474a8506b8'}, {u'direction': u'ingress', u'protocol': None, u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': None, u'id': u'1b8c5e01-c739-46b1-bdeb-e4e46460ee54'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.100.0.10/32', u'id': u'1c33a398-12ee-4a85-b70c-176ee3cd627a'}, {u'direction': u'ingress', u'protocol': u'icmp', u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': u'0.0.0.0/0', u'id': u'cd43952e-cbeb-4b07-86c5-a357cbf0fab4'}, {u'direction': u'ingress', u'protocol': None, u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': None, u'id': u'd50c2cd0-9ae9-4a1b-b8d9-e8880ad4bc52'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.100.0.15/32', u'id': u'e8099b94-603f-4602-bb57-2f678e1a8a22'}], u'name': u'default'}], u'user_id': u'eaa1c24248ef4c9ab7dd87b7f2a96572', u'OS-DCF:diskConfig': u'MANUAL', u'networks': {u'internal': [u'192.168.0.9', u'192.168.0.22']}, u'accessIPv4': u'192.168.0.22', u'accessIPv6': u'', u'cloud': u'envvars', u'key_name': u'username', u'progress': 0, u'OS-EXT-STS:power_state': 1, u'interface_ip': u'192.168.0.22', u'config_drive': u'', u'status': u'ACTIVE', u'updated': u'2016-03-24T14:55:58Z', u'hostId': u'd3d17c9a8b6b19ccda574e8418ca98da23682e0f1f4398a122a96088', u'HUMAN_ID': True, u'OS-SRV-USG:terminated_at': None, u'public_v4': u'192.168.0.22', u'public_v6': u'', u'private_v4': u'192.168.0.9', u'OS-EXT-AZ:availability_zone': u'nova', u'name': u'singlebox', u'created': u'2016-03-24T14:55:53Z', u'tenant_id': u'35f7725e44794773ae17d9ad18a4dd23', u'region': u'RegionOne', u'os-extended-volumes:volumes_attached': [], u'volumes': [], u'metadata': {}, u'human_id': u'singlebox'}, u'id': u'637f46be-0b6c-494e-b75f-b4172c60db35'})
| True | Make add_host less verbose - ##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
add_host
##### ANSIBLE VERSION
```
ansible 2.0.1.0
config file = /etc/ansible/ansible.cfg
configured module search path = Default w/o overrides
```
##### CONFIGURATION
```
[defaults]
remote_tmp = $HOME/.ansible/tmp
roles_path = /etc/ansible/roles
inventory = inventory
host_key_checking = False
ansible_managed = Ansible managed: {file} modified on %Y-%m-%d %H:%M:%S
jinja2_extensions = jinja2.ext.do
[privilege_escalation]
become = True
become_method = sudo
become_user = root
become_ask_pass = False
[paramiko_connection]
[ssh_connection]
pipelining = True
scp_if_ssh = True
ssh_args = -F ssh_config
[accelerate]
[selinux]
```
##### OS / ENVIRONMENT
N/A
##### SUMMARY
When running `add_host` I get a ton of output on my shell. I don't see any reasons for this verbose output.
##### STEPS TO REPRODUCE
```
add_host:
name: foobar
```
##### EXPECTED RESULTS
Not a ton of output
##### ACTUAL RESULTS
This is the output... Without -vvv for a single server on OpenStack
> ok: [localhost] => (item={'_ansible_no_log': False, u'changed': False, u'server': {u'OS-EXT-STS:task_state': None, u'addresses': {u'internal': [{u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:d3:6e:0e', u'version': 4, u'addr': u'192.168.0.9', u'OS-EXT-IPS:type': u'fixed'}, {u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:d3:6e:0e', u'version': 4, u'addr': u'192.168.0.22', u'OS-EXT-IPS:type': u'floating'}]}, u'image': {u'id': u'd4711bae-b30e-4e32-a4dd-64010a01e104'}, u'OS-EXT-STS:vm_state': u'active', u'OS-SRV-USG:launched_at': u'2016-03-24T14:55:58.000000', u'NAME_ATTR': u'name', u'flavor': {u'id': u'ba1dc475-4f14-4e46-b601-ab43b775e4b5', u'name': u'm1.micro'}, u'az': u'nova', u'id': u'637f46be-0b6c-494e-b75f-b4172c60db35', u'security_groups': [{u'description': u'Default policy which allows all outgoing and incomming only SSH from foo jumphosts', u'id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'security_group_rules': [{u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.0.0.5/25', u'id': u'0a7cd664-0896-40bd-b98e-20a6d25dc4e6'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.0.0.10/24', u'id': u'18326637-7af7-4db1-a575-3c474a8506b8'}, {u'direction': u'ingress', u'protocol': None, u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': None, u'id': u'1b8c5e01-c739-46b1-bdeb-e4e46460ee54'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.100.0.10/32', u'id': u'1c33a398-12ee-4a85-b70c-176ee3cd627a'}, {u'direction': u'ingress', u'protocol': u'icmp', u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': u'0.0.0.0/0', u'id': u'cd43952e-cbeb-4b07-86c5-a357cbf0fab4'}, {u'direction': u'ingress', u'protocol': None, u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': None, u'id': u'd50c2cd0-9ae9-4a1b-b8d9-e8880ad4bc52'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.100.0.15/32', u'id': u'e8099b94-603f-4602-bb57-2f678e1a8a22'}], u'name': u'default'}], u'user_id': u'eaa1c24248ef4c9ab7dd87b7f2a96572', u'OS-DCF:diskConfig': u'MANUAL', u'networks': {u'internal': [u'192.168.0.9', u'192.168.0.22']}, u'accessIPv4': u'192.168.0.22', u'accessIPv6': u'', u'cloud': u'envvars', u'key_name': u'username', u'progress': 0, u'OS-EXT-STS:power_state': 1, u'interface_ip': u'192.168.0.22', u'config_drive': u'', u'status': u'ACTIVE', u'updated': u'2016-03-24T14:55:58Z', u'hostId': u'd3d17c9a8b6b19ccda574e8418ca98da23682e0f1f4398a122a96088', u'HUMAN_ID': True, u'OS-SRV-USG:terminated_at': None, u'public_v4': u'192.168.0.22', u'public_v6': u'', u'private_v4': u'192.168.0.9', u'OS-EXT-AZ:availability_zone': u'nova', u'name': u'singlebox', u'created': u'2016-03-24T14:55:53Z', u'tenant_id': u'35f7725e44794773ae17d9ad18a4dd23', u'region': u'RegionOne', u'os-extended-volumes:volumes_attached': [], u'volumes': [], u'metadata': {}, u'human_id': u'singlebox'}, 'item': u'singlebox', 'invocation': {'module_name': u'os_server', u'module_args': {u'auth_type': None, u'availability_zone': None, u'image': u'Ubuntu 14.04 foo-cloudimg amd64', u'image_exclude': u'(deprecated)', u'flavor_include': None, u'meta': None, u'flavor': u'm1.micro', u'security_groups': [u'default', u'default'], u'boot_from_volume': False, u'userdata': u'#cloud-config\nsystem_info:\n default_user:\n name: foostaff\n home: /home/foostaff\n shell: /bin/bash\n lock_passwd: True\n gecos: foostaff\n sudo: ["ALL=(ALL) NOPASSWD:ALL"]\nruncmd:\n - [ mkdir, -p, "/home/foostaff/.ssh" ]\n - "wget \'https://gitlab.foo.de/security/foostaff-keys/raw/master/authorized_keys\' -O - > /home/foostaff/.ssh/authorized_keys -q -t 5 -T 300"\n - [ chmod, 700, "/home/foostaff/.ssh" ]\n - [ chmod, 600, "/home/foostaff/.ssh/authorized_keys" ]\n - [ chown, -R, foostaff, "/home/foostaff/.ssh/" ]\n', u'network': None, u'nics': [{u'net-name': u'internal'}], u'floating_ips': None, u'flavor_ram': None, u'volume_size': False, u'state': u'present', u'auto_ip': True, u'cloud': None, u'floating_ip_pools': [u'float1'], u'region_name': None, u'key_name': u'username', u'api_timeout': None, u'auth': None, u'endpoint_type': u'public', u'boot_volume': None, u'key': None, u'cacert': None, u'terminate_volume': False, u'wait': True, u'name': u'singlebox', u'timeout': 180, u'cert': None, u'volumes': [], u'verify': True, u'config_drive': False}}, u'openstack': {u'OS-EXT-STS:task_state': None, u'addresses': {u'internal': [{u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:d3:6e:0e', u'version': 4, u'addr': u'192.168.0.9', u'OS-EXT-IPS:type': u'fixed'}, {u'OS-EXT-IPS-MAC:mac_addr': u'fa:16:3e:d3:6e:0e', u'version': 4, u'addr': u'192.168.0.22', u'OS-EXT-IPS:type': u'floating'}]}, u'image': {u'id': u'd4711bae-b30e-4e32-a4dd-64010a01e104'}, u'OS-EXT-STS:vm_state': u'active', u'OS-SRV-USG:launched_at': u'2016-03-24T14:55:58.000000', u'NAME_ATTR': u'name', u'flavor': {u'id': u'ba1dc475-4f14-4e46-b601-ab43b775e4b5', u'name': u'm1.micro'}, u'az': u'nova', u'id': u'637f46be-0b6c-494e-b75f-b4172c60db35', u'security_groups': [{u'description': u'Default policy which allows all outgoing and incomming only SSH from foo jumphosts', u'id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'security_group_rules': [{u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.0.0.5/25', u'id': u'0a7cd664-0896-40bd-b98e-20a6d25dc4e6'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.0.0.10/24', u'id': u'18326637-7af7-4db1-a575-3c474a8506b8'}, {u'direction': u'ingress', u'protocol': None, u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': None, u'id': u'1b8c5e01-c739-46b1-bdeb-e4e46460ee54'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.100.0.10/32', u'id': u'1c33a398-12ee-4a85-b70c-176ee3cd627a'}, {u'direction': u'ingress', u'protocol': u'icmp', u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': u'0.0.0.0/0', u'id': u'cd43952e-cbeb-4b07-86c5-a357cbf0fab4'}, {u'direction': u'ingress', u'protocol': None, u'ethertype': u'IPv4', u'port_range_max': None, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': None, u'remote_ip_prefix': None, u'id': u'd50c2cd0-9ae9-4a1b-b8d9-e8880ad4bc52'}, {u'direction': u'ingress', u'protocol': u'tcp', u'ethertype': u'IPv4', u'port_range_max': 22, u'security_group_id': u'f5f8560d-b674-41ed-84b9-8d04dae79000', u'port_range_min': 22, u'remote_ip_prefix': u'10.100.0.15/32', u'id': u'e8099b94-603f-4602-bb57-2f678e1a8a22'}], u'name': u'default'}], u'user_id': u'eaa1c24248ef4c9ab7dd87b7f2a96572', u'OS-DCF:diskConfig': u'MANUAL', u'networks': {u'internal': [u'192.168.0.9', u'192.168.0.22']}, u'accessIPv4': u'192.168.0.22', u'accessIPv6': u'', u'cloud': u'envvars', u'key_name': u'username', u'progress': 0, u'OS-EXT-STS:power_state': 1, u'interface_ip': u'192.168.0.22', u'config_drive': u'', u'status': u'ACTIVE', u'updated': u'2016-03-24T14:55:58Z', u'hostId': u'd3d17c9a8b6b19ccda574e8418ca98da23682e0f1f4398a122a96088', u'HUMAN_ID': True, u'OS-SRV-USG:terminated_at': None, u'public_v4': u'192.168.0.22', u'public_v6': u'', u'private_v4': u'192.168.0.9', u'OS-EXT-AZ:availability_zone': u'nova', u'name': u'singlebox', u'created': u'2016-03-24T14:55:53Z', u'tenant_id': u'35f7725e44794773ae17d9ad18a4dd23', u'region': u'RegionOne', u'os-extended-volumes:volumes_attached': [], u'volumes': [], u'metadata': {}, u'human_id': u'singlebox'}, u'id': u'637f46be-0b6c-494e-b75f-b4172c60db35'})
| main | make add host less verbose issue type feature idea component name add host ansible version ansible config file etc ansible ansible cfg configured module search path default w o overrides configuration remote tmp home ansible tmp roles path etc ansible roles inventory inventory host key checking false ansible managed ansible managed file modified on y m d h m s extensions ext do become true become method sudo become user root become ask pass false pipelining true scp if ssh true ssh args f ssh config os environment n a summary when running add host i get a ton of output on my shell i don t see any reasons for this verbose output steps to reproduce add host name foobar expected results not a ton of output actual results this is the output without vvv for a single server on openstack ok item ansible no log false u changed false u server u os ext sts task state none u addresses u internal u image u id u u os ext sts vm state u active u os srv usg launched at u u name attr u name u flavor u id u u name u micro u az u nova u id u u security groups u name u default u user id u u os dcf diskconfig u manual u networks u internal u u u u u cloud u envvars u key name u username u progress u os ext sts power state u interface ip u u config drive u u status u active u updated u u hostid u u human id true u os srv usg terminated at none u public u u public u u private u u os ext az availability zone u nova u name u singlebox u created u u tenant id u u region u regionone u os extended volumes volumes attached u volumes u metadata u human id u singlebox item u singlebox invocation module name u os server u module args u auth type none u availability zone none u image u ubuntu foo cloudimg u image exclude u deprecated u flavor include none u meta none u flavor u micro u security groups u boot from volume false u userdata u cloud config nsystem info n default user n name foostaff n home home foostaff n shell bin bash n lock passwd true n gecos foostaff n sudo nruncmd n n wget o home foostaff ssh authorized keys q t t n n n n u network none u nics u floating ips none u flavor ram none u volume size false u state u present u auto ip true u cloud none u floating ip pools u region name none u key name u username u api timeout none u auth none u endpoint type u public u boot volume none u key none u cacert none u terminate volume false u wait true u name u singlebox u timeout u cert none u volumes u verify true u config drive false u openstack u os ext sts task state none u addresses u internal u image u id u u os ext sts vm state u active u os srv usg launched at u u name attr u name u flavor u id u u name u micro u az u nova u id u u security groups u name u default u user id u u os dcf diskconfig u manual u networks u internal u u u u u cloud u envvars u key name u username u progress u os ext sts power state u interface ip u u config drive u u status u active u updated u u hostid u u human id true u os srv usg terminated at none u public u u public u u private u u os ext az availability zone u nova u name u singlebox u created u u tenant id u u region u regionone u os extended volumes volumes attached u volumes u metadata u human id u singlebox u id u | 1 |
26,244 | 11,277,180,447 | IssuesEvent | 2020-01-15 01:50:55 | yoshi1125hisa/node-bbs | https://api.github.com/repos/yoshi1125hisa/node-bbs | closed | CVE-2019-18797 (Medium) detected in opennms-opennms-source-24.1.3-1 | security vulnerability | ## CVE-2019-18797 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-24.1.3-1</b></p></summary>
<p>
<p>A Java based fault and performance management system</p>
<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>
<p>Found in HEAD commit: <a href="https://github.com/yoshi1125hisa/node-bbs/commit/0920d7d072d8db784d75c752475e33662ce7eec4">0920d7d072d8db784d75c752475e33662ce7eec4</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (77)</summary>
<p></p>
<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>
<p>
- /node-bbs/node_modules/nan/nan_callbacks_pre_12_inl.h
- /node-bbs/node_modules/node-sass/src/libsass/src/expand.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/expand.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/factory.cpp
- /node-bbs/node_modules/js-base64/test/./yoshinoya.js
- /node-bbs/node_modules/node-sass/src/sass_types/boolean.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/util.hpp
- /node-bbs/node_modules/node-sass/src/sass_types/value.h
- /node-bbs/node_modules/node-sass/src/libsass/src/emitter.hpp
- /node-bbs/node_modules/nan/nan_converters_pre_43_inl.h
- /node-bbs/node_modules/node-sass/src/callback_bridge.h
- /node-bbs/node_modules/node-sass/src/libsass/src/file.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/sass.cpp
- /node-bbs/node_modules/nan/nan_persistent_12_inl.h
- /node-bbs/node_modules/node-sass/src/libsass/src/operation.hpp
- /node-bbs/node_modules/nan/nan_persistent_pre_12_inl.h
- /node-bbs/node_modules/node-sass/src/libsass/src/operators.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/constants.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/error_handling.hpp
- /node-bbs/node_modules/nan/nan_implementation_pre_12_inl.h
- /node-bbs/node_modules/js-base64/test/./dankogai.js
- /node-bbs/node_modules/node-sass/src/custom_importer_bridge.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/parser.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/constants.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/list.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/cssize.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/functions.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/util.cpp
- /node-bbs/node_modules/node-sass/src/custom_function_bridge.cpp
- /node-bbs/node_modules/nan/nan_typedarray_contents.h
- /node-bbs/node_modules/node-sass/src/custom_importer_bridge.h
- /node-bbs/node_modules/node-sass/src/libsass/src/bind.cpp
- /node-bbs/node_modules/nan/nan_json.h
- /node-bbs/node_modules/node-sass/src/libsass/src/eval.hpp
- /node-bbs/node_modules/nan/nan_converters.h
- /node-bbs/node_modules/node-sass/src/libsass/src/backtrace.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/extend.cpp
- /node-bbs/node_modules/node-sass/src/sass_context_wrapper.h
- /node-bbs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h
- /node-bbs/node_modules/node-sass/src/libsass/src/error_handling.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/debugger.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/emitter.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/number.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/color.h
- /node-bbs/node_modules/nan/nan_new.h
- /node-bbs/node_modules/node-sass/src/libsass/src/sass_values.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/ast.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/output.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/check_nesting.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/null.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/functions.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/cssize.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/prelexer.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/ast.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/to_c.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/to_value.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp
- /node-bbs/node_modules/nan/nan_callbacks.h
- /node-bbs/node_modules/node-sass/src/libsass/src/inspect.hpp
- /node-bbs/node_modules/node-sass/src/sass_types/color.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/values.cpp
- /node-bbs/node_modules/node-sass/src/sass_context_wrapper.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/list.h
- /node-bbs/node_modules/node-sass/src/libsass/src/check_nesting.hpp
- /node-bbs/node_modules/nan/nan_define_own_property_helper.h
- /node-bbs/node_modules/js-base64/.attic/test-moment/./es5.js
- /node-bbs/node_modules/node-sass/src/sass_types/map.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/to_value.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/context.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/string.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/sass_context.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/prelexer.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/context.hpp
- /node-bbs/node_modules/node-sass/src/sass_types/boolean.h
- /node-bbs/node_modules/nan/nan_private.h
- /node-bbs/node_modules/node-sass/src/libsass/src/eval.cpp
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
LibSass 3.6.1 has uncontrolled recursion in Sass::Eval::operator()(Sass::Binary_Expression*) in eval.cpp.
<p>Publish Date: 2019-11-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-18797>CVE-2019-18797</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-18797">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-18797</a></p>
<p>Release Date: 2019-11-06</p>
<p>Fix Resolution: 3.6.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-18797 (Medium) detected in opennms-opennms-source-24.1.3-1 - ## CVE-2019-18797 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-24.1.3-1</b></p></summary>
<p>
<p>A Java based fault and performance management system</p>
<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>
<p>Found in HEAD commit: <a href="https://github.com/yoshi1125hisa/node-bbs/commit/0920d7d072d8db784d75c752475e33662ce7eec4">0920d7d072d8db784d75c752475e33662ce7eec4</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (77)</summary>
<p></p>
<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>
<p>
- /node-bbs/node_modules/nan/nan_callbacks_pre_12_inl.h
- /node-bbs/node_modules/node-sass/src/libsass/src/expand.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/expand.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/factory.cpp
- /node-bbs/node_modules/js-base64/test/./yoshinoya.js
- /node-bbs/node_modules/node-sass/src/sass_types/boolean.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/util.hpp
- /node-bbs/node_modules/node-sass/src/sass_types/value.h
- /node-bbs/node_modules/node-sass/src/libsass/src/emitter.hpp
- /node-bbs/node_modules/nan/nan_converters_pre_43_inl.h
- /node-bbs/node_modules/node-sass/src/callback_bridge.h
- /node-bbs/node_modules/node-sass/src/libsass/src/file.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/sass.cpp
- /node-bbs/node_modules/nan/nan_persistent_12_inl.h
- /node-bbs/node_modules/node-sass/src/libsass/src/operation.hpp
- /node-bbs/node_modules/nan/nan_persistent_pre_12_inl.h
- /node-bbs/node_modules/node-sass/src/libsass/src/operators.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/constants.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/error_handling.hpp
- /node-bbs/node_modules/nan/nan_implementation_pre_12_inl.h
- /node-bbs/node_modules/js-base64/test/./dankogai.js
- /node-bbs/node_modules/node-sass/src/custom_importer_bridge.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/parser.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/constants.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/list.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/cssize.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/functions.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/util.cpp
- /node-bbs/node_modules/node-sass/src/custom_function_bridge.cpp
- /node-bbs/node_modules/nan/nan_typedarray_contents.h
- /node-bbs/node_modules/node-sass/src/custom_importer_bridge.h
- /node-bbs/node_modules/node-sass/src/libsass/src/bind.cpp
- /node-bbs/node_modules/nan/nan_json.h
- /node-bbs/node_modules/node-sass/src/libsass/src/eval.hpp
- /node-bbs/node_modules/nan/nan_converters.h
- /node-bbs/node_modules/node-sass/src/libsass/src/backtrace.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/extend.cpp
- /node-bbs/node_modules/node-sass/src/sass_context_wrapper.h
- /node-bbs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h
- /node-bbs/node_modules/node-sass/src/libsass/src/error_handling.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/debugger.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/emitter.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/number.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/color.h
- /node-bbs/node_modules/nan/nan_new.h
- /node-bbs/node_modules/node-sass/src/libsass/src/sass_values.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/ast.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/output.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/check_nesting.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/null.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/functions.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/cssize.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/prelexer.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/ast.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/to_c.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/to_value.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp
- /node-bbs/node_modules/nan/nan_callbacks.h
- /node-bbs/node_modules/node-sass/src/libsass/src/inspect.hpp
- /node-bbs/node_modules/node-sass/src/sass_types/color.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/values.cpp
- /node-bbs/node_modules/node-sass/src/sass_context_wrapper.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/list.h
- /node-bbs/node_modules/node-sass/src/libsass/src/check_nesting.hpp
- /node-bbs/node_modules/nan/nan_define_own_property_helper.h
- /node-bbs/node_modules/js-base64/.attic/test-moment/./es5.js
- /node-bbs/node_modules/node-sass/src/sass_types/map.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/to_value.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/context.cpp
- /node-bbs/node_modules/node-sass/src/sass_types/string.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/sass_context.cpp
- /node-bbs/node_modules/node-sass/src/libsass/src/prelexer.hpp
- /node-bbs/node_modules/node-sass/src/libsass/src/context.hpp
- /node-bbs/node_modules/node-sass/src/sass_types/boolean.h
- /node-bbs/node_modules/nan/nan_private.h
- /node-bbs/node_modules/node-sass/src/libsass/src/eval.cpp
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
LibSass 3.6.1 has uncontrolled recursion in Sass::Eval::operator()(Sass::Binary_Expression*) in eval.cpp.
<p>Publish Date: 2019-11-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-18797>CVE-2019-18797</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-18797">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-18797</a></p>
<p>Release Date: 2019-11-06</p>
<p>Fix Resolution: 3.6.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_main | cve medium detected in opennms opennms source cve medium severity vulnerability vulnerable library opennmsopennms source a java based fault and performance management system library home page a href found in head commit a href library source files the source files were matched to this source library based on a best effort match source libraries are selected from a list of probable public libraries node bbs node modules nan nan callbacks pre inl h node bbs node modules node sass src libsass src expand hpp node bbs node modules node sass src libsass src expand cpp node bbs node modules node sass src sass types factory cpp node bbs node modules js test yoshinoya js node bbs node modules node sass src sass types boolean cpp node bbs node modules node sass src libsass src util hpp node bbs node modules node sass src sass types value h node bbs node modules node sass src libsass src emitter hpp node bbs node modules nan nan converters pre inl h node bbs node modules node sass src callback bridge h node bbs node modules node sass src libsass src file cpp node bbs node modules node sass src libsass src sass cpp node bbs node modules nan nan persistent inl h node bbs node modules node sass src libsass src operation hpp node bbs node modules nan nan persistent pre inl h node bbs node modules node sass src libsass src operators hpp node bbs node modules node sass src libsass src constants hpp node bbs node modules node sass src libsass src error handling hpp node bbs node modules nan nan implementation pre inl h node bbs node modules js test dankogai js node bbs node modules node sass src custom importer bridge cpp node bbs node modules node sass src libsass src parser hpp node bbs node modules node sass src libsass src constants cpp node bbs node modules node sass src sass types list cpp node bbs node modules node sass src libsass src cssize cpp node bbs node modules node sass src libsass src functions hpp node bbs node modules node sass src libsass src util cpp node bbs node modules node sass src custom function bridge cpp node bbs node modules nan nan typedarray contents h node bbs node modules node sass src custom importer bridge h node bbs node modules node sass src libsass src bind cpp node bbs node modules nan nan json h node bbs node modules node sass src libsass src eval hpp node bbs node modules nan nan converters h node bbs node modules node sass src libsass src backtrace cpp node bbs node modules node sass src libsass src extend cpp node bbs node modules node sass src sass context wrapper h node bbs node modules node sass src sass types sass value wrapper h node bbs node modules node sass src libsass src error handling cpp node bbs node modules node sass src libsass src debugger hpp node bbs node modules node sass src libsass src emitter cpp node bbs node modules node sass src sass types number cpp node bbs node modules node sass src sass types color h node bbs node modules nan nan new h node bbs node modules node sass src libsass src sass values cpp node bbs node modules node sass src libsass src ast hpp node bbs node modules node sass src libsass src output cpp node bbs node modules node sass src libsass src check nesting cpp node bbs node modules node sass src sass types null cpp node bbs node modules node sass src libsass src ast def macros hpp node bbs node modules node sass src libsass src functions cpp node bbs node modules node sass src libsass src cssize hpp node bbs node modules node sass src libsass src prelexer cpp node bbs node modules node sass src libsass src ast cpp node bbs node modules node sass src libsass src to c cpp node bbs node modules node sass src libsass src to value hpp node bbs node modules node sass src libsass src ast fwd decl hpp node bbs node modules nan nan callbacks h node bbs node modules node sass src libsass src inspect hpp node bbs node modules node sass src sass types color cpp node bbs node modules node sass src libsass src values cpp node bbs node modules node sass src sass context wrapper cpp node bbs node modules node sass src sass types list h node bbs node modules node sass src libsass src check nesting hpp node bbs node modules nan nan define own property helper h node bbs node modules js attic test moment js node bbs node modules node sass src sass types map cpp node bbs node modules node sass src libsass src to value cpp node bbs node modules node sass src libsass src context cpp node bbs node modules node sass src sass types string cpp node bbs node modules node sass src libsass src sass context cpp node bbs node modules node sass src libsass src prelexer hpp node bbs node modules node sass src libsass src context hpp node bbs node modules node sass src sass types boolean h node bbs node modules nan nan private h node bbs node modules node sass src libsass src eval cpp vulnerability details libsass has uncontrolled recursion in sass eval operator sass binary expression in eval cpp publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
516,475 | 14,982,935,870 | IssuesEvent | 2021-01-28 16:34:59 | ithriv/ithriv_web | https://api.github.com/repos/ithriv/ithriv_web | closed | bypass InCommon institutional selection page | enhancement high priority | HIGH PRIORITY: Users are confused by the InCommon login page (needing to select institution even though they selected from login page, finding the "next" button). Is it possible to pass InCommon the institution information so it automatically sends them forward to their own federated login? See this website where they have already achieved this type of pass through: https://clic-ctsa.org/ | 1.0 | bypass InCommon institutional selection page - HIGH PRIORITY: Users are confused by the InCommon login page (needing to select institution even though they selected from login page, finding the "next" button). Is it possible to pass InCommon the institution information so it automatically sends them forward to their own federated login? See this website where they have already achieved this type of pass through: https://clic-ctsa.org/ | non_main | bypass incommon institutional selection page high priority users are confused by the incommon login page needing to select institution even though they selected from login page finding the next button is it possible to pass incommon the institution information so it automatically sends them forward to their own federated login see this website where they have already achieved this type of pass through | 0 |
370 | 3,366,204,201 | IssuesEvent | 2015-11-21 05:06:59 | tgstation/-tg-station | https://api.github.com/repos/tgstation/-tg-station | closed | Wild west has cult walls instead of indestructable walls | Bug Easy Fix Maintainability - Hinders improvements Map Issue | Used to be iconstate edited admin walls so you couldnt cut through the maze | True | Wild west has cult walls instead of indestructable walls - Used to be iconstate edited admin walls so you couldnt cut through the maze | main | wild west has cult walls instead of indestructable walls used to be iconstate edited admin walls so you couldnt cut through the maze | 1 |
4,710 | 24,270,833,777 | IssuesEvent | 2022-09-28 10:07:39 | mozilla/foundation.mozilla.org | https://api.github.com/repos/mozilla/foundation.mozilla.org | closed | SEO | Correct hreflang links | engineering Maintain | Off the back of the Grassriots site audit, it has been identified that there are a number of issues with incorrect hreflang links.
Detail from Grassriots
The x-default for most pages has a redirect to the /en version of the page. It is very important to make sure your hreflang links always refer to absolute URLs with HTTP 200 status codes, otherwise search engines will not be able to interpret them correctly and, as a result, will not show the correct language version of your pages to the relevant audience.
Link to the [audit](https://docs.google.com/spreadsheets/d/15HwgpxSYc4Zl809kcebAhLfLYXFuIk8ZP-Qvk3yVV8Q/edit?usp=sharing) | True | SEO | Correct hreflang links - Off the back of the Grassriots site audit, it has been identified that there are a number of issues with incorrect hreflang links.
Detail from Grassriots
The x-default for most pages has a redirect to the /en version of the page. It is very important to make sure your hreflang links always refer to absolute URLs with HTTP 200 status codes, otherwise search engines will not be able to interpret them correctly and, as a result, will not show the correct language version of your pages to the relevant audience.
Link to the [audit](https://docs.google.com/spreadsheets/d/15HwgpxSYc4Zl809kcebAhLfLYXFuIk8ZP-Qvk3yVV8Q/edit?usp=sharing) | main | seo correct hreflang links off the back of the grassriots site audit it has been identified that there are a number of issues with incorrect hreflang links detail from grassriots the x default for most pages has a redirect to the en version of the page it is very important to make sure your hreflang links always refer to absolute urls with http status codes otherwise search engines will not be able to interpret them correctly and as a result will not show the correct language version of your pages to the relevant audience link to the | 1 |
25,832 | 11,221,496,853 | IssuesEvent | 2020-01-07 17:58:29 | alpersonalwebsite/react-mobx-redux | https://api.github.com/repos/alpersonalwebsite/react-mobx-redux | opened | CVE-2019-20149 (Medium) detected in multiple libraries | security vulnerability | ## CVE-2019-20149 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>kind-of-3.2.2.tgz</b>, <b>kind-of-4.0.0.tgz</b>, <b>kind-of-6.0.2.tgz</b>, <b>kind-of-2.0.1.tgz</b>, <b>kind-of-5.1.0.tgz</b></p></summary>
<p>
<details><summary><b>kind-of-3.2.2.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-3.2.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-3.2.2.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-mobx-redux/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/react-mobx-redux/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- webpack-4.1.0.tgz
- plugin-svgo-4.3.1.tgz
- merge-deep-3.0.2.tgz
- :x: **kind-of-3.2.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>kind-of-4.0.0.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-4.0.0.tgz">https://registry.npmjs.org/kind-of/-/kind-of-4.0.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-mobx-redux/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/react-mobx-redux/node_modules/has-values/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- webpack-4.29.6.tgz
- micromatch-3.1.10.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- cache-base-1.0.1.tgz
- has-value-1.0.0.tgz
- has-values-1.0.0.tgz
- :x: **kind-of-4.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>kind-of-6.0.2.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-mobx-redux/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/react-mobx-redux/node_modules/define-property/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- sass-loader-7.1.0.tgz
- clone-deep-2.0.2.tgz
- :x: **kind-of-6.0.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>kind-of-2.0.1.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-2.0.1.tgz">https://registry.npmjs.org/kind-of/-/kind-of-2.0.1.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-mobx-redux/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/react-mobx-redux/node_modules/shallow-clone/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- webpack-4.1.0.tgz
- plugin-svgo-4.3.1.tgz
- merge-deep-3.0.2.tgz
- clone-deep-0.2.4.tgz
- shallow-clone-0.1.2.tgz
- :x: **kind-of-2.0.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>kind-of-5.1.0.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-5.1.0.tgz">https://registry.npmjs.org/kind-of/-/kind-of-5.1.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-mobx-redux/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/react-mobx-redux/node_modules/is-descriptor/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- sass-loader-7.1.0.tgz
- clone-deep-2.0.2.tgz
- shallow-clone-1.0.0.tgz
- :x: **kind-of-5.1.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/alpersonalwebsite/react-mobx-redux/commit/c4ddf6b4c10bdef2c3e81097f1ef5f6a009feee4">c4ddf6b4c10bdef2c3e81097f1ef5f6a009feee4</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by 'constructor': {'name':'Symbol'}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result.
<p>Publish Date: 2019-12-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149>CVE-2019-20149</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-20149 (Medium) detected in multiple libraries - ## CVE-2019-20149 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>kind-of-3.2.2.tgz</b>, <b>kind-of-4.0.0.tgz</b>, <b>kind-of-6.0.2.tgz</b>, <b>kind-of-2.0.1.tgz</b>, <b>kind-of-5.1.0.tgz</b></p></summary>
<p>
<details><summary><b>kind-of-3.2.2.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-3.2.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-3.2.2.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-mobx-redux/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/react-mobx-redux/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- webpack-4.1.0.tgz
- plugin-svgo-4.3.1.tgz
- merge-deep-3.0.2.tgz
- :x: **kind-of-3.2.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>kind-of-4.0.0.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-4.0.0.tgz">https://registry.npmjs.org/kind-of/-/kind-of-4.0.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-mobx-redux/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/react-mobx-redux/node_modules/has-values/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- webpack-4.29.6.tgz
- micromatch-3.1.10.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- cache-base-1.0.1.tgz
- has-value-1.0.0.tgz
- has-values-1.0.0.tgz
- :x: **kind-of-4.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>kind-of-6.0.2.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-mobx-redux/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/react-mobx-redux/node_modules/define-property/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- sass-loader-7.1.0.tgz
- clone-deep-2.0.2.tgz
- :x: **kind-of-6.0.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>kind-of-2.0.1.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-2.0.1.tgz">https://registry.npmjs.org/kind-of/-/kind-of-2.0.1.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-mobx-redux/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/react-mobx-redux/node_modules/shallow-clone/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- webpack-4.1.0.tgz
- plugin-svgo-4.3.1.tgz
- merge-deep-3.0.2.tgz
- clone-deep-0.2.4.tgz
- shallow-clone-0.1.2.tgz
- :x: **kind-of-2.0.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>kind-of-5.1.0.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-5.1.0.tgz">https://registry.npmjs.org/kind-of/-/kind-of-5.1.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/react-mobx-redux/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/react-mobx-redux/node_modules/is-descriptor/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- sass-loader-7.1.0.tgz
- clone-deep-2.0.2.tgz
- shallow-clone-1.0.0.tgz
- :x: **kind-of-5.1.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/alpersonalwebsite/react-mobx-redux/commit/c4ddf6b4c10bdef2c3e81097f1ef5f6a009feee4">c4ddf6b4c10bdef2c3e81097f1ef5f6a009feee4</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by 'constructor': {'name':'Symbol'}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result.
<p>Publish Date: 2019-12-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149>CVE-2019-20149</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_main | cve medium detected in multiple libraries cve medium severity vulnerability vulnerable libraries kind of tgz kind of tgz kind of tgz kind of tgz kind of tgz kind of tgz get the native type of a value library home page a href path to dependency file tmp ws scm react mobx redux package json path to vulnerable library tmp ws scm react mobx redux node modules kind of package json dependency hierarchy react scripts tgz root library webpack tgz plugin svgo tgz merge deep tgz x kind of tgz vulnerable library kind of tgz get the native type of a value library home page a href path to dependency file tmp ws scm react mobx redux package json path to vulnerable library tmp ws scm react mobx redux node modules has values node modules kind of package json dependency hierarchy react scripts tgz root library webpack tgz micromatch tgz snapdragon tgz base tgz cache base tgz has value tgz has values tgz x kind of tgz vulnerable library kind of tgz get the native type of a value library home page a href path to dependency file tmp ws scm react mobx redux package json path to vulnerable library tmp ws scm react mobx redux node modules define property node modules kind of package json dependency hierarchy react scripts tgz root library sass loader tgz clone deep tgz x kind of tgz vulnerable library kind of tgz get the native type of a value library home page a href path to dependency file tmp ws scm react mobx redux package json path to vulnerable library tmp ws scm react mobx redux node modules shallow clone node modules kind of package json dependency hierarchy react scripts tgz root library webpack tgz plugin svgo tgz merge deep tgz clone deep tgz shallow clone tgz x kind of tgz vulnerable library kind of tgz get the native type of a value library home page a href path to dependency file tmp ws scm react mobx redux package json path to vulnerable library tmp ws scm react mobx redux node modules is descriptor node modules kind of package json dependency hierarchy react scripts tgz root library sass loader tgz clone deep tgz shallow clone tgz x kind of tgz vulnerable library found in head commit a href vulnerability details ctorname in index js in kind of allows external user input to overwrite certain internal attributes via a conflicting name as demonstrated by constructor name symbol hence a crafted payload can overwrite this builtin attribute to manipulate the type detection result publish date url a href cvss score details base score metrics not available step up your open source security game with whitesource | 0 |
5,762 | 30,535,822,798 | IssuesEvent | 2023-07-19 17:16:40 | bazelbuild/intellij | https://api.github.com/repos/bazelbuild/intellij | closed | Incorrect package structure in IntelliJ for main and test folders | product: IntelliJ lang: java awaiting-maintainer | ### Description of the bug:
I am using bazel plugin in IntelliJ. Under main and test folders I can see the package structure is not nested (package under package).
Is there any configuration which needs to be added in .bazelproject file or is there any setting inside IntelliJ which needs to be done and I am missing here?
<img width="531" alt="Screenshot 2023-07-19 at 11 01 29 AM" src="https://github.com/bazelbuild/intellij/assets/92775528/8b834ef9-4b8a-4bcb-83d7-76d40b96294f">
### What's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
1. Sync or build the project.
2. Check under your main and test folders the structure of the package
### Which Intellij IDE are you using? Please provide the specific version.
JetBrains Client 2023.1.2
### What programming languages and tools are you using? Please provide specific versions.
Java
### What Bazel plugin version are you using?
bundled 2023.06.13.0.1-api-version-231
### Have you found anything relevant by searching the web?
_No response_
### Any other information, logs, or outputs that you want to share?
_No response_ | True | Incorrect package structure in IntelliJ for main and test folders - ### Description of the bug:
I am using bazel plugin in IntelliJ. Under main and test folders I can see the package structure is not nested (package under package).
Is there any configuration which needs to be added in .bazelproject file or is there any setting inside IntelliJ which needs to be done and I am missing here?
<img width="531" alt="Screenshot 2023-07-19 at 11 01 29 AM" src="https://github.com/bazelbuild/intellij/assets/92775528/8b834ef9-4b8a-4bcb-83d7-76d40b96294f">
### What's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
1. Sync or build the project.
2. Check under your main and test folders the structure of the package
### Which Intellij IDE are you using? Please provide the specific version.
JetBrains Client 2023.1.2
### What programming languages and tools are you using? Please provide specific versions.
Java
### What Bazel plugin version are you using?
bundled 2023.06.13.0.1-api-version-231
### Have you found anything relevant by searching the web?
_No response_
### Any other information, logs, or outputs that you want to share?
_No response_ | main | incorrect package structure in intellij for main and test folders description of the bug i am using bazel plugin in intellij under main and test folders i can see the package structure is not nested package under package is there any configuration which needs to be added in bazelproject file or is there any setting inside intellij which needs to be done and i am missing here img width alt screenshot at am src what s the simplest easiest way to reproduce this bug please provide a minimal example if possible sync or build the project check under your main and test folders the structure of the package which intellij ide are you using please provide the specific version jetbrains client what programming languages and tools are you using please provide specific versions java what bazel plugin version are you using bundled api version have you found anything relevant by searching the web no response any other information logs or outputs that you want to share no response | 1 |
3,309 | 12,813,921,989 | IssuesEvent | 2020-07-04 15:43:57 | aar3/tik | https://api.github.com/repos/aar3/tik | closed | integrate redux and redux-thunk | feature maintainability technical | #### Issue
We need some sort of central state management in the client, so integrate redux and redux-thunk so that we can not have to pass the same `props` around to every component
#### Acceptance Criteria
When redux is up and working | True | integrate redux and redux-thunk - #### Issue
We need some sort of central state management in the client, so integrate redux and redux-thunk so that we can not have to pass the same `props` around to every component
#### Acceptance Criteria
When redux is up and working | main | integrate redux and redux thunk issue we need some sort of central state management in the client so integrate redux and redux thunk so that we can not have to pass the same props around to every component acceptance criteria when redux is up and working | 1 |
5,071 | 25,960,180,746 | IssuesEvent | 2022-12-18 19:59:12 | cran-task-views/Hydrology | https://api.github.com/repos/cran-task-views/Hydrology | closed | Package 'CityWaterBalance' has been archived on CRAN for more than 60 days | maintainer-contacted | Package [CityWaterBalance](https://CRAN.R-project.org/package=CityWaterBalance) is currently listed in CRAN Task View [Hydrology](https://CRAN.R-project.org/view=Hydrology) but the package has actually been archived for more than 60 days on CRAN. Often this indicates that the package is currently not sufficiently actively maintained and should be excluded from the task view.
Alternatively, you might also consider reaching out to the authors of the package and encourage (or even help) them to bring the package back to CRAN.
In any case, the situation should be resolved in the next four weeks. If the package does not seem to be brought back to CRAN, please exclude it from the task view. | True | Package 'CityWaterBalance' has been archived on CRAN for more than 60 days - Package [CityWaterBalance](https://CRAN.R-project.org/package=CityWaterBalance) is currently listed in CRAN Task View [Hydrology](https://CRAN.R-project.org/view=Hydrology) but the package has actually been archived for more than 60 days on CRAN. Often this indicates that the package is currently not sufficiently actively maintained and should be excluded from the task view.
Alternatively, you might also consider reaching out to the authors of the package and encourage (or even help) them to bring the package back to CRAN.
In any case, the situation should be resolved in the next four weeks. If the package does not seem to be brought back to CRAN, please exclude it from the task view. | main | package citywaterbalance has been archived on cran for more than days package is currently listed in cran task view but the package has actually been archived for more than days on cran often this indicates that the package is currently not sufficiently actively maintained and should be excluded from the task view alternatively you might also consider reaching out to the authors of the package and encourage or even help them to bring the package back to cran in any case the situation should be resolved in the next four weeks if the package does not seem to be brought back to cran please exclude it from the task view | 1 |
251,028 | 18,923,131,936 | IssuesEvent | 2021-11-17 05:53:28 | extratone/bilge | https://api.github.com/repos/extratone/bilge | opened | Future of GitHub Wikis | documentation meta | [**Future of GitHub Wikis · Discussion #7820 · github/feedback**](https://github.com/github/feedback/discussions/7820)
I've been sitting on this question long enough, I hope (and doing a not-insignificant amount of searching the web,) that I'm fairly sure it isn't redundant. If I'm wrong about that, sincere apologies.
GitHub's feature rollouts in the past year have been absolutely incredible, but it's felt like Wikis - as in [Repository Wikis](https://docs.github.com/en/communities/documenting-your-project-with-wikis/about-wikis) - have been more or less abadoned? What I'm asking, really, is whether or not we should anticipate a significant **update**, a **sunsetting**, or neither, going forward, for GitHub Wikis. | 1.0 | Future of GitHub Wikis - [**Future of GitHub Wikis · Discussion #7820 · github/feedback**](https://github.com/github/feedback/discussions/7820)
I've been sitting on this question long enough, I hope (and doing a not-insignificant amount of searching the web,) that I'm fairly sure it isn't redundant. If I'm wrong about that, sincere apologies.
GitHub's feature rollouts in the past year have been absolutely incredible, but it's felt like Wikis - as in [Repository Wikis](https://docs.github.com/en/communities/documenting-your-project-with-wikis/about-wikis) - have been more or less abadoned? What I'm asking, really, is whether or not we should anticipate a significant **update**, a **sunsetting**, or neither, going forward, for GitHub Wikis. | non_main | future of github wikis i ve been sitting on this question long enough i hope and doing a not insignificant amount of searching the web that i m fairly sure it isn t redundant if i m wrong about that sincere apologies github s feature rollouts in the past year have been absolutely incredible but it s felt like wikis as in have been more or less abadoned what i m asking really is whether or not we should anticipate a significant update a sunsetting or neither going forward for github wikis | 0 |
38,073 | 8,407,809,423 | IssuesEvent | 2018-10-11 22:16:17 | google/go-cloud | https://api.github.com/repos/google/go-cloud | closed | blob/all: blobError type is not necessary | code health in progress | There's a `blobError` type that checks to see if the underlying error is a `driver.Error` and, if so, pulls the `driver.ErrorKind` out. It's unnecessary; the only place it is used is in functions like `blob.IsErrNotExists`, and the check can just happen there.
The `blobError` wrapper is fragile because it counts on the concrete type to always remember to wrap errors in `blobError` before returning them. | 1.0 | blob/all: blobError type is not necessary - There's a `blobError` type that checks to see if the underlying error is a `driver.Error` and, if so, pulls the `driver.ErrorKind` out. It's unnecessary; the only place it is used is in functions like `blob.IsErrNotExists`, and the check can just happen there.
The `blobError` wrapper is fragile because it counts on the concrete type to always remember to wrap errors in `blobError` before returning them. | non_main | blob all bloberror type is not necessary there s a bloberror type that checks to see if the underlying error is a driver error and if so pulls the driver errorkind out it s unnecessary the only place it is used is in functions like blob iserrnotexists and the check can just happen there the bloberror wrapper is fragile because it counts on the concrete type to always remember to wrap errors in bloberror before returning them | 0 |
32,881 | 12,151,838,004 | IssuesEvent | 2020-04-24 20:48:55 | trackercli/covid19-tracker-cli | https://api.github.com/repos/trackercli/covid19-tracker-cli | closed | security issues | security | I really appreciate the work you did, many thx.
Ofc as a terminal dev I cannot avoid to do some blaming as well. Here is the problem:
**Never ever encourage ppl to input downloaded stuff unchecked in the terminal!**
I am sure you did this with good intentions and your data is free of malicious actions, but plz lets not spread a bad habit. We all better ban any command combination in the form `curl -L ...` or `curl ... | bash` from our memory. It opens the door for trojans and viruses (thus it is in a weird way ontopic).
*How should this be done instead?*
Well, write a tiny data parser (maybe JSON), that can actually validate the content before putting it onto the terminal. This still can be driven by a simple curl command, but is actually safe if the parser does the proper checks: `curl .... | tiny_parser`
In the meantime - stay safe, and to anyone who already got infected - get well soon! | True | security issues - I really appreciate the work you did, many thx.
Ofc as a terminal dev I cannot avoid to do some blaming as well. Here is the problem:
**Never ever encourage ppl to input downloaded stuff unchecked in the terminal!**
I am sure you did this with good intentions and your data is free of malicious actions, but plz lets not spread a bad habit. We all better ban any command combination in the form `curl -L ...` or `curl ... | bash` from our memory. It opens the door for trojans and viruses (thus it is in a weird way ontopic).
*How should this be done instead?*
Well, write a tiny data parser (maybe JSON), that can actually validate the content before putting it onto the terminal. This still can be driven by a simple curl command, but is actually safe if the parser does the proper checks: `curl .... | tiny_parser`
In the meantime - stay safe, and to anyone who already got infected - get well soon! | non_main | security issues i really appreciate the work you did many thx ofc as a terminal dev i cannot avoid to do some blaming as well here is the problem never ever encourage ppl to input downloaded stuff unchecked in the terminal i am sure you did this with good intentions and your data is free of malicious actions but plz lets not spread a bad habit we all better ban any command combination in the form curl l or curl bash from our memory it opens the door for trojans and viruses thus it is in a weird way ontopic how should this be done instead well write a tiny data parser maybe json that can actually validate the content before putting it onto the terminal this still can be driven by a simple curl command but is actually safe if the parser does the proper checks curl tiny parser in the meantime stay safe and to anyone who already got infected get well soon | 0 |
1,947 | 6,627,312,057 | IssuesEvent | 2017-09-23 00:54:45 | duckduckgo/zeroclickinfo-goodies | https://api.github.com/repos/duckduckgo/zeroclickinfo-goodies | closed | Conversions: Result spelling should match query spelling | Improvement Maintainer Approved Status: Tolerated | For example, searching for "6 feet in metres" (British spelling) should show the result also with British spelling ("metres"): https://duckduckgo.com/?q=6+feet+in+metres&ia=answer
[Reported on Twitter](https://twitter.com/MartijnSaly/status/744824285722718208)
---
IA Page: http://duck.co/ia/view/conversions
[Maintainer](http://docs.duckduckhack.com/maintaining/guidelines.html): @mintsoft
| True | Conversions: Result spelling should match query spelling - For example, searching for "6 feet in metres" (British spelling) should show the result also with British spelling ("metres"): https://duckduckgo.com/?q=6+feet+in+metres&ia=answer
[Reported on Twitter](https://twitter.com/MartijnSaly/status/744824285722718208)
---
IA Page: http://duck.co/ia/view/conversions
[Maintainer](http://docs.duckduckhack.com/maintaining/guidelines.html): @mintsoft
| main | conversions result spelling should match query spelling for example searching for feet in metres british spelling should show the result also with british spelling metres ia page mintsoft | 1 |
385,350 | 26,633,011,846 | IssuesEvent | 2023-01-24 19:23:07 | department-of-veterans-affairs/va.gov-team | https://api.github.com/repos/department-of-veterans-affairs/va.gov-team | closed | [Monitoring] Update "Get access to Datadog" documentation | documentation monitoring platform-tech-team-2 | ## Description
Update the document based on feedback from the recent mini-workshop with Platform Tech Team 2.
## Resources
- [Get access to Datadog](https://vfs.atlassian.net/wiki/spaces/OT/pages/2233598117/Get+access+to+Datadog)
## Feedback
Needs work:
- Clarify the default Datadog role that a user is given when first getting access (which is RO)
- Reorder the steps to have Happy Path first, followed by how to remedy any errors that one may encounter during the process
## Acceptance Criteria
- [x] The document is updated and shared with the team for review
## Refinement Guidance - Check the following before working on this issue:
- [x] _Team label assigned ("platform-tech-team-2")_
- [x] _Epic assigned (if needed)_
- [x] _Estimated (points assigned)_
- [x] _Sprint assigned (once planned)_
- [x] _Team member(s) assigned_
| 1.0 | [Monitoring] Update "Get access to Datadog" documentation - ## Description
Update the document based on feedback from the recent mini-workshop with Platform Tech Team 2.
## Resources
- [Get access to Datadog](https://vfs.atlassian.net/wiki/spaces/OT/pages/2233598117/Get+access+to+Datadog)
## Feedback
Needs work:
- Clarify the default Datadog role that a user is given when first getting access (which is RO)
- Reorder the steps to have Happy Path first, followed by how to remedy any errors that one may encounter during the process
## Acceptance Criteria
- [x] The document is updated and shared with the team for review
## Refinement Guidance - Check the following before working on this issue:
- [x] _Team label assigned ("platform-tech-team-2")_
- [x] _Epic assigned (if needed)_
- [x] _Estimated (points assigned)_
- [x] _Sprint assigned (once planned)_
- [x] _Team member(s) assigned_
| non_main | update get access to datadog documentation description update the document based on feedback from the recent mini workshop with platform tech team resources feedback needs work clarify the default datadog role that a user is given when first getting access which is ro reorder the steps to have happy path first followed by how to remedy any errors that one may encounter during the process acceptance criteria the document is updated and shared with the team for review refinement guidance check the following before working on this issue team label assigned platform tech team epic assigned if needed estimated points assigned sprint assigned once planned team member s assigned | 0 |
4,819 | 24,841,071,698 | IssuesEvent | 2022-10-26 12:46:47 | centerofci/mathesar | https://api.github.com/repos/centerofci/mathesar | closed | Layout components to accommodate new designs | type: enhancement work: frontend status: ready restricted: maintainers | ## Problem
In order to implement the new designs for the database page and schema page the first thing that we need to do is create components for common layouts.
## Proposed solution
After looking at the designs here are the common layouts IMHO.
### LayoutWithHeader
This component will have two responsibilities:
1. Adding a static header on the top.
2. Rendering the children inside the `main` with some `max-width` so that the layout does not spread across the width of the larger screens. This `max-width` behavior can be toggled using a prop since we might not need this restriction for certain pages for example the Table page.
<img width="1001" alt="Screenshot 2022-09-28 at 2 55 44 PM" src="https://user-images.githubusercontent.com/11032856/192742836-cf4a8329-399f-4318-8f0a-24d67d341376.png">
_NOTE: This will be purely a presentational component._
#### Can we reuse the existing `LayoutWithHeader` component?
We can use this existing component but this might need us to check the layout at other places too where this component is used and make sure no bugs are introduced there which might delay the delivery of the DB and schema page. I suggest we create a new component with the current needs and gradually start using at other places once we do their designs.
#### Header
The header in itself will be a separate component that gets rendered inside the `LayoutWithHeader` component. I don't see the header requiring any props as of now but if we need this in the future, the `LayoutWithHeader` should support all of the header props too and pass them down to this component.
### PageTitleAndMeta
<img width="321" alt="Screenshot 2022-09-28 at 3 02 21 PM" src="https://user-images.githubusercontent.com/11032856/192744316-aac327ab-e91f-4abb-b4f9-938e371a2f5a.png">
This component will be responsible for the following things:
1. Rendering the `type` of the entity. It will be of type `string` but for now `schema` / `database` will be the possible values.
2. Rendering the `name` of the entity. So if the entity type is a `database` then it will be the name of the DB.
3. Rendering the `icon` using the `icon` prop of type `IconProps`.
4. Rendering the `description`, if any.
5. Rendering an action button using the `buttonProps` prop.
<img width="688" alt="Screenshot 2022-09-28 at 2 57 46 PM" src="https://user-images.githubusercontent.com/11032856/192744204-dd8617b6-99d5-458e-98a0-2abc5ae7443f.png">
_NOTE: This will be purely a presentational component._
| True | Layout components to accommodate new designs - ## Problem
In order to implement the new designs for the database page and schema page the first thing that we need to do is create components for common layouts.
## Proposed solution
After looking at the designs here are the common layouts IMHO.
### LayoutWithHeader
This component will have two responsibilities:
1. Adding a static header on the top.
2. Rendering the children inside the `main` with some `max-width` so that the layout does not spread across the width of the larger screens. This `max-width` behavior can be toggled using a prop since we might not need this restriction for certain pages for example the Table page.
<img width="1001" alt="Screenshot 2022-09-28 at 2 55 44 PM" src="https://user-images.githubusercontent.com/11032856/192742836-cf4a8329-399f-4318-8f0a-24d67d341376.png">
_NOTE: This will be purely a presentational component._
#### Can we reuse the existing `LayoutWithHeader` component?
We can use this existing component but this might need us to check the layout at other places too where this component is used and make sure no bugs are introduced there which might delay the delivery of the DB and schema page. I suggest we create a new component with the current needs and gradually start using at other places once we do their designs.
#### Header
The header in itself will be a separate component that gets rendered inside the `LayoutWithHeader` component. I don't see the header requiring any props as of now but if we need this in the future, the `LayoutWithHeader` should support all of the header props too and pass them down to this component.
### PageTitleAndMeta
<img width="321" alt="Screenshot 2022-09-28 at 3 02 21 PM" src="https://user-images.githubusercontent.com/11032856/192744316-aac327ab-e91f-4abb-b4f9-938e371a2f5a.png">
This component will be responsible for the following things:
1. Rendering the `type` of the entity. It will be of type `string` but for now `schema` / `database` will be the possible values.
2. Rendering the `name` of the entity. So if the entity type is a `database` then it will be the name of the DB.
3. Rendering the `icon` using the `icon` prop of type `IconProps`.
4. Rendering the `description`, if any.
5. Rendering an action button using the `buttonProps` prop.
<img width="688" alt="Screenshot 2022-09-28 at 2 57 46 PM" src="https://user-images.githubusercontent.com/11032856/192744204-dd8617b6-99d5-458e-98a0-2abc5ae7443f.png">
_NOTE: This will be purely a presentational component._
| main | layout components to accommodate new designs problem in order to implement the new designs for the database page and schema page the first thing that we need to do is create components for common layouts proposed solution after looking at the designs here are the common layouts imho layoutwithheader this component will have two responsibilities adding a static header on the top rendering the children inside the main with some max width so that the layout does not spread across the width of the larger screens this max width behavior can be toggled using a prop since we might not need this restriction for certain pages for example the table page img width alt screenshot at pm src note this will be purely a presentational component can we reuse the existing layoutwithheader component we can use this existing component but this might need us to check the layout at other places too where this component is used and make sure no bugs are introduced there which might delay the delivery of the db and schema page i suggest we create a new component with the current needs and gradually start using at other places once we do their designs header the header in itself will be a separate component that gets rendered inside the layoutwithheader component i don t see the header requiring any props as of now but if we need this in the future the layoutwithheader should support all of the header props too and pass them down to this component pagetitleandmeta img width alt screenshot at pm src this component will be responsible for the following things rendering the type of the entity it will be of type string but for now schema database will be the possible values rendering the name of the entity so if the entity type is a database then it will be the name of the db rendering the icon using the icon prop of type iconprops rendering the description if any rendering an action button using the buttonprops prop img width alt screenshot at pm src note this will be purely a presentational component | 1 |
5,301 | 26,778,922,006 | IssuesEvent | 2023-01-31 19:25:12 | aws/serverless-application-model | https://api.github.com/repos/aws/serverless-application-model | closed | Cannot use multiple Authorizers in AWS::Serverless::Function | stage/bug-repro maintainer/need-response | <!-- Make sure we don't have an existing Issue that reports the bug you are seeing (both open and closed).
If you do find an existing Issue, re-open or add a comment to that Issue instead of creating a new one. -->
### Description:
I'm trying to use 2 different authorizers with API gateway, and so far it doesn't seem to work.
### Steps to reproduce:
This is part of my API gateway:
```yaml
MyApi:
Type: AWS::Serverless::Api
Properties:
Name: MyApi
StageName: prod
Auth:
DefaultAuthorizer: MyAuthorizerV1
Authorizers:
MyAuthorizerV1:
UserPoolArn: !Ref CognitoUserPoolArn
MyAuthorizerV2:
UserPoolArn: !Ref CognitoUserPoolArn
AuthorizationScopes:
- 'aws.cognito.signin.user.admin'
```
This is my function with 2 distinct events, each one using a different path and a different authorizer:
```yaml
MyFunction:
Type: 'AWS::Serverless::Function'
Properties:
FunctionName: my.function
Handler: my.handler
Events:
MyEventV1:
Type: Api
Properties:
RestApiId: !Ref MyApi
Path: /test
Method: get
Auth:
Authorizer: MyAuthorizerV1
MyEventV2:
Type: Api
Properties:
RestApiId: !Ref MyApi
Path: /v2/test
Method: get
Auth:
Authorizer: MyAuthorizerV2
```
### Observed result:
Both API gateway endpoints are using `MyAuthorizerV1`.
If I remove the `DefaultAuthorizer`, then both endpoints are using `NONE` as an Authorizer.
### Expected result:
The endpoint `/test` should be using `MyAuthorizerV1` and the endpoint `/v2/test` should be using `MyAuthorizerV2`.
### Additional environment details (Ex: Windows, Mac, Amazon Linux etc)
1. OS: macOS or Linux
2. If using SAM CLI, `sam --version`: SAM CLI, version 1.23.0
3. AWS region: eu-west-1
Just to be more specific let me explain exactly what I'm trying to achieve here:
I already have live endpoints using my Cognito authorizer with `id_token`. I want to migrate and start using `access_token` instead. If I'm right, this is possible but I need to add `AuthorizationScopes` to my Authorizer.
Since I have clients already using the `id_token`, I can't just migrate without some kind of transition, which is why I wanted to create new endpoints prefixed by `/v2` and migrate the clients smoothly.
So far the only way I'm thinking of is duplicate the whole API part, which will generated another API gateway URL, and use CloudFront to cloak both API gateway domains under a single domain. Maybe there's a different technique to achieve this, in this case I'm open to suggestions. | True | Cannot use multiple Authorizers in AWS::Serverless::Function - <!-- Make sure we don't have an existing Issue that reports the bug you are seeing (both open and closed).
If you do find an existing Issue, re-open or add a comment to that Issue instead of creating a new one. -->
### Description:
I'm trying to use 2 different authorizers with API gateway, and so far it doesn't seem to work.
### Steps to reproduce:
This is part of my API gateway:
```yaml
MyApi:
Type: AWS::Serverless::Api
Properties:
Name: MyApi
StageName: prod
Auth:
DefaultAuthorizer: MyAuthorizerV1
Authorizers:
MyAuthorizerV1:
UserPoolArn: !Ref CognitoUserPoolArn
MyAuthorizerV2:
UserPoolArn: !Ref CognitoUserPoolArn
AuthorizationScopes:
- 'aws.cognito.signin.user.admin'
```
This is my function with 2 distinct events, each one using a different path and a different authorizer:
```yaml
MyFunction:
Type: 'AWS::Serverless::Function'
Properties:
FunctionName: my.function
Handler: my.handler
Events:
MyEventV1:
Type: Api
Properties:
RestApiId: !Ref MyApi
Path: /test
Method: get
Auth:
Authorizer: MyAuthorizerV1
MyEventV2:
Type: Api
Properties:
RestApiId: !Ref MyApi
Path: /v2/test
Method: get
Auth:
Authorizer: MyAuthorizerV2
```
### Observed result:
Both API gateway endpoints are using `MyAuthorizerV1`.
If I remove the `DefaultAuthorizer`, then both endpoints are using `NONE` as an Authorizer.
### Expected result:
The endpoint `/test` should be using `MyAuthorizerV1` and the endpoint `/v2/test` should be using `MyAuthorizerV2`.
### Additional environment details (Ex: Windows, Mac, Amazon Linux etc)
1. OS: macOS or Linux
2. If using SAM CLI, `sam --version`: SAM CLI, version 1.23.0
3. AWS region: eu-west-1
Just to be more specific let me explain exactly what I'm trying to achieve here:
I already have live endpoints using my Cognito authorizer with `id_token`. I want to migrate and start using `access_token` instead. If I'm right, this is possible but I need to add `AuthorizationScopes` to my Authorizer.
Since I have clients already using the `id_token`, I can't just migrate without some kind of transition, which is why I wanted to create new endpoints prefixed by `/v2` and migrate the clients smoothly.
So far the only way I'm thinking of is duplicate the whole API part, which will generated another API gateway URL, and use CloudFront to cloak both API gateway domains under a single domain. Maybe there's a different technique to achieve this, in this case I'm open to suggestions. | main | cannot use multiple authorizers in aws serverless function make sure we don t have an existing issue that reports the bug you are seeing both open and closed if you do find an existing issue re open or add a comment to that issue instead of creating a new one description i m trying to use different authorizers with api gateway and so far it doesn t seem to work steps to reproduce this is part of my api gateway yaml myapi type aws serverless api properties name myapi stagename prod auth defaultauthorizer authorizers userpoolarn ref cognitouserpoolarn userpoolarn ref cognitouserpoolarn authorizationscopes aws cognito signin user admin this is my function with distinct events each one using a different path and a different authorizer yaml myfunction type aws serverless function properties functionname my function handler my handler events type api properties restapiid ref myapi path test method get auth authorizer type api properties restapiid ref myapi path test method get auth authorizer observed result both api gateway endpoints are using if i remove the defaultauthorizer then both endpoints are using none as an authorizer expected result the endpoint test should be using and the endpoint test should be using additional environment details ex windows mac amazon linux etc os macos or linux if using sam cli sam version sam cli version aws region eu west just to be more specific let me explain exactly what i m trying to achieve here i already have live endpoints using my cognito authorizer with id token i want to migrate and start using access token instead if i m right this is possible but i need to add authorizationscopes to my authorizer since i have clients already using the id token i can t just migrate without some kind of transition which is why i wanted to create new endpoints prefixed by and migrate the clients smoothly so far the only way i m thinking of is duplicate the whole api part which will generated another api gateway url and use cloudfront to cloak both api gateway domains under a single domain maybe there s a different technique to achieve this in this case i m open to suggestions | 1 |
181 | 2,788,271,607 | IssuesEvent | 2015-05-08 12:38:30 | TheRosettaFoundation/SOLAS-Match | https://api.github.com/repos/TheRosettaFoundation/SOLAS-Match | opened | Tracking resource deletions | feature-idea maintainability | It is important to record resource deletions (e.g. deletion of orgs - see #1152). Metadata related to deleted resources can be very useful in different situations (e.g. when trying to restore some data from backups). Information such as who deleted, what resource, whenc needs to be stored in the database. | True | Tracking resource deletions - It is important to record resource deletions (e.g. deletion of orgs - see #1152). Metadata related to deleted resources can be very useful in different situations (e.g. when trying to restore some data from backups). Information such as who deleted, what resource, whenc needs to be stored in the database. | main | tracking resource deletions it is important to record resource deletions e g deletion of orgs see metadata related to deleted resources can be very useful in different situations e g when trying to restore some data from backups information such as who deleted what resource whenc needs to be stored in the database | 1 |
39,587 | 5,105,419,535 | IssuesEvent | 2017-01-05 07:18:34 | fossasia/loklak_search | https://api.github.com/repos/fossasia/loklak_search | closed | Simplify "Search More" style | design | The "search more" on the bottom introduces a new blue color and has a border element, that we do not find anywhere else on the site.
Please simplify the style to go inline with the page design
* [ ] Take out the background and border
* [ ] Only show the "Show More" in blue (as it is now) and bold
* [ ] Also ensure the cursor changes when scrolling over

| 1.0 | Simplify "Search More" style - The "search more" on the bottom introduces a new blue color and has a border element, that we do not find anywhere else on the site.
Please simplify the style to go inline with the page design
* [ ] Take out the background and border
* [ ] Only show the "Show More" in blue (as it is now) and bold
* [ ] Also ensure the cursor changes when scrolling over

| non_main | simplify search more style the search more on the bottom introduces a new blue color and has a border element that we do not find anywhere else on the site please simplify the style to go inline with the page design take out the background and border only show the show more in blue as it is now and bold also ensure the cursor changes when scrolling over | 0 |
1,318 | 5,654,398,187 | IssuesEvent | 2017-04-09 08:16:43 | MDAnalysis/mdanalysis | https://api.github.com/repos/MDAnalysis/mdanalysis | closed | Oldest compatible numpy version is 1.9.3 | maintainability | ### Expected behaviour
We claim to support numpy 1.5 and upwards.
### Actual behaviour
Everything below 1.9.3 has a lot of errors in the testsuite or just crashes. For 1.9.3 we just seems to have some accuracy errors for regression tests.
Here are links to travis runs
[Numpy 1.10.4](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219240316)
[Numpy 1.9.3](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219234044)
[Numpy 1.8.3](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219230330)
[Numpy 1.7.1](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219227618)
[Numpy 1.6.2](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219223754)
[Numpy 1.5.1](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219212223)
I suggest we fix the tests and bump the minimum version to 1.9.3 | True | Oldest compatible numpy version is 1.9.3 - ### Expected behaviour
We claim to support numpy 1.5 and upwards.
### Actual behaviour
Everything below 1.9.3 has a lot of errors in the testsuite or just crashes. For 1.9.3 we just seems to have some accuracy errors for regression tests.
Here are links to travis runs
[Numpy 1.10.4](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219240316)
[Numpy 1.9.3](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219234044)
[Numpy 1.8.3](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219230330)
[Numpy 1.7.1](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219227618)
[Numpy 1.6.2](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219223754)
[Numpy 1.5.1](https://travis-ci.org/MDAnalysis/mdanalysis/builds/219212223)
I suggest we fix the tests and bump the minimum version to 1.9.3 | main | oldest compatible numpy version is expected behaviour we claim to support numpy and upwards actual behaviour everything below has a lot of errors in the testsuite or just crashes for we just seems to have some accuracy errors for regression tests here are links to travis runs i suggest we fix the tests and bump the minimum version to | 1 |
3,534 | 13,912,080,027 | IssuesEvent | 2020-10-20 18:19:50 | grey-software/Twitter-Focus | https://api.github.com/repos/grey-software/Twitter-Focus | opened | 🚀 Feature Request: Add Chrome and Firefox web store links | Domain: User Experience Role: Maintainer Type: Enhancement hacktoberfest-accepted | ### Problem Overview 👁️🗨️
Users should be able to see the links to the Twitter-Focus extension (on the Chrome And Firefox Web Store) in README.md.
### What would you like? 🧰
Add the links to the Twitter-Focus web extension that has been published on the Chrome and Firefox web stores to the README.md file.
### What alternatives have you considered? 🔍
N/A
### Additional details ℹ️
Links to Twitter-Focus on the web stores:
Chrome Web Store: https://chrome.google.com/webstore/detail/twitter-focus/kmdpomipbibobgdgfeidajmnlecloeml?hl=en&authuser=1
Mozilla Firefox Web Store: https://addons.mozilla.org/en-US/firefox/addon/twitter-focus/ | True | 🚀 Feature Request: Add Chrome and Firefox web store links - ### Problem Overview 👁️🗨️
Users should be able to see the links to the Twitter-Focus extension (on the Chrome And Firefox Web Store) in README.md.
### What would you like? 🧰
Add the links to the Twitter-Focus web extension that has been published on the Chrome and Firefox web stores to the README.md file.
### What alternatives have you considered? 🔍
N/A
### Additional details ℹ️
Links to Twitter-Focus on the web stores:
Chrome Web Store: https://chrome.google.com/webstore/detail/twitter-focus/kmdpomipbibobgdgfeidajmnlecloeml?hl=en&authuser=1
Mozilla Firefox Web Store: https://addons.mozilla.org/en-US/firefox/addon/twitter-focus/ | main | 🚀 feature request add chrome and firefox web store links problem overview 👁️🗨️ users should be able to see the links to the twitter focus extension on the chrome and firefox web store in readme md what would you like 🧰 add the links to the twitter focus web extension that has been published on the chrome and firefox web stores to the readme md file what alternatives have you considered 🔍 n a additional details ℹ️ links to twitter focus on the web stores chrome web store mozilla firefox web store | 1 |
901 | 4,561,017,521 | IssuesEvent | 2016-09-14 10:05:51 | simplesamlphp/simplesamlphp | https://api.github.com/repos/simplesamlphp/simplesamlphp | closed | Deprecate certificate validation by fingerprint | enhancement maintainability | Because we want to remove certificate validation by fingerprint (per #431), we should deprecate the functionality now so people can transition away from it. | True | Deprecate certificate validation by fingerprint - Because we want to remove certificate validation by fingerprint (per #431), we should deprecate the functionality now so people can transition away from it. | main | deprecate certificate validation by fingerprint because we want to remove certificate validation by fingerprint per we should deprecate the functionality now so people can transition away from it | 1 |
4,319 | 21,721,412,997 | IssuesEvent | 2022-05-11 00:48:52 | cncf/glossary | https://api.github.com/repos/cncf/glossary | closed | Update Chinese lang ID: `zh` → `zh-cn` | maintainers lang/zh | Recently, Docsy has renamed language 'zh' to 'zh-cn'.
(Ref: https://github.com/google/docsy/pull/832)
And after [updating Docsy](https://github.com/kubernetes/website/pull/32906) used by [kubernetes/website repo](https://github.com/kubernetes/website),
this causes some issues (like [this](https://github.com/kubernetes/website/issues/33002), and so on..).
To resolve this, [Qiming Teng](https://github.com/tengqm), an active English & Chinese approver in kubernetes/website repo, suggested a solution:
**update the Chinese language identifier from `zh` to `zh-cn`**,
causing everything under `https//kubernetes.io/zh/*` to be moved to `https//kubernetes.io/zh-cn/*` .
- https://github.com/kubernetes/website/pull/33050
- https://github.com/kubernetes/website/pull/33051
It is likely that we update our Docsy version some time or other, and that we will encounter this issue too.
So I suggest to update the Chinese language identifier from `zh` to `zh-cn` in CNCF Glossary project, before `dev-zh` goes live,
because updating **before** `dev-zh` going live will be better than updating **after** `dev-zh` going live.
(Which means that Chinese contents will be live in `https://glossary.cncf.io/zh-cn/`, not `https://glossary.cncf.io/zh/`.)
How about your thoughts,
Chinese approvers (@hanyuancheung @Jacob953 @Rocksnake @Submarinee) and
maintainers and English approvers (@cjyabraham @CathPag @jasonmorgan @seokho-son @iamNoah1) ? 😊
[Refs]
- https://github.com/google/docsy/issues/732
- https://github.com/google/docsy/pull/712
- https://github.com/google/docsy/pull/826
- https://github.com/google/docsy/pull/832
- https://github.com/kubernetes/website/issues/33002 | True | Update Chinese lang ID: `zh` → `zh-cn` - Recently, Docsy has renamed language 'zh' to 'zh-cn'.
(Ref: https://github.com/google/docsy/pull/832)
And after [updating Docsy](https://github.com/kubernetes/website/pull/32906) used by [kubernetes/website repo](https://github.com/kubernetes/website),
this causes some issues (like [this](https://github.com/kubernetes/website/issues/33002), and so on..).
To resolve this, [Qiming Teng](https://github.com/tengqm), an active English & Chinese approver in kubernetes/website repo, suggested a solution:
**update the Chinese language identifier from `zh` to `zh-cn`**,
causing everything under `https//kubernetes.io/zh/*` to be moved to `https//kubernetes.io/zh-cn/*` .
- https://github.com/kubernetes/website/pull/33050
- https://github.com/kubernetes/website/pull/33051
It is likely that we update our Docsy version some time or other, and that we will encounter this issue too.
So I suggest to update the Chinese language identifier from `zh` to `zh-cn` in CNCF Glossary project, before `dev-zh` goes live,
because updating **before** `dev-zh` going live will be better than updating **after** `dev-zh` going live.
(Which means that Chinese contents will be live in `https://glossary.cncf.io/zh-cn/`, not `https://glossary.cncf.io/zh/`.)
How about your thoughts,
Chinese approvers (@hanyuancheung @Jacob953 @Rocksnake @Submarinee) and
maintainers and English approvers (@cjyabraham @CathPag @jasonmorgan @seokho-son @iamNoah1) ? 😊
[Refs]
- https://github.com/google/docsy/issues/732
- https://github.com/google/docsy/pull/712
- https://github.com/google/docsy/pull/826
- https://github.com/google/docsy/pull/832
- https://github.com/kubernetes/website/issues/33002 | main | update chinese lang id zh → zh cn recently docsy has renamed language zh to zh cn ref and after used by this causes some issues like and so on to resolve this an active english chinese approver in kubernetes website repo suggested a solution update the chinese language identifier from zh to zh cn causing everything under https kubernetes io zh to be moved to https kubernetes io zh cn it is likely that we update our docsy version some time or other and that we will encounter this issue too so i suggest to update the chinese language identifier from zh to zh cn in cncf glossary project before dev zh goes live because updating before dev zh going live will be better than updating after dev zh going live which means that chinese contents will be live in not how about your thoughts chinese approvers hanyuancheung rocksnake submarinee and maintainers and english approvers cjyabraham cathpag jasonmorgan seokho son 😊 | 1 |
4,018 | 18,754,142,942 | IssuesEvent | 2021-11-05 08:29:54 | aws/aws-sam-cli | https://api.github.com/repos/aws/aws-sam-cli | closed | Unable to map httpapi to exisiting customdomain | type/bug stage/needs-investigation maintainer/need-response | ### Description:
I am trying to create a httpapi and map to an exisitng custom domain
### Steps to reproduce:
Sample template.yml file
```
Resources:
HttpApi:
Type: AWS::Serverless::HttpApi
Properties:
StageName: !Ref StageName
Tags:
Tag: Value
AccessLogSettings:
DestinationArn: !GetAtt AccessLogs.Arn
Format: $context.requestId
StageVariables:
StageVar: Value
FailOnWarnings: True
CorsConfiguration:
AllowOrigins:
- "*"
AllowHeaders:
- origin
- content-type
- x-amz-date
- authorization
- x-api-key
- x-amz-security-token
AllowMethods:
- GET
- POST
- OPTIONS
- PATCH
- DELETE
MaxAge: 0
Domain:
DomainName: x.x.x
CertificateArn: arn
EndpointConfiguration: REGIONAL
Route53:
HostedZoneId: XXXXXXXXXXX
BasePath:
- /banks
```
### Observed result:
```
The domain name you provided already exists.
(Service: AmazonApiGatewayV2; Status Code: 400;
Error Code: BadRequestException; Request ID:
; Proxy:
null)
```
### Expected result:
Map created httpapi to the custom domain and under paths given as per the template section BasePath
### Additional environment details (Ex: Windows, Mac, Amazon Linux etc)
1. OS: Ubuntu
2. `sam --version`: 1.16.0
3. AWS region: eu-west-1
`Add --debug flag to command you are running`
| True | Unable to map httpapi to exisiting customdomain - ### Description:
I am trying to create a httpapi and map to an exisitng custom domain
### Steps to reproduce:
Sample template.yml file
```
Resources:
HttpApi:
Type: AWS::Serverless::HttpApi
Properties:
StageName: !Ref StageName
Tags:
Tag: Value
AccessLogSettings:
DestinationArn: !GetAtt AccessLogs.Arn
Format: $context.requestId
StageVariables:
StageVar: Value
FailOnWarnings: True
CorsConfiguration:
AllowOrigins:
- "*"
AllowHeaders:
- origin
- content-type
- x-amz-date
- authorization
- x-api-key
- x-amz-security-token
AllowMethods:
- GET
- POST
- OPTIONS
- PATCH
- DELETE
MaxAge: 0
Domain:
DomainName: x.x.x
CertificateArn: arn
EndpointConfiguration: REGIONAL
Route53:
HostedZoneId: XXXXXXXXXXX
BasePath:
- /banks
```
### Observed result:
```
The domain name you provided already exists.
(Service: AmazonApiGatewayV2; Status Code: 400;
Error Code: BadRequestException; Request ID:
; Proxy:
null)
```
### Expected result:
Map created httpapi to the custom domain and under paths given as per the template section BasePath
### Additional environment details (Ex: Windows, Mac, Amazon Linux etc)
1. OS: Ubuntu
2. `sam --version`: 1.16.0
3. AWS region: eu-west-1
`Add --debug flag to command you are running`
| main | unable to map httpapi to exisiting customdomain description i am trying to create a httpapi and map to an exisitng custom domain steps to reproduce sample template yml file resources httpapi type aws serverless httpapi properties stagename ref stagename tags tag value accesslogsettings destinationarn getatt accesslogs arn format context requestid stagevariables stagevar value failonwarnings true corsconfiguration alloworigins allowheaders origin content type x amz date authorization x api key x amz security token allowmethods get post options patch delete maxage domain domainname x x x certificatearn arn endpointconfiguration regional hostedzoneid xxxxxxxxxxx basepath banks observed result the domain name you provided already exists service status code error code badrequestexception request id proxy null expected result map created httpapi to the custom domain and under paths given as per the template section basepath additional environment details ex windows mac amazon linux etc os ubuntu sam version aws region eu west add debug flag to command you are running | 1 |
1,891 | 6,577,533,420 | IssuesEvent | 2017-09-12 01:34:41 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | Docker pull: always and state: reloaded no longer ignores unnamed containers | affects_1.9 bug_report cloud docker waiting_on_maintainer | ##### Issue Type:
- Bug Report
##### Plugin Name:
Docker
##### Ansible Version:
**Worked:**
```
ansible 1.9.4
configured module search path = None
```
**Broken:**
```
ansible 2.0.0.1
config file = /Users/mmorris/Work/ansible/ansible.cfg
configured module search path = Default w/o overrides
```
##### Ansible Configuration:
```
[defaults]
host_key_checking=False
display_skipped_hosts=False
force_handlers = True
hostfile = inventory/ec2.py
retry_files_enabled = False
[ssh_connection]
pipelining=True
```
##### Environment:
Ubuntu 14.04 from OSX 10.10
##### Summary:
Something has changed in the docker module so that if you have `pull: always` and `state: reloaded` with a unnamed container, it will keep on creating new containers instead of realizing it's the same container, just differently named.
##### Steps To Reproduce:
``` yaml
- name: create redis container
docker:
image: "redis:3.0.3"
pull: always
state: reloaded
```
##### Expected Results:
When a container already exists and it has all the same settings except the dynamically assigned name is different, nothing should happen:
**First run:**
```
GATHERING FACTS ***************************************************************
ok: [x.x.x.x]
TASK: [docker | create redis container] *********************
changed: [x.x.x.x]
```
**Second run:**
```
GATHERING FACTS ***************************************************************
ok: [x.x.x.x]
TASK: [docker | create redis container] *********************
ok: [x.x.x.x]
```
```
$docker ps | grep redis | awk '{print $2 "," $12}'
redis:3.0.3,sad_tesla
```
##### Actual Results:
It will create a new, separate container every time:
**First run:**
```
GATHERING FACTS ***************************************************************
ok: [x.x.x.x]
TASK: [docker | create redis container] *********************
changed: [x.x.x.x]
```
**Second run:**
```
GATHERING FACTS ***************************************************************
ok: [x.x.x.x]
TASK: [docker | create redis container] *********************
changed: [x.x.x.x]
```
```
$ docker ps | grep redis | awk '{print $2 "," $12}'
redis:3.0.3,sad_tesla
redis:3.0.3,mad_kirch
```
Not sure why this change was made, but it is very much NOT what should be happening and will result in problems for what I know are many people who use the module like this.
| True | Docker pull: always and state: reloaded no longer ignores unnamed containers - ##### Issue Type:
- Bug Report
##### Plugin Name:
Docker
##### Ansible Version:
**Worked:**
```
ansible 1.9.4
configured module search path = None
```
**Broken:**
```
ansible 2.0.0.1
config file = /Users/mmorris/Work/ansible/ansible.cfg
configured module search path = Default w/o overrides
```
##### Ansible Configuration:
```
[defaults]
host_key_checking=False
display_skipped_hosts=False
force_handlers = True
hostfile = inventory/ec2.py
retry_files_enabled = False
[ssh_connection]
pipelining=True
```
##### Environment:
Ubuntu 14.04 from OSX 10.10
##### Summary:
Something has changed in the docker module so that if you have `pull: always` and `state: reloaded` with a unnamed container, it will keep on creating new containers instead of realizing it's the same container, just differently named.
##### Steps To Reproduce:
``` yaml
- name: create redis container
docker:
image: "redis:3.0.3"
pull: always
state: reloaded
```
##### Expected Results:
When a container already exists and it has all the same settings except the dynamically assigned name is different, nothing should happen:
**First run:**
```
GATHERING FACTS ***************************************************************
ok: [x.x.x.x]
TASK: [docker | create redis container] *********************
changed: [x.x.x.x]
```
**Second run:**
```
GATHERING FACTS ***************************************************************
ok: [x.x.x.x]
TASK: [docker | create redis container] *********************
ok: [x.x.x.x]
```
```
$docker ps | grep redis | awk '{print $2 "," $12}'
redis:3.0.3,sad_tesla
```
##### Actual Results:
It will create a new, separate container every time:
**First run:**
```
GATHERING FACTS ***************************************************************
ok: [x.x.x.x]
TASK: [docker | create redis container] *********************
changed: [x.x.x.x]
```
**Second run:**
```
GATHERING FACTS ***************************************************************
ok: [x.x.x.x]
TASK: [docker | create redis container] *********************
changed: [x.x.x.x]
```
```
$ docker ps | grep redis | awk '{print $2 "," $12}'
redis:3.0.3,sad_tesla
redis:3.0.3,mad_kirch
```
Not sure why this change was made, but it is very much NOT what should be happening and will result in problems for what I know are many people who use the module like this.
| main | docker pull always and state reloaded no longer ignores unnamed containers issue type bug report plugin name docker ansible version worked ansible configured module search path none broken ansible config file users mmorris work ansible ansible cfg configured module search path default w o overrides ansible configuration host key checking false display skipped hosts false force handlers true hostfile inventory py retry files enabled false pipelining true environment ubuntu from osx summary something has changed in the docker module so that if you have pull always and state reloaded with a unnamed container it will keep on creating new containers instead of realizing it s the same container just differently named steps to reproduce yaml name create redis container docker image redis pull always state reloaded expected results when a container already exists and it has all the same settings except the dynamically assigned name is different nothing should happen first run gathering facts ok task changed second run gathering facts ok task ok docker ps grep redis awk print redis sad tesla actual results it will create a new separate container every time first run gathering facts ok task changed second run gathering facts ok task changed docker ps grep redis awk print redis sad tesla redis mad kirch not sure why this change was made but it is very much not what should be happening and will result in problems for what i know are many people who use the module like this | 1 |
388 | 3,422,951,107 | IssuesEvent | 2015-12-09 02:14:23 | dotnet/roslyn-analyzers | https://api.github.com/repos/dotnet/roslyn-analyzers | closed | Port FxCop rule CA1806: DoNotIgnoreMethodResults | Area-Microsoft.Maintainability.Analyzers FxCop-Port Urgency-Soon | **Title:** Do not ignore method results
**Description:**
A new object is created but never used; or a method that creates and returns a new string is called and the new string is never used; or a COM or P/Invoke method returns an HRESULT or error code that is never used.
**Dependency:** None
**Notes:**
Consider having a whitelist, like immutable types (including string), and also look for the Pure attribute | True | Port FxCop rule CA1806: DoNotIgnoreMethodResults - **Title:** Do not ignore method results
**Description:**
A new object is created but never used; or a method that creates and returns a new string is called and the new string is never used; or a COM or P/Invoke method returns an HRESULT or error code that is never used.
**Dependency:** None
**Notes:**
Consider having a whitelist, like immutable types (including string), and also look for the Pure attribute | main | port fxcop rule donotignoremethodresults title do not ignore method results description a new object is created but never used or a method that creates and returns a new string is called and the new string is never used or a com or p invoke method returns an hresult or error code that is never used dependency none notes consider having a whitelist like immutable types including string and also look for the pure attribute | 1 |
2,562 | 8,711,615,396 | IssuesEvent | 2018-12-06 19:43:57 | arcticicestudio/nord-docs | https://api.github.com/repos/arcticicestudio/nord-docs | opened | SVGR | context-api scope-dx scope-maintainability type-feature | <p align="center"><img src="https://user-images.githubusercontent.com/7836623/49604267-bceaa780-f98d-11e8-932a-05ee4b952a83.png" width="20%" /></p>
This issue documents the integration of [SVGR][] which transforms SVG into ready to use React components. It is part of [create-react-app][cra] and makes SVG integration into React projects easy.
## Configuration
All configurations will be placed in a `svgr.config.js` file in the project instead of passing them to the corresponding Gatsby plugin for the Webpack loader that is documented in the section below.
- `expandProps` = `end` — ensure props are always spread (`{...props}`) last on the root `<svg>` element.
- `ext` = `.jsx` — use file JSX extension for generated components.
- `icon` = `false` — ensure the `width` and `height` props (hardcoded `1em` value) are not added.
- `ref` = `true` — export components using React's `forwardRef` API to reference the underlying component and allow access to the root SVG element which is necessary for animations.
- `svgo` = `false` — all SVG files in this project are already optimized with SVGO using the included `.svgo.yml` configuration.
## Webpack Loader
To directly import SVG's as React component the [@svgr/webpack][gh-svgr-wpl] package will be used to simply integrate SVGR as Webpack plugin. It'll be added through the [gatsby-plugin-svgr][npm-gp-svgr] which adjusts the required Webpack configurations by adding the loader and removing the default `.svg` file loader.
## Tasks
- [ ] Install the required (dev) dependencies:
- [gatsby-plugin-svgr][npm-gp-svgr]
- [@svgr/webpack][npm-svgr-wp]
- [ ] Implement the SVGR and SVGO configuration files.
- [ ] Add `gatsby-plugin-svgr` to the Gatsby configuration.
[cra]: https://facebook.github.io/create-react-app
[gh-svgr-wpl]: https://github.com/smooth-code/svgr/tree/master/packages/webpack
[npm-gp-svgr]: https://www.npmjs.com/package/gatsby-plugin-svgr
[npm-svgr-wp]: https://www.npmjs.com/package/@svgr/webpack
[svgr]: https://www.smooth-code.com/open-source/svgr
| True | SVGR - <p align="center"><img src="https://user-images.githubusercontent.com/7836623/49604267-bceaa780-f98d-11e8-932a-05ee4b952a83.png" width="20%" /></p>
This issue documents the integration of [SVGR][] which transforms SVG into ready to use React components. It is part of [create-react-app][cra] and makes SVG integration into React projects easy.
## Configuration
All configurations will be placed in a `svgr.config.js` file in the project instead of passing them to the corresponding Gatsby plugin for the Webpack loader that is documented in the section below.
- `expandProps` = `end` — ensure props are always spread (`{...props}`) last on the root `<svg>` element.
- `ext` = `.jsx` — use file JSX extension for generated components.
- `icon` = `false` — ensure the `width` and `height` props (hardcoded `1em` value) are not added.
- `ref` = `true` — export components using React's `forwardRef` API to reference the underlying component and allow access to the root SVG element which is necessary for animations.
- `svgo` = `false` — all SVG files in this project are already optimized with SVGO using the included `.svgo.yml` configuration.
## Webpack Loader
To directly import SVG's as React component the [@svgr/webpack][gh-svgr-wpl] package will be used to simply integrate SVGR as Webpack plugin. It'll be added through the [gatsby-plugin-svgr][npm-gp-svgr] which adjusts the required Webpack configurations by adding the loader and removing the default `.svg` file loader.
## Tasks
- [ ] Install the required (dev) dependencies:
- [gatsby-plugin-svgr][npm-gp-svgr]
- [@svgr/webpack][npm-svgr-wp]
- [ ] Implement the SVGR and SVGO configuration files.
- [ ] Add `gatsby-plugin-svgr` to the Gatsby configuration.
[cra]: https://facebook.github.io/create-react-app
[gh-svgr-wpl]: https://github.com/smooth-code/svgr/tree/master/packages/webpack
[npm-gp-svgr]: https://www.npmjs.com/package/gatsby-plugin-svgr
[npm-svgr-wp]: https://www.npmjs.com/package/@svgr/webpack
[svgr]: https://www.smooth-code.com/open-source/svgr
| main | svgr this issue documents the integration of which transforms svg into ready to use react components it is part of and makes svg integration into react projects easy configuration all configurations will be placed in a svgr config js file in the project instead of passing them to the corresponding gatsby plugin for the webpack loader that is documented in the section below expandprops end — ensure props are always spread props last on the root element ext jsx — use file jsx extension for generated components icon false — ensure the width and height props hardcoded value are not added ref true — export components using react s forwardref api to reference the underlying component and allow access to the root svg element which is necessary for animations svgo false — all svg files in this project are already optimized with svgo using the included svgo yml configuration webpack loader to directly import svg s as react component the package will be used to simply integrate svgr as webpack plugin it ll be added through the which adjusts the required webpack configurations by adding the loader and removing the default svg file loader tasks install the required dev dependencies implement the svgr and svgo configuration files add gatsby plugin svgr to the gatsby configuration | 1 |
3,240 | 12,368,706,773 | IssuesEvent | 2020-05-18 14:13:31 | Kashdeya/Tiny-Progressions | https://api.github.com/repos/Kashdeya/Tiny-Progressions | closed | Request for compatibility with 'Iblis' | Version not Maintainted | The author of 'Iblis' said that your recipe registration of armor and weapons/tools happens too late during load for his mod to intercept them. His mod is a quality mod. Quality of weapons/tools and armor is affected by 'weapon smithing' and 'armor smithing' skills. | True | Request for compatibility with 'Iblis' - The author of 'Iblis' said that your recipe registration of armor and weapons/tools happens too late during load for his mod to intercept them. His mod is a quality mod. Quality of weapons/tools and armor is affected by 'weapon smithing' and 'armor smithing' skills. | main | request for compatibility with iblis the author of iblis said that your recipe registration of armor and weapons tools happens too late during load for his mod to intercept them his mod is a quality mod quality of weapons tools and armor is affected by weapon smithing and armor smithing skills | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.