Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 7 112 | repo_url stringlengths 36 141 | action stringclasses 3 values | title stringlengths 1 744 | labels stringlengths 4 574 | body stringlengths 9 211k | index stringclasses 10 values | text_combine stringlengths 96 211k | label stringclasses 2 values | text stringlengths 96 188k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
21,619 | 30,022,526,080 | IssuesEvent | 2023-06-27 01:34:58 | bazelbuild/bazel | https://api.github.com/repos/bazelbuild/bazel | closed | bazel third-party CPP library relative path | P4 type: support / not a bug (process) team-Rules-CPP stale | Depending on the third-party CPP library, some directory header files contain the relative path of.. /. / how to introduce bazel
| 1.0 | bazel third-party CPP library relative path - Depending on the third-party CPP library, some directory header files contain the relative path of.. /. / how to introduce bazel
| process | bazel third party cpp library relative path depending on the third party cpp library some directory header files contain the relative path of how to introduce bazel | 1 |
15,561 | 19,703,503,996 | IssuesEvent | 2022-01-12 19:08:02 | googleapis/java-pubsublite-spark | https://api.github.com/repos/googleapis/java-pubsublite-spark | opened | Your .repo-metadata.json file has a problem 🤒 | type: process repo-metadata: lint | You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'pubsublite-spark' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions. | 1.0 | Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'pubsublite-spark' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions. | process | your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 release level must be equal to one of the allowed values in repo metadata json api shortname pubsublite spark invalid in repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions | 1 |
19,112 | 25,165,916,437 | IssuesEvent | 2022-11-10 20:50:09 | ORNL-AMO/AMO-Tools-Desktop | https://api.github.com/repos/ORNL-AMO/AMO-Tools-Desktop | closed | Copy Edit for Design Energy | Process Heating | For electric energy input, we should say ‘total rated capacity of electric furnace (kW)’ or something like that …
 | 1.0 | Copy Edit for Design Energy - For electric energy input, we should say ‘total rated capacity of electric furnace (kW)’ or something like that …
 | process | copy edit for design energy for electric energy input we should say ‘total rated capacity of electric furnace kw ’ or something like that … | 1 |
135,968 | 12,696,010,323 | IssuesEvent | 2020-06-22 09:21:21 | ga4gh/workflow-execution-service-schemas | https://api.github.com/repos/ga4gh/workflow-execution-service-schemas | closed | Data Retention Policy | Project: WES Type: Documentation | A section Guidelines should be provided to advise implementers on Data Retention Policy. This issue provides a space for this discussion to take place. A PR can then be raised with the changes in place. | 1.0 | Data Retention Policy - A section Guidelines should be provided to advise implementers on Data Retention Policy. This issue provides a space for this discussion to take place. A PR can then be raised with the changes in place. | non_process | data retention policy a section guidelines should be provided to advise implementers on data retention policy this issue provides a space for this discussion to take place a pr can then be raised with the changes in place | 0 |
236,040 | 7,745,384,221 | IssuesEvent | 2018-05-29 18:10:09 | minio/minio-go | https://api.github.com/repos/minio/minio-go | closed | release tag format, compatibility with the Go versioning proposal | community priority: medium | Hey,
when testing if restic compiles with `vgo`, the prototype implementation for adding [versioning to the Go toolchain](https://github.com/golang/go/issues/24301), I noticed that some time ago you switched from release tags in the form `v1.2.3` (as required by `vgo`) to `1.2.3`, so the `v` prefix was dropped.
This leads to `vgo` selecting the `v1.0.0` release tag, which is ancient:
```
vgo: import "github.com/restic/restic/cmd/restic" ->
import "github.com/restic/restic/internal/backend/location" ->
import "github.com/restic/restic/internal/backend/s3" ->
import "github.com/minio/minio-go/pkg/credentials" [/home/fd0/gopath/src/v/github.com/minio/minio-go@v1.0.0/pkg/credentials]: open /home/fd0/gopath/src/v/github.com/minio/minio-go@v1.0.0/pkg/credentials: no such file or directory
```
Would you mind adding a second tag `v6.0.1` which points to the same commit the tag `6.0.1` does, and tagging future releases with the `v` prefix? That'd be awesome :)
Thanks!
| 1.0 | release tag format, compatibility with the Go versioning proposal - Hey,
when testing if restic compiles with `vgo`, the prototype implementation for adding [versioning to the Go toolchain](https://github.com/golang/go/issues/24301), I noticed that some time ago you switched from release tags in the form `v1.2.3` (as required by `vgo`) to `1.2.3`, so the `v` prefix was dropped.
This leads to `vgo` selecting the `v1.0.0` release tag, which is ancient:
```
vgo: import "github.com/restic/restic/cmd/restic" ->
import "github.com/restic/restic/internal/backend/location" ->
import "github.com/restic/restic/internal/backend/s3" ->
import "github.com/minio/minio-go/pkg/credentials" [/home/fd0/gopath/src/v/github.com/minio/minio-go@v1.0.0/pkg/credentials]: open /home/fd0/gopath/src/v/github.com/minio/minio-go@v1.0.0/pkg/credentials: no such file or directory
```
Would you mind adding a second tag `v6.0.1` which points to the same commit the tag `6.0.1` does, and tagging future releases with the `v` prefix? That'd be awesome :)
Thanks!
| non_process | release tag format compatibility with the go versioning proposal hey when testing if restic compiles with vgo the prototype implementation for adding i noticed that some time ago you switched from release tags in the form as required by vgo to so the v prefix was dropped this leads to vgo selecting the release tag which is ancient vgo import github com restic restic cmd restic import github com restic restic internal backend location import github com restic restic internal backend import github com minio minio go pkg credentials open home gopath src v github com minio minio go pkg credentials no such file or directory would you mind adding a second tag which points to the same commit the tag does and tagging future releases with the v prefix that d be awesome thanks | 0 |
21,881 | 30,326,829,323 | IssuesEvent | 2023-07-11 01:11:31 | open-telemetry/opentelemetry-collector-contrib | https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib | closed | [processor/spanmetrics] Inaccurate timestamps on metrics. | bug processor/spanmetrics connector/spanmetrics | **Describe the bug**
The processor uses the time since it started up for the start time of all metrics, and this is not very accurate. For delta metrics it is particularly inaccurate. For example here: https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/v0.42.0/processor/spanmetricsprocessor/processor.go#L125
**What did you expect to see?**
Each metric should probably keep track of the earliest and latest corresponding timestamp, for more accurate results. Or, any feasible way of giving more accurate time results.
**What version did you use?**
Version: `v0.42.0`
| 1.0 | [processor/spanmetrics] Inaccurate timestamps on metrics. - **Describe the bug**
The processor uses the time since it started up for the start time of all metrics, and this is not very accurate. For delta metrics it is particularly inaccurate. For example here: https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/v0.42.0/processor/spanmetricsprocessor/processor.go#L125
**What did you expect to see?**
Each metric should probably keep track of the earliest and latest corresponding timestamp, for more accurate results. Or, any feasible way of giving more accurate time results.
**What version did you use?**
Version: `v0.42.0`
| process | inaccurate timestamps on metrics describe the bug the processor uses the time since it started up for the start time of all metrics and this is not very accurate for delta metrics it is particularly inaccurate for example here what did you expect to see each metric should probably keep track of the earliest and latest corresponding timestamp for more accurate results or any feasible way of giving more accurate time results what version did you use version | 1 |
9,266 | 6,192,244,745 | IssuesEvent | 2017-07-05 00:23:47 | ValueChart/WebValueCharts | https://api.github.com/repos/ValueChart/WebValueCharts | closed | Enable "enter" button functionality across the application. | DIFFICULTY: Easy IMPORTANCE: Low TYPE: Usability/Style | Hitting the "enter" button should trigger form submissions across the application. | True | Enable "enter" button functionality across the application. - Hitting the "enter" button should trigger form submissions across the application. | non_process | enable enter button functionality across the application hitting the enter button should trigger form submissions across the application | 0 |
595,794 | 18,074,942,012 | IssuesEvent | 2021-09-21 08:50:14 | googleapis/nodejs-speech | https://api.github.com/repos/googleapis/nodejs-speech | opened | Recognize: "after all" hook for "should run speech diarization on a local file" failed | priority: p1 type: bug flakybot: issue | This test failed!
To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot).
If I'm commenting on this issue too often, add the `flakybot: quiet` label and
I will stop commenting.
---
commit: 88df0763de1021c915c2b843766397253243058e
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e34dabb9-12fa-4676-b167-589738c02915), [Sponge](http://sponge2/e34dabb9-12fa-4676-b167-589738c02915)
status: failed
<details><summary>Test output</summary><br><pre>The specified bucket does not exist.
Error: The specified bucket does not exist.
at new ApiError (node_modules/@google-cloud/common/build/src/util.js:73:15)
at Util.parseHttpRespBody (node_modules/@google-cloud/common/build/src/util.js:208:38)
at Util.handleResp (node_modules/@google-cloud/common/build/src/util.js:149:117)
at /workspace/samples/node_modules/@google-cloud/common/build/src/util.js:477:22
at onResponse (node_modules/retry-request/index.js:228:7)
at /workspace/samples/node_modules/teeny-request/build/src/index.js:226:13
-> /workspace/samples/node_modules/teeny-request/src/index.ts:333:11
at processTicksAndRejections (internal/process/task_queues.js:97:5)</pre></details> | 1.0 | Recognize: "after all" hook for "should run speech diarization on a local file" failed - This test failed!
To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot).
If I'm commenting on this issue too often, add the `flakybot: quiet` label and
I will stop commenting.
---
commit: 88df0763de1021c915c2b843766397253243058e
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e34dabb9-12fa-4676-b167-589738c02915), [Sponge](http://sponge2/e34dabb9-12fa-4676-b167-589738c02915)
status: failed
<details><summary>Test output</summary><br><pre>The specified bucket does not exist.
Error: The specified bucket does not exist.
at new ApiError (node_modules/@google-cloud/common/build/src/util.js:73:15)
at Util.parseHttpRespBody (node_modules/@google-cloud/common/build/src/util.js:208:38)
at Util.handleResp (node_modules/@google-cloud/common/build/src/util.js:149:117)
at /workspace/samples/node_modules/@google-cloud/common/build/src/util.js:477:22
at onResponse (node_modules/retry-request/index.js:228:7)
at /workspace/samples/node_modules/teeny-request/build/src/index.js:226:13
-> /workspace/samples/node_modules/teeny-request/src/index.ts:333:11
at processTicksAndRejections (internal/process/task_queues.js:97:5)</pre></details> | non_process | recognize after all hook for should run speech diarization on a local file failed this test failed to configure my behavior see if i m commenting on this issue too often add the flakybot quiet label and i will stop commenting commit buildurl status failed test output the specified bucket does not exist error the specified bucket does not exist at new apierror node modules google cloud common build src util js at util parsehttprespbody node modules google cloud common build src util js at util handleresp node modules google cloud common build src util js at workspace samples node modules google cloud common build src util js at onresponse node modules retry request index js at workspace samples node modules teeny request build src index js workspace samples node modules teeny request src index ts at processticksandrejections internal process task queues js | 0 |
15,314 | 19,406,096,081 | IssuesEvent | 2021-12-20 01:02:19 | w3c/rdf-star | https://api.github.com/repos/w3c/rdf-star | closed | Test report generator failure | help wanted process | In an up-to-date repo, `bundle install` and `bundle exec rake` run fine.
But I tried to add a new EARL report `apache-jena.ttl` I get errors relating to `"http://champin.net/#pa"` which isn't in the file.
(draft [apache-jena.ttl.txt](https://github.com/w3c/rdf-star/files/7739236/apache-jena.ttl.txt) attached - with `.txt` added for GH).
How can I debug this?
_Update_: PR #235 works - the difference is "homepage" URIs changed from "http:" to "https:" and if the timestamp of the file is later than all files in the directory. Restoring earl.* from git does not leave timestamps in the right state.
```
Traceback (most recent call last):
9: from /home/afs/gems/bin/earl-report:23:in `<main>'
8: from /home/afs/gems/bin/earl-report:23:in `load'
7: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/bin/earl-report:100:in `<top (required)>'
6: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:479:in `generate'
5: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:559:in `earl_turtle'
4: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:559:in `each'
3: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:565:in `block in earl_turtle'
2: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:565:in `each'
1: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:566:in `block (2 levels) in earl_turtle'
/var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:574:in `ttl_entity': undefined method `map' for "http://champin.net/#pa":String (NoMethodError)
```
If I rename the file to `jena.ttl, I get
```
Traceback (most recent call last):
11: from /home/afs/gems/bin/earl-report:23:in `<main>'
10: from /home/afs/gems/bin/earl-report:23:in `load'
9: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/bin/earl-report:100:in `<top (required)>'
8: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:495:in `generate'
7: from /var/lib/gems/2.7.0/gems/haml-5.2.1/lib/haml/engine.rb:130:in `render'
6: from /var/lib/gems/2.7.0/gems/haml-5.2.1/lib/haml/engine.rb:130:in `eval'
5: from (haml):343:in `block in render'
4: from (haml):343:in `each_with_index'
3: from (haml):343:in `each'
2: from (haml):374:in `block (2 levels) in render'
1: from (haml):374:in `each'
(haml):377:in `block (3 levels) in render': undefined method `has_key?' for "http://champin.net/#pa":String (NoMethodError)
``` | 1.0 | Test report generator failure - In an up-to-date repo, `bundle install` and `bundle exec rake` run fine.
But I tried to add a new EARL report `apache-jena.ttl` I get errors relating to `"http://champin.net/#pa"` which isn't in the file.
(draft [apache-jena.ttl.txt](https://github.com/w3c/rdf-star/files/7739236/apache-jena.ttl.txt) attached - with `.txt` added for GH).
How can I debug this?
_Update_: PR #235 works - the difference is "homepage" URIs changed from "http:" to "https:" and if the timestamp of the file is later than all files in the directory. Restoring earl.* from git does not leave timestamps in the right state.
```
Traceback (most recent call last):
9: from /home/afs/gems/bin/earl-report:23:in `<main>'
8: from /home/afs/gems/bin/earl-report:23:in `load'
7: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/bin/earl-report:100:in `<top (required)>'
6: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:479:in `generate'
5: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:559:in `earl_turtle'
4: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:559:in `each'
3: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:565:in `block in earl_turtle'
2: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:565:in `each'
1: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:566:in `block (2 levels) in earl_turtle'
/var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:574:in `ttl_entity': undefined method `map' for "http://champin.net/#pa":String (NoMethodError)
```
If I rename the file to `jena.ttl, I get
```
Traceback (most recent call last):
11: from /home/afs/gems/bin/earl-report:23:in `<main>'
10: from /home/afs/gems/bin/earl-report:23:in `load'
9: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/bin/earl-report:100:in `<top (required)>'
8: from /var/lib/gems/2.7.0/gems/earl-report-0.7.0/lib/earl_report.rb:495:in `generate'
7: from /var/lib/gems/2.7.0/gems/haml-5.2.1/lib/haml/engine.rb:130:in `render'
6: from /var/lib/gems/2.7.0/gems/haml-5.2.1/lib/haml/engine.rb:130:in `eval'
5: from (haml):343:in `block in render'
4: from (haml):343:in `each_with_index'
3: from (haml):343:in `each'
2: from (haml):374:in `block (2 levels) in render'
1: from (haml):374:in `each'
(haml):377:in `block (3 levels) in render': undefined method `has_key?' for "http://champin.net/#pa":String (NoMethodError)
``` | process | test report generator failure in an up to date repo bundle install and bundle exec rake run fine but i tried to add a new earl report apache jena ttl i get errors relating to which isn t in the file draft attached with txt added for gh how can i debug this update pr works the difference is homepage uris changed from http to https and if the timestamp of the file is later than all files in the directory restoring earl from git does not leave timestamps in the right state traceback most recent call last from home afs gems bin earl report in from home afs gems bin earl report in load from var lib gems gems earl report bin earl report in from var lib gems gems earl report lib earl report rb in generate from var lib gems gems earl report lib earl report rb in earl turtle from var lib gems gems earl report lib earl report rb in each from var lib gems gems earl report lib earl report rb in block in earl turtle from var lib gems gems earl report lib earl report rb in each from var lib gems gems earl report lib earl report rb in block levels in earl turtle var lib gems gems earl report lib earl report rb in ttl entity undefined method map for nomethoderror if i rename the file to jena ttl i get traceback most recent call last from home afs gems bin earl report in from home afs gems bin earl report in load from var lib gems gems earl report bin earl report in from var lib gems gems earl report lib earl report rb in generate from var lib gems gems haml lib haml engine rb in render from var lib gems gems haml lib haml engine rb in eval from haml in block in render from haml in each with index from haml in each from haml in block levels in render from haml in each haml in block levels in render undefined method has key for nomethoderror | 1 |
5,065 | 7,868,330,822 | IssuesEvent | 2018-06-23 20:16:51 | hashicorp/packer | https://api.github.com/repos/hashicorp/packer | closed | Post-processor failed: bad HTTP status: 503 | post-processor/vagrant-cloud question | Hi all,
I've got an older vagrant box that I am trying to update and upload via `vagrant-cloud`. Before the Atlas cloud migration, this worked. I don't believe I have been able to upload a new box since (although I just recently started trying to do it again). My packer template works up until the Vagrant cloud upload. Once my upload starts I get the following messages:
```
==> virtualbox-iso (vagrant-cloud): Uploading box: virtualbox.box
virtualbox-iso (vagrant-cloud): Depending on your internet connection and the size of the box,
virtualbox-iso (vagrant-cloud): this may take some time
virtualbox-iso (vagrant-cloud): Uploading box, attempt 2
virtualbox-iso (vagrant-cloud): Error uploading box! Will retry in 10 seconds. Status: 503
virtualbox-iso (vagrant-cloud): Uploading box, attempt 3
virtualbox-iso (vagrant-cloud): Error uploading box! Will retry in 10 seconds. Status: 503
==> virtualbox-iso (vagrant-cloud): Cleaning up provider
virtualbox-iso (vagrant-cloud): Deleting provider: virtualbox
Build 'virtualbox-iso' errored: 1 error(s) occurred:
* Post-processor failed: bad HTTP status: 503
==> Some builds didn't complete successfully and had errors:
--> virtualbox-iso: 1 error(s) occurred:
* Post-processor failed: bad HTTP status: 503
==> Builds finished but no artifacts were created.
```
- Packer version from `packer version`: **1.2.4**
- Host platform: **Windows 10 x64**
- The _simplest example template and scripts_ needed to reproduce the bug. Include these in your gist https://gist.github.com/fujiiface/072ce44425a8897152d2cf9bf13b308c | 1.0 | Post-processor failed: bad HTTP status: 503 - Hi all,
I've got an older vagrant box that I am trying to update and upload via `vagrant-cloud`. Before the Atlas cloud migration, this worked. I don't believe I have been able to upload a new box since (although I just recently started trying to do it again). My packer template works up until the Vagrant cloud upload. Once my upload starts I get the following messages:
```
==> virtualbox-iso (vagrant-cloud): Uploading box: virtualbox.box
virtualbox-iso (vagrant-cloud): Depending on your internet connection and the size of the box,
virtualbox-iso (vagrant-cloud): this may take some time
virtualbox-iso (vagrant-cloud): Uploading box, attempt 2
virtualbox-iso (vagrant-cloud): Error uploading box! Will retry in 10 seconds. Status: 503
virtualbox-iso (vagrant-cloud): Uploading box, attempt 3
virtualbox-iso (vagrant-cloud): Error uploading box! Will retry in 10 seconds. Status: 503
==> virtualbox-iso (vagrant-cloud): Cleaning up provider
virtualbox-iso (vagrant-cloud): Deleting provider: virtualbox
Build 'virtualbox-iso' errored: 1 error(s) occurred:
* Post-processor failed: bad HTTP status: 503
==> Some builds didn't complete successfully and had errors:
--> virtualbox-iso: 1 error(s) occurred:
* Post-processor failed: bad HTTP status: 503
==> Builds finished but no artifacts were created.
```
- Packer version from `packer version`: **1.2.4**
- Host platform: **Windows 10 x64**
- The _simplest example template and scripts_ needed to reproduce the bug. Include these in your gist https://gist.github.com/fujiiface/072ce44425a8897152d2cf9bf13b308c | process | post processor failed bad http status hi all i ve got an older vagrant box that i am trying to update and upload via vagrant cloud before the atlas cloud migration this worked i don t believe i have been able to upload a new box since although i just recently started trying to do it again my packer template works up until the vagrant cloud upload once my upload starts i get the following messages virtualbox iso vagrant cloud uploading box virtualbox box virtualbox iso vagrant cloud depending on your internet connection and the size of the box virtualbox iso vagrant cloud this may take some time virtualbox iso vagrant cloud uploading box attempt virtualbox iso vagrant cloud error uploading box will retry in seconds status virtualbox iso vagrant cloud uploading box attempt virtualbox iso vagrant cloud error uploading box will retry in seconds status virtualbox iso vagrant cloud cleaning up provider virtualbox iso vagrant cloud deleting provider virtualbox build virtualbox iso errored error s occurred post processor failed bad http status some builds didn t complete successfully and had errors virtualbox iso error s occurred post processor failed bad http status builds finished but no artifacts were created packer version from packer version host platform windows the simplest example template and scripts needed to reproduce the bug include these in your gist | 1 |
16,703 | 21,820,436,149 | IssuesEvent | 2022-05-17 00:03:00 | apache/arrow-datafusion | https://api.github.com/repos/apache/arrow-datafusion | closed | Fix issues that came up during publishing 8.0.0 | bug development-process | **Describe the bug**
I ran into some issues publishing the crates and had to make some minimal changes to Cargo.toml files:
- datafusion/core/Cargo.toml has wrong path to README. I changed from `../README.md` to `../../README.md`
- datafusion/core/Cargo.toml - I had to comment out the row.rs test
- datafusion-cli and ballista-cli both use wildcard for mimalloc dependency version. Changed to "0.1.29"
**To Reproduce**
Steps to reproduce the behavior:
**Expected behavior**
A clear and concise description of what you expected to happen.
**Additional context**
Add any other context about the problem here.
| 1.0 | Fix issues that came up during publishing 8.0.0 - **Describe the bug**
I ran into some issues publishing the crates and had to make some minimal changes to Cargo.toml files:
- datafusion/core/Cargo.toml has wrong path to README. I changed from `../README.md` to `../../README.md`
- datafusion/core/Cargo.toml - I had to comment out the row.rs test
- datafusion-cli and ballista-cli both use wildcard for mimalloc dependency version. Changed to "0.1.29"
**To Reproduce**
Steps to reproduce the behavior:
**Expected behavior**
A clear and concise description of what you expected to happen.
**Additional context**
Add any other context about the problem here.
| process | fix issues that came up during publishing describe the bug i ran into some issues publishing the crates and had to make some minimal changes to cargo toml files datafusion core cargo toml has wrong path to readme i changed from readme md to readme md datafusion core cargo toml i had to comment out the row rs test datafusion cli and ballista cli both use wildcard for mimalloc dependency version changed to to reproduce steps to reproduce the behavior expected behavior a clear and concise description of what you expected to happen additional context add any other context about the problem here | 1 |
31,774 | 12,013,239,446 | IssuesEvent | 2020-04-10 08:22:47 | nextcloud/server | https://api.github.com/repos/nextcloud/server | closed | Nextcloud exposes internal configuration/setup information | 0. Needs triage needs info security | ### Steps to reproduce
1. Load the NextCloud main page of your installation, e.g. nextcloud.example.com
2. View the HTML source in your browser
3. Look at the header part at the 'oc_*' variables
### Expected behaviour
Don't expose internal configuration to the web - also no version numbers, etc.
### Actual behaviour
Some oc_* variables contain internal configuration setup while not enabling any kind of federation and not being logged in.
### Server configuration
**Operating system**: Ubuntu 16.04
**Web server:** Apache
**Database:** MySQL
**PHP version:**?
**Nextcloud version:** 13.0.7.2
**Updated from an older Nextcloud/ownCloud or fresh install:** Updated
| True | Nextcloud exposes internal configuration/setup information - ### Steps to reproduce
1. Load the NextCloud main page of your installation, e.g. nextcloud.example.com
2. View the HTML source in your browser
3. Look at the header part at the 'oc_*' variables
### Expected behaviour
Don't expose internal configuration to the web - also no version numbers, etc.
### Actual behaviour
Some oc_* variables contain internal configuration setup while not enabling any kind of federation and not being logged in.
### Server configuration
**Operating system**: Ubuntu 16.04
**Web server:** Apache
**Database:** MySQL
**PHP version:**?
**Nextcloud version:** 13.0.7.2
**Updated from an older Nextcloud/ownCloud or fresh install:** Updated
| non_process | nextcloud exposes internal configuration setup information steps to reproduce load the nextcloud main page of your installation e g nextcloud example com view the html source in your browser look at the header part at the oc variables expected behaviour don t expose internal configuration to the web also no version numbers etc actual behaviour some oc variables contain internal configuration setup while not enabling any kind of federation and not being logged in server configuration operating system ubuntu web server apache database mysql php version nextcloud version updated from an older nextcloud owncloud or fresh install updated | 0 |
391,499 | 26,895,481,719 | IssuesEvent | 2023-02-06 12:06:32 | GDSC-MYONGJI/22-23-TDD-Study | https://api.github.com/repos/GDSC-MYONGJI/22-23-TDD-Study | closed | 6주차 - 테스트 코드의 구성 - 정창우 | documentation | #56
# ⭐ Chapter 6. 테스트 코드의 구성
테스트 코드의 구성에 대해 알아본다
- [x] 테스트 코드의 구성을 알아보고 여러 상황에서의 테스트 코드를 작성해보기
- [x] Chap 6의 내용을 개인 기술 블로그에 정리한 후 자신의 issue에 comment로 링크 남기기
- [x] 문제 만들고, Core Member에게 제출하기(스터디 전날까지 부탁드려요!)
<br>
- [x] 모든 과정을 마친 후 pull request 올리기 | 1.0 | 6주차 - 테스트 코드의 구성 - 정창우 - #56
# ⭐ Chapter 6. 테스트 코드의 구성
테스트 코드의 구성에 대해 알아본다
- [x] 테스트 코드의 구성을 알아보고 여러 상황에서의 테스트 코드를 작성해보기
- [x] Chap 6의 내용을 개인 기술 블로그에 정리한 후 자신의 issue에 comment로 링크 남기기
- [x] 문제 만들고, Core Member에게 제출하기(스터디 전날까지 부탁드려요!)
<br>
- [x] 모든 과정을 마친 후 pull request 올리기 | non_process | 테스트 코드의 구성 정창우 ⭐ chapter 테스트 코드의 구성 테스트 코드의 구성에 대해 알아본다 테스트 코드의 구성을 알아보고 여러 상황에서의 테스트 코드를 작성해보기 chap 내용을 개인 기술 블로그에 정리한 후 자신의 issue에 comment로 링크 남기기 문제 만들고 core member에게 제출하기 스터디 전날까지 부탁드려요 모든 과정을 마친 후 pull request 올리기 | 0 |
538,779 | 15,778,192,961 | IssuesEvent | 2021-04-01 07:22:31 | LimeChain/hedera-eth-bridge-validator | https://api.github.com/repos/LimeChain/hedera-eth-bridge-validator | closed | Generalised Bridge - Smart Contracts | enhancement high-priority | - Introduce `Router` contract. The contract must contain only the members and have a `mint` function for signature verification. Each time a `mint` function is called, the contract verifies the signature and if correct, the contract calls the corresponding `application-contract` (bridge contract).
- Have `n` bridge contacts (application contracts) that are performing the fee distribution and minting of token transfers. Each bridge contract is linked to 1 ERC20 contract | 1.0 | Generalised Bridge - Smart Contracts - - Introduce `Router` contract. The contract must contain only the members and have a `mint` function for signature verification. Each time a `mint` function is called, the contract verifies the signature and if correct, the contract calls the corresponding `application-contract` (bridge contract).
- Have `n` bridge contacts (application contracts) that are performing the fee distribution and minting of token transfers. Each bridge contract is linked to 1 ERC20 contract | non_process | generalised bridge smart contracts introduce router contract the contract must contain only the members and have a mint function for signature verification each time a mint function is called the contract verifies the signature and if correct the contract calls the corresponding application contract bridge contract have n bridge contacts application contracts that are performing the fee distribution and minting of token transfers each bridge contract is linked to contract | 0 |
87,361 | 25,096,237,852 | IssuesEvent | 2022-11-08 10:21:06 | elastic/elasticsearch | https://api.github.com/repos/elastic/elasticsearch | opened | [CI] Example plugin tests fail with annotation @NamedComponent | :Delivery/Build >test-failure needs:triage Team:Delivery | ### CI Link
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+main+periodic+example-plugins/465/console
### Repro line
Seems to be task :stable-analysis:compileJava
### Does it reproduce?
Didn't try
### Applicable branches
I think main
### Failure history
_No response_
### Failure excerpt
```
01:44:04 > Task :stable-analysis:compileJava FAILED
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleTokenizerFactory.java:16: error: cannot find symbol
01:44:04 @NamedComponent( "example_tokenizer_factory")
01:44:04 ^
01:44:04 symbol: method value()
01:44:04 location: @interface NamedComponent
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleTokenFilterFactory.java:17: error: cannot find symbol
01:44:04 @NamedComponent( "example_token_filter_factory")
01:44:04 ^
01:44:04 symbol: method value()
01:44:04 location: @interface NamedComponent
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleCharFilterFactory.java:18: error: cannot find symbol
01:44:04 @NamedComponent( "example_char_filter")
01:44:04 ^
01:44:04 symbol: method value()
01:44:04 location: @interface NamedComponent
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleAnalyzerFactory.java:17: error: cannot find symbol
01:44:04 @NamedComponent( "example_analyzer_factory")
01:44:04 ^
01:44:04 symbol: method value()
01:44:04 location: @interface NamedComponent
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleTokenizerFactory.java:16: error: annotation @NamedComponent is missing a default value for the element 'name'
01:44:04 @NamedComponent( "example_tokenizer_factory")
01:44:04 ^
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleTokenFilterFactory.java:17: error: annotation @NamedComponent is missing a default value for the element 'name'
01:44:04 @NamedComponent( "example_token_filter_factory")
01:44:04 ^
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleCharFilterFactory.java:18: error: annotation @NamedComponent is missing a default value for the element 'name'
01:44:04 @NamedComponent( "example_char_filter")
01:44:04 ^
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleAnalyzerFactory.java:17: error: annotation @NamedComponent is missing a default value for the element 'name'
01:44:04 @NamedComponent( "example_analyzer_factory")
01:44:04 ^
01:44:04 8 errors
``` | 1.0 | [CI] Example plugin tests fail with annotation @NamedComponent - ### CI Link
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+main+periodic+example-plugins/465/console
### Repro line
Seems to be task :stable-analysis:compileJava
### Does it reproduce?
Didn't try
### Applicable branches
I think main
### Failure history
_No response_
### Failure excerpt
```
01:44:04 > Task :stable-analysis:compileJava FAILED
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleTokenizerFactory.java:16: error: cannot find symbol
01:44:04 @NamedComponent( "example_tokenizer_factory")
01:44:04 ^
01:44:04 symbol: method value()
01:44:04 location: @interface NamedComponent
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleTokenFilterFactory.java:17: error: cannot find symbol
01:44:04 @NamedComponent( "example_token_filter_factory")
01:44:04 ^
01:44:04 symbol: method value()
01:44:04 location: @interface NamedComponent
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleCharFilterFactory.java:18: error: cannot find symbol
01:44:04 @NamedComponent( "example_char_filter")
01:44:04 ^
01:44:04 symbol: method value()
01:44:04 location: @interface NamedComponent
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleAnalyzerFactory.java:17: error: cannot find symbol
01:44:04 @NamedComponent( "example_analyzer_factory")
01:44:04 ^
01:44:04 symbol: method value()
01:44:04 location: @interface NamedComponent
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleTokenizerFactory.java:16: error: annotation @NamedComponent is missing a default value for the element 'name'
01:44:04 @NamedComponent( "example_tokenizer_factory")
01:44:04 ^
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleTokenFilterFactory.java:17: error: annotation @NamedComponent is missing a default value for the element 'name'
01:44:04 @NamedComponent( "example_token_filter_factory")
01:44:04 ^
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleCharFilterFactory.java:18: error: annotation @NamedComponent is missing a default value for the element 'name'
01:44:04 @NamedComponent( "example_char_filter")
01:44:04 ^
01:44:04 /dev/shm/elastic+elasticsearch+main+periodic+example-plugins/plugins/examples/stable-analysis/src/main/java/org/elasticsearch/example/analysis/ExampleAnalyzerFactory.java:17: error: annotation @NamedComponent is missing a default value for the element 'name'
01:44:04 @NamedComponent( "example_analyzer_factory")
01:44:04 ^
01:44:04 8 errors
``` | non_process | example plugin tests fail with annotation namedcomponent ci link repro line seems to be task stable analysis compilejava does it reproduce didn t try applicable branches i think main failure history no response failure excerpt task stable analysis compilejava failed dev shm elastic elasticsearch main periodic example plugins plugins examples stable analysis src main java org elasticsearch example analysis exampletokenizerfactory java error cannot find symbol namedcomponent example tokenizer factory symbol method value location interface namedcomponent dev shm elastic elasticsearch main periodic example plugins plugins examples stable analysis src main java org elasticsearch example analysis exampletokenfilterfactory java error cannot find symbol namedcomponent example token filter factory symbol method value location interface namedcomponent dev shm elastic elasticsearch main periodic example plugins plugins examples stable analysis src main java org elasticsearch example analysis examplecharfilterfactory java error cannot find symbol namedcomponent example char filter symbol method value location interface namedcomponent dev shm elastic elasticsearch main periodic example plugins plugins examples stable analysis src main java org elasticsearch example analysis exampleanalyzerfactory java error cannot find symbol namedcomponent example analyzer factory symbol method value location interface namedcomponent dev shm elastic elasticsearch main periodic example plugins plugins examples stable analysis src main java org elasticsearch example analysis exampletokenizerfactory java error annotation namedcomponent is missing a default value for the element name namedcomponent example tokenizer factory dev shm elastic elasticsearch main periodic example plugins plugins examples stable analysis src main java org elasticsearch example analysis exampletokenfilterfactory java error annotation namedcomponent is missing a default value for the element name namedcomponent example token filter factory dev shm elastic elasticsearch main periodic example plugins plugins examples stable analysis src main java org elasticsearch example analysis examplecharfilterfactory java error annotation namedcomponent is missing a default value for the element name namedcomponent example char filter dev shm elastic elasticsearch main periodic example plugins plugins examples stable analysis src main java org elasticsearch example analysis exampleanalyzerfactory java error annotation namedcomponent is missing a default value for the element name namedcomponent example analyzer factory errors | 0 |
18,767 | 24,674,242,693 | IssuesEvent | 2022-10-18 15:44:57 | keras-team/keras-cv | https://api.github.com/repos/keras-team/keras-cv | closed | Add CLAHE preprocessing layer | contribution-welcome preprocessing | CLAHE is a variant of Adaptive histogram equalization (AHE) which takes care of over-amplification of the contrast. It's well-used in medical imaging practice.

(original image (left) - HE (middle) - CLAHE (right))
[source.](https://chowdera.com/2021/09/20210930171712779O.html)
| 1.0 | Add CLAHE preprocessing layer - CLAHE is a variant of Adaptive histogram equalization (AHE) which takes care of over-amplification of the contrast. It's well-used in medical imaging practice.

(original image (left) - HE (middle) - CLAHE (right))
[source.](https://chowdera.com/2021/09/20210930171712779O.html)
| process | add clahe preprocessing layer clahe is a variant of adaptive histogram equalization ahe which takes care of over amplification of the contrast it s well used in medical imaging practice original image left he middle clahe right | 1 |
16,958 | 2,615,127,892 | IssuesEvent | 2015-03-01 05:57:04 | chrsmith/google-api-java-client | https://api.github.com/repos/chrsmith/google-api-java-client | closed | APIs Explorer & Console links | auto-migrated Priority-High Type-Wiki | ```
Referenced documentation:
Please describe in detail the wiki documentation request:
```
Original issue reported on code.google.com by `yan...@google.com` on 27 Jun 2012 at 6:07 | 1.0 | APIs Explorer & Console links - ```
Referenced documentation:
Please describe in detail the wiki documentation request:
```
Original issue reported on code.google.com by `yan...@google.com` on 27 Jun 2012 at 6:07 | non_process | apis explorer console links referenced documentation please describe in detail the wiki documentation request original issue reported on code google com by yan google com on jun at | 0 |
121,820 | 4,821,701,312 | IssuesEvent | 2016-11-05 13:39:27 | glue-viz/glue-vispy-viewers | https://api.github.com/repos/glue-viz/glue-vispy-viewers | closed | Fix selections when multiple datasets are present | enhancement high priority | In cases where there are e.g. multiple datasets in a 3D scatter plot or e.g markers in a volume rendering, we should make sure that selections are applied to all visible datasets.
| 1.0 | Fix selections when multiple datasets are present - In cases where there are e.g. multiple datasets in a 3D scatter plot or e.g markers in a volume rendering, we should make sure that selections are applied to all visible datasets.
| non_process | fix selections when multiple datasets are present in cases where there are e g multiple datasets in a scatter plot or e g markers in a volume rendering we should make sure that selections are applied to all visible datasets | 0 |
1,719 | 2,603,970,091 | IssuesEvent | 2015-02-24 19:00:02 | chrsmith/nishazi6 | https://api.github.com/repos/chrsmith/nishazi6 | opened | 沈阳治疗人乳头瘤 | auto-migrated Priority-Medium Type-Defect | ```
沈阳治疗人乳头瘤〓沈陽軍區政治部醫院性病〓TEL:024-3102330
8〓成立于1946年,68年專注于性傳播疾病的研究和治療。位于�
��陽市沈河區二緯路32號。是一所與新中國同建立共輝煌的歷�
��悠久、設備精良、技術權威、專家云集,是預防、保健、醫
療、科研康復為一體的綜合性醫院。是國家首批公立甲等部��
�醫院、全國首批醫療規范定點單位,是第四軍醫大學、東南�
��學等知名高等院校的教學醫院。曾被中國人民解放軍空軍后
勤部衛生部評為衛生工作先進單位,先后兩次榮立集體二等��
�。
```
-----
Original issue reported on code.google.com by `q964105...@gmail.com` on 4 Jun 2014 at 7:26 | 1.0 | 沈阳治疗人乳头瘤 - ```
沈阳治疗人乳头瘤〓沈陽軍區政治部醫院性病〓TEL:024-3102330
8〓成立于1946年,68年專注于性傳播疾病的研究和治療。位于�
��陽市沈河區二緯路32號。是一所與新中國同建立共輝煌的歷�
��悠久、設備精良、技術權威、專家云集,是預防、保健、醫
療、科研康復為一體的綜合性醫院。是國家首批公立甲等部��
�醫院、全國首批醫療規范定點單位,是第四軍醫大學、東南�
��學等知名高等院校的教學醫院。曾被中國人民解放軍空軍后
勤部衛生部評為衛生工作先進單位,先后兩次榮立集體二等��
�。
```
-----
Original issue reported on code.google.com by `q964105...@gmail.com` on 4 Jun 2014 at 7:26 | non_process | 沈阳治疗人乳头瘤 沈阳治疗人乳头瘤〓沈陽軍區政治部醫院性病〓tel: 〓 , 。位于� �� 。是一所與新中國同建立共輝煌的歷� ��悠久、設備精良、技術權威、專家云集,是預防、保健、醫 療、科研康復為一體的綜合性醫院。是國家首批公立甲等部�� �醫院、全國首批醫療規范定點單位,是第四軍醫大學、東南� ��學等知名高等院校的教學醫院。曾被中國人民解放軍空軍后 勤部衛生部評為衛生工作先進單位,先后兩次榮立集體二等�� �。 original issue reported on code google com by gmail com on jun at | 0 |
4,310 | 7,203,155,638 | IssuesEvent | 2018-02-06 08:06:08 | qgis/QGIS-Documentation | https://api.github.com/repos/qgis/QGIS-Documentation | closed | [FEATURE] New processing algorithm "extract/clip by extent" | Automatic new feature Processing | Original commit: https://github.com/qgis/QGIS/commit/d8db3ecc4d07c9fa7c55beba663cda50a3d42a66 by nyalldawson
Allows extract a subset of another layer using an extent, with
optional setting to clip geometries to the extent | 1.0 | [FEATURE] New processing algorithm "extract/clip by extent" - Original commit: https://github.com/qgis/QGIS/commit/d8db3ecc4d07c9fa7c55beba663cda50a3d42a66 by nyalldawson
Allows extract a subset of another layer using an extent, with
optional setting to clip geometries to the extent | process | new processing algorithm extract clip by extent original commit by nyalldawson allows extract a subset of another layer using an extent with optional setting to clip geometries to the extent | 1 |
15,509 | 19,703,265,847 | IssuesEvent | 2022-01-12 18:52:15 | googleapis/java-billing | https://api.github.com/repos/googleapis/java-billing | opened | Your .repo-metadata.json file has a problem 🤒 | type: process repo-metadata: lint | You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'billing' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions. | 1.0 | Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'billing' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions. | process | your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 release level must be equal to one of the allowed values in repo metadata json api shortname billing invalid in repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions | 1 |
811,417 | 30,287,037,626 | IssuesEvent | 2023-07-08 20:25:10 | flatironinstitute/neurosift | https://api.github.com/repos/flatironinstitute/neurosift | closed | better NWB file exploration | pending review high priority | We just recently added to HDMF a feature to generate HTML for NWB objects that allows the user to navigate into NWB files and explore each group, attribute, and dataset within the file in a hierarchical accordion [here](https://github.com/hdmf-dev/hdmf/blob/0c01dd7e83f11a07bc26bbae9975046a963f7e91/src/hdmf/container.py#L463). Could Neurosift do something similar? Specifically, if you click Browse NWB File, you get this view:

but that does not have any information about e.g. the attributes of ElectricalSeries. See this PR: https://github.com/hdmf-dev/hdmf/pull/883
| 1.0 | better NWB file exploration - We just recently added to HDMF a feature to generate HTML for NWB objects that allows the user to navigate into NWB files and explore each group, attribute, and dataset within the file in a hierarchical accordion [here](https://github.com/hdmf-dev/hdmf/blob/0c01dd7e83f11a07bc26bbae9975046a963f7e91/src/hdmf/container.py#L463). Could Neurosift do something similar? Specifically, if you click Browse NWB File, you get this view:

but that does not have any information about e.g. the attributes of ElectricalSeries. See this PR: https://github.com/hdmf-dev/hdmf/pull/883
| non_process | better nwb file exploration we just recently added to hdmf a feature to generate html for nwb objects that allows the user to navigate into nwb files and explore each group attribute and dataset within the file in a hierarchical accordion could neurosift do something similar specifically if you click browse nwb file you get this view but that does not have any information about e g the attributes of electricalseries see this pr | 0 |
109,282 | 16,843,673,986 | IssuesEvent | 2021-06-19 02:47:07 | bharathirajatut/fitbit-api-example-java2 | https://api.github.com/repos/bharathirajatut/fitbit-api-example-java2 | opened | CVE-2018-12023 (High) detected in jackson-databind-2.8.1.jar | security vulnerability | ## CVE-2018-12023 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.1.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: fitbit-api-example-java2/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.1/jackson-databind-2.8.1.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-1.4.0.RELEASE.jar (Root Library)
- :x: **jackson-databind-2.8.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/bharathirajatut/fitbit-api-example-java2/commits/8c153ad064e8f07a4ddade35ac13a9b485ca3dac">8c153ad064e8f07a4ddade35ac13a9b485ca3dac</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in FasterXML jackson-databind prior to 2.7.9.4, 2.8.11.2, and 2.9.6. When Default Typing is enabled (either globally or for a specific property), the service has the Oracle JDBC jar in the classpath, and an attacker can provide an LDAP service to access, it is possible to make the service execute a malicious payload.
<p>Publish Date: 2019-03-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-12023>CVE-2018-12023</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-12022">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-12022</a></p>
<p>Release Date: 2019-03-21</p>
<p>Fix Resolution: 2.7.9.4, 2.8.11.2, 2.9.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2018-12023 (High) detected in jackson-databind-2.8.1.jar - ## CVE-2018-12023 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.1.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: fitbit-api-example-java2/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.1/jackson-databind-2.8.1.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-1.4.0.RELEASE.jar (Root Library)
- :x: **jackson-databind-2.8.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/bharathirajatut/fitbit-api-example-java2/commits/8c153ad064e8f07a4ddade35ac13a9b485ca3dac">8c153ad064e8f07a4ddade35ac13a9b485ca3dac</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in FasterXML jackson-databind prior to 2.7.9.4, 2.8.11.2, and 2.9.6. When Default Typing is enabled (either globally or for a specific property), the service has the Oracle JDBC jar in the classpath, and an attacker can provide an LDAP service to access, it is possible to make the service execute a malicious payload.
<p>Publish Date: 2019-03-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-12023>CVE-2018-12023</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-12022">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-12022</a></p>
<p>Release Date: 2019-03-21</p>
<p>Fix Resolution: 2.7.9.4, 2.8.11.2, 2.9.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_process | cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file fitbit api example pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring boot starter web release jar root library x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details an issue was discovered in fasterxml jackson databind prior to and when default typing is enabled either globally or for a specific property the service has the oracle jdbc jar in the classpath and an attacker can provide an ldap service to access it is possible to make the service execute a malicious payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
333,041 | 10,114,514,977 | IssuesEvent | 2019-07-30 19:22:47 | TheAssassin/AppImageLauncher | https://api.github.com/repos/TheAssassin/AppImageLauncher | closed | Document AppImageLauncher configuration, e.g., in the Wiki | enhancement high priority | We should add a link in the generated config file to a wiki page explaining the available options. That's been requested earlier. | 1.0 | Document AppImageLauncher configuration, e.g., in the Wiki - We should add a link in the generated config file to a wiki page explaining the available options. That's been requested earlier. | non_process | document appimagelauncher configuration e g in the wiki we should add a link in the generated config file to a wiki page explaining the available options that s been requested earlier | 0 |
745,474 | 25,985,362,941 | IssuesEvent | 2022-12-19 23:16:20 | etcd-io/etcd | https://api.github.com/repos/etcd-io/etcd | closed | [Security] Add Trivy scan workflows for release-3.4 and release-3.5 branches respectively | area/security type/feature priority/important | ### What would you like to be added?
Since the adding of [trivy-nightly-scan.yaml](https://github.com/etcd-io/etcd/blob/main/.github/workflows/trivy-nightly-scan.yaml), some security alerts are reported. But it's hard to tell which image (`v3.4.22` or `v3.5.6`) each alert was raised against.
It would be better if we can differentiate the target image from each alert. The most stupid solution I can think of for now is to use different registries. For example, use `gcr.io/etcd-development/etcd` for 3.5 image, and quay.io/coreos/etcd for 3.4 image, if there is no other better way.
<img width="1039" alt="Screen Shot 2022-12-16 at 10 20 47" src="https://user-images.githubusercontent.com/30739825/208007102-5a5ea29a-2089-447a-9adf-e320b47a5781.png">
### Why is this needed?
See above
ccc @ArkaSaha30 | 1.0 | [Security] Add Trivy scan workflows for release-3.4 and release-3.5 branches respectively - ### What would you like to be added?
Since the adding of [trivy-nightly-scan.yaml](https://github.com/etcd-io/etcd/blob/main/.github/workflows/trivy-nightly-scan.yaml), some security alerts are reported. But it's hard to tell which image (`v3.4.22` or `v3.5.6`) each alert was raised against.
It would be better if we can differentiate the target image from each alert. The most stupid solution I can think of for now is to use different registries. For example, use `gcr.io/etcd-development/etcd` for 3.5 image, and quay.io/coreos/etcd for 3.4 image, if there is no other better way.
<img width="1039" alt="Screen Shot 2022-12-16 at 10 20 47" src="https://user-images.githubusercontent.com/30739825/208007102-5a5ea29a-2089-447a-9adf-e320b47a5781.png">
### Why is this needed?
See above
ccc @ArkaSaha30 | non_process | add trivy scan workflows for release and release branches respectively what would you like to be added since the adding of some security alerts are reported but it s hard to tell which image or each alert was raised against it would be better if we can differentiate the target image from each alert the most stupid solution i can think of for now is to use different registries for example use gcr io etcd development etcd for image and quay io coreos etcd for image if there is no other better way img width alt screen shot at src why is this needed see above ccc | 0 |
55,018 | 13,502,702,569 | IssuesEvent | 2020-09-13 09:52:30 | koto-bank/lbge | https://api.github.com/repos/koto-bank/lbge | opened | Return osx checks to travis | build pipeline | Due to roswel being broken, we temporary switched to cl-travis, which doesn't support OSX. | 1.0 | Return osx checks to travis - Due to roswel being broken, we temporary switched to cl-travis, which doesn't support OSX. | non_process | return osx checks to travis due to roswel being broken we temporary switched to cl travis which doesn t support osx | 0 |
31,856 | 6,650,285,142 | IssuesEvent | 2017-09-28 15:48:53 | fieldenms/tg | https://api.github.com/repos/fieldenms/tg | closed | Entity Master: saving defects during fast entry | Defect Entity master In progress P1 Property editor UI / UX | ### Description
There are couple of significant deficiencies while entity master is quickly saved through the use of `CTRL+S` shortcut immediately after editing.
The nature of these deficiencies is more or less intermittent, however some examples are quite easy to reproduce.
----------------------------------------
a) `tg-air`: in WA's compound master for new entity, choose `CAR` in `Type` autocompleter, press `CTRL+S`; after that `Priority` becomes erroneous and focused; type `2` into `Priority` and press `CTRL+S` immediately. For the very brief period of time `Scheduled Start` becomes erroneous and focused and then the focus is moved to `Type` property and `Scheduled Start` error disappears.
b) `tg-air`: in Equipment's compound master, type several characters into `KEY` and press `CTRL+S` immediately; replay it many times (usually over ~20) and following validation error appears:
`This property has recently been changed by another user. Please either edit the value back to [HGFHGFHGFHGFHGFHGFFGHFGHHGFFGHHGGFGHFGHFHGFHGFSFGH] to resolve the conflict or cancel all of your changes.`
c) `tg-air`: in Equipment's compound master, press and hold `S` character into `KEY` and after some time press `CTRL`; a couple of client-side errors appears making entity master fully unusable:
`SimultaneousSaveException {message: "Simultaneous save exception: the save process has been already started before and not ended. Please, block UI until the save action completes."}`
----------------------------------------
After initial investigation and discussion it appears that saving process is started earlier and after that validation starts too. Such validation after completion replaces the results of saving, which causes situations a) and b).
Situation c) is caused by over-restrictive client-side `Simultaneous save exception`: perhaps debouncing is a good idea here very similarly to validation debouncing.
### Expected outcome
Reliable fast entry and saving in entity masters. | 1.0 | Entity Master: saving defects during fast entry - ### Description
There are couple of significant deficiencies while entity master is quickly saved through the use of `CTRL+S` shortcut immediately after editing.
The nature of these deficiencies is more or less intermittent, however some examples are quite easy to reproduce.
----------------------------------------
a) `tg-air`: in WA's compound master for new entity, choose `CAR` in `Type` autocompleter, press `CTRL+S`; after that `Priority` becomes erroneous and focused; type `2` into `Priority` and press `CTRL+S` immediately. For the very brief period of time `Scheduled Start` becomes erroneous and focused and then the focus is moved to `Type` property and `Scheduled Start` error disappears.
b) `tg-air`: in Equipment's compound master, type several characters into `KEY` and press `CTRL+S` immediately; replay it many times (usually over ~20) and following validation error appears:
`This property has recently been changed by another user. Please either edit the value back to [HGFHGFHGFHGFHGFHGFFGHFGHHGFFGHHGGFGHFGHFHGFHGFSFGH] to resolve the conflict or cancel all of your changes.`
c) `tg-air`: in Equipment's compound master, press and hold `S` character into `KEY` and after some time press `CTRL`; a couple of client-side errors appears making entity master fully unusable:
`SimultaneousSaveException {message: "Simultaneous save exception: the save process has been already started before and not ended. Please, block UI until the save action completes."}`
----------------------------------------
After initial investigation and discussion it appears that saving process is started earlier and after that validation starts too. Such validation after completion replaces the results of saving, which causes situations a) and b).
Situation c) is caused by over-restrictive client-side `Simultaneous save exception`: perhaps debouncing is a good idea here very similarly to validation debouncing.
### Expected outcome
Reliable fast entry and saving in entity masters. | non_process | entity master saving defects during fast entry description there are couple of significant deficiencies while entity master is quickly saved through the use of ctrl s shortcut immediately after editing the nature of these deficiencies is more or less intermittent however some examples are quite easy to reproduce a tg air in wa s compound master for new entity choose car in type autocompleter press ctrl s after that priority becomes erroneous and focused type into priority and press ctrl s immediately for the very brief period of time scheduled start becomes erroneous and focused and then the focus is moved to type property and scheduled start error disappears b tg air in equipment s compound master type several characters into key and press ctrl s immediately replay it many times usually over and following validation error appears this property has recently been changed by another user please either edit the value back to to resolve the conflict or cancel all of your changes c tg air in equipment s compound master press and hold s character into key and after some time press ctrl a couple of client side errors appears making entity master fully unusable simultaneoussaveexception message simultaneous save exception the save process has been already started before and not ended please block ui until the save action completes after initial investigation and discussion it appears that saving process is started earlier and after that validation starts too such validation after completion replaces the results of saving which causes situations a and b situation c is caused by over restrictive client side simultaneous save exception perhaps debouncing is a good idea here very similarly to validation debouncing expected outcome reliable fast entry and saving in entity masters | 0 |
219,087 | 24,440,648,682 | IssuesEvent | 2022-10-06 14:23:25 | varkalaramalingam/test-drone-build | https://api.github.com/repos/varkalaramalingam/test-drone-build | closed | CVE-2021-3777 (High) detected in tmpl-1.0.4.tgz - autoclosed | security vulnerability | ## CVE-2021-3777 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tmpl-1.0.4.tgz</b></p></summary>
<p>JavaScript micro templates.</p>
<p>Library home page: <a href="https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz">https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tmpl/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.1.tgz (Root Library)
- babel-jest-26.6.3.tgz
- transform-26.6.2.tgz
- jest-haste-map-26.6.2.tgz
- walker-1.0.7.tgz
- makeerror-1.0.11.tgz
- :x: **tmpl-1.0.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/varkalaramalingam/test-drone-build/commit/a9f66a3dd11ad495e7a01529fd8bf73659be1c3a">a9f66a3dd11ad495e7a01529fd8bf73659be1c3a</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
nodejs-tmpl is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3777>CVE-2021-3777</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-09-15</p>
<p>Fix Resolution (tmpl): 1.0.5</p>
<p>Direct dependency fix Resolution (react-scripts): 4.0.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-3777 (High) detected in tmpl-1.0.4.tgz - autoclosed - ## CVE-2021-3777 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tmpl-1.0.4.tgz</b></p></summary>
<p>JavaScript micro templates.</p>
<p>Library home page: <a href="https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz">https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tmpl/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.1.tgz (Root Library)
- babel-jest-26.6.3.tgz
- transform-26.6.2.tgz
- jest-haste-map-26.6.2.tgz
- walker-1.0.7.tgz
- makeerror-1.0.11.tgz
- :x: **tmpl-1.0.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/varkalaramalingam/test-drone-build/commit/a9f66a3dd11ad495e7a01529fd8bf73659be1c3a">a9f66a3dd11ad495e7a01529fd8bf73659be1c3a</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
nodejs-tmpl is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3777>CVE-2021-3777</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-09-15</p>
<p>Fix Resolution (tmpl): 1.0.5</p>
<p>Direct dependency fix Resolution (react-scripts): 4.0.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_process | cve high detected in tmpl tgz autoclosed cve high severity vulnerability vulnerable library tmpl tgz javascript micro templates library home page a href path to dependency file package json path to vulnerable library node modules tmpl package json dependency hierarchy react scripts tgz root library babel jest tgz transform tgz jest haste map tgz walker tgz makeerror tgz x tmpl tgz vulnerable library found in head commit a href found in base branch master vulnerability details nodejs tmpl is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution tmpl direct dependency fix resolution react scripts step up your open source security game with mend | 0 |
5,094 | 7,878,639,984 | IssuesEvent | 2018-06-26 10:54:08 | LOVDnl/LOVD3 | https://api.github.com/repos/LOVDnl/LOVD3 | opened | Improve working with HPO terms | cat: interface cat: submission process feature request student project | Some submitters fill in HPO IDs, other the IDs with the descriptions. Then still, in a non-standard way. We need a better submission tool to allow users to enter the IDs as well as the descriptions, simply. Perhaps by using the `select2` library that we also use in LOVD+ for some inputs.
Storing both the name and the ID will allow searching on both.
We'll worry about the tree structure later.
Obsoletes: #271. | 1.0 | Improve working with HPO terms - Some submitters fill in HPO IDs, other the IDs with the descriptions. Then still, in a non-standard way. We need a better submission tool to allow users to enter the IDs as well as the descriptions, simply. Perhaps by using the `select2` library that we also use in LOVD+ for some inputs.
Storing both the name and the ID will allow searching on both.
We'll worry about the tree structure later.
Obsoletes: #271. | process | improve working with hpo terms some submitters fill in hpo ids other the ids with the descriptions then still in a non standard way we need a better submission tool to allow users to enter the ids as well as the descriptions simply perhaps by using the library that we also use in lovd for some inputs storing both the name and the id will allow searching on both we ll worry about the tree structure later obsoletes | 1 |
7,747 | 10,864,283,572 | IssuesEvent | 2019-11-14 16:35:11 | ESMValGroup/ESMValCore | https://api.github.com/repos/ESMValGroup/ESMValCore | closed | Derived variable needs different input dataset for CMIP5/CMIP6 | preprocessor variable derivation | The AMOC derived variable needs to use different data depending on whether we're looking at CMIP5 or CMIP6 data. This is because the input variable that was used in CMIP5 (**msftmyz**) has been superseded by a slightly different variable in CMIP6 (**msftyz**).
Is there a way to derive a variable such that it checks for the presence/absence of the two variables **msftmyz** or **msftyz** then uses the best one? It shouldn't fail if it doesn't find one dataset, provided that it finds the other.
(The difference between the two is that msftmyz is defined in the meridional direction and msftyz is defined in the Y direction.)
I like to keep you on your toes! Enjoy Bavaria this week! | 1.0 | Derived variable needs different input dataset for CMIP5/CMIP6 - The AMOC derived variable needs to use different data depending on whether we're looking at CMIP5 or CMIP6 data. This is because the input variable that was used in CMIP5 (**msftmyz**) has been superseded by a slightly different variable in CMIP6 (**msftyz**).
Is there a way to derive a variable such that it checks for the presence/absence of the two variables **msftmyz** or **msftyz** then uses the best one? It shouldn't fail if it doesn't find one dataset, provided that it finds the other.
(The difference between the two is that msftmyz is defined in the meridional direction and msftyz is defined in the Y direction.)
I like to keep you on your toes! Enjoy Bavaria this week! | process | derived variable needs different input dataset for the amoc derived variable needs to use different data depending on whether we re looking at or data this is because the input variable that was used in msftmyz has been superseded by a slightly different variable in msftyz is there a way to derive a variable such that it checks for the presence absence of the two variables msftmyz or msftyz then uses the best one it shouldn t fail if it doesn t find one dataset provided that it finds the other the difference between the two is that msftmyz is defined in the meridional direction and msftyz is defined in the y direction i like to keep you on your toes enjoy bavaria this week | 1 |
322,455 | 9,817,896,146 | IssuesEvent | 2019-06-13 17:52:21 | robotology/human-dynamics-estimation | https://api.github.com/repos/robotology/human-dynamics-estimation | closed | Update HumanRobotPosePublisher with rpc port control | complexity:medium priority:high severity:major type:enhancement type:task | During https://github.com/robotology/human-dynamics-estimation/issues/99#issuecomment-483313796 we notice there is a slight lag in the `tfs` and hence the rviz visualization is a bit laggy. Also, we tested `HumanRobotPosePublisher` and the visualization is bad probably because the robot base tf is continuously streamed with the current `HumanRobotPosePublisher`. As @diegoferigo suggested [before](https://github.com/dic-iit/component_andy/issues/102#issuecomment-446108987) it is better to set the robot pose through a rpc port. Also, this can be stabilitzed only if the tfs are streamed correctly without any interruption.

Hopefully this is the last piece of puzzle to fix towards fully functional HDEv2
CC @DanielePucci @diegoferigo @lrapetti @traversaro | 1.0 | Update HumanRobotPosePublisher with rpc port control - During https://github.com/robotology/human-dynamics-estimation/issues/99#issuecomment-483313796 we notice there is a slight lag in the `tfs` and hence the rviz visualization is a bit laggy. Also, we tested `HumanRobotPosePublisher` and the visualization is bad probably because the robot base tf is continuously streamed with the current `HumanRobotPosePublisher`. As @diegoferigo suggested [before](https://github.com/dic-iit/component_andy/issues/102#issuecomment-446108987) it is better to set the robot pose through a rpc port. Also, this can be stabilitzed only if the tfs are streamed correctly without any interruption.

Hopefully this is the last piece of puzzle to fix towards fully functional HDEv2
CC @DanielePucci @diegoferigo @lrapetti @traversaro | non_process | update humanrobotposepublisher with rpc port control during we notice there is a slight lag in the tfs and hence the rviz visualization is a bit laggy also we tested humanrobotposepublisher and the visualization is bad probably because the robot base tf is continuously streamed with the current humanrobotposepublisher as diegoferigo suggested it is better to set the robot pose through a rpc port also this can be stabilitzed only if the tfs are streamed correctly without any interruption hopefully this is the last piece of puzzle to fix towards fully functional cc danielepucci diegoferigo lrapetti traversaro | 0 |
22,522 | 31,593,909,169 | IssuesEvent | 2023-09-05 02:47:01 | unicode-org/icu4x | https://api.github.com/repos/unicode-org/icu4x | closed | Better release checklist | T-docs-tests C-process S-tiny | The release checklist should:
- Have three phases, for "week before release", "day of release", and the final release steps
- Have more details on how to do changelogs, etc (e.g. tips and tricks around `git log icu@1.1.0.. -- folder/` )
- ... ? | 1.0 | Better release checklist - The release checklist should:
- Have three phases, for "week before release", "day of release", and the final release steps
- Have more details on how to do changelogs, etc (e.g. tips and tricks around `git log icu@1.1.0.. -- folder/` )
- ... ? | process | better release checklist the release checklist should have three phases for week before release day of release and the final release steps have more details on how to do changelogs etc e g tips and tricks around git log icu folder | 1 |
18,469 | 24,550,303,125 | IssuesEvent | 2022-10-12 12:06:03 | usmannasir/cyberpanel | https://api.github.com/repos/usmannasir/cyberpanel | closed | Official Support for Ubuntu 22.04 LTS | in-process | Hi,
as far as I noticed, Cyberpanel could not support Ubuntu 22.04 LTS yet, because Litespeed did not offer support yet. Good news: This is now no longer the case :smile:
https://docs.litespeedtech.com/lsws/extapp/php/getting_started/
Please add an official support, thank you! | 1.0 | Official Support for Ubuntu 22.04 LTS - Hi,
as far as I noticed, Cyberpanel could not support Ubuntu 22.04 LTS yet, because Litespeed did not offer support yet. Good news: This is now no longer the case :smile:
https://docs.litespeedtech.com/lsws/extapp/php/getting_started/
Please add an official support, thank you! | process | official support for ubuntu lts hi as far as i noticed cyberpanel could not support ubuntu lts yet because litespeed did not offer support yet good news this is now no longer the case smile please add an official support thank you | 1 |
13,757 | 16,510,557,744 | IssuesEvent | 2021-05-26 03:10:30 | qgis/QGIS | https://api.github.com/repos/qgis/QGIS | closed | TIN Interpolation from vector to raster fails with unknown error | Bug Feedback Processing stale | I am trying to create a DEM from contours using the TIN interpolation tool.
QGIS version: 3.19.0-Master
QGIS code revision: 6df122bd19
Qt version: 5.14.2
GDAL version: 3.1.3
GEOS version: 3.8.1-CAPI-1.13.3
PROJ version: Rel. 7.1.0, August 1st, 2020
```
Processing algorithm…
Algorithm 'TIN interpolation' starting…
Input parameters:
{ 'EXTENT' : '499482.380000000,521757.950000000,1515163.700000000,1559694.900000000 [EPSG:2006]', 'INTERPOLATION_DATA' : 'dbname=\'gis\' host=localhost port=35432 user=\'docker\' password=\'docker\' sslmode=disable key=\'tid\' srid=2006 type=MultiLineStringZ checkPrimaryKeyUnicity=\'1\' table=\"public\".\"contours\" (geom)::~::1::~::-1::~::1', 'METHOD' : 1, 'OUTPUT' : '/home/timlinux/Downloads/stlucia-dem-from-contours.tif', 'PIXEL_SIZE' : 0.5 }
Traceback (most recent call last):
File "/home/timlinux/dev/cpp/QGIS-Debug-Build/output/python/plugins/processing/algs/qgis/TinInterpolation.py", line 188, in processAlgorithm
writer.writeFile(feedback)
Exception: unknown
Execution failed after 1707.56 seconds (28 minutes 28 seconds)
Loading resulting layers
Algorithm 'TIN interpolation' finished
```
I can see it started writing the output file and then failed:
```
-rw-rw-r-- 1 timlinux timlinux 111 Apr 12 10:43 stlucia-dem-from-contours.tif
```
At worst we should get a meaningful error message, otherwise it would be nice to make the output write properly. | 1.0 | TIN Interpolation from vector to raster fails with unknown error - I am trying to create a DEM from contours using the TIN interpolation tool.
QGIS version: 3.19.0-Master
QGIS code revision: 6df122bd19
Qt version: 5.14.2
GDAL version: 3.1.3
GEOS version: 3.8.1-CAPI-1.13.3
PROJ version: Rel. 7.1.0, August 1st, 2020
```
Processing algorithm…
Algorithm 'TIN interpolation' starting…
Input parameters:
{ 'EXTENT' : '499482.380000000,521757.950000000,1515163.700000000,1559694.900000000 [EPSG:2006]', 'INTERPOLATION_DATA' : 'dbname=\'gis\' host=localhost port=35432 user=\'docker\' password=\'docker\' sslmode=disable key=\'tid\' srid=2006 type=MultiLineStringZ checkPrimaryKeyUnicity=\'1\' table=\"public\".\"contours\" (geom)::~::1::~::-1::~::1', 'METHOD' : 1, 'OUTPUT' : '/home/timlinux/Downloads/stlucia-dem-from-contours.tif', 'PIXEL_SIZE' : 0.5 }
Traceback (most recent call last):
File "/home/timlinux/dev/cpp/QGIS-Debug-Build/output/python/plugins/processing/algs/qgis/TinInterpolation.py", line 188, in processAlgorithm
writer.writeFile(feedback)
Exception: unknown
Execution failed after 1707.56 seconds (28 minutes 28 seconds)
Loading resulting layers
Algorithm 'TIN interpolation' finished
```
I can see it started writing the output file and then failed:
```
-rw-rw-r-- 1 timlinux timlinux 111 Apr 12 10:43 stlucia-dem-from-contours.tif
```
At worst we should get a meaningful error message, otherwise it would be nice to make the output write properly. | process | tin interpolation from vector to raster fails with unknown error i am trying to create a dem from contours using the tin interpolation tool qgis version master qgis code revision qt version gdal version geos version capi proj version rel august processing algorithm… algorithm tin interpolation starting… input parameters extent interpolation data dbname gis host localhost port user docker password docker sslmode disable key tid srid type multilinestringz checkprimarykeyunicity table public contours geom method output home timlinux downloads stlucia dem from contours tif pixel size traceback most recent call last file home timlinux dev cpp qgis debug build output python plugins processing algs qgis tininterpolation py line in processalgorithm writer writefile feedback exception unknown execution failed after seconds minutes seconds loading resulting layers algorithm tin interpolation finished i can see it started writing the output file and then failed rw rw r timlinux timlinux apr stlucia dem from contours tif at worst we should get a meaningful error message otherwise it would be nice to make the output write properly | 1 |
54,324 | 13,902,515,279 | IssuesEvent | 2020-10-20 05:31:27 | emilwareus/angular | https://api.github.com/repos/emilwareus/angular | opened | CVE-2020-7656 (Medium) detected in jquery-1.4.4.min.js | security vulnerability | ## CVE-2020-7656 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.4.4.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js</a></p>
<p>Path to dependency file: angular/packages/benchpress/node_modules/selenium-webdriver/lib/test/data/draggableLists.html</p>
<p>Path to vulnerable library: angular/packages/benchpress/node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.4.4.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/emilwareus/angular/commit/0a802f3678958587eafa0136d927232b89cc1427">0a802f3678958587eafa0136d927232b89cc1427</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed.
<p>Publish Date: 2020-05-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7656>CVE-2020-7656</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/rails/jquery-rails/commit/8f601cbfa08749ee5bbd2bffb6e509db9d753568">https://github.com/rails/jquery-rails/commit/8f601cbfa08749ee5bbd2bffb6e509db9d753568</a></p>
<p>Release Date: 2020-05-19</p>
<p>Fix Resolution: jquery-rails - 2.2.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-7656 (Medium) detected in jquery-1.4.4.min.js - ## CVE-2020-7656 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.4.4.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js</a></p>
<p>Path to dependency file: angular/packages/benchpress/node_modules/selenium-webdriver/lib/test/data/draggableLists.html</p>
<p>Path to vulnerable library: angular/packages/benchpress/node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.4.4.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/emilwareus/angular/commit/0a802f3678958587eafa0136d927232b89cc1427">0a802f3678958587eafa0136d927232b89cc1427</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed.
<p>Publish Date: 2020-05-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7656>CVE-2020-7656</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/rails/jquery-rails/commit/8f601cbfa08749ee5bbd2bffb6e509db9d753568">https://github.com/rails/jquery-rails/commit/8f601cbfa08749ee5bbd2bffb6e509db9d753568</a></p>
<p>Release Date: 2020-05-19</p>
<p>Fix Resolution: jquery-rails - 2.2.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_process | cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file angular packages benchpress node modules selenium webdriver lib test data draggablelists html path to vulnerable library angular packages benchpress node modules selenium webdriver lib test data js jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href vulnerability details jquery prior to allows cross site scripting attacks via the load method the load method fails to recognize and remove html tags that contain a whitespace character i e which results in the enclosed script logic to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery rails step up your open source security game with whitesource | 0 |
166,070 | 14,018,965,011 | IssuesEvent | 2020-10-29 17:31:26 | Cycling74/min-devkit | https://api.github.com/repos/Cycling74/min-devkit | closed | package-info.json.in file missing from distribution | bug needs-documentation | Hi,
I installed min-devkit via Package manager. I was wondering if there is a procedure for updating the min-devkit inside of a package that I have already been working on? Do I just copy over the min-api and min-lib from the updated min-devkit package?
Thank you.
| 1.0 | package-info.json.in file missing from distribution - Hi,
I installed min-devkit via Package manager. I was wondering if there is a procedure for updating the min-devkit inside of a package that I have already been working on? Do I just copy over the min-api and min-lib from the updated min-devkit package?
Thank you.
| non_process | package info json in file missing from distribution hi i installed min devkit via package manager i was wondering if there is a procedure for updating the min devkit inside of a package that i have already been working on do i just copy over the min api and min lib from the updated min devkit package thank you | 0 |
337,615 | 24,548,194,164 | IssuesEvent | 2022-10-12 10:25:54 | se310-t6/politicry | https://api.github.com/repos/se310-t6/politicry | closed | Update installation instructions to use release | documentation Approved | ### Reference Issues
Related to #55
### Summary
Add instructions for installing the latest release into a Chrome browser.
Currently, the instructions require you to build the extension. Now that we have put out our first [release](https://github.com/se310-t6/politicry/releases), the installation instructions should also be updated.
It will be simpler for new users to install it this way because they will not need to build the extension. | 1.0 | Update installation instructions to use release - ### Reference Issues
Related to #55
### Summary
Add instructions for installing the latest release into a Chrome browser.
Currently, the instructions require you to build the extension. Now that we have put out our first [release](https://github.com/se310-t6/politicry/releases), the installation instructions should also be updated.
It will be simpler for new users to install it this way because they will not need to build the extension. | non_process | update installation instructions to use release reference issues related to summary add instructions for installing the latest release into a chrome browser currently the instructions require you to build the extension now that we have put out our first the installation instructions should also be updated it will be simpler for new users to install it this way because they will not need to build the extension | 0 |
18,491 | 24,550,921,632 | IssuesEvent | 2022-10-12 12:33:14 | GoogleCloudPlatform/fda-mystudies | https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies | closed | [iOS] [Offline indicator] Error message is not getting displayed for the following | Bug P1 iOS Process: Fixed Process: Tested dev | Steps:
1. Sign in to the app
2. Click on the enrolled study
3. Turn off the data
4. Click on the Activity
5. Observe
AR: Offline error message is not getting displayed
ER: 'Seems to be offline...' error message should get displayed to the user
Note:
The issue should also be fixed when the user clicks on the 'View website', 'Terms' 'Privacy policy' etc. | 2.0 | [iOS] [Offline indicator] Error message is not getting displayed for the following - Steps:
1. Sign in to the app
2. Click on the enrolled study
3. Turn off the data
4. Click on the Activity
5. Observe
AR: Offline error message is not getting displayed
ER: 'Seems to be offline...' error message should get displayed to the user
Note:
The issue should also be fixed when the user clicks on the 'View website', 'Terms' 'Privacy policy' etc. | process | error message is not getting displayed for the following steps sign in to the app click on the enrolled study turn off the data click on the activity observe ar offline error message is not getting displayed er seems to be offline error message should get displayed to the user note the issue should also be fixed when the user clicks on the view website terms privacy policy etc | 1 |
152,879 | 12,128,666,351 | IssuesEvent | 2020-04-22 20:55:47 | kwk/test-llvm-bz-import-4 | https://api.github.com/repos/kwk/test-llvm-bz-import-4 | opened | test-suite should support building only C/ObjC programs | BZ-BUG-STATUS: RESOLVED BZ-RESOLUTION: FIXED Test Suite/Programs Tests dummy import from bugzilla | This issue was imported from Bugzilla https://bugs.llvm.org/show_bug.cgi?id=3484. | 2.0 | test-suite should support building only C/ObjC programs - This issue was imported from Bugzilla https://bugs.llvm.org/show_bug.cgi?id=3484. | non_process | test suite should support building only c objc programs this issue was imported from bugzilla | 0 |
599,944 | 18,286,707,585 | IssuesEvent | 2021-10-05 11:07:26 | Tsuey/L4D2-Community-Update | https://api.github.com/repos/Tsuey/L4D2-Community-Update | closed | "client disconnect" replace all kick messages | duplicate medium priority game bug live | You broke something in recent updates, it always write "client disconnect" in the place of the real reason that should show to players when they get kicked. Votekick for example and same for custom reasons with sourcemod, not sure if it does it too for bans. | 1.0 | "client disconnect" replace all kick messages - You broke something in recent updates, it always write "client disconnect" in the place of the real reason that should show to players when they get kicked. Votekick for example and same for custom reasons with sourcemod, not sure if it does it too for bans. | non_process | client disconnect replace all kick messages you broke something in recent updates it always write client disconnect in the place of the real reason that should show to players when they get kicked votekick for example and same for custom reasons with sourcemod not sure if it does it too for bans | 0 |
9,577 | 12,530,669,748 | IssuesEvent | 2020-06-04 13:26:39 | fluent/fluent-bit | https://api.github.com/repos/fluent/fluent-bit | closed | Unable to perform "GROUP BY" on "multiple fields" correctly in stream processing | work-in-process | ## Bug Report
**Describe the bug**
Unable to perform **"GROUP BY"** on **"multiple fields"** correctly in stream processing at Fluent Bit version 1.3.3 (the binary was got from the yum installation), please see the detailed information of wrong aggregation result and how to reproduce it below.
**To Reproduce**
- Steps to reproduce the problem:
Step 1. Create a sample log like **dns_bind.log** below
```
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (google.com): query: google.com IN A +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (google.com): query: google.com IN A +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (facebook.com): query: facebook.com IN A +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (facebook.com): query: facebook.com IN A +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (google.com): query: google.com IN AAAA +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (google.com): query: google.com IN AAAA +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (facebook.com): query: facebook.com IN CNAME +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (facebook.com): query: facebook.com IN CNAME +E(0)D (8.8.8.8)
```
Step 2. Setup an regex parser for the log above and put them into **parser-bind.conf**
```
[PARSER]
Name bind
Format regex
Regex ^(?<time>[^ ]*\ [^ ]*) (?<client>[^ ]*) (?<client_ip>[^ ]*)#(?<client_port>[^ ]*) \((?<target_queryname>[^ ]*)\): (?<query>[^ ]*): (?<query_domain_name>[^ ]*) (?<class>[^ ]*) (?<query_type>[^ ]*) (?<recursion_desired_flag>[^ ]*) \((?<dns_server>[^ ]*)\)$
```
when log is parsed by the parser above, it will be organized to
```
{"time"=>"2019-12-05 03:12:31.820221", "client"=>"client", "client_ip"=>"192.168.11.12", "client_port"=>"55206", "target_queryname"=>"facebook.com", "query"=>"query", "query_domain_name"=>"facebook.com", "class"=>"IN", "query_type"=>"A", "recursion_desired_flag"=>"+E(0)D", "dns_server"=>"8.8.8.8"}
```
Step 3. Setup a stream processor configuration **stream-process-bind.conf** below
```
[STREAM_TASK]
Name bind_sp_1
Exec CREATE STREAM bind_sp_1 AS SELECT query_domain_name, query_type, COUNT(*) AS hits FROM STREAM:bind_raw_log WINDOW TUMBLING (60 SECOND) GROUP BY query_domain_name, query_type;
```
Step 4. Setup the main configuration file **flb_main.conf** below
```
[SERVICE]
Parsers_File parser-bind.conf
Streams_File stream-process-bind.conf
Log_Level info
[INPUT]
Name tail
alias bind_raw_log
Path dns_bind.log
Parser bind
[OUTPUT]
Name stdout
Match bind_sp_1
```
Step 5. Execute FluentBit with the configuration files above, then we will get the aggregation result like the output below
```
{"query_domain_name"=>"google.com", "query_type"=>"A", "hits"=>4}
{"query_domain_name"=>"facebook.com", "query_type"=>"A", "hits"=>4}
```
but the result above isn't the correct result, the expected result should be like below
**Expected behavior**
The expected result from stream processor above should be like
```
{"query_domain_name"=>"google.com", "query_type"=>"A", "hits"=>2}
{"query_domain_name"=>"google.com", "query_type"=>"AAAA", "hits"=>2}
{"query_domain_name"=>"facebook.com", "query_type"=>"A", "hits"=>2}
{"query_domain_name"=>"facebook.com", "query_type"=>"CNAME", "hits"=>2}
```
**Your Environment**
* Version used: Fluent Bit v1.3.3
* Configuration: Please see the configuration files above
* Environment name and version (e.g. Kubernetes? What version?): NA
* Server type and version: NA
* Operating System and version: CentOS Linux release 7.2.1511 (Core)
* Filters and plugins: NA
**Additional Context**
Hi Team FluentBit,
Just want to say Thank you for you folks,
FluentBit is a really nice thing for using on log/metric collection or transferring.
btw, please feel free to let me know if there's something wrong with my configurations and occasion above.
| 1.0 | Unable to perform "GROUP BY" on "multiple fields" correctly in stream processing - ## Bug Report
**Describe the bug**
Unable to perform **"GROUP BY"** on **"multiple fields"** correctly in stream processing at Fluent Bit version 1.3.3 (the binary was got from the yum installation), please see the detailed information of wrong aggregation result and how to reproduce it below.
**To Reproduce**
- Steps to reproduce the problem:
Step 1. Create a sample log like **dns_bind.log** below
```
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (google.com): query: google.com IN A +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (google.com): query: google.com IN A +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (facebook.com): query: facebook.com IN A +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (facebook.com): query: facebook.com IN A +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (google.com): query: google.com IN AAAA +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (google.com): query: google.com IN AAAA +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (facebook.com): query: facebook.com IN CNAME +E(0)D (8.8.8.8)
2019-12-05 03:12:31.820221 client 192.168.11.12#55206 (facebook.com): query: facebook.com IN CNAME +E(0)D (8.8.8.8)
```
Step 2. Setup an regex parser for the log above and put them into **parser-bind.conf**
```
[PARSER]
Name bind
Format regex
Regex ^(?<time>[^ ]*\ [^ ]*) (?<client>[^ ]*) (?<client_ip>[^ ]*)#(?<client_port>[^ ]*) \((?<target_queryname>[^ ]*)\): (?<query>[^ ]*): (?<query_domain_name>[^ ]*) (?<class>[^ ]*) (?<query_type>[^ ]*) (?<recursion_desired_flag>[^ ]*) \((?<dns_server>[^ ]*)\)$
```
when log is parsed by the parser above, it will be organized to
```
{"time"=>"2019-12-05 03:12:31.820221", "client"=>"client", "client_ip"=>"192.168.11.12", "client_port"=>"55206", "target_queryname"=>"facebook.com", "query"=>"query", "query_domain_name"=>"facebook.com", "class"=>"IN", "query_type"=>"A", "recursion_desired_flag"=>"+E(0)D", "dns_server"=>"8.8.8.8"}
```
Step 3. Setup a stream processor configuration **stream-process-bind.conf** below
```
[STREAM_TASK]
Name bind_sp_1
Exec CREATE STREAM bind_sp_1 AS SELECT query_domain_name, query_type, COUNT(*) AS hits FROM STREAM:bind_raw_log WINDOW TUMBLING (60 SECOND) GROUP BY query_domain_name, query_type;
```
Step 4. Setup the main configuration file **flb_main.conf** below
```
[SERVICE]
Parsers_File parser-bind.conf
Streams_File stream-process-bind.conf
Log_Level info
[INPUT]
Name tail
alias bind_raw_log
Path dns_bind.log
Parser bind
[OUTPUT]
Name stdout
Match bind_sp_1
```
Step 5. Execute FluentBit with the configuration files above, then we will get the aggregation result like the output below
```
{"query_domain_name"=>"google.com", "query_type"=>"A", "hits"=>4}
{"query_domain_name"=>"facebook.com", "query_type"=>"A", "hits"=>4}
```
but the result above isn't the correct result, the expected result should be like below
**Expected behavior**
The expected result from stream processor above should be like
```
{"query_domain_name"=>"google.com", "query_type"=>"A", "hits"=>2}
{"query_domain_name"=>"google.com", "query_type"=>"AAAA", "hits"=>2}
{"query_domain_name"=>"facebook.com", "query_type"=>"A", "hits"=>2}
{"query_domain_name"=>"facebook.com", "query_type"=>"CNAME", "hits"=>2}
```
**Your Environment**
* Version used: Fluent Bit v1.3.3
* Configuration: Please see the configuration files above
* Environment name and version (e.g. Kubernetes? What version?): NA
* Server type and version: NA
* Operating System and version: CentOS Linux release 7.2.1511 (Core)
* Filters and plugins: NA
**Additional Context**
Hi Team FluentBit,
Just want to say Thank you for you folks,
FluentBit is a really nice thing for using on log/metric collection or transferring.
btw, please feel free to let me know if there's something wrong with my configurations and occasion above.
| process | unable to perform group by on multiple fields correctly in stream processing bug report describe the bug unable to perform group by on multiple fields correctly in stream processing at fluent bit version the binary was got from the yum installation please see the detailed information of wrong aggregation result and how to reproduce it below to reproduce steps to reproduce the problem step create a sample log like dns bind log below client google com query google com in a e d client google com query google com in a e d client facebook com query facebook com in a e d client facebook com query facebook com in a e d client google com query google com in aaaa e d client google com query google com in aaaa e d client facebook com query facebook com in cname e d client facebook com query facebook com in cname e d step setup an regex parser for the log above and put them into parser bind conf name bind format regex regex when log is parsed by the parser above it will be organized to time client client client ip client port target queryname facebook com query query query domain name facebook com class in query type a recursion desired flag e d dns server step setup a stream processor configuration stream process bind conf below name bind sp exec create stream bind sp as select query domain name query type count as hits from stream bind raw log window tumbling second group by query domain name query type step setup the main configuration file flb main conf below parsers file parser bind conf streams file stream process bind conf log level info name tail alias bind raw log path dns bind log parser bind name stdout match bind sp step execute fluentbit with the configuration files above then we will get the aggregation result like the output below query domain name google com query type a hits query domain name facebook com query type a hits but the result above isn t the correct result the expected result should be like below expected behavior the expected result from stream processor above should be like query domain name google com query type a hits query domain name google com query type aaaa hits query domain name facebook com query type a hits query domain name facebook com query type cname hits your environment version used fluent bit configuration please see the configuration files above environment name and version e g kubernetes what version na server type and version na operating system and version centos linux release core filters and plugins na additional context hi team fluentbit just want to say thank you for you folks fluentbit is a really nice thing for using on log metric collection or transferring btw please feel free to let me know if there s something wrong with my configurations and occasion above | 1 |
203,701 | 15,887,317,091 | IssuesEvent | 2021-04-10 01:45:06 | Neos-Metaverse/NeosPublic | https://api.github.com/repos/Neos-Metaverse/NeosPublic | opened | Documentation: Error handling for ImageImporter.ImportImage (and potentially other internal coroutines) | Documentation | ## Where have you searched for this information before making this issue?
This information does not appear to be available within the community, or in the discord history.
## Which part of the system do you need information on?
`FrooxEngine.ImageImporter.ImportImage` and `FrooxEngine.UniversalImporter.Import` - I can't find a way to capture errors with the import process (Unlike `FrooxEngine.ModelImporter.ImportModel`, which can accept a `BaseX.CallbackProgressIndicator`)
## What information do you need to know?
- What is the recommended way to call these methods from code which uses async/await, instead of coroutines?
- How do I properly handle errors with the import process? Is there some helper method for wrapping these calls?
At present, I'm using `World.Coroutines.StartCoroutine` to run the coroutine, and wrapping this in a `TaskCompletionSource` that depends on the `onDone` callback to complete the awaitable.
This sometimes causes issues, because certain images will cause the import to fail, which never triggers the `onDone` callback, and does not appear to produce any consumable exception (except as a log entry).
## Do you plan to format and contribute the provided information to Wiki?
Potentially, although we have not decided on a format for more advanced internal/technical documentation at present.
| 1.0 | Documentation: Error handling for ImageImporter.ImportImage (and potentially other internal coroutines) - ## Where have you searched for this information before making this issue?
This information does not appear to be available within the community, or in the discord history.
## Which part of the system do you need information on?
`FrooxEngine.ImageImporter.ImportImage` and `FrooxEngine.UniversalImporter.Import` - I can't find a way to capture errors with the import process (Unlike `FrooxEngine.ModelImporter.ImportModel`, which can accept a `BaseX.CallbackProgressIndicator`)
## What information do you need to know?
- What is the recommended way to call these methods from code which uses async/await, instead of coroutines?
- How do I properly handle errors with the import process? Is there some helper method for wrapping these calls?
At present, I'm using `World.Coroutines.StartCoroutine` to run the coroutine, and wrapping this in a `TaskCompletionSource` that depends on the `onDone` callback to complete the awaitable.
This sometimes causes issues, because certain images will cause the import to fail, which never triggers the `onDone` callback, and does not appear to produce any consumable exception (except as a log entry).
## Do you plan to format and contribute the provided information to Wiki?
Potentially, although we have not decided on a format for more advanced internal/technical documentation at present.
| non_process | documentation error handling for imageimporter importimage and potentially other internal coroutines where have you searched for this information before making this issue this information does not appear to be available within the community or in the discord history which part of the system do you need information on frooxengine imageimporter importimage and frooxengine universalimporter import i can t find a way to capture errors with the import process unlike frooxengine modelimporter importmodel which can accept a basex callbackprogressindicator what information do you need to know what is the recommended way to call these methods from code which uses async await instead of coroutines how do i properly handle errors with the import process is there some helper method for wrapping these calls at present i m using world coroutines startcoroutine to run the coroutine and wrapping this in a taskcompletionsource that depends on the ondone callback to complete the awaitable this sometimes causes issues because certain images will cause the import to fail which never triggers the ondone callback and does not appear to produce any consumable exception except as a log entry do you plan to format and contribute the provided information to wiki potentially although we have not decided on a format for more advanced internal technical documentation at present | 0 |
22,714 | 32,038,498,240 | IssuesEvent | 2023-09-22 17:14:44 | bazelbuild/bazel | https://api.github.com/repos/bazelbuild/bazel | closed | 120850Release X.Y.Z - $MONTH $YEAR | P1 type: process release team-OSS | # Status of Bazel X.Y.Z
- Expected first release candidate date: [date]
- Expected release date: [date]
- [List of release blockers](link-to-milestone)
To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone.
To cherry-pick a mainline commit into X.Y.Z, simply send a PR against the `release-X.Y.Z` branch.
**Task list:**
<!-- The first item is only needed for major releases (X.0.0) -->
- [ ] Pick release baseline: [link to base commit]
- [ ] Create release candidate: X.Y.Zrc1
- [ ] Check downstream projects
- [ ] Create [draft release announcement](https://docs.google.com/document/d/1pu2ARPweOCTxPsRR8snoDtkC9R51XWRyBXeiC6Ql5so/edit) <!-- Note that there should be a new Bazel Release Announcement document for every major release. For minor and patch releases, use the latest open doc. -->
- [ ] Send the release announcement PR for review: [link to bazel-blog PR] <!-- Only for major releases. -->
- [ ] Push the release and notify package maintainers: [link to comment notifying package maintainers]
- [ ] Update the documentation
- [ ] Push the blog post: [link to blog post] <!-- Only for major releases. -->
- [ ] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
| 1.0 | 120850Release X.Y.Z - $MONTH $YEAR - # Status of Bazel X.Y.Z
- Expected first release candidate date: [date]
- Expected release date: [date]
- [List of release blockers](link-to-milestone)
To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone.
To cherry-pick a mainline commit into X.Y.Z, simply send a PR against the `release-X.Y.Z` branch.
**Task list:**
<!-- The first item is only needed for major releases (X.0.0) -->
- [ ] Pick release baseline: [link to base commit]
- [ ] Create release candidate: X.Y.Zrc1
- [ ] Check downstream projects
- [ ] Create [draft release announcement](https://docs.google.com/document/d/1pu2ARPweOCTxPsRR8snoDtkC9R51XWRyBXeiC6Ql5so/edit) <!-- Note that there should be a new Bazel Release Announcement document for every major release. For minor and patch releases, use the latest open doc. -->
- [ ] Send the release announcement PR for review: [link to bazel-blog PR] <!-- Only for major releases. -->
- [ ] Push the release and notify package maintainers: [link to comment notifying package maintainers]
- [ ] Update the documentation
- [ ] Push the blog post: [link to blog post] <!-- Only for major releases. -->
- [ ] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
| process | x y z month year status of bazel x y z expected first release candidate date expected release date link to milestone to report a release blocking bug please add a comment with the text bazel io flag to the issue a release manager will triage it and add it to the milestone to cherry pick a mainline commit into x y z simply send a pr against the release x y z branch task list pick release baseline create release candidate x y check downstream projects create send the release announcement pr for review push the release and notify package maintainers update the documentation push the blog post update the | 1 |
22,576 | 31,804,887,868 | IssuesEvent | 2023-09-13 13:24:29 | workfloworchestrator/orchestrator-ui | https://api.github.com/repos/workfloworchestrator/orchestrator-ui | closed | Process detail page: implement time line component | Process detail page | Implement the time line component

Add process time line component, this component has the attributes:
* total number of steps
* current step index
* step description
In the first implementation add the following features:
- [x] visualize the current step as part of time line
- [x] use full width of component, and equally spaced `dots` to represent a process step
- [x] steps < step_index should have the completed style
- [x] steps = step_index should have the active style
- [x] steps > step_index should have the disabled style
- [x] add mouse over effect on each `dot`, displaying the step description
- [ ] this mouse over can be enhanced with more information at a later stage
| 1.0 | Process detail page: implement time line component - Implement the time line component

Add process time line component, this component has the attributes:
* total number of steps
* current step index
* step description
In the first implementation add the following features:
- [x] visualize the current step as part of time line
- [x] use full width of component, and equally spaced `dots` to represent a process step
- [x] steps < step_index should have the completed style
- [x] steps = step_index should have the active style
- [x] steps > step_index should have the disabled style
- [x] add mouse over effect on each `dot`, displaying the step description
- [ ] this mouse over can be enhanced with more information at a later stage
| process | process detail page implement time line component implement the time line component add process time line component this component has the attributes total number of steps current step index step description in the first implementation add the following features visualize the current step as part of time line use full width of component and equally spaced dots to represent a process step steps step index should have the completed style steps step index should have the active style steps step index should have the disabled style add mouse over effect on each dot displaying the step description this mouse over can be enhanced with more information at a later stage | 1 |
11,426 | 14,248,153,582 | IssuesEvent | 2020-11-19 12:31:45 | tikv/tikv | https://api.github.com/repos/tikv/tikv | closed | Use protobuf enums to replace coprocessor constants like REQ_TYPE_DAG | sig/coprocessor status/discussion type/enhancement | Due to historical reasons, we are using standalone constants like REQ_TYPE_DAG, REQ_TYPE_ANALYZE and REQ_TYPE_CHECKSUM. This can be replaced by embedding enums in the protobuf file. This change can help simplify our code and make TiKV & TiDB source more consistent.
Note:
1. To complete this task, you need to update TiDB as well.
2. Because this is a breaking change, we may not merge the PR immediately. Instead it may be deferred to our 3.0 release. | 1.0 | Use protobuf enums to replace coprocessor constants like REQ_TYPE_DAG - Due to historical reasons, we are using standalone constants like REQ_TYPE_DAG, REQ_TYPE_ANALYZE and REQ_TYPE_CHECKSUM. This can be replaced by embedding enums in the protobuf file. This change can help simplify our code and make TiKV & TiDB source more consistent.
Note:
1. To complete this task, you need to update TiDB as well.
2. Because this is a breaking change, we may not merge the PR immediately. Instead it may be deferred to our 3.0 release. | process | use protobuf enums to replace coprocessor constants like req type dag due to historical reasons we are using standalone constants like req type dag req type analyze and req type checksum this can be replaced by embedding enums in the protobuf file this change can help simplify our code and make tikv tidb source more consistent note to complete this task you need to update tidb as well because this is a breaking change we may not merge the pr immediately instead it may be deferred to our release | 1 |
233,552 | 7,698,916,674 | IssuesEvent | 2018-05-19 05:16:12 | sethballantyne/Game-Demos | https://api.github.com/repos/sethballantyne/Game-Demos | closed | Input::ReadMouse() throws a DirectInputInvalidParameterException | PBB-Game bug priority-critical | [01/05/18 09:03:56] Fatal Error: DirectInputInvalidParameterException thrown. IDirectInputDevice8::GetDeviceState: an invalid parameter was passed to the function, or the object is an invalid state.
Stack trace:
at Input.ReadMouse() in c:\projects\project ball buster\src\pbb\game\input.cpp:line 347
at Game.Update() in c:\projects\project ball buster\src\pbb\game\game.cpp:line 169
at WinMain(HINSTANCE__* hInstance, HINSTANCE__* hPrevInstance, SByte* lpCmdLine, Int32 nCmdShow) in c:\projects\project ball buster\src\pbb\game\main.cpp:line 195
| 1.0 | Input::ReadMouse() throws a DirectInputInvalidParameterException - [01/05/18 09:03:56] Fatal Error: DirectInputInvalidParameterException thrown. IDirectInputDevice8::GetDeviceState: an invalid parameter was passed to the function, or the object is an invalid state.
Stack trace:
at Input.ReadMouse() in c:\projects\project ball buster\src\pbb\game\input.cpp:line 347
at Game.Update() in c:\projects\project ball buster\src\pbb\game\game.cpp:line 169
at WinMain(HINSTANCE__* hInstance, HINSTANCE__* hPrevInstance, SByte* lpCmdLine, Int32 nCmdShow) in c:\projects\project ball buster\src\pbb\game\main.cpp:line 195
| non_process | input readmouse throws a directinputinvalidparameterexception fatal error directinputinvalidparameterexception thrown getdevicestate an invalid parameter was passed to the function or the object is an invalid state stack trace at input readmouse in c projects project ball buster src pbb game input cpp line at game update in c projects project ball buster src pbb game game cpp line at winmain hinstance hinstance hinstance hprevinstance sbyte lpcmdline ncmdshow in c projects project ball buster src pbb game main cpp line | 0 |
23,019 | 3,750,550,660 | IssuesEvent | 2016-03-11 07:41:58 | AlexObukhoff/cryptophane | https://api.github.com/repos/AlexObukhoff/cryptophane | closed | invalid password if longer than 64 characters | auto-migrated Priority-Medium Type-Defect | ```
What steps will reproduce the problem?
1.
2.
3.
What is the expected output? What do you see instead?
Passaword not valid! Password valid.
What version of the product are you using? On what operating system?
0.7.0.42
Please provide any additional information below.
I checked better. Cryptophane can only enter passwords smaller than about
64 characters. It would be useful to include longer passwords. I use the
password very long to write. Why can not I paste a copy and paste when you
import the certificates? Thank you.
```
Original issue reported on code.google.com by `diamant...@gmail.com` on 19 Apr 2010 at 7:49 | 1.0 | invalid password if longer than 64 characters - ```
What steps will reproduce the problem?
1.
2.
3.
What is the expected output? What do you see instead?
Passaword not valid! Password valid.
What version of the product are you using? On what operating system?
0.7.0.42
Please provide any additional information below.
I checked better. Cryptophane can only enter passwords smaller than about
64 characters. It would be useful to include longer passwords. I use the
password very long to write. Why can not I paste a copy and paste when you
import the certificates? Thank you.
```
Original issue reported on code.google.com by `diamant...@gmail.com` on 19 Apr 2010 at 7:49 | non_process | invalid password if longer than characters what steps will reproduce the problem what is the expected output what do you see instead passaword not valid password valid what version of the product are you using on what operating system please provide any additional information below i checked better cryptophane can only enter passwords smaller than about characters it would be useful to include longer passwords i use the password very long to write why can not i paste a copy and paste when you import the certificates thank you original issue reported on code google com by diamant gmail com on apr at | 0 |
24,329 | 12,065,920,632 | IssuesEvent | 2020-04-16 10:49:08 | MicrosoftDocs/azure-docs | https://api.github.com/repos/MicrosoftDocs/azure-docs | closed | kubernetes ttl override | Pri2 container-service/svc cxp product-question triaged | Hi,
I want to change the default TTL for the kubernetes plugin which is 5 seconds to 60 for example, the only way that I have it working is if I am overwriting the entire "server" block, for example:
```yaml
apiVersion: v1
kind: ConfigMap
metadata:
name: coredns-custom
namespace: kube-system
data:
k8s.server: |
cluster.local:53 {
log
errors
health
kubernetes cluster.local in-addr.arpa ip6.arpa {
pods insecure
upstream
ttl 900
fallthrough in-addr.arpa ip6.arpa
}
prometheus :9153
forward . /etc/resolv.conf
cache 900
loop
reload
loadbalance
}
```
My question is if this is the supported way and if not, what it is?
Thanks.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 2f95537e-770a-739c-0175-4447a995f8b2
* Version Independent ID: 6c9a1132-05af-70f1-5a2b-8dad56de9cc3
* Content: [Customize CoreDNS for Azure Kubernetes Service (AKS) - Azure Kubernetes Service](https://docs.microsoft.com/en-us/azure/aks/coredns-custom#feedback)
* Content Source: [articles/aks/coredns-custom.md](https://github.com/Microsoft/azure-docs/blob/master/articles/aks/coredns-custom.md)
* Service: **container-service**
* GitHub Login: @jnoller
* Microsoft Alias: **jenoller** | 1.0 | kubernetes ttl override - Hi,
I want to change the default TTL for the kubernetes plugin which is 5 seconds to 60 for example, the only way that I have it working is if I am overwriting the entire "server" block, for example:
```yaml
apiVersion: v1
kind: ConfigMap
metadata:
name: coredns-custom
namespace: kube-system
data:
k8s.server: |
cluster.local:53 {
log
errors
health
kubernetes cluster.local in-addr.arpa ip6.arpa {
pods insecure
upstream
ttl 900
fallthrough in-addr.arpa ip6.arpa
}
prometheus :9153
forward . /etc/resolv.conf
cache 900
loop
reload
loadbalance
}
```
My question is if this is the supported way and if not, what it is?
Thanks.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 2f95537e-770a-739c-0175-4447a995f8b2
* Version Independent ID: 6c9a1132-05af-70f1-5a2b-8dad56de9cc3
* Content: [Customize CoreDNS for Azure Kubernetes Service (AKS) - Azure Kubernetes Service](https://docs.microsoft.com/en-us/azure/aks/coredns-custom#feedback)
* Content Source: [articles/aks/coredns-custom.md](https://github.com/Microsoft/azure-docs/blob/master/articles/aks/coredns-custom.md)
* Service: **container-service**
* GitHub Login: @jnoller
* Microsoft Alias: **jenoller** | non_process | kubernetes ttl override hi i want to change the default ttl for the kubernetes plugin which is seconds to for example the only way that i have it working is if i am overwriting the entire server block for example yaml apiversion kind configmap metadata name coredns custom namespace kube system data server cluster local log errors health kubernetes cluster local in addr arpa arpa pods insecure upstream ttl fallthrough in addr arpa arpa prometheus forward etc resolv conf cache loop reload loadbalance my question is if this is the supported way and if not what it is thanks document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service container service github login jnoller microsoft alias jenoller | 0 |
233,568 | 19,010,740,602 | IssuesEvent | 2021-11-23 09:02:00 | input-output-hk/ouroboros-network | https://api.github.com/repos/input-output-hk/ouroboros-network | closed | prop_mutlinode_pruning_Sim failure | peer2peer testing inbound-governor | See the `coot/prop_pruning_failure` branch for the test case. It triggers:
```
Exception thrown while showing test case:
Assertion failed
CallStack (from HasCallStack):
assert, called at src\Ouroboros\Network\ConnectionManager\Core.hs:702:17 in ouroboros-network-framework-0.1.0.0-inplace:Ouroboros.Network.ConnectionManager.Core
```
| 1.0 | prop_mutlinode_pruning_Sim failure - See the `coot/prop_pruning_failure` branch for the test case. It triggers:
```
Exception thrown while showing test case:
Assertion failed
CallStack (from HasCallStack):
assert, called at src\Ouroboros\Network\ConnectionManager\Core.hs:702:17 in ouroboros-network-framework-0.1.0.0-inplace:Ouroboros.Network.ConnectionManager.Core
```
| non_process | prop mutlinode pruning sim failure see the coot prop pruning failure branch for the test case it triggers exception thrown while showing test case assertion failed callstack from hascallstack assert called at src ouroboros network connectionmanager core hs in ouroboros network framework inplace ouroboros network connectionmanager core | 0 |
2,040 | 4,847,649,850 | IssuesEvent | 2016-11-10 15:30:07 | Alfresco/alfresco-ng2-components | https://api.github.com/repos/Alfresco/alfresco-ng2-components | opened | date picker displays behind start proces dialog | browser: chrome bug comp: activiti-processList | Include date widget within form attached to start event and click date picker.
Fine in Firefox and Safari, issue only in Chrome.
<img width="729" alt="screen shot 2016-11-10 at 15 27 29" src="https://cloud.githubusercontent.com/assets/13200338/20182669/72689f82-a75a-11e6-908e-1b90799240bf.png">
| 1.0 | date picker displays behind start proces dialog - Include date widget within form attached to start event and click date picker.
Fine in Firefox and Safari, issue only in Chrome.
<img width="729" alt="screen shot 2016-11-10 at 15 27 29" src="https://cloud.githubusercontent.com/assets/13200338/20182669/72689f82-a75a-11e6-908e-1b90799240bf.png">
| process | date picker displays behind start proces dialog include date widget within form attached to start event and click date picker fine in firefox and safari issue only in chrome img width alt screen shot at src | 1 |
8,073 | 11,251,374,439 | IssuesEvent | 2020-01-11 00:08:48 | googleapis/java-recaptchaenterprise | https://api.github.com/repos/googleapis/java-recaptchaenterprise | opened | Promote to GA | type: process | Package name: **google-cloud-recaptchaenterprise**
Current release: **beta**
Proposed release: **GA**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [ ] 28 days elapsed since last beta release with new API surface
- [ ] Server API is GA
- [ ] Package API is stable, and we can commit to backward compatibility
- [ ] All dependencies are GA
## Optional
- [ ] Most common / important scenarios have descriptive samples
- [ ] Public manual methods have at least one usage sample each (excluding overloads)
- [ ] Per-API README includes a full description of the API
- [ ] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site | 1.0 | Promote to GA - Package name: **google-cloud-recaptchaenterprise**
Current release: **beta**
Proposed release: **GA**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [ ] 28 days elapsed since last beta release with new API surface
- [ ] Server API is GA
- [ ] Package API is stable, and we can commit to backward compatibility
- [ ] All dependencies are GA
## Optional
- [ ] Most common / important scenarios have descriptive samples
- [ ] Public manual methods have at least one usage sample each (excluding overloads)
- [ ] Per-API README includes a full description of the API
- [ ] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site | process | promote to ga package name google cloud recaptchaenterprise current release beta proposed release ga instructions check the lists below adding tests documentation as required once all the required boxes are ticked please create a release and close this issue required days elapsed since last beta release with new api surface server api is ga package api is stable and we can commit to backward compatibility all dependencies are ga optional most common important scenarios have descriptive samples public manual methods have at least one usage sample each excluding overloads per api readme includes a full description of the api per api readme contains at least one “getting started” sample using the most common api scenario manual code has been reviewed by api producer manual code has been reviewed by a dpe responsible for samples client libraries page is added to the product documentation in apis reference section of the product s documentation on cloud site | 1 |
17,008 | 22,386,211,235 | IssuesEvent | 2022-06-17 00:51:00 | figlesias221/ProyectoDevOps_Grupo3_IglesiasPerezMolinoloJuan | https://api.github.com/repos/figlesias221/ProyectoDevOps_Grupo3_IglesiasPerezMolinoloJuan | closed | Review Alta punto de carga | task process | Cada product owner debe seguir como guía los escenarios descritos en Gherkin
Esfuerzo en HS-P (por persona):
Estimado: 1
Real: 1 (@figlesias221 ) | 1.0 | Review Alta punto de carga - Cada product owner debe seguir como guía los escenarios descritos en Gherkin
Esfuerzo en HS-P (por persona):
Estimado: 1
Real: 1 (@figlesias221 ) | process | review alta punto de carga cada product owner debe seguir como guía los escenarios descritos en gherkin esfuerzo en hs p por persona estimado real | 1 |
17,261 | 23,043,322,203 | IssuesEvent | 2022-07-23 13:52:47 | andrewzah/openbook | https://api.github.com/repos/andrewzah/openbook | opened | create indices for appendix | enhancement rust-preprocessor | At the end of the openbook, sort songs by:
- [ ] composer
- [ ] lyricist
- [ ] year published
- [ ] meter
- [ ] tempo | 1.0 | create indices for appendix - At the end of the openbook, sort songs by:
- [ ] composer
- [ ] lyricist
- [ ] year published
- [ ] meter
- [ ] tempo | process | create indices for appendix at the end of the openbook sort songs by composer lyricist year published meter tempo | 1 |
4,779 | 7,653,826,184 | IssuesEvent | 2018-05-10 06:35:20 | hmacphail/pokemon-evolution | https://api.github.com/repos/hmacphail/pokemon-evolution | reopened | Create seeders for populating pokemon data | database dev process next generation | Use seeders instead of admin populating form views. This will mean the pokemon are programmatically added (probably with separate json static files, especially larger datasets).
Should also use [pokeapi](https://pokeapi.co/) to easily grab json files for learnsets, moves, and other large datasets | 1.0 | Create seeders for populating pokemon data - Use seeders instead of admin populating form views. This will mean the pokemon are programmatically added (probably with separate json static files, especially larger datasets).
Should also use [pokeapi](https://pokeapi.co/) to easily grab json files for learnsets, moves, and other large datasets | process | create seeders for populating pokemon data use seeders instead of admin populating form views this will mean the pokemon are programmatically added probably with separate json static files especially larger datasets should also use to easily grab json files for learnsets moves and other large datasets | 1 |
40,849 | 5,276,871,562 | IssuesEvent | 2017-02-07 00:51:00 | elegantthemes/Divi-Beta | https://api.github.com/repos/elegantthemes/Divi-Beta | closed | Detect Failed Save & Launch Failure Modal | !IMPORTANT DESIGN SIGNOFF FEATURE QUALITY ASSURED READY FOR REVIEW | Would it be possible to detect JS and HTTP errors when performing a save in the Visual Builder? We still have a lot of people that can't save, and what's worse is that it's not always apparent that saving is failing. Usually there is an error involved with the admin ajax request, and if we could detect that when saving and launch a custom error message that would be great. We could launch the standard failure modal, or launch something a bit more specific.
## Attached PR
- https://github.com/elegantthemes/submodule-builder/pull/1622
| 1.0 | Detect Failed Save & Launch Failure Modal - Would it be possible to detect JS and HTTP errors when performing a save in the Visual Builder? We still have a lot of people that can't save, and what's worse is that it's not always apparent that saving is failing. Usually there is an error involved with the admin ajax request, and if we could detect that when saving and launch a custom error message that would be great. We could launch the standard failure modal, or launch something a bit more specific.
## Attached PR
- https://github.com/elegantthemes/submodule-builder/pull/1622
| non_process | detect failed save launch failure modal would it be possible to detect js and http errors when performing a save in the visual builder we still have a lot of people that can t save and what s worse is that it s not always apparent that saving is failing usually there is an error involved with the admin ajax request and if we could detect that when saving and launch a custom error message that would be great we could launch the standard failure modal or launch something a bit more specific attached pr | 0 |
4,774 | 7,642,079,821 | IssuesEvent | 2018-05-08 08:02:28 | Bw2801/environment | https://api.github.com/repos/Bw2801/environment | opened | Allow specific addresses only for internal connections | enhancement processor | Allow the configuration of specific ip addresses or subnetworks to limit the addresses the processor listens to. | 1.0 | Allow specific addresses only for internal connections - Allow the configuration of specific ip addresses or subnetworks to limit the addresses the processor listens to. | process | allow specific addresses only for internal connections allow the configuration of specific ip addresses or subnetworks to limit the addresses the processor listens to | 1 |
22,352 | 31,028,515,795 | IssuesEvent | 2023-08-10 10:48:04 | raycast/extensions | https://api.github.com/repos/raycast/extensions | closed | [Kill Process] Adding more details about homonym processes like Activity Monitor | feature request extension status: stalled extension: kill-process | ### Extension
https://www.raycast.com/rolandleth/kill-process
### Description
When I search "QuickLook" with your extension, I have 3 processes with the same name: QuickLookUIService.
Compared to Activity Monitor, I don't have any more precision about them. In Activity Monitor I have some information in parenthesis:
- 'Finder';
- 'Open and Save Panel Service (Safari)';
- 'Open and Save Panel Service (com.apple.Safari.SandboxBroker (Safari))'.
Could you add these precisions?
### Who will benefit from this feature?
Everyone.
### Anything else?
Thank you for your work and your nice extension. | 1.0 | [Kill Process] Adding more details about homonym processes like Activity Monitor - ### Extension
https://www.raycast.com/rolandleth/kill-process
### Description
When I search "QuickLook" with your extension, I have 3 processes with the same name: QuickLookUIService.
Compared to Activity Monitor, I don't have any more precision about them. In Activity Monitor I have some information in parenthesis:
- 'Finder';
- 'Open and Save Panel Service (Safari)';
- 'Open and Save Panel Service (com.apple.Safari.SandboxBroker (Safari))'.
Could you add these precisions?
### Who will benefit from this feature?
Everyone.
### Anything else?
Thank you for your work and your nice extension. | process | adding more details about homonym processes like activity monitor extension description when i search quicklook with your extension i have processes with the same name quicklookuiservice compared to activity monitor i don t have any more precision about them in activity monitor i have some information in parenthesis finder open and save panel service safari open and save panel service com apple safari sandboxbroker safari could you add these precisions who will benefit from this feature everyone anything else thank you for your work and your nice extension | 1 |
66,529 | 3,255,138,224 | IssuesEvent | 2015-10-20 06:41:05 | Apollo-Community/ApolloStation | https://api.github.com/repos/Apollo-Community/ApolloStation | opened | Admin Tools | priority: low suggestion | New tools to help admins better administrate without all of the current hassle.
Administration
* Grouping Panel
* Player grouping: Players are assigned into groups when they first join the server. Staff only see logs for groups assigned to them.
* Moderators automatically assigned to watch X number of groups, splitting them up evenly among all active mods. AFK mods won't be counted
* Player marking: Mark a someone as a player of interest, which will highlight any logs caused by them
Fun
* RTS-style admin control for clientless mobs. Left-click to select a mob, then using hotkeys, they can make the mob do various things.
| 1.0 | Admin Tools - New tools to help admins better administrate without all of the current hassle.
Administration
* Grouping Panel
* Player grouping: Players are assigned into groups when they first join the server. Staff only see logs for groups assigned to them.
* Moderators automatically assigned to watch X number of groups, splitting them up evenly among all active mods. AFK mods won't be counted
* Player marking: Mark a someone as a player of interest, which will highlight any logs caused by them
Fun
* RTS-style admin control for clientless mobs. Left-click to select a mob, then using hotkeys, they can make the mob do various things.
| non_process | admin tools new tools to help admins better administrate without all of the current hassle administration grouping panel player grouping players are assigned into groups when they first join the server staff only see logs for groups assigned to them moderators automatically assigned to watch x number of groups splitting them up evenly among all active mods afk mods won t be counted player marking mark a someone as a player of interest which will highlight any logs caused by them fun rts style admin control for clientless mobs left click to select a mob then using hotkeys they can make the mob do various things | 0 |
49,479 | 6,027,767,791 | IssuesEvent | 2017-06-08 14:26:42 | golang/go | https://api.github.com/repos/golang/go | closed | misc/cgo: segmentation fault flake on darwin-amd64 | NeedsInvestigation Testing | https://storage.googleapis.com/go-build-log/7da47d96/darwin-amd64-10_11_d625077a.log
```
##### Testing race detector
ok runtime/race 3.387s
ok flag 1.022s
ok os 1.087s
ok os/exec 3.103s
PASS
scatter = 0x41e9160
hello from C
sqrt is: 0
ok _/private/var/folders/dx/k53rs1s93538b4x20g46cj_w0000gn/T/workdir/go/misc/cgo/test 2.608s
ok flag 1.081s
ok os/exec 3.203s
##### ../misc/cgo/testso
##### ../misc/cgo/testsovar
##### ../misc/cgo/testcarchive
--- FAIL: TestInstall (3.22s)
carchive_test.go:152: [clang -fPIC -m64 -pthread -fno-caret-diagnostics -Qunused-arguments -fmessage-length=0 -fdebug-prefix-map=/var/folders/dx/k53rs1s93538b4x20g46cj_w0000gn/T/go-build737724035=/tmp/go-build -gno-record-gcc-switches -fno-common -framework CoreFoundation -framework Foundation -I pkg/darwin_amd64 -o ./testp1 main.c main_unix.c pkg/darwin_amd64/libgo.a]
carchive_test.go:161:
carchive_test.go:162: signal: segmentation fault
FAIL
2017/05/10 05:01:02 Failed: exit status 1
```
Seen in https://go-review.googlesource.com/c/43131/, which is a tiny and seemingly safe patch.
/cc @josharian - potentially related to your recent concurrent compiler work, since we haven't seen this flake before. | 1.0 | misc/cgo: segmentation fault flake on darwin-amd64 - https://storage.googleapis.com/go-build-log/7da47d96/darwin-amd64-10_11_d625077a.log
```
##### Testing race detector
ok runtime/race 3.387s
ok flag 1.022s
ok os 1.087s
ok os/exec 3.103s
PASS
scatter = 0x41e9160
hello from C
sqrt is: 0
ok _/private/var/folders/dx/k53rs1s93538b4x20g46cj_w0000gn/T/workdir/go/misc/cgo/test 2.608s
ok flag 1.081s
ok os/exec 3.203s
##### ../misc/cgo/testso
##### ../misc/cgo/testsovar
##### ../misc/cgo/testcarchive
--- FAIL: TestInstall (3.22s)
carchive_test.go:152: [clang -fPIC -m64 -pthread -fno-caret-diagnostics -Qunused-arguments -fmessage-length=0 -fdebug-prefix-map=/var/folders/dx/k53rs1s93538b4x20g46cj_w0000gn/T/go-build737724035=/tmp/go-build -gno-record-gcc-switches -fno-common -framework CoreFoundation -framework Foundation -I pkg/darwin_amd64 -o ./testp1 main.c main_unix.c pkg/darwin_amd64/libgo.a]
carchive_test.go:161:
carchive_test.go:162: signal: segmentation fault
FAIL
2017/05/10 05:01:02 Failed: exit status 1
```
Seen in https://go-review.googlesource.com/c/43131/, which is a tiny and seemingly safe patch.
/cc @josharian - potentially related to your recent concurrent compiler work, since we haven't seen this flake before. | non_process | misc cgo segmentation fault flake on darwin testing race detector ok runtime race ok flag ok os ok os exec pass scatter hello from c sqrt is ok private var folders dx t workdir go misc cgo test ok flag ok os exec misc cgo testso misc cgo testsovar misc cgo testcarchive fail testinstall carchive test go carchive test go carchive test go signal segmentation fault fail failed exit status seen in which is a tiny and seemingly safe patch cc josharian potentially related to your recent concurrent compiler work since we haven t seen this flake before | 0 |
19,802 | 13,462,719,612 | IssuesEvent | 2020-09-09 16:28:16 | ArctosDB/arctos | https://api.github.com/repos/ArctosDB/arctos | opened | Print List Reports not on Report Manager list | Bug Error Messages Priority-High Type-Infrastructure function-Reports | Why is the report manager list (CFR list to edit handlers) different than the print list? There are labels in the print list which are not on the report manager list so I can edit them? I am checking TEST and they are not in that CFR management list there either. (I see now that they share the reports server so it's kinda a non-issue in test v prod). I cannot update the reports without being able to access them. Uploading a new copy works however.
Print List:

Also when I click on ```Manage Reports``` above, I get an error:
```
The following information is meant for the website developer for debugging purposes.
--
Error Occurred While Processing Request
Variable AUTH_KEY is undefined. The error occurred in /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 273 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 268 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 52 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 1 271 : <form name="n" method="post" enctype="multipart/form-data" action="reporter.cfm"> 272 : <input type="hidden" name="action" value="loadTemplate"> 273 : <input type="hidden" name="auth_key" value="#auth_key#"> 274 : <input type="file" name="FiletoUpload" id="FiletoUpload" size="45"> 275 : <input type="submit" class="savBtn" value="Upload File"> Resources: Check the ColdFusion documentation to verify that you are using the correct syntax. Search the Knowledge Base to find a solution to your problem. Browser Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0 Remote Address 129.114.52.18 Referrer http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid Date/Time 09-Sep-20 11:19 AM Stack Trace at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273) at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268) at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52) at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1) coldfusion.runtime.UndefinedVariableException: Variable AUTH_KEY is undefined. at coldfusion.runtime.CfJspPage._get(CfJspPage.java:390) at coldfusion.runtime.CfJspPage._get(CfJspPage.java:352) at coldfusion.runtime.CfJspPage._autoscalarize(CfJspPage.java:1462) at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273) at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268) at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52) at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1) at coldfusion.runtime.CfJspPage.invoke(CfJspPage.java:244) at coldfusion.tagext.lang.IncludeTag.doStartTag(IncludeTag.java:444) at coldfusion.filter.CfincludeFilter.invoke(CfincludeFilter.java:65) at coldfusion.filter.IpFilter.invoke(IpFilter.java:64) at coldfusion.filter.ApplicationFilter.invoke(ApplicationFilter.java:422) at coldfusion.filter.RequestMonitorFilter.invoke(RequestMonitorFilter.java:48) at coldfusion.filter.MonitoringFilter.invoke(MonitoringFilter.java:40) at coldfusion.filter.PathFilter.invoke(PathFilter.java:112) at coldfusion.filter.ExceptionFilter.invoke(ExceptionFilter.java:94) at coldfusion.filter.ClientScopePersistenceFilter.invoke(ClientScopePersistenceFilter.java:28) at coldfusion.filter.BrowserFilter.invoke(BrowserFilter.java:38) at coldfusion.filter.NoCacheFilter.invoke(NoCacheFilter.java:46) at coldfusion.filter.GlobalsFilter.invoke(GlobalsFilter.java:38) at coldfusion.filter.DatasourceFilter.invoke(DatasourceFilter.java:22) at coldfusion.filter.CachingFilter.invoke(CachingFilter.java:62) at coldfusion.CfmServlet.service(CfmServlet.java:219) at coldfusion.bootstrap.BootstrapServlet.service(BootstrapServlet.java:89) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at coldfusion.monitor.event.MonitoringServletFilter.doFilter(MonitoringServletFilter.java:42) at coldfusion.bootstrap.BootstrapFilter.doFilter(BootstrapFilter.java:46) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:928) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:414) at org.apache.coyote.ajp.AjpProcessor.process(AjpProcessor.java:204) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:539) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:298) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:722) | Variable AUTH_KEY is undefined. | | | The error occurred in /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 273 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 268 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 52 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 1 | 271 : <form name="n" method="post" enctype="multipart/form-data" action="reporter.cfm"> 272 : <input type="hidden" name="action" value="loadTemplate"> 273 : <input type="hidden" name="auth_key" value="#auth_key#"> 274 : <input type="file" name="FiletoUpload" id="FiletoUpload" size="45"> 275 : <input type="submit" class="savBtn" value="Upload File"> | | Resources: Check the ColdFusion documentation to verify that you are using the correct syntax. Search the Knowledge Base to find a solution to your problem. | Browser Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0 Remote Address 129.114.52.18 Referrer http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid Date/Time 09-Sep-20 11:19 AM | Browser | Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0 | Remote Address | 129.114.52.18 | Referrer | http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid | Date/Time | 09-Sep-20 11:19 AM | Stack Trace | at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273) at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268) at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52) at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1) coldfusion.runtime.UndefinedVariableException: Variable AUTH_KEY is undefined. at coldfusion.runtime.CfJspPage._get(CfJspPage.java:390) at coldfusion.runtime.CfJspPage._get(CfJspPage.java:352) at coldfusion.runtime.CfJspPage._autoscalarize(CfJspPage.java:1462) at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273) at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268) at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52) at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1) at coldfusion.runtime.CfJspPage.invoke(CfJspPage.java:244) at coldfusion.tagext.lang.IncludeTag.doStartTag(IncludeTag.java:444) at coldfusion.filter.CfincludeFilter.invoke(CfincludeFilter.java:65) at coldfusion.filter.IpFilter.invoke(IpFilter.java:64) at coldfusion.filter.ApplicationFilter.invoke(ApplicationFilter.java:422) at coldfusion.filter.RequestMonitorFilter.invoke(RequestMonitorFilter.java:48) at coldfusion.filter.MonitoringFilter.invoke(MonitoringFilter.java:40) at coldfusion.filter.PathFilter.invoke(PathFilter.java:112) at coldfusion.filter.ExceptionFilter.invoke(ExceptionFilter.java:94) at coldfusion.filter.ClientScopePersistenceFilter.invoke(ClientScopePersistenceFilter.java:28) at coldfusion.filter.BrowserFilter.invoke(BrowserFilter.java:38) at coldfusion.filter.NoCacheFilter.invoke(NoCacheFilter.java:46) at coldfusion.filter.GlobalsFilter.invoke(GlobalsFilter.java:38) at coldfusion.filter.DatasourceFilter.invoke(DatasourceFilter.java:22) at coldfusion.filter.CachingFilter.invoke(CachingFilter.java:62) at coldfusion.CfmServlet.service(CfmServlet.java:219) at coldfusion.bootstrap.BootstrapServlet.service(BootstrapServlet.java:89) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at coldfusion.monitor.event.MonitoringServletFilter.doFilter(MonitoringServletFilter.java:42) at coldfusion.bootstrap.BootstrapFilter.doFilter(BootstrapFilter.java:46) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:928) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:414) at org.apache.coyote.ajp.AjpProcessor.process(AjpProcessor.java:204) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:539) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:298) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:722)
Variable AUTH_KEY is undefined.
The error occurred in /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 273 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 268 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 52 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 1
271 : <form name="n" method="post" enctype="multipart/form-data" action="reporter.cfm"> 272 : <input type="hidden" name="action" value="loadTemplate"> 273 : <input type="hidden" name="auth_key" value="#auth_key#"> 274 : <input type="file" name="FiletoUpload" id="FiletoUpload" size="45"> 275 : <input type="submit" class="savBtn" value="Upload File">
Resources: Check the ColdFusion documentation to verify that you are using the correct syntax. Search the Knowledge Base to find a solution to your problem.
Browser Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0 Remote Address 129.114.52.18 Referrer http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid Date/Time 09-Sep-20 11:19 AM | Browser | Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0 | Remote Address | 129.114.52.18 | Referrer | http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid | Date/Time | 09-Sep-20 11:19 AM
Browser | Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0
Remote Address | 129.114.52.18
Referrer | http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid
Date/Time | 09-Sep-20 11:19 AM
Stack Trace
at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273)
at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268)
at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52)
at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1) coldfusion.runtime.UndefinedVariableException: Variable AUTH_KEY is undefined.
at coldfusion.runtime.CfJspPage._get(CfJspPage.java:390)
at coldfusion.runtime.CfJspPage._get(CfJspPage.java:352)
at coldfusion.runtime.CfJspPage._autoscalarize(CfJspPage.java:1462)
at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273)
at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268)
at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52)
at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1)
at coldfusion.runtime.CfJspPage.invoke(CfJspPage.java:244)
at coldfusion.tagext.lang.IncludeTag.doStartTag(IncludeTag.java:444)
at coldfusion.filter.CfincludeFilter.invoke(CfincludeFilter.java:65)
at coldfusion.filter.IpFilter.invoke(IpFilter.java:64)
at coldfusion.filter.ApplicationFilter.invoke(ApplicationFilter.java:422)
at coldfusion.filter.RequestMonitorFilter.invoke(RequestMonitorFilter.java:48)
at coldfusion.filter.MonitoringFilter.invoke(MonitoringFilter.java:40)
at coldfusion.filter.PathFilter.invoke(PathFilter.java:112)
at coldfusion.filter.ExceptionFilter.invoke(ExceptionFilter.java:94)
at coldfusion.filter.ClientScopePersistenceFilter.invoke(ClientScopePersistenceFilter.java:28)
at coldfusion.filter.BrowserFilter.invoke(BrowserFilter.java:38)
at coldfusion.filter.NoCacheFilter.invoke(NoCacheFilter.java:46)
at coldfusion.filter.GlobalsFilter.invoke(GlobalsFilter.java:38)
at coldfusion.filter.DatasourceFilter.invoke(DatasourceFilter.java:22)
at coldfusion.filter.CachingFilter.invoke(CachingFilter.java:62)
at coldfusion.CfmServlet.service(CfmServlet.java:219)
at coldfusion.bootstrap.BootstrapServlet.service(BootstrapServlet.java:89)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at coldfusion.monitor.event.MonitoringServletFilter.doFilter(MonitoringServletFilter.java:42)
at coldfusion.bootstrap.BootstrapFilter.doFilter(BootstrapFilter.java:46)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:928)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:414)
at org.apache.coyote.ajp.AjpProcessor.process(AjpProcessor.java:204)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:539)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:298)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
```
| 1.0 | Print List Reports not on Report Manager list - Why is the report manager list (CFR list to edit handlers) different than the print list? There are labels in the print list which are not on the report manager list so I can edit them? I am checking TEST and they are not in that CFR management list there either. (I see now that they share the reports server so it's kinda a non-issue in test v prod). I cannot update the reports without being able to access them. Uploading a new copy works however.
Print List:

Also when I click on ```Manage Reports``` above, I get an error:
```
The following information is meant for the website developer for debugging purposes.
--
Error Occurred While Processing Request
Variable AUTH_KEY is undefined. The error occurred in /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 273 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 268 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 52 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 1 271 : <form name="n" method="post" enctype="multipart/form-data" action="reporter.cfm"> 272 : <input type="hidden" name="action" value="loadTemplate"> 273 : <input type="hidden" name="auth_key" value="#auth_key#"> 274 : <input type="file" name="FiletoUpload" id="FiletoUpload" size="45"> 275 : <input type="submit" class="savBtn" value="Upload File"> Resources: Check the ColdFusion documentation to verify that you are using the correct syntax. Search the Knowledge Base to find a solution to your problem. Browser Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0 Remote Address 129.114.52.18 Referrer http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid Date/Time 09-Sep-20 11:19 AM Stack Trace at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273) at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268) at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52) at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1) coldfusion.runtime.UndefinedVariableException: Variable AUTH_KEY is undefined. at coldfusion.runtime.CfJspPage._get(CfJspPage.java:390) at coldfusion.runtime.CfJspPage._get(CfJspPage.java:352) at coldfusion.runtime.CfJspPage._autoscalarize(CfJspPage.java:1462) at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273) at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268) at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52) at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1) at coldfusion.runtime.CfJspPage.invoke(CfJspPage.java:244) at coldfusion.tagext.lang.IncludeTag.doStartTag(IncludeTag.java:444) at coldfusion.filter.CfincludeFilter.invoke(CfincludeFilter.java:65) at coldfusion.filter.IpFilter.invoke(IpFilter.java:64) at coldfusion.filter.ApplicationFilter.invoke(ApplicationFilter.java:422) at coldfusion.filter.RequestMonitorFilter.invoke(RequestMonitorFilter.java:48) at coldfusion.filter.MonitoringFilter.invoke(MonitoringFilter.java:40) at coldfusion.filter.PathFilter.invoke(PathFilter.java:112) at coldfusion.filter.ExceptionFilter.invoke(ExceptionFilter.java:94) at coldfusion.filter.ClientScopePersistenceFilter.invoke(ClientScopePersistenceFilter.java:28) at coldfusion.filter.BrowserFilter.invoke(BrowserFilter.java:38) at coldfusion.filter.NoCacheFilter.invoke(NoCacheFilter.java:46) at coldfusion.filter.GlobalsFilter.invoke(GlobalsFilter.java:38) at coldfusion.filter.DatasourceFilter.invoke(DatasourceFilter.java:22) at coldfusion.filter.CachingFilter.invoke(CachingFilter.java:62) at coldfusion.CfmServlet.service(CfmServlet.java:219) at coldfusion.bootstrap.BootstrapServlet.service(BootstrapServlet.java:89) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at coldfusion.monitor.event.MonitoringServletFilter.doFilter(MonitoringServletFilter.java:42) at coldfusion.bootstrap.BootstrapFilter.doFilter(BootstrapFilter.java:46) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:928) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:414) at org.apache.coyote.ajp.AjpProcessor.process(AjpProcessor.java:204) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:539) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:298) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:722) | Variable AUTH_KEY is undefined. | | | The error occurred in /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 273 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 268 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 52 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 1 | 271 : <form name="n" method="post" enctype="multipart/form-data" action="reporter.cfm"> 272 : <input type="hidden" name="action" value="loadTemplate"> 273 : <input type="hidden" name="auth_key" value="#auth_key#"> 274 : <input type="file" name="FiletoUpload" id="FiletoUpload" size="45"> 275 : <input type="submit" class="savBtn" value="Upload File"> | | Resources: Check the ColdFusion documentation to verify that you are using the correct syntax. Search the Knowledge Base to find a solution to your problem. | Browser Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0 Remote Address 129.114.52.18 Referrer http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid Date/Time 09-Sep-20 11:19 AM | Browser | Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0 | Remote Address | 129.114.52.18 | Referrer | http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid | Date/Time | 09-Sep-20 11:19 AM | Stack Trace | at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273) at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268) at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52) at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1) coldfusion.runtime.UndefinedVariableException: Variable AUTH_KEY is undefined. at coldfusion.runtime.CfJspPage._get(CfJspPage.java:390) at coldfusion.runtime.CfJspPage._get(CfJspPage.java:352) at coldfusion.runtime.CfJspPage._autoscalarize(CfJspPage.java:1462) at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273) at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268) at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52) at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1) at coldfusion.runtime.CfJspPage.invoke(CfJspPage.java:244) at coldfusion.tagext.lang.IncludeTag.doStartTag(IncludeTag.java:444) at coldfusion.filter.CfincludeFilter.invoke(CfincludeFilter.java:65) at coldfusion.filter.IpFilter.invoke(IpFilter.java:64) at coldfusion.filter.ApplicationFilter.invoke(ApplicationFilter.java:422) at coldfusion.filter.RequestMonitorFilter.invoke(RequestMonitorFilter.java:48) at coldfusion.filter.MonitoringFilter.invoke(MonitoringFilter.java:40) at coldfusion.filter.PathFilter.invoke(PathFilter.java:112) at coldfusion.filter.ExceptionFilter.invoke(ExceptionFilter.java:94) at coldfusion.filter.ClientScopePersistenceFilter.invoke(ClientScopePersistenceFilter.java:28) at coldfusion.filter.BrowserFilter.invoke(BrowserFilter.java:38) at coldfusion.filter.NoCacheFilter.invoke(NoCacheFilter.java:46) at coldfusion.filter.GlobalsFilter.invoke(GlobalsFilter.java:38) at coldfusion.filter.DatasourceFilter.invoke(DatasourceFilter.java:22) at coldfusion.filter.CachingFilter.invoke(CachingFilter.java:62) at coldfusion.CfmServlet.service(CfmServlet.java:219) at coldfusion.bootstrap.BootstrapServlet.service(BootstrapServlet.java:89) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at coldfusion.monitor.event.MonitoringServletFilter.doFilter(MonitoringServletFilter.java:42) at coldfusion.bootstrap.BootstrapFilter.doFilter(BootstrapFilter.java:46) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:928) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:414) at org.apache.coyote.ajp.AjpProcessor.process(AjpProcessor.java:204) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:539) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:298) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:722)
Variable AUTH_KEY is undefined.
The error occurred in /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 273 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 268 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 52 Called from /usr/local/httpd/htdocs/internal/reporter/reporter.cfm: line 1
271 : <form name="n" method="post" enctype="multipart/form-data" action="reporter.cfm"> 272 : <input type="hidden" name="action" value="loadTemplate"> 273 : <input type="hidden" name="auth_key" value="#auth_key#"> 274 : <input type="file" name="FiletoUpload" id="FiletoUpload" size="45"> 275 : <input type="submit" class="savBtn" value="Upload File">
Resources: Check the ColdFusion documentation to verify that you are using the correct syntax. Search the Knowledge Base to find a solution to your problem.
Browser Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0 Remote Address 129.114.52.18 Referrer http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid Date/Time 09-Sep-20 11:19 AM | Browser | Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0 | Remote Address | 129.114.52.18 | Referrer | http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid | Date/Time | 09-Sep-20 11:19 AM
Browser | Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:80.0) Gecko/20100101 Firefox/80.0
Remote Address | 129.114.52.18
Referrer | http://reports.arctos.database.museum/reporter/report_printer.cfm?auth_key=0ED38597-46FF-444A-AF1DF463C517B7B9&table_name=temp%5Fcache%2Ess%5Fmkoo%5F20200909110918375%5F606&sort=guid
Date/Time | 09-Sep-20 11:19 AM
Stack Trace
at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273)
at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268)
at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52)
at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1) coldfusion.runtime.UndefinedVariableException: Variable AUTH_KEY is undefined.
at coldfusion.runtime.CfJspPage._get(CfJspPage.java:390)
at coldfusion.runtime.CfJspPage._get(CfJspPage.java:352)
at coldfusion.runtime.CfJspPage._autoscalarize(CfJspPage.java:1462)
at cfreporter2ecfm76796494._factor5(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:273)
at cfreporter2ecfm76796494._factor9(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:268)
at cfreporter2ecfm76796494._factor10(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:52)
at cfreporter2ecfm76796494.runPage(/usr/local/httpd/htdocs/internal/reporter/reporter.cfm:1)
at coldfusion.runtime.CfJspPage.invoke(CfJspPage.java:244)
at coldfusion.tagext.lang.IncludeTag.doStartTag(IncludeTag.java:444)
at coldfusion.filter.CfincludeFilter.invoke(CfincludeFilter.java:65)
at coldfusion.filter.IpFilter.invoke(IpFilter.java:64)
at coldfusion.filter.ApplicationFilter.invoke(ApplicationFilter.java:422)
at coldfusion.filter.RequestMonitorFilter.invoke(RequestMonitorFilter.java:48)
at coldfusion.filter.MonitoringFilter.invoke(MonitoringFilter.java:40)
at coldfusion.filter.PathFilter.invoke(PathFilter.java:112)
at coldfusion.filter.ExceptionFilter.invoke(ExceptionFilter.java:94)
at coldfusion.filter.ClientScopePersistenceFilter.invoke(ClientScopePersistenceFilter.java:28)
at coldfusion.filter.BrowserFilter.invoke(BrowserFilter.java:38)
at coldfusion.filter.NoCacheFilter.invoke(NoCacheFilter.java:46)
at coldfusion.filter.GlobalsFilter.invoke(GlobalsFilter.java:38)
at coldfusion.filter.DatasourceFilter.invoke(DatasourceFilter.java:22)
at coldfusion.filter.CachingFilter.invoke(CachingFilter.java:62)
at coldfusion.CfmServlet.service(CfmServlet.java:219)
at coldfusion.bootstrap.BootstrapServlet.service(BootstrapServlet.java:89)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at coldfusion.monitor.event.MonitoringServletFilter.doFilter(MonitoringServletFilter.java:42)
at coldfusion.bootstrap.BootstrapFilter.doFilter(BootstrapFilter.java:46)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:928)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:414)
at org.apache.coyote.ajp.AjpProcessor.process(AjpProcessor.java:204)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:539)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:298)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
```
| non_process | print list reports not on report manager list why is the report manager list cfr list to edit handlers different than the print list there are labels in the print list which are not on the report manager list so i can edit them i am checking test and they are not in that cfr management list there either i see now that they share the reports server so it s kinda a non issue in test v prod i cannot update the reports without being able to access them uploading a new copy works however print list also when i click on manage reports above i get an error the following information is meant for the website developer for debugging purposes error occurred while processing request variable auth key is undefined the error occurred in usr local httpd htdocs internal reporter reporter cfm line called from usr local httpd htdocs internal reporter reporter cfm line called from usr local httpd htdocs internal reporter reporter cfm line called from usr local httpd htdocs internal reporter reporter cfm line resources check the coldfusion documentation to verify that you are using the correct syntax search the knowledge base to find a solution to your problem browser mozilla macintosh intel mac os x rv gecko firefox remote address referrer date time sep am stack trace at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at runpage usr local httpd htdocs internal reporter reporter cfm coldfusion runtime undefinedvariableexception variable auth key is undefined at coldfusion runtime cfjsppage get cfjsppage java at coldfusion runtime cfjsppage get cfjsppage java at coldfusion runtime cfjsppage autoscalarize cfjsppage java at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at runpage usr local httpd htdocs internal reporter reporter cfm at coldfusion runtime cfjsppage invoke cfjsppage java at coldfusion tagext lang includetag dostarttag includetag java at coldfusion filter cfincludefilter invoke cfincludefilter java at coldfusion filter ipfilter invoke ipfilter java at coldfusion filter applicationfilter invoke applicationfilter java at coldfusion filter requestmonitorfilter invoke requestmonitorfilter java at coldfusion filter monitoringfilter invoke monitoringfilter java at coldfusion filter pathfilter invoke pathfilter java at coldfusion filter exceptionfilter invoke exceptionfilter java at coldfusion filter clientscopepersistencefilter invoke clientscopepersistencefilter java at coldfusion filter browserfilter invoke browserfilter java at coldfusion filter nocachefilter invoke nocachefilter java at coldfusion filter globalsfilter invoke globalsfilter java at coldfusion filter datasourcefilter invoke datasourcefilter java at coldfusion filter cachingfilter invoke cachingfilter java at coldfusion cfmservlet service cfmservlet java at coldfusion bootstrap bootstrapservlet service bootstrapservlet java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at coldfusion monitor event monitoringservletfilter dofilter monitoringservletfilter java at coldfusion bootstrap bootstrapfilter dofilter bootstrapfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org apache catalina core standardwrappervalve invoke standardwrappervalve java at org apache catalina core standardcontextvalve invoke standardcontextvalve java at org apache catalina authenticator authenticatorbase invoke authenticatorbase java at org apache catalina core standardhostvalve invoke standardhostvalve java at org apache catalina valves errorreportvalve invoke errorreportvalve java at org apache catalina valves accesslogvalve invoke accesslogvalve java at org apache catalina core standardenginevalve invoke standardenginevalve java at org apache catalina connector coyoteadapter service coyoteadapter java at org apache coyote ajp ajpprocessor process ajpprocessor java at org apache coyote abstractprotocol abstractconnectionhandler process abstractprotocol java at org apache tomcat util net jioendpoint socketprocessor run jioendpoint java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java variable auth key is undefined the error occurred in usr local httpd htdocs internal reporter reporter cfm line called from usr local httpd htdocs internal reporter reporter cfm line called from usr local httpd htdocs internal reporter reporter cfm line called from usr local httpd htdocs internal reporter reporter cfm line resources check the coldfusion documentation to verify that you are using the correct syntax search the knowledge base to find a solution to your problem browser mozilla macintosh intel mac os x rv gecko firefox remote address referrer date time sep am browser mozilla macintosh intel mac os x rv gecko firefox remote address referrer date time sep am stack trace at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at runpage usr local httpd htdocs internal reporter reporter cfm coldfusion runtime undefinedvariableexception variable auth key is undefined at coldfusion runtime cfjsppage get cfjsppage java at coldfusion runtime cfjsppage get cfjsppage java at coldfusion runtime cfjsppage autoscalarize cfjsppage java at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at runpage usr local httpd htdocs internal reporter reporter cfm at coldfusion runtime cfjsppage invoke cfjsppage java at coldfusion tagext lang includetag dostarttag includetag java at coldfusion filter cfincludefilter invoke cfincludefilter java at coldfusion filter ipfilter invoke ipfilter java at coldfusion filter applicationfilter invoke applicationfilter java at coldfusion filter requestmonitorfilter invoke requestmonitorfilter java at coldfusion filter monitoringfilter invoke monitoringfilter java at coldfusion filter pathfilter invoke pathfilter java at coldfusion filter exceptionfilter invoke exceptionfilter java at coldfusion filter clientscopepersistencefilter invoke clientscopepersistencefilter java at coldfusion filter browserfilter invoke browserfilter java at coldfusion filter nocachefilter invoke nocachefilter java at coldfusion filter globalsfilter invoke globalsfilter java at coldfusion filter datasourcefilter invoke datasourcefilter java at coldfusion filter cachingfilter invoke cachingfilter java at coldfusion cfmservlet service cfmservlet java at coldfusion bootstrap bootstrapservlet service bootstrapservlet java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at coldfusion monitor event monitoringservletfilter dofilter monitoringservletfilter java at coldfusion bootstrap bootstrapfilter dofilter bootstrapfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org apache catalina core standardwrappervalve invoke standardwrappervalve java at org apache catalina core standardcontextvalve invoke standardcontextvalve java at org apache catalina authenticator authenticatorbase invoke authenticatorbase java at org apache catalina core standardhostvalve invoke standardhostvalve java at org apache catalina valves errorreportvalve invoke errorreportvalve java at org apache catalina valves accesslogvalve invoke accesslogvalve java at org apache catalina core standardenginevalve invoke standardenginevalve java at org apache catalina connector coyoteadapter service coyoteadapter java at org apache coyote ajp ajpprocessor process ajpprocessor java at org apache coyote abstractprotocol abstractconnectionhandler process abstractprotocol java at org apache tomcat util net jioendpoint socketprocessor run jioendpoint java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java variable auth key is undefined the error occurred in usr local httpd htdocs internal reporter reporter cfm line called from usr local httpd htdocs internal reporter reporter cfm line called from usr local httpd htdocs internal reporter reporter cfm line called from usr local httpd htdocs internal reporter reporter cfm line resources check the coldfusion documentation to verify that you are using the correct syntax search the knowledge base to find a solution to your problem browser mozilla macintosh intel mac os x rv gecko firefox remote address referrer date time sep am browser mozilla macintosh intel mac os x rv gecko firefox remote address referrer date time sep am browser mozilla macintosh intel mac os x rv gecko firefox remote address referrer date time sep am stack trace at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at runpage usr local httpd htdocs internal reporter reporter cfm coldfusion runtime undefinedvariableexception variable auth key is undefined at coldfusion runtime cfjsppage get cfjsppage java at coldfusion runtime cfjsppage get cfjsppage java at coldfusion runtime cfjsppage autoscalarize cfjsppage java at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at usr local httpd htdocs internal reporter reporter cfm at runpage usr local httpd htdocs internal reporter reporter cfm at coldfusion runtime cfjsppage invoke cfjsppage java at coldfusion tagext lang includetag dostarttag includetag java at coldfusion filter cfincludefilter invoke cfincludefilter java at coldfusion filter ipfilter invoke ipfilter java at coldfusion filter applicationfilter invoke applicationfilter java at coldfusion filter requestmonitorfilter invoke requestmonitorfilter java at coldfusion filter monitoringfilter invoke monitoringfilter java at coldfusion filter pathfilter invoke pathfilter java at coldfusion filter exceptionfilter invoke exceptionfilter java at coldfusion filter clientscopepersistencefilter invoke clientscopepersistencefilter java at coldfusion filter browserfilter invoke browserfilter java at coldfusion filter nocachefilter invoke nocachefilter java at coldfusion filter globalsfilter invoke globalsfilter java at coldfusion filter datasourcefilter invoke datasourcefilter java at coldfusion filter cachingfilter invoke cachingfilter java at coldfusion cfmservlet service cfmservlet java at coldfusion bootstrap bootstrapservlet service bootstrapservlet java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at coldfusion monitor event monitoringservletfilter dofilter monitoringservletfilter java at coldfusion bootstrap bootstrapfilter dofilter bootstrapfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org apache catalina core standardwrappervalve invoke standardwrappervalve java at org apache catalina core standardcontextvalve invoke standardcontextvalve java at org apache catalina authenticator authenticatorbase invoke authenticatorbase java at org apache catalina core standardhostvalve invoke standardhostvalve java at org apache catalina valves errorreportvalve invoke errorreportvalve java at org apache catalina valves accesslogvalve invoke accesslogvalve java at org apache catalina core standardenginevalve invoke standardenginevalve java at org apache catalina connector coyoteadapter service coyoteadapter java at org apache coyote ajp ajpprocessor process ajpprocessor java at org apache coyote abstractprotocol abstractconnectionhandler process abstractprotocol java at org apache tomcat util net jioendpoint socketprocessor run jioendpoint java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java | 0 |
237,919 | 18,172,447,593 | IssuesEvent | 2021-09-27 21:41:44 | girlscript/winter-of-contributing | https://api.github.com/repos/girlscript/winter-of-contributing | reopened | How NPM and Yarn works. | documentation GWOC21 Assigned Frontend Dev React/Angular/Vue | ### Description
If its not in progress. I would like to write a documentation about how NPM works and the difference between NPM and yarn.
It's a task located in week 1 at React folder.
Feel free to assign me to this issue if it's free.
### Domain
Frontend Dev React/Angular/Vue
### Type of Contribution
Documentation
### Code of Conduct
- [X] I follow [Contributing Guidelines](https://github.com/girlscript/winter-of-contributing/blob/main/.github/CONTRIBUTING.md) & [Code of conduct](https://github.com/girlscript/winter-of-contributing/blob/main/.github/CODE_OF_CONDUCT.md) of this project. | 1.0 | How NPM and Yarn works. - ### Description
If its not in progress. I would like to write a documentation about how NPM works and the difference between NPM and yarn.
It's a task located in week 1 at React folder.
Feel free to assign me to this issue if it's free.
### Domain
Frontend Dev React/Angular/Vue
### Type of Contribution
Documentation
### Code of Conduct
- [X] I follow [Contributing Guidelines](https://github.com/girlscript/winter-of-contributing/blob/main/.github/CONTRIBUTING.md) & [Code of conduct](https://github.com/girlscript/winter-of-contributing/blob/main/.github/CODE_OF_CONDUCT.md) of this project. | non_process | how npm and yarn works description if its not in progress i would like to write a documentation about how npm works and the difference between npm and yarn it s a task located in week at react folder feel free to assign me to this issue if it s free domain frontend dev react angular vue type of contribution documentation code of conduct i follow of this project | 0 |
13,389 | 15,865,832,789 | IssuesEvent | 2021-04-08 15:07:17 | COPIM/open-book-collective | https://api.github.com/repos/COPIM/open-book-collective | opened | Monitor membership details | membership management (pillar 4) organisational process userstory | As the marketing person for a publisher or collective (e.g. OBP or ScholarLed) using the platform
... I want to be able to monitor membership details, invoicing dates etc
... so that I can administer the collective agreements.
| 1.0 | Monitor membership details - As the marketing person for a publisher or collective (e.g. OBP or ScholarLed) using the platform
... I want to be able to monitor membership details, invoicing dates etc
... so that I can administer the collective agreements.
| process | monitor membership details as the marketing person for a publisher or collective e g obp or scholarled using the platform i want to be able to monitor membership details invoicing dates etc so that i can administer the collective agreements | 1 |
21,909 | 30,439,562,457 | IssuesEvent | 2023-07-15 00:14:34 | winter-telescope/mirar | https://api.github.com/repos/winter-telescope/mirar | closed | [BUG] MultiExtParser does not actually pass on the images it loads | bug nearfuture processors | **Describe the bug**
The MultiExtParser loads up mef fits files but does not produce an image batch containing these frames. | 1.0 | [BUG] MultiExtParser does not actually pass on the images it loads - **Describe the bug**
The MultiExtParser loads up mef fits files but does not produce an image batch containing these frames. | process | multiextparser does not actually pass on the images it loads describe the bug the multiextparser loads up mef fits files but does not produce an image batch containing these frames | 1 |
11,800 | 14,625,400,051 | IssuesEvent | 2020-12-23 08:27:32 | DevExpress/testcafe-hammerhead | https://api.github.com/repos/DevExpress/testcafe-hammerhead | closed | Wrong handler is called for unautorized cross-domain request | AREA: client FREQUENCY: level 1 SYSTEM: client side processing TYPE: bug | For unauthorized cross-domain request (don't pass CORS rules) we call load handler instead of an error handler.
Script for reproduce:
With proxing, a browser console will be contain 'loadHandler is called' message.
Without - 'errorHandler is called';
```js
var xhr = new XMLHttpRequest();
// http://cross-domain.com/ should not respond CORS headers.
xhr.open('GET', 'http://cross-domain.com/', true);
function errorHandler () {
console.log('errorHandler is called');
}
function loadHandler () {
console.log('loadHandler is called');
}
xhr.addEventListener('load', loadHandler);
xhr.addEventListener('abort', errorHandler);
xhr.addEventListener('error', errorHandler);
xhr.addEventListener('timeout', errorHandler);
xhr.send();
```
| 1.0 | Wrong handler is called for unautorized cross-domain request - For unauthorized cross-domain request (don't pass CORS rules) we call load handler instead of an error handler.
Script for reproduce:
With proxing, a browser console will be contain 'loadHandler is called' message.
Without - 'errorHandler is called';
```js
var xhr = new XMLHttpRequest();
// http://cross-domain.com/ should not respond CORS headers.
xhr.open('GET', 'http://cross-domain.com/', true);
function errorHandler () {
console.log('errorHandler is called');
}
function loadHandler () {
console.log('loadHandler is called');
}
xhr.addEventListener('load', loadHandler);
xhr.addEventListener('abort', errorHandler);
xhr.addEventListener('error', errorHandler);
xhr.addEventListener('timeout', errorHandler);
xhr.send();
```
| process | wrong handler is called for unautorized cross domain request for unauthorized cross domain request don t pass cors rules we call load handler instead of an error handler script for reproduce with proxing a browser console will be contain loadhandler is called message without errorhandler is called js var xhr new xmlhttprequest should not respond cors headers xhr open get true function errorhandler console log errorhandler is called function loadhandler console log loadhandler is called xhr addeventlistener load loadhandler xhr addeventlistener abort errorhandler xhr addeventlistener error errorhandler xhr addeventlistener timeout errorhandler xhr send | 1 |
22,332 | 30,921,815,794 | IssuesEvent | 2023-08-06 01:47:43 | mmattDonk/AI-TTS-Donations | https://api.github.com/repos/mmattDonk/AI-TTS-Donations | closed | [SOLP-24] api request to trigger a tts for the public (?) | @solrock/processor processor Improvement | surely i could explain this better but i mean i think it makes sense
<sub>From [SyncLinear.com](https://synclinear.com) | [SOLP-24](https://linear.app/donk/issue/SOLP-24/api-request-to-trigger-a-tts-for-the-public)</sub> | 2.0 | [SOLP-24] api request to trigger a tts for the public (?) - surely i could explain this better but i mean i think it makes sense
<sub>From [SyncLinear.com](https://synclinear.com) | [SOLP-24](https://linear.app/donk/issue/SOLP-24/api-request-to-trigger-a-tts-for-the-public)</sub> | process | api request to trigger a tts for the public surely i could explain this better but i mean i think it makes sense from | 1 |
7,385 | 10,515,315,934 | IssuesEvent | 2019-09-28 08:46:43 | sysown/proxysql | https://api.github.com/repos/sysown/proxysql | closed | Add a global variable to define how often the maintenance loop needs to be executed | ADMIN CONNECTION POOL QUERY PROCESSOR | This is a note to myself.
Will add more details later.
| 1.0 | Add a global variable to define how often the maintenance loop needs to be executed - This is a note to myself.
Will add more details later.
| process | add a global variable to define how often the maintenance loop needs to be executed this is a note to myself will add more details later | 1 |
113,733 | 4,567,898,543 | IssuesEvent | 2016-09-15 12:54:09 | PowerlineApp/powerline-mobile | https://api.github.com/repos/PowerlineApp/powerline-mobile | closed | Push notification on UserPetition Created by Followed User | P2 - Medium Priority question | > #226 Petition Created by Followed User: Main Avatar (Post Author / Followed User avatar), Small Avatar (Powerline logo), Title ("FirstName LastName Petition"), Message ("PetitionText") where PetitionText is up to 300 characters long. Buttons: Sign, Ignore
@jterps08 , what should happen when Ignore button is clicked?
Currently an UserPetition has action Sign (and Unsign). Ignore is not used in the mobile app for UserPetitions. | 1.0 | Push notification on UserPetition Created by Followed User - > #226 Petition Created by Followed User: Main Avatar (Post Author / Followed User avatar), Small Avatar (Powerline logo), Title ("FirstName LastName Petition"), Message ("PetitionText") where PetitionText is up to 300 characters long. Buttons: Sign, Ignore
@jterps08 , what should happen when Ignore button is clicked?
Currently an UserPetition has action Sign (and Unsign). Ignore is not used in the mobile app for UserPetitions. | non_process | push notification on userpetition created by followed user petition created by followed user main avatar post author followed user avatar small avatar powerline logo title firstname lastname petition message petitiontext where petitiontext is up to characters long buttons sign ignore what should happen when ignore button is clicked currently an userpetition has action sign and unsign ignore is not used in the mobile app for userpetitions | 0 |
455,806 | 13,132,677,766 | IssuesEvent | 2020-08-06 19:23:04 | googleapis/nodejs-grafeas | https://api.github.com/repos/googleapis/nodejs-grafeas | closed | Synthesis failed for nodejs-grafeas | autosynth failure priority: p1 type: bug | Hello! Autosynth couldn't regenerate nodejs-grafeas. :broken_heart:
Here's the output from running `synth.py`:
```
bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:42:1
DEBUG: Rule 'gapic_generator_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "8d11f06b408ac5f1c01da3ca17f3a75dc008831509c5c1a4f24f9bde37792a57"
DEBUG: Call stack for the definition of repository 'gapic_generator_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:224:1
DEBUG: Rule 'com_googleapis_gapic_generator_go' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "12bfed7873f085093cd60615bd113178ecf36396af0c2ca25e6cd4d4bebdd198"
DEBUG: Call stack for the definition of repository 'com_googleapis_gapic_generator_go' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:261:1
DEBUG: Rule 'gapic_generator_typescript' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "ca322b5e7b0d03b3cc44a90444e3a7f944c9ba3345f0505ee48c8e715d19dd95"
DEBUG: Call stack for the definition of repository 'gapic_generator_typescript' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:280:1
DEBUG: Rule 'gapic_generator_csharp' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "40ddae63d2729ef5ccbd8b60123327ea200ce9400d0629238193ff530dcaea18"
DEBUG: Call stack for the definition of repository 'gapic_generator_csharp' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:318:1
DEBUG: Rule 'bazel_skylib' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "1dde365491125a3db70731e25658dfdd3bc5dbdfd11b840b3e987ecf043c7ca0"
DEBUG: Call stack for the definition of repository 'bazel_skylib' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:35:1
Analyzing: target //grafeas/v1:grafeas-v1-nodejs (1 packages loaded, 0 targets configured)
INFO: Call stack for the definition of repository 'npm' which is a yarn_install (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/build_bazel_rules_nodejs/internal/npm_install/npm_install.bzl:411:16):
- <builtin>
- /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/build_bazel_rules_nodejs/index.bzl:87:5
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:293:1
ERROR: An error occurred during the fetch of repository 'npm':
yarn_install failed: yarn install v1.19.1
[1/5] Validating package.json...
[2/5] Resolving packages...
[3/5] Fetching packages...
info If you think this is a bug, please open a bug report with the information provided in "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_typescript/yarn-error.log".
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
(error An unexpected error occurred: "https://registry.yarnpkg.com/@grpc/proto-loader/-/proto-loader-0.5.5.tgz: Request failed \"404 Not Found\"".
)
ERROR: /home/kbuilder/.cache/synthtool/googleapis/grafeas/v1/BUILD.bazel:235:1: //grafeas/v1:grafeas_nodejs_gapic depends on @gapic_generator_typescript//:protoc_plugin in repository @gapic_generator_typescript which failed to fetch. no such package '@npm//@bazel/typescript': yarn_install failed: yarn install v1.19.1
[1/5] Validating package.json...
[2/5] Resolving packages...
[3/5] Fetching packages...
info If you think this is a bug, please open a bug report with the information provided in "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_typescript/yarn-error.log".
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
(error An unexpected error occurred: "https://registry.yarnpkg.com/@grpc/proto-loader/-/proto-loader-0.5.5.tgz: Request failed \"404 Not Found\"".
)
ERROR: Analysis of target '//grafeas/v1:grafeas-v1-nodejs' failed; build aborted: no such package '@npm//@bazel/typescript': yarn_install failed: yarn install v1.19.1
[1/5] Validating package.json...
[2/5] Resolving packages...
[3/5] Fetching packages...
info If you think this is a bug, please open a bug report with the information provided in "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_typescript/yarn-error.log".
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
(error An unexpected error occurred: "https://registry.yarnpkg.com/@grpc/proto-loader/-/proto-loader-0.5.5.tgz: Request failed \"404 Not Found\"".
)
INFO: Elapsed time: 1.115s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (1 packages loaded, 13 targets configured)
FAILED: Build did NOT complete successfully (1 packages loaded, 13 targets configured)
2020-08-06 04:16:10,777 synthtool [DEBUG] > Wrote metadata to synth.metadata.
DEBUG:synthtool:Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/nodejs-grafeas/synth.py", line 30, in <module>
library = gapic.node_library('grafeas', version, bazel_target=f'//grafeas/{version}:grafeas-{version}-nodejs')
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in node_library
return self._generate_code(service, version, "nodejs", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 183, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//grafeas/v1:grafeas-v1-nodejs']' returned non-zero exit status 1.
2020-08-06 04:16:10,826 autosynth [ERROR] > Synthesis failed
2020-08-06 04:16:10,826 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 6eb454e chore: delete Node 8 presubmit tests (#178)
2020-08-06 04:16:10,832 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2020-08-06 04:16:10,837 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 690, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 539, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 670, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 375, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 273, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/76bb7f6f-4d47-4888-b0d2-a8761a276cc8/targets/github%2Fsynthtool;config=default/tests;query=nodejs-grafeas;failed=false).
| 1.0 | Synthesis failed for nodejs-grafeas - Hello! Autosynth couldn't regenerate nodejs-grafeas. :broken_heart:
Here's the output from running `synth.py`:
```
bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:42:1
DEBUG: Rule 'gapic_generator_python' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "8d11f06b408ac5f1c01da3ca17f3a75dc008831509c5c1a4f24f9bde37792a57"
DEBUG: Call stack for the definition of repository 'gapic_generator_python' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:224:1
DEBUG: Rule 'com_googleapis_gapic_generator_go' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "12bfed7873f085093cd60615bd113178ecf36396af0c2ca25e6cd4d4bebdd198"
DEBUG: Call stack for the definition of repository 'com_googleapis_gapic_generator_go' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:261:1
DEBUG: Rule 'gapic_generator_typescript' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "ca322b5e7b0d03b3cc44a90444e3a7f944c9ba3345f0505ee48c8e715d19dd95"
DEBUG: Call stack for the definition of repository 'gapic_generator_typescript' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:280:1
DEBUG: Rule 'gapic_generator_csharp' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "40ddae63d2729ef5ccbd8b60123327ea200ce9400d0629238193ff530dcaea18"
DEBUG: Call stack for the definition of repository 'gapic_generator_csharp' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:318:1
DEBUG: Rule 'bazel_skylib' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "1dde365491125a3db70731e25658dfdd3bc5dbdfd11b840b3e987ecf043c7ca0"
DEBUG: Call stack for the definition of repository 'bazel_skylib' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:35:1
Analyzing: target //grafeas/v1:grafeas-v1-nodejs (1 packages loaded, 0 targets configured)
INFO: Call stack for the definition of repository 'npm' which is a yarn_install (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/build_bazel_rules_nodejs/internal/npm_install/npm_install.bzl:411:16):
- <builtin>
- /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/build_bazel_rules_nodejs/index.bzl:87:5
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:293:1
ERROR: An error occurred during the fetch of repository 'npm':
yarn_install failed: yarn install v1.19.1
[1/5] Validating package.json...
[2/5] Resolving packages...
[3/5] Fetching packages...
info If you think this is a bug, please open a bug report with the information provided in "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_typescript/yarn-error.log".
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
(error An unexpected error occurred: "https://registry.yarnpkg.com/@grpc/proto-loader/-/proto-loader-0.5.5.tgz: Request failed \"404 Not Found\"".
)
ERROR: /home/kbuilder/.cache/synthtool/googleapis/grafeas/v1/BUILD.bazel:235:1: //grafeas/v1:grafeas_nodejs_gapic depends on @gapic_generator_typescript//:protoc_plugin in repository @gapic_generator_typescript which failed to fetch. no such package '@npm//@bazel/typescript': yarn_install failed: yarn install v1.19.1
[1/5] Validating package.json...
[2/5] Resolving packages...
[3/5] Fetching packages...
info If you think this is a bug, please open a bug report with the information provided in "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_typescript/yarn-error.log".
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
(error An unexpected error occurred: "https://registry.yarnpkg.com/@grpc/proto-loader/-/proto-loader-0.5.5.tgz: Request failed \"404 Not Found\"".
)
ERROR: Analysis of target '//grafeas/v1:grafeas-v1-nodejs' failed; build aborted: no such package '@npm//@bazel/typescript': yarn_install failed: yarn install v1.19.1
[1/5] Validating package.json...
[2/5] Resolving packages...
[3/5] Fetching packages...
info If you think this is a bug, please open a bug report with the information provided in "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_typescript/yarn-error.log".
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
(error An unexpected error occurred: "https://registry.yarnpkg.com/@grpc/proto-loader/-/proto-loader-0.5.5.tgz: Request failed \"404 Not Found\"".
)
INFO: Elapsed time: 1.115s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (1 packages loaded, 13 targets configured)
FAILED: Build did NOT complete successfully (1 packages loaded, 13 targets configured)
2020-08-06 04:16:10,777 synthtool [DEBUG] > Wrote metadata to synth.metadata.
DEBUG:synthtool:Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/nodejs-grafeas/synth.py", line 30, in <module>
library = gapic.node_library('grafeas', version, bazel_target=f'//grafeas/{version}:grafeas-{version}-nodejs')
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in node_library
return self._generate_code(service, version, "nodejs", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 183, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//grafeas/v1:grafeas-v1-nodejs']' returned non-zero exit status 1.
2020-08-06 04:16:10,826 autosynth [ERROR] > Synthesis failed
2020-08-06 04:16:10,826 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 6eb454e chore: delete Node 8 presubmit tests (#178)
2020-08-06 04:16:10,832 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2020-08-06 04:16:10,837 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 690, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 539, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 670, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 375, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 273, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/76bb7f6f-4d47-4888-b0d2-a8761a276cc8/targets/github%2Fsynthtool;config=default/tests;query=nodejs-grafeas;failed=false).
| non_process | synthesis failed for nodejs grafeas hello autosynth couldn t regenerate nodejs grafeas broken heart here s the output from running synth py bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace debug rule gapic generator python indicated that a canonical reproducible form can be obtained by modifying arguments debug call stack for the definition of repository gapic generator python which is a http archive rule definition at home kbuilder cache bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace debug rule com googleapis gapic generator go indicated that a canonical reproducible form can be obtained by modifying arguments debug call stack for the definition of repository com googleapis gapic generator go which is a http archive rule definition at home kbuilder cache bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace debug rule gapic generator typescript indicated that a canonical reproducible form can be obtained by modifying arguments debug call stack for the definition of repository gapic generator typescript which is a http archive rule definition at home kbuilder cache bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace debug rule gapic generator csharp indicated that a canonical reproducible form can be obtained by modifying arguments debug call stack for the definition of repository gapic generator csharp which is a http archive rule definition at home kbuilder cache bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace debug rule bazel skylib indicated that a canonical reproducible form can be obtained by modifying arguments debug call stack for the definition of repository bazel skylib which is a http archive rule definition at home kbuilder cache bazel bazel kbuilder external bazel tools tools build defs repo http bzl home kbuilder cache synthtool googleapis workspace analyzing target grafeas grafeas nodejs packages loaded targets configured info call stack for the definition of repository npm which is a yarn install rule definition at home kbuilder cache bazel bazel kbuilder external build bazel rules nodejs internal npm install npm install bzl home kbuilder cache bazel bazel kbuilder external build bazel rules nodejs index bzl home kbuilder cache synthtool googleapis workspace error an error occurred during the fetch of repository npm yarn install failed yarn install validating package json resolving packages fetching packages info if you think this is a bug please open a bug report with the information provided in home kbuilder cache bazel bazel kbuilder external gapic generator typescript yarn error log info visit for documentation about this command error an unexpected error occurred request failed not found error home kbuilder cache synthtool googleapis grafeas build bazel grafeas grafeas nodejs gapic depends on gapic generator typescript protoc plugin in repository gapic generator typescript which failed to fetch no such package npm bazel typescript yarn install failed yarn install validating package json resolving packages fetching packages info if you think this is a bug please open a bug report with the information provided in home kbuilder cache bazel bazel kbuilder external gapic generator typescript yarn error log info visit for documentation about this command error an unexpected error occurred request failed not found error analysis of target grafeas grafeas nodejs failed build aborted no such package npm bazel typescript yarn install failed yarn install validating package json resolving packages fetching packages info if you think this is a bug please open a bug report with the information provided in home kbuilder cache bazel bazel kbuilder external gapic generator typescript yarn error log info visit for documentation about this command error an unexpected error occurred request failed not found info elapsed time info processes failed build did not complete successfully packages loaded targets configured failed build did not complete successfully packages loaded targets configured synthtool wrote metadata to synth metadata debug synthtool wrote metadata to synth metadata traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src github synthtool synthtool main py line in main file tmpfs src github synthtool env lib site packages click core py line in call return self main args kwargs file tmpfs src github synthtool env lib site packages click core py line in main rv self invoke ctx file tmpfs src github synthtool env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src github synthtool env lib site packages click core py line in invoke return callback args kwargs file tmpfs src github synthtool synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file home kbuilder cache synthtool nodejs grafeas synth py line in library gapic node library grafeas version bazel target f grafeas version grafeas version nodejs file tmpfs src github synthtool synthtool gcp gapic bazel py line in node library return self generate code service version nodejs kwargs file tmpfs src github synthtool synthtool gcp gapic bazel py line in generate code shell run bazel run args file tmpfs src github synthtool synthtool shell py line in run raise exc file tmpfs src github synthtool synthtool shell py line in run encoding utf file home kbuilder pyenv versions lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command returned non zero exit status autosynth synthesis failed autosynth running git reset hard head head is now at chore delete node presubmit tests autosynth running git checkout autosynth switched to branch autosynth autosynth running git clean fdx removing pycache traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src github synthtool autosynth synth py line in main file tmpfs src github synthtool autosynth synth py line in main return inner main temp dir file tmpfs src github synthtool autosynth synth py line in inner main commit count synthesize loop x multiple prs change pusher synthesizer file tmpfs src github synthtool autosynth synth py line in synthesize loop has changes toolbox synthesize version in new branch synthesizer youngest file tmpfs src github synthtool autosynth synth py line in synthesize version in new branch synthesizer synthesize synth log path self environ file tmpfs src github synthtool autosynth synthesizer py line in synthesize synth proc check returncode raise an exception file home kbuilder pyenv versions lib subprocess py line in check returncode self stderr subprocess calledprocesserror command returned non zero exit status google internal developers can see the full log | 0 |
37,660 | 8,345,767,375 | IssuesEvent | 2018-10-01 05:16:50 | Microsoft/vscode | https://api.github.com/repos/Microsoft/vscode | closed | Could the QuickFix Widget look as cool as the IntelliSense autocompletion dropdown? | editor editor-code-actions needs more info | It maybe due to some technical limitations, but I wonder if the Quick Fix widget - as bland as it is - could get a face-lift. It's not the best for dark skins, and it might as well use the same keyboard shortcuts as the IntelliSense autocompletion dropdown. I tried to find the HTML view of `editor.action.quickfix` but I couldn't find it.
Any help and/or info appreciated :)
| 1.0 | Could the QuickFix Widget look as cool as the IntelliSense autocompletion dropdown? - It maybe due to some technical limitations, but I wonder if the Quick Fix widget - as bland as it is - could get a face-lift. It's not the best for dark skins, and it might as well use the same keyboard shortcuts as the IntelliSense autocompletion dropdown. I tried to find the HTML view of `editor.action.quickfix` but I couldn't find it.
Any help and/or info appreciated :)
| non_process | could the quickfix widget look as cool as the intellisense autocompletion dropdown it maybe due to some technical limitations but i wonder if the quick fix widget as bland as it is could get a face lift it s not the best for dark skins and it might as well use the same keyboard shortcuts as the intellisense autocompletion dropdown i tried to find the html view of editor action quickfix but i couldn t find it any help and or info appreciated | 0 |
20,202 | 26,778,359,690 | IssuesEvent | 2023-01-31 18:58:25 | hashicorp/terraform-cdk | https://api.github.com/repos/hashicorp/terraform-cdk | closed | Add more non-technical information to CONTRIBUTING.md | enhancement priority/important-longterm dev-process | <!--- Please keep this note for the community --->
### Community Note
- Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
- Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
- If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
I was chatting with one of HashiCorp's community managers last night and he pointed out that the contents of our `CONTRIBUTING.md` file are too technical, or rather, the info in there is fine, but we should also include a higher-level layer that talks more about our process of best practices for submitting issues, how we intend to work with community feature requests, where to ask questions, etc.
### References
Here are examples of a few other open source projects we can draw inspiration from:
- https://github.com/hashicorp/terraform/blob/main/.github/CONTRIBUTING.md
- https://github.com/hashicorp/vault/blob/main/CONTRIBUTING.md
- https://github.com/angular/angular/blob/main/CONTRIBUTING.md
- https://reactjs.org/docs/how-to-contribute.html
- https://github.com/php/php-src/blob/master/CONTRIBUTING.md | 1.0 | Add more non-technical information to CONTRIBUTING.md - <!--- Please keep this note for the community --->
### Community Note
- Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
- Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
- If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
I was chatting with one of HashiCorp's community managers last night and he pointed out that the contents of our `CONTRIBUTING.md` file are too technical, or rather, the info in there is fine, but we should also include a higher-level layer that talks more about our process of best practices for submitting issues, how we intend to work with community feature requests, where to ask questions, etc.
### References
Here are examples of a few other open source projects we can draw inspiration from:
- https://github.com/hashicorp/terraform/blob/main/.github/CONTRIBUTING.md
- https://github.com/hashicorp/vault/blob/main/CONTRIBUTING.md
- https://github.com/angular/angular/blob/main/CONTRIBUTING.md
- https://reactjs.org/docs/how-to-contribute.html
- https://github.com/php/php-src/blob/master/CONTRIBUTING.md | process | add more non technical information to contributing md community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or other comments that do not add relevant new information or questions they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment description i was chatting with one of hashicorp s community managers last night and he pointed out that the contents of our contributing md file are too technical or rather the info in there is fine but we should also include a higher level layer that talks more about our process of best practices for submitting issues how we intend to work with community feature requests where to ask questions etc references here are examples of a few other open source projects we can draw inspiration from | 1 |
21,366 | 29,194,080,448 | IssuesEvent | 2023-05-20 00:31:49 | devssa/onde-codar-em-salvador | https://api.github.com/repos/devssa/onde-codar-em-salvador | closed | [Remoto] Data Engineer na Coodesh | SALVADOR PJ DATA SCIENCE JAVA PYTHON REQUISITOS REMOTO PROCESSOS GITHUB INGLÊS SCALA UMA QUALIDADE NEGÓCIOS SPARK Stale | ## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/data-engineer-181939610?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋
<p> A<strong> Beltis</strong> busca pessoa <strong><ins>Data Engineer</ins></strong> para compor seu time!</p>
<p>Venha fazer parte de um projeto internacional e desafiador, atuando de maneira constante e interagindo diariamente com equipes globais.</p>
<p><strong>Responsabilidades:</strong></p>
<ul>
<li>Desenvolver, analisar e organizar dados;</li>
<li>Construir sistemas de dados e pipelines;</li>
<li>Analisar as necessidades e objetivos de negócios;</li>
<li>Explorar maneiras de melhorar a qualidade e confiabilidade dos dados;</li>
<li>Preparar dados para modelagem e apresentação.</li>
</ul>
## BELTIS TECNOLOGIA:
<p>Com mais de 20 anos de atuação no segmento de TI, adotamos uma política direcionada às pessoas. Com conhecimento de mercado, processos e tecnologia, oferecemos alta capacidade em Outsourcing de Profissionais.</p>
<p>Atuamos com diversos clientes a nível nacional, grandes players em seus segmentos que vão desde o financeiro, varejo, ensino e órgãos públicos.</p><a href='https://coodesh.com/empresas/beltis-tecnologia'>Veja mais no site</a>
## Habilidades:
- Spark
- Scala
- Java
- Python
## Local:
100% Remoto
## Requisitos:
- Experiência em Java e Scala;
- Experiência com Spark;
- Inglês Fluente;
- Disponibilidade para trabalhar nos horários das 13:00 ás 22:00 (Devido ao fuso horário).
## Diferenciais:
- Experiência em Python.
## Benefícios:
- Férias remuneradas de 15 dias após 12 meses de prestação;
- Acesso ilimitado a Udemy.
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Data Engineer na BELTIS TECNOLOGIA](https://coodesh.com/vagas/data-engineer-181939610?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Remoto
#### Regime
PJ
#### Categoria
Data Science | 1.0 | [Remoto] Data Engineer na Coodesh - ## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/data-engineer-181939610?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋
<p> A<strong> Beltis</strong> busca pessoa <strong><ins>Data Engineer</ins></strong> para compor seu time!</p>
<p>Venha fazer parte de um projeto internacional e desafiador, atuando de maneira constante e interagindo diariamente com equipes globais.</p>
<p><strong>Responsabilidades:</strong></p>
<ul>
<li>Desenvolver, analisar e organizar dados;</li>
<li>Construir sistemas de dados e pipelines;</li>
<li>Analisar as necessidades e objetivos de negócios;</li>
<li>Explorar maneiras de melhorar a qualidade e confiabilidade dos dados;</li>
<li>Preparar dados para modelagem e apresentação.</li>
</ul>
## BELTIS TECNOLOGIA:
<p>Com mais de 20 anos de atuação no segmento de TI, adotamos uma política direcionada às pessoas. Com conhecimento de mercado, processos e tecnologia, oferecemos alta capacidade em Outsourcing de Profissionais.</p>
<p>Atuamos com diversos clientes a nível nacional, grandes players em seus segmentos que vão desde o financeiro, varejo, ensino e órgãos públicos.</p><a href='https://coodesh.com/empresas/beltis-tecnologia'>Veja mais no site</a>
## Habilidades:
- Spark
- Scala
- Java
- Python
## Local:
100% Remoto
## Requisitos:
- Experiência em Java e Scala;
- Experiência com Spark;
- Inglês Fluente;
- Disponibilidade para trabalhar nos horários das 13:00 ás 22:00 (Devido ao fuso horário).
## Diferenciais:
- Experiência em Python.
## Benefícios:
- Férias remuneradas de 15 dias após 12 meses de prestação;
- Acesso ilimitado a Udemy.
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Data Engineer na BELTIS TECNOLOGIA](https://coodesh.com/vagas/data-engineer-181939610?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Remoto
#### Regime
PJ
#### Categoria
Data Science | process | data engineer na coodesh descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 nbsp a beltis busca pessoa data engineer para compor seu time venha fazer parte de um projeto internacional e desafiador atuando de maneira constante e interagindo diariamente com equipes globais responsabilidades desenvolver analisar e organizar dados construir sistemas de dados e pipelines analisar as necessidades e objetivos de negócios explorar maneiras de melhorar a qualidade e confiabilidade dos dados preparar dados para modelagem e apresentação beltis tecnologia com mais de anos de atuação no segmento de ti adotamos uma política direcionada às pessoas com conhecimento de mercado processos e tecnologia oferecemos alta capacidade em outsourcing de profissionais atuamos com diversos clientes a nível nacional grandes players em seus segmentos que vão desde o financeiro varejo ensino e órgãos públicos habilidades spark scala java python local remoto requisitos experiência em java e scala experiência com spark inglês fluente disponibilidade para trabalhar nos horários das ás devido ao fuso horário diferenciais experiência em python benefícios férias remuneradas de dias após meses de prestação acesso ilimitado a udemy como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação remoto regime pj categoria data science | 1 |
202,364 | 23,077,100,690 | IssuesEvent | 2022-07-26 01:24:52 | Hieunc-NT/eShopOnContainers | https://api.github.com/repos/Hieunc-NT/eShopOnContainers | closed | microsoft.aspnetcore.healthchecks.1.0.0.nupkg: 1 vulnerabilities (highest severity is: 7.5) - autoclosed | security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.aspnetcore.healthchecks.1.0.0.nupkg</b></p></summary>
<p></p>
<p>Path to dependency file: /src/Services/Webhooks/Webhooks.API/Webhooks.API.csproj</p>
<p>Path to vulnerable library: /home/wss-scanner/.nuget/packages/microsoft.aspnetcore.http/2.1.1/microsoft.aspnetcore.http.2.1.1.nupkg</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Hieunc-NT/eShopOnContainers/commit/58162be7965e66c71394dab67f66ed3d7cfaaef5">58162be7965e66c71394dab67f66ed3d7cfaaef5</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2020-1045](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1045) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | microsoft.aspnetcore.http.2.1.1.nupkg | Transitive | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-1045</summary>
### Vulnerable Library - <b>microsoft.aspnetcore.http.2.1.1.nupkg</b></p>
<p>ASP.NET Core default HTTP feature implementations.</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.aspnetcore.http.2.1.1.nupkg">https://api.nuget.org/packages/microsoft.aspnetcore.http.2.1.1.nupkg</a></p>
<p>Path to dependency file: /src/Services/Webhooks/Webhooks.API/Webhooks.API.csproj</p>
<p>Path to vulnerable library: /home/wss-scanner/.nuget/packages/microsoft.aspnetcore.http/2.1.1/microsoft.aspnetcore.http.2.1.1.nupkg</p>
<p>
Dependency Hierarchy:
- microsoft.aspnetcore.healthchecks.1.0.0.nupkg (Root Library)
- microsoft.aspnetcore.hosting.2.1.1.nupkg
- :x: **microsoft.aspnetcore.http.2.1.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Hieunc-NT/eShopOnContainers/commit/58162be7965e66c71394dab67f66ed3d7cfaaef5">58162be7965e66c71394dab67f66ed3d7cfaaef5</a></p>
<p>Found in base branch: <b>dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A security feature bypass vulnerability exists in the way Microsoft ASP.NET Core parses encoded cookie names.The ASP.NET Core cookie parser decodes entire cookie strings which could allow a malicious attacker to set a second cookie with the name being percent encoded.The security update addresses the vulnerability by fixing the way the ASP.NET Core cookie parser handles encoded names., aka 'Microsoft ASP.NET Core Security Feature Bypass Vulnerability'.
<p>Publish Date: 2020-09-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1045>CVE-2020-1045</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/dotnet/announcements/issues/165">https://github.com/dotnet/announcements/issues/165</a></p>
<p>Release Date: 2020-10-02</p>
<p>Fix Resolution: Microsoft.AspNetCore.App - 2.1.22, Microsoft.AspNetCore.All - 2.1.22,Microsoft.NETCore.App - 2.1.22, Microsoft.AspNetCore.Http - 2.1.22 </p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
<!-- <REMEDIATE>[{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Nuget","packageName":"Microsoft.AspNetCore.Http","packageVersion":"2.1.1","packageFilePaths":["/src/Services/Webhooks/Webhooks.API/Webhooks.API.csproj"],"isTransitiveDependency":true,"dependencyTree":"Microsoft.AspNetCore.HealthChecks:1.0.0;Microsoft.AspNetCore.Hosting:2.1.1;Microsoft.AspNetCore.Http:2.1.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"Microsoft.AspNetCore.App - 2.1.22, Microsoft.AspNetCore.All - 2.1.22,Microsoft.NETCore.App - 2.1.22, Microsoft.AspNetCore.Http - 2.1.22 ","isBinary":false}],"baseBranches":["dev"],"vulnerabilityIdentifier":"CVE-2020-1045","vulnerabilityDetails":"A security feature bypass vulnerability exists in the way Microsoft ASP.NET Core parses encoded cookie names.The ASP.NET Core cookie parser decodes entire cookie strings which could allow a malicious attacker to set a second cookie with the name being percent encoded.The security update addresses the vulnerability by fixing the way the ASP.NET Core cookie parser handles encoded names., aka \u0027Microsoft ASP.NET Core Security Feature Bypass Vulnerability\u0027.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1045","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}]</REMEDIATE> --> | True | microsoft.aspnetcore.healthchecks.1.0.0.nupkg: 1 vulnerabilities (highest severity is: 7.5) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.aspnetcore.healthchecks.1.0.0.nupkg</b></p></summary>
<p></p>
<p>Path to dependency file: /src/Services/Webhooks/Webhooks.API/Webhooks.API.csproj</p>
<p>Path to vulnerable library: /home/wss-scanner/.nuget/packages/microsoft.aspnetcore.http/2.1.1/microsoft.aspnetcore.http.2.1.1.nupkg</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Hieunc-NT/eShopOnContainers/commit/58162be7965e66c71394dab67f66ed3d7cfaaef5">58162be7965e66c71394dab67f66ed3d7cfaaef5</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2020-1045](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1045) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | microsoft.aspnetcore.http.2.1.1.nupkg | Transitive | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-1045</summary>
### Vulnerable Library - <b>microsoft.aspnetcore.http.2.1.1.nupkg</b></p>
<p>ASP.NET Core default HTTP feature implementations.</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.aspnetcore.http.2.1.1.nupkg">https://api.nuget.org/packages/microsoft.aspnetcore.http.2.1.1.nupkg</a></p>
<p>Path to dependency file: /src/Services/Webhooks/Webhooks.API/Webhooks.API.csproj</p>
<p>Path to vulnerable library: /home/wss-scanner/.nuget/packages/microsoft.aspnetcore.http/2.1.1/microsoft.aspnetcore.http.2.1.1.nupkg</p>
<p>
Dependency Hierarchy:
- microsoft.aspnetcore.healthchecks.1.0.0.nupkg (Root Library)
- microsoft.aspnetcore.hosting.2.1.1.nupkg
- :x: **microsoft.aspnetcore.http.2.1.1.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Hieunc-NT/eShopOnContainers/commit/58162be7965e66c71394dab67f66ed3d7cfaaef5">58162be7965e66c71394dab67f66ed3d7cfaaef5</a></p>
<p>Found in base branch: <b>dev</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A security feature bypass vulnerability exists in the way Microsoft ASP.NET Core parses encoded cookie names.The ASP.NET Core cookie parser decodes entire cookie strings which could allow a malicious attacker to set a second cookie with the name being percent encoded.The security update addresses the vulnerability by fixing the way the ASP.NET Core cookie parser handles encoded names., aka 'Microsoft ASP.NET Core Security Feature Bypass Vulnerability'.
<p>Publish Date: 2020-09-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1045>CVE-2020-1045</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/dotnet/announcements/issues/165">https://github.com/dotnet/announcements/issues/165</a></p>
<p>Release Date: 2020-10-02</p>
<p>Fix Resolution: Microsoft.AspNetCore.App - 2.1.22, Microsoft.AspNetCore.All - 2.1.22,Microsoft.NETCore.App - 2.1.22, Microsoft.AspNetCore.Http - 2.1.22 </p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
<!-- <REMEDIATE>[{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Nuget","packageName":"Microsoft.AspNetCore.Http","packageVersion":"2.1.1","packageFilePaths":["/src/Services/Webhooks/Webhooks.API/Webhooks.API.csproj"],"isTransitiveDependency":true,"dependencyTree":"Microsoft.AspNetCore.HealthChecks:1.0.0;Microsoft.AspNetCore.Hosting:2.1.1;Microsoft.AspNetCore.Http:2.1.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"Microsoft.AspNetCore.App - 2.1.22, Microsoft.AspNetCore.All - 2.1.22,Microsoft.NETCore.App - 2.1.22, Microsoft.AspNetCore.Http - 2.1.22 ","isBinary":false}],"baseBranches":["dev"],"vulnerabilityIdentifier":"CVE-2020-1045","vulnerabilityDetails":"A security feature bypass vulnerability exists in the way Microsoft ASP.NET Core parses encoded cookie names.The ASP.NET Core cookie parser decodes entire cookie strings which could allow a malicious attacker to set a second cookie with the name being percent encoded.The security update addresses the vulnerability by fixing the way the ASP.NET Core cookie parser handles encoded names., aka \u0027Microsoft ASP.NET Core Security Feature Bypass Vulnerability\u0027.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1045","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}]</REMEDIATE> --> | non_process | microsoft aspnetcore healthchecks nupkg vulnerabilities highest severity is autoclosed vulnerable library microsoft aspnetcore healthchecks nupkg path to dependency file src services webhooks webhooks api webhooks api csproj path to vulnerable library home wss scanner nuget packages microsoft aspnetcore http microsoft aspnetcore http nupkg found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high microsoft aspnetcore http nupkg transitive n a details cve vulnerable library microsoft aspnetcore http nupkg asp net core default http feature implementations library home page a href path to dependency file src services webhooks webhooks api webhooks api csproj path to vulnerable library home wss scanner nuget packages microsoft aspnetcore http microsoft aspnetcore http nupkg dependency hierarchy microsoft aspnetcore healthchecks nupkg root library microsoft aspnetcore hosting nupkg x microsoft aspnetcore http nupkg vulnerable library found in head commit a href found in base branch dev vulnerability details a security feature bypass vulnerability exists in the way microsoft asp net core parses encoded cookie names the asp net core cookie parser decodes entire cookie strings which could allow a malicious attacker to set a second cookie with the name being percent encoded the security update addresses the vulnerability by fixing the way the asp net core cookie parser handles encoded names aka microsoft asp net core security feature bypass vulnerability publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution microsoft aspnetcore app microsoft aspnetcore all microsoft netcore app microsoft aspnetcore http step up your open source security game with mend istransitivedependency true dependencytree microsoft aspnetcore healthchecks microsoft aspnetcore hosting microsoft aspnetcore http isminimumfixversionavailable true minimumfixversion microsoft aspnetcore app microsoft aspnetcore all microsoft netcore app microsoft aspnetcore http isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails a security feature bypass vulnerability exists in the way microsoft asp net core parses encoded cookie names the asp net core cookie parser decodes entire cookie strings which could allow a malicious attacker to set a second cookie with the name being percent encoded the security update addresses the vulnerability by fixing the way the asp net core cookie parser handles encoded names aka asp net core security feature bypass vulnerability vulnerabilityurl | 0 |
249,912 | 18,858,249,911 | IssuesEvent | 2021-11-12 09:33:10 | avellinwong01/pe | https://api.github.com/repos/avellinwong01/pe | opened | DG: UML Diagram Mistakes -- class diagram | type.DocumentationBug severity.Medium | 
The protected ArrayList<String> meals should be written as "# meals: ArrayList<String>", similar to the other protected attributes in the class diagram.
<!--session: 1636704688726-e04984b2-5805-4e7d-8093-250fd60d8402-->
<!--Version: Web v3.4.1--> | 1.0 | DG: UML Diagram Mistakes -- class diagram - 
The protected ArrayList<String> meals should be written as "# meals: ArrayList<String>", similar to the other protected attributes in the class diagram.
<!--session: 1636704688726-e04984b2-5805-4e7d-8093-250fd60d8402-->
<!--Version: Web v3.4.1--> | non_process | dg uml diagram mistakes class diagram the protected arraylist meals should be written as meals arraylist similar to the other protected attributes in the class diagram | 0 |
138,648 | 11,210,491,948 | IssuesEvent | 2020-01-06 13:22:20 | pandas-dev/pandas | https://api.github.com/repos/pandas-dev/pandas | closed | BUG: concat(Series[sparse]) raises ValueError | Needs Tests Sparse good first issue | ```python
In [1]: import pandas as pd
In [2]: pd.__version__
Out[2]: '0.23.4'
In [3]: a = pd.Series(pd.SparseArray([0, 1, 2]))
In [4]: pd.concat([a, a], axis=1)
```
```pytb
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-4-a3e65bc6fb67> in <module>()
----> 1 pd.concat([a, a], axis=1)
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/reshape/concat.py in concat(objs, axis, join, join_axes, ignore_index, keys, levels, names, verify_integrity, sort, copy)
224 verify_integrity=verify_integrity,
225 copy=copy, sort=sort)
--> 226 return op.get_result()
227
228
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/reshape/concat.py in get_result(self)
398
399 index, columns = self.new_axes
--> 400 df = cons(data, index=index)
401 df.columns = columns
402 return df.__finalize__(self, method='concat')
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/frame.py in __init__(self, data, index, columns, dtype, copy)
346 dtype=dtype, copy=copy)
347 elif isinstance(data, dict):
--> 348 mgr = self._init_dict(data, index, columns, dtype=dtype)
349 elif isinstance(data, ma.MaskedArray):
350 import numpy.ma.mrecords as mrecords
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/frame.py in _init_dict(self, data, index, columns, dtype)
457 arrays = [data[k] for k in keys]
458
--> 459 return _arrays_to_mgr(arrays, data_names, index, columns, dtype=dtype)
460
461 def _init_ndarray(self, values, index, columns, dtype=None, copy=False):
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/frame.py in _arrays_to_mgr(arrays, arr_names, index, columns, dtype)
7362 axes = [_ensure_index(columns), _ensure_index(index)]
7363
-> 7364 return create_block_manager_from_arrays(arrays, arr_names, axes)
7365
7366
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in create_block_manager_from_arrays(arrays, names, axes)
4875 return mgr
4876 except ValueError as e:
-> 4877 construction_error(len(arrays), arrays[0].shape, axes, e)
4878
4879
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in construction_error(tot_items, block_shape, axes, e)
4837 implied = tuple(map(int, [len(ax) for ax in axes]))
4838 if passed == implied and e is not None:
-> 4839 raise e
4840 if block_shape[0] == 0:
4841 raise ValueError("Empty data passed with indices specified.")
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in create_block_manager_from_arrays(arrays, names, axes)
4870
4871 try:
-> 4872 blocks = form_blocks(arrays, names, axes)
4873 mgr = BlockManager(blocks, axes)
4874 mgr._consolidate_inplace()
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in form_blocks(arrays, names, axes)
4916
4917 if len(items_dict['IntBlock']):
-> 4918 int_blocks = _multi_blockify(items_dict['IntBlock'])
4919 blocks.extend(int_blocks)
4920
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in _multi_blockify(tuples, dtype)
4993 for dtype, tup_block in grouper:
4994
-> 4995 values, placement = _stack_arrays(list(tup_block), dtype)
4996
4997 block = make_block(values, placement=placement)
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in _stack_arrays(tuples, dtype)
5037 stacked = np.empty(shape, dtype=dtype)
5038 for i, arr in enumerate(arrays):
-> 5039 stacked[i] = _asarray_compat(arr)
5040
5041 return stacked, placement
ValueError: could not broadcast input array from shape (2) into shape (3)
```
Right now on master, we return a SparseDataFrame. I would like to instead return a DataFrame with sparse values. On master, the rule for the result type is currently "sparse if any of the inputs are sparse". I would amend that to specifically be "sparse if any of the inputs are a SparseDataFrame or SparseSeries". This will require breaking a couple places like `SparseSeries.unstack()`, unless we hack in some special cases for sparse, which I'd like to avoid. | 1.0 | BUG: concat(Series[sparse]) raises ValueError - ```python
In [1]: import pandas as pd
In [2]: pd.__version__
Out[2]: '0.23.4'
In [3]: a = pd.Series(pd.SparseArray([0, 1, 2]))
In [4]: pd.concat([a, a], axis=1)
```
```pytb
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-4-a3e65bc6fb67> in <module>()
----> 1 pd.concat([a, a], axis=1)
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/reshape/concat.py in concat(objs, axis, join, join_axes, ignore_index, keys, levels, names, verify_integrity, sort, copy)
224 verify_integrity=verify_integrity,
225 copy=copy, sort=sort)
--> 226 return op.get_result()
227
228
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/reshape/concat.py in get_result(self)
398
399 index, columns = self.new_axes
--> 400 df = cons(data, index=index)
401 df.columns = columns
402 return df.__finalize__(self, method='concat')
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/frame.py in __init__(self, data, index, columns, dtype, copy)
346 dtype=dtype, copy=copy)
347 elif isinstance(data, dict):
--> 348 mgr = self._init_dict(data, index, columns, dtype=dtype)
349 elif isinstance(data, ma.MaskedArray):
350 import numpy.ma.mrecords as mrecords
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/frame.py in _init_dict(self, data, index, columns, dtype)
457 arrays = [data[k] for k in keys]
458
--> 459 return _arrays_to_mgr(arrays, data_names, index, columns, dtype=dtype)
460
461 def _init_ndarray(self, values, index, columns, dtype=None, copy=False):
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/frame.py in _arrays_to_mgr(arrays, arr_names, index, columns, dtype)
7362 axes = [_ensure_index(columns), _ensure_index(index)]
7363
-> 7364 return create_block_manager_from_arrays(arrays, arr_names, axes)
7365
7366
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in create_block_manager_from_arrays(arrays, names, axes)
4875 return mgr
4876 except ValueError as e:
-> 4877 construction_error(len(arrays), arrays[0].shape, axes, e)
4878
4879
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in construction_error(tot_items, block_shape, axes, e)
4837 implied = tuple(map(int, [len(ax) for ax in axes]))
4838 if passed == implied and e is not None:
-> 4839 raise e
4840 if block_shape[0] == 0:
4841 raise ValueError("Empty data passed with indices specified.")
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in create_block_manager_from_arrays(arrays, names, axes)
4870
4871 try:
-> 4872 blocks = form_blocks(arrays, names, axes)
4873 mgr = BlockManager(blocks, axes)
4874 mgr._consolidate_inplace()
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in form_blocks(arrays, names, axes)
4916
4917 if len(items_dict['IntBlock']):
-> 4918 int_blocks = _multi_blockify(items_dict['IntBlock'])
4919 blocks.extend(int_blocks)
4920
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in _multi_blockify(tuples, dtype)
4993 for dtype, tup_block in grouper:
4994
-> 4995 values, placement = _stack_arrays(list(tup_block), dtype)
4996
4997 block = make_block(values, placement=placement)
~/Envs/dask-dev/lib/python3.7/site-packages/pandas/core/internals.py in _stack_arrays(tuples, dtype)
5037 stacked = np.empty(shape, dtype=dtype)
5038 for i, arr in enumerate(arrays):
-> 5039 stacked[i] = _asarray_compat(arr)
5040
5041 return stacked, placement
ValueError: could not broadcast input array from shape (2) into shape (3)
```
Right now on master, we return a SparseDataFrame. I would like to instead return a DataFrame with sparse values. On master, the rule for the result type is currently "sparse if any of the inputs are sparse". I would amend that to specifically be "sparse if any of the inputs are a SparseDataFrame or SparseSeries". This will require breaking a couple places like `SparseSeries.unstack()`, unless we hack in some special cases for sparse, which I'd like to avoid. | non_process | bug concat series raises valueerror python in import pandas as pd in pd version out in a pd series pd sparsearray in pd concat axis pytb valueerror traceback most recent call last in pd concat axis envs dask dev lib site packages pandas core reshape concat py in concat objs axis join join axes ignore index keys levels names verify integrity sort copy verify integrity verify integrity copy copy sort sort return op get result envs dask dev lib site packages pandas core reshape concat py in get result self index columns self new axes df cons data index index df columns columns return df finalize self method concat envs dask dev lib site packages pandas core frame py in init self data index columns dtype copy dtype dtype copy copy elif isinstance data dict mgr self init dict data index columns dtype dtype elif isinstance data ma maskedarray import numpy ma mrecords as mrecords envs dask dev lib site packages pandas core frame py in init dict self data index columns dtype arrays for k in keys return arrays to mgr arrays data names index columns dtype dtype def init ndarray self values index columns dtype none copy false envs dask dev lib site packages pandas core frame py in arrays to mgr arrays arr names index columns dtype axes return create block manager from arrays arrays arr names axes envs dask dev lib site packages pandas core internals py in create block manager from arrays arrays names axes return mgr except valueerror as e construction error len arrays arrays shape axes e envs dask dev lib site packages pandas core internals py in construction error tot items block shape axes e implied tuple map int if passed implied and e is not none raise e if block shape raise valueerror empty data passed with indices specified envs dask dev lib site packages pandas core internals py in create block manager from arrays arrays names axes try blocks form blocks arrays names axes mgr blockmanager blocks axes mgr consolidate inplace envs dask dev lib site packages pandas core internals py in form blocks arrays names axes if len items dict int blocks multi blockify items dict blocks extend int blocks envs dask dev lib site packages pandas core internals py in multi blockify tuples dtype for dtype tup block in grouper values placement stack arrays list tup block dtype block make block values placement placement envs dask dev lib site packages pandas core internals py in stack arrays tuples dtype stacked np empty shape dtype dtype for i arr in enumerate arrays stacked asarray compat arr return stacked placement valueerror could not broadcast input array from shape into shape right now on master we return a sparsedataframe i would like to instead return a dataframe with sparse values on master the rule for the result type is currently sparse if any of the inputs are sparse i would amend that to specifically be sparse if any of the inputs are a sparsedataframe or sparseseries this will require breaking a couple places like sparseseries unstack unless we hack in some special cases for sparse which i d like to avoid | 0 |
783,206 | 27,522,741,253 | IssuesEvent | 2023-03-06 16:01:13 | EnMAP-Box/enmap-box | https://api.github.com/repos/EnMAP-Box/enmap-box | closed | implement EMIT L2A sensor product import algorithm | feature request priority: high | Use testdata: \sensors\EMIT\EMIT_L2A\EMIT_L2A_RFL_001_20220815T042838_2222703_003.nc
EMIT L2 data is stored in a NetCDF container in sensor geometry, but the interleave is messed up. The column and band dimensions are wrongly ordered, so that when opening the product in QGIS, you are looking at the side of the image cube (similar to PRISMA data):

The product import algorithm should:
1. fix the interleave
2. use the geolocation arrays for proper spatial visualization
3. add center wavelength and FWHM information


| 1.0 | implement EMIT L2A sensor product import algorithm - Use testdata: \sensors\EMIT\EMIT_L2A\EMIT_L2A_RFL_001_20220815T042838_2222703_003.nc
EMIT L2 data is stored in a NetCDF container in sensor geometry, but the interleave is messed up. The column and band dimensions are wrongly ordered, so that when opening the product in QGIS, you are looking at the side of the image cube (similar to PRISMA data):

The product import algorithm should:
1. fix the interleave
2. use the geolocation arrays for proper spatial visualization
3. add center wavelength and FWHM information


| non_process | implement emit sensor product import algorithm use testdata sensors emit emit emit rfl nc emit data is stored in a netcdf container in sensor geometry but the interleave is messed up the column and band dimensions are wrongly ordered so that when opening the product in qgis you are looking at the side of the image cube similar to prisma data the product import algorithm should fix the interleave use the geolocation arrays for proper spatial visualization add center wavelength and fwhm information | 0 |
256,701 | 27,561,709,513 | IssuesEvent | 2023-03-07 22:41:28 | samqws-marketing/electronicarts_ava-capture | https://api.github.com/repos/samqws-marketing/electronicarts_ava-capture | closed | CVE-2022-37620 (High) detected in html-minifier-3.5.21.tgz - autoclosed | Mend: dependency security vulnerability | ## CVE-2022-37620 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>html-minifier-3.5.21.tgz</b></p></summary>
<p>Highly configurable, well-tested, JavaScript-based HTML minifier.</p>
<p>Library home page: <a href="https://registry.npmjs.org/html-minifier/-/html-minifier-3.5.21.tgz">https://registry.npmjs.org/html-minifier/-/html-minifier-3.5.21.tgz</a></p>
<p>Path to dependency file: /website-frontend/package.json</p>
<p>Path to vulnerable library: /website-frontend/node_modules/html-minifier/package.json</p>
<p>
Dependency Hierarchy:
- html-loader-0.5.5.tgz (Root Library)
- :x: **html-minifier-3.5.21.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/electronicarts_ava-capture/commit/a04e5f9a7ee817317d0d58ce800eefc6bf4bd150">a04e5f9a7ee817317d0d58ce800eefc6bf4bd150</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Regular Expression Denial of Service (ReDoS) flaw was found in kangax html-minifier 4.0.0 via the candidate variable in htmlminifier.js.
<p>Publish Date: 2022-10-31
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-37620>CVE-2022-37620</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
| True | CVE-2022-37620 (High) detected in html-minifier-3.5.21.tgz - autoclosed - ## CVE-2022-37620 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>html-minifier-3.5.21.tgz</b></p></summary>
<p>Highly configurable, well-tested, JavaScript-based HTML minifier.</p>
<p>Library home page: <a href="https://registry.npmjs.org/html-minifier/-/html-minifier-3.5.21.tgz">https://registry.npmjs.org/html-minifier/-/html-minifier-3.5.21.tgz</a></p>
<p>Path to dependency file: /website-frontend/package.json</p>
<p>Path to vulnerable library: /website-frontend/node_modules/html-minifier/package.json</p>
<p>
Dependency Hierarchy:
- html-loader-0.5.5.tgz (Root Library)
- :x: **html-minifier-3.5.21.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/electronicarts_ava-capture/commit/a04e5f9a7ee817317d0d58ce800eefc6bf4bd150">a04e5f9a7ee817317d0d58ce800eefc6bf4bd150</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Regular Expression Denial of Service (ReDoS) flaw was found in kangax html-minifier 4.0.0 via the candidate variable in htmlminifier.js.
<p>Publish Date: 2022-10-31
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-37620>CVE-2022-37620</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
| non_process | cve high detected in html minifier tgz autoclosed cve high severity vulnerability vulnerable library html minifier tgz highly configurable well tested javascript based html minifier library home page a href path to dependency file website frontend package json path to vulnerable library website frontend node modules html minifier package json dependency hierarchy html loader tgz root library x html minifier tgz vulnerable library found in head commit a href found in base branch master vulnerability details a regular expression denial of service redos flaw was found in kangax html minifier via the candidate variable in htmlminifier js publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href | 0 |
766,278 | 26,876,471,803 | IssuesEvent | 2023-02-05 04:14:32 | tusen-ai/naive-ui | https://api.github.com/repos/tusen-ai/naive-ui | closed | 能否提供聊天消息组件? | priority: low | <!-- generated by issue-helper DO NOT REMOVE __FEATURE_REQUEST__ -->
### This function solves the problem (这个功能解决的问题)
能否提供聊天组件?类似quasar提供的ChatMessage组件
[ChatMessage](https://quasar.dev/vue-components/chat)
### Expected API (期望的 API)
eg . https://quasar.dev/vue-components/chat
<!-- generated by issue-helper DO NOT REMOVE __FEATURE_REQUEST__ --> | 1.0 | 能否提供聊天消息组件? - <!-- generated by issue-helper DO NOT REMOVE __FEATURE_REQUEST__ -->
### This function solves the problem (这个功能解决的问题)
能否提供聊天组件?类似quasar提供的ChatMessage组件
[ChatMessage](https://quasar.dev/vue-components/chat)
### Expected API (期望的 API)
eg . https://quasar.dev/vue-components/chat
<!-- generated by issue-helper DO NOT REMOVE __FEATURE_REQUEST__ --> | non_process | 能否提供聊天消息组件 this function solves the problem 这个功能解决的问题 能否提供聊天组件 类似quasar提供的chatmessage组件 expected api 期望的 api eg | 0 |
356,674 | 25,176,238,668 | IssuesEvent | 2022-11-11 09:30:34 | ningtan11/pe | https://api.github.com/repos/ningtan11/pe | opened | Inconsistent line breaks in help message | severity.VeryLow type.DocumentationBug | Help message that is shown when user inputs the wrong date format in add-leave has incorrect/missing line breaks, and an additional space before `3.`

<!--session: 1668155378118-6c8ca797-3769-4fe3-8a56-a7e1fa6c915e-->
<!--Version: Web v3.4.4--> | 1.0 | Inconsistent line breaks in help message - Help message that is shown when user inputs the wrong date format in add-leave has incorrect/missing line breaks, and an additional space before `3.`

<!--session: 1668155378118-6c8ca797-3769-4fe3-8a56-a7e1fa6c915e-->
<!--Version: Web v3.4.4--> | non_process | inconsistent line breaks in help message help message that is shown when user inputs the wrong date format in add leave has incorrect missing line breaks and an additional space before | 0 |
47,492 | 7,329,191,781 | IssuesEvent | 2018-03-05 03:13:11 | museumsvictoria/nodel | https://api.github.com/repos/museumsvictoria/nodel | opened | Front-End Building Blocks' shortcomings | documentation enhancement inconvenience polish | I was hoping to use the Yamaha Amp front-end as a good real-world example of what the front-end framework can do.
Some issues I've come across (see screenshot below):
* **(1)** `<link href='...'>` does not support any internal text so instead a massive title has to be used
* **(2a)** A light-weight "field name/ field value" UI element bound to an event/signal is not supported. Instead a very heavy `<Panel>` has to be used.
* in this example, it'd be nice to have those basic text fields presented elegantly (**Firmware**, **Run Mode**, **Device Error Status**
* **(3)** How do you layout a bank of range controls elegantly (spacing an issue; and an index label would be nice)?
* **(4)** `<Range>` is missing the current value, a bit like how the meters have (shown)
* **(5)** How to you layout a bank of On/Off switches elegantly (an index label would be nice)?
* **(6)** `<Meter>` only support 0-100. I'd say standardise on typical dB audio range of -60dB to +6dB.
* **(7)** Natively supporting a bump on `<Range>` would be very useful. So **+** and **-** or **⇧** and **⇩**
I consider **(2)** a major shortcoming.

| 1.0 | Front-End Building Blocks' shortcomings - I was hoping to use the Yamaha Amp front-end as a good real-world example of what the front-end framework can do.
Some issues I've come across (see screenshot below):
* **(1)** `<link href='...'>` does not support any internal text so instead a massive title has to be used
* **(2a)** A light-weight "field name/ field value" UI element bound to an event/signal is not supported. Instead a very heavy `<Panel>` has to be used.
* in this example, it'd be nice to have those basic text fields presented elegantly (**Firmware**, **Run Mode**, **Device Error Status**
* **(3)** How do you layout a bank of range controls elegantly (spacing an issue; and an index label would be nice)?
* **(4)** `<Range>` is missing the current value, a bit like how the meters have (shown)
* **(5)** How to you layout a bank of On/Off switches elegantly (an index label would be nice)?
* **(6)** `<Meter>` only support 0-100. I'd say standardise on typical dB audio range of -60dB to +6dB.
* **(7)** Natively supporting a bump on `<Range>` would be very useful. So **+** and **-** or **⇧** and **⇩**
I consider **(2)** a major shortcoming.

| non_process | front end building blocks shortcomings i was hoping to use the yamaha amp front end as a good real world example of what the front end framework can do some issues i ve come across see screenshot below does not support any internal text so instead a massive title has to be used a light weight field name field value ui element bound to an event signal is not supported instead a very heavy has to be used in this example it d be nice to have those basic text fields presented elegantly firmware run mode device error status how do you layout a bank of range controls elegantly spacing an issue and an index label would be nice is missing the current value a bit like how the meters have shown how to you layout a bank of on off switches elegantly an index label would be nice only support i d say standardise on typical db audio range of to natively supporting a bump on would be very useful so and or ⇧ and ⇩ i consider a major shortcoming | 0 |
235,561 | 7,740,280,245 | IssuesEvent | 2018-05-28 20:38:32 | nimona/go-nimona | https://api.github.com/repos/nimona/go-nimona | closed | Block exchange should announce its blocks on the DHT | Priority: High Status: Available Type: Enhancement | * [ ] When BLX starts up it should announce all blocks on the DHT
* [ ] When adding a new block on the BLX it should also announce it on the DHT | 1.0 | Block exchange should announce its blocks on the DHT - * [ ] When BLX starts up it should announce all blocks on the DHT
* [ ] When adding a new block on the BLX it should also announce it on the DHT | non_process | block exchange should announce its blocks on the dht when blx starts up it should announce all blocks on the dht when adding a new block on the blx it should also announce it on the dht | 0 |
157,073 | 19,913,648,361 | IssuesEvent | 2022-01-25 19:54:41 | Recidiviz/pulse-data | https://api.github.com/repos/Recidiviz/pulse-data | closed | Security Alert - Package: node-forge; Severity: MODERATE | Subject: Security Severity: MODERATE Subject: Vulnerability |
---
due: 2022-03-26
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: URL parsing in node-forge could lead to undesired behavior.
Description: ### Impact
The regex used for the `forge.util.parseUrl` API would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior.
### Patches
`forge.util.parseUrl` and other very old related URL APIs were removed in 1.0.0 in favor of letting applications use the more modern WHATWG URL Standard API.
### Workarounds
Ensure code does not directly or indirectly call `forge.util.parseUrl` with untrusted input.
### References
- https://www.huntr.dev/bounties/41852c50-3c6d-4703-8c55-4db27164a4ae/
### For more information
If you have any questions or comments about this advisory:
* Open an issue in [forge](https://github.com/digitalbazaar/forge)
* Email us at support@digitalbazaar.com
identifiers: [{'type': 'GHSA', 'value': 'GHSA-gf8q-jrpm-jvxq'}]
Fixed Version: 1.0.0
Created Date = January 08, 2022
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: Prototype Pollution in node-forge debug API.
Description: ### Impact
The `forge.debug` API had a potential prototype pollution issue if called with untrusted input. The API was only used for internal debug purposes in a safe way and never documented or advertised. It is suspected that uses of this API, if any exist, would likely not have used untrusted inputs in a vulnerable way.
### Patches
The `forge.debug` API and related functions were removed in 1.0.0.
### Workarounds
Don't use the `forge.debug` API directly or indirectly with untrusted input.
### References
- https://www.huntr.dev/bounties/1-npm-node-forge/
### For more information
If you have any questions or comments about this advisory:
* Open an issue in [forge](https://github.com/digitalbazaar/forge).
* Email us at support@digitalbazaar.com.
identifiers: [{'type': 'GHSA', 'value': 'GHSA-5rrq-pxf6-6jx5'}]
Fixed Version: 1.0.0
Created Date = January 08, 2022
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: URL parsing in node-forge could lead to undesired behavior.
Description: ### Impact
The regex used for the `forge.util.parseUrl` API would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior.
### Patches
`forge.util.parseUrl` and other very old related URL APIs were removed in 1.0.0 in favor of letting applications use the more modern WHATWG URL Standard API.
### Workarounds
Ensure code does not directly or indirectly call `forge.util.parseUrl` with untrusted input.
### References
- https://www.huntr.dev/bounties/41852c50-3c6d-4703-8c55-4db27164a4ae/
### For more information
If you have any questions or comments about this advisory:
* Open an issue in [forge](https://github.com/digitalbazaar/forge)
* Email us at support@digitalbazaar.com
identifiers: [{'type': 'GHSA', 'value': 'GHSA-gf8q-jrpm-jvxq'}]
Fixed Version: 1.0.0
Created Date = January 08, 2022
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: Prototype Pollution in node-forge debug API.
Description: ### Impact
The `forge.debug` API had a potential prototype pollution issue if called with untrusted input. The API was only used for internal debug purposes in a safe way and never documented or advertised. It is suspected that uses of this API, if any exist, would likely not have used untrusted inputs in a vulnerable way.
### Patches
The `forge.debug` API and related functions were removed in 1.0.0.
### Workarounds
Don't use the `forge.debug` API directly or indirectly with untrusted input.
### References
- https://www.huntr.dev/bounties/1-npm-node-forge/
### For more information
If you have any questions or comments about this advisory:
* Open an issue in [forge](https://github.com/digitalbazaar/forge).
* Email us at support@digitalbazaar.com.
identifiers: [{'type': 'GHSA', 'value': 'GHSA-5rrq-pxf6-6jx5'}]
Fixed Version: 1.0.0
Created Date = January 08, 2022
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: Open Redirect in node-forge
Description: parseUrl functionality in node-forge mishandles certain uses of backslash such as https:/\/\/\ and interprets the URI as a relative path.
identifiers: [{'type': 'GHSA', 'value': 'GHSA-8fr3-hfg3-gpgp'}, {'type': 'CVE', 'value': 'CVE-2022-0122'}]
Fixed Version: 1.0.0
Created Date = January 23, 2022
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: Open Redirect in node-forge
Description: parseUrl functionality in node-forge mishandles certain uses of backslash such as https:/\/\/\ and interprets the URI as a relative path.
identifiers: [{'type': 'GHSA', 'value': 'GHSA-8fr3-hfg3-gpgp'}, {'type': 'CVE', 'value': 'CVE-2022-0122'}]
Fixed Version: 1.0.0
Created Date = January 23, 2022
---
| True | Security Alert - Package: node-forge; Severity: MODERATE -
---
due: 2022-03-26
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: URL parsing in node-forge could lead to undesired behavior.
Description: ### Impact
The regex used for the `forge.util.parseUrl` API would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior.
### Patches
`forge.util.parseUrl` and other very old related URL APIs were removed in 1.0.0 in favor of letting applications use the more modern WHATWG URL Standard API.
### Workarounds
Ensure code does not directly or indirectly call `forge.util.parseUrl` with untrusted input.
### References
- https://www.huntr.dev/bounties/41852c50-3c6d-4703-8c55-4db27164a4ae/
### For more information
If you have any questions or comments about this advisory:
* Open an issue in [forge](https://github.com/digitalbazaar/forge)
* Email us at support@digitalbazaar.com
identifiers: [{'type': 'GHSA', 'value': 'GHSA-gf8q-jrpm-jvxq'}]
Fixed Version: 1.0.0
Created Date = January 08, 2022
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: Prototype Pollution in node-forge debug API.
Description: ### Impact
The `forge.debug` API had a potential prototype pollution issue if called with untrusted input. The API was only used for internal debug purposes in a safe way and never documented or advertised. It is suspected that uses of this API, if any exist, would likely not have used untrusted inputs in a vulnerable way.
### Patches
The `forge.debug` API and related functions were removed in 1.0.0.
### Workarounds
Don't use the `forge.debug` API directly or indirectly with untrusted input.
### References
- https://www.huntr.dev/bounties/1-npm-node-forge/
### For more information
If you have any questions or comments about this advisory:
* Open an issue in [forge](https://github.com/digitalbazaar/forge).
* Email us at support@digitalbazaar.com.
identifiers: [{'type': 'GHSA', 'value': 'GHSA-5rrq-pxf6-6jx5'}]
Fixed Version: 1.0.0
Created Date = January 08, 2022
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: URL parsing in node-forge could lead to undesired behavior.
Description: ### Impact
The regex used for the `forge.util.parseUrl` API would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior.
### Patches
`forge.util.parseUrl` and other very old related URL APIs were removed in 1.0.0 in favor of letting applications use the more modern WHATWG URL Standard API.
### Workarounds
Ensure code does not directly or indirectly call `forge.util.parseUrl` with untrusted input.
### References
- https://www.huntr.dev/bounties/41852c50-3c6d-4703-8c55-4db27164a4ae/
### For more information
If you have any questions or comments about this advisory:
* Open an issue in [forge](https://github.com/digitalbazaar/forge)
* Email us at support@digitalbazaar.com
identifiers: [{'type': 'GHSA', 'value': 'GHSA-gf8q-jrpm-jvxq'}]
Fixed Version: 1.0.0
Created Date = January 08, 2022
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: Prototype Pollution in node-forge debug API.
Description: ### Impact
The `forge.debug` API had a potential prototype pollution issue if called with untrusted input. The API was only used for internal debug purposes in a safe way and never documented or advertised. It is suspected that uses of this API, if any exist, would likely not have used untrusted inputs in a vulnerable way.
### Patches
The `forge.debug` API and related functions were removed in 1.0.0.
### Workarounds
Don't use the `forge.debug` API directly or indirectly with untrusted input.
### References
- https://www.huntr.dev/bounties/1-npm-node-forge/
### For more information
If you have any questions or comments about this advisory:
* Open an issue in [forge](https://github.com/digitalbazaar/forge).
* Email us at support@digitalbazaar.com.
identifiers: [{'type': 'GHSA', 'value': 'GHSA-5rrq-pxf6-6jx5'}]
Fixed Version: 1.0.0
Created Date = January 08, 2022
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: Open Redirect in node-forge
Description: parseUrl functionality in node-forge mishandles certain uses of backslash such as https:/\/\/\ and interprets the URI as a relative path.
identifiers: [{'type': 'GHSA', 'value': 'GHSA-8fr3-hfg3-gpgp'}, {'type': 'CVE', 'value': 'CVE-2022-0122'}]
Fixed Version: 1.0.0
Created Date = January 23, 2022
---
Affected package: node-forge
Ecosystem: NPM
Affected version range: < 1.0.0
Summary: Open Redirect in node-forge
Description: parseUrl functionality in node-forge mishandles certain uses of backslash such as https:/\/\/\ and interprets the URI as a relative path.
identifiers: [{'type': 'GHSA', 'value': 'GHSA-8fr3-hfg3-gpgp'}, {'type': 'CVE', 'value': 'CVE-2022-0122'}]
Fixed Version: 1.0.0
Created Date = January 23, 2022
---
| non_process | security alert package node forge severity moderate due affected package node forge ecosystem npm affected version range summary url parsing in node forge could lead to undesired behavior description impact the regex used for the forge util parseurl api would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior patches forge util parseurl and other very old related url apis were removed in in favor of letting applications use the more modern whatwg url standard api workarounds ensure code does not directly or indirectly call forge util parseurl with untrusted input references for more information if you have any questions or comments about this advisory open an issue in email us at support digitalbazaar com identifiers fixed version created date january affected package node forge ecosystem npm affected version range summary prototype pollution in node forge debug api description impact the forge debug api had a potential prototype pollution issue if called with untrusted input the api was only used for internal debug purposes in a safe way and never documented or advertised it is suspected that uses of this api if any exist would likely not have used untrusted inputs in a vulnerable way patches the forge debug api and related functions were removed in workarounds don t use the forge debug api directly or indirectly with untrusted input references for more information if you have any questions or comments about this advisory open an issue in email us at support digitalbazaar com identifiers fixed version created date january affected package node forge ecosystem npm affected version range summary url parsing in node forge could lead to undesired behavior description impact the regex used for the forge util parseurl api would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior patches forge util parseurl and other very old related url apis were removed in in favor of letting applications use the more modern whatwg url standard api workarounds ensure code does not directly or indirectly call forge util parseurl with untrusted input references for more information if you have any questions or comments about this advisory open an issue in email us at support digitalbazaar com identifiers fixed version created date january affected package node forge ecosystem npm affected version range summary prototype pollution in node forge debug api description impact the forge debug api had a potential prototype pollution issue if called with untrusted input the api was only used for internal debug purposes in a safe way and never documented or advertised it is suspected that uses of this api if any exist would likely not have used untrusted inputs in a vulnerable way patches the forge debug api and related functions were removed in workarounds don t use the forge debug api directly or indirectly with untrusted input references for more information if you have any questions or comments about this advisory open an issue in email us at support digitalbazaar com identifiers fixed version created date january affected package node forge ecosystem npm affected version range summary open redirect in node forge description parseurl functionality in node forge mishandles certain uses of backslash such as https and interprets the uri as a relative path identifiers fixed version created date january affected package node forge ecosystem npm affected version range summary open redirect in node forge description parseurl functionality in node forge mishandles certain uses of backslash such as https and interprets the uri as a relative path identifiers fixed version created date january | 0 |
20,622 | 27,293,514,954 | IssuesEvent | 2023-02-23 18:20:29 | geneontology/go-ontology | https://api.github.com/repos/geneontology/go-ontology | closed | Missing parent: [Polyadenylation should be part-of RNA 3' end processing?] | RNA processes missing parentage |
Polyadenylation should be part-of RNA 3' end processing?
This relationship was deleted in 2015, but poly adenylation IS part of 3' end processing
2015-05-11 | Deleted | RELATION | is a GO:0031123 (RNA 3'-end processing)
Quote:
"
Image result for polyadenylation and end processing
Introduction: Polyadenylation is the process in which the pre-mRNA is cleaved at the poly(A) site and a poly(A) tail is added – a process necessary for normal mRNA formation."
https://www.frontiersin.org/articles/10.3389/fendo.2013.00053/full
first line of abstract.
(I spotted this because FYPO phenotype[es are arranged according to GO and our abnormal polyadenylation terms are not under abnormal 3' end-processing)
Does anyone recollect why this was removed?
| 1.0 | Missing parent: [Polyadenylation should be part-of RNA 3' end processing?] -
Polyadenylation should be part-of RNA 3' end processing?
This relationship was deleted in 2015, but poly adenylation IS part of 3' end processing
2015-05-11 | Deleted | RELATION | is a GO:0031123 (RNA 3'-end processing)
Quote:
"
Image result for polyadenylation and end processing
Introduction: Polyadenylation is the process in which the pre-mRNA is cleaved at the poly(A) site and a poly(A) tail is added – a process necessary for normal mRNA formation."
https://www.frontiersin.org/articles/10.3389/fendo.2013.00053/full
first line of abstract.
(I spotted this because FYPO phenotype[es are arranged according to GO and our abnormal polyadenylation terms are not under abnormal 3' end-processing)
Does anyone recollect why this was removed?
| process | missing parent polyadenylation should be part of rna end processing this relationship was deleted in but poly adenylation is part of end processing deleted relation is a go rna end processing quote image result for polyadenylation and end processing introduction polyadenylation is the process in which the pre mrna is cleaved at the poly a site and a poly a tail is added – a process necessary for normal mrna formation first line of abstract i spotted this because fypo phenotype es are arranged according to go and our abnormal polyadenylation terms are not under abnormal end processing does anyone recollect why this was removed | 1 |
7,634 | 10,732,048,517 | IssuesEvent | 2019-10-28 20:55:34 | prisma/prisma-examples | https://api.github.com/repos/prisma/prisma-examples | closed | The graphql examples are tooo repetitive | process/candidate | I am not sure if there is something I am missing, but I think there is just too much repetition with this code.
```js
const User = objectType({
name: 'User',
definition(t) {
t.model.id()
t.model.name()
t.model.email()
t.model.posts({
pagination: false,
})
},
})
```
where as the same is defined elsewhere (which I actually prefer) in the `schema.prisma` file:
```graphql
model User {
id String @default(cuid()) @id
email String @unique
name String?
posts Post[]
}
```
Can't this example or the framework generate this Nexus code itself and provide a way to give it resolvers instead of repeating things like this. I can imagine when a change is done in the schema.prisma file and the followups that need to be done in a large application. I'd prefer it using the schema file as the source of truth, or at least give us a way to retrieve the definitions as regular graphql definitions that can be used with other Graphql libs including Nexus without repeating things like that. | 1.0 | The graphql examples are tooo repetitive - I am not sure if there is something I am missing, but I think there is just too much repetition with this code.
```js
const User = objectType({
name: 'User',
definition(t) {
t.model.id()
t.model.name()
t.model.email()
t.model.posts({
pagination: false,
})
},
})
```
where as the same is defined elsewhere (which I actually prefer) in the `schema.prisma` file:
```graphql
model User {
id String @default(cuid()) @id
email String @unique
name String?
posts Post[]
}
```
Can't this example or the framework generate this Nexus code itself and provide a way to give it resolvers instead of repeating things like this. I can imagine when a change is done in the schema.prisma file and the followups that need to be done in a large application. I'd prefer it using the schema file as the source of truth, or at least give us a way to retrieve the definitions as regular graphql definitions that can be used with other Graphql libs including Nexus without repeating things like that. | process | the graphql examples are tooo repetitive i am not sure if there is something i am missing but i think there is just too much repetition with this code js const user objecttype name user definition t t model id t model name t model email t model posts pagination false where as the same is defined elsewhere which i actually prefer in the schema prisma file graphql model user id string default cuid id email string unique name string posts post can t this example or the framework generate this nexus code itself and provide a way to give it resolvers instead of repeating things like this i can imagine when a change is done in the schema prisma file and the followups that need to be done in a large application i d prefer it using the schema file as the source of truth or at least give us a way to retrieve the definitions as regular graphql definitions that can be used with other graphql libs including nexus without repeating things like that | 1 |
55,961 | 13,726,247,235 | IssuesEvent | 2020-10-03 22:34:36 | dusk-network/rusk | https://api.github.com/repos/dusk-network/rusk | closed | PublicParams serialisation takes too much time | area:conf&build type:bug type:performance type:refactor | In difference to the `ProverKey` serialization, `PublicParameters` are serialized using `serde` & `bincode` and it slows down everything.
We should change the PublicParameters and serialize them using `to_bytes` and `from_bytes` without going through serde. | 1.0 | PublicParams serialisation takes too much time - In difference to the `ProverKey` serialization, `PublicParameters` are serialized using `serde` & `bincode` and it slows down everything.
We should change the PublicParameters and serialize them using `to_bytes` and `from_bytes` without going through serde. | non_process | publicparams serialisation takes too much time in difference to the proverkey serialization publicparameters are serialized using serde bincode and it slows down everything we should change the publicparameters and serialize them using to bytes and from bytes without going through serde | 0 |
60,923 | 3,135,720,301 | IssuesEvent | 2015-09-10 16:27:22 | MinetestForFun/server-minetestforfun-skyblock | https://api.github.com/repos/MinetestForFun/server-minetestforfun-skyblock | closed | "Place 20 Wood Fences" quest can't be completed | Modding ➤ BugFix Priority: High | Somehow the quest system doesn't count `default:fence_wood` placement, making players unable to complete the "Place 20 Wood Fences" quest and to get to level 3. | 1.0 | "Place 20 Wood Fences" quest can't be completed - Somehow the quest system doesn't count `default:fence_wood` placement, making players unable to complete the "Place 20 Wood Fences" quest and to get to level 3. | non_process | place wood fences quest can t be completed somehow the quest system doesn t count default fence wood placement making players unable to complete the place wood fences quest and to get to level | 0 |
303 | 2,736,161,935 | IssuesEvent | 2015-04-19 05:17:13 | sysown/proxysql-0.2 | https://api.github.com/repos/sysown/proxysql-0.2 | closed | Query rewrite not working | ADMIN bug PROTOCOL QUERY PROCESSOR | How to reproduce it:
On admin module:
```
mysql> insert into mysql_query_rules(rule_id, active, username, schemaname, flagIN, match_pattern, negate_match_pattern,flagOUT ,replace_pattern,destination_hostgroup,cache_ttl,apply) values(1,1,NULL,NULL,0,"SELECT * FROM proxytest",1,0,"SELECT id FROM test",0,1,1);
Query OK, 1 row affected (0.01 sec)
mysql> LOAD MYSQL QUERY RULES TO RUNTIME; Query OK, 0 rows affected (0.01 sec)
```
On mysql module:
```
mysql> SELECT * FROM proxytest;
ERROR 1109 (42S02): Unknown table 'proxytest' in information_schema
``` | 1.0 | Query rewrite not working - How to reproduce it:
On admin module:
```
mysql> insert into mysql_query_rules(rule_id, active, username, schemaname, flagIN, match_pattern, negate_match_pattern,flagOUT ,replace_pattern,destination_hostgroup,cache_ttl,apply) values(1,1,NULL,NULL,0,"SELECT * FROM proxytest",1,0,"SELECT id FROM test",0,1,1);
Query OK, 1 row affected (0.01 sec)
mysql> LOAD MYSQL QUERY RULES TO RUNTIME; Query OK, 0 rows affected (0.01 sec)
```
On mysql module:
```
mysql> SELECT * FROM proxytest;
ERROR 1109 (42S02): Unknown table 'proxytest' in information_schema
``` | process | query rewrite not working how to reproduce it on admin module mysql insert into mysql query rules rule id active username schemaname flagin match pattern negate match pattern flagout replace pattern destination hostgroup cache ttl apply values null null select from proxytest select id from test query ok row affected sec mysql load mysql query rules to runtime query ok rows affected sec on mysql module mysql select from proxytest error unknown table proxytest in information schema | 1 |
7,115 | 10,266,127,588 | IssuesEvent | 2019-08-22 20:37:03 | automotive-edge-computing-consortium/AECC | https://api.github.com/repos/automotive-edge-computing-consortium/AECC | opened | Need a better way to document, track, and follow-up work items across meetings and across work groups, SIGs etc. | priority:High status:Open type:Enhancement type:Process | Concern that issues can be get dropped across meetings when there is no clear ownership and followup. This may need tools within Higher Logic or some other tracking system (e.g.Trello)
Koya will contact Laurie to get the latest information incl. pricing. | 1.0 | Need a better way to document, track, and follow-up work items across meetings and across work groups, SIGs etc. - Concern that issues can be get dropped across meetings when there is no clear ownership and followup. This may need tools within Higher Logic or some other tracking system (e.g.Trello)
Koya will contact Laurie to get the latest information incl. pricing. | process | need a better way to document track and follow up work items across meetings and across work groups sigs etc concern that issues can be get dropped across meetings when there is no clear ownership and followup this may need tools within higher logic or some other tracking system e g trello koya will contact laurie to get the latest information incl pricing | 1 |
6,933 | 10,097,552,104 | IssuesEvent | 2019-07-28 07:07:56 | aiidateam/aiida-core | https://api.github.com/repos/aiidateam/aiida-core | closed | Formalization of meaning of exit code ranges | aiida-core 1.x priority/important requires discussion topic/processes type/enhancement | This issue will be a design document to agree on a guideline for the meaning of certain exit code ranges to be used by plugins and potentially even reserve some for internal use in `aiida-core`.
The proposal in this message will be updated as the discussion progresses:
* `0 - 99` Reserved for internal use by the engine of `aiida-core`
* `100 - 199` Reserved for exit codes by scheduler parsers (This only applies to `CalcJobs` and the scheduler parsing is not yet implemented)
* `200 - 299` Suggested for compound (not automated by engine based on process spec) input validation errors
* `300 - 399` Suggested for unrecoverable errors
* `400 - ...` Suggested for recoverable errors | 1.0 | Formalization of meaning of exit code ranges - This issue will be a design document to agree on a guideline for the meaning of certain exit code ranges to be used by plugins and potentially even reserve some for internal use in `aiida-core`.
The proposal in this message will be updated as the discussion progresses:
* `0 - 99` Reserved for internal use by the engine of `aiida-core`
* `100 - 199` Reserved for exit codes by scheduler parsers (This only applies to `CalcJobs` and the scheduler parsing is not yet implemented)
* `200 - 299` Suggested for compound (not automated by engine based on process spec) input validation errors
* `300 - 399` Suggested for unrecoverable errors
* `400 - ...` Suggested for recoverable errors | process | formalization of meaning of exit code ranges this issue will be a design document to agree on a guideline for the meaning of certain exit code ranges to be used by plugins and potentially even reserve some for internal use in aiida core the proposal in this message will be updated as the discussion progresses reserved for internal use by the engine of aiida core reserved for exit codes by scheduler parsers this only applies to calcjobs and the scheduler parsing is not yet implemented suggested for compound not automated by engine based on process spec input validation errors suggested for unrecoverable errors suggested for recoverable errors | 1 |
187,968 | 14,435,421,028 | IssuesEvent | 2020-12-07 08:42:48 | kalexmills/github-vet-tests-dec2020 | https://api.github.com/repos/kalexmills/github-vet-tests-dec2020 | closed | golang/go: src/encoding/base32/base32_test.go; 18 LoC | fresh small test |
Found a possible issue in [golang/go](https://www.github.com/golang/go) at [src/encoding/base32/base32_test.go](https://github.com/golang/go/blob/c15593197453b8bf90fc3a9080ba2afeaf7934ea/src/encoding/base32/base32_test.go#L609-L626)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable chunks used in defer or goroutine at line 614
[Click here to see the code in its original context.](https://github.com/golang/go/blob/c15593197453b8bf90fc3a9080ba2afeaf7934ea/src/encoding/base32/base32_test.go#L609-L626)
<details>
<summary>Click here to show the 18 line(s) of Go which triggered the analyzer.</summary>
```go
for _, chunks := range testcase.chunkCombinations {
pr, pw := io.Pipe()
// Write the encoded chunks into the pipe
go func() {
for _, chunk := range chunks {
pw.Write([]byte(chunk))
}
pw.Close()
}()
decoder := NewDecoder(StdEncoding, pr)
_, err := io.ReadAll(decoder)
if err != testcase.expected {
t.Errorf("Expected %v, got %v; case %s %+v", testcase.expected, err, testcase.prefix, chunks)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: c15593197453b8bf90fc3a9080ba2afeaf7934ea
| 1.0 | golang/go: src/encoding/base32/base32_test.go; 18 LoC -
Found a possible issue in [golang/go](https://www.github.com/golang/go) at [src/encoding/base32/base32_test.go](https://github.com/golang/go/blob/c15593197453b8bf90fc3a9080ba2afeaf7934ea/src/encoding/base32/base32_test.go#L609-L626)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable chunks used in defer or goroutine at line 614
[Click here to see the code in its original context.](https://github.com/golang/go/blob/c15593197453b8bf90fc3a9080ba2afeaf7934ea/src/encoding/base32/base32_test.go#L609-L626)
<details>
<summary>Click here to show the 18 line(s) of Go which triggered the analyzer.</summary>
```go
for _, chunks := range testcase.chunkCombinations {
pr, pw := io.Pipe()
// Write the encoded chunks into the pipe
go func() {
for _, chunk := range chunks {
pw.Write([]byte(chunk))
}
pw.Close()
}()
decoder := NewDecoder(StdEncoding, pr)
_, err := io.ReadAll(decoder)
if err != testcase.expected {
t.Errorf("Expected %v, got %v; case %s %+v", testcase.expected, err, testcase.prefix, chunks)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: c15593197453b8bf90fc3a9080ba2afeaf7934ea
| non_process | golang go src encoding test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message range loop variable chunks used in defer or goroutine at line click here to show the line s of go which triggered the analyzer go for chunks range testcase chunkcombinations pr pw io pipe write the encoded chunks into the pipe go func for chunk range chunks pw write byte chunk pw close decoder newdecoder stdencoding pr err io readall decoder if err testcase expected t errorf expected v got v case s v testcase expected err testcase prefix chunks leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id | 0 |
50,052 | 21,003,746,681 | IssuesEvent | 2022-03-29 20:09:20 | emergenzeHack/ukrainehelp.emergenzehack.info_segnalazioni | https://api.github.com/repos/emergenzeHack/ukrainehelp.emergenzehack.info_segnalazioni | opened | A Cento ‘Facciamo rete’ - accoglienza solidale offre servizi ai rifugiati dall’Ucraina | Services Legal materialGoods hospitality translation psychologicalSupport Children | <pre><yamldata>
servicetypes:
materialGoods: true
hospitality: true
transport: false
healthcare: false
Legal: true
translation: true
job: false
psychologicalSupport: true
Children: true
disability: false
women: false
education: false
offerFromWho: Facciamo rete accoglienza solidale a Cento
title: A Cento ‘Facciamo rete’ - accoglienza solidale offre servizi ai rifugiati
dall’Ucraina
recipients: ''
description: ''
url: https://www.facebook.com/groups/ukrainehelpit/permalink/468223791713241/
address:
mode: autocomplete
address:
place_id: 283662905
licence: Data © OpenStreetMap contributors, ODbL 1.0. https://osm.org/copyright
osm_type: relation
osm_id: 43506
boundingbox:
- '44.705628'
- '44.8381841'
- '11.2461129'
- '11.3880712'
lat: '44.7274395'
lon: '11.2903029'
display_name: Cento, Unione Alto Ferrarese, Ferrara, Emilia-Romagna, Italia
class: boundary
type: administrative
importance: 0.5919924514729962
icon: https://nominatim.openstreetmap.org/ui/mapicons//poi_boundary_administrative.p.20.png
address:
town: Cento
municipality: Unione Alto Ferrarese
county: Ferrara
state: Emilia-Romagna
country: Italia
country_code: it
iConfirmToHaveReadAndAcceptedInformativeToThreatPersonalData: true
label: services
submit: true
</yamldata></pre> | 1.0 | A Cento ‘Facciamo rete’ - accoglienza solidale offre servizi ai rifugiati dall’Ucraina - <pre><yamldata>
servicetypes:
materialGoods: true
hospitality: true
transport: false
healthcare: false
Legal: true
translation: true
job: false
psychologicalSupport: true
Children: true
disability: false
women: false
education: false
offerFromWho: Facciamo rete accoglienza solidale a Cento
title: A Cento ‘Facciamo rete’ - accoglienza solidale offre servizi ai rifugiati
dall’Ucraina
recipients: ''
description: ''
url: https://www.facebook.com/groups/ukrainehelpit/permalink/468223791713241/
address:
mode: autocomplete
address:
place_id: 283662905
licence: Data © OpenStreetMap contributors, ODbL 1.0. https://osm.org/copyright
osm_type: relation
osm_id: 43506
boundingbox:
- '44.705628'
- '44.8381841'
- '11.2461129'
- '11.3880712'
lat: '44.7274395'
lon: '11.2903029'
display_name: Cento, Unione Alto Ferrarese, Ferrara, Emilia-Romagna, Italia
class: boundary
type: administrative
importance: 0.5919924514729962
icon: https://nominatim.openstreetmap.org/ui/mapicons//poi_boundary_administrative.p.20.png
address:
town: Cento
municipality: Unione Alto Ferrarese
county: Ferrara
state: Emilia-Romagna
country: Italia
country_code: it
iConfirmToHaveReadAndAcceptedInformativeToThreatPersonalData: true
label: services
submit: true
</yamldata></pre> | non_process | a cento ‘facciamo rete’ accoglienza solidale offre servizi ai rifugiati dall’ucraina servicetypes materialgoods true hospitality true transport false healthcare false legal true translation true job false psychologicalsupport true children true disability false women false education false offerfromwho facciamo rete accoglienza solidale a cento title a cento ‘facciamo rete’ accoglienza solidale offre servizi ai rifugiati dall’ucraina recipients description url address mode autocomplete address place id licence data © openstreetmap contributors odbl osm type relation osm id boundingbox lat lon display name cento unione alto ferrarese ferrara emilia romagna italia class boundary type administrative importance icon address town cento municipality unione alto ferrarese county ferrara state emilia romagna country italia country code it iconfirmtohavereadandacceptedinformativetothreatpersonaldata true label services submit true | 0 |
416,107 | 12,139,558,606 | IssuesEvent | 2020-04-23 19:03:12 | ChrisNZL/Tallowmere2 | https://api.github.com/repos/ChrisNZL/Tallowmere2 | opened | Null ref: T2.CanvasMenu.UpdateTabNavigationGlyphs | ⚠ priority 🌐 online 🔴 error | Auto bug report from 0.1.3.
Feedback ID: 20200423-5TZBB
```
5:48:55, Frame 834, LOG » RELAY CLIENT: Connecting to server...
5:48:56, Frame 880, LOG » NETWORK CLIENT: Connecting to X.X.X.X ...
5:48:56, Frame 885, LOG » NETWORK CLIENT: RelayClient connected to server X.X.X.X
5:49:38, Frame 3362, EXCEPTION » NullReferenceException: Object reference not set to an instance of an object
>>>>> CRITICAL ERROR >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
T2.CanvasMenu.UpdateTabNavigationGlyphs ()
T2.CanvasMenu.SetCategoryAndUpdateAppearance (T2.Category newCategory, System.Boolean forceRebuildMenuItems)
T2.Action_ShowMerchantMenu.ShowMenu (T2.Creature creatureUsingMenu, T2.Creature merchant)
T2.Action_ShowMerchantMenu.ExecuteAction ()
T2.SpeechBubble.FadeOut ()
T2.SpeechBubble._LateUpdate (System.Boolean hurryUp)
T2.SpeechBubble.LateUpdate ()
GameStates: RompingThroughDungeon
GameSetupMode: NetworkCoop
Players: 2
Room: 0 / 12
RoomModifiers: None
``` | 1.0 | Null ref: T2.CanvasMenu.UpdateTabNavigationGlyphs - Auto bug report from 0.1.3.
Feedback ID: 20200423-5TZBB
```
5:48:55, Frame 834, LOG » RELAY CLIENT: Connecting to server...
5:48:56, Frame 880, LOG » NETWORK CLIENT: Connecting to X.X.X.X ...
5:48:56, Frame 885, LOG » NETWORK CLIENT: RelayClient connected to server X.X.X.X
5:49:38, Frame 3362, EXCEPTION » NullReferenceException: Object reference not set to an instance of an object
>>>>> CRITICAL ERROR >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
T2.CanvasMenu.UpdateTabNavigationGlyphs ()
T2.CanvasMenu.SetCategoryAndUpdateAppearance (T2.Category newCategory, System.Boolean forceRebuildMenuItems)
T2.Action_ShowMerchantMenu.ShowMenu (T2.Creature creatureUsingMenu, T2.Creature merchant)
T2.Action_ShowMerchantMenu.ExecuteAction ()
T2.SpeechBubble.FadeOut ()
T2.SpeechBubble._LateUpdate (System.Boolean hurryUp)
T2.SpeechBubble.LateUpdate ()
GameStates: RompingThroughDungeon
GameSetupMode: NetworkCoop
Players: 2
Room: 0 / 12
RoomModifiers: None
``` | non_process | null ref canvasmenu updatetabnavigationglyphs auto bug report from feedback id frame log » relay client connecting to server frame log » network client connecting to x x x x frame log » network client relayclient connected to server x x x x frame exception » nullreferenceexception object reference not set to an instance of an object critical error canvasmenu updatetabnavigationglyphs canvasmenu setcategoryandupdateappearance category newcategory system boolean forcerebuildmenuitems action showmerchantmenu showmenu creature creatureusingmenu creature merchant action showmerchantmenu executeaction speechbubble fadeout speechbubble lateupdate system boolean hurryup speechbubble lateupdate gamestates rompingthroughdungeon gamesetupmode networkcoop players room roommodifiers none | 0 |
195,868 | 6,919,364,298 | IssuesEvent | 2017-11-29 15:16:02 | uracreative/task-management | https://api.github.com/repos/uracreative/task-management | opened | Image template for social media and blog posts | Internal: Outreach Internal: Social Media Internal: Website Priority: Medium | We need to have a concept for the template that will be used for posts on our social media channels, newsletter and blog posts. This needs to be different but at the same time to keep the same look and feel in terms of design.
Deadline: 6.12.2017 | 1.0 | Image template for social media and blog posts - We need to have a concept for the template that will be used for posts on our social media channels, newsletter and blog posts. This needs to be different but at the same time to keep the same look and feel in terms of design.
Deadline: 6.12.2017 | non_process | image template for social media and blog posts we need to have a concept for the template that will be used for posts on our social media channels newsletter and blog posts this needs to be different but at the same time to keep the same look and feel in terms of design deadline | 0 |
366,800 | 25,602,441,479 | IssuesEvent | 2022-12-01 21:32:48 | pharmaverse/admiral | https://api.github.com/repos/pharmaverse/admiral | opened | Documentation: include convert_na_to_blanks in the news.md section | documentation | ### Please select a category the issue is focused on?
Other
### Let us know where something needs a refresh or put your idea here!
include convert_na_to_blanks in the news.md section | 1.0 | Documentation: include convert_na_to_blanks in the news.md section - ### Please select a category the issue is focused on?
Other
### Let us know where something needs a refresh or put your idea here!
include convert_na_to_blanks in the news.md section | non_process | documentation include convert na to blanks in the news md section please select a category the issue is focused on other let us know where something needs a refresh or put your idea here include convert na to blanks in the news md section | 0 |
10,970 | 13,775,422,343 | IssuesEvent | 2020-10-08 07:59:10 | cypress-io/cypress | https://api.github.com/repos/cypress-io/cypress | closed | #8764 should be re-written. | process: tests stage: ready for work | ### Current behavior
#8764 broke CI and some test suites like `packages/server/test/e2e/6_video_compression_spec.js` are not executed.
### Desired behavior
* CI works correctly
* Test suites like `packages/server/test/e2e/6_video_compression_spec.js` are executed.
* Flaky tests are fixed.
### Test code to reproduce
According to [the CI results](https://app.circleci.com/pipelines/github/cypress-io/cypress?branch=retries-for-flaky-failing-specs) of #8764, it's merged hastily. There are non-flaky failures in server-6 tests, but they're ignored.
### Versions
Current develop branch. | 1.0 | #8764 should be re-written. - ### Current behavior
#8764 broke CI and some test suites like `packages/server/test/e2e/6_video_compression_spec.js` are not executed.
### Desired behavior
* CI works correctly
* Test suites like `packages/server/test/e2e/6_video_compression_spec.js` are executed.
* Flaky tests are fixed.
### Test code to reproduce
According to [the CI results](https://app.circleci.com/pipelines/github/cypress-io/cypress?branch=retries-for-flaky-failing-specs) of #8764, it's merged hastily. There are non-flaky failures in server-6 tests, but they're ignored.
### Versions
Current develop branch. | process | should be re written current behavior broke ci and some test suites like packages server test video compression spec js are not executed desired behavior ci works correctly test suites like packages server test video compression spec js are executed flaky tests are fixed test code to reproduce according to of it s merged hastily there are non flaky failures in server tests but they re ignored versions current develop branch | 1 |
14,573 | 17,702,935,581 | IssuesEvent | 2021-08-25 01:55:25 | tdwg/dwc | https://api.github.com/repos/tdwg/dwc | closed | New term - cultivarEpithet | Term - add Class - Taxon normative Process - complete | ## New Term
Submitter: Markus Döring
Justification: cultivar names can only be shared now via the full scientificName and need their own atomized term.
Proponents: Workshop on Biodiversity Data Quality (2016-03), TCS
Definition: Part of the name of a cultivar, cultivar group or grex that follows the scientific name.
Comment: According to the Rules of the Cultivated Plant Code, a cultivar name consists of a botanical name followed by a cultivar epithet. The value given as the cultivarEpithet should exclude any quotes. The term taxonRank should be used to indicate which type of cultivated plant name (e.g. cultivar, cultivar group, grex) is concerned. This epithet, including any enclosing apostrophes or suffix, should be provided in scientificName as well.
Examples: `King Edward` (for scientificName "Solanum tuberosum 'King Edward'" and taxonRank "cultivar"); `Mishmiense` (for scientificName "Rhododendron boothii Mishmiense Group" and taxonRank "cultivar group"); `Atlantis` (for scientificName "Paphiopedilum Atlantis grex" and taxonRank "grex").
Refines: None
Replaces: None
ABCD 2.06: http://rs.tdwg.org/abcd/terms/cultivarName or http://rs.tdwg.org/abcd/terms/cultivarGroupName or http://rs.tdwg.org/abcd/terms/breed (ABCD 3.0)
Original comment: https://code.google.com/p/darwincore/issues/detail?id=141
Mar 1, 2012 comment #1 dag.endresen
One potential challenge of adding cultivar name as part of the nomenclature code is perhaps that cultivar names can have language. The very same cultivar can be released under different marketing names in different countries.
Mar 2, 2012 comment #2 wixner
isn't that a kind of synonym then? Several names referring to the same taxon. Or does the cultivar code explicitly accept several localisations for one name?
Mar 2, 2012 comment #3 wixner
There is also the need to share information about other informal name parts not governed by the nomenclatoral codes. For example bacterial strain names or temporary names in use before they have been published. Hyloxalus sp. JCS-2010a for example was updated to Hyloxalus yasuni when Paez-Vacas et al. (2010) was published. Other examples are Clostridium botulinum A112 or Clostridium botulinum A strain ATCC 19397.
Is it maybe worthwhile having an informalEpithet instead that caters for both cultivars and temporary names?
Mar 2, 2012 comment #4 dag.endresen
I believe that each 'localization' of a cultivar name could perhaps be seen as the 'valid' name in the country where the name is used/marketed - and as a 'synonym' in other countries...? I am not sure if it is always easy to pick one globally 'valid' cultivar name...? And each cultivar can have different release year in different countries, and I believe sometimes (but not always?) released and marketed under a different cultivar name...?
Mar 2, 2012 comment #5 dag.endresen
The International Code of Nomenclature for Cultivated Plants (ICNCP, Cultivated Plant Code) seems to regulate the 'cultivar epithet' to be a distinct name within a genus - and that the cultivar name (cultivar epithet) is a different thing from the 'trade name', and that it is the 'trade name' that can be a different name for the same cultivar marketed in each country... sorry for the confusion.
http://en.wikipedia.org/wiki/International_Code_of_Nomenclature_for_Cultivated_Plants
Mar 2, 2012 comment #6 dag.endresen
Which could mean that there is a role in Darwin Core for both a new cultivarEpithet and an informalEpithet term...?
Mar 2, 2012 comment #7 wixner
Wikipedia suggests there is only a single cultivar epithet:
http://en.wikipedia.org/wiki/Cultivar#Cultivar_names
"A cultivar name consists of a botanical name (of a genus, species, infraspecific taxon, interspecific hybrid or intergeneric hybrid) followed by a cultivar epithet. The cultivar epithet is capitalised and enclosed by single quotes; it should not be italicized. It is permissible to place a cultivar epithet after a common name provided the common name is botanically unambiguous. Cultivar epithets published before 1 January 1959 were often given a Latin form and can be readily confused with the specific epithets in botanical names; after that date, newly coined cultivar epithets must be in a modern vernacular language to distinguish them from botanical epithets."
A cultivar name also has to be published just like a scientific name before it can be registered. So the publishedIn(Year) terms would also apply nicely.
An example registration page for aroids can be found here:
http://www.aroid.org/cultivars/reg_form_short.php
Mar 2, 2012 comment #8 dag.endresen
There seems to be two systems for cultivar names - one system of "cultivar epithet" under the ICNCP (distinct names published in a journal) - and another system of "trade designations"/"trade names" generally under the UPOV (but with varying national legislation) where different "trade name" can be protected in distinct countries (different years) for the same cultivar. I believe that both these types of "cultivar names" might perhaps be useful to track...?
"Many plants have "selling names" or "marketing names" as well as a cultivar name; the ICNCP refers to these as "trade designations". Only the cultivar name is governed by the ICNCP. It is required to be unique", http://en.wikipedia.org/wiki/International_Code_of_Nomenclature_for_Cultivated_Plants.
Mar 6, 2012 comment #9 wixner
Added a new issue144 for strains
Sep 23, 2013 comment #12 gtuco.btuco
I would like to promote the adoption of the concepts mentioned in this issue. To do so, I will need a stronger proposal demonstrating the need to share this information - that is, that independent groups, organizations, projects have the same need and can reach a consensus proposal about how the term or terms should be used. It might be a good idea to circulate the proposal on tdwg-content and see if a community can be built around and support the additions.
| 1.0 | New term - cultivarEpithet - ## New Term
Submitter: Markus Döring
Justification: cultivar names can only be shared now via the full scientificName and need their own atomized term.
Proponents: Workshop on Biodiversity Data Quality (2016-03), TCS
Definition: Part of the name of a cultivar, cultivar group or grex that follows the scientific name.
Comment: According to the Rules of the Cultivated Plant Code, a cultivar name consists of a botanical name followed by a cultivar epithet. The value given as the cultivarEpithet should exclude any quotes. The term taxonRank should be used to indicate which type of cultivated plant name (e.g. cultivar, cultivar group, grex) is concerned. This epithet, including any enclosing apostrophes or suffix, should be provided in scientificName as well.
Examples: `King Edward` (for scientificName "Solanum tuberosum 'King Edward'" and taxonRank "cultivar"); `Mishmiense` (for scientificName "Rhododendron boothii Mishmiense Group" and taxonRank "cultivar group"); `Atlantis` (for scientificName "Paphiopedilum Atlantis grex" and taxonRank "grex").
Refines: None
Replaces: None
ABCD 2.06: http://rs.tdwg.org/abcd/terms/cultivarName or http://rs.tdwg.org/abcd/terms/cultivarGroupName or http://rs.tdwg.org/abcd/terms/breed (ABCD 3.0)
Original comment: https://code.google.com/p/darwincore/issues/detail?id=141
Mar 1, 2012 comment #1 dag.endresen
One potential challenge of adding cultivar name as part of the nomenclature code is perhaps that cultivar names can have language. The very same cultivar can be released under different marketing names in different countries.
Mar 2, 2012 comment #2 wixner
isn't that a kind of synonym then? Several names referring to the same taxon. Or does the cultivar code explicitly accept several localisations for one name?
Mar 2, 2012 comment #3 wixner
There is also the need to share information about other informal name parts not governed by the nomenclatoral codes. For example bacterial strain names or temporary names in use before they have been published. Hyloxalus sp. JCS-2010a for example was updated to Hyloxalus yasuni when Paez-Vacas et al. (2010) was published. Other examples are Clostridium botulinum A112 or Clostridium botulinum A strain ATCC 19397.
Is it maybe worthwhile having an informalEpithet instead that caters for both cultivars and temporary names?
Mar 2, 2012 comment #4 dag.endresen
I believe that each 'localization' of a cultivar name could perhaps be seen as the 'valid' name in the country where the name is used/marketed - and as a 'synonym' in other countries...? I am not sure if it is always easy to pick one globally 'valid' cultivar name...? And each cultivar can have different release year in different countries, and I believe sometimes (but not always?) released and marketed under a different cultivar name...?
Mar 2, 2012 comment #5 dag.endresen
The International Code of Nomenclature for Cultivated Plants (ICNCP, Cultivated Plant Code) seems to regulate the 'cultivar epithet' to be a distinct name within a genus - and that the cultivar name (cultivar epithet) is a different thing from the 'trade name', and that it is the 'trade name' that can be a different name for the same cultivar marketed in each country... sorry for the confusion.
http://en.wikipedia.org/wiki/International_Code_of_Nomenclature_for_Cultivated_Plants
Mar 2, 2012 comment #6 dag.endresen
Which could mean that there is a role in Darwin Core for both a new cultivarEpithet and an informalEpithet term...?
Mar 2, 2012 comment #7 wixner
Wikipedia suggests there is only a single cultivar epithet:
http://en.wikipedia.org/wiki/Cultivar#Cultivar_names
"A cultivar name consists of a botanical name (of a genus, species, infraspecific taxon, interspecific hybrid or intergeneric hybrid) followed by a cultivar epithet. The cultivar epithet is capitalised and enclosed by single quotes; it should not be italicized. It is permissible to place a cultivar epithet after a common name provided the common name is botanically unambiguous. Cultivar epithets published before 1 January 1959 were often given a Latin form and can be readily confused with the specific epithets in botanical names; after that date, newly coined cultivar epithets must be in a modern vernacular language to distinguish them from botanical epithets."
A cultivar name also has to be published just like a scientific name before it can be registered. So the publishedIn(Year) terms would also apply nicely.
An example registration page for aroids can be found here:
http://www.aroid.org/cultivars/reg_form_short.php
Mar 2, 2012 comment #8 dag.endresen
There seems to be two systems for cultivar names - one system of "cultivar epithet" under the ICNCP (distinct names published in a journal) - and another system of "trade designations"/"trade names" generally under the UPOV (but with varying national legislation) where different "trade name" can be protected in distinct countries (different years) for the same cultivar. I believe that both these types of "cultivar names" might perhaps be useful to track...?
"Many plants have "selling names" or "marketing names" as well as a cultivar name; the ICNCP refers to these as "trade designations". Only the cultivar name is governed by the ICNCP. It is required to be unique", http://en.wikipedia.org/wiki/International_Code_of_Nomenclature_for_Cultivated_Plants.
Mar 6, 2012 comment #9 wixner
Added a new issue144 for strains
Sep 23, 2013 comment #12 gtuco.btuco
I would like to promote the adoption of the concepts mentioned in this issue. To do so, I will need a stronger proposal demonstrating the need to share this information - that is, that independent groups, organizations, projects have the same need and can reach a consensus proposal about how the term or terms should be used. It might be a good idea to circulate the proposal on tdwg-content and see if a community can be built around and support the additions.
| process | new term cultivarepithet new term submitter markus döring justification cultivar names can only be shared now via the full scientificname and need their own atomized term proponents workshop on biodiversity data quality tcs definition part of the name of a cultivar cultivar group or grex that follows the scientific name comment according to the rules of the cultivated plant code a cultivar name consists of a botanical name followed by a cultivar epithet the value given as the cultivarepithet should exclude any quotes the term taxonrank should be used to indicate which type of cultivated plant name e g cultivar cultivar group grex is concerned this epithet including any enclosing apostrophes or suffix should be provided in scientificname as well examples king edward for scientificname solanum tuberosum king edward and taxonrank cultivar mishmiense for scientificname rhododendron boothii mishmiense group and taxonrank cultivar group atlantis for scientificname paphiopedilum atlantis grex and taxonrank grex refines none replaces none abcd or or abcd original comment mar comment dag endresen one potential challenge of adding cultivar name as part of the nomenclature code is perhaps that cultivar names can have language the very same cultivar can be released under different marketing names in different countries mar comment wixner isn t that a kind of synonym then several names referring to the same taxon or does the cultivar code explicitly accept several localisations for one name mar comment wixner there is also the need to share information about other informal name parts not governed by the nomenclatoral codes for example bacterial strain names or temporary names in use before they have been published hyloxalus sp jcs for example was updated to hyloxalus yasuni when paez vacas et al was published other examples are clostridium botulinum or clostridium botulinum a strain atcc is it maybe worthwhile having an informalepithet instead that caters for both cultivars and temporary names mar comment dag endresen i believe that each localization of a cultivar name could perhaps be seen as the valid name in the country where the name is used marketed and as a synonym in other countries i am not sure if it is always easy to pick one globally valid cultivar name and each cultivar can have different release year in different countries and i believe sometimes but not always released and marketed under a different cultivar name mar comment dag endresen the international code of nomenclature for cultivated plants icncp cultivated plant code seems to regulate the cultivar epithet to be a distinct name within a genus and that the cultivar name cultivar epithet is a different thing from the trade name and that it is the trade name that can be a different name for the same cultivar marketed in each country sorry for the confusion mar comment dag endresen which could mean that there is a role in darwin core for both a new cultivarepithet and an informalepithet term mar comment wixner wikipedia suggests there is only a single cultivar epithet a cultivar name consists of a botanical name of a genus species infraspecific taxon interspecific hybrid or intergeneric hybrid followed by a cultivar epithet the cultivar epithet is capitalised and enclosed by single quotes it should not be italicized it is permissible to place a cultivar epithet after a common name provided the common name is botanically unambiguous cultivar epithets published before january were often given a latin form and can be readily confused with the specific epithets in botanical names after that date newly coined cultivar epithets must be in a modern vernacular language to distinguish them from botanical epithets a cultivar name also has to be published just like a scientific name before it can be registered so the publishedin year terms would also apply nicely an example registration page for aroids can be found here mar comment dag endresen there seems to be two systems for cultivar names one system of cultivar epithet under the icncp distinct names published in a journal and another system of trade designations trade names generally under the upov but with varying national legislation where different trade name can be protected in distinct countries different years for the same cultivar i believe that both these types of cultivar names might perhaps be useful to track many plants have selling names or marketing names as well as a cultivar name the icncp refers to these as trade designations only the cultivar name is governed by the icncp it is required to be unique mar comment wixner added a new for strains sep comment gtuco btuco i would like to promote the adoption of the concepts mentioned in this issue to do so i will need a stronger proposal demonstrating the need to share this information that is that independent groups organizations projects have the same need and can reach a consensus proposal about how the term or terms should be used it might be a good idea to circulate the proposal on tdwg content and see if a community can be built around and support the additions | 1 |
7,825 | 3,106,082,589 | IssuesEvent | 2015-09-01 01:23:20 | california-civic-data-coalition/django-calaccess-raw-data | https://api.github.com/repos/california-civic-data-coalition/django-calaccess-raw-data | closed | Add documentation for the ``cand_st`` field on the ``CvrCampaignDisclosureCd`` database model | documentation enhancement small |
## Your mission
Add documentation for the ``cand_st`` field on the ``CvrCampaignDisclosureCd`` database model.
## Here's how
**Step 1**: Claim this ticket by leaving a comment below. Tell everyone you're ON IT!
**Step 2**: Open up the file that contains this model. It should be in <a href="https://github.com/california-civic-data-coalition/django-calaccess-raw-data/blob/master/calaccess_raw/models/campaign.py">calaccess_raw.models.campaign.py</a>.
**Step 3**: Hit the little pencil button in the upper-right corner of the code box to begin editing the file.

**Step 4**: Find this model and field in the file. (Clicking into the box and searching with CTRL-F can help you here.) Once you find it, we expect the field to lack the ``help_text`` field typically used in Django to explain what a field contains.
```python
effect_dt = fields.DateField(
null=True,
db_column="EFFECT_DT"
)
```
**Step 5**: In a separate tab, open up the <a href="Quilmes">official state documentation</a> and find the page that defines all the fields in this model.

**Step 6**: Find the row in that table's definition table that spells out what this field contains. If it lacks documentation. Note that in the ticket and close it now.

**Step 7**: Return to the GitHub tab.
**Step 8**: Add the state's label explaining what's in the field, to our field definition by inserting it a ``help_text`` argument. That should look something like this:
```python
effect_dt = fields.DateField(
null=True,
db_column="EFFECT_DT",
# Add a help_text argument like the one here, but put your string in instead.
help_text="The other values in record were effective as of this date"
)
```
**Step 9**: Scroll down below the code box and describe the change you've made in the commit message. Press the button below.

**Step 10**: Review your changes and create a pull request submitting them to the core team for inclusion.

That's it! Mission accomplished!
| 1.0 | Add documentation for the ``cand_st`` field on the ``CvrCampaignDisclosureCd`` database model -
## Your mission
Add documentation for the ``cand_st`` field on the ``CvrCampaignDisclosureCd`` database model.
## Here's how
**Step 1**: Claim this ticket by leaving a comment below. Tell everyone you're ON IT!
**Step 2**: Open up the file that contains this model. It should be in <a href="https://github.com/california-civic-data-coalition/django-calaccess-raw-data/blob/master/calaccess_raw/models/campaign.py">calaccess_raw.models.campaign.py</a>.
**Step 3**: Hit the little pencil button in the upper-right corner of the code box to begin editing the file.

**Step 4**: Find this model and field in the file. (Clicking into the box and searching with CTRL-F can help you here.) Once you find it, we expect the field to lack the ``help_text`` field typically used in Django to explain what a field contains.
```python
effect_dt = fields.DateField(
null=True,
db_column="EFFECT_DT"
)
```
**Step 5**: In a separate tab, open up the <a href="Quilmes">official state documentation</a> and find the page that defines all the fields in this model.

**Step 6**: Find the row in that table's definition table that spells out what this field contains. If it lacks documentation. Note that in the ticket and close it now.

**Step 7**: Return to the GitHub tab.
**Step 8**: Add the state's label explaining what's in the field, to our field definition by inserting it a ``help_text`` argument. That should look something like this:
```python
effect_dt = fields.DateField(
null=True,
db_column="EFFECT_DT",
# Add a help_text argument like the one here, but put your string in instead.
help_text="The other values in record were effective as of this date"
)
```
**Step 9**: Scroll down below the code box and describe the change you've made in the commit message. Press the button below.

**Step 10**: Review your changes and create a pull request submitting them to the core team for inclusion.

That's it! Mission accomplished!
| non_process | add documentation for the cand st field on the cvrcampaigndisclosurecd database model your mission add documentation for the cand st field on the cvrcampaigndisclosurecd database model here s how step claim this ticket by leaving a comment below tell everyone you re on it step open up the file that contains this model it should be in a href step hit the little pencil button in the upper right corner of the code box to begin editing the file step find this model and field in the file clicking into the box and searching with ctrl f can help you here once you find it we expect the field to lack the help text field typically used in django to explain what a field contains python effect dt fields datefield null true db column effect dt step in a separate tab open up the official state documentation and find the page that defines all the fields in this model step find the row in that table s definition table that spells out what this field contains if it lacks documentation note that in the ticket and close it now step return to the github tab step add the state s label explaining what s in the field to our field definition by inserting it a help text argument that should look something like this python effect dt fields datefield null true db column effect dt add a help text argument like the one here but put your string in instead help text the other values in record were effective as of this date step scroll down below the code box and describe the change you ve made in the commit message press the button below step review your changes and create a pull request submitting them to the core team for inclusion that s it mission accomplished | 0 |
15,962 | 5,195,711,359 | IssuesEvent | 2017-01-23 10:19:13 | SemsTestOrg/combinearchive-web | https://api.github.com/repos/SemsTestOrg/combinearchive-web | closed | Can not read archive - error message is corrupt | code defect fixed major migrated | ## Trac Ticket #48
**component:** code
**owner:** somebody
**reporter:** martinP
**created:** 2014-08-17 17:57:41
**milestone:**
**type:** defect
**version:**
**keywords:**
Can not read archive {0} entries in WorkingDir {1}
## comment 1
**time:** 2014-08-17 17:57:50
**author:** martinP
## comment 2
**time:** 2014-08-17 17:57:50
**author:** martinP
Updated **type** to **defect**
## comment 3
**time:** 2014-08-17 18:39:12
**author:** mp487 <martin.peters3@uni-rostock.de>
In changeset:"09bb2c63164601688302b612170ac8fec8552942"]:
```CommitTicketReference repository="" revision="09bb2c63164601688302b612170ac8fec8552942"
fixed error messages [fixes #48]
```
## comment 4
**time:** 2014-08-17 18:39:12
**author:** mp487 <martin.peters3@uni-rostock.de>
Updated **resolution** to **fixed**
## comment 5
**time:** 2014-08-17 18:39:12
**author:** mp487 <martin.peters3@uni-rostock.de>
Updated **status** to **closed**
| 1.0 | Can not read archive - error message is corrupt - ## Trac Ticket #48
**component:** code
**owner:** somebody
**reporter:** martinP
**created:** 2014-08-17 17:57:41
**milestone:**
**type:** defect
**version:**
**keywords:**
Can not read archive {0} entries in WorkingDir {1}
## comment 1
**time:** 2014-08-17 17:57:50
**author:** martinP
## comment 2
**time:** 2014-08-17 17:57:50
**author:** martinP
Updated **type** to **defect**
## comment 3
**time:** 2014-08-17 18:39:12
**author:** mp487 <martin.peters3@uni-rostock.de>
In changeset:"09bb2c63164601688302b612170ac8fec8552942"]:
```CommitTicketReference repository="" revision="09bb2c63164601688302b612170ac8fec8552942"
fixed error messages [fixes #48]
```
## comment 4
**time:** 2014-08-17 18:39:12
**author:** mp487 <martin.peters3@uni-rostock.de>
Updated **resolution** to **fixed**
## comment 5
**time:** 2014-08-17 18:39:12
**author:** mp487 <martin.peters3@uni-rostock.de>
Updated **status** to **closed**
| non_process | can not read archive error message is corrupt trac ticket component code owner somebody reporter martinp created milestone type defect version keywords can not read archive entries in workingdir comment time author martinp comment time author martinp updated type to defect comment time author in changeset committicketreference repository revision fixed error messages comment time author updated resolution to fixed comment time author updated status to closed | 0 |
17,018 | 22,389,860,724 | IssuesEvent | 2022-06-17 06:23:24 | PyCQA/pylint | https://api.github.com/repos/PyCQA/pylint | closed | Spawning child-process when using multiprocessing is very slow | High Effort 🏋 topic-multiprocessing | ### Bug description
See: https://github.com/PyCQA/pylint/issues/6965#issuecomment-1158128082
> I've noticed that spawning the child processes is extremely slow - about 1-2 processes per second. So, 60-way parallelism generally means that there is about a 30 s delay while things spin up, then almost no time spent doing work. This slow startup time was happening despite my CPUs being about 70% idle. This slow startup time presumably explains the bug I saw about how ineffectual multiprocessing was for pylint. You would need a huge batch of files to justify doing significant multi-processing.
> I don't know whether the slow startup is a bug in multiprocessing or in pylint. I just measured some of our presubmits and 60-way parallelism more than doubles the time that they take, from ~30 to ~70 s.
Possible solution in https://github.com/PyCQA/pylint/issues/6965#issuecomment-1158158384
> We would need a complete rewrite of the parallel code. We currently spin up a new PyLinter class for every job. That is taking way too long (probably). But I'm not sure what the best approach is to create a PyLinterLite...
I personally think that a refactor of PyLinter will be required, and we'd have to classify checkers to know if they can benefit from multiprocessing or not. ``duplicate-code`` or ``cyclic import`` won't for example as they need information on the imports of a file. Some check are are file based like ``unused-private-member`` (the scope is a single class) or ``while-used`` (it just has to check if a while node exists) and can benefit from multiprocessing if done at the right time.
### Configuration
We should use a full configuration for this with a lot to parse, as we're probably parsing the configuration in each forks and this would make it apparent.
### Command used
```python
``lint.Run(['--jobs', '42'] + argv)``
```
### Expected behavior
Run time decrease with more core (when there is more files to lint than cores available).
### Pylint version
```shell
2.14.2
```
| 1.0 | Spawning child-process when using multiprocessing is very slow - ### Bug description
See: https://github.com/PyCQA/pylint/issues/6965#issuecomment-1158128082
> I've noticed that spawning the child processes is extremely slow - about 1-2 processes per second. So, 60-way parallelism generally means that there is about a 30 s delay while things spin up, then almost no time spent doing work. This slow startup time was happening despite my CPUs being about 70% idle. This slow startup time presumably explains the bug I saw about how ineffectual multiprocessing was for pylint. You would need a huge batch of files to justify doing significant multi-processing.
> I don't know whether the slow startup is a bug in multiprocessing or in pylint. I just measured some of our presubmits and 60-way parallelism more than doubles the time that they take, from ~30 to ~70 s.
Possible solution in https://github.com/PyCQA/pylint/issues/6965#issuecomment-1158158384
> We would need a complete rewrite of the parallel code. We currently spin up a new PyLinter class for every job. That is taking way too long (probably). But I'm not sure what the best approach is to create a PyLinterLite...
I personally think that a refactor of PyLinter will be required, and we'd have to classify checkers to know if they can benefit from multiprocessing or not. ``duplicate-code`` or ``cyclic import`` won't for example as they need information on the imports of a file. Some check are are file based like ``unused-private-member`` (the scope is a single class) or ``while-used`` (it just has to check if a while node exists) and can benefit from multiprocessing if done at the right time.
### Configuration
We should use a full configuration for this with a lot to parse, as we're probably parsing the configuration in each forks and this would make it apparent.
### Command used
```python
``lint.Run(['--jobs', '42'] + argv)``
```
### Expected behavior
Run time decrease with more core (when there is more files to lint than cores available).
### Pylint version
```shell
2.14.2
```
| process | spawning child process when using multiprocessing is very slow bug description see i ve noticed that spawning the child processes is extremely slow about processes per second so way parallelism generally means that there is about a s delay while things spin up then almost no time spent doing work this slow startup time was happening despite my cpus being about idle this slow startup time presumably explains the bug i saw about how ineffectual multiprocessing was for pylint you would need a huge batch of files to justify doing significant multi processing i don t know whether the slow startup is a bug in multiprocessing or in pylint i just measured some of our presubmits and way parallelism more than doubles the time that they take from to s possible solution in we would need a complete rewrite of the parallel code we currently spin up a new pylinter class for every job that is taking way too long probably but i m not sure what the best approach is to create a pylinterlite i personally think that a refactor of pylinter will be required and we d have to classify checkers to know if they can benefit from multiprocessing or not duplicate code or cyclic import won t for example as they need information on the imports of a file some check are are file based like unused private member the scope is a single class or while used it just has to check if a while node exists and can benefit from multiprocessing if done at the right time configuration we should use a full configuration for this with a lot to parse as we re probably parsing the configuration in each forks and this would make it apparent command used python lint run argv expected behavior run time decrease with more core when there is more files to lint than cores available pylint version shell | 1 |
350,237 | 24,974,417,655 | IssuesEvent | 2022-11-02 06:09:02 | kubevela/kubevela.io | https://api.github.com/repos/kubevela/kubevela.io | closed | beautify overall website for command line and output | documentation enhancement good first issue | The overall website has many commands and output, for example:

In the picture above, users can't copy it convenient. Actually we could change it into:

Anyone who're interested in this issue can refer to #947 as an example. | 1.0 | beautify overall website for command line and output - The overall website has many commands and output, for example:

In the picture above, users can't copy it convenient. Actually we could change it into:

Anyone who're interested in this issue can refer to #947 as an example. | non_process | beautify overall website for command line and output the overall website has many commands and output for example in the picture above users can t copy it convenient actually we could change it into anyone who re interested in this issue can refer to as an example | 0 |
6,875 | 10,013,858,694 | IssuesEvent | 2019-07-15 16:04:55 | allinurl/goaccess | https://api.github.com/repos/allinurl/goaccess | closed | GoAccess not showing 'Tx.' Amount | log-processing log/date/time format question | Hello everyone,
I've configured goaccess for one of our servers to show data for our individual websites 'apache vhosts'.
Everything works great at first, but the one thing missing is the bandwidth amount 'the reason I installed it'.
Is there anything specific I should check as to why it won't show the bandwidth?
https://imgur.com/a/TUz7Sv3
| 1.0 | GoAccess not showing 'Tx.' Amount - Hello everyone,
I've configured goaccess for one of our servers to show data for our individual websites 'apache vhosts'.
Everything works great at first, but the one thing missing is the bandwidth amount 'the reason I installed it'.
Is there anything specific I should check as to why it won't show the bandwidth?
https://imgur.com/a/TUz7Sv3
| process | goaccess not showing tx amount hello everyone i ve configured goaccess for one of our servers to show data for our individual websites apache vhosts everything works great at first but the one thing missing is the bandwidth amount the reason i installed it is there anything specific i should check as to why it won t show the bandwidth | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.