Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 4
112
| repo_url
stringlengths 33
141
| action
stringclasses 3
values | title
stringlengths 1
1.02k
| labels
stringlengths 4
1.54k
| body
stringlengths 1
262k
| index
stringclasses 17
values | text_combine
stringlengths 95
262k
| label
stringclasses 2
values | text
stringlengths 96
252k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
311,951
| 26,826,168,464
|
IssuesEvent
|
2023-02-02 13:03:05
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
opened
|
[APM] 8.7.0 Test Plan
|
Team:APM apm:test-plan-guide v8.7.0
|
# Test Plan - 8.7.0
## Phase 1: Manual Testing - Monday 20th February
- Pick an issue from the [curated list](https://github.com/elastic/kibana/issues?q=label%3Aapm%3Atest-plan-8.7.0+is%3Aclosed+-label%3Aapm%3Atest-plan-done)
- After testing apply the `apm:test-plan-done` label. If a problem is found a new issue should be opened, labelled `apm:test-plan-regression` and moved to "Ready" column.
### Run the latest build candidate (BC)
Using [apm-integration-testing](https://github.com/elastic/apm-integration-testing) the stack for the latest BC can be started:
```
./scripts/compose.py start 8.7.0 --bc --all-opbeans
```
#### Check to see if issues are fixed before opening a bug (section to be updated with 8.7.0 info)
Find the commit kibana was built with:
```
./scripts/compose.py versions
...
Kibana (image built: 2022-11-22 07:33:44 UTC):
Version: 8.6.0-SNAPSHOT
Branch: 8.6
Build SHA: 0d8de4df69f8084a94cdd9638d7de510813cb5ce
Build number: 6678
```
Using that Build SHA, visit https://github.com/elastic/kibana/compare/0d8de4df69f8084a94cdd9638d7de510813cb5ce...8.6 to see commits since the BC was built.
### Creating users locally
```
node ./x-pack/plugins/apm/scripts/create_apm_users.js --username elastic --password changeme --kibana-url http://localhost:5601 --es-url http://localhost:9200
```
Creates the following users (username / password):
- `viewer` / `changeme`
- `editor` / `changeme`
## Phase 2: Improving Quality: Tuesday 21st February - Friday 24th February
Primarily we need more e2e (cypress) tests and api tests.
### Important areas & flows
- [List](https://github.com/elastic/kibana/issues?q=is%3Aopen+is%3Aissue+label%3Afailed-test+label%3ATeam%3AAPM) of failed tests (These might have already been fixed and can be closed)
- [List](https://github.com/elastic/kibana/issues?q=is%3Aopen+is%3Aissue+label%3ATeam%3AAPM+label%3Aapm%3Aneeds-test++-label%3Afailed-test) of missing tests and other improvements
- Trace waterfall perf improvements
|
1.0
|
[APM] 8.7.0 Test Plan - # Test Plan - 8.7.0
## Phase 1: Manual Testing - Monday 20th February
- Pick an issue from the [curated list](https://github.com/elastic/kibana/issues?q=label%3Aapm%3Atest-plan-8.7.0+is%3Aclosed+-label%3Aapm%3Atest-plan-done)
- After testing apply the `apm:test-plan-done` label. If a problem is found a new issue should be opened, labelled `apm:test-plan-regression` and moved to "Ready" column.
### Run the latest build candidate (BC)
Using [apm-integration-testing](https://github.com/elastic/apm-integration-testing) the stack for the latest BC can be started:
```
./scripts/compose.py start 8.7.0 --bc --all-opbeans
```
#### Check to see if issues are fixed before opening a bug (section to be updated with 8.7.0 info)
Find the commit kibana was built with:
```
./scripts/compose.py versions
...
Kibana (image built: 2022-11-22 07:33:44 UTC):
Version: 8.6.0-SNAPSHOT
Branch: 8.6
Build SHA: 0d8de4df69f8084a94cdd9638d7de510813cb5ce
Build number: 6678
```
Using that Build SHA, visit https://github.com/elastic/kibana/compare/0d8de4df69f8084a94cdd9638d7de510813cb5ce...8.6 to see commits since the BC was built.
### Creating users locally
```
node ./x-pack/plugins/apm/scripts/create_apm_users.js --username elastic --password changeme --kibana-url http://localhost:5601 --es-url http://localhost:9200
```
Creates the following users (username / password):
- `viewer` / `changeme`
- `editor` / `changeme`
## Phase 2: Improving Quality: Tuesday 21st February - Friday 24th February
Primarily we need more e2e (cypress) tests and api tests.
### Important areas & flows
- [List](https://github.com/elastic/kibana/issues?q=is%3Aopen+is%3Aissue+label%3Afailed-test+label%3ATeam%3AAPM) of failed tests (These might have already been fixed and can be closed)
- [List](https://github.com/elastic/kibana/issues?q=is%3Aopen+is%3Aissue+label%3ATeam%3AAPM+label%3Aapm%3Aneeds-test++-label%3Afailed-test) of missing tests and other improvements
- Trace waterfall perf improvements
|
test
|
test plan test plan phase manual testing monday february pick an issue from the after testing apply the apm test plan done label if a problem is found a new issue should be opened labelled apm test plan regression and moved to ready column run the latest build candidate bc using the stack for the latest bc can be started scripts compose py start bc all opbeans check to see if issues are fixed before opening a bug section to be updated with info find the commit kibana was built with scripts compose py versions kibana image built utc version snapshot branch build sha build number using that build sha visit to see commits since the bc was built creating users locally node x pack plugins apm scripts create apm users js username elastic password changeme kibana url es url creates the following users username password viewer changeme editor changeme phase improving quality tuesday february friday february primarily we need more cypress tests and api tests important areas flows of failed tests these might have already been fixed and can be closed of missing tests and other improvements trace waterfall perf improvements
| 1
|
40,107
| 2,865,714,549
|
IssuesEvent
|
2015-06-05 00:12:23
|
NuGet/Home
|
https://api.github.com/repos/NuGet/Home
|
closed
|
nuget.exe Update doesn't work with wix projects
|
Area:CommandLine Priority:1 Type:Bug
|
Updating a wix project is only possible through the IDE. The command line tool won't work.
It always outputs this error message:
Unable to locate project file for 'path to packages.config'
|
1.0
|
nuget.exe Update doesn't work with wix projects - Updating a wix project is only possible through the IDE. The command line tool won't work.
It always outputs this error message:
Unable to locate project file for 'path to packages.config'
|
non_test
|
nuget exe update doesn t work with wix projects updating a wix project is only possible through the ide the command line tool won t work it always outputs this error message unable to locate project file for path to packages config
| 0
|
20,379
| 10,513,777,104
|
IssuesEvent
|
2019-09-27 21:35:34
|
elastic/elasticsearch
|
https://api.github.com/repos/elastic/elasticsearch
|
opened
|
Cluster level log settings can inadvertently enable the deprecated *_access log
|
:Security/Audit >bug
|
<!--
** Please read the guidelines below. **
Issues that do not follow these guidelines are likely to be closed.
1. GitHub is reserved for bug reports and feature requests. The best place to
ask a general question is at the Elastic [forums](https://discuss.elastic.co).
GitHub is not the place for general questions.
2. Is this bug report or feature request for a supported OS? If not, it
is likely to be closed. See https://www.elastic.co/support/matrix#show_os
3. Please fill out EITHER the feature request block or the bug report block
below, and delete the other block.
-->
<!-- Feature request -->
<!-- Bug report -->
**Elasticsearch version:** `Version: 6.8.3, Build: default/tar/0c48c0e/2019-08-29T19:05:24.312154Z, JVM: 1.8.0_181`
**JVM version:** `Java HotSpot(TM) 64-Bit Server VM (build 25.181-b13, mixed mode)`
**OS version:** `18.7.0 Darwin Kernel Version 18.7.0: Tue Aug 20 16:57:14 PDT 2019; root:xnu-4903.271.2~2/RELEASE_X86_64 x86_64`
**Description of the problem including expected versus actual behavior**:
Adjusting the `_root` logger level via a cluster setting API call can enable the deprecated `*_access.log`, even if `logger.xpack_security_audit_deprecated_logfile.level = off` if defined in the `log4j2.properties` file. It remains enabled even if you `null` out the cluster setting.
**Steps to reproduce**:
1. Enabled auditing in the `elasticsearch.yml` + disable the deprecated `*_access.log` in the `log4j2.properties` file.
2. Switch to `debug` logging via a cluster setting
```
PUT /_cluster/settings
{"transient":{"logger._root":"debug"}}
```
3. Observe the `path.logs` directory. The deprecated `*_access.log` is now being written to.
4. Clear your debug logging via a `null`
```
PUT /_cluster/settings
{"transient":{"logger._root":null}}
```
5. `*_access.log` will continue being written to until the node is restarted.
|
True
|
Cluster level log settings can inadvertently enable the deprecated *_access log - <!--
** Please read the guidelines below. **
Issues that do not follow these guidelines are likely to be closed.
1. GitHub is reserved for bug reports and feature requests. The best place to
ask a general question is at the Elastic [forums](https://discuss.elastic.co).
GitHub is not the place for general questions.
2. Is this bug report or feature request for a supported OS? If not, it
is likely to be closed. See https://www.elastic.co/support/matrix#show_os
3. Please fill out EITHER the feature request block or the bug report block
below, and delete the other block.
-->
<!-- Feature request -->
<!-- Bug report -->
**Elasticsearch version:** `Version: 6.8.3, Build: default/tar/0c48c0e/2019-08-29T19:05:24.312154Z, JVM: 1.8.0_181`
**JVM version:** `Java HotSpot(TM) 64-Bit Server VM (build 25.181-b13, mixed mode)`
**OS version:** `18.7.0 Darwin Kernel Version 18.7.0: Tue Aug 20 16:57:14 PDT 2019; root:xnu-4903.271.2~2/RELEASE_X86_64 x86_64`
**Description of the problem including expected versus actual behavior**:
Adjusting the `_root` logger level via a cluster setting API call can enable the deprecated `*_access.log`, even if `logger.xpack_security_audit_deprecated_logfile.level = off` if defined in the `log4j2.properties` file. It remains enabled even if you `null` out the cluster setting.
**Steps to reproduce**:
1. Enabled auditing in the `elasticsearch.yml` + disable the deprecated `*_access.log` in the `log4j2.properties` file.
2. Switch to `debug` logging via a cluster setting
```
PUT /_cluster/settings
{"transient":{"logger._root":"debug"}}
```
3. Observe the `path.logs` directory. The deprecated `*_access.log` is now being written to.
4. Clear your debug logging via a `null`
```
PUT /_cluster/settings
{"transient":{"logger._root":null}}
```
5. `*_access.log` will continue being written to until the node is restarted.
|
non_test
|
cluster level log settings can inadvertently enable the deprecated access log please read the guidelines below issues that do not follow these guidelines are likely to be closed github is reserved for bug reports and feature requests the best place to ask a general question is at the elastic github is not the place for general questions is this bug report or feature request for a supported os if not it is likely to be closed see please fill out either the feature request block or the bug report block below and delete the other block elasticsearch version version build default tar jvm jvm version java hotspot tm bit server vm build mixed mode os version darwin kernel version tue aug pdt root xnu release description of the problem including expected versus actual behavior adjusting the root logger level via a cluster setting api call can enable the deprecated access log even if logger xpack security audit deprecated logfile level off if defined in the properties file it remains enabled even if you null out the cluster setting steps to reproduce enabled auditing in the elasticsearch yml disable the deprecated access log in the properties file switch to debug logging via a cluster setting put cluster settings transient logger root debug observe the path logs directory the deprecated access log is now being written to clear your debug logging via a null put cluster settings transient logger root null access log will continue being written to until the node is restarted
| 0
|
202,537
| 15,286,882,253
|
IssuesEvent
|
2021-02-23 15:10:43
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
roachtest: tpcc-nowait/nodes=3/w=1 failed
|
C-test-failure O-roachtest O-robot branch-release-20.1 release-blocker
|
[(roachtest).tpcc-nowait/nodes=3/w=1 failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2661563&tab=buildLog) on [release-20.1@90f78268f3b5b08ba838ac3ad164821d2f5a5362](https://github.com/cockroachdb/cockroach/commits/90f78268f3b5b08ba838ac3ad164821d2f5a5362):
```
| github.com/cockroachdb/cockroach/vendor/golang.org/x/sync/errgroup.(*Group).Go.func1
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/golang.org/x/sync/errgroup/errgroup.go:57
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1374
Wraps: (2) 2 safe details enclosed
Wraps: (3) output in run_093819.057_n4_workload_run_tpcc
Wraps: (4) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-2661563-1612941348-124-n4cpu16:4 -- ./workload run tpcc --warehouses=1 --histograms=perf/stats.json --wait=false --ramp=5m0s --duration=10m0s {pgurl:1-3} returned
| stderr:
| ./workload: error while loading shared libraries: libncurses.so.6: cannot open shared object file: No such file or directory
| Error: COMMAND_PROBLEM: exit status 127
| (1) COMMAND_PROBLEM
| Wraps: (2) Node 4. Command with error:
| | ```
| | ./workload run tpcc --warehouses=1 --histograms=perf/stats.json --wait=false --ramp=5m0s --duration=10m0s {pgurl:1-3}
| | ```
| Wraps: (3) exit status 127
| Error types: (1) errors.Cmd (2) *hintdetail.withDetail (3) *exec.ExitError
|
| stdout:
Wraps: (5) exit status 20
Error types: (1) *withstack.withStack (2) *safedetails.withSafeDetails (3) *errutil.withMessage (4) *main.withCommandDetails (5) *exec.ExitError
cluster.go:2628,tpcc.go:174,tpcc.go:286,test_runner.go:749: monitor failure: monitor task failed: t.Fatal() was called
(1) attached stack trace
| main.(*monitor).WaitE
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2616
| main.(*monitor).Wait
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2624
| main.runTPCC
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tpcc.go:174
| main.registerTPCC.func3
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tpcc.go:286
| main.(*testRunner).runTest.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/test_runner.go:749
Wraps: (2) monitor failure
Wraps: (3) attached stack trace
| main.(*monitor).wait.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2672
Wraps: (4) monitor task failed
Wraps: (5) attached stack trace
| main.init
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2586
| runtime.doInit
| /usr/local/go/src/runtime/proc.go:5652
| runtime.main
| /usr/local/go/src/runtime/proc.go:191
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1374
Wraps: (6) t.Fatal() was called
Error types: (1) *withstack.withStack (2) *errutil.withMessage (3) *withstack.withStack (4) *errutil.withMessage (5) *withstack.withStack (6) *errors.errorString
```
<details><summary>More</summary><p>
Artifacts: [/tpcc-nowait/nodes=3/w=1](https://teamcity.cockroachdb.com/viewLog.html?buildId=2661563&tab=artifacts#/tpcc-nowait/nodes=3/w=1)
Related:
- #60123 roachtest: tpcc-nowait/nodes=3/w=1 failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-20.2](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-20.2) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Atpcc-nowait%2Fnodes%3D3%2Fw%3D1.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
2.0
|
roachtest: tpcc-nowait/nodes=3/w=1 failed - [(roachtest).tpcc-nowait/nodes=3/w=1 failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2661563&tab=buildLog) on [release-20.1@90f78268f3b5b08ba838ac3ad164821d2f5a5362](https://github.com/cockroachdb/cockroach/commits/90f78268f3b5b08ba838ac3ad164821d2f5a5362):
```
| github.com/cockroachdb/cockroach/vendor/golang.org/x/sync/errgroup.(*Group).Go.func1
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/golang.org/x/sync/errgroup/errgroup.go:57
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1374
Wraps: (2) 2 safe details enclosed
Wraps: (3) output in run_093819.057_n4_workload_run_tpcc
Wraps: (4) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-2661563-1612941348-124-n4cpu16:4 -- ./workload run tpcc --warehouses=1 --histograms=perf/stats.json --wait=false --ramp=5m0s --duration=10m0s {pgurl:1-3} returned
| stderr:
| ./workload: error while loading shared libraries: libncurses.so.6: cannot open shared object file: No such file or directory
| Error: COMMAND_PROBLEM: exit status 127
| (1) COMMAND_PROBLEM
| Wraps: (2) Node 4. Command with error:
| | ```
| | ./workload run tpcc --warehouses=1 --histograms=perf/stats.json --wait=false --ramp=5m0s --duration=10m0s {pgurl:1-3}
| | ```
| Wraps: (3) exit status 127
| Error types: (1) errors.Cmd (2) *hintdetail.withDetail (3) *exec.ExitError
|
| stdout:
Wraps: (5) exit status 20
Error types: (1) *withstack.withStack (2) *safedetails.withSafeDetails (3) *errutil.withMessage (4) *main.withCommandDetails (5) *exec.ExitError
cluster.go:2628,tpcc.go:174,tpcc.go:286,test_runner.go:749: monitor failure: monitor task failed: t.Fatal() was called
(1) attached stack trace
| main.(*monitor).WaitE
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2616
| main.(*monitor).Wait
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2624
| main.runTPCC
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tpcc.go:174
| main.registerTPCC.func3
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tpcc.go:286
| main.(*testRunner).runTest.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/test_runner.go:749
Wraps: (2) monitor failure
Wraps: (3) attached stack trace
| main.(*monitor).wait.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2672
Wraps: (4) monitor task failed
Wraps: (5) attached stack trace
| main.init
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2586
| runtime.doInit
| /usr/local/go/src/runtime/proc.go:5652
| runtime.main
| /usr/local/go/src/runtime/proc.go:191
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1374
Wraps: (6) t.Fatal() was called
Error types: (1) *withstack.withStack (2) *errutil.withMessage (3) *withstack.withStack (4) *errutil.withMessage (5) *withstack.withStack (6) *errors.errorString
```
<details><summary>More</summary><p>
Artifacts: [/tpcc-nowait/nodes=3/w=1](https://teamcity.cockroachdb.com/viewLog.html?buildId=2661563&tab=artifacts#/tpcc-nowait/nodes=3/w=1)
Related:
- #60123 roachtest: tpcc-nowait/nodes=3/w=1 failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-20.2](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-20.2) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Atpcc-nowait%2Fnodes%3D3%2Fw%3D1.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
test
|
roachtest tpcc nowait nodes w failed on github com cockroachdb cockroach vendor golang org x sync errgroup group go home agent work go src github com cockroachdb cockroach vendor golang org x sync errgroup errgroup go runtime goexit usr local go src runtime asm s wraps safe details enclosed wraps output in run workload run tpcc wraps home agent work go src github com cockroachdb cockroach bin roachprod run teamcity workload run tpcc warehouses histograms perf stats json wait false ramp duration pgurl returned stderr workload error while loading shared libraries libncurses so cannot open shared object file no such file or directory error command problem exit status command problem wraps node command with error workload run tpcc warehouses histograms perf stats json wait false ramp duration pgurl wraps exit status error types errors cmd hintdetail withdetail exec exiterror stdout wraps exit status error types withstack withstack safedetails withsafedetails errutil withmessage main withcommanddetails exec exiterror cluster go tpcc go tpcc go test runner go monitor failure monitor task failed t fatal was called attached stack trace main monitor waite home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go main monitor wait home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go main runtpcc home agent work go src github com cockroachdb cockroach pkg cmd roachtest tpcc go main registertpcc home agent work go src github com cockroachdb cockroach pkg cmd roachtest tpcc go main testrunner runtest home agent work go src github com cockroachdb cockroach pkg cmd roachtest test runner go wraps monitor failure wraps attached stack trace main monitor wait home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go wraps monitor task failed wraps attached stack trace main init home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go runtime doinit usr local go src runtime proc go runtime main usr local go src runtime proc go runtime goexit usr local go src runtime asm s wraps t fatal was called error types withstack withstack errutil withmessage withstack withstack errutil withmessage withstack withstack errors errorstring more artifacts related roachtest tpcc nowait nodes w failed powered by
| 1
|
263,597
| 23,069,830,666
|
IssuesEvent
|
2022-07-25 16:56:57
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
closed
|
Flutter engine cannot locate ICU data file correctly in IosUnitTests
|
a: tests platform-ios engine
|
<!-- Thank you for contributing to Flutter!
If you are filing a bug, please add the steps to reproduce, expected and actual results.
If you are filing a feature request, please describe the use case and a proposal.
If you are requesting a small infra task with P0 or P1 priority, please add it to the
"Infra Ticket Queue" project with "New" column, explain why the task is needed and what
actions need to perform (if you happen to know). No need to set an assignee; the infra oncall
will triage and process the infra ticket queue.
-->
In https://github.com/flutter/engine/pull/34508 I added a new test case for iOS which depends on the ICU library, but the test kept failing because it could not locate `icudtl.dat` file correctly, until I specified its location by hardcoding it in the command line argument. We should get rid of this hardcoding.
cc @jmagman
|
1.0
|
Flutter engine cannot locate ICU data file correctly in IosUnitTests - <!-- Thank you for contributing to Flutter!
If you are filing a bug, please add the steps to reproduce, expected and actual results.
If you are filing a feature request, please describe the use case and a proposal.
If you are requesting a small infra task with P0 or P1 priority, please add it to the
"Infra Ticket Queue" project with "New" column, explain why the task is needed and what
actions need to perform (if you happen to know). No need to set an assignee; the infra oncall
will triage and process the infra ticket queue.
-->
In https://github.com/flutter/engine/pull/34508 I added a new test case for iOS which depends on the ICU library, but the test kept failing because it could not locate `icudtl.dat` file correctly, until I specified its location by hardcoding it in the command line argument. We should get rid of this hardcoding.
cc @jmagman
|
test
|
flutter engine cannot locate icu data file correctly in iosunittests thank you for contributing to flutter if you are filing a bug please add the steps to reproduce expected and actual results if you are filing a feature request please describe the use case and a proposal if you are requesting a small infra task with or priority please add it to the infra ticket queue project with new column explain why the task is needed and what actions need to perform if you happen to know no need to set an assignee the infra oncall will triage and process the infra ticket queue in i added a new test case for ios which depends on the icu library but the test kept failing because it could not locate icudtl dat file correctly until i specified its location by hardcoding it in the command line argument we should get rid of this hardcoding cc jmagman
| 1
|
749,795
| 26,179,877,785
|
IssuesEvent
|
2023-01-02 14:17:39
|
rancher/fleet
|
https://api.github.com/repos/rancher/fleet
|
closed
|
Investigate and minimize transient Fleet CRD-related error logging during startup
|
[zube]: Done kind/bug priority/low
|
## Issue
During the init sequence of the `fleet-controller`, there are frequent error logs for unrecognized CRDs.
Perhaps, this is a Wrangler issue as well since similar things are seen in Rancher's init sequence. While cosmetic, these errors can be verbose and unclear to the user.
```
E0804 23:54:12.789558 1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.20.2/tools/cache/reflector.go:167: Failed to watch *v1.GitJob: failed to list *v1.GitJob: the server could not find the requested resource (get gitjobs.meta.k8s.io)
E0804 23:54:14.345164 1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.20.2/tools/cache/reflector.go:167: Failed to watch *v1.GitJob: failed to list *v1.GitJob: the server could not find the requested resource (get gitjobs.meta.k8s.io)
E0804 23:54:17.057299 1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.20.2/tools/cache/reflector.go:167: Failed to watch *v1.GitJob: failed to list *v1.GitJob: the server could not find the requested resource (get gitjobs.meta.k8s.io)
```
## Source
Found for Fleet: https://github.com/rancher/rancher/issues/33112#issuecomment-893106078
## Solution Brainstroming
We should evaluate adding a timeout (customizable via arg/env/helm/etc.) for these checks; perhaps logging them after a specified amount of time
|
1.0
|
Investigate and minimize transient Fleet CRD-related error logging during startup - ## Issue
During the init sequence of the `fleet-controller`, there are frequent error logs for unrecognized CRDs.
Perhaps, this is a Wrangler issue as well since similar things are seen in Rancher's init sequence. While cosmetic, these errors can be verbose and unclear to the user.
```
E0804 23:54:12.789558 1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.20.2/tools/cache/reflector.go:167: Failed to watch *v1.GitJob: failed to list *v1.GitJob: the server could not find the requested resource (get gitjobs.meta.k8s.io)
E0804 23:54:14.345164 1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.20.2/tools/cache/reflector.go:167: Failed to watch *v1.GitJob: failed to list *v1.GitJob: the server could not find the requested resource (get gitjobs.meta.k8s.io)
E0804 23:54:17.057299 1 reflector.go:138] pkg/mod/k8s.io/client-go@v0.20.2/tools/cache/reflector.go:167: Failed to watch *v1.GitJob: failed to list *v1.GitJob: the server could not find the requested resource (get gitjobs.meta.k8s.io)
```
## Source
Found for Fleet: https://github.com/rancher/rancher/issues/33112#issuecomment-893106078
## Solution Brainstroming
We should evaluate adding a timeout (customizable via arg/env/helm/etc.) for these checks; perhaps logging them after a specified amount of time
|
non_test
|
investigate and minimize transient fleet crd related error logging during startup issue during the init sequence of the fleet controller there are frequent error logs for unrecognized crds perhaps this is a wrangler issue as well since similar things are seen in rancher s init sequence while cosmetic these errors can be verbose and unclear to the user reflector go pkg mod io client go tools cache reflector go failed to watch gitjob failed to list gitjob the server could not find the requested resource get gitjobs meta io reflector go pkg mod io client go tools cache reflector go failed to watch gitjob failed to list gitjob the server could not find the requested resource get gitjobs meta io reflector go pkg mod io client go tools cache reflector go failed to watch gitjob failed to list gitjob the server could not find the requested resource get gitjobs meta io source found for fleet solution brainstroming we should evaluate adding a timeout customizable via arg env helm etc for these checks perhaps logging them after a specified amount of time
| 0
|
136,304
| 5,279,800,920
|
IssuesEvent
|
2017-02-07 12:26:11
|
esikachev/my-dev-server
|
https://api.github.com/repos/esikachev/my-dev-server
|
closed
|
Incorrect checking of ssh in DB
|
bug priority/P0
|
Now, we check ssh in DB via 2 fields (host and ssh_username) https://github.com/esikachev/my-dev-server/blob/master/my_dev_server/server/main.py#L105, but we need also check user_id because several users can have similar creds for ssh
|
1.0
|
Incorrect checking of ssh in DB - Now, we check ssh in DB via 2 fields (host and ssh_username) https://github.com/esikachev/my-dev-server/blob/master/my_dev_server/server/main.py#L105, but we need also check user_id because several users can have similar creds for ssh
|
non_test
|
incorrect checking of ssh in db now we check ssh in db via fields host and ssh username but we need also check user id because several users can have similar creds for ssh
| 0
|
37,626
| 5,134,360,188
|
IssuesEvent
|
2017-01-11 08:43:06
|
puikinsh/shapely
|
https://api.github.com/repos/puikinsh/shapely
|
closed
|
Function prefix issue
|
tested
|
* In `extras.php` line 131, function name should be `shapely_get_theme_options` instead of `get_shapely_theme_options`
|
1.0
|
Function prefix issue - * In `extras.php` line 131, function name should be `shapely_get_theme_options` instead of `get_shapely_theme_options`
|
test
|
function prefix issue in extras php line function name should be shapely get theme options instead of get shapely theme options
| 1
|
272,382
| 23,669,677,139
|
IssuesEvent
|
2022-08-27 06:15:56
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test: TestTenantLogic_alter_table failed
|
C-test-failure O-robot branch-master
|
pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test.TestTenantLogic_alter_table [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/6260581?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/6260581?buildTab=artifacts#/) on master @ [770ff3c545a51752490403da64d56fb397f49c5e](https://github.com/cockroachdb/cockroach/commits/770ff3c545a51752490403da64d56fb397f49c5e):
```
=== RUN TestTenantLogic_alter_table
test_log_scope.go:161: test logs captured to: /artifacts/tmp/_tmp/9eebf1fbe77b8b038ea96f01f08fb077/logTestTenantLogic_alter_table3959955553
test_log_scope.go:79: use -show-logs to present logs inline
[05:24:36] rng seed: 4642510388756153330
[05:24:38] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 22 statements
[05:24:40] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 48 statements
[05:24:43] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 60 statements
[05:24:45] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 79 statements
[05:24:47] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 88 statements
[05:24:50] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 94 statements
[05:24:52] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 102 statements
[05:24:54] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 110 statements
[05:24:57] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 133 statements
[05:24:59] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 141 statements
=== CONT TestTenantLogic_alter_table
logic.go:3897: -- test log scope end --
--- FAIL: TestTenantLogic_alter_table (65.51s)
=== RUN TestTenantLogic_alter_table/generated_as_identity_with_seq_option
[05:25:28] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 421 statements
logic.go:2720:
/home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table:2036: SELECT * FROM t ORDER BY a
expected:
7 1
8 4
9 7
but found (query options: "") :
7 10
8 13
9 16
logic.go:1972:
/home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table:2043: too many errors encountered, skipping the rest of the input
[05:25:29] --- done: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table with config 3node-tenant: 425 tests, 2 failures
[05:25:35] --- total progress: 425 statements
--- total: 425 tests, 2 failures
--- FAIL: TestTenantLogic_alter_table/generated_as_identity_with_seq_option (2.88s)
```
<p>Parameters: <code>TAGS=bazel,gss</code>
</p>
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestTenantLogic_alter_table.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
1.0
|
pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test: TestTenantLogic_alter_table failed - pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test.TestTenantLogic_alter_table [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/6260581?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/6260581?buildTab=artifacts#/) on master @ [770ff3c545a51752490403da64d56fb397f49c5e](https://github.com/cockroachdb/cockroach/commits/770ff3c545a51752490403da64d56fb397f49c5e):
```
=== RUN TestTenantLogic_alter_table
test_log_scope.go:161: test logs captured to: /artifacts/tmp/_tmp/9eebf1fbe77b8b038ea96f01f08fb077/logTestTenantLogic_alter_table3959955553
test_log_scope.go:79: use -show-logs to present logs inline
[05:24:36] rng seed: 4642510388756153330
[05:24:38] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 22 statements
[05:24:40] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 48 statements
[05:24:43] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 60 statements
[05:24:45] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 79 statements
[05:24:47] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 88 statements
[05:24:50] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 94 statements
[05:24:52] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 102 statements
[05:24:54] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 110 statements
[05:24:57] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 133 statements
[05:24:59] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 141 statements
=== CONT TestTenantLogic_alter_table
logic.go:3897: -- test log scope end --
--- FAIL: TestTenantLogic_alter_table (65.51s)
=== RUN TestTenantLogic_alter_table/generated_as_identity_with_seq_option
[05:25:28] --- progress: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table: 421 statements
logic.go:2720:
/home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table:2036: SELECT * FROM t ORDER BY a
expected:
7 1
8 4
9 7
but found (query options: "") :
7 10
8 13
9 16
logic.go:1972:
/home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table:2043: too many errors encountered, skipping the rest of the input
[05:25:29] --- done: /home/roach/.cache/bazel/_bazel_roach/c5a4e7d36696d9cd970af2045211a7df/sandbox/processwrapper-sandbox/3172/execroot/com_github_cockroachdb_cockroach/bazel-out/k8-dbg/bin/pkg/ccl/logictestccl/tests/3node-tenant/3node-tenant_test_/3node-tenant_test.runfiles/com_github_cockroachdb_cockroach/pkg/sql/logictest/testdata/logic_test/alter_table with config 3node-tenant: 425 tests, 2 failures
[05:25:35] --- total progress: 425 statements
--- total: 425 tests, 2 failures
--- FAIL: TestTenantLogic_alter_table/generated_as_identity_with_seq_option (2.88s)
```
<p>Parameters: <code>TAGS=bazel,gss</code>
</p>
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestTenantLogic_alter_table.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
test
|
pkg ccl logictestccl tests tenant tenant test testtenantlogic alter table failed pkg ccl logictestccl tests tenant tenant test testtenantlogic alter table with on master run testtenantlogic alter table test log scope go test logs captured to artifacts tmp tmp logtesttenantlogic alter test log scope go use show logs to present logs inline rng seed progress home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table statements progress home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table statements progress home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table statements progress home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table statements progress home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table statements progress home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table statements progress home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table statements progress home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table statements progress home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table statements progress home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table statements cont testtenantlogic alter table logic go test log scope end fail testtenantlogic alter table run testtenantlogic alter table generated as identity with seq option progress home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table statements logic go home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table select from t order by a expected but found query options logic go home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table too many errors encountered skipping the rest of the input done home roach cache bazel bazel roach sandbox processwrapper sandbox execroot com github cockroachdb cockroach bazel out dbg bin pkg ccl logictestccl tests tenant tenant test tenant test runfiles com github cockroachdb cockroach pkg sql logictest testdata logic test alter table with config tenant tests failures total progress statements total tests failures fail testtenantlogic alter table generated as identity with seq option parameters tags bazel gss help see also
| 1
|
220,854
| 17,264,893,469
|
IssuesEvent
|
2021-07-22 12:41:34
|
appium/appium
|
https://api.github.com/repos/appium/appium
|
closed
|
Encountered internal error running command: NoSuchDriverError: A session is either terminated or not started is displayed and AUT is closed
|
ThirdParty XCUITest
|
## The problem
Encountered internal error running command: NoSuchDriverError: A session is either terminated or not started is displayed and AUT is closed while Appium Desktop is trying to take screenshot(or load elements)
## Environment
* Appium version (or git revision) that exhibits the issue:1.21.0
* Last Appium version that did not exhibit the issue (if applicable):
* Desktop OS/version used to run Appium:MacOS BigSur
* Node.js version (unless using Appium.app|exe):
* Npm or Yarn package manager:
* Mobile platform/version under test:iPhone 14.6
* Real device or emulator/simulator:Real Device
* Appium CLI or Appium.app|exe:Appium.app
## Details
Appium.app desktop app while taking screenshot , AUT is closed and error is displayed
## Link to Appium logs
[debug] [35m[MJSONWP (656069b6)][39m Calling AppiumDriver.getSession() with args: ["656069b6-928e-4052-b654-542468136075"]
[debug] [35m[XCUITest][39m Executing command 'getSession'
[info] [35m[XCUITest][39m Merging WDA caps over Appium caps for session detail response
[debug] [35m[MJSONWP (656069b6)][39m Responding to client with driver.getSession() result: {"udid":"00008030-001C4D243AA1802E","automationName":"XCUITest","bundleId":"com.carriertransicold.containerLINK","clearSystemFiles":true,"deviceName":"Srinivas’s iPhone","newCommandTimeout":0,"platformName":"iOS","platformVersion":"14.6","showXcodeLog":true,"wdaConnectionTimeout":2400000,"connectHardwareKeyboard":true,"device":"iphone","browserName":"ContainerLINK™","sdkVersion":"14.6","CFBundleIdentifier":"com.carriertransicold.containerLINK","pixelRatio":2,"statBarHeight":48,"viewportRect":{"left":0,"top":96,"width":828,"height":1696}}
[info] [35m[HTTP][39m [37m<-- GET /wd/hub/session/656069b6-928e-4052-b654-542468136075 [39m[32m200[39m [90m2 ms - 619[39m
[info] [35m[HTTP][39m [90m[39m
[info] [35m[HTTP][39m [37m-->[39m [37mGET[39m [37m/wd/hub/session/656069b6-928e-4052-b654-542468136075/window/current/size[39m
[info] [35m[HTTP][39m [90m{}[39m
[debug] [35m[MJSONWP (656069b6)][39m Calling AppiumDriver.getWindowSize() with args: ["current","656069b6-928e-4052-b654-542468136075"]
[debug] [35m[XCUITest][39m Executing command 'getWindowSize'
[debug] [35m[WD Proxy][39m Proxying [GET /window/size] to [GET http://127.0.0.1:8100/session/F84C2BB1-AD84-4CE3-BD44-AECFC5DFDF71/window/size] with no body
[error] [35m[Xcode][39m 2021-07-19 14:02:06.528156+0530 WebDriverAgentRunner-Runner[2712:1623950] Getting the most recent active application (out of 1 total items)
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m t = 5426.78s Requesting snapshot of accessibility hierarchy for app with pid 2750
[error] [35m[Xcode][39m [debug] [35m[WD Proxy][39m Got response with status 200: {"value":{"width":414,"height":896},"sessionId":"F84C2BB1-AD84-4CE3-BD44-AECFC5DFDF71"}
[debug] [35m[MJSONWP (656069b6)][39m Responding to client with driver.getWindowSize() result: {"width":414,"height":896}
[info] [35m[HTTP][39m [37m<-- GET /wd/hub/session/656069b6-928e-4052-b654-542468136075/window/current/size [39m[32m200[39m [90m1052 ms - 98[39m
[info] [35m[HTTP][39m [90m[39m
[error] [35m[Xcode][39m 2021-07-19 14:02:07.527763+0530 WebDriverAgentRunner-Runner[2712:1623950] Getting the most recent active application (out of 1 total items)
[error] [35m[Xcode][39m
[info] [35m[HTTP][39m [37m-->[39m [37mPOST[39m [37m/wd/hub/session/656069b6-928e-4052-b654-542468136075/execute[39m
[info] [35m[HTTP][39m [90m{"script":"mobile:getContexts","args":[]}[39m
[debug] [35m[MJSONWP (656069b6)][39m Calling AppiumDriver.execute() with args: ["mobile:getContexts",[],"656069b6-928e-4052-b654-542468136075"]
[debug] [35m[XCUITest][39m Executing command 'execute'
[debug] [35m[XCUITest][39m Getting list of available contexts
[debug] [35m[iOS][39m Retrieving contexts and views
[debug] [35m[XCUITest][39m Selecting by url: false
[debug] [35m[RemoteDebugger][39m Sending connection key request
[debug] [35m[RemoteDebugger][39m Sending '_rpc_reportIdentifier:' message (id: 8): 'setConnectionKey'
[debug] [35m[RemoteDebugger][39m Sending to Web Inspector took 2ms
[debug] [35m[RemoteDebugger][39m Selecting application
[debug] [35m[RemoteDebugger][39m No applications currently connected.
[debug] [35m[XCUITest][39m No web frames found.
[debug] [35m[XCUITest][39m No webviews found in 2ms
[debug] [35m[MJSONWP (656069b6)][39m Responding to client with driver.execute() result: [{"id":"NATIVE_APP"}]
[info] [35m[HTTP][39m [37m<-- POST /wd/hub/session/656069b6-928e-4052-b654-542468136075/execute [39m[32m200[39m [90m8 ms - 93[39m
[info] [35m[HTTP][39m [90m[39m
[info] [35m[HTTP][39m [37m-->[39m [37mGET[39m [37m/wd/hub/session/656069b6-928e-4052-b654-542468136075/source[39m
[info] [35m[HTTP][39m [90m{}[39m
[debug] [35m[MJSONWP (656069b6)][39m Calling AppiumDriver.getPageSource() with args: ["656069b6-928e-4052-b654-542468136075"]
[debug] [35m[XCUITest][39m Executing command 'getPageSource'
[debug] [35m[WD Proxy][39m Matched '/source' to command name 'getPageSource'
[debug] [35m[WD Proxy][39m Proxying [GET /source] to [GET http://127.0.0.1:8100/session/F84C2BB1-AD84-4CE3-BD44-AECFC5DFDF71/source] with no body
[error] [35m[Xcode][39m 2021-07-19 14:02:07.576409+0530 WebDriverAgentRunner-Runner[2712:1623950] Getting the most recent active application (out of 1 total items)
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m 2021-07-19 14:02:07.576790+0530 WebDriverAgentRunner-Runner[2712:1623950] The following attributes were requested to be included into the XML: {(
[error] [35m[Xcode][39m FBHeightAttribute,
[error] [35m[Xcode][39m FBAccessibleAttribute,
[error] [35m[Xcode][39m FBValueAttribute,
[error] [35m[Xcode][39m FBVisibleAttribute,
[error] [35m[Xcode][39m FBWidthAttribute,
[error] [35m[Xcode][39m FBEnabledAttribute,
[error] [35m[Xcode][39m FBTypeAttribute,
[error] [35m[Xcode][39m FBYAttribute,
[error] [35m[Xcode][39m FBLabelAttribute,
[error] [35m[Xcode][39m FBIndexAttribute,
[error] [35m[Xcode][39m FBXAttribute,
[error] [35m[Xcode][39m FBNameAttribute
[error] [35m[Xcode][39m )}
[error] [35m[Xcode][39m 2021-07-19 14:02:07.576898+0530 WebDriverAgentRunner-Runner[2712:1623950] Waiting up to 2s until com.carriertransicold.containerLINK is in idle state (including animations)
[error] [35m[Xcode][39m t = 5427.83s Wait for com.carriertransicold.containerLINK to idle
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m t = 5427.83s Requesting snapshot of accessibility hierarchy for app with pid 2750
[error] [35m[Xcode][39m [error] [35m[Xcode][39m 2021-07-19 14:04:38.548952+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot take a snapshot with attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeElementBaseType",
[error] [35m[Xcode][39m "XC_kAXXCAttributeHasNativeFocus",
[error] [35m[Xcode][39m "XC_kAXXCAttributeTruncatedValue",
[error] [35m[Xcode][39m "XC_kAXXCAttributeViewControllerTitle",
[error] [35m[Xcode][39m "XC_kAXXCAttributeUserTestingElements",
[error] [35m[Xcode][39m "XC_kAXXCAttributeAutomationType",
[error] [35m[Xcode][39m "XC_kAXXCAttributeBannerIsStickyAttribute",
[error] [35m[Xcode][39m "XC_kAXXCAttributeIdentifier",
[error] [35m[Xcode][39m "XC_kAXXCAttributeTraits",
[error] [35m[Xcode][39m "XC_kAXXCAttributeFrame",
[error] [35m[Xcode][39m "XC_kAXXCAttributeVerticalSizeClass",
[error] [35m[Xcode][39m "XC_kAXXCAttributeViewControllerClassName",
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsUserInteractionEnabled",
[error] [35m[Xcode][39m "XC_kAXXCAttributeLabel",
[error] [35m[Xcode][39m "XC_kAXXCAttributeURL",
[error] [35m[Xcode][39m "XC_kAXXCAttributeParent",
[error] [35m[Xcode][39m "XC_kAXXCAttributeHorizontalSizeClass",
[error] [35m[Xcode][39m "XC_kAXXCAttributePlaceholderValue",
[error] [35m[Xcode][39m "XC_kAXXCAttributeElementType",
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsVisible"
[error] [35m[Xcode][39m ) of 'XCUIElementTypeAny' after 600.00 seconds
[error] [35m[Xcode][39m 2021-07-19 14:04:38.549114+0530 WebDriverAgentRunner-Runner[2712:1623950] This timeout could be customized via 'customSnapshotTimeout' setting
[error] [35m[Xcode][39m 2021-07-19 14:04:38.549189+0530 WebDriverAgentRunner-Runner[2712:1623950] Internal error: XCTPerformOnMainRunLoop work timed out after 60.0s
[error] [35m[Xcode][39m 2021-07-19 14:04:38.549268+0530 WebDriverAgentRunner-Runner[2712:1623950] Falling back to the default snapshotting mechanism for the element 'XCUIElementTypeAny' (some attribute values, like visibility or accessibility might not be precise though)
[error] [35m[Xcode][39m [error] [35m[Xcode][39m 2021-07-19 14:07:08.576537+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorIPCTimeout from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25216, NSLocalizedDescription=Accessibility error kAXErrorIPCTimeout from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m [info] [35m[HTTP][39m [37m<-- POST /wd/hub/session/f80b4c3a-8fbc-470a-b614-7d1b95f4073d/elements [39m[32m-[39m [90m- ms - -[39m
[info] [35m[HTTP][39m [90m[39m
[info] [35m[HTTP][39m [37m-->[39m [37mPOST[39m [37m/wd/hub/session/f80b4c3a-8fbc-470a-b614-7d1b95f4073d/elements[39m
[info] [35m[HTTP][39m [90m{"using":"xpath","value":"(//XCUIElementTypeStaticText[@name=\"Select All\"])[1]"}[39m
[debug] [35m[MJSONWP (f80b4c3a)][39m **Encountered internal error running command: NoSuchDriverError: A session is either terminated or not started**
[debug] [35m[MJSONWP (f80b4c3a)][39m at asyncHandler (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/appium-base-driver/lib/protocol/protocol.js:243:15)
[debug] [35m[MJSONWP (f80b4c3a)][39m at /Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/appium-base-driver/lib/protocol/protocol.js:423:15
[debug] [35m[MJSONWP (f80b4c3a)][39m at Layer.handle [as handle_request] (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/layer.js:95:5)
[debug] [35m[MJSONWP (f80b4c3a)][39m at next (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/route.js:137:13)
[debug] [35m[MJSONWP (f80b4c3a)][39m at Route.dispatch (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/route.js:112:3)
[debug] [35m[MJSONWP (f80b4c3a)][39m at Layer.handle [as handle_request] (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/layer.js:95:5)
[debug] [35m[MJSONWP (f80b4c3a)][39m at /Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:281:22
[debug] [35m[MJSONWP (f80b4c3a)][39m at param (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:354:14)
[debug] [35m[MJSONWP (f80b4c3a)][39m at param (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:365:14)
[debug] [35m[MJSONWP (f80b4c3a)][39m at Function.process_params (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:410:3)
[debug] [35m[MJSONWP (f80b4c3a)][39m at next (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:275:10)
[debug] [35m[MJSONWP (f80b4c3a)][39m at logger (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/morgan/index.js:144:5)
[debug] [35m[MJSONWP (f80b4c3a)][39m at Layer.handle [as handle_request] (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/layer.js:95:5)
[debug] [35m[MJSONWP (f80b4c3a)][39m at trim_prefix (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:317:13)
[debug] [35m[MJSONWP (f80b4c3a)][39m at /Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:284:7
[debug] [35m[MJSONWP (f80b4c3a)][39m at Function.process_params (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:335:12)
[debug] [35m[MJSONWP (f80b4c3a)][39m at next (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:275:10)
[debug] [35m[MJSONWP (f80b4c3a)][39m at /Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/body-parser/lib/read.js:130:5
[debug] [35m[MJSONWP (f80b4c3a)][39m at invokeCallback (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/raw-body/index.js:224:16)
[debug] [35m[MJSONWP (f80b4c3a)][39m at done (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/raw-body/index.js:213:7)
[debug] [35m[MJSONWP (f80b4c3a)][39m at IncomingMessage.onEnd (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/raw-body/index.js:273:7)
[debug] [35m[MJSONWP (f80b4c3a)][39m at IncomingMessage.emit (events.js:208:15)
[debug] [35m[MJSONWP (f80b4c3a)][39m at endReadableNT (_stream_readable.js:1168:12)
[debug] [35m[MJSONWP (f80b4c3a)][39m at processTicksAndRejections (internal/process/task_queues.js:77:11)
[info] [35m[HTTP][39m [37m<-- POST /wd/hub/session/f80b4c3a-8fbc-470a-b614-7d1b95f4073d/elements [39m[33m404[39m [90m25 ms - 131[39m
[info] [35m[HTTP][39m [90m[39m[error] [35m[Xcode][39m 2021-07-19 14:09:36.998475+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m 2021-07-19 14:09:37.004420+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m 2021-07-19 14:09:37.010448+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m 2021-07-19 14:09:37.017429+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m 2021-07-19 14:09:37.027483+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m 2021-07-19 14:09:37.033365+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
## Code To Reproduce Issue [ Good To Have ]
Please remember that with sample code it's easier to reproduce the bug and it's much faster to fix it.
Please git clone https://github.com/appium/appium and from the `sample-code` directory, use one of your favourite languages and sample apps to reproduce the issue.
In case a similar scenario is missing in sample-code, please submit a PR with one of the sample apps provided.
|
1.0
|
Encountered internal error running command: NoSuchDriverError: A session is either terminated or not started is displayed and AUT is closed - ## The problem
Encountered internal error running command: NoSuchDriverError: A session is either terminated or not started is displayed and AUT is closed while Appium Desktop is trying to take screenshot(or load elements)
## Environment
* Appium version (or git revision) that exhibits the issue:1.21.0
* Last Appium version that did not exhibit the issue (if applicable):
* Desktop OS/version used to run Appium:MacOS BigSur
* Node.js version (unless using Appium.app|exe):
* Npm or Yarn package manager:
* Mobile platform/version under test:iPhone 14.6
* Real device or emulator/simulator:Real Device
* Appium CLI or Appium.app|exe:Appium.app
## Details
Appium.app desktop app while taking screenshot , AUT is closed and error is displayed
## Link to Appium logs
[debug] [35m[MJSONWP (656069b6)][39m Calling AppiumDriver.getSession() with args: ["656069b6-928e-4052-b654-542468136075"]
[debug] [35m[XCUITest][39m Executing command 'getSession'
[info] [35m[XCUITest][39m Merging WDA caps over Appium caps for session detail response
[debug] [35m[MJSONWP (656069b6)][39m Responding to client with driver.getSession() result: {"udid":"00008030-001C4D243AA1802E","automationName":"XCUITest","bundleId":"com.carriertransicold.containerLINK","clearSystemFiles":true,"deviceName":"Srinivas’s iPhone","newCommandTimeout":0,"platformName":"iOS","platformVersion":"14.6","showXcodeLog":true,"wdaConnectionTimeout":2400000,"connectHardwareKeyboard":true,"device":"iphone","browserName":"ContainerLINK™","sdkVersion":"14.6","CFBundleIdentifier":"com.carriertransicold.containerLINK","pixelRatio":2,"statBarHeight":48,"viewportRect":{"left":0,"top":96,"width":828,"height":1696}}
[info] [35m[HTTP][39m [37m<-- GET /wd/hub/session/656069b6-928e-4052-b654-542468136075 [39m[32m200[39m [90m2 ms - 619[39m
[info] [35m[HTTP][39m [90m[39m
[info] [35m[HTTP][39m [37m-->[39m [37mGET[39m [37m/wd/hub/session/656069b6-928e-4052-b654-542468136075/window/current/size[39m
[info] [35m[HTTP][39m [90m{}[39m
[debug] [35m[MJSONWP (656069b6)][39m Calling AppiumDriver.getWindowSize() with args: ["current","656069b6-928e-4052-b654-542468136075"]
[debug] [35m[XCUITest][39m Executing command 'getWindowSize'
[debug] [35m[WD Proxy][39m Proxying [GET /window/size] to [GET http://127.0.0.1:8100/session/F84C2BB1-AD84-4CE3-BD44-AECFC5DFDF71/window/size] with no body
[error] [35m[Xcode][39m 2021-07-19 14:02:06.528156+0530 WebDriverAgentRunner-Runner[2712:1623950] Getting the most recent active application (out of 1 total items)
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m t = 5426.78s Requesting snapshot of accessibility hierarchy for app with pid 2750
[error] [35m[Xcode][39m [debug] [35m[WD Proxy][39m Got response with status 200: {"value":{"width":414,"height":896},"sessionId":"F84C2BB1-AD84-4CE3-BD44-AECFC5DFDF71"}
[debug] [35m[MJSONWP (656069b6)][39m Responding to client with driver.getWindowSize() result: {"width":414,"height":896}
[info] [35m[HTTP][39m [37m<-- GET /wd/hub/session/656069b6-928e-4052-b654-542468136075/window/current/size [39m[32m200[39m [90m1052 ms - 98[39m
[info] [35m[HTTP][39m [90m[39m
[error] [35m[Xcode][39m 2021-07-19 14:02:07.527763+0530 WebDriverAgentRunner-Runner[2712:1623950] Getting the most recent active application (out of 1 total items)
[error] [35m[Xcode][39m
[info] [35m[HTTP][39m [37m-->[39m [37mPOST[39m [37m/wd/hub/session/656069b6-928e-4052-b654-542468136075/execute[39m
[info] [35m[HTTP][39m [90m{"script":"mobile:getContexts","args":[]}[39m
[debug] [35m[MJSONWP (656069b6)][39m Calling AppiumDriver.execute() with args: ["mobile:getContexts",[],"656069b6-928e-4052-b654-542468136075"]
[debug] [35m[XCUITest][39m Executing command 'execute'
[debug] [35m[XCUITest][39m Getting list of available contexts
[debug] [35m[iOS][39m Retrieving contexts and views
[debug] [35m[XCUITest][39m Selecting by url: false
[debug] [35m[RemoteDebugger][39m Sending connection key request
[debug] [35m[RemoteDebugger][39m Sending '_rpc_reportIdentifier:' message (id: 8): 'setConnectionKey'
[debug] [35m[RemoteDebugger][39m Sending to Web Inspector took 2ms
[debug] [35m[RemoteDebugger][39m Selecting application
[debug] [35m[RemoteDebugger][39m No applications currently connected.
[debug] [35m[XCUITest][39m No web frames found.
[debug] [35m[XCUITest][39m No webviews found in 2ms
[debug] [35m[MJSONWP (656069b6)][39m Responding to client with driver.execute() result: [{"id":"NATIVE_APP"}]
[info] [35m[HTTP][39m [37m<-- POST /wd/hub/session/656069b6-928e-4052-b654-542468136075/execute [39m[32m200[39m [90m8 ms - 93[39m
[info] [35m[HTTP][39m [90m[39m
[info] [35m[HTTP][39m [37m-->[39m [37mGET[39m [37m/wd/hub/session/656069b6-928e-4052-b654-542468136075/source[39m
[info] [35m[HTTP][39m [90m{}[39m
[debug] [35m[MJSONWP (656069b6)][39m Calling AppiumDriver.getPageSource() with args: ["656069b6-928e-4052-b654-542468136075"]
[debug] [35m[XCUITest][39m Executing command 'getPageSource'
[debug] [35m[WD Proxy][39m Matched '/source' to command name 'getPageSource'
[debug] [35m[WD Proxy][39m Proxying [GET /source] to [GET http://127.0.0.1:8100/session/F84C2BB1-AD84-4CE3-BD44-AECFC5DFDF71/source] with no body
[error] [35m[Xcode][39m 2021-07-19 14:02:07.576409+0530 WebDriverAgentRunner-Runner[2712:1623950] Getting the most recent active application (out of 1 total items)
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m 2021-07-19 14:02:07.576790+0530 WebDriverAgentRunner-Runner[2712:1623950] The following attributes were requested to be included into the XML: {(
[error] [35m[Xcode][39m FBHeightAttribute,
[error] [35m[Xcode][39m FBAccessibleAttribute,
[error] [35m[Xcode][39m FBValueAttribute,
[error] [35m[Xcode][39m FBVisibleAttribute,
[error] [35m[Xcode][39m FBWidthAttribute,
[error] [35m[Xcode][39m FBEnabledAttribute,
[error] [35m[Xcode][39m FBTypeAttribute,
[error] [35m[Xcode][39m FBYAttribute,
[error] [35m[Xcode][39m FBLabelAttribute,
[error] [35m[Xcode][39m FBIndexAttribute,
[error] [35m[Xcode][39m FBXAttribute,
[error] [35m[Xcode][39m FBNameAttribute
[error] [35m[Xcode][39m )}
[error] [35m[Xcode][39m 2021-07-19 14:02:07.576898+0530 WebDriverAgentRunner-Runner[2712:1623950] Waiting up to 2s until com.carriertransicold.containerLINK is in idle state (including animations)
[error] [35m[Xcode][39m t = 5427.83s Wait for com.carriertransicold.containerLINK to idle
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m t = 5427.83s Requesting snapshot of accessibility hierarchy for app with pid 2750
[error] [35m[Xcode][39m [error] [35m[Xcode][39m 2021-07-19 14:04:38.548952+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot take a snapshot with attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeElementBaseType",
[error] [35m[Xcode][39m "XC_kAXXCAttributeHasNativeFocus",
[error] [35m[Xcode][39m "XC_kAXXCAttributeTruncatedValue",
[error] [35m[Xcode][39m "XC_kAXXCAttributeViewControllerTitle",
[error] [35m[Xcode][39m "XC_kAXXCAttributeUserTestingElements",
[error] [35m[Xcode][39m "XC_kAXXCAttributeAutomationType",
[error] [35m[Xcode][39m "XC_kAXXCAttributeBannerIsStickyAttribute",
[error] [35m[Xcode][39m "XC_kAXXCAttributeIdentifier",
[error] [35m[Xcode][39m "XC_kAXXCAttributeTraits",
[error] [35m[Xcode][39m "XC_kAXXCAttributeFrame",
[error] [35m[Xcode][39m "XC_kAXXCAttributeVerticalSizeClass",
[error] [35m[Xcode][39m "XC_kAXXCAttributeViewControllerClassName",
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsUserInteractionEnabled",
[error] [35m[Xcode][39m "XC_kAXXCAttributeLabel",
[error] [35m[Xcode][39m "XC_kAXXCAttributeURL",
[error] [35m[Xcode][39m "XC_kAXXCAttributeParent",
[error] [35m[Xcode][39m "XC_kAXXCAttributeHorizontalSizeClass",
[error] [35m[Xcode][39m "XC_kAXXCAttributePlaceholderValue",
[error] [35m[Xcode][39m "XC_kAXXCAttributeElementType",
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsVisible"
[error] [35m[Xcode][39m ) of 'XCUIElementTypeAny' after 600.00 seconds
[error] [35m[Xcode][39m 2021-07-19 14:04:38.549114+0530 WebDriverAgentRunner-Runner[2712:1623950] This timeout could be customized via 'customSnapshotTimeout' setting
[error] [35m[Xcode][39m 2021-07-19 14:04:38.549189+0530 WebDriverAgentRunner-Runner[2712:1623950] Internal error: XCTPerformOnMainRunLoop work timed out after 60.0s
[error] [35m[Xcode][39m 2021-07-19 14:04:38.549268+0530 WebDriverAgentRunner-Runner[2712:1623950] Falling back to the default snapshotting mechanism for the element 'XCUIElementTypeAny' (some attribute values, like visibility or accessibility might not be precise though)
[error] [35m[Xcode][39m [error] [35m[Xcode][39m 2021-07-19 14:07:08.576537+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorIPCTimeout from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25216, NSLocalizedDescription=Accessibility error kAXErrorIPCTimeout from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m [info] [35m[HTTP][39m [37m<-- POST /wd/hub/session/f80b4c3a-8fbc-470a-b614-7d1b95f4073d/elements [39m[32m-[39m [90m- ms - -[39m
[info] [35m[HTTP][39m [90m[39m
[info] [35m[HTTP][39m [37m-->[39m [37mPOST[39m [37m/wd/hub/session/f80b4c3a-8fbc-470a-b614-7d1b95f4073d/elements[39m
[info] [35m[HTTP][39m [90m{"using":"xpath","value":"(//XCUIElementTypeStaticText[@name=\"Select All\"])[1]"}[39m
[debug] [35m[MJSONWP (f80b4c3a)][39m **Encountered internal error running command: NoSuchDriverError: A session is either terminated or not started**
[debug] [35m[MJSONWP (f80b4c3a)][39m at asyncHandler (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/appium-base-driver/lib/protocol/protocol.js:243:15)
[debug] [35m[MJSONWP (f80b4c3a)][39m at /Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/appium-base-driver/lib/protocol/protocol.js:423:15
[debug] [35m[MJSONWP (f80b4c3a)][39m at Layer.handle [as handle_request] (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/layer.js:95:5)
[debug] [35m[MJSONWP (f80b4c3a)][39m at next (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/route.js:137:13)
[debug] [35m[MJSONWP (f80b4c3a)][39m at Route.dispatch (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/route.js:112:3)
[debug] [35m[MJSONWP (f80b4c3a)][39m at Layer.handle [as handle_request] (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/layer.js:95:5)
[debug] [35m[MJSONWP (f80b4c3a)][39m at /Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:281:22
[debug] [35m[MJSONWP (f80b4c3a)][39m at param (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:354:14)
[debug] [35m[MJSONWP (f80b4c3a)][39m at param (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:365:14)
[debug] [35m[MJSONWP (f80b4c3a)][39m at Function.process_params (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:410:3)
[debug] [35m[MJSONWP (f80b4c3a)][39m at next (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:275:10)
[debug] [35m[MJSONWP (f80b4c3a)][39m at logger (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/morgan/index.js:144:5)
[debug] [35m[MJSONWP (f80b4c3a)][39m at Layer.handle [as handle_request] (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/layer.js:95:5)
[debug] [35m[MJSONWP (f80b4c3a)][39m at trim_prefix (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:317:13)
[debug] [35m[MJSONWP (f80b4c3a)][39m at /Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:284:7
[debug] [35m[MJSONWP (f80b4c3a)][39m at Function.process_params (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:335:12)
[debug] [35m[MJSONWP (f80b4c3a)][39m at next (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/express/lib/router/index.js:275:10)
[debug] [35m[MJSONWP (f80b4c3a)][39m at /Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/body-parser/lib/read.js:130:5
[debug] [35m[MJSONWP (f80b4c3a)][39m at invokeCallback (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/raw-body/index.js:224:16)
[debug] [35m[MJSONWP (f80b4c3a)][39m at done (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/raw-body/index.js:213:7)
[debug] [35m[MJSONWP (f80b4c3a)][39m at IncomingMessage.onEnd (/Applications/Appium.app/Contents/Resources/app/node_modules/appium/node_modules/raw-body/index.js:273:7)
[debug] [35m[MJSONWP (f80b4c3a)][39m at IncomingMessage.emit (events.js:208:15)
[debug] [35m[MJSONWP (f80b4c3a)][39m at endReadableNT (_stream_readable.js:1168:12)
[debug] [35m[MJSONWP (f80b4c3a)][39m at processTicksAndRejections (internal/process/task_queues.js:77:11)
[info] [35m[HTTP][39m [37m<-- POST /wd/hub/session/f80b4c3a-8fbc-470a-b614-7d1b95f4073d/elements [39m[33m404[39m [90m25 ms - 131[39m
[info] [35m[HTTP][39m [90m[39m[error] [35m[Xcode][39m 2021-07-19 14:09:36.998475+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m 2021-07-19 14:09:37.004420+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m 2021-07-19 14:09:37.010448+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m 2021-07-19 14:09:37.017429+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m 2021-07-19 14:09:37.027483+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
[error] [35m[Xcode][39m
[error] [35m[Xcode][39m 2021-07-19 14:09:37.033365+0530 WebDriverAgentRunner-Runner[2712:1623950] Cannot retrieve element attribute(s) (
[error] [35m[Xcode][39m "XC_kAXXCAttributeIsElement"
[error] [35m[Xcode][39m ). Original error: Error Domain=com.apple.dt.xctest.automation-support.error Code=11 "Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016" UserInfo={accessibility-error=-25204, NSLocalizedDescription=Accessibility error kAXErrorCannotComplete from AXUIElementCopyMultipleAttributeValues for 2016}
## Code To Reproduce Issue [ Good To Have ]
Please remember that with sample code it's easier to reproduce the bug and it's much faster to fix it.
Please git clone https://github.com/appium/appium and from the `sample-code` directory, use one of your favourite languages and sample apps to reproduce the issue.
In case a similar scenario is missing in sample-code, please submit a PR with one of the sample apps provided.
|
test
|
encountered internal error running command nosuchdrivererror a session is either terminated or not started is displayed and aut is closed the problem encountered internal error running command nosuchdrivererror a session is either terminated or not started is displayed and aut is closed while appium desktop is trying to take screenshot or load elements environment appium version or git revision that exhibits the issue last appium version that did not exhibit the issue if applicable desktop os version used to run appium macos bigsur node js version unless using appium app exe npm or yarn package manager mobile platform version under test iphone real device or emulator simulator real device appium cli or appium app exe appium app details appium app desktop app while taking screenshot aut is closed and error is displayed link to appium logs executing command getsession merging wda caps over appium caps for session detail response responding to client with driver getsession result udid automationname xcuitest bundleid com carriertransicold containerlink clearsystemfiles true devicename srinivas’s iphone newcommandtimeout platformname ios platformversion showxcodelog true wdaconnectiontimeout connecthardwarekeyboard true device iphone browsername containerlink™ sdkversion cfbundleidentifier com carriertransicold containerlink pixelratio statbarheight viewportrect left top width height get wd hub session ms wd hub session window current size executing command getwindowsize to with no body getting the most recent active application out of total items t requesting snapshot of accessibility hierarchy for app with pid got response with status value width height sessionid responding to client with driver getwindowsize result width height get wd hub session window current size ms getting the most recent active application out of total items wd hub session execute executing command execute getting list of available contexts retrieving contexts and views selecting by url false sending connection key request sending rpc reportidentifier message id setconnectionkey sending to web inspector took selecting application no applications currently connected no web frames found no webviews found in post wd hub session execute ms wd hub session source executing command getpagesource matched source to command name getpagesource to with no body getting the most recent active application out of total items the following attributes were requested to be included into the xml fbheightattribute fbaccessibleattribute fbvalueattribute fbvisibleattribute fbwidthattribute fbenabledattribute fbtypeattribute fbyattribute fblabelattribute fbindexattribute fbxattribute fbnameattribute waiting up to until com carriertransicold containerlink is in idle state including animations t wait for com carriertransicold containerlink to idle t requesting snapshot of accessibility hierarchy for app with pid cannot take a snapshot with attribute s xc kaxxcattributeelementbasetype xc kaxxcattributehasnativefocus xc kaxxcattributetruncatedvalue xc kaxxcattributeviewcontrollertitle xc kaxxcattributeusertestingelements xc kaxxcattributeautomationtype xc kaxxcattributebannerisstickyattribute xc kaxxcattributeidentifier xc kaxxcattributetraits xc kaxxcattributeframe xc kaxxcattributeverticalsizeclass xc kaxxcattributeviewcontrollerclassname xc kaxxcattributeisuserinteractionenabled xc kaxxcattributelabel xc kaxxcattributeurl xc kaxxcattributeparent xc kaxxcattributehorizontalsizeclass xc kaxxcattributeplaceholdervalue xc kaxxcattributeelementtype xc kaxxcattributeisvisible of xcuielementtypeany after seconds this timeout could be customized via customsnapshottimeout setting internal error xctperformonmainrunloop work timed out after falling back to the default snapshotting mechanism for the element xcuielementtypeany some attribute values like visibility or accessibility might not be precise though cannot retrieve element attribute s xc kaxxcattributeiselement original error error domain com apple dt xctest automation support error code accessibility error kaxerroripctimeout from axuielementcopymultipleattributevalues for userinfo accessibility error nslocalizeddescription accessibility error kaxerroripctimeout from axuielementcopymultipleattributevalues for post wd hub session elements ms wd hub session elements encountered internal error running command nosuchdrivererror a session is either terminated or not started at asynchandler applications appium app contents resources app node modules appium node modules appium base driver lib protocol protocol js at applications appium app contents resources app node modules appium node modules appium base driver lib protocol protocol js applications appium app contents resources app node modules appium node modules express lib router layer js at next applications appium app contents resources app node modules appium node modules express lib router route js at route dispatch applications appium app contents resources app node modules appium node modules express lib router route js applications appium app contents resources app node modules appium node modules express lib router layer js at applications appium app contents resources app node modules appium node modules express lib router index js at param applications appium app contents resources app node modules appium node modules express lib router index js at param applications appium app contents resources app node modules appium node modules express lib router index js at function process params applications appium app contents resources app node modules appium node modules express lib router index js at next applications appium app contents resources app node modules appium node modules express lib router index js at logger applications appium app contents resources app node modules appium node modules morgan index js applications appium app contents resources app node modules appium node modules express lib router layer js at trim prefix applications appium app contents resources app node modules appium node modules express lib router index js at applications appium app contents resources app node modules appium node modules express lib router index js at function process params applications appium app contents resources app node modules appium node modules express lib router index js at next applications appium app contents resources app node modules appium node modules express lib router index js at applications appium app contents resources app node modules appium node modules body parser lib read js at invokecallback applications appium app contents resources app node modules appium node modules raw body index js at done applications appium app contents resources app node modules appium node modules raw body index js at incomingmessage onend applications appium app contents resources app node modules appium node modules raw body index js at incomingmessage emit events js at endreadablent stream readable js at processticksandrejections internal process task queues js post wd hub session elements ms cannot retrieve element attribute s xc kaxxcattributeiselement original error error domain com apple dt xctest automation support error code accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for userinfo accessibility error nslocalizeddescription accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for cannot retrieve element attribute s xc kaxxcattributeiselement original error error domain com apple dt xctest automation support error code accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for userinfo accessibility error nslocalizeddescription accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for cannot retrieve element attribute s xc kaxxcattributeiselement original error error domain com apple dt xctest automation support error code accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for userinfo accessibility error nslocalizeddescription accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for cannot retrieve element attribute s xc kaxxcattributeiselement original error error domain com apple dt xctest automation support error code accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for userinfo accessibility error nslocalizeddescription accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for cannot retrieve element attribute s xc kaxxcattributeiselement original error error domain com apple dt xctest automation support error code accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for userinfo accessibility error nslocalizeddescription accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for cannot retrieve element attribute s xc kaxxcattributeiselement original error error domain com apple dt xctest automation support error code accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for userinfo accessibility error nslocalizeddescription accessibility error kaxerrorcannotcomplete from axuielementcopymultipleattributevalues for code to reproduce issue please remember that with sample code it s easier to reproduce the bug and it s much faster to fix it please git clone and from the sample code directory use one of your favourite languages and sample apps to reproduce the issue in case a similar scenario is missing in sample code please submit a pr with one of the sample apps provided
| 1
|
286,686
| 21,604,865,186
|
IssuesEvent
|
2022-05-04 00:35:20
|
ICEI-PUC-Minas-PMV-ADS/pmv-ads-2022-1-e2-proj-int-t3-adotar-pets
|
https://api.github.com/repos/ICEI-PUC-Minas-PMV-ADS/pmv-ads-2022-1-e2-proj-int-t3-adotar-pets
|
closed
|
Projeto do Github com Backlog e Sprints
|
documentation
|
Revisar backlog do projeto e lançamento do board com plano de sprints
|
1.0
|
Projeto do Github com Backlog e Sprints - Revisar backlog do projeto e lançamento do board com plano de sprints
|
non_test
|
projeto do github com backlog e sprints revisar backlog do projeto e lançamento do board com plano de sprints
| 0
|
787,012
| 27,701,724,198
|
IssuesEvent
|
2023-03-14 08:32:04
|
AY2223S2-CS2103-W17-1/tp
|
https://api.github.com/repos/AY2223S2-CS2103-W17-1/tp
|
closed
|
View a student’s tasks
|
type.Story priority.high
|
As a user I can see the tasks for a student so that I can check what tasks the student has.
|
1.0
|
View a student’s tasks - As a user I can see the tasks for a student so that I can check what tasks the student has.
|
non_test
|
view a student’s tasks as a user i can see the tasks for a student so that i can check what tasks the student has
| 0
|
121,447
| 25,971,537,253
|
IssuesEvent
|
2022-12-19 11:38:56
|
Clueless-Community/seamless-ui
|
https://api.github.com/repos/Clueless-Community/seamless-ui
|
opened
|
Improve search-with-btn
|
codepeak 22
|
Need to improve this component: search-with-btn
## Improvements required :
1. Design is not as per the Figma file
2. Make three different size
## Any reference image?

|
1.0
|
Improve search-with-btn - Need to improve this component: search-with-btn
## Improvements required :
1. Design is not as per the Figma file
2. Make three different size
## Any reference image?

|
non_test
|
improve search with btn need to improve this component search with btn improvements required design is not as per the figma file make three different size any reference image
| 0
|
102,247
| 8,822,828,295
|
IssuesEvent
|
2019-01-02 11:00:58
|
status-im/status-react
|
https://api.github.com/repos/status-im/status-react
|
closed
|
No 'Intrinsic gas too low' popup when sending TX with Gas Limit < 21000
|
bug e2e test blocker medium-severity stale wallet
|
***Type***: Bug
***Summary***: We don't show 'Intrinsic gas too low' pop-up warning when a contact sends a transaction (from 1-1 chat / Dapp / directly from Wallet) with Gas Limit < 21000. A transaction is not sent as expected, but we need to show up the pop-up (that we have in 0.9.27 release) notifying user for what's happened.
#### Expected behavior
'Intrinsic gas too low' popup appears when sending TX with Gas Limit < 21000
#### Actual behavior
No 'Intrinsic gas too low' popup appears when sending TX with Gas Limit < 21000
### Reproduction
1) Open Status and create an account
2) Switch to Ropsten network and request test ETH (from Simple Test Dapp)
3) Add a contact to initiate 1-1 chat with it
4) Navigate to 1-1 chat and send a `/send ETH 0` message
5) Once Wallet screen is opened, change the Gas Limit value to `20999` in transaction fee screen
6) Sign this transaction with valid password
**Actual result:** No pop-up with the 'Intrinsic gas too low' message.
7) Navigate to `Wallet` -> `Send tranaction`:
8) Specify recipient, set 0 value in `Amount` field
9) Change the Gas Limit value to `20999` in transaction fee screen
10) Sign this transaction with valid password
**Actual result:** No pop-up with the 'Intrinsic gas too low' message.
11) Open `Simple Test Dapp` -> `Assets` -> tap `Request STT`
12) Once Wallet screen is opened, change the Gas Limit value to `20999` in transaction fee screen
13) Sign this transaction with valid password
**Actual result:** No pop-up with the 'Intrinsic gas too low' message.
### Additional Information
* Status version: Develop 0.9.28 (8561); node 0.14.1
* Operating System: Android and iOS
#### Logs
TF session with logs: https://app.testfairy.com/projects/4803622-status/builds/8597460/sessions/4401830477/?accessToken=FrUZwjmQiDPtcfCCrE5P99qGPP8
<blockquote><div><strong><a href="https://app.testfairy.com/projects/4803622-status/builds/8597460/sessions/4401830477/?accessToken=FrUZwjmQiDPtcfCCrE5P99qGPP8">TestFairy: Status 0.9.28 (8561) @2018</a></strong></div></blockquote>
|
1.0
|
No 'Intrinsic gas too low' popup when sending TX with Gas Limit < 21000 -
***Type***: Bug
***Summary***: We don't show 'Intrinsic gas too low' pop-up warning when a contact sends a transaction (from 1-1 chat / Dapp / directly from Wallet) with Gas Limit < 21000. A transaction is not sent as expected, but we need to show up the pop-up (that we have in 0.9.27 release) notifying user for what's happened.
#### Expected behavior
'Intrinsic gas too low' popup appears when sending TX with Gas Limit < 21000
#### Actual behavior
No 'Intrinsic gas too low' popup appears when sending TX with Gas Limit < 21000
### Reproduction
1) Open Status and create an account
2) Switch to Ropsten network and request test ETH (from Simple Test Dapp)
3) Add a contact to initiate 1-1 chat with it
4) Navigate to 1-1 chat and send a `/send ETH 0` message
5) Once Wallet screen is opened, change the Gas Limit value to `20999` in transaction fee screen
6) Sign this transaction with valid password
**Actual result:** No pop-up with the 'Intrinsic gas too low' message.
7) Navigate to `Wallet` -> `Send tranaction`:
8) Specify recipient, set 0 value in `Amount` field
9) Change the Gas Limit value to `20999` in transaction fee screen
10) Sign this transaction with valid password
**Actual result:** No pop-up with the 'Intrinsic gas too low' message.
11) Open `Simple Test Dapp` -> `Assets` -> tap `Request STT`
12) Once Wallet screen is opened, change the Gas Limit value to `20999` in transaction fee screen
13) Sign this transaction with valid password
**Actual result:** No pop-up with the 'Intrinsic gas too low' message.
### Additional Information
* Status version: Develop 0.9.28 (8561); node 0.14.1
* Operating System: Android and iOS
#### Logs
TF session with logs: https://app.testfairy.com/projects/4803622-status/builds/8597460/sessions/4401830477/?accessToken=FrUZwjmQiDPtcfCCrE5P99qGPP8
<blockquote><div><strong><a href="https://app.testfairy.com/projects/4803622-status/builds/8597460/sessions/4401830477/?accessToken=FrUZwjmQiDPtcfCCrE5P99qGPP8">TestFairy: Status 0.9.28 (8561) @2018</a></strong></div></blockquote>
|
test
|
no intrinsic gas too low popup when sending tx with gas limit type bug summary we don t show intrinsic gas too low pop up warning when a contact sends a transaction from chat dapp directly from wallet with gas limit a transaction is not sent as expected but we need to show up the pop up that we have in release notifying user for what s happened expected behavior intrinsic gas too low popup appears when sending tx with gas limit actual behavior no intrinsic gas too low popup appears when sending tx with gas limit reproduction open status and create an account switch to ropsten network and request test eth from simple test dapp add a contact to initiate chat with it navigate to chat and send a send eth message once wallet screen is opened change the gas limit value to in transaction fee screen sign this transaction with valid password actual result no pop up with the intrinsic gas too low message navigate to wallet send tranaction specify recipient set value in amount field change the gas limit value to in transaction fee screen sign this transaction with valid password actual result no pop up with the intrinsic gas too low message open simple test dapp assets tap request stt once wallet screen is opened change the gas limit value to in transaction fee screen sign this transaction with valid password actual result no pop up with the intrinsic gas too low message additional information status version develop node operating system android and ios logs tf session with logs
| 1
|
196,172
| 14,816,546,826
|
IssuesEvent
|
2021-01-14 09:11:37
|
status-im/status-react
|
https://api.github.com/repos/status-im/status-react
|
closed
|
Install sticker button is overlapped with keycard menu
|
bug e2e test blocker
|
# Bug Report
## Problem
After moving keycard menu it is overlapped with install sticker button, I assume that it needs to be a bit more left (and we can do it really tiny one)

It is reason for failed `test_install_pack_and_send_sticker`
|
1.0
|
Install sticker button is overlapped with keycard menu -
# Bug Report
## Problem
After moving keycard menu it is overlapped with install sticker button, I assume that it needs to be a bit more left (and we can do it really tiny one)

It is reason for failed `test_install_pack_and_send_sticker`
|
test
|
install sticker button is overlapped with keycard menu bug report problem after moving keycard menu it is overlapped with install sticker button i assume that it needs to be a bit more left and we can do it really tiny one it is reason for failed test install pack and send sticker
| 1
|
226,935
| 18,045,966,518
|
IssuesEvent
|
2021-09-18 22:39:57
|
logicmoo/logicmoo_workspace
|
https://api.github.com/repos/logicmoo/logicmoo_workspace
|
opened
|
logicmoo.pfc.test.sanity_base.MT_01D JUnit
|
Test_9999 logicmoo.pfc.test.sanity_base unit_test MT_01D
|
(cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif mt_01d.pl)
GH_MASTER_ISSUE_FINFO=
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3AMT_01D
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/mt_01d.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/MT_01D/logicmoo_pfc_test_sanity_base_MT_01D_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/68/testReport/logicmoo.pfc.test.sanity_base/MT_01D/logicmoo_pfc_test_sanity_base_MT_01D_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://github.com/logicmoo/logicmoo_workspace/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/mt_01d.pl
```
%
running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/mt_01d.pl'),
%~ /var/lib/jenkins/.local/share/swi-prolog/pack/logicmoo_utils/prolog/logicmoo_test_header.pl:92
%~ this_test_might_need( :-( use_module( library(logicmoo_plarkc))))
%~ this_test_might_need( :-( expects_dialect(pfc)))
:- expects_dialect(pfc).
:- set_fileAssertMt(cycKB1).
%~ pfc_iri : include_module_file(cycKB1:library('pfclib/system_each_module.pfc'),cycKB1).
/*~
%~ pfc_iri:include_module_file(cycKB1:library('pfclib/system_each_module.pfc'),cycKB1)
~*/
loves(sally,joe).
No source location!?
:- mpred_test(clause_u(cycKB1:loves(_,_))).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/mt_01d.pl:20
%~ mpred_test("Test_0001_Line_0000__loves_2_in_cycKB1",baseKB:clause_u(cycKB1:loves(_462,_466)))
/*~
%~ mpred_test("Test_0001_Line_0000__loves_2_in_cycKB1",baseKB:clause_u(cycKB1:loves(_462,_466)))
Call: (68) [pfc_lib] clause_u(cycKB1:loves(_462, _466))
Unify: (68) [pfc_lib] clause_u(cycKB1:loves(_462, _466))
^ Call: (72) [pfc_lib] hook_database:clause_i(cycKB1:loves(_462, _466), true, _17498)
^ Unify: (72) [pfc_lib] hook_database:clause_i(cycKB1:loves(_462, _466), true, _17498)
^ Call: (73) [system] clause(cycKB1:loves(_462, _466), true, _17498)
^ Fail: (73) [system] clause(cycKB1:loves(_462, _466), true, _17498)
^ Fail: (72) [pfc_lib] hook_database:clause_i(cycKB1:loves(_462, _466), true, _17498)
Fail: (68) [pfc_lib] clause_u(cycKB1:loves(_462, _466))
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+clause_u(cycKB1:loves(_462,_466)))),rtrace(baseKB:clause_u(cycKB1:loves(_462,_466)))))
no_proof_for(\+clause_u(cycKB1:loves(Loves,Loves1))).
no_proof_for(\+clause_u(cycKB1:loves(Loves,Loves1))).
no_proof_for(\+clause_u(cycKB1:loves(Loves,Loves1))).
name ='logicmoo.pfc.test.sanity_base.MT_01D-Test_0001_Line_0000__loves_2_in_cycKB1'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.MT_01D'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif mt_01d.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.MT_01D-Test_0001_Line_0000__loves_2_in_cycKB1-junit.xml
~*/
:- mpred_test(\+clause_u(baseKB:loves(_,_))).
%~ mpred_test("Test_0002_Line_0000__naf_loves_2",baseKB:(\+clause_u(baseKB:loves(_35510,_35532))))
%~ FIlE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/mt_01d.pl#L22
/*~
%~ mpred_test("Test_0002_Line_0000__naf_loves_2",baseKB:(\+clause_u(baseKB:loves(_35510,_35532))))
Call: (68) [pfc_lib] clause_u(baseKB:loves(_35510, _35532))
Unify: (68) [pfc_lib] clause_u(baseKB:loves(_35510, _35532))
^ Call: (72) [pfc_lib] hook_database:clause_i(baseKB:loves(_35510, _35532), true, _49456)
^ Unify: (72) [pfc_lib] hook_database:clause_i(baseKB:loves(_35510, _35532), true, _49456)
^ Call: (73) [system] clause(baseKB:loves(_35510, _35532), true, _49456)
^ Exit: (73) [system] clause(loves(sally, joe), true, <clause>(0x5567983de070))
^ Exit: (72) [pfc_lib] hook_database:clause_i(baseKB:loves(sally, joe), true, <clause>(0x5567983de070))
Call: (74) [t_l] t_l:exact_kb(_52618)
Fail: (74) [t_l] t_l:exact_kb(_52618)
Exit: (68) [pfc_lib] clause_u(baseKB:loves(sally, joe))
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:clause_u(baseKB:loves(_35510,_35532))),rtrace(baseKB:(\+clause_u(baseKB:loves(_35510,_35532))))))
Justifications for clause_u(baseKB:loves(Loves,Loves1)):
name ='logicmoo.pfc.test.sanity_base.MT_01D-Test_0002_Line_0000__naf_loves_2'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.MT_01D'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif mt_01d.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.MT_01D-Test_0002_Line_0000__naf_loves_2-junit.xml
~*/
:- pfc_test_feature(mt,\+clause_u(header_sane:loves(_,_))).
%~ unused(save_junit_results)
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/mt_01d.pl:29
%~ test_completed_exit(8)
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
```
totalTime=3
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3AMT_01D
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/mt_01d.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/MT_01D/logicmoo_pfc_test_sanity_base_MT_01D_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/68/testReport/logicmoo.pfc.test.sanity_base/MT_01D/logicmoo_pfc_test_sanity_base_MT_01D_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://github.com/logicmoo/logicmoo_workspace/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/mt_01d.pl
FAILED: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k mt_01d.pl (returned 8)
|
3.0
|
logicmoo.pfc.test.sanity_base.MT_01D JUnit - (cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif mt_01d.pl)
GH_MASTER_ISSUE_FINFO=
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3AMT_01D
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/mt_01d.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/MT_01D/logicmoo_pfc_test_sanity_base_MT_01D_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/68/testReport/logicmoo.pfc.test.sanity_base/MT_01D/logicmoo_pfc_test_sanity_base_MT_01D_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://github.com/logicmoo/logicmoo_workspace/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/mt_01d.pl
```
%
running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/mt_01d.pl'),
%~ /var/lib/jenkins/.local/share/swi-prolog/pack/logicmoo_utils/prolog/logicmoo_test_header.pl:92
%~ this_test_might_need( :-( use_module( library(logicmoo_plarkc))))
%~ this_test_might_need( :-( expects_dialect(pfc)))
:- expects_dialect(pfc).
:- set_fileAssertMt(cycKB1).
%~ pfc_iri : include_module_file(cycKB1:library('pfclib/system_each_module.pfc'),cycKB1).
/*~
%~ pfc_iri:include_module_file(cycKB1:library('pfclib/system_each_module.pfc'),cycKB1)
~*/
loves(sally,joe).
No source location!?
:- mpred_test(clause_u(cycKB1:loves(_,_))).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/mt_01d.pl:20
%~ mpred_test("Test_0001_Line_0000__loves_2_in_cycKB1",baseKB:clause_u(cycKB1:loves(_462,_466)))
/*~
%~ mpred_test("Test_0001_Line_0000__loves_2_in_cycKB1",baseKB:clause_u(cycKB1:loves(_462,_466)))
Call: (68) [pfc_lib] clause_u(cycKB1:loves(_462, _466))
Unify: (68) [pfc_lib] clause_u(cycKB1:loves(_462, _466))
^ Call: (72) [pfc_lib] hook_database:clause_i(cycKB1:loves(_462, _466), true, _17498)
^ Unify: (72) [pfc_lib] hook_database:clause_i(cycKB1:loves(_462, _466), true, _17498)
^ Call: (73) [system] clause(cycKB1:loves(_462, _466), true, _17498)
^ Fail: (73) [system] clause(cycKB1:loves(_462, _466), true, _17498)
^ Fail: (72) [pfc_lib] hook_database:clause_i(cycKB1:loves(_462, _466), true, _17498)
Fail: (68) [pfc_lib] clause_u(cycKB1:loves(_462, _466))
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+clause_u(cycKB1:loves(_462,_466)))),rtrace(baseKB:clause_u(cycKB1:loves(_462,_466)))))
no_proof_for(\+clause_u(cycKB1:loves(Loves,Loves1))).
no_proof_for(\+clause_u(cycKB1:loves(Loves,Loves1))).
no_proof_for(\+clause_u(cycKB1:loves(Loves,Loves1))).
name ='logicmoo.pfc.test.sanity_base.MT_01D-Test_0001_Line_0000__loves_2_in_cycKB1'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.MT_01D'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif mt_01d.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.MT_01D-Test_0001_Line_0000__loves_2_in_cycKB1-junit.xml
~*/
:- mpred_test(\+clause_u(baseKB:loves(_,_))).
%~ mpred_test("Test_0002_Line_0000__naf_loves_2",baseKB:(\+clause_u(baseKB:loves(_35510,_35532))))
%~ FIlE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/mt_01d.pl#L22
/*~
%~ mpred_test("Test_0002_Line_0000__naf_loves_2",baseKB:(\+clause_u(baseKB:loves(_35510,_35532))))
Call: (68) [pfc_lib] clause_u(baseKB:loves(_35510, _35532))
Unify: (68) [pfc_lib] clause_u(baseKB:loves(_35510, _35532))
^ Call: (72) [pfc_lib] hook_database:clause_i(baseKB:loves(_35510, _35532), true, _49456)
^ Unify: (72) [pfc_lib] hook_database:clause_i(baseKB:loves(_35510, _35532), true, _49456)
^ Call: (73) [system] clause(baseKB:loves(_35510, _35532), true, _49456)
^ Exit: (73) [system] clause(loves(sally, joe), true, <clause>(0x5567983de070))
^ Exit: (72) [pfc_lib] hook_database:clause_i(baseKB:loves(sally, joe), true, <clause>(0x5567983de070))
Call: (74) [t_l] t_l:exact_kb(_52618)
Fail: (74) [t_l] t_l:exact_kb(_52618)
Exit: (68) [pfc_lib] clause_u(baseKB:loves(sally, joe))
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:clause_u(baseKB:loves(_35510,_35532))),rtrace(baseKB:(\+clause_u(baseKB:loves(_35510,_35532))))))
Justifications for clause_u(baseKB:loves(Loves,Loves1)):
name ='logicmoo.pfc.test.sanity_base.MT_01D-Test_0002_Line_0000__naf_loves_2'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.MT_01D'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif mt_01d.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.MT_01D-Test_0002_Line_0000__naf_loves_2-junit.xml
~*/
:- pfc_test_feature(mt,\+clause_u(header_sane:loves(_,_))).
%~ unused(save_junit_results)
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/mt_01d.pl:29
%~ test_completed_exit(8)
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
```
totalTime=3
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3AMT_01D
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/mt_01d.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/MT_01D/logicmoo_pfc_test_sanity_base_MT_01D_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/68/testReport/logicmoo.pfc.test.sanity_base/MT_01D/logicmoo_pfc_test_sanity_base_MT_01D_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://github.com/logicmoo/logicmoo_workspace/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/mt_01d.pl
FAILED: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k mt_01d.pl (returned 8)
|
test
|
logicmoo pfc test sanity base mt junit cd var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base timeout foreground preserve status s sigkill k lmoo clif mt pl gh master issue finfo issue search gitlab latest this build github running var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base mt pl var lib jenkins local share swi prolog pack logicmoo utils prolog logicmoo test header pl this test might need use module library logicmoo plarkc this test might need expects dialect pfc expects dialect pfc set fileassertmt pfc iri include module file library pfclib system each module pfc pfc iri include module file library pfclib system each module pfc loves sally joe no source location mpred test clause u loves var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base mt pl mpred test test line loves in basekb clause u loves mpred test test line loves in basekb clause u loves call clause u loves unify clause u loves call hook database clause i loves true unify hook database clause i loves true call clause loves true fail clause loves true fail hook database clause i loves true fail clause u loves call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb clause u loves rtrace basekb clause u loves no proof for clause u loves loves no proof for clause u loves loves no proof for clause u loves loves name logicmoo pfc test sanity base mt test line loves in junit classname logicmoo pfc test sanity base mt junit cmd timeout foreground preserve status s sigkill k lmoo clif mt pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base mt test line loves in junit xml mpred test clause u basekb loves mpred test test line naf loves basekb clause u basekb loves file mpred test test line naf loves basekb clause u basekb loves call clause u basekb loves unify clause u basekb loves call hook database clause i basekb loves true unify hook database clause i basekb loves true call clause basekb loves true exit clause loves sally joe true exit hook database clause i basekb loves sally joe true call t l exact kb fail t l exact kb exit clause u basekb loves sally joe call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb clause u basekb loves rtrace basekb clause u basekb loves justifications for clause u basekb loves loves name logicmoo pfc test sanity base mt test line naf loves junit classname logicmoo pfc test sanity base mt junit cmd timeout foreground preserve status s sigkill k lmoo clif mt pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base mt test line naf loves junit xml pfc test feature mt clause u header sane loves unused save junit results var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base mt pl test completed exit dynamic junit prop dynamic junit prop dynamic junit prop totaltime issue search gitlab latest this build github failed var lib jenkins workspace logicmoo workspace bin lmoo junit minor k mt pl returned
| 1
|
255,789
| 21,954,612,820
|
IssuesEvent
|
2022-05-24 10:55:55
|
dask/distributed
|
https://api.github.com/repos/dask/distributed
|
opened
|
`distributed/cli/tests/test_dask_worker.py::test_worker_cli_nprocs_renamed_to_nworkers` failure
|
flaky test
|
`distributed/cli/tests/test_dask_worker.py::test_worker_cli_nprocs_renamed_to_nworkers` failed on https://github.com/dask/distributed/pull/6423
https://github.com/dask/distributed/pull/6423
https://github.com/dask/distributed/runs/6570436094?check_suite_focus=true
It is stuck in `c.wait_for_workers(2)` and the logs indicate that indeed the second worker never is started. There are two nannies but not a second worker
```
2022-05-24 09:26:53,219 - distributed.nanny - INFO - Start Nanny at: 'tcp://127.0.0.1:49995'
2022-05-24 09:26:53,240 - distributed.nanny - INFO - Start Nanny at: 'tcp://127.0.0.1:49994'
2022-05-24 09:26:55,712 - distributed.worker - INFO - Start worker at: tcp://127.0.0.1:50002
2022-05-24 09:26:55,713 - distributed.worker - INFO - Listening to: tcp://127.0.0.1:50002
2022-05-24 09:26:55,713 - distributed.worker - INFO - dashboard at: 127.0.0.1:50004
2022-05-24 09:26:55,713 - distributed.worker - INFO - Waiting to connect to: tcp://127.0.0.1:49990
2022-05-24 09:26:55,713 - distributed.worker - INFO - -------------------------------------------------
2022-05-24 09:26:55,713 - distributed.worker - INFO - Threads: 1
2022-05-24 09:26:55,714 - distributed.worker - INFO - Memory: 4.67 GiB
2022-05-24 09:26:55,714 - distributed.worker - INFO - Local Directory: /Users/runner/work/distributed/distributed/dask-worker-space/worker-jqvysbo3
2022-05-24 09:26:55,714 - distributed.worker - INFO - -------------------------------------------------
2022-05-24 09:26:55,743 - distributed.worker - INFO - Registered to: tcp://127.0.0.1:49990
2022-05-24 09:26:55,743 - distributed.worker - INFO - -------------------------------------------------
2022-05-24 09:26:55,745 - distributed.core - INFO - Starting established connection
2022-05-24 09:27:22,263 - distributed._signals - INFO - Received signal SIGINT (2)
2022-05-24 09:27:22,264 - distributed.nanny - INFO - Closing Nanny at 'tcp://127.0.0.1:49994'.
2022-05-24 09:27:22,264 - distributed.nanny - INFO - Nanny asking worker to close
2022-05-24 09:27:22,265 - distributed.nanny - INFO - Closing Nanny at 'tcp://127.0.0.1:49995'.
2022-05-24 09:27:22,265 - distributed.nanny - INFO - Nanny asking worker to close
2022-05-24 09:27:22,266 - distributed.worker - INFO - Stopping worker at tcp://127.0.0.1:50002
```
|
1.0
|
`distributed/cli/tests/test_dask_worker.py::test_worker_cli_nprocs_renamed_to_nworkers` failure - `distributed/cli/tests/test_dask_worker.py::test_worker_cli_nprocs_renamed_to_nworkers` failed on https://github.com/dask/distributed/pull/6423
https://github.com/dask/distributed/pull/6423
https://github.com/dask/distributed/runs/6570436094?check_suite_focus=true
It is stuck in `c.wait_for_workers(2)` and the logs indicate that indeed the second worker never is started. There are two nannies but not a second worker
```
2022-05-24 09:26:53,219 - distributed.nanny - INFO - Start Nanny at: 'tcp://127.0.0.1:49995'
2022-05-24 09:26:53,240 - distributed.nanny - INFO - Start Nanny at: 'tcp://127.0.0.1:49994'
2022-05-24 09:26:55,712 - distributed.worker - INFO - Start worker at: tcp://127.0.0.1:50002
2022-05-24 09:26:55,713 - distributed.worker - INFO - Listening to: tcp://127.0.0.1:50002
2022-05-24 09:26:55,713 - distributed.worker - INFO - dashboard at: 127.0.0.1:50004
2022-05-24 09:26:55,713 - distributed.worker - INFO - Waiting to connect to: tcp://127.0.0.1:49990
2022-05-24 09:26:55,713 - distributed.worker - INFO - -------------------------------------------------
2022-05-24 09:26:55,713 - distributed.worker - INFO - Threads: 1
2022-05-24 09:26:55,714 - distributed.worker - INFO - Memory: 4.67 GiB
2022-05-24 09:26:55,714 - distributed.worker - INFO - Local Directory: /Users/runner/work/distributed/distributed/dask-worker-space/worker-jqvysbo3
2022-05-24 09:26:55,714 - distributed.worker - INFO - -------------------------------------------------
2022-05-24 09:26:55,743 - distributed.worker - INFO - Registered to: tcp://127.0.0.1:49990
2022-05-24 09:26:55,743 - distributed.worker - INFO - -------------------------------------------------
2022-05-24 09:26:55,745 - distributed.core - INFO - Starting established connection
2022-05-24 09:27:22,263 - distributed._signals - INFO - Received signal SIGINT (2)
2022-05-24 09:27:22,264 - distributed.nanny - INFO - Closing Nanny at 'tcp://127.0.0.1:49994'.
2022-05-24 09:27:22,264 - distributed.nanny - INFO - Nanny asking worker to close
2022-05-24 09:27:22,265 - distributed.nanny - INFO - Closing Nanny at 'tcp://127.0.0.1:49995'.
2022-05-24 09:27:22,265 - distributed.nanny - INFO - Nanny asking worker to close
2022-05-24 09:27:22,266 - distributed.worker - INFO - Stopping worker at tcp://127.0.0.1:50002
```
|
test
|
distributed cli tests test dask worker py test worker cli nprocs renamed to nworkers failure distributed cli tests test dask worker py test worker cli nprocs renamed to nworkers failed on it is stuck in c wait for workers and the logs indicate that indeed the second worker never is started there are two nannies but not a second worker distributed nanny info start nanny at tcp distributed nanny info start nanny at tcp distributed worker info start worker at tcp distributed worker info listening to tcp distributed worker info dashboard at distributed worker info waiting to connect to tcp distributed worker info distributed worker info threads distributed worker info memory gib distributed worker info local directory users runner work distributed distributed dask worker space worker distributed worker info distributed worker info registered to tcp distributed worker info distributed core info starting established connection distributed signals info received signal sigint distributed nanny info closing nanny at tcp distributed nanny info nanny asking worker to close distributed nanny info closing nanny at tcp distributed nanny info nanny asking worker to close distributed worker info stopping worker at tcp
| 1
|
225,493
| 17,862,473,452
|
IssuesEvent
|
2021-09-06 04:10:46
|
ceph/ceph-csi
|
https://api.github.com/repos/ceph/ceph-csi
|
reopened
|
Improve CI readability by labelling script steps
|
good first issue wontfix component/testing
|
# Describe the feature you'd like to have #
Currently the CI jobs that run in Jenkins for each PR list shell commands for steps that are executed. These commands can be labelled with a short and understandable comment.
See [containerized-tests](https://jenkins-ceph-csi.apps.ocp.ci.centos.org/blue/organizations/jenkins/containerized-tests/) as an example, and click on a job to check the different stages there.
# What is the value to the end user? (why is it a priority?) #
Developers will understand the CI jobs better.
# How will we know we have a good solution? (acceptance criteria) #
Looking at CI jobs in Jenkins will show useful comments in the top of the steps (new), with the commands when expanding the view (as currently done).
This improves the understandability of the `*.groovy` files in the [ci/centos branch](https://github.com/ceph/ceph-csi/tree/ci/centos).
# Additional context #
Idea comes from noobaa/noobaa-core#6259
|
1.0
|
Improve CI readability by labelling script steps - # Describe the feature you'd like to have #
Currently the CI jobs that run in Jenkins for each PR list shell commands for steps that are executed. These commands can be labelled with a short and understandable comment.
See [containerized-tests](https://jenkins-ceph-csi.apps.ocp.ci.centos.org/blue/organizations/jenkins/containerized-tests/) as an example, and click on a job to check the different stages there.
# What is the value to the end user? (why is it a priority?) #
Developers will understand the CI jobs better.
# How will we know we have a good solution? (acceptance criteria) #
Looking at CI jobs in Jenkins will show useful comments in the top of the steps (new), with the commands when expanding the view (as currently done).
This improves the understandability of the `*.groovy` files in the [ci/centos branch](https://github.com/ceph/ceph-csi/tree/ci/centos).
# Additional context #
Idea comes from noobaa/noobaa-core#6259
|
test
|
improve ci readability by labelling script steps describe the feature you d like to have currently the ci jobs that run in jenkins for each pr list shell commands for steps that are executed these commands can be labelled with a short and understandable comment see as an example and click on a job to check the different stages there what is the value to the end user why is it a priority developers will understand the ci jobs better how will we know we have a good solution acceptance criteria looking at ci jobs in jenkins will show useful comments in the top of the steps new with the commands when expanding the view as currently done this improves the understandability of the groovy files in the additional context idea comes from noobaa noobaa core
| 1
|
19,749
| 26,112,520,752
|
IssuesEvent
|
2022-12-27 22:37:32
|
firebase/firebase-cpp-sdk
|
https://api.github.com/repos/firebase/firebase-cpp-sdk
|
closed
|
[C++] Nightly Integration Testing Report
|
type: process nightly-testing
|
Note: This report excludes firestore. Please also check **[the report for firestore](https://github.com/firebase/firebase-cpp-sdk/issues/1166)**
***
<hidden value="integration-test-status-comment"></hidden>
### ❌ [build against repo] Integration test FAILED
Requested by @DellaBitta on commit 3095517de64f316fee6ad3e978163f584f91bb67
Last updated: Tue Dec 27 03:05 PST 2022
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3786329233)**
| Failures | Configs |
|----------|---------|
| gma | [TEST] [FAILURE] [iOS] [macos] [2/6 ios_device: ios_min ios_latest]<details><summary>(2 failed tests)</summary> FirebaseGmaTest.TestRewardedAdLoad<br/> FirebaseGmaTest.TestRewardedAdLoadEmptyRequest</details>[TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> FirebaseGmaTest.TestRewardedAdStress</details>[TEST] [FLAKINESS] [iOS] [macos] [1/6 ios_device: ios_target]<details><summary>(2 failed tests)</summary> FirebaseGmaTest.TestRewardedAdLoad<br/> FirebaseGmaTest.TestRewardedAdLoadEmptyRequest</details> |
| messaging | [TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
| storage | [TEST] [FLAKINESS] [Android] [2/3 os: windows ubuntu] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
<hidden value="integration-test-status-comment"></hidden>
***
### [build against SDK] Integration test with FLAKINESS (succeeded after retry)
Requested by @firebase-workflow-trigger[bot] on commit 3095517de64f316fee6ad3e978163f584f91bb67
Last updated: Tue Dec 27 05:48 PST 2022
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3787453255)**
| Failures | Configs |
|----------|---------|
| analytics | [TEST] [FLAKINESS] [iOS] [macos] [1/6 ios_device: ios_latest]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
| storage | [TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
|
1.0
|
[C++] Nightly Integration Testing Report - Note: This report excludes firestore. Please also check **[the report for firestore](https://github.com/firebase/firebase-cpp-sdk/issues/1166)**
***
<hidden value="integration-test-status-comment"></hidden>
### ❌ [build against repo] Integration test FAILED
Requested by @DellaBitta on commit 3095517de64f316fee6ad3e978163f584f91bb67
Last updated: Tue Dec 27 03:05 PST 2022
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3786329233)**
| Failures | Configs |
|----------|---------|
| gma | [TEST] [FAILURE] [iOS] [macos] [2/6 ios_device: ios_min ios_latest]<details><summary>(2 failed tests)</summary> FirebaseGmaTest.TestRewardedAdLoad<br/> FirebaseGmaTest.TestRewardedAdLoadEmptyRequest</details>[TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> FirebaseGmaTest.TestRewardedAdStress</details>[TEST] [FLAKINESS] [iOS] [macos] [1/6 ios_device: ios_target]<details><summary>(2 failed tests)</summary> FirebaseGmaTest.TestRewardedAdLoad<br/> FirebaseGmaTest.TestRewardedAdLoadEmptyRequest</details> |
| messaging | [TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
| storage | [TEST] [FLAKINESS] [Android] [2/3 os: windows ubuntu] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
<hidden value="integration-test-status-comment"></hidden>
***
### [build against SDK] Integration test with FLAKINESS (succeeded after retry)
Requested by @firebase-workflow-trigger[bot] on commit 3095517de64f316fee6ad3e978163f584f91bb67
Last updated: Tue Dec 27 05:48 PST 2022
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3787453255)**
| Failures | Configs |
|----------|---------|
| analytics | [TEST] [FLAKINESS] [iOS] [macos] [1/6 ios_device: ios_latest]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
| storage | [TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
|
non_test
|
nightly integration testing report note this report excludes firestore please also check ❌ nbsp integration test failed requested by dellabitta on commit last updated tue dec pst failures configs gma failed tests nbsp nbsp firebasegmatest testrewardedadload nbsp nbsp firebasegmatest testrewardedadloademptyrequest failed tests nbsp nbsp firebasegmatest testrewardedadstress failed tests nbsp nbsp firebasegmatest testrewardedadload nbsp nbsp firebasegmatest testrewardedadloademptyrequest messaging failed tests nbsp nbsp crash timeout storage failed tests nbsp nbsp crash timeout add flaky tests to integration test with flakiness succeeded after retry requested by firebase workflow trigger on commit last updated tue dec pst failures configs analytics failed tests nbsp nbsp crash timeout storage failed tests nbsp nbsp crash timeout add flaky tests to
| 0
|
165,719
| 12,879,869,830
|
IssuesEvent
|
2020-07-12 01:26:19
|
osquery/osquery
|
https://api.github.com/repos/osquery/osquery
|
closed
|
Create tests for the table `extended_attributes`
|
good-first-issue macOS test
|
## Create tests for the table `extended_attributes`
- Create header file for the table implementation, if one is not exists.
- In test, query the table and check if retrieved columns (name and types) match the columns from table spec.
- If there is any guarantee to number of rows (e.g. only 1 record in every query result, more than 3 records or something else) check it.
- Test the implementation details of the table, if it possible.
Table spec: `specs/darwin/extended_attributes.table`
Source files:
- `osquery/tables/system/darwin/extended_attributes.cpp`
Table generating function: `genXattr()`
|
1.0
|
Create tests for the table `extended_attributes` - ## Create tests for the table `extended_attributes`
- Create header file for the table implementation, if one is not exists.
- In test, query the table and check if retrieved columns (name and types) match the columns from table spec.
- If there is any guarantee to number of rows (e.g. only 1 record in every query result, more than 3 records or something else) check it.
- Test the implementation details of the table, if it possible.
Table spec: `specs/darwin/extended_attributes.table`
Source files:
- `osquery/tables/system/darwin/extended_attributes.cpp`
Table generating function: `genXattr()`
|
test
|
create tests for the table extended attributes create tests for the table extended attributes create header file for the table implementation if one is not exists in test query the table and check if retrieved columns name and types match the columns from table spec if there is any guarantee to number of rows e g only record in every query result more than records or something else check it test the implementation details of the table if it possible table spec specs darwin extended attributes table source files osquery tables system darwin extended attributes cpp table generating function genxattr
| 1
|
709,990
| 24,399,751,286
|
IssuesEvent
|
2022-10-04 23:29:32
|
azerothcore/azerothcore-wotlk
|
https://api.github.com/repos/azerothcore/azerothcore-wotlk
|
opened
|
[QAston] Nether Protection doesn't proc
|
Confirmed Class - Warlock Priority-Medium
|
### Current Behaviour
Regardless of how many spells you are hit with. Nether Protection never procs.
### Expected Blizzlike Behaviour
There should be a 30% chance, when hit with a spell, to proc Nether Protection
### Source
_No response_
### Steps to reproduce the problem
Use 2 characters
Spec into Nether Protection (Destruction tree)
Cast a spell such as Frostbolt (116) on the warlock repeatedly
See if Nether Protection procs
### Extra Notes
_No response_
### AC rev. hash/commit
https://github.com/azerothcore/azerothcore-wotlk/commit/e05f61d1b3873a217798078169455591422a1766
### Operating system
Windows 10
### Custom changes or Modules
_No response_
|
1.0
|
[QAston] Nether Protection doesn't proc - ### Current Behaviour
Regardless of how many spells you are hit with. Nether Protection never procs.
### Expected Blizzlike Behaviour
There should be a 30% chance, when hit with a spell, to proc Nether Protection
### Source
_No response_
### Steps to reproduce the problem
Use 2 characters
Spec into Nether Protection (Destruction tree)
Cast a spell such as Frostbolt (116) on the warlock repeatedly
See if Nether Protection procs
### Extra Notes
_No response_
### AC rev. hash/commit
https://github.com/azerothcore/azerothcore-wotlk/commit/e05f61d1b3873a217798078169455591422a1766
### Operating system
Windows 10
### Custom changes or Modules
_No response_
|
non_test
|
nether protection doesn t proc current behaviour regardless of how many spells you are hit with nether protection never procs expected blizzlike behaviour there should be a chance when hit with a spell to proc nether protection source no response steps to reproduce the problem use characters spec into nether protection destruction tree cast a spell such as frostbolt on the warlock repeatedly see if nether protection procs extra notes no response ac rev hash commit operating system windows custom changes or modules no response
| 0
|
40,340
| 8,775,351,416
|
IssuesEvent
|
2018-12-18 22:47:15
|
open-contracting/standard
|
https://api.github.com/repos/open-contracting/standard
|
closed
|
relatedProcess: Add 'planned' code
|
Codelist - Open
|
The [relatedProcess](http://standard.open-contracting.org/latest/en/schema/codelists/#related-process) codelist has an entry for 'planning' to point *upwards* from a tender to the contracting process in which it was planned. But it does not have an entry for pointing from a planning process to the tenders that resulted from it.
This would be important for cases where a publisher wants to show the procurement resulting from a particular planning process.
|
1.0
|
relatedProcess: Add 'planned' code - The [relatedProcess](http://standard.open-contracting.org/latest/en/schema/codelists/#related-process) codelist has an entry for 'planning' to point *upwards* from a tender to the contracting process in which it was planned. But it does not have an entry for pointing from a planning process to the tenders that resulted from it.
This would be important for cases where a publisher wants to show the procurement resulting from a particular planning process.
|
non_test
|
relatedprocess add planned code the codelist has an entry for planning to point upwards from a tender to the contracting process in which it was planned but it does not have an entry for pointing from a planning process to the tenders that resulted from it this would be important for cases where a publisher wants to show the procurement resulting from a particular planning process
| 0
|
5,204
| 7,759,498,808
|
IssuesEvent
|
2018-05-31 23:53:45
|
asamuzaK/sidebarTabs
|
https://api.github.com/repos/asamuzaK/sidebarTabs
|
opened
|
Implement ability to close a selection of tabs
|
compatibility enhancement
|
See [1458022 - Implement ability to close a selection of tabs](https://bugzilla.mozilla.org/show_bug.cgi?id=1458022 "1458022 - Implement ability to close a selection of tabs")
Related #33
|
True
|
Implement ability to close a selection of tabs - See [1458022 - Implement ability to close a selection of tabs](https://bugzilla.mozilla.org/show_bug.cgi?id=1458022 "1458022 - Implement ability to close a selection of tabs")
Related #33
|
non_test
|
implement ability to close a selection of tabs see implement ability to close a selection of tabs related
| 0
|
22,901
| 3,727,389,436
|
IssuesEvent
|
2016-03-06 08:05:05
|
godfather1103/mentohust
|
https://api.github.com/repos/godfather1103/mentohust
|
closed
|
wrt54g 1.1 ddwrt要怎么操作呢 求指导
|
auto-migrated Priority-Medium Type-Defect
|
```
已经刷好ddwrt了 然后不懂怎么操作了 额
不是集成的我想自己弄个
```
Original issue reported on code.google.com by `J1140752...@gmail.com` on 19 Apr 2013 at 7:03
|
1.0
|
wrt54g 1.1 ddwrt要怎么操作呢 求指导 - ```
已经刷好ddwrt了 然后不懂怎么操作了 额
不是集成的我想自己弄个
```
Original issue reported on code.google.com by `J1140752...@gmail.com` on 19 Apr 2013 at 7:03
|
non_test
|
ddwrt要怎么操作呢 求指导 已经刷好ddwrt了 然后不懂怎么操作了 额 不是集成的我想自己弄个 original issue reported on code google com by gmail com on apr at
| 0
|
280,638
| 24,320,445,381
|
IssuesEvent
|
2022-09-30 10:13:52
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
opened
|
Upstream unit tests crash when BraveExtensionSystem initialized via TestExtensionSystem
|
dev-concern OS/Desktop Chromium/upstream-test
|
These upstream unit tests crash when `BraveExtensionManagement` is initialized via a
`TestExtensionSystem`.
These tests are affected, but there may be others in the filter lists that are also affected:
- `AppHomePageHandlerTest`
|
1.0
|
Upstream unit tests crash when BraveExtensionSystem initialized via TestExtensionSystem - These upstream unit tests crash when `BraveExtensionManagement` is initialized via a
`TestExtensionSystem`.
These tests are affected, but there may be others in the filter lists that are also affected:
- `AppHomePageHandlerTest`
|
test
|
upstream unit tests crash when braveextensionsystem initialized via testextensionsystem these upstream unit tests crash when braveextensionmanagement is initialized via a testextensionsystem these tests are affected but there may be others in the filter lists that are also affected apphomepagehandlertest
| 1
|
1,150
| 2,532,423,572
|
IssuesEvent
|
2015-01-23 16:01:07
|
nnnick/Chart.js
|
https://api.github.com/repos/nnnick/Chart.js
|
closed
|
Using too many point in dataset collapse all on x axis
|
Needs test case
|
Hi,
If you use a dataset of 86340 points (23h59 1 point per minute) for a canvas of 950px wide, all x-axis label are collapse to the left Bar and Line chart.
Using a 1600 canvas wide solve the problem
Kind regards
Bertrand
|
1.0
|
Using too many point in dataset collapse all on x axis - Hi,
If you use a dataset of 86340 points (23h59 1 point per minute) for a canvas of 950px wide, all x-axis label are collapse to the left Bar and Line chart.
Using a 1600 canvas wide solve the problem
Kind regards
Bertrand
|
test
|
using too many point in dataset collapse all on x axis hi if you use a dataset of points point per minute for a canvas of wide all x axis label are collapse to the left bar and line chart using a canvas wide solve the problem kind regards bertrand
| 1
|
255,002
| 21,893,216,435
|
IssuesEvent
|
2022-05-20 05:34:26
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
roachtest: sqlsmith/setup=tpcc/setting=ddl-nodrop failed
|
C-test-failure O-robot O-roachtest branch-master release-blocker
|
roachtest.sqlsmith/setup=tpcc/setting=ddl-nodrop [failed](https://teamcity.cockroachdb.com/buildConfiguration/BUILDTYPE_ID-not-found-in-env/5213353?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/BUILDTYPE_ID-not-found-in-env/5213353?buildTab=artifacts#/sqlsmith/setup=tpcc/setting=ddl-nodrop) on master @ [92947c29c55ff909f50a5e625811d34a1bbe71f7](https://github.com/cockroachdb/cockroach/commits/92947c29c55ff909f50a5e625811d34a1bbe71f7):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /artifacts/sqlsmith/setup=tpcc/setting=ddl-nodrop/run_1
sqlsmith.go:265,sqlsmith.go:305,test_runner.go:876: error: pq: internal error: failed to construct index entries during backfill: Non-nullable column "new_order:col_36" with no value! Index scanned was "idx_8" with the index key columns (no_w_id,no_d_id,no_o_id) and the values (0,1,2101)
stmt:
ALTER TABLE defaultdb.public.new_order ADD COLUMN col_41 TIME DEFAULT ('00:06:00':::TIME::TIME) NOT NULL;
```
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/sql-queries
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*sqlsmith/setup=tpcc/setting=ddl-nodrop.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
2.0
|
roachtest: sqlsmith/setup=tpcc/setting=ddl-nodrop failed - roachtest.sqlsmith/setup=tpcc/setting=ddl-nodrop [failed](https://teamcity.cockroachdb.com/buildConfiguration/BUILDTYPE_ID-not-found-in-env/5213353?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/BUILDTYPE_ID-not-found-in-env/5213353?buildTab=artifacts#/sqlsmith/setup=tpcc/setting=ddl-nodrop) on master @ [92947c29c55ff909f50a5e625811d34a1bbe71f7](https://github.com/cockroachdb/cockroach/commits/92947c29c55ff909f50a5e625811d34a1bbe71f7):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /artifacts/sqlsmith/setup=tpcc/setting=ddl-nodrop/run_1
sqlsmith.go:265,sqlsmith.go:305,test_runner.go:876: error: pq: internal error: failed to construct index entries during backfill: Non-nullable column "new_order:col_36" with no value! Index scanned was "idx_8" with the index key columns (no_w_id,no_d_id,no_o_id) and the values (0,1,2101)
stmt:
ALTER TABLE defaultdb.public.new_order ADD COLUMN col_41 TIME DEFAULT ('00:06:00':::TIME::TIME) NOT NULL;
```
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/sql-queries
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*sqlsmith/setup=tpcc/setting=ddl-nodrop.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
test
|
roachtest sqlsmith setup tpcc setting ddl nodrop failed roachtest sqlsmith setup tpcc setting ddl nodrop with on master the test failed on branch master cloud gce test artifacts and logs in artifacts sqlsmith setup tpcc setting ddl nodrop run sqlsmith go sqlsmith go test runner go error pq internal error failed to construct index entries during backfill non nullable column new order col with no value index scanned was idx with the index key columns no w id no d id no o id and the values stmt alter table defaultdb public new order add column col time default time time not null help see see cc cockroachdb sql queries
| 1
|
281,895
| 24,432,152,223
|
IssuesEvent
|
2022-10-06 08:54:20
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
pkg/sql/sqlitelogictest/tests/local/local_test: TestSqlLiteLogic_testindexorderby_nosort1000slt_good_1_test failed
|
C-test-failure O-robot branch-release-22.2.0
|
pkg/sql/sqlitelogictest/tests/local/local_test.TestSqlLiteLogic_testindexorderby_nosort1000slt_good_1_test [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_SQLiteLogicTestsBazel/6803740?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_SQLiteLogicTestsBazel/6803740?buildTab=artifacts#/) on release-22.2.0 @ [4f2f7c685cd5adb079ad652c732023dc2759c87d](https://github.com/cockroachdb/cockroach/commits/4f2f7c685cd5adb079ad652c732023dc2759c87d):
```
Slow failing tests:
TestSqlLiteLogic_testindexorderby_nosort1000slt_good_1_test - 2699.11s
Slow passing tests:
TestSqlLiteLogic_testindexdelete100slt_good_3_test - 251.29s
TestSqlLiteLogic_testindexin1000slt_good_1_test - 190.65s
TestSqlLiteLogic_testindexbetween100slt_good_4_test - 140.89s
TestSqlLiteLogic_testindexorderby_nosort10slt_good_38_test - 51.36s
TestSqlLiteLogic_testindexcommute10slt_good_21_test - 47.87s
TestSqlLiteLogic_testindexcommute100slt_good_8_test - 47.06s
TestSqlLiteLogic_testindexorderby10slt_good_22_test - 46.57s
TestSqlLiteLogic_testindexorderby_nosort10slt_good_0_test - 43.07s
TestSqlLiteLogic_testindexcommute10slt_good_5_test - 39.07s
TestSqlLiteLogic_testindexorderby_nosort10slt_good_23_test - 38.06s
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
/cc @cockroachdb/sql-queries
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestSqlLiteLogic_testindexorderby_nosort1000slt_good_1_test.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
1.0
|
pkg/sql/sqlitelogictest/tests/local/local_test: TestSqlLiteLogic_testindexorderby_nosort1000slt_good_1_test failed - pkg/sql/sqlitelogictest/tests/local/local_test.TestSqlLiteLogic_testindexorderby_nosort1000slt_good_1_test [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_SQLiteLogicTestsBazel/6803740?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_SQLiteLogicTestsBazel/6803740?buildTab=artifacts#/) on release-22.2.0 @ [4f2f7c685cd5adb079ad652c732023dc2759c87d](https://github.com/cockroachdb/cockroach/commits/4f2f7c685cd5adb079ad652c732023dc2759c87d):
```
Slow failing tests:
TestSqlLiteLogic_testindexorderby_nosort1000slt_good_1_test - 2699.11s
Slow passing tests:
TestSqlLiteLogic_testindexdelete100slt_good_3_test - 251.29s
TestSqlLiteLogic_testindexin1000slt_good_1_test - 190.65s
TestSqlLiteLogic_testindexbetween100slt_good_4_test - 140.89s
TestSqlLiteLogic_testindexorderby_nosort10slt_good_38_test - 51.36s
TestSqlLiteLogic_testindexcommute10slt_good_21_test - 47.87s
TestSqlLiteLogic_testindexcommute100slt_good_8_test - 47.06s
TestSqlLiteLogic_testindexorderby10slt_good_22_test - 46.57s
TestSqlLiteLogic_testindexorderby_nosort10slt_good_0_test - 43.07s
TestSqlLiteLogic_testindexcommute10slt_good_5_test - 39.07s
TestSqlLiteLogic_testindexorderby_nosort10slt_good_23_test - 38.06s
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
/cc @cockroachdb/sql-queries
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestSqlLiteLogic_testindexorderby_nosort1000slt_good_1_test.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
test
|
pkg sql sqlitelogictest tests local local test testsqllitelogic testindexorderby good test failed pkg sql sqlitelogictest tests local local test testsqllitelogic testindexorderby good test with on release slow failing tests testsqllitelogic testindexorderby good test slow passing tests testsqllitelogic good test testsqllitelogic good test testsqllitelogic good test testsqllitelogic testindexorderby good test testsqllitelogic good test testsqllitelogic good test testsqllitelogic good test testsqllitelogic testindexorderby good test testsqllitelogic good test testsqllitelogic testindexorderby good test help see also cc cockroachdb sql queries
| 1
|
98,726
| 20,790,044,207
|
IssuesEvent
|
2022-03-17 00:19:32
|
Ale-Torres/BrowserQuest
|
https://api.github.com/repos/Ale-Torres/BrowserQuest
|
closed
|
Astar.js "find" set of functions has to many parameters
|
code smell
|
Should reduce the number of parameters to these functions, grouping similar parameters together, and remove unnecessary parameters
|
1.0
|
Astar.js "find" set of functions has to many parameters - Should reduce the number of parameters to these functions, grouping similar parameters together, and remove unnecessary parameters
|
non_test
|
astar js find set of functions has to many parameters should reduce the number of parameters to these functions grouping similar parameters together and remove unnecessary parameters
| 0
|
297,364
| 25,724,918,862
|
IssuesEvent
|
2022-12-07 16:01:47
|
hashicorp/nomad
|
https://api.github.com/repos/hashicorp/nomad
|
closed
|
Job Endpoint concurrent map writes
|
type/bug theme/testing
|
Had a CI fail due to (almost certainly) unrelated changes; concurrent map writes on the Job endpoint
@pkazmierczak
@jrasell
```
FAIL nomad
=== RUN TestACLEndpoint_GetAuthMethods
=== PAUSE TestACLEndpoint_GetAuthMethods
FAIL nomad.TestACLEndpoint_GetAuthMethods (-1.00s)
=== RUN TestClientEndpoint_UpdateAlloc_Evals_ByTrigger
=== PAUSE TestClientEndpoint_UpdateAlloc_Evals_ByTrigger
FAIL nomad.TestClientEndpoint_UpdateAlloc_Evals_ByTrigger (-1.00s)
=== RUN TestPlanEndpoint_ApplyConcurrent
=== PAUSE TestPlanEndpoint_ApplyConcurrent
FAIL nomad.TestPlanEndpoint_ApplyConcurrent (-1.00s)
=== RUN TestRPC_streamingRpcConn_goodMethod_TLS
fatal error: concurrent map writes
goroutine 91524 [running]:
github.com/hashicorp/nomad/nomad/structs.(*StreamingRpcRegistry).Register(...)
/home/runner/work/nomad/nomad/nomad/structs/streaming_rpc.go:42
github.com/hashicorp/nomad/nomad.(*ClientAllocations).register(...)
/home/runner/work/nomad/nomad/nomad/client_alloc_endpoint.go:32
github.com/hashicorp/nomad/nomad.(*Server).setupRpcServer(0xc0046aedc0, 0xc005f65880?, 0xc0050bb220)
/home/runner/work/nomad/nomad/nomad/server.go:1206 +0x16b
github.com/hashicorp/nomad/nomad.(*rpcHandler).handleMultiplexV2(0xc005bfd280, {0x29e92a0?, 0xc0066e5440}, {0x29f51f0?, 0xc005f65880}, 0xc0050bb220)
/home/runner/work/nomad/nomad/nomad/rpc.go:494 +0x225
github.com/hashicorp/nomad/nomad.(*rpcHandler).handleConn(0xc005bfd280, {0x29e92a0, 0xc0066e5440}, {0x29f51f0, 0xc005f65880?}, 0xc0050bb220)
/home/runner/work/nomad/nomad/nomad/rpc.go:352 +0xe45
github.com/hashicorp/nomad/nomad.(*rpcHandler).handleConn(0xc005bfd280, {0x29e92a0, 0xc0066e5440}, {0x29f74a8, 0xc005d9b148?}, 0xc0050bb220)
/home/runner/work/nomad/nomad/nomad/rpc.go:332 +0x9bf
created by github.com/hashicorp/nomad/nomad.(*rpcHandler).listen
/home/runner/work/nomad/nomad/nomad/rpc.go:183 +0x452
```
|
1.0
|
Job Endpoint concurrent map writes - Had a CI fail due to (almost certainly) unrelated changes; concurrent map writes on the Job endpoint
@pkazmierczak
@jrasell
```
FAIL nomad
=== RUN TestACLEndpoint_GetAuthMethods
=== PAUSE TestACLEndpoint_GetAuthMethods
FAIL nomad.TestACLEndpoint_GetAuthMethods (-1.00s)
=== RUN TestClientEndpoint_UpdateAlloc_Evals_ByTrigger
=== PAUSE TestClientEndpoint_UpdateAlloc_Evals_ByTrigger
FAIL nomad.TestClientEndpoint_UpdateAlloc_Evals_ByTrigger (-1.00s)
=== RUN TestPlanEndpoint_ApplyConcurrent
=== PAUSE TestPlanEndpoint_ApplyConcurrent
FAIL nomad.TestPlanEndpoint_ApplyConcurrent (-1.00s)
=== RUN TestRPC_streamingRpcConn_goodMethod_TLS
fatal error: concurrent map writes
goroutine 91524 [running]:
github.com/hashicorp/nomad/nomad/structs.(*StreamingRpcRegistry).Register(...)
/home/runner/work/nomad/nomad/nomad/structs/streaming_rpc.go:42
github.com/hashicorp/nomad/nomad.(*ClientAllocations).register(...)
/home/runner/work/nomad/nomad/nomad/client_alloc_endpoint.go:32
github.com/hashicorp/nomad/nomad.(*Server).setupRpcServer(0xc0046aedc0, 0xc005f65880?, 0xc0050bb220)
/home/runner/work/nomad/nomad/nomad/server.go:1206 +0x16b
github.com/hashicorp/nomad/nomad.(*rpcHandler).handleMultiplexV2(0xc005bfd280, {0x29e92a0?, 0xc0066e5440}, {0x29f51f0?, 0xc005f65880}, 0xc0050bb220)
/home/runner/work/nomad/nomad/nomad/rpc.go:494 +0x225
github.com/hashicorp/nomad/nomad.(*rpcHandler).handleConn(0xc005bfd280, {0x29e92a0, 0xc0066e5440}, {0x29f51f0, 0xc005f65880?}, 0xc0050bb220)
/home/runner/work/nomad/nomad/nomad/rpc.go:352 +0xe45
github.com/hashicorp/nomad/nomad.(*rpcHandler).handleConn(0xc005bfd280, {0x29e92a0, 0xc0066e5440}, {0x29f74a8, 0xc005d9b148?}, 0xc0050bb220)
/home/runner/work/nomad/nomad/nomad/rpc.go:332 +0x9bf
created by github.com/hashicorp/nomad/nomad.(*rpcHandler).listen
/home/runner/work/nomad/nomad/nomad/rpc.go:183 +0x452
```
|
test
|
job endpoint concurrent map writes had a ci fail due to almost certainly unrelated changes concurrent map writes on the job endpoint pkazmierczak jrasell fail nomad run testaclendpoint getauthmethods pause testaclendpoint getauthmethods fail nomad testaclendpoint getauthmethods run testclientendpoint updatealloc evals bytrigger pause testclientendpoint updatealloc evals bytrigger fail nomad testclientendpoint updatealloc evals bytrigger run testplanendpoint applyconcurrent pause testplanendpoint applyconcurrent fail nomad testplanendpoint applyconcurrent run testrpc streamingrpcconn goodmethod tls fatal error concurrent map writes goroutine github com hashicorp nomad nomad structs streamingrpcregistry register home runner work nomad nomad nomad structs streaming rpc go github com hashicorp nomad nomad clientallocations register home runner work nomad nomad nomad client alloc endpoint go github com hashicorp nomad nomad server setuprpcserver home runner work nomad nomad nomad server go github com hashicorp nomad nomad rpchandler home runner work nomad nomad nomad rpc go github com hashicorp nomad nomad rpchandler handleconn home runner work nomad nomad nomad rpc go github com hashicorp nomad nomad rpchandler handleconn home runner work nomad nomad nomad rpc go created by github com hashicorp nomad nomad rpchandler listen home runner work nomad nomad nomad rpc go
| 1
|
382,644
| 11,309,706,079
|
IssuesEvent
|
2020-01-19 14:57:06
|
r-lib/styler
|
https://api.github.com/repos/r-lib/styler
|
closed
|
Remove inner spaces around square braces
|
Complexity: Low Priority: High Status: Unassigned Type: Bug
|
``` r
styler::style_text("a[[ 2 ]]")
#> a[[ 2 ]]
styler::style_text("a[ 2 ]")
#> a[ 2 ]
# this case can be ignored
styler::style_text("a[ [2 ] ]")
#> Error: <text>:1:4: unexpected '['
#> 1: a[ [
#> ^
```
<sup>Created on 2020-01-19 by the [reprex package](https://reprex.tidyverse.org) (v0.3.0)</sup>
We can probably adapt rule from formatting round braces and extend it to square braces.
|
1.0
|
Remove inner spaces around square braces - ``` r
styler::style_text("a[[ 2 ]]")
#> a[[ 2 ]]
styler::style_text("a[ 2 ]")
#> a[ 2 ]
# this case can be ignored
styler::style_text("a[ [2 ] ]")
#> Error: <text>:1:4: unexpected '['
#> 1: a[ [
#> ^
```
<sup>Created on 2020-01-19 by the [reprex package](https://reprex.tidyverse.org) (v0.3.0)</sup>
We can probably adapt rule from formatting round braces and extend it to square braces.
|
non_test
|
remove inner spaces around square braces r styler style text a a styler style text a a this case can be ignored styler style text a error unexpected a created on by the we can probably adapt rule from formatting round braces and extend it to square braces
| 0
|
53,663
| 11,101,914,048
|
IssuesEvent
|
2019-12-16 22:29:48
|
microsoft/PowerToys
|
https://api.github.com/repos/microsoft/PowerToys
|
opened
|
Use GetModuleFileNameW wrappers
|
Code improvement
|
This commit https://github.com/microsoft/PowerToys/commit/fd8fc679be2497d69a1b295e6b7dd6c66584fe34 introduced the `get_module_filename` and `get_module_folderpath` wrappers.
Use them instead of calling directly GetModuleFileNameW
|
1.0
|
Use GetModuleFileNameW wrappers - This commit https://github.com/microsoft/PowerToys/commit/fd8fc679be2497d69a1b295e6b7dd6c66584fe34 introduced the `get_module_filename` and `get_module_folderpath` wrappers.
Use them instead of calling directly GetModuleFileNameW
|
non_test
|
use getmodulefilenamew wrappers this commit introduced the get module filename and get module folderpath wrappers use them instead of calling directly getmodulefilenamew
| 0
|
57,280
| 3,081,253,841
|
IssuesEvent
|
2015-08-22 14:45:34
|
pavel-pimenov/flylinkdc-r5xx
|
https://api.github.com/repos/pavel-pimenov/flylinkdc-r5xx
|
closed
|
[Оптимизмция] Добавить функцию сброса-установки бита в Identity
|
imported Priority-Low Type-Review
|
_From [Pavel.Pimenov@gmail.com](https://code.google.com/u/Pavel.Pimenov@gmail.com/) on January 21, 2013 12:40:58_
В коде присуствует определенное кол-во парных команд
get и set и сейчас они выполняют двойную блокировку секцией
C:\vc10\r5xx-trunk\client\CheatManager.h(142): ou->getIdentity().set("FC", Util::toString(Util::toInt(ou->getIdentity().get("FC")) | Identity::BAD_LIST));
C:\vc10\r5xx-trunk\client\CheatManager.h(160): ou->getIdentity().set("FC", Util::toString(Util::toInt(ou->getIdentity().get("FC")) | Identity::CHECKED));
C:\vc10\r5xx-trunk\client\User.cpp(238): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(240): set("FC", Util::toString(Util::toInt(get("FC")) & ~BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(413): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(414): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_LIST));
C:\vc10\r5xx-trunk\client\User.cpp(424): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(490): set("FC", Util::toString(Util::toInt(get("FC")) & ~BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(492): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(514): set("FC", Util::toString(Util::toInt(get("FC")) & ~BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(580): set("FC", Util::toString(Util::toInt(get("FC")) & ~BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(582): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(606): set("FC", Util::toString(Util::toInt(get("FC")) & ~BAD_CLIENT));
Сделать метод выполняющий замену бита с одной блокировкой
_Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=897_
|
1.0
|
[Оптимизмция] Добавить функцию сброса-установки бита в Identity - _From [Pavel.Pimenov@gmail.com](https://code.google.com/u/Pavel.Pimenov@gmail.com/) on January 21, 2013 12:40:58_
В коде присуствует определенное кол-во парных команд
get и set и сейчас они выполняют двойную блокировку секцией
C:\vc10\r5xx-trunk\client\CheatManager.h(142): ou->getIdentity().set("FC", Util::toString(Util::toInt(ou->getIdentity().get("FC")) | Identity::BAD_LIST));
C:\vc10\r5xx-trunk\client\CheatManager.h(160): ou->getIdentity().set("FC", Util::toString(Util::toInt(ou->getIdentity().get("FC")) | Identity::CHECKED));
C:\vc10\r5xx-trunk\client\User.cpp(238): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(240): set("FC", Util::toString(Util::toInt(get("FC")) & ~BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(413): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(414): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_LIST));
C:\vc10\r5xx-trunk\client\User.cpp(424): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(490): set("FC", Util::toString(Util::toInt(get("FC")) & ~BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(492): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(514): set("FC", Util::toString(Util::toInt(get("FC")) & ~BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(580): set("FC", Util::toString(Util::toInt(get("FC")) & ~BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(582): set("FC", Util::toString(Util::toInt(get("FC")) | BAD_CLIENT));
C:\vc10\r5xx-trunk\client\User.cpp(606): set("FC", Util::toString(Util::toInt(get("FC")) & ~BAD_CLIENT));
Сделать метод выполняющий замену бита с одной блокировкой
_Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=897_
|
non_test
|
добавить функцию сброса установки бита в identity from on january в коде присуствует определенное кол во парных команд get и set и сейчас они выполняют двойную блокировку секцией c trunk client cheatmanager h ou getidentity set fc util tostring util toint ou getidentity get fc identity bad list c trunk client cheatmanager h ou getidentity set fc util tostring util toint ou getidentity get fc identity checked c trunk client user cpp set fc util tostring util toint get fc bad client c trunk client user cpp set fc util tostring util toint get fc bad client c trunk client user cpp set fc util tostring util toint get fc bad client c trunk client user cpp set fc util tostring util toint get fc bad list c trunk client user cpp set fc util tostring util toint get fc bad client c trunk client user cpp set fc util tostring util toint get fc bad client c trunk client user cpp set fc util tostring util toint get fc bad client c trunk client user cpp set fc util tostring util toint get fc bad client c trunk client user cpp set fc util tostring util toint get fc bad client c trunk client user cpp set fc util tostring util toint get fc bad client c trunk client user cpp set fc util tostring util toint get fc bad client сделать метод выполняющий замену бита с одной блокировкой original issue
| 0
|
50,464
| 6,096,772,626
|
IssuesEvent
|
2017-06-20 00:16:34
|
rancher/rancher
|
https://api.github.com/repos/rancher/rancher
|
closed
|
Upgrading from kubernetes 1.5.4 to kubernetes 1.6 causes pods to enter crash loop
|
area/kubernetes kind/bug status/resolved status/to-test
|
**Rancher versions:**
rancher/server: v1.6.2
kubernetes (if applicable): 1.5.4 --> 1.6.0
**Steps to Reproduce:**
- Start kubernetes 1.5.4
- start few pods
- Set catalog branch to `k8s-1.6-dev`
- upgrade to k8s 1.6
**Results:**
Kubernetes pods keeps restarting for over than 20 minutes until they enter Running state.
```
kubectl get pods --all-namespaces
NAMESPACE NAME READY STATUS RESTARTS AGE
kube-system heapster-818085469-rs6w6 0/1 CrashLoopBackOff 16 40m
kube-system kube-dns-2945059722-ngxt6 0/3 CrashLoopBackOff 25 12m
kube-system kubernetes-dashboard-2463885659-bsd84 0/1 CrashLoopBackOff 11 12m
kube-system monitoring-grafana-832403127-w0ccs 0/1 CrashLoopBackOff 11 40m
kube-system monitoring-influxdb-2441835288-8h2g4 0/1 CrashLoopBackOff 12 40m
kube-system tiller-deploy-1933461550-4mrhx 0/1 CrashLoopBackOff 16 12m
pre1 k8testrc1-g3w7c 0/1 CrashLoopBackOff 11 37m
pre1 nginx-g5dfv 1/1 Running 15 37m
pre1 nginx-ingress2 0/1 CrashLoopBackOff 9 37m
pre1 nginx-pod 1/1 Running 12 37m
pre1 nginx-x66vj 0/1 CrashLoopBackOff 10 37m
```
```
kubectl describe pod nginx-ingress2 -n pre1
Name: nginx-ingress2
Namespace: pre1
Node: hgalal-k8s-10acre-3.c.rancher-qa.internal/x.x.x.x
Start Time: Tue, 13 Jun 2017 01:09:33 +0200
Labels: k8s-app=k8test2-service
Status: Running
IP: 10.42.100.139
Controllers: <none>
Containers:
testcontainer:
Container ID: docker://9de376f37999037796d133bb092615488e3f4531532aca1189638f8c4b66f256
Image: sangeetha/testnewhostrouting
Image ID: docker-pullable://sangeetha/testnewhostrouting@sha256:07f6b52d5c67c71dea08ccec41d7f09083da1c3e9c2e3932d2f1ef00afbf77af
Port: 80/TCP
State: Running
Started: Tue, 13 Jun 2017 01:47:41 +0200
Last State: Terminated
Reason: Completed
Exit Code: 0
Started: Mon, 01 Jan 0001 00:00:00 +0000
Finished: Tue, 13 Jun 2017 01:42:39 +0200
Ready: True
Restart Count: 10
Volume Mounts:
/var/run/secrets/kubernetes.io/serviceaccount from default-token-j9wbx (ro)
Environment Variables: <none>
Conditions:
Type Status
Initialized True
Ready True
PodScheduled True
Volumes:
default-token-j9wbx:
Type: Secret (a volume populated by a Secret)
SecretName: default-token-j9wbx
QoS Class: BestEffort
Tolerations: <none>
Events:
FirstSeen LastSeen Count From SubObjectPath Type Reason Message
--------- -------- ----- ---- ------------- -------- ------ -------
38m 38m 1 {default-scheduler } Normal Scheduled Successfully assigned nginx-ingress2 to hgalal-k8s-10acre-3.c.rancher-qa.internal
38m 38m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Pulling pulling image "sangeetha/testnewhostrouting"
38m 38m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Pulled Successfully pulled image "sangeetha/testnewhostrouting"
38m 38m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with docker id 926069bca8f4; Security:[seccomp=unconfined]
38m 38m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with docker id 926069bca8f4
16m 15m 5 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedMount MountVolume.SetUp failed for volume "kubernetes.io/secret/32570ed9-4fc4-11e7-b7f4-02ff588301b0-default-token-j9wbx" (spec.Name: "default-token-j9wbx") pod "32570ed9-4fc4-11e7-b7f4-02ff588301b0" (UID: "32570ed9-4fc4-11e7-b7f4-02ff588301b0") with: Get https://kubernetes.kubernetes.rancher.internal:6443/api/v1/namespaces/pre1/secrets/default-token-j9wbx: dial tcp 10.42.121.36:6443: getsockopt: connection refused
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 1b48b1a8ca06c04311af69374982971ad5ce9e9d9abc131182aaadda93cb5806
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id 1b48b1a8ca06c04311af69374982971ad5ce9e9d9abc131182aaadda93cb5806
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://1b48b1a8ca06c04311af69374982971ad5ce9e9d9abc131182aaadda93cb5806:Need to kill Pod
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 41b14be0b2b7602d9affd923d662cc350961484a863f361623b2a9f3705f9621
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Warning Failed Failed to start container with id 41b14be0b2b7602d9affd923d662cc350961484a863f361623b2a9f3705f9621 with error: rpc error: code = 2 desc = failed to start container "41b14be0b2b7602d9affd923d662cc350961484a863f361623b2a9f3705f9621": Error response from daemon: {"message":"cannot join network of a non running container: dea3879c2f09140c430d60d4cc11563f336ca850b1b8f24441529c98bd52c579"}
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with rpc error: code = 2 desc = failed to start container "41b14be0b2b7602d9affd923d662cc350961484a863f361623b2a9f3705f9621": Error response from daemon: {"message":"cannot join network of a non running container: dea3879c2f09140c430d60d4cc11563f336ca850b1b8f24441529c98bd52c579"}: "Start Container Failed"
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 194cb93a0b7e7b4031d579b4e959e45d5b3e1fd71604240f6ae46cb0d3b7234a
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with rpc error: code = 2 desc = failed to start container "194cb93a0b7e7b4031d579b4e959e45d5b3e1fd71604240f6ae46cb0d3b7234a": Error response from daemon: {"message":"cannot join network of a non running container: 674e0ea08da57709cdc340fe4bf08326a4f6cf06c883ca2bf74cde640db96d73"}: "Start Container Failed"
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Warning Failed Failed to start container with id 194cb93a0b7e7b4031d579b4e959e45d5b3e1fd71604240f6ae46cb0d3b7234a with error: rpc error: code = 2 desc = failed to start container "194cb93a0b7e7b4031d579b4e959e45d5b3e1fd71604240f6ae46cb0d3b7234a": Error response from daemon: {"message":"cannot join network of a non running container: 674e0ea08da57709cdc340fe4bf08326a4f6cf06c883ca2bf74cde640db96d73"}
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 228f71e7c7c3b349bc107825bff2947b17b321c94932e7059095305ad9632f38
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id 228f71e7c7c3b349bc107825bff2947b17b321c94932e7059095305ad9632f38
12m 12m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://228f71e7c7c3b349bc107825bff2947b17b321c94932e7059095305ad9632f38:Need to kill Pod
12m 12m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "KillPodSandbox" for "32570ed9-4fc4-11e7-b7f4-02ff588301b0" with KillPodSandboxError: "rpc error: code = 2 desc = NetworkPlugin cni failed to teardown pod \"_\" network: CNI failed to retrieve network namespace path: Error: No such container: dea3879c2f09140c430d60d4cc11563f336ca850b1b8f24441529c98bd52c579"
12m 12m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with CrashLoopBackOff: "Back-off 40s restarting failed container=testcontainer pod=nginx-ingress2_pre1(32570ed9-4fc4-11e7-b7f4-02ff588301b0)"
12m 12m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "CreatePodSandbox" for "nginx-ingress2_pre1(32570ed9-4fc4-11e7-b7f4-02ff588301b0)" with CreatePodSandboxError: "CreatePodSandbox for pod \"nginx-ingress2_pre1(32570ed9-4fc4-11e7-b7f4-02ff588301b0)\" failed: rpc error: code = 2 desc = NetworkPlugin cni failed to set up pod \"nginx-ingress2_pre1\" network: failed to open netns \"/proc/27100/ns/net\": failed to Statfs \"/proc/27100/ns/net\": no such file or directory"
11m 11m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 0de38db72d5df33e2aa997cff89fb762191566a2dc66577b866a81e28a72d15b
11m 11m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id 0de38db72d5df33e2aa997cff89fb762191566a2dc66577b866a81e28a72d15b
11m 11m 2 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: rpc error: code = 2 desc = Error: No such container: 674e0ea08da57709cdc340fe4bf08326a4f6cf06c883ca2bf74cde640db96d73
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://0de38db72d5df33e2aa997cff89fb762191566a2dc66577b866a81e28a72d15b:Need to kill Pod
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Warning Failed Failed to start container with id bbd25978344a1802b32eb1554a500c5021ad87fca3ce037ac526a75206d34a58 with error: rpc error: code = 2 desc = failed to start container "bbd25978344a1802b32eb1554a500c5021ad87fca3ce037ac526a75206d34a58": Error response from daemon: {"message":"cannot join network of a non running container: 586a294007bfd431f73f3dfbc52f1f28cfe8adcff56f174c8a28e5c56337186b"}
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with rpc error: code = 2 desc = failed to start container "bbd25978344a1802b32eb1554a500c5021ad87fca3ce037ac526a75206d34a58": Error response from daemon: {"message":"cannot join network of a non running container: 586a294007bfd431f73f3dfbc52f1f28cfe8adcff56f174c8a28e5c56337186b"}: "Start Container Failed"
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id bbd25978344a1802b32eb1554a500c5021ad87fca3ce037ac526a75206d34a58
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 6e7b4a8365ff72118733fa049edebf3097855a8b35a95c42af57a04e16725437
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id 6e7b4a8365ff72118733fa049edebf3097855a8b35a95c42af57a04e16725437
9m 9m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://6e7b4a8365ff72118733fa049edebf3097855a8b35a95c42af57a04e16725437:Need to kill Pod
9m 9m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id c02566c00e2a4f24aff48abdaa720a5bfe964ad9fa522e7704548a13566bda05
9m 9m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id c02566c00e2a4f24aff48abdaa720a5bfe964ad9fa522e7704548a13566bda05
9m 9m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://c02566c00e2a4f24aff48abdaa720a5bfe964ad9fa522e7704548a13566bda05:Need to kill Pod
10m 6m 10 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with CrashLoopBackOff: "Back-off 2m40s restarting failed container=testcontainer pod=nginx-ingress2_pre1(32570ed9-4fc4-11e7-b7f4-02ff588301b0)"
6m 6m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id f2a04b0ec6c4fe18ff46fb288f69d6736137acbce9c6d435c5324a2f7c5ad22b
6m 6m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id f2a04b0ec6c4fe18ff46fb288f69d6736137acbce9c6d435c5324a2f7c5ad22b
5m 5m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://f2a04b0ec6c4fe18ff46fb288f69d6736137acbce9c6d435c5324a2f7c5ad22b:Need to kill Pod
13m 1m 22 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Normal SandboxChanged Pod sandbox changed, it will be killed and re-created.
12m 1m 37 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Warning BackOff Back-off restarting failed container
5m 1m 26 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with CrashLoopBackOff: "Back-off 5m0s restarting failed container=testcontainer pod=nginx-ingress2_pre1(32570ed9-4fc4-11e7-b7f4-02ff588301b0)"
13m 50s 10 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Pulling pulling image "sangeetha/testnewhostrouting"
13m 50s 10 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Pulled Successfully pulled image "sangeetha/testnewhostrouting"
50s 50s 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created (events with common reason combined)
50s 50s 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id 9de376f37999037796d133bb092615488e3f4531532aca1189638f8c4b66f256
14m 49s 88 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning DNSSearchForming Search Line limits were exceeded, some dns names have been omitted, the applied search line is: pre1.svc.cluster.local svc.cluster.local cluster.local kubelet.kubernetes.rancher.internal kubernetes.rancher.internal rancher.internal
```
|
1.0
|
Upgrading from kubernetes 1.5.4 to kubernetes 1.6 causes pods to enter crash loop - **Rancher versions:**
rancher/server: v1.6.2
kubernetes (if applicable): 1.5.4 --> 1.6.0
**Steps to Reproduce:**
- Start kubernetes 1.5.4
- start few pods
- Set catalog branch to `k8s-1.6-dev`
- upgrade to k8s 1.6
**Results:**
Kubernetes pods keeps restarting for over than 20 minutes until they enter Running state.
```
kubectl get pods --all-namespaces
NAMESPACE NAME READY STATUS RESTARTS AGE
kube-system heapster-818085469-rs6w6 0/1 CrashLoopBackOff 16 40m
kube-system kube-dns-2945059722-ngxt6 0/3 CrashLoopBackOff 25 12m
kube-system kubernetes-dashboard-2463885659-bsd84 0/1 CrashLoopBackOff 11 12m
kube-system monitoring-grafana-832403127-w0ccs 0/1 CrashLoopBackOff 11 40m
kube-system monitoring-influxdb-2441835288-8h2g4 0/1 CrashLoopBackOff 12 40m
kube-system tiller-deploy-1933461550-4mrhx 0/1 CrashLoopBackOff 16 12m
pre1 k8testrc1-g3w7c 0/1 CrashLoopBackOff 11 37m
pre1 nginx-g5dfv 1/1 Running 15 37m
pre1 nginx-ingress2 0/1 CrashLoopBackOff 9 37m
pre1 nginx-pod 1/1 Running 12 37m
pre1 nginx-x66vj 0/1 CrashLoopBackOff 10 37m
```
```
kubectl describe pod nginx-ingress2 -n pre1
Name: nginx-ingress2
Namespace: pre1
Node: hgalal-k8s-10acre-3.c.rancher-qa.internal/x.x.x.x
Start Time: Tue, 13 Jun 2017 01:09:33 +0200
Labels: k8s-app=k8test2-service
Status: Running
IP: 10.42.100.139
Controllers: <none>
Containers:
testcontainer:
Container ID: docker://9de376f37999037796d133bb092615488e3f4531532aca1189638f8c4b66f256
Image: sangeetha/testnewhostrouting
Image ID: docker-pullable://sangeetha/testnewhostrouting@sha256:07f6b52d5c67c71dea08ccec41d7f09083da1c3e9c2e3932d2f1ef00afbf77af
Port: 80/TCP
State: Running
Started: Tue, 13 Jun 2017 01:47:41 +0200
Last State: Terminated
Reason: Completed
Exit Code: 0
Started: Mon, 01 Jan 0001 00:00:00 +0000
Finished: Tue, 13 Jun 2017 01:42:39 +0200
Ready: True
Restart Count: 10
Volume Mounts:
/var/run/secrets/kubernetes.io/serviceaccount from default-token-j9wbx (ro)
Environment Variables: <none>
Conditions:
Type Status
Initialized True
Ready True
PodScheduled True
Volumes:
default-token-j9wbx:
Type: Secret (a volume populated by a Secret)
SecretName: default-token-j9wbx
QoS Class: BestEffort
Tolerations: <none>
Events:
FirstSeen LastSeen Count From SubObjectPath Type Reason Message
--------- -------- ----- ---- ------------- -------- ------ -------
38m 38m 1 {default-scheduler } Normal Scheduled Successfully assigned nginx-ingress2 to hgalal-k8s-10acre-3.c.rancher-qa.internal
38m 38m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Pulling pulling image "sangeetha/testnewhostrouting"
38m 38m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Pulled Successfully pulled image "sangeetha/testnewhostrouting"
38m 38m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with docker id 926069bca8f4; Security:[seccomp=unconfined]
38m 38m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with docker id 926069bca8f4
16m 15m 5 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedMount MountVolume.SetUp failed for volume "kubernetes.io/secret/32570ed9-4fc4-11e7-b7f4-02ff588301b0-default-token-j9wbx" (spec.Name: "default-token-j9wbx") pod "32570ed9-4fc4-11e7-b7f4-02ff588301b0" (UID: "32570ed9-4fc4-11e7-b7f4-02ff588301b0") with: Get https://kubernetes.kubernetes.rancher.internal:6443/api/v1/namespaces/pre1/secrets/default-token-j9wbx: dial tcp 10.42.121.36:6443: getsockopt: connection refused
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 1b48b1a8ca06c04311af69374982971ad5ce9e9d9abc131182aaadda93cb5806
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id 1b48b1a8ca06c04311af69374982971ad5ce9e9d9abc131182aaadda93cb5806
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://1b48b1a8ca06c04311af69374982971ad5ce9e9d9abc131182aaadda93cb5806:Need to kill Pod
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 41b14be0b2b7602d9affd923d662cc350961484a863f361623b2a9f3705f9621
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Warning Failed Failed to start container with id 41b14be0b2b7602d9affd923d662cc350961484a863f361623b2a9f3705f9621 with error: rpc error: code = 2 desc = failed to start container "41b14be0b2b7602d9affd923d662cc350961484a863f361623b2a9f3705f9621": Error response from daemon: {"message":"cannot join network of a non running container: dea3879c2f09140c430d60d4cc11563f336ca850b1b8f24441529c98bd52c579"}
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with rpc error: code = 2 desc = failed to start container "41b14be0b2b7602d9affd923d662cc350961484a863f361623b2a9f3705f9621": Error response from daemon: {"message":"cannot join network of a non running container: dea3879c2f09140c430d60d4cc11563f336ca850b1b8f24441529c98bd52c579"}: "Start Container Failed"
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 194cb93a0b7e7b4031d579b4e959e45d5b3e1fd71604240f6ae46cb0d3b7234a
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with rpc error: code = 2 desc = failed to start container "194cb93a0b7e7b4031d579b4e959e45d5b3e1fd71604240f6ae46cb0d3b7234a": Error response from daemon: {"message":"cannot join network of a non running container: 674e0ea08da57709cdc340fe4bf08326a4f6cf06c883ca2bf74cde640db96d73"}: "Start Container Failed"
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Warning Failed Failed to start container with id 194cb93a0b7e7b4031d579b4e959e45d5b3e1fd71604240f6ae46cb0d3b7234a with error: rpc error: code = 2 desc = failed to start container "194cb93a0b7e7b4031d579b4e959e45d5b3e1fd71604240f6ae46cb0d3b7234a": Error response from daemon: {"message":"cannot join network of a non running container: 674e0ea08da57709cdc340fe4bf08326a4f6cf06c883ca2bf74cde640db96d73"}
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 228f71e7c7c3b349bc107825bff2947b17b321c94932e7059095305ad9632f38
13m 13m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id 228f71e7c7c3b349bc107825bff2947b17b321c94932e7059095305ad9632f38
12m 12m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://228f71e7c7c3b349bc107825bff2947b17b321c94932e7059095305ad9632f38:Need to kill Pod
12m 12m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "KillPodSandbox" for "32570ed9-4fc4-11e7-b7f4-02ff588301b0" with KillPodSandboxError: "rpc error: code = 2 desc = NetworkPlugin cni failed to teardown pod \"_\" network: CNI failed to retrieve network namespace path: Error: No such container: dea3879c2f09140c430d60d4cc11563f336ca850b1b8f24441529c98bd52c579"
12m 12m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with CrashLoopBackOff: "Back-off 40s restarting failed container=testcontainer pod=nginx-ingress2_pre1(32570ed9-4fc4-11e7-b7f4-02ff588301b0)"
12m 12m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "CreatePodSandbox" for "nginx-ingress2_pre1(32570ed9-4fc4-11e7-b7f4-02ff588301b0)" with CreatePodSandboxError: "CreatePodSandbox for pod \"nginx-ingress2_pre1(32570ed9-4fc4-11e7-b7f4-02ff588301b0)\" failed: rpc error: code = 2 desc = NetworkPlugin cni failed to set up pod \"nginx-ingress2_pre1\" network: failed to open netns \"/proc/27100/ns/net\": failed to Statfs \"/proc/27100/ns/net\": no such file or directory"
11m 11m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 0de38db72d5df33e2aa997cff89fb762191566a2dc66577b866a81e28a72d15b
11m 11m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id 0de38db72d5df33e2aa997cff89fb762191566a2dc66577b866a81e28a72d15b
11m 11m 2 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: rpc error: code = 2 desc = Error: No such container: 674e0ea08da57709cdc340fe4bf08326a4f6cf06c883ca2bf74cde640db96d73
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://0de38db72d5df33e2aa997cff89fb762191566a2dc66577b866a81e28a72d15b:Need to kill Pod
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Warning Failed Failed to start container with id bbd25978344a1802b32eb1554a500c5021ad87fca3ce037ac526a75206d34a58 with error: rpc error: code = 2 desc = failed to start container "bbd25978344a1802b32eb1554a500c5021ad87fca3ce037ac526a75206d34a58": Error response from daemon: {"message":"cannot join network of a non running container: 586a294007bfd431f73f3dfbc52f1f28cfe8adcff56f174c8a28e5c56337186b"}
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with rpc error: code = 2 desc = failed to start container "bbd25978344a1802b32eb1554a500c5021ad87fca3ce037ac526a75206d34a58": Error response from daemon: {"message":"cannot join network of a non running container: 586a294007bfd431f73f3dfbc52f1f28cfe8adcff56f174c8a28e5c56337186b"}: "Start Container Failed"
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id bbd25978344a1802b32eb1554a500c5021ad87fca3ce037ac526a75206d34a58
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id 6e7b4a8365ff72118733fa049edebf3097855a8b35a95c42af57a04e16725437
10m 10m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id 6e7b4a8365ff72118733fa049edebf3097855a8b35a95c42af57a04e16725437
9m 9m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://6e7b4a8365ff72118733fa049edebf3097855a8b35a95c42af57a04e16725437:Need to kill Pod
9m 9m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id c02566c00e2a4f24aff48abdaa720a5bfe964ad9fa522e7704548a13566bda05
9m 9m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id c02566c00e2a4f24aff48abdaa720a5bfe964ad9fa522e7704548a13566bda05
9m 9m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://c02566c00e2a4f24aff48abdaa720a5bfe964ad9fa522e7704548a13566bda05:Need to kill Pod
10m 6m 10 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with CrashLoopBackOff: "Back-off 2m40s restarting failed container=testcontainer pod=nginx-ingress2_pre1(32570ed9-4fc4-11e7-b7f4-02ff588301b0)"
6m 6m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id f2a04b0ec6c4fe18ff46fb288f69d6736137acbce9c6d435c5324a2f7c5ad22b
6m 6m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created Created container with id f2a04b0ec6c4fe18ff46fb288f69d6736137acbce9c6d435c5324a2f7c5ad22b
5m 5m 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Killing Killing container with id docker://f2a04b0ec6c4fe18ff46fb288f69d6736137acbce9c6d435c5324a2f7c5ad22b:Need to kill Pod
13m 1m 22 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Normal SandboxChanged Pod sandbox changed, it will be killed and re-created.
12m 1m 37 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Warning BackOff Back-off restarting failed container
5m 1m 26 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning FailedSync Error syncing pod, skipping: failed to "StartContainer" for "testcontainer" with CrashLoopBackOff: "Back-off 5m0s restarting failed container=testcontainer pod=nginx-ingress2_pre1(32570ed9-4fc4-11e7-b7f4-02ff588301b0)"
13m 50s 10 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Pulling pulling image "sangeetha/testnewhostrouting"
13m 50s 10 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Pulled Successfully pulled image "sangeetha/testnewhostrouting"
50s 50s 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Created (events with common reason combined)
50s 50s 1 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} spec.containers{testcontainer} Normal Started Started container with id 9de376f37999037796d133bb092615488e3f4531532aca1189638f8c4b66f256
14m 49s 88 {kubelet hgalal-k8s-10acre-3.c.rancher-qa.internal} Warning DNSSearchForming Search Line limits were exceeded, some dns names have been omitted, the applied search line is: pre1.svc.cluster.local svc.cluster.local cluster.local kubelet.kubernetes.rancher.internal kubernetes.rancher.internal rancher.internal
```
|
test
|
upgrading from kubernetes to kubernetes causes pods to enter crash loop rancher versions rancher server kubernetes if applicable steps to reproduce start kubernetes start few pods set catalog branch to dev upgrade to results kubernetes pods keeps restarting for over than minutes until they enter running state kubectl get pods all namespaces namespace name ready status restarts age kube system heapster crashloopbackoff kube system kube dns crashloopbackoff kube system kubernetes dashboard crashloopbackoff kube system monitoring grafana crashloopbackoff kube system monitoring influxdb crashloopbackoff kube system tiller deploy crashloopbackoff crashloopbackoff nginx running nginx crashloopbackoff nginx pod running nginx crashloopbackoff kubectl describe pod nginx n name nginx namespace node hgalal c rancher qa internal x x x x start time tue jun labels app service status running ip controllers containers testcontainer container id docker image sangeetha testnewhostrouting image id docker pullable sangeetha testnewhostrouting port tcp state running started tue jun last state terminated reason completed exit code started mon jan finished tue jun ready true restart count volume mounts var run secrets kubernetes io serviceaccount from default token ro environment variables conditions type status initialized true ready true podscheduled true volumes default token type secret a volume populated by a secret secretname default token qos class besteffort tolerations events firstseen lastseen count from subobjectpath type reason message default scheduler normal scheduled successfully assigned nginx to hgalal c rancher qa internal kubelet hgalal c rancher qa internal spec containers testcontainer normal pulling pulling image sangeetha testnewhostrouting kubelet hgalal c rancher qa internal spec containers testcontainer normal pulled successfully pulled image sangeetha testnewhostrouting kubelet hgalal c rancher qa internal spec containers testcontainer normal created created container with docker id security kubelet hgalal c rancher qa internal spec containers testcontainer normal started started container with docker id kubelet hgalal c rancher qa internal warning failedmount mountvolume setup failed for volume kubernetes io secret default token spec name default token pod uid with get dial tcp getsockopt connection refused kubelet hgalal c rancher qa internal spec containers testcontainer normal created created container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal started started container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal killing killing container with id docker need to kill pod kubelet hgalal c rancher qa internal spec containers testcontainer normal created created container with id kubelet hgalal c rancher qa internal spec containers testcontainer warning failed failed to start container with id with error rpc error code desc failed to start container error response from daemon message cannot join network of a non running container kubelet hgalal c rancher qa internal warning failedsync error syncing pod skipping failed to startcontainer for testcontainer with rpc error code desc failed to start container error response from daemon message cannot join network of a non running container start container failed kubelet hgalal c rancher qa internal spec containers testcontainer normal created created container with id kubelet hgalal c rancher qa internal warning failedsync error syncing pod skipping failed to startcontainer for testcontainer with rpc error code desc failed to start container error response from daemon message cannot join network of a non running container start container failed kubelet hgalal c rancher qa internal spec containers testcontainer warning failed failed to start container with id with error rpc error code desc failed to start container error response from daemon message cannot join network of a non running container kubelet hgalal c rancher qa internal spec containers testcontainer normal created created container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal started started container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal killing killing container with id docker need to kill pod kubelet hgalal c rancher qa internal warning failedsync error syncing pod skipping failed to killpodsandbox for with killpodsandboxerror rpc error code desc networkplugin cni failed to teardown pod network cni failed to retrieve network namespace path error no such container kubelet hgalal c rancher qa internal warning failedsync error syncing pod skipping failed to startcontainer for testcontainer with crashloopbackoff back off restarting failed container testcontainer pod nginx kubelet hgalal c rancher qa internal warning failedsync error syncing pod skipping failed to createpodsandbox for nginx with createpodsandboxerror createpodsandbox for pod nginx failed rpc error code desc networkplugin cni failed to set up pod nginx network failed to open netns proc ns net failed to statfs proc ns net no such file or directory kubelet hgalal c rancher qa internal spec containers testcontainer normal created created container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal started started container with id kubelet hgalal c rancher qa internal warning failedsync error syncing pod skipping rpc error code desc error no such container kubelet hgalal c rancher qa internal spec containers testcontainer normal killing killing container with id docker need to kill pod kubelet hgalal c rancher qa internal spec containers testcontainer warning failed failed to start container with id with error rpc error code desc failed to start container error response from daemon message cannot join network of a non running container kubelet hgalal c rancher qa internal warning failedsync error syncing pod skipping failed to startcontainer for testcontainer with rpc error code desc failed to start container error response from daemon message cannot join network of a non running container start container failed kubelet hgalal c rancher qa internal spec containers testcontainer normal created created container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal created created container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal started started container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal killing killing container with id docker need to kill pod kubelet hgalal c rancher qa internal spec containers testcontainer normal started started container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal created created container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal killing killing container with id docker need to kill pod kubelet hgalal c rancher qa internal warning failedsync error syncing pod skipping failed to startcontainer for testcontainer with crashloopbackoff back off restarting failed container testcontainer pod nginx kubelet hgalal c rancher qa internal spec containers testcontainer normal started started container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal created created container with id kubelet hgalal c rancher qa internal spec containers testcontainer normal killing killing container with id docker need to kill pod kubelet hgalal c rancher qa internal normal sandboxchanged pod sandbox changed it will be killed and re created kubelet hgalal c rancher qa internal spec containers testcontainer warning backoff back off restarting failed container kubelet hgalal c rancher qa internal warning failedsync error syncing pod skipping failed to startcontainer for testcontainer with crashloopbackoff back off restarting failed container testcontainer pod nginx kubelet hgalal c rancher qa internal spec containers testcontainer normal pulling pulling image sangeetha testnewhostrouting kubelet hgalal c rancher qa internal spec containers testcontainer normal pulled successfully pulled image sangeetha testnewhostrouting kubelet hgalal c rancher qa internal spec containers testcontainer normal created events with common reason combined kubelet hgalal c rancher qa internal spec containers testcontainer normal started started container with id kubelet hgalal c rancher qa internal warning dnssearchforming search line limits were exceeded some dns names have been omitted the applied search line is svc cluster local svc cluster local cluster local kubelet kubernetes rancher internal kubernetes rancher internal rancher internal
| 1
|
170,761
| 13,201,963,470
|
IssuesEvent
|
2020-08-14 11:15:52
|
wazuh/wazuh
|
https://api.github.com/repos/wazuh/wazuh
|
closed
|
Wazuh-logtest I/O processing
|
core/analysisd core/logtest
|
|Wazuh version|Component|Install type|Install method|Platform|
|---|---|---|---|---|
| 3.13 | Analysisd | Manager | Packages/Sources | Linux |
Hello team!
For the new development of `Wazuh-logtest`, it is necessary to process input and output logs.
Enhance `w_logtest_main`. When logtest receive messages:
- Save message in the cJSON structure.
- If there isn't a session for this client, create it.
- Call `w_logtest_process_log` passing message as parameter
- Receive the message from `w_logtest_process_log` and send it
Best regards,
Eva
|
1.0
|
Wazuh-logtest I/O processing - |Wazuh version|Component|Install type|Install method|Platform|
|---|---|---|---|---|
| 3.13 | Analysisd | Manager | Packages/Sources | Linux |
Hello team!
For the new development of `Wazuh-logtest`, it is necessary to process input and output logs.
Enhance `w_logtest_main`. When logtest receive messages:
- Save message in the cJSON structure.
- If there isn't a session for this client, create it.
- Call `w_logtest_process_log` passing message as parameter
- Receive the message from `w_logtest_process_log` and send it
Best regards,
Eva
|
test
|
wazuh logtest i o processing wazuh version component install type install method platform analysisd manager packages sources linux hello team for the new development of wazuh logtest it is necessary to process input and output logs enhance w logtest main when logtest receive messages save message in the cjson structure if there isn t a session for this client create it call w logtest process log passing message as parameter receive the message from w logtest process log and send it best regards eva
| 1
|
628,154
| 19,976,917,532
|
IssuesEvent
|
2022-01-29 08:18:27
|
takoagemat/plus_Sugar
|
https://api.github.com/repos/takoagemat/plus_Sugar
|
closed
|
ホームページでの動画ポップアップ対応
|
Priority:high
|
以下要望への対応が可能ならばお願いしたいです。(私は可能かどうかもわかっていません)
- 要望
> 23:13 ♡ こんな感じでホームページ開いたらpvでるようにしてほしいです🙇♀️
> 23:18 ♡ URL送りたいけど、速度制限でページ開けない😭らぶゆあらいふって曲のmvが流れるようにお願いしたい
- 参考
`こんな感じ`が指しているもの
https://www.ppppphm.com/
Love your lifeのMV
https://www.youtube.com/watch?v=P3v61AuxKeo
|
1.0
|
ホームページでの動画ポップアップ対応 - 以下要望への対応が可能ならばお願いしたいです。(私は可能かどうかもわかっていません)
- 要望
> 23:13 ♡ こんな感じでホームページ開いたらpvでるようにしてほしいです🙇♀️
> 23:18 ♡ URL送りたいけど、速度制限でページ開けない😭らぶゆあらいふって曲のmvが流れるようにお願いしたい
- 参考
`こんな感じ`が指しているもの
https://www.ppppphm.com/
Love your lifeのMV
https://www.youtube.com/watch?v=P3v61AuxKeo
|
non_test
|
ホームページでの動画ポップアップ対応 以下要望への対応が可能ならばお願いしたいです。 私は可能かどうかもわかっていません 要望 ♡ こんな感じでホームページ開いたらpvでるようにしてほしいです🙇♀️ ♡ url送りたいけど、速度制限でページ開けない😭らぶゆあらいふって曲のmvが流れるようにお願いしたい 参考 こんな感じ が指しているもの love your lifeのmv
| 0
|
170,378
| 14,257,682,405
|
IssuesEvent
|
2020-11-20 04:20:49
|
microsoft/react-native-windows-samples
|
https://api.github.com/repos/microsoft/react-native-windows-samples
|
closed
|
Update WinUI3 Docs for 0.63
|
documentation
|
See https://microsoft.github.io/react-native-windows/docs/next/winui3
Needs updating:
- Canary builds no longer needed
- NPM Package section (meant to be NuGet package?)

|
1.0
|
Update WinUI3 Docs for 0.63 - See https://microsoft.github.io/react-native-windows/docs/next/winui3
Needs updating:
- Canary builds no longer needed
- NPM Package section (meant to be NuGet package?)

|
non_test
|
update docs for see needs updating canary builds no longer needed npm package section meant to be nuget package
| 0
|
1,215
| 13,927,586,353
|
IssuesEvent
|
2020-10-21 20:04:16
|
department-of-veterans-affairs/va.gov-team
|
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
|
opened
|
Create a single re-attempt for the attachment of a 10-10CG submission
|
backend reliability vsa-caregiver
|
Out of ~8000 10-10cg submissions, 3 submissions did not have an attachment.
Based on the logs, it looks like the API never attempted to send an attachment because the generation of the PDF failed (see https://github.com/department-of-veterans-affairs/va.gov-team/issues/14429).
Since the PDF generation failure seems to happen randomly, consider implementing a single re-attempt to generate and send again. This can happen within the same req/res cycle and only when an error occurs.
Prerequisite: https://github.com/department-of-veterans-affairs/va.gov-team/issues/15146
|
True
|
Create a single re-attempt for the attachment of a 10-10CG submission - Out of ~8000 10-10cg submissions, 3 submissions did not have an attachment.
Based on the logs, it looks like the API never attempted to send an attachment because the generation of the PDF failed (see https://github.com/department-of-veterans-affairs/va.gov-team/issues/14429).
Since the PDF generation failure seems to happen randomly, consider implementing a single re-attempt to generate and send again. This can happen within the same req/res cycle and only when an error occurs.
Prerequisite: https://github.com/department-of-veterans-affairs/va.gov-team/issues/15146
|
non_test
|
create a single re attempt for the attachment of a submission out of submissions submissions did not have an attachment based on the logs it looks like the api never attempted to send an attachment because the generation of the pdf failed see since the pdf generation failure seems to happen randomly consider implementing a single re attempt to generate and send again this can happen within the same req res cycle and only when an error occurs prerequisite
| 0
|
438,537
| 30,647,328,550
|
IssuesEvent
|
2023-07-25 06:16:12
|
sklyar/docker-log-driver-telegram
|
https://api.github.com/repos/sklyar/docker-log-driver-telegram
|
closed
|
Mention container name in logs.
|
documentation
|
Is it possible to add container name, from which the message is coming, to log message?
|
1.0
|
Mention container name in logs. - Is it possible to add container name, from which the message is coming, to log message?
|
non_test
|
mention container name in logs is it possible to add container name from which the message is coming to log message
| 0
|
132,632
| 10,760,630,672
|
IssuesEvent
|
2019-10-31 19:00:39
|
PulpQE/pulp-smash
|
https://api.github.com/repos/PulpQE/pulp-smash
|
closed
|
Test if a single package fails to sync the error is displayed
|
Issue Type: Test Case
|
https://pulp.plan.io/issues/2593
1. Create a scenario which will cause an error during sync on a single package
2. Attempt a sync
3. Observe that the "Error" field is empty
4. View the task details via the API to see that the "error_details" field does contain the error message
|
1.0
|
Test if a single package fails to sync the error is displayed - https://pulp.plan.io/issues/2593
1. Create a scenario which will cause an error during sync on a single package
2. Attempt a sync
3. Observe that the "Error" field is empty
4. View the task details via the API to see that the "error_details" field does contain the error message
|
test
|
test if a single package fails to sync the error is displayed create a scenario which will cause an error during sync on a single package attempt a sync observe that the error field is empty view the task details via the api to see that the error details field does contain the error message
| 1
|
155,576
| 24,485,463,053
|
IssuesEvent
|
2022-10-09 11:22:59
|
gotogether-s/gotogether-s
|
https://api.github.com/repos/gotogether-s/gotogether-s
|
closed
|
Layout NavBar Spacing
|
design
|
## About
Fix Layout and NavBar spacing issue
## To do list
- [x] Fix Layout and NavBar spacing issue
|
1.0
|
Layout NavBar Spacing - ## About
Fix Layout and NavBar spacing issue
## To do list
- [x] Fix Layout and NavBar spacing issue
|
non_test
|
layout navbar spacing about fix layout and navbar spacing issue to do list fix layout and navbar spacing issue
| 0
|
48,176
| 5,948,119,417
|
IssuesEvent
|
2017-05-26 10:21:44
|
openbmc/openbmc-test-automation
|
https://api.github.com/repos/openbmc/openbmc-test-automation
|
opened
|
[logging-test] enable logging test for elog testing
|
Test
|
Need to create symbolic fro logging-test in the tarball lib/utils.robot
```
# Create symlink to logging-test binary.
Execute Command On BMC
... ln -s ${targ_tarball_dir_path}/bin/logging-test /usr/bin/logging-test
```
|
1.0
|
[logging-test] enable logging test for elog testing - Need to create symbolic fro logging-test in the tarball lib/utils.robot
```
# Create symlink to logging-test binary.
Execute Command On BMC
... ln -s ${targ_tarball_dir_path}/bin/logging-test /usr/bin/logging-test
```
|
test
|
enable logging test for elog testing need to create symbolic fro logging test in the tarball lib utils robot create symlink to logging test binary execute command on bmc ln s targ tarball dir path bin logging test usr bin logging test
| 1
|
539,414
| 15,788,257,736
|
IssuesEvent
|
2021-04-01 20:30:21
|
inverse-inc/packetfence
|
https://api.github.com/repos/inverse-inc/packetfence
|
closed
|
packetfence-pki: configure Organisation, Country, State, etc. fields on Templates
|
Priority: Medium Type: Feature / Enhancement
|
**Is your feature request related to a problem? Please describe.**
When you create a certificate template, you don't have possibility to set Organisation, Country, State, etc. fields.
Consequently, you have to set this field each time you generate a certificate based on template.
**Describe the solution you'd like**
Inherit Organisation, Country, State, etc. fields from template when creating certificate with possibility to override with certificate values.
**Describe alternatives you've considered**
Configure Organisation, Country, State, etc. fields on Templates in place of Certificates.
|
1.0
|
packetfence-pki: configure Organisation, Country, State, etc. fields on Templates - **Is your feature request related to a problem? Please describe.**
When you create a certificate template, you don't have possibility to set Organisation, Country, State, etc. fields.
Consequently, you have to set this field each time you generate a certificate based on template.
**Describe the solution you'd like**
Inherit Organisation, Country, State, etc. fields from template when creating certificate with possibility to override with certificate values.
**Describe alternatives you've considered**
Configure Organisation, Country, State, etc. fields on Templates in place of Certificates.
|
non_test
|
packetfence pki configure organisation country state etc fields on templates is your feature request related to a problem please describe when you create a certificate template you don t have possibility to set organisation country state etc fields consequently you have to set this field each time you generate a certificate based on template describe the solution you d like inherit organisation country state etc fields from template when creating certificate with possibility to override with certificate values describe alternatives you ve considered configure organisation country state etc fields on templates in place of certificates
| 0
|
32,194
| 6,038,764,999
|
IssuesEvent
|
2017-06-09 22:31:22
|
capstone-coal/pycoal
|
https://api.github.com/repos/capstone-coal/pycoal
|
closed
|
Consider Using Interview
|
documentation enhancement
|
The [COAL review interview](https://groups.google.com/forum/#!topic/coal-capstone/MpNIhaXzl1s) I posted to the mailing list is a potential resource for adding frequently asked questions or less technical language to our project. This issue is simply to consider whether to do anything with these or similar materials.
|
1.0
|
Consider Using Interview - The [COAL review interview](https://groups.google.com/forum/#!topic/coal-capstone/MpNIhaXzl1s) I posted to the mailing list is a potential resource for adding frequently asked questions or less technical language to our project. This issue is simply to consider whether to do anything with these or similar materials.
|
non_test
|
consider using interview the i posted to the mailing list is a potential resource for adding frequently asked questions or less technical language to our project this issue is simply to consider whether to do anything with these or similar materials
| 0
|
132,573
| 10,759,807,539
|
IssuesEvent
|
2019-10-31 17:19:59
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
opened
|
Conda downloads from anaconda.org are flaky sometimes
|
topic: flaky-tests
|
```
Oct 31 01:51:13 ++ /Users/distiller/workspace/miniconda3/bin/conda install -y mkl mkl-include numpy pyyaml setuptools cmake cffi ninja
Oct 31 01:52:18 Collecting package metadata (current_repodata.json): - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - failed
Oct 31 01:52:18
Oct 31 01:52:18 CondaHTTPError: HTTP 502 BAD GATEWAY for url <https://repo.anaconda.com/pkgs/main/osx-64/current_repodata.json>
Oct 31 01:52:18 Elapsed: 00:12.951057
Oct 31 01:52:18 CF-RAY: 52e46323efb1c907-CMH
Oct 31 01:52:18
Oct 31 01:52:18 A remote server error occurred when trying to retrieve this URL.
Oct 31 01:52:18
Oct 31 01:52:18 A 500-type error (e.g. 500, 501, 502, 503, etc.) indicates the server failed to
Oct 31 01:52:18 fulfill a valid request. The problem may be spurious, and will resolve itself if you
Oct 31 01:52:18 try your request again. If the problem persists, consider notifying the maintainer
Oct 31 01:52:18 of the remote server.
Oct 31 01:52:18
Oct 31 01:52:18
```
https://app.circleci.com/jobs/github/pytorch/pytorch/3420118
|
1.0
|
Conda downloads from anaconda.org are flaky sometimes - ```
Oct 31 01:51:13 ++ /Users/distiller/workspace/miniconda3/bin/conda install -y mkl mkl-include numpy pyyaml setuptools cmake cffi ninja
Oct 31 01:52:18 Collecting package metadata (current_repodata.json): - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - failed
Oct 31 01:52:18
Oct 31 01:52:18 CondaHTTPError: HTTP 502 BAD GATEWAY for url <https://repo.anaconda.com/pkgs/main/osx-64/current_repodata.json>
Oct 31 01:52:18 Elapsed: 00:12.951057
Oct 31 01:52:18 CF-RAY: 52e46323efb1c907-CMH
Oct 31 01:52:18
Oct 31 01:52:18 A remote server error occurred when trying to retrieve this URL.
Oct 31 01:52:18
Oct 31 01:52:18 A 500-type error (e.g. 500, 501, 502, 503, etc.) indicates the server failed to
Oct 31 01:52:18 fulfill a valid request. The problem may be spurious, and will resolve itself if you
Oct 31 01:52:18 try your request again. If the problem persists, consider notifying the maintainer
Oct 31 01:52:18 of the remote server.
Oct 31 01:52:18
Oct 31 01:52:18
```
https://app.circleci.com/jobs/github/pytorch/pytorch/3420118
|
test
|
conda downloads from anaconda org are flaky sometimes oct users distiller workspace bin conda install y mkl mkl include numpy pyyaml setuptools cmake cffi ninja oct collecting package metadata current repodata json failed oct oct condahttperror http bad gateway for url oct elapsed oct cf ray cmh oct oct a remote server error occurred when trying to retrieve this url oct oct a type error e g etc indicates the server failed to oct fulfill a valid request the problem may be spurious and will resolve itself if you oct try your request again if the problem persists consider notifying the maintainer oct of the remote server oct oct
| 1
|
4,901
| 7,782,161,716
|
IssuesEvent
|
2018-06-06 04:55:27
|
neuropoly/spinalcordtoolbox
|
https://api.github.com/repos/neuropoly/spinalcordtoolbox
|
closed
|
UnboundLocalError: local variable 'z_centerline_voxel' referenced before assignment
|
bug priority:HIGH sct_process_segmentation
|
I think this issue is related to the centerline not being defined if the flag `-no-angle` is set to 1. Needs further investigations.
~~~
Spinal Cord Toolbox (master/fc72f34d4251de09d3e92071c66b40ac7fe4c92a)
Running /Users/julien/code/sct/scripts/sct_process_segmentation.py -i t2s_gmseg_manual.nii.gz -p csa -no-angle 1 -vert 4 -vertfile t1_seg_labeled_reg.nii.gz -ofolder csa_gm
Folder csa_gm has been created.
Check parameters:
.. segmentation file: t2s_gmseg_manual.nii.gz
Create temporary folder (/var/folders/6f/wy6ljmx9453cgth2qwv5l1s80000gn/T/sct-180530091520-LylRGO)...
Copying input data to tmp folder and convert to nii...
sct_convert -i /Users/julien/data/spine_generic/20180509_julien-skyra/t2s/t2s_gmseg_manual.nii.gz -o /var/folders/6f/wy6ljmx9453cgth2qwv5l1s80000gn/T/sct-180530091520-LylRGO/segmentation.nii.gz # in /Users/julien/data/spine_generic/20180509_julien-skyra/t2s
Change orientation to RPI...
sct_image -i segmentation.nii.gz -setorient RPI -o segmentation_RPI.nii.gz # in /private/var/folders/6f/wy6ljmx9453cgth2qwv5l1s80000gn/T/sct-180530091520-LylRGO
Open segmentation volume...
Get data dimensions...
448 x 448 x 15
Compute CSA...
Smooth CSA across slices...
.. No smoothing!
Create volume of CSA values...
Create volume of angle values...
sct_image -i csa_volume_RPI.nii.gz -setorient RPI -o csa_volume_in_initial_orientation.nii.gz # in /private/var/folders/6f/wy6ljmx9453cgth2qwv5l1s80000gn/T/sct-180530091520-LylRGO
sct_image -i angle_volume_RPI.nii.gz -setorient RPI -o angle_volume_in_initial_orientation.nii.gz # in /private/var/folders/6f/wy6ljmx9453cgth2qwv5l1s80000gn/T/sct-180530091520-LylRGO
Generate output files...
WARNING: File csa_gm/csa_image.nii.gz already exists. Deleting it...
File created: csa_gm/csa_image.nii.gz
WARNING: File csa_gm/angle_image.nii.gz already exists. Deleting it...
File created: csa_gm/angle_image.nii.gz
Display CSA per slice:
z = 0, CSA = 16.000000 mm^2, Angle = 0.000000 deg
z = 1, CSA = 17.000000 mm^2, Angle = 0.000000 deg
z = 2, CSA = 18.000000 mm^2, Angle = 0.000000 deg
z = 3, CSA = 16.750000 mm^2, Angle = 0.000000 deg
z = 4, CSA = 15.250000 mm^2, Angle = 0.000000 deg
z = 5, CSA = 15.750000 mm^2, Angle = 0.000000 deg
z = 6, CSA = 15.500000 mm^2, Angle = 0.000000 deg
z = 7, CSA = 16.250000 mm^2, Angle = 0.000000 deg
z = 8, CSA = 16.000000 mm^2, Angle = 0.000000 deg
z = 9, CSA = 13.750000 mm^2, Angle = 0.000000 deg
z = 10, CSA = 14.500000 mm^2, Angle = 0.000000 deg
z = 11, CSA = 13.500000 mm^2, Angle = 0.000000 deg
z = 12, CSA = 13.000000 mm^2, Angle = 0.000000 deg
z = 13, CSA = 14.000000 mm^2, Angle = 0.000000 deg
z = 14, CSA = 13.500000 mm^2, Angle = 0.000000 deg
Save results in: csa_gm/csa_per_slice.txt
Save results in: csa_gm/csa_per_slice.pickle
Selected vertebral levels... 4
OK: t1_seg_labeled_reg.nii.gz
Traceback (most recent call last):
File "/Users/julien/code/sct/scripts/sct_process_segmentation.py", line 1230, in <module>
main(sys.argv[1:])
File "/Users/julien/code/sct/scripts/sct_process_segmentation.py", line 242, in main
compute_csa(fname_segmentation, output_folder, overwrite, verbose, remove_temp_files, step, smoothing_param, slices, vert_lev, fname_vertebral_labeling, algo_fitting=param.algo_fitting, type_window=param.type_window, window_length=param.window_length, angle_correction=angle_correction, use_phys_coord=use_phys_coord)
File "/Users/julien/code/sct/scripts/sct_process_segmentation.py", line 827, in compute_csa
slices, vert_levels_list, warning = get_slices_matching_with_vertebral_levels_based_centerline(vert_levels, im_vertebral_labeling.data, z_centerline_voxel)
UnboundLocalError: local variable 'z_centerline_voxel' referenced before assignment
~~~
|
1.0
|
UnboundLocalError: local variable 'z_centerline_voxel' referenced before assignment - I think this issue is related to the centerline not being defined if the flag `-no-angle` is set to 1. Needs further investigations.
~~~
Spinal Cord Toolbox (master/fc72f34d4251de09d3e92071c66b40ac7fe4c92a)
Running /Users/julien/code/sct/scripts/sct_process_segmentation.py -i t2s_gmseg_manual.nii.gz -p csa -no-angle 1 -vert 4 -vertfile t1_seg_labeled_reg.nii.gz -ofolder csa_gm
Folder csa_gm has been created.
Check parameters:
.. segmentation file: t2s_gmseg_manual.nii.gz
Create temporary folder (/var/folders/6f/wy6ljmx9453cgth2qwv5l1s80000gn/T/sct-180530091520-LylRGO)...
Copying input data to tmp folder and convert to nii...
sct_convert -i /Users/julien/data/spine_generic/20180509_julien-skyra/t2s/t2s_gmseg_manual.nii.gz -o /var/folders/6f/wy6ljmx9453cgth2qwv5l1s80000gn/T/sct-180530091520-LylRGO/segmentation.nii.gz # in /Users/julien/data/spine_generic/20180509_julien-skyra/t2s
Change orientation to RPI...
sct_image -i segmentation.nii.gz -setorient RPI -o segmentation_RPI.nii.gz # in /private/var/folders/6f/wy6ljmx9453cgth2qwv5l1s80000gn/T/sct-180530091520-LylRGO
Open segmentation volume...
Get data dimensions...
448 x 448 x 15
Compute CSA...
Smooth CSA across slices...
.. No smoothing!
Create volume of CSA values...
Create volume of angle values...
sct_image -i csa_volume_RPI.nii.gz -setorient RPI -o csa_volume_in_initial_orientation.nii.gz # in /private/var/folders/6f/wy6ljmx9453cgth2qwv5l1s80000gn/T/sct-180530091520-LylRGO
sct_image -i angle_volume_RPI.nii.gz -setorient RPI -o angle_volume_in_initial_orientation.nii.gz # in /private/var/folders/6f/wy6ljmx9453cgth2qwv5l1s80000gn/T/sct-180530091520-LylRGO
Generate output files...
WARNING: File csa_gm/csa_image.nii.gz already exists. Deleting it...
File created: csa_gm/csa_image.nii.gz
WARNING: File csa_gm/angle_image.nii.gz already exists. Deleting it...
File created: csa_gm/angle_image.nii.gz
Display CSA per slice:
z = 0, CSA = 16.000000 mm^2, Angle = 0.000000 deg
z = 1, CSA = 17.000000 mm^2, Angle = 0.000000 deg
z = 2, CSA = 18.000000 mm^2, Angle = 0.000000 deg
z = 3, CSA = 16.750000 mm^2, Angle = 0.000000 deg
z = 4, CSA = 15.250000 mm^2, Angle = 0.000000 deg
z = 5, CSA = 15.750000 mm^2, Angle = 0.000000 deg
z = 6, CSA = 15.500000 mm^2, Angle = 0.000000 deg
z = 7, CSA = 16.250000 mm^2, Angle = 0.000000 deg
z = 8, CSA = 16.000000 mm^2, Angle = 0.000000 deg
z = 9, CSA = 13.750000 mm^2, Angle = 0.000000 deg
z = 10, CSA = 14.500000 mm^2, Angle = 0.000000 deg
z = 11, CSA = 13.500000 mm^2, Angle = 0.000000 deg
z = 12, CSA = 13.000000 mm^2, Angle = 0.000000 deg
z = 13, CSA = 14.000000 mm^2, Angle = 0.000000 deg
z = 14, CSA = 13.500000 mm^2, Angle = 0.000000 deg
Save results in: csa_gm/csa_per_slice.txt
Save results in: csa_gm/csa_per_slice.pickle
Selected vertebral levels... 4
OK: t1_seg_labeled_reg.nii.gz
Traceback (most recent call last):
File "/Users/julien/code/sct/scripts/sct_process_segmentation.py", line 1230, in <module>
main(sys.argv[1:])
File "/Users/julien/code/sct/scripts/sct_process_segmentation.py", line 242, in main
compute_csa(fname_segmentation, output_folder, overwrite, verbose, remove_temp_files, step, smoothing_param, slices, vert_lev, fname_vertebral_labeling, algo_fitting=param.algo_fitting, type_window=param.type_window, window_length=param.window_length, angle_correction=angle_correction, use_phys_coord=use_phys_coord)
File "/Users/julien/code/sct/scripts/sct_process_segmentation.py", line 827, in compute_csa
slices, vert_levels_list, warning = get_slices_matching_with_vertebral_levels_based_centerline(vert_levels, im_vertebral_labeling.data, z_centerline_voxel)
UnboundLocalError: local variable 'z_centerline_voxel' referenced before assignment
~~~
|
non_test
|
unboundlocalerror local variable z centerline voxel referenced before assignment i think this issue is related to the centerline not being defined if the flag no angle is set to needs further investigations spinal cord toolbox master running users julien code sct scripts sct process segmentation py i gmseg manual nii gz p csa no angle vert vertfile seg labeled reg nii gz ofolder csa gm folder csa gm has been created check parameters segmentation file gmseg manual nii gz create temporary folder var folders t sct lylrgo copying input data to tmp folder and convert to nii sct convert i users julien data spine generic julien skyra gmseg manual nii gz o var folders t sct lylrgo segmentation nii gz in users julien data spine generic julien skyra change orientation to rpi sct image i segmentation nii gz setorient rpi o segmentation rpi nii gz in private var folders t sct lylrgo open segmentation volume get data dimensions x x compute csa smooth csa across slices no smoothing create volume of csa values create volume of angle values sct image i csa volume rpi nii gz setorient rpi o csa volume in initial orientation nii gz in private var folders t sct lylrgo sct image i angle volume rpi nii gz setorient rpi o angle volume in initial orientation nii gz in private var folders t sct lylrgo generate output files warning file csa gm csa image nii gz already exists deleting it file created csa gm csa image nii gz warning file csa gm angle image nii gz already exists deleting it file created csa gm angle image nii gz display csa per slice z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg z csa mm angle deg save results in csa gm csa per slice txt save results in csa gm csa per slice pickle selected vertebral levels ok seg labeled reg nii gz traceback most recent call last file users julien code sct scripts sct process segmentation py line in main sys argv file users julien code sct scripts sct process segmentation py line in main compute csa fname segmentation output folder overwrite verbose remove temp files step smoothing param slices vert lev fname vertebral labeling algo fitting param algo fitting type window param type window window length param window length angle correction angle correction use phys coord use phys coord file users julien code sct scripts sct process segmentation py line in compute csa slices vert levels list warning get slices matching with vertebral levels based centerline vert levels im vertebral labeling data z centerline voxel unboundlocalerror local variable z centerline voxel referenced before assignment
| 0
|
94,871
| 16,021,924,903
|
IssuesEvent
|
2021-04-21 01:38:12
|
joshnewton31080/dvna
|
https://api.github.com/repos/joshnewton31080/dvna
|
opened
|
CVE-2017-5941 (High) detected in node-serialize-0.0.4.tgz
|
security vulnerability
|
## CVE-2017-5941 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-serialize-0.0.4.tgz</b></p></summary>
<p>Serialize a object including it's function into a JSON.</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-serialize/-/node-serialize-0.0.4.tgz">https://registry.npmjs.org/node-serialize/-/node-serialize-0.0.4.tgz</a></p>
<p>Path to dependency file: dvna/package.json</p>
<p>Path to vulnerable library: dvna/node_modules/node-serialize/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-serialize-0.0.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/joshnewton31080/dvna/commit/ebbe518de6103063656cb8a1c3d1040aacb09826">ebbe518de6103063656cb8a1c3d1040aacb09826</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in the node-serialize package 0.0.4 for Node.js. Untrusted data passed into the unserialize() function can be exploited to achieve arbitrary code execution by passing a JavaScript Object with an Immediately Invoked Function Expression (IIFE).
<p>Publish Date: 2017-02-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-5941>CVE-2017-5941</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-serialize","packageVersion":"0.0.4","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"node-serialize:0.0.4","isMinimumFixVersionAvailable":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2017-5941","vulnerabilityDetails":"An issue was discovered in the node-serialize package 0.0.4 for Node.js. Untrusted data passed into the unserialize() function can be exploited to achieve arbitrary code execution by passing a JavaScript Object with an Immediately Invoked Function Expression (IIFE).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-5941","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2017-5941 (High) detected in node-serialize-0.0.4.tgz - ## CVE-2017-5941 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-serialize-0.0.4.tgz</b></p></summary>
<p>Serialize a object including it's function into a JSON.</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-serialize/-/node-serialize-0.0.4.tgz">https://registry.npmjs.org/node-serialize/-/node-serialize-0.0.4.tgz</a></p>
<p>Path to dependency file: dvna/package.json</p>
<p>Path to vulnerable library: dvna/node_modules/node-serialize/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-serialize-0.0.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/joshnewton31080/dvna/commit/ebbe518de6103063656cb8a1c3d1040aacb09826">ebbe518de6103063656cb8a1c3d1040aacb09826</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in the node-serialize package 0.0.4 for Node.js. Untrusted data passed into the unserialize() function can be exploited to achieve arbitrary code execution by passing a JavaScript Object with an Immediately Invoked Function Expression (IIFE).
<p>Publish Date: 2017-02-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-5941>CVE-2017-5941</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-serialize","packageVersion":"0.0.4","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"node-serialize:0.0.4","isMinimumFixVersionAvailable":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2017-5941","vulnerabilityDetails":"An issue was discovered in the node-serialize package 0.0.4 for Node.js. Untrusted data passed into the unserialize() function can be exploited to achieve arbitrary code execution by passing a JavaScript Object with an Immediately Invoked Function Expression (IIFE).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-5941","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_test
|
cve high detected in node serialize tgz cve high severity vulnerability vulnerable library node serialize tgz serialize a object including it s function into a json library home page a href path to dependency file dvna package json path to vulnerable library dvna node modules node serialize package json dependency hierarchy x node serialize tgz vulnerable library found in head commit a href found in base branch main vulnerability details an issue was discovered in the node serialize package for node js untrusted data passed into the unserialize function can be exploited to achieve arbitrary code execution by passing a javascript object with an immediately invoked function expression iife publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree node serialize isminimumfixversionavailable false basebranches vulnerabilityidentifier cve vulnerabilitydetails an issue was discovered in the node serialize package for node js untrusted data passed into the unserialize function can be exploited to achieve arbitrary code execution by passing a javascript object with an immediately invoked function expression iife vulnerabilityurl
| 0
|
156,939
| 12,341,575,591
|
IssuesEvent
|
2020-05-14 22:15:45
|
sharifsmashmedia/clickster-specs
|
https://api.github.com/repos/sharifsmashmedia/clickster-specs
|
closed
|
Get all affiliate groups
|
Affiliates Groups test
|
## Group
<!-- Which group this test belongs to? -->
Groups
## Endpoint
| method | path |
| ---------------------- | ------------------------ |
| `GET` | `/api/groups/affiliate` |
## Schema
<!-- Which Joi should this endpoint return -->
`AffiliateGroupSchema`
## Specs
<!-- Repeat below items as many times as needed -->
<details>
<summary>should return a list of all affiliate groups</summary>
### URL params
<!-- Which URL params should get replaced -->
```json
```
### Query params
<!-- Query params to be appended to the test URL -->
```json
```
### Input
<!-- The input passed to the request -->
```json
```
### Output
<!-- The expected output -->
```json
[
{
"_id": "5751a659cf9160c0134087ba",
"object_name": "affiliate",
"name": "Semi-Monthly",
"updated_on": "2016-06-03T15:46:33.288Z",
"created_on": "2016-06-03T15:46:33.288Z",
"__v": 0
},
{
"_id": "575f0d9877c7a04d47b669f3",
"object_name": "affiliate",
"name": "Monthly",
"updated_on": "2016-06-13T19:46:32.033Z",
"created_on": "2016-06-13T19:46:32.033Z",
"__v": 0
},
{
"_id": "577eaa1a4087d5f413f033da",
"object_name": "affiliate",
"name": "Weekly",
"updated_on": "2016-07-07T19:14:34.821Z",
"created_on": "2016-07-07T19:14:34.821Z",
"__v": 0
},
{
"_id": "5c2fd162fdbc1d513fbce4b5",
"object_name": "affiliate",
"name": "test",
"created_on": "2019-01-04T21:34:26.471Z",
"updated_on": "2019-01-04T21:34:26.471Z",
"__v": 0
},
{
"_id": "5c4f884515f0000e1a4459c1",
"object_name": "affiliate",
"name": "tester",
"created_on": "2019-01-28T22:55:01.776Z",
"updated_on": "2019-01-28T22:55:01.776Z",
"__v": 0
},
{
"_id": "5eb476b026d5090d2e41b707",
"object_name": "affiliate",
"name": "test 123",
"created_on": "2020-05-07T20:59:28.134Z",
"updated_on": "2020-05-07T20:59:28.134Z",
"__v": 0
}
]
```
### Status code
<!-- Expected status code from this test run -->
`200`
</details>
|
1.0
|
Get all affiliate groups - ## Group
<!-- Which group this test belongs to? -->
Groups
## Endpoint
| method | path |
| ---------------------- | ------------------------ |
| `GET` | `/api/groups/affiliate` |
## Schema
<!-- Which Joi should this endpoint return -->
`AffiliateGroupSchema`
## Specs
<!-- Repeat below items as many times as needed -->
<details>
<summary>should return a list of all affiliate groups</summary>
### URL params
<!-- Which URL params should get replaced -->
```json
```
### Query params
<!-- Query params to be appended to the test URL -->
```json
```
### Input
<!-- The input passed to the request -->
```json
```
### Output
<!-- The expected output -->
```json
[
{
"_id": "5751a659cf9160c0134087ba",
"object_name": "affiliate",
"name": "Semi-Monthly",
"updated_on": "2016-06-03T15:46:33.288Z",
"created_on": "2016-06-03T15:46:33.288Z",
"__v": 0
},
{
"_id": "575f0d9877c7a04d47b669f3",
"object_name": "affiliate",
"name": "Monthly",
"updated_on": "2016-06-13T19:46:32.033Z",
"created_on": "2016-06-13T19:46:32.033Z",
"__v": 0
},
{
"_id": "577eaa1a4087d5f413f033da",
"object_name": "affiliate",
"name": "Weekly",
"updated_on": "2016-07-07T19:14:34.821Z",
"created_on": "2016-07-07T19:14:34.821Z",
"__v": 0
},
{
"_id": "5c2fd162fdbc1d513fbce4b5",
"object_name": "affiliate",
"name": "test",
"created_on": "2019-01-04T21:34:26.471Z",
"updated_on": "2019-01-04T21:34:26.471Z",
"__v": 0
},
{
"_id": "5c4f884515f0000e1a4459c1",
"object_name": "affiliate",
"name": "tester",
"created_on": "2019-01-28T22:55:01.776Z",
"updated_on": "2019-01-28T22:55:01.776Z",
"__v": 0
},
{
"_id": "5eb476b026d5090d2e41b707",
"object_name": "affiliate",
"name": "test 123",
"created_on": "2020-05-07T20:59:28.134Z",
"updated_on": "2020-05-07T20:59:28.134Z",
"__v": 0
}
]
```
### Status code
<!-- Expected status code from this test run -->
`200`
</details>
|
test
|
get all affiliate groups group groups endpoint method path get api groups affiliate schema affiliategroupschema specs should return a list of all affiliate groups url params json query params json input json output json id object name affiliate name semi monthly updated on created on v id object name affiliate name monthly updated on created on v id object name affiliate name weekly updated on created on v id object name affiliate name test created on updated on v id object name affiliate name tester created on updated on v id object name affiliate name test created on updated on v status code
| 1
|
78,501
| 7,645,044,292
|
IssuesEvent
|
2018-05-08 17:19:42
|
Azure/azure-functions-host
|
https://api.github.com/repos/Azure/azure-functions-host
|
closed
|
EndToEndTimeoutTests Hanging
|
v2-testgaps
|
These tests currently hang indefinitely when run. I can see this when trying to run them locally via VS, and it also looks like they may be hanging on AppVeyor, even though the tests runs are coming back green.
E.g. here's a [v1.x test run](https://ci.appveyor.com/project/appsvc/azure-webjobs-sdk-script-y8o14/build/1.0.11562), and if do a search in the tests for "TimeoutTest_UsingToken_CSharp" you'll see it ran to completion. If you do the same search on [a v2 test run](https://ci.appveyor.com/project/appsvc/azure-webjobs-sdk-script-y8o14/build/2.0.11554) you won't find the test.
I think timeouts may be broken in v2. I've disabled the tests, referencing this issue.
|
1.0
|
EndToEndTimeoutTests Hanging - These tests currently hang indefinitely when run. I can see this when trying to run them locally via VS, and it also looks like they may be hanging on AppVeyor, even though the tests runs are coming back green.
E.g. here's a [v1.x test run](https://ci.appveyor.com/project/appsvc/azure-webjobs-sdk-script-y8o14/build/1.0.11562), and if do a search in the tests for "TimeoutTest_UsingToken_CSharp" you'll see it ran to completion. If you do the same search on [a v2 test run](https://ci.appveyor.com/project/appsvc/azure-webjobs-sdk-script-y8o14/build/2.0.11554) you won't find the test.
I think timeouts may be broken in v2. I've disabled the tests, referencing this issue.
|
test
|
endtoendtimeouttests hanging these tests currently hang indefinitely when run i can see this when trying to run them locally via vs and it also looks like they may be hanging on appveyor even though the tests runs are coming back green e g here s a and if do a search in the tests for timeouttest usingtoken csharp you ll see it ran to completion if you do the same search on you won t find the test i think timeouts may be broken in i ve disabled the tests referencing this issue
| 1
|
315,106
| 27,046,429,447
|
IssuesEvent
|
2023-02-13 10:06:15
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
kv/kvserver: TestReplicateQueueDecommissioningNonVoters/replace failed
|
C-test-failure O-robot X-stale T-kv no-test-failure-activity branch-release-22.2
|
kv/kvserver.TestReplicateQueueDecommissioningNonVoters/replace [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/7767054?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/7767054?buildTab=artifacts#/) on release-22.2 @ [d978988afcfc78c02c23a3836ba90e4a4faeeb25](https://github.com/cockroachdb/cockroach/commits/d978988afcfc78c02c23a3836ba90e4a4faeeb25):
```
Slow failing tests:
TestReplicateQueueDecommissioningNonVoters/replace - 33.03s
TestLeaseTransferRejectedIfTargetNeedsSnapshot - 8.41s
Slow passing tests:
TestReplicateQueueRebalance - 1260.70s
TestReplicateQueueRebalance - 155.12s
TestReplicateQueueDecommissioningNonVoters - 142.44s
TestReplicateQueueSwapVotersWithNonVoters - 122.48s
TestDecommission - 106.21s
TestMergeQueueSeesNonVoters - 99.90s
TestMergeQueueSeesNonVoters - 98.53s
TestDecommission - 98.49s
TestLearnerSnapshotFailsRollback - 95.01s
TestLearnerSnapshotFailsRollback - 93.74s
TestSnapshotsToDrainingNodes - 63.53s
TestSnapshotsToDrainingNodes - 62.02s
TestReplicateQueueDownReplicate - 51.07s
TestClosedTimestampFrozenAfterSubsumption - 49.37s
TestStoreRangeMergeTimestampCache - 48.65s
TestClosedTimestampCantServeForNonTransactionalBatch - 47.05s
TestClosedTimestampFrozenAfterSubsumption - 46.32s
TestStoreRangeMergeTimestampCache - 44.85s
TestClosedTimestampCantServeForNonTransactionalBatch - 44.73s
TestReplicateQueueMetrics - 38.18s
```
<p>Parameters: <code>TAGS=bazel,gss,deadlock</code>
</p>
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
/cc @cockroachdb/kv
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestReplicateQueueDecommissioningNonVoters/replace.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
Jira issue: CRDB-21933
|
2.0
|
kv/kvserver: TestReplicateQueueDecommissioningNonVoters/replace failed - kv/kvserver.TestReplicateQueueDecommissioningNonVoters/replace [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/7767054?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/7767054?buildTab=artifacts#/) on release-22.2 @ [d978988afcfc78c02c23a3836ba90e4a4faeeb25](https://github.com/cockroachdb/cockroach/commits/d978988afcfc78c02c23a3836ba90e4a4faeeb25):
```
Slow failing tests:
TestReplicateQueueDecommissioningNonVoters/replace - 33.03s
TestLeaseTransferRejectedIfTargetNeedsSnapshot - 8.41s
Slow passing tests:
TestReplicateQueueRebalance - 1260.70s
TestReplicateQueueRebalance - 155.12s
TestReplicateQueueDecommissioningNonVoters - 142.44s
TestReplicateQueueSwapVotersWithNonVoters - 122.48s
TestDecommission - 106.21s
TestMergeQueueSeesNonVoters - 99.90s
TestMergeQueueSeesNonVoters - 98.53s
TestDecommission - 98.49s
TestLearnerSnapshotFailsRollback - 95.01s
TestLearnerSnapshotFailsRollback - 93.74s
TestSnapshotsToDrainingNodes - 63.53s
TestSnapshotsToDrainingNodes - 62.02s
TestReplicateQueueDownReplicate - 51.07s
TestClosedTimestampFrozenAfterSubsumption - 49.37s
TestStoreRangeMergeTimestampCache - 48.65s
TestClosedTimestampCantServeForNonTransactionalBatch - 47.05s
TestClosedTimestampFrozenAfterSubsumption - 46.32s
TestStoreRangeMergeTimestampCache - 44.85s
TestClosedTimestampCantServeForNonTransactionalBatch - 44.73s
TestReplicateQueueMetrics - 38.18s
```
<p>Parameters: <code>TAGS=bazel,gss,deadlock</code>
</p>
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
/cc @cockroachdb/kv
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestReplicateQueueDecommissioningNonVoters/replace.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
Jira issue: CRDB-21933
|
test
|
kv kvserver testreplicatequeuedecommissioningnonvoters replace failed kv kvserver testreplicatequeuedecommissioningnonvoters replace with on release slow failing tests testreplicatequeuedecommissioningnonvoters replace testleasetransferrejectediftargetneedssnapshot slow passing tests testreplicatequeuerebalance testreplicatequeuerebalance testreplicatequeuedecommissioningnonvoters testreplicatequeueswapvoterswithnonvoters testdecommission testmergequeueseesnonvoters testmergequeueseesnonvoters testdecommission testlearnersnapshotfailsrollback testlearnersnapshotfailsrollback testsnapshotstodrainingnodes testsnapshotstodrainingnodes testreplicatequeuedownreplicate testclosedtimestampfrozenaftersubsumption teststorerangemergetimestampcache testclosedtimestampcantservefornontransactionalbatch testclosedtimestampfrozenaftersubsumption teststorerangemergetimestampcache testclosedtimestampcantservefornontransactionalbatch testreplicatequeuemetrics parameters tags bazel gss deadlock help see also cc cockroachdb kv jira issue crdb
| 1
|
295,139
| 25,457,660,647
|
IssuesEvent
|
2022-11-24 15:28:40
|
Mbed-TLS/mbedtls
|
https://api.github.com/repos/Mbed-TLS/mbedtls
|
opened
|
compat.sh misses CAMELLIA with newer GnuTLS
|
enhancement size-s component-test
|
For some reason, newer GnuTLS doesn't include CAMELLIA cipher suites in the NORMAL priority. So if you run `compat.sh` out of the box on, say, Ubuntu 20.04, all CAMELLIA interop with GnuTLS fails. It seems you have to enable CAMELLIA ciphers explicitly (I haven't found a way to enable them all).
I have a patch to fix this and any similar problems that might come up: https://github.com/Mbed-TLS/mbedtls/pull/6614/commits/ff88ece6feb4507a322704cfa63a163b3f57e928 (not sure yet whether it'll end up in that pull request)
|
1.0
|
compat.sh misses CAMELLIA with newer GnuTLS - For some reason, newer GnuTLS doesn't include CAMELLIA cipher suites in the NORMAL priority. So if you run `compat.sh` out of the box on, say, Ubuntu 20.04, all CAMELLIA interop with GnuTLS fails. It seems you have to enable CAMELLIA ciphers explicitly (I haven't found a way to enable them all).
I have a patch to fix this and any similar problems that might come up: https://github.com/Mbed-TLS/mbedtls/pull/6614/commits/ff88ece6feb4507a322704cfa63a163b3f57e928 (not sure yet whether it'll end up in that pull request)
|
test
|
compat sh misses camellia with newer gnutls for some reason newer gnutls doesn t include camellia cipher suites in the normal priority so if you run compat sh out of the box on say ubuntu all camellia interop with gnutls fails it seems you have to enable camellia ciphers explicitly i haven t found a way to enable them all i have a patch to fix this and any similar problems that might come up not sure yet whether it ll end up in that pull request
| 1
|
197,541
| 14,932,526,955
|
IssuesEvent
|
2021-01-25 07:53:47
|
tendermint/tendermint
|
https://api.github.com/repos/tendermint/tendermint
|
opened
|
test/fuzz: add missing tests
|
T:test
|
As a follow-up to https://github.com/tendermint/tendermint/pull/5918, we should analyze what tests are missing (i.e. what inputs do we want to cover with fuzzing). A few that come to my mind immediately:
- light client (https://github.com/tendermint/tendermint/issues/4453)
- X Reactor # Receive (it does not make sense to test protobuf encoding / decoding, but it does when it comes to handling individual messages)
- RPC `/tx_search`, `/blockchain`
|
1.0
|
test/fuzz: add missing tests - As a follow-up to https://github.com/tendermint/tendermint/pull/5918, we should analyze what tests are missing (i.e. what inputs do we want to cover with fuzzing). A few that come to my mind immediately:
- light client (https://github.com/tendermint/tendermint/issues/4453)
- X Reactor # Receive (it does not make sense to test protobuf encoding / decoding, but it does when it comes to handling individual messages)
- RPC `/tx_search`, `/blockchain`
|
test
|
test fuzz add missing tests as a follow up to we should analyze what tests are missing i e what inputs do we want to cover with fuzzing a few that come to my mind immediately light client x reactor receive it does not make sense to test protobuf encoding decoding but it does when it comes to handling individual messages rpc tx search blockchain
| 1
|
47,694
| 5,908,277,544
|
IssuesEvent
|
2017-05-19 19:57:10
|
vmware/docker-volume-vsphere
|
https://api.github.com/repos/vmware/docker-volume-vsphere
|
opened
|
Prepare CI testbed to run all P0 tests
|
component/ci-infrastructure component/test-infrastructure
|
Currently CI is having 1 ESX and 2 VMS for vSphere 6.5 and the same for 6.0. For running P0 tests, there is a need of having 2 ESX + 6 vms (1 vm each on local/shared/vsan).
2 ESX 6.5 as nested VMs:
- having access to 3 datastore (shared/local/vsan)
- 1 vm each on above datastores ( 3vms)
The same setup for ESX 6.0 U2 as well.
|
1.0
|
Prepare CI testbed to run all P0 tests - Currently CI is having 1 ESX and 2 VMS for vSphere 6.5 and the same for 6.0. For running P0 tests, there is a need of having 2 ESX + 6 vms (1 vm each on local/shared/vsan).
2 ESX 6.5 as nested VMs:
- having access to 3 datastore (shared/local/vsan)
- 1 vm each on above datastores ( 3vms)
The same setup for ESX 6.0 U2 as well.
|
test
|
prepare ci testbed to run all tests currently ci is having esx and vms for vsphere and the same for for running tests there is a need of having esx vms vm each on local shared vsan esx as nested vms having access to datastore shared local vsan vm each on above datastores the same setup for esx as well
| 1
|
469,198
| 13,503,382,460
|
IssuesEvent
|
2020-09-13 13:18:40
|
IFB-ElixirFr/ifbcat
|
https://api.github.com/repos/IFB-ElixirFr/ifbcat
|
closed
|
Make non-functional (but retain) APIView and ViewSet code added for testing purposes only
|
high priority
|
Edit:
* TestApiView and TestViewSet in views.py
* TestApiViewSerializer from serializers.py (used for both APIView & ViewSet code)
* urls.py (endpoints)
|
1.0
|
Make non-functional (but retain) APIView and ViewSet code added for testing purposes only - Edit:
* TestApiView and TestViewSet in views.py
* TestApiViewSerializer from serializers.py (used for both APIView & ViewSet code)
* urls.py (endpoints)
|
non_test
|
make non functional but retain apiview and viewset code added for testing purposes only edit testapiview and testviewset in views py testapiviewserializer from serializers py used for both apiview viewset code urls py endpoints
| 0
|
121,163
| 10,152,102,153
|
IssuesEvent
|
2019-08-05 22:19:30
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
Pkcs12Info.Decode bytesConsumed could use better test coverage.
|
area-System.Security test enhancement
|
It looks, from code inspection, that the bytesConsumed value is always set to encodedBytes.Length, even if there were extraneous bytes at the end of the value (the correct answer appears to be maybeCopy.Length).
A test should be added which copies (e.g.) EmptyPfx into an oversized byte array, and the emitted length should only be EmptyPfx.Length, not bigger.Length.
(Found by accident during code inspection during docs writing)
|
1.0
|
Pkcs12Info.Decode bytesConsumed could use better test coverage. - It looks, from code inspection, that the bytesConsumed value is always set to encodedBytes.Length, even if there were extraneous bytes at the end of the value (the correct answer appears to be maybeCopy.Length).
A test should be added which copies (e.g.) EmptyPfx into an oversized byte array, and the emitted length should only be EmptyPfx.Length, not bigger.Length.
(Found by accident during code inspection during docs writing)
|
test
|
decode bytesconsumed could use better test coverage it looks from code inspection that the bytesconsumed value is always set to encodedbytes length even if there were extraneous bytes at the end of the value the correct answer appears to be maybecopy length a test should be added which copies e g emptypfx into an oversized byte array and the emitted length should only be emptypfx length not bigger length found by accident during code inspection during docs writing
| 1
|
271,406
| 29,490,988,146
|
IssuesEvent
|
2023-06-02 13:27:12
|
shortlink-org/shortlink
|
https://api.github.com/repos/shortlink-org/shortlink
|
closed
|
landing-0.1.1.tgz: 1 vulnerabilities (highest severity is: 9.8)
|
Mend: dependency security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>landing-0.1.1.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /ui/nx-monorepo/package.json</p>
<p>Path to vulnerable library: /ui/nx-monorepo/node_modules/ejs/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/shortlink-org/shortlink/commit/3fc31ec07a90bdd345e4efe27fd8249068eeef68">3fc31ec07a90bdd345e4efe27fd8249068eeef68</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (landing version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2023-29827](https://www.mend.io/vulnerability-database/CVE-2023-29827) | <img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> Critical | 9.8 | ejs-3.1.9.tgz | Transitive | N/A* | ❌ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> CVE-2023-29827</summary>
### Vulnerable Library - <b>ejs-3.1.9.tgz</b></p>
<p></p>
<p>Library home page: <a href="https://registry.npmjs.org/ejs/-/ejs-3.1.9.tgz">https://registry.npmjs.org/ejs/-/ejs-3.1.9.tgz</a></p>
<p>Path to dependency file: /ui/nx-monorepo/package.json</p>
<p>Path to vulnerable library: /ui/nx-monorepo/node_modules/ejs/package.json</p>
<p>
Dependency Hierarchy:
- landing-0.1.1.tgz (Root Library)
- next-pwa-5.6.0.tgz
- workbox-webpack-plugin-6.6.0.tgz
- workbox-build-6.6.0.tgz
- rollup-plugin-off-main-thread-2.2.3.tgz
- :x: **ejs-3.1.9.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/shortlink-org/shortlink/commit/3fc31ec07a90bdd345e4efe27fd8249068eeef68">3fc31ec07a90bdd345e4efe27fd8249068eeef68</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
** DISPUTED ** ejs v3.1.9 is vulnerable to server-side template injection. If the ejs file is controllable, template injection can be implemented through the configuration settings of the closeDelimiter parameter. NOTE: this is disputed by the vendor because the render function is not intended to be used with untrusted input.
<p>Publish Date: 2023-05-04
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-29827>CVE-2023-29827</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
|
True
|
landing-0.1.1.tgz: 1 vulnerabilities (highest severity is: 9.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>landing-0.1.1.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /ui/nx-monorepo/package.json</p>
<p>Path to vulnerable library: /ui/nx-monorepo/node_modules/ejs/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/shortlink-org/shortlink/commit/3fc31ec07a90bdd345e4efe27fd8249068eeef68">3fc31ec07a90bdd345e4efe27fd8249068eeef68</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (landing version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2023-29827](https://www.mend.io/vulnerability-database/CVE-2023-29827) | <img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> Critical | 9.8 | ejs-3.1.9.tgz | Transitive | N/A* | ❌ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> CVE-2023-29827</summary>
### Vulnerable Library - <b>ejs-3.1.9.tgz</b></p>
<p></p>
<p>Library home page: <a href="https://registry.npmjs.org/ejs/-/ejs-3.1.9.tgz">https://registry.npmjs.org/ejs/-/ejs-3.1.9.tgz</a></p>
<p>Path to dependency file: /ui/nx-monorepo/package.json</p>
<p>Path to vulnerable library: /ui/nx-monorepo/node_modules/ejs/package.json</p>
<p>
Dependency Hierarchy:
- landing-0.1.1.tgz (Root Library)
- next-pwa-5.6.0.tgz
- workbox-webpack-plugin-6.6.0.tgz
- workbox-build-6.6.0.tgz
- rollup-plugin-off-main-thread-2.2.3.tgz
- :x: **ejs-3.1.9.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/shortlink-org/shortlink/commit/3fc31ec07a90bdd345e4efe27fd8249068eeef68">3fc31ec07a90bdd345e4efe27fd8249068eeef68</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
** DISPUTED ** ejs v3.1.9 is vulnerable to server-side template injection. If the ejs file is controllable, template injection can be implemented through the configuration settings of the closeDelimiter parameter. NOTE: this is disputed by the vendor because the render function is not intended to be used with untrusted input.
<p>Publish Date: 2023-05-04
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-29827>CVE-2023-29827</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
|
non_test
|
landing tgz vulnerabilities highest severity is vulnerable library landing tgz path to dependency file ui nx monorepo package json path to vulnerable library ui nx monorepo node modules ejs package json found in head commit a href vulnerabilities cve severity cvss dependency type fixed in landing version remediation available critical ejs tgz transitive n a for some transitive vulnerabilities there is no version of direct dependency with a fix check the details section below to see if there is a version of transitive dependency where vulnerability is fixed details cve vulnerable library ejs tgz library home page a href path to dependency file ui nx monorepo package json path to vulnerable library ui nx monorepo node modules ejs package json dependency hierarchy landing tgz root library next pwa tgz workbox webpack plugin tgz workbox build tgz rollup plugin off main thread tgz x ejs tgz vulnerable library found in head commit a href found in base branch main vulnerability details disputed ejs is vulnerable to server side template injection if the ejs file is controllable template injection can be implemented through the configuration settings of the closedelimiter parameter note this is disputed by the vendor because the render function is not intended to be used with untrusted input publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href step up your open source security game with mend
| 0
|
76,168
| 7,520,412,779
|
IssuesEvent
|
2018-04-12 14:25:25
|
ARUP-CAS/arup-da-amcr
|
https://api.github.com/repos/ARUP-CAS/arup-da-amcr
|
closed
|
Přidat odkaz na zaslání zapomenutého hesla (1 hodina)
|
enhancement ready for test
|
Do přihlašovacího menu přidat tlačítko s odkazem na adresu: http://api.archeologickamapa.cz/password_request/0/
|
1.0
|
Přidat odkaz na zaslání zapomenutého hesla (1 hodina) - Do přihlašovacího menu přidat tlačítko s odkazem na adresu: http://api.archeologickamapa.cz/password_request/0/
|
test
|
přidat odkaz na zaslání zapomenutého hesla hodina do přihlašovacího menu přidat tlačítko s odkazem na adresu
| 1
|
24,404
| 4,080,654,155
|
IssuesEvent
|
2016-05-31 04:11:42
|
karudedios/todo-angular-js
|
https://api.github.com/repos/karudedios/todo-angular-js
|
opened
|
TDD Todo Services
|
backend software testing
|
- As a developer I'd like to have Todo services and interactors to guide me in the model design labor.
**AC**
- TDD Services for
- Creation of Todos
- Updating of Todos
- Listing of Todos
- Fetching single Todo
|
1.0
|
TDD Todo Services - - As a developer I'd like to have Todo services and interactors to guide me in the model design labor.
**AC**
- TDD Services for
- Creation of Todos
- Updating of Todos
- Listing of Todos
- Fetching single Todo
|
test
|
tdd todo services as a developer i d like to have todo services and interactors to guide me in the model design labor ac tdd services for creation of todos updating of todos listing of todos fetching single todo
| 1
|
75,793
| 7,488,543,466
|
IssuesEvent
|
2018-04-06 02:02:29
|
unixpunk/PlutoWeb
|
https://api.github.com/repos/unixpunk/PlutoWeb
|
closed
|
Update settings.sh to only restart app when necessary
|
Needs testing enhancement
|
Right now any change to any setting causes settings.sh to restart the running app even when its an unrelated setting.
|
1.0
|
Update settings.sh to only restart app when necessary - Right now any change to any setting causes settings.sh to restart the running app even when its an unrelated setting.
|
test
|
update settings sh to only restart app when necessary right now any change to any setting causes settings sh to restart the running app even when its an unrelated setting
| 1
|
324,157
| 9,884,935,166
|
IssuesEvent
|
2019-06-25 00:10:30
|
kubernetes/sig-release
|
https://api.github.com/repos/kubernetes/sig-release
|
closed
|
Update RT docs/handbooks to remove being an org member as a req for RT shadows
|
help wanted kind/cleanup priority/critical-urgent
|
[Feedback from Twitter](https://twitter.com/SethMcCombs/status/1139279455216525312?s=20) (@sethmccombs) that some handbooks list being an org member as a requirement to be a shadow for the Release Team.
While that's definitely a *_goal_* for shadows (and a requirement for Leads), we should edit the documentation to make sure to not alienate any potential contributors.
/milestone v1.15
/priority critical-urgent
/kind cleanup
/help
@kubernetes/release-team
|
1.0
|
Update RT docs/handbooks to remove being an org member as a req for RT shadows - [Feedback from Twitter](https://twitter.com/SethMcCombs/status/1139279455216525312?s=20) (@sethmccombs) that some handbooks list being an org member as a requirement to be a shadow for the Release Team.
While that's definitely a *_goal_* for shadows (and a requirement for Leads), we should edit the documentation to make sure to not alienate any potential contributors.
/milestone v1.15
/priority critical-urgent
/kind cleanup
/help
@kubernetes/release-team
|
non_test
|
update rt docs handbooks to remove being an org member as a req for rt shadows sethmccombs that some handbooks list being an org member as a requirement to be a shadow for the release team while that s definitely a goal for shadows and a requirement for leads we should edit the documentation to make sure to not alienate any potential contributors milestone priority critical urgent kind cleanup help kubernetes release team
| 0
|
210,364
| 16,097,783,214
|
IssuesEvent
|
2021-04-27 04:15:42
|
rancher/rancher
|
https://api.github.com/repos/rancher/rancher
|
opened
|
RKE2 Provisioning: vSphere (v2 Cloud Credential and Node Pool)
|
[zube]: To Test area/rke2
|
vSphere as a v2 Cloud Credential
```
apiVersion: v1
kind: Secret
metadata:
name: #string
annotations:
provisioning.cattle.io/driver: vsphere
# key: string
labels:
{}
# key: string
namespace: fleet-default
type: provisioning.cattle.io/cloud-credential
#data:
# key: string
#immutable: boolean
```
vSphere as a Node Driver
```
nodePools:
# - cloudCredentialSecretName: string
# controlPlaneRole: boolean
# displayName: string
# etcdRole: boolean
# hostnamePrefix: string
# labels:
# key: string
# name: string
# nodeConfig:
# apiVersion: string
# fieldPath: string
# kind: string
# name: string
# namespace: string
# resourceVersion: string
# uid: string
# paused: boolean
# quantity: int
# rollingUpdate:
# maxSurge: string
# maxUnavailable: string
# taints:
# - effect: string
# key: string
# timeAdded: string
# value: string
# workerRole: boolean
```
|
1.0
|
RKE2 Provisioning: vSphere (v2 Cloud Credential and Node Pool) - vSphere as a v2 Cloud Credential
```
apiVersion: v1
kind: Secret
metadata:
name: #string
annotations:
provisioning.cattle.io/driver: vsphere
# key: string
labels:
{}
# key: string
namespace: fleet-default
type: provisioning.cattle.io/cloud-credential
#data:
# key: string
#immutable: boolean
```
vSphere as a Node Driver
```
nodePools:
# - cloudCredentialSecretName: string
# controlPlaneRole: boolean
# displayName: string
# etcdRole: boolean
# hostnamePrefix: string
# labels:
# key: string
# name: string
# nodeConfig:
# apiVersion: string
# fieldPath: string
# kind: string
# name: string
# namespace: string
# resourceVersion: string
# uid: string
# paused: boolean
# quantity: int
# rollingUpdate:
# maxSurge: string
# maxUnavailable: string
# taints:
# - effect: string
# key: string
# timeAdded: string
# value: string
# workerRole: boolean
```
|
test
|
provisioning vsphere cloud credential and node pool vsphere as a cloud credential apiversion kind secret metadata name string annotations provisioning cattle io driver vsphere key string labels key string namespace fleet default type provisioning cattle io cloud credential data key string immutable boolean vsphere as a node driver nodepools cloudcredentialsecretname string controlplanerole boolean displayname string etcdrole boolean hostnameprefix string labels key string name string nodeconfig apiversion string fieldpath string kind string name string namespace string resourceversion string uid string paused boolean quantity int rollingupdate maxsurge string maxunavailable string taints effect string key string timeadded string value string workerrole boolean
| 1
|
11,676
| 5,072,494,201
|
IssuesEvent
|
2016-12-27 00:01:26
|
opencv/opencv
|
https://api.github.com/repos/opencv/opencv
|
closed
|
Opencv 3.2: compile error on Raspian, PI3
|
category: build/install invalid
|
I tried these build options :
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D INSTALL_PYTHON_EXAMPLES=OFF \
-D OPENCV_EXTRA_MODULES_PATH=/home/pi/opencv_contrib/modules \
-D BUILD_EXAMPLES=ON ..
and I got this compile error on Raspian, PI3:
[ 86%] Building CXX object samples/cpp/CMakeFiles/tutorial_mat_the_basic_image_container.dir/tutorial_code/core/mat_the_basic_image_container/mat_the_basic_image_container.cpp.o
c++: internal compiler error: Getötet (program cc1plus)
Please submit a full bug report,
with preprocessed source if appropriate.
See <file:///usr/share/doc/gcc-4.9/README.Bugs> for instructions.
modules/python2/CMakeFiles/opencv_python2.dir/build.make:322: recipe for target 'modules/python2/CMakeFiles/opencv_python2.dir/__/src2/cv2.cpp.o' failed
make[2]: *** [modules/python2/CMakeFiles/opencv_python2.dir/__/src2/cv2.cpp.o] Error 4
CMakeFiles/Makefile2:20773: recipe for target 'modules/python2/CMakeFiles/opencv_python2.dir/all' failed
make[1]: *** [modules/python2/CMakeFiles/opencv_python2.dir/all] Error 2
make[1]: *** Warte auf noch nicht beendete Prozesse...
[ 86%] Linking CXX executable ../../bin/cpp-tutorial-AKAZE_match
[ 86%] Built target tutorial_AKAZE_match
[ 86%] Linking CXX executable ../../bin/cpp-tutorial-mat_the_basic_image_container
[ 86%] Built target tutorial_mat_the_basic_image_container
|
1.0
|
Opencv 3.2: compile error on Raspian, PI3 - I tried these build options :
cmake -D CMAKE_BUILD_TYPE=RELEASE \
-D CMAKE_INSTALL_PREFIX=/usr/local \
-D INSTALL_PYTHON_EXAMPLES=OFF \
-D OPENCV_EXTRA_MODULES_PATH=/home/pi/opencv_contrib/modules \
-D BUILD_EXAMPLES=ON ..
and I got this compile error on Raspian, PI3:
[ 86%] Building CXX object samples/cpp/CMakeFiles/tutorial_mat_the_basic_image_container.dir/tutorial_code/core/mat_the_basic_image_container/mat_the_basic_image_container.cpp.o
c++: internal compiler error: Getötet (program cc1plus)
Please submit a full bug report,
with preprocessed source if appropriate.
See <file:///usr/share/doc/gcc-4.9/README.Bugs> for instructions.
modules/python2/CMakeFiles/opencv_python2.dir/build.make:322: recipe for target 'modules/python2/CMakeFiles/opencv_python2.dir/__/src2/cv2.cpp.o' failed
make[2]: *** [modules/python2/CMakeFiles/opencv_python2.dir/__/src2/cv2.cpp.o] Error 4
CMakeFiles/Makefile2:20773: recipe for target 'modules/python2/CMakeFiles/opencv_python2.dir/all' failed
make[1]: *** [modules/python2/CMakeFiles/opencv_python2.dir/all] Error 2
make[1]: *** Warte auf noch nicht beendete Prozesse...
[ 86%] Linking CXX executable ../../bin/cpp-tutorial-AKAZE_match
[ 86%] Built target tutorial_AKAZE_match
[ 86%] Linking CXX executable ../../bin/cpp-tutorial-mat_the_basic_image_container
[ 86%] Built target tutorial_mat_the_basic_image_container
|
non_test
|
opencv compile error on raspian i tried these build options cmake d cmake build type release d cmake install prefix usr local d install python examples off d opencv extra modules path home pi opencv contrib modules d build examples on and i got this compile error on raspian building cxx object samples cpp cmakefiles tutorial mat the basic image container dir tutorial code core mat the basic image container mat the basic image container cpp o c internal compiler error getötet program please submit a full bug report with preprocessed source if appropriate see for instructions modules cmakefiles opencv dir build make recipe for target modules cmakefiles opencv dir cpp o failed make error cmakefiles recipe for target modules cmakefiles opencv dir all failed make error make warte auf noch nicht beendete prozesse linking cxx executable bin cpp tutorial akaze match built target tutorial akaze match linking cxx executable bin cpp tutorial mat the basic image container built target tutorial mat the basic image container
| 0
|
115,154
| 11,866,531,082
|
IssuesEvent
|
2020-03-26 04:01:44
|
WilliamChan4/SENG2021_TheFastGroup
|
https://api.github.com/repos/WilliamChan4/SENG2021_TheFastGroup
|
opened
|
Rename pipelines Staging and Production and Add Badges to GitHub
|
documentation enhancement
|
The pipelines should be renamed to staging and production and the badges for both the pipelines and deployments put on the README.md page.
|
1.0
|
Rename pipelines Staging and Production and Add Badges to GitHub - The pipelines should be renamed to staging and production and the badges for both the pipelines and deployments put on the README.md page.
|
non_test
|
rename pipelines staging and production and add badges to github the pipelines should be renamed to staging and production and the badges for both the pipelines and deployments put on the readme md page
| 0
|
147,768
| 11,807,322,215
|
IssuesEvent
|
2020-03-19 11:12:28
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
roachtest: sqlalchemy failed
|
C-test-failure O-roachtest O-robot branch-release-19.1 release-blocker
|
[(roachtest).sqlalchemy failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=1816983&tab=buildLog) on [release-19.1@9d9b9a1a3417ae3460b8bdc7c77bf8c9df54d899](https://github.com/cockroachdb/cockroach/commits/9d9b9a1a3417ae3460b8bdc7c77bf8c9df54d899):
```
The test failed on branch=release-19.1, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/20200319-1816983/sqlalchemy/run_1
orm_helpers.go:214,orm_helpers.go:144,sqlalchemy.go:220,sqlalchemy.go:231,test_runner.go:753:
Tests run on Cockroach v19.1.8-20-g9d9b9a1
Tests run against sqlalchemy rel_1_3_15
0 Total Tests Run
0 tests passed
0 tests failed
0 tests skipped
0 tests ignored
0 tests passed unexpectedly
0 tests failed unexpectedly
0 tests expected failed but skipped
67 tests expected failed but not run
---
For a full summary look at the sqlalchemy artifacts
An updated blacklist (sqlAlchemyBlacklist) is available in the artifacts' sqlalchemy log
```
<details><summary>More</summary><p>
Artifacts: [/sqlalchemy](https://teamcity.cockroachdb.com/viewLog.html?buildId=1816983&tab=artifacts#/sqlalchemy)
Related:
- #46301 roachtest: sqlalchemy failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202003181957_v20.1.0-beta.3](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202003181957_v20.1.0-beta.3) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #46176 roachtest: sqlalchemy failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202003161814_v19.2.5](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202003161814_v19.2.5) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #46025 roachtest: sqlalchemy failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-19.2](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-19.2) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #45989 roachtest: sqlalchemy failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-master](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-master) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Asqlalchemy.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
2.0
|
roachtest: sqlalchemy failed - [(roachtest).sqlalchemy failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=1816983&tab=buildLog) on [release-19.1@9d9b9a1a3417ae3460b8bdc7c77bf8c9df54d899](https://github.com/cockroachdb/cockroach/commits/9d9b9a1a3417ae3460b8bdc7c77bf8c9df54d899):
```
The test failed on branch=release-19.1, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/20200319-1816983/sqlalchemy/run_1
orm_helpers.go:214,orm_helpers.go:144,sqlalchemy.go:220,sqlalchemy.go:231,test_runner.go:753:
Tests run on Cockroach v19.1.8-20-g9d9b9a1
Tests run against sqlalchemy rel_1_3_15
0 Total Tests Run
0 tests passed
0 tests failed
0 tests skipped
0 tests ignored
0 tests passed unexpectedly
0 tests failed unexpectedly
0 tests expected failed but skipped
67 tests expected failed but not run
---
For a full summary look at the sqlalchemy artifacts
An updated blacklist (sqlAlchemyBlacklist) is available in the artifacts' sqlalchemy log
```
<details><summary>More</summary><p>
Artifacts: [/sqlalchemy](https://teamcity.cockroachdb.com/viewLog.html?buildId=1816983&tab=artifacts#/sqlalchemy)
Related:
- #46301 roachtest: sqlalchemy failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202003181957_v20.1.0-beta.3](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202003181957_v20.1.0-beta.3) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #46176 roachtest: sqlalchemy failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202003161814_v19.2.5](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202003161814_v19.2.5) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #46025 roachtest: sqlalchemy failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-19.2](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-19.2) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #45989 roachtest: sqlalchemy failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-master](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-master) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Asqlalchemy.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
test
|
roachtest sqlalchemy failed on the test failed on branch release cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts sqlalchemy run orm helpers go orm helpers go sqlalchemy go sqlalchemy go test runner go tests run on cockroach tests run against sqlalchemy rel total tests run tests passed tests failed tests skipped tests ignored tests passed unexpectedly tests failed unexpectedly tests expected failed but skipped tests expected failed but not run for a full summary look at the sqlalchemy artifacts an updated blacklist sqlalchemyblacklist is available in the artifacts sqlalchemy log more artifacts related roachtest sqlalchemy failed roachtest sqlalchemy failed roachtest sqlalchemy failed roachtest sqlalchemy failed powered by
| 1
|
397,516
| 27,168,265,951
|
IssuesEvent
|
2023-02-17 17:01:48
|
xgeekshq/split
|
https://api.github.com/repos/xgeekshq/split
|
closed
|
[FEATURE]: dialog primitive story
|
documentation frontend
|
<!--
Please fill out each section below, otherwise your issue will be closed.
Before opening a new issue, please search existing issues: https://github.com/xgeekshq/divide-and-conquer/issues
## A note on adding features to this repo
Every feature needs to strike a balance - complex features are less likely to be worked on.
This means that not every feature request will be added, but hearing about what you want is important. Don't be afraid to add a feature request!
-->
## Summary
- Add Dialog primitive to Storybook
|
1.0
|
[FEATURE]: dialog primitive story - <!--
Please fill out each section below, otherwise your issue will be closed.
Before opening a new issue, please search existing issues: https://github.com/xgeekshq/divide-and-conquer/issues
## A note on adding features to this repo
Every feature needs to strike a balance - complex features are less likely to be worked on.
This means that not every feature request will be added, but hearing about what you want is important. Don't be afraid to add a feature request!
-->
## Summary
- Add Dialog primitive to Storybook
|
non_test
|
dialog primitive story please fill out each section below otherwise your issue will be closed before opening a new issue please search existing issues a note on adding features to this repo every feature needs to strike a balance complex features are less likely to be worked on this means that not every feature request will be added but hearing about what you want is important don t be afraid to add a feature request summary add dialog primitive to storybook
| 0
|
57,425
| 7,056,555,289
|
IssuesEvent
|
2018-01-04 13:16:31
|
status-im/status-react
|
https://api.github.com/repos/status-im/status-react
|
opened
|
Implement PN infromative message for Android users (Onboarding)
|
bug design proposal
|
### Description
*Type*: Feature
*Summary*: this is a feature request. I want Android users to be informed about Push Notifications enabled feature. Just from user side - i even dont know about PNs while on Android and iOS informs me about it right after i start the app.
#### Expected behavior
We have this security alert on iOS saying about PNs. Its just a system notification but we can try to implement something in terms of onboarding for Android platform

#### Actual behavior
Nothing informs about PN feature while being on Android platform
|
1.0
|
Implement PN infromative message for Android users (Onboarding) - ### Description
*Type*: Feature
*Summary*: this is a feature request. I want Android users to be informed about Push Notifications enabled feature. Just from user side - i even dont know about PNs while on Android and iOS informs me about it right after i start the app.
#### Expected behavior
We have this security alert on iOS saying about PNs. Its just a system notification but we can try to implement something in terms of onboarding for Android platform

#### Actual behavior
Nothing informs about PN feature while being on Android platform
|
non_test
|
implement pn infromative message for android users onboarding description type feature summary this is a feature request i want android users to be informed about push notifications enabled feature just from user side i even dont know about pns while on android and ios informs me about it right after i start the app expected behavior we have this security alert on ios saying about pns its just a system notification but we can try to implement something in terms of onboarding for android platform actual behavior nothing informs about pn feature while being on android platform
| 0
|
96,605
| 12,144,906,570
|
IssuesEvent
|
2020-04-24 08:24:34
|
se701-group6/A2
|
https://api.github.com/repos/se701-group6/A2
|
closed
|
Webapp can be used without logging in.
|
approved bug design front-end
|
**Describe the bug**
I can access any page without having to login first.
**To Reproduce**
Steps to reproduce the behavior:
1. Start the webapp
2. Go to login page and login
3. Copy the URL
4. Go to a different browser or private browsing (of the same browser)
5. Paste the URL
6. Go to the URL
OR
1. Start the webapp
2. Go to localhost:3000/#/home/split
3. Create a bill
4. Press "Split Bill"
**Severity**
Mark with an x.
[x]: Minor effect eg. graphical
[]: Functional error eg. App does not function correctly
[]: Severe eg. Crashing
**Reproducibility**
Mark with an x.
[x]: Consistent
[]: Occasional
[]: Cannot reproduce at the moment eg. unsure
**Expected behavior**
I should be redirected to the login screen if I haven't already logged in on the browser.
**Screenshots, Images and Traces**


**Conditions**
I used Ubuntu OS with FireFox, FireFox Private Browsing and Chromium. I have not tested this on Windows OS.
**Additional context**
* The backend will return a message saying "User not logged in", meaning it is responding correctly to not logging in.
* Using the second set of instructions the new bill isn't saved due to the above stated error message.
|
1.0
|
Webapp can be used without logging in. - **Describe the bug**
I can access any page without having to login first.
**To Reproduce**
Steps to reproduce the behavior:
1. Start the webapp
2. Go to login page and login
3. Copy the URL
4. Go to a different browser or private browsing (of the same browser)
5. Paste the URL
6. Go to the URL
OR
1. Start the webapp
2. Go to localhost:3000/#/home/split
3. Create a bill
4. Press "Split Bill"
**Severity**
Mark with an x.
[x]: Minor effect eg. graphical
[]: Functional error eg. App does not function correctly
[]: Severe eg. Crashing
**Reproducibility**
Mark with an x.
[x]: Consistent
[]: Occasional
[]: Cannot reproduce at the moment eg. unsure
**Expected behavior**
I should be redirected to the login screen if I haven't already logged in on the browser.
**Screenshots, Images and Traces**


**Conditions**
I used Ubuntu OS with FireFox, FireFox Private Browsing and Chromium. I have not tested this on Windows OS.
**Additional context**
* The backend will return a message saying "User not logged in", meaning it is responding correctly to not logging in.
* Using the second set of instructions the new bill isn't saved due to the above stated error message.
|
non_test
|
webapp can be used without logging in describe the bug i can access any page without having to login first to reproduce steps to reproduce the behavior start the webapp go to login page and login copy the url go to a different browser or private browsing of the same browser paste the url go to the url or start the webapp go to localhost home split create a bill press split bill severity mark with an x minor effect eg graphical functional error eg app does not function correctly severe eg crashing reproducibility mark with an x consistent occasional cannot reproduce at the moment eg unsure expected behavior i should be redirected to the login screen if i haven t already logged in on the browser screenshots images and traces conditions i used ubuntu os with firefox firefox private browsing and chromium i have not tested this on windows os additional context the backend will return a message saying user not logged in meaning it is responding correctly to not logging in using the second set of instructions the new bill isn t saved due to the above stated error message
| 0
|
544,060
| 15,889,190,964
|
IssuesEvent
|
2021-04-10 10:27:24
|
AY2021S2-CS2103T-W10-2/tp
|
https://api.github.com/repos/AY2021S2-CS2103T-W10-2/tp
|
closed
|
[PE-D] Sort command fails after adding (valid) large quantity and wipes all user data
|
priority.High severity.High type.Bug
|
After adding an item with a valid but large quantity (i.e. 999999999999999999) the sort command fails.
Furthermore, all items currently in the list seem to get wiped from it.
I've marked this as a high severity bug because the app wipes all user info after this one bad command, and without any backup that information becomes irretrievable.
The following are steps from default storemando (i.e. just downloaded and opened, with default data present)
1. `sort quantity asc`
2. `add n/Chicken t/test l/Kitchen q/999999999999999999`
3. `sort quantity asc` -- CHICKEN NOT DISPLAYED --
4. `list` -- CHICKEN NOT DISPLAYED
5. `sort quantity asc` -- ALL ITEMS NOT DISPLAYED -- **It seems that at this point all items are deleted**
6. Manually close StoreMando (click X to close app)
7. Open Storemando -- ALL ITEMS NOT DISPLAYED --
8. `list` -- ALL ITEMS NOT DISPLAYED
9. `delete 1` -- Cannot delete because index invalid
10. Check data/storemando.json

<!--session: 1617429866672-356a0beb-701c-474c-b2f0-7f2e6d0f4961-->
-------------
Labels: `severity.High` `type.FunctionalityBug`
original: douglaswja/ped#2
|
1.0
|
[PE-D] Sort command fails after adding (valid) large quantity and wipes all user data - After adding an item with a valid but large quantity (i.e. 999999999999999999) the sort command fails.
Furthermore, all items currently in the list seem to get wiped from it.
I've marked this as a high severity bug because the app wipes all user info after this one bad command, and without any backup that information becomes irretrievable.
The following are steps from default storemando (i.e. just downloaded and opened, with default data present)
1. `sort quantity asc`
2. `add n/Chicken t/test l/Kitchen q/999999999999999999`
3. `sort quantity asc` -- CHICKEN NOT DISPLAYED --
4. `list` -- CHICKEN NOT DISPLAYED
5. `sort quantity asc` -- ALL ITEMS NOT DISPLAYED -- **It seems that at this point all items are deleted**
6. Manually close StoreMando (click X to close app)
7. Open Storemando -- ALL ITEMS NOT DISPLAYED --
8. `list` -- ALL ITEMS NOT DISPLAYED
9. `delete 1` -- Cannot delete because index invalid
10. Check data/storemando.json

<!--session: 1617429866672-356a0beb-701c-474c-b2f0-7f2e6d0f4961-->
-------------
Labels: `severity.High` `type.FunctionalityBug`
original: douglaswja/ped#2
|
non_test
|
sort command fails after adding valid large quantity and wipes all user data after adding an item with a valid but large quantity i e the sort command fails furthermore all items currently in the list seem to get wiped from it i ve marked this as a high severity bug because the app wipes all user info after this one bad command and without any backup that information becomes irretrievable the following are steps from default storemando i e just downloaded and opened with default data present sort quantity asc add n chicken t test l kitchen q sort quantity asc chicken not displayed list chicken not displayed sort quantity asc all items not displayed it seems that at this point all items are deleted manually close storemando click x to close app open storemando all items not displayed list all items not displayed delete cannot delete because index invalid check data storemando json labels severity high type functionalitybug original douglaswja ped
| 0
|
194,511
| 14,680,147,808
|
IssuesEvent
|
2020-12-31 09:11:54
|
microsoft/vscode
|
https://api.github.com/repos/microsoft/vscode
|
opened
|
Flaky test: EditorsObserver - copy group
|
unit-test-failure
|
https://dev.azure.com/monacotools/Monaco/_build/results?buildId=98794&view=logs&j=672276a2-8d3a-5fab-615d-090c51352f92&t=6ad30dce-518e-54c1-94da-00d19cce9354&l=7719
Failed on Windows:
```
6405 passing (3m)
42 pending
1 failing
1) EditorsObserver
copy group:
AssertionError [ERR_ASSERTION]: 136 == 137
+ expected - actual
-136
+137
at Context.<anonymous> (file:///D:/a/1/s/out-build/vs/workbench/services/editor/test/browser/editorsObserver.test.js:177:20)
```
|
1.0
|
Flaky test: EditorsObserver - copy group - https://dev.azure.com/monacotools/Monaco/_build/results?buildId=98794&view=logs&j=672276a2-8d3a-5fab-615d-090c51352f92&t=6ad30dce-518e-54c1-94da-00d19cce9354&l=7719
Failed on Windows:
```
6405 passing (3m)
42 pending
1 failing
1) EditorsObserver
copy group:
AssertionError [ERR_ASSERTION]: 136 == 137
+ expected - actual
-136
+137
at Context.<anonymous> (file:///D:/a/1/s/out-build/vs/workbench/services/editor/test/browser/editorsObserver.test.js:177:20)
```
|
test
|
flaky test editorsobserver copy group failed on windows passing pending failing editorsobserver copy group assertionerror expected actual at context file d a s out build vs workbench services editor test browser editorsobserver test js
| 1
|
233,131
| 18,950,316,613
|
IssuesEvent
|
2021-11-18 14:35:19
|
go-gitea/gitea
|
https://api.github.com/repos/go-gitea/gitea
|
opened
|
CI fails with unknown permission problems
|
kind/testing
|
These errors occurs many times:
```
+ make backend
mkdir: cannot create directory '.make_evidence': Permission denied
make: *** [Makefile:251: .make_evidence/tags] Error 1
```
```
+ make unit-test-coverage test-check
Running unit-test-coverage -race -tags 'sqlite sqlite_unlock_notify'...
open /drone/src/coverage.out: permission denied
make: *** [Makefile:395: unit-test-coverage] Error 1
```
|
1.0
|
CI fails with unknown permission problems - These errors occurs many times:
```
+ make backend
mkdir: cannot create directory '.make_evidence': Permission denied
make: *** [Makefile:251: .make_evidence/tags] Error 1
```
```
+ make unit-test-coverage test-check
Running unit-test-coverage -race -tags 'sqlite sqlite_unlock_notify'...
open /drone/src/coverage.out: permission denied
make: *** [Makefile:395: unit-test-coverage] Error 1
```
|
test
|
ci fails with unknown permission problems these errors occurs many times make backend mkdir cannot create directory make evidence permission denied make error make unit test coverage test check running unit test coverage race tags sqlite sqlite unlock notify open drone src coverage out permission denied make error
| 1
|
100,821
| 8,755,887,825
|
IssuesEvent
|
2018-12-14 16:04:50
|
LiskHQ/lisk
|
https://api.github.com/repos/LiskHQ/lisk
|
closed
|
lisk-core-network tests create log files under wrong path
|
*easy test
|
### Expected behavior
Log file is written to a predefined location so that is can be archived (and shown in the Jenkins UI for troubleshooting)
### Actual behavior
Log file is created under `$PWD/$PWD/test/network/utils/networkTestsLogger.log` e.g.:
`/home/lisk/workspace/lisk-core-network_PR-2597/home/lisk/workspace/lisk-core-network_PR-2597/test/network/utils/networkTestsLogger.logs` (notice the duplicate `home/lisk/workspace/lisk-core-network_PR-2597`)
### Steps to reproduce
Run `npm test -- mocha:extensive` like `Jenkinsfile.network` does.
### Which version(s) does this affect? (Environment, OS, etc...)
`lisk-core-networks` test in Jenkins
|
1.0
|
lisk-core-network tests create log files under wrong path - ### Expected behavior
Log file is written to a predefined location so that is can be archived (and shown in the Jenkins UI for troubleshooting)
### Actual behavior
Log file is created under `$PWD/$PWD/test/network/utils/networkTestsLogger.log` e.g.:
`/home/lisk/workspace/lisk-core-network_PR-2597/home/lisk/workspace/lisk-core-network_PR-2597/test/network/utils/networkTestsLogger.logs` (notice the duplicate `home/lisk/workspace/lisk-core-network_PR-2597`)
### Steps to reproduce
Run `npm test -- mocha:extensive` like `Jenkinsfile.network` does.
### Which version(s) does this affect? (Environment, OS, etc...)
`lisk-core-networks` test in Jenkins
|
test
|
lisk core network tests create log files under wrong path expected behavior log file is written to a predefined location so that is can be archived and shown in the jenkins ui for troubleshooting actual behavior log file is created under pwd pwd test network utils networktestslogger log e g home lisk workspace lisk core network pr home lisk workspace lisk core network pr test network utils networktestslogger logs notice the duplicate home lisk workspace lisk core network pr steps to reproduce run npm test mocha extensive like jenkinsfile network does which version s does this affect environment os etc lisk core networks test in jenkins
| 1
|
153,919
| 12,168,551,414
|
IssuesEvent
|
2020-04-27 12:52:17
|
ICIJ/datashare
|
https://api.github.com/repos/ICIJ/datashare
|
closed
|
See how Regex (doesn't) work and why
|
front need testing
|
We noticed some problems with spaces in Luxembourg IBAN in Lux Leaks. And also some issues with @.
|
1.0
|
See how Regex (doesn't) work and why - We noticed some problems with spaces in Luxembourg IBAN in Lux Leaks. And also some issues with @.
|
test
|
see how regex doesn t work and why we noticed some problems with spaces in luxembourg iban in lux leaks and also some issues with
| 1
|
131,339
| 18,273,706,145
|
IssuesEvent
|
2021-10-04 16:15:50
|
bcgov/entity
|
https://api.github.com/repos/bcgov/entity
|
closed
|
UI Design - Verification Statement PDF for my base registration
|
UX Design Assets
|
- [ ] Design for the 'Verification Statement' PDF that results from completion of a registration statement
- [ ] Include all types
- [ ] For registrations that don't have General Collateral (ie you can't even add GC), we should not show the General collateral component at all in the outputs (from 8199)
- [ ] For registrations that can have GC, but do not, continue to show the GC collateral with "None" (no period) (from 8199)
Based on the comparison results in the same Epic, update the Search PDF output
- **Not Required** Include Control Number field (maybe in the footer)
- [x] Just for the SG registration verification, use "Secured Party (Buyer)" for Secured Party and "Debtor (Seller)" Debtor.
Examples of Registration Verification Statements from current PPR application:
https://docs.google.com/spreadsheets/d/1C3KYYmxj-22759YVu4o4DXNTsNYNapW1vvFgYZkzwjY/edit?usp=sharing
### Invision Link
https://invis.io/8Y11RKAPCKAE
*Just for Sale of Goods (SG) reg type, we say **Secured Party (Buyer) Information** and **Debtor (Seller) Information**.
|
1.0
|
UI Design - Verification Statement PDF for my base registration - - [ ] Design for the 'Verification Statement' PDF that results from completion of a registration statement
- [ ] Include all types
- [ ] For registrations that don't have General Collateral (ie you can't even add GC), we should not show the General collateral component at all in the outputs (from 8199)
- [ ] For registrations that can have GC, but do not, continue to show the GC collateral with "None" (no period) (from 8199)
Based on the comparison results in the same Epic, update the Search PDF output
- **Not Required** Include Control Number field (maybe in the footer)
- [x] Just for the SG registration verification, use "Secured Party (Buyer)" for Secured Party and "Debtor (Seller)" Debtor.
Examples of Registration Verification Statements from current PPR application:
https://docs.google.com/spreadsheets/d/1C3KYYmxj-22759YVu4o4DXNTsNYNapW1vvFgYZkzwjY/edit?usp=sharing
### Invision Link
https://invis.io/8Y11RKAPCKAE
*Just for Sale of Goods (SG) reg type, we say **Secured Party (Buyer) Information** and **Debtor (Seller) Information**.
|
non_test
|
ui design verification statement pdf for my base registration design for the verification statement pdf that results from completion of a registration statement include all types for registrations that don t have general collateral ie you can t even add gc we should not show the general collateral component at all in the outputs from for registrations that can have gc but do not continue to show the gc collateral with none no period from based on the comparison results in the same epic update the search pdf output not required include control number field maybe in the footer just for the sg registration verification use secured party buyer for secured party and debtor seller debtor examples of registration verification statements from current ppr application invision link just for sale of goods sg reg type we say secured party buyer information and debtor seller information
| 0
|
24,336
| 3,966,557,310
|
IssuesEvent
|
2016-05-03 13:27:46
|
buildo/nemobot
|
https://api.github.com/repos/buildo/nemobot
|
closed
|
[reminder] Only show repo-specific topic labels in reminder
|
defect in review
|
## description
When nemobot comments about a missing topic label, it should only propose the repo-specific topic labels and not all the possible ones cross-repo.
|
1.0
|
[reminder] Only show repo-specific topic labels in reminder - ## description
When nemobot comments about a missing topic label, it should only propose the repo-specific topic labels and not all the possible ones cross-repo.
|
non_test
|
only show repo specific topic labels in reminder description when nemobot comments about a missing topic label it should only propose the repo specific topic labels and not all the possible ones cross repo
| 0
|
55,241
| 13,550,067,618
|
IssuesEvent
|
2020-09-17 09:05:31
|
googleapis/nodejs-gce-images
|
https://api.github.com/repos/googleapis/nodejs-gce-images
|
opened
|
system tests: "before all" hook in "system tests" failed
|
buildcop: issue priority: p1 type: bug
|
This test failed!
To configure my behavior, see [the Build Cop Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/master/packages/buildcop).
If I'm commenting on this issue too often, add the `buildcop: quiet` label and
I will stop commenting.
---
commit: 78438cebbeb48e443f2f31accf3f6e3174fa1fd5
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/b83be76e-56c7-4f14-88ae-845752589e80), [Sponge](http://sponge2/b83be76e-56c7-4f14-88ae-845752589e80)
status: failed
<details><summary>Test output</summary><br><pre>Could not find a suitable image.
Error: Could not find a suitable image.
at GCEImages._getAllByOS (build/src/index.js:127:19)
-> /tmpfs/src/github/nodejs-gce-images/src/index.ts:263:13
at processTicksAndRejections (internal/process/task_queues.js:97:5)
at async /tmpfs/src/github/nodejs-gce-images/build/src/index.js:76:40
-> /tmpfs/src/github/nodejs-gce-images/src/index.ts:184:12
at async Promise.all (index 2)
at async GCEImages.getAllAsync (build/src/index.js:74:9)
-> /tmpfs/src/github/nodejs-gce-images/src/index.ts:179:5
at async Promise.all (index 1)
at async Context.<anonymous> (build/system-test/test.js:29:13)
-> /tmpfs/src/github/nodejs-gce-images/system-test/test.ts:35:9</pre></details>
|
1.0
|
system tests: "before all" hook in "system tests" failed - This test failed!
To configure my behavior, see [the Build Cop Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/master/packages/buildcop).
If I'm commenting on this issue too often, add the `buildcop: quiet` label and
I will stop commenting.
---
commit: 78438cebbeb48e443f2f31accf3f6e3174fa1fd5
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/b83be76e-56c7-4f14-88ae-845752589e80), [Sponge](http://sponge2/b83be76e-56c7-4f14-88ae-845752589e80)
status: failed
<details><summary>Test output</summary><br><pre>Could not find a suitable image.
Error: Could not find a suitable image.
at GCEImages._getAllByOS (build/src/index.js:127:19)
-> /tmpfs/src/github/nodejs-gce-images/src/index.ts:263:13
at processTicksAndRejections (internal/process/task_queues.js:97:5)
at async /tmpfs/src/github/nodejs-gce-images/build/src/index.js:76:40
-> /tmpfs/src/github/nodejs-gce-images/src/index.ts:184:12
at async Promise.all (index 2)
at async GCEImages.getAllAsync (build/src/index.js:74:9)
-> /tmpfs/src/github/nodejs-gce-images/src/index.ts:179:5
at async Promise.all (index 1)
at async Context.<anonymous> (build/system-test/test.js:29:13)
-> /tmpfs/src/github/nodejs-gce-images/system-test/test.ts:35:9</pre></details>
|
non_test
|
system tests before all hook in system tests failed this test failed to configure my behavior see if i m commenting on this issue too often add the buildcop quiet label and i will stop commenting commit buildurl status failed test output could not find a suitable image error could not find a suitable image at gceimages getallbyos build src index js tmpfs src github nodejs gce images src index ts at processticksandrejections internal process task queues js at async tmpfs src github nodejs gce images build src index js tmpfs src github nodejs gce images src index ts at async promise all index at async gceimages getallasync build src index js tmpfs src github nodejs gce images src index ts at async promise all index at async context build system test test js tmpfs src github nodejs gce images system test test ts
| 0
|
250,296
| 27,066,437,686
|
IssuesEvent
|
2023-02-14 01:03:37
|
DevOps-PM-PGDip-2022-2023/easybuggy4django.old
|
https://api.github.com/repos/DevOps-PM-PGDip-2022-2023/easybuggy4django.old
|
opened
|
CVE-2018-14042 (Medium) detected in bootstrap-3.3.7.min.js
|
security vulnerability
|
## CVE-2018-14042 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.7.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js</a></p>
<p>Path to dependency file: /templates/base.html</p>
<p>Path to vulnerable library: /templates/base.html</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.7.min.js** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip.
<p>Publish Date: 2018-07-13
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-14042>CVE-2018-14042</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2018-07-13</p>
<p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2018-14042 (Medium) detected in bootstrap-3.3.7.min.js - ## CVE-2018-14042 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.7.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js</a></p>
<p>Path to dependency file: /templates/base.html</p>
<p>Path to vulnerable library: /templates/base.html</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.7.min.js** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip.
<p>Publish Date: 2018-07-13
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-14042>CVE-2018-14042</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2018-07-13</p>
<p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve medium detected in bootstrap min js cve medium severity vulnerability vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to dependency file templates base html path to vulnerable library templates base html dependency hierarchy x bootstrap min js vulnerable library found in base branch master vulnerability details in bootstrap before xss is possible in the data container property of tooltip publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version release date fix resolution org webjars npm bootstrap org webjars bootstrap step up your open source security game with mend
| 0
|
49,896
| 13,187,286,722
|
IssuesEvent
|
2020-08-13 02:56:05
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
opened
|
Subtracting bias in photospline/I3SplineTable even if derivatives are available (Trac #2236)
|
Incomplete Migration Migrated from Trac combo reconstruction defect
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/2236">https://code.icecube.wisc.edu/ticket/2236</a>, reported by mhieronymus and owned by mhieronymus</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-04T17:52:51",
"description": "After evaluating photosplines a bias must only be subtracted if no derivatives are available but it is always done.\nI am going to commit a fix for that soon.",
"reporter": "mhieronymus",
"cc": "",
"resolution": "fixed",
"_ts": "1549302771596052",
"component": "combo reconstruction",
"summary": "Subtracting bias in photospline/I3SplineTable even if derivatives are available",
"priority": "normal",
"keywords": "Bug I3SplineTable",
"time": "2019-01-31T08:17:04",
"milestone": "",
"owner": "mhieronymus",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
Subtracting bias in photospline/I3SplineTable even if derivatives are available (Trac #2236) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/2236">https://code.icecube.wisc.edu/ticket/2236</a>, reported by mhieronymus and owned by mhieronymus</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-04T17:52:51",
"description": "After evaluating photosplines a bias must only be subtracted if no derivatives are available but it is always done.\nI am going to commit a fix for that soon.",
"reporter": "mhieronymus",
"cc": "",
"resolution": "fixed",
"_ts": "1549302771596052",
"component": "combo reconstruction",
"summary": "Subtracting bias in photospline/I3SplineTable even if derivatives are available",
"priority": "normal",
"keywords": "Bug I3SplineTable",
"time": "2019-01-31T08:17:04",
"milestone": "",
"owner": "mhieronymus",
"type": "defect"
}
```
</p>
</details>
|
non_test
|
subtracting bias in photospline even if derivatives are available trac migrated from json status closed changetime description after evaluating photosplines a bias must only be subtracted if no derivatives are available but it is always done ni am going to commit a fix for that soon reporter mhieronymus cc resolution fixed ts component combo reconstruction summary subtracting bias in photospline even if derivatives are available priority normal keywords bug time milestone owner mhieronymus type defect
| 0
|
126,747
| 10,434,406,460
|
IssuesEvent
|
2019-09-17 15:10:48
|
openshift/odo
|
https://api.github.com/repos/openshift/odo
|
closed
|
Desired json output reported invalid format on win10 platform
|
area/testing kind/bug
|
[kind/bug]
<!--
Welcome! - We kindly ask you to:
1. Fill out the issue template below
2. Use the Google group if you have a question rather than a bug or feature request.
The group is at: https://groups.google.com/forum/#!forum/odo-users
Thanks for understanding, and for contributing to the project!
-->
## What versions of software are you using?
- Operating System: Win10
- Output of `odo version`: master
## How did you run odo exactly?
```
odo component command tests Creating component
create component twice fails from same directory
C:/Users/User/go/src/github.com/openshift/odo/tests/integration/component.go:94
Created dir: C:\Users\User\AppData\Local\Temp\283954347
Creating a new project: wqftposkme
Running odo.exe with args [odo project create wqftposkme -w -v4]
[odo] - Waiting for project to come up ...
V Waiting for project to come up [645ms]
[odo] V Project 'wqftposkme' is ready for use
[odo] V New project created and now using project : wqftposkme
[odo] I0915 23:28:41.623211 2896 odo.go:70] Could not get the latest release information in time. Never mind, exiting gracefully :slightly_smiling_face:
Current working dir: C:\Users\User\go\src\github.com\openshift\odo\tests\integration
Setting current dir to: C:\Users\User\AppData\Local\Temp\283954347
Running odo.exe with args [odo create nodejs nodejs --project wqftposkme]
V Validating component [671ms]
[odo] Please use odo push command to create the component with source deployed
Running odo.exe with args [odo list -o json --path C:\Users\User\AppData\Local\Temp\283954347]
[odo] {"kind":"List","apiVersion":"odo.openshift.io/v1alpha1","metadata":{},"items":
[{"kind":"Component","apiVersion":"odo.openshift.io/v1alpha1","metadata":
{"name":"nodejs","namespace":"wqftposkme","creationTimestamp":null},"spec":
{"app":"app","type":"nodejs","source":"./","ports":["8080/TCP"]},"status":{"context":"C:\\Users\\User
\\AppData\\Local\\Temp\\283954347","state":"Not Pushed"}}]}
Deleting project: wqftposkme
Running odo.exe with args [odo project delete wqftposkme -f]
V Deleting project wqftposkme [6s]
[odo] V Deleted project : wqftposkme
Setting current dir to: C:\Users\User\go\src\github.com\openshift\odo\tests\integration
Deleting dir: C:\Users\User\AppData\Local\Temp\283954347+ Failure [14.737 seconds]
odo component command tests
C:/Users/User/go/src/github.com/openshift/odo/tests/integration/cmd_cmp_test.go:13
Creating component
C:/Users/User/go/src/github.com/openshift/odo/tests/integration/component.go:40
create component twice fails from same directory [It]
C:/Users/User/go/src/github.com/openshift/odo/tests/integration/component.go:94
Actual '{"kind":"List","apiVersion":"odo.openshift.io/v1alpha1","metadata":{},"items":
[{"kind":"Component","apiVersion":"odo.openshift.io/v1alpha1","metadata":
{"name":"nodejs","namespace":"wqftposkme","creationTimestamp":null},"spec":
{"app":"app","type":"nodejs","source":"./","ports":["8080/TCP"]},"status":{"context":"C:\Users
\User\AppData\Local\Temp\283954347","state":"Not Pushed"}}]}' should be valid JSON, but it is not.
Underlying error:invalid character 'U' in string escape code C:/Users/User/go/src/github.com/openshift/odo/tests/integration/component.go:98
```
## Actual behavior
Invalid json format
## Expected behavior
Should be a valid json
## Any logs, error output, etc?
|
1.0
|
Desired json output reported invalid format on win10 platform - [kind/bug]
<!--
Welcome! - We kindly ask you to:
1. Fill out the issue template below
2. Use the Google group if you have a question rather than a bug or feature request.
The group is at: https://groups.google.com/forum/#!forum/odo-users
Thanks for understanding, and for contributing to the project!
-->
## What versions of software are you using?
- Operating System: Win10
- Output of `odo version`: master
## How did you run odo exactly?
```
odo component command tests Creating component
create component twice fails from same directory
C:/Users/User/go/src/github.com/openshift/odo/tests/integration/component.go:94
Created dir: C:\Users\User\AppData\Local\Temp\283954347
Creating a new project: wqftposkme
Running odo.exe with args [odo project create wqftposkme -w -v4]
[odo] - Waiting for project to come up ...
V Waiting for project to come up [645ms]
[odo] V Project 'wqftposkme' is ready for use
[odo] V New project created and now using project : wqftposkme
[odo] I0915 23:28:41.623211 2896 odo.go:70] Could not get the latest release information in time. Never mind, exiting gracefully :slightly_smiling_face:
Current working dir: C:\Users\User\go\src\github.com\openshift\odo\tests\integration
Setting current dir to: C:\Users\User\AppData\Local\Temp\283954347
Running odo.exe with args [odo create nodejs nodejs --project wqftposkme]
V Validating component [671ms]
[odo] Please use odo push command to create the component with source deployed
Running odo.exe with args [odo list -o json --path C:\Users\User\AppData\Local\Temp\283954347]
[odo] {"kind":"List","apiVersion":"odo.openshift.io/v1alpha1","metadata":{},"items":
[{"kind":"Component","apiVersion":"odo.openshift.io/v1alpha1","metadata":
{"name":"nodejs","namespace":"wqftposkme","creationTimestamp":null},"spec":
{"app":"app","type":"nodejs","source":"./","ports":["8080/TCP"]},"status":{"context":"C:\\Users\\User
\\AppData\\Local\\Temp\\283954347","state":"Not Pushed"}}]}
Deleting project: wqftposkme
Running odo.exe with args [odo project delete wqftposkme -f]
V Deleting project wqftposkme [6s]
[odo] V Deleted project : wqftposkme
Setting current dir to: C:\Users\User\go\src\github.com\openshift\odo\tests\integration
Deleting dir: C:\Users\User\AppData\Local\Temp\283954347+ Failure [14.737 seconds]
odo component command tests
C:/Users/User/go/src/github.com/openshift/odo/tests/integration/cmd_cmp_test.go:13
Creating component
C:/Users/User/go/src/github.com/openshift/odo/tests/integration/component.go:40
create component twice fails from same directory [It]
C:/Users/User/go/src/github.com/openshift/odo/tests/integration/component.go:94
Actual '{"kind":"List","apiVersion":"odo.openshift.io/v1alpha1","metadata":{},"items":
[{"kind":"Component","apiVersion":"odo.openshift.io/v1alpha1","metadata":
{"name":"nodejs","namespace":"wqftposkme","creationTimestamp":null},"spec":
{"app":"app","type":"nodejs","source":"./","ports":["8080/TCP"]},"status":{"context":"C:\Users
\User\AppData\Local\Temp\283954347","state":"Not Pushed"}}]}' should be valid JSON, but it is not.
Underlying error:invalid character 'U' in string escape code C:/Users/User/go/src/github.com/openshift/odo/tests/integration/component.go:98
```
## Actual behavior
Invalid json format
## Expected behavior
Should be a valid json
## Any logs, error output, etc?
|
test
|
desired json output reported invalid format on platform welcome we kindly ask you to fill out the issue template below use the google group if you have a question rather than a bug or feature request the group is at thanks for understanding and for contributing to the project what versions of software are you using operating system output of odo version master how did you run odo exactly odo component command tests creating component create component twice fails from same directory c users user go src github com openshift odo tests integration component go created dir c users user appdata local temp creating a new project wqftposkme running odo exe with args waiting for project to come up v waiting for project to come up v project wqftposkme is ready for use v new project created and now using project wqftposkme odo go could not get the latest release information in time never mind exiting gracefully slightly smiling face current working dir c users user go src github com openshift odo tests integration setting current dir to c users user appdata local temp running odo exe with args v validating component please use odo push command to create the component with source deployed running odo exe with args kind list apiversion odo openshift io metadata items kind component apiversion odo openshift io metadata name nodejs namespace wqftposkme creationtimestamp null spec app app type nodejs source ports status context c users user appdata local temp state not pushed deleting project wqftposkme running odo exe with args v deleting project wqftposkme v deleted project wqftposkme setting current dir to c users user go src github com openshift odo tests integration deleting dir c users user appdata local temp failure odo component command tests c users user go src github com openshift odo tests integration cmd cmp test go creating component c users user go src github com openshift odo tests integration component go create component twice fails from same directory c users user go src github com openshift odo tests integration component go actual kind list apiversion odo openshift io metadata items kind component apiversion odo openshift io metadata name nodejs namespace wqftposkme creationtimestamp null spec app app type nodejs source ports status context c users user appdata local temp state not pushed should be valid json but it is not underlying error invalid character u in string escape code c users user go src github com openshift odo tests integration component go actual behavior invalid json format expected behavior should be a valid json any logs error output etc
| 1
|
20,057
| 27,985,854,053
|
IssuesEvent
|
2023-03-26 17:33:27
|
UnblockNeteaseMusic/server
|
https://api.github.com/repos/UnblockNeteaseMusic/server
|
closed
|
hook error: ERR_OSSL_EVP_WRONG_FINAL_BLOCK_LENGTH
|
bug help wanted compatibility
|
### Bug 描述
hook error 再次出现,暂不清楚由何原因导致,可能与之前一样是编码问题。
### 预期行为
_No response_
### 实际行为
_No response_
### 复现步骤
_No response_
### 日志内容
```
ERROR: (hook) A error occurred in hook.request.after when hooking https://music.163.com/weapi/v3/song/detail?csrf_token=.
Error: error:0606506D:digital envelope routines:EVP_DecryptFinal_ex:wrong final block length
at Decipheriv.final (node:internal/crypto/cipher:193:29)
at decrypt (/usr/share/unblockneteasemusic/core/app.js:51:56)
at Object.decrypt (/usr/share/unblockneteasemusic/core/app.js:53:90)
at /usr/share/unblockneteasemusic/core/app.js:100:164
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
library: digital envelope routines
function: EVP_DecryptFinal_ex
reason: wrong final block length
code: ERR_OSSL_EVP_WRONG_FINAL_BLOCK_LENGTH
```
### 网易云音乐版本号
2.10.5+
### 操作系统
Windows / iOS / macOS
### 其他信息
_No response_
### 条款
- [X] 我确认我使用的核心是由 UnblockNeteaseMusic 项目官方发行,不是其他任何 fork。
- [X] 我确认我已经升级到了最新的核心版本(推荐使用最新构建而不是 release)。
- [X] 我确认我已经启用了 HTTPS 端口。
- [X] 我确认我已经正确设置了 EndPoint(仅 macOS 或 iOS)。
- [X] 我确认我已经在对应的客户端正确安装了 CA 证书。
- [X] 我同意开发者没有任何义务必须解答我的问题,尤其是缺少恰当的描述和日志。
|
True
|
hook error: ERR_OSSL_EVP_WRONG_FINAL_BLOCK_LENGTH - ### Bug 描述
hook error 再次出现,暂不清楚由何原因导致,可能与之前一样是编码问题。
### 预期行为
_No response_
### 实际行为
_No response_
### 复现步骤
_No response_
### 日志内容
```
ERROR: (hook) A error occurred in hook.request.after when hooking https://music.163.com/weapi/v3/song/detail?csrf_token=.
Error: error:0606506D:digital envelope routines:EVP_DecryptFinal_ex:wrong final block length
at Decipheriv.final (node:internal/crypto/cipher:193:29)
at decrypt (/usr/share/unblockneteasemusic/core/app.js:51:56)
at Object.decrypt (/usr/share/unblockneteasemusic/core/app.js:53:90)
at /usr/share/unblockneteasemusic/core/app.js:100:164
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
library: digital envelope routines
function: EVP_DecryptFinal_ex
reason: wrong final block length
code: ERR_OSSL_EVP_WRONG_FINAL_BLOCK_LENGTH
```
### 网易云音乐版本号
2.10.5+
### 操作系统
Windows / iOS / macOS
### 其他信息
_No response_
### 条款
- [X] 我确认我使用的核心是由 UnblockNeteaseMusic 项目官方发行,不是其他任何 fork。
- [X] 我确认我已经升级到了最新的核心版本(推荐使用最新构建而不是 release)。
- [X] 我确认我已经启用了 HTTPS 端口。
- [X] 我确认我已经正确设置了 EndPoint(仅 macOS 或 iOS)。
- [X] 我确认我已经在对应的客户端正确安装了 CA 证书。
- [X] 我同意开发者没有任何义务必须解答我的问题,尤其是缺少恰当的描述和日志。
|
non_test
|
hook error err ossl evp wrong final block length bug 描述 hook error 再次出现,暂不清楚由何原因导致,可能与之前一样是编码问题。 预期行为 no response 实际行为 no response 复现步骤 no response 日志内容 error hook a error occurred in hook request after when hooking error error digital envelope routines evp decryptfinal ex wrong final block length at decipheriv final node internal crypto cipher at decrypt usr share unblockneteasemusic core app js at object decrypt usr share unblockneteasemusic core app js at usr share unblockneteasemusic core app js at process processticksandrejections node internal process task queues library digital envelope routines function evp decryptfinal ex reason wrong final block length code err ossl evp wrong final block length 网易云音乐版本号 操作系统 windows ios macos 其他信息 no response 条款 我确认我使用的核心是由 unblockneteasemusic 项目官方发行,不是其他任何 fork。 我确认我已经升级到了最新的核心版本(推荐使用最新构建而不是 release)。 我确认我已经启用了 https 端口。 我确认我已经正确设置了 endpoint(仅 macos 或 ios)。 我确认我已经在对应的客户端正确安装了 ca 证书。 我同意开发者没有任何义务必须解答我的问题,尤其是缺少恰当的描述和日志。
| 0
|
225,895
| 17,929,085,051
|
IssuesEvent
|
2021-09-10 06:36:43
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
roachtest: scbench/randomload/nodes=3/ops=2000/conc=1 failed
|
C-test-failure O-robot O-roachtest branch-release-21.2
|
roachtest.scbench/randomload/nodes=3/ops=2000/conc=1 [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3421823&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3421823&tab=artifacts#/scbench/randomload/nodes=3/ops=2000/conc=1) on release-21.2 @ [99a4816fc272228a63df20dae3cc41d235e705f3](https://github.com/cockroachdb/cockroach/commits/99a4816fc272228a63df20dae3cc41d235e705f3):
```
| "workerId": 0,
| "clientTimestamp": "06:36:15.414061",
| "ops": [
| "BEGIN",
| "CREATE TABLE IF NOT EXISTS public.table21 (col21_22 \"char\" NOT NULL, col21_23 OID NOT NULL, col21_24 JSONB NULL, col21_25 STRING[] NOT NULL, col21_26 NAME NOT NULL, col21_27 INT4 NULL, col21_28 TIMESTAMPTZ NOT NULL, col21_29 BIT(13) NOT NULL, col21_30 INT2[], INVERTED INDEX ((CASE WHEN col21_30 IS NULL THEN e'85\\bp\\x19r\\x10\\x14':::STRING ELSE e'*\\x0e)``\\x13\\x1d':::STRING END) ASC, col21_29 ASC, col21_24), UNIQUE (col21_22), FAMILY (col21_22, col21_28, col21_26, col21_30, col21_25, col21_24), FAMILY (col21_23, col21_27), FAMILY (col21_29))",
| "COMMIT"
| ],
| "expectedExecErrors": "",
| "expectedCommitErrors": "",
| "message": ""
| }
| {
| "workerId": 0,
| "clientTimestamp": "06:36:15.440435",
| "ops": [
| "BEGIN",
| "DROP SCHEMA \"public\" CASCADE"
| ],
| "expectedExecErrors": "3F000",
| "expectedCommitErrors": "",
| "message": "ROLLBACK; Successfully got expected execution error: ERROR: cannot drop schema \"public\" (SQLSTATE 3F000)"
| }
| {
| "workerId": 0,
| "clientTimestamp": "06:36:15.450918",
| "ops": [
| "BEGIN",
| "ALTER VIEW public.view31 RENAME TO public.view32"
| ],
| "expectedExecErrors": "42P01",
| "expectedCommitErrors": "",
| "message": "ROLLBACK; Successfully got expected execution error: ERROR: relation \"public.view31\" does not exist (SQLSTATE 42P01)"
| }
| {
| "workerId": 0,
| "clientTimestamp": "06:36:15.460751",
| "ops": [
| "BEGIN",
| "ALTER TABLE public.table21 ALTER COLUMN crdb_internal_idx_expr SET DEFAULT '6K':::STRING:::NAME",
| "ALTER TABLE public.table21 RENAME COLUMN \"col21_30\" TO \"col21_33\"",
| "ALTER DATABASE schemachange SURVIVE ZONE FAILURE",
| "CREATE SCHEMA IF NOT EXISTS schema34 AUTHORIZATION root",
| "CREATE UNIQUE INDEX index21_35 ON public.table21 (col21_33 DESC, col21_27) STORING (col21_22, crdb_internal_idx_expr, col21_24, col21_26, col21_25)"
| ],
| "expectedExecErrors": "0A000,XXUUU",
| "expectedCommitErrors": "",
| "message": "ROLLBACK; Successfully got expected execution error: ERROR: unimplemented: column col21_33 is of type int2[] and thus is not indexable (SQLSTATE 0A000)"
| }
Wraps: (4) exit status 20
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *cluster.WithCommandDetails (4) *exec.ExitError
```
<details><summary>Reproduce</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #63514 roachtest: scbench/randomload/nodes=3/ops=2000/conc=1 failed [C-test-failure O-roachtest O-robot T-sql-schema branch-63484]
- #61976 roachtest: scbench/randomload/nodes=3/ops=2000/conc=1 failed [C-test-failure O-roachtest O-robot T-sql-schema branch-release-20.2]
- #61685 roachtest: scbench/randomload/nodes=3/ops=2000/conc=1 failed [C-test-failure O-roachtest O-robot T-sql-schema branch-release-21.1]
- #56081 roachtest: scbench/randomload/nodes=3/ops=2000/conc=1 failed [C-test-failure O-roachtest O-robot T-sql-schema branch-master]
</p>
</details>
/cc @cockroachdb/sql-schema
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*scbench/randomload/nodes=3/ops=2000/conc=1.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
2.0
|
roachtest: scbench/randomload/nodes=3/ops=2000/conc=1 failed - roachtest.scbench/randomload/nodes=3/ops=2000/conc=1 [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3421823&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3421823&tab=artifacts#/scbench/randomload/nodes=3/ops=2000/conc=1) on release-21.2 @ [99a4816fc272228a63df20dae3cc41d235e705f3](https://github.com/cockroachdb/cockroach/commits/99a4816fc272228a63df20dae3cc41d235e705f3):
```
| "workerId": 0,
| "clientTimestamp": "06:36:15.414061",
| "ops": [
| "BEGIN",
| "CREATE TABLE IF NOT EXISTS public.table21 (col21_22 \"char\" NOT NULL, col21_23 OID NOT NULL, col21_24 JSONB NULL, col21_25 STRING[] NOT NULL, col21_26 NAME NOT NULL, col21_27 INT4 NULL, col21_28 TIMESTAMPTZ NOT NULL, col21_29 BIT(13) NOT NULL, col21_30 INT2[], INVERTED INDEX ((CASE WHEN col21_30 IS NULL THEN e'85\\bp\\x19r\\x10\\x14':::STRING ELSE e'*\\x0e)``\\x13\\x1d':::STRING END) ASC, col21_29 ASC, col21_24), UNIQUE (col21_22), FAMILY (col21_22, col21_28, col21_26, col21_30, col21_25, col21_24), FAMILY (col21_23, col21_27), FAMILY (col21_29))",
| "COMMIT"
| ],
| "expectedExecErrors": "",
| "expectedCommitErrors": "",
| "message": ""
| }
| {
| "workerId": 0,
| "clientTimestamp": "06:36:15.440435",
| "ops": [
| "BEGIN",
| "DROP SCHEMA \"public\" CASCADE"
| ],
| "expectedExecErrors": "3F000",
| "expectedCommitErrors": "",
| "message": "ROLLBACK; Successfully got expected execution error: ERROR: cannot drop schema \"public\" (SQLSTATE 3F000)"
| }
| {
| "workerId": 0,
| "clientTimestamp": "06:36:15.450918",
| "ops": [
| "BEGIN",
| "ALTER VIEW public.view31 RENAME TO public.view32"
| ],
| "expectedExecErrors": "42P01",
| "expectedCommitErrors": "",
| "message": "ROLLBACK; Successfully got expected execution error: ERROR: relation \"public.view31\" does not exist (SQLSTATE 42P01)"
| }
| {
| "workerId": 0,
| "clientTimestamp": "06:36:15.460751",
| "ops": [
| "BEGIN",
| "ALTER TABLE public.table21 ALTER COLUMN crdb_internal_idx_expr SET DEFAULT '6K':::STRING:::NAME",
| "ALTER TABLE public.table21 RENAME COLUMN \"col21_30\" TO \"col21_33\"",
| "ALTER DATABASE schemachange SURVIVE ZONE FAILURE",
| "CREATE SCHEMA IF NOT EXISTS schema34 AUTHORIZATION root",
| "CREATE UNIQUE INDEX index21_35 ON public.table21 (col21_33 DESC, col21_27) STORING (col21_22, crdb_internal_idx_expr, col21_24, col21_26, col21_25)"
| ],
| "expectedExecErrors": "0A000,XXUUU",
| "expectedCommitErrors": "",
| "message": "ROLLBACK; Successfully got expected execution error: ERROR: unimplemented: column col21_33 is of type int2[] and thus is not indexable (SQLSTATE 0A000)"
| }
Wraps: (4) exit status 20
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *cluster.WithCommandDetails (4) *exec.ExitError
```
<details><summary>Reproduce</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #63514 roachtest: scbench/randomload/nodes=3/ops=2000/conc=1 failed [C-test-failure O-roachtest O-robot T-sql-schema branch-63484]
- #61976 roachtest: scbench/randomload/nodes=3/ops=2000/conc=1 failed [C-test-failure O-roachtest O-robot T-sql-schema branch-release-20.2]
- #61685 roachtest: scbench/randomload/nodes=3/ops=2000/conc=1 failed [C-test-failure O-roachtest O-robot T-sql-schema branch-release-21.1]
- #56081 roachtest: scbench/randomload/nodes=3/ops=2000/conc=1 failed [C-test-failure O-roachtest O-robot T-sql-schema branch-master]
</p>
</details>
/cc @cockroachdb/sql-schema
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*scbench/randomload/nodes=3/ops=2000/conc=1.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
test
|
roachtest scbench randomload nodes ops conc failed roachtest scbench randomload nodes ops conc with on release workerid clienttimestamp ops begin create table if not exists public char not null oid not null jsonb null string not null name not null null timestamptz not null bit not null inverted index case when is null then e bp string else e string end asc asc unique family family family commit expectedexecerrors expectedcommiterrors message workerid clienttimestamp ops begin drop schema public cascade expectedexecerrors expectedcommiterrors message rollback successfully got expected execution error error cannot drop schema public sqlstate workerid clienttimestamp ops begin alter view public rename to public expectedexecerrors expectedcommiterrors message rollback successfully got expected execution error error relation public does not exist sqlstate workerid clienttimestamp ops begin alter table public alter column crdb internal idx expr set default string name alter table public rename column to alter database schemachange survive zone failure create schema if not exists authorization root create unique index on public desc storing crdb internal idx expr expectedexecerrors xxuuu expectedcommiterrors message rollback successfully got expected execution error error unimplemented column is of type and thus is not indexable sqlstate wraps exit status error types withstack withstack errutil withprefix cluster withcommanddetails exec exiterror reproduce see same failure on other branches roachtest scbench randomload nodes ops conc failed roachtest scbench randomload nodes ops conc failed roachtest scbench randomload nodes ops conc failed roachtest scbench randomload nodes ops conc failed cc cockroachdb sql schema
| 1
|
8,431
| 7,290,423,315
|
IssuesEvent
|
2018-02-24 01:52:13
|
allianceauth/allianceauth
|
https://api.github.com/repos/allianceauth/allianceauth
|
closed
|
Excessive Privileges in sudoers file
|
Security docs install v1
|
The current installation of allianceauth asks that the following line be added to the /etc/sudoers file
allianceserver ALL=(ALL:ALL) ALL
While I understand that this is very convenient for the developers, this is an extraordinarily permissive setup for an automated service. Should anyone manage to take over any process running under this and manage privilege escalation to the allianceauth user, this gives the intruder carte blanche to run rampant through the server.
I would like to ask for a sudoers entry that follows the "Principle of least privilege". i.e. only the commands that allianceauth actually needs escalated privileges to run be listed in the sudoers file.
I believe that this will be highly beneficial to all users and groups running this project.
|
True
|
Excessive Privileges in sudoers file - The current installation of allianceauth asks that the following line be added to the /etc/sudoers file
allianceserver ALL=(ALL:ALL) ALL
While I understand that this is very convenient for the developers, this is an extraordinarily permissive setup for an automated service. Should anyone manage to take over any process running under this and manage privilege escalation to the allianceauth user, this gives the intruder carte blanche to run rampant through the server.
I would like to ask for a sudoers entry that follows the "Principle of least privilege". i.e. only the commands that allianceauth actually needs escalated privileges to run be listed in the sudoers file.
I believe that this will be highly beneficial to all users and groups running this project.
|
non_test
|
excessive privileges in sudoers file the current installation of allianceauth asks that the following line be added to the etc sudoers file allianceserver all all all all while i understand that this is very convenient for the developers this is an extraordinarily permissive setup for an automated service should anyone manage to take over any process running under this and manage privilege escalation to the allianceauth user this gives the intruder carte blanche to run rampant through the server i would like to ask for a sudoers entry that follows the principle of least privilege i e only the commands that allianceauth actually needs escalated privileges to run be listed in the sudoers file i believe that this will be highly beneficial to all users and groups running this project
| 0
|
136,917
| 18,751,508,217
|
IssuesEvent
|
2021-11-05 03:00:20
|
Dima2022/Resiliency-Studio
|
https://api.github.com/repos/Dima2022/Resiliency-Studio
|
closed
|
CVE-2020-11112 (High) detected in jackson-databind-2.8.6.jar - autoclosed
|
security vulnerability
|
## CVE-2020-11112 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.6.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: Resiliency-Studio/resiliency-studio-agent/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar</p>
<p>
Dependency Hierarchy:
- sdk-java-rest-6.2.0.4-oss.jar (Root Library)
- :x: **jackson-databind-2.8.6.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Dima2022/Resiliency-Studio/commit/9809d9b7bfdc114eafb0a14d86667f3a76a014e8">9809d9b7bfdc114eafb0a14d86667f3a76a014e8</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.proxy.provider.remoting.RmiProvider (aka apache/commons-proxy).
<p>Publish Date: 2020-03-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11112>CVE-2020-11112</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11112">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11112</a></p>
<p>Release Date: 2020-03-31</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.4,2.10.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.6","packageFilePaths":["/resiliency-studio-agent/pom.xml","/resiliency-studio-security/pom.xml","/resiliency-studio-service/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"com.att.ajsc:sdk-java-rest:6.2.0.4-oss;com.fasterxml.jackson.core:jackson-databind:2.8.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.4,2.10.0"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-11112","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.proxy.provider.remoting.RmiProvider (aka apache/commons-proxy).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11112","cvss3Severity":"high","cvss3Score":"8.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-11112 (High) detected in jackson-databind-2.8.6.jar - autoclosed - ## CVE-2020-11112 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.6.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: Resiliency-Studio/resiliency-studio-agent/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar</p>
<p>
Dependency Hierarchy:
- sdk-java-rest-6.2.0.4-oss.jar (Root Library)
- :x: **jackson-databind-2.8.6.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Dima2022/Resiliency-Studio/commit/9809d9b7bfdc114eafb0a14d86667f3a76a014e8">9809d9b7bfdc114eafb0a14d86667f3a76a014e8</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.proxy.provider.remoting.RmiProvider (aka apache/commons-proxy).
<p>Publish Date: 2020-03-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11112>CVE-2020-11112</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11112">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11112</a></p>
<p>Release Date: 2020-03-31</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.4,2.10.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.6","packageFilePaths":["/resiliency-studio-agent/pom.xml","/resiliency-studio-security/pom.xml","/resiliency-studio-service/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"com.att.ajsc:sdk-java-rest:6.2.0.4-oss;com.fasterxml.jackson.core:jackson-databind:2.8.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.4,2.10.0"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-11112","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.proxy.provider.remoting.RmiProvider (aka apache/commons-proxy).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11112","cvss3Severity":"high","cvss3Score":"8.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_test
|
cve high detected in jackson databind jar autoclosed cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file resiliency studio resiliency studio agent pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy sdk java rest oss jar root library x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache commons proxy provider remoting rmiprovider aka apache commons proxy publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree com att ajsc sdk java rest oss com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind basebranches vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache commons proxy provider remoting rmiprovider aka apache commons proxy vulnerabilityurl
| 0
|
56,364
| 8,068,606,752
|
IssuesEvent
|
2018-08-05 22:37:32
|
APSIMInitiative/ApsimX
|
https://api.github.com/repos/APSIMInitiative/ApsimX
|
opened
|
New references for bib file
|
documentation science
|
I'm creating this as a generic issue which may be reused whenever we need to add more references to the bib file.
|
1.0
|
New references for bib file - I'm creating this as a generic issue which may be reused whenever we need to add more references to the bib file.
|
non_test
|
new references for bib file i m creating this as a generic issue which may be reused whenever we need to add more references to the bib file
| 0
|
331,678
| 29,046,335,925
|
IssuesEvent
|
2023-05-13 16:01:26
|
envoyproxy/envoy
|
https://api.github.com/repos/envoyproxy/envoy
|
closed
|
icu build is using foreign_cc despite it being a bazel package
|
bug area/test flakes stale area/ci
|
Reading through the original PR that added this #20143 its not clear why icu was added as a foreign_cc package when it has a bazel build
Either way this dependency causes constant network transient flakes as it seems to try and run bazel as a foreign build
|
1.0
|
icu build is using foreign_cc despite it being a bazel package - Reading through the original PR that added this #20143 its not clear why icu was added as a foreign_cc package when it has a bazel build
Either way this dependency causes constant network transient flakes as it seems to try and run bazel as a foreign build
|
test
|
icu build is using foreign cc despite it being a bazel package reading through the original pr that added this its not clear why icu was added as a foreign cc package when it has a bazel build either way this dependency causes constant network transient flakes as it seems to try and run bazel as a foreign build
| 1
|
152,000
| 23,901,698,997
|
IssuesEvent
|
2022-09-08 19:25:47
|
BitcoinDesign/Guide
|
https://api.github.com/repos/BitcoinDesign/Guide
|
opened
|
New header image for First Use
|
Good first issue Design Daily spending wallet
|
We are in need of a new header image for this page: [First Use](We are in need of a new header image for this page: https://bitcoin.design/guide/daily-spending-wallet/first-use/
## What are we looking for in a header image?
The image should be relevant to the topic of the page, however, don't be afraid to get creative with it. Examples of other pages that are both creative and relevant to the topic: [Settings](https://bitcoin.design/guide/daily-spending-wallet/settings/), [Coin selection](https://bitcoin.design/guide/how-it-works/coin-selection/), [Sending bitcoin](https://bitcoin.design/guide/daily-spending-wallet/sending/).
## Why is a new image needed?
The reason is because the current image was designed when this page was part of a section that no longer exists called "onboarding". It was meant to show a structured order of steps. Now, it is part of the [daily spending wallet section](https://bitcoin.design/guide/daily-spending-wallet/). Half of the other pages in the daily spending wallet do not follow this pattern. The structured order no longer makes sense.
## Illustration guidelines
If you're a contributor looking to tackle the issue, you can [preview our illustration guidelines](https://bitcoin.design/guide/contribute/illustration-guidelines/).
## Tips
If you are interested in working on this issue, leave a comment so we can assign it to you.
Please also share your initial ideas and ideally **work-in-progress** updates _before_ suggesting a final visual. This allows for better collaboration and ensuring the visual goes well with the page content.
)
## What are we looking for in a header image?
The image should be relevant to the topic of the page, however, don't be afraid to get creative with it. Examples of other pages that are both creative and relevant to the topic: [Settings](https://bitcoin.design/guide/daily-spending-wallet/settings/), [Coin selection](https://bitcoin.design/guide/how-it-works/coin-selection/), [Sending bitcoin](https://bitcoin.design/guide/daily-spending-wallet/sending/).
## Why is a new image needed?
The reason is because the current image was designed when this page was part of a section that no longer exists called "onboarding". It was meant to show a structured order of steps. Now, it is part of the [daily spending wallet section](https://bitcoin.design/guide/daily-spending-wallet/). Half of the other pages in the daily spending wallet do not follow this pattern. The structured order no longer makes sense.
## Illustration guidelines
If you're a contributor looking to tackle the issue, you can [preview our illustration guidelines](https://bitcoin.design/guide/contribute/illustration-guidelines/).
## Tips
If you are interested in working on this issue, leave a comment so we can assign it to you.
Please also share your initial ideas and ideally **work-in-progress** updates _before_ suggesting a final visual. This allows for better collaboration and ensuring the visual goes well with the page content.
|
1.0
|
New header image for First Use - We are in need of a new header image for this page: [First Use](We are in need of a new header image for this page: https://bitcoin.design/guide/daily-spending-wallet/first-use/
## What are we looking for in a header image?
The image should be relevant to the topic of the page, however, don't be afraid to get creative with it. Examples of other pages that are both creative and relevant to the topic: [Settings](https://bitcoin.design/guide/daily-spending-wallet/settings/), [Coin selection](https://bitcoin.design/guide/how-it-works/coin-selection/), [Sending bitcoin](https://bitcoin.design/guide/daily-spending-wallet/sending/).
## Why is a new image needed?
The reason is because the current image was designed when this page was part of a section that no longer exists called "onboarding". It was meant to show a structured order of steps. Now, it is part of the [daily spending wallet section](https://bitcoin.design/guide/daily-spending-wallet/). Half of the other pages in the daily spending wallet do not follow this pattern. The structured order no longer makes sense.
## Illustration guidelines
If you're a contributor looking to tackle the issue, you can [preview our illustration guidelines](https://bitcoin.design/guide/contribute/illustration-guidelines/).
## Tips
If you are interested in working on this issue, leave a comment so we can assign it to you.
Please also share your initial ideas and ideally **work-in-progress** updates _before_ suggesting a final visual. This allows for better collaboration and ensuring the visual goes well with the page content.
)
## What are we looking for in a header image?
The image should be relevant to the topic of the page, however, don't be afraid to get creative with it. Examples of other pages that are both creative and relevant to the topic: [Settings](https://bitcoin.design/guide/daily-spending-wallet/settings/), [Coin selection](https://bitcoin.design/guide/how-it-works/coin-selection/), [Sending bitcoin](https://bitcoin.design/guide/daily-spending-wallet/sending/).
## Why is a new image needed?
The reason is because the current image was designed when this page was part of a section that no longer exists called "onboarding". It was meant to show a structured order of steps. Now, it is part of the [daily spending wallet section](https://bitcoin.design/guide/daily-spending-wallet/). Half of the other pages in the daily spending wallet do not follow this pattern. The structured order no longer makes sense.
## Illustration guidelines
If you're a contributor looking to tackle the issue, you can [preview our illustration guidelines](https://bitcoin.design/guide/contribute/illustration-guidelines/).
## Tips
If you are interested in working on this issue, leave a comment so we can assign it to you.
Please also share your initial ideas and ideally **work-in-progress** updates _before_ suggesting a final visual. This allows for better collaboration and ensuring the visual goes well with the page content.
|
non_test
|
new header image for first use we are in need of a new header image for this page we are in need of a new header image for this page what are we looking for in a header image the image should be relevant to the topic of the page however don t be afraid to get creative with it examples of other pages that are both creative and relevant to the topic why is a new image needed the reason is because the current image was designed when this page was part of a section that no longer exists called onboarding it was meant to show a structured order of steps now it is part of the half of the other pages in the daily spending wallet do not follow this pattern the structured order no longer makes sense illustration guidelines if you re a contributor looking to tackle the issue you can tips if you are interested in working on this issue leave a comment so we can assign it to you please also share your initial ideas and ideally work in progress updates before suggesting a final visual this allows for better collaboration and ensuring the visual goes well with the page content what are we looking for in a header image the image should be relevant to the topic of the page however don t be afraid to get creative with it examples of other pages that are both creative and relevant to the topic why is a new image needed the reason is because the current image was designed when this page was part of a section that no longer exists called onboarding it was meant to show a structured order of steps now it is part of the half of the other pages in the daily spending wallet do not follow this pattern the structured order no longer makes sense illustration guidelines if you re a contributor looking to tackle the issue you can tips if you are interested in working on this issue leave a comment so we can assign it to you please also share your initial ideas and ideally work in progress updates before suggesting a final visual this allows for better collaboration and ensuring the visual goes well with the page content
| 0
|
182,490
| 14,137,369,180
|
IssuesEvent
|
2020-11-10 06:31:05
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
Failing test: Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/ml/data_frame_analytics/cloning·ts - machine learning data frame analytics jobs cloning supported by UI form outlier detection job supported by the form should have correct init form values for config step
|
:ml failed-test v7.10.0
|
A test failed on a tracked branch
```
Error: expected testSubject(mlAnalyticsCreateJobWizardIncludesSelect) to exist
at TestSubjects.existOrFail (/dev/shm/workspace/kibana/test/functional/services/common/test_subjects.ts:62:15)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+master/6447/)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/ml/data_frame_analytics/cloning·ts","test.name":"machine learning data frame analytics jobs cloning supported by UI form outlier detection job supported by the form should have correct init form values for config step","test.failCount":3}} -->
|
1.0
|
Failing test: Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/ml/data_frame_analytics/cloning·ts - machine learning data frame analytics jobs cloning supported by UI form outlier detection job supported by the form should have correct init form values for config step - A test failed on a tracked branch
```
Error: expected testSubject(mlAnalyticsCreateJobWizardIncludesSelect) to exist
at TestSubjects.existOrFail (/dev/shm/workspace/kibana/test/functional/services/common/test_subjects.ts:62:15)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+master/6447/)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/ml/data_frame_analytics/cloning·ts","test.name":"machine learning data frame analytics jobs cloning supported by UI form outlier detection job supported by the form should have correct init form values for config step","test.failCount":3}} -->
|
test
|
failing test chrome x pack ui functional tests x pack test functional apps ml data frame analytics cloning·ts machine learning data frame analytics jobs cloning supported by ui form outlier detection job supported by the form should have correct init form values for config step a test failed on a tracked branch error expected testsubject mlanalyticscreatejobwizardincludesselect to exist at testsubjects existorfail dev shm workspace kibana test functional services common test subjects ts first failure
| 1
|
9,186
| 3,026,346,590
|
IssuesEvent
|
2015-08-03 14:38:16
|
GirlsCodersWarsaw/CodeQuestProject_Basia
|
https://api.github.com/repos/GirlsCodersWarsaw/CodeQuestProject_Basia
|
closed
|
rspec tests - learning
|
learning testing
|
It's time to learn testing! If you have some good links about rspec and what you need to test, please share it with me :)
/cc @pjar @lsolniczek
|
1.0
|
rspec tests - learning - It's time to learn testing! If you have some good links about rspec and what you need to test, please share it with me :)
/cc @pjar @lsolniczek
|
test
|
rspec tests learning it s time to learn testing if you have some good links about rspec and what you need to test please share it with me cc pjar lsolniczek
| 1
|
341,580
| 30,593,753,950
|
IssuesEvent
|
2023-07-21 19:36:16
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
reopened
|
Fix tensor.test_torch_instance_getitem
|
PyTorch Frontend Sub Task Failing Test
|
| | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5625899872"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/5599074213"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5599074213"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/5625899872"><img src=https://img.shields.io/badge/-success-success></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5625899872"><img src=https://img.shields.io/badge/-failure-red></a>
|
1.0
|
Fix tensor.test_torch_instance_getitem - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5625899872"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/5599074213"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5599074213"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/5625899872"><img src=https://img.shields.io/badge/-success-success></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5625899872"><img src=https://img.shields.io/badge/-failure-red></a>
|
test
|
fix tensor test torch instance getitem tensorflow a href src jax a href src numpy a href src torch a href src paddle a href src
| 1
|
10,712
| 2,622,181,638
|
IssuesEvent
|
2015-03-04 00:19:05
|
byzhang/leveldb
|
https://api.github.com/repos/byzhang/leveldb
|
opened
|
Possible bug: fsync() required after calling rename()
|
auto-migrated Priority-Medium Type-Defect
|
```
Similar to issue 187, this bug is about what happens during a power-failure.
Also, this bug is not actually triggered atop file-systems that I usually use
(ext3/ext4), so I haven't tried reproducing it on a real system.
The bug: When LevelDB opens a database, it compacts the current log files to a
".sst" file, creates a new MANIFEST, and updates the CURRENT file. Updating the
CURRENT file is done with the usual atomic technique using rename(). It then
deletes the old log files and old MANIFEST file.
In this entire sequence of operations, LevelDB does not explicitly ensure that
the rename() (corresponding to the CURRENT file) actually happens before the
unlink()s (of the old MANIFEST and log files). If the unlink()s happen before
the rename, the database can be left corrupted (due to a missing MANIFEST) or
wrong (due to a deleted, non-compacted log). I verified the corruption and
wrong-behavior by simulating the unlink()s happening before the rename().
Saying that, considering Linux, neither ext3 nor ext4 never "re-order" calls
like unlink() or rename(). Btrfs comes close to affecting LevelDB by doing
re-ordering, but it seems to only re-order such that rename()s get pushed
before unlink()s, and so seems to do no real harm.
Nevertheless, LevelDB should not depend on the unlink()s happening after the
rename(). There are lots of operating systems and file-systems out there.
```
Original issue reported on code.google.com by `madthanu@gmail.com` on 17 Jul 2013 at 3:34
|
1.0
|
Possible bug: fsync() required after calling rename() - ```
Similar to issue 187, this bug is about what happens during a power-failure.
Also, this bug is not actually triggered atop file-systems that I usually use
(ext3/ext4), so I haven't tried reproducing it on a real system.
The bug: When LevelDB opens a database, it compacts the current log files to a
".sst" file, creates a new MANIFEST, and updates the CURRENT file. Updating the
CURRENT file is done with the usual atomic technique using rename(). It then
deletes the old log files and old MANIFEST file.
In this entire sequence of operations, LevelDB does not explicitly ensure that
the rename() (corresponding to the CURRENT file) actually happens before the
unlink()s (of the old MANIFEST and log files). If the unlink()s happen before
the rename, the database can be left corrupted (due to a missing MANIFEST) or
wrong (due to a deleted, non-compacted log). I verified the corruption and
wrong-behavior by simulating the unlink()s happening before the rename().
Saying that, considering Linux, neither ext3 nor ext4 never "re-order" calls
like unlink() or rename(). Btrfs comes close to affecting LevelDB by doing
re-ordering, but it seems to only re-order such that rename()s get pushed
before unlink()s, and so seems to do no real harm.
Nevertheless, LevelDB should not depend on the unlink()s happening after the
rename(). There are lots of operating systems and file-systems out there.
```
Original issue reported on code.google.com by `madthanu@gmail.com` on 17 Jul 2013 at 3:34
|
non_test
|
possible bug fsync required after calling rename similar to issue this bug is about what happens during a power failure also this bug is not actually triggered atop file systems that i usually use so i haven t tried reproducing it on a real system the bug when leveldb opens a database it compacts the current log files to a sst file creates a new manifest and updates the current file updating the current file is done with the usual atomic technique using rename it then deletes the old log files and old manifest file in this entire sequence of operations leveldb does not explicitly ensure that the rename corresponding to the current file actually happens before the unlink s of the old manifest and log files if the unlink s happen before the rename the database can be left corrupted due to a missing manifest or wrong due to a deleted non compacted log i verified the corruption and wrong behavior by simulating the unlink s happening before the rename saying that considering linux neither nor never re order calls like unlink or rename btrfs comes close to affecting leveldb by doing re ordering but it seems to only re order such that rename s get pushed before unlink s and so seems to do no real harm nevertheless leveldb should not depend on the unlink s happening after the rename there are lots of operating systems and file systems out there original issue reported on code google com by madthanu gmail com on jul at
| 0
|
422,149
| 28,374,483,259
|
IssuesEvent
|
2023-04-12 19:39:40
|
167987982FA/reimagined-octo-spoon
|
https://api.github.com/repos/167987982FA/reimagined-octo-spoon
|
opened
|
Helenaluengo\software
|
bug documentation duplicate enhancement help wanted good first issue invalid wontfix
|
)helena luengo software (((extremely detailed))),(((best quality))),(((masterpiece))),illustration,(((colorful))),clear-cut margin,1girl, Isometric half sphere island on neon background, isometric environment, isometric art, amazing detail, artstation, ray A warrior robot astronaut, floral, horizon zero dawn machine, posing for a fight intricate Steampunk city, sunrise, landscape, intricate, detailed, volumetric lighting, scenery, highly detailed, artstation, sharp uniform float palettes_speed;
uniform float palettes_shadow;
Isometric half sphere island on neon background, isometric environment, isometric art, amazing detail, artstation, ray A warrior robot astronaut, floral, horizon zero dawn machine, posing for a fight intricate Steampunk city, sunrise, landscape, intricate, detailed, volumetric lighting, scenery, highly detailed, artstation, sharp uniform float palettes_speed;
uniform float palettes_shadow;
uniform float palettes_color;
vec3 palettes_pal( float t, vec3 a, vec3 b, vec3 c, vec3 d )
{
return a + b*cos( 6.28318*(c*t+d) );
uniform float rotRects_grid;
uniform float rotRects_period;
uniform float rotRects_w;
uniform float rotRects_h;
mat2 rotRects_rot(float a) {
return mat2(cos(a), -sin(a), sin(a), cos(a));
}
float rotRects_rect(vec2 p, vec2 c) {
vec2 d = abs(p) - c;
return smoothstep(1., -1., max(d.x, d.y) * uResolution.y);
}
float rotRects_triWave(float n, float grid_divn1) {
return abs(mod(n + grid_divn1, 2. * grid_divn1) - grid_divn1) / max(grid_divn1 - 1., 1.);
}
float rotRects_(vec2 p, float mode) {
p += mode / rotRects_grid / 4.;
p *= rotRects_grid;
float grid_divn1 = rotRects_grid - 1.;
vec2 pi_ = p + .5 * grid_divn1;
vec2 pi = floor(pi_) + step(vec2(0.5), fract(pi_));
float n = mode < 0. ? pi.x + pi.y : pi.x - pi.y;
float angle = PI * (2. / rotRects_period * uTime + rotRects_triWave(n, grid_divn1));
if (mode > 0.) angle -= PI / 2.;
p = rotRects_rot(angle) * (fract(p + . <Phytonutrients><helenaluengo+intelligent DNA_>with_Intelligence>@andreatobarfigueroa_v4agEj@kindle.com/><helena luengo by andrea Python * Audioop- tobar_figueroa /><Python SDK VERSION superior (phytonutrients -start/install the stability-sdk package from puppies release new
// Isometric half sphere island on neon background, isometric environment, isometric art, amazing detail, artstation, ray A warrior robot astronaut, floral, horizon zero dawn machine, posing for a fight intricate Steampunk city, sunrise, landscape, intricate, detailed, volumetric lighting, scenery, highly detailed, artstation, sharp uniform float palettes_speed;
uniform float palettes_shadow;
Isometric half sphere island on neon background, isometric environment, isometric art, amazing detail, artstation, ray A warrior robot astronaut, floral, horizon zero dawn machine, posing for a fight intricate Steampunk city, sunrise, landscape, intricate, detailed, volumetric lighting, scenery, highly detailed, artstation, sharp uniform float palettes_speed;
uniform float palettes_shadow;
uniform float palettes_color;
vec3 palettes_pal( float t, vec3 a, vec3 b, vec3 c, vec3 d )
{
return a + b*cos( 6.28318*(c*t+d) );
uniform float rotRects_grid;
uniform float rotRects_period;
uniform float rotRects_w;
uniform float rotRects_h;
mat2 rotRects_rot(float a) {
return mat2(cos(a), -sin(a), sin(a), cos(a));
}
float rotRects_rect(vec2 p, vec2 c) {
vec2 d = abs(p) - c;
return smoothstep(1., -1., max(d.x, d.y) * uResolution.y);
}
float rotRects_triWave(float n, float grid_divn1) {
return abs(mod(n + grid_divn1, 2. * grid_divn1) - grid_divn1) / max(grid_divn1 - 1., 1.);
}
float rotRects_(vec2 p, float mode) {
p += mode / rotRects_grid / 4.;
p *= rotRects_grid;
float grid_divn1 = rotRects_grid - 1.;
vec2 pi_ = p + .5 * grid_divn1;
vec2 pi = floor(pi_) + step(vec2(0.5), fract(pi_));
float n = mode < 0. ? pi.x + pi.y : pi.x - pi.y;
float angle = PI * (2. / rotRects_period * uTime + rotRects_triWave(n, grid_divn1));
if (mode > 0.) angle -= PI / 2.;
p = rotRects_rot(angle) * (fract(p + .5 * mod(rotRects_grid, 2.)) - .5);
return rotRects_rect(p / rotRects_grid,
.5 * vec2(rotRects_w, rotRe---------------------------------------------------------------------
uniform float rotRects_grid;
uniform float rotRects_period;
uniform float rotRects_w;
uniform float rotRects_h;
mat2 rotRects_rot(float a) {
return mat2(cos(a), -sin(a), sin(a), cos(a));
}
float rotRects_rect(vec2 p, vec2 c) {
vec2 d = abs(p) - c;
return smoothstep(1., -1., max(d.x, d.y) * uResolution.y);
}
float rotRects_triWave(float n, float grid_divn1) {
return abs(mod(n + grid_divn1, 2. * grid_divn1) - grid_divn1) / max(grid_divn1 - 1., 1.);
}
float rotRects_(vec2 p, float mode) {
p += mode / rotRects_grid / 4.;
p *= rotRects_grid;
float grid_divn1 = rotRects_grid - 1.;
vec2 pi_ = p + .5 * grid_divn1;
vec2 pi = floor(pi_) + step(vec2(0.5), fract(pi_));
float n = mode < 0. ? pi.x + pi.y : pi.x - pi.y;
float angle = PI * (2. / rotRects_period * uTime + rotRects_triWave(n, grid_divn1));
if (mode > 0.) angle -= PI / 2.;
p = rotRects_rot(angle) * (fract(p + .5 * mod(rotRects_grid, 2.)) - .5);
return rotRects_rect(p / rotRects_grid,
.5 * vec2(rotRects_w, rotRects_h) / rotRects_grid);
}
vec4 rotRects(vec2 p, vec2 uv)
{
vec2 pp = vec2(-1., 1.) * p;
float val = .008 + rotRects_(pp, 1.) + rotRects_(pp, -1.);
val = pow(val, 1. / 2.2);
val *= 0.5 + 0.5 * pow(16.0 * uv.x * uv.y * (1.0 - uv.x) * (1.0 - uv.y), 0.2);
return vec4(val, val, val, 1.0);
}
// Author: Rigel
// Shader: Mystic Flower
// licence: https://creativecommons.org/licenses/by/4.0/
uniform float mysticFlower_disto;
uniform float mysticFlower_disti;
// noise in 2d
float mysticFlower_noise(vec2 p) {
vec2 i = floor(p);
vec2 f = fract(p);
vec2 u = f*f*(3.0-2.0*f);
return mix(mix(hash_2(i + vec2(0.0, 0.0)), hash_2(i + vec2(1.0, 0.0)), u.x),
mix(hash_2(i + vec2(0.0, 1.0)), hash_2(i + vec2(1.0, 1.0)), u.x), u.y);
}
// fractal noise in 2d
float mysticFlower_fbm (vec2 p) {
const mat2 m = mat2(0.8, 0.6, -0.6, 0.8);
float f = 0.0;
f += 0.5000*mysticFlower_noise(p); p*=m*2.02;
f += 0.2500*mysticFlower_noise(p); p*=m*2.04;
f += 0.1250*mysticFlower_noise(p); p*=m*2.03;
f += 0.0650*mysticFlower_noise(p); p*=m*2.01;
// normalize f;
f /= 0.9375;
return f*2.0-1.0;
}
vec2 mysticFlower(vec2 st, float distort, float distinct) {
vec2 p = st * vec2(1.5);
// angle and radius to center 0,0
float a = atan(p.y, abs(p.x));
float r = length(p);
// space distortion
float f = mysticFlower_fbm(vec2(a*2.+uTime*.1, r*.4-uTime*.3));
f = pow(abs(f), distinct) * sign(f);
p += vec2(f)*distort;
return p;
}
uniform float sakura_blur;
uniform float sakura_color;
// Borrowed from BigWIngs
vec4 sakura_N14(float t) {
return fract(sin(t*vec4(123., 104., 145., 24.))*vec4(657., 345., 879., 154.));
}
// Computes the RGB and alpha of a single flower in its own UV space
vec4 sakura_(vec2 uv, vec2 id, float blur)
{
float time = uTime + 45.0; //time is offset to avoid the flowers to be aligned at start
vec4 rnd = sakura_N14(mod(id.x, 500.0) * 5.4 + mod(id.y, 500.0) * 13.67); //get 4 random numbersper flower
// Offset the flower form the center in a random Lissajous pattern
uv *= mix(0.75, 1.3, rnd.y);
uv.x += sin(time * rnd.z * 0.3) * 0.6;
uv.y += sin(time * rnd.w * 0.45) * 0.4;
// Computes the angle of the flower with a random rotation speed
float angle = atan(uv.y, uv.x) + rnd.x * 421.47 + uTime * mix(-0.6, 0.6, rnd.x);
// euclidean distance to the center of the flower
float dist = length(uv);
// Flower shaped distance function form the center
float petal = 1.0 - abs(sin(angle * 2.5));
float sqPetal = petal * petal;
petal = mix(petal, sqPetal, 0.7);
float petal2 = 1.0 - abs(sin(angle * 2.5 + 1.5));
petal += petal2 * 0.2;
float sakuraDist = dist + petal * 0.25;
// Compute a blurry shadow mask.
float shadowblur = 0.3;
float shadow = smoothstep(0.5 + shadowblur, 0.5 - shadowblur, sakuraDist) * 0.4;
//Computes the sharper mask of the flower
float sakuraMask = smoothstep(0.5 + blur, 0.5 - blur, sakuraDist);
// The flower has a pink hue and is lighter in the center
vec3 hsv = rgb2hsv(vec3(1.0, 0.6, 0.7));
hsv.x = fract(hsv.x + sakura_color);
vec3 sakuraCol = hsv2rgb(hsv);
sakuraCol += (0.5 - dist) * 0.2;
// Computes the border mask of the flower
vec3 outlineCol = vec3(1.0, 0.3, 0.3);
float outlineMask = smoothstep(0.5 - blur, 0.5, sakuraDist + 0.045);
// Defines a tiling polarspace for the pistil pattern
float polarSpace = angle * 1.9098 + 0.5;
float polarPistil = fract(polarSpace) - 0.5; // 12 / (2 * pi)
// Round dot in the center
outlineMask += smoothstep(0.035 + blur, 0.035 - blur, dist);
float petalBlur = blur * 2.0;
float pistilMask = smoothstep(0.12 + blur, 0.12, dist) * smoothstep(0.05, 0.05 + blur , dist);
// Compute the pistil 'bars' in polar space
float barW = 0.2 - dist * 0.7;
float pistilBar = smoothstep(-barW, -barW + petalBlur, polarPistil) * smoothstep(barW + petalBlur, barW, polarPistil);
// Compute the little dots in polar space
float pistilDotLen = length(vec2(polarPistil * 0.10, dist) - vec2(0, 0.16)) * 9.0;
float pistilDot = smoothstep(0.1 + petalBlur, 0.1 - petalBlur, pistilDotLen);
//combines the middle an border color
outlineMask += pistilMask * pistilBar + pistilDot;
sakuraCol = mix(sakuraCol, outlineCol, clamp(outlineMask,0.0,1.0) * 0.5);
//sets the background to the shadow color
sakuraCol = mix(vec3(0.2, 0.2, 0.8) * shadow, sakuraCol, sakuraMask);
//incorporates the shadow mask into alpha channel
sakuraMask = clamp(sakuraMask + shadow,0.0,1.0);
//returns the flower in pre-multiplied rgba
return vec4(sakuraCol, sakuraMask);
}
// blends a pre-multiplied src onto a dst color (without alpha)
vec3 sakura_premulMix(vec4 src, vec3 dst)
{
return dst.rgb * (1.0 - src.a) + src.rgb;
}
// blends a pre-multiplied src onto a dst color (with alpha)
vec4 sakura_premulMix(vec4 src, vec4 dst)
{
vec4 res;
res.rgb = sakura_premulMix(src, dst.rgb);
res.a = 1.0 - (1.0 - src.a) * (1.0 - dst.a);
return res;
}
// Computes a Layer of flowers
vec4 sakura_layer(vec2 uv, float blur)
{
vec2 cellUV = fract(uv) - 0.5;
vec2 cellId = floor(uv);
vec4 accum = vec4(0.0);
// the flowers can overlap on the 9 neighboring cells so we blend them all together on each cell
for (float y = -1.0; y <= 1.0; y++)
{
for (float x = -1.0; x <= 1.0; x++)
{
vec2 offset = vec2(x, y);
vec4 sakura = sakura_(cellUV - offset, cellId + offset, blur);
accum = sakura_premulMix(sakura, accum);
}
}
return accum;
}
vec4 sakura(vec2 st, vec4 inc, float inb)
{
// Scroll the UV with a cosine oscillation
vec2 p =vec2(st);
p.y += uTime * 0.1;
p.x -= uTime * 0.03 + sin(uTime) * 0.1;
p *= 4.3;
vec3 col = inc.rgb;
// Compute a tilt-shift-like blur factor
float blur = abs(st.y);
blur *= blur * 0.15;
// Computes several layers with various degrees of blur and scale
vec4 layer1 = sakura_layer(p, inb + blur);
// Blend it all together
col = sakura_premulMix(layer1, col);
return vec4(col,inc.a);
}
// The MIT License
//
uniform float palettes_speed;
uniform float palettes_shadow;
uniform float palettes_color;
vec3 palettes_pal( float t, vec3 a, vec3 b, vec3 c, vec3 d )
{
return a + b*cos( 6.28318*(c*t+d) );
}
vec4 palettes(vec2 st, float speed, float shadow)
{
// animate
vec2 p = abs(st);
p.x += speed*uTime;
// compute colors
vec3 col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.0,0.33,0.67) );
if( p.y>(1.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.0,0.10,0.20) );
if( p.y>(2.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.3,0.20,0.20) );
if( p.y>(3.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,0.5),vec3(0.8,0.90,0.30) );
if( p.y>(4.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,0.7,0.4),vec3(0.0,0.15,0.20) );
if( p.y>(5.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(2.0,1.0,0.0),vec3(0.5,0.20,0.25) );
if( p.y>(6.0/7.0) ) col = palettes_pal( p.x, vec3(0.8,0.5,0.4),vec3(0.2,0.4,0.2),vec3(2.0,1.0,1.0),vec3(0.0,0.25,0.25) );
// band
float f = fract(p.y*7.0);
// borders
col *= smoothstep( 0.49, 0.47, abs(f-0.5) );
// shadowing
col *= mix(1.0, sqrt(4.0*f*(1.0-f)), shadow);
return vec4( col, 1.0 );
}
void main() {
vec2 uv = vec2(textureCoordinate);
vec2 st = vec2(uv.x, 1.0 - uv.y);
st = vec2(cropArea[2] * uv.x + cropArea[0], 1.0 - cropArea[3] * uv.y - cropArea[1]);
vec4 color = vec4(step(st.x, -1.0));
color = rotRects(st, uv);
st = mysticFlower(st, mysticFlower_disto, mysticFlower_disti);
color = sakura(st, color, sakura_blur);
color = palettes(st, palettes_speed, palettes_shadow);
color = vec4(0.0, 0.0, 0.0, 1.0) + color - vec4(0.0, 0.0, 0.0, 1.0) * color.a;
gl_FragColor = color;
} andreatobarphytonutrients ...inteligencia IAthena_zeus Akkadians$ ethw T-rexxx - Install Stability-SDK package from Puppies Phytonutrients with Intelligence let seed = 3382;
let canvasWidth = 980;
let canvasHeight = 980;
let account = 100;
let size = 2.675789;
let repeatType = 0;
let speed = 1.720000;
let colorbg = "#000000";
let colors0 = ["#B68EF2","#EC71F2","#8664FA","#0452FA","#54CAC0"];
let colors1 = ["#94F753","#FFCF88","#7D2F3C","#EF9D8A","#58BC68"];
let colors2 = ["#64CAC6","#C9B2B8","#7D2F3C","#F28830","#F28830"];
let colors3 = ["#E1CAC0","#EF9D8A","#7D2F3C","#FAACAD","#F28830"];
let colors4 = ["#C9B2B8","#F28830","#7D2F3C","#FD7E53","#FAACAD"];
let colors5 = ["#FA0E6A","#FAACAD","#F7BC68","#E1CAC0","#000000"];
let colors6 = ["#FBCAC6","#C9B2B8","#F28830","#7D2F3C","#E1CAC0"];
let colors7 = ["#FF8188","#F28830","#FF8188","#EF9D8A","#000000"];
//let colorbg = '#F2F2F2';
//let speed = 1.0;
//let size = 1.0; // 0.2 -3.0
//let account = 100; // 1 - 100
//let repeatType = 0;
//let colors0 = ["#4596c7", "#6d8370", "#e45240", "#21d3a4", "#3303f9"];
//let colors1 = ["#cd2220", "#173df6", "#244ca8", "#a00360", "#b31016"];
//let colors2 = ["#7382ce", "#9fb7f4", "#12177d", "#9bb5e9", "#7486af"];
//let colors3 = ["#82d362", "#5c5190", "#6c6dd1", "#3d6966", "#5967ca"];
//let colors4 = ["#8c75ff", "#c553d2", "#2dfd60", "#2788f5", "#23054f"];
//let colors5 = ["#f21252", "#8834f1", "#c4dd92", "#184fd3", "#f9fee2"];
//let colors6 = ["#2E294E", "#541388", "#F1E9DA", "#FFD400", "#D90368"];
//let colors7 = ["#1b1b1b", "#292929", "#f3f3f3", "#222222", "#ff0000"];
uniform float tonemap_exposure;
vec4 tonemap(vec4 inc, float exposure) {
vec3 col = smoothstep(0.0, 1.0, 1.0 - exp(-inc.rgb * exposure));
// sRGB Color Component Transfer: https://www.color.org/chardata/rgb/sRGB.pdf
col = vec3(
col.r > 0.0031308 ? (pow(col.r, 1.0 / 2.4) * 1.055) - 0.055 : col.r * 12.92,
col.g > 0.0031308 ? (pow(col.g, 1.0 / 2.4) * 1.055) - 0.055 : col.g * 12.92,
col.b > 0.0031308 ? (pow(col.b, 1.0 / 2.4) * 1.055) - 0.055 : col.b * 12.92);
return vec4(clamp(col, 0.0, 1.0), inc.a);
}
vec2 pos__trans(vec2 uv) {
vec2 p = -1. + 2. * uv;
p.x *= uResolution.x/uResolution.y;
return p;
}
// ---------------------------------------------------------------------
uniform float rotRects_grid;
uniform float rotRects_period;
uniform float rotRects_w;
uniform float rotRects_h;
mat2 rotRects_rot(float a) {
return mat2(cos(a), -sin(a), sin(a), cos(a));
}
float rotRects_rect(vec2 p, vec2 c) {
vec2 d = abs(p) - c;
return smoothstep(1., -1., max(d.x, d.y) * uResolution.y);
}
float rotRects_triWave(float n, float grid_divn1) {
return abs(mod(n + grid_divn1, 2. * grid_divn1) - grid_divn1) / max(grid_divn1 - 1., 1.);
}
float rotRects_(vec2 p, float mode) {
p += mode / rotRects_grid / 4.;
p *= rotRects_grid;
float grid_divn1 = rotRects_grid - 1.;
vec2 pi_ = p + .5 * grid_divn1;
vec2 pi = floor(pi_) + step(vec2(0.5), fract(pi_));
float n = mode < 0. ? pi.x + pi.y : pi.x - pi.y;
float angle = PI * (2. / rotRects_period * uTime + rotRects_triWave(n, grid_divn1));
if (mode > 0.) angle -= PI / 2.;
p = rotRects_rot(angle) * (fract(p + .5 * mod(rotRects_grid, 2.)) - .5);
return rotRects_rect(p / rotRects_grid,
.5 * vec2(rotRects_w, rotRects_h) / rotRects_grid);
}
vec4 rotRects(vec2 p, vec2 uv)
{
vec2 pp = vec2(-1., 1.) * p;
float val = .008 + rotRects_(pp, 1.) + rotRects_(pp, -1.);
val = pow(val, 1. / 2.2);
val *= 0.5 + 0.5 * pow(16.0 * uv.x * uv.y * (1.0 - uv.x) * (1.0 - uv.y), 0.2);
return vec4(val, val, val, 1.0);
}
// noise in 2d
float mysticFlower_noise(vec2 p) {
vec2 i = floor(p);
vec2 f = fract(p);
vec2 u = f*f*(3.0-2.0*f);
return mix(mix(hash_2(i + vec2(0.0, 0.0)), hash_2(i + vec2(1.0, 0.0)), u.x),
mix(hash_2(i + vec2(0.0, 1.0)), hash_2(i + vec2(1.0, 1.0)), u.x), u.y);
}
// fractal noise in 2d
float mysticFlower_fbm (vec2 p) {
const mat2 m = mat2(0.8, 0.6, -0.6, 0.8);
float f = 0.0;
f += 0.5000*mysticFlower_noise(p); p*=m*2.02;
f += 0.2500*mysticFlower_noise(p); p*=m*2.04;
f += 0.1250*mysticFlower_noise(p); p*=m*2.03;
f += 0.0650*mysticFlower_noise(p); p*=m*2.01;
// normalize f;
f /= 0.9375;
return f*2.0-1.0;
}
vec2 mysticFlower(vec2 st, float distort, float distinct) {
vec2 p = st * vec2(1.5);
// angle and radius to center 0,0
float a = atan(p.y, abs(p.x));
float r = length(p);
// space distortion
float f = mysticFlower_fbm(vec2(a*2.+uTime*.1, r*.4-uTime*.3));
f = pow(abs(f), distinct) * sign(f);
p += vec2(f)*distort;
return p;
}
// Sakura Bliss by Philippe Desgranges
// Email: Philippe.desgranges@gmail.com
// License Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
uniform float sakura_blur;
uniform float sakura_color;
// Borrowed from BigWIngs
vec4 sakura_N14(float t) {
return fract(sin(t*vec4(123., 104., 145., 24.))*vec4(657., 345., 879., 154.));
}
// Computes the RGB and alpha of a single flower in its own UV space
vec4 sakura_(vec2 uv, vec2 id, float blur)
{
float time = uTime + 45.0; //time is offset to avoid the flowers to be aligned at start
vec4 rnd = sakura_N14(mod(id.x, 500.0) * 5.4 + mod(id.y, 500.0) * 13.67); //get 4 random numbersper flower
// Offset the flower form the center in a random Lissajous pattern
uv *= mix(0.75, 1.3, rnd.y);
uv.x += sin(time * rnd.z * 0.3) * 0.6;
uv.y += sin(time * rnd.w * 0.45) * 0.4;
// Computes the angle of the flower with a random rotation speed
float angle = atan(uv.y, uv.x) + rnd.x * 421.47 + uTime * mix(-0.6, 0.6, rnd.x);
// euclidean distance to the center of the flower
float dist = length(uv);
// Flower shaped distance function form the center
float petal = 1.0 - abs(sin(angle * 2.5));
float sqPetal = petal * petal;
petal = mix(petal, sqPetal, 0.7);
float petal2 = 1.0 - abs(sin(angle * 2.5 + 1.5));
petal += petal2 * 0.2;
float sakuraDist = dist + petal * 0.25;
// Compute a blurry shadow mask.
float shadowblur = 0.3;
float shadow = smoothstep(0.5 + shadowblur, 0.5 - shadowblur, sakuraDist) * 0.4;
//Computes the sharper mask of the flower
float sakuraMask = smoothstep(0.5 + blur, 0.5 - blur, sakuraDist);
// The flower has a pink hue and is lighter in the center
vec3 hsv = rgb2hsv(vec3(1.0, 0.6, 0.7));
hsv.x = fract(hsv.x + sakura_color);
vec3 sakuraCol = hsv2rgb(hsv);
sakuraCol += (0.5 - dist) * 0.2;
// Computes the border mask of the flower
vec3 outlineCol = vec3(1.0, 0.3, 0.3);
float outlineMask = smoothstep(0.5 - blur, 0.5, sakuraDist + 0.045);
// Defines a tiling polarspace for the pistil pattern
float polarSpace = angle * 1.9098 + 0.5;
float polarPistil = fract(polarSpace) - 0.5; // 12 / (2 * pi)
// Round dot in the center
outlineMask += smoothstep(0.035 + blur, 0.035 - blur, dist);
float petalBlur = blur * 2.0;
float pistilMask = smoothstep(0.12 + blur, 0.12, dist) * smoothstep(0.05, 0.05 + blur , dist);
// Compute the pistil 'bars' in polar space
float barW = 0.2 - dist * 0.7;
float pistilBar = smoothstep(-barW, -barW + petalBlur, polarPistil) * smoothstep(barW + petalBlur, barW, polarPistil);
// Compute the little dots in polar space
float pistilDotLen = length(vec2(polarPistil * 0.10, dist) - vec2(0, 0.16)) * 9.0;
float pistilDot = smoothstep(0.1 + petalBlur, 0.1 - petalBlur, pistilDotLen);
//combines the middle an border color
outlineMask += pistilMask * pistilBar + pistilDot;
sakuraCol = mix(sakuraCol, outlineCol, clamp(outlineMask,0.0,1.0) * 0.5);
//sets the background to the shadow color
sakuraCol = mix(vec3(0.2, 0.2, 0.8) * shadow, sakuraCol, sakuraMask);
//incorporates the shadow mask into alpha channel
sakuraMask = clamp(sakuraMask + shadow,0.0,1.0);
//returns the flower in pre-multiplied rgba
return vec4(sakuraCol, sakuraMask);
}
// blends a pre-multiplied src onto a dst color (without alpha)
vec3 sakura_premulMix(vec4 src, vec3 dst)
{
return dst.rgb * (1.0 - src.a) + src.rgb;
}
// blends a pre-multiplied src onto a dst color (with alpha)
vec4 sakura_premulMix(vec4 src, vec4 dst)
{
vec4 res;
res.rgb = sakura_premulMix(src, dst.rgb);
res.a = 1.0 - (1.0 - src.a) * (1.0 - dst.a);
return res;
}
// Computes a Layer of flowers
vec4 sakura_layer(vec2 uv, float blur)
{
vec2 cellUV = fract(uv) - 0.5;
vec2 cellId = floor(uv);
vec4 accum = vec4(0.0);
// the flowers can overlap on the 9 neighboring cells so we blend them all together on each cell
for (float y = -1.0; y <= 1.0; y++)
{
for (float x = -1.0; x <= 1.0; x++)
{
vec2 offset = vec2(x, y);
vec4 sakura = sakura_(cellUV - offset, cellId + offset, blur);
accum = sakura_premulMix(sakura, accum);
}
}
return accum;
}
vec4 sakura(vec2 st, vec4 inc, float inb)
{
// Scroll the UV with a cosine oscillation
vec2 p =vec2(st);
p.y += uTime * 0.1;
p.x -= uTime * 0.03 + sin(uTime) * 0.1;
p *= 4.3;
vec3 col = inc.rgb;
// Compute a tilt-shift-like blur factor
float blur = abs(st.y);
blur *= blur * 0.15;
// Computes several layers with various degrees of blur and scale
vec4 layer1 = sakura_layer(p, inb + blur);
// Blend it all together
col = sakura_premulMix(layer1, col);
return vec4(col,inc.a);
}
void main() {
vec2 uv = vec2(textureCoordinate);
vec2 st = vec2(uv.x, 1.0 - uv.y);
st = vec2(cropArea[2] * uv.x + cropArea[0], 1.0 - cropArea[3] * uv.y - cropArea[1]);
vec4 color = vec4(step(st.x, -1.0));
color = rotRects(st, uv);
st = mysticFlower(st, mysticFlower_disto, mysticFlower_disti);
color = sakura(st, color, sakura_blur);
color = vec4(0.0, 0.0, 0.0, 1.0) + color - vec4(0.0, 0.0, 0.0, 1.0) * color.a;
gl_FragColor = color;
}---------------------------------------------------------------------
uniform float rotRects_grid;
uniform float rotRects_period;
uniform float rotRects_w;
uniform float rotRects_h;
mat2 rotRects_rot(float a) {
return mat2(cos(a), -sin(a), sin(a), cos(a));
}
float rotRects_rect(vec2 p, vec2 c) {
vec2 d = abs(p) - c;
return smoothstep(1., -1., max(d.x, d.y) * uResolution.y);
}
float rotRects_triWave(float n, float grid_divn1) {
return abs(mod(n + grid_divn1, 2. * grid_divn1) - grid_divn1) / max(grid_divn1 - 1., 1.);
}
float rotRects_(vec2 p, float mode) {
p += mode / rotRects_grid / 4.;
p *= rotRects_grid;
float grid_divn1 = rotRects_grid - 1.;
vec2 pi_ = p + .5 * grid_divn1;
vec2 pi = floor(pi_) + step(vec2(0.5), fract(pi_));
float n = mode < 0. ? pi.x + pi.y : pi.x - pi.y;
float angle = PI * (2. / rotRects_period * uTime + rotRects_triWave(n, grid_divn1));
if (mode > 0.) angle -= PI / 2.;
p = rotRects_rot(angle) * (fract(p + .5 * mod(rotRects_grid, 2.)) - .5);
return rotRects_rect(p / rotRects_grid,
.5 * vec2(rotRects_w, rotRects_h) / rotRects_grid);
}
vec4 rotRects(vec2 p, vec2 uv)
{
vec2 pp = vec2(-1., 1.) * p;
float val = .008 + rotRects_(pp, 1.) + rotRects_(pp, -1.);
val = pow(val, 1. / 2.2);
val *= 0.5 + 0.5 * pow(16.0 * uv.x * uv.y * (1.0 - uv.x) * (1.0 - uv.y), 0.2);
return vec4(val, val, val, 1.0);
}
uniform float moon_dark;
uniform float moon_haze;
uniform float moon_x;
uniform float moon_y;
uniform float moon_radius;
uniform float moon_light;
uniform float moon_clear;
vec3 moon_noise(vec2 p)
{
vec4 w = vec4(
floor(p),
ceil (p) );
vec3
_00 = hash3_2(w.xy),
_01 = hash3_2(w.xw),
_10 = hash3_2(w.zy),
_11 = hash3_2(w.zw),
_0 = mix(_00,_01,fract(p.y)),
_1 = mix(_10,_11,fract(p.y));
return mix(_0,_1,fract(p.x));
}
vec3 moon_fbm(vec2 p)
{
vec3 w = vec3(0);
float N = 5.;
mat2 ei = mat2(cos(.5),-sin(.5),sin(.5),cos(.5))*1.7;
for (float i = 1.; i < N; i++)
{
p *= ei;
w += moon_noise(p)/N/i;
}
return w;
}
vec4 moon(vec2 U, float dark, float haze, float x, float y, float radius, float light, float clear) {
// sunset
vec4 Q = vec4(0.);
Q = 1.-dark+.4*sin(4.4-.8*U.y+vec4(1.,2.,3.,4.));
Q += haze*moon_fbm(2.*U).x;
// Moon
vec2 r = U-vec2(x,y);
float l = length(r);
float L = radius;
vec3 n = moon_fbm(8.*U);
Q += vec4(light+n.x)*exp(-clear*max(l-L,0.));
Q += .3*moon_fbm(60.*U).x*Q;
Q = clamp(Q,0.,1.);
return Q;
}
//
uniform float palettes_speed;
uniform float palettes_shadow;
uniform float palettes_color;
vec3 palettes_pal( float t, vec3 a, vec3 b, vec3 c, vec3 d )
{
return a + b*cos( 6.28318*(c*t+d) );
}
vec4 palettes(vec2 st, float speed, float shadow)
{
// animate
vec2 p = abs(st);
p.x += speed*uTime;
// compute colors
vec3 col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.0,0.33,0.67) );
if( p.y>(1.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.0,0.10,0.20) );
if( p.y>(2.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.3,0.20,0.20) );
if( p.y>(3.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,0.5),vec3(0.8,0.90,0.30) );
if( p.y>(4.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,0.7,0.4),vec3(0.0,0.15,0.20) );
if( p.y>(5.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(2.0,1.0,0.0),vec3(0.5,0.20,0.25) );
if( p.y>(6.0/7.0) ) col = palettes_pal( p.x, vec3(0.8,0.5,0.4),vec3(0.2,0.4,0.2),vec3(2.0,1.0,1.0),vec3(0.0,0.25,0.25) );
// band
float f = fract(p.y*7.0);
// borders
col *= smoothstep( 0.49, 0.47, abs(f-0.5) );
// shadowing
col *= mix(1.0, sqrt(4.0*f*(1.0-f)), shadow);
return vec4( col, 1.0 );
}
//Tweet: https://twitter.com/XorDev/status/1519343739419959297
//Twigl: https://t.co/FELzNSfU40
//Based on "Molecules 2": https://www.shadertoy.com/view/7llBzS
uniform float shuffleMosaic_period;
uniform float shuffleMosaic_twist;
uniform float shuffleMosaic_freq;
vec2 shuffleMosaic(vec2 I, float period, float twist, float freq) {
vec3 c = vec3(0.,2.,1.);
vec3 T= mod(uTime+c,3.*period);
vec3 P = vec3(dot(vec2(-7.,4.), I), dot(vec2(0.,-8.), I), dot(vec2(7.,4.), I)) * 0.166667 + .2;
int t1 = int(mod(T.y, 3.));
int t2 = int(mod(T.y+1., 3.));
float ti = fract(mod(T.y, 3.));
float tc = mix(P[t1], P[t2], ti);
P+=(T*twist-sin(T*6.283)*0.166667).x*sin(T*3.14*freq)*cos(tc*3.14);
return P.xy;
}
uniform float gradientColor_c1;
uniform float gradientColor_c2;
uniform float gradientColor_c3;
uniform float gradientColor_dir;
vec4 gradientColor(vec2 st, float c1, float c2, float c3, float dir){
vec3 col;
float d = dir * PI / 180.;
d = (st.x - 0.5) * cos(d) + (st.y - 0.5) * sin(d) + 0.5;
col.r = d + c1;
col.g = d + c2;
col.b = d + c3;
return vec4(col,1.0);
}
// Copyright Inigo Quilez, 2020 - https://iquilezles.org/
// I am the sole copyright owner of this Work.
// You cannot host, display, distribute or share this Work in any form,
// including physical and digital. You cannot use this Work in any
// commercial or non-commercial product, website or project. You cannot
// sell this Work and you cannot mint an NFTs of it.
// I share this Work for educational purposes, and you can link to it,
// through an URL, proper attribution and unmodified screenshot, as part
// of your educational material. If these conditions are too restrictive
// please contact me and we'll definitely work it out.
uniform float stripes_2_count;
float stripes_2_noise(vec2 p)
{
vec2 i = floor(p);
vec2 f = fract(p);
f = f*f*(3.0-2.0*f);
float n = i.x + i.y*57.0;
return mix(mix(hash(n+ 0.0), hash(n+ 1.0), f.x),
mix(hash(n+57.0), hash(n+58.0), f.x), f.y);
}
vec2 stripes_2_map(vec2 p, float time)
{
for (int i=0; i<4; i++)
{
float a = stripes_2_noise(p*1.5)*6.2831 + time;
p += 0.1*vec2(cos(a), sin(a));
}
return p;
}
float stripes_2_height(vec2 p, vec2 q)
{
float h = dot(p-q, p-q);
h += 0.005*stripes_2_noise(0.75*(p+q));
return h;
}
vec4 stripes_2(vec2 p, vec4 inc, float count)
{
float time = 0.25*uTime;
vec2 q = p + 0.3;
// color
float w = count*q.x;
float u = floor(w);
float f = fract(w);
vec3 col = 1.1*inc.rgb + 0.2*sin(3.0*u+vec3(5.0, 1.5, 2.0));
// filtered drop-shadow
float sha = smoothstep(0.0, 0.8, f);
// normal
vec2 eps = vec2(2.0/uResolution.y, 0.0);
float l2c = stripes_2_height(q, p);
float l2x = stripes_2_height(stripes_2_map(p+eps.xy, time), p) - l2c;
float l2y = stripes_2_height(stripes_2_map(p+eps.yx, time), p) - l2c;
vec3 nor = normalize(vec3(l2x, eps.x, l2y));
// lighting
col *= 0.4+0.6*sha;
col *= 0.8+0.2*vec3(1.0, 0.9, 0.3)*dot(nor, vec3(0.7, 0.3, 0.7));
col += 0.2*pow(nor.y, 8.0)*sha;
col *= 7.5*l2c;
return vec4(col, inc.a);
}
uniform float fluffballs_c1;
uniform float fluffballs_c2;
uniform float fluffballs_c3;
uniform float fluffballs_nor;
uniform float fluffballs_deform;
mat2 fluffballs_rot(float c) {
float s=sin(c);
return mat2(c=cos(c),s,-s,c);
}
float fluffballs_map(vec3 p) {
float d = 1e9;
p -= (hash3_3(p)-.5)*.1;
d = min(d,length(fract(p)-.5)+1.);
p.xy = (p.xy+p.yx*vec2(-1,1))/sqrt(2.);
p.xz = (p.xz+p.zx*vec2(-1,1))/sqrt(2.);
p*=.4;
p-=uTime*.3;
d = min(d,(length(fract(p)-.5))/.4);
return d;
}
vec3 fluffballs_normal(vec3 P, float E) {
return vec3(
fluffballs_map(P+vec3(E,0,0))-fluffballs_map(P-vec3(E,0,0)),
fluffballs_map(P+vec3(0,E,0))-fluffballs_map(P-vec3(0,E,0)),
fluffballs_map(P+vec3(0,0,E))-fluffballs_map(P-vec3(0,0,E))
) / (E*2.);
}
float fluffballs_trace(vec3 ro,vec3 rd) {
vec3 p = ro;
float t = 0.;
float h = -.4;
for(int i=0;i<40;i++){
t += (fluffballs_map(p)+t*h)/(1.-h);
p = ro+rd*t;
}
return t;
}
vec4 fluffballs(vec2 p, vec4 inc, float c1, float c2, float c3, float deform, float nor){
vec3 ro = vec3(sin(uTime*.2)*4.,sin(.1*uTime*1.23)*4.,-0.)+uTime;
vec3 rd = normalize(vec3(p,deform));
rd.yz*=fluffballs_rot(uTime*.37);
rd.xy*=fluffballs_rot(uTime*.4);
vec4 O = vec4(inc);
float t = fluffballs_trace(ro,rd);
vec3 pp = ro+rd*t;
vec3 n = fluffballs_normal(pp,nor);
O.xyz += vec3(1.,2.,3.)*max(dot(n,normalize(vec3(0.,1.,0.)))*.5+.5,0.)*c1;
O.xyz += vec3(4.,2.,1.)*max(dot(n,normalize(vec3(3.,1.,0.))),0.);
vec3 hsv = rgb2hsv(vec3(.1,.2,.3));
hsv.x = fract(hsv.x+c3);
O.xyz += hsv2rgb(hsv)*exp(t*.4);
O.xyz *= c2;
O.xyz-=.4;
O.xyz = 1. - exp(-O.xyz);
O.xyz = pow(O.xyz,vec3(0.45454545));
return O;
}
uniform float dither_v;
vec4 dither(vec2 st, vec4 inc, float d) {
vec4 col = vec4(inc);
col.rgb += (d/255.0)*hash3_2(st*200.0);
return col;
}
// Sakura Bliss by Philippe Desgranges
// Email: Philippe.desgranges@gmail.com
// License Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
uniform float sakura_blur;
uniform float sakura_color;
// Borrowed from BigWIngs
vec4 sakura_N14(float t) {
return fract(sin(t*vec4(123., 104., 145., 24.))*vec4(657., 345., 879., 154.));
}
// Computes the RGB and alpha of a single flower in its own UV space
vec4 sakura_(vec2 uv, vec2 id, float blur)
{
float time = uTime + 45.0; //time is offset to avoid the flowers to be aligned at start
vec4 rnd = sakura_N14(mod(id.x, 500.0) * 5.4 + mod(id.y, 500.0) * 13.67); //get 4 random numbersper flower
// Offset the flower form the center in a random Lissajous pattern
uv *= mix(0.75, 1.3, rnd.y);
uv.x += sin(time * rnd.z * 0.3) * 0.6;
uv.y += sin(time * rnd.w * 0.45) * 0.4;
// Computes the angle of the flower with a random rotation speed
float angle = atan(uv.y, uv.x) + rnd.x * 421.47 + uTime * mix(-0.6, 0.6, rnd.x);
// euclidean distance to the center of the flower
float dist = length(uv);
// Flower shaped distance function form the center
float petal = 1.0 - abs(sin(angle * 2.5));
float sqPetal = petal * petal;
petal = mix(petal, sqPetal, 0.7);
float petal2 = 1.0 - abs(sin(angle * 2.5 + 1.5));
petal += petal2 * 0.2;
float sakuraDist = dist + petal * 0.25;
// Compute a blurry shadow mask.
float shadowblur = 0.3;
float shadow = smoothstep(0.5 + shadowblur, 0.5 - shadowblur, sakuraDist) * 0.4;
//Computes the sharper mask of the flower
float sakuraMask = smoothstep(0.5 + blur, 0.5 - blur, sakuraDist);
// The flower has a pink hue and is lighter in the center
vec3 hsv = rgb2hsv(vec3(1.0, 0.6, 0.7));
hsv.x = fract(hsv.x + sakura_color);
vec3 sakuraCol = hsv2rgb(hsv);
sakuraCol += (0.5 - dist) * 0.2;
// Computes the border mask of the flower
vec3 outlineCol = vec3(1.0, 0.3, 0.3);
float outlineMask = smoothstep(0.5 - blur, 0.5, sakuraDist + 0.045);
// Defines a tiling polarspace for the pistil pattern
float polarSpace = angle * 1.9098 + 0.5;
float polarPistil = fract(polarSpace) - 0.5; // 12 / (2 * pi)
// Round dot in the center
outlineMask += smoothstep(0.035 + blur, 0.035 - blur, dist);
float petalBlur = blur * 2.0;
float pistilMask = smoothstep(0.12 + blur, 0.12, dist) * smoothstep(0.05, 0.05 + blur , dist);
// Compute the pistil 'bars' in polar space
float barW = 0.2 - dist * 0.7;
float pistilBar = smoothstep(-barW, -barW + petalBlur, polarPistil) * smoothstep(barW + petalBlur, barW, polarPistil);
// Compute the little dots in polar space
float pistilDotLen = length(vec2(polarPistil * 0.10, dist) - vec2(0, 0.16)) * 9.0;
float pistilDot = smoothstep(0.1 + petalBlur, 0.1 - petalBlur, pistilDotLen);
//combines the middle an border color
outlineMask += pistilMask * pistilBar + pistilDot;
sakuraCol = mix(sakuraCol, outlineCol, clamp(outlineMask,0.0,1.0) * 0.5);
//sets the background to the shadow color
sakuraCol = mix(vec3(0.2, 0.2, 0.8) * shadow, sakuraCol, sakuraMask);
//incorporates the shadow mask into alpha channel
sakuraMask = clamp(sakuraMask + shadow,0.0,1.0);
//returns the flower in pre-multiplied rgba
return vec4(sakuraCol, sakuraMask);
}
// blends a pre-multiplied src onto a dst color (without alpha)
vec3 sakura_premulMix(vec4 src, vec3 dst)
{
return dst.rgb * (1.0 - src.a) + src.rgb;
}
// blends a pre-multiplied src onto a dst color (with alpha)
vec4 sakura_premulMix(vec4 src, vec4 dst)
{
vec4 res;
res.rgb = sakura_premulMix(src, dst.rgb);
res.a = 1.0 - (1.0 - src.a) * (1.0 - dst.a);
return res;
}
// Computes a Layer of flowers
vec4 sakura_layer(vec2 uv, float blur)
{
vec2 cellUV = fract(uv) - 0.5;
vec2 cellId = floor(uv);
vec4 accum = vec4(0.0);
// the flowers can overlap on the 9 neighboring cells so we blend them all together on each cell
for (float y = -1.0; y <= 1.0; y++)
{
for (float x = -1.0; x <= 1.0; x++)
{
vec2 offset = vec2(x, y);
vec4 sakura = sakura_(cellUV - offset, cellId + offset, blur);
accum = sakura_premulMix(sakura, accum);
}
}
return accum;
}
vec4 sakura(vec2 st, vec4 inc, float inb)
{
// Scroll the UV with a cosine oscillation
vec2 p =vec2(st);
p.y += uTime * 0.1;
p.x -= uTime * 0.03 + sin(uTime) * 0.1;
p *= 4.3;
vec3 col = inc.rgb;
// Compute a tilt-shift-like blur factor
float blur = abs(st.y);
blur *= blur * 0.15;
// Computes several layers with various degrees of blur and scale
vec4 layer1 = sakura_layer(p, inb + blur);
// Blend it all together
col = sakura_premulMix(layer1, col);
return vec4(col,inc.a);
}
uniform float stripes1_c1;
uniform float stripes1_c2;
uniform float stripes1_c3;
uniform float stripes1_count;
vec4 stripes1(vec2 p, float c1, float c2, float c3, float count) {
float a = floor((p.x - p.y * 0.5 - uTime * .08) * count) - uTime * 2.;
vec3 col = vec3(sin(a + c1*PI), sin(a + c2*PI), sin(a + c3*PI)) * 0.2 + 0.7;
return vec4(col, 1.0);
}
// Created by greenbird10
// License Creative Commons Attribution-NonCommercial-ShareAlike 3.0
uniform float water_sunx;
uniform float water_suny;
uniform float water_wave1;
uniform float water_wave2;
//From Dave (https://www.shadertoy.com/view/4djSRW)
vec2 water_hash(vec2 p)
{
return hash2_2(p)*2.0 - 1.0;
}
//From iq (https://www.shadertoy.com/view/XdXGW8)
float water_noise( vec2 p )
{
vec2 i = floor( p );
vec2 f = fract( p );
vec2 u = f*f*(3.0-2.0*f);
return mix( mix( dot( water_hash( i + vec2(0.0,0.0) ), f - vec2(0.0,0.0) ),
dot( water_hash( i + vec2(1.0,0.0) ), f - vec2(1.0,0.0) ), u.x),
mix( dot( water_hash( i + vec2(0.0,1.0) ), f - vec2(0.0,1.0) ),
dot( water_hash( i + vec2(1.0,1.0) ), f - vec2(1.0,1.0) ), u.x), u.y);
}
vec4 water(vec2 p)
{
// water
vec3 col = vec3(102./255., 120./255., 133./255.);
vec3 col1 = vec3(165./255., 157./255., 152./255.);
vec2 pp = p * vec2(uResolution.x/uResolution.y, 1.);
float sun = distance(pp, vec2(water_sunx, water_suny));
sun = pow(sun, 1.7);
col = mix(col, col*1.2, sun);
col1 = mix(col1, col1*1.5, sun);
col = mix(col1, col, smoothstep(
water_wave1, water_wave2, p.y + 0.5 * water_noise(vec2(
(p.x + 0.3 * water_noise(vec2(p.y * 30., 0.17 + uTime*0.5))) * 4., 0.33 + uTime*0.1))));
// Output to screen
return vec4(col,1.0);
}
// Copyright Inigo Quilez, 2020 - https://iquilezles.org/
// I am the sole copyright owner of this Work.
// You cannot host, display, distribute or share this Work in any form,
// including physical and digital. You cannot use this Work in any
// commercial or non-commercial product, website or project. You cannot
// sell this Work and you cannot mint an NFTs of it.
// I share this Work for educational purposes, and you can link to it,
// through an URL, proper attribution and unmodified screenshot, as part
// of your educational material. If these conditions are too restrictive
// please contact me and we'll definitely work it out.
uniform float stripes_1_times;
uniform float stripes_1_dist;
uniform float stripes_1_phase;
uniform float stripes_1_amp;
float stripes_1_noise( vec2 p )
{
vec2 i = floor(p);
vec2 f = fract(p);
f = f*f*(3.0-2.0*f);
float n = i.x + i.y*57.0;
return mix(mix( hash(n+ 0.0), hash(n+ 1.0),f.x),
mix( hash(n+57.0), hash(n+58.0),f.x),f.y);
}
vec2 stripes_1(vec2 p, float times, float phase, float dist, float amp)
{
for( float i=0.; i<times; i++ )
{
float a = stripes_1_noise(p*1.5)*PI*phase + i + uTime;
p += dist*vec2( cos(a), sin(a) );
dist *= amp;
}
return p;
}
uniform float subdivision_dir_x;
uniform float subdivision_dir_y;
uniform float subdivision_fine;
uniform float subdivision_chaos;
uniform float subdivision_deform;
float subdivision_bum(float sc, vec2 ipos) {
return 0.5 * mod(PI * (ipos.y * cos(sc * ipos.x) + ipos.x * cos(sc * ipos.y)), 2.);
}
vec2 subdivision(vec2 p, float dirX, float dirY, float fine, float chaos, float deform) {
vec2 direction = vec2(dirX, dirY);
vec2 pp = p + direction;
float sc = PI;
float ss = sign(dirX);
ss += step(ss, 0.0);
float si = fine * (pp.x + max(30. - abs(dirX), 0.) * ss);
float b = 1.;
pp *= 1. + 0.00045 * chaos * cos(0.1 * uTime + 4. * PI * b);
vec2 ipos;
float n = 5.;
for (float i = 0.; i < n; i++) {
ipos = floor(si * pp);
b = mix(b, subdivision_bum(sc, ipos), deform);
float io = 2. * PI * i / n;
pp *= 1. + 0.00045 * chaos * cos(io + 0.1 * uTime + 4. * PI * b);
}
return pp - direction;
}
// Copyright Inigo Quilez, 2013 - https://iquilezles.org/
// I am the sole copyright owner of this Work.
// You cannot host, display, distribute or share this Work in any form,
// including physical and digital. You cannot use this Work in any
// commercial or non-commercial product, website or project. You cannot
// sell this Work and you cannot mint an NFTs of it.
// I share this Work for educational purposes, and you can link to it,
// through an URL, proper attribution and unmodified screenshot, as part
// of your educational material. If these conditions are too restrictive
// please contact me and we'll definitely work it out.
uniform float warping_angle;
uniform float warping_color;
float warping_noise(vec2 p) {
return sin(p.x)*sin(p.y);
}
float warping_fbm4(vec2 p, mat2 m) {
float f = 0.0;
f += 0.5000*warping_noise(p); p = m*p*2.02;
f += 0.2500*warping_noise(p); p = m*p*2.03;
f += 0.1250*warping_noise(p); p = m*p*2.01;
f += 0.0625*warping_noise(p);
return f/0.9375;
}
float warping_fbm6(vec2 p, mat2 m) {
float f = 0.0;
f += 0.500000*(0.5+0.5*warping_noise(p)); p = m*p*2.02;
f += 0.250000*(0.5+0.5*warping_noise(p)); p = m*p*2.03;
f += 0.125000*(0.5+0.5*warping_noise(p)); p = m*p*2.01;
f += 0.062500*(0.5+0.5*warping_noise(p)); p = m*p*2.04;
f += 0.031250*(0.5+0.5*warping_noise(p)); p = m*p*2.01;
f += 0.015625*(0.5+0.5*warping_noise(p));
return f/0.96875;
}
vec2 warping_fbm4_2(vec2 p, mat2 m) {
return vec2(warping_fbm4(p, m), warping_fbm4(p+vec2(7.8), m));
}
vec2 warping_fbm6_2(vec2 p, mat2 m) {
return vec2(warping_fbm6(p+vec2(16.8), m), warping_fbm6(p+vec2(11.5), m));
}
vec4 warping(vec2 st, float angle, float c) {
float sa = sin(angle);
float ca = cos(angle);
mat2 m = mat2(ca, sa, -sa, ca);
vec2 q = vec2(st);
q += 0.03*sin(vec2(0.27, 0.23)*uTime + length(q)*vec2(4.1, 4.3));
vec2 o = warping_fbm4_2(0.9*q, m);
o += 0.04*sin(vec2(0.12, 0.14)*uTime + length(o));
vec2 n = warping_fbm6_2(3.0*o, m);
vec4 on = vec4(o, n);
float f = 0.5 + 0.5*warping_fbm4(1.8*q + 6.0*n, m);
f = mix(f, f*f*f*3.5, f*abs(n.x));
vec3 col = vec3(0.0);
col = mix(vec3(0.2, 0.1, 0.4), vec3(0.3, 0.05, 0.05), f);
col = mix(col, vec3(0.9, 0.9, 0.9), dot(on.zw, on.zw));
col = mix(col, vec3(0.4, 0.3, 0.3), 0.2 + 0.5*on.y*on.y);
col = mix(col, vec3(0.0, 0.2, 0.4), 0.5*smoothstep(1.2, 1.3, abs(on.z)+abs(on.w)));
col = clamp(col*f*2.0, 0.0, 1.0);
vec3 hsv = rgb2hsv(col);
hsv.x = fract(hsv.x + c);
return vec4(hsv2rgb(hsv), 1.0);
}
uniform float vignette_v;
vec4 vignette(vec4 color, vec2 q, float v)
{
color.rgb *= 0.3 + 0.8 * pow(16.0 * q.x * q.y * (1.0 - q.x) * (1.0 - q.y), v);
return color;
}
uniform float cmyk_smooth;
uniform float cmyk_thres;
vec3 cmyk_color(float x) {
float factor = fract(x) * 4.;
float f0 = smoothstep(0., cmyk_smooth, factor);
float f1 = smoothstep(0., cmyk_smooth, factor - 1.);
float f2 = smoothstep(0., cmyk_smooth, factor - 2.);
float f3 = smoothstep(0., cmyk_smooth, factor - 3.);
|
1.0
|
Helenaluengo\software - )helena luengo software (((extremely detailed))),(((best quality))),(((masterpiece))),illustration,(((colorful))),clear-cut margin,1girl, Isometric half sphere island on neon background, isometric environment, isometric art, amazing detail, artstation, ray A warrior robot astronaut, floral, horizon zero dawn machine, posing for a fight intricate Steampunk city, sunrise, landscape, intricate, detailed, volumetric lighting, scenery, highly detailed, artstation, sharp uniform float palettes_speed;
uniform float palettes_shadow;
Isometric half sphere island on neon background, isometric environment, isometric art, amazing detail, artstation, ray A warrior robot astronaut, floral, horizon zero dawn machine, posing for a fight intricate Steampunk city, sunrise, landscape, intricate, detailed, volumetric lighting, scenery, highly detailed, artstation, sharp uniform float palettes_speed;
uniform float palettes_shadow;
uniform float palettes_color;
vec3 palettes_pal( float t, vec3 a, vec3 b, vec3 c, vec3 d )
{
return a + b*cos( 6.28318*(c*t+d) );
uniform float rotRects_grid;
uniform float rotRects_period;
uniform float rotRects_w;
uniform float rotRects_h;
mat2 rotRects_rot(float a) {
return mat2(cos(a), -sin(a), sin(a), cos(a));
}
float rotRects_rect(vec2 p, vec2 c) {
vec2 d = abs(p) - c;
return smoothstep(1., -1., max(d.x, d.y) * uResolution.y);
}
float rotRects_triWave(float n, float grid_divn1) {
return abs(mod(n + grid_divn1, 2. * grid_divn1) - grid_divn1) / max(grid_divn1 - 1., 1.);
}
float rotRects_(vec2 p, float mode) {
p += mode / rotRects_grid / 4.;
p *= rotRects_grid;
float grid_divn1 = rotRects_grid - 1.;
vec2 pi_ = p + .5 * grid_divn1;
vec2 pi = floor(pi_) + step(vec2(0.5), fract(pi_));
float n = mode < 0. ? pi.x + pi.y : pi.x - pi.y;
float angle = PI * (2. / rotRects_period * uTime + rotRects_triWave(n, grid_divn1));
if (mode > 0.) angle -= PI / 2.;
p = rotRects_rot(angle) * (fract(p + . <Phytonutrients><helenaluengo+intelligent DNA_>with_Intelligence>@andreatobarfigueroa_v4agEj@kindle.com/><helena luengo by andrea Python * Audioop- tobar_figueroa /><Python SDK VERSION superior (phytonutrients -start/install the stability-sdk package from puppies release new
// Isometric half sphere island on neon background, isometric environment, isometric art, amazing detail, artstation, ray A warrior robot astronaut, floral, horizon zero dawn machine, posing for a fight intricate Steampunk city, sunrise, landscape, intricate, detailed, volumetric lighting, scenery, highly detailed, artstation, sharp uniform float palettes_speed;
uniform float palettes_shadow;
Isometric half sphere island on neon background, isometric environment, isometric art, amazing detail, artstation, ray A warrior robot astronaut, floral, horizon zero dawn machine, posing for a fight intricate Steampunk city, sunrise, landscape, intricate, detailed, volumetric lighting, scenery, highly detailed, artstation, sharp uniform float palettes_speed;
uniform float palettes_shadow;
uniform float palettes_color;
vec3 palettes_pal( float t, vec3 a, vec3 b, vec3 c, vec3 d )
{
return a + b*cos( 6.28318*(c*t+d) );
uniform float rotRects_grid;
uniform float rotRects_period;
uniform float rotRects_w;
uniform float rotRects_h;
mat2 rotRects_rot(float a) {
return mat2(cos(a), -sin(a), sin(a), cos(a));
}
float rotRects_rect(vec2 p, vec2 c) {
vec2 d = abs(p) - c;
return smoothstep(1., -1., max(d.x, d.y) * uResolution.y);
}
float rotRects_triWave(float n, float grid_divn1) {
return abs(mod(n + grid_divn1, 2. * grid_divn1) - grid_divn1) / max(grid_divn1 - 1., 1.);
}
float rotRects_(vec2 p, float mode) {
p += mode / rotRects_grid / 4.;
p *= rotRects_grid;
float grid_divn1 = rotRects_grid - 1.;
vec2 pi_ = p + .5 * grid_divn1;
vec2 pi = floor(pi_) + step(vec2(0.5), fract(pi_));
float n = mode < 0. ? pi.x + pi.y : pi.x - pi.y;
float angle = PI * (2. / rotRects_period * uTime + rotRects_triWave(n, grid_divn1));
if (mode > 0.) angle -= PI / 2.;
p = rotRects_rot(angle) * (fract(p + .5 * mod(rotRects_grid, 2.)) - .5);
return rotRects_rect(p / rotRects_grid,
.5 * vec2(rotRects_w, rotRe---------------------------------------------------------------------
uniform float rotRects_grid;
uniform float rotRects_period;
uniform float rotRects_w;
uniform float rotRects_h;
mat2 rotRects_rot(float a) {
return mat2(cos(a), -sin(a), sin(a), cos(a));
}
float rotRects_rect(vec2 p, vec2 c) {
vec2 d = abs(p) - c;
return smoothstep(1., -1., max(d.x, d.y) * uResolution.y);
}
float rotRects_triWave(float n, float grid_divn1) {
return abs(mod(n + grid_divn1, 2. * grid_divn1) - grid_divn1) / max(grid_divn1 - 1., 1.);
}
float rotRects_(vec2 p, float mode) {
p += mode / rotRects_grid / 4.;
p *= rotRects_grid;
float grid_divn1 = rotRects_grid - 1.;
vec2 pi_ = p + .5 * grid_divn1;
vec2 pi = floor(pi_) + step(vec2(0.5), fract(pi_));
float n = mode < 0. ? pi.x + pi.y : pi.x - pi.y;
float angle = PI * (2. / rotRects_period * uTime + rotRects_triWave(n, grid_divn1));
if (mode > 0.) angle -= PI / 2.;
p = rotRects_rot(angle) * (fract(p + .5 * mod(rotRects_grid, 2.)) - .5);
return rotRects_rect(p / rotRects_grid,
.5 * vec2(rotRects_w, rotRects_h) / rotRects_grid);
}
vec4 rotRects(vec2 p, vec2 uv)
{
vec2 pp = vec2(-1., 1.) * p;
float val = .008 + rotRects_(pp, 1.) + rotRects_(pp, -1.);
val = pow(val, 1. / 2.2);
val *= 0.5 + 0.5 * pow(16.0 * uv.x * uv.y * (1.0 - uv.x) * (1.0 - uv.y), 0.2);
return vec4(val, val, val, 1.0);
}
// Author: Rigel
// Shader: Mystic Flower
// licence: https://creativecommons.org/licenses/by/4.0/
uniform float mysticFlower_disto;
uniform float mysticFlower_disti;
// noise in 2d
float mysticFlower_noise(vec2 p) {
vec2 i = floor(p);
vec2 f = fract(p);
vec2 u = f*f*(3.0-2.0*f);
return mix(mix(hash_2(i + vec2(0.0, 0.0)), hash_2(i + vec2(1.0, 0.0)), u.x),
mix(hash_2(i + vec2(0.0, 1.0)), hash_2(i + vec2(1.0, 1.0)), u.x), u.y);
}
// fractal noise in 2d
float mysticFlower_fbm (vec2 p) {
const mat2 m = mat2(0.8, 0.6, -0.6, 0.8);
float f = 0.0;
f += 0.5000*mysticFlower_noise(p); p*=m*2.02;
f += 0.2500*mysticFlower_noise(p); p*=m*2.04;
f += 0.1250*mysticFlower_noise(p); p*=m*2.03;
f += 0.0650*mysticFlower_noise(p); p*=m*2.01;
// normalize f;
f /= 0.9375;
return f*2.0-1.0;
}
vec2 mysticFlower(vec2 st, float distort, float distinct) {
vec2 p = st * vec2(1.5);
// angle and radius to center 0,0
float a = atan(p.y, abs(p.x));
float r = length(p);
// space distortion
float f = mysticFlower_fbm(vec2(a*2.+uTime*.1, r*.4-uTime*.3));
f = pow(abs(f), distinct) * sign(f);
p += vec2(f)*distort;
return p;
}
uniform float sakura_blur;
uniform float sakura_color;
// Borrowed from BigWIngs
vec4 sakura_N14(float t) {
return fract(sin(t*vec4(123., 104., 145., 24.))*vec4(657., 345., 879., 154.));
}
// Computes the RGB and alpha of a single flower in its own UV space
vec4 sakura_(vec2 uv, vec2 id, float blur)
{
float time = uTime + 45.0; //time is offset to avoid the flowers to be aligned at start
vec4 rnd = sakura_N14(mod(id.x, 500.0) * 5.4 + mod(id.y, 500.0) * 13.67); //get 4 random numbersper flower
// Offset the flower form the center in a random Lissajous pattern
uv *= mix(0.75, 1.3, rnd.y);
uv.x += sin(time * rnd.z * 0.3) * 0.6;
uv.y += sin(time * rnd.w * 0.45) * 0.4;
// Computes the angle of the flower with a random rotation speed
float angle = atan(uv.y, uv.x) + rnd.x * 421.47 + uTime * mix(-0.6, 0.6, rnd.x);
// euclidean distance to the center of the flower
float dist = length(uv);
// Flower shaped distance function form the center
float petal = 1.0 - abs(sin(angle * 2.5));
float sqPetal = petal * petal;
petal = mix(petal, sqPetal, 0.7);
float petal2 = 1.0 - abs(sin(angle * 2.5 + 1.5));
petal += petal2 * 0.2;
float sakuraDist = dist + petal * 0.25;
// Compute a blurry shadow mask.
float shadowblur = 0.3;
float shadow = smoothstep(0.5 + shadowblur, 0.5 - shadowblur, sakuraDist) * 0.4;
//Computes the sharper mask of the flower
float sakuraMask = smoothstep(0.5 + blur, 0.5 - blur, sakuraDist);
// The flower has a pink hue and is lighter in the center
vec3 hsv = rgb2hsv(vec3(1.0, 0.6, 0.7));
hsv.x = fract(hsv.x + sakura_color);
vec3 sakuraCol = hsv2rgb(hsv);
sakuraCol += (0.5 - dist) * 0.2;
// Computes the border mask of the flower
vec3 outlineCol = vec3(1.0, 0.3, 0.3);
float outlineMask = smoothstep(0.5 - blur, 0.5, sakuraDist + 0.045);
// Defines a tiling polarspace for the pistil pattern
float polarSpace = angle * 1.9098 + 0.5;
float polarPistil = fract(polarSpace) - 0.5; // 12 / (2 * pi)
// Round dot in the center
outlineMask += smoothstep(0.035 + blur, 0.035 - blur, dist);
float petalBlur = blur * 2.0;
float pistilMask = smoothstep(0.12 + blur, 0.12, dist) * smoothstep(0.05, 0.05 + blur , dist);
// Compute the pistil 'bars' in polar space
float barW = 0.2 - dist * 0.7;
float pistilBar = smoothstep(-barW, -barW + petalBlur, polarPistil) * smoothstep(barW + petalBlur, barW, polarPistil);
// Compute the little dots in polar space
float pistilDotLen = length(vec2(polarPistil * 0.10, dist) - vec2(0, 0.16)) * 9.0;
float pistilDot = smoothstep(0.1 + petalBlur, 0.1 - petalBlur, pistilDotLen);
//combines the middle an border color
outlineMask += pistilMask * pistilBar + pistilDot;
sakuraCol = mix(sakuraCol, outlineCol, clamp(outlineMask,0.0,1.0) * 0.5);
//sets the background to the shadow color
sakuraCol = mix(vec3(0.2, 0.2, 0.8) * shadow, sakuraCol, sakuraMask);
//incorporates the shadow mask into alpha channel
sakuraMask = clamp(sakuraMask + shadow,0.0,1.0);
//returns the flower in pre-multiplied rgba
return vec4(sakuraCol, sakuraMask);
}
// blends a pre-multiplied src onto a dst color (without alpha)
vec3 sakura_premulMix(vec4 src, vec3 dst)
{
return dst.rgb * (1.0 - src.a) + src.rgb;
}
// blends a pre-multiplied src onto a dst color (with alpha)
vec4 sakura_premulMix(vec4 src, vec4 dst)
{
vec4 res;
res.rgb = sakura_premulMix(src, dst.rgb);
res.a = 1.0 - (1.0 - src.a) * (1.0 - dst.a);
return res;
}
// Computes a Layer of flowers
vec4 sakura_layer(vec2 uv, float blur)
{
vec2 cellUV = fract(uv) - 0.5;
vec2 cellId = floor(uv);
vec4 accum = vec4(0.0);
// the flowers can overlap on the 9 neighboring cells so we blend them all together on each cell
for (float y = -1.0; y <= 1.0; y++)
{
for (float x = -1.0; x <= 1.0; x++)
{
vec2 offset = vec2(x, y);
vec4 sakura = sakura_(cellUV - offset, cellId + offset, blur);
accum = sakura_premulMix(sakura, accum);
}
}
return accum;
}
vec4 sakura(vec2 st, vec4 inc, float inb)
{
// Scroll the UV with a cosine oscillation
vec2 p =vec2(st);
p.y += uTime * 0.1;
p.x -= uTime * 0.03 + sin(uTime) * 0.1;
p *= 4.3;
vec3 col = inc.rgb;
// Compute a tilt-shift-like blur factor
float blur = abs(st.y);
blur *= blur * 0.15;
// Computes several layers with various degrees of blur and scale
vec4 layer1 = sakura_layer(p, inb + blur);
// Blend it all together
col = sakura_premulMix(layer1, col);
return vec4(col,inc.a);
}
// The MIT License
//
uniform float palettes_speed;
uniform float palettes_shadow;
uniform float palettes_color;
vec3 palettes_pal( float t, vec3 a, vec3 b, vec3 c, vec3 d )
{
return a + b*cos( 6.28318*(c*t+d) );
}
vec4 palettes(vec2 st, float speed, float shadow)
{
// animate
vec2 p = abs(st);
p.x += speed*uTime;
// compute colors
vec3 col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.0,0.33,0.67) );
if( p.y>(1.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.0,0.10,0.20) );
if( p.y>(2.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.3,0.20,0.20) );
if( p.y>(3.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,0.5),vec3(0.8,0.90,0.30) );
if( p.y>(4.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,0.7,0.4),vec3(0.0,0.15,0.20) );
if( p.y>(5.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(2.0,1.0,0.0),vec3(0.5,0.20,0.25) );
if( p.y>(6.0/7.0) ) col = palettes_pal( p.x, vec3(0.8,0.5,0.4),vec3(0.2,0.4,0.2),vec3(2.0,1.0,1.0),vec3(0.0,0.25,0.25) );
// band
float f = fract(p.y*7.0);
// borders
col *= smoothstep( 0.49, 0.47, abs(f-0.5) );
// shadowing
col *= mix(1.0, sqrt(4.0*f*(1.0-f)), shadow);
return vec4( col, 1.0 );
}
void main() {
vec2 uv = vec2(textureCoordinate);
vec2 st = vec2(uv.x, 1.0 - uv.y);
st = vec2(cropArea[2] * uv.x + cropArea[0], 1.0 - cropArea[3] * uv.y - cropArea[1]);
vec4 color = vec4(step(st.x, -1.0));
color = rotRects(st, uv);
st = mysticFlower(st, mysticFlower_disto, mysticFlower_disti);
color = sakura(st, color, sakura_blur);
color = palettes(st, palettes_speed, palettes_shadow);
color = vec4(0.0, 0.0, 0.0, 1.0) + color - vec4(0.0, 0.0, 0.0, 1.0) * color.a;
gl_FragColor = color;
} andreatobarphytonutrients ...inteligencia IAthena_zeus Akkadians$ ethw T-rexxx - Install Stability-SDK package from Puppies Phytonutrients with Intelligence let seed = 3382;
let canvasWidth = 980;
let canvasHeight = 980;
let account = 100;
let size = 2.675789;
let repeatType = 0;
let speed = 1.720000;
let colorbg = "#000000";
let colors0 = ["#B68EF2","#EC71F2","#8664FA","#0452FA","#54CAC0"];
let colors1 = ["#94F753","#FFCF88","#7D2F3C","#EF9D8A","#58BC68"];
let colors2 = ["#64CAC6","#C9B2B8","#7D2F3C","#F28830","#F28830"];
let colors3 = ["#E1CAC0","#EF9D8A","#7D2F3C","#FAACAD","#F28830"];
let colors4 = ["#C9B2B8","#F28830","#7D2F3C","#FD7E53","#FAACAD"];
let colors5 = ["#FA0E6A","#FAACAD","#F7BC68","#E1CAC0","#000000"];
let colors6 = ["#FBCAC6","#C9B2B8","#F28830","#7D2F3C","#E1CAC0"];
let colors7 = ["#FF8188","#F28830","#FF8188","#EF9D8A","#000000"];
//let colorbg = '#F2F2F2';
//let speed = 1.0;
//let size = 1.0; // 0.2 -3.0
//let account = 100; // 1 - 100
//let repeatType = 0;
//let colors0 = ["#4596c7", "#6d8370", "#e45240", "#21d3a4", "#3303f9"];
//let colors1 = ["#cd2220", "#173df6", "#244ca8", "#a00360", "#b31016"];
//let colors2 = ["#7382ce", "#9fb7f4", "#12177d", "#9bb5e9", "#7486af"];
//let colors3 = ["#82d362", "#5c5190", "#6c6dd1", "#3d6966", "#5967ca"];
//let colors4 = ["#8c75ff", "#c553d2", "#2dfd60", "#2788f5", "#23054f"];
//let colors5 = ["#f21252", "#8834f1", "#c4dd92", "#184fd3", "#f9fee2"];
//let colors6 = ["#2E294E", "#541388", "#F1E9DA", "#FFD400", "#D90368"];
//let colors7 = ["#1b1b1b", "#292929", "#f3f3f3", "#222222", "#ff0000"];
uniform float tonemap_exposure;
vec4 tonemap(vec4 inc, float exposure) {
vec3 col = smoothstep(0.0, 1.0, 1.0 - exp(-inc.rgb * exposure));
// sRGB Color Component Transfer: https://www.color.org/chardata/rgb/sRGB.pdf
col = vec3(
col.r > 0.0031308 ? (pow(col.r, 1.0 / 2.4) * 1.055) - 0.055 : col.r * 12.92,
col.g > 0.0031308 ? (pow(col.g, 1.0 / 2.4) * 1.055) - 0.055 : col.g * 12.92,
col.b > 0.0031308 ? (pow(col.b, 1.0 / 2.4) * 1.055) - 0.055 : col.b * 12.92);
return vec4(clamp(col, 0.0, 1.0), inc.a);
}
vec2 pos__trans(vec2 uv) {
vec2 p = -1. + 2. * uv;
p.x *= uResolution.x/uResolution.y;
return p;
}
// ---------------------------------------------------------------------
uniform float rotRects_grid;
uniform float rotRects_period;
uniform float rotRects_w;
uniform float rotRects_h;
mat2 rotRects_rot(float a) {
return mat2(cos(a), -sin(a), sin(a), cos(a));
}
float rotRects_rect(vec2 p, vec2 c) {
vec2 d = abs(p) - c;
return smoothstep(1., -1., max(d.x, d.y) * uResolution.y);
}
float rotRects_triWave(float n, float grid_divn1) {
return abs(mod(n + grid_divn1, 2. * grid_divn1) - grid_divn1) / max(grid_divn1 - 1., 1.);
}
float rotRects_(vec2 p, float mode) {
p += mode / rotRects_grid / 4.;
p *= rotRects_grid;
float grid_divn1 = rotRects_grid - 1.;
vec2 pi_ = p + .5 * grid_divn1;
vec2 pi = floor(pi_) + step(vec2(0.5), fract(pi_));
float n = mode < 0. ? pi.x + pi.y : pi.x - pi.y;
float angle = PI * (2. / rotRects_period * uTime + rotRects_triWave(n, grid_divn1));
if (mode > 0.) angle -= PI / 2.;
p = rotRects_rot(angle) * (fract(p + .5 * mod(rotRects_grid, 2.)) - .5);
return rotRects_rect(p / rotRects_grid,
.5 * vec2(rotRects_w, rotRects_h) / rotRects_grid);
}
vec4 rotRects(vec2 p, vec2 uv)
{
vec2 pp = vec2(-1., 1.) * p;
float val = .008 + rotRects_(pp, 1.) + rotRects_(pp, -1.);
val = pow(val, 1. / 2.2);
val *= 0.5 + 0.5 * pow(16.0 * uv.x * uv.y * (1.0 - uv.x) * (1.0 - uv.y), 0.2);
return vec4(val, val, val, 1.0);
}
// noise in 2d
float mysticFlower_noise(vec2 p) {
vec2 i = floor(p);
vec2 f = fract(p);
vec2 u = f*f*(3.0-2.0*f);
return mix(mix(hash_2(i + vec2(0.0, 0.0)), hash_2(i + vec2(1.0, 0.0)), u.x),
mix(hash_2(i + vec2(0.0, 1.0)), hash_2(i + vec2(1.0, 1.0)), u.x), u.y);
}
// fractal noise in 2d
float mysticFlower_fbm (vec2 p) {
const mat2 m = mat2(0.8, 0.6, -0.6, 0.8);
float f = 0.0;
f += 0.5000*mysticFlower_noise(p); p*=m*2.02;
f += 0.2500*mysticFlower_noise(p); p*=m*2.04;
f += 0.1250*mysticFlower_noise(p); p*=m*2.03;
f += 0.0650*mysticFlower_noise(p); p*=m*2.01;
// normalize f;
f /= 0.9375;
return f*2.0-1.0;
}
vec2 mysticFlower(vec2 st, float distort, float distinct) {
vec2 p = st * vec2(1.5);
// angle and radius to center 0,0
float a = atan(p.y, abs(p.x));
float r = length(p);
// space distortion
float f = mysticFlower_fbm(vec2(a*2.+uTime*.1, r*.4-uTime*.3));
f = pow(abs(f), distinct) * sign(f);
p += vec2(f)*distort;
return p;
}
// Sakura Bliss by Philippe Desgranges
// Email: Philippe.desgranges@gmail.com
// License Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
uniform float sakura_blur;
uniform float sakura_color;
// Borrowed from BigWIngs
vec4 sakura_N14(float t) {
return fract(sin(t*vec4(123., 104., 145., 24.))*vec4(657., 345., 879., 154.));
}
// Computes the RGB and alpha of a single flower in its own UV space
vec4 sakura_(vec2 uv, vec2 id, float blur)
{
float time = uTime + 45.0; //time is offset to avoid the flowers to be aligned at start
vec4 rnd = sakura_N14(mod(id.x, 500.0) * 5.4 + mod(id.y, 500.0) * 13.67); //get 4 random numbersper flower
// Offset the flower form the center in a random Lissajous pattern
uv *= mix(0.75, 1.3, rnd.y);
uv.x += sin(time * rnd.z * 0.3) * 0.6;
uv.y += sin(time * rnd.w * 0.45) * 0.4;
// Computes the angle of the flower with a random rotation speed
float angle = atan(uv.y, uv.x) + rnd.x * 421.47 + uTime * mix(-0.6, 0.6, rnd.x);
// euclidean distance to the center of the flower
float dist = length(uv);
// Flower shaped distance function form the center
float petal = 1.0 - abs(sin(angle * 2.5));
float sqPetal = petal * petal;
petal = mix(petal, sqPetal, 0.7);
float petal2 = 1.0 - abs(sin(angle * 2.5 + 1.5));
petal += petal2 * 0.2;
float sakuraDist = dist + petal * 0.25;
// Compute a blurry shadow mask.
float shadowblur = 0.3;
float shadow = smoothstep(0.5 + shadowblur, 0.5 - shadowblur, sakuraDist) * 0.4;
//Computes the sharper mask of the flower
float sakuraMask = smoothstep(0.5 + blur, 0.5 - blur, sakuraDist);
// The flower has a pink hue and is lighter in the center
vec3 hsv = rgb2hsv(vec3(1.0, 0.6, 0.7));
hsv.x = fract(hsv.x + sakura_color);
vec3 sakuraCol = hsv2rgb(hsv);
sakuraCol += (0.5 - dist) * 0.2;
// Computes the border mask of the flower
vec3 outlineCol = vec3(1.0, 0.3, 0.3);
float outlineMask = smoothstep(0.5 - blur, 0.5, sakuraDist + 0.045);
// Defines a tiling polarspace for the pistil pattern
float polarSpace = angle * 1.9098 + 0.5;
float polarPistil = fract(polarSpace) - 0.5; // 12 / (2 * pi)
// Round dot in the center
outlineMask += smoothstep(0.035 + blur, 0.035 - blur, dist);
float petalBlur = blur * 2.0;
float pistilMask = smoothstep(0.12 + blur, 0.12, dist) * smoothstep(0.05, 0.05 + blur , dist);
// Compute the pistil 'bars' in polar space
float barW = 0.2 - dist * 0.7;
float pistilBar = smoothstep(-barW, -barW + petalBlur, polarPistil) * smoothstep(barW + petalBlur, barW, polarPistil);
// Compute the little dots in polar space
float pistilDotLen = length(vec2(polarPistil * 0.10, dist) - vec2(0, 0.16)) * 9.0;
float pistilDot = smoothstep(0.1 + petalBlur, 0.1 - petalBlur, pistilDotLen);
//combines the middle an border color
outlineMask += pistilMask * pistilBar + pistilDot;
sakuraCol = mix(sakuraCol, outlineCol, clamp(outlineMask,0.0,1.0) * 0.5);
//sets the background to the shadow color
sakuraCol = mix(vec3(0.2, 0.2, 0.8) * shadow, sakuraCol, sakuraMask);
//incorporates the shadow mask into alpha channel
sakuraMask = clamp(sakuraMask + shadow,0.0,1.0);
//returns the flower in pre-multiplied rgba
return vec4(sakuraCol, sakuraMask);
}
// blends a pre-multiplied src onto a dst color (without alpha)
vec3 sakura_premulMix(vec4 src, vec3 dst)
{
return dst.rgb * (1.0 - src.a) + src.rgb;
}
// blends a pre-multiplied src onto a dst color (with alpha)
vec4 sakura_premulMix(vec4 src, vec4 dst)
{
vec4 res;
res.rgb = sakura_premulMix(src, dst.rgb);
res.a = 1.0 - (1.0 - src.a) * (1.0 - dst.a);
return res;
}
// Computes a Layer of flowers
vec4 sakura_layer(vec2 uv, float blur)
{
vec2 cellUV = fract(uv) - 0.5;
vec2 cellId = floor(uv);
vec4 accum = vec4(0.0);
// the flowers can overlap on the 9 neighboring cells so we blend them all together on each cell
for (float y = -1.0; y <= 1.0; y++)
{
for (float x = -1.0; x <= 1.0; x++)
{
vec2 offset = vec2(x, y);
vec4 sakura = sakura_(cellUV - offset, cellId + offset, blur);
accum = sakura_premulMix(sakura, accum);
}
}
return accum;
}
vec4 sakura(vec2 st, vec4 inc, float inb)
{
// Scroll the UV with a cosine oscillation
vec2 p =vec2(st);
p.y += uTime * 0.1;
p.x -= uTime * 0.03 + sin(uTime) * 0.1;
p *= 4.3;
vec3 col = inc.rgb;
// Compute a tilt-shift-like blur factor
float blur = abs(st.y);
blur *= blur * 0.15;
// Computes several layers with various degrees of blur and scale
vec4 layer1 = sakura_layer(p, inb + blur);
// Blend it all together
col = sakura_premulMix(layer1, col);
return vec4(col,inc.a);
}
void main() {
vec2 uv = vec2(textureCoordinate);
vec2 st = vec2(uv.x, 1.0 - uv.y);
st = vec2(cropArea[2] * uv.x + cropArea[0], 1.0 - cropArea[3] * uv.y - cropArea[1]);
vec4 color = vec4(step(st.x, -1.0));
color = rotRects(st, uv);
st = mysticFlower(st, mysticFlower_disto, mysticFlower_disti);
color = sakura(st, color, sakura_blur);
color = vec4(0.0, 0.0, 0.0, 1.0) + color - vec4(0.0, 0.0, 0.0, 1.0) * color.a;
gl_FragColor = color;
}---------------------------------------------------------------------
uniform float rotRects_grid;
uniform float rotRects_period;
uniform float rotRects_w;
uniform float rotRects_h;
mat2 rotRects_rot(float a) {
return mat2(cos(a), -sin(a), sin(a), cos(a));
}
float rotRects_rect(vec2 p, vec2 c) {
vec2 d = abs(p) - c;
return smoothstep(1., -1., max(d.x, d.y) * uResolution.y);
}
float rotRects_triWave(float n, float grid_divn1) {
return abs(mod(n + grid_divn1, 2. * grid_divn1) - grid_divn1) / max(grid_divn1 - 1., 1.);
}
float rotRects_(vec2 p, float mode) {
p += mode / rotRects_grid / 4.;
p *= rotRects_grid;
float grid_divn1 = rotRects_grid - 1.;
vec2 pi_ = p + .5 * grid_divn1;
vec2 pi = floor(pi_) + step(vec2(0.5), fract(pi_));
float n = mode < 0. ? pi.x + pi.y : pi.x - pi.y;
float angle = PI * (2. / rotRects_period * uTime + rotRects_triWave(n, grid_divn1));
if (mode > 0.) angle -= PI / 2.;
p = rotRects_rot(angle) * (fract(p + .5 * mod(rotRects_grid, 2.)) - .5);
return rotRects_rect(p / rotRects_grid,
.5 * vec2(rotRects_w, rotRects_h) / rotRects_grid);
}
vec4 rotRects(vec2 p, vec2 uv)
{
vec2 pp = vec2(-1., 1.) * p;
float val = .008 + rotRects_(pp, 1.) + rotRects_(pp, -1.);
val = pow(val, 1. / 2.2);
val *= 0.5 + 0.5 * pow(16.0 * uv.x * uv.y * (1.0 - uv.x) * (1.0 - uv.y), 0.2);
return vec4(val, val, val, 1.0);
}
uniform float moon_dark;
uniform float moon_haze;
uniform float moon_x;
uniform float moon_y;
uniform float moon_radius;
uniform float moon_light;
uniform float moon_clear;
vec3 moon_noise(vec2 p)
{
vec4 w = vec4(
floor(p),
ceil (p) );
vec3
_00 = hash3_2(w.xy),
_01 = hash3_2(w.xw),
_10 = hash3_2(w.zy),
_11 = hash3_2(w.zw),
_0 = mix(_00,_01,fract(p.y)),
_1 = mix(_10,_11,fract(p.y));
return mix(_0,_1,fract(p.x));
}
vec3 moon_fbm(vec2 p)
{
vec3 w = vec3(0);
float N = 5.;
mat2 ei = mat2(cos(.5),-sin(.5),sin(.5),cos(.5))*1.7;
for (float i = 1.; i < N; i++)
{
p *= ei;
w += moon_noise(p)/N/i;
}
return w;
}
vec4 moon(vec2 U, float dark, float haze, float x, float y, float radius, float light, float clear) {
// sunset
vec4 Q = vec4(0.);
Q = 1.-dark+.4*sin(4.4-.8*U.y+vec4(1.,2.,3.,4.));
Q += haze*moon_fbm(2.*U).x;
// Moon
vec2 r = U-vec2(x,y);
float l = length(r);
float L = radius;
vec3 n = moon_fbm(8.*U);
Q += vec4(light+n.x)*exp(-clear*max(l-L,0.));
Q += .3*moon_fbm(60.*U).x*Q;
Q = clamp(Q,0.,1.);
return Q;
}
//
uniform float palettes_speed;
uniform float palettes_shadow;
uniform float palettes_color;
vec3 palettes_pal( float t, vec3 a, vec3 b, vec3 c, vec3 d )
{
return a + b*cos( 6.28318*(c*t+d) );
}
vec4 palettes(vec2 st, float speed, float shadow)
{
// animate
vec2 p = abs(st);
p.x += speed*uTime;
// compute colors
vec3 col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.0,0.33,0.67) );
if( p.y>(1.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.0,0.10,0.20) );
if( p.y>(2.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,1.0),vec3(0.3,0.20,0.20) );
if( p.y>(3.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,1.0,0.5),vec3(0.8,0.90,0.30) );
if( p.y>(4.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(1.0,0.7,0.4),vec3(0.0,0.15,0.20) );
if( p.y>(5.0/7.0) ) col = palettes_pal( p.x, vec3(0.5,0.5,0.5),vec3(0.5,0.5,0.5),vec3(2.0,1.0,0.0),vec3(0.5,0.20,0.25) );
if( p.y>(6.0/7.0) ) col = palettes_pal( p.x, vec3(0.8,0.5,0.4),vec3(0.2,0.4,0.2),vec3(2.0,1.0,1.0),vec3(0.0,0.25,0.25) );
// band
float f = fract(p.y*7.0);
// borders
col *= smoothstep( 0.49, 0.47, abs(f-0.5) );
// shadowing
col *= mix(1.0, sqrt(4.0*f*(1.0-f)), shadow);
return vec4( col, 1.0 );
}
//Tweet: https://twitter.com/XorDev/status/1519343739419959297
//Twigl: https://t.co/FELzNSfU40
//Based on "Molecules 2": https://www.shadertoy.com/view/7llBzS
uniform float shuffleMosaic_period;
uniform float shuffleMosaic_twist;
uniform float shuffleMosaic_freq;
vec2 shuffleMosaic(vec2 I, float period, float twist, float freq) {
vec3 c = vec3(0.,2.,1.);
vec3 T= mod(uTime+c,3.*period);
vec3 P = vec3(dot(vec2(-7.,4.), I), dot(vec2(0.,-8.), I), dot(vec2(7.,4.), I)) * 0.166667 + .2;
int t1 = int(mod(T.y, 3.));
int t2 = int(mod(T.y+1., 3.));
float ti = fract(mod(T.y, 3.));
float tc = mix(P[t1], P[t2], ti);
P+=(T*twist-sin(T*6.283)*0.166667).x*sin(T*3.14*freq)*cos(tc*3.14);
return P.xy;
}
uniform float gradientColor_c1;
uniform float gradientColor_c2;
uniform float gradientColor_c3;
uniform float gradientColor_dir;
vec4 gradientColor(vec2 st, float c1, float c2, float c3, float dir){
vec3 col;
float d = dir * PI / 180.;
d = (st.x - 0.5) * cos(d) + (st.y - 0.5) * sin(d) + 0.5;
col.r = d + c1;
col.g = d + c2;
col.b = d + c3;
return vec4(col,1.0);
}
// Copyright Inigo Quilez, 2020 - https://iquilezles.org/
// I am the sole copyright owner of this Work.
// You cannot host, display, distribute or share this Work in any form,
// including physical and digital. You cannot use this Work in any
// commercial or non-commercial product, website or project. You cannot
// sell this Work and you cannot mint an NFTs of it.
// I share this Work for educational purposes, and you can link to it,
// through an URL, proper attribution and unmodified screenshot, as part
// of your educational material. If these conditions are too restrictive
// please contact me and we'll definitely work it out.
uniform float stripes_2_count;
float stripes_2_noise(vec2 p)
{
vec2 i = floor(p);
vec2 f = fract(p);
f = f*f*(3.0-2.0*f);
float n = i.x + i.y*57.0;
return mix(mix(hash(n+ 0.0), hash(n+ 1.0), f.x),
mix(hash(n+57.0), hash(n+58.0), f.x), f.y);
}
vec2 stripes_2_map(vec2 p, float time)
{
for (int i=0; i<4; i++)
{
float a = stripes_2_noise(p*1.5)*6.2831 + time;
p += 0.1*vec2(cos(a), sin(a));
}
return p;
}
float stripes_2_height(vec2 p, vec2 q)
{
float h = dot(p-q, p-q);
h += 0.005*stripes_2_noise(0.75*(p+q));
return h;
}
vec4 stripes_2(vec2 p, vec4 inc, float count)
{
float time = 0.25*uTime;
vec2 q = p + 0.3;
// color
float w = count*q.x;
float u = floor(w);
float f = fract(w);
vec3 col = 1.1*inc.rgb + 0.2*sin(3.0*u+vec3(5.0, 1.5, 2.0));
// filtered drop-shadow
float sha = smoothstep(0.0, 0.8, f);
// normal
vec2 eps = vec2(2.0/uResolution.y, 0.0);
float l2c = stripes_2_height(q, p);
float l2x = stripes_2_height(stripes_2_map(p+eps.xy, time), p) - l2c;
float l2y = stripes_2_height(stripes_2_map(p+eps.yx, time), p) - l2c;
vec3 nor = normalize(vec3(l2x, eps.x, l2y));
// lighting
col *= 0.4+0.6*sha;
col *= 0.8+0.2*vec3(1.0, 0.9, 0.3)*dot(nor, vec3(0.7, 0.3, 0.7));
col += 0.2*pow(nor.y, 8.0)*sha;
col *= 7.5*l2c;
return vec4(col, inc.a);
}
uniform float fluffballs_c1;
uniform float fluffballs_c2;
uniform float fluffballs_c3;
uniform float fluffballs_nor;
uniform float fluffballs_deform;
mat2 fluffballs_rot(float c) {
float s=sin(c);
return mat2(c=cos(c),s,-s,c);
}
float fluffballs_map(vec3 p) {
float d = 1e9;
p -= (hash3_3(p)-.5)*.1;
d = min(d,length(fract(p)-.5)+1.);
p.xy = (p.xy+p.yx*vec2(-1,1))/sqrt(2.);
p.xz = (p.xz+p.zx*vec2(-1,1))/sqrt(2.);
p*=.4;
p-=uTime*.3;
d = min(d,(length(fract(p)-.5))/.4);
return d;
}
vec3 fluffballs_normal(vec3 P, float E) {
return vec3(
fluffballs_map(P+vec3(E,0,0))-fluffballs_map(P-vec3(E,0,0)),
fluffballs_map(P+vec3(0,E,0))-fluffballs_map(P-vec3(0,E,0)),
fluffballs_map(P+vec3(0,0,E))-fluffballs_map(P-vec3(0,0,E))
) / (E*2.);
}
float fluffballs_trace(vec3 ro,vec3 rd) {
vec3 p = ro;
float t = 0.;
float h = -.4;
for(int i=0;i<40;i++){
t += (fluffballs_map(p)+t*h)/(1.-h);
p = ro+rd*t;
}
return t;
}
vec4 fluffballs(vec2 p, vec4 inc, float c1, float c2, float c3, float deform, float nor){
vec3 ro = vec3(sin(uTime*.2)*4.,sin(.1*uTime*1.23)*4.,-0.)+uTime;
vec3 rd = normalize(vec3(p,deform));
rd.yz*=fluffballs_rot(uTime*.37);
rd.xy*=fluffballs_rot(uTime*.4);
vec4 O = vec4(inc);
float t = fluffballs_trace(ro,rd);
vec3 pp = ro+rd*t;
vec3 n = fluffballs_normal(pp,nor);
O.xyz += vec3(1.,2.,3.)*max(dot(n,normalize(vec3(0.,1.,0.)))*.5+.5,0.)*c1;
O.xyz += vec3(4.,2.,1.)*max(dot(n,normalize(vec3(3.,1.,0.))),0.);
vec3 hsv = rgb2hsv(vec3(.1,.2,.3));
hsv.x = fract(hsv.x+c3);
O.xyz += hsv2rgb(hsv)*exp(t*.4);
O.xyz *= c2;
O.xyz-=.4;
O.xyz = 1. - exp(-O.xyz);
O.xyz = pow(O.xyz,vec3(0.45454545));
return O;
}
uniform float dither_v;
vec4 dither(vec2 st, vec4 inc, float d) {
vec4 col = vec4(inc);
col.rgb += (d/255.0)*hash3_2(st*200.0);
return col;
}
// Sakura Bliss by Philippe Desgranges
// Email: Philippe.desgranges@gmail.com
// License Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
uniform float sakura_blur;
uniform float sakura_color;
// Borrowed from BigWIngs
vec4 sakura_N14(float t) {
return fract(sin(t*vec4(123., 104., 145., 24.))*vec4(657., 345., 879., 154.));
}
// Computes the RGB and alpha of a single flower in its own UV space
vec4 sakura_(vec2 uv, vec2 id, float blur)
{
float time = uTime + 45.0; //time is offset to avoid the flowers to be aligned at start
vec4 rnd = sakura_N14(mod(id.x, 500.0) * 5.4 + mod(id.y, 500.0) * 13.67); //get 4 random numbersper flower
// Offset the flower form the center in a random Lissajous pattern
uv *= mix(0.75, 1.3, rnd.y);
uv.x += sin(time * rnd.z * 0.3) * 0.6;
uv.y += sin(time * rnd.w * 0.45) * 0.4;
// Computes the angle of the flower with a random rotation speed
float angle = atan(uv.y, uv.x) + rnd.x * 421.47 + uTime * mix(-0.6, 0.6, rnd.x);
// euclidean distance to the center of the flower
float dist = length(uv);
// Flower shaped distance function form the center
float petal = 1.0 - abs(sin(angle * 2.5));
float sqPetal = petal * petal;
petal = mix(petal, sqPetal, 0.7);
float petal2 = 1.0 - abs(sin(angle * 2.5 + 1.5));
petal += petal2 * 0.2;
float sakuraDist = dist + petal * 0.25;
// Compute a blurry shadow mask.
float shadowblur = 0.3;
float shadow = smoothstep(0.5 + shadowblur, 0.5 - shadowblur, sakuraDist) * 0.4;
//Computes the sharper mask of the flower
float sakuraMask = smoothstep(0.5 + blur, 0.5 - blur, sakuraDist);
// The flower has a pink hue and is lighter in the center
vec3 hsv = rgb2hsv(vec3(1.0, 0.6, 0.7));
hsv.x = fract(hsv.x + sakura_color);
vec3 sakuraCol = hsv2rgb(hsv);
sakuraCol += (0.5 - dist) * 0.2;
// Computes the border mask of the flower
vec3 outlineCol = vec3(1.0, 0.3, 0.3);
float outlineMask = smoothstep(0.5 - blur, 0.5, sakuraDist + 0.045);
// Defines a tiling polarspace for the pistil pattern
float polarSpace = angle * 1.9098 + 0.5;
float polarPistil = fract(polarSpace) - 0.5; // 12 / (2 * pi)
// Round dot in the center
outlineMask += smoothstep(0.035 + blur, 0.035 - blur, dist);
float petalBlur = blur * 2.0;
float pistilMask = smoothstep(0.12 + blur, 0.12, dist) * smoothstep(0.05, 0.05 + blur , dist);
// Compute the pistil 'bars' in polar space
float barW = 0.2 - dist * 0.7;
float pistilBar = smoothstep(-barW, -barW + petalBlur, polarPistil) * smoothstep(barW + petalBlur, barW, polarPistil);
// Compute the little dots in polar space
float pistilDotLen = length(vec2(polarPistil * 0.10, dist) - vec2(0, 0.16)) * 9.0;
float pistilDot = smoothstep(0.1 + petalBlur, 0.1 - petalBlur, pistilDotLen);
//combines the middle an border color
outlineMask += pistilMask * pistilBar + pistilDot;
sakuraCol = mix(sakuraCol, outlineCol, clamp(outlineMask,0.0,1.0) * 0.5);
//sets the background to the shadow color
sakuraCol = mix(vec3(0.2, 0.2, 0.8) * shadow, sakuraCol, sakuraMask);
//incorporates the shadow mask into alpha channel
sakuraMask = clamp(sakuraMask + shadow,0.0,1.0);
//returns the flower in pre-multiplied rgba
return vec4(sakuraCol, sakuraMask);
}
// blends a pre-multiplied src onto a dst color (without alpha)
vec3 sakura_premulMix(vec4 src, vec3 dst)
{
return dst.rgb * (1.0 - src.a) + src.rgb;
}
// blends a pre-multiplied src onto a dst color (with alpha)
vec4 sakura_premulMix(vec4 src, vec4 dst)
{
vec4 res;
res.rgb = sakura_premulMix(src, dst.rgb);
res.a = 1.0 - (1.0 - src.a) * (1.0 - dst.a);
return res;
}
// Computes a Layer of flowers
vec4 sakura_layer(vec2 uv, float blur)
{
vec2 cellUV = fract(uv) - 0.5;
vec2 cellId = floor(uv);
vec4 accum = vec4(0.0);
// the flowers can overlap on the 9 neighboring cells so we blend them all together on each cell
for (float y = -1.0; y <= 1.0; y++)
{
for (float x = -1.0; x <= 1.0; x++)
{
vec2 offset = vec2(x, y);
vec4 sakura = sakura_(cellUV - offset, cellId + offset, blur);
accum = sakura_premulMix(sakura, accum);
}
}
return accum;
}
vec4 sakura(vec2 st, vec4 inc, float inb)
{
// Scroll the UV with a cosine oscillation
vec2 p =vec2(st);
p.y += uTime * 0.1;
p.x -= uTime * 0.03 + sin(uTime) * 0.1;
p *= 4.3;
vec3 col = inc.rgb;
// Compute a tilt-shift-like blur factor
float blur = abs(st.y);
blur *= blur * 0.15;
// Computes several layers with various degrees of blur and scale
vec4 layer1 = sakura_layer(p, inb + blur);
// Blend it all together
col = sakura_premulMix(layer1, col);
return vec4(col,inc.a);
}
uniform float stripes1_c1;
uniform float stripes1_c2;
uniform float stripes1_c3;
uniform float stripes1_count;
vec4 stripes1(vec2 p, float c1, float c2, float c3, float count) {
float a = floor((p.x - p.y * 0.5 - uTime * .08) * count) - uTime * 2.;
vec3 col = vec3(sin(a + c1*PI), sin(a + c2*PI), sin(a + c3*PI)) * 0.2 + 0.7;
return vec4(col, 1.0);
}
// Created by greenbird10
// License Creative Commons Attribution-NonCommercial-ShareAlike 3.0
uniform float water_sunx;
uniform float water_suny;
uniform float water_wave1;
uniform float water_wave2;
//From Dave (https://www.shadertoy.com/view/4djSRW)
vec2 water_hash(vec2 p)
{
return hash2_2(p)*2.0 - 1.0;
}
//From iq (https://www.shadertoy.com/view/XdXGW8)
float water_noise( vec2 p )
{
vec2 i = floor( p );
vec2 f = fract( p );
vec2 u = f*f*(3.0-2.0*f);
return mix( mix( dot( water_hash( i + vec2(0.0,0.0) ), f - vec2(0.0,0.0) ),
dot( water_hash( i + vec2(1.0,0.0) ), f - vec2(1.0,0.0) ), u.x),
mix( dot( water_hash( i + vec2(0.0,1.0) ), f - vec2(0.0,1.0) ),
dot( water_hash( i + vec2(1.0,1.0) ), f - vec2(1.0,1.0) ), u.x), u.y);
}
vec4 water(vec2 p)
{
// water
vec3 col = vec3(102./255., 120./255., 133./255.);
vec3 col1 = vec3(165./255., 157./255., 152./255.);
vec2 pp = p * vec2(uResolution.x/uResolution.y, 1.);
float sun = distance(pp, vec2(water_sunx, water_suny));
sun = pow(sun, 1.7);
col = mix(col, col*1.2, sun);
col1 = mix(col1, col1*1.5, sun);
col = mix(col1, col, smoothstep(
water_wave1, water_wave2, p.y + 0.5 * water_noise(vec2(
(p.x + 0.3 * water_noise(vec2(p.y * 30., 0.17 + uTime*0.5))) * 4., 0.33 + uTime*0.1))));
// Output to screen
return vec4(col,1.0);
}
// Copyright Inigo Quilez, 2020 - https://iquilezles.org/
// I am the sole copyright owner of this Work.
// You cannot host, display, distribute or share this Work in any form,
// including physical and digital. You cannot use this Work in any
// commercial or non-commercial product, website or project. You cannot
// sell this Work and you cannot mint an NFTs of it.
// I share this Work for educational purposes, and you can link to it,
// through an URL, proper attribution and unmodified screenshot, as part
// of your educational material. If these conditions are too restrictive
// please contact me and we'll definitely work it out.
uniform float stripes_1_times;
uniform float stripes_1_dist;
uniform float stripes_1_phase;
uniform float stripes_1_amp;
float stripes_1_noise( vec2 p )
{
vec2 i = floor(p);
vec2 f = fract(p);
f = f*f*(3.0-2.0*f);
float n = i.x + i.y*57.0;
return mix(mix( hash(n+ 0.0), hash(n+ 1.0),f.x),
mix( hash(n+57.0), hash(n+58.0),f.x),f.y);
}
vec2 stripes_1(vec2 p, float times, float phase, float dist, float amp)
{
for( float i=0.; i<times; i++ )
{
float a = stripes_1_noise(p*1.5)*PI*phase + i + uTime;
p += dist*vec2( cos(a), sin(a) );
dist *= amp;
}
return p;
}
uniform float subdivision_dir_x;
uniform float subdivision_dir_y;
uniform float subdivision_fine;
uniform float subdivision_chaos;
uniform float subdivision_deform;
float subdivision_bum(float sc, vec2 ipos) {
return 0.5 * mod(PI * (ipos.y * cos(sc * ipos.x) + ipos.x * cos(sc * ipos.y)), 2.);
}
vec2 subdivision(vec2 p, float dirX, float dirY, float fine, float chaos, float deform) {
vec2 direction = vec2(dirX, dirY);
vec2 pp = p + direction;
float sc = PI;
float ss = sign(dirX);
ss += step(ss, 0.0);
float si = fine * (pp.x + max(30. - abs(dirX), 0.) * ss);
float b = 1.;
pp *= 1. + 0.00045 * chaos * cos(0.1 * uTime + 4. * PI * b);
vec2 ipos;
float n = 5.;
for (float i = 0.; i < n; i++) {
ipos = floor(si * pp);
b = mix(b, subdivision_bum(sc, ipos), deform);
float io = 2. * PI * i / n;
pp *= 1. + 0.00045 * chaos * cos(io + 0.1 * uTime + 4. * PI * b);
}
return pp - direction;
}
// Copyright Inigo Quilez, 2013 - https://iquilezles.org/
// I am the sole copyright owner of this Work.
// You cannot host, display, distribute or share this Work in any form,
// including physical and digital. You cannot use this Work in any
// commercial or non-commercial product, website or project. You cannot
// sell this Work and you cannot mint an NFTs of it.
// I share this Work for educational purposes, and you can link to it,
// through an URL, proper attribution and unmodified screenshot, as part
// of your educational material. If these conditions are too restrictive
// please contact me and we'll definitely work it out.
uniform float warping_angle;
uniform float warping_color;
float warping_noise(vec2 p) {
return sin(p.x)*sin(p.y);
}
float warping_fbm4(vec2 p, mat2 m) {
float f = 0.0;
f += 0.5000*warping_noise(p); p = m*p*2.02;
f += 0.2500*warping_noise(p); p = m*p*2.03;
f += 0.1250*warping_noise(p); p = m*p*2.01;
f += 0.0625*warping_noise(p);
return f/0.9375;
}
float warping_fbm6(vec2 p, mat2 m) {
float f = 0.0;
f += 0.500000*(0.5+0.5*warping_noise(p)); p = m*p*2.02;
f += 0.250000*(0.5+0.5*warping_noise(p)); p = m*p*2.03;
f += 0.125000*(0.5+0.5*warping_noise(p)); p = m*p*2.01;
f += 0.062500*(0.5+0.5*warping_noise(p)); p = m*p*2.04;
f += 0.031250*(0.5+0.5*warping_noise(p)); p = m*p*2.01;
f += 0.015625*(0.5+0.5*warping_noise(p));
return f/0.96875;
}
vec2 warping_fbm4_2(vec2 p, mat2 m) {
return vec2(warping_fbm4(p, m), warping_fbm4(p+vec2(7.8), m));
}
vec2 warping_fbm6_2(vec2 p, mat2 m) {
return vec2(warping_fbm6(p+vec2(16.8), m), warping_fbm6(p+vec2(11.5), m));
}
vec4 warping(vec2 st, float angle, float c) {
float sa = sin(angle);
float ca = cos(angle);
mat2 m = mat2(ca, sa, -sa, ca);
vec2 q = vec2(st);
q += 0.03*sin(vec2(0.27, 0.23)*uTime + length(q)*vec2(4.1, 4.3));
vec2 o = warping_fbm4_2(0.9*q, m);
o += 0.04*sin(vec2(0.12, 0.14)*uTime + length(o));
vec2 n = warping_fbm6_2(3.0*o, m);
vec4 on = vec4(o, n);
float f = 0.5 + 0.5*warping_fbm4(1.8*q + 6.0*n, m);
f = mix(f, f*f*f*3.5, f*abs(n.x));
vec3 col = vec3(0.0);
col = mix(vec3(0.2, 0.1, 0.4), vec3(0.3, 0.05, 0.05), f);
col = mix(col, vec3(0.9, 0.9, 0.9), dot(on.zw, on.zw));
col = mix(col, vec3(0.4, 0.3, 0.3), 0.2 + 0.5*on.y*on.y);
col = mix(col, vec3(0.0, 0.2, 0.4), 0.5*smoothstep(1.2, 1.3, abs(on.z)+abs(on.w)));
col = clamp(col*f*2.0, 0.0, 1.0);
vec3 hsv = rgb2hsv(col);
hsv.x = fract(hsv.x + c);
return vec4(hsv2rgb(hsv), 1.0);
}
uniform float vignette_v;
vec4 vignette(vec4 color, vec2 q, float v)
{
color.rgb *= 0.3 + 0.8 * pow(16.0 * q.x * q.y * (1.0 - q.x) * (1.0 - q.y), v);
return color;
}
uniform float cmyk_smooth;
uniform float cmyk_thres;
vec3 cmyk_color(float x) {
float factor = fract(x) * 4.;
float f0 = smoothstep(0., cmyk_smooth, factor);
float f1 = smoothstep(0., cmyk_smooth, factor - 1.);
float f2 = smoothstep(0., cmyk_smooth, factor - 2.);
float f3 = smoothstep(0., cmyk_smooth, factor - 3.);
|
non_test
|
helenaluengo software helena luengo software extremely detailed best quality masterpiece illustration colorful clear cut margin isometric half sphere island on neon background isometric environment isometric art amazing detail artstation ray a warrior robot astronaut floral horizon zero dawn machine posing for a fight intricate steampunk city sunrise landscape intricate detailed volumetric lighting scenery highly detailed artstation sharp uniform float palettes speed uniform float palettes shadow isometric half sphere island on neon background isometric environment isometric art amazing detail artstation ray a warrior robot astronaut floral horizon zero dawn machine posing for a fight intricate steampunk city sunrise landscape intricate detailed volumetric lighting scenery highly detailed artstation sharp uniform float palettes speed uniform float palettes shadow uniform float palettes color palettes pal float t a b c d return a b cos c t d uniform float rotrects grid uniform float rotrects period uniform float rotrects w uniform float rotrects h rotrects rot float a return cos a sin a sin a cos a float rotrects rect p c d abs p c return smoothstep max d x d y uresolution y float rotrects triwave float n float grid return abs mod n grid grid grid max grid float rotrects p float mode p mode rotrects grid p rotrects grid float grid rotrects grid pi p grid pi floor pi step fract pi float n mode pi x pi y pi x pi y float angle pi rotrects period utime rotrects triwave n grid if mode angle pi p rotrects rot angle fract p with intelligence andreatobarfigueroa kindle com python sdk version superior phytonutrients start install the stability sdk package from puppies release new isometric half sphere island on neon background isometric environment isometric art amazing detail artstation ray a warrior robot astronaut floral horizon zero dawn machine posing for a fight intricate steampunk city sunrise landscape intricate detailed volumetric lighting scenery highly detailed artstation sharp uniform float palettes speed uniform float palettes shadow isometric half sphere island on neon background isometric environment isometric art amazing detail artstation ray a warrior robot astronaut floral horizon zero dawn machine posing for a fight intricate steampunk city sunrise landscape intricate detailed volumetric lighting scenery highly detailed artstation sharp uniform float palettes speed uniform float palettes shadow uniform float palettes color palettes pal float t a b c d return a b cos c t d uniform float rotrects grid uniform float rotrects period uniform float rotrects w uniform float rotrects h rotrects rot float a return cos a sin a sin a cos a float rotrects rect p c d abs p c return smoothstep max d x d y uresolution y float rotrects triwave float n float grid return abs mod n grid grid grid max grid float rotrects p float mode p mode rotrects grid p rotrects grid float grid rotrects grid pi p grid pi floor pi step fract pi float n mode pi x pi y pi x pi y float angle pi rotrects period utime rotrects triwave n grid if mode angle pi p rotrects rot angle fract p mod rotrects grid return rotrects rect p rotrects grid rotrects w rotre uniform float rotrects grid uniform float rotrects period uniform float rotrects w uniform float rotrects h rotrects rot float a return cos a sin a sin a cos a float rotrects rect p c d abs p c return smoothstep max d x d y uresolution y float rotrects triwave float n float grid return abs mod n grid grid grid max grid float rotrects p float mode p mode rotrects grid p rotrects grid float grid rotrects grid pi p grid pi floor pi step fract pi float n mode pi x pi y pi x pi y float angle pi rotrects period utime rotrects triwave n grid if mode angle pi p rotrects rot angle fract p mod rotrects grid return rotrects rect p rotrects grid rotrects w rotrects h rotrects grid rotrects p uv pp p float val rotrects pp rotrects pp val pow val val pow uv x uv y uv x uv y return val val val author rigel shader mystic flower licence uniform float mysticflower disto uniform float mysticflower disti noise in float mysticflower noise p i floor p f fract p u f f f return mix mix hash i hash i u x mix hash i hash i u x u y fractal noise in float mysticflower fbm p const m float f f mysticflower noise p p m f mysticflower noise p p m f mysticflower noise p p m f mysticflower noise p p m normalize f f return f mysticflower st float distort float distinct p st angle and radius to center float a atan p y abs p x float r length p space distortion float f mysticflower fbm a utime r utime f pow abs f distinct sign f p f distort return p uniform float sakura blur uniform float sakura color borrowed from bigwings sakura float t return fract sin t computes the rgb and alpha of a single flower in its own uv space sakura uv id float blur float time utime time is offset to avoid the flowers to be aligned at start rnd sakura mod id x mod id y get random numbersper flower offset the flower form the center in a random lissajous pattern uv mix rnd y uv x sin time rnd z uv y sin time rnd w computes the angle of the flower with a random rotation speed float angle atan uv y uv x rnd x utime mix rnd x euclidean distance to the center of the flower float dist length uv flower shaped distance function form the center float petal abs sin angle float sqpetal petal petal petal mix petal sqpetal float abs sin angle petal float sakuradist dist petal compute a blurry shadow mask float shadowblur float shadow smoothstep shadowblur shadowblur sakuradist computes the sharper mask of the flower float sakuramask smoothstep blur blur sakuradist the flower has a pink hue and is lighter in the center hsv hsv x fract hsv x sakura color sakuracol hsv sakuracol dist computes the border mask of the flower outlinecol float outlinemask smoothstep blur sakuradist defines a tiling polarspace for the pistil pattern float polarspace angle float polarpistil fract polarspace pi round dot in the center outlinemask smoothstep blur blur dist float petalblur blur float pistilmask smoothstep blur dist smoothstep blur dist compute the pistil bars in polar space float barw dist float pistilbar smoothstep barw barw petalblur polarpistil smoothstep barw petalblur barw polarpistil compute the little dots in polar space float pistildotlen length polarpistil dist float pistildot smoothstep petalblur petalblur pistildotlen combines the middle an border color outlinemask pistilmask pistilbar pistildot sakuracol mix sakuracol outlinecol clamp outlinemask sets the background to the shadow color sakuracol mix shadow sakuracol sakuramask incorporates the shadow mask into alpha channel sakuramask clamp sakuramask shadow returns the flower in pre multiplied rgba return sakuracol sakuramask blends a pre multiplied src onto a dst color without alpha sakura premulmix src dst return dst rgb src a src rgb blends a pre multiplied src onto a dst color with alpha sakura premulmix src dst res res rgb sakura premulmix src dst rgb res a src a dst a return res computes a layer of flowers sakura layer uv float blur celluv fract uv cellid floor uv accum the flowers can overlap on the neighboring cells so we blend them all together on each cell for float y y y for float x x x offset x y sakura sakura celluv offset cellid offset blur accum sakura premulmix sakura accum return accum sakura st inc float inb scroll the uv with a cosine oscillation p st p y utime p x utime sin utime p col inc rgb compute a tilt shift like blur factor float blur abs st y blur blur computes several layers with various degrees of blur and scale sakura layer p inb blur blend it all together col sakura premulmix col return col inc a the mit license uniform float palettes speed uniform float palettes shadow uniform float palettes color palettes pal float t a b c d return a b cos c t d palettes st float speed float shadow animate p abs st p x speed utime compute colors col palettes pal p x if p y col palettes pal p x if p y col palettes pal p x if p y col palettes pal p x if p y col palettes pal p x if p y col palettes pal p x if p y col palettes pal p x band float f fract p y borders col smoothstep abs f shadowing col mix sqrt f f shadow return col void main uv texturecoordinate st uv x uv y st croparea uv x croparea croparea uv y croparea color step st x color rotrects st uv st mysticflower st mysticflower disto mysticflower disti color sakura st color sakura blur color palettes st palettes speed palettes shadow color color color a gl fragcolor color andreatobarphytonutrients inteligencia iathena zeus akkadians ethw t rexxx install stability sdk package from puppies phytonutrients with intelligence let seed let canvaswidth let canvasheight let account let size let repeattype let speed let colorbg let let let let let let let let let colorbg let speed let size let account let repeattype let let let let let let let let uniform float tonemap exposure tonemap inc float exposure col smoothstep exp inc rgb exposure srgb color component transfer col col r pow col r col r col g pow col g col g col b pow col b col b return clamp col inc a pos trans uv p uv p x uresolution x uresolution y return p uniform float rotrects grid uniform float rotrects period uniform float rotrects w uniform float rotrects h rotrects rot float a return cos a sin a sin a cos a float rotrects rect p c d abs p c return smoothstep max d x d y uresolution y float rotrects triwave float n float grid return abs mod n grid grid grid max grid float rotrects p float mode p mode rotrects grid p rotrects grid float grid rotrects grid pi p grid pi floor pi step fract pi float n mode pi x pi y pi x pi y float angle pi rotrects period utime rotrects triwave n grid if mode angle pi p rotrects rot angle fract p mod rotrects grid return rotrects rect p rotrects grid rotrects w rotrects h rotrects grid rotrects p uv pp p float val rotrects pp rotrects pp val pow val val pow uv x uv y uv x uv y return val val val noise in float mysticflower noise p i floor p f fract p u f f f return mix mix hash i hash i u x mix hash i hash i u x u y fractal noise in float mysticflower fbm p const m float f f mysticflower noise p p m f mysticflower noise p p m f mysticflower noise p p m f mysticflower noise p p m normalize f f return f mysticflower st float distort float distinct p st angle and radius to center float a atan p y abs p x float r length p space distortion float f mysticflower fbm a utime r utime f pow abs f distinct sign f p f distort return p sakura bliss by philippe desgranges email philippe desgranges gmail com license creative commons attribution noncommercial sharealike unported license uniform float sakura blur uniform float sakura color borrowed from bigwings sakura float t return fract sin t computes the rgb and alpha of a single flower in its own uv space sakura uv id float blur float time utime time is offset to avoid the flowers to be aligned at start rnd sakura mod id x mod id y get random numbersper flower offset the flower form the center in a random lissajous pattern uv mix rnd y uv x sin time rnd z uv y sin time rnd w computes the angle of the flower with a random rotation speed float angle atan uv y uv x rnd x utime mix rnd x euclidean distance to the center of the flower float dist length uv flower shaped distance function form the center float petal abs sin angle float sqpetal petal petal petal mix petal sqpetal float abs sin angle petal float sakuradist dist petal compute a blurry shadow mask float shadowblur float shadow smoothstep shadowblur shadowblur sakuradist computes the sharper mask of the flower float sakuramask smoothstep blur blur sakuradist the flower has a pink hue and is lighter in the center hsv hsv x fract hsv x sakura color sakuracol hsv sakuracol dist computes the border mask of the flower outlinecol float outlinemask smoothstep blur sakuradist defines a tiling polarspace for the pistil pattern float polarspace angle float polarpistil fract polarspace pi round dot in the center outlinemask smoothstep blur blur dist float petalblur blur float pistilmask smoothstep blur dist smoothstep blur dist compute the pistil bars in polar space float barw dist float pistilbar smoothstep barw barw petalblur polarpistil smoothstep barw petalblur barw polarpistil compute the little dots in polar space float pistildotlen length polarpistil dist float pistildot smoothstep petalblur petalblur pistildotlen combines the middle an border color outlinemask pistilmask pistilbar pistildot sakuracol mix sakuracol outlinecol clamp outlinemask sets the background to the shadow color sakuracol mix shadow sakuracol sakuramask incorporates the shadow mask into alpha channel sakuramask clamp sakuramask shadow returns the flower in pre multiplied rgba return sakuracol sakuramask blends a pre multiplied src onto a dst color without alpha sakura premulmix src dst return dst rgb src a src rgb blends a pre multiplied src onto a dst color with alpha sakura premulmix src dst res res rgb sakura premulmix src dst rgb res a src a dst a return res computes a layer of flowers sakura layer uv float blur celluv fract uv cellid floor uv accum the flowers can overlap on the neighboring cells so we blend them all together on each cell for float y y y for float x x x offset x y sakura sakura celluv offset cellid offset blur accum sakura premulmix sakura accum return accum sakura st inc float inb scroll the uv with a cosine oscillation p st p y utime p x utime sin utime p col inc rgb compute a tilt shift like blur factor float blur abs st y blur blur computes several layers with various degrees of blur and scale sakura layer p inb blur blend it all together col sakura premulmix col return col inc a void main uv texturecoordinate st uv x uv y st croparea uv x croparea croparea uv y croparea color step st x color rotrects st uv st mysticflower st mysticflower disto mysticflower disti color sakura st color sakura blur color color color a gl fragcolor color uniform float rotrects grid uniform float rotrects period uniform float rotrects w uniform float rotrects h rotrects rot float a return cos a sin a sin a cos a float rotrects rect p c d abs p c return smoothstep max d x d y uresolution y float rotrects triwave float n float grid return abs mod n grid grid grid max grid float rotrects p float mode p mode rotrects grid p rotrects grid float grid rotrects grid pi p grid pi floor pi step fract pi float n mode pi x pi y pi x pi y float angle pi rotrects period utime rotrects triwave n grid if mode angle pi p rotrects rot angle fract p mod rotrects grid return rotrects rect p rotrects grid rotrects w rotrects h rotrects grid rotrects p uv pp p float val rotrects pp rotrects pp val pow val val pow uv x uv y uv x uv y return val val val uniform float moon dark uniform float moon haze uniform float moon x uniform float moon y uniform float moon radius uniform float moon light uniform float moon clear moon noise p w floor p ceil p w xy w xw w zy w zw mix fract p y mix fract p y return mix fract p x moon fbm p w float n ei cos sin sin cos for float i i n i p ei w moon noise p n i return w moon u float dark float haze float x float y float radius float light float clear sunset q q dark sin u y q haze moon fbm u x moon r u x y float l length r float l radius n moon fbm u q light n x exp clear max l l q moon fbm u x q q clamp q return q uniform float palettes speed uniform float palettes shadow uniform float palettes color palettes pal float t a b c d return a b cos c t d palettes st float speed float shadow animate p abs st p x speed utime compute colors col palettes pal p x if p y col palettes pal p x if p y col palettes pal p x if p y col palettes pal p x if p y col palettes pal p x if p y col palettes pal p x if p y col palettes pal p x band float f fract p y borders col smoothstep abs f shadowing col mix sqrt f f shadow return col tweet twigl based on molecules uniform float shufflemosaic period uniform float shufflemosaic twist uniform float shufflemosaic freq shufflemosaic i float period float twist float freq c t mod utime c period p dot i dot i dot i int int mod t y int int mod t y float ti fract mod t y float tc mix p p ti p t twist sin t x sin t freq cos tc return p xy uniform float gradientcolor uniform float gradientcolor uniform float gradientcolor uniform float gradientcolor dir gradientcolor st float float float float dir col float d dir pi d st x cos d st y sin d col r d col g d col b d return col copyright inigo quilez i am the sole copyright owner of this work you cannot host display distribute or share this work in any form including physical and digital you cannot use this work in any commercial or non commercial product website or project you cannot sell this work and you cannot mint an nfts of it i share this work for educational purposes and you can link to it through an url proper attribution and unmodified screenshot as part of your educational material if these conditions are too restrictive please contact me and we ll definitely work it out uniform float stripes count float stripes noise p i floor p f fract p f f f f float n i x i y return mix mix hash n hash n f x mix hash n hash n f x f y stripes map p float time for int i i i float a stripes noise p time p cos a sin a return p float stripes height p q float h dot p q p q h stripes noise p q return h stripes p inc float count float time utime q p color float w count q x float u floor w float f fract w col inc rgb sin u filtered drop shadow float sha smoothstep f normal eps uresolution y float stripes height q p float stripes height stripes map p eps xy time p float stripes height stripes map p eps yx time p nor normalize eps x lighting col sha col dot nor col pow nor y sha col return col inc a uniform float fluffballs uniform float fluffballs uniform float fluffballs uniform float fluffballs nor uniform float fluffballs deform fluffballs rot float c float s sin c return c cos c s s c float fluffballs map p float d p p d min d length fract p p xy p xy p yx sqrt p xz p xz p zx sqrt p p utime d min d length fract p return d fluffballs normal p float e return fluffballs map p e fluffballs map p e fluffballs map p e fluffballs map p e fluffballs map p e fluffballs map p e e float fluffballs trace ro rd p ro float t float h for int i i i t fluffballs map p t h h p ro rd t return t fluffballs p inc float float float float deform float nor ro sin utime sin utime utime rd normalize p deform rd yz fluffballs rot utime rd xy fluffballs rot utime o inc float t fluffballs trace ro rd pp ro rd t n fluffballs normal pp nor o xyz max dot n normalize o xyz max dot n normalize hsv hsv x fract hsv x o xyz hsv exp t o xyz o xyz o xyz exp o xyz o xyz pow o xyz return o uniform float dither v dither st inc float d col inc col rgb d st return col sakura bliss by philippe desgranges email philippe desgranges gmail com license creative commons attribution noncommercial sharealike unported license uniform float sakura blur uniform float sakura color borrowed from bigwings sakura float t return fract sin t computes the rgb and alpha of a single flower in its own uv space sakura uv id float blur float time utime time is offset to avoid the flowers to be aligned at start rnd sakura mod id x mod id y get random numbersper flower offset the flower form the center in a random lissajous pattern uv mix rnd y uv x sin time rnd z uv y sin time rnd w computes the angle of the flower with a random rotation speed float angle atan uv y uv x rnd x utime mix rnd x euclidean distance to the center of the flower float dist length uv flower shaped distance function form the center float petal abs sin angle float sqpetal petal petal petal mix petal sqpetal float abs sin angle petal float sakuradist dist petal compute a blurry shadow mask float shadowblur float shadow smoothstep shadowblur shadowblur sakuradist computes the sharper mask of the flower float sakuramask smoothstep blur blur sakuradist the flower has a pink hue and is lighter in the center hsv hsv x fract hsv x sakura color sakuracol hsv sakuracol dist computes the border mask of the flower outlinecol float outlinemask smoothstep blur sakuradist defines a tiling polarspace for the pistil pattern float polarspace angle float polarpistil fract polarspace pi round dot in the center outlinemask smoothstep blur blur dist float petalblur blur float pistilmask smoothstep blur dist smoothstep blur dist compute the pistil bars in polar space float barw dist float pistilbar smoothstep barw barw petalblur polarpistil smoothstep barw petalblur barw polarpistil compute the little dots in polar space float pistildotlen length polarpistil dist float pistildot smoothstep petalblur petalblur pistildotlen combines the middle an border color outlinemask pistilmask pistilbar pistildot sakuracol mix sakuracol outlinecol clamp outlinemask sets the background to the shadow color sakuracol mix shadow sakuracol sakuramask incorporates the shadow mask into alpha channel sakuramask clamp sakuramask shadow returns the flower in pre multiplied rgba return sakuracol sakuramask blends a pre multiplied src onto a dst color without alpha sakura premulmix src dst return dst rgb src a src rgb blends a pre multiplied src onto a dst color with alpha sakura premulmix src dst res res rgb sakura premulmix src dst rgb res a src a dst a return res computes a layer of flowers sakura layer uv float blur celluv fract uv cellid floor uv accum the flowers can overlap on the neighboring cells so we blend them all together on each cell for float y y y for float x x x offset x y sakura sakura celluv offset cellid offset blur accum sakura premulmix sakura accum return accum sakura st inc float inb scroll the uv with a cosine oscillation p st p y utime p x utime sin utime p col inc rgb compute a tilt shift like blur factor float blur abs st y blur blur computes several layers with various degrees of blur and scale sakura layer p inb blur blend it all together col sakura premulmix col return col inc a uniform float uniform float uniform float uniform float count p float float float float count float a floor p x p y utime count utime col sin a pi sin a pi sin a pi return col created by license creative commons attribution noncommercial sharealike uniform float water sunx uniform float water suny uniform float water uniform float water from dave water hash p return p from iq float water noise p i floor p f fract p u f f f return mix mix dot water hash i f dot water hash i f u x mix dot water hash i f dot water hash i f u x u y water p water col pp p uresolution x uresolution y float sun distance pp water sunx water suny sun pow sun col mix col col sun mix sun col mix col smoothstep water water p y water noise p x water noise p y utime utime output to screen return col copyright inigo quilez i am the sole copyright owner of this work you cannot host display distribute or share this work in any form including physical and digital you cannot use this work in any commercial or non commercial product website or project you cannot sell this work and you cannot mint an nfts of it i share this work for educational purposes and you can link to it through an url proper attribution and unmodified screenshot as part of your educational material if these conditions are too restrictive please contact me and we ll definitely work it out uniform float stripes times uniform float stripes dist uniform float stripes phase uniform float stripes amp float stripes noise p i floor p f fract p f f f f float n i x i y return mix mix hash n hash n f x mix hash n hash n f x f y stripes p float times float phase float dist float amp for float i i times i float a stripes noise p pi phase i utime p dist cos a sin a dist amp return p uniform float subdivision dir x uniform float subdivision dir y uniform float subdivision fine uniform float subdivision chaos uniform float subdivision deform float subdivision bum float sc ipos return mod pi ipos y cos sc ipos x ipos x cos sc ipos y subdivision p float dirx float diry float fine float chaos float deform direction dirx diry pp p direction float sc pi float ss sign dirx ss step ss float si fine pp x max abs dirx ss float b pp chaos cos utime pi b ipos float n for float i i n i ipos floor si pp b mix b subdivision bum sc ipos deform float io pi i n pp chaos cos io utime pi b return pp direction copyright inigo quilez i am the sole copyright owner of this work you cannot host display distribute or share this work in any form including physical and digital you cannot use this work in any commercial or non commercial product website or project you cannot sell this work and you cannot mint an nfts of it i share this work for educational purposes and you can link to it through an url proper attribution and unmodified screenshot as part of your educational material if these conditions are too restrictive please contact me and we ll definitely work it out uniform float warping angle uniform float warping color float warping noise p return sin p x sin p y float warping p m float f f warping noise p p m p f warping noise p p m p f warping noise p p m p f warping noise p return f float warping p m float f f warping noise p p m p f warping noise p p m p f warping noise p p m p f warping noise p p m p f warping noise p p m p f warping noise p return f warping p m return warping p m warping p m warping p m return warping p m warping p m warping st float angle float c float sa sin angle float ca cos angle m ca sa sa ca q st q sin utime length q o warping q m o sin utime length o n warping o m on o n float f warping q n m f mix f f f f f abs n x col col mix f col mix col dot on zw on zw col mix col on y on y col mix col smoothstep abs on z abs on w col clamp col f hsv col hsv x fract hsv x c return hsv uniform float vignette v vignette color q float v color rgb pow q x q y q x q y v return color uniform float cmyk smooth uniform float cmyk thres cmyk color float x float factor fract x float smoothstep cmyk smooth factor float smoothstep cmyk smooth factor float smoothstep cmyk smooth factor float smoothstep cmyk smooth factor
| 0
|
167,967
| 13,050,387,950
|
IssuesEvent
|
2020-07-29 15:25:13
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
roachtest: jobs/mixed-versions failed
|
C-test-failure O-roachtest O-robot branch-provisional_202007271721_v20.1.4 release-blocker
|
[(roachtest).jobs/mixed-versions failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2122385&tab=buildLog) on [provisional_202007271721_v20.1.4@9a822f51a64af9eebcf9ea5e5ecc0fdfcc1521fa](https://github.com/cockroachdb/cockroach/commits/9a822f51a64af9eebcf9ea5e5ecc0fdfcc1521fa):
```
The test failed on branch=provisional_202007271721_v20.1.4, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/jobs/mixed-versions/run_1
cluster.go:2055,versionupgrade.go:315,versionupgrade.go:167,mixed_version_jobs.go:296,mixed_version_jobs.go:320,test_runner.go:754: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod start --binary=./cockroach --encrypt=false teamcity-2122385-1595896218-03-n4cpu4:1 returned: exit status 1
(1) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod start --binary=./cockroach --encrypt=false teamcity-2122385-1595896218-03-n4cpu4:1 returned
| stderr:
|
| stdout:
| teamcity-2122385-1595896218-03-n4cpu4: starting nodes
| teamcity-2122385-1595896218-03-n4cpu4: initializing clusterI200728 00:36:19.152135 8 cockroach.go:163 unable to initialize cluster: ~
| if ! test -e /mnt/data1/cockroach/cluster-bootstrapped ; then
| COCKROACH_CONNECT_TIMEOUT=0 ././cockroach init --url 'postgres://root@localhost:26257?sslmode=disable' && touch /mnt/data1/cockroach/cluster-bootstrapped
| fi
| warning: --url specifies user/password, but command "init" does not accept user/password details - details ignored
| *
| * ERROR: ERROR: rpc error: code = Unknown desc = cluster has already been initialized
| * HINT: Please ensure all your start commands are using --join.
| *
| E200728 00:36:19.110645 1 cli/error.go:373 ERROR: rpc error: code = Unknown desc = cluster has already been initialized
| HINT: Please ensure all your start commands are using --join.
| ERROR: rpc error: code = Unknown desc = cluster has already been initialized
| HINT: Please ensure all your start commands are using --join.
| Failed running "init": exit status 1
Wraps: (2) exit status 1
Error types: (1) *main.withCommandDetails (2) *exec.ExitError
```
<details><summary>More</summary><p>
Artifacts: [/jobs/mixed-versions](https://teamcity.cockroachdb.com/viewLog.html?buildId=2122385&tab=artifacts#/jobs/mixed-versions)
Related:
- #51699 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202007220233_v20.2.0-alpha.2](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202007220233_v20.2.0-alpha.2) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #51186 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202007081918_v20.2.0-alpha.2](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202007081918_v20.2.0-alpha.2) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #51100 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202007071743_v20.2.0-alpha.2](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202007071743_v20.2.0-alpha.2) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #50026 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202006032224_v20.2.0-alpha.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202006032224_v20.2.0-alpha.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #49281 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202005191400_v20.1.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202005191400_v20.1.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #49233 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202005182011_v20.1.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202005182011_v20.1.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #48407 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202005041945_v19.1.9](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202005041945_v19.1.9) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #48315 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-master](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-master) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #48194 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-20.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-20.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #48193 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-19.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-19.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Ajobs%2Fmixed-versions.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
2.0
|
roachtest: jobs/mixed-versions failed - [(roachtest).jobs/mixed-versions failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2122385&tab=buildLog) on [provisional_202007271721_v20.1.4@9a822f51a64af9eebcf9ea5e5ecc0fdfcc1521fa](https://github.com/cockroachdb/cockroach/commits/9a822f51a64af9eebcf9ea5e5ecc0fdfcc1521fa):
```
The test failed on branch=provisional_202007271721_v20.1.4, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/jobs/mixed-versions/run_1
cluster.go:2055,versionupgrade.go:315,versionupgrade.go:167,mixed_version_jobs.go:296,mixed_version_jobs.go:320,test_runner.go:754: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod start --binary=./cockroach --encrypt=false teamcity-2122385-1595896218-03-n4cpu4:1 returned: exit status 1
(1) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod start --binary=./cockroach --encrypt=false teamcity-2122385-1595896218-03-n4cpu4:1 returned
| stderr:
|
| stdout:
| teamcity-2122385-1595896218-03-n4cpu4: starting nodes
| teamcity-2122385-1595896218-03-n4cpu4: initializing clusterI200728 00:36:19.152135 8 cockroach.go:163 unable to initialize cluster: ~
| if ! test -e /mnt/data1/cockroach/cluster-bootstrapped ; then
| COCKROACH_CONNECT_TIMEOUT=0 ././cockroach init --url 'postgres://root@localhost:26257?sslmode=disable' && touch /mnt/data1/cockroach/cluster-bootstrapped
| fi
| warning: --url specifies user/password, but command "init" does not accept user/password details - details ignored
| *
| * ERROR: ERROR: rpc error: code = Unknown desc = cluster has already been initialized
| * HINT: Please ensure all your start commands are using --join.
| *
| E200728 00:36:19.110645 1 cli/error.go:373 ERROR: rpc error: code = Unknown desc = cluster has already been initialized
| HINT: Please ensure all your start commands are using --join.
| ERROR: rpc error: code = Unknown desc = cluster has already been initialized
| HINT: Please ensure all your start commands are using --join.
| Failed running "init": exit status 1
Wraps: (2) exit status 1
Error types: (1) *main.withCommandDetails (2) *exec.ExitError
```
<details><summary>More</summary><p>
Artifacts: [/jobs/mixed-versions](https://teamcity.cockroachdb.com/viewLog.html?buildId=2122385&tab=artifacts#/jobs/mixed-versions)
Related:
- #51699 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202007220233_v20.2.0-alpha.2](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202007220233_v20.2.0-alpha.2) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #51186 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202007081918_v20.2.0-alpha.2](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202007081918_v20.2.0-alpha.2) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #51100 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202007071743_v20.2.0-alpha.2](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202007071743_v20.2.0-alpha.2) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #50026 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202006032224_v20.2.0-alpha.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202006032224_v20.2.0-alpha.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #49281 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202005191400_v20.1.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202005191400_v20.1.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #49233 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202005182011_v20.1.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202005182011_v20.1.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #48407 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202005041945_v19.1.9](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202005041945_v19.1.9) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #48315 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-master](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-master) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #48194 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-20.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-20.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #48193 roachtest: jobs/mixed-versions failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-19.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-19.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Ajobs%2Fmixed-versions.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
test
|
roachtest jobs mixed versions failed on the test failed on branch provisional cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts jobs mixed versions run cluster go versionupgrade go versionupgrade go mixed version jobs go mixed version jobs go test runner go home agent work go src github com cockroachdb cockroach bin roachprod start binary cockroach encrypt false teamcity returned exit status home agent work go src github com cockroachdb cockroach bin roachprod start binary cockroach encrypt false teamcity returned stderr stdout teamcity starting nodes teamcity initializing cockroach go unable to initialize cluster if test e mnt cockroach cluster bootstrapped then cockroach connect timeout cockroach init url postgres root localhost sslmode disable touch mnt cockroach cluster bootstrapped fi warning url specifies user password but command init does not accept user password details details ignored error error rpc error code unknown desc cluster has already been initialized hint please ensure all your start commands are using join cli error go error rpc error code unknown desc cluster has already been initialized hint please ensure all your start commands are using join error rpc error code unknown desc cluster has already been initialized hint please ensure all your start commands are using join failed running init exit status wraps exit status error types main withcommanddetails exec exiterror more artifacts related roachtest jobs mixed versions failed roachtest jobs mixed versions failed roachtest jobs mixed versions failed roachtest jobs mixed versions failed roachtest jobs mixed versions failed roachtest jobs mixed versions failed roachtest jobs mixed versions failed roachtest jobs mixed versions failed roachtest jobs mixed versions failed roachtest jobs mixed versions failed powered by
| 1
|
233,441
| 25,765,481,396
|
IssuesEvent
|
2022-12-09 01:14:13
|
dreamboy9/mongo
|
https://api.github.com/repos/dreamboy9/mongo
|
reopened
|
CVE-2021-37701 (High) detected in tar-6.1.0.tgz
|
security vulnerability
|
## CVE-2021-37701 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-6.1.0.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-6.1.0.tgz">https://registry.npmjs.org/tar/-/tar-6.1.0.tgz</a></p>
<p>Path to dependency file: /buildscripts/libdeps/graph_visualizer_web_stack/package.json</p>
<p>Path to vulnerable library: /buildscripts/libdeps/graph_visualizer_web_stack/node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- canvas-2.8.0.tgz (Root Library)
- node-pre-gyp-1.0.5.tgz
- :x: **tar-6.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/dreamboy9/mongo/commit/60ef70ebd8d46f4c893b3fb90ccf2616f8e21d2b">60ef70ebd8d46f4c893b3fb90ccf2616f8e21d2b</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 4.4.16, 5.0.8, and 6.1.7 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both `\` and `/` characters as path separators, however `\` is a valid filename character on posix systems. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar archive contained a directory at `FOO`, followed by a symbolic link named `foo`, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but _not_ from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the `FOO` directory would then be placed in the target of the symbolic link, thinking that the directory had already been created. These issues were addressed in releases 4.4.16, 5.0.8 and 6.1.7. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-9r2w-394v-53qc.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-37701>CVE-2021-37701</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc">https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 6.1.7</p>
<p>Direct dependency fix Resolution (canvas): 2.9.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-37701 (High) detected in tar-6.1.0.tgz - ## CVE-2021-37701 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-6.1.0.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-6.1.0.tgz">https://registry.npmjs.org/tar/-/tar-6.1.0.tgz</a></p>
<p>Path to dependency file: /buildscripts/libdeps/graph_visualizer_web_stack/package.json</p>
<p>Path to vulnerable library: /buildscripts/libdeps/graph_visualizer_web_stack/node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- canvas-2.8.0.tgz (Root Library)
- node-pre-gyp-1.0.5.tgz
- :x: **tar-6.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/dreamboy9/mongo/commit/60ef70ebd8d46f4c893b3fb90ccf2616f8e21d2b">60ef70ebd8d46f4c893b3fb90ccf2616f8e21d2b</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 4.4.16, 5.0.8, and 6.1.7 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both `\` and `/` characters as path separators, however `\` is a valid filename character on posix systems. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar archive contained a directory at `FOO`, followed by a symbolic link named `foo`, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but _not_ from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the `FOO` directory would then be placed in the target of the symbolic link, thinking that the directory had already been created. These issues were addressed in releases 4.4.16, 5.0.8 and 6.1.7. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-9r2w-394v-53qc.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-37701>CVE-2021-37701</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc">https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 6.1.7</p>
<p>Direct dependency fix Resolution (canvas): 2.9.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve high detected in tar tgz cve high severity vulnerability vulnerable library tar tgz tar for node library home page a href path to dependency file buildscripts libdeps graph visualizer web stack package json path to vulnerable library buildscripts libdeps graph visualizer web stack node modules tar package json dependency hierarchy canvas tgz root library node pre gyp tgz x tar tgz vulnerable library found in head commit a href found in base branch master vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems the cache checking logic used both and characters as path separators however is a valid filename character on posix systems by first creating a directory and then replacing that directory with a symlink it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite additionally a similar confusion could arise on case insensitive filesystems if a tar archive contained a directory at foo followed by a symbolic link named foo then on case insensitive file systems the creation of the symbolic link would remove the directory from the filesystem but not from the internal directory cache as it would not be treated as a cache hit a subsequent file entry within the foo directory would then be placed in the target of the symbolic link thinking that the directory had already been created these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar if this is not possible a workaround is available in the referenced ghsa publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar direct dependency fix resolution canvas step up your open source security game with mend
| 0
|
166,954
| 26,430,476,483
|
IssuesEvent
|
2023-01-14 18:57:49
|
hackforla/tdm-calculator
|
https://api.github.com/repos/hackforla/tdm-calculator
|
closed
|
FAQs Warning Box
|
level: medium p-Feature- FAQ screen priority: MUST HAVE role: ui/ux design Ready for development
|
### Overview
To prevent admins from mistakenly deleting FAQs categories and questions, we need a warning dialog box that asks them to be sure before deleting
### Action Items
- [x] create mockups for the warning box that quotes the text of the question/answer that the admin is deleting, and also includes a leaving the page warning box
- [x] Get stakeholders' approval
### Resources/Instructions
|
1.0
|
FAQs Warning Box - ### Overview
To prevent admins from mistakenly deleting FAQs categories and questions, we need a warning dialog box that asks them to be sure before deleting
### Action Items
- [x] create mockups for the warning box that quotes the text of the question/answer that the admin is deleting, and also includes a leaving the page warning box
- [x] Get stakeholders' approval
### Resources/Instructions
|
non_test
|
faqs warning box overview to prevent admins from mistakenly deleting faqs categories and questions we need a warning dialog box that asks them to be sure before deleting action items create mockups for the warning box that quotes the text of the question answer that the admin is deleting and also includes a leaving the page warning box get stakeholders approval resources instructions
| 0
|
46,130
| 5,786,421,248
|
IssuesEvent
|
2017-05-01 10:37:53
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
WaitHandleWaitAll fails on desktop
|
area-System.Threading test-run-desktop
|
https://mc.dot.net/#/product/netcore/master/source/official~2Fcorefx~2Fmaster~2F/type/test~2Ffunctional~2Fdesktop~2Fcli~2F/build/20170429.02/workItem/System.Threading.Tests/analysis/xunit/System.Threading.Tests.ManualResetEventTests~2FWaitHandleWaitAll
```
Windows.10.Amd64-x64-Release
Unhandled Exception of Type System.NotSupportedException
Message :
System.NotSupportedException : WaitAll for multiple handles on a STA thread is not supported.
Stack Trace :
at System.Threading.WaitHandle.WaitMultiple(WaitHandle[] waitHandles, Int32 millisecondsTimeout, Boolean exitContext, Boolean WaitAll)
at System.Threading.WaitHandle.WaitAll(WaitHandle[] waitHandles, Int32 millisecondsTimeout, Boolean exitContext)
at System.Threading.Tests.ManualResetEventTests.WaitHandleWaitAll() in E:\A\_work\313\s\corefx\src\System.Threading\tests\ManualResetEventTests.cs:line 55
```
Needs this adding
`[SkipOnTargetFramework(TargetFrameworkMonikers.NetFramework, "Full framework throws NotSupportedException because test runs on STA and Core run on MTA")]`
|
1.0
|
WaitHandleWaitAll fails on desktop - https://mc.dot.net/#/product/netcore/master/source/official~2Fcorefx~2Fmaster~2F/type/test~2Ffunctional~2Fdesktop~2Fcli~2F/build/20170429.02/workItem/System.Threading.Tests/analysis/xunit/System.Threading.Tests.ManualResetEventTests~2FWaitHandleWaitAll
```
Windows.10.Amd64-x64-Release
Unhandled Exception of Type System.NotSupportedException
Message :
System.NotSupportedException : WaitAll for multiple handles on a STA thread is not supported.
Stack Trace :
at System.Threading.WaitHandle.WaitMultiple(WaitHandle[] waitHandles, Int32 millisecondsTimeout, Boolean exitContext, Boolean WaitAll)
at System.Threading.WaitHandle.WaitAll(WaitHandle[] waitHandles, Int32 millisecondsTimeout, Boolean exitContext)
at System.Threading.Tests.ManualResetEventTests.WaitHandleWaitAll() in E:\A\_work\313\s\corefx\src\System.Threading\tests\ManualResetEventTests.cs:line 55
```
Needs this adding
`[SkipOnTargetFramework(TargetFrameworkMonikers.NetFramework, "Full framework throws NotSupportedException because test runs on STA and Core run on MTA")]`
|
test
|
waithandlewaitall fails on desktop windows release unhandled exception of type system notsupportedexception message system notsupportedexception waitall for multiple handles on a sta thread is not supported stack trace at system threading waithandle waitmultiple waithandle waithandles millisecondstimeout boolean exitcontext boolean waitall at system threading waithandle waitall waithandle waithandles millisecondstimeout boolean exitcontext at system threading tests manualreseteventtests waithandlewaitall in e a work s corefx src system threading tests manualreseteventtests cs line needs this adding
| 1
|
332,648
| 29,490,276,844
|
IssuesEvent
|
2023-06-02 12:59:38
|
MPMG-DCC-UFMG/F01
|
https://api.github.com/repos/MPMG-DCC-UFMG/F01
|
closed
|
Teste de generalizacao para a tag Terceiro Setor - Dados de parcerias - União de Minas
|
generalization test development template - GRP (27) tag - Terceiro Setor subtag - Dados de Parcerias
|
DoD: Realizar o teste de Generalização do validador da tag Terceiro Setor - Dados de parcerias para o Município de União de Minas.
|
1.0
|
Teste de generalizacao para a tag Terceiro Setor - Dados de parcerias - União de Minas - DoD: Realizar o teste de Generalização do validador da tag Terceiro Setor - Dados de parcerias para o Município de União de Minas.
|
test
|
teste de generalizacao para a tag terceiro setor dados de parcerias união de minas dod realizar o teste de generalização do validador da tag terceiro setor dados de parcerias para o município de união de minas
| 1
|
150,923
| 11,993,541,130
|
IssuesEvent
|
2020-04-08 12:12:39
|
zilahir/teleprompter
|
https://api.github.com/repos/zilahir/teleprompter
|
closed
|
Up/down controls in remote don't work
|
:bug: bug :hourglass: needs testing
|
They don't seem to be doing anything. There's also a slight visual overlap with the play button.

|
1.0
|
Up/down controls in remote don't work - They don't seem to be doing anything. There's also a slight visual overlap with the play button.

|
test
|
up down controls in remote don t work they don t seem to be doing anything there s also a slight visual overlap with the play button
| 1
|
145,985
| 13,167,734,935
|
IssuesEvent
|
2020-08-11 10:48:32
|
reactor/reactor-netty
|
https://api.github.com/repos/reactor/reactor-netty
|
closed
|
Memory leak in HttpClient
|
for/stackoverflow good first issue type/documentation
|
Please check the reproducible example here: https://github.com/michaelr524/reactor-netty-ssl-leak
```
mvn clean package
java -jar target/reactor-netty-ssl-leak-1.0-SNAPSHOT.jar
```
* Reactor version(s) used: 0.9.4.RELEASE
* Other relevant libraries versions (eg. `netty`, ...):
* JVM version (`javar -version`):
openjdk version "12.0.2" 2019-07-16
OpenJDK Runtime Environment (build 12.0.2+10)
OpenJDK 64-Bit Server VM (build 12.0.2+10, mixed mode, sharing)
* OS and version (eg `uname -a`): Darwin Michaels-MacBook-Pro.local 19.3.0 Darwin Kernel Version 19.3.0: Thu Jan 9 20:58:23 PST 2020; root:xnu-6153.81.5~1/RELEASE_X86_64 x86_64
Looks like something related to the ssl context:
<img width="618" alt="VisualVM-2 0 1-2020-04-06-11-11-05" src="https://user-images.githubusercontent.com/1745556/78547458-cdbc5f00-7807-11ea-9e12-938b3709eced.png">
<img width="992" alt="VisualVM-2 0 1-2020-04-06-11-12-09" src="https://user-images.githubusercontent.com/1745556/78547465-ceed8c00-7807-11ea-9ad6-730230a1b47a.png">
|
1.0
|
Memory leak in HttpClient - Please check the reproducible example here: https://github.com/michaelr524/reactor-netty-ssl-leak
```
mvn clean package
java -jar target/reactor-netty-ssl-leak-1.0-SNAPSHOT.jar
```
* Reactor version(s) used: 0.9.4.RELEASE
* Other relevant libraries versions (eg. `netty`, ...):
* JVM version (`javar -version`):
openjdk version "12.0.2" 2019-07-16
OpenJDK Runtime Environment (build 12.0.2+10)
OpenJDK 64-Bit Server VM (build 12.0.2+10, mixed mode, sharing)
* OS and version (eg `uname -a`): Darwin Michaels-MacBook-Pro.local 19.3.0 Darwin Kernel Version 19.3.0: Thu Jan 9 20:58:23 PST 2020; root:xnu-6153.81.5~1/RELEASE_X86_64 x86_64
Looks like something related to the ssl context:
<img width="618" alt="VisualVM-2 0 1-2020-04-06-11-11-05" src="https://user-images.githubusercontent.com/1745556/78547458-cdbc5f00-7807-11ea-9e12-938b3709eced.png">
<img width="992" alt="VisualVM-2 0 1-2020-04-06-11-12-09" src="https://user-images.githubusercontent.com/1745556/78547465-ceed8c00-7807-11ea-9ad6-730230a1b47a.png">
|
non_test
|
memory leak in httpclient please check the reproducible example here mvn clean package java jar target reactor netty ssl leak snapshot jar reactor version s used release other relevant libraries versions eg netty jvm version javar version openjdk version openjdk runtime environment build openjdk bit server vm build mixed mode sharing os and version eg uname a darwin michaels macbook pro local darwin kernel version thu jan pst root xnu release looks like something related to the ssl context img width alt visualvm src img width alt visualvm src
| 0
|
63,056
| 8,655,714,410
|
IssuesEvent
|
2018-11-27 16:34:38
|
ReactiveX/rxjs
|
https://api.github.com/repos/ReactiveX/rxjs
|
closed
|
Outdated link in tutorial page
|
type: bug type: documentation
|
<!--
Thank you for raising your concerns, we appreciate your feedback and contributions to this repository.
Before you continue, consider the following:
If you have a "How do I do ...?" question, it is better for you and for us that this question is placed in [StackOverflow](http://stackoverflow.com/questions/tagged/rxjs5) or some chat channel. This way, you are making it easier for others to learn from your experiences too.
These "Issues" are meant only for technical problems, bugs, and proposals related to the library.
If your issue is a bug, please follow the format below:
-->
In [tutorial](http://reactivex.io/rxjs/manual/tutorial.html#tutorials) page [RxJS @ Egghead.io](https://egghead.io/technologies/rx) link is outdated
|
1.0
|
Outdated link in tutorial page - <!--
Thank you for raising your concerns, we appreciate your feedback and contributions to this repository.
Before you continue, consider the following:
If you have a "How do I do ...?" question, it is better for you and for us that this question is placed in [StackOverflow](http://stackoverflow.com/questions/tagged/rxjs5) or some chat channel. This way, you are making it easier for others to learn from your experiences too.
These "Issues" are meant only for technical problems, bugs, and proposals related to the library.
If your issue is a bug, please follow the format below:
-->
In [tutorial](http://reactivex.io/rxjs/manual/tutorial.html#tutorials) page [RxJS @ Egghead.io](https://egghead.io/technologies/rx) link is outdated
|
non_test
|
outdated link in tutorial page thank you for raising your concerns we appreciate your feedback and contributions to this repository before you continue consider the following if you have a how do i do question it is better for you and for us that this question is placed in or some chat channel this way you are making it easier for others to learn from your experiences too these issues are meant only for technical problems bugs and proposals related to the library if your issue is a bug please follow the format below in page link is outdated
| 0
|
278,504
| 24,159,678,330
|
IssuesEvent
|
2022-09-22 10:33:13
|
ethereum/solidity
|
https://api.github.com/repos/ethereum/solidity
|
opened
|
[isoltest] Executing semantic tests at arbitrary paths
|
testing :hammer: low effort low impact
|
Isoltest can execute contracts using evmone and we use that to implement semantic tests. Currently these tests have to be located inside `test/libsolidity/semanticTests/`. It would be convenient for debugging to be able to use isoltest to execute a code snippet from an arbitrary path by simply giving it that path on the command line.
|
1.0
|
[isoltest] Executing semantic tests at arbitrary paths - Isoltest can execute contracts using evmone and we use that to implement semantic tests. Currently these tests have to be located inside `test/libsolidity/semanticTests/`. It would be convenient for debugging to be able to use isoltest to execute a code snippet from an arbitrary path by simply giving it that path on the command line.
|
test
|
executing semantic tests at arbitrary paths isoltest can execute contracts using evmone and we use that to implement semantic tests currently these tests have to be located inside test libsolidity semantictests it would be convenient for debugging to be able to use isoltest to execute a code snippet from an arbitrary path by simply giving it that path on the command line
| 1
|
326,340
| 27,985,533,217
|
IssuesEvent
|
2023-03-26 16:52:41
|
godotengine/godot
|
https://api.github.com/repos/godotengine/godot
|
closed
|
SDFGI doesn't work on Geforce GTX 680
|
bug topic:rendering needs testing topic:3d
|
### Godot version
v4.0.stable.official [92bee43ad]
### System information
MacOS 10.15.2
Video:
NVIDIA GeForce GTX 680
VRAM (Total): 2 GB
Metal: Supported, feature set macOS GPUFamily1 v4
### Issue description
Vulkan API 1.2.231 - Forward+ - Using Vulkan Device #0: NVIDIA - NVIDIA GeForce GTX 680
does not render when SDFGI is enabled. The picture changes colour like a kaleidoscope, until all is white. See attached screenshot.

### Steps to reproduce
Enable SDFGI.
### Minimal reproduction project
N/A
|
1.0
|
SDFGI doesn't work on Geforce GTX 680 - ### Godot version
v4.0.stable.official [92bee43ad]
### System information
MacOS 10.15.2
Video:
NVIDIA GeForce GTX 680
VRAM (Total): 2 GB
Metal: Supported, feature set macOS GPUFamily1 v4
### Issue description
Vulkan API 1.2.231 - Forward+ - Using Vulkan Device #0: NVIDIA - NVIDIA GeForce GTX 680
does not render when SDFGI is enabled. The picture changes colour like a kaleidoscope, until all is white. See attached screenshot.

### Steps to reproduce
Enable SDFGI.
### Minimal reproduction project
N/A
|
test
|
sdfgi doesn t work on geforce gtx godot version stable official system information macos video nvidia geforce gtx vram total gb metal supported feature set macos issue description vulkan api forward using vulkan device nvidia nvidia geforce gtx does not render when sdfgi is enabled the picture changes colour like a kaleidoscope until all is white see attached screenshot steps to reproduce enable sdfgi minimal reproduction project n a
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.