Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 4
112
| repo_url
stringlengths 33
141
| action
stringclasses 3
values | title
stringlengths 1
1.02k
| labels
stringlengths 4
1.54k
| body
stringlengths 1
262k
| index
stringclasses 17
values | text_combine
stringlengths 95
262k
| label
stringclasses 2
values | text
stringlengths 96
252k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
691,894
| 23,715,593,259
|
IssuesEvent
|
2022-08-30 11:30:28
|
magento/magento2
|
https://api.github.com/repos/magento/magento2
|
closed
|
Catalog Page Empty if "Allow All Products Per Page" is set to "Yes"
|
Issue: Confirmed Reproduced on 2.4.x Progress: PR in progress Priority: P0 Progress: done Reported on 2.4.x Area: Catalog
|
### Preconditions and environment
- Magento 2.4.5
### Steps to reproduce
Deploy 2.4.5 vanilla
Create Test Category
Create Test Product in test category
Browse to test category page on frontend and ensure product is visible.
Set Allow All Products Per Page setting (Config -> Catalog -> Catalog) to Yes

Clear Cache
Browse to test category page on frontend and ensure product is visible.
### Expected result
Product is visisble

### Actual result
Product is not visible

### Additional information
_No response_
### Release note
The Category Page is no longer empty when "Allow All Products Per Page" is set to "Yes".
### Triage and priority
- [ ] Severity: **S0** _- Affects critical data or functionality and leaves users without workaround._
- [X] Severity: **S1** _- Affects critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S2** _- Affects non-critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S3** _- Affects non-critical data or functionality and does not force users to employ a workaround._
- [ ] Severity: **S4** _- Affects aesthetics, professional look and feel, “quality” or “usability”._
|
1.0
|
Catalog Page Empty if "Allow All Products Per Page" is set to "Yes" - ### Preconditions and environment
- Magento 2.4.5
### Steps to reproduce
Deploy 2.4.5 vanilla
Create Test Category
Create Test Product in test category
Browse to test category page on frontend and ensure product is visible.
Set Allow All Products Per Page setting (Config -> Catalog -> Catalog) to Yes

Clear Cache
Browse to test category page on frontend and ensure product is visible.
### Expected result
Product is visisble

### Actual result
Product is not visible

### Additional information
_No response_
### Release note
The Category Page is no longer empty when "Allow All Products Per Page" is set to "Yes".
### Triage and priority
- [ ] Severity: **S0** _- Affects critical data or functionality and leaves users without workaround._
- [X] Severity: **S1** _- Affects critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S2** _- Affects non-critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S3** _- Affects non-critical data or functionality and does not force users to employ a workaround._
- [ ] Severity: **S4** _- Affects aesthetics, professional look and feel, “quality” or “usability”._
|
non_test
|
catalog page empty if allow all products per page is set to yes preconditions and environment magento steps to reproduce deploy vanilla create test category create test product in test category browse to test category page on frontend and ensure product is visible set allow all products per page setting config catalog catalog to yes clear cache browse to test category page on frontend and ensure product is visible expected result product is visisble actual result product is not visible additional information no response release note the category page is no longer empty when allow all products per page is set to yes triage and priority severity affects critical data or functionality and leaves users without workaround severity affects critical data or functionality and forces users to employ a workaround severity affects non critical data or functionality and forces users to employ a workaround severity affects non critical data or functionality and does not force users to employ a workaround severity affects aesthetics professional look and feel “quality” or “usability”
| 0
|
248,486
| 21,035,501,844
|
IssuesEvent
|
2022-03-31 07:26:16
|
rancher/qa-tasks
|
https://api.github.com/repos/rancher/qa-tasks
|
closed
|
[2.4.18] Release testing tasks
|
team/area1 [zube]: Release Task area/release-testing qa-release-task
|
Start date - 03/25/2022
Due by - 03/29/2022
---
### Team 1 Release Tasks
- [x] ✅ HA upgrade testing - Anu and Andrew
- [x] ✅ docker install rollbacks - Anu
|
1.0
|
[2.4.18] Release testing tasks - Start date - 03/25/2022
Due by - 03/29/2022
---
### Team 1 Release Tasks
- [x] ✅ HA upgrade testing - Anu and Andrew
- [x] ✅ docker install rollbacks - Anu
|
test
|
release testing tasks start date due by team release tasks ✅ ha upgrade testing anu and andrew ✅ docker install rollbacks anu
| 1
|
96,499
| 8,616,740,703
|
IssuesEvent
|
2018-11-20 01:41:54
|
Ravonus/TrakBak
|
https://api.github.com/repos/Ravonus/TrakBak
|
closed
|
Fix Mongoose Models
|
MVC Testing backend bug
|
Need to fix mongoose model templates. New logic is needed after implementing redis.
|
1.0
|
Fix Mongoose Models - Need to fix mongoose model templates. New logic is needed after implementing redis.
|
test
|
fix mongoose models need to fix mongoose model templates new logic is needed after implementing redis
| 1
|
92,740
| 8,377,389,868
|
IssuesEvent
|
2018-10-06 00:40:01
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
PostAsyncExpect100Continue_FailsAfterContentSendStarted_Throws fails on CurlHandler
|
area-System.Net.Http test bug
|
This test failed on CurlHandler in one of the daily test runs today ([link](https://mc.dot.net/#/product/netcore/30/source/official~2Fcorefx~2Fmaster~2F/type/test~2Ffunctional~2Fcli~2F/build/20181005.06/workItem/System.Net.Http.Functional.Tests)).
I'm pretty sure the test is just invalid on CurlHandler, as the expected behavior depends on an Expect: 100-Continue header being sent. LibCurl won't send one unless the [content length is > 1024](https://gms.tf/when-curl-sends-100-continue.html#curl-logic).
|
1.0
|
PostAsyncExpect100Continue_FailsAfterContentSendStarted_Throws fails on CurlHandler - This test failed on CurlHandler in one of the daily test runs today ([link](https://mc.dot.net/#/product/netcore/30/source/official~2Fcorefx~2Fmaster~2F/type/test~2Ffunctional~2Fcli~2F/build/20181005.06/workItem/System.Net.Http.Functional.Tests)).
I'm pretty sure the test is just invalid on CurlHandler, as the expected behavior depends on an Expect: 100-Continue header being sent. LibCurl won't send one unless the [content length is > 1024](https://gms.tf/when-curl-sends-100-continue.html#curl-logic).
|
test
|
failsaftercontentsendstarted throws fails on curlhandler this test failed on curlhandler in one of the daily test runs today i m pretty sure the test is just invalid on curlhandler as the expected behavior depends on an expect continue header being sent libcurl won t send one unless the
| 1
|
41,004
| 10,262,660,448
|
IssuesEvent
|
2019-08-22 12:50:06
|
scipy/scipy
|
https://api.github.com/repos/scipy/scipy
|
closed
|
minor 'iprint' documentation error in scipy.optimize.minimize L-BFGS-B
|
Documentation defect scipy.optimize
|
The documentation of L-BFGS-B (http://docs.scipy.org/doc/scipy/reference/optimize.minimize-lbfgsb.html) lists `iprint` as an argument in the signature, but it is missing in the description of the arguments, probably replaced by the seemingly redundant `disp`, itself there twice under two different types (bool and int).
|
1.0
|
minor 'iprint' documentation error in scipy.optimize.minimize L-BFGS-B - The documentation of L-BFGS-B (http://docs.scipy.org/doc/scipy/reference/optimize.minimize-lbfgsb.html) lists `iprint` as an argument in the signature, but it is missing in the description of the arguments, probably replaced by the seemingly redundant `disp`, itself there twice under two different types (bool and int).
|
non_test
|
minor iprint documentation error in scipy optimize minimize l bfgs b the documentation of l bfgs b lists iprint as an argument in the signature but it is missing in the description of the arguments probably replaced by the seemingly redundant disp itself there twice under two different types bool and int
| 0
|
173,565
| 14,429,187,563
|
IssuesEvent
|
2020-12-06 13:15:10
|
IIS-ZPI/ZPI2020_zaoczni_Grupa_P
|
https://api.github.com/repos/IIS-ZPI/ZPI2020_zaoczni_Grupa_P
|
opened
|
Pierwszy sprint
|
documentation
|
### Pierwszy Sprint 06.12.2020
- omówienie wymagań projektu
- rozdzielenie zadań
- przygotowanie środowiska.
|
1.0
|
Pierwszy sprint - ### Pierwszy Sprint 06.12.2020
- omówienie wymagań projektu
- rozdzielenie zadań
- przygotowanie środowiska.
|
non_test
|
pierwszy sprint pierwszy sprint omówienie wymagań projektu rozdzielenie zadań przygotowanie środowiska
| 0
|
715,206
| 24,590,081,818
|
IssuesEvent
|
2022-10-14 00:44:45
|
OregonDigital/OD2
|
https://api.github.com/repos/OregonDigital/OD2
|
opened
|
Cache cacheable content and investigate additional caching options
|
Enhancement Priority - High MVP Review ruby
|
### Descriptive summary
OD1 is still able to take advantage of Rails 4's Page and Action caching, as you can see with the following command on OD1:
`memcached-tool localhost display`
Rails 5 removed the integrated Page and Action caching, which means OD2 does not yet have the same level of caching that OD1 has.
Without widespread caching, OD2 is going to have a difficult time handling large amounts of traffic. We are able to take advantage of other caches like ActiveFedora's LDP cache for Fedora content. We have Blazegraph for caching triples. We have caching for SQL queries to our primary PostgreSQL database. But to handle larger amounts of traffic and decrease overall response times we need more caching for our Rails frontend application.
Without caching, Rails has to query backend services for every request, which manifests as longer and longer response times as backend services get more busy. Our average response time when the system is relatively idle ranges from 1.5 - 2.5 seconds. While large amounts of Sidekiq jobs are running this can increase to 5 seconds to 20+ seconds.
With more caching we can maintain a more stable User Experience regardless of how busy the overall system is. This will also help to lighten the load on backend services. In addition to the extra resources that background jobs will have available, caching can be added to background jobs where it makes sense to speed them up even further.
Since earlier this week we're now hooked up to `memcached` via `Rails.application.config.cache_store`. We can `write` and `fetch` keys from the `Rails.cache`. Without anything taking advantage of that our `cache_store` is empty, or mostly empty.
```
irb(main):034:0> Rails.cache.stats["memcache.od2-prod.svc.cluster.local:11211"]["total_items"]
=> "1"
```
This was after creating and clearing keys manually, otherwise it would still be returning 0.
Out of the box we can [cache fragments and partials](https://guides.rubyonrails.org/caching_with_rails.html#fragment-caching)
To make use of caching for OD2, we will need to add caching code for content that should be cached.
Additionally, Rack Cache can cache content without changes to code. See the References section for a link.
The Page and Action caching that are missing in Rails 5 are available as gems, which we should test on Staging. See the References section for links.
We can also do low level caching with `Rails.cache.fetch()` and `Rails.cache.write()` for data from backends that won't change often, or are acceptable to be eventually correct. Whether that's SOLR query results, or responses from other external APIs, if the response changes infrequently or eventually correct results are ok we should be caching them. See the low level caching section of the Ruby on Rails Caching Guide for details.
### Expected behavior
Infrequently changed content should be cached wherever possible.
We should be seeing increasing numbers of items in the memcache cache_store.
### References
[https://guides.rubyonrails.org/caching_with_rails.html](https://guides.rubyonrails.org/caching_with_rails.html)
[https://rtomayko.github.io/rack-cache/configuration](https://rtomayko.github.io/rack-cache/configuration)
[https://github.com/rails/actionpack-page_caching](https://github.com/rails/actionpack-page_caching)
|
1.0
|
Cache cacheable content and investigate additional caching options - ### Descriptive summary
OD1 is still able to take advantage of Rails 4's Page and Action caching, as you can see with the following command on OD1:
`memcached-tool localhost display`
Rails 5 removed the integrated Page and Action caching, which means OD2 does not yet have the same level of caching that OD1 has.
Without widespread caching, OD2 is going to have a difficult time handling large amounts of traffic. We are able to take advantage of other caches like ActiveFedora's LDP cache for Fedora content. We have Blazegraph for caching triples. We have caching for SQL queries to our primary PostgreSQL database. But to handle larger amounts of traffic and decrease overall response times we need more caching for our Rails frontend application.
Without caching, Rails has to query backend services for every request, which manifests as longer and longer response times as backend services get more busy. Our average response time when the system is relatively idle ranges from 1.5 - 2.5 seconds. While large amounts of Sidekiq jobs are running this can increase to 5 seconds to 20+ seconds.
With more caching we can maintain a more stable User Experience regardless of how busy the overall system is. This will also help to lighten the load on backend services. In addition to the extra resources that background jobs will have available, caching can be added to background jobs where it makes sense to speed them up even further.
Since earlier this week we're now hooked up to `memcached` via `Rails.application.config.cache_store`. We can `write` and `fetch` keys from the `Rails.cache`. Without anything taking advantage of that our `cache_store` is empty, or mostly empty.
```
irb(main):034:0> Rails.cache.stats["memcache.od2-prod.svc.cluster.local:11211"]["total_items"]
=> "1"
```
This was after creating and clearing keys manually, otherwise it would still be returning 0.
Out of the box we can [cache fragments and partials](https://guides.rubyonrails.org/caching_with_rails.html#fragment-caching)
To make use of caching for OD2, we will need to add caching code for content that should be cached.
Additionally, Rack Cache can cache content without changes to code. See the References section for a link.
The Page and Action caching that are missing in Rails 5 are available as gems, which we should test on Staging. See the References section for links.
We can also do low level caching with `Rails.cache.fetch()` and `Rails.cache.write()` for data from backends that won't change often, or are acceptable to be eventually correct. Whether that's SOLR query results, or responses from other external APIs, if the response changes infrequently or eventually correct results are ok we should be caching them. See the low level caching section of the Ruby on Rails Caching Guide for details.
### Expected behavior
Infrequently changed content should be cached wherever possible.
We should be seeing increasing numbers of items in the memcache cache_store.
### References
[https://guides.rubyonrails.org/caching_with_rails.html](https://guides.rubyonrails.org/caching_with_rails.html)
[https://rtomayko.github.io/rack-cache/configuration](https://rtomayko.github.io/rack-cache/configuration)
[https://github.com/rails/actionpack-page_caching](https://github.com/rails/actionpack-page_caching)
|
non_test
|
cache cacheable content and investigate additional caching options descriptive summary is still able to take advantage of rails s page and action caching as you can see with the following command on memcached tool localhost display rails removed the integrated page and action caching which means does not yet have the same level of caching that has without widespread caching is going to have a difficult time handling large amounts of traffic we are able to take advantage of other caches like activefedora s ldp cache for fedora content we have blazegraph for caching triples we have caching for sql queries to our primary postgresql database but to handle larger amounts of traffic and decrease overall response times we need more caching for our rails frontend application without caching rails has to query backend services for every request which manifests as longer and longer response times as backend services get more busy our average response time when the system is relatively idle ranges from seconds while large amounts of sidekiq jobs are running this can increase to seconds to seconds with more caching we can maintain a more stable user experience regardless of how busy the overall system is this will also help to lighten the load on backend services in addition to the extra resources that background jobs will have available caching can be added to background jobs where it makes sense to speed them up even further since earlier this week we re now hooked up to memcached via rails application config cache store we can write and fetch keys from the rails cache without anything taking advantage of that our cache store is empty or mostly empty irb main rails cache stats this was after creating and clearing keys manually otherwise it would still be returning out of the box we can to make use of caching for we will need to add caching code for content that should be cached additionally rack cache can cache content without changes to code see the references section for a link the page and action caching that are missing in rails are available as gems which we should test on staging see the references section for links we can also do low level caching with rails cache fetch and rails cache write for data from backends that won t change often or are acceptable to be eventually correct whether that s solr query results or responses from other external apis if the response changes infrequently or eventually correct results are ok we should be caching them see the low level caching section of the ruby on rails caching guide for details expected behavior infrequently changed content should be cached wherever possible we should be seeing increasing numbers of items in the memcache cache store references
| 0
|
253,638
| 21,692,807,961
|
IssuesEvent
|
2022-05-09 16:56:44
|
rancher/backup-restore-operator
|
https://api.github.com/repos/rancher/backup-restore-operator
|
closed
|
Set logrus Formatter as early as possible
|
[zube]: To Test status/dev-validate team/area3
|
In v2.1.2-rc2, the log lines are:
```
$ kubectl -n cattle-resources-system logs -l app.kubernetes.io/name=rancher-backup -f
time="2022-04-25T14:11:28Z" level=debug msg="Loglevel set to [debug]"
time="2022-04-25T14:11:28Z" level=info msg="Starting backup-restore controller version v0.0.0-dev (HEAD)"
INFO[2022/04/25 14:11:28] No PVC or S3 details provided for storing backups by default. User must specify storageLocation on each Backup CR
INFO[2022/04/25 14:11:28] Secrets containing encryption config files must be stored in the namespace cattle-resources-system
INFO[2022/04/25 14:11:28] Starting resources.cattle.io/v1, Kind=Backup controller
INFO[2022/04/25 14:11:28] Starting resources.cattle.io/v1, Kind=Restore controller
```
This shows different formatting and the reason is because formatting only happens after the first two lines.
|
1.0
|
Set logrus Formatter as early as possible - In v2.1.2-rc2, the log lines are:
```
$ kubectl -n cattle-resources-system logs -l app.kubernetes.io/name=rancher-backup -f
time="2022-04-25T14:11:28Z" level=debug msg="Loglevel set to [debug]"
time="2022-04-25T14:11:28Z" level=info msg="Starting backup-restore controller version v0.0.0-dev (HEAD)"
INFO[2022/04/25 14:11:28] No PVC or S3 details provided for storing backups by default. User must specify storageLocation on each Backup CR
INFO[2022/04/25 14:11:28] Secrets containing encryption config files must be stored in the namespace cattle-resources-system
INFO[2022/04/25 14:11:28] Starting resources.cattle.io/v1, Kind=Backup controller
INFO[2022/04/25 14:11:28] Starting resources.cattle.io/v1, Kind=Restore controller
```
This shows different formatting and the reason is because formatting only happens after the first two lines.
|
test
|
set logrus formatter as early as possible in the log lines are kubectl n cattle resources system logs l app kubernetes io name rancher backup f time level debug msg loglevel set to time level info msg starting backup restore controller version dev head info no pvc or details provided for storing backups by default user must specify storagelocation on each backup cr info secrets containing encryption config files must be stored in the namespace cattle resources system info starting resources cattle io kind backup controller info starting resources cattle io kind restore controller this shows different formatting and the reason is because formatting only happens after the first two lines
| 1
|
318,140
| 27,289,830,184
|
IssuesEvent
|
2023-02-23 15:52:12
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
roachtest: jasync failed
|
C-test-failure O-robot O-roachtest T-sql-sessions branch-release-22.1
|
roachtest.jasync [failed](https://teamcity.cockroachdb.com/buildConfiguration/BUILDTYPE_ID-not-found-in-env/8440187?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/BUILDTYPE_ID-not-found-in-env/8440187?buildTab=artifacts#/jasync) on release-22.1 @ [4f4342ecdf97f478ec8db536643cd48c2f051689](https://github.com/cockroachdb/cockroach/commits/4f4342ecdf97f478ec8db536643cd48c2f051689):
```
test artifacts and logs in: /artifacts/jasync/run_1
(orm_helpers.go:193).summarizeFailed:
Tests run on Cockroach v22.1.13-71-g4f4342ecdf
Tests run against jasyncsql 6301aa1b9ef8a0d4c5cf6f3c095b30a388c62dc0
158 Total Tests Run
102 tests passed
56 tests failed
5 tests skipped
0 tests ignored
0 tests passed unexpectedly
1 test failed unexpectedly
0 tests expected failed but skipped
0 tests expected failed but not run
---
--- FAIL: com.github.aysnc.sql.db.integration.PreparedStatementSpec.handler should handle interceptors - unknown (unexpected)
For a full summary look at the jasyncsql artifacts
An updated blocklist (jasyncsqlBlocklist22_1) is available in the artifacts' jasyncsql log
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_encrypted=false</code>
, <code>ROACHTEST_fs=ext4</code>
, <code>ROACHTEST_localSSD=true</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/sql-sessions
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*jasync.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
Jira issue: CRDB-23761
|
2.0
|
roachtest: jasync failed - roachtest.jasync [failed](https://teamcity.cockroachdb.com/buildConfiguration/BUILDTYPE_ID-not-found-in-env/8440187?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/BUILDTYPE_ID-not-found-in-env/8440187?buildTab=artifacts#/jasync) on release-22.1 @ [4f4342ecdf97f478ec8db536643cd48c2f051689](https://github.com/cockroachdb/cockroach/commits/4f4342ecdf97f478ec8db536643cd48c2f051689):
```
test artifacts and logs in: /artifacts/jasync/run_1
(orm_helpers.go:193).summarizeFailed:
Tests run on Cockroach v22.1.13-71-g4f4342ecdf
Tests run against jasyncsql 6301aa1b9ef8a0d4c5cf6f3c095b30a388c62dc0
158 Total Tests Run
102 tests passed
56 tests failed
5 tests skipped
0 tests ignored
0 tests passed unexpectedly
1 test failed unexpectedly
0 tests expected failed but skipped
0 tests expected failed but not run
---
--- FAIL: com.github.aysnc.sql.db.integration.PreparedStatementSpec.handler should handle interceptors - unknown (unexpected)
For a full summary look at the jasyncsql artifacts
An updated blocklist (jasyncsqlBlocklist22_1) is available in the artifacts' jasyncsql log
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_encrypted=false</code>
, <code>ROACHTEST_fs=ext4</code>
, <code>ROACHTEST_localSSD=true</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/sql-sessions
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*jasync.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
Jira issue: CRDB-23761
|
test
|
roachtest jasync failed roachtest jasync with on release test artifacts and logs in artifacts jasync run orm helpers go summarizefailed tests run on cockroach tests run against jasyncsql total tests run tests passed tests failed tests skipped tests ignored tests passed unexpectedly test failed unexpectedly tests expected failed but skipped tests expected failed but not run fail com github aysnc sql db integration preparedstatementspec handler should handle interceptors unknown unexpected for a full summary look at the jasyncsql artifacts an updated blocklist is available in the artifacts jasyncsql log parameters roachtest cloud gce roachtest cpu roachtest encrypted false roachtest fs roachtest localssd true roachtest ssd help see see cc cockroachdb sql sessions jira issue crdb
| 1
|
142,643
| 21,796,804,122
|
IssuesEvent
|
2022-05-15 19:01:23
|
zulip/zulip
|
https://api.github.com/repos/zulip/zulip
|
opened
|
UI redesign: message highlights and UI
|
area: message feed display redesign
|
As part of the redesign we want to update styles messages. This issue is not about [typography ](https://github.com/zulip/zulip/issues/22022) but about all the different ways to highlight the whole message.
New design of the possible message highlights consist from such layers:
0. theme message background
1. kind of message background
2. hover background
3. message content, including author's name and user pic
4. hover buttons
5. zulip focus rectangle
6. unread indicator
Here is the [Figma area](https://www.figma.com/file/jbNOiBWvbtLuHaiTj4CW0G/Zulip-Web-App?node-id=2054%3A132598) with all the variations:
<img width="868" alt="image" src="https://user-images.githubusercontent.com/1903309/168487056-af6ddcaa-becb-4612-88b3-16e83c410ece.png">
## Specification each layer
In Figma layers don't exactly reflect the order of containers/backgrounds in html/css. My proposal of implementation might not be optimal, so suggestions are welcome.
### 0. Theme message background
White for light theme `hsla(0, 0%, 100%, 1)`
<img width="921" alt="image" src="https://user-images.githubusercontent.com/1903309/168487355-bd6c97bd-814a-4a8b-a7d4-f948d1858568.png">
and almost black in dark `hsla(0, 0%, 14%, 1)`
<img width="919" alt="image" src="https://user-images.githubusercontent.com/1903309/168487375-ebfef654-fcb1-419c-b575-f953e671de71.png">
### 1. Kind of message background
This replaces the theme background.
- viewer's user mention in light `hsla(240, 24%, 96%, 1)` and in dark `hsla(240, 13%, 20%, 1)`
- viewer's group mention in light `hsla(180, 13%, 95%, 1)` and in dark `hsla(180, 13%, 15%, 1)`
- personal message in light `hsla(45, 20%, 96%, 1)` and in dark `hsla(46, 25%, 14%, 1)`
<img width="921" alt="image" src="https://user-images.githubusercontent.com/1903309/168487534-25e274c8-e216-4f0f-8e99-0209e0bb99b2.png">
<img width="879" alt="image" src="https://user-images.githubusercontent.com/1903309/168487826-d4904901-0e67-43fb-bb2c-3e752269f308.png">
### 2. Hover background
This goes on top of previous layer. In light theme is 2% black `hsla(0, 0%, 0%, 0.02)` and in dark theme is 20% black `background: hsla(0, 0%, 0%, 0.2);`
<img width="843" alt="image" src="https://user-images.githubusercontent.com/1903309/168488341-a0ab8674-65c8-45ee-99d0-476774d28d15.png">
### 3. Message content
Partly specified in https://github.com/zulip/zulip/issues/22022
### 4. Hover buttons (appear on the whole message hover)
<img width="798" alt="image" src="https://user-images.githubusercontent.com/1903309/168488621-06891a7c-bbcc-4d12-87b8-2e49d073f98f.png">
Every button width is 26px, height in the design sketch is 25px, but just to be aligned good with datetime label, so I suggest to start with 26x26 buttons and see if it would be ok to tune top margin.
Button icons has 50% opacity until hovered, to have 100% opacity. In light theme icon color is`#1D2E48` and in dark `#DBE3F0`
**Icons:**
[more-vertical](https://user-images.githubusercontent.com/1903309/164133383-9515181b-830e-4f3d-b733-d34f2899ce0d.svg) - custom icon
[star](https://user-images.githubusercontent.com/1903309/168489100-25f21e44-02aa-4c40-87dd-3f04aa89a911.svg) - https://feathericons.com/?query=star 1.33 stroke
[star-filled](https://user-images.githubusercontent.com/1903309/168489102-6d2e7026-578b-4944-8861-54706b4ffb83.svg) - https://feathericons.com/?query=star 1.33 stroke + filled
[smile](https://user-images.githubusercontent.com/1903309/168489103-7de92998-46bd-44ca-9ca6-1230fae5b5bb.svg) - https://feathericons.com/?query=smile 1.33 stroke
### 5. Zulip focus rectangle
Color in light `hsla(217, 64%, 59%, 0.6)` and in dark is `hsla(217, 64%, 59%, 0.7)`. In the figma `border-radius: 4px;` and stroke is 1px inside, so `box-sizing: border-box;`
<img width="1082" alt="image" src="https://user-images.githubusercontent.com/1903309/168489263-7f33951d-1548-485f-bfc5-92ab65272afb.png">
### 6. Unread indicator
1px left border, inside the whole message rectangle. Could be done with a shadow `box-shadow: inset 1px 0px 0px #3380FF;` (same for dark theme)
<img width="371" alt="image" src="https://user-images.githubusercontent.com/1903309/168489421-80e06846-a133-428e-99d9-c24330f03e91.png">
|
1.0
|
UI redesign: message highlights and UI - As part of the redesign we want to update styles messages. This issue is not about [typography ](https://github.com/zulip/zulip/issues/22022) but about all the different ways to highlight the whole message.
New design of the possible message highlights consist from such layers:
0. theme message background
1. kind of message background
2. hover background
3. message content, including author's name and user pic
4. hover buttons
5. zulip focus rectangle
6. unread indicator
Here is the [Figma area](https://www.figma.com/file/jbNOiBWvbtLuHaiTj4CW0G/Zulip-Web-App?node-id=2054%3A132598) with all the variations:
<img width="868" alt="image" src="https://user-images.githubusercontent.com/1903309/168487056-af6ddcaa-becb-4612-88b3-16e83c410ece.png">
## Specification each layer
In Figma layers don't exactly reflect the order of containers/backgrounds in html/css. My proposal of implementation might not be optimal, so suggestions are welcome.
### 0. Theme message background
White for light theme `hsla(0, 0%, 100%, 1)`
<img width="921" alt="image" src="https://user-images.githubusercontent.com/1903309/168487355-bd6c97bd-814a-4a8b-a7d4-f948d1858568.png">
and almost black in dark `hsla(0, 0%, 14%, 1)`
<img width="919" alt="image" src="https://user-images.githubusercontent.com/1903309/168487375-ebfef654-fcb1-419c-b575-f953e671de71.png">
### 1. Kind of message background
This replaces the theme background.
- viewer's user mention in light `hsla(240, 24%, 96%, 1)` and in dark `hsla(240, 13%, 20%, 1)`
- viewer's group mention in light `hsla(180, 13%, 95%, 1)` and in dark `hsla(180, 13%, 15%, 1)`
- personal message in light `hsla(45, 20%, 96%, 1)` and in dark `hsla(46, 25%, 14%, 1)`
<img width="921" alt="image" src="https://user-images.githubusercontent.com/1903309/168487534-25e274c8-e216-4f0f-8e99-0209e0bb99b2.png">
<img width="879" alt="image" src="https://user-images.githubusercontent.com/1903309/168487826-d4904901-0e67-43fb-bb2c-3e752269f308.png">
### 2. Hover background
This goes on top of previous layer. In light theme is 2% black `hsla(0, 0%, 0%, 0.02)` and in dark theme is 20% black `background: hsla(0, 0%, 0%, 0.2);`
<img width="843" alt="image" src="https://user-images.githubusercontent.com/1903309/168488341-a0ab8674-65c8-45ee-99d0-476774d28d15.png">
### 3. Message content
Partly specified in https://github.com/zulip/zulip/issues/22022
### 4. Hover buttons (appear on the whole message hover)
<img width="798" alt="image" src="https://user-images.githubusercontent.com/1903309/168488621-06891a7c-bbcc-4d12-87b8-2e49d073f98f.png">
Every button width is 26px, height in the design sketch is 25px, but just to be aligned good with datetime label, so I suggest to start with 26x26 buttons and see if it would be ok to tune top margin.
Button icons has 50% opacity until hovered, to have 100% opacity. In light theme icon color is`#1D2E48` and in dark `#DBE3F0`
**Icons:**
[more-vertical](https://user-images.githubusercontent.com/1903309/164133383-9515181b-830e-4f3d-b733-d34f2899ce0d.svg) - custom icon
[star](https://user-images.githubusercontent.com/1903309/168489100-25f21e44-02aa-4c40-87dd-3f04aa89a911.svg) - https://feathericons.com/?query=star 1.33 stroke
[star-filled](https://user-images.githubusercontent.com/1903309/168489102-6d2e7026-578b-4944-8861-54706b4ffb83.svg) - https://feathericons.com/?query=star 1.33 stroke + filled
[smile](https://user-images.githubusercontent.com/1903309/168489103-7de92998-46bd-44ca-9ca6-1230fae5b5bb.svg) - https://feathericons.com/?query=smile 1.33 stroke
### 5. Zulip focus rectangle
Color in light `hsla(217, 64%, 59%, 0.6)` and in dark is `hsla(217, 64%, 59%, 0.7)`. In the figma `border-radius: 4px;` and stroke is 1px inside, so `box-sizing: border-box;`
<img width="1082" alt="image" src="https://user-images.githubusercontent.com/1903309/168489263-7f33951d-1548-485f-bfc5-92ab65272afb.png">
### 6. Unread indicator
1px left border, inside the whole message rectangle. Could be done with a shadow `box-shadow: inset 1px 0px 0px #3380FF;` (same for dark theme)
<img width="371" alt="image" src="https://user-images.githubusercontent.com/1903309/168489421-80e06846-a133-428e-99d9-c24330f03e91.png">
|
non_test
|
ui redesign message highlights and ui as part of the redesign we want to update styles messages this issue is not about but about all the different ways to highlight the whole message new design of the possible message highlights consist from such layers theme message background kind of message background hover background message content including author s name and user pic hover buttons zulip focus rectangle unread indicator here is the with all the variations img width alt image src specification each layer in figma layers don t exactly reflect the order of containers backgrounds in html css my proposal of implementation might not be optimal so suggestions are welcome theme message background white for light theme hsla img width alt image src and almost black in dark hsla img width alt image src kind of message background this replaces the theme background viewer s user mention in light hsla and in dark hsla viewer s group mention in light hsla and in dark hsla personal message in light hsla and in dark hsla img width alt image src img width alt image src hover background this goes on top of previous layer in light theme is black hsla and in dark theme is black background hsla img width alt image src message content partly specified in hover buttons appear on the whole message hover img width alt image src every button width is height in the design sketch is but just to be aligned good with datetime label so i suggest to start with buttons and see if it would be ok to tune top margin button icons has opacity until hovered to have opacity in light theme icon color is and in dark icons custom icon stroke stroke filled stroke zulip focus rectangle color in light hsla and in dark is hsla in the figma border radius and stroke is inside so box sizing border box img width alt image src unread indicator left border inside the whole message rectangle could be done with a shadow box shadow inset same for dark theme img width alt image src
| 0
|
324,416
| 27,808,015,499
|
IssuesEvent
|
2023-03-17 22:17:25
|
gravitational/teleport
|
https://api.github.com/repos/gravitational/teleport
|
closed
|
`TestGetHeadlessAuthentication/OK_same_user` flakiness
|
flaky tests headless-sso
|
## Failure
#### Link(s) to logs
- https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255
#### Relevant snippet
```
===================================================
[288](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:289)
OUTPUT github.com/gravitational/teleport/lib/auth.TestGetHeadlessAuthentication/OK_same_user
[289](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:290)
===================================================
[290](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:291)
=== RUN TestGetHeadlessAuthentication/OK_same_user
[291](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:292)
=== PAUSE TestGetHeadlessAuthentication/OK_same_user
[292](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:293)
=== CONT TestGetHeadlessAuthentication/OK_same_user
[293](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:294)
auth_with_roles_test.go:4310:
[294](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:295)
Error Trace: /__w/teleport/teleport/lib/auth/auth_with_roles_test.go:4310
[295](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:296)
Error: Received unexpected error:
[296](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:297)
rpc error: code = DeadlineExceeded desc = context deadline exceeded
[297](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:298)
Test: TestGetHeadlessAuthentication/OK_same_user
```
|
1.0
|
`TestGetHeadlessAuthentication/OK_same_user` flakiness - ## Failure
#### Link(s) to logs
- https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255
#### Relevant snippet
```
===================================================
[288](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:289)
OUTPUT github.com/gravitational/teleport/lib/auth.TestGetHeadlessAuthentication/OK_same_user
[289](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:290)
===================================================
[290](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:291)
=== RUN TestGetHeadlessAuthentication/OK_same_user
[291](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:292)
=== PAUSE TestGetHeadlessAuthentication/OK_same_user
[292](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:293)
=== CONT TestGetHeadlessAuthentication/OK_same_user
[293](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:294)
auth_with_roles_test.go:4310:
[294](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:295)
Error Trace: /__w/teleport/teleport/lib/auth/auth_with_roles_test.go:4310
[295](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:296)
Error: Received unexpected error:
[296](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:297)
rpc error: code = DeadlineExceeded desc = context deadline exceeded
[297](https://github.com/gravitational/teleport/actions/runs/4450963659/jobs/7817052223?pr=23255#step:6:298)
Test: TestGetHeadlessAuthentication/OK_same_user
```
|
test
|
testgetheadlessauthentication ok same user flakiness failure link s to logs relevant snippet output github com gravitational teleport lib auth testgetheadlessauthentication ok same user run testgetheadlessauthentication ok same user pause testgetheadlessauthentication ok same user cont testgetheadlessauthentication ok same user auth with roles test go error trace w teleport teleport lib auth auth with roles test go error received unexpected error rpc error code deadlineexceeded desc context deadline exceeded test testgetheadlessauthentication ok same user
| 1
|
267,280
| 8,381,214,923
|
IssuesEvent
|
2018-10-07 22:35:46
|
opentargets/genetics
|
https://api.github.com/repos/opentargets/genetics
|
closed
|
Gecko plot - investigate dense regions
|
Kind: Bug Priority: Critical
|
The Gecko plot can ask for a lot of data, particularly in the MHC region (6:28,477,797–33,448,354), which is dense in genes. One call to a 2MB window in this region recently broke the API due to an out of memory exception. Clickhouse appeared to be unaffected.
Proposed short term solution is to filter these regions and respond with a **Region too dense** message.
|
1.0
|
Gecko plot - investigate dense regions - The Gecko plot can ask for a lot of data, particularly in the MHC region (6:28,477,797–33,448,354), which is dense in genes. One call to a 2MB window in this region recently broke the API due to an out of memory exception. Clickhouse appeared to be unaffected.
Proposed short term solution is to filter these regions and respond with a **Region too dense** message.
|
non_test
|
gecko plot investigate dense regions the gecko plot can ask for a lot of data particularly in the mhc region – which is dense in genes one call to a window in this region recently broke the api due to an out of memory exception clickhouse appeared to be unaffected proposed short term solution is to filter these regions and respond with a region too dense message
| 0
|
135,482
| 11,007,813,133
|
IssuesEvent
|
2019-12-04 09:18:30
|
microsoft/AzureStorageExplorer
|
https://api.github.com/repos/microsoft/AzureStorageExplorer
|
opened
|
Add an option to generate 'share-level shared access signature URI' for one file
|
:gear: files 🧪 testing
|
**Storage Explorer Version:** 1.11.1
**Build:** [20191204.2](https://devdiv.visualstudio.com/DevDiv/_build/results?buildId=3292206)
**Branch:** master
**Platform/OS:** Windows 10/ Linux Ubuntu 18.04/macOS High Sierra
**Architecture:** ia32/x64
**Regression From:** Not a regression
**Steps to reproduce:**
1. Expand one storage account -> File Shares.
2. Select one file share which contains one file at least.
3. Select one file and right click it -> Select 'Get Shared Access Signature...'.
4. Check whether there an option 'Generate share-level shared access signature URI' shows on the dialog or not.
**Expect Experience:**
Show an option 'Generate share-level shared access signature URI'

**Actual Experience:**
No option 'Generate share-level shared access signature URI' shows.

**More Info:**
For blobs:

|
1.0
|
Add an option to generate 'share-level shared access signature URI' for one file - **Storage Explorer Version:** 1.11.1
**Build:** [20191204.2](https://devdiv.visualstudio.com/DevDiv/_build/results?buildId=3292206)
**Branch:** master
**Platform/OS:** Windows 10/ Linux Ubuntu 18.04/macOS High Sierra
**Architecture:** ia32/x64
**Regression From:** Not a regression
**Steps to reproduce:**
1. Expand one storage account -> File Shares.
2. Select one file share which contains one file at least.
3. Select one file and right click it -> Select 'Get Shared Access Signature...'.
4. Check whether there an option 'Generate share-level shared access signature URI' shows on the dialog or not.
**Expect Experience:**
Show an option 'Generate share-level shared access signature URI'

**Actual Experience:**
No option 'Generate share-level shared access signature URI' shows.

**More Info:**
For blobs:

|
test
|
add an option to generate share level shared access signature uri for one file storage explorer version build branch master platform os windows linux ubuntu macos high sierra architecture regression from not a regression steps to reproduce expand one storage account file shares select one file share which contains one file at least select one file and right click it select get shared access signature check whether there an option generate share level shared access signature uri shows on the dialog or not expect experience show an option generate share level shared access signature uri actual experience no option generate share level shared access signature uri shows more info for blobs
| 1
|
252,289
| 19,011,251,821
|
IssuesEvent
|
2021-11-23 09:34:06
|
farcaster-project/farcaster-node
|
https://api.github.com/repos/farcaster-project/farcaster-node
|
closed
|
Add alias example for connecting cli to docker
|
documentation enhancement
|
Create alias and use it in example command:
```shell
alias swap='swap-cli -x "lnpz://127.0.0.1:9981/?api=esb"'
swap info
```
|
1.0
|
Add alias example for connecting cli to docker - Create alias and use it in example command:
```shell
alias swap='swap-cli -x "lnpz://127.0.0.1:9981/?api=esb"'
swap info
```
|
non_test
|
add alias example for connecting cli to docker create alias and use it in example command shell alias swap swap cli x lnpz api esb swap info
| 0
|
100,465
| 21,338,016,970
|
IssuesEvent
|
2022-04-18 16:50:06
|
Arquisoft/dede_es2c
|
https://api.github.com/repos/Arquisoft/dede_es2c
|
closed
|
Problema para hallar productos por precio
|
bug code back-end
|
Ahora mismo si intento buscar un producto por precio no me aparece nada.
|
1.0
|
Problema para hallar productos por precio - Ahora mismo si intento buscar un producto por precio no me aparece nada.
|
non_test
|
problema para hallar productos por precio ahora mismo si intento buscar un producto por precio no me aparece nada
| 0
|
351,304
| 25,021,963,498
|
IssuesEvent
|
2022-11-04 02:14:29
|
AY2223S1-CS2103T-F11-4/tp
|
https://api.github.com/repos/AY2223S1-CS2103T-F11-4/tp
|
closed
|
[PE-D][Tester D] Small formatting errors to be fixed
|
documentation
|
Some small formatting errors found.
Screenshot 2022-10-28 at 4.57.05 PM
<!--session: 1666943395167-926869b6-f069-44c8-a26b-49c4a9a80e64-->
<!--Version: Web v3.4.4-->
-------------
Labels: `severity.VeryLow` `type.DocumentationBug`
original: samuelcheongws/ped#3
|
1.0
|
[PE-D][Tester D] Small formatting errors to be fixed - Some small formatting errors found.
Screenshot 2022-10-28 at 4.57.05 PM
<!--session: 1666943395167-926869b6-f069-44c8-a26b-49c4a9a80e64-->
<!--Version: Web v3.4.4-->
-------------
Labels: `severity.VeryLow` `type.DocumentationBug`
original: samuelcheongws/ped#3
|
non_test
|
small formatting errors to be fixed some small formatting errors found at pm labels severity verylow type documentationbug original samuelcheongws ped
| 0
|
606,937
| 18,770,542,113
|
IssuesEvent
|
2021-11-06 19:03:32
|
apcountryman/picolibrary
|
https://api.github.com/repos/apcountryman/picolibrary
|
closed
|
Fix missing mock WIZnet W5500 IP network stack functions
|
priority-normal status-awaiting_review type-bug
|
Fix missing mock WIZnet W5500 IP network stack (`::picolibrary::WIZnet::W5500::IP::Network_Stack`) functions:
- [x] `auto available_sockets() const noexcept -> std::uint_fast8_t;`
- [x] `auto tcp_ephemeral_port_allocation_enabled() const noexcept -> bool;`
- [x] `auto tcp_ephemeral_port_min() const noexcept -> ::picolibrary::IP::TCP::Port;`
- [x] `auto tcp_ephemeral_port_max() const noexcept -> ::picolibrary::IP::TCP::Port;`
|
1.0
|
Fix missing mock WIZnet W5500 IP network stack functions - Fix missing mock WIZnet W5500 IP network stack (`::picolibrary::WIZnet::W5500::IP::Network_Stack`) functions:
- [x] `auto available_sockets() const noexcept -> std::uint_fast8_t;`
- [x] `auto tcp_ephemeral_port_allocation_enabled() const noexcept -> bool;`
- [x] `auto tcp_ephemeral_port_min() const noexcept -> ::picolibrary::IP::TCP::Port;`
- [x] `auto tcp_ephemeral_port_max() const noexcept -> ::picolibrary::IP::TCP::Port;`
|
non_test
|
fix missing mock wiznet ip network stack functions fix missing mock wiznet ip network stack picolibrary wiznet ip network stack functions auto available sockets const noexcept std uint t auto tcp ephemeral port allocation enabled const noexcept bool auto tcp ephemeral port min const noexcept picolibrary ip tcp port auto tcp ephemeral port max const noexcept picolibrary ip tcp port
| 0
|
384,969
| 26,610,130,862
|
IssuesEvent
|
2023-01-23 23:07:01
|
jupyter-widgets/ipywidgets
|
https://api.github.com/repos/jupyter-widgets/ipywidgets
|
closed
|
The documentation fails to build on ReadTheDocs
|
documentation
|
## Description
The documentation fails to build on ReadTheDocs.
## Reproduce
The docs seem to be failing to build recently. This was noticed in https://github.com/jupyter-widgets/ipywidgets/pull/3661#issuecomment-1371032922: https://readthedocs.org/projects/ipywidgets/builds/19070899/
The error message is:
```
Command killed due to timeout or excessive memory consumption
```
This also seems to be the case on `latest` and most of the PRs.
## Expected behavior
Docs should build on ReadTheDocs.
## Context
This was noticed in https://github.com/jupyter-widgets/ipywidgets/pull/3661#issuecomment-1371032922.
- ipywidgets version: latest
- Operating System and version: N/A
- Browser and version: Any
|
1.0
|
The documentation fails to build on ReadTheDocs - ## Description
The documentation fails to build on ReadTheDocs.
## Reproduce
The docs seem to be failing to build recently. This was noticed in https://github.com/jupyter-widgets/ipywidgets/pull/3661#issuecomment-1371032922: https://readthedocs.org/projects/ipywidgets/builds/19070899/
The error message is:
```
Command killed due to timeout or excessive memory consumption
```
This also seems to be the case on `latest` and most of the PRs.
## Expected behavior
Docs should build on ReadTheDocs.
## Context
This was noticed in https://github.com/jupyter-widgets/ipywidgets/pull/3661#issuecomment-1371032922.
- ipywidgets version: latest
- Operating System and version: N/A
- Browser and version: Any
|
non_test
|
the documentation fails to build on readthedocs description the documentation fails to build on readthedocs reproduce the docs seem to be failing to build recently this was noticed in the error message is command killed due to timeout or excessive memory consumption this also seems to be the case on latest and most of the prs expected behavior docs should build on readthedocs context this was noticed in ipywidgets version latest operating system and version n a browser and version any
| 0
|
83,148
| 15,696,038,929
|
IssuesEvent
|
2021-03-26 01:02:07
|
jgeraigery/cloud-native-starter
|
https://api.github.com/repos/jgeraigery/cloud-native-starter
|
opened
|
CVE-2020-1935 (Medium) detected in tomcat-embed-core-9.0.21.jar, tomcat-embed-core-8.5.39.jar
|
security vulnerability
|
## CVE-2020-1935 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>tomcat-embed-core-9.0.21.jar</b>, <b>tomcat-embed-core-8.5.39.jar</b></p></summary>
<p>
<details><summary><b>tomcat-embed-core-9.0.21.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Library home page: <a href="https://tomcat.apache.org/">https://tomcat.apache.org/</a></p>
<p>Path to dependency file: cloud-native-starter/articles-java-spring-boot/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/9.0.21/tomcat-embed-core-9.0.21.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.1.6.RELEASE.jar (Root Library)
- spring-boot-starter-tomcat-2.1.6.RELEASE.jar
- :x: **tomcat-embed-core-9.0.21.jar** (Vulnerable Library)
</details>
<details><summary><b>tomcat-embed-core-8.5.39.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Path to dependency file: cloud-native-starter/authors-java-spring-boot/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/8.5.39/tomcat-embed-core-8.5.39.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.0.9.RELEASE.jar (Root Library)
- spring-boot-starter-tomcat-2.0.9.RELEASE.jar
- :x: **tomcat-embed-core-8.5.39.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Apache Tomcat 9.0.0.M1 to 9.0.30, 8.5.0 to 8.5.50 and 7.0.0 to 7.0.99 the HTTP header parsing code used an approach to end-of-line parsing that allowed some invalid HTTP headers to be parsed as valid. This led to a possibility of HTTP Request Smuggling if Tomcat was located behind a reverse proxy that incorrectly handled the invalid Transfer-Encoding header in a particular manner. Such a reverse proxy is considered unlikely.
<p>Publish Date: 2020-02-24
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1935>CVE-2020-1935</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-6v7p-v754-j89v">https://github.com/advisories/GHSA-6v7p-v754-j89v</a></p>
<p>Release Date: 2020-02-24</p>
<p>Fix Resolution: org.apache.tomcat.embed:tomcat-embed-core:7.0.100,8.5.51,9.0.31;org.apache.tomcat:tomcat-coyote:7.0.100,8.5.51,9.0.31</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.tomcat.embed","packageName":"tomcat-embed-core","packageVersion":"9.0.21","packageFilePaths":["/articles-java-spring-boot/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.1.6.RELEASE;org.springframework.boot:spring-boot-starter-tomcat:2.1.6.RELEASE;org.apache.tomcat.embed:tomcat-embed-core:9.0.21","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.tomcat.embed:tomcat-embed-core:7.0.100,8.5.51,9.0.31;org.apache.tomcat:tomcat-coyote:7.0.100,8.5.51,9.0.31"},{"packageType":"Java","groupId":"org.apache.tomcat.embed","packageName":"tomcat-embed-core","packageVersion":"8.5.39","packageFilePaths":["/authors-java-spring-boot/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.0.9.RELEASE;org.springframework.boot:spring-boot-starter-tomcat:2.0.9.RELEASE;org.apache.tomcat.embed:tomcat-embed-core:8.5.39","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.tomcat.embed:tomcat-embed-core:7.0.100,8.5.51,9.0.31;org.apache.tomcat:tomcat-coyote:7.0.100,8.5.51,9.0.31"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-1935","vulnerabilityDetails":"In Apache Tomcat 9.0.0.M1 to 9.0.30, 8.5.0 to 8.5.50 and 7.0.0 to 7.0.99 the HTTP header parsing code used an approach to end-of-line parsing that allowed some invalid HTTP headers to be parsed as valid. This led to a possibility of HTTP Request Smuggling if Tomcat was located behind a reverse proxy that incorrectly handled the invalid Transfer-Encoding header in a particular manner. Such a reverse proxy is considered unlikely.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1935","cvss3Severity":"medium","cvss3Score":"4.8","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-1935 (Medium) detected in tomcat-embed-core-9.0.21.jar, tomcat-embed-core-8.5.39.jar - ## CVE-2020-1935 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>tomcat-embed-core-9.0.21.jar</b>, <b>tomcat-embed-core-8.5.39.jar</b></p></summary>
<p>
<details><summary><b>tomcat-embed-core-9.0.21.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Library home page: <a href="https://tomcat.apache.org/">https://tomcat.apache.org/</a></p>
<p>Path to dependency file: cloud-native-starter/articles-java-spring-boot/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/9.0.21/tomcat-embed-core-9.0.21.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.1.6.RELEASE.jar (Root Library)
- spring-boot-starter-tomcat-2.1.6.RELEASE.jar
- :x: **tomcat-embed-core-9.0.21.jar** (Vulnerable Library)
</details>
<details><summary><b>tomcat-embed-core-8.5.39.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Path to dependency file: cloud-native-starter/authors-java-spring-boot/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/8.5.39/tomcat-embed-core-8.5.39.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.0.9.RELEASE.jar (Root Library)
- spring-boot-starter-tomcat-2.0.9.RELEASE.jar
- :x: **tomcat-embed-core-8.5.39.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Apache Tomcat 9.0.0.M1 to 9.0.30, 8.5.0 to 8.5.50 and 7.0.0 to 7.0.99 the HTTP header parsing code used an approach to end-of-line parsing that allowed some invalid HTTP headers to be parsed as valid. This led to a possibility of HTTP Request Smuggling if Tomcat was located behind a reverse proxy that incorrectly handled the invalid Transfer-Encoding header in a particular manner. Such a reverse proxy is considered unlikely.
<p>Publish Date: 2020-02-24
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1935>CVE-2020-1935</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-6v7p-v754-j89v">https://github.com/advisories/GHSA-6v7p-v754-j89v</a></p>
<p>Release Date: 2020-02-24</p>
<p>Fix Resolution: org.apache.tomcat.embed:tomcat-embed-core:7.0.100,8.5.51,9.0.31;org.apache.tomcat:tomcat-coyote:7.0.100,8.5.51,9.0.31</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.tomcat.embed","packageName":"tomcat-embed-core","packageVersion":"9.0.21","packageFilePaths":["/articles-java-spring-boot/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.1.6.RELEASE;org.springframework.boot:spring-boot-starter-tomcat:2.1.6.RELEASE;org.apache.tomcat.embed:tomcat-embed-core:9.0.21","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.tomcat.embed:tomcat-embed-core:7.0.100,8.5.51,9.0.31;org.apache.tomcat:tomcat-coyote:7.0.100,8.5.51,9.0.31"},{"packageType":"Java","groupId":"org.apache.tomcat.embed","packageName":"tomcat-embed-core","packageVersion":"8.5.39","packageFilePaths":["/authors-java-spring-boot/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.0.9.RELEASE;org.springframework.boot:spring-boot-starter-tomcat:2.0.9.RELEASE;org.apache.tomcat.embed:tomcat-embed-core:8.5.39","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.tomcat.embed:tomcat-embed-core:7.0.100,8.5.51,9.0.31;org.apache.tomcat:tomcat-coyote:7.0.100,8.5.51,9.0.31"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-1935","vulnerabilityDetails":"In Apache Tomcat 9.0.0.M1 to 9.0.30, 8.5.0 to 8.5.50 and 7.0.0 to 7.0.99 the HTTP header parsing code used an approach to end-of-line parsing that allowed some invalid HTTP headers to be parsed as valid. This led to a possibility of HTTP Request Smuggling if Tomcat was located behind a reverse proxy that incorrectly handled the invalid Transfer-Encoding header in a particular manner. Such a reverse proxy is considered unlikely.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1935","cvss3Severity":"medium","cvss3Score":"4.8","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_test
|
cve medium detected in tomcat embed core jar tomcat embed core jar cve medium severity vulnerability vulnerable libraries tomcat embed core jar tomcat embed core jar tomcat embed core jar core tomcat implementation library home page a href path to dependency file cloud native starter articles java spring boot pom xml path to vulnerable library home wss scanner repository org apache tomcat embed tomcat embed core tomcat embed core jar dependency hierarchy spring boot starter web release jar root library spring boot starter tomcat release jar x tomcat embed core jar vulnerable library tomcat embed core jar core tomcat implementation path to dependency file cloud native starter authors java spring boot pom xml path to vulnerable library home wss scanner repository org apache tomcat embed tomcat embed core tomcat embed core jar dependency hierarchy spring boot starter web release jar root library spring boot starter tomcat release jar x tomcat embed core jar vulnerable library found in base branch master vulnerability details in apache tomcat to to and to the http header parsing code used an approach to end of line parsing that allowed some invalid http headers to be parsed as valid this led to a possibility of http request smuggling if tomcat was located behind a reverse proxy that incorrectly handled the invalid transfer encoding header in a particular manner such a reverse proxy is considered unlikely publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache tomcat embed tomcat embed core org apache tomcat tomcat coyote isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree org springframework boot spring boot starter web release org springframework boot spring boot starter tomcat release org apache tomcat embed tomcat embed core isminimumfixversionavailable true minimumfixversion org apache tomcat embed tomcat embed core org apache tomcat tomcat coyote packagetype java groupid org apache tomcat embed packagename tomcat embed core packageversion packagefilepaths istransitivedependency true dependencytree org springframework boot spring boot starter web release org springframework boot spring boot starter tomcat release org apache tomcat embed tomcat embed core isminimumfixversionavailable true minimumfixversion org apache tomcat embed tomcat embed core org apache tomcat tomcat coyote basebranches vulnerabilityidentifier cve vulnerabilitydetails in apache tomcat to to and to the http header parsing code used an approach to end of line parsing that allowed some invalid http headers to be parsed as valid this led to a possibility of http request smuggling if tomcat was located behind a reverse proxy that incorrectly handled the invalid transfer encoding header in a particular manner such a reverse proxy is considered unlikely vulnerabilityurl
| 0
|
450,015
| 12,978,523,394
|
IssuesEvent
|
2020-07-21 23:09:19
|
ChainSafe/gossamer
|
https://api.github.com/repos/ChainSafe/gossamer
|
opened
|
increase network message size limit
|
Priority: 3 - Medium Type: Maintenance network
|
<!---
PLEASE READ CAREFULLY
-->
## Expected Behavior
<!---
If you're describing a bug, tell us what should happen.
If you're suggesting a change/improvement, tell us how it should work.
-->
- the network message size limit is much lower than it should be
- eg. BlockResponses are currently limited to 12, although it should be able to be increased to 128
- investigate why messages get truncated when they are too large and fix it
- updating libp2p is also a good idea since we are on a rather old version
- the message size limit should be around 4MB: https://godoc.org/github.com/libp2p/go-libp2p-core/network#pkg-constants
## Checklist
<!---
Each empty square brackets below is a checkbox. Replace [ ] with [x] to check
the box after completing the task.
--->
- [ ] I have read [CODE_OF_CONDUCT](https://github.com/ChainSafe/gossamer/blob/development/.github/CODE_OF_CONDUCT.md) and [CONTRIBUTING](https://github.com/ChainSafe/gossamer/blob/development/.github/CONTRIBUTING.md)
- [ ] I have provided as much information as possible and necessary
- [ ] I am planning to submit a pull request to fix this issue myself
|
1.0
|
increase network message size limit - <!---
PLEASE READ CAREFULLY
-->
## Expected Behavior
<!---
If you're describing a bug, tell us what should happen.
If you're suggesting a change/improvement, tell us how it should work.
-->
- the network message size limit is much lower than it should be
- eg. BlockResponses are currently limited to 12, although it should be able to be increased to 128
- investigate why messages get truncated when they are too large and fix it
- updating libp2p is also a good idea since we are on a rather old version
- the message size limit should be around 4MB: https://godoc.org/github.com/libp2p/go-libp2p-core/network#pkg-constants
## Checklist
<!---
Each empty square brackets below is a checkbox. Replace [ ] with [x] to check
the box after completing the task.
--->
- [ ] I have read [CODE_OF_CONDUCT](https://github.com/ChainSafe/gossamer/blob/development/.github/CODE_OF_CONDUCT.md) and [CONTRIBUTING](https://github.com/ChainSafe/gossamer/blob/development/.github/CONTRIBUTING.md)
- [ ] I have provided as much information as possible and necessary
- [ ] I am planning to submit a pull request to fix this issue myself
|
non_test
|
increase network message size limit please read carefully expected behavior if you re describing a bug tell us what should happen if you re suggesting a change improvement tell us how it should work the network message size limit is much lower than it should be eg blockresponses are currently limited to although it should be able to be increased to investigate why messages get truncated when they are too large and fix it updating is also a good idea since we are on a rather old version the message size limit should be around checklist each empty square brackets below is a checkbox replace with to check the box after completing the task i have read and i have provided as much information as possible and necessary i am planning to submit a pull request to fix this issue myself
| 0
|
172,693
| 14,379,552,872
|
IssuesEvent
|
2020-12-02 00:37:52
|
vmware-samples/vcenter-event-broker-appliance
|
https://api.github.com/repos/vmware-samples/vcenter-event-broker-appliance
|
closed
|
Update DESIGN.MD with latest feature enhancements
|
documentation stale
|
`DESIGN.MD` is outdated, especially the sections on recovery, retries and idempotency.
|
1.0
|
Update DESIGN.MD with latest feature enhancements - `DESIGN.MD` is outdated, especially the sections on recovery, retries and idempotency.
|
non_test
|
update design md with latest feature enhancements design md is outdated especially the sections on recovery retries and idempotency
| 0
|
319,647
| 23,783,483,413
|
IssuesEvent
|
2022-09-02 07:54:34
|
kubermatic/kubeone
|
https://api.github.com/repos/kubermatic/kubeone
|
closed
|
Deploy Operating System Profile with KubeOne
|
kind/documentation sig/cluster-management
|
### Description of the feature you would like to add / User story
<!-- We've provided an example format of the user story below. You're free to use any other format as well. -->
As a user of KubeOne I want to be able to deploy operating-system-profiles while I deploy my cluster, to use my custom profiles from the beginning.
Right now if you try to deploy OSP via KubeOne addon, you get templating errors like this:
```
WARN[09:54:50 CEST] Task failed, error was: runtime: executing addons manifest template "osp-ubuntu-custom.yaml"
template: addons-base:26:63: executing "addons-base" at <.Token>: can't evaluate field Token in type addons.templateData
WARN[09:55:00 CEST] Retrying task...
```
### Solution details
<!-- Please, provide a bullet-pointed list or a few sentences of requirements that have to be met to mark the requested feature (user story) complete. -->
- operating-system-profile should be deployable via KubeOne Addon
|
1.0
|
Deploy Operating System Profile with KubeOne - ### Description of the feature you would like to add / User story
<!-- We've provided an example format of the user story below. You're free to use any other format as well. -->
As a user of KubeOne I want to be able to deploy operating-system-profiles while I deploy my cluster, to use my custom profiles from the beginning.
Right now if you try to deploy OSP via KubeOne addon, you get templating errors like this:
```
WARN[09:54:50 CEST] Task failed, error was: runtime: executing addons manifest template "osp-ubuntu-custom.yaml"
template: addons-base:26:63: executing "addons-base" at <.Token>: can't evaluate field Token in type addons.templateData
WARN[09:55:00 CEST] Retrying task...
```
### Solution details
<!-- Please, provide a bullet-pointed list or a few sentences of requirements that have to be met to mark the requested feature (user story) complete. -->
- operating-system-profile should be deployable via KubeOne Addon
|
non_test
|
deploy operating system profile with kubeone description of the feature you would like to add user story as a user of kubeone i want to be able to deploy operating system profiles while i deploy my cluster to use my custom profiles from the beginning right now if you try to deploy osp via kubeone addon you get templating errors like this warn task failed error was runtime executing addons manifest template osp ubuntu custom yaml template addons base executing addons base at can t evaluate field token in type addons templatedata warn retrying task solution details operating system profile should be deployable via kubeone addon
| 0
|
295,388
| 22,210,090,480
|
IssuesEvent
|
2022-06-07 18:22:05
|
loudelmar/tareaScrum
|
https://api.github.com/repos/loudelmar/tareaScrum
|
closed
|
renombrar readme
|
documentation
|
Cambiarle el nombre al readme para cumplir con la actividad.
Hay que ponerle como nombre "ConceptosScrum"
|
1.0
|
renombrar readme - Cambiarle el nombre al readme para cumplir con la actividad.
Hay que ponerle como nombre "ConceptosScrum"
|
non_test
|
renombrar readme cambiarle el nombre al readme para cumplir con la actividad hay que ponerle como nombre conceptosscrum
| 0
|
236,945
| 19,587,492,036
|
IssuesEvent
|
2022-01-05 09:00:23
|
AlvarofbUS/decide-single-picacho
|
https://api.github.com/repos/AlvarofbUS/decide-single-picacho
|
closed
|
Test para la funcionalidad de cabina
|
enhancement Test
|
Realizar los tests necesarios para probar la nueva funcionalidad implementada en el módulo de cabina
- [x] Tests realizado
- [x] Usar selenium
|
1.0
|
Test para la funcionalidad de cabina - Realizar los tests necesarios para probar la nueva funcionalidad implementada en el módulo de cabina
- [x] Tests realizado
- [x] Usar selenium
|
test
|
test para la funcionalidad de cabina realizar los tests necesarios para probar la nueva funcionalidad implementada en el módulo de cabina tests realizado usar selenium
| 1
|
150,330
| 11,956,575,928
|
IssuesEvent
|
2020-04-04 11:08:43
|
qrdl/flightrec
|
https://api.github.com/repos/qrdl/flightrec
|
closed
|
Test script for running
|
testing
|
Script to test:
- running/continuing forward
- running/continuing backward
- step in forward
- step out forward
- step over forward
- step over backward (no other backward steps available by protocol)
- frames (stack trace)
|
1.0
|
Test script for running - Script to test:
- running/continuing forward
- running/continuing backward
- step in forward
- step out forward
- step over forward
- step over backward (no other backward steps available by protocol)
- frames (stack trace)
|
test
|
test script for running script to test running continuing forward running continuing backward step in forward step out forward step over forward step over backward no other backward steps available by protocol frames stack trace
| 1
|
62,649
| 6,802,323,950
|
IssuesEvent
|
2017-11-02 19:47:37
|
WikiWatershed/model-my-watershed
|
https://api.github.com/repos/WikiWatershed/model-my-watershed
|
closed
|
Implement revised CINERGI detail view
|
BiG CZ November demo BigCZ tested/verified
|
Currently looks like:

Waiting on mockup from Anthony + Emilio.
|
1.0
|
Implement revised CINERGI detail view - Currently looks like:

Waiting on mockup from Anthony + Emilio.
|
test
|
implement revised cinergi detail view currently looks like waiting on mockup from anthony emilio
| 1
|
170,689
| 6,469,404,063
|
IssuesEvent
|
2017-08-17 05:46:02
|
vmware/harbor
|
https://api.github.com/repos/vmware/harbor
|
closed
|
Angular is running in the development mode. Call enableProdMode() to enable the production mode.
|
area/ui kind/bug priority/medium target/1.2.0
|
If you are reporting a problem, please make sure the following information are provided:
1) Version of docker engine and docker-compose.
```shell
# docker version
Client:
Version: 17.03.0-ce
API version: 1.26
Go version: go1.7.5
Git commit: 60ccb22
Built: Thu Feb 23 10:54:03 2017
OS/Arch: linux/amd64
Server:
Version: 17.03.0-ce
API version: 1.26 (minimum version 1.12)
Go version: go1.7.5
Git commit: 60ccb22
Built: Thu Feb 23 10:54:03 2017
OS/Arch: linux/amd64
Experimental: true
```
```shell
# docker-compose version
docker-compose version 1.14.0, build c7bdf9e
docker-py version: 2.3.0
CPython version: 2.7.13
OpenSSL version: OpenSSL 1.0.1t 3 May 2016
```
2) Config files of harbor, you can get them by packaging "harbor.cfg" and files in the same directory, including subdirectory.
3) Log files, you can get them by package the /var/log/harbor/ .
------------
4) Issue detail:
It shows the message below on chrome developer tools console:
```
Angular is running in the development mode. Call enableProdMode() to enable the production mode.
```
5) Harbor version:

|
1.0
|
Angular is running in the development mode. Call enableProdMode() to enable the production mode. - If you are reporting a problem, please make sure the following information are provided:
1) Version of docker engine and docker-compose.
```shell
# docker version
Client:
Version: 17.03.0-ce
API version: 1.26
Go version: go1.7.5
Git commit: 60ccb22
Built: Thu Feb 23 10:54:03 2017
OS/Arch: linux/amd64
Server:
Version: 17.03.0-ce
API version: 1.26 (minimum version 1.12)
Go version: go1.7.5
Git commit: 60ccb22
Built: Thu Feb 23 10:54:03 2017
OS/Arch: linux/amd64
Experimental: true
```
```shell
# docker-compose version
docker-compose version 1.14.0, build c7bdf9e
docker-py version: 2.3.0
CPython version: 2.7.13
OpenSSL version: OpenSSL 1.0.1t 3 May 2016
```
2) Config files of harbor, you can get them by packaging "harbor.cfg" and files in the same directory, including subdirectory.
3) Log files, you can get them by package the /var/log/harbor/ .
------------
4) Issue detail:
It shows the message below on chrome developer tools console:
```
Angular is running in the development mode. Call enableProdMode() to enable the production mode.
```
5) Harbor version:

|
non_test
|
angular is running in the development mode call enableprodmode to enable the production mode if you are reporting a problem please make sure the following information are provided version of docker engine and docker compose shell docker version client version ce api version go version git commit built thu feb os arch linux server version ce api version minimum version go version git commit built thu feb os arch linux experimental true shell docker compose version docker compose version build docker py version cpython version openssl version openssl may config files of harbor you can get them by packaging harbor cfg and files in the same directory including subdirectory log files you can get them by package the var log harbor issue detail it shows the message below on chrome developer tools console angular is running in the development mode call enableprodmode to enable the production mode harbor version
| 0
|
797,350
| 28,144,382,779
|
IssuesEvent
|
2023-04-02 10:10:11
|
Atlas-OS/Atlas
|
https://api.github.com/repos/Atlas-OS/Atlas
|
closed
|
[BUG REPORT] - Cannot enable Hyper-V
|
bug high priority
|
### Description
Hyper-V is not enabled after running the included cmd file and restarting.
### Steps to reproduce (if applicable add screenshots)
1) Go to Atlas folder --> 3. Configuration\1. General Configuration\Hyper-V and VBS
2) Run Enable Hyper-V and VBS.cmd
3) Error messages show up, and after restarting, Hyper-V is not enabled.
### Expected behavior
Hyper-V would get enabled.
### Actual behavior (if applicable add screenshots)


### Atlas Version
Atlas 10 22H2
### Desktop information
- Ryzen 5 3500
- 16GB RAM
- GTX 1660
- Windows installed on SSD.
### Requisites
- [X] This is not a support issue or a question. For any support, questions or help, join our [Discord server](https://discord.com/invite/atlasos).
- [X] I performed a [cursory search of the issue tracker](https://github.com/Atlas-OS/Atlas/issues?q=is%3Aissue) to avoid opening a duplicate issue.
- [X] I checked the [documentation](https://docs.atlasos.net) to understand that the issue I am reporting is not normal behavior.
- [X] I understand that not filling out this template will lead to the issue being closed.
### Additional content
_No response_
|
1.0
|
[BUG REPORT] - Cannot enable Hyper-V - ### Description
Hyper-V is not enabled after running the included cmd file and restarting.
### Steps to reproduce (if applicable add screenshots)
1) Go to Atlas folder --> 3. Configuration\1. General Configuration\Hyper-V and VBS
2) Run Enable Hyper-V and VBS.cmd
3) Error messages show up, and after restarting, Hyper-V is not enabled.
### Expected behavior
Hyper-V would get enabled.
### Actual behavior (if applicable add screenshots)


### Atlas Version
Atlas 10 22H2
### Desktop information
- Ryzen 5 3500
- 16GB RAM
- GTX 1660
- Windows installed on SSD.
### Requisites
- [X] This is not a support issue or a question. For any support, questions or help, join our [Discord server](https://discord.com/invite/atlasos).
- [X] I performed a [cursory search of the issue tracker](https://github.com/Atlas-OS/Atlas/issues?q=is%3Aissue) to avoid opening a duplicate issue.
- [X] I checked the [documentation](https://docs.atlasos.net) to understand that the issue I am reporting is not normal behavior.
- [X] I understand that not filling out this template will lead to the issue being closed.
### Additional content
_No response_
|
non_test
|
cannot enable hyper v description hyper v is not enabled after running the included cmd file and restarting steps to reproduce if applicable add screenshots go to atlas folder configuration general configuration hyper v and vbs run enable hyper v and vbs cmd error messages show up and after restarting hyper v is not enabled expected behavior hyper v would get enabled actual behavior if applicable add screenshots atlas version atlas desktop information ryzen ram gtx windows installed on ssd requisites this is not a support issue or a question for any support questions or help join our i performed a to avoid opening a duplicate issue i checked the to understand that the issue i am reporting is not normal behavior i understand that not filling out this template will lead to the issue being closed additional content no response
| 0
|
302,534
| 22,829,146,730
|
IssuesEvent
|
2022-07-12 11:20:51
|
OPEN-NEXT/wp2.2_dev
|
https://api.github.com/repos/OPEN-NEXT/wp2.2_dev
|
closed
|
Generate DSMs and File co-edition graphs
|
bug documentation enhancement data
|
related to #18
- [x] test the script on more repositories
- [x] feed the script a list of repositories instead of one at a time (new config parameter; issue #19)
- [ ] Identity management (issue #15)
- [x] Pull github issues (issue #20)
- [ ] Generate interaction graphs based on file co-edition and issue participation (#23)
- [ ] Generate DSMs based on file co-edition and issue participation
- [ ] first: test with considering a link between files when they are showing up in the same commit.
- [ ] second: experiment with weighting between files depending on the frequency of their co-editions within commits.
|
1.0
|
Generate DSMs and File co-edition graphs - related to #18
- [x] test the script on more repositories
- [x] feed the script a list of repositories instead of one at a time (new config parameter; issue #19)
- [ ] Identity management (issue #15)
- [x] Pull github issues (issue #20)
- [ ] Generate interaction graphs based on file co-edition and issue participation (#23)
- [ ] Generate DSMs based on file co-edition and issue participation
- [ ] first: test with considering a link between files when they are showing up in the same commit.
- [ ] second: experiment with weighting between files depending on the frequency of their co-editions within commits.
|
non_test
|
generate dsms and file co edition graphs related to test the script on more repositories feed the script a list of repositories instead of one at a time new config parameter issue identity management issue pull github issues issue generate interaction graphs based on file co edition and issue participation generate dsms based on file co edition and issue participation first test with considering a link between files when they are showing up in the same commit second experiment with weighting between files depending on the frequency of their co editions within commits
| 0
|
25,377
| 25,083,958,848
|
IssuesEvent
|
2022-11-07 21:52:59
|
ZcashFoundation/zebra
|
https://api.github.com/repos/ZcashFoundation/zebra
|
opened
|
ci: notify team members in our chat system if the main branch fails a job
|
C-enhancement S-needs-triage I-usability P-Optional :sparkles: A-diagnostics C-feature
|
## Motivation
When CI is stable we'd like to notify the team if a CI job fails in the `main` branch, so we're notified in a timely manner to fix it as soon as possible
### Designs
- Create a separate workflow for notifications
- We might want to have a global solution which connects to multiple chat systems (Discord, Slack, etc.)
|
True
|
ci: notify team members in our chat system if the main branch fails a job - ## Motivation
When CI is stable we'd like to notify the team if a CI job fails in the `main` branch, so we're notified in a timely manner to fix it as soon as possible
### Designs
- Create a separate workflow for notifications
- We might want to have a global solution which connects to multiple chat systems (Discord, Slack, etc.)
|
non_test
|
ci notify team members in our chat system if the main branch fails a job motivation when ci is stable we d like to notify the team if a ci job fails in the main branch so we re notified in a timely manner to fix it as soon as possible designs create a separate workflow for notifications we might want to have a global solution which connects to multiple chat systems discord slack etc
| 0
|
207,362
| 15,812,075,225
|
IssuesEvent
|
2021-04-05 04:37:43
|
divanov11/Mumble
|
https://api.github.com/repos/divanov11/Mumble
|
closed
|
Setup Cypress
|
Type: CI/CD Type: Testing enhancement
|
- Write a single cypress test to verify the login page can be viewed
- create npm script for running cypress tests
|
1.0
|
Setup Cypress - - Write a single cypress test to verify the login page can be viewed
- create npm script for running cypress tests
|
test
|
setup cypress write a single cypress test to verify the login page can be viewed create npm script for running cypress tests
| 1
|
36,710
| 9,880,047,120
|
IssuesEvent
|
2019-06-24 11:37:59
|
neomutt/neomutt
|
https://api.github.com/repos/neomutt/neomutt
|
closed
|
Can't compile with --notmuch
|
has:bisect topic:build-process topic:notmuch type:bug
|
## Expected Behaviour
No compilation error
## Actual Behaviour
```
Undefined symbols for architecture x86_64:
"_notmuch_database_begin_atomic", referenced from:
_nm_db_trans_begin in libnotmuch.a(nm_db.o)
"_notmuch_database_destroy", referenced from:
_nm_db_release in libnotmuch.a(nm_db.o)
_nm_db_free in libnotmuch.a(nm_db.o)
_nm_db_longrun_done in libnotmuch.a(nm_db.o)
_nm_db_debug_check in libnotmuch.a(nm_db.o)
"_notmuch_database_end_atomic", referenced from:
_nm_db_trans_end in libnotmuch.a(nm_db.o)
"_notmuch_database_find_message", referenced from:
_nm_read_entire_thread in libnotmuch.a(mutt_notmuch.o)
_nm_tags_commit in libnotmuch.a(mutt_notmuch.o)
"_notmuch_database_find_message_by_filename", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
"_notmuch_database_get_all_tags", referenced from:
_nm_get_all_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_database_index_file", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_record_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_database_open_verbose", referenced from:
_nm_db_do_open in libnotmuch.a(nm_db.o)
"_notmuch_database_remove_message", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
"_notmuch_filenames_get", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
_append_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_filenames_move_to_next", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
_append_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_filenames_valid", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
_append_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_add_tag", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_destroy", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_record_message in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
_append_replies in libnotmuch.a(mutt_notmuch.o)
...
"_notmuch_message_freeze", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_get_filenames", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
_append_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_get_message_id", referenced from:
_append_message in libnotmuch.a(mutt_notmuch.o)
_get_mutt_email in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_get_replies", referenced from:
_append_replies in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_get_tags", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
_update_email_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_get_thread_id", referenced from:
_nm_read_entire_thread in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_maildir_flags_to_tags", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_record_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_remove_tag", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_thaw", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_messages_destroy", referenced from:
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
"_notmuch_messages_get", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_append_replies in libnotmuch.a(mutt_notmuch.o)
"_notmuch_messages_move_to_next", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_append_replies in libnotmuch.a(mutt_notmuch.o)
"_notmuch_messages_valid", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_append_replies in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_add_tag_exclude", referenced from:
_apply_exclude_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_count_messages", referenced from:
_count_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_create", referenced from:
_nm_read_entire_thread in libnotmuch.a(mutt_notmuch.o)
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
_get_query in libnotmuch.a(mutt_notmuch.o)
_count_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_destroy", referenced from:
_nm_read_entire_thread in libnotmuch.a(mutt_notmuch.o)
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_count_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_search_messages", referenced from:
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_search_threads", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_set_omit_excluded", referenced from:
_apply_exclude_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_set_sort", referenced from:
_nm_read_entire_thread in libnotmuch.a(mutt_notmuch.o)
_get_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_status_to_string", referenced from:
_nm_db_do_open in libnotmuch.a(nm_db.o)
"_notmuch_tags_destroy", referenced from:
_nm_get_all_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_tags_get", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
_nm_get_all_tags in libnotmuch.a(mutt_notmuch.o)
_update_email_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_tags_move_to_next", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
_nm_get_all_tags in libnotmuch.a(mutt_notmuch.o)
_update_email_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_tags_valid", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
_nm_get_all_tags in libnotmuch.a(mutt_notmuch.o)
_update_email_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_thread_destroy", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_thread_get_toplevel_messages", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_threads_destroy", referenced from:
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
"_notmuch_threads_get", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_threads_move_to_next", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_threads_valid", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
```
## Steps to Reproduce
Summary of build options:
Version: 20180716
Host OS: darwin18.6.0
Install prefix: /opt
Compiler: cc
CFlags: -g -O2 -std=c99 -D_ALL_SOURCE=1 -D_GNU_SOURCE=1 -D__EXTENSIONS__ -I/opt/local/include -I/opt/include -DNCURSES_WIDECHAR
LDFlags: -L/opt/sdk/lib -L/var/sdk/lib -L/opt/lib -L/usr/local/lib -L/usr/lib -L/var/lib -L/lib -L. -L/opt/lib -L/opt/local/lib -L/opt/local/lib -dynamic -lgssapi_krb5 -lkrb5 -lk5crypto -lcom_err
Libs: -lkyotocabinet -lidn -lgnutls -lncursesw -lnotmuch -llua -lsasl2 -lgpgme -lgpg-error -lintl -liconv
Header cache libs: -lkyotocabinet
GPGME: yes
PGP: yes
SMIME: yes
Notmuch: yes
Header Cache(s): kyotocabinet
Lua: yes
$ git bisect bad
```
609cad0a92ff993389ac9ec6eae9160db28da6d5 is the first bad commit
commit 609cad0a92ff993389ac9ec6eae9160db28da6d5
Author: Richard Russon <rich@flatcap.org>
Date: Sat Jul 7 12:02:05 2018 +0100
create libnotmuch
:100644 100644 a6dc534784fb5c6d5986cbfe07c163a85297c95d 8dce07a1f0ca8bbc2c0de67c0923c1689df66c5f M Makefile.autosetup
:100644 100644 49bf8cc4e336aa44407666f5d9d3f1f87447f3c6 e22e2743ed3aaa80db7866f66ec3f4f45beb60cf M browser.c
:100644 100644 550555a89d50152cc35bf71ce9244c41e66d4303 c04e0504d574232b7662ca15d29215a7aff57650 M buffy.c
:100644 100644 e19db60b590525f7dbd9057ab7d155fb6eb1d7a5 9a7377f0d2c21471e93d91e680dffae1bbb4a525 M commands.c
:100644 100644 b58d39cbe7f9f1818025f0ca7df8d6f37ef34958 c786a953bc21708b466934b5f24ca09f47786c27 M copy.c
:100644 100644 704afe6d05911c45bfde14044ecdf76f5a44895a 5fc5ea4dbdcea3606164416312b291123f140e70 M curs_lib.c
:100644 100644 4f75221402a7560f91d00fd2cb31e20c73df2419 f1f60fa47e468bb01ac65d1a1fb190ef70a9765d M curs_main.c
:100644 100644 cec49f7d54f98d9c002b67a4b5f9a30936f96b4a d2c3534456981a10e4163f878198f5408344049c M init.c
:100644 100644 b4400e2e31c86bc473aeccd6ea036db2a5a8149d 5c05eb7a50c13ab7e52b9ce15e4c59c2ff8fdc42 M mh.c
:100644 000000 2e31493fb316db1dc6b0d8b7956b28cc89499c20 0000000000000000000000000000000000000000 D mutt_notmuch.c
:100644 000000 424fc35e61ac19ac979eb103ecc4bebf75a7559b 0000000000000000000000000000000000000000 D mutt_notmuch.h
:100644 100644 8b00155897efabca68b2b14f9de8838e34054f0a f6900daca08c9623cb667fd710306b17aa208b61 M mx.c
:000000 040000 0000000000000000000000000000000000000000 f60a4e06a68fa57ce2a040f16dbf21ba30095730 A notmuch
:040000 040000 1c94924c37d0ca79c15ed2d8f2caba360b2ea379 2335436978eeef1c1902d416d287e7585280a1a7 M po
:100644 100644 4d434f279a7bf1bde93851228867d63b6c1b231b 76756006f363088d538b4885265ef9a1be14f47c M send.c
:100644 100644 9ff44f9516ac69aa94fc95670d516db257673943 829a2b5ae31452a343cea92cc05ac0e21b3794a8 M sidebar.c
:100644 100644 6fb2f8a1e9296493f4784f7840f99cef890b849a 1f48b71a0fd86a6e1897f519245d7c6a11968e3e M status.c
```
## How often does this happen?
- Always
## When did it start to happen?
- When I upgraded
Which version did you use to use?
NeoMutt 20180716
## NeoMutt Version
```
commit b37a784d4fc6efe74c121c01408e6fdb9cbb93fd (HEAD -> master, origin/master, origin/HEAD)
Merge: d973635cc 8b9575c33
Author: Richard Russon <rich@flatcap.org>
Date: Sun Jun 16 01:32:52 2019 +0100
merge: upstream fixes
* Improve imap_append_message() error message handling
* Mention sources for ~p and ~P patterns
* Allow imap_cmd_finish() to both expunge and fetch new mail
* Improve $reverse_realname documentation
* Add $fcc_before_send, defaulting unset
```
## Extra Info
Compiles without "--notmuch", but still suffers from https://github.com/neomutt/neomutt/issues/1629
* Operating System and its version
MacOS Mojave (10.14.5 )
MacPorts 2.5.4
notmuch @0.29_1
* Were you using multiple copies of NeoMutt at once?
no
* Were you using 'screen' or 'tmux'?
no
* Is your email local (maildir) or remote (IMAP)?
Neither, it's local (mbox)
|
1.0
|
Can't compile with --notmuch - ## Expected Behaviour
No compilation error
## Actual Behaviour
```
Undefined symbols for architecture x86_64:
"_notmuch_database_begin_atomic", referenced from:
_nm_db_trans_begin in libnotmuch.a(nm_db.o)
"_notmuch_database_destroy", referenced from:
_nm_db_release in libnotmuch.a(nm_db.o)
_nm_db_free in libnotmuch.a(nm_db.o)
_nm_db_longrun_done in libnotmuch.a(nm_db.o)
_nm_db_debug_check in libnotmuch.a(nm_db.o)
"_notmuch_database_end_atomic", referenced from:
_nm_db_trans_end in libnotmuch.a(nm_db.o)
"_notmuch_database_find_message", referenced from:
_nm_read_entire_thread in libnotmuch.a(mutt_notmuch.o)
_nm_tags_commit in libnotmuch.a(mutt_notmuch.o)
"_notmuch_database_find_message_by_filename", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
"_notmuch_database_get_all_tags", referenced from:
_nm_get_all_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_database_index_file", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_record_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_database_open_verbose", referenced from:
_nm_db_do_open in libnotmuch.a(nm_db.o)
"_notmuch_database_remove_message", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
"_notmuch_filenames_get", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
_append_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_filenames_move_to_next", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
_append_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_filenames_valid", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
_append_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_add_tag", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_destroy", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_record_message in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
_append_replies in libnotmuch.a(mutt_notmuch.o)
...
"_notmuch_message_freeze", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_get_filenames", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_sync in libnotmuch.a(mutt_notmuch.o)
_append_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_get_message_id", referenced from:
_append_message in libnotmuch.a(mutt_notmuch.o)
_get_mutt_email in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_get_replies", referenced from:
_append_replies in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_get_tags", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
_update_email_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_get_thread_id", referenced from:
_nm_read_entire_thread in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_maildir_flags_to_tags", referenced from:
_rename_filename in libnotmuch.a(mutt_notmuch.o)
_nm_record_message in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_remove_tag", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_message_thaw", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_messages_destroy", referenced from:
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
"_notmuch_messages_get", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_append_replies in libnotmuch.a(mutt_notmuch.o)
"_notmuch_messages_move_to_next", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_append_replies in libnotmuch.a(mutt_notmuch.o)
"_notmuch_messages_valid", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_append_replies in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_add_tag_exclude", referenced from:
_apply_exclude_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_count_messages", referenced from:
_count_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_create", referenced from:
_nm_read_entire_thread in libnotmuch.a(mutt_notmuch.o)
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
_get_query in libnotmuch.a(mutt_notmuch.o)
_count_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_destroy", referenced from:
_nm_read_entire_thread in libnotmuch.a(mutt_notmuch.o)
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
_count_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_search_messages", referenced from:
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_open in libnotmuch.a(mutt_notmuch.o)
_nm_mbox_check in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_search_threads", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_set_omit_excluded", referenced from:
_apply_exclude_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_query_set_sort", referenced from:
_nm_read_entire_thread in libnotmuch.a(mutt_notmuch.o)
_get_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_status_to_string", referenced from:
_nm_db_do_open in libnotmuch.a(nm_db.o)
"_notmuch_tags_destroy", referenced from:
_nm_get_all_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_tags_get", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
_nm_get_all_tags in libnotmuch.a(mutt_notmuch.o)
_update_email_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_tags_move_to_next", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
_nm_get_all_tags in libnotmuch.a(mutt_notmuch.o)
_update_email_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_tags_valid", referenced from:
_update_tags in libnotmuch.a(mutt_notmuch.o)
_nm_get_all_tags in libnotmuch.a(mutt_notmuch.o)
_update_email_tags in libnotmuch.a(mutt_notmuch.o)
"_notmuch_thread_destroy", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_thread_get_toplevel_messages", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_threads_destroy", referenced from:
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
"_notmuch_threads_get", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_threads_move_to_next", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
"_notmuch_threads_valid", referenced from:
_read_threads_query in libnotmuch.a(mutt_notmuch.o)
_nm_message_is_still_queried in libnotmuch.a(mutt_notmuch.o)
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
```
## Steps to Reproduce
Summary of build options:
Version: 20180716
Host OS: darwin18.6.0
Install prefix: /opt
Compiler: cc
CFlags: -g -O2 -std=c99 -D_ALL_SOURCE=1 -D_GNU_SOURCE=1 -D__EXTENSIONS__ -I/opt/local/include -I/opt/include -DNCURSES_WIDECHAR
LDFlags: -L/opt/sdk/lib -L/var/sdk/lib -L/opt/lib -L/usr/local/lib -L/usr/lib -L/var/lib -L/lib -L. -L/opt/lib -L/opt/local/lib -L/opt/local/lib -dynamic -lgssapi_krb5 -lkrb5 -lk5crypto -lcom_err
Libs: -lkyotocabinet -lidn -lgnutls -lncursesw -lnotmuch -llua -lsasl2 -lgpgme -lgpg-error -lintl -liconv
Header cache libs: -lkyotocabinet
GPGME: yes
PGP: yes
SMIME: yes
Notmuch: yes
Header Cache(s): kyotocabinet
Lua: yes
$ git bisect bad
```
609cad0a92ff993389ac9ec6eae9160db28da6d5 is the first bad commit
commit 609cad0a92ff993389ac9ec6eae9160db28da6d5
Author: Richard Russon <rich@flatcap.org>
Date: Sat Jul 7 12:02:05 2018 +0100
create libnotmuch
:100644 100644 a6dc534784fb5c6d5986cbfe07c163a85297c95d 8dce07a1f0ca8bbc2c0de67c0923c1689df66c5f M Makefile.autosetup
:100644 100644 49bf8cc4e336aa44407666f5d9d3f1f87447f3c6 e22e2743ed3aaa80db7866f66ec3f4f45beb60cf M browser.c
:100644 100644 550555a89d50152cc35bf71ce9244c41e66d4303 c04e0504d574232b7662ca15d29215a7aff57650 M buffy.c
:100644 100644 e19db60b590525f7dbd9057ab7d155fb6eb1d7a5 9a7377f0d2c21471e93d91e680dffae1bbb4a525 M commands.c
:100644 100644 b58d39cbe7f9f1818025f0ca7df8d6f37ef34958 c786a953bc21708b466934b5f24ca09f47786c27 M copy.c
:100644 100644 704afe6d05911c45bfde14044ecdf76f5a44895a 5fc5ea4dbdcea3606164416312b291123f140e70 M curs_lib.c
:100644 100644 4f75221402a7560f91d00fd2cb31e20c73df2419 f1f60fa47e468bb01ac65d1a1fb190ef70a9765d M curs_main.c
:100644 100644 cec49f7d54f98d9c002b67a4b5f9a30936f96b4a d2c3534456981a10e4163f878198f5408344049c M init.c
:100644 100644 b4400e2e31c86bc473aeccd6ea036db2a5a8149d 5c05eb7a50c13ab7e52b9ce15e4c59c2ff8fdc42 M mh.c
:100644 000000 2e31493fb316db1dc6b0d8b7956b28cc89499c20 0000000000000000000000000000000000000000 D mutt_notmuch.c
:100644 000000 424fc35e61ac19ac979eb103ecc4bebf75a7559b 0000000000000000000000000000000000000000 D mutt_notmuch.h
:100644 100644 8b00155897efabca68b2b14f9de8838e34054f0a f6900daca08c9623cb667fd710306b17aa208b61 M mx.c
:000000 040000 0000000000000000000000000000000000000000 f60a4e06a68fa57ce2a040f16dbf21ba30095730 A notmuch
:040000 040000 1c94924c37d0ca79c15ed2d8f2caba360b2ea379 2335436978eeef1c1902d416d287e7585280a1a7 M po
:100644 100644 4d434f279a7bf1bde93851228867d63b6c1b231b 76756006f363088d538b4885265ef9a1be14f47c M send.c
:100644 100644 9ff44f9516ac69aa94fc95670d516db257673943 829a2b5ae31452a343cea92cc05ac0e21b3794a8 M sidebar.c
:100644 100644 6fb2f8a1e9296493f4784f7840f99cef890b849a 1f48b71a0fd86a6e1897f519245d7c6a11968e3e M status.c
```
## How often does this happen?
- Always
## When did it start to happen?
- When I upgraded
Which version did you use to use?
NeoMutt 20180716
## NeoMutt Version
```
commit b37a784d4fc6efe74c121c01408e6fdb9cbb93fd (HEAD -> master, origin/master, origin/HEAD)
Merge: d973635cc 8b9575c33
Author: Richard Russon <rich@flatcap.org>
Date: Sun Jun 16 01:32:52 2019 +0100
merge: upstream fixes
* Improve imap_append_message() error message handling
* Mention sources for ~p and ~P patterns
* Allow imap_cmd_finish() to both expunge and fetch new mail
* Improve $reverse_realname documentation
* Add $fcc_before_send, defaulting unset
```
## Extra Info
Compiles without "--notmuch", but still suffers from https://github.com/neomutt/neomutt/issues/1629
* Operating System and its version
MacOS Mojave (10.14.5 )
MacPorts 2.5.4
notmuch @0.29_1
* Were you using multiple copies of NeoMutt at once?
no
* Were you using 'screen' or 'tmux'?
no
* Is your email local (maildir) or remote (IMAP)?
Neither, it's local (mbox)
|
non_test
|
can t compile with notmuch expected behaviour no compilation error actual behaviour undefined symbols for architecture notmuch database begin atomic referenced from nm db trans begin in libnotmuch a nm db o notmuch database destroy referenced from nm db release in libnotmuch a nm db o nm db free in libnotmuch a nm db o nm db longrun done in libnotmuch a nm db o nm db debug check in libnotmuch a nm db o notmuch database end atomic referenced from nm db trans end in libnotmuch a nm db o notmuch database find message referenced from nm read entire thread in libnotmuch a mutt notmuch o nm tags commit in libnotmuch a mutt notmuch o notmuch database find message by filename referenced from rename filename in libnotmuch a mutt notmuch o nm mbox sync in libnotmuch a mutt notmuch o notmuch database get all tags referenced from nm get all tags in libnotmuch a mutt notmuch o notmuch database index file referenced from rename filename in libnotmuch a mutt notmuch o nm record message in libnotmuch a mutt notmuch o notmuch database open verbose referenced from nm db do open in libnotmuch a nm db o notmuch database remove message referenced from rename filename in libnotmuch a mutt notmuch o nm mbox sync in libnotmuch a mutt notmuch o notmuch filenames get referenced from rename filename in libnotmuch a mutt notmuch o nm mbox check in libnotmuch a mutt notmuch o nm mbox sync in libnotmuch a mutt notmuch o append message in libnotmuch a mutt notmuch o notmuch filenames move to next referenced from rename filename in libnotmuch a mutt notmuch o nm mbox check in libnotmuch a mutt notmuch o nm mbox sync in libnotmuch a mutt notmuch o append message in libnotmuch a mutt notmuch o notmuch filenames valid referenced from rename filename in libnotmuch a mutt notmuch o nm mbox check in libnotmuch a mutt notmuch o nm mbox sync in libnotmuch a mutt notmuch o append message in libnotmuch a mutt notmuch o notmuch message add tag referenced from update tags in libnotmuch a mutt notmuch o notmuch message destroy referenced from read threads query in libnotmuch a mutt notmuch o rename filename in libnotmuch a mutt notmuch o nm record message in libnotmuch a mutt notmuch o nm mbox open in libnotmuch a mutt notmuch o nm mbox check in libnotmuch a mutt notmuch o nm mbox sync in libnotmuch a mutt notmuch o append replies in libnotmuch a mutt notmuch o notmuch message freeze referenced from update tags in libnotmuch a mutt notmuch o notmuch message get filenames referenced from rename filename in libnotmuch a mutt notmuch o nm mbox check in libnotmuch a mutt notmuch o nm mbox sync in libnotmuch a mutt notmuch o append message in libnotmuch a mutt notmuch o notmuch message get message id referenced from append message in libnotmuch a mutt notmuch o get mutt email in libnotmuch a mutt notmuch o notmuch message get replies referenced from append replies in libnotmuch a mutt notmuch o notmuch message get tags referenced from update tags in libnotmuch a mutt notmuch o update email tags in libnotmuch a mutt notmuch o notmuch message get thread id referenced from nm read entire thread in libnotmuch a mutt notmuch o notmuch message maildir flags to tags referenced from rename filename in libnotmuch a mutt notmuch o nm record message in libnotmuch a mutt notmuch o notmuch message remove tag referenced from update tags in libnotmuch a mutt notmuch o notmuch message thaw referenced from update tags in libnotmuch a mutt notmuch o notmuch messages destroy referenced from nm message is still queried in libnotmuch a mutt notmuch o notmuch messages get referenced from read threads query in libnotmuch a mutt notmuch o nm mbox open in libnotmuch a mutt notmuch o nm mbox check in libnotmuch a mutt notmuch o append replies in libnotmuch a mutt notmuch o notmuch messages move to next referenced from read threads query in libnotmuch a mutt notmuch o nm mbox open in libnotmuch a mutt notmuch o nm mbox check in libnotmuch a mutt notmuch o append replies in libnotmuch a mutt notmuch o notmuch messages valid referenced from read threads query in libnotmuch a mutt notmuch o nm message is still queried in libnotmuch a mutt notmuch o nm mbox open in libnotmuch a mutt notmuch o nm mbox check in libnotmuch a mutt notmuch o append replies in libnotmuch a mutt notmuch o notmuch query add tag exclude referenced from apply exclude tags in libnotmuch a mutt notmuch o notmuch query count messages referenced from count query in libnotmuch a mutt notmuch o notmuch query create referenced from nm read entire thread in libnotmuch a mutt notmuch o nm message is still queried in libnotmuch a mutt notmuch o get query in libnotmuch a mutt notmuch o count query in libnotmuch a mutt notmuch o notmuch query destroy referenced from nm read entire thread in libnotmuch a mutt notmuch o nm message is still queried in libnotmuch a mutt notmuch o nm mbox open in libnotmuch a mutt notmuch o nm mbox check in libnotmuch a mutt notmuch o count query in libnotmuch a mutt notmuch o notmuch query search messages referenced from nm message is still queried in libnotmuch a mutt notmuch o nm mbox open in libnotmuch a mutt notmuch o nm mbox check in libnotmuch a mutt notmuch o notmuch query search threads referenced from read threads query in libnotmuch a mutt notmuch o nm message is still queried in libnotmuch a mutt notmuch o notmuch query set omit excluded referenced from apply exclude tags in libnotmuch a mutt notmuch o notmuch query set sort referenced from nm read entire thread in libnotmuch a mutt notmuch o get query in libnotmuch a mutt notmuch o notmuch status to string referenced from nm db do open in libnotmuch a nm db o notmuch tags destroy referenced from nm get all tags in libnotmuch a mutt notmuch o notmuch tags get referenced from update tags in libnotmuch a mutt notmuch o nm get all tags in libnotmuch a mutt notmuch o update email tags in libnotmuch a mutt notmuch o notmuch tags move to next referenced from update tags in libnotmuch a mutt notmuch o nm get all tags in libnotmuch a mutt notmuch o update email tags in libnotmuch a mutt notmuch o notmuch tags valid referenced from update tags in libnotmuch a mutt notmuch o nm get all tags in libnotmuch a mutt notmuch o update email tags in libnotmuch a mutt notmuch o notmuch thread destroy referenced from read threads query in libnotmuch a mutt notmuch o notmuch thread get toplevel messages referenced from read threads query in libnotmuch a mutt notmuch o notmuch threads destroy referenced from nm message is still queried in libnotmuch a mutt notmuch o notmuch threads get referenced from read threads query in libnotmuch a mutt notmuch o notmuch threads move to next referenced from read threads query in libnotmuch a mutt notmuch o notmuch threads valid referenced from read threads query in libnotmuch a mutt notmuch o nm message is still queried in libnotmuch a mutt notmuch o ld symbol s not found for architecture clang error linker command failed with exit code use v to see invocation steps to reproduce summary of build options version host os install prefix opt compiler cc cflags g std d all source d gnu source d extensions i opt local include i opt include dncurses widechar ldflags l opt sdk lib l var sdk lib l opt lib l usr local lib l usr lib l var lib l lib l l opt lib l opt local lib l opt local lib dynamic lgssapi lcom err libs lkyotocabinet lidn lgnutls lncursesw lnotmuch llua lgpgme lgpg error lintl liconv header cache libs lkyotocabinet gpgme yes pgp yes smime yes notmuch yes header cache s kyotocabinet lua yes git bisect bad is the first bad commit commit author richard russon date sat jul create libnotmuch m makefile autosetup m browser c m buffy c m commands c m copy c m curs lib c m curs main c m init c m mh c d mutt notmuch c d mutt notmuch h m mx c a notmuch m po m send c m sidebar c m status c how often does this happen always when did it start to happen when i upgraded which version did you use to use neomutt neomutt version commit head master origin master origin head merge author richard russon date sun jun merge upstream fixes improve imap append message error message handling mention sources for p and p patterns allow imap cmd finish to both expunge and fetch new mail improve reverse realname documentation add fcc before send defaulting unset extra info compiles without notmuch but still suffers from operating system and its version macos mojave macports notmuch were you using multiple copies of neomutt at once no were you using screen or tmux no is your email local maildir or remote imap neither it s local mbox
| 0
|
226,133
| 7,504,283,953
|
IssuesEvent
|
2018-04-10 02:43:00
|
ngxs/store
|
https://api.github.com/repos/ngxs/store
|
closed
|
Type Safety
|
domain:core priority:1 type:chore
|
There are several parts of the library that open the user up to potential mistakes that SHOULD be caught by the compiler. An example of this is the Select decorator and the fact that the Ngxs class isn't generic so the select method cannot be typed. Anything where you pass in a string (like the selectors) are an opening for potential mistakes
|
1.0
|
Type Safety - There are several parts of the library that open the user up to potential mistakes that SHOULD be caught by the compiler. An example of this is the Select decorator and the fact that the Ngxs class isn't generic so the select method cannot be typed. Anything where you pass in a string (like the selectors) are an opening for potential mistakes
|
non_test
|
type safety there are several parts of the library that open the user up to potential mistakes that should be caught by the compiler an example of this is the select decorator and the fact that the ngxs class isn t generic so the select method cannot be typed anything where you pass in a string like the selectors are an opening for potential mistakes
| 0
|
149,266
| 23,451,575,605
|
IssuesEvent
|
2022-08-16 03:45:11
|
USACE/cumulus
|
https://api.github.com/repos/USACE/cumulus
|
closed
|
[PRODUCT]: Add NWRFC Products to Cumulus
|
new data product design data load
|
## Product Source (Sample Pattern)
- [ ] #278
- [ ] #279
- [ ] #280
- [ ] #281
## NWD POCs
Eric C., Sonja M., Kevin M., and Ross W.
## Product Format
- [x] NetCDF
|
1.0
|
[PRODUCT]: Add NWRFC Products to Cumulus - ## Product Source (Sample Pattern)
- [ ] #278
- [ ] #279
- [ ] #280
- [ ] #281
## NWD POCs
Eric C., Sonja M., Kevin M., and Ross W.
## Product Format
- [x] NetCDF
|
non_test
|
add nwrfc products to cumulus product source sample pattern nwd pocs eric c sonja m kevin m and ross w product format netcdf
| 0
|
26,614
| 12,440,874,378
|
IssuesEvent
|
2020-05-26 12:46:57
|
AzureAD/microsoft-authentication-library-for-dotnet
|
https://api.github.com/repos/AzureAD/microsoft-authentication-library-for-dotnet
|
closed
|
AcquireTokenInteractive for AD B2C on .NET Core 3.0 WPF desktop client does not work
|
B2C enhancement external - fixed needs-sample netcore service-enhancement
|
**Which Version of MSAL are you using ?**
MSAL 4.0.0
**Platform**
.NET Core 3.0
**What authentication flow has the issue?**
* Desktop / Mobile
* [x] Interactive
* [ ] Integrated Windows Auth
* [ ] Username Password
* [ ] Device code flow (browserless)
* Web App
* [ ] Authorization code
* [ ] OBO
* Web API
* [ ] OBO
**Is this a new or existing app?**
This is a new app
**Repro**
```csharp
// Creation of PublicClientApp
// Call to WithRedirectUri is commented out because that redirect URI cannot
// be added during app registration.
public static IPublicClientApplication PublicClientApp { get; } = PublicClientApplicationBuilder.
Create(ClientId).
WithB2CAuthority(Authority).
// WithRedirectUri("http://localhost").
Build();
// ...
// Sign-in method logic
await App.PublicClientApp.AcquireTokenInteractive(App.ApiScopes).
WithB2CAuthority(App.Authority).
WithPrompt(Prompt.SelectAccount).
ExecuteAsync();
```
**Expected behavior**
Call to AcquireTokenInteractive should open default browser and prompt user for login credentials. After login should return focus to app.
**Actual behavior**
AcquireTokenInteractive is throwing an exception:
"Only loopback redirect uri is supported, but urn:ietf:wg:oauth:2.0:oob was found. Configure http://localhost or http://localhost:port both during app registration and when you create the PublicClientApplication object. See https://aka.ms/msal-net-os-browser for details".
**Additional context/ Logs / Screenshots**
Here is the Azure portal AD B2C tenant app registration screen for native client:
[link](https://i.stack.imgur.com/TOtTD.png)
On the AD B2C app registration I cannot add custom URI "http://localhost".
So how can a .NET Core Desktop WPF app use interactive login with B2C? Are there any other options and examples?
|
1.0
|
AcquireTokenInteractive for AD B2C on .NET Core 3.0 WPF desktop client does not work - **Which Version of MSAL are you using ?**
MSAL 4.0.0
**Platform**
.NET Core 3.0
**What authentication flow has the issue?**
* Desktop / Mobile
* [x] Interactive
* [ ] Integrated Windows Auth
* [ ] Username Password
* [ ] Device code flow (browserless)
* Web App
* [ ] Authorization code
* [ ] OBO
* Web API
* [ ] OBO
**Is this a new or existing app?**
This is a new app
**Repro**
```csharp
// Creation of PublicClientApp
// Call to WithRedirectUri is commented out because that redirect URI cannot
// be added during app registration.
public static IPublicClientApplication PublicClientApp { get; } = PublicClientApplicationBuilder.
Create(ClientId).
WithB2CAuthority(Authority).
// WithRedirectUri("http://localhost").
Build();
// ...
// Sign-in method logic
await App.PublicClientApp.AcquireTokenInteractive(App.ApiScopes).
WithB2CAuthority(App.Authority).
WithPrompt(Prompt.SelectAccount).
ExecuteAsync();
```
**Expected behavior**
Call to AcquireTokenInteractive should open default browser and prompt user for login credentials. After login should return focus to app.
**Actual behavior**
AcquireTokenInteractive is throwing an exception:
"Only loopback redirect uri is supported, but urn:ietf:wg:oauth:2.0:oob was found. Configure http://localhost or http://localhost:port both during app registration and when you create the PublicClientApplication object. See https://aka.ms/msal-net-os-browser for details".
**Additional context/ Logs / Screenshots**
Here is the Azure portal AD B2C tenant app registration screen for native client:
[link](https://i.stack.imgur.com/TOtTD.png)
On the AD B2C app registration I cannot add custom URI "http://localhost".
So how can a .NET Core Desktop WPF app use interactive login with B2C? Are there any other options and examples?
|
non_test
|
acquiretokeninteractive for ad on net core wpf desktop client does not work which version of msal are you using msal platform net core what authentication flow has the issue desktop mobile interactive integrated windows auth username password device code flow browserless web app authorization code obo web api obo is this a new or existing app this is a new app repro csharp creation of publicclientapp call to withredirecturi is commented out because that redirect uri cannot be added during app registration public static ipublicclientapplication publicclientapp get publicclientapplicationbuilder create clientid authority withredirecturi build sign in method logic await app publicclientapp acquiretokeninteractive app apiscopes app authority withprompt prompt selectaccount executeasync expected behavior call to acquiretokeninteractive should open default browser and prompt user for login credentials after login should return focus to app actual behavior acquiretokeninteractive is throwing an exception only loopback redirect uri is supported but urn ietf wg oauth oob was found configure or both during app registration and when you create the publicclientapplication object see for details additional context logs screenshots here is the azure portal ad tenant app registration screen for native client on the ad app registration i cannot add custom uri so how can a net core desktop wpf app use interactive login with are there any other options and examples
| 0
|
181,415
| 14,019,376,383
|
IssuesEvent
|
2020-10-29 18:05:29
|
rancher/dashboard
|
https://api.github.com/repos/rancher/dashboard
|
closed
|
Add tooltip/help text on monitoring deployment page
|
[zube]: To Test area/monitoring-alerting
|
- [ ] "Create Default Monitoring Cluster Roles”
- [ ] “Aggregate to Default Kubernetes Roles”
Those options exist on the deployment page and need help text on what this means
|
1.0
|
Add tooltip/help text on monitoring deployment page - - [ ] "Create Default Monitoring Cluster Roles”
- [ ] “Aggregate to Default Kubernetes Roles”
Those options exist on the deployment page and need help text on what this means
|
test
|
add tooltip help text on monitoring deployment page create default monitoring cluster roles” “aggregate to default kubernetes roles” those options exist on the deployment page and need help text on what this means
| 1
|
172,726
| 6,515,790,843
|
IssuesEvent
|
2017-08-26 20:35:31
|
kubernetes/kubernetes
|
https://api.github.com/repos/kubernetes/kubernetes
|
closed
|
Protobuf serialization does not distinguish between `[]` and `null`
|
kind/bug priority/important-soon sig/api-machinery
|
<!-- Thanks for filing an issue! Before hitting the button, please answer these questions.-->
**Is this a request for help?** (If yes, you should use our troubleshooting guide and community support channels, see http://kubernetes.io/docs/troubleshooting/.): Not a request for help. Observed undocumented change in 1.6.1 api.
**What keywords did you search in Kubernetes issues before filing this one?** (If you have found any duplicates, you should instead reply there.): endpoints subsets api
---
**Is this a BUG REPORT or FEATURE REQUEST?** (choose one): BUG REPORT
<!--
If this is a BUG REPORT, please:
- Fill in as much of the template below as you can. If you leave out
information, we can't help you as well.
If this is a FEATURE REQUEST, please:
- Describe *in detail* the feature/behavior/change you'd like to see.
In both cases, be ready for followup questions, and please respond in a timely
manner. If we can't reproduce a bug or think a feature already exists, we
might close your issue. If we're wrong, PLEASE feel free to reopen it and
explain why.
-->
**Kubernetes version** (use `kubectl version`):
```
Client Version: version.Info{Major:"1", Minor:"6", GitVersion:"v1.6.0", GitCommit:"fff5156092b56e6bd60fff75aad4dc9de6b6ef37", GitTreeState:"clean", BuildDate:"2017-03-28T16:36:33Z", GoVersion:"go1.7.5", Compiler:"gc", Platform:"darwin/amd64"}
Server Version: version.Info{Major:"1", Minor:"6", GitVersion:"v1.6.1", GitCommit:"b0b7a323cc5a4a2019b2e9520c21c7830b7f708e", GitTreeState:"clean", BuildDate:"2017-04-03T20:33:27Z", GoVersion:"go1.7.5", Compiler:"gc", Platform:"linux/amd64"}
```
**Environment**:
- **Cloud provider or hardware configuration**: GKE
- **OS** (e.g. from /etc/os-release):
```
BUILD_ID=9000.104.0
NAME="Container-Optimized OS"
GOOGLE_CRASH_ID=Lakitu
VERSION_ID=56
BUG_REPORT_URL=https://crbug.com/new
PRETTY_NAME="Container-Optimized OS from Google"
VERSION=56
GOOGLE_METRICS_PRODUCT_ID=26
HOME_URL="https://cloud.google.com/compute/docs/containers/vm-image/"
ID=cos
```
- **Kernel** (e.g. `uname -a`):
```
Linux gke-alpha-default-pool-3f2b4945-188b 4.4.21+ #1 SMP Wed Apr 5 14:40:46 PDT 2017 x86_64 Intel(R) Xeon(R) CPU @ 2.30GHz GenuineIntel GNU/Linux
```
- **Install tools**:
- **Others**:
1.6.1 alpha cluster set up with:
```
gcloud alpha container clusters create alpha --cluster-version 1.6.1 --zone us-central1-b --enable-kubernetes-alpha
```
**What happened**:
Observed API change in 1.6.1. `api/v1/namespaces/{namespace}/endpoints` returns `null` for `.items[].subsets`. 1.6.0 returns `[]`
**What you expected to happen**:
Expect 1.6.1 to match 1.6.0, and return `[]`
**How to reproduce it** (as minimally and precisely as possible):
## 1.6.0
```bash
$ kubectl -n=kube-system get ep kube-controller-manager -o jsonpath='{.subsets}'
[]
$ kubectl -n=kube-system get ep -o jsonpath='{.items[?(@.metadata.name=="kube-controller-manager")].subsets}'
[]
```
## 1.6.1
```bash
$ kubectl -n=kube-system get ep kube-controller-manager -o jsonpath='{.subsets}'
[]
$ kubectl -n=kube-system get ep -o jsonpath='{.items[?(@.metadata.name=="kube-controller-manager")].subsets}'
<nil> # <---- BOOM
```
**Anything else we need to know**:
Original bug reported as an exception in linkerd:
https://github.com/linkerd/linkerd/issues/1219
|
1.0
|
Protobuf serialization does not distinguish between `[]` and `null` - <!-- Thanks for filing an issue! Before hitting the button, please answer these questions.-->
**Is this a request for help?** (If yes, you should use our troubleshooting guide and community support channels, see http://kubernetes.io/docs/troubleshooting/.): Not a request for help. Observed undocumented change in 1.6.1 api.
**What keywords did you search in Kubernetes issues before filing this one?** (If you have found any duplicates, you should instead reply there.): endpoints subsets api
---
**Is this a BUG REPORT or FEATURE REQUEST?** (choose one): BUG REPORT
<!--
If this is a BUG REPORT, please:
- Fill in as much of the template below as you can. If you leave out
information, we can't help you as well.
If this is a FEATURE REQUEST, please:
- Describe *in detail* the feature/behavior/change you'd like to see.
In both cases, be ready for followup questions, and please respond in a timely
manner. If we can't reproduce a bug or think a feature already exists, we
might close your issue. If we're wrong, PLEASE feel free to reopen it and
explain why.
-->
**Kubernetes version** (use `kubectl version`):
```
Client Version: version.Info{Major:"1", Minor:"6", GitVersion:"v1.6.0", GitCommit:"fff5156092b56e6bd60fff75aad4dc9de6b6ef37", GitTreeState:"clean", BuildDate:"2017-03-28T16:36:33Z", GoVersion:"go1.7.5", Compiler:"gc", Platform:"darwin/amd64"}
Server Version: version.Info{Major:"1", Minor:"6", GitVersion:"v1.6.1", GitCommit:"b0b7a323cc5a4a2019b2e9520c21c7830b7f708e", GitTreeState:"clean", BuildDate:"2017-04-03T20:33:27Z", GoVersion:"go1.7.5", Compiler:"gc", Platform:"linux/amd64"}
```
**Environment**:
- **Cloud provider or hardware configuration**: GKE
- **OS** (e.g. from /etc/os-release):
```
BUILD_ID=9000.104.0
NAME="Container-Optimized OS"
GOOGLE_CRASH_ID=Lakitu
VERSION_ID=56
BUG_REPORT_URL=https://crbug.com/new
PRETTY_NAME="Container-Optimized OS from Google"
VERSION=56
GOOGLE_METRICS_PRODUCT_ID=26
HOME_URL="https://cloud.google.com/compute/docs/containers/vm-image/"
ID=cos
```
- **Kernel** (e.g. `uname -a`):
```
Linux gke-alpha-default-pool-3f2b4945-188b 4.4.21+ #1 SMP Wed Apr 5 14:40:46 PDT 2017 x86_64 Intel(R) Xeon(R) CPU @ 2.30GHz GenuineIntel GNU/Linux
```
- **Install tools**:
- **Others**:
1.6.1 alpha cluster set up with:
```
gcloud alpha container clusters create alpha --cluster-version 1.6.1 --zone us-central1-b --enable-kubernetes-alpha
```
**What happened**:
Observed API change in 1.6.1. `api/v1/namespaces/{namespace}/endpoints` returns `null` for `.items[].subsets`. 1.6.0 returns `[]`
**What you expected to happen**:
Expect 1.6.1 to match 1.6.0, and return `[]`
**How to reproduce it** (as minimally and precisely as possible):
## 1.6.0
```bash
$ kubectl -n=kube-system get ep kube-controller-manager -o jsonpath='{.subsets}'
[]
$ kubectl -n=kube-system get ep -o jsonpath='{.items[?(@.metadata.name=="kube-controller-manager")].subsets}'
[]
```
## 1.6.1
```bash
$ kubectl -n=kube-system get ep kube-controller-manager -o jsonpath='{.subsets}'
[]
$ kubectl -n=kube-system get ep -o jsonpath='{.items[?(@.metadata.name=="kube-controller-manager")].subsets}'
<nil> # <---- BOOM
```
**Anything else we need to know**:
Original bug reported as an exception in linkerd:
https://github.com/linkerd/linkerd/issues/1219
|
non_test
|
protobuf serialization does not distinguish between and null is this a request for help if yes you should use our troubleshooting guide and community support channels see not a request for help observed undocumented change in api what keywords did you search in kubernetes issues before filing this one if you have found any duplicates you should instead reply there endpoints subsets api is this a bug report or feature request choose one bug report if this is a bug report please fill in as much of the template below as you can if you leave out information we can t help you as well if this is a feature request please describe in detail the feature behavior change you d like to see in both cases be ready for followup questions and please respond in a timely manner if we can t reproduce a bug or think a feature already exists we might close your issue if we re wrong please feel free to reopen it and explain why kubernetes version use kubectl version client version version info major minor gitversion gitcommit gittreestate clean builddate goversion compiler gc platform darwin server version version info major minor gitversion gitcommit gittreestate clean builddate goversion compiler gc platform linux environment cloud provider or hardware configuration gke os e g from etc os release build id name container optimized os google crash id lakitu version id bug report url pretty name container optimized os from google version google metrics product id home url id cos kernel e g uname a linux gke alpha default pool smp wed apr pdt intel r xeon r cpu genuineintel gnu linux install tools others alpha cluster set up with gcloud alpha container clusters create alpha cluster version zone us b enable kubernetes alpha what happened observed api change in api namespaces namespace endpoints returns null for items subsets returns what you expected to happen expect to match and return how to reproduce it as minimally and precisely as possible bash kubectl n kube system get ep kube controller manager o jsonpath subsets kubectl n kube system get ep o jsonpath items subsets bash kubectl n kube system get ep kube controller manager o jsonpath subsets kubectl n kube system get ep o jsonpath items subsets boom anything else we need to know original bug reported as an exception in linkerd
| 0
|
19,084
| 5,793,794,000
|
IssuesEvent
|
2017-05-02 13:31:17
|
elastic/logstash
|
https://api.github.com/repos/elastic/logstash
|
closed
|
Refactor `ConfigPath` class with `SourceWithMetadata`
|
code cleanup v6.0.0
|
Instead of using a custom class for the source information lets use the same class from LIR `SourceWithMetadata`
|
1.0
|
Refactor `ConfigPath` class with `SourceWithMetadata` - Instead of using a custom class for the source information lets use the same class from LIR `SourceWithMetadata`
|
non_test
|
refactor configpath class with sourcewithmetadata instead of using a custom class for the source information lets use the same class from lir sourcewithmetadata
| 0
|
455,619
| 13,130,009,189
|
IssuesEvent
|
2020-08-06 14:44:19
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
tivoidp.tivo.com - Unsupported browser pop-up displayed
|
browser-firefox-mobile engine-gecko priority-normal severity-critical sitepatch-applied type-unsupported
|
<!-- @browser: Firefox Mobile 68.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 10; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 -->
<!-- @reported_with: mobile-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/48137 -->
**URL**: https://tivoidp.tivo.com/tivoCommunitySupport/s/
**Browser / Version**: Firefox Mobile 68.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Desktop site instead of mobile site
**Description**: displays a pop-up stating that the browser is not supported
**Steps to Reproduce**:
displays a pop-up stating that the browser is not supported. works if I check desktop site. No pop-up using Chrome Canary
<details><summary>View the screenshot</summary><img alt='Screenshot' src='https://webcompat.com/uploads/2020/2/a8d2b2dc-1b62-41df-bfae-760e56b47dca.jpeg'></details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20200120180435</li><li>channel: nightly</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2020/2/43f3811b-1c6f-4d91-908a-f75b72dc51fa)
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
tivoidp.tivo.com - Unsupported browser pop-up displayed - <!-- @browser: Firefox Mobile 68.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 10; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 -->
<!-- @reported_with: mobile-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/48137 -->
**URL**: https://tivoidp.tivo.com/tivoCommunitySupport/s/
**Browser / Version**: Firefox Mobile 68.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Desktop site instead of mobile site
**Description**: displays a pop-up stating that the browser is not supported
**Steps to Reproduce**:
displays a pop-up stating that the browser is not supported. works if I check desktop site. No pop-up using Chrome Canary
<details><summary>View the screenshot</summary><img alt='Screenshot' src='https://webcompat.com/uploads/2020/2/a8d2b2dc-1b62-41df-bfae-760e56b47dca.jpeg'></details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20200120180435</li><li>channel: nightly</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2020/2/43f3811b-1c6f-4d91-908a-f75b72dc51fa)
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_test
|
tivoidp tivo com unsupported browser pop up displayed url browser version firefox mobile operating system android tested another browser yes problem type desktop site instead of mobile site description displays a pop up stating that the browser is not supported steps to reproduce displays a pop up stating that the browser is not supported works if i check desktop site no pop up using chrome canary view the screenshot img alt screenshot src browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel nightly hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️
| 0
|
274,302
| 23,829,555,819
|
IssuesEvent
|
2022-09-05 18:39:07
|
scikit-hep/vector
|
https://api.github.com/repos/scikit-hep/vector
|
closed
|
Test with Python `3.10` and set `cancel-in-progress` to `True`
|
good first issue tests
|
`Vector` supports Python `3.10`, but the support is not tested in the `CI` -
https://github.com/scikit-hep/vector/blob/d4ad49ecf1761e3a6561498d2405b7789d53deff/.github/workflows/ci.yml#L31-L39
---
Additionally, the `cancel-in-progress` field is set to `${{ startsWith(github.ref, 'refs/pull/') }}` -
https://github.com/scikit-hep/vector/blob/d4ad49ecf1761e3a6561498d2405b7789d53deff/.github/workflows/ci.yml#L11-L15
which for some reason returns a `string` now (unverified bug/incorrect documentation, see - https://github.com/github/docs/issues/19416).
I think it should be okay to set it to `true` as set here - https://github.com/scikit-hep/cookie/blob/main/%7B%7Bcookiecutter.project_name%7D%7D/.github/workflows/ci.yml#L18-L20.
|
1.0
|
Test with Python `3.10` and set `cancel-in-progress` to `True` - `Vector` supports Python `3.10`, but the support is not tested in the `CI` -
https://github.com/scikit-hep/vector/blob/d4ad49ecf1761e3a6561498d2405b7789d53deff/.github/workflows/ci.yml#L31-L39
---
Additionally, the `cancel-in-progress` field is set to `${{ startsWith(github.ref, 'refs/pull/') }}` -
https://github.com/scikit-hep/vector/blob/d4ad49ecf1761e3a6561498d2405b7789d53deff/.github/workflows/ci.yml#L11-L15
which for some reason returns a `string` now (unverified bug/incorrect documentation, see - https://github.com/github/docs/issues/19416).
I think it should be okay to set it to `true` as set here - https://github.com/scikit-hep/cookie/blob/main/%7B%7Bcookiecutter.project_name%7D%7D/.github/workflows/ci.yml#L18-L20.
|
test
|
test with python and set cancel in progress to true vector supports python but the support is not tested in the ci additionally the cancel in progress field is set to startswith github ref refs pull which for some reason returns a string now unverified bug incorrect documentation see i think it should be okay to set it to true as set here
| 1
|
101,507
| 31,168,981,792
|
IssuesEvent
|
2023-08-16 22:31:40
|
ClickHouse/ClickHouse
|
https://api.github.com/repos/ClickHouse/ClickHouse
|
closed
|
`fread` declaration error
|
build st-need-info
|
> Make sure that `git diff` result is empty and you've just pulled fresh master. Try cleaning up cmake cache. Just in case, official build instructions are published here: https://clickhouse.com/docs/en/development/build/
**Operating system**
> OS kind or distribution, specific version/release, non-standard kernel if any. If you are trying to build inside virtual machine, please mention it too.
WSL2(Ubuntu 22.04)
**Cmake version**
3.26.3
**Ninja version**
1.10.1
**Compiler name and version**
clang-16, clang++-16
**Full cmake and/or ninja output**
I execute `ninja -j16 clickhouse-client clickhouse-server` and get the output down here.
~/Workspace/proj/finish_proj/ClickHouse/build (master) # ninja -j16 clickhouse-client clickhouse-server 1 ↵ root@DESKTOP-CO31SAO
[0/2] Re-checking globbed directories...
[96/11352] Building C object base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/glibc-compatibility.c.o
FAILED: base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/glibc-compatibility.c.o
/usr/local/bin/ccache /usr/bin/clang -DSTD_EXCEPTION_HAS_STACK_TRACE=1 -D_LIBCPP_ENABLE_THREAD_SAFETY_ANNOTATIONS -I/root/Workspace/proj/finish_proj/ClickHouse/base/glibc-compatibility/libcxxabi -I/root/Workspace/proj/finish_proj/ClickHouse/base/glibc-compatibility/musl/x86_64 -I/root/Workspace/proj/finish_proj/ClickHouse/base/glibc-compatibility/memcpy -isystem /root/Workspace/proj/finish_proj/ClickHouse/contrib/llvm-project/libcxxabi/include -isystem /root/Workspace/proj/finish_proj/ClickHouse/contrib/libunwind/include -fdiagnostics-color=always -Xclang -fuse-ctor-homing -Wno-enum-constexpr-conversion -gdwarf-aranges -pipe -fasynchronous-unwind-tables -ffile-prefix-map=/root/Workspace/proj/finish_proj/ClickHouse=. -falign-functions=32 -mbranches-within-32B-boundaries -fdiagnostics-absolute-paths -fomit-frame-pointer -O2 -g -DNDEBUG -O3 -g -gdwarf-4 -std=gnu11 -D OS_LINUX -Werror -MD -MT base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/glibc-compatibility.c.o -MF base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/glibc-compatibility.c.o.d -o base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/glibc-compatibility.c.o -c /root/Workspace/proj/finish_proj/ClickHouse/base/glibc-compatibility/glibc-compatibility.c
/root/Workspace/proj/finish_proj/ClickHouse/base/glibc-compatibility/glibc-compatibility.c:103:8: error: declaration of built-in function 'fread' requires inclusion of the header <stdio.h> [-Werror,-Wbuiltin-requires-header]
size_t fread(void *ptr, size_t size, size_t nmemb, void *stream);
^
1 error generated.
[111/11352] Building C object base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/musl/glob.c.o
ninja: build stopped: subcommand failed.
|
1.0
|
`fread` declaration error - > Make sure that `git diff` result is empty and you've just pulled fresh master. Try cleaning up cmake cache. Just in case, official build instructions are published here: https://clickhouse.com/docs/en/development/build/
**Operating system**
> OS kind or distribution, specific version/release, non-standard kernel if any. If you are trying to build inside virtual machine, please mention it too.
WSL2(Ubuntu 22.04)
**Cmake version**
3.26.3
**Ninja version**
1.10.1
**Compiler name and version**
clang-16, clang++-16
**Full cmake and/or ninja output**
I execute `ninja -j16 clickhouse-client clickhouse-server` and get the output down here.
~/Workspace/proj/finish_proj/ClickHouse/build (master) # ninja -j16 clickhouse-client clickhouse-server 1 ↵ root@DESKTOP-CO31SAO
[0/2] Re-checking globbed directories...
[96/11352] Building C object base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/glibc-compatibility.c.o
FAILED: base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/glibc-compatibility.c.o
/usr/local/bin/ccache /usr/bin/clang -DSTD_EXCEPTION_HAS_STACK_TRACE=1 -D_LIBCPP_ENABLE_THREAD_SAFETY_ANNOTATIONS -I/root/Workspace/proj/finish_proj/ClickHouse/base/glibc-compatibility/libcxxabi -I/root/Workspace/proj/finish_proj/ClickHouse/base/glibc-compatibility/musl/x86_64 -I/root/Workspace/proj/finish_proj/ClickHouse/base/glibc-compatibility/memcpy -isystem /root/Workspace/proj/finish_proj/ClickHouse/contrib/llvm-project/libcxxabi/include -isystem /root/Workspace/proj/finish_proj/ClickHouse/contrib/libunwind/include -fdiagnostics-color=always -Xclang -fuse-ctor-homing -Wno-enum-constexpr-conversion -gdwarf-aranges -pipe -fasynchronous-unwind-tables -ffile-prefix-map=/root/Workspace/proj/finish_proj/ClickHouse=. -falign-functions=32 -mbranches-within-32B-boundaries -fdiagnostics-absolute-paths -fomit-frame-pointer -O2 -g -DNDEBUG -O3 -g -gdwarf-4 -std=gnu11 -D OS_LINUX -Werror -MD -MT base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/glibc-compatibility.c.o -MF base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/glibc-compatibility.c.o.d -o base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/glibc-compatibility.c.o -c /root/Workspace/proj/finish_proj/ClickHouse/base/glibc-compatibility/glibc-compatibility.c
/root/Workspace/proj/finish_proj/ClickHouse/base/glibc-compatibility/glibc-compatibility.c:103:8: error: declaration of built-in function 'fread' requires inclusion of the header <stdio.h> [-Werror,-Wbuiltin-requires-header]
size_t fread(void *ptr, size_t size, size_t nmemb, void *stream);
^
1 error generated.
[111/11352] Building C object base/glibc-compatibility/CMakeFiles/glibc-compatibility.dir/musl/glob.c.o
ninja: build stopped: subcommand failed.
|
non_test
|
fread declaration error make sure that git diff result is empty and you ve just pulled fresh master try cleaning up cmake cache just in case official build instructions are published here operating system os kind or distribution specific version release non standard kernel if any if you are trying to build inside virtual machine please mention it too ubuntu cmake version ninja version compiler name and version clang clang full cmake and or ninja output i execute ninja clickhouse client clickhouse server and get the output down here workspace proj finish proj clickhouse build master ninja clickhouse client clickhouse server ↵ root desktop re checking globbed directories building c object base glibc compatibility cmakefiles glibc compatibility dir glibc compatibility c o failed base glibc compatibility cmakefiles glibc compatibility dir glibc compatibility c o usr local bin ccache usr bin clang dstd exception has stack trace d libcpp enable thread safety annotations i root workspace proj finish proj clickhouse base glibc compatibility libcxxabi i root workspace proj finish proj clickhouse base glibc compatibility musl i root workspace proj finish proj clickhouse base glibc compatibility memcpy isystem root workspace proj finish proj clickhouse contrib llvm project libcxxabi include isystem root workspace proj finish proj clickhouse contrib libunwind include fdiagnostics color always xclang fuse ctor homing wno enum constexpr conversion gdwarf aranges pipe fasynchronous unwind tables ffile prefix map root workspace proj finish proj clickhouse falign functions mbranches within boundaries fdiagnostics absolute paths fomit frame pointer g dndebug g gdwarf std d os linux werror md mt base glibc compatibility cmakefiles glibc compatibility dir glibc compatibility c o mf base glibc compatibility cmakefiles glibc compatibility dir glibc compatibility c o d o base glibc compatibility cmakefiles glibc compatibility dir glibc compatibility c o c root workspace proj finish proj clickhouse base glibc compatibility glibc compatibility c root workspace proj finish proj clickhouse base glibc compatibility glibc compatibility c error declaration of built in function fread requires inclusion of the header size t fread void ptr size t size size t nmemb void stream error generated building c object base glibc compatibility cmakefiles glibc compatibility dir musl glob c o ninja build stopped subcommand failed
| 0
|
6,370
| 6,361,319,796
|
IssuesEvent
|
2017-07-31 12:37:23
|
warg-lang/warg
|
https://api.github.com/repos/warg-lang/warg
|
opened
|
Generate accessible static analysis diagnostics
|
ci enhancement infrastructure
|
[neovim](https://github.com/neovim/neovim) provides a nice [diagnostics overview](https://neovim.io/doc/reports/clang/) using Clang Static Analysis. While it is not completely transparent how to do that, making similar page would be great.
|
1.0
|
Generate accessible static analysis diagnostics - [neovim](https://github.com/neovim/neovim) provides a nice [diagnostics overview](https://neovim.io/doc/reports/clang/) using Clang Static Analysis. While it is not completely transparent how to do that, making similar page would be great.
|
non_test
|
generate accessible static analysis diagnostics provides a nice using clang static analysis while it is not completely transparent how to do that making similar page would be great
| 0
|
429,193
| 30,029,463,845
|
IssuesEvent
|
2023-06-27 08:39:12
|
britishredcrosssociety/local-lockdown
|
https://api.github.com/repos/britishredcrosssociety/local-lockdown
|
closed
|
Use containers statelessly
|
bug documentation
|
WRT the recent PR #25, setting the `enableBookmarking()` function to `state` will not work in the long term.
Basically, all Docker app should be considered stateless. That means that any transient data (in this case, the `*.rds` files for the bookmarks) may be lost at any point in time - they certainly won't be available for more than an couple of hours.
Anything you wish to save long-term must be in an external data source - that could be a blob storage account or a database. **This is particularly important to note when it comes to the CSV data in this repo if you are apply any manipulations to it.**
**tl;dr** - any data written to disk **WILL** be lost.
I know that means that URLs will be ugly, but that's an unfortunate side effect. If you require a short URL, use a shortening service such as [bit.ly](https://bit.ly) before sending to anyone.
I have created a PR #34 to fix this and add some documentation on the concept of statelessness which you should all get familiar with. This issue is raised to highlight it further and act as an aide-memoire
|
1.0
|
Use containers statelessly - WRT the recent PR #25, setting the `enableBookmarking()` function to `state` will not work in the long term.
Basically, all Docker app should be considered stateless. That means that any transient data (in this case, the `*.rds` files for the bookmarks) may be lost at any point in time - they certainly won't be available for more than an couple of hours.
Anything you wish to save long-term must be in an external data source - that could be a blob storage account or a database. **This is particularly important to note when it comes to the CSV data in this repo if you are apply any manipulations to it.**
**tl;dr** - any data written to disk **WILL** be lost.
I know that means that URLs will be ugly, but that's an unfortunate side effect. If you require a short URL, use a shortening service such as [bit.ly](https://bit.ly) before sending to anyone.
I have created a PR #34 to fix this and add some documentation on the concept of statelessness which you should all get familiar with. This issue is raised to highlight it further and act as an aide-memoire
|
non_test
|
use containers statelessly wrt the recent pr setting the enablebookmarking function to state will not work in the long term basically all docker app should be considered stateless that means that any transient data in this case the rds files for the bookmarks may be lost at any point in time they certainly won t be available for more than an couple of hours anything you wish to save long term must be in an external data source that could be a blob storage account or a database this is particularly important to note when it comes to the csv data in this repo if you are apply any manipulations to it tl dr any data written to disk will be lost i know that means that urls will be ugly but that s an unfortunate side effect if you require a short url use a shortening service such as before sending to anyone i have created a pr to fix this and add some documentation on the concept of statelessness which you should all get familiar with this issue is raised to highlight it further and act as an aide memoire
| 0
|
45,405
| 7,181,228,322
|
IssuesEvent
|
2018-02-01 03:36:01
|
Parsl/parsl
|
https://api.github.com/repos/Parsl/parsl
|
closed
|
Notify users of anonymous usage tracking
|
documentation help wanted
|
Document what we track and have a banner on the docs saying what we track, why and how to opt-out.
Technical details are here : #34
|
1.0
|
Notify users of anonymous usage tracking - Document what we track and have a banner on the docs saying what we track, why and how to opt-out.
Technical details are here : #34
|
non_test
|
notify users of anonymous usage tracking document what we track and have a banner on the docs saying what we track why and how to opt out technical details are here
| 0
|
226,773
| 18,044,163,863
|
IssuesEvent
|
2021-09-18 15:40:17
|
logicmoo/logicmoo_workspace
|
https://api.github.com/repos/logicmoo/logicmoo_workspace
|
opened
|
logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01 JUnit
|
logicmoo.base.fol.fiveof Test_9999 unit_test NONMONOTONIC_TYPE_01
|
(cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl)
GH_MASTER_ISSUE_FINFO=
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANONMONOTONIC_TYPE_01
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/66/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://github.com/logicmoo/logicmoo_workspace/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
```
%
running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl'),
%~ /var/lib/jenkins/.local/share/swi-prolog/pack/logicmoo_utils/prolog/logicmoo_test_header.pl:92
%~ this_test_might_need( :-( use_module( library(logicmoo_plarkc))))
%~ this_test_might_need( :-( expects_dialect(pfc)))
% =============================================
% File 'mpred_builtin.pfc'
% Purpose: Agent Reactivity for SWI-Prolog
% Maintainer: Douglas Miles
% Contact: $Author: dmiles $@users.sourceforge.net %
% Version: 'interface' 1.0.0
% Revision: $Revision: 1.9 $
% Revised At: $Date: 2002/06/27 14:13:20 $
% =============================================
%
:- module(baseKB).
:- process_script_file.
%~ kifm = leftof(h1,h2).
%~ kifm = leftof(h1,h2).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:23
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h1,h2)))))
%~ kifm = leftof(h1,h2).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:24
%~ kifm = leftof(h2,h3).
%~ kifm = leftof(h2,h3).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h2,h3)))))
%~ kifm = leftof(h2,h3).
%~ kifm = leftof(h3,h4).
%~ kifm = leftof(h3,h4).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:25
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h3,h4)))))
%~ kifm = leftof(h3,h4).
%~ kifm = leftof(h4,h5).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:26
%~ kifm = leftof(h4,h5).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h4,h5)))))
%~ kifm = leftof(h4,h5).
%~ kifm = ( leftof(House_Leftof,House_Leftof3) =>
%~ house(House_Leftof)&house(House_Leftof3)).
%~ kifm = ( leftof(House_Leftof8,House_Leftof9) =>
%~ house(House_Leftof8)&house(House_Leftof9)).
%~ debugm( common_logic_loader,
%~ show_success( common_logic_loader,
%~ common_logic_loader : ain( clif( leftof(H1,H2)=>(house(H1)&house(H2))))))
%~ kifm = ( leftof(House_Leftof,House_Leftof3) =>
%~ house(House_Leftof)&house(House_Leftof3)).
%~ kifm = leftof(H1,H2)=>(house(H1)&house(H2)).
%~ mpred_test("Test_0001_Line_0038__H1",baseKB:house(h1))
%~ mpred_test("Test_0002_Line_0039__H2",baseKB:house(h2))
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:40
%~ mpred_test("Test_0003_Line_0040__H3",baseKB:house(h3))
%~ mpred_test("Test_0004_Line_0041__H4",baseKB:house(h4))
%~ mpred_test("Test_0005_Line_0042__H5",baseKB:house(h5))
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:45
%~ mpred_test("Test_0006_Line_0045__False_positive",baseKB:poss(house(false_positive)))
%~ mpred_test("Test_0007_Line_0047__naf_False_positive",baseKB:(\+nesc(house(false_positive))))
%~ skipped( listing( [ house/1,
%~ nesc/1]))
/*~
%~ kifm=leftof(h1,h2)
%~ kifm=leftof(h1,h2)
%~ kif_to_boxlog_attvars2 = leftof(h1,h2)
% /var/lib/jenkins/.local/share/swi-prolog/pack/pfc/prolog/pfc_test compiled into pfc_test 0.05 sec, -1 clauses
=======================================================
leftof(h1,h2)
============================================
?- kif_to_boxlog( leftof(h1,h2) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h1 leftof h2
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h1,h2)
%~ kif_to_boxlog_attvars2 = leftof(h1,h2)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h1,h2).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h1 leftof h2
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h1,h2).
============================================%~ kifm=leftof(h2,h3)
%~ kifm=leftof(h2,h3)
%~ kif_to_boxlog_attvars2 = leftof(h2,h3)
=======================================================
leftof(h2,h3)
============================================
?- kif_to_boxlog( leftof(h2,h3) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h2 leftof h3
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h2,h3)
%~ kif_to_boxlog_attvars2 = leftof(h2,h3)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h2,h3).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h2 leftof h3
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h2,h3).
============================================%~ kifm=leftof(h3,h4)
%~ kifm=leftof(h3,h4)
%~ kif_to_boxlog_attvars2 = leftof(h3,h4)
=======================================================
leftof(h3,h4)
============================================
?- kif_to_boxlog( leftof(h3,h4) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h3 leftof h4
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h3,h4)
%~ kif_to_boxlog_attvars2 = leftof(h3,h4)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h3,h4).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h3 leftof h4
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h3,h4).
============================================%~ kifm=leftof(h4,h5)
%~ kifm=leftof(h4,h5)
%~ kif_to_boxlog_attvars2 = leftof(h4,h5)
=======================================================
leftof(h4,h5)
============================================
?- kif_to_boxlog( leftof(h4,h5) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h4 leftof h5
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h4,h5)
%~ kif_to_boxlog_attvars2 = leftof(h4,h5)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h4,h5).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h4 leftof h5
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h4,h5).
============================================%~ kifm=leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3))
%~ kifm=leftof(House_Leftof8,House_Leftof9)=>(house(House_Leftof8)&house(House_Leftof9))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('House_Leftof8'),'$VAR'('House_Leftof9')),and(house('$VAR'('House_Leftof8')),house('$VAR'('House_Leftof9'))))
=======================================================
=>(leftof('$VAR'('House_Leftof'),'$VAR'('House_Leftof3')),&(house('$VAR'('House_Leftof')),house('$VAR'('House_Leftof3'))))
============================================
?- kif_to_boxlog( leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3)) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ If:
%~ ?House_Leftof leftof ?House_Leftof3 then it is
%~ Implied that:
%~ " ?House_Leftof isa house " and
%~ " ?House_Leftof3 isa house "
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('House_Leftof'),'$VAR'('House_Leftof3')),and(house('$VAR'('House_Leftof')),house('$VAR'('House_Leftof3'))))
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 6 entailment(s):
nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof))==>nesc(~house(House_Leftof3)).
nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof3))==>nesc(~house(House_Leftof)).
nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof))==>nesc(house(House_Leftof3)).
nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof3))==>nesc(house(House_Leftof)).
poss(house(House_Leftof))&nesc(~house(House_Leftof3))==>nesc(~leftof(House_Leftof,House_Leftof3)).
poss(house(House_Leftof3))&nesc(~house(House_Leftof))==>nesc(~leftof(House_Leftof,House_Leftof3)).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof3 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof)) ==>
nesc( ~( house(House_Leftof3)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof3 isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof3)) ==>
nesc( ~( house(House_Leftof)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof isa house " is possible
%~ It's Proof that:
%~ " ?House_Leftof3 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof)) ==>
nesc( house(House_Leftof3))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof3 isa house " is possible
%~ It's Proof that:
%~ " ?House_Leftof isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof3)) ==>
nesc( house(House_Leftof))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof isa house " is possible and
%~ " ?House_Leftof3 isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( poss(house(House_Leftof))&nesc(~house(House_Leftof3)) ==>
nesc( ~( leftof(House_Leftof,House_Leftof3)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof3 isa house " is possible and
%~ " ?House_Leftof isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( poss(house(House_Leftof3))&nesc(~house(House_Leftof)) ==>
nesc( ~( leftof(House_Leftof,House_Leftof3)))).
============================================%~ kifm=leftof(H1,H2)=>(house(H1)&house(H2))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('H1'),'$VAR'('H2')),and(house('$VAR'('H1')),house('$VAR'('H2'))))
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H2 isa house " is possible and
%~ " ?H1 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 leftof ?H2 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
poss(house(H2))&nesc(~house(H1))==>nesc(~leftof(H1,H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H1 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H2 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&nesc(~house(H1))==>nesc(~house(H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H2 isa house " is possible
%~ It's Proof that:
%~ " ?H1 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&poss(house(H2))==>nesc(house(H1)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 isa house " is possible and
%~ " ?H2 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 leftof ?H2 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
poss(house(H1))&nesc(~house(H2))==>nesc(~leftof(H1,H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H2 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&nesc(~house(H2))==>nesc(~house(H1)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H1 isa house " is possible
%~ It's Proof that:
%~ " ?H2 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&poss(house(H1))==>nesc(house(H2)).
[0m[1m[97m[40m?- listing(kif_show).[49m[0m[21m[0m%~ mpred_test("Test_0001_Line_0038__H1",baseKB:house(h1))
Call: (90) [baseKB] baseKB:house(h1)
Fail: (90) [baseKB] baseKB:house(h1)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h1))),rtrace(baseKB:house(h1))))
no_proof_for(\+house(h1)).
no_proof_for(\+house(h1)).
no_proof_for(\+house(h1)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0001_Line_0038__H1'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0001_Line_0038__H1-junit.xml
%~ mpred_test("Test_0002_Line_0039__H2",baseKB:house(h2))
Call: (90) [baseKB] baseKB:house(h2)
Fail: (90) [baseKB] baseKB:house(h2)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h2))),rtrace(baseKB:house(h2))))
no_proof_for(\+house(h2)).
no_proof_for(\+house(h2)).
no_proof_for(\+house(h2)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0002_Line_0039__H2'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0002_Line_0039__H2-junit.xml
%~ mpred_test("Test_0003_Line_0040__H3",baseKB:house(h3))
Call: (90) [baseKB] baseKB:house(h3)
Fail: (90) [baseKB] baseKB:house(h3)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h3))),rtrace(baseKB:house(h3))))
no_proof_for(\+house(h3)).
no_proof_for(\+house(h3)).
no_proof_for(\+house(h3)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0003_Line_0040__H3'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0003_Line_0040__H3-junit.xml
%~ mpred_test("Test_0004_Line_0041__H4",baseKB:house(h4))
Call: (90) [baseKB] baseKB:house(h4)
Fail: (90) [baseKB] baseKB:house(h4)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h4))),rtrace(baseKB:house(h4))))
no_proof_for(\+house(h4)).
no_proof_for(\+house(h4)).
no_proof_for(\+house(h4)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0004_Line_0041__H4'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0004_Line_0041__H4-junit.xml
%~ mpred_test("Test_0005_Line_0042__H5",baseKB:house(h5))
Call: (90) [baseKB] baseKB:house(h5)
Fail: (90) [baseKB] baseKB:house(h5)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h5))),rtrace(baseKB:house(h5))))
no_proof_for(\+house(h5)).
no_proof_for(\+house(h5)).
no_proof_for(\+house(h5)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0005_Line_0042__H5'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0005_Line_0042__H5-junit.xml
%~ mpred_test("Test_0006_Line_0045__False_positive",baseKB:poss(house(false_positive)))
passed=info(why_was_true(baseKB:poss(house(false_positive))))
no_proof_for(poss(house(false_positive))).
no_proof_for(poss(house(false_positive))).
no_proof_for(poss(house(false_positive))).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0006_Line_0045__False_positive'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0006_Line_0045__False_positive-junit.xml
%~ mpred_test("Test_0007_Line_0047__naf_False_positive",baseKB:(\+nesc(house(false_positive))))
passed=info(why_was_true(baseKB:(\+nesc(house(false_positive)))))
no_proof_for(\+nesc(house(false_positive))).
no_proof_for(\+nesc(house(false_positive))).
no_proof_for(\+nesc(house(false_positive))).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0007_Line_0047__naf_False_positive'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0007_Line_0047__naf_False_positive-junit.xml
~*/
%~ unused(save_junit_results)
%~ test_completed_exit(6)
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
```
totalTime=1
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANONMONOTONIC_TYPE_01
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/66/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://github.com/logicmoo/logicmoo_workspace/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
FAILED: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k nonmonotonic_type_01.pl (returned 6)
|
2.0
|
logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01 JUnit - (cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl)
GH_MASTER_ISSUE_FINFO=
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANONMONOTONIC_TYPE_01
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/66/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://github.com/logicmoo/logicmoo_workspace/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
```
%
running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl'),
%~ /var/lib/jenkins/.local/share/swi-prolog/pack/logicmoo_utils/prolog/logicmoo_test_header.pl:92
%~ this_test_might_need( :-( use_module( library(logicmoo_plarkc))))
%~ this_test_might_need( :-( expects_dialect(pfc)))
% =============================================
% File 'mpred_builtin.pfc'
% Purpose: Agent Reactivity for SWI-Prolog
% Maintainer: Douglas Miles
% Contact: $Author: dmiles $@users.sourceforge.net %
% Version: 'interface' 1.0.0
% Revision: $Revision: 1.9 $
% Revised At: $Date: 2002/06/27 14:13:20 $
% =============================================
%
:- module(baseKB).
:- process_script_file.
%~ kifm = leftof(h1,h2).
%~ kifm = leftof(h1,h2).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:23
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h1,h2)))))
%~ kifm = leftof(h1,h2).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:24
%~ kifm = leftof(h2,h3).
%~ kifm = leftof(h2,h3).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h2,h3)))))
%~ kifm = leftof(h2,h3).
%~ kifm = leftof(h3,h4).
%~ kifm = leftof(h3,h4).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:25
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h3,h4)))))
%~ kifm = leftof(h3,h4).
%~ kifm = leftof(h4,h5).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:26
%~ kifm = leftof(h4,h5).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h4,h5)))))
%~ kifm = leftof(h4,h5).
%~ kifm = ( leftof(House_Leftof,House_Leftof3) =>
%~ house(House_Leftof)&house(House_Leftof3)).
%~ kifm = ( leftof(House_Leftof8,House_Leftof9) =>
%~ house(House_Leftof8)&house(House_Leftof9)).
%~ debugm( common_logic_loader,
%~ show_success( common_logic_loader,
%~ common_logic_loader : ain( clif( leftof(H1,H2)=>(house(H1)&house(H2))))))
%~ kifm = ( leftof(House_Leftof,House_Leftof3) =>
%~ house(House_Leftof)&house(House_Leftof3)).
%~ kifm = leftof(H1,H2)=>(house(H1)&house(H2)).
%~ mpred_test("Test_0001_Line_0038__H1",baseKB:house(h1))
%~ mpred_test("Test_0002_Line_0039__H2",baseKB:house(h2))
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:40
%~ mpred_test("Test_0003_Line_0040__H3",baseKB:house(h3))
%~ mpred_test("Test_0004_Line_0041__H4",baseKB:house(h4))
%~ mpred_test("Test_0005_Line_0042__H5",baseKB:house(h5))
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:45
%~ mpred_test("Test_0006_Line_0045__False_positive",baseKB:poss(house(false_positive)))
%~ mpred_test("Test_0007_Line_0047__naf_False_positive",baseKB:(\+nesc(house(false_positive))))
%~ skipped( listing( [ house/1,
%~ nesc/1]))
/*~
%~ kifm=leftof(h1,h2)
%~ kifm=leftof(h1,h2)
%~ kif_to_boxlog_attvars2 = leftof(h1,h2)
% /var/lib/jenkins/.local/share/swi-prolog/pack/pfc/prolog/pfc_test compiled into pfc_test 0.05 sec, -1 clauses
=======================================================
leftof(h1,h2)
============================================
?- kif_to_boxlog( leftof(h1,h2) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h1 leftof h2
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h1,h2)
%~ kif_to_boxlog_attvars2 = leftof(h1,h2)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h1,h2).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h1 leftof h2
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h1,h2).
============================================%~ kifm=leftof(h2,h3)
%~ kifm=leftof(h2,h3)
%~ kif_to_boxlog_attvars2 = leftof(h2,h3)
=======================================================
leftof(h2,h3)
============================================
?- kif_to_boxlog( leftof(h2,h3) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h2 leftof h3
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h2,h3)
%~ kif_to_boxlog_attvars2 = leftof(h2,h3)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h2,h3).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h2 leftof h3
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h2,h3).
============================================%~ kifm=leftof(h3,h4)
%~ kifm=leftof(h3,h4)
%~ kif_to_boxlog_attvars2 = leftof(h3,h4)
=======================================================
leftof(h3,h4)
============================================
?- kif_to_boxlog( leftof(h3,h4) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h3 leftof h4
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h3,h4)
%~ kif_to_boxlog_attvars2 = leftof(h3,h4)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h3,h4).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h3 leftof h4
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h3,h4).
============================================%~ kifm=leftof(h4,h5)
%~ kifm=leftof(h4,h5)
%~ kif_to_boxlog_attvars2 = leftof(h4,h5)
=======================================================
leftof(h4,h5)
============================================
?- kif_to_boxlog( leftof(h4,h5) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h4 leftof h5
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h4,h5)
%~ kif_to_boxlog_attvars2 = leftof(h4,h5)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h4,h5).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h4 leftof h5
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h4,h5).
============================================%~ kifm=leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3))
%~ kifm=leftof(House_Leftof8,House_Leftof9)=>(house(House_Leftof8)&house(House_Leftof9))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('House_Leftof8'),'$VAR'('House_Leftof9')),and(house('$VAR'('House_Leftof8')),house('$VAR'('House_Leftof9'))))
=======================================================
=>(leftof('$VAR'('House_Leftof'),'$VAR'('House_Leftof3')),&(house('$VAR'('House_Leftof')),house('$VAR'('House_Leftof3'))))
============================================
?- kif_to_boxlog( leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3)) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ If:
%~ ?House_Leftof leftof ?House_Leftof3 then it is
%~ Implied that:
%~ " ?House_Leftof isa house " and
%~ " ?House_Leftof3 isa house "
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('House_Leftof'),'$VAR'('House_Leftof3')),and(house('$VAR'('House_Leftof')),house('$VAR'('House_Leftof3'))))
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 6 entailment(s):
nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof))==>nesc(~house(House_Leftof3)).
nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof3))==>nesc(~house(House_Leftof)).
nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof))==>nesc(house(House_Leftof3)).
nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof3))==>nesc(house(House_Leftof)).
poss(house(House_Leftof))&nesc(~house(House_Leftof3))==>nesc(~leftof(House_Leftof,House_Leftof3)).
poss(house(House_Leftof3))&nesc(~house(House_Leftof))==>nesc(~leftof(House_Leftof,House_Leftof3)).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof3 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof)) ==>
nesc( ~( house(House_Leftof3)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof3 isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof3)) ==>
nesc( ~( house(House_Leftof)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof isa house " is possible
%~ It's Proof that:
%~ " ?House_Leftof3 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof)) ==>
nesc( house(House_Leftof3))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof3 isa house " is possible
%~ It's Proof that:
%~ " ?House_Leftof isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof3)) ==>
nesc( house(House_Leftof))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof isa house " is possible and
%~ " ?House_Leftof3 isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( poss(house(House_Leftof))&nesc(~house(House_Leftof3)) ==>
nesc( ~( leftof(House_Leftof,House_Leftof3)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof3 isa house " is possible and
%~ " ?House_Leftof isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( poss(house(House_Leftof3))&nesc(~house(House_Leftof)) ==>
nesc( ~( leftof(House_Leftof,House_Leftof3)))).
============================================%~ kifm=leftof(H1,H2)=>(house(H1)&house(H2))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('H1'),'$VAR'('H2')),and(house('$VAR'('H1')),house('$VAR'('H2'))))
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H2 isa house " is possible and
%~ " ?H1 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 leftof ?H2 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
poss(house(H2))&nesc(~house(H1))==>nesc(~leftof(H1,H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H1 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H2 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&nesc(~house(H1))==>nesc(~house(H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H2 isa house " is possible
%~ It's Proof that:
%~ " ?H1 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&poss(house(H2))==>nesc(house(H1)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 isa house " is possible and
%~ " ?H2 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 leftof ?H2 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
poss(house(H1))&nesc(~house(H2))==>nesc(~leftof(H1,H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H2 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&nesc(~house(H2))==>nesc(~house(H1)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H1 isa house " is possible
%~ It's Proof that:
%~ " ?H2 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&poss(house(H1))==>nesc(house(H2)).
[0m[1m[97m[40m?- listing(kif_show).[49m[0m[21m[0m%~ mpred_test("Test_0001_Line_0038__H1",baseKB:house(h1))
Call: (90) [baseKB] baseKB:house(h1)
Fail: (90) [baseKB] baseKB:house(h1)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h1))),rtrace(baseKB:house(h1))))
no_proof_for(\+house(h1)).
no_proof_for(\+house(h1)).
no_proof_for(\+house(h1)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0001_Line_0038__H1'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0001_Line_0038__H1-junit.xml
%~ mpred_test("Test_0002_Line_0039__H2",baseKB:house(h2))
Call: (90) [baseKB] baseKB:house(h2)
Fail: (90) [baseKB] baseKB:house(h2)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h2))),rtrace(baseKB:house(h2))))
no_proof_for(\+house(h2)).
no_proof_for(\+house(h2)).
no_proof_for(\+house(h2)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0002_Line_0039__H2'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0002_Line_0039__H2-junit.xml
%~ mpred_test("Test_0003_Line_0040__H3",baseKB:house(h3))
Call: (90) [baseKB] baseKB:house(h3)
Fail: (90) [baseKB] baseKB:house(h3)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h3))),rtrace(baseKB:house(h3))))
no_proof_for(\+house(h3)).
no_proof_for(\+house(h3)).
no_proof_for(\+house(h3)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0003_Line_0040__H3'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0003_Line_0040__H3-junit.xml
%~ mpred_test("Test_0004_Line_0041__H4",baseKB:house(h4))
Call: (90) [baseKB] baseKB:house(h4)
Fail: (90) [baseKB] baseKB:house(h4)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h4))),rtrace(baseKB:house(h4))))
no_proof_for(\+house(h4)).
no_proof_for(\+house(h4)).
no_proof_for(\+house(h4)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0004_Line_0041__H4'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0004_Line_0041__H4-junit.xml
%~ mpred_test("Test_0005_Line_0042__H5",baseKB:house(h5))
Call: (90) [baseKB] baseKB:house(h5)
Fail: (90) [baseKB] baseKB:house(h5)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h5))),rtrace(baseKB:house(h5))))
no_proof_for(\+house(h5)).
no_proof_for(\+house(h5)).
no_proof_for(\+house(h5)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0005_Line_0042__H5'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0005_Line_0042__H5-junit.xml
%~ mpred_test("Test_0006_Line_0045__False_positive",baseKB:poss(house(false_positive)))
passed=info(why_was_true(baseKB:poss(house(false_positive))))
no_proof_for(poss(house(false_positive))).
no_proof_for(poss(house(false_positive))).
no_proof_for(poss(house(false_positive))).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0006_Line_0045__False_positive'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0006_Line_0045__False_positive-junit.xml
%~ mpred_test("Test_0007_Line_0047__naf_False_positive",baseKB:(\+nesc(house(false_positive))))
passed=info(why_was_true(baseKB:(\+nesc(house(false_positive)))))
no_proof_for(\+nesc(house(false_positive))).
no_proof_for(\+nesc(house(false_positive))).
no_proof_for(\+nesc(house(false_positive))).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0007_Line_0047__naf_False_positive'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0007_Line_0047__naf_False_positive-junit.xml
~*/
%~ unused(save_junit_results)
%~ test_completed_exit(6)
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
```
totalTime=1
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANONMONOTONIC_TYPE_01
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/66/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://github.com/logicmoo/logicmoo_workspace/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
FAILED: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k nonmonotonic_type_01.pl (returned 6)
|
test
|
logicmoo base fol fiveof nonmonotonic type junit cd var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl gh master issue finfo issue search gitlab latest this build github running var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl var lib jenkins local share swi prolog pack logicmoo utils prolog logicmoo test header pl this test might need use module library logicmoo plarkc this test might need expects dialect pfc file mpred builtin pfc purpose agent reactivity for swi prolog maintainer douglas miles contact author dmiles users sourceforge net version interface revision revision revised at date module basekb process script file kifm leftof kifm leftof var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl debugm common logic loader show success common logic loader common logic loader ain clif leftof kifm leftof var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl kifm leftof kifm leftof debugm common logic loader show success common logic loader common logic loader ain clif leftof kifm leftof kifm leftof kifm leftof var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl debugm common logic loader show success common logic loader common logic loader ain clif leftof kifm leftof kifm leftof var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl kifm leftof debugm common logic loader show success common logic loader common logic loader ain clif leftof kifm leftof kifm leftof house leftof house house house leftof house house kifm leftof house house house house house house debugm common logic loader show success common logic loader common logic loader ain clif leftof house house kifm leftof house leftof house house house leftof house house kifm leftof house house mpred test test line basekb house mpred test test line basekb house var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl mpred test test line basekb house mpred test test line basekb house mpred test test line basekb house var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl mpred test test line false positive basekb poss house false positive mpred test test line naf false positive basekb nesc house false positive skipped listing house nesc kifm leftof kifm leftof kif to boxlog leftof var lib jenkins local share swi prolog pack pfc prolog pfc test compiled into pfc test sec clauses leftof kif to boxlog leftof in english leftof kifm leftof kif to boxlog leftof results in the following entailment s leftof leftof leftof kifm leftof kifm leftof kif to boxlog leftof leftof kif to boxlog leftof in english leftof kifm leftof kif to boxlog leftof results in the following entailment s leftof leftof leftof kifm leftof kifm leftof kif to boxlog leftof leftof kif to boxlog leftof in english leftof kifm leftof kif to boxlog leftof results in the following entailment s leftof leftof leftof kifm leftof kifm leftof kif to boxlog leftof leftof kif to boxlog leftof in english leftof kifm leftof kif to boxlog leftof results in the following entailment s leftof leftof leftof kifm leftof house leftof house house house leftof house house kifm leftof house house house house house house kif to boxlog leftof var house var house and house var house house var house leftof var house leftof var house house var house leftof house var house kif to boxlog leftof house leftof house house house leftof house house in english if house leftof leftof house then it is implied that house leftof isa house and house isa house kifm leftof house leftof house house house leftof house house kif to boxlog leftof var house leftof var house and house var house leftof house var house results in the following entailment s nesc leftof house leftof house nesc house house leftof nesc house house nesc leftof house leftof house nesc house house nesc house house leftof nesc leftof house leftof house poss house house leftof nesc house house nesc leftof house leftof house poss house house nesc house house leftof poss house house leftof nesc house house nesc leftof house leftof house poss house house nesc house house leftof nesc leftof house leftof house whenever house leftof leftof house is necessarily true and house leftof isa house is necessarily false it s proof that house isa house is necessarily false nesc leftof house leftof house nesc house house leftof nesc house house and whenever house leftof leftof house is necessarily true and house isa house is necessarily false it s proof that house leftof isa house is necessarily false nesc leftof house leftof house nesc house house nesc house house leftof and whenever house leftof leftof house is necessarily true and house leftof isa house is possible it s proof that house isa house is necessarily true nesc leftof house leftof house poss house house leftof nesc house house and whenever house leftof leftof house is necessarily true and house isa house is possible it s proof that house leftof isa house is necessarily true nesc leftof house leftof house poss house house nesc house house leftof and whenever house leftof isa house is possible and house isa house is necessarily false it s proof that house leftof leftof house is necessarily false poss house house leftof nesc house house nesc leftof house leftof house and whenever house isa house is possible and house leftof isa house is necessarily false it s proof that house leftof leftof house is necessarily false poss house house nesc house house leftof nesc leftof house leftof house kifm leftof house house kif to boxlog leftof var var and house var house var whenever isa house is possible and isa house is necessarily false it s proof that leftof is necessarily false poss house nesc house nesc leftof and whenever leftof is necessarily true and isa house is necessarily false it s proof that isa house is necessarily false nesc leftof nesc house nesc house and whenever leftof is necessarily true and isa house is possible it s proof that isa house is necessarily true nesc leftof poss house nesc house and whenever isa house is possible and isa house is necessarily false it s proof that leftof is necessarily false poss house nesc house nesc leftof and whenever leftof is necessarily true and isa house is necessarily false it s proof that isa house is necessarily false nesc leftof nesc house nesc house and whenever leftof is necessarily true and isa house is possible it s proof that isa house is necessarily true nesc leftof poss house nesc house listing kif show mpred test test line basekb house call basekb house fail basekb house call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb house rtrace basekb house no proof for house no proof for house no proof for house name logicmoo base fol fiveof nonmonotonic type test line junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line junit xml mpred test test line basekb house call basekb house fail basekb house call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb house rtrace basekb house no proof for house no proof for house no proof for house name logicmoo base fol fiveof nonmonotonic type test line junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line junit xml mpred test test line basekb house call basekb house fail basekb house call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb house rtrace basekb house no proof for house no proof for house no proof for house name logicmoo base fol fiveof nonmonotonic type test line junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line junit xml mpred test test line basekb house call basekb house fail basekb house call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb house rtrace basekb house no proof for house no proof for house no proof for house name logicmoo base fol fiveof nonmonotonic type test line junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line junit xml mpred test test line basekb house call basekb house fail basekb house call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb house rtrace basekb house no proof for house no proof for house no proof for house name logicmoo base fol fiveof nonmonotonic type test line junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line junit xml mpred test test line false positive basekb poss house false positive passed info why was true basekb poss house false positive no proof for poss house false positive no proof for poss house false positive no proof for poss house false positive name logicmoo base fol fiveof nonmonotonic type test line false positive junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line false positive junit xml mpred test test line naf false positive basekb nesc house false positive passed info why was true basekb nesc house false positive no proof for nesc house false positive no proof for nesc house false positive no proof for nesc house false positive name logicmoo base fol fiveof nonmonotonic type test line naf false positive junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line naf false positive junit xml unused save junit results test completed exit dynamic junit prop dynamic junit prop dynamic junit prop totaltime issue search gitlab latest this build github failed var lib jenkins workspace logicmoo workspace bin lmoo junit minor k nonmonotonic type pl returned
| 1
|
84,170
| 7,893,221,264
|
IssuesEvent
|
2018-06-28 17:19:45
|
Fourdee/DietPi
|
https://api.github.com/repos/Fourdee/DietPi
|
closed
|
Nukkit Site Offline/Installation Broken
|
External Bug :beetle: Testing/testers required :arrow_down_small: Workaround/Fix available :clinking_glasses:
|
Hello upon Installation of Nukkit - Server for Minecraft Pocket Edition on a Raspberry pi 3 i got an error saying: that remote file does not exist -- broken link, referring to the file: nukkit-1.0-SNAPSHOT.jar
By going manual from a browser to this site that dietpi tries to connect to get the file i got this message in Chinese i have translate it into english:
Site: http://ci.mengcraft.com:8080/job/nukkit/
Message: Reminder: The website cannot be accessed temporarily because it has not been filed or has access to illegal content. Please contact the access provider.
looks like the Nukkit webpage is down for good i have found an alternative link/page that is working if you want to put it to the nukkit installer script
https://ci.potestas.xyz/job/NukkitX/job/master/lastSuccessfulBuild/artifact/target/nukkit-1.0-SNAPSHOT.jar
how can i change the script for now to make nukkit install successfully for now?
Thank you
|
2.0
|
Nukkit Site Offline/Installation Broken - Hello upon Installation of Nukkit - Server for Minecraft Pocket Edition on a Raspberry pi 3 i got an error saying: that remote file does not exist -- broken link, referring to the file: nukkit-1.0-SNAPSHOT.jar
By going manual from a browser to this site that dietpi tries to connect to get the file i got this message in Chinese i have translate it into english:
Site: http://ci.mengcraft.com:8080/job/nukkit/
Message: Reminder: The website cannot be accessed temporarily because it has not been filed or has access to illegal content. Please contact the access provider.
looks like the Nukkit webpage is down for good i have found an alternative link/page that is working if you want to put it to the nukkit installer script
https://ci.potestas.xyz/job/NukkitX/job/master/lastSuccessfulBuild/artifact/target/nukkit-1.0-SNAPSHOT.jar
how can i change the script for now to make nukkit install successfully for now?
Thank you
|
test
|
nukkit site offline installation broken hello upon installation of nukkit server for minecraft pocket edition on a raspberry pi i got an error saying that remote file does not exist broken link referring to the file nukkit snapshot jar by going manual from a browser to this site that dietpi tries to connect to get the file i got this message in chinese i have translate it into english site message reminder the website cannot be accessed temporarily because it has not been filed or has access to illegal content please contact the access provider looks like the nukkit webpage is down for good i have found an alternative link page that is working if you want to put it to the nukkit installer script how can i change the script for now to make nukkit install successfully for now thank you
| 1
|
7,745
| 2,925,620,398
|
IssuesEvent
|
2015-06-26 07:35:54
|
e-government-ua/i
|
https://api.github.com/repos/e-government-ua/i
|
closed
|
На бэке централа добавить добавить в сущность ServiceData поле asAuth, и перечислить там допустимые методы авторизации, например "BankID,EDS"
|
hi priority test
|
обязательно добавить в умолчательные инициализационные списки значения по этой сущности в виде:
BankID,EDS
|
1.0
|
На бэке централа добавить добавить в сущность ServiceData поле asAuth, и перечислить там допустимые методы авторизации, например "BankID,EDS" - обязательно добавить в умолчательные инициализационные списки значения по этой сущности в виде:
BankID,EDS
|
test
|
на бэке централа добавить добавить в сущность servicedata поле asauth и перечислить там допустимые методы авторизации например bankid eds обязательно добавить в умолчательные инициализационные списки значения по этой сущности в виде bankid eds
| 1
|
131,735
| 10,708,049,794
|
IssuesEvent
|
2019-10-24 18:48:49
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
roachtest: tpccbench/nodes=3/cpu=16 failed
|
C-test-failure O-roachtest O-robot
|
SHA: https://github.com/cockroachdb/cockroach/commits/f9a102814bdce90d687f6215acadf10a9d784c29
Parameters:
To repro, try:
```
# Don't forget to check out a clean suitable branch and experiment with the
# stress invocation until the desired results present themselves. For example,
# using stress instead of stressrace and passing the '-p' stressflag which
# controls concurrency.
./scripts/gceworker.sh start && ./scripts/gceworker.sh mosh
cd ~/go/src/github.com/cockroachdb/cockroach && \
stdbuf -oL -eL \
make stressrace TESTS=tpccbench/nodes=3/cpu=16 PKG=roachtest TESTTIMEOUT=5m STRESSFLAGS='-maxtime 20m -timeout 10m' 2>&1 | tee /tmp/stress.log
```
Failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=1555990&tab=artifacts#/tpccbench/nodes=3/cpu=16
```
The test failed on branch=master, cloud=aws:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/20191024-1555990/tpccbench/nodes=3/cpu=16/run_1
test_runner.go:704: test timed out (10h0m0s)
```
|
2.0
|
roachtest: tpccbench/nodes=3/cpu=16 failed - SHA: https://github.com/cockroachdb/cockroach/commits/f9a102814bdce90d687f6215acadf10a9d784c29
Parameters:
To repro, try:
```
# Don't forget to check out a clean suitable branch and experiment with the
# stress invocation until the desired results present themselves. For example,
# using stress instead of stressrace and passing the '-p' stressflag which
# controls concurrency.
./scripts/gceworker.sh start && ./scripts/gceworker.sh mosh
cd ~/go/src/github.com/cockroachdb/cockroach && \
stdbuf -oL -eL \
make stressrace TESTS=tpccbench/nodes=3/cpu=16 PKG=roachtest TESTTIMEOUT=5m STRESSFLAGS='-maxtime 20m -timeout 10m' 2>&1 | tee /tmp/stress.log
```
Failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=1555990&tab=artifacts#/tpccbench/nodes=3/cpu=16
```
The test failed on branch=master, cloud=aws:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/20191024-1555990/tpccbench/nodes=3/cpu=16/run_1
test_runner.go:704: test timed out (10h0m0s)
```
|
test
|
roachtest tpccbench nodes cpu failed sha parameters to repro try don t forget to check out a clean suitable branch and experiment with the stress invocation until the desired results present themselves for example using stress instead of stressrace and passing the p stressflag which controls concurrency scripts gceworker sh start scripts gceworker sh mosh cd go src github com cockroachdb cockroach stdbuf ol el make stressrace tests tpccbench nodes cpu pkg roachtest testtimeout stressflags maxtime timeout tee tmp stress log failed test the test failed on branch master cloud aws test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts tpccbench nodes cpu run test runner go test timed out
| 1
|
655,418
| 21,689,815,257
|
IssuesEvent
|
2022-05-09 14:27:59
|
zeyneplervesarp/swe574-javagang
|
https://api.github.com/repos/zeyneplervesarp/swe574-javagang
|
closed
|
Add FOLLOW to Activity Streams
|
enhancement backend medium priority difficulty-medium
|
Follow events had been skipped in the activity streams, it should be added.
In the end, the activity should look like:
` {
"summary": "<username> started following <username>",
"verb": "follow",
"actor": {
"objectType": "user",
"displayName": "<username>",
"id": "<id of the following user>",
"url": "/user/<id of the following user>"
},`
|
1.0
|
Add FOLLOW to Activity Streams - Follow events had been skipped in the activity streams, it should be added.
In the end, the activity should look like:
` {
"summary": "<username> started following <username>",
"verb": "follow",
"actor": {
"objectType": "user",
"displayName": "<username>",
"id": "<id of the following user>",
"url": "/user/<id of the following user>"
},`
|
non_test
|
add follow to activity streams follow events had been skipped in the activity streams it should be added in the end the activity should look like summary started following verb follow actor objecttype user displayname id url user
| 0
|
289,668
| 8,873,702,155
|
IssuesEvent
|
2019-01-11 18:58:51
|
kleros/archon
|
https://api.github.com/repos/kleros/archon
|
opened
|
Validators should be able to determine if URI is preValidated
|
Priority: Medium Status: Available Type: Enhancement :sparkles:
|
URI's from protocols such as IPFS guarantee file integrity (if fetched from a trusted gateway). Therefore Archon does not need to re-validate these file. At the moment the higher level functions (e.g. `getEvidence`) tell the utility functions whether or not the URI needs to be validated. `validateFileFromURI` should be able to handle this on it's own, so that ipfs links can be seamlessly fetched in the same workflow as classic links.
- IPFS gateways are passed to the arbitrator and arbitrable classes so utils do not have access to them.
- This would not solve the case where a URI is passed that already includes the gateway. The uri would have to follow the format `/ipfs/...` in order to be marked as prevalidated, since not all gateways are trusted by the user
- Possibly add the ability to revalidate ipfs links? It is a bit tricky since ipfs by default does file transformations before hashing, so that would have to be replicated. And it can be opted out of in some cases so it would be difficult for Archon to know the format.
|
1.0
|
Validators should be able to determine if URI is preValidated - URI's from protocols such as IPFS guarantee file integrity (if fetched from a trusted gateway). Therefore Archon does not need to re-validate these file. At the moment the higher level functions (e.g. `getEvidence`) tell the utility functions whether or not the URI needs to be validated. `validateFileFromURI` should be able to handle this on it's own, so that ipfs links can be seamlessly fetched in the same workflow as classic links.
- IPFS gateways are passed to the arbitrator and arbitrable classes so utils do not have access to them.
- This would not solve the case where a URI is passed that already includes the gateway. The uri would have to follow the format `/ipfs/...` in order to be marked as prevalidated, since not all gateways are trusted by the user
- Possibly add the ability to revalidate ipfs links? It is a bit tricky since ipfs by default does file transformations before hashing, so that would have to be replicated. And it can be opted out of in some cases so it would be difficult for Archon to know the format.
|
non_test
|
validators should be able to determine if uri is prevalidated uri s from protocols such as ipfs guarantee file integrity if fetched from a trusted gateway therefore archon does not need to re validate these file at the moment the higher level functions e g getevidence tell the utility functions whether or not the uri needs to be validated validatefilefromuri should be able to handle this on it s own so that ipfs links can be seamlessly fetched in the same workflow as classic links ipfs gateways are passed to the arbitrator and arbitrable classes so utils do not have access to them this would not solve the case where a uri is passed that already includes the gateway the uri would have to follow the format ipfs in order to be marked as prevalidated since not all gateways are trusted by the user possibly add the ability to revalidate ipfs links it is a bit tricky since ipfs by default does file transformations before hashing so that would have to be replicated and it can be opted out of in some cases so it would be difficult for archon to know the format
| 0
|
12,468
| 20,017,523,225
|
IssuesEvent
|
2022-02-01 13:34:07
|
jacopograndi/snub
|
https://api.github.com/repos/jacopograndi/snub
|
closed
|
Turret detail GUI
|
requirement
|
GUI that shows all turrets stats, all modules and their effect. The detail GUI is modular: can read a dict and display it.
|
1.0
|
Turret detail GUI - GUI that shows all turrets stats, all modules and their effect. The detail GUI is modular: can read a dict and display it.
|
non_test
|
turret detail gui gui that shows all turrets stats all modules and their effect the detail gui is modular can read a dict and display it
| 0
|
98,534
| 16,376,721,452
|
IssuesEvent
|
2021-05-16 08:54:32
|
Seagate/cortx-management-portal
|
https://api.github.com/repos/Seagate/cortx-management-portal
|
opened
|
CVE-2020-28500 (Medium) detected in lodash-4.17.20.tgz
|
security vulnerability
|
## CVE-2020-28500 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.20.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.20.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.20.tgz</a></p>
<p>Path to dependency file: cortx-management-portal/experiments/stats-gui-c3-poc/package.json</p>
<p>Path to vulnerable library: cortx-management-portal/experiments/stats-gui-c3-poc/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- eslint-5.16.0.tgz (Root Library)
- :x: **lodash-4.17.20.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Seagate/cortx-management-portal/commits/938b791eb755fea8b89e5e43b34cc0b6511919ba">938b791eb755fea8b89e5e43b34cc0b6511919ba</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Lodash versions prior to 4.17.21 are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions.
<p>Publish Date: 2021-02-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500>CVE-2020-28500</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28500">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28500</a></p>
<p>Release Date: 2021-02-15</p>
<p>Fix Resolution: lodash-4.17.21</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.20","packageFilePaths":["/experiments/stats-gui-c3-poc/package.json"],"isTransitiveDependency":true,"dependencyTree":"eslint:5.16.0;lodash:4.17.20","isMinimumFixVersionAvailable":true,"minimumFixVersion":"lodash-4.17.21"}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2020-28500","vulnerabilityDetails":"Lodash versions prior to 4.17.21 are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-28500 (Medium) detected in lodash-4.17.20.tgz - ## CVE-2020-28500 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.20.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.20.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.20.tgz</a></p>
<p>Path to dependency file: cortx-management-portal/experiments/stats-gui-c3-poc/package.json</p>
<p>Path to vulnerable library: cortx-management-portal/experiments/stats-gui-c3-poc/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- eslint-5.16.0.tgz (Root Library)
- :x: **lodash-4.17.20.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Seagate/cortx-management-portal/commits/938b791eb755fea8b89e5e43b34cc0b6511919ba">938b791eb755fea8b89e5e43b34cc0b6511919ba</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Lodash versions prior to 4.17.21 are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions.
<p>Publish Date: 2021-02-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500>CVE-2020-28500</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28500">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28500</a></p>
<p>Release Date: 2021-02-15</p>
<p>Fix Resolution: lodash-4.17.21</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.20","packageFilePaths":["/experiments/stats-gui-c3-poc/package.json"],"isTransitiveDependency":true,"dependencyTree":"eslint:5.16.0;lodash:4.17.20","isMinimumFixVersionAvailable":true,"minimumFixVersion":"lodash-4.17.21"}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2020-28500","vulnerabilityDetails":"Lodash versions prior to 4.17.21 are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_test
|
cve medium detected in lodash tgz cve medium severity vulnerability vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file cortx management portal experiments stats gui poc package json path to vulnerable library cortx management portal experiments stats gui poc node modules lodash package json dependency hierarchy eslint tgz root library x lodash tgz vulnerable library found in head commit a href found in base branch main vulnerability details lodash versions prior to are vulnerable to regular expression denial of service redos via the tonumber trim and trimend functions publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution lodash isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree eslint lodash isminimumfixversionavailable true minimumfixversion lodash basebranches vulnerabilityidentifier cve vulnerabilitydetails lodash versions prior to are vulnerable to regular expression denial of service redos via the tonumber trim and trimend functions vulnerabilityurl
| 0
|
24,803
| 24,304,696,461
|
IssuesEvent
|
2022-09-29 16:21:57
|
DCS-LCSR/SignStream3
|
https://api.github.com/repos/DCS-LCSR/SignStream3
|
closed
|
Converter
|
enhancement usability concern severity NONE
|
Converting signstream collection files into .csv files. An app existed already which does this for older collection files - investigate that app and see what would be required to do so for the current day collection files.
|
True
|
Converter - Converting signstream collection files into .csv files. An app existed already which does this for older collection files - investigate that app and see what would be required to do so for the current day collection files.
|
non_test
|
converter converting signstream collection files into csv files an app existed already which does this for older collection files investigate that app and see what would be required to do so for the current day collection files
| 0
|
194,680
| 14,684,920,432
|
IssuesEvent
|
2021-01-01 05:55:47
|
Grinnode-live/2020-grin-bug-bash-challenge
|
https://api.github.com/repos/Grinnode-live/2020-grin-bug-bash-challenge
|
closed
|
[GRIN-Node][Owner API 2.0] test the get_status API method
|
Grin-Node Test-Case
|
**Description**
Goal of this issue is to test the [get_status API method](https://docs.rs/grin_api/4.1.1/grin_api/trait.OwnerRpc.html#tymethod.get_status)
**Prerequisites**
1. GRIN-Node
**Test procedure**
1. Run GRIN-Node in Owner API listener mode
2. Run the [get_connected_peers API method](https://docs.rs/grin_api/4.1.1/grin_api/trait.OwnerRpc.html#tymethod.get_status)
**Expected result:**
Output should match [the example](https://docs.rs/grin_api/4.1.1/grin_api/trait.OwnerRpc.html#json-rpc-example). Include all cURL requests and responses.
Include the exact version of your grin-node and also your environment
```
uname -a
```
|
1.0
|
[GRIN-Node][Owner API 2.0] test the get_status API method - **Description**
Goal of this issue is to test the [get_status API method](https://docs.rs/grin_api/4.1.1/grin_api/trait.OwnerRpc.html#tymethod.get_status)
**Prerequisites**
1. GRIN-Node
**Test procedure**
1. Run GRIN-Node in Owner API listener mode
2. Run the [get_connected_peers API method](https://docs.rs/grin_api/4.1.1/grin_api/trait.OwnerRpc.html#tymethod.get_status)
**Expected result:**
Output should match [the example](https://docs.rs/grin_api/4.1.1/grin_api/trait.OwnerRpc.html#json-rpc-example). Include all cURL requests and responses.
Include the exact version of your grin-node and also your environment
```
uname -a
```
|
test
|
test the get status api method description goal of this issue is to test the prerequisites grin node test procedure run grin node in owner api listener mode run the expected result output should match include all curl requests and responses include the exact version of your grin node and also your environment uname a
| 1
|
159,652
| 12,485,478,647
|
IssuesEvent
|
2020-05-30 20:00:07
|
rotki/rotki
|
https://api.github.com/repos/rotki/rotki
|
closed
|
Migrate accounting settings tests to Cypress
|
frontend tests
|
## Abstract
As part of the effort to reintroduce e2e tests for the frontend #638 the accounting settings [tests](https://github.com/rotki/rotki/blob/08562c304c528f2d2bf2f8a7556a33d0dac0194d/electron-app/tests/spectron/specs/settings_accounting.spec.ts) need to be migrated from Spectron to Cypress.
The tests should
- [ ] Replicate existing testing accurately
- [ ] Pass properly on CI
|
1.0
|
Migrate accounting settings tests to Cypress - ## Abstract
As part of the effort to reintroduce e2e tests for the frontend #638 the accounting settings [tests](https://github.com/rotki/rotki/blob/08562c304c528f2d2bf2f8a7556a33d0dac0194d/electron-app/tests/spectron/specs/settings_accounting.spec.ts) need to be migrated from Spectron to Cypress.
The tests should
- [ ] Replicate existing testing accurately
- [ ] Pass properly on CI
|
test
|
migrate accounting settings tests to cypress abstract as part of the effort to reintroduce tests for the frontend the accounting settings need to be migrated from spectron to cypress the tests should replicate existing testing accurately pass properly on ci
| 1
|
627,124
| 19,896,581,198
|
IssuesEvent
|
2022-01-25 00:15:52
|
googleapis/repo-automation-bots
|
https://api.github.com/repos/googleapis/repo-automation-bots
|
closed
|
release-trigger: hitting memory limits in Cloud Run and crashing
|
type: bug priority: p2
|
Memory limit of 953M exceeded with 967M used. Consider increasing the memory limit, see https://cloud.google.com/run/docs/configuring/memory-limits
|
1.0
|
release-trigger: hitting memory limits in Cloud Run and crashing - Memory limit of 953M exceeded with 967M used. Consider increasing the memory limit, see https://cloud.google.com/run/docs/configuring/memory-limits
|
non_test
|
release trigger hitting memory limits in cloud run and crashing memory limit of exceeded with used consider increasing the memory limit see
| 0
|
119,119
| 17,604,031,691
|
IssuesEvent
|
2021-08-17 14:58:14
|
Pio1006/ui
|
https://api.github.com/repos/Pio1006/ui
|
opened
|
CVE-2020-7598 (Medium) detected in minimist-0.0.8.tgz, minimist-1.2.0.tgz
|
security vulnerability
|
## CVE-2020-7598 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimist-0.0.8.tgz</b>, <b>minimist-1.2.0.tgz</b></p></summary>
<p>
<details><summary><b>minimist-0.0.8.tgz</b></p></summary>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz</a></p>
<p>Path to dependency file: ui/codemodes/cmf/package.json</p>
<p>Path to vulnerable library: ui/codemodes/cmf/node_modules/minimist</p>
<p>
Dependency Hierarchy:
- jscodeshift-0.5.0.tgz (Root Library)
- babel-register-6.26.0.tgz
- mkdirp-0.5.1.tgz
- :x: **minimist-0.0.8.tgz** (Vulnerable Library)
</details>
<details><summary><b>minimist-1.2.0.tgz</b></p></summary>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p>
<p>Path to dependency file: ui/packages/containers/sandbox/package.json</p>
<p>Path to vulnerable library: ui/packages/containers/sandbox/node_modules/minimist</p>
<p>
Dependency Hierarchy:
- scripts-preset-react-6.10.0.tgz (Root Library)
- scripts-config-eslint-6.10.0.tgz
- eslint-plugin-import-2.22.1.tgz
- tsconfig-paths-3.9.0.tgz
- :x: **minimist-1.2.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/Pio1006/ui/commit/aa9d669312c8b8bc27f7998665ca077a82d6e113">aa9d669312c8b8bc27f7998665ca077a82d6e113</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload.
<p>Publish Date: 2020-03-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94">https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94</a></p>
<p>Release Date: 2020-03-11</p>
<p>Fix Resolution: minimist - 0.2.1,1.2.3</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"0.0.8","packageFilePaths":["/codemodes/cmf/package.json"],"isTransitiveDependency":true,"dependencyTree":"jscodeshift:0.5.0;babel-register:6.26.0;mkdirp:0.5.1;minimist:0.0.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"},{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"1.2.0","packageFilePaths":["/packages/containers/sandbox/package.json"],"isTransitiveDependency":true,"dependencyTree":"@talend/scripts-preset-react:6.10.0;@talend/scripts-config-eslint:6.10.0;eslint-plugin-import:2.22.1;tsconfig-paths:3.9.0;minimist:1.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-7598","vulnerabilityDetails":"minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a \"constructor\" or \"__proto__\" payload.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598","cvss3Severity":"medium","cvss3Score":"5.6","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-7598 (Medium) detected in minimist-0.0.8.tgz, minimist-1.2.0.tgz - ## CVE-2020-7598 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimist-0.0.8.tgz</b>, <b>minimist-1.2.0.tgz</b></p></summary>
<p>
<details><summary><b>minimist-0.0.8.tgz</b></p></summary>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz</a></p>
<p>Path to dependency file: ui/codemodes/cmf/package.json</p>
<p>Path to vulnerable library: ui/codemodes/cmf/node_modules/minimist</p>
<p>
Dependency Hierarchy:
- jscodeshift-0.5.0.tgz (Root Library)
- babel-register-6.26.0.tgz
- mkdirp-0.5.1.tgz
- :x: **minimist-0.0.8.tgz** (Vulnerable Library)
</details>
<details><summary><b>minimist-1.2.0.tgz</b></p></summary>
<p>parse argument options</p>
<p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p>
<p>Path to dependency file: ui/packages/containers/sandbox/package.json</p>
<p>Path to vulnerable library: ui/packages/containers/sandbox/node_modules/minimist</p>
<p>
Dependency Hierarchy:
- scripts-preset-react-6.10.0.tgz (Root Library)
- scripts-config-eslint-6.10.0.tgz
- eslint-plugin-import-2.22.1.tgz
- tsconfig-paths-3.9.0.tgz
- :x: **minimist-1.2.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/Pio1006/ui/commit/aa9d669312c8b8bc27f7998665ca077a82d6e113">aa9d669312c8b8bc27f7998665ca077a82d6e113</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload.
<p>Publish Date: 2020-03-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94">https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94</a></p>
<p>Release Date: 2020-03-11</p>
<p>Fix Resolution: minimist - 0.2.1,1.2.3</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"0.0.8","packageFilePaths":["/codemodes/cmf/package.json"],"isTransitiveDependency":true,"dependencyTree":"jscodeshift:0.5.0;babel-register:6.26.0;mkdirp:0.5.1;minimist:0.0.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"},{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"1.2.0","packageFilePaths":["/packages/containers/sandbox/package.json"],"isTransitiveDependency":true,"dependencyTree":"@talend/scripts-preset-react:6.10.0;@talend/scripts-config-eslint:6.10.0;eslint-plugin-import:2.22.1;tsconfig-paths:3.9.0;minimist:1.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 0.2.1,1.2.3"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-7598","vulnerabilityDetails":"minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a \"constructor\" or \"__proto__\" payload.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598","cvss3Severity":"medium","cvss3Score":"5.6","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_test
|
cve medium detected in minimist tgz minimist tgz cve medium severity vulnerability vulnerable libraries minimist tgz minimist tgz minimist tgz parse argument options library home page a href path to dependency file ui codemodes cmf package json path to vulnerable library ui codemodes cmf node modules minimist dependency hierarchy jscodeshift tgz root library babel register tgz mkdirp tgz x minimist tgz vulnerable library minimist tgz parse argument options library home page a href path to dependency file ui packages containers sandbox package json path to vulnerable library ui packages containers sandbox node modules minimist dependency hierarchy scripts preset react tgz root library scripts config eslint tgz eslint plugin import tgz tsconfig paths tgz x minimist tgz vulnerable library found in head commit a href found in base branch master vulnerability details minimist before could be tricked into adding or modifying properties of object prototype using a constructor or proto payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution minimist isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree jscodeshift babel register mkdirp minimist isminimumfixversionavailable true minimumfixversion minimist packagetype javascript node js packagename minimist packageversion packagefilepaths istransitivedependency true dependencytree talend scripts preset react talend scripts config eslint eslint plugin import tsconfig paths minimist isminimumfixversionavailable true minimumfixversion minimist basebranches vulnerabilityidentifier cve vulnerabilitydetails minimist before could be tricked into adding or modifying properties of object prototype using a constructor or proto payload vulnerabilityurl
| 0
|
322,015
| 27,573,487,003
|
IssuesEvent
|
2023-03-08 11:07:31
|
interlay/interbtc-ui
|
https://api.github.com/repos/interlay/interbtc-ui
|
opened
|
[Swap] Custom slippage
|
enhancement testnet-competition
|
**Is your feature request related to a problem? Please describe.**
At the moment slippage is limited to a set of value
**Describe the solution you'd like**
Allow more customisable slippage
|
1.0
|
[Swap] Custom slippage - **Is your feature request related to a problem? Please describe.**
At the moment slippage is limited to a set of value
**Describe the solution you'd like**
Allow more customisable slippage
|
test
|
custom slippage is your feature request related to a problem please describe at the moment slippage is limited to a set of value describe the solution you d like allow more customisable slippage
| 1
|
89,547
| 8,205,735,703
|
IssuesEvent
|
2018-09-03 11:08:51
|
humera987/HumTestData
|
https://api.github.com/repos/humera987/HumTestData
|
closed
|
humz_proj_test : ApiV1RunsIdTestSuiteResponseNameGetPathParamNullValueId
|
humz_proj_test
|
Project : humz_proj_test
Job : UAT
Env : UAT
Region : FXLabs/US_WEST_1
Result : fail
Status Code : 500
Headers : {}
Endpoint : http://13.56.210.25/api/v1/runs/null/test-suite-response/{name}
Request :
Response :
Not enough variable values available to expand 'name'
Logs :
Assertion [@StatusCode != 404] passed, not expecting [404] and found [500]Assertion [@StatusCode != 401] passed, not expecting [401] and found [500]Assertion [@StatusCode != 500] failed, not expecting [500] but found [500]Assertion [@StatusCode != 200] passed, not expecting [200] and found [500]
--- FX Bot ---
|
1.0
|
humz_proj_test : ApiV1RunsIdTestSuiteResponseNameGetPathParamNullValueId - Project : humz_proj_test
Job : UAT
Env : UAT
Region : FXLabs/US_WEST_1
Result : fail
Status Code : 500
Headers : {}
Endpoint : http://13.56.210.25/api/v1/runs/null/test-suite-response/{name}
Request :
Response :
Not enough variable values available to expand 'name'
Logs :
Assertion [@StatusCode != 404] passed, not expecting [404] and found [500]Assertion [@StatusCode != 401] passed, not expecting [401] and found [500]Assertion [@StatusCode != 500] failed, not expecting [500] but found [500]Assertion [@StatusCode != 200] passed, not expecting [200] and found [500]
--- FX Bot ---
|
test
|
humz proj test project humz proj test job uat env uat region fxlabs us west result fail status code headers endpoint request response not enough variable values available to expand name logs assertion passed not expecting and found assertion passed not expecting and found assertion failed not expecting but found assertion passed not expecting and found fx bot
| 1
|
134,777
| 10,931,596,192
|
IssuesEvent
|
2019-11-23 11:30:12
|
nginx/njs
|
https://api.github.com/repos/nginx/njs
|
closed
|
JSON.stringify() fixes
|
bug test262
|
Here are patches:
```diff
# HG changeset patch
# User Artem S. Povalyukhin <artem.povaluhin@gmail.com>
# Date 1574460124 -10800
# Sat Nov 23 01:02:04 2019 +0300
# Node ID 10b80bb6d96891871e0164fcabfae9ec0f1d6691
# Parent 41d7a74052d20d27408d769727a305b4c803fac2
Fixed handling of Symbol values in JSON.stringify().
diff -r 41d7a74052d2 -r 10b80bb6d968 src/njs_json.c
--- a/src/njs_json.c Sat Nov 23 00:09:26 2019 +0300
+++ b/src/njs_json.c Sat Nov 23 01:02:04 2019 +0300
@@ -1128,6 +1128,7 @@ njs_json_pop_stringify_state(njs_json_st
#define njs_json_is_object(value) \
(((value)->type == NJS_OBJECT) \
+ || ((value)->type == NJS_OBJECT_SYMBOL) \
|| ((value)->type == NJS_ARRAY) \
|| ((value)->type >= NJS_REGEXP))
@@ -1211,6 +1212,7 @@ njs_json_stringify_iterator(njs_vm_t *vm
}
if (njs_is_undefined(value)
+ || njs_is_symbol(value)
|| njs_is_function(value)
|| !njs_is_valid(value))
{
@@ -1559,6 +1561,7 @@ njs_json_append_value(njs_json_stringify
case NJS_UNDEFINED:
case NJS_NULL:
+ case NJS_SYMBOL:
case NJS_INVALID:
case NJS_FUNCTION:
default:
diff -r 41d7a74052d2 -r 10b80bb6d968 src/test/njs_unit_test.c
--- a/src/test/njs_unit_test.c Sat Nov 23 00:09:26 2019 +0300
+++ b/src/test/njs_unit_test.c Sat Nov 23 01:02:04 2019 +0300
@@ -14264,6 +14264,9 @@ static njs_unit_test_t njs_test[] =
{ njs_str("JSON.stringify(undefined)"),
njs_str("undefined") },
+ { njs_str("JSON.stringify(Symbol())"),
+ njs_str("undefined") },
+
{ njs_str("JSON.stringify({})"),
njs_str("{}") },
@@ -14279,6 +14282,9 @@ static njs_unit_test_t njs_test[] =
{ njs_str("JSON.stringify({a:1, b:undefined})"),
njs_str("{\"a\":1}") },
+ { njs_str("JSON.stringify({a:1, b:Symbol()})"),
+ njs_str("{\"a\":1}") },
+
{ njs_str("var o = {a:1, c:2};"
"Object.defineProperty(o, 'b', {enumerable:false, value:3});"
"JSON.stringify(o)"),
@@ -14290,6 +14296,12 @@ static njs_unit_test_t njs_test[] =
{ njs_str("JSON.stringify(RegExp())"),
njs_str("{}") },
+ { njs_str("JSON.stringify(Object(Symbol()))"),
+ njs_str("{}") },
+
+ { njs_str("var s = Object(Symbol()); s.test = 'test'; JSON.stringify(s)"),
+ njs_str("{\"test\":\"test\"}") },
+
{ njs_str("JSON.stringify(SyntaxError('e'))"),
njs_str("{}") },
# HG changeset patch
# User Artem S. Povalyukhin <artem.povaluhin@gmail.com>
# Date 1574502726 -10800
# Sat Nov 23 12:52:06 2019 +0300
# Node ID 819f204a50d021c76f1c222b356b93d139167001
# Parent 10b80bb6d96891871e0164fcabfae9ec0f1d6691
Pass unprintable values to JSON.stringify() replacer function.
diff -r 10b80bb6d968 -r 819f204a50d0 src/njs_json.c
--- a/src/njs_json.c Sat Nov 23 01:02:04 2019 +0300
+++ b/src/njs_json.c Sat Nov 23 12:52:06 2019 +0300
@@ -1211,14 +1211,6 @@ njs_json_stringify_iterator(njs_vm_t *vm
return ret;
}
- if (njs_is_undefined(value)
- || njs_is_symbol(value)
- || njs_is_function(value)
- || !njs_is_valid(value))
- {
- break;
- }
-
ret = njs_json_stringify_to_json(stringify, state, key, value);
if (njs_slow_path(ret != NJS_OK)) {
return ret;
@@ -1229,7 +1221,11 @@ njs_json_stringify_iterator(njs_vm_t *vm
return ret;
}
- if (njs_is_undefined(value)) {
+ if (njs_is_undefined(value)
+ || njs_is_symbol(value)
+ || njs_is_function(value)
+ || !njs_is_valid(value))
+ {
break;
}
diff -r 10b80bb6d968 -r 819f204a50d0 src/test/njs_unit_test.c
--- a/src/test/njs_unit_test.c Sat Nov 23 01:02:04 2019 +0300
+++ b/src/test/njs_unit_test.c Sat Nov 23 12:52:06 2019 +0300
@@ -14562,6 +14562,10 @@ static njs_unit_test_t njs_test[] =
"JSON.stringify(objs)"),
njs_str("[{\"\":{\"a\":1}},{\"a\":1}]") },
+ { njs_str("JSON.stringify({a: () => 1, b: Symbol(), c: undefined},"
+ "(k, v) => k.length ? String(v) : v)"),
+ njs_str("{\"a\":\"[object Function]\",\"b\":\"Symbol()\",\"c\":\"undefined\"}") },
+
{ njs_str("var a = []; a[0] = a; JSON.stringify(a)"),
njs_str("TypeError: Nested too deep or a cyclic structure") },
```
The first fixes `built-ins/JSON/stringify/value-symbol`.
About the second there is a clear consensus:
```shell
root@node:~# eshost -x 'JSON.stringify({ a: Symbol(), b: () => 1, c: undefined}, (k, v) => { print(String(k) + ":" + String(v)); return k == "" ? v : String(v); })'
#### ch
:[object Object]
a:Symbol()
b:() => 1
c:undefined
#### jsc
:[object Object]
a:Symbol()
b:() => 1
c:undefined
#### sm
:[object Object]
a:Symbol()
b:() => 1
c:undefined
#### v8
:[object Object]
a:Symbol()
b:() => 1
c:undefined
#### xs
:[object Object]
a:Symbol()
b:function b (){[native code]}
c:undefined
```
|
1.0
|
JSON.stringify() fixes - Here are patches:
```diff
# HG changeset patch
# User Artem S. Povalyukhin <artem.povaluhin@gmail.com>
# Date 1574460124 -10800
# Sat Nov 23 01:02:04 2019 +0300
# Node ID 10b80bb6d96891871e0164fcabfae9ec0f1d6691
# Parent 41d7a74052d20d27408d769727a305b4c803fac2
Fixed handling of Symbol values in JSON.stringify().
diff -r 41d7a74052d2 -r 10b80bb6d968 src/njs_json.c
--- a/src/njs_json.c Sat Nov 23 00:09:26 2019 +0300
+++ b/src/njs_json.c Sat Nov 23 01:02:04 2019 +0300
@@ -1128,6 +1128,7 @@ njs_json_pop_stringify_state(njs_json_st
#define njs_json_is_object(value) \
(((value)->type == NJS_OBJECT) \
+ || ((value)->type == NJS_OBJECT_SYMBOL) \
|| ((value)->type == NJS_ARRAY) \
|| ((value)->type >= NJS_REGEXP))
@@ -1211,6 +1212,7 @@ njs_json_stringify_iterator(njs_vm_t *vm
}
if (njs_is_undefined(value)
+ || njs_is_symbol(value)
|| njs_is_function(value)
|| !njs_is_valid(value))
{
@@ -1559,6 +1561,7 @@ njs_json_append_value(njs_json_stringify
case NJS_UNDEFINED:
case NJS_NULL:
+ case NJS_SYMBOL:
case NJS_INVALID:
case NJS_FUNCTION:
default:
diff -r 41d7a74052d2 -r 10b80bb6d968 src/test/njs_unit_test.c
--- a/src/test/njs_unit_test.c Sat Nov 23 00:09:26 2019 +0300
+++ b/src/test/njs_unit_test.c Sat Nov 23 01:02:04 2019 +0300
@@ -14264,6 +14264,9 @@ static njs_unit_test_t njs_test[] =
{ njs_str("JSON.stringify(undefined)"),
njs_str("undefined") },
+ { njs_str("JSON.stringify(Symbol())"),
+ njs_str("undefined") },
+
{ njs_str("JSON.stringify({})"),
njs_str("{}") },
@@ -14279,6 +14282,9 @@ static njs_unit_test_t njs_test[] =
{ njs_str("JSON.stringify({a:1, b:undefined})"),
njs_str("{\"a\":1}") },
+ { njs_str("JSON.stringify({a:1, b:Symbol()})"),
+ njs_str("{\"a\":1}") },
+
{ njs_str("var o = {a:1, c:2};"
"Object.defineProperty(o, 'b', {enumerable:false, value:3});"
"JSON.stringify(o)"),
@@ -14290,6 +14296,12 @@ static njs_unit_test_t njs_test[] =
{ njs_str("JSON.stringify(RegExp())"),
njs_str("{}") },
+ { njs_str("JSON.stringify(Object(Symbol()))"),
+ njs_str("{}") },
+
+ { njs_str("var s = Object(Symbol()); s.test = 'test'; JSON.stringify(s)"),
+ njs_str("{\"test\":\"test\"}") },
+
{ njs_str("JSON.stringify(SyntaxError('e'))"),
njs_str("{}") },
# HG changeset patch
# User Artem S. Povalyukhin <artem.povaluhin@gmail.com>
# Date 1574502726 -10800
# Sat Nov 23 12:52:06 2019 +0300
# Node ID 819f204a50d021c76f1c222b356b93d139167001
# Parent 10b80bb6d96891871e0164fcabfae9ec0f1d6691
Pass unprintable values to JSON.stringify() replacer function.
diff -r 10b80bb6d968 -r 819f204a50d0 src/njs_json.c
--- a/src/njs_json.c Sat Nov 23 01:02:04 2019 +0300
+++ b/src/njs_json.c Sat Nov 23 12:52:06 2019 +0300
@@ -1211,14 +1211,6 @@ njs_json_stringify_iterator(njs_vm_t *vm
return ret;
}
- if (njs_is_undefined(value)
- || njs_is_symbol(value)
- || njs_is_function(value)
- || !njs_is_valid(value))
- {
- break;
- }
-
ret = njs_json_stringify_to_json(stringify, state, key, value);
if (njs_slow_path(ret != NJS_OK)) {
return ret;
@@ -1229,7 +1221,11 @@ njs_json_stringify_iterator(njs_vm_t *vm
return ret;
}
- if (njs_is_undefined(value)) {
+ if (njs_is_undefined(value)
+ || njs_is_symbol(value)
+ || njs_is_function(value)
+ || !njs_is_valid(value))
+ {
break;
}
diff -r 10b80bb6d968 -r 819f204a50d0 src/test/njs_unit_test.c
--- a/src/test/njs_unit_test.c Sat Nov 23 01:02:04 2019 +0300
+++ b/src/test/njs_unit_test.c Sat Nov 23 12:52:06 2019 +0300
@@ -14562,6 +14562,10 @@ static njs_unit_test_t njs_test[] =
"JSON.stringify(objs)"),
njs_str("[{\"\":{\"a\":1}},{\"a\":1}]") },
+ { njs_str("JSON.stringify({a: () => 1, b: Symbol(), c: undefined},"
+ "(k, v) => k.length ? String(v) : v)"),
+ njs_str("{\"a\":\"[object Function]\",\"b\":\"Symbol()\",\"c\":\"undefined\"}") },
+
{ njs_str("var a = []; a[0] = a; JSON.stringify(a)"),
njs_str("TypeError: Nested too deep or a cyclic structure") },
```
The first fixes `built-ins/JSON/stringify/value-symbol`.
About the second there is a clear consensus:
```shell
root@node:~# eshost -x 'JSON.stringify({ a: Symbol(), b: () => 1, c: undefined}, (k, v) => { print(String(k) + ":" + String(v)); return k == "" ? v : String(v); })'
#### ch
:[object Object]
a:Symbol()
b:() => 1
c:undefined
#### jsc
:[object Object]
a:Symbol()
b:() => 1
c:undefined
#### sm
:[object Object]
a:Symbol()
b:() => 1
c:undefined
#### v8
:[object Object]
a:Symbol()
b:() => 1
c:undefined
#### xs
:[object Object]
a:Symbol()
b:function b (){[native code]}
c:undefined
```
|
test
|
json stringify fixes here are patches diff hg changeset patch user artem s povalyukhin date sat nov node id parent fixed handling of symbol values in json stringify diff r r src njs json c a src njs json c sat nov b src njs json c sat nov njs json pop stringify state njs json st define njs json is object value value type njs object value type njs object symbol value type njs array value type njs regexp njs json stringify iterator njs vm t vm if njs is undefined value njs is symbol value njs is function value njs is valid value njs json append value njs json stringify case njs undefined case njs null case njs symbol case njs invalid case njs function default diff r r src test njs unit test c a src test njs unit test c sat nov b src test njs unit test c sat nov static njs unit test t njs test njs str json stringify undefined njs str undefined njs str json stringify symbol njs str undefined njs str json stringify njs str static njs unit test t njs test njs str json stringify a b undefined njs str a njs str json stringify a b symbol njs str a njs str var o a c object defineproperty o b enumerable false value json stringify o static njs unit test t njs test njs str json stringify regexp njs str njs str json stringify object symbol njs str njs str var s object symbol s test test json stringify s njs str test test njs str json stringify syntaxerror e njs str hg changeset patch user artem s povalyukhin date sat nov node id parent pass unprintable values to json stringify replacer function diff r r src njs json c a src njs json c sat nov b src njs json c sat nov njs json stringify iterator njs vm t vm return ret if njs is undefined value njs is symbol value njs is function value njs is valid value break ret njs json stringify to json stringify state key value if njs slow path ret njs ok return ret njs json stringify iterator njs vm t vm return ret if njs is undefined value if njs is undefined value njs is symbol value njs is function value njs is valid value break diff r r src test njs unit test c a src test njs unit test c sat nov b src test njs unit test c sat nov static njs unit test t njs test json stringify objs njs str njs str json stringify a b symbol c undefined k v k length string v v njs str a b symbol c undefined njs str var a a a json stringify a njs str typeerror nested too deep or a cyclic structure the first fixes built ins json stringify value symbol about the second there is a clear consensus shell root node eshost x json stringify a symbol b c undefined k v print string k string v return k v string v ch a symbol b c undefined jsc a symbol b c undefined sm a symbol b c undefined a symbol b c undefined xs a symbol b function b c undefined
| 1
|
305,907
| 26,420,075,970
|
IssuesEvent
|
2023-01-13 19:30:58
|
dtcenter/METcalcpy
|
https://api.github.com/repos/dtcenter/METcalcpy
|
closed
|
Fix failed case of test_event_equalize.py
|
priority: high type: bug alert: NEED ACCOUNT KEY component: testing requestor: METplus Team METcalcpy: Event Equalization
|
## Describe the Problem ##
When running _python3 -m pytest test_event_equalize.py_ in _/apps/sw_review/emc/METcalcpy/METcalcpy-2.0.0/test_ on WCOSS2 Acorn, the following error occurred. Currently, this error is not reproducible on "seneca". The assignee will try to recreate the environment on seneca using the nco_metcalcpy_requirements.txt file and the pip_list_METcalcpy_only_acorn.txt. If this error is still not reproducible, there will be a working session on WCOSS2 Acorn with @jprestop.
```
========================================================================== test session starts ===========================================================================
platform linux -- Python 3.8.6, pytest-6.2.2, py-1.10.0, pluggy-0.13.1
rootdir: /apps/sw_review/emc/METcalcpy/METcalcpy-2.0.0/test, configfile: pytest.ini
collected 4 items
test_event_equalize.py ...F [100%]
================================================================================ FAILURES ================================================================================
__________________________________________________________________ test_equalize_axis_data_no_fcst_var ___________________________________________________________________
settings_no_fcst_var_vals = {'fix_val_keys': [], 'fix_vals_permuted': {}, 'indy_var': {}, 'line_type': None, ...}
def test_equalize_axis_data_no_fcst_var(settings_no_fcst_var_vals):
'''
conditions that lead to an empty data frame from event equalization should
return an empty data frame instead of an error due to trying to clean up the 'equalize' column before
returning the event equalization data frame.
'''
print("Testing equalize_axis_data with ROC CTC threshold data...")
input_file_list = ["data/ROC_CTC_thresh.data", "data/ROC_CTC.data"]
for input_file in input_file_list:
cur_df = pd.read_csv(input_file, sep='\t')
fix_vals_keys = []
fix_vals_permuted_list = []
> ee_df = equalize_axis_data(fix_vals_keys, fix_vals_permuted_list, settings_no_fcst_var_vals, cur_df, axis='1')
test_event_equalize.py:200:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../metcalcpy/util/utils.py:697: in equalize_axis_data
event_equalize(series_data_for_ee, params['indy_var'],
../metcalcpy/event_equalize.py:65: in event_equalize
series_data['equalize'] = series_data.loc[:, 'equalize'].astype(str) + " " \
/apps/prod/python-modules/3.8.6/intel/19.1.3.304/lib/python3.8/site-packages/pandas/core/frame.py:3645: in __setitem__
self._set_item_frame_value(key, value)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = fcst_thresh fy_oy fy_on fn_oy fn_on fcst_valid_beg fcst_lead equalize
0 >0.635 ... 12:00:00 120000
13 >0.635 122 872 146 25484 2012-04-09 12:00:00 120000 2012-04-09 12:00:00 120000
key = 'equalize'
value = 0 1 2 3 4 5 6 7 8 9 10 11 12 13
0 NaN NaN NaN NaN NaN NaN NaN NaN N...N NaN NaN NaN NaN NaN NaN NaN NaN NaN
13 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
def _set_item_frame_value(self, key, value: DataFrame) -> None:
self._ensure_valid_index(value)
# align columns
if key in self.columns:
loc = self.columns.get_loc(key)
cols = self.columns[loc]
len_cols = 1 if is_scalar(cols) else len(cols)
if len_cols != len(value.columns):
> raise ValueError("Columns must be same length as key")
E ValueError: Columns must be same length as key
/apps/prod/python-modules/3.8.6/intel/19.1.3.304/lib/python3.8/site-packages/pandas/core/frame.py:3775: ValueError
-------------------------------------------------------------------------- Captured stdout call --------------------------------------------------------------------------
Testing equalize_axis_data with ROC CTC threshold data...
======================================================================== short test summary info =========================================================================
FAILED test_event_equalize.py::test_equalize_axis_data_no_fcst_var - ValueError: Columns must be same length as key
====================================================================== 1 failed, 3 passed in 2.03s =======================================================================
```
### Expected Behavior ###
All tests should pass.
### Environment ###
Describe your runtime environment:
*1. Machine: WCOSS2 Acorn
*2. OS: SUSE Linux Enterprise Server 15 SP3
*3. Software version number(s): METcalcpy 2.0.0
### To Reproduce ###
Describe the steps to reproduce the behavior: **Already described above**
*1. Go to '...'*
*2. Click on '....'*
*3. Scroll down to '....'*
*4. See error*
*Post relevant sample data following these instructions:*
*https://dtcenter.org/community-code/model-evaluation-tools-met/met-help-desk#ftp*
### Relevant Deadlines ###
ASAP
### Funding Source ###
TBD
## Define the Metadata ##
### Assignee ###
- [x] Select **engineer(s)** or **no engineer** required
- [ ] Select **scientist(s)** or **no scientist** required
### Labels ###
- [x] Select **component(s)**
- [x] Select **priority**
- [x] Select **requestor(s)**
### Projects and Milestone ###
- [x] Select **Organization** level **Project** for support of the current coordinated release
- [x] Select **Repository** level **Project** for development toward the next official release or add **alert: NEED PROJECT ASSIGNMENT** label
- [x] Select **Milestone** as the next bugfix version
## Define Related Issue(s) ##
Consider the impact to the other METplus components.
- [x] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdataio](https://github.com/dtcenter/METdataio/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose)
## Bugfix Checklist ##
See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details.
- [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**.
- [ ] Fork this repository or create a branch of **main_\<Version>**.
Branch name: `bugfix_<Issue Number>_main_<Version>_<Description>`
- [ ] Fix the bug and test your changes.
- [ ] Add/update log messages for easier debugging.
- [ ] Add/update unit tests.
- [ ] Add/update documentation.
- [ ] Push local changes to GitHub.
- [ ] Submit a pull request to merge into **main_\<Version>**.
Pull request: `bugfix <Issue Number> main_<Version> <Description>`
- [ ] Define the pull request metadata, as permissions allow.
Select: **Reviewer(s)** and **Development** issues
Select: **Organization** level software support **Project** for the current coordinated release
Select: **Milestone** as the next bugfix version
- [ ] Iterate until the reviewer(s) accept and merge your changes.
- [ ] Delete your fork or branch.
- [ ] Complete the steps above to fix the bug on the **develop** branch.
Branch name: `bugfix_<Issue Number>_develop_<Description>`
Pull request: `bugfix <Issue Number> develop <Description>`
Select: **Reviewer(s)** and **Development** issues
Select: **Repository** level development cycle **Project** for the next official release
Select: **Milestone** as the next official version
- [ ] Close this issue.
|
1.0
|
Fix failed case of test_event_equalize.py - ## Describe the Problem ##
When running _python3 -m pytest test_event_equalize.py_ in _/apps/sw_review/emc/METcalcpy/METcalcpy-2.0.0/test_ on WCOSS2 Acorn, the following error occurred. Currently, this error is not reproducible on "seneca". The assignee will try to recreate the environment on seneca using the nco_metcalcpy_requirements.txt file and the pip_list_METcalcpy_only_acorn.txt. If this error is still not reproducible, there will be a working session on WCOSS2 Acorn with @jprestop.
```
========================================================================== test session starts ===========================================================================
platform linux -- Python 3.8.6, pytest-6.2.2, py-1.10.0, pluggy-0.13.1
rootdir: /apps/sw_review/emc/METcalcpy/METcalcpy-2.0.0/test, configfile: pytest.ini
collected 4 items
test_event_equalize.py ...F [100%]
================================================================================ FAILURES ================================================================================
__________________________________________________________________ test_equalize_axis_data_no_fcst_var ___________________________________________________________________
settings_no_fcst_var_vals = {'fix_val_keys': [], 'fix_vals_permuted': {}, 'indy_var': {}, 'line_type': None, ...}
def test_equalize_axis_data_no_fcst_var(settings_no_fcst_var_vals):
'''
conditions that lead to an empty data frame from event equalization should
return an empty data frame instead of an error due to trying to clean up the 'equalize' column before
returning the event equalization data frame.
'''
print("Testing equalize_axis_data with ROC CTC threshold data...")
input_file_list = ["data/ROC_CTC_thresh.data", "data/ROC_CTC.data"]
for input_file in input_file_list:
cur_df = pd.read_csv(input_file, sep='\t')
fix_vals_keys = []
fix_vals_permuted_list = []
> ee_df = equalize_axis_data(fix_vals_keys, fix_vals_permuted_list, settings_no_fcst_var_vals, cur_df, axis='1')
test_event_equalize.py:200:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../metcalcpy/util/utils.py:697: in equalize_axis_data
event_equalize(series_data_for_ee, params['indy_var'],
../metcalcpy/event_equalize.py:65: in event_equalize
series_data['equalize'] = series_data.loc[:, 'equalize'].astype(str) + " " \
/apps/prod/python-modules/3.8.6/intel/19.1.3.304/lib/python3.8/site-packages/pandas/core/frame.py:3645: in __setitem__
self._set_item_frame_value(key, value)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = fcst_thresh fy_oy fy_on fn_oy fn_on fcst_valid_beg fcst_lead equalize
0 >0.635 ... 12:00:00 120000
13 >0.635 122 872 146 25484 2012-04-09 12:00:00 120000 2012-04-09 12:00:00 120000
key = 'equalize'
value = 0 1 2 3 4 5 6 7 8 9 10 11 12 13
0 NaN NaN NaN NaN NaN NaN NaN NaN N...N NaN NaN NaN NaN NaN NaN NaN NaN NaN
13 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
def _set_item_frame_value(self, key, value: DataFrame) -> None:
self._ensure_valid_index(value)
# align columns
if key in self.columns:
loc = self.columns.get_loc(key)
cols = self.columns[loc]
len_cols = 1 if is_scalar(cols) else len(cols)
if len_cols != len(value.columns):
> raise ValueError("Columns must be same length as key")
E ValueError: Columns must be same length as key
/apps/prod/python-modules/3.8.6/intel/19.1.3.304/lib/python3.8/site-packages/pandas/core/frame.py:3775: ValueError
-------------------------------------------------------------------------- Captured stdout call --------------------------------------------------------------------------
Testing equalize_axis_data with ROC CTC threshold data...
======================================================================== short test summary info =========================================================================
FAILED test_event_equalize.py::test_equalize_axis_data_no_fcst_var - ValueError: Columns must be same length as key
====================================================================== 1 failed, 3 passed in 2.03s =======================================================================
```
### Expected Behavior ###
All tests should pass.
### Environment ###
Describe your runtime environment:
*1. Machine: WCOSS2 Acorn
*2. OS: SUSE Linux Enterprise Server 15 SP3
*3. Software version number(s): METcalcpy 2.0.0
### To Reproduce ###
Describe the steps to reproduce the behavior: **Already described above**
*1. Go to '...'*
*2. Click on '....'*
*3. Scroll down to '....'*
*4. See error*
*Post relevant sample data following these instructions:*
*https://dtcenter.org/community-code/model-evaluation-tools-met/met-help-desk#ftp*
### Relevant Deadlines ###
ASAP
### Funding Source ###
TBD
## Define the Metadata ##
### Assignee ###
- [x] Select **engineer(s)** or **no engineer** required
- [ ] Select **scientist(s)** or **no scientist** required
### Labels ###
- [x] Select **component(s)**
- [x] Select **priority**
- [x] Select **requestor(s)**
### Projects and Milestone ###
- [x] Select **Organization** level **Project** for support of the current coordinated release
- [x] Select **Repository** level **Project** for development toward the next official release or add **alert: NEED PROJECT ASSIGNMENT** label
- [x] Select **Milestone** as the next bugfix version
## Define Related Issue(s) ##
Consider the impact to the other METplus components.
- [x] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdataio](https://github.com/dtcenter/METdataio/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose)
## Bugfix Checklist ##
See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details.
- [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**.
- [ ] Fork this repository or create a branch of **main_\<Version>**.
Branch name: `bugfix_<Issue Number>_main_<Version>_<Description>`
- [ ] Fix the bug and test your changes.
- [ ] Add/update log messages for easier debugging.
- [ ] Add/update unit tests.
- [ ] Add/update documentation.
- [ ] Push local changes to GitHub.
- [ ] Submit a pull request to merge into **main_\<Version>**.
Pull request: `bugfix <Issue Number> main_<Version> <Description>`
- [ ] Define the pull request metadata, as permissions allow.
Select: **Reviewer(s)** and **Development** issues
Select: **Organization** level software support **Project** for the current coordinated release
Select: **Milestone** as the next bugfix version
- [ ] Iterate until the reviewer(s) accept and merge your changes.
- [ ] Delete your fork or branch.
- [ ] Complete the steps above to fix the bug on the **develop** branch.
Branch name: `bugfix_<Issue Number>_develop_<Description>`
Pull request: `bugfix <Issue Number> develop <Description>`
Select: **Reviewer(s)** and **Development** issues
Select: **Repository** level development cycle **Project** for the next official release
Select: **Milestone** as the next official version
- [ ] Close this issue.
|
test
|
fix failed case of test event equalize py describe the problem when running m pytest test event equalize py in apps sw review emc metcalcpy metcalcpy test on acorn the following error occurred currently this error is not reproducible on seneca the assignee will try to recreate the environment on seneca using the nco metcalcpy requirements txt file and the pip list metcalcpy only acorn txt if this error is still not reproducible there will be a working session on acorn with jprestop test session starts platform linux python pytest py pluggy rootdir apps sw review emc metcalcpy metcalcpy test configfile pytest ini collected items test event equalize py f failures test equalize axis data no fcst var settings no fcst var vals fix val keys fix vals permuted indy var line type none def test equalize axis data no fcst var settings no fcst var vals conditions that lead to an empty data frame from event equalization should return an empty data frame instead of an error due to trying to clean up the equalize column before returning the event equalization data frame print testing equalize axis data with roc ctc threshold data input file list for input file in input file list cur df pd read csv input file sep t fix vals keys fix vals permuted list ee df equalize axis data fix vals keys fix vals permuted list settings no fcst var vals cur df axis test event equalize py metcalcpy util utils py in equalize axis data event equalize series data for ee params metcalcpy event equalize py in event equalize series data series data loc astype str apps prod python modules intel lib site packages pandas core frame py in setitem self set item frame value key value self fcst thresh fy oy fy on fn oy fn on fcst valid beg fcst lead equalize key equalize value nan nan nan nan nan nan nan nan n n nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan nan def set item frame value self key value dataframe none self ensure valid index value align columns if key in self columns loc self columns get loc key cols self columns len cols if is scalar cols else len cols if len cols len value columns raise valueerror columns must be same length as key e valueerror columns must be same length as key apps prod python modules intel lib site packages pandas core frame py valueerror captured stdout call testing equalize axis data with roc ctc threshold data short test summary info failed test event equalize py test equalize axis data no fcst var valueerror columns must be same length as key failed passed in expected behavior all tests should pass environment describe your runtime environment machine acorn os suse linux enterprise server software version number s metcalcpy to reproduce describe the steps to reproduce the behavior already described above go to click on scroll down to see error post relevant sample data following these instructions relevant deadlines asap funding source tbd define the metadata assignee select engineer s or no engineer required select scientist s or no scientist required labels select component s select priority select requestor s projects and milestone select organization level project for support of the current coordinated release select repository level project for development toward the next official release or add alert need project assignment label select milestone as the next bugfix version define related issue s consider the impact to the other metplus components bugfix checklist see the for details complete the issue definition above including the time estimate and funding source fork this repository or create a branch of main branch name bugfix main fix the bug and test your changes add update log messages for easier debugging add update unit tests add update documentation push local changes to github submit a pull request to merge into main pull request bugfix main define the pull request metadata as permissions allow select reviewer s and development issues select organization level software support project for the current coordinated release select milestone as the next bugfix version iterate until the reviewer s accept and merge your changes delete your fork or branch complete the steps above to fix the bug on the develop branch branch name bugfix develop pull request bugfix develop select reviewer s and development issues select repository level development cycle project for the next official release select milestone as the next official version close this issue
| 1
|
554,633
| 16,435,086,701
|
IssuesEvent
|
2021-05-20 08:21:17
|
buddyboss/buddyboss-platform
|
https://api.github.com/repos/buddyboss/buddyboss-platform
|
opened
|
Subgroups are displaying in All Groups and My Groups tab whereas Hide Subgroups option is checked from setting
|
bug priority: medium
|
**Describe the bug**
Subgroups are displaying in All Groups and My Groups tab whereas Hide Subgroups option is checked from setting.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to BuddyBoss -> Groups -> Create Group ( i.e - Parent Group)
2. Go to BuddyBoss -> Groups -> Create Group ( i.e - Sub Group) -> Select Group Parent ( i.e - Parent Group)
3. Go to BuddyBoss ->Settings -> Groups -> Hide Subgroups -> Checked
4. See error - https://screencast-o-matic.com/watch/crhYjFVfdVx
**Expected behavior**
Subgroup should not be display in All Groups and My Groups tab if we enable Hide Subgroups option from setting.
**Screenshots**
https://screencast-o-matic.com/watch/crhYjFVfdVx
**Support ticket links**
Based on App testing
|
1.0
|
Subgroups are displaying in All Groups and My Groups tab whereas Hide Subgroups option is checked from setting - **Describe the bug**
Subgroups are displaying in All Groups and My Groups tab whereas Hide Subgroups option is checked from setting.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to BuddyBoss -> Groups -> Create Group ( i.e - Parent Group)
2. Go to BuddyBoss -> Groups -> Create Group ( i.e - Sub Group) -> Select Group Parent ( i.e - Parent Group)
3. Go to BuddyBoss ->Settings -> Groups -> Hide Subgroups -> Checked
4. See error - https://screencast-o-matic.com/watch/crhYjFVfdVx
**Expected behavior**
Subgroup should not be display in All Groups and My Groups tab if we enable Hide Subgroups option from setting.
**Screenshots**
https://screencast-o-matic.com/watch/crhYjFVfdVx
**Support ticket links**
Based on App testing
|
non_test
|
subgroups are displaying in all groups and my groups tab whereas hide subgroups option is checked from setting describe the bug subgroups are displaying in all groups and my groups tab whereas hide subgroups option is checked from setting to reproduce steps to reproduce the behavior go to buddyboss groups create group i e parent group go to buddyboss groups create group i e sub group select group parent i e parent group go to buddyboss settings groups hide subgroups checked see error expected behavior subgroup should not be display in all groups and my groups tab if we enable hide subgroups option from setting screenshots support ticket links based on app testing
| 0
|
453,471
| 13,080,162,635
|
IssuesEvent
|
2020-08-01 06:18:26
|
EUCweb/BIS-F
|
https://api.github.com/repos/EUCweb/BIS-F
|
closed
|
UPL + PVS Target Device Driver doesn't convert VHDX
|
Priority: Medium Type: Bug
|
If UPL with PVS Target Device Driver is installed and the System must be converted through BIS-F from C: to the VDHX it ending up with the message **Citrix AppLayering installed, convert to Disk not necessary**
|
1.0
|
UPL + PVS Target Device Driver doesn't convert VHDX - If UPL with PVS Target Device Driver is installed and the System must be converted through BIS-F from C: to the VDHX it ending up with the message **Citrix AppLayering installed, convert to Disk not necessary**
|
non_test
|
upl pvs target device driver doesn t convert vhdx if upl with pvs target device driver is installed and the system must be converted through bis f from c to the vdhx it ending up with the message citrix applayering installed convert to disk not necessary
| 0
|
356,209
| 25,176,138,939
|
IssuesEvent
|
2022-11-11 09:25:38
|
snigloo/pe
|
https://api.github.com/repos/snigloo/pe
|
opened
|
Developer Guide has no introduction
|
type.DocumentationBug severity.Medium
|
There is no introduction to DG, such as introducing the product and what the developer guide is for and who the target audience for the DG is, especially when you are assuming most of the users reading the developer guide understand terms such as `alias`.

<!--session: 1668153633019-9e3564c2-b17a-4c86-ae39-e9ee2c05bce9-->
<!--Version: Web v3.4.4-->
|
1.0
|
Developer Guide has no introduction - There is no introduction to DG, such as introducing the product and what the developer guide is for and who the target audience for the DG is, especially when you are assuming most of the users reading the developer guide understand terms such as `alias`.

<!--session: 1668153633019-9e3564c2-b17a-4c86-ae39-e9ee2c05bce9-->
<!--Version: Web v3.4.4-->
|
non_test
|
developer guide has no introduction there is no introduction to dg such as introducing the product and what the developer guide is for and who the target audience for the dg is especially when you are assuming most of the users reading the developer guide understand terms such as alias
| 0
|
168,709
| 14,170,748,364
|
IssuesEvent
|
2020-11-12 14:54:38
|
dadhi/FastExpressionCompiler
|
https://api.github.com/repos/dadhi/FastExpressionCompiler
|
opened
|
Add a flags to the CompileFast to configure the Invocated Lambda inlining
|
documentation enhancement
|
Related #274
Currently (in v3 preview) the lambda is Not inlined which differs from the System Compile behavior.
In v2 it was inlined.
So the Inlining should be the default behavior and the Prevent Inlining shoud be the opt-in.
Inlining prevention is required when you have multiple the same nested lambdas invocations - so that the nested lambda will be compiled once and reused.
|
1.0
|
Add a flags to the CompileFast to configure the Invocated Lambda inlining - Related #274
Currently (in v3 preview) the lambda is Not inlined which differs from the System Compile behavior.
In v2 it was inlined.
So the Inlining should be the default behavior and the Prevent Inlining shoud be the opt-in.
Inlining prevention is required when you have multiple the same nested lambdas invocations - so that the nested lambda will be compiled once and reused.
|
non_test
|
add a flags to the compilefast to configure the invocated lambda inlining related currently in preview the lambda is not inlined which differs from the system compile behavior in it was inlined so the inlining should be the default behavior and the prevent inlining shoud be the opt in inlining prevention is required when you have multiple the same nested lambdas invocations so that the nested lambda will be compiled once and reused
| 0
|
290,593
| 25,078,406,669
|
IssuesEvent
|
2022-11-07 17:09:57
|
eclipse-openj9/openj9
|
https://api.github.com/repos/eclipse-openj9/openj9
|
closed
|
jdk_util - Assertion failed: guardOkForExpr: should not intern OSR guard
|
comp:jit test failure blocker segfault
|
In the internal nightly JITServer tests
- `/job/Test_openjdk8_j9_sanity.openjdk_x86-64_linux_jit_Personal/475`
- `/job/Test_openjdk11_j9_sanity.openjdk_x86-64_linux_jit_Personal/478`
the test `jdk_util_1` failed due to an assertion failure, but it seems to reproduce readily without JITServer enabled.
Console log from the first job:
```
03:48:29 openjdk version "1.8.0_352-internal"
03:48:29 OpenJDK Runtime Environment (build 1.8.0_352-internal-_2022_10_17_07_04-b00)
03:48:29 Eclipse OpenJ9 VM (build master-153e812, JRE 1.8.0 Linux amd64-64-Bit Compressed References 20221017_1201 (JIT enabled, AOT enabled)
03:48:29 OpenJ9 - 153e812
03:48:29 OMR - d5d4a53
03:48:29 JCL - d824d66 based on jdk8u352-b07)
04:23:29 ===============================================
04:23:29 Running test jdk_util_1 ...
04:23:29 ===============================================
04:23:29 jdk_util_1 Start Time: Mon Oct 17 01:23:25 2022 Epoch Time (ms): 1665995005293
04:23:29 "/home/jenkins/workspace/Test_openjdk8_j9_sanity.openjdk_x86-64_linux_jit_Personal/openjdkbinary/j2sdk-image/bin/java" -Xshareclasses:destroyAll; "/home/jenkins/workspace/Test_openjdk8_j9_sanity.openjdk_x86-64_linux_jit_Personal/openjdkbinary/j2sdk-image/bin/java" -Xshareclasses:groupAccess,destroyAll; echo "cache cleanup done";
04:23:29 JVMSHRC005I No shared class caches available
04:23:29 JVMSHRC005I No shared class caches available
04:23:29 cache cleanup done
04:23:29 variation: -Xdump:system:none -Xdump:heap:none -Xdump:system:events=gpf+abort+traceassert+corruptcache Mode650
04:23:29 JVM_OPTIONS: -XX:+UseJITServer -Xdump:system:none -Xdump:heap:none -Xdump:system:events=gpf+abort+traceassert+corruptcache -XX:-UseCompressedOops
04:32:46 Unhandled exception
04:32:46 Type=Unhandled trap vmState=0x00050fff
04:32:46 J9Generic_Signal_Number=00000108 Signal_Number=00000005 Error_Value=00000000 Signal_Code=fffffffa
04:32:46 Handler1=00007F953A295C40 Handler2=00007F9539FF1870
04:32:46 RDI=0000000000000002 RSI=00007F94C039B3B0 RAX=0000000000000000 RBX=00007F94A8749C20
04:32:46 RCX=00007F953B4682AB RDX=0000000000000000 R8=0000000000000000 R9=00007F94C039B3B0
04:32:46 R10=0000000000000008 R11=0000000000000246 R12=0000000000000000 R13=00007F94A8131400
04:32:46 R14=0000000000000000 R15=00007F94A8749C20
04:32:46 RIP=00007F953B4682AB GS=0000 FS=0000 RSP=00007F94C039B3B0
04:32:46 EFlags=0000000000000246 CS=0033 RBP=0000000000000011 ERR=0000000000000000
04:32:46 TRAPNO=0000000000000000 OLDMASK=0000000000000000 CR2=0000000000000000
04:32:46 xmm0 ffffffffffffffff (f: 4294967296.000000, d: -nan)
04:32:46 xmm1 ff00000000000000 (f: 0.000000, d: -5.486124e+303)
04:32:46 xmm2 ffffffffffffff00 (f: 4294967040.000000, d: -nan)
04:32:46 xmm3 00007f94c039ae80 (f: 3225005568.000000, d: 6.930598e-310)
04:32:46 xmm4 2b286f732e393272 (f: 775500416.000000, d: 8.727906e-101)
04:32:46 xmm5 6c2f65726a2f6567 (f: 1781491072.000000, d: 1.321189e+213)
04:32:46 xmm6 6c616e6f73726550 (f: 1936876928.000000, d: 1.173651e+214)
04:32:46 xmm7 6f2e7974696e6173 (f: 1768841600.000000, d: 3.609627e+227)
04:32:46 xmm8 000000000000000a (f: 10.000000, d: 4.940656e-323)
04:32:46 xmm9 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 xmm10 081f08032541595b (f: 625039680.000000, d: 1.468466e-269)
04:32:46 xmm11 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 xmm12 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 xmm13 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 xmm14 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 xmm15 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 Module=/lib/x86_64-linux-gnu/libpthread.so.0
04:32:46 Module_base_address=00007F953B454000 Symbol=raise
04:32:46 Symbol_address=00007F953B4681E0
04:32:46
04:32:46 Method_being_compiled=java/util/Random$RandomIntsSpliterator.forEachRemaining(Ljava/util/function/IntConsumer;)V
04:32:46 Target=2_90_20221017_1201 (Linux 5.4.0-126-generic)
04:32:46 CPU=amd64 (4 logical CPUs) (0xf5b39000 RAM)
04:32:46 ----------- Stack Backtrace -----------
04:32:46 raise+0xcb (0x00007F953B4682AB [libpthread.so.0+0x142ab])
04:32:46 _ZN2TR4trapEv+0x47 (0x00007F953373ACB7 [libj9jit29.so+0x581cb7])
04:32:46 (0x00007F953373AD42 [libj9jit29.so+0x581d42])
04:32:46 _ZN16TR_LoopVersioner14guardOkForExprEPN2TR4NodeEb+0x481 (0x00007F95338A28E1 [libj9jit29.so+0x6e98e1])
04:32:46 _ZN16TR_LoopVersioner16initExprFromNodeEPNS_4ExprEPN2TR4NodeEb+0x94 (0x00007F95338A29F4 [libj9jit29.so+0x6e99f4])
04:32:46 _ZN16TR_LoopVersioner17makeCanonicalExprEPN2TR4NodeE+0xb0 (0x00007F95338A3CD0 [libj9jit29.so+0x6eacd0])
04:32:46 _ZN16TR_LoopVersioner19createLoopEntryPrepENS_13LoopEntryPrep4KindEPN2TR4NodeEPNS2_13NodeChecklistEPS0_+0x75 (0x00007F95338A4595 [libj9jit29.so+0x6eb595])
04:32:46 _ZN16TR_LoopVersioner20buildConditionalTreeEP4ListIN2TR7TreeTopEERN3CS216ASparseBitVectorINS5_16shared_allocatorINS5_14heap_allocatorILm65536ELj12E17TRMemoryAllocatorIL17TR_AllocationKind1ELj12ELj28EEEEEEEE+0x750 (0x00007F95338B0D00 [libj9jit29.so+0x6f7d00])
04:32:46 _ZN16TR_LoopVersioner18versionNaturalLoopEP18TR_RegionStructureP4ListIN2TR4NodeEEPS2_INS3_7TreeTopEES9_S9_S9_S9_S9_S9_S9_S6_PS2_I19TR_NodeParentSymRefEPS2_I30TR_NodeParentSymRefWeightTupleEPS2_I12TR_StructureESI_bRN3CS216ASparseBitVectorINSJ_16shared_allo+0x1a6f (0x00007F95338B70DF [libj9jit29.so+0x6fe0df])
04:32:46 _ZN16TR_LoopVersioner24performWithoutDominatorsEv+0x13a3 (0x00007F95338BBD43 [libj9jit29.so+0x702d43])
04:32:46 _ZN3OMR9Optimizer19performOptimizationEPK20OptimizationStrategyiii+0x767 (0x00007F95338FF537 [libj9jit29.so+0x746537])
04:32:46 _ZN3OMR9Optimizer19performOptimizationEPK20OptimizationStrategyiii+0xdc9 (0x00007F95338FFB99 [libj9jit29.so+0x746b99])
04:32:46 _ZN3OMR9Optimizer19performOptimizationEPK20OptimizationStrategyiii+0xdc9 (0x00007F95338FFB99 [libj9jit29.so+0x746b99])
04:32:46 _ZN3OMR9Optimizer8optimizeEv+0x1db (0x00007F9533900E6B [libj9jit29.so+0x747e6b])
04:32:46 _ZN3OMR11Compilation7compileEv+0xaf5 (0x00007F95336F8405 [libj9jit29.so+0x53f405])
04:32:46 _ZN2TR28CompilationInfoPerThreadBase7compileEP10J9VMThreadPNS_11CompilationEP17TR_ResolvedMethodR11TR_J9VMBaseP19TR_OptimizationPlanRKNS_16SegmentAllocatorE+0x4bf (0x00007F953330828F [libj9jit29.so+0x14f28f])
04:32:46 _ZN2TR28CompilationInfoPerThreadBase14wrappedCompileEP13J9PortLibraryPv+0x314 (0x00007F95333092F4 [libj9jit29.so+0x1502f4])
04:32:46 omrsig_protect+0x1e3 (0x00007F9539FF25D3 [libj9prt29.so+0x2b5d3])
04:32:46 _ZN2TR28CompilationInfoPerThreadBase7compileEP10J9VMThreadP21TR_MethodToBeCompiledRN2J917J9SegmentProviderE+0x309 (0x00007F9533306A19 [libj9jit29.so+0x14da19])
04:32:46 _ZN2TR24CompilationInfoPerThread12processEntryER21TR_MethodToBeCompiledRN2J917J9SegmentProviderE+0x1c0 (0x00007F9533307060 [libj9jit29.so+0x14e060])
04:32:46 _ZN2TR24CompilationInfoPerThread14processEntriesEv+0x3b3 (0x00007F9533305B93 [libj9jit29.so+0x14cb93])
04:32:46 _ZN2TR24CompilationInfoPerThread3runEv+0x42 (0x00007F9533306072 [libj9jit29.so+0x14d072])
04:32:46 _Z30protectedCompilationThreadProcP13J9PortLibraryPN2TR24CompilationInfoPerThreadE+0x82 (0x00007F9533306122 [libj9jit29.so+0x14d122])
04:32:46 omrsig_protect+0x1e3 (0x00007F9539FF25D3 [libj9prt29.so+0x2b5d3])
04:32:46 _Z21compilationThreadProcPv+0x1d2 (0x00007F9533306552 [libj9jit29.so+0x14d552])
04:32:46 thread_wrapper+0x186 (0x00007F9539DBA4F6 [libj9thr29.so+0xe4f6])
04:32:46 start_thread+0xd9 (0x00007F953B45C609 [libpthread.so.0+0x8609])
04:32:46 clone+0x43 (0x00007F953B163133 [libc.so.6+0x11f133])
04:32:46 JVMDUMP053I JIT dump is recompiling java/util/Random$RandomIntsSpliterator.forEachRemaining(Ljava/util/function/IntConsumer;)V
04:32:46 Assertion failed at /home/jenkins/workspace/Build_JDK8_x86-64_linux_jit_Personal/omr/compiler/optimizer/LoopVersioner.cpp:9125: onlySearching
04:32:46 VMState: 0x00050fff
04:32:46 guardOkForExpr: should not intern OSR guard n3908n [0x7f94883da540]
04:32:46 compiling java/util/Random$RandomIntsSpliterator.forEachRemaining(Ljava/util/function/IntConsumer;)V at level: hot
```
|
1.0
|
jdk_util - Assertion failed: guardOkForExpr: should not intern OSR guard - In the internal nightly JITServer tests
- `/job/Test_openjdk8_j9_sanity.openjdk_x86-64_linux_jit_Personal/475`
- `/job/Test_openjdk11_j9_sanity.openjdk_x86-64_linux_jit_Personal/478`
the test `jdk_util_1` failed due to an assertion failure, but it seems to reproduce readily without JITServer enabled.
Console log from the first job:
```
03:48:29 openjdk version "1.8.0_352-internal"
03:48:29 OpenJDK Runtime Environment (build 1.8.0_352-internal-_2022_10_17_07_04-b00)
03:48:29 Eclipse OpenJ9 VM (build master-153e812, JRE 1.8.0 Linux amd64-64-Bit Compressed References 20221017_1201 (JIT enabled, AOT enabled)
03:48:29 OpenJ9 - 153e812
03:48:29 OMR - d5d4a53
03:48:29 JCL - d824d66 based on jdk8u352-b07)
04:23:29 ===============================================
04:23:29 Running test jdk_util_1 ...
04:23:29 ===============================================
04:23:29 jdk_util_1 Start Time: Mon Oct 17 01:23:25 2022 Epoch Time (ms): 1665995005293
04:23:29 "/home/jenkins/workspace/Test_openjdk8_j9_sanity.openjdk_x86-64_linux_jit_Personal/openjdkbinary/j2sdk-image/bin/java" -Xshareclasses:destroyAll; "/home/jenkins/workspace/Test_openjdk8_j9_sanity.openjdk_x86-64_linux_jit_Personal/openjdkbinary/j2sdk-image/bin/java" -Xshareclasses:groupAccess,destroyAll; echo "cache cleanup done";
04:23:29 JVMSHRC005I No shared class caches available
04:23:29 JVMSHRC005I No shared class caches available
04:23:29 cache cleanup done
04:23:29 variation: -Xdump:system:none -Xdump:heap:none -Xdump:system:events=gpf+abort+traceassert+corruptcache Mode650
04:23:29 JVM_OPTIONS: -XX:+UseJITServer -Xdump:system:none -Xdump:heap:none -Xdump:system:events=gpf+abort+traceassert+corruptcache -XX:-UseCompressedOops
04:32:46 Unhandled exception
04:32:46 Type=Unhandled trap vmState=0x00050fff
04:32:46 J9Generic_Signal_Number=00000108 Signal_Number=00000005 Error_Value=00000000 Signal_Code=fffffffa
04:32:46 Handler1=00007F953A295C40 Handler2=00007F9539FF1870
04:32:46 RDI=0000000000000002 RSI=00007F94C039B3B0 RAX=0000000000000000 RBX=00007F94A8749C20
04:32:46 RCX=00007F953B4682AB RDX=0000000000000000 R8=0000000000000000 R9=00007F94C039B3B0
04:32:46 R10=0000000000000008 R11=0000000000000246 R12=0000000000000000 R13=00007F94A8131400
04:32:46 R14=0000000000000000 R15=00007F94A8749C20
04:32:46 RIP=00007F953B4682AB GS=0000 FS=0000 RSP=00007F94C039B3B0
04:32:46 EFlags=0000000000000246 CS=0033 RBP=0000000000000011 ERR=0000000000000000
04:32:46 TRAPNO=0000000000000000 OLDMASK=0000000000000000 CR2=0000000000000000
04:32:46 xmm0 ffffffffffffffff (f: 4294967296.000000, d: -nan)
04:32:46 xmm1 ff00000000000000 (f: 0.000000, d: -5.486124e+303)
04:32:46 xmm2 ffffffffffffff00 (f: 4294967040.000000, d: -nan)
04:32:46 xmm3 00007f94c039ae80 (f: 3225005568.000000, d: 6.930598e-310)
04:32:46 xmm4 2b286f732e393272 (f: 775500416.000000, d: 8.727906e-101)
04:32:46 xmm5 6c2f65726a2f6567 (f: 1781491072.000000, d: 1.321189e+213)
04:32:46 xmm6 6c616e6f73726550 (f: 1936876928.000000, d: 1.173651e+214)
04:32:46 xmm7 6f2e7974696e6173 (f: 1768841600.000000, d: 3.609627e+227)
04:32:46 xmm8 000000000000000a (f: 10.000000, d: 4.940656e-323)
04:32:46 xmm9 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 xmm10 081f08032541595b (f: 625039680.000000, d: 1.468466e-269)
04:32:46 xmm11 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 xmm12 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 xmm13 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 xmm14 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 xmm15 0000000000000000 (f: 0.000000, d: 0.000000e+00)
04:32:46 Module=/lib/x86_64-linux-gnu/libpthread.so.0
04:32:46 Module_base_address=00007F953B454000 Symbol=raise
04:32:46 Symbol_address=00007F953B4681E0
04:32:46
04:32:46 Method_being_compiled=java/util/Random$RandomIntsSpliterator.forEachRemaining(Ljava/util/function/IntConsumer;)V
04:32:46 Target=2_90_20221017_1201 (Linux 5.4.0-126-generic)
04:32:46 CPU=amd64 (4 logical CPUs) (0xf5b39000 RAM)
04:32:46 ----------- Stack Backtrace -----------
04:32:46 raise+0xcb (0x00007F953B4682AB [libpthread.so.0+0x142ab])
04:32:46 _ZN2TR4trapEv+0x47 (0x00007F953373ACB7 [libj9jit29.so+0x581cb7])
04:32:46 (0x00007F953373AD42 [libj9jit29.so+0x581d42])
04:32:46 _ZN16TR_LoopVersioner14guardOkForExprEPN2TR4NodeEb+0x481 (0x00007F95338A28E1 [libj9jit29.so+0x6e98e1])
04:32:46 _ZN16TR_LoopVersioner16initExprFromNodeEPNS_4ExprEPN2TR4NodeEb+0x94 (0x00007F95338A29F4 [libj9jit29.so+0x6e99f4])
04:32:46 _ZN16TR_LoopVersioner17makeCanonicalExprEPN2TR4NodeE+0xb0 (0x00007F95338A3CD0 [libj9jit29.so+0x6eacd0])
04:32:46 _ZN16TR_LoopVersioner19createLoopEntryPrepENS_13LoopEntryPrep4KindEPN2TR4NodeEPNS2_13NodeChecklistEPS0_+0x75 (0x00007F95338A4595 [libj9jit29.so+0x6eb595])
04:32:46 _ZN16TR_LoopVersioner20buildConditionalTreeEP4ListIN2TR7TreeTopEERN3CS216ASparseBitVectorINS5_16shared_allocatorINS5_14heap_allocatorILm65536ELj12E17TRMemoryAllocatorIL17TR_AllocationKind1ELj12ELj28EEEEEEEE+0x750 (0x00007F95338B0D00 [libj9jit29.so+0x6f7d00])
04:32:46 _ZN16TR_LoopVersioner18versionNaturalLoopEP18TR_RegionStructureP4ListIN2TR4NodeEEPS2_INS3_7TreeTopEES9_S9_S9_S9_S9_S9_S9_S6_PS2_I19TR_NodeParentSymRefEPS2_I30TR_NodeParentSymRefWeightTupleEPS2_I12TR_StructureESI_bRN3CS216ASparseBitVectorINSJ_16shared_allo+0x1a6f (0x00007F95338B70DF [libj9jit29.so+0x6fe0df])
04:32:46 _ZN16TR_LoopVersioner24performWithoutDominatorsEv+0x13a3 (0x00007F95338BBD43 [libj9jit29.so+0x702d43])
04:32:46 _ZN3OMR9Optimizer19performOptimizationEPK20OptimizationStrategyiii+0x767 (0x00007F95338FF537 [libj9jit29.so+0x746537])
04:32:46 _ZN3OMR9Optimizer19performOptimizationEPK20OptimizationStrategyiii+0xdc9 (0x00007F95338FFB99 [libj9jit29.so+0x746b99])
04:32:46 _ZN3OMR9Optimizer19performOptimizationEPK20OptimizationStrategyiii+0xdc9 (0x00007F95338FFB99 [libj9jit29.so+0x746b99])
04:32:46 _ZN3OMR9Optimizer8optimizeEv+0x1db (0x00007F9533900E6B [libj9jit29.so+0x747e6b])
04:32:46 _ZN3OMR11Compilation7compileEv+0xaf5 (0x00007F95336F8405 [libj9jit29.so+0x53f405])
04:32:46 _ZN2TR28CompilationInfoPerThreadBase7compileEP10J9VMThreadPNS_11CompilationEP17TR_ResolvedMethodR11TR_J9VMBaseP19TR_OptimizationPlanRKNS_16SegmentAllocatorE+0x4bf (0x00007F953330828F [libj9jit29.so+0x14f28f])
04:32:46 _ZN2TR28CompilationInfoPerThreadBase14wrappedCompileEP13J9PortLibraryPv+0x314 (0x00007F95333092F4 [libj9jit29.so+0x1502f4])
04:32:46 omrsig_protect+0x1e3 (0x00007F9539FF25D3 [libj9prt29.so+0x2b5d3])
04:32:46 _ZN2TR28CompilationInfoPerThreadBase7compileEP10J9VMThreadP21TR_MethodToBeCompiledRN2J917J9SegmentProviderE+0x309 (0x00007F9533306A19 [libj9jit29.so+0x14da19])
04:32:46 _ZN2TR24CompilationInfoPerThread12processEntryER21TR_MethodToBeCompiledRN2J917J9SegmentProviderE+0x1c0 (0x00007F9533307060 [libj9jit29.so+0x14e060])
04:32:46 _ZN2TR24CompilationInfoPerThread14processEntriesEv+0x3b3 (0x00007F9533305B93 [libj9jit29.so+0x14cb93])
04:32:46 _ZN2TR24CompilationInfoPerThread3runEv+0x42 (0x00007F9533306072 [libj9jit29.so+0x14d072])
04:32:46 _Z30protectedCompilationThreadProcP13J9PortLibraryPN2TR24CompilationInfoPerThreadE+0x82 (0x00007F9533306122 [libj9jit29.so+0x14d122])
04:32:46 omrsig_protect+0x1e3 (0x00007F9539FF25D3 [libj9prt29.so+0x2b5d3])
04:32:46 _Z21compilationThreadProcPv+0x1d2 (0x00007F9533306552 [libj9jit29.so+0x14d552])
04:32:46 thread_wrapper+0x186 (0x00007F9539DBA4F6 [libj9thr29.so+0xe4f6])
04:32:46 start_thread+0xd9 (0x00007F953B45C609 [libpthread.so.0+0x8609])
04:32:46 clone+0x43 (0x00007F953B163133 [libc.so.6+0x11f133])
04:32:46 JVMDUMP053I JIT dump is recompiling java/util/Random$RandomIntsSpliterator.forEachRemaining(Ljava/util/function/IntConsumer;)V
04:32:46 Assertion failed at /home/jenkins/workspace/Build_JDK8_x86-64_linux_jit_Personal/omr/compiler/optimizer/LoopVersioner.cpp:9125: onlySearching
04:32:46 VMState: 0x00050fff
04:32:46 guardOkForExpr: should not intern OSR guard n3908n [0x7f94883da540]
04:32:46 compiling java/util/Random$RandomIntsSpliterator.forEachRemaining(Ljava/util/function/IntConsumer;)V at level: hot
```
|
test
|
jdk util assertion failed guardokforexpr should not intern osr guard in the internal nightly jitserver tests job test sanity openjdk linux jit personal job test sanity openjdk linux jit personal the test jdk util failed due to an assertion failure but it seems to reproduce readily without jitserver enabled console log from the first job openjdk version internal openjdk runtime environment build internal eclipse vm build master jre linux bit compressed references jit enabled aot enabled omr jcl based on running test jdk util jdk util start time mon oct epoch time ms home jenkins workspace test sanity openjdk linux jit personal openjdkbinary image bin java xshareclasses destroyall home jenkins workspace test sanity openjdk linux jit personal openjdkbinary image bin java xshareclasses groupaccess destroyall echo cache cleanup done no shared class caches available no shared class caches available cache cleanup done variation xdump system none xdump heap none xdump system events gpf abort traceassert corruptcache jvm options xx usejitserver xdump system none xdump heap none xdump system events gpf abort traceassert corruptcache xx usecompressedoops unhandled exception type unhandled trap vmstate signal number signal number error value signal code fffffffa rdi rsi rax rbx rcx rdx rip gs fs rsp eflags cs rbp err trapno oldmask ffffffffffffffff f d nan f d f d nan f d f d f d f d f d f d f d f d f d f d f d f d f d module lib linux gnu libpthread so module base address symbol raise symbol address method being compiled java util random randomintsspliterator foreachremaining ljava util function intconsumer v target linux generic cpu logical cpus ram stack backtrace raise structureesi allo optimizationplanrkns omrsig protect omrsig protect thread wrapper start thread clone jit dump is recompiling java util random randomintsspliterator foreachremaining ljava util function intconsumer v assertion failed at home jenkins workspace build linux jit personal omr compiler optimizer loopversioner cpp onlysearching vmstate guardokforexpr should not intern osr guard compiling java util random randomintsspliterator foreachremaining ljava util function intconsumer v at level hot
| 1
|
99,340
| 8,697,854,487
|
IssuesEvent
|
2018-12-04 21:30:49
|
brave/browser-android-tabs
|
https://api.github.com/repos/brave/browser-android-tabs
|
opened
|
Unable to synch through QR code, brave is hanging on Android.
|
QA/Test-plan-specified QA/Yes bug
|
**Did you search for similar issues before submitting this one?**
**Description:**
Unable to synch through QR code, brave is hanging on Android.
**Device Details:**
- Install Type(ARM, x86): ARM
- Device(Phone, Tablet, Phablet): Desktop (BraveCore - 0.57.12)
- Android Version: Gionee Android 5.1
**Brave Version:**
1.0.71(sync1)
**Steps to reproduce:**
1. Enable sync on Desktop (Sync creator)
2. Try to sync Android using QR code
3. Brave is hanging on Android (after hanging brave is not usable untill we uninstall and re-install)
Note: Sync through code word works as expected.
**Actual Behavior**

**Expected Behavior**
Should not hang and sync should be established through QR code
|
1.0
|
Unable to synch through QR code, brave is hanging on Android. - **Did you search for similar issues before submitting this one?**
**Description:**
Unable to synch through QR code, brave is hanging on Android.
**Device Details:**
- Install Type(ARM, x86): ARM
- Device(Phone, Tablet, Phablet): Desktop (BraveCore - 0.57.12)
- Android Version: Gionee Android 5.1
**Brave Version:**
1.0.71(sync1)
**Steps to reproduce:**
1. Enable sync on Desktop (Sync creator)
2. Try to sync Android using QR code
3. Brave is hanging on Android (after hanging brave is not usable untill we uninstall and re-install)
Note: Sync through code word works as expected.
**Actual Behavior**

**Expected Behavior**
Should not hang and sync should be established through QR code
|
test
|
unable to synch through qr code brave is hanging on android did you search for similar issues before submitting this one description unable to synch through qr code brave is hanging on android device details install type arm arm device phone tablet phablet desktop bravecore android version gionee android brave version steps to reproduce enable sync on desktop sync creator try to sync android using qr code brave is hanging on android after hanging brave is not usable untill we uninstall and re install note sync through code word works as expected actual behavior expected behavior should not hang and sync should be established through qr code
| 1
|
562,880
| 16,671,676,408
|
IssuesEvent
|
2021-06-07 11:42:50
|
openscd/open-scd
|
https://api.github.com/repos/openscd/open-scd
|
closed
|
Display DOType's in filtered list within template editor
|
Kind: Feature Priority: Important Priority: Urgent
|
As a user of OpenSCD I want to know how many and witch `DOType` elements are in the project. I want to filter for these elements and want to see the most important information quickly that is:
- id
- common data class (CDC)
- number of children
See `DAType` element as prototype
|
2.0
|
Display DOType's in filtered list within template editor - As a user of OpenSCD I want to know how many and witch `DOType` elements are in the project. I want to filter for these elements and want to see the most important information quickly that is:
- id
- common data class (CDC)
- number of children
See `DAType` element as prototype
|
non_test
|
display dotype s in filtered list within template editor as a user of openscd i want to know how many and witch dotype elements are in the project i want to filter for these elements and want to see the most important information quickly that is id common data class cdc number of children see datype element as prototype
| 0
|
191,861
| 14,596,730,680
|
IssuesEvent
|
2020-12-20 17:04:07
|
github-vet/rangeloop-pointer-findings
|
https://api.github.com/repos/github-vet/rangeloop-pointer-findings
|
closed
|
aoxn/ctlplane: Godeps/_workspace/src/golang.org/x/crypto/ssh/kex_test.go; 29 LoC
|
fresh small test
|
Found a possible issue in [aoxn/ctlplane](https://www.github.com/aoxn/ctlplane) at [Godeps/_workspace/src/golang.org/x/crypto/ssh/kex_test.go](https://github.com/aoxn/ctlplane/blob/f08e72317cfca30405b2978bcad53f88987321a5/Godeps/_workspace/src/golang.org/x/crypto/ssh/kex_test.go#L21-L49)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable kex used in defer or goroutine at line 28
[Click here to see the code in its original context.](https://github.com/aoxn/ctlplane/blob/f08e72317cfca30405b2978bcad53f88987321a5/Godeps/_workspace/src/golang.org/x/crypto/ssh/kex_test.go#L21-L49)
<details>
<summary>Click here to show the 29 line(s) of Go which triggered the analyzer.</summary>
```go
for name, kex := range kexAlgoMap {
a, b := memPipe()
s := make(chan kexResultErr, 1)
c := make(chan kexResultErr, 1)
var magics handshakeMagics
go func() {
r, e := kex.Client(a, rand.Reader, &magics)
a.Close()
c <- kexResultErr{r, e}
}()
go func() {
r, e := kex.Server(b, rand.Reader, &magics, testSigners["ecdsa"])
b.Close()
s <- kexResultErr{r, e}
}()
clientRes := <-c
serverRes := <-s
if clientRes.err != nil {
t.Errorf("client: %v", clientRes.err)
}
if serverRes.err != nil {
t.Errorf("server: %v", serverRes.err)
}
if !reflect.DeepEqual(clientRes.result, serverRes.result) {
t.Errorf("kex %q: mismatch %#v, %#v", name, clientRes.result, serverRes.result)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: f08e72317cfca30405b2978bcad53f88987321a5
|
1.0
|
aoxn/ctlplane: Godeps/_workspace/src/golang.org/x/crypto/ssh/kex_test.go; 29 LoC -
Found a possible issue in [aoxn/ctlplane](https://www.github.com/aoxn/ctlplane) at [Godeps/_workspace/src/golang.org/x/crypto/ssh/kex_test.go](https://github.com/aoxn/ctlplane/blob/f08e72317cfca30405b2978bcad53f88987321a5/Godeps/_workspace/src/golang.org/x/crypto/ssh/kex_test.go#L21-L49)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable kex used in defer or goroutine at line 28
[Click here to see the code in its original context.](https://github.com/aoxn/ctlplane/blob/f08e72317cfca30405b2978bcad53f88987321a5/Godeps/_workspace/src/golang.org/x/crypto/ssh/kex_test.go#L21-L49)
<details>
<summary>Click here to show the 29 line(s) of Go which triggered the analyzer.</summary>
```go
for name, kex := range kexAlgoMap {
a, b := memPipe()
s := make(chan kexResultErr, 1)
c := make(chan kexResultErr, 1)
var magics handshakeMagics
go func() {
r, e := kex.Client(a, rand.Reader, &magics)
a.Close()
c <- kexResultErr{r, e}
}()
go func() {
r, e := kex.Server(b, rand.Reader, &magics, testSigners["ecdsa"])
b.Close()
s <- kexResultErr{r, e}
}()
clientRes := <-c
serverRes := <-s
if clientRes.err != nil {
t.Errorf("client: %v", clientRes.err)
}
if serverRes.err != nil {
t.Errorf("server: %v", serverRes.err)
}
if !reflect.DeepEqual(clientRes.result, serverRes.result) {
t.Errorf("kex %q: mismatch %#v, %#v", name, clientRes.result, serverRes.result)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: f08e72317cfca30405b2978bcad53f88987321a5
|
test
|
aoxn ctlplane godeps workspace src golang org x crypto ssh kex test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message range loop variable kex used in defer or goroutine at line click here to show the line s of go which triggered the analyzer go for name kex range kexalgomap a b mempipe s make chan kexresulterr c make chan kexresulterr var magics handshakemagics go func r e kex client a rand reader magics a close c kexresulterr r e go func r e kex server b rand reader magics testsigners b close s kexresulterr r e clientres c serverres s if clientres err nil t errorf client v clientres err if serverres err nil t errorf server v serverres err if reflect deepequal clientres result serverres result t errorf kex q mismatch v v name clientres result serverres result leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
| 1
|
316,989
| 27,201,455,848
|
IssuesEvent
|
2023-02-20 10:01:22
|
meowool/sweekt-gradle
|
https://api.github.com/repos/meowool/sweekt-gradle
|
closed
|
Failed to test the distribution on the `changed/v7.5.1` branch
|
bot bug: test
|
## Affected Version
https://github.com/meowool/sweekt-gradle/releases/tag/v1
## Error
```console
$ ./gradlew clean -Dfile.encoding=UTF-8 -Duser.language=en
Downloading https://services.gradle.org/distributions/gradle-7.5.1-bin.zip
...........10%............20%...........30%............40%...........50%............60%...........70%............80%...........90%............100%
Welcome to Gradle 7.5.1!
Here are the highlights of this release:
- Support for Java 18
- Support for building with Groovy 4
- Much more responsive continuous builds
- Improved diagnostics for dependency resolution
For more details see https://docs.gradle.org/7.5.1/release-notes.html
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration cache is an incubating feature.
Calculating task graph as no configuration cache is available for tasks: clean
> Task :build-logic-settings:build-logic-settings-plugin:generateExternalPluginSpecBuilders
> Task :build-logic-settings:build-logic-settings-plugin:extractPrecompiledScriptPluginPlugins
> Task :build-logic-settings:build-logic-settings-plugin:compilePluginsBlocks
> Task :build-logic-settings:build-logic-settings-plugin:generatePrecompiledScriptPluginAccessors
> Task :build-logic-settings:build-logic-settings-plugin:generateScriptPluginAdapters
> Task :build-logic-settings:build-logic-settings-plugin:pluginDescriptors
> Task :build-logic-settings:build-logic-settings-plugin:processResources
> Task :build-logic-settings:build-logic-settings-plugin:compileKotlin
> Task :build-logic-settings:build-logic-settings-plugin:compileJava NO-SOURCE
> Task :build-logic-settings:build-logic-settings-plugin:classes
> Task :build-logic-settings:build-logic-settings-plugin:inspectClassesForKotlinIC
> Task :build-logic-settings:build-logic-settings-plugin:jar
Type-safe project accessors is an incubating feature.
> Task :build-logic-commons:gradle-plugin:generateExternalPluginSpecBuilders
> Task :build-logic-commons:gradle-plugin:extractPrecompiledScriptPluginPlugins
> Task :build-logic-commons:gradle-plugin:compilePluginsBlocks
> Task :build-logic-commons:gradle-plugin:generatePrecompiledScriptPluginAccessors
> Task :build-logic-commons:gradle-plugin:generateScriptPluginAdapters
> Task :build-logic-commons:gradle-plugin:pluginDescriptors
> Task :build-logic-commons:gradle-plugin:processResources
> Task :build-logic-commons:gradle-plugin:compileKotlin
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic-commons/gradle-plugin/src/main/kotlin/gradlebuild.cache-miss-monitor.gradle.kts: (25, 26): 'afterTask(Action<Task!>): Unit' is deprecated. Deprecated in Java
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic-commons/gradle-plugin/src/main/kotlin/gradlebuild.code-quality.gradle.kts: (133, 13): 'withConvention(KClass<ConventionType>, ConventionType.() -> ReturnType): ReturnType' is deprecated. The concept of conventions is deprecated. Use extensions instead.
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic-commons/gradle-plugin/src/main/kotlin/gradlebuild.code-quality.gradle.kts: (133, 28): 'GroovySourceSet' is deprecated. Deprecated in Java
> Task :build-logic-commons:gradle-plugin:compileJava NO-SOURCE
> Task :build-logic-commons:gradle-plugin:classes
> Task :build-logic-commons:gradle-plugin:inspectClassesForKotlinIC
> Task :build-logic-commons:gradle-plugin:jar
> Task :build-logic:module-identity:extractPrecompiledScriptPluginPlugins
> Task :build-logic:module-identity:generateScriptPluginAdapters
> Task :build-logic:module-identity:pluginDescriptors
> Task :build-logic:module-identity:processResources
> Task :build-logic:cleanup:extractPluginRequests NO-SOURCE
> Task :build-logic:cleanup:generatePluginAdapters
> Task :build-logic:cleanup:extractPrecompiledScriptPluginPlugins
> Task :build-logic:cleanup:generateScriptPluginAdapters
> Task :build-logic:idea:extractPrecompiledScriptPluginPlugins
> Task :build-logic:idea:generateScriptPluginAdapters
> Task :build-logic:build-update-utils:extractPluginRequests NO-SOURCE
> Task :build-logic:build-update-utils:generatePluginAdapters FROM-CACHE
> Task :build-logic:build-update-utils:extractPrecompiledScriptPluginPlugins
> Task :build-logic:build-update-utils:generateScriptPluginAdapters
> Task :build-logic:build-update-utils:pluginDescriptors
> Task :build-logic:build-update-utils:processResources
> Task :build-logic:documentation:extractPluginRequests NO-SOURCE
> Task :build-logic:documentation:generatePluginAdapters FROM-CACHE
> Task :build-logic:documentation:pluginDescriptors
> Task :build-logic:documentation:processResources
> Task :build-logic:profiling:extractPrecompiledScriptPluginPlugins
> Task :build-logic:profiling:generateScriptPluginAdapters
> Task :build-logic:cleanup:pluginDescriptors
> Task :build-logic:cleanup:processResources
> Task :build-logic:idea:pluginDescriptors
> Task :build-logic:idea:processResources
> Task :build-logic:profiling:pluginDescriptors
> Task :build-logic:profiling:processResources
> Task :build-logic:root-build:extractPrecompiledScriptPluginPlugins
> Task :build-logic:root-build:generateScriptPluginAdapters
> Task :build-logic:root-build:pluginDescriptors
> Task :build-logic:root-build:processResources
> Task :build-logic:lifecycle:extractPrecompiledScriptPluginPlugins
> Task :build-logic:lifecycle:generateScriptPluginAdapters
> Task :build-logic:lifecycle:pluginDescriptors
> Task :build-logic:lifecycle:processResources
> Task :build-logic:basics:generateExternalPluginSpecBuilders
> Task :build-logic:basics:extractPrecompiledScriptPluginPlugins
> Task :build-logic:binary-compatibility:extractPluginRequests
> Task :build-logic:basics:compilePluginsBlocks
> Task :build-logic:binary-compatibility:generatePluginAdapters
> Task :build-logic:dependency-modules:extractPrecompiledScriptPluginPlugins
> Task :build-logic:dependency-modules:generateScriptPluginAdapters
> Task :build-logic:dependency-modules:pluginDescriptors
> Task :build-logic:dependency-modules:processResources
> Task :build-logic:integration-testing:extractPrecompiledScriptPluginPlugins
> Task :build-logic:integration-testing:generateScriptPluginAdapters
> Task :build-logic:integration-testing:pluginDescriptors
> Task :build-logic:integration-testing:processResources
> Task :build-logic:basics:generatePrecompiledScriptPluginAccessors
> Task :build-logic:performance-testing:extractPluginRequests
> Task :build-logic:basics:generateScriptPluginAdapters
> Task :build-logic:performance-testing:generatePluginAdapters
> Task :build-logic:binary-compatibility:pluginDescriptors
> Task :build-logic:binary-compatibility:processResources
> Task :build-logic:buildquality:extractPrecompiledScriptPluginPlugins
> Task :build-logic:performance-testing:pluginDescriptors
> Task :build-logic:performance-testing:processResources
> Task :build-logic:buildquality:generateScriptPluginAdapters
> Task :build-logic:buildquality:pluginDescriptors
> Task :build-logic:buildquality:processResources
> Task :build-logic:basics:pluginDescriptors
> Task :build-logic:basics:processResources
> Task :build-logic:basics:compileKotlin
> Task :build-logic:basics:compileJava NO-SOURCE
> Task :build-logic:basics:classes
> Task :build-logic:basics:inspectClassesForKotlinIC
> Task :build-logic:basics:jar
> Task :build-logic:module-identity:generateExternalPluginSpecBuilders
> Task :build-logic:idea:generateExternalPluginSpecBuilders
> Task :build-logic:module-identity:compilePluginsBlocks
> Task :build-logic:module-identity:generatePrecompiledScriptPluginAccessors
> Task :build-logic:idea:compilePluginsBlocks
> Task :build-logic:idea:generatePrecompiledScriptPluginAccessors
> Task :build-logic:idea:compileKotlin
> Task :build-logic:idea:compileJava NO-SOURCE
> Task :build-logic:build-update-utils:generateExternalPluginSpecBuilders
> Task :build-logic:module-identity:compileKotlin
> Task :build-logic:module-identity:compileJava NO-SOURCE
> Task :build-logic:module-identity:classes
> Task :build-logic:module-identity:inspectClassesForKotlinIC
> Task :build-logic:module-identity:jar
> Task :build-logic:build-update-utils:compilePluginsBlocks
> Task :build-logic:cleanup:generateExternalPluginSpecBuilders
> Task :build-logic:build-update-utils:generatePrecompiledScriptPluginAccessors
> Task :build-logic:cleanup:compilePluginsBlocks
> Task :build-logic:cleanup:generatePrecompiledScriptPluginAccessors
> Task :build-logic:build-update-utils:compileKotlin
> Task :build-logic:build-update-utils:compileJava NO-SOURCE
> Task :build-logic:build-update-utils:compileGroovy NO-SOURCE
> Task :build-logic:build-update-utils:compileGroovyPlugins NO-SOURCE
> Task :build-logic:build-update-utils:classes
> Task :build-logic:build-update-utils:inspectClassesForKotlinIC
> Task :build-logic:cleanup:compileKotlin
> Task :build-logic:cleanup:compileJava NO-SOURCE
> Task :build-logic:cleanup:compileGroovy NO-SOURCE
> Task :build-logic:cleanup:compileGroovyPlugins NO-SOURCE
> Task :build-logic:cleanup:classes
> Task :build-logic:cleanup:inspectClassesForKotlinIC
> Task :build-logic:build-update-utils:jar
> Task :build-logic:documentation:compileKotlin NO-SOURCE
> Task :build-logic:documentation:compileJava NO-SOURCE
> Task :build-logic:cleanup:jar
> Task :build-logic:idea:classes
> Task :build-logic:idea:inspectClassesForKotlinIC
> Task :build-logic:idea:jar
> Task :build-logic:lifecycle:generateExternalPluginSpecBuilders FROM-CACHE
> Task :build-logic:lifecycle:compilePluginsBlocks
> Task :build-logic:lifecycle:generatePrecompiledScriptPluginAccessors
> Task :build-logic:lifecycle:compileKotlin
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic/lifecycle/src/main/kotlin/gradlebuild.lifecycle.gradle.kts: (69, 16): 'buildFinished(Action<in BuildResult!>): Unit' is deprecated. Deprecated in Java
> Task :build-logic:lifecycle:compileJava NO-SOURCE
> Task :build-logic:lifecycle:classes
> Task :build-logic:lifecycle:inspectClassesForKotlinIC
> Task :build-logic:lifecycle:jar
> Task :build-logic:binary-compatibility:compileKotlin
'compileJava' task (current target is 11) and 'compileKotlin' task (current target is 1.8) jvm target compatibility should be set to the same Java version.
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic/binary-compatibility/src/main/kotlin/gradlebuild/binarycompatibility/metadata/HasKotlinFlagsMetadataQuery.kt: (295, 22): This declaration overrides deprecated member but not marked as deprecated itself. This deprecation won't be inherited in future releases. Please add @Deprecated annotation or suppress
> Task :build-logic:documentation:compileGroovy
Groovy compilation avoidance is an incubating feature.
> Task :build-logic:profiling:generateExternalPluginSpecBuilders
> Task :build-logic:documentation:compileGroovyPlugins NO-SOURCE
> Task :build-logic:documentation:classes
> Task :build-logic:documentation:inspectClassesForKotlinIC
> Task :build-logic:documentation:jar
> Task :build-logic:profiling:compilePluginsBlocks
> Task :build-logic:binary-compatibility:compileJava
> Task :build-logic:binary-compatibility:compileGroovy
> Task :build-logic:dependency-modules:generateExternalPluginSpecBuilders FROM-CACHE
> Task :build-logic:dependency-modules:compilePluginsBlocks FROM-CACHE
> Task :build-logic:dependency-modules:generatePrecompiledScriptPluginAccessors
> Task :build-logic:profiling:generatePrecompiledScriptPluginAccessors
> Task :build-logic:dependency-modules:compileKotlin
> Task :build-logic:dependency-modules:compileJava NO-SOURCE
> Task :build-logic:integration-testing:generateExternalPluginSpecBuilders
> Task :build-logic:dependency-modules:classes
> Task :build-logic:dependency-modules:inspectClassesForKotlinIC
> Task :build-logic:dependency-modules:jar
> Task :build-logic:integration-testing:compilePluginsBlocks
> Task :build-logic:integration-testing:generatePrecompiledScriptPluginAccessors
> Task :build-logic:profiling:compileKotlin
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic/profiling/src/main/kotlin/gradlebuild.buildscan.gradle.kts: (192, 22): 'afterTask(Action<Task!>): Unit' is deprecated. Deprecated in Java
> Task :build-logic:profiling:compileJava NO-SOURCE
> Task :build-logic:root-build:generateExternalPluginSpecBuilders
> Task :build-logic:profiling:classes
> Task :build-logic:profiling:inspectClassesForKotlinIC
> Task :build-logic:profiling:jar
> Task :build-logic:root-build:compilePluginsBlocks
> Task :build-logic:root-build:generatePrecompiledScriptPluginAccessors
> Task :build-logic:root-build:compileKotlin
> Task :build-logic:root-build:compileJava NO-SOURCE
> Task :build-logic:root-build:classes
> Task :build-logic:root-build:inspectClassesForKotlinIC
> Task :build-logic:root-build:jar
> Task :build-logic:binary-compatibility:compileGroovyPlugins
> Task :build-logic:binary-compatibility:classes
> Task :build-logic:binary-compatibility:inspectClassesForKotlinIC
> Task :build-logic:binary-compatibility:jar
> Task :build-logic:integration-testing:compileKotlin
> Task :build-logic:integration-testing:compileJava NO-SOURCE
> Task :build-logic:integration-testing:classes
> Task :build-logic:integration-testing:inspectClassesForKotlinIC
> Task :build-logic:integration-testing:jar
> Task :build-logic:performance-testing:compileGroovy
> Task :build-logic:performance-testing:compileKotlin
'compileJava' task (current target is 11) and 'compileKotlin' task (current target is 1.8) jvm target compatibility should be set to the same Java version.
> Task :build-logic:performance-testing:compileJava
> Task :build-logic:performance-testing:compileGroovyPlugins
> Task :build-logic:performance-testing:classes
> Task :build-logic:performance-testing:inspectClassesForKotlinIC
> Task :build-logic:performance-testing:jar
> Task :build-logic:buildquality:generateExternalPluginSpecBuilders
> Task :build-logic:buildquality:compilePluginsBlocks
> Task :build-logic:buildquality:generatePrecompiledScriptPluginAccessors
> Task :build-logic:buildquality:compileKotlin
> Task :build-logic:buildquality:compileJava NO-SOURCE
> Task :build-logic:buildquality:classes
> Task :build-logic:buildquality:inspectClassesForKotlinIC
> Task :build-logic:buildquality:jar
> Task :build-logic:publishing:generateExternalPluginSpecBuilders
> Task :build-logic:publishing:extractPrecompiledScriptPluginPlugins
> Task :build-logic:jvm:generateExternalPluginSpecBuilders
> Task :build-logic:jvm:extractPrecompiledScriptPluginPlugins
> Task :build-logic:publishing:compilePluginsBlocks
> Task :build-logic:jvm:compilePluginsBlocks
> Task :build-logic:publishing:generatePrecompiledScriptPluginAccessors
> Task :build-logic:publishing:generateScriptPluginAdapters
> Task :build-logic:jvm:generatePrecompiledScriptPluginAccessors
> Task :build-logic:jvm:generateScriptPluginAdapters
> Task :build-logic:publishing:compileKotlin
> Task :build-logic:publishing:compileJava NO-SOURCE
> Task :build-logic:jvm:pluginDescriptors
> Task :build-logic:jvm:processResources
> Task :build-logic:publishing:pluginDescriptors
> Task :build-logic:publishing:processResources
> Task :build-logic:publishing:classes
> Task :build-logic:publishing:inspectClassesForKotlinIC
> Task :build-logic:publishing:jar
> Task :build-logic:uber-plugins:extractPrecompiledScriptPluginPlugins
> Task :build-logic:uber-plugins:generateScriptPluginAdapters
> Task :build-logic:uber-plugins:pluginDescriptors
> Task :build-logic:uber-plugins:processResources
> Task :build-logic:jvm:compileKotlin
> Task :build-logic:jvm:compileJava NO-SOURCE
> Task :build-logic:jvm:classes
> Task :build-logic:jvm:inspectClassesForKotlinIC
> Task :build-logic:jvm:jar
> Task :build-logic:uber-plugins:generateExternalPluginSpecBuilders
> Task :build-logic:uber-plugins:compilePluginsBlocks
> Task :build-logic:uber-plugins:generatePrecompiledScriptPluginAccessors
> Task :build-logic:uber-plugins:compileKotlin
> Task :build-logic:uber-plugins:compileJava NO-SOURCE
> Task :build-logic:uber-plugins:classes
> Task :build-logic:uber-plugins:inspectClassesForKotlinIC
> Task :build-logic:uber-plugins:jar
> Task :build-logic:packaging:generateExternalPluginSpecBuilders
> Task :build-logic:packaging:extractPrecompiledScriptPluginPlugins
> Task :build-logic:packaging:compilePluginsBlocks
> Task :build-logic:packaging:generatePrecompiledScriptPluginAccessors
> Task :build-logic:packaging:generateScriptPluginAdapters
> Task :build-logic:packaging:pluginDescriptors
> Task :build-logic:packaging:processResources
> Task :build-logic:packaging:compileKotlin
> Task :build-logic:packaging:compileJava NO-SOURCE
> Task :build-logic:packaging:classes
> Task :build-logic:packaging:inspectClassesForKotlinIC
> Task :build-logic:packaging:jar
> Task :build-logic:kotlin-dsl:generateExternalPluginSpecBuilders
> Task :build-logic:kotlin-dsl:extractPrecompiledScriptPluginPlugins
> Task :build-logic:kotlin-dsl:compilePluginsBlocks
> Task :build-logic:kotlin-dsl:generatePrecompiledScriptPluginAccessors
> Task :build-logic:kotlin-dsl:generateScriptPluginAdapters
> Task :build-logic:kotlin-dsl:pluginDescriptors
> Task :build-logic:kotlin-dsl:processResources
> Task :build-logic:kotlin-dsl:compileKotlin
> Task :build-logic:kotlin-dsl:compileJava NO-SOURCE
> Task :build-logic:kotlin-dsl:classes
> Task :build-logic:kotlin-dsl:inspectClassesForKotlinIC
> Task :build-logic:kotlin-dsl:jar
> Task :build-logic:build-init-samples:generateExternalPluginSpecBuilders
> Task :build-logic:build-init-samples:extractPrecompiledScriptPluginPlugins
> Task :build-logic:build-init-samples:compilePluginsBlocks
> Task :build-logic:build-init-samples:generatePrecompiledScriptPluginAccessors
> Task :build-logic:build-init-samples:generateScriptPluginAdapters
> Task :build-logic:build-init-samples:pluginDescriptors
> Task :build-logic:build-init-samples:processResources
> Task :build-logic:build-init-samples:compileKotlin
> Task :build-logic:build-init-samples:compileJava NO-SOURCE
> Task :build-logic:build-init-samples:classes
> Task :build-logic:build-init-samples:inspectClassesForKotlinIC
> Task :build-logic:build-init-samples:jar
> Task :clean UP-TO-DATE
> Task :antlr:clean UP-TO-DATE
> Task :api-metadata:clean UP-TO-DATE
> Task :architecture-test:clean UP-TO-DATE
> Task :base-annotations:clean UP-TO-DATE
> Task :base-services:clean UP-TO-DATE
> Task :bootstrap:clean UP-TO-DATE
> Task :build-cache:clean UP-TO-DATE
> Task :build-cache-base:clean UP-TO-DATE
> Task :base-services-groovy:clean UP-TO-DATE
> Task :build-cache-http:clean UP-TO-DATE
> Task :build-cache-packaging:clean UP-TO-DATE
> Task :build-events:clean UP-TO-DATE
> Task :build-init:clean UP-TO-DATE
> Task :build-operations:clean UP-TO-DATE
> Task :build-option:clean UP-TO-DATE
> Task :build-profile:clean UP-TO-DATE
> Task :build-scan-performance:clean UP-TO-DATE
> Task :cli:clean UP-TO-DATE
> Task :code-quality:clean UP-TO-DATE
> Task :composite-builds:clean UP-TO-DATE
> Task :configuration-cache:clean UP-TO-DATE
> Task :core:clean UP-TO-DATE
> Task :core-api:clean UP-TO-DATE
> Task :core-platform:clean UP-TO-DATE
> Task :diagnostics:clean UP-TO-DATE
> Task :distributions-basics:clean UP-TO-DATE
> Task :dependency-management:clean UP-TO-DATE
> Task :distributions-dependencies:clean UP-TO-DATE
> Task :distributions-core:clean UP-TO-DATE
> Task :distributions-full:clean UP-TO-DATE
> Task :distributions-integ-tests:clean UP-TO-DATE
> Task :distributions-jvm:clean UP-TO-DATE
> Task :distributions-publishing:clean UP-TO-DATE
> Task :distributions-native:clean UP-TO-DATE
> Task :docs:clean UP-TO-DATE
> Task :ear:clean UP-TO-DATE
> Task :enterprise:clean UP-TO-DATE
> Task :enterprise-logging:clean UP-TO-DATE
> Task :enterprise-operations:clean UP-TO-DATE
> Task :enterprise-workers:clean UP-TO-DATE
> Task :execution:clean UP-TO-DATE
> Task :file-temp:clean UP-TO-DATE
> Task :file-collections:clean UP-TO-DATE
> Task :files:clean UP-TO-DATE
> Task :file-watching:clean UP-TO-DATE
> Task :functional:clean UP-TO-DATE
> Task :ide:clean UP-TO-DATE
> Task :hashing:clean UP-TO-DATE
> Task :installation-beacon:clean UP-TO-DATE
> Task :ide-native:clean UP-TO-DATE
> Task :integ-test:clean UP-TO-DATE
> Task :internal-build-reports:clean UP-TO-DATE
> Task :internal-integ-testing:clean UP-TO-DATE
> Task :internal-performance-testing:clean UP-TO-DATE
> Task :internal-testing:clean UP-TO-DATE
> Task :ivy:clean UP-TO-DATE
> Task :jacoco:clean UP-TO-DATE
> Task :java-compiler-plugin:clean UP-TO-DATE
> Task :jvm-services:clean UP-TO-DATE
> Task :kotlin-compiler-embeddable:clean UP-TO-DATE
> Task :kotlin-dsl:clean UP-TO-DATE
> Task :kotlin-dsl-plugins:clean UP-TO-DATE
> Task :kotlin-dsl-integ-tests:clean UP-TO-DATE
> Task :kotlin-dsl-provider-plugins:clean UP-TO-DATE
> Task :kotlin-dsl-tooling-builders:clean UP-TO-DATE
> Task :kotlin-dsl-tooling-models:clean UP-TO-DATE
> Task :language-groovy:clean UP-TO-DATE
> Task :language-java:clean UP-TO-DATE
> Task :language-jvm:clean UP-TO-DATE
> Task :language-native:clean UP-TO-DATE
> Task :launcher:clean UP-TO-DATE
> Task :logging:clean UP-TO-DATE
> Task :logging-api:clean UP-TO-DATE
> Task :maven:clean UP-TO-DATE
> Task :messaging:clean UP-TO-DATE
> Task :model-groovy:clean UP-TO-DATE
> Task :model-core:clean UP-TO-DATE
> Task :native:clean UP-TO-DATE
> Task :normalization-java:clean UP-TO-DATE
> Task :performance:clean UP-TO-DATE
> Task :persistent-cache:clean UP-TO-DATE
> Task :platform-base:clean UP-TO-DATE
> Task :platform-jvm:clean UP-TO-DATE
> Task :platform-native:clean UP-TO-DATE
> Task :plugin-use:clean UP-TO-DATE
> Task :plugin-development:clean UP-TO-DATE
> Task :plugins:clean UP-TO-DATE
> Task :problems:clean UP-TO-DATE
> Task :process-services:clean UP-TO-DATE
> Task :publish:clean UP-TO-DATE
> Task :resources:clean UP-TO-DATE
> Task :reporting:clean UP-TO-DATE
> Task :resources-gcs:clean UP-TO-DATE
> Task :resources-http:clean UP-TO-DATE
> Task :resources-s3:clean UP-TO-DATE
> Task :resources-sftp:clean UP-TO-DATE
> Task :samples:clean UP-TO-DATE
> Task :scala:clean UP-TO-DATE
> Task :security:clean UP-TO-DATE
> Task :signing:clean UP-TO-DATE
> Task :smoke-test:clean UP-TO-DATE
> Task :snapshots:clean UP-TO-DATE
> Task :soak:clean UP-TO-DATE
> Task :test-kit:clean UP-TO-DATE
> Task :testing-junit-platform:clean UP-TO-DATE
> Task :testing-jvm:clean UP-TO-DATE
> Task :testing-native:clean UP-TO-DATE
> Task :tooling-api:clean UP-TO-DATE
> Task :tooling-api-builders:clean UP-TO-DATE
> Task :testing-base:clean UP-TO-DATE
> Task :tooling-native:clean UP-TO-DATE
> Task :version-control:clean UP-TO-DATE
> Task :worker-services:clean UP-TO-DATE
> Task :worker-processes:clean UP-TO-DATE
> Task :workers:clean UP-TO-DATE
> Task :wrapper:clean UP-TO-DATE
> Task :wrapper-shared:clean UP-TO-DATE
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
336 actionable tasks: 213 executed, 5 from cache, 118 up-to-date
A build scan was not published as you have not authenticated with server 'ge.gradle.org'.
Configuration cache entry discarded with 4 problems.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
FAILURE: Build failed with an exception.
* What went wrong:
Configuration cache problems found in this build.
4 problems were found storing the configuration cache.
- Gradle runtime: cannot serialize Gradle script object references as these are not supported with the configuration cache.
See https://docs.gradle.org/7.5.1/userguide/configuration_cache.html#config_cache:requirements:disallowed_types
- Gradle runtime: cannot serialize object of type 'org.gradle.api.internal.project.DefaultProject', a subtype of 'org.gradle.api.Project', as these are not supported with the configuration cache.
See https://docs.gradle.org/7.5.1/userguide/configuration_cache.html#config_cache:requirements:disallowed_types
- Plugin 'gradlebuild.lifecycle': registration of listener on 'Gradle.buildFinished' is unsupported
See https://docs.gradle.org/7.5.1/userguide/configuration_cache.html#config_cache:requirements:build_listeners
- Unknown location: external process started 'git rev-parse --verify HEAD'
See https://docs.gradle.org/7.5.1/userguide/configuration_cache.html#config_cache:requirements:external_processes
See the complete report at file:///home/runner/work/sweekt-gradle/sweekt-gradle/build/reports/configuration-cache/2xrb7a79uza42zkf2efd9x5aq/f4txwdrtex1agpn5y7616bzy0/configuration-cache-report.html
> Starting an external process 'git rev-parse --verify HEAD' during configuration time is unsupported.
> Listener registration 'Gradle.buildFinished' by build 'gradle' is unsupported.
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 8m 39s
```
> View the full workflow running log in https://github.com/meowool/sweekt-gradle/actions/runs/4212557831
<!-- related: 7.5.1.1 -->
|
1.0
|
Failed to test the distribution on the `changed/v7.5.1` branch - ## Affected Version
https://github.com/meowool/sweekt-gradle/releases/tag/v1
## Error
```console
$ ./gradlew clean -Dfile.encoding=UTF-8 -Duser.language=en
Downloading https://services.gradle.org/distributions/gradle-7.5.1-bin.zip
...........10%............20%...........30%............40%...........50%............60%...........70%............80%...........90%............100%
Welcome to Gradle 7.5.1!
Here are the highlights of this release:
- Support for Java 18
- Support for building with Groovy 4
- Much more responsive continuous builds
- Improved diagnostics for dependency resolution
For more details see https://docs.gradle.org/7.5.1/release-notes.html
Starting a Gradle Daemon (subsequent builds will be faster)
Configuration cache is an incubating feature.
Calculating task graph as no configuration cache is available for tasks: clean
> Task :build-logic-settings:build-logic-settings-plugin:generateExternalPluginSpecBuilders
> Task :build-logic-settings:build-logic-settings-plugin:extractPrecompiledScriptPluginPlugins
> Task :build-logic-settings:build-logic-settings-plugin:compilePluginsBlocks
> Task :build-logic-settings:build-logic-settings-plugin:generatePrecompiledScriptPluginAccessors
> Task :build-logic-settings:build-logic-settings-plugin:generateScriptPluginAdapters
> Task :build-logic-settings:build-logic-settings-plugin:pluginDescriptors
> Task :build-logic-settings:build-logic-settings-plugin:processResources
> Task :build-logic-settings:build-logic-settings-plugin:compileKotlin
> Task :build-logic-settings:build-logic-settings-plugin:compileJava NO-SOURCE
> Task :build-logic-settings:build-logic-settings-plugin:classes
> Task :build-logic-settings:build-logic-settings-plugin:inspectClassesForKotlinIC
> Task :build-logic-settings:build-logic-settings-plugin:jar
Type-safe project accessors is an incubating feature.
> Task :build-logic-commons:gradle-plugin:generateExternalPluginSpecBuilders
> Task :build-logic-commons:gradle-plugin:extractPrecompiledScriptPluginPlugins
> Task :build-logic-commons:gradle-plugin:compilePluginsBlocks
> Task :build-logic-commons:gradle-plugin:generatePrecompiledScriptPluginAccessors
> Task :build-logic-commons:gradle-plugin:generateScriptPluginAdapters
> Task :build-logic-commons:gradle-plugin:pluginDescriptors
> Task :build-logic-commons:gradle-plugin:processResources
> Task :build-logic-commons:gradle-plugin:compileKotlin
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic-commons/gradle-plugin/src/main/kotlin/gradlebuild.cache-miss-monitor.gradle.kts: (25, 26): 'afterTask(Action<Task!>): Unit' is deprecated. Deprecated in Java
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic-commons/gradle-plugin/src/main/kotlin/gradlebuild.code-quality.gradle.kts: (133, 13): 'withConvention(KClass<ConventionType>, ConventionType.() -> ReturnType): ReturnType' is deprecated. The concept of conventions is deprecated. Use extensions instead.
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic-commons/gradle-plugin/src/main/kotlin/gradlebuild.code-quality.gradle.kts: (133, 28): 'GroovySourceSet' is deprecated. Deprecated in Java
> Task :build-logic-commons:gradle-plugin:compileJava NO-SOURCE
> Task :build-logic-commons:gradle-plugin:classes
> Task :build-logic-commons:gradle-plugin:inspectClassesForKotlinIC
> Task :build-logic-commons:gradle-plugin:jar
> Task :build-logic:module-identity:extractPrecompiledScriptPluginPlugins
> Task :build-logic:module-identity:generateScriptPluginAdapters
> Task :build-logic:module-identity:pluginDescriptors
> Task :build-logic:module-identity:processResources
> Task :build-logic:cleanup:extractPluginRequests NO-SOURCE
> Task :build-logic:cleanup:generatePluginAdapters
> Task :build-logic:cleanup:extractPrecompiledScriptPluginPlugins
> Task :build-logic:cleanup:generateScriptPluginAdapters
> Task :build-logic:idea:extractPrecompiledScriptPluginPlugins
> Task :build-logic:idea:generateScriptPluginAdapters
> Task :build-logic:build-update-utils:extractPluginRequests NO-SOURCE
> Task :build-logic:build-update-utils:generatePluginAdapters FROM-CACHE
> Task :build-logic:build-update-utils:extractPrecompiledScriptPluginPlugins
> Task :build-logic:build-update-utils:generateScriptPluginAdapters
> Task :build-logic:build-update-utils:pluginDescriptors
> Task :build-logic:build-update-utils:processResources
> Task :build-logic:documentation:extractPluginRequests NO-SOURCE
> Task :build-logic:documentation:generatePluginAdapters FROM-CACHE
> Task :build-logic:documentation:pluginDescriptors
> Task :build-logic:documentation:processResources
> Task :build-logic:profiling:extractPrecompiledScriptPluginPlugins
> Task :build-logic:profiling:generateScriptPluginAdapters
> Task :build-logic:cleanup:pluginDescriptors
> Task :build-logic:cleanup:processResources
> Task :build-logic:idea:pluginDescriptors
> Task :build-logic:idea:processResources
> Task :build-logic:profiling:pluginDescriptors
> Task :build-logic:profiling:processResources
> Task :build-logic:root-build:extractPrecompiledScriptPluginPlugins
> Task :build-logic:root-build:generateScriptPluginAdapters
> Task :build-logic:root-build:pluginDescriptors
> Task :build-logic:root-build:processResources
> Task :build-logic:lifecycle:extractPrecompiledScriptPluginPlugins
> Task :build-logic:lifecycle:generateScriptPluginAdapters
> Task :build-logic:lifecycle:pluginDescriptors
> Task :build-logic:lifecycle:processResources
> Task :build-logic:basics:generateExternalPluginSpecBuilders
> Task :build-logic:basics:extractPrecompiledScriptPluginPlugins
> Task :build-logic:binary-compatibility:extractPluginRequests
> Task :build-logic:basics:compilePluginsBlocks
> Task :build-logic:binary-compatibility:generatePluginAdapters
> Task :build-logic:dependency-modules:extractPrecompiledScriptPluginPlugins
> Task :build-logic:dependency-modules:generateScriptPluginAdapters
> Task :build-logic:dependency-modules:pluginDescriptors
> Task :build-logic:dependency-modules:processResources
> Task :build-logic:integration-testing:extractPrecompiledScriptPluginPlugins
> Task :build-logic:integration-testing:generateScriptPluginAdapters
> Task :build-logic:integration-testing:pluginDescriptors
> Task :build-logic:integration-testing:processResources
> Task :build-logic:basics:generatePrecompiledScriptPluginAccessors
> Task :build-logic:performance-testing:extractPluginRequests
> Task :build-logic:basics:generateScriptPluginAdapters
> Task :build-logic:performance-testing:generatePluginAdapters
> Task :build-logic:binary-compatibility:pluginDescriptors
> Task :build-logic:binary-compatibility:processResources
> Task :build-logic:buildquality:extractPrecompiledScriptPluginPlugins
> Task :build-logic:performance-testing:pluginDescriptors
> Task :build-logic:performance-testing:processResources
> Task :build-logic:buildquality:generateScriptPluginAdapters
> Task :build-logic:buildquality:pluginDescriptors
> Task :build-logic:buildquality:processResources
> Task :build-logic:basics:pluginDescriptors
> Task :build-logic:basics:processResources
> Task :build-logic:basics:compileKotlin
> Task :build-logic:basics:compileJava NO-SOURCE
> Task :build-logic:basics:classes
> Task :build-logic:basics:inspectClassesForKotlinIC
> Task :build-logic:basics:jar
> Task :build-logic:module-identity:generateExternalPluginSpecBuilders
> Task :build-logic:idea:generateExternalPluginSpecBuilders
> Task :build-logic:module-identity:compilePluginsBlocks
> Task :build-logic:module-identity:generatePrecompiledScriptPluginAccessors
> Task :build-logic:idea:compilePluginsBlocks
> Task :build-logic:idea:generatePrecompiledScriptPluginAccessors
> Task :build-logic:idea:compileKotlin
> Task :build-logic:idea:compileJava NO-SOURCE
> Task :build-logic:build-update-utils:generateExternalPluginSpecBuilders
> Task :build-logic:module-identity:compileKotlin
> Task :build-logic:module-identity:compileJava NO-SOURCE
> Task :build-logic:module-identity:classes
> Task :build-logic:module-identity:inspectClassesForKotlinIC
> Task :build-logic:module-identity:jar
> Task :build-logic:build-update-utils:compilePluginsBlocks
> Task :build-logic:cleanup:generateExternalPluginSpecBuilders
> Task :build-logic:build-update-utils:generatePrecompiledScriptPluginAccessors
> Task :build-logic:cleanup:compilePluginsBlocks
> Task :build-logic:cleanup:generatePrecompiledScriptPluginAccessors
> Task :build-logic:build-update-utils:compileKotlin
> Task :build-logic:build-update-utils:compileJava NO-SOURCE
> Task :build-logic:build-update-utils:compileGroovy NO-SOURCE
> Task :build-logic:build-update-utils:compileGroovyPlugins NO-SOURCE
> Task :build-logic:build-update-utils:classes
> Task :build-logic:build-update-utils:inspectClassesForKotlinIC
> Task :build-logic:cleanup:compileKotlin
> Task :build-logic:cleanup:compileJava NO-SOURCE
> Task :build-logic:cleanup:compileGroovy NO-SOURCE
> Task :build-logic:cleanup:compileGroovyPlugins NO-SOURCE
> Task :build-logic:cleanup:classes
> Task :build-logic:cleanup:inspectClassesForKotlinIC
> Task :build-logic:build-update-utils:jar
> Task :build-logic:documentation:compileKotlin NO-SOURCE
> Task :build-logic:documentation:compileJava NO-SOURCE
> Task :build-logic:cleanup:jar
> Task :build-logic:idea:classes
> Task :build-logic:idea:inspectClassesForKotlinIC
> Task :build-logic:idea:jar
> Task :build-logic:lifecycle:generateExternalPluginSpecBuilders FROM-CACHE
> Task :build-logic:lifecycle:compilePluginsBlocks
> Task :build-logic:lifecycle:generatePrecompiledScriptPluginAccessors
> Task :build-logic:lifecycle:compileKotlin
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic/lifecycle/src/main/kotlin/gradlebuild.lifecycle.gradle.kts: (69, 16): 'buildFinished(Action<in BuildResult!>): Unit' is deprecated. Deprecated in Java
> Task :build-logic:lifecycle:compileJava NO-SOURCE
> Task :build-logic:lifecycle:classes
> Task :build-logic:lifecycle:inspectClassesForKotlinIC
> Task :build-logic:lifecycle:jar
> Task :build-logic:binary-compatibility:compileKotlin
'compileJava' task (current target is 11) and 'compileKotlin' task (current target is 1.8) jvm target compatibility should be set to the same Java version.
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic/binary-compatibility/src/main/kotlin/gradlebuild/binarycompatibility/metadata/HasKotlinFlagsMetadataQuery.kt: (295, 22): This declaration overrides deprecated member but not marked as deprecated itself. This deprecation won't be inherited in future releases. Please add @Deprecated annotation or suppress
> Task :build-logic:documentation:compileGroovy
Groovy compilation avoidance is an incubating feature.
> Task :build-logic:profiling:generateExternalPluginSpecBuilders
> Task :build-logic:documentation:compileGroovyPlugins NO-SOURCE
> Task :build-logic:documentation:classes
> Task :build-logic:documentation:inspectClassesForKotlinIC
> Task :build-logic:documentation:jar
> Task :build-logic:profiling:compilePluginsBlocks
> Task :build-logic:binary-compatibility:compileJava
> Task :build-logic:binary-compatibility:compileGroovy
> Task :build-logic:dependency-modules:generateExternalPluginSpecBuilders FROM-CACHE
> Task :build-logic:dependency-modules:compilePluginsBlocks FROM-CACHE
> Task :build-logic:dependency-modules:generatePrecompiledScriptPluginAccessors
> Task :build-logic:profiling:generatePrecompiledScriptPluginAccessors
> Task :build-logic:dependency-modules:compileKotlin
> Task :build-logic:dependency-modules:compileJava NO-SOURCE
> Task :build-logic:integration-testing:generateExternalPluginSpecBuilders
> Task :build-logic:dependency-modules:classes
> Task :build-logic:dependency-modules:inspectClassesForKotlinIC
> Task :build-logic:dependency-modules:jar
> Task :build-logic:integration-testing:compilePluginsBlocks
> Task :build-logic:integration-testing:generatePrecompiledScriptPluginAccessors
> Task :build-logic:profiling:compileKotlin
w: /home/runner/work/sweekt-gradle/sweekt-gradle/build-logic/profiling/src/main/kotlin/gradlebuild.buildscan.gradle.kts: (192, 22): 'afterTask(Action<Task!>): Unit' is deprecated. Deprecated in Java
> Task :build-logic:profiling:compileJava NO-SOURCE
> Task :build-logic:root-build:generateExternalPluginSpecBuilders
> Task :build-logic:profiling:classes
> Task :build-logic:profiling:inspectClassesForKotlinIC
> Task :build-logic:profiling:jar
> Task :build-logic:root-build:compilePluginsBlocks
> Task :build-logic:root-build:generatePrecompiledScriptPluginAccessors
> Task :build-logic:root-build:compileKotlin
> Task :build-logic:root-build:compileJava NO-SOURCE
> Task :build-logic:root-build:classes
> Task :build-logic:root-build:inspectClassesForKotlinIC
> Task :build-logic:root-build:jar
> Task :build-logic:binary-compatibility:compileGroovyPlugins
> Task :build-logic:binary-compatibility:classes
> Task :build-logic:binary-compatibility:inspectClassesForKotlinIC
> Task :build-logic:binary-compatibility:jar
> Task :build-logic:integration-testing:compileKotlin
> Task :build-logic:integration-testing:compileJava NO-SOURCE
> Task :build-logic:integration-testing:classes
> Task :build-logic:integration-testing:inspectClassesForKotlinIC
> Task :build-logic:integration-testing:jar
> Task :build-logic:performance-testing:compileGroovy
> Task :build-logic:performance-testing:compileKotlin
'compileJava' task (current target is 11) and 'compileKotlin' task (current target is 1.8) jvm target compatibility should be set to the same Java version.
> Task :build-logic:performance-testing:compileJava
> Task :build-logic:performance-testing:compileGroovyPlugins
> Task :build-logic:performance-testing:classes
> Task :build-logic:performance-testing:inspectClassesForKotlinIC
> Task :build-logic:performance-testing:jar
> Task :build-logic:buildquality:generateExternalPluginSpecBuilders
> Task :build-logic:buildquality:compilePluginsBlocks
> Task :build-logic:buildquality:generatePrecompiledScriptPluginAccessors
> Task :build-logic:buildquality:compileKotlin
> Task :build-logic:buildquality:compileJava NO-SOURCE
> Task :build-logic:buildquality:classes
> Task :build-logic:buildquality:inspectClassesForKotlinIC
> Task :build-logic:buildquality:jar
> Task :build-logic:publishing:generateExternalPluginSpecBuilders
> Task :build-logic:publishing:extractPrecompiledScriptPluginPlugins
> Task :build-logic:jvm:generateExternalPluginSpecBuilders
> Task :build-logic:jvm:extractPrecompiledScriptPluginPlugins
> Task :build-logic:publishing:compilePluginsBlocks
> Task :build-logic:jvm:compilePluginsBlocks
> Task :build-logic:publishing:generatePrecompiledScriptPluginAccessors
> Task :build-logic:publishing:generateScriptPluginAdapters
> Task :build-logic:jvm:generatePrecompiledScriptPluginAccessors
> Task :build-logic:jvm:generateScriptPluginAdapters
> Task :build-logic:publishing:compileKotlin
> Task :build-logic:publishing:compileJava NO-SOURCE
> Task :build-logic:jvm:pluginDescriptors
> Task :build-logic:jvm:processResources
> Task :build-logic:publishing:pluginDescriptors
> Task :build-logic:publishing:processResources
> Task :build-logic:publishing:classes
> Task :build-logic:publishing:inspectClassesForKotlinIC
> Task :build-logic:publishing:jar
> Task :build-logic:uber-plugins:extractPrecompiledScriptPluginPlugins
> Task :build-logic:uber-plugins:generateScriptPluginAdapters
> Task :build-logic:uber-plugins:pluginDescriptors
> Task :build-logic:uber-plugins:processResources
> Task :build-logic:jvm:compileKotlin
> Task :build-logic:jvm:compileJava NO-SOURCE
> Task :build-logic:jvm:classes
> Task :build-logic:jvm:inspectClassesForKotlinIC
> Task :build-logic:jvm:jar
> Task :build-logic:uber-plugins:generateExternalPluginSpecBuilders
> Task :build-logic:uber-plugins:compilePluginsBlocks
> Task :build-logic:uber-plugins:generatePrecompiledScriptPluginAccessors
> Task :build-logic:uber-plugins:compileKotlin
> Task :build-logic:uber-plugins:compileJava NO-SOURCE
> Task :build-logic:uber-plugins:classes
> Task :build-logic:uber-plugins:inspectClassesForKotlinIC
> Task :build-logic:uber-plugins:jar
> Task :build-logic:packaging:generateExternalPluginSpecBuilders
> Task :build-logic:packaging:extractPrecompiledScriptPluginPlugins
> Task :build-logic:packaging:compilePluginsBlocks
> Task :build-logic:packaging:generatePrecompiledScriptPluginAccessors
> Task :build-logic:packaging:generateScriptPluginAdapters
> Task :build-logic:packaging:pluginDescriptors
> Task :build-logic:packaging:processResources
> Task :build-logic:packaging:compileKotlin
> Task :build-logic:packaging:compileJava NO-SOURCE
> Task :build-logic:packaging:classes
> Task :build-logic:packaging:inspectClassesForKotlinIC
> Task :build-logic:packaging:jar
> Task :build-logic:kotlin-dsl:generateExternalPluginSpecBuilders
> Task :build-logic:kotlin-dsl:extractPrecompiledScriptPluginPlugins
> Task :build-logic:kotlin-dsl:compilePluginsBlocks
> Task :build-logic:kotlin-dsl:generatePrecompiledScriptPluginAccessors
> Task :build-logic:kotlin-dsl:generateScriptPluginAdapters
> Task :build-logic:kotlin-dsl:pluginDescriptors
> Task :build-logic:kotlin-dsl:processResources
> Task :build-logic:kotlin-dsl:compileKotlin
> Task :build-logic:kotlin-dsl:compileJava NO-SOURCE
> Task :build-logic:kotlin-dsl:classes
> Task :build-logic:kotlin-dsl:inspectClassesForKotlinIC
> Task :build-logic:kotlin-dsl:jar
> Task :build-logic:build-init-samples:generateExternalPluginSpecBuilders
> Task :build-logic:build-init-samples:extractPrecompiledScriptPluginPlugins
> Task :build-logic:build-init-samples:compilePluginsBlocks
> Task :build-logic:build-init-samples:generatePrecompiledScriptPluginAccessors
> Task :build-logic:build-init-samples:generateScriptPluginAdapters
> Task :build-logic:build-init-samples:pluginDescriptors
> Task :build-logic:build-init-samples:processResources
> Task :build-logic:build-init-samples:compileKotlin
> Task :build-logic:build-init-samples:compileJava NO-SOURCE
> Task :build-logic:build-init-samples:classes
> Task :build-logic:build-init-samples:inspectClassesForKotlinIC
> Task :build-logic:build-init-samples:jar
> Task :clean UP-TO-DATE
> Task :antlr:clean UP-TO-DATE
> Task :api-metadata:clean UP-TO-DATE
> Task :architecture-test:clean UP-TO-DATE
> Task :base-annotations:clean UP-TO-DATE
> Task :base-services:clean UP-TO-DATE
> Task :bootstrap:clean UP-TO-DATE
> Task :build-cache:clean UP-TO-DATE
> Task :build-cache-base:clean UP-TO-DATE
> Task :base-services-groovy:clean UP-TO-DATE
> Task :build-cache-http:clean UP-TO-DATE
> Task :build-cache-packaging:clean UP-TO-DATE
> Task :build-events:clean UP-TO-DATE
> Task :build-init:clean UP-TO-DATE
> Task :build-operations:clean UP-TO-DATE
> Task :build-option:clean UP-TO-DATE
> Task :build-profile:clean UP-TO-DATE
> Task :build-scan-performance:clean UP-TO-DATE
> Task :cli:clean UP-TO-DATE
> Task :code-quality:clean UP-TO-DATE
> Task :composite-builds:clean UP-TO-DATE
> Task :configuration-cache:clean UP-TO-DATE
> Task :core:clean UP-TO-DATE
> Task :core-api:clean UP-TO-DATE
> Task :core-platform:clean UP-TO-DATE
> Task :diagnostics:clean UP-TO-DATE
> Task :distributions-basics:clean UP-TO-DATE
> Task :dependency-management:clean UP-TO-DATE
> Task :distributions-dependencies:clean UP-TO-DATE
> Task :distributions-core:clean UP-TO-DATE
> Task :distributions-full:clean UP-TO-DATE
> Task :distributions-integ-tests:clean UP-TO-DATE
> Task :distributions-jvm:clean UP-TO-DATE
> Task :distributions-publishing:clean UP-TO-DATE
> Task :distributions-native:clean UP-TO-DATE
> Task :docs:clean UP-TO-DATE
> Task :ear:clean UP-TO-DATE
> Task :enterprise:clean UP-TO-DATE
> Task :enterprise-logging:clean UP-TO-DATE
> Task :enterprise-operations:clean UP-TO-DATE
> Task :enterprise-workers:clean UP-TO-DATE
> Task :execution:clean UP-TO-DATE
> Task :file-temp:clean UP-TO-DATE
> Task :file-collections:clean UP-TO-DATE
> Task :files:clean UP-TO-DATE
> Task :file-watching:clean UP-TO-DATE
> Task :functional:clean UP-TO-DATE
> Task :ide:clean UP-TO-DATE
> Task :hashing:clean UP-TO-DATE
> Task :installation-beacon:clean UP-TO-DATE
> Task :ide-native:clean UP-TO-DATE
> Task :integ-test:clean UP-TO-DATE
> Task :internal-build-reports:clean UP-TO-DATE
> Task :internal-integ-testing:clean UP-TO-DATE
> Task :internal-performance-testing:clean UP-TO-DATE
> Task :internal-testing:clean UP-TO-DATE
> Task :ivy:clean UP-TO-DATE
> Task :jacoco:clean UP-TO-DATE
> Task :java-compiler-plugin:clean UP-TO-DATE
> Task :jvm-services:clean UP-TO-DATE
> Task :kotlin-compiler-embeddable:clean UP-TO-DATE
> Task :kotlin-dsl:clean UP-TO-DATE
> Task :kotlin-dsl-plugins:clean UP-TO-DATE
> Task :kotlin-dsl-integ-tests:clean UP-TO-DATE
> Task :kotlin-dsl-provider-plugins:clean UP-TO-DATE
> Task :kotlin-dsl-tooling-builders:clean UP-TO-DATE
> Task :kotlin-dsl-tooling-models:clean UP-TO-DATE
> Task :language-groovy:clean UP-TO-DATE
> Task :language-java:clean UP-TO-DATE
> Task :language-jvm:clean UP-TO-DATE
> Task :language-native:clean UP-TO-DATE
> Task :launcher:clean UP-TO-DATE
> Task :logging:clean UP-TO-DATE
> Task :logging-api:clean UP-TO-DATE
> Task :maven:clean UP-TO-DATE
> Task :messaging:clean UP-TO-DATE
> Task :model-groovy:clean UP-TO-DATE
> Task :model-core:clean UP-TO-DATE
> Task :native:clean UP-TO-DATE
> Task :normalization-java:clean UP-TO-DATE
> Task :performance:clean UP-TO-DATE
> Task :persistent-cache:clean UP-TO-DATE
> Task :platform-base:clean UP-TO-DATE
> Task :platform-jvm:clean UP-TO-DATE
> Task :platform-native:clean UP-TO-DATE
> Task :plugin-use:clean UP-TO-DATE
> Task :plugin-development:clean UP-TO-DATE
> Task :plugins:clean UP-TO-DATE
> Task :problems:clean UP-TO-DATE
> Task :process-services:clean UP-TO-DATE
> Task :publish:clean UP-TO-DATE
> Task :resources:clean UP-TO-DATE
> Task :reporting:clean UP-TO-DATE
> Task :resources-gcs:clean UP-TO-DATE
> Task :resources-http:clean UP-TO-DATE
> Task :resources-s3:clean UP-TO-DATE
> Task :resources-sftp:clean UP-TO-DATE
> Task :samples:clean UP-TO-DATE
> Task :scala:clean UP-TO-DATE
> Task :security:clean UP-TO-DATE
> Task :signing:clean UP-TO-DATE
> Task :smoke-test:clean UP-TO-DATE
> Task :snapshots:clean UP-TO-DATE
> Task :soak:clean UP-TO-DATE
> Task :test-kit:clean UP-TO-DATE
> Task :testing-junit-platform:clean UP-TO-DATE
> Task :testing-jvm:clean UP-TO-DATE
> Task :testing-native:clean UP-TO-DATE
> Task :tooling-api:clean UP-TO-DATE
> Task :tooling-api-builders:clean UP-TO-DATE
> Task :testing-base:clean UP-TO-DATE
> Task :tooling-native:clean UP-TO-DATE
> Task :version-control:clean UP-TO-DATE
> Task :worker-services:clean UP-TO-DATE
> Task :worker-processes:clean UP-TO-DATE
> Task :workers:clean UP-TO-DATE
> Task :wrapper:clean UP-TO-DATE
> Task :wrapper-shared:clean UP-TO-DATE
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
336 actionable tasks: 213 executed, 5 from cache, 118 up-to-date
A build scan was not published as you have not authenticated with server 'ge.gradle.org'.
Configuration cache entry discarded with 4 problems.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
FAILURE: Build failed with an exception.
* What went wrong:
Configuration cache problems found in this build.
4 problems were found storing the configuration cache.
- Gradle runtime: cannot serialize Gradle script object references as these are not supported with the configuration cache.
See https://docs.gradle.org/7.5.1/userguide/configuration_cache.html#config_cache:requirements:disallowed_types
- Gradle runtime: cannot serialize object of type 'org.gradle.api.internal.project.DefaultProject', a subtype of 'org.gradle.api.Project', as these are not supported with the configuration cache.
See https://docs.gradle.org/7.5.1/userguide/configuration_cache.html#config_cache:requirements:disallowed_types
- Plugin 'gradlebuild.lifecycle': registration of listener on 'Gradle.buildFinished' is unsupported
See https://docs.gradle.org/7.5.1/userguide/configuration_cache.html#config_cache:requirements:build_listeners
- Unknown location: external process started 'git rev-parse --verify HEAD'
See https://docs.gradle.org/7.5.1/userguide/configuration_cache.html#config_cache:requirements:external_processes
See the complete report at file:///home/runner/work/sweekt-gradle/sweekt-gradle/build/reports/configuration-cache/2xrb7a79uza42zkf2efd9x5aq/f4txwdrtex1agpn5y7616bzy0/configuration-cache-report.html
> Starting an external process 'git rev-parse --verify HEAD' during configuration time is unsupported.
> Listener registration 'Gradle.buildFinished' by build 'gradle' is unsupported.
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 8m 39s
```
> View the full workflow running log in https://github.com/meowool/sweekt-gradle/actions/runs/4212557831
<!-- related: 7.5.1.1 -->
|
test
|
failed to test the distribution on the changed branch affected version error console gradlew clean dfile encoding utf duser language en downloading welcome to gradle here are the highlights of this release support for java support for building with groovy much more responsive continuous builds improved diagnostics for dependency resolution for more details see starting a gradle daemon subsequent builds will be faster configuration cache is an incubating feature calculating task graph as no configuration cache is available for tasks clean task build logic settings build logic settings plugin generateexternalpluginspecbuilders task build logic settings build logic settings plugin extractprecompiledscriptpluginplugins task build logic settings build logic settings plugin compilepluginsblocks task build logic settings build logic settings plugin generateprecompiledscriptpluginaccessors task build logic settings build logic settings plugin generatescriptpluginadapters task build logic settings build logic settings plugin plugindescriptors task build logic settings build logic settings plugin processresources task build logic settings build logic settings plugin compilekotlin task build logic settings build logic settings plugin compilejava no source task build logic settings build logic settings plugin classes task build logic settings build logic settings plugin inspectclassesforkotlinic task build logic settings build logic settings plugin jar type safe project accessors is an incubating feature task build logic commons gradle plugin generateexternalpluginspecbuilders task build logic commons gradle plugin extractprecompiledscriptpluginplugins task build logic commons gradle plugin compilepluginsblocks task build logic commons gradle plugin generateprecompiledscriptpluginaccessors task build logic commons gradle plugin generatescriptpluginadapters task build logic commons gradle plugin plugindescriptors task build logic commons gradle plugin processresources task build logic commons gradle plugin compilekotlin w home runner work sweekt gradle sweekt gradle build logic commons gradle plugin src main kotlin gradlebuild cache miss monitor gradle kts aftertask action unit is deprecated deprecated in java w home runner work sweekt gradle sweekt gradle build logic commons gradle plugin src main kotlin gradlebuild code quality gradle kts withconvention kclass conventiontype returntype returntype is deprecated the concept of conventions is deprecated use extensions instead w home runner work sweekt gradle sweekt gradle build logic commons gradle plugin src main kotlin gradlebuild code quality gradle kts groovysourceset is deprecated deprecated in java task build logic commons gradle plugin compilejava no source task build logic commons gradle plugin classes task build logic commons gradle plugin inspectclassesforkotlinic task build logic commons gradle plugin jar task build logic module identity extractprecompiledscriptpluginplugins task build logic module identity generatescriptpluginadapters task build logic module identity plugindescriptors task build logic module identity processresources task build logic cleanup extractpluginrequests no source task build logic cleanup generatepluginadapters task build logic cleanup extractprecompiledscriptpluginplugins task build logic cleanup generatescriptpluginadapters task build logic idea extractprecompiledscriptpluginplugins task build logic idea generatescriptpluginadapters task build logic build update utils extractpluginrequests no source task build logic build update utils generatepluginadapters from cache task build logic build update utils extractprecompiledscriptpluginplugins task build logic build update utils generatescriptpluginadapters task build logic build update utils plugindescriptors task build logic build update utils processresources task build logic documentation extractpluginrequests no source task build logic documentation generatepluginadapters from cache task build logic documentation plugindescriptors task build logic documentation processresources task build logic profiling extractprecompiledscriptpluginplugins task build logic profiling generatescriptpluginadapters task build logic cleanup plugindescriptors task build logic cleanup processresources task build logic idea plugindescriptors task build logic idea processresources task build logic profiling plugindescriptors task build logic profiling processresources task build logic root build extractprecompiledscriptpluginplugins task build logic root build generatescriptpluginadapters task build logic root build plugindescriptors task build logic root build processresources task build logic lifecycle extractprecompiledscriptpluginplugins task build logic lifecycle generatescriptpluginadapters task build logic lifecycle plugindescriptors task build logic lifecycle processresources task build logic basics generateexternalpluginspecbuilders task build logic basics extractprecompiledscriptpluginplugins task build logic binary compatibility extractpluginrequests task build logic basics compilepluginsblocks task build logic binary compatibility generatepluginadapters task build logic dependency modules extractprecompiledscriptpluginplugins task build logic dependency modules generatescriptpluginadapters task build logic dependency modules plugindescriptors task build logic dependency modules processresources task build logic integration testing extractprecompiledscriptpluginplugins task build logic integration testing generatescriptpluginadapters task build logic integration testing plugindescriptors task build logic integration testing processresources task build logic basics generateprecompiledscriptpluginaccessors task build logic performance testing extractpluginrequests task build logic basics generatescriptpluginadapters task build logic performance testing generatepluginadapters task build logic binary compatibility plugindescriptors task build logic binary compatibility processresources task build logic buildquality extractprecompiledscriptpluginplugins task build logic performance testing plugindescriptors task build logic performance testing processresources task build logic buildquality generatescriptpluginadapters task build logic buildquality plugindescriptors task build logic buildquality processresources task build logic basics plugindescriptors task build logic basics processresources task build logic basics compilekotlin task build logic basics compilejava no source task build logic basics classes task build logic basics inspectclassesforkotlinic task build logic basics jar task build logic module identity generateexternalpluginspecbuilders task build logic idea generateexternalpluginspecbuilders task build logic module identity compilepluginsblocks task build logic module identity generateprecompiledscriptpluginaccessors task build logic idea compilepluginsblocks task build logic idea generateprecompiledscriptpluginaccessors task build logic idea compilekotlin task build logic idea compilejava no source task build logic build update utils generateexternalpluginspecbuilders task build logic module identity compilekotlin task build logic module identity compilejava no source task build logic module identity classes task build logic module identity inspectclassesforkotlinic task build logic module identity jar task build logic build update utils compilepluginsblocks task build logic cleanup generateexternalpluginspecbuilders task build logic build update utils generateprecompiledscriptpluginaccessors task build logic cleanup compilepluginsblocks task build logic cleanup generateprecompiledscriptpluginaccessors task build logic build update utils compilekotlin task build logic build update utils compilejava no source task build logic build update utils compilegroovy no source task build logic build update utils compilegroovyplugins no source task build logic build update utils classes task build logic build update utils inspectclassesforkotlinic task build logic cleanup compilekotlin task build logic cleanup compilejava no source task build logic cleanup compilegroovy no source task build logic cleanup compilegroovyplugins no source task build logic cleanup classes task build logic cleanup inspectclassesforkotlinic task build logic build update utils jar task build logic documentation compilekotlin no source task build logic documentation compilejava no source task build logic cleanup jar task build logic idea classes task build logic idea inspectclassesforkotlinic task build logic idea jar task build logic lifecycle generateexternalpluginspecbuilders from cache task build logic lifecycle compilepluginsblocks task build logic lifecycle generateprecompiledscriptpluginaccessors task build logic lifecycle compilekotlin w home runner work sweekt gradle sweekt gradle build logic lifecycle src main kotlin gradlebuild lifecycle gradle kts buildfinished action unit is deprecated deprecated in java task build logic lifecycle compilejava no source task build logic lifecycle classes task build logic lifecycle inspectclassesforkotlinic task build logic lifecycle jar task build logic binary compatibility compilekotlin compilejava task current target is and compilekotlin task current target is jvm target compatibility should be set to the same java version w home runner work sweekt gradle sweekt gradle build logic binary compatibility src main kotlin gradlebuild binarycompatibility metadata haskotlinflagsmetadataquery kt this declaration overrides deprecated member but not marked as deprecated itself this deprecation won t be inherited in future releases please add deprecated annotation or suppress task build logic documentation compilegroovy groovy compilation avoidance is an incubating feature task build logic profiling generateexternalpluginspecbuilders task build logic documentation compilegroovyplugins no source task build logic documentation classes task build logic documentation inspectclassesforkotlinic task build logic documentation jar task build logic profiling compilepluginsblocks task build logic binary compatibility compilejava task build logic binary compatibility compilegroovy task build logic dependency modules generateexternalpluginspecbuilders from cache task build logic dependency modules compilepluginsblocks from cache task build logic dependency modules generateprecompiledscriptpluginaccessors task build logic profiling generateprecompiledscriptpluginaccessors task build logic dependency modules compilekotlin task build logic dependency modules compilejava no source task build logic integration testing generateexternalpluginspecbuilders task build logic dependency modules classes task build logic dependency modules inspectclassesforkotlinic task build logic dependency modules jar task build logic integration testing compilepluginsblocks task build logic integration testing generateprecompiledscriptpluginaccessors task build logic profiling compilekotlin w home runner work sweekt gradle sweekt gradle build logic profiling src main kotlin gradlebuild buildscan gradle kts aftertask action unit is deprecated deprecated in java task build logic profiling compilejava no source task build logic root build generateexternalpluginspecbuilders task build logic profiling classes task build logic profiling inspectclassesforkotlinic task build logic profiling jar task build logic root build compilepluginsblocks task build logic root build generateprecompiledscriptpluginaccessors task build logic root build compilekotlin task build logic root build compilejava no source task build logic root build classes task build logic root build inspectclassesforkotlinic task build logic root build jar task build logic binary compatibility compilegroovyplugins task build logic binary compatibility classes task build logic binary compatibility inspectclassesforkotlinic task build logic binary compatibility jar task build logic integration testing compilekotlin task build logic integration testing compilejava no source task build logic integration testing classes task build logic integration testing inspectclassesforkotlinic task build logic integration testing jar task build logic performance testing compilegroovy task build logic performance testing compilekotlin compilejava task current target is and compilekotlin task current target is jvm target compatibility should be set to the same java version task build logic performance testing compilejava task build logic performance testing compilegroovyplugins task build logic performance testing classes task build logic performance testing inspectclassesforkotlinic task build logic performance testing jar task build logic buildquality generateexternalpluginspecbuilders task build logic buildquality compilepluginsblocks task build logic buildquality generateprecompiledscriptpluginaccessors task build logic buildquality compilekotlin task build logic buildquality compilejava no source task build logic buildquality classes task build logic buildquality inspectclassesforkotlinic task build logic buildquality jar task build logic publishing generateexternalpluginspecbuilders task build logic publishing extractprecompiledscriptpluginplugins task build logic jvm generateexternalpluginspecbuilders task build logic jvm extractprecompiledscriptpluginplugins task build logic publishing compilepluginsblocks task build logic jvm compilepluginsblocks task build logic publishing generateprecompiledscriptpluginaccessors task build logic publishing generatescriptpluginadapters task build logic jvm generateprecompiledscriptpluginaccessors task build logic jvm generatescriptpluginadapters task build logic publishing compilekotlin task build logic publishing compilejava no source task build logic jvm plugindescriptors task build logic jvm processresources task build logic publishing plugindescriptors task build logic publishing processresources task build logic publishing classes task build logic publishing inspectclassesforkotlinic task build logic publishing jar task build logic uber plugins extractprecompiledscriptpluginplugins task build logic uber plugins generatescriptpluginadapters task build logic uber plugins plugindescriptors task build logic uber plugins processresources task build logic jvm compilekotlin task build logic jvm compilejava no source task build logic jvm classes task build logic jvm inspectclassesforkotlinic task build logic jvm jar task build logic uber plugins generateexternalpluginspecbuilders task build logic uber plugins compilepluginsblocks task build logic uber plugins generateprecompiledscriptpluginaccessors task build logic uber plugins compilekotlin task build logic uber plugins compilejava no source task build logic uber plugins classes task build logic uber plugins inspectclassesforkotlinic task build logic uber plugins jar task build logic packaging generateexternalpluginspecbuilders task build logic packaging extractprecompiledscriptpluginplugins task build logic packaging compilepluginsblocks task build logic packaging generateprecompiledscriptpluginaccessors task build logic packaging generatescriptpluginadapters task build logic packaging plugindescriptors task build logic packaging processresources task build logic packaging compilekotlin task build logic packaging compilejava no source task build logic packaging classes task build logic packaging inspectclassesforkotlinic task build logic packaging jar task build logic kotlin dsl generateexternalpluginspecbuilders task build logic kotlin dsl extractprecompiledscriptpluginplugins task build logic kotlin dsl compilepluginsblocks task build logic kotlin dsl generateprecompiledscriptpluginaccessors task build logic kotlin dsl generatescriptpluginadapters task build logic kotlin dsl plugindescriptors task build logic kotlin dsl processresources task build logic kotlin dsl compilekotlin task build logic kotlin dsl compilejava no source task build logic kotlin dsl classes task build logic kotlin dsl inspectclassesforkotlinic task build logic kotlin dsl jar task build logic build init samples generateexternalpluginspecbuilders task build logic build init samples extractprecompiledscriptpluginplugins task build logic build init samples compilepluginsblocks task build logic build init samples generateprecompiledscriptpluginaccessors task build logic build init samples generatescriptpluginadapters task build logic build init samples plugindescriptors task build logic build init samples processresources task build logic build init samples compilekotlin task build logic build init samples compilejava no source task build logic build init samples classes task build logic build init samples inspectclassesforkotlinic task build logic build init samples jar task clean up to date task antlr clean up to date task api metadata clean up to date task architecture test clean up to date task base annotations clean up to date task base services clean up to date task bootstrap clean up to date task build cache clean up to date task build cache base clean up to date task base services groovy clean up to date task build cache http clean up to date task build cache packaging clean up to date task build events clean up to date task build init clean up to date task build operations clean up to date task build option clean up to date task build profile clean up to date task build scan performance clean up to date task cli clean up to date task code quality clean up to date task composite builds clean up to date task configuration cache clean up to date task core clean up to date task core api clean up to date task core platform clean up to date task diagnostics clean up to date task distributions basics clean up to date task dependency management clean up to date task distributions dependencies clean up to date task distributions core clean up to date task distributions full clean up to date task distributions integ tests clean up to date task distributions jvm clean up to date task distributions publishing clean up to date task distributions native clean up to date task docs clean up to date task ear clean up to date task enterprise clean up to date task enterprise logging clean up to date task enterprise operations clean up to date task enterprise workers clean up to date task execution clean up to date task file temp clean up to date task file collections clean up to date task files clean up to date task file watching clean up to date task functional clean up to date task ide clean up to date task hashing clean up to date task installation beacon clean up to date task ide native clean up to date task integ test clean up to date task internal build reports clean up to date task internal integ testing clean up to date task internal performance testing clean up to date task internal testing clean up to date task ivy clean up to date task jacoco clean up to date task java compiler plugin clean up to date task jvm services clean up to date task kotlin compiler embeddable clean up to date task kotlin dsl clean up to date task kotlin dsl plugins clean up to date task kotlin dsl integ tests clean up to date task kotlin dsl provider plugins clean up to date task kotlin dsl tooling builders clean up to date task kotlin dsl tooling models clean up to date task language groovy clean up to date task language java clean up to date task language jvm clean up to date task language native clean up to date task launcher clean up to date task logging clean up to date task logging api clean up to date task maven clean up to date task messaging clean up to date task model groovy clean up to date task model core clean up to date task native clean up to date task normalization java clean up to date task performance clean up to date task persistent cache clean up to date task platform base clean up to date task platform jvm clean up to date task platform native clean up to date task plugin use clean up to date task plugin development clean up to date task plugins clean up to date task problems clean up to date task process services clean up to date task publish clean up to date task resources clean up to date task reporting clean up to date task resources gcs clean up to date task resources http clean up to date task resources clean up to date task resources sftp clean up to date task samples clean up to date task scala clean up to date task security clean up to date task signing clean up to date task smoke test clean up to date task snapshots clean up to date task soak clean up to date task test kit clean up to date task testing junit platform clean up to date task testing jvm clean up to date task testing native clean up to date task tooling api clean up to date task tooling api builders clean up to date task testing base clean up to date task tooling native clean up to date task version control clean up to date task worker services clean up to date task worker processes clean up to date task workers clean up to date task wrapper clean up to date task wrapper shared clean up to date deprecated gradle features were used in this build making it incompatible with gradle you can use warning mode all to show the individual deprecation warnings and determine if they come from your own scripts or plugins see actionable tasks executed from cache up to date a build scan was not published as you have not authenticated with server ge gradle org configuration cache entry discarded with problems note some input files use or override a deprecated api note recompile with xlint deprecation for details failure build failed with an exception what went wrong configuration cache problems found in this build problems were found storing the configuration cache gradle runtime cannot serialize gradle script object references as these are not supported with the configuration cache see gradle runtime cannot serialize object of type org gradle api internal project defaultproject a subtype of org gradle api project as these are not supported with the configuration cache see plugin gradlebuild lifecycle registration of listener on gradle buildfinished is unsupported see unknown location external process started git rev parse verify head see see the complete report at file home runner work sweekt gradle sweekt gradle build reports configuration cache configuration cache report html starting an external process git rev parse verify head during configuration time is unsupported listener registration gradle buildfinished by build gradle is unsupported try run with stacktrace option to get the stack trace run with info or debug option to get more log output get more help at build failed in view the full workflow running log in
| 1
|
82,072
| 7,812,009,624
|
IssuesEvent
|
2018-06-12 12:06:58
|
AllKinds/Katsefet
|
https://api.github.com/repos/AllKinds/Katsefet
|
opened
|
Test on user router
|
Study Testing
|
Study on router automatic tester (thinking about 'supertest').
Create a spec file on the user api:
- [ ] get user by id
- [ ] get all users
- [ ] post new user
- [ ] delete user
- [ ] update user
|
1.0
|
Test on user router - Study on router automatic tester (thinking about 'supertest').
Create a spec file on the user api:
- [ ] get user by id
- [ ] get all users
- [ ] post new user
- [ ] delete user
- [ ] update user
|
test
|
test on user router study on router automatic tester thinking about supertest create a spec file on the user api get user by id get all users post new user delete user update user
| 1
|
183,328
| 31,301,825,104
|
IssuesEvent
|
2023-08-23 00:40:09
|
compilerla/compiler.la
|
https://api.github.com/repos/compilerla/compiler.la
|
closed
|
Consider max width for site
|
design
|
Just a thought: I have a super widescreen, and the site currently expands to fill the entire viewport. Not necessarily a problem, but it might be more readable if we imposed a max width with the content center aligned and increased margins/padding on the sides.
Here's a screenshot of the homepage at full width on my screen:
<img width="1874" alt="image" src="https://user-images.githubusercontent.com/1783439/208964325-493bd703-6b14-40b3-af1f-d567c402f626.png">
And another of the current job posting:
<img width="1861" alt="image" src="https://user-images.githubusercontent.com/1783439/208964428-fa17f1b3-aa4f-41a5-a00d-c52cd16c71b6.png">
|
1.0
|
Consider max width for site - Just a thought: I have a super widescreen, and the site currently expands to fill the entire viewport. Not necessarily a problem, but it might be more readable if we imposed a max width with the content center aligned and increased margins/padding on the sides.
Here's a screenshot of the homepage at full width on my screen:
<img width="1874" alt="image" src="https://user-images.githubusercontent.com/1783439/208964325-493bd703-6b14-40b3-af1f-d567c402f626.png">
And another of the current job posting:
<img width="1861" alt="image" src="https://user-images.githubusercontent.com/1783439/208964428-fa17f1b3-aa4f-41a5-a00d-c52cd16c71b6.png">
|
non_test
|
consider max width for site just a thought i have a super widescreen and the site currently expands to fill the entire viewport not necessarily a problem but it might be more readable if we imposed a max width with the content center aligned and increased margins padding on the sides here s a screenshot of the homepage at full width on my screen img width alt image src and another of the current job posting img width alt image src
| 0
|
296,970
| 22,333,854,447
|
IssuesEvent
|
2022-06-14 16:38:55
|
dotnet/interactive
|
https://api.github.com/repos/dotnet/interactive
|
closed
|
Add documentation for JSON APIs
|
Area-Documentation
|
Documentation is needed for the JSON-based APIs available in `stdio` and `http` modes.
|
1.0
|
Add documentation for JSON APIs - Documentation is needed for the JSON-based APIs available in `stdio` and `http` modes.
|
non_test
|
add documentation for json apis documentation is needed for the json based apis available in stdio and http modes
| 0
|
79,903
| 15,300,350,346
|
IssuesEvent
|
2021-02-24 12:11:46
|
Alice52/Algorithms
|
https://api.github.com/repos/Alice52/Algorithms
|
closed
|
[daily] 2021-02-16 [349. Intersection of Two Arrays]
|
binary-search easy hash-table leetcode raw-question sort two-pointers
|
## 1. [Question Description](https://leetcode.com/problems/intersection-of-two-arrays/)
1. Given two arrays, write a function to compute their intersection.
2. Each element in the result must be unique.
3. The result can be in any order.
## 2. Example
```txt
Input: nums1 = [1,2,2,1], nums2 = [2,2]
Output: [2]
Input: nums1 = [4,9,5], nums2 = [9,4,9,8,4]
Output: [9,4]
```
## 3. Explain
1. 找到两个数组的交集元素,如果交集元素同⼀个数字出现了多次,只输出⼀次。
## 4. Core Thinking
1. xxx
2. xxx
## 5. Implement Task
- [x] 1. java
- [x] 2. golang
## 6. Animation
- N/A
## 7. Conclusion
- N/A
## 8. Best Practice
1. Timing: O(n)
2. Spacing: O(n)
## 9. Similar Issue
- N/A
|
1.0
|
[daily] 2021-02-16 [349. Intersection of Two Arrays] - ## 1. [Question Description](https://leetcode.com/problems/intersection-of-two-arrays/)
1. Given two arrays, write a function to compute their intersection.
2. Each element in the result must be unique.
3. The result can be in any order.
## 2. Example
```txt
Input: nums1 = [1,2,2,1], nums2 = [2,2]
Output: [2]
Input: nums1 = [4,9,5], nums2 = [9,4,9,8,4]
Output: [9,4]
```
## 3. Explain
1. 找到两个数组的交集元素,如果交集元素同⼀个数字出现了多次,只输出⼀次。
## 4. Core Thinking
1. xxx
2. xxx
## 5. Implement Task
- [x] 1. java
- [x] 2. golang
## 6. Animation
- N/A
## 7. Conclusion
- N/A
## 8. Best Practice
1. Timing: O(n)
2. Spacing: O(n)
## 9. Similar Issue
- N/A
|
non_test
|
given two arrays write a function to compute their intersection each element in the result must be unique the result can be in any order example txt input output input output explain 找到两个数组的交集元素,如果交集元素同⼀个数字出现了多次,只输出⼀次。 core thinking xxx xxx implement task java golang animation n a conclusion n a best practice timing o n spacing o n similar issue n a
| 0
|
132,101
| 18,266,109,847
|
IssuesEvent
|
2021-10-04 08:38:57
|
artsking/linux-3.0.35_CVE-2020-15436_withPatch
|
https://api.github.com/repos/artsking/linux-3.0.35_CVE-2020-15436_withPatch
|
closed
|
CVE-2018-11506 (High) detected in linux-stable-rtv3.8.6 - autoclosed
|
security vulnerability
|
## CVE-2018-11506 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/artsking/linux-3.0.35_CVE-2020-15436_withPatch/commit/594a70cb9871ddd73cf61197bb1a2a1b1777a7ae">594a70cb9871ddd73cf61197bb1a2a1b1777a7ae</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/scsi/sr_ioctl.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/scsi/sr_ioctl.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The sr_do_ioctl function in drivers/scsi/sr_ioctl.c in the Linux kernel through 4.16.12 allows local users to cause a denial of service (stack-based buffer overflow) or possibly have unspecified other impact because sense buffers have different sizes at the CDROM layer and the SCSI layer, as demonstrated by a CDROMREADMODE2 ioctl call.
<p>Publish Date: 2018-05-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11506>CVE-2018-11506</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-11506">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-11506</a></p>
<p>Release Date: 2018-05-28</p>
<p>Fix Resolution: v4.17-rc7</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2018-11506 (High) detected in linux-stable-rtv3.8.6 - autoclosed - ## CVE-2018-11506 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/artsking/linux-3.0.35_CVE-2020-15436_withPatch/commit/594a70cb9871ddd73cf61197bb1a2a1b1777a7ae">594a70cb9871ddd73cf61197bb1a2a1b1777a7ae</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/scsi/sr_ioctl.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/scsi/sr_ioctl.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The sr_do_ioctl function in drivers/scsi/sr_ioctl.c in the Linux kernel through 4.16.12 allows local users to cause a denial of service (stack-based buffer overflow) or possibly have unspecified other impact because sense buffers have different sizes at the CDROM layer and the SCSI layer, as demonstrated by a CDROMREADMODE2 ioctl call.
<p>Publish Date: 2018-05-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11506>CVE-2018-11506</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-11506">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-11506</a></p>
<p>Release Date: 2018-05-28</p>
<p>Fix Resolution: v4.17-rc7</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve high detected in linux stable autoclosed cve high severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href found in base branch master vulnerable source files drivers scsi sr ioctl c drivers scsi sr ioctl c vulnerability details the sr do ioctl function in drivers scsi sr ioctl c in the linux kernel through allows local users to cause a denial of service stack based buffer overflow or possibly have unspecified other impact because sense buffers have different sizes at the cdrom layer and the scsi layer as demonstrated by a ioctl call publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
21,415
| 11,215,870,287
|
IssuesEvent
|
2020-01-07 03:57:00
|
tensorflow/tensorflow
|
https://api.github.com/repos/tensorflow/tensorflow
|
closed
|
Unrolled LSTM terrible performance when tf.function is used
|
TF 2.0 comp:autograph comp:keras stat:awaiting tensorflower type:performance
|
**System information**
- Have I written custom code: Yes
- OS Platform and Distribution: Windows 10
- TensorFlow installed from : binary
- TensorFlow version : 2.0
- Python version: 3.6.9
- CUDA/cuDNN version: no CUDA
- GPU model and memory: no GPU
Well I thought I knew what tf.function does, but these two pieces of code confused me:
This does not use tf.function:
```
import time
import tensorflow as tf
rnn = tf.keras.layers.LSTM(256, return_state=True, unroll=True)
def go():
return rnn(tf.zeros((1, 300, 256)))
st = time.time()
with tf.GradientTape() as tape:
x = go()
print(f'graph: {time.time() - st}') # --> graph: 0.20923495292663574
st = time.time()
gradient = tape.gradient(x[-1], rnn.trainable_weights) # --> gradient: 0.4629485607147217
print(f'gradient: {time.time() - st}')
```
While this does:
```
import time
import tensorflow as tf
rnn = tf.keras.layers.LSTM(256, return_state=True, unroll=True)
@tf.function
def go():
return rnn(tf.zeros((1, 300, 256)))
st = time.time()
with tf.GradientTape() as tape:
x = go()
print(f'graph: {time.time() - st}') # --> graph: 12.843713283538818
st = time.time()
gradient = tape.gradient(x[-1], rnn.trainable_weights) # --> gradient: 2.7147161960601807
print(f'gradient: {time.time() - st}')
```
This difference in performance only happens when unroll=True. Is this expected?
|
True
|
Unrolled LSTM terrible performance when tf.function is used - **System information**
- Have I written custom code: Yes
- OS Platform and Distribution: Windows 10
- TensorFlow installed from : binary
- TensorFlow version : 2.0
- Python version: 3.6.9
- CUDA/cuDNN version: no CUDA
- GPU model and memory: no GPU
Well I thought I knew what tf.function does, but these two pieces of code confused me:
This does not use tf.function:
```
import time
import tensorflow as tf
rnn = tf.keras.layers.LSTM(256, return_state=True, unroll=True)
def go():
return rnn(tf.zeros((1, 300, 256)))
st = time.time()
with tf.GradientTape() as tape:
x = go()
print(f'graph: {time.time() - st}') # --> graph: 0.20923495292663574
st = time.time()
gradient = tape.gradient(x[-1], rnn.trainable_weights) # --> gradient: 0.4629485607147217
print(f'gradient: {time.time() - st}')
```
While this does:
```
import time
import tensorflow as tf
rnn = tf.keras.layers.LSTM(256, return_state=True, unroll=True)
@tf.function
def go():
return rnn(tf.zeros((1, 300, 256)))
st = time.time()
with tf.GradientTape() as tape:
x = go()
print(f'graph: {time.time() - st}') # --> graph: 12.843713283538818
st = time.time()
gradient = tape.gradient(x[-1], rnn.trainable_weights) # --> gradient: 2.7147161960601807
print(f'gradient: {time.time() - st}')
```
This difference in performance only happens when unroll=True. Is this expected?
|
non_test
|
unrolled lstm terrible performance when tf function is used system information have i written custom code yes os platform and distribution windows tensorflow installed from binary tensorflow version python version cuda cudnn version no cuda gpu model and memory no gpu well i thought i knew what tf function does but these two pieces of code confused me this does not use tf function import time import tensorflow as tf rnn tf keras layers lstm return state true unroll true def go return rnn tf zeros st time time with tf gradienttape as tape x go print f graph time time st graph st time time gradient tape gradient x rnn trainable weights gradient print f gradient time time st while this does import time import tensorflow as tf rnn tf keras layers lstm return state true unroll true tf function def go return rnn tf zeros st time time with tf gradienttape as tape x go print f graph time time st graph st time time gradient tape gradient x rnn trainable weights gradient print f gradient time time st this difference in performance only happens when unroll true is this expected
| 0
|
49,001
| 3,001,526,445
|
IssuesEvent
|
2015-07-24 12:01:27
|
enviroCar/enviroCar-app
|
https://api.github.com/repos/enviroCar/enviroCar-app
|
opened
|
Cars get registered multiple times
|
bug Priority - 1 - High
|
It happens that one car gets registered multiple times at the server. This could be the result of https://github.com/enviroCar/enviroCar-app/blob/e1a498ba6cd46fb4ef9c6b857a05f696ded4322b/org.envirocar.app/src/org/envirocar/app/application/UploadManager.java#L199. That list could empty if the app was restarted. --> As soon as the car has been registered, update all local tracks that used this car.
|
1.0
|
Cars get registered multiple times - It happens that one car gets registered multiple times at the server. This could be the result of https://github.com/enviroCar/enviroCar-app/blob/e1a498ba6cd46fb4ef9c6b857a05f696ded4322b/org.envirocar.app/src/org/envirocar/app/application/UploadManager.java#L199. That list could empty if the app was restarted. --> As soon as the car has been registered, update all local tracks that used this car.
|
non_test
|
cars get registered multiple times it happens that one car gets registered multiple times at the server this could be the result of that list could empty if the app was restarted as soon as the car has been registered update all local tracks that used this car
| 0
|
319,391
| 23,770,178,549
|
IssuesEvent
|
2022-09-01 15:40:04
|
cda-tum/MQTBench
|
https://api.github.com/repos/cda-tum/MQTBench
|
closed
|
Streamline and Parallelize Benchmark Generation
|
documentation enhancement
|
To leverage the computing resources available, mqt.bench should support parallel generation of benchmarks.
From our discussion, there are two ways to parallelize:
1. Inside python, e.g. with [joblib](https://joblib.readthedocs.io/en/latest/index.html).
2. Outside python, e.g. with [GNU parallel](https://www.gnu.org/software/parallel/).
The chosen approach should support
- a timeout (since the generation of some benchmarks takes very long)
- early cancellation of benchmarks (if generation runs into a timeout with $n$ qubits, it should not try $n+1$ qubits next)
- selective generation of benchmarks (e.g. just amplitude estimation with 20 to 40 qubits)
Solving this issue also requires at least a brief documentation on the generation and useful features :wink:
|
1.0
|
Streamline and Parallelize Benchmark Generation - To leverage the computing resources available, mqt.bench should support parallel generation of benchmarks.
From our discussion, there are two ways to parallelize:
1. Inside python, e.g. with [joblib](https://joblib.readthedocs.io/en/latest/index.html).
2. Outside python, e.g. with [GNU parallel](https://www.gnu.org/software/parallel/).
The chosen approach should support
- a timeout (since the generation of some benchmarks takes very long)
- early cancellation of benchmarks (if generation runs into a timeout with $n$ qubits, it should not try $n+1$ qubits next)
- selective generation of benchmarks (e.g. just amplitude estimation with 20 to 40 qubits)
Solving this issue also requires at least a brief documentation on the generation and useful features :wink:
|
non_test
|
streamline and parallelize benchmark generation to leverage the computing resources available mqt bench should support parallel generation of benchmarks from our discussion there are two ways to parallelize inside python e g with outside python e g with the chosen approach should support a timeout since the generation of some benchmarks takes very long early cancellation of benchmarks if generation runs into a timeout with n qubits it should not try n qubits next selective generation of benchmarks e g just amplitude estimation with to qubits solving this issue also requires at least a brief documentation on the generation and useful features wink
| 0
|
59,707
| 6,661,376,104
|
IssuesEvent
|
2017-10-02 08:20:07
|
minishift/minishift
|
https://api.github.com/repos/minishift/minishift
|
closed
|
Move integration.log from integration-test directory
|
component/integration-test kind/bug priority/critical
|
Having integration.log in integration-test directory causes problems in downstream. It will be better to move it from integration-test dir as this could cause further problems in future. The new location can be probably just `out/integration.log`.
|
1.0
|
Move integration.log from integration-test directory - Having integration.log in integration-test directory causes problems in downstream. It will be better to move it from integration-test dir as this could cause further problems in future. The new location can be probably just `out/integration.log`.
|
test
|
move integration log from integration test directory having integration log in integration test directory causes problems in downstream it will be better to move it from integration test dir as this could cause further problems in future the new location can be probably just out integration log
| 1
|
45,279
| 12,700,509,491
|
IssuesEvent
|
2020-06-22 16:27:03
|
department-of-veterans-affairs/va.gov-team
|
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
|
closed
|
508-defect-2 [FOCUS MANAGEMENT, SCREEN READER]: After editing/adding an address, focus MUST be returned to where user previously was
|
508-defect-2 508-issue-focus-mgmt 508/Accessibility vsa vsa-benefits-2
|
# [508-defect-2](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/platform/accessibility/guidance/defect-severity-rubric.md#508-defect-2)
**Feedback framework**
- **❗️ Must** for if the feedback must be applied
- **⚠️Should** if the feedback is best practice
- **✔️ Consider** for suggestions/enhancements
## Description
When a user edits or adds an address for their shipping address, when they return to the Shipping address screen, the focus **must** return to where they triggered the action to edit or add an address.
At present, when the user is returned to the Shipping address screen, focus is on the `.saved-success-container`, and then flow proceeds to the link to "Finish this order later", then "Need help?". A non-sighted user would be confused, and miss the selection of shipping address, email address entry, and options to go back or continue with the form.
## Point of Contact
**VFS Point of Contact:** Jennifer
## Acceptance Criteria
As a screen reader user, I want to select to edit/add an address, and when completed, return to where I initiated modifying that address so that I may continue to complete the form.
## Environment
* Operating System: all
* Browser: all
* Screenreading device: all
* Server destination: staging
## Steps to Recreate
1. Enter https://staging.va.gov/hearing-aid-batteries-and-accessories/veteran-information/addresses in browser
2. Start screenreading device listed in Environment
3. Proceed through the form
4. Select to Edit permanent address
5. Edit the permanent address, and Save permanent address
6. Verify that focus is moved to the "Application has been saved.…" notice beneath the form buttons
7. Verify that the rest of the form, to choose which address to ship to, add an email address, or progress back/continue with the form are not in the next steps available
## Possible Fixes (optional)
Return focus to the button that initiated editing/adding an address.
## WCAG or Vendor Guidance (optional)
* [Focus Order: Understanding SC 2.4.3](https://www.w3.org/TR/UNDERSTANDING-WCAG20/navigation-mechanisms-focus-order.html)
|
1.0
|
508-defect-2 [FOCUS MANAGEMENT, SCREEN READER]: After editing/adding an address, focus MUST be returned to where user previously was - # [508-defect-2](https://github.com/department-of-veterans-affairs/va.gov-team/blob/master/platform/accessibility/guidance/defect-severity-rubric.md#508-defect-2)
**Feedback framework**
- **❗️ Must** for if the feedback must be applied
- **⚠️Should** if the feedback is best practice
- **✔️ Consider** for suggestions/enhancements
## Description
When a user edits or adds an address for their shipping address, when they return to the Shipping address screen, the focus **must** return to where they triggered the action to edit or add an address.
At present, when the user is returned to the Shipping address screen, focus is on the `.saved-success-container`, and then flow proceeds to the link to "Finish this order later", then "Need help?". A non-sighted user would be confused, and miss the selection of shipping address, email address entry, and options to go back or continue with the form.
## Point of Contact
**VFS Point of Contact:** Jennifer
## Acceptance Criteria
As a screen reader user, I want to select to edit/add an address, and when completed, return to where I initiated modifying that address so that I may continue to complete the form.
## Environment
* Operating System: all
* Browser: all
* Screenreading device: all
* Server destination: staging
## Steps to Recreate
1. Enter https://staging.va.gov/hearing-aid-batteries-and-accessories/veteran-information/addresses in browser
2. Start screenreading device listed in Environment
3. Proceed through the form
4. Select to Edit permanent address
5. Edit the permanent address, and Save permanent address
6. Verify that focus is moved to the "Application has been saved.…" notice beneath the form buttons
7. Verify that the rest of the form, to choose which address to ship to, add an email address, or progress back/continue with the form are not in the next steps available
## Possible Fixes (optional)
Return focus to the button that initiated editing/adding an address.
## WCAG or Vendor Guidance (optional)
* [Focus Order: Understanding SC 2.4.3](https://www.w3.org/TR/UNDERSTANDING-WCAG20/navigation-mechanisms-focus-order.html)
|
non_test
|
defect after editing adding an address focus must be returned to where user previously was feedback framework ❗️ must for if the feedback must be applied ⚠️should if the feedback is best practice ✔️ consider for suggestions enhancements description when a user edits or adds an address for their shipping address when they return to the shipping address screen the focus must return to where they triggered the action to edit or add an address at present when the user is returned to the shipping address screen focus is on the saved success container and then flow proceeds to the link to finish this order later then need help a non sighted user would be confused and miss the selection of shipping address email address entry and options to go back or continue with the form point of contact vfs point of contact jennifer acceptance criteria as a screen reader user i want to select to edit add an address and when completed return to where i initiated modifying that address so that i may continue to complete the form environment operating system all browser all screenreading device all server destination staging steps to recreate enter in browser start screenreading device listed in environment proceed through the form select to edit permanent address edit the permanent address and save permanent address verify that focus is moved to the application has been saved … notice beneath the form buttons verify that the rest of the form to choose which address to ship to add an email address or progress back continue with the form are not in the next steps available possible fixes optional return focus to the button that initiated editing adding an address wcag or vendor guidance optional
| 0
|
13,921
| 3,787,155,533
|
IssuesEvent
|
2016-03-21 09:18:05
|
tagua-vm/tagua-vm
|
https://api.github.com/repos/tagua-vm/tagua-vm
|
closed
|
Write documentation
|
component-documentation enhancement in progress
|
This is important to have an up-to-date documentation. Use https://doc.rust-lang.org/book/documentation.html.
### Progression
* [ ] Write documentation,
* [ ] Make it run with `cargo doc`,
* [ ] Publish documentation online (optional).
|
1.0
|
Write documentation - This is important to have an up-to-date documentation. Use https://doc.rust-lang.org/book/documentation.html.
### Progression
* [ ] Write documentation,
* [ ] Make it run with `cargo doc`,
* [ ] Publish documentation online (optional).
|
non_test
|
write documentation this is important to have an up to date documentation use progression write documentation make it run with cargo doc publish documentation online optional
| 0
|
98,646
| 4,029,677,149
|
IssuesEvent
|
2016-05-18 11:39:24
|
Supadog/DB_iti
|
https://api.github.com/repos/Supadog/DB_iti
|
closed
|
Semesters & Students - display semester info
|
Medium priority
|
display this information either in view_one_student_semester.php or on the main page in the table.
|
1.0
|
Semesters & Students - display semester info - display this information either in view_one_student_semester.php or on the main page in the table.
|
non_test
|
semesters students display semester info display this information either in view one student semester php or on the main page in the table
| 0
|
3,505
| 4,467,516,327
|
IssuesEvent
|
2016-08-25 05:17:06
|
ghantoos/lshell
|
https://api.github.com/repos/ghantoos/lshell
|
reopened
|
SECURITY ISSUE: Escape possible using special keys
|
security
|
Just type `<CTRL+V><CTRL+J>` after any allowed command and then type desired restricted command:
```
vladislav@dt1:~$ getent passwd testuser
testuser:x:1001:1002:,,,:/home/testuser:/usr/bin/lshell
vladislav@dt1:~$ su - testuser
Password:
You are in a limited shell.
Type '?' or 'help' to get the list of allowed commands
testuser:~$ ?
cd clear echo exit help history ll lpath ls lsudo
testuser:~$ bash
*** forbidden command: bash
testuser:~$ echo<CTRL+V><CTRL+J>
bash
testuser@dt1:~$ which bash
/bin/bash
```
|
True
|
SECURITY ISSUE: Escape possible using special keys - Just type `<CTRL+V><CTRL+J>` after any allowed command and then type desired restricted command:
```
vladislav@dt1:~$ getent passwd testuser
testuser:x:1001:1002:,,,:/home/testuser:/usr/bin/lshell
vladislav@dt1:~$ su - testuser
Password:
You are in a limited shell.
Type '?' or 'help' to get the list of allowed commands
testuser:~$ ?
cd clear echo exit help history ll lpath ls lsudo
testuser:~$ bash
*** forbidden command: bash
testuser:~$ echo<CTRL+V><CTRL+J>
bash
testuser@dt1:~$ which bash
/bin/bash
```
|
non_test
|
security issue escape possible using special keys just type after any allowed command and then type desired restricted command vladislav getent passwd testuser testuser x home testuser usr bin lshell vladislav su testuser password you are in a limited shell type or help to get the list of allowed commands testuser cd clear echo exit help history ll lpath ls lsudo testuser bash forbidden command bash testuser echo bash testuser which bash bin bash
| 0
|
43,476
| 7,047,760,796
|
IssuesEvent
|
2018-01-02 14:58:51
|
camlistore/camlistore
|
https://api.github.com/repos/camlistore/camlistore
|
closed
|
remove /doc/search-ui.txt after v0.10.0 release
|
Documentation
|
up through v0.9.0, camlistore included a direct link to https://camlistore.googlesource.com/camlistore/+/master/doc/search-ui.txt in the help page. With the recent doc cleanup, this file got renamed to `search-ui.md` in 620d837a3d94cd939b16c56ece039639ef724df5, which broke that link.
As a temporary fix, the `search-ui.txt` file was recreated in 29dcc70f3c6a9ad471eb30c3f08e70da3d570bc9 with a simple pointer to https://camlistore.org/doc/search-ui while also updating the help page inside camlistore. Once v0.10.0 is released (or soon after), we should be able to remove the temporary `search-ui.txt` file.
|
1.0
|
remove /doc/search-ui.txt after v0.10.0 release - up through v0.9.0, camlistore included a direct link to https://camlistore.googlesource.com/camlistore/+/master/doc/search-ui.txt in the help page. With the recent doc cleanup, this file got renamed to `search-ui.md` in 620d837a3d94cd939b16c56ece039639ef724df5, which broke that link.
As a temporary fix, the `search-ui.txt` file was recreated in 29dcc70f3c6a9ad471eb30c3f08e70da3d570bc9 with a simple pointer to https://camlistore.org/doc/search-ui while also updating the help page inside camlistore. Once v0.10.0 is released (or soon after), we should be able to remove the temporary `search-ui.txt` file.
|
non_test
|
remove doc search ui txt after release up through camlistore included a direct link to in the help page with the recent doc cleanup this file got renamed to search ui md in which broke that link as a temporary fix the search ui txt file was recreated in with a simple pointer to while also updating the help page inside camlistore once is released or soon after we should be able to remove the temporary search ui txt file
| 0
|
104,463
| 22,676,667,283
|
IssuesEvent
|
2022-07-04 05:43:33
|
appsmithorg/appsmith
|
https://api.github.com/repos/appsmithorg/appsmith
|
closed
|
[Bug]: Reimplement #9824 for different code path in UQI
|
Bug Backend QA High Google Sheets UQI BE Coders Pod
|
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Description
#9824 needs to be factored into the new where clause code path
### Steps To Reproduce
Same as #9824
### Public Sample App
_No response_
### Version
Deploy preview
|
1.0
|
[Bug]: Reimplement #9824 for different code path in UQI - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Description
#9824 needs to be factored into the new where clause code path
### Steps To Reproduce
Same as #9824
### Public Sample App
_No response_
### Version
Deploy preview
|
non_test
|
reimplement for different code path in uqi is there an existing issue for this i have searched the existing issues description needs to be factored into the new where clause code path steps to reproduce same as public sample app no response version deploy preview
| 0
|
372,279
| 25,992,635,939
|
IssuesEvent
|
2022-12-20 09:00:28
|
Azure/az-hop
|
https://api.github.com/repos/Azure/az-hop
|
closed
|
Epic - How to use CVMFS
|
Epic kind/feature P0 area/configuration area/documentation
|
This Epic capture the work to explain how to use the CVMFS modules available
## Plan Items
Please list the items to be covered. Create an issue for each and link it with the # syntax
### Deploy
- [ ] item title 1 #???
- [ ] item title 2 #???
### Configure
- [ ] item title 3 #???
- [ ] item title 4 #???
|
1.0
|
Epic - How to use CVMFS - This Epic capture the work to explain how to use the CVMFS modules available
## Plan Items
Please list the items to be covered. Create an issue for each and link it with the # syntax
### Deploy
- [ ] item title 1 #???
- [ ] item title 2 #???
### Configure
- [ ] item title 3 #???
- [ ] item title 4 #???
|
non_test
|
epic how to use cvmfs this epic capture the work to explain how to use the cvmfs modules available plan items please list the items to be covered create an issue for each and link it with the syntax deploy item title item title configure item title item title
| 0
|
144,682
| 13,113,785,591
|
IssuesEvent
|
2020-08-05 06:22:31
|
enso-ui/modal
|
https://api.github.com/repos/enso-ui/modal
|
closed
|
Modal not working properly!
|
documentation
|
<!-- Choose one of the following: -->
This is a **bug**.
<!-- Make sure that everything is checked below: -->
### Prerequisites
* [x] Are you running the latest version?
* [x] Are you reporting to the correct repository?
(enso is made of many specialized packages: https://github.com/laravel-enso)
* [x] Did you check the documentation?
* [ ] Did you perform a cursory search?
### Description
I have installed enso-ui/modal using **yarn add @enso-ui/modal**
When I run it I can see the model but I am not able to close it. I am using one boolean variable to show/hide modal.
### Steps to Reproduce
1. First Step
Install package.
2. Second Step
`import { ModalCard } from '@enso-ui/modal/bulma';`
3. Third Step
put it inside component's template like this:
`<modalCard :show="isActive" @close="close"/>`
### Expected behavior
Modal should be visible when **isActive** value is true and hide when **isActive** value is false when clicking on the close button.
### Actual behavior
Modal is visible even **isActive** true/false. I am not able to close it even after setting **isActive** value to false in close method.
<!-- when the issue is resolved, don't forget to **CLOSE** it -->
|
1.0
|
Modal not working properly! - <!-- Choose one of the following: -->
This is a **bug**.
<!-- Make sure that everything is checked below: -->
### Prerequisites
* [x] Are you running the latest version?
* [x] Are you reporting to the correct repository?
(enso is made of many specialized packages: https://github.com/laravel-enso)
* [x] Did you check the documentation?
* [ ] Did you perform a cursory search?
### Description
I have installed enso-ui/modal using **yarn add @enso-ui/modal**
When I run it I can see the model but I am not able to close it. I am using one boolean variable to show/hide modal.
### Steps to Reproduce
1. First Step
Install package.
2. Second Step
`import { ModalCard } from '@enso-ui/modal/bulma';`
3. Third Step
put it inside component's template like this:
`<modalCard :show="isActive" @close="close"/>`
### Expected behavior
Modal should be visible when **isActive** value is true and hide when **isActive** value is false when clicking on the close button.
### Actual behavior
Modal is visible even **isActive** true/false. I am not able to close it even after setting **isActive** value to false in close method.
<!-- when the issue is resolved, don't forget to **CLOSE** it -->
|
non_test
|
modal not working properly this is a bug prerequisites are you running the latest version are you reporting to the correct repository enso is made of many specialized packages did you check the documentation did you perform a cursory search description i have installed enso ui modal using yarn add enso ui modal when i run it i can see the model but i am not able to close it i am using one boolean variable to show hide modal steps to reproduce first step install package second step import modalcard from enso ui modal bulma third step put it inside component s template like this expected behavior modal should be visible when isactive value is true and hide when isactive value is false when clicking on the close button actual behavior modal is visible even isactive true false i am not able to close it even after setting isactive value to false in close method
| 0
|
204,887
| 15,570,254,642
|
IssuesEvent
|
2021-03-17 02:07:25
|
microsoft/AzureStorageExplorer
|
https://api.github.com/repos/microsoft/AzureStorageExplorer
|
closed
|
The Azure AD attached items and QA items do not show broken after signing back into that same Microsoft account
|
:gear: sign-in 🧪 testing
|
**Storage Explorer Version**: 1.18.1
**Build Number**: 20210311.3
**Branch**: main
**Platform/OS:** Windows 10/ Linux ubuntu 18.04/ MacOS Catalina
**Architecture**: ia32/x64
**Regression From**: Not a regression
## Bug Description ##
When coming from having ADAL enabled in 1.18 and older, then sign back into that same account.
1. The attachment and the attachment in Quick Access were made with a Microsoft account, All attachments show well .
2. The attachment and the attachment in Quick Access were made with a non-Microsoft account. All attachments show broken.
## Steps to Reproduce ##
1. Install and launch the Storage Explorer release 1.18.1 -> Make sure the MSAL and code flow sign-in are disabled.
2. Sign in an Microsoft account -> Attach the blob container and queue using Azure AD -> Add the Azure AD attached items to Quick Access.
3. Upgrade the Storage Explorer to the current build(20210311.3) -> All attached items and QA items show broken and the azure account disappears.
5. Select the option 'Integrated Sign-in' in the 'Settings' -> Sign in the same Microsoft account.
6. Observe the Azure AD attached items and QA items -> Check whether all items show broken.
## Expected Experience ##
All items show broken.
## Actual Experience ##
All items show well.

## Additional Context ##
1. The Azure AD attached items and QA items show broken after signing back into that **same non-Microsoft account**.
2. This issue also reproduces when coming from having code flow enabled in 1.18.1 .
3. This issue doesn't reproduce when coming from having MSAL enabled in 1.18.1.
|
1.0
|
The Azure AD attached items and QA items do not show broken after signing back into that same Microsoft account - **Storage Explorer Version**: 1.18.1
**Build Number**: 20210311.3
**Branch**: main
**Platform/OS:** Windows 10/ Linux ubuntu 18.04/ MacOS Catalina
**Architecture**: ia32/x64
**Regression From**: Not a regression
## Bug Description ##
When coming from having ADAL enabled in 1.18 and older, then sign back into that same account.
1. The attachment and the attachment in Quick Access were made with a Microsoft account, All attachments show well .
2. The attachment and the attachment in Quick Access were made with a non-Microsoft account. All attachments show broken.
## Steps to Reproduce ##
1. Install and launch the Storage Explorer release 1.18.1 -> Make sure the MSAL and code flow sign-in are disabled.
2. Sign in an Microsoft account -> Attach the blob container and queue using Azure AD -> Add the Azure AD attached items to Quick Access.
3. Upgrade the Storage Explorer to the current build(20210311.3) -> All attached items and QA items show broken and the azure account disappears.
5. Select the option 'Integrated Sign-in' in the 'Settings' -> Sign in the same Microsoft account.
6. Observe the Azure AD attached items and QA items -> Check whether all items show broken.
## Expected Experience ##
All items show broken.
## Actual Experience ##
All items show well.

## Additional Context ##
1. The Azure AD attached items and QA items show broken after signing back into that **same non-Microsoft account**.
2. This issue also reproduces when coming from having code flow enabled in 1.18.1 .
3. This issue doesn't reproduce when coming from having MSAL enabled in 1.18.1.
|
test
|
the azure ad attached items and qa items do not show broken after signing back into that same microsoft account storage explorer version build number branch main platform os windows linux ubuntu macos catalina architecture regression from not a regression bug description when coming from having adal enabled in and older then sign back into that same account the attachment and the attachment in quick access were made with a microsoft account all attachments show well the attachment and the attachment in quick access were made with a non microsoft account all attachments show broken steps to reproduce install and launch the storage explorer release make sure the msal and code flow sign in are disabled sign in an microsoft account attach the blob container and queue using azure ad add the azure ad attached items to quick access upgrade the storage explorer to the current build all attached items and qa items show broken and the azure account disappears select the option integrated sign in in the settings sign in the same microsoft account observe the azure ad attached items and qa items check whether all items show broken expected experience all items show broken actual experience all items show well additional context the azure ad attached items and qa items show broken after signing back into that same non microsoft account this issue also reproduces when coming from having code flow enabled in this issue doesn t reproduce when coming from having msal enabled in
| 1
|
359,681
| 10,679,517,285
|
IssuesEvent
|
2019-10-21 19:26:29
|
clearlinux/distribution
|
https://api.github.com/repos/clearlinux/distribution
|
closed
|
Unable to use locale specific keyboard map
|
bug desktop high priority
|
**Describe the bug**
Selecting the Portuguese keyboard keymap results in still having all inputs interpreted as the default US layout
**To Reproduce**
Steps to reproduce the behavior:
1. Install from the Installer image, picking Portuguese language and pt keyboard layout
2. From installed system, ensure Portuguese is the layout in Region & Language GNOME settings.
3. cat /etc/vconsole.conf and /etc/locale.conf
4. Confirm all report correct locale settings
**Expected behavior**
Hitting AltGr+2 should produce @ symbol, or key to the right of L to produce ç
**Environment (please complete the following information):**
- Clear Linux OS version: 28940
- Bundles:
`NetworkManager
acpica-unix2
alsa-utils
baobab
bc
binutils
bison
bootloader
c-basic
cheese
cloc
clr-network-troubleshooter
cpio
curl
desktop
desktop-apps
desktop-assets
desktop-autostart
desktop-gnomelibs
desktop-locales
dev-utils
devpkg-base
devpkg-llvm
diffutils
docutils
dosfstools
dpdk
editors
emacs
eog
ethtool
evince
evolution
file
file-roller
findutils
firefox
flatpak
flex
fonts-basic
fuse
gdb
gedit
gimp
git
gjs
glibc-locale
gnome-base-libs
gnome-calculator
gnome-characters
gnome-color-manager
gnome-disk-utility
gnome-font-viewer
gnome-logs
gnome-music
gnome-photos
gnome-screenshot
gnome-system-monitor
gnome-todo
gnome-weather
graphviz
gstreamer
gvim
gzip
hardware-printing
hardware-uefi
htop
icdiff
inotify-tools
iproute2
iptables
joe
kbd
kernel-install
kernel-native
kvm-host
less
lib-imageformat
lib-opengl
lib-openssl
lib-samba
libX11client
libglib
libstdcpp
libva-utils
linux-firmware
linux-firmware-extras
linux-firmware-wifi
linux-tools
llvm
locales
mail-utils
make
man-pages
minicom
mutt
nasm
nautilus
net-tools
network-basic
nfs-utils
notmuch
openldap
openssh-server
openssl
openvswitch
os-core
os-core-update
os-core-webproxy
p11-kit
parallel
parted
patch
perl-basic
pmdk
polkit
powertop
procps-ng
pulseaudio
pygobject
python3-basic
qemu-guest-additions
samba
seahorse
shells
smartmontools
storage-utils
strace
sudo
sysadmin-basic
syslinux
thermal_daemon
tmux
totem
tzdata
unzip
user-basic
valgrind
vim
webkitgtk
wget
which
wpa_supplicant
x11-server
xemacs
xfsprogs
xz
znc
zsh
zstd`
|
1.0
|
Unable to use locale specific keyboard map - **Describe the bug**
Selecting the Portuguese keyboard keymap results in still having all inputs interpreted as the default US layout
**To Reproduce**
Steps to reproduce the behavior:
1. Install from the Installer image, picking Portuguese language and pt keyboard layout
2. From installed system, ensure Portuguese is the layout in Region & Language GNOME settings.
3. cat /etc/vconsole.conf and /etc/locale.conf
4. Confirm all report correct locale settings
**Expected behavior**
Hitting AltGr+2 should produce @ symbol, or key to the right of L to produce ç
**Environment (please complete the following information):**
- Clear Linux OS version: 28940
- Bundles:
`NetworkManager
acpica-unix2
alsa-utils
baobab
bc
binutils
bison
bootloader
c-basic
cheese
cloc
clr-network-troubleshooter
cpio
curl
desktop
desktop-apps
desktop-assets
desktop-autostart
desktop-gnomelibs
desktop-locales
dev-utils
devpkg-base
devpkg-llvm
diffutils
docutils
dosfstools
dpdk
editors
emacs
eog
ethtool
evince
evolution
file
file-roller
findutils
firefox
flatpak
flex
fonts-basic
fuse
gdb
gedit
gimp
git
gjs
glibc-locale
gnome-base-libs
gnome-calculator
gnome-characters
gnome-color-manager
gnome-disk-utility
gnome-font-viewer
gnome-logs
gnome-music
gnome-photos
gnome-screenshot
gnome-system-monitor
gnome-todo
gnome-weather
graphviz
gstreamer
gvim
gzip
hardware-printing
hardware-uefi
htop
icdiff
inotify-tools
iproute2
iptables
joe
kbd
kernel-install
kernel-native
kvm-host
less
lib-imageformat
lib-opengl
lib-openssl
lib-samba
libX11client
libglib
libstdcpp
libva-utils
linux-firmware
linux-firmware-extras
linux-firmware-wifi
linux-tools
llvm
locales
mail-utils
make
man-pages
minicom
mutt
nasm
nautilus
net-tools
network-basic
nfs-utils
notmuch
openldap
openssh-server
openssl
openvswitch
os-core
os-core-update
os-core-webproxy
p11-kit
parallel
parted
patch
perl-basic
pmdk
polkit
powertop
procps-ng
pulseaudio
pygobject
python3-basic
qemu-guest-additions
samba
seahorse
shells
smartmontools
storage-utils
strace
sudo
sysadmin-basic
syslinux
thermal_daemon
tmux
totem
tzdata
unzip
user-basic
valgrind
vim
webkitgtk
wget
which
wpa_supplicant
x11-server
xemacs
xfsprogs
xz
znc
zsh
zstd`
|
non_test
|
unable to use locale specific keyboard map describe the bug selecting the portuguese keyboard keymap results in still having all inputs interpreted as the default us layout to reproduce steps to reproduce the behavior install from the installer image picking portuguese language and pt keyboard layout from installed system ensure portuguese is the layout in region language gnome settings cat etc vconsole conf and etc locale conf confirm all report correct locale settings expected behavior hitting altgr should produce symbol or key to the right of l to produce ç environment please complete the following information clear linux os version bundles networkmanager acpica alsa utils baobab bc binutils bison bootloader c basic cheese cloc clr network troubleshooter cpio curl desktop desktop apps desktop assets desktop autostart desktop gnomelibs desktop locales dev utils devpkg base devpkg llvm diffutils docutils dosfstools dpdk editors emacs eog ethtool evince evolution file file roller findutils firefox flatpak flex fonts basic fuse gdb gedit gimp git gjs glibc locale gnome base libs gnome calculator gnome characters gnome color manager gnome disk utility gnome font viewer gnome logs gnome music gnome photos gnome screenshot gnome system monitor gnome todo gnome weather graphviz gstreamer gvim gzip hardware printing hardware uefi htop icdiff inotify tools iptables joe kbd kernel install kernel native kvm host less lib imageformat lib opengl lib openssl lib samba libglib libstdcpp libva utils linux firmware linux firmware extras linux firmware wifi linux tools llvm locales mail utils make man pages minicom mutt nasm nautilus net tools network basic nfs utils notmuch openldap openssh server openssl openvswitch os core os core update os core webproxy kit parallel parted patch perl basic pmdk polkit powertop procps ng pulseaudio pygobject basic qemu guest additions samba seahorse shells smartmontools storage utils strace sudo sysadmin basic syslinux thermal daemon tmux totem tzdata unzip user basic valgrind vim webkitgtk wget which wpa supplicant server xemacs xfsprogs xz znc zsh zstd
| 0
|
129,068
| 12,397,937,631
|
IssuesEvent
|
2020-05-21 00:13:37
|
terraform-providers/terraform-provider-vsphere
|
https://api.github.com/repos/terraform-providers/terraform-provider-vsphere
|
closed
|
docs: 404 on link to Linux time zones
|
documentation stale
|
### Expected Behavior
On the [vsphere_virtual_machine page](https://www.terraform.io/docs/providers/vsphere/r/virtual_machine.html) under the "Linux Customization Options" section, clicking the link to the time zones should show a page with all the possible values.
### Actual Behavior
A 404 was returned from the link.
### Steps to Reproduce
Click the link.

|
1.0
|
docs: 404 on link to Linux time zones - ### Expected Behavior
On the [vsphere_virtual_machine page](https://www.terraform.io/docs/providers/vsphere/r/virtual_machine.html) under the "Linux Customization Options" section, clicking the link to the time zones should show a page with all the possible values.
### Actual Behavior
A 404 was returned from the link.
### Steps to Reproduce
Click the link.

|
non_test
|
docs on link to linux time zones expected behavior on the under the linux customization options section clicking the link to the time zones should show a page with all the possible values actual behavior a was returned from the link steps to reproduce click the link
| 0
|
766,335
| 26,879,427,181
|
IssuesEvent
|
2023-02-05 13:00:36
|
restarone/violet_rails
|
https://api.github.com/repos/restarone/violet_rails
|
opened
|
fix N+1 query in Analytics V2
|
bug high priority
|
**Describe the bug**
Visiting Analytics V2 causes N+1 query. On markedrestaurant.com the N+1 causes a timeout when the range is 3 months or above:
<img width="1728" alt="Screen Shot 2023-02-03 at 7 36 50 PM" src="https://user-images.githubusercontent.com/35935196/216735330-29b53c45-0ab0-4343-943c-f7fa9829899c.png">
|
1.0
|
fix N+1 query in Analytics V2 - **Describe the bug**
Visiting Analytics V2 causes N+1 query. On markedrestaurant.com the N+1 causes a timeout when the range is 3 months or above:
<img width="1728" alt="Screen Shot 2023-02-03 at 7 36 50 PM" src="https://user-images.githubusercontent.com/35935196/216735330-29b53c45-0ab0-4343-943c-f7fa9829899c.png">
|
non_test
|
fix n query in analytics describe the bug visiting analytics causes n query on markedrestaurant com the n causes a timeout when the range is months or above img width alt screen shot at pm src
| 0
|
269,432
| 23,442,205,520
|
IssuesEvent
|
2022-08-15 15:55:35
|
rhinstaller/kickstart-tests
|
https://api.github.com/repos/rhinstaller/kickstart-tests
|
closed
|
15 storage tests timeouting with "Found sanity error: Your BIOS-based system needs a special partition to boot from a GPT disk label. To continue, please create a 1MiB 'biosboot' type partition."
|
failing test
|
Tests started to timeout on daily-iso of https://github.com/rhinstaller/kickstart-tests/actions/runs/2828945970
```
kstest-btrfs-1.2022_08_10-02_46_03.kx2jwfj_
kstest-btrfs-2.2022_08_10-00_30_47.gn9lrb2r
kstest-default-fstype.2022_08_10-01_43_22.dbnahynh
kstest-encrypt-device.2022_08_10-02_07_24._6b4bh3d
kstest-encrypt-swap.2022_08_10-01_15_49.o63xszto
kstest-escrow-cert.2022_08_10-02_12_18.n7a3j8o6
kstest-harddrive-install-tree.2022_08_09-23_51_51.8_s01af0
kstest-harddrive-install-tree-relative.2022_08_09-23_38_57.42oa5fpz
kstest-lvm-1.2022_08_10-00_09_11.z6uxbc_f
kstest-lvm-2.2022_08_10-03_40_09.kwce2e_8
kstest-lvm-raid-1.2022_08_10-03_13_27.wvdpp8nt
kstest-lvm-thinp-1.2022_08_10-03_33_42.ibfh_ubp
kstest-lvm-thinp-2.2022_08_10-02_56_04.k1c5_o4s
kstest-raid-1.2022_08_10-03_21_45.vqbics02
kstest-tmpfs-fixed_size.2022_08_10-00_08_34.p9r3mb74
```
Versions diff for last known pass:
[versions.diff.txt](https://github.com/rhinstaller/kickstart-tests/files/9297927/versions.diff.txt)
```
01:44:43,193 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_root.
01:44:43,193 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_s390_constraints.
01:44:43,193 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_partition_formatting.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_partition_sizes.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Found sanity warning: Your /boot partition is less than 512 MiB which is lower than recommended for a normal Fedora install.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_partition_format_sizes.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_bootloader.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_gpt_biosboot.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Found sanity error: Your BIOS-based system needs a special partition to boot from a GPT disk label. To continue, please create a 1MiB 'biosboot' type partition.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_opal_compatibility.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_swap.
01:44:43,195 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_swap_uuid.
01:44:43,195 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_mountpoints_on_linuxfs.
01:44:43,195 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_mountpoints_on_root.
01:44:43,195 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_mountpoints_not_on_root.
01:44:43,195 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_unlocked_devices_have_key.
01:44:43,196 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_luks_devices_have_key.
01:44:43,196 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_luks2_memory_requirements.
01:44:43,196 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_mounted_partitions.
01:44:43,196 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_lvm_destruction.
01:44:43,196 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_requests.
01:44:43,197 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Storage check finished with failure(s).
01:44:43,197 WARNING org.fedoraproject.Anaconda.Modules.Storage:INFO:anaconda.threading:Thread Done: AnaTaskThread-StorageValidateTask-1 (139818000676544)
01:44:44,150 DEBUG anaconda:anaconda: ui.lib.storage: Validation has been completed: ValidationReport(error_messages=["Your BIOS-based system needs a special partition to boot from a GPT disk label. To continue, please create a 1MiB 'biosboot' type partition."], warning_messages=['Your /boot partition is less than 512 MiB which is lower than recommended for a normal Fedora install.'])
01:44:44,151 INFO anaconda:anaconda: threading: Thread Done: AnaExecuteStorageThread (140047701669568)
```
Logs for `default-fstype`:
[kstest.log](https://github.com/rhinstaller/kickstart-tests/files/9297848/kstest.log)
[virt-install.log](https://github.com/rhinstaller/kickstart-tests/files/9297849/virt-install.log)
|
1.0
|
15 storage tests timeouting with "Found sanity error: Your BIOS-based system needs a special partition to boot from a GPT disk label. To continue, please create a 1MiB 'biosboot' type partition." - Tests started to timeout on daily-iso of https://github.com/rhinstaller/kickstart-tests/actions/runs/2828945970
```
kstest-btrfs-1.2022_08_10-02_46_03.kx2jwfj_
kstest-btrfs-2.2022_08_10-00_30_47.gn9lrb2r
kstest-default-fstype.2022_08_10-01_43_22.dbnahynh
kstest-encrypt-device.2022_08_10-02_07_24._6b4bh3d
kstest-encrypt-swap.2022_08_10-01_15_49.o63xszto
kstest-escrow-cert.2022_08_10-02_12_18.n7a3j8o6
kstest-harddrive-install-tree.2022_08_09-23_51_51.8_s01af0
kstest-harddrive-install-tree-relative.2022_08_09-23_38_57.42oa5fpz
kstest-lvm-1.2022_08_10-00_09_11.z6uxbc_f
kstest-lvm-2.2022_08_10-03_40_09.kwce2e_8
kstest-lvm-raid-1.2022_08_10-03_13_27.wvdpp8nt
kstest-lvm-thinp-1.2022_08_10-03_33_42.ibfh_ubp
kstest-lvm-thinp-2.2022_08_10-02_56_04.k1c5_o4s
kstest-raid-1.2022_08_10-03_21_45.vqbics02
kstest-tmpfs-fixed_size.2022_08_10-00_08_34.p9r3mb74
```
Versions diff for last known pass:
[versions.diff.txt](https://github.com/rhinstaller/kickstart-tests/files/9297927/versions.diff.txt)
```
01:44:43,193 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_root.
01:44:43,193 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_s390_constraints.
01:44:43,193 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_partition_formatting.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_partition_sizes.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Found sanity warning: Your /boot partition is less than 512 MiB which is lower than recommended for a normal Fedora install.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_partition_format_sizes.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_bootloader.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_gpt_biosboot.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Found sanity error: Your BIOS-based system needs a special partition to boot from a GPT disk label. To continue, please create a 1MiB 'biosboot' type partition.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_opal_compatibility.
01:44:43,194 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_swap.
01:44:43,195 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_swap_uuid.
01:44:43,195 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_mountpoints_on_linuxfs.
01:44:43,195 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_mountpoints_on_root.
01:44:43,195 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_mountpoints_not_on_root.
01:44:43,195 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_unlocked_devices_have_key.
01:44:43,196 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_luks_devices_have_key.
01:44:43,196 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_luks2_memory_requirements.
01:44:43,196 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_mounted_partitions.
01:44:43,196 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_lvm_destruction.
01:44:43,196 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Run sanity check verify_requests.
01:44:43,197 WARNING org.fedoraproject.Anaconda.Modules.Storage:DEBUG:anaconda.modules.storage.partitioning.validate:Storage check finished with failure(s).
01:44:43,197 WARNING org.fedoraproject.Anaconda.Modules.Storage:INFO:anaconda.threading:Thread Done: AnaTaskThread-StorageValidateTask-1 (139818000676544)
01:44:44,150 DEBUG anaconda:anaconda: ui.lib.storage: Validation has been completed: ValidationReport(error_messages=["Your BIOS-based system needs a special partition to boot from a GPT disk label. To continue, please create a 1MiB 'biosboot' type partition."], warning_messages=['Your /boot partition is less than 512 MiB which is lower than recommended for a normal Fedora install.'])
01:44:44,151 INFO anaconda:anaconda: threading: Thread Done: AnaExecuteStorageThread (140047701669568)
```
Logs for `default-fstype`:
[kstest.log](https://github.com/rhinstaller/kickstart-tests/files/9297848/kstest.log)
[virt-install.log](https://github.com/rhinstaller/kickstart-tests/files/9297849/virt-install.log)
|
test
|
storage tests timeouting with found sanity error your bios based system needs a special partition to boot from a gpt disk label to continue please create a biosboot type partition tests started to timeout on daily iso of kstest btrfs kstest btrfs kstest default fstype dbnahynh kstest encrypt device kstest encrypt swap kstest escrow cert kstest harddrive install tree kstest harddrive install tree relative kstest lvm f kstest lvm kstest lvm raid kstest lvm thinp ibfh ubp kstest lvm thinp kstest raid kstest tmpfs fixed size versions diff for last known pass warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify root warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify constraints warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify partition formatting warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify partition sizes warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate found sanity warning your boot partition is less than mib which is lower than recommended for a normal fedora install warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify partition format sizes warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify bootloader warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify gpt biosboot warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate found sanity error your bios based system needs a special partition to boot from a gpt disk label to continue please create a biosboot type partition warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify opal compatibility warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify swap warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify swap uuid warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify mountpoints on linuxfs warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify mountpoints on root warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify mountpoints not on root warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify unlocked devices have key warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify luks devices have key warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify memory requirements warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify mounted partitions warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify lvm destruction warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate run sanity check verify requests warning org fedoraproject anaconda modules storage debug anaconda modules storage partitioning validate storage check finished with failure s warning org fedoraproject anaconda modules storage info anaconda threading thread done anataskthread storagevalidatetask debug anaconda anaconda ui lib storage validation has been completed validationreport error messages warning messages info anaconda anaconda threading thread done anaexecutestoragethread logs for default fstype
| 1
|
6
| 2,490,038,787
|
IssuesEvent
|
2015-01-02 05:15:42
|
FreeUKGen/MyopicVicar
|
https://api.github.com/repos/FreeUKGen/MyopicVicar
|
closed
|
Reorder feedback page
|
enhancement testing
|
Feedback page currently orders by ?? -- this needs to put most recent at the top.
|
1.0
|
Reorder feedback page - Feedback page currently orders by ?? -- this needs to put most recent at the top.
|
test
|
reorder feedback page feedback page currently orders by this needs to put most recent at the top
| 1
|
72,401
| 7,296,638,840
|
IssuesEvent
|
2018-02-26 11:28:58
|
EyeSeeTea/QAApp
|
https://api.github.com/repos/EyeSeeTea/QAApp
|
closed
|
Improve module, Action Plan: incorrect calculation of dates of next assessment
|
complexity - low (1hr) priority - critical question testing type - maintenance
|
Janet is in Malawi for an End-User Training and she has found that the app Incorrectly calculates the dates of the next visit. See sshots attached. It should be: 6 months if above 80%, 4 months if between 50 and 79% and 2 months if below 50%.
Linda reported this issue on #206: just to add that the Plan module is correctly calculating the dates of next assessment, but the Improve module (Action Plan) doesn't report these correctly.


|
1.0
|
Improve module, Action Plan: incorrect calculation of dates of next assessment - Janet is in Malawi for an End-User Training and she has found that the app Incorrectly calculates the dates of the next visit. See sshots attached. It should be: 6 months if above 80%, 4 months if between 50 and 79% and 2 months if below 50%.
Linda reported this issue on #206: just to add that the Plan module is correctly calculating the dates of next assessment, but the Improve module (Action Plan) doesn't report these correctly.


|
test
|
improve module action plan incorrect calculation of dates of next assessment janet is in malawi for an end user training and she has found that the app incorrectly calculates the dates of the next visit see sshots attached it should be months if above months if between and and months if below linda reported this issue on just to add that the plan module is correctly calculating the dates of next assessment but the improve module action plan doesn t report these correctly
| 1
|
299,389
| 25,901,019,151
|
IssuesEvent
|
2022-12-15 05:40:56
|
openBackhaul/OperationKeyManagement
|
https://api.github.com/repos/openBackhaul/OperationKeyManagement
|
opened
|
Add testcase for "Attribute configured?" check of protocol attribute
|
testsuite_to_be_changed
|
protocol is an attribute added to multiple services. This attribute is an enumeration of two values like "HTTP" and "HTTPS". To check if server is accepting and configuring the input values given from user, testcase to be added to applicable services under "Acceptance" :: "Attribute configured?" section.
Services required for changes are
- [ ] /v1/bequeath-your-data-and-die
- [ ] /v1/regard-application
|
1.0
|
Add testcase for "Attribute configured?" check of protocol attribute - protocol is an attribute added to multiple services. This attribute is an enumeration of two values like "HTTP" and "HTTPS". To check if server is accepting and configuring the input values given from user, testcase to be added to applicable services under "Acceptance" :: "Attribute configured?" section.
Services required for changes are
- [ ] /v1/bequeath-your-data-and-die
- [ ] /v1/regard-application
|
test
|
add testcase for attribute configured check of protocol attribute protocol is an attribute added to multiple services this attribute is an enumeration of two values like http and https to check if server is accepting and configuring the input values given from user testcase to be added to applicable services under acceptance attribute configured section services required for changes are bequeath your data and die regard application
| 1
|
292,609
| 25,225,459,231
|
IssuesEvent
|
2022-11-14 15:40:17
|
trailofbits/pe-parse
|
https://api.github.com/repos/trailofbits/pe-parse
|
closed
|
Compile error
|
testing stale
|
Env:Windows10、vistual studio 2017、cmake version 3.19.4
Error info:
Determining if the include file filesystem exists failed with the following output:
Change Dir: D:/FoxitGit/pe-parse/build/CMakeFiles/CMakeTmp
Run Build Command(s):D:/vs2017/MSBuild/15.0/Bin/MSBuild.exe cmTC_7d71c.vcxproj /p:Configuration=Debug /p:Platform=Win32 /p:VisualStudioVersion=15.0 /v:m && 用于 .NET Framework 的 Microsoft (R) 生成引擎版本 15.9.21+g9802d43bc3
版权所有(C) Microsoft Corporation。保留所有权利。
用于 x86 的 Microsoft (R) C/C++ 优化编译器 19.16.27045 版
版权所有(C) Microsoft Corporation。保留所有权利。
cl /c /Zi /W3 /WX- /diagnostics:classic /Od /Ob0 /Oy- /D WIN32 /D _WINDOWS /D "CMAKE_INTDIR=\"Debug\"" /D _MBCS /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++17 /Fo"cmTC_7d71c.dir\Debug\\" /Fd"cmTC_7d71c.dir\Debug\vc141.pdb" /Gd /TP /analyze- /errorReport:queue "D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\CheckIncludeFile.cxx"
CheckIncludeFile.cxx
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(137): error C2039: “_W_int_curr_symbol”: 不是“lconv”的成员 [D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\cmTC_7d71c.vcxproj]
D:\Windows Kits\10\Include\10.0.17763.0\ucrt\locale.h(99): note: 参见“lconv”的声明
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(135): note: 编译 类 模板 成员函数 "void std::_Mpunct<_Elem>::_Getvals(wchar_t,const lconv *)" 时
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(160): note: 参见对正在编译的函数 模板 实例化“void std::_Mpunct<_Elem>::_Getvals(wchar_t,const lconv *)”的引用
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(276): note: 参见对正在编译的 类 模板 实例化 "std::_Mpunct<_Elem>" 的引用
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(1030): note: 参见对正在编译的 类 模板 实例化 "std::moneypunct<wchar_t,true>" 的引用
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(137): error C2039: “_W_currency_symbol”: 不是“lconv”的成员 [D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\cmTC_7d71c.vcxproj]
D:\Windows Kits\10\Include\10.0.17763.0\ucrt\locale.h(99): note: 参见“lconv”的声明
Determining if the include file experimental/filesystem exists failed with the following output:
Change Dir: D:/FoxitGit/pe-parse/build/CMakeFiles/CMakeTmp
Run Build Command(s):D:/vs2017/MSBuild/15.0/Bin/MSBuild.exe cmTC_66dc4.vcxproj /p:Configuration=Debug /p:Platform=Win32 /p:VisualStudioVersion=15.0 /v:m && 用于 .NET Framework 的 Microsoft (R) 生成引擎版本 15.9.21+g9802d43bc3
版权所有(C) Microsoft Corporation。保留所有权利。
用于 x86 的 Microsoft (R) C/C++ 优化编译器 19.16.27045 版
版权所有(C) Microsoft Corporation。保留所有权利。
cl /c /Zi /W3 /WX- /diagnostics:classic /Od /Ob0 /Oy- /D WIN32 /D _WINDOWS /D "CMAKE_INTDIR=\"Debug\"" /D _MBCS /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++17 /Fo"cmTC_66dc4.dir\Debug\\" /Fd"cmTC_66dc4.dir\Debug\vc141.pdb" /Gd /TP /analyze- /errorReport:queue "D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\CheckIncludeFile.cxx"
CheckIncludeFile.cxx
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(137): error C2039: “_W_int_curr_symbol”: 不是“lconv”的成员 [D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\cmTC_66dc4.vcxproj]
D:\Windows Kits\10\Include\10.0.17763.0\ucrt\locale.h(99): note: 参见“lconv”的声明
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(135): note: 编译 类 模板 成员函数 "void std::_Mpunct<_Elem>::_Getvals(wchar_t,const lconv *)" 时
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(160): note: 参见对正在编译的函数 模板 实例化“void std::_Mpunct<_Elem>::_Getvals(wchar_t,const lconv *)”的引用
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(276): note: 参见对正在编译的 类 模板 实例化 "std::_Mpunct<_Elem>" 的引用
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(1030): note: 参见对正在编译的 类 模板 实例化 "std::moneypunct<wchar_t,true>" 的引用
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(137): error C2039: “_W_currency_symbol”: 不是“lconv”的成员 [D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\cmTC_66dc4.vcxproj]
D:\Windows Kits\10\Include\10.0.17763.0\ucrt\locale.h(99): note: 参见“lconv”的声明
|
1.0
|
Compile error - Env:Windows10、vistual studio 2017、cmake version 3.19.4
Error info:
Determining if the include file filesystem exists failed with the following output:
Change Dir: D:/FoxitGit/pe-parse/build/CMakeFiles/CMakeTmp
Run Build Command(s):D:/vs2017/MSBuild/15.0/Bin/MSBuild.exe cmTC_7d71c.vcxproj /p:Configuration=Debug /p:Platform=Win32 /p:VisualStudioVersion=15.0 /v:m && 用于 .NET Framework 的 Microsoft (R) 生成引擎版本 15.9.21+g9802d43bc3
版权所有(C) Microsoft Corporation。保留所有权利。
用于 x86 的 Microsoft (R) C/C++ 优化编译器 19.16.27045 版
版权所有(C) Microsoft Corporation。保留所有权利。
cl /c /Zi /W3 /WX- /diagnostics:classic /Od /Ob0 /Oy- /D WIN32 /D _WINDOWS /D "CMAKE_INTDIR=\"Debug\"" /D _MBCS /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++17 /Fo"cmTC_7d71c.dir\Debug\\" /Fd"cmTC_7d71c.dir\Debug\vc141.pdb" /Gd /TP /analyze- /errorReport:queue "D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\CheckIncludeFile.cxx"
CheckIncludeFile.cxx
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(137): error C2039: “_W_int_curr_symbol”: 不是“lconv”的成员 [D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\cmTC_7d71c.vcxproj]
D:\Windows Kits\10\Include\10.0.17763.0\ucrt\locale.h(99): note: 参见“lconv”的声明
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(135): note: 编译 类 模板 成员函数 "void std::_Mpunct<_Elem>::_Getvals(wchar_t,const lconv *)" 时
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(160): note: 参见对正在编译的函数 模板 实例化“void std::_Mpunct<_Elem>::_Getvals(wchar_t,const lconv *)”的引用
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(276): note: 参见对正在编译的 类 模板 实例化 "std::_Mpunct<_Elem>" 的引用
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(1030): note: 参见对正在编译的 类 模板 实例化 "std::moneypunct<wchar_t,true>" 的引用
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(137): error C2039: “_W_currency_symbol”: 不是“lconv”的成员 [D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\cmTC_7d71c.vcxproj]
D:\Windows Kits\10\Include\10.0.17763.0\ucrt\locale.h(99): note: 参见“lconv”的声明
Determining if the include file experimental/filesystem exists failed with the following output:
Change Dir: D:/FoxitGit/pe-parse/build/CMakeFiles/CMakeTmp
Run Build Command(s):D:/vs2017/MSBuild/15.0/Bin/MSBuild.exe cmTC_66dc4.vcxproj /p:Configuration=Debug /p:Platform=Win32 /p:VisualStudioVersion=15.0 /v:m && 用于 .NET Framework 的 Microsoft (R) 生成引擎版本 15.9.21+g9802d43bc3
版权所有(C) Microsoft Corporation。保留所有权利。
用于 x86 的 Microsoft (R) C/C++ 优化编译器 19.16.27045 版
版权所有(C) Microsoft Corporation。保留所有权利。
cl /c /Zi /W3 /WX- /diagnostics:classic /Od /Ob0 /Oy- /D WIN32 /D _WINDOWS /D "CMAKE_INTDIR=\"Debug\"" /D _MBCS /Gm- /EHsc /RTC1 /MDd /GS /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR /std:c++17 /Fo"cmTC_66dc4.dir\Debug\\" /Fd"cmTC_66dc4.dir\Debug\vc141.pdb" /Gd /TP /analyze- /errorReport:queue "D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\CheckIncludeFile.cxx"
CheckIncludeFile.cxx
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(137): error C2039: “_W_int_curr_symbol”: 不是“lconv”的成员 [D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\cmTC_66dc4.vcxproj]
D:\Windows Kits\10\Include\10.0.17763.0\ucrt\locale.h(99): note: 参见“lconv”的声明
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(135): note: 编译 类 模板 成员函数 "void std::_Mpunct<_Elem>::_Getvals(wchar_t,const lconv *)" 时
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(160): note: 参见对正在编译的函数 模板 实例化“void std::_Mpunct<_Elem>::_Getvals(wchar_t,const lconv *)”的引用
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(276): note: 参见对正在编译的 类 模板 实例化 "std::_Mpunct<_Elem>" 的引用
with
[
_Elem=wchar_t
]
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(1030): note: 参见对正在编译的 类 模板 实例化 "std::moneypunct<wchar_t,true>" 的引用
D:\vs2017\VC\Tools\MSVC\14.16.27023\include\xlocmon(137): error C2039: “_W_currency_symbol”: 不是“lconv”的成员 [D:\FoxitGit\pe-parse\build\CMakeFiles\CMakeTmp\cmTC_66dc4.vcxproj]
D:\Windows Kits\10\Include\10.0.17763.0\ucrt\locale.h(99): note: 参见“lconv”的声明
|
test
|
compile error env: 、vistual studio 、cmake version error info: determining if the include file filesystem exists failed with the following output change dir d foxitgit pe parse build cmakefiles cmaketmp run build command s d msbuild bin msbuild exe cmtc vcxproj p configuration debug p platform p visualstudioversion v m 用于 net framework 的 microsoft r 生成引擎版本 版权所有 c microsoft corporation。保留所有权利。 用于 的 microsoft r c c 优化编译器 版 版权所有 c microsoft corporation。保留所有权利。 cl c zi wx diagnostics classic od oy d d windows d cmake intdir debug d mbcs gm ehsc mdd gs fp precise zc wchar t zc forscope zc inline gr std c fo cmtc dir debug fd cmtc dir debug pdb gd tp analyze errorreport queue d foxitgit pe parse build cmakefiles cmaketmp checkincludefile cxx checkincludefile cxx d vc tools msvc include xlocmon error “ w int curr symbol” 不是“lconv”的成员 d windows kits include ucrt locale h note 参见“lconv”的声明 d vc tools msvc include xlocmon note 编译 类 模板 成员函数 void std mpunct getvals wchar t const lconv 时 with elem wchar t d vc tools msvc include xlocmon note 参见对正在编译的函数 模板 实例化“void std mpunct getvals wchar t const lconv ”的引用 with elem wchar t d vc tools msvc include xlocmon note 参见对正在编译的 类 模板 实例化 std mpunct 的引用 with elem wchar t d vc tools msvc include xlocmon note 参见对正在编译的 类 模板 实例化 std moneypunct 的引用 d vc tools msvc include xlocmon error “ w currency symbol” 不是“lconv”的成员 d windows kits include ucrt locale h note 参见“lconv”的声明 determining if the include file experimental filesystem exists failed with the following output change dir d foxitgit pe parse build cmakefiles cmaketmp run build command s d msbuild bin msbuild exe cmtc vcxproj p configuration debug p platform p visualstudioversion v m 用于 net framework 的 microsoft r 生成引擎版本 版权所有 c microsoft corporation。保留所有权利。 用于 的 microsoft r c c 优化编译器 版 版权所有 c microsoft corporation。保留所有权利。 cl c zi wx diagnostics classic od oy d d windows d cmake intdir debug d mbcs gm ehsc mdd gs fp precise zc wchar t zc forscope zc inline gr std c fo cmtc dir debug fd cmtc dir debug pdb gd tp analyze errorreport queue d foxitgit pe parse build cmakefiles cmaketmp checkincludefile cxx checkincludefile cxx d vc tools msvc include xlocmon error “ w int curr symbol” 不是“lconv”的成员 d windows kits include ucrt locale h note 参见“lconv”的声明 d vc tools msvc include xlocmon note 编译 类 模板 成员函数 void std mpunct getvals wchar t const lconv 时 with elem wchar t d vc tools msvc include xlocmon note 参见对正在编译的函数 模板 实例化“void std mpunct getvals wchar t const lconv ”的引用 with elem wchar t d vc tools msvc include xlocmon note 参见对正在编译的 类 模板 实例化 std mpunct 的引用 with elem wchar t d vc tools msvc include xlocmon note 参见对正在编译的 类 模板 实例化 std moneypunct 的引用 d vc tools msvc include xlocmon error “ w currency symbol” 不是“lconv”的成员 d windows kits include ucrt locale h note 参见“lconv”的声明
| 1
|
212,058
| 23,857,004,585
|
IssuesEvent
|
2022-09-07 01:24:56
|
yael-lindman/forever
|
https://api.github.com/repos/yael-lindman/forever
|
closed
|
CVE-2020-7774 (High) detected in y18n-4.0.0.tgz - autoclosed
|
security vulnerability
|
## CVE-2020-7774 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>y18n-4.0.0.tgz</b></p></summary>
<p>the bare-bones internationalization library used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz">https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mocha/node_modules/y18n/package.json,/node_modules/yargs-unparser/node_modules/y18n/package.json</p>
<p>
Dependency Hierarchy:
- mocha-6.2.2.tgz (Root Library)
- yargs-13.3.0.tgz
- :x: **y18n-4.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/yael-lindman/forever/commit/78afd4ec3b0fcf33bf00a56ba44187272cd37cb6">78afd4ec3b0fcf33bf00a56ba44187272cd37cb6</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package y18n before 3.2.2, 4.0.1 and 5.0.5. PoC by po6ix: const y18n = require('y18n')(); y18n.setLocale('__proto__'); y18n.updateLocale({polluted: true}); console.log(polluted); // true
<p>Publish Date: 2020-11-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774>CVE-2020-7774</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1654">https://www.npmjs.com/advisories/1654</a></p>
<p>Release Date: 2020-11-17</p>
<p>Fix Resolution (y18n): 4.0.1</p>
<p>Direct dependency fix Resolution (mocha): 6.2.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-7774 (High) detected in y18n-4.0.0.tgz - autoclosed - ## CVE-2020-7774 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>y18n-4.0.0.tgz</b></p></summary>
<p>the bare-bones internationalization library used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz">https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/mocha/node_modules/y18n/package.json,/node_modules/yargs-unparser/node_modules/y18n/package.json</p>
<p>
Dependency Hierarchy:
- mocha-6.2.2.tgz (Root Library)
- yargs-13.3.0.tgz
- :x: **y18n-4.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/yael-lindman/forever/commit/78afd4ec3b0fcf33bf00a56ba44187272cd37cb6">78afd4ec3b0fcf33bf00a56ba44187272cd37cb6</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package y18n before 3.2.2, 4.0.1 and 5.0.5. PoC by po6ix: const y18n = require('y18n')(); y18n.setLocale('__proto__'); y18n.updateLocale({polluted: true}); console.log(polluted); // true
<p>Publish Date: 2020-11-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774>CVE-2020-7774</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1654">https://www.npmjs.com/advisories/1654</a></p>
<p>Release Date: 2020-11-17</p>
<p>Fix Resolution (y18n): 4.0.1</p>
<p>Direct dependency fix Resolution (mocha): 6.2.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve high detected in tgz autoclosed cve high severity vulnerability vulnerable library tgz the bare bones internationalization library used by yargs library home page a href path to dependency file package json path to vulnerable library node modules mocha node modules package json node modules yargs unparser node modules package json dependency hierarchy mocha tgz root library yargs tgz x tgz vulnerable library found in head commit a href vulnerability details this affects the package before and poc by const require setlocale proto updatelocale polluted true console log polluted true publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution direct dependency fix resolution mocha step up your open source security game with whitesource
| 0
|
53,786
| 6,758,829,481
|
IssuesEvent
|
2017-10-24 15:16:19
|
mozilla/network-pulse
|
https://api.github.com/repos/mozilla/network-pulse
|
opened
|
Optimize header & nav for small screens
|
design help wanted
|
Pulse is awesome on mobile. It could use a little TLC up top on small screens. When 320px wide (iphone 5), the search icon falls to the second line. Sign-up, sign-in gets cut off (could we just say "Login" up there?).

cc @kristinashu
|
1.0
|
Optimize header & nav for small screens - Pulse is awesome on mobile. It could use a little TLC up top on small screens. When 320px wide (iphone 5), the search icon falls to the second line. Sign-up, sign-in gets cut off (could we just say "Login" up there?).

cc @kristinashu
|
non_test
|
optimize header nav for small screens pulse is awesome on mobile it could use a little tlc up top on small screens when wide iphone the search icon falls to the second line sign up sign in gets cut off could we just say login up there cc kristinashu
| 0
|
155,149
| 12,240,306,646
|
IssuesEvent
|
2020-05-04 23:53:18
|
Azure/azure-iot-sdk-csharp
|
https://api.github.com/repos/Azure/azure-iot-sdk-csharp
|
closed
|
Twin fault injection test does not check desired property update callback operation properly
|
IoTSDK test bug
|
The twin DesiredPropertyUpdate callback is not tested properly for the fault injection tests (highlighted for MqttWs test):
Steps:
1. Perform initialization operation (set the device client desired property update callback handler).
1. Perform baseline test operation - service updates the twin, device client desired property update callback handler should get the updated properties.
1. Inject fault for duration <durationInSec>.
1. During faulted duration, perform the test operation in step 2 again (sdk should retry and finally succeed once fault duration has passed). As a result, service should update the twin, device client desired property update callback handler should get the updated properties.
1. Close the device client instance.
Checking the logs, we can see that device client desired property callback handler is not getting called in step 4.
If we add a delay of 1s before step 4, we can see the callback being called.
|
1.0
|
Twin fault injection test does not check desired property update callback operation properly - The twin DesiredPropertyUpdate callback is not tested properly for the fault injection tests (highlighted for MqttWs test):
Steps:
1. Perform initialization operation (set the device client desired property update callback handler).
1. Perform baseline test operation - service updates the twin, device client desired property update callback handler should get the updated properties.
1. Inject fault for duration <durationInSec>.
1. During faulted duration, perform the test operation in step 2 again (sdk should retry and finally succeed once fault duration has passed). As a result, service should update the twin, device client desired property update callback handler should get the updated properties.
1. Close the device client instance.
Checking the logs, we can see that device client desired property callback handler is not getting called in step 4.
If we add a delay of 1s before step 4, we can see the callback being called.
|
test
|
twin fault injection test does not check desired property update callback operation properly the twin desiredpropertyupdate callback is not tested properly for the fault injection tests highlighted for mqttws test steps perform initialization operation set the device client desired property update callback handler perform baseline test operation service updates the twin device client desired property update callback handler should get the updated properties inject fault for duration during faulted duration perform the test operation in step again sdk should retry and finally succeed once fault duration has passed as a result service should update the twin device client desired property update callback handler should get the updated properties close the device client instance checking the logs we can see that device client desired property callback handler is not getting called in step if we add a delay of before step we can see the callback being called
| 1
|
16,405
| 9,393,292,589
|
IssuesEvent
|
2019-04-07 10:19:23
|
api-platform/core
|
https://api.github.com/repos/api-platform/core
|
closed
|
DataProvider Overhead?
|
enhancement performance unconfirmed
|
I have numerous tables that have an ID but also a combination (Client ID, Client External ID) as a unique index.
I would like to expose data based on client id naturally. If I create a data provider for each entity that has this combination, I could have almost as many data providers as I have entities.
If we assume, say, 50 data providers that have been defined all with priority 5. Will the platform iterate through each provider until a matching provider is found? ie, O(n) in the worst case to determine the provider required ? Or does the platform determine which provider(s) prior to will be required?
If the former, is there anyway to reduce this overhead to a O(1) considering these data providers would be scanned on each request? Maybe, adding a "provider" to an operation..
/**
*
* @ApiResource(attributes={
* "normalization_context"={"groups"={"API_READ"}},
* "denormalization_context"={"groups"={"API_WRITE"}}
* },
* collectionOperations={},
* itemOperations={
* "getBySpecialId"={
* "provider"=CustomerAccountItemDataProvider::class, <---- something like this, rather than defining a kernel event listener
* "method"="GET",
* "path"="/customer-account/{id}",
* "controller"=CustomerAccountAction::class,
* "force_eager"=true
* }
* }
* )
*/
|
True
|
DataProvider Overhead? - I have numerous tables that have an ID but also a combination (Client ID, Client External ID) as a unique index.
I would like to expose data based on client id naturally. If I create a data provider for each entity that has this combination, I could have almost as many data providers as I have entities.
If we assume, say, 50 data providers that have been defined all with priority 5. Will the platform iterate through each provider until a matching provider is found? ie, O(n) in the worst case to determine the provider required ? Or does the platform determine which provider(s) prior to will be required?
If the former, is there anyway to reduce this overhead to a O(1) considering these data providers would be scanned on each request? Maybe, adding a "provider" to an operation..
/**
*
* @ApiResource(attributes={
* "normalization_context"={"groups"={"API_READ"}},
* "denormalization_context"={"groups"={"API_WRITE"}}
* },
* collectionOperations={},
* itemOperations={
* "getBySpecialId"={
* "provider"=CustomerAccountItemDataProvider::class, <---- something like this, rather than defining a kernel event listener
* "method"="GET",
* "path"="/customer-account/{id}",
* "controller"=CustomerAccountAction::class,
* "force_eager"=true
* }
* }
* )
*/
|
non_test
|
dataprovider overhead i have numerous tables that have an id but also a combination client id client external id as a unique index i would like to expose data based on client id naturally if i create a data provider for each entity that has this combination i could have almost as many data providers as i have entities if we assume say data providers that have been defined all with priority will the platform iterate through each provider until a matching provider is found ie o n in the worst case to determine the provider required or does the platform determine which provider s prior to will be required if the former is there anyway to reduce this overhead to a o considering these data providers would be scanned on each request maybe adding a provider to an operation apiresource attributes normalization context groups api read denormalization context groups api write collectionoperations itemoperations getbyspecialid provider customeraccountitemdataprovider class something like this rather than defining a kernel event listener method get path customer account id controller customeraccountaction class force eager true
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.