Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 4
112
| repo_url
stringlengths 33
141
| action
stringclasses 3
values | title
stringlengths 1
1.02k
| labels
stringlengths 4
1.54k
| body
stringlengths 1
262k
| index
stringclasses 17
values | text_combine
stringlengths 95
262k
| label
stringclasses 2
values | text
stringlengths 96
252k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
12,872
| 3,293,697,877
|
IssuesEvent
|
2015-10-30 20:15:12
|
dotnet/wcf
|
https://api.github.com/repos/dotnet/wcf
|
closed
|
X509Store throws NotImplemented for read/Write on Linux
|
Linux test bug
|
Https_ClientCredentialTypeTests.BasicAuthenticationInvalidPwd_throw_MessageSecurityException [FAIL]
Assert.Throws() Failure
Expected: typeof(System.ServiceModel.Security.MessageSecurityException)
Actual: typeof(System.NotImplementedException): The method or operation is not implemented.
Stack Trace:
at Internal.Cryptography.Pal.StorePal.FromSystemStore(String storeName, StoreLocation storeLocation, OpenFlags openFlags)
at System.Security.Cryptography.X509Certificates.X509Store.Open(OpenFlags flags)
at Infrastructure.Common.BridgeClientCertificateManager.AddToStoreIfNeeded(StoreName storeName, StoreLocation storeLocation, X509Certificate2 certificate)
at Infrastructure.Common.BridgeClientCertificateManager.InstallCertificateToRootStore(X509Certificate2 certificate)
at Infrastructure.Common.BridgeClientCertificateManager.InstallRootCertificateFromBridge()
at Endpoints.get_Https_BasicAuth_Address()
at Https_ClientCredentialTypeTests.<>c.<BasicAuthenticationInvalidPwd_throw_MessageSecurityException>b__4_0()
|
1.0
|
X509Store throws NotImplemented for read/Write on Linux -
Https_ClientCredentialTypeTests.BasicAuthenticationInvalidPwd_throw_MessageSecurityException [FAIL]
Assert.Throws() Failure
Expected: typeof(System.ServiceModel.Security.MessageSecurityException)
Actual: typeof(System.NotImplementedException): The method or operation is not implemented.
Stack Trace:
at Internal.Cryptography.Pal.StorePal.FromSystemStore(String storeName, StoreLocation storeLocation, OpenFlags openFlags)
at System.Security.Cryptography.X509Certificates.X509Store.Open(OpenFlags flags)
at Infrastructure.Common.BridgeClientCertificateManager.AddToStoreIfNeeded(StoreName storeName, StoreLocation storeLocation, X509Certificate2 certificate)
at Infrastructure.Common.BridgeClientCertificateManager.InstallCertificateToRootStore(X509Certificate2 certificate)
at Infrastructure.Common.BridgeClientCertificateManager.InstallRootCertificateFromBridge()
at Endpoints.get_Https_BasicAuth_Address()
at Https_ClientCredentialTypeTests.<>c.<BasicAuthenticationInvalidPwd_throw_MessageSecurityException>b__4_0()
|
test
|
throws notimplemented for read write on linux https clientcredentialtypetests basicauthenticationinvalidpwd throw messagesecurityexception assert throws failure expected typeof system servicemodel security messagesecurityexception actual typeof system notimplementedexception the method or operation is not implemented stack trace at internal cryptography pal storepal fromsystemstore string storename storelocation storelocation openflags openflags at system security cryptography open openflags flags at infrastructure common bridgeclientcertificatemanager addtostoreifneeded storename storename storelocation storelocation certificate at infrastructure common bridgeclientcertificatemanager installcertificatetorootstore certificate at infrastructure common bridgeclientcertificatemanager installrootcertificatefrombridge at endpoints get https basicauth address at https clientcredentialtypetests c b
| 1
|
259,982
| 22,582,115,995
|
IssuesEvent
|
2022-06-28 12:34:22
|
rstudio/rstudio
|
https://api.github.com/repos/rstudio/rstudio
|
opened
|
Dropdown menus rendering "under" the editor region in latest Safari on macOS (Backport: Ghost Orchid)
|
bug macos test backport backport-ghost-orchid
|
Dropdown menus and tooltips are hiding behind other elements on Safari.
This covers the Ghost Orchid backport of issue #10821.
|
1.0
|
Dropdown menus rendering "under" the editor region in latest Safari on macOS (Backport: Ghost Orchid) - Dropdown menus and tooltips are hiding behind other elements on Safari.
This covers the Ghost Orchid backport of issue #10821.
|
test
|
dropdown menus rendering under the editor region in latest safari on macos backport ghost orchid dropdown menus and tooltips are hiding behind other elements on safari this covers the ghost orchid backport of issue
| 1
|
129,638
| 27,528,170,055
|
IssuesEvent
|
2023-03-06 19:45:40
|
pwa-builder/PWABuilder
|
https://api.github.com/repos/pwa-builder/PWABuilder
|
closed
|
[VSCODE] Check icon sizes in snippets
|
bug :bug: vscode
|
### What happened?
I got this two error or warning:

### What do you expect to happen?
We can fix the first error by adding the icon path for 96x icon (which accidentally I found that is exist in your website):
```
"shortcuts": [
{
"name":"The name you would like to be displayed for your shortcut",
"url":"The url you would like to open when the user chooses this shortcut. This must be a URL local to your PWA. For example: If my start_url is /, this URL must be something like /shortcut",
"description":"A description of the functionality of this shortcut",
"icons": [
{
"src": "https://www.pwabuilder.com/assets/icons/icon_96.png",
"sizes": "96x96",
"type": "image/png",
"purpose": "maskable"
}
]
}
]
```
And about the second error, we can match the size with the image you addressed.
### What version of VS Code are you using?
_No response_
### Relevant log output
_No response_
### Are you using the latest version of the VS Code extension?
- [X] I am using the latest version of the VS Code extension
|
1.0
|
[VSCODE] Check icon sizes in snippets - ### What happened?
I got this two error or warning:

### What do you expect to happen?
We can fix the first error by adding the icon path for 96x icon (which accidentally I found that is exist in your website):
```
"shortcuts": [
{
"name":"The name you would like to be displayed for your shortcut",
"url":"The url you would like to open when the user chooses this shortcut. This must be a URL local to your PWA. For example: If my start_url is /, this URL must be something like /shortcut",
"description":"A description of the functionality of this shortcut",
"icons": [
{
"src": "https://www.pwabuilder.com/assets/icons/icon_96.png",
"sizes": "96x96",
"type": "image/png",
"purpose": "maskable"
}
]
}
]
```
And about the second error, we can match the size with the image you addressed.
### What version of VS Code are you using?
_No response_
### Relevant log output
_No response_
### Are you using the latest version of the VS Code extension?
- [X] I am using the latest version of the VS Code extension
|
non_test
|
check icon sizes in snippets what happened i got this two error or warning what do you expect to happen we can fix the first error by adding the icon path for icon which accidentally i found that is exist in your website shortcuts name the name you would like to be displayed for your shortcut url the url you would like to open when the user chooses this shortcut this must be a url local to your pwa for example if my start url is this url must be something like shortcut description a description of the functionality of this shortcut icons src sizes type image png purpose maskable and about the second error we can match the size with the image you addressed what version of vs code are you using no response relevant log output no response are you using the latest version of the vs code extension i am using the latest version of the vs code extension
| 0
|
726,209
| 24,991,897,712
|
IssuesEvent
|
2022-11-02 19:34:15
|
VEuPathDB/web-eda
|
https://api.github.com/repos/VEuPathDB/web-eda
|
closed
|
Variable with binary outcome cannot be chosen in 2x2 mosaic plot
|
bug high priority
|
In MORDOR study, the var study arm has two values, but it cannot be chosen in 2x2 mosaic, however, if I plot it in RxC mosaic plot meanwhile keep all other things the same, it works.
In 2x2 mosaic, I cannot choose var:"study arm" for either x or y axis.
<img width="922" alt="Screen Shot 2022-10-26 at 5 11 38 PM" src="https://user-images.githubusercontent.com/61077845/198142994-acf78908-4dac-4046-bc51-63f5315613cb.png">
In RXC mosaic, it works.
<img width="889" alt="Screen Shot 2022-10-26 at 5 30 32 PM" src="https://user-images.githubusercontent.com/61077845/198143181-c7b178aa-343b-48ea-b37c-fc52993e77cb.png">
|
1.0
|
Variable with binary outcome cannot be chosen in 2x2 mosaic plot - In MORDOR study, the var study arm has two values, but it cannot be chosen in 2x2 mosaic, however, if I plot it in RxC mosaic plot meanwhile keep all other things the same, it works.
In 2x2 mosaic, I cannot choose var:"study arm" for either x or y axis.
<img width="922" alt="Screen Shot 2022-10-26 at 5 11 38 PM" src="https://user-images.githubusercontent.com/61077845/198142994-acf78908-4dac-4046-bc51-63f5315613cb.png">
In RXC mosaic, it works.
<img width="889" alt="Screen Shot 2022-10-26 at 5 30 32 PM" src="https://user-images.githubusercontent.com/61077845/198143181-c7b178aa-343b-48ea-b37c-fc52993e77cb.png">
|
non_test
|
variable with binary outcome cannot be chosen in mosaic plot in mordor study the var study arm has two values but it cannot be chosen in mosaic however if i plot it in rxc mosaic plot meanwhile keep all other things the same it works in mosaic i cannot choose var study arm for either x or y axis img width alt screen shot at pm src in rxc mosaic it works img width alt screen shot at pm src
| 0
|
1,487
| 2,550,414,550
|
IssuesEvent
|
2015-02-01 14:44:35
|
Kademi/kademi-dev
|
https://api.github.com/repos/Kademi/kademi-dev
|
closed
|
Allows users to opt-in to groups from contactUsApp
|
bug enhancement question Ready to Test QA
|
This will allows users to opt-in to groups from the contact us page the same way users can from the signup page.
|
1.0
|
Allows users to opt-in to groups from contactUsApp - This will allows users to opt-in to groups from the contact us page the same way users can from the signup page.
|
test
|
allows users to opt in to groups from contactusapp this will allows users to opt in to groups from the contact us page the same way users can from the signup page
| 1
|
789,453
| 27,790,326,056
|
IssuesEvent
|
2023-03-17 08:24:01
|
HaDuve/TravelCostNative
|
https://api.github.com/repos/HaDuve/TravelCostNative
|
opened
|
add split summary information
|
Enhancement Frontend 3 - Low priority AAA - Complex
|
- [ ] aus mehreren reisen die splits zusammenfassen für simplify in einer personen kategorie wo alle splits zwischen dir und einer andern person
- [ ] aus den splits die gesamtsumme berechnen, speichern und standardmässig anzeigen die man (insgesamt und von allen anderen) noch zurück kriegt
|
1.0
|
add split summary information - - [ ] aus mehreren reisen die splits zusammenfassen für simplify in einer personen kategorie wo alle splits zwischen dir und einer andern person
- [ ] aus den splits die gesamtsumme berechnen, speichern und standardmässig anzeigen die man (insgesamt und von allen anderen) noch zurück kriegt
|
non_test
|
add split summary information aus mehreren reisen die splits zusammenfassen für simplify in einer personen kategorie wo alle splits zwischen dir und einer andern person aus den splits die gesamtsumme berechnen speichern und standardmässig anzeigen die man insgesamt und von allen anderen noch zurück kriegt
| 0
|
208,221
| 16,106,732,166
|
IssuesEvent
|
2021-04-27 15:44:39
|
apache/camel-quarkus
|
https://api.github.com/repos/apache/camel-quarkus
|
opened
|
Document locale limitations in native mode
|
documentation
|
We should add a paragraph about locale to https://camel.apache.org/camel-quarkus/latest/user-guide/native-mode.html
It should document the state before and after GraalVM 21.1 https://github.com/oracle/graal/issues/2908
See also https://github.com/quarkusio/quarkus/issues/5244
|
1.0
|
Document locale limitations in native mode - We should add a paragraph about locale to https://camel.apache.org/camel-quarkus/latest/user-guide/native-mode.html
It should document the state before and after GraalVM 21.1 https://github.com/oracle/graal/issues/2908
See also https://github.com/quarkusio/quarkus/issues/5244
|
non_test
|
document locale limitations in native mode we should add a paragraph about locale to it should document the state before and after graalvm see also
| 0
|
339,402
| 30,446,169,871
|
IssuesEvent
|
2023-07-15 17:43:33
|
natiatabatadzebtu/mid-term-versioning
|
https://api.github.com/repos/natiatabatadzebtu/mid-term-versioning
|
opened
|
640c851 failed unit and formatting tests.
|
ci-pytest ci-black
|
Automatically generated message
640c8517801ba0baab4fcca90ce913ede6618f37 failed unit and formatting tests.
Pytest report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442899/pytest.html
Black report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442899/black.html
Automatically generated message
640c8517801ba0baab4fcca90ce913ede6618f37 failed unit and formatting tests.
Pytest report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442927/pytest.html
Black report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442927/black.html
Automatically generated message
640c8517801ba0baab4fcca90ce913ede6618f37 failed unit and formatting tests.
Pytest report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442954/pytest.html
Black report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442954/black.html
Automatically generated message
640c8517801ba0baab4fcca90ce913ede6618f37 failed unit and formatting tests.
Pytest report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442981/pytest.html
Black report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442981/black.html
Automatically generated message
640c8517801ba0baab4fcca90ce913ede6618f37 failed unit and formatting tests.
Pytest report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689443009/pytest.html
Black report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689443009/black.html
|
1.0
|
640c851 failed unit and formatting tests. - Automatically generated message
640c8517801ba0baab4fcca90ce913ede6618f37 failed unit and formatting tests.
Pytest report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442899/pytest.html
Black report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442899/black.html
Automatically generated message
640c8517801ba0baab4fcca90ce913ede6618f37 failed unit and formatting tests.
Pytest report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442927/pytest.html
Black report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442927/black.html
Automatically generated message
640c8517801ba0baab4fcca90ce913ede6618f37 failed unit and formatting tests.
Pytest report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442954/pytest.html
Black report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442954/black.html
Automatically generated message
640c8517801ba0baab4fcca90ce913ede6618f37 failed unit and formatting tests.
Pytest report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442981/pytest.html
Black report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689442981/black.html
Automatically generated message
640c8517801ba0baab4fcca90ce913ede6618f37 failed unit and formatting tests.
Pytest report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689443009/pytest.html
Black report: https://natiatabatadzebtu.github.io/mid-term-versioning-ci/640c8517801ba0baab4fcca90ce913ede6618f37-1689443009/black.html
|
test
|
failed unit and formatting tests automatically generated message failed unit and formatting tests pytest report black report automatically generated message failed unit and formatting tests pytest report black report automatically generated message failed unit and formatting tests pytest report black report automatically generated message failed unit and formatting tests pytest report black report automatically generated message failed unit and formatting tests pytest report black report
| 1
|
52,680
| 6,650,276,470
|
IssuesEvent
|
2017-09-28 15:47:17
|
18F/openFEC-web-app
|
https://api.github.com/repos/18F/openFEC-web-app
|
closed
|
Request: Provide context for what is included in the data tables
|
Work: Design
|
@noahmanger commented on [Tue Jun 21 2016](https://github.com/18F/FEC/issues/370)
Another comment shared offline:

I think this raises an interesting question about if we want to bring in the "About this data" pattern from the overviews to the data table pages.
---
@jenniferthibault commented on [Thu Jun 30 2016](https://github.com/18F/FEC/issues/370#issuecomment-229743714)
Moving this to the workflow, with design dependencies in https://github.com/18F/openFEC-web-app/issues/1303
|
1.0
|
Request: Provide context for what is included in the data tables - @noahmanger commented on [Tue Jun 21 2016](https://github.com/18F/FEC/issues/370)
Another comment shared offline:

I think this raises an interesting question about if we want to bring in the "About this data" pattern from the overviews to the data table pages.
---
@jenniferthibault commented on [Thu Jun 30 2016](https://github.com/18F/FEC/issues/370#issuecomment-229743714)
Moving this to the workflow, with design dependencies in https://github.com/18F/openFEC-web-app/issues/1303
|
non_test
|
request provide context for what is included in the data tables noahmanger commented on another comment shared offline i think this raises an interesting question about if we want to bring in the about this data pattern from the overviews to the data table pages jenniferthibault commented on moving this to the workflow with design dependencies in
| 0
|
93,171
| 8,402,803,114
|
IssuesEvent
|
2018-10-11 07:58:13
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
teamcity: failed test: TestStoreRangeMergeWatcher
|
C-test-failure O-robot
|
The following tests appear to have failed on release-2.1 (test): TestStoreRangeMergeWatcher/inject-failures=false, TestStoreRangeMergeWatcher
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestStoreRangeMergeWatcher).
[#958017](https://teamcity.cockroachdb.com/viewLog.html?buildId=958017):
```
TestStoreRangeMergeWatcher
--- FAIL: test/TestStoreRangeMergeWatcher (0.000s)
Test ended in panic.
TestStoreRangeMergeWatcher/inject-failures=false
...ach/pkg/storage/scheduler.go:196 +0x7c
github.com/cockroachdb/cockroach/pkg/storage.(*raftScheduler).Start.func2(0x2c06fc0, 0xc4215e8630)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/scheduler.go:165 +0x3e
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc42051e690, 0xc420dc2ab0, 0xc42051e680)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
goroutine 47470 [semacquire]:
sync.runtime_notifyListWait(0xc420dd8510, 0x799c)
/usr/local/go/src/runtime/sema.go:510 +0x10b
sync.(*Cond).Wait(0xc420dd8500)
/usr/local/go/src/sync/cond.go:56 +0x80
github.com/cockroachdb/cockroach/pkg/storage.(*raftScheduler).worker(0xc420f6a300, 0x2c06fc0, 0xc420acc7b0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/scheduler.go:196 +0x7c
github.com/cockroachdb/cockroach/pkg/storage.(*raftScheduler).Start.func2(0x2c06fc0, 0xc420acc7b0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/scheduler.go:165 +0x3e
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc4215ecb00, 0xc420f5ac60, 0xc4215ecaf0)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
goroutine 47523 [select]:
github.com/cockroachdb/cockroach/pkg/storage.(*Store).startGossip.func4(0x2c06fc0, 0xc420b74d50)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/store.go:1679 +0x39d
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc4215ed3d0, 0xc420f5ac60, 0xc4212b5130)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
goroutine 47650 [select]:
github.com/cockroachdb/cockroach/pkg/gossip.(*client).gossip(0xc420afa9c0, 0x2c06fc0, 0xc421651770, 0xc420526820, 0x2c1d9c0, 0xc420386100, 0xc420f5a870, 0xc4204fea80, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/pkg/gossip/client.go:371 +0x414
github.com/cockroachdb/cockroach/pkg/gossip.(*client).startLocked.func1(0x2c06fc0, 0xc421651770)
/go/src/github.com/cockroachdb/cockroach/pkg/gossip/client.go:130 +0x399
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc4203860a0, 0xc420f5a870, 0xc42174c540)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
goroutine 47761 [select, 7 minutes]:
github.com/cockroachdb/cockroach/pkg/storage/idalloc.(*Allocator).start.func1(0x2c06fc0, 0xc420daa8d0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/idalloc/id_alloc.go:138 +0x638
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc4215ec800, 0xc420f5ac60, 0xc4215ec7f0)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
goroutine 47438 [select]:
github.com/cockroachdb/cockroach/pkg/storage.(*baseQueue).processLoop.func1(0x2c06fc0, 0xc420b92ea0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/queue.go:596 +0x194
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc4217f2800, 0xc420f5ac60, 0xc4204e74c0)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
TestStoreRangeMergeWatcher
--- FAIL: testrace/TestStoreRangeMergeWatcher (2.910s)
Test ended in panic.
```
Please assign, take a look and update the issue accordingly.
|
1.0
|
teamcity: failed test: TestStoreRangeMergeWatcher - The following tests appear to have failed on release-2.1 (test): TestStoreRangeMergeWatcher/inject-failures=false, TestStoreRangeMergeWatcher
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestStoreRangeMergeWatcher).
[#958017](https://teamcity.cockroachdb.com/viewLog.html?buildId=958017):
```
TestStoreRangeMergeWatcher
--- FAIL: test/TestStoreRangeMergeWatcher (0.000s)
Test ended in panic.
TestStoreRangeMergeWatcher/inject-failures=false
...ach/pkg/storage/scheduler.go:196 +0x7c
github.com/cockroachdb/cockroach/pkg/storage.(*raftScheduler).Start.func2(0x2c06fc0, 0xc4215e8630)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/scheduler.go:165 +0x3e
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc42051e690, 0xc420dc2ab0, 0xc42051e680)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
goroutine 47470 [semacquire]:
sync.runtime_notifyListWait(0xc420dd8510, 0x799c)
/usr/local/go/src/runtime/sema.go:510 +0x10b
sync.(*Cond).Wait(0xc420dd8500)
/usr/local/go/src/sync/cond.go:56 +0x80
github.com/cockroachdb/cockroach/pkg/storage.(*raftScheduler).worker(0xc420f6a300, 0x2c06fc0, 0xc420acc7b0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/scheduler.go:196 +0x7c
github.com/cockroachdb/cockroach/pkg/storage.(*raftScheduler).Start.func2(0x2c06fc0, 0xc420acc7b0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/scheduler.go:165 +0x3e
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc4215ecb00, 0xc420f5ac60, 0xc4215ecaf0)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
goroutine 47523 [select]:
github.com/cockroachdb/cockroach/pkg/storage.(*Store).startGossip.func4(0x2c06fc0, 0xc420b74d50)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/store.go:1679 +0x39d
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc4215ed3d0, 0xc420f5ac60, 0xc4212b5130)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
goroutine 47650 [select]:
github.com/cockroachdb/cockroach/pkg/gossip.(*client).gossip(0xc420afa9c0, 0x2c06fc0, 0xc421651770, 0xc420526820, 0x2c1d9c0, 0xc420386100, 0xc420f5a870, 0xc4204fea80, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/pkg/gossip/client.go:371 +0x414
github.com/cockroachdb/cockroach/pkg/gossip.(*client).startLocked.func1(0x2c06fc0, 0xc421651770)
/go/src/github.com/cockroachdb/cockroach/pkg/gossip/client.go:130 +0x399
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc4203860a0, 0xc420f5a870, 0xc42174c540)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
goroutine 47761 [select, 7 minutes]:
github.com/cockroachdb/cockroach/pkg/storage/idalloc.(*Allocator).start.func1(0x2c06fc0, 0xc420daa8d0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/idalloc/id_alloc.go:138 +0x638
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc4215ec800, 0xc420f5ac60, 0xc4215ec7f0)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
goroutine 47438 [select]:
github.com/cockroachdb/cockroach/pkg/storage.(*baseQueue).processLoop.func1(0x2c06fc0, 0xc420b92ea0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/queue.go:596 +0x194
github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker.func1(0xc4217f2800, 0xc420f5ac60, 0xc4204e74c0)
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:199 +0xe9
created by github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:192 +0xad
TestStoreRangeMergeWatcher
--- FAIL: testrace/TestStoreRangeMergeWatcher (2.910s)
Test ended in panic.
```
Please assign, take a look and update the issue accordingly.
|
test
|
teamcity failed test teststorerangemergewatcher the following tests appear to have failed on release test teststorerangemergewatcher inject failures false teststorerangemergewatcher you may want to check teststorerangemergewatcher fail test teststorerangemergewatcher test ended in panic teststorerangemergewatcher inject failures false ach pkg storage scheduler go github com cockroachdb cockroach pkg storage raftscheduler start go src github com cockroachdb cockroach pkg storage scheduler go github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go created by github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go goroutine sync runtime notifylistwait usr local go src runtime sema go sync cond wait usr local go src sync cond go github com cockroachdb cockroach pkg storage raftscheduler worker go src github com cockroachdb cockroach pkg storage scheduler go github com cockroachdb cockroach pkg storage raftscheduler start go src github com cockroachdb cockroach pkg storage scheduler go github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go created by github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go goroutine github com cockroachdb cockroach pkg storage store startgossip go src github com cockroachdb cockroach pkg storage store go github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go created by github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go goroutine github com cockroachdb cockroach pkg gossip client gossip go src github com cockroachdb cockroach pkg gossip client go github com cockroachdb cockroach pkg gossip client startlocked go src github com cockroachdb cockroach pkg gossip client go github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go created by github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go goroutine github com cockroachdb cockroach pkg storage idalloc allocator start go src github com cockroachdb cockroach pkg storage idalloc id alloc go github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go created by github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go goroutine github com cockroachdb cockroach pkg storage basequeue processloop go src github com cockroachdb cockroach pkg storage queue go github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go created by github com cockroachdb cockroach pkg util stop stopper runworker go src github com cockroachdb cockroach pkg util stop stopper go teststorerangemergewatcher fail testrace teststorerangemergewatcher test ended in panic please assign take a look and update the issue accordingly
| 1
|
322,161
| 9,813,712,419
|
IssuesEvent
|
2019-06-13 08:38:03
|
WoWManiaUK/Blackwing-Lair
|
https://api.github.com/repos/WoWManiaUK/Blackwing-Lair
|
opened
|
[Npc] Julak-Doom - Twilight highlands
|
Low Priority zone 80-85 Cata
|
**Links:**
Npc - https://www.wowhead.com/npc=50089/julak-doom
Spell - https://www.wowhead.com/spell=93621
from WoWHead or our Armory
**What is happening:**
Julak-Doom not killable due to party wide mind control
**What should happen:**
Should not control whole party just a single target
|
1.0
|
[Npc] Julak-Doom - Twilight highlands - **Links:**
Npc - https://www.wowhead.com/npc=50089/julak-doom
Spell - https://www.wowhead.com/spell=93621
from WoWHead or our Armory
**What is happening:**
Julak-Doom not killable due to party wide mind control
**What should happen:**
Should not control whole party just a single target
|
non_test
|
julak doom twilight highlands links npc spell from wowhead or our armory what is happening julak doom not killable due to party wide mind control what should happen should not control whole party just a single target
| 0
|
233
| 2,659,742,846
|
IssuesEvent
|
2015-03-18 22:59:51
|
hammerlab/pileup.js
|
https://api.github.com/repos/hammerlab/pileup.js
|
closed
|
Update to flow 0.6.0
|
process
|
It includes support for [bounded polymorphism][1], which would be useful for `ContigInterval` (which can have either `number` or `string` contigs).
[1]: http://flowtype.org/blog/2015/03/12/Bounded-Polymorphism.html
|
1.0
|
Update to flow 0.6.0 - It includes support for [bounded polymorphism][1], which would be useful for `ContigInterval` (which can have either `number` or `string` contigs).
[1]: http://flowtype.org/blog/2015/03/12/Bounded-Polymorphism.html
|
non_test
|
update to flow it includes support for which would be useful for contiginterval which can have either number or string contigs
| 0
|
16,788
| 3,561,367,613
|
IssuesEvent
|
2016-01-23 19:05:03
|
tgstation/-tg-station
|
https://api.github.com/repos/tgstation/-tg-station
|
closed
|
Gangs can apparently only have one Dominator
|
Bug Needs Reproducing/Testing
|
Said over OOC and on the forums, needs reproducing/testing.
|
1.0
|
Gangs can apparently only have one Dominator - Said over OOC and on the forums, needs reproducing/testing.
|
test
|
gangs can apparently only have one dominator said over ooc and on the forums needs reproducing testing
| 1
|
286,234
| 24,732,234,556
|
IssuesEvent
|
2022-10-20 18:37:48
|
dotnet/source-build
|
https://api.github.com/repos/dotnet/source-build
|
closed
|
xunit smoke-test flaky: "The library 'libhostpolicy.so' required to execute the application was not found"
|
area-ci-testing
|
https://dev.azure.com/dnceng/internal/_build/results?buildId=715168&view=logs&j=e242d376-9fed-565d-2a2a-c4e65b5ab60e&t=a2b6a900-d726-529e-22d8-1a91fa385f64&l=253
This happened during tarball smoke-test in a `fedora30 Offline Portable` job:
```
starting language C#, type xunit
running new
The template "xUnit Test Project" was created successfully.
running restore
/tb/tarball_715168/testing-smoke/builtCli/sdk/3.1.106/MSBuild.dll -nologo -maxcpucount -target:Restore -verbosity:m /bl:/tb/tarball_715168/testing-smoke/C#_xunit_local_nohttps_restore.binlog ./C#_xunit.csproj
Restore completed in 1.82 sec for /tb/tarball_715168/testing-smoke/C#_xunit/C#_xunit.csproj.
running test
/tb/tarball_715168/testing-smoke/builtCli/sdk/3.1.106/MSBuild.dll -nologo -maxcpucount -nodereuse:false -restore -target:VSTest -verbosity:m -verbosity:quiet /bl:/tb/tarball_715168/testing-smoke/C#_xunit_local_nohttps_test.binlog ./C#_xunit.csproj
Test run for /tb/tarball_715168/testing-smoke/C#_xunit/bin/Debug/netcoreapp3.1/C#_xunit.dll(.NETCoreApp,Version=v3.1)
A fatal error was encountered. The library 'libhostpolicy.so' required to execute the application was not found in '/tb/tarball_715168/testing-smoke/builtCli/sdk/3.1.106/'.
Failed to run as a self-contained app. If this should be a framework-dependent app, add the /tb/tarball_715168/testing-smoke/builtCli/sdk/3.1.106/vstest.console.runtimeconfig.json file specifying the appropriate framework.
```
This may be difficult to diagnose without more artifacts (does the file actually not exist?): https://github.com/dotnet/source-build/issues/1345 / https://github.com/dotnet/source-build/issues/1321
|
1.0
|
xunit smoke-test flaky: "The library 'libhostpolicy.so' required to execute the application was not found" - https://dev.azure.com/dnceng/internal/_build/results?buildId=715168&view=logs&j=e242d376-9fed-565d-2a2a-c4e65b5ab60e&t=a2b6a900-d726-529e-22d8-1a91fa385f64&l=253
This happened during tarball smoke-test in a `fedora30 Offline Portable` job:
```
starting language C#, type xunit
running new
The template "xUnit Test Project" was created successfully.
running restore
/tb/tarball_715168/testing-smoke/builtCli/sdk/3.1.106/MSBuild.dll -nologo -maxcpucount -target:Restore -verbosity:m /bl:/tb/tarball_715168/testing-smoke/C#_xunit_local_nohttps_restore.binlog ./C#_xunit.csproj
Restore completed in 1.82 sec for /tb/tarball_715168/testing-smoke/C#_xunit/C#_xunit.csproj.
running test
/tb/tarball_715168/testing-smoke/builtCli/sdk/3.1.106/MSBuild.dll -nologo -maxcpucount -nodereuse:false -restore -target:VSTest -verbosity:m -verbosity:quiet /bl:/tb/tarball_715168/testing-smoke/C#_xunit_local_nohttps_test.binlog ./C#_xunit.csproj
Test run for /tb/tarball_715168/testing-smoke/C#_xunit/bin/Debug/netcoreapp3.1/C#_xunit.dll(.NETCoreApp,Version=v3.1)
A fatal error was encountered. The library 'libhostpolicy.so' required to execute the application was not found in '/tb/tarball_715168/testing-smoke/builtCli/sdk/3.1.106/'.
Failed to run as a self-contained app. If this should be a framework-dependent app, add the /tb/tarball_715168/testing-smoke/builtCli/sdk/3.1.106/vstest.console.runtimeconfig.json file specifying the appropriate framework.
```
This may be difficult to diagnose without more artifacts (does the file actually not exist?): https://github.com/dotnet/source-build/issues/1345 / https://github.com/dotnet/source-build/issues/1321
|
test
|
xunit smoke test flaky the library libhostpolicy so required to execute the application was not found this happened during tarball smoke test in a offline portable job starting language c type xunit running new the template xunit test project was created successfully running restore tb tarball testing smoke builtcli sdk msbuild dll nologo maxcpucount target restore verbosity m bl tb tarball testing smoke c xunit local nohttps restore binlog c xunit csproj restore completed in sec for tb tarball testing smoke c xunit c xunit csproj running test tb tarball testing smoke builtcli sdk msbuild dll nologo maxcpucount nodereuse false restore target vstest verbosity m verbosity quiet bl tb tarball testing smoke c xunit local nohttps test binlog c xunit csproj test run for tb tarball testing smoke c xunit bin debug c xunit dll netcoreapp version a fatal error was encountered the library libhostpolicy so required to execute the application was not found in tb tarball testing smoke builtcli sdk failed to run as a self contained app if this should be a framework dependent app add the tb tarball testing smoke builtcli sdk vstest console runtimeconfig json file specifying the appropriate framework this may be difficult to diagnose without more artifacts does the file actually not exist
| 1
|
226,874
| 18,045,932,638
|
IssuesEvent
|
2021-09-18 22:29:53
|
logicmoo/logicmoo_workspace
|
https://api.github.com/repos/logicmoo/logicmoo_workspace
|
opened
|
logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B JUnit
|
Test_9999 logicmoo.pfc.test.sanity_base unit_test NEVER_RETRACT_01B
|
(cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc)
GH_MASTER_ISSUE_FINFO=
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANEVER_RETRACT_01B
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/NEVER_RETRACT_01B/logicmoo_pfc_test_sanity_base_NEVER_RETRACT_01B_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/68/testReport/logicmoo.pfc.test.sanity_base/NEVER_RETRACT_01B/logicmoo_pfc_test_sanity_base_NEVER_RETRACT_01B_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://github.com/logicmoo/logicmoo_workspace/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc
```
%
running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc'),
%~ this_test_might_need( :-( use_module( library(logicmoo_plarkc))))
:- expects_dialect(pfc).
(tType(COL)==>{kb_local(COL/2)},% functorDeclares(COL),
(t(COL,ext,X)<==>instanceOf(X,COL))).
tType(tFly).
tType(tCanary).
tType(tPenguin).
tType(tBird).
:- mpred_test(predicate_property(tBird(ext,_),dynamic)).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:39
%~ mpred_test("Test_0001_Line_0000__Ext",baseKB:predicate_property(tBird(ext,_456),dynamic))
/*~
%~ mpred_test("Test_0001_Line_0000__Ext",baseKB:predicate_property(tBird(ext,_456),dynamic))
passed=info(why_was_true(baseKB:predicate_property(tBird(ext,_456),dynamic)))
no_proof_for(predicate_property(tBird(ext,Bird_Ext),dynamic)).
no_proof_for(predicate_property(tBird(ext,Bird_Ext),dynamic)).
no_proof_for(predicate_property(tBird(ext,Bird_Ext),dynamic)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0001_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0001_Line_0000__Ext-junit.xml
~*/
subClassOf(C1,C2)==> (instanceOf(X,C1)==>instanceOf(X,C2)).
subClassOf(tCanary,tBird).
subClassOf(tPenguin,tBird).
:- dmsg("chilly is a penguin.").
%~ chilly is a penguin.
tPenguin(ext,iChilly).
:- mpred_test((tBird(ext,iChilly))).
%~ mpred_test("Test_0002_Line_0000__Ext",baseKB:tBird(ext,iChilly))
/*~
%~ mpred_test("Test_0002_Line_0000__Ext",baseKB:tBird(ext,iChilly))
passed=info(why_was_true(baseKB:tBird(ext,iChilly)))
no_proof_for(tBird(ext,iChilly)).
no_proof_for(tBird(ext,iChilly)).
no_proof_for(tBird(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0002_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0002_Line_0000__Ext-junit.xml
~*/
:- dmsg("tweety is a canary.").
%~ tweety is a canary.
tCanary(ext,iTweety).
:- mpred_test((tBird(ext,iTweety))).
%~ mpred_test("Test_0003_Line_0000__Ext",baseKB:tBird(ext,iTweety))
/*~
%~ mpred_test("Test_0003_Line_0000__Ext",baseKB:tBird(ext,iTweety))
passed=info(why_was_true(baseKB:tBird(ext,iTweety)))
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0003_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0003_Line_0000__Ext-junit.xml
~*/
:- dmsg("birds fly by default.").
%~ birds fly by default.
mdefault(( tBird(ext,X) ==> tFly(ext,X) )).
:- mpred_test((tBird(ext,iTweety))).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:62
%~ mpred_test("Test_0004_Line_0000__Ext",baseKB:tBird(ext,iTweety))
/*~
%~ mpred_test("Test_0004_Line_0000__Ext",baseKB:tBird(ext,iTweety))
passed=info(why_was_true(baseKB:tBird(ext,iTweety)))
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0004_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0004_Line_0000__Ext-junit.xml
~*/
:- mpred_test((tFly(ext,iTweety))).
%~ mpred_test("Test_0005_Line_0000__Ext",baseKB:tFly(ext,iTweety))
/*~
%~ mpred_test("Test_0005_Line_0000__Ext",baseKB:tFly(ext,iTweety))
passed=info(why_was_true(baseKB:tFly(ext,iTweety)))
no_proof_for(tFly(ext,iTweety)).
no_proof_for(tFly(ext,iTweety)).
no_proof_for(tFly(ext,iTweety)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0005_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0005_Line_0000__Ext-junit.xml
~*/
:- dmsg("make sure chilly can fly").
%~ make sure chilly can fly
:- mpred_test((instanceOf(I,tFly),I=iChilly)).
%~ mpred_test("Test_0006_Line_0000__TFly",baseKB:(instanceOf(_183882,tFly),_183882=iChilly))
/*~
%~ mpred_test("Test_0006_Line_0000__TFly",baseKB:(instanceOf(_183882,tFly),_183882=iChilly))
^ Call: (68) [baseKB] baseKB:instanceOf(_183882, tFly)
^ Fail: (68) [baseKB] baseKB:instanceOf(_183882, tFly)
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+ (instanceOf(_183882,tFly),_183882=iChilly))),rtrace(baseKB:(instanceOf(_183882,tFly),_183882=iChilly))))
no_proof_for(\+ (instanceOf(I,tFly),I=iChilly)).
no_proof_for(\+ (instanceOf(I,tFly),I=iChilly)).
no_proof_for(\+ (instanceOf(I,tFly),I=iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0006_Line_0000__TFly'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0006_Line_0000__TFly-junit.xml
~*/
:- mpred_test((tBird(ext,iTweety))).
%~ mpred_test("Test_0007_Line_0000__Ext",baseKB:tBird(ext,iTweety))
/*~
%~ mpred_test("Test_0007_Line_0000__Ext",baseKB:tBird(ext,iTweety))
passed=info(why_was_true(baseKB:tBird(ext,iTweety)))
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0007_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0007_Line_0000__Ext-junit.xml
~*/
:- listing([tFly/2,tBird/2,instanceOf/2]).
%~ skipped( listing( [ tFly/2, tBird/2,instanceOf/2]))
:- dmsg("make sure tweety can fly (and again chilly)").
%~ make sure tweety can fly (and again chilly)
:- mpred_test((tFly(ext,iTweety))).
%~ mpred_test("Test_0008_Line_0000__Ext",baseKB:tFly(ext,iTweety))
/*~
%~ mpred_test("Test_0008_Line_0000__Ext",baseKB:tFly(ext,iTweety))
passed=info(why_was_true(baseKB:tFly(ext,iTweety)))
no_proof_for(tFly(ext,iTweety)).
no_proof_for(tFly(ext,iTweety)).
no_proof_for(tFly(ext,iTweety)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0008_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0008_Line_0000__Ext-junit.xml
~*/
:- mpred_test((tFly(ext,iChilly))).
%~ mpred_test("Test_0009_Line_0000__Ext",baseKB:tFly(ext,iChilly))
/*~
%~ mpred_test("Test_0009_Line_0000__Ext",baseKB:tFly(ext,iChilly))
passed=info(why_was_true(baseKB:tFly(ext,iChilly)))
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0009_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0009_Line_0000__Ext-junit.xml
~*/
never_retract_u(tBird(ext,iChilly)).
:- dmsg("penguins do not tFly.").
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:81
%~ penguins do not tFly.
tPenguin(ext,X) ==> ~tFly(ext,X).
:- dmsg("confirm chilly now cant fly").
%~ confirm chilly now cant fly
:- mpred_test((\+ tFly(ext,iChilly))).
%~ mpred_test("Test_0010_Line_0000__naf_Ext",baseKB:(\+tFly(ext,iChilly)))
/*~
%~ mpred_test("Test_0010_Line_0000__naf_Ext",baseKB:(\+tFly(ext,iChilly)))
^ Call: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Unify: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Exit: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:tFly(ext,iChilly)),rtrace(baseKB:(\+tFly(ext,iChilly)))))
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0010_Line_0000__naf_Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0010_Line_0000__naf_Ext-junit.xml
~*/
:- mpred_test(( ~ tFly(ext,iChilly))).
%= repropigate that chilly was a bird again (actualy this asserts)
%~ mpred_test("Test_0011_Line_0000__Ext",baseKB: ~tFly(ext,iChilly))
/*~
%~ mpred_test("Test_0011_Line_0000__Ext",baseKB: ~tFly(ext,iChilly))
^ Call: (68) [baseKB] ~tFly(ext, iChilly)
^ Unify: (68) [baseKB] ~ (baseKB:tFly(ext, iChilly))
^ Call: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
^ Unify: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
Call: (76) [system] set_prolog_flag(last_call_optimisation, false)
Exit: (76) [system] set_prolog_flag(last_call_optimisation, false)
^ Call: (76) [loop_check] prolog_frame_attribute(1189, parent_goal, loop_check_term_frame(_547520, info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _547526, _547528))
^ Fail: (76) [loop_check] prolog_frame_attribute(1189, parent_goal, loop_check_term_frame(_547520, info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _547526, _547528))
^ Redo: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
Call: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
Unify: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
^ Call: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
^ Unify: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Call: (83) [system] set_prolog_flag(last_call_optimisation, false)
Exit: (83) [system] set_prolog_flag(last_call_optimisation, false)
^ Call: (83) [loop_check] prolog_frame_attribute(1328, parent_goal, loop_check_term_frame(_553244, info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _553250, _553252))
^ Fail: (83) [loop_check] prolog_frame_attribute(1328, parent_goal, loop_check_term_frame(_553244, info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _553250, _553252))
^ Redo: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Call: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
Unify: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Call: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _556576)
^ Unify: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _556576)
^ Call: (88) [system] clause(pfc_lib:prologNegByFailure(tFly), true, _556576)
^ Fail: (88) [system] clause(pfc_lib:prologNegByFailure(tFly), true, _556576)
^ Fail: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _556576)
Unify: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Call: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Unify: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Call: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Unify: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Fail: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Redo: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Exit: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Call: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _564746), call(_564746)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
^ Unify: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _564746), call(_564746)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
^ Call: (90) [hook_database] clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _564746)
^ Exit: (90) [hook_database] clause(mpred_prop(baseKB, tFly, 2, prologHybrid), true)
Call: (90) [system] true
Exit: (90) [system] true
Call: (90) [system] true
Exit: (90) [system] true
^ Exit: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), true), call(true)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
Fail: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Fail: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Fail: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
^ Fail: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
^ Fail: (68) [baseKB] ~ (baseKB:tFly(ext, iChilly))
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+ ~tFly(ext,iChilly))),rtrace(baseKB: ~tFly(ext,iChilly))))
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0011_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0011_Line_0000__Ext-junit.xml
~*/
%= repropigate that chilly was a bird again (actualy this asserts)
tBird(ext,iChilly).
:- listing(tBird/2).
%= the dmsg explains the difference between \+ and ~
%~ skipped( listing( tBird/2))
%= the dmsg explains the difference between \+ and ~
:- dmsg("confirm chilly still does not fly").
%~ confirm chilly still does not fly
:- mpred_test(( \+ tFly(ext,iChilly))).
%~ mpred_test("Test_0012_Line_0000__naf_Ext",baseKB:(\+tFly(ext,iChilly)))
/*~
%~ mpred_test("Test_0012_Line_0000__naf_Ext",baseKB:(\+tFly(ext,iChilly)))
^ Call: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Unify: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Exit: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:tFly(ext,iChilly)),rtrace(baseKB:(\+tFly(ext,iChilly)))))
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0012_Line_0000__naf_Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0012_Line_0000__naf_Ext-junit.xml
~*/
:- dmsg("confirm chilly still cant fly").
%~ confirm chilly still cant fly
:- mpred_test(( ~ tFly(ext,iChilly))).
%~ mpred_test("Test_0013_Line_0000__Ext",baseKB: ~tFly(ext,iChilly))
/*~
%~ mpred_test("Test_0013_Line_0000__Ext",baseKB: ~tFly(ext,iChilly))
^ Call: (68) [baseKB] ~tFly(ext, iChilly)
^ Unify: (68) [baseKB] ~ (baseKB:tFly(ext, iChilly))
^ Call: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
^ Unify: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
Call: (76) [system] set_prolog_flag(last_call_optimisation, false)
Exit: (76) [system] set_prolog_flag(last_call_optimisation, false)
^ Call: (76) [loop_check] prolog_frame_attribute(1189, parent_goal, loop_check_term_frame(_1180304, info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _1180310, _1180312))
^ Fail: (76) [loop_check] prolog_frame_attribute(1189, parent_goal, loop_check_term_frame(_1180304, info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _1180310, _1180312))
^ Redo: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
Call: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
Unify: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
^ Call: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
^ Unify: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Call: (83) [system] set_prolog_flag(last_call_optimisation, false)
Exit: (83) [system] set_prolog_flag(last_call_optimisation, false)
^ Call: (83) [loop_check] prolog_frame_attribute(1328, parent_goal, loop_check_term_frame(_1186028, info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _1186034, _1186036))
^ Fail: (83) [loop_check] prolog_frame_attribute(1328, parent_goal, loop_check_term_frame(_1186028, info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _1186034, _1186036))
^ Redo: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Call: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
Unify: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Call: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _1189360)
^ Unify: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _1189360)
^ Call: (88) [system] clause(pfc_lib:prologNegByFailure(tFly), true, _1189360)
^ Fail: (88) [system] clause(pfc_lib:prologNegByFailure(tFly), true, _1189360)
^ Fail: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _1189360)
Unify: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Call: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Unify: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Call: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Unify: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Fail: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Redo: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Exit: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Call: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _1197530), call(_1197530)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
^ Unify: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _1197530), call(_1197530)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
^ Call: (90) [hook_database] clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _1197530)
^ Exit: (90) [hook_database] clause(mpred_prop(baseKB, tFly, 2, prologHybrid), true)
Call: (90) [system] true
Exit: (90) [system] true
Call: (90) [system] true
Exit: (90) [system] true
^ Exit: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), true), call(true)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
Fail: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Fail: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Fail: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
^ Fail: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
^ Fail: (68) [baseKB] ~ (baseKB:tFly(ext, iChilly))
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+ ~tFly(ext,iChilly))),rtrace(baseKB: ~tFly(ext,iChilly))))
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0013_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0013_Line_0000__Ext-junit.xml
~*/
/*
% This wounld be a good TMS test it should throw.. but right now it passes wrongly
tFly(ext,iChilly).
:- dmsg("confirm chilly is flying penguin").
:- mpred_test(( tFly(ext,iChilly))).
:- mpred_test(( tPenguin(ext,iChilly))).
:- mpred_test((\+ ~tFly(ext,iChilly))).
\+ tFly(ext,iChilly).
:- dmsg("confirm chilly is a normal penguin who cant fly").
:- mpred_test((\+ tFly(ext,iChilly))).
% fails rightly
:- mpred_test(( tPenguin(ext,iChilly))).
*/
:- dmsg("chilly is no longer a penguin (hopefly the assertion above about him being a bird wont be removed)").
%~ chilly is no longer a penguin (hopefly the assertion above about him being a bird wont be removed)
:- debug_logicmoo(_).
:- mpred_trace_exec.
:- debug_logicmoo(logicmoo(_)).
:- mpred_test(tBird(ext,iChilly)).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:127
%~ mpred_test("Test_0014_Line_0000__Ext",baseKB:tBird(ext,iChilly))
%~ FIlE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L127
/*~
%~ mpred_test("Test_0014_Line_0000__Ext",baseKB:tBird(ext,iChilly))
passed=info(why_was_true(baseKB:tBird(ext,iChilly)))
Justifications for tBird(ext,iChilly):
[36m 1.1 mfl4(_,baseKB,'* https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L90 ',90) [0m
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0014_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0014_Line_0000__Ext-junit.xml
~*/
never_retract_u(tBird(ext,iChilly)).
\+ tPenguin(ext,iChilly).
%~ mpred_undo1( '$nt'(
%~ ~( tFly(ext,iChilly)),
%~ ( call_u_no_bc( ~( tFly(ext,iChilly))) ,
%~ ground( ~( tFly(ext,iChilly))) ,
%~ \+( tFly(ext,iChilly))),
%~ '$nt'(~tFly(ext,iChilly),call_u_no_bc(~tFly(ext,iChilly)),rhs([tFly(ext,iChilly)]))))
%~ mpred_undo1( '$nt'(~tFly(ext,iChilly),call_u_no_bc(~tFly(ext,iChilly)),rhs([tFly(ext,iChilly)])))
%~ mpred_undo1( '$nt'(
%~ ~( tFly(ext,iChilly)),
%~ ( call_u_no_bc( ~( tFly(ext,iChilly))) ,
%~ ground( ~( tFly(ext,iChilly))) ,
%~ \+( tFly(ext,iChilly))),
%~ '$nt'(~tFly(ext,iChilly),call_u_no_bc(~tFly(ext,iChilly)),rhs([tFly(ext,iChilly)]))))
%~ mpred_undo1( '$nt'(
%~ ~( tFly(ext,iChilly)),
%~ ( call_u_no_bc( ~( tFly(ext,iChilly))) ,
%~ ground( ~( tFly(ext,iChilly))) ,
%~ \+( tFly(ext,iChilly)) ,
%~ \+( ~( tFly(ext,iChilly)) =
%~
%~ tFly(ext,iChilly))),
%~ rhs([\+tFly(ext,iChilly)])))
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:132
%~ mpred_undo1( '$nt'(
%~ ~( tFly(ext,iChilly)),
%~ ( call_u_no_bc( ~( tFly(ext,iChilly))) ,
%~ ground( ~( tFly(ext,iChilly))) ,
%~ \+( tFly(ext,iChilly)) ,
%~ \+( ~( tFly(ext,iChilly)) =
%~
%~ tFly(ext,iChilly))),
%~ rhs([\+tFly(ext,iChilly)])))
%~ debugm( baseKB,
%~ show_success( baseKB,
%~ baseKB : mpred_withdraw( tPenguin(ext,iChilly),
%~ ( mfl4(BaseKB,baseKB,'* https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc ',132) ,
%~ ax))))
:- mpred_test((tBird(ext,iChilly))).
%~ mpred_test("Test_0015_Line_0000__Ext",baseKB:tBird(ext,iChilly))
%~ FIlE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L135
/*~
%~ mpred_test("Test_0015_Line_0000__Ext",baseKB:tBird(ext,iChilly))
passed=info(why_was_true(baseKB:tBird(ext,iChilly)))
Justifications for tBird(ext,iChilly):
[36m 1.1 mfl4(_,baseKB,'* https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L90 ',90) [0m
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0015_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0015_Line_0000__Ext-junit.xml
~*/
:- mpred_test(( \+ tPenguin(ext,iChilly))).
%~ mpred_test("Test_0016_Line_0000__naf_Ext",baseKB:(\+tPenguin(ext,iChilly)))
/*~
%~ mpred_test("Test_0016_Line_0000__naf_Ext",baseKB:(\+tPenguin(ext,iChilly)))
^ Call: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Unify: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Exit: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:tPenguin(ext,iChilly)),rtrace(baseKB:(\+tPenguin(ext,iChilly)))))
no_proof_for(tPenguin(ext,iChilly)).
no_proof_for(tPenguin(ext,iChilly)).
no_proof_for(tPenguin(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0016_Line_0000__naf_Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0016_Line_0000__naf_Ext-junit.xml
~*/
:- dmsg("chilly is still a bird").
%~ chilly is still a bird
:- mpred_test((tBird(ext,iChilly))).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:140
%~ mpred_test("Test_0017_Line_0000__Ext",baseKB:tBird(ext,iChilly))
%~ FIlE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L140
/*~
%~ mpred_test("Test_0017_Line_0000__Ext",baseKB:tBird(ext,iChilly))
passed=info(why_was_true(baseKB:tBird(ext,iChilly)))
Justifications for tBird(ext,iChilly):
[36m 1.1 mfl4(_,baseKB,'* https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L90 ',90) [0m
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0017_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0017_Line_0000__Ext-junit.xml
~*/
:- dmsg("confirm chilly is flying bird").
%~ confirm chilly is flying bird
:- mpred_test(( tFly(ext,iChilly))).
%~ mpred_test("Test_0018_Line_0000__Ext",baseKB:tFly(ext,iChilly))
/*~
%~ mpred_test("Test_0018_Line_0000__Ext",baseKB:tFly(ext,iChilly))
passed=info(why_was_true(baseKB:tFly(ext,iChilly)))
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0018_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0018_Line_0000__Ext-junit.xml
~*/
:- repropagate(tBird(ext,iChilly)).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:145
%~ debugm(baseKB,show_success(baseKB,baseKB:mpred_fwc(tBird(ext,iChilly))))
:- dmsg("confirm chilly is flying bird").
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:147
%~ confirm chilly is flying bird
:- mpred_test(( tFly(ext,iChilly))).
%~ mpred_test("Test_0019_Line_0000__Ext",baseKB:tFly(ext,iChilly))
/*~
%~ mpred_test("Test_0019_Line_0000__Ext",baseKB:tFly(ext,iChilly))
passed=info(why_was_true(baseKB:tFly(ext,iChilly)))
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0019_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0019_Line_0000__Ext-junit.xml
~*/
:- mpred_test(( \+ tPenguin(ext,iChilly))).
%~ mpred_test("Test_0020_Line_0000__naf_Ext",baseKB:(\+tPenguin(ext,iChilly)))
/*~
%~ mpred_test("Test_0020_Line_0000__naf_Ext",baseKB:(\+tPenguin(ext,iChilly)))
^ Call: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Unify: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Exit: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:tPenguin(ext,iChilly)),rtrace(baseKB:(\+tPenguin(ext,iChilly)))))
no_proof_for(tPenguin(ext,iChilly)).
no_proof_for(tPenguin(ext,iChilly)).
no_proof_for(tPenguin(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0020_Line_0000__naf_Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0020_Line_0000__naf_Ext-junit.xml
~*/
:- mpred_test(( \+ ~tFly(ext,iChilly))).
%~ mpred_test("Test_0021_Line_0000__naf_Ext",baseKB:(\+ ~tFly(ext,iChilly)))
/*~
%~ mpred_test("Test_0021_Line_0000__naf_Ext",baseKB:(\+ ~tFly(ext,iChilly)))
passed=info(why_was_true(baseKB:(\+ ~tFly(ext,iChilly))))
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0021_Line_0000__naf_Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0021_Line_0000__naf_Ext-junit.xml
~*/
%~ unused(save_junit_results)
%~ test_completed_exit(8)
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
```
totalTime=1
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANEVER_RETRACT_01B
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/NEVER_RETRACT_01B/logicmoo_pfc_test_sanity_base_NEVER_RETRACT_01B_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/68/testReport/logicmoo.pfc.test.sanity_base/NEVER_RETRACT_01B/logicmoo_pfc_test_sanity_base_NEVER_RETRACT_01B_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://github.com/logicmoo/logicmoo_workspace/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc
FAILED: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k never_retract_01b.pfc (returned 8)
|
3.0
|
logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B JUnit - (cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc)
GH_MASTER_ISSUE_FINFO=
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANEVER_RETRACT_01B
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/NEVER_RETRACT_01B/logicmoo_pfc_test_sanity_base_NEVER_RETRACT_01B_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/68/testReport/logicmoo.pfc.test.sanity_base/NEVER_RETRACT_01B/logicmoo_pfc_test_sanity_base_NEVER_RETRACT_01B_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://github.com/logicmoo/logicmoo_workspace/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc
```
%
running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc'),
%~ this_test_might_need( :-( use_module( library(logicmoo_plarkc))))
:- expects_dialect(pfc).
(tType(COL)==>{kb_local(COL/2)},% functorDeclares(COL),
(t(COL,ext,X)<==>instanceOf(X,COL))).
tType(tFly).
tType(tCanary).
tType(tPenguin).
tType(tBird).
:- mpred_test(predicate_property(tBird(ext,_),dynamic)).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:39
%~ mpred_test("Test_0001_Line_0000__Ext",baseKB:predicate_property(tBird(ext,_456),dynamic))
/*~
%~ mpred_test("Test_0001_Line_0000__Ext",baseKB:predicate_property(tBird(ext,_456),dynamic))
passed=info(why_was_true(baseKB:predicate_property(tBird(ext,_456),dynamic)))
no_proof_for(predicate_property(tBird(ext,Bird_Ext),dynamic)).
no_proof_for(predicate_property(tBird(ext,Bird_Ext),dynamic)).
no_proof_for(predicate_property(tBird(ext,Bird_Ext),dynamic)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0001_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0001_Line_0000__Ext-junit.xml
~*/
subClassOf(C1,C2)==> (instanceOf(X,C1)==>instanceOf(X,C2)).
subClassOf(tCanary,tBird).
subClassOf(tPenguin,tBird).
:- dmsg("chilly is a penguin.").
%~ chilly is a penguin.
tPenguin(ext,iChilly).
:- mpred_test((tBird(ext,iChilly))).
%~ mpred_test("Test_0002_Line_0000__Ext",baseKB:tBird(ext,iChilly))
/*~
%~ mpred_test("Test_0002_Line_0000__Ext",baseKB:tBird(ext,iChilly))
passed=info(why_was_true(baseKB:tBird(ext,iChilly)))
no_proof_for(tBird(ext,iChilly)).
no_proof_for(tBird(ext,iChilly)).
no_proof_for(tBird(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0002_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0002_Line_0000__Ext-junit.xml
~*/
:- dmsg("tweety is a canary.").
%~ tweety is a canary.
tCanary(ext,iTweety).
:- mpred_test((tBird(ext,iTweety))).
%~ mpred_test("Test_0003_Line_0000__Ext",baseKB:tBird(ext,iTweety))
/*~
%~ mpred_test("Test_0003_Line_0000__Ext",baseKB:tBird(ext,iTweety))
passed=info(why_was_true(baseKB:tBird(ext,iTweety)))
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0003_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0003_Line_0000__Ext-junit.xml
~*/
:- dmsg("birds fly by default.").
%~ birds fly by default.
mdefault(( tBird(ext,X) ==> tFly(ext,X) )).
:- mpred_test((tBird(ext,iTweety))).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:62
%~ mpred_test("Test_0004_Line_0000__Ext",baseKB:tBird(ext,iTweety))
/*~
%~ mpred_test("Test_0004_Line_0000__Ext",baseKB:tBird(ext,iTweety))
passed=info(why_was_true(baseKB:tBird(ext,iTweety)))
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0004_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0004_Line_0000__Ext-junit.xml
~*/
:- mpred_test((tFly(ext,iTweety))).
%~ mpred_test("Test_0005_Line_0000__Ext",baseKB:tFly(ext,iTweety))
/*~
%~ mpred_test("Test_0005_Line_0000__Ext",baseKB:tFly(ext,iTweety))
passed=info(why_was_true(baseKB:tFly(ext,iTweety)))
no_proof_for(tFly(ext,iTweety)).
no_proof_for(tFly(ext,iTweety)).
no_proof_for(tFly(ext,iTweety)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0005_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0005_Line_0000__Ext-junit.xml
~*/
:- dmsg("make sure chilly can fly").
%~ make sure chilly can fly
:- mpred_test((instanceOf(I,tFly),I=iChilly)).
%~ mpred_test("Test_0006_Line_0000__TFly",baseKB:(instanceOf(_183882,tFly),_183882=iChilly))
/*~
%~ mpred_test("Test_0006_Line_0000__TFly",baseKB:(instanceOf(_183882,tFly),_183882=iChilly))
^ Call: (68) [baseKB] baseKB:instanceOf(_183882, tFly)
^ Fail: (68) [baseKB] baseKB:instanceOf(_183882, tFly)
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+ (instanceOf(_183882,tFly),_183882=iChilly))),rtrace(baseKB:(instanceOf(_183882,tFly),_183882=iChilly))))
no_proof_for(\+ (instanceOf(I,tFly),I=iChilly)).
no_proof_for(\+ (instanceOf(I,tFly),I=iChilly)).
no_proof_for(\+ (instanceOf(I,tFly),I=iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0006_Line_0000__TFly'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0006_Line_0000__TFly-junit.xml
~*/
:- mpred_test((tBird(ext,iTweety))).
%~ mpred_test("Test_0007_Line_0000__Ext",baseKB:tBird(ext,iTweety))
/*~
%~ mpred_test("Test_0007_Line_0000__Ext",baseKB:tBird(ext,iTweety))
passed=info(why_was_true(baseKB:tBird(ext,iTweety)))
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
no_proof_for(tBird(ext,iTweety)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0007_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0007_Line_0000__Ext-junit.xml
~*/
:- listing([tFly/2,tBird/2,instanceOf/2]).
%~ skipped( listing( [ tFly/2, tBird/2,instanceOf/2]))
:- dmsg("make sure tweety can fly (and again chilly)").
%~ make sure tweety can fly (and again chilly)
:- mpred_test((tFly(ext,iTweety))).
%~ mpred_test("Test_0008_Line_0000__Ext",baseKB:tFly(ext,iTweety))
/*~
%~ mpred_test("Test_0008_Line_0000__Ext",baseKB:tFly(ext,iTweety))
passed=info(why_was_true(baseKB:tFly(ext,iTweety)))
no_proof_for(tFly(ext,iTweety)).
no_proof_for(tFly(ext,iTweety)).
no_proof_for(tFly(ext,iTweety)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0008_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0008_Line_0000__Ext-junit.xml
~*/
:- mpred_test((tFly(ext,iChilly))).
%~ mpred_test("Test_0009_Line_0000__Ext",baseKB:tFly(ext,iChilly))
/*~
%~ mpred_test("Test_0009_Line_0000__Ext",baseKB:tFly(ext,iChilly))
passed=info(why_was_true(baseKB:tFly(ext,iChilly)))
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0009_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0009_Line_0000__Ext-junit.xml
~*/
never_retract_u(tBird(ext,iChilly)).
:- dmsg("penguins do not tFly.").
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:81
%~ penguins do not tFly.
tPenguin(ext,X) ==> ~tFly(ext,X).
:- dmsg("confirm chilly now cant fly").
%~ confirm chilly now cant fly
:- mpred_test((\+ tFly(ext,iChilly))).
%~ mpred_test("Test_0010_Line_0000__naf_Ext",baseKB:(\+tFly(ext,iChilly)))
/*~
%~ mpred_test("Test_0010_Line_0000__naf_Ext",baseKB:(\+tFly(ext,iChilly)))
^ Call: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Unify: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Exit: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:tFly(ext,iChilly)),rtrace(baseKB:(\+tFly(ext,iChilly)))))
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0010_Line_0000__naf_Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0010_Line_0000__naf_Ext-junit.xml
~*/
:- mpred_test(( ~ tFly(ext,iChilly))).
%= repropigate that chilly was a bird again (actualy this asserts)
%~ mpred_test("Test_0011_Line_0000__Ext",baseKB: ~tFly(ext,iChilly))
/*~
%~ mpred_test("Test_0011_Line_0000__Ext",baseKB: ~tFly(ext,iChilly))
^ Call: (68) [baseKB] ~tFly(ext, iChilly)
^ Unify: (68) [baseKB] ~ (baseKB:tFly(ext, iChilly))
^ Call: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
^ Unify: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
Call: (76) [system] set_prolog_flag(last_call_optimisation, false)
Exit: (76) [system] set_prolog_flag(last_call_optimisation, false)
^ Call: (76) [loop_check] prolog_frame_attribute(1189, parent_goal, loop_check_term_frame(_547520, info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _547526, _547528))
^ Fail: (76) [loop_check] prolog_frame_attribute(1189, parent_goal, loop_check_term_frame(_547520, info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _547526, _547528))
^ Redo: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
Call: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
Unify: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
^ Call: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
^ Unify: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Call: (83) [system] set_prolog_flag(last_call_optimisation, false)
Exit: (83) [system] set_prolog_flag(last_call_optimisation, false)
^ Call: (83) [loop_check] prolog_frame_attribute(1328, parent_goal, loop_check_term_frame(_553244, info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _553250, _553252))
^ Fail: (83) [loop_check] prolog_frame_attribute(1328, parent_goal, loop_check_term_frame(_553244, info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _553250, _553252))
^ Redo: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Call: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
Unify: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Call: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _556576)
^ Unify: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _556576)
^ Call: (88) [system] clause(pfc_lib:prologNegByFailure(tFly), true, _556576)
^ Fail: (88) [system] clause(pfc_lib:prologNegByFailure(tFly), true, _556576)
^ Fail: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _556576)
Unify: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Call: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Unify: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Call: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Unify: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Fail: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Redo: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Exit: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Call: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _564746), call(_564746)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
^ Unify: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _564746), call(_564746)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
^ Call: (90) [hook_database] clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _564746)
^ Exit: (90) [hook_database] clause(mpred_prop(baseKB, tFly, 2, prologHybrid), true)
Call: (90) [system] true
Exit: (90) [system] true
Call: (90) [system] true
Exit: (90) [system] true
^ Exit: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), true), call(true)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
Fail: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Fail: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Fail: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
^ Fail: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
^ Fail: (68) [baseKB] ~ (baseKB:tFly(ext, iChilly))
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+ ~tFly(ext,iChilly))),rtrace(baseKB: ~tFly(ext,iChilly))))
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0011_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0011_Line_0000__Ext-junit.xml
~*/
%= repropigate that chilly was a bird again (actualy this asserts)
tBird(ext,iChilly).
:- listing(tBird/2).
%= the dmsg explains the difference between \+ and ~
%~ skipped( listing( tBird/2))
%= the dmsg explains the difference between \+ and ~
:- dmsg("confirm chilly still does not fly").
%~ confirm chilly still does not fly
:- mpred_test(( \+ tFly(ext,iChilly))).
%~ mpred_test("Test_0012_Line_0000__naf_Ext",baseKB:(\+tFly(ext,iChilly)))
/*~
%~ mpred_test("Test_0012_Line_0000__naf_Ext",baseKB:(\+tFly(ext,iChilly)))
^ Call: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Unify: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Exit: (68) [baseKB] baseKB:tFly(ext, iChilly)
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:tFly(ext,iChilly)),rtrace(baseKB:(\+tFly(ext,iChilly)))))
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0012_Line_0000__naf_Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0012_Line_0000__naf_Ext-junit.xml
~*/
:- dmsg("confirm chilly still cant fly").
%~ confirm chilly still cant fly
:- mpred_test(( ~ tFly(ext,iChilly))).
%~ mpred_test("Test_0013_Line_0000__Ext",baseKB: ~tFly(ext,iChilly))
/*~
%~ mpred_test("Test_0013_Line_0000__Ext",baseKB: ~tFly(ext,iChilly))
^ Call: (68) [baseKB] ~tFly(ext, iChilly)
^ Unify: (68) [baseKB] ~ (baseKB:tFly(ext, iChilly))
^ Call: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
^ Unify: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
Call: (76) [system] set_prolog_flag(last_call_optimisation, false)
Exit: (76) [system] set_prolog_flag(last_call_optimisation, false)
^ Call: (76) [loop_check] prolog_frame_attribute(1189, parent_goal, loop_check_term_frame(_1180304, info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _1180310, _1180312))
^ Fail: (76) [loop_check] prolog_frame_attribute(1189, parent_goal, loop_check_term_frame(_1180304, info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _1180310, _1180312))
^ Redo: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
Call: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
Unify: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
^ Call: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
^ Unify: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Call: (83) [system] set_prolog_flag(last_call_optimisation, false)
Exit: (83) [system] set_prolog_flag(last_call_optimisation, false)
^ Call: (83) [loop_check] prolog_frame_attribute(1328, parent_goal, loop_check_term_frame(_1186028, info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _1186034, _1186036))
^ Fail: (83) [loop_check] prolog_frame_attribute(1328, parent_goal, loop_check_term_frame(_1186028, info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, _1186034, _1186036))
^ Redo: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Call: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
Unify: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Call: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _1189360)
^ Unify: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _1189360)
^ Call: (88) [system] clause(pfc_lib:prologNegByFailure(tFly), true, _1189360)
^ Fail: (88) [system] clause(pfc_lib:prologNegByFailure(tFly), true, _1189360)
^ Fail: (87) [pfc_lib] hook_database:clause_i(pfc_lib:prologNegByFailure(tFly), true, _1189360)
Unify: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Call: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Unify: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Call: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Unify: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Fail: (85) [pfc_lib] ucatch:is_ftVar(baseKB:tFly(ext, iChilly))
^ Redo: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Exit: (84) [pfc_lib] ucatch:is_ftCompound(baseKB:tFly(ext, iChilly))
^ Call: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _1197530), call(_1197530)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
^ Unify: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _1197530), call(_1197530)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
^ Call: (90) [hook_database] clause(mpred_prop(baseKB, tFly, 2, prologHybrid), _1197530)
^ Exit: (90) [hook_database] clause(mpred_prop(baseKB, tFly, 2, prologHybrid), true)
Call: (90) [system] true
Exit: (90) [system] true
Call: (90) [system] true
Exit: (90) [system] true
^ Exit: (88) [hook_database] hook_database:pfc_with_quiet_vars_lock((clause(mpred_prop(baseKB, tFly, 2, prologHybrid), true), call(true)*->true;clause_b(baseKB:mpred_prop(baseKB, tFly, 2, prologHybrid))))
Fail: (83) [pfc_lib] neg_may_naf(baseKB:tFly(ext, iChilly))
^ Fail: (82) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1328, pfc_lib:trace_or_throw(looped(pfc_lib:neg_may_naf(baseKB:tFly(ext, iChilly)))))
Fail: (76) [pfc_lib] neg_in_code0(baseKB:tFly(ext, iChilly))
^ Fail: (75) [loop_check] loop_check:loop_check_term_frame(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), info(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)), 'mpred_core.pl':273), 1, 1189, pfc_lib:trace_or_throw(looped(pfc_lib:neg_in_code0(baseKB:tFly(ext, iChilly)))))
^ Fail: (68) [baseKB] ~ (baseKB:tFly(ext, iChilly))
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+ ~tFly(ext,iChilly))),rtrace(baseKB: ~tFly(ext,iChilly))))
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0013_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0013_Line_0000__Ext-junit.xml
~*/
/*
% This wounld be a good TMS test it should throw.. but right now it passes wrongly
tFly(ext,iChilly).
:- dmsg("confirm chilly is flying penguin").
:- mpred_test(( tFly(ext,iChilly))).
:- mpred_test(( tPenguin(ext,iChilly))).
:- mpred_test((\+ ~tFly(ext,iChilly))).
\+ tFly(ext,iChilly).
:- dmsg("confirm chilly is a normal penguin who cant fly").
:- mpred_test((\+ tFly(ext,iChilly))).
% fails rightly
:- mpred_test(( tPenguin(ext,iChilly))).
*/
:- dmsg("chilly is no longer a penguin (hopefly the assertion above about him being a bird wont be removed)").
%~ chilly is no longer a penguin (hopefly the assertion above about him being a bird wont be removed)
:- debug_logicmoo(_).
:- mpred_trace_exec.
:- debug_logicmoo(logicmoo(_)).
:- mpred_test(tBird(ext,iChilly)).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:127
%~ mpred_test("Test_0014_Line_0000__Ext",baseKB:tBird(ext,iChilly))
%~ FIlE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L127
/*~
%~ mpred_test("Test_0014_Line_0000__Ext",baseKB:tBird(ext,iChilly))
passed=info(why_was_true(baseKB:tBird(ext,iChilly)))
Justifications for tBird(ext,iChilly):
[36m 1.1 mfl4(_,baseKB,'* https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L90 ',90) [0m
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0014_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0014_Line_0000__Ext-junit.xml
~*/
never_retract_u(tBird(ext,iChilly)).
\+ tPenguin(ext,iChilly).
%~ mpred_undo1( '$nt'(
%~ ~( tFly(ext,iChilly)),
%~ ( call_u_no_bc( ~( tFly(ext,iChilly))) ,
%~ ground( ~( tFly(ext,iChilly))) ,
%~ \+( tFly(ext,iChilly))),
%~ '$nt'(~tFly(ext,iChilly),call_u_no_bc(~tFly(ext,iChilly)),rhs([tFly(ext,iChilly)]))))
%~ mpred_undo1( '$nt'(~tFly(ext,iChilly),call_u_no_bc(~tFly(ext,iChilly)),rhs([tFly(ext,iChilly)])))
%~ mpred_undo1( '$nt'(
%~ ~( tFly(ext,iChilly)),
%~ ( call_u_no_bc( ~( tFly(ext,iChilly))) ,
%~ ground( ~( tFly(ext,iChilly))) ,
%~ \+( tFly(ext,iChilly))),
%~ '$nt'(~tFly(ext,iChilly),call_u_no_bc(~tFly(ext,iChilly)),rhs([tFly(ext,iChilly)]))))
%~ mpred_undo1( '$nt'(
%~ ~( tFly(ext,iChilly)),
%~ ( call_u_no_bc( ~( tFly(ext,iChilly))) ,
%~ ground( ~( tFly(ext,iChilly))) ,
%~ \+( tFly(ext,iChilly)) ,
%~ \+( ~( tFly(ext,iChilly)) =
%~
%~ tFly(ext,iChilly))),
%~ rhs([\+tFly(ext,iChilly)])))
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:132
%~ mpred_undo1( '$nt'(
%~ ~( tFly(ext,iChilly)),
%~ ( call_u_no_bc( ~( tFly(ext,iChilly))) ,
%~ ground( ~( tFly(ext,iChilly))) ,
%~ \+( tFly(ext,iChilly)) ,
%~ \+( ~( tFly(ext,iChilly)) =
%~
%~ tFly(ext,iChilly))),
%~ rhs([\+tFly(ext,iChilly)])))
%~ debugm( baseKB,
%~ show_success( baseKB,
%~ baseKB : mpred_withdraw( tPenguin(ext,iChilly),
%~ ( mfl4(BaseKB,baseKB,'* https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc ',132) ,
%~ ax))))
:- mpred_test((tBird(ext,iChilly))).
%~ mpred_test("Test_0015_Line_0000__Ext",baseKB:tBird(ext,iChilly))
%~ FIlE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L135
/*~
%~ mpred_test("Test_0015_Line_0000__Ext",baseKB:tBird(ext,iChilly))
passed=info(why_was_true(baseKB:tBird(ext,iChilly)))
Justifications for tBird(ext,iChilly):
[36m 1.1 mfl4(_,baseKB,'* https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L90 ',90) [0m
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0015_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0015_Line_0000__Ext-junit.xml
~*/
:- mpred_test(( \+ tPenguin(ext,iChilly))).
%~ mpred_test("Test_0016_Line_0000__naf_Ext",baseKB:(\+tPenguin(ext,iChilly)))
/*~
%~ mpred_test("Test_0016_Line_0000__naf_Ext",baseKB:(\+tPenguin(ext,iChilly)))
^ Call: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Unify: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Exit: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:tPenguin(ext,iChilly)),rtrace(baseKB:(\+tPenguin(ext,iChilly)))))
no_proof_for(tPenguin(ext,iChilly)).
no_proof_for(tPenguin(ext,iChilly)).
no_proof_for(tPenguin(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0016_Line_0000__naf_Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0016_Line_0000__naf_Ext-junit.xml
~*/
:- dmsg("chilly is still a bird").
%~ chilly is still a bird
:- mpred_test((tBird(ext,iChilly))).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:140
%~ mpred_test("Test_0017_Line_0000__Ext",baseKB:tBird(ext,iChilly))
%~ FIlE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L140
/*~
%~ mpred_test("Test_0017_Line_0000__Ext",baseKB:tBird(ext,iChilly))
passed=info(why_was_true(baseKB:tBird(ext,iChilly)))
Justifications for tBird(ext,iChilly):
[36m 1.1 mfl4(_,baseKB,'* https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc#L90 ',90) [0m
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0017_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0017_Line_0000__Ext-junit.xml
~*/
:- dmsg("confirm chilly is flying bird").
%~ confirm chilly is flying bird
:- mpred_test(( tFly(ext,iChilly))).
%~ mpred_test("Test_0018_Line_0000__Ext",baseKB:tFly(ext,iChilly))
/*~
%~ mpred_test("Test_0018_Line_0000__Ext",baseKB:tFly(ext,iChilly))
passed=info(why_was_true(baseKB:tFly(ext,iChilly)))
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0018_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0018_Line_0000__Ext-junit.xml
~*/
:- repropagate(tBird(ext,iChilly)).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:145
%~ debugm(baseKB,show_success(baseKB,baseKB:mpred_fwc(tBird(ext,iChilly))))
:- dmsg("confirm chilly is flying bird").
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc:147
%~ confirm chilly is flying bird
:- mpred_test(( tFly(ext,iChilly))).
%~ mpred_test("Test_0019_Line_0000__Ext",baseKB:tFly(ext,iChilly))
/*~
%~ mpred_test("Test_0019_Line_0000__Ext",baseKB:tFly(ext,iChilly))
passed=info(why_was_true(baseKB:tFly(ext,iChilly)))
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
no_proof_for(tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0019_Line_0000__Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0019_Line_0000__Ext-junit.xml
~*/
:- mpred_test(( \+ tPenguin(ext,iChilly))).
%~ mpred_test("Test_0020_Line_0000__naf_Ext",baseKB:(\+tPenguin(ext,iChilly)))
/*~
%~ mpred_test("Test_0020_Line_0000__naf_Ext",baseKB:(\+tPenguin(ext,iChilly)))
^ Call: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Unify: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Exit: (68) [baseKB] baseKB:tPenguin(ext, iChilly)
^ Call: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (68) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:tPenguin(ext,iChilly)),rtrace(baseKB:(\+tPenguin(ext,iChilly)))))
no_proof_for(tPenguin(ext,iChilly)).
no_proof_for(tPenguin(ext,iChilly)).
no_proof_for(tPenguin(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0020_Line_0000__naf_Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0020_Line_0000__naf_Ext-junit.xml
~*/
:- mpred_test(( \+ ~tFly(ext,iChilly))).
%~ mpred_test("Test_0021_Line_0000__naf_Ext",baseKB:(\+ ~tFly(ext,iChilly)))
/*~
%~ mpred_test("Test_0021_Line_0000__naf_Ext",baseKB:(\+ ~tFly(ext,iChilly)))
passed=info(why_was_true(baseKB:(\+ ~tFly(ext,iChilly))))
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
no_proof_for(\+ ~tFly(ext,iChilly)).
name ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0021_Line_0000__naf_Ext'.
JUNIT_CLASSNAME ='logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif never_retract_01b.pfc'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.NEVER_RETRACT_01B-Test_0021_Line_0000__naf_Ext-junit.xml
~*/
%~ unused(save_junit_results)
%~ test_completed_exit(8)
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
```
totalTime=1
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANEVER_RETRACT_01B
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/NEVER_RETRACT_01B/logicmoo_pfc_test_sanity_base_NEVER_RETRACT_01B_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/68/testReport/logicmoo.pfc.test.sanity_base/NEVER_RETRACT_01B/logicmoo_pfc_test_sanity_base_NEVER_RETRACT_01B_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/1629eba4a2a1da0e1b731d156198a7168dafae44
https://github.com/logicmoo/logicmoo_workspace/blob/1629eba4a2a1da0e1b731d156198a7168dafae44/packs_sys/pfc/t/sanity_base/never_retract_01b.pfc
FAILED: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k never_retract_01b.pfc (returned 8)
|
test
|
logicmoo pfc test sanity base never retract junit cd var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base timeout foreground preserve status s sigkill k lmoo clif never retract pfc gh master issue finfo issue search gitlab latest this build github running var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base never retract pfc this test might need use module library logicmoo plarkc expects dialect pfc ttype col kb local col functordeclares col t col ext x instanceof x col ttype tfly ttype tcanary ttype tpenguin ttype tbird mpred test predicate property tbird ext dynamic var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base never retract pfc mpred test test line ext basekb predicate property tbird ext dynamic mpred test test line ext basekb predicate property tbird ext dynamic passed info why was true basekb predicate property tbird ext dynamic no proof for predicate property tbird ext bird ext dynamic no proof for predicate property tbird ext bird ext dynamic no proof for predicate property tbird ext bird ext dynamic name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml subclassof instanceof x instanceof x subclassof tcanary tbird subclassof tpenguin tbird dmsg chilly is a penguin chilly is a penguin tpenguin ext ichilly mpred test tbird ext ichilly mpred test test line ext basekb tbird ext ichilly mpred test test line ext basekb tbird ext ichilly passed info why was true basekb tbird ext ichilly no proof for tbird ext ichilly no proof for tbird ext ichilly no proof for tbird ext ichilly name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml dmsg tweety is a canary tweety is a canary tcanary ext itweety mpred test tbird ext itweety mpred test test line ext basekb tbird ext itweety mpred test test line ext basekb tbird ext itweety passed info why was true basekb tbird ext itweety no proof for tbird ext itweety no proof for tbird ext itweety no proof for tbird ext itweety name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml dmsg birds fly by default birds fly by default mdefault tbird ext x tfly ext x mpred test tbird ext itweety var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base never retract pfc mpred test test line ext basekb tbird ext itweety mpred test test line ext basekb tbird ext itweety passed info why was true basekb tbird ext itweety no proof for tbird ext itweety no proof for tbird ext itweety no proof for tbird ext itweety name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml mpred test tfly ext itweety mpred test test line ext basekb tfly ext itweety mpred test test line ext basekb tfly ext itweety passed info why was true basekb tfly ext itweety no proof for tfly ext itweety no proof for tfly ext itweety no proof for tfly ext itweety name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml dmsg make sure chilly can fly make sure chilly can fly mpred test instanceof i tfly i ichilly mpred test test line tfly basekb instanceof tfly ichilly mpred test test line tfly basekb instanceof tfly ichilly call basekb instanceof tfly fail basekb instanceof tfly call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb instanceof tfly ichilly rtrace basekb instanceof tfly ichilly no proof for instanceof i tfly i ichilly no proof for instanceof i tfly i ichilly no proof for instanceof i tfly i ichilly name logicmoo pfc test sanity base never retract test line tfly junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line tfly junit xml mpred test tbird ext itweety mpred test test line ext basekb tbird ext itweety mpred test test line ext basekb tbird ext itweety passed info why was true basekb tbird ext itweety no proof for tbird ext itweety no proof for tbird ext itweety no proof for tbird ext itweety name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml listing skipped listing dmsg make sure tweety can fly and again chilly make sure tweety can fly and again chilly mpred test tfly ext itweety mpred test test line ext basekb tfly ext itweety mpred test test line ext basekb tfly ext itweety passed info why was true basekb tfly ext itweety no proof for tfly ext itweety no proof for tfly ext itweety no proof for tfly ext itweety name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml mpred test tfly ext ichilly mpred test test line ext basekb tfly ext ichilly mpred test test line ext basekb tfly ext ichilly passed info why was true basekb tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml never retract u tbird ext ichilly dmsg penguins do not tfly var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base never retract pfc penguins do not tfly tpenguin ext x tfly ext x dmsg confirm chilly now cant fly confirm chilly now cant fly mpred test tfly ext ichilly mpred test test line naf ext basekb tfly ext ichilly mpred test test line naf ext basekb tfly ext ichilly call basekb tfly ext ichilly unify basekb tfly ext ichilly exit basekb tfly ext ichilly call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb tfly ext ichilly rtrace basekb tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly name logicmoo pfc test sanity base never retract test line naf ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line naf ext junit xml mpred test tfly ext ichilly repropigate that chilly was a bird again actualy this asserts mpred test test line ext basekb tfly ext ichilly mpred test test line ext basekb tfly ext ichilly call tfly ext ichilly unify basekb tfly ext ichilly call loop check loop check term frame pfc lib neg in basekb tfly ext ichilly info pfc lib neg in basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg in basekb tfly ext ichilly unify loop check loop check term frame pfc lib neg in basekb tfly ext ichilly info pfc lib neg in basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg in basekb tfly ext ichilly call set prolog flag last call optimisation false exit set prolog flag last call optimisation false call prolog frame attribute parent goal loop check term frame info pfc lib neg in basekb tfly ext ichilly mpred core pl fail prolog frame attribute parent goal loop check term frame info pfc lib neg in basekb tfly ext ichilly mpred core pl redo loop check loop check term frame pfc lib neg in basekb tfly ext ichilly info pfc lib neg in basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg in basekb tfly ext ichilly call neg in basekb tfly ext ichilly unify neg in basekb tfly ext ichilly call loop check loop check term frame pfc lib neg may naf basekb tfly ext ichilly info pfc lib neg may naf basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg may naf basekb tfly ext ichilly unify loop check loop check term frame pfc lib neg may naf basekb tfly ext ichilly info pfc lib neg may naf basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg may naf basekb tfly ext ichilly call set prolog flag last call optimisation false exit set prolog flag last call optimisation false call prolog frame attribute parent goal loop check term frame info pfc lib neg may naf basekb tfly ext ichilly mpred core pl fail prolog frame attribute parent goal loop check term frame info pfc lib neg may naf basekb tfly ext ichilly mpred core pl redo loop check loop check term frame pfc lib neg may naf basekb tfly ext ichilly info pfc lib neg may naf basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg may naf basekb tfly ext ichilly call neg may naf basekb tfly ext ichilly unify neg may naf basekb tfly ext ichilly call hook database clause i pfc lib prolognegbyfailure tfly true unify hook database clause i pfc lib prolognegbyfailure tfly true call clause pfc lib prolognegbyfailure tfly true fail clause pfc lib prolognegbyfailure tfly true fail hook database clause i pfc lib prolognegbyfailure tfly true unify neg may naf basekb tfly ext ichilly call ucatch is ftcompound basekb tfly ext ichilly unify ucatch is ftcompound basekb tfly ext ichilly call ucatch is ftvar basekb tfly ext ichilly unify ucatch is ftvar basekb tfly ext ichilly fail ucatch is ftvar basekb tfly ext ichilly redo ucatch is ftcompound basekb tfly ext ichilly exit ucatch is ftcompound basekb tfly ext ichilly call hook database pfc with quiet vars lock clause mpred prop basekb tfly prologhybrid call true clause b basekb mpred prop basekb tfly prologhybrid unify hook database pfc with quiet vars lock clause mpred prop basekb tfly prologhybrid call true clause b basekb mpred prop basekb tfly prologhybrid call clause mpred prop basekb tfly prologhybrid exit clause mpred prop basekb tfly prologhybrid true call true exit true call true exit true exit hook database pfc with quiet vars lock clause mpred prop basekb tfly prologhybrid true call true true clause b basekb mpred prop basekb tfly prologhybrid fail neg may naf basekb tfly ext ichilly fail loop check loop check term frame pfc lib neg may naf basekb tfly ext ichilly info pfc lib neg may naf basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg may naf basekb tfly ext ichilly fail neg in basekb tfly ext ichilly fail loop check loop check term frame pfc lib neg in basekb tfly ext ichilly info pfc lib neg in basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg in basekb tfly ext ichilly fail basekb tfly ext ichilly call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb tfly ext ichilly rtrace basekb tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml repropigate that chilly was a bird again actualy this asserts tbird ext ichilly listing tbird the dmsg explains the difference between and skipped listing tbird the dmsg explains the difference between and dmsg confirm chilly still does not fly confirm chilly still does not fly mpred test tfly ext ichilly mpred test test line naf ext basekb tfly ext ichilly mpred test test line naf ext basekb tfly ext ichilly call basekb tfly ext ichilly unify basekb tfly ext ichilly exit basekb tfly ext ichilly call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb tfly ext ichilly rtrace basekb tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly name logicmoo pfc test sanity base never retract test line naf ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line naf ext junit xml dmsg confirm chilly still cant fly confirm chilly still cant fly mpred test tfly ext ichilly mpred test test line ext basekb tfly ext ichilly mpred test test line ext basekb tfly ext ichilly call tfly ext ichilly unify basekb tfly ext ichilly call loop check loop check term frame pfc lib neg in basekb tfly ext ichilly info pfc lib neg in basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg in basekb tfly ext ichilly unify loop check loop check term frame pfc lib neg in basekb tfly ext ichilly info pfc lib neg in basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg in basekb tfly ext ichilly call set prolog flag last call optimisation false exit set prolog flag last call optimisation false call prolog frame attribute parent goal loop check term frame info pfc lib neg in basekb tfly ext ichilly mpred core pl fail prolog frame attribute parent goal loop check term frame info pfc lib neg in basekb tfly ext ichilly mpred core pl redo loop check loop check term frame pfc lib neg in basekb tfly ext ichilly info pfc lib neg in basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg in basekb tfly ext ichilly call neg in basekb tfly ext ichilly unify neg in basekb tfly ext ichilly call loop check loop check term frame pfc lib neg may naf basekb tfly ext ichilly info pfc lib neg may naf basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg may naf basekb tfly ext ichilly unify loop check loop check term frame pfc lib neg may naf basekb tfly ext ichilly info pfc lib neg may naf basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg may naf basekb tfly ext ichilly call set prolog flag last call optimisation false exit set prolog flag last call optimisation false call prolog frame attribute parent goal loop check term frame info pfc lib neg may naf basekb tfly ext ichilly mpred core pl fail prolog frame attribute parent goal loop check term frame info pfc lib neg may naf basekb tfly ext ichilly mpred core pl redo loop check loop check term frame pfc lib neg may naf basekb tfly ext ichilly info pfc lib neg may naf basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg may naf basekb tfly ext ichilly call neg may naf basekb tfly ext ichilly unify neg may naf basekb tfly ext ichilly call hook database clause i pfc lib prolognegbyfailure tfly true unify hook database clause i pfc lib prolognegbyfailure tfly true call clause pfc lib prolognegbyfailure tfly true fail clause pfc lib prolognegbyfailure tfly true fail hook database clause i pfc lib prolognegbyfailure tfly true unify neg may naf basekb tfly ext ichilly call ucatch is ftcompound basekb tfly ext ichilly unify ucatch is ftcompound basekb tfly ext ichilly call ucatch is ftvar basekb tfly ext ichilly unify ucatch is ftvar basekb tfly ext ichilly fail ucatch is ftvar basekb tfly ext ichilly redo ucatch is ftcompound basekb tfly ext ichilly exit ucatch is ftcompound basekb tfly ext ichilly call hook database pfc with quiet vars lock clause mpred prop basekb tfly prologhybrid call true clause b basekb mpred prop basekb tfly prologhybrid unify hook database pfc with quiet vars lock clause mpred prop basekb tfly prologhybrid call true clause b basekb mpred prop basekb tfly prologhybrid call clause mpred prop basekb tfly prologhybrid exit clause mpred prop basekb tfly prologhybrid true call true exit true call true exit true exit hook database pfc with quiet vars lock clause mpred prop basekb tfly prologhybrid true call true true clause b basekb mpred prop basekb tfly prologhybrid fail neg may naf basekb tfly ext ichilly fail loop check loop check term frame pfc lib neg may naf basekb tfly ext ichilly info pfc lib neg may naf basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg may naf basekb tfly ext ichilly fail neg in basekb tfly ext ichilly fail loop check loop check term frame pfc lib neg in basekb tfly ext ichilly info pfc lib neg in basekb tfly ext ichilly mpred core pl pfc lib trace or throw looped pfc lib neg in basekb tfly ext ichilly fail basekb tfly ext ichilly call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb tfly ext ichilly rtrace basekb tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml this wounld be a good tms test it should throw but right now it passes wrongly tfly ext ichilly dmsg confirm chilly is flying penguin mpred test tfly ext ichilly mpred test tpenguin ext ichilly mpred test tfly ext ichilly tfly ext ichilly dmsg confirm chilly is a normal penguin who cant fly mpred test tfly ext ichilly fails rightly mpred test tpenguin ext ichilly dmsg chilly is no longer a penguin hopefly the assertion above about him being a bird wont be removed chilly is no longer a penguin hopefly the assertion above about him being a bird wont be removed debug logicmoo mpred trace exec debug logicmoo logicmoo mpred test tbird ext ichilly var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base never retract pfc mpred test test line ext basekb tbird ext ichilly file mpred test test line ext basekb tbird ext ichilly passed info why was true basekb tbird ext ichilly justifications for tbird ext ichilly basekb name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml never retract u tbird ext ichilly tpenguin ext ichilly mpred nt tfly ext ichilly call u no bc tfly ext ichilly ground tfly ext ichilly tfly ext ichilly nt tfly ext ichilly call u no bc tfly ext ichilly rhs mpred nt tfly ext ichilly call u no bc tfly ext ichilly rhs mpred nt tfly ext ichilly call u no bc tfly ext ichilly ground tfly ext ichilly tfly ext ichilly nt tfly ext ichilly call u no bc tfly ext ichilly rhs mpred nt tfly ext ichilly call u no bc tfly ext ichilly ground tfly ext ichilly tfly ext ichilly tfly ext ichilly tfly ext ichilly rhs var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base never retract pfc mpred nt tfly ext ichilly call u no bc tfly ext ichilly ground tfly ext ichilly tfly ext ichilly tfly ext ichilly tfly ext ichilly rhs debugm basekb show success basekb basekb mpred withdraw tpenguin ext ichilly basekb basekb ax mpred test tbird ext ichilly mpred test test line ext basekb tbird ext ichilly file mpred test test line ext basekb tbird ext ichilly passed info why was true basekb tbird ext ichilly justifications for tbird ext ichilly basekb name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml mpred test tpenguin ext ichilly mpred test test line naf ext basekb tpenguin ext ichilly mpred test test line naf ext basekb tpenguin ext ichilly call basekb tpenguin ext ichilly unify basekb tpenguin ext ichilly exit basekb tpenguin ext ichilly call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb tpenguin ext ichilly rtrace basekb tpenguin ext ichilly no proof for tpenguin ext ichilly no proof for tpenguin ext ichilly no proof for tpenguin ext ichilly name logicmoo pfc test sanity base never retract test line naf ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line naf ext junit xml dmsg chilly is still a bird chilly is still a bird mpred test tbird ext ichilly var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base never retract pfc mpred test test line ext basekb tbird ext ichilly file mpred test test line ext basekb tbird ext ichilly passed info why was true basekb tbird ext ichilly justifications for tbird ext ichilly basekb name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml dmsg confirm chilly is flying bird confirm chilly is flying bird mpred test tfly ext ichilly mpred test test line ext basekb tfly ext ichilly mpred test test line ext basekb tfly ext ichilly passed info why was true basekb tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml repropagate tbird ext ichilly var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base never retract pfc debugm basekb show success basekb basekb mpred fwc tbird ext ichilly dmsg confirm chilly is flying bird var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base never retract pfc confirm chilly is flying bird mpred test tfly ext ichilly mpred test test line ext basekb tfly ext ichilly mpred test test line ext basekb tfly ext ichilly passed info why was true basekb tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly name logicmoo pfc test sanity base never retract test line ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line ext junit xml mpred test tpenguin ext ichilly mpred test test line naf ext basekb tpenguin ext ichilly mpred test test line naf ext basekb tpenguin ext ichilly call basekb tpenguin ext ichilly unify basekb tpenguin ext ichilly exit basekb tpenguin ext ichilly call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb tpenguin ext ichilly rtrace basekb tpenguin ext ichilly no proof for tpenguin ext ichilly no proof for tpenguin ext ichilly no proof for tpenguin ext ichilly name logicmoo pfc test sanity base never retract test line naf ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line naf ext junit xml mpred test tfly ext ichilly mpred test test line naf ext basekb tfly ext ichilly mpred test test line naf ext basekb tfly ext ichilly passed info why was true basekb tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly no proof for tfly ext ichilly name logicmoo pfc test sanity base never retract test line naf ext junit classname logicmoo pfc test sanity base never retract junit cmd timeout foreground preserve status s sigkill k lmoo clif never retract pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit test sanity base units logicmoo pfc test sanity base never retract test line naf ext junit xml unused save junit results test completed exit dynamic junit prop dynamic junit prop dynamic junit prop totaltime issue search gitlab latest this build github failed var lib jenkins workspace logicmoo workspace bin lmoo junit minor k never retract pfc returned
| 1
|
6,354
| 2,839,331,002
|
IssuesEvent
|
2015-05-27 13:15:02
|
dojo/loader
|
https://api.github.com/repos/dojo/loader
|
opened
|
Test loader configuration options
|
tests
|
## Task
Write tests exercising the loader's configuration options.
## Loader Functional Testing Pattern
Write loader tests as functional tests to provide a clean environment in which the loader can operate. Whenever possible use the pattern outlined in [this wiki](../wiki/Loader Functional Testing Pattern) for inspecting test results.
|
1.0
|
Test loader configuration options - ## Task
Write tests exercising the loader's configuration options.
## Loader Functional Testing Pattern
Write loader tests as functional tests to provide a clean environment in which the loader can operate. Whenever possible use the pattern outlined in [this wiki](../wiki/Loader Functional Testing Pattern) for inspecting test results.
|
test
|
test loader configuration options task write tests exercising the loader s configuration options loader functional testing pattern write loader tests as functional tests to provide a clean environment in which the loader can operate whenever possible use the pattern outlined in wiki loader functional testing pattern for inspecting test results
| 1
|
228,064
| 18,153,975,735
|
IssuesEvent
|
2021-09-26 19:01:50
|
python-discord/bot
|
https://api.github.com/repos/python-discord/bot
|
closed
|
Write unit tests for `bot/cogs/tags.py`
|
p: 3 - low a: tests
|
Write unit tests for [`bot/cogs/tags.py`](../blob/master/bot/cogs/tags.py).
## Implementation details
Please make sure to read the general information in the [meta issue](553) and the [testing README](../blob/master/tests/README.md). We are aiming for a 100% [branch coverage](https://coverage.readthedocs.io/en/stable/branch.html) for this file, but if you think that is not possible, please discuss that in this issue.
## Additional information
If you want to work on this issue, **please make sure that you get assigned to it** by one of the core devs before starting to work on it. We would like to prevent the situation that multiple people are working on the same issue. To get assigned, leave a comment showing your interesting in tackling this issue.
|
1.0
|
Write unit tests for `bot/cogs/tags.py` - Write unit tests for [`bot/cogs/tags.py`](../blob/master/bot/cogs/tags.py).
## Implementation details
Please make sure to read the general information in the [meta issue](553) and the [testing README](../blob/master/tests/README.md). We are aiming for a 100% [branch coverage](https://coverage.readthedocs.io/en/stable/branch.html) for this file, but if you think that is not possible, please discuss that in this issue.
## Additional information
If you want to work on this issue, **please make sure that you get assigned to it** by one of the core devs before starting to work on it. We would like to prevent the situation that multiple people are working on the same issue. To get assigned, leave a comment showing your interesting in tackling this issue.
|
test
|
write unit tests for bot cogs tags py write unit tests for blob master bot cogs tags py implementation details please make sure to read the general information in the and the blob master tests readme md we are aiming for a for this file but if you think that is not possible please discuss that in this issue additional information if you want to work on this issue please make sure that you get assigned to it by one of the core devs before starting to work on it we would like to prevent the situation that multiple people are working on the same issue to get assigned leave a comment showing your interesting in tackling this issue
| 1
|
54,345
| 6,379,966,539
|
IssuesEvent
|
2017-08-02 15:44:22
|
fossasia/phimpme-android
|
https://api.github.com/repos/fossasia/phimpme-android
|
opened
|
Add more test condition in CameraActivity.
|
Testing
|
**Actual Behaviour**
Currently, the camera activity tests only exposure and to take a photo.
**Expected Behaviour**
Add more testing condition in the camera activity.
**Would you like to work on the issue?**
Yes.
|
1.0
|
Add more test condition in CameraActivity. - **Actual Behaviour**
Currently, the camera activity tests only exposure and to take a photo.
**Expected Behaviour**
Add more testing condition in the camera activity.
**Would you like to work on the issue?**
Yes.
|
test
|
add more test condition in cameraactivity actual behaviour currently the camera activity tests only exposure and to take a photo expected behaviour add more testing condition in the camera activity would you like to work on the issue yes
| 1
|
180,211
| 13,926,254,732
|
IssuesEvent
|
2020-10-21 18:01:05
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
Failing test: Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/monitoring/elasticsearch/nodes·js - Monitoring app Elasticsearch nodes listing with only online nodes "before all" hook for "should have an Elasticsearch Cluster Summary Status with correct info"
|
Team:Monitoring failed-test
|
A test failed on a tracked branch
```
Error: retry.try timeout: TimeoutError: Waiting for element to be located By(css selector, [data-test-subj="elasticsearchNodesListingPage"])
Wait timed out after 10009ms
at /dev/shm/workspace/kibana/node_modules/selenium-webdriver/lib/webdriver.js:842:17
at process._tickCallback (internal/process/next_tick.js:68:7)
at onFailure (/dev/shm/workspace/parallel/20/kibana/test/common/services/retry/retry_for_success.ts:28:9)
at retryForSuccess (/dev/shm/workspace/parallel/20/kibana/test/common/services/retry/retry_for_success.ts:68:13)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+7.x/8939/)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/monitoring/elasticsearch/nodes·js","test.name":"Monitoring app Elasticsearch nodes listing with only online nodes \"before all\" hook for \"should have an Elasticsearch Cluster Summary Status with correct info\"","test.failCount":2}} -->
|
1.0
|
Failing test: Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/monitoring/elasticsearch/nodes·js - Monitoring app Elasticsearch nodes listing with only online nodes "before all" hook for "should have an Elasticsearch Cluster Summary Status with correct info" - A test failed on a tracked branch
```
Error: retry.try timeout: TimeoutError: Waiting for element to be located By(css selector, [data-test-subj="elasticsearchNodesListingPage"])
Wait timed out after 10009ms
at /dev/shm/workspace/kibana/node_modules/selenium-webdriver/lib/webdriver.js:842:17
at process._tickCallback (internal/process/next_tick.js:68:7)
at onFailure (/dev/shm/workspace/parallel/20/kibana/test/common/services/retry/retry_for_success.ts:28:9)
at retryForSuccess (/dev/shm/workspace/parallel/20/kibana/test/common/services/retry/retry_for_success.ts:68:13)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+7.x/8939/)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/monitoring/elasticsearch/nodes·js","test.name":"Monitoring app Elasticsearch nodes listing with only online nodes \"before all\" hook for \"should have an Elasticsearch Cluster Summary Status with correct info\"","test.failCount":2}} -->
|
test
|
failing test chrome x pack ui functional tests x pack test functional apps monitoring elasticsearch nodes·js monitoring app elasticsearch nodes listing with only online nodes before all hook for should have an elasticsearch cluster summary status with correct info a test failed on a tracked branch error retry try timeout timeouterror waiting for element to be located by css selector wait timed out after at dev shm workspace kibana node modules selenium webdriver lib webdriver js at process tickcallback internal process next tick js at onfailure dev shm workspace parallel kibana test common services retry retry for success ts at retryforsuccess dev shm workspace parallel kibana test common services retry retry for success ts first failure
| 1
|
418,383
| 12,197,251,693
|
IssuesEvent
|
2020-04-29 20:26:38
|
eecs-autograder/autograder-server
|
https://api.github.com/repos/eecs-autograder/autograder-server
|
closed
|
Don't include MEDIA_ROOT in output file paths saved in database
|
priority-2-low refactoring
|
This causes issues with student test suite result setup output when moving data from production to dev environments.
|
1.0
|
Don't include MEDIA_ROOT in output file paths saved in database - This causes issues with student test suite result setup output when moving data from production to dev environments.
|
non_test
|
don t include media root in output file paths saved in database this causes issues with student test suite result setup output when moving data from production to dev environments
| 0
|
267,333
| 23,291,987,509
|
IssuesEvent
|
2022-08-06 01:46:59
|
void-linux/void-packages
|
https://api.github.com/repos/void-linux/void-packages
|
closed
|
efibootmgr weird output
|
bug needs-testing
|
### Is this a new report?
Yes
### System Info
Void 5.10.109_1 x86_64 AuthenticAMD notuptodate rrmFFFFF
### Package(s) Affected
efibootmgr-18_1
### Does a report exist for this bug with the project's home (upstream) and/or another distro?
_No response_
### Expected behaviour
When execute efibootmgr -v on terminal, the output should be something like this:
`Boot0002* Void Linux 5.18 HD(XXXXXX)r.o.o.t.=XXXXXXX .q.u.i.e.t. .s.p.l.a.s.h. .r.w. .r.a.d.e.o.n...s.i._.s.u.p.p.o.r.t.=.0. .a.m.d.g.p.u...s.i._.s.u.p.p.o.r.t.=.1. .a.m.d.g.p.u...d.p.m.=.1. .a.m.d.g.p.u...g.p.u._.r.e.c.o.v.e.r.y.=.2. .a.p.p.a.r.m.o.r.=.1. .n.e.t...i.f.n.a.m.e.s.=.0. .i.o.m.m.u.=.p.t. .a.m.d._.i.o.m.m.u.=.1. .a.c.p.i._.e.n.f.o.r.c.e._.r.e.s.o.u.r.c.e.s.=.l.a.x. .i.n.i.t.r.d.=.\.i.n.i.t.r.a.m.f.s.-.5...1.8...1.1._.1...i.m.g.`
### Actual behaviour
But on the latest version. efibootmgr didnt report something like that, instead like this:
`Boot0002* Void Linux 5.18 VenHw(99e275e7-75a0-4b37-a2e6-c5385e6c00cb)72006f006f0074003d005a00460053003d005a004600530052002f0056006f00690064002000710075006900650074002000730070006c00610073006800200072007700200072006100640065006f006e002e00730069005f0073007500700070006f00720074003d003000200061006d0064006700700075002e00730069005f0073007500700070006f00720074003d003100200061006d0064006700700075002e00640070006d003d003100200061006d0064006700700075002e006700700075005f007200650063006f0076006500720079003d0032002000610070007000610072006d006f0072003d00310020006e00650074002e00690066006e0061006d00650073003d003000200069006f006d006d0075003d0070007400200061006d0064005f0069006f006d006d0075003d003100200061006300700069005f0065006e0066006f007200630065005f007200650073006f00750072006300650073003d006c0061007800200069006e0069007400720064003d005c0069006e0069007400720061006d00660073002d0035002e00310038002e00310031005f0031002e0069006d006700`
### Steps to reproduce
1. Open terminal
2. launch "efibootmgr -v"
3. Weird Output
4. Done
|
1.0
|
efibootmgr weird output - ### Is this a new report?
Yes
### System Info
Void 5.10.109_1 x86_64 AuthenticAMD notuptodate rrmFFFFF
### Package(s) Affected
efibootmgr-18_1
### Does a report exist for this bug with the project's home (upstream) and/or another distro?
_No response_
### Expected behaviour
When execute efibootmgr -v on terminal, the output should be something like this:
`Boot0002* Void Linux 5.18 HD(XXXXXX)r.o.o.t.=XXXXXXX .q.u.i.e.t. .s.p.l.a.s.h. .r.w. .r.a.d.e.o.n...s.i._.s.u.p.p.o.r.t.=.0. .a.m.d.g.p.u...s.i._.s.u.p.p.o.r.t.=.1. .a.m.d.g.p.u...d.p.m.=.1. .a.m.d.g.p.u...g.p.u._.r.e.c.o.v.e.r.y.=.2. .a.p.p.a.r.m.o.r.=.1. .n.e.t...i.f.n.a.m.e.s.=.0. .i.o.m.m.u.=.p.t. .a.m.d._.i.o.m.m.u.=.1. .a.c.p.i._.e.n.f.o.r.c.e._.r.e.s.o.u.r.c.e.s.=.l.a.x. .i.n.i.t.r.d.=.\.i.n.i.t.r.a.m.f.s.-.5...1.8...1.1._.1...i.m.g.`
### Actual behaviour
But on the latest version. efibootmgr didnt report something like that, instead like this:
`Boot0002* Void Linux 5.18 VenHw(99e275e7-75a0-4b37-a2e6-c5385e6c00cb)72006f006f0074003d005a00460053003d005a004600530052002f0056006f00690064002000710075006900650074002000730070006c00610073006800200072007700200072006100640065006f006e002e00730069005f0073007500700070006f00720074003d003000200061006d0064006700700075002e00730069005f0073007500700070006f00720074003d003100200061006d0064006700700075002e00640070006d003d003100200061006d0064006700700075002e006700700075005f007200650063006f0076006500720079003d0032002000610070007000610072006d006f0072003d00310020006e00650074002e00690066006e0061006d00650073003d003000200069006f006d006d0075003d0070007400200061006d0064005f0069006f006d006d0075003d003100200061006300700069005f0065006e0066006f007200630065005f007200650073006f00750072006300650073003d006c0061007800200069006e0069007400720064003d005c0069006e0069007400720061006d00660073002d0035002e00310038002e00310031005f0031002e0069006d006700`
### Steps to reproduce
1. Open terminal
2. launch "efibootmgr -v"
3. Weird Output
4. Done
|
test
|
efibootmgr weird output is this a new report yes system info void authenticamd notuptodate rrmfffff package s affected efibootmgr does a report exist for this bug with the project s home upstream and or another distro no response expected behaviour when execute efibootmgr v on terminal the output should be something like this void linux hd xxxxxx r o o t xxxxxxx q u i e t s p l a s h r w r a d e o n s i s u p p o r t a m d g p u s i s u p p o r t a m d g p u d p m a m d g p u g p u r e c o v e r y a p p a r m o r n e t i f n a m e s i o m m u p t a m d i o m m u a c p i e n f o r c e r e s o u r c e s l a x i n i t r d i n i t r a m f s i m g actual behaviour but on the latest version efibootmgr didnt report something like that instead like this void linux venhw steps to reproduce open terminal launch efibootmgr v weird output done
| 1
|
211,833
| 16,371,774,289
|
IssuesEvent
|
2021-05-15 09:12:52
|
LockTech/cerberus
|
https://api.github.com/repos/LockTech/cerberus
|
opened
|
Improve testing for `api/src/lib`
|
api enhancement test
|
The API's `lib` directory contains a number of files which would benefit from tests to assert their functionality.
|
1.0
|
Improve testing for `api/src/lib` - The API's `lib` directory contains a number of files which would benefit from tests to assert their functionality.
|
test
|
improve testing for api src lib the api s lib directory contains a number of files which would benefit from tests to assert their functionality
| 1
|
172,404
| 13,305,076,866
|
IssuesEvent
|
2020-08-25 17:57:22
|
aeternity/aeternity
|
https://api.github.com/repos/aeternity/aeternity
|
opened
|
aehttp_sc_SUITE failure: timeout waiting for channel `open` messages
|
area/tests kind/bug
|
The `aehttp_sc_SUITE:sc_ws_min_depth_is_modifiable/1` test case fails with a timeout - at least in some runs.
```
=== Reason: {timeout,{messages,[{<0.9168.0>,websocket_event,channel,
update,
#{<<"jsonrpc">> => <<"2.0">>,
<<"method">> => <<"channels.update">>,
<<"params">> =>
#{<<"channel_id">> =>
<<"ch_21woyLUNVapgSZrHrbdTKohDG5Yad9xNw4wFsrLzf8sKFspZim">>,
<<"data">> =>
#{<<"state">> =>
<<"tx_+QENCwH4hLhAdU5TGcrejxQkGW36Bb7mfyY/N6FwCG5qJxSDCpUmcIv0ie2oy0TzPWq9TpTNeby7zYVcnO/hjIUY1S+KiDTrBLhAn0NBWw3RzA4FS9tujscTinOUTK4jm5RuG7eRMyrMycUVRl4olmgxkDHIPLi3YMMzJ+sdHgK3wHvkxUETOYCDA7iD+IEyAaEBnvLdFTEfyWE16WLId900E+O0wsvSmKqkynYPhUodScSGP6olImAAoQG5u4uTbiAM4+qz6hnyjAQZJnfxP/hQFvbjGc7kagT/PoYkYTnKgAACCgCGEAZ510gAwKCBwESIXtJYQs/2KrdjFqVbmEKLcMJyZ1sylrOYFH8zrQJifO77">>}},
<<"version">> => 1}},
{<0.9163.0>,websocket_event,channel,
update,
#{<<"jsonrpc">> => <<"2.0">>,
<<"method">> => <<"channels.update">>,
<<"params">> =>
#{<<"channel_id">> =>
<<"ch_21woyLUNVapgSZrHrbdTKohDG5Yad9xNw4wFsrLzf8sKFspZim">>,
<<"data">> =>
#{<<"state">> =>
<<"tx_+QENCwH4hLhAdU5TGcrejxQkGW36Bb7mfyY/N6FwCG5qJxSDCpUmcIv0ie2oy0TzPWq9TpTNeby7zYVcnO/hjIUY1S+KiDTrBLhAn0NBWw3RzA4FS9tujscTinOUTK4jm5RuG7eRMyrMycUVRl4olmgxkDHIPLi3YMMzJ+sdHgK3wHvkxUETOYCDA7iD+IEyAaEBnvLdFTEfyWE16WLId900E+O0wsvSmKqkynYPhUodScSGP6olImAAoQG5u4uTbiAM4+qz6hnyjAQZJnfxP/hQFvbjGc7kagT/PoYkYTnKgAACCgCGEAZ510gAwKCBwESIXtJYQs/2KrdjFqVbmEKLcMJyZ1sylrOYFH8zrQJifO77">>}},
<<"version">> => 1}}]}}
in function aehttp_ws_test_utils:wait_for_msg/5 (/home/builder/aeternity/apps/aehttp/test/aehttp_ws_test_utils.erl, line 324)
in call from aehttp_sc_SUITE:wait_for_channel_event_/3 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 4294)
in call from aehttp_sc_SUITE:wait_for_channel_event_match/4 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 4268)
in call from aehttp_sc_SUITE:channel_send_chan_open_infos/3 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 894)
in call from aehttp_sc_SUITE:finish_sc_ws_open/2 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 842)
in call from aehttp_sc_SUITE:sc_ws_open_/4 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 775)
in call from aehttp_sc_SUITE:sc_ws_min_depth_is_modifiable/1 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 3012)
in call from test_server:ts_tc/3 (test_server.erl, line 1755)
```
From some log analysis, it seems as if the problem is that the channel is opened with `minimum_depth => 0`. This confuses the generic channel setup code, which has a finishing phase where, optionally, blocks are mined to ensure that the `create_tx` is actually included in a block, and minimum depth is reached. This is triggered for the test case in question, but the tx has already been included, and since `minimum_depth == 0`, minimum depth has also been reached and the associated info reports already delivered.
In a failing run, the following could be seen from the test case output:
```
*** User 2020-08-25 07:23:33.929 ***
aec_conductor:start_mining(#{}) (aeternity_dev1@localhost) -> ok
*** User 2020-08-25 07:23:33.973 ***
aec_conductor:stop_mining() (aeternity_dev1@localhost) -> ok
*** CT Error Notification 2020-08-25 07:23:45.980 ***
aehttp_ws_test_utils:wait_for_msg failed on line 324
Reason: timeout
```
From the stacktrace above, we can see that the test core is waiting for an `open` info msg (line 842).
But scrolling up, we find those messages already delivered, although the test case code wasn't ready for them then.
```
*** User 2020-08-25 07:23:33.908 ***
No test registered for this event (Msg = #{<<"jsonrpc">> => <<"2.0">>,
<<"method">> => <<"channels.info">>,
<<"params">> =>
#{<<"channel_id">> =>
<<"ch_21woyLUNVapgSZrHrbdTKohDG5Yad9xNw4wFsrLzf8sKFspZim">>,
<<"data">> =>
#{<<"event">> =>
<<"open">>}},
<<"version">> => 1})
*** User 2020-08-25 07:23:33.909 ***
[initiator] Received msg #{<<"jsonrpc">> => <<"2.0">>,
<<"method">> => <<"channels.info">>,
<<"params">> =>
#{<<"channel_id">> =>
<<"ch_21woyLUNVapgSZrHrbdTKohDG5Yad9xNw4wFsrLzf8sKFspZim">>,
<<"data">> => #{<<"event">> => <<"open">>}},
<<"version">> => 1}
```
|
1.0
|
aehttp_sc_SUITE failure: timeout waiting for channel `open` messages - The `aehttp_sc_SUITE:sc_ws_min_depth_is_modifiable/1` test case fails with a timeout - at least in some runs.
```
=== Reason: {timeout,{messages,[{<0.9168.0>,websocket_event,channel,
update,
#{<<"jsonrpc">> => <<"2.0">>,
<<"method">> => <<"channels.update">>,
<<"params">> =>
#{<<"channel_id">> =>
<<"ch_21woyLUNVapgSZrHrbdTKohDG5Yad9xNw4wFsrLzf8sKFspZim">>,
<<"data">> =>
#{<<"state">> =>
<<"tx_+QENCwH4hLhAdU5TGcrejxQkGW36Bb7mfyY/N6FwCG5qJxSDCpUmcIv0ie2oy0TzPWq9TpTNeby7zYVcnO/hjIUY1S+KiDTrBLhAn0NBWw3RzA4FS9tujscTinOUTK4jm5RuG7eRMyrMycUVRl4olmgxkDHIPLi3YMMzJ+sdHgK3wHvkxUETOYCDA7iD+IEyAaEBnvLdFTEfyWE16WLId900E+O0wsvSmKqkynYPhUodScSGP6olImAAoQG5u4uTbiAM4+qz6hnyjAQZJnfxP/hQFvbjGc7kagT/PoYkYTnKgAACCgCGEAZ510gAwKCBwESIXtJYQs/2KrdjFqVbmEKLcMJyZ1sylrOYFH8zrQJifO77">>}},
<<"version">> => 1}},
{<0.9163.0>,websocket_event,channel,
update,
#{<<"jsonrpc">> => <<"2.0">>,
<<"method">> => <<"channels.update">>,
<<"params">> =>
#{<<"channel_id">> =>
<<"ch_21woyLUNVapgSZrHrbdTKohDG5Yad9xNw4wFsrLzf8sKFspZim">>,
<<"data">> =>
#{<<"state">> =>
<<"tx_+QENCwH4hLhAdU5TGcrejxQkGW36Bb7mfyY/N6FwCG5qJxSDCpUmcIv0ie2oy0TzPWq9TpTNeby7zYVcnO/hjIUY1S+KiDTrBLhAn0NBWw3RzA4FS9tujscTinOUTK4jm5RuG7eRMyrMycUVRl4olmgxkDHIPLi3YMMzJ+sdHgK3wHvkxUETOYCDA7iD+IEyAaEBnvLdFTEfyWE16WLId900E+O0wsvSmKqkynYPhUodScSGP6olImAAoQG5u4uTbiAM4+qz6hnyjAQZJnfxP/hQFvbjGc7kagT/PoYkYTnKgAACCgCGEAZ510gAwKCBwESIXtJYQs/2KrdjFqVbmEKLcMJyZ1sylrOYFH8zrQJifO77">>}},
<<"version">> => 1}}]}}
in function aehttp_ws_test_utils:wait_for_msg/5 (/home/builder/aeternity/apps/aehttp/test/aehttp_ws_test_utils.erl, line 324)
in call from aehttp_sc_SUITE:wait_for_channel_event_/3 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 4294)
in call from aehttp_sc_SUITE:wait_for_channel_event_match/4 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 4268)
in call from aehttp_sc_SUITE:channel_send_chan_open_infos/3 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 894)
in call from aehttp_sc_SUITE:finish_sc_ws_open/2 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 842)
in call from aehttp_sc_SUITE:sc_ws_open_/4 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 775)
in call from aehttp_sc_SUITE:sc_ws_min_depth_is_modifiable/1 (/home/builder/aeternity/apps/aehttp/test/aehttp_sc_SUITE.erl, line 3012)
in call from test_server:ts_tc/3 (test_server.erl, line 1755)
```
From some log analysis, it seems as if the problem is that the channel is opened with `minimum_depth => 0`. This confuses the generic channel setup code, which has a finishing phase where, optionally, blocks are mined to ensure that the `create_tx` is actually included in a block, and minimum depth is reached. This is triggered for the test case in question, but the tx has already been included, and since `minimum_depth == 0`, minimum depth has also been reached and the associated info reports already delivered.
In a failing run, the following could be seen from the test case output:
```
*** User 2020-08-25 07:23:33.929 ***
aec_conductor:start_mining(#{}) (aeternity_dev1@localhost) -> ok
*** User 2020-08-25 07:23:33.973 ***
aec_conductor:stop_mining() (aeternity_dev1@localhost) -> ok
*** CT Error Notification 2020-08-25 07:23:45.980 ***
aehttp_ws_test_utils:wait_for_msg failed on line 324
Reason: timeout
```
From the stacktrace above, we can see that the test core is waiting for an `open` info msg (line 842).
But scrolling up, we find those messages already delivered, although the test case code wasn't ready for them then.
```
*** User 2020-08-25 07:23:33.908 ***
No test registered for this event (Msg = #{<<"jsonrpc">> => <<"2.0">>,
<<"method">> => <<"channels.info">>,
<<"params">> =>
#{<<"channel_id">> =>
<<"ch_21woyLUNVapgSZrHrbdTKohDG5Yad9xNw4wFsrLzf8sKFspZim">>,
<<"data">> =>
#{<<"event">> =>
<<"open">>}},
<<"version">> => 1})
*** User 2020-08-25 07:23:33.909 ***
[initiator] Received msg #{<<"jsonrpc">> => <<"2.0">>,
<<"method">> => <<"channels.info">>,
<<"params">> =>
#{<<"channel_id">> =>
<<"ch_21woyLUNVapgSZrHrbdTKohDG5Yad9xNw4wFsrLzf8sKFspZim">>,
<<"data">> => #{<<"event">> => <<"open">>}},
<<"version">> => 1}
```
|
test
|
aehttp sc suite failure timeout waiting for channel open messages the aehttp sc suite sc ws min depth is modifiable test case fails with a timeout at least in some runs reason timeout messages websocket event channel update websocket event channel update in function aehttp ws test utils wait for msg home builder aeternity apps aehttp test aehttp ws test utils erl line in call from aehttp sc suite wait for channel event home builder aeternity apps aehttp test aehttp sc suite erl line in call from aehttp sc suite wait for channel event match home builder aeternity apps aehttp test aehttp sc suite erl line in call from aehttp sc suite channel send chan open infos home builder aeternity apps aehttp test aehttp sc suite erl line in call from aehttp sc suite finish sc ws open home builder aeternity apps aehttp test aehttp sc suite erl line in call from aehttp sc suite sc ws open home builder aeternity apps aehttp test aehttp sc suite erl line in call from aehttp sc suite sc ws min depth is modifiable home builder aeternity apps aehttp test aehttp sc suite erl line in call from test server ts tc test server erl line from some log analysis it seems as if the problem is that the channel is opened with minimum depth this confuses the generic channel setup code which has a finishing phase where optionally blocks are mined to ensure that the create tx is actually included in a block and minimum depth is reached this is triggered for the test case in question but the tx has already been included and since minimum depth minimum depth has also been reached and the associated info reports already delivered in a failing run the following could be seen from the test case output user aec conductor start mining aeternity localhost ok user aec conductor stop mining aeternity localhost ok ct error notification aehttp ws test utils wait for msg failed on line reason timeout from the stacktrace above we can see that the test core is waiting for an open info msg line but scrolling up we find those messages already delivered although the test case code wasn t ready for them then user no test registered for this event msg user received msg
| 1
|
61,240
| 17,023,644,742
|
IssuesEvent
|
2021-07-03 03:04:51
|
tomhughes/trac-tickets
|
https://api.github.com/repos/tomhughes/trac-tickets
|
closed
|
Islands don't render in riverbank multipolygons
|
Component: opencyclemap Priority: major Resolution: fixed Type: defect
|
**[Submitted to the original trac issue database at 9.56pm, Friday, 15th October 2010]**
For instance:
http://www.openstreetmap.org/?lat=51.746065&lon=-1.256518&zoom=18&layers=C
There's an island on the other layers
|
1.0
|
Islands don't render in riverbank multipolygons - **[Submitted to the original trac issue database at 9.56pm, Friday, 15th October 2010]**
For instance:
http://www.openstreetmap.org/?lat=51.746065&lon=-1.256518&zoom=18&layers=C
There's an island on the other layers
|
non_test
|
islands don t render in riverbank multipolygons for instance there s an island on the other layers
| 0
|
768,677
| 26,975,802,465
|
IssuesEvent
|
2023-02-09 09:28:15
|
ita-social-projects/TeachUA
|
https://api.github.com/repos/ita-social-projects/TeachUA
|
closed
|
[Челендж] Using Invalid language in "Челендж" creating
|
bug Backend Priority: High Actual
|
Environment: Windows 10 Pro 21H1,OS build19043.1586, Google Chrome, version 99.0.4844.84.
Reproducible: always.
Build found:last commit
Preconditions
Log in as an administrator on https://speak-ukrainian.org.ua/dev/
Steps to reproduce"
1. Click on 'Aдміністрування'
2. Click on 'Додати Челендж'
3. Fill in all required fields
4. Fill in with languages which are not required according to the US
5. Created challegne with Japanesse and German languages
6. Click on "Зберегти" button
**Actual result**
Challenge will be saved
**Expected result**
Challegne would'nt save
https://user-images.githubusercontent.com/57858834/162674623-d0600448-97c6-44ae-b256-928432782563.mp4
https://user-images.githubusercontent.com/57858834/162674643-d07f6e1b-6c9f-45cb-936f-b952224c3580.mp4
**User story and test case links**
E.g.: "User story #100
[Test case](https://jira.softserve.academy/browse/100)"
**Labels to be added**
"Bug", Priority ("pri: "), Severity ("severity:"), Type ("UI, "Functional"), "API" (for back-end bugs).
|
1.0
|
[Челендж] Using Invalid language in "Челендж" creating - Environment: Windows 10 Pro 21H1,OS build19043.1586, Google Chrome, version 99.0.4844.84.
Reproducible: always.
Build found:last commit
Preconditions
Log in as an administrator on https://speak-ukrainian.org.ua/dev/
Steps to reproduce"
1. Click on 'Aдміністрування'
2. Click on 'Додати Челендж'
3. Fill in all required fields
4. Fill in with languages which are not required according to the US
5. Created challegne with Japanesse and German languages
6. Click on "Зберегти" button
**Actual result**
Challenge will be saved
**Expected result**
Challegne would'nt save
https://user-images.githubusercontent.com/57858834/162674623-d0600448-97c6-44ae-b256-928432782563.mp4
https://user-images.githubusercontent.com/57858834/162674643-d07f6e1b-6c9f-45cb-936f-b952224c3580.mp4
**User story and test case links**
E.g.: "User story #100
[Test case](https://jira.softserve.academy/browse/100)"
**Labels to be added**
"Bug", Priority ("pri: "), Severity ("severity:"), Type ("UI, "Functional"), "API" (for back-end bugs).
|
non_test
|
using invalid language in челендж creating environment windows pro os google chrome version reproducible always build found last commit preconditions log in as an administrator on steps to reproduce click on aдміністрування click on додати челендж fill in all required fields fill in with languages which are not required according to the us created challegne with japanesse and german languages click on зберегти button actual result challenge will be saved expected result challegne would nt save user story and test case links e g user story labels to be added bug priority pri severity severity type ui functional api for back end bugs
| 0
|
329,405
| 28,240,343,843
|
IssuesEvent
|
2023-04-06 06:33:14
|
Lurkars/gloomhavensecretariat
|
https://api.github.com/repos/Lurkars/gloomhavensecretariat
|
closed
|
Display issues on scenario summary
|
bug to test
|
There are some display issues when one or more characters are absent in scenario summary. In the example below, Blinkblade is absent.

|
1.0
|
Display issues on scenario summary - There are some display issues when one or more characters are absent in scenario summary. In the example below, Blinkblade is absent.

|
test
|
display issues on scenario summary there are some display issues when one or more characters are absent in scenario summary in the example below blinkblade is absent
| 1
|
32,003
| 4,732,641,742
|
IssuesEvent
|
2016-10-19 08:34:17
|
Kademi/kademi-dev
|
https://api.github.com/repos/Kademi/kademi-dev
|
closed
|
Leadman: Allow to save custom fields on a lead profile page
|
enhancement Ready to Test - Dev
|
This should populate customs fields on a group
|
1.0
|
Leadman: Allow to save custom fields on a lead profile page - This should populate customs fields on a group
|
test
|
leadman allow to save custom fields on a lead profile page this should populate customs fields on a group
| 1
|
444,266
| 31,030,154,069
|
IssuesEvent
|
2023-08-10 11:56:27
|
appsmithorg/appsmith-docs
|
https://api.github.com/repos/appsmithorg/appsmith-docs
|
opened
|
[Docs]: Setup Server-side Filtering on Table
|
Documentation User Education Pod
|
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Documentation Link
_No response_
### Discord/slack/intercom Link
_No response_
### Describe the problem and improvement.
Setup Server-side Filtering on Table
|
1.0
|
[Docs]: Setup Server-side Filtering on Table - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Documentation Link
_No response_
### Discord/slack/intercom Link
_No response_
### Describe the problem and improvement.
Setup Server-side Filtering on Table
|
non_test
|
setup server side filtering on table is there an existing issue for this i have searched the existing issues documentation link no response discord slack intercom link no response describe the problem and improvement setup server side filtering on table
| 0
|
273,354
| 23,748,761,197
|
IssuesEvent
|
2022-08-31 18:28:30
|
phetsims/molecule-shapes
|
https://api.github.com/repos/phetsims/molecule-shapes
|
closed
|
Uncaught Error: Assertion failed: tried to removeListener on something that wasn't a listener
|
status:ready-for-review type:automated-testing
|
From CT:
```
molecule-shapes-basics : fuzz : unbuilt
https://bayes.colorado.edu/continuous-testing/ct-snapshots/1661714801238/molecule-shapes-basics/molecule-shapes-basics_en.html?continuousTest=%7B%22test%22%3A%5B%22molecule-shapes-basics%22%2C%22fuzz%22%2C%22unbuilt%22%5D%2C%22snapshotName%22%3A%22snapshot-1661714801238%22%2C%22timestamp%22%3A1661720765662%7D&brand=phet&ea&fuzz&memoryLimit=1000
Query: brand=phet&ea&fuzz&memoryLimit=1000
Uncaught Error: Assertion failed: tried to removeListener on something that wasn't a listener
Error: Assertion failed: tried to removeListener on something that wasn't a listener
at window.assertions.assertFunction (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1661714801238/assert/js/assert.js:28:13)
at assert (TinyEmitter.ts:149:6)
at removeListener (TinyProperty.ts:154:9)
at unlink (ReadOnlyProperty.ts:418:22)
at unlink (LonePairView.js:123:28)
at dispose (MoleculeView.js:133:19)
at dispose (BondGroupNode.js:91:7)
at getBondDataURL (BondGroupNode.js:155:29)
at (ModelMoleculesScreenView.js:58:25)
at (ModelMoleculesScreen.js:39:15)
id: Bayes Puppeteer
Snapshot from 8/28/2022, 1:26:41 PM
```
|
1.0
|
Uncaught Error: Assertion failed: tried to removeListener on something that wasn't a listener - From CT:
```
molecule-shapes-basics : fuzz : unbuilt
https://bayes.colorado.edu/continuous-testing/ct-snapshots/1661714801238/molecule-shapes-basics/molecule-shapes-basics_en.html?continuousTest=%7B%22test%22%3A%5B%22molecule-shapes-basics%22%2C%22fuzz%22%2C%22unbuilt%22%5D%2C%22snapshotName%22%3A%22snapshot-1661714801238%22%2C%22timestamp%22%3A1661720765662%7D&brand=phet&ea&fuzz&memoryLimit=1000
Query: brand=phet&ea&fuzz&memoryLimit=1000
Uncaught Error: Assertion failed: tried to removeListener on something that wasn't a listener
Error: Assertion failed: tried to removeListener on something that wasn't a listener
at window.assertions.assertFunction (https://bayes.colorado.edu/continuous-testing/ct-snapshots/1661714801238/assert/js/assert.js:28:13)
at assert (TinyEmitter.ts:149:6)
at removeListener (TinyProperty.ts:154:9)
at unlink (ReadOnlyProperty.ts:418:22)
at unlink (LonePairView.js:123:28)
at dispose (MoleculeView.js:133:19)
at dispose (BondGroupNode.js:91:7)
at getBondDataURL (BondGroupNode.js:155:29)
at (ModelMoleculesScreenView.js:58:25)
at (ModelMoleculesScreen.js:39:15)
id: Bayes Puppeteer
Snapshot from 8/28/2022, 1:26:41 PM
```
|
test
|
uncaught error assertion failed tried to removelistener on something that wasn t a listener from ct molecule shapes basics fuzz unbuilt query brand phet ea fuzz memorylimit uncaught error assertion failed tried to removelistener on something that wasn t a listener error assertion failed tried to removelistener on something that wasn t a listener at window assertions assertfunction at assert tinyemitter ts at removelistener tinyproperty ts at unlink readonlyproperty ts at unlink lonepairview js at dispose moleculeview js at dispose bondgroupnode js at getbonddataurl bondgroupnode js at modelmoleculesscreenview js at modelmoleculesscreen js id bayes puppeteer snapshot from pm
| 1
|
115,350
| 9,792,190,065
|
IssuesEvent
|
2019-06-10 16:46:40
|
rancher/rancher
|
https://api.github.com/repos/rancher/rancher
|
closed
|
Global Registry Notary cpu and memory reservation doesn't work
|
[zube]: To Test alpha area/registry area/ui kind/bug-qa team/ui
|
rancher/rancher:master
Steps:
1. Change the cpu reservation of Notary to 450m
2. Enable Global Registry
3. Check the reservation on Notary workload
Results:
It's not 450m which I configured
|
1.0
|
Global Registry Notary cpu and memory reservation doesn't work - rancher/rancher:master
Steps:
1. Change the cpu reservation of Notary to 450m
2. Enable Global Registry
3. Check the reservation on Notary workload
Results:
It's not 450m which I configured
|
test
|
global registry notary cpu and memory reservation doesn t work rancher rancher master steps change the cpu reservation of notary to enable global registry check the reservation on notary workload results it s not which i configured
| 1
|
285,942
| 24,708,812,885
|
IssuesEvent
|
2022-10-19 21:42:54
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
opened
|
DISABLED test_comprehensive_addmm_cuda_complex64 (__main__.TestDecompCUDA)
|
triaged module: flaky-tests skipped module: primTorch module: decompositions
|
Platforms: rocm
This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_comprehensive_addmm_cuda_complex64&suite=TestDecompCUDA) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/8990116489).
Over the past 3 hours, it has been determined flaky in 2 workflow(s) with 2 failures and 2 successes.
**Debugging instructions (after clicking on the recent samples link):**
DO NOT BE ALARMED IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs.
To find relevant log snippets:
1. Click on the workflow logs linked above
2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work.
3. Grep for `test_comprehensive_addmm_cuda_complex64`
4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs.
|
1.0
|
DISABLED test_comprehensive_addmm_cuda_complex64 (__main__.TestDecompCUDA) - Platforms: rocm
This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_comprehensive_addmm_cuda_complex64&suite=TestDecompCUDA) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/8990116489).
Over the past 3 hours, it has been determined flaky in 2 workflow(s) with 2 failures and 2 successes.
**Debugging instructions (after clicking on the recent samples link):**
DO NOT BE ALARMED IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs.
To find relevant log snippets:
1. Click on the workflow logs linked above
2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work.
3. Grep for `test_comprehensive_addmm_cuda_complex64`
4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs.
|
test
|
disabled test comprehensive addmm cuda main testdecompcuda platforms rocm this test was disabled because it is failing in ci see and the most recent trunk over the past hours it has been determined flaky in workflow s with failures and successes debugging instructions after clicking on the recent samples link do not be alarmed if the ci is green we now shield flaky tests from developers so ci will thus be green but it will be harder to parse the logs to find relevant log snippets click on the workflow logs linked above click on the test step of the job so that it is expanded otherwise the grepping will not work grep for test comprehensive addmm cuda there should be several instances run as flaky tests are rerun in ci from which you can study the logs
| 1
|
120,882
| 10,138,262,019
|
IssuesEvent
|
2019-08-02 17:28:18
|
GeoDaCenter/geoda
|
https://api.github.com/repos/GeoDaCenter/geoda
|
closed
|
Linux Crash: Weights Manager w/ Variable Grouping Editor
|
OpSys-Linux to be tested
|
From:
Details:
os: 3-4-19
vs: 1-12-1-201
/usr/local/geoda/web_plugins/logger.txt
Click GdaFrame::OnNewProject
ConnectDatasourceDlg::InitSamplePanel()
Check auto update:
AutoUpdate::CheckUpdate()
AutoUpdate::GetCheckList()
AutoUpdate::ReadUrlContent()
Entering ConnectDatasourceDlg::OnOkClick
ConnectDatasourceDlg::CreateDataSource()
Open Datasource:/home/gnann/geoda/SP/35MUE250GC_SIR.shp
Open Layer:35MUE250GC_SIR
Entering ConnectDatasourceDlg::SaveRecentDataSource
Exiting ConnectDatasourceDlg::SaveRecentDataSource
Exiting ConnectDatasourceDlg::OnOkClick
Entering Project::Project (new project)
Project::CommonProjectInit()
Entering Project::InitFromOgrLayer
Datasource name:
/home/gnann/geoda/SP/35MUE250GC_SIR.shp
Entering OGRTable::OGRTable
Project::GetSpatialReference()
Exiting Project::Project
Click GdaFrame::InitWithProject()
Open TableFrame.
Open MapFrame.
MapCanvas::MapCanvas()
MapCanvas::ChangeMapType()
MapCanvas::VarInfoAttributeChange()
MapCanvas::CreateAndUpdateCategories()
Open VarGroupingEditorDlg.
In VarGroupingEditorDlg::OnNewGroupNameChange
In VarGroupingEditorDlg::OnClose
Click GdaFrame::OnToolsWeightsManager
Entering WeightsManFrame::WeightsManFrame
In WeightsManFrame::SetDetailsForItem
Exiting WeightsManFrame::WeightsManFrame
In WeightsManFrame::OnActivate
In WeightsManFrame::OnLoadBtn
In WeightsManFrame::OnActivate
In WeightsManFrame::update(WeightsManState* o)
In WeightsManFrame::update(WeightsManState* o)
In WeightsManFrame::OnWListItemSelect
In WeightsManFrame::SetDetailsForItem
In WeightsManFrame::SetDetailsForItem
In WeightsManFrame::OnActivate
|
1.0
|
Linux Crash: Weights Manager w/ Variable Grouping Editor - From:
Details:
os: 3-4-19
vs: 1-12-1-201
/usr/local/geoda/web_plugins/logger.txt
Click GdaFrame::OnNewProject
ConnectDatasourceDlg::InitSamplePanel()
Check auto update:
AutoUpdate::CheckUpdate()
AutoUpdate::GetCheckList()
AutoUpdate::ReadUrlContent()
Entering ConnectDatasourceDlg::OnOkClick
ConnectDatasourceDlg::CreateDataSource()
Open Datasource:/home/gnann/geoda/SP/35MUE250GC_SIR.shp
Open Layer:35MUE250GC_SIR
Entering ConnectDatasourceDlg::SaveRecentDataSource
Exiting ConnectDatasourceDlg::SaveRecentDataSource
Exiting ConnectDatasourceDlg::OnOkClick
Entering Project::Project (new project)
Project::CommonProjectInit()
Entering Project::InitFromOgrLayer
Datasource name:
/home/gnann/geoda/SP/35MUE250GC_SIR.shp
Entering OGRTable::OGRTable
Project::GetSpatialReference()
Exiting Project::Project
Click GdaFrame::InitWithProject()
Open TableFrame.
Open MapFrame.
MapCanvas::MapCanvas()
MapCanvas::ChangeMapType()
MapCanvas::VarInfoAttributeChange()
MapCanvas::CreateAndUpdateCategories()
Open VarGroupingEditorDlg.
In VarGroupingEditorDlg::OnNewGroupNameChange
In VarGroupingEditorDlg::OnClose
Click GdaFrame::OnToolsWeightsManager
Entering WeightsManFrame::WeightsManFrame
In WeightsManFrame::SetDetailsForItem
Exiting WeightsManFrame::WeightsManFrame
In WeightsManFrame::OnActivate
In WeightsManFrame::OnLoadBtn
In WeightsManFrame::OnActivate
In WeightsManFrame::update(WeightsManState* o)
In WeightsManFrame::update(WeightsManState* o)
In WeightsManFrame::OnWListItemSelect
In WeightsManFrame::SetDetailsForItem
In WeightsManFrame::SetDetailsForItem
In WeightsManFrame::OnActivate
|
test
|
linux crash weights manager w variable grouping editor from details os vs usr local geoda web plugins logger txt click gdaframe onnewproject connectdatasourcedlg initsamplepanel check auto update autoupdate checkupdate autoupdate getchecklist autoupdate readurlcontent entering connectdatasourcedlg onokclick connectdatasourcedlg createdatasource open datasource home gnann geoda sp sir shp open layer sir entering connectdatasourcedlg saverecentdatasource exiting connectdatasourcedlg saverecentdatasource exiting connectdatasourcedlg onokclick entering project project new project project commonprojectinit entering project initfromogrlayer datasource name home gnann geoda sp sir shp entering ogrtable ogrtable project getspatialreference exiting project project click gdaframe initwithproject open tableframe open mapframe mapcanvas mapcanvas mapcanvas changemaptype mapcanvas varinfoattributechange mapcanvas createandupdatecategories open vargroupingeditordlg in vargroupingeditordlg onnewgroupnamechange in vargroupingeditordlg onclose click gdaframe ontoolsweightsmanager entering weightsmanframe weightsmanframe in weightsmanframe setdetailsforitem exiting weightsmanframe weightsmanframe in weightsmanframe onactivate in weightsmanframe onloadbtn in weightsmanframe onactivate in weightsmanframe update weightsmanstate o in weightsmanframe update weightsmanstate o in weightsmanframe onwlistitemselect in weightsmanframe setdetailsforitem in weightsmanframe setdetailsforitem in weightsmanframe onactivate
| 1
|
24,589
| 4,099,348,334
|
IssuesEvent
|
2016-06-03 12:19:19
|
elastic/logstash
|
https://api.github.com/repos/elastic/logstash
|
closed
|
branch 5.0 core tests failing
|
test failure
| ERROR: type should be string, got "https://travis-ci.org/elastic/logstash/builds/134732394\r\n\r\n```\r\nrake test:install-core 102.80s user 5.56s system 59% cpu 3:03.54 total\r\n--- jar coordinate com.fasterxml.jackson.core:jackson-annotations already loaded with version 2.7.1 - omit version 2.7.0\r\n--- jar coordinate com.fasterxml.jackson.core:jackson-databind already loaded with version 2.7.1 - omit version 2.7.1-1\r\nUsing Accessor#strict_set for specs\r\nrake aborted!\r\nNameError: uninitialized constant LogStash::Api::RackApp::ApiErrorHandler\r\n/Users/joaoduarte/projects/logstash/logstash-core/spec/api/lib/rack_app_spec.rb:26:in `(root)'\r\n/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/example_group.rb:325:in `subclass'\r\n/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/example_group.rb:219:in `describe'\r\n/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/dsl.rb:41:in `describe'\r\n/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/dsl.rb:79:in `describe'\r\n/Users/joaoduarte/projects/logstash/logstash-core/spec/api/lib/rack_app_spec.rb:4:in `(root)'\r\n/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/configuration.rb:1:in `(root)'\r\n/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/configuration.rb:1105:in `load_spec_files'\r\n/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/configuration.rb:1105:in `load_spec_files'\r\n/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/runner.rb:96:in `setup'\r\n/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/runner.rb:84:in `run'\r\n/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/runner.rb:69:in `run'\r\n/Users/joaoduarte/projects/logstash/rakelib/test.rake:42:in `(root)'\r\nTasks: TOP => test:core\r\n```\r\n\r\n@andrewvc I believe ef18693 is missing in 5.0?"
|
1.0
|
branch 5.0 core tests failing - https://travis-ci.org/elastic/logstash/builds/134732394
```
rake test:install-core 102.80s user 5.56s system 59% cpu 3:03.54 total
--- jar coordinate com.fasterxml.jackson.core:jackson-annotations already loaded with version 2.7.1 - omit version 2.7.0
--- jar coordinate com.fasterxml.jackson.core:jackson-databind already loaded with version 2.7.1 - omit version 2.7.1-1
Using Accessor#strict_set for specs
rake aborted!
NameError: uninitialized constant LogStash::Api::RackApp::ApiErrorHandler
/Users/joaoduarte/projects/logstash/logstash-core/spec/api/lib/rack_app_spec.rb:26:in `(root)'
/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/example_group.rb:325:in `subclass'
/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/example_group.rb:219:in `describe'
/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/dsl.rb:41:in `describe'
/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/dsl.rb:79:in `describe'
/Users/joaoduarte/projects/logstash/logstash-core/spec/api/lib/rack_app_spec.rb:4:in `(root)'
/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/configuration.rb:1:in `(root)'
/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/configuration.rb:1105:in `load_spec_files'
/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/configuration.rb:1105:in `load_spec_files'
/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/runner.rb:96:in `setup'
/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/runner.rb:84:in `run'
/Users/joaoduarte/projects/logstash/vendor/bundle/jruby/1.9/gems/rspec-core-3.1.7/lib/rspec/core/runner.rb:69:in `run'
/Users/joaoduarte/projects/logstash/rakelib/test.rake:42:in `(root)'
Tasks: TOP => test:core
```
@andrewvc I believe ef18693 is missing in 5.0?
|
test
|
branch core tests failing rake test install core user system cpu total jar coordinate com fasterxml jackson core jackson annotations already loaded with version omit version jar coordinate com fasterxml jackson core jackson databind already loaded with version omit version using accessor strict set for specs rake aborted nameerror uninitialized constant logstash api rackapp apierrorhandler users joaoduarte projects logstash logstash core spec api lib rack app spec rb in root users joaoduarte projects logstash vendor bundle jruby gems rspec core lib rspec core example group rb in subclass users joaoduarte projects logstash vendor bundle jruby gems rspec core lib rspec core example group rb in describe users joaoduarte projects logstash vendor bundle jruby gems rspec core lib rspec core dsl rb in describe users joaoduarte projects logstash vendor bundle jruby gems rspec core lib rspec core dsl rb in describe users joaoduarte projects logstash logstash core spec api lib rack app spec rb in root users joaoduarte projects logstash vendor bundle jruby gems rspec core lib rspec core configuration rb in root users joaoduarte projects logstash vendor bundle jruby gems rspec core lib rspec core configuration rb in load spec files users joaoduarte projects logstash vendor bundle jruby gems rspec core lib rspec core configuration rb in load spec files users joaoduarte projects logstash vendor bundle jruby gems rspec core lib rspec core runner rb in setup users joaoduarte projects logstash vendor bundle jruby gems rspec core lib rspec core runner rb in run users joaoduarte projects logstash vendor bundle jruby gems rspec core lib rspec core runner rb in run users joaoduarte projects logstash rakelib test rake in root tasks top test core andrewvc i believe is missing in
| 1
|
144,414
| 11,615,652,749
|
IssuesEvent
|
2020-02-26 14:32:35
|
JohanKJIP/algorithms
|
https://api.github.com/repos/JohanKJIP/algorithms
|
closed
|
Testcase for requirement 6 for dijkstra's
|
test
|
Requirement 6: The time complexity should be O(Elog(V))
|
1.0
|
Testcase for requirement 6 for dijkstra's - Requirement 6: The time complexity should be O(Elog(V))
|
test
|
testcase for requirement for dijkstra s requirement the time complexity should be o elog v
| 1
|
17,449
| 3,618,107,763
|
IssuesEvent
|
2016-02-08 09:52:58
|
backbee/backbee-standard
|
https://api.github.com/repos/backbee/backbee-standard
|
closed
|
[PAGE MANAGEMENET] Disable contextual menu in page management tree
|
enhancement To test
|
Disable contextual menu in page management tree
|
1.0
|
[PAGE MANAGEMENET] Disable contextual menu in page management tree - Disable contextual menu in page management tree
|
test
|
disable contextual menu in page management tree disable contextual menu in page management tree
| 1
|
4,150
| 2,711,585,894
|
IssuesEvent
|
2015-04-09 07:41:09
|
IDgis/geo-publisher-test
|
https://api.github.com/repos/IDgis/geo-publisher-test
|
closed
|
Lagen / lagen aanpassen : zit in de groep, wordt niet getoond
|
enhancement readyfortest
|
Als ik een laag in een groep heb geplaatst moet hier weergegeven worden dat hij in een groep zit.
zie Laag Aardgastransportleidingen aanpassen
|
1.0
|
Lagen / lagen aanpassen : zit in de groep, wordt niet getoond - Als ik een laag in een groep heb geplaatst moet hier weergegeven worden dat hij in een groep zit.
zie Laag Aardgastransportleidingen aanpassen
|
test
|
lagen lagen aanpassen zit in de groep wordt niet getoond als ik een laag in een groep heb geplaatst moet hier weergegeven worden dat hij in een groep zit zie laag aardgastransportleidingen aanpassen
| 1
|
784,708
| 27,582,909,844
|
IssuesEvent
|
2023-03-08 17:25:43
|
eLearningDAO/POCRE
|
https://api.github.com/repos/eLearningDAO/POCRE
|
closed
|
Claim material and share link only for published creation
|
enhancement low priority
|
The check (if claimable/shareable or not) should also be done at API level as well.
1) the claim button only for published ~~material~~ creation

2) the sharing function only for published ~~material~~ creation

|
1.0
|
Claim material and share link only for published creation - The check (if claimable/shareable or not) should also be done at API level as well.
1) the claim button only for published ~~material~~ creation

2) the sharing function only for published ~~material~~ creation

|
non_test
|
claim material and share link only for published creation the check if claimable shareable or not should also be done at api level as well the claim button only for published material creation the sharing function only for published material creation
| 0
|
57,690
| 6,554,042,850
|
IssuesEvent
|
2017-09-06 02:50:23
|
easydigitaldownloads/edd-free-downloads
|
https://api.github.com/repos/easydigitaldownloads/edd-free-downloads
|
closed
|
File downloads fail on mobile when product has variable prices and file is assigned to price
|
Bug Has PR Needs Testing
|
File downloads fail on mobile when the following conditions are true:
- variable prices enabled
- file assigned to specific price
- price option is free
Disabling variable prices fixes the issue.
See https://secure.helpscout.net/conversation/374970730/60426/?folderId=180505
|
1.0
|
File downloads fail on mobile when product has variable prices and file is assigned to price - File downloads fail on mobile when the following conditions are true:
- variable prices enabled
- file assigned to specific price
- price option is free
Disabling variable prices fixes the issue.
See https://secure.helpscout.net/conversation/374970730/60426/?folderId=180505
|
test
|
file downloads fail on mobile when product has variable prices and file is assigned to price file downloads fail on mobile when the following conditions are true variable prices enabled file assigned to specific price price option is free disabling variable prices fixes the issue see
| 1
|
296,006
| 25,521,385,741
|
IssuesEvent
|
2022-11-28 20:49:02
|
rancher/qa-tasks
|
https://api.github.com/repos/rancher/qa-tasks
|
closed
|
Add support for provisioning k3s hardened Rancher managed Custom clusters
|
team/area2 [zube]: QA Review area/automation-test
|
### Issue Description
When provisioning clusters, we should have automated tests to harden the clusters. For security purposes, this should be seen as the default, so tests should be added to our framework
---
- [x] Deploy a k3s hardened local cluster HA rancher deploy - incorporated existing hardened k3s cluster code in our Jenkins job to enable it.
- [x] Deploy a k3s hardened downstream custom cluster
|
1.0
|
Add support for provisioning k3s hardened Rancher managed Custom clusters - ### Issue Description
When provisioning clusters, we should have automated tests to harden the clusters. For security purposes, this should be seen as the default, so tests should be added to our framework
---
- [x] Deploy a k3s hardened local cluster HA rancher deploy - incorporated existing hardened k3s cluster code in our Jenkins job to enable it.
- [x] Deploy a k3s hardened downstream custom cluster
|
test
|
add support for provisioning hardened rancher managed custom clusters issue description when provisioning clusters we should have automated tests to harden the clusters for security purposes this should be seen as the default so tests should be added to our framework deploy a hardened local cluster ha rancher deploy incorporated existing hardened cluster code in our jenkins job to enable it deploy a hardened downstream custom cluster
| 1
|
294,204
| 25,351,901,079
|
IssuesEvent
|
2022-11-19 21:52:37
|
ValveSoftware/portal2
|
https://api.github.com/repos/ValveSoftware/portal2
|
closed
|
Portal 1 and 2 are failing on Steam Deck due to failed VR entry point
|
Need Retest
|
Portal 2
```
SDL video target is 'x11'
SDL video target is 'x11'
Using shader api: shaderapivk
Using shader api: shaderapivk
free(): invalid pointer
/home/deck/.local/share/Steam/steamapps/common/Portal 2/portal2.sh: line 51: 12457 Aborted (core dumped) ${GAME_DEBUGGER} "${GAMEROOT}"/${GAMEEXE} "$@"
```
Portal 1
```
chdir /home/deck/.local/share/Steam/steamapps/common/Portal
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_64/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS64): ignored.
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
pid 11856 != 11852, skipping destruction (fork without exec?)
GameAction [AppID 400, ActionID 1] : LaunchApp changed task to WaitingGameWindow with ""
GameAction [AppID 400, ActionID 1] : LaunchApp changed task to Completed with ""
Installing breakpad exception handler for appid(steam)/version(1666913041)
pid 12012 != 12008, skipping destruction (fork without exec?)
pid 12017 != 12008, skipping destruction (fork without exec?)
pid 12078 != 12008, skipping destruction (fork without exec?)
SDL video target is 'x11'
SDL video target is 'x11'
failed to dlopen /home/deck/.local/share/Steam/steamapps/common/Portal/bin/sourcevr.so error=/home/deck/.local/share/Steam/steamapps/common/Portal/bin/sourcevr.so: undefined symbol: VR_IsHmdPresent
failed to dlopen sourcevr.so error=/home/deck/.local/share/Steam/steamapps/common/Portal/bin/sourcevr.so: undefined symbol: VR_IsHmdPresent
AppFramework : Unable to load module sourcevr.so!
Using shader api: shaderapivk
free(): invalid pointer
```
|
1.0
|
Portal 1 and 2 are failing on Steam Deck due to failed VR entry point - Portal 2
```
SDL video target is 'x11'
SDL video target is 'x11'
Using shader api: shaderapivk
Using shader api: shaderapivk
free(): invalid pointer
/home/deck/.local/share/Steam/steamapps/common/Portal 2/portal2.sh: line 51: 12457 Aborted (core dumped) ${GAME_DEBUGGER} "${GAMEROOT}"/${GAMEEXE} "$@"
```
Portal 1
```
chdir /home/deck/.local/share/Steam/steamapps/common/Portal
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_64/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS64): ignored.
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
ERROR: ld.so: object '/home/deck/.local/share/Steam/ubuntu12_32/gameoverlayrenderer.so' from LD_PRELOAD cannot be preloaded (wrong ELF class: ELFCLASS32): ignored.
pid 11856 != 11852, skipping destruction (fork without exec?)
GameAction [AppID 400, ActionID 1] : LaunchApp changed task to WaitingGameWindow with ""
GameAction [AppID 400, ActionID 1] : LaunchApp changed task to Completed with ""
Installing breakpad exception handler for appid(steam)/version(1666913041)
pid 12012 != 12008, skipping destruction (fork without exec?)
pid 12017 != 12008, skipping destruction (fork without exec?)
pid 12078 != 12008, skipping destruction (fork without exec?)
SDL video target is 'x11'
SDL video target is 'x11'
failed to dlopen /home/deck/.local/share/Steam/steamapps/common/Portal/bin/sourcevr.so error=/home/deck/.local/share/Steam/steamapps/common/Portal/bin/sourcevr.so: undefined symbol: VR_IsHmdPresent
failed to dlopen sourcevr.so error=/home/deck/.local/share/Steam/steamapps/common/Portal/bin/sourcevr.so: undefined symbol: VR_IsHmdPresent
AppFramework : Unable to load module sourcevr.so!
Using shader api: shaderapivk
free(): invalid pointer
```
|
test
|
portal and are failing on steam deck due to failed vr entry point portal sdl video target is sdl video target is using shader api shaderapivk using shader api shaderapivk free invalid pointer home deck local share steam steamapps common portal sh line aborted core dumped game debugger gameroot gameexe portal chdir home deck local share steam steamapps common portal error ld so object home deck local share steam gameoverlayrenderer so from ld preload cannot be preloaded wrong elf class ignored error ld so object home deck local share steam gameoverlayrenderer so from ld preload cannot be preloaded wrong elf class ignored error ld so object home deck local share steam gameoverlayrenderer so from ld preload cannot be preloaded wrong elf class ignored error ld so object home deck local share steam gameoverlayrenderer so from ld preload cannot be preloaded wrong elf class ignored error ld so object home deck local share steam gameoverlayrenderer so from ld preload cannot be preloaded wrong elf class ignored error ld so object home deck local share steam gameoverlayrenderer so from ld preload cannot be preloaded wrong elf class ignored pid skipping destruction fork without exec gameaction launchapp changed task to waitinggamewindow with gameaction launchapp changed task to completed with installing breakpad exception handler for appid steam version pid skipping destruction fork without exec pid skipping destruction fork without exec pid skipping destruction fork without exec sdl video target is sdl video target is failed to dlopen home deck local share steam steamapps common portal bin sourcevr so error home deck local share steam steamapps common portal bin sourcevr so undefined symbol vr ishmdpresent failed to dlopen sourcevr so error home deck local share steam steamapps common portal bin sourcevr so undefined symbol vr ishmdpresent appframework unable to load module sourcevr so using shader api shaderapivk free invalid pointer
| 1
|
26,259
| 2,684,274,314
|
IssuesEvent
|
2015-03-28 20:35:34
|
ConEmu/old-issues
|
https://api.github.com/repos/ConEmu/old-issues
|
closed
|
Crash whilst resizing Conemu window
|
2–5 stars bug duplicate imported Priority-Medium
|
_From [col.brad...@gmail.com](https://code.google.com/u/103901031233724257395/) on January 24, 2013 00:00:36_
Required information! OS version: Win2k/WinXP/Vista/Win7/Win8 SP? x86/x64 ConEmu version: ? Far version (if you are using Far Manager): ? *Bug description* Conemu crashes whilst resizing window. *Steps to reproduction* 1. Open single instance of Conemu.
2. Set font to Consolas, 14pt, Clear-type.
3. Resize console window to maximum size on 1920x1080 monitor... CRASHY, CRASHY!
4. Workaround is to use Setting->Main->Size&Pos->Window size (cells).
_Original issue: http://code.google.com/p/conemu-maximus5/issues/detail?id=904_
|
1.0
|
Crash whilst resizing Conemu window - _From [col.brad...@gmail.com](https://code.google.com/u/103901031233724257395/) on January 24, 2013 00:00:36_
Required information! OS version: Win2k/WinXP/Vista/Win7/Win8 SP? x86/x64 ConEmu version: ? Far version (if you are using Far Manager): ? *Bug description* Conemu crashes whilst resizing window. *Steps to reproduction* 1. Open single instance of Conemu.
2. Set font to Consolas, 14pt, Clear-type.
3. Resize console window to maximum size on 1920x1080 monitor... CRASHY, CRASHY!
4. Workaround is to use Setting->Main->Size&Pos->Window size (cells).
_Original issue: http://code.google.com/p/conemu-maximus5/issues/detail?id=904_
|
non_test
|
crash whilst resizing conemu window from on january required information os version winxp vista sp conemu version far version if you are using far manager bug description conemu crashes whilst resizing window steps to reproduction open single instance of conemu set font to consolas clear type resize console window to maximum size on monitor crashy crashy workaround is to use setting main size pos window size cells original issue
| 0
|
40,099
| 5,272,561,498
|
IssuesEvent
|
2017-02-06 13:18:32
|
c2corg/v6_ui
|
https://api.github.com/repos/c2corg/v6_ui
|
closed
|
In image edition, do not allow to change the licence to copyright
|
bug fixed and ready for testing Images
|
The licence of an image should be changed to copyright only by a moderator.
But for now, everybody can change the licence of its personal images to copyright.
The copyright option must be limited to moderator only.
A simple member must be able to choose between personal and collaborative licence only.
There is no problem in the image upload tool, only in the image edit form when we modify an image.
|
1.0
|
In image edition, do not allow to change the licence to copyright - The licence of an image should be changed to copyright only by a moderator.
But for now, everybody can change the licence of its personal images to copyright.
The copyright option must be limited to moderator only.
A simple member must be able to choose between personal and collaborative licence only.
There is no problem in the image upload tool, only in the image edit form when we modify an image.
|
test
|
in image edition do not allow to change the licence to copyright the licence of an image should be changed to copyright only by a moderator but for now everybody can change the licence of its personal images to copyright the copyright option must be limited to moderator only a simple member must be able to choose between personal and collaborative licence only there is no problem in the image upload tool only in the image edit form when we modify an image
| 1
|
335,351
| 24,465,414,061
|
IssuesEvent
|
2022-10-07 14:37:36
|
Heigvd/colab
|
https://api.github.com/repos/Heigvd/colab
|
closed
|
Déplacement des ressources/documents
|
documentation
|
- [x] depuis l'affichage du document
- [x] depuis les settings du document
|
1.0
|
Déplacement des ressources/documents - - [x] depuis l'affichage du document
- [x] depuis les settings du document
|
non_test
|
déplacement des ressources documents depuis l affichage du document depuis les settings du document
| 0
|
21,511
| 10,653,449,168
|
IssuesEvent
|
2019-10-17 14:27:00
|
oasislabs/oasis-core
|
https://api.github.com/repos/oasislabs/oasis-core
|
closed
|
[EXT-SEC-AUDIT] Multiple Rust dependencies contain known vulnerabilities
|
c:security
|
*Issue transferred from an external security audit report.*
Multiple Rust dependencies were found to be affected by known vulnerabilities.
```
% cargo audit
Fetching advisory database from `https://github.com/RustSec/advisory-db.git`
Loaded 41 security advisories (from /Users/sae/.cargo/advisory-db)
Scanning Cargo.lock for vulnerabilities (215 crate dependencies)
error: Vulnerable crates found!
ID: RUSTSEC-2018-0009
Crate: crossbeam
Version: 0.2.12
Date: 2018-12-09
URL: https://github.com/crossbeam-rs/crossbeam-epoch/issues/82
Title: MsQueue and SegQueue suffer from double-free
Solution: upgrade to: >= 0.4.1
ID: RUSTSEC-2019-0011
Crate: memoffset
Version: 0.2.1
Date: 2019-07-16
URL: https://github.com/Gilnaa/memoffset/issues/9#issuecomment-505461490
Title: Flaw in offset_of and span_of causes SIGILL, drops uninitialized memory of arbitrary
type on panic in client code
Solution: upgrade to: >= 0.5.0
ID: RUSTSEC-2019-0003
Crate: protobuf
Version: 1.7.4
Date: 2019-06-08
URL: https://github.com/stepancheg/rust-protobuf/issues/411
Title: Out of Memory in stream::read_raw_bytes_into()
Solution: upgrade to: ^1.7.5 OR >= 2.6.0
ID: RUSTSEC-2019-0003
Crate: protobuf
Version: 2.4.2
Date: 2019-06-08
URL: https://github.com/stepancheg/rust-protobuf/issues/411
Title: Out of Memory in stream::read_raw_bytes_into()
Solution: upgrade to: ^1.7.5 OR >= 2.6.0
ID: RUSTSEC-2019-0009
Crate: smallvec
Version: 0.6.9
Date: 2019-06-06
URL: https://github.com/servo/rust-smallvec/issues/148
Title: Double-free and use-after-free in SmallVec::grow()
Solution: upgrade to: >= 0.6.10
ID: RUSTSEC-2019-0012
Crate: smallvec
Version: 0.6.9
Date: 2019-07-19
URL: https://github.com/servo/rust-smallvec/issues/149
Title: Memory corruption in SmallVec::grow()
Solution: upgrade to: >= 0.6.10
ID: RUSTSEC-2019-0013
Crate: spin
Version: 0.5.0
Date: 2019-08-27
URL: https://github.com/mvdnes/spin-rs/issues/65
Title: Wrong memory orderings in RwLock potentially violates mutual exclusion
Solution: upgrade to: >= 0.5.2
error: 7 vulnerabilities found!
```
Short term, update all affected crates to their latest versions.
Long term, integrate `cargo audit` into Oasis Lab’s continuous integration pipeline and do
not allow a build to succeed if it contains vulnerable dependencies.
|
True
|
[EXT-SEC-AUDIT] Multiple Rust dependencies contain known vulnerabilities - *Issue transferred from an external security audit report.*
Multiple Rust dependencies were found to be affected by known vulnerabilities.
```
% cargo audit
Fetching advisory database from `https://github.com/RustSec/advisory-db.git`
Loaded 41 security advisories (from /Users/sae/.cargo/advisory-db)
Scanning Cargo.lock for vulnerabilities (215 crate dependencies)
error: Vulnerable crates found!
ID: RUSTSEC-2018-0009
Crate: crossbeam
Version: 0.2.12
Date: 2018-12-09
URL: https://github.com/crossbeam-rs/crossbeam-epoch/issues/82
Title: MsQueue and SegQueue suffer from double-free
Solution: upgrade to: >= 0.4.1
ID: RUSTSEC-2019-0011
Crate: memoffset
Version: 0.2.1
Date: 2019-07-16
URL: https://github.com/Gilnaa/memoffset/issues/9#issuecomment-505461490
Title: Flaw in offset_of and span_of causes SIGILL, drops uninitialized memory of arbitrary
type on panic in client code
Solution: upgrade to: >= 0.5.0
ID: RUSTSEC-2019-0003
Crate: protobuf
Version: 1.7.4
Date: 2019-06-08
URL: https://github.com/stepancheg/rust-protobuf/issues/411
Title: Out of Memory in stream::read_raw_bytes_into()
Solution: upgrade to: ^1.7.5 OR >= 2.6.0
ID: RUSTSEC-2019-0003
Crate: protobuf
Version: 2.4.2
Date: 2019-06-08
URL: https://github.com/stepancheg/rust-protobuf/issues/411
Title: Out of Memory in stream::read_raw_bytes_into()
Solution: upgrade to: ^1.7.5 OR >= 2.6.0
ID: RUSTSEC-2019-0009
Crate: smallvec
Version: 0.6.9
Date: 2019-06-06
URL: https://github.com/servo/rust-smallvec/issues/148
Title: Double-free and use-after-free in SmallVec::grow()
Solution: upgrade to: >= 0.6.10
ID: RUSTSEC-2019-0012
Crate: smallvec
Version: 0.6.9
Date: 2019-07-19
URL: https://github.com/servo/rust-smallvec/issues/149
Title: Memory corruption in SmallVec::grow()
Solution: upgrade to: >= 0.6.10
ID: RUSTSEC-2019-0013
Crate: spin
Version: 0.5.0
Date: 2019-08-27
URL: https://github.com/mvdnes/spin-rs/issues/65
Title: Wrong memory orderings in RwLock potentially violates mutual exclusion
Solution: upgrade to: >= 0.5.2
error: 7 vulnerabilities found!
```
Short term, update all affected crates to their latest versions.
Long term, integrate `cargo audit` into Oasis Lab’s continuous integration pipeline and do
not allow a build to succeed if it contains vulnerable dependencies.
|
non_test
|
multiple rust dependencies contain known vulnerabilities issue transferred from an external security audit report multiple rust dependencies were found to be affected by known vulnerabilities cargo audit fetching advisory database from loaded security advisories from users sae cargo advisory db scanning cargo lock for vulnerabilities crate dependencies error vulnerable crates found id rustsec crate crossbeam version date url title msqueue and segqueue suffer from double free solution upgrade to id rustsec crate memoffset version date url title flaw in offset of and span of causes sigill drops uninitialized memory of arbitrary type on panic in client code solution upgrade to id rustsec crate protobuf version date url title out of memory in stream read raw bytes into solution upgrade to or id rustsec crate protobuf version date url title out of memory in stream read raw bytes into solution upgrade to or id rustsec crate smallvec version date url title double free and use after free in smallvec grow solution upgrade to id rustsec crate smallvec version date url title memory corruption in smallvec grow solution upgrade to id rustsec crate spin version date url title wrong memory orderings in rwlock potentially violates mutual exclusion solution upgrade to error vulnerabilities found short term update all affected crates to their latest versions long term integrate cargo audit into oasis lab’s continuous integration pipeline and do not allow a build to succeed if it contains vulnerable dependencies
| 0
|
204,069
| 15,398,714,348
|
IssuesEvent
|
2021-03-04 00:36:56
|
nucleus-security/Test-repo
|
https://api.github.com/repos/nucleus-security/Test-repo
|
opened
|
Nucleus - Project: Ticketing Rules now apply to all vulnerabilities - [High] - CentOS Security Update for kernel (CESA-2015:0864)
|
Testing
|
Source: QUALYS
Finding Description: CentOS has released security update for kernel to fix the vulnerabilities.<p>Affected Products:<br />centos 6
Impact: This vulnerability could be exploited to gain complete access to sensitive information. Malicious users could also use this vulnerability to change all the contents or configuration on the system.</p>
Target(s): Asset name: 45.55.254.143
IP: 45.55.254.143
Solution: To resolve this issue, upgrade to the latest packages which contain a patch. Refer to CentOS advisory <a href="http://lists.centos.org/pipermail/centos-announce/2015-april/021083.html">centos 6</a> for updates and patch information.
<p>Patch:<br />
Following are links for downloading patches to fix the vulnerabilities:
</p><p> <a href="http://lists.centos.org/pipermail/centos-announce/2015-april/021083.html">CESA-2015:0864: centos 6</a></p>
References:
ID:123562
CVE:CVE-2014-3215,CVE-2014-3690,CVE-2014-7825,CVE-2014-7826,CVE-2014-8171,CVE-2014-8884,CVE-2014-9529,CVE-2014-9584,CVE-2015-1421
Category:Local
PCI Flagged:1
Vendor References:CESA-2015:0864 centos 6
Bugtraq IDs:67341,70691,72356,71880,71883,70972,70971,74293
Severity: High
Date Discovered: 2020-01-07 14:35:48
Nucleus Notification Rules Triggered: GitHub Rule
Project Name: Ticketing Rules now apply to all vulnerabilities
|
1.0
|
Nucleus - Project: Ticketing Rules now apply to all vulnerabilities - [High] - CentOS Security Update for kernel (CESA-2015:0864) - Source: QUALYS
Finding Description: CentOS has released security update for kernel to fix the vulnerabilities.<p>Affected Products:<br />centos 6
Impact: This vulnerability could be exploited to gain complete access to sensitive information. Malicious users could also use this vulnerability to change all the contents or configuration on the system.</p>
Target(s): Asset name: 45.55.254.143
IP: 45.55.254.143
Solution: To resolve this issue, upgrade to the latest packages which contain a patch. Refer to CentOS advisory <a href="http://lists.centos.org/pipermail/centos-announce/2015-april/021083.html">centos 6</a> for updates and patch information.
<p>Patch:<br />
Following are links for downloading patches to fix the vulnerabilities:
</p><p> <a href="http://lists.centos.org/pipermail/centos-announce/2015-april/021083.html">CESA-2015:0864: centos 6</a></p>
References:
ID:123562
CVE:CVE-2014-3215,CVE-2014-3690,CVE-2014-7825,CVE-2014-7826,CVE-2014-8171,CVE-2014-8884,CVE-2014-9529,CVE-2014-9584,CVE-2015-1421
Category:Local
PCI Flagged:1
Vendor References:CESA-2015:0864 centos 6
Bugtraq IDs:67341,70691,72356,71880,71883,70972,70971,74293
Severity: High
Date Discovered: 2020-01-07 14:35:48
Nucleus Notification Rules Triggered: GitHub Rule
Project Name: Ticketing Rules now apply to all vulnerabilities
|
test
|
nucleus project ticketing rules now apply to all vulnerabilities centos security update for kernel cesa source qualys finding description centos has released security update for kernel to fix the vulnerabilities affected products centos impact this vulnerability could be exploited to gain complete access to sensitive information malicious users could also use this vulnerability to change all the contents or configuration on the system target s asset name ip solution to resolve this issue upgrade to the latest packages which contain a patch refer to centos advisory for updates and patch information patch following are links for downloading patches to fix the vulnerabilities references id cve cve cve cve cve cve cve cve cve cve category local pci flagged vendor references cesa centos bugtraq ids severity high date discovered nucleus notification rules triggered github rule project name ticketing rules now apply to all vulnerabilities
| 1
|
7,320
| 2,610,363,186
|
IssuesEvent
|
2015-02-26 19:57:25
|
chrsmith/scribefire-chrome
|
https://api.github.com/repos/chrsmith/scribefire-chrome
|
opened
|
Post disappears after saving and closing safari but before publishing the post
|
auto-migrated Priority-Medium Type-Defect
|
```
What's the problem?
I write my post and save it but I can't find it after closing scribefire or
safari and then re-opening them. Where does it get saved on my computer?
I think I'll go back to using text edit to write my blogs - it's safer.
What browser are you using?
Safari 6
What version of ScribeFire are you running?
The latest one from the Safari extension site
```
-----
Original issue reported on code.google.com by `learncal...@gmail.com` on 17 Aug 2012 at 7:27
|
1.0
|
Post disappears after saving and closing safari but before publishing the post - ```
What's the problem?
I write my post and save it but I can't find it after closing scribefire or
safari and then re-opening them. Where does it get saved on my computer?
I think I'll go back to using text edit to write my blogs - it's safer.
What browser are you using?
Safari 6
What version of ScribeFire are you running?
The latest one from the Safari extension site
```
-----
Original issue reported on code.google.com by `learncal...@gmail.com` on 17 Aug 2012 at 7:27
|
non_test
|
post disappears after saving and closing safari but before publishing the post what s the problem i write my post and save it but i can t find it after closing scribefire or safari and then re opening them where does it get saved on my computer i think i ll go back to using text edit to write my blogs it s safer what browser are you using safari what version of scribefire are you running the latest one from the safari extension site original issue reported on code google com by learncal gmail com on aug at
| 0
|
90,652
| 8,251,688,308
|
IssuesEvent
|
2018-09-12 08:37:01
|
mono/mono
|
https://api.github.com/repos/mono/mono
|
opened
|
[roslyn] PatternMatchingTests.Query_01 fails
|
epic: Roslyn Tests
|
roslyn$ PATH=~/.mono/bin:$PATH ./build/scripts/tests.sh Debug mono "" -method "*.Query_01"
```
Microsoft.CodeAnalysis.CSharp.UnitTests.PatternMatchingTests.Query_01 [FAIL]
Roslyn.Test.Utilities.ExecutionException :
Execution failed for assembly '/tmp/RoslynTests'.
Expected: 1
3
5
2
4
6
7
8
10
9
11
12
Actual: 1
3
2
5
4
6
7
8
10
9
11
12
Stack Trace:
at (wrapper managed-to-native) System.Reflection.MonoMethod.InternalInvoke(System.Reflection.MonoMethod,object,object[],System.Exception&)
at System.Reflection.MonoMethod.Invoke (System.Object obj, System.Reflection.BindingFlags invokeAttr, System.Reflection.Binder binder, System.Object[] parameters, System.Globalization.CultureInfo culture) [0x0003b] in <af30111a119b4c33adf68bea3aff0284>:0
Finished: Microsoft.CodeAnalysis.CSharp.Semantic.UnitTests
```
|
1.0
|
[roslyn] PatternMatchingTests.Query_01 fails - roslyn$ PATH=~/.mono/bin:$PATH ./build/scripts/tests.sh Debug mono "" -method "*.Query_01"
```
Microsoft.CodeAnalysis.CSharp.UnitTests.PatternMatchingTests.Query_01 [FAIL]
Roslyn.Test.Utilities.ExecutionException :
Execution failed for assembly '/tmp/RoslynTests'.
Expected: 1
3
5
2
4
6
7
8
10
9
11
12
Actual: 1
3
2
5
4
6
7
8
10
9
11
12
Stack Trace:
at (wrapper managed-to-native) System.Reflection.MonoMethod.InternalInvoke(System.Reflection.MonoMethod,object,object[],System.Exception&)
at System.Reflection.MonoMethod.Invoke (System.Object obj, System.Reflection.BindingFlags invokeAttr, System.Reflection.Binder binder, System.Object[] parameters, System.Globalization.CultureInfo culture) [0x0003b] in <af30111a119b4c33adf68bea3aff0284>:0
Finished: Microsoft.CodeAnalysis.CSharp.Semantic.UnitTests
```
|
test
|
patternmatchingtests query fails roslyn path mono bin path build scripts tests sh debug mono method query microsoft codeanalysis csharp unittests patternmatchingtests query roslyn test utilities executionexception execution failed for assembly tmp roslyntests expected actual stack trace at wrapper managed to native system reflection monomethod internalinvoke system reflection monomethod object object system exception at system reflection monomethod invoke system object obj system reflection bindingflags invokeattr system reflection binder binder system object parameters system globalization cultureinfo culture in finished microsoft codeanalysis csharp semantic unittests
| 1
|
203,007
| 15,327,332,399
|
IssuesEvent
|
2021-02-26 05:53:10
|
kubesphere/kubesphere
|
https://api.github.com/repos/kubesphere/kubesphere
|
closed
|
Integrate e2e testing in CI process and collect test results
|
area/test kind/feature
|
**What's it about?**
We need to integrate e2e testing into the CI process and collect test results. To launch an e2e testing we also need a way to set up a testing kubesphere cluster.
Kind is one of the options to setup the test Kubernetes cluster. Then use ks-installer to install kubesphere.
**Area Suggestion**
/area test
/kind feature
|
1.0
|
Integrate e2e testing in CI process and collect test results - **What's it about?**
We need to integrate e2e testing into the CI process and collect test results. To launch an e2e testing we also need a way to set up a testing kubesphere cluster.
Kind is one of the options to setup the test Kubernetes cluster. Then use ks-installer to install kubesphere.
**Area Suggestion**
/area test
/kind feature
|
test
|
integrate testing in ci process and collect test results what s it about we need to integrate testing into the ci process and collect test results to launch an testing we also need a way to set up a testing kubesphere cluster kind is one of the options to setup the test kubernetes cluster then use ks installer to install kubesphere area suggestion area test kind feature
| 1
|
170,751
| 13,201,702,635
|
IssuesEvent
|
2020-08-14 10:41:18
|
Niraj-Kamdar/question-paper-generator
|
https://api.github.com/repos/Niraj-Kamdar/question-paper-generator
|
closed
|
Set IMP flag to question selenium test
|
Test
|
### Describe the test
Add a test to set IMP flag to question using selenium webdriver.
### What it covers
It will cover front-end test for setting IMP flag to the question.
### Testing
Some tests for the above testcase.
| Description | Input | Output |
| :---------- | :---- | :----- |
| Set IMP flag to question of selected course | Click on flag mark and tickmark on IMP | IMP tag on question|
|
1.0
|
Set IMP flag to question selenium test - ### Describe the test
Add a test to set IMP flag to question using selenium webdriver.
### What it covers
It will cover front-end test for setting IMP flag to the question.
### Testing
Some tests for the above testcase.
| Description | Input | Output |
| :---------- | :---- | :----- |
| Set IMP flag to question of selected course | Click on flag mark and tickmark on IMP | IMP tag on question|
|
test
|
set imp flag to question selenium test describe the test add a test to set imp flag to question using selenium webdriver what it covers it will cover front end test for setting imp flag to the question testing some tests for the above testcase description input output set imp flag to question of selected course click on flag mark and tickmark on imp imp tag on question
| 1
|
814,950
| 30,531,120,180
|
IssuesEvent
|
2023-07-19 14:20:59
|
RobotLocomotion/drake
|
https://api.github.com/repos/RobotLocomotion/drake
|
opened
|
Mesh warnings from franka_description with recent Drake versions
|
type: bug priority: medium component: geometry perception
|
### What happened?
In `drake/manipulation/models/franka_description/urdf` we provide some sample URDFs for the Franka Panda robot.
If one of those models is loaded into a scene that contains render engines (e.g., for camera simulation) then there are a log of warnings spammed to the console:
For example:
```
[console] [warning] warning: Drake currently only supports OBJs that use a single material across the whole mesh; for drake_models/franka_description/meshes/visual/hand.obj, 5 materials were used: 'Part__Feature001_008_005', 'Part__Feature002_005_005', 'Part__Feature005_001_005', 'Part__Feature005_001_005_001', 'Part__Feature_009_005'. The parsed materials will not be used.
```
It's not OK for Drake models to trigger Drake warnings. We either need to fix the model, or nerf the warning.
For now, our plan is to nerf the warning.
### Version
1.19.0
### What operating system are you using?
Ubuntu 22.04
### What installation option are you using?
compiled from source code using Bazel
### Relevant log output
_No response_
|
1.0
|
Mesh warnings from franka_description with recent Drake versions - ### What happened?
In `drake/manipulation/models/franka_description/urdf` we provide some sample URDFs for the Franka Panda robot.
If one of those models is loaded into a scene that contains render engines (e.g., for camera simulation) then there are a log of warnings spammed to the console:
For example:
```
[console] [warning] warning: Drake currently only supports OBJs that use a single material across the whole mesh; for drake_models/franka_description/meshes/visual/hand.obj, 5 materials were used: 'Part__Feature001_008_005', 'Part__Feature002_005_005', 'Part__Feature005_001_005', 'Part__Feature005_001_005_001', 'Part__Feature_009_005'. The parsed materials will not be used.
```
It's not OK for Drake models to trigger Drake warnings. We either need to fix the model, or nerf the warning.
For now, our plan is to nerf the warning.
### Version
1.19.0
### What operating system are you using?
Ubuntu 22.04
### What installation option are you using?
compiled from source code using Bazel
### Relevant log output
_No response_
|
non_test
|
mesh warnings from franka description with recent drake versions what happened in drake manipulation models franka description urdf we provide some sample urdfs for the franka panda robot if one of those models is loaded into a scene that contains render engines e g for camera simulation then there are a log of warnings spammed to the console for example warning drake currently only supports objs that use a single material across the whole mesh for drake models franka description meshes visual hand obj materials were used part part part part part feature the parsed materials will not be used it s not ok for drake models to trigger drake warnings we either need to fix the model or nerf the warning for now our plan is to nerf the warning version what operating system are you using ubuntu what installation option are you using compiled from source code using bazel relevant log output no response
| 0
|
279,623
| 24,240,381,422
|
IssuesEvent
|
2022-09-27 05:59:16
|
pytest-dev/pytest
|
https://api.github.com/repos/pytest-dev/pytest
|
closed
|
`unittest` TestCase collection fails silently when there are unclear circular dependencies
|
type: bug topic: collection plugin: unittest status: needs information
|
Hey!
I have problem with collecting unittest/asynctest based test.
I'm trying to run tests TestCase from unittest/asynctest and have problem when I provide for example exact dir
`pytest directory_name`
then collecting works fine, it will collect correct test and run it but when I will run pytest without directory
`pytest`
it's collecting same tests but in this case it skipps all of them becasue it has _init_ method defined from testcase.
```
PytestCollectionWarning: cannot collect test class 'TestCase' because it has a __init__ constructor
````
but If I run it directly, by passing
`pytest test_name.py` or `pytest dir_name`
it will run single test or in second case, it will collect correctly test from dir and run it
In every case the same test are collected but in one is skipped and in another it run normally.
also
https://docs.pytest.org/en/stable/unittest.html
it says that
`pytest tests` should run unittest style test but it will work only if there are in `test` dir, right?
Is it bug or missing configuration?
- pytest 6.2.2
- asynctest 0.13.0
- python 3.9.1
- osx Big Sur on m1
|
1.0
|
`unittest` TestCase collection fails silently when there are unclear circular dependencies - Hey!
I have problem with collecting unittest/asynctest based test.
I'm trying to run tests TestCase from unittest/asynctest and have problem when I provide for example exact dir
`pytest directory_name`
then collecting works fine, it will collect correct test and run it but when I will run pytest without directory
`pytest`
it's collecting same tests but in this case it skipps all of them becasue it has _init_ method defined from testcase.
```
PytestCollectionWarning: cannot collect test class 'TestCase' because it has a __init__ constructor
````
but If I run it directly, by passing
`pytest test_name.py` or `pytest dir_name`
it will run single test or in second case, it will collect correctly test from dir and run it
In every case the same test are collected but in one is skipped and in another it run normally.
also
https://docs.pytest.org/en/stable/unittest.html
it says that
`pytest tests` should run unittest style test but it will work only if there are in `test` dir, right?
Is it bug or missing configuration?
- pytest 6.2.2
- asynctest 0.13.0
- python 3.9.1
- osx Big Sur on m1
|
test
|
unittest testcase collection fails silently when there are unclear circular dependencies hey i have problem with collecting unittest asynctest based test i m trying to run tests testcase from unittest asynctest and have problem when i provide for example exact dir pytest directory name then collecting works fine it will collect correct test and run it but when i will run pytest without directory pytest it s collecting same tests but in this case it skipps all of them becasue it has init method defined from testcase pytestcollectionwarning cannot collect test class testcase because it has a init constructor but if i run it directly by passing pytest test name py or pytest dir name it will run single test or in second case it will collect correctly test from dir and run it in every case the same test are collected but in one is skipped and in another it run normally also it says that pytest tests should run unittest style test but it will work only if there are in test dir right is it bug or missing configuration pytest asynctest python osx big sur on
| 1
|
43,489
| 11,236,379,808
|
IssuesEvent
|
2020-01-09 10:20:01
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
closed
|
ROCm support to pytorch
|
module: build triaged
|
## ❓ while compile failed tipes is The following variables are used in this project, but they are set to NOTFOUND
compile env is ubuntu18.04 ROCm 2.10 GPU is Vega56-8G
here is mine error log:
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
**ROCM_ROCTX_LIB
linked by target "torch_hip" in directory** /home/ling/build**/pytorch/caffe2**
-- Configuring incomplete, errors occurred!
See also "/home/ling/build/pytorch/build/CMakeFiles/CMakeOutput.log".
See also "/home/ling/build/pytorch/build/CMakeFiles/CMakeError.log".
Traceback (most recent call last):
File "setup.py", line 755, in <module>
build_deps()
File "setup.py", line 316, in build_deps
cmake=cmake)
File "/home/ling/build/pytorch/tools/build_pytorch_libs.py", line 59, in build_caffe2
rerun_cmake)
File "/home/ling/build/pytorch/tools/setup_helpers/cmake.py", line 321, in generate
self.run(args, env=my_env)
File "/home/ling/build/pytorch/tools/setup_helpers/cmake.py", line 141, in run
check_call(command, cwd=self.build_dir, env=env)
File "/home/ling/anaconda3/lib/python3.7/subprocess.py", line 347, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['cmake', '-DBUILD_PYTHON=True', '-DBUILD_TEST=True', '-DCMAKE_BUILD_TYPE=Release', '-DCMAKE_INSTALL_PREFIX=/home/ling/build/pytorch/torch', '-DCMAKE_PREFIX_PATH=/home/ling/anaconda3/lib/python3.7/site-packages', '-DNUMPY_INCLUDE_DIR=/home/ling/anaconda3/lib/python3.7/site-packages/numpy/core/include', '-DPYTHON_EXECUTABLE=/home/ling/anaconda3/bin/python', '-DPYTHON_INCLUDE_DIR=/home/ling/anaconda3/include/python3.7m', '-DPYTHON_LIBRARY=/home/ling/anaconda3/lib/libpython3.7m.so.1.0', '-DTORCH_BUILD_VERSION=1.4.0a0+190dac1', '-DUSE_NUMPY=True', '-DUSE_ROCM=1', '/home/ling/build/pytorch']' returned non-zero exit status 1.
- [Discussion Forum](https://discuss.pytorch.org/)
|
1.0
|
ROCm support to pytorch - ## ❓ while compile failed tipes is The following variables are used in this project, but they are set to NOTFOUND
compile env is ubuntu18.04 ROCm 2.10 GPU is Vega56-8G
here is mine error log:
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
**ROCM_ROCTX_LIB
linked by target "torch_hip" in directory** /home/ling/build**/pytorch/caffe2**
-- Configuring incomplete, errors occurred!
See also "/home/ling/build/pytorch/build/CMakeFiles/CMakeOutput.log".
See also "/home/ling/build/pytorch/build/CMakeFiles/CMakeError.log".
Traceback (most recent call last):
File "setup.py", line 755, in <module>
build_deps()
File "setup.py", line 316, in build_deps
cmake=cmake)
File "/home/ling/build/pytorch/tools/build_pytorch_libs.py", line 59, in build_caffe2
rerun_cmake)
File "/home/ling/build/pytorch/tools/setup_helpers/cmake.py", line 321, in generate
self.run(args, env=my_env)
File "/home/ling/build/pytorch/tools/setup_helpers/cmake.py", line 141, in run
check_call(command, cwd=self.build_dir, env=env)
File "/home/ling/anaconda3/lib/python3.7/subprocess.py", line 347, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['cmake', '-DBUILD_PYTHON=True', '-DBUILD_TEST=True', '-DCMAKE_BUILD_TYPE=Release', '-DCMAKE_INSTALL_PREFIX=/home/ling/build/pytorch/torch', '-DCMAKE_PREFIX_PATH=/home/ling/anaconda3/lib/python3.7/site-packages', '-DNUMPY_INCLUDE_DIR=/home/ling/anaconda3/lib/python3.7/site-packages/numpy/core/include', '-DPYTHON_EXECUTABLE=/home/ling/anaconda3/bin/python', '-DPYTHON_INCLUDE_DIR=/home/ling/anaconda3/include/python3.7m', '-DPYTHON_LIBRARY=/home/ling/anaconda3/lib/libpython3.7m.so.1.0', '-DTORCH_BUILD_VERSION=1.4.0a0+190dac1', '-DUSE_NUMPY=True', '-DUSE_ROCM=1', '/home/ling/build/pytorch']' returned non-zero exit status 1.
- [Discussion Forum](https://discuss.pytorch.org/)
|
non_test
|
rocm support to pytorch ❓ while compile failed tipes is the following variables are used in this project but they are set to notfound compile env is rocm gpu is here is mine error log cmake error the following variables are used in this project but they are set to notfound please set them or make sure they are set and tested correctly in the cmake files rocm roctx lib linked by target torch hip in directory home ling build pytorch configuring incomplete errors occurred see also home ling build pytorch build cmakefiles cmakeoutput log see also home ling build pytorch build cmakefiles cmakeerror log traceback most recent call last file setup py line in build deps file setup py line in build deps cmake cmake file home ling build pytorch tools build pytorch libs py line in build rerun cmake file home ling build pytorch tools setup helpers cmake py line in generate self run args env my env file home ling build pytorch tools setup helpers cmake py line in run check call command cwd self build dir env env file home ling lib subprocess py line in check call raise calledprocesserror retcode cmd subprocess calledprocesserror command returned non zero exit status
| 0
|
155,083
| 12,238,208,619
|
IssuesEvent
|
2020-05-04 19:23:46
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
Failing test: X-Pack Jest Tests.x-pack/plugins/apm/public/components/app/Settings/CustomizeUI/CustomLink/CustomLinkFlyout - LinkPreview shows label and url values
|
Team:apm [zube]: In Progress failed-test
|
A test failed on a tracked branch
```
Error: callApmApi has to be initialized before used. Call createCallApmApi first.
at callApmApi (/var/lib/jenkins/workspace/elastic+kibana+7.x/kibana/x-pack/plugins/apm/public/services/rest/createCallApmApi.ts:23:9)
at Object.<anonymous>.fetchTransaction (/var/lib/jenkins/workspace/elastic+kibana+7.x/kibana/x-pack/plugins/apm/public/components/app/Settings/CustomizeUI/CustomLink/CustomLinkFlyout/LinkPreview.tsx:33:31)
at complete (/var/lib/jenkins/workspace/elastic+kibana+7.x/kibana/node_modules/lodash/index.js:7745:25)
at delayed (/var/lib/jenkins/workspace/elastic+kibana+7.x/kibana/node_modules/lodash/index.js:7755:11)
at Timeout.callback [as _onTimeout] (/var/lib/jenkins/workspace/elastic+kibana+7.x/kibana/node_modules/jest-environment-jsdom/node_modules/jsdom/lib/jsdom/browser/Window.js:592:19)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+7.x/4847/)
<!-- kibanaCiData = {"failed-test":{"test.class":"X-Pack Jest Tests.x-pack/plugins/apm/public/components/app/Settings/CustomizeUI/CustomLink/CustomLinkFlyout","test.name":"LinkPreview shows label and url values","test.failCount":1}} -->
|
1.0
|
Failing test: X-Pack Jest Tests.x-pack/plugins/apm/public/components/app/Settings/CustomizeUI/CustomLink/CustomLinkFlyout - LinkPreview shows label and url values - A test failed on a tracked branch
```
Error: callApmApi has to be initialized before used. Call createCallApmApi first.
at callApmApi (/var/lib/jenkins/workspace/elastic+kibana+7.x/kibana/x-pack/plugins/apm/public/services/rest/createCallApmApi.ts:23:9)
at Object.<anonymous>.fetchTransaction (/var/lib/jenkins/workspace/elastic+kibana+7.x/kibana/x-pack/plugins/apm/public/components/app/Settings/CustomizeUI/CustomLink/CustomLinkFlyout/LinkPreview.tsx:33:31)
at complete (/var/lib/jenkins/workspace/elastic+kibana+7.x/kibana/node_modules/lodash/index.js:7745:25)
at delayed (/var/lib/jenkins/workspace/elastic+kibana+7.x/kibana/node_modules/lodash/index.js:7755:11)
at Timeout.callback [as _onTimeout] (/var/lib/jenkins/workspace/elastic+kibana+7.x/kibana/node_modules/jest-environment-jsdom/node_modules/jsdom/lib/jsdom/browser/Window.js:592:19)
at ontimeout (timers.js:436:11)
at tryOnTimeout (timers.js:300:5)
at listOnTimeout (timers.js:263:5)
at Timer.processTimers (timers.js:223:10)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+7.x/4847/)
<!-- kibanaCiData = {"failed-test":{"test.class":"X-Pack Jest Tests.x-pack/plugins/apm/public/components/app/Settings/CustomizeUI/CustomLink/CustomLinkFlyout","test.name":"LinkPreview shows label and url values","test.failCount":1}} -->
|
test
|
failing test x pack jest tests x pack plugins apm public components app settings customizeui customlink customlinkflyout linkpreview shows label and url values a test failed on a tracked branch error callapmapi has to be initialized before used call createcallapmapi first at callapmapi var lib jenkins workspace elastic kibana x kibana x pack plugins apm public services rest createcallapmapi ts at object fetchtransaction var lib jenkins workspace elastic kibana x kibana x pack plugins apm public components app settings customizeui customlink customlinkflyout linkpreview tsx at complete var lib jenkins workspace elastic kibana x kibana node modules lodash index js at delayed var lib jenkins workspace elastic kibana x kibana node modules lodash index js at timeout callback var lib jenkins workspace elastic kibana x kibana node modules jest environment jsdom node modules jsdom lib jsdom browser window js at ontimeout timers js at tryontimeout timers js at listontimeout timers js at timer processtimers timers js first failure
| 1
|
55,107
| 11,386,580,338
|
IssuesEvent
|
2020-01-29 13:33:46
|
googleapis/google-cloud-python
|
https://api.github.com/repos/googleapis/google-cloud-python
|
closed
|
Synthesis failed for language
|
api: language autosynth failure codegen type: process
|
Hello! Autosynth couldn't regenerate language. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-language'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/language/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:10a6d0342b8d62544810ac5ad86c3b21049ec0696608ac60175da8e513234344
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/language/artman_language_v1beta2.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1beta2.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1beta2/language_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1beta2/google/cloud/language_v1beta2/proto/language_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1beta2/google/cloud/language_v1beta2/proto.
synthtool > No files in sources /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1beta2/samples were copied. Does the source contain files?
synthtool > Running generator for google/cloud/language/artman_language_v1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/language_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/google/cloud/language_v1/proto/language_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/google/cloud/language_v1/proto.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/samples/test/analyzing_sentiment.test.yaml to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/samples/v1/test/analyzing_sentiment.test.yaml
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/samples/test/analyzing_entity_sentiment.test.yaml to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/samples/v1/test/analyzing_entity_sentiment.test.yaml
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/samples/test/analyzing_syntax.test.yaml to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/samples/v1/test/analyzing_syntax.test.yaml
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/samples/test/analyzing_entities.test.yaml to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/samples/v1/test/analyzing_entities.test.yaml
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/samples/test/classifying_content.test.yaml to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/samples/v1/test/classifying_content.test.yaml
synthtool > Writing samples manifest ['gen-manifest', '--env=python', '--bin=python3', '--output=v1/test/samples.manifest.yaml', '--chdir={@manifest_dir}/../..', 'v1/language_entity_sentiment_gcs.py', 'v1/language_classify_gcs.py', 'v1/language_syntax_gcs.py', 'v1/language_entities_text.py', 'v1/language_classify_text.py', 'v1/language_syntax_text.py', 'v1/language_entity_sentiment_text.py', 'v1/language_entities_gcs.py', 'v1/language_sentiment_gcs.py', 'v1/language_sentiment_text.py']
synthtool > No files in sources /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/tests/system/gapic/v1 were copied. Does the source contain files?
.coveragerc
.flake8
MANIFEST.in
docs/_static/custom.css
docs/_templates/layout.html
noxfile.py.j2
setup.cfg
synthtool > Replaced 'types\\.EncodingType' in google/cloud/language_v1beta2/gapic/language_service_client.py.
synthtool > Replaced 'types\\.EncodingType' in google/cloud/language_v1/gapic/language_service_client.py.
Running session blacken
Creating virtual environment (virtualenv) using python3.6 in .nox/blacken
pip install black==19.3b0
black docs google tests noxfile.py setup.py samples
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/gapic/language_service_client_config.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/gapic/transports/language_service_grpc_transport.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/gapic/enums.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/proto/language_service_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/gapic/language_service_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/gapic/language_service_client_config.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/gapic/transports/language_service_grpc_transport.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/gapic/enums.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/gapic/language_service_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/proto/language_service_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_classify_gcs.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_classify_text.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_entities_gcs.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_entities_text.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_entity_sentiment_gcs.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_entity_sentiment_text.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_sentiment_gcs.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_sentiment_text.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_syntax_gcs.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_syntax_text.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/tests/system/gapic/v1beta2/test_system_language_service_v1beta2.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/tests/unit/gapic/v1/test_language_service_client_v1.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/tests/unit/gapic/v1beta2/test_language_service_client_v1beta2.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/proto/language_service_pb2.py: Cannot parse: 2249:11: '__doc__': """################################################################
error: cannot format /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/proto/language_service_pb2.py: Cannot parse: 2250:11: '__doc__': """################################################################
All done! 💥 💔 💥
23 files reformatted, 19 files left unchanged, 2 files failed to reformat.
Command black docs google tests noxfile.py setup.py samples failed with exit code 123
Session blacken failed.
synthtool > Failed executing nox -s blacken:
None
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/language/synth.py", line 56, in <module>
s.shell.run(["nox", "-s", "blacken"], hide_output=False)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/7466e354-d120-4bf0-a156-1b9913bcb1fc).
|
1.0
|
Synthesis failed for language - Hello! Autosynth couldn't regenerate language. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-language'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/language/synth.py.
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:10a6d0342b8d62544810ac5ad86c3b21049ec0696608ac60175da8e513234344
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/language/artman_language_v1beta2.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1beta2.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1beta2/language_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1beta2/google/cloud/language_v1beta2/proto/language_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1beta2/google/cloud/language_v1beta2/proto.
synthtool > No files in sources /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1beta2/samples were copied. Does the source contain files?
synthtool > Running generator for google/cloud/language/artman_language_v1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/language_service.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/google/cloud/language_v1/proto/language_service.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/google/cloud/language_v1/proto.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/samples/test/analyzing_sentiment.test.yaml to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/samples/v1/test/analyzing_sentiment.test.yaml
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/samples/test/analyzing_entity_sentiment.test.yaml to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/samples/v1/test/analyzing_entity_sentiment.test.yaml
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/samples/test/analyzing_syntax.test.yaml to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/samples/v1/test/analyzing_syntax.test.yaml
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/samples/test/analyzing_entities.test.yaml to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/samples/v1/test/analyzing_entities.test.yaml
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/language/v1/samples/test/classifying_content.test.yaml to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/samples/v1/test/classifying_content.test.yaml
synthtool > Writing samples manifest ['gen-manifest', '--env=python', '--bin=python3', '--output=v1/test/samples.manifest.yaml', '--chdir={@manifest_dir}/../..', 'v1/language_entity_sentiment_gcs.py', 'v1/language_classify_gcs.py', 'v1/language_syntax_gcs.py', 'v1/language_entities_text.py', 'v1/language_classify_text.py', 'v1/language_syntax_text.py', 'v1/language_entity_sentiment_text.py', 'v1/language_entities_gcs.py', 'v1/language_sentiment_gcs.py', 'v1/language_sentiment_text.py']
synthtool > No files in sources /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/language-v1/tests/system/gapic/v1 were copied. Does the source contain files?
.coveragerc
.flake8
MANIFEST.in
docs/_static/custom.css
docs/_templates/layout.html
noxfile.py.j2
setup.cfg
synthtool > Replaced 'types\\.EncodingType' in google/cloud/language_v1beta2/gapic/language_service_client.py.
synthtool > Replaced 'types\\.EncodingType' in google/cloud/language_v1/gapic/language_service_client.py.
Running session blacken
Creating virtual environment (virtualenv) using python3.6 in .nox/blacken
pip install black==19.3b0
black docs google tests noxfile.py setup.py samples
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/gapic/language_service_client_config.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/gapic/transports/language_service_grpc_transport.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/gapic/enums.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/proto/language_service_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/gapic/language_service_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/gapic/language_service_client_config.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/gapic/transports/language_service_grpc_transport.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/gapic/enums.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/gapic/language_service_client.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/proto/language_service_pb2_grpc.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_classify_gcs.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_classify_text.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_entities_gcs.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_entities_text.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_entity_sentiment_gcs.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_entity_sentiment_text.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_sentiment_gcs.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_sentiment_text.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_syntax_gcs.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/samples/v1/language_syntax_text.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/tests/system/gapic/v1beta2/test_system_language_service_v1beta2.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/tests/unit/gapic/v1/test_language_service_client_v1.py
reformatted /tmpfs/src/git/autosynth/working_repo/language/tests/unit/gapic/v1beta2/test_language_service_client_v1beta2.py
error: cannot format /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1/proto/language_service_pb2.py: Cannot parse: 2249:11: '__doc__': """################################################################
error: cannot format /tmpfs/src/git/autosynth/working_repo/language/google/cloud/language_v1beta2/proto/language_service_pb2.py: Cannot parse: 2250:11: '__doc__': """################################################################
All done! 💥 💔 💥
23 files reformatted, 19 files left unchanged, 2 files failed to reformat.
Command black docs google tests noxfile.py setup.py samples failed with exit code 123
Session blacken failed.
synthtool > Failed executing nox -s blacken:
None
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 99, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 91, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/language/synth.py", line 56, in <module>
s.shell.run(["nox", "-s", "blacken"], hide_output=False)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/7466e354-d120-4bf0-a156-1b9913bcb1fc).
|
non_test
|
synthesis failed for language hello autosynth couldn t regenerate language broken heart here s the output from running synth py cloning into working repo switched to branch autosynth language running synthtool synthtool executing tmpfs src git autosynth working repo language synth py synthtool ensuring dependencies synthtool pulling artman image latest pulling from googleapis artman digest status image is up to date for googleapis artman latest synthtool cloning googleapis synthtool running generator for google cloud language artman language yaml synthtool generated code into home kbuilder cache synthtool googleapis artman genfiles python language synthtool copy home kbuilder cache synthtool googleapis google cloud language language service proto to home kbuilder cache synthtool googleapis artman genfiles python language google cloud language proto language service proto synthtool placed proto files into home kbuilder cache synthtool googleapis artman genfiles python language google cloud language proto synthtool no files in sources home kbuilder cache synthtool googleapis artman genfiles python language samples were copied does the source contain files synthtool running generator for google cloud language artman language yaml synthtool generated code into home kbuilder cache synthtool googleapis artman genfiles python language synthtool copy home kbuilder cache synthtool googleapis google cloud language language service proto to home kbuilder cache synthtool googleapis artman genfiles python language google cloud language proto language service proto synthtool placed proto files into home kbuilder cache synthtool googleapis artman genfiles python language google cloud language proto synthtool copy home kbuilder cache synthtool googleapis google cloud language samples test analyzing sentiment test yaml to home kbuilder cache synthtool googleapis artman genfiles python language samples test analyzing sentiment test yaml synthtool copy home kbuilder cache synthtool googleapis google cloud language samples test analyzing entity sentiment test yaml to home kbuilder cache synthtool googleapis artman genfiles python language samples test analyzing entity sentiment test yaml synthtool copy home kbuilder cache synthtool googleapis google cloud language samples test analyzing syntax test yaml to home kbuilder cache synthtool googleapis artman genfiles python language samples test analyzing syntax test yaml synthtool copy home kbuilder cache synthtool googleapis google cloud language samples test analyzing entities test yaml to home kbuilder cache synthtool googleapis artman genfiles python language samples test analyzing entities test yaml synthtool copy home kbuilder cache synthtool googleapis google cloud language samples test classifying content test yaml to home kbuilder cache synthtool googleapis artman genfiles python language samples test classifying content test yaml synthtool writing samples manifest synthtool no files in sources home kbuilder cache synthtool googleapis artman genfiles python language tests system gapic were copied does the source contain files coveragerc manifest in docs static custom css docs templates layout html noxfile py setup cfg synthtool replaced types encodingtype in google cloud language gapic language service client py synthtool replaced types encodingtype in google cloud language gapic language service client py running session blacken creating virtual environment virtualenv using in nox blacken pip install black black docs google tests noxfile py setup py samples reformatted tmpfs src git autosynth working repo language google cloud language gapic language service client config py reformatted tmpfs src git autosynth working repo language google cloud language gapic transports language service grpc transport py reformatted tmpfs src git autosynth working repo language google cloud language gapic enums py reformatted tmpfs src git autosynth working repo language google cloud language proto language service grpc py reformatted tmpfs src git autosynth working repo language google cloud language gapic language service client py reformatted tmpfs src git autosynth working repo language google cloud language gapic language service client config py reformatted tmpfs src git autosynth working repo language google cloud language gapic transports language service grpc transport py reformatted tmpfs src git autosynth working repo language google cloud language gapic enums py reformatted tmpfs src git autosynth working repo language google cloud language gapic language service client py reformatted tmpfs src git autosynth working repo language google cloud language proto language service grpc py reformatted tmpfs src git autosynth working repo language samples language classify gcs py reformatted tmpfs src git autosynth working repo language samples language classify text py reformatted tmpfs src git autosynth working repo language samples language entities gcs py reformatted tmpfs src git autosynth working repo language samples language entities text py reformatted tmpfs src git autosynth working repo language samples language entity sentiment gcs py reformatted tmpfs src git autosynth working repo language samples language entity sentiment text py reformatted tmpfs src git autosynth working repo language samples language sentiment gcs py reformatted tmpfs src git autosynth working repo language samples language sentiment text py reformatted tmpfs src git autosynth working repo language samples language syntax gcs py reformatted tmpfs src git autosynth working repo language samples language syntax text py reformatted tmpfs src git autosynth working repo language tests system gapic test system language service py reformatted tmpfs src git autosynth working repo language tests unit gapic test language service client py reformatted tmpfs src git autosynth working repo language tests unit gapic test language service client py error cannot format tmpfs src git autosynth working repo language google cloud language proto language service py cannot parse doc error cannot format tmpfs src git autosynth working repo language google cloud language proto language service py cannot parse doc all done 💥 💔 💥 files reformatted files left unchanged files failed to reformat command black docs google tests noxfile py setup py samples failed with exit code session blacken failed synthtool failed executing nox s blacken none synthtool wrote metadata to synth metadata traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth env lib site packages synthtool main py line in main file tmpfs src git autosynth env lib site packages click core py line in call return self main args kwargs file tmpfs src git autosynth env lib site packages click core py line in main rv self invoke ctx file tmpfs src git autosynth env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src git autosynth env lib site packages click core py line in invoke return callback args kwargs file tmpfs src git autosynth env lib site packages synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file tmpfs src git autosynth working repo language synth py line in s shell run hide output false file tmpfs src git autosynth env lib site packages synthtool shell py line in run raise exc file tmpfs src git autosynth env lib site packages synthtool shell py line in run encoding utf file home kbuilder pyenv versions lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command returned non zero exit status synthesis failed google internal developers can see the full log
| 0
|
87,410
| 8,073,863,221
|
IssuesEvent
|
2018-08-06 20:46:28
|
elastic/elasticsearch
|
https://api.github.com/repos/elastic/elasticsearch
|
closed
|
[CI] o.e.i.c.IngestRestartIT.testScriptDisabled fails with EsRejectedExecutionException
|
:Core/Ingest >test-failure v6.5.0 v7.0.0
|
`o.e.i.c.IngestRestartIT.testScriptDisabled` failed as follows, apparently due to trying to use a threadpool after it had terminated. Full log at https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+master+intake/2443/console
```
1> [2018-08-01T05:50:12,899][INFO ][o.e.i.c.IngestRestartIT ] [testScriptDisabled] [IngestRestartIT#testScriptDisabled]: cleaned up after test
1> [2018-08-01T05:50:12,899][INFO ][o.e.i.c.IngestRestartIT ] [testScriptDisabled] after test
ERROR 4.05s | IngestRestartIT.testScriptDisabled <<< FAILURES!
> Throwable #1: com.carrotsearch.randomizedtesting.UncaughtExceptionError: Captured an uncaught exception in thread: Thread[id=66, name=elasticsearch[node_t0][generic][T#2], state=RUNNABLE, group=TGRP-IngestRestartIT]
> at __randomizedtesting.SeedInfo.seed([F226344894683122:B5BAEEDA56780614]:0)
> Caused by: org.elasticsearch.common.util.concurrent.EsRejectedExecutionException: rejected execution of java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@60e56eae on java.util.concurrent.ScheduledThreadPoolExecutor@2a8207d9[Terminated, pool size = 1, active threads = 0, queued tasks = 0, completed tasks = 2]
> at __randomizedtesting.SeedInfo.seed([F226344894683122]:0)
> at org.elasticsearch.common.util.concurrent.EsAbortPolicy.rejectedExecution(EsAbortPolicy.java:48)
> at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
> at java.util.concurrent.ScheduledThreadPoolExecutor.delayedExecute(ScheduledThreadPoolExecutor.java:326)
> at java.util.concurrent.ScheduledThreadPoolExecutor.schedule(ScheduledThreadPoolExecutor.java:533)
> at org.elasticsearch.threadpool.ThreadPool.schedule(ThreadPool.java:346)
> at org.elasticsearch.ingest.IngestService.lambda$new$0(IngestService.java:49)
> at org.elasticsearch.grok.ThreadWatchdog$Default.interruptLongRunningExecutions(ThreadWatchdog.java:143)
> at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:624)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
```
I see a similar failure on 13 June but nothing else:
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+6.x+periodic/2179/console
The REPRODUCE WITH line does not reproduce, but is here for posterity:
```./gradlew :modules:ingest-common:integTestRunner -Dtests.seed=F226344894683122 -Dtests.class=org.elasticsearch.ingest.common.IngestRestartIT -Dtests.method="testScriptDisabled" -Dtests.security.manager=true -Dtests.locale=ar-SY -Dtests.timezone=Africa/Banjul```
|
1.0
|
[CI] o.e.i.c.IngestRestartIT.testScriptDisabled fails with EsRejectedExecutionException - `o.e.i.c.IngestRestartIT.testScriptDisabled` failed as follows, apparently due to trying to use a threadpool after it had terminated. Full log at https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+master+intake/2443/console
```
1> [2018-08-01T05:50:12,899][INFO ][o.e.i.c.IngestRestartIT ] [testScriptDisabled] [IngestRestartIT#testScriptDisabled]: cleaned up after test
1> [2018-08-01T05:50:12,899][INFO ][o.e.i.c.IngestRestartIT ] [testScriptDisabled] after test
ERROR 4.05s | IngestRestartIT.testScriptDisabled <<< FAILURES!
> Throwable #1: com.carrotsearch.randomizedtesting.UncaughtExceptionError: Captured an uncaught exception in thread: Thread[id=66, name=elasticsearch[node_t0][generic][T#2], state=RUNNABLE, group=TGRP-IngestRestartIT]
> at __randomizedtesting.SeedInfo.seed([F226344894683122:B5BAEEDA56780614]:0)
> Caused by: org.elasticsearch.common.util.concurrent.EsRejectedExecutionException: rejected execution of java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@60e56eae on java.util.concurrent.ScheduledThreadPoolExecutor@2a8207d9[Terminated, pool size = 1, active threads = 0, queued tasks = 0, completed tasks = 2]
> at __randomizedtesting.SeedInfo.seed([F226344894683122]:0)
> at org.elasticsearch.common.util.concurrent.EsAbortPolicy.rejectedExecution(EsAbortPolicy.java:48)
> at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
> at java.util.concurrent.ScheduledThreadPoolExecutor.delayedExecute(ScheduledThreadPoolExecutor.java:326)
> at java.util.concurrent.ScheduledThreadPoolExecutor.schedule(ScheduledThreadPoolExecutor.java:533)
> at org.elasticsearch.threadpool.ThreadPool.schedule(ThreadPool.java:346)
> at org.elasticsearch.ingest.IngestService.lambda$new$0(IngestService.java:49)
> at org.elasticsearch.grok.ThreadWatchdog$Default.interruptLongRunningExecutions(ThreadWatchdog.java:143)
> at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:624)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
```
I see a similar failure on 13 June but nothing else:
https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+6.x+periodic/2179/console
The REPRODUCE WITH line does not reproduce, but is here for posterity:
```./gradlew :modules:ingest-common:integTestRunner -Dtests.seed=F226344894683122 -Dtests.class=org.elasticsearch.ingest.common.IngestRestartIT -Dtests.method="testScriptDisabled" -Dtests.security.manager=true -Dtests.locale=ar-SY -Dtests.timezone=Africa/Banjul```
|
test
|
o e i c ingestrestartit testscriptdisabled fails with esrejectedexecutionexception o e i c ingestrestartit testscriptdisabled failed as follows apparently due to trying to use a threadpool after it had terminated full log at cleaned up after test after test error ingestrestartit testscriptdisabled failures throwable com carrotsearch randomizedtesting uncaughtexceptionerror captured an uncaught exception in thread thread state runnable group tgrp ingestrestartit at randomizedtesting seedinfo seed caused by org elasticsearch common util concurrent esrejectedexecutionexception rejected execution of java util concurrent scheduledthreadpoolexecutor scheduledfuturetask on java util concurrent scheduledthreadpoolexecutor at randomizedtesting seedinfo seed at org elasticsearch common util concurrent esabortpolicy rejectedexecution esabortpolicy java at java util concurrent threadpoolexecutor reject threadpoolexecutor java at java util concurrent scheduledthreadpoolexecutor delayedexecute scheduledthreadpoolexecutor java at java util concurrent scheduledthreadpoolexecutor schedule scheduledthreadpoolexecutor java at org elasticsearch threadpool threadpool schedule threadpool java at org elasticsearch ingest ingestservice lambda new ingestservice java at org elasticsearch grok threadwatchdog default interruptlongrunningexecutions threadwatchdog java at org elasticsearch common util concurrent threadcontext contextpreservingrunnable run threadcontext java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java i see a similar failure on june but nothing else the reproduce with line does not reproduce but is here for posterity gradlew modules ingest common integtestrunner dtests seed dtests class org elasticsearch ingest common ingestrestartit dtests method testscriptdisabled dtests security manager true dtests locale ar sy dtests timezone africa banjul
| 1
|
41,647
| 6,924,892,236
|
IssuesEvent
|
2017-11-30 14:21:49
|
jan-molak/serenity-js
|
https://api.github.com/repos/jan-molak/serenity-js
|
closed
|
Unable to add ability to the actor.
|
documentation question
|
Hi, I'm trying to set the ability to interact with a database to an actor, but I'm getting the following error:
**"I don't have the ability to InteractWithDatabase, said Brayan sadly."**
This is my Ability:
```
import { Ability, UsesAbilities } from "@serenity-js/core/lib/screenplay";
const sql = require('mssql')
const config = {
///config info here
};
export class InteractWithDatabase implements Ability {
static using() {
return new InteractWithDatabase();
}
static as(actor: UsesAbilities): InteractWithDatabase {
debugger
return actor.abilityTo(InteractWithDatabase);
}
constructor() {
}
execute(query: string): PromiseLike<any> {
return sql.connect(config).then(() => {
return sql.query`${query}`
}).then(result => {
console.dir(result)
}).catch(err => {
// ... error checks
})
}
}
```
This is the interaction:
```
import { Interaction, UsesAbilities } from "@serenity-js/core/lib/screenplay";
import { InteractWithDatabase } from "./InteractWithDatabase";
export class ExecuteQuery implements Interaction {
static using(query: string) {
return new ExecuteQuery(query);
}
performAs(actor: UsesAbilities): PromiseLike<any> {
return InteractWithDatabase.as(actor).execute(this.query);
}
constructor(private query: string) {
}
}
```
This is my task:
```
export class GetUserData implements Task {
static fromDatabase() {
return new GetUserData();
}
performAs(actor: PerformsTasks): PromiseLike<void> {
return actor.attemptsTo(
ExecuteQuery.using('query here'),
);
}
}
```
I tried adding the ability in the actors class:
```
export class Actors implements Cast {
abilities = [BrowseTheWeb.using(protractor.browser), InteractWithDatabase.using()];
actor(name: string): Actor {
debugger
return Actor.named(name).whoCan(this.abilities)
}
}
```
but doesn't work. :(
I hope someone can help me with this.
Thanks
|
1.0
|
Unable to add ability to the actor. - Hi, I'm trying to set the ability to interact with a database to an actor, but I'm getting the following error:
**"I don't have the ability to InteractWithDatabase, said Brayan sadly."**
This is my Ability:
```
import { Ability, UsesAbilities } from "@serenity-js/core/lib/screenplay";
const sql = require('mssql')
const config = {
///config info here
};
export class InteractWithDatabase implements Ability {
static using() {
return new InteractWithDatabase();
}
static as(actor: UsesAbilities): InteractWithDatabase {
debugger
return actor.abilityTo(InteractWithDatabase);
}
constructor() {
}
execute(query: string): PromiseLike<any> {
return sql.connect(config).then(() => {
return sql.query`${query}`
}).then(result => {
console.dir(result)
}).catch(err => {
// ... error checks
})
}
}
```
This is the interaction:
```
import { Interaction, UsesAbilities } from "@serenity-js/core/lib/screenplay";
import { InteractWithDatabase } from "./InteractWithDatabase";
export class ExecuteQuery implements Interaction {
static using(query: string) {
return new ExecuteQuery(query);
}
performAs(actor: UsesAbilities): PromiseLike<any> {
return InteractWithDatabase.as(actor).execute(this.query);
}
constructor(private query: string) {
}
}
```
This is my task:
```
export class GetUserData implements Task {
static fromDatabase() {
return new GetUserData();
}
performAs(actor: PerformsTasks): PromiseLike<void> {
return actor.attemptsTo(
ExecuteQuery.using('query here'),
);
}
}
```
I tried adding the ability in the actors class:
```
export class Actors implements Cast {
abilities = [BrowseTheWeb.using(protractor.browser), InteractWithDatabase.using()];
actor(name: string): Actor {
debugger
return Actor.named(name).whoCan(this.abilities)
}
}
```
but doesn't work. :(
I hope someone can help me with this.
Thanks
|
non_test
|
unable to add ability to the actor hi i m trying to set the ability to interact with a database to an actor but i m getting the following error i don t have the ability to interactwithdatabase said brayan sadly this is my ability import ability usesabilities from serenity js core lib screenplay const sql require mssql const config config info here export class interactwithdatabase implements ability static using return new interactwithdatabase static as actor usesabilities interactwithdatabase debugger return actor abilityto interactwithdatabase constructor execute query string promiselike return sql connect config then return sql query query then result console dir result catch err error checks this is the interaction import interaction usesabilities from serenity js core lib screenplay import interactwithdatabase from interactwithdatabase export class executequery implements interaction static using query string return new executequery query performas actor usesabilities promiselike return interactwithdatabase as actor execute this query constructor private query string this is my task export class getuserdata implements task static fromdatabase return new getuserdata performas actor performstasks promiselike return actor attemptsto executequery using query here i tried adding the ability in the actors class export class actors implements cast abilities actor name string actor debugger return actor named name whocan this abilities but doesn t work i hope someone can help me with this thanks
| 0
|
102,631
| 8,851,100,963
|
IssuesEvent
|
2019-01-08 14:58:33
|
italia/spid
|
https://api.github.com/repos/italia/spid
|
closed
|
Controllo metadata - Comune di Casale Monferrato
|
metadata nuovo md test
|
Buongiorno,
per conto del Comune di Casale Monferrato, richiediamo la verifica dei metadata pubblicati all'url:
https://sportellodigitale.comune.casale-monferrato.al.it/006039/spid/metadata
Grazie e cordiali saluti
Federico Albesano
|
1.0
|
Controllo metadata - Comune di Casale Monferrato - Buongiorno,
per conto del Comune di Casale Monferrato, richiediamo la verifica dei metadata pubblicati all'url:
https://sportellodigitale.comune.casale-monferrato.al.it/006039/spid/metadata
Grazie e cordiali saluti
Federico Albesano
|
test
|
controllo metadata comune di casale monferrato buongiorno per conto del comune di casale monferrato richiediamo la verifica dei metadata pubblicati all url grazie e cordiali saluti federico albesano
| 1
|
82,224
| 15,646,528,036
|
IssuesEvent
|
2021-03-23 01:08:09
|
jgeraigery/linux
|
https://api.github.com/repos/jgeraigery/linux
|
opened
|
CVE-2019-19074 (High) detected in linuxv5.2
|
security vulnerability
|
## CVE-2019-19074 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (0)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A memory leak in the ath9k_wmi_cmd() function in drivers/net/wireless/ath/ath9k/wmi.c in the Linux kernel through 5.3.11 allows attackers to cause a denial of service (memory consumption), aka CID-728c1e2a05e4.
<p>Publish Date: 2019-11-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-19074>CVE-2019-19074</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19074">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19074</a></p>
<p>Release Date: 2019-11-18</p>
<p>Fix Resolution: v5.4-rc1</p>
</p>
</details>
<p></p>
|
True
|
CVE-2019-19074 (High) detected in linuxv5.2 - ## CVE-2019-19074 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (0)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A memory leak in the ath9k_wmi_cmd() function in drivers/net/wireless/ath/ath9k/wmi.c in the Linux kernel through 5.3.11 allows attackers to cause a denial of service (memory consumption), aka CID-728c1e2a05e4.
<p>Publish Date: 2019-11-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-19074>CVE-2019-19074</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19074">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19074</a></p>
<p>Release Date: 2019-11-18</p>
<p>Fix Resolution: v5.4-rc1</p>
</p>
</details>
<p></p>
|
non_test
|
cve high detected in cve high severity vulnerability vulnerable library linux kernel source tree library home page a href vulnerable source files vulnerability details a memory leak in the wmi cmd function in drivers net wireless ath wmi c in the linux kernel through allows attackers to cause a denial of service memory consumption aka cid publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
| 0
|
489,354
| 14,105,033,520
|
IssuesEvent
|
2020-11-06 12:52:21
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.google.com - design is broken
|
browser-firefox engine-gecko priority-critical
|
<!-- @browser: Firefox 83.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:83.0) Gecko/20100101 Firefox/83.0 -->
<!-- @reported_with: unknown -->
**URL**: https://www.google.com/
**Browser / Version**: Firefox 83.0
**Operating System**: Windows 7
**Tested Another Browser**: Yes Safari
**Problem type**: Design is broken
**Description**: Items are misaligned
**Steps to Reproduce**:
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.google.com - design is broken - <!-- @browser: Firefox 83.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:83.0) Gecko/20100101 Firefox/83.0 -->
<!-- @reported_with: unknown -->
**URL**: https://www.google.com/
**Browser / Version**: Firefox 83.0
**Operating System**: Windows 7
**Tested Another Browser**: Yes Safari
**Problem type**: Design is broken
**Description**: Items are misaligned
**Steps to Reproduce**:
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_test
|
design is broken url browser version firefox operating system windows tested another browser yes safari problem type design is broken description items are misaligned steps to reproduce browser configuration none from with ❤️
| 0
|
278,611
| 24,163,183,531
|
IssuesEvent
|
2022-09-22 13:15:01
|
cloudscribe/cloudscribe.SimpleContent
|
https://api.github.com/repos/cloudscribe/cloudscribe.SimpleContent
|
closed
|
Blog Menu Position behaves strangely
|
bug needs-test REFERENCE
|
In a Simple Content site I'm seeing very odd behaviour and cannot position the Blog where I want it.
if I have these page orders set
Page1 10
Page2 15
Blog 20
I get this menu:
Other pages... Page1 Blog Page2
and if I change the ordering to this
Page1 30
Page2 15
Blog 20
I get this menu
Other pages... Page2 Blog Page1
What!??
Multi-tenant site with hostname mode. The above is observed in s2.
|
1.0
|
Blog Menu Position behaves strangely - In a Simple Content site I'm seeing very odd behaviour and cannot position the Blog where I want it.
if I have these page orders set
Page1 10
Page2 15
Blog 20
I get this menu:
Other pages... Page1 Blog Page2
and if I change the ordering to this
Page1 30
Page2 15
Blog 20
I get this menu
Other pages... Page2 Blog Page1
What!??
Multi-tenant site with hostname mode. The above is observed in s2.
|
test
|
blog menu position behaves strangely in a simple content site i m seeing very odd behaviour and cannot position the blog where i want it if i have these page orders set blog i get this menu other pages blog and if i change the ordering to this blog i get this menu other pages blog what multi tenant site with hostname mode the above is observed in
| 1
|
45,211
| 5,703,829,650
|
IssuesEvent
|
2017-04-18 01:33:36
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
Test failure: System.Net.Http.Functional.Tests.CancellationTest/ReadAsStreamAsync_ReadAsync_Cancel_BodyNeverStarted_TaskCanceledQuickly
|
area-System.Net.Http test bug test-run-core test-run-portable
|
Opened on behalf of @Jiayili1
The test `System.Net.Http.Functional.Tests.CancellationTest/ReadAsStreamAsync_ReadAsync_Cancel_BodyNeverStarted_TaskCanceledQuickly` has failed.
Elapsed time should be short\r
Expected: True\r
Actual: False
Stack Trace:
at System.Net.Http.Functional.Tests.CancellationTest.<>c__DisplayClass4_2.<<ReadAsStreamAsync_ReadAsync_Cancel_TaskCanceledQuickly>b__0>d.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.GetResult()
at System.Net.Test.Common.LoopbackServer.<>c__DisplayClass3_0.<CreateServerAsync>b__0(Task t)
at System.Threading.Tasks.ContinuationTaskFromTask.InnerInvoke()
at System.Threading.Tasks.Task.<>c.<.cctor>b__276_1(Object obj)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.GetResult()
at System.Net.Http.Functional.Tests.CancellationTest.<ReadAsStreamAsync_ReadAsync_Cancel_TaskCanceledQuickly>d__4.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
Build : Master - 20170328.01 (Core Tests)
Failing configurations:
- Windows.10.Amd64
- x86-Debug
- Windows.7.Amd64
- x64-Release
Detail: https://mc.dot.net/#/product/netcore/master/source/official~2Fcorefx~2Fmaster~2F/type/test~2Ffunctional~2Fcli~2F/build/20170328.01/workItem/System.Net.Http.Functional.Tests/analysis/xunit/System.Net.Http.Functional.Tests.CancellationTest~2FReadAsStreamAsync_ReadAsync_Cancel_BodyNeverStarted_TaskCanceledQuickly
|
3.0
|
Test failure: System.Net.Http.Functional.Tests.CancellationTest/ReadAsStreamAsync_ReadAsync_Cancel_BodyNeverStarted_TaskCanceledQuickly - Opened on behalf of @Jiayili1
The test `System.Net.Http.Functional.Tests.CancellationTest/ReadAsStreamAsync_ReadAsync_Cancel_BodyNeverStarted_TaskCanceledQuickly` has failed.
Elapsed time should be short\r
Expected: True\r
Actual: False
Stack Trace:
at System.Net.Http.Functional.Tests.CancellationTest.<>c__DisplayClass4_2.<<ReadAsStreamAsync_ReadAsync_Cancel_TaskCanceledQuickly>b__0>d.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.GetResult()
at System.Net.Test.Common.LoopbackServer.<>c__DisplayClass3_0.<CreateServerAsync>b__0(Task t)
at System.Threading.Tasks.ContinuationTaskFromTask.InnerInvoke()
at System.Threading.Tasks.Task.<>c.<.cctor>b__276_1(Object obj)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.GetResult()
at System.Net.Http.Functional.Tests.CancellationTest.<ReadAsStreamAsync_ReadAsync_Cancel_TaskCanceledQuickly>d__4.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
Build : Master - 20170328.01 (Core Tests)
Failing configurations:
- Windows.10.Amd64
- x86-Debug
- Windows.7.Amd64
- x64-Release
Detail: https://mc.dot.net/#/product/netcore/master/source/official~2Fcorefx~2Fmaster~2F/type/test~2Ffunctional~2Fcli~2F/build/20170328.01/workItem/System.Net.Http.Functional.Tests/analysis/xunit/System.Net.Http.Functional.Tests.CancellationTest~2FReadAsStreamAsync_ReadAsync_Cancel_BodyNeverStarted_TaskCanceledQuickly
|
test
|
test failure system net http functional tests cancellationtest readasstreamasync readasync cancel bodyneverstarted taskcanceledquickly opened on behalf of the test system net http functional tests cancellationtest readasstreamasync readasync cancel bodyneverstarted taskcanceledquickly has failed elapsed time should be short r expected true r actual false stack trace at system net http functional tests cancellationtest c b d movenext end of stack trace from previous location where exception was thrown at system runtime exceptionservices exceptiondispatchinfo throw at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system runtime compilerservices taskawaiter getresult at system net test common loopbackserver c b task t at system threading tasks continuationtaskfromtask innerinvoke at system threading tasks task c b object obj at system threading executioncontext run executioncontext executioncontext contextcallback callback object state at system threading tasks task executewiththreadlocal task currenttaskslot end of stack trace from previous location where exception was thrown at system runtime exceptionservices exceptiondispatchinfo throw at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system runtime compilerservices taskawaiter getresult at system net http functional tests cancellationtest d movenext end of stack trace from previous location where exception was thrown at system runtime exceptionservices exceptiondispatchinfo throw at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task end of stack trace from previous location where exception was thrown at system runtime exceptionservices exceptiondispatchinfo throw at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task end of stack trace from previous location where exception was thrown at system runtime exceptionservices exceptiondispatchinfo throw at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task build master core tests failing configurations windows debug windows release detail
| 1
|
285,317
| 24,658,994,220
|
IssuesEvent
|
2022-10-18 04:07:36
|
HSLdevcom/jore4
|
https://api.github.com/repos/HSLdevcom/jore4
|
closed
|
Set video recording for tests so that it creates a separate video for each test.
|
testing
|
Currently video recording for robot tests is done so that it starts recording in the suite setup and then ends when suite is done. So we get one video file for the run.
It would be easier to debug specific tests if the video recording was started in test setup. So we would get separate videos for each test.
Implementation plan:
- figure out the best way to record separate videos for each test. Do we create a new context in each test setup or is there a better way to stop and start recording.
- implement the solution.
|
1.0
|
Set video recording for tests so that it creates a separate video for each test. - Currently video recording for robot tests is done so that it starts recording in the suite setup and then ends when suite is done. So we get one video file for the run.
It would be easier to debug specific tests if the video recording was started in test setup. So we would get separate videos for each test.
Implementation plan:
- figure out the best way to record separate videos for each test. Do we create a new context in each test setup or is there a better way to stop and start recording.
- implement the solution.
|
test
|
set video recording for tests so that it creates a separate video for each test currently video recording for robot tests is done so that it starts recording in the suite setup and then ends when suite is done so we get one video file for the run it would be easier to debug specific tests if the video recording was started in test setup so we would get separate videos for each test implementation plan figure out the best way to record separate videos for each test do we create a new context in each test setup or is there a better way to stop and start recording implement the solution
| 1
|
234,497
| 19,186,317,793
|
IssuesEvent
|
2021-12-05 08:57:09
|
imixs/imixs-workflow
|
https://api.github.com/repos/imixs/imixs-workflow
|
closed
|
LuceneIndex - Autorebuild Defaults
|
enhancement testing
|
Increase default for autorebuild
- block size 500 -> 10000
- interval 2 -> 1
|
1.0
|
LuceneIndex - Autorebuild Defaults - Increase default for autorebuild
- block size 500 -> 10000
- interval 2 -> 1
|
test
|
luceneindex autorebuild defaults increase default for autorebuild block size interval
| 1
|
228,372
| 18,172,622,737
|
IssuesEvent
|
2021-09-27 21:53:21
|
openshift/odo
|
https://api.github.com/repos/openshift/odo
|
closed
|
Automate integration and unit test for windows on PSI
|
priority/High area/testing points/2 area/Windows kind/user-story milestone/2.4
|
/kind user-story
## User Story
As a QE I want to automate both integration and unit test on Windows.
## Acceptance Criteria
- [ ] Automate the test for Windows.
- [ ] Resolve the blockers, if any.
## Links
- Feature Request: N/A
- Related issue: https://github.com/openshift/odo/issues/4379 , #2540
/kind user-story
|
1.0
|
Automate integration and unit test for windows on PSI - /kind user-story
## User Story
As a QE I want to automate both integration and unit test on Windows.
## Acceptance Criteria
- [ ] Automate the test for Windows.
- [ ] Resolve the blockers, if any.
## Links
- Feature Request: N/A
- Related issue: https://github.com/openshift/odo/issues/4379 , #2540
/kind user-story
|
test
|
automate integration and unit test for windows on psi kind user story user story as a qe i want to automate both integration and unit test on windows acceptance criteria automate the test for windows resolve the blockers if any links feature request n a related issue kind user story
| 1
|
268,386
| 23,365,293,104
|
IssuesEvent
|
2022-08-10 14:54:08
|
watchfulli/XCloner-Wordpress
|
https://api.github.com/repos/watchfulli/XCloner-Wordpress
|
closed
|
Test v4.4.2
|
testing
|
- [x] Basic Settings can be saved
- [x] Remote Storages settings can be saved and can be validated against valid credentials (spot-test with existing cloud credentials from LastPass)
- [x] Generate Backup - full backup with default settings
- [x] Generate Backup - partial backup with some tables excluded
- [x] Generate Backup - partial backup with some files excludes
- [x] Generate Backup - partial backup with additional databases included
- [x] Generate Backup - full backup with encryption enabled
- [x] Generate Backup - full backup transferred to Dropbox
- [x] Generate Backup - full backup transferred to Dropbox and deleted from local
- [x] Manage Local Backups - Make sure backups are listed.
- [x] Manage Local Backups - Make sure backups can be deleted
- [x] Manage Local Backups - Make sure backups can be uploaded to DropBox
- [x] Manage Local Backups - Make sure backups can be downloaded.
- [x] Manage Local Backups - Make sure backup contents can be listed.
- [x] Manage Local Backups - Make sure an encrypted backup can be unencrypted.
- [x] Manage Dropbox Backups - Make sure backups are listed.
- [x] Manage Dropbox Backups - Make sure backups can be deleted
- [x] Manage Dropbox Backups - Make sure backups can be uploaded to DropBox
- [x] Manage Dropbox Backups - Make sure backups can be downloaded.
- [x] Create a Scheduled Profile
- [x] Check the correct recurrence of a scheduled Profile based on the Wordpress cronjob execution
- [ ] Trigger any backup profile from the XCloner CLI
- [x] Restore all files from a local backup
- [x] Restore database from a local backup
- [x] Restore all files from a remote location
- [x] Restore database from a remote location
- [x] Enable/Disable logs - check if enabling/disabling logs in XCloner works
- [x] Site connects/validates in Watchful (test `Refresh data` success)
- [x] XCloner backup works as expected in Watchful
- [x] Check if XCloner version has been incremented from previous release
- [x] Check if release has changelog notes
NOTE: this release should fix #227 and #231
[XCloner-Wordpress-v4.4.2.zip](https://github.com/watchfulli/XCloner-Wordpress/files/9300366/XCloner-Wordpress-v4.4.2.zip)
|
1.0
|
Test v4.4.2 - - [x] Basic Settings can be saved
- [x] Remote Storages settings can be saved and can be validated against valid credentials (spot-test with existing cloud credentials from LastPass)
- [x] Generate Backup - full backup with default settings
- [x] Generate Backup - partial backup with some tables excluded
- [x] Generate Backup - partial backup with some files excludes
- [x] Generate Backup - partial backup with additional databases included
- [x] Generate Backup - full backup with encryption enabled
- [x] Generate Backup - full backup transferred to Dropbox
- [x] Generate Backup - full backup transferred to Dropbox and deleted from local
- [x] Manage Local Backups - Make sure backups are listed.
- [x] Manage Local Backups - Make sure backups can be deleted
- [x] Manage Local Backups - Make sure backups can be uploaded to DropBox
- [x] Manage Local Backups - Make sure backups can be downloaded.
- [x] Manage Local Backups - Make sure backup contents can be listed.
- [x] Manage Local Backups - Make sure an encrypted backup can be unencrypted.
- [x] Manage Dropbox Backups - Make sure backups are listed.
- [x] Manage Dropbox Backups - Make sure backups can be deleted
- [x] Manage Dropbox Backups - Make sure backups can be uploaded to DropBox
- [x] Manage Dropbox Backups - Make sure backups can be downloaded.
- [x] Create a Scheduled Profile
- [x] Check the correct recurrence of a scheduled Profile based on the Wordpress cronjob execution
- [ ] Trigger any backup profile from the XCloner CLI
- [x] Restore all files from a local backup
- [x] Restore database from a local backup
- [x] Restore all files from a remote location
- [x] Restore database from a remote location
- [x] Enable/Disable logs - check if enabling/disabling logs in XCloner works
- [x] Site connects/validates in Watchful (test `Refresh data` success)
- [x] XCloner backup works as expected in Watchful
- [x] Check if XCloner version has been incremented from previous release
- [x] Check if release has changelog notes
NOTE: this release should fix #227 and #231
[XCloner-Wordpress-v4.4.2.zip](https://github.com/watchfulli/XCloner-Wordpress/files/9300366/XCloner-Wordpress-v4.4.2.zip)
|
test
|
test basic settings can be saved remote storages settings can be saved and can be validated against valid credentials spot test with existing cloud credentials from lastpass generate backup full backup with default settings generate backup partial backup with some tables excluded generate backup partial backup with some files excludes generate backup partial backup with additional databases included generate backup full backup with encryption enabled generate backup full backup transferred to dropbox generate backup full backup transferred to dropbox and deleted from local manage local backups make sure backups are listed manage local backups make sure backups can be deleted manage local backups make sure backups can be uploaded to dropbox manage local backups make sure backups can be downloaded manage local backups make sure backup contents can be listed manage local backups make sure an encrypted backup can be unencrypted manage dropbox backups make sure backups are listed manage dropbox backups make sure backups can be deleted manage dropbox backups make sure backups can be uploaded to dropbox manage dropbox backups make sure backups can be downloaded create a scheduled profile check the correct recurrence of a scheduled profile based on the wordpress cronjob execution trigger any backup profile from the xcloner cli restore all files from a local backup restore database from a local backup restore all files from a remote location restore database from a remote location enable disable logs check if enabling disabling logs in xcloner works site connects validates in watchful test refresh data success xcloner backup works as expected in watchful check if xcloner version has been incremented from previous release check if release has changelog notes note this release should fix and
| 1
|
22,309
| 3,953,073,215
|
IssuesEvent
|
2016-04-29 11:55:06
|
servo/servo
|
https://api.github.com/repos/servo/servo
|
closed
|
Text after blockquote is displayed inside float
|
A-layout/floats C-has-test
|
```html
<div></div>
<blockquote>bar</blockquote>
foobar
<style>
div {
float: left;
width: 100px;
height: 100px;
background-color: green;
}
</style>
```
Firefox:

Servo:

|
1.0
|
Text after blockquote is displayed inside float - ```html
<div></div>
<blockquote>bar</blockquote>
foobar
<style>
div {
float: left;
width: 100px;
height: 100px;
background-color: green;
}
</style>
```
Firefox:

Servo:

|
test
|
text after blockquote is displayed inside float html bar foobar div float left width height background color green firefox servo
| 1
|
769,660
| 27,015,691,020
|
IssuesEvent
|
2023-02-10 19:08:35
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Data selector not working in FullApp embedding if going directly from location that hasn't loaded `/api/database` yet
|
Type:Bug Priority:P2 .Frontend Querying/GUI .Regression Embedding/Interactive
|
**Describe the bug**
If FullApp embedding directly to e.g `/collection/root`, then it is not possible to click the :heavy_plus_sign: (New...) menu to start a question, because there hasn't been any calls to `/api/database` yet.
**Possible workaround:**
Create a button outside the iframe, similar to pre-42, which changes the URL to `/question/new` or `/question/notebook#eyJjcmVhdGlvblR5cGUiOiJjdXN0b21fcXVlc3Rpb24iLCJkYXRhc2V0X3F1ZXJ5Ijp7ImRhdGFiYXNlIjpudWxsLCJxdWVyeSI6eyJzb3VyY2UtdGFibGUiOm51bGx9LCJ0eXBlIjoicXVlcnkifSwiZGlzcGxheSI6InRhYmxlIiwidmlzdWFsaXphdGlvbl9zZXR0aW5ncyI6e319`
Or use the homepage as starting point, which loads the databases.
**Reproduce 1**
1. Setup an instance. Might need to adjust allowed embed URLs or just `*`
2. Create a page that has an iframe like (some path that doesn't load the `/api/database` endpoint, which the homepage does by default)
`<iframe src="https://metabase.example.com/collection/root">`
3. Click :heavy_plus_sign: (New...) > Question

4. Data selector just shows: `To pick some data, you'll need to add some first`

**Reproduce 2**
1. Setup an instance. Might need to adjust allowed embed URLs or just `*`
2. Create a page that has an iframe like (some path that doesn't load the `/api/database` endpoint, which the homepage does by default)
`<iframe src="https://metabase.example.com/question#eyJkYXRhc2V0X3F1ZXJ5Ijp7InR5cGUiOiJuYXRpdmUiLCJuYXRpdmUiOnsicXVlcnkiOiJzZWxlY3QgMSIsInRlbXBsYXRlLXRhZ3MiOnt9fSwiZGF0YWJhc2UiOjF9LCJkaXNwbGF5Ijoic2NhbGFyIiwiZGlzcGxheUlzTG9ja2VkIjp0cnVlLCJwYXJhbWV0ZXJzIjpbXSwidmlzdWFsaXphdGlvbl9zZXR0aW5ncyI6e319">`
3. It will not show the database selector

4. Clicking the Reference (Learn about your data) opens the sidebar and makes a request to `/api/database`, which then updates everything correctly

**Reproduce X**
There's likely multiple other ways of reproducing this problem, not going through all of them.
**Information about your Metabase Installation:**
Tested 1.42.3
**Additional context**
Historically related to #18552 and #21055, and likely also related to #10927
|
1.0
|
Data selector not working in FullApp embedding if going directly from location that hasn't loaded `/api/database` yet - **Describe the bug**
If FullApp embedding directly to e.g `/collection/root`, then it is not possible to click the :heavy_plus_sign: (New...) menu to start a question, because there hasn't been any calls to `/api/database` yet.
**Possible workaround:**
Create a button outside the iframe, similar to pre-42, which changes the URL to `/question/new` or `/question/notebook#eyJjcmVhdGlvblR5cGUiOiJjdXN0b21fcXVlc3Rpb24iLCJkYXRhc2V0X3F1ZXJ5Ijp7ImRhdGFiYXNlIjpudWxsLCJxdWVyeSI6eyJzb3VyY2UtdGFibGUiOm51bGx9LCJ0eXBlIjoicXVlcnkifSwiZGlzcGxheSI6InRhYmxlIiwidmlzdWFsaXphdGlvbl9zZXR0aW5ncyI6e319`
Or use the homepage as starting point, which loads the databases.
**Reproduce 1**
1. Setup an instance. Might need to adjust allowed embed URLs or just `*`
2. Create a page that has an iframe like (some path that doesn't load the `/api/database` endpoint, which the homepage does by default)
`<iframe src="https://metabase.example.com/collection/root">`
3. Click :heavy_plus_sign: (New...) > Question

4. Data selector just shows: `To pick some data, you'll need to add some first`

**Reproduce 2**
1. Setup an instance. Might need to adjust allowed embed URLs or just `*`
2. Create a page that has an iframe like (some path that doesn't load the `/api/database` endpoint, which the homepage does by default)
`<iframe src="https://metabase.example.com/question#eyJkYXRhc2V0X3F1ZXJ5Ijp7InR5cGUiOiJuYXRpdmUiLCJuYXRpdmUiOnsicXVlcnkiOiJzZWxlY3QgMSIsInRlbXBsYXRlLXRhZ3MiOnt9fSwiZGF0YWJhc2UiOjF9LCJkaXNwbGF5Ijoic2NhbGFyIiwiZGlzcGxheUlzTG9ja2VkIjp0cnVlLCJwYXJhbWV0ZXJzIjpbXSwidmlzdWFsaXphdGlvbl9zZXR0aW5ncyI6e319">`
3. It will not show the database selector

4. Clicking the Reference (Learn about your data) opens the sidebar and makes a request to `/api/database`, which then updates everything correctly

**Reproduce X**
There's likely multiple other ways of reproducing this problem, not going through all of them.
**Information about your Metabase Installation:**
Tested 1.42.3
**Additional context**
Historically related to #18552 and #21055, and likely also related to #10927
|
non_test
|
data selector not working in fullapp embedding if going directly from location that hasn t loaded api database yet describe the bug if fullapp embedding directly to e g collection root then it is not possible to click the heavy plus sign new menu to start a question because there hasn t been any calls to api database yet possible workaround create a button outside the iframe similar to pre which changes the url to question new or question notebook or use the homepage as starting point which loads the databases reproduce setup an instance might need to adjust allowed embed urls or just create a page that has an iframe like some path that doesn t load the api database endpoint which the homepage does by default iframe src click heavy plus sign new question data selector just shows to pick some data you ll need to add some first reproduce setup an instance might need to adjust allowed embed urls or just create a page that has an iframe like some path that doesn t load the api database endpoint which the homepage does by default iframe src it will not show the database selector clicking the reference learn about your data opens the sidebar and makes a request to api database which then updates everything correctly reproduce x there s likely multiple other ways of reproducing this problem not going through all of them information about your metabase installation tested additional context historically related to and and likely also related to
| 0
|
278,480
| 24,157,941,630
|
IssuesEvent
|
2022-09-22 09:14:07
|
gitpod-io/gitpod
|
https://api.github.com/repos/gitpod-io/gitpod
|
closed
|
[integration tests] provide a means to run tests from PR via opt-in, non-blocking mechanism
|
aspect: testing team: platform
|
## Is your feature request related to a problem? Please describe
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
Integration tests run nightly, or ad-hoc.
## Describe the behaviour you'd like
<!-- A clear and concise description of what you want to happen. -->
As an initial skateboard, give developers an opt-in, non-blocking way to easily run the tests. Perhaps as an additional check/item in PRs, such as:
- [ ] /werft with-integration-tests
For a follow-on skateboard, consider something like
- [ ] /werft with-integration-tests workspace
Where we could also do
- [ ] /werft with-integration-tests workspace ide (but not webapp, as an example)
## Additional context
The thought is having an easier way to run tests will help teams at Gitpod more easily adopt and rely on the tests.
The preview environment for these test environments should not be the same as the live one used for preview. In other words, it should be possible to leave `with-preview` unchecked but check `with-integration-tests`.
Failed tests should not block the PR from merging.
|
1.0
|
[integration tests] provide a means to run tests from PR via opt-in, non-blocking mechanism - ## Is your feature request related to a problem? Please describe
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
Integration tests run nightly, or ad-hoc.
## Describe the behaviour you'd like
<!-- A clear and concise description of what you want to happen. -->
As an initial skateboard, give developers an opt-in, non-blocking way to easily run the tests. Perhaps as an additional check/item in PRs, such as:
- [ ] /werft with-integration-tests
For a follow-on skateboard, consider something like
- [ ] /werft with-integration-tests workspace
Where we could also do
- [ ] /werft with-integration-tests workspace ide (but not webapp, as an example)
## Additional context
The thought is having an easier way to run tests will help teams at Gitpod more easily adopt and rely on the tests.
The preview environment for these test environments should not be the same as the live one used for preview. In other words, it should be possible to leave `with-preview` unchecked but check `with-integration-tests`.
Failed tests should not block the PR from merging.
|
test
|
provide a means to run tests from pr via opt in non blocking mechanism is your feature request related to a problem please describe integration tests run nightly or ad hoc describe the behaviour you d like as an initial skateboard give developers an opt in non blocking way to easily run the tests perhaps as an additional check item in prs such as werft with integration tests for a follow on skateboard consider something like werft with integration tests workspace where we could also do werft with integration tests workspace ide but not webapp as an example additional context the thought is having an easier way to run tests will help teams at gitpod more easily adopt and rely on the tests the preview environment for these test environments should not be the same as the live one used for preview in other words it should be possible to leave with preview unchecked but check with integration tests failed tests should not block the pr from merging
| 1
|
208,202
| 15,879,904,283
|
IssuesEvent
|
2021-04-09 13:04:06
|
WoWManiaUK/Redemption
|
https://api.github.com/repos/WoWManiaUK/Redemption
|
closed
|
[NPC/Add Does Not Attack] Levixus
|
Fixed on PTR - Tester Confirmed
|
**Links:**
Link To video showing summon but no attack: https://youtu.be/rStIhDIVyT4
Link to wow-mania Armory: https://www.wow-mania.com/armory/?npc=19847
**What is Happening:**
Levixus will summon his Guardian (Infernal) and it will just stand there not attacking.
**What Should happen:**
The infernal Summon Guardian should aid Levixus in the battle.
|
1.0
|
[NPC/Add Does Not Attack] Levixus - **Links:**
Link To video showing summon but no attack: https://youtu.be/rStIhDIVyT4
Link to wow-mania Armory: https://www.wow-mania.com/armory/?npc=19847
**What is Happening:**
Levixus will summon his Guardian (Infernal) and it will just stand there not attacking.
**What Should happen:**
The infernal Summon Guardian should aid Levixus in the battle.
|
test
|
levixus links link to video showing summon but no attack link to wow mania armory what is happening levixus will summon his guardian infernal and it will just stand there not attacking what should happen the infernal summon guardian should aid levixus in the battle
| 1
|
581,785
| 17,331,658,097
|
IssuesEvent
|
2021-07-28 03:48:57
|
martinmags/july-challenge-tbd
|
https://api.github.com/repos/martinmags/july-challenge-tbd
|
opened
|
Login and Account Creation
|
Issue: Dependencies Issue: Feature Priority: High Research
|
- Create an account via Google OAuth and MongoDB
- Login with Google Oauth and MongoDB
- Manage Protected and Public pages depending on login status (React Context/Provider)
|
1.0
|
Login and Account Creation - - Create an account via Google OAuth and MongoDB
- Login with Google Oauth and MongoDB
- Manage Protected and Public pages depending on login status (React Context/Provider)
|
non_test
|
login and account creation create an account via google oauth and mongodb login with google oauth and mongodb manage protected and public pages depending on login status react context provider
| 0
|
38,384
| 2,846,598,848
|
IssuesEvent
|
2015-05-29 12:37:01
|
UnifiedViews/Core
|
https://api.github.com/repos/UnifiedViews/Core
|
opened
|
Update scripts to schema.sql and data.sql should be two separated files
|
priority: Normal severity: enhancement status: planed
|
The update script below updates schema.sql as well as data.sql. It should be separated to two update scripts - one for schema.sql, one for data.sql. Also in the future, schema.sql updates should not be in the same file with data.sql updates.
https://github.com/UnifiedViews/Core/blob/feature/permissionsRefactoring/db/mysql/updates/2.1.0-update.sql
|
1.0
|
Update scripts to schema.sql and data.sql should be two separated files - The update script below updates schema.sql as well as data.sql. It should be separated to two update scripts - one for schema.sql, one for data.sql. Also in the future, schema.sql updates should not be in the same file with data.sql updates.
https://github.com/UnifiedViews/Core/blob/feature/permissionsRefactoring/db/mysql/updates/2.1.0-update.sql
|
non_test
|
update scripts to schema sql and data sql should be two separated files the update script below updates schema sql as well as data sql it should be separated to two update scripts one for schema sql one for data sql also in the future schema sql updates should not be in the same file with data sql updates
| 0
|
236,687
| 19,567,214,081
|
IssuesEvent
|
2022-01-04 03:23:11
|
kubernetes/test-infra
|
https://api.github.com/repos/kubernetes/test-infra
|
closed
|
Make it clear which version of kubetest or kubetest2 is being used
|
help wanted area/kubetest sig/testing priority/important-soon lifecycle/rotten kind/cleanup
|
We may have `deprecated` kubetest, but it's still in use. Whatever we do here should support `kubetest2` or be included in `kubetest2`
https://github.com/kubernetes/kubernetes/pull/96987#issuecomment-780646063 is an example of confusion over the fact that changes to `kubetest` don't take effect until job configs are updated to a new image (e.g. `kubekins`)
We could stand to document how changes flow a better. But it might help for troubleshooting if we make it clearer what version of kubetest is being used.
Ideas include:
- log the version kubetest was built at
- have kubetest log its version
- store version info as an artifact
- include kubetest's version in `metadata.json` so it can be displayed as a custom testgrid column (maybe related to https://github.com/kubernetes/test-infra/issues/20650)
|
2.0
|
Make it clear which version of kubetest or kubetest2 is being used - We may have `deprecated` kubetest, but it's still in use. Whatever we do here should support `kubetest2` or be included in `kubetest2`
https://github.com/kubernetes/kubernetes/pull/96987#issuecomment-780646063 is an example of confusion over the fact that changes to `kubetest` don't take effect until job configs are updated to a new image (e.g. `kubekins`)
We could stand to document how changes flow a better. But it might help for troubleshooting if we make it clearer what version of kubetest is being used.
Ideas include:
- log the version kubetest was built at
- have kubetest log its version
- store version info as an artifact
- include kubetest's version in `metadata.json` so it can be displayed as a custom testgrid column (maybe related to https://github.com/kubernetes/test-infra/issues/20650)
|
test
|
make it clear which version of kubetest or is being used we may have deprecated kubetest but it s still in use whatever we do here should support or be included in is an example of confusion over the fact that changes to kubetest don t take effect until job configs are updated to a new image e g kubekins we could stand to document how changes flow a better but it might help for troubleshooting if we make it clearer what version of kubetest is being used ideas include log the version kubetest was built at have kubetest log its version store version info as an artifact include kubetest s version in metadata json so it can be displayed as a custom testgrid column maybe related to
| 1
|
493,575
| 14,234,958,213
|
IssuesEvent
|
2020-11-18 14:15:33
|
GoogleCloudPlatform/bank-of-anthos
|
https://api.github.com/repos/GoogleCloudPlatform/bank-of-anthos
|
closed
|
Kubernetes “frontend” service is unable to connect to “ledgermonolith-service”
|
priority: p1 type: bug
|
The firewall rule that was created in the “deploy-monolith.sh” script with a Source Tag “monolith”. The kubernetes cluster does not have a 'monolith' networking tag to match this.
|
1.0
|
Kubernetes “frontend” service is unable to connect to “ledgermonolith-service” - The firewall rule that was created in the “deploy-monolith.sh” script with a Source Tag “monolith”. The kubernetes cluster does not have a 'monolith' networking tag to match this.
|
non_test
|
kubernetes “frontend” service is unable to connect to “ledgermonolith service” the firewall rule that was created in the “deploy monolith sh” script with a source tag “monolith” the kubernetes cluster does not have a monolith networking tag to match this
| 0
|
20,431
| 13,914,795,559
|
IssuesEvent
|
2020-10-20 22:56:59
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
closed
|
Add System.Runtime.CompilerServices.ILproj to slns targeting net461 in their tests.
|
area-Infrastructure-libraries untriaged
|
we need to Add System.Runtime.CompilerServices.ILproj to slns targeting net461 in their tests because all the test projects have a transitive dependency on this assembly. VS doesnt build the project references if they are not in sln.
dependency https://github.com/dotnet/runtime/blob/master/src/libraries/Common/tests/TestUtilities/TestUtilities.csproj#L79
cc @ViktorHofer @safern @ericstj
|
1.0
|
Add System.Runtime.CompilerServices.ILproj to slns targeting net461 in their tests. - we need to Add System.Runtime.CompilerServices.ILproj to slns targeting net461 in their tests because all the test projects have a transitive dependency on this assembly. VS doesnt build the project references if they are not in sln.
dependency https://github.com/dotnet/runtime/blob/master/src/libraries/Common/tests/TestUtilities/TestUtilities.csproj#L79
cc @ViktorHofer @safern @ericstj
|
non_test
|
add system runtime compilerservices ilproj to slns targeting in their tests we need to add system runtime compilerservices ilproj to slns targeting in their tests because all the test projects have a transitive dependency on this assembly vs doesnt build the project references if they are not in sln dependency cc viktorhofer safern ericstj
| 0
|
68,200
| 7,089,140,319
|
IssuesEvent
|
2018-01-12 00:50:13
|
SpongePowered/SpongeCommon
|
https://api.github.com/repos/SpongePowered/SpongeCommon
|
closed
|
Data inconsistency: player data and world save
|
status: more-information-needed status: needs-testing version: 1.10 (u)
|
When using
```
auto-player-save-interval=0
auto-save-interval=0
```
and doing the save manually via. /save-all command the world and player data gets out of sync on shutdown (crash, out of mem..).
We are running the saves our self as we want the distance between saves to be in real time (to match the players experience of time) and so we can delay saves while backups are in progress to avoid corruption and inconsistency of backups.
|
1.0
|
Data inconsistency: player data and world save - When using
```
auto-player-save-interval=0
auto-save-interval=0
```
and doing the save manually via. /save-all command the world and player data gets out of sync on shutdown (crash, out of mem..).
We are running the saves our self as we want the distance between saves to be in real time (to match the players experience of time) and so we can delay saves while backups are in progress to avoid corruption and inconsistency of backups.
|
test
|
data inconsistency player data and world save when using auto player save interval auto save interval and doing the save manually via save all command the world and player data gets out of sync on shutdown crash out of mem we are running the saves our self as we want the distance between saves to be in real time to match the players experience of time and so we can delay saves while backups are in progress to avoid corruption and inconsistency of backups
| 1
|
34,023
| 12,235,831,140
|
IssuesEvent
|
2020-05-04 15:26:56
|
TIBCOSoftware/tibco-streaming-samples
|
https://api.github.com/repos/TIBCOSoftware/tibco-streaming-samples
|
opened
|
CVE-2019-14439 (High) detected in jackson-databind-2.8.9.jar
|
security vulnerability
|
## CVE-2019-14439 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.9.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-scm/tibco-streaming-samples/web/openapi/openapi-client/pom.xml</p>
<p>Path to vulnerable library: 20200504152534_NTXSTJ/downloadResource_NRTEMG/20200504152540/jackson-databind-2.8.9.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.8.9.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/TIBCOSoftware/tibco-streaming-samples/commit/1ec7c157a9b3f4a53d7dbaea8b57c6776fbf9f8c">1ec7c157a9b3f4a53d7dbaea8b57c6776fbf9f8c</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x before 2.9.9.2. This occurs when Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has the logback jar in the classpath.
<p>Publish Date: 2019-07-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14439>CVE-2019-14439</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14439">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14439</a></p>
<p>Release Date: 2019-07-30</p>
<p>Fix Resolution: 2.9.9.2</p>
</p>
</details>
<p></p>
|
True
|
CVE-2019-14439 (High) detected in jackson-databind-2.8.9.jar - ## CVE-2019-14439 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.9.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-scm/tibco-streaming-samples/web/openapi/openapi-client/pom.xml</p>
<p>Path to vulnerable library: 20200504152534_NTXSTJ/downloadResource_NRTEMG/20200504152540/jackson-databind-2.8.9.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.8.9.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/TIBCOSoftware/tibco-streaming-samples/commit/1ec7c157a9b3f4a53d7dbaea8b57c6776fbf9f8c">1ec7c157a9b3f4a53d7dbaea8b57c6776fbf9f8c</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.x before 2.9.9.2. This occurs when Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has the logback jar in the classpath.
<p>Publish Date: 2019-07-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14439>CVE-2019-14439</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14439">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14439</a></p>
<p>Release Date: 2019-07-30</p>
<p>Fix Resolution: 2.9.9.2</p>
</p>
</details>
<p></p>
|
non_test
|
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file tmp ws scm tibco streaming samples web openapi openapi client pom xml path to vulnerable library ntxstj downloadresource nrtemg jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href vulnerability details a polymorphic typing issue was discovered in fasterxml jackson databind x before this occurs when default typing is enabled either globally or for a specific property for an externally exposed json endpoint and the service has the logback jar in the classpath publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
| 0
|
669,104
| 22,612,252,669
|
IssuesEvent
|
2022-06-29 18:15:36
|
operator-framework/rukpak
|
https://api.github.com/repos/operator-framework/rukpak
|
opened
|
Incorporate changes from new Bundle APIs
|
priority/important-soon
|
#396 refactored the underlying Bundle filesystem representation and added a new conversion method to transform registry+v1 bundles to plain+v0 bundles. These new APIs should be incorporated in the logic of the respective provisioners.
Part of #406
|
1.0
|
Incorporate changes from new Bundle APIs - #396 refactored the underlying Bundle filesystem representation and added a new conversion method to transform registry+v1 bundles to plain+v0 bundles. These new APIs should be incorporated in the logic of the respective provisioners.
Part of #406
|
non_test
|
incorporate changes from new bundle apis refactored the underlying bundle filesystem representation and added a new conversion method to transform registry bundles to plain bundles these new apis should be incorporated in the logic of the respective provisioners part of
| 0
|
210,987
| 16,162,692,352
|
IssuesEvent
|
2021-05-01 00:23:39
|
openenclave/openenclave
|
https://api.github.com/repos/openenclave/openenclave
|
reopened
|
Publish oecert tool
|
SGX attestation devtools triaged
|
The oecert tool is an internal command-line tool which uses the OE attestation code to output a report and certificate from SGX.
This tool may be valuable for people writing SGX verification code. We should consider publishing this tool with the SDK so developers can use it to generate test data. This would also allow users to generate files to pass to the new host_verify sample: https://github.com/openenclave/openenclave/pull/2865
sub task:
- [x] Option to skip evidence verification - #3912
- [x] Option to print verbose loggings (similar to oecertdump tool) - #3889
- [x] Hardcoded enclave filename - #3919
- [x] Option to specify in-proc or out-of-proc quote generation - #3949
- [x] Option to not generate report/evidence/certificate output file - #3939
- [x] Enclave must be signed. Check mrsigner when verification option is set - #3955
- [ ] Remove oecertdump tool
- [x] Add legal printout - #3955
- [ ] Publish the tool
|
1.0
|
Publish oecert tool - The oecert tool is an internal command-line tool which uses the OE attestation code to output a report and certificate from SGX.
This tool may be valuable for people writing SGX verification code. We should consider publishing this tool with the SDK so developers can use it to generate test data. This would also allow users to generate files to pass to the new host_verify sample: https://github.com/openenclave/openenclave/pull/2865
sub task:
- [x] Option to skip evidence verification - #3912
- [x] Option to print verbose loggings (similar to oecertdump tool) - #3889
- [x] Hardcoded enclave filename - #3919
- [x] Option to specify in-proc or out-of-proc quote generation - #3949
- [x] Option to not generate report/evidence/certificate output file - #3939
- [x] Enclave must be signed. Check mrsigner when verification option is set - #3955
- [ ] Remove oecertdump tool
- [x] Add legal printout - #3955
- [ ] Publish the tool
|
test
|
publish oecert tool the oecert tool is an internal command line tool which uses the oe attestation code to output a report and certificate from sgx this tool may be valuable for people writing sgx verification code we should consider publishing this tool with the sdk so developers can use it to generate test data this would also allow users to generate files to pass to the new host verify sample sub task option to skip evidence verification option to print verbose loggings similar to oecertdump tool hardcoded enclave filename option to specify in proc or out of proc quote generation option to not generate report evidence certificate output file enclave must be signed check mrsigner when verification option is set remove oecertdump tool add legal printout publish the tool
| 1
|
268,436
| 23,369,007,323
|
IssuesEvent
|
2022-08-10 17:59:29
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
Failing test: X-Pack Alerting API Integration Tests.x-pack/test/alerting_api_integration/spaces_only/tests/alerting/get_action_error_log·ts - alerting api integration spaces only Alerting getActionErrorLog get and filter action error logs for rules with multiple action errors
|
blocker failed-test skipped-test Team:ResponseOps v8.4.0
|
A test failed on a tracked branch
```
Error: expected 0 to sort of equal 1
at Assertion.assert (node_modules/@kbn/expect/expect.js:100:11)
at Assertion.eql (node_modules/@kbn/expect/expect.js:244:8)
at Context.<anonymous> (x-pack/test/alerting_api_integration/spaces_only/tests/alerting/get_action_error_log.ts:188:52)
at runMicrotasks (<anonymous>)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at Object.apply (node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16) {
actual: '0',
expected: '1',
showDiff: true
}
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/18759#0182276d-cfc2-456a-b811-8178e5b661f7)
<!-- kibanaCiData = {"failed-test":{"test.class":"X-Pack Alerting API Integration Tests.x-pack/test/alerting_api_integration/spaces_only/tests/alerting/get_action_error_log·ts","test.name":"alerting api integration spaces only Alerting getActionErrorLog get and filter action error logs for rules with multiple action errors","test.failCount":6}} -->
|
2.0
|
Failing test: X-Pack Alerting API Integration Tests.x-pack/test/alerting_api_integration/spaces_only/tests/alerting/get_action_error_log·ts - alerting api integration spaces only Alerting getActionErrorLog get and filter action error logs for rules with multiple action errors - A test failed on a tracked branch
```
Error: expected 0 to sort of equal 1
at Assertion.assert (node_modules/@kbn/expect/expect.js:100:11)
at Assertion.eql (node_modules/@kbn/expect/expect.js:244:8)
at Context.<anonymous> (x-pack/test/alerting_api_integration/spaces_only/tests/alerting/get_action_error_log.ts:188:52)
at runMicrotasks (<anonymous>)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at Object.apply (node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16) {
actual: '0',
expected: '1',
showDiff: true
}
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/18759#0182276d-cfc2-456a-b811-8178e5b661f7)
<!-- kibanaCiData = {"failed-test":{"test.class":"X-Pack Alerting API Integration Tests.x-pack/test/alerting_api_integration/spaces_only/tests/alerting/get_action_error_log·ts","test.name":"alerting api integration spaces only Alerting getActionErrorLog get and filter action error logs for rules with multiple action errors","test.failCount":6}} -->
|
test
|
failing test x pack alerting api integration tests x pack test alerting api integration spaces only tests alerting get action error log·ts alerting api integration spaces only alerting getactionerrorlog get and filter action error logs for rules with multiple action errors a test failed on a tracked branch error expected to sort of equal at assertion assert node modules kbn expect expect js at assertion eql node modules kbn expect expect js at context x pack test alerting api integration spaces only tests alerting get action error log ts at runmicrotasks at processticksandrejections node internal process task queues at object apply node modules kbn test target node functional test runner lib mocha wrap function js actual expected showdiff true first failure
| 1
|
303,530
| 26,215,323,265
|
IssuesEvent
|
2023-01-04 10:27:48
|
saleor/saleor-dashboard
|
https://api.github.com/repos/saleor/saleor-dashboard
|
closed
|
Cypress test fail: should not be able to disable several gift cards on gift card list page and use it in checkout. TC: SALEOR_1013
|
tests
|
**Known bug for versions:**
v31: true
v33: true
v34: true
**Additional Info:**
Spec: As a admin I want to use enabled gift card in checkout
|
1.0
|
Cypress test fail: should not be able to disable several gift cards on gift card list page and use it in checkout. TC: SALEOR_1013 - **Known bug for versions:**
v31: true
v33: true
v34: true
**Additional Info:**
Spec: As a admin I want to use enabled gift card in checkout
|
test
|
cypress test fail should not be able to disable several gift cards on gift card list page and use it in checkout tc saleor known bug for versions true true true additional info spec as a admin i want to use enabled gift card in checkout
| 1
|
235,768
| 18,059,457,523
|
IssuesEvent
|
2021-09-20 12:27:57
|
protocolbuffers/protobuf
|
https://api.github.com/repos/protocolbuffers/protobuf
|
closed
|
Javadocs is not updated at original javadoc site
|
java documentation
|
It seems that the official Javadoc is not up to date. I found differences in the latest releases and Javadoc site -> https://developers.google.com/protocol-buffers/docs/reference/java/.
I checked for `UnsafeByteOperation.java` at [Javadoc site](https://developers.google.com/protocol-buffers/docs/reference/java/com/google/protobuf/UnsafeByteOperations) and at [github repo](https://github.com/protocolbuffers/protobuf/blob/master/java/core/src/main/java/com/google/protobuf/UnsafeByteOperations.java).
|
1.0
|
Javadocs is not updated at original javadoc site - It seems that the official Javadoc is not up to date. I found differences in the latest releases and Javadoc site -> https://developers.google.com/protocol-buffers/docs/reference/java/.
I checked for `UnsafeByteOperation.java` at [Javadoc site](https://developers.google.com/protocol-buffers/docs/reference/java/com/google/protobuf/UnsafeByteOperations) and at [github repo](https://github.com/protocolbuffers/protobuf/blob/master/java/core/src/main/java/com/google/protobuf/UnsafeByteOperations.java).
|
non_test
|
javadocs is not updated at original javadoc site it seems that the official javadoc is not up to date i found differences in the latest releases and javadoc site i checked for unsafebyteoperation java at and at
| 0
|
218,406
| 16,989,221,667
|
IssuesEvent
|
2021-06-30 18:05:38
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
opened
|
txt and json files cannot be imported via Share or import IPFS options
|
OS/Desktop QA/Test-Plan-Specified QA/Yes bug feature/ipfs
|
<!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
txt and json files cannot be imported via Share or import IPFS options
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Clean profile 1.27.x
2. Enable IPFS
3. Click on the Hamburger menu and select `IPFS->Share local files using IPFS` or Import the local file via `Import` menu
4. Share/import the .txt or .json file from local manchine
## Actual result:
<!--Please add screenshots if needed-->
txt and json files cannot be imported via Share or import IPFS options


## Expected result:
txt and json files can be imported via Share or import IPFS options
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easy
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
Brave | 1.27.84 Chromium: 91.0.4472.124 (Official Build) beta (64-bit)
-- | --
Revision | 7345a6d1bfcaff81162a957e9b7d52649fe2ac38-refs/branch-heads/4472_114@{#6}
OS | Windows 10 OS Version 2004 (Build 19041.1052)
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? Yes
- Can you reproduce this issue with the beta channel? Yes
- Can you reproduce this issue with the nightly channel? Yes
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields? NA
- Does the issue resolve itself when disabling Brave Rewards? NA
- Is the issue reproducible on the latest version of Chrome? NA
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->
cc: @brave/legacy_qa @spylogsster
|
1.0
|
txt and json files cannot be imported via Share or import IPFS options - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
txt and json files cannot be imported via Share or import IPFS options
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Clean profile 1.27.x
2. Enable IPFS
3. Click on the Hamburger menu and select `IPFS->Share local files using IPFS` or Import the local file via `Import` menu
4. Share/import the .txt or .json file from local manchine
## Actual result:
<!--Please add screenshots if needed-->
txt and json files cannot be imported via Share or import IPFS options


## Expected result:
txt and json files can be imported via Share or import IPFS options
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easy
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
Brave | 1.27.84 Chromium: 91.0.4472.124 (Official Build) beta (64-bit)
-- | --
Revision | 7345a6d1bfcaff81162a957e9b7d52649fe2ac38-refs/branch-heads/4472_114@{#6}
OS | Windows 10 OS Version 2004 (Build 19041.1052)
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? Yes
- Can you reproduce this issue with the beta channel? Yes
- Can you reproduce this issue with the nightly channel? Yes
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields? NA
- Does the issue resolve itself when disabling Brave Rewards? NA
- Is the issue reproducible on the latest version of Chrome? NA
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->
cc: @brave/legacy_qa @spylogsster
|
test
|
txt and json files cannot be imported via share or import ipfs options have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description txt and json files cannot be imported via share or import ipfs options steps to reproduce clean profile x enable ipfs click on the hamburger menu and select ipfs share local files using ipfs or import the local file via import menu share import the txt or json file from local manchine actual result txt and json files cannot be imported via share or import ipfs options expected result txt and json files can be imported via share or import ipfs options reproduces how often easy brave version brave version info brave chromium official build beta bit revision refs branch heads os windows os version build version channel information can you reproduce this issue with the current release yes can you reproduce this issue with the beta channel yes can you reproduce this issue with the nightly channel yes other additional information does the issue resolve itself when disabling brave shields na does the issue resolve itself when disabling brave rewards na is the issue reproducible on the latest version of chrome na miscellaneous information cc brave legacy qa spylogsster
| 1
|
231,753
| 25,541,817,751
|
IssuesEvent
|
2022-11-29 15:49:05
|
elastic/cloudbeat
|
https://api.github.com/repos/elastic/cloudbeat
|
opened
|
Cloudbeat fail due to fatal
|
bug Team:Cloud Security Posture
|
**Describe the bug**
Cloudbat enters a crash loop-back after continuously receiving new configurations from fleet server. This leads to continuous restarting of the fetching cycle and shipping lots of findings.
**Preconditions**
8.6.0 BC3
**To Reproduce**
Write the exact actions one should perform in order to reproduce the bug.
Steps to reproduce the behavior:
1. Deploy 8.6.0 BC3 agent with KSPM configured
2. Go to the agent logs
- OS: linux
- Arch: and64
- Agent Version: 8.6.0 BC3
- Cloudbeat Version: 8.6.0 BC3
- Kibana Version: 8.6.0 BC3
|
True
|
Cloudbeat fail due to fatal - **Describe the bug**
Cloudbat enters a crash loop-back after continuously receiving new configurations from fleet server. This leads to continuous restarting of the fetching cycle and shipping lots of findings.
**Preconditions**
8.6.0 BC3
**To Reproduce**
Write the exact actions one should perform in order to reproduce the bug.
Steps to reproduce the behavior:
1. Deploy 8.6.0 BC3 agent with KSPM configured
2. Go to the agent logs
- OS: linux
- Arch: and64
- Agent Version: 8.6.0 BC3
- Cloudbeat Version: 8.6.0 BC3
- Kibana Version: 8.6.0 BC3
|
non_test
|
cloudbeat fail due to fatal describe the bug cloudbat enters a crash loop back after continuously receiving new configurations from fleet server this leads to continuous restarting of the fetching cycle and shipping lots of findings preconditions to reproduce write the exact actions one should perform in order to reproduce the bug steps to reproduce the behavior deploy agent with kspm configured go to the agent logs os linux arch agent version cloudbeat version kibana version
| 0
|
4,979
| 11,923,578,806
|
IssuesEvent
|
2020-04-01 08:05:42
|
timelyxyz/blog
|
https://api.github.com/repos/timelyxyz/blog
|
opened
|
新冠肺炎引起的架构调整 - COVID-19 Caused IT Architecture Change
|
Architecture Free Talk
|
I think the biggest change is **Work From Home**. I've WFH for 2+ months...
To support global systems easily and quickly, we need to consider about the following architectural design aspects.
1. Cross-geo data center
2. Cross-geo data synchronization
3. Globalization (i18n)
4. Horizonal scalling
5. Global load balancing
6. Global CDN
|
1.0
|
新冠肺炎引起的架构调整 - COVID-19 Caused IT Architecture Change - I think the biggest change is **Work From Home**. I've WFH for 2+ months...
To support global systems easily and quickly, we need to consider about the following architectural design aspects.
1. Cross-geo data center
2. Cross-geo data synchronization
3. Globalization (i18n)
4. Horizonal scalling
5. Global load balancing
6. Global CDN
|
non_test
|
新冠肺炎引起的架构调整 covid caused it architecture change i think the biggest change is work from home i ve wfh for months to support global systems easily and quickly we need to consider about the following architectural design aspects cross geo data center cross geo data synchronization globalization horizonal scalling global load balancing global cdn
| 0
|
439,992
| 30,725,207,440
|
IssuesEvent
|
2023-07-27 19:03:30
|
openrewrite/rewrite-maven-plugin
|
https://api.github.com/repos/openrewrite/rewrite-maven-plugin
|
closed
|
Investigate automating docs
|
documentation enhancement
|
Right now, we rely on manually updating the [Maven plugin docs](https://docs.openrewrite.org/reference/rewrite-maven-plugin). Theoretically, we could use something like [the maven docc plugin](https://maven.apache.org/plugin-developers/plugin-documenting.html) to automate the creation of docs for this plugin.
We could then invoke this new goal and place the HTML in some `target/` directory that we could publish.
[Related Gradle issue](https://github.com/openrewrite/rewrite-gradle-plugin/issues/215)
[Related internal Slack thread](https://moderneinc.slack.com/archives/C04HLU7FG31/p1689693646849119?thread_ts=1689619338.215239&cid=C04HLU7FG31)
Suggested by: @timtebeek
|
1.0
|
Investigate automating docs - Right now, we rely on manually updating the [Maven plugin docs](https://docs.openrewrite.org/reference/rewrite-maven-plugin). Theoretically, we could use something like [the maven docc plugin](https://maven.apache.org/plugin-developers/plugin-documenting.html) to automate the creation of docs for this plugin.
We could then invoke this new goal and place the HTML in some `target/` directory that we could publish.
[Related Gradle issue](https://github.com/openrewrite/rewrite-gradle-plugin/issues/215)
[Related internal Slack thread](https://moderneinc.slack.com/archives/C04HLU7FG31/p1689693646849119?thread_ts=1689619338.215239&cid=C04HLU7FG31)
Suggested by: @timtebeek
|
non_test
|
investigate automating docs right now we rely on manually updating the theoretically we could use something like to automate the creation of docs for this plugin we could then invoke this new goal and place the html in some target directory that we could publish suggested by timtebeek
| 0
|
114,459
| 14,580,796,960
|
IssuesEvent
|
2020-12-18 09:43:49
|
IBM/ibm-spectrum-scale-csi
|
https://api.github.com/repos/IBM/ibm-spectrum-scale-csi
|
closed
|
not able to update node mapping
|
Customer Impact: Localized low impact Customer Probability: Medium Phase: Test Severity: 3 Target: Operator Type: Bug Type: Working As Designed
|
**Describe the bug**
removed one node worker0.gsanjay-ocp from mapping. but it is not effective
`oc edit CSIScaleOperator ibm-spectrum-scale-csi -n ibm-spectrum-scale-csi-driver`
Followed KC link https://rtpdoc01.rtp.raleigh.ibm.com:9443/kc/STXKQY_CSI_review/com.ibm.spectrum.scale.csi.v2r00.doc/bl1_csi_scaleoperator_config.html
```
https://rtpdoc01.rtp.raleigh.ibm.com:9443/kc/STXKQY_CSI_review/com.ibm.spectrum.scale.csi.v2r00.doc/bl1_csi_scaleoperator_config.html
**To Reproduce**
Create CR with node mapping, try to remove one node mapping using edit
**Expected behavior**
mapping should be removed
**Environment**
Please run the following an paste your output here:
``` bash
# Developement
operator-sdk version
go version
# Deployment
oc version
Client Version: 4.3.13
Server Version: 4.3.13
Kubernetes Version: v1.16.2
[root@gsanjay-ocp-inf ibm-spectrum-scale-csi]#
```
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Additional context**
Add any other context about the problem here.
|
1.0
|
not able to update node mapping - **Describe the bug**
removed one node worker0.gsanjay-ocp from mapping. but it is not effective
`oc edit CSIScaleOperator ibm-spectrum-scale-csi -n ibm-spectrum-scale-csi-driver`
Followed KC link https://rtpdoc01.rtp.raleigh.ibm.com:9443/kc/STXKQY_CSI_review/com.ibm.spectrum.scale.csi.v2r00.doc/bl1_csi_scaleoperator_config.html
```
https://rtpdoc01.rtp.raleigh.ibm.com:9443/kc/STXKQY_CSI_review/com.ibm.spectrum.scale.csi.v2r00.doc/bl1_csi_scaleoperator_config.html
**To Reproduce**
Create CR with node mapping, try to remove one node mapping using edit
**Expected behavior**
mapping should be removed
**Environment**
Please run the following an paste your output here:
``` bash
# Developement
operator-sdk version
go version
# Deployment
oc version
Client Version: 4.3.13
Server Version: 4.3.13
Kubernetes Version: v1.16.2
[root@gsanjay-ocp-inf ibm-spectrum-scale-csi]#
```
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Additional context**
Add any other context about the problem here.
|
non_test
|
not able to update node mapping describe the bug removed one node gsanjay ocp from mapping but it is not effective oc edit csiscaleoperator ibm spectrum scale csi n ibm spectrum scale csi driver followed kc link to reproduce create cr with node mapping try to remove one node mapping using edit expected behavior mapping should be removed environment please run the following an paste your output here bash developement operator sdk version go version deployment oc version client version server version kubernetes version screenshots if applicable add screenshots to help explain your problem additional context add any other context about the problem here
| 0
|
207,686
| 15,831,591,859
|
IssuesEvent
|
2021-04-06 13:47:39
|
WarwickCIM/backfillz-py
|
https://api.github.com/repos/WarwickCIM/backfillz-py
|
closed
|
Sample Stan data
|
aspect:testing
|
Need some sample Stan data to test the plotting functions. In the R version, this is loaded from `sample_stanfit.rda` (an R binary format). This seems to have been generated from the Eight Schools example in R; we can probably do something similar using PyStan.
- [x] `poetry add pystan`
- [x] generate the Eight Schools fit using the [sample Python code](https://pystan.readthedocs.io/en/latest/getting_started.html)
- [x] skeleton of `Backfillz` class and `as_backfillz` method
- [x] `poetry run flake8 backfillz/ tests/` locally
- [x] `poetry run mypy --strict --config-file=mypy.ini backfillz/` locally
- [x] `BackfillzTheme` class and prebuilt instances
- [x] bump to Python 3.8, install Pandas
- [x] `test_asBackfillz`:
- [x] save/pickle baseline sample data as `expected_backfillz`
- [x] load/unpickle `expected_backfillz` and compare
|
1.0
|
Sample Stan data - Need some sample Stan data to test the plotting functions. In the R version, this is loaded from `sample_stanfit.rda` (an R binary format). This seems to have been generated from the Eight Schools example in R; we can probably do something similar using PyStan.
- [x] `poetry add pystan`
- [x] generate the Eight Schools fit using the [sample Python code](https://pystan.readthedocs.io/en/latest/getting_started.html)
- [x] skeleton of `Backfillz` class and `as_backfillz` method
- [x] `poetry run flake8 backfillz/ tests/` locally
- [x] `poetry run mypy --strict --config-file=mypy.ini backfillz/` locally
- [x] `BackfillzTheme` class and prebuilt instances
- [x] bump to Python 3.8, install Pandas
- [x] `test_asBackfillz`:
- [x] save/pickle baseline sample data as `expected_backfillz`
- [x] load/unpickle `expected_backfillz` and compare
|
test
|
sample stan data need some sample stan data to test the plotting functions in the r version this is loaded from sample stanfit rda an r binary format this seems to have been generated from the eight schools example in r we can probably do something similar using pystan poetry add pystan generate the eight schools fit using the skeleton of backfillz class and as backfillz method poetry run backfillz tests locally poetry run mypy strict config file mypy ini backfillz locally backfillztheme class and prebuilt instances bump to python install pandas test asbackfillz save pickle baseline sample data as expected backfillz load unpickle expected backfillz and compare
| 1
|
319,883
| 27,405,491,510
|
IssuesEvent
|
2023-03-01 06:17:26
|
harvester/harvester
|
https://api.github.com/repos/harvester/harvester
|
closed
|
[BUG] Upgrade stuck in waiting the degraded volume that not start rebuilding
|
kind/bug priority/1 severity/1 regression area/upgrade-related reproduce/often not-require/test-plan
|
### Describe the bug
The volume is not created by user.

### To Reproduce
Steps to reproduce the behavior:
1. Install Harvester with 4 nodes
2. Create Image for VM creation
3. Create VM Network
4. Setup backup-target
5. Import Harvester to Rancher
6. Create RKE2 with 3 machines
7. Take backup for each RKE2 VM
8. Create VM `vm1`, write some data
9. Back up `vm1`
10. Perform upgrade
### Expected behavior
Volume should rebuild automatically or should be skip waiting
## Environment:
- Harvester ISO version: **v1.1.1** upgrade to **v1.1.2-rc1**
- Underlying Infrastructure (e.g. Baremetal with Dell PowerEdge R630): **Baremetal DL360G9 4 nodes**
### Additional context
regression test of 4 nodes upgrade
|
1.0
|
[BUG] Upgrade stuck in waiting the degraded volume that not start rebuilding - ### Describe the bug
The volume is not created by user.

### To Reproduce
Steps to reproduce the behavior:
1. Install Harvester with 4 nodes
2. Create Image for VM creation
3. Create VM Network
4. Setup backup-target
5. Import Harvester to Rancher
6. Create RKE2 with 3 machines
7. Take backup for each RKE2 VM
8. Create VM `vm1`, write some data
9. Back up `vm1`
10. Perform upgrade
### Expected behavior
Volume should rebuild automatically or should be skip waiting
## Environment:
- Harvester ISO version: **v1.1.1** upgrade to **v1.1.2-rc1**
- Underlying Infrastructure (e.g. Baremetal with Dell PowerEdge R630): **Baremetal DL360G9 4 nodes**
### Additional context
regression test of 4 nodes upgrade
|
test
|
upgrade stuck in waiting the degraded volume that not start rebuilding describe the bug the volume is not created by user to reproduce steps to reproduce the behavior install harvester with nodes create image for vm creation create vm network setup backup target import harvester to rancher create with machines take backup for each vm create vm write some data back up perform upgrade expected behavior volume should rebuild automatically or should be skip waiting environment harvester iso version upgrade to underlying infrastructure e g baremetal with dell poweredge baremetal nodes additional context regression test of nodes upgrade
| 1
|
734,797
| 25,364,524,840
|
IssuesEvent
|
2022-11-21 04:25:16
|
ppy/osu
|
https://api.github.com/repos/ppy/osu
|
closed
|
Deleting/updating beatmapset causes weird visual bug in the song select screen
|
priority:0 area:song-select
|
### Type
Cosmetic
### Bug description
Deleted beatmapset remains in the background overlapping the other ones.
### Screenshots or videos
https://user-images.githubusercontent.com/86934170/202422540-f253a8cb-6c61-4c07-81ea-e8dfe74411f0.mp4
### Version
1117
### Logs
nothing
|
1.0
|
Deleting/updating beatmapset causes weird visual bug in the song select screen - ### Type
Cosmetic
### Bug description
Deleted beatmapset remains in the background overlapping the other ones.
### Screenshots or videos
https://user-images.githubusercontent.com/86934170/202422540-f253a8cb-6c61-4c07-81ea-e8dfe74411f0.mp4
### Version
1117
### Logs
nothing
|
non_test
|
deleting updating beatmapset causes weird visual bug in the song select screen type cosmetic bug description deleted beatmapset remains in the background overlapping the other ones screenshots or videos version logs nothing
| 0
|
268,858
| 8,415,266,657
|
IssuesEvent
|
2018-10-13 12:55:45
|
ngageoint/hootenanny
|
https://api.github.com/repos/ngageoint/hootenanny
|
closed
|
Symmetric translation fixes
|
Category: Translation Priority: High Status: In Progress Type: Bug Type: Support in progress
|
Initial list to look at:
* Motor vehicle station
* Shopping complex
|
1.0
|
Symmetric translation fixes - Initial list to look at:
* Motor vehicle station
* Shopping complex
|
non_test
|
symmetric translation fixes initial list to look at motor vehicle station shopping complex
| 0
|
246,345
| 20,834,499,786
|
IssuesEvent
|
2022-03-20 00:54:48
|
backend-br/vagas
|
https://api.github.com/repos/backend-br/vagas
|
closed
|
[Remoto] Ruby on Rails Developer na TeamHub
|
PJ Ruby Remoto DevOps Exterior AWS Testes automatizados Scrum Git Rest Stale
|
## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/ruby-on-rails-developer-141052179?utm_source=github&utm_medium=backend-br-vagas&modal=open) com o pop-up personalizado de candidatura. 👋
<p>A <strong>TeamHub</strong> está buscando <strong>Ruby on Rails Developer</strong> em nível <strong>Sênior</strong> para compor sua equipe!</p>
<p>Somos um Hub de tecnologia que DESCOMPLICA a gestão de cultura para CONECTAR pessoas e MULTIPLICAR resultados.</p>
<p>O nosso foco é criar um ambiente onde a gestão da cultura seja fácil, acessível, dinâmica e de todos.</p>
## TeamHub:
<p>Somos um Hub de tecnologia que <strong>DESCOMPLICA </strong>a gestão de cultura para <strong>CONECTAR </strong>pessoas e <strong>MULTIPLICAR </strong>resultados.</p>
<p>O nosso foco é criar um ambiente onde a gestão da cultura seja fácil, acessível, dinâmica e de todos.</p><a href='https://coodesh.com/empresas/teamhub'>Veja mais no site</a>
## Habilidades:
- Ruby on Rails
- SCRUM
- Kanban
- Rails
## Local:
100% Remoto
## Requisitos:
- Sólida experiência no desenvolvendo com Ruby on Rails diretamente;
- Experiência em participar de times que utilizam metodologias ágeis de gestão de software, como Scrum e Kanban;
- Domínio da construção de aplicações de arquitetura monolítica;
- Domínio da construção de aplicações de arquitetura de microsserviços utilizando webservices REST (Rails-API);
- Domínio dos princípios de SOLID e Design Patterns;
- Domínio de versionamento de código utilizando GIT;
- Trabalhar bem em conjunto com a equipe;
- Lidar com problemas de desempenho e escalabilidade das aplicações;
- Garantir que todo código criado foi devidamente testado com cobertura de testes automatizados;
- Propor boas práticas de desenvolvimento;
## Diferenciais:
- Experiência com regras de negócio de Comércio Exterior;
- Já ter aplicado treinamentos internos ou externos às empresas em que trabalhou;
- Já ter palestrado em eventos de desenvolvimento sobre Ruby on Rails e tecnologias correlatas;
- Conhecimento em Vue.js;
- Ter experiência em trabalhar com Sidekiq e serviços de mensageria;
- Ter conhecimento de boas práticas de DevOps;
- Ter conhecimento da infra-estrutura oferecida pela Amazon AWS e similares;
## Benefícios:
- Vale refeição: R$26,86;
- Auxílio Home Office;
- Programa de promoção de saúde no Home Office;
- Programas de Benefícios com parcerias com Allya e Creditas;
- Plano de Saúde Sulamérica.
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Ruby on Rails Developer na TeamHub](https://coodesh.com/vagas/ruby-on-rails-developer-141052179?utm_source=github&utm_medium=backend-br-vagas&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Remoto
#### Regime
PJ
#### Categoria
Back-End
|
1.0
|
[Remoto] Ruby on Rails Developer na TeamHub - ## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/ruby-on-rails-developer-141052179?utm_source=github&utm_medium=backend-br-vagas&modal=open) com o pop-up personalizado de candidatura. 👋
<p>A <strong>TeamHub</strong> está buscando <strong>Ruby on Rails Developer</strong> em nível <strong>Sênior</strong> para compor sua equipe!</p>
<p>Somos um Hub de tecnologia que DESCOMPLICA a gestão de cultura para CONECTAR pessoas e MULTIPLICAR resultados.</p>
<p>O nosso foco é criar um ambiente onde a gestão da cultura seja fácil, acessível, dinâmica e de todos.</p>
## TeamHub:
<p>Somos um Hub de tecnologia que <strong>DESCOMPLICA </strong>a gestão de cultura para <strong>CONECTAR </strong>pessoas e <strong>MULTIPLICAR </strong>resultados.</p>
<p>O nosso foco é criar um ambiente onde a gestão da cultura seja fácil, acessível, dinâmica e de todos.</p><a href='https://coodesh.com/empresas/teamhub'>Veja mais no site</a>
## Habilidades:
- Ruby on Rails
- SCRUM
- Kanban
- Rails
## Local:
100% Remoto
## Requisitos:
- Sólida experiência no desenvolvendo com Ruby on Rails diretamente;
- Experiência em participar de times que utilizam metodologias ágeis de gestão de software, como Scrum e Kanban;
- Domínio da construção de aplicações de arquitetura monolítica;
- Domínio da construção de aplicações de arquitetura de microsserviços utilizando webservices REST (Rails-API);
- Domínio dos princípios de SOLID e Design Patterns;
- Domínio de versionamento de código utilizando GIT;
- Trabalhar bem em conjunto com a equipe;
- Lidar com problemas de desempenho e escalabilidade das aplicações;
- Garantir que todo código criado foi devidamente testado com cobertura de testes automatizados;
- Propor boas práticas de desenvolvimento;
## Diferenciais:
- Experiência com regras de negócio de Comércio Exterior;
- Já ter aplicado treinamentos internos ou externos às empresas em que trabalhou;
- Já ter palestrado em eventos de desenvolvimento sobre Ruby on Rails e tecnologias correlatas;
- Conhecimento em Vue.js;
- Ter experiência em trabalhar com Sidekiq e serviços de mensageria;
- Ter conhecimento de boas práticas de DevOps;
- Ter conhecimento da infra-estrutura oferecida pela Amazon AWS e similares;
## Benefícios:
- Vale refeição: R$26,86;
- Auxílio Home Office;
- Programa de promoção de saúde no Home Office;
- Programas de Benefícios com parcerias com Allya e Creditas;
- Plano de Saúde Sulamérica.
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Ruby on Rails Developer na TeamHub](https://coodesh.com/vagas/ruby-on-rails-developer-141052179?utm_source=github&utm_medium=backend-br-vagas&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Remoto
#### Regime
PJ
#### Categoria
Back-End
|
test
|
ruby on rails developer na teamhub descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 a teamhub está buscando ruby on rails developer em nível sênior para compor sua equipe somos um hub de tecnologia que descomplica a gestão de cultura para conectar pessoas e multiplicar resultados o nosso foco é criar um ambiente onde a gestão da cultura seja fácil acessível dinâmica e de todos teamhub somos um hub de tecnologia que descomplica a gestão de cultura para conectar pessoas e multiplicar resultados o nosso foco é criar um ambiente onde a gestão da cultura seja fácil acessível dinâmica e de todos habilidades ruby on rails scrum kanban rails local remoto requisitos sólida experiência no desenvolvendo com ruby on rails diretamente experiência em participar de times que utilizam metodologias ágeis de gestão de software como scrum e kanban domínio da construção de aplicações de arquitetura monolítica domínio da construção de aplicações de arquitetura de microsserviços utilizando webservices rest rails api domínio dos princípios de solid e design patterns domínio de versionamento de código utilizando git trabalhar bem em conjunto com a equipe lidar com problemas de desempenho e escalabilidade das aplicações garantir que todo código criado foi devidamente testado com cobertura de testes automatizados propor boas práticas de desenvolvimento diferenciais experiência com regras de negócio de comércio exterior já ter aplicado treinamentos internos ou externos às empresas em que trabalhou já ter palestrado em eventos de desenvolvimento sobre ruby on rails e tecnologias correlatas conhecimento em vue js ter experiência em trabalhar com sidekiq e serviços de mensageria ter conhecimento de boas práticas de devops ter conhecimento da infra estrutura oferecida pela amazon aws e similares benefícios vale refeição r auxílio home office programa de promoção de saúde no home office programas de benefícios com parcerias com allya e creditas plano de saúde sulamérica como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação remoto regime pj categoria back end
| 1
|
94,016
| 19,430,477,685
|
IssuesEvent
|
2021-12-21 11:20:00
|
nopSolutions/nopCommerce
|
https://api.github.com/repos/nopSolutions/nopCommerce
|
opened
|
Remove multi-step checkout
|
refactoring / source code
|
We already have one-page checkout that is used worldwide. In order to keep our solution simpler, let's remove the multi-step checkout functionality
|
1.0
|
Remove multi-step checkout - We already have one-page checkout that is used worldwide. In order to keep our solution simpler, let's remove the multi-step checkout functionality
|
non_test
|
remove multi step checkout we already have one page checkout that is used worldwide in order to keep our solution simpler let s remove the multi step checkout functionality
| 0
|
97,698
| 8,666,070,341
|
IssuesEvent
|
2018-11-29 02:14:06
|
CGCookie/retopoflow
|
https://api.github.com/repos/CGCookie/retopoflow
|
closed
|
Patches: Unhandled Exception Caught!
|
Ready for Testing bug can't replicate
|
Second time, moving contour loop toward end of source mesh and the snapping messed up.
-------------------------------------
```
Environment:
- RetopoFlow: 2.0.0
- Blender: 2.79.0 master 2018-03-22
- Platform: Windows, 10, 10.0.17134, AMD64, Intel64 Family 6 Model 158 Stepping 9, GenuineIntel
- GPU: NVIDIA Corporation, GeForce GTX 1070/PCIe/SSE2, 4.6.0 NVIDIA 391.35, 4.60 NVIDIA
- Timestamp: 2018-11-19 19:00:53.829657
Error Hash: ccd86f3e6e5e4b76b4b9f93201fc039c
Trace:
- EXCEPTION (<class 'ReferenceError'>): BMesh data of type BMVert has been removed
- .../rfmode\rfmesh_wrapper.py
- 000 0145:co() return self.l2w_point(self.bmelem.co)
- .../rfmode\rftool_patches.py
- 001 0608:<listcomp>() self.vis_bmverts = [(bmv, Point_to_Point2D(bmv.co)) for bmv in self.vis_verts if bmv and bmv not in self.sel_verts]
- 002 0608:prep_move() self.vis_bmverts = [(bmv, Point_to_Point2D(bmv.co)) for bmv in self.vis_verts if bmv and bmv not in self.sel_verts]
- .../common\profiler.py
- 003 0174:wrapper() return fn(*args, **kwargs)
- .../rfmode\rftool_patches.py
- 004 0561:modal_main() self.prep_move()
- .../rfmode\rftool.py
- 005 0110:modal() nmode = self.FSM[self.mode]()
- 006 0114:modal() raise e # passing on the exception to RFContext
- .../rfmode\rfcontext.py
- 007 0556:modal_main() self.tool.modal()
- 008 0469:modal() nmode = self.FSM[self.mode]()
```
|
1.0
|
Patches: Unhandled Exception Caught! - Second time, moving contour loop toward end of source mesh and the snapping messed up.
-------------------------------------
```
Environment:
- RetopoFlow: 2.0.0
- Blender: 2.79.0 master 2018-03-22
- Platform: Windows, 10, 10.0.17134, AMD64, Intel64 Family 6 Model 158 Stepping 9, GenuineIntel
- GPU: NVIDIA Corporation, GeForce GTX 1070/PCIe/SSE2, 4.6.0 NVIDIA 391.35, 4.60 NVIDIA
- Timestamp: 2018-11-19 19:00:53.829657
Error Hash: ccd86f3e6e5e4b76b4b9f93201fc039c
Trace:
- EXCEPTION (<class 'ReferenceError'>): BMesh data of type BMVert has been removed
- .../rfmode\rfmesh_wrapper.py
- 000 0145:co() return self.l2w_point(self.bmelem.co)
- .../rfmode\rftool_patches.py
- 001 0608:<listcomp>() self.vis_bmverts = [(bmv, Point_to_Point2D(bmv.co)) for bmv in self.vis_verts if bmv and bmv not in self.sel_verts]
- 002 0608:prep_move() self.vis_bmverts = [(bmv, Point_to_Point2D(bmv.co)) for bmv in self.vis_verts if bmv and bmv not in self.sel_verts]
- .../common\profiler.py
- 003 0174:wrapper() return fn(*args, **kwargs)
- .../rfmode\rftool_patches.py
- 004 0561:modal_main() self.prep_move()
- .../rfmode\rftool.py
- 005 0110:modal() nmode = self.FSM[self.mode]()
- 006 0114:modal() raise e # passing on the exception to RFContext
- .../rfmode\rfcontext.py
- 007 0556:modal_main() self.tool.modal()
- 008 0469:modal() nmode = self.FSM[self.mode]()
```
|
test
|
patches unhandled exception caught second time moving contour loop toward end of source mesh and the snapping messed up environment retopoflow blender master platform windows family model stepping genuineintel gpu nvidia corporation geforce gtx pcie nvidia nvidia timestamp error hash trace exception bmesh data of type bmvert has been removed rfmode rfmesh wrapper py co return self point self bmelem co rfmode rftool patches py self vis bmverts prep move self vis bmverts common profiler py wrapper return fn args kwargs rfmode rftool patches py modal main self prep move rfmode rftool py modal nmode self fsm modal raise e passing on the exception to rfcontext rfmode rfcontext py modal main self tool modal modal nmode self fsm
| 1
|
32,401
| 4,769,834,656
|
IssuesEvent
|
2016-10-26 13:46:50
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
sql: TestAmbiguousCommit is susceptible to deadlock
|
test-failure
|
Spotted in https://teamcity.cockroachdb.com/viewLog.html?tab=buildLog&logTab=tree&filter=debug&expand=all&buildId=35691#_focus=8201
Interesting part of the log:
```
[19:47:18][TestAmbiguousCommit] goroutine 2742 [chan receive, 1 minutes]:
[19:47:18][TestAmbiguousCommit] github.com/cockroachdb/cockroach/pkg/sql_test.TestAmbiguousCommit(0xc4209b6240)
[19:47:18][TestAmbiguousCommit] /go/src/github.com/cockroachdb/cockroach/pkg/sql/ambiguous_commit_test.go:140 +0x7e7
[19:47:18][TestAmbiguousCommit] testing.tRunner(0xc4209b6240, 0x19e3ff8)
[19:47:18][TestAmbiguousCommit] /usr/local/go/src/testing/testing.go:610 +0x81
[19:47:18][TestAmbiguousCommit] created by testing.(*T).Run
[19:47:18][TestAmbiguousCommit] /usr/local/go/src/testing/testing.go:646 +0x2ec
```
Also reproduced locally:
```
make stress PKG=./pkg/sql TESTS=TestAmbiguousCommit TESTTIMEOUT=30s
6 runs so far, 0 failures, over 5s
6 runs so far, 0 failures, over 10s
10 runs so far, 0 failures, over 15s
14 runs so far, 0 failures, over 20s
16 runs so far, 0 failures, over 25s
17 runs so far, 0 failures, over 30s
23 runs so far, 0 failures, over 35s
31 runs so far, 0 failures, over 40s
36 runs so far, 0 failures, over 45s
...
goroutine 7 [chan receive]:
github.com/cockroachdb/cockroach/pkg/sql_test.TestAmbiguousCommit(0xc42008c540)
/Users/tamird/src/go/src/github.com/cockroachdb/cockroach/pkg/sql/ambiguous_commit_test.go:140 +0x7e7
testing.tRunner(0xc42008c540, 0x54bade0)
/Users/tamird/src/go1.7/src/testing/testing.go:610 +0x81
created by testing.(*T).Run
/Users/tamird/src/go1.7/src/testing/testing.go:646 +0x2ec
```
So looks like there's a real problem with the test.
|
1.0
|
sql: TestAmbiguousCommit is susceptible to deadlock - Spotted in https://teamcity.cockroachdb.com/viewLog.html?tab=buildLog&logTab=tree&filter=debug&expand=all&buildId=35691#_focus=8201
Interesting part of the log:
```
[19:47:18][TestAmbiguousCommit] goroutine 2742 [chan receive, 1 minutes]:
[19:47:18][TestAmbiguousCommit] github.com/cockroachdb/cockroach/pkg/sql_test.TestAmbiguousCommit(0xc4209b6240)
[19:47:18][TestAmbiguousCommit] /go/src/github.com/cockroachdb/cockroach/pkg/sql/ambiguous_commit_test.go:140 +0x7e7
[19:47:18][TestAmbiguousCommit] testing.tRunner(0xc4209b6240, 0x19e3ff8)
[19:47:18][TestAmbiguousCommit] /usr/local/go/src/testing/testing.go:610 +0x81
[19:47:18][TestAmbiguousCommit] created by testing.(*T).Run
[19:47:18][TestAmbiguousCommit] /usr/local/go/src/testing/testing.go:646 +0x2ec
```
Also reproduced locally:
```
make stress PKG=./pkg/sql TESTS=TestAmbiguousCommit TESTTIMEOUT=30s
6 runs so far, 0 failures, over 5s
6 runs so far, 0 failures, over 10s
10 runs so far, 0 failures, over 15s
14 runs so far, 0 failures, over 20s
16 runs so far, 0 failures, over 25s
17 runs so far, 0 failures, over 30s
23 runs so far, 0 failures, over 35s
31 runs so far, 0 failures, over 40s
36 runs so far, 0 failures, over 45s
...
goroutine 7 [chan receive]:
github.com/cockroachdb/cockroach/pkg/sql_test.TestAmbiguousCommit(0xc42008c540)
/Users/tamird/src/go/src/github.com/cockroachdb/cockroach/pkg/sql/ambiguous_commit_test.go:140 +0x7e7
testing.tRunner(0xc42008c540, 0x54bade0)
/Users/tamird/src/go1.7/src/testing/testing.go:610 +0x81
created by testing.(*T).Run
/Users/tamird/src/go1.7/src/testing/testing.go:646 +0x2ec
```
So looks like there's a real problem with the test.
|
test
|
sql testambiguouscommit is susceptible to deadlock spotted in interesting part of the log goroutine github com cockroachdb cockroach pkg sql test testambiguouscommit go src github com cockroachdb cockroach pkg sql ambiguous commit test go testing trunner usr local go src testing testing go created by testing t run usr local go src testing testing go also reproduced locally make stress pkg pkg sql tests testambiguouscommit testtimeout runs so far failures over runs so far failures over runs so far failures over runs so far failures over runs so far failures over runs so far failures over runs so far failures over runs so far failures over runs so far failures over goroutine github com cockroachdb cockroach pkg sql test testambiguouscommit users tamird src go src github com cockroachdb cockroach pkg sql ambiguous commit test go testing trunner users tamird src src testing testing go created by testing t run users tamird src src testing testing go so looks like there s a real problem with the test
| 1
|
624,148
| 19,687,890,151
|
IssuesEvent
|
2022-01-12 01:18:59
|
SETI/pds-opus
|
https://api.github.com/repos/SETI/pds-opus
|
opened
|
Translator for PDS API
|
A-Enhancement Effort 1 Hard Priority TBD B-OPUS Django
|
Once the PDS API is deployed, we will need to write a translator to convert PDS API queries into OPUS queries.
|
1.0
|
Translator for PDS API - Once the PDS API is deployed, we will need to write a translator to convert PDS API queries into OPUS queries.
|
non_test
|
translator for pds api once the pds api is deployed we will need to write a translator to convert pds api queries into opus queries
| 0
|
242,209
| 20,205,678,534
|
IssuesEvent
|
2022-02-11 20:03:20
|
ValveSoftware/steam-for-linux
|
https://api.github.com/repos/ValveSoftware/steam-for-linux
|
closed
|
Login window flickering and no text is displayed
|
Intel drivers Steam client Need Retest Distro Family: openSUSE
|
#### Your system information
* Steam client version (build number or date): Updated on 14 September 2019
* Distribution (e.g. Ubuntu): openSUSE Tumbleweed
* Opted into Steam client beta?: [Yes/No] Apparently, yes
* Have you checked for system updates?: [Yes/No] Yes
#### Please describe your issue in as much detail as possible:
Steam login window should be displayed okay.

#### Steps for reproducing this issue:
1. Run steam from the command line
[SteamConsoleOutput.txt](https://github.com/ValveSoftware/steam-for-linux/files/3612407/SteamConsoleOutput.txt)
|
1.0
|
Login window flickering and no text is displayed - #### Your system information
* Steam client version (build number or date): Updated on 14 September 2019
* Distribution (e.g. Ubuntu): openSUSE Tumbleweed
* Opted into Steam client beta?: [Yes/No] Apparently, yes
* Have you checked for system updates?: [Yes/No] Yes
#### Please describe your issue in as much detail as possible:
Steam login window should be displayed okay.

#### Steps for reproducing this issue:
1. Run steam from the command line
[SteamConsoleOutput.txt](https://github.com/ValveSoftware/steam-for-linux/files/3612407/SteamConsoleOutput.txt)
|
test
|
login window flickering and no text is displayed your system information steam client version build number or date updated on september distribution e g ubuntu opensuse tumbleweed opted into steam client beta apparently yes have you checked for system updates yes please describe your issue in as much detail as possible steam login window should be displayed okay steps for reproducing this issue run steam from the command line
| 1
|
112,572
| 9,593,363,046
|
IssuesEvent
|
2019-05-09 11:19:41
|
abraunegg/onedrive
|
https://api.github.com/repos/abraunegg/onedrive
|
closed
|
Implement to sync "All files not in a folder" when using sync_list
|
Feature Request PR Provided - Needs Testing
|
When using `sync_list` individual files are not uploaded/downloaded in the `sync_dir` itself.
As per #488, an option exists in the Windows client which would allow this to occur.
This feature is to enable, by configuration (default disabled) to upload/download files in the `sync_dir` root which normally would be excluded by `sync_list` operations.
|
1.0
|
Implement to sync "All files not in a folder" when using sync_list - When using `sync_list` individual files are not uploaded/downloaded in the `sync_dir` itself.
As per #488, an option exists in the Windows client which would allow this to occur.
This feature is to enable, by configuration (default disabled) to upload/download files in the `sync_dir` root which normally would be excluded by `sync_list` operations.
|
test
|
implement to sync all files not in a folder when using sync list when using sync list individual files are not uploaded downloaded in the sync dir itself as per an option exists in the windows client which would allow this to occur this feature is to enable by configuration default disabled to upload download files in the sync dir root which normally would be excluded by sync list operations
| 1
|
283,548
| 24,552,030,221
|
IssuesEvent
|
2022-10-12 13:19:04
|
harvester/harvester
|
https://api.github.com/repos/harvester/harvester
|
closed
|
[FEATURE] enhance double check of VM's resource modification
|
kind/enhancement area/ui severity/2 need-reprioritize reproduce/always area/vm-lifecycle not-require/test-plan
|
### The problem
The prompted's `Save` button will be clicked silently when Press `ESC` or areas out of Prompt message, modification will be updated, but the UI won't.
### The solution
Add a Cancel option, and other behaviors (excepts click `Save` or `Save & Restart`) should execute cancel option.
### The Alternatives
Suppress Keyboard inputs and freeze view.
### Additional Context
https://user-images.githubusercontent.com/5169694/193790263-19379641-e282-445f-831f-8da039c15e77.mp4
|
1.0
|
[FEATURE] enhance double check of VM's resource modification - ### The problem
The prompted's `Save` button will be clicked silently when Press `ESC` or areas out of Prompt message, modification will be updated, but the UI won't.
### The solution
Add a Cancel option, and other behaviors (excepts click `Save` or `Save & Restart`) should execute cancel option.
### The Alternatives
Suppress Keyboard inputs and freeze view.
### Additional Context
https://user-images.githubusercontent.com/5169694/193790263-19379641-e282-445f-831f-8da039c15e77.mp4
|
test
|
enhance double check of vm s resource modification the problem the prompted s save button will be clicked silently when press esc or areas out of prompt message modification will be updated but the ui won t the solution add a cancel option and other behaviors excepts click save or save restart should execute cancel option the alternatives suppress keyboard inputs and freeze view additional context
| 1
|
4,577
| 11,382,975,379
|
IssuesEvent
|
2020-01-29 04:07:02
|
TerriaJS/terriajs
|
https://api.github.com/repos/TerriaJS/terriajs
|
closed
|
Model Architecture: Remove real servers from testing
|
New Model Architecture
|
Update any tests that call `loadMapItems()` to remove external dependencies in our tests, so far I've seen in
ArcGisMapServerCatalogItem
These tests do not fail if a bad URL is given, however we should still remove as it makes a call to the server as part of the test.
https://github.com/TerriaJS/terriajs/blob/0ce9a51c5ccf89bcdcdb19a810cbddf0bf991124/test/Models/ArcGisMapServerCatalogItemSpec.ts#L81
WebMapServiceCatalogItem
The WMS tests will fail if that URL falls over
https://github.com/TerriaJS/terriajs/blob/0ce9a51c5ccf89bcdcdb19a810cbddf0bf991124/test/Models/WebMapServiceCatalogItemSpec.ts#L36
We should mock out the response as we've done in `master`, non-mobx tests.
|
1.0
|
Model Architecture: Remove real servers from testing - Update any tests that call `loadMapItems()` to remove external dependencies in our tests, so far I've seen in
ArcGisMapServerCatalogItem
These tests do not fail if a bad URL is given, however we should still remove as it makes a call to the server as part of the test.
https://github.com/TerriaJS/terriajs/blob/0ce9a51c5ccf89bcdcdb19a810cbddf0bf991124/test/Models/ArcGisMapServerCatalogItemSpec.ts#L81
WebMapServiceCatalogItem
The WMS tests will fail if that URL falls over
https://github.com/TerriaJS/terriajs/blob/0ce9a51c5ccf89bcdcdb19a810cbddf0bf991124/test/Models/WebMapServiceCatalogItemSpec.ts#L36
We should mock out the response as we've done in `master`, non-mobx tests.
|
non_test
|
model architecture remove real servers from testing update any tests that call loadmapitems to remove external dependencies in our tests so far i ve seen in arcgismapservercatalogitem these tests do not fail if a bad url is given however we should still remove as it makes a call to the server as part of the test webmapservicecatalogitem the wms tests will fail if that url falls over we should mock out the response as we ve done in master non mobx tests
| 0
|
209,081
| 23,681,706,013
|
IssuesEvent
|
2022-08-28 22:12:11
|
meramsey/user-alias
|
https://api.github.com/repos/meramsey/user-alias
|
closed
|
dom4j-1.6.1.jar: 1 vulnerabilities (highest severity is: 7.5)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>dom4j-1.6.1.jar</b></p></summary>
<p>dom4j: the flexible XML framework for Java</p>
<p>Library home page: <a href="http://dom4j.org">http://dom4j.org</a></p>
<p>Path to dependency file: /extension/pom.xml</p>
<p>Path to vulnerable library: /repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/meramsey/user-alias/commit/f2be907aeaa8c6567433543468dc2c648ee24183">f2be907aeaa8c6567433543468dc2c648ee24183</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2018-1000632](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | dom4j-1.6.1.jar | Direct | 20040902.021138 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2018-1000632</summary>
### Vulnerable Library - <b>dom4j-1.6.1.jar</b></p>
<p>dom4j: the flexible XML framework for Java</p>
<p>Library home page: <a href="http://dom4j.org">http://dom4j.org</a></p>
<p>Path to dependency file: /extension/pom.xml</p>
<p>Path to vulnerable library: /repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **dom4j-1.6.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/meramsey/user-alias/commit/f2be907aeaa8c6567433543468dc2c648ee24183">f2be907aeaa8c6567433543468dc2c648ee24183</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
dom4j version prior to version 2.1.1 contains a CWE-91: XML Injection vulnerability in Class: Element. Methods: addElement, addAttribute that can result in an attacker tampering with XML documents through XML injection. This attack appear to be exploitable via an attacker specifying attributes or elements in the XML document. This vulnerability appears to have been fixed in 2.1.1 or later.
<p>Publish Date: 2018-08-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632>CVE-2018-1000632</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632</a></p>
<p>Release Date: 2018-08-20</p>
<p>Fix Resolution: 20040902.021138</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
|
True
|
dom4j-1.6.1.jar: 1 vulnerabilities (highest severity is: 7.5) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>dom4j-1.6.1.jar</b></p></summary>
<p>dom4j: the flexible XML framework for Java</p>
<p>Library home page: <a href="http://dom4j.org">http://dom4j.org</a></p>
<p>Path to dependency file: /extension/pom.xml</p>
<p>Path to vulnerable library: /repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/meramsey/user-alias/commit/f2be907aeaa8c6567433543468dc2c648ee24183">f2be907aeaa8c6567433543468dc2c648ee24183</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2018-1000632](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | dom4j-1.6.1.jar | Direct | 20040902.021138 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2018-1000632</summary>
### Vulnerable Library - <b>dom4j-1.6.1.jar</b></p>
<p>dom4j: the flexible XML framework for Java</p>
<p>Library home page: <a href="http://dom4j.org">http://dom4j.org</a></p>
<p>Path to dependency file: /extension/pom.xml</p>
<p>Path to vulnerable library: /repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **dom4j-1.6.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/meramsey/user-alias/commit/f2be907aeaa8c6567433543468dc2c648ee24183">f2be907aeaa8c6567433543468dc2c648ee24183</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
dom4j version prior to version 2.1.1 contains a CWE-91: XML Injection vulnerability in Class: Element. Methods: addElement, addAttribute that can result in an attacker tampering with XML documents through XML injection. This attack appear to be exploitable via an attacker specifying attributes or elements in the XML document. This vulnerability appears to have been fixed in 2.1.1 or later.
<p>Publish Date: 2018-08-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632>CVE-2018-1000632</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632</a></p>
<p>Release Date: 2018-08-20</p>
<p>Fix Resolution: 20040902.021138</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
|
non_test
|
jar vulnerabilities highest severity is vulnerable library jar the flexible xml framework for java library home page a href path to dependency file extension pom xml path to vulnerable library repository jar found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high jar direct details cve vulnerable library jar the flexible xml framework for java library home page a href path to dependency file extension pom xml path to vulnerable library repository jar dependency hierarchy x jar vulnerable library found in head commit a href found in base branch master vulnerability details version prior to version contains a cwe xml injection vulnerability in class element methods addelement addattribute that can result in an attacker tampering with xml documents through xml injection this attack appear to be exploitable via an attacker specifying attributes or elements in the xml document this vulnerability appears to have been fixed in or later publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.