Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 4
112
| repo_url
stringlengths 33
141
| action
stringclasses 3
values | title
stringlengths 1
1.02k
| labels
stringlengths 4
1.54k
| body
stringlengths 1
262k
| index
stringclasses 17
values | text_combine
stringlengths 95
262k
| label
stringclasses 2
values | text
stringlengths 96
252k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
63,890
| 7,751,126,227
|
IssuesEvent
|
2018-05-30 16:05:56
|
GPIG-Group-C/web-server-ui
|
https://api.github.com/repos/GPIG-Group-C/web-server-ui
|
closed
|
Create additional hidden webpage for triggering demonstration events on the server at key points in the demonstration
|
Web Server & UI design presentation
|
There needs to be a way to trigger the earthquake or the start of the data streams. A way to reset this without force closing all applications might also be useful.
|
1.0
|
Create additional hidden webpage for triggering demonstration events on the server at key points in the demonstration - There needs to be a way to trigger the earthquake or the start of the data streams. A way to reset this without force closing all applications might also be useful.
|
non_test
|
create additional hidden webpage for triggering demonstration events on the server at key points in the demonstration there needs to be a way to trigger the earthquake or the start of the data streams a way to reset this without force closing all applications might also be useful
| 0
|
70,454
| 7,188,858,931
|
IssuesEvent
|
2018-02-02 11:44:46
|
eclipse/smarthome
|
https://api.github.com/repos/eclipse/smarthome
|
opened
|
[Test failures] HostFragmentSupportTest
|
Automation Test
|
```
Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 12.494 sec <<< FAILURE! - in org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest
asserting that the update of the fragment-host provides the resources correctly(org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest) Time elapsed: 9.442 sec <<< FAILURE!
java.lang.AssertionError:
Expected: is "Тригер 1 Обновен Етикет"
but: was "Trigger 1 Label"
at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:20)
at org.junit.Assert.assertThat(Assert.java:956)
at org.junit.Assert.assertThat(Assert.java:923)
at org.junit.Assert$assertThat.callStatic(Unknown Source)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest$_asserting_that_the_update_of_the_fragment-host_provides_the_resources_correctly_closure12.doCall(HostFragmentSupportTest.groovy:316)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest$_asserting_that_the_update_of_the_fragment-host_provides_the_resources_correctly_closure12.doCall(HostFragmentSupportTest.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:278)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1016)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:39)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:112)
at org.eclipse.smarthome.test.OSGiTest.waitForAssert(OSGiTest.groovy:252)
at sun.reflect.GeneratedMethodAccessor17.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrap.invoke(PogoMetaMethodSite.java:187)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:56)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:49)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:61)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:153)
at org.eclipse.smarthome.test.OSGiTest.waitForAssert(OSGiTest.groovy:223)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:207)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:56)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:49)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:133)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:149)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest.asserting that the update of the fragment-host provides the resources correctly(HostFragmentSupportTest.groovy:311)
asserting that the installation of the fragment-host provides the resources correctly(org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest) Time elapsed: 3.034 sec <<< FAILURE!
java.lang.AssertionError:
Expected: is <2>
but: was <4>
at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:20)
at org.junit.Assert.assertThat(Assert.java:956)
at org.junit.Assert.assertThat(Assert.java:923)
at org.junit.Assert$assertThat.callStatic(Unknown Source)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest$_asserting_that_the_installation_of_the_fragment-host_provides_the_resources_correctly_closure2.doCall(HostFragmentSupportTest.groovy:127)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest$_asserting_that_the_installation_of_the_fragment-host_provides_the_resources_correctly_closure2.doCall(HostFragmentSupportTest.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:278)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1016)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:39)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:54)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:112)
at org.eclipse.smarthome.test.OSGiTest.waitForAssert(OSGiTest.groovy:252)
at sun.reflect.GeneratedMethodAccessor17.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrap.invoke(PogoMetaMethodSite.java:187)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:56)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:49)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:61)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:153)
at org.eclipse.smarthome.test.OSGiTest.waitForAssert(OSGiTest.groovy:223)
at sun.reflect.GeneratedMethodAccessor29.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:207)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:56)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:49)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:133)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:149)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest.asserting that the installation of the fragment-host provides the resources correctly(HostFragmentSupportTest.groovy:126)
```
Source: https://travis-ci.org/eclipse/smarthome/builds/336491495?utm_source=github_status&utm_medium=notification
Full log: [log.txt](https://github.com/eclipse/smarthome/files/1689195/log.txt)
|
1.0
|
[Test failures] HostFragmentSupportTest - ```
Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 12.494 sec <<< FAILURE! - in org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest
asserting that the update of the fragment-host provides the resources correctly(org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest) Time elapsed: 9.442 sec <<< FAILURE!
java.lang.AssertionError:
Expected: is "Тригер 1 Обновен Етикет"
but: was "Trigger 1 Label"
at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:20)
at org.junit.Assert.assertThat(Assert.java:956)
at org.junit.Assert.assertThat(Assert.java:923)
at org.junit.Assert$assertThat.callStatic(Unknown Source)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest$_asserting_that_the_update_of_the_fragment-host_provides_the_resources_correctly_closure12.doCall(HostFragmentSupportTest.groovy:316)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest$_asserting_that_the_update_of_the_fragment-host_provides_the_resources_correctly_closure12.doCall(HostFragmentSupportTest.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:278)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1016)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:39)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:112)
at org.eclipse.smarthome.test.OSGiTest.waitForAssert(OSGiTest.groovy:252)
at sun.reflect.GeneratedMethodAccessor17.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrap.invoke(PogoMetaMethodSite.java:187)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:56)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:49)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:61)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:153)
at org.eclipse.smarthome.test.OSGiTest.waitForAssert(OSGiTest.groovy:223)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:207)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:56)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:49)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:133)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:149)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest.asserting that the update of the fragment-host provides the resources correctly(HostFragmentSupportTest.groovy:311)
asserting that the installation of the fragment-host provides the resources correctly(org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest) Time elapsed: 3.034 sec <<< FAILURE!
java.lang.AssertionError:
Expected: is <2>
but: was <4>
at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:20)
at org.junit.Assert.assertThat(Assert.java:956)
at org.junit.Assert.assertThat(Assert.java:923)
at org.junit.Assert$assertThat.callStatic(Unknown Source)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest$_asserting_that_the_installation_of_the_fragment-host_provides_the_resources_correctly_closure2.doCall(HostFragmentSupportTest.groovy:127)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest$_asserting_that_the_installation_of_the_fragment-host_provides_the_resources_correctly_closure2.doCall(HostFragmentSupportTest.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:278)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1016)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:39)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:54)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:112)
at org.eclipse.smarthome.test.OSGiTest.waitForAssert(OSGiTest.groovy:252)
at sun.reflect.GeneratedMethodAccessor17.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrap.invoke(PogoMetaMethodSite.java:187)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:56)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:49)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:61)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:153)
at org.eclipse.smarthome.test.OSGiTest.waitForAssert(OSGiTest.groovy:223)
at sun.reflect.GeneratedMethodAccessor29.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:207)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:56)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:49)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:133)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:149)
at org.eclipse.smarthome.automation.integration.test.HostFragmentSupportTest.asserting that the installation of the fragment-host provides the resources correctly(HostFragmentSupportTest.groovy:126)
```
Source: https://travis-ci.org/eclipse/smarthome/builds/336491495?utm_source=github_status&utm_medium=notification
Full log: [log.txt](https://github.com/eclipse/smarthome/files/1689195/log.txt)
|
test
|
hostfragmentsupporttest tests run failures errors skipped time elapsed sec failure in org eclipse smarthome automation integration test hostfragmentsupporttest asserting that the update of the fragment host provides the resources correctly org eclipse smarthome automation integration test hostfragmentsupporttest time elapsed sec failure java lang assertionerror expected is тригер обновен етикет but was trigger label at org hamcrest matcherassert assertthat matcherassert java at org junit assert assertthat assert java at org junit assert assertthat assert java at org junit assert assertthat callstatic unknown source at org eclipse smarthome automation integration test hostfragmentsupporttest asserting that the update of the fragment host provides the resources correctly docall hostfragmentsupporttest groovy at org eclipse smarthome automation integration test hostfragmentsupporttest asserting that the update of the fragment host provides the resources correctly docall hostfragmentsupporttest groovy at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org codehaus groovy reflection cachedmethod invoke cachedmethod java at groovy lang metamethod domethodinvoke metamethod java at org codehaus groovy runtime metaclass closuremetaclass invokemethod closuremetaclass java at groovy lang metaclassimpl invokemethod metaclassimpl java at org codehaus groovy runtime callsite pogometaclasssite call pogometaclasssite java at org codehaus groovy runtime callsite callsitearray defaultcall callsitearray java at org codehaus groovy runtime callsite abstractcallsite call abstractcallsite java at org codehaus groovy runtime callsite abstractcallsite call abstractcallsite java at org eclipse smarthome test osgitest waitforassert osgitest groovy at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org codehaus groovy runtime callsite pogometamethodsite pogocachedmethodsitenounwrap invoke pogometamethodsite java at org codehaus groovy runtime callsite pogometamethodsite callcurrent pogometamethodsite java at org codehaus groovy runtime callsite callsitearray defaultcallcurrent callsitearray java at org codehaus groovy runtime callsite pogometamethodsite callcurrent pogometamethodsite java at org codehaus groovy runtime callsite abstractcallsite callcurrent abstractcallsite java at org eclipse smarthome test osgitest waitforassert osgitest groovy at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org codehaus groovy runtime callsite pogometamethodsite pogocachedmethodsitenounwrapnocoerce invoke pogometamethodsite java at org codehaus groovy runtime callsite pogometamethodsite callcurrent pogometamethodsite java at org codehaus groovy runtime callsite callsitearray defaultcallcurrent callsitearray java at org codehaus groovy runtime callsite abstractcallsite callcurrent abstractcallsite java at org codehaus groovy runtime callsite abstractcallsite callcurrent abstractcallsite java at org eclipse smarthome automation integration test hostfragmentsupporttest asserting that the update of the fragment host provides the resources correctly hostfragmentsupporttest groovy asserting that the installation of the fragment host provides the resources correctly org eclipse smarthome automation integration test hostfragmentsupporttest time elapsed sec failure java lang assertionerror expected is but was at org hamcrest matcherassert assertthat matcherassert java at org junit assert assertthat assert java at org junit assert assertthat assert java at org junit assert assertthat callstatic unknown source at org eclipse smarthome automation integration test hostfragmentsupporttest asserting that the installation of the fragment host provides the resources correctly docall hostfragmentsupporttest groovy at org eclipse smarthome automation integration test hostfragmentsupporttest asserting that the installation of the fragment host provides the resources correctly docall hostfragmentsupporttest groovy at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org codehaus groovy reflection cachedmethod invoke cachedmethod java at groovy lang metamethod domethodinvoke metamethod java at org codehaus groovy runtime metaclass closuremetaclass invokemethod closuremetaclass java at groovy lang metaclassimpl invokemethod metaclassimpl java at org codehaus groovy runtime callsite pogometaclasssite call pogometaclasssite java at org codehaus groovy runtime callsite callsitearray defaultcall callsitearray java at org codehaus groovy runtime callsite pogometaclasssite call pogometaclasssite java at org codehaus groovy runtime callsite abstractcallsite call abstractcallsite java at org eclipse smarthome test osgitest waitforassert osgitest groovy at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org codehaus groovy runtime callsite pogometamethodsite pogocachedmethodsitenounwrap invoke pogometamethodsite java at org codehaus groovy runtime callsite pogometamethodsite callcurrent pogometamethodsite java at org codehaus groovy runtime callsite callsitearray defaultcallcurrent callsitearray java at org codehaus groovy runtime callsite pogometamethodsite callcurrent pogometamethodsite java at org codehaus groovy runtime callsite abstractcallsite callcurrent abstractcallsite java at org eclipse smarthome test osgitest waitforassert osgitest groovy at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org codehaus groovy runtime callsite pogometamethodsite pogocachedmethodsitenounwrapnocoerce invoke pogometamethodsite java at org codehaus groovy runtime callsite pogometamethodsite callcurrent pogometamethodsite java at org codehaus groovy runtime callsite callsitearray defaultcallcurrent callsitearray java at org codehaus groovy runtime callsite abstractcallsite callcurrent abstractcallsite java at org codehaus groovy runtime callsite abstractcallsite callcurrent abstractcallsite java at org eclipse smarthome automation integration test hostfragmentsupporttest asserting that the installation of the fragment host provides the resources correctly hostfragmentsupporttest groovy source full log
| 1
|
819,323
| 30,728,857,898
|
IssuesEvent
|
2023-07-27 22:25:28
|
UNopenGIS/7
|
https://api.github.com/repos/UNopenGIS/7
|
closed
|
Smart Maps Dojo process 1
|
priority/MAY
|
# Generation 1
The Smart Maps Dojo process is a process to run a sustaining community of practice about Smart Maps. This is a first prototype concept generation.
## 1. 示範 demonstration
the mentor demonstrates smart maps application.
## 2. 解説 explanation
the mentor provides explanation about the fundamental principles and techniques of smart maps development.
## 3. 実践 practice
the mentee practices smart maps development under the mentor's guidance, receiving support and advice.
## 4. 助言 feedback
the mentor evaluates the mentee's work, providing specific feedback. Feedback is crucial for progress.
## 5. 学習 learning
the mentee learns smart maps development skills through mentor's guidance and feedback. Documentations are often developed in this learning process so that a mentee become a mentor.
## 6. 演習 exercise
the mentee applies smart maps development skills in practical projects or scenarios.
# Generation 2
スマート地図道場のプロセスは、スマート地図に焦点を当てた持続可能な実践コミュニティを構築するための包括的なアプローチです。以下にプロセスの概要を示します:
示範(デモンストレーション):
メンターはスマート地図の活用方法を実演し、その機能と潜在的な活用方法を披露します。
解説:
メンターはスマート地図開発の基本原則と技術について詳細な解説を行い、学習者の間で確固たる理解を確保します。
実践:
学習者はメンターの指導の下、スマート地図開発に関連する実践的な演習に積極的に取り組みます。彼らは実践的なサポートと助言を受けながら、スキルの磨きを行います。
助言:
メンターは学習者の作業を評価し、具体的なフィードバックを提供します。このフィードバックは、学習者の進歩とスマート地図開発能力の向上において非常に重要です。
学習:
学習者はメンターの指導とフィードバックを通じて、スマート地図開発のスキルを学び、向上させ続けます。このフェーズでは、学習プロセスをサポートするためのドキュメンテーションの作成が行われることが多く、学習者は次第にメンターとしての役割を果たすことができます。
演習:
学習者は習得したスマート地図開発のスキルを実践的なプロジェクトやシナリオに適用します。これにより、実際の経験を積み重ね、能力を磨くことができます。
このプロセスに従うことで、スマート地図道場は学習、応用、成長の連続的なサイクルを促進し、参加者がスマート地図の専門知識を磨くことができる実践コミュニティを育成します。
|
1.0
|
Smart Maps Dojo process 1 - # Generation 1
The Smart Maps Dojo process is a process to run a sustaining community of practice about Smart Maps. This is a first prototype concept generation.
## 1. 示範 demonstration
the mentor demonstrates smart maps application.
## 2. 解説 explanation
the mentor provides explanation about the fundamental principles and techniques of smart maps development.
## 3. 実践 practice
the mentee practices smart maps development under the mentor's guidance, receiving support and advice.
## 4. 助言 feedback
the mentor evaluates the mentee's work, providing specific feedback. Feedback is crucial for progress.
## 5. 学習 learning
the mentee learns smart maps development skills through mentor's guidance and feedback. Documentations are often developed in this learning process so that a mentee become a mentor.
## 6. 演習 exercise
the mentee applies smart maps development skills in practical projects or scenarios.
# Generation 2
スマート地図道場のプロセスは、スマート地図に焦点を当てた持続可能な実践コミュニティを構築するための包括的なアプローチです。以下にプロセスの概要を示します:
示範(デモンストレーション):
メンターはスマート地図の活用方法を実演し、その機能と潜在的な活用方法を披露します。
解説:
メンターはスマート地図開発の基本原則と技術について詳細な解説を行い、学習者の間で確固たる理解を確保します。
実践:
学習者はメンターの指導の下、スマート地図開発に関連する実践的な演習に積極的に取り組みます。彼らは実践的なサポートと助言を受けながら、スキルの磨きを行います。
助言:
メンターは学習者の作業を評価し、具体的なフィードバックを提供します。このフィードバックは、学習者の進歩とスマート地図開発能力の向上において非常に重要です。
学習:
学習者はメンターの指導とフィードバックを通じて、スマート地図開発のスキルを学び、向上させ続けます。このフェーズでは、学習プロセスをサポートするためのドキュメンテーションの作成が行われることが多く、学習者は次第にメンターとしての役割を果たすことができます。
演習:
学習者は習得したスマート地図開発のスキルを実践的なプロジェクトやシナリオに適用します。これにより、実際の経験を積み重ね、能力を磨くことができます。
このプロセスに従うことで、スマート地図道場は学習、応用、成長の連続的なサイクルを促進し、参加者がスマート地図の専門知識を磨くことができる実践コミュニティを育成します。
|
non_test
|
smart maps dojo process generation the smart maps dojo process is a process to run a sustaining community of practice about smart maps this is a first prototype concept generation 示範 demonstration the mentor demonstrates smart maps application 解説 explanation the mentor provides explanation about the fundamental principles and techniques of smart maps development 実践 practice the mentee practices smart maps development under the mentor s guidance receiving support and advice 助言 feedback the mentor evaluates the mentee s work providing specific feedback feedback is crucial for progress 学習 learning the mentee learns smart maps development skills through mentor s guidance and feedback documentations are often developed in this learning process so that a mentee become a mentor 演習 exercise the mentee applies smart maps development skills in practical projects or scenarios generation スマート地図道場のプロセスは、スマート地図に焦点を当てた持続可能な実践コミュニティを構築するための包括的なアプローチです。以下にプロセスの概要を示します: 示範(デモンストレーション): メンターはスマート地図の活用方法を実演し、その機能と潜在的な活用方法を披露します。 解説: メンターはスマート地図開発の基本原則と技術について詳細な解説を行い、学習者の間で確固たる理解を確保します。 実践: 学習者はメンターの指導の下、スマート地図開発に関連する実践的な演習に積極的に取り組みます。彼らは実践的なサポートと助言を受けながら、スキルの磨きを行います。 助言: メンターは学習者の作業を評価し、具体的なフィードバックを提供します。このフィードバックは、学習者の進歩とスマート地図開発能力の向上において非常に重要です。 学習: 学習者はメンターの指導とフィードバックを通じて、スマート地図開発のスキルを学び、向上させ続けます。このフェーズでは、学習プロセスをサポートするためのドキュメンテーションの作成が行われることが多く、学習者は次第にメンターとしての役割を果たすことができます。 演習: 学習者は習得したスマート地図開発のスキルを実践的なプロジェクトやシナリオに適用します。これにより、実際の経験を積み重ね、能力を磨くことができます。 このプロセスに従うことで、スマート地図道場は学習、応用、成長の連続的なサイクルを促進し、参加者がスマート地図の専門知識を磨くことができる実践コミュニティを育成します。
| 0
|
4,451
| 2,610,094,301
|
IssuesEvent
|
2015-02-26 18:28:29
|
chrsmith/dsdsdaadf
|
https://api.github.com/repos/chrsmith/dsdsdaadf
|
opened
|
深圳红蓝光祛痘效果
|
auto-migrated Priority-Medium Type-Defect
|
```
深圳红蓝光祛痘效果【深圳韩方科颜全国热线400-869-1818,24小
时QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以韩国��
�方——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩�
��科颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹”
健康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专��
�治疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的�
��痘。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 8:14
|
1.0
|
深圳红蓝光祛痘效果 - ```
深圳红蓝光祛痘效果【深圳韩方科颜全国热线400-869-1818,24小
时QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以韩国��
�方——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩�
��科颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹”
健康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专��
�治疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的�
��痘。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 8:14
|
non_test
|
深圳红蓝光祛痘效果 深圳红蓝光祛痘效果【 , 】深圳韩方科颜专业祛痘连锁机构,机构以韩国�� �方——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩� ��科颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹” 健康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专�� �治疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的� ��痘。 original issue reported on code google com by szft com on may at
| 0
|
269,528
| 23,447,494,777
|
IssuesEvent
|
2022-08-15 21:17:08
|
prysmaticlabs/prysm
|
https://api.github.com/repos/prysmaticlabs/prysm
|
closed
|
Abstracting Time and Tickers from Prysm's Core Implementation
|
Enhancement Discussion Priority: Low Sync E2E Tests
|
# 💎 Issue
Thanks @kasey for bringing this up in our conversations.
### Background
Ethereum's consensus protocol, [Gasper](https://arxiv.org/abs/2003.03052), is a synchronous one. This means time is a critical part of its functionality and security. As Prysm implements the [specification](https://github.com/ethereum/consensus-specs), we use time quite a lot. Unfortunately, using the actual `time` package from the standard library, dealing with timestamps and tickers becomes tricky when writing tests and simulating a blockchain environment. Today, we use the standard library for a lot of our tests. **However**, the consensus protocol at a high-level only has the concept of time ticks, which are abstract intervals, and not specific periods of seconds/milliseconds/etc. Even in places where the specification mentions the concept of seconds, it is in the context of intervals, which can be abstracted.
The goal of this issue is to consider a revamp in Prysm where we abstract the concept of time as much as possible from the chain, such that we can have full, programmatic control of "ticks". This allows us to do really interesting things, such as:
1. Warp debugging: we could easily advance through a chain in the future using programmatic ticks in our unit tests
2. Lightning-fast, integration tests and chain simulations: today, our end-to-end test suite uses actual slot ticks from a ticker in the standard library to advance a chain state. We could instead advance through e2e tests as fast as our machines will allow if we have abstract time tickers
3. Unit tests become easier to write: today, many tests in Prysm do manual operations with time, such as adding, subtracting, dealing with milliseconds/seconds, etc. This detracts from the main logic we care about testing.
### Description
A deliverable that resolves this is a design document which analyzes all places where Prysm relies on the actual `time` package from the standard library, and understand how that code could be refactored to abstract the concept of time. We do this today in many places, as we define our own `SlotTicker` inside of the `slots` package in [github.com/prysmaticlabs/prysm/time/slots](https://github.com/prysmaticlabs/prysm/time/slots). However, its usage is not uniform. There are many place we perform concrete time operations, especially in unit tests, in the validator client, and parts of the runtime such as the `powchain`, `blockchain`, `operations` packages and even in places such as caches.
Messing with time in Prysm is [risky business](https://medium.com/prysmatic-labs/eth2-medalla-testnet-incident-f7fbc3cc934a). This issue is non-trivial, and requires significant risk analysis (it is possible this is not worth the time and effort). At the very least, making use of our slot ticker more uniform would be a good time investment, as it could allow us to write integration tests and simulations where we can advance a chain as fast as a machine could go.
This issue is open for discussion.
|
1.0
|
Abstracting Time and Tickers from Prysm's Core Implementation - # 💎 Issue
Thanks @kasey for bringing this up in our conversations.
### Background
Ethereum's consensus protocol, [Gasper](https://arxiv.org/abs/2003.03052), is a synchronous one. This means time is a critical part of its functionality and security. As Prysm implements the [specification](https://github.com/ethereum/consensus-specs), we use time quite a lot. Unfortunately, using the actual `time` package from the standard library, dealing with timestamps and tickers becomes tricky when writing tests and simulating a blockchain environment. Today, we use the standard library for a lot of our tests. **However**, the consensus protocol at a high-level only has the concept of time ticks, which are abstract intervals, and not specific periods of seconds/milliseconds/etc. Even in places where the specification mentions the concept of seconds, it is in the context of intervals, which can be abstracted.
The goal of this issue is to consider a revamp in Prysm where we abstract the concept of time as much as possible from the chain, such that we can have full, programmatic control of "ticks". This allows us to do really interesting things, such as:
1. Warp debugging: we could easily advance through a chain in the future using programmatic ticks in our unit tests
2. Lightning-fast, integration tests and chain simulations: today, our end-to-end test suite uses actual slot ticks from a ticker in the standard library to advance a chain state. We could instead advance through e2e tests as fast as our machines will allow if we have abstract time tickers
3. Unit tests become easier to write: today, many tests in Prysm do manual operations with time, such as adding, subtracting, dealing with milliseconds/seconds, etc. This detracts from the main logic we care about testing.
### Description
A deliverable that resolves this is a design document which analyzes all places where Prysm relies on the actual `time` package from the standard library, and understand how that code could be refactored to abstract the concept of time. We do this today in many places, as we define our own `SlotTicker` inside of the `slots` package in [github.com/prysmaticlabs/prysm/time/slots](https://github.com/prysmaticlabs/prysm/time/slots). However, its usage is not uniform. There are many place we perform concrete time operations, especially in unit tests, in the validator client, and parts of the runtime such as the `powchain`, `blockchain`, `operations` packages and even in places such as caches.
Messing with time in Prysm is [risky business](https://medium.com/prysmatic-labs/eth2-medalla-testnet-incident-f7fbc3cc934a). This issue is non-trivial, and requires significant risk analysis (it is possible this is not worth the time and effort). At the very least, making use of our slot ticker more uniform would be a good time investment, as it could allow us to write integration tests and simulations where we can advance a chain as fast as a machine could go.
This issue is open for discussion.
|
test
|
abstracting time and tickers from prysm s core implementation 💎 issue thanks kasey for bringing this up in our conversations background ethereum s consensus protocol is a synchronous one this means time is a critical part of its functionality and security as prysm implements the we use time quite a lot unfortunately using the actual time package from the standard library dealing with timestamps and tickers becomes tricky when writing tests and simulating a blockchain environment today we use the standard library for a lot of our tests however the consensus protocol at a high level only has the concept of time ticks which are abstract intervals and not specific periods of seconds milliseconds etc even in places where the specification mentions the concept of seconds it is in the context of intervals which can be abstracted the goal of this issue is to consider a revamp in prysm where we abstract the concept of time as much as possible from the chain such that we can have full programmatic control of ticks this allows us to do really interesting things such as warp debugging we could easily advance through a chain in the future using programmatic ticks in our unit tests lightning fast integration tests and chain simulations today our end to end test suite uses actual slot ticks from a ticker in the standard library to advance a chain state we could instead advance through tests as fast as our machines will allow if we have abstract time tickers unit tests become easier to write today many tests in prysm do manual operations with time such as adding subtracting dealing with milliseconds seconds etc this detracts from the main logic we care about testing description a deliverable that resolves this is a design document which analyzes all places where prysm relies on the actual time package from the standard library and understand how that code could be refactored to abstract the concept of time we do this today in many places as we define our own slotticker inside of the slots package in however its usage is not uniform there are many place we perform concrete time operations especially in unit tests in the validator client and parts of the runtime such as the powchain blockchain operations packages and even in places such as caches messing with time in prysm is this issue is non trivial and requires significant risk analysis it is possible this is not worth the time and effort at the very least making use of our slot ticker more uniform would be a good time investment as it could allow us to write integration tests and simulations where we can advance a chain as fast as a machine could go this issue is open for discussion
| 1
|
25,430
| 11,172,290,923
|
IssuesEvent
|
2019-12-29 04:31:19
|
christian-cleberg/lets-debug
|
https://api.github.com/repos/christian-cleberg/lets-debug
|
opened
|
CVE-2019-11358 (Medium) detected in jquery-3.3.1.min.js
|
security vulnerability
|
## CVE-2019-11358 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-3.3.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/lets-debug/articles/2019-01/full-stack-web.html</p>
<p>Path to vulnerable library: /lets-debug/articles/2019-01/full-stack-web.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-3.3.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/christian-cleberg/lets-debug/commit/f5fb9c110dc20e6471adf62da719b9d2c9d32612">f5fb9c110dc20e6471adf62da719b9d2c9d32612</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype.
<p>Publish Date: 2019-04-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-11358>CVE-2019-11358</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358</a></p>
<p>Release Date: 2019-04-20</p>
<p>Fix Resolution: 3.4.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-11358 (Medium) detected in jquery-3.3.1.min.js - ## CVE-2019-11358 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-3.3.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/lets-debug/articles/2019-01/full-stack-web.html</p>
<p>Path to vulnerable library: /lets-debug/articles/2019-01/full-stack-web.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-3.3.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/christian-cleberg/lets-debug/commit/f5fb9c110dc20e6471adf62da719b9d2c9d32612">f5fb9c110dc20e6471adf62da719b9d2c9d32612</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype.
<p>Publish Date: 2019-04-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-11358>CVE-2019-11358</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358</a></p>
<p>Release Date: 2019-04-20</p>
<p>Fix Resolution: 3.4.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file tmp ws scm lets debug articles full stack web html path to vulnerable library lets debug articles full stack web html dependency hierarchy x jquery min js vulnerable library found in head commit a href vulnerability details jquery before as used in drupal backdrop cms and other products mishandles jquery extend true because of object prototype pollution if an unsanitized source object contained an enumerable proto property it could extend the native object prototype publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
118,449
| 9,990,906,948
|
IssuesEvent
|
2019-07-11 09:50:35
|
chameleon-system/chameleon-system
|
https://api.github.com/repos/chameleon-system/chameleon-system
|
closed
|
Error-prone default portal selection
|
Status: Test Type: Bug
|
**Describe the bug**
If an action needs an active portal and no portal was set for example actions running in cms backend. The cms tries to get an default portal. Actually its the portal with the lowest id.
Our CMS was delivered with one Portal wit id "1". If you add a new portal with a lower id for example 01dsfs... the new portal is the default portal.
**Affected version(s)**
6.2
**To Reproduce**
Steps to reproduce the behavior:
1. Add new portal wit id beginning with 0
2. Run action which needed an active portal
3. wrong portal will be used
|
1.0
|
Error-prone default portal selection - **Describe the bug**
If an action needs an active portal and no portal was set for example actions running in cms backend. The cms tries to get an default portal. Actually its the portal with the lowest id.
Our CMS was delivered with one Portal wit id "1". If you add a new portal with a lower id for example 01dsfs... the new portal is the default portal.
**Affected version(s)**
6.2
**To Reproduce**
Steps to reproduce the behavior:
1. Add new portal wit id beginning with 0
2. Run action which needed an active portal
3. wrong portal will be used
|
test
|
error prone default portal selection describe the bug if an action needs an active portal and no portal was set for example actions running in cms backend the cms tries to get an default portal actually its the portal with the lowest id our cms was delivered with one portal wit id if you add a new portal with a lower id for example the new portal is the default portal affected version s to reproduce steps to reproduce the behavior add new portal wit id beginning with run action which needed an active portal wrong portal will be used
| 1
|
328,757
| 9,999,603,857
|
IssuesEvent
|
2019-07-12 11:08:21
|
turbolabz/transfer-bug-track
|
https://api.github.com/repos/turbolabz/transfer-bug-track
|
opened
|
Chat messages are not showing properly in some iOS devices
|
Priority: Medium Type: bug
|
Text is showing in two lines and it shows as half on the board.

|
1.0
|
Chat messages are not showing properly in some iOS devices - Text is showing in two lines and it shows as half on the board.

|
non_test
|
chat messages are not showing properly in some ios devices text is showing in two lines and it shows as half on the board
| 0
|
87,755
| 17,370,868,556
|
IssuesEvent
|
2021-07-30 13:51:00
|
parallaxsecond/parsec
|
https://api.github.com/repos/parallaxsecond/parsec
|
closed
|
Investigate the strange CodeCov reports
|
bug code health question testing
|
Code coverage reporting was out of order for a while, and now that it's working again, the numbers are somewhat strange. You can see the most recent report [here](https://app.codecov.io/gh/parallaxsecond/parsec). The coverage has dropped, despite us adding new tests for various bits of functionality, while some parts of the report are nonsensical. For example, in [this](https://codecov.io/gh/parallaxsecond/parsec/src/2af44cc6dc10b23511e9442f4281636728c59a14/src/providers/tpm/key_management.rs) report of the key management functionality of the TPM provider, there are a number of inconsistent or illogical results:
* The first line in a few functions is shown as uncovered, even though there is no branching involved and other lines in those functions are covered
* The last line in a few functions is shown as uncovered, even though we have hundreds of tests that use those functions and rely on them touching that line (e.g. line 237, which marks a successful key deletion)
* Other lines that are shown as uncovered even though they must have happened for others after them to also be covered (e.g. lines 169-170, which are covered in a [previous report](https://codecov.io/gh/parallaxsecond/parsec/src/b6b73160498e52cbfb3527f5120036c729f91920/src/providers/tpm/key_management.rs))
There is a need to investigate why those inconsistencies suddenly appeared - maybe some of the recent changes in the `ci.sh` file have lead to this, or changes in the Rust compiler...?
More tests might also be needed, but these problems seem to stem from external issues - we've not been actually losing coverage, we just hit some tooling errors.
|
1.0
|
Investigate the strange CodeCov reports - Code coverage reporting was out of order for a while, and now that it's working again, the numbers are somewhat strange. You can see the most recent report [here](https://app.codecov.io/gh/parallaxsecond/parsec). The coverage has dropped, despite us adding new tests for various bits of functionality, while some parts of the report are nonsensical. For example, in [this](https://codecov.io/gh/parallaxsecond/parsec/src/2af44cc6dc10b23511e9442f4281636728c59a14/src/providers/tpm/key_management.rs) report of the key management functionality of the TPM provider, there are a number of inconsistent or illogical results:
* The first line in a few functions is shown as uncovered, even though there is no branching involved and other lines in those functions are covered
* The last line in a few functions is shown as uncovered, even though we have hundreds of tests that use those functions and rely on them touching that line (e.g. line 237, which marks a successful key deletion)
* Other lines that are shown as uncovered even though they must have happened for others after them to also be covered (e.g. lines 169-170, which are covered in a [previous report](https://codecov.io/gh/parallaxsecond/parsec/src/b6b73160498e52cbfb3527f5120036c729f91920/src/providers/tpm/key_management.rs))
There is a need to investigate why those inconsistencies suddenly appeared - maybe some of the recent changes in the `ci.sh` file have lead to this, or changes in the Rust compiler...?
More tests might also be needed, but these problems seem to stem from external issues - we've not been actually losing coverage, we just hit some tooling errors.
|
non_test
|
investigate the strange codecov reports code coverage reporting was out of order for a while and now that it s working again the numbers are somewhat strange you can see the most recent report the coverage has dropped despite us adding new tests for various bits of functionality while some parts of the report are nonsensical for example in report of the key management functionality of the tpm provider there are a number of inconsistent or illogical results the first line in a few functions is shown as uncovered even though there is no branching involved and other lines in those functions are covered the last line in a few functions is shown as uncovered even though we have hundreds of tests that use those functions and rely on them touching that line e g line which marks a successful key deletion other lines that are shown as uncovered even though they must have happened for others after them to also be covered e g lines which are covered in a there is a need to investigate why those inconsistencies suddenly appeared maybe some of the recent changes in the ci sh file have lead to this or changes in the rust compiler more tests might also be needed but these problems seem to stem from external issues we ve not been actually losing coverage we just hit some tooling errors
| 0
|
256,185
| 19,402,667,780
|
IssuesEvent
|
2021-12-19 13:16:42
|
canwebe/CatBreeds
|
https://api.github.com/repos/canwebe/CatBreeds
|
closed
|
Add Readme and Information about the project in this repo
|
documentation
|
- [ ] Added Readme
- [ ] Added about section in repo
|
1.0
|
Add Readme and Information about the project in this repo - - [ ] Added Readme
- [ ] Added about section in repo
|
non_test
|
add readme and information about the project in this repo added readme added about section in repo
| 0
|
257,005
| 8,131,790,901
|
IssuesEvent
|
2018-08-18 02:17:45
|
alassanecoly/BookmarkMyChampions
|
https://api.github.com/repos/alassanecoly/BookmarkMyChampions
|
closed
|
✨ Implement PATCH /users/me route
|
priority: medium 🚧 scope: api scope: authentication scope: routing status: accepted 👍 type: feature ✨
|
**Is your feature request related to a problem ? Please describe.**
Current user can update his informations.
**Describe the solution you'd like**
Implement PATCH /users/me route
|
1.0
|
✨ Implement PATCH /users/me route - **Is your feature request related to a problem ? Please describe.**
Current user can update his informations.
**Describe the solution you'd like**
Implement PATCH /users/me route
|
non_test
|
✨ implement patch users me route is your feature request related to a problem please describe current user can update his informations describe the solution you d like implement patch users me route
| 0
|
313,980
| 26,967,520,925
|
IssuesEvent
|
2023-02-09 00:11:05
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
closed
|
Fix tensor.test_torch_instance_type
|
PyTorch Frontend Sub Task Failing Test
|
| | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/3977080495/jobs/6817950648" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/3977080495/jobs/6817950648" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/3977080495/jobs/6817950648" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/3977080495/jobs/6817950648" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_torch/test_tensor.py::test_torch_instance_type[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-01-22T00:13:08.8951339Z E AssertionError: 0 != 255
2023-01-22T00:13:08.8951696Z E Falsifying example: test_torch_instance_type(
2023-01-22T00:13:08.8952144Z E dtype_and_x=(['float64'], [array(-1.)]),
2023-01-22T00:13:08.8952504Z E dtype=['uint8'],
2023-01-22T00:13:08.8952812Z E init_num_positional_args=0,
2023-01-22T00:13:08.8953136Z E method_num_positional_args=0,
2023-01-22T00:13:08.8954975Z E as_variable=[False],
2023-01-22T00:13:08.8955282Z E native_array=[False],
2023-01-22T00:13:08.8956205Z E frontend_method_data=FrontendMethodData(ivy_init_module=<module 'ivy.functional.frontends.torch' from '/ivy/ivy/functional/frontends/torch/__init__.py'>, framework_init_module=<module 'torch' from '/usr/local/lib/python3.8/dist-packages/torch/__init__.py'>, init_name='tensor', method_name='type'),
2023-01-22T00:13:08.8956895Z E frontend='torch',
2023-01-22T00:13:08.8957169Z E )
2023-01-22T00:13:08.8957430Z E
2023-01-22T00:13:08.8958144Z E You can reproduce this example by temporarily adding @reproduce_failure('6.55.0', b'AXicY2QAAkYGCIDRAABCAAQ=') as a decorator on your test case
</details>
|
1.0
|
Fix tensor.test_torch_instance_type - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/3977080495/jobs/6817950648" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/3977080495/jobs/6817950648" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/3977080495/jobs/6817950648" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/3977080495/jobs/6817950648" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_torch/test_tensor.py::test_torch_instance_type[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-01-22T00:13:08.8951339Z E AssertionError: 0 != 255
2023-01-22T00:13:08.8951696Z E Falsifying example: test_torch_instance_type(
2023-01-22T00:13:08.8952144Z E dtype_and_x=(['float64'], [array(-1.)]),
2023-01-22T00:13:08.8952504Z E dtype=['uint8'],
2023-01-22T00:13:08.8952812Z E init_num_positional_args=0,
2023-01-22T00:13:08.8953136Z E method_num_positional_args=0,
2023-01-22T00:13:08.8954975Z E as_variable=[False],
2023-01-22T00:13:08.8955282Z E native_array=[False],
2023-01-22T00:13:08.8956205Z E frontend_method_data=FrontendMethodData(ivy_init_module=<module 'ivy.functional.frontends.torch' from '/ivy/ivy/functional/frontends/torch/__init__.py'>, framework_init_module=<module 'torch' from '/usr/local/lib/python3.8/dist-packages/torch/__init__.py'>, init_name='tensor', method_name='type'),
2023-01-22T00:13:08.8956895Z E frontend='torch',
2023-01-22T00:13:08.8957169Z E )
2023-01-22T00:13:08.8957430Z E
2023-01-22T00:13:08.8958144Z E You can reproduce this example by temporarily adding @reproduce_failure('6.55.0', b'AXicY2QAAkYGCIDRAABCAAQ=') as a decorator on your test case
</details>
|
test
|
fix tensor test torch instance type tensorflow img src torch img src numpy img src jax img src failed ivy tests test ivy test frontends test torch test tensor py test torch instance type e assertionerror e falsifying example test torch instance type e dtype and x e dtype e init num positional args e method num positional args e as variable e native array e frontend method data frontendmethoddata ivy init module framework init module init name tensor method name type e frontend torch e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case
| 1
|
7,659
| 8,026,932,271
|
IssuesEvent
|
2018-07-27 07:07:55
|
badges/shields
|
https://api.github.com/repos/badges/shields
|
closed
|
Jenkins permission requirements
|
question service-badge
|
Hello all,
I've been trying to get a shields badge of my Jenkins build status for [this job](https://ci.gamerking195.com/job/AutoUpdaterAPI). However, it always [displays inaccessible.](https://img.shields.io/jenkins/s/https/ci.gamerking195.com/job/AutoUpdaterAPI.svg) The job has project-based security, with anonymous users being able to discover and read, but nothing else.
Things I've tried:
- Adding/removing SSL
- Giving anonymous workspace permission
Any help is appreciated, thanks in advance!
|
1.0
|
Jenkins permission requirements - Hello all,
I've been trying to get a shields badge of my Jenkins build status for [this job](https://ci.gamerking195.com/job/AutoUpdaterAPI). However, it always [displays inaccessible.](https://img.shields.io/jenkins/s/https/ci.gamerking195.com/job/AutoUpdaterAPI.svg) The job has project-based security, with anonymous users being able to discover and read, but nothing else.
Things I've tried:
- Adding/removing SSL
- Giving anonymous workspace permission
Any help is appreciated, thanks in advance!
|
non_test
|
jenkins permission requirements hello all i ve been trying to get a shields badge of my jenkins build status for however it always the job has project based security with anonymous users being able to discover and read but nothing else things i ve tried adding removing ssl giving anonymous workspace permission any help is appreciated thanks in advance
| 0
|
354,918
| 25,175,215,091
|
IssuesEvent
|
2022-11-11 08:37:37
|
Isaaclhy00/pe
|
https://api.github.com/repos/Isaaclhy00/pe
|
opened
|
Inconsistent grammar
|
severity.VeryLow type.DocumentationBug
|


The listTasks command uses the plural form "Tasks" however the findTask command uses the singular form "Task" although both commands may potentially return one or more tasks.
<!--session: 1668153714273-56fa1640-cfd4-4b88-b1c2-babefad465eb-->
<!--Version: Web v3.4.4-->
|
1.0
|
Inconsistent grammar - 

The listTasks command uses the plural form "Tasks" however the findTask command uses the singular form "Task" although both commands may potentially return one or more tasks.
<!--session: 1668153714273-56fa1640-cfd4-4b88-b1c2-babefad465eb-->
<!--Version: Web v3.4.4-->
|
non_test
|
inconsistent grammar the listtasks command uses the plural form tasks however the findtask command uses the singular form task although both commands may potentially return one or more tasks
| 0
|
104,656
| 8,996,685,538
|
IssuesEvent
|
2019-02-02 03:34:33
|
elastic/elasticsearch
|
https://api.github.com/repos/elastic/elasticsearch
|
opened
|
SSLConfigurationReloaderTests hangs when run locally
|
:Security/Network >test-failure
|
This test suite hangs indefinitely when run locally, but I haven't seen CI be impacted yet.
Reproduce with:
```
./gradlew :x-pack:plugin:core:test \
-Dtests.seed=5DB61DD425F081B \
-Dtests.class=org.elasticsearch.xpack.core.ssl.SSLConfigurationReloaderTests \
-Dtests.security.manager=true \
-Dtests.locale=en-US \
-Dtests.timezone=Etc/UTC
```
Whichever test runs first in this suite hangs.
Possibly related to https://github.com/elastic/elasticsearch/issues/32124
Appears to have been introduced by https://github.com/elastic/elasticsearch/pull/38103
Confirmed on MacOS and Linux.
|
1.0
|
SSLConfigurationReloaderTests hangs when run locally - This test suite hangs indefinitely when run locally, but I haven't seen CI be impacted yet.
Reproduce with:
```
./gradlew :x-pack:plugin:core:test \
-Dtests.seed=5DB61DD425F081B \
-Dtests.class=org.elasticsearch.xpack.core.ssl.SSLConfigurationReloaderTests \
-Dtests.security.manager=true \
-Dtests.locale=en-US \
-Dtests.timezone=Etc/UTC
```
Whichever test runs first in this suite hangs.
Possibly related to https://github.com/elastic/elasticsearch/issues/32124
Appears to have been introduced by https://github.com/elastic/elasticsearch/pull/38103
Confirmed on MacOS and Linux.
|
test
|
sslconfigurationreloadertests hangs when run locally this test suite hangs indefinitely when run locally but i haven t seen ci be impacted yet reproduce with gradlew x pack plugin core test dtests seed dtests class org elasticsearch xpack core ssl sslconfigurationreloadertests dtests security manager true dtests locale en us dtests timezone etc utc whichever test runs first in this suite hangs possibly related to appears to have been introduced by confirmed on macos and linux
| 1
|
50,355
| 21,076,589,045
|
IssuesEvent
|
2022-04-02 08:21:59
|
emergenzeHack/ukrainehelp.emergenzehack.info_segnalazioni
|
https://api.github.com/repos/emergenzeHack/ukrainehelp.emergenzehack.info_segnalazioni
|
opened
|
https://www.raiplay.it/benvenuti-bambini Cartoni animati in lingua italiana e ucraina (contenuti gr
|
Services translation Children
|
<pre><yamldata>
servicetypes:
materialGoods: false
hospitality: false
transport: false
healthcare: false
Legal: false
translation: true
job: false
psychologicalSupport: false
Children: true
disability: false
women: false
education: false
offerFromWho: Raiplay
title: https://www.raiplay.it/benvenuti-bambini Cartoni animati in lingua italiana
e ucraina (contenuti gratuiti).
recipients: ''
description: ''
url: https://www.raiplay.it/benvenuti-bambini
address:
mode: autocomplete
address:
place_id: 283767136
licence: Data © OpenStreetMap contributors, ODbL 1.0. https://osm.org/copyright
osm_type: relation
osm_id: 41485
boundingbox:
- '41.6556417'
- '42.1410285'
- '12.2344669'
- '12.8557603'
lat: '41.8933203'
lon: '12.4829321'
display_name: Roma, Roma Capitale, Lazio, Italia
class: boundary
type: administrative
importance: 0.7896107180689524
icon: https://nominatim.openstreetmap.org/ui/mapicons//poi_boundary_administrative.p.20.png
address:
city: Roma
county: Roma Capitale
state: Lazio
country: Italia
country_code: it
iConfirmToHaveReadAndAcceptedInformativeToThreatPersonalData: true
label: services
submit: true
</yamldata></pre>
|
1.0
|
https://www.raiplay.it/benvenuti-bambini Cartoni animati in lingua italiana e ucraina (contenuti gr - <pre><yamldata>
servicetypes:
materialGoods: false
hospitality: false
transport: false
healthcare: false
Legal: false
translation: true
job: false
psychologicalSupport: false
Children: true
disability: false
women: false
education: false
offerFromWho: Raiplay
title: https://www.raiplay.it/benvenuti-bambini Cartoni animati in lingua italiana
e ucraina (contenuti gratuiti).
recipients: ''
description: ''
url: https://www.raiplay.it/benvenuti-bambini
address:
mode: autocomplete
address:
place_id: 283767136
licence: Data © OpenStreetMap contributors, ODbL 1.0. https://osm.org/copyright
osm_type: relation
osm_id: 41485
boundingbox:
- '41.6556417'
- '42.1410285'
- '12.2344669'
- '12.8557603'
lat: '41.8933203'
lon: '12.4829321'
display_name: Roma, Roma Capitale, Lazio, Italia
class: boundary
type: administrative
importance: 0.7896107180689524
icon: https://nominatim.openstreetmap.org/ui/mapicons//poi_boundary_administrative.p.20.png
address:
city: Roma
county: Roma Capitale
state: Lazio
country: Italia
country_code: it
iConfirmToHaveReadAndAcceptedInformativeToThreatPersonalData: true
label: services
submit: true
</yamldata></pre>
|
non_test
|
cartoni animati in lingua italiana e ucraina contenuti gr servicetypes materialgoods false hospitality false transport false healthcare false legal false translation true job false psychologicalsupport false children true disability false women false education false offerfromwho raiplay title cartoni animati in lingua italiana e ucraina contenuti gratuiti recipients description url address mode autocomplete address place id licence data © openstreetmap contributors odbl osm type relation osm id boundingbox lat lon display name roma roma capitale lazio italia class boundary type administrative importance icon address city roma county roma capitale state lazio country italia country code it iconfirmtohavereadandacceptedinformativetothreatpersonaldata true label services submit true
| 0
|
150,654
| 11,980,044,776
|
IssuesEvent
|
2020-04-07 08:40:39
|
WoWManiaUK/Redemption
|
https://api.github.com/repos/WoWManiaUK/Redemption
|
closed
|
Sindragosa gauntlet not active
|
Fix - Tester Confirmed
|
**What is Happening:** Currently the sindragosa gauntlet does not active, so the door is close and the teleport is not active, you can't get to her.
**What Should happen:** The gauntlet should active when you move to the gauntlet room before sindy.
Not sure if this is reported or not, but since i can't find it so i will report it just to make sure.
|
1.0
|
Sindragosa gauntlet not active - **What is Happening:** Currently the sindragosa gauntlet does not active, so the door is close and the teleport is not active, you can't get to her.
**What Should happen:** The gauntlet should active when you move to the gauntlet room before sindy.
Not sure if this is reported or not, but since i can't find it so i will report it just to make sure.
|
test
|
sindragosa gauntlet not active what is happening currently the sindragosa gauntlet does not active so the door is close and the teleport is not active you can t get to her what should happen the gauntlet should active when you move to the gauntlet room before sindy not sure if this is reported or not but since i can t find it so i will report it just to make sure
| 1
|
145,865
| 11,710,898,343
|
IssuesEvent
|
2020-03-09 02:53:05
|
microsoft/AzureStorageExplorer
|
https://api.github.com/repos/microsoft/AzureStorageExplorer
|
closed
|
Fail to acquire lease for blobs in one SAS attached blob container
|
:beetle: regression :gear: blobs 🧪 testing
|
**Storage Explorer Version:** 1.12.0
**Build**: [20200305.6](https://devdiv.visualstudio.com/DevDiv/_build/results?buildId=3529152)
**Branch**: dev/chuye/beta-blob-extension
**Platform/OS:** Windows 10/ Linux Ubuntu 16.04
**Architecture**: ia32/x64
**Regression From:** Previous release(1.12.0)
**Steps to reproduce:**
1. Expand one Non-ADLS Gen2 storage account -> Blob Containers.
2. Select one blob container -> Attach it using SAS (with full permissions).
3. Open the SAS attached blob container -> Upload one blob to it.
4. Try to acquire lease for the blob -> Check the result.
**Expect Experience:**
Succeed to acquire lease for the blob.
**Actual Experience:**
1. Fail to acquire lease for the blob.
2. Storage Explorer is not responding.

**More Info:**
1. This issue also reproduces for **break lease** action
2. This issue also reproduces for blobs in one **SAS attached account**.
|
1.0
|
Fail to acquire lease for blobs in one SAS attached blob container - **Storage Explorer Version:** 1.12.0
**Build**: [20200305.6](https://devdiv.visualstudio.com/DevDiv/_build/results?buildId=3529152)
**Branch**: dev/chuye/beta-blob-extension
**Platform/OS:** Windows 10/ Linux Ubuntu 16.04
**Architecture**: ia32/x64
**Regression From:** Previous release(1.12.0)
**Steps to reproduce:**
1. Expand one Non-ADLS Gen2 storage account -> Blob Containers.
2. Select one blob container -> Attach it using SAS (with full permissions).
3. Open the SAS attached blob container -> Upload one blob to it.
4. Try to acquire lease for the blob -> Check the result.
**Expect Experience:**
Succeed to acquire lease for the blob.
**Actual Experience:**
1. Fail to acquire lease for the blob.
2. Storage Explorer is not responding.

**More Info:**
1. This issue also reproduces for **break lease** action
2. This issue also reproduces for blobs in one **SAS attached account**.
|
test
|
fail to acquire lease for blobs in one sas attached blob container storage explorer version build branch dev chuye beta blob extension platform os windows linux ubuntu architecture regression from previous release steps to reproduce expand one non adls storage account blob containers select one blob container attach it using sas with full permissions open the sas attached blob container upload one blob to it try to acquire lease for the blob check the result expect experience succeed to acquire lease for the blob actual experience fail to acquire lease for the blob storage explorer is not responding more info this issue also reproduces for break lease action this issue also reproduces for blobs in one sas attached account
| 1
|
279,733
| 24,251,684,859
|
IssuesEvent
|
2022-09-27 14:37:48
|
department-of-veterans-affairs/va.gov-cms
|
https://api.github.com/repos/department-of-veterans-affairs/va.gov-cms
|
closed
|
CMS Test Coverage Analysis
|
Automated testing ⭐️ Sitewide CMS
|
## Acceptance Criteria
- [ ] team has a comprehensive list of manual test cases (in va.gov-team repo) to run through
## Implementation notes
- [ ] comprehensive list of custom features we've built that don't have test coverage
- [ ] stories created for any critical features that need automated test coverage
|
1.0
|
CMS Test Coverage Analysis - ## Acceptance Criteria
- [ ] team has a comprehensive list of manual test cases (in va.gov-team repo) to run through
## Implementation notes
- [ ] comprehensive list of custom features we've built that don't have test coverage
- [ ] stories created for any critical features that need automated test coverage
|
test
|
cms test coverage analysis acceptance criteria team has a comprehensive list of manual test cases in va gov team repo to run through implementation notes comprehensive list of custom features we ve built that don t have test coverage stories created for any critical features that need automated test coverage
| 1
|
107,054
| 9,201,064,658
|
IssuesEvent
|
2019-03-07 18:39:11
|
scylladb/scylla
|
https://api.github.com/repos/scylladb/scylla
|
opened
|
segfault in sstable::has_correct_non_compound_range_tombstones during repair_disjoint_row_2nodes_diff_shard_count_test
|
dtest
|
scylla version e9bc2a7912bcc3539f0a10c436a60069b88d1cab
scylla dtest version scylladb/scylla-dtest@f373388d91b494d398919ace1c54b73bd0a8b4a2
Seen in [dtest-release/50/artifact/logs-release.2/1551952592893_repair_additional_test.RepairAdditionalTest.repair_disjoint_row_2nodes_diff_shard_count_test/node1.log](http://jenkins.cloudius-systems.com:8080/view/master/job/scylla-master/job/dtest-release/50/artifact/logs-release.2/1551952592893_repair_additional_test.RepairAdditionalTest.repair_disjoint_row_2nodes_diff_shard_count_test/node1.log):
```
$ addr2line -Cfpi -e logs-release.2/scylla
0x00000000041d3732
0x00000000040d21e5
0x00000000040d24e5
0x00000000040d2533
0x00007fb13a30c02f
0x00000000016692e0
0x0000000000d59ecf
0x0000000000ebfbda
0x0000000000e49c90
0x00000000011fa2d1
0x00000000040cf671
0x00000000040cf86e
0x00000000041a3965
0x00000000041ca65b
0x000000000409f24d
void seastar::backtrace<seastar::backtrace_buffer::append_backtrace()::{lambda(seastar::frame)#1}>(seastar::backtrace_buffer::append_backtrace()::{lambda(seastar::frame)#1}&&) at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../include/seastar/util/backtrace.hh:55
seastar::print_with_backtrace(seastar::backtrace_buffer&) at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:1075
(inlined by) print_with_backtrace at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:1096
seastar::print_with_backtrace(char const*) at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:1103
seastar::install_oneshot_signal_handler<11, &seastar::sigsegv_action>()::{lambda(int, siginfo_t*, void*)#1}::_FUN(int, siginfo_t*, void*) at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:4906
(inlined by) operator() at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:4892
(inlined by) _FUN at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:4888
?? ??:0
sstables::sstable::read_range_rows_flat(seastar::lw_shared_ptr<schema const>, nonwrapping_range<dht::ring_position> const&, query::partition_slice const&, seastar::io_priority_class const&, reader_resource_tracker, seastar::bool_class<streamed_mutation::forwarding_tag>, seastar::bool_class<flat_mutation_reader::partition_range_forwarding_tag>, sstables::read_monitor&) at /jenkins/workspace/scylla-master/dtest-release/scylla/sstables/sstables.hh:683
(inlined by) ?? at /jenkins/workspace/scylla-master/dtest-release/scylla/sstables/mp_row_consumer.hh:366
(inlined by) ?? at /jenkins/workspace/scylla-master/dtest-release/scylla/sstables/partition.cc:191
(inlined by) ?? at /usr/include/c++/8/bits/unique_ptr.h:831
(inlined by) flat_mutation_reader make_flat_mutation_reader<sstables::sstable_mutation_reader<sstables::data_consume_rows_context, sstables::mp_row_consumer_k_l>, seastar::lw_shared_ptr<sstables::sstable>, seastar::lw_shared_ptr<schema const>, nonwrapping_range<dht::ring_position> const&, query::partition_slice const&, seastar::io_priority_class const&, reader_resource_tracker, seastar::bool_class<streamed_mutation::forwarding_tag>&, seastar::bool_class<flat_mutation_reader::partition_range_forwarding_tag>&, sstables::read_monitor&>(seastar::lw_shared_ptr<sstables::sstable>&&, seastar::lw_shared_ptr<schema const>&&, nonwrapping_range<dht::ring_position> const&, query::partition_slice const&, seastar::io_priority_class const&, reader_resource_tracker&&, seastar::bool_class<streamed_mutation::forwarding_tag>&, seastar::bool_class<flat_mutation_reader::partition_range_forwarding_tag>&, sstables::read_monitor&) at /jenkins/workspace/scylla-master/dtest-release/scylla/./flat_mutation_reader.hh:483
(inlined by) sstables::sstable::read_range_rows_flat(seastar::lw_shared_ptr<schema const>, nonwrapping_range<dht::ring_position> const&, query::partition_slice const&, seastar::io_priority_class const&, reader_resource_tracker, seastar::bool_class<streamed_mutation::forwarding_tag>, seastar::bool_class<flat_mutation_reader::partition_range_forwarding_tag>, sstables::read_monitor&) at /jenkins/workspace/scylla-master/dtest-release/scylla/sstables/partition.cc:553
operator() at /jenkins/workspace/scylla-master/dtest-release/scylla/table.cc:586
(inlined by) _M_invoke at /usr/include/c++/8/bits/std_function.h:283
incremental_reader_selector::create_new_readers(std::optional<dht::ring_position_view> const&) at /usr/include/c++/8/bits/std_function.h:687
(inlined by) ?? at /jenkins/workspace/scylla-master/dtest-release/scylla/table.cc:205
(inlined by) auto incremental_reader_selector::create_new_readers(std::optional<dht::ring_position_view> const&)::{lambda(auto:1&)#2}::operator()<seastar::lw_shared_ptr<sstables::sstable> const>(seastar::lw_shared_ptr<sstables::sstable> const&) const at /jenkins/workspace/scylla-master/dtest-release/scylla/table.cc:247
(inlined by) ?? at /usr/include/boost/range/detail/default_constructible_unary_fn.hpp:39
(inlined by) ?? at /usr/include/boost/iterator/transform_iterator.hpp:126
(inlined by) ?? at /usr/include/boost/iterator/iterator_facade.hpp:550
(inlined by) ?? at /usr/include/boost/iterator/iterator_facade.hpp:656
(inlined by) ?? at /usr/include/c++/8/bits/stl_vector.h:1449
(inlined by) ?? at /usr/include/c++/8/bits/stl_vector.h:1437
(inlined by) ?? at /usr/include/c++/8/bits/stl_vector.h:546
(inlined by) ?? at /usr/include/boost/range/iterator_range_core.hpp:873
(inlined by) incremental_reader_selector::create_new_readers(std::optional<dht::ring_position_view> const&) at /jenkins/workspace/scylla-master/dtest-release/scylla/table.cc:245
incremental_reader_selector::fast_forward_to(nonwrapping_range<dht::ring_position> const&, std::chrono::time_point<seastar::lowres_clock, std::chrono::duration<long, std::ratio<1l, 1000l> > >) at /jenkins/workspace/scylla-master/dtest-release/scylla/table.cc:266
operator() at /jenkins/workspace/scylla-master/dtest-release/scylla/mutation_reader.cc:463
```
|
1.0
|
segfault in sstable::has_correct_non_compound_range_tombstones during repair_disjoint_row_2nodes_diff_shard_count_test - scylla version e9bc2a7912bcc3539f0a10c436a60069b88d1cab
scylla dtest version scylladb/scylla-dtest@f373388d91b494d398919ace1c54b73bd0a8b4a2
Seen in [dtest-release/50/artifact/logs-release.2/1551952592893_repair_additional_test.RepairAdditionalTest.repair_disjoint_row_2nodes_diff_shard_count_test/node1.log](http://jenkins.cloudius-systems.com:8080/view/master/job/scylla-master/job/dtest-release/50/artifact/logs-release.2/1551952592893_repair_additional_test.RepairAdditionalTest.repair_disjoint_row_2nodes_diff_shard_count_test/node1.log):
```
$ addr2line -Cfpi -e logs-release.2/scylla
0x00000000041d3732
0x00000000040d21e5
0x00000000040d24e5
0x00000000040d2533
0x00007fb13a30c02f
0x00000000016692e0
0x0000000000d59ecf
0x0000000000ebfbda
0x0000000000e49c90
0x00000000011fa2d1
0x00000000040cf671
0x00000000040cf86e
0x00000000041a3965
0x00000000041ca65b
0x000000000409f24d
void seastar::backtrace<seastar::backtrace_buffer::append_backtrace()::{lambda(seastar::frame)#1}>(seastar::backtrace_buffer::append_backtrace()::{lambda(seastar::frame)#1}&&) at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../include/seastar/util/backtrace.hh:55
seastar::print_with_backtrace(seastar::backtrace_buffer&) at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:1075
(inlined by) print_with_backtrace at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:1096
seastar::print_with_backtrace(char const*) at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:1103
seastar::install_oneshot_signal_handler<11, &seastar::sigsegv_action>()::{lambda(int, siginfo_t*, void*)#1}::_FUN(int, siginfo_t*, void*) at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:4906
(inlined by) operator() at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:4892
(inlined by) _FUN at /jenkins/workspace/scylla-master/dtest-release/scylla/seastar/build/release/../../src/core/reactor.cc:4888
?? ??:0
sstables::sstable::read_range_rows_flat(seastar::lw_shared_ptr<schema const>, nonwrapping_range<dht::ring_position> const&, query::partition_slice const&, seastar::io_priority_class const&, reader_resource_tracker, seastar::bool_class<streamed_mutation::forwarding_tag>, seastar::bool_class<flat_mutation_reader::partition_range_forwarding_tag>, sstables::read_monitor&) at /jenkins/workspace/scylla-master/dtest-release/scylla/sstables/sstables.hh:683
(inlined by) ?? at /jenkins/workspace/scylla-master/dtest-release/scylla/sstables/mp_row_consumer.hh:366
(inlined by) ?? at /jenkins/workspace/scylla-master/dtest-release/scylla/sstables/partition.cc:191
(inlined by) ?? at /usr/include/c++/8/bits/unique_ptr.h:831
(inlined by) flat_mutation_reader make_flat_mutation_reader<sstables::sstable_mutation_reader<sstables::data_consume_rows_context, sstables::mp_row_consumer_k_l>, seastar::lw_shared_ptr<sstables::sstable>, seastar::lw_shared_ptr<schema const>, nonwrapping_range<dht::ring_position> const&, query::partition_slice const&, seastar::io_priority_class const&, reader_resource_tracker, seastar::bool_class<streamed_mutation::forwarding_tag>&, seastar::bool_class<flat_mutation_reader::partition_range_forwarding_tag>&, sstables::read_monitor&>(seastar::lw_shared_ptr<sstables::sstable>&&, seastar::lw_shared_ptr<schema const>&&, nonwrapping_range<dht::ring_position> const&, query::partition_slice const&, seastar::io_priority_class const&, reader_resource_tracker&&, seastar::bool_class<streamed_mutation::forwarding_tag>&, seastar::bool_class<flat_mutation_reader::partition_range_forwarding_tag>&, sstables::read_monitor&) at /jenkins/workspace/scylla-master/dtest-release/scylla/./flat_mutation_reader.hh:483
(inlined by) sstables::sstable::read_range_rows_flat(seastar::lw_shared_ptr<schema const>, nonwrapping_range<dht::ring_position> const&, query::partition_slice const&, seastar::io_priority_class const&, reader_resource_tracker, seastar::bool_class<streamed_mutation::forwarding_tag>, seastar::bool_class<flat_mutation_reader::partition_range_forwarding_tag>, sstables::read_monitor&) at /jenkins/workspace/scylla-master/dtest-release/scylla/sstables/partition.cc:553
operator() at /jenkins/workspace/scylla-master/dtest-release/scylla/table.cc:586
(inlined by) _M_invoke at /usr/include/c++/8/bits/std_function.h:283
incremental_reader_selector::create_new_readers(std::optional<dht::ring_position_view> const&) at /usr/include/c++/8/bits/std_function.h:687
(inlined by) ?? at /jenkins/workspace/scylla-master/dtest-release/scylla/table.cc:205
(inlined by) auto incremental_reader_selector::create_new_readers(std::optional<dht::ring_position_view> const&)::{lambda(auto:1&)#2}::operator()<seastar::lw_shared_ptr<sstables::sstable> const>(seastar::lw_shared_ptr<sstables::sstable> const&) const at /jenkins/workspace/scylla-master/dtest-release/scylla/table.cc:247
(inlined by) ?? at /usr/include/boost/range/detail/default_constructible_unary_fn.hpp:39
(inlined by) ?? at /usr/include/boost/iterator/transform_iterator.hpp:126
(inlined by) ?? at /usr/include/boost/iterator/iterator_facade.hpp:550
(inlined by) ?? at /usr/include/boost/iterator/iterator_facade.hpp:656
(inlined by) ?? at /usr/include/c++/8/bits/stl_vector.h:1449
(inlined by) ?? at /usr/include/c++/8/bits/stl_vector.h:1437
(inlined by) ?? at /usr/include/c++/8/bits/stl_vector.h:546
(inlined by) ?? at /usr/include/boost/range/iterator_range_core.hpp:873
(inlined by) incremental_reader_selector::create_new_readers(std::optional<dht::ring_position_view> const&) at /jenkins/workspace/scylla-master/dtest-release/scylla/table.cc:245
incremental_reader_selector::fast_forward_to(nonwrapping_range<dht::ring_position> const&, std::chrono::time_point<seastar::lowres_clock, std::chrono::duration<long, std::ratio<1l, 1000l> > >) at /jenkins/workspace/scylla-master/dtest-release/scylla/table.cc:266
operator() at /jenkins/workspace/scylla-master/dtest-release/scylla/mutation_reader.cc:463
```
|
test
|
segfault in sstable has correct non compound range tombstones during repair disjoint row diff shard count test scylla version scylla dtest version scylladb scylla dtest seen in cfpi e logs release scylla void seastar backtrace seastar backtrace buffer append backtrace lambda seastar frame at jenkins workspace scylla master dtest release scylla seastar build release include seastar util backtrace hh seastar print with backtrace seastar backtrace buffer at jenkins workspace scylla master dtest release scylla seastar build release src core reactor cc inlined by print with backtrace at jenkins workspace scylla master dtest release scylla seastar build release src core reactor cc seastar print with backtrace char const at jenkins workspace scylla master dtest release scylla seastar build release src core reactor cc seastar install oneshot signal handler lambda int siginfo t void fun int siginfo t void at jenkins workspace scylla master dtest release scylla seastar build release src core reactor cc inlined by operator at jenkins workspace scylla master dtest release scylla seastar build release src core reactor cc inlined by fun at jenkins workspace scylla master dtest release scylla seastar build release src core reactor cc sstables sstable read range rows flat seastar lw shared ptr nonwrapping range const query partition slice const seastar io priority class const reader resource tracker seastar bool class seastar bool class sstables read monitor at jenkins workspace scylla master dtest release scylla sstables sstables hh inlined by at jenkins workspace scylla master dtest release scylla sstables mp row consumer hh inlined by at jenkins workspace scylla master dtest release scylla sstables partition cc inlined by at usr include c bits unique ptr h inlined by flat mutation reader make flat mutation reader seastar lw shared ptr seastar lw shared ptr nonwrapping range const query partition slice const seastar io priority class const reader resource tracker seastar bool class seastar bool class sstables read monitor seastar lw shared ptr seastar lw shared ptr nonwrapping range const query partition slice const seastar io priority class const reader resource tracker seastar bool class seastar bool class sstables read monitor at jenkins workspace scylla master dtest release scylla flat mutation reader hh inlined by sstables sstable read range rows flat seastar lw shared ptr nonwrapping range const query partition slice const seastar io priority class const reader resource tracker seastar bool class seastar bool class sstables read monitor at jenkins workspace scylla master dtest release scylla sstables partition cc operator at jenkins workspace scylla master dtest release scylla table cc inlined by m invoke at usr include c bits std function h incremental reader selector create new readers std optional const at usr include c bits std function h inlined by at jenkins workspace scylla master dtest release scylla table cc inlined by auto incremental reader selector create new readers std optional const lambda auto operator const seastar lw shared ptr const const at jenkins workspace scylla master dtest release scylla table cc inlined by at usr include boost range detail default constructible unary fn hpp inlined by at usr include boost iterator transform iterator hpp inlined by at usr include boost iterator iterator facade hpp inlined by at usr include boost iterator iterator facade hpp inlined by at usr include c bits stl vector h inlined by at usr include c bits stl vector h inlined by at usr include c bits stl vector h inlined by at usr include boost range iterator range core hpp inlined by incremental reader selector create new readers std optional const at jenkins workspace scylla master dtest release scylla table cc incremental reader selector fast forward to nonwrapping range const std chrono time point at jenkins workspace scylla master dtest release scylla table cc operator at jenkins workspace scylla master dtest release scylla mutation reader cc
| 1
|
12,421
| 3,269,147,677
|
IssuesEvent
|
2015-10-23 15:06:16
|
medic/medic-webapp
|
https://api.github.com/repos/medic/medic-webapp
|
closed
|
Enhanced markdown: Design of Reports page for review with LG
|
4 - Acceptance testing Feature Request Needs Design Work UI/UX
|
@Lesterng please provide details :-)
|
1.0
|
Enhanced markdown: Design of Reports page for review with LG - @Lesterng please provide details :-)
|
test
|
enhanced markdown design of reports page for review with lg lesterng please provide details
| 1
|
192,622
| 14,622,909,335
|
IssuesEvent
|
2020-12-23 01:45:50
|
microsoft/AzureStorageExplorer
|
https://api.github.com/repos/microsoft/AzureStorageExplorer
|
closed
|
The public connected blob container is neither auto selected nor opened
|
:beetle: regression :gear: blobs :heavy_check_mark: merged 🧪 testing
|
**Storage Explorer Version:** 1.17.0
**Build Number:** 20201222.1
**Branch:** main
**Platform/OS:** Windows 10/ Linux Ubuntu 16.04/ MacOS Catalina
**Architecture:** ia32/x64
**Regression From:** Previous build (20201218.3)
## Steps to Reproduce ##
1. Expand one storage account -> Blob Containers.
2. Create a blob container -> Set the container public access level to 'container'.
3. Copy the URL of the blob container.
4. Open the connect dialog -> Select 'Connect to public blob container' -> Click 'Next'.
5. Paste the container URL -> Click 'Next' -> Click 'Connect'.
6. Check whether the public attached blob container is selected and opened.
## Expected Experience ##
The public connected blob container is auto selected and opened.
## Actual Experience ##
The public connected blob container is neither auto selected nor opened.
|
1.0
|
The public connected blob container is neither auto selected nor opened - **Storage Explorer Version:** 1.17.0
**Build Number:** 20201222.1
**Branch:** main
**Platform/OS:** Windows 10/ Linux Ubuntu 16.04/ MacOS Catalina
**Architecture:** ia32/x64
**Regression From:** Previous build (20201218.3)
## Steps to Reproduce ##
1. Expand one storage account -> Blob Containers.
2. Create a blob container -> Set the container public access level to 'container'.
3. Copy the URL of the blob container.
4. Open the connect dialog -> Select 'Connect to public blob container' -> Click 'Next'.
5. Paste the container URL -> Click 'Next' -> Click 'Connect'.
6. Check whether the public attached blob container is selected and opened.
## Expected Experience ##
The public connected blob container is auto selected and opened.
## Actual Experience ##
The public connected blob container is neither auto selected nor opened.
|
test
|
the public connected blob container is neither auto selected nor opened storage explorer version build number branch main platform os windows linux ubuntu macos catalina architecture regression from previous build steps to reproduce expand one storage account blob containers create a blob container set the container public access level to container copy the url of the blob container open the connect dialog select connect to public blob container click next paste the container url click next click connect check whether the public attached blob container is selected and opened expected experience the public connected blob container is auto selected and opened actual experience the public connected blob container is neither auto selected nor opened
| 1
|
492,633
| 14,216,694,348
|
IssuesEvent
|
2020-11-17 09:20:48
|
usc-isi-i2/datamart-api
|
https://api.github.com/repos/usc-isi-i2/datamart-api
|
opened
|
Load anntated datasets from Pam
|
Priority 1 world-modeler
|
Our local copy of the dataset files is here:
https://drive.google.com/drive/folders/1etSpJAJth_0xRSil6jSTYbr7su5Pdrmf
Pam's Google shared drive is here:
https://drive.google.com/drive/u/2/folders/13DwbrNaHuDr7ZmFkbYcXIOd1mkcre5S0
|
1.0
|
Load anntated datasets from Pam - Our local copy of the dataset files is here:
https://drive.google.com/drive/folders/1etSpJAJth_0xRSil6jSTYbr7su5Pdrmf
Pam's Google shared drive is here:
https://drive.google.com/drive/u/2/folders/13DwbrNaHuDr7ZmFkbYcXIOd1mkcre5S0
|
non_test
|
load anntated datasets from pam our local copy of the dataset files is here pam s google shared drive is here
| 0
|
1,540
| 3,041,618,778
|
IssuesEvent
|
2015-08-07 22:42:23
|
npgsql/npgsql
|
https://api.github.com/repos/npgsql/npgsql
|
closed
|
Add SyncDNS connection string parameter
|
feature performance
|
Our current connection mechanism resolves DNS with an asynchronous call; this is because the .NET sync DNS API provides no timeout facility, and we're bound by ADO.NET to provide a connection timeout.
We've had several reports of people having trouble with this mechanism, in cases of bursts: the threadpool is exhausted, the DNS async callback is delayed, up to a timeout. See #376 and this [extensive discussion](https://groups.google.com/forum/#!topic/npgsql-dev/xb8KzglpApo).
Assuming no alternative DNS client implementation supporting a timeout can be found, we should provide an option to the user to select the .NET sync DNS, which would mean giving up the timeout for a completely synchronous connection process (which doesn't depend on the threadpool).
|
True
|
Add SyncDNS connection string parameter - Our current connection mechanism resolves DNS with an asynchronous call; this is because the .NET sync DNS API provides no timeout facility, and we're bound by ADO.NET to provide a connection timeout.
We've had several reports of people having trouble with this mechanism, in cases of bursts: the threadpool is exhausted, the DNS async callback is delayed, up to a timeout. See #376 and this [extensive discussion](https://groups.google.com/forum/#!topic/npgsql-dev/xb8KzglpApo).
Assuming no alternative DNS client implementation supporting a timeout can be found, we should provide an option to the user to select the .NET sync DNS, which would mean giving up the timeout for a completely synchronous connection process (which doesn't depend on the threadpool).
|
non_test
|
add syncdns connection string parameter our current connection mechanism resolves dns with an asynchronous call this is because the net sync dns api provides no timeout facility and we re bound by ado net to provide a connection timeout we ve had several reports of people having trouble with this mechanism in cases of bursts the threadpool is exhausted the dns async callback is delayed up to a timeout see and this assuming no alternative dns client implementation supporting a timeout can be found we should provide an option to the user to select the net sync dns which would mean giving up the timeout for a completely synchronous connection process which doesn t depend on the threadpool
| 0
|
300,838
| 25,998,255,450
|
IssuesEvent
|
2022-12-20 13:24:08
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
kvnemesis: use ReplicationManual
|
C-enhancement T-testeng
|
**Describe the problem**
In https://github.com/cockroachdb/cockroach/pull/89477, TestKVNemesisMultiNode was (accidentally) switched to use ReplicationAuto instead of ReplicationManual, meaning that the TestCluster will upreplicate and also have the replicate queue active throughout the run.
We should decide whether that's desirable and if not, undo this change.
kvnemesis does perform manual replication changes, so manual seems preferrable to preserve as much determinism as possible.
kvnemesis also performs random zone config changes, which I thought was an argument for keeping the replicate queue active, but it turns out that at the time of writing, the only zone cfg change it can make is toggling global reads, which is independent of the replicate queue being active:
https://github.com/cockroachdb/cockroach/blob/97054a0e76049cd8f78d8b534ab1e2107be9f2ed/pkg/kv/kvnemesis/generator.go#L914-L916
|
1.0
|
kvnemesis: use ReplicationManual - **Describe the problem**
In https://github.com/cockroachdb/cockroach/pull/89477, TestKVNemesisMultiNode was (accidentally) switched to use ReplicationAuto instead of ReplicationManual, meaning that the TestCluster will upreplicate and also have the replicate queue active throughout the run.
We should decide whether that's desirable and if not, undo this change.
kvnemesis does perform manual replication changes, so manual seems preferrable to preserve as much determinism as possible.
kvnemesis also performs random zone config changes, which I thought was an argument for keeping the replicate queue active, but it turns out that at the time of writing, the only zone cfg change it can make is toggling global reads, which is independent of the replicate queue being active:
https://github.com/cockroachdb/cockroach/blob/97054a0e76049cd8f78d8b534ab1e2107be9f2ed/pkg/kv/kvnemesis/generator.go#L914-L916
|
test
|
kvnemesis use replicationmanual describe the problem in testkvnemesismultinode was accidentally switched to use replicationauto instead of replicationmanual meaning that the testcluster will upreplicate and also have the replicate queue active throughout the run we should decide whether that s desirable and if not undo this change kvnemesis does perform manual replication changes so manual seems preferrable to preserve as much determinism as possible kvnemesis also performs random zone config changes which i thought was an argument for keeping the replicate queue active but it turns out that at the time of writing the only zone cfg change it can make is toggling global reads which is independent of the replicate queue being active
| 1
|
780,987
| 27,417,609,706
|
IssuesEvent
|
2023-03-01 14:45:59
|
PrefectHQ/prefect
|
https://api.github.com/repos/PrefectHQ/prefect
|
closed
|
Orion - add search functionality in block selection.
|
enhancement status:accepted ui priority:medium
|
### First check
- [X] I added a descriptive title to this issue.
- [X] I used the GitHub search to find a similar request and didn't find it.
- [X] I searched the Prefect documentation for this feature.
### Prefect Version
2.x
### Describe the current behavior
If I define a block as an input for a flow or as a attribute of another block, I get a drop-down in Orion. If the list is long I have to scroll through lots of options.
### Describe the proposed behavior
Add search functionality to the drop-down. If I click a field in Orion that is of any block type, I can search through that list by typing.
The behavior would be similar how the search for issues here in GitHub works.
<img src="https://user-images.githubusercontent.com/24698503/197032103-1248981b-0436-4ebd-8783-24b1fb01b095.jpg" width="300">
### Example Use
This is especially helpful if one has lots of blocks of the same type. Say I have a custom Block called `ObjectDetectionModel`.
Each of these contains one trained and published model. If I have 100 of these the pure drop-down becomes a pain to use.
### Additional context
_No response_
|
1.0
|
Orion - add search functionality in block selection. - ### First check
- [X] I added a descriptive title to this issue.
- [X] I used the GitHub search to find a similar request and didn't find it.
- [X] I searched the Prefect documentation for this feature.
### Prefect Version
2.x
### Describe the current behavior
If I define a block as an input for a flow or as a attribute of another block, I get a drop-down in Orion. If the list is long I have to scroll through lots of options.
### Describe the proposed behavior
Add search functionality to the drop-down. If I click a field in Orion that is of any block type, I can search through that list by typing.
The behavior would be similar how the search for issues here in GitHub works.
<img src="https://user-images.githubusercontent.com/24698503/197032103-1248981b-0436-4ebd-8783-24b1fb01b095.jpg" width="300">
### Example Use
This is especially helpful if one has lots of blocks of the same type. Say I have a custom Block called `ObjectDetectionModel`.
Each of these contains one trained and published model. If I have 100 of these the pure drop-down becomes a pain to use.
### Additional context
_No response_
|
non_test
|
orion add search functionality in block selection first check i added a descriptive title to this issue i used the github search to find a similar request and didn t find it i searched the prefect documentation for this feature prefect version x describe the current behavior if i define a block as an input for a flow or as a attribute of another block i get a drop down in orion if the list is long i have to scroll through lots of options describe the proposed behavior add search functionality to the drop down if i click a field in orion that is of any block type i can search through that list by typing the behavior would be similar how the search for issues here in github works example use this is especially helpful if one has lots of blocks of the same type say i have a custom block called objectdetectionmodel each of these contains one trained and published model if i have of these the pure drop down becomes a pain to use additional context no response
| 0
|
11,241
| 8,336,358,246
|
IssuesEvent
|
2018-09-28 07:30:41
|
CoditEU/practical-api-guidelines
|
https://api.github.com/repos/CoditEU/practical-api-guidelines
|
closed
|
Complete the security first paragraph
|
guidance must-have security
|
Complete the security first paragraph with more details.
Decide whether any form of authentication / authorization will be part of the first maturity level
|
True
|
Complete the security first paragraph - Complete the security first paragraph with more details.
Decide whether any form of authentication / authorization will be part of the first maturity level
|
non_test
|
complete the security first paragraph complete the security first paragraph with more details decide whether any form of authentication authorization will be part of the first maturity level
| 0
|
356,505
| 25,176,204,480
|
IssuesEvent
|
2022-11-11 09:28:53
|
t1mzzz/pe
|
https://api.github.com/repos/t1mzzz/pe
|
opened
|
DG - Links to `Main` and `MainApp` are not correct
|
type.DocumentationBug severity.VeryLow
|
As seen below, `Main` and `MainApp` contain hyperlinks that should link to `Main.java` and `MainApp.java` of CLInkedIn. However, it still links to AB3s GitHub repository.

<!--session: 1668153883434-430ea50c-05c6-419d-8236-21285a055c7f-->
<!--Version: Web v3.4.4-->
|
1.0
|
DG - Links to `Main` and `MainApp` are not correct - As seen below, `Main` and `MainApp` contain hyperlinks that should link to `Main.java` and `MainApp.java` of CLInkedIn. However, it still links to AB3s GitHub repository.

<!--session: 1668153883434-430ea50c-05c6-419d-8236-21285a055c7f-->
<!--Version: Web v3.4.4-->
|
non_test
|
dg links to main and mainapp are not correct as seen below main and mainapp contain hyperlinks that should link to main java and mainapp java of clinkedin however it still links to github repository
| 0
|
727,840
| 25,048,270,896
|
IssuesEvent
|
2022-11-05 14:51:56
|
TalaoDAO/AltMe
|
https://api.github.com/repos/TalaoDAO/AltMe
|
closed
|
AUDIT : Replace api token "mytoken" for API to get passbase status
|
a V2 Priority
|
same approach as yoti AI API, age estimate
|
1.0
|
AUDIT : Replace api token "mytoken" for API to get passbase status - same approach as yoti AI API, age estimate
|
non_test
|
audit replace api token mytoken for api to get passbase status same approach as yoti ai api age estimate
| 0
|
171,888
| 13,251,744,958
|
IssuesEvent
|
2020-08-20 03:06:48
|
robe070/cookbooks
|
https://api.github.com/repos/robe070/cookbooks
|
closed
|
P3 Add Tests Powershell script for Azure Pipeline (Pester Preferred)
|
test
|
Use Pester to write down the tests based on the PS scripts. The result of this task will be the Pester Test Script. This script when executed should generate a NUnit.xml file. Eventually, it will be used by the Azure Pipeline. It should include the TestImage tests and Test Image using Stack Test.
|
1.0
|
P3 Add Tests Powershell script for Azure Pipeline (Pester Preferred) - Use Pester to write down the tests based on the PS scripts. The result of this task will be the Pester Test Script. This script when executed should generate a NUnit.xml file. Eventually, it will be used by the Azure Pipeline. It should include the TestImage tests and Test Image using Stack Test.
|
test
|
add tests powershell script for azure pipeline pester preferred use pester to write down the tests based on the ps scripts the result of this task will be the pester test script this script when executed should generate a nunit xml file eventually it will be used by the azure pipeline it should include the testimage tests and test image using stack test
| 1
|
72,477
| 9,594,851,963
|
IssuesEvent
|
2019-05-09 14:49:38
|
regolith-linux/regolith-desktop
|
https://api.github.com/repos/regolith-linux/regolith-desktop
|
closed
|
Add READMEs to all debian package repos
|
documentation
|
The content of readme's should contain:
1. general description of the package.
2. list and describe any dependencies with other Regolith packages.
3. notable configuration if any exists.
4. how to build the package locally and publish changes to a PPA.
|
1.0
|
Add READMEs to all debian package repos - The content of readme's should contain:
1. general description of the package.
2. list and describe any dependencies with other Regolith packages.
3. notable configuration if any exists.
4. how to build the package locally and publish changes to a PPA.
|
non_test
|
add readmes to all debian package repos the content of readme s should contain general description of the package list and describe any dependencies with other regolith packages notable configuration if any exists how to build the package locally and publish changes to a ppa
| 0
|
185,070
| 14,292,764,484
|
IssuesEvent
|
2020-11-24 01:55:31
|
github-vet/rangeclosure-findings
|
https://api.github.com/repos/github-vet/rangeclosure-findings
|
closed
|
eclipse-iofog/iofog-go-sdk: vendor/k8s.io/gengo/examples/deepcopy-gen/generators/deepcopy_test.go; 10 LoC
|
fresh test tiny
|
Found a possible issue in [eclipse-iofog/iofog-go-sdk](https://www.github.com/eclipse-iofog/iofog-go-sdk) at [vendor/k8s.io/gengo/examples/deepcopy-gen/generators/deepcopy_test.go](https://github.com/eclipse-iofog/iofog-go-sdk/blob/b8ff7f50d1585fd5f2a41ea43fb89c9dd6805c7c/vendor/k8s.io/gengo/examples/deepcopy-gen/generators/deepcopy_test.go#L368-L377)
The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements
which capture loop variables.
[Click here to see the code in its original context.](https://github.com/eclipse-iofog/iofog-go-sdk/blob/b8ff7f50d1585fd5f2a41ea43fb89c9dd6805c7c/vendor/k8s.io/gengo/examples/deepcopy-gen/generators/deepcopy_test.go#L368-L377)
<details>
<summary>Click here to show the 10 line(s) of Go which triggered the analyzer.</summary>
```go
for i, tc := range testCases {
r, err := deepCopyMethod(&tc.typ)
if tc.error && err == nil {
t.Errorf("case[%d]: expected an error, got none", i)
} else if !tc.error && err != nil {
t.Errorf("case[%d]: expected no error, got: %v", i, err)
} else if !tc.error && (r != nil) != tc.expect {
t.Errorf("case[%d]: expected result %v, got: %v", i, tc.expect, r)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: b8ff7f50d1585fd5f2a41ea43fb89c9dd6805c7c
|
1.0
|
eclipse-iofog/iofog-go-sdk: vendor/k8s.io/gengo/examples/deepcopy-gen/generators/deepcopy_test.go; 10 LoC -
Found a possible issue in [eclipse-iofog/iofog-go-sdk](https://www.github.com/eclipse-iofog/iofog-go-sdk) at [vendor/k8s.io/gengo/examples/deepcopy-gen/generators/deepcopy_test.go](https://github.com/eclipse-iofog/iofog-go-sdk/blob/b8ff7f50d1585fd5f2a41ea43fb89c9dd6805c7c/vendor/k8s.io/gengo/examples/deepcopy-gen/generators/deepcopy_test.go#L368-L377)
The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements
which capture loop variables.
[Click here to see the code in its original context.](https://github.com/eclipse-iofog/iofog-go-sdk/blob/b8ff7f50d1585fd5f2a41ea43fb89c9dd6805c7c/vendor/k8s.io/gengo/examples/deepcopy-gen/generators/deepcopy_test.go#L368-L377)
<details>
<summary>Click here to show the 10 line(s) of Go which triggered the analyzer.</summary>
```go
for i, tc := range testCases {
r, err := deepCopyMethod(&tc.typ)
if tc.error && err == nil {
t.Errorf("case[%d]: expected an error, got none", i)
} else if !tc.error && err != nil {
t.Errorf("case[%d]: expected no error, got: %v", i, err)
} else if !tc.error && (r != nil) != tc.expect {
t.Errorf("case[%d]: expected result %v, got: %v", i, tc.expect, r)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: b8ff7f50d1585fd5f2a41ea43fb89c9dd6805c7c
|
test
|
eclipse iofog iofog go sdk vendor io gengo examples deepcopy gen generators deepcopy test go loc found a possible issue in at the below snippet of go code triggered static analysis which searches for goroutines and or defer statements which capture loop variables click here to show the line s of go which triggered the analyzer go for i tc range testcases r err deepcopymethod tc typ if tc error err nil t errorf case expected an error got none i else if tc error err nil t errorf case expected no error got v i err else if tc error r nil tc expect t errorf case expected result v got v i tc expect r leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
| 1
|
281,374
| 24,388,050,664
|
IssuesEvent
|
2022-10-04 13:20:10
|
celestiaorg/test-infra
|
https://api.github.com/repos/celestiaorg/test-infra
|
closed
|
testground/tests: implement TC-004
|
enhancement test testground
|
After finishing #61 and #62 , we need to
- [ ] Create more composition files reflecting data
- [ ] Measure the sync times from both light and full nodes
Ref:
1. https://github.com/celestiaorg/test-infra/blob/main/docs/test-plans/001-Big-Blocks/test-cases/tc-004-full-light-past.md
2. #1
3. #55
|
2.0
|
testground/tests: implement TC-004 - After finishing #61 and #62 , we need to
- [ ] Create more composition files reflecting data
- [ ] Measure the sync times from both light and full nodes
Ref:
1. https://github.com/celestiaorg/test-infra/blob/main/docs/test-plans/001-Big-Blocks/test-cases/tc-004-full-light-past.md
2. #1
3. #55
|
test
|
testground tests implement tc after finishing and we need to create more composition files reflecting data measure the sync times from both light and full nodes ref
| 1
|
122,237
| 10,217,751,368
|
IssuesEvent
|
2019-08-15 14:23:48
|
DBCG/cql_engine
|
https://api.github.com/repos/DBCG/cql_engine
|
closed
|
Unexpected result in Multi Source Query
|
bug test created
|
Getting an unexpected output from a multi source query:
define "a":
{
{ code: 1, periods: {Interval[1, 2], Interval[3, 4]} }
}
define "b":
{
{ code: 1, periods: {Interval[1, 2], Interval[3, 4]} }
}
define "Multisource":
from "a" A, "b" B
>> Multisource [10:1] Index: 0, Size: 0
|
1.0
|
Unexpected result in Multi Source Query - Getting an unexpected output from a multi source query:
define "a":
{
{ code: 1, periods: {Interval[1, 2], Interval[3, 4]} }
}
define "b":
{
{ code: 1, periods: {Interval[1, 2], Interval[3, 4]} }
}
define "Multisource":
from "a" A, "b" B
>> Multisource [10:1] Index: 0, Size: 0
|
test
|
unexpected result in multi source query getting an unexpected output from a multi source query define a code periods interval interval define b code periods interval interval define multisource from a a b b multisource index size
| 1
|
141,646
| 11,429,762,763
|
IssuesEvent
|
2020-02-04 08:44:39
|
proarc/proarc
|
https://api.github.com/repos/proarc/proarc
|
closed
|
Smazání importního adresáře po úspěšném importu
|
6 k testování Release-3.5.15
|
Po úspěšném importu smazat importní složku (proarc_import)
|
1.0
|
Smazání importního adresáře po úspěšném importu - Po úspěšném importu smazat importní složku (proarc_import)
|
test
|
smazání importního adresáře po úspěšném importu po úspěšném importu smazat importní složku proarc import
| 1
|
178,077
| 13,761,077,754
|
IssuesEvent
|
2020-10-07 07:06:29
|
OpenPaaS-Suite/esn-frontend-calendar
|
https://api.github.com/repos/OpenPaaS-Suite/esn-frontend-calendar
|
closed
|
As a user, I want the more important fields to precede the less important fields in the event dialog
|
QA:Testing enhancement
|
#### User story summary
As a user, I want the more important fields to precede the less important fields in the event dialog as they are much more frequently used.
#### Where to find the feature
1. Go to Calendar
2. Choose to create a new event or edit an existing event
3. The event dialog should be opened
#### Criteria
- [x] The "Attendees" and the "Resources" fields should follow the date/time fields and precede all the other fields.
#### UI/UX Design
- On desktop/laptop:

- On mobile devices:

|
1.0
|
As a user, I want the more important fields to precede the less important fields in the event dialog - #### User story summary
As a user, I want the more important fields to precede the less important fields in the event dialog as they are much more frequently used.
#### Where to find the feature
1. Go to Calendar
2. Choose to create a new event or edit an existing event
3. The event dialog should be opened
#### Criteria
- [x] The "Attendees" and the "Resources" fields should follow the date/time fields and precede all the other fields.
#### UI/UX Design
- On desktop/laptop:

- On mobile devices:

|
test
|
as a user i want the more important fields to precede the less important fields in the event dialog user story summary as a user i want the more important fields to precede the less important fields in the event dialog as they are much more frequently used where to find the feature go to calendar choose to create a new event or edit an existing event the event dialog should be opened criteria the attendees and the resources fields should follow the date time fields and precede all the other fields ui ux design on desktop laptop on mobile devices
| 1
|
115,734
| 14,880,521,865
|
IssuesEvent
|
2021-01-20 09:17:58
|
WordPress/gutenberg
|
https://api.github.com/repos/WordPress/gutenberg
|
closed
|
Add editor setting to toggle Breadcrumb UI on/off.
|
General Interface Needs Design Feedback
|
At the bottom of the editor there is a white toolbar that lists a breadcrumb-like trail of the currently selected block:
<img width="173" alt="image" src="https://user-images.githubusercontent.com/191598/101187755-c3669500-3622-11eb-8cbe-7be911b77670.png">
The intention of this toolbar is to make it easier to traverse from a child block to its parent(s). However, I've rarely seen users interact with this UI; Its generally "invisible" unless pointed out, and even then its not always obvious what this list of blocks shows.
I suggest we add an option to the editor preference to control the display of this toolbar. Here's how the appearance preferences look now:
<img width="402" alt="image" src="https://user-images.githubusercontent.com/191598/101188032-25bf9580-3623-11eb-9e38-bee8dfda2687.png">
We could add a new option for "Display block breadcrumbs" and/or integrate the display of the breadcrumb toolbar with the existing "Reduce the interface" setting.
I think we should also consider disabling the breadcrumb toolbar by default.
|
1.0
|
Add editor setting to toggle Breadcrumb UI on/off. - At the bottom of the editor there is a white toolbar that lists a breadcrumb-like trail of the currently selected block:
<img width="173" alt="image" src="https://user-images.githubusercontent.com/191598/101187755-c3669500-3622-11eb-8cbe-7be911b77670.png">
The intention of this toolbar is to make it easier to traverse from a child block to its parent(s). However, I've rarely seen users interact with this UI; Its generally "invisible" unless pointed out, and even then its not always obvious what this list of blocks shows.
I suggest we add an option to the editor preference to control the display of this toolbar. Here's how the appearance preferences look now:
<img width="402" alt="image" src="https://user-images.githubusercontent.com/191598/101188032-25bf9580-3623-11eb-9e38-bee8dfda2687.png">
We could add a new option for "Display block breadcrumbs" and/or integrate the display of the breadcrumb toolbar with the existing "Reduce the interface" setting.
I think we should also consider disabling the breadcrumb toolbar by default.
|
non_test
|
add editor setting to toggle breadcrumb ui on off at the bottom of the editor there is a white toolbar that lists a breadcrumb like trail of the currently selected block img width alt image src the intention of this toolbar is to make it easier to traverse from a child block to its parent s however i ve rarely seen users interact with this ui its generally invisible unless pointed out and even then its not always obvious what this list of blocks shows i suggest we add an option to the editor preference to control the display of this toolbar here s how the appearance preferences look now img width alt image src we could add a new option for display block breadcrumbs and or integrate the display of the breadcrumb toolbar with the existing reduce the interface setting i think we should also consider disabling the breadcrumb toolbar by default
| 0
|
535,373
| 15,687,219,002
|
IssuesEvent
|
2021-03-25 13:26:37
|
gsbelarus/check-and-cash
|
https://api.github.com/repos/gsbelarus/check-and-cash
|
closed
|
gedemin control center
|
POSitive:Cash Priority-Normal Severity - Minor
|
Positive Cash. После оплаты, окно gedemin control center выходит на передний план. У двоих клиентов появилась данная проблема, предположительно после добавления нового пользователя. Версии программы у клиентов разные.


|
1.0
|
gedemin control center - Positive Cash. После оплаты, окно gedemin control center выходит на передний план. У двоих клиентов появилась данная проблема, предположительно после добавления нового пользователя. Версии программы у клиентов разные.


|
non_test
|
gedemin control center positive cash после оплаты окно gedemin control center выходит на передний план у двоих клиентов появилась данная проблема предположительно после добавления нового пользователя версии программы у клиентов разные
| 0
|
201,074
| 15,172,591,168
|
IssuesEvent
|
2021-02-13 09:58:38
|
Slimefun/Slimefun4
|
https://api.github.com/repos/Slimefun/Slimefun4
|
opened
|
Slimefun + EcoEnchant Incompatible
|
🎯 Needs testing 🐞 Bug Report
|
<!-- FILL IN THE FORM BELOW -->
## :round_pushpin: Description (REQUIRED)
<!-- A clear and detailed description of what went wrong. -->
<!-- The more information you can provide, the easier we can handle this problem. -->
<!-- Start writing below this line -->
Hello, I was told to make an report with slimefun by a plugin developer name Auxilor. I'm using this plugin call ecoenchants which I will provide the link and wiki below. This plugin allows MC servers to have custom enchantments. There is a configuration that allow us to disable players from obtaining custom enchantments which are villagers, loot chests, and enchanted table. But I have them all disabled and this took me a while figure out.
Even with the enchanted table, It looks like slimefun is interfering with it. Even if I have enchanted table disable players can use Talisman of the magician and ender talisman of the magician to get the lucky bonus enchant which allows them to get eco enchants custom enchant from the enchanted table. I was wondering is there a way for you guys to somehow fix it from happening? I would like to allow players to use the talisman but not to obtain custom enchants from eco enchants.
Eco Enchant: https://www.spigotmc.org/resources/%E2%9A%A1-1-15-1-16-5-ecoenchants-%E2%9C%A8-220-custom-enchantments-%E2%9C%85-essentials-cmi-support.79573/
Eco Enchant Wiki: https://ecoenchants.willfp.com/enchantments/obtaining
## :bookmark_tabs: Steps to reproduce the Issue (REQUIRED)
<!-- Tell us the exact steps to reproduce this issue, the more detailed the easier we can reproduce it. -->
<!-- Youtube Videos and Screenshots are recommended!!! -->
<!-- Start writing below this line -->
I explained above.
## :bulb: Expected behavior (REQUIRED)
<!-- What were you expecting to happen? -->
<!-- What do you think would have been the correct behaviour? -->
<!-- Start writing below this line -->
I would like to have Talisman of the magician and ender talisman of the magician to not support eco enchants by allowing players to use it to get custom enchantments on their tools, armors, or books in any way when we disable the enchanted table section on eco enchants configuraton.
## :scroll: Server Log
<!-- Take a look at your Server Log and post any errors you can find via https://pastebin.com/ -->
<!-- If you are unsure about it, post your full log, you can find it under /logs/latest.log -->
<!-- Paste your link(s) below this line -->
I do not have any server log or console errors.
## :open_file_folder: /error-reports/ Folder
<!-- Check the folder /plugins/Slimefun/error-reports/ and upload any files inside that folder. -->
<!-- You can also post these files via https://pastebin.com/ -->
<!-- Paste your link(s) below this line -->
I do not have any server log or console errors.
## :compass: Environment (REQUIRED)
<!-- Any issue without the exact version numbers will be closed! -->
<!-- "latest" IS NOT A VERSION NUMBER. -->
<!-- We recommend running "/sf versions" and showing us a screenshot of that. -->
<!-- Make sure that the screenshot covers the entire output of that command. -->
<!-- If your issue is related to other plugins, make sure to include the versions of these plugins too! -->
https://gyazo.com/eab352829db58bdd1835df76d6b8c60d
- Server Software: Paper
- Minecraft Version: 1.16.5
- Slimefun Version: vRC - 20 (git da6866ee)
|
1.0
|
Slimefun + EcoEnchant Incompatible - <!-- FILL IN THE FORM BELOW -->
## :round_pushpin: Description (REQUIRED)
<!-- A clear and detailed description of what went wrong. -->
<!-- The more information you can provide, the easier we can handle this problem. -->
<!-- Start writing below this line -->
Hello, I was told to make an report with slimefun by a plugin developer name Auxilor. I'm using this plugin call ecoenchants which I will provide the link and wiki below. This plugin allows MC servers to have custom enchantments. There is a configuration that allow us to disable players from obtaining custom enchantments which are villagers, loot chests, and enchanted table. But I have them all disabled and this took me a while figure out.
Even with the enchanted table, It looks like slimefun is interfering with it. Even if I have enchanted table disable players can use Talisman of the magician and ender talisman of the magician to get the lucky bonus enchant which allows them to get eco enchants custom enchant from the enchanted table. I was wondering is there a way for you guys to somehow fix it from happening? I would like to allow players to use the talisman but not to obtain custom enchants from eco enchants.
Eco Enchant: https://www.spigotmc.org/resources/%E2%9A%A1-1-15-1-16-5-ecoenchants-%E2%9C%A8-220-custom-enchantments-%E2%9C%85-essentials-cmi-support.79573/
Eco Enchant Wiki: https://ecoenchants.willfp.com/enchantments/obtaining
## :bookmark_tabs: Steps to reproduce the Issue (REQUIRED)
<!-- Tell us the exact steps to reproduce this issue, the more detailed the easier we can reproduce it. -->
<!-- Youtube Videos and Screenshots are recommended!!! -->
<!-- Start writing below this line -->
I explained above.
## :bulb: Expected behavior (REQUIRED)
<!-- What were you expecting to happen? -->
<!-- What do you think would have been the correct behaviour? -->
<!-- Start writing below this line -->
I would like to have Talisman of the magician and ender talisman of the magician to not support eco enchants by allowing players to use it to get custom enchantments on their tools, armors, or books in any way when we disable the enchanted table section on eco enchants configuraton.
## :scroll: Server Log
<!-- Take a look at your Server Log and post any errors you can find via https://pastebin.com/ -->
<!-- If you are unsure about it, post your full log, you can find it under /logs/latest.log -->
<!-- Paste your link(s) below this line -->
I do not have any server log or console errors.
## :open_file_folder: /error-reports/ Folder
<!-- Check the folder /plugins/Slimefun/error-reports/ and upload any files inside that folder. -->
<!-- You can also post these files via https://pastebin.com/ -->
<!-- Paste your link(s) below this line -->
I do not have any server log or console errors.
## :compass: Environment (REQUIRED)
<!-- Any issue without the exact version numbers will be closed! -->
<!-- "latest" IS NOT A VERSION NUMBER. -->
<!-- We recommend running "/sf versions" and showing us a screenshot of that. -->
<!-- Make sure that the screenshot covers the entire output of that command. -->
<!-- If your issue is related to other plugins, make sure to include the versions of these plugins too! -->
https://gyazo.com/eab352829db58bdd1835df76d6b8c60d
- Server Software: Paper
- Minecraft Version: 1.16.5
- Slimefun Version: vRC - 20 (git da6866ee)
|
test
|
slimefun ecoenchant incompatible round pushpin description required hello i was told to make an report with slimefun by a plugin developer name auxilor i m using this plugin call ecoenchants which i will provide the link and wiki below this plugin allows mc servers to have custom enchantments there is a configuration that allow us to disable players from obtaining custom enchantments which are villagers loot chests and enchanted table but i have them all disabled and this took me a while figure out even with the enchanted table it looks like slimefun is interfering with it even if i have enchanted table disable players can use talisman of the magician and ender talisman of the magician to get the lucky bonus enchant which allows them to get eco enchants custom enchant from the enchanted table i was wondering is there a way for you guys to somehow fix it from happening i would like to allow players to use the talisman but not to obtain custom enchants from eco enchants eco enchant eco enchant wiki bookmark tabs steps to reproduce the issue required i explained above bulb expected behavior required i would like to have talisman of the magician and ender talisman of the magician to not support eco enchants by allowing players to use it to get custom enchantments on their tools armors or books in any way when we disable the enchanted table section on eco enchants configuraton scroll server log i do not have any server log or console errors open file folder error reports folder i do not have any server log or console errors compass environment required server software paper minecraft version slimefun version vrc git
| 1
|
193,132
| 6,881,893,902
|
IssuesEvent
|
2017-11-21 00:37:51
|
zulip/zulip
|
https://api.github.com/repos/zulip/zulip
|
closed
|
Move `tools/generate-custom-webfont-icon` to run as part of `update-prod-static`
|
area: tooling priority: high
|
I've determined that tools/generate-custom-webfont-icon never produces the same output twice (probably there's a timestamp in there). Given that it only takes 300ms to run, I think we want to move this to the update-prod-static / static/generated system. I'll open a follow-up issue.
We should move it to work more like `tools/setup/build_pygments_data` where it gets run in `provision` and `update-prod-static`, outputs to `static/generated/icons`, etc.
Tagging this as medium since this sort of technical debt is good to clean up quickly.
|
1.0
|
Move `tools/generate-custom-webfont-icon` to run as part of `update-prod-static` - I've determined that tools/generate-custom-webfont-icon never produces the same output twice (probably there's a timestamp in there). Given that it only takes 300ms to run, I think we want to move this to the update-prod-static / static/generated system. I'll open a follow-up issue.
We should move it to work more like `tools/setup/build_pygments_data` where it gets run in `provision` and `update-prod-static`, outputs to `static/generated/icons`, etc.
Tagging this as medium since this sort of technical debt is good to clean up quickly.
|
non_test
|
move tools generate custom webfont icon to run as part of update prod static i ve determined that tools generate custom webfont icon never produces the same output twice probably there s a timestamp in there given that it only takes to run i think we want to move this to the update prod static static generated system i ll open a follow up issue we should move it to work more like tools setup build pygments data where it gets run in provision and update prod static outputs to static generated icons etc tagging this as medium since this sort of technical debt is good to clean up quickly
| 0
|
121,003
| 10,146,264,293
|
IssuesEvent
|
2019-08-05 07:40:49
|
linz/linz-bde-copy
|
https://api.github.com/repos/linz/linz-bde-copy
|
closed
|
Add test for calls with -o switch
|
Stale testsuite
|
I noticed current `runtests.sh` script is not ever testing calls with `-o` switch (for output fields).
As a bug was found in using that switch, I think it will be important to add some.
\cc @imincik
|
1.0
|
Add test for calls with -o switch - I noticed current `runtests.sh` script is not ever testing calls with `-o` switch (for output fields).
As a bug was found in using that switch, I think it will be important to add some.
\cc @imincik
|
test
|
add test for calls with o switch i noticed current runtests sh script is not ever testing calls with o switch for output fields as a bug was found in using that switch i think it will be important to add some cc imincik
| 1
|
283,885
| 24,569,437,270
|
IssuesEvent
|
2022-10-13 07:26:52
|
longhorn/longhorn
|
https://api.github.com/repos/longhorn/longhorn
|
closed
|
[BUG] Volume attach API not working for RWX volume
|
kind/bug kind/test
|
## Describe the bug
(1) Try to attach a RWX volume to a node through API, the API response status code 200, but the RWX volume is still detached:
HTTP Request:
```
HTTP/1.1 POST /v1/volumes/test-2?action=attach
Host: 54.243.179.156:30007
Accept: application/json
Content-Type: application/json
Content-Length: 41
{
"attachedBy": "",
"hostId": "ip-10-0-1-55",
}
```
HTTP Response:
```
HTTP/1.1 200
connection: keep-alive
content-type: application/json
date: Thu, 13 Oct 2022 00:22:32 GMT
server: nginx/1.19.8
transfer-encoding: chunked
x-api-schemas: http://54.243.179.156:30007/v1/schemas
{
"accessMode": "rwx",
"actions": {
"[activate](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=activate"](http://54.243.179.156:30007/v1/volumes/test-2?action=activate),
"[attach](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=attach"](http://54.243.179.156:30007/v1/volumes/test-2?action=attach),
"[cancelExpansion](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=cancelExpansion"](http://54.243.179.156:30007/v1/volumes/test-2?action=cancelExpansion),
"[detach](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=detach"](http://54.243.179.156:30007/v1/volumes/test-2?action=detach),
"[engineUpgrade](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=engineUpgrade"](http://54.243.179.156:30007/v1/volumes/test-2?action=engineUpgrade),
"[expand](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=expand"](http://54.243.179.156:30007/v1/volumes/test-2?action=expand),
"[pvCreate](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=pvCreate"](http://54.243.179.156:30007/v1/volumes/test-2?action=pvCreate),
"[pvcCreate](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=pvcCreate"](http://54.243.179.156:30007/v1/volumes/test-2?action=pvcCreate),
"[recurringJobAdd](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=recurringJobAdd"](http://54.243.179.156:30007/v1/volumes/test-2?action=recurringJobAdd),
"[recurringJobDelete](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=recurringJobDelete"](http://54.243.179.156:30007/v1/volumes/test-2?action=recurringJobDelete),
"[recurringJobList](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=recurringJobList"](http://54.243.179.156:30007/v1/volumes/test-2?action=recurringJobList),
"[replicaRemove](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=replicaRemove"](http://54.243.179.156:30007/v1/volumes/test-2?action=replicaRemove),
"[updateAccessMode](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=updateAccessMode"](http://54.243.179.156:30007/v1/volumes/test-2?action=updateAccessMode),
"[updateDataLocality](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=updateDataLocality"](http://54.243.179.156:30007/v1/volumes/test-2?action=updateDataLocality),
"[updateReplicaAutoBalance](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=updateReplicaAutoBalance"](http://54.243.179.156:30007/v1/volumes/test-2?action=updateReplicaAutoBalance),
},
"backingImage": "",
"backupStatus": [ ],
"cloneStatus": {
"snapshot": "",
"sourceVolume": "",
"state": "",
},
"conditions": {
"restore": {
"lastProbeTime": "",
"lastTransitionTime": "2022-10-13T00:09:12Z",
"message": "",
"reason": "",
"status": "False",
"type": "restore",
},
"scheduled": {
"lastProbeTime": "",
"lastTransitionTime": "2022-10-13T00:09:12Z",
"message": "",
"reason": "",
"status": "True",
"type": "scheduled",
},
"toomanysnapshots": {
"lastProbeTime": "",
"lastTransitionTime": "2022-10-13T00:09:12Z",
"message": "",
"reason": "",
"status": "False",
"type": "toomanysnapshots",
},
},
"controllers": [ ],
"created": "2022-10-13 00:09:11 +0000 UTC",
"currentImage": "longhornio/longhorn-engine:v1.3.2-rc2",
"dataLocality": "disabled",
"dataSource": "",
"disableFrontend": false,
"diskSelector": [ ],
"encrypted": false,
"engineImage": "longhornio/longhorn-engine:v1.3.2-rc2",
"fromBackup": "",
"frontend": "blockdev",
"id": ["test-2"](http://54.243.179.156:30007/v1/volumes/test-2),
"kubernetesStatus": {
"lastPVCRefAt": "",
"lastPodRefAt": "",
"namespace": "",
"pvName": "",
"pvStatus": "",
"pvcName": "",
"workloadsStatus": null,
},
"lastAttachedBy": "",
"lastBackup": "",
"lastBackupAt": "",
"links": {
"self": ["…/v1/volumes/test-2"](http://54.243.179.156:30007/v1/volumes/test-2),
},
"migratable": false,
"name": "test-2",
"nodeSelector": [ ],
"numberOfReplicas": 3,
"purgeStatus": null,
"ready": true,
"rebuildStatus": [ ],
"recurringJobSelector": null,
"replicaAutoBalance": "ignored",
"replicas": [ ],
"restoreRequired": false,
"restoreStatus": [ ],
"revisionCounterDisabled": false,
"robustness": "unknown",
"shareEndpoint": "",
"shareState": "stopped",
"size": "21474836480",
"staleReplicaTimeout": 20,
"standby": false,
"state": "detached",
"type": "volume",
}
```
(2) Using Longhorn python client API can get the same result:
==> try to attach a RWX volume
https://github.com/longhorn/longhorn-tests/blob/master/manager/integration/tests/test_basic.py#L4599
==> the RWX volume is still detached
https://ci.longhorn.io/job/public/job/master/job/sles/job/amd64/job/longhorn-tests-sles-amd64/lastCompletedBuild/testReport/tests/test_basic/test_backup_volume_restore_with_access_mode_s3_rwx_rwo_/
(3) If try to attach a RWX volume directly through Longhorn UI, the `Maintenance` checkbox is automatically checked, and the volume can be attached successfully, but from API, cannot find this `Maintenance` field.
## To Reproduce
Steps to reproduce the behavior:
1. Create a RWX volume
2. Try to attach this volume to a node through API
## Expected behavior
The volume can be attached
## Log or Support bundle
If applicable, add the Longhorn managers' log or support bundle when the issue happens.
You can generate a Support Bundle using the link at the footer of the Longhorn UI.
## Environment
- Longhorn version: v1.3.2-rc2, master
- Installation method (e.g. Rancher Catalog App/Helm/Kubectl): kubectl
- Kubernetes distro (e.g. RKE/K3s/EKS/OpenShift) and version:
- Number of management node in the cluster:
- Number of worker node in the cluster:
- Node config
- OS type and version:
- CPU per node:
- Memory per node:
- Disk type(e.g. SSD/NVMe):
- Network bandwidth between the nodes:
- Underlying Infrastructure (e.g. on AWS/GCE, EKS/GKE, VMWare/KVM, Baremetal):
- Number of Longhorn volumes in the cluster:
## Additional context
Add any other context about the problem here.
|
1.0
|
[BUG] Volume attach API not working for RWX volume - ## Describe the bug
(1) Try to attach a RWX volume to a node through API, the API response status code 200, but the RWX volume is still detached:
HTTP Request:
```
HTTP/1.1 POST /v1/volumes/test-2?action=attach
Host: 54.243.179.156:30007
Accept: application/json
Content-Type: application/json
Content-Length: 41
{
"attachedBy": "",
"hostId": "ip-10-0-1-55",
}
```
HTTP Response:
```
HTTP/1.1 200
connection: keep-alive
content-type: application/json
date: Thu, 13 Oct 2022 00:22:32 GMT
server: nginx/1.19.8
transfer-encoding: chunked
x-api-schemas: http://54.243.179.156:30007/v1/schemas
{
"accessMode": "rwx",
"actions": {
"[activate](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=activate"](http://54.243.179.156:30007/v1/volumes/test-2?action=activate),
"[attach](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=attach"](http://54.243.179.156:30007/v1/volumes/test-2?action=attach),
"[cancelExpansion](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=cancelExpansion"](http://54.243.179.156:30007/v1/volumes/test-2?action=cancelExpansion),
"[detach](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=detach"](http://54.243.179.156:30007/v1/volumes/test-2?action=detach),
"[engineUpgrade](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=engineUpgrade"](http://54.243.179.156:30007/v1/volumes/test-2?action=engineUpgrade),
"[expand](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=expand"](http://54.243.179.156:30007/v1/volumes/test-2?action=expand),
"[pvCreate](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=pvCreate"](http://54.243.179.156:30007/v1/volumes/test-2?action=pvCreate),
"[pvcCreate](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=pvcCreate"](http://54.243.179.156:30007/v1/volumes/test-2?action=pvcCreate),
"[recurringJobAdd](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=recurringJobAdd"](http://54.243.179.156:30007/v1/volumes/test-2?action=recurringJobAdd),
"[recurringJobDelete](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=recurringJobDelete"](http://54.243.179.156:30007/v1/volumes/test-2?action=recurringJobDelete),
"[recurringJobList](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=recurringJobList"](http://54.243.179.156:30007/v1/volumes/test-2?action=recurringJobList),
"[replicaRemove](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=replicaRemove"](http://54.243.179.156:30007/v1/volumes/test-2?action=replicaRemove),
"[updateAccessMode](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=updateAccessMode"](http://54.243.179.156:30007/v1/volumes/test-2?action=updateAccessMode),
"[updateDataLocality](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=updateDataLocality"](http://54.243.179.156:30007/v1/volumes/test-2?action=updateDataLocality),
"[updateReplicaAutoBalance](http://54.243.179.156:30007/v1/volumes/test-2#)": ["…/v1/volumes/test-2?action=updateReplicaAutoBalance"](http://54.243.179.156:30007/v1/volumes/test-2?action=updateReplicaAutoBalance),
},
"backingImage": "",
"backupStatus": [ ],
"cloneStatus": {
"snapshot": "",
"sourceVolume": "",
"state": "",
},
"conditions": {
"restore": {
"lastProbeTime": "",
"lastTransitionTime": "2022-10-13T00:09:12Z",
"message": "",
"reason": "",
"status": "False",
"type": "restore",
},
"scheduled": {
"lastProbeTime": "",
"lastTransitionTime": "2022-10-13T00:09:12Z",
"message": "",
"reason": "",
"status": "True",
"type": "scheduled",
},
"toomanysnapshots": {
"lastProbeTime": "",
"lastTransitionTime": "2022-10-13T00:09:12Z",
"message": "",
"reason": "",
"status": "False",
"type": "toomanysnapshots",
},
},
"controllers": [ ],
"created": "2022-10-13 00:09:11 +0000 UTC",
"currentImage": "longhornio/longhorn-engine:v1.3.2-rc2",
"dataLocality": "disabled",
"dataSource": "",
"disableFrontend": false,
"diskSelector": [ ],
"encrypted": false,
"engineImage": "longhornio/longhorn-engine:v1.3.2-rc2",
"fromBackup": "",
"frontend": "blockdev",
"id": ["test-2"](http://54.243.179.156:30007/v1/volumes/test-2),
"kubernetesStatus": {
"lastPVCRefAt": "",
"lastPodRefAt": "",
"namespace": "",
"pvName": "",
"pvStatus": "",
"pvcName": "",
"workloadsStatus": null,
},
"lastAttachedBy": "",
"lastBackup": "",
"lastBackupAt": "",
"links": {
"self": ["…/v1/volumes/test-2"](http://54.243.179.156:30007/v1/volumes/test-2),
},
"migratable": false,
"name": "test-2",
"nodeSelector": [ ],
"numberOfReplicas": 3,
"purgeStatus": null,
"ready": true,
"rebuildStatus": [ ],
"recurringJobSelector": null,
"replicaAutoBalance": "ignored",
"replicas": [ ],
"restoreRequired": false,
"restoreStatus": [ ],
"revisionCounterDisabled": false,
"robustness": "unknown",
"shareEndpoint": "",
"shareState": "stopped",
"size": "21474836480",
"staleReplicaTimeout": 20,
"standby": false,
"state": "detached",
"type": "volume",
}
```
(2) Using Longhorn python client API can get the same result:
==> try to attach a RWX volume
https://github.com/longhorn/longhorn-tests/blob/master/manager/integration/tests/test_basic.py#L4599
==> the RWX volume is still detached
https://ci.longhorn.io/job/public/job/master/job/sles/job/amd64/job/longhorn-tests-sles-amd64/lastCompletedBuild/testReport/tests/test_basic/test_backup_volume_restore_with_access_mode_s3_rwx_rwo_/
(3) If try to attach a RWX volume directly through Longhorn UI, the `Maintenance` checkbox is automatically checked, and the volume can be attached successfully, but from API, cannot find this `Maintenance` field.
## To Reproduce
Steps to reproduce the behavior:
1. Create a RWX volume
2. Try to attach this volume to a node through API
## Expected behavior
The volume can be attached
## Log or Support bundle
If applicable, add the Longhorn managers' log or support bundle when the issue happens.
You can generate a Support Bundle using the link at the footer of the Longhorn UI.
## Environment
- Longhorn version: v1.3.2-rc2, master
- Installation method (e.g. Rancher Catalog App/Helm/Kubectl): kubectl
- Kubernetes distro (e.g. RKE/K3s/EKS/OpenShift) and version:
- Number of management node in the cluster:
- Number of worker node in the cluster:
- Node config
- OS type and version:
- CPU per node:
- Memory per node:
- Disk type(e.g. SSD/NVMe):
- Network bandwidth between the nodes:
- Underlying Infrastructure (e.g. on AWS/GCE, EKS/GKE, VMWare/KVM, Baremetal):
- Number of Longhorn volumes in the cluster:
## Additional context
Add any other context about the problem here.
|
test
|
volume attach api not working for rwx volume describe the bug try to attach a rwx volume to a node through api the api response status code but the rwx volume is still detached http request http post volumes test action attach host accept application json content type application json content length attachedby hostid ip http response http connection keep alive content type application json date thu oct gmt server nginx transfer encoding chunked x api schemas accessmode rwx actions backingimage backupstatus clonestatus snapshot sourcevolume state conditions restore lastprobetime lasttransitiontime message reason status false type restore scheduled lastprobetime lasttransitiontime message reason status true type scheduled toomanysnapshots lastprobetime lasttransitiontime message reason status false type toomanysnapshots controllers created utc currentimage longhornio longhorn engine datalocality disabled datasource disablefrontend false diskselector encrypted false engineimage longhornio longhorn engine frombackup frontend blockdev id kubernetesstatus lastpvcrefat lastpodrefat namespace pvname pvstatus pvcname workloadsstatus null lastattachedby lastbackup lastbackupat links self migratable false name test nodeselector numberofreplicas purgestatus null ready true rebuildstatus recurringjobselector null replicaautobalance ignored replicas restorerequired false restorestatus revisioncounterdisabled false robustness unknown shareendpoint sharestate stopped size stalereplicatimeout standby false state detached type volume using longhorn python client api can get the same result try to attach a rwx volume the rwx volume is still detached if try to attach a rwx volume directly through longhorn ui the maintenance checkbox is automatically checked and the volume can be attached successfully but from api cannot find this maintenance field to reproduce steps to reproduce the behavior create a rwx volume try to attach this volume to a node through api expected behavior the volume can be attached log or support bundle if applicable add the longhorn managers log or support bundle when the issue happens you can generate a support bundle using the link at the footer of the longhorn ui environment longhorn version master installation method e g rancher catalog app helm kubectl kubectl kubernetes distro e g rke eks openshift and version number of management node in the cluster number of worker node in the cluster node config os type and version cpu per node memory per node disk type e g ssd nvme network bandwidth between the nodes underlying infrastructure e g on aws gce eks gke vmware kvm baremetal number of longhorn volumes in the cluster additional context add any other context about the problem here
| 1
|
299,629
| 25,915,255,859
|
IssuesEvent
|
2022-12-15 16:50:58
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
closed
|
Credentials expire for a new VPN account when system date is advanced on the client side
|
bug needs-discussion QA/Yes QA/Test-Plan-Specified OS/Desktop feature/vpn
|
<!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. New profile,
2. Open brave://flags
3. Set brave://flags/#brave-vpn to Enabled`
4. Load account.bravesoftware.com.
5. Enter basic-authentication credentials and click Sign in on the modal dialog
6. entered test1214vpn2@mailinator.com, clicked Get login link
7. clicked the link in the email
8. clicked Browse plans
9. scrolled down
10. clicked on Buy now for Brave VPN Subscription
11. completed the Stripe purchase flow
12. confirmed I could connect to the VPN
13. closed Brave browser
14. advanced the system date to 12/20 (Windows settings>>Time & language>>Current date and time)
15. relaunched Brave
16. login to account.bravesoftare.com
17. successfully logged in
18. clicked VPN button
## Actual result:
<!--Please add screenshots if needed-->
`Brave Firewall + VPN` modal rendered

## Expected result:
Credentials should be renewed/new credentials created
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easily
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
```
Brave | 1.46.144 Chromium: 108.0.5359.128 (Official Build) (64-bit)
-- | --
Revision | 1cd27afdb8e5d057070c0961e04c490d2aca1aa0-refs/branch-heads/5359@{#1185}
OS | Windows 11 Version 21H2 (Build 22000.1335)
```
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? Yes
- Can you reproduce this issue with the beta channel?
- Can you reproduce this issue with the nightly channel?
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields?
- Does the issue resolve itself when disabling Brave Rewards?
- Is the issue reproducible on the latest version of Chrome?
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->@clifton @mattmcalister cc: @stephendonner
|
1.0
|
Credentials expire for a new VPN account when system date is advanced on the client side - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. New profile,
2. Open brave://flags
3. Set brave://flags/#brave-vpn to Enabled`
4. Load account.bravesoftware.com.
5. Enter basic-authentication credentials and click Sign in on the modal dialog
6. entered test1214vpn2@mailinator.com, clicked Get login link
7. clicked the link in the email
8. clicked Browse plans
9. scrolled down
10. clicked on Buy now for Brave VPN Subscription
11. completed the Stripe purchase flow
12. confirmed I could connect to the VPN
13. closed Brave browser
14. advanced the system date to 12/20 (Windows settings>>Time & language>>Current date and time)
15. relaunched Brave
16. login to account.bravesoftare.com
17. successfully logged in
18. clicked VPN button
## Actual result:
<!--Please add screenshots if needed-->
`Brave Firewall + VPN` modal rendered

## Expected result:
Credentials should be renewed/new credentials created
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easily
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
```
Brave | 1.46.144 Chromium: 108.0.5359.128 (Official Build) (64-bit)
-- | --
Revision | 1cd27afdb8e5d057070c0961e04c490d2aca1aa0-refs/branch-heads/5359@{#1185}
OS | Windows 11 Version 21H2 (Build 22000.1335)
```
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? Yes
- Can you reproduce this issue with the beta channel?
- Can you reproduce this issue with the nightly channel?
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields?
- Does the issue resolve itself when disabling Brave Rewards?
- Is the issue reproducible on the latest version of Chrome?
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->@clifton @mattmcalister cc: @stephendonner
|
test
|
credentials expire for a new vpn account when system date is advanced on the client side have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description steps to reproduce new profile open brave flags set brave flags brave vpn to enabled load account bravesoftware com enter basic authentication credentials and click sign in on the modal dialog entered mailinator com clicked get login link clicked the link in the email clicked browse plans scrolled down clicked on buy now for brave vpn subscription completed the stripe purchase flow confirmed i could connect to the vpn closed brave browser advanced the system date to windows settings time language current date and time relaunched brave login to account bravesoftare com successfully logged in clicked vpn button actual result brave firewall vpn modal rendered expected result credentials should be renewed new credentials created reproduces how often easily brave version brave version info brave chromium official build bit revision refs branch heads os windows version build version channel information can you reproduce this issue with the current release yes can you reproduce this issue with the beta channel can you reproduce this issue with the nightly channel other additional information does the issue resolve itself when disabling brave shields does the issue resolve itself when disabling brave rewards is the issue reproducible on the latest version of chrome miscellaneous information clifton mattmcalister cc stephendonner
| 1
|
283,753
| 30,913,539,427
|
IssuesEvent
|
2023-08-05 02:10:42
|
hshivhare67/kernel_v4.19.72
|
https://api.github.com/repos/hshivhare67/kernel_v4.19.72
|
reopened
|
CVE-2019-11884 (Low) detected in linuxlinux-4.19.282
|
Mend: dependency security vulnerability
|
## CVE-2019-11884 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.282</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/hshivhare67/kernel_v4.19.72/commit/139c4e073703974ca0b05255c4cff6dcd52a8e31">139c4e073703974ca0b05255c4cff6dcd52a8e31</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
The do_hidp_sock_ioctl function in net/bluetooth/hidp/sock.c in the Linux kernel before 5.0.15 allows a local user to obtain potentially sensitive information from kernel stack memory via a HIDPCONNADD command, because a name field may not end with a '\0' character.
<p>Publish Date: 2019-05-10
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-11884>CVE-2019-11884</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11884">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11884</a></p>
<p>Release Date: 2020-08-24</p>
<p>Fix Resolution: 5.0.15</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-11884 (Low) detected in linuxlinux-4.19.282 - ## CVE-2019-11884 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.282</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/hshivhare67/kernel_v4.19.72/commit/139c4e073703974ca0b05255c4cff6dcd52a8e31">139c4e073703974ca0b05255c4cff6dcd52a8e31</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
The do_hidp_sock_ioctl function in net/bluetooth/hidp/sock.c in the Linux kernel before 5.0.15 allows a local user to obtain potentially sensitive information from kernel stack memory via a HIDPCONNADD command, because a name field may not end with a '\0' character.
<p>Publish Date: 2019-05-10
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-11884>CVE-2019-11884</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11884">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11884</a></p>
<p>Release Date: 2020-08-24</p>
<p>Fix Resolution: 5.0.15</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve low detected in linuxlinux cve low severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch master vulnerable source files vulnerability details the do hidp sock ioctl function in net bluetooth hidp sock c in the linux kernel before allows a local user to obtain potentially sensitive information from kernel stack memory via a hidpconnadd command because a name field may not end with a character publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
38,949
| 8,559,443,977
|
IssuesEvent
|
2018-11-08 21:17:33
|
kentcdodds/ama
|
https://api.github.com/repos/kentcdodds/ama
|
closed
|
Code sharing between projects, how to manage?
|
code-help
|
First of all I would like to thank you for all your insightful videos and your projects such as testing libraries. I am day-by-day extending the test coverage of my project. Thank you, Kent.
I am currently working on a charity project were I am having two React/CRA projects which I would like to share code with each other. For example, I would like to share components such as a component which renders details of a profile. The main reason why I would like to share the code is to ensure that both CRA applications will display the same details.
If I understand it correctly, I could make a npm package and use it. Only this sounds really inconvenient to continuously push a new version to the git repo, run yarn install while working on the project. How do you share code between projects? How do you handle this while developing approach? Do you have any tips regarding this?
Oh yeah, I prefer to not publish the npm package in the registry as I think it's not useful for other people :)
|
1.0
|
Code sharing between projects, how to manage? - First of all I would like to thank you for all your insightful videos and your projects such as testing libraries. I am day-by-day extending the test coverage of my project. Thank you, Kent.
I am currently working on a charity project were I am having two React/CRA projects which I would like to share code with each other. For example, I would like to share components such as a component which renders details of a profile. The main reason why I would like to share the code is to ensure that both CRA applications will display the same details.
If I understand it correctly, I could make a npm package and use it. Only this sounds really inconvenient to continuously push a new version to the git repo, run yarn install while working on the project. How do you share code between projects? How do you handle this while developing approach? Do you have any tips regarding this?
Oh yeah, I prefer to not publish the npm package in the registry as I think it's not useful for other people :)
|
non_test
|
code sharing between projects how to manage first of all i would like to thank you for all your insightful videos and your projects such as testing libraries i am day by day extending the test coverage of my project thank you kent i am currently working on a charity project were i am having two react cra projects which i would like to share code with each other for example i would like to share components such as a component which renders details of a profile the main reason why i would like to share the code is to ensure that both cra applications will display the same details if i understand it correctly i could make a npm package and use it only this sounds really inconvenient to continuously push a new version to the git repo run yarn install while working on the project how do you share code between projects how do you handle this while developing approach do you have any tips regarding this oh yeah i prefer to not publish the npm package in the registry as i think it s not useful for other people
| 0
|
374,687
| 26,127,914,368
|
IssuesEvent
|
2022-12-28 21:44:25
|
yannellym/JavaBank
|
https://api.github.com/repos/yannellym/JavaBank
|
closed
|
Merge the the prompt and verify funcs
|
documentation enhancement
|
Could potentially merge the promptuserforAccnumberandpin and verify functions to cut down wet code
|
1.0
|
Merge the the prompt and verify funcs - Could potentially merge the promptuserforAccnumberandpin and verify functions to cut down wet code
|
non_test
|
merge the the prompt and verify funcs could potentially merge the promptuserforaccnumberandpin and verify functions to cut down wet code
| 0
|
9,537
| 3,052,145,382
|
IssuesEvent
|
2015-08-12 13:18:10
|
rssidlowski/Pollution_Source_Tracking
|
https://api.github.com/repos/rssidlowski/Pollution_Source_Tracking
|
closed
|
Link Sample: unable to successfully link sample
|
bug COBDev Ready for Testing moderate priority
|
As reported from the PST team: We have found that the Link Sample feature in the PST application does not work correctly. Several staff members have tried over the past few weeks and all have had the same experience. After clicking on the “Link Sample” button the application guides us to choose an existing sample to link to the open investigation. At this time we can choose a sample from the map and it will ask to confirm the link. After confirming the link by clinking the yes, nothing happens. The selected sample remains only in its original investigation.
I recreated this behavior in development. I am unable to successfully link a sample even though the message displays that it was linked. I don't see the sample in the sample list for the current investigation.
|
1.0
|
Link Sample: unable to successfully link sample - As reported from the PST team: We have found that the Link Sample feature in the PST application does not work correctly. Several staff members have tried over the past few weeks and all have had the same experience. After clicking on the “Link Sample” button the application guides us to choose an existing sample to link to the open investigation. At this time we can choose a sample from the map and it will ask to confirm the link. After confirming the link by clinking the yes, nothing happens. The selected sample remains only in its original investigation.
I recreated this behavior in development. I am unable to successfully link a sample even though the message displays that it was linked. I don't see the sample in the sample list for the current investigation.
|
test
|
link sample unable to successfully link sample as reported from the pst team we have found that the link sample feature in the pst application does not work correctly several staff members have tried over the past few weeks and all have had the same experience after clicking on the “link sample” button the application guides us to choose an existing sample to link to the open investigation at this time we can choose a sample from the map and it will ask to confirm the link after confirming the link by clinking the yes nothing happens the selected sample remains only in its original investigation i recreated this behavior in development i am unable to successfully link a sample even though the message displays that it was linked i don t see the sample in the sample list for the current investigation
| 1
|
228,299
| 18,169,728,005
|
IssuesEvent
|
2021-09-27 18:27:54
|
microsoft/vscode-jupyter
|
https://api.github.com/repos/microsoft/vscode-jupyter
|
opened
|
Ensure white background is applied to just the plot and not the entire output area for matplot lib plots
|
testplan-item
|
Refs: https://github.com/microsoft/vscode-jupyter/issues/7470
- [ ] anyOS
Complexity: 3
[Create Issue](https://github.com/microsoft/vscode-jupyter/issues/new?body=Testing+%237470%0A%0A&assignees=DonJayamanne)
---
Retina display option for Matplotlib does not work as intended
**Testing**
* Install Python
* Install Python & Jupyter extension
* Change theme to dark (anything other than light/white)
* Install Matplotlib
* Open an ipynb file
* Create a Python cell with the following code & run it
```
%pip install -U matplotlib
```
* Or use the following documentation (https://matplotlib.org/stable/users/installing.html)
* Create a Python cell with the following code & run it
```python
import matplotlib.pyplot as plt
plt.figure()
plt.plot([1,2], [1,2])
plt.show()
```
* The background of the entire output area should not be white (only the plot should be white)
**Heres a sample of how it used to render (bug):**

**Heres a sample of how it should render (after the fix):**

|
1.0
|
Ensure white background is applied to just the plot and not the entire output area for matplot lib plots - Refs: https://github.com/microsoft/vscode-jupyter/issues/7470
- [ ] anyOS
Complexity: 3
[Create Issue](https://github.com/microsoft/vscode-jupyter/issues/new?body=Testing+%237470%0A%0A&assignees=DonJayamanne)
---
Retina display option for Matplotlib does not work as intended
**Testing**
* Install Python
* Install Python & Jupyter extension
* Change theme to dark (anything other than light/white)
* Install Matplotlib
* Open an ipynb file
* Create a Python cell with the following code & run it
```
%pip install -U matplotlib
```
* Or use the following documentation (https://matplotlib.org/stable/users/installing.html)
* Create a Python cell with the following code & run it
```python
import matplotlib.pyplot as plt
plt.figure()
plt.plot([1,2], [1,2])
plt.show()
```
* The background of the entire output area should not be white (only the plot should be white)
**Heres a sample of how it used to render (bug):**

**Heres a sample of how it should render (after the fix):**

|
test
|
ensure white background is applied to just the plot and not the entire output area for matplot lib plots refs anyos complexity retina display option for matplotlib does not work as intended testing install python install python jupyter extension change theme to dark anything other than light white install matplotlib open an ipynb file create a python cell with the following code run it pip install u matplotlib or use the following documentation create a python cell with the following code run it python import matplotlib pyplot as plt plt figure plt plot plt show the background of the entire output area should not be white only the plot should be white heres a sample of how it used to render bug heres a sample of how it should render after the fix
| 1
|
291,045
| 21,913,962,609
|
IssuesEvent
|
2022-05-21 14:08:49
|
SebastianZolkwer/obligatorio-agil2-SebastianZolkwer-MauroWynter-AlanGarfinkel
|
https://api.github.com/repos/SebastianZolkwer/obligatorio-agil2-SebastianZolkwer-MauroWynter-AlanGarfinkel
|
opened
|
Registro de esfuerzo de los integrantes por tarea
|
documentation
|
Parte del TODO de la Entrega 2: Se debe llevar detalle de registro de esfuerzo por tarea e integrantes.
|
1.0
|
Registro de esfuerzo de los integrantes por tarea - Parte del TODO de la Entrega 2: Se debe llevar detalle de registro de esfuerzo por tarea e integrantes.
|
non_test
|
registro de esfuerzo de los integrantes por tarea parte del todo de la entrega se debe llevar detalle de registro de esfuerzo por tarea e integrantes
| 0
|
25,638
| 3,953,191,309
|
IssuesEvent
|
2016-04-29 12:28:57
|
codeforboston/cornerwise
|
https://api.github.com/repos/codeforboston/cornerwise
|
closed
|
Project detail view
|
css design javascript
|
Containing additional budget details, justification, project description, and associated address(es).
|
1.0
|
Project detail view - Containing additional budget details, justification, project description, and associated address(es).
|
non_test
|
project detail view containing additional budget details justification project description and associated address es
| 0
|
349,043
| 31,769,448,792
|
IssuesEvent
|
2023-09-12 10:48:48
|
BookStackApp/BookStack
|
https://api.github.com/repos/BookStackApp/BookStack
|
closed
|
Sorting of books is lost when copying
|
:bug: Bug :mag: Testing required
|
### Describe the Bug
We have created a book as a template in our bookstack and now want to copy it. However, the manual sorting of the book is lost and is sorted automatically (by name?)
### Steps to Reproduce
1. Create a book with a fixed strcuture
2. hit the copy button
3. strucutre is lost
### Expected Behaviour
After clicking on copy the structure should be maintained.
### Screenshots or Additional Context
_No response_
### Browser Details
_No response_
### Exact BookStack Version
v23.06.2
### PHP Version
_No response_
### Hosting Environment
Ubuntu 20.04 VPS, installed using official installation script
|
1.0
|
Sorting of books is lost when copying - ### Describe the Bug
We have created a book as a template in our bookstack and now want to copy it. However, the manual sorting of the book is lost and is sorted automatically (by name?)
### Steps to Reproduce
1. Create a book with a fixed strcuture
2. hit the copy button
3. strucutre is lost
### Expected Behaviour
After clicking on copy the structure should be maintained.
### Screenshots or Additional Context
_No response_
### Browser Details
_No response_
### Exact BookStack Version
v23.06.2
### PHP Version
_No response_
### Hosting Environment
Ubuntu 20.04 VPS, installed using official installation script
|
test
|
sorting of books is lost when copying describe the bug we have created a book as a template in our bookstack and now want to copy it however the manual sorting of the book is lost and is sorted automatically by name steps to reproduce create a book with a fixed strcuture hit the copy button strucutre is lost expected behaviour after clicking on copy the structure should be maintained screenshots or additional context no response browser details no response exact bookstack version php version no response hosting environment ubuntu vps installed using official installation script
| 1
|
86,178
| 24,778,252,670
|
IssuesEvent
|
2022-10-24 00:34:11
|
haskell/cabal
|
https://api.github.com/repos/haskell/cabal
|
closed
|
cabal-install 3.8.1.0 regression: can't control the order of hs-source-dirs anymore
|
type: bug cabal-install: cmd/build attention: needs-backport 3.8 regression in 3.8
|
**Describe the bug**
`cabal-install` apparently seems to try to preprocess *all* modules in `hs-source-dirs`, regardless of whether they are listed in `exposed-modules`, `other-modules` or required for compilation. This can lead to the build of a component failing because preprocessing of a module fails that isn't actually required to compile the component.
**To Reproduce**
Checkout https://github.com/sternenseemann/spacecookie/tree/75275cf971197f1b1da4464b17e39129428e527b and run `cabal v2-build test`.
The test suite includes the `server` directory additionally to be able to import a single internal module of the executable component (all needed modules are listed explicitly in `other-modules`). This causes cabal-install since 3.8.1.0 to preprocess unrelated modules and fail (because they have a `MIN_VERSION_aeson` macro that won't work if the component doesn't depend on `aeson`). With `cabal-install` 3.6.2.0` it is possible to build the test suite without any problems.
**Expected behavior**
The test suite should compile successfully.
**System information**
- NixOS
- `cabal` 3.8.1.0, `ghc` 9.0.2
|
1.0
|
cabal-install 3.8.1.0 regression: can't control the order of hs-source-dirs anymore - **Describe the bug**
`cabal-install` apparently seems to try to preprocess *all* modules in `hs-source-dirs`, regardless of whether they are listed in `exposed-modules`, `other-modules` or required for compilation. This can lead to the build of a component failing because preprocessing of a module fails that isn't actually required to compile the component.
**To Reproduce**
Checkout https://github.com/sternenseemann/spacecookie/tree/75275cf971197f1b1da4464b17e39129428e527b and run `cabal v2-build test`.
The test suite includes the `server` directory additionally to be able to import a single internal module of the executable component (all needed modules are listed explicitly in `other-modules`). This causes cabal-install since 3.8.1.0 to preprocess unrelated modules and fail (because they have a `MIN_VERSION_aeson` macro that won't work if the component doesn't depend on `aeson`). With `cabal-install` 3.6.2.0` it is possible to build the test suite without any problems.
**Expected behavior**
The test suite should compile successfully.
**System information**
- NixOS
- `cabal` 3.8.1.0, `ghc` 9.0.2
|
non_test
|
cabal install regression can t control the order of hs source dirs anymore describe the bug cabal install apparently seems to try to preprocess all modules in hs source dirs regardless of whether they are listed in exposed modules other modules or required for compilation this can lead to the build of a component failing because preprocessing of a module fails that isn t actually required to compile the component to reproduce checkout and run cabal build test the test suite includes the server directory additionally to be able to import a single internal module of the executable component all needed modules are listed explicitly in other modules this causes cabal install since to preprocess unrelated modules and fail because they have a min version aeson macro that won t work if the component doesn t depend on aeson with cabal install it is possible to build the test suite without any problems expected behavior the test suite should compile successfully system information nixos cabal ghc
| 0
|
339,242
| 10,245,029,208
|
IssuesEvent
|
2019-08-20 11:55:21
|
CW-Khristos/scripts
|
https://api.github.com/repos/CW-Khristos/scripts
|
closed
|
Auto_Plan - Domain Environments
|
Agent / Probe Auto_Plan PRIORITY Protection Plans enhancement needs validation
|
Auto_Plan needs to be able to join device to domain, create RMMTech Domain Admin, add RMMTech Admin to Domain Admin group, add Domain Admin / RMMTech Admin to Local Admin group, and grant Service Logon rights to RMMTech Domain Admin
For a video review of complete progression through each "Stage" of script, watch this "proof of concept" [video](https://computerwarriors.sharepoint.com/:v:/s/BizClass/EayQhXv3zmhOgFjbcqgO45sBOK-4IwkZo6VLlMCbiBA0mQ?e=WQJcku)
- [ ] Join device to domain
- wmic.exe /interactive:off ComputerSystem Where name="%computername%" call JoinDomainOrWorkgroup FJoinOptions=3 Name="myDom.local" UserName="myDom\UsrName" Password="@passwrd!@" AccountOU="OU=MyClients;OU=MyOrg;DC=myDom;DC=local"
- [ ] RMMTech Admin - Domain Admin
- https://ss64.com/nt/dsadd.html
- Create RMMTech in AD Users
- Add RMMTech to AD Domain Admin / req groups
- net user username password /ADD /DOMAIN (device must already be member of domain)
- net group groupname username[ ...] {/add | /delete} [/domain]
- [ ] Domain Admin - Local Admin Group
- https://ss64.com/nt/net-useradmin.html
- Add RMMTech to local admin on domain devices
- NET LOCALGROUP "group name" username /ADD /DOMAIN
- [ ] Service Logon - Domain RMMTech
|
1.0
|
Auto_Plan - Domain Environments - Auto_Plan needs to be able to join device to domain, create RMMTech Domain Admin, add RMMTech Admin to Domain Admin group, add Domain Admin / RMMTech Admin to Local Admin group, and grant Service Logon rights to RMMTech Domain Admin
For a video review of complete progression through each "Stage" of script, watch this "proof of concept" [video](https://computerwarriors.sharepoint.com/:v:/s/BizClass/EayQhXv3zmhOgFjbcqgO45sBOK-4IwkZo6VLlMCbiBA0mQ?e=WQJcku)
- [ ] Join device to domain
- wmic.exe /interactive:off ComputerSystem Where name="%computername%" call JoinDomainOrWorkgroup FJoinOptions=3 Name="myDom.local" UserName="myDom\UsrName" Password="@passwrd!@" AccountOU="OU=MyClients;OU=MyOrg;DC=myDom;DC=local"
- [ ] RMMTech Admin - Domain Admin
- https://ss64.com/nt/dsadd.html
- Create RMMTech in AD Users
- Add RMMTech to AD Domain Admin / req groups
- net user username password /ADD /DOMAIN (device must already be member of domain)
- net group groupname username[ ...] {/add | /delete} [/domain]
- [ ] Domain Admin - Local Admin Group
- https://ss64.com/nt/net-useradmin.html
- Add RMMTech to local admin on domain devices
- NET LOCALGROUP "group name" username /ADD /DOMAIN
- [ ] Service Logon - Domain RMMTech
|
non_test
|
auto plan domain environments auto plan needs to be able to join device to domain create rmmtech domain admin add rmmtech admin to domain admin group add domain admin rmmtech admin to local admin group and grant service logon rights to rmmtech domain admin for a video review of complete progression through each stage of script watch this proof of concept join device to domain wmic exe interactive off computersystem where name computername call joindomainorworkgroup fjoinoptions name mydom local username mydom usrname password passwrd accountou ou myclients ou myorg dc mydom dc local rmmtech admin domain admin create rmmtech in ad users add rmmtech to ad domain admin req groups net user username password add domain device must already be member of domain net group groupname username add delete domain admin local admin group add rmmtech to local admin on domain devices net localgroup group name username add domain service logon domain rmmtech
| 0
|
328,518
| 28,123,042,671
|
IssuesEvent
|
2023-03-31 15:27:12
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
reopened
|
Fix jax_lax_operators.test_jax_lax_shift_right_logical
|
JAX Frontend Sub Task Failing Test
|
| | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4570376767/jobs/8067642702" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4570376767/jobs/8067642702" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4570376767/jobs/8067642702" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4570376767/jobs/8067642702" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_lax_operators.py::test_jax_lax_shift_right_logical[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-31T01:48:56.7849244Z E AssertionError: the ground truth framework jax returned a int8 datatype while the backend jax returned a int64 datatype
2023-03-31T01:48:56.7850033Z E Falsifying example: test_jax_lax_shift_right_logical(
2023-03-31T01:48:56.7851410Z E dtype_and_x=(['int8', 'int8'],
2023-03-31T01:48:56.7851739Z E [array(0, dtype=int8), array(0, dtype=int8)]),
2023-03-31T01:48:56.7852202Z E fn_tree='ivy.functional.frontends.jax.lax.shift_right_logical',
2023-03-31T01:48:56.7852781Z E test_flags=FrontendFunctionTestFlags(
2023-03-31T01:48:56.7853085Z E num_positional_args=0,
2023-03-31T01:48:56.7853338Z E with_out=False,
2023-03-31T01:48:56.7853900Z E inplace=False,
2023-03-31T01:48:56.7854308Z E as_variable=[False],
2023-03-31T01:48:56.7854535Z E native_arrays=[False],
2023-03-31T01:48:56.7854993Z E generate_frontend_arrays=True,
2023-03-31T01:48:56.7855217Z E ),
2023-03-31T01:48:56.7855467Z E on_device='cpu',
2023-03-31T01:48:56.7855714Z E frontend='jax',
2023-03-31T01:48:56.7855914Z E )
2023-03-31T01:48:56.7856091Z E
2023-03-31T01:48:56.7856566Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.1', b'AAIAAAAAAAAAAQ==') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_lax_operators.py::test_jax_lax_shift_right_logical[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-31T01:48:56.7849244Z E AssertionError: the ground truth framework jax returned a int8 datatype while the backend jax returned a int64 datatype
2023-03-31T01:48:56.7850033Z E Falsifying example: test_jax_lax_shift_right_logical(
2023-03-31T01:48:56.7851410Z E dtype_and_x=(['int8', 'int8'],
2023-03-31T01:48:56.7851739Z E [array(0, dtype=int8), array(0, dtype=int8)]),
2023-03-31T01:48:56.7852202Z E fn_tree='ivy.functional.frontends.jax.lax.shift_right_logical',
2023-03-31T01:48:56.7852781Z E test_flags=FrontendFunctionTestFlags(
2023-03-31T01:48:56.7853085Z E num_positional_args=0,
2023-03-31T01:48:56.7853338Z E with_out=False,
2023-03-31T01:48:56.7853900Z E inplace=False,
2023-03-31T01:48:56.7854308Z E as_variable=[False],
2023-03-31T01:48:56.7854535Z E native_arrays=[False],
2023-03-31T01:48:56.7854993Z E generate_frontend_arrays=True,
2023-03-31T01:48:56.7855217Z E ),
2023-03-31T01:48:56.7855467Z E on_device='cpu',
2023-03-31T01:48:56.7855714Z E frontend='jax',
2023-03-31T01:48:56.7855914Z E )
2023-03-31T01:48:56.7856091Z E
2023-03-31T01:48:56.7856566Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.1', b'AAIAAAAAAAAAAQ==') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_lax_operators.py::test_jax_lax_shift_right_logical[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-31T01:48:56.7849244Z E AssertionError: the ground truth framework jax returned a int8 datatype while the backend jax returned a int64 datatype
2023-03-31T01:48:56.7850033Z E Falsifying example: test_jax_lax_shift_right_logical(
2023-03-31T01:48:56.7851410Z E dtype_and_x=(['int8', 'int8'],
2023-03-31T01:48:56.7851739Z E [array(0, dtype=int8), array(0, dtype=int8)]),
2023-03-31T01:48:56.7852202Z E fn_tree='ivy.functional.frontends.jax.lax.shift_right_logical',
2023-03-31T01:48:56.7852781Z E test_flags=FrontendFunctionTestFlags(
2023-03-31T01:48:56.7853085Z E num_positional_args=0,
2023-03-31T01:48:56.7853338Z E with_out=False,
2023-03-31T01:48:56.7853900Z E inplace=False,
2023-03-31T01:48:56.7854308Z E as_variable=[False],
2023-03-31T01:48:56.7854535Z E native_arrays=[False],
2023-03-31T01:48:56.7854993Z E generate_frontend_arrays=True,
2023-03-31T01:48:56.7855217Z E ),
2023-03-31T01:48:56.7855467Z E on_device='cpu',
2023-03-31T01:48:56.7855714Z E frontend='jax',
2023-03-31T01:48:56.7855914Z E )
2023-03-31T01:48:56.7856091Z E
2023-03-31T01:48:56.7856566Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.1', b'AAIAAAAAAAAAAQ==') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_lax_operators.py::test_jax_lax_shift_right_logical[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-31T01:48:56.7849244Z E AssertionError: the ground truth framework jax returned a int8 datatype while the backend jax returned a int64 datatype
2023-03-31T01:48:56.7850033Z E Falsifying example: test_jax_lax_shift_right_logical(
2023-03-31T01:48:56.7851410Z E dtype_and_x=(['int8', 'int8'],
2023-03-31T01:48:56.7851739Z E [array(0, dtype=int8), array(0, dtype=int8)]),
2023-03-31T01:48:56.7852202Z E fn_tree='ivy.functional.frontends.jax.lax.shift_right_logical',
2023-03-31T01:48:56.7852781Z E test_flags=FrontendFunctionTestFlags(
2023-03-31T01:48:56.7853085Z E num_positional_args=0,
2023-03-31T01:48:56.7853338Z E with_out=False,
2023-03-31T01:48:56.7853900Z E inplace=False,
2023-03-31T01:48:56.7854308Z E as_variable=[False],
2023-03-31T01:48:56.7854535Z E native_arrays=[False],
2023-03-31T01:48:56.7854993Z E generate_frontend_arrays=True,
2023-03-31T01:48:56.7855217Z E ),
2023-03-31T01:48:56.7855467Z E on_device='cpu',
2023-03-31T01:48:56.7855714Z E frontend='jax',
2023-03-31T01:48:56.7855914Z E )
2023-03-31T01:48:56.7856091Z E
2023-03-31T01:48:56.7856566Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.1', b'AAIAAAAAAAAAAQ==') as a decorator on your test case
</details>
|
1.0
|
Fix jax_lax_operators.test_jax_lax_shift_right_logical - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4570376767/jobs/8067642702" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4570376767/jobs/8067642702" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4570376767/jobs/8067642702" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4570376767/jobs/8067642702" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_lax_operators.py::test_jax_lax_shift_right_logical[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-31T01:48:56.7849244Z E AssertionError: the ground truth framework jax returned a int8 datatype while the backend jax returned a int64 datatype
2023-03-31T01:48:56.7850033Z E Falsifying example: test_jax_lax_shift_right_logical(
2023-03-31T01:48:56.7851410Z E dtype_and_x=(['int8', 'int8'],
2023-03-31T01:48:56.7851739Z E [array(0, dtype=int8), array(0, dtype=int8)]),
2023-03-31T01:48:56.7852202Z E fn_tree='ivy.functional.frontends.jax.lax.shift_right_logical',
2023-03-31T01:48:56.7852781Z E test_flags=FrontendFunctionTestFlags(
2023-03-31T01:48:56.7853085Z E num_positional_args=0,
2023-03-31T01:48:56.7853338Z E with_out=False,
2023-03-31T01:48:56.7853900Z E inplace=False,
2023-03-31T01:48:56.7854308Z E as_variable=[False],
2023-03-31T01:48:56.7854535Z E native_arrays=[False],
2023-03-31T01:48:56.7854993Z E generate_frontend_arrays=True,
2023-03-31T01:48:56.7855217Z E ),
2023-03-31T01:48:56.7855467Z E on_device='cpu',
2023-03-31T01:48:56.7855714Z E frontend='jax',
2023-03-31T01:48:56.7855914Z E )
2023-03-31T01:48:56.7856091Z E
2023-03-31T01:48:56.7856566Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.1', b'AAIAAAAAAAAAAQ==') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_lax_operators.py::test_jax_lax_shift_right_logical[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-31T01:48:56.7849244Z E AssertionError: the ground truth framework jax returned a int8 datatype while the backend jax returned a int64 datatype
2023-03-31T01:48:56.7850033Z E Falsifying example: test_jax_lax_shift_right_logical(
2023-03-31T01:48:56.7851410Z E dtype_and_x=(['int8', 'int8'],
2023-03-31T01:48:56.7851739Z E [array(0, dtype=int8), array(0, dtype=int8)]),
2023-03-31T01:48:56.7852202Z E fn_tree='ivy.functional.frontends.jax.lax.shift_right_logical',
2023-03-31T01:48:56.7852781Z E test_flags=FrontendFunctionTestFlags(
2023-03-31T01:48:56.7853085Z E num_positional_args=0,
2023-03-31T01:48:56.7853338Z E with_out=False,
2023-03-31T01:48:56.7853900Z E inplace=False,
2023-03-31T01:48:56.7854308Z E as_variable=[False],
2023-03-31T01:48:56.7854535Z E native_arrays=[False],
2023-03-31T01:48:56.7854993Z E generate_frontend_arrays=True,
2023-03-31T01:48:56.7855217Z E ),
2023-03-31T01:48:56.7855467Z E on_device='cpu',
2023-03-31T01:48:56.7855714Z E frontend='jax',
2023-03-31T01:48:56.7855914Z E )
2023-03-31T01:48:56.7856091Z E
2023-03-31T01:48:56.7856566Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.1', b'AAIAAAAAAAAAAQ==') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_lax_operators.py::test_jax_lax_shift_right_logical[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-31T01:48:56.7849244Z E AssertionError: the ground truth framework jax returned a int8 datatype while the backend jax returned a int64 datatype
2023-03-31T01:48:56.7850033Z E Falsifying example: test_jax_lax_shift_right_logical(
2023-03-31T01:48:56.7851410Z E dtype_and_x=(['int8', 'int8'],
2023-03-31T01:48:56.7851739Z E [array(0, dtype=int8), array(0, dtype=int8)]),
2023-03-31T01:48:56.7852202Z E fn_tree='ivy.functional.frontends.jax.lax.shift_right_logical',
2023-03-31T01:48:56.7852781Z E test_flags=FrontendFunctionTestFlags(
2023-03-31T01:48:56.7853085Z E num_positional_args=0,
2023-03-31T01:48:56.7853338Z E with_out=False,
2023-03-31T01:48:56.7853900Z E inplace=False,
2023-03-31T01:48:56.7854308Z E as_variable=[False],
2023-03-31T01:48:56.7854535Z E native_arrays=[False],
2023-03-31T01:48:56.7854993Z E generate_frontend_arrays=True,
2023-03-31T01:48:56.7855217Z E ),
2023-03-31T01:48:56.7855467Z E on_device='cpu',
2023-03-31T01:48:56.7855714Z E frontend='jax',
2023-03-31T01:48:56.7855914Z E )
2023-03-31T01:48:56.7856091Z E
2023-03-31T01:48:56.7856566Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.1', b'AAIAAAAAAAAAAQ==') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_lax_operators.py::test_jax_lax_shift_right_logical[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-31T01:48:56.7849244Z E AssertionError: the ground truth framework jax returned a int8 datatype while the backend jax returned a int64 datatype
2023-03-31T01:48:56.7850033Z E Falsifying example: test_jax_lax_shift_right_logical(
2023-03-31T01:48:56.7851410Z E dtype_and_x=(['int8', 'int8'],
2023-03-31T01:48:56.7851739Z E [array(0, dtype=int8), array(0, dtype=int8)]),
2023-03-31T01:48:56.7852202Z E fn_tree='ivy.functional.frontends.jax.lax.shift_right_logical',
2023-03-31T01:48:56.7852781Z E test_flags=FrontendFunctionTestFlags(
2023-03-31T01:48:56.7853085Z E num_positional_args=0,
2023-03-31T01:48:56.7853338Z E with_out=False,
2023-03-31T01:48:56.7853900Z E inplace=False,
2023-03-31T01:48:56.7854308Z E as_variable=[False],
2023-03-31T01:48:56.7854535Z E native_arrays=[False],
2023-03-31T01:48:56.7854993Z E generate_frontend_arrays=True,
2023-03-31T01:48:56.7855217Z E ),
2023-03-31T01:48:56.7855467Z E on_device='cpu',
2023-03-31T01:48:56.7855714Z E frontend='jax',
2023-03-31T01:48:56.7855914Z E )
2023-03-31T01:48:56.7856091Z E
2023-03-31T01:48:56.7856566Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.1', b'AAIAAAAAAAAAAQ==') as a decorator on your test case
</details>
|
test
|
fix jax lax operators test jax lax shift right logical tensorflow img src torch img src numpy img src jax img src failed ivy tests test ivy test frontends test jax test jax lax operators py test jax lax shift right logical e assertionerror the ground truth framework jax returned a datatype while the backend jax returned a datatype e falsifying example test jax lax shift right logical e dtype and x e e fn tree ivy functional frontends jax lax shift right logical e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e generate frontend arrays true e e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b aaiaaaaaaaaaaq as a decorator on your test case failed ivy tests test ivy test frontends test jax test jax lax operators py test jax lax shift right logical e assertionerror the ground truth framework jax returned a datatype while the backend jax returned a datatype e falsifying example test jax lax shift right logical e dtype and x e e fn tree ivy functional frontends jax lax shift right logical e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e generate frontend arrays true e e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b aaiaaaaaaaaaaq as a decorator on your test case failed ivy tests test ivy test frontends test jax test jax lax operators py test jax lax shift right logical e assertionerror the ground truth framework jax returned a datatype while the backend jax returned a datatype e falsifying example test jax lax shift right logical e dtype and x e e fn tree ivy functional frontends jax lax shift right logical e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e generate frontend arrays true e e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b aaiaaaaaaaaaaq as a decorator on your test case failed ivy tests test ivy test frontends test jax test jax lax operators py test jax lax shift right logical e assertionerror the ground truth framework jax returned a datatype while the backend jax returned a datatype e falsifying example test jax lax shift right logical e dtype and x e e fn tree ivy functional frontends jax lax shift right logical e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e generate frontend arrays true e e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b aaiaaaaaaaaaaq as a decorator on your test case
| 1
|
10,827
| 3,143,537,186
|
IssuesEvent
|
2015-09-14 07:47:08
|
tripikad/trip2
|
https://api.github.com/repos/tripikad/trip2
|
closed
|
Write login and registration tests
|
simple testing
|
Basically copy and adjust these tests:
https://github.com/laracasts/Email-Verification-In-Laravel/blob/master/tests/AuthTest.php
See the related video https://laracasts.com/lessons/email-verification-in-laravel
For mail testing, see https://github.com/bertramtruong/mailtrap
|
1.0
|
Write login and registration tests - Basically copy and adjust these tests:
https://github.com/laracasts/Email-Verification-In-Laravel/blob/master/tests/AuthTest.php
See the related video https://laracasts.com/lessons/email-verification-in-laravel
For mail testing, see https://github.com/bertramtruong/mailtrap
|
test
|
write login and registration tests basically copy and adjust these tests see the related video for mail testing see
| 1
|
275,136
| 23,893,568,914
|
IssuesEvent
|
2022-09-08 13:18:52
|
ARUP-CAS/aiscr-webamcr
|
https://api.github.com/repos/ARUP-CAS/aiscr-webamcr
|
closed
|
Mapový podklad - OpenStreetMap
|
bug / maintanance map TESTED
|
Všude sjednotit na šedou variantu (nyní je pouze v PAS; je lepší než barevná, ale mít obě je zbytečné)
|
1.0
|
Mapový podklad - OpenStreetMap - Všude sjednotit na šedou variantu (nyní je pouze v PAS; je lepší než barevná, ale mít obě je zbytečné)
|
test
|
mapový podklad openstreetmap všude sjednotit na šedou variantu nyní je pouze v pas je lepší než barevná ale mít obě je zbytečné
| 1
|
321,235
| 27,517,001,212
|
IssuesEvent
|
2023-03-06 12:40:31
|
Plutonomicon/cardano-transaction-lib
|
https://api.github.com/repos/Plutonomicon/cardano-transaction-lib
|
opened
|
E2E test suite: Allow piping the logs to the terminal from browser console
|
enhancement e2e testing
|
The config option should be implemented via an env variable in `test/e2e.env`.
`Ctl.Internal.Test.E2E.Feedback.Node` contains code that suppresses the logs until an error is observed. `addLogLine` there could be conditionally replaced with `Effect.Console.log`
|
1.0
|
E2E test suite: Allow piping the logs to the terminal from browser console - The config option should be implemented via an env variable in `test/e2e.env`.
`Ctl.Internal.Test.E2E.Feedback.Node` contains code that suppresses the logs until an error is observed. `addLogLine` there could be conditionally replaced with `Effect.Console.log`
|
test
|
test suite allow piping the logs to the terminal from browser console the config option should be implemented via an env variable in test env ctl internal test feedback node contains code that suppresses the logs until an error is observed addlogline there could be conditionally replaced with effect console log
| 1
|
246,636
| 20,888,605,615
|
IssuesEvent
|
2022-03-23 08:41:35
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
kv/kvserver: TestStoreTxnWaitQueueEnabledOnSplit failed
|
C-test-failure O-robot S-3 branch-master T-kv
|
kv/kvserver.TestStoreTxnWaitQueueEnabledOnSplit [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=4316130&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=4316130&tab=artifacts#/) on master @ [c95e8161b7752b1c9ac6c922070b7b1f2653a40b](https://github.com/cockroachdb/cockroach/commits/c95e8161b7752b1c9ac6c922070b7b1f2653a40b):
```
=== RUN TestStoreTxnWaitQueueEnabledOnSplit
test_log_scope.go:79: test logs captured to: /artifacts/tmp/_tmp/751d67000aac5f3394c2369309253f02/logTestStoreTxnWaitQueueEnabledOnSplit434347622
test_log_scope.go:80: use -show-logs to present logs inline
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
Parameters in this failure:
- TAGS=bazel,gss
</p>
</details>
/cc @cockroachdb/kv
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestStoreTxnWaitQueueEnabledOnSplit.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
Jira issue: CRDB-12998
|
1.0
|
kv/kvserver: TestStoreTxnWaitQueueEnabledOnSplit failed - kv/kvserver.TestStoreTxnWaitQueueEnabledOnSplit [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=4316130&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=4316130&tab=artifacts#/) on master @ [c95e8161b7752b1c9ac6c922070b7b1f2653a40b](https://github.com/cockroachdb/cockroach/commits/c95e8161b7752b1c9ac6c922070b7b1f2653a40b):
```
=== RUN TestStoreTxnWaitQueueEnabledOnSplit
test_log_scope.go:79: test logs captured to: /artifacts/tmp/_tmp/751d67000aac5f3394c2369309253f02/logTestStoreTxnWaitQueueEnabledOnSplit434347622
test_log_scope.go:80: use -show-logs to present logs inline
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
Parameters in this failure:
- TAGS=bazel,gss
</p>
</details>
/cc @cockroachdb/kv
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestStoreTxnWaitQueueEnabledOnSplit.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
Jira issue: CRDB-12998
|
test
|
kv kvserver teststoretxnwaitqueueenabledonsplit failed kv kvserver teststoretxnwaitqueueenabledonsplit with on master run teststoretxnwaitqueueenabledonsplit test log scope go test logs captured to artifacts tmp tmp test log scope go use show logs to present logs inline help see also parameters in this failure tags bazel gss cc cockroachdb kv jira issue crdb
| 1
|
52,107
| 6,573,155,958
|
IssuesEvent
|
2017-09-11 07:34:24
|
RRZE-Webteam/FAU-Einrichtungen
|
https://api.github.com/repos/RRZE-Webteam/FAU-Einrichtungen
|
closed
|
Mobile Navigation: Suche und Sprachwechsler in das Flyout-Menu übernehmen?
|
Design-Entscheidung
|
Wurde schon einmal darüber gesprochen wie die Suche und der Sprachwechsler in der mobilen Version umgesetzt wird?
Wie wäre es denn, wenn man die beiden Funktionen in das Flyout-Menü mit aufnehmen würde?
Im Moment sehen die beiden Funktionen etwas verloren aus, wenn in dem Meta-Bereich sonst keine weiteren Buttons sind. (siehe Bild)

|
1.0
|
Mobile Navigation: Suche und Sprachwechsler in das Flyout-Menu übernehmen? - Wurde schon einmal darüber gesprochen wie die Suche und der Sprachwechsler in der mobilen Version umgesetzt wird?
Wie wäre es denn, wenn man die beiden Funktionen in das Flyout-Menü mit aufnehmen würde?
Im Moment sehen die beiden Funktionen etwas verloren aus, wenn in dem Meta-Bereich sonst keine weiteren Buttons sind. (siehe Bild)

|
non_test
|
mobile navigation suche und sprachwechsler in das flyout menu übernehmen wurde schon einmal darüber gesprochen wie die suche und der sprachwechsler in der mobilen version umgesetzt wird wie wäre es denn wenn man die beiden funktionen in das flyout menü mit aufnehmen würde im moment sehen die beiden funktionen etwas verloren aus wenn in dem meta bereich sonst keine weiteren buttons sind siehe bild
| 0
|
362,387
| 25,373,147,733
|
IssuesEvent
|
2022-11-21 12:08:40
|
STMicroelectronics/cmsis_core
|
https://api.github.com/repos/STMicroelectronics/cmsis_core
|
closed
|
ST_README.md outdated
|
documentation
|
The compatibility information in the "ST_README.md" file is outdated and should be corrected asap!
It is only listed up to v5.4.0 but v5.6.0 has been released for more than 2 years.
Thanks.
|
1.0
|
ST_README.md outdated - The compatibility information in the "ST_README.md" file is outdated and should be corrected asap!
It is only listed up to v5.4.0 but v5.6.0 has been released for more than 2 years.
Thanks.
|
non_test
|
st readme md outdated the compatibility information in the st readme md file is outdated and should be corrected asap it is only listed up to but has been released for more than years thanks
| 0
|
131,054
| 10,679,317,343
|
IssuesEvent
|
2019-10-21 19:01:16
|
smartsystemslab-uf/ZynqRobotController
|
https://api.github.com/repos/smartsystemslab-uf/ZynqRobotController
|
closed
|
UART Tests
|
testing
|
Need tests for all UART ports available in the system. Presumably, a simply program could be written to take as input a serial port and some data, send data out over serial port, and receive the same data on the receiver line of the UART port.
|
1.0
|
UART Tests - Need tests for all UART ports available in the system. Presumably, a simply program could be written to take as input a serial port and some data, send data out over serial port, and receive the same data on the receiver line of the UART port.
|
test
|
uart tests need tests for all uart ports available in the system presumably a simply program could be written to take as input a serial port and some data send data out over serial port and receive the same data on the receiver line of the uart port
| 1
|
177,814
| 13,748,854,042
|
IssuesEvent
|
2020-10-06 09:38:22
|
inveniosoftware/react-invenio-app-ils
|
https://api.github.com/repos/inveniosoftware/react-invenio-app-ils
|
closed
|
Cannot search for multiple aggregations of the same category
|
bug test-blocker
|
### Reproduce
1. Go to frontsite
2. Perform an empty search
3. In "Literature types", tick "Book" and note the number of results
4. Now keep "Book" ticked and tick "Proceeding"; again note the number of results
5. Observe that the results always correspond to the last element you ticked (here, "Proceeding")
### What should (supposedly) happen
Return all the items that match any ticked kind (in this case, both books and proceedings).
The URL is correct but the API calls aren't.
|
1.0
|
Cannot search for multiple aggregations of the same category - ### Reproduce
1. Go to frontsite
2. Perform an empty search
3. In "Literature types", tick "Book" and note the number of results
4. Now keep "Book" ticked and tick "Proceeding"; again note the number of results
5. Observe that the results always correspond to the last element you ticked (here, "Proceeding")
### What should (supposedly) happen
Return all the items that match any ticked kind (in this case, both books and proceedings).
The URL is correct but the API calls aren't.
|
test
|
cannot search for multiple aggregations of the same category reproduce go to frontsite perform an empty search in literature types tick book and note the number of results now keep book ticked and tick proceeding again note the number of results observe that the results always correspond to the last element you ticked here proceeding what should supposedly happen return all the items that match any ticked kind in this case both books and proceedings the url is correct but the api calls aren t
| 1
|
197,980
| 14,952,220,001
|
IssuesEvent
|
2021-01-26 15:17:25
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
opened
|
TestE2ETensorPipe.TestTrainingLoop is Flaky
|
module: flaky-tests module: rpc module: tensorpipe oncall: distributed triaged
|
https://app.circleci.com/pipelines/github/pytorch/pytorch/263779/workflows/c1535b0d-74cb-467f-9955-b05adf928ade/jobs/10367433/steps
```
Jan 25 23:17:56 [----------] Global test environment tear-down
Jan 25 23:17:56 [==========] 9 tests from 4 test cases ran. (1651 ms total)
Jan 25 23:17:56 [ PASSED ] 8 tests.
Jan 25 23:17:56 [ FAILED ] 1 test, listed below:
Jan 25 23:17:56 [ FAILED ] TestE2ETensorPipe.TestTrainingLoop
```
```
/var/lib/jenkins/workspace/test/cpp/rpc/test_e2e_tensorpipe.cpp:59
Expected equality of these values:
0
tensorpipeAgent->timeoutMapSize()
Which is: 2
```
|
1.0
|
TestE2ETensorPipe.TestTrainingLoop is Flaky - https://app.circleci.com/pipelines/github/pytorch/pytorch/263779/workflows/c1535b0d-74cb-467f-9955-b05adf928ade/jobs/10367433/steps
```
Jan 25 23:17:56 [----------] Global test environment tear-down
Jan 25 23:17:56 [==========] 9 tests from 4 test cases ran. (1651 ms total)
Jan 25 23:17:56 [ PASSED ] 8 tests.
Jan 25 23:17:56 [ FAILED ] 1 test, listed below:
Jan 25 23:17:56 [ FAILED ] TestE2ETensorPipe.TestTrainingLoop
```
```
/var/lib/jenkins/workspace/test/cpp/rpc/test_e2e_tensorpipe.cpp:59
Expected equality of these values:
0
tensorpipeAgent->timeoutMapSize()
Which is: 2
```
|
test
|
testtrainingloop is flaky jan global test environment tear down jan tests from test cases ran ms total jan tests jan test listed below jan testtrainingloop var lib jenkins workspace test cpp rpc test tensorpipe cpp expected equality of these values tensorpipeagent timeoutmapsize which is
| 1
|
34,086
| 6,289,089,991
|
IssuesEvent
|
2017-07-19 18:26:11
|
wp-cli/wp-cli
|
https://api.github.com/repos/wp-cli/wp-cli
|
opened
|
Update references to the Package Index
|
scope:documentation
|
Now that the [Package Index is being deprecated](https://make.wordpress.org/cli/2017/07/18/feature-development-discussion-recap/), we need to update the existing references:
* [ ] Update Package Index README: https://github.com/wp-cli/package-index#wp-cli-package-index
* [ ] Remove "Package Index" link from website navigation.
* [ ] Update Package Index reference in "Commands Cookbook": https://make.wordpress.org/cli/handbook/commands-cookbook/#add-to-the-package-index
|
1.0
|
Update references to the Package Index - Now that the [Package Index is being deprecated](https://make.wordpress.org/cli/2017/07/18/feature-development-discussion-recap/), we need to update the existing references:
* [ ] Update Package Index README: https://github.com/wp-cli/package-index#wp-cli-package-index
* [ ] Remove "Package Index" link from website navigation.
* [ ] Update Package Index reference in "Commands Cookbook": https://make.wordpress.org/cli/handbook/commands-cookbook/#add-to-the-package-index
|
non_test
|
update references to the package index now that the we need to update the existing references update package index readme remove package index link from website navigation update package index reference in commands cookbook
| 0
|
82,404
| 7,840,682,067
|
IssuesEvent
|
2018-06-18 17:07:46
|
udacity/lesson_feedback_nd113
|
https://api.github.com/repos/udacity/lesson_feedback_nd113
|
closed
|
[Neutral]2018-04-14
|
1.0.0 14. Robot Localization test
|
Lectures should have more detail\, I had difficulty in understanding the robot motion and probability moving in the direction of the robot
|
1.0
|
[Neutral]2018-04-14 - Lectures should have more detail\, I had difficulty in understanding the robot motion and probability moving in the direction of the robot
|
test
|
lectures should have more detail i had difficulty in understanding the robot motion and probability moving in the direction of the robot
| 1
|
205,371
| 15,610,816,729
|
IssuesEvent
|
2021-03-19 13:38:28
|
WoWManiaUK/Redemption
|
https://api.github.com/repos/WoWManiaUK/Redemption
|
closed
|
[Object] Herb in tree (Redridge Mountains)
|
Fixed on PTR - Tester Confirmed
|
What is Happening:

There is herb inside tree and it's inaccessible
What Should happen:
It should be accesable I guess
|
1.0
|
[Object] Herb in tree (Redridge Mountains) - What is Happening:

There is herb inside tree and it's inaccessible
What Should happen:
It should be accesable I guess
|
test
|
herb in tree redridge mountains what is happening there is herb inside tree and it s inaccessible what should happen it should be accesable i guess
| 1
|
40,009
| 2,862,123,565
|
IssuesEvent
|
2015-06-04 01:14:40
|
kbandla/testrepo
|
https://api.github.com/repos/kbandla/testrepo
|
reopened
|
Record Fragmentation not handled in TLSMultiFactory(buf)
|
bug imported Priority-Medium
|
_From [achin...@gmail.com](https://code.google.com/u/115462122534195676742/) on November 12, 2014 20:42:23_
What steps will reproduce the problem? 1. Send data more than 17000 Bytes.
2. The TLSMultiFactory will throw an error in finding the TLs version
3. The number of records returned is 0 What is the expected output? What do you see instead? We should see two records, one with length 16383 and another record with length 17000-16383 What version of the product are you using? On what operating system? Dpkt 1.8 and Windows Platform, Win 8.1 64 Bit Please provide any additional information below. def TLSMultiFactory(buf):
'''
Attempt to parse one or more TLSRecord's out of buf
Args:
buf: string containing SSL/TLS messages. May have an incomplete record
on the end
Returns:
[TLSRecord]
int, total bytes consumed, != len(buf) if an incomplete record was left at
the end.
Raises SSL3Exception.
'''
i, n = 0, len(buf)
msgs = []
while i \< n:
v = buf[i+1:i+3]
if v in SSL3_VERSION_BYTES:
try:
msg = TLSRecord(buf[i:])
msgs.append(msg)
except dpkt.NeedData:
break
else:
raise SSL3Exception('Bad TLS version in buf: %r' % buf[i:i+5])
i += len(msg)
return msgs, i
I couldn't find the code that handles fragmentation in the Record layer.
_Original issue: http://code.google.com/p/dpkt/issues/detail?id=136_
|
1.0
|
Record Fragmentation not handled in TLSMultiFactory(buf) - _From [achin...@gmail.com](https://code.google.com/u/115462122534195676742/) on November 12, 2014 20:42:23_
What steps will reproduce the problem? 1. Send data more than 17000 Bytes.
2. The TLSMultiFactory will throw an error in finding the TLs version
3. The number of records returned is 0 What is the expected output? What do you see instead? We should see two records, one with length 16383 and another record with length 17000-16383 What version of the product are you using? On what operating system? Dpkt 1.8 and Windows Platform, Win 8.1 64 Bit Please provide any additional information below. def TLSMultiFactory(buf):
'''
Attempt to parse one or more TLSRecord's out of buf
Args:
buf: string containing SSL/TLS messages. May have an incomplete record
on the end
Returns:
[TLSRecord]
int, total bytes consumed, != len(buf) if an incomplete record was left at
the end.
Raises SSL3Exception.
'''
i, n = 0, len(buf)
msgs = []
while i \< n:
v = buf[i+1:i+3]
if v in SSL3_VERSION_BYTES:
try:
msg = TLSRecord(buf[i:])
msgs.append(msg)
except dpkt.NeedData:
break
else:
raise SSL3Exception('Bad TLS version in buf: %r' % buf[i:i+5])
i += len(msg)
return msgs, i
I couldn't find the code that handles fragmentation in the Record layer.
_Original issue: http://code.google.com/p/dpkt/issues/detail?id=136_
|
non_test
|
record fragmentation not handled in tlsmultifactory buf from on november what steps will reproduce the problem send data more than bytes the tlsmultifactory will throw an error in finding the tls version the number of records returned is what is the expected output what do you see instead we should see two records one with length and another record with length what version of the product are you using on what operating system dpkt and windows platform win bit please provide any additional information below def tlsmultifactory buf attempt to parse one or more tlsrecord s out of buf args buf string containing ssl tls messages may have an incomplete record on the end returns int total bytes consumed len buf if an incomplete record was left at the end raises i n len buf msgs while i n v buf if v in version bytes try msg tlsrecord buf msgs append msg except dpkt needdata break else raise bad tls version in buf r buf i len msg return msgs i i couldn t find the code that handles fragmentation in the record layer original issue
| 0
|
529,914
| 15,397,648,326
|
IssuesEvent
|
2021-03-03 22:30:19
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Wrong day names are displayed when using not-Sunday as start of the week and grouping by "Day of week"
|
.Correctness .Frontend .Regression .Reproduced Customization/i18n Priority:P2 Type:Bug Visualization/
|
When grouping by day of the week, the DATA use the correct date but the AXIS LABELS still start with Sunday.
**To Reproduce**
- Start of the week: Monday
- Sample dataset: table Orders
- Filter: "Created at" between 2020-03-02 and 2020-03-03 (monday and tuesday)
- Summarize: by Count and group by Created At (by day of the week)
- Visualization: Bar, X-axis: Ordinal
**Expected behavior**
- Two bars, one for Monday 2020-03-02 labelled "Monday", one for Tuesday 2020-03-03 labelled "Tuesday"
**Actual behavior**
- Two bars, one for Monday 2020-03-02 labelled "Sunday", one for Tuesday 2020-03-03 labelled "Monday"

**Information about your Metabase Installation:**
- Metabase version: 0.37.0.2
- Metabase hosting environment: Jar-file on Ubuntu 16.04
- Metabase internal database: H2
**Severity**
Rather annoying. We do a lot of week-over-week stats (data from our helpdesk). Until now we had to make do with graphs starting with Sunday, which look lopsided (not a lot of issues opened on a Sunday, so the graphs look like calm Mondays and busy Tuesdays). This bug makes the graphs look correct (busy on week start, calm on weekends) but can't be used with wrong labels.
|
1.0
|
Wrong day names are displayed when using not-Sunday as start of the week and grouping by "Day of week" - When grouping by day of the week, the DATA use the correct date but the AXIS LABELS still start with Sunday.
**To Reproduce**
- Start of the week: Monday
- Sample dataset: table Orders
- Filter: "Created at" between 2020-03-02 and 2020-03-03 (monday and tuesday)
- Summarize: by Count and group by Created At (by day of the week)
- Visualization: Bar, X-axis: Ordinal
**Expected behavior**
- Two bars, one for Monday 2020-03-02 labelled "Monday", one for Tuesday 2020-03-03 labelled "Tuesday"
**Actual behavior**
- Two bars, one for Monday 2020-03-02 labelled "Sunday", one for Tuesday 2020-03-03 labelled "Monday"

**Information about your Metabase Installation:**
- Metabase version: 0.37.0.2
- Metabase hosting environment: Jar-file on Ubuntu 16.04
- Metabase internal database: H2
**Severity**
Rather annoying. We do a lot of week-over-week stats (data from our helpdesk). Until now we had to make do with graphs starting with Sunday, which look lopsided (not a lot of issues opened on a Sunday, so the graphs look like calm Mondays and busy Tuesdays). This bug makes the graphs look correct (busy on week start, calm on weekends) but can't be used with wrong labels.
|
non_test
|
wrong day names are displayed when using not sunday as start of the week and grouping by day of week when grouping by day of the week the data use the correct date but the axis labels still start with sunday to reproduce start of the week monday sample dataset table orders filter created at between and monday and tuesday summarize by count and group by created at by day of the week visualization bar x axis ordinal expected behavior two bars one for monday labelled monday one for tuesday labelled tuesday actual behavior two bars one for monday labelled sunday one for tuesday labelled monday information about your metabase installation metabase version metabase hosting environment jar file on ubuntu metabase internal database severity rather annoying we do a lot of week over week stats data from our helpdesk until now we had to make do with graphs starting with sunday which look lopsided not a lot of issues opened on a sunday so the graphs look like calm mondays and busy tuesdays this bug makes the graphs look correct busy on week start calm on weekends but can t be used with wrong labels
| 0
|
246,710
| 20,909,992,176
|
IssuesEvent
|
2022-03-24 08:23:00
|
pygame/pygame
|
https://api.github.com/repos/pygame/pygame
|
closed
|
Update freetype version to 2.9.1+
|
font freetype Gnu/Linux Windows needs-testing Wheels Difficulty: moderate
|
I noticed that pygame is about 6 versions (and at least 5 years) behind the latest version of freetype. It looks like there have been some potentially nice changes over the last five years that may improve font rendering for pygame - mainly the addition of ClearType hinting.
I suppose there is a chance it might just be an easy copy and paste job :)
**To Do**
- [ ] Test what freetype version reported on linux with `freetype.get_version()`.
- [x] Test what freetype version reported on mac with `freetype.get_version()`.
- [x] Test what freetype version reported on windows dev10 wheel with `freetype.get_version()`.
- [ ] Resolve issue with out of date freetype library used in windows wheels.
**Related Docs:** https://www.pygame.org/docs/ref/freetype.html#pygame.freetype.get_version
|
1.0
|
Update freetype version to 2.9.1+ - I noticed that pygame is about 6 versions (and at least 5 years) behind the latest version of freetype. It looks like there have been some potentially nice changes over the last five years that may improve font rendering for pygame - mainly the addition of ClearType hinting.
I suppose there is a chance it might just be an easy copy and paste job :)
**To Do**
- [ ] Test what freetype version reported on linux with `freetype.get_version()`.
- [x] Test what freetype version reported on mac with `freetype.get_version()`.
- [x] Test what freetype version reported on windows dev10 wheel with `freetype.get_version()`.
- [ ] Resolve issue with out of date freetype library used in windows wheels.
**Related Docs:** https://www.pygame.org/docs/ref/freetype.html#pygame.freetype.get_version
|
test
|
update freetype version to i noticed that pygame is about versions and at least years behind the latest version of freetype it looks like there have been some potentially nice changes over the last five years that may improve font rendering for pygame mainly the addition of cleartype hinting i suppose there is a chance it might just be an easy copy and paste job to do test what freetype version reported on linux with freetype get version test what freetype version reported on mac with freetype get version test what freetype version reported on windows wheel with freetype get version resolve issue with out of date freetype library used in windows wheels related docs
| 1
|
105,621
| 23,083,640,223
|
IssuesEvent
|
2022-07-26 09:26:07
|
backstage/backstage
|
https://api.github.com/repos/backstage/backstage
|
closed
|
Tech Docs broken header for small to medium devices
|
bug docs-like-code
|
## Expected Behavior
The header should fill all available width.
## Actual Behavior
There is a strange block on top of the header. More easily to see the image than me try to explain:

## Steps to Reproduce
1. Access any tech docs e.g. [https://demo.backstage.io/docs/default/component/backstage](https://demo.backstage.io/docs/default/component/backstage).
2. Resize the window width to a size between 601px to 1218px.
3. That's it! you will see the strange bar like in the image above.
## Context
I noticed it when I was bumping my backstage from 1.0.1 to 1.3.1 and was looking for bugs to see if the upgrade went all right.
Visualizing my tech docs found it.
## Your Environment
Backstage 1.0.1 and 1.0.3 has the bug and also the official Backstage demo at
https://demo.backstage.io has it.
|
1.0
|
Tech Docs broken header for small to medium devices - ## Expected Behavior
The header should fill all available width.
## Actual Behavior
There is a strange block on top of the header. More easily to see the image than me try to explain:

## Steps to Reproduce
1. Access any tech docs e.g. [https://demo.backstage.io/docs/default/component/backstage](https://demo.backstage.io/docs/default/component/backstage).
2. Resize the window width to a size between 601px to 1218px.
3. That's it! you will see the strange bar like in the image above.
## Context
I noticed it when I was bumping my backstage from 1.0.1 to 1.3.1 and was looking for bugs to see if the upgrade went all right.
Visualizing my tech docs found it.
## Your Environment
Backstage 1.0.1 and 1.0.3 has the bug and also the official Backstage demo at
https://demo.backstage.io has it.
|
non_test
|
tech docs broken header for small to medium devices expected behavior the header should fill all available width actual behavior there is a strange block on top of the header more easily to see the image than me try to explain steps to reproduce access any tech docs e g resize the window width to a size between to that s it you will see the strange bar like in the image above context i noticed it when i was bumping my backstage from to and was looking for bugs to see if the upgrade went all right visualizing my tech docs found it your environment backstage and has the bug and also the official backstage demo at has it
| 0
|
91,823
| 8,319,870,349
|
IssuesEvent
|
2018-09-25 18:28:10
|
nasa-gibs/worldview
|
https://api.github.com/repos/nasa-gibs/worldview
|
closed
|
Update image of the layer picker in the tour
|
testing
|
Update image of the layer picker in the tour to show the "Start Comparison" feature.
|
1.0
|
Update image of the layer picker in the tour - Update image of the layer picker in the tour to show the "Start Comparison" feature.
|
test
|
update image of the layer picker in the tour update image of the layer picker in the tour to show the start comparison feature
| 1
|
39,170
| 15,883,373,963
|
IssuesEvent
|
2021-04-09 17:17:53
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
How to specify the NBest length?
|
Pri1 cognitive-services/svc cxp product-question speech-service/subsvc triaged
|
Hello,
i'm using speech-to-test sdk to recognize audio to text.
Current json result seems returning 5 NBest candidates, but how can i specify how many Nbest candidates to receive ? Or full NBest list ?
Thank you.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: e4b55e56-55a1-d635-6bea-271dc361d676
* Version Independent ID: 8e9cb28f-c133-f310-3c7d-3a3b7e85f69d
* Content: [Speech-to-text quickstart - Speech service - Azure Cognitive Services](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/get-started-speech-to-text?tabs=windowsinstall)
* Content Source: [articles/cognitive-services/Speech-Service/get-started-speech-to-text.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/cognitive-services/Speech-Service/get-started-speech-to-text.md)
* Service: **cognitive-services**
* Sub-service: **speech-service**
* GitHub Login: @trevorbye
* Microsoft Alias: **trbye**
|
2.0
|
How to specify the NBest length? - Hello,
i'm using speech-to-test sdk to recognize audio to text.
Current json result seems returning 5 NBest candidates, but how can i specify how many Nbest candidates to receive ? Or full NBest list ?
Thank you.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: e4b55e56-55a1-d635-6bea-271dc361d676
* Version Independent ID: 8e9cb28f-c133-f310-3c7d-3a3b7e85f69d
* Content: [Speech-to-text quickstart - Speech service - Azure Cognitive Services](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/get-started-speech-to-text?tabs=windowsinstall)
* Content Source: [articles/cognitive-services/Speech-Service/get-started-speech-to-text.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/cognitive-services/Speech-Service/get-started-speech-to-text.md)
* Service: **cognitive-services**
* Sub-service: **speech-service**
* GitHub Login: @trevorbye
* Microsoft Alias: **trbye**
|
non_test
|
how to specify the nbest length hello i m using speech to test sdk to recognize audio to text current json result seems returning nbest candidates but how can i specify how many nbest candidates to receive or full nbest list thank you document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service cognitive services sub service speech service github login trevorbye microsoft alias trbye
| 0
|
136,131
| 18,722,341,100
|
IssuesEvent
|
2021-11-03 13:11:37
|
KDWSS/dd-trace-java
|
https://api.github.com/repos/KDWSS/dd-trace-java
|
opened
|
CVE-2019-16869 (High) detected in multiple libraries
|
security vulnerability
|
## CVE-2019-16869 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>netty-codec-http-4.0.0.Final.jar</b>, <b>netty-codec-http-4.1.29.Final.jar</b>, <b>netty-codec-http-4.1.22.Final.jar</b>, <b>netty-codec-http-4.1.16.Final.jar</b>, <b>netty-codec-http-4.1.13.Final.jar</b>, <b>netty-codec-http-4.1.34.Final.jar</b>, <b>netty-codec-http-4.1.12.Final.jar</b>, <b>netty-codec-http-4.1.8.Final.jar</b>, <b>netty-codec-http-4.1.5.Final.jar</b>, <b>netty-codec-http-4.1.32.Final.jar</b>, <b>netty-codec-http-4.1.0.Final.jar</b>, <b>netty-all-4.1.9.Final.jar</b>, <b>netty-codec-http-4.0.56.Final.jar</b>, <b>netty-codec-http-4.1.11.Final.jar</b>, <b>netty-codec-http-4.0.51.Final.jar</b>, <b>netty-codec-http-4.1.9.Final.jar</b>, <b>netty-codec-http-4.1.7.Final.jar</b>, <b>netty-all-4.0.13.Final.jar</b>, <b>netty-codec-http-4.1.36.Final.jar</b>, <b>netty-codec-http-4.0.34.Final.jar</b>, <b>netty-codec-http-4.1.19.Final.jar</b>, <b>netty-codec-http-4.1.15.Final.jar</b>, <b>netty-codec-http-4.0.36.Final.jar</b></p></summary>
<p>
<details><summary><b>netty-codec-http-4.0.0.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle</p>
<p>Path to vulnerable library: /caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.0.Final/27e4bef1dfa3703add0dd6c36f32b01c6549c9cc/netty-codec-http-4.0.0.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.0.0.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.29.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/netty-4.1/netty-4.1.gradle</p>
<p>Path to vulnerable library: /caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.29.Final/454688b88cea27a4d407202d1fc79a6522345b5e/netty-codec-http-4.1.29.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.29.Final/454688b88cea27a4d407202d1fc79a6522345b5e/netty-codec-http-4.1.29.Final.jar</p>
<p>
Dependency Hierarchy:
- async-http-client-2.1.0.jar (Root Library)
- :x: **netty-codec-http-4.1.29.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.22.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.22.Final/3805f3ca0d57630200defc7f9bb6ed3382dcb10b/netty-codec-http-4.1.22.Final.jar</p>
<p>
Dependency Hierarchy:
- reactor-netty-0.7.5.RELEASE.jar (Root Library)
- netty-handler-proxy-4.1.22.Final.jar
- :x: **netty-codec-http-4.1.22.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.16.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/rest-6.4/rest-6.4.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.16.Final/d64312378b438dfdad84267c599a053327c6f02a/netty-codec-http-4.1.16.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.16.Final/d64312378b438dfdad84267c599a053327c6f02a/netty-codec-http-4.1.16.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.16.Final/d64312378b438dfdad84267c599a053327c6f02a/netty-codec-http-4.1.16.Final.jar</p>
<p>
Dependency Hierarchy:
- transport-netty4-client-6.3.2.jar (Root Library)
- :x: **netty-codec-http-4.1.16.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.13.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.13.Final/ee87368766e6b900cf6be8ac9cdce27156e9411/netty-codec-http-4.1.13.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.13.Final/ee87368766e6b900cf6be8ac9cdce27156e9411/netty-codec-http-4.1.13.Final.jar</p>
<p>
Dependency Hierarchy:
- transport-netty4-client-6.0.0.jar (Root Library)
- :x: **netty-codec-http-4.1.13.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.34.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.34.Final/2887d87fbc1b057657348f61dc538f7296daf79/netty-codec-http-4.1.34.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.34.Final/2887d87fbc1b057657348f61dc538f7296daf79/netty-codec-http-4.1.34.Final.jar,/home/wss-scanner/.ivy2/cache/io.netty/netty-codec-http/jars/netty-codec-http-4.1.34.Final.jar</p>
<p>
Dependency Hierarchy:
- reactor-netty-0.7.15.RELEASE.jar (Root Library)
- :x: **netty-codec-http-4.1.34.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.12.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/grpc-1.5/grpc-1.5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.12.Final/df1561ac7c455faf57c83a45af78771c3d3d0621/netty-codec-http-4.1.12.Final.jar</p>
<p>
Dependency Hierarchy:
- grpc-netty-1.5.0.jar (Root Library)
- netty-codec-http2-4.1.12.Final.jar
- :x: **netty-codec-http-4.1.12.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.8.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/vertx-web-3.4/vertx-web-3.4.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.8.Final/1e88617c4a6c88da7e86fdbbd9494d22a250c879/netty-codec-http-4.1.8.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.8.Final/1e88617c4a6c88da7e86fdbbd9494d22a250c879/netty-codec-http-4.1.8.Final.jar</p>
<p>
Dependency Hierarchy:
- vertx-web-3.4.0.jar (Root Library)
- vertx-core-3.4.0.jar
- :x: **netty-codec-http-4.1.8.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.5.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-5/transport-5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.5.Final/87bda1b9ec7e3f75ca721fc87735cbedad2aa1a/netty-codec-http-4.1.5.Final.jar</p>
<p>
Dependency Hierarchy:
- transport-5.0.0.jar (Root Library)
- transport-netty4-client-5.0.0.jar
- :x: **netty-codec-http-4.1.5.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.32.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/aws-java-sdk-2.2/aws-java-sdk-2.2.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.32.Final/b9218adba7353ad5a75fcb639e4755d64bd6ddf/netty-codec-http-4.1.32.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.32.Final/b9218adba7353ad5a75fcb639e4755d64bd6ddf/netty-codec-http-4.1.32.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.32.Final/b9218adba7353ad5a75fcb639e4755d64bd6ddf/netty-codec-http-4.1.32.Final.jar</p>
<p>
Dependency Hierarchy:
- kinesis-2.2.0.jar (Root Library)
- netty-nio-client-2.2.0.jar
- netty-reactive-streams-http-2.0.0.jar
- :x: **netty-codec-http-4.1.32.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.0.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/netty-4.1/netty-4.1.gradle</p>
<p>Path to vulnerable library: /caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.0.Final/fcbd87accec40f44f5019211f4714e5b7d76ba47/netty-codec-http-4.1.0.Final.jar,/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.0.Final/fcbd87accec40f44f5019211f4714e5b7d76ba47/netty-codec-http-4.1.0.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.1.0.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-all-4.1.9.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/java-concurrent/java-concurrent.gradle</p>
<p>Path to vulnerable library: /caches/modules-2/files-2.1/io.netty/netty-all/4.1.9.Final/97860965d6a0a6b98e7f569f3f966727b8db75/netty-all-4.1.9.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.1.9.Final/97860965d6a0a6b98e7f569f3f966727b8db75/netty-all-4.1.9.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-all-4.1.9.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.0.56.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.56.Final/c88ed62c18a174e83ec3a560630297c9f7c0c1f2/netty-codec-http-4.0.56.Final.jar,/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.56.Final/c88ed62c18a174e83ec3a560630297c9f7c0c1f2/netty-codec-http-4.0.56.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.0.56.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.11.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.11.Final/3edeb0f08e455e570a55eb56bf64595fcb1a6b15/netty-codec-http-4.1.11.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-data-elasticsearch-3.0.0.RELEASE.jar (Root Library)
- transport-netty4-client-5.5.0.jar
- :x: **netty-codec-http-4.1.11.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.0.51.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.51.Final/1c8074c311dd2f1273c722477b232cdb74dcd844/netty-codec-http-4.0.51.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.51.Final/1c8074c311dd2f1273c722477b232cdb74dcd844/netty-codec-http-4.0.51.Final.jar</p>
<p>
Dependency Hierarchy:
- play-java-ws_2.11-2.5.19.jar (Root Library)
- async-http-client-2.0.36.jar
- :x: **netty-codec-http-4.0.51.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.9.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/ratpack-1.5/ratpack-1.5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.9.Final/efb68f8ce201d180fdbbec8ade5e25684cae12bc/netty-codec-http-4.1.9.Final.jar</p>
<p>
Dependency Hierarchy:
- ratpack-core-1.5.0.jar (Root Library)
- :x: **netty-codec-http-4.1.9.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.7.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.7.Final/9f957998c651e7b73d6dc878f704d81b4c085387/netty-codec-http-4.1.7.Final.jar</p>
<p>
Dependency Hierarchy:
- transport-5.3.0.jar (Root Library)
- transport-netty4-client-5.3.0.jar
- :x: **netty-codec-http-4.1.7.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-all-4.0.13.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/jms/jms.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.0.13.Final/75de08aeaef1712d88b011d81900937481fc3e7/netty-all-4.0.13.Final.jar</p>
<p>
Dependency Hierarchy:
- hornetq-jms-client-2.4.7.Final.jar (Root Library)
- hornetq-core-client-2.4.7.Final.jar
- :x: **netty-all-4.0.13.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.36.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-7.3/transport-7.3.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.36.Final/62b73d439dbddf3c0dde092b048580139695ab46/netty-codec-http-4.1.36.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.36.Final/62b73d439dbddf3c0dde092b048580139695ab46/netty-codec-http-4.1.36.Final.jar</p>
<p>
Dependency Hierarchy:
- transport-7.3.0.jar (Root Library)
- transport-netty4-client-7.3.0.jar
- :x: **netty-codec-http-4.1.36.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.0.34.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.34.Final/974e1c686ee143ae4d1266f64e03de657e778542/netty-codec-http-4.0.34.Final.jar</p>
<p>
Dependency Hierarchy:
- play-test_2.11-2.5.0.jar (Root Library)
- play-netty-server_2.11-2.5.0.jar
- netty-reactive-streams-http-1.0.2.jar
- :x: **netty-codec-http-4.0.34.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.19.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/akka-http-10.0/akka-http-10.0.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.19.Final/f7edff289d10cc03cdb97ad99e2722f9d61ffdc3/netty-codec-http-4.1.19.Final.jar</p>
<p>
Dependency Hierarchy:
- lagom-javadsl-testkit_2.11-1.4.0.jar (Root Library)
- lagom-javadsl-server_2.11-1.4.0.jar
- lagom-server_2.11-1.4.0.jar
- lagom-client_2.11-1.4.0.jar
- :x: **netty-codec-http-4.1.19.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.15.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/vertx-rx-3.5/vertx-rx-3.5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.15.Final/c06dbf0f4119fdbb3db6ff880b38e835766455b2/netty-codec-http-4.1.15.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.15.Final/c06dbf0f4119fdbb3db6ff880b38e835766455b2/netty-codec-http-4.1.15.Final.jar</p>
<p>
Dependency Hierarchy:
- reactor-netty-0.7.0.RELEASE.jar (Root Library)
- netty-handler-proxy-4.1.15.Final.jar
- :x: **netty-codec-http-4.1.15.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.0.36.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.36.Final/5e83ee4191937ccdaac25fb48ec699169512891c/netty-codec-http-4.0.36.Final.jar,/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.36.Final/5e83ee4191937ccdaac25fb48ec699169512891c/netty-codec-http-4.0.36.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.0.36.Final.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/KDWSS/dd-trace-java/commit/2819174635979a19573ec0ce8e3e2b63a3848079">2819174635979a19573ec0ce8e3e2b63a3848079</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Netty before 4.1.42.Final mishandles whitespace before the colon in HTTP headers (such as a "Transfer-Encoding : chunked" line), which leads to HTTP request smuggling.
<p>Publish Date: 2019-09-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-16869>CVE-2019-16869</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16869">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16869</a></p>
<p>Release Date: 2019-09-26</p>
<p>Fix Resolution: io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.0.0.Final","packageFilePaths":["/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.0.0.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.29.Final","packageFilePaths":["/dd-java-agent/instrumentation/netty-4.1/netty-4.1.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.asynchttpclient:async-http-client:2.1.0;io.netty:netty-codec-http:4.1.29.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.22.Final","packageFilePaths":["/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.projectreactor.ipc:reactor-netty:0.7.5.RELEASE;io.netty:netty-handler-proxy:4.1.22.Final;io.netty:netty-codec-http:4.1.22.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.16.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/rest-6.4/rest-6.4.gradle","/dd-java-agent/instrumentation/elasticsearch/transport-6/transport-6.gradle","/dd-java-agent/instrumentation/elasticsearch/rest-5/rest-5.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.plugin:transport-netty4-client:6.3.2;io.netty:netty-codec-http:4.1.16.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.13.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle","/dd-java-agent/instrumentation/elasticsearch/transport-6/transport-6.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.plugin:transport-netty4-client:6.0.0;io.netty:netty-codec-http:4.1.13.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.34.Final","packageFilePaths":["/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle","/dd-java-agent/instrumentation/play-2.6/play-2.6.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.projectreactor.ipc:reactor-netty:0.7.15.RELEASE;io.netty:netty-codec-http:4.1.34.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.12.Final","packageFilePaths":["/dd-java-agent/instrumentation/grpc-1.5/grpc-1.5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.grpc:grpc-netty:1.5.0;io.netty:netty-codec-http2:4.1.12.Final;io.netty:netty-codec-http:4.1.12.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.8.Final","packageFilePaths":["/dd-java-agent/instrumentation/vertx-web-3.4/vertx-web-3.4.gradle","/dd-java-agent/instrumentation/finatra-2.9/finatra-2.9.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.vertx:vertx-web:3.4.0;io.vertx:vertx-core:3.4.0;io.netty:netty-codec-http:4.1.8.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.5.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-5/transport-5.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.client:transport:5.0.0;org.elasticsearch.plugin:transport-netty4-client:5.0.0;io.netty:netty-codec-http:4.1.5.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.32.Final","packageFilePaths":["/dd-java-agent/instrumentation/aws-java-sdk-2.2/aws-java-sdk-2.2.gradle","/dd-java-agent/instrumentation/elasticsearch/rest-6.4/rest-6.4.gradle","/dd-java-agent/instrumentation/elasticsearch/rest-7/rest-7.gradle"],"isTransitiveDependency":true,"dependencyTree":"software.amazon.awssdk:kinesis:2.2.0;software.amazon.awssdk:netty-nio-client:2.2.0;com.typesafe.netty:netty-reactive-streams-http:2.0.0;io.netty:netty-codec-http:4.1.32.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.0.Final","packageFilePaths":["/dd-java-agent/instrumentation/netty-4.1/netty-4.1.gradle","/dd-java-agent/instrumentation/netty-4.1-shared/netty-4.1-shared.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.1.0.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-all","packageVersion":"4.1.9.Final","packageFilePaths":["/dd-java-agent/instrumentation/java-concurrent/java-concurrent.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-all:4.1.9.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.0.56.Final","packageFilePaths":["/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.0.56.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.11.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.data:spring-data-elasticsearch:3.0.0.RELEASE;org.elasticsearch.plugin:transport-netty4-client:5.5.0;io.netty:netty-codec-http:4.1.11.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.0.51.Final","packageFilePaths":["/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle","/dd-smoke-tests/play-2.5/play-2.5.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-java-ws_2.11:2.5.19;org.asynchttpclient:async-http-client:2.0.36;io.netty:netty-codec-http:4.0.51.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.9.Final","packageFilePaths":["/dd-java-agent/instrumentation/ratpack-1.5/ratpack-1.5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.ratpack:ratpack-core:1.5.0;io.netty:netty-codec-http:4.1.9.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.7.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.client:transport:5.3.0;org.elasticsearch.plugin:transport-netty4-client:5.3.0;io.netty:netty-codec-http:4.1.7.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-all","packageVersion":"4.0.13.Final","packageFilePaths":["/dd-java-agent/instrumentation/jms/jms.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.hornetq:hornetq-jms-client:2.4.7.Final;org.hornetq:hornetq-core-client:2.4.7.Final;io.netty:netty-all:4.0.13.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.36.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-7.3/transport-7.3.gradle","/dd-java-agent/instrumentation/elasticsearch/transport/transport.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.client:transport:7.3.0;org.elasticsearch.plugin:transport-netty4-client:7.3.0;io.netty:netty-codec-http:4.1.36.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.0.34.Final","packageFilePaths":["/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-test_2.11:2.5.0;com.typesafe.play:play-netty-server_2.11:2.5.0;com.typesafe.netty:netty-reactive-streams-http:1.0.2;io.netty:netty-codec-http:4.0.34.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.19.Final","packageFilePaths":["/dd-java-agent/instrumentation/akka-http-10.0/akka-http-10.0.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.lightbend.lagom:lagom-javadsl-testkit_2.11:1.4.0;com.lightbend.lagom:lagom-javadsl-server_2.11:1.4.0;com.lightbend.lagom:lagom-server_2.11:1.4.0;com.lightbend.lagom:lagom-client_2.11:1.4.0;io.netty:netty-codec-http:4.1.19.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.15.Final","packageFilePaths":["/dd-java-agent/instrumentation/vertx-rx-3.5/vertx-rx-3.5.gradle","/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.projectreactor.ipc:reactor-netty:0.7.0.RELEASE;io.netty:netty-handler-proxy:4.1.15.Final;io.netty:netty-codec-http:4.1.15.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.0.36.Final","packageFilePaths":["/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.0.36.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2019-16869","vulnerabilityDetails":"Netty before 4.1.42.Final mishandles whitespace before the colon in HTTP headers (such as a \"Transfer-Encoding : chunked\" line), which leads to HTTP request smuggling.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-16869","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2019-16869 (High) detected in multiple libraries - ## CVE-2019-16869 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>netty-codec-http-4.0.0.Final.jar</b>, <b>netty-codec-http-4.1.29.Final.jar</b>, <b>netty-codec-http-4.1.22.Final.jar</b>, <b>netty-codec-http-4.1.16.Final.jar</b>, <b>netty-codec-http-4.1.13.Final.jar</b>, <b>netty-codec-http-4.1.34.Final.jar</b>, <b>netty-codec-http-4.1.12.Final.jar</b>, <b>netty-codec-http-4.1.8.Final.jar</b>, <b>netty-codec-http-4.1.5.Final.jar</b>, <b>netty-codec-http-4.1.32.Final.jar</b>, <b>netty-codec-http-4.1.0.Final.jar</b>, <b>netty-all-4.1.9.Final.jar</b>, <b>netty-codec-http-4.0.56.Final.jar</b>, <b>netty-codec-http-4.1.11.Final.jar</b>, <b>netty-codec-http-4.0.51.Final.jar</b>, <b>netty-codec-http-4.1.9.Final.jar</b>, <b>netty-codec-http-4.1.7.Final.jar</b>, <b>netty-all-4.0.13.Final.jar</b>, <b>netty-codec-http-4.1.36.Final.jar</b>, <b>netty-codec-http-4.0.34.Final.jar</b>, <b>netty-codec-http-4.1.19.Final.jar</b>, <b>netty-codec-http-4.1.15.Final.jar</b>, <b>netty-codec-http-4.0.36.Final.jar</b></p></summary>
<p>
<details><summary><b>netty-codec-http-4.0.0.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle</p>
<p>Path to vulnerable library: /caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.0.Final/27e4bef1dfa3703add0dd6c36f32b01c6549c9cc/netty-codec-http-4.0.0.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.0.0.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.29.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/netty-4.1/netty-4.1.gradle</p>
<p>Path to vulnerable library: /caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.29.Final/454688b88cea27a4d407202d1fc79a6522345b5e/netty-codec-http-4.1.29.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.29.Final/454688b88cea27a4d407202d1fc79a6522345b5e/netty-codec-http-4.1.29.Final.jar</p>
<p>
Dependency Hierarchy:
- async-http-client-2.1.0.jar (Root Library)
- :x: **netty-codec-http-4.1.29.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.22.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.22.Final/3805f3ca0d57630200defc7f9bb6ed3382dcb10b/netty-codec-http-4.1.22.Final.jar</p>
<p>
Dependency Hierarchy:
- reactor-netty-0.7.5.RELEASE.jar (Root Library)
- netty-handler-proxy-4.1.22.Final.jar
- :x: **netty-codec-http-4.1.22.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.16.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/rest-6.4/rest-6.4.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.16.Final/d64312378b438dfdad84267c599a053327c6f02a/netty-codec-http-4.1.16.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.16.Final/d64312378b438dfdad84267c599a053327c6f02a/netty-codec-http-4.1.16.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.16.Final/d64312378b438dfdad84267c599a053327c6f02a/netty-codec-http-4.1.16.Final.jar</p>
<p>
Dependency Hierarchy:
- transport-netty4-client-6.3.2.jar (Root Library)
- :x: **netty-codec-http-4.1.16.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.13.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.13.Final/ee87368766e6b900cf6be8ac9cdce27156e9411/netty-codec-http-4.1.13.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.13.Final/ee87368766e6b900cf6be8ac9cdce27156e9411/netty-codec-http-4.1.13.Final.jar</p>
<p>
Dependency Hierarchy:
- transport-netty4-client-6.0.0.jar (Root Library)
- :x: **netty-codec-http-4.1.13.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.34.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.34.Final/2887d87fbc1b057657348f61dc538f7296daf79/netty-codec-http-4.1.34.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.34.Final/2887d87fbc1b057657348f61dc538f7296daf79/netty-codec-http-4.1.34.Final.jar,/home/wss-scanner/.ivy2/cache/io.netty/netty-codec-http/jars/netty-codec-http-4.1.34.Final.jar</p>
<p>
Dependency Hierarchy:
- reactor-netty-0.7.15.RELEASE.jar (Root Library)
- :x: **netty-codec-http-4.1.34.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.12.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/grpc-1.5/grpc-1.5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.12.Final/df1561ac7c455faf57c83a45af78771c3d3d0621/netty-codec-http-4.1.12.Final.jar</p>
<p>
Dependency Hierarchy:
- grpc-netty-1.5.0.jar (Root Library)
- netty-codec-http2-4.1.12.Final.jar
- :x: **netty-codec-http-4.1.12.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.8.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/vertx-web-3.4/vertx-web-3.4.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.8.Final/1e88617c4a6c88da7e86fdbbd9494d22a250c879/netty-codec-http-4.1.8.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.8.Final/1e88617c4a6c88da7e86fdbbd9494d22a250c879/netty-codec-http-4.1.8.Final.jar</p>
<p>
Dependency Hierarchy:
- vertx-web-3.4.0.jar (Root Library)
- vertx-core-3.4.0.jar
- :x: **netty-codec-http-4.1.8.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.5.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-5/transport-5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.5.Final/87bda1b9ec7e3f75ca721fc87735cbedad2aa1a/netty-codec-http-4.1.5.Final.jar</p>
<p>
Dependency Hierarchy:
- transport-5.0.0.jar (Root Library)
- transport-netty4-client-5.0.0.jar
- :x: **netty-codec-http-4.1.5.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.32.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/aws-java-sdk-2.2/aws-java-sdk-2.2.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.32.Final/b9218adba7353ad5a75fcb639e4755d64bd6ddf/netty-codec-http-4.1.32.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.32.Final/b9218adba7353ad5a75fcb639e4755d64bd6ddf/netty-codec-http-4.1.32.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.32.Final/b9218adba7353ad5a75fcb639e4755d64bd6ddf/netty-codec-http-4.1.32.Final.jar</p>
<p>
Dependency Hierarchy:
- kinesis-2.2.0.jar (Root Library)
- netty-nio-client-2.2.0.jar
- netty-reactive-streams-http-2.0.0.jar
- :x: **netty-codec-http-4.1.32.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.0.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/netty-4.1/netty-4.1.gradle</p>
<p>Path to vulnerable library: /caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.0.Final/fcbd87accec40f44f5019211f4714e5b7d76ba47/netty-codec-http-4.1.0.Final.jar,/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.0.Final/fcbd87accec40f44f5019211f4714e5b7d76ba47/netty-codec-http-4.1.0.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.1.0.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-all-4.1.9.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/java-concurrent/java-concurrent.gradle</p>
<p>Path to vulnerable library: /caches/modules-2/files-2.1/io.netty/netty-all/4.1.9.Final/97860965d6a0a6b98e7f569f3f966727b8db75/netty-all-4.1.9.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.1.9.Final/97860965d6a0a6b98e7f569f3f966727b8db75/netty-all-4.1.9.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-all-4.1.9.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.0.56.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.56.Final/c88ed62c18a174e83ec3a560630297c9f7c0c1f2/netty-codec-http-4.0.56.Final.jar,/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.56.Final/c88ed62c18a174e83ec3a560630297c9f7c0c1f2/netty-codec-http-4.0.56.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.0.56.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.11.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.11.Final/3edeb0f08e455e570a55eb56bf64595fcb1a6b15/netty-codec-http-4.1.11.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-data-elasticsearch-3.0.0.RELEASE.jar (Root Library)
- transport-netty4-client-5.5.0.jar
- :x: **netty-codec-http-4.1.11.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.0.51.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.51.Final/1c8074c311dd2f1273c722477b232cdb74dcd844/netty-codec-http-4.0.51.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.51.Final/1c8074c311dd2f1273c722477b232cdb74dcd844/netty-codec-http-4.0.51.Final.jar</p>
<p>
Dependency Hierarchy:
- play-java-ws_2.11-2.5.19.jar (Root Library)
- async-http-client-2.0.36.jar
- :x: **netty-codec-http-4.0.51.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.9.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/ratpack-1.5/ratpack-1.5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.9.Final/efb68f8ce201d180fdbbec8ade5e25684cae12bc/netty-codec-http-4.1.9.Final.jar</p>
<p>
Dependency Hierarchy:
- ratpack-core-1.5.0.jar (Root Library)
- :x: **netty-codec-http-4.1.9.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.7.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.7.Final/9f957998c651e7b73d6dc878f704d81b4c085387/netty-codec-http-4.1.7.Final.jar</p>
<p>
Dependency Hierarchy:
- transport-5.3.0.jar (Root Library)
- transport-netty4-client-5.3.0.jar
- :x: **netty-codec-http-4.1.7.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-all-4.0.13.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/jms/jms.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.0.13.Final/75de08aeaef1712d88b011d81900937481fc3e7/netty-all-4.0.13.Final.jar</p>
<p>
Dependency Hierarchy:
- hornetq-jms-client-2.4.7.Final.jar (Root Library)
- hornetq-core-client-2.4.7.Final.jar
- :x: **netty-all-4.0.13.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.36.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-7.3/transport-7.3.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.36.Final/62b73d439dbddf3c0dde092b048580139695ab46/netty-codec-http-4.1.36.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.36.Final/62b73d439dbddf3c0dde092b048580139695ab46/netty-codec-http-4.1.36.Final.jar</p>
<p>
Dependency Hierarchy:
- transport-7.3.0.jar (Root Library)
- transport-netty4-client-7.3.0.jar
- :x: **netty-codec-http-4.1.36.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.0.34.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.34.Final/974e1c686ee143ae4d1266f64e03de657e778542/netty-codec-http-4.0.34.Final.jar</p>
<p>
Dependency Hierarchy:
- play-test_2.11-2.5.0.jar (Root Library)
- play-netty-server_2.11-2.5.0.jar
- netty-reactive-streams-http-1.0.2.jar
- :x: **netty-codec-http-4.0.34.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.19.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/akka-http-10.0/akka-http-10.0.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.19.Final/f7edff289d10cc03cdb97ad99e2722f9d61ffdc3/netty-codec-http-4.1.19.Final.jar</p>
<p>
Dependency Hierarchy:
- lagom-javadsl-testkit_2.11-1.4.0.jar (Root Library)
- lagom-javadsl-server_2.11-1.4.0.jar
- lagom-server_2.11-1.4.0.jar
- lagom-client_2.11-1.4.0.jar
- :x: **netty-codec-http-4.1.19.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.1.15.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/vertx-rx-3.5/vertx-rx-3.5.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.15.Final/c06dbf0f4119fdbb3db6ff880b38e835766455b2/netty-codec-http-4.1.15.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.15.Final/c06dbf0f4119fdbb3db6ff880b38e835766455b2/netty-codec-http-4.1.15.Final.jar</p>
<p>
Dependency Hierarchy:
- reactor-netty-0.7.0.RELEASE.jar (Root Library)
- netty-handler-proxy-4.1.15.Final.jar
- :x: **netty-codec-http-4.1.15.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-http-4.0.36.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.36.Final/5e83ee4191937ccdaac25fb48ec699169512891c/netty-codec-http-4.0.36.Final.jar,/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.0.36.Final/5e83ee4191937ccdaac25fb48ec699169512891c/netty-codec-http-4.0.36.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.0.36.Final.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/KDWSS/dd-trace-java/commit/2819174635979a19573ec0ce8e3e2b63a3848079">2819174635979a19573ec0ce8e3e2b63a3848079</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Netty before 4.1.42.Final mishandles whitespace before the colon in HTTP headers (such as a "Transfer-Encoding : chunked" line), which leads to HTTP request smuggling.
<p>Publish Date: 2019-09-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-16869>CVE-2019-16869</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16869">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16869</a></p>
<p>Release Date: 2019-09-26</p>
<p>Fix Resolution: io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.0.0.Final","packageFilePaths":["/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.0.0.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.29.Final","packageFilePaths":["/dd-java-agent/instrumentation/netty-4.1/netty-4.1.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.asynchttpclient:async-http-client:2.1.0;io.netty:netty-codec-http:4.1.29.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.22.Final","packageFilePaths":["/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.projectreactor.ipc:reactor-netty:0.7.5.RELEASE;io.netty:netty-handler-proxy:4.1.22.Final;io.netty:netty-codec-http:4.1.22.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.16.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/rest-6.4/rest-6.4.gradle","/dd-java-agent/instrumentation/elasticsearch/transport-6/transport-6.gradle","/dd-java-agent/instrumentation/elasticsearch/rest-5/rest-5.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.plugin:transport-netty4-client:6.3.2;io.netty:netty-codec-http:4.1.16.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.13.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle","/dd-java-agent/instrumentation/elasticsearch/transport-6/transport-6.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.plugin:transport-netty4-client:6.0.0;io.netty:netty-codec-http:4.1.13.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.34.Final","packageFilePaths":["/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle","/dd-java-agent/instrumentation/play-2.6/play-2.6.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.projectreactor.ipc:reactor-netty:0.7.15.RELEASE;io.netty:netty-codec-http:4.1.34.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.12.Final","packageFilePaths":["/dd-java-agent/instrumentation/grpc-1.5/grpc-1.5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.grpc:grpc-netty:1.5.0;io.netty:netty-codec-http2:4.1.12.Final;io.netty:netty-codec-http:4.1.12.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.8.Final","packageFilePaths":["/dd-java-agent/instrumentation/vertx-web-3.4/vertx-web-3.4.gradle","/dd-java-agent/instrumentation/finatra-2.9/finatra-2.9.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.vertx:vertx-web:3.4.0;io.vertx:vertx-core:3.4.0;io.netty:netty-codec-http:4.1.8.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.5.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-5/transport-5.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.client:transport:5.0.0;org.elasticsearch.plugin:transport-netty4-client:5.0.0;io.netty:netty-codec-http:4.1.5.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.32.Final","packageFilePaths":["/dd-java-agent/instrumentation/aws-java-sdk-2.2/aws-java-sdk-2.2.gradle","/dd-java-agent/instrumentation/elasticsearch/rest-6.4/rest-6.4.gradle","/dd-java-agent/instrumentation/elasticsearch/rest-7/rest-7.gradle"],"isTransitiveDependency":true,"dependencyTree":"software.amazon.awssdk:kinesis:2.2.0;software.amazon.awssdk:netty-nio-client:2.2.0;com.typesafe.netty:netty-reactive-streams-http:2.0.0;io.netty:netty-codec-http:4.1.32.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.0.Final","packageFilePaths":["/dd-java-agent/instrumentation/netty-4.1/netty-4.1.gradle","/dd-java-agent/instrumentation/netty-4.1-shared/netty-4.1-shared.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.1.0.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-all","packageVersion":"4.1.9.Final","packageFilePaths":["/dd-java-agent/instrumentation/java-concurrent/java-concurrent.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-all:4.1.9.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.0.56.Final","packageFilePaths":["/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.0.56.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.11.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.data:spring-data-elasticsearch:3.0.0.RELEASE;org.elasticsearch.plugin:transport-netty4-client:5.5.0;io.netty:netty-codec-http:4.1.11.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.0.51.Final","packageFilePaths":["/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle","/dd-smoke-tests/play-2.5/play-2.5.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-java-ws_2.11:2.5.19;org.asynchttpclient:async-http-client:2.0.36;io.netty:netty-codec-http:4.0.51.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.9.Final","packageFilePaths":["/dd-java-agent/instrumentation/ratpack-1.5/ratpack-1.5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.ratpack:ratpack-core:1.5.0;io.netty:netty-codec-http:4.1.9.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.7.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.client:transport:5.3.0;org.elasticsearch.plugin:transport-netty4-client:5.3.0;io.netty:netty-codec-http:4.1.7.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-all","packageVersion":"4.0.13.Final","packageFilePaths":["/dd-java-agent/instrumentation/jms/jms.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.hornetq:hornetq-jms-client:2.4.7.Final;org.hornetq:hornetq-core-client:2.4.7.Final;io.netty:netty-all:4.0.13.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.36.Final","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-7.3/transport-7.3.gradle","/dd-java-agent/instrumentation/elasticsearch/transport/transport.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.elasticsearch.client:transport:7.3.0;org.elasticsearch.plugin:transport-netty4-client:7.3.0;io.netty:netty-codec-http:4.1.36.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.0.34.Final","packageFilePaths":["/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-test_2.11:2.5.0;com.typesafe.play:play-netty-server_2.11:2.5.0;com.typesafe.netty:netty-reactive-streams-http:1.0.2;io.netty:netty-codec-http:4.0.34.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.19.Final","packageFilePaths":["/dd-java-agent/instrumentation/akka-http-10.0/akka-http-10.0.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.lightbend.lagom:lagom-javadsl-testkit_2.11:1.4.0;com.lightbend.lagom:lagom-javadsl-server_2.11:1.4.0;com.lightbend.lagom:lagom-server_2.11:1.4.0;com.lightbend.lagom:lagom-client_2.11:1.4.0;io.netty:netty-codec-http:4.1.19.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.15.Final","packageFilePaths":["/dd-java-agent/instrumentation/vertx-rx-3.5/vertx-rx-3.5.gradle","/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.projectreactor.ipc:reactor-netty:0.7.0.RELEASE;io.netty:netty-handler-proxy:4.1.15.Final;io.netty:netty-codec-http:4.1.15.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.0.36.Final","packageFilePaths":["/dd-java-agent/instrumentation/netty-4.0/netty-4.0.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.0.36.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-all:4.1.42.Final,io.netty:netty-codec-http:4.1.42.Final"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2019-16869","vulnerabilityDetails":"Netty before 4.1.42.Final mishandles whitespace before the colon in HTTP headers (such as a \"Transfer-Encoding : chunked\" line), which leads to HTTP request smuggling.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-16869","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_test
|
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty all final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty all final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation netty netty gradle path to vulnerable library caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation netty netty gradle path to vulnerable library caches modules files io netty netty codec http final netty codec http final jar home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy async http client jar root library x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation spring webflux spring webflux gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy reactor netty release jar root library netty handler proxy final jar x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation elasticsearch rest rest gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy transport client jar root library x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation elasticsearch transport transport gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy transport client jar root library x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation spring webflux spring webflux gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar home wss scanner cache io netty netty codec http jars netty codec http final jar dependency hierarchy reactor netty release jar root library x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation grpc grpc gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy grpc netty jar root library netty codec final jar x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation vertx web vertx web gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy vertx web jar root library vertx core jar x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation elasticsearch transport transport gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy transport jar root library transport client jar x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation aws java sdk aws java sdk gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy kinesis jar root library netty nio client jar netty reactive streams http jar x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation netty netty gradle path to vulnerable library caches modules files io netty netty codec http final netty codec http final jar caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy x netty codec http final jar vulnerable library netty all final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation java concurrent java concurrent gradle path to vulnerable library caches modules files io netty netty all final netty all final jar home wss scanner gradle caches modules files io netty netty all final netty all final jar dependency hierarchy x netty all final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation netty netty gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation elasticsearch transport transport gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy spring data elasticsearch release jar root library transport client jar x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation play play gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy play java ws jar root library async http client jar x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation ratpack ratpack gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy ratpack core jar root library x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation elasticsearch transport transport gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy transport jar root library transport client jar x netty codec http final jar vulnerable library netty all final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation jms jms gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty all final netty all final jar dependency hierarchy hornetq jms client final jar root library hornetq core client final jar x netty all final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation elasticsearch transport transport gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy transport jar root library transport client jar x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation play play gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy play test jar root library play netty server jar netty reactive streams http jar x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation akka http akka http gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy lagom javadsl testkit jar root library lagom javadsl server jar lagom server jar lagom client jar x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation vertx rx vertx rx gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy reactor netty release jar root library netty handler proxy final jar x netty codec http final jar vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file dd trace java dd java agent instrumentation netty netty gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec http final netty codec http final jar caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy x netty codec http final jar vulnerable library found in head commit a href found in base branch master vulnerability details netty before final mishandles whitespace before the colon in http headers such as a transfer encoding chunked line which leads to http request smuggling publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty all final io netty netty codec http final isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree org asynchttpclient async http client io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree io projectreactor ipc reactor netty release io netty netty handler proxy final io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree org elasticsearch plugin transport client io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree org elasticsearch plugin transport client io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree io projectreactor ipc reactor netty release io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree io grpc grpc netty io netty netty codec final io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree io vertx vertx web io vertx vertx core io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree org elasticsearch client transport org elasticsearch plugin transport client io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree software amazon awssdk kinesis software amazon awssdk netty nio client com typesafe netty netty reactive streams http io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency false dependencytree io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty all packageversion final packagefilepaths istransitivedependency false dependencytree io netty netty all final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency false dependencytree io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree org springframework data spring data elasticsearch release org elasticsearch plugin transport client io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree com typesafe play play java ws org asynchttpclient async http client io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree io ratpack ratpack core io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree org elasticsearch client transport org elasticsearch plugin transport client io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty all packageversion final packagefilepaths istransitivedependency true dependencytree org hornetq hornetq jms client final org hornetq hornetq core client final io netty netty all final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree org elasticsearch client transport org elasticsearch plugin transport client io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree com typesafe play play test com typesafe play play netty server com typesafe netty netty reactive streams http io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree com lightbend lagom lagom javadsl testkit com lightbend lagom lagom javadsl server com lightbend lagom lagom server com lightbend lagom lagom client io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency true dependencytree io projectreactor ipc reactor netty release io netty netty handler proxy final io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final packagetype java groupid io netty packagename netty codec http packageversion final packagefilepaths istransitivedependency false dependencytree io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty all final io netty netty codec http final basebranches vulnerabilityidentifier cve vulnerabilitydetails netty before final mishandles whitespace before the colon in http headers such as a transfer encoding chunked line which leads to http request smuggling vulnerabilityurl
| 0
|
275,447
| 23,917,096,751
|
IssuesEvent
|
2022-09-09 13:38:29
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
roachtest: cluster_creation failed
|
C-test-failure O-robot O-roachtest release-blocker branch-release-22.2
|
roachtest.cluster_creation [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6400400?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6400400?buildTab=artifacts#/cluster_creation) on release-22.2 @ [1adc3f0b396dc045c52729d249943889a495c652](https://github.com/cockroachdb/cockroach/commits/1adc3f0b396dc045c52729d249943889a495c652):
```
test kv50/rangelookups/split/nodes=8 was skipped due to test_runner.go:613,test_runner.go:265,stopper.go:489: in provider: gce: Command: gcloud [compute instances create --subnet default --scopes cloud-platform --image ubuntu-2004-focal-v20210603 --image-project ubuntu-os-cloud --boot-disk-type pd-ssd --service-account 21965078311-compute@developer.gserviceaccount.com --maintenance-policy MIGRATE --local-ssd interface=NVME --machine-type n1-standard-8 --labels roachprod=true,cluster=teamcity-6400400-1662700828-71-n9cpu8,lifetime=12h0m0s,created=2022-09-09t13_37_58z --metadata-from-file startup-script=/tmp/gce-startup-script474512318 --project cockroach-ephemeral --boot-disk-size=32GB]: exit status 1
(1) attached stack trace
-- stack trace:
| github.com/cockroachdb/cockroach/pkg/roachprod/vm.ForProvider
| github.com/cockroachdb/cockroach/pkg/roachprod/vm/vm.go:374
| [...repeated from below...]
Wraps: (2) in provider: gce
Wraps: (3) attached stack trace
-- stack trace:
| github.com/cockroachdb/cockroach/pkg/roachprod/vm/gce.(*Provider).Create.func2
| github.com/cockroachdb/cockroach/pkg/roachprod/vm/gce/gcloud.go:538
| golang.org/x/sync/errgroup.(*Group).Go.func1
| golang.org/x/sync/errgroup/external/org_golang_x_sync/errgroup/errgroup.go:74
| runtime.goexit
| GOROOT/src/runtime/asm_amd64.s:1594
Wraps: (4) Command: gcloud [compute instances create --subnet default --scopes cloud-platform --image ubuntu-2004-focal-v20210603 --image-project ubuntu-os-cloud --boot-disk-type pd-ssd --service-account 21965078311-compute@developer.gserviceaccount.com --maintenance-policy MIGRATE --local-ssd interface=NVME --machine-type n1-standard-8 --labels roachprod=true,cluster=teamcity-6400400-1662700828-71-n9cpu8,lifetime=12h0m0s,created=2022-09-09t13_37_58z --metadata-from-file startup-script=/tmp/gce-startup-script474512318 --project cockroach-ephemeral --boot-disk-size=32GB]
| Output: WARNING: Some requests generated warnings:
| - Disk size: '32 GB' is larger than image size: '10 GB'. You might need to resize the root repartition manually if the operating system does not support automatic resizing. See https://cloud.google.com/compute/docs/disks/add-persistent-disk#resize_pd for details.
| - The resource 'projects/ubuntu-os-cloud/global/images/ubuntu-2004-focal-v20210603' is deprecated. A suggested replacement is 'projects/ubuntu-os-cloud/global/images/ubuntu-2004-focal-v20220905'.
|
| ERROR: (gcloud.compute.instances.create) Could not fetch resource:
| - The zone 'projects/cockroach-ephemeral/zones/us-east1-b' does not have enough resources available to fulfill the request. Try a different zone, or try again later.
Wraps: (5) exit status 1
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *withstack.withStack (4) *errutil.withPrefix (5) *exec.ExitError
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=8</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #78601 roachtest: cluster_creation failed [C-test-failure O-roachtest O-robot T-testeng branch-master sync-me-8]
- #78035 roachtest: cluster_creation failed [C-test-failure O-roachtest O-robot T-testeng branch-release-22.1]
</p>
</details>
/cc @cockroachdb/dev-inf
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*cluster_creation.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
2.0
|
roachtest: cluster_creation failed - roachtest.cluster_creation [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6400400?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6400400?buildTab=artifacts#/cluster_creation) on release-22.2 @ [1adc3f0b396dc045c52729d249943889a495c652](https://github.com/cockroachdb/cockroach/commits/1adc3f0b396dc045c52729d249943889a495c652):
```
test kv50/rangelookups/split/nodes=8 was skipped due to test_runner.go:613,test_runner.go:265,stopper.go:489: in provider: gce: Command: gcloud [compute instances create --subnet default --scopes cloud-platform --image ubuntu-2004-focal-v20210603 --image-project ubuntu-os-cloud --boot-disk-type pd-ssd --service-account 21965078311-compute@developer.gserviceaccount.com --maintenance-policy MIGRATE --local-ssd interface=NVME --machine-type n1-standard-8 --labels roachprod=true,cluster=teamcity-6400400-1662700828-71-n9cpu8,lifetime=12h0m0s,created=2022-09-09t13_37_58z --metadata-from-file startup-script=/tmp/gce-startup-script474512318 --project cockroach-ephemeral --boot-disk-size=32GB]: exit status 1
(1) attached stack trace
-- stack trace:
| github.com/cockroachdb/cockroach/pkg/roachprod/vm.ForProvider
| github.com/cockroachdb/cockroach/pkg/roachprod/vm/vm.go:374
| [...repeated from below...]
Wraps: (2) in provider: gce
Wraps: (3) attached stack trace
-- stack trace:
| github.com/cockroachdb/cockroach/pkg/roachprod/vm/gce.(*Provider).Create.func2
| github.com/cockroachdb/cockroach/pkg/roachprod/vm/gce/gcloud.go:538
| golang.org/x/sync/errgroup.(*Group).Go.func1
| golang.org/x/sync/errgroup/external/org_golang_x_sync/errgroup/errgroup.go:74
| runtime.goexit
| GOROOT/src/runtime/asm_amd64.s:1594
Wraps: (4) Command: gcloud [compute instances create --subnet default --scopes cloud-platform --image ubuntu-2004-focal-v20210603 --image-project ubuntu-os-cloud --boot-disk-type pd-ssd --service-account 21965078311-compute@developer.gserviceaccount.com --maintenance-policy MIGRATE --local-ssd interface=NVME --machine-type n1-standard-8 --labels roachprod=true,cluster=teamcity-6400400-1662700828-71-n9cpu8,lifetime=12h0m0s,created=2022-09-09t13_37_58z --metadata-from-file startup-script=/tmp/gce-startup-script474512318 --project cockroach-ephemeral --boot-disk-size=32GB]
| Output: WARNING: Some requests generated warnings:
| - Disk size: '32 GB' is larger than image size: '10 GB'. You might need to resize the root repartition manually if the operating system does not support automatic resizing. See https://cloud.google.com/compute/docs/disks/add-persistent-disk#resize_pd for details.
| - The resource 'projects/ubuntu-os-cloud/global/images/ubuntu-2004-focal-v20210603' is deprecated. A suggested replacement is 'projects/ubuntu-os-cloud/global/images/ubuntu-2004-focal-v20220905'.
|
| ERROR: (gcloud.compute.instances.create) Could not fetch resource:
| - The zone 'projects/cockroach-ephemeral/zones/us-east1-b' does not have enough resources available to fulfill the request. Try a different zone, or try again later.
Wraps: (5) exit status 1
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *withstack.withStack (4) *errutil.withPrefix (5) *exec.ExitError
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=8</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #78601 roachtest: cluster_creation failed [C-test-failure O-roachtest O-robot T-testeng branch-master sync-me-8]
- #78035 roachtest: cluster_creation failed [C-test-failure O-roachtest O-robot T-testeng branch-release-22.1]
</p>
</details>
/cc @cockroachdb/dev-inf
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*cluster_creation.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
test
|
roachtest cluster creation failed roachtest cluster creation with on release test rangelookups split nodes was skipped due to test runner go test runner go stopper go in provider gce command gcloud exit status attached stack trace stack trace github com cockroachdb cockroach pkg roachprod vm forprovider github com cockroachdb cockroach pkg roachprod vm vm go wraps in provider gce wraps attached stack trace stack trace github com cockroachdb cockroach pkg roachprod vm gce provider create github com cockroachdb cockroach pkg roachprod vm gce gcloud go golang org x sync errgroup group go golang org x sync errgroup external org golang x sync errgroup errgroup go runtime goexit goroot src runtime asm s wraps command gcloud output warning some requests generated warnings disk size gb is larger than image size gb you might need to resize the root repartition manually if the operating system does not support automatic resizing see for details the resource projects ubuntu os cloud global images ubuntu focal is deprecated a suggested replacement is projects ubuntu os cloud global images ubuntu focal error gcloud compute instances create could not fetch resource the zone projects cockroach ephemeral zones us b does not have enough resources available to fulfill the request try a different zone or try again later wraps exit status error types withstack withstack errutil withprefix withstack withstack errutil withprefix exec exiterror parameters roachtest cloud gce roachtest cpu roachtest ssd help see see same failure on other branches roachtest cluster creation failed roachtest cluster creation failed cc cockroachdb dev inf
| 1
|
181,907
| 21,664,466,918
|
IssuesEvent
|
2022-05-07 01:26:46
|
Baneeishaque/spring_store_thymeleaf
|
https://api.github.com/repos/Baneeishaque/spring_store_thymeleaf
|
closed
|
WS-2018-0084 (High) detected in sshpk-1.13.1.tgz - autoclosed
|
security vulnerability
|
## WS-2018-0084 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>sshpk-1.13.1.tgz</b></p></summary>
<p>A library for finding and using SSH public keys</p>
<p>Library home page: <a href="https://registry.npmjs.org/sshpk/-/sshpk-1.13.1.tgz">https://registry.npmjs.org/sshpk/-/sshpk-1.13.1.tgz</a></p>
<p>Path to dependency file: spring_store_thymeleaf/html_site_template_customer/fashi/Source/OwlCarousel2-2.3.4/OwlCarousel2-2.3.4/package.json</p>
<p>Path to vulnerable library: spring_store_thymeleaf/html_site_template_customer/fashi/Source/OwlCarousel2-2.3.4/OwlCarousel2-2.3.4/node_modules/sshpk/package.json</p>
<p>
Dependency Hierarchy:
- grunt-sass-1.2.1.tgz (Root Library)
- node-sass-3.13.1.tgz
- request-2.67.0.tgz
- http-signature-1.1.1.tgz
- :x: **sshpk-1.13.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Baneeishaque/spring_store_thymeleaf/commit/08ba0922f3668b139df2a365e01b4d3e57faef86">08ba0922f3668b139df2a365e01b4d3e57faef86</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Versions of sshpk before 1.14.1 are vulnerable to regular expression denial of service when parsing crafted invalid public keys.
<p>Publish Date: 2018-04-25
<p>URL: <a href=https://github.com/joyent/node-sshpk/blob/v1.13.1/lib/formats/ssh.js#L17>WS-2018-0084</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>8.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/606">https://nodesecurity.io/advisories/606</a></p>
<p>Release Date: 2018-01-27</p>
<p>Fix Resolution: 1.14.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2018-0084 (High) detected in sshpk-1.13.1.tgz - autoclosed - ## WS-2018-0084 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>sshpk-1.13.1.tgz</b></p></summary>
<p>A library for finding and using SSH public keys</p>
<p>Library home page: <a href="https://registry.npmjs.org/sshpk/-/sshpk-1.13.1.tgz">https://registry.npmjs.org/sshpk/-/sshpk-1.13.1.tgz</a></p>
<p>Path to dependency file: spring_store_thymeleaf/html_site_template_customer/fashi/Source/OwlCarousel2-2.3.4/OwlCarousel2-2.3.4/package.json</p>
<p>Path to vulnerable library: spring_store_thymeleaf/html_site_template_customer/fashi/Source/OwlCarousel2-2.3.4/OwlCarousel2-2.3.4/node_modules/sshpk/package.json</p>
<p>
Dependency Hierarchy:
- grunt-sass-1.2.1.tgz (Root Library)
- node-sass-3.13.1.tgz
- request-2.67.0.tgz
- http-signature-1.1.1.tgz
- :x: **sshpk-1.13.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Baneeishaque/spring_store_thymeleaf/commit/08ba0922f3668b139df2a365e01b4d3e57faef86">08ba0922f3668b139df2a365e01b4d3e57faef86</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Versions of sshpk before 1.14.1 are vulnerable to regular expression denial of service when parsing crafted invalid public keys.
<p>Publish Date: 2018-04-25
<p>URL: <a href=https://github.com/joyent/node-sshpk/blob/v1.13.1/lib/formats/ssh.js#L17>WS-2018-0084</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>8.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/606">https://nodesecurity.io/advisories/606</a></p>
<p>Release Date: 2018-01-27</p>
<p>Fix Resolution: 1.14.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
ws high detected in sshpk tgz autoclosed ws high severity vulnerability vulnerable library sshpk tgz a library for finding and using ssh public keys library home page a href path to dependency file spring store thymeleaf html site template customer fashi source package json path to vulnerable library spring store thymeleaf html site template customer fashi source node modules sshpk package json dependency hierarchy grunt sass tgz root library node sass tgz request tgz http signature tgz x sshpk tgz vulnerable library found in head commit a href vulnerability details versions of sshpk before are vulnerable to regular expression denial of service when parsing crafted invalid public keys publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
77,639
| 7,582,944,259
|
IssuesEvent
|
2018-04-25 07:10:33
|
mozilla-iam/auth0-custom-lock
|
https://api.github.com/repos/mozilla-iam/auth0-custom-lock
|
closed
|
Cannot login passwordless
|
bug ready for test
|
STR:
1. Navigate to mozillians staging
2. Click Log in button
3. Add a non-LDAP email address and click Enter
Expected:
"Send email" screen is shown
Actual:
Spinning spinner is shown continuously

|
1.0
|
Cannot login passwordless - STR:
1. Navigate to mozillians staging
2. Click Log in button
3. Add a non-LDAP email address and click Enter
Expected:
"Send email" screen is shown
Actual:
Spinning spinner is shown continuously

|
test
|
cannot login passwordless str navigate to mozillians staging click log in button add a non ldap email address and click enter expected send email screen is shown actual spinning spinner is shown continuously
| 1
|
221,556
| 17,358,315,005
|
IssuesEvent
|
2021-07-29 16:56:21
|
eneiluj/cospend-nc
|
https://api.github.com/repos/eneiluj/cospend-nc
|
closed
|
Category doesn't show the full list
|
bug testing
|
Since the last update the category doesn't show the full list when clicking on it :

Nextcloud 22.0.0
Cospend 1.3.10
Breeze Dark 22.0.0
|
1.0
|
Category doesn't show the full list - Since the last update the category doesn't show the full list when clicking on it :

Nextcloud 22.0.0
Cospend 1.3.10
Breeze Dark 22.0.0
|
test
|
category doesn t show the full list since the last update the category doesn t show the full list when clicking on it nextcloud cospend breeze dark
| 1
|
34,078
| 6,288,741,837
|
IssuesEvent
|
2017-07-19 17:40:10
|
LLNL/maestrowf
|
https://api.github.com/repos/LLNL/maestrowf
|
closed
|
Upload sdist to pypi
|
Documentation High Priority
|
For the version 1.0.0 release, you should have both the sdist and bdist versions available.
|
1.0
|
Upload sdist to pypi - For the version 1.0.0 release, you should have both the sdist and bdist versions available.
|
non_test
|
upload sdist to pypi for the version release you should have both the sdist and bdist versions available
| 0
|
13,591
| 3,349,129,516
|
IssuesEvent
|
2015-11-17 07:55:16
|
start-jsk/jsk_apc
|
https://api.github.com/repos/start-jsk/jsk_apc
|
opened
|
Test recognition nodes
|
test
|
- [ ] localization of objects in bin
- [ ] localization of object in hand
- [ ] recognition of object in hand
|
1.0
|
Test recognition nodes - - [ ] localization of objects in bin
- [ ] localization of object in hand
- [ ] recognition of object in hand
|
test
|
test recognition nodes localization of objects in bin localization of object in hand recognition of object in hand
| 1
|
298,626
| 25,840,922,914
|
IssuesEvent
|
2022-12-13 00:15:25
|
mehah/otclient
|
https://api.github.com/repos/mehah/otclient
|
closed
|
Bug on target monsters
|
Priority: Critical Status: Pending Test Type: Bug
|
### Priority
Critical
### Area
- [ ] Data
- [X] Source
- [ ] Docker
- [ ] Other
### What happened?
The client is bugging the target when selecting mobs.

### What OS are you seeing the problem on?
Windows
### Code of Conduct
- [X] I agree to follow this project's Code of Conduct
|
1.0
|
Bug on target monsters - ### Priority
Critical
### Area
- [ ] Data
- [X] Source
- [ ] Docker
- [ ] Other
### What happened?
The client is bugging the target when selecting mobs.

### What OS are you seeing the problem on?
Windows
### Code of Conduct
- [X] I agree to follow this project's Code of Conduct
|
test
|
bug on target monsters priority critical area data source docker other what happened the client is bugging the target when selecting mobs what os are you seeing the problem on windows code of conduct i agree to follow this project s code of conduct
| 1
|
320,083
| 9,769,353,901
|
IssuesEvent
|
2019-06-06 08:21:30
|
mschubert/clustermq
|
https://api.github.com/repos/mschubert/clustermq
|
opened
|
Switch to `pbdZMQ` package for ZeroMQ backend
|
next-version priority
|
`rzmq` does not support #150, and the package is not really in active development. Switch to `pbdZMQ`, and interface with the library directly
Enables solving of #150 and #125
In addition, inner loops could be written in compiled code this way
|
1.0
|
Switch to `pbdZMQ` package for ZeroMQ backend - `rzmq` does not support #150, and the package is not really in active development. Switch to `pbdZMQ`, and interface with the library directly
Enables solving of #150 and #125
In addition, inner loops could be written in compiled code this way
|
non_test
|
switch to pbdzmq package for zeromq backend rzmq does not support and the package is not really in active development switch to pbdzmq and interface with the library directly enables solving of and in addition inner loops could be written in compiled code this way
| 0
|
167,796
| 20,726,397,579
|
IssuesEvent
|
2022-03-14 02:47:58
|
mihorsky/intentionally-buggy-code
|
https://api.github.com/repos/mihorsky/intentionally-buggy-code
|
opened
|
CVE-2021-37701 (High) detected in tar-2.2.1.tgz
|
security vulnerability
|
## CVE-2021-37701 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-2.2.1.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-2.2.1.tgz">https://registry.npmjs.org/tar/-/tar-2.2.1.tgz</a></p>
<p>
Dependency Hierarchy:
- react-scripts-1.0.17.tgz (Root Library)
- fsevents-1.1.2.tgz
- node-pre-gyp-0.6.36.tgz
- :x: **tar-2.2.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 4.4.16, 5.0.8, and 6.1.7 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both `\` and `/` characters as path separators, however `\` is a valid filename character on posix systems. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar archive contained a directory at `FOO`, followed by a symbolic link named `foo`, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but _not_ from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the `FOO` directory would then be placed in the target of the symbolic link, thinking that the directory had already been created. These issues were addressed in releases 4.4.16, 5.0.8 and 6.1.7. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-9r2w-394v-53qc.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37701>CVE-2021-37701</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc">https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 4.4.16</p>
<p>Direct dependency fix Resolution (react-scripts): 1.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-37701 (High) detected in tar-2.2.1.tgz - ## CVE-2021-37701 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-2.2.1.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-2.2.1.tgz">https://registry.npmjs.org/tar/-/tar-2.2.1.tgz</a></p>
<p>
Dependency Hierarchy:
- react-scripts-1.0.17.tgz (Root Library)
- fsevents-1.1.2.tgz
- node-pre-gyp-0.6.36.tgz
- :x: **tar-2.2.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 4.4.16, 5.0.8, and 6.1.7 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both `\` and `/` characters as path separators, however `\` is a valid filename character on posix systems. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar archive contained a directory at `FOO`, followed by a symbolic link named `foo`, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but _not_ from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the `FOO` directory would then be placed in the target of the symbolic link, thinking that the directory had already been created. These issues were addressed in releases 4.4.16, 5.0.8 and 6.1.7. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-9r2w-394v-53qc.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37701>CVE-2021-37701</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc">https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 4.4.16</p>
<p>Direct dependency fix Resolution (react-scripts): 1.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve high detected in tar tgz cve high severity vulnerability vulnerable library tar tgz tar for node library home page a href dependency hierarchy react scripts tgz root library fsevents tgz node pre gyp tgz x tar tgz vulnerable library found in base branch master vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems the cache checking logic used both and characters as path separators however is a valid filename character on posix systems by first creating a directory and then replacing that directory with a symlink it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite additionally a similar confusion could arise on case insensitive filesystems if a tar archive contained a directory at foo followed by a symbolic link named foo then on case insensitive file systems the creation of the symbolic link would remove the directory from the filesystem but not from the internal directory cache as it would not be treated as a cache hit a subsequent file entry within the foo directory would then be placed in the target of the symbolic link thinking that the directory had already been created these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar if this is not possible a workaround is available in the referenced ghsa publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar direct dependency fix resolution react scripts step up your open source security game with whitesource
| 0
|
44,481
| 5,630,283,810
|
IssuesEvent
|
2017-04-05 11:50:42
|
EFForg/https-everywhere
|
https://api.github.com/repos/EFForg/https-everywhere
|
closed
|
Customize the path of HTTPSEverywhereUserRules?
|
Ruleset Testing
|
I want to move the path to my cloud drive, not in firefox's profile.
|
1.0
|
Customize the path of HTTPSEverywhereUserRules? - I want to move the path to my cloud drive, not in firefox's profile.
|
test
|
customize the path of httpseverywhereuserrules i want to move the path to my cloud drive not in firefox s profile
| 1
|
624,394
| 19,696,034,361
|
IssuesEvent
|
2022-01-12 12:14:45
|
TeamBookTez/booktez-server
|
https://api.github.com/repos/TeamBookTez/booktez-server
|
closed
|
[bug] fix login
|
🌈 refactor 2️⃣ priority: middle
|
## 🐞 버그 설명
- 탈퇴된 회원 쿼리문 수정
## 📝 todo
- [x] #18
- [x] Controller 파라미터 req.body 한번에 보내도록
- [x] Controller reponseData 네이밍 data -> resData
- [x] 불필요한 변수선언 제거
- [x] 프리티어 적용 확인
- [x] Service 오류코드 constant library 적용
- [ ] isDeleted 체크
|
1.0
|
[bug] fix login - ## 🐞 버그 설명
- 탈퇴된 회원 쿼리문 수정
## 📝 todo
- [x] #18
- [x] Controller 파라미터 req.body 한번에 보내도록
- [x] Controller reponseData 네이밍 data -> resData
- [x] 불필요한 변수선언 제거
- [x] 프리티어 적용 확인
- [x] Service 오류코드 constant library 적용
- [ ] isDeleted 체크
|
non_test
|
fix login 🐞 버그 설명 탈퇴된 회원 쿼리문 수정 📝 todo controller 파라미터 req body 한번에 보내도록 controller reponsedata 네이밍 data resdata 불필요한 변수선언 제거 프리티어 적용 확인 service 오류코드 constant library 적용 isdeleted 체크
| 0
|
189,643
| 22,047,082,760
|
IssuesEvent
|
2022-05-30 03:51:29
|
Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2022-0492
|
https://api.github.com/repos/Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2022-0492
|
closed
|
CVE-2021-3411 (Medium) detected in linuxlinux-4.19.88 - autoclosed
|
security vulnerability
|
## CVE-2021-3411 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.88</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2022-0492/commit/8d2169763c8858bce8d07fbb569f01ef9b30383b">8d2169763c8858bce8d07fbb569f01ef9b30383b</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/linux-4.19.72/arch/x86/kernel/kprobes/opt.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/linux-4.19.72/arch/x86/kernel/kprobes/opt.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw was found in the Linux kernel in versions prior to 5.10. A violation of memory access was found while detecting a padding of int3 in the linking state. The highest threat from this vulnerability is to data confidentiality and integrity as well as system availability.
<p>Publish Date: 2021-03-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3411>CVE-2021-3411</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2021-3411">https://www.linuxkernelcves.com/cves/CVE-2021-3411</a></p>
<p>Release Date: 2021-03-09</p>
<p>Fix Resolution: v5.9.15, v5.10</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-3411 (Medium) detected in linuxlinux-4.19.88 - autoclosed - ## CVE-2021-3411 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.88</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2022-0492/commit/8d2169763c8858bce8d07fbb569f01ef9b30383b">8d2169763c8858bce8d07fbb569f01ef9b30383b</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/linux-4.19.72/arch/x86/kernel/kprobes/opt.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/linux-4.19.72/arch/x86/kernel/kprobes/opt.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw was found in the Linux kernel in versions prior to 5.10. A violation of memory access was found while detecting a padding of int3 in the linking state. The highest threat from this vulnerability is to data confidentiality and integrity as well as system availability.
<p>Publish Date: 2021-03-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3411>CVE-2021-3411</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2021-3411">https://www.linuxkernelcves.com/cves/CVE-2021-3411</a></p>
<p>Release Date: 2021-03-09</p>
<p>Fix Resolution: v5.9.15, v5.10</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve medium detected in linuxlinux autoclosed cve medium severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch master vulnerable source files linux arch kernel kprobes opt c linux arch kernel kprobes opt c vulnerability details a flaw was found in the linux kernel in versions prior to a violation of memory access was found while detecting a padding of in the linking state the highest threat from this vulnerability is to data confidentiality and integrity as well as system availability publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
12,315
| 3,265,787,536
|
IssuesEvent
|
2015-10-22 17:47:15
|
mautic/mautic
|
https://api.github.com/repos/mautic/mautic
|
closed
|
[BUG] Help message below field forms is not shown
|
Bug Ready To Test
|
In version 1.2 the help text below the label of field forms is not shown.
|
1.0
|
[BUG] Help message below field forms is not shown - In version 1.2 the help text below the label of field forms is not shown.
|
test
|
help message below field forms is not shown in version the help text below the label of field forms is not shown
| 1
|
172,844
| 13,349,259,523
|
IssuesEvent
|
2020-08-29 23:31:31
|
nicholas-maltbie/PropHunt
|
https://api.github.com/repos/nicholas-maltbie/PropHunt
|
opened
|
Create Unit Tests for Existing Systems
|
test
|
Once the Basic Testing Framework has been added, make sure that the code coverage for Unit Tests includes all existing code. This can be broken into smaller PRs for each system until the test coverage reaches 100% of all classes.
|
1.0
|
Create Unit Tests for Existing Systems - Once the Basic Testing Framework has been added, make sure that the code coverage for Unit Tests includes all existing code. This can be broken into smaller PRs for each system until the test coverage reaches 100% of all classes.
|
test
|
create unit tests for existing systems once the basic testing framework has been added make sure that the code coverage for unit tests includes all existing code this can be broken into smaller prs for each system until the test coverage reaches of all classes
| 1
|
124,748
| 10,322,630,000
|
IssuesEvent
|
2019-08-31 14:08:34
|
repobee/repobee
|
https://api.github.com/repos/repobee/repobee
|
opened
|
Update GitLab integration test instance
|
testing
|
- [ ] Update to the latest version of GitLab CE
- [ ] Increase the amount of students to at least 5 (currently only 2)
|
1.0
|
Update GitLab integration test instance - - [ ] Update to the latest version of GitLab CE
- [ ] Increase the amount of students to at least 5 (currently only 2)
|
test
|
update gitlab integration test instance update to the latest version of gitlab ce increase the amount of students to at least currently only
| 1
|
259,508
| 19,601,976,352
|
IssuesEvent
|
2022-01-06 03:05:24
|
HookCycle/BITcc
|
https://api.github.com/repos/HookCycle/BITcc
|
closed
|
See number of advisees (teacher)
|
documentation
|
The user can check the list of registered students he is mentoring
|
1.0
|
See number of advisees (teacher) - The user can check the list of registered students he is mentoring
|
non_test
|
see number of advisees teacher the user can check the list of registered students he is mentoring
| 0
|
207,275
| 15,801,761,303
|
IssuesEvent
|
2021-04-03 06:27:40
|
tgstation/tgstation
|
https://api.github.com/repos/tgstation/tgstation
|
closed
|
Green slimes don't produce enough mutation toxin
|
Bug Tested/Reproduced
|
<!-- Write **BELOW** The Headers and **ABOVE** The comments else it may not be viewable -->
## Round ID: [159371](https://scrubby.melonmesa.com/round/159371)
<!--- **INCLUDE THE ROUND ID**
If you discovered this issue from playing tgstation hosted servers:
[Round ID]: # (It can be found in the Status panel or retrieved from https://atlantaned.space/statbus/round.php ! The round id let's us look up valuable information and logs for the round the bug happened.)-->159271
## Testmerges:
<!-- If you're certain the issue is to be caused by a test merge [OOC tab -> Show Server Revision], report it in the pull request's comment section rather than on the tracker(If you're unsure you can refer to the issue number by prefixing said number with #. The issue number can be found beside the title after submitting it to the tracker).If no testmerges are active, feel free to remove this section. -->
The following pull requests are currently test merged:
#58051: 'reverts map_format changes' by Fikou at commit 39c28248f2
#58096: '[15.ai] [April Fools] The Nanotrasen Voice Module subscription has expired.' by Iamgoofball at commit 7a192b7550
## Reproduction:
<!-- Explain your issue in detail, including the steps to reproduce it. Issues without proper reproduction steps or explanation are open to being ignored/closed by maintainers.-->Inject 5u of plasma into a green slime. Draw out 3.75u mutation toxin. Likely related to the toxins chem overhaul.
<!-- **For Admins:** Oddities induced by var-edits and other admin tools are not necessarily bugs. Verify that your issues occur under regular circumstances before reporting them. -->
|
1.0
|
Green slimes don't produce enough mutation toxin - <!-- Write **BELOW** The Headers and **ABOVE** The comments else it may not be viewable -->
## Round ID: [159371](https://scrubby.melonmesa.com/round/159371)
<!--- **INCLUDE THE ROUND ID**
If you discovered this issue from playing tgstation hosted servers:
[Round ID]: # (It can be found in the Status panel or retrieved from https://atlantaned.space/statbus/round.php ! The round id let's us look up valuable information and logs for the round the bug happened.)-->159271
## Testmerges:
<!-- If you're certain the issue is to be caused by a test merge [OOC tab -> Show Server Revision], report it in the pull request's comment section rather than on the tracker(If you're unsure you can refer to the issue number by prefixing said number with #. The issue number can be found beside the title after submitting it to the tracker).If no testmerges are active, feel free to remove this section. -->
The following pull requests are currently test merged:
#58051: 'reverts map_format changes' by Fikou at commit 39c28248f2
#58096: '[15.ai] [April Fools] The Nanotrasen Voice Module subscription has expired.' by Iamgoofball at commit 7a192b7550
## Reproduction:
<!-- Explain your issue in detail, including the steps to reproduce it. Issues without proper reproduction steps or explanation are open to being ignored/closed by maintainers.-->Inject 5u of plasma into a green slime. Draw out 3.75u mutation toxin. Likely related to the toxins chem overhaul.
<!-- **For Admins:** Oddities induced by var-edits and other admin tools are not necessarily bugs. Verify that your issues occur under regular circumstances before reporting them. -->
|
test
|
green slimes don t produce enough mutation toxin round id include the round id if you discovered this issue from playing tgstation hosted servers it can be found in the status panel or retrieved from the round id let s us look up valuable information and logs for the round the bug happened testmerges the following pull requests are currently test merged reverts map format changes by fikou at commit the nanotrasen voice module subscription has expired by iamgoofball at commit reproduction inject of plasma into a green slime draw out mutation toxin likely related to the toxins chem overhaul
| 1
|
295,697
| 25,496,007,631
|
IssuesEvent
|
2022-11-27 17:31:40
|
YassiCorp/YassiCrop-Bot-Issues
|
https://api.github.com/repos/YassiCorp/YassiCrop-Bot-Issues
|
closed
|
[27/11/22] Erreur | » YassiCorp Bot Emojis créé par ㄚ卂丂丂丨
|
test
|
### **🎡 Informations:**
» Python vers: V3.10.6
» NextCord vers: V2.2.0
» Date et Heure: Le 27/11/22 à 18:28:45
» Guild: YassiCorp Bot Emojis | 1029080143672647680
» User: ㄚ卂丂丂丨#3452 | ㄚ卂丂丂丨 | 626833155281911849
» Lien du message: https://discord.com/channels/1029080143672647680/1029080144553447498/1046477688128405504### **🎯 L'erreur:**
```PY
Command raised an exception: attributeerror: 'interaction' object has no attribute 'dg'
```
|
1.0
|
[27/11/22] Erreur | » YassiCorp Bot Emojis créé par ㄚ卂丂丂丨 - ### **🎡 Informations:**
» Python vers: V3.10.6
» NextCord vers: V2.2.0
» Date et Heure: Le 27/11/22 à 18:28:45
» Guild: YassiCorp Bot Emojis | 1029080143672647680
» User: ㄚ卂丂丂丨#3452 | ㄚ卂丂丂丨 | 626833155281911849
» Lien du message: https://discord.com/channels/1029080143672647680/1029080144553447498/1046477688128405504### **🎯 L'erreur:**
```PY
Command raised an exception: attributeerror: 'interaction' object has no attribute 'dg'
```
|
test
|
erreur » yassicorp bot emojis créé par ㄚ卂丂丂丨 🎡 informations » python vers » nextcord vers » date et heure le à » guild yassicorp bot emojis » user ㄚ卂丂丂丨 ㄚ卂丂丂丨 » lien du message 🎯 l erreur py command raised an exception attributeerror interaction object has no attribute dg
| 1
|
169,212
| 13,129,900,630
|
IssuesEvent
|
2020-08-06 14:34:46
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
ccl/backupccl: TestBackupRestoreChecksum failed
|
C-test-failure O-robot branch-master
|
[(ccl/backupccl).TestBackupRestoreChecksum failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2150004&tab=buildLog) on [master@d50061568013bce3646069c79c117abc65a49f56](https://github.com/cockroachdb/cockroach/commits/d50061568013bce3646069c79c117abc65a49f56):
```
=== RUN TestBackupRestoreChecksum
TestBackupRestoreChecksum: test_log_scope.go:85: test logs captured to: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestBackupRestoreChecksum270929600
TestBackupRestoreChecksum: test_log_scope.go:58: use -show-logs to present logs inline
test logs left over in: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestBackupRestoreChecksum270929600
TestBackupRestoreChecksum: testing.go:906: race detected during execution of test
--- FAIL: TestBackupRestoreChecksum (5.32s)
```
<details><summary>More</summary><p>
Parameters:
- TAGS=
- GOFLAGS=-race -parallel=2
```
make stressrace TESTS=TestBackupRestoreChecksum PKG=./pkg/ccl/backupccl TESTTIMEOUT=5m STRESSFLAGS='-timeout 5m' 2>&1
```
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2ATestBackupRestoreChecksum.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
1.0
|
ccl/backupccl: TestBackupRestoreChecksum failed - [(ccl/backupccl).TestBackupRestoreChecksum failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2150004&tab=buildLog) on [master@d50061568013bce3646069c79c117abc65a49f56](https://github.com/cockroachdb/cockroach/commits/d50061568013bce3646069c79c117abc65a49f56):
```
=== RUN TestBackupRestoreChecksum
TestBackupRestoreChecksum: test_log_scope.go:85: test logs captured to: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestBackupRestoreChecksum270929600
TestBackupRestoreChecksum: test_log_scope.go:58: use -show-logs to present logs inline
test logs left over in: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestBackupRestoreChecksum270929600
TestBackupRestoreChecksum: testing.go:906: race detected during execution of test
--- FAIL: TestBackupRestoreChecksum (5.32s)
```
<details><summary>More</summary><p>
Parameters:
- TAGS=
- GOFLAGS=-race -parallel=2
```
make stressrace TESTS=TestBackupRestoreChecksum PKG=./pkg/ccl/backupccl TESTTIMEOUT=5m STRESSFLAGS='-timeout 5m' 2>&1
```
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2ATestBackupRestoreChecksum.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
test
|
ccl backupccl testbackuprestorechecksum failed on run testbackuprestorechecksum testbackuprestorechecksum test log scope go test logs captured to go src github com cockroachdb cockroach artifacts testbackuprestorechecksum test log scope go use show logs to present logs inline test logs left over in go src github com cockroachdb cockroach artifacts testbackuprestorechecksum testing go race detected during execution of test fail testbackuprestorechecksum more parameters tags goflags race parallel make stressrace tests testbackuprestorechecksum pkg pkg ccl backupccl testtimeout stressflags timeout powered by
| 1
|
456,581
| 13,150,852,517
|
IssuesEvent
|
2020-08-09 13:51:34
|
chrisjsewell/docutils
|
https://api.github.com/repos/chrisjsewell/docutils
|
closed
|
Test suite failure on Windows (CR/LF incompatibility) [SF:bugs:115]
|
bugs closed-works-for-me priority-5
|
author: crouz
created: 2009-06-21 19:34:26
assigned: None
SF_url: https://sourceforge.net/p/docutils/bugs/115
20 tests fail.
The functional tests seem to fail due to different line endings or unmatched number of trailing spaces, see attached file.
Used software:
Docutils 0.6 \[snapshot 2009-06-21, r5994\]
Python 2.6.1
Windows 7 RC 64-bit \(6.1.7100\)
---
commenter: crouz
posted: 2009-06-21 19:34:26
title: #115 Test suite failure on Windows (CR/LF incompatibility)
attachments:
- https://sourceforge.net/p/docutils/bugs/_discuss/thread/4b99ec41/47d6/attachment/alltests.zip
Test output
---
commenter: milde
posted: 2009-09-03 08:02:10
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **summary**: Test suite failure --> Test suite failure on Windows
---
commenter: milde
posted: 2009-09-03 08:02:10
title: #115 Test suite failure on Windows (CR/LF incompatibility)
Could you please specify which tests fail
\(or even better, if the problem is still manifest in the latest SVN\)?
---
commenter: crouz
posted: 2009-09-04 12:35:50
title: #115 Test suite failure on Windows (CR/LF incompatibility)
I re-ran the tests on the latest snapshot \(2009-09-04, r6116\) with Python 2.6.1 \(r261:67517\) and Python 2.6.2 \(r262:71605, Apr 14 2009, 22:46:50\) \[MSC v.1500 64 bit \(AMD64\)\] on win32 and there were now 35 tests which fail.
All the functions tests fail due to different file endings. The expected output contains windows style line endings \(CRLF\) while the output contains unix style line endings \(LF\).
---
commenter: milde
posted: 2009-09-10 11:34:17
title: #115 Test suite failure on Windows (CR/LF incompatibility)
It seems your Python \(or Docutils\) installation is configured to replace a '\n' with LF while it should be
CRLF on Windows.
Do you get Unix line endings \(LF\) also with "real" rst2... runs?
Do you get Unix line endings with other Python programs that write to a file?
---
commenter: milde
posted: 2009-09-10 11:34:17
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **summary**: Test suite failure on Windows --> Test suite failure on Windows (CR/LF incompatibility)
---
commenter: crouz
posted: 2009-09-10 21:13:38
title: #115 Test suite failure on Windows (CR/LF incompatibility)
A real run of rst2html also outputted only LF.
Other python programs write CRLF. \(Tried with a simple fh.write\("\n"\) program\)
I couldn't find any configuration files for docutils, where should I look?
---
commenter: milde
posted: 2009-09-11 14:25:42
title: #115 Test suite failure on Windows (CR/LF incompatibility)
OK. I see that in io.py FileOutput opens files in binary mode \('wb' instead of 'w'\).
As there is a BinaryFileOutput class too, I suppose this is a side-effect of the latest Python-3 compatibility change.
Reverting should fix the problem \(done in SVN\).
---
commenter: milde
posted: 2009-09-11 14:25:46
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **status**: open --> open-fixed
---
commenter: crouz
posted: 2009-09-11 22:38:50
title: #115 Test suite failure on Windows (CR/LF incompatibility)
I ran the tests again with the latest SVN version \(6126\) and there are still newline discrepancies, but now it's instead some places with CRCRLF in the output.
---
commenter: crouz
posted: 2009-09-11 22:44:36
title: #115 Test suite failure on Windows (CR/LF incompatibility)
attachments:
- https://sourceforge.net/p/docutils/bugs/_discuss/thread/4b99ec41/1eb0/attachment/compact_lists_diff.zip
Output sample from test run.
---
commenter: milde
posted: 2009-09-17 14:08:41
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **status**: open-fixed --> open-works-for-me
---
commenter: milde
posted: 2009-09-17 14:08:41
title: #115 Test suite failure on Windows (CR/LF incompatibility)
Could you please be more specific or provide sample output? I have no chance to test this due to lack of a Windows system.
---
commenter: crouz
posted: 2009-09-17 21:28:37
title: #115 Test suite failure on Windows (CR/LF incompatibility)
I did that already, see attached file "compact\_lists\_diff.zip". :o\)
All the tests in functional/ have similar discrepancies.
---
commenter: milde
posted: 2009-09-18 08:01:56
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **status**: open-works-for-me --> pending-works-for-me
---
commenter: sf-robot
posted: 2009-10-03 02:20:56
title: #115 Test suite failure on Windows (CR/LF incompatibility)
This Tracker item was closed automatically by the system. It was
previously set to a Pending status, and the original submitter
did not respond within 14 days \(the time period specified by
the administrator of this Tracker\).
---
commenter: sf-robot
posted: 2009-10-03 02:21:01
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **status**: pending-works-for-me --> closed-works-for-me
|
1.0
|
Test suite failure on Windows (CR/LF incompatibility) [SF:bugs:115] -
author: crouz
created: 2009-06-21 19:34:26
assigned: None
SF_url: https://sourceforge.net/p/docutils/bugs/115
20 tests fail.
The functional tests seem to fail due to different line endings or unmatched number of trailing spaces, see attached file.
Used software:
Docutils 0.6 \[snapshot 2009-06-21, r5994\]
Python 2.6.1
Windows 7 RC 64-bit \(6.1.7100\)
---
commenter: crouz
posted: 2009-06-21 19:34:26
title: #115 Test suite failure on Windows (CR/LF incompatibility)
attachments:
- https://sourceforge.net/p/docutils/bugs/_discuss/thread/4b99ec41/47d6/attachment/alltests.zip
Test output
---
commenter: milde
posted: 2009-09-03 08:02:10
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **summary**: Test suite failure --> Test suite failure on Windows
---
commenter: milde
posted: 2009-09-03 08:02:10
title: #115 Test suite failure on Windows (CR/LF incompatibility)
Could you please specify which tests fail
\(or even better, if the problem is still manifest in the latest SVN\)?
---
commenter: crouz
posted: 2009-09-04 12:35:50
title: #115 Test suite failure on Windows (CR/LF incompatibility)
I re-ran the tests on the latest snapshot \(2009-09-04, r6116\) with Python 2.6.1 \(r261:67517\) and Python 2.6.2 \(r262:71605, Apr 14 2009, 22:46:50\) \[MSC v.1500 64 bit \(AMD64\)\] on win32 and there were now 35 tests which fail.
All the functions tests fail due to different file endings. The expected output contains windows style line endings \(CRLF\) while the output contains unix style line endings \(LF\).
---
commenter: milde
posted: 2009-09-10 11:34:17
title: #115 Test suite failure on Windows (CR/LF incompatibility)
It seems your Python \(or Docutils\) installation is configured to replace a '\n' with LF while it should be
CRLF on Windows.
Do you get Unix line endings \(LF\) also with "real" rst2... runs?
Do you get Unix line endings with other Python programs that write to a file?
---
commenter: milde
posted: 2009-09-10 11:34:17
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **summary**: Test suite failure on Windows --> Test suite failure on Windows (CR/LF incompatibility)
---
commenter: crouz
posted: 2009-09-10 21:13:38
title: #115 Test suite failure on Windows (CR/LF incompatibility)
A real run of rst2html also outputted only LF.
Other python programs write CRLF. \(Tried with a simple fh.write\("\n"\) program\)
I couldn't find any configuration files for docutils, where should I look?
---
commenter: milde
posted: 2009-09-11 14:25:42
title: #115 Test suite failure on Windows (CR/LF incompatibility)
OK. I see that in io.py FileOutput opens files in binary mode \('wb' instead of 'w'\).
As there is a BinaryFileOutput class too, I suppose this is a side-effect of the latest Python-3 compatibility change.
Reverting should fix the problem \(done in SVN\).
---
commenter: milde
posted: 2009-09-11 14:25:46
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **status**: open --> open-fixed
---
commenter: crouz
posted: 2009-09-11 22:38:50
title: #115 Test suite failure on Windows (CR/LF incompatibility)
I ran the tests again with the latest SVN version \(6126\) and there are still newline discrepancies, but now it's instead some places with CRCRLF in the output.
---
commenter: crouz
posted: 2009-09-11 22:44:36
title: #115 Test suite failure on Windows (CR/LF incompatibility)
attachments:
- https://sourceforge.net/p/docutils/bugs/_discuss/thread/4b99ec41/1eb0/attachment/compact_lists_diff.zip
Output sample from test run.
---
commenter: milde
posted: 2009-09-17 14:08:41
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **status**: open-fixed --> open-works-for-me
---
commenter: milde
posted: 2009-09-17 14:08:41
title: #115 Test suite failure on Windows (CR/LF incompatibility)
Could you please be more specific or provide sample output? I have no chance to test this due to lack of a Windows system.
---
commenter: crouz
posted: 2009-09-17 21:28:37
title: #115 Test suite failure on Windows (CR/LF incompatibility)
I did that already, see attached file "compact\_lists\_diff.zip". :o\)
All the tests in functional/ have similar discrepancies.
---
commenter: milde
posted: 2009-09-18 08:01:56
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **status**: open-works-for-me --> pending-works-for-me
---
commenter: sf-robot
posted: 2009-10-03 02:20:56
title: #115 Test suite failure on Windows (CR/LF incompatibility)
This Tracker item was closed automatically by the system. It was
previously set to a Pending status, and the original submitter
did not respond within 14 days \(the time period specified by
the administrator of this Tracker\).
---
commenter: sf-robot
posted: 2009-10-03 02:21:01
title: #115 Test suite failure on Windows (CR/LF incompatibility)
- **status**: pending-works-for-me --> closed-works-for-me
|
non_test
|
test suite failure on windows cr lf incompatibility author crouz created assigned none sf url tests fail the functional tests seem to fail due to different line endings or unmatched number of trailing spaces see attached file used software docutils python windows rc bit commenter crouz posted title test suite failure on windows cr lf incompatibility attachments test output commenter milde posted title test suite failure on windows cr lf incompatibility summary test suite failure test suite failure on windows commenter milde posted title test suite failure on windows cr lf incompatibility could you please specify which tests fail or even better if the problem is still manifest in the latest svn commenter crouz posted title test suite failure on windows cr lf incompatibility i re ran the tests on the latest snapshot with python and python apr on and there were now tests which fail all the functions tests fail due to different file endings the expected output contains windows style line endings crlf while the output contains unix style line endings lf commenter milde posted title test suite failure on windows cr lf incompatibility it seems your python or docutils installation is configured to replace a n with lf while it should be crlf on windows do you get unix line endings lf also with real runs do you get unix line endings with other python programs that write to a file commenter milde posted title test suite failure on windows cr lf incompatibility summary test suite failure on windows test suite failure on windows cr lf incompatibility commenter crouz posted title test suite failure on windows cr lf incompatibility a real run of also outputted only lf other python programs write crlf tried with a simple fh write n program i couldn t find any configuration files for docutils where should i look commenter milde posted title test suite failure on windows cr lf incompatibility ok i see that in io py fileoutput opens files in binary mode wb instead of w as there is a binaryfileoutput class too i suppose this is a side effect of the latest python compatibility change reverting should fix the problem done in svn commenter milde posted title test suite failure on windows cr lf incompatibility status open open fixed commenter crouz posted title test suite failure on windows cr lf incompatibility i ran the tests again with the latest svn version and there are still newline discrepancies but now it s instead some places with crcrlf in the output commenter crouz posted title test suite failure on windows cr lf incompatibility attachments output sample from test run commenter milde posted title test suite failure on windows cr lf incompatibility status open fixed open works for me commenter milde posted title test suite failure on windows cr lf incompatibility could you please be more specific or provide sample output i have no chance to test this due to lack of a windows system commenter crouz posted title test suite failure on windows cr lf incompatibility i did that already see attached file compact lists diff zip o all the tests in functional have similar discrepancies commenter milde posted title test suite failure on windows cr lf incompatibility status open works for me pending works for me commenter sf robot posted title test suite failure on windows cr lf incompatibility this tracker item was closed automatically by the system it was previously set to a pending status and the original submitter did not respond within days the time period specified by the administrator of this tracker commenter sf robot posted title test suite failure on windows cr lf incompatibility status pending works for me closed works for me
| 0
|
55,843
| 6,493,746,599
|
IssuesEvent
|
2017-08-21 18:34:47
|
dotnet/roslyn
|
https://api.github.com/repos/dotnet/roslyn
|
closed
|
Convert if to switch garbles preprocessor directive
|
Area-IDE Bug Depth Testing
|
Convert
```C#
static int Main(string[] args)
{
if (
#if true
args == null
#else
args == new object()
#endif
)
{
}
else
{
}
}
```
and most of the processor gets lost. Note that the change is not offered if the first case of the preprocessor is false
``` C#
switch (
#if true
args)
{
case null:
break;
default:
break;
}
```
|
1.0
|
Convert if to switch garbles preprocessor directive - Convert
```C#
static int Main(string[] args)
{
if (
#if true
args == null
#else
args == new object()
#endif
)
{
}
else
{
}
}
```
and most of the processor gets lost. Note that the change is not offered if the first case of the preprocessor is false
``` C#
switch (
#if true
args)
{
case null:
break;
default:
break;
}
```
|
test
|
convert if to switch garbles preprocessor directive convert c static int main string args if if true args null else args new object endif else and most of the processor gets lost note that the change is not offered if the first case of the preprocessor is false c switch if true args case null break default break
| 1
|
274,878
| 20,871,112,700
|
IssuesEvent
|
2022-03-22 12:03:31
|
bounswe/bounswe2022group5
|
https://api.github.com/repos/bounswe/bounswe2022group5
|
closed
|
Editing Functional/Articles Requirements
|
Type: Documentation Medium Priority Status: Need Review
|
#### Description:
Requirements related to the Articles part of the project will be determined, a new section in requirements page for our project's Wiki part will be created and requrements will be added to the this section. Final decisions about requirements will be determined in the upcoming weeks.
Reviewer: @oguzhandemirelx
#### Task Deadline: 21.03.2022 - 18:00 GMT+3
#### Review Deadline: 21.03.2022 - 20:00 GMT+3
|
1.0
|
Editing Functional/Articles Requirements - #### Description:
Requirements related to the Articles part of the project will be determined, a new section in requirements page for our project's Wiki part will be created and requrements will be added to the this section. Final decisions about requirements will be determined in the upcoming weeks.
Reviewer: @oguzhandemirelx
#### Task Deadline: 21.03.2022 - 18:00 GMT+3
#### Review Deadline: 21.03.2022 - 20:00 GMT+3
|
non_test
|
editing functional articles requirements description requirements related to the articles part of the project will be determined a new section in requirements page for our project s wiki part will be created and requrements will be added to the this section final decisions about requirements will be determined in the upcoming weeks reviewer oguzhandemirelx task deadline gmt review deadline gmt
| 0
|
81,607
| 7,787,763,753
|
IssuesEvent
|
2018-06-07 00:17:24
|
equella/Equella
|
https://api.github.com/repos/equella/Equella
|
closed
|
New UI not working in IE11
|
Ready for Testing Unreleased blocker! bug
|
The following errors are displayed in the javascript console
SCRIPT438: Object doesn't support property or method 'find'
index.js (41464,13)
SCRIPT5007: Unable to get property 'focus' of undefined or null reference
logon.js (3,2)
|
1.0
|
New UI not working in IE11 - The following errors are displayed in the javascript console
SCRIPT438: Object doesn't support property or method 'find'
index.js (41464,13)
SCRIPT5007: Unable to get property 'focus' of undefined or null reference
logon.js (3,2)
|
test
|
new ui not working in the following errors are displayed in the javascript console object doesn t support property or method find index js unable to get property focus of undefined or null reference logon js
| 1
|
39,120
| 2,851,140,013
|
IssuesEvent
|
2015-06-01 02:53:18
|
CenterForOpenScience/osf.io
|
https://api.github.com/repos/CenterForOpenScience/osf.io
|
closed
|
New permission default for adding new contributors.
|
5 - pending review discuss priority - medium
|
Steps
-------
1. Go to "sharing" tab on project, click add button
2. find and add contributors
Expected
------------
Default permission is not admin.
Actual
--------
Default is admin.
Suggestion
--------
Set to read, read+write, or blank making the admin adding contributors pick one of the permissions options themselves.
I would like to discuss this.
|
1.0
|
New permission default for adding new contributors. - Steps
-------
1. Go to "sharing" tab on project, click add button
2. find and add contributors
Expected
------------
Default permission is not admin.
Actual
--------
Default is admin.
Suggestion
--------
Set to read, read+write, or blank making the admin adding contributors pick one of the permissions options themselves.
I would like to discuss this.
|
non_test
|
new permission default for adding new contributors steps go to sharing tab on project click add button find and add contributors expected default permission is not admin actual default is admin suggestion set to read read write or blank making the admin adding contributors pick one of the permissions options themselves i would like to discuss this
| 0
|
118,412
| 9,986,480,887
|
IssuesEvent
|
2019-07-10 19:12:43
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
closed
|
Make quick_actions testable
|
a: tests p: first party p: quick_actions p: tooling plugin severe: new feature
|
Please add the possibility to unit test apps which use QuickActions. One example which is testable is FirebaseMessaging.
|
1.0
|
Make quick_actions testable - Please add the possibility to unit test apps which use QuickActions. One example which is testable is FirebaseMessaging.
|
test
|
make quick actions testable please add the possibility to unit test apps which use quickactions one example which is testable is firebasemessaging
| 1
|
289,662
| 21,789,024,317
|
IssuesEvent
|
2022-05-14 16:00:43
|
StanfordLegion/legion
|
https://api.github.com/repos/StanfordLegion/legion
|
closed
|
Parts of the installation process are inconsistent
|
bug planned Documentation
|
While installing regent I found a few inconsistencies in the installation process.
1. The prerequisites listed on http://regent-lang.org/install/ seem to differ compared to the ones listed on https://github.com/StanfordLegion/legion/tree/stable/language#prerequisites
2. The `circuit.rg` file is not present anymore https://github.com/StanfordLegion/legion/tree/stable/language#running
|
1.0
|
Parts of the installation process are inconsistent - While installing regent I found a few inconsistencies in the installation process.
1. The prerequisites listed on http://regent-lang.org/install/ seem to differ compared to the ones listed on https://github.com/StanfordLegion/legion/tree/stable/language#prerequisites
2. The `circuit.rg` file is not present anymore https://github.com/StanfordLegion/legion/tree/stable/language#running
|
non_test
|
parts of the installation process are inconsistent while installing regent i found a few inconsistencies in the installation process the prerequisites listed on seem to differ compared to the ones listed on the circuit rg file is not present anymore
| 0
|
456,748
| 13,150,986,492
|
IssuesEvent
|
2020-08-09 14:32:02
|
chrisjsewell/docutils
|
https://api.github.com/repos/chrisjsewell/docutils
|
closed
|
Upload wheels to pypi [SF:bugs:275]
|
bugs closed-duplicate priority-5
|
author: aragilar
created: 2015-04-08 06:42:36.599000
assigned: grubert
SF_url: https://sourceforge.net/p/docutils/bugs/275
Currently docutils does not publish any wheels on pypi. Wheels make docutils faster to install (no need to run setup.py, which for a large number of packages can take some time), and is no more difficult than uploading an sdist (see https://packaging.python.org/en/latest/distributing.html#wheels for instructions).
---
commenter: grubert
posted: 2015-04-11 20:26:38.320000
title: #275 Upload wheels to pypi
- **assigned_to**: engelbert gruber
---
commenter: gitpull
posted: 2015-04-13 03:04:51.382000
title: #275 Upload wheels to pypi
attachments:
- https://sourceforge.net/p/docutils/bugs/_discuss/thread/633590c8/96f6/6a83/attachment/alternate
How much of a speed increase / benefit does wheel pose to give docutils? To
my understanding the major benefit from wheels is for packages that use C
speedups (such as lxml) that would have lengthy install times due to
compilation.
On Sat, Apr 11, 2015 at 3:26 PM, engelbert gruber <grubert@users.sf.net>
wrote:
>
> - *assigned_to*: engelbert gruber
>
> ------------------------------
>
> * [bugs:#275] <http://sourceforge.net/p/docutils/bugs/275> Upload wheels
> to pypi*
>
> *Status:* open
> *Group:* Default
> *Created:* Wed Apr 08, 2015 06:42 AM UTC by Aragilar
> *Last Updated:* Wed Apr 08, 2015 06:42 AM UTC
> *Owner:* engelbert gruber
>
> Currently docutils does not publish any wheels on pypi. Wheels make
> docutils faster to install (no need to run setup.py, which for a large
> number of packages can take some time), and is no more difficult than
> uploading an sdist (see
> https://packaging.python.org/en/latest/distributing.html#wheels for
> instructions).
> ------------------------------
>
> Sent from sourceforge.net because docutils-develop@lists.sourceforge.net
> is subscribed to https://sourceforge.net/p/docutils/bugs/
>
> To unsubscribe from further messages, a project admin can change settings
> at https://sourceforge.net/p/docutils/admin/bugs/options. Or, if this is
> a mailing list, you can unsubscribe from the mailing list.
>
>
> ------------------------------------------------------------------------------
> BPM Camp - Free Virtual Workshop May 6th at 10am PDT/1PM EDT
> Develop your own process in accordance with the BPMN 2 standard
> Learn Process modeling best practices with Bonita BPM through live
> exercises
> http://www.bonitasoft.com/be-part-of-it/events/bpm-camp-virtual-
> event?utm_
> source=Sourceforge_BPM_Camp_5_6_15&utm_medium=email&utm_campaign=VA_SF
> _______________________________________________
> Docutils-develop mailing list
> Docutils-develop@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/docutils-develop
>
> Please use "Reply All" to reply to the list.
>
>
---
commenter: aragilar
posted: 2015-04-13 05:37:15.238000
title: #275 Upload wheels to pypi
I've found that wheels provide the most benefit when the setup.py is
more complex, so stuff like C extensions, running 2to3 or generating
docs is where having a wheel speeds up installation. I've timed
installing on my laptop from sdist and from wheel using the latest
python 2.7 (py2) and python 3.4 (py3) in Debian unstable. There should
be no issues with different download times, as both the sdist and the
wheels came from a devpi instance running on my laptop.
No wheel & py3:
real 0m50.128s
user 0m49.812s
sys 0m0.224s
No wheel & py2:
real 0m1.877s
user 0m1.548s
sys 0m0.248s
Wheel & py3
real 0m0.867s
user 0m0.708s
sys 0m0.108s
Wheel & py2
real 0m1.058s
user 0m0.916s
sys 0m0.100s
On 13 April 2015 at 13:04, Tony N <gitpull@users.sf.net> wrote:
> How much of a speed increase / benefit does wheel pose to give docutils? To
> my understanding the major benefit from wheels is for packages that use C
> speedups (such as lxml) that would have lengthy install times due to
> compilation.
>
> On Sat, Apr 11, 2015 at 3:26 PM, engelbert gruber grubert@users.sf.net
> wrote:
>
> assigned_to: engelbert gruber
>
> ________________________________
>
> [bugs:#275] http://sourceforge.net/p/docutils/bugs/275 Upload wheels
> to pypi*
>
> Status: open
> Group: Default
> Created: Wed Apr 08, 2015 06:42 AM UTC by Aragilar
> Last Updated: Wed Apr 08, 2015 06:42 AM UTC
> Owner: engelbert gruber
>
> Currently docutils does not publish any wheels on pypi. Wheels make
> docutils faster to install (no need to run setup.py, which for a large
> number of packages can take some time), and is no more difficult than
> uploading an sdist (see
> https://packaging.python.org/en/latest/distributing.html#wheels for
> instructions).
>
> ________________________________
>
> Sent from sourceforge.net because docutils-develop@lists.sourceforge.net
> is subscribed to https://sourceforge.net/p/docutils/bugs/
>
> To unsubscribe from further messages, a project admin can change settings
> at https://sourceforge.net/p/docutils/admin/bugs/options. Or, if this is
> a mailing list, you can unsubscribe from the mailing list.
>
> ________________________________
>
> BPM Camp - Free Virtual Workshop May 6th at 10am PDT/1PM EDT
> Develop your own process in accordance with the BPMN 2 standard
> Learn Process modeling best practices with Bonita BPM through live
> exercises
> http://www.bonitasoft.com/be-part-of-it/events/bpm-camp-virtual-
> event?utm_
> source=Sourceforge_BPM_Camp_5_6_15&utm_medium=email&utm_campaign=VA_SF
>
> ________________________________
>
> Docutils-develop mailing list
> Docutils-develop@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/docutils-develop
>
> Please use "Reply All" to reply to the list.
>
> ________________________________
>
> [bugs:#275] Upload wheels to pypi
>
> Status: open
> Group: Default
> Created: Wed Apr 08, 2015 06:42 AM UTC by Aragilar
> Last Updated: Sat Apr 11, 2015 08:26 PM UTC
> Owner: engelbert gruber
>
> Currently docutils does not publish any wheels on pypi. Wheels make docutils
> faster to install (no need to run setup.py, which for a large number of
> packages can take some time), and is no more difficult than uploading an
> sdist (see https://packaging.python.org/en/latest/distributing.html#wheels
> for instructions).
>
> ________________________________
>
> Sent from sourceforge.net because you indicated interest in
> https://sourceforge.net/p/docutils/bugs/275/
>
> To unsubscribe from further messages, please visit
> https://sourceforge.net/auth/subscriptions/
--
Don't send me files in proprietary formats (.doc(x), .xls, .ppt etc.).
It isn't good enough for Tim Berners-Lee, and it isn't good enough for
me either. For more information visit
http://www.gnu.org/philosophy/no-word-attachments.html.
Truly great madness cannot be achieved without significant intelligence.
- Henrik Tikkanen
If you're not messing with your sanity, you're not having fun.
- James Tocknell
In theory, there is no difference between theory and practice; In
practice, there is.
---
commenter: milde
posted: 2015-04-13 20:21:23.878000
title: #275 Upload wheels to pypi
- **status**: open --> closed-duplicate
---
commenter: milde
posted: 2015-04-13 20:21:24.015000
title: #275 Upload wheels to pypi
This is a duplicate of https://sourceforge.net/p/docutils/feature-requests/43/
|
1.0
|
Upload wheels to pypi [SF:bugs:275] -
author: aragilar
created: 2015-04-08 06:42:36.599000
assigned: grubert
SF_url: https://sourceforge.net/p/docutils/bugs/275
Currently docutils does not publish any wheels on pypi. Wheels make docutils faster to install (no need to run setup.py, which for a large number of packages can take some time), and is no more difficult than uploading an sdist (see https://packaging.python.org/en/latest/distributing.html#wheels for instructions).
---
commenter: grubert
posted: 2015-04-11 20:26:38.320000
title: #275 Upload wheels to pypi
- **assigned_to**: engelbert gruber
---
commenter: gitpull
posted: 2015-04-13 03:04:51.382000
title: #275 Upload wheels to pypi
attachments:
- https://sourceforge.net/p/docutils/bugs/_discuss/thread/633590c8/96f6/6a83/attachment/alternate
How much of a speed increase / benefit does wheel pose to give docutils? To
my understanding the major benefit from wheels is for packages that use C
speedups (such as lxml) that would have lengthy install times due to
compilation.
On Sat, Apr 11, 2015 at 3:26 PM, engelbert gruber <grubert@users.sf.net>
wrote:
>
> - *assigned_to*: engelbert gruber
>
> ------------------------------
>
> * [bugs:#275] <http://sourceforge.net/p/docutils/bugs/275> Upload wheels
> to pypi*
>
> *Status:* open
> *Group:* Default
> *Created:* Wed Apr 08, 2015 06:42 AM UTC by Aragilar
> *Last Updated:* Wed Apr 08, 2015 06:42 AM UTC
> *Owner:* engelbert gruber
>
> Currently docutils does not publish any wheels on pypi. Wheels make
> docutils faster to install (no need to run setup.py, which for a large
> number of packages can take some time), and is no more difficult than
> uploading an sdist (see
> https://packaging.python.org/en/latest/distributing.html#wheels for
> instructions).
> ------------------------------
>
> Sent from sourceforge.net because docutils-develop@lists.sourceforge.net
> is subscribed to https://sourceforge.net/p/docutils/bugs/
>
> To unsubscribe from further messages, a project admin can change settings
> at https://sourceforge.net/p/docutils/admin/bugs/options. Or, if this is
> a mailing list, you can unsubscribe from the mailing list.
>
>
> ------------------------------------------------------------------------------
> BPM Camp - Free Virtual Workshop May 6th at 10am PDT/1PM EDT
> Develop your own process in accordance with the BPMN 2 standard
> Learn Process modeling best practices with Bonita BPM through live
> exercises
> http://www.bonitasoft.com/be-part-of-it/events/bpm-camp-virtual-
> event?utm_
> source=Sourceforge_BPM_Camp_5_6_15&utm_medium=email&utm_campaign=VA_SF
> _______________________________________________
> Docutils-develop mailing list
> Docutils-develop@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/docutils-develop
>
> Please use "Reply All" to reply to the list.
>
>
---
commenter: aragilar
posted: 2015-04-13 05:37:15.238000
title: #275 Upload wheels to pypi
I've found that wheels provide the most benefit when the setup.py is
more complex, so stuff like C extensions, running 2to3 or generating
docs is where having a wheel speeds up installation. I've timed
installing on my laptop from sdist and from wheel using the latest
python 2.7 (py2) and python 3.4 (py3) in Debian unstable. There should
be no issues with different download times, as both the sdist and the
wheels came from a devpi instance running on my laptop.
No wheel & py3:
real 0m50.128s
user 0m49.812s
sys 0m0.224s
No wheel & py2:
real 0m1.877s
user 0m1.548s
sys 0m0.248s
Wheel & py3
real 0m0.867s
user 0m0.708s
sys 0m0.108s
Wheel & py2
real 0m1.058s
user 0m0.916s
sys 0m0.100s
On 13 April 2015 at 13:04, Tony N <gitpull@users.sf.net> wrote:
> How much of a speed increase / benefit does wheel pose to give docutils? To
> my understanding the major benefit from wheels is for packages that use C
> speedups (such as lxml) that would have lengthy install times due to
> compilation.
>
> On Sat, Apr 11, 2015 at 3:26 PM, engelbert gruber grubert@users.sf.net
> wrote:
>
> assigned_to: engelbert gruber
>
> ________________________________
>
> [bugs:#275] http://sourceforge.net/p/docutils/bugs/275 Upload wheels
> to pypi*
>
> Status: open
> Group: Default
> Created: Wed Apr 08, 2015 06:42 AM UTC by Aragilar
> Last Updated: Wed Apr 08, 2015 06:42 AM UTC
> Owner: engelbert gruber
>
> Currently docutils does not publish any wheels on pypi. Wheels make
> docutils faster to install (no need to run setup.py, which for a large
> number of packages can take some time), and is no more difficult than
> uploading an sdist (see
> https://packaging.python.org/en/latest/distributing.html#wheels for
> instructions).
>
> ________________________________
>
> Sent from sourceforge.net because docutils-develop@lists.sourceforge.net
> is subscribed to https://sourceforge.net/p/docutils/bugs/
>
> To unsubscribe from further messages, a project admin can change settings
> at https://sourceforge.net/p/docutils/admin/bugs/options. Or, if this is
> a mailing list, you can unsubscribe from the mailing list.
>
> ________________________________
>
> BPM Camp - Free Virtual Workshop May 6th at 10am PDT/1PM EDT
> Develop your own process in accordance with the BPMN 2 standard
> Learn Process modeling best practices with Bonita BPM through live
> exercises
> http://www.bonitasoft.com/be-part-of-it/events/bpm-camp-virtual-
> event?utm_
> source=Sourceforge_BPM_Camp_5_6_15&utm_medium=email&utm_campaign=VA_SF
>
> ________________________________
>
> Docutils-develop mailing list
> Docutils-develop@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/docutils-develop
>
> Please use "Reply All" to reply to the list.
>
> ________________________________
>
> [bugs:#275] Upload wheels to pypi
>
> Status: open
> Group: Default
> Created: Wed Apr 08, 2015 06:42 AM UTC by Aragilar
> Last Updated: Sat Apr 11, 2015 08:26 PM UTC
> Owner: engelbert gruber
>
> Currently docutils does not publish any wheels on pypi. Wheels make docutils
> faster to install (no need to run setup.py, which for a large number of
> packages can take some time), and is no more difficult than uploading an
> sdist (see https://packaging.python.org/en/latest/distributing.html#wheels
> for instructions).
>
> ________________________________
>
> Sent from sourceforge.net because you indicated interest in
> https://sourceforge.net/p/docutils/bugs/275/
>
> To unsubscribe from further messages, please visit
> https://sourceforge.net/auth/subscriptions/
--
Don't send me files in proprietary formats (.doc(x), .xls, .ppt etc.).
It isn't good enough for Tim Berners-Lee, and it isn't good enough for
me either. For more information visit
http://www.gnu.org/philosophy/no-word-attachments.html.
Truly great madness cannot be achieved without significant intelligence.
- Henrik Tikkanen
If you're not messing with your sanity, you're not having fun.
- James Tocknell
In theory, there is no difference between theory and practice; In
practice, there is.
---
commenter: milde
posted: 2015-04-13 20:21:23.878000
title: #275 Upload wheels to pypi
- **status**: open --> closed-duplicate
---
commenter: milde
posted: 2015-04-13 20:21:24.015000
title: #275 Upload wheels to pypi
This is a duplicate of https://sourceforge.net/p/docutils/feature-requests/43/
|
non_test
|
upload wheels to pypi author aragilar created assigned grubert sf url currently docutils does not publish any wheels on pypi wheels make docutils faster to install no need to run setup py which for a large number of packages can take some time and is no more difficult than uploading an sdist see for instructions commenter grubert posted title upload wheels to pypi assigned to engelbert gruber commenter gitpull posted title upload wheels to pypi attachments how much of a speed increase benefit does wheel pose to give docutils to my understanding the major benefit from wheels is for packages that use c speedups such as lxml that would have lengthy install times due to compilation on sat apr at pm engelbert gruber wrote assigned to engelbert gruber upload wheels to pypi status open group default created wed apr am utc by aragilar last updated wed apr am utc owner engelbert gruber currently docutils does not publish any wheels on pypi wheels make docutils faster to install no need to run setup py which for a large number of packages can take some time and is no more difficult than uploading an sdist see for instructions sent from sourceforge net because docutils develop lists sourceforge net is subscribed to to unsubscribe from further messages a project admin can change settings at or if this is a mailing list you can unsubscribe from the mailing list bpm camp free virtual workshop may at pdt edt develop your own process in accordance with the bpmn standard learn process modeling best practices with bonita bpm through live exercises event utm source sourceforge bpm camp utm medium email utm campaign va sf docutils develop mailing list docutils develop lists sourceforge net please use reply all to reply to the list commenter aragilar posted title upload wheels to pypi i ve found that wheels provide the most benefit when the setup py is more complex so stuff like c extensions running or generating docs is where having a wheel speeds up installation i ve timed installing on my laptop from sdist and from wheel using the latest python and python in debian unstable there should be no issues with different download times as both the sdist and the wheels came from a devpi instance running on my laptop no wheel real user sys no wheel real user sys wheel real user sys wheel real user sys on april at tony n wrote how much of a speed increase benefit does wheel pose to give docutils to my understanding the major benefit from wheels is for packages that use c speedups such as lxml that would have lengthy install times due to compilation on sat apr at pm engelbert gruber grubert users sf net wrote assigned to engelbert gruber upload wheels to pypi status open group default created wed apr am utc by aragilar last updated wed apr am utc owner engelbert gruber currently docutils does not publish any wheels on pypi wheels make docutils faster to install no need to run setup py which for a large number of packages can take some time and is no more difficult than uploading an sdist see for instructions sent from sourceforge net because docutils develop lists sourceforge net is subscribed to to unsubscribe from further messages a project admin can change settings at or if this is a mailing list you can unsubscribe from the mailing list bpm camp free virtual workshop may at pdt edt develop your own process in accordance with the bpmn standard learn process modeling best practices with bonita bpm through live exercises event utm source sourceforge bpm camp utm medium email utm campaign va sf docutils develop mailing list docutils develop lists sourceforge net please use reply all to reply to the list upload wheels to pypi status open group default created wed apr am utc by aragilar last updated sat apr pm utc owner engelbert gruber currently docutils does not publish any wheels on pypi wheels make docutils faster to install no need to run setup py which for a large number of packages can take some time and is no more difficult than uploading an sdist see for instructions sent from sourceforge net because you indicated interest in to unsubscribe from further messages please visit don t send me files in proprietary formats doc x xls ppt etc it isn t good enough for tim berners lee and it isn t good enough for me either for more information visit truly great madness cannot be achieved without significant intelligence henrik tikkanen if you re not messing with your sanity you re not having fun james tocknell in theory there is no difference between theory and practice in practice there is commenter milde posted title upload wheels to pypi status open closed duplicate commenter milde posted title upload wheels to pypi this is a duplicate of
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.