Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3 values | title stringlengths 1 844 | labels stringlengths 4 721 | body stringlengths 1 261k | index stringclasses 12 values | text_combine stringlengths 96 261k | label stringclasses 2 values | text stringlengths 96 248k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
13,608 | 9,995,628,047 | IssuesEvent | 2019-07-11 20:46:23 | portainer/portainer | https://api.github.com/repos/portainer/portainer | opened | Add the service rollback feature to Portainer service details view | area/service-details kind/feature | **Is your feature request related to a problem? Please describe.**
Docker CLI has the ability to rollback a service to a previous configuration `docker service rollback [OPTIONS] SERVICE`, this feature is missing from Portainer.
**Describe the solution you'd like**
Add a _rollback the service_ button on the Service details view, that once clicked will rollback the service to its previous configuration
**Example of what it could look like:**

| 1.0 | Add the service rollback feature to Portainer service details view - **Is your feature request related to a problem? Please describe.**
Docker CLI has the ability to rollback a service to a previous configuration `docker service rollback [OPTIONS] SERVICE`, this feature is missing from Portainer.
**Describe the solution you'd like**
Add a _rollback the service_ button on the Service details view, that once clicked will rollback the service to its previous configuration
**Example of what it could look like:**

| non_priority | add the service rollback feature to portainer service details view is your feature request related to a problem please describe docker cli has the ability to rollback a service to a previous configuration docker service rollback service this feature is missing from portainer describe the solution you d like add a rollback the service button on the service details view that once clicked will rollback the service to its previous configuration example of what it could look like | 0 |
54,004 | 11,171,573,729 | IssuesEvent | 2019-12-28 20:55:38 | ariya/phantomjs | https://api.github.com/repos/ariya/phantomjs | closed | PhantomJS crashed during test | Need code Need more information Need reproduction Unclear stale | Result
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
Script1.groovy: 1: expecting EOF, found 'cd07af8' @ line 1, column 11.
cat /tmp/3cd07af8-09f4-b138-13dbcadf-2b66c1f3.dmp
^
1 error
```
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:302)
at org.codehaus.groovy.control.ErrorCollector.addFatalError(ErrorCollector.java:149)
at org.codehaus.groovy.control.ErrorCollector.addError(ErrorCollector.java:119)
at org.codehaus.groovy.control.ErrorCollector.addError(ErrorCollector.java:131)
at org.codehaus.groovy.control.SourceUnit.addError(SourceUnit.java:359)
at org.codehaus.groovy.antlr.AntlrParserPlugin.transformCSTIntoAST(AntlrParserPlugin.java:142)
at org.codehaus.groovy.antlr.AntlrParserPlugin.parseCST(AntlrParserPlugin.java:108)
at org.codehaus.groovy.control.SourceUnit.parse(SourceUnit.java:236)
at org.codehaus.groovy.control.CompilationUnit$1.call(CompilationUnit.java:161)
at org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:846)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:550)
at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:526)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:503)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:302)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:281)
at groovy.lang.GroovyShell.parseClass(GroovyShell.java:731)
at groovy.lang.GroovyShell.parse(GroovyShell.java:743)
at groovy.lang.GroovyShell.evaluate(GroovyShell.java:578)
at groovy.lang.GroovyShell.evaluate(GroovyShell.java:618)
at groovy.lang.GroovyShell.evaluate(GroovyShell.java:589)
at hudson.util.RemotingDiagnostics$Script.call(RemotingDiagnostics.java:139)
at hudson.util.RemotingDiagnostics$Script.call(RemotingDiagnostics.java:111)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:328)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at hudson.remoting.Engine$1$1.run(Engine.java:63)
at java.lang.Thread.run(Thread.java:745)
```
| 1.0 | PhantomJS crashed during test - Result
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
Script1.groovy: 1: expecting EOF, found 'cd07af8' @ line 1, column 11.
cat /tmp/3cd07af8-09f4-b138-13dbcadf-2b66c1f3.dmp
^
1 error
```
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:302)
at org.codehaus.groovy.control.ErrorCollector.addFatalError(ErrorCollector.java:149)
at org.codehaus.groovy.control.ErrorCollector.addError(ErrorCollector.java:119)
at org.codehaus.groovy.control.ErrorCollector.addError(ErrorCollector.java:131)
at org.codehaus.groovy.control.SourceUnit.addError(SourceUnit.java:359)
at org.codehaus.groovy.antlr.AntlrParserPlugin.transformCSTIntoAST(AntlrParserPlugin.java:142)
at org.codehaus.groovy.antlr.AntlrParserPlugin.parseCST(AntlrParserPlugin.java:108)
at org.codehaus.groovy.control.SourceUnit.parse(SourceUnit.java:236)
at org.codehaus.groovy.control.CompilationUnit$1.call(CompilationUnit.java:161)
at org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:846)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:550)
at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:526)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:503)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:302)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:281)
at groovy.lang.GroovyShell.parseClass(GroovyShell.java:731)
at groovy.lang.GroovyShell.parse(GroovyShell.java:743)
at groovy.lang.GroovyShell.evaluate(GroovyShell.java:578)
at groovy.lang.GroovyShell.evaluate(GroovyShell.java:618)
at groovy.lang.GroovyShell.evaluate(GroovyShell.java:589)
at hudson.util.RemotingDiagnostics$Script.call(RemotingDiagnostics.java:139)
at hudson.util.RemotingDiagnostics$Script.call(RemotingDiagnostics.java:111)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:328)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at hudson.remoting.Engine$1$1.run(Engine.java:63)
at java.lang.Thread.run(Thread.java:745)
```
| non_priority | phantomjs crashed during test result org codehaus groovy control multiplecompilationerrorsexception startup failed groovy expecting eof found line column cat tmp dmp error at org codehaus groovy control errorcollector failiferrors errorcollector java at org codehaus groovy control errorcollector addfatalerror errorcollector java at org codehaus groovy control errorcollector adderror errorcollector java at org codehaus groovy control errorcollector adderror errorcollector java at org codehaus groovy control sourceunit adderror sourceunit java at org codehaus groovy antlr antlrparserplugin transformcstintoast antlrparserplugin java at org codehaus groovy antlr antlrparserplugin parsecst antlrparserplugin java at org codehaus groovy control sourceunit parse sourceunit java at org codehaus groovy control compilationunit call compilationunit java at org codehaus groovy control compilationunit applytosourceunits compilationunit java at org codehaus groovy control compilationunit dophaseoperation compilationunit java at org codehaus groovy control compilationunit processphaseoperations compilationunit java at org codehaus groovy control compilationunit compile compilationunit java at groovy lang groovyclassloader doparseclass groovyclassloader java at groovy lang groovyclassloader parseclass groovyclassloader java at groovy lang groovyshell parseclass groovyshell java at groovy lang groovyshell parse groovyshell java at groovy lang groovyshell evaluate groovyshell java at groovy lang groovyshell evaluate groovyshell java at groovy lang groovyshell evaluate groovyshell java at hudson util remotingdiagnostics script call remotingdiagnostics java at hudson util remotingdiagnostics script call remotingdiagnostics java at hudson remoting userrequest perform userrequest java at hudson remoting userrequest perform userrequest java at hudson remoting request run request java at hudson remoting interceptingexecutorservice call interceptingexecutorservice java at java util concurrent futuretask run futuretask java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at hudson remoting engine run engine java at java lang thread run thread java | 0 |
89,172 | 17,792,890,083 | IssuesEvent | 2021-08-31 18:20:54 | dotnet/runtime | https://api.github.com/repos/dotnet/runtime | closed | Loader/classloader/regressions/523654/test532654_b/test532654_b.sh failing in CI | area-Codegen-Interpreter-mono | Configuration: `Mono OSX x64 Release @ OSX.1013.Amd64.Open`
Build: https://dev.azure.com/dnceng/public/_build/results?buildId=1303294&view=ms.vss-test-web.build-test-results-tab&runId=38392814&resultId=102537&paneView=debug
```
corerun(30780,0x7000031af000) malloc: *** error for object 0x7fd599073d78: pointer being freed was not allocated
*** set a breakpoint in malloc_error_break to debug
cmdLine:/private/tmp/helix/working/AE0E0907/w/AE3A09AD/e/Loader/classloader/regressions/523654/test532654_b/test532654_b.sh Timed Out (timeout in milliseconds: 600000 from variable __TestTimeout, start: 8/18/2021 7:08:37 PM, end: 8/18/2021 7:18:37 PM)
Return code: -100
Raw output file: /tmp/helix/working/AE0E0907/w/AE3A09AD/uploads/Reports/Loader.classloader/regressions/523654/test532654_b/test532654_b.output.txt
Raw output:
BEGIN EXECUTION
/tmp/helix/working/AE0E0907/p/corerun -p System.Reflection.Metadata.MetadataUpdater.IsSupported=false test532654_b.dll ''
=================================================================
Native Crash Reporting
Got a SIGABRT while executing native code. This usually indicates
a fatal error in the mono runtime or one of the native libraries
used by your application.
=================================================================
Native stacktrace:
0x10bdeebf6 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : mono_dump_native_crash_info
0x10bd8e48e - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : mono_handle_native_crash
0x10bdee4f2 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : sigabrt_signal_handler
0x7fff56f3df5a - /usr/lib/system/libsystem_platform.dylib : _sigtramp
0x7fd59840c7f0 - Unknown
0x7fff56cdb1ae - /usr/lib/system/libsystem_c.dylib : abort
0x7fff56dd9822 - /usr/lib/system/libsystem_malloc.dylib : free
0x10be3740e - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : monoeg_g_array_free
0x10be267cf - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : generate
0x10be22917 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : mono_interp_transform_method
0x10be00ce2 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : do_transform_method
0x10bdf293c - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : interp_exec_method
0x10bdf03af - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : interp_runtime_invoke
0x10bbff248 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : mono_runtime_invoke_checked
0x10bc18505 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : start_wrapper_internal
0x10bc1827e - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : start_wrapper
0x7fff56f47661 - /usr/lib/system/libsystem_pthread.dylib : _pthread_body
0x7fff56f4750d - /usr/lib/system/libsystem_pthread.dylib : _pthread_body
0x7fff56f46bf9 - /usr/lib/system/libsystem_pthread.dylib : thread_start
=================================================================
Telemetry Dumper:
T8: Inside meth<int>
T5: Inside meth<int>
T7: Inside meth<int>
T2: Inside meth<int>
T1: Inside meth<int>
T9: Inside meth<int>
T3: Inside meth<int>
T6: Inside meth<int>
T10: Inside meth<int>
Pkilling 0x123145345970176x from 0x123145354407936x
Pkilling 0x140735600104320x from 0x123145354407936x
Pkilling 0x123145343860736x from 0x123145354407936x
Could not exec mono-hang-watchdog, expected on path '/Users/runner/work/1/s/artifacts/obj/mono/OSX.x64.Release/out/etc/../bin/mono-hang-watchdog' (errno 2)
Entering thread summarizer pause from 0x123145354407936x
Finished thread summarizer pause from 0x123145354407936x.
cmdLine:/private/tmp/helix/working/AE0E0907/w/AE3A09AD/e/Loader/classloader/regressions/523654/test532654_b/test532654_b.sh Timed Out (timeout in milliseconds: 600000 from variable __TestTimeout, start: 8/18/2021 7:08:37 PM, end: 8/18/2021 7:18:37 PM)
Test Harness Exitcode is : -100
To run the test:
set CORE_ROOT=/tmp/helix/working/AE0E0907/p
/private/tmp/helix/working/AE0E0907/w/AE3A09AD/e/Loader/classloader/regressions/523654/test532654_b/test532654_b.sh
Expected: True
Actual: False
``` | 1.0 | Loader/classloader/regressions/523654/test532654_b/test532654_b.sh failing in CI - Configuration: `Mono OSX x64 Release @ OSX.1013.Amd64.Open`
Build: https://dev.azure.com/dnceng/public/_build/results?buildId=1303294&view=ms.vss-test-web.build-test-results-tab&runId=38392814&resultId=102537&paneView=debug
```
corerun(30780,0x7000031af000) malloc: *** error for object 0x7fd599073d78: pointer being freed was not allocated
*** set a breakpoint in malloc_error_break to debug
cmdLine:/private/tmp/helix/working/AE0E0907/w/AE3A09AD/e/Loader/classloader/regressions/523654/test532654_b/test532654_b.sh Timed Out (timeout in milliseconds: 600000 from variable __TestTimeout, start: 8/18/2021 7:08:37 PM, end: 8/18/2021 7:18:37 PM)
Return code: -100
Raw output file: /tmp/helix/working/AE0E0907/w/AE3A09AD/uploads/Reports/Loader.classloader/regressions/523654/test532654_b/test532654_b.output.txt
Raw output:
BEGIN EXECUTION
/tmp/helix/working/AE0E0907/p/corerun -p System.Reflection.Metadata.MetadataUpdater.IsSupported=false test532654_b.dll ''
=================================================================
Native Crash Reporting
Got a SIGABRT while executing native code. This usually indicates
a fatal error in the mono runtime or one of the native libraries
used by your application.
=================================================================
Native stacktrace:
0x10bdeebf6 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : mono_dump_native_crash_info
0x10bd8e48e - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : mono_handle_native_crash
0x10bdee4f2 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : sigabrt_signal_handler
0x7fff56f3df5a - /usr/lib/system/libsystem_platform.dylib : _sigtramp
0x7fd59840c7f0 - Unknown
0x7fff56cdb1ae - /usr/lib/system/libsystem_c.dylib : abort
0x7fff56dd9822 - /usr/lib/system/libsystem_malloc.dylib : free
0x10be3740e - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : monoeg_g_array_free
0x10be267cf - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : generate
0x10be22917 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : mono_interp_transform_method
0x10be00ce2 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : do_transform_method
0x10bdf293c - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : interp_exec_method
0x10bdf03af - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : interp_runtime_invoke
0x10bbff248 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : mono_runtime_invoke_checked
0x10bc18505 - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : start_wrapper_internal
0x10bc1827e - /tmp/helix/working/AE0E0907/p/libcoreclr.dylib : start_wrapper
0x7fff56f47661 - /usr/lib/system/libsystem_pthread.dylib : _pthread_body
0x7fff56f4750d - /usr/lib/system/libsystem_pthread.dylib : _pthread_body
0x7fff56f46bf9 - /usr/lib/system/libsystem_pthread.dylib : thread_start
=================================================================
Telemetry Dumper:
T8: Inside meth<int>
T5: Inside meth<int>
T7: Inside meth<int>
T2: Inside meth<int>
T1: Inside meth<int>
T9: Inside meth<int>
T3: Inside meth<int>
T6: Inside meth<int>
T10: Inside meth<int>
Pkilling 0x123145345970176x from 0x123145354407936x
Pkilling 0x140735600104320x from 0x123145354407936x
Pkilling 0x123145343860736x from 0x123145354407936x
Could not exec mono-hang-watchdog, expected on path '/Users/runner/work/1/s/artifacts/obj/mono/OSX.x64.Release/out/etc/../bin/mono-hang-watchdog' (errno 2)
Entering thread summarizer pause from 0x123145354407936x
Finished thread summarizer pause from 0x123145354407936x.
cmdLine:/private/tmp/helix/working/AE0E0907/w/AE3A09AD/e/Loader/classloader/regressions/523654/test532654_b/test532654_b.sh Timed Out (timeout in milliseconds: 600000 from variable __TestTimeout, start: 8/18/2021 7:08:37 PM, end: 8/18/2021 7:18:37 PM)
Test Harness Exitcode is : -100
To run the test:
set CORE_ROOT=/tmp/helix/working/AE0E0907/p
/private/tmp/helix/working/AE0E0907/w/AE3A09AD/e/Loader/classloader/regressions/523654/test532654_b/test532654_b.sh
Expected: True
Actual: False
``` | non_priority | loader classloader regressions b b sh failing in ci configuration mono osx release osx open build corerun malloc error for object pointer being freed was not allocated set a breakpoint in malloc error break to debug cmdline private tmp helix working w e loader classloader regressions b b sh timed out timeout in milliseconds from variable testtimeout start pm end pm return code raw output file tmp helix working w uploads reports loader classloader regressions b b output txt raw output begin execution tmp helix working p corerun p system reflection metadata metadataupdater issupported false b dll native crash reporting got a sigabrt while executing native code this usually indicates a fatal error in the mono runtime or one of the native libraries used by your application native stacktrace tmp helix working p libcoreclr dylib mono dump native crash info tmp helix working p libcoreclr dylib mono handle native crash tmp helix working p libcoreclr dylib sigabrt signal handler usr lib system libsystem platform dylib sigtramp unknown usr lib system libsystem c dylib abort usr lib system libsystem malloc dylib free tmp helix working p libcoreclr dylib monoeg g array free tmp helix working p libcoreclr dylib generate tmp helix working p libcoreclr dylib mono interp transform method tmp helix working p libcoreclr dylib do transform method tmp helix working p libcoreclr dylib interp exec method tmp helix working p libcoreclr dylib interp runtime invoke tmp helix working p libcoreclr dylib mono runtime invoke checked tmp helix working p libcoreclr dylib start wrapper internal tmp helix working p libcoreclr dylib start wrapper usr lib system libsystem pthread dylib pthread body usr lib system libsystem pthread dylib pthread body usr lib system libsystem pthread dylib thread start telemetry dumper inside meth inside meth inside meth inside meth inside meth inside meth inside meth inside meth inside meth pkilling from pkilling from pkilling from could not exec mono hang watchdog expected on path users runner work s artifacts obj mono osx release out etc bin mono hang watchdog errno entering thread summarizer pause from finished thread summarizer pause from cmdline private tmp helix working w e loader classloader regressions b b sh timed out timeout in milliseconds from variable testtimeout start pm end pm test harness exitcode is to run the test set core root tmp helix working p private tmp helix working w e loader classloader regressions b b sh expected true actual false | 0 |
135,966 | 30,453,687,451 | IssuesEvent | 2023-07-16 16:09:29 | JHannTX/angular-practice-food | https://api.github.com/repos/JHannTX/angular-practice-food | opened | Unsubscribe From Observables Where Possible | Code Optimization | There are areas where I didn't unsubscribe from observables, need to look into if it would hurt. It is good practice to unsubscribe from observables. | 1.0 | Unsubscribe From Observables Where Possible - There are areas where I didn't unsubscribe from observables, need to look into if it would hurt. It is good practice to unsubscribe from observables. | non_priority | unsubscribe from observables where possible there are areas where i didn t unsubscribe from observables need to look into if it would hurt it is good practice to unsubscribe from observables | 0 |
243,246 | 18,679,822,007 | IssuesEvent | 2021-11-01 03:05:47 | AY2122S1-CS2113T-W12-2/tp | https://api.github.com/repos/AY2122S1-CS2113T-W12-2/tp | closed | [PE-D] User guide line breaks not rendering correctly | documentation type.Bug severity.VeryLow | 
The line breaks in the FAQ and command summary are not showing correctly.
<!--session: 1635497011155-fc208fd6-b760-43ea-8d2f-31cb1cca642f--><!--Version: Web v3.4.1-->
-------------
Labels: `severity.VeryLow` `type.DocumentationBug`
original: alvintan01/ped#10 | 1.0 | [PE-D] User guide line breaks not rendering correctly - 
The line breaks in the FAQ and command summary are not showing correctly.
<!--session: 1635497011155-fc208fd6-b760-43ea-8d2f-31cb1cca642f--><!--Version: Web v3.4.1-->
-------------
Labels: `severity.VeryLow` `type.DocumentationBug`
original: alvintan01/ped#10 | non_priority | user guide line breaks not rendering correctly the line breaks in the faq and command summary are not showing correctly labels severity verylow type documentationbug original ped | 0 |
20,435 | 11,448,142,518 | IssuesEvent | 2020-02-06 02:14:49 | terraform-providers/terraform-provider-aws | https://api.github.com/repos/terraform-providers/terraform-provider-aws | closed | aws_cloudwatch_log_stream resource produced new value for was present but now absent | bug service/cloudwatch service/cloudwatchlogs | ### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
### Terraform Version
```
$ terraform -v
Terraform v0.12.3
+ provider.aws v2.37.0
```
### Affected Resource(s)
* aws_cloudwatch_log_stream
### Terraform Configuration Files
```hcl
resource "aws_cloudwatch_log_group" "test" {
name = "/test"
}
resource "aws_cloudwatch_log_stream" "test" {
name = "test"
log_group_name = aws_cloudwatch_log_group.test.name
}
```
### Debug Output
Here's a snippet of the debug log when the error occurs. The problem appears to occur when AWS returns a 200 OK with an empty list of `logStreams` for the stream which has just been created. This may be due to an eventual consistency issue that could be overcome by retrying the `DescribeLogStreams` call some number of times until the resource is available.
```
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: 2019/12/30 22:27:14 [DEBUG] [aws-sdk-go] DEBUG: Request logs/DescribeLogStreams Details:
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: ---[ REQUEST POST-SIGN ]-----------------------------
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: POST / HTTP/1.1
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Host: logs.us-west-2.amazonaws.com
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: User-Agent: aws-sdk-go/1.25.36 (go1.13.3; linux; amd64) APN/1.0 HashiCorp/1.0 Terraform/0.12.3 (+https://www.terraform.io)
...
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: X-Amz-Target: Logs_20140328.DescribeLogStreams
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Accept-Encoding: gzip
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4:
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: {"logGroupName":"/test","logStreamNamePrefix":"test"}
...
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: ---[ RESPONSE ]--------------------------------------
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: HTTP/1.1 200 OK
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Connection: close
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Content-Length: 17
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Content-Type: application/x-amz-json-1.1
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Date: Mon, 30 Dec 2019 22:27:14 GMT
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: X-Amzn-Requestid: XXX
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4:
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4:
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: -----------------------------------------------------
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: 2019/12/30 22:27:14 [DEBUG] [aws-sdk-go] {"logStreams":[]}
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: 2019/12/30 22:27:14 [DEBUG] CloudWatch Stream "test" Not Found. Removing from state
2019/12/30 22:27:14 [DEBUG] aws_cloudwatch_log_stream.test: apply errored, but we're indicating that via the Error pointer rather than returning it: Provider produced inconsistent result after apply: When applying changes to aws_cloudwatch_log_stream.test, provider "aws" produced an unexpected new value for was present, but now absent.
```
### Panic Output
### Expected Behavior
The log stream resource should be created without any errors and the `terraform apply` should succeed.
### Actual Behavior
Most of the time, the expected behavior occurs. Intermittently, though, the log stream resource is created in AWS but the `terraform apply` fails with the following error:
```
Error: Provider produced inconsistent result after apply
When applying changes to
aws_cloudwatch_log_stream.test,
provider "aws" produced an unexpected new value for was present, but now
absent.
This is a bug in the provider, which should be reported in the provider's own
issue tracker.
```
Subsequent `terraform apply` attempts fail with the following error:
```
aws_cloudwatch_log_stream.test: Creating...
Error: Creating CloudWatch Log Stream failed: ResourceAlreadyExistsException: The specified log stream already exists
status code: 400, request id: XXX
```
### Steps to Reproduce
1. `terraform apply`
### Important Factoids
### References
* #10549 - The behavior for this case seems similar, only for an `aws_appautoscaling_policy` resource instead of an `aws_cloudwatch_log_stream` one. | 2.0 | aws_cloudwatch_log_stream resource produced new value for was present but now absent - ### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
### Terraform Version
```
$ terraform -v
Terraform v0.12.3
+ provider.aws v2.37.0
```
### Affected Resource(s)
* aws_cloudwatch_log_stream
### Terraform Configuration Files
```hcl
resource "aws_cloudwatch_log_group" "test" {
name = "/test"
}
resource "aws_cloudwatch_log_stream" "test" {
name = "test"
log_group_name = aws_cloudwatch_log_group.test.name
}
```
### Debug Output
Here's a snippet of the debug log when the error occurs. The problem appears to occur when AWS returns a 200 OK with an empty list of `logStreams` for the stream which has just been created. This may be due to an eventual consistency issue that could be overcome by retrying the `DescribeLogStreams` call some number of times until the resource is available.
```
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: 2019/12/30 22:27:14 [DEBUG] [aws-sdk-go] DEBUG: Request logs/DescribeLogStreams Details:
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: ---[ REQUEST POST-SIGN ]-----------------------------
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: POST / HTTP/1.1
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Host: logs.us-west-2.amazonaws.com
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: User-Agent: aws-sdk-go/1.25.36 (go1.13.3; linux; amd64) APN/1.0 HashiCorp/1.0 Terraform/0.12.3 (+https://www.terraform.io)
...
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: X-Amz-Target: Logs_20140328.DescribeLogStreams
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Accept-Encoding: gzip
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4:
2019-12-30T22:27:14.323Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: {"logGroupName":"/test","logStreamNamePrefix":"test"}
...
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: ---[ RESPONSE ]--------------------------------------
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: HTTP/1.1 200 OK
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Connection: close
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Content-Length: 17
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Content-Type: application/x-amz-json-1.1
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: Date: Mon, 30 Dec 2019 22:27:14 GMT
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: X-Amzn-Requestid: XXX
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4:
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4:
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: -----------------------------------------------------
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: 2019/12/30 22:27:14 [DEBUG] [aws-sdk-go] {"logStreams":[]}
2019-12-30T22:27:14.644Z [DEBUG] plugin.terraform-provider-aws_v2.37.0_x4: 2019/12/30 22:27:14 [DEBUG] CloudWatch Stream "test" Not Found. Removing from state
2019/12/30 22:27:14 [DEBUG] aws_cloudwatch_log_stream.test: apply errored, but we're indicating that via the Error pointer rather than returning it: Provider produced inconsistent result after apply: When applying changes to aws_cloudwatch_log_stream.test, provider "aws" produced an unexpected new value for was present, but now absent.
```
### Panic Output
### Expected Behavior
The log stream resource should be created without any errors and the `terraform apply` should succeed.
### Actual Behavior
Most of the time, the expected behavior occurs. Intermittently, though, the log stream resource is created in AWS but the `terraform apply` fails with the following error:
```
Error: Provider produced inconsistent result after apply
When applying changes to
aws_cloudwatch_log_stream.test,
provider "aws" produced an unexpected new value for was present, but now
absent.
This is a bug in the provider, which should be reported in the provider's own
issue tracker.
```
Subsequent `terraform apply` attempts fail with the following error:
```
aws_cloudwatch_log_stream.test: Creating...
Error: Creating CloudWatch Log Stream failed: ResourceAlreadyExistsException: The specified log stream already exists
status code: 400, request id: XXX
```
### Steps to Reproduce
1. `terraform apply`
### Important Factoids
### References
* #10549 - The behavior for this case seems similar, only for an `aws_appautoscaling_policy` resource instead of an `aws_cloudwatch_log_stream` one. | non_priority | aws cloudwatch log stream resource produced new value for was present but now absent community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or other comments that do not add relevant new information or questions they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform version terraform v terraform provider aws affected resource s aws cloudwatch log stream terraform configuration files hcl resource aws cloudwatch log group test name test resource aws cloudwatch log stream test name test log group name aws cloudwatch log group test name debug output here s a snippet of the debug log when the error occurs the problem appears to occur when aws returns a ok with an empty list of logstreams for the stream which has just been created this may be due to an eventual consistency issue that could be overcome by retrying the describelogstreams call some number of times until the resource is available plugin terraform provider aws debug request logs describelogstreams details plugin terraform provider aws plugin terraform provider aws post http plugin terraform provider aws host logs us west amazonaws com plugin terraform provider aws user agent aws sdk go linux apn hashicorp terraform plugin terraform provider aws x amz target logs describelogstreams plugin terraform provider aws accept encoding gzip plugin terraform provider aws plugin terraform provider aws loggroupname test logstreamnameprefix test plugin terraform provider aws plugin terraform provider aws http ok plugin terraform provider aws connection close plugin terraform provider aws content length plugin terraform provider aws content type application x amz json plugin terraform provider aws date mon dec gmt plugin terraform provider aws x amzn requestid xxx plugin terraform provider aws plugin terraform provider aws plugin terraform provider aws plugin terraform provider aws logstreams plugin terraform provider aws cloudwatch stream test not found removing from state aws cloudwatch log stream test apply errored but we re indicating that via the error pointer rather than returning it provider produced inconsistent result after apply when applying changes to aws cloudwatch log stream test provider aws produced an unexpected new value for was present but now absent panic output expected behavior the log stream resource should be created without any errors and the terraform apply should succeed actual behavior most of the time the expected behavior occurs intermittently though the log stream resource is created in aws but the terraform apply fails with the following error error provider produced inconsistent result after apply when applying changes to aws cloudwatch log stream test provider aws produced an unexpected new value for was present but now absent this is a bug in the provider which should be reported in the provider s own issue tracker subsequent terraform apply attempts fail with the following error aws cloudwatch log stream test creating error creating cloudwatch log stream failed resourcealreadyexistsexception the specified log stream already exists status code request id xxx steps to reproduce terraform apply important factoids references the behavior for this case seems similar only for an aws appautoscaling policy resource instead of an aws cloudwatch log stream one | 0 |
30,949 | 14,706,470,848 | IssuesEvent | 2021-01-04 19:54:30 | CyclopsMC/IntegratedDynamics | https://api.github.com/repos/CyclopsMC/IntegratedDynamics | closed | Lag spikes with AE2 crafting when connected to logic network | mc-1.12 performance | <!--Thanks in advance for this issue, you're awesome! Please fill in the following template and make sure your title clear and concisely summarizes the issue.-->
#### Issue type:
- :snail: Performance issue <!--Don't change this issue type!-->
____
#### Short description:
AE2 autocrafting lags enormously when it is connected via storage bus to a logic network interface
#### Steps to reproduce the problem:
1. Setup a modest AE2 Autocrafting setup.
2. Place some (or all of the items/fluids) on a logic network with interfaces connected to chests/tanks etc. Attach this system via logic interface/me storage bus to the AE2 System.
3. try to craft something in AE2. After clicking next in the amount window the server thread hangs for a long time
Time can be well over 30 seconds on my FTB Interactions serve
____
#### Versions:
<!--Exact versions of the following mods, not just *latest*.-->
- This mod: 1.1.9
- AE2: rv6-stable7
- Minecraft: 1.12.2
- Forge: 14.23.5.2847
#### Profiler output:
[lagspikeae2idnetwork.zip](https://github.com/CyclopsMC/IntegratedDynamics/files/5718585/lagspikeae2idnetwork.zip)
#### Additional info:
If you want a copy of the world just tell me.
| True | Lag spikes with AE2 crafting when connected to logic network - <!--Thanks in advance for this issue, you're awesome! Please fill in the following template and make sure your title clear and concisely summarizes the issue.-->
#### Issue type:
- :snail: Performance issue <!--Don't change this issue type!-->
____
#### Short description:
AE2 autocrafting lags enormously when it is connected via storage bus to a logic network interface
#### Steps to reproduce the problem:
1. Setup a modest AE2 Autocrafting setup.
2. Place some (or all of the items/fluids) on a logic network with interfaces connected to chests/tanks etc. Attach this system via logic interface/me storage bus to the AE2 System.
3. try to craft something in AE2. After clicking next in the amount window the server thread hangs for a long time
Time can be well over 30 seconds on my FTB Interactions serve
____
#### Versions:
<!--Exact versions of the following mods, not just *latest*.-->
- This mod: 1.1.9
- AE2: rv6-stable7
- Minecraft: 1.12.2
- Forge: 14.23.5.2847
#### Profiler output:
[lagspikeae2idnetwork.zip](https://github.com/CyclopsMC/IntegratedDynamics/files/5718585/lagspikeae2idnetwork.zip)
#### Additional info:
If you want a copy of the world just tell me.
| non_priority | lag spikes with crafting when connected to logic network issue type snail performance issue short description autocrafting lags enormously when it is connected via storage bus to a logic network interface steps to reproduce the problem setup a modest autocrafting setup place some or all of the items fluids on a logic network with interfaces connected to chests tanks etc attach this system via logic interface me storage bus to the system try to craft something in after clicking next in the amount window the server thread hangs for a long time time can be well over seconds on my ftb interactions serve versions this mod minecraft forge profiler output additional info if you want a copy of the world just tell me | 0 |
89,561 | 10,604,060,366 | IssuesEvent | 2019-10-10 17:20:19 | ggerganov/diff-challenge | https://api.github.com/repos/ggerganov/diff-challenge | opened | Automatic PR merging on successful submission | documentation | Making this repo to automatically test the submitted PRs and potentially merge them if they satisfy the challenge requirements was kind of interesting to me so here is a quick summary:
1. Got a cheap [Linode](https://www.linode.com) server
2. Installed [Jenkins](https://jenkins.io) on it
3. In Jenkins - installed the [GitHub Pull Request Builder plugin](https://wiki.jenkins.io/display/JENKINS/GitHub+pull+request+builder+plugin)
4. Created a Jenkins job to run on each pull request and execute the following scripts:
```bash
# require pull-request has single commit
#
if [ "$(git rev-list --count origin/master..HEAD)" != "1" ] ; then
echo "Error: PR must have a single commit"
exit 1
fi
# run the following command in a docker container
#
# $ bash x.sh > diff
#
/home/run.sh
# require non-null output
#
if [ ! -s diff ] ; then
echo "Error: null output"
exit 2
fi
patch -f x.sh < diff
# require output is valid diff
#
if [ ! $? -eq 0 ] ; then
echo "Error: produced patch is invalid"
exit 3
fi
# require the patch reproduced origin/master -- x.sh
#
if [ "$(git diff origin/master -- x.sh)" != "" ] ; then
echo "Error: the patch does not reproduce the original x.sh"
exit 4
fi
# all checks passed - merge the pull-request !
#
/home/merge.sh ${ghprbPullId}
exit 0
```
5. The `/home/run.sh` script runs the submitted `x.sh` script inside a [Docker](https://www.docker.com) container in order to prevent from people running arbitrary code on my server:
```bash
#!/bin/bash
docker create --name sandbox -t ubuntu
docker start sandbox
docker cp ./x.sh sandbox:/x.sh
docker exec sandbox sh ./x.sh > diff
docker stop sandbox
docker rm sandbox
```
6. The `/home/merge.sh` script performs the actual PR merge using the PR number provided by the Jenkins plugin `${ghprbPullId}`:
```bash
#!/bin/bash
GITHUB_TOKEN=XXXXXXXXXXXSECRETXXXXXXXXXXXXXXXXXX
a=$(curl \
-XPUT \
-H "Authorization: token $GITHUB_TOKEN" \
https://api.github.com/repos/ggerganov/diff-challenge/pulls/$1/merge 2>/dev/null | grep merged)
if [ "$a" == "" ] ; then
echo "Merge of PR $1 failed!"
exit 1
fi
echo "Merge of PR $1 successfull!"
exit 0
``` | 1.0 | Automatic PR merging on successful submission - Making this repo to automatically test the submitted PRs and potentially merge them if they satisfy the challenge requirements was kind of interesting to me so here is a quick summary:
1. Got a cheap [Linode](https://www.linode.com) server
2. Installed [Jenkins](https://jenkins.io) on it
3. In Jenkins - installed the [GitHub Pull Request Builder plugin](https://wiki.jenkins.io/display/JENKINS/GitHub+pull+request+builder+plugin)
4. Created a Jenkins job to run on each pull request and execute the following scripts:
```bash
# require pull-request has single commit
#
if [ "$(git rev-list --count origin/master..HEAD)" != "1" ] ; then
echo "Error: PR must have a single commit"
exit 1
fi
# run the following command in a docker container
#
# $ bash x.sh > diff
#
/home/run.sh
# require non-null output
#
if [ ! -s diff ] ; then
echo "Error: null output"
exit 2
fi
patch -f x.sh < diff
# require output is valid diff
#
if [ ! $? -eq 0 ] ; then
echo "Error: produced patch is invalid"
exit 3
fi
# require the patch reproduced origin/master -- x.sh
#
if [ "$(git diff origin/master -- x.sh)" != "" ] ; then
echo "Error: the patch does not reproduce the original x.sh"
exit 4
fi
# all checks passed - merge the pull-request !
#
/home/merge.sh ${ghprbPullId}
exit 0
```
5. The `/home/run.sh` script runs the submitted `x.sh` script inside a [Docker](https://www.docker.com) container in order to prevent from people running arbitrary code on my server:
```bash
#!/bin/bash
docker create --name sandbox -t ubuntu
docker start sandbox
docker cp ./x.sh sandbox:/x.sh
docker exec sandbox sh ./x.sh > diff
docker stop sandbox
docker rm sandbox
```
6. The `/home/merge.sh` script performs the actual PR merge using the PR number provided by the Jenkins plugin `${ghprbPullId}`:
```bash
#!/bin/bash
GITHUB_TOKEN=XXXXXXXXXXXSECRETXXXXXXXXXXXXXXXXXX
a=$(curl \
-XPUT \
-H "Authorization: token $GITHUB_TOKEN" \
https://api.github.com/repos/ggerganov/diff-challenge/pulls/$1/merge 2>/dev/null | grep merged)
if [ "$a" == "" ] ; then
echo "Merge of PR $1 failed!"
exit 1
fi
echo "Merge of PR $1 successfull!"
exit 0
``` | non_priority | automatic pr merging on successful submission making this repo to automatically test the submitted prs and potentially merge them if they satisfy the challenge requirements was kind of interesting to me so here is a quick summary got a cheap server installed on it in jenkins installed the created a jenkins job to run on each pull request and execute the following scripts bash require pull request has single commit if then echo error pr must have a single commit exit fi run the following command in a docker container bash x sh diff home run sh require non null output if then echo error null output exit fi patch f x sh diff require output is valid diff if then echo error produced patch is invalid exit fi require the patch reproduced origin master x sh if then echo error the patch does not reproduce the original x sh exit fi all checks passed merge the pull request home merge sh ghprbpullid exit the home run sh script runs the submitted x sh script inside a container in order to prevent from people running arbitrary code on my server bash bin bash docker create name sandbox t ubuntu docker start sandbox docker cp x sh sandbox x sh docker exec sandbox sh x sh diff docker stop sandbox docker rm sandbox the home merge sh script performs the actual pr merge using the pr number provided by the jenkins plugin ghprbpullid bash bin bash github token xxxxxxxxxxxsecretxxxxxxxxxxxxxxxxxx a curl xput h authorization token github token dev null grep merged if then echo merge of pr failed exit fi echo merge of pr successfull exit | 0 |
20,255 | 26,874,022,498 | IssuesEvent | 2023-02-04 20:31:17 | SatDump/SatDump | https://api.github.com/repos/SatDump/SatDump | opened | L1b product quality flags | enhancement Processing | We need quality flags to make any L2 products, or trust the generated products for research purposes.
- [ ] AVHRR
- [ ] AMSU-A
- [ ] MHS | 1.0 | L1b product quality flags - We need quality flags to make any L2 products, or trust the generated products for research purposes.
- [ ] AVHRR
- [ ] AMSU-A
- [ ] MHS | non_priority | product quality flags we need quality flags to make any products or trust the generated products for research purposes avhrr amsu a mhs | 0 |
7,546 | 18,233,213,115 | IssuesEvent | 2021-10-01 01:29:10 | firstrateconcepts/FusionOfSouls | https://api.github.com/repos/firstrateconcepts/FusionOfSouls | closed | Refactor all Actors and Screens to have proper lifecycle management | architecture | Actors currently mostly build their visuals via constructor. Instead actors should properly draw when added to stage and destroy when removed from stage. In addition, all current timers, tasks, etc should be cancelled on removal
Screens should destroy their stuff on hide which is not currently happening. | 1.0 | Refactor all Actors and Screens to have proper lifecycle management - Actors currently mostly build their visuals via constructor. Instead actors should properly draw when added to stage and destroy when removed from stage. In addition, all current timers, tasks, etc should be cancelled on removal
Screens should destroy their stuff on hide which is not currently happening. | non_priority | refactor all actors and screens to have proper lifecycle management actors currently mostly build their visuals via constructor instead actors should properly draw when added to stage and destroy when removed from stage in addition all current timers tasks etc should be cancelled on removal screens should destroy their stuff on hide which is not currently happening | 0 |
52,314 | 6,609,472,059 | IssuesEvent | 2017-09-19 14:38:37 | unicef/etools-issues | https://api.github.com/repos/unicef/etools-issues | closed | CP Result structures should not be filtered by WBS Marker in Palestine | Backlog enhancement invalid PD/SSFA TOR PMP PMP-Redesign | In Palestine Country Workspace, Only one CP is set up and it is not a primary CP (the WBS is xxxx/PC/xx rather than xxx/A0/xx)
Currently, the "CP Output" drop-down only displays results that are associated with the primary.
The filtering should be switched of for all countries f to allow for data entry. | 1.0 | CP Result structures should not be filtered by WBS Marker in Palestine - In Palestine Country Workspace, Only one CP is set up and it is not a primary CP (the WBS is xxxx/PC/xx rather than xxx/A0/xx)
Currently, the "CP Output" drop-down only displays results that are associated with the primary.
The filtering should be switched of for all countries f to allow for data entry. | non_priority | cp result structures should not be filtered by wbs marker in palestine in palestine country workspace only one cp is set up and it is not a primary cp the wbs is xxxx pc xx rather than xxx xx currently the cp output drop down only displays results that are associated with the primary the filtering should be switched of for all countries f to allow for data entry | 0 |
219,127 | 16,818,963,274 | IssuesEvent | 2021-06-17 10:45:41 | SmartSystemLtd/planningRepo | https://api.github.com/repos/SmartSystemLtd/planningRepo | closed | 3567 Setting mail | documentation | - [x] You need to create mail retail@avto-mechanic.ru
- [x] set up redirection from all seller.NN mails to the created mail ((Configure so that letters arrive at both mails, are not deleted from recipients)
- [x] Send me a list of mails from which redirects were made to the client.
- [x] connect the created mail to Troshin Eugene, RDP - r.map. Laptop-18 (outlook) | 1.0 | 3567 Setting mail - - [x] You need to create mail retail@avto-mechanic.ru
- [x] set up redirection from all seller.NN mails to the created mail ((Configure so that letters arrive at both mails, are not deleted from recipients)
- [x] Send me a list of mails from which redirects were made to the client.
- [x] connect the created mail to Troshin Eugene, RDP - r.map. Laptop-18 (outlook) | non_priority | setting mail you need to create mail retail avto mechanic ru set up redirection from all seller nn mails to the created mail configure so that letters arrive at both mails are not deleted from recipients send me a list of mails from which redirects were made to the client connect the created mail to troshin eugene rdp r map laptop outlook | 0 |
20,412 | 27,072,109,015 | IssuesEvent | 2023-02-14 07:54:03 | bazelbuild/bazel | https://api.github.com/repos/bazelbuild/bazel | closed | Long running Genrule can't finish when "jobs" bigger then 1, thus compilation errors out. | P4 type: support / not a bug (process) team-Rules-CPP stale | ### [BUG?] Description of the problem:
**TLDR**: Long running genrule dependency is not allowed to finish before the compilation of "who depends on it" starts. Thus compilation errors out as the generated files have not been created yet. Genrule takes around 20s with 4 cores to complete.
Maybe not standard procedure but I am trying to compile a non bazel library project once as a dependency of one of my binaries covered under bazel.
For that,
0. Downloaded my project with tf_http_archive
1. I used a genrule that makes a call to cmake and make.
2. Then created a cc_library depend on this genrule output file
3. Then made the binary depend on this library.
If I execute build command with multiple jobs, genrule does not have time to finish, because the compilation of the binary starts straight away.
Scheduling bazel build with one job or having a long queue of actions that delay the build of my binary allows the genrule to finish and the compilation to successfully complete.
This is my first week with bazel and I am not sure how to debug or what is going wrong.
### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
See files below.
### What operating system are you running Bazel on?
Docker tensorflow/tensorflow:nightly-custom-op-ubuntu16 as of Dec/03/2019. bazel 1.1.0
### What's the output of `bazel info release`?
```
INFO: Options provided by the client:
Inherited 'common' options: --isatty=1 --terminal_columns=275
INFO: Reading rc options for 'info' from /working_dir/tensorflow/.bazelrc:
Inherited 'build' options: --apple_platform_type=macos --define framework_shared_object=true --define open_source_build=true --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define=use_fast_cpp_protos=true --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --config=v2
INFO: Found applicable config definition build:v2 in file /working_dir/tensorflow/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1
INFO: Found applicable config definition build:linux in file /working_dir/tensorflow/.bazelrc: --copt=-w --define=PREFIX=/usr --define=LIBDIR=$(PREFIX)/lib --define=INCLUDEDIR=$(PREFIX)/include --cxxopt=-std=c++14 --host_cxxopt=-std=c++14
release 1.1.0
```
### Have you found anything relevant by searching the web?
Tried the `https://github.com/bazelbuild/rules_foreign_cc` but it does not fit my needs as it is complex to integrate in the framework (not using http_archive but tf_http_archive).
### Files involved:
workspace.bzl
```
tf_http_archive(
name = "systemc",
build_file = clean_dep("//third_party:systemc.BUILD"),
sha256 = "5781b9a351e5afedabc37d145e5f7edec08f3fd5de00ffeb8fa1f3086b1f7b3f",
urls = [
"https://www.accellera.org/images/downloads/standards/systemc/systemc-2.3.3.tar.gz",
"https://www.accellera.org/images/downloads/standards/systemc/systemc-2.3.3.tar.gz",
],
)
```
BUILD
```
cc_binary(
name = "systemc_model",
srcs = [
"systemc_main.sc.cc",
],
deps = [
"@systemc//:systemc",
],
)
```
third_party/systemc.BUILD
```
licenses(["notice"])
package(default_visibility = ["//visibility:public"])
genrule(
name = "libsystemc",
srcs = [],
outs = ["systemc-2.3.3/install/lib/libsystemc.a"],
cmd =
"cmake -DCMAKE_INSTALL_PREFIX=external/systemc/systemc-2.3.3/install -DCMAKE_CXX_STANDARD=14 -DCMAKE_INSTALL_INCLUDEDIR=include/systemc -DBUILD_SHARED_LIBS=off -Bexternal/systemc/systemc-2.3.3/build -Hexternal/systemc/systemc-2.3.3 &&" +
"make -C external/systemc/systemc-2.3.3/build install -j4 &&" +
"cp external/systemc/systemc-2.3.3/install/lib/libsystemc.a $@",
)
cc_library(
name = "systemc",
srcs = ["systemc-2.3.3/install/lib/libsystemc.a"],
hdrs = glob([
"systemc-2.3.3/install/include/system.h",
]),
copts = ["std=c++14"],
data = [":libsystemc"],
includes = [
"systemc-2.3.3/install/include",
"systemc-2.3.3/install/include/systemc",
],
)
```
Success messages when running `bazel build --jobs 1 --explain=file.txt --verbose_explanations tensorflow/lite/tools/systemc:systemc_model`
```
Build options: --apple_platform_type=macos --define='framework_shared_object=true' --define='open_source_build=true' --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define='use_fast_cpp_protos=true' --define='allow_oversize_protos=true' --spawn_strategy=standalone --compilation_mode=opt --announce_rc --define='grpc_no_ares=true' --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --copt=-w --define='PREFIX=/usr' --define='LIBDIR=$(PREFIX)/lib' --define='INCLUDEDIR=$(PREFIX)/include' --cxxopt='-std=c++14' --host_cxxopt='-std=c++14' --config=v2 --define='tf_api_version=2' --action_env='TF2_BEHAVIOR=1' --jobs=1 --explain=file.txt --verbose_explanations
Executing action 'BazelWorkspaceStatusAction stable-status.txt': unconditional execution is requested.
Executing action 'Executing genrule @systemc//:libsystemc': no entry in the cache (action is new).
Executing action 'Creating source manifest for //tensorflow/lite/tools/systemc:systemc_model': no entry in the cache (action is new).
Executing action 'Creating runfiles tree bazel-out/k8-opt/bin/tensorflow/lite/tools/systemc/systemc_model.runfiles': no entry in the cache (action is new).
Executing action 'Writing file tensorflow/lite/tools/systemc/systemc_model-2.params': no entry in the cache (action is new).
Executing action 'Compiling tensorflow/lite/tools/systemc/systemc_main.sc.cc': no entry in the cache (action is new).
Executing action 'Linking tensorflow/lite/tools/systemc/systemc_model': no entry in the cache (action is new).
```
Error messages when running unlimited jobs `bazel build --explain=file.txt --verbose_explanations tensorflow/lite/tools/systemc:systemc_model`
```
Build options: --apple_platform_type=macos --define='framework_shared_object=true' --define='open_source_build=true' --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define='use_fast_cpp_protos=true' --define='allow_oversize_protos=true' --spawn_strategy=standalone --compilation_mode=opt --announce_rc --define='grpc_no_ares=true' --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --copt=-w --define='PREFIX=/usr' --define='LIBDIR=$(PREFIX)/lib' --define='INCLUDEDIR=$(PREFIX)/include' --cxxopt='-std=c++14' --host_cxxopt='-std=c++14' --config=v2 --define='tf_api_version=2' --action_env='TF2_BEHAVIOR=1' --explain=file.txt --verbose_explanations
Executing action 'BazelWorkspaceStatusAction stable-status.txt': unconditional execution is requested.
Executing action 'Creating source manifest for //tensorflow/lite/tools/systemc:systemc_model': no entry in the cache (action is new).
Executing action 'Writing file tensorflow/lite/tools/systemc/systemc_model-2.params': no entry in the cache (action is new).
Executing action 'Executing genrule @systemc//:libsystemc': no entry in the cache (action is new).
Executing action 'Compiling tensorflow/lite/tools/systemc/systemc_main.sc.cc': no entry in the cache (action is new).
Executing action 'Creating runfiles tree bazel-out/k8-opt/bin/tensorflow/lite/tools/systemc/systemc_model.runfiles': no entry in the cache (action is new).
```
and
```
Starting local Bazel server and connecting to it...
INFO: Writing tracer profile to '/home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/command.profile.gz'
INFO: Options provided by the client:
Inherited 'common' options: --isatty=1 --terminal_columns=275
INFO: Reading rc options for 'build' from /working_dir/tensorflow/.bazelrc:
'build' options: --apple_platform_type=macos --define framework_shared_object=true --define open_source_build=true --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define=use_fast_cpp_protos=true --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --config=v2
INFO: Found applicable config definition build:v2 in file /working_dir/tensorflow/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1
INFO: Found applicable config definition build:linux in file /working_dir/tensorflow/.bazelrc: --copt=-w --define=PREFIX=/usr --define=LIBDIR=$(PREFIX)/lib --define=INCLUDEDIR=$(PREFIX)/include --cxxopt=-std=c++14 --host_cxxopt=-std=c++14
DEBUG: Rule 'io_bazel_rules_docker' indicated that a canonical reproducible form can be obtained by modifying arguments shallow_since = "1556410077 -0400"
DEBUG: Call stack for the definition of repository 'io_bazel_rules_docker' which is a git_repository (rule definition at /home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/external/bazel_tools/tools/build_defs/repo/git.bzl:195:18):
- /home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/external/bazel_toolchains/repositories/repositories.bzl:37:9
- /working_dir/tensorflow/WORKSPACE:37:1
INFO: Analyzed target //tensorflow/lite/tools/systemc:systemc_model (19 packages loaded, 69 targets configured).
INFO: Found 1 target...
INFO: Writing explanation of rebuilds to 'file.txt'
ERROR: /working_dir/tensorflow/tensorflow/lite/tools/systemc/BUILD:22:1: C++ compilation of rule '//tensorflow/lite/tools/systemc:systemc_model' failed (Exit 1)
tensorflow/lite/tools/systemc/systemc_main.sc.cc:3:29: fatal error: systemc/systemc.h: No such file or directory
compilation terminated.
Target //tensorflow/lite/tools/systemc:systemc_model failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 3.761s, Critical Path: 0.04s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
```
| 1.0 | Long running Genrule can't finish when "jobs" bigger then 1, thus compilation errors out. - ### [BUG?] Description of the problem:
**TLDR**: Long running genrule dependency is not allowed to finish before the compilation of "who depends on it" starts. Thus compilation errors out as the generated files have not been created yet. Genrule takes around 20s with 4 cores to complete.
Maybe not standard procedure but I am trying to compile a non bazel library project once as a dependency of one of my binaries covered under bazel.
For that,
0. Downloaded my project with tf_http_archive
1. I used a genrule that makes a call to cmake and make.
2. Then created a cc_library depend on this genrule output file
3. Then made the binary depend on this library.
If I execute build command with multiple jobs, genrule does not have time to finish, because the compilation of the binary starts straight away.
Scheduling bazel build with one job or having a long queue of actions that delay the build of my binary allows the genrule to finish and the compilation to successfully complete.
This is my first week with bazel and I am not sure how to debug or what is going wrong.
### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
See files below.
### What operating system are you running Bazel on?
Docker tensorflow/tensorflow:nightly-custom-op-ubuntu16 as of Dec/03/2019. bazel 1.1.0
### What's the output of `bazel info release`?
```
INFO: Options provided by the client:
Inherited 'common' options: --isatty=1 --terminal_columns=275
INFO: Reading rc options for 'info' from /working_dir/tensorflow/.bazelrc:
Inherited 'build' options: --apple_platform_type=macos --define framework_shared_object=true --define open_source_build=true --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define=use_fast_cpp_protos=true --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --config=v2
INFO: Found applicable config definition build:v2 in file /working_dir/tensorflow/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1
INFO: Found applicable config definition build:linux in file /working_dir/tensorflow/.bazelrc: --copt=-w --define=PREFIX=/usr --define=LIBDIR=$(PREFIX)/lib --define=INCLUDEDIR=$(PREFIX)/include --cxxopt=-std=c++14 --host_cxxopt=-std=c++14
release 1.1.0
```
### Have you found anything relevant by searching the web?
Tried the `https://github.com/bazelbuild/rules_foreign_cc` but it does not fit my needs as it is complex to integrate in the framework (not using http_archive but tf_http_archive).
### Files involved:
workspace.bzl
```
tf_http_archive(
name = "systemc",
build_file = clean_dep("//third_party:systemc.BUILD"),
sha256 = "5781b9a351e5afedabc37d145e5f7edec08f3fd5de00ffeb8fa1f3086b1f7b3f",
urls = [
"https://www.accellera.org/images/downloads/standards/systemc/systemc-2.3.3.tar.gz",
"https://www.accellera.org/images/downloads/standards/systemc/systemc-2.3.3.tar.gz",
],
)
```
BUILD
```
cc_binary(
name = "systemc_model",
srcs = [
"systemc_main.sc.cc",
],
deps = [
"@systemc//:systemc",
],
)
```
third_party/systemc.BUILD
```
licenses(["notice"])
package(default_visibility = ["//visibility:public"])
genrule(
name = "libsystemc",
srcs = [],
outs = ["systemc-2.3.3/install/lib/libsystemc.a"],
cmd =
"cmake -DCMAKE_INSTALL_PREFIX=external/systemc/systemc-2.3.3/install -DCMAKE_CXX_STANDARD=14 -DCMAKE_INSTALL_INCLUDEDIR=include/systemc -DBUILD_SHARED_LIBS=off -Bexternal/systemc/systemc-2.3.3/build -Hexternal/systemc/systemc-2.3.3 &&" +
"make -C external/systemc/systemc-2.3.3/build install -j4 &&" +
"cp external/systemc/systemc-2.3.3/install/lib/libsystemc.a $@",
)
cc_library(
name = "systemc",
srcs = ["systemc-2.3.3/install/lib/libsystemc.a"],
hdrs = glob([
"systemc-2.3.3/install/include/system.h",
]),
copts = ["std=c++14"],
data = [":libsystemc"],
includes = [
"systemc-2.3.3/install/include",
"systemc-2.3.3/install/include/systemc",
],
)
```
Success messages when running `bazel build --jobs 1 --explain=file.txt --verbose_explanations tensorflow/lite/tools/systemc:systemc_model`
```
Build options: --apple_platform_type=macos --define='framework_shared_object=true' --define='open_source_build=true' --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define='use_fast_cpp_protos=true' --define='allow_oversize_protos=true' --spawn_strategy=standalone --compilation_mode=opt --announce_rc --define='grpc_no_ares=true' --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --copt=-w --define='PREFIX=/usr' --define='LIBDIR=$(PREFIX)/lib' --define='INCLUDEDIR=$(PREFIX)/include' --cxxopt='-std=c++14' --host_cxxopt='-std=c++14' --config=v2 --define='tf_api_version=2' --action_env='TF2_BEHAVIOR=1' --jobs=1 --explain=file.txt --verbose_explanations
Executing action 'BazelWorkspaceStatusAction stable-status.txt': unconditional execution is requested.
Executing action 'Executing genrule @systemc//:libsystemc': no entry in the cache (action is new).
Executing action 'Creating source manifest for //tensorflow/lite/tools/systemc:systemc_model': no entry in the cache (action is new).
Executing action 'Creating runfiles tree bazel-out/k8-opt/bin/tensorflow/lite/tools/systemc/systemc_model.runfiles': no entry in the cache (action is new).
Executing action 'Writing file tensorflow/lite/tools/systemc/systemc_model-2.params': no entry in the cache (action is new).
Executing action 'Compiling tensorflow/lite/tools/systemc/systemc_main.sc.cc': no entry in the cache (action is new).
Executing action 'Linking tensorflow/lite/tools/systemc/systemc_model': no entry in the cache (action is new).
```
Error messages when running unlimited jobs `bazel build --explain=file.txt --verbose_explanations tensorflow/lite/tools/systemc:systemc_model`
```
Build options: --apple_platform_type=macos --define='framework_shared_object=true' --define='open_source_build=true' --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define='use_fast_cpp_protos=true' --define='allow_oversize_protos=true' --spawn_strategy=standalone --compilation_mode=opt --announce_rc --define='grpc_no_ares=true' --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --copt=-w --define='PREFIX=/usr' --define='LIBDIR=$(PREFIX)/lib' --define='INCLUDEDIR=$(PREFIX)/include' --cxxopt='-std=c++14' --host_cxxopt='-std=c++14' --config=v2 --define='tf_api_version=2' --action_env='TF2_BEHAVIOR=1' --explain=file.txt --verbose_explanations
Executing action 'BazelWorkspaceStatusAction stable-status.txt': unconditional execution is requested.
Executing action 'Creating source manifest for //tensorflow/lite/tools/systemc:systemc_model': no entry in the cache (action is new).
Executing action 'Writing file tensorflow/lite/tools/systemc/systemc_model-2.params': no entry in the cache (action is new).
Executing action 'Executing genrule @systemc//:libsystemc': no entry in the cache (action is new).
Executing action 'Compiling tensorflow/lite/tools/systemc/systemc_main.sc.cc': no entry in the cache (action is new).
Executing action 'Creating runfiles tree bazel-out/k8-opt/bin/tensorflow/lite/tools/systemc/systemc_model.runfiles': no entry in the cache (action is new).
```
and
```
Starting local Bazel server and connecting to it...
INFO: Writing tracer profile to '/home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/command.profile.gz'
INFO: Options provided by the client:
Inherited 'common' options: --isatty=1 --terminal_columns=275
INFO: Reading rc options for 'build' from /working_dir/tensorflow/.bazelrc:
'build' options: --apple_platform_type=macos --define framework_shared_object=true --define open_source_build=true --java_toolchain=//third_party/toolchains/java:tf_java_toolchain --host_java_toolchain=//third_party/toolchains/java:tf_java_toolchain --define=use_fast_cpp_protos=true --define=allow_oversize_protos=true --spawn_strategy=standalone -c opt --announce_rc --define=grpc_no_ares=true --noincompatible_remove_legacy_whole_archive --enable_platform_specific_config --config=v2
INFO: Found applicable config definition build:v2 in file /working_dir/tensorflow/.bazelrc: --define=tf_api_version=2 --action_env=TF2_BEHAVIOR=1
INFO: Found applicable config definition build:linux in file /working_dir/tensorflow/.bazelrc: --copt=-w --define=PREFIX=/usr --define=LIBDIR=$(PREFIX)/lib --define=INCLUDEDIR=$(PREFIX)/include --cxxopt=-std=c++14 --host_cxxopt=-std=c++14
DEBUG: Rule 'io_bazel_rules_docker' indicated that a canonical reproducible form can be obtained by modifying arguments shallow_since = "1556410077 -0400"
DEBUG: Call stack for the definition of repository 'io_bazel_rules_docker' which is a git_repository (rule definition at /home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/external/bazel_tools/tools/build_defs/repo/git.bzl:195:18):
- /home/developer/.cache/bazel/_bazel_developer/881ae704c1434b3a766ac83e64e752e0/external/bazel_toolchains/repositories/repositories.bzl:37:9
- /working_dir/tensorflow/WORKSPACE:37:1
INFO: Analyzed target //tensorflow/lite/tools/systemc:systemc_model (19 packages loaded, 69 targets configured).
INFO: Found 1 target...
INFO: Writing explanation of rebuilds to 'file.txt'
ERROR: /working_dir/tensorflow/tensorflow/lite/tools/systemc/BUILD:22:1: C++ compilation of rule '//tensorflow/lite/tools/systemc:systemc_model' failed (Exit 1)
tensorflow/lite/tools/systemc/systemc_main.sc.cc:3:29: fatal error: systemc/systemc.h: No such file or directory
compilation terminated.
Target //tensorflow/lite/tools/systemc:systemc_model failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 3.761s, Critical Path: 0.04s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
```
| non_priority | long running genrule can t finish when jobs bigger then thus compilation errors out description of the problem tldr long running genrule dependency is not allowed to finish before the compilation of who depends on it starts thus compilation errors out as the generated files have not been created yet genrule takes around with cores to complete maybe not standard procedure but i am trying to compile a non bazel library project once as a dependency of one of my binaries covered under bazel for that downloaded my project with tf http archive i used a genrule that makes a call to cmake and make then created a cc library depend on this genrule output file then made the binary depend on this library if i execute build command with multiple jobs genrule does not have time to finish because the compilation of the binary starts straight away scheduling bazel build with one job or having a long queue of actions that delay the build of my binary allows the genrule to finish and the compilation to successfully complete this is my first week with bazel and i am not sure how to debug or what is going wrong bugs what s the simplest easiest way to reproduce this bug please provide a minimal example if possible see files below what operating system are you running bazel on docker tensorflow tensorflow nightly custom op as of dec bazel what s the output of bazel info release info options provided by the client inherited common options isatty terminal columns info reading rc options for info from working dir tensorflow bazelrc inherited build options apple platform type macos define framework shared object true define open source build true java toolchain third party toolchains java tf java toolchain host java toolchain third party toolchains java tf java toolchain define use fast cpp protos true define allow oversize protos true spawn strategy standalone c opt announce rc define grpc no ares true noincompatible remove legacy whole archive enable platform specific config config info found applicable config definition build in file working dir tensorflow bazelrc define tf api version action env behavior info found applicable config definition build linux in file working dir tensorflow bazelrc copt w define prefix usr define libdir prefix lib define includedir prefix include cxxopt std c host cxxopt std c release have you found anything relevant by searching the web tried the but it does not fit my needs as it is complex to integrate in the framework not using http archive but tf http archive files involved workspace bzl tf http archive name systemc build file clean dep third party systemc build urls build cc binary name systemc model srcs systemc main sc cc deps systemc systemc third party systemc build licenses package default visibility genrule name libsystemc srcs outs cmd cmake dcmake install prefix external systemc systemc install dcmake cxx standard dcmake install includedir include systemc dbuild shared libs off bexternal systemc systemc build hexternal systemc systemc make c external systemc systemc build install cp external systemc systemc install lib libsystemc a cc library name systemc srcs hdrs glob systemc install include system h copts data includes systemc install include systemc install include systemc success messages when running bazel build jobs explain file txt verbose explanations tensorflow lite tools systemc systemc model build options apple platform type macos define framework shared object true define open source build true java toolchain third party toolchains java tf java toolchain host java toolchain third party toolchains java tf java toolchain define use fast cpp protos true define allow oversize protos true spawn strategy standalone compilation mode opt announce rc define grpc no ares true noincompatible remove legacy whole archive enable platform specific config copt w define prefix usr define libdir prefix lib define includedir prefix include cxxopt std c host cxxopt std c config define tf api version action env behavior jobs explain file txt verbose explanations executing action bazelworkspacestatusaction stable status txt unconditional execution is requested executing action executing genrule systemc libsystemc no entry in the cache action is new executing action creating source manifest for tensorflow lite tools systemc systemc model no entry in the cache action is new executing action creating runfiles tree bazel out opt bin tensorflow lite tools systemc systemc model runfiles no entry in the cache action is new executing action writing file tensorflow lite tools systemc systemc model params no entry in the cache action is new executing action compiling tensorflow lite tools systemc systemc main sc cc no entry in the cache action is new executing action linking tensorflow lite tools systemc systemc model no entry in the cache action is new error messages when running unlimited jobs bazel build explain file txt verbose explanations tensorflow lite tools systemc systemc model build options apple platform type macos define framework shared object true define open source build true java toolchain third party toolchains java tf java toolchain host java toolchain third party toolchains java tf java toolchain define use fast cpp protos true define allow oversize protos true spawn strategy standalone compilation mode opt announce rc define grpc no ares true noincompatible remove legacy whole archive enable platform specific config copt w define prefix usr define libdir prefix lib define includedir prefix include cxxopt std c host cxxopt std c config define tf api version action env behavior explain file txt verbose explanations executing action bazelworkspacestatusaction stable status txt unconditional execution is requested executing action creating source manifest for tensorflow lite tools systemc systemc model no entry in the cache action is new executing action writing file tensorflow lite tools systemc systemc model params no entry in the cache action is new executing action executing genrule systemc libsystemc no entry in the cache action is new executing action compiling tensorflow lite tools systemc systemc main sc cc no entry in the cache action is new executing action creating runfiles tree bazel out opt bin tensorflow lite tools systemc systemc model runfiles no entry in the cache action is new and starting local bazel server and connecting to it info writing tracer profile to home developer cache bazel bazel developer command profile gz info options provided by the client inherited common options isatty terminal columns info reading rc options for build from working dir tensorflow bazelrc build options apple platform type macos define framework shared object true define open source build true java toolchain third party toolchains java tf java toolchain host java toolchain third party toolchains java tf java toolchain define use fast cpp protos true define allow oversize protos true spawn strategy standalone c opt announce rc define grpc no ares true noincompatible remove legacy whole archive enable platform specific config config info found applicable config definition build in file working dir tensorflow bazelrc define tf api version action env behavior info found applicable config definition build linux in file working dir tensorflow bazelrc copt w define prefix usr define libdir prefix lib define includedir prefix include cxxopt std c host cxxopt std c debug rule io bazel rules docker indicated that a canonical reproducible form can be obtained by modifying arguments shallow since debug call stack for the definition of repository io bazel rules docker which is a git repository rule definition at home developer cache bazel bazel developer external bazel tools tools build defs repo git bzl home developer cache bazel bazel developer external bazel toolchains repositories repositories bzl working dir tensorflow workspace info analyzed target tensorflow lite tools systemc systemc model packages loaded targets configured info found target info writing explanation of rebuilds to file txt error working dir tensorflow tensorflow lite tools systemc build c compilation of rule tensorflow lite tools systemc systemc model failed exit tensorflow lite tools systemc systemc main sc cc fatal error systemc systemc h no such file or directory compilation terminated target tensorflow lite tools systemc systemc model failed to build use verbose failures to see the command lines of failed build steps info elapsed time critical path info processes failed build did not complete successfully | 0 |
259,731 | 22,534,683,403 | IssuesEvent | 2022-06-25 03:26:55 | treeverse/lakeFS | https://api.github.com/repos/treeverse/lakeFS | opened | [Nice to have] Dump/Restore Refs Tests | area/testing team/versioning-engine nice-to-have area/KV | Though not part of gravler, this functionality is heavily dependent on gravlere and its usage of the underlying store. Having tests for these will add to our coverage and performance testing, as they are very exhaustive
**DoD**
DempRefs and RestoreRefs tests over SQL and KV | 1.0 | [Nice to have] Dump/Restore Refs Tests - Though not part of gravler, this functionality is heavily dependent on gravlere and its usage of the underlying store. Having tests for these will add to our coverage and performance testing, as they are very exhaustive
**DoD**
DempRefs and RestoreRefs tests over SQL and KV | non_priority | dump restore refs tests though not part of gravler this functionality is heavily dependent on gravlere and its usage of the underlying store having tests for these will add to our coverage and performance testing as they are very exhaustive dod demprefs and restorerefs tests over sql and kv | 0 |
125,077 | 16,718,672,133 | IssuesEvent | 2021-06-10 02:53:03 | cocoa-mhlw/cocoa | https://api.github.com/repos/cocoa-mhlw/cocoa | closed | ヘッダロゴのスペルミス (Welfar) | bug design released | **Describe the bug**
ヘッダロゴ中、厚生労働省さんの英語名表記にスペルミス (Welfar) があります。
**To Reproduce**
1. COCOA を起動する
2. メインページ上部のヘッダロゴを見る
**Expected behavior**
正しいスペル (Welfare) でロゴが表示される。
**Smartphone (please complete the following information):**
- Device: Essential Phone PH-1
- OS: Android 10
- Version: 1.2.2
----
Internal Tracking ID: BUG 1763
| 1.0 | ヘッダロゴのスペルミス (Welfar) - **Describe the bug**
ヘッダロゴ中、厚生労働省さんの英語名表記にスペルミス (Welfar) があります。
**To Reproduce**
1. COCOA を起動する
2. メインページ上部のヘッダロゴを見る
**Expected behavior**
正しいスペル (Welfare) でロゴが表示される。
**Smartphone (please complete the following information):**
- Device: Essential Phone PH-1
- OS: Android 10
- Version: 1.2.2
----
Internal Tracking ID: BUG 1763
| non_priority | ヘッダロゴのスペルミス welfar describe the bug ヘッダロゴ中、厚生労働省さんの英語名表記にスペルミス welfar があります。 to reproduce cocoa を起動する メインページ上部のヘッダロゴを見る expected behavior 正しいスペル welfare でロゴが表示される。 smartphone please complete the following information device essential phone ph os android version internal tracking id bug | 0 |
220,372 | 24,564,965,090 | IssuesEvent | 2022-10-13 01:29:51 | Baneeishaque/ask-med-pharma_Wordpress | https://api.github.com/repos/Baneeishaque/ask-med-pharma_Wordpress | closed | CVE-2018-11697 (High) detected in multiple libraries - autoclosed | security vulnerability | ## CVE-2018-11697 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.12.0.tgz</b>, <b>node-sass0bd48bbad6fccb0da16d3bdf76ad541f5f45ec70</b>, <b>CSS::Sassv3.6.0</b></p></summary>
<p>
<details><summary><b>node-sass-4.12.0.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.12.0.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.12.0.tgz</a></p>
<p>Path to dependency file: /wp-content/themes/twentynineteen/package.json</p>
<p>Path to vulnerable library: /wp-content/themes/twentynineteen/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.12.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/Baneeishaque/ask-med-pharma_Wordpress/commit/6464f631e15c1e1c888b6d505d934cf06ab4d0a2">6464f631e15c1e1c888b6d505d934cf06ab4d0a2</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in LibSass through 3.5.4. An out-of-bounds read of a memory region was found in the function Sass::Prelexer::exactly() which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service.
<p>Publish Date: 2018-06-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11697>CVE-2018-11697</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2018-06-04</p>
<p>Fix Resolution: 4.14.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2018-11697 (High) detected in multiple libraries - autoclosed - ## CVE-2018-11697 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.12.0.tgz</b>, <b>node-sass0bd48bbad6fccb0da16d3bdf76ad541f5f45ec70</b>, <b>CSS::Sassv3.6.0</b></p></summary>
<p>
<details><summary><b>node-sass-4.12.0.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.12.0.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.12.0.tgz</a></p>
<p>Path to dependency file: /wp-content/themes/twentynineteen/package.json</p>
<p>Path to vulnerable library: /wp-content/themes/twentynineteen/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.12.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/Baneeishaque/ask-med-pharma_Wordpress/commit/6464f631e15c1e1c888b6d505d934cf06ab4d0a2">6464f631e15c1e1c888b6d505d934cf06ab4d0a2</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in LibSass through 3.5.4. An out-of-bounds read of a memory region was found in the function Sass::Prelexer::exactly() which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service.
<p>Publish Date: 2018-06-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11697>CVE-2018-11697</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2018-06-04</p>
<p>Fix Resolution: 4.14.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in multiple libraries autoclosed cve high severity vulnerability vulnerable libraries node sass tgz node css node sass tgz wrapper around libsass library home page a href path to dependency file wp content themes twentynineteen package json path to vulnerable library wp content themes twentynineteen node modules node sass package json dependency hierarchy x node sass tgz vulnerable library found in head commit a href found in base branch master vulnerability details an issue was discovered in libsass through an out of bounds read of a memory region was found in the function sass prelexer exactly which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution step up your open source security game with mend | 0 |
58,099 | 24,329,394,370 | IssuesEvent | 2022-09-30 17:51:37 | hashicorp/terraform-provider-aws | https://api.github.com/repos/hashicorp/terraform-provider-aws | closed | Terraform doesn't detect that cloud formation stack was created successfully | bug service/cloudformation stale | <!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Terraform Version
`terraform 0.11.10`
`provider.aws: version = "~> 1.50"`
### Affected Resource(s)
<!--- Please list the affected resources and data sources. --->
* aws_cloudformation_stack
### Terraform Configuration Files
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
```hcl
resource "aws_cloudformation_stack" "MyStack" {
name = "MyStack-${var.deployment}"
parameters {
PARAMS GO HERE...
}
timeouts = {
create = "60m"
update = "60m"
delete = "10m"
}
template_url = "${var.template_url}"
on_failure = "${var.on_stack_creation_failure}"
capabilities = ["CAPABILITY_IAM"]
tags = "${var.default_tags}"
}
```
### Debug Output
Terraform doesn't detect that stack was actually created successfully
### Panic Output
None
<!--- If Terraform produced a panic, please provide a link to a GitHub Gist containing the output of the `crash.log`. --->
### Expected Behavior
It should detect that cloud formation stack was created successfully and continue.
### Actual Behavior
It throws a time out error that the cloud formation stack wasn't created in time, outputting the following message:
> * aws_cloudformation_stack.MyStack: timeout while waiting for state to become 'CREATE_COMPLETE, CREATE_FAILED, DELETE_COMPLETE, DELETE_FAILED, ROLLBACK_COMPLETE, ROLLBACK_FAILED' (last state: 'CREATE_IN_PROGRESS', timeout: 1h5m0s) | 1.0 | Terraform doesn't detect that cloud formation stack was created successfully - <!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Terraform Version
`terraform 0.11.10`
`provider.aws: version = "~> 1.50"`
### Affected Resource(s)
<!--- Please list the affected resources and data sources. --->
* aws_cloudformation_stack
### Terraform Configuration Files
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
```hcl
resource "aws_cloudformation_stack" "MyStack" {
name = "MyStack-${var.deployment}"
parameters {
PARAMS GO HERE...
}
timeouts = {
create = "60m"
update = "60m"
delete = "10m"
}
template_url = "${var.template_url}"
on_failure = "${var.on_stack_creation_failure}"
capabilities = ["CAPABILITY_IAM"]
tags = "${var.default_tags}"
}
```
### Debug Output
Terraform doesn't detect that stack was actually created successfully
### Panic Output
None
<!--- If Terraform produced a panic, please provide a link to a GitHub Gist containing the output of the `crash.log`. --->
### Expected Behavior
It should detect that cloud formation stack was created successfully and continue.
### Actual Behavior
It throws a time out error that the cloud formation stack wasn't created in time, outputting the following message:
> * aws_cloudformation_stack.MyStack: timeout while waiting for state to become 'CREATE_COMPLETE, CREATE_FAILED, DELETE_COMPLETE, DELETE_FAILED, ROLLBACK_COMPLETE, ROLLBACK_FAILED' (last state: 'CREATE_IN_PROGRESS', timeout: 1h5m0s) | non_priority | terraform doesn t detect that cloud formation stack was created successfully community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform version terraform provider aws version affected resource s aws cloudformation stack terraform configuration files hcl resource aws cloudformation stack mystack name mystack var deployment parameters params go here timeouts create update delete template url var template url on failure var on stack creation failure capabilities tags var default tags debug output terraform doesn t detect that stack was actually created successfully panic output none expected behavior it should detect that cloud formation stack was created successfully and continue actual behavior it throws a time out error that the cloud formation stack wasn t created in time outputting the following message aws cloudformation stack mystack timeout while waiting for state to become create complete create failed delete complete delete failed rollback complete rollback failed last state create in progress timeout | 0 |
51,561 | 27,142,083,605 | IssuesEvent | 2023-02-16 17:02:02 | dotnet/roslyn | https://api.github.com/repos/dotnet/roslyn | closed | Background analysis performance improvements | Area-Analyzers Feature Request Tenet-Performance Concept-Continuous Improvement | **Issue:** Recent performance measurements for background analysis have shown that for very large files and for large number of open files in a VS session, background analysis leads to a steady increase in memory pressure in the OOP Roslyn ServiceHub process. Majority of the allocations and memory usage is coming from the partial analysis state tracking done in CompilationWithAnalyzers for open file analysis: https://sourceroslyn.io/#Microsoft.CodeAnalysis/DiagnosticAnalyzer/AnalysisState.cs,c6a89afa028d81d8. We also have few feedback reports for the same, for example https://developercommunity.visualstudio.com/t/Code-Analyzer-crashes-for-a-large-file/10016304 shows 50% of memory usage in OOP Roslyn ServiceHub process coming from AnalysisState objects.
**Historical context:** Prior to moving analyzer execution to the OOP Roslyn ServiceHub process, for background analysis, we used to execute all analyzers one by one on each open file to compute diagnostics. This was done as a response to high GC and many UI delays while typing that stemmed from attempting to run all analyzers concurrently inside the devenv.exe process. However, this in-proc sequential analyzer execution led to repeated forking of the compilation inside CompilationWithAnalyzers instance for every analyzer being executed, which caused high allocations and increase in memory pressure. To avoid this, we introduced analyzer state tracking within CompilationWithAnalyzers, whereby we re-use the same underlying compilation for computing all analyzer diagnostics, even if asked sequentially by the CompilationWithAnalyzers host, but need to keep track of which analyzer has completed callbacks for which symbol/node/operation. This state tracking is needed to avoid duplicate callbacks into the analyzer for the same symbol/node/operation, which is needed for functional correctness. However, we need to keep dictionaries and hashsets storing references to all symbol/node/operation executed for corresponding compilation events, which increases the memory pressure. This also affects execution speed as we need to acquire locks for reading and writing into these data structures prior to every callback.
**Potential perf improvements:** With the move to analyzers being executed completely in OOP Roslyn ServiceHub process, we now execute all the analyzers concurrently in OOP. We do execute the compiler analyzer separately upfront though, to ensure that compiler diagnostics get refreshed before we compute analyzer diagnostics. We basically have 4 calls into `CompilationWithAnalyzers`: compiler syntax diagnostics, compiler semantic diagnostics, analyzer syntax diagnostics, analyzer semantic diagnostics. This means lot of reasoning why the partial analyzer state tracking logic was added into `CompilationWithAnalyzers` may no longer be required. We can use the following implementation logic:
1. For compiler syntax and semantic diagnostics, use the [underlying compilation](https://sourceroslyn.io/#Microsoft.CodeAnalysis/DiagnosticAnalyzer/CompilationWithAnalyzers.cs,23) to compute diagnostics without any state tracking.
2. For analyzer diagnostics, we cannot re-use the same [underlying compilation](https://sourceroslyn.io/#Microsoft.CodeAnalysis/DiagnosticAnalyzer/CompilationWithAnalyzers.cs,23) without state tracking, so we just fork the compilation for computing analyzer diagnostics and do not re-use this forked compilation for any other calls into `CompilationWithAnalyzers`. Forking compilation should be fine here as we only make 2 calls into `CompilationWithAnalyzers` for all analyzer syntax and semantic diagnostics. In case a cancellation happens during analyzer execution and the host re-queries for the diagnostics for the same tree, we will use a new forked compilation, so we don't run into the risk of duplicate callbacks into the analyzer for the same compilation.
3. [AnalysisResultBuilder](https://sourceroslyn.io/#Microsoft.CodeAnalysis/DiagnosticAnalyzer/CompilationWithAnalyzers.cs,47) within `CompilationWithAnalyzers` will be changed to de-dupe reported analyzer diagnostics.
I prototyped the above changes over the last week and see significant memory improvements in allocations and peak memory pressure for background analysis. Steady state memory usage for the OOP Roslyn ServiceHub process reduces by almost 30%-40% on opening bunch of files in the C# compiler project in Roslyn.sln, all of this savings coming from no longer holding AnalysisState objects in memory for current project. I also notice vast reduction in allocations from background analysis. I haven't measured the execution time differences, but something I plan to carry out this week. Another big plus here is we can delete a large amount of code and reduce maintenance costs for a part of code base that has had very subtle race conditions over time.
**Fallback approach:** If the above approach of completely getting rid of AnalysisState for partial state tracking does not work out for some reason (perf or functionality), then we can experiment with optimizing memory usage in AnalysisState by tracking spans instead of syntax nodes and operations. This fallback strategy should also lead to good amount of performance improvements. | True | Background analysis performance improvements - **Issue:** Recent performance measurements for background analysis have shown that for very large files and for large number of open files in a VS session, background analysis leads to a steady increase in memory pressure in the OOP Roslyn ServiceHub process. Majority of the allocations and memory usage is coming from the partial analysis state tracking done in CompilationWithAnalyzers for open file analysis: https://sourceroslyn.io/#Microsoft.CodeAnalysis/DiagnosticAnalyzer/AnalysisState.cs,c6a89afa028d81d8. We also have few feedback reports for the same, for example https://developercommunity.visualstudio.com/t/Code-Analyzer-crashes-for-a-large-file/10016304 shows 50% of memory usage in OOP Roslyn ServiceHub process coming from AnalysisState objects.
**Historical context:** Prior to moving analyzer execution to the OOP Roslyn ServiceHub process, for background analysis, we used to execute all analyzers one by one on each open file to compute diagnostics. This was done as a response to high GC and many UI delays while typing that stemmed from attempting to run all analyzers concurrently inside the devenv.exe process. However, this in-proc sequential analyzer execution led to repeated forking of the compilation inside CompilationWithAnalyzers instance for every analyzer being executed, which caused high allocations and increase in memory pressure. To avoid this, we introduced analyzer state tracking within CompilationWithAnalyzers, whereby we re-use the same underlying compilation for computing all analyzer diagnostics, even if asked sequentially by the CompilationWithAnalyzers host, but need to keep track of which analyzer has completed callbacks for which symbol/node/operation. This state tracking is needed to avoid duplicate callbacks into the analyzer for the same symbol/node/operation, which is needed for functional correctness. However, we need to keep dictionaries and hashsets storing references to all symbol/node/operation executed for corresponding compilation events, which increases the memory pressure. This also affects execution speed as we need to acquire locks for reading and writing into these data structures prior to every callback.
**Potential perf improvements:** With the move to analyzers being executed completely in OOP Roslyn ServiceHub process, we now execute all the analyzers concurrently in OOP. We do execute the compiler analyzer separately upfront though, to ensure that compiler diagnostics get refreshed before we compute analyzer diagnostics. We basically have 4 calls into `CompilationWithAnalyzers`: compiler syntax diagnostics, compiler semantic diagnostics, analyzer syntax diagnostics, analyzer semantic diagnostics. This means lot of reasoning why the partial analyzer state tracking logic was added into `CompilationWithAnalyzers` may no longer be required. We can use the following implementation logic:
1. For compiler syntax and semantic diagnostics, use the [underlying compilation](https://sourceroslyn.io/#Microsoft.CodeAnalysis/DiagnosticAnalyzer/CompilationWithAnalyzers.cs,23) to compute diagnostics without any state tracking.
2. For analyzer diagnostics, we cannot re-use the same [underlying compilation](https://sourceroslyn.io/#Microsoft.CodeAnalysis/DiagnosticAnalyzer/CompilationWithAnalyzers.cs,23) without state tracking, so we just fork the compilation for computing analyzer diagnostics and do not re-use this forked compilation for any other calls into `CompilationWithAnalyzers`. Forking compilation should be fine here as we only make 2 calls into `CompilationWithAnalyzers` for all analyzer syntax and semantic diagnostics. In case a cancellation happens during analyzer execution and the host re-queries for the diagnostics for the same tree, we will use a new forked compilation, so we don't run into the risk of duplicate callbacks into the analyzer for the same compilation.
3. [AnalysisResultBuilder](https://sourceroslyn.io/#Microsoft.CodeAnalysis/DiagnosticAnalyzer/CompilationWithAnalyzers.cs,47) within `CompilationWithAnalyzers` will be changed to de-dupe reported analyzer diagnostics.
I prototyped the above changes over the last week and see significant memory improvements in allocations and peak memory pressure for background analysis. Steady state memory usage for the OOP Roslyn ServiceHub process reduces by almost 30%-40% on opening bunch of files in the C# compiler project in Roslyn.sln, all of this savings coming from no longer holding AnalysisState objects in memory for current project. I also notice vast reduction in allocations from background analysis. I haven't measured the execution time differences, but something I plan to carry out this week. Another big plus here is we can delete a large amount of code and reduce maintenance costs for a part of code base that has had very subtle race conditions over time.
**Fallback approach:** If the above approach of completely getting rid of AnalysisState for partial state tracking does not work out for some reason (perf or functionality), then we can experiment with optimizing memory usage in AnalysisState by tracking spans instead of syntax nodes and operations. This fallback strategy should also lead to good amount of performance improvements. | non_priority | background analysis performance improvements issue recent performance measurements for background analysis have shown that for very large files and for large number of open files in a vs session background analysis leads to a steady increase in memory pressure in the oop roslyn servicehub process majority of the allocations and memory usage is coming from the partial analysis state tracking done in compilationwithanalyzers for open file analysis we also have few feedback reports for the same for example shows of memory usage in oop roslyn servicehub process coming from analysisstate objects historical context prior to moving analyzer execution to the oop roslyn servicehub process for background analysis we used to execute all analyzers one by one on each open file to compute diagnostics this was done as a response to high gc and many ui delays while typing that stemmed from attempting to run all analyzers concurrently inside the devenv exe process however this in proc sequential analyzer execution led to repeated forking of the compilation inside compilationwithanalyzers instance for every analyzer being executed which caused high allocations and increase in memory pressure to avoid this we introduced analyzer state tracking within compilationwithanalyzers whereby we re use the same underlying compilation for computing all analyzer diagnostics even if asked sequentially by the compilationwithanalyzers host but need to keep track of which analyzer has completed callbacks for which symbol node operation this state tracking is needed to avoid duplicate callbacks into the analyzer for the same symbol node operation which is needed for functional correctness however we need to keep dictionaries and hashsets storing references to all symbol node operation executed for corresponding compilation events which increases the memory pressure this also affects execution speed as we need to acquire locks for reading and writing into these data structures prior to every callback potential perf improvements with the move to analyzers being executed completely in oop roslyn servicehub process we now execute all the analyzers concurrently in oop we do execute the compiler analyzer separately upfront though to ensure that compiler diagnostics get refreshed before we compute analyzer diagnostics we basically have calls into compilationwithanalyzers compiler syntax diagnostics compiler semantic diagnostics analyzer syntax diagnostics analyzer semantic diagnostics this means lot of reasoning why the partial analyzer state tracking logic was added into compilationwithanalyzers may no longer be required we can use the following implementation logic for compiler syntax and semantic diagnostics use the to compute diagnostics without any state tracking for analyzer diagnostics we cannot re use the same without state tracking so we just fork the compilation for computing analyzer diagnostics and do not re use this forked compilation for any other calls into compilationwithanalyzers forking compilation should be fine here as we only make calls into compilationwithanalyzers for all analyzer syntax and semantic diagnostics in case a cancellation happens during analyzer execution and the host re queries for the diagnostics for the same tree we will use a new forked compilation so we don t run into the risk of duplicate callbacks into the analyzer for the same compilation within compilationwithanalyzers will be changed to de dupe reported analyzer diagnostics i prototyped the above changes over the last week and see significant memory improvements in allocations and peak memory pressure for background analysis steady state memory usage for the oop roslyn servicehub process reduces by almost on opening bunch of files in the c compiler project in roslyn sln all of this savings coming from no longer holding analysisstate objects in memory for current project i also notice vast reduction in allocations from background analysis i haven t measured the execution time differences but something i plan to carry out this week another big plus here is we can delete a large amount of code and reduce maintenance costs for a part of code base that has had very subtle race conditions over time fallback approach if the above approach of completely getting rid of analysisstate for partial state tracking does not work out for some reason perf or functionality then we can experiment with optimizing memory usage in analysisstate by tracking spans instead of syntax nodes and operations this fallback strategy should also lead to good amount of performance improvements | 0 |
130,770 | 27,765,091,853 | IssuesEvent | 2023-03-16 10:55:29 | tidharmws/authlab | https://api.github.com/repos/tidharmws/authlab | opened | Code Security Report: 2 total findings | Mend: code security findings | # Code Security Report
### Scan Metadata
**Latest Scan:** 2023-03-16 10:55am
**Total Findings:** 2 | **New Findings:** 0 | **Resolved Findings:** 0
**Tested Project Files:** 22
**Detected Programming Languages:** 2 (JavaScript / Node.js, Go)
<!-- SAST-MANUAL-SCAN-START -->
- [ ] Check this box to manually trigger a scan
<!-- SAST-MANUAL-SCAN-END -->
### Finding Details
| 1.0 | Code Security Report: 2 total findings - # Code Security Report
### Scan Metadata
**Latest Scan:** 2023-03-16 10:55am
**Total Findings:** 2 | **New Findings:** 0 | **Resolved Findings:** 0
**Tested Project Files:** 22
**Detected Programming Languages:** 2 (JavaScript / Node.js, Go)
<!-- SAST-MANUAL-SCAN-START -->
- [ ] Check this box to manually trigger a scan
<!-- SAST-MANUAL-SCAN-END -->
### Finding Details
| non_priority | code security report total findings code security report scan metadata latest scan total findings new findings resolved findings tested project files detected programming languages javascript node js go check this box to manually trigger a scan finding details | 0 |
307,498 | 23,202,207,056 | IssuesEvent | 2022-08-01 23:04:29 | MarketSquare/robotframework-requests | https://api.github.com/repos/MarketSquare/robotframework-requests | closed | 'file-tuple' reported as a kwarg in the documentation | accepted documentation | I would need to upload a YAML file via a POST request.
From the [documentation](https://marketsquare.github.io/robotframework-requests/doc/RequestsLibrary.html#:~:text=multipart%20encoding%20upload.-,file%2Dtuple,-can%20be%20a), I'm trying the following:
`${response} POST https://abc.xyz/endpoint file-tuple=('file', ${fileobj}, 'application/x-yaml') verify=${False}`
But I get the following error:
`TypeError: request() got an unexpected keyword argument 'file-tuple'`
Robot Framework 5.0.1 (Python 3.9.10 on linux) | 1.0 | 'file-tuple' reported as a kwarg in the documentation - I would need to upload a YAML file via a POST request.
From the [documentation](https://marketsquare.github.io/robotframework-requests/doc/RequestsLibrary.html#:~:text=multipart%20encoding%20upload.-,file%2Dtuple,-can%20be%20a), I'm trying the following:
`${response} POST https://abc.xyz/endpoint file-tuple=('file', ${fileobj}, 'application/x-yaml') verify=${False}`
But I get the following error:
`TypeError: request() got an unexpected keyword argument 'file-tuple'`
Robot Framework 5.0.1 (Python 3.9.10 on linux) | non_priority | file tuple reported as a kwarg in the documentation i would need to upload a yaml file via a post request from the i m trying the following response post file tuple file fileobj application x yaml verify false but i get the following error typeerror request got an unexpected keyword argument file tuple robot framework python on linux | 0 |
97,421 | 8,653,455,656 | IssuesEvent | 2018-11-27 10:53:06 | ubtue/DatenProbleme | https://api.github.com/repos/ubtue/DatenProbleme | closed | Scielo ISSN 0718-9273 From rationality to credibility Tag 520 a | Zotero_AUTO Zotero_Translator ready for testing | Immer wieder, so auch im Fall dieser Zeitschrift, erscheint in 520 a der Begriff Resumen, Abstract am Beginn des Textes.
Hier etwa heiß es:
```
<datafield tag="520" ind1="3" ind2=" ">
<subfield code="a">ResumenA partir de la Ilustración, la ...
```
Könnte man diese Begriffe wie "Resumen", "Abstract", "Résumé" am Anfang in 520 a entfernen?
| 1.0 | Scielo ISSN 0718-9273 From rationality to credibility Tag 520 a - Immer wieder, so auch im Fall dieser Zeitschrift, erscheint in 520 a der Begriff Resumen, Abstract am Beginn des Textes.
Hier etwa heiß es:
```
<datafield tag="520" ind1="3" ind2=" ">
<subfield code="a">ResumenA partir de la Ilustración, la ...
```
Könnte man diese Begriffe wie "Resumen", "Abstract", "Résumé" am Anfang in 520 a entfernen?
| non_priority | scielo issn from rationality to credibility tag a immer wieder so auch im fall dieser zeitschrift erscheint in a der begriff resumen abstract am beginn des textes hier etwa heiß es resumena partir de la ilustración la könnte man diese begriffe wie resumen abstract résumé am anfang in a entfernen | 0 |
82,216 | 15,878,047,411 | IssuesEvent | 2021-04-09 10:27:09 | internetarchive/openlibrary | https://api.github.com/repos/internetarchive/openlibrary | closed | Unable to add book: "Internal Server Error" | Lead: @dhruvmanila Needs: Triage Theme: Unicode Type: Bug | I tried to add a book with the following details:
* Title: `He Huli ʻUlaʻula: A Study in Scarlet`
* Author: `Fevronia H Watkins`
* Publisher: `Independently Published`
* Published: `November 25, 2020`
* ISBN 13: `9798589760521`
When I do I just get "Internal Server Error"
### Evidence / Screenshot (if possible)
<img width="988" alt="Screen Shot 2021-04-08 at 12 16 57 PM" src="https://user-images.githubusercontent.com/921217/114103319-a78b2e00-9864-11eb-8e33-fac2556c9329.png">

### Relevant url?
https://openlibrary.org/books/add
### Steps to Reproduce
<!-- What steps caused you to find the bug? -->
1. Go to to the add books url, add books with the details as listed above
I tried adding debug=true to the html action following [this guide](https://github.com/internetarchive/openlibrary/wiki/Debugging) but didn't see any more detailed errors.
<!-- What actually happened after these steps? What did you expect to happen? -->
* Actual: Error
* Expected: No error
### Details
- **Logged in (Y/N)?** Y
- **Browser type/version?** Brave
- **Operating system?** Mac
- **Environment (prod/dev/local)?** prod
### Proposal & Constraints
<!-- What is the proposed solution / implementation? Is there a precedent of this approach succeeding elsewhere? -->
### Related files
<!-- Files related to this issue; this is super useful for new contributors who might want to help! If you're not sure, leave this blank; a maintainer will add them. -->
### Stakeholders
<!-- @ tag stakeholders of this bug -->
| 1.0 | Unable to add book: "Internal Server Error" - I tried to add a book with the following details:
* Title: `He Huli ʻUlaʻula: A Study in Scarlet`
* Author: `Fevronia H Watkins`
* Publisher: `Independently Published`
* Published: `November 25, 2020`
* ISBN 13: `9798589760521`
When I do I just get "Internal Server Error"
### Evidence / Screenshot (if possible)
<img width="988" alt="Screen Shot 2021-04-08 at 12 16 57 PM" src="https://user-images.githubusercontent.com/921217/114103319-a78b2e00-9864-11eb-8e33-fac2556c9329.png">

### Relevant url?
https://openlibrary.org/books/add
### Steps to Reproduce
<!-- What steps caused you to find the bug? -->
1. Go to to the add books url, add books with the details as listed above
I tried adding debug=true to the html action following [this guide](https://github.com/internetarchive/openlibrary/wiki/Debugging) but didn't see any more detailed errors.
<!-- What actually happened after these steps? What did you expect to happen? -->
* Actual: Error
* Expected: No error
### Details
- **Logged in (Y/N)?** Y
- **Browser type/version?** Brave
- **Operating system?** Mac
- **Environment (prod/dev/local)?** prod
### Proposal & Constraints
<!-- What is the proposed solution / implementation? Is there a precedent of this approach succeeding elsewhere? -->
### Related files
<!-- Files related to this issue; this is super useful for new contributors who might want to help! If you're not sure, leave this blank; a maintainer will add them. -->
### Stakeholders
<!-- @ tag stakeholders of this bug -->
| non_priority | unable to add book internal server error i tried to add a book with the following details title he huli ʻulaʻula a study in scarlet author fevronia h watkins publisher independently published published november isbn when i do i just get internal server error evidence screenshot if possible img width alt screen shot at pm src relevant url steps to reproduce go to to the add books url add books with the details as listed above i tried adding debug true to the html action following but didn t see any more detailed errors actual error expected no error details logged in y n y browser type version brave operating system mac environment prod dev local prod proposal constraints related files stakeholders | 0 |
5,095 | 7,704,373,535 | IssuesEvent | 2018-05-21 12:01:55 | bpp/bpp | https://api.github.com/repos/bpp/bpp | opened | allow three values for option tauprior | compatibility | Allow an optional third value for option `tauprior` (concentration parameter alpha for Dirichlet distribution) which is to be ignored for now. | True | allow three values for option tauprior - Allow an optional third value for option `tauprior` (concentration parameter alpha for Dirichlet distribution) which is to be ignored for now. | non_priority | allow three values for option tauprior allow an optional third value for option tauprior concentration parameter alpha for dirichlet distribution which is to be ignored for now | 0 |
55,105 | 14,219,010,221 | IssuesEvent | 2020-11-17 12:40:40 | BitLucid/ninjawars | https://api.github.com/repos/BitLucid/ninjawars | opened | Bathhouse: not updating it’s information correctly at all. | defect | AC:
- Bathhouse entries update in time again.


| 1.0 | Bathhouse: not updating it’s information correctly at all. - AC:
- Bathhouse entries update in time again.


| non_priority | bathhouse not updating it’s information correctly at all ac bathhouse entries update in time again | 0 |
128,989 | 10,559,075,470 | IssuesEvent | 2019-10-04 10:37:09 | microsoft/azure-pipelines-tasks | https://api.github.com/repos/microsoft/azure-pipelines-tasks | closed | In Azure Pipelines, the published coverage report fail to display some Unicode characters | Area: Test bug route | ## Required Information
Entering this information will route you directly to the right team and expedite traction.
**Question, Bug, or Feature?**
*Type*: Bug
**Enter Task Name**: PublishCodeCoverageResults
## Environment
- Server - Azure Pipelines or TFS on-premises?
- Azure Pipelines
- Agent - Hosted or Private:
- Hosted
## Issue Description
The published reports fail to display some Unicode characters. See [this page](https://dev.azure.com/EFanZh/a142a4f5-d1aa-4ad6-9e36-c326724a6f11/_apis/test/CodeCoverage/browse/3380364/Code%20Coverage%20Report_49/section_12_3_insertion_and_deletion_mod.htm), line 10 of the source code.
I think that is because the generated page is missing the
```html
<meta charset="UTF-8" />
```
line. | 1.0 | In Azure Pipelines, the published coverage report fail to display some Unicode characters - ## Required Information
Entering this information will route you directly to the right team and expedite traction.
**Question, Bug, or Feature?**
*Type*: Bug
**Enter Task Name**: PublishCodeCoverageResults
## Environment
- Server - Azure Pipelines or TFS on-premises?
- Azure Pipelines
- Agent - Hosted or Private:
- Hosted
## Issue Description
The published reports fail to display some Unicode characters. See [this page](https://dev.azure.com/EFanZh/a142a4f5-d1aa-4ad6-9e36-c326724a6f11/_apis/test/CodeCoverage/browse/3380364/Code%20Coverage%20Report_49/section_12_3_insertion_and_deletion_mod.htm), line 10 of the source code.
I think that is because the generated page is missing the
```html
<meta charset="UTF-8" />
```
line. | non_priority | in azure pipelines the published coverage report fail to display some unicode characters required information entering this information will route you directly to the right team and expedite traction question bug or feature type bug enter task name publishcodecoverageresults environment server azure pipelines or tfs on premises azure pipelines agent hosted or private hosted issue description the published reports fail to display some unicode characters see line of the source code i think that is because the generated page is missing the html line | 0 |
133,020 | 10,780,225,450 | IssuesEvent | 2019-11-04 12:29:52 | ethereum/solidity | https://api.github.com/repos/ethereum/solidity | closed | [yul] proto fuzzer: Catch exceptions earlier and report termination reason | testing :hammer: | ## Description
Currently, exceptions thrown by the yul interpreter are caught in the fuzzer harness. The problem with this is that execution traces are not initialized properly. This issue tracks support for fixing this. | 1.0 | [yul] proto fuzzer: Catch exceptions earlier and report termination reason - ## Description
Currently, exceptions thrown by the yul interpreter are caught in the fuzzer harness. The problem with this is that execution traces are not initialized properly. This issue tracks support for fixing this. | non_priority | proto fuzzer catch exceptions earlier and report termination reason description currently exceptions thrown by the yul interpreter are caught in the fuzzer harness the problem with this is that execution traces are not initialized properly this issue tracks support for fixing this | 0 |
52,485 | 12,973,176,843 | IssuesEvent | 2020-07-21 13:40:38 | nearprotocol/nearcore | https://api.github.com/repos/nearprotocol/nearcore | closed | binary build in buildkite has path issue read genesis from file | build | ```
➜ ~ pwd
/home/bo
➜ ~ ./near --home aaa init --genesis /home/bo/output.json --chain-id betanet
Apr 14 18:35:38.514 INFO near: Version: 0.4.13, Build: 118a7a73
thread 'main' panicked at 'Could not read genesis config file.: Os { code: 2, kind: NotFound, message: "No such file or directory" }', /var/lib/buildkite-agent/builds/buildkite-i-0e15d392a9e50ef91-1/nearprotocol/nearcore/core/chain-configs/src/genesis_config.rs:140:14
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
➜ ~ ls output.json
output.json
```
So the /home/bo/output.json is there but near binary cannot open it, probably some path issue | 1.0 | binary build in buildkite has path issue read genesis from file - ```
➜ ~ pwd
/home/bo
➜ ~ ./near --home aaa init --genesis /home/bo/output.json --chain-id betanet
Apr 14 18:35:38.514 INFO near: Version: 0.4.13, Build: 118a7a73
thread 'main' panicked at 'Could not read genesis config file.: Os { code: 2, kind: NotFound, message: "No such file or directory" }', /var/lib/buildkite-agent/builds/buildkite-i-0e15d392a9e50ef91-1/nearprotocol/nearcore/core/chain-configs/src/genesis_config.rs:140:14
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
➜ ~ ls output.json
output.json
```
So the /home/bo/output.json is there but near binary cannot open it, probably some path issue | non_priority | binary build in buildkite has path issue read genesis from file ➜ pwd home bo ➜ near home aaa init genesis home bo output json chain id betanet apr info near version build thread main panicked at could not read genesis config file os code kind notfound message no such file or directory var lib buildkite agent builds buildkite i nearprotocol nearcore core chain configs src genesis config rs note run with rust backtrace environment variable to display a backtrace ➜ ls output json output json so the home bo output json is there but near binary cannot open it probably some path issue | 0 |
2,837 | 5,637,233,359 | IssuesEvent | 2017-04-06 08:37:41 | presscustomizr/customizr | https://api.github.com/repos/presscustomizr/customizr | closed | WooCommerce admin javascript error ( Customizr Pro Theme ) | bug compatibility-issue fixed Urgent | Hi,
This issue is reported by customer on helpscout.
https://secure.helpscout.net/conversation/345010016/25533/?folderId=607541#thread-928281680
Customizr Pro Theme 1.3.4 and WooCommerce 3.0.0
In WooCommerce, Add New Product interface, the Product data tabs are not working.
There is javascript error.
<img width="1670" alt="screen shot 2017-04-06 at 10 19 57 am" src="https://cloud.githubusercontent.com/assets/9284811/24735238/857ab2ea-1ab4-11e7-9693-fa56779c8fba.png">
Works properly in Customizr ( Free ) Theme.
Thank you
| True | WooCommerce admin javascript error ( Customizr Pro Theme ) - Hi,
This issue is reported by customer on helpscout.
https://secure.helpscout.net/conversation/345010016/25533/?folderId=607541#thread-928281680
Customizr Pro Theme 1.3.4 and WooCommerce 3.0.0
In WooCommerce, Add New Product interface, the Product data tabs are not working.
There is javascript error.
<img width="1670" alt="screen shot 2017-04-06 at 10 19 57 am" src="https://cloud.githubusercontent.com/assets/9284811/24735238/857ab2ea-1ab4-11e7-9693-fa56779c8fba.png">
Works properly in Customizr ( Free ) Theme.
Thank you
| non_priority | woocommerce admin javascript error customizr pro theme hi this issue is reported by customer on helpscout customizr pro theme and woocommerce in woocommerce add new product interface the product data tabs are not working there is javascript error img width alt screen shot at am src works properly in customizr free theme thank you | 0 |
79,804 | 9,955,582,814 | IssuesEvent | 2019-07-05 11:30:13 | mash-up-kr/Thing-BackEnd | https://api.github.com/repos/mash-up-kr/Thing-BackEnd | opened | 랭킹 도메인 요구 사항 | design enhancement | # 랭킹 도메인 요구 사항
## 랭킹 도메인 요구사항을 도출한다.
- 카테고리별 유튜버 랭킹을 볼 수 있다.
- 구독자 순으로 유튜버 랭킹을 볼 수 있다.
- 급상승 순으로 유튜버 랭킹을 볼 수 있다.
- 랭킹에서 보여주는 정보는 다음과 같다.
- 유튜버 이름
- 랭킹
- 구독자수
- 조회수
- 썸네일
- 배너
<br>
### 완료 조건
- [ ] 랭킹 도메인 모델을 도출한다.
<br>
## relate to issue
* 도메인도출(#4)
<br>
> #### Reference
> FIXME: fill reference link
> * `[title](link)`
---
<br>
## Check List
- [ ] issue 제목은 유의미한가?
- [ ] issue 내용은 issue 내용만 확인하고도 모르는 사람도 파악할 수 있을 정도로 기술되었는가? (무엇을, 언제, 어디서...)
- [ ] reference가 있다면 추가했는가?
- [ ] 관련 issue가 있다면 추가했는가?
- [ ] 유의미한 Label을 추가했는가?
- [ ] Assginees를 추가했는가?
- [ ] Estimate를 추가했는가?
- [ ] 관련 Milestone이 있다면 추가했는가?
- [ ] 관련 Epics가 있다면 추가했는가?
---
| 1.0 | 랭킹 도메인 요구 사항 - # 랭킹 도메인 요구 사항
## 랭킹 도메인 요구사항을 도출한다.
- 카테고리별 유튜버 랭킹을 볼 수 있다.
- 구독자 순으로 유튜버 랭킹을 볼 수 있다.
- 급상승 순으로 유튜버 랭킹을 볼 수 있다.
- 랭킹에서 보여주는 정보는 다음과 같다.
- 유튜버 이름
- 랭킹
- 구독자수
- 조회수
- 썸네일
- 배너
<br>
### 완료 조건
- [ ] 랭킹 도메인 모델을 도출한다.
<br>
## relate to issue
* 도메인도출(#4)
<br>
> #### Reference
> FIXME: fill reference link
> * `[title](link)`
---
<br>
## Check List
- [ ] issue 제목은 유의미한가?
- [ ] issue 내용은 issue 내용만 확인하고도 모르는 사람도 파악할 수 있을 정도로 기술되었는가? (무엇을, 언제, 어디서...)
- [ ] reference가 있다면 추가했는가?
- [ ] 관련 issue가 있다면 추가했는가?
- [ ] 유의미한 Label을 추가했는가?
- [ ] Assginees를 추가했는가?
- [ ] Estimate를 추가했는가?
- [ ] 관련 Milestone이 있다면 추가했는가?
- [ ] 관련 Epics가 있다면 추가했는가?
---
| non_priority | 랭킹 도메인 요구 사항 랭킹 도메인 요구 사항 랭킹 도메인 요구사항을 도출한다 카테고리별 유튜버 랭킹을 볼 수 있다 구독자 순으로 유튜버 랭킹을 볼 수 있다 급상승 순으로 유튜버 랭킹을 볼 수 있다 랭킹에서 보여주는 정보는 다음과 같다 유튜버 이름 랭킹 구독자수 조회수 썸네일 배너 완료 조건 랭킹 도메인 모델을 도출한다 relate to issue 도메인도출 reference fixme fill reference link link check list issue 제목은 유의미한가 issue 내용은 issue 내용만 확인하고도 모르는 사람도 파악할 수 있을 정도로 기술되었는가 무엇을 언제 어디서 reference가 있다면 추가했는가 관련 issue가 있다면 추가했는가 유의미한 label을 추가했는가 assginees를 추가했는가 estimate를 추가했는가 관련 milestone이 있다면 추가했는가 관련 epics가 있다면 추가했는가 | 0 |
111,027 | 24,053,150,362 | IssuesEvent | 2022-09-16 14:28:22 | haproxy/haproxy | https://api.github.com/repos/haproxy/haproxy | opened | haproxy crashes when built using clang and thread sanitizer | type: code-report | ### Tool Name and Version
clang and thread sanitizer
### Code Report
```plain
[Thread 0x7ffff52fe700 (LWP 9112) exited]
HAProxy version 2.7-dev5-e3e312-97 2022/09/16 - https://haproxy.org/
Status: development branch - not safe for use in production.
Known bugs: https://github.com/haproxy/haproxy/issues?q=is:issue+is:open
Running on: Linux 5.15.0-1019-azure #24~20.04.1-Ubuntu SMP Tue Aug 23 15:52:52 UTC 2022 x86_64
Build options :
TARGET = linux-glibc
CPU = generic
CC = clang
CFLAGS = -O2 -fsanitize=thread -ggdb -Wall -Wextra -Wundef -Wdeclaration-after-statement -Wfatal-errors -Wtype-limits -Wshift-negative-value -Wnull-dereference -fwrapv -Wno-unknown-warning-option -Wno-address-of-packed-member -Wno-unused-label -Wno-sign-compare -Wno-unused-parameter -Wno-clobbered -Wno-missing-field-initializers -Wno-cast-function-type -Wno-string-plus-int -Wno-atomic-alignment -Werror
OPTIONS = USE_OPENSSL=1
DEBUG = -DDEBUG_STRICT -DDEBUG_MEMORY_POOLS
Feature list : +EPOLL -KQUEUE +NETFILTER -PCRE -PCRE_JIT -PCRE2 -PCRE2_JIT +POLL +THREAD -PTHREAD_EMULATION +BACKTRACE -STATIC_PCRE -STATIC_PCRE2 +TPROXY +LINUX_TPROXY +LINUX_SPLICE +LIBCRYPT +CRYPT_H -ENGINE +GETADDRINFO +OPENSSL -LUA +ACCEPT4 -CLOSEFROM -ZLIB +SLZ +CPU_AFFINITY +TFO +NS +DL +RT -DEVICEATLAS -51DEGREES -WURFL -SYSTEMD -OBSOLETE_LINKER +PRCTL -PROCCTL +THREAD_DUMP -EVPORTS -OT -QUIC -PROMEX -MEMORY_PROFILING
Default settings :
bufsize = 16384, maxrewrite = 1024, maxpollevents = 200
Built with multi-threading support (MAX_TGROUPS=16, MAX_THREADS=256, default=2).
Built with OpenSSL version : OpenSSL 1.1.1f 31 Mar 2020
Running on OpenSSL version : OpenSSL 1.1.1f 31 Mar 2020
OpenSSL library supports TLS extensions : yes
OpenSSL library supports SNI : yes
OpenSSL library supports : TLSv1.0 TLSv1.1 TLSv1.2 TLSv1.3
Built with network namespace support.
Built with libslz for stateless compression.
Compression algorithms supported : identity("identity"), deflate("deflate"), raw-deflate("deflate"), gzip("gzip")
Built with transparent proxy support using: IP_TRANSPARENT IPV6_TRANSPARENT IP_FREEBIND
Built without PCRE or PCRE2 support (using libc's regex instead)
Encrypted password support via crypt(3): yes
Built with clang compiler version 10.0.0
Available polling systems :
epoll : pref=300, test result OK
poll : pref=200, test result OK
select : pref=150, test result OK
Total: 3 (3 usable), will use epoll.
Available multiplexer protocols :
(protocols marked as <default> cannot be specified using 'proto' keyword)
h2 : mode=HTTP side=FE|BE mux=H2 flags=HTX|HOL_RISK|NO_UPG
fcgi : mode=HTTP side=BE mux=FCGI flags=HTX|HOL_RISK|NO_UPG
h1 : mode=HTTP side=FE|BE mux=H1 flags=HTX|NO_UPG
<default> : mode=HTTP side=FE|BE mux=H1 flags=HTX
none : mode=TCP side=FE|BE mux=PASS flags=NO_UPG
<default> : mode=TCP side=FE|BE mux=PASS flags=
Available services : none
Available filters :
[BWLIM] bwlim-in
[BWLIM] bwlim-out
[CACHE] cache
[COMP] compression
[FCGI] fcgi-app
[SPOE] spoe
[TRACE] trace
Thread 1 "haproxy" received signal SIGSEGV, Segmentation fault.
0xffffffffffffffff in ?? ()
(gdb) bt full
#0 0xffffffffffffffff in ?? ()
No symbol table info available.
#1 0x0000000000476309 in sigaction ()
No symbol table info available.
#2 0x000000000047615e in signal ()
No symbol table info available.
#3 0x0000000000775991 in deinit_signals () at src/signal.c:158
sig = 93
sh = <optimized out>
shb = <optimized out>
#4 0x00000000006952b5 in deinit () at src/haproxy.c:2624
ua = 0x0
p = 0x0
cur_fd = <optimized out>
p0 = <optimized out>
uap = <optimized out>
pdf = <optimized out>
logb = <optimized out>
log = <optimized out>
wlb = <optimized out>
wl = <optimized out>
bolb = <optimized out>
bol = <optimized out>
pxdfb = <optimized out>
pxdf = <optimized out>
pdfb = <optimized out>
srvdfb = <optimized out>
srvdf = <optimized out>
pcfb = <optimized out>
pcf = <optimized out>
pscfb = <optimized out>
pscf = <optimized out>
ppcfb = <optimized out>
ppcf = <optimized out>
prcfb = <optimized out>
prcf = <optimized out>
tifb = <optimized out>
tif = <optimized out>
tdfb = <optimized out>
--Type <RET> for more, q to quit, c to continue without paging-- c
tdf = <optimized out>
tafb = <optimized out>
taf = <optimized out>
tffb = <optimized out>
tff = <optimized out>
pprsb = <optimized out>
pprs = <optimized out>
#5 0x000000000069616f in deinit_and_exit (status=0) at src/haproxy.c:2786
No locals.
#6 0x000000000069b51a in init_args (argc=<optimized out>, argv=<optimized out>) at src/haproxy.c:1602
flag = 0x7fffffffe749 "vv"
err_msg = <optimized out>
progname = 0x7b0400000060 "haproxy"
flag = <optimized out>
endptr = <optimized out>
c = <optimized out>
ret = <optimized out>
__x = <optimized out>
__x = <optimized out>
#7 main (argc=-6327, argv=0x7fffffffe4b8) at src/haproxy.c:3109
limit = {rlim_cur = 1024, rlim_max = 1048576}
intovf = <optimized out>
pidfd = -1
err = <optimized out>
retry = <optimized out>
(gdb)
```
```
### Additional Information
_No response_
### Output of `haproxy -vv`
```plain
HAProxy version 2.7-dev5-e3e312-97 2022/09/16 - https://haproxy.org/
Status: development branch - not safe for use in production.
Known bugs: https://github.com/haproxy/haproxy/issues?q=is:issue+is:open
Running on: Linux 5.15.0-1019-azure #24~20.04.1-Ubuntu SMP Tue Aug 23 15:52:52 UTC 2022 x86_64
Build options :
TARGET = linux-glibc
CPU = generic
CC = clang
CFLAGS = -O2 -fsanitize=thread -ggdb -Wall -Wextra -Wundef -Wdeclaration-after-statement -Wfatal-errors -Wtype-limits -Wshift-negative-value -Wnull-dereference -fwrapv -Wno-unknown-warning-option -Wno-address-of-packed-member -Wno-unused-label -Wno-sign-compare -Wno-unused-parameter -Wno-clobbered -Wno-missing-field-initializers -Wno-cast-function-type -Wno-string-plus-int -Wno-atomic-alignment -Werror
OPTIONS = USE_OPENSSL=1
DEBUG = -DDEBUG_STRICT -DDEBUG_MEMORY_POOLS
Feature list : +EPOLL -KQUEUE +NETFILTER -PCRE -PCRE_JIT -PCRE2 -PCRE2_JIT +POLL +THREAD -PTHREAD_EMULATION +BACKTRACE -STATIC_PCRE -STATIC_PCRE2 +TPROXY +LINUX_TPROXY +LINUX_SPLICE +LIBCRYPT +CRYPT_H -ENGINE +GETADDRINFO +OPENSSL -LUA +ACCEPT4 -CLOSEFROM -ZLIB +SLZ +CPU_AFFINITY +TFO +NS +DL +RT -DEVICEATLAS -51DEGREES -WURFL -SYSTEMD -OBSOLETE_LINKER +PRCTL -PROCCTL +THREAD_DUMP -EVPORTS -OT -QUIC -PROMEX -MEMORY_PROFILING
Default settings :
bufsize = 16384, maxrewrite = 1024, maxpollevents = 200
Built with multi-threading support (MAX_TGROUPS=16, MAX_THREADS=256, default=2).
Built with OpenSSL version : OpenSSL 1.1.1f 31 Mar 2020
Running on OpenSSL version : OpenSSL 1.1.1f 31 Mar 2020
OpenSSL library supports TLS extensions : yes
OpenSSL library supports SNI : yes
OpenSSL library supports : TLSv1.0 TLSv1.1 TLSv1.2 TLSv1.3
Built with network namespace support.
Built with libslz for stateless compression.
Compression algorithms supported : identity("identity"), deflate("deflate"), raw-deflate("deflate"), gzip("gzip")
Built with transparent proxy support using: IP_TRANSPARENT IPV6_TRANSPARENT IP_FREEBIND
Built without PCRE or PCRE2 support (using libc's regex instead)
Encrypted password support via crypt(3): yes
Built with clang compiler version 10.0.0
Available polling systems :
epoll : pref=300, test result OK
poll : pref=200, test result OK
select : pref=150, test result OK
Total: 3 (3 usable), will use epoll.
Available multiplexer protocols :
(protocols marked as <default> cannot be specified using 'proto' keyword)
h2 : mode=HTTP side=FE|BE mux=H2 flags=HTX|HOL_RISK|NO_UPG
fcgi : mode=HTTP side=BE mux=FCGI flags=HTX|HOL_RISK|NO_UPG
h1 : mode=HTTP side=FE|BE mux=H1 flags=HTX|NO_UPG
<default> : mode=HTTP side=FE|BE mux=H1 flags=HTX
none : mode=TCP side=FE|BE mux=PASS flags=NO_UPG
<default> : mode=TCP side=FE|BE mux=PASS flags=
Available services : none
Available filters :
[BWLIM] bwlim-in
[BWLIM] bwlim-out
[CACHE] cache
[COMP] compression
[FCGI] fcgi-app
[SPOE] spoe
[TRACE] trace
Thread 1 "haproxy" received signal SIGSEGV, Segmentation fault.
```
| 1.0 | haproxy crashes when built using clang and thread sanitizer - ### Tool Name and Version
clang and thread sanitizer
### Code Report
```plain
[Thread 0x7ffff52fe700 (LWP 9112) exited]
HAProxy version 2.7-dev5-e3e312-97 2022/09/16 - https://haproxy.org/
Status: development branch - not safe for use in production.
Known bugs: https://github.com/haproxy/haproxy/issues?q=is:issue+is:open
Running on: Linux 5.15.0-1019-azure #24~20.04.1-Ubuntu SMP Tue Aug 23 15:52:52 UTC 2022 x86_64
Build options :
TARGET = linux-glibc
CPU = generic
CC = clang
CFLAGS = -O2 -fsanitize=thread -ggdb -Wall -Wextra -Wundef -Wdeclaration-after-statement -Wfatal-errors -Wtype-limits -Wshift-negative-value -Wnull-dereference -fwrapv -Wno-unknown-warning-option -Wno-address-of-packed-member -Wno-unused-label -Wno-sign-compare -Wno-unused-parameter -Wno-clobbered -Wno-missing-field-initializers -Wno-cast-function-type -Wno-string-plus-int -Wno-atomic-alignment -Werror
OPTIONS = USE_OPENSSL=1
DEBUG = -DDEBUG_STRICT -DDEBUG_MEMORY_POOLS
Feature list : +EPOLL -KQUEUE +NETFILTER -PCRE -PCRE_JIT -PCRE2 -PCRE2_JIT +POLL +THREAD -PTHREAD_EMULATION +BACKTRACE -STATIC_PCRE -STATIC_PCRE2 +TPROXY +LINUX_TPROXY +LINUX_SPLICE +LIBCRYPT +CRYPT_H -ENGINE +GETADDRINFO +OPENSSL -LUA +ACCEPT4 -CLOSEFROM -ZLIB +SLZ +CPU_AFFINITY +TFO +NS +DL +RT -DEVICEATLAS -51DEGREES -WURFL -SYSTEMD -OBSOLETE_LINKER +PRCTL -PROCCTL +THREAD_DUMP -EVPORTS -OT -QUIC -PROMEX -MEMORY_PROFILING
Default settings :
bufsize = 16384, maxrewrite = 1024, maxpollevents = 200
Built with multi-threading support (MAX_TGROUPS=16, MAX_THREADS=256, default=2).
Built with OpenSSL version : OpenSSL 1.1.1f 31 Mar 2020
Running on OpenSSL version : OpenSSL 1.1.1f 31 Mar 2020
OpenSSL library supports TLS extensions : yes
OpenSSL library supports SNI : yes
OpenSSL library supports : TLSv1.0 TLSv1.1 TLSv1.2 TLSv1.3
Built with network namespace support.
Built with libslz for stateless compression.
Compression algorithms supported : identity("identity"), deflate("deflate"), raw-deflate("deflate"), gzip("gzip")
Built with transparent proxy support using: IP_TRANSPARENT IPV6_TRANSPARENT IP_FREEBIND
Built without PCRE or PCRE2 support (using libc's regex instead)
Encrypted password support via crypt(3): yes
Built with clang compiler version 10.0.0
Available polling systems :
epoll : pref=300, test result OK
poll : pref=200, test result OK
select : pref=150, test result OK
Total: 3 (3 usable), will use epoll.
Available multiplexer protocols :
(protocols marked as <default> cannot be specified using 'proto' keyword)
h2 : mode=HTTP side=FE|BE mux=H2 flags=HTX|HOL_RISK|NO_UPG
fcgi : mode=HTTP side=BE mux=FCGI flags=HTX|HOL_RISK|NO_UPG
h1 : mode=HTTP side=FE|BE mux=H1 flags=HTX|NO_UPG
<default> : mode=HTTP side=FE|BE mux=H1 flags=HTX
none : mode=TCP side=FE|BE mux=PASS flags=NO_UPG
<default> : mode=TCP side=FE|BE mux=PASS flags=
Available services : none
Available filters :
[BWLIM] bwlim-in
[BWLIM] bwlim-out
[CACHE] cache
[COMP] compression
[FCGI] fcgi-app
[SPOE] spoe
[TRACE] trace
Thread 1 "haproxy" received signal SIGSEGV, Segmentation fault.
0xffffffffffffffff in ?? ()
(gdb) bt full
#0 0xffffffffffffffff in ?? ()
No symbol table info available.
#1 0x0000000000476309 in sigaction ()
No symbol table info available.
#2 0x000000000047615e in signal ()
No symbol table info available.
#3 0x0000000000775991 in deinit_signals () at src/signal.c:158
sig = 93
sh = <optimized out>
shb = <optimized out>
#4 0x00000000006952b5 in deinit () at src/haproxy.c:2624
ua = 0x0
p = 0x0
cur_fd = <optimized out>
p0 = <optimized out>
uap = <optimized out>
pdf = <optimized out>
logb = <optimized out>
log = <optimized out>
wlb = <optimized out>
wl = <optimized out>
bolb = <optimized out>
bol = <optimized out>
pxdfb = <optimized out>
pxdf = <optimized out>
pdfb = <optimized out>
srvdfb = <optimized out>
srvdf = <optimized out>
pcfb = <optimized out>
pcf = <optimized out>
pscfb = <optimized out>
pscf = <optimized out>
ppcfb = <optimized out>
ppcf = <optimized out>
prcfb = <optimized out>
prcf = <optimized out>
tifb = <optimized out>
tif = <optimized out>
tdfb = <optimized out>
--Type <RET> for more, q to quit, c to continue without paging-- c
tdf = <optimized out>
tafb = <optimized out>
taf = <optimized out>
tffb = <optimized out>
tff = <optimized out>
pprsb = <optimized out>
pprs = <optimized out>
#5 0x000000000069616f in deinit_and_exit (status=0) at src/haproxy.c:2786
No locals.
#6 0x000000000069b51a in init_args (argc=<optimized out>, argv=<optimized out>) at src/haproxy.c:1602
flag = 0x7fffffffe749 "vv"
err_msg = <optimized out>
progname = 0x7b0400000060 "haproxy"
flag = <optimized out>
endptr = <optimized out>
c = <optimized out>
ret = <optimized out>
__x = <optimized out>
__x = <optimized out>
#7 main (argc=-6327, argv=0x7fffffffe4b8) at src/haproxy.c:3109
limit = {rlim_cur = 1024, rlim_max = 1048576}
intovf = <optimized out>
pidfd = -1
err = <optimized out>
retry = <optimized out>
(gdb)
```
```
### Additional Information
_No response_
### Output of `haproxy -vv`
```plain
HAProxy version 2.7-dev5-e3e312-97 2022/09/16 - https://haproxy.org/
Status: development branch - not safe for use in production.
Known bugs: https://github.com/haproxy/haproxy/issues?q=is:issue+is:open
Running on: Linux 5.15.0-1019-azure #24~20.04.1-Ubuntu SMP Tue Aug 23 15:52:52 UTC 2022 x86_64
Build options :
TARGET = linux-glibc
CPU = generic
CC = clang
CFLAGS = -O2 -fsanitize=thread -ggdb -Wall -Wextra -Wundef -Wdeclaration-after-statement -Wfatal-errors -Wtype-limits -Wshift-negative-value -Wnull-dereference -fwrapv -Wno-unknown-warning-option -Wno-address-of-packed-member -Wno-unused-label -Wno-sign-compare -Wno-unused-parameter -Wno-clobbered -Wno-missing-field-initializers -Wno-cast-function-type -Wno-string-plus-int -Wno-atomic-alignment -Werror
OPTIONS = USE_OPENSSL=1
DEBUG = -DDEBUG_STRICT -DDEBUG_MEMORY_POOLS
Feature list : +EPOLL -KQUEUE +NETFILTER -PCRE -PCRE_JIT -PCRE2 -PCRE2_JIT +POLL +THREAD -PTHREAD_EMULATION +BACKTRACE -STATIC_PCRE -STATIC_PCRE2 +TPROXY +LINUX_TPROXY +LINUX_SPLICE +LIBCRYPT +CRYPT_H -ENGINE +GETADDRINFO +OPENSSL -LUA +ACCEPT4 -CLOSEFROM -ZLIB +SLZ +CPU_AFFINITY +TFO +NS +DL +RT -DEVICEATLAS -51DEGREES -WURFL -SYSTEMD -OBSOLETE_LINKER +PRCTL -PROCCTL +THREAD_DUMP -EVPORTS -OT -QUIC -PROMEX -MEMORY_PROFILING
Default settings :
bufsize = 16384, maxrewrite = 1024, maxpollevents = 200
Built with multi-threading support (MAX_TGROUPS=16, MAX_THREADS=256, default=2).
Built with OpenSSL version : OpenSSL 1.1.1f 31 Mar 2020
Running on OpenSSL version : OpenSSL 1.1.1f 31 Mar 2020
OpenSSL library supports TLS extensions : yes
OpenSSL library supports SNI : yes
OpenSSL library supports : TLSv1.0 TLSv1.1 TLSv1.2 TLSv1.3
Built with network namespace support.
Built with libslz for stateless compression.
Compression algorithms supported : identity("identity"), deflate("deflate"), raw-deflate("deflate"), gzip("gzip")
Built with transparent proxy support using: IP_TRANSPARENT IPV6_TRANSPARENT IP_FREEBIND
Built without PCRE or PCRE2 support (using libc's regex instead)
Encrypted password support via crypt(3): yes
Built with clang compiler version 10.0.0
Available polling systems :
epoll : pref=300, test result OK
poll : pref=200, test result OK
select : pref=150, test result OK
Total: 3 (3 usable), will use epoll.
Available multiplexer protocols :
(protocols marked as <default> cannot be specified using 'proto' keyword)
h2 : mode=HTTP side=FE|BE mux=H2 flags=HTX|HOL_RISK|NO_UPG
fcgi : mode=HTTP side=BE mux=FCGI flags=HTX|HOL_RISK|NO_UPG
h1 : mode=HTTP side=FE|BE mux=H1 flags=HTX|NO_UPG
<default> : mode=HTTP side=FE|BE mux=H1 flags=HTX
none : mode=TCP side=FE|BE mux=PASS flags=NO_UPG
<default> : mode=TCP side=FE|BE mux=PASS flags=
Available services : none
Available filters :
[BWLIM] bwlim-in
[BWLIM] bwlim-out
[CACHE] cache
[COMP] compression
[FCGI] fcgi-app
[SPOE] spoe
[TRACE] trace
Thread 1 "haproxy" received signal SIGSEGV, Segmentation fault.
```
| non_priority | haproxy crashes when built using clang and thread sanitizer tool name and version clang and thread sanitizer code report plain haproxy version status development branch not safe for use in production known bugs running on linux azure ubuntu smp tue aug utc build options target linux glibc cpu generic cc clang cflags fsanitize thread ggdb wall wextra wundef wdeclaration after statement wfatal errors wtype limits wshift negative value wnull dereference fwrapv wno unknown warning option wno address of packed member wno unused label wno sign compare wno unused parameter wno clobbered wno missing field initializers wno cast function type wno string plus int wno atomic alignment werror options use openssl debug ddebug strict ddebug memory pools feature list epoll kqueue netfilter pcre pcre jit jit poll thread pthread emulation backtrace static pcre static tproxy linux tproxy linux splice libcrypt crypt h engine getaddrinfo openssl lua closefrom zlib slz cpu affinity tfo ns dl rt deviceatlas wurfl systemd obsolete linker prctl procctl thread dump evports ot quic promex memory profiling default settings bufsize maxrewrite maxpollevents built with multi threading support max tgroups max threads default built with openssl version openssl mar running on openssl version openssl mar openssl library supports tls extensions yes openssl library supports sni yes openssl library supports built with network namespace support built with libslz for stateless compression compression algorithms supported identity identity deflate deflate raw deflate deflate gzip gzip built with transparent proxy support using ip transparent transparent ip freebind built without pcre or support using libc s regex instead encrypted password support via crypt yes built with clang compiler version available polling systems epoll pref test result ok poll pref test result ok select pref test result ok total usable will use epoll available multiplexer protocols protocols marked as cannot be specified using proto keyword mode http side fe be mux flags htx hol risk no upg fcgi mode http side be mux fcgi flags htx hol risk no upg mode http side fe be mux flags htx no upg mode http side fe be mux flags htx none mode tcp side fe be mux pass flags no upg mode tcp side fe be mux pass flags available services none available filters bwlim in bwlim out cache compression fcgi app spoe trace thread haproxy received signal sigsegv segmentation fault in gdb bt full in no symbol table info available in sigaction no symbol table info available in signal no symbol table info available in deinit signals at src signal c sig sh shb in deinit at src haproxy c ua p cur fd uap pdf logb log wlb wl bolb bol pxdfb pxdf pdfb srvdfb srvdf pcfb pcf pscfb pscf ppcfb ppcf prcfb prcf tifb tif tdfb type for more q to quit c to continue without paging c tdf tafb taf tffb tff pprsb pprs in deinit and exit status at src haproxy c no locals in init args argc argv at src haproxy c flag vv err msg progname haproxy flag endptr c ret x x main argc argv at src haproxy c limit rlim cur rlim max intovf pidfd err retry gdb additional information no response output of haproxy vv plain haproxy version status development branch not safe for use in production known bugs running on linux azure ubuntu smp tue aug utc build options target linux glibc cpu generic cc clang cflags fsanitize thread ggdb wall wextra wundef wdeclaration after statement wfatal errors wtype limits wshift negative value wnull dereference fwrapv wno unknown warning option wno address of packed member wno unused label wno sign compare wno unused parameter wno clobbered wno missing field initializers wno cast function type wno string plus int wno atomic alignment werror options use openssl debug ddebug strict ddebug memory pools feature list epoll kqueue netfilter pcre pcre jit jit poll thread pthread emulation backtrace static pcre static tproxy linux tproxy linux splice libcrypt crypt h engine getaddrinfo openssl lua closefrom zlib slz cpu affinity tfo ns dl rt deviceatlas wurfl systemd obsolete linker prctl procctl thread dump evports ot quic promex memory profiling default settings bufsize maxrewrite maxpollevents built with multi threading support max tgroups max threads default built with openssl version openssl mar running on openssl version openssl mar openssl library supports tls extensions yes openssl library supports sni yes openssl library supports built with network namespace support built with libslz for stateless compression compression algorithms supported identity identity deflate deflate raw deflate deflate gzip gzip built with transparent proxy support using ip transparent transparent ip freebind built without pcre or support using libc s regex instead encrypted password support via crypt yes built with clang compiler version available polling systems epoll pref test result ok poll pref test result ok select pref test result ok total usable will use epoll available multiplexer protocols protocols marked as cannot be specified using proto keyword mode http side fe be mux flags htx hol risk no upg fcgi mode http side be mux fcgi flags htx hol risk no upg mode http side fe be mux flags htx no upg mode http side fe be mux flags htx none mode tcp side fe be mux pass flags no upg mode tcp side fe be mux pass flags available services none available filters bwlim in bwlim out cache compression fcgi app spoe trace thread haproxy received signal sigsegv segmentation fault | 0 |
51,327 | 6,155,460,062 | IssuesEvent | 2017-06-28 14:48:33 | ProjectSidewalk/SidewalkWebpage | https://api.github.com/repos/ProjectSidewalk/SidewalkWebpage | opened | Path does not load on mini-map after a jump | Relaunch Testing | 
Occasionally after a manual jump, the compass will point the user in the direction of a path that hasn't appeared on the mini-map yet. Taking a step in that direction makes the path appear.

| 1.0 | Path does not load on mini-map after a jump - 
Occasionally after a manual jump, the compass will point the user in the direction of a path that hasn't appeared on the mini-map yet. Taking a step in that direction makes the path appear.

| non_priority | path does not load on mini map after a jump occasionally after a manual jump the compass will point the user in the direction of a path that hasn t appeared on the mini map yet taking a step in that direction makes the path appear | 0 |
104,745 | 16,621,080,565 | IssuesEvent | 2021-06-03 01:09:51 | ekediala/bluetooth-scanner | https://api.github.com/repos/ekediala/bluetooth-scanner | opened | CVE-2020-1912 (High) detected in hermes-engine-0.2.1.tgz | security vulnerability | ## CVE-2020-1912 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hermes-engine-0.2.1.tgz</b></p></summary>
<p>A JavaScript engine optimized for running React Native on Android</p>
<p>Library home page: <a href="https://registry.npmjs.org/hermes-engine/-/hermes-engine-0.2.1.tgz">https://registry.npmjs.org/hermes-engine/-/hermes-engine-0.2.1.tgz</a></p>
<p>Path to dependency file: bluetooth-scanner/package.json</p>
<p>Path to vulnerable library: bluetooth-scanner/node_modules/hermes-engine/package.json</p>
<p>
Dependency Hierarchy:
- react-native-0.61.5.tgz (Root Library)
- :x: **hermes-engine-0.2.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An out-of-bounds read/write vulnerability when executing lazily compiled inner generator functions in Facebook Hermes prior to commit 091835377369c8fd5917d9b87acffa721ad2a168 allows attackers to potentially execute arbitrary code via crafted JavaScript. Note that this is only exploitable if the application using Hermes permits evaluation of untrusted JavaScript. Hence, most React Native applications are not affected.
<p>Publish Date: 2020-09-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1912>CVE-2020-1912</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/facebook/hermes/releases/tag/v0.7.0">https://github.com/facebook/hermes/releases/tag/v0.7.0</a></p>
<p>Release Date: 2020-09-15</p>
<p>Fix Resolution: v0.7.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-1912 (High) detected in hermes-engine-0.2.1.tgz - ## CVE-2020-1912 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hermes-engine-0.2.1.tgz</b></p></summary>
<p>A JavaScript engine optimized for running React Native on Android</p>
<p>Library home page: <a href="https://registry.npmjs.org/hermes-engine/-/hermes-engine-0.2.1.tgz">https://registry.npmjs.org/hermes-engine/-/hermes-engine-0.2.1.tgz</a></p>
<p>Path to dependency file: bluetooth-scanner/package.json</p>
<p>Path to vulnerable library: bluetooth-scanner/node_modules/hermes-engine/package.json</p>
<p>
Dependency Hierarchy:
- react-native-0.61.5.tgz (Root Library)
- :x: **hermes-engine-0.2.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An out-of-bounds read/write vulnerability when executing lazily compiled inner generator functions in Facebook Hermes prior to commit 091835377369c8fd5917d9b87acffa721ad2a168 allows attackers to potentially execute arbitrary code via crafted JavaScript. Note that this is only exploitable if the application using Hermes permits evaluation of untrusted JavaScript. Hence, most React Native applications are not affected.
<p>Publish Date: 2020-09-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1912>CVE-2020-1912</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/facebook/hermes/releases/tag/v0.7.0">https://github.com/facebook/hermes/releases/tag/v0.7.0</a></p>
<p>Release Date: 2020-09-15</p>
<p>Fix Resolution: v0.7.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in hermes engine tgz cve high severity vulnerability vulnerable library hermes engine tgz a javascript engine optimized for running react native on android library home page a href path to dependency file bluetooth scanner package json path to vulnerable library bluetooth scanner node modules hermes engine package json dependency hierarchy react native tgz root library x hermes engine tgz vulnerable library vulnerability details an out of bounds read write vulnerability when executing lazily compiled inner generator functions in facebook hermes prior to commit allows attackers to potentially execute arbitrary code via crafted javascript note that this is only exploitable if the application using hermes permits evaluation of untrusted javascript hence most react native applications are not affected publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
34,029 | 7,327,856,559 | IssuesEvent | 2018-03-04 15:02:50 | cakephp/cakephp | https://api.github.com/repos/cakephp/cakephp | closed | 3.6: Fatal error loading plugins | Defect plugins | This is a (multiple allowed):
* [x] bug
* [ ] enhancement
* [ ] feature-discussion (RFC)
* CakePHP Version: latest master
### What you did
Using 3.6 pre-beta with a 3.5 app.
And the load command as it always has been:
Plugin::load('Tools', ['bootstrap' => true]);
### What happened
> Fatal error: Cannot redeclare endsWith() (previously declared in /home/vagrant/Apps/x.local/vendor/dereuromark/cakephp-tools/config/bootstrap.php:323) in /home/vagrant/Apps/x.local/vendor/dereuromark/cakephp-tools/config/bootstrap.php on line 323
Seems like it tries to load the plugin now twice?
### What you expected to happen
Still BC - only loading the plugins once (the old way). | 1.0 | 3.6: Fatal error loading plugins - This is a (multiple allowed):
* [x] bug
* [ ] enhancement
* [ ] feature-discussion (RFC)
* CakePHP Version: latest master
### What you did
Using 3.6 pre-beta with a 3.5 app.
And the load command as it always has been:
Plugin::load('Tools', ['bootstrap' => true]);
### What happened
> Fatal error: Cannot redeclare endsWith() (previously declared in /home/vagrant/Apps/x.local/vendor/dereuromark/cakephp-tools/config/bootstrap.php:323) in /home/vagrant/Apps/x.local/vendor/dereuromark/cakephp-tools/config/bootstrap.php on line 323
Seems like it tries to load the plugin now twice?
### What you expected to happen
Still BC - only loading the plugins once (the old way). | non_priority | fatal error loading plugins this is a multiple allowed bug enhancement feature discussion rfc cakephp version latest master what you did using pre beta with a app and the load command as it always has been plugin load tools what happened fatal error cannot redeclare endswith previously declared in home vagrant apps x local vendor dereuromark cakephp tools config bootstrap php in home vagrant apps x local vendor dereuromark cakephp tools config bootstrap php on line seems like it tries to load the plugin now twice what you expected to happen still bc only loading the plugins once the old way | 0 |
27,607 | 6,888,463,695 | IssuesEvent | 2017-11-22 06:05:35 | w3c/aria-practices | https://api.github.com/repos/w3c/aria-practices | closed | Develop example of listbox pattern | code example | The listbox design pattern is at:
http://w3c.github.io/aria-practices/#Listbox
The HTML template and description for two listbox examples is in the examples/listbox/listbox.html file in the repo and can be viewed in RawGit at:
https://rawgit.com/w3c/aria-practices/master/examples/listbox/listbox.html | 1.0 | Develop example of listbox pattern - The listbox design pattern is at:
http://w3c.github.io/aria-practices/#Listbox
The HTML template and description for two listbox examples is in the examples/listbox/listbox.html file in the repo and can be viewed in RawGit at:
https://rawgit.com/w3c/aria-practices/master/examples/listbox/listbox.html | non_priority | develop example of listbox pattern the listbox design pattern is at the html template and description for two listbox examples is in the examples listbox listbox html file in the repo and can be viewed in rawgit at | 0 |
63,812 | 6,885,075,869 | IssuesEvent | 2017-11-21 15:05:15 | brave/browser-ios | https://api.github.com/repos/brave/browser-ios | closed | Manual test run on iOS10 iPhone6 for 1.5.1 | iPhone release-notes/exclude tests | ## Per release specialty tests
- [x] Opening a link in a new tab shows blank tab before loading the page ([#1156](https://github.com/brave/browser-ios/issues/1156))
- [x] Added Share Option for Images ([#1244](https://github.com/brave/browser-ios/issues/1244))
- [x] Add pocket integration suggestion ([#1124](https://github.com/brave/browser-ios/issues/1124))
- [ ] Add ui support for Iphone X iPhone Specific ([#1209](https://github.com/brave/browser-ios/issues/1209))
- [ ] Bookmarks only load when switched to the tab ([#1299](https://github.com/brave/browser-ios/issues/1299))
- [ ] Open link in a new tab causes empty tabs ([#1284](https://github.com/brave/browser-ios/issues/1284))
- [ ] Tab title is missing after app launch for reader mode tab ([#1242](https://github.com/brave/browser-ios/issues/1242))
- [x] Long press new private tab option for iPhone ([#1189](https://github.com/brave/browser-ios/issues/1189))
## Installer
1. [ ] Check that installer is close to the size of last release.
2. [x] Check the Brave version in About and make sure it is EXACTLY as expected.
## Data
1. [ ] Make sure that data from the last version appears in the new version OK.
2. [ ] Test that the previous version's cookies are preserved in the next version.
## Bookmarks
1. [ ] Test that creating a bookmark in the left well works
2. [ ] Test that clicking a bookmark in the left well loads the bookmark
3. [ ] Test that deleting a bookmark in the left well works
4. [ ] Test that creating a bookmark folder works
5. [ ] Test that creating a bookmark inside the created folder works
6. [ ] Test that you are able to add a bookmark directly inside a bookmark folder
7. [ ] Test that you are able to delete a bookmark in edit mode
8. [ ] Test that you are able to delete a bookmark folder with bookmarks inside
9. [ ] Test adding a bookmark domain subpaths is retained and you are successfully able to visit the domain subpath in a new tab
## Context menus
1. [ ] Make sure context menu items in the URL bar work
2. [ ] Make sure context menu items on content work with no selected text.
3. [ ] Make sure context menu items on content work with selected text.
4. [ ] Make sure context menu items on content work inside an editable control (input, textarea, or contenteditable).
5. [ ] Context menu: verify you can Open in Background Tab, and Open in Private Tab
## Find on page
1. [ ] Ensure search box is shown when selected via the share menu
2. [ ] Test successful find
3. [ ] Test forward and backward find navigation
4. [ ] Test failed find shows 0 results
## Private Mode
1. [ ] Create private tab, go to http://google.com, search for 'yumyums', exit private mode, go to http://google.com search box and begin typing 'yumyums' and verify that word is not in the autocomplete list
## Reader Mode
1. [ ] Visit http://m.slashdot.org, open any article, verify the reader mode icon is shown in the URL bar
2. [ ] Verify tapping on the reader mode icon opens the article in reader mode
3. [ ] Edit reader mode settings and open different pages in reader mode and verify if the setting is retained across each article
## History
1. [ ] On youtube.com, thestar.com (or any other site using push state nav), navigate the site and verify history is added. Also note if the progress bar activates and shows progress.
2. [ ] Settings > Clear Private Data, and clear all. Check history is cleared, and top sites are cleared.
## Shields Settings
1. [ ] Enable all switches in settings and visit a site and disable block scripts. Kill and relaunch app and verify if the site shield settings are retained
## Site hacks
1. [ ] Test https://www.twitch.tv/adobe sub-page loads a video and you can play it
## Downloads
1. [ ] Test that you can save an image from a site.
## Fullscreen
1. [ ] Test that entering HTML5 full screen works. And pressing restore to go back exits full screen. (youtube.com)
## Gestures
1. [ ] Test zoom in / out gestures work
2. [ ] Test that navigating to a different origin resets the zoom
3. [ ] Swipe back and forward to navigate, verify this works as expected
## Password Managers
1. [ ] Test tapping on 1Password on the slide out keyboard launches 1Password App and able to select the stored credentials
2. [ ] Test tapping on bitwarden password manager in the autofill field launches the app and autofills the stored data
## Sync
1. [ ] Ensure you are able to scan the QR code and sync with laptop
2. [ ] Ensure the bookmarks from laptop shows up on the mobile after sync completes
3. [ ] Add a bookmark on mobile and check if it gets synced to the laptop
## Bravery settings
1. [ ] Check that HTTPS Everywhere works by loading https://https-everywhere.badssl.com/
2. [ ] Turning HTTPS Everywhere off and shields off both disable the redirect to https://https-everywhere.badssl.com/
3. [ ] Check that block ad and unblock ad works on http://slashdot.org
4. [ ] Check that toggling to blocking and allow ads works as expected.
5. [ ] Test that clicking through a cert error in https://badssl.com/ works.
6. [ ] Test that Safe Browsing works (http://downloadme.org/)
7. [ ] Turning Safe Browsing off and shields off both disable safe browsing for http://downloadme.org/.
8. [ ] Enable block script globally from settings, Visit https://brianbondy.com/, nothing should load. Tap on Shields and disable block script, page should load properly
9. [ ] Test that preferences default Bravery settings take effect on pages with no site settings.
10. [ ] Test that turning on fingerprinting protection in preferences shows 1 fingerprints blocked at https://browserleaks.com/canvas . Test that turning it off in the Bravery menu shows 0 fingerprints blocked.
11. [ ] Test that 3rd party storage results are blank at https://jsfiddle.net/7ke9r14a/7/ when 3rd party cookies are blocked.
12. [ ] Test that audio fingerprint is blocked at https://audiofingerprint.openwpm.com/ when fingerprinting protection is on.
## Content tests
1. [ ] Go to https://brianbondy.com/ and click on the twitter icon on the top right. Test that context menus work in the new twitter tab
2. [ ] Load twitter and click on a tweet so the popup div shows. Click to dismiss and repeat with another div. Make sure it shows
3. [ ] Go to https://trac.torproject.org/projects/tor/login and make sure that the password can be saved. Make sure the saved password is auto-populated when you visit the site again.
4. [ ] Open an email on http://mail.google.com/ or inbox.google.com and click on a link. Make sure it works
5. [ ] Test that PDF is loaded at http://www.orimi.com/pdf-test.pdf
6. [ ] Test that https://mixed-script.badssl.com/ shows up as grey not red (no mixed content scripts are run)
7. [ ] Test that news.google.com sites open in a new tab (due to target being _blank)
## Top sites view
1. [ ] Long-press on top sites to get to deletion mode, and delete a top site (note this will stop that site from showing up again on top sites, so you may not want to do this a site you want to keep there)
## Background
1. [ ] Start loading a page, background the app, wait >5 sec, then bring to front, ensure splash screen is not shown
## Session storage
1. [ ] Test that tabs restore when closed, including active tab.
| 1.0 | Manual test run on iOS10 iPhone6 for 1.5.1 - ## Per release specialty tests
- [x] Opening a link in a new tab shows blank tab before loading the page ([#1156](https://github.com/brave/browser-ios/issues/1156))
- [x] Added Share Option for Images ([#1244](https://github.com/brave/browser-ios/issues/1244))
- [x] Add pocket integration suggestion ([#1124](https://github.com/brave/browser-ios/issues/1124))
- [ ] Add ui support for Iphone X iPhone Specific ([#1209](https://github.com/brave/browser-ios/issues/1209))
- [ ] Bookmarks only load when switched to the tab ([#1299](https://github.com/brave/browser-ios/issues/1299))
- [ ] Open link in a new tab causes empty tabs ([#1284](https://github.com/brave/browser-ios/issues/1284))
- [ ] Tab title is missing after app launch for reader mode tab ([#1242](https://github.com/brave/browser-ios/issues/1242))
- [x] Long press new private tab option for iPhone ([#1189](https://github.com/brave/browser-ios/issues/1189))
## Installer
1. [ ] Check that installer is close to the size of last release.
2. [x] Check the Brave version in About and make sure it is EXACTLY as expected.
## Data
1. [ ] Make sure that data from the last version appears in the new version OK.
2. [ ] Test that the previous version's cookies are preserved in the next version.
## Bookmarks
1. [ ] Test that creating a bookmark in the left well works
2. [ ] Test that clicking a bookmark in the left well loads the bookmark
3. [ ] Test that deleting a bookmark in the left well works
4. [ ] Test that creating a bookmark folder works
5. [ ] Test that creating a bookmark inside the created folder works
6. [ ] Test that you are able to add a bookmark directly inside a bookmark folder
7. [ ] Test that you are able to delete a bookmark in edit mode
8. [ ] Test that you are able to delete a bookmark folder with bookmarks inside
9. [ ] Test adding a bookmark domain subpaths is retained and you are successfully able to visit the domain subpath in a new tab
## Context menus
1. [ ] Make sure context menu items in the URL bar work
2. [ ] Make sure context menu items on content work with no selected text.
3. [ ] Make sure context menu items on content work with selected text.
4. [ ] Make sure context menu items on content work inside an editable control (input, textarea, or contenteditable).
5. [ ] Context menu: verify you can Open in Background Tab, and Open in Private Tab
## Find on page
1. [ ] Ensure search box is shown when selected via the share menu
2. [ ] Test successful find
3. [ ] Test forward and backward find navigation
4. [ ] Test failed find shows 0 results
## Private Mode
1. [ ] Create private tab, go to http://google.com, search for 'yumyums', exit private mode, go to http://google.com search box and begin typing 'yumyums' and verify that word is not in the autocomplete list
## Reader Mode
1. [ ] Visit http://m.slashdot.org, open any article, verify the reader mode icon is shown in the URL bar
2. [ ] Verify tapping on the reader mode icon opens the article in reader mode
3. [ ] Edit reader mode settings and open different pages in reader mode and verify if the setting is retained across each article
## History
1. [ ] On youtube.com, thestar.com (or any other site using push state nav), navigate the site and verify history is added. Also note if the progress bar activates and shows progress.
2. [ ] Settings > Clear Private Data, and clear all. Check history is cleared, and top sites are cleared.
## Shields Settings
1. [ ] Enable all switches in settings and visit a site and disable block scripts. Kill and relaunch app and verify if the site shield settings are retained
## Site hacks
1. [ ] Test https://www.twitch.tv/adobe sub-page loads a video and you can play it
## Downloads
1. [ ] Test that you can save an image from a site.
## Fullscreen
1. [ ] Test that entering HTML5 full screen works. And pressing restore to go back exits full screen. (youtube.com)
## Gestures
1. [ ] Test zoom in / out gestures work
2. [ ] Test that navigating to a different origin resets the zoom
3. [ ] Swipe back and forward to navigate, verify this works as expected
## Password Managers
1. [ ] Test tapping on 1Password on the slide out keyboard launches 1Password App and able to select the stored credentials
2. [ ] Test tapping on bitwarden password manager in the autofill field launches the app and autofills the stored data
## Sync
1. [ ] Ensure you are able to scan the QR code and sync with laptop
2. [ ] Ensure the bookmarks from laptop shows up on the mobile after sync completes
3. [ ] Add a bookmark on mobile and check if it gets synced to the laptop
## Bravery settings
1. [ ] Check that HTTPS Everywhere works by loading https://https-everywhere.badssl.com/
2. [ ] Turning HTTPS Everywhere off and shields off both disable the redirect to https://https-everywhere.badssl.com/
3. [ ] Check that block ad and unblock ad works on http://slashdot.org
4. [ ] Check that toggling to blocking and allow ads works as expected.
5. [ ] Test that clicking through a cert error in https://badssl.com/ works.
6. [ ] Test that Safe Browsing works (http://downloadme.org/)
7. [ ] Turning Safe Browsing off and shields off both disable safe browsing for http://downloadme.org/.
8. [ ] Enable block script globally from settings, Visit https://brianbondy.com/, nothing should load. Tap on Shields and disable block script, page should load properly
9. [ ] Test that preferences default Bravery settings take effect on pages with no site settings.
10. [ ] Test that turning on fingerprinting protection in preferences shows 1 fingerprints blocked at https://browserleaks.com/canvas . Test that turning it off in the Bravery menu shows 0 fingerprints blocked.
11. [ ] Test that 3rd party storage results are blank at https://jsfiddle.net/7ke9r14a/7/ when 3rd party cookies are blocked.
12. [ ] Test that audio fingerprint is blocked at https://audiofingerprint.openwpm.com/ when fingerprinting protection is on.
## Content tests
1. [ ] Go to https://brianbondy.com/ and click on the twitter icon on the top right. Test that context menus work in the new twitter tab
2. [ ] Load twitter and click on a tweet so the popup div shows. Click to dismiss and repeat with another div. Make sure it shows
3. [ ] Go to https://trac.torproject.org/projects/tor/login and make sure that the password can be saved. Make sure the saved password is auto-populated when you visit the site again.
4. [ ] Open an email on http://mail.google.com/ or inbox.google.com and click on a link. Make sure it works
5. [ ] Test that PDF is loaded at http://www.orimi.com/pdf-test.pdf
6. [ ] Test that https://mixed-script.badssl.com/ shows up as grey not red (no mixed content scripts are run)
7. [ ] Test that news.google.com sites open in a new tab (due to target being _blank)
## Top sites view
1. [ ] Long-press on top sites to get to deletion mode, and delete a top site (note this will stop that site from showing up again on top sites, so you may not want to do this a site you want to keep there)
## Background
1. [ ] Start loading a page, background the app, wait >5 sec, then bring to front, ensure splash screen is not shown
## Session storage
1. [ ] Test that tabs restore when closed, including active tab.
| non_priority | manual test run on for per release specialty tests opening a link in a new tab shows blank tab before loading the page added share option for images add pocket integration suggestion add ui support for iphone x iphone specific bookmarks only load when switched to the tab open link in a new tab causes empty tabs tab title is missing after app launch for reader mode tab long press new private tab option for iphone installer check that installer is close to the size of last release check the brave version in about and make sure it is exactly as expected data make sure that data from the last version appears in the new version ok test that the previous version s cookies are preserved in the next version bookmarks test that creating a bookmark in the left well works test that clicking a bookmark in the left well loads the bookmark test that deleting a bookmark in the left well works test that creating a bookmark folder works test that creating a bookmark inside the created folder works test that you are able to add a bookmark directly inside a bookmark folder test that you are able to delete a bookmark in edit mode test that you are able to delete a bookmark folder with bookmarks inside test adding a bookmark domain subpaths is retained and you are successfully able to visit the domain subpath in a new tab context menus make sure context menu items in the url bar work make sure context menu items on content work with no selected text make sure context menu items on content work with selected text make sure context menu items on content work inside an editable control input textarea or contenteditable context menu verify you can open in background tab and open in private tab find on page ensure search box is shown when selected via the share menu test successful find test forward and backward find navigation test failed find shows results private mode create private tab go to search for yumyums exit private mode go to search box and begin typing yumyums and verify that word is not in the autocomplete list reader mode visit open any article verify the reader mode icon is shown in the url bar verify tapping on the reader mode icon opens the article in reader mode edit reader mode settings and open different pages in reader mode and verify if the setting is retained across each article history on youtube com thestar com or any other site using push state nav navigate the site and verify history is added also note if the progress bar activates and shows progress settings clear private data and clear all check history is cleared and top sites are cleared shields settings enable all switches in settings and visit a site and disable block scripts kill and relaunch app and verify if the site shield settings are retained site hacks test sub page loads a video and you can play it downloads test that you can save an image from a site fullscreen test that entering full screen works and pressing restore to go back exits full screen youtube com gestures test zoom in out gestures work test that navigating to a different origin resets the zoom swipe back and forward to navigate verify this works as expected password managers test tapping on on the slide out keyboard launches app and able to select the stored credentials test tapping on bitwarden password manager in the autofill field launches the app and autofills the stored data sync ensure you are able to scan the qr code and sync with laptop ensure the bookmarks from laptop shows up on the mobile after sync completes add a bookmark on mobile and check if it gets synced to the laptop bravery settings check that https everywhere works by loading turning https everywhere off and shields off both disable the redirect to check that block ad and unblock ad works on check that toggling to blocking and allow ads works as expected test that clicking through a cert error in works test that safe browsing works turning safe browsing off and shields off both disable safe browsing for enable block script globally from settings visit nothing should load tap on shields and disable block script page should load properly test that preferences default bravery settings take effect on pages with no site settings test that turning on fingerprinting protection in preferences shows fingerprints blocked at test that turning it off in the bravery menu shows fingerprints blocked test that party storage results are blank at when party cookies are blocked test that audio fingerprint is blocked at when fingerprinting protection is on content tests go to and click on the twitter icon on the top right test that context menus work in the new twitter tab load twitter and click on a tweet so the popup div shows click to dismiss and repeat with another div make sure it shows go to and make sure that the password can be saved make sure the saved password is auto populated when you visit the site again open an email on or inbox google com and click on a link make sure it works test that pdf is loaded at test that shows up as grey not red no mixed content scripts are run test that news google com sites open in a new tab due to target being blank top sites view long press on top sites to get to deletion mode and delete a top site note this will stop that site from showing up again on top sites so you may not want to do this a site you want to keep there background start loading a page background the app wait sec then bring to front ensure splash screen is not shown session storage test that tabs restore when closed including active tab | 0 |
1,856 | 6,577,402,365 | IssuesEvent | 2017-09-12 00:39:53 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | os_router: HA interfaces break os_router module. | affects_2.0 bug_report cloud openstack waiting_on_maintainer | ##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
os_router.py
##### ANSIBLE VERSION
```
ansible 2.0.1.0
```
##### OS / ENVIRONMENT
NA
##### SUMMARY
The HA ports cause issues when deleting a router through this module.
This means that any router updates or deletions through this module will fail.
Currently, for updates, the code retrieves all internal interfaces of a router(including the HA ports), then tries to delete them.
See:
https://github.com/ansible/ansible-modules-core/blob/devel/cloud/openstack/os_router.py#L330
The principle is the same for deletion.
However, neutron does not allow these interfaces to be deleted and will throw an error on any such attempt.
##### STEPS TO REPRODUCE
1. Create a router using the os_router module on an environment running Neutron L3HA using the VRRP protocol(i'm unsure about DVR).
2. Update it's configurations
3. Re-run the playbooks. They will fail when trying to delete the HA ports.
##### EXPECTED RESULTS
The playbooks will fail to run.
| True | os_router: HA interfaces break os_router module. - ##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
os_router.py
##### ANSIBLE VERSION
```
ansible 2.0.1.0
```
##### OS / ENVIRONMENT
NA
##### SUMMARY
The HA ports cause issues when deleting a router through this module.
This means that any router updates or deletions through this module will fail.
Currently, for updates, the code retrieves all internal interfaces of a router(including the HA ports), then tries to delete them.
See:
https://github.com/ansible/ansible-modules-core/blob/devel/cloud/openstack/os_router.py#L330
The principle is the same for deletion.
However, neutron does not allow these interfaces to be deleted and will throw an error on any such attempt.
##### STEPS TO REPRODUCE
1. Create a router using the os_router module on an environment running Neutron L3HA using the VRRP protocol(i'm unsure about DVR).
2. Update it's configurations
3. Re-run the playbooks. They will fail when trying to delete the HA ports.
##### EXPECTED RESULTS
The playbooks will fail to run.
| non_priority | os router ha interfaces break os router module issue type bug report component name os router py ansible version ansible os environment na summary the ha ports cause issues when deleting a router through this module this means that any router updates or deletions through this module will fail currently for updates the code retrieves all internal interfaces of a router including the ha ports then tries to delete them see the principle is the same for deletion however neutron does not allow these interfaces to be deleted and will throw an error on any such attempt steps to reproduce create a router using the os router module on an environment running neutron using the vrrp protocol i m unsure about dvr update it s configurations re run the playbooks they will fail when trying to delete the ha ports expected results the playbooks will fail to run | 0 |
232,537 | 25,578,878,355 | IssuesEvent | 2022-12-01 01:32:14 | BrentWJacobs/gay | https://api.github.com/repos/BrentWJacobs/gay | opened | torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl: 1 vulnerabilities (highest severity is: 9.8) | security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl</b></p></summary>
<p>Tensors and Dynamic neural networks in Python with strong GPU acceleration</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/7a/fb/b1b11ae95ffa7099ca2e60ed5945e56130cc8740208f42aa77f17e03ab3c/torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/7a/fb/b1b11ae95ffa7099ca2e60ed5945e56130cc8740208f42aa77f17e03ab3c/torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt,/requirements.txt</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/BrentWJacobs/gay/commit/f25cca0fbf5e4573f9ff61881eee8ec29c5ef6d7">f25cca0fbf5e4573f9ff61881eee8ec29c5ef6d7</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (torch version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-45907](https://www.mend.io/vulnerability-database/CVE-2022-45907) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl | Direct | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-45907</summary>
### Vulnerable Library - <b>torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl</b></p>
<p>Tensors and Dynamic neural networks in Python with strong GPU acceleration</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/7a/fb/b1b11ae95ffa7099ca2e60ed5945e56130cc8740208f42aa77f17e03ab3c/torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/7a/fb/b1b11ae95ffa7099ca2e60ed5945e56130cc8740208f42aa77f17e03ab3c/torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/BrentWJacobs/gay/commit/f25cca0fbf5e4573f9ff61881eee8ec29c5ef6d7">f25cca0fbf5e4573f9ff61881eee8ec29c5ef6d7</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In PyTorch before trunk/89695, torch.jit.annotations.parse_type_line can cause arbitrary code execution because eval is used unsafely.
<p>Publish Date: 2022-11-26
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-45907>CVE-2022-45907</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details> | True | torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl: 1 vulnerabilities (highest severity is: 9.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl</b></p></summary>
<p>Tensors and Dynamic neural networks in Python with strong GPU acceleration</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/7a/fb/b1b11ae95ffa7099ca2e60ed5945e56130cc8740208f42aa77f17e03ab3c/torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/7a/fb/b1b11ae95ffa7099ca2e60ed5945e56130cc8740208f42aa77f17e03ab3c/torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt,/requirements.txt</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/BrentWJacobs/gay/commit/f25cca0fbf5e4573f9ff61881eee8ec29c5ef6d7">f25cca0fbf5e4573f9ff61881eee8ec29c5ef6d7</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (torch version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-45907](https://www.mend.io/vulnerability-database/CVE-2022-45907) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl | Direct | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-45907</summary>
### Vulnerable Library - <b>torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl</b></p>
<p>Tensors and Dynamic neural networks in Python with strong GPU acceleration</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/7a/fb/b1b11ae95ffa7099ca2e60ed5945e56130cc8740208f42aa77f17e03ab3c/torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/7a/fb/b1b11ae95ffa7099ca2e60ed5945e56130cc8740208f42aa77f17e03ab3c/torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **torch-1.13.0-cp37-cp37m-manylinux1_x86_64.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/BrentWJacobs/gay/commit/f25cca0fbf5e4573f9ff61881eee8ec29c5ef6d7">f25cca0fbf5e4573f9ff61881eee8ec29c5ef6d7</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In PyTorch before trunk/89695, torch.jit.annotations.parse_type_line can cause arbitrary code execution because eval is used unsafely.
<p>Publish Date: 2022-11-26
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-45907>CVE-2022-45907</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details> | non_priority | torch whl vulnerabilities highest severity is vulnerable library torch whl tensors and dynamic neural networks in python with strong gpu acceleration library home page a href path to dependency file requirements txt path to vulnerable library requirements txt requirements txt found in head commit a href vulnerabilities cve severity cvss dependency type fixed in torch version remediation available high torch whl direct n a details cve vulnerable library torch whl tensors and dynamic neural networks in python with strong gpu acceleration library home page a href path to dependency file requirements txt path to vulnerable library requirements txt requirements txt dependency hierarchy x torch whl vulnerable library found in head commit a href found in base branch main vulnerability details in pytorch before trunk torch jit annotations parse type line can cause arbitrary code execution because eval is used unsafely publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href step up your open source security game with mend | 0 |
200,496 | 15,109,252,884 | IssuesEvent | 2021-02-08 17:36:27 | phetsims/vector-addition | https://api.github.com/repos/phetsims/vector-addition | closed | ?fuzzPointers=2 error: "Cannot drag tip when not on graph" | type:automated-testing | From https://github.com/phetsims/aqua/issues/106, this assertion is thrown when fuzzing with ?fuzzPointers=2.
> Cannot drag tip when not on graph.
It didn't happen right away, I didn't encounter this until fuzzing the sim for about 45 seconds. Assigning to the responsible dev for this repo. | 1.0 | ?fuzzPointers=2 error: "Cannot drag tip when not on graph" - From https://github.com/phetsims/aqua/issues/106, this assertion is thrown when fuzzing with ?fuzzPointers=2.
> Cannot drag tip when not on graph.
It didn't happen right away, I didn't encounter this until fuzzing the sim for about 45 seconds. Assigning to the responsible dev for this repo. | non_priority | fuzzpointers error cannot drag tip when not on graph from this assertion is thrown when fuzzing with fuzzpointers cannot drag tip when not on graph it didn t happen right away i didn t encounter this until fuzzing the sim for about seconds assigning to the responsible dev for this repo | 0 |
79,463 | 7,716,579,618 | IssuesEvent | 2018-05-23 11:09:20 | percyfal/pytest-ngsfixtures | https://api.github.com/repos/percyfal/pytest-ngsfixtures | closed | Specific RNA-Seq data set needed | invalid pytest-data | STAR toTranscriptome now produces empty output due to lack of concordance between genome sequence data and transcriptome. Need RNA-Seq specific data. | 1.0 | Specific RNA-Seq data set needed - STAR toTranscriptome now produces empty output due to lack of concordance between genome sequence data and transcriptome. Need RNA-Seq specific data. | non_priority | specific rna seq data set needed star totranscriptome now produces empty output due to lack of concordance between genome sequence data and transcriptome need rna seq specific data | 0 |
78,696 | 9,785,741,240 | IssuesEvent | 2019-06-09 10:27:32 | josdejong/mathjs | https://api.github.com/repos/josdejong/mathjs | closed | Refactor mathjs to a plugin-based library | design decision | This would
1. drastically increase maintainability
2. allow people to contribute more easily by developing plugins (e.g. TypedMatrix)
3. allow to keep your frontend JS small by loading only what you need, e.g. :
``` javascript
var mathjs = require('mathjs') // the core
var mathjs_matrix = require('mathjs-matrix') // the matrix plugin
```
| 1.0 | Refactor mathjs to a plugin-based library - This would
1. drastically increase maintainability
2. allow people to contribute more easily by developing plugins (e.g. TypedMatrix)
3. allow to keep your frontend JS small by loading only what you need, e.g. :
``` javascript
var mathjs = require('mathjs') // the core
var mathjs_matrix = require('mathjs-matrix') // the matrix plugin
```
| non_priority | refactor mathjs to a plugin based library this would drastically increase maintainability allow people to contribute more easily by developing plugins e g typedmatrix allow to keep your frontend js small by loading only what you need e g javascript var mathjs require mathjs the core var mathjs matrix require mathjs matrix the matrix plugin | 0 |
104,212 | 8,968,636,210 | IssuesEvent | 2019-01-29 08:42:08 | OpenTechFund/opentech.fund | https://api.github.com/repos/OpenTechFund/opentech.fund | closed | Status updates for OTF staff | needs tests | **Acceptance criteria**
- [x] Updates to statuses for OTF staff as shown in #786
**QA criteria**
Dev:
- [x] checked feature meets acceptance criteria/conforms exactly to the specification.
- [x] provided good unit test coverage (if this is non-trivial behaviour).
- [x] checked all tests for the project pass with this feature enabled.
- [x] checked code conforms to the project coding standards.
- [x] had code reviewed by another developer and resolved any issues raised.
- [x] tested this feature as an end user of the website/app (Can I get to it? Is it useable? Can I break it? Does it work in an end-to-end context?)
- [x] checked that this feature works on the server/s I am deploying it to.
QA:
* [x] tested this feature as a front end user and it meets the acceptance criteria/conforms to the specification and design.
* [x] checked that the feature works on the server/s deployed to | 1.0 | Status updates for OTF staff - **Acceptance criteria**
- [x] Updates to statuses for OTF staff as shown in #786
**QA criteria**
Dev:
- [x] checked feature meets acceptance criteria/conforms exactly to the specification.
- [x] provided good unit test coverage (if this is non-trivial behaviour).
- [x] checked all tests for the project pass with this feature enabled.
- [x] checked code conforms to the project coding standards.
- [x] had code reviewed by another developer and resolved any issues raised.
- [x] tested this feature as an end user of the website/app (Can I get to it? Is it useable? Can I break it? Does it work in an end-to-end context?)
- [x] checked that this feature works on the server/s I am deploying it to.
QA:
* [x] tested this feature as a front end user and it meets the acceptance criteria/conforms to the specification and design.
* [x] checked that the feature works on the server/s deployed to | non_priority | status updates for otf staff acceptance criteria updates to statuses for otf staff as shown in qa criteria dev checked feature meets acceptance criteria conforms exactly to the specification provided good unit test coverage if this is non trivial behaviour checked all tests for the project pass with this feature enabled checked code conforms to the project coding standards had code reviewed by another developer and resolved any issues raised tested this feature as an end user of the website app can i get to it is it useable can i break it does it work in an end to end context checked that this feature works on the server s i am deploying it to qa tested this feature as a front end user and it meets the acceptance criteria conforms to the specification and design checked that the feature works on the server s deployed to | 0 |
125,331 | 10,340,038,885 | IssuesEvent | 2019-09-03 20:51:03 | rancher/rancher | https://api.github.com/repos/rancher/rancher | closed | [UI] Add ap-east-1 to eks regions | [zube]: To Test kind/enhancement team/ui | <!--
Please search for existing issues first, then read https://rancher.com/docs/rancher/v2.x/en/contributing/#bugs-issues-or-questions to see what we expect in an issue
For security issues, please email security@rancher.com instead of posting a public issue in GitHub. You may (but are not required to) use the GPG key located on Keybase.
-->
**What kind of request is this (question/bug/enhancement/feature request):**
enhancement
**Steps to reproduce (least amount of steps as possible):**
go to create an eks cluster
**Result:**
ap-east-1 is not provided as an option but is now supported by api.
Backend PR:
https://github.com/rancher/kontainer-engine/pull/182 | 1.0 | [UI] Add ap-east-1 to eks regions - <!--
Please search for existing issues first, then read https://rancher.com/docs/rancher/v2.x/en/contributing/#bugs-issues-or-questions to see what we expect in an issue
For security issues, please email security@rancher.com instead of posting a public issue in GitHub. You may (but are not required to) use the GPG key located on Keybase.
-->
**What kind of request is this (question/bug/enhancement/feature request):**
enhancement
**Steps to reproduce (least amount of steps as possible):**
go to create an eks cluster
**Result:**
ap-east-1 is not provided as an option but is now supported by api.
Backend PR:
https://github.com/rancher/kontainer-engine/pull/182 | non_priority | add ap east to eks regions please search for existing issues first then read to see what we expect in an issue for security issues please email security rancher com instead of posting a public issue in github you may but are not required to use the gpg key located on keybase what kind of request is this question bug enhancement feature request enhancement steps to reproduce least amount of steps as possible go to create an eks cluster result ap east is not provided as an option but is now supported by api backend pr | 0 |
44,479 | 5,630,172,396 | IssuesEvent | 2017-04-05 11:30:30 | tarantool/tarantool | https://api.github.com/repos/tarantool/tarantool | closed | box/net.box.test.lua fails occasionally | flaky test | ```
box/net.box.test.lua [ fail ]
Test failed! Result content mismatch:
--- box/net.box.result Thu Mar 30 17:21:23 2017
+++ box/net.box.reject Thu Mar 30 17:24:02 2017
@@ -1614,7 +1614,7 @@
...
nb.error == "Timeout exceeded";
---
-- true
+- false
...
nb:close();
``` | 1.0 | box/net.box.test.lua fails occasionally - ```
box/net.box.test.lua [ fail ]
Test failed! Result content mismatch:
--- box/net.box.result Thu Mar 30 17:21:23 2017
+++ box/net.box.reject Thu Mar 30 17:24:02 2017
@@ -1614,7 +1614,7 @@
...
nb.error == "Timeout exceeded";
---
-- true
+- false
...
nb:close();
``` | non_priority | box net box test lua fails occasionally box net box test lua test failed result content mismatch box net box result thu mar box net box reject thu mar nb error timeout exceeded true false nb close | 0 |
35,263 | 7,673,991,975 | IssuesEvent | 2018-05-15 01:12:05 | adamhope/testing-github-and-projects | https://api.github.com/repos/adamhope/testing-github-and-projects | closed | Media upload modal not behaving after recent update to Blueprint JS | Defect campaign delivery | https://trello.com/c/rDaN1It4/146-media-upload-modal-not-behaving-after-recent-update-to-blueprint-js
After uploading an image on the media library modal for Classic and Digital the user is re-directed to a blank screen. The image is uploaded however. According to @lkarsai this is caused by the recent update to our front end library Blueprint JS and will need to be fixed. | 1.0 | Media upload modal not behaving after recent update to Blueprint JS - https://trello.com/c/rDaN1It4/146-media-upload-modal-not-behaving-after-recent-update-to-blueprint-js
After uploading an image on the media library modal for Classic and Digital the user is re-directed to a blank screen. The image is uploaded however. According to @lkarsai this is caused by the recent update to our front end library Blueprint JS and will need to be fixed. | non_priority | media upload modal not behaving after recent update to blueprint js after uploading an image on the media library modal for classic and digital the user is re directed to a blank screen the image is uploaded however according to lkarsai this is caused by the recent update to our front end library blueprint js and will need to be fixed | 0 |
5,410 | 5,695,676,796 | IssuesEvent | 2017-04-16 01:41:34 | bschwartz10/little_shop_of_orders | https://api.github.com/repos/bschwartz10/little_shop_of_orders | closed | Authenticated users security | Iteration 3 security & admin ready | Background: An authenticated user
As an Authenticated User
I cannot view another user's private data (current or past orders, etc)
I cannot view the administrator screens or use admin functionality
I cannot make myself an admin | True | Authenticated users security - Background: An authenticated user
As an Authenticated User
I cannot view another user's private data (current or past orders, etc)
I cannot view the administrator screens or use admin functionality
I cannot make myself an admin | non_priority | authenticated users security background an authenticated user as an authenticated user i cannot view another user s private data current or past orders etc i cannot view the administrator screens or use admin functionality i cannot make myself an admin | 0 |
129,020 | 12,397,047,992 | IssuesEvent | 2020-05-20 21:46:50 | edgexfoundry/edgex-docs | https://api.github.com/repos/edgexfoundry/edgex-docs | opened | Architecture diagrams need to be updated with latest version | 2-medium documentation geneva | All the architecture diagrams are out of date. They need to be updated with a form of the one shown here.
https://docs.edgexfoundry.org/1.2/microservices/application/ApplServices/ | 1.0 | Architecture diagrams need to be updated with latest version - All the architecture diagrams are out of date. They need to be updated with a form of the one shown here.
https://docs.edgexfoundry.org/1.2/microservices/application/ApplServices/ | non_priority | architecture diagrams need to be updated with latest version all the architecture diagrams are out of date they need to be updated with a form of the one shown here | 0 |
158,428 | 20,025,256,256 | IssuesEvent | 2022-02-01 20:32:50 | timf-app-test/clojure-tools-build | https://api.github.com/repos/timf-app-test/clojure-tools-build | opened | CVE-2015-9251 (Medium) detected in jquery-1.11.0.min.js | security vulnerability | ## CVE-2015-9251 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.11.0.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.0/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.0/jquery.min.js</a></p>
<p>Path to dependency file: /docs/index.html</p>
<p>Path to vulnerable library: /docs/js/jquery.min.js,/docs/js/jquery.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.11.0.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/timf-app-test/clojure-tools-build/commit/6804bb27c3fe3386d200022d8d833e003e1028ce">6804bb27c3fe3386d200022d8d833e003e1028ce</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.
<p>Publish Date: 2018-01-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251>CVE-2015-9251</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p>
<p>Release Date: 2018-01-18</p>
<p>Fix Resolution: jQuery - v3.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.11.0","packageFilePaths":["/docs/index.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:1.11.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jQuery - v3.0.0","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | True | CVE-2015-9251 (Medium) detected in jquery-1.11.0.min.js - ## CVE-2015-9251 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.11.0.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.0/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.0/jquery.min.js</a></p>
<p>Path to dependency file: /docs/index.html</p>
<p>Path to vulnerable library: /docs/js/jquery.min.js,/docs/js/jquery.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.11.0.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/timf-app-test/clojure-tools-build/commit/6804bb27c3fe3386d200022d8d833e003e1028ce">6804bb27c3fe3386d200022d8d833e003e1028ce</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.
<p>Publish Date: 2018-01-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251>CVE-2015-9251</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p>
<p>Release Date: 2018-01-18</p>
<p>Fix Resolution: jQuery - v3.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.11.0","packageFilePaths":["/docs/index.html"],"isTransitiveDependency":false,"dependencyTree":"jquery:1.11.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jQuery - v3.0.0","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | non_priority | cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file docs index html path to vulnerable library docs js jquery min js docs js jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree jquery isminimumfixversionavailable true minimumfixversion jquery isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed vulnerabilityurl | 0 |
328,100 | 24,170,419,989 | IssuesEvent | 2022-09-22 18:40:51 | papermerge/papermerge-core | https://api.github.com/repos/papermerge/papermerge-core | closed | Add multiple OCR languages | documentation docker | Hi, it's possible to have an official worktrough to add multiple ocr languages in papermerge?
Many thank's,
Fabio.
**labels: docker, documentaion** (sorry i can't add labels by myself) | 1.0 | Add multiple OCR languages - Hi, it's possible to have an official worktrough to add multiple ocr languages in papermerge?
Many thank's,
Fabio.
**labels: docker, documentaion** (sorry i can't add labels by myself) | non_priority | add multiple ocr languages hi it s possible to have an official worktrough to add multiple ocr languages in papermerge many thank s fabio labels docker documentaion sorry i can t add labels by myself | 0 |
228,248 | 25,169,636,715 | IssuesEvent | 2022-11-11 01:13:10 | turkdevops/play-with-docker | https://api.github.com/repos/turkdevops/play-with-docker | closed | CVE-2021-27918 (High) detected in github.com/miekg/DNS-v1.0.0 - autoclosed | security vulnerability | ## CVE-2021-27918 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/miekg/DNS-v1.0.0</b></p></summary>
<p>DNS library in Go</p>
<p>Library home page: <a href="https://proxy.golang.org/github.com/miekg/dns/@v/v1.0.0.zip">https://proxy.golang.org/github.com/miekg/dns/@v/v1.0.0.zip</a></p>
<p>
Dependency Hierarchy:
- :x: **github.com/miekg/DNS-v1.0.0** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/turkdevops/play-with-docker/commit/27377d4ea18db54381a8dc972091f3c342337ec9">27377d4ea18db54381a8dc972091f3c342337ec9</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
encoding/xml in Go before 1.15.9 and 1.16.x before 1.16.1 has an infinite loop if a custom TokenReader (for xml.NewTokenDecoder) returns EOF in the middle of an element. This can occur in the Decode, DecodeElement, or Skip method.
<p>Publish Date: 2021-03-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-27918>CVE-2021-27918</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://groups.google.com/g/golang-announce/c/MfiLYjG-RAw">https://groups.google.com/g/golang-announce/c/MfiLYjG-RAw</a></p>
<p>Release Date: 2021-03-11</p>
<p>Fix Resolution: 1.15.9, 1.16.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-27918 (High) detected in github.com/miekg/DNS-v1.0.0 - autoclosed - ## CVE-2021-27918 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/miekg/DNS-v1.0.0</b></p></summary>
<p>DNS library in Go</p>
<p>Library home page: <a href="https://proxy.golang.org/github.com/miekg/dns/@v/v1.0.0.zip">https://proxy.golang.org/github.com/miekg/dns/@v/v1.0.0.zip</a></p>
<p>
Dependency Hierarchy:
- :x: **github.com/miekg/DNS-v1.0.0** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/turkdevops/play-with-docker/commit/27377d4ea18db54381a8dc972091f3c342337ec9">27377d4ea18db54381a8dc972091f3c342337ec9</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
encoding/xml in Go before 1.15.9 and 1.16.x before 1.16.1 has an infinite loop if a custom TokenReader (for xml.NewTokenDecoder) returns EOF in the middle of an element. This can occur in the Decode, DecodeElement, or Skip method.
<p>Publish Date: 2021-03-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-27918>CVE-2021-27918</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://groups.google.com/g/golang-announce/c/MfiLYjG-RAw">https://groups.google.com/g/golang-announce/c/MfiLYjG-RAw</a></p>
<p>Release Date: 2021-03-11</p>
<p>Fix Resolution: 1.15.9, 1.16.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in github com miekg dns autoclosed cve high severity vulnerability vulnerable library github com miekg dns dns library in go library home page a href dependency hierarchy x github com miekg dns vulnerable library found in head commit a href found in base branch master vulnerability details encoding xml in go before and x before has an infinite loop if a custom tokenreader for xml newtokendecoder returns eof in the middle of an element this can occur in the decode decodeelement or skip method publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
84,814 | 10,418,942,268 | IssuesEvent | 2019-09-15 12:58:06 | matplotlib/matplotlib | https://api.github.com/repos/matplotlib/matplotlib | closed | Matplotlib NavigationToolbar2Tk disappears when reducing window size | Documentation GUI/tk Good first issue | Using the example in the link below, the toolbar will disappear when reducing the height of the window. I have read about this bug in here and it was stated as resolved but I still have that problem.
https://matplotlib.org/gallery/user_interfaces/embedding_in_tk_sgskip.html#sphx-glr-gallery-user-interfaces-embedding-in-tk-sgskip-py
One solution is to use grid and one frame for FigureCanvasTkAgg and one frame for NavigationToolbar2Tk but then the cursor dont change appearance depending on if zoom, pan etc is selected.
Im using Python 3.7.1 (v3.7.1:260ec2c36a, Oct 20 2018, 14:57:15) [MSC v.1915 64 bit (AMD64)] on win32 and matplotlib v 3.0.2
Kind regards, Daniel | 1.0 | Matplotlib NavigationToolbar2Tk disappears when reducing window size - Using the example in the link below, the toolbar will disappear when reducing the height of the window. I have read about this bug in here and it was stated as resolved but I still have that problem.
https://matplotlib.org/gallery/user_interfaces/embedding_in_tk_sgskip.html#sphx-glr-gallery-user-interfaces-embedding-in-tk-sgskip-py
One solution is to use grid and one frame for FigureCanvasTkAgg and one frame for NavigationToolbar2Tk but then the cursor dont change appearance depending on if zoom, pan etc is selected.
Im using Python 3.7.1 (v3.7.1:260ec2c36a, Oct 20 2018, 14:57:15) [MSC v.1915 64 bit (AMD64)] on win32 and matplotlib v 3.0.2
Kind regards, Daniel | non_priority | matplotlib disappears when reducing window size using the example in the link below the toolbar will disappear when reducing the height of the window i have read about this bug in here and it was stated as resolved but i still have that problem one solution is to use grid and one frame for figurecanvastkagg and one frame for but then the cursor dont change appearance depending on if zoom pan etc is selected im using python oct on and matplotlib v kind regards daniel | 0 |
87,992 | 10,562,519,170 | IssuesEvent | 2019-10-04 18:32:14 | alejandrorojas91/Problem-Set-4 | https://api.github.com/repos/alejandrorojas91/Problem-Set-4 | opened | Tongue twisters | documentation | Since you like tongue twisters, add another one to README.md, because why not. | 1.0 | Tongue twisters - Since you like tongue twisters, add another one to README.md, because why not. | non_priority | tongue twisters since you like tongue twisters add another one to readme md because why not | 0 |
209,747 | 16,057,839,305 | IssuesEvent | 2021-04-23 08:17:22 | geosolutions-it/geonode | https://api.github.com/repos/geosolutions-it/geonode | opened | Test GeoNode for the official release | Testing | @ElenaGallo a full, deep and wide testing of master demo is requested, in preparation of the official 3.2 release.
In invite you to stress it and, possibly, extend the tests in the spreadsheet. | 1.0 | Test GeoNode for the official release - @ElenaGallo a full, deep and wide testing of master demo is requested, in preparation of the official 3.2 release.
In invite you to stress it and, possibly, extend the tests in the spreadsheet. | non_priority | test geonode for the official release elenagallo a full deep and wide testing of master demo is requested in preparation of the official release in invite you to stress it and possibly extend the tests in the spreadsheet | 0 |
20,405 | 6,885,374,537 | IssuesEvent | 2017-11-21 15:55:54 | blackbaud/skyux2 | https://api.github.com/repos/blackbaud/skyux2 | closed | Shared karma config does not pass correct command to getSkyPagesConfig() | bug builder | Kudos to @blackbaud-joshlandi for originally finding (and fixing) this bug. `shared.karma.conf.js` expects `argv.command` to exist and passes it to `getSkyPagesConfig`, but since the karma config is reading directly from the command line, it needs to read `_[0]` instead.
The outcome of this, would mean users can't use `skyuxconfig.test.json`. | 1.0 | Shared karma config does not pass correct command to getSkyPagesConfig() - Kudos to @blackbaud-joshlandi for originally finding (and fixing) this bug. `shared.karma.conf.js` expects `argv.command` to exist and passes it to `getSkyPagesConfig`, but since the karma config is reading directly from the command line, it needs to read `_[0]` instead.
The outcome of this, would mean users can't use `skyuxconfig.test.json`. | non_priority | shared karma config does not pass correct command to getskypagesconfig kudos to blackbaud joshlandi for originally finding and fixing this bug shared karma conf js expects argv command to exist and passes it to getskypagesconfig but since the karma config is reading directly from the command line it needs to read instead the outcome of this would mean users can t use skyuxconfig test json | 0 |
436,770 | 30,569,118,191 | IssuesEvent | 2023-07-20 20:22:24 | epiverse-trace/howto | https://api.github.com/repos/epiverse-trace/howto | opened | add the structure of a howto page to about page | documentation | - ingredients
- steps in code
- steps in detail
- related | 1.0 | add the structure of a howto page to about page - - ingredients
- steps in code
- steps in detail
- related | non_priority | add the structure of a howto page to about page ingredients steps in code steps in detail related | 0 |
92,059 | 8,338,171,293 | IssuesEvent | 2018-09-28 13:35:27 | eggjs/egg | https://api.github.com/repos/eggjs/egg | closed | Local unit test errors, but the remote one is OK with me | type: test | I've been running with the problem for a couple of months when contributing to `Egg` project—— I ALWAYS see errors in the next three cases:





After a detail discussion with @fengmk2,I changed my DNS setting to this following:

Then the `dns` unit tests' erros above are GONE~~~
But another new question comes to me:

So I just wonder how we make this the same as what we see in the remote server of Egg, you see that when submitting to the remote server to check CI, everything is right.
| 1.0 | Local unit test errors, but the remote one is OK with me - I've been running with the problem for a couple of months when contributing to `Egg` project—— I ALWAYS see errors in the next three cases:





After a detail discussion with @fengmk2,I changed my DNS setting to this following:

Then the `dns` unit tests' erros above are GONE~~~
But another new question comes to me:

So I just wonder how we make this the same as what we see in the remote server of Egg, you see that when submitting to the remote server to check CI, everything is right.
| non_priority | local unit test errors but the remote one is ok with me i ve been running with the problem for a couple of months when contributing to egg project—— i always see errors in the next three cases: after a detail discussion with ,i changed my dns setting to this following: then the dns unit tests erros above are gone but another new question comes to me: so i just wonder how we make this the same as what we see in the remote server of egg you see that when submitting to the remote server to check ci everything is right | 0 |
380,501 | 26,422,094,056 | IssuesEvent | 2023-01-13 21:40:52 | Tonomy-Foundation/Tonomy-ID | https://api.github.com/repos/Tonomy-Foundation/Tonomy-ID | closed | Create Readthedocs repo and deployment | documentation | Definition of done
- [ ] Install readthedocs in the Tonomy-ID-SDK repository
- [ ] show that it is working in a local environment
- [ ] readthedocs has some example content
Follow up
- [ ] deploy to staging | 1.0 | Create Readthedocs repo and deployment - Definition of done
- [ ] Install readthedocs in the Tonomy-ID-SDK repository
- [ ] show that it is working in a local environment
- [ ] readthedocs has some example content
Follow up
- [ ] deploy to staging | non_priority | create readthedocs repo and deployment definition of done install readthedocs in the tonomy id sdk repository show that it is working in a local environment readthedocs has some example content follow up deploy to staging | 0 |
38,111 | 10,141,949,466 | IssuesEvent | 2019-08-03 18:56:55 | TerryCavanagh/diceydungeonsbeta | https://api.github.com/repos/TerryCavanagh/diceydungeonsbeta | closed | Playing tutorial twice in one session causes fight background not to be loaded the second time | v0.6: 28th June Build | 
Repro: Do tutorial, die to hothead, reset progress from the menu, do tutorial again. No fight backgrounds will appear until you finish the tutorial. ("Floor 2", the first floor after you defeat Jester, will be completely normal.) | 1.0 | Playing tutorial twice in one session causes fight background not to be loaded the second time - 
Repro: Do tutorial, die to hothead, reset progress from the menu, do tutorial again. No fight backgrounds will appear until you finish the tutorial. ("Floor 2", the first floor after you defeat Jester, will be completely normal.) | non_priority | playing tutorial twice in one session causes fight background not to be loaded the second time repro do tutorial die to hothead reset progress from the menu do tutorial again no fight backgrounds will appear until you finish the tutorial floor the first floor after you defeat jester will be completely normal | 0 |
100,278 | 16,486,713,079 | IssuesEvent | 2021-05-24 19:08:21 | CrazyKidJack/WebGoat_2.0_clone | https://api.github.com/repos/CrazyKidJack/WebGoat_2.0_clone | closed | CVE-2020-36048 (High) detected in engine.io-3.2.1.tgz - autoclosed | security vulnerability | ## CVE-2020-36048 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>engine.io-3.2.1.tgz</b></p></summary>
<p>The realtime engine behind Socket.IO. Provides the foundation of a bidirectional connection between client and server</p>
<p>Library home page: <a href="https://registry.npmjs.org/engine.io/-/engine.io-3.2.1.tgz">https://registry.npmjs.org/engine.io/-/engine.io-3.2.1.tgz</a></p>
<p>Path to dependency file: WebGoat_2.0_clone/docs/package.json</p>
<p>Path to vulnerable library: WebGoat_2.0_clone/docs/node_modules/engine.io/package.json</p>
<p>
Dependency Hierarchy:
- browser-sync-2.26.3.tgz (Root Library)
- socket.io-2.1.1.tgz
- :x: **engine.io-3.2.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/CrazyKidJack/WebGoat_2.0_clone/commits/bf2e3239dd01ebad5bdcf3161aa931ddd47755ff">bf2e3239dd01ebad5bdcf3161aa931ddd47755ff</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Engine.IO before 4.0.0 allows attackers to cause a denial of service (resource consumption) via a POST request to the long polling transport.
<p>Publish Date: 2021-01-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36048>CVE-2020-36048</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-36048">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-36048</a></p>
<p>Release Date: 2021-01-08</p>
<p>Fix Resolution: engine.io - 4.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"engine.io","packageVersion":"3.2.1","packageFilePaths":["/docs/package.json"],"isTransitiveDependency":true,"dependencyTree":"browser-sync:2.26.3;socket.io:2.1.1;engine.io:3.2.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"engine.io - 4.0.0"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-36048","vulnerabilityDetails":"Engine.IO before 4.0.0 allows attackers to cause a denial of service (resource consumption) via a POST request to the long polling transport.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36048","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-36048 (High) detected in engine.io-3.2.1.tgz - autoclosed - ## CVE-2020-36048 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>engine.io-3.2.1.tgz</b></p></summary>
<p>The realtime engine behind Socket.IO. Provides the foundation of a bidirectional connection between client and server</p>
<p>Library home page: <a href="https://registry.npmjs.org/engine.io/-/engine.io-3.2.1.tgz">https://registry.npmjs.org/engine.io/-/engine.io-3.2.1.tgz</a></p>
<p>Path to dependency file: WebGoat_2.0_clone/docs/package.json</p>
<p>Path to vulnerable library: WebGoat_2.0_clone/docs/node_modules/engine.io/package.json</p>
<p>
Dependency Hierarchy:
- browser-sync-2.26.3.tgz (Root Library)
- socket.io-2.1.1.tgz
- :x: **engine.io-3.2.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/CrazyKidJack/WebGoat_2.0_clone/commits/bf2e3239dd01ebad5bdcf3161aa931ddd47755ff">bf2e3239dd01ebad5bdcf3161aa931ddd47755ff</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Engine.IO before 4.0.0 allows attackers to cause a denial of service (resource consumption) via a POST request to the long polling transport.
<p>Publish Date: 2021-01-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36048>CVE-2020-36048</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-36048">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-36048</a></p>
<p>Release Date: 2021-01-08</p>
<p>Fix Resolution: engine.io - 4.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"engine.io","packageVersion":"3.2.1","packageFilePaths":["/docs/package.json"],"isTransitiveDependency":true,"dependencyTree":"browser-sync:2.26.3;socket.io:2.1.1;engine.io:3.2.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"engine.io - 4.0.0"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-36048","vulnerabilityDetails":"Engine.IO before 4.0.0 allows attackers to cause a denial of service (resource consumption) via a POST request to the long polling transport.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36048","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_priority | cve high detected in engine io tgz autoclosed cve high severity vulnerability vulnerable library engine io tgz the realtime engine behind socket io provides the foundation of a bidirectional connection between client and server library home page a href path to dependency file webgoat clone docs package json path to vulnerable library webgoat clone docs node modules engine io package json dependency hierarchy browser sync tgz root library socket io tgz x engine io tgz vulnerable library found in head commit a href vulnerability details engine io before allows attackers to cause a denial of service resource consumption via a post request to the long polling transport publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution engine io isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree browser sync socket io engine io isminimumfixversionavailable true minimumfixversion engine io basebranches vulnerabilityidentifier cve vulnerabilitydetails engine io before allows attackers to cause a denial of service resource consumption via a post request to the long polling transport vulnerabilityurl | 0 |
10,675 | 6,850,340,094 | IssuesEvent | 2017-11-14 02:40:03 | Amber-MD/cmake-buildscripts | https://api.github.com/repos/Amber-MD/cmake-buildscripts | closed | add message saying user can turn off Python build | usability | hi,
from below message
```
CMake Error at cmake/PythonConfig.cmake:82 (message):
Missing required Python packages: scipy matplotlib. Please install these
and try again, or set USE_MINICONDA to TRUE to create a python environment
automatically.
```
| True | add message saying user can turn off Python build - hi,
from below message
```
CMake Error at cmake/PythonConfig.cmake:82 (message):
Missing required Python packages: scipy matplotlib. Please install these
and try again, or set USE_MINICONDA to TRUE to create a python environment
automatically.
```
| non_priority | add message saying user can turn off python build hi from below message cmake error at cmake pythonconfig cmake message missing required python packages scipy matplotlib please install these and try again or set use miniconda to true to create a python environment automatically | 0 |
404 | 7,382,852,292 | IssuesEvent | 2018-03-15 07:15:12 | PopulateTools/gobierto | https://api.github.com/repos/PopulateTools/gobierto | closed | Missing admin_preview_token in "view event" links | bug gobierto-people | I got [this exception](https://rollbar.com/Populate/gobierto/items/476/) after clicking on the "view event" link in the admin of [this event](http://madrid.gobify.net/agendas/rafa-garcia/2017-11-22-nuevo-evento-24h-format).
It happend because the person was drafted and the `preview token` wasn't being included in the generated link. | 1.0 | Missing admin_preview_token in "view event" links - I got [this exception](https://rollbar.com/Populate/gobierto/items/476/) after clicking on the "view event" link in the admin of [this event](http://madrid.gobify.net/agendas/rafa-garcia/2017-11-22-nuevo-evento-24h-format).
It happend because the person was drafted and the `preview token` wasn't being included in the generated link. | non_priority | missing admin preview token in view event links i got after clicking on the view event link in the admin of it happend because the person was drafted and the preview token wasn t being included in the generated link | 0 |
320,468 | 23,811,478,006 | IssuesEvent | 2022-09-04 20:33:41 | mikro-orm/mikro-orm | https://api.github.com/repos/mikro-orm/mikro-orm | closed | docs: Clarification for MySQL/Mariadb | documentation | I noticed in the installation guide that there are 2 separate packages for MySQL (`@mikro-orm/mysql`) & MariaDB (`@mikro-orm/mariadb`) but the inline comments for both have the same description for both. Are these currently interchangeable? Is the older one expected to be deprecated soon?
https://github.com/mikro-orm/mikro-orm/blob/ae81ca98a12d04081b9ca934b492d0c916dee30d/docs/versioned_docs/version-5.3/installation.md?plain=1#L14-L15
https://github.com/mikro-orm/mikro-orm/blob/ae81ca98a12d04081b9ca934b492d0c916dee30d/docs/versioned_docs/version-5.3/installation.md?plain=1#L24-L25
The inline comment should be updated accordingly. | 1.0 | docs: Clarification for MySQL/Mariadb - I noticed in the installation guide that there are 2 separate packages for MySQL (`@mikro-orm/mysql`) & MariaDB (`@mikro-orm/mariadb`) but the inline comments for both have the same description for both. Are these currently interchangeable? Is the older one expected to be deprecated soon?
https://github.com/mikro-orm/mikro-orm/blob/ae81ca98a12d04081b9ca934b492d0c916dee30d/docs/versioned_docs/version-5.3/installation.md?plain=1#L14-L15
https://github.com/mikro-orm/mikro-orm/blob/ae81ca98a12d04081b9ca934b492d0c916dee30d/docs/versioned_docs/version-5.3/installation.md?plain=1#L24-L25
The inline comment should be updated accordingly. | non_priority | docs clarification for mysql mariadb i noticed in the installation guide that there are separate packages for mysql mikro orm mysql mariadb mikro orm mariadb but the inline comments for both have the same description for both are these currently interchangeable is the older one expected to be deprecated soon the inline comment should be updated accordingly | 0 |
114,310 | 24,583,522,816 | IssuesEvent | 2022-10-13 17:34:47 | hurl365/fa22-cse110-lab3 | https://api.github.com/repos/hurl365/fa22-cse110-lab3 | opened | [Missing] Missing Different Selectors in my_stylesheet.css | [custom label] code | ### **my_stylesheet.css**
*Insert the missing items underneath*
- [x] .class
- [ ] #id
- [ ] *
- [ ] element
- [ ] [attribute=xxx]
- [ ] p:hover
- [ ] element, element (seletor list)
- [ ] Combinators (element element, element > element, element ~ element, element + element, element.class)
| 1.0 | [Missing] Missing Different Selectors in my_stylesheet.css - ### **my_stylesheet.css**
*Insert the missing items underneath*
- [x] .class
- [ ] #id
- [ ] *
- [ ] element
- [ ] [attribute=xxx]
- [ ] p:hover
- [ ] element, element (seletor list)
- [ ] Combinators (element element, element > element, element ~ element, element + element, element.class)
| non_priority | missing different selectors in my stylesheet css my stylesheet css insert the missing items underneath class id element p hover element element seletor list combinators element element element element element element element element element class | 0 |
4,151 | 6,733,136,976 | IssuesEvent | 2017-10-18 13:57:40 | IT2810/it2810-webutvikling-h17-prosjekt-3-group-7-native | https://api.github.com/repos/IT2810/it2810-webutvikling-h17-prosjekt-3-group-7-native | closed | Lage fungerende frontpage og tile component | feature requirement | Overføre og endre stil slik at appen fungerer på mobil | 1.0 | Lage fungerende frontpage og tile component - Overføre og endre stil slik at appen fungerer på mobil | non_priority | lage fungerende frontpage og tile component overføre og endre stil slik at appen fungerer på mobil | 0 |
30,540 | 4,629,214,625 | IssuesEvent | 2016-09-28 08:31:03 | hengxin/chameleon-transactional-kvstore | https://api.github.com/repos/hengxin/chameleon-transactional-kvstore | opened | Making the transaction commit phase atomic in 2PC protocol | bug doc question refactor test todo | ***Bug Description***:
The specification of snapshot isolation (SI) *explicitly* refers to starting timestamps and commit timestamps of committed transactions. If the events of getting commit timestamp for a transaction, attaching it to updates, updating data items, and (possibly) releasing locks are not atomic, concurrent events of getting start timestamps for new transactions and/or reading data items during this time interval will introduce non-SI anomalies.
***Solutions:***
I have no good solutions to this problem.
Right now I take a conservative strategy which may hurt the system performance.
The basic idea is to prevent the events of getting new *start* timestamps between the time some transaction gets its *commit* timestamp and the time it actually commits.
- [ ] In class `CentralizedTimestampOracle`:
- [ ] separates `getSts()` and `getCts()`
- [ ] implementing locking strategy to make sure that no `getSts()` can be called if some threads are in `getCts()`; see []().
- [ ] In class `RVSITransaction`:
- [ ] `get()` in `begin()` => `getSts()`
- [ ] In class `RVSI2PCPhaserCoordinator`:
- [ ] in method `onPreparePhaseFinished()`: `tsOracle.get()` => `tsOracle.lockStsAndThenGetCts()`
- [ ] in method `onCommitPhaseFinished()`: adding `tsOracle.unlockSts()`
- [ ] In class `SIMaster`:
- [ ] in method `commit()`: moving `table.apply(tx)` into the lock/unlock area
| 1.0 | Making the transaction commit phase atomic in 2PC protocol - ***Bug Description***:
The specification of snapshot isolation (SI) *explicitly* refers to starting timestamps and commit timestamps of committed transactions. If the events of getting commit timestamp for a transaction, attaching it to updates, updating data items, and (possibly) releasing locks are not atomic, concurrent events of getting start timestamps for new transactions and/or reading data items during this time interval will introduce non-SI anomalies.
***Solutions:***
I have no good solutions to this problem.
Right now I take a conservative strategy which may hurt the system performance.
The basic idea is to prevent the events of getting new *start* timestamps between the time some transaction gets its *commit* timestamp and the time it actually commits.
- [ ] In class `CentralizedTimestampOracle`:
- [ ] separates `getSts()` and `getCts()`
- [ ] implementing locking strategy to make sure that no `getSts()` can be called if some threads are in `getCts()`; see []().
- [ ] In class `RVSITransaction`:
- [ ] `get()` in `begin()` => `getSts()`
- [ ] In class `RVSI2PCPhaserCoordinator`:
- [ ] in method `onPreparePhaseFinished()`: `tsOracle.get()` => `tsOracle.lockStsAndThenGetCts()`
- [ ] in method `onCommitPhaseFinished()`: adding `tsOracle.unlockSts()`
- [ ] In class `SIMaster`:
- [ ] in method `commit()`: moving `table.apply(tx)` into the lock/unlock area
| non_priority | making the transaction commit phase atomic in protocol bug description the specification of snapshot isolation si explicitly refers to starting timestamps and commit timestamps of committed transactions if the events of getting commit timestamp for a transaction attaching it to updates updating data items and possibly releasing locks are not atomic concurrent events of getting start timestamps for new transactions and or reading data items during this time interval will introduce non si anomalies solutions i have no good solutions to this problem right now i take a conservative strategy which may hurt the system performance the basic idea is to prevent the events of getting new start timestamps between the time some transaction gets its commit timestamp and the time it actually commits in class centralizedtimestamporacle separates getsts and getcts implementing locking strategy to make sure that no getsts can be called if some threads are in getcts see in class rvsitransaction get in begin getsts in class in method onpreparephasefinished tsoracle get tsoracle lockstsandthengetcts in method oncommitphasefinished adding tsoracle unlocksts in class simaster in method commit moving table apply tx into the lock unlock area | 0 |
56,370 | 6,977,062,608 | IssuesEvent | 2017-12-12 13:26:17 | vaadin/flow | https://api.github.com/repos/vaadin/flow | opened | Dynamically inlining resources for initial response | bootstrap page needs design tutorial | `When building an app, I want to inline different resources to the initial response based on the opened browser, so I may only inline polyfills when necessary or inline transpiled resources for IE11`
Add the same API as with #3010 to `PageConfigurator` for dynamic support.
It is an open question on how this should work
- Provide API to just give a file to a callback and it will be inlined ? (Similarly to annotation)
- Provide API to just inline a text ?
- How to distinct between inlining to `<body>` and `<head>` ? Different methods ?
- How to distinct between appending and prepending ?
### Acceptance Criteria
- I'm able to dynamically inline data on `PageConfigurator` based on e.g. what browser was opened.
- The documentation for bootstrap page has mention and an example of e.g. inlining a polyfill if the browser is IE11 | 1.0 | Dynamically inlining resources for initial response - `When building an app, I want to inline different resources to the initial response based on the opened browser, so I may only inline polyfills when necessary or inline transpiled resources for IE11`
Add the same API as with #3010 to `PageConfigurator` for dynamic support.
It is an open question on how this should work
- Provide API to just give a file to a callback and it will be inlined ? (Similarly to annotation)
- Provide API to just inline a text ?
- How to distinct between inlining to `<body>` and `<head>` ? Different methods ?
- How to distinct between appending and prepending ?
### Acceptance Criteria
- I'm able to dynamically inline data on `PageConfigurator` based on e.g. what browser was opened.
- The documentation for bootstrap page has mention and an example of e.g. inlining a polyfill if the browser is IE11 | non_priority | dynamically inlining resources for initial response when building an app i want to inline different resources to the initial response based on the opened browser so i may only inline polyfills when necessary or inline transpiled resources for add the same api as with to pageconfigurator for dynamic support it is an open question on how this should work provide api to just give a file to a callback and it will be inlined similarly to annotation provide api to just inline a text how to distinct between inlining to and different methods how to distinct between appending and prepending acceptance criteria i m able to dynamically inline data on pageconfigurator based on e g what browser was opened the documentation for bootstrap page has mention and an example of e g inlining a polyfill if the browser is | 0 |
437,588 | 30,604,774,616 | IssuesEvent | 2023-07-22 21:35:40 | epwalsh/obsidian.nvim | https://api.github.com/repos/epwalsh/obsidian.nvim | opened | Config documentation is a bit convoluted | documentation | ### 📚 The doc issue
Currently, configuration information may be found in three different places, `Install and Configure/Lazy.nvim`, `Install and Configure/packer`, and `Configuration Options`. Also, the installation code example for Lazy.nvim only is the only place to find information about dependencies.
These are a handful of larger changes that I thought might warrant some discussion as opposed to the small ones I submitted a PR for earlier.
### Suggest a potential alternative/fix
1) Move dependency information to its own section and remove it from the Lazy.nvim setup, assuming users know how their package managers work. This will drastically cut down on the size of the Lazy.nvim setup portion and add emphasis to the configuration portion.
2) Remove keybinding recommendation from Lazy.nvim setup, moving either to a system I proposed in #157, or to its own keybinding section as a subheader of the Configuration section. This keybind may be skipped over as its part of the installation section and it feels like it belongs in a different place anyway. This will also help to shorten the section.
3) Configuration about templates seems to be partially duplicated. Seems intentional but it's worth taking a think about as it adds some extra clutter and makes the config section difficult to scan quickly.
4) Treesitter markdown additional vim regex highlighting is recommended but it's unclear the effect it will have. We should probably add some comparison pictures to make it more clear. | 1.0 | Config documentation is a bit convoluted - ### 📚 The doc issue
Currently, configuration information may be found in three different places, `Install and Configure/Lazy.nvim`, `Install and Configure/packer`, and `Configuration Options`. Also, the installation code example for Lazy.nvim only is the only place to find information about dependencies.
These are a handful of larger changes that I thought might warrant some discussion as opposed to the small ones I submitted a PR for earlier.
### Suggest a potential alternative/fix
1) Move dependency information to its own section and remove it from the Lazy.nvim setup, assuming users know how their package managers work. This will drastically cut down on the size of the Lazy.nvim setup portion and add emphasis to the configuration portion.
2) Remove keybinding recommendation from Lazy.nvim setup, moving either to a system I proposed in #157, or to its own keybinding section as a subheader of the Configuration section. This keybind may be skipped over as its part of the installation section and it feels like it belongs in a different place anyway. This will also help to shorten the section.
3) Configuration about templates seems to be partially duplicated. Seems intentional but it's worth taking a think about as it adds some extra clutter and makes the config section difficult to scan quickly.
4) Treesitter markdown additional vim regex highlighting is recommended but it's unclear the effect it will have. We should probably add some comparison pictures to make it more clear. | non_priority | config documentation is a bit convoluted 📚 the doc issue currently configuration information may be found in three different places install and configure lazy nvim install and configure packer and configuration options also the installation code example for lazy nvim only is the only place to find information about dependencies these are a handful of larger changes that i thought might warrant some discussion as opposed to the small ones i submitted a pr for earlier suggest a potential alternative fix move dependency information to its own section and remove it from the lazy nvim setup assuming users know how their package managers work this will drastically cut down on the size of the lazy nvim setup portion and add emphasis to the configuration portion remove keybinding recommendation from lazy nvim setup moving either to a system i proposed in or to its own keybinding section as a subheader of the configuration section this keybind may be skipped over as its part of the installation section and it feels like it belongs in a different place anyway this will also help to shorten the section configuration about templates seems to be partially duplicated seems intentional but it s worth taking a think about as it adds some extra clutter and makes the config section difficult to scan quickly treesitter markdown additional vim regex highlighting is recommended but it s unclear the effect it will have we should probably add some comparison pictures to make it more clear | 0 |
26,303 | 19,975,330,336 | IssuesEvent | 2022-01-29 02:04:51 | dotnet/project-system | https://api.github.com/repos/dotnet/project-system | opened | Remove usage of `roslyn-tools` VSIX package creation | Area-Infrastructure | I don't have all the details on this, but here is what I know currently:
- We utilize a tool from `roslyn-tools` to aid in creation of our `.vsix` files
- In our `setup` folder, the 3 projects there, `CommonFiles`, `ProjectSystemSetup`, and `VisualStudioEditorsSetup`, create our `.vsix` files
- There is a different way that the `CommonFiles` vsix is created compared to the other 2 projects
- I believe this to be the project that uses `roslyn-tools` VSIX creation
- I don't know why this is created differently than the other 2 projects
- It might have to do with `CommonFiles` having `.swr` files while the other projects have `.vsixmanifest` files
- The MicroBuild plugin for SWIX may be able to help us do this task
- There is documentation [here](https://devdiv.visualstudio.com/DevDiv/_wiki/wikis/DevDiv.wiki/645/How-to-Build-a-VSmanproj) on building `.vsmanproj` that uses the SWIX plugin
The idea here would be to remove the usage of the `rosyln-tools` style of creating our .vsix files. There is bits-and-pieces of documentation surrounding .vsix in the DevDiv wiki. There would be a bit of investigation in what the tool from `roslyn-tools` is actually doing, so we don't end up missing something. I think this is a good opportunity for improvement and simplification of the infrastructure of the repo.
In doing the OptProf work, I found an interesting situation with our file copying that was needed because of this .vsix process. Here's a quote from the discussion [here](https://github.com/dotnet/project-system/pull/7849#discussion_r787037538):
> Looking into it, the nupkg files do get built when you run build.cmd. I looked at the binlog and these new targets do not get ran. However, the reason that they don't run is interesting. The SignFiles target only exists when the MicroBuild signing plugin is installed. Therefore, the binlog shows this:
>> The target "SignFiles" listed in a BeforeTargets attribute at "C:\Code\project-system\src\Directory.Build.targets (18,42)" does not exist in the project, and will be ignored.
> So, the reason these targets don't run because SignFiles itself doesn't exist locally. So, that's some good knowledge to move forward with. I think keeping this implementation now is fine. However, one of the goals of cleaning up the build process would be reducing the amount of file copying (which is extensive at the moment). I'll put this information into an issue when I make one for replacing the RepoToolset Vsix process. This copying only exists because it needs to feed into that.
| 1.0 | Remove usage of `roslyn-tools` VSIX package creation - I don't have all the details on this, but here is what I know currently:
- We utilize a tool from `roslyn-tools` to aid in creation of our `.vsix` files
- In our `setup` folder, the 3 projects there, `CommonFiles`, `ProjectSystemSetup`, and `VisualStudioEditorsSetup`, create our `.vsix` files
- There is a different way that the `CommonFiles` vsix is created compared to the other 2 projects
- I believe this to be the project that uses `roslyn-tools` VSIX creation
- I don't know why this is created differently than the other 2 projects
- It might have to do with `CommonFiles` having `.swr` files while the other projects have `.vsixmanifest` files
- The MicroBuild plugin for SWIX may be able to help us do this task
- There is documentation [here](https://devdiv.visualstudio.com/DevDiv/_wiki/wikis/DevDiv.wiki/645/How-to-Build-a-VSmanproj) on building `.vsmanproj` that uses the SWIX plugin
The idea here would be to remove the usage of the `rosyln-tools` style of creating our .vsix files. There is bits-and-pieces of documentation surrounding .vsix in the DevDiv wiki. There would be a bit of investigation in what the tool from `roslyn-tools` is actually doing, so we don't end up missing something. I think this is a good opportunity for improvement and simplification of the infrastructure of the repo.
In doing the OptProf work, I found an interesting situation with our file copying that was needed because of this .vsix process. Here's a quote from the discussion [here](https://github.com/dotnet/project-system/pull/7849#discussion_r787037538):
> Looking into it, the nupkg files do get built when you run build.cmd. I looked at the binlog and these new targets do not get ran. However, the reason that they don't run is interesting. The SignFiles target only exists when the MicroBuild signing plugin is installed. Therefore, the binlog shows this:
>> The target "SignFiles" listed in a BeforeTargets attribute at "C:\Code\project-system\src\Directory.Build.targets (18,42)" does not exist in the project, and will be ignored.
> So, the reason these targets don't run because SignFiles itself doesn't exist locally. So, that's some good knowledge to move forward with. I think keeping this implementation now is fine. However, one of the goals of cleaning up the build process would be reducing the amount of file copying (which is extensive at the moment). I'll put this information into an issue when I make one for replacing the RepoToolset Vsix process. This copying only exists because it needs to feed into that.
| non_priority | remove usage of roslyn tools vsix package creation i don t have all the details on this but here is what i know currently we utilize a tool from roslyn tools to aid in creation of our vsix files in our setup folder the projects there commonfiles projectsystemsetup and visualstudioeditorssetup create our vsix files there is a different way that the commonfiles vsix is created compared to the other projects i believe this to be the project that uses roslyn tools vsix creation i don t know why this is created differently than the other projects it might have to do with commonfiles having swr files while the other projects have vsixmanifest files the microbuild plugin for swix may be able to help us do this task there is documentation on building vsmanproj that uses the swix plugin the idea here would be to remove the usage of the rosyln tools style of creating our vsix files there is bits and pieces of documentation surrounding vsix in the devdiv wiki there would be a bit of investigation in what the tool from roslyn tools is actually doing so we don t end up missing something i think this is a good opportunity for improvement and simplification of the infrastructure of the repo in doing the optprof work i found an interesting situation with our file copying that was needed because of this vsix process here s a quote from the discussion looking into it the nupkg files do get built when you run build cmd i looked at the binlog and these new targets do not get ran however the reason that they don t run is interesting the signfiles target only exists when the microbuild signing plugin is installed therefore the binlog shows this the target signfiles listed in a beforetargets attribute at c code project system src directory build targets does not exist in the project and will be ignored so the reason these targets don t run because signfiles itself doesn t exist locally so that s some good knowledge to move forward with i think keeping this implementation now is fine however one of the goals of cleaning up the build process would be reducing the amount of file copying which is extensive at the moment i ll put this information into an issue when i make one for replacing the repotoolset vsix process this copying only exists because it needs to feed into that | 0 |
34,255 | 12,259,131,818 | IssuesEvent | 2020-05-06 16:07:10 | witnet/witnet-rust | https://api.github.com/repos/witnet/witnet-rust | closed | Update Signature Manager API | security 🛡️ | The signature manager API should be updated to be able to generate signatures for both BLS and ECDSA. Currently only the latter is supported. This can be done through extending the API or by adding a flag to the existing methods indicating the algorithm to use. | True | Update Signature Manager API - The signature manager API should be updated to be able to generate signatures for both BLS and ECDSA. Currently only the latter is supported. This can be done through extending the API or by adding a flag to the existing methods indicating the algorithm to use. | non_priority | update signature manager api the signature manager api should be updated to be able to generate signatures for both bls and ecdsa currently only the latter is supported this can be done through extending the api or by adding a flag to the existing methods indicating the algorithm to use | 0 |
205,657 | 15,987,591,445 | IssuesEvent | 2021-04-19 01:06:52 | ElaSparks/ArchECM | https://api.github.com/repos/ElaSparks/ArchECM | opened | Basic translator | Develop Structure documentation | 1. write the basic translator
2. generate documentation
3. rename variables
4. make code style
5. structure a project | 1.0 | Basic translator - 1. write the basic translator
2. generate documentation
3. rename variables
4. make code style
5. structure a project | non_priority | basic translator write the basic translator generate documentation rename variables make code style structure a project | 0 |
71,997 | 31,047,658,780 | IssuesEvent | 2023-08-11 02:21:22 | aws/aws-sdk | https://api.github.com/repos/aws/aws-sdk | closed | (quicksight): add `paginateListGroups` Async Iterator | bug service-api quicksight | ### Describe the feature
Would like to have the `paginateListGroups` Async Iterator function in the QuickSight client.
Other `paginate*` functions already exist, but `paginateListGroups` seems to be missing.
### Use Case
To paginate.
### Proposed Solution
Implement it.
### Other Information
- https://aws.amazon.com/blogs/developer/pagination-using-async-iterators-in-modular-aws-sdk-for-javascript/
- https://github.com/aws/aws-sdk-js-v3/tree/main/clients/client-quicksight/src/pagination
### Acknowledgements
- [ ] I may be able to implement this feature request
- [ ] This feature might incur a breaking change
### SDK version used
3.347.1
### Environment details (OS name and version, etc.)
macOS, Node 16 | 1.0 | (quicksight): add `paginateListGroups` Async Iterator - ### Describe the feature
Would like to have the `paginateListGroups` Async Iterator function in the QuickSight client.
Other `paginate*` functions already exist, but `paginateListGroups` seems to be missing.
### Use Case
To paginate.
### Proposed Solution
Implement it.
### Other Information
- https://aws.amazon.com/blogs/developer/pagination-using-async-iterators-in-modular-aws-sdk-for-javascript/
- https://github.com/aws/aws-sdk-js-v3/tree/main/clients/client-quicksight/src/pagination
### Acknowledgements
- [ ] I may be able to implement this feature request
- [ ] This feature might incur a breaking change
### SDK version used
3.347.1
### Environment details (OS name and version, etc.)
macOS, Node 16 | non_priority | quicksight add paginatelistgroups async iterator describe the feature would like to have the paginatelistgroups async iterator function in the quicksight client other paginate functions already exist but paginatelistgroups seems to be missing use case to paginate proposed solution implement it other information acknowledgements i may be able to implement this feature request this feature might incur a breaking change sdk version used environment details os name and version etc macos node | 0 |
156,497 | 24,624,327,718 | IssuesEvent | 2022-10-16 10:12:10 | roeszler/reabook | https://api.github.com/repos/roeszler/reabook | closed | User Story: Create register.html | feature test ux design | As a **admin**, I can **produce a html template** so that **all users can register*.
| 1.0 | User Story: Create register.html - As a **admin**, I can **produce a html template** so that **all users can register*.
| non_priority | user story create register html as a admin i can produce a html template so that all users can register | 0 |
51,090 | 13,614,893,074 | IssuesEvent | 2020-09-23 13:48:44 | liorzilberg/swagger-parser | https://api.github.com/repos/liorzilberg/swagger-parser | opened | CVE-2020-11112 (High) detected in jackson-databind-2.9.5.jar | security vulnerability | ## CVE-2020-11112 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.5.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: swagger-parser/modules/swagger-compat-spec-parser/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.5/jackson-databind-2.9.5.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.5/jackson-databind-2.9.5.jar,swagger-parser/modules/swagger-parser/target/lib/jackson-databind-2.9.5.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.5.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/liorzilberg/swagger-parser/commits/299682f5b4a2ec420c0c3f91a170670051db10d0">299682f5b4a2ec420c0c3f91a170670051db10d0</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.proxy.provider.remoting.RmiProvider (aka apache/commons-proxy).
<p>Publish Date: 2020-03-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11112>CVE-2020-11112</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11112">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11112</a></p>
<p>Release Date: 2020-03-31</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.4,2.10.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.5","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.4,2.10.0"}],"vulnerabilityIdentifier":"CVE-2020-11112","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.proxy.provider.remoting.RmiProvider (aka apache/commons-proxy).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11112","cvss3Severity":"high","cvss3Score":"8.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-11112 (High) detected in jackson-databind-2.9.5.jar - ## CVE-2020-11112 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.5.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: swagger-parser/modules/swagger-compat-spec-parser/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.5/jackson-databind-2.9.5.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.5/jackson-databind-2.9.5.jar,swagger-parser/modules/swagger-parser/target/lib/jackson-databind-2.9.5.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.5.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/liorzilberg/swagger-parser/commits/299682f5b4a2ec420c0c3f91a170670051db10d0">299682f5b4a2ec420c0c3f91a170670051db10d0</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.proxy.provider.remoting.RmiProvider (aka apache/commons-proxy).
<p>Publish Date: 2020-03-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11112>CVE-2020-11112</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11112">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11112</a></p>
<p>Release Date: 2020-03-31</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.4,2.10.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.5","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.4,2.10.0"}],"vulnerabilityIdentifier":"CVE-2020-11112","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.proxy.provider.remoting.RmiProvider (aka apache/commons-proxy).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11112","cvss3Severity":"high","cvss3Score":"8.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | non_priority | cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file swagger parser modules swagger compat spec parser pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar swagger parser modules swagger parser target lib jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache commons proxy provider remoting rmiprovider aka apache commons proxy publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind check this box to open an automated fix pr isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache commons proxy provider remoting rmiprovider aka apache commons proxy vulnerabilityurl | 0 |
158,191 | 20,009,317,909 | IssuesEvent | 2022-02-01 03:02:40 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | zip: missing replication reports | C-enhancement T-server-and-security A-cli-client | The system tables
- `system.replication_stats`
- `system.replication_critical_localities`
- `system.replication_constraint_stats`
should be collected by debug zip.
Found by @AlexTalks during a support issue. | True | zip: missing replication reports - The system tables
- `system.replication_stats`
- `system.replication_critical_localities`
- `system.replication_constraint_stats`
should be collected by debug zip.
Found by @AlexTalks during a support issue. | non_priority | zip missing replication reports the system tables system replication stats system replication critical localities system replication constraint stats should be collected by debug zip found by alextalks during a support issue | 0 |
337,578 | 24,546,276,245 | IssuesEvent | 2022-10-12 09:02:27 | jamie-taylor-rjj/Recruitment-Glossary | https://api.github.com/repos/jamie-taylor-rjj/Recruitment-Glossary | opened | Write the content got the npm/node page | documentation good first issue hacktoberfest contributor-wanted | Be sure to read the readme for this repo with steps for getting started. It _should_ just be a case of using the current LTS of node and doing `npm install` and `npm run` to get the repo up and running.
## Summary
Write a series of paragraphs explaining .NET Maui in the same style as other pages in the application.
- The current .NET Maui page can be found at [https://recruitment-glossary.netlify.app/docs/npm/what-is-npm-and-nodejs/][(https://recruitment-glossary.netlify.app/docs/net/blazor/what-is-blazor/](https://recruitment-glossary.netlify.app/docs/npm/what-is-npm-and-nodejs/))
- An example page to follow would be the .NET page at [https://recruitment-glossary.netlify.app/docs/net/what-is-net/](https://recruitment-glossary.netlify.app/docs/net/what-is-net/)
The important part to remember is that the page should not be the "be all, end all" of documentation. It is meant to be a gentle introduction to the topic, broken into fours separate parts:
- 10,000 ft Overview
- Beginner information
- Intermediary Information
- Advanced Information
This issue can cover just one of the above parts, and others will be created for the remaining parts of the page. | 1.0 | Write the content got the npm/node page - Be sure to read the readme for this repo with steps for getting started. It _should_ just be a case of using the current LTS of node and doing `npm install` and `npm run` to get the repo up and running.
## Summary
Write a series of paragraphs explaining .NET Maui in the same style as other pages in the application.
- The current .NET Maui page can be found at [https://recruitment-glossary.netlify.app/docs/npm/what-is-npm-and-nodejs/][(https://recruitment-glossary.netlify.app/docs/net/blazor/what-is-blazor/](https://recruitment-glossary.netlify.app/docs/npm/what-is-npm-and-nodejs/))
- An example page to follow would be the .NET page at [https://recruitment-glossary.netlify.app/docs/net/what-is-net/](https://recruitment-glossary.netlify.app/docs/net/what-is-net/)
The important part to remember is that the page should not be the "be all, end all" of documentation. It is meant to be a gentle introduction to the topic, broken into fours separate parts:
- 10,000 ft Overview
- Beginner information
- Intermediary Information
- Advanced Information
This issue can cover just one of the above parts, and others will be created for the remaining parts of the page. | non_priority | write the content got the npm node page be sure to read the readme for this repo with steps for getting started it should just be a case of using the current lts of node and doing npm install and npm run to get the repo up and running summary write a series of paragraphs explaining net maui in the same style as other pages in the application the current net maui page can be found at an example page to follow would be the net page at the important part to remember is that the page should not be the be all end all of documentation it is meant to be a gentle introduction to the topic broken into fours separate parts ft overview beginner information intermediary information advanced information this issue can cover just one of the above parts and others will be created for the remaining parts of the page | 0 |
128,532 | 18,059,066,608 | IssuesEvent | 2021-09-20 12:02:06 | bithyve/hexa | https://api.github.com/repos/bithyve/hexa | reopened | Google Drive Backup. | Wallet Security 2.05 | .
- It is showing Level 1 cloud backup is complete and that "Backup on Google" still not confirmed.
- Also when I click on "Backup on Google" nothing is happening.
| True | Google Drive Backup. - .
- It is showing Level 1 cloud backup is complete and that "Backup on Google" still not confirmed.
- Also when I click on "Backup on Google" nothing is happening.
| non_priority | google drive backup it is showing level cloud backup is complete and that backup on google still not confirmed also when i click on backup on google nothing is happening | 0 |
173,519 | 13,426,928,017 | IssuesEvent | 2020-09-06 16:04:34 | widdowquinn/ncfp | https://api.github.com/repos/widdowquinn/ncfp | opened | Mock remote service calls in tests | testing | #### Summary:
CircleCI weekly tests are failing when making remote service calls. These can be mocked to avoid those errors.
| 1.0 | Mock remote service calls in tests - #### Summary:
CircleCI weekly tests are failing when making remote service calls. These can be mocked to avoid those errors.
| non_priority | mock remote service calls in tests summary circleci weekly tests are failing when making remote service calls these can be mocked to avoid those errors | 0 |
235,299 | 19,322,264,717 | IssuesEvent | 2021-12-14 07:31:40 | redhat-developer/odo | https://api.github.com/repos/redhat-developer/odo | closed | refactor test-cmd-devfile-storage | area/testing points/3 | /area testing
## Acceptance Criteria
- [ ] test-cmd-devfile-storage should use new test approach and run successfully.
| 1.0 | refactor test-cmd-devfile-storage - /area testing
## Acceptance Criteria
- [ ] test-cmd-devfile-storage should use new test approach and run successfully.
| non_priority | refactor test cmd devfile storage area testing acceptance criteria test cmd devfile storage should use new test approach and run successfully | 0 |
126,125 | 16,979,422,749 | IssuesEvent | 2021-06-30 06:49:42 | microsoft/vscode | https://api.github.com/repos/microsoft/vscode | closed | No separation between label and detail | *as-designed | Testing #127333
Using:
```
const simpleCompletion = new vscode.CompletionItem({ label: 'label', description: 'description', detail: 'detail'});
```
I expected there to be a space between the label and detail.

| 1.0 | No separation between label and detail - Testing #127333
Using:
```
const simpleCompletion = new vscode.CompletionItem({ label: 'label', description: 'description', detail: 'detail'});
```
I expected there to be a space between the label and detail.

| non_priority | no separation between label and detail testing using const simplecompletion new vscode completionitem label label description description detail detail i expected there to be a space between the label and detail | 0 |
162,088 | 12,619,402,113 | IssuesEvent | 2020-06-13 00:26:00 | dotnet/runtime | https://api.github.com/repos/dotnet/runtime | opened | Mono SIGABRT in System.Drawing.Common affecting some test runs | runtime-mono test-run-core | Affects clean test runs in https://github.com/dotnet/runtime/pull/37536
AzDO run: https://dev.azure.com/dnceng/public/_build/results?buildId=681708&view=logs&j=c6f8dc49-92a1-5760-c098-ba97b8142bfb&t=22b0078b-0469-5ba6-8725-2121fdbae049&l=42
Test log: https://helix.dot.net/api/2019-06-17/jobs/73f57a06-eec2-4676-8076-302ccabec52f/workitems/System.Drawing.Common.Tests/console
```txt
===========================================================================================================
/private/tmp/helix/working/A83A0976/w/A92E096B/e /private/tmp/helix/working/A83A0976/w/A92E096B/e
Discovering: System.Drawing.Common.Tests (method display = ClassAndMethod, method display options = None)
Discovered: System.Drawing.Common.Tests (found 1565 of 1967 test cases)
Starting: System.Drawing.Common.Tests (parallel test collections = on, max threads = 4)
System.Drawing.Drawing2D.Tests.BlendTests.Ctor_LargeCount_ThrowsOutOfMemoryException [SKIP]
Condition(s) not met: "IsNotIntMaxValueArrayIndexSupported"
System.Drawing.Printing.Tests.PrintControllerTests.OnStartPage_InvokeWithPrint_ReturnsNull [SKIP]
Condition(s) not met: "IsAnyInstalledPrinters"
=================================================================
Native Crash Reporting
=================================================================
Got a segv while executing native code. This usually indicates
a fatal error in the mono runtime or one of the native libraries
used by your application.
=================================================================
=================================================================
Native stacktrace:
=================================================================
0x10305ca56 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : mono_dump_native_crash_info
0x102ffeae5 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : mono_handle_native_crash
0x1030574c3 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : altstack_handle_and_restore
0x7fff5d81d916 - /usr/lib/system/libsystem_c.dylib : fclose
0x10771e7c0 - /usr/local/lib/libgdiplus.dylib : gdip_metafile_stop_recording
0x10770ab5e - /usr/local/lib/libgdiplus.dylib : GdipDeleteGraphics
0x108df9d35 - Unknown
0x108ad8c91 - Unknown
0x102f61f7e - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : mono_jit_runtime_invoke
0x10313acf8 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : mono_runtime_invoke_checked
0x10314288c - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : mono_runtime_try_invoke_array
0x1030ec3e4 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : ves_icall_InternalInvoke
0x1030f8484 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : ves_icall_InternalInvoke_raw
<snip>
----- end Wed Jun 10 17:22:30 PDT 2020 ----- exit code 134 ----------------------------------------------------------
exit code 134 means SIGABRT Abort. Managed or native assert, or runtime check such as heap corruption, caused call to abort(). Core dumped.
```
Possibly related: https://github.com/dotnet/runtime/issues/32827, https://github.com/dotnet/runtime/issues/23784, but this issue appears to have a different termination code. | 1.0 | Mono SIGABRT in System.Drawing.Common affecting some test runs - Affects clean test runs in https://github.com/dotnet/runtime/pull/37536
AzDO run: https://dev.azure.com/dnceng/public/_build/results?buildId=681708&view=logs&j=c6f8dc49-92a1-5760-c098-ba97b8142bfb&t=22b0078b-0469-5ba6-8725-2121fdbae049&l=42
Test log: https://helix.dot.net/api/2019-06-17/jobs/73f57a06-eec2-4676-8076-302ccabec52f/workitems/System.Drawing.Common.Tests/console
```txt
===========================================================================================================
/private/tmp/helix/working/A83A0976/w/A92E096B/e /private/tmp/helix/working/A83A0976/w/A92E096B/e
Discovering: System.Drawing.Common.Tests (method display = ClassAndMethod, method display options = None)
Discovered: System.Drawing.Common.Tests (found 1565 of 1967 test cases)
Starting: System.Drawing.Common.Tests (parallel test collections = on, max threads = 4)
System.Drawing.Drawing2D.Tests.BlendTests.Ctor_LargeCount_ThrowsOutOfMemoryException [SKIP]
Condition(s) not met: "IsNotIntMaxValueArrayIndexSupported"
System.Drawing.Printing.Tests.PrintControllerTests.OnStartPage_InvokeWithPrint_ReturnsNull [SKIP]
Condition(s) not met: "IsAnyInstalledPrinters"
=================================================================
Native Crash Reporting
=================================================================
Got a segv while executing native code. This usually indicates
a fatal error in the mono runtime or one of the native libraries
used by your application.
=================================================================
=================================================================
Native stacktrace:
=================================================================
0x10305ca56 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : mono_dump_native_crash_info
0x102ffeae5 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : mono_handle_native_crash
0x1030574c3 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : altstack_handle_and_restore
0x7fff5d81d916 - /usr/lib/system/libsystem_c.dylib : fclose
0x10771e7c0 - /usr/local/lib/libgdiplus.dylib : gdip_metafile_stop_recording
0x10770ab5e - /usr/local/lib/libgdiplus.dylib : GdipDeleteGraphics
0x108df9d35 - Unknown
0x108ad8c91 - Unknown
0x102f61f7e - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : mono_jit_runtime_invoke
0x10313acf8 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : mono_runtime_invoke_checked
0x10314288c - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : mono_runtime_try_invoke_array
0x1030ec3e4 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : ves_icall_InternalInvoke
0x1030f8484 - /private/tmp/helix/working/A83A0976/p/shared/Microsoft.NETCore.App/5.0.0/libcoreclr.dylib : ves_icall_InternalInvoke_raw
<snip>
----- end Wed Jun 10 17:22:30 PDT 2020 ----- exit code 134 ----------------------------------------------------------
exit code 134 means SIGABRT Abort. Managed or native assert, or runtime check such as heap corruption, caused call to abort(). Core dumped.
```
Possibly related: https://github.com/dotnet/runtime/issues/32827, https://github.com/dotnet/runtime/issues/23784, but this issue appears to have a different termination code. | non_priority | mono sigabrt in system drawing common affecting some test runs affects clean test runs in azdo run test log txt private tmp helix working w e private tmp helix working w e discovering system drawing common tests method display classandmethod method display options none discovered system drawing common tests found of test cases starting system drawing common tests parallel test collections on max threads system drawing tests blendtests ctor largecount throwsoutofmemoryexception condition s not met isnotintmaxvaluearrayindexsupported system drawing printing tests printcontrollertests onstartpage invokewithprint returnsnull condition s not met isanyinstalledprinters native crash reporting got a segv while executing native code this usually indicates a fatal error in the mono runtime or one of the native libraries used by your application native stacktrace private tmp helix working p shared microsoft netcore app libcoreclr dylib mono dump native crash info private tmp helix working p shared microsoft netcore app libcoreclr dylib mono handle native crash private tmp helix working p shared microsoft netcore app libcoreclr dylib altstack handle and restore usr lib system libsystem c dylib fclose usr local lib libgdiplus dylib gdip metafile stop recording usr local lib libgdiplus dylib gdipdeletegraphics unknown unknown private tmp helix working p shared microsoft netcore app libcoreclr dylib mono jit runtime invoke private tmp helix working p shared microsoft netcore app libcoreclr dylib mono runtime invoke checked private tmp helix working p shared microsoft netcore app libcoreclr dylib mono runtime try invoke array private tmp helix working p shared microsoft netcore app libcoreclr dylib ves icall internalinvoke private tmp helix working p shared microsoft netcore app libcoreclr dylib ves icall internalinvoke raw end wed jun pdt exit code exit code means sigabrt abort managed or native assert or runtime check such as heap corruption caused call to abort core dumped possibly related but this issue appears to have a different termination code | 0 |
113,483 | 17,142,096,140 | IssuesEvent | 2021-07-13 10:45:52 | vincentpham13/serverless-jest | https://api.github.com/repos/vincentpham13/serverless-jest | opened | CVE-2021-23343 (High) detected in path-parse-1.0.6.tgz | security vulnerability | ## CVE-2021-23343 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>path-parse-1.0.6.tgz</b></p></summary>
<p>Node.js path.parse() ponyfill</p>
<p>Library home page: <a href="https://registry.npmjs.org/path-parse/-/path-parse-1.0.6.tgz">https://registry.npmjs.org/path-parse/-/path-parse-1.0.6.tgz</a></p>
<p>Path to dependency file: serverless-jest/package.json</p>
<p>Path to vulnerable library: serverless-jest/node_modules/path-parse/package.json</p>
<p>
Dependency Hierarchy:
- jest-26.6.3.tgz (Root Library)
- core-26.6.3.tgz
- jest-resolve-26.6.2.tgz
- resolve-1.20.0.tgz
- :x: **path-parse-1.0.6.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/vincentpham13/serverless-jest/commit/402b2cba4a7e4bbe0402a0fe9a3dabe615025842">402b2cba4a7e4bbe0402a0fe9a3dabe615025842</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
All versions of package path-parse are vulnerable to Regular Expression Denial of Service (ReDoS) via splitDeviceRe, splitTailRe, and splitPathRe regular expressions. ReDoS exhibits polynomial worst-case time complexity.
<p>Publish Date: 2021-05-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23343>CVE-2021-23343</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/jbgutierrez/path-parse/issues/8">https://github.com/jbgutierrez/path-parse/issues/8</a></p>
<p>Release Date: 2021-05-04</p>
<p>Fix Resolution: path-parse - 1.0.7</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-23343 (High) detected in path-parse-1.0.6.tgz - ## CVE-2021-23343 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>path-parse-1.0.6.tgz</b></p></summary>
<p>Node.js path.parse() ponyfill</p>
<p>Library home page: <a href="https://registry.npmjs.org/path-parse/-/path-parse-1.0.6.tgz">https://registry.npmjs.org/path-parse/-/path-parse-1.0.6.tgz</a></p>
<p>Path to dependency file: serverless-jest/package.json</p>
<p>Path to vulnerable library: serverless-jest/node_modules/path-parse/package.json</p>
<p>
Dependency Hierarchy:
- jest-26.6.3.tgz (Root Library)
- core-26.6.3.tgz
- jest-resolve-26.6.2.tgz
- resolve-1.20.0.tgz
- :x: **path-parse-1.0.6.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/vincentpham13/serverless-jest/commit/402b2cba4a7e4bbe0402a0fe9a3dabe615025842">402b2cba4a7e4bbe0402a0fe9a3dabe615025842</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
All versions of package path-parse are vulnerable to Regular Expression Denial of Service (ReDoS) via splitDeviceRe, splitTailRe, and splitPathRe regular expressions. ReDoS exhibits polynomial worst-case time complexity.
<p>Publish Date: 2021-05-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23343>CVE-2021-23343</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/jbgutierrez/path-parse/issues/8">https://github.com/jbgutierrez/path-parse/issues/8</a></p>
<p>Release Date: 2021-05-04</p>
<p>Fix Resolution: path-parse - 1.0.7</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in path parse tgz cve high severity vulnerability vulnerable library path parse tgz node js path parse ponyfill library home page a href path to dependency file serverless jest package json path to vulnerable library serverless jest node modules path parse package json dependency hierarchy jest tgz root library core tgz jest resolve tgz resolve tgz x path parse tgz vulnerable library found in head commit a href found in base branch master vulnerability details all versions of package path parse are vulnerable to regular expression denial of service redos via splitdevicere splittailre and splitpathre regular expressions redos exhibits polynomial worst case time complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution path parse step up your open source security game with whitesource | 0 |
145,128 | 19,319,951,105 | IssuesEvent | 2021-12-14 03:32:57 | Sprinkle42/jenkins | https://api.github.com/repos/Sprinkle42/jenkins | opened | CVE-2020-15366 (Medium) detected in ajv-6.12.2.tgz | security vulnerability | ## CVE-2020-15366 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ajv-6.12.2.tgz</b></p></summary>
<p>Another JSON Schema Validator</p>
<p>Library home page: <a href="https://registry.npmjs.org/ajv/-/ajv-6.12.2.tgz">https://registry.npmjs.org/ajv/-/ajv-6.12.2.tgz</a></p>
<p>Path to dependency file: jenkins/war/package.json</p>
<p>Path to vulnerable library: jenkins/war/node_modules/ajv/package.json</p>
<p>
Dependency Hierarchy:
- file-loader-6.0.0.tgz (Root Library)
- schema-utils-2.6.6.tgz
- :x: **ajv-6.12.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Sprinkle42/jenkins/commit/9a9fb6059028eaf0b29dacacd5d944a4af38d15c">9a9fb6059028eaf0b29dacacd5d944a4af38d15c</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in ajv.validate() in Ajv (aka Another JSON Schema Validator) 6.12.2. A carefully crafted JSON schema could be provided that allows execution of other code by prototype pollution. (While untrusted schemas are recommended against, the worst case of an untrusted schema should be a denial of service, not execution of code.)
<p>Publish Date: 2020-07-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15366>CVE-2020-15366</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/ajv-validator/ajv/releases/tag/v6.12.3">https://github.com/ajv-validator/ajv/releases/tag/v6.12.3</a></p>
<p>Release Date: 2020-07-15</p>
<p>Fix Resolution: ajv - 6.12.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-15366 (Medium) detected in ajv-6.12.2.tgz - ## CVE-2020-15366 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ajv-6.12.2.tgz</b></p></summary>
<p>Another JSON Schema Validator</p>
<p>Library home page: <a href="https://registry.npmjs.org/ajv/-/ajv-6.12.2.tgz">https://registry.npmjs.org/ajv/-/ajv-6.12.2.tgz</a></p>
<p>Path to dependency file: jenkins/war/package.json</p>
<p>Path to vulnerable library: jenkins/war/node_modules/ajv/package.json</p>
<p>
Dependency Hierarchy:
- file-loader-6.0.0.tgz (Root Library)
- schema-utils-2.6.6.tgz
- :x: **ajv-6.12.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Sprinkle42/jenkins/commit/9a9fb6059028eaf0b29dacacd5d944a4af38d15c">9a9fb6059028eaf0b29dacacd5d944a4af38d15c</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in ajv.validate() in Ajv (aka Another JSON Schema Validator) 6.12.2. A carefully crafted JSON schema could be provided that allows execution of other code by prototype pollution. (While untrusted schemas are recommended against, the worst case of an untrusted schema should be a denial of service, not execution of code.)
<p>Publish Date: 2020-07-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15366>CVE-2020-15366</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/ajv-validator/ajv/releases/tag/v6.12.3">https://github.com/ajv-validator/ajv/releases/tag/v6.12.3</a></p>
<p>Release Date: 2020-07-15</p>
<p>Fix Resolution: ajv - 6.12.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in ajv tgz cve medium severity vulnerability vulnerable library ajv tgz another json schema validator library home page a href path to dependency file jenkins war package json path to vulnerable library jenkins war node modules ajv package json dependency hierarchy file loader tgz root library schema utils tgz x ajv tgz vulnerable library found in head commit a href found in base branch master vulnerability details an issue was discovered in ajv validate in ajv aka another json schema validator a carefully crafted json schema could be provided that allows execution of other code by prototype pollution while untrusted schemas are recommended against the worst case of an untrusted schema should be a denial of service not execution of code publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ajv step up your open source security game with whitesource | 0 |
53,403 | 28,119,261,132 | IssuesEvent | 2023-03-31 13:08:20 | ARK-Builders/arklib | https://api.github.com/repos/ARK-Builders/arklib | opened | Update method should return complete resources with their details | performance | `ResourceIndex` should pass complete resources with their details during `update`.
Right now, only `ResourceId`s are passed. This causes the library clients to reconstruct details again.
See `fun compute` in `Resource.kt` (https://github.com/ARK-Builders/arklib-android). | True | Update method should return complete resources with their details - `ResourceIndex` should pass complete resources with their details during `update`.
Right now, only `ResourceId`s are passed. This causes the library clients to reconstruct details again.
See `fun compute` in `Resource.kt` (https://github.com/ARK-Builders/arklib-android). | non_priority | update method should return complete resources with their details resourceindex should pass complete resources with their details during update right now only resourceid s are passed this causes the library clients to reconstruct details again see fun compute in resource kt | 0 |
46,096 | 18,952,549,282 | IssuesEvent | 2021-11-18 16:32:15 | hashicorp/terraform-provider-aws | https://api.github.com/repos/hashicorp/terraform-provider-aws | closed | resource/aws_route: aws_route does not respect a high create timeout and gives up always after 21 retries | service/ec2 needs-triage | <!---
Please note the following potential times when an issue might be in Terraform core:
* [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues
* [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues
* [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues
* [Registry](https://registry.terraform.io/) issues
* Spans resources across multiple providers
If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead.
--->
<!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Terraform CLI and Terraform AWS Provider Version
<!--- Please run `terraform -v` to show the Terraform core version and provider version(s). If you are not running the latest version of Terraform or the provider, please upgrade because your issue may have already been fixed. [Terraform documentation on provider versioning](https://www.terraform.io/docs/configuration/providers.html#provider-versions). --->
terraform version - 0.12.31
provider-aws version - 3.63.0
### Affected Resource(s)
<!--- Please list the affected resources and data sources. --->
* aws_route
### Terraform Configuration Files
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
Please include all Terraform configurations required to reproduce the bug. Bug reports without a functional reproduction may be closed without investigation.
```hcl
provider "aws" {
access_key = var.ACCESS_KEY_ID
secret_key = var.SECRET_ACCESS_KEY
region = "eu-west-1"
}
resource "aws_vpc" "vpc" {
cidr_block = "10.222.0.0/16"
enable_dns_support = true
enable_dns_hostnames = true
}
resource "aws_subnet" "public_utility_z0" {
vpc_id = aws_vpc.vpc.id
cidr_block = "10.222.96.0/26"
availability_zone = "eu-west-1a"
}
resource "aws_eip" "eip_natgw_z0" {
vpc = true
}
resource "aws_nat_gateway" "natgw_z0" {
allocation_id = aws_eip.eip_natgw_z0.id
subnet_id = aws_subnet.public_utility_z0.id
}
resource "aws_route_table" "routetable_private_utility_z0" {
vpc_id = aws_vpc.vpc.id
}
resource "aws_route" "private_utility_z0_nat" {
route_table_id = aws_route_table.routetable_private_utility_z0.id
destination_cidr_block = "0.0.0.0/0"
nat_gateway_id = aws_nat_gateway.natgw_z0.id
timeouts {
create = "5m"
}
}
```
### Debug Output
<!---
Please provide a link to a GitHub Gist containing the complete debug output. Please do NOT paste the debug output in the issue; just paste a link to the Gist.
To obtain the debug output, see the [Terraform documentation on debugging](https://www.terraform.io/docs/internals/debugging.html).
--->
### Panic Output
<!--- If Terraform produced a panic, please provide a link to a GitHub Gist containing the output of the `crash.log`. --->
### Expected Behavior
aws_route to respect the configured timeout:
```hcl
timeouts {
create = "5m"
}
```
### Actual Behavior
Route creation always times out after 21 retries (`2m30s`) and does not respect the configured create timeout:
`terraform apply` of the configuration from above potentially fails with:
```
aws_route.private_utility_z0_nat: Creating...
aws_route.private_utility_z0_nat: Still creating... [10s elapsed]
aws_route.private_utility_z0_nat: Still creating... [20s elapsed]
aws_route.private_utility_z0_nat: Still creating... [30s elapsed]
aws_route.private_utility_z0_nat: Still creating... [40s elapsed]
aws_route.private_utility_z0_nat: Still creating... [50s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m0s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m10s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m20s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m30s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m40s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m50s elapsed]
aws_route.private_utility_z0_nat: Still creating... [2m0s elapsed]
aws_route.private_utility_z0_nat: Still creating... [2m10s elapsed]
aws_route.private_utility_z0_nat: Still creating... [2m20s elapsed]
aws_route.private_utility_z0_nat: Still creating... [2m30s elapsed]
Error: error waiting for Route in Route Table (rtb-1234) with destination (0.0.0.0/0) to become available: couldn't find resource (21 retries)
on tf/main.tf line 245, in resource "aws_route" "private_utility_z0_nat":
245: resource "aws_route" "private_utility_z0_nat" {
```
### Steps to Reproduce
See above.
### References
<!---
Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests
Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor documentation? For example:
--->
* https://github.com/hashicorp/terraform-provider-aws/issues/21525
| 1.0 | resource/aws_route: aws_route does not respect a high create timeout and gives up always after 21 retries - <!---
Please note the following potential times when an issue might be in Terraform core:
* [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues
* [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues
* [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues
* [Registry](https://registry.terraform.io/) issues
* Spans resources across multiple providers
If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead.
--->
<!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Terraform CLI and Terraform AWS Provider Version
<!--- Please run `terraform -v` to show the Terraform core version and provider version(s). If you are not running the latest version of Terraform or the provider, please upgrade because your issue may have already been fixed. [Terraform documentation on provider versioning](https://www.terraform.io/docs/configuration/providers.html#provider-versions). --->
terraform version - 0.12.31
provider-aws version - 3.63.0
### Affected Resource(s)
<!--- Please list the affected resources and data sources. --->
* aws_route
### Terraform Configuration Files
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
Please include all Terraform configurations required to reproduce the bug. Bug reports without a functional reproduction may be closed without investigation.
```hcl
provider "aws" {
access_key = var.ACCESS_KEY_ID
secret_key = var.SECRET_ACCESS_KEY
region = "eu-west-1"
}
resource "aws_vpc" "vpc" {
cidr_block = "10.222.0.0/16"
enable_dns_support = true
enable_dns_hostnames = true
}
resource "aws_subnet" "public_utility_z0" {
vpc_id = aws_vpc.vpc.id
cidr_block = "10.222.96.0/26"
availability_zone = "eu-west-1a"
}
resource "aws_eip" "eip_natgw_z0" {
vpc = true
}
resource "aws_nat_gateway" "natgw_z0" {
allocation_id = aws_eip.eip_natgw_z0.id
subnet_id = aws_subnet.public_utility_z0.id
}
resource "aws_route_table" "routetable_private_utility_z0" {
vpc_id = aws_vpc.vpc.id
}
resource "aws_route" "private_utility_z0_nat" {
route_table_id = aws_route_table.routetable_private_utility_z0.id
destination_cidr_block = "0.0.0.0/0"
nat_gateway_id = aws_nat_gateway.natgw_z0.id
timeouts {
create = "5m"
}
}
```
### Debug Output
<!---
Please provide a link to a GitHub Gist containing the complete debug output. Please do NOT paste the debug output in the issue; just paste a link to the Gist.
To obtain the debug output, see the [Terraform documentation on debugging](https://www.terraform.io/docs/internals/debugging.html).
--->
### Panic Output
<!--- If Terraform produced a panic, please provide a link to a GitHub Gist containing the output of the `crash.log`. --->
### Expected Behavior
aws_route to respect the configured timeout:
```hcl
timeouts {
create = "5m"
}
```
### Actual Behavior
Route creation always times out after 21 retries (`2m30s`) and does not respect the configured create timeout:
`terraform apply` of the configuration from above potentially fails with:
```
aws_route.private_utility_z0_nat: Creating...
aws_route.private_utility_z0_nat: Still creating... [10s elapsed]
aws_route.private_utility_z0_nat: Still creating... [20s elapsed]
aws_route.private_utility_z0_nat: Still creating... [30s elapsed]
aws_route.private_utility_z0_nat: Still creating... [40s elapsed]
aws_route.private_utility_z0_nat: Still creating... [50s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m0s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m10s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m20s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m30s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m40s elapsed]
aws_route.private_utility_z0_nat: Still creating... [1m50s elapsed]
aws_route.private_utility_z0_nat: Still creating... [2m0s elapsed]
aws_route.private_utility_z0_nat: Still creating... [2m10s elapsed]
aws_route.private_utility_z0_nat: Still creating... [2m20s elapsed]
aws_route.private_utility_z0_nat: Still creating... [2m30s elapsed]
Error: error waiting for Route in Route Table (rtb-1234) with destination (0.0.0.0/0) to become available: couldn't find resource (21 retries)
on tf/main.tf line 245, in resource "aws_route" "private_utility_z0_nat":
245: resource "aws_route" "private_utility_z0_nat" {
```
### Steps to Reproduce
See above.
### References
<!---
Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests
Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor documentation? For example:
--->
* https://github.com/hashicorp/terraform-provider-aws/issues/21525
| non_priority | resource aws route aws route does not respect a high create timeout and gives up always after retries please note the following potential times when an issue might be in terraform core or resource ordering issues and issues issues issues spans resources across multiple providers if you are running into one of these scenarios we recommend opening an issue in the instead community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or other comments that do not add relevant new information or questions they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform cli and terraform aws provider version terraform version provider aws version affected resource s aws route terraform configuration files please include all terraform configurations required to reproduce the bug bug reports without a functional reproduction may be closed without investigation hcl provider aws access key var access key id secret key var secret access key region eu west resource aws vpc vpc cidr block enable dns support true enable dns hostnames true resource aws subnet public utility vpc id aws vpc vpc id cidr block availability zone eu west resource aws eip eip natgw vpc true resource aws nat gateway natgw allocation id aws eip eip natgw id subnet id aws subnet public utility id resource aws route table routetable private utility vpc id aws vpc vpc id resource aws route private utility nat route table id aws route table routetable private utility id destination cidr block nat gateway id aws nat gateway natgw id timeouts create debug output please provide a link to a github gist containing the complete debug output please do not paste the debug output in the issue just paste a link to the gist to obtain the debug output see the panic output expected behavior aws route to respect the configured timeout hcl timeouts create actual behavior route creation always times out after retries and does not respect the configured create timeout terraform apply of the configuration from above potentially fails with aws route private utility nat creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating aws route private utility nat still creating error error waiting for route in route table rtb with destination to become available couldn t find resource retries on tf main tf line in resource aws route private utility nat resource aws route private utility nat steps to reproduce see above references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here vendor documentation for example | 0 |
257,021 | 27,561,756,109 | IssuesEvent | 2023-03-07 22:44:22 | samqws-marketing/coursera_naptime | https://api.github.com/repos/samqws-marketing/coursera_naptime | closed | CVE-2020-36180 (High) detected in multiple libraries - autoclosed | Mend: dependency security vulnerability | ## CVE-2020-36180 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.0.jar</b>, <b>jackson-databind-2.3.3.jar</b>, <b>jackson-databind-2.8.11.4.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.9.0.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.9.0.jar</p>
<p>
Dependency Hierarchy:
- play-ehcache_2.12-2.6.25.jar (Root Library)
- play_2.12-2.6.25.jar
- play-json_2.12-2.6.14.jar
- jackson-datatype-jdk8-2.8.11.jar
- :x: **jackson-databind-2.9.0.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.3.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.3.3.jar</p>
<p>
Dependency Hierarchy:
- sbt-plugin-2.4.4.jar (Root Library)
- sbt-js-engine-1.1.3.jar
- npm_2.10-1.1.1.jar
- webjars-locator-0.26.jar
- :x: **jackson-databind-2.3.3.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.8.11.4.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.8.11.4.jar</p>
<p>
Dependency Hierarchy:
- play-ehcache_2.12-2.6.25.jar (Root Library)
- play_2.12-2.6.25.jar
- :x: **jackson-databind-2.8.11.4.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/coursera_naptime/commit/95750513b615ecf0ea9b7e14fb5f71e577d01a1f">95750513b615ecf0ea9b7e14fb5f71e577d01a1f</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.dbcp2.cpdsadapter.DriverAdapterCPDS.
<p>Publish Date: 2021-01-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-36180>CVE-2020-36180</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-01-07</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
| True | CVE-2020-36180 (High) detected in multiple libraries - autoclosed - ## CVE-2020-36180 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.0.jar</b>, <b>jackson-databind-2.3.3.jar</b>, <b>jackson-databind-2.8.11.4.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.9.0.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.9.0.jar</p>
<p>
Dependency Hierarchy:
- play-ehcache_2.12-2.6.25.jar (Root Library)
- play_2.12-2.6.25.jar
- play-json_2.12-2.6.14.jar
- jackson-datatype-jdk8-2.8.11.jar
- :x: **jackson-databind-2.9.0.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.3.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.3.3.jar</p>
<p>
Dependency Hierarchy:
- sbt-plugin-2.4.4.jar (Root Library)
- sbt-js-engine-1.1.3.jar
- npm_2.10-1.1.1.jar
- webjars-locator-0.26.jar
- :x: **jackson-databind-2.3.3.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.8.11.4.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.8.11.4.jar</p>
<p>
Dependency Hierarchy:
- play-ehcache_2.12-2.6.25.jar (Root Library)
- play_2.12-2.6.25.jar
- :x: **jackson-databind-2.8.11.4.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/coursera_naptime/commit/95750513b615ecf0ea9b7e14fb5f71e577d01a1f">95750513b615ecf0ea9b7e14fb5f71e577d01a1f</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.apache.commons.dbcp2.cpdsadapter.DriverAdapterCPDS.
<p>Publish Date: 2021-01-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-36180>CVE-2020-36180</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-01-07</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
| non_priority | cve high detected in multiple libraries autoclosed cve high severity vulnerability vulnerable libraries jackson databind jar jackson databind jar jackson databind jar jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library home wss scanner cache com fasterxml jackson core jackson databind bundles jackson databind jar dependency hierarchy play ehcache jar root library play jar play json jar jackson datatype jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api path to vulnerable library home wss scanner cache com fasterxml jackson core jackson databind bundles jackson databind jar dependency hierarchy sbt plugin jar root library sbt js engine jar npm jar webjars locator jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library home wss scanner cache com fasterxml jackson core jackson databind bundles jackson databind jar dependency hierarchy play ehcache jar root library play jar x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache commons cpdsadapter driveradaptercpds publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution com fasterxml jackson core jackson databind | 0 |
338,840 | 24,601,291,367 | IssuesEvent | 2022-10-14 12:43:56 | napari/napari | https://api.github.com/repos/napari/napari | opened | Update workflow for deploying docs from main repo | documentation task | ## 🧰 Task
Since #5216, we have docs in the github.com/napari/docs repo. As we were planning this move, we thought that this meant that docs need to be built in that repo and then deployed from there. But actually, we can just build in either place:
- in the docs repo, we clone the current napari/napari@main, and then build the docs with the current main or PR within that repo, and (optionally) deploy.
- in the main repo, we just clone the current napari/docs@main, then build the docs with the current main or PR, and then optionally deploy.
Since both repos have deploy keys to napari.github.io, both can do this. As far as I can tell, you won't get into race conditions here — later PRs in one will contain the latest commit in the other one, even if that one has not yet finished building. It *could* happen that one build hangs, but that can happen already — a manual workflow trigger would solve this, or we can just wait till the next PR.
The advantage of this is that we don't need to do the remote workflow trigger, which requires my personal access token, which is yucky.
| 1.0 | Update workflow for deploying docs from main repo - ## 🧰 Task
Since #5216, we have docs in the github.com/napari/docs repo. As we were planning this move, we thought that this meant that docs need to be built in that repo and then deployed from there. But actually, we can just build in either place:
- in the docs repo, we clone the current napari/napari@main, and then build the docs with the current main or PR within that repo, and (optionally) deploy.
- in the main repo, we just clone the current napari/docs@main, then build the docs with the current main or PR, and then optionally deploy.
Since both repos have deploy keys to napari.github.io, both can do this. As far as I can tell, you won't get into race conditions here — later PRs in one will contain the latest commit in the other one, even if that one has not yet finished building. It *could* happen that one build hangs, but that can happen already — a manual workflow trigger would solve this, or we can just wait till the next PR.
The advantage of this is that we don't need to do the remote workflow trigger, which requires my personal access token, which is yucky.
| non_priority | update workflow for deploying docs from main repo 🧰 task since we have docs in the github com napari docs repo as we were planning this move we thought that this meant that docs need to be built in that repo and then deployed from there but actually we can just build in either place in the docs repo we clone the current napari napari main and then build the docs with the current main or pr within that repo and optionally deploy in the main repo we just clone the current napari docs main then build the docs with the current main or pr and then optionally deploy since both repos have deploy keys to napari github io both can do this as far as i can tell you won t get into race conditions here — later prs in one will contain the latest commit in the other one even if that one has not yet finished building it could happen that one build hangs but that can happen already — a manual workflow trigger would solve this or we can just wait till the next pr the advantage of this is that we don t need to do the remote workflow trigger which requires my personal access token which is yucky | 0 |
324,823 | 27,823,304,227 | IssuesEvent | 2023-03-19 13:37:13 | JeriRov/TestTask | https://api.github.com/repos/JeriRov/TestTask | closed | Create a service for a person entity | enhancement test | Create a service that will return a person's age by their id, and a method that will return a person by their id.
<details>
<summary>Tasks</summary>
- [x] create a service and repository
- [x] method returns the person's age by id
- [x] method returns a person by id
- [x] create a unit tests
</details> | 1.0 | Create a service for a person entity - Create a service that will return a person's age by their id, and a method that will return a person by their id.
<details>
<summary>Tasks</summary>
- [x] create a service and repository
- [x] method returns the person's age by id
- [x] method returns a person by id
- [x] create a unit tests
</details> | non_priority | create a service for a person entity create a service that will return a person s age by their id and a method that will return a person by their id tasks create a service and repository method returns the person s age by id method returns a person by id create a unit tests | 0 |
154,152 | 13,539,523,828 | IssuesEvent | 2020-09-16 13:34:40 | JHBitencourt/timeline_tile | https://api.github.com/repos/JHBitencourt/timeline_tile | closed | Inaccurate Doc Comments for LineStyle class | documentation | Version: 0.1.2
There some inaccuracy in the Doc Comments.
- Doc Comments Says the default value is 10, but the constructor initialises the value to 4
| 1.0 | Inaccurate Doc Comments for LineStyle class - Version: 0.1.2
There some inaccuracy in the Doc Comments.
- Doc Comments Says the default value is 10, but the constructor initialises the value to 4
| non_priority | inaccurate doc comments for linestyle class version there some inaccuracy in the doc comments doc comments says the default value is but the constructor initialises the value to | 0 |
20,300 | 4,535,873,232 | IssuesEvent | 2016-09-08 18:40:18 | scikit-learn/scikit-learn | https://api.github.com/repos/scikit-learn/scikit-learn | closed | MSE is negative when returned by cross_val_score | API Bug Documentation Need Contributor | The Mean Square Error returned by sklearn.cross_validation.cross_val_score is always a negative. While being a designed decision so that the output of this function can be used for maximization given some hyperparameters, it's extremely confusing when using cross_val_score directly. At least I asked myself how a the mean of a square can possibly be negative and thought that cross_val_score was not working correctly or did not use the supplied metric. Only after digging in the sklearn source code I realized that the sign was flipped.
This behavior is mentioned in make_scorer in scorer.py, however it's not mentioned in cross_val_score and I think it should be, because otherwise it makes people think that cross_val_score is not working correctly. | 1.0 | MSE is negative when returned by cross_val_score - The Mean Square Error returned by sklearn.cross_validation.cross_val_score is always a negative. While being a designed decision so that the output of this function can be used for maximization given some hyperparameters, it's extremely confusing when using cross_val_score directly. At least I asked myself how a the mean of a square can possibly be negative and thought that cross_val_score was not working correctly or did not use the supplied metric. Only after digging in the sklearn source code I realized that the sign was flipped.
This behavior is mentioned in make_scorer in scorer.py, however it's not mentioned in cross_val_score and I think it should be, because otherwise it makes people think that cross_val_score is not working correctly. | non_priority | mse is negative when returned by cross val score the mean square error returned by sklearn cross validation cross val score is always a negative while being a designed decision so that the output of this function can be used for maximization given some hyperparameters it s extremely confusing when using cross val score directly at least i asked myself how a the mean of a square can possibly be negative and thought that cross val score was not working correctly or did not use the supplied metric only after digging in the sklearn source code i realized that the sign was flipped this behavior is mentioned in make scorer in scorer py however it s not mentioned in cross val score and i think it should be because otherwise it makes people think that cross val score is not working correctly | 0 |
93,056 | 15,872,997,915 | IssuesEvent | 2021-04-09 01:17:55 | rammatzkvosky/patroni | https://api.github.com/repos/rammatzkvosky/patroni | opened | CVE-2020-14343 (High) detected in PyYAML-5.2.tar.gz | security vulnerability | ## CVE-2020-14343 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>PyYAML-5.2.tar.gz</b></p></summary>
<p>YAML parser and emitter for Python</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/8d/c9/e5be955a117a1ac548cdd31e37e8fd7b02ce987f9655f5c7563c656d5dcb/PyYAML-5.2.tar.gz">https://files.pythonhosted.org/packages/8d/c9/e5be955a117a1ac548cdd31e37e8fd7b02ce987f9655f5c7563c656d5dcb/PyYAML-5.2.tar.gz</a></p>
<p>Path to dependency file: patroni/requirements.txt</p>
<p>Path to vulnerable library: patroni/requirements.txt</p>
<p>
Dependency Hierarchy:
- kubernetes-10.0.1-py2.py3-none-any.whl (Root Library)
- :x: **PyYAML-5.2.tar.gz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability was discovered in the PyYAML library in versions before 5.4, where it is susceptible to arbitrary code execution when it processes untrusted YAML files through the full_load method or with the FullLoader loader. Applications that use the library to process untrusted input may be vulnerable to this flaw. This flaw allows an attacker to execute arbitrary code on the system by abusing the python/object/new constructor. This flaw is due to an incomplete fix for CVE-2020-1747.
<p>Publish Date: 2021-02-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14343>CVE-2020-14343</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14343">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14343</a></p>
<p>Release Date: 2021-02-09</p>
<p>Fix Resolution: PyYAML - 5.4</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"PyYAML","packageVersion":"5.2","packageFilePaths":["/requirements.txt"],"isTransitiveDependency":true,"dependencyTree":"kubernetes:10.0.1;PyYAML:5.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"PyYAML - 5.4"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-14343","vulnerabilityDetails":"A vulnerability was discovered in the PyYAML library in versions before 5.4, where it is susceptible to arbitrary code execution when it processes untrusted YAML files through the full_load method or with the FullLoader loader. Applications that use the library to process untrusted input may be vulnerable to this flaw. This flaw allows an attacker to execute arbitrary code on the system by abusing the python/object/new constructor. This flaw is due to an incomplete fix for CVE-2020-1747.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14343","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-14343 (High) detected in PyYAML-5.2.tar.gz - ## CVE-2020-14343 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>PyYAML-5.2.tar.gz</b></p></summary>
<p>YAML parser and emitter for Python</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/8d/c9/e5be955a117a1ac548cdd31e37e8fd7b02ce987f9655f5c7563c656d5dcb/PyYAML-5.2.tar.gz">https://files.pythonhosted.org/packages/8d/c9/e5be955a117a1ac548cdd31e37e8fd7b02ce987f9655f5c7563c656d5dcb/PyYAML-5.2.tar.gz</a></p>
<p>Path to dependency file: patroni/requirements.txt</p>
<p>Path to vulnerable library: patroni/requirements.txt</p>
<p>
Dependency Hierarchy:
- kubernetes-10.0.1-py2.py3-none-any.whl (Root Library)
- :x: **PyYAML-5.2.tar.gz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability was discovered in the PyYAML library in versions before 5.4, where it is susceptible to arbitrary code execution when it processes untrusted YAML files through the full_load method or with the FullLoader loader. Applications that use the library to process untrusted input may be vulnerable to this flaw. This flaw allows an attacker to execute arbitrary code on the system by abusing the python/object/new constructor. This flaw is due to an incomplete fix for CVE-2020-1747.
<p>Publish Date: 2021-02-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14343>CVE-2020-14343</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14343">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14343</a></p>
<p>Release Date: 2021-02-09</p>
<p>Fix Resolution: PyYAML - 5.4</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"PyYAML","packageVersion":"5.2","packageFilePaths":["/requirements.txt"],"isTransitiveDependency":true,"dependencyTree":"kubernetes:10.0.1;PyYAML:5.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"PyYAML - 5.4"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-14343","vulnerabilityDetails":"A vulnerability was discovered in the PyYAML library in versions before 5.4, where it is susceptible to arbitrary code execution when it processes untrusted YAML files through the full_load method or with the FullLoader loader. Applications that use the library to process untrusted input may be vulnerable to this flaw. This flaw allows an attacker to execute arbitrary code on the system by abusing the python/object/new constructor. This flaw is due to an incomplete fix for CVE-2020-1747.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14343","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | non_priority | cve high detected in pyyaml tar gz cve high severity vulnerability vulnerable library pyyaml tar gz yaml parser and emitter for python library home page a href path to dependency file patroni requirements txt path to vulnerable library patroni requirements txt dependency hierarchy kubernetes none any whl root library x pyyaml tar gz vulnerable library vulnerability details a vulnerability was discovered in the pyyaml library in versions before where it is susceptible to arbitrary code execution when it processes untrusted yaml files through the full load method or with the fullloader loader applications that use the library to process untrusted input may be vulnerable to this flaw this flaw allows an attacker to execute arbitrary code on the system by abusing the python object new constructor this flaw is due to an incomplete fix for cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution pyyaml isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree kubernetes pyyaml isminimumfixversionavailable true minimumfixversion pyyaml basebranches vulnerabilityidentifier cve vulnerabilitydetails a vulnerability was discovered in the pyyaml library in versions before where it is susceptible to arbitrary code execution when it processes untrusted yaml files through the full load method or with the fullloader loader applications that use the library to process untrusted input may be vulnerable to this flaw this flaw allows an attacker to execute arbitrary code on the system by abusing the python object new constructor this flaw is due to an incomplete fix for cve vulnerabilityurl | 0 |
245,971 | 26,577,361,555 | IssuesEvent | 2023-01-22 01:00:08 | tabacws-sandbox/mattermost-golang | https://api.github.com/repos/tabacws-sandbox/mattermost-golang | opened | github.com/prometheus/client_golang-v1.11.0: 1 vulnerabilities (highest severity is: 7.5) | security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/prometheus/client_golang-v1.11.0</b></p></summary>
<p>Prometheus instrumentation library for Go applications</p>
<p>Library home page: <a href="https://proxy.golang.org/github.com/prometheus/client_golang/@v/v1.11.0.zip">https://proxy.golang.org/github.com/prometheus/client_golang/@v/v1.11.0.zip</a></p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/tabacws-sandbox/mattermost-golang/commit/fd9fc1baf3cd86beecdfe1d4b962b3e768b4ff92">fd9fc1baf3cd86beecdfe1d4b962b3e768b4ff92</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (github.com/prometheus/client_golang-v1.11.0 version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-21698](https://www.mend.io/vulnerability-database/CVE-2022-21698) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | github.com/prometheus/client_golang-v1.11.0 | Direct | v1.11.1 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-21698</summary>
### Vulnerable Library - <b>github.com/prometheus/client_golang-v1.11.0</b></p>
<p>Prometheus instrumentation library for Go applications</p>
<p>Library home page: <a href="https://proxy.golang.org/github.com/prometheus/client_golang/@v/v1.11.0.zip">https://proxy.golang.org/github.com/prometheus/client_golang/@v/v1.11.0.zip</a></p>
<p>
Dependency Hierarchy:
- :x: **github.com/prometheus/client_golang-v1.11.0** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/tabacws-sandbox/mattermost-golang/commit/fd9fc1baf3cd86beecdfe1d4b962b3e768b4ff92">fd9fc1baf3cd86beecdfe1d4b962b3e768b4ff92</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
client_golang is the instrumentation library for Go applications in Prometheus, and the promhttp package in client_golang provides tooling around HTTP servers and clients. In client_golang prior to version 1.11.1, HTTP server is susceptible to a Denial of Service through unbounded cardinality, and potential memory exhaustion, when handling requests with non-standard HTTP methods. In order to be affected, an instrumented software must use any of `promhttp.InstrumentHandler*` middleware except `RequestsInFlight`; not filter any specific methods (e.g GET) before middleware; pass metric with `method` label name to our middleware; and not have any firewall/LB/proxy that filters away requests with unknown `method`. client_golang version 1.11.1 contains a patch for this issue. Several workarounds are available, including removing the `method` label name from counter/gauge used in the InstrumentHandler; turning off affected promhttp handlers; adding custom middleware before promhttp handler that will sanitize the request method given by Go http.Request; and using a reverse proxy or web application firewall, configured to only allow a limited set of methods.
<p>Publish Date: 2022-02-15
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-21698>CVE-2022-21698</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/prometheus/client_golang/security/advisories/GHSA-cg3q-j54f-5p7p">https://github.com/prometheus/client_golang/security/advisories/GHSA-cg3q-j54f-5p7p</a></p>
<p>Release Date: 2022-02-15</p>
<p>Fix Resolution: v1.11.1</p>
</p>
<p></p>
</details> | True | github.com/prometheus/client_golang-v1.11.0: 1 vulnerabilities (highest severity is: 7.5) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/prometheus/client_golang-v1.11.0</b></p></summary>
<p>Prometheus instrumentation library for Go applications</p>
<p>Library home page: <a href="https://proxy.golang.org/github.com/prometheus/client_golang/@v/v1.11.0.zip">https://proxy.golang.org/github.com/prometheus/client_golang/@v/v1.11.0.zip</a></p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/tabacws-sandbox/mattermost-golang/commit/fd9fc1baf3cd86beecdfe1d4b962b3e768b4ff92">fd9fc1baf3cd86beecdfe1d4b962b3e768b4ff92</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (github.com/prometheus/client_golang-v1.11.0 version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-21698](https://www.mend.io/vulnerability-database/CVE-2022-21698) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | github.com/prometheus/client_golang-v1.11.0 | Direct | v1.11.1 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-21698</summary>
### Vulnerable Library - <b>github.com/prometheus/client_golang-v1.11.0</b></p>
<p>Prometheus instrumentation library for Go applications</p>
<p>Library home page: <a href="https://proxy.golang.org/github.com/prometheus/client_golang/@v/v1.11.0.zip">https://proxy.golang.org/github.com/prometheus/client_golang/@v/v1.11.0.zip</a></p>
<p>
Dependency Hierarchy:
- :x: **github.com/prometheus/client_golang-v1.11.0** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/tabacws-sandbox/mattermost-golang/commit/fd9fc1baf3cd86beecdfe1d4b962b3e768b4ff92">fd9fc1baf3cd86beecdfe1d4b962b3e768b4ff92</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
client_golang is the instrumentation library for Go applications in Prometheus, and the promhttp package in client_golang provides tooling around HTTP servers and clients. In client_golang prior to version 1.11.1, HTTP server is susceptible to a Denial of Service through unbounded cardinality, and potential memory exhaustion, when handling requests with non-standard HTTP methods. In order to be affected, an instrumented software must use any of `promhttp.InstrumentHandler*` middleware except `RequestsInFlight`; not filter any specific methods (e.g GET) before middleware; pass metric with `method` label name to our middleware; and not have any firewall/LB/proxy that filters away requests with unknown `method`. client_golang version 1.11.1 contains a patch for this issue. Several workarounds are available, including removing the `method` label name from counter/gauge used in the InstrumentHandler; turning off affected promhttp handlers; adding custom middleware before promhttp handler that will sanitize the request method given by Go http.Request; and using a reverse proxy or web application firewall, configured to only allow a limited set of methods.
<p>Publish Date: 2022-02-15
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-21698>CVE-2022-21698</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/prometheus/client_golang/security/advisories/GHSA-cg3q-j54f-5p7p">https://github.com/prometheus/client_golang/security/advisories/GHSA-cg3q-j54f-5p7p</a></p>
<p>Release Date: 2022-02-15</p>
<p>Fix Resolution: v1.11.1</p>
</p>
<p></p>
</details> | non_priority | github com prometheus client golang vulnerabilities highest severity is vulnerable library github com prometheus client golang prometheus instrumentation library for go applications library home page a href found in head commit a href vulnerabilities cve severity cvss dependency type fixed in github com prometheus client golang version remediation available high github com prometheus client golang direct details cve vulnerable library github com prometheus client golang prometheus instrumentation library for go applications library home page a href dependency hierarchy x github com prometheus client golang vulnerable library found in head commit a href found in base branch master vulnerability details client golang is the instrumentation library for go applications in prometheus and the promhttp package in client golang provides tooling around http servers and clients in client golang prior to version http server is susceptible to a denial of service through unbounded cardinality and potential memory exhaustion when handling requests with non standard http methods in order to be affected an instrumented software must use any of promhttp instrumenthandler middleware except requestsinflight not filter any specific methods e g get before middleware pass metric with method label name to our middleware and not have any firewall lb proxy that filters away requests with unknown method client golang version contains a patch for this issue several workarounds are available including removing the method label name from counter gauge used in the instrumenthandler turning off affected promhttp handlers adding custom middleware before promhttp handler that will sanitize the request method given by go http request and using a reverse proxy or web application firewall configured to only allow a limited set of methods publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution | 0 |
123,308 | 17,772,206,101 | IssuesEvent | 2021-08-30 14:51:13 | kapseliboi/plywood | https://api.github.com/repos/kapseliboi/plywood | opened | WS-2019-0331 (Medium) detected in handlebars-4.1.0.tgz | security vulnerability | ## WS-2019-0331 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.1.0.tgz</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz</a></p>
<p>Path to dependency file: plywood/package.json</p>
<p>Path to vulnerable library: plywood/node_modules/handlebars/package.json</p>
<p>
Dependency Hierarchy:
- istanbul-0.4.5.tgz (Root Library)
- :x: **handlebars-4.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/kapseliboi/plywood/commit/04b168b8948e7e181add52c41509f5f80da1b070">04b168b8948e7e181add52c41509f5f80da1b070</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Execution vulnerability found in handlebars before 4.5.2. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.
<p>Publish Date: 2019-11-13
<p>URL: <a href=https://github.com/wycats/handlebars.js/commit/d54137810a49939fd2ad01a91a34e182ece4528e>WS-2019-0331</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1316">https://www.npmjs.com/advisories/1316</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: handlebars - 4.5.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2019-0331 (Medium) detected in handlebars-4.1.0.tgz - ## WS-2019-0331 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.1.0.tgz</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.0.tgz</a></p>
<p>Path to dependency file: plywood/package.json</p>
<p>Path to vulnerable library: plywood/node_modules/handlebars/package.json</p>
<p>
Dependency Hierarchy:
- istanbul-0.4.5.tgz (Root Library)
- :x: **handlebars-4.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/kapseliboi/plywood/commit/04b168b8948e7e181add52c41509f5f80da1b070">04b168b8948e7e181add52c41509f5f80da1b070</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Execution vulnerability found in handlebars before 4.5.2. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.
<p>Publish Date: 2019-11-13
<p>URL: <a href=https://github.com/wycats/handlebars.js/commit/d54137810a49939fd2ad01a91a34e182ece4528e>WS-2019-0331</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1316">https://www.npmjs.com/advisories/1316</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: handlebars - 4.5.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws medium detected in handlebars tgz ws medium severity vulnerability vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file plywood package json path to vulnerable library plywood node modules handlebars package json dependency hierarchy istanbul tgz root library x handlebars tgz vulnerable library found in head commit a href found in base branch master vulnerability details arbitrary code execution vulnerability found in handlebars before lookup helper fails to validate templates attack may submit templates that execute arbitrary javascript in the system publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution handlebars step up your open source security game with whitesource | 0 |
170,644 | 20,883,793,498 | IssuesEvent | 2022-03-23 01:13:31 | snowdensb/dependabot-core | https://api.github.com/repos/snowdensb/dependabot-core | reopened | CVE-2019-1075 (Medium) detected in microsoft.aspnetcore.app.2.1.0.nupkg | security vulnerability | ## CVE-2019-1075 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.aspnetcore.app.2.1.0.nupkg</b></p></summary>
<p>Microsoft.AspNetCore.App</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.aspnetcore.app.2.1.0.nupkg">https://api.nuget.org/packages/microsoft.aspnetcore.app.2.1.0.nupkg</a></p>
<p>Path to dependency file: /nuget/spec/fixtures/csproj/basic.csproj</p>
<p>Path to vulnerable library: /canner/.nuget/packages/microsoft.aspnetcore.app/2.1.0/microsoft.aspnetcore.app.2.1.0.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.aspnetcore.app.2.1.0.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/snowdensb/dependabot-core/commit/ba8cd9078c8ce0cb202767d627706711237abf71">ba8cd9078c8ce0cb202767d627706711237abf71</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A spoofing vulnerability exists in ASP.NET Core that could lead to an open redirect, aka 'ASP.NET Core Spoofing Vulnerability'.
<p>Publish Date: 2019-07-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1075>CVE-2019-1075</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-1075">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-1075</a></p>
<p>Release Date: 2019-07-19</p>
<p>Fix Resolution: v2.1.12,v2.2.6</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Nuget","packageName":"Microsoft.AspNetCore.App","packageVersion":"2.1.0","packageFilePaths":["/nuget/spec/fixtures/csproj/basic.csproj"],"isTransitiveDependency":false,"dependencyTree":"Microsoft.AspNetCore.App:2.1.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v2.1.12,v2.2.6","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2019-1075","vulnerabilityDetails":"A spoofing vulnerability exists in ASP.NET Core that could lead to an open redirect, aka \u0027ASP.NET Core Spoofing Vulnerability\u0027.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1075","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | True | CVE-2019-1075 (Medium) detected in microsoft.aspnetcore.app.2.1.0.nupkg - ## CVE-2019-1075 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.aspnetcore.app.2.1.0.nupkg</b></p></summary>
<p>Microsoft.AspNetCore.App</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.aspnetcore.app.2.1.0.nupkg">https://api.nuget.org/packages/microsoft.aspnetcore.app.2.1.0.nupkg</a></p>
<p>Path to dependency file: /nuget/spec/fixtures/csproj/basic.csproj</p>
<p>Path to vulnerable library: /canner/.nuget/packages/microsoft.aspnetcore.app/2.1.0/microsoft.aspnetcore.app.2.1.0.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.aspnetcore.app.2.1.0.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/snowdensb/dependabot-core/commit/ba8cd9078c8ce0cb202767d627706711237abf71">ba8cd9078c8ce0cb202767d627706711237abf71</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A spoofing vulnerability exists in ASP.NET Core that could lead to an open redirect, aka 'ASP.NET Core Spoofing Vulnerability'.
<p>Publish Date: 2019-07-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1075>CVE-2019-1075</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-1075">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-1075</a></p>
<p>Release Date: 2019-07-19</p>
<p>Fix Resolution: v2.1.12,v2.2.6</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Nuget","packageName":"Microsoft.AspNetCore.App","packageVersion":"2.1.0","packageFilePaths":["/nuget/spec/fixtures/csproj/basic.csproj"],"isTransitiveDependency":false,"dependencyTree":"Microsoft.AspNetCore.App:2.1.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v2.1.12,v2.2.6","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2019-1075","vulnerabilityDetails":"A spoofing vulnerability exists in ASP.NET Core that could lead to an open redirect, aka \u0027ASP.NET Core Spoofing Vulnerability\u0027.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1075","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | non_priority | cve medium detected in microsoft aspnetcore app nupkg cve medium severity vulnerability vulnerable library microsoft aspnetcore app nupkg microsoft aspnetcore app library home page a href path to dependency file nuget spec fixtures csproj basic csproj path to vulnerable library canner nuget packages microsoft aspnetcore app microsoft aspnetcore app nupkg dependency hierarchy x microsoft aspnetcore app nupkg vulnerable library found in head commit a href found in base branch main vulnerability details a spoofing vulnerability exists in asp net core that could lead to an open redirect aka asp net core spoofing vulnerability publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree microsoft aspnetcore app isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails a spoofing vulnerability exists in asp net core that could lead to an open redirect aka net core spoofing vulnerability vulnerabilityurl | 0 |
318,019 | 23,699,412,672 | IssuesEvent | 2022-08-29 17:30:46 | opencv/cvat | https://api.github.com/repos/opencv/cvat | closed | All documentation links are broken | bug documentation | Hi, I receive a 404 when trying to view the docs at https://cvat-ai.github.io/cvat/docs/
This is true for every link to documentation both via Github and on the cvat.ai website. | 1.0 | All documentation links are broken - Hi, I receive a 404 when trying to view the docs at https://cvat-ai.github.io/cvat/docs/
This is true for every link to documentation both via Github and on the cvat.ai website. | non_priority | all documentation links are broken hi i receive a when trying to view the docs at this is true for every link to documentation both via github and on the cvat ai website | 0 |
19,811 | 5,946,660,837 | IssuesEvent | 2017-05-26 04:51:56 | pywbem/pywbem | https://api.github.com/repos/pywbem/pywbem | closed | Recorder output is all with indent=2. The norm for everything else is indent=4 | area: code release: mandatory resolution: fixed | Having the indent=2 makes it difficult to work with the results and any editing often leasts to messy results. We would be better if we kept everything at indent=4 | 1.0 | Recorder output is all with indent=2. The norm for everything else is indent=4 - Having the indent=2 makes it difficult to work with the results and any editing often leasts to messy results. We would be better if we kept everything at indent=4 | non_priority | recorder output is all with indent the norm for everything else is indent having the indent makes it difficult to work with the results and any editing often leasts to messy results we would be better if we kept everything at indent | 0 |
181,094 | 21,645,555,526 | IssuesEvent | 2022-05-06 01:06:34 | mgh3326/google-calendar-slackbot | https://api.github.com/repos/mgh3326/google-calendar-slackbot | closed | CVE-2021-21409 (Medium) detected in netty-codec-http2-4.1.50.Final.jar - autoclosed | security vulnerability | ## CVE-2021-21409 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http2-4.1.50.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http2/4.1.50.Final/cb3b530644b17d60984ffd018878e8ff389c384c/netty-codec-http2-4.1.50.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-webflux-2.3.1.RELEASE.jar (Root Library)
- spring-boot-dependencies-2.3.1.RELEASE.pom
- :x: **netty-codec-http2-4.1.50.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/mgh3326/google-calendar-slackbot/commit/5c8728bac16d10c8f6e535a646962cf6e8ab2b0b">5c8728bac16d10c8f6e535a646962cf6e8ab2b0b</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. In Netty (io.netty:netty-codec-http2) before version 4.1.61.Final there is a vulnerability that enables request smuggling. The content-length header is not correctly validated if the request only uses a single Http2HeaderFrame with the endStream set to to true. This could lead to request smuggling if the request is proxied to a remote peer and translated to HTTP/1.1. This is a followup of GHSA-wm47-8v5p-wjpj/CVE-2021-21295 which did miss to fix this one case. This was fixed as part of 4.1.61.Final.
<p>Publish Date: 2021-03-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21409>CVE-2021-21409</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/netty/netty/security/advisories/GHSA-f256-j965-7f32">https://github.com/netty/netty/security/advisories/GHSA-f256-j965-7f32</a></p>
<p>Release Date: 2021-03-30</p>
<p>Fix Resolution: io.netty:netty-codec-http2:4.1.61.Final</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-21409 (Medium) detected in netty-codec-http2-4.1.50.Final.jar - autoclosed - ## CVE-2021-21409 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http2-4.1.50.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http2/4.1.50.Final/cb3b530644b17d60984ffd018878e8ff389c384c/netty-codec-http2-4.1.50.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-webflux-2.3.1.RELEASE.jar (Root Library)
- spring-boot-dependencies-2.3.1.RELEASE.pom
- :x: **netty-codec-http2-4.1.50.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/mgh3326/google-calendar-slackbot/commit/5c8728bac16d10c8f6e535a646962cf6e8ab2b0b">5c8728bac16d10c8f6e535a646962cf6e8ab2b0b</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Netty is an open-source, asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. In Netty (io.netty:netty-codec-http2) before version 4.1.61.Final there is a vulnerability that enables request smuggling. The content-length header is not correctly validated if the request only uses a single Http2HeaderFrame with the endStream set to to true. This could lead to request smuggling if the request is proxied to a remote peer and translated to HTTP/1.1. This is a followup of GHSA-wm47-8v5p-wjpj/CVE-2021-21295 which did miss to fix this one case. This was fixed as part of 4.1.61.Final.
<p>Publish Date: 2021-03-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21409>CVE-2021-21409</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/netty/netty/security/advisories/GHSA-f256-j965-7f32">https://github.com/netty/netty/security/advisories/GHSA-f256-j965-7f32</a></p>
<p>Release Date: 2021-03-30</p>
<p>Fix Resolution: io.netty:netty-codec-http2:4.1.61.Final</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in netty codec final jar autoclosed cve medium severity vulnerability vulnerable library netty codec final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec final netty codec final jar dependency hierarchy spring boot starter webflux release jar root library spring boot dependencies release pom x netty codec final jar vulnerable library found in head commit a href vulnerability details netty is an open source asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers clients in netty io netty netty codec before version final there is a vulnerability that enables request smuggling the content length header is not correctly validated if the request only uses a single with the endstream set to to true this could lead to request smuggling if the request is proxied to a remote peer and translated to http this is a followup of ghsa wjpj cve which did miss to fix this one case this was fixed as part of final publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec final step up your open source security game with whitesource | 0 |
277,570 | 21,049,546,879 | IssuesEvent | 2022-03-31 19:17:03 | microsoftgraph/microsoft-graph-toolkit | https://api.github.com/repos/microsoftgraph/microsoft-graph-toolkit | closed | [BUG] 403 Forbidden error with PersonCard component and correct scopes | bug Area: Graph Needs: Documentation | I am using the PersonCard component as per the documentation. I have disabled the Organization and Files sections as I do not have the required User.Read.All (for organization) and Sites.Read.All (for files) scopes granted. This leaves the Default and Messages sections enabled, which require the following scopes (per the documentation):
- User.ReadBasic.All
- People.Read
- Contacts.Read
- Presence.Read.All
- Mail.ReadBasic

I have these scopes enabled for my app, but I am getting a 403 Forbidden error - response:
`{"error":{"code":"ErrorInsufficientPermissionsInAccessToken","message":"Exception of type 'Microsoft.Fast.Profile.Core.Exception.ProfileAccessDeniedException' was thrown.","innerError":{"date":"2022-01-19T00:06:12","request-id":"079b9285-288e-4a46-9f2a-9229f90b5b95","client-request-id":"1e48cea3-8bce-c656-ed4c-eef55df3c599"}}}`
When I use the getScopes static method with this configuration it returns the following array:
["Mail.ReadBasic", "User.Read.All", "Contacts.Read", "People.Read"]
I was surprised to see the scope User.Read.All here - as this should only be required for the Organization section which is disabled. There is no mention of requiring User.ReadBasic.All scope as well, so I'm guessing there is a bug somewhere within the component that is incorrectly checking the scopes granted.
Additionally, using this method did not return Mail.ReadBasic, even though i have the Messages section enabled...
**To Reproduce**
Steps to reproduce the behavior:
1. Disable Organization and Files section of PersonCard component:
```
MgtPersonCard.config.sections.organization = false;
MgtPersonCard.config.sections.files = false;
```
2. grant scopes:
- User.ReadBasic.All
- People.Read
- Contacts.Read
- Presence.Read.All
- Mail.ReadBasic
3. add PersonCard component to app and check for 403 error
**Expected behavior**
PersonCard component loads correct sections enabled and doesn't throw 403 errors
**Environment (please complete the following information):**
- OS: Windows 10
- Browser: Chromium
- Framework: React
- Context:
- Version: 2.3.1
- Provider: Msal2 | 1.0 | [BUG] 403 Forbidden error with PersonCard component and correct scopes - I am using the PersonCard component as per the documentation. I have disabled the Organization and Files sections as I do not have the required User.Read.All (for organization) and Sites.Read.All (for files) scopes granted. This leaves the Default and Messages sections enabled, which require the following scopes (per the documentation):
- User.ReadBasic.All
- People.Read
- Contacts.Read
- Presence.Read.All
- Mail.ReadBasic

I have these scopes enabled for my app, but I am getting a 403 Forbidden error - response:
`{"error":{"code":"ErrorInsufficientPermissionsInAccessToken","message":"Exception of type 'Microsoft.Fast.Profile.Core.Exception.ProfileAccessDeniedException' was thrown.","innerError":{"date":"2022-01-19T00:06:12","request-id":"079b9285-288e-4a46-9f2a-9229f90b5b95","client-request-id":"1e48cea3-8bce-c656-ed4c-eef55df3c599"}}}`
When I use the getScopes static method with this configuration it returns the following array:
["Mail.ReadBasic", "User.Read.All", "Contacts.Read", "People.Read"]
I was surprised to see the scope User.Read.All here - as this should only be required for the Organization section which is disabled. There is no mention of requiring User.ReadBasic.All scope as well, so I'm guessing there is a bug somewhere within the component that is incorrectly checking the scopes granted.
Additionally, using this method did not return Mail.ReadBasic, even though i have the Messages section enabled...
**To Reproduce**
Steps to reproduce the behavior:
1. Disable Organization and Files section of PersonCard component:
```
MgtPersonCard.config.sections.organization = false;
MgtPersonCard.config.sections.files = false;
```
2. grant scopes:
- User.ReadBasic.All
- People.Read
- Contacts.Read
- Presence.Read.All
- Mail.ReadBasic
3. add PersonCard component to app and check for 403 error
**Expected behavior**
PersonCard component loads correct sections enabled and doesn't throw 403 errors
**Environment (please complete the following information):**
- OS: Windows 10
- Browser: Chromium
- Framework: React
- Context:
- Version: 2.3.1
- Provider: Msal2 | non_priority | forbidden error with personcard component and correct scopes i am using the personcard component as per the documentation i have disabled the organization and files sections as i do not have the required user read all for organization and sites read all for files scopes granted this leaves the default and messages sections enabled which require the following scopes per the documentation user readbasic all people read contacts read presence read all mail readbasic i have these scopes enabled for my app but i am getting a forbidden error response error code errorinsufficientpermissionsinaccesstoken message exception of type microsoft fast profile core exception profileaccessdeniedexception was thrown innererror date request id client request id when i use the getscopes static method with this configuration it returns the following array i was surprised to see the scope user read all here as this should only be required for the organization section which is disabled there is no mention of requiring user readbasic all scope as well so i m guessing there is a bug somewhere within the component that is incorrectly checking the scopes granted additionally using this method did not return mail readbasic even though i have the messages section enabled to reproduce steps to reproduce the behavior disable organization and files section of personcard component mgtpersoncard config sections organization false mgtpersoncard config sections files false grant scopes user readbasic all people read contacts read presence read all mail readbasic add personcard component to app and check for error expected behavior personcard component loads correct sections enabled and doesn t throw errors environment please complete the following information os windows browser chromium framework react context version provider | 0 |
66,936 | 7,029,274,007 | IssuesEvent | 2017-12-25 21:20:41 | briskhome/briskhome | https://api.github.com/repos/briskhome/briskhome | closed | Unit tests for core.graphql | core.graphql tests ↘ | These plugins are currently not covered by unit tests. We need to fix that. | 1.0 | Unit tests for core.graphql - These plugins are currently not covered by unit tests. We need to fix that. | non_priority | unit tests for core graphql these plugins are currently not covered by unit tests we need to fix that | 0 |
314,594 | 27,012,547,007 | IssuesEvent | 2023-02-10 16:30:17 | elastic/kibana | https://api.github.com/repos/elastic/kibana | closed | Failing test: Jest Tests.x-pack/plugins/cases/public/components/all_cases - AllCasesListGeneric Actions Bulk actions Bulk delete | blocker failed-test skipped-test Team:ResponseOps Feature:Cases v8.7.0 | A test failed on a tracked branch
```
TestingLibraryElementError: Unable to find an element by: [data-test-subj="checkboxSelectAll"]
Ignored nodes: comments, script, style
<body
class=""
>
<div />
</body>
at Object.getElementError (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/config.js:40:19)
at /var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:90:38
at /var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:62:17
at /var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:111:19
at getByTestId (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/x-pack/plugins/cases/public/components/all_cases/all_cases_list.test.tsx:863:34)
at batchedUpdates$1 (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/react-dom/cjs/react-dom.development.js:22380:12)
at act (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/react-dom/cjs/react-dom-test-utils.development.js:1042:14)
at Object.<anonymous> (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/x-pack/plugins/cases/public/components/all_cases/all_cases_list.test.tsx:862:12)
at Promise.then.completed (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/utils.js:289:28)
at new Promise (<anonymous>)
at callAsyncCircusFn (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/utils.js:222:10)
at _callCircusTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:248:40)
at _runTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:184:3)
at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:86:9)
at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9)
at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9)
at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9)
at run (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:26:3)
at runAndTransformResultsToJestFormat (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/legacy-code-todo-rewrite/jestAdapterInit.js:120:21)
at jestAdapter (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/legacy-code-todo-rewrite/jestAdapter.js:79:19)
at runTestInternal (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:367:16)
at runTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:444:34)
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/25425#018583f8-7d49-4c22-8eb2-200518735190)
<!-- kibanaCiData = {"failed-test":{"test.class":"Jest Tests.x-pack/plugins/cases/public/components/all_cases","test.name":"AllCasesListGeneric Actions Bulk actions Bulk delete","test.failCount":1}} --> | 2.0 | Failing test: Jest Tests.x-pack/plugins/cases/public/components/all_cases - AllCasesListGeneric Actions Bulk actions Bulk delete - A test failed on a tracked branch
```
TestingLibraryElementError: Unable to find an element by: [data-test-subj="checkboxSelectAll"]
Ignored nodes: comments, script, style
<body
class=""
>
<div />
</body>
at Object.getElementError (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/config.js:40:19)
at /var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:90:38
at /var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:62:17
at /var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:111:19
at getByTestId (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/x-pack/plugins/cases/public/components/all_cases/all_cases_list.test.tsx:863:34)
at batchedUpdates$1 (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/react-dom/cjs/react-dom.development.js:22380:12)
at act (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/react-dom/cjs/react-dom-test-utils.development.js:1042:14)
at Object.<anonymous> (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/x-pack/plugins/cases/public/components/all_cases/all_cases_list.test.tsx:862:12)
at Promise.then.completed (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/utils.js:289:28)
at new Promise (<anonymous>)
at callAsyncCircusFn (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/utils.js:222:10)
at _callCircusTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:248:40)
at _runTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:184:3)
at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:86:9)
at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9)
at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9)
at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9)
at run (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:26:3)
at runAndTransformResultsToJestFormat (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/legacy-code-todo-rewrite/jestAdapterInit.js:120:21)
at jestAdapter (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/legacy-code-todo-rewrite/jestAdapter.js:79:19)
at runTestInternal (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:367:16)
at runTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-87f3625e8b45c3eb/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:444:34)
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/25425#018583f8-7d49-4c22-8eb2-200518735190)
<!-- kibanaCiData = {"failed-test":{"test.class":"Jest Tests.x-pack/plugins/cases/public/components/all_cases","test.name":"AllCasesListGeneric Actions Bulk actions Bulk delete","test.failCount":1}} --> | non_priority | failing test jest tests x pack plugins cases public components all cases allcaseslistgeneric actions bulk actions bulk delete a test failed on a tracked branch testinglibraryelementerror unable to find an element by ignored nodes comments script style body class at object getelementerror var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules testing library dom dist config js at var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules testing library dom dist query helpers js at var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules testing library dom dist query helpers js at var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules testing library dom dist query helpers js at getbytestid var lib buildkite agent builds kb spot elastic kibana on merge kibana x pack plugins cases public components all cases all cases list test tsx at batchedupdates var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules react dom cjs react dom development js at act var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules react dom cjs react dom test utils development js at object var lib buildkite agent builds kb spot elastic kibana on merge kibana x pack plugins cases public components all cases all cases list test tsx at promise then completed var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build utils js at new promise at callasynccircusfn var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build utils js at callcircustest var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runtest var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runtestsfordescribeblock var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runtestsfordescribeblock var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runtestsfordescribeblock var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runtestsfordescribeblock var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at run var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runandtransformresultstojestformat var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build legacy code todo rewrite jestadapterinit js at jestadapter var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build legacy code todo rewrite jestadapter js at runtestinternal var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest runner build runtest js at runtest var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest runner build runtest js first failure | 0 |
39,149 | 10,308,392,126 | IssuesEvent | 2019-08-29 10:49:50 | ballerina-platform/ballerina-lang | https://api.github.com/repos/ballerina-platform/ballerina-lang | closed | Improve the log when failing connecting to repository, during compilation | Area/BuildTools Component/Packerina Type/Improvement | **Description:**
```bash
Compiling source
could not connect to remote repository or unexpected response received.
```
Getting the above message during ballerina compile.
IMO, this log is more useful if it indicates which repository it was trying to connect to.
| 1.0 | Improve the log when failing connecting to repository, during compilation - **Description:**
```bash
Compiling source
could not connect to remote repository or unexpected response received.
```
Getting the above message during ballerina compile.
IMO, this log is more useful if it indicates which repository it was trying to connect to.
| non_priority | improve the log when failing connecting to repository during compilation description bash compiling source could not connect to remote repository or unexpected response received getting the above message during ballerina compile imo this log is more useful if it indicates which repository it was trying to connect to | 0 |
201,243 | 22,946,845,724 | IssuesEvent | 2022-07-19 01:27:04 | Chiencc/Spring5-Login_Prioritize-Sample | https://api.github.com/repos/Chiencc/Spring5-Login_Prioritize-Sample | closed | WS-2020-0408 (High) detected in netty-handler-4.1.25.Final.jar - autoclosed | security vulnerability | ## WS-2020-0408 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-handler-4.1.25.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-handler/4.1.25.Final/netty-handler-4.1.25.Final.jar</p>
<p>
Dependency Hierarchy:
- reactor-netty-0.7.8.RELEASE.jar (Root Library)
- :x: **netty-handler-4.1.25.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Chiencc/Spring5-Login_Prioritize-Sample/commit/0c3ecc11668698a0cc14027974dc1381483dcfa8">0c3ecc11668698a0cc14027974dc1381483dcfa8</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was found in all versions of io.netty:netty-all. Host verification in Netty is disabled by default. This can lead to MITM attack in which an attacker can forge valid SSL/TLS certificates for a different hostname in order to intercept traffic that doesn’t intend for him. This is an issue because the certificate is not matched with the host.
<p>Publish Date: 2020-06-22
<p>URL: <a href=https://github.com/netty/netty/issues/10362>WS-2020-0408</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/WS-2020-0408">https://nvd.nist.gov/vuln/detail/WS-2020-0408</a></p>
<p>Release Date: 2020-06-22</p>
<p>Fix Resolution: io.netty:netty-all - 4.1.68.Final-redhat-00001,4.0.0.Final,4.1.67.Final-redhat-00002;io.netty:netty-handler - 4.1.68.Final-redhat-00001,4.1.67.Final-redhat-00001</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2020-0408 (High) detected in netty-handler-4.1.25.Final.jar - autoclosed - ## WS-2020-0408 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-handler-4.1.25.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-handler/4.1.25.Final/netty-handler-4.1.25.Final.jar</p>
<p>
Dependency Hierarchy:
- reactor-netty-0.7.8.RELEASE.jar (Root Library)
- :x: **netty-handler-4.1.25.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Chiencc/Spring5-Login_Prioritize-Sample/commit/0c3ecc11668698a0cc14027974dc1381483dcfa8">0c3ecc11668698a0cc14027974dc1381483dcfa8</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was found in all versions of io.netty:netty-all. Host verification in Netty is disabled by default. This can lead to MITM attack in which an attacker can forge valid SSL/TLS certificates for a different hostname in order to intercept traffic that doesn’t intend for him. This is an issue because the certificate is not matched with the host.
<p>Publish Date: 2020-06-22
<p>URL: <a href=https://github.com/netty/netty/issues/10362>WS-2020-0408</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/WS-2020-0408">https://nvd.nist.gov/vuln/detail/WS-2020-0408</a></p>
<p>Release Date: 2020-06-22</p>
<p>Fix Resolution: io.netty:netty-all - 4.1.68.Final-redhat-00001,4.0.0.Final,4.1.67.Final-redhat-00002;io.netty:netty-handler - 4.1.68.Final-redhat-00001,4.1.67.Final-redhat-00001</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws high detected in netty handler final jar autoclosed ws high severity vulnerability vulnerable library netty handler final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository io netty netty handler final netty handler final jar dependency hierarchy reactor netty release jar root library x netty handler final jar vulnerable library found in head commit a href found in base branch master vulnerability details an issue was found in all versions of io netty netty all host verification in netty is disabled by default this can lead to mitm attack in which an attacker can forge valid ssl tls certificates for a different hostname in order to intercept traffic that doesn’t intend for him this is an issue because the certificate is not matched with the host publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty all final redhat final final redhat io netty netty handler final redhat final redhat step up your open source security game with whitesource | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.