Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2,703
| 5,560,299,445
|
IssuesEvent
|
2017-03-24 19:04:09
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
Process.GetProcessById() is slower than Process.GetProcesses()
|
area-System.Diagnostics.Process enhancement os-windows tenet-performance up for grabs
|
According to the test below, `GetProcessById()` is twice as slow as `GetProcesses()`. I get similar results when testing .NET Framework. Seems like there's room for optimizations in `GetProcessById()`.
Version: dotnet cli version=1.0.1 (.NET Core 4.6.25009.03)
```
Method | Mean | StdDev |
------------------ |----------- |---------- |
GetProcessById | 19.0449 ms | 0.4639 ms |
GetProcessesFirst | 9.8138 ms | 0.1876 ms |
GetProcessesLast | 9.7993 ms | 0.2403 ms |
```
```C#
using System.Diagnostics;
using System.Linq;
using System.Reflection;
using BenchmarkDotNet.Attributes;
using BenchmarkDotNet.Running;
namespace ConsoleApp72 {
public class Program {
[Benchmark]
public void GetProcessById() {
Process.GetProcessById(0);
}
[Benchmark]
public void GetProcessesFirst() {
Process.GetProcesses().FirstOrDefault(a => a.Id == a.Id);
}
[Benchmark]
public void GetProcessesLast() {
Process.GetProcesses().FirstOrDefault(a => a.Id != a.Id);
}
static void Main(string[] args) {
BenchmarkSwitcher.FromAssembly(typeof(Program).GetTypeInfo().Assembly).Run(args);
}
}
}
```
|
1.0
|
Process.GetProcessById() is slower than Process.GetProcesses() - According to the test below, `GetProcessById()` is twice as slow as `GetProcesses()`. I get similar results when testing .NET Framework. Seems like there's room for optimizations in `GetProcessById()`.
Version: dotnet cli version=1.0.1 (.NET Core 4.6.25009.03)
```
Method | Mean | StdDev |
------------------ |----------- |---------- |
GetProcessById | 19.0449 ms | 0.4639 ms |
GetProcessesFirst | 9.8138 ms | 0.1876 ms |
GetProcessesLast | 9.7993 ms | 0.2403 ms |
```
```C#
using System.Diagnostics;
using System.Linq;
using System.Reflection;
using BenchmarkDotNet.Attributes;
using BenchmarkDotNet.Running;
namespace ConsoleApp72 {
public class Program {
[Benchmark]
public void GetProcessById() {
Process.GetProcessById(0);
}
[Benchmark]
public void GetProcessesFirst() {
Process.GetProcesses().FirstOrDefault(a => a.Id == a.Id);
}
[Benchmark]
public void GetProcessesLast() {
Process.GetProcesses().FirstOrDefault(a => a.Id != a.Id);
}
static void Main(string[] args) {
BenchmarkSwitcher.FromAssembly(typeof(Program).GetTypeInfo().Assembly).Run(args);
}
}
}
```
|
process
|
process getprocessbyid is slower than process getprocesses according to the test below getprocessbyid is twice as slow as getprocesses i get similar results when testing net framework seems like there s room for optimizations in getprocessbyid version dotnet cli version net core method mean stddev getprocessbyid ms ms getprocessesfirst ms ms getprocesseslast ms ms c using system diagnostics using system linq using system reflection using benchmarkdotnet attributes using benchmarkdotnet running namespace public class program public void getprocessbyid process getprocessbyid public void getprocessesfirst process getprocesses firstordefault a a id a id public void getprocesseslast process getprocesses firstordefault a a id a id static void main string args benchmarkswitcher fromassembly typeof program gettypeinfo assembly run args
| 1
|
1,329
| 3,881,316,964
|
IssuesEvent
|
2016-04-13 03:28:38
|
dataproofer/Dataproofer
|
https://api.github.com/repos/dataproofer/Dataproofer
|
reopened
|
Add new "Use at your own risk" warning for large files
|
engine: processing engine: rendering
|
Currently we just limit to the first 1000 rows
Jason suggested (and I agree) that instead we simply issue a warning about large files and the fact they might take a long time (or break the app)
|
1.0
|
Add new "Use at your own risk" warning for large files - Currently we just limit to the first 1000 rows
Jason suggested (and I agree) that instead we simply issue a warning about large files and the fact they might take a long time (or break the app)
|
process
|
add new use at your own risk warning for large files currently we just limit to the first rows jason suggested and i agree that instead we simply issue a warning about large files and the fact they might take a long time or break the app
| 1
|
376,310
| 11,141,050,047
|
IssuesEvent
|
2019-12-21 19:14:23
|
DiormaisavecunLaudebut/track-id
|
https://api.github.com/repos/DiormaisavecunLaudebut/track-id
|
closed
|
convert audio to wav/mp3
|
back-end high-priority
|
javascript/components/record-audio
This function already works to record an audio file.
Need to find a way to convert it to wav/mp3 and send it to cloudinary by giving it the post_id in the name.
Ps: change cloudinary set-up to be able to send it directly without authorization
|
1.0
|
convert audio to wav/mp3 - javascript/components/record-audio
This function already works to record an audio file.
Need to find a way to convert it to wav/mp3 and send it to cloudinary by giving it the post_id in the name.
Ps: change cloudinary set-up to be able to send it directly without authorization
|
non_process
|
convert audio to wav javascript components record audio this function already works to record an audio file need to find a way to convert it to wav and send it to cloudinary by giving it the post id in the name ps change cloudinary set up to be able to send it directly without authorization
| 0
|
15,109
| 3,925,347,361
|
IssuesEvent
|
2016-04-22 18:36:49
|
diana-hep/carl
|
https://api.github.com/repos/diana-hep/carl
|
opened
|
Notebook: probability calibration
|
Documentation Enhancement
|
Would nice to add a notebook illustrating the usage of `CalibratedClassifierCV`.
|
1.0
|
Notebook: probability calibration - Would nice to add a notebook illustrating the usage of `CalibratedClassifierCV`.
|
non_process
|
notebook probability calibration would nice to add a notebook illustrating the usage of calibratedclassifiercv
| 0
|
439,938
| 30,722,720,391
|
IssuesEvent
|
2023-07-27 17:07:46
|
aws-solutions/media-services-application-mapper
|
https://api.github.com/repos/aws-solutions/media-services-application-mapper
|
closed
|
Describe the AWS permissions needed for the workflow environments
|
documentation
|
If someone forks the repository, they may want to automate their own upload to an AWS bucket. They could do this by supplying environments in their account that match the names we use in our workflows.
We should include a section in one of our guides that describes the resources and policies needed for each environment. Right now, these are primarily S3-related permissions for uploading to different buckets depending on the environment in use.
|
1.0
|
Describe the AWS permissions needed for the workflow environments - If someone forks the repository, they may want to automate their own upload to an AWS bucket. They could do this by supplying environments in their account that match the names we use in our workflows.
We should include a section in one of our guides that describes the resources and policies needed for each environment. Right now, these are primarily S3-related permissions for uploading to different buckets depending on the environment in use.
|
non_process
|
describe the aws permissions needed for the workflow environments if someone forks the repository they may want to automate their own upload to an aws bucket they could do this by supplying environments in their account that match the names we use in our workflows we should include a section in one of our guides that describes the resources and policies needed for each environment right now these are primarily related permissions for uploading to different buckets depending on the environment in use
| 0
|
288,611
| 24,920,347,393
|
IssuesEvent
|
2022-10-30 21:57:07
|
DMTF/libspdm
|
https://api.github.com/repos/DMTF/libspdm
|
opened
|
Add configuration sweep to Actions
|
enhancement test
|
libspdm has a fair amount of boolean configuration macros. In CI/CD we currently test two combinations : everything enabled and most everything disabled. However there are thousands of possible combinations and they should be built to ensure that they can be built, assuming that the combination is legal.
To start I think it would be helpful to have an Action where
- Only one OS and compiler is used. Probably CLANG on Linux.
- No tests are run. The only action performed is making sure the build is successful.
- It is either manually triggered or a cron job; say every week or so.
|
1.0
|
Add configuration sweep to Actions - libspdm has a fair amount of boolean configuration macros. In CI/CD we currently test two combinations : everything enabled and most everything disabled. However there are thousands of possible combinations and they should be built to ensure that they can be built, assuming that the combination is legal.
To start I think it would be helpful to have an Action where
- Only one OS and compiler is used. Probably CLANG on Linux.
- No tests are run. The only action performed is making sure the build is successful.
- It is either manually triggered or a cron job; say every week or so.
|
non_process
|
add configuration sweep to actions libspdm has a fair amount of boolean configuration macros in ci cd we currently test two combinations everything enabled and most everything disabled however there are thousands of possible combinations and they should be built to ensure that they can be built assuming that the combination is legal to start i think it would be helpful to have an action where only one os and compiler is used probably clang on linux no tests are run the only action performed is making sure the build is successful it is either manually triggered or a cron job say every week or so
| 0
|
229,298
| 17,538,546,307
|
IssuesEvent
|
2021-08-12 09:17:00
|
SAP/abap-file-formats
|
https://api.github.com/repos/SAP/abap-file-formats
|
closed
|
Interface file definitions naming
|
documentation
|
we currently have:
```
zif_oo_aff_intf_v1.intf.abap
zif_atc_aff_chkc_v1.intf.abap
```
`INTF` and `CHKC` are the TADIR R3TR names, which are guaranteed to be unique, I suggest leaving out the "_oo" and "_atc" parts
|
1.0
|
Interface file definitions naming - we currently have:
```
zif_oo_aff_intf_v1.intf.abap
zif_atc_aff_chkc_v1.intf.abap
```
`INTF` and `CHKC` are the TADIR R3TR names, which are guaranteed to be unique, I suggest leaving out the "_oo" and "_atc" parts
|
non_process
|
interface file definitions naming we currently have zif oo aff intf intf abap zif atc aff chkc intf abap intf and chkc are the tadir names which are guaranteed to be unique i suggest leaving out the oo and atc parts
| 0
|
44,659
| 5,638,647,377
|
IssuesEvent
|
2017-04-06 12:33:24
|
quinoacomputing/quinoa
|
https://api.github.com/repos/quinoacomputing/quinoa
|
closed
|
Test suite for exceptions
|
enhancement help wanted testing
|
- Automatically switch each potential throw location into a throw (is this even possible?) and test for clean exit (test with e.g. `valgrind`)
- `valgrind` can return a different exit code based on errors detected using the command line argument `--error-exitcode=` to `valgrind`
|
1.0
|
Test suite for exceptions - - Automatically switch each potential throw location into a throw (is this even possible?) and test for clean exit (test with e.g. `valgrind`)
- `valgrind` can return a different exit code based on errors detected using the command line argument `--error-exitcode=` to `valgrind`
|
non_process
|
test suite for exceptions automatically switch each potential throw location into a throw is this even possible and test for clean exit test with e g valgrind valgrind can return a different exit code based on errors detected using the command line argument error exitcode to valgrind
| 0
|
731,164
| 25,204,483,302
|
IssuesEvent
|
2022-11-13 14:12:26
|
SzFMV2022-Osz/AutomatedCar-A
|
https://api.github.com/repos/SzFMV2022-Osz/AutomatedCar-A
|
opened
|
ACC: Ki és bekapcsolhatóság implementálása
|
effort: low priority: normal
|
- [ ] C gomb lenyomására a tempomat kapcsoljon ki/be (InputPacket.CruiseControlInput == TurnOnOrOff)
- [ ] Bekapcsoláskor a célsebessége az aktuális sebesség, de a minimum célsebesség 30 km/h
|
1.0
|
ACC: Ki és bekapcsolhatóság implementálása - - [ ] C gomb lenyomására a tempomat kapcsoljon ki/be (InputPacket.CruiseControlInput == TurnOnOrOff)
- [ ] Bekapcsoláskor a célsebessége az aktuális sebesség, de a minimum célsebesség 30 km/h
|
non_process
|
acc ki és bekapcsolhatóság implementálása c gomb lenyomására a tempomat kapcsoljon ki be inputpacket cruisecontrolinput turnonoroff bekapcsoláskor a célsebessége az aktuális sebesség de a minimum célsebesség km h
| 0
|
73,837
| 9,725,781,505
|
IssuesEvent
|
2019-05-30 09:39:04
|
pardeike/Harmony
|
https://api.github.com/repos/pardeike/Harmony
|
closed
|
Using HarmonyPrepare seems to throw exceptions
|
documentation
|
**Describe the bug**
A `TargetParameterCountException` is thrown when attempting to patch with the use of `HarmonyPrepare` attribute. (This is found when developing a RimWorld mod.)
**To Reproduce**
Steps to reproduce the behavior:
1. The original method to patch is from RuntimeGC, another RimWorld mod. I am attempting to patch `RuntimeGC.WorldPawnCleaner::GC`. The full source of the type and the method can be found here: https://github.com/user19990313/RimWorld-RuntimeGC/blob/master/src/RuntimeGC/RuntimeGC/WorldPawnCleaner.cs
The signature of the method is `public int GC(bool verbose = false)`.
Both the type `RuntimeGC.WorldPawnCleaner` and the method `GC` is `public`, so anyone should be able to access the method without any errors.
2. I intend to detect the presence of RuntimeGC with the use of `HarmonyPrepare`, so that I can prevent the patch if RuntimeGC is not found, as demonstrated below:
```
using Harmony;
using RuntimeGC;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Reflection.Emit;
using System.Text;
using Verse;
namespace Desynchronized.Compatibility.RuntimeGC
{
[HarmonyPatch(typeof(WorldPawnCleaner))]
[HarmonyPatch("GC", MethodType.Normal)]
public class Transpiler_WorldPawnCleaner_GC
{
[HarmonyPrepare]
public static bool DetectRuntimeGC(HarmonyInstance instance)
{
foreach (ModContentPack pack in LoadedModManager.RunningMods)
{
//Log.Error("Current ModContentPack: " + pack.Name);
if (pack.Name.Contains("RuntimeGC"))
{
return true;
}
}
return false;
}
[HarmonyPostfix]
public static void TestFunc()
{
// foo bar foo bar
}
}
}
```
3. HugsLib reports this error when the RimWorld title screen finally shows up:
```
[HugsLib][ERR] Failed to apply Harmony patches for HugsLib.com.vectorial1024.rimworld.desynchronized. Exception was: System.Reflection.TargetParameterCountException: parameters do not match signature
at System.Reflection.MonoMethod.Invoke (System.Object obj, BindingFlags invokeAttr, System.Reflection.Binder binder, System.Object[] parameters, System.Globalization.CultureInfo culture) [0x00000] in <filename unknown>:0
at System.Reflection.MethodBase.Invoke (System.Object obj, System.Object[] parameters) [0x00000] in <filename unknown>:0
at Harmony.PatchProcessor.RunMethod[HarmonyPrepare,Boolean] (Boolean defaultIfNotExisting, System.Object[] parameters) [0x00000] in <filename unknown>:0
at Harmony.PatchProcessor.Patch () [0x00000] in <filename unknown>:0
at Harmony.HarmonyInstance.<PatchAll>b__9_0 (System.Type type) [0x00000] in <filename unknown>:0
at Harmony.CollectionExtensions.Do[Type] (IEnumerable`1 sequence, System.Action`1 action) [0x00000] in <filename unknown>:0
at Harmony.HarmonyInstance.PatchAll (System.Reflection.Assembly assembly) [0x00000] in <filename unknown>:0
at HugsLib.ModBase.ApplyHarmonyPatches () [0x00000] in <filename unknown>:0
Verse.Log:Error(String, Boolean)
HugsLib.Utils.ModLogger:Error(String, Object[])
HugsLib.ModBase:ApplyHarmonyPatches()
HugsLib.HugsLibController:EnumerateChildMods(Boolean)
HugsLib.HugsLibController:LoadReloadInitialize()
Verse.LongEventHandler:RunEventFromAnotherThread(Action)
Verse.LongEventHandler:<UpdateCurrentAsynchronousEvent>m__1()
```
As reference, here's the full log: https://gist.github.com/07b949dccd9120703eac4a907034eb8f
(The `desyncpsycho` error can be ignored; it shouldn't affect this error we are currently investigating.)
**Expected behavior**
No errors/exceptions should be thrown if I use `HarmonyPrepare`.
**Runtime environment (please complete the following information):**
- OS: Windows 10, 64-bit
- (...how do I check which version of .NET is running?)
- Harmony version: 1.2.0.1
- Name of game or host application: RimWorld
**Additional context**
There are a number of occurrences of `HarmonyPrefix`, `HarmonyTranspiler`, and `HarmonyPostfix` in my mod. Those occurrences do not make use of `HarmonyPrepare`. They are all loaded successfully without any errors.
I was originally planning to use `HarmonyPrepare` with `HarmonyTranspiler` to modify the method, but it seems that merely using `HarmonyPrepare` caused errors.
|
1.0
|
Using HarmonyPrepare seems to throw exceptions - **Describe the bug**
A `TargetParameterCountException` is thrown when attempting to patch with the use of `HarmonyPrepare` attribute. (This is found when developing a RimWorld mod.)
**To Reproduce**
Steps to reproduce the behavior:
1. The original method to patch is from RuntimeGC, another RimWorld mod. I am attempting to patch `RuntimeGC.WorldPawnCleaner::GC`. The full source of the type and the method can be found here: https://github.com/user19990313/RimWorld-RuntimeGC/blob/master/src/RuntimeGC/RuntimeGC/WorldPawnCleaner.cs
The signature of the method is `public int GC(bool verbose = false)`.
Both the type `RuntimeGC.WorldPawnCleaner` and the method `GC` is `public`, so anyone should be able to access the method without any errors.
2. I intend to detect the presence of RuntimeGC with the use of `HarmonyPrepare`, so that I can prevent the patch if RuntimeGC is not found, as demonstrated below:
```
using Harmony;
using RuntimeGC;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Reflection.Emit;
using System.Text;
using Verse;
namespace Desynchronized.Compatibility.RuntimeGC
{
[HarmonyPatch(typeof(WorldPawnCleaner))]
[HarmonyPatch("GC", MethodType.Normal)]
public class Transpiler_WorldPawnCleaner_GC
{
[HarmonyPrepare]
public static bool DetectRuntimeGC(HarmonyInstance instance)
{
foreach (ModContentPack pack in LoadedModManager.RunningMods)
{
//Log.Error("Current ModContentPack: " + pack.Name);
if (pack.Name.Contains("RuntimeGC"))
{
return true;
}
}
return false;
}
[HarmonyPostfix]
public static void TestFunc()
{
// foo bar foo bar
}
}
}
```
3. HugsLib reports this error when the RimWorld title screen finally shows up:
```
[HugsLib][ERR] Failed to apply Harmony patches for HugsLib.com.vectorial1024.rimworld.desynchronized. Exception was: System.Reflection.TargetParameterCountException: parameters do not match signature
at System.Reflection.MonoMethod.Invoke (System.Object obj, BindingFlags invokeAttr, System.Reflection.Binder binder, System.Object[] parameters, System.Globalization.CultureInfo culture) [0x00000] in <filename unknown>:0
at System.Reflection.MethodBase.Invoke (System.Object obj, System.Object[] parameters) [0x00000] in <filename unknown>:0
at Harmony.PatchProcessor.RunMethod[HarmonyPrepare,Boolean] (Boolean defaultIfNotExisting, System.Object[] parameters) [0x00000] in <filename unknown>:0
at Harmony.PatchProcessor.Patch () [0x00000] in <filename unknown>:0
at Harmony.HarmonyInstance.<PatchAll>b__9_0 (System.Type type) [0x00000] in <filename unknown>:0
at Harmony.CollectionExtensions.Do[Type] (IEnumerable`1 sequence, System.Action`1 action) [0x00000] in <filename unknown>:0
at Harmony.HarmonyInstance.PatchAll (System.Reflection.Assembly assembly) [0x00000] in <filename unknown>:0
at HugsLib.ModBase.ApplyHarmonyPatches () [0x00000] in <filename unknown>:0
Verse.Log:Error(String, Boolean)
HugsLib.Utils.ModLogger:Error(String, Object[])
HugsLib.ModBase:ApplyHarmonyPatches()
HugsLib.HugsLibController:EnumerateChildMods(Boolean)
HugsLib.HugsLibController:LoadReloadInitialize()
Verse.LongEventHandler:RunEventFromAnotherThread(Action)
Verse.LongEventHandler:<UpdateCurrentAsynchronousEvent>m__1()
```
As reference, here's the full log: https://gist.github.com/07b949dccd9120703eac4a907034eb8f
(The `desyncpsycho` error can be ignored; it shouldn't affect this error we are currently investigating.)
**Expected behavior**
No errors/exceptions should be thrown if I use `HarmonyPrepare`.
**Runtime environment (please complete the following information):**
- OS: Windows 10, 64-bit
- (...how do I check which version of .NET is running?)
- Harmony version: 1.2.0.1
- Name of game or host application: RimWorld
**Additional context**
There are a number of occurrences of `HarmonyPrefix`, `HarmonyTranspiler`, and `HarmonyPostfix` in my mod. Those occurrences do not make use of `HarmonyPrepare`. They are all loaded successfully without any errors.
I was originally planning to use `HarmonyPrepare` with `HarmonyTranspiler` to modify the method, but it seems that merely using `HarmonyPrepare` caused errors.
|
non_process
|
using harmonyprepare seems to throw exceptions describe the bug a targetparametercountexception is thrown when attempting to patch with the use of harmonyprepare attribute this is found when developing a rimworld mod to reproduce steps to reproduce the behavior the original method to patch is from runtimegc another rimworld mod i am attempting to patch runtimegc worldpawncleaner gc the full source of the type and the method can be found here the signature of the method is public int gc bool verbose false both the type runtimegc worldpawncleaner and the method gc is public so anyone should be able to access the method without any errors i intend to detect the presence of runtimegc with the use of harmonyprepare so that i can prevent the patch if runtimegc is not found as demonstrated below using harmony using runtimegc using system using system collections generic using system linq using system reflection emit using system text using verse namespace desynchronized compatibility runtimegc public class transpiler worldpawncleaner gc public static bool detectruntimegc harmonyinstance instance foreach modcontentpack pack in loadedmodmanager runningmods log error current modcontentpack pack name if pack name contains runtimegc return true return false public static void testfunc foo bar foo bar hugslib reports this error when the rimworld title screen finally shows up failed to apply harmony patches for hugslib com rimworld desynchronized exception was system reflection targetparametercountexception parameters do not match signature at system reflection monomethod invoke system object obj bindingflags invokeattr system reflection binder binder system object parameters system globalization cultureinfo culture in at system reflection methodbase invoke system object obj system object parameters in at harmony patchprocessor runmethod boolean defaultifnotexisting system object parameters in at harmony patchprocessor patch in at harmony harmonyinstance b system type type in at harmony collectionextensions do ienumerable sequence system action action in at harmony harmonyinstance patchall system reflection assembly assembly in at hugslib modbase applyharmonypatches in verse log error string boolean hugslib utils modlogger error string object hugslib modbase applyharmonypatches hugslib hugslibcontroller enumeratechildmods boolean hugslib hugslibcontroller loadreloadinitialize verse longeventhandler runeventfromanotherthread action verse longeventhandler m as reference here s the full log the desyncpsycho error can be ignored it shouldn t affect this error we are currently investigating expected behavior no errors exceptions should be thrown if i use harmonyprepare runtime environment please complete the following information os windows bit how do i check which version of net is running harmony version name of game or host application rimworld additional context there are a number of occurrences of harmonyprefix harmonytranspiler and harmonypostfix in my mod those occurrences do not make use of harmonyprepare they are all loaded successfully without any errors i was originally planning to use harmonyprepare with harmonytranspiler to modify the method but it seems that merely using harmonyprepare caused errors
| 0
|
445,793
| 12,836,071,431
|
IssuesEvent
|
2020-07-07 13:50:44
|
ChadGoymer/githapi
|
https://api.github.com/repos/ChadGoymer/githapi
|
closed
|
Shorten temporary path in download_commit()
|
commits effort:0.5 feature priority:3
|
## Description
The archive path in `download_commit()` can be very long, and since Windows has a path length limit of 255 characters we should make this path as short as possible.
## Proposed Solution
When setting the `archive_path`, just use a temporary filename instead of combining repo and ref, as it is not permanent anyway.
|
1.0
|
Shorten temporary path in download_commit() - ## Description
The archive path in `download_commit()` can be very long, and since Windows has a path length limit of 255 characters we should make this path as short as possible.
## Proposed Solution
When setting the `archive_path`, just use a temporary filename instead of combining repo and ref, as it is not permanent anyway.
|
non_process
|
shorten temporary path in download commit description the archive path in download commit can be very long and since windows has a path length limit of characters we should make this path as short as possible proposed solution when setting the archive path just use a temporary filename instead of combining repo and ref as it is not permanent anyway
| 0
|
257,940
| 19,536,219,513
|
IssuesEvent
|
2021-12-31 07:38:37
|
hyojaeKwon/cs496_week1
|
https://api.github.com/repos/hyojaeKwon/cs496_week1
|
closed
|
<개정할 것> Hashtag들 개별 textview로 구성하기
|
documentation
|
Fra1에 있는 hashtag들 개별 textview로 동적생성하여 구현 -> 겉에 동그라미 남기기, 한줄 넘어가면 밑으로 가기
|
1.0
|
<개정할 것> Hashtag들 개별 textview로 구성하기 - Fra1에 있는 hashtag들 개별 textview로 동적생성하여 구현 -> 겉에 동그라미 남기기, 한줄 넘어가면 밑으로 가기
|
non_process
|
hashtag들 개별 textview로 구성하기 있는 hashtag들 개별 textview로 동적생성하여 구현 겉에 동그라미 남기기 한줄 넘어가면 밑으로 가기
| 0
|
85,634
| 16,700,699,443
|
IssuesEvent
|
2021-06-09 01:39:32
|
camptocamp/puppet-postfix
|
https://api.github.com/repos/camptocamp/puppet-postfix
|
closed
|
Character whitelist for destination e-mail addresses
|
bug needs code needs tests triaged wontfix
|
The whitelist for characters in e-mail addresses in postfix_canonical.aug is sadly lacking. In my particular case, it didn't include "+", but https://tools.ietf.org/html/rfc2822 also specifies that quite a number of other non-alpha-numeric characters are also valid (section 3.2.4). I can confirm that adding + to the regex works, so the rest are probably ok too.
|
1.0
|
Character whitelist for destination e-mail addresses - The whitelist for characters in e-mail addresses in postfix_canonical.aug is sadly lacking. In my particular case, it didn't include "+", but https://tools.ietf.org/html/rfc2822 also specifies that quite a number of other non-alpha-numeric characters are also valid (section 3.2.4). I can confirm that adding + to the regex works, so the rest are probably ok too.
|
non_process
|
character whitelist for destination e mail addresses the whitelist for characters in e mail addresses in postfix canonical aug is sadly lacking in my particular case it didn t include but also specifies that quite a number of other non alpha numeric characters are also valid section i can confirm that adding to the regex works so the rest are probably ok too
| 0
|
40,797
| 5,316,503,850
|
IssuesEvent
|
2017-02-13 20:03:02
|
OpenDICOMweb/core
|
https://api.github.com/repos/OpenDICOMweb/core
|
opened
|
Unit Tests failing
|
bug P1 P0 Urgent Test - unit
|
1. test/dataset_test.dart - you can no longer add the same tag to the dataset. each element must have a unique tag.
2. test/entity_test.dart:
Unit Test should not have so much print out. You can delete the print statements or use the logger.
|
1.0
|
Unit Tests failing - 1. test/dataset_test.dart - you can no longer add the same tag to the dataset. each element must have a unique tag.
2. test/entity_test.dart:
Unit Test should not have so much print out. You can delete the print statements or use the logger.
|
non_process
|
unit tests failing test dataset test dart you can no longer add the same tag to the dataset each element must have a unique tag test entity test dart unit test should not have so much print out you can delete the print statements or use the logger
| 0
|
15,447
| 19,662,603,572
|
IssuesEvent
|
2022-01-10 18:37:51
|
prisma/e2e-tests
|
https://api.github.com/repos/prisma/e2e-tests
|
opened
|
Improve `setEnv`
|
kind/improvement process/candidate kind/tech team/client
|
* Add [variable](https://github.com/prisma/e2e-tests/pull/2269#discussion_r781385908) declaration for `HAS_X_FLAG`
* Change encryption to use Node.js instead of OpenSSL
* Re-generate encrypted entries
|
1.0
|
Improve `setEnv` - * Add [variable](https://github.com/prisma/e2e-tests/pull/2269#discussion_r781385908) declaration for `HAS_X_FLAG`
* Change encryption to use Node.js instead of OpenSSL
* Re-generate encrypted entries
|
process
|
improve setenv add declaration for has x flag change encryption to use node js instead of openssl re generate encrypted entries
| 1
|
280,876
| 8,687,861,914
|
IssuesEvent
|
2018-12-03 14:48:55
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.shutterstock.com - see bug description
|
browser-firefox priority-important
|
<!-- @browser: Firefox 64.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:64.0) Gecko/20100101 Firefox/64.0 -->
<!-- @reported_with: desktop-reporter -->
**URL**: https://www.shutterstock.com/blog/2019-color-trends
**Browser / Version**: Firefox 64.0
**Operating System**: Windows 7
**Tested Another Browser**: No
**Problem type**: Something else
**Description**: Images are not loaded in this particular site!
**Steps to Reproduce**:
[](https://webcompat.com/uploads/2018/12/ba789f10-bbfb-4ae5-9265-177249c1a7df.jpeg)
<details>
<summary>Browser Configuration</summary>
<ul>
<li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20181126173133</li><li>tracking content blocked: true (basic)</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: false</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: beta</li>
</ul>
<p>Console Messages:</p>
<pre>
[u'[console.log(JQMIGRATE: Migrate is installed, version 1.4.1) https://www.shutterstock.com/blog/_static/??-eJyljuEKwjAMhF/ILoib+yU+inRt6LKlXV1SxLe3iv4QRBAhcATuuzu4ZEPJcfEoMNU7F1yvT2km2cA3g4kUVqvYREovs1uSYlKIxWQugVJlULN1M5xqEAyF2MOKXDlv8iIq79+nMB0x1n4Rnc3ASwArgvoYFCzzfQpTmuUHzpMojGj9n33HeNh2u77tu3bfTzeUbH0J:9:542]']
</pre>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.shutterstock.com - see bug description - <!-- @browser: Firefox 64.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:64.0) Gecko/20100101 Firefox/64.0 -->
<!-- @reported_with: desktop-reporter -->
**URL**: https://www.shutterstock.com/blog/2019-color-trends
**Browser / Version**: Firefox 64.0
**Operating System**: Windows 7
**Tested Another Browser**: No
**Problem type**: Something else
**Description**: Images are not loaded in this particular site!
**Steps to Reproduce**:
[](https://webcompat.com/uploads/2018/12/ba789f10-bbfb-4ae5-9265-177249c1a7df.jpeg)
<details>
<summary>Browser Configuration</summary>
<ul>
<li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20181126173133</li><li>tracking content blocked: true (basic)</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: false</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: beta</li>
</ul>
<p>Console Messages:</p>
<pre>
[u'[console.log(JQMIGRATE: Migrate is installed, version 1.4.1) https://www.shutterstock.com/blog/_static/??-eJyljuEKwjAMhF/ILoib+yU+inRt6LKlXV1SxLe3iv4QRBAhcATuuzu4ZEPJcfEoMNU7F1yvT2km2cA3g4kUVqvYREovs1uSYlKIxWQugVJlULN1M5xqEAyF2MOKXDlv8iIq79+nMB0x1n4Rnc3ASwArgvoYFCzzfQpTmuUHzpMojGj9n33HeNh2u77tu3bfTzeUbH0J:9:542]']
</pre>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
see bug description url browser version firefox operating system windows tested another browser no problem type something else description images are not loaded in this particular site steps to reproduce browser configuration mixed active content blocked false image mem shared true buildid tracking content blocked true basic gfx webrender blob images true hastouchscreen false mixed passive content blocked false gfx webrender enabled false gfx webrender all false channel beta console messages from with ❤️
| 0
|
10,247
| 13,102,965,891
|
IssuesEvent
|
2020-08-04 07:44:14
|
GoogleCloudPlatform/python-docs-samples
|
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
|
closed
|
Events for Cloud Run for Anthos – GCS and PubSub tutorial
|
type: process
|
Port the Events for Cloud Run samples for Anthos
|
1.0
|
Events for Cloud Run for Anthos – GCS and PubSub tutorial - Port the Events for Cloud Run samples for Anthos
|
process
|
events for cloud run for anthos – gcs and pubsub tutorial port the events for cloud run samples for anthos
| 1
|
16,914
| 22,241,434,085
|
IssuesEvent
|
2022-06-09 05:54:31
|
camunda/zeebe
|
https://api.github.com/repos/camunda/zeebe
|
closed
|
Update metrics exporter for start process instance anywhere
|
team/process-automation
|
The metrics exporter needs to distinguish between process instances started at the root start event and process instances started at other elements. This will allow us to see whether the feature is actively being used in Grafana.
Blocked by #9390
Possibly blocked by #9397 and #9398 for testing
|
1.0
|
Update metrics exporter for start process instance anywhere - The metrics exporter needs to distinguish between process instances started at the root start event and process instances started at other elements. This will allow us to see whether the feature is actively being used in Grafana.
Blocked by #9390
Possibly blocked by #9397 and #9398 for testing
|
process
|
update metrics exporter for start process instance anywhere the metrics exporter needs to distinguish between process instances started at the root start event and process instances started at other elements this will allow us to see whether the feature is actively being used in grafana blocked by possibly blocked by and for testing
| 1
|
234,288
| 7,719,358,736
|
IssuesEvent
|
2018-05-23 19:08:39
|
vmware/vic
|
https://api.github.com/repos/vmware/vic
|
reopened
|
Nightly 05/07/18 VC 6.0 5-25-OPS-User-Grant: failed to validate ops credentials: This operation is not supported on object
|
kind/nightly-blocker priority/p0 team/foundation
|
Nightly 05/07/18, VC 6.0
5-25-OPS-User-Grant
Test case: granted ops-user perms work after upgrade
vic-machine install failed:
```
May 8 2018 04:09:06.094Z ERROR op=14780.1: --------------------
May 8 2018 04:09:06.094Z ERROR op=14780.1: Failed to validate operations credentials: ServerFaultCode: The operation is not supported on the object.
May 8 2018 04:09:06.094Z DEBUG [ END ] op=14780.1 [vic/lib/install/validate.(*Validator).ListIssues:273] [308.982µs]
May 8 2018 04:09:06.094Z DEBUG [ END ] op=14780.1 [vic/lib/install/validate.(*Validator).Validate:299] [2.088876028s]
May 8 2018 04:09:06.094Z ERROR op=14780.1: Create cannot continue: configuration validation failed
May 8 2018 04:09:06.128Z ERROR op=14780.1: --------------------
May 8 2018 04:09:06.128Z ERROR op=14780.1: vic-machine-linux create failed: validation of configuration failed
```
link to download complete logs: https://storage.cloud.google.com/vic-ci-logs/vic_nightly_logs_2018-05-07-20-32-56.zip?authuser=1
vic-machine log:
[vic-machine.log](https://github.com/vmware/vic/files/1985352/vic-machine.log)
Cc: @jzt
|
1.0
|
Nightly 05/07/18 VC 6.0 5-25-OPS-User-Grant: failed to validate ops credentials: This operation is not supported on object - Nightly 05/07/18, VC 6.0
5-25-OPS-User-Grant
Test case: granted ops-user perms work after upgrade
vic-machine install failed:
```
May 8 2018 04:09:06.094Z ERROR op=14780.1: --------------------
May 8 2018 04:09:06.094Z ERROR op=14780.1: Failed to validate operations credentials: ServerFaultCode: The operation is not supported on the object.
May 8 2018 04:09:06.094Z DEBUG [ END ] op=14780.1 [vic/lib/install/validate.(*Validator).ListIssues:273] [308.982µs]
May 8 2018 04:09:06.094Z DEBUG [ END ] op=14780.1 [vic/lib/install/validate.(*Validator).Validate:299] [2.088876028s]
May 8 2018 04:09:06.094Z ERROR op=14780.1: Create cannot continue: configuration validation failed
May 8 2018 04:09:06.128Z ERROR op=14780.1: --------------------
May 8 2018 04:09:06.128Z ERROR op=14780.1: vic-machine-linux create failed: validation of configuration failed
```
link to download complete logs: https://storage.cloud.google.com/vic-ci-logs/vic_nightly_logs_2018-05-07-20-32-56.zip?authuser=1
vic-machine log:
[vic-machine.log](https://github.com/vmware/vic/files/1985352/vic-machine.log)
Cc: @jzt
|
non_process
|
nightly vc ops user grant failed to validate ops credentials this operation is not supported on object nightly vc ops user grant test case granted ops user perms work after upgrade vic machine install failed may error op may error op failed to validate operations credentials serverfaultcode the operation is not supported on the object may debug op may debug op may error op create cannot continue configuration validation failed may error op may error op vic machine linux create failed validation of configuration failed link to download complete logs vic machine log cc jzt
| 0
|
4,740
| 2,610,154,123
|
IssuesEvent
|
2015-02-26 18:49:03
|
chrsmith/republic-at-war
|
https://api.github.com/repos/chrsmith/republic-at-war
|
closed
|
CR20
|
auto-migrated Priority-Medium Type-Defect
|
```
CR20 Engines dont turn off on landing
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 30 Jan 2011 at 2:10
|
1.0
|
CR20 - ```
CR20 Engines dont turn off on landing
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 30 Jan 2011 at 2:10
|
non_process
|
engines dont turn off on landing original issue reported on code google com by gmail com on jan at
| 0
|
1,129
| 3,614,270,619
|
IssuesEvent
|
2016-02-06 00:25:49
|
nodejs/citgm
|
https://api.github.com/repos/nodejs/citgm
|
closed
|
bluebird test suite blows up in child_process
|
child_process weirdness
|
Turns out that process.stdout in a child_process is created using net, whereas in node proper it is made using tty. If a call to any of the methods tty exposes that net does not have, everything will explode
I have submitted a PR to fix this https://github.com/nodejs/node/pull/5061
|
1.0
|
bluebird test suite blows up in child_process - Turns out that process.stdout in a child_process is created using net, whereas in node proper it is made using tty. If a call to any of the methods tty exposes that net does not have, everything will explode
I have submitted a PR to fix this https://github.com/nodejs/node/pull/5061
|
process
|
bluebird test suite blows up in child process turns out that process stdout in a child process is created using net whereas in node proper it is made using tty if a call to any of the methods tty exposes that net does not have everything will explode i have submitted a pr to fix this
| 1
|
10,206
| 13,067,015,519
|
IssuesEvent
|
2020-07-30 23:09:31
|
nion-software/nionswift
|
https://api.github.com/repos/nion-software/nionswift
|
opened
|
Changing complex display mode results in new display data, but no notification
|
f - processing type - bug
|
Do processing on FFT data that uses display data, such as line profile, and change the complex display type. The processing should update. It doesn't.
|
1.0
|
Changing complex display mode results in new display data, but no notification - Do processing on FFT data that uses display data, such as line profile, and change the complex display type. The processing should update. It doesn't.
|
process
|
changing complex display mode results in new display data but no notification do processing on fft data that uses display data such as line profile and change the complex display type the processing should update it doesn t
| 1
|
22,617
| 31,842,692,336
|
IssuesEvent
|
2023-09-14 17:28:36
|
h4sh5/npm-auto-scanner
|
https://api.github.com/repos/h4sh5/npm-auto-scanner
|
opened
|
@nx-bun/task-worker-runner 0.0.3 has 1 guarddog issues
|
npm-silent-process-execution
|
```{"npm-silent-process-execution":[{"code":" const p = (0, _child_process.spawn)('node', [\n scriptPath,\n `\"${this.cachePath}\"`\n ], {\n stdio: 'ignore',\n detached: true,\n ... });","location":"package/src/lib/cache.js:24","message":"This package is silently executing another executable"}]}```
|
1.0
|
@nx-bun/task-worker-runner 0.0.3 has 1 guarddog issues - ```{"npm-silent-process-execution":[{"code":" const p = (0, _child_process.spawn)('node', [\n scriptPath,\n `\"${this.cachePath}\"`\n ], {\n stdio: 'ignore',\n detached: true,\n ... });","location":"package/src/lib/cache.js:24","message":"This package is silently executing another executable"}]}```
|
process
|
nx bun task worker runner has guarddog issues npm silent process execution n stdio ignore n detached true n location package src lib cache js message this package is silently executing another executable
| 1
|
8,625
| 2,875,502,790
|
IssuesEvent
|
2015-06-09 08:36:29
|
nilmtk/nilmtk
|
https://api.github.com/repos/nilmtk/nilmtk
|
opened
|
Consider keeping cache in separate file
|
DataStore and format conversion design Statistics and correlations
|
At the moment, we modify the main dataset HDF5 file to store cached statistics. This has the advantage that the cache is kept with the data. But it has several disadvantages:
* Sometimes the HDF5 can become corrupted (e.g. in #328)
* It slightly complicates our unit tests (because we need to replace modified test files with originals)
So perhaps we should consider keeping the cache in a separate file (maybe even keeping it in the OS temporary directory)?
|
1.0
|
Consider keeping cache in separate file - At the moment, we modify the main dataset HDF5 file to store cached statistics. This has the advantage that the cache is kept with the data. But it has several disadvantages:
* Sometimes the HDF5 can become corrupted (e.g. in #328)
* It slightly complicates our unit tests (because we need to replace modified test files with originals)
So perhaps we should consider keeping the cache in a separate file (maybe even keeping it in the OS temporary directory)?
|
non_process
|
consider keeping cache in separate file at the moment we modify the main dataset file to store cached statistics this has the advantage that the cache is kept with the data but it has several disadvantages sometimes the can become corrupted e g in it slightly complicates our unit tests because we need to replace modified test files with originals so perhaps we should consider keeping the cache in a separate file maybe even keeping it in the os temporary directory
| 0
|
19,202
| 25,338,259,061
|
IssuesEvent
|
2022-11-18 18:51:30
|
hashgraph/hedera-mirror-node
|
https://api.github.com/repos/hashgraph/hedera-mirror-node
|
closed
|
Citus: Update Helm chart
|
enhancement process
|
### Problem
Current v2 portion of helm chart will require updates
Currently there's no citus chart.
We will have to adopt either a manual configuration of the citusdb resource or utilize Azure Arc coordination
### Solution
Depends on outcome of [Citus: Explore multi-cloud deployment solution](https://github.com/hashgraph/hedera-mirror-node/issues/2688)
- Swap timescale subs chart out for Citus
- Add configuration support for single and multi node versions
### Alternatives
_No response_
|
1.0
|
Citus: Update Helm chart - ### Problem
Current v2 portion of helm chart will require updates
Currently there's no citus chart.
We will have to adopt either a manual configuration of the citusdb resource or utilize Azure Arc coordination
### Solution
Depends on outcome of [Citus: Explore multi-cloud deployment solution](https://github.com/hashgraph/hedera-mirror-node/issues/2688)
- Swap timescale subs chart out for Citus
- Add configuration support for single and multi node versions
### Alternatives
_No response_
|
process
|
citus update helm chart problem current portion of helm chart will require updates currently there s no citus chart we will have to adopt either a manual configuration of the citusdb resource or utilize azure arc coordination solution depends on outcome of swap timescale subs chart out for citus add configuration support for single and multi node versions alternatives no response
| 1
|
29,025
| 5,564,540,087
|
IssuesEvent
|
2017-03-26 04:06:23
|
BobKnothe/autoNumeric
|
https://api.github.com/repos/BobKnothe/autoNumeric
|
closed
|
Complete the documentation with the AutoNumeric event lifecycle
|
Documentation
|
Complete the documentation with the AutoNumeric event lifecycle (ie. the events that are sent by an AutoNumeric object during its life).
Events :
- `'autoNumeric:minExceeded'`
- `'autoNumeric:maxExceeded'`
- `'autoNumeric:formatted'`
- `'input'`
- `'change'`
|
1.0
|
Complete the documentation with the AutoNumeric event lifecycle - Complete the documentation with the AutoNumeric event lifecycle (ie. the events that are sent by an AutoNumeric object during its life).
Events :
- `'autoNumeric:minExceeded'`
- `'autoNumeric:maxExceeded'`
- `'autoNumeric:formatted'`
- `'input'`
- `'change'`
|
non_process
|
complete the documentation with the autonumeric event lifecycle complete the documentation with the autonumeric event lifecycle ie the events that are sent by an autonumeric object during its life events autonumeric minexceeded autonumeric maxexceeded autonumeric formatted input change
| 0
|
315,735
| 23,595,136,906
|
IssuesEvent
|
2022-08-23 18:29:53
|
BrSTU-PO4-Pavel-Galanin/6sem_practice
|
https://api.github.com/repos/BrSTU-PO4-Pavel-Galanin/6sem_practice
|
closed
|
Show year, month, date and hours tasks
|
documentation enhancement
|
Реализовать обвновление access токена.
Сверстать страницу вывода тасок на год.
Сверстать страницу вывода тасок за месяц.
Сверстать страницу вывода тасок за день и таски по часам.
Обновить дерево проекта в `README.md`.
|
1.0
|
Show year, month, date and hours tasks - Реализовать обвновление access токена.
Сверстать страницу вывода тасок на год.
Сверстать страницу вывода тасок за месяц.
Сверстать страницу вывода тасок за день и таски по часам.
Обновить дерево проекта в `README.md`.
|
non_process
|
show year month date and hours tasks реализовать обвновление access токена сверстать страницу вывода тасок на год сверстать страницу вывода тасок за месяц сверстать страницу вывода тасок за день и таски по часам обновить дерево проекта в readme md
| 0
|
69,779
| 3,314,697,061
|
IssuesEvent
|
2015-11-06 07:27:28
|
YetiForceCompany/YetiForceCRM
|
https://api.github.com/repos/YetiForceCompany/YetiForceCRM
|
closed
|
[BUG] New PDF - go to next stage without clicking
|
Priority::#2 Normal Type::Bug
|
On new PDF module pages with WYSIWYG editor and related buttons when hovering near "copy to clipboard" button it change color (like on hover). After click in this place system goes to next stage of creating template.

|
1.0
|
[BUG] New PDF - go to next stage without clicking - On new PDF module pages with WYSIWYG editor and related buttons when hovering near "copy to clipboard" button it change color (like on hover). After click in this place system goes to next stage of creating template.

|
non_process
|
new pdf go to next stage without clicking on new pdf module pages with wysiwyg editor and related buttons when hovering near copy to clipboard button it change color like on hover after click in this place system goes to next stage of creating template
| 0
|
235,966
| 19,475,285,022
|
IssuesEvent
|
2021-12-24 10:56:48
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
roachtest: hibernate failed
|
C-test-failure O-robot O-roachtest release-blocker branch-release-21.2
|
roachtest.hibernate [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3993808&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3993808&tab=artifacts#/hibernate) on release-21.2 @ [ec9330f86763371c445acdc317088e24cbcbaac3](https://github.com/cockroachdb/cockroach/commits/ec9330f86763371c445acdc317088e24cbcbaac3):
```
The test failed on branch=release-21.2, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/hibernate/run_1
orm_helpers.go:245,orm_helpers.go:171,java_helpers.go:220,hibernate.go:230,hibernate.go:242,test_runner.go:777:
Tests run on Cockroach v21.2.3-109-gec9330f867
Tests run against hibernate 5.4.30
8106 Total Tests Run
8086 tests passed
20 tests failed
1901 tests skipped
0 tests ignored
0 tests passed unexpectedly
2 tests failed unexpectedly
0 tests expected failed but skipped
0 tests expected failed but not run
---
--- FAIL: org.hibernate.serialization.SessionFactorySerializationTest.testNamedSessionFactorySerialization - unknown (unexpected)
--- FAIL: org.hibernate.serialization.SessionFactorySerializationTest.testUnNamedSessionFactorySerialization - unknown (unexpected)
For a full summary look at the hibernate artifacts
An updated blocklist (hibernateBlockList21_2) is available in the artifacts' hibernate log
```
<details><summary>Reproduce</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
</p>
</details>
/cc @cockroachdb/sql-experience
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*hibernate.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
2.0
|
roachtest: hibernate failed - roachtest.hibernate [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3993808&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3993808&tab=artifacts#/hibernate) on release-21.2 @ [ec9330f86763371c445acdc317088e24cbcbaac3](https://github.com/cockroachdb/cockroach/commits/ec9330f86763371c445acdc317088e24cbcbaac3):
```
The test failed on branch=release-21.2, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/hibernate/run_1
orm_helpers.go:245,orm_helpers.go:171,java_helpers.go:220,hibernate.go:230,hibernate.go:242,test_runner.go:777:
Tests run on Cockroach v21.2.3-109-gec9330f867
Tests run against hibernate 5.4.30
8106 Total Tests Run
8086 tests passed
20 tests failed
1901 tests skipped
0 tests ignored
0 tests passed unexpectedly
2 tests failed unexpectedly
0 tests expected failed but skipped
0 tests expected failed but not run
---
--- FAIL: org.hibernate.serialization.SessionFactorySerializationTest.testNamedSessionFactorySerialization - unknown (unexpected)
--- FAIL: org.hibernate.serialization.SessionFactorySerializationTest.testUnNamedSessionFactorySerialization - unknown (unexpected)
For a full summary look at the hibernate artifacts
An updated blocklist (hibernateBlockList21_2) is available in the artifacts' hibernate log
```
<details><summary>Reproduce</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
</p>
</details>
/cc @cockroachdb/sql-experience
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*hibernate.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
non_process
|
roachtest hibernate failed roachtest hibernate with on release the test failed on branch release cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts hibernate run orm helpers go orm helpers go java helpers go hibernate go hibernate go test runner go tests run on cockroach tests run against hibernate total tests run tests passed tests failed tests skipped tests ignored tests passed unexpectedly tests failed unexpectedly tests expected failed but skipped tests expected failed but not run fail org hibernate serialization sessionfactoryserializationtest testnamedsessionfactoryserialization unknown unexpected fail org hibernate serialization sessionfactoryserializationtest testunnamedsessionfactoryserialization unknown unexpected for a full summary look at the hibernate artifacts an updated blocklist is available in the artifacts hibernate log reproduce see cc cockroachdb sql experience
| 0
|
17,967
| 23,981,733,904
|
IssuesEvent
|
2022-09-13 15:32:41
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Split expression example fails with error "unrecognized value 'split'"
|
devops/prod doc-bug Pri1 devops-cicd-process/tech
|
Hello,
The documentation provides the following example for [split expression](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#split):
```yml
variables:
- name: environments
value: prod1,prod2
steps:
- ${{ each env in split(variables.environments, ',')}}:
- script: ./deploy.sh --environment ${{ env }}
```
I have created the following YAML pipeline:
```yml
parameters:
- name: environments
displayName: 'Environments CSV'
type: string
default: 'prod3,prod4'
pool:
vmImage: windows-latest
variables:
- name: environments
value: prod1,prod2
steps:
- ${{ each env in split(variables.environments, ',')}}:
- pwsh: Write-Host "var environment ${{ env }}"
- ${{ each env in split(parameters.environments, ',')}}:
- pwsh: Write-Host "param environment ${{ env }}"
```
In Azure DevOps Services I get the following errors:
```text
/split-string-expression.yml (Line: 17, Col: 5): Unrecognized value: 'split'. Located at position 1 within expression: split(variables.environments, ','). For more help, refer to https://go.microsoft.com/fwlink/?linkid=842996
/split-string-expression.yml (Line: 19, Col: 5): Unrecognized value: 'split'. Located at position 1 within expression: split(parameters.environments, ','). For more help, refer to https://go.microsoft.com/fwlink/?linkid=842996
```
Please advise.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 77c58a78-a567-e99a-9eb7-62dddd1b90b6
* Version Independent ID: 680a79bc-11de-39fc-43e3-e07dc762db18
* Content: [Expressions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#split)
* Content Source: [docs/pipelines/process/expressions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/expressions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Split expression example fails with error "unrecognized value 'split'" - Hello,
The documentation provides the following example for [split expression](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#split):
```yml
variables:
- name: environments
value: prod1,prod2
steps:
- ${{ each env in split(variables.environments, ',')}}:
- script: ./deploy.sh --environment ${{ env }}
```
I have created the following YAML pipeline:
```yml
parameters:
- name: environments
displayName: 'Environments CSV'
type: string
default: 'prod3,prod4'
pool:
vmImage: windows-latest
variables:
- name: environments
value: prod1,prod2
steps:
- ${{ each env in split(variables.environments, ',')}}:
- pwsh: Write-Host "var environment ${{ env }}"
- ${{ each env in split(parameters.environments, ',')}}:
- pwsh: Write-Host "param environment ${{ env }}"
```
In Azure DevOps Services I get the following errors:
```text
/split-string-expression.yml (Line: 17, Col: 5): Unrecognized value: 'split'. Located at position 1 within expression: split(variables.environments, ','). For more help, refer to https://go.microsoft.com/fwlink/?linkid=842996
/split-string-expression.yml (Line: 19, Col: 5): Unrecognized value: 'split'. Located at position 1 within expression: split(parameters.environments, ','). For more help, refer to https://go.microsoft.com/fwlink/?linkid=842996
```
Please advise.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 77c58a78-a567-e99a-9eb7-62dddd1b90b6
* Version Independent ID: 680a79bc-11de-39fc-43e3-e07dc762db18
* Content: [Expressions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#split)
* Content Source: [docs/pipelines/process/expressions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/expressions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
split expression example fails with error unrecognized value split hello the documentation provides the following example for yml variables name environments value steps each env in split variables environments script deploy sh environment env i have created the following yaml pipeline yml parameters name environments displayname environments csv type string default pool vmimage windows latest variables name environments value steps each env in split variables environments pwsh write host var environment env each env in split parameters environments pwsh write host param environment env in azure devops services i get the following errors text split string expression yml line col unrecognized value split located at position within expression split variables environments for more help refer to split string expression yml line col unrecognized value split located at position within expression split parameters environments for more help refer to please advise document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
20,706
| 27,393,949,409
|
IssuesEvent
|
2023-02-28 18:10:05
|
googleapis/python-pubsub
|
https://api.github.com/repos/googleapis/python-pubsub
|
closed
|
Warning: a recent release failed
|
api: pubsub type: process
|
The following release PRs may have failed:
* #867 - The release job was triggered, but has not reported back success.
* #853 - The release job was triggered, but has not reported back success.
|
1.0
|
Warning: a recent release failed - The following release PRs may have failed:
* #867 - The release job was triggered, but has not reported back success.
* #853 - The release job was triggered, but has not reported back success.
|
process
|
warning a recent release failed the following release prs may have failed the release job was triggered but has not reported back success the release job was triggered but has not reported back success
| 1
|
10,599
| 13,427,143,002
|
IssuesEvent
|
2020-09-06 16:53:51
|
Figma-Linux/figma-linux
|
https://api.github.com/repos/Figma-Linux/figma-linux
|
closed
|
Snap Figma 0.6.2 - "-" & "+" keys ALWAYS zoom
|
Kind: Renderer Process Priority: High Status: Done/In test Type: Bug
|
* App version: 0.6.2
* Operating System (Platform and Version): Pop_OS 20.04
* Type of installed package (Snap, AppImage, deb, rpm, pacman): Snap
**Bug description**
Inside of figma button - & + behave as if the ctrl key has been pressed. This means that even when typing text writing in "bla bla - bla bla" will zoom out when you press the - key.
|
1.0
|
Snap Figma 0.6.2 - "-" & "+" keys ALWAYS zoom - * App version: 0.6.2
* Operating System (Platform and Version): Pop_OS 20.04
* Type of installed package (Snap, AppImage, deb, rpm, pacman): Snap
**Bug description**
Inside of figma button - & + behave as if the ctrl key has been pressed. This means that even when typing text writing in "bla bla - bla bla" will zoom out when you press the - key.
|
process
|
snap figma keys always zoom app version operating system platform and version pop os type of installed package snap appimage deb rpm pacman snap bug description inside of figma button behave as if the ctrl key has been pressed this means that even when typing text writing in bla bla bla bla will zoom out when you press the key
| 1
|
11,793
| 5,089,305,954
|
IssuesEvent
|
2017-01-01 14:19:54
|
TNG/JGiven
|
https://api.github.com/repos/TNG/JGiven
|
closed
|
Extract HTML 5 Report into separate project
|
build HTML Report
|
In order to make it easier to develop the HTML report independently of JGiven, it would be useful to put the HTML report into its own project with a separate release cycle. JGiven should still provide a Java artifact with a version matching that of JGiven.
As the HTML 5 report is a pure Javascript project, it would make sense to publish it to NPM.
This is in particular important to easier support different frontends, like https://github.com/jsGiven/jsGiven. Also see https://github.com/jsGiven/jsGiven/issues/53.
|
1.0
|
Extract HTML 5 Report into separate project - In order to make it easier to develop the HTML report independently of JGiven, it would be useful to put the HTML report into its own project with a separate release cycle. JGiven should still provide a Java artifact with a version matching that of JGiven.
As the HTML 5 report is a pure Javascript project, it would make sense to publish it to NPM.
This is in particular important to easier support different frontends, like https://github.com/jsGiven/jsGiven. Also see https://github.com/jsGiven/jsGiven/issues/53.
|
non_process
|
extract html report into separate project in order to make it easier to develop the html report independently of jgiven it would be useful to put the html report into its own project with a separate release cycle jgiven should still provide a java artifact with a version matching that of jgiven as the html report is a pure javascript project it would make sense to publish it to npm this is in particular important to easier support different frontends like also see
| 0
|
19,459
| 25,751,686,706
|
IssuesEvent
|
2022-12-08 13:42:28
|
esmero/ami
|
https://api.github.com/repos/esmero/ami
|
closed
|
Allow multi ADO relationships on AMI CSVs
|
queue queue workers File processing CSV Processing
|
# What?
Right now our relationships can hold a single value. I (yes ME) have a many reasons for this and I will keep my concerns (cyclic, messy graphs, "inventive" compounds, etc) but @alliomeria made a point and I need to listen
The solution involves rewriting a few pieces of the \Drupal\ami\AmiUtilityService::preprocessAmiSet to do more preprocessing of the values and also some better cleanups on the queue worker. A lot of this was made "upfront" waiting for this day.
This will allow values under e.g `ismemberof` like
- `rownumber1;rownumber2`
- `[rownumber1, rownumber2]`
- `uuid1;uuid2`
- `["uuid1","uuid2"]`
- `rownumber1;uuid1`
- `[rownumber1,"uuid2"]`
The preferred way will be semicolon separated to avoid you all using "nice quotes" and breaking stuff.
|
2.0
|
Allow multi ADO relationships on AMI CSVs - # What?
Right now our relationships can hold a single value. I (yes ME) have a many reasons for this and I will keep my concerns (cyclic, messy graphs, "inventive" compounds, etc) but @alliomeria made a point and I need to listen
The solution involves rewriting a few pieces of the \Drupal\ami\AmiUtilityService::preprocessAmiSet to do more preprocessing of the values and also some better cleanups on the queue worker. A lot of this was made "upfront" waiting for this day.
This will allow values under e.g `ismemberof` like
- `rownumber1;rownumber2`
- `[rownumber1, rownumber2]`
- `uuid1;uuid2`
- `["uuid1","uuid2"]`
- `rownumber1;uuid1`
- `[rownumber1,"uuid2"]`
The preferred way will be semicolon separated to avoid you all using "nice quotes" and breaking stuff.
|
process
|
allow multi ado relationships on ami csvs what right now our relationships can hold a single value i yes me have a many reasons for this and i will keep my concerns cyclic messy graphs inventive compounds etc but alliomeria made a point and i need to listen the solution involves rewriting a few pieces of the drupal ami amiutilityservice preprocessamiset to do more preprocessing of the values and also some better cleanups on the queue worker a lot of this was made upfront waiting for this day this will allow values under e g ismemberof like the preferred way will be semicolon separated to avoid you all using nice quotes and breaking stuff
| 1
|
45,103
| 23,911,999,862
|
IssuesEvent
|
2022-09-09 09:05:18
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
opened
|
LeakCanary 【 SingleViewPresentation$PresentationContext 】
|
created via performance template
|
My app is a mixed project of kotlin and flutter.
When I open a platformview flutterActivity and then close this activity ,
but something leakCanary .
**flutter docror :**
```
Flutter assets will be downloaded from https://storage.flutter-io.cn. Make sure you trust this source!
Doctor summary (to see all details, run flutter doctor -v):
[√] Flutter (Channel stable, 3.3.0, on Microsoft Windows [版本 10.0.18363.836], locale zh-CN)
Checking Android licenses is taking an unexpectedly long time...[√] Android toolchain - develop for Android devices (Android SDK version 32.1.0-rc1)
[√] Chrome - develop for the web
[√] Visual Studio - develop for Windows (Visual Studio Community 2019 16.4.2)
[√] Android Studio (version 2020.3)
[√] Connected device (3 available)
[√] HTTP Host Availability
• No issues found!
```
**log below :**
```
LeakCanary: ├─ io.flutter.plugin.common.MethodChannel instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 24 B in 1 objects
LeakCanary: │ ↓ MethodChannel.messenger
LeakCanary: │ ~~~~~~~~~
LeakCanary: ├─ io.flutter.embedding.engine.dart.DartExecutor instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 37 B in 1 objects
LeakCanary: │ ↓ DartExecutor.flutterJNI
LeakCanary: │ ~~~~~~~~~~
LeakCanary: ├─ io.flutter.embedding.engine.FlutterJNI instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 240 B in 15 objects
LeakCanary: │ ↓ FlutterJNI.platformViewsController
LeakCanary: │ ~~~~~~~~~~~~~~~~~~~~~~~
LeakCanary: ├─ io.flutter.plugin.platform.PlatformViewsController instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 1.9 MB in 3853 objects
LeakCanary: │ context instance of com.tome.ropse.MyApplication
LeakCanary: │ ↓ PlatformViewsController.contextToEmbeddedView
LeakCanary: │ ~~~~~~~~~~~~~~~~~~~~~
LeakCanary: ├─ java.util.HashMap instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 1.9 MB in 3829 objects
LeakCanary: │ ↓ HashMap[key()]
LeakCanary: │ ~~~~~~~
LeakCanary: ├─ android.content.MutableContextWrapper instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 57.6 kB in 874 objects
LeakCanary: │ mBase instance of io.flutter.plugin.platform.SingleViewPresentation$PresentationContext
LeakCanary: │ MutableContextWrapper does not wrap a known Android context
LeakCanary: │ ↓ ContextWrapper.mBase
LeakCanary: │ ~~~~~
LeakCanary: ├─ io.flutter.plugin.platform.SingleViewPresentation$PresentationContext instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 57.6 kB in 873 objects
LeakCanary: │ flutterAppWindowContext instance of com.tome.flutter.page.FlutterPageActivity with mDestroyed = tru
LeakCanary: │ mBase instance of android.app.Presentation$1
LeakCanary: │ SingleViewPresentation$PresentationContext does not wrap a known Android context
LeakCanary: │ ↓ SingleViewPresentation$PresentationContext.flutterAppWindowContext
LeakCanary: │ ~~~~~~~~~~~~~~~~~~~~~~~
LeakCanary: ╰→ com.tome.flutter.page.FlutterPageActivity instance
LeakCanary: Leaking: YES (ObjectWatcher was watching this because com.tome.flutter.page.FlutterPageActivity re
LeakCanary: Activity#onDestroy() callback and Activity#mDestroyed is true)
LeakCanary: Retaining 10.1 kB in 187 objects
LeakCanary: key = ab4bdb00-298c-457d-a56e-d9a05984b6dc
LeakCanary: watchDurationMillis = 71742
LeakCanary: retainedDurationMillis = 66742
LeakCanary: mApplication instance of com.tome.ropse.MyApplication
LeakCanary: mBase instance of android.app.ContextImpl
LeakCanary: ====================================
```
|
True
|
LeakCanary 【 SingleViewPresentation$PresentationContext 】 - My app is a mixed project of kotlin and flutter.
When I open a platformview flutterActivity and then close this activity ,
but something leakCanary .
**flutter docror :**
```
Flutter assets will be downloaded from https://storage.flutter-io.cn. Make sure you trust this source!
Doctor summary (to see all details, run flutter doctor -v):
[√] Flutter (Channel stable, 3.3.0, on Microsoft Windows [版本 10.0.18363.836], locale zh-CN)
Checking Android licenses is taking an unexpectedly long time...[√] Android toolchain - develop for Android devices (Android SDK version 32.1.0-rc1)
[√] Chrome - develop for the web
[√] Visual Studio - develop for Windows (Visual Studio Community 2019 16.4.2)
[√] Android Studio (version 2020.3)
[√] Connected device (3 available)
[√] HTTP Host Availability
• No issues found!
```
**log below :**
```
LeakCanary: ├─ io.flutter.plugin.common.MethodChannel instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 24 B in 1 objects
LeakCanary: │ ↓ MethodChannel.messenger
LeakCanary: │ ~~~~~~~~~
LeakCanary: ├─ io.flutter.embedding.engine.dart.DartExecutor instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 37 B in 1 objects
LeakCanary: │ ↓ DartExecutor.flutterJNI
LeakCanary: │ ~~~~~~~~~~
LeakCanary: ├─ io.flutter.embedding.engine.FlutterJNI instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 240 B in 15 objects
LeakCanary: │ ↓ FlutterJNI.platformViewsController
LeakCanary: │ ~~~~~~~~~~~~~~~~~~~~~~~
LeakCanary: ├─ io.flutter.plugin.platform.PlatformViewsController instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 1.9 MB in 3853 objects
LeakCanary: │ context instance of com.tome.ropse.MyApplication
LeakCanary: │ ↓ PlatformViewsController.contextToEmbeddedView
LeakCanary: │ ~~~~~~~~~~~~~~~~~~~~~
LeakCanary: ├─ java.util.HashMap instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 1.9 MB in 3829 objects
LeakCanary: │ ↓ HashMap[key()]
LeakCanary: │ ~~~~~~~
LeakCanary: ├─ android.content.MutableContextWrapper instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 57.6 kB in 874 objects
LeakCanary: │ mBase instance of io.flutter.plugin.platform.SingleViewPresentation$PresentationContext
LeakCanary: │ MutableContextWrapper does not wrap a known Android context
LeakCanary: │ ↓ ContextWrapper.mBase
LeakCanary: │ ~~~~~
LeakCanary: ├─ io.flutter.plugin.platform.SingleViewPresentation$PresentationContext instance
LeakCanary: │ Leaking: UNKNOWN
LeakCanary: │ Retaining 57.6 kB in 873 objects
LeakCanary: │ flutterAppWindowContext instance of com.tome.flutter.page.FlutterPageActivity with mDestroyed = tru
LeakCanary: │ mBase instance of android.app.Presentation$1
LeakCanary: │ SingleViewPresentation$PresentationContext does not wrap a known Android context
LeakCanary: │ ↓ SingleViewPresentation$PresentationContext.flutterAppWindowContext
LeakCanary: │ ~~~~~~~~~~~~~~~~~~~~~~~
LeakCanary: ╰→ com.tome.flutter.page.FlutterPageActivity instance
LeakCanary: Leaking: YES (ObjectWatcher was watching this because com.tome.flutter.page.FlutterPageActivity re
LeakCanary: Activity#onDestroy() callback and Activity#mDestroyed is true)
LeakCanary: Retaining 10.1 kB in 187 objects
LeakCanary: key = ab4bdb00-298c-457d-a56e-d9a05984b6dc
LeakCanary: watchDurationMillis = 71742
LeakCanary: retainedDurationMillis = 66742
LeakCanary: mApplication instance of com.tome.ropse.MyApplication
LeakCanary: mBase instance of android.app.ContextImpl
LeakCanary: ====================================
```
|
non_process
|
leakcanary 【 singleviewpresentation presentationcontext 】 my app is a mixed project of kotlin and flutter when i open a platformview flutteractivity and then close this activity but something leakcanary flutter docror flutter assets will be downloaded from make sure you trust this source doctor summary to see all details run flutter doctor v flutter channel stable on microsoft windows locale zh cn checking android licenses is taking an unexpectedly long time android toolchain develop for android devices android sdk version chrome develop for the web visual studio develop for windows visual studio community android studio version connected device available http host availability • no issues found log below leakcanary ├─ io flutter plugin common methodchannel instance leakcanary │ leaking unknown leakcanary │ retaining b in objects leakcanary │ ↓ methodchannel messenger leakcanary │ leakcanary ├─ io flutter embedding engine dart dartexecutor instance leakcanary │ leaking unknown leakcanary │ retaining b in objects leakcanary │ ↓ dartexecutor flutterjni leakcanary │ leakcanary ├─ io flutter embedding engine flutterjni instance leakcanary │ leaking unknown leakcanary │ retaining b in objects leakcanary │ ↓ flutterjni platformviewscontroller leakcanary │ leakcanary ├─ io flutter plugin platform platformviewscontroller instance leakcanary │ leaking unknown leakcanary │ retaining mb in objects leakcanary │ context instance of com tome ropse myapplication leakcanary │ ↓ platformviewscontroller contexttoembeddedview leakcanary │ leakcanary ├─ java util hashmap instance leakcanary │ leaking unknown leakcanary │ retaining mb in objects leakcanary │ ↓ hashmap leakcanary │ leakcanary ├─ android content mutablecontextwrapper instance leakcanary │ leaking unknown leakcanary │ retaining kb in objects leakcanary │ mbase instance of io flutter plugin platform singleviewpresentation presentationcontext leakcanary │ mutablecontextwrapper does not wrap a known android context leakcanary │ ↓ contextwrapper mbase leakcanary │ leakcanary ├─ io flutter plugin platform singleviewpresentation presentationcontext instance leakcanary │ leaking unknown leakcanary │ retaining kb in objects leakcanary │ flutterappwindowcontext instance of com tome flutter page flutterpageactivity with mdestroyed tru leakcanary │ mbase instance of android app presentation leakcanary │ singleviewpresentation presentationcontext does not wrap a known android context leakcanary │ ↓ singleviewpresentation presentationcontext flutterappwindowcontext leakcanary │ leakcanary ╰→ com tome flutter page flutterpageactivity instance leakcanary leaking yes objectwatcher was watching this because com tome flutter page flutterpageactivity re leakcanary activity ondestroy callback and activity mdestroyed is true leakcanary retaining kb in objects leakcanary key leakcanary watchdurationmillis leakcanary retaineddurationmillis leakcanary mapplication instance of com tome ropse myapplication leakcanary mbase instance of android app contextimpl leakcanary
| 0
|
8,509
| 11,687,253,089
|
IssuesEvent
|
2020-03-05 12:28:07
|
heim-rs/heim
|
https://api.github.com/repos/heim-rs/heim
|
closed
|
Process::priority for Windows
|
A-process C-enhancement O-windows
|
Similar to #216, there should be Windows-specific method to change process priority
|
1.0
|
Process::priority for Windows - Similar to #216, there should be Windows-specific method to change process priority
|
process
|
process priority for windows similar to there should be windows specific method to change process priority
| 1
|
437,302
| 30,594,712,499
|
IssuesEvent
|
2023-07-21 20:35:42
|
DCC-EX/dcc-ex.github.io
|
https://api.github.com/repos/DCC-EX/dcc-ex.github.io
|
closed
|
Add EX-RAIL crossover coding examples.
|
Documentation
|
There is a slight risk where you join the 2 back-to-back turnouts with one button because someone using a throttle could alter one of them without altering the other and then they're out of sync.
It might be best to write the interlinked ones slightly differently to a standard turnout but it depends on your level of confidence with code...
Perhaps I would have used this technique:
Assuming my interlinked turnouts are A and B
- DON'T define B as a turnout, so nobody could THROW B from a throttle because it would be invisible.
in the ONTHROW(A) or ONCLOSE(A)
use a SERVO command to move B to match A correctly.
(For info of other readers: The same technique works if you are using pin or DCC turnouts)
You cant use a generic CLEVER macro for this, because its a special case so just write the script for A as you did before.
|
1.0
|
Add EX-RAIL crossover coding examples. - There is a slight risk where you join the 2 back-to-back turnouts with one button because someone using a throttle could alter one of them without altering the other and then they're out of sync.
It might be best to write the interlinked ones slightly differently to a standard turnout but it depends on your level of confidence with code...
Perhaps I would have used this technique:
Assuming my interlinked turnouts are A and B
- DON'T define B as a turnout, so nobody could THROW B from a throttle because it would be invisible.
in the ONTHROW(A) or ONCLOSE(A)
use a SERVO command to move B to match A correctly.
(For info of other readers: The same technique works if you are using pin or DCC turnouts)
You cant use a generic CLEVER macro for this, because its a special case so just write the script for A as you did before.
|
non_process
|
add ex rail crossover coding examples there is a slight risk where you join the back to back turnouts with one button because someone using a throttle could alter one of them without altering the other and then they re out of sync it might be best to write the interlinked ones slightly differently to a standard turnout but it depends on your level of confidence with code perhaps i would have used this technique assuming my interlinked turnouts are a and b don t define b as a turnout so nobody could throw b from a throttle because it would be invisible in the onthrow a or onclose a use a servo command to move b to match a correctly for info of other readers the same technique works if you are using pin or dcc turnouts you cant use a generic clever macro for this because its a special case so just write the script for a as you did before
| 0
|
4,173
| 7,108,066,883
|
IssuesEvent
|
2018-01-16 22:20:11
|
ViktorKuryshev/CRM
|
https://api.github.com/repos/ViktorKuryshev/CRM
|
closed
|
ProcV - 052 Создать вкладку ТМ
|
Critical Process
|
- [x] Добавить узел ТМ.
- [x] Внести в навигацию переключение на управление ТМ.
|
1.0
|
ProcV - 052 Создать вкладку ТМ - - [x] Добавить узел ТМ.
- [x] Внести в навигацию переключение на управление ТМ.
|
process
|
procv создать вкладку тм добавить узел тм внести в навигацию переключение на управление тм
| 1
|
14,381
| 17,401,381,320
|
IssuesEvent
|
2021-08-02 20:13:29
|
choderalab/perses
|
https://api.github.com/repos/choderalab/perses
|
closed
|
Maintain the changelog
|
effort: low priority: low process improvement
|
In theory we have a change log here:
https://github.com/choderalab/perses/blob/master/docs/changelog.rst
but we have not maintained it. We can use github releases to fill it out for past releases and going forward and adding an entry to the PR checklist.
|
1.0
|
Maintain the changelog - In theory we have a change log here:
https://github.com/choderalab/perses/blob/master/docs/changelog.rst
but we have not maintained it. We can use github releases to fill it out for past releases and going forward and adding an entry to the PR checklist.
|
process
|
maintain the changelog in theory we have a change log here but we have not maintained it we can use github releases to fill it out for past releases and going forward and adding an entry to the pr checklist
| 1
|
10,636
| 13,446,094,771
|
IssuesEvent
|
2020-09-08 12:28:03
|
MHRA/products
|
https://api.github.com/repos/MHRA/products
|
closed
|
PARs - User guide
|
EPIC - PARs process NEW :new:
|
### User want
As a Medical Writer I would like to read a user guide on how to upload and amend PARs, so that I'm fully informed and can refer back this when i NEED HELP
### Acceptance Criteria
**Customer acceptance criteria**
- [ ] Step by step guide on uploading a new PAR
- [ ] Step by step guide on uploading an amended PAR
- [ ] Details on how to access log files for auditing
**Technical acceptance criteria**
**Data acceptance criteria**
**Testing acceptance criteria**
**Size**
**Value**
**Effort**
### Exit Criteria met
- [ ] Backlog
- [ ] Discovery
- [ ] DUXD
- [ ] Development
- [ ] Quality Assurance
- [ ] Release and Validate
|
1.0
|
PARs - User guide - ### User want
As a Medical Writer I would like to read a user guide on how to upload and amend PARs, so that I'm fully informed and can refer back this when i NEED HELP
### Acceptance Criteria
**Customer acceptance criteria**
- [ ] Step by step guide on uploading a new PAR
- [ ] Step by step guide on uploading an amended PAR
- [ ] Details on how to access log files for auditing
**Technical acceptance criteria**
**Data acceptance criteria**
**Testing acceptance criteria**
**Size**
**Value**
**Effort**
### Exit Criteria met
- [ ] Backlog
- [ ] Discovery
- [ ] DUXD
- [ ] Development
- [ ] Quality Assurance
- [ ] Release and Validate
|
process
|
pars user guide user want as a medical writer i would like to read a user guide on how to upload and amend pars so that i m fully informed and can refer back this when i need help acceptance criteria customer acceptance criteria step by step guide on uploading a new par step by step guide on uploading an amended par details on how to access log files for auditing technical acceptance criteria data acceptance criteria testing acceptance criteria size value effort exit criteria met backlog discovery duxd development quality assurance release and validate
| 1
|
129,125
| 5,088,823,235
|
IssuesEvent
|
2017-01-01 04:03:16
|
remlostime/one
|
https://api.github.com/repos/remlostime/one
|
opened
|
[Bug] Fix edit user page error
|
bug high-priority
|
* NSNotification to tell home profile vc to update user info
* User data sync up with server has error
* UI is too slow
|
1.0
|
[Bug] Fix edit user page error - * NSNotification to tell home profile vc to update user info
* User data sync up with server has error
* UI is too slow
|
non_process
|
fix edit user page error nsnotification to tell home profile vc to update user info user data sync up with server has error ui is too slow
| 0
|
118,830
| 11,993,858,183
|
IssuesEvent
|
2020-04-08 12:46:58
|
galasa-dev/projectmanagement
|
https://api.github.com/repos/galasa-dev/projectmanagement
|
opened
|
IP Network Manager doesn't have a documentation page
|
documentation
|
although this is in ALPHA it should have a doc page
|
1.0
|
IP Network Manager doesn't have a documentation page - although this is in ALPHA it should have a doc page
|
non_process
|
ip network manager doesn t have a documentation page although this is in alpha it should have a doc page
| 0
|
74,436
| 7,428,605,510
|
IssuesEvent
|
2018-03-24 03:37:34
|
ODIQueensland/data-curator
|
https://api.github.com/repos/ODIQueensland/data-curator
|
closed
|
v0.12.0 one automated test fails
|
env:MacOS i:Tests
|
The output from the automated tests looks awesome.
Using Develop branch, when I run `yarn run test` I get an error in the Open csv scenario
<img width="540" alt="screenshot 2018-03-16 06 12 52" src="https://user-images.githubusercontent.com/9379524/37488665-1d4cf728-28e1-11e8-96be-4b462ef49ccb.png">
### Your Environment
* Data Curator version: 0.12.0
* Operating System and version: macOS High Sierra 10.13.3
|
1.0
|
v0.12.0 one automated test fails - The output from the automated tests looks awesome.
Using Develop branch, when I run `yarn run test` I get an error in the Open csv scenario
<img width="540" alt="screenshot 2018-03-16 06 12 52" src="https://user-images.githubusercontent.com/9379524/37488665-1d4cf728-28e1-11e8-96be-4b462ef49ccb.png">
### Your Environment
* Data Curator version: 0.12.0
* Operating System and version: macOS High Sierra 10.13.3
|
non_process
|
one automated test fails the output from the automated tests looks awesome using develop branch when i run yarn run test i get an error in the open csv scenario img width alt screenshot src your environment data curator version operating system and version macos high sierra
| 0
|
444
| 2,873,860,469
|
IssuesEvent
|
2015-06-08 19:20:28
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
Add extension attributes to table
|
feature P2 preprocess
|
Add extension attributes to table elements to simplify transtype specific code:
* `x` coordinate of the cell in table
* `y` coordinate of the cell in table
* `colspan` for cell based on span start and end columns
|
1.0
|
Add extension attributes to table - Add extension attributes to table elements to simplify transtype specific code:
* `x` coordinate of the cell in table
* `y` coordinate of the cell in table
* `colspan` for cell based on span start and end columns
|
process
|
add extension attributes to table add extension attributes to table elements to simplify transtype specific code x coordinate of the cell in table y coordinate of the cell in table colspan for cell based on span start and end columns
| 1
|
4,143
| 7,097,410,871
|
IssuesEvent
|
2018-01-14 19:01:03
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
opened
|
[New Tests] Failed: System.ServiceProcess.Tests.ServiceBaseTests / LogWritten & LogWritten_AutoLog_False
|
area-System.ServiceProcess test bug
|
Affected tests (introduced in PR #26260 - @Anipik):
* LogWritten
* LogWritten_AutoLog_False
Failure of `LogWritten` test:
```
System.InvalidOperationException : Cannot open log Application on computer '.'. This function is not supported on this system
at System.Diagnostics.EventLogInternal.OpenForRead(String currentMachineName) in E:\A\_work\1554\s\corefx\src\System.Diagnostics.EventLog\src\System\Diagnostics\EventLogInternal.cs:line 1112
at System.Diagnostics.EventLogInternal.get_EntryCount() in E:\A\_work\1554\s\corefx\src\System.Diagnostics.EventLog\src\System\Diagnostics\EventLogInternal.cs:line 155
at System.ServiceProcess.Tests.ServiceBaseTests.LogWritten() in E:\A\_work\1554\s\corefx\src\System.ServiceProcess.ServiceController\tests\ServiceBaseTests.cs:line 180
```
## History of failures
1/13 | 20180113.03 | Win10.Nano | LogWritten & LogWritten_AutoLog_False
1/14 | 20180114.01 | Win10.Nano | LogWritten & LogWritten_AutoLog_False
1/14 | 20180114.03 | Win10.Nano | LogWritten & LogWritten_AutoLog_False
|
1.0
|
[New Tests] Failed: System.ServiceProcess.Tests.ServiceBaseTests / LogWritten & LogWritten_AutoLog_False - Affected tests (introduced in PR #26260 - @Anipik):
* LogWritten
* LogWritten_AutoLog_False
Failure of `LogWritten` test:
```
System.InvalidOperationException : Cannot open log Application on computer '.'. This function is not supported on this system
at System.Diagnostics.EventLogInternal.OpenForRead(String currentMachineName) in E:\A\_work\1554\s\corefx\src\System.Diagnostics.EventLog\src\System\Diagnostics\EventLogInternal.cs:line 1112
at System.Diagnostics.EventLogInternal.get_EntryCount() in E:\A\_work\1554\s\corefx\src\System.Diagnostics.EventLog\src\System\Diagnostics\EventLogInternal.cs:line 155
at System.ServiceProcess.Tests.ServiceBaseTests.LogWritten() in E:\A\_work\1554\s\corefx\src\System.ServiceProcess.ServiceController\tests\ServiceBaseTests.cs:line 180
```
## History of failures
1/13 | 20180113.03 | Win10.Nano | LogWritten & LogWritten_AutoLog_False
1/14 | 20180114.01 | Win10.Nano | LogWritten & LogWritten_AutoLog_False
1/14 | 20180114.03 | Win10.Nano | LogWritten & LogWritten_AutoLog_False
|
process
|
failed system serviceprocess tests servicebasetests logwritten logwritten autolog false affected tests introduced in pr anipik logwritten logwritten autolog false failure of logwritten test system invalidoperationexception cannot open log application on computer this function is not supported on this system at system diagnostics eventloginternal openforread string currentmachinename in e a work s corefx src system diagnostics eventlog src system diagnostics eventloginternal cs line at system diagnostics eventloginternal get entrycount in e a work s corefx src system diagnostics eventlog src system diagnostics eventloginternal cs line at system serviceprocess tests servicebasetests logwritten in e a work s corefx src system serviceprocess servicecontroller tests servicebasetests cs line history of failures nano logwritten logwritten autolog false nano logwritten logwritten autolog false nano logwritten logwritten autolog false
| 1
|
1,596
| 4,210,997,360
|
IssuesEvent
|
2016-06-29 12:06:21
|
e-government-ua/iBP
|
https://api.github.com/repos/e-government-ua/iBP
|
closed
|
Хмельницький - Видача дозвіл на порушення об’єктів благоустрою міста Хмельницького
|
In process of testing in work test
|
Доброго дня, прошу розробити нові послуги та підключити їх в тестовий портал
https://test.region.igov.org.ua на логін - khmel_mvk_1
контакти відповідальних з боку державного органу: Малінковська Олена Володимирівна, cnap32@rada.khmelnytsky.com, elenamalinkovskaya@yandex.ru, 0672831292
інфокарта
https://drive.google.com/file/d/0BzhmqPf0m3-EWEsxNXRxUVAzTzg/view
|
1.0
|
Хмельницький - Видача дозвіл на порушення об’єктів благоустрою міста Хмельницького - Доброго дня, прошу розробити нові послуги та підключити їх в тестовий портал
https://test.region.igov.org.ua на логін - khmel_mvk_1
контакти відповідальних з боку державного органу: Малінковська Олена Володимирівна, cnap32@rada.khmelnytsky.com, elenamalinkovskaya@yandex.ru, 0672831292
інфокарта
https://drive.google.com/file/d/0BzhmqPf0m3-EWEsxNXRxUVAzTzg/view
|
process
|
хмельницький видача дозвіл на порушення об’єктів благоустрою міста хмельницького доброго дня прошу розробити нові послуги та підключити їх в тестовий портал на логін khmel mvk контакти відповідальних з боку державного органу малінковська олена володимирівна rada khmelnytsky com elenamalinkovskaya yandex ru інфокарта
| 1
|
232,545
| 25,578,885,524
|
IssuesEvent
|
2022-12-01 01:32:49
|
renfei/SpringCloudDemo
|
https://api.github.com/repos/renfei/SpringCloudDemo
|
opened
|
CVE-2022-41854 (Medium) detected in snakeyaml-1.25.jar
|
security vulnerability
|
## CVE-2022-41854 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.25.jar</b></p></summary>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /demoservice/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.25/snakeyaml-1.25.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.25/snakeyaml-1.25.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.25/snakeyaml-1.25.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.25/snakeyaml-1.25.jar</p>
<p>
Dependency Hierarchy:
- spring-cloud-config-server-2.2.1.RELEASE.jar (Root Library)
- :x: **snakeyaml-1.25.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/renfei/SpringCloudDemo/commit/3e0d29d6560c91c0b0e70b612a1bc5c66f0758cb">3e0d29d6560c91c0b0e70b612a1bc5c66f0758cb</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Those using Snakeyaml to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stack overflow. This effect may support a denial of service attack.
<p>Publish Date: 2022-11-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-41854>CVE-2022-41854</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/531/">https://bitbucket.org/snakeyaml/snakeyaml/issues/531/</a></p>
<p>Release Date: 2022-11-11</p>
<p>Fix Resolution: org.yaml:snakeyaml:1.32</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-41854 (Medium) detected in snakeyaml-1.25.jar - ## CVE-2022-41854 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.25.jar</b></p></summary>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /demoservice/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.25/snakeyaml-1.25.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.25/snakeyaml-1.25.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.25/snakeyaml-1.25.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.25/snakeyaml-1.25.jar</p>
<p>
Dependency Hierarchy:
- spring-cloud-config-server-2.2.1.RELEASE.jar (Root Library)
- :x: **snakeyaml-1.25.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/renfei/SpringCloudDemo/commit/3e0d29d6560c91c0b0e70b612a1bc5c66f0758cb">3e0d29d6560c91c0b0e70b612a1bc5c66f0758cb</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Those using Snakeyaml to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stack overflow. This effect may support a denial of service attack.
<p>Publish Date: 2022-11-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-41854>CVE-2022-41854</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/531/">https://bitbucket.org/snakeyaml/snakeyaml/issues/531/</a></p>
<p>Release Date: 2022-11-11</p>
<p>Fix Resolution: org.yaml:snakeyaml:1.32</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in snakeyaml jar cve medium severity vulnerability vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file demoservice pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring cloud config server release jar root library x snakeyaml jar vulnerable library found in head commit a href found in base branch master vulnerability details those using snakeyaml to parse untrusted yaml files may be vulnerable to denial of service attacks dos if the parser is running on user supplied input an attacker may supply content that causes the parser to crash by stack overflow this effect may support a denial of service attack publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml step up your open source security game with mend
| 0
|
106,641
| 13,338,664,737
|
IssuesEvent
|
2020-08-28 11:28:39
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
closed
|
formKey.currentState.reset() doesn't reset DropdownButtonFormField memory data
|
d: stackoverflow f: material design framework waiting for customer response
|
<!-- Thank you for using Flutter!
If you are looking for support, please check out our documentation
or consider asking a question on Stack Overflow:
* https://flutter.dev/
* https://api.flutter.dev/
* https://stackoverflow.com/questions/tagged/flutter?sort=frequent
If you have found a bug or if our documentation doesn't have an answer
to what you're looking for, then fill our the template below. Please read
our guide to filing a bug first: https://flutter.dev/docs/resources/bug-reports
-->
## Steps to Reproduce
<!-- You must include full steps to reproduce so that we can reproduce the problem. -->
Code as following
--------------------
```
import 'package:flutter/material.dart';
void main() => runApp(MyApp());
class MyApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'Flutter Demo',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home: MyHomePage(title: 'Flutter Demo Home Page'),
);
}
}
class MyHomePage extends StatefulWidget {
MyHomePage({Key key, this.title}) : super(key: key);
final String title;
@override
_MyHomePageState createState() => _MyHomePageState();
}
class _MyHomePageState extends State<MyHomePage> {
final GlobalKey<FormState> _formKey = GlobalKey<FormState>();
int dropDownValue;
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Form(
key: _formKey,
child: Padding(
padding: const EdgeInsets.all(20.0),
child: Column(
children: <Widget>[
DropdownButtonFormField<int>(
validator: (int value) =>
value == null ? 'please select' : null,
value: dropDownValue,
hint: Text('select'),
isDense: true,
onSaved: (int value) => dropDownValue = value,
items: <int>[1, 2, 3]
.map((value) => DropdownMenuItem(
value: value,
child: Text(value.toString()),
))
.toList(),
onChanged: (int value) {
setState(() {
dropDownValue = value;
});
},
),
RaisedButton(
child: Text('Validate'),
onPressed: () {
if (_formKey.currentState.validate()) {
_formKey.currentState.save();
print(dropDownValue);
}
},
),
RaisedButton(
child: Text('Reset'),
onPressed: () {
_formKey.currentState.reset();
setState(() {
dropDownValue = null;
});
},
),
Text('selected = $dropDownValue'),
],
),
),
),
);
}
}
```
1. Click on Validate without selecting anything on Dropdown.
2. It validates correctly.
3. Select any item from dropdown
4. Click on Validate and it validates correctly and prints the value into output window.
5. Click on Reset which clears the dropdown selection.
6. Click on Validate and it doesn't validate (even if nothing is selected on dropdown) and prints the value into output window.
7. Click on Reset again
8. Click on Validate and it validates this time.
**Expected results:**
1. It should validate on Step#6 when I reset the formKey.currentState.reset() on step#5
**Actual results:**
1. It holds value into memory and validation passes even if the dropdown has nothing selected on step#6
2. Don't know why but it works on step#8 (second time reset on step#7)
<details>
<summary>Logs</summary>
```
<!--
[ +26 ms] executing: [C:\flutter\] git -c log.showSignature=false log -n 1 --pretty=format:%H
[ +68 ms] Exit code 0 from: git -c log.showSignature=false log -n 1 --pretty=format:%H
[ ] 0b8abb4724aa590dd0f429683339b1e045a1594d
[ +1 ms] executing: [C:\flutter\] git describe --match v*.*.* --first-parent --long --tags
[ +45 ms] Exit code 0 from: git describe --match v*.*.* --first-parent --long --tags
[ ] v1.12.13+hotfix.8-0-g0b8abb472
[ +8 ms] executing: [C:\flutter\] git rev-parse --abbrev-ref --symbolic @{u}
[ +44 ms] Exit code 0 from: git rev-parse --abbrev-ref --symbolic @{u}
[ +1 ms] origin/stable
[ ] executing: [C:\flutter\] git ls-remote --get-url origin
[ +38 ms] Exit code 0 from: git ls-remote --get-url origin
[ ] https://github.com/flutter/flutter.git
[ +88 ms] executing: [C:\flutter\] git rev-parse --abbrev-ref HEAD
[ +39 ms] Exit code 0 from: git rev-parse --abbrev-ref HEAD
[ +1 ms] stable
[ +114 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe devices -l
[ +32 ms] Exit code 0 from: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe devices -l
[ +1 ms] List of devices attached
emulator-5554 device product:sdk_gphone_x86 model:Android_SDK_built_for_x86 device:generic_x86 transport_id:5
[ +19 ms] C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell getprop
[ +67 ms] Artifact Instance of 'AndroidMavenArtifacts' is not required, skipping update.
[ +5 ms] Artifact Instance of 'AndroidInternalBuildArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'IOSEngineArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'FlutterWebSdk' is not required, skipping update.
[ +4 ms] Artifact Instance of 'WindowsEngineArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'MacOSEngineArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'LinuxEngineArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'LinuxFuchsiaSDKArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'MacOSFuchsiaSDKArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'FlutterRunnerSDKArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'FlutterRunnerDebugSymbols' is not required, skipping update.
[ +156 ms] Generating E:\Practice\Flutter\flutter_bug\android\app\src\main\java\io\flutter\plugins\GeneratedPluginRegistrant.java
[ +36 ms] ro.hardware = ranchu
[ +40 ms] Using hardware rendering with device Android SDK built for x86. If you get graphics artifacts, consider enabling software rendering with "--enable-software-rendering".
[ +24 ms] Launching lib\main.dart on Android SDK built for x86 in debug mode...
[ +9 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\build-tools\29.0.3\aapt dump xmltree E:\Practice\Flutter\flutter_bug\build\app\outputs\apk\app.apk AndroidManifest.xml
[ +15 ms] Exit code 0 from: C:\Users\panka\AppData\Local\Android\sdk\build-tools\29.0.3\aapt dump xmltree E:\Practice\Flutter\flutter_bug\build\app\outputs\apk\app.apk AndroidManifest.xml
[ ] N: android=http://schemas.android.com/apk/res/android
E: manifest (line=2)
A: android:versionCode(0x0101021b)=(type 0x10)0x1
A: android:versionName(0x0101021c)="1.0.0" (Raw: "1.0.0")
A: android:compileSdkVersion(0x01010572)=(type 0x10)0x1c
A: android:compileSdkVersionCodename(0x01010573)="9" (Raw: "9")
A: package="com.example.flutter_bug" (Raw: "com.example.flutter_bug")
A: platformBuildVersionCode=(type 0x10)0x1c
A: platformBuildVersionName=(type 0x10)0x9
E: uses-sdk (line=7)
A: android:minSdkVersion(0x0101020c)=(type 0x10)0x10
A: android:targetSdkVersion(0x01010270)=(type 0x10)0x1c
E: uses-permission (line=14)
A: android:name(0x01010003)="android.permission.INTERNET" (Raw: "android.permission.INTERNET")
E: application (line=22)
A: android:label(0x01010001)="flutter_bug" (Raw: "flutter_bug")
A: android:icon(0x01010002)=@0x7f080000
A: android:name(0x01010003)="io.flutter.app.FlutterApplication" (Raw: "io.flutter.app.FlutterApplication")
A: android:debuggable(0x0101000f)=(type 0x12)0xffffffff
A: android:appComponentFactory(0x0101057a)="androidx.core.app.CoreComponentFactory" (Raw: "androidx.core.app.CoreComponentFactory")
E: activity (line=28)
A: android:theme(0x01010000)=@0x7f0a0000
A: android:name(0x01010003)="com.example.flutter_bug.MainActivity" (Raw: "com.example.flutter_bug.MainActivity")
A: android:launchMode(0x0101001d)=(type 0x10)0x1
A: android:configChanges(0x0101001f)=(type 0x11)0x40003fb4
A: android:windowSoftInputMode(0x0101022b)=(type 0x11)0x10
A: android:hardwareAccelerated(0x010102d3)=(type 0x12)0xffffffff
E: intent-filter (line=35)
E: action (line=36)
A: android:name(0x01010003)="android.intent.action.MAIN" (Raw: "android.intent.action.MAIN")
E: category (line=38)
A: android:name(0x01010003)="android.intent.category.LAUNCHER" (Raw: "android.intent.category.LAUNCHER")
E: meta-data (line=45)
A: android:name(0x01010003)="flutterEmbedding" (Raw: "flutterEmbedding")
A: android:value(0x01010024)=(type 0x10)0x2
[ +9 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell -x logcat -v time -t 1
[ +55 ms] Exit code 0 from: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell -x logcat -v time -t 1
[ ] --------- beginning of main
02-24 12:15:51.211 E/GnssHAL_GnssInterface( 1795): gnssSvStatusCb: b: input svInfo.flags is 8
[ +8 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe version
[ +1 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 logcat -v time -T 02-24 12:15:51.211
[ +45 ms] Android Debug Bridge version 1.0.41
Version 29.0.6-6198805
Installed as C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe
[ +4 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe start-server
[ +62 ms] Building APK
[ +20 ms] Running Gradle task 'assembleDebug'...
[ +2 ms] gradle.properties already sets `android.enableR8`
[ +4 ms] Using gradle from E:\Practice\Flutter\flutter_bug\android\gradlew.bat.
[ +9 ms] executing: C:\Program Files\Android\Android Studio\jre\bin\java -version
[ +105 ms] Exit code 0 from: C:\Program Files\Android\Android Studio\jre\bin\java -version
[ +1 ms] openjdk version "1.8.0_202-release"
OpenJDK Runtime Environment (build 1.8.0_202-release-1483-b03)
OpenJDK 64-Bit Server VM (build 25.202-b03, mixed mode)
[ +3 ms] executing: [E:\Practice\Flutter\flutter_bug\android\] E:\Practice\Flutter\flutter_bug\android\gradlew.bat -Pverbose=true -Ptarget=E:\Practice\Flutter\flutter_bug\lib\main.dart
-Ptrack-widget-creation=true -Pfilesystem-scheme=org-dartlang-root -Ptarget-platform=android-x86 assembleDebug
[+3906 ms] > Task :app:compileFlutterBuildDebug
[ +1 ms] [ +24 ms] executing: [C:\flutter\] git -c log.showSignature=false log -n 1 --pretty=format:%H
[ +1 ms] [ +66 ms] Exit code 0 from: git -c log.showSignature=false log -n 1 --pretty=format:%H
[ +1 ms] [ ] 0b8abb4724aa590dd0f429683339b1e045a1594d
[ +1 ms] [ ] executing: [C:\flutter\] git describe --match v*.*.* --first-parent --long --tags
[ +1 ms] [ +42 ms] Exit code 0 from: git describe --match v*.*.* --first-parent --long --tags
[ +1 ms] [ ] v1.12.13+hotfix.8-0-g0b8abb472
[ +1 ms] [ +7 ms] executing: [C:\flutter\] git rev-parse --abbrev-ref --symbolic @{u}
[ ] [ +41 ms] Exit code 0 from: git rev-parse --abbrev-ref --symbolic @{u}
[ +1 ms] [ ] origin/stable
[ ] [ ] executing: [C:\flutter\] git ls-remote --get-url origin
[ ] [ +41 ms] Exit code 0 from: git ls-remote --get-url origin
[ ] [ ] https://github.com/flutter/flutter.git
[ ] [ +80 ms] executing: [C:\flutter\] git rev-parse --abbrev-ref HEAD
[ ] [ +38 ms] Exit code 0 from: git rev-parse --abbrev-ref HEAD
[ ] [ ] stable
[ +1 ms] [ +21 ms] Artifact Instance of 'AndroidMavenArtifacts' is not required, skipping update.
[ +2 ms] [ ] Artifact Instance of 'AndroidGenSnapshotArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'AndroidInternalBuildArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'IOSEngineArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'FlutterWebSdk' is not required, skipping update.
[ +5 ms] [ +3 ms] Artifact Instance of 'WindowsEngineArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'MacOSEngineArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'LinuxEngineArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'LinuxFuchsiaSDKArtifacts' is not required, skipping update.
[ +2 ms] [ ] Artifact Instance of 'MacOSFuchsiaSDKArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'FlutterRunnerSDKArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'FlutterRunnerDebugSymbols' is not required, skipping update.
[ +1 ms] [ +96 ms] Initializing file store
[ +1 ms] [ +17 ms] Done initializing file store
[ +543 ms] [+1440 ms] kernel_snapshot: Starting due to {InvalidatedReason.inputChanged}
[ +2 ms] [ +13 ms] C:\flutter\bin\cache\dart-sdk\bin\dart.exe C:\flutter\bin\cache\artifacts\engine\windows-x64\frontend_server.dart.snapshot --sdk-root
C:\flutter\bin\cache\artifacts\engine\common\flutter_patched_sdk/ --target=flutter -Ddart.developer.causal_async_stacks=true -Ddart.vm.profile=false -Ddart.vm.product=false
--bytecode-options=source-positions,local-var-info,debugger-stops,instance-field-initializers,keep-unreachable-code,avoid-closure-call-instructions --enable-asserts --track-widget-creation
--no-link-platform --packages E:\Practice\Flutter\flutter_bug\.packages --output-dill E:\Practice\Flutter\flutter_bug\.dart_tool\flutter_build\9461e6759bfd07e069714b57cb612713\app.dill --depfile
E:\Practice\Flutter\flutter_bug\.dart_tool\flutter_build\9461e6759bfd07e069714b57cb612713\kernel_snapshot.d package:flutter_bug/main.dart
[+5598 ms] [+5673 ms] kernel_snapshot: Complete
[ +400 ms] [ +362 ms] debug_android_application: Starting due to {InvalidatedReason.inputChanged, InvalidatedReason.outputMissing}
[ +198 ms] [ +155 ms] debug_android_application: Complete
[ +1 ms] [ +18 ms] Persisting file store
[ +2 ms] [ +15 ms] Done persisting file store
[ +1 ms] [ +4 ms] build succeeded.
[ +1 ms] [ +26 ms] "flutter assemble" took 7,883ms.
[ +193 ms] > Task :app:packLibsflutterBuildDebug UP-TO-DATE
[ +2 ms] > Task :app:preBuild UP-TO-DATE
[ +1 ms] > Task :app:preDebugBuild UP-TO-DATE
[ +1 ms] > Task :app:checkDebugManifest UP-TO-DATE
[ ] > Task :app:generateDebugBuildConfig UP-TO-DATE
[ ] > Task :app:cleanMergeDebugAssets
[ ] > Task :app:compileDebugRenderscript NO-SOURCE
[ +1 ms] > Task :app:compileDebugAidl NO-SOURCE
[ +1 ms] > Task :app:mergeDebugShaders UP-TO-DATE
[ +1 ms] > Task :app:compileDebugShaders UP-TO-DATE
[ ] > Task :app:generateDebugAssets UP-TO-DATE
[ +1 ms] > Task :app:mergeDebugAssets
[ +286 ms] > Task :app:copyFlutterAssetsDebug
[ +2 ms] > Task :app:mainApkListPersistenceDebug UP-TO-DATE
[ +1 ms] > Task :app:generateDebugResValues UP-TO-DATE
[ ] > Task :app:generateDebugResources UP-TO-DATE
[ ] > Task :app:mergeDebugResources UP-TO-DATE
[ +1 ms] > Task :app:createDebugCompatibleScreenManifests UP-TO-DATE
[ +99 ms] > Task :app:processDebugManifest UP-TO-DATE
[ +9 ms] > Task :app:processDebugResources UP-TO-DATE
[ +3 ms] > Task :app:compileDebugKotlin UP-TO-DATE
[ +1 ms] > Task :app:javaPreCompileDebug UP-TO-DATE
[ +3 ms] > Task :app:compileDebugJavaWithJavac UP-TO-DATE
[ +1 ms] > Task :app:compileDebugSources UP-TO-DATE
[ +1 ms] > Task :app:processDebugJavaRes NO-SOURCE
[ ] > Task :app:mergeDebugJavaResource UP-TO-DATE
[ ] > Task :app:checkDebugDuplicateClasses UP-TO-DATE
[ +73 ms] > Task :app:transformClassesWithDexBuilderForDebug UP-TO-DATE
[ +4 ms] > Task :app:desugarDebugFileDependencies UP-TO-DATE
[ +1 ms] > Task :app:mergeExtDexDebug UP-TO-DATE
[ ] > Task :app:mergeDexDebug UP-TO-DATE
[ +2 ms] > Task :app:validateSigningDebug UP-TO-DATE
[ +3 ms] > Task :app:signingConfigWriterDebug UP-TO-DATE
[ +83 ms] > Task :app:mergeDebugJniLibFolders UP-TO-DATE
[ +2 ms] > Task :app:mergeDebugNativeLibs UP-TO-DATE
[ +2 ms] > Task :app:stripDebugDebugSymbols UP-TO-DATE
[ +1 ms] Compatible side by side NDK version was not found.
[+1168 ms] > Task :app:packageDebug
[ +1 ms] > Task :app:assembleDebug
[ +1 ms] BUILD SUCCESSFUL in 12s
[ +1 ms] 30 actionable tasks: 5 executed, 25 up-to-date
[ +363 ms] Running Gradle task 'assembleDebug'... (completed in 13.2s)
[ +45 ms] calculateSha: LocalDirectory: 'E:\Practice\Flutter\flutter_bug\build\app\outputs\apk'/app.apk
[ +63 ms] calculateSha: reading file took 60us
[ +442 ms] calculateSha: computing sha took 440us
[ +14 ms] √ Built build\app\outputs\apk\debug\app-debug.apk.
[ +5 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\build-tools\29.0.3\aapt dump xmltree E:\Practice\Flutter\flutter_bug\build\app\outputs\apk\app.apk AndroidManifest.xml
[ +24 ms] Exit code 0 from: C:\Users\panka\AppData\Local\Android\sdk\build-tools\29.0.3\aapt dump xmltree E:\Practice\Flutter\flutter_bug\build\app\outputs\apk\app.apk AndroidManifest.xml
[ +1 ms] N: android=http://schemas.android.com/apk/res/android
E: manifest (line=2)
A: android:versionCode(0x0101021b)=(type 0x10)0x1
A: android:versionName(0x0101021c)="1.0.0" (Raw: "1.0.0")
A: android:compileSdkVersion(0x01010572)=(type 0x10)0x1c
A: android:compileSdkVersionCodename(0x01010573)="9" (Raw: "9")
A: package="com.example.flutter_bug" (Raw: "com.example.flutter_bug")
A: platformBuildVersionCode=(type 0x10)0x1c
A: platformBuildVersionName=(type 0x10)0x9
E: uses-sdk (line=7)
A: android:minSdkVersion(0x0101020c)=(type 0x10)0x10
A: android:targetSdkVersion(0x01010270)=(type 0x10)0x1c
E: uses-permission (line=14)
A: android:name(0x01010003)="android.permission.INTERNET" (Raw: "android.permission.INTERNET")
E: application (line=22)
A: android:label(0x01010001)="flutter_bug" (Raw: "flutter_bug")
A: android:icon(0x01010002)=@0x7f080000
A: android:name(0x01010003)="io.flutter.app.FlutterApplication" (Raw: "io.flutter.app.FlutterApplication")
A: android:debuggable(0x0101000f)=(type 0x12)0xffffffff
A: android:appComponentFactory(0x0101057a)="androidx.core.app.CoreComponentFactory" (Raw: "androidx.core.app.CoreComponentFactory")
E: activity (line=28)
A: android:theme(0x01010000)=@0x7f0a0000
A: android:name(0x01010003)="com.example.flutter_bug.MainActivity" (Raw: "com.example.flutter_bug.MainActivity")
A: android:launchMode(0x0101001d)=(type 0x10)0x1
A: android:configChanges(0x0101001f)=(type 0x11)0x40003fb4
A: android:windowSoftInputMode(0x0101022b)=(type 0x11)0x10
A: android:hardwareAccelerated(0x010102d3)=(type 0x12)0xffffffff
E: intent-filter (line=35)
E: action (line=36)
A: android:name(0x01010003)="android.intent.action.MAIN" (Raw: "android.intent.action.MAIN")
E: category (line=38)
A: android:name(0x01010003)="android.intent.category.LAUNCHER" (Raw: "android.intent.category.LAUNCHER")
E: meta-data (line=45)
A: android:name(0x01010003)="flutterEmbedding" (Raw: "flutterEmbedding")
A: android:value(0x01010024)=(type 0x10)0x2
[ +3 ms] Stopping app 'app.apk' on Android SDK built for x86.
[ +2 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell am force-stop com.example.flutter_bug
[ +213 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell pm list packages com.example.flutter_bug
[ +153 ms] package:com.example.flutter_bug
[ +4 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell cat /data/local/tmp/sky.com.example.flutter_bug.sha1
[ +84 ms] f164cb2b52e37bbbb49c70213c9c88ab262d4d6d
[ +1 ms] Installing APK.
[ +3 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe version
[ +36 ms] Android Debug Bridge version 1.0.41
Version 29.0.6-6198805
Installed as C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe
[ +2 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe start-server
[ +36 ms] Installing build\app\outputs\apk\app.apk...
[ +2 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 install -t -r E:\Practice\Flutter\flutter_bug\build\app\outputs\apk\app.apk
[+1473 ms] Performing Streamed Install
Success
[ +1 ms] Installing build\app\outputs\apk\app.apk... (completed in 1.5s)
[ +2 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell echo -n a3fbf5f4680e3be140d39832844a41c6625c80a1 >
/data/local/tmp/sky.com.example.flutter_bug.sha1
[ +69 ms] Android SDK built for x86 startApp
[ +4 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell am start -a android.intent.action.RUN -f 0x20000000 --ez enable-background-compilation true --ez
enable-dart-profiling true --ez enable-checked-mode true --ez verify-entry-points true com.example.flutter_bug/com.example.flutter_bug.MainActivity
[ +146 ms] Starting: Intent { act=android.intent.action.RUN flg=0x20000000 cmp=com.example.flutter_bug/.MainActivity (has extras) }
[ +1 ms] Waiting for observatory port to be available...
[ +585 ms] D/FlutterActivity( 8971): Using the launch theme as normal theme.
[ +16 ms] D/FlutterActivityAndFragmentDelegate( 8971): Setting up FlutterEngine.
[ +1 ms] D/FlutterActivityAndFragmentDelegate( 8971): No preferred FlutterEngine was provided. Creating a new FlutterEngine for this FlutterFragment.
[ +759 ms] D/FlutterActivityAndFragmentDelegate( 8971): Attaching FlutterEngine to the Activity that owns this Fragment.
[ +39 ms] D/FlutterView( 8971): Attaching to a FlutterEngine: io.flutter.embedding.engine.FlutterEngine@91e62f0
[ +34 ms] D/FlutterActivityAndFragmentDelegate( 8971): Executing Dart entrypoint: main, and sending initial route: /
[ +57 ms] Observatory URL on device: http://127.0.0.1:42143/eUY3e5lREPo=/
[ +3 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 forward tcp:0 tcp:42143
[ +37 ms] 62681
[ +1 ms] Forwarded host port 62681 to device port 42143 for Observatory
[ +15 ms] Connecting to service protocol: http://127.0.0.1:62681/eUY3e5lREPo=/
[ +354 ms] Successfully connected to service protocol: http://127.0.0.1:62681/eUY3e5lREPo=/
[ +3 ms] Sending to VM service: getVM({})
[ +7 ms] Result: {type: VM, name: vm, architectureBits: 32, hostCPU: Intel(R) Core(TM) i7-6820HQ CPU @ 2.70GHz, operatingSystem: android, targetCPU: ia32, version: 2.7.0 (Fri Dec 6 16:26:51 2019 +0100) on
"android_ia32", _profilerMode: VM, _nativeZoneMemoryUsage: ...
[ +5 ms] Sending to VM service: getIsolate({isolateId: isolates/4334389538851163})
[ +5 ms] Sending to VM service: _flutter.listViews({})
[ +14 ms] Result: {type: FlutterViewList, views: [{type: FlutterView, id: _flutterView/0xf1a32210, isolate: {type: @Isolate, fixedId: true, id: isolates/4334389538851163, name:
main.dart$main-4334389538851163, number: 4334389538851163}}]}
[ +7 ms] DevFS: Creating new filesystem on the device (null)
[ +1 ms] Sending to VM service: _createDevFS({fsName: flutter_bug})
[ +31 ms] D/EGL_emulation( 8971): eglMakeCurrent: 0xf1a7f160: ver 3 1 (tinfo 0xe68bcc80)
[ +61 ms] Result: {type: Isolate, id: isolates/4334389538851163, name: main, number: 4334389538851163, _originNumber: 4334389538851163, startTime: 1582546568700, _heaps: {new: {type: HeapSpace, name: new,
vmName: Scavenger, collections: 2, avgCollectionPeriodMillis...
[ +18 ms] Result: {type: FileSystem, name: flutter_bug, uri: file:///data/user/0/com.example.flutter_bug/code_cache/flutter_bugPAMGPJ/flutter_bug/}
[ +1 ms] DevFS: Created new filesystem on the device (file:///data/user/0/com.example.flutter_bug/code_cache/flutter_bugPAMGPJ/flutter_bug/)
[ +3 ms] Updating assets
[ +123 ms] Syncing files to device Android SDK built for x86...
[ +3 ms] Scanning asset files
[ +3 ms] <- reset
[ ] Compiling dart to kernel with 0 updated files
[ +9 ms] C:\flutter\bin\cache\dart-sdk\bin\dart.exe C:\flutter\bin\cache\artifacts\engine\windows-x64\frontend_server.dart.snapshot --sdk-root
C:\flutter\bin\cache\artifacts\engine\common\flutter_patched_sdk/ --incremental --target=flutter -Ddart.developer.causal_async_stacks=true --output-dill
C:\Users\panka\AppData\Local\Temp\flutter_tool.64ff4266-56ff-11ea-a354-e4a7a0043ed0\app.dill --packages E:\Practice\Flutter\flutter_bug\.packages -Ddart.vm.profile=false -Ddart.vm.product=false
--bytecode-options=source-positions,local-var-info,debugger-stops,instance-field-initializers,keep-unreachable-code,avoid-closure-call-instructions --enable-asserts --track-widget-creation
--filesystem-scheme org-dartlang-root
[ +6 ms] <- compile package:flutter_bug/main.dart
[ +1 ms] I/Choreographer( 8971): Skipped 38 frames! The application may be doing too much work on its main thread.
[ +128 ms] D/EGL_emulation( 8971): eglMakeCurrent: 0xe68126c0: ver 3 1 (tinfo 0xe689efe0)
[+5351 ms] Updating files
[ +150 ms] DevFS: Sync finished
[ +5 ms] Syncing files to device Android SDK built for x86... (completed in 5,658ms, longer than expected)
[ +1 ms] Synced 0.9MB.
[ +2 ms] Sending to VM service: _flutter.listViews({})
[ +18 ms] Result: {type: FlutterViewList, views: [{type: FlutterView, id: _flutterView/0xf1a32210, isolate: {type: @Isolate, fixedId: true, id: isolates/4334389538851163, name:
main.dart$main-4334389538851163, number: 4334389538851163}}]}
[ +2 ms] <- accept
[ +1 ms] Connected to _flutterView/0xf1a32210.
[ +3 ms] � To hot reload changes while running, press "r". To hot restart (and rebuild state), press "R".
[ +2 ms] An Observatory debugger and profiler on Android SDK built for x86 is available at: http://127.0.0.1:62681/eUY3e5lREPo=/
[ +3 ms] For a more detailed help message, press "h". To detach, press "d"; to quit, press "q".
[+11083 ms] I/flutter ( 8971): 2
[+3870 ms] I/flutter ( 8971): 2
-->
<!--
E:\Practice\Flutter\flutter_bug>flutter analyze
Analyzing flutter_bug...
No issues found! (ran in 1.8s)
-->
```
```
E:\Practice\Flutter\flutter_bug>flutter doctor -v
[√] Flutter (Channel stable, v1.12.13+hotfix.8, on Microsoft Windows [Version 10.0.18363.657], locale en-GB)
• Flutter version 1.12.13+hotfix.8 at C:\flutter
• Framework revision 0b8abb4724 (13 days ago), 2020-02-11 11:44:36 -0800
• Engine revision e1e6ced81d
• Dart version 2.7.0
[√] Android toolchain - develop for Android devices (Android SDK version 29.0.3)
• Android SDK at C:\Users\panka\AppData\Local\Android\sdk
• Android NDK location not configured (optional; useful for native profiling support)
• Platform android-29, build-tools 29.0.3
• Java binary at: C:\Program Files\Android\Android Studio\jre\bin\java
• Java version OpenJDK Runtime Environment (build 1.8.0_202-release-1483-b03)
• All Android licenses accepted.
[√] Android Studio (version 3.5)
• Android Studio at C:\Program Files\Android\Android Studio
• Flutter plugin version 43.0.1
• Dart plugin version 191.8593
• Java version OpenJDK Runtime Environment (build 1.8.0_202-release-1483-b03)
[√] VS Code (version 1.42.1)
• VS Code at C:\Users\panka\AppData\Local\Programs\Microsoft VS Code
• Flutter extension version 3.8.1
[√] Connected device (1 available)
• Android SDK built for x86 • emulator-5554 • android-x86 • Android 10 (API 29) (emulator)
• No issues found!
```
</details>
|
1.0
|
formKey.currentState.reset() doesn't reset DropdownButtonFormField memory data - <!-- Thank you for using Flutter!
If you are looking for support, please check out our documentation
or consider asking a question on Stack Overflow:
* https://flutter.dev/
* https://api.flutter.dev/
* https://stackoverflow.com/questions/tagged/flutter?sort=frequent
If you have found a bug or if our documentation doesn't have an answer
to what you're looking for, then fill our the template below. Please read
our guide to filing a bug first: https://flutter.dev/docs/resources/bug-reports
-->
## Steps to Reproduce
<!-- You must include full steps to reproduce so that we can reproduce the problem. -->
Code as following
--------------------
```
import 'package:flutter/material.dart';
void main() => runApp(MyApp());
class MyApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'Flutter Demo',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home: MyHomePage(title: 'Flutter Demo Home Page'),
);
}
}
class MyHomePage extends StatefulWidget {
MyHomePage({Key key, this.title}) : super(key: key);
final String title;
@override
_MyHomePageState createState() => _MyHomePageState();
}
class _MyHomePageState extends State<MyHomePage> {
final GlobalKey<FormState> _formKey = GlobalKey<FormState>();
int dropDownValue;
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Form(
key: _formKey,
child: Padding(
padding: const EdgeInsets.all(20.0),
child: Column(
children: <Widget>[
DropdownButtonFormField<int>(
validator: (int value) =>
value == null ? 'please select' : null,
value: dropDownValue,
hint: Text('select'),
isDense: true,
onSaved: (int value) => dropDownValue = value,
items: <int>[1, 2, 3]
.map((value) => DropdownMenuItem(
value: value,
child: Text(value.toString()),
))
.toList(),
onChanged: (int value) {
setState(() {
dropDownValue = value;
});
},
),
RaisedButton(
child: Text('Validate'),
onPressed: () {
if (_formKey.currentState.validate()) {
_formKey.currentState.save();
print(dropDownValue);
}
},
),
RaisedButton(
child: Text('Reset'),
onPressed: () {
_formKey.currentState.reset();
setState(() {
dropDownValue = null;
});
},
),
Text('selected = $dropDownValue'),
],
),
),
),
);
}
}
```
1. Click on Validate without selecting anything on Dropdown.
2. It validates correctly.
3. Select any item from dropdown
4. Click on Validate and it validates correctly and prints the value into output window.
5. Click on Reset which clears the dropdown selection.
6. Click on Validate and it doesn't validate (even if nothing is selected on dropdown) and prints the value into output window.
7. Click on Reset again
8. Click on Validate and it validates this time.
**Expected results:**
1. It should validate on Step#6 when I reset the formKey.currentState.reset() on step#5
**Actual results:**
1. It holds value into memory and validation passes even if the dropdown has nothing selected on step#6
2. Don't know why but it works on step#8 (second time reset on step#7)
<details>
<summary>Logs</summary>
```
<!--
[ +26 ms] executing: [C:\flutter\] git -c log.showSignature=false log -n 1 --pretty=format:%H
[ +68 ms] Exit code 0 from: git -c log.showSignature=false log -n 1 --pretty=format:%H
[ ] 0b8abb4724aa590dd0f429683339b1e045a1594d
[ +1 ms] executing: [C:\flutter\] git describe --match v*.*.* --first-parent --long --tags
[ +45 ms] Exit code 0 from: git describe --match v*.*.* --first-parent --long --tags
[ ] v1.12.13+hotfix.8-0-g0b8abb472
[ +8 ms] executing: [C:\flutter\] git rev-parse --abbrev-ref --symbolic @{u}
[ +44 ms] Exit code 0 from: git rev-parse --abbrev-ref --symbolic @{u}
[ +1 ms] origin/stable
[ ] executing: [C:\flutter\] git ls-remote --get-url origin
[ +38 ms] Exit code 0 from: git ls-remote --get-url origin
[ ] https://github.com/flutter/flutter.git
[ +88 ms] executing: [C:\flutter\] git rev-parse --abbrev-ref HEAD
[ +39 ms] Exit code 0 from: git rev-parse --abbrev-ref HEAD
[ +1 ms] stable
[ +114 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe devices -l
[ +32 ms] Exit code 0 from: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe devices -l
[ +1 ms] List of devices attached
emulator-5554 device product:sdk_gphone_x86 model:Android_SDK_built_for_x86 device:generic_x86 transport_id:5
[ +19 ms] C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell getprop
[ +67 ms] Artifact Instance of 'AndroidMavenArtifacts' is not required, skipping update.
[ +5 ms] Artifact Instance of 'AndroidInternalBuildArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'IOSEngineArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'FlutterWebSdk' is not required, skipping update.
[ +4 ms] Artifact Instance of 'WindowsEngineArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'MacOSEngineArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'LinuxEngineArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'LinuxFuchsiaSDKArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'MacOSFuchsiaSDKArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'FlutterRunnerSDKArtifacts' is not required, skipping update.
[ ] Artifact Instance of 'FlutterRunnerDebugSymbols' is not required, skipping update.
[ +156 ms] Generating E:\Practice\Flutter\flutter_bug\android\app\src\main\java\io\flutter\plugins\GeneratedPluginRegistrant.java
[ +36 ms] ro.hardware = ranchu
[ +40 ms] Using hardware rendering with device Android SDK built for x86. If you get graphics artifacts, consider enabling software rendering with "--enable-software-rendering".
[ +24 ms] Launching lib\main.dart on Android SDK built for x86 in debug mode...
[ +9 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\build-tools\29.0.3\aapt dump xmltree E:\Practice\Flutter\flutter_bug\build\app\outputs\apk\app.apk AndroidManifest.xml
[ +15 ms] Exit code 0 from: C:\Users\panka\AppData\Local\Android\sdk\build-tools\29.0.3\aapt dump xmltree E:\Practice\Flutter\flutter_bug\build\app\outputs\apk\app.apk AndroidManifest.xml
[ ] N: android=http://schemas.android.com/apk/res/android
E: manifest (line=2)
A: android:versionCode(0x0101021b)=(type 0x10)0x1
A: android:versionName(0x0101021c)="1.0.0" (Raw: "1.0.0")
A: android:compileSdkVersion(0x01010572)=(type 0x10)0x1c
A: android:compileSdkVersionCodename(0x01010573)="9" (Raw: "9")
A: package="com.example.flutter_bug" (Raw: "com.example.flutter_bug")
A: platformBuildVersionCode=(type 0x10)0x1c
A: platformBuildVersionName=(type 0x10)0x9
E: uses-sdk (line=7)
A: android:minSdkVersion(0x0101020c)=(type 0x10)0x10
A: android:targetSdkVersion(0x01010270)=(type 0x10)0x1c
E: uses-permission (line=14)
A: android:name(0x01010003)="android.permission.INTERNET" (Raw: "android.permission.INTERNET")
E: application (line=22)
A: android:label(0x01010001)="flutter_bug" (Raw: "flutter_bug")
A: android:icon(0x01010002)=@0x7f080000
A: android:name(0x01010003)="io.flutter.app.FlutterApplication" (Raw: "io.flutter.app.FlutterApplication")
A: android:debuggable(0x0101000f)=(type 0x12)0xffffffff
A: android:appComponentFactory(0x0101057a)="androidx.core.app.CoreComponentFactory" (Raw: "androidx.core.app.CoreComponentFactory")
E: activity (line=28)
A: android:theme(0x01010000)=@0x7f0a0000
A: android:name(0x01010003)="com.example.flutter_bug.MainActivity" (Raw: "com.example.flutter_bug.MainActivity")
A: android:launchMode(0x0101001d)=(type 0x10)0x1
A: android:configChanges(0x0101001f)=(type 0x11)0x40003fb4
A: android:windowSoftInputMode(0x0101022b)=(type 0x11)0x10
A: android:hardwareAccelerated(0x010102d3)=(type 0x12)0xffffffff
E: intent-filter (line=35)
E: action (line=36)
A: android:name(0x01010003)="android.intent.action.MAIN" (Raw: "android.intent.action.MAIN")
E: category (line=38)
A: android:name(0x01010003)="android.intent.category.LAUNCHER" (Raw: "android.intent.category.LAUNCHER")
E: meta-data (line=45)
A: android:name(0x01010003)="flutterEmbedding" (Raw: "flutterEmbedding")
A: android:value(0x01010024)=(type 0x10)0x2
[ +9 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell -x logcat -v time -t 1
[ +55 ms] Exit code 0 from: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell -x logcat -v time -t 1
[ ] --------- beginning of main
02-24 12:15:51.211 E/GnssHAL_GnssInterface( 1795): gnssSvStatusCb: b: input svInfo.flags is 8
[ +8 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe version
[ +1 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 logcat -v time -T 02-24 12:15:51.211
[ +45 ms] Android Debug Bridge version 1.0.41
Version 29.0.6-6198805
Installed as C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe
[ +4 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe start-server
[ +62 ms] Building APK
[ +20 ms] Running Gradle task 'assembleDebug'...
[ +2 ms] gradle.properties already sets `android.enableR8`
[ +4 ms] Using gradle from E:\Practice\Flutter\flutter_bug\android\gradlew.bat.
[ +9 ms] executing: C:\Program Files\Android\Android Studio\jre\bin\java -version
[ +105 ms] Exit code 0 from: C:\Program Files\Android\Android Studio\jre\bin\java -version
[ +1 ms] openjdk version "1.8.0_202-release"
OpenJDK Runtime Environment (build 1.8.0_202-release-1483-b03)
OpenJDK 64-Bit Server VM (build 25.202-b03, mixed mode)
[ +3 ms] executing: [E:\Practice\Flutter\flutter_bug\android\] E:\Practice\Flutter\flutter_bug\android\gradlew.bat -Pverbose=true -Ptarget=E:\Practice\Flutter\flutter_bug\lib\main.dart
-Ptrack-widget-creation=true -Pfilesystem-scheme=org-dartlang-root -Ptarget-platform=android-x86 assembleDebug
[+3906 ms] > Task :app:compileFlutterBuildDebug
[ +1 ms] [ +24 ms] executing: [C:\flutter\] git -c log.showSignature=false log -n 1 --pretty=format:%H
[ +1 ms] [ +66 ms] Exit code 0 from: git -c log.showSignature=false log -n 1 --pretty=format:%H
[ +1 ms] [ ] 0b8abb4724aa590dd0f429683339b1e045a1594d
[ +1 ms] [ ] executing: [C:\flutter\] git describe --match v*.*.* --first-parent --long --tags
[ +1 ms] [ +42 ms] Exit code 0 from: git describe --match v*.*.* --first-parent --long --tags
[ +1 ms] [ ] v1.12.13+hotfix.8-0-g0b8abb472
[ +1 ms] [ +7 ms] executing: [C:\flutter\] git rev-parse --abbrev-ref --symbolic @{u}
[ ] [ +41 ms] Exit code 0 from: git rev-parse --abbrev-ref --symbolic @{u}
[ +1 ms] [ ] origin/stable
[ ] [ ] executing: [C:\flutter\] git ls-remote --get-url origin
[ ] [ +41 ms] Exit code 0 from: git ls-remote --get-url origin
[ ] [ ] https://github.com/flutter/flutter.git
[ ] [ +80 ms] executing: [C:\flutter\] git rev-parse --abbrev-ref HEAD
[ ] [ +38 ms] Exit code 0 from: git rev-parse --abbrev-ref HEAD
[ ] [ ] stable
[ +1 ms] [ +21 ms] Artifact Instance of 'AndroidMavenArtifacts' is not required, skipping update.
[ +2 ms] [ ] Artifact Instance of 'AndroidGenSnapshotArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'AndroidInternalBuildArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'IOSEngineArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'FlutterWebSdk' is not required, skipping update.
[ +5 ms] [ +3 ms] Artifact Instance of 'WindowsEngineArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'MacOSEngineArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'LinuxEngineArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'LinuxFuchsiaSDKArtifacts' is not required, skipping update.
[ +2 ms] [ ] Artifact Instance of 'MacOSFuchsiaSDKArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'FlutterRunnerSDKArtifacts' is not required, skipping update.
[ +1 ms] [ ] Artifact Instance of 'FlutterRunnerDebugSymbols' is not required, skipping update.
[ +1 ms] [ +96 ms] Initializing file store
[ +1 ms] [ +17 ms] Done initializing file store
[ +543 ms] [+1440 ms] kernel_snapshot: Starting due to {InvalidatedReason.inputChanged}
[ +2 ms] [ +13 ms] C:\flutter\bin\cache\dart-sdk\bin\dart.exe C:\flutter\bin\cache\artifacts\engine\windows-x64\frontend_server.dart.snapshot --sdk-root
C:\flutter\bin\cache\artifacts\engine\common\flutter_patched_sdk/ --target=flutter -Ddart.developer.causal_async_stacks=true -Ddart.vm.profile=false -Ddart.vm.product=false
--bytecode-options=source-positions,local-var-info,debugger-stops,instance-field-initializers,keep-unreachable-code,avoid-closure-call-instructions --enable-asserts --track-widget-creation
--no-link-platform --packages E:\Practice\Flutter\flutter_bug\.packages --output-dill E:\Practice\Flutter\flutter_bug\.dart_tool\flutter_build\9461e6759bfd07e069714b57cb612713\app.dill --depfile
E:\Practice\Flutter\flutter_bug\.dart_tool\flutter_build\9461e6759bfd07e069714b57cb612713\kernel_snapshot.d package:flutter_bug/main.dart
[+5598 ms] [+5673 ms] kernel_snapshot: Complete
[ +400 ms] [ +362 ms] debug_android_application: Starting due to {InvalidatedReason.inputChanged, InvalidatedReason.outputMissing}
[ +198 ms] [ +155 ms] debug_android_application: Complete
[ +1 ms] [ +18 ms] Persisting file store
[ +2 ms] [ +15 ms] Done persisting file store
[ +1 ms] [ +4 ms] build succeeded.
[ +1 ms] [ +26 ms] "flutter assemble" took 7,883ms.
[ +193 ms] > Task :app:packLibsflutterBuildDebug UP-TO-DATE
[ +2 ms] > Task :app:preBuild UP-TO-DATE
[ +1 ms] > Task :app:preDebugBuild UP-TO-DATE
[ +1 ms] > Task :app:checkDebugManifest UP-TO-DATE
[ ] > Task :app:generateDebugBuildConfig UP-TO-DATE
[ ] > Task :app:cleanMergeDebugAssets
[ ] > Task :app:compileDebugRenderscript NO-SOURCE
[ +1 ms] > Task :app:compileDebugAidl NO-SOURCE
[ +1 ms] > Task :app:mergeDebugShaders UP-TO-DATE
[ +1 ms] > Task :app:compileDebugShaders UP-TO-DATE
[ ] > Task :app:generateDebugAssets UP-TO-DATE
[ +1 ms] > Task :app:mergeDebugAssets
[ +286 ms] > Task :app:copyFlutterAssetsDebug
[ +2 ms] > Task :app:mainApkListPersistenceDebug UP-TO-DATE
[ +1 ms] > Task :app:generateDebugResValues UP-TO-DATE
[ ] > Task :app:generateDebugResources UP-TO-DATE
[ ] > Task :app:mergeDebugResources UP-TO-DATE
[ +1 ms] > Task :app:createDebugCompatibleScreenManifests UP-TO-DATE
[ +99 ms] > Task :app:processDebugManifest UP-TO-DATE
[ +9 ms] > Task :app:processDebugResources UP-TO-DATE
[ +3 ms] > Task :app:compileDebugKotlin UP-TO-DATE
[ +1 ms] > Task :app:javaPreCompileDebug UP-TO-DATE
[ +3 ms] > Task :app:compileDebugJavaWithJavac UP-TO-DATE
[ +1 ms] > Task :app:compileDebugSources UP-TO-DATE
[ +1 ms] > Task :app:processDebugJavaRes NO-SOURCE
[ ] > Task :app:mergeDebugJavaResource UP-TO-DATE
[ ] > Task :app:checkDebugDuplicateClasses UP-TO-DATE
[ +73 ms] > Task :app:transformClassesWithDexBuilderForDebug UP-TO-DATE
[ +4 ms] > Task :app:desugarDebugFileDependencies UP-TO-DATE
[ +1 ms] > Task :app:mergeExtDexDebug UP-TO-DATE
[ ] > Task :app:mergeDexDebug UP-TO-DATE
[ +2 ms] > Task :app:validateSigningDebug UP-TO-DATE
[ +3 ms] > Task :app:signingConfigWriterDebug UP-TO-DATE
[ +83 ms] > Task :app:mergeDebugJniLibFolders UP-TO-DATE
[ +2 ms] > Task :app:mergeDebugNativeLibs UP-TO-DATE
[ +2 ms] > Task :app:stripDebugDebugSymbols UP-TO-DATE
[ +1 ms] Compatible side by side NDK version was not found.
[+1168 ms] > Task :app:packageDebug
[ +1 ms] > Task :app:assembleDebug
[ +1 ms] BUILD SUCCESSFUL in 12s
[ +1 ms] 30 actionable tasks: 5 executed, 25 up-to-date
[ +363 ms] Running Gradle task 'assembleDebug'... (completed in 13.2s)
[ +45 ms] calculateSha: LocalDirectory: 'E:\Practice\Flutter\flutter_bug\build\app\outputs\apk'/app.apk
[ +63 ms] calculateSha: reading file took 60us
[ +442 ms] calculateSha: computing sha took 440us
[ +14 ms] √ Built build\app\outputs\apk\debug\app-debug.apk.
[ +5 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\build-tools\29.0.3\aapt dump xmltree E:\Practice\Flutter\flutter_bug\build\app\outputs\apk\app.apk AndroidManifest.xml
[ +24 ms] Exit code 0 from: C:\Users\panka\AppData\Local\Android\sdk\build-tools\29.0.3\aapt dump xmltree E:\Practice\Flutter\flutter_bug\build\app\outputs\apk\app.apk AndroidManifest.xml
[ +1 ms] N: android=http://schemas.android.com/apk/res/android
E: manifest (line=2)
A: android:versionCode(0x0101021b)=(type 0x10)0x1
A: android:versionName(0x0101021c)="1.0.0" (Raw: "1.0.0")
A: android:compileSdkVersion(0x01010572)=(type 0x10)0x1c
A: android:compileSdkVersionCodename(0x01010573)="9" (Raw: "9")
A: package="com.example.flutter_bug" (Raw: "com.example.flutter_bug")
A: platformBuildVersionCode=(type 0x10)0x1c
A: platformBuildVersionName=(type 0x10)0x9
E: uses-sdk (line=7)
A: android:minSdkVersion(0x0101020c)=(type 0x10)0x10
A: android:targetSdkVersion(0x01010270)=(type 0x10)0x1c
E: uses-permission (line=14)
A: android:name(0x01010003)="android.permission.INTERNET" (Raw: "android.permission.INTERNET")
E: application (line=22)
A: android:label(0x01010001)="flutter_bug" (Raw: "flutter_bug")
A: android:icon(0x01010002)=@0x7f080000
A: android:name(0x01010003)="io.flutter.app.FlutterApplication" (Raw: "io.flutter.app.FlutterApplication")
A: android:debuggable(0x0101000f)=(type 0x12)0xffffffff
A: android:appComponentFactory(0x0101057a)="androidx.core.app.CoreComponentFactory" (Raw: "androidx.core.app.CoreComponentFactory")
E: activity (line=28)
A: android:theme(0x01010000)=@0x7f0a0000
A: android:name(0x01010003)="com.example.flutter_bug.MainActivity" (Raw: "com.example.flutter_bug.MainActivity")
A: android:launchMode(0x0101001d)=(type 0x10)0x1
A: android:configChanges(0x0101001f)=(type 0x11)0x40003fb4
A: android:windowSoftInputMode(0x0101022b)=(type 0x11)0x10
A: android:hardwareAccelerated(0x010102d3)=(type 0x12)0xffffffff
E: intent-filter (line=35)
E: action (line=36)
A: android:name(0x01010003)="android.intent.action.MAIN" (Raw: "android.intent.action.MAIN")
E: category (line=38)
A: android:name(0x01010003)="android.intent.category.LAUNCHER" (Raw: "android.intent.category.LAUNCHER")
E: meta-data (line=45)
A: android:name(0x01010003)="flutterEmbedding" (Raw: "flutterEmbedding")
A: android:value(0x01010024)=(type 0x10)0x2
[ +3 ms] Stopping app 'app.apk' on Android SDK built for x86.
[ +2 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell am force-stop com.example.flutter_bug
[ +213 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell pm list packages com.example.flutter_bug
[ +153 ms] package:com.example.flutter_bug
[ +4 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell cat /data/local/tmp/sky.com.example.flutter_bug.sha1
[ +84 ms] f164cb2b52e37bbbb49c70213c9c88ab262d4d6d
[ +1 ms] Installing APK.
[ +3 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe version
[ +36 ms] Android Debug Bridge version 1.0.41
Version 29.0.6-6198805
Installed as C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe
[ +2 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe start-server
[ +36 ms] Installing build\app\outputs\apk\app.apk...
[ +2 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 install -t -r E:\Practice\Flutter\flutter_bug\build\app\outputs\apk\app.apk
[+1473 ms] Performing Streamed Install
Success
[ +1 ms] Installing build\app\outputs\apk\app.apk... (completed in 1.5s)
[ +2 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell echo -n a3fbf5f4680e3be140d39832844a41c6625c80a1 >
/data/local/tmp/sky.com.example.flutter_bug.sha1
[ +69 ms] Android SDK built for x86 startApp
[ +4 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 shell am start -a android.intent.action.RUN -f 0x20000000 --ez enable-background-compilation true --ez
enable-dart-profiling true --ez enable-checked-mode true --ez verify-entry-points true com.example.flutter_bug/com.example.flutter_bug.MainActivity
[ +146 ms] Starting: Intent { act=android.intent.action.RUN flg=0x20000000 cmp=com.example.flutter_bug/.MainActivity (has extras) }
[ +1 ms] Waiting for observatory port to be available...
[ +585 ms] D/FlutterActivity( 8971): Using the launch theme as normal theme.
[ +16 ms] D/FlutterActivityAndFragmentDelegate( 8971): Setting up FlutterEngine.
[ +1 ms] D/FlutterActivityAndFragmentDelegate( 8971): No preferred FlutterEngine was provided. Creating a new FlutterEngine for this FlutterFragment.
[ +759 ms] D/FlutterActivityAndFragmentDelegate( 8971): Attaching FlutterEngine to the Activity that owns this Fragment.
[ +39 ms] D/FlutterView( 8971): Attaching to a FlutterEngine: io.flutter.embedding.engine.FlutterEngine@91e62f0
[ +34 ms] D/FlutterActivityAndFragmentDelegate( 8971): Executing Dart entrypoint: main, and sending initial route: /
[ +57 ms] Observatory URL on device: http://127.0.0.1:42143/eUY3e5lREPo=/
[ +3 ms] executing: C:\Users\panka\AppData\Local\Android\sdk\platform-tools\adb.exe -s emulator-5554 forward tcp:0 tcp:42143
[ +37 ms] 62681
[ +1 ms] Forwarded host port 62681 to device port 42143 for Observatory
[ +15 ms] Connecting to service protocol: http://127.0.0.1:62681/eUY3e5lREPo=/
[ +354 ms] Successfully connected to service protocol: http://127.0.0.1:62681/eUY3e5lREPo=/
[ +3 ms] Sending to VM service: getVM({})
[ +7 ms] Result: {type: VM, name: vm, architectureBits: 32, hostCPU: Intel(R) Core(TM) i7-6820HQ CPU @ 2.70GHz, operatingSystem: android, targetCPU: ia32, version: 2.7.0 (Fri Dec 6 16:26:51 2019 +0100) on
"android_ia32", _profilerMode: VM, _nativeZoneMemoryUsage: ...
[ +5 ms] Sending to VM service: getIsolate({isolateId: isolates/4334389538851163})
[ +5 ms] Sending to VM service: _flutter.listViews({})
[ +14 ms] Result: {type: FlutterViewList, views: [{type: FlutterView, id: _flutterView/0xf1a32210, isolate: {type: @Isolate, fixedId: true, id: isolates/4334389538851163, name:
main.dart$main-4334389538851163, number: 4334389538851163}}]}
[ +7 ms] DevFS: Creating new filesystem on the device (null)
[ +1 ms] Sending to VM service: _createDevFS({fsName: flutter_bug})
[ +31 ms] D/EGL_emulation( 8971): eglMakeCurrent: 0xf1a7f160: ver 3 1 (tinfo 0xe68bcc80)
[ +61 ms] Result: {type: Isolate, id: isolates/4334389538851163, name: main, number: 4334389538851163, _originNumber: 4334389538851163, startTime: 1582546568700, _heaps: {new: {type: HeapSpace, name: new,
vmName: Scavenger, collections: 2, avgCollectionPeriodMillis...
[ +18 ms] Result: {type: FileSystem, name: flutter_bug, uri: file:///data/user/0/com.example.flutter_bug/code_cache/flutter_bugPAMGPJ/flutter_bug/}
[ +1 ms] DevFS: Created new filesystem on the device (file:///data/user/0/com.example.flutter_bug/code_cache/flutter_bugPAMGPJ/flutter_bug/)
[ +3 ms] Updating assets
[ +123 ms] Syncing files to device Android SDK built for x86...
[ +3 ms] Scanning asset files
[ +3 ms] <- reset
[ ] Compiling dart to kernel with 0 updated files
[ +9 ms] C:\flutter\bin\cache\dart-sdk\bin\dart.exe C:\flutter\bin\cache\artifacts\engine\windows-x64\frontend_server.dart.snapshot --sdk-root
C:\flutter\bin\cache\artifacts\engine\common\flutter_patched_sdk/ --incremental --target=flutter -Ddart.developer.causal_async_stacks=true --output-dill
C:\Users\panka\AppData\Local\Temp\flutter_tool.64ff4266-56ff-11ea-a354-e4a7a0043ed0\app.dill --packages E:\Practice\Flutter\flutter_bug\.packages -Ddart.vm.profile=false -Ddart.vm.product=false
--bytecode-options=source-positions,local-var-info,debugger-stops,instance-field-initializers,keep-unreachable-code,avoid-closure-call-instructions --enable-asserts --track-widget-creation
--filesystem-scheme org-dartlang-root
[ +6 ms] <- compile package:flutter_bug/main.dart
[ +1 ms] I/Choreographer( 8971): Skipped 38 frames! The application may be doing too much work on its main thread.
[ +128 ms] D/EGL_emulation( 8971): eglMakeCurrent: 0xe68126c0: ver 3 1 (tinfo 0xe689efe0)
[+5351 ms] Updating files
[ +150 ms] DevFS: Sync finished
[ +5 ms] Syncing files to device Android SDK built for x86... (completed in 5,658ms, longer than expected)
[ +1 ms] Synced 0.9MB.
[ +2 ms] Sending to VM service: _flutter.listViews({})
[ +18 ms] Result: {type: FlutterViewList, views: [{type: FlutterView, id: _flutterView/0xf1a32210, isolate: {type: @Isolate, fixedId: true, id: isolates/4334389538851163, name:
main.dart$main-4334389538851163, number: 4334389538851163}}]}
[ +2 ms] <- accept
[ +1 ms] Connected to _flutterView/0xf1a32210.
[ +3 ms] � To hot reload changes while running, press "r". To hot restart (and rebuild state), press "R".
[ +2 ms] An Observatory debugger and profiler on Android SDK built for x86 is available at: http://127.0.0.1:62681/eUY3e5lREPo=/
[ +3 ms] For a more detailed help message, press "h". To detach, press "d"; to quit, press "q".
[+11083 ms] I/flutter ( 8971): 2
[+3870 ms] I/flutter ( 8971): 2
-->
<!--
E:\Practice\Flutter\flutter_bug>flutter analyze
Analyzing flutter_bug...
No issues found! (ran in 1.8s)
-->
```
```
E:\Practice\Flutter\flutter_bug>flutter doctor -v
[√] Flutter (Channel stable, v1.12.13+hotfix.8, on Microsoft Windows [Version 10.0.18363.657], locale en-GB)
• Flutter version 1.12.13+hotfix.8 at C:\flutter
• Framework revision 0b8abb4724 (13 days ago), 2020-02-11 11:44:36 -0800
• Engine revision e1e6ced81d
• Dart version 2.7.0
[√] Android toolchain - develop for Android devices (Android SDK version 29.0.3)
• Android SDK at C:\Users\panka\AppData\Local\Android\sdk
• Android NDK location not configured (optional; useful for native profiling support)
• Platform android-29, build-tools 29.0.3
• Java binary at: C:\Program Files\Android\Android Studio\jre\bin\java
• Java version OpenJDK Runtime Environment (build 1.8.0_202-release-1483-b03)
• All Android licenses accepted.
[√] Android Studio (version 3.5)
• Android Studio at C:\Program Files\Android\Android Studio
• Flutter plugin version 43.0.1
• Dart plugin version 191.8593
• Java version OpenJDK Runtime Environment (build 1.8.0_202-release-1483-b03)
[√] VS Code (version 1.42.1)
• VS Code at C:\Users\panka\AppData\Local\Programs\Microsoft VS Code
• Flutter extension version 3.8.1
[√] Connected device (1 available)
• Android SDK built for x86 • emulator-5554 • android-x86 • Android 10 (API 29) (emulator)
• No issues found!
```
</details>
|
non_process
|
formkey currentstate reset doesn t reset dropdownbuttonformfield memory data thank you for using flutter if you are looking for support please check out our documentation or consider asking a question on stack overflow if you have found a bug or if our documentation doesn t have an answer to what you re looking for then fill our the template below please read our guide to filing a bug first steps to reproduce code as following import package flutter material dart void main runapp myapp class myapp extends statelesswidget override widget build buildcontext context return materialapp title flutter demo theme themedata primaryswatch colors blue home myhomepage title flutter demo home page class myhomepage extends statefulwidget myhomepage key key this title super key key final string title override myhomepagestate createstate myhomepagestate class myhomepagestate extends state final globalkey formkey globalkey int dropdownvalue override widget build buildcontext context return scaffold appbar appbar title text widget title body form key formkey child padding padding const edgeinsets all child column children dropdownbuttonformfield validator int value value null please select null value dropdownvalue hint text select isdense true onsaved int value dropdownvalue value items map value dropdownmenuitem value value child text value tostring tolist onchanged int value setstate dropdownvalue value raisedbutton child text validate onpressed if formkey currentstate validate formkey currentstate save print dropdownvalue raisedbutton child text reset onpressed formkey currentstate reset setstate dropdownvalue null text selected dropdownvalue click on validate without selecting anything on dropdown it validates correctly select any item from dropdown click on validate and it validates correctly and prints the value into output window click on reset which clears the dropdown selection click on validate and it doesn t validate even if nothing is selected on dropdown and prints the value into output window click on reset again click on validate and it validates this time expected results it should validate on step when i reset the formkey currentstate reset on step actual results it holds value into memory and validation passes even if the dropdown has nothing selected on step don t know why but it works on step second time reset on step logs executing git c log showsignature false log n pretty format h exit code from git c log showsignature false log n pretty format h executing git describe match v first parent long tags exit code from git describe match v first parent long tags hotfix executing git rev parse abbrev ref symbolic u exit code from git rev parse abbrev ref symbolic u origin stable executing git ls remote get url origin exit code from git ls remote get url origin executing git rev parse abbrev ref head exit code from git rev parse abbrev ref head stable executing c users panka appdata local android sdk platform tools adb exe devices l exit code from c users panka appdata local android sdk platform tools adb exe devices l list of devices attached emulator device product sdk gphone model android sdk built for device generic transport id c users panka appdata local android sdk platform tools adb exe s emulator shell getprop artifact instance of androidmavenartifacts is not required skipping update artifact instance of androidinternalbuildartifacts is not required skipping update artifact instance of iosengineartifacts is not required skipping update artifact instance of flutterwebsdk is not required skipping update artifact instance of windowsengineartifacts is not required skipping update artifact instance of macosengineartifacts is not required skipping update artifact instance of linuxengineartifacts is not required skipping update artifact instance of linuxfuchsiasdkartifacts is not required skipping update artifact instance of macosfuchsiasdkartifacts is not required skipping update artifact instance of flutterrunnersdkartifacts is not required skipping update artifact instance of flutterrunnerdebugsymbols is not required skipping update generating e practice flutter flutter bug android app src main java io flutter plugins generatedpluginregistrant java ro hardware ranchu using hardware rendering with device android sdk built for if you get graphics artifacts consider enabling software rendering with enable software rendering launching lib main dart on android sdk built for in debug mode executing c users panka appdata local android sdk build tools aapt dump xmltree e practice flutter flutter bug build app outputs apk app apk androidmanifest xml exit code from c users panka appdata local android sdk build tools aapt dump xmltree e practice flutter flutter bug build app outputs apk app apk androidmanifest xml n android e manifest line a android versioncode type a android versionname raw a android compilesdkversion type a android compilesdkversioncodename raw a package com example flutter bug raw com example flutter bug a platformbuildversioncode type a platformbuildversionname type e uses sdk line a android minsdkversion type a android targetsdkversion type e uses permission line a android name android permission internet raw android permission internet e application line a android label flutter bug raw flutter bug a android icon a android name io flutter app flutterapplication raw io flutter app flutterapplication a android debuggable type a android appcomponentfactory androidx core app corecomponentfactory raw androidx core app corecomponentfactory e activity line a android theme a android name com example flutter bug mainactivity raw com example flutter bug mainactivity a android launchmode type a android configchanges type a android windowsoftinputmode type a android hardwareaccelerated type e intent filter line e action line a android name android intent action main raw android intent action main e category line a android name android intent category launcher raw android intent category launcher e meta data line a android name flutterembedding raw flutterembedding a android value type executing c users panka appdata local android sdk platform tools adb exe s emulator shell x logcat v time t exit code from c users panka appdata local android sdk platform tools adb exe s emulator shell x logcat v time t beginning of main e gnsshal gnssinterface gnsssvstatuscb b input svinfo flags is executing c users panka appdata local android sdk platform tools adb exe version executing c users panka appdata local android sdk platform tools adb exe s emulator logcat v time t android debug bridge version version installed as c users panka appdata local android sdk platform tools adb exe executing c users panka appdata local android sdk platform tools adb exe start server building apk running gradle task assembledebug gradle properties already sets android using gradle from e practice flutter flutter bug android gradlew bat executing c program files android android studio jre bin java version exit code from c program files android android studio jre bin java version openjdk version release openjdk runtime environment build release openjdk bit server vm build mixed mode executing e practice flutter flutter bug android gradlew bat pverbose true ptarget e practice flutter flutter bug lib main dart ptrack widget creation true pfilesystem scheme org dartlang root ptarget platform android assembledebug task app compileflutterbuilddebug executing git c log showsignature false log n pretty format h exit code from git c log showsignature false log n pretty format h executing git describe match v first parent long tags exit code from git describe match v first parent long tags hotfix executing git rev parse abbrev ref symbolic u exit code from git rev parse abbrev ref symbolic u origin stable executing git ls remote get url origin exit code from git ls remote get url origin executing git rev parse abbrev ref head exit code from git rev parse abbrev ref head stable artifact instance of androidmavenartifacts is not required skipping update artifact instance of androidgensnapshotartifacts is not required skipping update artifact instance of androidinternalbuildartifacts is not required skipping update artifact instance of iosengineartifacts is not required skipping update artifact instance of flutterwebsdk is not required skipping update artifact instance of windowsengineartifacts is not required skipping update artifact instance of macosengineartifacts is not required skipping update artifact instance of linuxengineartifacts is not required skipping update artifact instance of linuxfuchsiasdkartifacts is not required skipping update artifact instance of macosfuchsiasdkartifacts is not required skipping update artifact instance of flutterrunnersdkartifacts is not required skipping update artifact instance of flutterrunnerdebugsymbols is not required skipping update initializing file store done initializing file store kernel snapshot starting due to invalidatedreason inputchanged c flutter bin cache dart sdk bin dart exe c flutter bin cache artifacts engine windows frontend server dart snapshot sdk root c flutter bin cache artifacts engine common flutter patched sdk target flutter ddart developer causal async stacks true ddart vm profile false ddart vm product false bytecode options source positions local var info debugger stops instance field initializers keep unreachable code avoid closure call instructions enable asserts track widget creation no link platform packages e practice flutter flutter bug packages output dill e practice flutter flutter bug dart tool flutter build app dill depfile e practice flutter flutter bug dart tool flutter build kernel snapshot d package flutter bug main dart kernel snapshot complete debug android application starting due to invalidatedreason inputchanged invalidatedreason outputmissing debug android application complete persisting file store done persisting file store build succeeded flutter assemble took task app packlibsflutterbuilddebug up to date task app prebuild up to date task app predebugbuild up to date task app checkdebugmanifest up to date task app generatedebugbuildconfig up to date task app cleanmergedebugassets task app compiledebugrenderscript no source task app compiledebugaidl no source task app mergedebugshaders up to date task app compiledebugshaders up to date task app generatedebugassets up to date task app mergedebugassets task app copyflutterassetsdebug task app mainapklistpersistencedebug up to date task app generatedebugresvalues up to date task app generatedebugresources up to date task app mergedebugresources up to date task app createdebugcompatiblescreenmanifests up to date task app processdebugmanifest up to date task app processdebugresources up to date task app compiledebugkotlin up to date task app javaprecompiledebug up to date task app compiledebugjavawithjavac up to date task app compiledebugsources up to date task app processdebugjavares no source task app mergedebugjavaresource up to date task app checkdebugduplicateclasses up to date task app transformclasseswithdexbuilderfordebug up to date task app desugardebugfiledependencies up to date task app mergeextdexdebug up to date task app mergedexdebug up to date task app validatesigningdebug up to date task app signingconfigwriterdebug up to date task app mergedebugjnilibfolders up to date task app mergedebugnativelibs up to date task app stripdebugdebugsymbols up to date compatible side by side ndk version was not found task app packagedebug task app assembledebug build successful in actionable tasks executed up to date running gradle task assembledebug completed in calculatesha localdirectory e practice flutter flutter bug build app outputs apk app apk calculatesha reading file took calculatesha computing sha took √ built build app outputs apk debug app debug apk executing c users panka appdata local android sdk build tools aapt dump xmltree e practice flutter flutter bug build app outputs apk app apk androidmanifest xml exit code from c users panka appdata local android sdk build tools aapt dump xmltree e practice flutter flutter bug build app outputs apk app apk androidmanifest xml n android e manifest line a android versioncode type a android versionname raw a android compilesdkversion type a android compilesdkversioncodename raw a package com example flutter bug raw com example flutter bug a platformbuildversioncode type a platformbuildversionname type e uses sdk line a android minsdkversion type a android targetsdkversion type e uses permission line a android name android permission internet raw android permission internet e application line a android label flutter bug raw flutter bug a android icon a android name io flutter app flutterapplication raw io flutter app flutterapplication a android debuggable type a android appcomponentfactory androidx core app corecomponentfactory raw androidx core app corecomponentfactory e activity line a android theme a android name com example flutter bug mainactivity raw com example flutter bug mainactivity a android launchmode type a android configchanges type a android windowsoftinputmode type a android hardwareaccelerated type e intent filter line e action line a android name android intent action main raw android intent action main e category line a android name android intent category launcher raw android intent category launcher e meta data line a android name flutterembedding raw flutterembedding a android value type stopping app app apk on android sdk built for executing c users panka appdata local android sdk platform tools adb exe s emulator shell am force stop com example flutter bug executing c users panka appdata local android sdk platform tools adb exe s emulator shell pm list packages com example flutter bug package com example flutter bug executing c users panka appdata local android sdk platform tools adb exe s emulator shell cat data local tmp sky com example flutter bug installing apk executing c users panka appdata local android sdk platform tools adb exe version android debug bridge version version installed as c users panka appdata local android sdk platform tools adb exe executing c users panka appdata local android sdk platform tools adb exe start server installing build app outputs apk app apk executing c users panka appdata local android sdk platform tools adb exe s emulator install t r e practice flutter flutter bug build app outputs apk app apk performing streamed install success installing build app outputs apk app apk completed in executing c users panka appdata local android sdk platform tools adb exe s emulator shell echo n data local tmp sky com example flutter bug android sdk built for startapp executing c users panka appdata local android sdk platform tools adb exe s emulator shell am start a android intent action run f ez enable background compilation true ez enable dart profiling true ez enable checked mode true ez verify entry points true com example flutter bug com example flutter bug mainactivity starting intent act android intent action run flg cmp com example flutter bug mainactivity has extras waiting for observatory port to be available d flutteractivity using the launch theme as normal theme d flutteractivityandfragmentdelegate setting up flutterengine d flutteractivityandfragmentdelegate no preferred flutterengine was provided creating a new flutterengine for this flutterfragment d flutteractivityandfragmentdelegate attaching flutterengine to the activity that owns this fragment d flutterview attaching to a flutterengine io flutter embedding engine flutterengine d flutteractivityandfragmentdelegate executing dart entrypoint main and sending initial route observatory url on device executing c users panka appdata local android sdk platform tools adb exe s emulator forward tcp tcp forwarded host port to device port for observatory connecting to service protocol successfully connected to service protocol sending to vm service getvm result type vm name vm architecturebits hostcpu intel r core tm cpu operatingsystem android targetcpu version fri dec on android profilermode vm nativezonememoryusage sending to vm service getisolate isolateid isolates sending to vm service flutter listviews result type flutterviewlist views type flutterview id flutterview isolate type isolate fixedid true id isolates name main dart main number devfs creating new filesystem on the device null sending to vm service createdevfs fsname flutter bug d egl emulation eglmakecurrent ver tinfo result type isolate id isolates name main number originnumber starttime heaps new type heapspace name new vmname scavenger collections avgcollectionperiodmillis result type filesystem name flutter bug uri file data user com example flutter bug code cache flutter bugpamgpj flutter bug devfs created new filesystem on the device file data user com example flutter bug code cache flutter bugpamgpj flutter bug updating assets syncing files to device android sdk built for scanning asset files reset compiling dart to kernel with updated files c flutter bin cache dart sdk bin dart exe c flutter bin cache artifacts engine windows frontend server dart snapshot sdk root c flutter bin cache artifacts engine common flutter patched sdk incremental target flutter ddart developer causal async stacks true output dill c users panka appdata local temp flutter tool app dill packages e practice flutter flutter bug packages ddart vm profile false ddart vm product false bytecode options source positions local var info debugger stops instance field initializers keep unreachable code avoid closure call instructions enable asserts track widget creation filesystem scheme org dartlang root compile package flutter bug main dart i choreographer skipped frames the application may be doing too much work on its main thread d egl emulation eglmakecurrent ver tinfo updating files devfs sync finished syncing files to device android sdk built for completed in longer than expected synced sending to vm service flutter listviews result type flutterviewlist views type flutterview id flutterview isolate type isolate fixedid true id isolates name main dart main number accept connected to flutterview � to hot reload changes while running press r to hot restart and rebuild state press r an observatory debugger and profiler on android sdk built for is available at for a more detailed help message press h to detach press d to quit press q i flutter i flutter e practice flutter flutter bug flutter analyze analyzing flutter bug no issues found ran in e practice flutter flutter bug flutter doctor v flutter channel stable hotfix on microsoft windows locale en gb • flutter version hotfix at c flutter • framework revision days ago • engine revision • dart version android toolchain develop for android devices android sdk version • android sdk at c users panka appdata local android sdk • android ndk location not configured optional useful for native profiling support • platform android build tools • java binary at c program files android android studio jre bin java • java version openjdk runtime environment build release • all android licenses accepted android studio version • android studio at c program files android android studio • flutter plugin version • dart plugin version • java version openjdk runtime environment build release vs code version • vs code at c users panka appdata local programs microsoft vs code • flutter extension version connected device available • android sdk built for • emulator • android • android api emulator • no issues found
| 0
|
9,770
| 12,754,361,666
|
IssuesEvent
|
2020-06-28 04:54:22
|
bisq-network/bisq
|
https://api.github.com/repos/bisq-network/bisq
|
closed
|
Bisq does not verify taker's funding TX is broadcast to Bitcoin P2P network
|
a:discussion in:trade-process was:dropped
|
### Description
Bisq allows its offers to be taken without verifying the taker's funding TX was actually broadcast to the Bitcoin P2P network.
#### Version
1.1.7
### Steps to reproduce
1) Configure Alice's Bisq node to only connect to a single a Bitcoin node that does not relay transactions to the Bitcoin P2P network (i.e. configure it to have zero peers)
2) Create an offer on Bob's node
3) Take the offer on Alice's node
### Expected behaviour
Alice's node should not attempt to take the offer if it cannot broadcast a valid funding TX to the Bitcoin P2P network.
Bob's node should not allow Alice to accept his offer if Bob cannot verify that Alice's funding TXID has been seen by at least 1 of his Bitcoin peers.
### Actual behaviour
Bob's offer gets accepted state, but the trade can never begin because Alice's funding TX was not broadcast to the Bitcoin P2P network. Additionally, Bob's funds are now stuck in limbo.
### Screenshots
<img width="1196" alt="Screen Shot 2019-10-27 at 19 38 13" src="https://user-images.githubusercontent.com/232186/67633385-a2495600-f8f2-11e9-8d86-645ca873db1b.png">
|
1.0
|
Bisq does not verify taker's funding TX is broadcast to Bitcoin P2P network - ### Description
Bisq allows its offers to be taken without verifying the taker's funding TX was actually broadcast to the Bitcoin P2P network.
#### Version
1.1.7
### Steps to reproduce
1) Configure Alice's Bisq node to only connect to a single a Bitcoin node that does not relay transactions to the Bitcoin P2P network (i.e. configure it to have zero peers)
2) Create an offer on Bob's node
3) Take the offer on Alice's node
### Expected behaviour
Alice's node should not attempt to take the offer if it cannot broadcast a valid funding TX to the Bitcoin P2P network.
Bob's node should not allow Alice to accept his offer if Bob cannot verify that Alice's funding TXID has been seen by at least 1 of his Bitcoin peers.
### Actual behaviour
Bob's offer gets accepted state, but the trade can never begin because Alice's funding TX was not broadcast to the Bitcoin P2P network. Additionally, Bob's funds are now stuck in limbo.
### Screenshots
<img width="1196" alt="Screen Shot 2019-10-27 at 19 38 13" src="https://user-images.githubusercontent.com/232186/67633385-a2495600-f8f2-11e9-8d86-645ca873db1b.png">
|
process
|
bisq does not verify taker s funding tx is broadcast to bitcoin network description bisq allows its offers to be taken without verifying the taker s funding tx was actually broadcast to the bitcoin network version steps to reproduce configure alice s bisq node to only connect to a single a bitcoin node that does not relay transactions to the bitcoin network i e configure it to have zero peers create an offer on bob s node take the offer on alice s node expected behaviour alice s node should not attempt to take the offer if it cannot broadcast a valid funding tx to the bitcoin network bob s node should not allow alice to accept his offer if bob cannot verify that alice s funding txid has been seen by at least of his bitcoin peers actual behaviour bob s offer gets accepted state but the trade can never begin because alice s funding tx was not broadcast to the bitcoin network additionally bob s funds are now stuck in limbo screenshots img width alt screen shot at src
| 1
|
22,402
| 31,142,290,623
|
IssuesEvent
|
2023-08-16 01:44:37
|
cypress-io/cypress
|
https://api.github.com/repos/cypress-io/cypress
|
closed
|
Flaky test: cypress/e2e/commands/actions/type_special_chars.cy.js
|
OS: linux process: flaky test topic: flake ❄️ stage: flake "topic: done()" stale
|
### Link to dashboard or CircleCI failure
https://app.circleci.com/pipelines/github/cypress-io/cypress/41433/workflows/feaa577c-5380-49e0-afb4-bb6a861e1844/jobs/1715705
### Link to failing test in GitHub
https://github.com/cypress-io/cypress/blob/develop/packages/driver/cypress/e2e/commands/actions/type_special_chars.cy.js#L1254
### Analysis
<img width="971" alt="Screen Shot 2022-08-05 at 12 48 33 PM" src="https://user-images.githubusercontent.com/26726429/183150270-047a3bc5-cebd-477a-879b-858e1a2b1486.png">
### Cypress Version
10.4.0
### Other
There's A LOT of flaky tests in this file due to the way `done()` is being called. See list of failing tests in dashboard link above. Using this single issue to flag all flaky tests in that file instead of creating individual issues. Search for this issue number in the codebase to find the test(s) skipped until this issue is fixed
|
1.0
|
Flaky test: cypress/e2e/commands/actions/type_special_chars.cy.js - ### Link to dashboard or CircleCI failure
https://app.circleci.com/pipelines/github/cypress-io/cypress/41433/workflows/feaa577c-5380-49e0-afb4-bb6a861e1844/jobs/1715705
### Link to failing test in GitHub
https://github.com/cypress-io/cypress/blob/develop/packages/driver/cypress/e2e/commands/actions/type_special_chars.cy.js#L1254
### Analysis
<img width="971" alt="Screen Shot 2022-08-05 at 12 48 33 PM" src="https://user-images.githubusercontent.com/26726429/183150270-047a3bc5-cebd-477a-879b-858e1a2b1486.png">
### Cypress Version
10.4.0
### Other
There's A LOT of flaky tests in this file due to the way `done()` is being called. See list of failing tests in dashboard link above. Using this single issue to flag all flaky tests in that file instead of creating individual issues. Search for this issue number in the codebase to find the test(s) skipped until this issue is fixed
|
process
|
flaky test cypress commands actions type special chars cy js link to dashboard or circleci failure link to failing test in github analysis img width alt screen shot at pm src cypress version other there s a lot of flaky tests in this file due to the way done is being called see list of failing tests in dashboard link above using this single issue to flag all flaky tests in that file instead of creating individual issues search for this issue number in the codebase to find the test s skipped until this issue is fixed
| 1
|
360,775
| 25,310,177,409
|
IssuesEvent
|
2022-11-17 16:50:50
|
forem/forem
|
https://api.github.com/repos/forem/forem
|
opened
|
[API v1 docs] Follows
|
area: API area: documentation
|
We are using [rswag](https://github.com/rswag/rswag) to implement API V1 docs and are missing the following Follows endpoints:
- [ ] [Followed tags](https://developers.forem.com/api/v0#tag/follows/operation/getFollowedTags)
|
1.0
|
[API v1 docs] Follows - We are using [rswag](https://github.com/rswag/rswag) to implement API V1 docs and are missing the following Follows endpoints:
- [ ] [Followed tags](https://developers.forem.com/api/v0#tag/follows/operation/getFollowedTags)
|
non_process
|
follows we are using to implement api docs and are missing the following follows endpoints
| 0
|
72,000
| 18,957,246,698
|
IssuesEvent
|
2021-11-18 21:55:11
|
microsoft/fluentui
|
https://api.github.com/repos/microsoft/fluentui
|
closed
|
Add a test to ensure compatibility with TS 3.9
|
Area: Build System Priority 1: High Area: Testing Area: Typescript Effort II: Days
|
Since our consumer TS min bar is 3.9 for `@fluentui/react` version 8, but our repo is compiling with TS 4.1, we need to ensure we don't introduce any TS 3.9-incompatible public APIs or .d.ts.
A "good enough" approach (may not catch all cases but is easy to implement):
- In a temp directory, scaffold a basic package with TS 3.9 and relevant fluentui deps (gzipped local versions, similar to projects-test)
- Make an index file with does `export * from /*all the fluentui deps*/` -- basically just something to get TS to parse all the .d.ts files for exported stuff
- Attempt to compile
OPEN QUESTION: what's the consumer TS minbar for converged? Does it need to be included in this test?
|
1.0
|
Add a test to ensure compatibility with TS 3.9 - Since our consumer TS min bar is 3.9 for `@fluentui/react` version 8, but our repo is compiling with TS 4.1, we need to ensure we don't introduce any TS 3.9-incompatible public APIs or .d.ts.
A "good enough" approach (may not catch all cases but is easy to implement):
- In a temp directory, scaffold a basic package with TS 3.9 and relevant fluentui deps (gzipped local versions, similar to projects-test)
- Make an index file with does `export * from /*all the fluentui deps*/` -- basically just something to get TS to parse all the .d.ts files for exported stuff
- Attempt to compile
OPEN QUESTION: what's the consumer TS minbar for converged? Does it need to be included in this test?
|
non_process
|
add a test to ensure compatibility with ts since our consumer ts min bar is for fluentui react version but our repo is compiling with ts we need to ensure we don t introduce any ts incompatible public apis or d ts a good enough approach may not catch all cases but is easy to implement in a temp directory scaffold a basic package with ts and relevant fluentui deps gzipped local versions similar to projects test make an index file with does export from all the fluentui deps basically just something to get ts to parse all the d ts files for exported stuff attempt to compile open question what s the consumer ts minbar for converged does it need to be included in this test
| 0
|
4,312
| 7,203,329,725
|
IssuesEvent
|
2018-02-06 08:49:18
|
qgis/QGIS-Documentation
|
https://api.github.com/repos/qgis/QGIS-Documentation
|
closed
|
[FEATURE] Drop processing 'Select by Attribute Sum' algorithm
|
Automatic new feature Easy Processing
|
Original commit: https://github.com/qgis/QGIS/commit/d08398f785b9421d0a8df66d34576cd3097da540 by nyalldawson
Tagged as feature to be included in release notes.
Because:
- The use case for this algorithm is very unclear for users - the name
does not describe what the algorithm does, and there's no help
documentation available for the algorithm either. Given this I suspect
that the algorithm is not being put into use.
- The algorithm needs enhancement to be more useful. There's no logic
in place which dictates how neighbouring features are chosen to
dissolve into the selected feature (it's effectively random - you're
just as likely to get a huge narrow polygon stretching across a map as
you are a nice compact cluster). To be more useful the algorithm would
need logic to either minimise the area of the dissolved feature, or
minimise the total number of dissolved features, or ... ?
|
1.0
|
[FEATURE] Drop processing 'Select by Attribute Sum' algorithm - Original commit: https://github.com/qgis/QGIS/commit/d08398f785b9421d0a8df66d34576cd3097da540 by nyalldawson
Tagged as feature to be included in release notes.
Because:
- The use case for this algorithm is very unclear for users - the name
does not describe what the algorithm does, and there's no help
documentation available for the algorithm either. Given this I suspect
that the algorithm is not being put into use.
- The algorithm needs enhancement to be more useful. There's no logic
in place which dictates how neighbouring features are chosen to
dissolve into the selected feature (it's effectively random - you're
just as likely to get a huge narrow polygon stretching across a map as
you are a nice compact cluster). To be more useful the algorithm would
need logic to either minimise the area of the dissolved feature, or
minimise the total number of dissolved features, or ... ?
|
process
|
drop processing select by attribute sum algorithm original commit by nyalldawson tagged as feature to be included in release notes because the use case for this algorithm is very unclear for users the name does not describe what the algorithm does and there s no help documentation available for the algorithm either given this i suspect that the algorithm is not being put into use the algorithm needs enhancement to be more useful there s no logic in place which dictates how neighbouring features are chosen to dissolve into the selected feature it s effectively random you re just as likely to get a huge narrow polygon stretching across a map as you are a nice compact cluster to be more useful the algorithm would need logic to either minimise the area of the dissolved feature or minimise the total number of dissolved features or
| 1
|
21,593
| 29,993,417,007
|
IssuesEvent
|
2023-06-26 02:00:08
|
lizhihao6/get-daily-arxiv-noti
|
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
|
opened
|
New submissions for Mon, 26 Jun 23
|
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
|
## Keyword: events
There is no result
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
There is no result
## Keyword: ISP
### A Sparse Graph Formulation for Efficient Spectral Image Segmentation
- **Authors:** Rahul Palnitkar, Jeova Farias Sales Rocha Neto
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.13166
- **Pdf link:** https://arxiv.org/pdf/2306.13166
- **Abstract**
Spectral Clustering is one of the most traditional methods to solve segmentation problems. Based on Normalized Cuts, it aims at partitioning an image using an objective function defined by a graph. Despite their mathematical attractiveness, spectral approaches are traditionally neglected by the scientific community due to their practical issues and underperformance. In this paper, we adopt a sparse graph formulation based on the inclusion of extra nodes to a simple grid graph. While the grid encodes the pixel spatial disposition, the extra nodes account for the pixel color data. Applying the original Normalized Cuts algorithm to this graph leads to a simple and scalable method for spectral image segmentation, with an interpretable solution. Our experiments also demonstrate that our proposed methodology over performs traditional spectral algorithms for segmentation.
### Targeted Background Removal Creates Interpretable Feature Visualizations
- **Authors:** Ian E. Nielsen, Erik Grundeland, Joseph Snedeker, Ghulam Rasool, Ravi P. Ramachandran
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Human-Computer Interaction (cs.HC)
- **Arxiv link:** https://arxiv.org/abs/2306.13178
- **Pdf link:** https://arxiv.org/pdf/2306.13178
- **Abstract**
Feature visualization is used to visualize learned features for black box machine learning models. Our approach explores an altered training process to improve interpretability of the visualizations. We argue that by using background removal techniques as a form of robust training, a network is forced to learn more human recognizable features, namely, by focusing on the main object of interest without any distractions from the background. Four different training methods were used to verify this hypothesis. The first used unmodified pictures. The second used a black background. The third utilized Gaussian noise as the background. The fourth approach employed a mix of background removed images and unmodified images. The feature visualization results show that the background removed images reveal a significant improvement over the baseline model. These new results displayed easily recognizable features from their respective classes, unlike the model trained on unmodified data.
### Shape-Constraint Recurrent Flow for 6D Object Pose Estimation
- **Authors:** Yang Hai, Rui Song, Jiaojiao Li, Yinlin Hu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.13266
- **Pdf link:** https://arxiv.org/pdf/2306.13266
- **Abstract**
Most recent 6D object pose methods use 2D optical flow to refine their results. However, the general optical flow methods typically do not consider the target's 3D shape information during matching, making them less effective in 6D object pose estimation. In this work, we propose a shape-constraint recurrent matching framework for 6D object pose estimation. We first compute a pose-induced flow based on the displacement of 2D reprojection between the initial pose and the currently estimated pose, which embeds the target's 3D shape implicitly. Then we use this pose-induced flow to construct the correlation map for the following matching iterations, which reduces the matching space significantly and is much easier to learn. Furthermore, we use networks to learn the object pose based on the current estimated flow, which facilitates the computation of the pose-induced flow for the next iteration and yields an end-to-end system for object pose. Finally, we optimize the optical flow and object pose simultaneously in a recurrent manner. We evaluate our method on three challenging 6D object pose datasets and show that it outperforms the state of the art significantly in both accuracy and efficiency.
### Differentiable Display Photometric Stereo
- **Authors:** Seokjun Choi, Seungwoo Yoon, Giljoo Nam, Seungyong Lee, Seung-Hawn Baek
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.13325
- **Pdf link:** https://arxiv.org/pdf/2306.13325
- **Abstract**
Photometric stereo leverages variations in illumination conditions to reconstruct per-pixel surface normals. The concept of display photometric stereo, which employs a conventional monitor as an illumination source, has the potential to overcome limitations often encountered in bulky and difficult-to-use conventional setups. In this paper, we introduce Differentiable Display Photometric Stereo (DDPS), a method designed to achieve high-fidelity normal reconstruction using an off-the-shelf monitor and camera. DDPS addresses a critical yet often neglected challenge in photometric stereo: the optimization of display patterns for enhanced normal reconstruction. We present a differentiable framework that couples basis-illumination image formation with a photometric-stereo reconstruction method. This facilitates the learning of display patterns that leads to high-quality normal reconstruction through automatic differentiation. Addressing the synthetic-real domain gap inherent in end-to-end optimization, we propose the use of a real-world photometric-stereo training dataset composed of 3D-printed objects. Moreover, to reduce the ill-posed nature of photometric stereo, we exploit the linearly polarized light emitted from the monitor to optically separate diffuse and specular reflections in the captured images. We demonstrate that DDPS allows for learning display patterns optimized for a target configuration and is robust to initialization. We assess DDPS on 3D-printed objects with ground-truth normals and diverse real-world objects, validating that DDPS enables effective photometric-stereo reconstruction.
### 3DSAM-adapter: Holistic Adaptation of SAM from 2D to 3D for Promptable Medical Image Segmentation
- **Authors:** Shizhan Gong, Yuan Zhong, Wenao Ma, Jinpeng Li, Zhao Wang, Jingyang Zhang, Pheng-Ann Heng, Qi Dou
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.13465
- **Pdf link:** https://arxiv.org/pdf/2306.13465
- **Abstract**
Despite that the segment anything model (SAM) achieved impressive results on general-purpose semantic segmentation with strong generalization ability on daily images, its demonstrated performance on medical image segmentation is less precise and not stable, especially when dealing with tumor segmentation tasks that involve objects of small sizes, irregular shapes, and low contrast. Notably, the original SAM architecture is designed for 2D natural images, therefore would not be able to extract the 3D spatial information from volumetric medical data effectively. In this paper, we propose a novel adaptation method for transferring SAM from 2D to 3D for promptable medical image segmentation. Through a holistically designed scheme for architecture modification, we transfer the SAM to support volumetric inputs while retaining the majority of its pre-trained parameters for reuse. The fine-tuning process is conducted in a parameter-efficient manner, wherein most of the pre-trained parameters remain frozen, and only a few lightweight spatial adapters are introduced and tuned. Regardless of the domain gap between natural and medical data and the disparity in the spatial arrangement between 2D and 3D, the transformer trained on natural images can effectively capture the spatial patterns present in volumetric medical images with only lightweight adaptations. We conduct experiments on four open-source tumor segmentation datasets, and with a single click prompt, our model can outperform domain state-of-the-art medical image segmentation models on 3 out of 4 tasks, specifically by 8.25%, 29.87%, and 10.11% for kidney tumor, pancreas tumor, colon cancer segmentation, and achieve similar performance for liver tumor segmentation. We also compare our adaptation method with existing popular adapters, and observed significant performance improvement on most datasets.
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
There is no result
## Keyword: RAW
### Bridging the Performance Gap between DETR and R-CNN for Graphical Object Detection in Document Images
- **Authors:** Tahira Shehzadi, Khurram Azeem Hashmi, Didier Stricker, Marcus Liwicki, Muhammad Zeshan Afzal
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.13526
- **Pdf link:** https://arxiv.org/pdf/2306.13526
- **Abstract**
This paper takes an important step in bridging the performance gap between DETR and R-CNN for graphical object detection. Existing graphical object detection approaches have enjoyed recent enhancements in CNN-based object detection methods, achieving remarkable progress. Recently, Transformer-based detectors have considerably boosted the generic object detection performance, eliminating the need for hand-crafted features or post-processing steps such as Non-Maximum Suppression (NMS) using object queries. However, the effectiveness of such enhanced transformer-based detection algorithms has yet to be verified for the problem of graphical object detection. Essentially, inspired by the latest advancements in the DETR, we employ the existing detection transformer with few modifications for graphical object detection. We modify object queries in different ways, using points, anchor boxes and adding positive and negative noise to the anchors to boost performance. These modifications allow for better handling of objects with varying sizes and aspect ratios, more robustness to small variations in object positions and sizes, and improved image discrimination between objects and non-objects. We evaluate our approach on the four graphical datasets: PubTables, TableBank, NTable and PubLaynet. Upon integrating query modifications in the DETR, we outperform prior works and achieve new state-of-the-art results with the mAP of 96.9\%, 95.7\% and 99.3\% on TableBank, PubLaynet, PubTables, respectively. The results from extensive ablations show that transformer-based methods are more effective for document analysis analogous to other applications. We hope this study draws more attention to the research of using detection transformers in document image analysis.
## Keyword: raw image
There is no result
|
2.0
|
New submissions for Mon, 26 Jun 23 - ## Keyword: events
There is no result
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
There is no result
## Keyword: ISP
### A Sparse Graph Formulation for Efficient Spectral Image Segmentation
- **Authors:** Rahul Palnitkar, Jeova Farias Sales Rocha Neto
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.13166
- **Pdf link:** https://arxiv.org/pdf/2306.13166
- **Abstract**
Spectral Clustering is one of the most traditional methods to solve segmentation problems. Based on Normalized Cuts, it aims at partitioning an image using an objective function defined by a graph. Despite their mathematical attractiveness, spectral approaches are traditionally neglected by the scientific community due to their practical issues and underperformance. In this paper, we adopt a sparse graph formulation based on the inclusion of extra nodes to a simple grid graph. While the grid encodes the pixel spatial disposition, the extra nodes account for the pixel color data. Applying the original Normalized Cuts algorithm to this graph leads to a simple and scalable method for spectral image segmentation, with an interpretable solution. Our experiments also demonstrate that our proposed methodology over performs traditional spectral algorithms for segmentation.
### Targeted Background Removal Creates Interpretable Feature Visualizations
- **Authors:** Ian E. Nielsen, Erik Grundeland, Joseph Snedeker, Ghulam Rasool, Ravi P. Ramachandran
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Human-Computer Interaction (cs.HC)
- **Arxiv link:** https://arxiv.org/abs/2306.13178
- **Pdf link:** https://arxiv.org/pdf/2306.13178
- **Abstract**
Feature visualization is used to visualize learned features for black box machine learning models. Our approach explores an altered training process to improve interpretability of the visualizations. We argue that by using background removal techniques as a form of robust training, a network is forced to learn more human recognizable features, namely, by focusing on the main object of interest without any distractions from the background. Four different training methods were used to verify this hypothesis. The first used unmodified pictures. The second used a black background. The third utilized Gaussian noise as the background. The fourth approach employed a mix of background removed images and unmodified images. The feature visualization results show that the background removed images reveal a significant improvement over the baseline model. These new results displayed easily recognizable features from their respective classes, unlike the model trained on unmodified data.
### Shape-Constraint Recurrent Flow for 6D Object Pose Estimation
- **Authors:** Yang Hai, Rui Song, Jiaojiao Li, Yinlin Hu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.13266
- **Pdf link:** https://arxiv.org/pdf/2306.13266
- **Abstract**
Most recent 6D object pose methods use 2D optical flow to refine their results. However, the general optical flow methods typically do not consider the target's 3D shape information during matching, making them less effective in 6D object pose estimation. In this work, we propose a shape-constraint recurrent matching framework for 6D object pose estimation. We first compute a pose-induced flow based on the displacement of 2D reprojection between the initial pose and the currently estimated pose, which embeds the target's 3D shape implicitly. Then we use this pose-induced flow to construct the correlation map for the following matching iterations, which reduces the matching space significantly and is much easier to learn. Furthermore, we use networks to learn the object pose based on the current estimated flow, which facilitates the computation of the pose-induced flow for the next iteration and yields an end-to-end system for object pose. Finally, we optimize the optical flow and object pose simultaneously in a recurrent manner. We evaluate our method on three challenging 6D object pose datasets and show that it outperforms the state of the art significantly in both accuracy and efficiency.
### Differentiable Display Photometric Stereo
- **Authors:** Seokjun Choi, Seungwoo Yoon, Giljoo Nam, Seungyong Lee, Seung-Hawn Baek
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.13325
- **Pdf link:** https://arxiv.org/pdf/2306.13325
- **Abstract**
Photometric stereo leverages variations in illumination conditions to reconstruct per-pixel surface normals. The concept of display photometric stereo, which employs a conventional monitor as an illumination source, has the potential to overcome limitations often encountered in bulky and difficult-to-use conventional setups. In this paper, we introduce Differentiable Display Photometric Stereo (DDPS), a method designed to achieve high-fidelity normal reconstruction using an off-the-shelf monitor and camera. DDPS addresses a critical yet often neglected challenge in photometric stereo: the optimization of display patterns for enhanced normal reconstruction. We present a differentiable framework that couples basis-illumination image formation with a photometric-stereo reconstruction method. This facilitates the learning of display patterns that leads to high-quality normal reconstruction through automatic differentiation. Addressing the synthetic-real domain gap inherent in end-to-end optimization, we propose the use of a real-world photometric-stereo training dataset composed of 3D-printed objects. Moreover, to reduce the ill-posed nature of photometric stereo, we exploit the linearly polarized light emitted from the monitor to optically separate diffuse and specular reflections in the captured images. We demonstrate that DDPS allows for learning display patterns optimized for a target configuration and is robust to initialization. We assess DDPS on 3D-printed objects with ground-truth normals and diverse real-world objects, validating that DDPS enables effective photometric-stereo reconstruction.
### 3DSAM-adapter: Holistic Adaptation of SAM from 2D to 3D for Promptable Medical Image Segmentation
- **Authors:** Shizhan Gong, Yuan Zhong, Wenao Ma, Jinpeng Li, Zhao Wang, Jingyang Zhang, Pheng-Ann Heng, Qi Dou
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.13465
- **Pdf link:** https://arxiv.org/pdf/2306.13465
- **Abstract**
Despite that the segment anything model (SAM) achieved impressive results on general-purpose semantic segmentation with strong generalization ability on daily images, its demonstrated performance on medical image segmentation is less precise and not stable, especially when dealing with tumor segmentation tasks that involve objects of small sizes, irregular shapes, and low contrast. Notably, the original SAM architecture is designed for 2D natural images, therefore would not be able to extract the 3D spatial information from volumetric medical data effectively. In this paper, we propose a novel adaptation method for transferring SAM from 2D to 3D for promptable medical image segmentation. Through a holistically designed scheme for architecture modification, we transfer the SAM to support volumetric inputs while retaining the majority of its pre-trained parameters for reuse. The fine-tuning process is conducted in a parameter-efficient manner, wherein most of the pre-trained parameters remain frozen, and only a few lightweight spatial adapters are introduced and tuned. Regardless of the domain gap between natural and medical data and the disparity in the spatial arrangement between 2D and 3D, the transformer trained on natural images can effectively capture the spatial patterns present in volumetric medical images with only lightweight adaptations. We conduct experiments on four open-source tumor segmentation datasets, and with a single click prompt, our model can outperform domain state-of-the-art medical image segmentation models on 3 out of 4 tasks, specifically by 8.25%, 29.87%, and 10.11% for kidney tumor, pancreas tumor, colon cancer segmentation, and achieve similar performance for liver tumor segmentation. We also compare our adaptation method with existing popular adapters, and observed significant performance improvement on most datasets.
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
There is no result
## Keyword: RAW
### Bridging the Performance Gap between DETR and R-CNN for Graphical Object Detection in Document Images
- **Authors:** Tahira Shehzadi, Khurram Azeem Hashmi, Didier Stricker, Marcus Liwicki, Muhammad Zeshan Afzal
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.13526
- **Pdf link:** https://arxiv.org/pdf/2306.13526
- **Abstract**
This paper takes an important step in bridging the performance gap between DETR and R-CNN for graphical object detection. Existing graphical object detection approaches have enjoyed recent enhancements in CNN-based object detection methods, achieving remarkable progress. Recently, Transformer-based detectors have considerably boosted the generic object detection performance, eliminating the need for hand-crafted features or post-processing steps such as Non-Maximum Suppression (NMS) using object queries. However, the effectiveness of such enhanced transformer-based detection algorithms has yet to be verified for the problem of graphical object detection. Essentially, inspired by the latest advancements in the DETR, we employ the existing detection transformer with few modifications for graphical object detection. We modify object queries in different ways, using points, anchor boxes and adding positive and negative noise to the anchors to boost performance. These modifications allow for better handling of objects with varying sizes and aspect ratios, more robustness to small variations in object positions and sizes, and improved image discrimination between objects and non-objects. We evaluate our approach on the four graphical datasets: PubTables, TableBank, NTable and PubLaynet. Upon integrating query modifications in the DETR, we outperform prior works and achieve new state-of-the-art results with the mAP of 96.9\%, 95.7\% and 99.3\% on TableBank, PubLaynet, PubTables, respectively. The results from extensive ablations show that transformer-based methods are more effective for document analysis analogous to other applications. We hope this study draws more attention to the research of using detection transformers in document image analysis.
## Keyword: raw image
There is no result
|
process
|
new submissions for mon jun keyword events there is no result keyword event camera there is no result keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb there is no result keyword isp a sparse graph formulation for efficient spectral image segmentation authors rahul palnitkar jeova farias sales rocha neto subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract spectral clustering is one of the most traditional methods to solve segmentation problems based on normalized cuts it aims at partitioning an image using an objective function defined by a graph despite their mathematical attractiveness spectral approaches are traditionally neglected by the scientific community due to their practical issues and underperformance in this paper we adopt a sparse graph formulation based on the inclusion of extra nodes to a simple grid graph while the grid encodes the pixel spatial disposition the extra nodes account for the pixel color data applying the original normalized cuts algorithm to this graph leads to a simple and scalable method for spectral image segmentation with an interpretable solution our experiments also demonstrate that our proposed methodology over performs traditional spectral algorithms for segmentation targeted background removal creates interpretable feature visualizations authors ian e nielsen erik grundeland joseph snedeker ghulam rasool ravi p ramachandran subjects computer vision and pattern recognition cs cv artificial intelligence cs ai human computer interaction cs hc arxiv link pdf link abstract feature visualization is used to visualize learned features for black box machine learning models our approach explores an altered training process to improve interpretability of the visualizations we argue that by using background removal techniques as a form of robust training a network is forced to learn more human recognizable features namely by focusing on the main object of interest without any distractions from the background four different training methods were used to verify this hypothesis the first used unmodified pictures the second used a black background the third utilized gaussian noise as the background the fourth approach employed a mix of background removed images and unmodified images the feature visualization results show that the background removed images reveal a significant improvement over the baseline model these new results displayed easily recognizable features from their respective classes unlike the model trained on unmodified data shape constraint recurrent flow for object pose estimation authors yang hai rui song jiaojiao li yinlin hu subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract most recent object pose methods use optical flow to refine their results however the general optical flow methods typically do not consider the target s shape information during matching making them less effective in object pose estimation in this work we propose a shape constraint recurrent matching framework for object pose estimation we first compute a pose induced flow based on the displacement of reprojection between the initial pose and the currently estimated pose which embeds the target s shape implicitly then we use this pose induced flow to construct the correlation map for the following matching iterations which reduces the matching space significantly and is much easier to learn furthermore we use networks to learn the object pose based on the current estimated flow which facilitates the computation of the pose induced flow for the next iteration and yields an end to end system for object pose finally we optimize the optical flow and object pose simultaneously in a recurrent manner we evaluate our method on three challenging object pose datasets and show that it outperforms the state of the art significantly in both accuracy and efficiency differentiable display photometric stereo authors seokjun choi seungwoo yoon giljoo nam seungyong lee seung hawn baek subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract photometric stereo leverages variations in illumination conditions to reconstruct per pixel surface normals the concept of display photometric stereo which employs a conventional monitor as an illumination source has the potential to overcome limitations often encountered in bulky and difficult to use conventional setups in this paper we introduce differentiable display photometric stereo ddps a method designed to achieve high fidelity normal reconstruction using an off the shelf monitor and camera ddps addresses a critical yet often neglected challenge in photometric stereo the optimization of display patterns for enhanced normal reconstruction we present a differentiable framework that couples basis illumination image formation with a photometric stereo reconstruction method this facilitates the learning of display patterns that leads to high quality normal reconstruction through automatic differentiation addressing the synthetic real domain gap inherent in end to end optimization we propose the use of a real world photometric stereo training dataset composed of printed objects moreover to reduce the ill posed nature of photometric stereo we exploit the linearly polarized light emitted from the monitor to optically separate diffuse and specular reflections in the captured images we demonstrate that ddps allows for learning display patterns optimized for a target configuration and is robust to initialization we assess ddps on printed objects with ground truth normals and diverse real world objects validating that ddps enables effective photometric stereo reconstruction adapter holistic adaptation of sam from to for promptable medical image segmentation authors shizhan gong yuan zhong wenao ma jinpeng li zhao wang jingyang zhang pheng ann heng qi dou subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract despite that the segment anything model sam achieved impressive results on general purpose semantic segmentation with strong generalization ability on daily images its demonstrated performance on medical image segmentation is less precise and not stable especially when dealing with tumor segmentation tasks that involve objects of small sizes irregular shapes and low contrast notably the original sam architecture is designed for natural images therefore would not be able to extract the spatial information from volumetric medical data effectively in this paper we propose a novel adaptation method for transferring sam from to for promptable medical image segmentation through a holistically designed scheme for architecture modification we transfer the sam to support volumetric inputs while retaining the majority of its pre trained parameters for reuse the fine tuning process is conducted in a parameter efficient manner wherein most of the pre trained parameters remain frozen and only a few lightweight spatial adapters are introduced and tuned regardless of the domain gap between natural and medical data and the disparity in the spatial arrangement between and the transformer trained on natural images can effectively capture the spatial patterns present in volumetric medical images with only lightweight adaptations we conduct experiments on four open source tumor segmentation datasets and with a single click prompt our model can outperform domain state of the art medical image segmentation models on out of tasks specifically by and for kidney tumor pancreas tumor colon cancer segmentation and achieve similar performance for liver tumor segmentation we also compare our adaptation method with existing popular adapters and observed significant performance improvement on most datasets keyword image signal processing there is no result keyword image signal process there is no result keyword compression there is no result keyword raw bridging the performance gap between detr and r cnn for graphical object detection in document images authors tahira shehzadi khurram azeem hashmi didier stricker marcus liwicki muhammad zeshan afzal subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract this paper takes an important step in bridging the performance gap between detr and r cnn for graphical object detection existing graphical object detection approaches have enjoyed recent enhancements in cnn based object detection methods achieving remarkable progress recently transformer based detectors have considerably boosted the generic object detection performance eliminating the need for hand crafted features or post processing steps such as non maximum suppression nms using object queries however the effectiveness of such enhanced transformer based detection algorithms has yet to be verified for the problem of graphical object detection essentially inspired by the latest advancements in the detr we employ the existing detection transformer with few modifications for graphical object detection we modify object queries in different ways using points anchor boxes and adding positive and negative noise to the anchors to boost performance these modifications allow for better handling of objects with varying sizes and aspect ratios more robustness to small variations in object positions and sizes and improved image discrimination between objects and non objects we evaluate our approach on the four graphical datasets pubtables tablebank ntable and publaynet upon integrating query modifications in the detr we outperform prior works and achieve new state of the art results with the map of and on tablebank publaynet pubtables respectively the results from extensive ablations show that transformer based methods are more effective for document analysis analogous to other applications we hope this study draws more attention to the research of using detection transformers in document image analysis keyword raw image there is no result
| 1
|
217,605
| 16,722,763,781
|
IssuesEvent
|
2021-06-10 09:18:09
|
coopdigital/coop-frontend
|
https://api.github.com/repos/coopdigital/coop-frontend
|
opened
|
[Component] InputRadio Component for Storybook docs
|
documentation enhancement
|
TODO:
- [ ] Move the InputRadio component from coop-ds-storybook to this repo, under the react-ui package
- [ ] Remove the pcss imports from jsx
- [ ] Remove the pcss file from component directory
- [ ] Have component styles installed at react-ui package root level as a package dependency
- [ ] Import component style .pcss at a global level in `.storybook/preview.js`
- [ ] Have stories attached to the component that cover all states of the component
- [ ] Add tests for the component using jest and testing-library/react
- [ ] Remove any static html examples from component directory
- [ ] Avoid any state management work for the time being, if it can be avoided.
|
1.0
|
[Component] InputRadio Component for Storybook docs - TODO:
- [ ] Move the InputRadio component from coop-ds-storybook to this repo, under the react-ui package
- [ ] Remove the pcss imports from jsx
- [ ] Remove the pcss file from component directory
- [ ] Have component styles installed at react-ui package root level as a package dependency
- [ ] Import component style .pcss at a global level in `.storybook/preview.js`
- [ ] Have stories attached to the component that cover all states of the component
- [ ] Add tests for the component using jest and testing-library/react
- [ ] Remove any static html examples from component directory
- [ ] Avoid any state management work for the time being, if it can be avoided.
|
non_process
|
inputradio component for storybook docs todo move the inputradio component from coop ds storybook to this repo under the react ui package remove the pcss imports from jsx remove the pcss file from component directory have component styles installed at react ui package root level as a package dependency import component style pcss at a global level in storybook preview js have stories attached to the component that cover all states of the component add tests for the component using jest and testing library react remove any static html examples from component directory avoid any state management work for the time being if it can be avoided
| 0
|
93,896
| 10,779,222,323
|
IssuesEvent
|
2019-11-04 10:07:12
|
ocibuilder/ocibuilder
|
https://api.github.com/repos/ocibuilder/ocibuilder
|
closed
|
Rename spec.yaml to alternative
|
PRIORITY: 1 documentation enhancement
|
**Is your feature request related to a problem? Please describe.**
We want to rename the current spec.yaml to something that is specific to ocibuilder and easily identifiable and searchable
**Describe the solution you'd like**
Come up with a unique alternative
|
1.0
|
Rename spec.yaml to alternative - **Is your feature request related to a problem? Please describe.**
We want to rename the current spec.yaml to something that is specific to ocibuilder and easily identifiable and searchable
**Describe the solution you'd like**
Come up with a unique alternative
|
non_process
|
rename spec yaml to alternative is your feature request related to a problem please describe we want to rename the current spec yaml to something that is specific to ocibuilder and easily identifiable and searchable describe the solution you d like come up with a unique alternative
| 0
|
27,238
| 2,691,276,462
|
IssuesEvent
|
2015-03-31 20:38:46
|
mbenkmann/limux-gosa-test
|
https://api.github.com/repos/mbenkmann/limux-gosa-test
|
closed
|
Client simulator for mass-installation and mass-updating
|
imported Milestone-5 Priority-Must Type-Feature
|
_From [matthias...@gmail.com](https://code.google.com/u/110029356478037175549/) on March 22, 2013 09:36:56_
For testing the installation and updating of lots of clients in parallel, a program should be written that will simulate the behaviour of a client performing the relevant operation as realistically as possible.
Desirable features:
* Mass-removal of LDAP objects created for fake clients in prior tests to get a clean starting point for the testing of new machines mass installation.
* Mass creation of LDAP objects with fake MAC addresses and IPs in preparation of testing mass re-installation or softupdates of existing machines.
* Mass-resetting of LDAP objects for fake clients (in particular faistate) in preparation of a new test.
* Sniffing the network for WOL packages targetting the fake MACs and triggering fake client processes in reaction to them.
* Timed startup of fake clients with parameters like "total fake clients to start", "fake clients per minute to start"
* Gathering of statistics: "Clients successfully WOLed", "Clients successfully installed/updated", "Clients installed/updated per hour", "Clients failed to WOL", "Clients failed to install", "avg. TFTP download rate", "avg. HTTP download rate", "avg. time between here_i_am and activated_for_installation", "avg. time for log file upload"
* Error simulation via parameters like "% of clients that don't WOL", "% of clients that will freeze/reboot/shutdown during the process
* Parameters min-boot-time and max-boot-time that determine a range from which the fake client selects randomly the time it takes from receiving the WOL or being otherwise started up to the time it sends here_i_am. This feature helps uncover bugs like gosa-si-server's reactivate problem.
* Some kind of detailed reporting on the state of all fake clients with information where exactly it is in the installation and with logging of important events (such as si-server messages)
* Send appropriate gosa-si-client messages in the proper order, including in relation downloads.
* Wait for si-server messages before continuing in appropriate places ("waiting for activation")
* React to si-server messages that cause reboot or shutdown to uncover errors like gosa-si-server's reactivate problem.
* Download files via TFTP in the proper order in relation to messages and HTTP downloads
* Download files via HTTP in the proper order in relation to messages and TFTP downloads
* Mount NFS shares and read files via NFS. Order is secondary here since most of the installation happens within the target chroot. But at the beginning of the installation the key files (such as /usr/sbin/gosa-si-client, perl, bash) should be loaded once via NFS to ensure that it works and to identify possible problems such as HTTP downloads from competing installations slowing down NFS or issues with too many parallel NFS mounts.
_Original issue: http://code.google.com/p/go-susi/issues/detail?id=96_
|
1.0
|
Client simulator for mass-installation and mass-updating - _From [matthias...@gmail.com](https://code.google.com/u/110029356478037175549/) on March 22, 2013 09:36:56_
For testing the installation and updating of lots of clients in parallel, a program should be written that will simulate the behaviour of a client performing the relevant operation as realistically as possible.
Desirable features:
* Mass-removal of LDAP objects created for fake clients in prior tests to get a clean starting point for the testing of new machines mass installation.
* Mass creation of LDAP objects with fake MAC addresses and IPs in preparation of testing mass re-installation or softupdates of existing machines.
* Mass-resetting of LDAP objects for fake clients (in particular faistate) in preparation of a new test.
* Sniffing the network for WOL packages targetting the fake MACs and triggering fake client processes in reaction to them.
* Timed startup of fake clients with parameters like "total fake clients to start", "fake clients per minute to start"
* Gathering of statistics: "Clients successfully WOLed", "Clients successfully installed/updated", "Clients installed/updated per hour", "Clients failed to WOL", "Clients failed to install", "avg. TFTP download rate", "avg. HTTP download rate", "avg. time between here_i_am and activated_for_installation", "avg. time for log file upload"
* Error simulation via parameters like "% of clients that don't WOL", "% of clients that will freeze/reboot/shutdown during the process
* Parameters min-boot-time and max-boot-time that determine a range from which the fake client selects randomly the time it takes from receiving the WOL or being otherwise started up to the time it sends here_i_am. This feature helps uncover bugs like gosa-si-server's reactivate problem.
* Some kind of detailed reporting on the state of all fake clients with information where exactly it is in the installation and with logging of important events (such as si-server messages)
* Send appropriate gosa-si-client messages in the proper order, including in relation downloads.
* Wait for si-server messages before continuing in appropriate places ("waiting for activation")
* React to si-server messages that cause reboot or shutdown to uncover errors like gosa-si-server's reactivate problem.
* Download files via TFTP in the proper order in relation to messages and HTTP downloads
* Download files via HTTP in the proper order in relation to messages and TFTP downloads
* Mount NFS shares and read files via NFS. Order is secondary here since most of the installation happens within the target chroot. But at the beginning of the installation the key files (such as /usr/sbin/gosa-si-client, perl, bash) should be loaded once via NFS to ensure that it works and to identify possible problems such as HTTP downloads from competing installations slowing down NFS or issues with too many parallel NFS mounts.
_Original issue: http://code.google.com/p/go-susi/issues/detail?id=96_
|
non_process
|
client simulator for mass installation and mass updating from on march for testing the installation and updating of lots of clients in parallel a program should be written that will simulate the behaviour of a client performing the relevant operation as realistically as possible desirable features mass removal of ldap objects created for fake clients in prior tests to get a clean starting point for the testing of new machines mass installation mass creation of ldap objects with fake mac addresses and ips in preparation of testing mass re installation or softupdates of existing machines mass resetting of ldap objects for fake clients in particular faistate in preparation of a new test sniffing the network for wol packages targetting the fake macs and triggering fake client processes in reaction to them timed startup of fake clients with parameters like total fake clients to start fake clients per minute to start gathering of statistics clients successfully woled clients successfully installed updated clients installed updated per hour clients failed to wol clients failed to install avg tftp download rate avg http download rate avg time between here i am and activated for installation avg time for log file upload error simulation via parameters like of clients that don t wol of clients that will freeze reboot shutdown during the process parameters min boot time and max boot time that determine a range from which the fake client selects randomly the time it takes from receiving the wol or being otherwise started up to the time it sends here i am this feature helps uncover bugs like gosa si server s reactivate problem some kind of detailed reporting on the state of all fake clients with information where exactly it is in the installation and with logging of important events such as si server messages send appropriate gosa si client messages in the proper order including in relation downloads wait for si server messages before continuing in appropriate places waiting for activation react to si server messages that cause reboot or shutdown to uncover errors like gosa si server s reactivate problem download files via tftp in the proper order in relation to messages and http downloads download files via http in the proper order in relation to messages and tftp downloads mount nfs shares and read files via nfs order is secondary here since most of the installation happens within the target chroot but at the beginning of the installation the key files such as usr sbin gosa si client perl bash should be loaded once via nfs to ensure that it works and to identify possible problems such as http downloads from competing installations slowing down nfs or issues with too many parallel nfs mounts original issue
| 0
|
279,200
| 24,206,071,478
|
IssuesEvent
|
2022-09-25 08:35:15
|
IntellectualSites/FastAsyncWorldEdit
|
https://api.github.com/repos/IntellectualSites/FastAsyncWorldEdit
|
closed
|
1.16.5 java.lang.IllegalArgumentException: Unsupported class file major version 61
|
Requires Testing
|
### Server Implementation
Paper
### Server Version
1.16.5
### Describe the bug
i used jdk11 first. It prompts: Unsupported class file major version 61
after i configure jdk17 (-DPaper.IgnoreJavaVersion=true)
it went from "unable to load plugin" to "Fatal error trying ... *.class"
then i try used jdk18. The problem still occurs
when it does not send error logs in the console. the debugging plugin prompts FAWE to handle an exception for (like PlayerMovementEvent, PlayerTeleportEvent) events [class com.sk89q.worldedit.bukkit.BukkitPlayer cannot be cast to class com.sk89q.worldedit.bukkit.BukkitPlayer (com.sk89q.worldedit.bukkit.BukkitPlayer is in unnamed module of loader org.bukkit.plugin.java.PluginClassLoader @1d716adc; com.sk89q.worldedit.bukkit.BukkitPlayer is in unnamed module of loader org.bukkit.plugin.java.PluginClassLoader @46ca75ff)]
### To Reproduce
i am using Akarin-6be3d57(1.16.5 paper fork)
an error occurred after using jdk17
i just wanted to use it. If thats really Akarin problem. i will try to use Tuinity.
i know though that using this core will not be supported. but i still want to ask
### Expected behaviour
successfully running. console are no any errors or spam
### Screenshots / Videos
_No response_
### Error log (if applicable)
_No response_
### Fawe Debugpaste
Fatal error trying to convert FastAsyncWorldEdit v2.4.7-SNAPSHOT-281;10b852d:com/sk89q/worldedit/internal/util/ErrorReporting.class java.lang.IllegalArgumentException: Unsupported class file major version 61 :|
### Fawe Version
FastAsyncWorldEdit-Bukkit-2.4.7-SNAPSHOT-281
### Checklist
- [X] I have included a Fawe debugpaste.
- [X] I am using the newest build from https://ci.athion.net/job/FastAsyncWorldEdit/ and the issue still persists.
### Anything else?
_No response_
|
1.0
|
1.16.5 java.lang.IllegalArgumentException: Unsupported class file major version 61 - ### Server Implementation
Paper
### Server Version
1.16.5
### Describe the bug
i used jdk11 first. It prompts: Unsupported class file major version 61
after i configure jdk17 (-DPaper.IgnoreJavaVersion=true)
it went from "unable to load plugin" to "Fatal error trying ... *.class"
then i try used jdk18. The problem still occurs
when it does not send error logs in the console. the debugging plugin prompts FAWE to handle an exception for (like PlayerMovementEvent, PlayerTeleportEvent) events [class com.sk89q.worldedit.bukkit.BukkitPlayer cannot be cast to class com.sk89q.worldedit.bukkit.BukkitPlayer (com.sk89q.worldedit.bukkit.BukkitPlayer is in unnamed module of loader org.bukkit.plugin.java.PluginClassLoader @1d716adc; com.sk89q.worldedit.bukkit.BukkitPlayer is in unnamed module of loader org.bukkit.plugin.java.PluginClassLoader @46ca75ff)]
### To Reproduce
i am using Akarin-6be3d57(1.16.5 paper fork)
an error occurred after using jdk17
i just wanted to use it. If thats really Akarin problem. i will try to use Tuinity.
i know though that using this core will not be supported. but i still want to ask
### Expected behaviour
successfully running. console are no any errors or spam
### Screenshots / Videos
_No response_
### Error log (if applicable)
_No response_
### Fawe Debugpaste
Fatal error trying to convert FastAsyncWorldEdit v2.4.7-SNAPSHOT-281;10b852d:com/sk89q/worldedit/internal/util/ErrorReporting.class java.lang.IllegalArgumentException: Unsupported class file major version 61 :|
### Fawe Version
FastAsyncWorldEdit-Bukkit-2.4.7-SNAPSHOT-281
### Checklist
- [X] I have included a Fawe debugpaste.
- [X] I am using the newest build from https://ci.athion.net/job/FastAsyncWorldEdit/ and the issue still persists.
### Anything else?
_No response_
|
non_process
|
java lang illegalargumentexception unsupported class file major version server implementation paper server version describe the bug i used first it prompts unsupported class file major version after i configure dpaper ignorejavaversion true it went from unable to load plugin to fatal error trying class then i try used the problem still occurs when it does not send error logs in the console the debugging plugin prompts fawe to handle an exception for like playermovementevent playerteleportevent events to reproduce i am using akarin paper fork an error occurred after using i just wanted to use it if thats really akarin problem i will try to use tuinity i know though that using this core will not be supported but i still want to ask expected behaviour successfully running console are no any errors or spam screenshots videos no response error log if applicable no response fawe debugpaste fatal error trying to convert fastasyncworldedit snapshot com worldedit internal util errorreporting class java lang illegalargumentexception unsupported class file major version fawe version fastasyncworldedit bukkit snapshot checklist i have included a fawe debugpaste i am using the newest build from and the issue still persists anything else no response
| 0
|
360,579
| 25,296,699,695
|
IssuesEvent
|
2022-11-17 07:25:36
|
WordPress/Documentation-Issue-Tracker
|
https://api.github.com/repos/WordPress/Documentation-Issue-Tracker
|
closed
|
Plugin handbook: Incorrect formatted code snippets on "Determining Plugin and Content Directories" page
|
developer documentation tracking issue plugins
|
## Issue Description
Some code snippet on https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/ are incorrect formatted.
## Example of an incorrect formatted code snippet
<img width="688" alt="Screenshot 2022-11-05 at 12 03 40" src="https://user-images.githubusercontent.com/3323310/200102107-3eef388b-bdd2-4ea2-bb69-ff570ba33482.png">
## URL of the Page with the Issue
- https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/
## Section of Page with the issue
- [ ] [Common Usage](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#common-usage)
- [ ] [Plugins](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#plugins)
- [ ] [Themes](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#themes)
- [ ] [Site Home](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#site-home)
- [ ] [WordPress](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#wordpress)
- [ ] [Multisite](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#multisite)
- [ ] [Constants](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#constants)
## Why is this a problem?
These code snippets might have been created before the existence of the Code Syntax Block.
## Suggested Fix
Using the same Code Syntax Block, as used for working code snippets such as https://developer.wordpress.org/themes/block-themes/converting-a-classic-theme-to-a-block-theme/#adding-block-template-parts-in-classic-themes, should solve this problem.
|
1.0
|
Plugin handbook: Incorrect formatted code snippets on "Determining Plugin and Content Directories" page - ## Issue Description
Some code snippet on https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/ are incorrect formatted.
## Example of an incorrect formatted code snippet
<img width="688" alt="Screenshot 2022-11-05 at 12 03 40" src="https://user-images.githubusercontent.com/3323310/200102107-3eef388b-bdd2-4ea2-bb69-ff570ba33482.png">
## URL of the Page with the Issue
- https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/
## Section of Page with the issue
- [ ] [Common Usage](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#common-usage)
- [ ] [Plugins](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#plugins)
- [ ] [Themes](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#themes)
- [ ] [Site Home](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#site-home)
- [ ] [WordPress](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#wordpress)
- [ ] [Multisite](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#multisite)
- [ ] [Constants](https://developer.wordpress.org/plugins/plugin-basics/determining-plugin-and-content-directories/#constants)
## Why is this a problem?
These code snippets might have been created before the existence of the Code Syntax Block.
## Suggested Fix
Using the same Code Syntax Block, as used for working code snippets such as https://developer.wordpress.org/themes/block-themes/converting-a-classic-theme-to-a-block-theme/#adding-block-template-parts-in-classic-themes, should solve this problem.
|
non_process
|
plugin handbook incorrect formatted code snippets on determining plugin and content directories page issue description some code snippet on are incorrect formatted example of an incorrect formatted code snippet img width alt screenshot at src url of the page with the issue section of page with the issue why is this a problem these code snippets might have been created before the existence of the code syntax block suggested fix using the same code syntax block as used for working code snippets such as should solve this problem
| 0
|
372,237
| 11,011,420,965
|
IssuesEvent
|
2019-12-04 16:16:25
|
siddhi-io/siddhi-store-rdbms
|
https://api.github.com/repos/siddhi-io/siddhi-store-rdbms
|
closed
|
Filter condition fails when more than one str:contains() is present in the condition
|
priority/normal severity/major type/bug
|
**Description:**
$subject
```
@info(name = 'query2')
from CheckStockStream join StockTable
on (
str:contains(StockTable.symbol, 'test') AND
str:contains(StockTable.symbol2, 'test1')
)
select CheckStockStream.symbol as checkSymbol, StockTable.symbol as symbol,
StockTable.volume as volume
insert into OutputStream ;
```
The above query resolves to SQL query as,
```
where symbol LIKE test and symbol2 like %test1%
```
**Affected Product Version:**
siddhi store RDBMS 7.0.x
**OS, DB, other environment details and versions:**
H2
|
1.0
|
Filter condition fails when more than one str:contains() is present in the condition - **Description:**
$subject
```
@info(name = 'query2')
from CheckStockStream join StockTable
on (
str:contains(StockTable.symbol, 'test') AND
str:contains(StockTable.symbol2, 'test1')
)
select CheckStockStream.symbol as checkSymbol, StockTable.symbol as symbol,
StockTable.volume as volume
insert into OutputStream ;
```
The above query resolves to SQL query as,
```
where symbol LIKE test and symbol2 like %test1%
```
**Affected Product Version:**
siddhi store RDBMS 7.0.x
**OS, DB, other environment details and versions:**
H2
|
non_process
|
filter condition fails when more than one str contains is present in the condition description subject info name from checkstockstream join stocktable on str contains stocktable symbol test and str contains stocktable select checkstockstream symbol as checksymbol stocktable symbol as symbol stocktable volume as volume insert into outputstream the above query resolves to sql query as where symbol like test and like affected product version siddhi store rdbms x os db other environment details and versions
| 0
|
83,877
| 3,644,431,662
|
IssuesEvent
|
2016-02-15 09:53:26
|
wp-property/wp-property
|
https://api.github.com/repos/wp-property/wp-property
|
closed
|
Adding Slashes if attribute or property type contain apostrophe
|
priority/high type/bug
|
approximately that was changed after 1.42.2
e.g Mortgagee's
https://usabilitydynamics.uservoice.com/admin/tickets/5415/?q=assignee%3Ame+status%3Aopen
|
1.0
|
Adding Slashes if attribute or property type contain apostrophe - approximately that was changed after 1.42.2
e.g Mortgagee's
https://usabilitydynamics.uservoice.com/admin/tickets/5415/?q=assignee%3Ame+status%3Aopen
|
non_process
|
adding slashes if attribute or property type contain apostrophe approximately that was changed after e g mortgagee s
| 0
|
207,823
| 15,837,533,749
|
IssuesEvent
|
2021-04-06 20:55:51
|
metallb/metallb
|
https://api.github.com/repos/metallb/metallb
|
closed
|
FRR: Can't open configuration file
|
bug protocol/BGP testing
|
Running `docker exec frr vtysh -c "show ip bgp summary"` following https://github.com/metallb/metallb/tree/main/dev-env/bgp#bgp-test-env gives me the following:
```
% Can't open configuration file /etc/frr/vtysh.conf due to 'No such file or directory'.
IPv4 Unicast Summary:
BGP router identifier 172.18.0.5, local AS number 64512 vrf-id 0
BGP table version 2
RIB entries 0, using 0 bytes of memory
Peers 3, using 43 KiB of memory
Neighbor V AS MsgRcvd MsgSent TblVer InQ OutQ Up/Down State/PfxRcd PfxSnt
172.18.0.2 4 64512 118 116 0 0 0 00:57:24 0 0
172.18.0.3 4 64512 118 116 0 0 0 00:57:24 0 0
172.18.0.4 4 64512 118 116 0 0 0 00:57:24 0 0
Total number of neighbors 3
```
As can be seen, the expected output is still shown so FRR seems to function correctly, however when trying to understand how the FRR-based environment worked I saw the error above and thought there was a problem. I think we should either get rid of this error or document that it's fine to see it.
```
# docker exec frr vtysh -c "show ver"
% Can't open configuration file /etc/frr/vtysh.conf due to 'No such file or directory'.
FRRouting 7.5-dev_git (eafeb84d9006).
Copyright 1996-2005 Kunihiro Ishiguro, et al.
configured with:
'--prefix=/usr' '--sbindir=/usr/lib/frr' '--sysconfdir=/etc/frr' '--libdir=/usr/lib' '--localstatedir=/var/run/frr' '--enable-systemd=no' '--enable-rpki' '--enable-vtysh' '--enable-multipath=64' '--enable-vty-group=frrvty' '--enable-user=frr' '--enable-group=frr' 'CC=gcc' 'CXX=g++'
```
|
1.0
|
FRR: Can't open configuration file - Running `docker exec frr vtysh -c "show ip bgp summary"` following https://github.com/metallb/metallb/tree/main/dev-env/bgp#bgp-test-env gives me the following:
```
% Can't open configuration file /etc/frr/vtysh.conf due to 'No such file or directory'.
IPv4 Unicast Summary:
BGP router identifier 172.18.0.5, local AS number 64512 vrf-id 0
BGP table version 2
RIB entries 0, using 0 bytes of memory
Peers 3, using 43 KiB of memory
Neighbor V AS MsgRcvd MsgSent TblVer InQ OutQ Up/Down State/PfxRcd PfxSnt
172.18.0.2 4 64512 118 116 0 0 0 00:57:24 0 0
172.18.0.3 4 64512 118 116 0 0 0 00:57:24 0 0
172.18.0.4 4 64512 118 116 0 0 0 00:57:24 0 0
Total number of neighbors 3
```
As can be seen, the expected output is still shown so FRR seems to function correctly, however when trying to understand how the FRR-based environment worked I saw the error above and thought there was a problem. I think we should either get rid of this error or document that it's fine to see it.
```
# docker exec frr vtysh -c "show ver"
% Can't open configuration file /etc/frr/vtysh.conf due to 'No such file or directory'.
FRRouting 7.5-dev_git (eafeb84d9006).
Copyright 1996-2005 Kunihiro Ishiguro, et al.
configured with:
'--prefix=/usr' '--sbindir=/usr/lib/frr' '--sysconfdir=/etc/frr' '--libdir=/usr/lib' '--localstatedir=/var/run/frr' '--enable-systemd=no' '--enable-rpki' '--enable-vtysh' '--enable-multipath=64' '--enable-vty-group=frrvty' '--enable-user=frr' '--enable-group=frr' 'CC=gcc' 'CXX=g++'
```
|
non_process
|
frr can t open configuration file running docker exec frr vtysh c show ip bgp summary following gives me the following can t open configuration file etc frr vtysh conf due to no such file or directory unicast summary bgp router identifier local as number vrf id bgp table version rib entries using bytes of memory peers using kib of memory neighbor v as msgrcvd msgsent tblver inq outq up down state pfxrcd pfxsnt total number of neighbors as can be seen the expected output is still shown so frr seems to function correctly however when trying to understand how the frr based environment worked i saw the error above and thought there was a problem i think we should either get rid of this error or document that it s fine to see it docker exec frr vtysh c show ver can t open configuration file etc frr vtysh conf due to no such file or directory frrouting dev git copyright kunihiro ishiguro et al configured with prefix usr sbindir usr lib frr sysconfdir etc frr libdir usr lib localstatedir var run frr enable systemd no enable rpki enable vtysh enable multipath enable vty group frrvty enable user frr enable group frr cc gcc cxx g
| 0
|
12,434
| 14,930,442,930
|
IssuesEvent
|
2021-01-25 02:57:38
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Points on a line segment are not excluded from the difference algorithm result
|
Bug Feedback Processing stale
|
1. Let's take a line layer
1. Use the "extract vertices" algorithm to generate a point layer from it
1. Add some more points: some snapped to the line segment, others away from the line
1. Run the difference algorithm with point layer as input and line one as overlay
1. Points that I add away from the lines are reported (good!) as well as those that are placed on the segment (bug imho). It will exclude only the points that are at the vertices of the lines.
It's not imho a matter of snapping because:
* i zoomed in to 10000:1 (in m) and the point and line features were still shown aligned
* If I add a new point and I edit the line to snap to it, that point is not reported anymore in the "difference" layer since it's now a vertex
* If I extend a line using the "trim/extend" tool, the point that was previously at the vertex I extend is now reported in the "difference" layer (considered as not on the line anymore).
So is a point considered as a line only if it's at the line vertex position?
master c52a6e711a on Win10
|
1.0
|
Points on a line segment are not excluded from the difference algorithm result - 1. Let's take a line layer
1. Use the "extract vertices" algorithm to generate a point layer from it
1. Add some more points: some snapped to the line segment, others away from the line
1. Run the difference algorithm with point layer as input and line one as overlay
1. Points that I add away from the lines are reported (good!) as well as those that are placed on the segment (bug imho). It will exclude only the points that are at the vertices of the lines.
It's not imho a matter of snapping because:
* i zoomed in to 10000:1 (in m) and the point and line features were still shown aligned
* If I add a new point and I edit the line to snap to it, that point is not reported anymore in the "difference" layer since it's now a vertex
* If I extend a line using the "trim/extend" tool, the point that was previously at the vertex I extend is now reported in the "difference" layer (considered as not on the line anymore).
So is a point considered as a line only if it's at the line vertex position?
master c52a6e711a on Win10
|
process
|
points on a line segment are not excluded from the difference algorithm result let s take a line layer use the extract vertices algorithm to generate a point layer from it add some more points some snapped to the line segment others away from the line run the difference algorithm with point layer as input and line one as overlay points that i add away from the lines are reported good as well as those that are placed on the segment bug imho it will exclude only the points that are at the vertices of the lines it s not imho a matter of snapping because i zoomed in to in m and the point and line features were still shown aligned if i add a new point and i edit the line to snap to it that point is not reported anymore in the difference layer since it s now a vertex if i extend a line using the trim extend tool the point that was previously at the vertex i extend is now reported in the difference layer considered as not on the line anymore so is a point considered as a line only if it s at the line vertex position master on
| 1
|
1,977
| 3,025,918,450
|
IssuesEvent
|
2015-08-03 12:05:56
|
lionheart/openradar-mirror
|
https://api.github.com/repos/lionheart/openradar-mirror
|
opened
|
20421800: Inspectables should sort in file counterpart order for category headers
|
classification:ui/usability reproducible:always status:open
|
#### Description
Summary:
If a category header (with a plus sign in its name) declares additional inspectables on the class, those inspectables appear first in the Attributes inspector, before the inspectables that are declared in the main @interface. Headers should not be sorted in strict ASCII order; consideration should be given to delimiters such as _ and +. Observe how the Assistant editor arranges counterparts so that the main header is always first.
Steps to Reproduce:
1. In a file “Building.h”, declare a class Building to have the inspectable properties “houseNumber” and “streetName”.
2. In a separate file “Building+NYCAdditions.h”, declare a category Buliding (Flat) to have the additional inspectable properties “suiteNumber” and “numberOfRooms”.
3. Insert an instance of Building into a storyboard and switch to the Attributes inspector.
Expected Results:
The “House Number” and “Street Name” fields should come before the “Suite Number” and “Number Of Rooms” fields. “Building+NYCAdditions.h” should be considered to come after “Building.h”, just as it does in the Counterparts section of the assistant menu. (Also note the poor capitalization of “Of”.)
Actual Results:
The “House Number” and “Street Name” fields come after the “Suite Number” and “Number Of Rooms” fields, because “Building+NYCAdditions.h” comes before “Building.h” in ASCII sort order.
Version:
Xcode 6.3b3 (6D554n)
OS X 10.10.2 (14C109)
Notes:
Configuration:
Xcode 6.3b3 (6D554n)
OS X 10.10.2 (14C109)
Xcode 6.3b3 (6D554n)
OS X 10.10.2 (14C109)
Attachments:
-
Product Version: 6D554n
Created: 2015-04-03T22:10:38.044412
Originated: 2015-04-03T00:00:00
Open Radar Link: http://www.openradar.me/20421800
|
True
|
20421800: Inspectables should sort in file counterpart order for category headers - #### Description
Summary:
If a category header (with a plus sign in its name) declares additional inspectables on the class, those inspectables appear first in the Attributes inspector, before the inspectables that are declared in the main @interface. Headers should not be sorted in strict ASCII order; consideration should be given to delimiters such as _ and +. Observe how the Assistant editor arranges counterparts so that the main header is always first.
Steps to Reproduce:
1. In a file “Building.h”, declare a class Building to have the inspectable properties “houseNumber” and “streetName”.
2. In a separate file “Building+NYCAdditions.h”, declare a category Buliding (Flat) to have the additional inspectable properties “suiteNumber” and “numberOfRooms”.
3. Insert an instance of Building into a storyboard and switch to the Attributes inspector.
Expected Results:
The “House Number” and “Street Name” fields should come before the “Suite Number” and “Number Of Rooms” fields. “Building+NYCAdditions.h” should be considered to come after “Building.h”, just as it does in the Counterparts section of the assistant menu. (Also note the poor capitalization of “Of”.)
Actual Results:
The “House Number” and “Street Name” fields come after the “Suite Number” and “Number Of Rooms” fields, because “Building+NYCAdditions.h” comes before “Building.h” in ASCII sort order.
Version:
Xcode 6.3b3 (6D554n)
OS X 10.10.2 (14C109)
Notes:
Configuration:
Xcode 6.3b3 (6D554n)
OS X 10.10.2 (14C109)
Xcode 6.3b3 (6D554n)
OS X 10.10.2 (14C109)
Attachments:
-
Product Version: 6D554n
Created: 2015-04-03T22:10:38.044412
Originated: 2015-04-03T00:00:00
Open Radar Link: http://www.openradar.me/20421800
|
non_process
|
inspectables should sort in file counterpart order for category headers description summary if a category header with a plus sign in its name declares additional inspectables on the class those inspectables appear first in the attributes inspector before the inspectables that are declared in the main interface headers should not be sorted in strict ascii order consideration should be given to delimiters such as and observe how the assistant editor arranges counterparts so that the main header is always first steps to reproduce in a file “building h” declare a class building to have the inspectable properties “housenumber” and “streetname” in a separate file “building nycadditions h” declare a category buliding flat to have the additional inspectable properties “suitenumber” and “numberofrooms” insert an instance of building into a storyboard and switch to the attributes inspector expected results the “house number” and “street name” fields should come before the “suite number” and “number of rooms” fields “building nycadditions h” should be considered to come after “building h” just as it does in the counterparts section of the assistant menu also note the poor capitalization of “of” actual results the “house number” and “street name” fields come after the “suite number” and “number of rooms” fields because “building nycadditions h” comes before “building h” in ascii sort order version xcode os x notes configuration xcode os x xcode os x attachments product version created originated open radar link
| 0
|
300,457
| 22,679,784,397
|
IssuesEvent
|
2022-07-04 08:51:55
|
apple/swift-docc
|
https://api.github.com/repos/apple/swift-docc
|
closed
|
Adopt new GitHub Issues Forms syntax
|
documentation good first issue
|
GitHub has support for a new Forms-style of GitHub Issues that seems like a significant improvement over the default markdown templating system: https://github.blog/changelog/2021-06-23-issues-forms-beta-for-public-repositories/.
There's some documentation on the syntax here: https://docs.github.com/en/communities/using-templates-to-encourage-useful-issues-and-pull-requests/syntax-for-issue-forms.
We should migrate our existing templates over to the new format to make it easier for contributors to file bug reports and feature requests.
|
1.0
|
Adopt new GitHub Issues Forms syntax - GitHub has support for a new Forms-style of GitHub Issues that seems like a significant improvement over the default markdown templating system: https://github.blog/changelog/2021-06-23-issues-forms-beta-for-public-repositories/.
There's some documentation on the syntax here: https://docs.github.com/en/communities/using-templates-to-encourage-useful-issues-and-pull-requests/syntax-for-issue-forms.
We should migrate our existing templates over to the new format to make it easier for contributors to file bug reports and feature requests.
|
non_process
|
adopt new github issues forms syntax github has support for a new forms style of github issues that seems like a significant improvement over the default markdown templating system there s some documentation on the syntax here we should migrate our existing templates over to the new format to make it easier for contributors to file bug reports and feature requests
| 0
|
102,072
| 12,742,870,954
|
IssuesEvent
|
2020-06-26 09:17:38
|
creme-ml/creme
|
https://api.github.com/repos/creme-ml/creme
|
closed
|
Online hierarchical clustering approximations
|
Status: Design phase Type: Feature
|
There's an nice algorithm in [this](https://arxiv.org/pdf/1909.09667.pdf) paper by Google called Online Top Down. It's a relatively simple clustering algorithm. The only issue is that it grows linearly with the amount of observations it is trained. It would be nice to introduce some forgetting mechanism so to focus on a recent window observations, an therefore limit the size of the tree.
There are other online hierarchical clustering algorithms, such as BIRCH and PEARCH. One thing we have to think about is what we mean by clusters, and how to assign cluster numbers. Indeed things are a bit different online because the structures are dynamic. If the clustering structure is a tree, then we can assign strings of 0s (left) and 1s (left) to an observation. For instance an assignment could be `01100011`. Observations that are close together would therefore have similar strings. Also we could assign colours to these strings for visualization.
Food for thought! Feel free to ping me if you want to talk about it further. I've contacted the researchers from Google to see what they think about the forgetting mechanism. If I don't write their answer here, then assume that they haven't responded yet.
|
1.0
|
Online hierarchical clustering approximations - There's an nice algorithm in [this](https://arxiv.org/pdf/1909.09667.pdf) paper by Google called Online Top Down. It's a relatively simple clustering algorithm. The only issue is that it grows linearly with the amount of observations it is trained. It would be nice to introduce some forgetting mechanism so to focus on a recent window observations, an therefore limit the size of the tree.
There are other online hierarchical clustering algorithms, such as BIRCH and PEARCH. One thing we have to think about is what we mean by clusters, and how to assign cluster numbers. Indeed things are a bit different online because the structures are dynamic. If the clustering structure is a tree, then we can assign strings of 0s (left) and 1s (left) to an observation. For instance an assignment could be `01100011`. Observations that are close together would therefore have similar strings. Also we could assign colours to these strings for visualization.
Food for thought! Feel free to ping me if you want to talk about it further. I've contacted the researchers from Google to see what they think about the forgetting mechanism. If I don't write their answer here, then assume that they haven't responded yet.
|
non_process
|
online hierarchical clustering approximations there s an nice algorithm in paper by google called online top down it s a relatively simple clustering algorithm the only issue is that it grows linearly with the amount of observations it is trained it would be nice to introduce some forgetting mechanism so to focus on a recent window observations an therefore limit the size of the tree there are other online hierarchical clustering algorithms such as birch and pearch one thing we have to think about is what we mean by clusters and how to assign cluster numbers indeed things are a bit different online because the structures are dynamic if the clustering structure is a tree then we can assign strings of left and left to an observation for instance an assignment could be observations that are close together would therefore have similar strings also we could assign colours to these strings for visualization food for thought feel free to ping me if you want to talk about it further i ve contacted the researchers from google to see what they think about the forgetting mechanism if i don t write their answer here then assume that they haven t responded yet
| 0
|
54,511
| 6,824,465,991
|
IssuesEvent
|
2017-11-08 06:22:11
|
nawissor/stomer
|
https://api.github.com/repos/nawissor/stomer
|
closed
|
Bookings Button
|
Design in progress Navigation
|
Create a book now button below every section of the special occasions page to automatically send a booking request to the person that handles the bookings.
|
1.0
|
Bookings Button - Create a book now button below every section of the special occasions page to automatically send a booking request to the person that handles the bookings.
|
non_process
|
bookings button create a book now button below every section of the special occasions page to automatically send a booking request to the person that handles the bookings
| 0
|
22,023
| 30,535,168,501
|
IssuesEvent
|
2023-07-19 16:48:03
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
[MLv2] `join-condition-lhs-columns` should have a way to filter out columns from the current join
|
.Backend .metabase-lib .Team/QueryProcessor :hammer_and_wrench:
|
From @kulyk:
> `join-condition-lhs-columns` returns columns from the source table + joined tables as expected. Now imagine I have a question joining Orders with Products. If I wanted to change an LHS column, the method would give me product columns too which is not expected. Would it make sense to pass an optional `join` argument to it to filter out unexpected column? Maybe add another method?
I think an option `join` argument makes sense, because it makes no sense to try to join a join on itself e.g.
```sql
LEFT JOIN my_table ON my_table.id = my_table.id`
```
|
1.0
|
[MLv2] `join-condition-lhs-columns` should have a way to filter out columns from the current join - From @kulyk:
> `join-condition-lhs-columns` returns columns from the source table + joined tables as expected. Now imagine I have a question joining Orders with Products. If I wanted to change an LHS column, the method would give me product columns too which is not expected. Would it make sense to pass an optional `join` argument to it to filter out unexpected column? Maybe add another method?
I think an option `join` argument makes sense, because it makes no sense to try to join a join on itself e.g.
```sql
LEFT JOIN my_table ON my_table.id = my_table.id`
```
|
process
|
join condition lhs columns should have a way to filter out columns from the current join from kulyk join condition lhs columns returns columns from the source table joined tables as expected now imagine i have a question joining orders with products if i wanted to change an lhs column the method would give me product columns too which is not expected would it make sense to pass an optional join argument to it to filter out unexpected column maybe add another method i think an option join argument makes sense because it makes no sense to try to join a join on itself e g sql left join my table on my table id my table id
| 1
|
16,630
| 21,704,112,513
|
IssuesEvent
|
2022-05-10 08:04:35
|
camunda/zeebe
|
https://api.github.com/repos/camunda/zeebe
|
closed
|
Add test implementations for the exporter-api
|
kind/toil team/process-automation area/test
|
**Description**
In order to test exporters with a simplified set up, it's useful to just create your own exporter instance, configure it, pass it a controllable context, etc. This avoids having to setup a whole broker around your exporter just to test it.
This is achievable using mocks, however as the usage of these mocks grow, it means any time the interface changes we have to update all the mocks everywhere. It's much easier to refactor these things, and to write tests, with controlled implementations.
The goal would be to add implementations of the all the interfaces in `exporter-api`, minus the `Exporter` interface itself.
I would propose adding a new special purpose module for this to move away from having catch all modules (e.g. util, test-util), but I'm happy to comply and keep these utilities there.
|
1.0
|
Add test implementations for the exporter-api - **Description**
In order to test exporters with a simplified set up, it's useful to just create your own exporter instance, configure it, pass it a controllable context, etc. This avoids having to setup a whole broker around your exporter just to test it.
This is achievable using mocks, however as the usage of these mocks grow, it means any time the interface changes we have to update all the mocks everywhere. It's much easier to refactor these things, and to write tests, with controlled implementations.
The goal would be to add implementations of the all the interfaces in `exporter-api`, minus the `Exporter` interface itself.
I would propose adding a new special purpose module for this to move away from having catch all modules (e.g. util, test-util), but I'm happy to comply and keep these utilities there.
|
process
|
add test implementations for the exporter api description in order to test exporters with a simplified set up it s useful to just create your own exporter instance configure it pass it a controllable context etc this avoids having to setup a whole broker around your exporter just to test it this is achievable using mocks however as the usage of these mocks grow it means any time the interface changes we have to update all the mocks everywhere it s much easier to refactor these things and to write tests with controlled implementations the goal would be to add implementations of the all the interfaces in exporter api minus the exporter interface itself i would propose adding a new special purpose module for this to move away from having catch all modules e g util test util but i m happy to comply and keep these utilities there
| 1
|
25,792
| 5,199,597,955
|
IssuesEvent
|
2017-01-23 21:19:58
|
esl/jerboa
|
https://api.github.com/repos/esl/jerboa
|
closed
|
List of supported STUN methods
|
documentation
|
To be fair with anyone using our library, we should keep a list of methods supported by it. It could be a checklist, so that we know what is left to implement.
|
1.0
|
List of supported STUN methods - To be fair with anyone using our library, we should keep a list of methods supported by it. It could be a checklist, so that we know what is left to implement.
|
non_process
|
list of supported stun methods to be fair with anyone using our library we should keep a list of methods supported by it it could be a checklist so that we know what is left to implement
| 0
|
1,758
| 4,462,182,630
|
IssuesEvent
|
2016-08-24 09:00:04
|
opentrials/opentrials
|
https://api.github.com/repos/opentrials/opentrials
|
reopened
|
Improved deduplication system
|
3. In Development Processors
|
# Description
Based on current experience of data deduplication I could suggest some implementation improvements and strategies to use. I'll write it as tasks for the reading convenience but it's of course for a discussion (cc @vitorbaptista @pwalsh @benmeg).
Ideas highlights:
- use only identifiers for automatic matching
- improve querying duplicates implementation (facts system)
- use curators for matching records based on pre-processed similarity
# Tasks
- [ ] instead of `trials.facts` use `records.identifiers` (or indexable analogue) to search for the same trial records. Current implementation based on "stateful" `trials.facts` field should be replaced because of maintainability problems (e.g. after hra removing facts should be cleaned from hra strings etc).
- [ ] create processor and database table for calculation trials similarity. We could start with scientific titles match then follow with probabilistic and machine learning methods. As output table like `trial_id_1, trials_id_2, method, similarity, is_marked_as_matching, updated_by_user_id` (not real naming just to show the idea). Could be split for 2 tables etc.
- [ ] create processor for reading similarity table to dynamically merge/unmerge records based on user marked matches
- [ ] create some redirects table because trial pages could be merged at any time (saving history of dedup process is a thing to think about)
---
- [ ] move UI tasks to different issue after discussion
- [ ] create UI for similarity table - as similar records on trial page with ability to mark it as duplicate
- [ ] create UI for similarity table - as a list of similar record pairs ordered by similarity to match by users/curators. So some hired curators or enthusiasts could highly effective processing this list from top to bottom.
|
1.0
|
Improved deduplication system - # Description
Based on current experience of data deduplication I could suggest some implementation improvements and strategies to use. I'll write it as tasks for the reading convenience but it's of course for a discussion (cc @vitorbaptista @pwalsh @benmeg).
Ideas highlights:
- use only identifiers for automatic matching
- improve querying duplicates implementation (facts system)
- use curators for matching records based on pre-processed similarity
# Tasks
- [ ] instead of `trials.facts` use `records.identifiers` (or indexable analogue) to search for the same trial records. Current implementation based on "stateful" `trials.facts` field should be replaced because of maintainability problems (e.g. after hra removing facts should be cleaned from hra strings etc).
- [ ] create processor and database table for calculation trials similarity. We could start with scientific titles match then follow with probabilistic and machine learning methods. As output table like `trial_id_1, trials_id_2, method, similarity, is_marked_as_matching, updated_by_user_id` (not real naming just to show the idea). Could be split for 2 tables etc.
- [ ] create processor for reading similarity table to dynamically merge/unmerge records based on user marked matches
- [ ] create some redirects table because trial pages could be merged at any time (saving history of dedup process is a thing to think about)
---
- [ ] move UI tasks to different issue after discussion
- [ ] create UI for similarity table - as similar records on trial page with ability to mark it as duplicate
- [ ] create UI for similarity table - as a list of similar record pairs ordered by similarity to match by users/curators. So some hired curators or enthusiasts could highly effective processing this list from top to bottom.
|
process
|
improved deduplication system description based on current experience of data deduplication i could suggest some implementation improvements and strategies to use i ll write it as tasks for the reading convenience but it s of course for a discussion cc vitorbaptista pwalsh benmeg ideas highlights use only identifiers for automatic matching improve querying duplicates implementation facts system use curators for matching records based on pre processed similarity tasks instead of trials facts use records identifiers or indexable analogue to search for the same trial records current implementation based on stateful trials facts field should be replaced because of maintainability problems e g after hra removing facts should be cleaned from hra strings etc create processor and database table for calculation trials similarity we could start with scientific titles match then follow with probabilistic and machine learning methods as output table like trial id trials id method similarity is marked as matching updated by user id not real naming just to show the idea could be split for tables etc create processor for reading similarity table to dynamically merge unmerge records based on user marked matches create some redirects table because trial pages could be merged at any time saving history of dedup process is a thing to think about move ui tasks to different issue after discussion create ui for similarity table as similar records on trial page with ability to mark it as duplicate create ui for similarity table as a list of similar record pairs ordered by similarity to match by users curators so some hired curators or enthusiasts could highly effective processing this list from top to bottom
| 1
|
159,161
| 13,756,887,369
|
IssuesEvent
|
2020-10-06 20:40:38
|
XDavid1999/PacketService
|
https://api.github.com/repos/XDavid1999/PacketService
|
closed
|
Actualización del archivo README
|
documentation enhancement update
|
Se actualizará el archivo de descripción del proyecto incluyendo una descripción más específica del mismo.
|
1.0
|
Actualización del archivo README - Se actualizará el archivo de descripción del proyecto incluyendo una descripción más específica del mismo.
|
non_process
|
actualización del archivo readme se actualizará el archivo de descripción del proyecto incluyendo una descripción más específica del mismo
| 0
|
10,513
| 13,284,011,319
|
IssuesEvent
|
2020-08-24 05:09:01
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
opened
|
copr: fast-path Chunk decoding
|
sig/coprocessor type/enhancement
|
## Development Task
After Chunk format is implemented [in this RFC](https://github.com/tikv/rfcs/pull/43), we can now directly construct `ChunkedVec` from Chunk format instead of one-by-one appending elements.
Currently, decoding is done through [this structure](https://github.com/tikv/tikv/blob/master/components/tidb_query_datatype/src/codec/batch/lazy_column.rs). We could do some optimization on this process. If `logical_rows` is identical, we could directly copy what's inside `raw_vec` to `ChunkedVec`, which may reduce overhead on decoding data from TiDB.
|
1.0
|
copr: fast-path Chunk decoding - ## Development Task
After Chunk format is implemented [in this RFC](https://github.com/tikv/rfcs/pull/43), we can now directly construct `ChunkedVec` from Chunk format instead of one-by-one appending elements.
Currently, decoding is done through [this structure](https://github.com/tikv/tikv/blob/master/components/tidb_query_datatype/src/codec/batch/lazy_column.rs). We could do some optimization on this process. If `logical_rows` is identical, we could directly copy what's inside `raw_vec` to `ChunkedVec`, which may reduce overhead on decoding data from TiDB.
|
process
|
copr fast path chunk decoding development task after chunk format is implemented we can now directly construct chunkedvec from chunk format instead of one by one appending elements currently decoding is done through we could do some optimization on this process if logical rows is identical we could directly copy what s inside raw vec to chunkedvec which may reduce overhead on decoding data from tidb
| 1
|
44,200
| 11,405,570,063
|
IssuesEvent
|
2020-01-31 12:25:47
|
precice/precice
|
https://api.github.com/repos/precice/precice
|
closed
|
Deliver solver dummies as examples
|
building enhancement
|
Alongside preCICE, it would be nice that we also install our solver dummies code (and the corresponding preCICE configuration file) as examples.
This would avoid the cases where one has to separately obtain th preCICE sources to:
- build the C++/C/Fortran solver dummy
- run a separately-provided solver dummy without duplicating the `precice-config.xml`
|
1.0
|
Deliver solver dummies as examples - Alongside preCICE, it would be nice that we also install our solver dummies code (and the corresponding preCICE configuration file) as examples.
This would avoid the cases where one has to separately obtain th preCICE sources to:
- build the C++/C/Fortran solver dummy
- run a separately-provided solver dummy without duplicating the `precice-config.xml`
|
non_process
|
deliver solver dummies as examples alongside precice it would be nice that we also install our solver dummies code and the corresponding precice configuration file as examples this would avoid the cases where one has to separately obtain th precice sources to build the c c fortran solver dummy run a separately provided solver dummy without duplicating the precice config xml
| 0
|
5,443
| 8,306,106,412
|
IssuesEvent
|
2018-09-22 15:13:31
|
sojamo/controlp5
|
https://api.github.com/repos/sojamo/controlp5
|
closed
|
controlP5 and opengl incompatibility. how to solve?
|
processing-related wontfix
|
Hello. Basically I figured out what was making the backspace not work properly on a textfield on my code. If i use opengl which I need, I'm not able to keep pressing backspace to delete the text all at once. Instead I need to press and release, press and release etc... Is there any workaround? Everything else works fine (on my main code) like getting the strings from the textfields by pressing buttons etc.
Not functional with opengl
```
import controlP5.*;
import processing.opengl.*;
ControlP5 caixas;
Textfield selected;
void setup() {
size(600, 400,OPENGL);
caixas = new ControlP5(this);
selected = caixas.addTextfield("Name").setPosition(20, 200).setSize(200, 40)
.setAutoClear(false)
.setFont(createFont("Calibri", 17))
.setColorActive(color(255));
}
void draw() {
background(0);
}
```
Fully functional without opengl:
```
import controlP5.*;
ControlP5 caixas;
Textfield selected;
void setup() {
size(600, 400);
caixas = new ControlP5(this);
selected = caixas.addTextfield("Name").setPosition(20, 200).setSize(200, 40)
.setAutoClear(false)
.setFont(createFont("Calibri", 17))
.setColorActive(color(255));
}
void draw() {
background(0);
}
```
|
1.0
|
controlP5 and opengl incompatibility. how to solve? - Hello. Basically I figured out what was making the backspace not work properly on a textfield on my code. If i use opengl which I need, I'm not able to keep pressing backspace to delete the text all at once. Instead I need to press and release, press and release etc... Is there any workaround? Everything else works fine (on my main code) like getting the strings from the textfields by pressing buttons etc.
Not functional with opengl
```
import controlP5.*;
import processing.opengl.*;
ControlP5 caixas;
Textfield selected;
void setup() {
size(600, 400,OPENGL);
caixas = new ControlP5(this);
selected = caixas.addTextfield("Name").setPosition(20, 200).setSize(200, 40)
.setAutoClear(false)
.setFont(createFont("Calibri", 17))
.setColorActive(color(255));
}
void draw() {
background(0);
}
```
Fully functional without opengl:
```
import controlP5.*;
ControlP5 caixas;
Textfield selected;
void setup() {
size(600, 400);
caixas = new ControlP5(this);
selected = caixas.addTextfield("Name").setPosition(20, 200).setSize(200, 40)
.setAutoClear(false)
.setFont(createFont("Calibri", 17))
.setColorActive(color(255));
}
void draw() {
background(0);
}
```
|
process
|
and opengl incompatibility how to solve hello basically i figured out what was making the backspace not work properly on a textfield on my code if i use opengl which i need i m not able to keep pressing backspace to delete the text all at once instead i need to press and release press and release etc is there any workaround everything else works fine on my main code like getting the strings from the textfields by pressing buttons etc not functional with opengl import import processing opengl caixas textfield selected void setup size opengl caixas new this selected caixas addtextfield name setposition setsize setautoclear false setfont createfont calibri setcoloractive color void draw background fully functional without opengl import caixas textfield selected void setup size caixas new this selected caixas addtextfield name setposition setsize setautoclear false setfont createfont calibri setcoloractive color void draw background
| 1
|
781,697
| 27,446,374,606
|
IssuesEvent
|
2023-03-02 14:33:07
|
magento/magento2
|
https://api.github.com/repos/magento/magento2
|
closed
|
[Issue] Only Update Indexer Mode When Required
|
Issue: Confirmed Component: Admin Reproduced on 2.4.x Progress: PR in progress Priority: P3 Area: Framework
|
This issue is automatically created based on existing pull request: magento/magento2#36422: Only Update Indexer Mode When Required
---------
### Description (*)
Updates the indexer mode change mass actions to only change modes when required i.e. when the current index mode is different to the one being applied. This prevents unnecessary trips to the database.
Also updates the messaging displayed to the admin to inform them how many indexers were updated and how many were skipped thanks to this change.
Finally, replaces the deprecated `addSuccess()` function with the non-deprecated `addSuccessMessage()` function for the message manager class when displaying the changed messages to the admin.
### Manual testing scenarios (*)
1. Log into the admin
2. Set any number of indexers to be `Update on Schedule`
3. Observe the success message to confirm the correct number of indexers were updated versus skipped, based on their original mode
### Contribution checklist (*)
- [x] Pull request has a meaningful description of its purpose
- [x] All commits are accompanied by meaningful commit messages
- [ ] All new or changed code is covered with unit/integration tests (if applicable)
- [x] README.md files for modified modules are updated and included in the pull request if any [README.md predefined sections](https://github.com/magento/devdocs/wiki/Magento-module-README.md) require an update
- [ ] All automated tests passed successfully (all builds are green)
|
1.0
|
[Issue] Only Update Indexer Mode When Required - This issue is automatically created based on existing pull request: magento/magento2#36422: Only Update Indexer Mode When Required
---------
### Description (*)
Updates the indexer mode change mass actions to only change modes when required i.e. when the current index mode is different to the one being applied. This prevents unnecessary trips to the database.
Also updates the messaging displayed to the admin to inform them how many indexers were updated and how many were skipped thanks to this change.
Finally, replaces the deprecated `addSuccess()` function with the non-deprecated `addSuccessMessage()` function for the message manager class when displaying the changed messages to the admin.
### Manual testing scenarios (*)
1. Log into the admin
2. Set any number of indexers to be `Update on Schedule`
3. Observe the success message to confirm the correct number of indexers were updated versus skipped, based on their original mode
### Contribution checklist (*)
- [x] Pull request has a meaningful description of its purpose
- [x] All commits are accompanied by meaningful commit messages
- [ ] All new or changed code is covered with unit/integration tests (if applicable)
- [x] README.md files for modified modules are updated and included in the pull request if any [README.md predefined sections](https://github.com/magento/devdocs/wiki/Magento-module-README.md) require an update
- [ ] All automated tests passed successfully (all builds are green)
|
non_process
|
only update indexer mode when required this issue is automatically created based on existing pull request magento only update indexer mode when required description updates the indexer mode change mass actions to only change modes when required i e when the current index mode is different to the one being applied this prevents unnecessary trips to the database also updates the messaging displayed to the admin to inform them how many indexers were updated and how many were skipped thanks to this change finally replaces the deprecated addsuccess function with the non deprecated addsuccessmessage function for the message manager class when displaying the changed messages to the admin manual testing scenarios log into the admin set any number of indexers to be update on schedule observe the success message to confirm the correct number of indexers were updated versus skipped based on their original mode contribution checklist pull request has a meaningful description of its purpose all commits are accompanied by meaningful commit messages all new or changed code is covered with unit integration tests if applicable readme md files for modified modules are updated and included in the pull request if any require an update all automated tests passed successfully all builds are green
| 0
|
10,732
| 13,532,589,562
|
IssuesEvent
|
2020-09-16 00:34:26
|
timberio/vector
|
https://api.github.com/repos/timberio/vector
|
opened
|
Map fields in sinks - Automatically (CIM)
|
Epic domain: processing type: enhancement
|
A key objective for Vector is to be the easiest tool in it's space to setup and operate. In order to achieve this we must reduce the amount of configuration and decisions a user makes when setting up and using Vector. A particularly hairy aspect of this is field mapping. Because Vector integrates with disparate sources and sinks, it requires some level of field mapping.
A good example of this is the `datadog_logs` sink. Datadog defines [reserved fields](https://docs.datadoghq.com/logs/processing/#reserved-attributes) that the user should take advantage of in order to have the best Datadog experience. Currently, Vector [performs limited mapping](https://github.com/timberio/vector/blob/a391434e97a078f6ea92691bba76f9af91ad1866/src/sinks/datadog/logs.rs#L74) based on [global schema settings](https://vector.dev/docs/reference/global-options/#log_schema). This approach is flawed since this global schema is not consistent across all of Vector's sources. The data coming in from the `http` source is likely to be different than the data coming in from the `journald` source. Instead of trying to shoehorn events from disparate sources into a single shape, Vector should allow the event to take any shape and use the source context to make assumptions about where data is located within the event. These assumptions can be used to reduce the configuration and decisions a user makes when configuring Vector.
To go back to the `datadog_logs` sink, Vector can automatically map Datadog's `service` field to the Journald `_SYSTEMD_UNIT` field when setting up a `journald` -> `datadog_logs` pipeline. And while `datadog_logs` is used in this example, this problem is widespread throughout most of Vector's sinks.
Finally, it's worth noting that a defining principle of Vector is to be standard agnostic. This is critical to Vector's adoption since it allows users adopt Vector for legacy and future pipelines. Vector should be able to support the [ECS schema](https://www.elastic.co/guide/en/ecs/current/index.html) as well as the [OpenTelemtry schema](https://github.com/open-telemetry/opentelemetry-proto/blob/master/opentelemetry/proto/logs/v1/logs.proto).
# User Needs
As a user, I need the...
- ABILITY to automatically map fields in the `datadog_logs` sink, when the shape of my data is known.
- ABILITY to manually map fields in the `datadog_logs` sink, when the shape of my data is arbitrary/custom.
- more to come...
# Prior Art
- [Splunk CIM](https://docs.splunk.com/Documentation/CIM/4.16.0/User/Overview) - It turns out Splunk has solved this exact problem with CIMs.
# History
## Field metadata
The concept of event/field metadata has been discussed before. When data is ingested into Vector we should store metadata about that event (the source type, source id, known schema, etc). This was discussed in [issue 1448](https://github.com/timberio/vector/issues/1448). We landed on [adding this metadata into the event itself](https://github.com/timberio/vector/issues/1448#issuecomment-570434900) and allowing users to black/whitelist fields when encoding them. We felt this was much simpler than a concept of private fields that are separate from public fields. This kicked off a slew of changes ([#2272](https://github.com/timberio/vector/issues/2272), [#2273](https://github.com/timberio/vector/issues/2273), [#2147](https://github.com/timberio/vector/issues/2147), [#2269](https://github.com/timberio/vector/issues/2269), [#2270](https://github.com/timberio/vector/issues/2270), [#2271](https://github.com/timberio/vector/issues/2271), [#1150](https://github.com/timberio/vector/issues/1150), [#2477](https://github.com/timberio/vector/issues/2477), [#2480](https://github.com/timberio/vector/issues/2480)) that attempted to add more context to events as they were ingested, but ended up causing unexpected problems. For example, [#2717](https://github.com/timberio/vector/issues/2717) was caused due to the `source` and `source_type` fields that were added as metadata. Additionally, we've heard many users complain about these fields. They were suprised to see these fields in their downstream storages, which suggests that encoding this data at the sink level should be opt-in.
## Schemas
There are a handful of popular log schemas. Such as [ECS](https://www.elastic.co/guide/en/ecs/current/index.html), [OpenTelemetry](https://github.com/open-telemetry/opentelemetry-proto/blob/master/opentelemetry/proto/logs/v1/logs.proto), [GELF](https://docs.graylog.org/en/3.2/pages/gelf.html), and [Syslog](https://tools.ietf.org/html/rfc5424). Vector plans to implement sources for all of these schemas. Given that these are documented schemas, how can Vector attach metadata to events and use the known properties of these schemas to perform mapping?
## Remap syntax
[#3134](https://github.com/timberio/vector/pull/3134) proposed a new remap sytax and [#3341](https://github.com/timberio/vector/pull/3341) implemented it. This could be used to offer mapping at the source and sink level:
```
[sinks.my-dd-sink]
type = "datadog_logs"
remap = """
payload.host = .host
payload.service = ._SYSTEMD_UNIT
payload.event = .
"""
```
This, of course, needs more discussion.
|
1.0
|
Map fields in sinks - Automatically (CIM) - A key objective for Vector is to be the easiest tool in it's space to setup and operate. In order to achieve this we must reduce the amount of configuration and decisions a user makes when setting up and using Vector. A particularly hairy aspect of this is field mapping. Because Vector integrates with disparate sources and sinks, it requires some level of field mapping.
A good example of this is the `datadog_logs` sink. Datadog defines [reserved fields](https://docs.datadoghq.com/logs/processing/#reserved-attributes) that the user should take advantage of in order to have the best Datadog experience. Currently, Vector [performs limited mapping](https://github.com/timberio/vector/blob/a391434e97a078f6ea92691bba76f9af91ad1866/src/sinks/datadog/logs.rs#L74) based on [global schema settings](https://vector.dev/docs/reference/global-options/#log_schema). This approach is flawed since this global schema is not consistent across all of Vector's sources. The data coming in from the `http` source is likely to be different than the data coming in from the `journald` source. Instead of trying to shoehorn events from disparate sources into a single shape, Vector should allow the event to take any shape and use the source context to make assumptions about where data is located within the event. These assumptions can be used to reduce the configuration and decisions a user makes when configuring Vector.
To go back to the `datadog_logs` sink, Vector can automatically map Datadog's `service` field to the Journald `_SYSTEMD_UNIT` field when setting up a `journald` -> `datadog_logs` pipeline. And while `datadog_logs` is used in this example, this problem is widespread throughout most of Vector's sinks.
Finally, it's worth noting that a defining principle of Vector is to be standard agnostic. This is critical to Vector's adoption since it allows users adopt Vector for legacy and future pipelines. Vector should be able to support the [ECS schema](https://www.elastic.co/guide/en/ecs/current/index.html) as well as the [OpenTelemtry schema](https://github.com/open-telemetry/opentelemetry-proto/blob/master/opentelemetry/proto/logs/v1/logs.proto).
# User Needs
As a user, I need the...
- ABILITY to automatically map fields in the `datadog_logs` sink, when the shape of my data is known.
- ABILITY to manually map fields in the `datadog_logs` sink, when the shape of my data is arbitrary/custom.
- more to come...
# Prior Art
- [Splunk CIM](https://docs.splunk.com/Documentation/CIM/4.16.0/User/Overview) - It turns out Splunk has solved this exact problem with CIMs.
# History
## Field metadata
The concept of event/field metadata has been discussed before. When data is ingested into Vector we should store metadata about that event (the source type, source id, known schema, etc). This was discussed in [issue 1448](https://github.com/timberio/vector/issues/1448). We landed on [adding this metadata into the event itself](https://github.com/timberio/vector/issues/1448#issuecomment-570434900) and allowing users to black/whitelist fields when encoding them. We felt this was much simpler than a concept of private fields that are separate from public fields. This kicked off a slew of changes ([#2272](https://github.com/timberio/vector/issues/2272), [#2273](https://github.com/timberio/vector/issues/2273), [#2147](https://github.com/timberio/vector/issues/2147), [#2269](https://github.com/timberio/vector/issues/2269), [#2270](https://github.com/timberio/vector/issues/2270), [#2271](https://github.com/timberio/vector/issues/2271), [#1150](https://github.com/timberio/vector/issues/1150), [#2477](https://github.com/timberio/vector/issues/2477), [#2480](https://github.com/timberio/vector/issues/2480)) that attempted to add more context to events as they were ingested, but ended up causing unexpected problems. For example, [#2717](https://github.com/timberio/vector/issues/2717) was caused due to the `source` and `source_type` fields that were added as metadata. Additionally, we've heard many users complain about these fields. They were suprised to see these fields in their downstream storages, which suggests that encoding this data at the sink level should be opt-in.
## Schemas
There are a handful of popular log schemas. Such as [ECS](https://www.elastic.co/guide/en/ecs/current/index.html), [OpenTelemetry](https://github.com/open-telemetry/opentelemetry-proto/blob/master/opentelemetry/proto/logs/v1/logs.proto), [GELF](https://docs.graylog.org/en/3.2/pages/gelf.html), and [Syslog](https://tools.ietf.org/html/rfc5424). Vector plans to implement sources for all of these schemas. Given that these are documented schemas, how can Vector attach metadata to events and use the known properties of these schemas to perform mapping?
## Remap syntax
[#3134](https://github.com/timberio/vector/pull/3134) proposed a new remap sytax and [#3341](https://github.com/timberio/vector/pull/3341) implemented it. This could be used to offer mapping at the source and sink level:
```
[sinks.my-dd-sink]
type = "datadog_logs"
remap = """
payload.host = .host
payload.service = ._SYSTEMD_UNIT
payload.event = .
"""
```
This, of course, needs more discussion.
|
process
|
map fields in sinks automatically cim a key objective for vector is to be the easiest tool in it s space to setup and operate in order to achieve this we must reduce the amount of configuration and decisions a user makes when setting up and using vector a particularly hairy aspect of this is field mapping because vector integrates with disparate sources and sinks it requires some level of field mapping a good example of this is the datadog logs sink datadog defines that the user should take advantage of in order to have the best datadog experience currently vector based on this approach is flawed since this global schema is not consistent across all of vector s sources the data coming in from the http source is likely to be different than the data coming in from the journald source instead of trying to shoehorn events from disparate sources into a single shape vector should allow the event to take any shape and use the source context to make assumptions about where data is located within the event these assumptions can be used to reduce the configuration and decisions a user makes when configuring vector to go back to the datadog logs sink vector can automatically map datadog s service field to the journald systemd unit field when setting up a journald datadog logs pipeline and while datadog logs is used in this example this problem is widespread throughout most of vector s sinks finally it s worth noting that a defining principle of vector is to be standard agnostic this is critical to vector s adoption since it allows users adopt vector for legacy and future pipelines vector should be able to support the as well as the user needs as a user i need the ability to automatically map fields in the datadog logs sink when the shape of my data is known ability to manually map fields in the datadog logs sink when the shape of my data is arbitrary custom more to come prior art it turns out splunk has solved this exact problem with cims history field metadata the concept of event field metadata has been discussed before when data is ingested into vector we should store metadata about that event the source type source id known schema etc this was discussed in we landed on and allowing users to black whitelist fields when encoding them we felt this was much simpler than a concept of private fields that are separate from public fields this kicked off a slew of changes that attempted to add more context to events as they were ingested but ended up causing unexpected problems for example was caused due to the source and source type fields that were added as metadata additionally we ve heard many users complain about these fields they were suprised to see these fields in their downstream storages which suggests that encoding this data at the sink level should be opt in schemas there are a handful of popular log schemas such as and vector plans to implement sources for all of these schemas given that these are documented schemas how can vector attach metadata to events and use the known properties of these schemas to perform mapping remap syntax proposed a new remap sytax and implemented it this could be used to offer mapping at the source and sink level type datadog logs remap payload host host payload service systemd unit payload event this of course needs more discussion
| 1
|
4,559
| 7,389,385,423
|
IssuesEvent
|
2018-03-16 08:29:00
|
KetchPartners/kmsprint2
|
https://api.github.com/repos/KetchPartners/kmsprint2
|
opened
|
Worklist Opp rework
|
Axure Prototype Process enhancement
|
# Baseline - A queue of estimates to be created and who is assigned with desired criteria:
<img width="359" alt="base7" src="https://user-images.githubusercontent.com/29525920/37510692-65dcfb96-28d2-11e8-8b65-f04c709f1d1a.png">
|
1.0
|
Worklist Opp rework - # Baseline - A queue of estimates to be created and who is assigned with desired criteria:
<img width="359" alt="base7" src="https://user-images.githubusercontent.com/29525920/37510692-65dcfb96-28d2-11e8-8b65-f04c709f1d1a.png">
|
process
|
worklist opp rework baseline a queue of estimates to be created and who is assigned with desired criteria img width alt src
| 1
|
117,543
| 17,496,859,687
|
IssuesEvent
|
2021-08-10 02:23:32
|
argoproj/argo-cd
|
https://api.github.com/repos/argoproj/argo-cd
|
closed
|
Non-cryptographically secure random function
|
bug security TOB
|
If you are trying to resolve an environment-specific issue or have a one-off question about the edge case that does not require a feature then please consider asking a question in argocd slack [channel](https://argoproj.github.io/community/join-slack).
Checklist:
* [ ] I've searched in the docs and FAQ for my answer: https://bit.ly/argocd-faq.
* [ ] I've included steps to reproduce the bug.
* [ ] I've pasted the output of `argocd version`.
**Describe the bug**
A clear and concise description of what the bug is.
The argoproj/pkg utility library implements rand module with a RandString and RandStringCharset functions for generating cryptographically-secure pseudo-random strings. However, this rand modules the math/rand Go module which is not intended for security-sensitive work. Additionally, the Argo CD codebase implements the same logic in its u til/rand/rand module.
|
True
|
Non-cryptographically secure random function - If you are trying to resolve an environment-specific issue or have a one-off question about the edge case that does not require a feature then please consider asking a question in argocd slack [channel](https://argoproj.github.io/community/join-slack).
Checklist:
* [ ] I've searched in the docs and FAQ for my answer: https://bit.ly/argocd-faq.
* [ ] I've included steps to reproduce the bug.
* [ ] I've pasted the output of `argocd version`.
**Describe the bug**
A clear and concise description of what the bug is.
The argoproj/pkg utility library implements rand module with a RandString and RandStringCharset functions for generating cryptographically-secure pseudo-random strings. However, this rand modules the math/rand Go module which is not intended for security-sensitive work. Additionally, the Argo CD codebase implements the same logic in its u til/rand/rand module.
|
non_process
|
non cryptographically secure random function if you are trying to resolve an environment specific issue or have a one off question about the edge case that does not require a feature then please consider asking a question in argocd slack checklist i ve searched in the docs and faq for my answer i ve included steps to reproduce the bug i ve pasted the output of argocd version describe the bug a clear and concise description of what the bug is the argoproj pkg utility library implements rand module with a randstring and randstringcharset functions for generating cryptographically secure pseudo random strings however this rand modules the math rand go module which is not intended for security sensitive work additionally the argo cd codebase implements the same logic in its u til rand rand module
| 0
|
90,202
| 26,006,276,645
|
IssuesEvent
|
2022-12-20 19:42:41
|
NixOS/nixpkgs
|
https://api.github.com/repos/NixOS/nixpkgs
|
closed
|
rtw89-firmware-unstable 2022-12-18 fail to build
|
1.severity: channel blocker 0.kind: build failure
|
### Build log
```
error: builder for '/nix/store/mg1xjg8v5lnm076c5nr1yy2wglmq32ri-rtw89-firmware-unstable-2022-12-18.drv' failed with exit code 1;
last 9 log lines:
> unpacking sources
> unpacking source archive /nix/store/0wkx1wcb3nmg20fkbd8kdsy1kqr6cw49-source
> source root is source
> patching sources
> configuring
> no configure script, doing nothing
> installing
> cp: missing destination file operand after '/nix/store/24g6mmjnlgw2yvg9fvsvfgahxnyg9nl5-rtw89-firmware-unstable-2022-12-18/lib/firmware/rtw89'
> Try 'cp --help' for more information.
For full logs, run 'nix log /nix/store/mg1xjg8v5lnm076c5nr1yy2wglmq32ri-rtw89-firmware-unstable-2022-12-18.drv'.
error: 1 dependencies of derivation '/nix/store/kyvjzkq3dw82h8bbf3vqkf45z2hikh86-rtw89-firmware-unstable-2022-12-18-xz.drv' failed to build
error: 1 dependencies of derivation '/nix/store/qx45vny3mkjrkp9k8k34a7x63vmggb11-firmware.drv' failed to build
error: 1 dependencies of derivation '/nix/store/1pdnri7qgcghaynd1wx8hx1qvsy1v4a2-nixos-system-omega-23.05.20221220.762426b.drv' failed to build
```
|
1.0
|
rtw89-firmware-unstable 2022-12-18 fail to build - ### Build log
```
error: builder for '/nix/store/mg1xjg8v5lnm076c5nr1yy2wglmq32ri-rtw89-firmware-unstable-2022-12-18.drv' failed with exit code 1;
last 9 log lines:
> unpacking sources
> unpacking source archive /nix/store/0wkx1wcb3nmg20fkbd8kdsy1kqr6cw49-source
> source root is source
> patching sources
> configuring
> no configure script, doing nothing
> installing
> cp: missing destination file operand after '/nix/store/24g6mmjnlgw2yvg9fvsvfgahxnyg9nl5-rtw89-firmware-unstable-2022-12-18/lib/firmware/rtw89'
> Try 'cp --help' for more information.
For full logs, run 'nix log /nix/store/mg1xjg8v5lnm076c5nr1yy2wglmq32ri-rtw89-firmware-unstable-2022-12-18.drv'.
error: 1 dependencies of derivation '/nix/store/kyvjzkq3dw82h8bbf3vqkf45z2hikh86-rtw89-firmware-unstable-2022-12-18-xz.drv' failed to build
error: 1 dependencies of derivation '/nix/store/qx45vny3mkjrkp9k8k34a7x63vmggb11-firmware.drv' failed to build
error: 1 dependencies of derivation '/nix/store/1pdnri7qgcghaynd1wx8hx1qvsy1v4a2-nixos-system-omega-23.05.20221220.762426b.drv' failed to build
```
|
non_process
|
firmware unstable fail to build build log error builder for nix store firmware unstable drv failed with exit code last log lines unpacking sources unpacking source archive nix store source source root is source patching sources configuring no configure script doing nothing installing cp missing destination file operand after nix store firmware unstable lib firmware try cp help for more information for full logs run nix log nix store firmware unstable drv error dependencies of derivation nix store firmware unstable xz drv failed to build error dependencies of derivation nix store firmware drv failed to build error dependencies of derivation nix store nixos system omega drv failed to build
| 0
|
207,297
| 15,802,885,794
|
IssuesEvent
|
2021-04-03 11:54:22
|
astpl1998/Jyoti-Ceramic
|
https://api.github.com/repos/astpl1998/Jyoti-Ceramic
|
closed
|
PPD : Tool Trigger Stage base reporting validation
|
17.Testing2_Completed
|
@karthick please find the attachment
[Tool Logic.xlsx](https://github.com/astpl1998/Jyoti-Ceramic/files/5564501/Tool.Logic.xlsx)
|
1.0
|
PPD : Tool Trigger Stage base reporting validation - @karthick please find the attachment
[Tool Logic.xlsx](https://github.com/astpl1998/Jyoti-Ceramic/files/5564501/Tool.Logic.xlsx)
|
non_process
|
ppd tool trigger stage base reporting validation karthick please find the attachment
| 0
|
8,859
| 11,956,576,870
|
IssuesEvent
|
2020-04-04 11:09:01
|
hngskj/labnote
|
https://api.github.com/repos/hngskj/labnote
|
closed
|
Inputs of Mix process view
|
bug process-view
|
When checkboxes are clicked, an error occurs.
Check `amount` in `AppMix.vue`
|
1.0
|
Inputs of Mix process view - When checkboxes are clicked, an error occurs.
Check `amount` in `AppMix.vue`
|
process
|
inputs of mix process view when checkboxes are clicked an error occurs check amount in appmix vue
| 1
|
370,213
| 25,893,757,334
|
IssuesEvent
|
2022-12-14 20:20:57
|
matplotlib/matplotlib
|
https://api.github.com/repos/matplotlib/matplotlib
|
closed
|
[Bug]: The Axes3D does not work as expected.
|
Documentation Community support
|
### Bug summary
The Axes3D does not work as expected.
### Code for reproduction
```python
import matplotlib.pyplot as plt
import numpy as np
from mpl_toolkits.mplot3d import Axes3D
plt.style.use('_mpl-gallery')
# Make data
np.random.seed(19680801)
n = 100
rng = np.random.default_rng()
xs = rng.uniform(23, 32, n)
ys = rng.uniform(0, 100, n)
zs = rng.uniform(-50, -25, n)
# Plot
# fig, ax = plt.subplots(subplot_kw={"projection": "3d"})
fig = plt.figure()
ax = Axes3D(fig)
ax.scatter(xs, ys, zs)
ax.set(xticklabels=[],
yticklabels=[],
zticklabels=[])
plt.show()
```
### Actual outcome
The outcome figure is blank.
### Expected outcome
Should show figure with correct content.
### Additional information
The code works well if changed to
`fig, ax = plt.subplots(subplot_kw={"projection": "3d"})`
### Operating system
Windows
### Matplotlib Version
3.6.2
### Matplotlib Backend
Qt5Agg
### Python version
3.9.15
### Jupyter version
NA
### Installation
conda
|
1.0
|
[Bug]: The Axes3D does not work as expected. - ### Bug summary
The Axes3D does not work as expected.
### Code for reproduction
```python
import matplotlib.pyplot as plt
import numpy as np
from mpl_toolkits.mplot3d import Axes3D
plt.style.use('_mpl-gallery')
# Make data
np.random.seed(19680801)
n = 100
rng = np.random.default_rng()
xs = rng.uniform(23, 32, n)
ys = rng.uniform(0, 100, n)
zs = rng.uniform(-50, -25, n)
# Plot
# fig, ax = plt.subplots(subplot_kw={"projection": "3d"})
fig = plt.figure()
ax = Axes3D(fig)
ax.scatter(xs, ys, zs)
ax.set(xticklabels=[],
yticklabels=[],
zticklabels=[])
plt.show()
```
### Actual outcome
The outcome figure is blank.
### Expected outcome
Should show figure with correct content.
### Additional information
The code works well if changed to
`fig, ax = plt.subplots(subplot_kw={"projection": "3d"})`
### Operating system
Windows
### Matplotlib Version
3.6.2
### Matplotlib Backend
Qt5Agg
### Python version
3.9.15
### Jupyter version
NA
### Installation
conda
|
non_process
|
the does not work as expected bug summary the does not work as expected code for reproduction python import matplotlib pyplot as plt import numpy as np from mpl toolkits import plt style use mpl gallery make data np random seed n rng np random default rng xs rng uniform n ys rng uniform n zs rng uniform n plot fig ax plt subplots subplot kw projection fig plt figure ax fig ax scatter xs ys zs ax set xticklabels yticklabels zticklabels plt show actual outcome the outcome figure is blank expected outcome should show figure with correct content additional information the code works well if changed to fig ax plt subplots subplot kw projection operating system windows matplotlib version matplotlib backend python version jupyter version na installation conda
| 0
|
193,398
| 22,216,147,027
|
IssuesEvent
|
2022-06-08 02:00:42
|
uniquelyparticular/shipengine-request
|
https://api.github.com/repos/uniquelyparticular/shipengine-request
|
closed
|
WS-2019-0332 (Medium) detected in handlebars-4.1.2.tgz - autoclosed
|
security vulnerability
|
## WS-2019-0332 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.1.2.tgz</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.2.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.2.tgz</a></p>
<p>Path to dependency file: shipengine-request/package.json</p>
<p>Path to vulnerable library: shipengine-request/node_modules/handlebars/package.json</p>
<p>
Dependency Hierarchy:
- semantic-release-15.14.0.tgz (Root Library)
- release-notes-generator-7.1.4.tgz
- conventional-changelog-writer-4.0.3.tgz
- :x: **handlebars-4.1.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uniquelyparticular/shipengine-request/commit/64ac831a846dcf5cca8dd2543c975372a4ab1d86">64ac831a846dcf5cca8dd2543c975372a4ab1d86</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Execution vulnerability found in handlebars before 4.5.3. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.It is due to an incomplete fix for a WS-2019-0331.
<p>Publish Date: 2019-11-17
<p>URL: <a href=https://github.com/wycats/handlebars.js/commit/198887808780bbef9dba67a8af68ece091d5baa7>WS-2019-0332</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1324">https://www.npmjs.com/advisories/1324</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: handlebars - 4.5.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2019-0332 (Medium) detected in handlebars-4.1.2.tgz - autoclosed - ## WS-2019-0332 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.1.2.tgz</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.2.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.2.tgz</a></p>
<p>Path to dependency file: shipengine-request/package.json</p>
<p>Path to vulnerable library: shipengine-request/node_modules/handlebars/package.json</p>
<p>
Dependency Hierarchy:
- semantic-release-15.14.0.tgz (Root Library)
- release-notes-generator-7.1.4.tgz
- conventional-changelog-writer-4.0.3.tgz
- :x: **handlebars-4.1.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uniquelyparticular/shipengine-request/commit/64ac831a846dcf5cca8dd2543c975372a4ab1d86">64ac831a846dcf5cca8dd2543c975372a4ab1d86</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Execution vulnerability found in handlebars before 4.5.3. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.It is due to an incomplete fix for a WS-2019-0331.
<p>Publish Date: 2019-11-17
<p>URL: <a href=https://github.com/wycats/handlebars.js/commit/198887808780bbef9dba67a8af68ece091d5baa7>WS-2019-0332</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1324">https://www.npmjs.com/advisories/1324</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: handlebars - 4.5.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws medium detected in handlebars tgz autoclosed ws medium severity vulnerability vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file shipengine request package json path to vulnerable library shipengine request node modules handlebars package json dependency hierarchy semantic release tgz root library release notes generator tgz conventional changelog writer tgz x handlebars tgz vulnerable library found in head commit a href vulnerability details arbitrary code execution vulnerability found in handlebars before lookup helper fails to validate templates attack may submit templates that execute arbitrary javascript in the system it is due to an incomplete fix for a ws publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution handlebars step up your open source security game with whitesource
| 0
|
3,682
| 6,714,698,595
|
IssuesEvent
|
2017-10-13 17:59:17
|
syndesisio/syndesis
|
https://api.github.com/repos/syndesisio/syndesis
|
opened
|
Find optimal time slots for Team Meetings
|
process/retro
|
Currently, we have the following team meetings:
* Daily Standup (15 mins)
* Retrospective (1 hour every 3 weeks)
* Planning Meeting (1 hour every 3 weeks)
* Sprint Demo (1 hour every 3 weeks)
Each of these meetings is supposed to be mandatory to attend (except for reasons).
However, the scheduled dates are not optimal, so that there is a systematic problem to attend certain meetings. Let's try a bit harder and find a common base.
I'm going to open Doodles for these meetings, especially the standup meeting is critical.
|
1.0
|
Find optimal time slots for Team Meetings - Currently, we have the following team meetings:
* Daily Standup (15 mins)
* Retrospective (1 hour every 3 weeks)
* Planning Meeting (1 hour every 3 weeks)
* Sprint Demo (1 hour every 3 weeks)
Each of these meetings is supposed to be mandatory to attend (except for reasons).
However, the scheduled dates are not optimal, so that there is a systematic problem to attend certain meetings. Let's try a bit harder and find a common base.
I'm going to open Doodles for these meetings, especially the standup meeting is critical.
|
process
|
find optimal time slots for team meetings currently we have the following team meetings daily standup mins retrospective hour every weeks planning meeting hour every weeks sprint demo hour every weeks each of these meetings is supposed to be mandatory to attend except for reasons however the scheduled dates are not optimal so that there is a systematic problem to attend certain meetings let s try a bit harder and find a common base i m going to open doodles for these meetings especially the standup meeting is critical
| 1
|
590,685
| 17,784,519,026
|
IssuesEvent
|
2021-08-31 09:25:44
|
yalla-coop/universal-credit-support
|
https://api.github.com/repos/yalla-coop/universal-credit-support
|
closed
|
Create Organisations page
|
backlog priority-5 front-end backend
|
__Wireframe link__
https://www.figma.com/file/2b1vIUlRczjuwhWr9cE9OY/Wireframes?node-id=1932%3A10513
---
### Acceptance Criteria:
_REMEMBER THAT WHOEVER WORKS ON THIS ISSUE MUST TICK OFF ALL THE POINTS IN THIS LIST UNLESS THERE IS CLEAR AGREEMENT IN THE COMMENTS TO SAY OTHERWISE. **DO NOT REVIEW A PR INVOLVING THIS ISSUE UNLESS THIS HAS BEEN DONE**_
- [x] Get all accounts signed up from db and list on page along with role
- [x] Add ability to change their role (admin, super admin or remove account)
- [x] Remove account would be a hard delete from database
|
1.0
|
Create Organisations page -
__Wireframe link__
https://www.figma.com/file/2b1vIUlRczjuwhWr9cE9OY/Wireframes?node-id=1932%3A10513
---
### Acceptance Criteria:
_REMEMBER THAT WHOEVER WORKS ON THIS ISSUE MUST TICK OFF ALL THE POINTS IN THIS LIST UNLESS THERE IS CLEAR AGREEMENT IN THE COMMENTS TO SAY OTHERWISE. **DO NOT REVIEW A PR INVOLVING THIS ISSUE UNLESS THIS HAS BEEN DONE**_
- [x] Get all accounts signed up from db and list on page along with role
- [x] Add ability to change their role (admin, super admin or remove account)
- [x] Remove account would be a hard delete from database
|
non_process
|
create organisations page wireframe link acceptance criteria remember that whoever works on this issue must tick off all the points in this list unless there is clear agreement in the comments to say otherwise do not review a pr involving this issue unless this has been done get all accounts signed up from db and list on page along with role add ability to change their role admin super admin or remove account remove account would be a hard delete from database
| 0
|
7,220
| 4,823,007,655
|
IssuesEvent
|
2016-11-06 04:57:04
|
cortoproject/corto
|
https://api.github.com/repos/cortoproject/corto
|
closed
|
Add varargs to corto_fromcontent
|
Corto:Usability enhancement
|
To make the `corto_fromcontent` function easier to use, add varargs to the signature, so it becomes:
```
corto_int16 corto_fromcontent(corto_object o, corto_string contentType, corto_string content, ...);
```
The `content` parameter will become a format string which follows `printf` conventions.
|
True
|
Add varargs to corto_fromcontent - To make the `corto_fromcontent` function easier to use, add varargs to the signature, so it becomes:
```
corto_int16 corto_fromcontent(corto_object o, corto_string contentType, corto_string content, ...);
```
The `content` parameter will become a format string which follows `printf` conventions.
|
non_process
|
add varargs to corto fromcontent to make the corto fromcontent function easier to use add varargs to the signature so it becomes corto corto fromcontent corto object o corto string contenttype corto string content the content parameter will become a format string which follows printf conventions
| 0
|
6,966
| 10,119,370,405
|
IssuesEvent
|
2019-07-31 11:18:14
|
linnovate/root
|
https://api.github.com/repos/linnovate/root
|
opened
|
tasks inheritance from discussion
|
Process bug Tasks
|
create a task
create a disucssion
add users as partners with different permissions
**example meeting:**

go to the created task
assign the discussion to it
the partners' permissions are viewers :

|
1.0
|
tasks inheritance from discussion - create a task
create a disucssion
add users as partners with different permissions
**example meeting:**

go to the created task
assign the discussion to it
the partners' permissions are viewers :

|
process
|
tasks inheritance from discussion create a task create a disucssion add users as partners with different permissions example meeting go to the created task assign the discussion to it the partners permissions are viewers
| 1
|
309,308
| 26,660,293,646
|
IssuesEvent
|
2023-01-25 20:26:44
|
MPMG-DCC-UFMG/F01
|
https://api.github.com/repos/MPMG-DCC-UFMG/F01
|
closed
|
Teste de generalizacao para a tag Concursos Públicos - Divulgação de Recursos e Decisões - Muriaé
|
generalization test development template - ABO (21) tag - Concursos Públicos subtag - Divulgação de Recursos e Decisões
|
DoD: Realizar o teste de Generalização do validador da tag Concursos Públicos - Divulgação de Recursos e Decisões para o Município de Muriaé.
|
1.0
|
Teste de generalizacao para a tag Concursos Públicos - Divulgação de Recursos e Decisões - Muriaé - DoD: Realizar o teste de Generalização do validador da tag Concursos Públicos - Divulgação de Recursos e Decisões para o Município de Muriaé.
|
non_process
|
teste de generalizacao para a tag concursos públicos divulgação de recursos e decisões muriaé dod realizar o teste de generalização do validador da tag concursos públicos divulgação de recursos e decisões para o município de muriaé
| 0
|
236,090
| 18,070,127,268
|
IssuesEvent
|
2021-09-21 01:14:27
|
beefproject/beef
|
https://api.github.com/repos/beefproject/beef
|
closed
|
Usability: Beef update text in install instructions is missing bundle command
|
Documentation
|
# Submit Issue
* https://github.com/beefproject/beef/issues
* https://github.com/beefproject/beef/wiki/FAQ
Ensure you're using the [latest version of BeEF](https://github.com/beefproject/beef/releases/tag/v0.5.1.0).
Please do your best to provide as much information as possible. It will help substantially if you can enable and provide debugging logs with your issue. Instructions for enabling debugging logs are below:
1. In the `config.yaml` file of your BeEF root folder set debug and client_debug (lines 11 & 13 respectively) to `true`
* If using a standard installation of `beef-xss` the root folder will typically be `/usr/share/beef-xss`
2. Reproduce your error
3. Retrieve your client-side logs from your browser's developer console (Ctrl + Shift + I)
4. Retrieve your server-side logs from `~/.beef/beef.log`
5. **If using `beef-xss`:** Retrieve your service logs using `journalctl -u beef-xss`
Thank you, this will greatly aid us in identifying the root cause of your issue :)
**If we request additional information and we don't hear back from you within a week, we will be closing the ticket off.**
Feel free to open it back up if you continue to have issues.
## Summary
**Q:** Please provide a brief summary of the issue that you experienced.
Instructions to update beef doesn't include the necessary step of running 'bundle' to ensure gems are updated
## Environment
*Please identify the environment in which your issue occurred.*
1. **BeEF Version:**
2. **Ruby Version:**
3. **Browser Details (e.g. Chrome v81.0):**
4. **Operating System (e.g. OSX Catalina):**
## Configuration
**Q:** Have you made any changes to your BeEF configuration?
Yes, updated it
**Q:** Have you enabled or disabled any BeEF extensions?
No
## Expected vs. Actual Behaviour
**Expected Behaviour:**
<br />
**Actual Behaviour:**
<br />
## Steps to Reproduce
*Please provide steps to reproduce this issue.*
1 update beef and see that gems are not updated
## Additional Information
Please provide any additional information which may be useful in resolving this issue, such as debugging output and relevant screen shots. Debug output can be retrieved by following the instructions towards the top of the issue template.
|
1.0
|
Usability: Beef update text in install instructions is missing bundle command - # Submit Issue
* https://github.com/beefproject/beef/issues
* https://github.com/beefproject/beef/wiki/FAQ
Ensure you're using the [latest version of BeEF](https://github.com/beefproject/beef/releases/tag/v0.5.1.0).
Please do your best to provide as much information as possible. It will help substantially if you can enable and provide debugging logs with your issue. Instructions for enabling debugging logs are below:
1. In the `config.yaml` file of your BeEF root folder set debug and client_debug (lines 11 & 13 respectively) to `true`
* If using a standard installation of `beef-xss` the root folder will typically be `/usr/share/beef-xss`
2. Reproduce your error
3. Retrieve your client-side logs from your browser's developer console (Ctrl + Shift + I)
4. Retrieve your server-side logs from `~/.beef/beef.log`
5. **If using `beef-xss`:** Retrieve your service logs using `journalctl -u beef-xss`
Thank you, this will greatly aid us in identifying the root cause of your issue :)
**If we request additional information and we don't hear back from you within a week, we will be closing the ticket off.**
Feel free to open it back up if you continue to have issues.
## Summary
**Q:** Please provide a brief summary of the issue that you experienced.
Instructions to update beef doesn't include the necessary step of running 'bundle' to ensure gems are updated
## Environment
*Please identify the environment in which your issue occurred.*
1. **BeEF Version:**
2. **Ruby Version:**
3. **Browser Details (e.g. Chrome v81.0):**
4. **Operating System (e.g. OSX Catalina):**
## Configuration
**Q:** Have you made any changes to your BeEF configuration?
Yes, updated it
**Q:** Have you enabled or disabled any BeEF extensions?
No
## Expected vs. Actual Behaviour
**Expected Behaviour:**
<br />
**Actual Behaviour:**
<br />
## Steps to Reproduce
*Please provide steps to reproduce this issue.*
1 update beef and see that gems are not updated
## Additional Information
Please provide any additional information which may be useful in resolving this issue, such as debugging output and relevant screen shots. Debug output can be retrieved by following the instructions towards the top of the issue template.
|
non_process
|
usability beef update text in install instructions is missing bundle command submit issue ensure you re using the please do your best to provide as much information as possible it will help substantially if you can enable and provide debugging logs with your issue instructions for enabling debugging logs are below in the config yaml file of your beef root folder set debug and client debug lines respectively to true if using a standard installation of beef xss the root folder will typically be usr share beef xss reproduce your error retrieve your client side logs from your browser s developer console ctrl shift i retrieve your server side logs from beef beef log if using beef xss retrieve your service logs using journalctl u beef xss thank you this will greatly aid us in identifying the root cause of your issue if we request additional information and we don t hear back from you within a week we will be closing the ticket off feel free to open it back up if you continue to have issues summary q please provide a brief summary of the issue that you experienced instructions to update beef doesn t include the necessary step of running bundle to ensure gems are updated environment please identify the environment in which your issue occurred beef version ruby version browser details e g chrome operating system e g osx catalina configuration q have you made any changes to your beef configuration yes updated it q have you enabled or disabled any beef extensions no expected vs actual behaviour expected behaviour actual behaviour steps to reproduce please provide steps to reproduce this issue update beef and see that gems are not updated additional information please provide any additional information which may be useful in resolving this issue such as debugging output and relevant screen shots debug output can be retrieved by following the instructions towards the top of the issue template
| 0
|
52,327
| 22,152,462,830
|
IssuesEvent
|
2022-06-03 18:26:48
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
[Reporting] Remove layout management behaviour
|
Feature:Reporting loe:hours impact:high Team:Reporting Services EnableJiraSync Team:AppServicesUx
|
This issue is intended as a point of departure for other issues that can inform a more well-defined project (with stakeholders). _This issue should be closed once this information has been translated to a more well-defined project._
## Summary
Reporting injects custom CSS and JavaScript into the server-side rendering environment to adjust layout for print. See:
* `x-pack/plugins/reporting/server/lib/screenshots/inject_css.ts`
* `x-pack/plugins/reporting/server/lib/layouts`
This code is responsible for adjusting UI elements outside of the control of Reporting after they have been rendered. The adjustments optimise the UI for the requested media.
## The problem
An implicit contract exists between the UI being rendered (the plugin) and the behaviour of the renderer (Reporting). If either of these changes it will often not be explicit that something may have broken in generated reports until they are re-tested. This "knowledge" that Reporting has is implicit and can be classified as tech debt since it is a burden to maintain and forces Reporting into the domain of what is being rendered not just the process of rendering.
Additionally, any issues encountered with layout are often routed to both the reporting team and the plugin team which creates ambiguity in ownership.
## Remove all layout code that adjusts client components
Reporting should not have any concept of "layout" that extends to the domain of how components should respond to different view port sizes or print media. Reporting should **only** be responsible for rendering a page in the specified size and sharing the result.
This means that clients of reporting become fully responsible for managing layout and should provide a URL to access the print-friendly version of their content.
## Path forward
We want to get consumers of reporting to a place where they are empowered to make decisions about how their app looks in the browser and on print and screen media. We also want to ensure that the UX for getting a printable version of various pieces of UI is consistent for all users (cmd/ctrl+P). This is outlined as [option 2](https://github.com/elastic/kibana/issues/99890#option2) for giving users a way to get PDF/PNG reports without saving.
### Migration path
In conversation with @timroes we spoke about how removal of layout-related code (CSS and JS) in Reporting will require coordination across the Reporting, Core, Maps, Dashboard, Visualize and Canvas teams to achieve.
#### A solution: Pure CSS and print media solution
Plugins could add and maintain some CSS that targets print media. Hiding irrelevant visual elements and repositioning any relevant elements once a browser attempts to the print the page.
##### Challenges
* Today we offer the ability to capture a screenshot "as is" and a screenshot that is optimised for printing on A4 paper, this is "print" vs "screen" media. Will CSS offer everything we might want for creating the kind of layouts we want for those to media for all different views users might want to print?
## Related issues
* Getting reporting to a place where full report state is saved with the report object https://github.com/elastic/kibana/issues/99890
CC @tsullivan
|
2.0
|
[Reporting] Remove layout management behaviour - This issue is intended as a point of departure for other issues that can inform a more well-defined project (with stakeholders). _This issue should be closed once this information has been translated to a more well-defined project._
## Summary
Reporting injects custom CSS and JavaScript into the server-side rendering environment to adjust layout for print. See:
* `x-pack/plugins/reporting/server/lib/screenshots/inject_css.ts`
* `x-pack/plugins/reporting/server/lib/layouts`
This code is responsible for adjusting UI elements outside of the control of Reporting after they have been rendered. The adjustments optimise the UI for the requested media.
## The problem
An implicit contract exists between the UI being rendered (the plugin) and the behaviour of the renderer (Reporting). If either of these changes it will often not be explicit that something may have broken in generated reports until they are re-tested. This "knowledge" that Reporting has is implicit and can be classified as tech debt since it is a burden to maintain and forces Reporting into the domain of what is being rendered not just the process of rendering.
Additionally, any issues encountered with layout are often routed to both the reporting team and the plugin team which creates ambiguity in ownership.
## Remove all layout code that adjusts client components
Reporting should not have any concept of "layout" that extends to the domain of how components should respond to different view port sizes or print media. Reporting should **only** be responsible for rendering a page in the specified size and sharing the result.
This means that clients of reporting become fully responsible for managing layout and should provide a URL to access the print-friendly version of their content.
## Path forward
We want to get consumers of reporting to a place where they are empowered to make decisions about how their app looks in the browser and on print and screen media. We also want to ensure that the UX for getting a printable version of various pieces of UI is consistent for all users (cmd/ctrl+P). This is outlined as [option 2](https://github.com/elastic/kibana/issues/99890#option2) for giving users a way to get PDF/PNG reports without saving.
### Migration path
In conversation with @timroes we spoke about how removal of layout-related code (CSS and JS) in Reporting will require coordination across the Reporting, Core, Maps, Dashboard, Visualize and Canvas teams to achieve.
#### A solution: Pure CSS and print media solution
Plugins could add and maintain some CSS that targets print media. Hiding irrelevant visual elements and repositioning any relevant elements once a browser attempts to the print the page.
##### Challenges
* Today we offer the ability to capture a screenshot "as is" and a screenshot that is optimised for printing on A4 paper, this is "print" vs "screen" media. Will CSS offer everything we might want for creating the kind of layouts we want for those to media for all different views users might want to print?
## Related issues
* Getting reporting to a place where full report state is saved with the report object https://github.com/elastic/kibana/issues/99890
CC @tsullivan
|
non_process
|
remove layout management behaviour this issue is intended as a point of departure for other issues that can inform a more well defined project with stakeholders this issue should be closed once this information has been translated to a more well defined project summary reporting injects custom css and javascript into the server side rendering environment to adjust layout for print see x pack plugins reporting server lib screenshots inject css ts x pack plugins reporting server lib layouts this code is responsible for adjusting ui elements outside of the control of reporting after they have been rendered the adjustments optimise the ui for the requested media the problem an implicit contract exists between the ui being rendered the plugin and the behaviour of the renderer reporting if either of these changes it will often not be explicit that something may have broken in generated reports until they are re tested this knowledge that reporting has is implicit and can be classified as tech debt since it is a burden to maintain and forces reporting into the domain of what is being rendered not just the process of rendering additionally any issues encountered with layout are often routed to both the reporting team and the plugin team which creates ambiguity in ownership remove all layout code that adjusts client components reporting should not have any concept of layout that extends to the domain of how components should respond to different view port sizes or print media reporting should only be responsible for rendering a page in the specified size and sharing the result this means that clients of reporting become fully responsible for managing layout and should provide a url to access the print friendly version of their content path forward we want to get consumers of reporting to a place where they are empowered to make decisions about how their app looks in the browser and on print and screen media we also want to ensure that the ux for getting a printable version of various pieces of ui is consistent for all users cmd ctrl p this is outlined as for giving users a way to get pdf png reports without saving migration path in conversation with timroes we spoke about how removal of layout related code css and js in reporting will require coordination across the reporting core maps dashboard visualize and canvas teams to achieve a solution pure css and print media solution plugins could add and maintain some css that targets print media hiding irrelevant visual elements and repositioning any relevant elements once a browser attempts to the print the page challenges today we offer the ability to capture a screenshot as is and a screenshot that is optimised for printing on paper this is print vs screen media will css offer everything we might want for creating the kind of layouts we want for those to media for all different views users might want to print related issues getting reporting to a place where full report state is saved with the report object cc tsullivan
| 0
|
29,611
| 2,716,632,489
|
IssuesEvent
|
2015-04-10 20:21:48
|
CruxFramework/crux
|
https://api.github.com/repos/CruxFramework/crux
|
closed
|
Refactoring View Container
|
enhancement imported Milestone-M14-C4 Module-CruxCore Priority-Medium
|
_From [juli...@cruxframework.org](https://code.google.com/u/108392056359000771618/) on August 27, 2014 16:47:27_
Refactoring the view container in Crux and create two view containers. One will support parameters and other will support history control.
Change references of view container in crux-widgets and crux-smart-faces
_Original issue: http://code.google.com/p/crux-framework/issues/detail?id=499_
|
1.0
|
Refactoring View Container - _From [juli...@cruxframework.org](https://code.google.com/u/108392056359000771618/) on August 27, 2014 16:47:27_
Refactoring the view container in Crux and create two view containers. One will support parameters and other will support history control.
Change references of view container in crux-widgets and crux-smart-faces
_Original issue: http://code.google.com/p/crux-framework/issues/detail?id=499_
|
non_process
|
refactoring view container from on august refactoring the view container in crux and create two view containers one will support parameters and other will support history control change references of view container in crux widgets and crux smart faces original issue
| 0
|
4,582
| 7,422,635,729
|
IssuesEvent
|
2018-03-23 00:24:42
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
promisify child_process.exec always rejects
|
child_process promises
|
* **Version**:
v8.9.4
* **Platform**:
Ubuntu 16.04
I'm having trouble `promisify`ing `child_process.exec`. When I run with error or without, the promise is rejected. I've followed the docs and wrote this:
```javascript
const { promisify } = require('util');
const execCallback = require('child_process').exec;
const exec = promisify(execCallback);
const fooBar = () => {
const dir = `engine/Users/${user}`;
const fileOne = `${process.cwd()}/${dir}/myFileOne.txt`;
const fileTwo = `${process.cwd()}/${dir}/myFileTwo.txt`;
const command = `diff --unchanged-group-format="" ${fileOne} ${fileTwo}`;
return run(command)
.then((stdout, stderr) => {
console.log('ran diff');
return stdout;
})
.catch(error => console.log('got error running diff'));
};
```
`fooBar` goes through the `catch` block with this `error`:
```javascript
{
killed: false,
code: 1,
signal: null,
cmd: /*the correct diff cmd*/,
stdout: /*the correct diff stdout*/,
stderr: ''
}
```
An actual error (or at least a 'no such file') gives me `code: 2`, `stdout: ''`, `stderr: no such file error string`
|
1.0
|
promisify child_process.exec always rejects - * **Version**:
v8.9.4
* **Platform**:
Ubuntu 16.04
I'm having trouble `promisify`ing `child_process.exec`. When I run with error or without, the promise is rejected. I've followed the docs and wrote this:
```javascript
const { promisify } = require('util');
const execCallback = require('child_process').exec;
const exec = promisify(execCallback);
const fooBar = () => {
const dir = `engine/Users/${user}`;
const fileOne = `${process.cwd()}/${dir}/myFileOne.txt`;
const fileTwo = `${process.cwd()}/${dir}/myFileTwo.txt`;
const command = `diff --unchanged-group-format="" ${fileOne} ${fileTwo}`;
return run(command)
.then((stdout, stderr) => {
console.log('ran diff');
return stdout;
})
.catch(error => console.log('got error running diff'));
};
```
`fooBar` goes through the `catch` block with this `error`:
```javascript
{
killed: false,
code: 1,
signal: null,
cmd: /*the correct diff cmd*/,
stdout: /*the correct diff stdout*/,
stderr: ''
}
```
An actual error (or at least a 'no such file') gives me `code: 2`, `stdout: ''`, `stderr: no such file error string`
|
process
|
promisify child process exec always rejects version platform ubuntu i m having trouble promisify ing child process exec when i run with error or without the promise is rejected i ve followed the docs and wrote this javascript const promisify require util const execcallback require child process exec const exec promisify execcallback const foobar const dir engine users user const fileone process cwd dir myfileone txt const filetwo process cwd dir myfiletwo txt const command diff unchanged group format fileone filetwo return run command then stdout stderr console log ran diff return stdout catch error console log got error running diff foobar goes through the catch block with this error javascript killed false code signal null cmd the correct diff cmd stdout the correct diff stdout stderr an actual error or at least a no such file gives me code stdout stderr no such file error string
| 1
|
7,377
| 10,514,634,636
|
IssuesEvent
|
2019-09-28 02:15:12
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Performance Improvement: Use date range instead of casting
|
.Proposal Enhancement Priority:P1 Query Processor Type:Performance
|
On postgres (9.1), when using the date filter on question creation, selecting the current date results in the following where clause:
cast(column as date) = cast(now() as date)
moving to the following instead results in a huge speed improvement
column >= date_trunc('day',now()) and column < date_trunc('day',now() + '1 day')
currently our read-only replica cries with the following error because the cast causes the query to run too long
```
ERROR: canceling statement due to conflict with recovery
DETAIL: User query might have needed to see row versions that must be removed.
```
⬇️ **Please click the 👍 reaction instead of leaving a `+1` or 👍 comment**
|
1.0
|
Performance Improvement: Use date range instead of casting - On postgres (9.1), when using the date filter on question creation, selecting the current date results in the following where clause:
cast(column as date) = cast(now() as date)
moving to the following instead results in a huge speed improvement
column >= date_trunc('day',now()) and column < date_trunc('day',now() + '1 day')
currently our read-only replica cries with the following error because the cast causes the query to run too long
```
ERROR: canceling statement due to conflict with recovery
DETAIL: User query might have needed to see row versions that must be removed.
```
⬇️ **Please click the 👍 reaction instead of leaving a `+1` or 👍 comment**
|
process
|
performance improvement use date range instead of casting on postgres when using the date filter on question creation selecting the current date results in the following where clause cast column as date cast now as date moving to the following instead results in a huge speed improvement column date trunc day now and column date trunc day now day currently our read only replica cries with the following error because the cast causes the query to run too long error canceling statement due to conflict with recovery detail user query might have needed to see row versions that must be removed ⬇️ please click the 👍 reaction instead of leaving a or 👍 comment
| 1
|
17,135
| 22,667,768,177
|
IssuesEvent
|
2022-07-03 06:20:39
|
django-crispy-forms/django-crispy-forms
|
https://api.github.com/repos/django-crispy-forms/django-crispy-forms
|
closed
|
Tests that just assert a string object
|
Testing/Process
|
While working on bootstrap4 support I came across a couple lines in the tests that just `assert <hardcoded_string>`.
https://github.com/maraujop/django-crispy-forms/blob/dev/crispy_forms/tests/test_form_helper.py#L598
https://github.com/maraujop/django-crispy-forms/blob/dev/crispy_forms/tests/test_form_helper.py#L605
It looks like they were added [here](https://github.com/maraujop/django-crispy-forms/commit/5c3a2680e2a0a607208e897dc5c278436b69f681)
|
1.0
|
Tests that just assert a string object - While working on bootstrap4 support I came across a couple lines in the tests that just `assert <hardcoded_string>`.
https://github.com/maraujop/django-crispy-forms/blob/dev/crispy_forms/tests/test_form_helper.py#L598
https://github.com/maraujop/django-crispy-forms/blob/dev/crispy_forms/tests/test_form_helper.py#L605
It looks like they were added [here](https://github.com/maraujop/django-crispy-forms/commit/5c3a2680e2a0a607208e897dc5c278436b69f681)
|
process
|
tests that just assert a string object while working on support i came across a couple lines in the tests that just assert it looks like they were added
| 1
|
20,503
| 27,167,178,049
|
IssuesEvent
|
2023-02-17 16:12:48
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
`succeeded()` parameter is undocumented
|
devops/prod doc-bug Pri1 devops-cicd-process/tech
|
I couldn't find any documentation that indicated `succeeded` allowed a parameter reference to a particular job. E.g. `succeeded('JobName')`
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 21e5cee4-eaae-3a96-db91-540ac759e83a
* Version Independent ID: 9bdc837c-ffe0-d999-f922-f3a5debc7f92
* Content: [Conditions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml)
* Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/conditions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
`succeeded()` parameter is undocumented - I couldn't find any documentation that indicated `succeeded` allowed a parameter reference to a particular job. E.g. `succeeded('JobName')`
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 21e5cee4-eaae-3a96-db91-540ac759e83a
* Version Independent ID: 9bdc837c-ffe0-d999-f922-f3a5debc7f92
* Content: [Conditions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml)
* Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/conditions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
succeeded parameter is undocumented i couldn t find any documentation that indicated succeeded allowed a parameter reference to a particular job e g succeeded jobname document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id eaae version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
3,795
| 6,778,214,984
|
IssuesEvent
|
2017-10-28 07:34:42
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
PDF: Conref regression with preprocess2
|
bug pdf preprocess2
|
When building the docs using dita-ot/dita-ot@19abe1998, there are missing conrefs in the PDF output _(empty bullets on page 12, for example)_.
This is apparently related to recent changes in the conref processing and/or `preprocess2`, as the content appears as expected when building with the `master` branch of the `dita-ot` repo.
|
1.0
|
PDF: Conref regression with preprocess2 - When building the docs using dita-ot/dita-ot@19abe1998, there are missing conrefs in the PDF output _(empty bullets on page 12, for example)_.
This is apparently related to recent changes in the conref processing and/or `preprocess2`, as the content appears as expected when building with the `master` branch of the `dita-ot` repo.
|
process
|
pdf conref regression with when building the docs using dita ot dita ot there are missing conrefs in the pdf output empty bullets on page for example this is apparently related to recent changes in the conref processing and or as the content appears as expected when building with the master branch of the dita ot repo
| 1
|
11,529
| 14,403,568,241
|
IssuesEvent
|
2020-12-03 16:12:54
|
panther-labs/panther
|
https://api.github.com/repos/panther-labs/panther
|
closed
|
Some log type reporting null p_source_id, p_source_label
|
bug p0 team:data processing
|
### Describe the bug
Seeing null `p_source_id`, `p_source_label` columns for data onboarding through S3.
### Steps to reproduce
Steps to reproduce the behavior:
1. Go to Log Analysis -> Sources
2. Onboard a CloudTrail log type
3. Wait until you have received some data
4. Run the query :
```
SELECT p_source_label, p_source_id FROM panther_logs.aws_cloudtrail
```
5. See `NULL`s
### Expected behavior
`p_source_id`, `p_source_label` should ALWAYS be populated
### Environment
How are you deploying or using Panther?
- Panther version or commit: 1.13
### Additional context
This seems to affect data that have been migrated to the new parser scheme
|
1.0
|
Some log type reporting null p_source_id, p_source_label - ### Describe the bug
Seeing null `p_source_id`, `p_source_label` columns for data onboarding through S3.
### Steps to reproduce
Steps to reproduce the behavior:
1. Go to Log Analysis -> Sources
2. Onboard a CloudTrail log type
3. Wait until you have received some data
4. Run the query :
```
SELECT p_source_label, p_source_id FROM panther_logs.aws_cloudtrail
```
5. See `NULL`s
### Expected behavior
`p_source_id`, `p_source_label` should ALWAYS be populated
### Environment
How are you deploying or using Panther?
- Panther version or commit: 1.13
### Additional context
This seems to affect data that have been migrated to the new parser scheme
|
process
|
some log type reporting null p source id p source label describe the bug seeing null p source id p source label columns for data onboarding through steps to reproduce steps to reproduce the behavior go to log analysis sources onboard a cloudtrail log type wait until you have received some data run the query select p source label p source id from panther logs aws cloudtrail see null s expected behavior p source id p source label should always be populated environment how are you deploying or using panther panther version or commit additional context this seems to affect data that have been migrated to the new parser scheme
| 1
|
60,596
| 14,562,633,546
|
IssuesEvent
|
2020-12-17 00:32:24
|
Dima2021/lodash
|
https://api.github.com/repos/Dima2021/lodash
|
opened
|
CVE-2018-1000620 (High) detected in cryptiles-0.2.2.tgz, cryptiles-3.1.2.tgz
|
security vulnerability
|
## CVE-2018-1000620 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>cryptiles-0.2.2.tgz</b>, <b>cryptiles-3.1.2.tgz</b></p></summary>
<p>
<details><summary><b>cryptiles-0.2.2.tgz</b></p></summary>
<p>General purpose crypto utilities</p>
<p>Library home page: <a href="https://registry.npmjs.org/cryptiles/-/cryptiles-0.2.2.tgz">https://registry.npmjs.org/cryptiles/-/cryptiles-0.2.2.tgz</a></p>
<p>Path to dependency file: lodash/package.json</p>
<p>Path to vulnerable library: lodash/node_modules/cryptiles/package.json</p>
<p>
Dependency Hierarchy:
- codecov.io-0.1.6.tgz (Root Library)
- request-2.42.0.tgz
- hawk-1.1.1.tgz
- :x: **cryptiles-0.2.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>cryptiles-3.1.2.tgz</b></p></summary>
<p>General purpose crypto utilities</p>
<p>Library home page: <a href="https://registry.npmjs.org/cryptiles/-/cryptiles-3.1.2.tgz">https://registry.npmjs.org/cryptiles/-/cryptiles-3.1.2.tgz</a></p>
<p>Path to dependency file: lodash/package.json</p>
<p>Path to vulnerable library: lodash/node_modules/request/node_modules/cryptiles/package.json</p>
<p>
Dependency Hierarchy:
- request-2.85.0.tgz (Root Library)
- hawk-6.0.2.tgz
- :x: **cryptiles-3.1.2.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Dima2021/lodash/commits/50af7e310d557bae73ac2badb6ac6ecda0cebcb0">50af7e310d557bae73ac2badb6ac6ecda0cebcb0</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Eran Hammer cryptiles version 4.1.1 earlier contains a CWE-331: Insufficient Entropy vulnerability in randomDigits() method that can result in An attacker is more likely to be able to brute force something that was supposed to be random.. This attack appear to be exploitable via Depends upon the calling application.. This vulnerability appears to have been fixed in 4.1.2.
<p>Publish Date: 2018-07-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000620>CVE-2018-1000620</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-1000620">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-1000620</a></p>
<p>Release Date: 2018-07-09</p>
<p>Fix Resolution: v4.1.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"cryptiles","packageVersion":"0.2.2","isTransitiveDependency":true,"dependencyTree":"codecov.io:0.1.6;request:2.42.0;hawk:1.1.1;cryptiles:0.2.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v4.1.2"},{"packageType":"javascript/Node.js","packageName":"cryptiles","packageVersion":"3.1.2","isTransitiveDependency":true,"dependencyTree":"request:2.85.0;hawk:6.0.2;cryptiles:3.1.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v4.1.2"}],"vulnerabilityIdentifier":"CVE-2018-1000620","vulnerabilityDetails":"Eran Hammer cryptiles version 4.1.1 earlier contains a CWE-331: Insufficient Entropy vulnerability in randomDigits() method that can result in An attacker is more likely to be able to brute force something that was supposed to be random.. This attack appear to be exploitable via Depends upon the calling application.. This vulnerability appears to have been fixed in 4.1.2.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000620","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2018-1000620 (High) detected in cryptiles-0.2.2.tgz, cryptiles-3.1.2.tgz - ## CVE-2018-1000620 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>cryptiles-0.2.2.tgz</b>, <b>cryptiles-3.1.2.tgz</b></p></summary>
<p>
<details><summary><b>cryptiles-0.2.2.tgz</b></p></summary>
<p>General purpose crypto utilities</p>
<p>Library home page: <a href="https://registry.npmjs.org/cryptiles/-/cryptiles-0.2.2.tgz">https://registry.npmjs.org/cryptiles/-/cryptiles-0.2.2.tgz</a></p>
<p>Path to dependency file: lodash/package.json</p>
<p>Path to vulnerable library: lodash/node_modules/cryptiles/package.json</p>
<p>
Dependency Hierarchy:
- codecov.io-0.1.6.tgz (Root Library)
- request-2.42.0.tgz
- hawk-1.1.1.tgz
- :x: **cryptiles-0.2.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>cryptiles-3.1.2.tgz</b></p></summary>
<p>General purpose crypto utilities</p>
<p>Library home page: <a href="https://registry.npmjs.org/cryptiles/-/cryptiles-3.1.2.tgz">https://registry.npmjs.org/cryptiles/-/cryptiles-3.1.2.tgz</a></p>
<p>Path to dependency file: lodash/package.json</p>
<p>Path to vulnerable library: lodash/node_modules/request/node_modules/cryptiles/package.json</p>
<p>
Dependency Hierarchy:
- request-2.85.0.tgz (Root Library)
- hawk-6.0.2.tgz
- :x: **cryptiles-3.1.2.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Dima2021/lodash/commits/50af7e310d557bae73ac2badb6ac6ecda0cebcb0">50af7e310d557bae73ac2badb6ac6ecda0cebcb0</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Eran Hammer cryptiles version 4.1.1 earlier contains a CWE-331: Insufficient Entropy vulnerability in randomDigits() method that can result in An attacker is more likely to be able to brute force something that was supposed to be random.. This attack appear to be exploitable via Depends upon the calling application.. This vulnerability appears to have been fixed in 4.1.2.
<p>Publish Date: 2018-07-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000620>CVE-2018-1000620</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-1000620">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-1000620</a></p>
<p>Release Date: 2018-07-09</p>
<p>Fix Resolution: v4.1.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"cryptiles","packageVersion":"0.2.2","isTransitiveDependency":true,"dependencyTree":"codecov.io:0.1.6;request:2.42.0;hawk:1.1.1;cryptiles:0.2.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v4.1.2"},{"packageType":"javascript/Node.js","packageName":"cryptiles","packageVersion":"3.1.2","isTransitiveDependency":true,"dependencyTree":"request:2.85.0;hawk:6.0.2;cryptiles:3.1.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v4.1.2"}],"vulnerabilityIdentifier":"CVE-2018-1000620","vulnerabilityDetails":"Eran Hammer cryptiles version 4.1.1 earlier contains a CWE-331: Insufficient Entropy vulnerability in randomDigits() method that can result in An attacker is more likely to be able to brute force something that was supposed to be random.. This attack appear to be exploitable via Depends upon the calling application.. This vulnerability appears to have been fixed in 4.1.2.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000620","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in cryptiles tgz cryptiles tgz cve high severity vulnerability vulnerable libraries cryptiles tgz cryptiles tgz cryptiles tgz general purpose crypto utilities library home page a href path to dependency file lodash package json path to vulnerable library lodash node modules cryptiles package json dependency hierarchy codecov io tgz root library request tgz hawk tgz x cryptiles tgz vulnerable library cryptiles tgz general purpose crypto utilities library home page a href path to dependency file lodash package json path to vulnerable library lodash node modules request node modules cryptiles package json dependency hierarchy request tgz root library hawk tgz x cryptiles tgz vulnerable library found in head commit a href found in base branch master vulnerability details eran hammer cryptiles version earlier contains a cwe insufficient entropy vulnerability in randomdigits method that can result in an attacker is more likely to be able to brute force something that was supposed to be random this attack appear to be exploitable via depends upon the calling application this vulnerability appears to have been fixed in publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails eran hammer cryptiles version earlier contains a cwe insufficient entropy vulnerability in randomdigits method that can result in an attacker is more likely to be able to brute force something that was supposed to be random this attack appear to be exploitable via depends upon the calling application this vulnerability appears to have been fixed in vulnerabilityurl
| 0
|
8,425
| 11,593,863,139
|
IssuesEvent
|
2020-02-24 14:22:54
|
prisma/specs
|
https://api.github.com/repos/prisma/specs
|
opened
|
Clean labels in GitHub
|
kind/meta process/candidate
|
The labels to be renamed
- photon -> prisma-client
- photonjs -> prisma-client-js
- photongo -> prisma-client-go?
- lift -> migrate
<img width="248" alt="Screen Shot 2020-02-24 at 15 20 56" src="https://user-images.githubusercontent.com/1328733/75159882-5334f580-5719-11ea-9a31-e0fd394cd781.png">
|
1.0
|
Clean labels in GitHub - The labels to be renamed
- photon -> prisma-client
- photonjs -> prisma-client-js
- photongo -> prisma-client-go?
- lift -> migrate
<img width="248" alt="Screen Shot 2020-02-24 at 15 20 56" src="https://user-images.githubusercontent.com/1328733/75159882-5334f580-5719-11ea-9a31-e0fd394cd781.png">
|
process
|
clean labels in github the labels to be renamed photon prisma client photonjs prisma client js photongo prisma client go lift migrate img width alt screen shot at src
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.