Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
386,352
| 26,679,072,596
|
IssuesEvent
|
2023-01-26 16:17:40
|
arbor-sim/arbor
|
https://api.github.com/repos/arbor-sim/arbor
|
opened
|
Units and behavior of diffusive particles
|
documentation clarification needed
|
In the [documentation](https://docs.arbor-sim.org/en/latest/fileformat/nmodl.html#units), `mmol/L` is mentioned as the unit for `ion X diffusive concentration`, but it is not stated what it refers to (e.g., to point mechanisms, as it is stated for other quantities on the same documentation page). More importantly, the unit of the amount of diffusive particles is neither stated in the documentation nor does it immediately become clear.
Here is an example to specify the issue. Consider a compartment with the surface area `A = 2⋅π⋅radius⋅length`. To get from a point measurement of the diffusive particle concentration `Xd` to the amount of particles on the whole surface of the compartment, I find that one needs to compute `Xd⋅A⋅10^(-3)m`, if `A` is in units of `m^2`. Thus, the unit of the amount of diffusive particles would be `mmol`. Nevertheless, the exact definition of this quantity (and thus its biophysical meaning) remains unclear because the origin of the factor `10^(-3)m = 1mm` is unclear. I guess, the problem may be that the concentration is defined 'per volume' while all the dynamics currently considered by Arbor happen on a surface.
Furthermore, it is not clear from the documentation how the particle diffusion behaves in general (but some information has already been provided in [this PR](https://github.com/arbor-sim/arbor/pull/1729)).
To summarize, it would be nice to have a thorough description of the behavior of the particle diffusion in the documentation. Moreover, the unit of the amount of particles (presumably `mmol`) and its definition should be mentioned in the documentation, so that not everyone will have to think it over again.
|
1.0
|
Units and behavior of diffusive particles - In the [documentation](https://docs.arbor-sim.org/en/latest/fileformat/nmodl.html#units), `mmol/L` is mentioned as the unit for `ion X diffusive concentration`, but it is not stated what it refers to (e.g., to point mechanisms, as it is stated for other quantities on the same documentation page). More importantly, the unit of the amount of diffusive particles is neither stated in the documentation nor does it immediately become clear.
Here is an example to specify the issue. Consider a compartment with the surface area `A = 2⋅π⋅radius⋅length`. To get from a point measurement of the diffusive particle concentration `Xd` to the amount of particles on the whole surface of the compartment, I find that one needs to compute `Xd⋅A⋅10^(-3)m`, if `A` is in units of `m^2`. Thus, the unit of the amount of diffusive particles would be `mmol`. Nevertheless, the exact definition of this quantity (and thus its biophysical meaning) remains unclear because the origin of the factor `10^(-3)m = 1mm` is unclear. I guess, the problem may be that the concentration is defined 'per volume' while all the dynamics currently considered by Arbor happen on a surface.
Furthermore, it is not clear from the documentation how the particle diffusion behaves in general (but some information has already been provided in [this PR](https://github.com/arbor-sim/arbor/pull/1729)).
To summarize, it would be nice to have a thorough description of the behavior of the particle diffusion in the documentation. Moreover, the unit of the amount of particles (presumably `mmol`) and its definition should be mentioned in the documentation, so that not everyone will have to think it over again.
|
non_process
|
units and behavior of diffusive particles in the mmol l is mentioned as the unit for ion x diffusive concentration but it is not stated what it refers to e g to point mechanisms as it is stated for other quantities on the same documentation page more importantly the unit of the amount of diffusive particles is neither stated in the documentation nor does it immediately become clear here is an example to specify the issue consider a compartment with the surface area a ⋅π⋅radius⋅length to get from a point measurement of the diffusive particle concentration xd to the amount of particles on the whole surface of the compartment i find that one needs to compute xd⋅a⋅ m if a is in units of m thus the unit of the amount of diffusive particles would be mmol nevertheless the exact definition of this quantity and thus its biophysical meaning remains unclear because the origin of the factor m is unclear i guess the problem may be that the concentration is defined per volume while all the dynamics currently considered by arbor happen on a surface furthermore it is not clear from the documentation how the particle diffusion behaves in general but some information has already been provided in to summarize it would be nice to have a thorough description of the behavior of the particle diffusion in the documentation moreover the unit of the amount of particles presumably mmol and its definition should be mentioned in the documentation so that not everyone will have to think it over again
| 0
|
13,083
| 15,429,487,470
|
IssuesEvent
|
2021-03-06 03:51:26
|
bigwolftime/gitmentCommentsPlugin
|
https://api.github.com/repos/bigwolftime/gitmentCommentsPlugin
|
opened
|
操作系统进程调度
|
-xblog-system-process-dispatcher-
|
https://index1024.gitee.io/xblog/system-process-dispatcher/
一. 调度指标
周转时间
其中一个衡量调度算法性能的指标是周转时间, 定义为:
T(周转时间) = T(完成时间) - T(任务到达时间)
多个任务的平均周转时间定义为:
T(平均周转时间) = (T1(任务1周转时间) + T2(任务2周转时间) + ...) / n. n 为任务数
响应时间
一。 单 CPU 下的调度策略
1. 先进先出(First In First Out, FIFO)
使用了队列思想, 那个任务先来便运行哪个任务. 其特点是: 逻辑简单, 易于实现
假设1: 有 A,B,C 三个任务, 几乎同时到达系统, 排队的序列为: A,B,C, 假如每个任务的执行时间是 10s, 则第 0-10s 运行任务A, 11-20s 运行
任务B, 21-30s 运行任务C.
则对于每个任务, 其周转时间为(假设: 任务几乎同时到达, 开始时间为0):
A: 10 - 0 = 10
B: 20 - 0 = 20
C: 30 - 0 = 30
其平均周转时间 = (10 + 20 + 30) / 3 = 20s
但实际上每个任务所需的执行时间是不同的, 在上面的前提下, 假设2: 我们修改任务A的运行时长为 100s. 则其周转时间为:
A: 100 - 0 = 100
B: 110 - 0 = 110
C: 120 - 0 = 120
其平均周转时间 = (100 + 110 + 120) / 3 = 110s
如果我们调换一下任务的执行顺序呢? 假设3: 任务A,B,C 的运行时间为 100s, 10s, 10s, 任务到达系统的顺序为: B, C, A, 则其周转时间:
B: 10 - 0 = 10
C: 20 - 0 = 20
A: 120 - 0 = 120
平均周转时间 = (10 + 20 + 120) / 3 = 50s
可以看到差距了, 在公平的调度策略下, 不同的任务执行顺序计算得到的平均周转时间是不同的, 这个问题通常被称为护航效应, 即耗时较少的的任
务被排在了耗时较大的任务后面.
2. 最短任务优先(Shorted Job First, SJF)
考虑到平均周转时间, 提出了 SJF(最短任务优先原则): 即先执行最短的任务, 再执行次短的任务, 以此类推.
先进先出策略的假设3部分, 就体现出了 SJF 策略, 其表现要比 FIFO 要好.
假设: 有A,B,C三个任务, 其耗时分别为: 100s, 10s, 10s, 任务A在 0s 到达, 任务B,C在 10s 到达(B 在 C 的前面), 其周转时间:
A: 100 - 0 = 100
B: 110 - 10 = 100
C: 120 - 10 = 110
平均周转时间 = (100 + 100 + 110) / 3 = 103.3s
注意: 目前并没有用考虑进程的抢占式调度, 即进程一旦开始执行, 可一直运行直到结束.
可以看出, SJF 策略同样出现了护航效应问题.
3. 最短完成时间优先(Shorted Time-to-Completion First, STCF)
上面的讨论都是非抢占式的调度策略, 在 SJF 的基础上, 假设任务可以被抢占, 即当一个新任务到达后, 如果新任务比当前正在运行的任务耗时少,
则停止正在运行的任务并保存其上下文, 转而执行新任务.
假设: A,B,C 三个任务, 其耗时分别为: 100s, 10s, 10s, 任务A在 0s 到达, 任务B,C在 10s 到达(B在C的前面), 其周转时间:
A: 120 - 0 = 120, A在第10s被B抢占, 直到第31s才继续执行
B: 20 - 10 = 10, B在10s时刻到达后直接抢占
C: 30 - 10 = 20, C在B的后面执行
平均周转时间 = (120 + 10 + 20) / 3 = 50s
4.
参考
《操作系统导论》
|
1.0
|
操作系统进程调度 - https://index1024.gitee.io/xblog/system-process-dispatcher/
一. 调度指标
周转时间
其中一个衡量调度算法性能的指标是周转时间, 定义为:
T(周转时间) = T(完成时间) - T(任务到达时间)
多个任务的平均周转时间定义为:
T(平均周转时间) = (T1(任务1周转时间) + T2(任务2周转时间) + ...) / n. n 为任务数
响应时间
一。 单 CPU 下的调度策略
1. 先进先出(First In First Out, FIFO)
使用了队列思想, 那个任务先来便运行哪个任务. 其特点是: 逻辑简单, 易于实现
假设1: 有 A,B,C 三个任务, 几乎同时到达系统, 排队的序列为: A,B,C, 假如每个任务的执行时间是 10s, 则第 0-10s 运行任务A, 11-20s 运行
任务B, 21-30s 运行任务C.
则对于每个任务, 其周转时间为(假设: 任务几乎同时到达, 开始时间为0):
A: 10 - 0 = 10
B: 20 - 0 = 20
C: 30 - 0 = 30
其平均周转时间 = (10 + 20 + 30) / 3 = 20s
但实际上每个任务所需的执行时间是不同的, 在上面的前提下, 假设2: 我们修改任务A的运行时长为 100s. 则其周转时间为:
A: 100 - 0 = 100
B: 110 - 0 = 110
C: 120 - 0 = 120
其平均周转时间 = (100 + 110 + 120) / 3 = 110s
如果我们调换一下任务的执行顺序呢? 假设3: 任务A,B,C 的运行时间为 100s, 10s, 10s, 任务到达系统的顺序为: B, C, A, 则其周转时间:
B: 10 - 0 = 10
C: 20 - 0 = 20
A: 120 - 0 = 120
平均周转时间 = (10 + 20 + 120) / 3 = 50s
可以看到差距了, 在公平的调度策略下, 不同的任务执行顺序计算得到的平均周转时间是不同的, 这个问题通常被称为护航效应, 即耗时较少的的任
务被排在了耗时较大的任务后面.
2. 最短任务优先(Shorted Job First, SJF)
考虑到平均周转时间, 提出了 SJF(最短任务优先原则): 即先执行最短的任务, 再执行次短的任务, 以此类推.
先进先出策略的假设3部分, 就体现出了 SJF 策略, 其表现要比 FIFO 要好.
假设: 有A,B,C三个任务, 其耗时分别为: 100s, 10s, 10s, 任务A在 0s 到达, 任务B,C在 10s 到达(B 在 C 的前面), 其周转时间:
A: 100 - 0 = 100
B: 110 - 10 = 100
C: 120 - 10 = 110
平均周转时间 = (100 + 100 + 110) / 3 = 103.3s
注意: 目前并没有用考虑进程的抢占式调度, 即进程一旦开始执行, 可一直运行直到结束.
可以看出, SJF 策略同样出现了护航效应问题.
3. 最短完成时间优先(Shorted Time-to-Completion First, STCF)
上面的讨论都是非抢占式的调度策略, 在 SJF 的基础上, 假设任务可以被抢占, 即当一个新任务到达后, 如果新任务比当前正在运行的任务耗时少,
则停止正在运行的任务并保存其上下文, 转而执行新任务.
假设: A,B,C 三个任务, 其耗时分别为: 100s, 10s, 10s, 任务A在 0s 到达, 任务B,C在 10s 到达(B在C的前面), 其周转时间:
A: 120 - 0 = 120, A在第10s被B抢占, 直到第31s才继续执行
B: 20 - 10 = 10, B在10s时刻到达后直接抢占
C: 30 - 10 = 20, C在B的后面执行
平均周转时间 = (120 + 10 + 20) / 3 = 50s
4.
参考
《操作系统导论》
|
process
|
操作系统进程调度 一 调度指标 周转时间 其中一个衡量调度算法性能的指标是周转时间 定义为 t 周转时间 t 完成时间 t 任务到达时间 多个任务的平均周转时间定义为 t 平均周转时间 n n 为任务数 响应时间 一。 单 cpu 下的调度策略 先进先出 first in first out fifo 使用了队列思想 那个任务先来便运行哪个任务 其特点是 逻辑简单 易于实现 有 a b c 三个任务 几乎同时到达系统 排队的序列为 a b c 假如每个任务的执行时间是 则第 运行任务a 运行 任务b 运行任务c 则对于每个任务 其周转时间为 假设 任务几乎同时到达 a b c 其平均周转时间 但实际上每个任务所需的执行时间是不同的 在上面的前提下 我们修改任务a的运行时长为 则其周转时间为 a b c 其平均周转时间 如果我们调换一下任务的执行顺序呢 任务a b c 的运行时间为 任务到达系统的顺序为 b c a 则其周转时间 b c a 平均周转时间 可以看到差距了 在公平的调度策略下 不同的任务执行顺序计算得到的平均周转时间是不同的 这个问题通常被称为护航效应 即耗时较少的的任 务被排在了耗时较大的任务后面 最短任务优先 shorted job first sjf 考虑到平均周转时间 提出了 sjf 最短任务优先原则 即先执行最短的任务 再执行次短的任务 以此类推 就体现出了 sjf 策略 其表现要比 fifo 要好 假设 有a b c三个任务 其耗时分别为 任务a在 到达 任务b c在 到达 b 在 c 的前面 其周转时间 a b c 平均周转时间 注意 目前并没有用考虑进程的抢占式调度 即进程一旦开始执行 可一直运行直到结束 可以看出 sjf 策略同样出现了护航效应问题 最短完成时间优先 shorted time to completion first stcf 上面的讨论都是非抢占式的调度策略 在 sjf 的基础上 假设任务可以被抢占 即当一个新任务到达后 如果新任务比当前正在运行的任务耗时少 则停止正在运行的任务并保存其上下文 转而执行新任务 假设 a b c 三个任务 其耗时分别为 任务a在 到达 任务b c在 到达 b在c的前面 其周转时间 a b c c在b的后面执行 平均周转时间 参考 《操作系统导论》
| 1
|
10,765
| 13,561,164,255
|
IssuesEvent
|
2020-09-18 03:46:32
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
closed
|
MaxWorkingSet supported platforms
|
area-System.Diagnostics.Process blocking-release
|
https://github.com/dotnet/runtime/blob/3130ac980c97c0664eddb04411f8f765a9f908d8/src/libraries/System.Diagnostics.Process/ref/System.Diagnostics.Process.cs#L45
The code specifies supported platforms as Windows only.
However, the [documentation](https://github.com/dotnet/docs/blob/3588fdadac34bee137fe519d50c9a187d704d1fd/docs/core/compatibility/unsupported-apis.md) mentions it throws in Linux only, which means both Windows and macOS are supported.
I think macOS should be missing in either the documentation or the supported platforms attribute.
|
1.0
|
MaxWorkingSet supported platforms - https://github.com/dotnet/runtime/blob/3130ac980c97c0664eddb04411f8f765a9f908d8/src/libraries/System.Diagnostics.Process/ref/System.Diagnostics.Process.cs#L45
The code specifies supported platforms as Windows only.
However, the [documentation](https://github.com/dotnet/docs/blob/3588fdadac34bee137fe519d50c9a187d704d1fd/docs/core/compatibility/unsupported-apis.md) mentions it throws in Linux only, which means both Windows and macOS are supported.
I think macOS should be missing in either the documentation or the supported platforms attribute.
|
process
|
maxworkingset supported platforms the code specifies supported platforms as windows only however the mentions it throws in linux only which means both windows and macos are supported i think macos should be missing in either the documentation or the supported platforms attribute
| 1
|
13,847
| 16,610,098,629
|
IssuesEvent
|
2021-06-02 10:23:17
|
sanger/General-Backlog-Items
|
https://api.github.com/repos/sanger/General-Backlog-Items
|
closed
|
GPL-730 [Improvement] Standarize deployment command (s) (Dulipcate of hot reload)
|
Deployment process improvement
|
Some projects (Limber, Labwhere) require an extra flag for deployment named ```full_deploy=true```, while other projects don't need it. This is caused because in this projects we have an issue that doesn't allow to stop and start all puma workers on deployment.
If we want to use full_deploy=true mandatorily, then we should give it the right default value so we don't have to pass it as an argument when we want to deploy. The way it is now is prone to error as we can forget if we have to use the full_deploy flag in the project we are deploying in that moment.
Acceptance criteria:
- [ ] Give a default value for ```full_deploy flag``` in psd deployment for all projects, so we dont have to specify it when we want to deploy normally
|
1.0
|
GPL-730 [Improvement] Standarize deployment command (s) (Dulipcate of hot reload) - Some projects (Limber, Labwhere) require an extra flag for deployment named ```full_deploy=true```, while other projects don't need it. This is caused because in this projects we have an issue that doesn't allow to stop and start all puma workers on deployment.
If we want to use full_deploy=true mandatorily, then we should give it the right default value so we don't have to pass it as an argument when we want to deploy. The way it is now is prone to error as we can forget if we have to use the full_deploy flag in the project we are deploying in that moment.
Acceptance criteria:
- [ ] Give a default value for ```full_deploy flag``` in psd deployment for all projects, so we dont have to specify it when we want to deploy normally
|
process
|
gpl standarize deployment command s dulipcate of hot reload some projects limber labwhere require an extra flag for deployment named full deploy true while other projects don t need it this is caused because in this projects we have an issue that doesn t allow to stop and start all puma workers on deployment if we want to use full deploy true mandatorily then we should give it the right default value so we don t have to pass it as an argument when we want to deploy the way it is now is prone to error as we can forget if we have to use the full deploy flag in the project we are deploying in that moment acceptance criteria give a default value for full deploy flag in psd deployment for all projects so we dont have to specify it when we want to deploy normally
| 1
|
500,328
| 14,496,250,986
|
IssuesEvent
|
2020-12-11 12:29:42
|
StrangeLoopGames/EcoIssues
|
https://api.github.com/repos/StrangeLoopGames/EcoIssues
|
closed
|
[0.9.2 develop-109] Avatar doesn't update hair, beard and clothes during creation or editing.
|
Category: Gameplay Priority: High Type: Regression
|
But after press save it will be selected what you select during creation. Step to reproduce:
- start to create avatar, select others clothes, hair and beard then default. It will not updated:

- press save and open edit, I will look like I select:

```
Exception in view notifySystem.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.
at System.Collections.Generic.Dictionary`2[TKey,TValue].get_Item (TKey key) [0x00000] in <00000000000000000000000000000000>:0
at Avatar.UpdateClothingPart (System.String slot, ItemView itemView, UnityEngine.Color color) [0x00000] in <00000000000000000000000000000000>:0
at Avatar.UpdateClothingPart (System.String slot) [0x00000] in <00000000000000000000000000000000>:0
at Avatar.UpdateAllClothingParts () [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.Remoting.Contexts.CrossContextDelegate.Invoke () [0x00000] in <00000000000000000000000000000000>:0
at Eco.Shared.View.View.NotifyChanged (System.String propname) [0x00000] in <00000000000000000000000000000000>:0
at Eco.Shared.View.ViewManager.ReceiveViewUpdate (Eco.Shared.Serialization.BSONArray updates) [0x00000] in <00000000000000000000000000000000>:0
at ClientPacketHandler.ReceiveWhile (Eco.Shared.Networking.NetworkClient client, TimeLimit timeLimit) [0x00000] in <00000000000000000000000000000000>:0
at NetworkManager.OnTimeLimitedUpdate (TimeLimit timeLimit) [0x00000] in <00000000000000000000000000000000>:0
at System.Action`1[T].Invoke (T obj) [0x00000] in <00000000000000000000000000000000>:0
at FramePlanner.PlannerGroup.OnUpdate () [0x00000] in <00000000000000000000000000000000>:0
at FramePlanner.FramePlannerSystem.OnUpdate () [0x00000] in <00000000000000000000000000000000>:0
at Unity.Entities.ComponentSystem.Update () [0x00000] in <00000000000000000000000000000000>:0
at Unity.Entities.ComponentSystemGroup.UpdateAllSystems () [0x00000] in <00000000000000000000000000000000>:0
at Unity.Entities.ComponentSystem.Update () [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.Remoting.Contexts.CrossContextDelegate.Invoke () [0x00000] in <00000000000000000000000000000000>:0
```
video:
https://drive.google.com/file/d/16ra_Yfmy8TgDwElU_HzCibHVTAWzMBTo/view?usp=sharing
log file:
[Player.log](https://github.com/StrangeLoopGames/EcoIssues/files/5516779/Player.log)
|
1.0
|
[0.9.2 develop-109] Avatar doesn't update hair, beard and clothes during creation or editing. - But after press save it will be selected what you select during creation. Step to reproduce:
- start to create avatar, select others clothes, hair and beard then default. It will not updated:

- press save and open edit, I will look like I select:

```
Exception in view notifySystem.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.
at System.Collections.Generic.Dictionary`2[TKey,TValue].get_Item (TKey key) [0x00000] in <00000000000000000000000000000000>:0
at Avatar.UpdateClothingPart (System.String slot, ItemView itemView, UnityEngine.Color color) [0x00000] in <00000000000000000000000000000000>:0
at Avatar.UpdateClothingPart (System.String slot) [0x00000] in <00000000000000000000000000000000>:0
at Avatar.UpdateAllClothingParts () [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.Remoting.Contexts.CrossContextDelegate.Invoke () [0x00000] in <00000000000000000000000000000000>:0
at Eco.Shared.View.View.NotifyChanged (System.String propname) [0x00000] in <00000000000000000000000000000000>:0
at Eco.Shared.View.ViewManager.ReceiveViewUpdate (Eco.Shared.Serialization.BSONArray updates) [0x00000] in <00000000000000000000000000000000>:0
at ClientPacketHandler.ReceiveWhile (Eco.Shared.Networking.NetworkClient client, TimeLimit timeLimit) [0x00000] in <00000000000000000000000000000000>:0
at NetworkManager.OnTimeLimitedUpdate (TimeLimit timeLimit) [0x00000] in <00000000000000000000000000000000>:0
at System.Action`1[T].Invoke (T obj) [0x00000] in <00000000000000000000000000000000>:0
at FramePlanner.PlannerGroup.OnUpdate () [0x00000] in <00000000000000000000000000000000>:0
at FramePlanner.FramePlannerSystem.OnUpdate () [0x00000] in <00000000000000000000000000000000>:0
at Unity.Entities.ComponentSystem.Update () [0x00000] in <00000000000000000000000000000000>:0
at Unity.Entities.ComponentSystemGroup.UpdateAllSystems () [0x00000] in <00000000000000000000000000000000>:0
at Unity.Entities.ComponentSystem.Update () [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.Remoting.Contexts.CrossContextDelegate.Invoke () [0x00000] in <00000000000000000000000000000000>:0
```
video:
https://drive.google.com/file/d/16ra_Yfmy8TgDwElU_HzCibHVTAWzMBTo/view?usp=sharing
log file:
[Player.log](https://github.com/StrangeLoopGames/EcoIssues/files/5516779/Player.log)
|
non_process
|
avatar doesn t update hair beard and clothes during creation or editing but after press save it will be selected what you select during creation step to reproduce start to create avatar select others clothes hair and beard then default it will not updated press save and open edit i will look like i select exception in view notifysystem collections generic keynotfoundexception the given key was not present in the dictionary at system collections generic dictionary get item tkey key in at avatar updateclothingpart system string slot itemview itemview unityengine color color in at avatar updateclothingpart system string slot in at avatar updateallclothingparts in at system runtime remoting contexts crosscontextdelegate invoke in at eco shared view view notifychanged system string propname in at eco shared view viewmanager receiveviewupdate eco shared serialization bsonarray updates in at clientpackethandler receivewhile eco shared networking networkclient client timelimit timelimit in at networkmanager ontimelimitedupdate timelimit timelimit in at system action invoke t obj in at frameplanner plannergroup onupdate in at frameplanner frameplannersystem onupdate in at unity entities componentsystem update in at unity entities componentsystemgroup updateallsystems in at unity entities componentsystem update in at system runtime remoting contexts crosscontextdelegate invoke in video log file
| 0
|
43,948
| 17,774,478,011
|
IssuesEvent
|
2021-08-30 17:22:01
|
Azure/azure-sdk-for-net
|
https://api.github.com/repos/Azure/azure-sdk-for-net
|
closed
|
[Flaky test] WithTimeoutGenericDoesNotThrowsWhenATimeoutDoesNotOccur
|
Service Bus Client
|
```
[xUnit.net 00:00:06.58] Microsoft.Azure.ServiceBus.UnitTests.TaskExtensionsTests.WithTimeoutGenericDoesNotThrowsWhenATimeoutDoesNotOccur [FAIL]
X Microsoft.Azure.ServiceBus.UnitTests.TaskExtensionsTests.WithTimeoutGenericDoesNotThrowsWhenATimeoutDoesNotOccur [5s 338ms]
Error Message:
There should not have been a timeout exception thrown.
Expected: False
Actual: True
Stack Trace:
at Microsoft.Azure.ServiceBus.UnitTests.TaskExtensionsTests.WithTimeoutGenericDoesNotThrowsWhenATimeoutDoesNotOccur() in D:\a\1\s\sdk\servicebus\Microsoft.Azure.ServiceBus\tests\Infrastructure.Tests\TaskExtensionTests.cs:line 101
--- End of stack trace from previous location where exception was thrown ---
```
|
1.0
|
[Flaky test] WithTimeoutGenericDoesNotThrowsWhenATimeoutDoesNotOccur - ```
[xUnit.net 00:00:06.58] Microsoft.Azure.ServiceBus.UnitTests.TaskExtensionsTests.WithTimeoutGenericDoesNotThrowsWhenATimeoutDoesNotOccur [FAIL]
X Microsoft.Azure.ServiceBus.UnitTests.TaskExtensionsTests.WithTimeoutGenericDoesNotThrowsWhenATimeoutDoesNotOccur [5s 338ms]
Error Message:
There should not have been a timeout exception thrown.
Expected: False
Actual: True
Stack Trace:
at Microsoft.Azure.ServiceBus.UnitTests.TaskExtensionsTests.WithTimeoutGenericDoesNotThrowsWhenATimeoutDoesNotOccur() in D:\a\1\s\sdk\servicebus\Microsoft.Azure.ServiceBus\tests\Infrastructure.Tests\TaskExtensionTests.cs:line 101
--- End of stack trace from previous location where exception was thrown ---
```
|
non_process
|
withtimeoutgenericdoesnotthrowswhenatimeoutdoesnotoccur microsoft azure servicebus unittests taskextensionstests withtimeoutgenericdoesnotthrowswhenatimeoutdoesnotoccur x microsoft azure servicebus unittests taskextensionstests withtimeoutgenericdoesnotthrowswhenatimeoutdoesnotoccur error message there should not have been a timeout exception thrown expected false actual true stack trace at microsoft azure servicebus unittests taskextensionstests withtimeoutgenericdoesnotthrowswhenatimeoutdoesnotoccur in d a s sdk servicebus microsoft azure servicebus tests infrastructure tests taskextensiontests cs line end of stack trace from previous location where exception was thrown
| 0
|
397,191
| 11,725,034,894
|
IssuesEvent
|
2020-03-10 12:10:32
|
Sergih28/iRacing-season-organizer
|
https://api.github.com/repos/Sergih28/iRacing-season-organizer
|
closed
|
Fix iRLMS and iRELMS not having the correct races
|
Priority: Critical Status: Available Type: Bug
|
Both series, because they share a page in the pdf, don't have the correct races. Some iRELMS races are shown in iRLMS column
|
1.0
|
Fix iRLMS and iRELMS not having the correct races - Both series, because they share a page in the pdf, don't have the correct races. Some iRELMS races are shown in iRLMS column
|
non_process
|
fix irlms and irelms not having the correct races both series because they share a page in the pdf don t have the correct races some irelms races are shown in irlms column
| 0
|
160,249
| 13,784,847,066
|
IssuesEvent
|
2020-10-08 21:36:32
|
pingcap/tiup
|
https://api.github.com/repos/pingcap/tiup
|
opened
|
docs: there should be documentation aimed at Ti{DB|KV|Flash} devs
|
category/documentation category/usability
|
As a TiKV dev wanting to use TiUp to test my code, there is no documentation for me.
|
1.0
|
docs: there should be documentation aimed at Ti{DB|KV|Flash} devs - As a TiKV dev wanting to use TiUp to test my code, there is no documentation for me.
|
non_process
|
docs there should be documentation aimed at ti db kv flash devs as a tikv dev wanting to use tiup to test my code there is no documentation for me
| 0
|
6,864
| 3,061,942,222
|
IssuesEvent
|
2015-08-16 02:50:55
|
hapijs/vision
|
https://api.github.com/repos/hapijs/vision
|
closed
|
Document response variety view
|
documentation
|
From the hapi doc:
- `variety` - a string indicating the type of `source` with available values:
- `'view'` - a view generated with [`reply.view()`](#replyviewtemplate-context-options).
|
1.0
|
Document response variety view - From the hapi doc:
- `variety` - a string indicating the type of `source` with available values:
- `'view'` - a view generated with [`reply.view()`](#replyviewtemplate-context-options).
|
non_process
|
document response variety view from the hapi doc variety a string indicating the type of source with available values view a view generated with replyviewtemplate context options
| 0
|
7,123
| 10,268,937,922
|
IssuesEvent
|
2019-08-23 07:49:24
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
error at the end of Processing/GDAL warp logs/runs
|
Bug Processing
|
On 3.8.1 Processing when running "warp" from GDAL the output is created as expected, but if the output is a temp layer then the log always ends with an error. In this sense, this bug is very similar to:
https://github.com/qgis/QGIS/issues/29346
|
1.0
|
error at the end of Processing/GDAL warp logs/runs - On 3.8.1 Processing when running "warp" from GDAL the output is created as expected, but if the output is a temp layer then the log always ends with an error. In this sense, this bug is very similar to:
https://github.com/qgis/QGIS/issues/29346
|
process
|
error at the end of processing gdal warp logs runs on processing when running warp from gdal the output is created as expected but if the output is a temp layer then the log always ends with an error in this sense this bug is very similar to
| 1
|
4,408
| 7,299,004,297
|
IssuesEvent
|
2018-02-26 18:43:47
|
UKHomeOffice/dq-aws-transition
|
https://api.github.com/repos/UKHomeOffice/dq-aws-transition
|
closed
|
Create Prod RDS SQL Server MDS instance in Data Ingest Subnet
|
DQ Data Ingest Production S4 Processing
|
Create Prod RDS SQL Server MDS instance in Data Ingest Subnet and migrate required MDS views/tables.
- [x] Create Prod RDS SQL Server 2012 R2 instance in Data Ingest Subnet
- [x] Create MDS Database
- [x] Create mds_user User
- [x] Create mdm schema
- [x] Create working tables from each MD View in MDS database in Sungard
- [x] Generate DDL Create Table statements from working tables
- [x] Generate INSERT statements for each working table
- [x] Create tables in RDS SQL Server MDS instance in mdm schema
- [x] Execute INSERT statement in RDS SQL Server MDS instance
- [x] Drop working tables from MDS database in Sungard
|
1.0
|
Create Prod RDS SQL Server MDS instance in Data Ingest Subnet - Create Prod RDS SQL Server MDS instance in Data Ingest Subnet and migrate required MDS views/tables.
- [x] Create Prod RDS SQL Server 2012 R2 instance in Data Ingest Subnet
- [x] Create MDS Database
- [x] Create mds_user User
- [x] Create mdm schema
- [x] Create working tables from each MD View in MDS database in Sungard
- [x] Generate DDL Create Table statements from working tables
- [x] Generate INSERT statements for each working table
- [x] Create tables in RDS SQL Server MDS instance in mdm schema
- [x] Execute INSERT statement in RDS SQL Server MDS instance
- [x] Drop working tables from MDS database in Sungard
|
process
|
create prod rds sql server mds instance in data ingest subnet create prod rds sql server mds instance in data ingest subnet and migrate required mds views tables create prod rds sql server instance in data ingest subnet create mds database create mds user user create mdm schema create working tables from each md view in mds database in sungard generate ddl create table statements from working tables generate insert statements for each working table create tables in rds sql server mds instance in mdm schema execute insert statement in rds sql server mds instance drop working tables from mds database in sungard
| 1
|
100,173
| 12,507,453,430
|
IssuesEvent
|
2020-06-02 14:09:02
|
eclipse/codewind
|
https://api.github.com/repos/eclipse/codewind
|
closed
|
Include social feed on our landing page
|
area/design area/docs area/website kind/enhancement
|
<!-- Please fill out the following form to suggest an enhancement. If some fields do not apply to your situation, feel free to skip them.-->
**Codewind version: 0.4
MacOS
**Che version:**
**IDE extension version:**
**IDE version:**
**Kubernetes cluster:**
**Description of the enhancement:**
We have a number of social feeds which are linked from our main landing page, but would like to see a live feed of new content on the landing page (similar to that also used at the bottom of https://Kabanero.io )
**Proposed solution:**
<!-- Do you have ideas about how your idea could be implemented?-->
|
1.0
|
Include social feed on our landing page - <!-- Please fill out the following form to suggest an enhancement. If some fields do not apply to your situation, feel free to skip them.-->
**Codewind version: 0.4
MacOS
**Che version:**
**IDE extension version:**
**IDE version:**
**Kubernetes cluster:**
**Description of the enhancement:**
We have a number of social feeds which are linked from our main landing page, but would like to see a live feed of new content on the landing page (similar to that also used at the bottom of https://Kabanero.io )
**Proposed solution:**
<!-- Do you have ideas about how your idea could be implemented?-->
|
non_process
|
include social feed on our landing page codewind version macos che version ide extension version ide version kubernetes cluster description of the enhancement we have a number of social feeds which are linked from our main landing page but would like to see a live feed of new content on the landing page similar to that also used at the bottom of proposed solution
| 0
|
158,876
| 13,750,584,981
|
IssuesEvent
|
2020-10-06 12:15:35
|
competitive-programming-tools/competitive-problems-tools
|
https://api.github.com/repos/competitive-programming-tools/competitive-problems-tools
|
opened
|
Criar Heatmap
|
documentation management
|
## Criar Heatmap
### O que é:
O Heatmap é útil para auxiliar no planejamento quanto a disponibilidade dos integrantes do grupo durante a realização do projeto.
### Description:
<!-- Describe issue briefly, inserting information necessary for the understanding of the future developer. -->
### Acceptance criteria:
<!-- Define acceptance criteria to verify the completeness of the issue -->
- [ ] criteria 1
- [ ] criteria 2
### Tasks:
<!-- Checklist of actions that should possibly be taken to complete the issue. -->
- [ ] Task 1
- [ ] Task 2
|
1.0
|
Criar Heatmap - ## Criar Heatmap
### O que é:
O Heatmap é útil para auxiliar no planejamento quanto a disponibilidade dos integrantes do grupo durante a realização do projeto.
### Description:
<!-- Describe issue briefly, inserting information necessary for the understanding of the future developer. -->
### Acceptance criteria:
<!-- Define acceptance criteria to verify the completeness of the issue -->
- [ ] criteria 1
- [ ] criteria 2
### Tasks:
<!-- Checklist of actions that should possibly be taken to complete the issue. -->
- [ ] Task 1
- [ ] Task 2
|
non_process
|
criar heatmap criar heatmap o que é o heatmap é útil para auxiliar no planejamento quanto a disponibilidade dos integrantes do grupo durante a realização do projeto description acceptance criteria criteria criteria tasks task task
| 0
|
445,480
| 12,831,513,255
|
IssuesEvent
|
2020-07-07 05:34:44
|
gambitph/Stackable
|
https://api.github.com/repos/gambitph/Stackable
|
closed
|
Container Column Paddings (right and left) are not working
|
[block] container bug high priority
|
Container Column Paddings (right and left) are not working
Backend

Frontend

|
1.0
|
Container Column Paddings (right and left) are not working - Container Column Paddings (right and left) are not working
Backend

Frontend

|
non_process
|
container column paddings right and left are not working container column paddings right and left are not working backend frontend
| 0
|
7,789
| 10,931,984,662
|
IssuesEvent
|
2019-11-23 14:33:23
|
code4romania/expert-consultation-api
|
https://api.github.com/repos/code4romania/expert-consultation-api
|
closed
|
[Document processing] Implement the processing of pdf file without chapters
|
document processing documents help wanted java spring
|
As an user of legal consultation web app I want to be able to view the contents of a pdf file in the platform, broken into units like chapters and articles.
Some of the pdf file do not contain chapters, and only contain articles.
A strategy needs to be implemented in order to break the contents of a pdf file only into articles, if the pdf does not contain chapters.
An example document can be provided, per request.
|
1.0
|
[Document processing] Implement the processing of pdf file without chapters - As an user of legal consultation web app I want to be able to view the contents of a pdf file in the platform, broken into units like chapters and articles.
Some of the pdf file do not contain chapters, and only contain articles.
A strategy needs to be implemented in order to break the contents of a pdf file only into articles, if the pdf does not contain chapters.
An example document can be provided, per request.
|
process
|
implement the processing of pdf file without chapters as an user of legal consultation web app i want to be able to view the contents of a pdf file in the platform broken into units like chapters and articles some of the pdf file do not contain chapters and only contain articles a strategy needs to be implemented in order to break the contents of a pdf file only into articles if the pdf does not contain chapters an example document can be provided per request
| 1
|
21,594
| 29,994,340,390
|
IssuesEvent
|
2023-06-26 03:20:12
|
phuocduong-agilityio/internship-huy-dao
|
https://api.github.com/repos/phuocduong-agilityio/internship-huy-dao
|
opened
|
Update UI/UX practice
|
In-process
|
- [ ] Remove `useClickOutside`
- [ ] Change data connection to remove `useEffect()`
- [ ] Use modal for a popup
|
1.0
|
Update UI/UX practice - - [ ] Remove `useClickOutside`
- [ ] Change data connection to remove `useEffect()`
- [ ] Use modal for a popup
|
process
|
update ui ux practice remove useclickoutside change data connection to remove useeffect use modal for a popup
| 1
|
136,632
| 12,728,608,429
|
IssuesEvent
|
2020-06-25 03:09:43
|
vmware/govmomi
|
https://api.github.com/repos/vmware/govmomi
|
closed
|
Query on SetRootCAs
|
documentation/examples
|
Hi,
Is there a way to create a cert pool and assign it to to the TLS config of the client directly instead of using SetRootCAs() function exposed by the soap client ?
|
1.0
|
Query on SetRootCAs - Hi,
Is there a way to create a cert pool and assign it to to the TLS config of the client directly instead of using SetRootCAs() function exposed by the soap client ?
|
non_process
|
query on setrootcas hi is there a way to create a cert pool and assign it to to the tls config of the client directly instead of using setrootcas function exposed by the soap client
| 0
|
5,692
| 8,560,921,996
|
IssuesEvent
|
2018-11-09 03:47:58
|
thewca/wca-regulations
|
https://api.github.com/repos/thewca/wca-regulations
|
closed
|
Proposal: Require a second scramble checker for possible WRs
|
2019 controversial process proposal
|
It seems that we can't fix the mis-scrambling issue in general, but it keeps happening for high-profile cubers who are obviously fairly likely to set a WR (cf. Jeff Park).
I'd like to propose the following limited scramble checker policy:
--------
If a competitor has **an official PR single better than the WR average**, then every scramble for the competitor in that event must be checked by a second scrambler who:
1) is a **different person** than the one who scrambled the puzzle,
2) must **verify** that the scramble is **correct for the current attempt**, and
3) **sign the score card** to certify this.
Every incorrect scramble must be fixed and *reported to the Delegate*.
--------
Ideally, I'd like to modify score card software to automatically and clearly indicate such competitors, with a column for the scramble checker to sign.
And if this turns out to be successful and practical, we can think about expanding the scope – e.g. applying this for CR or NR candidates, applying this to top competitors from previous rounds, or allowing other competitors to specifically request this under certain conditions.
@thewca/wrc-team: would you be in favor of this proposal if I worked on implementing it?
We can also try this out without immediately integrating this into the Regulations, but I'm getting *really* frustrated with preventable incidents, and want to get consensus that it's worth doing *something* like this.
|
1.0
|
Proposal: Require a second scramble checker for possible WRs - It seems that we can't fix the mis-scrambling issue in general, but it keeps happening for high-profile cubers who are obviously fairly likely to set a WR (cf. Jeff Park).
I'd like to propose the following limited scramble checker policy:
--------
If a competitor has **an official PR single better than the WR average**, then every scramble for the competitor in that event must be checked by a second scrambler who:
1) is a **different person** than the one who scrambled the puzzle,
2) must **verify** that the scramble is **correct for the current attempt**, and
3) **sign the score card** to certify this.
Every incorrect scramble must be fixed and *reported to the Delegate*.
--------
Ideally, I'd like to modify score card software to automatically and clearly indicate such competitors, with a column for the scramble checker to sign.
And if this turns out to be successful and practical, we can think about expanding the scope – e.g. applying this for CR or NR candidates, applying this to top competitors from previous rounds, or allowing other competitors to specifically request this under certain conditions.
@thewca/wrc-team: would you be in favor of this proposal if I worked on implementing it?
We can also try this out without immediately integrating this into the Regulations, but I'm getting *really* frustrated with preventable incidents, and want to get consensus that it's worth doing *something* like this.
|
process
|
proposal require a second scramble checker for possible wrs it seems that we can t fix the mis scrambling issue in general but it keeps happening for high profile cubers who are obviously fairly likely to set a wr cf jeff park i d like to propose the following limited scramble checker policy if a competitor has an official pr single better than the wr average then every scramble for the competitor in that event must be checked by a second scrambler who is a different person than the one who scrambled the puzzle must verify that the scramble is correct for the current attempt and sign the score card to certify this every incorrect scramble must be fixed and reported to the delegate ideally i d like to modify score card software to automatically and clearly indicate such competitors with a column for the scramble checker to sign and if this turns out to be successful and practical we can think about expanding the scope – e g applying this for cr or nr candidates applying this to top competitors from previous rounds or allowing other competitors to specifically request this under certain conditions thewca wrc team would you be in favor of this proposal if i worked on implementing it we can also try this out without immediately integrating this into the regulations but i m getting really frustrated with preventable incidents and want to get consensus that it s worth doing something like this
| 1
|
5,055
| 7,860,930,833
|
IssuesEvent
|
2018-06-21 21:44:34
|
GoogleCloudPlatform/google-cloud-python
|
https://api.github.com/repos/GoogleCloudPlatform/google-cloud-python
|
opened
|
Regen / release vision, texttospeech, iot
|
api: iot api: text to speech api: vision type: process
|
Now that the gapic-generator [fix for shared modules is merged](https://github.com/googleapis/gapic-generator/pull/2063). The recent releases of those API libraries all contain the re-introduced module overwrites for shared modules (#5473).
|
1.0
|
Regen / release vision, texttospeech, iot - Now that the gapic-generator [fix for shared modules is merged](https://github.com/googleapis/gapic-generator/pull/2063). The recent releases of those API libraries all contain the re-introduced module overwrites for shared modules (#5473).
|
process
|
regen release vision texttospeech iot now that the gapic generator the recent releases of those api libraries all contain the re introduced module overwrites for shared modules
| 1
|
9,446
| 12,426,847,797
|
IssuesEvent
|
2020-05-24 23:33:05
|
aliceingovernment/aliceingovernment.com
|
https://api.github.com/repos/aliceingovernment/aliceingovernment.com
|
closed
|
tian/ plan necessary steps for reshaping AiG to focus on universities
|
in-process
|
1. Only accept emails ending in @university.edu for each university separately (e.g. MIT accepts @mit.edu ONLY, UNAM accepts @unam.mx ONLY)
use spreadsheet shared with elf
2. DONE---Change copy on /home and /vote
3. Email existent voters and say dynamics are changing
4. Save existent voters with their votes and opinions
5. DONE---Come up with a relevant list of options instead of Drawdown (research on university possibilities / divestment / more green programs / waste management / etc..)
(talk with divestment peeps at universities)
6. Order universities by # of votes (except if you are in a "specific uni site, like .com/MIT) you see MIT first and then the rest of unis ordered by # of votes
7. Display list of universities to chose from plus a "add one" (fill form in site) .
8. Contact existent voters that are university voters and invite them to vote again
|
1.0
|
tian/ plan necessary steps for reshaping AiG to focus on universities - 1. Only accept emails ending in @university.edu for each university separately (e.g. MIT accepts @mit.edu ONLY, UNAM accepts @unam.mx ONLY)
use spreadsheet shared with elf
2. DONE---Change copy on /home and /vote
3. Email existent voters and say dynamics are changing
4. Save existent voters with their votes and opinions
5. DONE---Come up with a relevant list of options instead of Drawdown (research on university possibilities / divestment / more green programs / waste management / etc..)
(talk with divestment peeps at universities)
6. Order universities by # of votes (except if you are in a "specific uni site, like .com/MIT) you see MIT first and then the rest of unis ordered by # of votes
7. Display list of universities to chose from plus a "add one" (fill form in site) .
8. Contact existent voters that are university voters and invite them to vote again
|
process
|
tian plan necessary steps for reshaping aig to focus on universities only accept emails ending in university edu for each university separately e g mit accepts mit edu only unam accepts unam mx only use spreadsheet shared with elf done change copy on home and vote email existent voters and say dynamics are changing save existent voters with their votes and opinions done come up with a relevant list of options instead of drawdown research on university possibilities divestment more green programs waste management etc talk with divestment peeps at universities order universities by of votes except if you are in a specific uni site like com mit you see mit first and then the rest of unis ordered by of votes display list of universities to chose from plus a add one fill form in site contact existent voters that are university voters and invite them to vote again
| 1
|
4,830
| 7,725,908,136
|
IssuesEvent
|
2018-05-24 19:28:54
|
kaching-hq/Privacy-and-Security
|
https://api.github.com/repos/kaching-hq/Privacy-and-Security
|
opened
|
Prepare a "Data Privacy Guide" for our clients
|
Processes
|
- [ ] Present an overview of our security measures
- [ ] Present guidelines on supporting data subject rights (with our apps)
- [ ] Guide readers through setting up secure comms with us
- [ ] Share contact information for data protection purposes
|
1.0
|
Prepare a "Data Privacy Guide" for our clients - - [ ] Present an overview of our security measures
- [ ] Present guidelines on supporting data subject rights (with our apps)
- [ ] Guide readers through setting up secure comms with us
- [ ] Share contact information for data protection purposes
|
process
|
prepare a data privacy guide for our clients present an overview of our security measures present guidelines on supporting data subject rights with our apps guide readers through setting up secure comms with us share contact information for data protection purposes
| 1
|
413,394
| 12,066,565,446
|
IssuesEvent
|
2020-04-16 11:59:54
|
hotosm/tasking-manager
|
https://api.github.com/repos/hotosm/tasking-manager
|
closed
|
Notification Page: alignment issues
|
Component: Frontend Priority: Medium Status: Needs implementation
|
Noticed a few text and icon alignment issues. Also unsure of the list icon next to the notification type in text. Can we just have the notification type as `Mention, Validation, System, Task Comment` instead of a `notification` word appended to each type?

|
1.0
|
Notification Page: alignment issues - Noticed a few text and icon alignment issues. Also unsure of the list icon next to the notification type in text. Can we just have the notification type as `Mention, Validation, System, Task Comment` instead of a `notification` word appended to each type?

|
non_process
|
notification page alignment issues noticed a few text and icon alignment issues also unsure of the list icon next to the notification type in text can we just have the notification type as mention validation system task comment instead of a notification word appended to each type
| 0
|
18,754
| 24,657,093,725
|
IssuesEvent
|
2022-10-18 01:18:17
|
ECP-WarpX/WarpX
|
https://api.github.com/repos/ECP-WarpX/WarpX
|
closed
|
Which software should I use for post-processing?
|
question component: post-processing component: boosted frame
|
Hi,
Thank you for your open source work. After I learned the power of Warpx, I tried running a few examples. In example Beam-driven electron acceleration, the output file is lab_frame_data. I tried to use ParaView and VisIt to post-process the data, but it didn't work.
What tools should I use to process this data?
Hope to hear from you!


|
1.0
|
Which software should I use for post-processing? - Hi,
Thank you for your open source work. After I learned the power of Warpx, I tried running a few examples. In example Beam-driven electron acceleration, the output file is lab_frame_data. I tried to use ParaView and VisIt to post-process the data, but it didn't work.
What tools should I use to process this data?
Hope to hear from you!


|
process
|
which software should i use for post processing? hi, thank you for your open source work after i learned the power of warpx i tried running a few examples in example beam driven electron acceleration the output file is lab frame data i tried to use paraview and visit to post process the data but it didn t work what tools should i use to process this data hope to hear from you
| 1
|
5,760
| 8,598,731,346
|
IssuesEvent
|
2018-11-15 22:47:08
|
gfrebello/qs-trip-planning-procedure
|
https://api.github.com/repos/gfrebello/qs-trip-planning-procedure
|
closed
|
Refactor JDL file
|
Priority:High Process:Implement Requirement
|
The JDL entity-relationship model contains some mistakes such as the absence of prices in FlightReservation and a ChosenAttraction entity. We must fix it and rebuild the database.
|
1.0
|
Refactor JDL file - The JDL entity-relationship model contains some mistakes such as the absence of prices in FlightReservation and a ChosenAttraction entity. We must fix it and rebuild the database.
|
process
|
refactor jdl file the jdl entity relationship model contains some mistakes such as the absence of prices in flightreservation and a chosenattraction entity we must fix it and rebuild the database
| 1
|
7,955
| 11,137,565,415
|
IssuesEvent
|
2019-12-20 19:42:47
|
openopps/openopps-platform
|
https://api.github.com/repos/openopps/openopps-platform
|
closed
|
Track student application withdrawals
|
Apply Process Approved Artifact Needed Requirements Ready State Dept.
|
Who: Students who withdraw their applications
What: Track data related to withdrawals
Why: in order to provide metrics on how many withdrawals occured
Acceptance Criteria:
When a student application is withdrawn, track the following:
- That the application was withdrawn
- the date the application was withdrawn
|
1.0
|
Track student application withdrawals - Who: Students who withdraw their applications
What: Track data related to withdrawals
Why: in order to provide metrics on how many withdrawals occured
Acceptance Criteria:
When a student application is withdrawn, track the following:
- That the application was withdrawn
- the date the application was withdrawn
|
process
|
track student application withdrawals who students who withdraw their applications what track data related to withdrawals why in order to provide metrics on how many withdrawals occured acceptance criteria when a student application is withdrawn track the following that the application was withdrawn the date the application was withdrawn
| 1
|
98,580
| 20,763,171,954
|
IssuesEvent
|
2022-03-15 18:00:48
|
rust-analyzer/rust-analyzer
|
https://api.github.com/repos/rust-analyzer/rust-analyzer
|
closed
|
[VSCode] Allow customisation of inlay hint styling
|
E-has-instructions A-vscode S-actionable A-inlay-hints
|
I personally don't like the new smaller inlay hints, and I think it would be great to be able to customise it.
It was introduced here: #6394
Discussion of customising was held here: #6380
|
1.0
|
[VSCode] Allow customisation of inlay hint styling - I personally don't like the new smaller inlay hints, and I think it would be great to be able to customise it.
It was introduced here: #6394
Discussion of customising was held here: #6380
|
non_process
|
allow customisation of inlay hint styling i personally don t like the new smaller inlay hints and i think it would be great to be able to customise it it was introduced here discussion of customising was held here
| 0
|
2,401
| 5,193,012,012
|
IssuesEvent
|
2017-01-22 15:10:18
|
AllenFang/react-bootstrap-table
|
https://api.github.com/repos/AllenFang/react-bootstrap-table
|
closed
|
Table height, Chrome
|
bug inprocess
|
Hey
If I set the height on the table, it works fine across Firefox and Chrome. If I don't set the height, Firefox seems to render find but Chrome seems to pin the table at 100% of the screen.
Thanks
Andrew
|
1.0
|
Table height, Chrome - Hey
If I set the height on the table, it works fine across Firefox and Chrome. If I don't set the height, Firefox seems to render find but Chrome seems to pin the table at 100% of the screen.
Thanks
Andrew
|
process
|
table height chrome hey if i set the height on the table it works fine across firefox and chrome if i don t set the height firefox seems to render find but chrome seems to pin the table at of the screen thanks andrew
| 1
|
15,108
| 18,844,891,164
|
IssuesEvent
|
2021-11-11 13:56:27
|
googleapis/python-pubsub
|
https://api.github.com/repos/googleapis/python-pubsub
|
opened
|
Make sure type checks pass with mypy as well
|
type: process
|
#500 added type annotation to the codebase, but we only checked that static type analysis passes under `pytype`. A lot of people, on the other hand, use `mypy`, but `mypy` currently reports 97 errors.
If we want users to leverage our type annotations, we should make sure that the checks pass with `mypy` as well, and then we can also add `py.typed` file to declare the library type-checked.
An additional requirement of this issue is that a new nox session named `mypy` needs to be added and enabled by default.
|
1.0
|
Make sure type checks pass with mypy as well - #500 added type annotation to the codebase, but we only checked that static type analysis passes under `pytype`. A lot of people, on the other hand, use `mypy`, but `mypy` currently reports 97 errors.
If we want users to leverage our type annotations, we should make sure that the checks pass with `mypy` as well, and then we can also add `py.typed` file to declare the library type-checked.
An additional requirement of this issue is that a new nox session named `mypy` needs to be added and enabled by default.
|
process
|
make sure type checks pass with mypy as well added type annotation to the codebase but we only checked that static type analysis passes under pytype a lot of people on the other hand use mypy but mypy currently reports errors if we want users to leverage our type annotations we should make sure that the checks pass with mypy as well and then we can also add py typed file to declare the library type checked an additional requirement of this issue is that a new nox session named mypy needs to be added and enabled by default
| 1
|
46,845
| 11,899,416,526
|
IssuesEvent
|
2020-03-30 08:58:26
|
ClickHouse/ClickHouse
|
https://api.github.com/repos/ClickHouse/ClickHouse
|
closed
|
Clickhouse dictionary by view error:code 393 There is no query
|
build
|
**view sql is:**
CREATE VIEW shared_dwd.dim_district AS
SELECT
t1.id AS id,
t1.name AS name,
t1.last_modified_date AS last_modified_datetime,
t2.country_name AS country_name
FROM
shared_ods.view_dist_district t1
GLOBAL ANY LEFT JOIN shared_ods.view_dist_country t2 ON t2.id = t1.country_id AND t2.enabled = '1';
**dictionary file is:**
`<yandex>
<comment>dim_district</comment>
<dictionary>
<name>district</name>
<source>
<clickhouse>
<host>localhost</host>
<port>9000</port>
<user>default</user>
<password>*****</password>
<db>shared_dwd</db>
<table>dim_district</table>
</clickhouse>
</source>
<layout>
<complex_key_hashed />
</layout>
<structure>
<key>
<attribute>
<name>id</name>
<type>String</type>
</attribute>
</key>
<attribute>
<name>name</name>
<type>String</type>
<expression>assumeNotNull(name)</expression>
<null_value></null_value>
<injective>true</injective>
</attribute>
<attribute>
<name>country_name</name>
<type>String</type>
<expression>assumeNotNull(country_name)</expression>
<null_value></null_value>
<injective>true</injective>
</attribute>
</structure>
<lifetime>
<min>10</min>
<max>60</max>
</lifetime>
<odbc>
<invalidate_query>select max(last_modified_datetime) from shared_dwd.dim_district limit 1</invalidate_query>
</odbc>
</dictionary>
</yandex>`
error message is :
ExternalDictionariesLoader: Could not load external dictionary 'label', next update is scheduled at 2020-03-30 16:17:
05: Code: 393, e.displayText() = DB::Exception: There is no query, Stack trace:
0. 0x3512b60 StackTrace::StackTrace() /usr/bin/clickhouse
1. 0x351cdaf DB::Exception::Exception(std::string const&, int) /usr/bin/clickhouse
2. 0x61639c9 DB::Context::getQueryContext() const /usr/bin/clickhouse
3. 0x6163a09 DB::Context::getSampleBlockCache() const /usr/bin/clickhouse
4. 0x61f8853 DB::InterpreterSelectWithUnionQuery::getSampleBlock(std::shared_ptr<DB::IAST> const&, DB::Context const&) /usr/bin/clickhouse
5. 0x68a725a DB::getNamesAndTypeListFromTableExpression(DB::ASTTableExpression const&, DB::Context const&) /usr/bin/clickhouse
6. 0x68b4d6c DB::getDatabaseAndTablesWithColumnNames(DB::ASTSelectQuery const&, DB::Context const&) /usr/bin/clickhouse
7. 0x62a9756 DB::SyntaxAnalyzer::analyze(std::shared_ptr<DB::IAST>&, DB::NamesAndTypesList const&, std::vector<std::string, std::allocator<std::string> > const&, s
td::shared_ptr<DB::IStorage>, DB::NamesAndTypesList const&) const /usr/bin/clickhouse
8. 0x61cbf9f ? /usr/bin/clickhouse
9. 0x61cd387 DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, std::shared_ptr<DB::IBlockInputStream> const&
, std::shared_ptr<DB::IStorage> const&, DB::SelectQueryOptions const&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
10. 0x61cde4b DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions const&, std::vector<s
td::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
11. 0x61f7e63 DB::InterpreterSelectWithUnionQuery::InterpreterSelectWithUnionQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions con
st&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
12. 0x61cda9f DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, std::shared_ptr<DB::IBlockInputStream> const
&, std::shared_ptr<DB::IStorage> const&, DB::SelectQueryOptions const&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
13. 0x61cde4b DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions const&, std::vector<s
td::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
14. 0x61f7e63 DB::InterpreterSelectWithUnionQuery::InterpreterSelectWithUnionQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions con
st&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
15. 0x61cda9f DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, std::shared_ptr<DB::IBlockInputStream> const
&, std::shared_ptr<DB::IStorage> const&, DB::SelectQueryOptions const&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
16. 0x61cde4b DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions const&, std::vector<s
td::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
17. 0x61f7e63 DB::InterpreterSelectWithUnionQuery::InterpreterSelectWithUnionQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions con
st&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
18. 0x69675da DB::StorageView::read(std::vector<std::string, std::allocator<std::string> > const&, DB::SelectQueryInfo const&, DB::Context const&, DB::QueryProcess
ingStage::Enum, unsigned long, unsigned int) /usr/bin/clickhouse
19. 0x61eb9c5 void DB::InterpreterSelectQuery::executeFetchColumns<DB::InterpreterSelectQuery::Pipeline>(DB::QueryProcessingStage::Enum, DB::InterpreterSelectQuery
::Pipeline&, std::shared_ptr<DB::SortingInfo> const&, std::shared_ptr<DB::PrewhereInfo> const&, std::vector<std::string, std::allocator<std::string> > const&, DB::
QueryPipeline&) /usr/bin/clickhouse
please help me solve this problem, thank you so much
|
1.0
|
Clickhouse dictionary by view error:code 393 There is no query - **view sql is:**
CREATE VIEW shared_dwd.dim_district AS
SELECT
t1.id AS id,
t1.name AS name,
t1.last_modified_date AS last_modified_datetime,
t2.country_name AS country_name
FROM
shared_ods.view_dist_district t1
GLOBAL ANY LEFT JOIN shared_ods.view_dist_country t2 ON t2.id = t1.country_id AND t2.enabled = '1';
**dictionary file is:**
`<yandex>
<comment>dim_district</comment>
<dictionary>
<name>district</name>
<source>
<clickhouse>
<host>localhost</host>
<port>9000</port>
<user>default</user>
<password>*****</password>
<db>shared_dwd</db>
<table>dim_district</table>
</clickhouse>
</source>
<layout>
<complex_key_hashed />
</layout>
<structure>
<key>
<attribute>
<name>id</name>
<type>String</type>
</attribute>
</key>
<attribute>
<name>name</name>
<type>String</type>
<expression>assumeNotNull(name)</expression>
<null_value></null_value>
<injective>true</injective>
</attribute>
<attribute>
<name>country_name</name>
<type>String</type>
<expression>assumeNotNull(country_name)</expression>
<null_value></null_value>
<injective>true</injective>
</attribute>
</structure>
<lifetime>
<min>10</min>
<max>60</max>
</lifetime>
<odbc>
<invalidate_query>select max(last_modified_datetime) from shared_dwd.dim_district limit 1</invalidate_query>
</odbc>
</dictionary>
</yandex>`
error message is :
ExternalDictionariesLoader: Could not load external dictionary 'label', next update is scheduled at 2020-03-30 16:17:
05: Code: 393, e.displayText() = DB::Exception: There is no query, Stack trace:
0. 0x3512b60 StackTrace::StackTrace() /usr/bin/clickhouse
1. 0x351cdaf DB::Exception::Exception(std::string const&, int) /usr/bin/clickhouse
2. 0x61639c9 DB::Context::getQueryContext() const /usr/bin/clickhouse
3. 0x6163a09 DB::Context::getSampleBlockCache() const /usr/bin/clickhouse
4. 0x61f8853 DB::InterpreterSelectWithUnionQuery::getSampleBlock(std::shared_ptr<DB::IAST> const&, DB::Context const&) /usr/bin/clickhouse
5. 0x68a725a DB::getNamesAndTypeListFromTableExpression(DB::ASTTableExpression const&, DB::Context const&) /usr/bin/clickhouse
6. 0x68b4d6c DB::getDatabaseAndTablesWithColumnNames(DB::ASTSelectQuery const&, DB::Context const&) /usr/bin/clickhouse
7. 0x62a9756 DB::SyntaxAnalyzer::analyze(std::shared_ptr<DB::IAST>&, DB::NamesAndTypesList const&, std::vector<std::string, std::allocator<std::string> > const&, s
td::shared_ptr<DB::IStorage>, DB::NamesAndTypesList const&) const /usr/bin/clickhouse
8. 0x61cbf9f ? /usr/bin/clickhouse
9. 0x61cd387 DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, std::shared_ptr<DB::IBlockInputStream> const&
, std::shared_ptr<DB::IStorage> const&, DB::SelectQueryOptions const&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
10. 0x61cde4b DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions const&, std::vector<s
td::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
11. 0x61f7e63 DB::InterpreterSelectWithUnionQuery::InterpreterSelectWithUnionQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions con
st&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
12. 0x61cda9f DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, std::shared_ptr<DB::IBlockInputStream> const
&, std::shared_ptr<DB::IStorage> const&, DB::SelectQueryOptions const&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
13. 0x61cde4b DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions const&, std::vector<s
td::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
14. 0x61f7e63 DB::InterpreterSelectWithUnionQuery::InterpreterSelectWithUnionQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions con
st&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
15. 0x61cda9f DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, std::shared_ptr<DB::IBlockInputStream> const
&, std::shared_ptr<DB::IStorage> const&, DB::SelectQueryOptions const&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
16. 0x61cde4b DB::InterpreterSelectQuery::InterpreterSelectQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions const&, std::vector<s
td::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
17. 0x61f7e63 DB::InterpreterSelectWithUnionQuery::InterpreterSelectWithUnionQuery(std::shared_ptr<DB::IAST> const&, DB::Context const&, DB::SelectQueryOptions con
st&, std::vector<std::string, std::allocator<std::string> > const&) /usr/bin/clickhouse
18. 0x69675da DB::StorageView::read(std::vector<std::string, std::allocator<std::string> > const&, DB::SelectQueryInfo const&, DB::Context const&, DB::QueryProcess
ingStage::Enum, unsigned long, unsigned int) /usr/bin/clickhouse
19. 0x61eb9c5 void DB::InterpreterSelectQuery::executeFetchColumns<DB::InterpreterSelectQuery::Pipeline>(DB::QueryProcessingStage::Enum, DB::InterpreterSelectQuery
::Pipeline&, std::shared_ptr<DB::SortingInfo> const&, std::shared_ptr<DB::PrewhereInfo> const&, std::vector<std::string, std::allocator<std::string> > const&, DB::
QueryPipeline&) /usr/bin/clickhouse
please help me solve this problem, thank you so much
|
non_process
|
clickhouse dictionary by view error code there is no query view sql is create view shared dwd dim district as select id as id name as name last modified date as last modified datetime country name as country name from shared ods view dist district global any left join shared ods view dist country on id country id and enabled dictionary file is dim district district localhost default shared dwd dim district id string name string assumenotnull name true country name string assumenotnull country name true select max last modified datetime from shared dwd dim district limit error message is externaldictionariesloader could not load external dictionary label next update is scheduled at code e displaytext db exception there is no query stack trace stacktrace stacktrace usr bin clickhouse db exception exception std string const int usr bin clickhouse db context getquerycontext const usr bin clickhouse db context getsampleblockcache const usr bin clickhouse db interpreterselectwithunionquery getsampleblock std shared ptr const db context const usr bin clickhouse db getnamesandtypelistfromtableexpression db asttableexpression const db context const usr bin clickhouse db getdatabaseandtableswithcolumnnames db astselectquery const db context const usr bin clickhouse db syntaxanalyzer analyze std shared ptr db namesandtypeslist const std vector const s td shared ptr db namesandtypeslist const const usr bin clickhouse usr bin clickhouse db interpreterselectquery interpreterselectquery std shared ptr const db context const std shared ptr const std shared ptr const db selectqueryoptions const std vector const usr bin clickhouse db interpreterselectquery interpreterselectquery std shared ptr const db context const db selectqueryoptions const std vector s td string std allocator const usr bin clickhouse db interpreterselectwithunionquery interpreterselectwithunionquery std shared ptr const db context const db selectqueryoptions con st std vector const usr bin clickhouse db interpreterselectquery interpreterselectquery std shared ptr const db context const std shared ptr const std shared ptr const db selectqueryoptions const std vector const usr bin clickhouse db interpreterselectquery interpreterselectquery std shared ptr const db context const db selectqueryoptions const std vector s td string std allocator const usr bin clickhouse db interpreterselectwithunionquery interpreterselectwithunionquery std shared ptr const db context const db selectqueryoptions con st std vector const usr bin clickhouse db interpreterselectquery interpreterselectquery std shared ptr const db context const std shared ptr const std shared ptr const db selectqueryoptions const std vector const usr bin clickhouse db interpreterselectquery interpreterselectquery std shared ptr const db context const db selectqueryoptions const std vector s td string std allocator const usr bin clickhouse db interpreterselectwithunionquery interpreterselectwithunionquery std shared ptr const db context const db selectqueryoptions con st std vector const usr bin clickhouse db storageview read std vector const db selectqueryinfo const db context const db queryprocess ingstage enum unsigned long unsigned int usr bin clickhouse void db interpreterselectquery executefetchcolumns db queryprocessingstage enum db interpreterselectquery pipeline std shared ptr const std shared ptr const std vector const db querypipeline usr bin clickhouse please help me solve this problem, thank you so much
| 0
|
45,987
| 2,944,362,656
|
IssuesEvent
|
2015-07-03 02:56:15
|
tokenly/swapbot
|
https://api.github.com/repos/tokenly/swapbot
|
closed
|
Add basic markdown formatting to Swapbot descriptions
|
high priority UX
|
Because this looks terrible and i'm spoiled by @cryptonaut420 including it everywhere

|
1.0
|
Add basic markdown formatting to Swapbot descriptions - Because this looks terrible and i'm spoiled by @cryptonaut420 including it everywhere

|
non_process
|
add basic markdown formatting to swapbot descriptions because this looks terrible and i m spoiled by including it everywhere
| 0
|
305,623
| 9,372,070,635
|
IssuesEvent
|
2019-04-03 16:46:45
|
mozilla/addons-code-manager
|
https://api.github.com/repos/mozilla/addons-code-manager
|
opened
|
Update the content of the `Index` component
|
priority: mvp priority: p3
|
The content of the `Index` component was useful to quickly get access to add-ons/versions but those links won't work in stage or prod and we should start relying on different -dev data to maybe catch issues.
I am not sure what we could add instead, maybe [this gif](http://bestanimations.com/Site/Construction/under-construction-gif-6.gif)
|
2.0
|
Update the content of the `Index` component - The content of the `Index` component was useful to quickly get access to add-ons/versions but those links won't work in stage or prod and we should start relying on different -dev data to maybe catch issues.
I am not sure what we could add instead, maybe [this gif](http://bestanimations.com/Site/Construction/under-construction-gif-6.gif)
|
non_process
|
update the content of the index component the content of the index component was useful to quickly get access to add ons versions but those links won t work in stage or prod and we should start relying on different dev data to maybe catch issues i am not sure what we could add instead maybe
| 0
|
294,612
| 9,037,597,293
|
IssuesEvent
|
2019-02-09 12:20:19
|
Tribler/tribler
|
https://api.github.com/repos/Tribler/tribler
|
reopened
|
IPv8/Dispersy dual stack peer explosion
|
_Top Priority_ bug
|
Tribler takes a whole core for nearly 100% after running for days (167h of CPU of htop equal 75% avg. load):


The swp is getting full, maxed out box, but that does not boost user-space CPU usage. Running v7.1.5 mostly idle for 9 days continuously on an Ubuntu 18.04.1 LTS box. Never done a single keyword search, two Fedora seeding (0MB uploaded), but donating encrypted relay bandwidth:

Gathering solid credits:


Reasonable amount of memory, but not growing out of control:

Some amount of errors in the logs:
```
INFO 1548520097.02 community:83 (Request) Timeout for ping to Peer<94.174.45.58:7759, pnK7Ahoxt5xGR/w29hfI5EATiZI=>
INFO 1548520097.70 handler:152 (TftpHandler) TFTP[23593 C 12.124.121.166:7759][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] timed out
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
INFO 1548520098.94 handler:137 (TftpHandler) TFTP[34101 C 23.252.56.68:7759][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] started
INFO 1548520101.04 torrent_checker:151 (TorrentChecker) Selected 2 new torrents to check on tracker: udp://tracker.xfsub.com:6868
INFO 1548520101.07 handler:152 (TftpHandler) TFTP[34101 C 23.252.56.68:7759][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] timed out
INFO 1548520101.17 session:419 (UdpTrackerSession) Error when querying UDP tracker: [Failure instance: Traceback (failure with no frames): <class 'twisted.internet.error.DNSLookupError'>: DNS lookup failed: tracker.xfsub.com.
] udp://tracker.xfsub.com:6868
WARNING 1548520101.17 torrent_checker:262 (TorrentChecker) Got session error for URL udp://tracker.xfsub.com:6868: [Failure instance: Traceback (failure with no frames): <type 'exceptions.ValueError'>: UDP tracker failed for url udp://tracker.xfsub.com:6868 (error: DNS lookup failed: tracker.xfsub.com.)
]
INFO 1548520102.49 handler:137 (TftpHandler) TFTP[1609 C 74.66.207.23:9683][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] started
WARNING 1548520103.63 handler:240 (TftpHandler) got non-existing session from 2.24.234.96:7759, id = 28608
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<86.126.173.209:19811, F8TPp56do0IvYd8QJ1KyoHijOMg=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<193.165.189.6:26423, rfJDpWM47gT55U9Izwe9R8tGROA=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<5.79.71.187:35035, hiNKfy4uEPKMH9IJ177OKc3zh3c=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<88.21.114.167:7759, s8Ro19O1qVHHvAnAjvAv1IAvbGw=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<70.121.56.5:7759, sLxOpSmBdM0YfABKCIHyZVuAxhk=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<92.20.137.112:55502, r8nBAHqUi9ezNc9TZZHMvTRcrVM=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<81.171.8.17:35200, 7s7HWCaTpTTAi6LSG09/EfZVxRo=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<81.171.8.19:35200, qIqJyrGyUl6VPJIBPmx2FBIZ3Ao=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<81.171.8.18:35160, XO8ZrfspX3GBzL89CMTRQ0YstB8=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<81.171.8.19:35075, PAOwFnIPqursajEC4nwLZgH2Tzw=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<88.89.71.78:7758, o4/QqaweAuzAW85zt/HUQBfQ024=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<185.5.165.113:47892, rWrfbl3Y2xZaRXwwuZk9ZFYjd0I=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<81.171.8.17:35060, vAvMRdf0hPb5xeJ/PTviWHiQHKw=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<81.171.8.17:35010, KLP/0BIzlDf7HiYL2QpV+7oCWHk=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<81.171.27.193:35040, RAVrUhLCsiUUKx+ipKxmX2uV2lQ=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<81.171.27.193:35005, kcfBnK+3zeEGwy8J0LcWurCoZlg=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<37.48.77.235:35010, t07Ao4YBTvmvw6UqH9O7yLx5Mg0=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<37.48.77.237:35035, N+LrqteDPqm2fSi77BOudycsRwI=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<37.48.77.238:35020, E73U8Eh5g1vFIXbMmOq+NqplIRQ=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<95.96.66.166:7759, uNOlxm1LhnBFxlRhwHLzrgG1zQM=>
INFO 1548520103.97 community:83 (Request) Timeout for find to Peer<86.126.173.209:19811, F8TPp56do0IvYd8QJ1KyoHijOMg=>
INFO 1548520104.89 handler:152 (TftpHandler) TFTP[1609 C 74.66.207.23:9683][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] timed out
INFO 1548520105.00 handler:137 (TftpHandler) TFTP[39475 C 75.169.51.124:10054][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] started
```
|
1.0
|
IPv8/Dispersy dual stack peer explosion - Tribler takes a whole core for nearly 100% after running for days (167h of CPU of htop equal 75% avg. load):


The swp is getting full, maxed out box, but that does not boost user-space CPU usage. Running v7.1.5 mostly idle for 9 days continuously on an Ubuntu 18.04.1 LTS box. Never done a single keyword search, two Fedora seeding (0MB uploaded), but donating encrypted relay bandwidth:

Gathering solid credits:


Reasonable amount of memory, but not growing out of control:

Some amount of errors in the logs:
```
INFO 1548520097.02 community:83 (Request) Timeout for ping to Peer<94.174.45.58:7759, pnK7Ahoxt5xGR/w29hfI5EATiZI=>
INFO 1548520097.70 handler:152 (TftpHandler) TFTP[23593 C 12.124.121.166:7759][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] timed out
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
WARNING 1548520098.09 endpoint:56 (UDPEndpoint) Dropping packet due to socket error: [Errno 11] Resource temporarily unavailable
INFO 1548520098.94 handler:137 (TftpHandler) TFTP[34101 C 23.252.56.68:7759][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] started
INFO 1548520101.04 torrent_checker:151 (TorrentChecker) Selected 2 new torrents to check on tracker: udp://tracker.xfsub.com:6868
INFO 1548520101.07 handler:152 (TftpHandler) TFTP[34101 C 23.252.56.68:7759][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] timed out
INFO 1548520101.17 session:419 (UdpTrackerSession) Error when querying UDP tracker: [Failure instance: Traceback (failure with no frames): <class 'twisted.internet.error.DNSLookupError'>: DNS lookup failed: tracker.xfsub.com.
] udp://tracker.xfsub.com:6868
WARNING 1548520101.17 torrent_checker:262 (TorrentChecker) Got session error for URL udp://tracker.xfsub.com:6868: [Failure instance: Traceback (failure with no frames): <type 'exceptions.ValueError'>: UDP tracker failed for url udp://tracker.xfsub.com:6868 (error: DNS lookup failed: tracker.xfsub.com.)
]
INFO 1548520102.49 handler:137 (TftpHandler) TFTP[1609 C 74.66.207.23:9683][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] started
WARNING 1548520103.63 handler:240 (TftpHandler) got non-existing session from 2.24.234.96:7759, id = 28608
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<86.126.173.209:19811, F8TPp56do0IvYd8QJ1KyoHijOMg=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<193.165.189.6:26423, rfJDpWM47gT55U9Izwe9R8tGROA=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<5.79.71.187:35035, hiNKfy4uEPKMH9IJ177OKc3zh3c=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<88.21.114.167:7759, s8Ro19O1qVHHvAnAjvAv1IAvbGw=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<70.121.56.5:7759, sLxOpSmBdM0YfABKCIHyZVuAxhk=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<92.20.137.112:55502, r8nBAHqUi9ezNc9TZZHMvTRcrVM=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<81.171.8.17:35200, 7s7HWCaTpTTAi6LSG09/EfZVxRo=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<81.171.8.19:35200, qIqJyrGyUl6VPJIBPmx2FBIZ3Ao=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<81.171.8.18:35160, XO8ZrfspX3GBzL89CMTRQ0YstB8=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<81.171.8.19:35075, PAOwFnIPqursajEC4nwLZgH2Tzw=>
INFO 1548520103.96 community:83 (Request) Timeout for ping to Peer<88.89.71.78:7758, o4/QqaweAuzAW85zt/HUQBfQ024=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<185.5.165.113:47892, rWrfbl3Y2xZaRXwwuZk9ZFYjd0I=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<81.171.8.17:35060, vAvMRdf0hPb5xeJ/PTviWHiQHKw=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<81.171.8.17:35010, KLP/0BIzlDf7HiYL2QpV+7oCWHk=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<81.171.27.193:35040, RAVrUhLCsiUUKx+ipKxmX2uV2lQ=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<81.171.27.193:35005, kcfBnK+3zeEGwy8J0LcWurCoZlg=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<37.48.77.235:35010, t07Ao4YBTvmvw6UqH9O7yLx5Mg0=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<37.48.77.237:35035, N+LrqteDPqm2fSi77BOudycsRwI=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<37.48.77.238:35020, E73U8Eh5g1vFIXbMmOq+NqplIRQ=>
INFO 1548520103.97 community:83 (Request) Timeout for ping to Peer<95.96.66.166:7759, uNOlxm1LhnBFxlRhwHLzrgG1zQM=>
INFO 1548520103.97 community:83 (Request) Timeout for find to Peer<86.126.173.209:19811, F8TPp56do0IvYd8QJ1KyoHijOMg=>
INFO 1548520104.89 handler:152 (TftpHandler) TFTP[1609 C 74.66.207.23:9683][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] timed out
INFO 1548520105.00 handler:137 (TftpHandler) TFTP[39475 C 75.169.51.124:10054][8b6cd3e1e5dc97efc3cb469631d4980f6dfe13d3.torrent] started
```
|
non_process
|
dispersy dual stack peer explosion tribler takes a whole core for nearly after running for days of cpu of htop equal avg load the swp is getting full maxed out box but that does not boost user space cpu usage running mostly idle for days continuously on an ubuntu lts box never done a single keyword search two fedora seeding uploaded but donating encrypted relay bandwidth gathering solid credits reasonable amount of memory but not growing out of control some amount of errors in the logs info community request timeout for ping to peer info handler tftphandler tftp timed out warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable warning endpoint udpendpoint dropping packet due to socket error resource temporarily unavailable info handler tftphandler tftp started info torrent checker torrentchecker selected new torrents to check on tracker udp tracker xfsub com info handler tftphandler tftp timed out info session udptrackersession error when querying udp tracker failure instance traceback failure with no frames dns lookup failed tracker xfsub com udp tracker xfsub com warning torrent checker torrentchecker got session error for url udp tracker xfsub com failure instance traceback failure with no frames udp tracker failed for url udp tracker xfsub com error dns lookup failed tracker xfsub com info handler tftphandler tftp started warning handler tftphandler got non existing session from id info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for ping to peer info community request timeout for find to peer info handler tftphandler tftp timed out info handler tftphandler tftp started
| 0
|
241,990
| 26,257,016,500
|
IssuesEvent
|
2023-01-06 02:16:19
|
ams0/openhack-containers
|
https://api.github.com/repos/ams0/openhack-containers
|
closed
|
CVE-2020-9283 (High) detected in github.com/golang/crypto-v0.0.0-20190308221718-c2843e01d9a2 - autoclosed
|
security vulnerability
|
## CVE-2020-9283 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/golang/crypto-v0.0.0-20190308221718-c2843e01d9a2</b></p></summary>
<p>[mirror] Go supplementary cryptography libraries</p>
<p>Library home page: <a href="https://proxy.golang.org/github.com/golang/crypto/@v/v0.0.0-20190308221718-c2843e01d9a2.zip">https://proxy.golang.org/github.com/golang/crypto/@v/v0.0.0-20190308221718-c2843e01d9a2.zip</a></p>
<p>
Dependency Hierarchy:
- github.com/denisenkom/go-mssqldb-0.9.0 (Root Library)
- :x: **github.com/golang/crypto-v0.0.0-20190308221718-c2843e01d9a2** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ams0/openhack-containers/commit/fbc3a9665f7473faa96484a3fa9b058ad82d7e60">fbc3a9665f7473faa96484a3fa9b058ad82d7e60</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
golang.org/x/crypto before v0.0.0-20200220183623-bac4c82f6975 for Go allows a panic during signature verification in the golang.org/x/crypto/ssh package. A client can attack an SSH server that accepts public keys. Also, a server can attack any SSH client.
<p>Publish Date: 2020-02-20
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-9283>CVE-2020-9283</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9283">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9283</a></p>
<p>Release Date: 2020-02-20</p>
<p>Fix Resolution: github.com/golang/crypto - bac4c82f69751a6dd76e702d54b3ceb88adab236</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-9283 (High) detected in github.com/golang/crypto-v0.0.0-20190308221718-c2843e01d9a2 - autoclosed - ## CVE-2020-9283 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/golang/crypto-v0.0.0-20190308221718-c2843e01d9a2</b></p></summary>
<p>[mirror] Go supplementary cryptography libraries</p>
<p>Library home page: <a href="https://proxy.golang.org/github.com/golang/crypto/@v/v0.0.0-20190308221718-c2843e01d9a2.zip">https://proxy.golang.org/github.com/golang/crypto/@v/v0.0.0-20190308221718-c2843e01d9a2.zip</a></p>
<p>
Dependency Hierarchy:
- github.com/denisenkom/go-mssqldb-0.9.0 (Root Library)
- :x: **github.com/golang/crypto-v0.0.0-20190308221718-c2843e01d9a2** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ams0/openhack-containers/commit/fbc3a9665f7473faa96484a3fa9b058ad82d7e60">fbc3a9665f7473faa96484a3fa9b058ad82d7e60</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
golang.org/x/crypto before v0.0.0-20200220183623-bac4c82f6975 for Go allows a panic during signature verification in the golang.org/x/crypto/ssh package. A client can attack an SSH server that accepts public keys. Also, a server can attack any SSH client.
<p>Publish Date: 2020-02-20
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-9283>CVE-2020-9283</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9283">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9283</a></p>
<p>Release Date: 2020-02-20</p>
<p>Fix Resolution: github.com/golang/crypto - bac4c82f69751a6dd76e702d54b3ceb88adab236</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in github com golang crypto autoclosed cve high severity vulnerability vulnerable library github com golang crypto go supplementary cryptography libraries library home page a href dependency hierarchy github com denisenkom go mssqldb root library x github com golang crypto vulnerable library found in head commit a href found in base branch master vulnerability details golang org x crypto before for go allows a panic during signature verification in the golang org x crypto ssh package a client can attack an ssh server that accepts public keys also a server can attack any ssh client publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution github com golang crypto step up your open source security game with mend
| 0
|
4,409
| 7,299,072,051
|
IssuesEvent
|
2018-02-26 18:57:00
|
ncbo/bioportal-project
|
https://api.github.com/repos/ncbo/bioportal-project
|
reopened
|
GO ontology is not getting new submissions.
|
ontology processing problem
|
Last submission of GO ontology is dated back to 01/22/2016; however, this ontology should be updating more frequently than that.
GO ontology also missing description.
|
1.0
|
GO ontology is not getting new submissions. - Last submission of GO ontology is dated back to 01/22/2016; however, this ontology should be updating more frequently than that.
GO ontology also missing description.
|
process
|
go ontology is not getting new submissions last submission of go ontology is dated back to however this ontology should be updating more frequently than that go ontology also missing description
| 1
|
6,789
| 9,921,858,639
|
IssuesEvent
|
2019-06-30 22:03:59
|
GroceriStar/fetch-constants
|
https://api.github.com/repos/GroceriStar/fetch-constants
|
closed
|
#### [Groceristar][Favorites][methods]
|
in-process
|
By using names on methods from this [page](https://groceristar.github.io/documentation/docs/groceristar-website-methods-list/favorite-router/favorite-router.html)
In order to make it better, we'll create a set of constants, each for a different method.
Example:
*Get All ingredients with status favorite*
will became `export const GET_ALL_FAVORITE_INGREDIENTS = "GET_ALL_FAVORITE_INGREDIENTS ";`
|
1.0
|
#### [Groceristar][Favorites][methods] -
By using names on methods from this [page](https://groceristar.github.io/documentation/docs/groceristar-website-methods-list/favorite-router/favorite-router.html)
In order to make it better, we'll create a set of constants, each for a different method.
Example:
*Get All ingredients with status favorite*
will became `export const GET_ALL_FAVORITE_INGREDIENTS = "GET_ALL_FAVORITE_INGREDIENTS ";`
|
process
|
by using names on methods from this in order to make it better we ll create a set of constants each for a different method example get all ingredients with status favorite will became export const get all favorite ingredients get all favorite ingredients
| 1
|
133,908
| 29,669,414,853
|
IssuesEvent
|
2023-06-11 08:04:42
|
dtcxzyw/llvm-ci
|
https://api.github.com/repos/dtcxzyw/llvm-ci
|
closed
|
Regressions Report [rv64gc-O3-thinlto] May 3rd 2023, 10:18:25 am
|
regression codegen reasonable
|
## Metadata
+ Workflow URL: https://github.com/dtcxzyw/llvm-ci/actions/runs/4870636533
## Change Logs
from 583d492c630655dc0cd57ad167dec03e6c5d211c to c68e92d941723702810093161be4834f3ca68372
[c68e92d941723702810093161be4834f3ca68372](https://github.com/llvm/llvm-project/commit/c68e92d941723702810093161be4834f3ca68372) Fix MSVC "not all control paths return a value" warning. NFC.
[4e2b4f97a09500fb6ceb4f077c492fac056a6a0a](https://github.com/llvm/llvm-project/commit/4e2b4f97a09500fb6ceb4f077c492fac056a6a0a) [ShrinkWrap] Use underlying object to rule out stack access.
## Regressions (Time)
|Name|Baseline MD5|Current MD5|Baseline Time|Current Time|Ratio|
|:--|:--:|:--:|--:|--:|--:|
|MultiSource/Benchmarks/VersaBench/dbms/dbms|9c4cac2dce3d536d5f5a396c6bd608aa|432ba6a0eff42efa26922eea9178435f|3.727168956|3.727182616|1.000|
## Differences (Size)
|Name|Baseline MD5|Current MD5|Baseline Size|Current Size|Ratio|
|:--|:--:|:--:|--:|--:|--:|
|MicroBenchmarks/Builtins/Int128/Builtins|cf0a8405b8b39420328058659870ca97|1c3be8c5c52460a17c345ab4293bcbd2|196290|196290|1.000|
|MicroBenchmarks/ImageProcessing/AnisotropicDiffusion/AnisotropicDiffusion|c1b882079ae077ccd00a42749a18c93b|ec33c226b569ea65ff9920bc102a7d76|193802|193802|1.000|
|MicroBenchmarks/ImageProcessing/BilateralFiltering/BilateralFilter|3d24018f45d5c5f91a05d480488b4552|ae6113e61886e1f9314301f6c5110a9c|193858|193858|1.000|
|MicroBenchmarks/ImageProcessing/Blur/blur|563ce652fae5db268bae41de87295fdc|8b228a358633f5dc6421f089f8c4fd76|198230|198230|1.000|
|MicroBenchmarks/ImageProcessing/Dilate/Dilate|57548ec16a368c5bec903f0346a4ac77|bb841418b0f4900cc1ba093074ab296b|198458|198458|1.000|
|MicroBenchmarks/ImageProcessing/Dither/Dither|98b20e6f275233f2229802b6dbeb501a|92dd97adcb7b5fa96ae24a91e5049c70|203218|203218|1.000|
|MicroBenchmarks/ImageProcessing/Interpolation/Interpolation|2ea4b8c6eb4176d1f4a8228243ec4be5|8e62f411b4d6e4d155368525a803ff5a|195294|195294|1.000|
|MicroBenchmarks/LCALS/SubsetALambdaLoops/lcalsALambda|d3462a330ebaa22b702ccf292d31680e|03e67c2b636661828f0fc3b459cd5f74|401438|401438|1.000|
|MicroBenchmarks/LCALS/SubsetARawLoops/lcalsARaw|c3a92772e377e9f7480ae65391d72ffe|39ad63cc7444c88887beafec77adc358|401354|401354|1.000|
|MicroBenchmarks/LCALS/SubsetBLambdaLoops/lcalsBLambda|53f8489337cf88d5824bda3b91b9ffcc|df16f9bf438e9497e2dc558119251915|394314|394314|1.000|
|MicroBenchmarks/LCALS/SubsetBRawLoops/lcalsBRaw|9bec9444eccfb4ae66688e6c755b55e8|32aee1f3cf6fbb68058e84564beb0108|393874|393874|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|420046|420046|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|419778|419778|1.000|
|MicroBenchmarks/LoopInterchange/LoopInterchange|9efd5c1e37c5c2b2666b84e406333fc1|fb20c7dad4e293fb0100f53cf1e2e87f|189554|189554|1.000|
|MicroBenchmarks/LoopVectorization/LoopVectorizationBenchmarks|1793ebc5873f575a37a7b358ee772124|2817e69592ee03513ee99667e2ad1a08|313826|313826|1.000|
|MicroBenchmarks/MemFunctions/MemFunctions|450ffb001dd12beaf0b15146f7256660|3375dd034545b26027d6bb47f58de212|357874|357874|1.000|
|MicroBenchmarks/RNG/RandomGeneratorBenchmarks|44b95b006fe54d3105166530d667a07e|e5965c1c65c01e7bbaa622afe90d6f0b|203682|203682|1.000|
|MicroBenchmarks/SLPVectorization/SLPVectorizationBenchmarks|e0d425c01477257338641403edf771fd|8078210a0b9c14b9f9c847ccb2a87070|199394|199394|1.000|
|MicroBenchmarks/harris/harris|e89fb4dd08927ad36165226ac3010444|a1f584703aa2c3f131589c9831217e26|199902|199902|1.000|
|MultiSource/Applications/ALAC/decode/alacconvert-decode|0f98fb576650fbef872eca08b2039043|18e31933231582f25677fdb7aeda3487|77036|77036|1.000|
|MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg|12bdfec963862179b907b4c6cc7f8476|a31ad3b4f5f8aefafab7e71d65f74bf7|88874|88874|1.000|
|MultiSource/Benchmarks/lzbench/lzbench|b1d3032f19bf1f08a500367659dea32e|d4f092aa367ee3e96f7368f8966a8fb6|3464692|3464680|1.000|
|MultiSource/Applications/JM/lencod/lencod|722f6685ffc4667fe1d1b768dab36e11|c259386bb7daf3f567669454b75a6172|830490|830486|1.000|
|MultiSource/Applications/sqlite3/sqlite3|1be77b90041d22b9d05eb6d884ece14c|913d5e690e98d9b628869b37c9529fbc|463754|463750|1.000|
|MultiSource/Applications/SPASS/SPASS|37900209513d78989f935d3fb42f73a6|085a864327b1ddfe32638ff17f872637|543462|543438|1.000|
|MultiSource/Benchmarks/DOE-ProxyApps-C++/CLAMR/CLAMR|74646ef9bb16013ab9be947613d7dbf4|3dd6ec13d04c6107c529a5aecd3fe889|184814|184802|1.000|
|MultiSource/Benchmarks/MiBench/consumer-typeset/consumer-typeset|8ce856bdce658b9b709056b1ac7bc710|121298f3f587eb6c5622f6334374a497|489576|489544|1.000|
|MultiSource/Applications/ClamAV/clamscan|7129cf0cd3bfff994920811fae7f778c|aff6e03df351f2e71de4673e3aecd12a|569138|569098|1.000|
|MultiSource/Benchmarks/Ptrdist/yacr2/yacr2|d8e9e761b05fd5410245fbd289e9f203|a3d04fc3fe955c6ca10ffd111d9bf0fa|49740|49736|1.000|
|MultiSource/Benchmarks/Bullet/bullet|a249850119a3247efc4326adc771eae9|332769a95ad26f2edc9bf2eab9d29874|290544|290520|1.000|
|MultiSource/Benchmarks/MallocBench/gs/gs|bb427abd4b22d8a863d23bd5f2190c82|b013d7db9c61896616f6bb105b0dc9a1|173240|173220|1.000|
|MultiSource/Applications/siod/siod|aeb7dc35c1df72bd439f5fe9a856b9ea|bab7f448df963f0e2e17110bac3d166a|206142|206118|1.000|
|MultiSource/Benchmarks/Prolangs-C/bison/mybison|201cec183ca2fc69e7204017e2958f14|227d84cdcf01b152bef08f3a7dad5a43|68690|68682|1.000|
|MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame|217ebca647d86b87727ba892cbcfb396|a8451a6662101ab5c9f478992edf181f|177958|177930|1.000|
|MultiSource/Benchmarks/7zip/7zip-benchmark|f22748bf49681a431e9b0bd0b1ea0104|ced1f6e564880697c875dada7938c613|1078278|1078078|1.000|
|MultiSource/Applications/spiff/spiff|1d659cc703d9728d77a0f456547a3d3b|a4ea887b9c4f903890355c36f8b521e0|40930|40922|1.000|
|MultiSource/Benchmarks/MiBench/security-rijndael/security-rijndael|5ee150f13516a57daae4ab4a8fea47b4|efd5ecb2fb8f973a38d30cf8614eec79|12136|12132|1.000|
|MultiSource/Benchmarks/VersaBench/dbms/dbms|9c4cac2dce3d536d5f5a396c6bd608aa|432ba6a0eff42efa26922eea9178435f|20422|20414|1.000|
|MultiSource/Benchmarks/Ptrdist/bc/bc|446607d6a85c27a1d7ee1d5d2c1cb840|160056944080a938fd056425941ad9ee|63294|63222|0.999|
|MultiSource/Applications/aha/aha|939e70178a46a8d1676e9cf1bb5ae93e|06d3ad60c9b0d3179852e94c18ea46ec|3280|3256|0.993|
|GeoMeans|N/A|N/A|157119.174|157032.245|0.999|
## Differences (Time)
|Name|Baseline MD5|Current MD5|Baseline Time|Current Time|Ratio|
|:--|:--:|:--:|--:|--:|--:|
|MultiSource/Benchmarks/VersaBench/dbms/dbms|9c4cac2dce3d536d5f5a396c6bd608aa|432ba6a0eff42efa26922eea9178435f|3.727168956|3.727182616|1.000|
|MultiSource/Benchmarks/Prolangs-C/bison/mybison|201cec183ca2fc69e7204017e2958f14|227d84cdcf01b152bef08f3a7dad5a43|0.00779144|0.007791446|1.000|
|MultiSource/Applications/ClamAV/clamscan|7129cf0cd3bfff994920811fae7f778c|aff6e03df351f2e71de4673e3aecd12a|0.451311956|0.451312301|1.000|
|MultiSource/Benchmarks/MiBench/consumer-typeset/consumer-typeset|8ce856bdce658b9b709056b1ac7bc710|121298f3f587eb6c5622f6334374a497|0.646433935|0.646434219|1.000|
|MultiSource/Benchmarks/DOE-ProxyApps-C++/CLAMR/CLAMR|74646ef9bb16013ab9be947613d7dbf4|3dd6ec13d04c6107c529a5aecd3fe889|7.189383391|7.189383409|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_FIRST_SUM_LAMBDA/5001|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|18.13500417109218|18.13500417109292|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_HYDRO_2D_LAMBDA/171|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|84.31603673813255|84.31603673813598|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_DISC_ORD_LAMBDA/5001|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|170.054037900871|170.05403790087792|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_MAT_X_MAT_LAMBDA/171|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|1028.3062393538833|1028.306239353925|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_PIC_2D_RAW/171|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|10.353002248088202|10.353002248088623|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_PIC_2D_RAW/5001|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|302.56806568711517|302.56806568712744|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_GEN_LIN_RECUR_RAW/44217|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|718.5451735112686|718.5451735112978|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_DISC_ORD_RAW/171|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|5.8340012918172715|5.834001291817509|1.000|
|MicroBenchmarks/LoopVectorization/LoopVectorizationBenchmarks.test:benchForTruncOrZextVecInLoopFrom_uint8_t_To_uint64_t_|1793ebc5873f575a37a7b358ee772124|2817e69592ee03513ee99667e2ad1a08|21606.004444718692|21606.00444471957|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_PIC_1D_RAW/5001|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|280.74407220215863|280.74407220217|1.000|
|MicroBenchmarks/MemFunctions/MemFunctions.test:BM_MemCmp<32, LessThanZero, Mid>|450ffb001dd12beaf0b15146f7256660|3375dd034545b26027d6bb47f58de212|8964.001805608741|8964.001805609103|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_ADI_LAMBDA/5001|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|695.0482005958191|695.0482005958334|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_INNER_PROD_RAW/171|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|1.8890003885947337|1.8890003885947722|1.000|
|MicroBenchmarks/LoopVectorization/LoopVectorizationBenchmarks.test:benchVecWithRuntimeChecks4PointersDEqualsA/32|1793ebc5873f575a37a7b358ee772124|2817e69592ee03513ee99667e2ad1a08|395059.08013543935|395059.0801354474|1.000|
|MicroBenchmarks/MemFunctions/MemFunctions.test:BM_MemCmp<7, LessThanZero, Last>|450ffb001dd12beaf0b15146f7256660|3375dd034545b26027d6bb47f58de212|49144.00989890462|49144.00989890562|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_PLANCKIAN_LAMBDA/5001|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|390.09108082498574|390.09108082496994|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_DISC_ORD_LAMBDA/171|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|5.834001300151726|5.834001300151489|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_DISC_ORD_LAMBDA/44217|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|1503.398334763987|1503.3983347639257|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_FIRST_DIFF_RAW/171|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|0.7930001778589473|0.7930001778589151|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_DISC_ORD_RAW/5001|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|170.05403765792616|170.05403765791925|1.000|
|MicroBenchmarks/LoopVectorization/LoopVectorizationBenchmarks.test:benchForTruncOrZextVecInLoopFrom_uint16_t_To_uint32_t_|1793ebc5873f575a37a7b358ee772124|2817e69592ee03513ee99667e2ad1a08|21606.00444471957|21606.004444718692|1.000|
|MicroBenchmarks/LoopVectorization/LoopVectorizationBenchmarks.test:benchForTruncOrZextVecInLoopFrom_uint16_t_To_uint64_t_|1793ebc5873f575a37a7b358ee772124|2817e69592ee03513ee99667e2ad1a08|21606.00444471957|21606.004444718692|1.000|
|MultiSource/Applications/siod/siod|aeb7dc35c1df72bd439f5fe9a856b9ea|bab7f448df963f0e2e17110bac3d166a|13.917935575|13.917935511|1.000|
|MultiSource/Benchmarks/DOE-ProxyApps-C++/PENNANT/PENNANT|5c4e3bac1bae1bac868c67334d2298c6|196a25a63f9beb3f71451a38c2f46ffd|3.348809302|3.348809278|1.000|
|MultiSource/Applications/oggenc/oggenc|3d57052f6d46f17f4efe0cf1507a750d|89da5990044d755a1e3223c635662349|0.479870194|0.479870179|1.000|
|MultiSource/Applications/aha/aha|939e70178a46a8d1676e9cf1bb5ae93e|06d3ad60c9b0d3179852e94c18ea46ec|11.423399295|11.42339888|1.000|
|MultiSource/Applications/SPASS/SPASS|37900209513d78989f935d3fb42f73a6|085a864327b1ddfe32638ff17f872637|13.363899276|13.3638983|1.000|
|MultiSource/Benchmarks/7zip/7zip-benchmark|f22748bf49681a431e9b0bd0b1ea0104|ced1f6e564880697c875dada7938c613|29.624594039|29.624589896|1.000|
|MultiSource/Benchmarks/MallocBench/gs/gs|bb427abd4b22d8a863d23bd5f2190c82|b013d7db9c61896616f6bb105b0dc9a1|0.190009062|0.190009028|1.000|
|MultiSource/Applications/spiff/spiff|1d659cc703d9728d77a0f456547a3d3b|a4ea887b9c4f903890355c36f8b521e0|11.849167827|11.849164704|1.000|
|MultiSource/Benchmarks/Ptrdist/bc/bc|446607d6a85c27a1d7ee1d5d2c1cb840|160056944080a938fd056425941ad9ee|2.564161258|2.564160146|1.000|
|MultiSource/Benchmarks/lzbench/lzbench|b1d3032f19bf1f08a500367659dea32e|d4f092aa367ee3e96f7368f8966a8fb6|483.598336052|483.598068892|1.000|
|MultiSource/Applications/sqlite3/sqlite3|1be77b90041d22b9d05eb6d884ece14c|913d5e690e98d9b628869b37c9529fbc|18.423499385|18.423488985|1.000|
|MultiSource/Benchmarks/Bullet/bullet|a249850119a3247efc4326adc771eae9|332769a95ad26f2edc9bf2eab9d29874|20.793433843|20.714967499|0.996|
|MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg|12bdfec963862179b907b4c6cc7f8476|a31ad3b4f5f8aefafab7e71d65f74bf7|0.014668678|0.014602528|0.995|
|GeoMeans|N/A|N/A|1598.282|1598.198|1.000|
|
1.0
|
Regressions Report [rv64gc-O3-thinlto] May 3rd 2023, 10:18:25 am - ## Metadata
+ Workflow URL: https://github.com/dtcxzyw/llvm-ci/actions/runs/4870636533
## Change Logs
from 583d492c630655dc0cd57ad167dec03e6c5d211c to c68e92d941723702810093161be4834f3ca68372
[c68e92d941723702810093161be4834f3ca68372](https://github.com/llvm/llvm-project/commit/c68e92d941723702810093161be4834f3ca68372) Fix MSVC "not all control paths return a value" warning. NFC.
[4e2b4f97a09500fb6ceb4f077c492fac056a6a0a](https://github.com/llvm/llvm-project/commit/4e2b4f97a09500fb6ceb4f077c492fac056a6a0a) [ShrinkWrap] Use underlying object to rule out stack access.
## Regressions (Time)
|Name|Baseline MD5|Current MD5|Baseline Time|Current Time|Ratio|
|:--|:--:|:--:|--:|--:|--:|
|MultiSource/Benchmarks/VersaBench/dbms/dbms|9c4cac2dce3d536d5f5a396c6bd608aa|432ba6a0eff42efa26922eea9178435f|3.727168956|3.727182616|1.000|
## Differences (Size)
|Name|Baseline MD5|Current MD5|Baseline Size|Current Size|Ratio|
|:--|:--:|:--:|--:|--:|--:|
|MicroBenchmarks/Builtins/Int128/Builtins|cf0a8405b8b39420328058659870ca97|1c3be8c5c52460a17c345ab4293bcbd2|196290|196290|1.000|
|MicroBenchmarks/ImageProcessing/AnisotropicDiffusion/AnisotropicDiffusion|c1b882079ae077ccd00a42749a18c93b|ec33c226b569ea65ff9920bc102a7d76|193802|193802|1.000|
|MicroBenchmarks/ImageProcessing/BilateralFiltering/BilateralFilter|3d24018f45d5c5f91a05d480488b4552|ae6113e61886e1f9314301f6c5110a9c|193858|193858|1.000|
|MicroBenchmarks/ImageProcessing/Blur/blur|563ce652fae5db268bae41de87295fdc|8b228a358633f5dc6421f089f8c4fd76|198230|198230|1.000|
|MicroBenchmarks/ImageProcessing/Dilate/Dilate|57548ec16a368c5bec903f0346a4ac77|bb841418b0f4900cc1ba093074ab296b|198458|198458|1.000|
|MicroBenchmarks/ImageProcessing/Dither/Dither|98b20e6f275233f2229802b6dbeb501a|92dd97adcb7b5fa96ae24a91e5049c70|203218|203218|1.000|
|MicroBenchmarks/ImageProcessing/Interpolation/Interpolation|2ea4b8c6eb4176d1f4a8228243ec4be5|8e62f411b4d6e4d155368525a803ff5a|195294|195294|1.000|
|MicroBenchmarks/LCALS/SubsetALambdaLoops/lcalsALambda|d3462a330ebaa22b702ccf292d31680e|03e67c2b636661828f0fc3b459cd5f74|401438|401438|1.000|
|MicroBenchmarks/LCALS/SubsetARawLoops/lcalsARaw|c3a92772e377e9f7480ae65391d72ffe|39ad63cc7444c88887beafec77adc358|401354|401354|1.000|
|MicroBenchmarks/LCALS/SubsetBLambdaLoops/lcalsBLambda|53f8489337cf88d5824bda3b91b9ffcc|df16f9bf438e9497e2dc558119251915|394314|394314|1.000|
|MicroBenchmarks/LCALS/SubsetBRawLoops/lcalsBRaw|9bec9444eccfb4ae66688e6c755b55e8|32aee1f3cf6fbb68058e84564beb0108|393874|393874|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|420046|420046|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|419778|419778|1.000|
|MicroBenchmarks/LoopInterchange/LoopInterchange|9efd5c1e37c5c2b2666b84e406333fc1|fb20c7dad4e293fb0100f53cf1e2e87f|189554|189554|1.000|
|MicroBenchmarks/LoopVectorization/LoopVectorizationBenchmarks|1793ebc5873f575a37a7b358ee772124|2817e69592ee03513ee99667e2ad1a08|313826|313826|1.000|
|MicroBenchmarks/MemFunctions/MemFunctions|450ffb001dd12beaf0b15146f7256660|3375dd034545b26027d6bb47f58de212|357874|357874|1.000|
|MicroBenchmarks/RNG/RandomGeneratorBenchmarks|44b95b006fe54d3105166530d667a07e|e5965c1c65c01e7bbaa622afe90d6f0b|203682|203682|1.000|
|MicroBenchmarks/SLPVectorization/SLPVectorizationBenchmarks|e0d425c01477257338641403edf771fd|8078210a0b9c14b9f9c847ccb2a87070|199394|199394|1.000|
|MicroBenchmarks/harris/harris|e89fb4dd08927ad36165226ac3010444|a1f584703aa2c3f131589c9831217e26|199902|199902|1.000|
|MultiSource/Applications/ALAC/decode/alacconvert-decode|0f98fb576650fbef872eca08b2039043|18e31933231582f25677fdb7aeda3487|77036|77036|1.000|
|MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg|12bdfec963862179b907b4c6cc7f8476|a31ad3b4f5f8aefafab7e71d65f74bf7|88874|88874|1.000|
|MultiSource/Benchmarks/lzbench/lzbench|b1d3032f19bf1f08a500367659dea32e|d4f092aa367ee3e96f7368f8966a8fb6|3464692|3464680|1.000|
|MultiSource/Applications/JM/lencod/lencod|722f6685ffc4667fe1d1b768dab36e11|c259386bb7daf3f567669454b75a6172|830490|830486|1.000|
|MultiSource/Applications/sqlite3/sqlite3|1be77b90041d22b9d05eb6d884ece14c|913d5e690e98d9b628869b37c9529fbc|463754|463750|1.000|
|MultiSource/Applications/SPASS/SPASS|37900209513d78989f935d3fb42f73a6|085a864327b1ddfe32638ff17f872637|543462|543438|1.000|
|MultiSource/Benchmarks/DOE-ProxyApps-C++/CLAMR/CLAMR|74646ef9bb16013ab9be947613d7dbf4|3dd6ec13d04c6107c529a5aecd3fe889|184814|184802|1.000|
|MultiSource/Benchmarks/MiBench/consumer-typeset/consumer-typeset|8ce856bdce658b9b709056b1ac7bc710|121298f3f587eb6c5622f6334374a497|489576|489544|1.000|
|MultiSource/Applications/ClamAV/clamscan|7129cf0cd3bfff994920811fae7f778c|aff6e03df351f2e71de4673e3aecd12a|569138|569098|1.000|
|MultiSource/Benchmarks/Ptrdist/yacr2/yacr2|d8e9e761b05fd5410245fbd289e9f203|a3d04fc3fe955c6ca10ffd111d9bf0fa|49740|49736|1.000|
|MultiSource/Benchmarks/Bullet/bullet|a249850119a3247efc4326adc771eae9|332769a95ad26f2edc9bf2eab9d29874|290544|290520|1.000|
|MultiSource/Benchmarks/MallocBench/gs/gs|bb427abd4b22d8a863d23bd5f2190c82|b013d7db9c61896616f6bb105b0dc9a1|173240|173220|1.000|
|MultiSource/Applications/siod/siod|aeb7dc35c1df72bd439f5fe9a856b9ea|bab7f448df963f0e2e17110bac3d166a|206142|206118|1.000|
|MultiSource/Benchmarks/Prolangs-C/bison/mybison|201cec183ca2fc69e7204017e2958f14|227d84cdcf01b152bef08f3a7dad5a43|68690|68682|1.000|
|MultiSource/Benchmarks/MiBench/consumer-lame/consumer-lame|217ebca647d86b87727ba892cbcfb396|a8451a6662101ab5c9f478992edf181f|177958|177930|1.000|
|MultiSource/Benchmarks/7zip/7zip-benchmark|f22748bf49681a431e9b0bd0b1ea0104|ced1f6e564880697c875dada7938c613|1078278|1078078|1.000|
|MultiSource/Applications/spiff/spiff|1d659cc703d9728d77a0f456547a3d3b|a4ea887b9c4f903890355c36f8b521e0|40930|40922|1.000|
|MultiSource/Benchmarks/MiBench/security-rijndael/security-rijndael|5ee150f13516a57daae4ab4a8fea47b4|efd5ecb2fb8f973a38d30cf8614eec79|12136|12132|1.000|
|MultiSource/Benchmarks/VersaBench/dbms/dbms|9c4cac2dce3d536d5f5a396c6bd608aa|432ba6a0eff42efa26922eea9178435f|20422|20414|1.000|
|MultiSource/Benchmarks/Ptrdist/bc/bc|446607d6a85c27a1d7ee1d5d2c1cb840|160056944080a938fd056425941ad9ee|63294|63222|0.999|
|MultiSource/Applications/aha/aha|939e70178a46a8d1676e9cf1bb5ae93e|06d3ad60c9b0d3179852e94c18ea46ec|3280|3256|0.993|
|GeoMeans|N/A|N/A|157119.174|157032.245|0.999|
## Differences (Time)
|Name|Baseline MD5|Current MD5|Baseline Time|Current Time|Ratio|
|:--|:--:|:--:|--:|--:|--:|
|MultiSource/Benchmarks/VersaBench/dbms/dbms|9c4cac2dce3d536d5f5a396c6bd608aa|432ba6a0eff42efa26922eea9178435f|3.727168956|3.727182616|1.000|
|MultiSource/Benchmarks/Prolangs-C/bison/mybison|201cec183ca2fc69e7204017e2958f14|227d84cdcf01b152bef08f3a7dad5a43|0.00779144|0.007791446|1.000|
|MultiSource/Applications/ClamAV/clamscan|7129cf0cd3bfff994920811fae7f778c|aff6e03df351f2e71de4673e3aecd12a|0.451311956|0.451312301|1.000|
|MultiSource/Benchmarks/MiBench/consumer-typeset/consumer-typeset|8ce856bdce658b9b709056b1ac7bc710|121298f3f587eb6c5622f6334374a497|0.646433935|0.646434219|1.000|
|MultiSource/Benchmarks/DOE-ProxyApps-C++/CLAMR/CLAMR|74646ef9bb16013ab9be947613d7dbf4|3dd6ec13d04c6107c529a5aecd3fe889|7.189383391|7.189383409|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_FIRST_SUM_LAMBDA/5001|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|18.13500417109218|18.13500417109292|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_HYDRO_2D_LAMBDA/171|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|84.31603673813255|84.31603673813598|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_DISC_ORD_LAMBDA/5001|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|170.054037900871|170.05403790087792|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_MAT_X_MAT_LAMBDA/171|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|1028.3062393538833|1028.306239353925|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_PIC_2D_RAW/171|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|10.353002248088202|10.353002248088623|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_PIC_2D_RAW/5001|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|302.56806568711517|302.56806568712744|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_GEN_LIN_RECUR_RAW/44217|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|718.5451735112686|718.5451735112978|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_DISC_ORD_RAW/171|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|5.8340012918172715|5.834001291817509|1.000|
|MicroBenchmarks/LoopVectorization/LoopVectorizationBenchmarks.test:benchForTruncOrZextVecInLoopFrom_uint8_t_To_uint64_t_|1793ebc5873f575a37a7b358ee772124|2817e69592ee03513ee99667e2ad1a08|21606.004444718692|21606.00444471957|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_PIC_1D_RAW/5001|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|280.74407220215863|280.74407220217|1.000|
|MicroBenchmarks/MemFunctions/MemFunctions.test:BM_MemCmp<32, LessThanZero, Mid>|450ffb001dd12beaf0b15146f7256660|3375dd034545b26027d6bb47f58de212|8964.001805608741|8964.001805609103|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_ADI_LAMBDA/5001|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|695.0482005958191|695.0482005958334|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_INNER_PROD_RAW/171|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|1.8890003885947337|1.8890003885947722|1.000|
|MicroBenchmarks/LoopVectorization/LoopVectorizationBenchmarks.test:benchVecWithRuntimeChecks4PointersDEqualsA/32|1793ebc5873f575a37a7b358ee772124|2817e69592ee03513ee99667e2ad1a08|395059.08013543935|395059.0801354474|1.000|
|MicroBenchmarks/MemFunctions/MemFunctions.test:BM_MemCmp<7, LessThanZero, Last>|450ffb001dd12beaf0b15146f7256660|3375dd034545b26027d6bb47f58de212|49144.00989890462|49144.00989890562|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_PLANCKIAN_LAMBDA/5001|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|390.09108082498574|390.09108082496994|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_DISC_ORD_LAMBDA/171|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|5.834001300151726|5.834001300151489|1.000|
|MicroBenchmarks/LCALS/SubsetCLambdaLoops/lcalsCLambda.test:BM_DISC_ORD_LAMBDA/44217|d1aa67b6ed1014201b3829b3536da5e6|e9fca0c038c618d20a83e4da1b0aa113|1503.398334763987|1503.3983347639257|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_FIRST_DIFF_RAW/171|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|0.7930001778589473|0.7930001778589151|1.000|
|MicroBenchmarks/LCALS/SubsetCRawLoops/lcalsCRaw.test:BM_DISC_ORD_RAW/5001|b53cdc6c3956d959f65049e3a1794bcd|485f7ce23c0122882e4bc224ba36db70|170.05403765792616|170.05403765791925|1.000|
|MicroBenchmarks/LoopVectorization/LoopVectorizationBenchmarks.test:benchForTruncOrZextVecInLoopFrom_uint16_t_To_uint32_t_|1793ebc5873f575a37a7b358ee772124|2817e69592ee03513ee99667e2ad1a08|21606.00444471957|21606.004444718692|1.000|
|MicroBenchmarks/LoopVectorization/LoopVectorizationBenchmarks.test:benchForTruncOrZextVecInLoopFrom_uint16_t_To_uint64_t_|1793ebc5873f575a37a7b358ee772124|2817e69592ee03513ee99667e2ad1a08|21606.00444471957|21606.004444718692|1.000|
|MultiSource/Applications/siod/siod|aeb7dc35c1df72bd439f5fe9a856b9ea|bab7f448df963f0e2e17110bac3d166a|13.917935575|13.917935511|1.000|
|MultiSource/Benchmarks/DOE-ProxyApps-C++/PENNANT/PENNANT|5c4e3bac1bae1bac868c67334d2298c6|196a25a63f9beb3f71451a38c2f46ffd|3.348809302|3.348809278|1.000|
|MultiSource/Applications/oggenc/oggenc|3d57052f6d46f17f4efe0cf1507a750d|89da5990044d755a1e3223c635662349|0.479870194|0.479870179|1.000|
|MultiSource/Applications/aha/aha|939e70178a46a8d1676e9cf1bb5ae93e|06d3ad60c9b0d3179852e94c18ea46ec|11.423399295|11.42339888|1.000|
|MultiSource/Applications/SPASS/SPASS|37900209513d78989f935d3fb42f73a6|085a864327b1ddfe32638ff17f872637|13.363899276|13.3638983|1.000|
|MultiSource/Benchmarks/7zip/7zip-benchmark|f22748bf49681a431e9b0bd0b1ea0104|ced1f6e564880697c875dada7938c613|29.624594039|29.624589896|1.000|
|MultiSource/Benchmarks/MallocBench/gs/gs|bb427abd4b22d8a863d23bd5f2190c82|b013d7db9c61896616f6bb105b0dc9a1|0.190009062|0.190009028|1.000|
|MultiSource/Applications/spiff/spiff|1d659cc703d9728d77a0f456547a3d3b|a4ea887b9c4f903890355c36f8b521e0|11.849167827|11.849164704|1.000|
|MultiSource/Benchmarks/Ptrdist/bc/bc|446607d6a85c27a1d7ee1d5d2c1cb840|160056944080a938fd056425941ad9ee|2.564161258|2.564160146|1.000|
|MultiSource/Benchmarks/lzbench/lzbench|b1d3032f19bf1f08a500367659dea32e|d4f092aa367ee3e96f7368f8966a8fb6|483.598336052|483.598068892|1.000|
|MultiSource/Applications/sqlite3/sqlite3|1be77b90041d22b9d05eb6d884ece14c|913d5e690e98d9b628869b37c9529fbc|18.423499385|18.423488985|1.000|
|MultiSource/Benchmarks/Bullet/bullet|a249850119a3247efc4326adc771eae9|332769a95ad26f2edc9bf2eab9d29874|20.793433843|20.714967499|0.996|
|MultiSource/Benchmarks/mediabench/jpeg/jpeg-6a/cjpeg|12bdfec963862179b907b4c6cc7f8476|a31ad3b4f5f8aefafab7e71d65f74bf7|0.014668678|0.014602528|0.995|
|GeoMeans|N/A|N/A|1598.282|1598.198|1.000|
|
non_process
|
regressions report may am metadata workflow url change logs from to fix msvc quot not all control paths return a value quot warning nfc use underlying object to rule out stack access regressions time name baseline current baseline time current time ratio multisource benchmarks versabench dbms dbms differences size name baseline current baseline size current size ratio microbenchmarks builtins builtins microbenchmarks imageprocessing anisotropicdiffusion anisotropicdiffusion microbenchmarks imageprocessing bilateralfiltering bilateralfilter microbenchmarks imageprocessing blur blur microbenchmarks imageprocessing dilate dilate microbenchmarks imageprocessing dither dither microbenchmarks imageprocessing interpolation interpolation microbenchmarks lcals subsetalambdaloops lcalsalambda microbenchmarks lcals subsetarawloops lcalsaraw microbenchmarks lcals subsetblambdaloops lcalsblambda microbenchmarks lcals subsetbrawloops lcalsbraw microbenchmarks lcals subsetclambdaloops lcalsclambda microbenchmarks lcals subsetcrawloops lcalscraw microbenchmarks loopinterchange loopinterchange microbenchmarks loopvectorization loopvectorizationbenchmarks microbenchmarks memfunctions memfunctions microbenchmarks rng randomgeneratorbenchmarks microbenchmarks slpvectorization slpvectorizationbenchmarks microbenchmarks harris harris multisource applications alac decode alacconvert decode multisource benchmarks mediabench jpeg jpeg cjpeg multisource benchmarks lzbench lzbench multisource applications jm lencod lencod multisource applications multisource applications spass spass multisource benchmarks doe proxyapps c clamr clamr multisource benchmarks mibench consumer typeset consumer typeset multisource applications clamav clamscan multisource benchmarks ptrdist multisource benchmarks bullet bullet multisource benchmarks mallocbench gs gs multisource applications siod siod multisource benchmarks prolangs c bison mybison multisource benchmarks mibench consumer lame consumer lame multisource benchmarks benchmark multisource applications spiff spiff multisource benchmarks mibench security rijndael security rijndael multisource benchmarks versabench dbms dbms multisource benchmarks ptrdist bc bc multisource applications aha aha geomeans n a n a differences time name baseline current baseline time current time ratio multisource benchmarks versabench dbms dbms multisource benchmarks prolangs c bison mybison multisource applications clamav clamscan multisource benchmarks mibench consumer typeset consumer typeset multisource benchmarks doe proxyapps c clamr clamr microbenchmarks lcals subsetclambdaloops lcalsclambda test bm first sum lambda microbenchmarks lcals subsetclambdaloops lcalsclambda test bm hydro lambda microbenchmarks lcals subsetclambdaloops lcalsclambda test bm disc ord lambda microbenchmarks lcals subsetclambdaloops lcalsclambda test bm mat x mat lambda microbenchmarks lcals subsetcrawloops lcalscraw test bm pic raw microbenchmarks lcals subsetcrawloops lcalscraw test bm pic raw microbenchmarks lcals subsetcrawloops lcalscraw test bm gen lin recur raw microbenchmarks lcals subsetcrawloops lcalscraw test bm disc ord raw microbenchmarks loopvectorization loopvectorizationbenchmarks test benchfortruncorzextvecinloopfrom t to t microbenchmarks lcals subsetcrawloops lcalscraw test bm pic raw microbenchmarks memfunctions memfunctions test bm memcmp microbenchmarks lcals subsetclambdaloops lcalsclambda test bm adi lambda microbenchmarks lcals subsetcrawloops lcalscraw test bm inner prod raw microbenchmarks loopvectorization loopvectorizationbenchmarks test microbenchmarks memfunctions memfunctions test bm memcmp microbenchmarks lcals subsetclambdaloops lcalsclambda test bm planckian lambda microbenchmarks lcals subsetclambdaloops lcalsclambda test bm disc ord lambda microbenchmarks lcals subsetclambdaloops lcalsclambda test bm disc ord lambda microbenchmarks lcals subsetcrawloops lcalscraw test bm first diff raw microbenchmarks lcals subsetcrawloops lcalscraw test bm disc ord raw microbenchmarks loopvectorization loopvectorizationbenchmarks test benchfortruncorzextvecinloopfrom t to t microbenchmarks loopvectorization loopvectorizationbenchmarks test benchfortruncorzextvecinloopfrom t to t multisource applications siod siod multisource benchmarks doe proxyapps c pennant pennant multisource applications oggenc oggenc multisource applications aha aha multisource applications spass spass multisource benchmarks benchmark multisource benchmarks mallocbench gs gs multisource applications spiff spiff multisource benchmarks ptrdist bc bc multisource benchmarks lzbench lzbench multisource applications multisource benchmarks bullet bullet multisource benchmarks mediabench jpeg jpeg cjpeg geomeans n a n a
| 0
|
20,729
| 27,429,259,641
|
IssuesEvent
|
2023-03-01 23:12:07
|
wandb/wandb
|
https://api.github.com/repos/wandb/wandb
|
closed
|
Wandb only terminates one process when using DDP.
|
bug multiprocessing
|
**Describe the bug**
Each time I run a sweep (one or multiple runs per agent), some processes are left running (these hog 100% the processor they're assigned to and eat up RAM). The data from GPUs is not cleared after finishing a sweep, leading to `CUDA out of memory` error. Afterwards I need to kill all processes individually. After stopping the sweep I get this error.
`/home/jpohjone/miniconda3/envs/models/lib/python3.8/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 6 leaked semaphore objects to clean up at shutdown`
There is no problem when running a normal run (ie. copying and running the same command that the sweep uses).
**To Reproduce**
I can create a small example script later I if seems the bug can't be solved otherwise.
**Expected behavior**
Shut down the script correctly after sweep is done.
**Screenshots**
Snippet from `htop`.
<img width="647" alt="htop" src="https://user-images.githubusercontent.com/49716607/101246013-45d18080-3719-11eb-8da8-e9c8879bc244.png">
**Operating system and other versions**
- OS: Ubuntu 20.04.1 LT
- wand: 10.12
- python: 3.8.5
- torch: 1.7.0
- miniconda3
**Additional context**
I use four GPUs with DDP to distribute jobs, but the problem persists even when I only use 1 GPU.
|
1.0
|
Wandb only terminates one process when using DDP. - **Describe the bug**
Each time I run a sweep (one or multiple runs per agent), some processes are left running (these hog 100% the processor they're assigned to and eat up RAM). The data from GPUs is not cleared after finishing a sweep, leading to `CUDA out of memory` error. Afterwards I need to kill all processes individually. After stopping the sweep I get this error.
`/home/jpohjone/miniconda3/envs/models/lib/python3.8/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 6 leaked semaphore objects to clean up at shutdown`
There is no problem when running a normal run (ie. copying and running the same command that the sweep uses).
**To Reproduce**
I can create a small example script later I if seems the bug can't be solved otherwise.
**Expected behavior**
Shut down the script correctly after sweep is done.
**Screenshots**
Snippet from `htop`.
<img width="647" alt="htop" src="https://user-images.githubusercontent.com/49716607/101246013-45d18080-3719-11eb-8da8-e9c8879bc244.png">
**Operating system and other versions**
- OS: Ubuntu 20.04.1 LT
- wand: 10.12
- python: 3.8.5
- torch: 1.7.0
- miniconda3
**Additional context**
I use four GPUs with DDP to distribute jobs, but the problem persists even when I only use 1 GPU.
|
process
|
wandb only terminates one process when using ddp describe the bug each time i run a sweep one or multiple runs per agent some processes are left running these hog the processor they re assigned to and eat up ram the data from gpus is not cleared after finishing a sweep leading to cuda out of memory error afterwards i need to kill all processes individually after stopping the sweep i get this error home jpohjone envs models lib multiprocessing resource tracker py userwarning resource tracker there appear to be leaked semaphore objects to clean up at shutdown there is no problem when running a normal run ie copying and running the same command that the sweep uses to reproduce i can create a small example script later i if seems the bug can t be solved otherwise expected behavior shut down the script correctly after sweep is done screenshots snippet from htop img width alt htop src operating system and other versions os ubuntu lt wand python torch additional context i use four gpus with ddp to distribute jobs but the problem persists even when i only use gpu
| 1
|
21,925
| 30,446,558,352
|
IssuesEvent
|
2023-07-15 18:48:21
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
pyutils 0.0.1b18 has 2 GuardDog issues
|
guarddog typosquatting silent-process-execution
|
https://pypi.org/project/pyutils
https://inspector.pypi.io/project/pyutils
```{
"dependency": "pyutils",
"version": "0.0.1b18",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pytils, python-utils",
"silent-process-execution": [
{
"location": "pyutils/exec_utils.py/pyutils/exec_utils.py:214",
"code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmp8b_5ugxs/pyutils"
}
}```
|
1.0
|
pyutils 0.0.1b18 has 2 GuardDog issues - https://pypi.org/project/pyutils
https://inspector.pypi.io/project/pyutils
```{
"dependency": "pyutils",
"version": "0.0.1b18",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pytils, python-utils",
"silent-process-execution": [
{
"location": "pyutils/exec_utils.py/pyutils/exec_utils.py:214",
"code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmp8b_5ugxs/pyutils"
}
}```
|
process
|
pyutils has guarddog issues dependency pyutils version result issues errors results typosquatting this package closely ressembles the following package names and might be a typosquatting attempt pytils python utils silent process execution location pyutils exec utils py pyutils exec utils py code subproc subprocess popen n args n stdin subprocess devnull n stdout subprocess devnull n stderr subprocess devnull n message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp pyutils
| 1
|
3,430
| 6,529,654,407
|
IssuesEvent
|
2017-08-30 12:33:51
|
DynareTeam/dynare
|
https://api.github.com/repos/DynareTeam/dynare
|
closed
|
Allow writing steady_state_model-block into LaTeX-document
|
enhancement preprocessor
|
This would make debugging a lot easier
|
1.0
|
Allow writing steady_state_model-block into LaTeX-document - This would make debugging a lot easier
|
process
|
allow writing steady state model block into latex document this would make debugging a lot easier
| 1
|
183,321
| 14,938,358,410
|
IssuesEvent
|
2021-01-25 15:41:03
|
IBM/FHIR
|
https://api.github.com/repos/IBM/FHIR
|
closed
|
Document validation of resource types in fhir-server-config.json with whole-system search
|
documentation search
|
**Is your feature request related to a problem? Please describe.**
We need to document what must be specified in fhir-server-config.json in order for whole-system search to be allowed.
**Describe the solution you'd like**
Documentation should include an explanation of what is checked and example configurations.
**Describe alternatives you've considered**
**Acceptance Criteria**
At least one acceptance criteria is included.
1.
GIVEN [a precondition]
AND [another precondition] WHEN [test step]
AND [test step]
THEN [verification step]
AND [verification step]
**Additional context**
Add any other context or screenshots about the feature request here.
|
1.0
|
Document validation of resource types in fhir-server-config.json with whole-system search - **Is your feature request related to a problem? Please describe.**
We need to document what must be specified in fhir-server-config.json in order for whole-system search to be allowed.
**Describe the solution you'd like**
Documentation should include an explanation of what is checked and example configurations.
**Describe alternatives you've considered**
**Acceptance Criteria**
At least one acceptance criteria is included.
1.
GIVEN [a precondition]
AND [another precondition] WHEN [test step]
AND [test step]
THEN [verification step]
AND [verification step]
**Additional context**
Add any other context or screenshots about the feature request here.
|
non_process
|
document validation of resource types in fhir server config json with whole system search is your feature request related to a problem please describe we need to document what must be specified in fhir server config json in order for whole system search to be allowed describe the solution you d like documentation should include an explanation of what is checked and example configurations describe alternatives you ve considered acceptance criteria at least one acceptance criteria is included given and when and then and additional context add any other context or screenshots about the feature request here
| 0
|
3,781
| 4,045,754,288
|
IssuesEvent
|
2016-05-22 07:35:23
|
minetest/minetest
|
https://api.github.com/repos/minetest/minetest
|
closed
|
Laggy interactions
|
Performance
|
While playing Minetest, I have experienced frequent problems when interacting with the world. This is especially apparent when moving items around in the inventory screens. Note that this happens both in singleplayer and multiplayer.
I think it would make the engine much more polished if this lag did not happen. Does anybody know why this happens?
Also, I don't know if I can help with the code any, but I would like to try. I haven't got much experience with C++, but I have used many other languages.
|
True
|
Laggy interactions - While playing Minetest, I have experienced frequent problems when interacting with the world. This is especially apparent when moving items around in the inventory screens. Note that this happens both in singleplayer and multiplayer.
I think it would make the engine much more polished if this lag did not happen. Does anybody know why this happens?
Also, I don't know if I can help with the code any, but I would like to try. I haven't got much experience with C++, but I have used many other languages.
|
non_process
|
laggy interactions while playing minetest i have experienced frequent problems when interacting with the world this is especially apparent when moving items around in the inventory screens note that this happens both in singleplayer and multiplayer i think it would make the engine much more polished if this lag did not happen does anybody know why this happens also i don t know if i can help with the code any but i would like to try i haven t got much experience with c but i have used many other languages
| 0
|
18,034
| 24,044,358,285
|
IssuesEvent
|
2022-09-16 06:51:09
|
zammad/zammad
|
https://api.github.com/repos/zammad/zammad
|
closed
|
Zammad ignores CSS white-space: pre-wrap; and displays multiline text in a single line
|
bug verified mail processing
|
<!--
Hi there - thanks for filing an issue. Please ensure the following things before creating an issue - thank you! 🤓
Since november 15th we handle all requests, except real bugs, at our community board.
Full explanation: https://community.zammad.org/t/major-change-regarding-github-issues-community-board/21
Please post:
- Feature requests
- Development questions
- Technical questions
on the board -> https://community.zammad.org !
If you think you hit a bug, please continue:
- Search existing issues and the CHANGELOG.md for your issue - there might be a solution already
- Make sure to use the latest version of Zammad if possible
- Add the `log/production.log` file from your system. Attention: Make sure no confidential data is in it!
- Please write the issue in english
- Don't remove the template - otherwise we will close the issue without further comments
- Ask questions about Zammad configuration and usage at our mailinglist. See: https://zammad.org/participate
Note: We always do our best. Unfortunately, sometimes there are too many requests and we can't handle everything at once. If you want to prioritize/escalate your issue, you can do so by means of a support contract (see https://zammad.com/pricing#selfhosted).
* The upper textblock will be removed automatically when you submit your issue *
-->
### Infos:
* Used Zammad version: 5.2.1-7
* Installation method (source, package, ..): docker compose
* Operating system: Ubuntu 20.04
* Database + version: zammad-postgresql-5.2.1-7
* Elasticsearch version: zammad-elasticsearch-5.2.1-7
* Browser + version: Firefox 103.0.2
We use Helpkit which sends mails with multi-line text in a single `<p style="white-space: pre-wrap;">` tag.
With zammad 5.2.1 (and earlier versions) this is displayed without newlines.
### Expected behavior:
The mail content with line breaks.
### Actual behavior:

### Steps to reproduce the behavior:
Raw email (with redacted addresses):
```
Return-Path: <bounces+9937234-a69e-support=redacted.com@em4102.helpkit.so>
Delivered-To: support@redacted.com
Received: from director-04.heinlein-hosting.de ([80.241.60.215])
(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits))
by dobby31b.heinlein-hosting.de with LMTPS
id 6ORfHuQxA2MF+AEAmyDAhQ
(envelope-from <bounces+9937234-a69e-support=redacted.com@em4102.helpkit.so>)
for <support@redacted.com>; Mon, 22 Aug 2022 09:36:04 +0200
Received: from mx2.mailbox.org ([80.241.60.215])
(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits))
by director-04.heinlein-hosting.de with LMTPS
id yGmxHOQxA2O/MAEAgupzMw
(envelope-from <bounces+9937234-a69e-support=redacted.com@em4102.helpkit.so>); Mon, 22 Aug 2022 09:36:04 +0200
Authentication-Results: spamfilter05.heinlein-hosting.de (amavisd-new);
dkim=pass (2048-bit key) header.d=helpkit.so
Received: from wrqvvpzp.outbound-mail.sendgrid.net (wrqvvpzp.outbound-mail.sendgrid.net [149.72.131.227])
(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits)
key-exchange ECDHE (P-384) server-signature RSA-PSS (4096 bits) server-digest SHA256)
(No client certificate requested)
by mx2.mailbox.org (Postfix) with ESMTPS id C7CF2A26BD
for <support@redacted.com>; Mon, 22 Aug 2022 09:35:59 +0200 (CEST)
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=helpkit.so;
h=content-type:from:mime-version:subject:reply-to:to:cc;
s=s1; bh=ZTdcbZN0K3fd5996wlmaZ7tSz3L3ERKQnptFHuFhBfk=;
b=G157kKzxUX9x2PqZqQn+Ihuzc31fJvQ0Va5S4CWZ6DfmkNHuuUpGUXdryoXBkJK8qRc3
EZUPsqrW40EsxdaybgcBpLLMsZJELqgAivDL4JMfhInm+RefdY4sSwYpKMNONAL3/VqTbs
x4saxBfVV8NlxbZJPNAMATYIquU8DpiG60C5w7j7B1Qnl7Fl3HZvfRv/s+rntOufEFsJS8
+ZCnzSC6a/Q/7HU5R5wRO+1gWEAR9/vyd2VqEV0u4xNTMnom0/cfGl4gtGzdT+gD88YASb
ahKDEp/Ezodew1bPRAXQztcvl+Dp8n9WU7hD3ABpsUmbNRWy8GKjXcixaNUKbmQw==
Received: by filterdrecv-846bc987b5-2ngxl with SMTP id filterdrecv-846bc987b5-2ngxl-1-630331DC-22
2022-08-22 07:35:56.743650717 +0000 UTC m=+2719891.605882063
Received: from OTkzNzIzNA (unknown)
by geopod-ismtpd-4-0 (SG) with HTTP
id JBEh4Z9vTQ63o7MxoxgsiQ
Mon, 22 Aug 2022 07:35:56.589 +0000 (UTC)
Content-Type: multipart/alternative; boundary=d7078f34c4d60d9d32a8fedcabde5a22a2d050218f511318071ccd212f93
Date: Mon, 22 Aug 2022 07:35:56 +0000 (UTC)
From: Nico <notifications@helpkit.so>
Mime-Version: 1.0
Message-ID: <JBEh4Z9vTQ63o7MxoxgsiQ@geopod-ismtpd-4-0>
Subject: Testmail for zammad issue tracker
Reply-To: redacted@redacted.com
X-SG-EID:
=?us-ascii?Q?YBNWZ89bs5mwVSkmxC5Pd6sj6mrqz9n8q8=2F+vrcvrHdMK+mv4btGL1bm7jD0qW?=
=?us-ascii?Q?yhYXd=2FTT7pZjFy7O4v6MAT88TXnrf1xNMHjAPU4?=
=?us-ascii?Q?FyH1c3ImwVTuVLYbtZ92dB9GOa11EPqWLe8l6bl?=
=?us-ascii?Q?kMzbk4ikkT1hPBpEBpULlN6im4wO9LVhyHK2ybd?=
=?us-ascii?Q?oHOwp7fnNwWH1b0SRdogWBBl3yugT65VeMIE5fq?=
=?us-ascii?Q?xo7dz2c6bvC=2FZEEslhPdDaMX9kUPTftYiVWZJB?=
To: support@redacted.com
X-Entity-ID: fv8b11wcRphjSqbTFT5npQ==
X-Rspamd-Score: -7.11 / 15.00 / 15.00
X-MBO-SPAM-Probability:
X-Rspamd-Queue-Id: C7CF2A26BD
--d7078f34c4d60d9d32a8fedcabde5a22a2d050218f511318071ccd212f93
Content-Transfer-Encoding: quoted-printable
Content-Type: text/plain; charset=us-ascii
Mime-Version: 1.0
Hi!
This is a test mail for the zammad issue tracker to show that multiline tex=
t is displayed as a single line.
Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy ei=
rmod tempor invidunt ut labore et dolore magna aliquyam erat, sed diam volu=
ptua. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita k=
asd gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet. Lore=
m ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod=
tempor invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua=
. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd =
gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet. Lorem ip=
sum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod tem=
por invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua. At=
vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd gube=
rgren, no sea takimata sanctus est Lorem ipsum dolor sit amet.=20
Duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molest=
ie consequat, vel illum dolore eu feugiat nulla facilisis at vero eros et a=
ccumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit=
augue duis dolore te feugait nulla facilisi. Lorem ipsum dolor sit amet, c=
onsectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut lao=
reet dolore magna aliquam erat volutpat.=20
Ut wisi enim ad minim veniam, quis nostrud exerci tation ullamcorper suscip=
it lobortis nisl ut aliquip ex ea commodo consequat. Duis autem vel eum iri=
ure dolor in hendrerit in vulputate velit esse molestie consequat, vel illu=
m dolore eu feugiat nulla facilisis at vero eros et accumsan et iusto odio =
dignissim qui blandit praesent luptatum zzril delenit augue duis dolore te =
feugait nulla facilisi.
--d7078f34c4d60d9d32a8fedcabde5a22a2d050218f511318071ccd212f93
Content-Transfer-Encoding: quoted-printable
Content-Type: text/html; charset=us-ascii
Mime-Version: 1.0
<p style=3D"white-space: pre-wrap;">Hi!
This is a test mail for the zammad issue tracker to show that multiline tex=
t is displayed as a single line.
Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy ei=
rmod tempor invidunt ut labore et dolore magna aliquyam erat, sed diam volu=
ptua. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita k=
asd gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet. Lore=
m ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod=
tempor invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua=
. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd =
gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet. Lorem ip=
sum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod tem=
por invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua. At=
vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd gube=
rgren, no sea takimata sanctus est Lorem ipsum dolor sit amet.=20
Duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molest=
ie consequat, vel illum dolore eu feugiat nulla facilisis at vero eros et a=
ccumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit=
augue duis dolore te feugait nulla facilisi. Lorem ipsum dolor sit amet, c=
onsectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut lao=
reet dolore magna aliquam erat volutpat.=20
Ut wisi enim ad minim veniam, quis nostrud exerci tation ullamcorper suscip=
it lobortis nisl ut aliquip ex ea commodo consequat. Duis autem vel eum iri=
ure dolor in hendrerit in vulputate velit esse molestie consequat, vel illu=
m dolore eu feugiat nulla facilisis at vero eros et accumsan et iusto odio =
dignissim qui blandit praesent luptatum zzril delenit augue duis dolore te =
feugait nulla facilisi.</p><img src=3D"http://url9025.helpkit.so/wf/open?up=
n=3Dr3XecG9Oeir8G6iSrKDq5Dy8UwfMhF2V9XfwX0wxF44NZqiGiAjKARZpLBlSkL0S4EioOhD=
g7Y-2Bip-2FjrDjpG8lb6DmWGYiAEZ8Z2qi-2BT86z-2Fhb4JMALmwSgr-2F6vyMo6vM-2Fa8HS=
kPbndL2807vJmK3HaPhTgGvTeYSKkq13OVJdyA3akmO66ZZTGAQX1b8-2Fm51-2FzujlqUODw-2=
F-2FFke7r5VWbL66MFmemZmt4oP-2BXdYBDQ-3D" alt=3D"" width=3D"1" height=3D"1" =
border=3D"0" style=3D"height:1px !important;width:1px !important;border-wid=
th:0 !important;margin-top:0 !important;margin-bottom:0 !important;margin-r=
ight:0 !important;margin-left:0 !important;padding-top:0 !important;padding=
-bottom:0 !important;padding-right:0 !important;padding-left:0 !important;"=
/>
--d7078f34c4d60d9d32a8fedcabde5a22a2d050218f511318071ccd212f93--
```
Injecting that mail should trigger that bug.
Alternatively, sign up for Helpkit and use that to send a mail to your zammad instance. Or send an email with a single `<p style="white-space: pre-wrap;">` tag yourself.
Yes I'm sure this is a bug and no feature request or a general question.
|
1.0
|
Zammad ignores CSS white-space: pre-wrap; and displays multiline text in a single line - <!--
Hi there - thanks for filing an issue. Please ensure the following things before creating an issue - thank you! 🤓
Since november 15th we handle all requests, except real bugs, at our community board.
Full explanation: https://community.zammad.org/t/major-change-regarding-github-issues-community-board/21
Please post:
- Feature requests
- Development questions
- Technical questions
on the board -> https://community.zammad.org !
If you think you hit a bug, please continue:
- Search existing issues and the CHANGELOG.md for your issue - there might be a solution already
- Make sure to use the latest version of Zammad if possible
- Add the `log/production.log` file from your system. Attention: Make sure no confidential data is in it!
- Please write the issue in english
- Don't remove the template - otherwise we will close the issue without further comments
- Ask questions about Zammad configuration and usage at our mailinglist. See: https://zammad.org/participate
Note: We always do our best. Unfortunately, sometimes there are too many requests and we can't handle everything at once. If you want to prioritize/escalate your issue, you can do so by means of a support contract (see https://zammad.com/pricing#selfhosted).
* The upper textblock will be removed automatically when you submit your issue *
-->
### Infos:
* Used Zammad version: 5.2.1-7
* Installation method (source, package, ..): docker compose
* Operating system: Ubuntu 20.04
* Database + version: zammad-postgresql-5.2.1-7
* Elasticsearch version: zammad-elasticsearch-5.2.1-7
* Browser + version: Firefox 103.0.2
We use Helpkit which sends mails with multi-line text in a single `<p style="white-space: pre-wrap;">` tag.
With zammad 5.2.1 (and earlier versions) this is displayed without newlines.
### Expected behavior:
The mail content with line breaks.
### Actual behavior:

### Steps to reproduce the behavior:
Raw email (with redacted addresses):
```
Return-Path: <bounces+9937234-a69e-support=redacted.com@em4102.helpkit.so>
Delivered-To: support@redacted.com
Received: from director-04.heinlein-hosting.de ([80.241.60.215])
(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits))
by dobby31b.heinlein-hosting.de with LMTPS
id 6ORfHuQxA2MF+AEAmyDAhQ
(envelope-from <bounces+9937234-a69e-support=redacted.com@em4102.helpkit.so>)
for <support@redacted.com>; Mon, 22 Aug 2022 09:36:04 +0200
Received: from mx2.mailbox.org ([80.241.60.215])
(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits))
by director-04.heinlein-hosting.de with LMTPS
id yGmxHOQxA2O/MAEAgupzMw
(envelope-from <bounces+9937234-a69e-support=redacted.com@em4102.helpkit.so>); Mon, 22 Aug 2022 09:36:04 +0200
Authentication-Results: spamfilter05.heinlein-hosting.de (amavisd-new);
dkim=pass (2048-bit key) header.d=helpkit.so
Received: from wrqvvpzp.outbound-mail.sendgrid.net (wrqvvpzp.outbound-mail.sendgrid.net [149.72.131.227])
(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits)
key-exchange ECDHE (P-384) server-signature RSA-PSS (4096 bits) server-digest SHA256)
(No client certificate requested)
by mx2.mailbox.org (Postfix) with ESMTPS id C7CF2A26BD
for <support@redacted.com>; Mon, 22 Aug 2022 09:35:59 +0200 (CEST)
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=helpkit.so;
h=content-type:from:mime-version:subject:reply-to:to:cc;
s=s1; bh=ZTdcbZN0K3fd5996wlmaZ7tSz3L3ERKQnptFHuFhBfk=;
b=G157kKzxUX9x2PqZqQn+Ihuzc31fJvQ0Va5S4CWZ6DfmkNHuuUpGUXdryoXBkJK8qRc3
EZUPsqrW40EsxdaybgcBpLLMsZJELqgAivDL4JMfhInm+RefdY4sSwYpKMNONAL3/VqTbs
x4saxBfVV8NlxbZJPNAMATYIquU8DpiG60C5w7j7B1Qnl7Fl3HZvfRv/s+rntOufEFsJS8
+ZCnzSC6a/Q/7HU5R5wRO+1gWEAR9/vyd2VqEV0u4xNTMnom0/cfGl4gtGzdT+gD88YASb
ahKDEp/Ezodew1bPRAXQztcvl+Dp8n9WU7hD3ABpsUmbNRWy8GKjXcixaNUKbmQw==
Received: by filterdrecv-846bc987b5-2ngxl with SMTP id filterdrecv-846bc987b5-2ngxl-1-630331DC-22
2022-08-22 07:35:56.743650717 +0000 UTC m=+2719891.605882063
Received: from OTkzNzIzNA (unknown)
by geopod-ismtpd-4-0 (SG) with HTTP
id JBEh4Z9vTQ63o7MxoxgsiQ
Mon, 22 Aug 2022 07:35:56.589 +0000 (UTC)
Content-Type: multipart/alternative; boundary=d7078f34c4d60d9d32a8fedcabde5a22a2d050218f511318071ccd212f93
Date: Mon, 22 Aug 2022 07:35:56 +0000 (UTC)
From: Nico <notifications@helpkit.so>
Mime-Version: 1.0
Message-ID: <JBEh4Z9vTQ63o7MxoxgsiQ@geopod-ismtpd-4-0>
Subject: Testmail for zammad issue tracker
Reply-To: redacted@redacted.com
X-SG-EID:
=?us-ascii?Q?YBNWZ89bs5mwVSkmxC5Pd6sj6mrqz9n8q8=2F+vrcvrHdMK+mv4btGL1bm7jD0qW?=
=?us-ascii?Q?yhYXd=2FTT7pZjFy7O4v6MAT88TXnrf1xNMHjAPU4?=
=?us-ascii?Q?FyH1c3ImwVTuVLYbtZ92dB9GOa11EPqWLe8l6bl?=
=?us-ascii?Q?kMzbk4ikkT1hPBpEBpULlN6im4wO9LVhyHK2ybd?=
=?us-ascii?Q?oHOwp7fnNwWH1b0SRdogWBBl3yugT65VeMIE5fq?=
=?us-ascii?Q?xo7dz2c6bvC=2FZEEslhPdDaMX9kUPTftYiVWZJB?=
To: support@redacted.com
X-Entity-ID: fv8b11wcRphjSqbTFT5npQ==
X-Rspamd-Score: -7.11 / 15.00 / 15.00
X-MBO-SPAM-Probability:
X-Rspamd-Queue-Id: C7CF2A26BD
--d7078f34c4d60d9d32a8fedcabde5a22a2d050218f511318071ccd212f93
Content-Transfer-Encoding: quoted-printable
Content-Type: text/plain; charset=us-ascii
Mime-Version: 1.0
Hi!
This is a test mail for the zammad issue tracker to show that multiline tex=
t is displayed as a single line.
Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy ei=
rmod tempor invidunt ut labore et dolore magna aliquyam erat, sed diam volu=
ptua. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita k=
asd gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet. Lore=
m ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod=
tempor invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua=
. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd =
gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet. Lorem ip=
sum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod tem=
por invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua. At=
vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd gube=
rgren, no sea takimata sanctus est Lorem ipsum dolor sit amet.=20
Duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molest=
ie consequat, vel illum dolore eu feugiat nulla facilisis at vero eros et a=
ccumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit=
augue duis dolore te feugait nulla facilisi. Lorem ipsum dolor sit amet, c=
onsectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut lao=
reet dolore magna aliquam erat volutpat.=20
Ut wisi enim ad minim veniam, quis nostrud exerci tation ullamcorper suscip=
it lobortis nisl ut aliquip ex ea commodo consequat. Duis autem vel eum iri=
ure dolor in hendrerit in vulputate velit esse molestie consequat, vel illu=
m dolore eu feugiat nulla facilisis at vero eros et accumsan et iusto odio =
dignissim qui blandit praesent luptatum zzril delenit augue duis dolore te =
feugait nulla facilisi.
--d7078f34c4d60d9d32a8fedcabde5a22a2d050218f511318071ccd212f93
Content-Transfer-Encoding: quoted-printable
Content-Type: text/html; charset=us-ascii
Mime-Version: 1.0
<p style=3D"white-space: pre-wrap;">Hi!
This is a test mail for the zammad issue tracker to show that multiline tex=
t is displayed as a single line.
Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy ei=
rmod tempor invidunt ut labore et dolore magna aliquyam erat, sed diam volu=
ptua. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita k=
asd gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet. Lore=
m ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod=
tempor invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua=
. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd =
gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet. Lorem ip=
sum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod tem=
por invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua. At=
vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd gube=
rgren, no sea takimata sanctus est Lorem ipsum dolor sit amet.=20
Duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molest=
ie consequat, vel illum dolore eu feugiat nulla facilisis at vero eros et a=
ccumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit=
augue duis dolore te feugait nulla facilisi. Lorem ipsum dolor sit amet, c=
onsectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut lao=
reet dolore magna aliquam erat volutpat.=20
Ut wisi enim ad minim veniam, quis nostrud exerci tation ullamcorper suscip=
it lobortis nisl ut aliquip ex ea commodo consequat. Duis autem vel eum iri=
ure dolor in hendrerit in vulputate velit esse molestie consequat, vel illu=
m dolore eu feugiat nulla facilisis at vero eros et accumsan et iusto odio =
dignissim qui blandit praesent luptatum zzril delenit augue duis dolore te =
feugait nulla facilisi.</p><img src=3D"http://url9025.helpkit.so/wf/open?up=
n=3Dr3XecG9Oeir8G6iSrKDq5Dy8UwfMhF2V9XfwX0wxF44NZqiGiAjKARZpLBlSkL0S4EioOhD=
g7Y-2Bip-2FjrDjpG8lb6DmWGYiAEZ8Z2qi-2BT86z-2Fhb4JMALmwSgr-2F6vyMo6vM-2Fa8HS=
kPbndL2807vJmK3HaPhTgGvTeYSKkq13OVJdyA3akmO66ZZTGAQX1b8-2Fm51-2FzujlqUODw-2=
F-2FFke7r5VWbL66MFmemZmt4oP-2BXdYBDQ-3D" alt=3D"" width=3D"1" height=3D"1" =
border=3D"0" style=3D"height:1px !important;width:1px !important;border-wid=
th:0 !important;margin-top:0 !important;margin-bottom:0 !important;margin-r=
ight:0 !important;margin-left:0 !important;padding-top:0 !important;padding=
-bottom:0 !important;padding-right:0 !important;padding-left:0 !important;"=
/>
--d7078f34c4d60d9d32a8fedcabde5a22a2d050218f511318071ccd212f93--
```
Injecting that mail should trigger that bug.
Alternatively, sign up for Helpkit and use that to send a mail to your zammad instance. Or send an email with a single `<p style="white-space: pre-wrap;">` tag yourself.
Yes I'm sure this is a bug and no feature request or a general question.
|
process
|
zammad ignores css white space pre wrap and displays multiline text in a single line hi there thanks for filing an issue please ensure the following things before creating an issue thank you 🤓 since november we handle all requests except real bugs at our community board full explanation please post feature requests development questions technical questions on the board if you think you hit a bug please continue search existing issues and the changelog md for your issue there might be a solution already make sure to use the latest version of zammad if possible add the log production log file from your system attention make sure no confidential data is in it please write the issue in english don t remove the template otherwise we will close the issue without further comments ask questions about zammad configuration and usage at our mailinglist see note we always do our best unfortunately sometimes there are too many requests and we can t handle everything at once if you want to prioritize escalate your issue you can do so by means of a support contract see the upper textblock will be removed automatically when you submit your issue infos used zammad version installation method source package docker compose operating system ubuntu database version zammad postgresql elasticsearch version zammad elasticsearch browser version firefox we use helpkit which sends mails with multi line text in a single tag with zammad and earlier versions this is displayed without newlines expected behavior the mail content with line breaks actual behavior steps to reproduce the behavior raw email with redacted addresses return path delivered to support redacted com received from director heinlein hosting de using with cipher tls aes gcm bits by heinlein hosting de with lmtps id aeamydahq envelope from for mon aug received from mailbox org using with cipher tls aes gcm bits by director heinlein hosting de with lmtps id maeagupzmw envelope from mon aug authentication results heinlein hosting de amavisd new dkim pass bit key header d helpkit so received from wrqvvpzp outbound mail sendgrid net wrqvvpzp outbound mail sendgrid net using with cipher tls aes gcm bits key exchange ecdhe p server signature rsa pss bits server digest no client certificate requested by mailbox org postfix with esmtps id for mon aug cest dkim signature v a rsa c relaxed relaxed d helpkit so h content type from mime version subject reply to to cc s bh b vqtbs s q ahkdep received by filterdrecv with smtp id filterdrecv utc m received from otkznzizna unknown by geopod ismtpd sg with http id mon aug utc content type multipart alternative boundary date mon aug utc from nico mime version message id subject testmail for zammad issue tracker reply to redacted redacted com x sg eid us ascii q vrcvrhdmk us ascii q yhyxd us ascii q us ascii q us ascii q us ascii q to support redacted com x entity id x rspamd score x mbo spam probability x rspamd queue id content transfer encoding quoted printable content type text plain charset us ascii mime version hi this is a test mail for the zammad issue tracker to show that multiline tex t is displayed as a single line lorem ipsum dolor sit amet consetetur sadipscing elitr sed diam nonumy ei rmod tempor invidunt ut labore et dolore magna aliquyam erat sed diam volu ptua at vero eos et accusam et justo duo dolores et ea rebum stet clita k asd gubergren no sea takimata sanctus est lorem ipsum dolor sit amet lore m ipsum dolor sit amet consetetur sadipscing elitr sed diam nonumy eirmod tempor invidunt ut labore et dolore magna aliquyam erat sed diam voluptua at vero eos et accusam et justo duo dolores et ea rebum stet clita kasd gubergren no sea takimata sanctus est lorem ipsum dolor sit amet lorem ip sum dolor sit amet consetetur sadipscing elitr sed diam nonumy eirmod tem por invidunt ut labore et dolore magna aliquyam erat sed diam voluptua at vero eos et accusam et justo duo dolores et ea rebum stet clita kasd gube rgren no sea takimata sanctus est lorem ipsum dolor sit amet duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molest ie consequat vel illum dolore eu feugiat nulla facilisis at vero eros et a ccumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit augue duis dolore te feugait nulla facilisi lorem ipsum dolor sit amet c onsectetuer adipiscing elit sed diam nonummy nibh euismod tincidunt ut lao reet dolore magna aliquam erat volutpat ut wisi enim ad minim veniam quis nostrud exerci tation ullamcorper suscip it lobortis nisl ut aliquip ex ea commodo consequat duis autem vel eum iri ure dolor in hendrerit in vulputate velit esse molestie consequat vel illu m dolore eu feugiat nulla facilisis at vero eros et accumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit augue duis dolore te feugait nulla facilisi content transfer encoding quoted printable content type text html charset us ascii mime version hi this is a test mail for the zammad issue tracker to show that multiline tex t is displayed as a single line lorem ipsum dolor sit amet consetetur sadipscing elitr sed diam nonumy ei rmod tempor invidunt ut labore et dolore magna aliquyam erat sed diam volu ptua at vero eos et accusam et justo duo dolores et ea rebum stet clita k asd gubergren no sea takimata sanctus est lorem ipsum dolor sit amet lore m ipsum dolor sit amet consetetur sadipscing elitr sed diam nonumy eirmod tempor invidunt ut labore et dolore magna aliquyam erat sed diam voluptua at vero eos et accusam et justo duo dolores et ea rebum stet clita kasd gubergren no sea takimata sanctus est lorem ipsum dolor sit amet lorem ip sum dolor sit amet consetetur sadipscing elitr sed diam nonumy eirmod tem por invidunt ut labore et dolore magna aliquyam erat sed diam voluptua at vero eos et accusam et justo duo dolores et ea rebum stet clita kasd gube rgren no sea takimata sanctus est lorem ipsum dolor sit amet duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molest ie consequat vel illum dolore eu feugiat nulla facilisis at vero eros et a ccumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit augue duis dolore te feugait nulla facilisi lorem ipsum dolor sit amet c onsectetuer adipiscing elit sed diam nonummy nibh euismod tincidunt ut lao reet dolore magna aliquam erat volutpat ut wisi enim ad minim veniam quis nostrud exerci tation ullamcorper suscip it lobortis nisl ut aliquip ex ea commodo consequat duis autem vel eum iri ure dolor in hendrerit in vulputate velit esse molestie consequat vel illu m dolore eu feugiat nulla facilisis at vero eros et accumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit augue duis dolore te feugait nulla facilisi img src n f alt width height border style height important width important border wid th important margin top important margin bottom important margin r ight important margin left important padding top important padding bottom important padding right important padding left important injecting that mail should trigger that bug alternatively sign up for helpkit and use that to send a mail to your zammad instance or send an email with a single tag yourself yes i m sure this is a bug and no feature request or a general question
| 1
|
20,492
| 27,146,979,408
|
IssuesEvent
|
2023-02-16 20:49:09
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Confusing/Contradictory statements in runtime expression syntax
|
devops/prod doc-bug Pri2 devops-cicd-process/tech
|
In the subsection, _What Syntax Should I Use_ under _Understand Variable Syntax_, you state the following:
> Choose a runtime expression if you are working with [conditions](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops) and [expressions](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops). **The exception to this is if you have a pipeline where it will cause a problem for your empty variable to print out. For example, if you have conditional logic that relies on a variable having a specific value or no value. In that case, you should use a runtime expression**.
(Emphasis mine.)
The bolded statements first assert that you should not use a runtime expression if the generation of an empty value would break your pipeline. They then go on to state that you should use a runtime expression in such a case.
These statements appear to be contradictory. If so, which syntax should we use instead?
[Enter feedback here]
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: dd7e0bd3-1f7d-d7b6-cc72-5ef63c31b46a
* Version Independent ID: dae87abd-b73d-9120-bcdb-6097d4b40f2a
* Content: [Define variables - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch)
* Content Source: [docs/pipelines/process/variables.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/variables.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Confusing/Contradictory statements in runtime expression syntax - In the subsection, _What Syntax Should I Use_ under _Understand Variable Syntax_, you state the following:
> Choose a runtime expression if you are working with [conditions](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops) and [expressions](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops). **The exception to this is if you have a pipeline where it will cause a problem for your empty variable to print out. For example, if you have conditional logic that relies on a variable having a specific value or no value. In that case, you should use a runtime expression**.
(Emphasis mine.)
The bolded statements first assert that you should not use a runtime expression if the generation of an empty value would break your pipeline. They then go on to state that you should use a runtime expression in such a case.
These statements appear to be contradictory. If so, which syntax should we use instead?
[Enter feedback here]
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: dd7e0bd3-1f7d-d7b6-cc72-5ef63c31b46a
* Version Independent ID: dae87abd-b73d-9120-bcdb-6097d4b40f2a
* Content: [Define variables - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch)
* Content Source: [docs/pipelines/process/variables.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/variables.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
confusing contradictory statements in runtime expression syntax in the subsection what syntax should i use under understand variable syntax you state the following choose a runtime expression if you are working with and the exception to this is if you have a pipeline where it will cause a problem for your empty variable to print out for example if you have conditional logic that relies on a variable having a specific value or no value in that case you should use a runtime expression emphasis mine the bolded statements first assert that you should not use a runtime expression if the generation of an empty value would break your pipeline they then go on to state that you should use a runtime expression in such a case these statements appear to be contradictory if so which syntax should we use instead document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id bcdb content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
13,525
| 16,058,312,411
|
IssuesEvent
|
2021-04-23 08:53:31
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
Error: [libs/sql-schema-describer/src/sqlite.rs:452:76] get name
|
bug/1-repro-available kind/bug process/candidate team/migrations
|
<!-- If required, please update the title to be clear and descriptive -->
Command: `prisma introspect`
Version: `2.20.1`
Binary Version: `60ba6551f29b17d7d6ce479e5733c70d9c00860e`
Report: https://prisma-errors.netlify.app/report/13207
OS: `arm64 darwin 20.2.0`
JS Stacktrace:
```
Error: [libs/sql-schema-describer/src/sqlite.rs:452:76] get name
at ChildProcess.<anonymous> (/Users/jonnyparris/dev/duvnab/node_modules/prisma/build/index.js:39909:28)
at ChildProcess.emit (node:events:378:20)
at ChildProcess.EventEmitter.emit (node:domain:470:12)
at Process.ChildProcess._handle.onexit (node:internal/child_process:290:12)
```
Rust Stacktrace:
```
0: backtrace::backtrace::trace
1: backtrace::capture::Backtrace::new
2: user_facing_errors::Error::new_in_panic_hook
3: user_facing_errors::panic_hook::set_panic_hook::{{closure}}
4: std::panicking::rust_panic_with_hook
5: std::panicking::begin_panic_handler::{{closure}}
6: std::sys_common::backtrace::__rust_end_short_backtrace
7: _rust_begin_unwind
8: core::panicking::panic_fmt
9: core::option::expect_failed
10: sql_schema_describer::sqlite::SqlSchemaDescriber::get_table::{{closure}}::{{closure}}
11: <tracing::instrument::Instrumented<T> as core::future::future::Future>::poll
12: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
13: <tracing::instrument::Instrumented<T> as core::future::future::Future>::poll
14: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
15: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
16: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
17: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
18: <futures_util::future::either::Either<A,B> as core::future::future::Future>::poll
19: <futures_util::future::future::Then<Fut1,Fut2,F> as core::future::future::Future>::poll
20: <futures_util::future::either::Either<A,B> as core::future::future::Future>::poll
21: introspection_engine::main::{{closure}}
22: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
23: introspection_engine::main
24: std::sys_common::backtrace::__rust_begin_short_backtrace
25: std::rt::lang_start::{{closure}}
26: std::rt::lang_start_internal
27: std::rt::lang_start
```
|
1.0
|
Error: [libs/sql-schema-describer/src/sqlite.rs:452:76] get name - <!-- If required, please update the title to be clear and descriptive -->
Command: `prisma introspect`
Version: `2.20.1`
Binary Version: `60ba6551f29b17d7d6ce479e5733c70d9c00860e`
Report: https://prisma-errors.netlify.app/report/13207
OS: `arm64 darwin 20.2.0`
JS Stacktrace:
```
Error: [libs/sql-schema-describer/src/sqlite.rs:452:76] get name
at ChildProcess.<anonymous> (/Users/jonnyparris/dev/duvnab/node_modules/prisma/build/index.js:39909:28)
at ChildProcess.emit (node:events:378:20)
at ChildProcess.EventEmitter.emit (node:domain:470:12)
at Process.ChildProcess._handle.onexit (node:internal/child_process:290:12)
```
Rust Stacktrace:
```
0: backtrace::backtrace::trace
1: backtrace::capture::Backtrace::new
2: user_facing_errors::Error::new_in_panic_hook
3: user_facing_errors::panic_hook::set_panic_hook::{{closure}}
4: std::panicking::rust_panic_with_hook
5: std::panicking::begin_panic_handler::{{closure}}
6: std::sys_common::backtrace::__rust_end_short_backtrace
7: _rust_begin_unwind
8: core::panicking::panic_fmt
9: core::option::expect_failed
10: sql_schema_describer::sqlite::SqlSchemaDescriber::get_table::{{closure}}::{{closure}}
11: <tracing::instrument::Instrumented<T> as core::future::future::Future>::poll
12: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
13: <tracing::instrument::Instrumented<T> as core::future::future::Future>::poll
14: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
15: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
16: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
17: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
18: <futures_util::future::either::Either<A,B> as core::future::future::Future>::poll
19: <futures_util::future::future::Then<Fut1,Fut2,F> as core::future::future::Future>::poll
20: <futures_util::future::either::Either<A,B> as core::future::future::Future>::poll
21: introspection_engine::main::{{closure}}
22: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
23: introspection_engine::main
24: std::sys_common::backtrace::__rust_begin_short_backtrace
25: std::rt::lang_start::{{closure}}
26: std::rt::lang_start_internal
27: std::rt::lang_start
```
|
process
|
error get name command prisma introspect version binary version report os darwin js stacktrace error get name at childprocess users jonnyparris dev duvnab node modules prisma build index js at childprocess emit node events at childprocess eventemitter emit node domain at process childprocess handle onexit node internal child process rust stacktrace backtrace backtrace trace backtrace capture backtrace new user facing errors error new in panic hook user facing errors panic hook set panic hook closure std panicking rust panic with hook std panicking begin panic handler closure std sys common backtrace rust end short backtrace rust begin unwind core panicking panic fmt core option expect failed sql schema describer sqlite sqlschemadescriber get table closure closure as core future future future poll as core future future future poll as core future future future poll as core future future future poll as core future future future poll as core future future future poll as core future future future poll as core future future future poll as core future future future poll as core future future future poll introspection engine main closure as core future future future poll introspection engine main std sys common backtrace rust begin short backtrace std rt lang start closure std rt lang start internal std rt lang start
| 1
|
298,685
| 25,847,263,156
|
IssuesEvent
|
2022-12-13 07:45:06
|
dart-lang/co19
|
https://api.github.com/repos/dart-lang/co19
|
closed
|
_FROM_DEFERRED_LIBRARY location
|
bad-test
|
With https://dart-review.googlesource.com/c/sdk/+/274141 we changed the location on which `XYZ_FROM_DEFERRED_LIBRARY` is reported. By doing this I broke following tests:
```
co19/Language/Expressions/Constants/static_constant_t06
co19/Language/Expressions/Constants/static_constant_t07
co19_2/Language/Expressions/Constants/static_constant_t06
co19_2/Language/Expressions/Constants/static_constant_t07
```
|
1.0
|
_FROM_DEFERRED_LIBRARY location - With https://dart-review.googlesource.com/c/sdk/+/274141 we changed the location on which `XYZ_FROM_DEFERRED_LIBRARY` is reported. By doing this I broke following tests:
```
co19/Language/Expressions/Constants/static_constant_t06
co19/Language/Expressions/Constants/static_constant_t07
co19_2/Language/Expressions/Constants/static_constant_t06
co19_2/Language/Expressions/Constants/static_constant_t07
```
|
non_process
|
from deferred library location with we changed the location on which xyz from deferred library is reported by doing this i broke following tests language expressions constants static constant language expressions constants static constant language expressions constants static constant language expressions constants static constant
| 0
|
2,597
| 5,356,196,634
|
IssuesEvent
|
2017-02-20 15:03:59
|
AllenFang/react-bootstrap-table
|
https://api.github.com/repos/AllenFang/react-bootstrap-table
|
closed
|
Default value not working for select filter in remote mode
|
help wanted inprocess
|
I have a table with two enum columns, And I added a filter on them (the table is in remote mode).
Here is a piece of my code :
```
<TableHeaderColumn dataField="customerType" dataSort
filter={{ type: 'SelectFilter', options: zipObj(props.customerTypes, props.customerTypes),
defaultValue: filterDefaults.customerType }}
>
Type
</TableHeaderColumn>
```
The options object looks like this (zipObj if from the ramda library):
```
{
'EASY': 'EASY',
'SIR': 'SIR'
}
```
and the `defaultValue` is 'EASY'.
I noticed in the code that you use the `defaultValue `attribute on the select tag, but I think it should just be `value`. (See here : https://github.com/AllenFang/react-bootstrap-table/blob/master/src/filters/Select.js#L63)
I tried to change this in my node_modules folder and the right option was selected, but then it had some strange behaviors, the style was the same as when the placeholder is selected, and I couldn't select the placeholder anymore.
Can you help me figure out what's wrong if it's my code, or publish a fix to this problem ? I'm more than happy to do a PR but I don't see the problem besides the `defaultValue` attribute.
|
1.0
|
Default value not working for select filter in remote mode - I have a table with two enum columns, And I added a filter on them (the table is in remote mode).
Here is a piece of my code :
```
<TableHeaderColumn dataField="customerType" dataSort
filter={{ type: 'SelectFilter', options: zipObj(props.customerTypes, props.customerTypes),
defaultValue: filterDefaults.customerType }}
>
Type
</TableHeaderColumn>
```
The options object looks like this (zipObj if from the ramda library):
```
{
'EASY': 'EASY',
'SIR': 'SIR'
}
```
and the `defaultValue` is 'EASY'.
I noticed in the code that you use the `defaultValue `attribute on the select tag, but I think it should just be `value`. (See here : https://github.com/AllenFang/react-bootstrap-table/blob/master/src/filters/Select.js#L63)
I tried to change this in my node_modules folder and the right option was selected, but then it had some strange behaviors, the style was the same as when the placeholder is selected, and I couldn't select the placeholder anymore.
Can you help me figure out what's wrong if it's my code, or publish a fix to this problem ? I'm more than happy to do a PR but I don't see the problem besides the `defaultValue` attribute.
|
process
|
default value not working for select filter in remote mode i have a table with two enum columns and i added a filter on them the table is in remote mode here is a piece of my code tableheadercolumn datafield customertype datasort filter type selectfilter options zipobj props customertypes props customertypes defaultvalue filterdefaults customertype type the options object looks like this zipobj if from the ramda library easy easy sir sir and the defaultvalue is easy i noticed in the code that you use the defaultvalue attribute on the select tag but i think it should just be value see here i tried to change this in my node modules folder and the right option was selected but then it had some strange behaviors the style was the same as when the placeholder is selected and i couldn t select the placeholder anymore can you help me figure out what s wrong if it s my code or publish a fix to this problem i m more than happy to do a pr but i don t see the problem besides the defaultvalue attribute
| 1
|
372,072
| 11,008,762,735
|
IssuesEvent
|
2019-12-04 11:10:30
|
coder3101/cp-editor2
|
https://api.github.com/repos/coder3101/cp-editor2
|
closed
|
Open .cpp file with CP editor
|
enhancement medium_priority windows
|
**Is your feature request related to a problem? Please describe.**
If this feature was added, there would be a solution to the issue regarding the crash of the editor when it opens the file explorer. This bug doesn't allow some users to save/open files directly from the program and is completely explained here: https://github.com/coder3101/cp-editor2/issues/6 .
**Describe the solution you'd like**
When I select and right-click a .cpp file in the file explorer, there's no "Open with CP editor" option. Moreover, when I force the editor to open a .cpp file, it opens as if I opened it by double-clicking the executable file.
Furthermore, this feature exists in nearly every code editor. Therefore adding it would not only improve the program's quality but also benefit users who don't experience the bug.
**Describe alternatives you've considered**
N/A
**Additional context**
Here's a gif of what I've been trying to do: https://i.imgur.com/swpWIcB.gifv
|
1.0
|
Open .cpp file with CP editor - **Is your feature request related to a problem? Please describe.**
If this feature was added, there would be a solution to the issue regarding the crash of the editor when it opens the file explorer. This bug doesn't allow some users to save/open files directly from the program and is completely explained here: https://github.com/coder3101/cp-editor2/issues/6 .
**Describe the solution you'd like**
When I select and right-click a .cpp file in the file explorer, there's no "Open with CP editor" option. Moreover, when I force the editor to open a .cpp file, it opens as if I opened it by double-clicking the executable file.
Furthermore, this feature exists in nearly every code editor. Therefore adding it would not only improve the program's quality but also benefit users who don't experience the bug.
**Describe alternatives you've considered**
N/A
**Additional context**
Here's a gif of what I've been trying to do: https://i.imgur.com/swpWIcB.gifv
|
non_process
|
open cpp file with cp editor is your feature request related to a problem please describe if this feature was added there would be a solution to the issue regarding the crash of the editor when it opens the file explorer this bug doesn t allow some users to save open files directly from the program and is completely explained here describe the solution you d like when i select and right click a cpp file in the file explorer there s no open with cp editor option moreover when i force the editor to open a cpp file it opens as if i opened it by double clicking the executable file furthermore this feature exists in nearly every code editor therefore adding it would not only improve the program s quality but also benefit users who don t experience the bug describe alternatives you ve considered n a additional context here s a gif of what i ve been trying to do
| 0
|
5,109
| 7,885,991,474
|
IssuesEvent
|
2018-06-27 14:03:49
|
cptechinc/soft-dpluso
|
https://api.github.com/repos/cptechinc/soft-dpluso
|
closed
|
Rename Processwire configs
|
Processwire
|
In Processwire go into templates, choose a template
then under advanced there''s an area that says rename template, then you can change the name of the template. Rename the following templates
customer-config -> config
actions-config -> config-useractions
dplus-config -> config-dplus
interfax-config -> config-interfax
form-fields-config -> config-form-fields
sales-orders-config -> config-sales-orders
quotes-config -> config-quotes
ii-config -> config-ii
cart-config -> config-cart-config
add config-dashboard template
add config-dashboard to allowable children templates of config
add instance of config dashboard as Dashboard under /config/
Add field show_salespanel (checkbox) "Show Top 25 Customers Panel?"
Add field show_bookingspanel (checkbox) "Show bookings ?"
Add fields to config-dashboard
Do this for Bellboy Liquor, Bellboy Bar Supply
|
1.0
|
Rename Processwire configs - In Processwire go into templates, choose a template
then under advanced there''s an area that says rename template, then you can change the name of the template. Rename the following templates
customer-config -> config
actions-config -> config-useractions
dplus-config -> config-dplus
interfax-config -> config-interfax
form-fields-config -> config-form-fields
sales-orders-config -> config-sales-orders
quotes-config -> config-quotes
ii-config -> config-ii
cart-config -> config-cart-config
add config-dashboard template
add config-dashboard to allowable children templates of config
add instance of config dashboard as Dashboard under /config/
Add field show_salespanel (checkbox) "Show Top 25 Customers Panel?"
Add field show_bookingspanel (checkbox) "Show bookings ?"
Add fields to config-dashboard
Do this for Bellboy Liquor, Bellboy Bar Supply
|
process
|
rename processwire configs in processwire go into templates choose a template then under advanced there s an area that says rename template then you can change the name of the template rename the following templates customer config config actions config config useractions dplus config config dplus interfax config config interfax form fields config config form fields sales orders config config sales orders quotes config config quotes ii config config ii cart config config cart config add config dashboard template add config dashboard to allowable children templates of config add instance of config dashboard as dashboard under config add field show salespanel checkbox show top customers panel add field show bookingspanel checkbox show bookings add fields to config dashboard do this for bellboy liquor bellboy bar supply
| 1
|
194,057
| 15,395,935,475
|
IssuesEvent
|
2021-03-03 19:55:17
|
UuuNyaa/blender_mmd_uuunyaa_tools
|
https://api.github.com/repos/UuuNyaa/blender_mmd_uuunyaa_tools
|
opened
|
Write a wiki on how to add asset issues
|
documentation
|
1. create a issue to https://github.com/UuuNyaa/blender_mmd_assets/issues
2. download the issue (eg. 15) in debug mode on Python Console
```python
import mmd_uuunyaa_tools.debug
mmd_uuunyaa_tools.debug.debug_assets(15)
bpy.ops.mmd_uuunyaa_tools.reload_asset_jsons()
```

3. do debug
4. delete debug asset
```python
import mmd_uuunyaa_tools.debug
mmd_uuunyaa_tools.debug.debug_assets()
bpy.ops.mmd_uuunyaa_tools.reload_asset_jsons()
```
|
1.0
|
Write a wiki on how to add asset issues - 1. create a issue to https://github.com/UuuNyaa/blender_mmd_assets/issues
2. download the issue (eg. 15) in debug mode on Python Console
```python
import mmd_uuunyaa_tools.debug
mmd_uuunyaa_tools.debug.debug_assets(15)
bpy.ops.mmd_uuunyaa_tools.reload_asset_jsons()
```

3. do debug
4. delete debug asset
```python
import mmd_uuunyaa_tools.debug
mmd_uuunyaa_tools.debug.debug_assets()
bpy.ops.mmd_uuunyaa_tools.reload_asset_jsons()
```
|
non_process
|
write a wiki on how to add asset issues create a issue to download the issue eg in debug mode on python console python import mmd uuunyaa tools debug mmd uuunyaa tools debug debug assets bpy ops mmd uuunyaa tools reload asset jsons do debug delete debug asset python import mmd uuunyaa tools debug mmd uuunyaa tools debug debug assets bpy ops mmd uuunyaa tools reload asset jsons
| 0
|
17,726
| 23,626,378,477
|
IssuesEvent
|
2022-08-25 04:37:25
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
reopened
|
Migrate dev fails on migration with unique fields
|
bug/1-unconfirmed kind/bug process/candidate team/schema topic: shadow database topic: prisma migrate dev
|
### Bug description
Hello, guys!
I have very strange issue. I tried too many solutions for resolve, but have no results.
First, I create init schema and apply some migrations after it. All works fine.
In my next task I need to add unique fields to one of my models.
Ok, it seems easy and I made it. Migration runs successful.
But now if I run `prisma migrate dev` I catch error ```Error: P1017 Server has closed the connection```.
I make some debug of MySQL container. And also have log:
```
| 2022-08-20T17:35:36.458136Z 9 [ERROR] InnoDB could not find key no 1 with name Estimate_userId_fkey from dict cache for table prisma_migrate_shadow_db_0d697eb8@002d1ad2@002d4b00@002dba85@002d01fe25c977f7/estimate
mysql | 17:35:36 UTC - mysqld got signal 11 ;
mysql | This could be because you hit a bug. It is also possible that this binary
mysql | or one of the libraries it was linked against is corrupt, improperly built,
mysql | or misconfigured. This error can also be caused by malfunctioning hardware.
mysql | Attempting to collect some information that could help diagnose the problem.
mysql | As this is a crash and something is definitely wrong, the information
mysql | collection process might fail.
mysql |
mysql | key_buffer_size=8388608
mysql | read_buffer_size=131072
mysql | max_used_connections=3
mysql | max_threads=151
mysql | thread_count=2
mysql | connection_count=2
mysql | It is possible that mysqld could use up to
mysql | key_buffer_size + (read_buffer_size + sort_buffer_size)*max_threads = 68196 K bytes of memory
mysql | Hope that's ok; if not, decrease some variables in the equation.
mysql |
mysql | Thread pointer: 0x7f5b74014b30
mysql | Attempting backtrace. You can use the following information to find out
mysql | where mysqld died. If you see no messages after this, something went
mysql | terribly wrong...
mysql | stack_bottom = 7f5b982e2e80 thread_stack 0x40000
mysql | mysqld(my_print_stacktrace+0x2c)[0x55f071d9633c]
mysql | mysqld(handle_fatal_signal+0x479)[0x55f0716ba0c9]
mysql | /lib/x86_64-linux-gnu/libpthread.so.0(+0x110e0)[0x7f5bbde1a0e0]
mysql | mysqld(_ZN11ha_innobase10index_typeEj+0x40)[0x55f071dc4120]
mysql | mysqld(+0xbb17bd)[0x55f071bb17bd]
mysql | mysqld(+0xbbd962)[0x55f071bbd962]
mysql | mysqld(_Z14get_all_tablesP3THDP10TABLE_LISTP4Item+0x6e2)[0x55f071bbe1b2]
mysql | mysqld(+0xba964d)[0x55f071ba964d]
mysql | mysqld(_Z24get_schema_tables_resultP4JOIN23enum_schema_table_state+0x195)[0x55f071bba155]
mysql | mysqld(_ZN4JOIN14prepare_resultEv+0x6d)[0x55f071b9f12d]
mysql | mysqld(_ZN4JOIN4execEv+0x8a)[0x55f071b3011a]
mysql | mysqld(_Z12handle_queryP3THDP3LEXP12Query_resultyy+0x233)[0x55f071b9fb03]
mysql | mysqld(+0x684362)[0x55f071684362]
mysql | mysqld(_Z21mysql_execute_commandP3THDb+0x462d)[0x55f071b630fd]
mysql | mysqld(_ZN18Prepared_statement7executeEP6Stringb+0x356)[0x55f071b8dd56]
mysql | mysqld(_ZN18Prepared_statement12execute_loopEP6StringbPhS2_+0xda)[0x55f071b90cba]
mysql | mysqld(_Z19mysqld_stmt_executeP3THDmmPhm+0xfa)[0x55f071b90faa]
mysql | mysqld(_Z16dispatch_commandP3THDPK8COM_DATA19enum_server_command+0x1a29)[0x55f071b67529]
mysql | mysqld(_Z10do_commandP3THD+0x197)[0x55f071b67f77]
mysql | mysqld(handle_connection+0x278)[0x55f071c25648]
mysql | mysqld(pfs_spawn_thread+0x1b4)[0x55f0720fd324]
mysql | /lib/x86_64-linux-gnu/libpthread.so.0(+0x74a4)[0x7f5bbde104a4]
mysql | /lib/x86_64-linux-gnu/libc.so.6(clone+0x3f)[0x7f5bbc65cd0f]
mysql |
mysql | Trying to get some variables.
mysql | Some pointers may be invalid and cause the dump to abort.
mysql | Query (7f5b741c3d18): SELECT DISTINCT index_name AS index_name, non_unique AS non_unique, Binary column_name AS column_name, seq_in_index AS seq_in_index, Binary table_name AS table_name, sub_part AS partial, Binary collation AS column_order, Binary index_type AS index_type FROM INFORMATION_SCHEMA.STATISTICS WHERE table_schema = ? ORDER BY index_name, seq_in_index
mysql | Connection ID (thread ID): 9
mysql | Status: NOT_KILLED
mysql |
mysql | The manual page at http://dev.mysql.com/doc/mysql/en/crashing.html contains
mysql | information that should help you find out what is causing the crash.
mysql | 2022-08-20T17:35:38.199460Z 0 [Warning] TIMESTAMP with implicit DEFAULT value is deprecated. Please use --explicit_defaults_for_timestamp server option (see documentation for more details).
mysql | 2022-08-20T17:35:38.213287Z 0 [Note] mysqld (mysqld 5.7.28) starting as process 1 ...
mysql | 2022-08-20T17:35:38.223908Z 0 [Warning] Setting lower_case_table_names=2 because file system for /var/lib/mysql/ is case insensitive
mysql | 2022-08-20T17:35:38.226632Z 0 [Note] InnoDB: PUNCH HOLE support available
mysql | 2022-08-20T17:35:38.226720Z 0 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins
mysql | 2022-08-20T17:35:38.226770Z 0 [Note] InnoDB: Uses event mutexes
mysql | 2022-08-20T17:35:38.226802Z 0 [Note] InnoDB: GCC builtin __atomic_thread_fence() is used for memory barrier
mysql | 2022-08-20T17:35:38.227443Z 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
mysql | 2022-08-20T17:35:38.227562Z 0 [Note] InnoDB: Using Linux native AIO
mysql | 2022-08-20T17:35:38.228309Z 0 [Note] InnoDB: Number of pools: 1
mysql | 2022-08-20T17:35:38.228971Z 0 [Note] InnoDB: Using CPU crc32 instructions
mysql | 2022-08-20T17:35:38.231732Z 0 [Note] InnoDB: Initializing buffer pool, total size = 128M, instances = 1, chunk size = 128M
mysql | 2022-08-20T17:35:38.261921Z 0 [Note] InnoDB: Completed initialization of buffer pool
mysql | 2022-08-20T17:35:38.271931Z 0 [Note] InnoDB: If the mysqld execution user is authorized, page cleaner thread priority can be changed. See the man page of setpriority().
```
Also I tried to upgrade prisma packages to last version – it does not help.
### How to reproduce
Just clone my repo and run by instructions.
https://github.com/storageddd/prisma-migration-issue
### Expected behavior
All migrations in dev mode runs successfully.
### Prisma information
```
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "mysql"
url = env("DATABASE_URL")
}
model Room {
id Int @id @default(autoincrement())
sid String @unique
status RoomStatus @default(CREATED)
set RoomSet
tasks Task[]
users User[]
}
model Task {
id Int @id @default(autoincrement())
title String
status TaskStatus @default(CREATED)
active Boolean @default(false)
estimate String?
roomId Int
room Room @relation(fields: [roomId], references: [id], onDelete: Cascade)
estimates Estimate[]
}
model User {
id Int @id @default(autoincrement())
sid String
name String
role UserRole
roomId Int
online Boolean @default(true)
room Room @relation(fields: [roomId], references: [id], onDelete: Cascade)
estimates Estimate[]
}
model Estimate {
id Int @id @default(autoincrement())
userId Int
taskId Int
rate String?
task Task @relation(fields: [taskId], references: [id], onDelete: Cascade)
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
@@unique([userId, taskId])
}
enum RoomStatus {
CREATED
IN_PROGRESS
DONE
}
enum RoomSet {
FIBONACCI
FIBONACCI_MOD
STORY_POINTS
T_SHIRTS
}
enum UserRole {
MODERATOR
RECIPIENT
VIEWER
}
enum TaskStatus {
CREATED
IN_PROGRESS
DISCLOSE
SKIPPED
DONE
}
```
### Environment & setup
- OS: Mac OS
- Database: MySQL 5.7
- Node.js version: 16
### Prisma Version
```
prisma : 3.7.0
@prisma/client : 3.7.0
Current platform : darwin
Query Engine (Node-API) : libquery-engine 8746e055198f517658c08a0c426c7eec87f5a85f (at node_modules/@prisma/engines/libquery_engine-darwin.dylib.node)
Migration Engine : migration-engine-cli 8746e055198f517658c08a0c426c7eec87f5a85f (at node_modules/@prisma/engines/migration-engine-darwin)
Introspection Engine : introspection-core 8746e055198f517658c08a0c426c7eec87f5a85f (at node_modules/@prisma/engines/introspection-engine-darwin)
Format Binary : prisma-fmt 8746e055198f517658c08a0c426c7eec87f5a85f (at node_modules/@prisma/engines/prisma-fmt-darwin)
Default Engines Hash : 8746e055198f517658c08a0c426c7eec87f5a85f
Studio : 0.445.0
```
|
1.0
|
Migrate dev fails on migration with unique fields - ### Bug description
Hello, guys!
I have very strange issue. I tried too many solutions for resolve, but have no results.
First, I create init schema and apply some migrations after it. All works fine.
In my next task I need to add unique fields to one of my models.
Ok, it seems easy and I made it. Migration runs successful.
But now if I run `prisma migrate dev` I catch error ```Error: P1017 Server has closed the connection```.
I make some debug of MySQL container. And also have log:
```
| 2022-08-20T17:35:36.458136Z 9 [ERROR] InnoDB could not find key no 1 with name Estimate_userId_fkey from dict cache for table prisma_migrate_shadow_db_0d697eb8@002d1ad2@002d4b00@002dba85@002d01fe25c977f7/estimate
mysql | 17:35:36 UTC - mysqld got signal 11 ;
mysql | This could be because you hit a bug. It is also possible that this binary
mysql | or one of the libraries it was linked against is corrupt, improperly built,
mysql | or misconfigured. This error can also be caused by malfunctioning hardware.
mysql | Attempting to collect some information that could help diagnose the problem.
mysql | As this is a crash and something is definitely wrong, the information
mysql | collection process might fail.
mysql |
mysql | key_buffer_size=8388608
mysql | read_buffer_size=131072
mysql | max_used_connections=3
mysql | max_threads=151
mysql | thread_count=2
mysql | connection_count=2
mysql | It is possible that mysqld could use up to
mysql | key_buffer_size + (read_buffer_size + sort_buffer_size)*max_threads = 68196 K bytes of memory
mysql | Hope that's ok; if not, decrease some variables in the equation.
mysql |
mysql | Thread pointer: 0x7f5b74014b30
mysql | Attempting backtrace. You can use the following information to find out
mysql | where mysqld died. If you see no messages after this, something went
mysql | terribly wrong...
mysql | stack_bottom = 7f5b982e2e80 thread_stack 0x40000
mysql | mysqld(my_print_stacktrace+0x2c)[0x55f071d9633c]
mysql | mysqld(handle_fatal_signal+0x479)[0x55f0716ba0c9]
mysql | /lib/x86_64-linux-gnu/libpthread.so.0(+0x110e0)[0x7f5bbde1a0e0]
mysql | mysqld(_ZN11ha_innobase10index_typeEj+0x40)[0x55f071dc4120]
mysql | mysqld(+0xbb17bd)[0x55f071bb17bd]
mysql | mysqld(+0xbbd962)[0x55f071bbd962]
mysql | mysqld(_Z14get_all_tablesP3THDP10TABLE_LISTP4Item+0x6e2)[0x55f071bbe1b2]
mysql | mysqld(+0xba964d)[0x55f071ba964d]
mysql | mysqld(_Z24get_schema_tables_resultP4JOIN23enum_schema_table_state+0x195)[0x55f071bba155]
mysql | mysqld(_ZN4JOIN14prepare_resultEv+0x6d)[0x55f071b9f12d]
mysql | mysqld(_ZN4JOIN4execEv+0x8a)[0x55f071b3011a]
mysql | mysqld(_Z12handle_queryP3THDP3LEXP12Query_resultyy+0x233)[0x55f071b9fb03]
mysql | mysqld(+0x684362)[0x55f071684362]
mysql | mysqld(_Z21mysql_execute_commandP3THDb+0x462d)[0x55f071b630fd]
mysql | mysqld(_ZN18Prepared_statement7executeEP6Stringb+0x356)[0x55f071b8dd56]
mysql | mysqld(_ZN18Prepared_statement12execute_loopEP6StringbPhS2_+0xda)[0x55f071b90cba]
mysql | mysqld(_Z19mysqld_stmt_executeP3THDmmPhm+0xfa)[0x55f071b90faa]
mysql | mysqld(_Z16dispatch_commandP3THDPK8COM_DATA19enum_server_command+0x1a29)[0x55f071b67529]
mysql | mysqld(_Z10do_commandP3THD+0x197)[0x55f071b67f77]
mysql | mysqld(handle_connection+0x278)[0x55f071c25648]
mysql | mysqld(pfs_spawn_thread+0x1b4)[0x55f0720fd324]
mysql | /lib/x86_64-linux-gnu/libpthread.so.0(+0x74a4)[0x7f5bbde104a4]
mysql | /lib/x86_64-linux-gnu/libc.so.6(clone+0x3f)[0x7f5bbc65cd0f]
mysql |
mysql | Trying to get some variables.
mysql | Some pointers may be invalid and cause the dump to abort.
mysql | Query (7f5b741c3d18): SELECT DISTINCT index_name AS index_name, non_unique AS non_unique, Binary column_name AS column_name, seq_in_index AS seq_in_index, Binary table_name AS table_name, sub_part AS partial, Binary collation AS column_order, Binary index_type AS index_type FROM INFORMATION_SCHEMA.STATISTICS WHERE table_schema = ? ORDER BY index_name, seq_in_index
mysql | Connection ID (thread ID): 9
mysql | Status: NOT_KILLED
mysql |
mysql | The manual page at http://dev.mysql.com/doc/mysql/en/crashing.html contains
mysql | information that should help you find out what is causing the crash.
mysql | 2022-08-20T17:35:38.199460Z 0 [Warning] TIMESTAMP with implicit DEFAULT value is deprecated. Please use --explicit_defaults_for_timestamp server option (see documentation for more details).
mysql | 2022-08-20T17:35:38.213287Z 0 [Note] mysqld (mysqld 5.7.28) starting as process 1 ...
mysql | 2022-08-20T17:35:38.223908Z 0 [Warning] Setting lower_case_table_names=2 because file system for /var/lib/mysql/ is case insensitive
mysql | 2022-08-20T17:35:38.226632Z 0 [Note] InnoDB: PUNCH HOLE support available
mysql | 2022-08-20T17:35:38.226720Z 0 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins
mysql | 2022-08-20T17:35:38.226770Z 0 [Note] InnoDB: Uses event mutexes
mysql | 2022-08-20T17:35:38.226802Z 0 [Note] InnoDB: GCC builtin __atomic_thread_fence() is used for memory barrier
mysql | 2022-08-20T17:35:38.227443Z 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
mysql | 2022-08-20T17:35:38.227562Z 0 [Note] InnoDB: Using Linux native AIO
mysql | 2022-08-20T17:35:38.228309Z 0 [Note] InnoDB: Number of pools: 1
mysql | 2022-08-20T17:35:38.228971Z 0 [Note] InnoDB: Using CPU crc32 instructions
mysql | 2022-08-20T17:35:38.231732Z 0 [Note] InnoDB: Initializing buffer pool, total size = 128M, instances = 1, chunk size = 128M
mysql | 2022-08-20T17:35:38.261921Z 0 [Note] InnoDB: Completed initialization of buffer pool
mysql | 2022-08-20T17:35:38.271931Z 0 [Note] InnoDB: If the mysqld execution user is authorized, page cleaner thread priority can be changed. See the man page of setpriority().
```
Also I tried to upgrade prisma packages to last version – it does not help.
### How to reproduce
Just clone my repo and run by instructions.
https://github.com/storageddd/prisma-migration-issue
### Expected behavior
All migrations in dev mode runs successfully.
### Prisma information
```
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "mysql"
url = env("DATABASE_URL")
}
model Room {
id Int @id @default(autoincrement())
sid String @unique
status RoomStatus @default(CREATED)
set RoomSet
tasks Task[]
users User[]
}
model Task {
id Int @id @default(autoincrement())
title String
status TaskStatus @default(CREATED)
active Boolean @default(false)
estimate String?
roomId Int
room Room @relation(fields: [roomId], references: [id], onDelete: Cascade)
estimates Estimate[]
}
model User {
id Int @id @default(autoincrement())
sid String
name String
role UserRole
roomId Int
online Boolean @default(true)
room Room @relation(fields: [roomId], references: [id], onDelete: Cascade)
estimates Estimate[]
}
model Estimate {
id Int @id @default(autoincrement())
userId Int
taskId Int
rate String?
task Task @relation(fields: [taskId], references: [id], onDelete: Cascade)
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
@@unique([userId, taskId])
}
enum RoomStatus {
CREATED
IN_PROGRESS
DONE
}
enum RoomSet {
FIBONACCI
FIBONACCI_MOD
STORY_POINTS
T_SHIRTS
}
enum UserRole {
MODERATOR
RECIPIENT
VIEWER
}
enum TaskStatus {
CREATED
IN_PROGRESS
DISCLOSE
SKIPPED
DONE
}
```
### Environment & setup
- OS: Mac OS
- Database: MySQL 5.7
- Node.js version: 16
### Prisma Version
```
prisma : 3.7.0
@prisma/client : 3.7.0
Current platform : darwin
Query Engine (Node-API) : libquery-engine 8746e055198f517658c08a0c426c7eec87f5a85f (at node_modules/@prisma/engines/libquery_engine-darwin.dylib.node)
Migration Engine : migration-engine-cli 8746e055198f517658c08a0c426c7eec87f5a85f (at node_modules/@prisma/engines/migration-engine-darwin)
Introspection Engine : introspection-core 8746e055198f517658c08a0c426c7eec87f5a85f (at node_modules/@prisma/engines/introspection-engine-darwin)
Format Binary : prisma-fmt 8746e055198f517658c08a0c426c7eec87f5a85f (at node_modules/@prisma/engines/prisma-fmt-darwin)
Default Engines Hash : 8746e055198f517658c08a0c426c7eec87f5a85f
Studio : 0.445.0
```
|
process
|
migrate dev fails on migration with unique fields bug description hello guys i have very strange issue i tried too many solutions for resolve but have no results first i create init schema and apply some migrations after it all works fine in my next task i need to add unique fields to one of my models ok it seems easy and i made it migration runs successful but now if i run prisma migrate dev i catch error error server has closed the connection i make some debug of mysql container and also have log innodb could not find key no with name estimate userid fkey from dict cache for table prisma migrate shadow db estimate mysql utc mysqld got signal mysql this could be because you hit a bug it is also possible that this binary mysql or one of the libraries it was linked against is corrupt improperly built mysql or misconfigured this error can also be caused by malfunctioning hardware mysql attempting to collect some information that could help diagnose the problem mysql as this is a crash and something is definitely wrong the information mysql collection process might fail mysql mysql key buffer size mysql read buffer size mysql max used connections mysql max threads mysql thread count mysql connection count mysql it is possible that mysqld could use up to mysql key buffer size read buffer size sort buffer size max threads k bytes of memory mysql hope that s ok if not decrease some variables in the equation mysql mysql thread pointer mysql attempting backtrace you can use the following information to find out mysql where mysqld died if you see no messages after this something went mysql terribly wrong mysql stack bottom thread stack mysql mysqld my print stacktrace mysql mysqld handle fatal signal mysql lib linux gnu libpthread so mysql mysqld typeej mysql mysqld mysql mysqld mysql mysqld all mysql mysqld mysql mysqld schema tables schema table state mysql mysqld resultev mysql mysqld mysql mysqld resultyy mysql mysqld mysql mysqld execute mysql mysqld mysql mysqld mysql mysqld stmt mysql mysqld server command mysql mysqld mysql mysqld handle connection mysql mysqld pfs spawn thread mysql lib linux gnu libpthread so mysql lib linux gnu libc so clone mysql mysql trying to get some variables mysql some pointers may be invalid and cause the dump to abort mysql query select distinct index name as index name non unique as non unique binary column name as column name seq in index as seq in index binary table name as table name sub part as partial binary collation as column order binary index type as index type from information schema statistics where table schema order by index name seq in index mysql connection id thread id mysql status not killed mysql mysql the manual page at contains mysql information that should help you find out what is causing the crash mysql timestamp with implicit default value is deprecated please use explicit defaults for timestamp server option see documentation for more details mysql mysqld mysqld starting as process mysql setting lower case table names because file system for var lib mysql is case insensitive mysql innodb punch hole support available mysql innodb mutexes and rw locks use gcc atomic builtins mysql innodb uses event mutexes mysql innodb gcc builtin atomic thread fence is used for memory barrier mysql innodb compressed tables use zlib mysql innodb using linux native aio mysql innodb number of pools mysql innodb using cpu instructions mysql innodb initializing buffer pool total size instances chunk size mysql innodb completed initialization of buffer pool mysql innodb if the mysqld execution user is authorized page cleaner thread priority can be changed see the man page of setpriority also i tried to upgrade prisma packages to last version – it does not help how to reproduce just clone my repo and run by instructions expected behavior all migrations in dev mode runs successfully prisma information generator client provider prisma client js datasource db provider mysql url env database url model room id int id default autoincrement sid string unique status roomstatus default created set roomset tasks task users user model task id int id default autoincrement title string status taskstatus default created active boolean default false estimate string roomid int room room relation fields references ondelete cascade estimates estimate model user id int id default autoincrement sid string name string role userrole roomid int online boolean default true room room relation fields references ondelete cascade estimates estimate model estimate id int id default autoincrement userid int taskid int rate string task task relation fields references ondelete cascade user user relation fields references ondelete cascade unique enum roomstatus created in progress done enum roomset fibonacci fibonacci mod story points t shirts enum userrole moderator recipient viewer enum taskstatus created in progress disclose skipped done environment setup os mac os database mysql node js version prisma version prisma prisma client current platform darwin query engine node api libquery engine at node modules prisma engines libquery engine darwin dylib node migration engine migration engine cli at node modules prisma engines migration engine darwin introspection engine introspection core at node modules prisma engines introspection engine darwin format binary prisma fmt at node modules prisma engines prisma fmt darwin default engines hash studio
| 1
|
19,236
| 25,387,544,156
|
IssuesEvent
|
2022-11-21 23:32:11
|
AMDResearch/omnitrace
|
https://api.github.com/repos/AMDResearch/omnitrace
|
closed
|
Intermediate sampling flushing
|
enhancement sampling process sampling configuration
|
Right now, there is no way to limit the amount of sampling data stored in memory beyond setting `OMNITRACE_SAMPLING_DURATION`. Need to add a way to occasionally flush the data stored in memory and an option to configure it.
|
1.0
|
Intermediate sampling flushing - Right now, there is no way to limit the amount of sampling data stored in memory beyond setting `OMNITRACE_SAMPLING_DURATION`. Need to add a way to occasionally flush the data stored in memory and an option to configure it.
|
process
|
intermediate sampling flushing right now there is no way to limit the amount of sampling data stored in memory beyond setting omnitrace sampling duration need to add a way to occasionally flush the data stored in memory and an option to configure it
| 1
|
57,185
| 7,043,611,273
|
IssuesEvent
|
2017-12-31 09:33:55
|
Jo-ravar/Tutorials
|
https://api.github.com/repos/Jo-ravar/Tutorials
|
reopened
|
[UI] Design a logo
|
Design
|
### Issue Description
- Come up with your own logo designs and add screenshot here.
|
1.0
|
[UI] Design a logo - ### Issue Description
- Come up with your own logo designs and add screenshot here.
|
non_process
|
design a logo issue description come up with your own logo designs and add screenshot here
| 0
|
20,686
| 27,357,198,374
|
IssuesEvent
|
2023-02-27 13:40:14
|
camunda/issues
|
https://api.github.com/repos/camunda/issues
|
opened
|
Improved visual feedback for Process Instance Modification
|
component:operate component:zeebe component:zeebe-process-automation public kind:epic potential:8.2
|
### Value Proposition Statement
Visual feedback after applying Process Instance Modification
### User Problem
**First problem:**
Currently, when I click "Apply" in the summary of modifications in Process Instance Modification, I get the toast notification:

but I don't see the changes in Operate - it takes a couple of seconds to apply the modification.
This is a confusing behavior for user, as the expectation is that when I see something was modified, it will change. This is especially true for users, who don't know that it takes time to send data between Operate->Zeebe->Operate.
Also, currently we do not communicate to the user that the command was not valid and modification cannot be applied.
Status of the finished modification is in progress in the Operations panel

**Second problem:**
We communicate to users, that the new token they want to create won’t be affected of the cancel operation performed on the same flow node. But in reality Zeebe applies activation instructions before the termination. This leads the new added token to be canceled as well.
Also in the instance history this is visualized as ‘new token will be added’ and ‘ONLY existing tokens will be canceled’.
It’s just wrong on Operate side. User expects a new instance to appear, but it doesn’t
### User Stories
- As a Developer, I can see failure in the send operation (modification command was not valid)
- As a Developer, I can see the status of the modification and the result of the batch operation (move, add, cancel)
- As a Developer, when I click the "apply", I know what is happening
- As a Developer, I know that if I apply action - add and apply on the same flow node, the newly added token will be canceled
- This involves only UX copy/frontend changes
### Implementation Notes
<!-- Notes to consider for implementation, for example:
* In Cawemo we already have the capability to manage templates via the feature that we call “catalog”
* What we would build now is the ability to a) use this feature in the web modeler to create templates and b) when the context pad opens for defining the type of a task, the templates that decorate service tasks are shown
* We should clarify terminology (integrations vs. connectors vs. job workers vs. element templates.) Particularly “element templates” might not be a term that a user intuitively understands.
* See these high level wireframes to capture the idea -->
### Validation Criteria
- Validate this with 2 community members
- Validate this with 2 customers
### Breakdown
> This section links to various sub-issues / -tasks contributing to respective epic phase or phase results where appropriate.
#### Discovery phase ##
<!-- Example: link to "Conduct customer interview with xyz" -->
#### Define phase ##
<!-- Consider: UI, UX, technical design, documentation design -->
<!-- Example: link to "Define User-Journey Flow" or "Define target architecture" -->
Design Planning
* Reviewed by design: Nov 2022
* Designer assigned: @gastonpillet01
* Assignee: Gaston
* Design Brief - [link to design brief](https://docs.google.com/document/d/19dyKN9XPTxfjkSWikO1A-_ZwoxwUEXGs5Tfb9T2lPMU/edit#)
* [CPIM Research report ](https://docs.google.com/document/d/1a7cY5Rz-m2cVSabJfRGBHgYslAaASsf9TDiLiqPikuI/edit#heading=h.cxixej5665y)
Design Deliverables
* HFW [Link to Figma](https://www.figma.com/file/JR6tC1y0RfmJMIUzppJ8gW/Improved-visual-feedback-for-Process-Instance-Modification-%23638?node-id=307%3A1011)
Documentation Planning
<!-- Complex changes must be reviewed during the Define phase by the DRI of Documentation or technical writer. -->
<!-- Briefly describe the anticipated impact to documentation. -->
<!-- Example: "Creates structural changes in docs as UX is reworked." _Add docs reviewer to Epic for feedback._ -->
Risk Management <!-- add link to risk management issue -->
* Risk Class: <!-- e.g. very low | low | medium | high | very high -->
* Risk Treatment: <!-- e.g. avoid | mitigate | transfer | accept -->
#### Implement phase ##
<!-- Example: link to "Implement User Story xyz". Should not only include core implementation, but also documentation. -->
#### Validate phase ##
<!-- Example: link to "Evaluate usage data of last quarter" -->
### Links to additional collateral
<!-- Example: link to relevant support cases -->
|
1.0
|
Improved visual feedback for Process Instance Modification - ### Value Proposition Statement
Visual feedback after applying Process Instance Modification
### User Problem
**First problem:**
Currently, when I click "Apply" in the summary of modifications in Process Instance Modification, I get the toast notification:

but I don't see the changes in Operate - it takes a couple of seconds to apply the modification.
This is a confusing behavior for user, as the expectation is that when I see something was modified, it will change. This is especially true for users, who don't know that it takes time to send data between Operate->Zeebe->Operate.
Also, currently we do not communicate to the user that the command was not valid and modification cannot be applied.
Status of the finished modification is in progress in the Operations panel

**Second problem:**
We communicate to users, that the new token they want to create won’t be affected of the cancel operation performed on the same flow node. But in reality Zeebe applies activation instructions before the termination. This leads the new added token to be canceled as well.
Also in the instance history this is visualized as ‘new token will be added’ and ‘ONLY existing tokens will be canceled’.
It’s just wrong on Operate side. User expects a new instance to appear, but it doesn’t
### User Stories
- As a Developer, I can see failure in the send operation (modification command was not valid)
- As a Developer, I can see the status of the modification and the result of the batch operation (move, add, cancel)
- As a Developer, when I click the "apply", I know what is happening
- As a Developer, I know that if I apply action - add and apply on the same flow node, the newly added token will be canceled
- This involves only UX copy/frontend changes
### Implementation Notes
<!-- Notes to consider for implementation, for example:
* In Cawemo we already have the capability to manage templates via the feature that we call “catalog”
* What we would build now is the ability to a) use this feature in the web modeler to create templates and b) when the context pad opens for defining the type of a task, the templates that decorate service tasks are shown
* We should clarify terminology (integrations vs. connectors vs. job workers vs. element templates.) Particularly “element templates” might not be a term that a user intuitively understands.
* See these high level wireframes to capture the idea -->
### Validation Criteria
- Validate this with 2 community members
- Validate this with 2 customers
### Breakdown
> This section links to various sub-issues / -tasks contributing to respective epic phase or phase results where appropriate.
#### Discovery phase ##
<!-- Example: link to "Conduct customer interview with xyz" -->
#### Define phase ##
<!-- Consider: UI, UX, technical design, documentation design -->
<!-- Example: link to "Define User-Journey Flow" or "Define target architecture" -->
Design Planning
* Reviewed by design: Nov 2022
* Designer assigned: @gastonpillet01
* Assignee: Gaston
* Design Brief - [link to design brief](https://docs.google.com/document/d/19dyKN9XPTxfjkSWikO1A-_ZwoxwUEXGs5Tfb9T2lPMU/edit#)
* [CPIM Research report ](https://docs.google.com/document/d/1a7cY5Rz-m2cVSabJfRGBHgYslAaASsf9TDiLiqPikuI/edit#heading=h.cxixej5665y)
Design Deliverables
* HFW [Link to Figma](https://www.figma.com/file/JR6tC1y0RfmJMIUzppJ8gW/Improved-visual-feedback-for-Process-Instance-Modification-%23638?node-id=307%3A1011)
Documentation Planning
<!-- Complex changes must be reviewed during the Define phase by the DRI of Documentation or technical writer. -->
<!-- Briefly describe the anticipated impact to documentation. -->
<!-- Example: "Creates structural changes in docs as UX is reworked." _Add docs reviewer to Epic for feedback._ -->
Risk Management <!-- add link to risk management issue -->
* Risk Class: <!-- e.g. very low | low | medium | high | very high -->
* Risk Treatment: <!-- e.g. avoid | mitigate | transfer | accept -->
#### Implement phase ##
<!-- Example: link to "Implement User Story xyz". Should not only include core implementation, but also documentation. -->
#### Validate phase ##
<!-- Example: link to "Evaluate usage data of last quarter" -->
### Links to additional collateral
<!-- Example: link to relevant support cases -->
|
process
|
improved visual feedback for process instance modification value proposition statement visual feedback after applying process instance modification user problem first problem currently when i click apply in the summary of modifications in process instance modification i get the toast notification but i don t see the changes in operate it takes a couple of seconds to apply the modification this is a confusing behavior for user as the expectation is that when i see something was modified it will change this is especially true for users who don t know that it takes time to send data between operate zeebe operate also currently we do not communicate to the user that the command was not valid and modification cannot be applied status of the finished modification is in progress in the operations panel second problem we communicate to users that the new token they want to create won’t be affected of the cancel operation performed on the same flow node but in reality zeebe applies activation instructions before the termination this leads the new added token to be canceled as well also in the instance history this is visualized as ‘new token will be added’ and ‘only existing tokens will be canceled’ it’s just wrong on operate side user expects a new instance to appear but it doesn’t user stories as a developer i can see failure in the send operation modification command was not valid as a developer i can see the status of the modification and the result of the batch operation move add cancel as a developer when i click the apply i know what is happening as a developer i know that if i apply action add and apply on the same flow node the newly added token will be canceled this involves only ux copy frontend changes implementation notes notes to consider for implementation for example in cawemo we already have the capability to manage templates via the feature that we call “catalog” what we would build now is the ability to a use this feature in the web modeler to create templates and b when the context pad opens for defining the type of a task the templates that decorate service tasks are shown we should clarify terminology integrations vs connectors vs job workers vs element templates particularly “element templates” might not be a term that a user intuitively understands see these high level wireframes to capture the idea validation criteria validate this with community members validate this with customers breakdown this section links to various sub issues tasks contributing to respective epic phase or phase results where appropriate discovery phase define phase design planning reviewed by design nov designer assigned assignee gaston design brief design deliverables hfw documentation planning risk management risk class risk treatment implement phase validate phase links to additional collateral
| 1
|
46,918
| 13,196,267,739
|
IssuesEvent
|
2020-08-13 20:18:31
|
onlinejudge95/rinnegan
|
https://api.github.com/repos/onlinejudge95/rinnegan
|
closed
|
Password validation during login
|
bug python security server
|
**Describe the bug**
During Sign-In form validation, wrong passwords are not checked
**To Reproduce**
Steps to reproduce the behavior:
1. Go to http://localhost:3000/login
2. Submit correct email ID
3. Submit a wrong password
4. User is logged in
**Expected behavior**
If the password is not the one used by the user during Sign-Up, login should not succeed
|
True
|
Password validation during login - **Describe the bug**
During Sign-In form validation, wrong passwords are not checked
**To Reproduce**
Steps to reproduce the behavior:
1. Go to http://localhost:3000/login
2. Submit correct email ID
3. Submit a wrong password
4. User is logged in
**Expected behavior**
If the password is not the one used by the user during Sign-Up, login should not succeed
|
non_process
|
password validation during login describe the bug during sign in form validation wrong passwords are not checked to reproduce steps to reproduce the behavior go to submit correct email id submit a wrong password user is logged in expected behavior if the password is not the one used by the user during sign up login should not succeed
| 0
|
16,634
| 21,706,735,472
|
IssuesEvent
|
2022-05-10 10:16:33
|
camunda/zeebe
|
https://api.github.com/repos/camunda/zeebe
|
closed
|
Extract index name and record routing logic from ElasticsearchClient
|
kind/toil team/process-automation area/maintainability
|
**Description**
The index names, the record's document IDs, and the routing logic are all part of a public API which consumers of the exported records rely on. To ensure this is tested properly, we can extract that logic out of the `ElasticsearchClient` class and into its own SRP class.
|
1.0
|
Extract index name and record routing logic from ElasticsearchClient - **Description**
The index names, the record's document IDs, and the routing logic are all part of a public API which consumers of the exported records rely on. To ensure this is tested properly, we can extract that logic out of the `ElasticsearchClient` class and into its own SRP class.
|
process
|
extract index name and record routing logic from elasticsearchclient description the index names the record s document ids and the routing logic are all part of a public api which consumers of the exported records rely on to ensure this is tested properly we can extract that logic out of the elasticsearchclient class and into its own srp class
| 1
|
346,524
| 10,416,379,296
|
IssuesEvent
|
2019-09-14 13:06:23
|
A-Team-Rowan-University/a-team-website
|
https://api.github.com/repos/A-Team-Rowan-University/a-team-website
|
opened
|
Sync database to filesystem
|
priority-medium
|
Right now, all the database data lives in the docker container. This means we can never bring down that container, which makes some things more difficult and perilous. It would be better to store the actual data somewhere more permanent, like the /var partition in the host filesystem. This may also make backing up easier.
|
1.0
|
Sync database to filesystem - Right now, all the database data lives in the docker container. This means we can never bring down that container, which makes some things more difficult and perilous. It would be better to store the actual data somewhere more permanent, like the /var partition in the host filesystem. This may also make backing up easier.
|
non_process
|
sync database to filesystem right now all the database data lives in the docker container this means we can never bring down that container which makes some things more difficult and perilous it would be better to store the actual data somewhere more permanent like the var partition in the host filesystem this may also make backing up easier
| 0
|
85,488
| 10,618,355,522
|
IssuesEvent
|
2019-10-13 03:40:09
|
carbon-design-system/ibm-dotcom-library
|
https://api.github.com/repos/carbon-design-system/ibm-dotcom-library
|
opened
|
Pattern: promo, visual design study
|
design design: research design: ux design: visual migrate package: patterns
|
_Wonil-Suh1 created the following on Sep 08:_
### User story
As an IBM.com user, I need a way to learn about a promotion related to the product/solution/offering presented on the page so that I can decide whether I should follow through it or not.
### Notes
Current design is typically 50px high and small. Here is a design exploration that allows a bit more space and potentially a better user experience.

### Acceptance criteria
- [ ] Patterns used by competitors have been collected
- [ ] multiple design directions have been explored
- [ ] Two design directions have been identified
_Original issue: https://github.ibm.com/webstandards/digital-design/issues/1614_
|
4.0
|
Pattern: promo, visual design study - _Wonil-Suh1 created the following on Sep 08:_
### User story
As an IBM.com user, I need a way to learn about a promotion related to the product/solution/offering presented on the page so that I can decide whether I should follow through it or not.
### Notes
Current design is typically 50px high and small. Here is a design exploration that allows a bit more space and potentially a better user experience.

### Acceptance criteria
- [ ] Patterns used by competitors have been collected
- [ ] multiple design directions have been explored
- [ ] Two design directions have been identified
_Original issue: https://github.ibm.com/webstandards/digital-design/issues/1614_
|
non_process
|
pattern promo visual design study wonil created the following on sep user story as an ibm com user i need a way to learn about a promotion related to the product solution offering presented on the page so that i can decide whether i should follow through it or not notes current design is typically high and small here is a design exploration that allows a bit more space and potentially a better user experience acceptance criteria patterns used by competitors have been collected multiple design directions have been explored two design directions have been identified original issue
| 0
|
257,494
| 8,138,153,268
|
IssuesEvent
|
2018-08-20 14:00:41
|
minishift/minishift
|
https://api.github.com/repos/minishift/minishift
|
closed
|
minishift start should return non-zero exit code when it fails with error "could not set oc CLI context for 'minishift' profile"
|
kind/bug priority/major
|
### General information
* Minishift version:
* OS: Linux / macOS / Windows
* Hypervisor: KVM / xhyve / Hyper-V / VirtualBox
In the code we do not exit when we see the error #2679 `could not set oc CLI context for 'minishift' profile` , hence `minishift start` returns `0` as the exit code when it should return a non-zero exit code.
```code
if isRestart {
err = cmdUtil.SetOcContext(minishiftConfig.AllInstancesConfig.ActiveProfile)
if err != nil {
fmt.Println(fmt.Sprintf("Could not set oc CLI context for '%s' profile: %v", profileActions.GetActiveProfile(), err))
}
}
```
### Steps to reproduce
The issue #2679 is fixed now, but if you revert https://github.com/minishift/minishift/commit/6c901e8c31f35d021ee9423312c90b29e0c67040 you can reproduce this issue as it is only visible when code errors out `Could not set oc CLI context for 'minishift' profile: Error during setting 'minishift' as active profile: The specified path to oc '' does not exist`.
1. minishift start
2. minishift stop
3. minishift start
4. echo $?
### Expected
```
$ minishift start
-- Starting profile 'minishift'
xxxxxxx
Starting OpenShift using openshift/origin:v3.9.0 ...
OpenShift server started.
The server is accessible via web console at:
https://192.168.64.47:8443
Could not set oc CLI context for 'minishift' profile: Error during setting 'minishift' as active profile: The specified path to oc '' does not exist
```
```
$ echo $?
1
```
### Actual
```
$ minishift start
-- Starting profile 'minishift'
xxxxxxx
Starting OpenShift using openshift/origin:v3.9.0 ...
OpenShift server started.
The server is accessible via web console at:
https://192.168.64.47:8443
Could not set oc CLI context for 'minishift' profile: Error during setting 'minishift' as active profile: The specified path to oc '' does not exist
```
```
$ echo $?
0
```
|
1.0
|
minishift start should return non-zero exit code when it fails with error "could not set oc CLI context for 'minishift' profile" - ### General information
* Minishift version:
* OS: Linux / macOS / Windows
* Hypervisor: KVM / xhyve / Hyper-V / VirtualBox
In the code we do not exit when we see the error #2679 `could not set oc CLI context for 'minishift' profile` , hence `minishift start` returns `0` as the exit code when it should return a non-zero exit code.
```code
if isRestart {
err = cmdUtil.SetOcContext(minishiftConfig.AllInstancesConfig.ActiveProfile)
if err != nil {
fmt.Println(fmt.Sprintf("Could not set oc CLI context for '%s' profile: %v", profileActions.GetActiveProfile(), err))
}
}
```
### Steps to reproduce
The issue #2679 is fixed now, but if you revert https://github.com/minishift/minishift/commit/6c901e8c31f35d021ee9423312c90b29e0c67040 you can reproduce this issue as it is only visible when code errors out `Could not set oc CLI context for 'minishift' profile: Error during setting 'minishift' as active profile: The specified path to oc '' does not exist`.
1. minishift start
2. minishift stop
3. minishift start
4. echo $?
### Expected
```
$ minishift start
-- Starting profile 'minishift'
xxxxxxx
Starting OpenShift using openshift/origin:v3.9.0 ...
OpenShift server started.
The server is accessible via web console at:
https://192.168.64.47:8443
Could not set oc CLI context for 'minishift' profile: Error during setting 'minishift' as active profile: The specified path to oc '' does not exist
```
```
$ echo $?
1
```
### Actual
```
$ minishift start
-- Starting profile 'minishift'
xxxxxxx
Starting OpenShift using openshift/origin:v3.9.0 ...
OpenShift server started.
The server is accessible via web console at:
https://192.168.64.47:8443
Could not set oc CLI context for 'minishift' profile: Error during setting 'minishift' as active profile: The specified path to oc '' does not exist
```
```
$ echo $?
0
```
|
non_process
|
minishift start should return non zero exit code when it fails with error could not set oc cli context for minishift profile general information minishift version os linux macos windows hypervisor kvm xhyve hyper v virtualbox in the code we do not exit when we see the error could not set oc cli context for minishift profile hence minishift start returns as the exit code when it should return a non zero exit code code if isrestart err cmdutil setoccontext minishiftconfig allinstancesconfig activeprofile if err nil fmt println fmt sprintf could not set oc cli context for s profile v profileactions getactiveprofile err steps to reproduce the issue is fixed now but if you revert you can reproduce this issue as it is only visible when code errors out could not set oc cli context for minishift profile error during setting minishift as active profile the specified path to oc does not exist minishift start minishift stop minishift start echo expected minishift start starting profile minishift xxxxxxx starting openshift using openshift origin openshift server started the server is accessible via web console at could not set oc cli context for minishift profile error during setting minishift as active profile the specified path to oc does not exist echo actual minishift start starting profile minishift xxxxxxx starting openshift using openshift origin openshift server started the server is accessible via web console at could not set oc cli context for minishift profile error during setting minishift as active profile the specified path to oc does not exist echo
| 0
|
21,995
| 30,490,127,988
|
IssuesEvent
|
2023-07-18 07:05:05
|
fkdl0048/BookReview
|
https://api.github.com/repos/fkdl0048/BookReview
|
closed
|
10장 디자인 패턴
|
2023 The Object-Oriented Thought Process
|
[Link](https://github.com/fkdl0048/BookReview/blob/main/The_Object-Oriented_Thought_Process/Chapter10.md)
## 10. 디자인 패턴
업무용 소프트웨어 시스템을 작성하려면 개발자는 사업방식을 완전히 이해해야 한다.
결과적으로 개발자는 종종 회사의 업무 과정에 대해 가장 친밀한 지식을 갖게 된다.
우리가 포유류 클래스를 만들 때, 모든 포유류가 특정 행위와 특성을 공유하기 때문에 이 포유류 클래스를 사용하면 개나 고양이 등의 클래스들을 수없이 만들 수 있다.
이렇게 하다 보면 개, 고양이, 다람쥐 및 기타 포유류를 연구할 때 효과적인데, 이는 **패턴**을 발견할 수 있기 때문이다.
이런 패턴을 기반으로 특정 동물을 살펴보고 그게 포유류인지 아니면 행위와 특성의 패턴이 포유류와 다른 파충류인지 판단할 수 있다.
역사적으로 우리는 이러한 패턴 사용해왔다.
이러한 패턴은 소프트웨어 분야의 중요한 부분으로 소프트웨어 재사용형태와 밀접한 관련이 있다.(디자인 패턴)
패턴은 재사용 가능한 소프트웨어 개발 개념에 아주 적합하다.
객체지향 개발은 모두 재사용에 관한 것이므로 패턴과 객체지향 개발은 함께 진보한다.
디자인 패턴의 기본 개념은 모범 사례의 원칙과 관련이 있다.
모범 사례에 따르면 우수하고 효율적인 솔루션이 만들어질 때, 이러한 솔루션은 다른 사람들이 실패로부터 배웠던 방식과 이전의 성공에 따른 이익을 얻을 수 있었던 방식을 문서화한다는 것을 의미한다.
### 10.1. 디자인 패턴이 필요한 이유
디자인 패턴의 개념이 반드시 재사용 가능한 소프트웨어의 필요성에서 시작된 것은 아니다.
디자인 패턴에 대한 중요한 업적은 건물과 도시 건설에 관련되어 있다.
패턴은 우리 환경에서 반복적으로 발생하는 문제들을 기술하며, 그 문제에 대한 핵심 해법도 기술하는데, 이런 방식 덕분에 우리는 그러한 해법을 백만 번 거듭해서 사용할 수 있으며, 그럼에도 같은 방식을 두 번 다시 따르지 않아도 된다..
#### 10.1.1. 패턴의 네 가지 요소
*GoF는 패턴을 다음 네 가지 필수 요소로 설명한다.*
- 패턴 이름
- 패턴 이름은 디자인 문제, 솔루션 및 결과를 한두 단어로 설명하는 데 사용할 수 있는 조종간 같은 것
- 패턴에 이름을 붙여두면 어휘가 즉시 늘어나며 너 높은 수준으로 설계를 추상화할 수 있다.
- 패턴에 대한 어휘를 갖추면 동료와 소통할 수 있고 문서로 교신할 수 있으며 자신과도 이야기 할 수 있다.
- 이해하기 쉬워지면 절충산 설계를 다른 사람들에게 쉽게 전달할 수 있다.
- 문제
- 문제는 패턴을 적용할 시점을 설명한다.
- 이것으로 문제 자체와 내용을 설명할 수 있다.
- 알고리즘을 객체로 표현하는 방법과 같은 특정 설계 문제들을 설명할 수 있다.
- 융통성이 없는 설계의 증상인 클래스 구조나 객체 구조를 설명할 수 있다.
- 해법
- 해법은 설계, 관계, 책임 및 협업을 이루는 요소를 설명한다.
- 패턴은 여러 상황에 적용할 수 있는 템플릿과 같기 때문에, 해법은 구체적인 특정 설계나 구현을 설명하지 않는다.
- 그 대신 패턴으로는 설계 문제를 추상적으로 설명하며, 요소를 일반적으로 배치해 설계 문제를 어떻게 해결하는지 보여준다.
- 귀결
- 귀결이란 패턴을 적용한 결과와 절충점을 말한다.
- 귀결이 종종 무시되기도 하지만, 우리가 설게 결정 내용을 설명할 때는 귀결이 설계 대안을 평가하고 적용 패턴의 비용과 이점을 이해하는데 중요하다.
- 소프트웨어 귀결은 시공간상의 절충과 관련이 있다.
- 귀결로 언어 문제 및 구현 문제도 해결할 수 있다.
### 10.2. 스몰토크의 모델/뷰/컨트롤러
MVC는 종종 디자인 패턴의 기원을 설명하는 데 사용된다.
모델/뷰/컨트롤러(MVC)라는 패러다임은 스몰토크에서 사용자 인터페이스를 생성하는데 사용했다.
> 스몰토크
> 스몰토크는 그 당시 가장 인기있는 객체지향 언어였다..
Design Patterns에서는 다음과 같은 방식으로 MVC 컴포넌트들을 정의한다.
모델(model)은 애플리케이션 객체이고 뷰(view)는 화면 표현이며, 컨트롤러(controller)는 사용자 인터페이스가 사용자 입력에 반응하는 방식을 정의한다.
이전 패러다임에서는 모델, 뷰, 컨트롤러가 단일 엔터티 안에 모두 모여 있는 문제가 있었다.
예를 들어, 단일 객체에 이 세 가지 컴포넌트가 모두 들어 있는 식이다.
MVC패러다임에서는 이 세 가지 컴포넌트에 개별적이며 구별되는 인터페이스들이 있다.
따라서 우리는 어떤 애플리케이션의 사용자 인터페이스를 변경하려고 한다면 뷰만 변경하면 된다.
이는 객체지향 개발의`인터페이스 대 구현부`와 관련이 있다.
가능한 인터페이스와 구현부를 분리하려고 한다.
또한 인터페이스들끼리도 서로 최대한 분리하려고 한다.
MVC는 아주 일반적이면서도 기본적인 프로그래밍 문제와 관련된 특정 컴포넌트 간의 인터페이스를 명시적으로 정의한다.
MVC 개념을 따르고 사용자 인터페이스, 비즈니스 로직 및 데이터를 분리하면서 시스템이 훨씬 유연하고 강력해진다.
예를 들어, 사용자 인터페이스가 클라이언트 시스템에 있고, 비즈니스 로직이 애플리케이션 서버에 있고, 데이터가 데이터 서버에 있다고 가정한다면 개발 도중에 비즈니스 로직에 영향 없이 GUI의 형태를 변경할 수 있다.
> **MVC의 단점**
> MVC는 훌룡한 디자인이지만, 선행 디자인에 많은 주의를 기울여야 한다는 점에서 복잡할 수 있다.
> 이는 일반적인 객체지향 디자인의 문제이다.
> 좋은 디자인과 성가신 디자인 사이에는 미세한 경계선이 있다.
> 그렇다면 과연 완전한 디자인과 관련해 시스템이 얼마나 복잡해야 하는가?
> 정닶은 없고, 항상 트레이드 오프를 생각하자.
### 10.3. 디자인 패턴의 종류
Design Patterns에는 세 가지 범주로 분류된 총 23개의 패턴이 있다.
- 생성 패턴(creational patterns)
- 객체를 직접 인스턴스화하지 않은 채로 객체를 만든다.
- 이를 통해 특정 사례에 대해 어떤 객체를 생성해야 할지 결정할 때 프로그램의 유연성이 향상된다.
- 구조 패턴(structural patterns)
- 복잡한 사용자 인터페이스나 계정 데이터와 같은 객체 그룹을 더 큰 구조로 합성할 수 있다.
- 행위 패턴(behavioral patterns)
- 시스템의 객체 간 통신을 정의하고 복잡한 프로그램에서 흐름을 제어하는 방법을 정의한다.
#### 10.3.1. 생성 패턴
생성 패턴은 다음 범주별로 나눠볼 수 있다.
- 추상 팩토리(Abstract Factory)
- 빌더(Builder)
- 팩토리 메서드(Factory Method)
- 프로토타입(Prototype)
- 싱글톤(Singleton)
이 책에선 각 패턴을 자세하게 다루지 않고 디자인 패턴이 뭔지 설명하는 내용이 주를 이룬다.
```
생각
확실히 멘토님께서 디자인 패턴먼저 공부하지 말라고 하신 이유를 다시한번 느낀다.
게임 프로그래밍에 적합한 디자인 패턴도 물론 있지만 해당 디자인 패턴을 공부한다고 억지로 적용하면 오히려 독이다.
현재 구조에서 필요한 범주 생성인지, 행위인지, 구조인지 필요성을 느낄 때 해당 범주내에서 적합한 디자인 패턴을 공부하고 적용하는 것이 더 효울적이고 기억에 남는다는 것.
공부의 순서는 없지만 적합한 방법은 있는 것 같다.
```
##### 팩토리 메서드 디자인 패턴
객체 생성, 즉 인스턴스화는 객체지향 프로그래밍에서 가장 기본적인 개념 중에 하나일 수 있다.
해당 객체가 존재하지 않으면 객체를 사용할 수 없다는 것은 말할 필요도 없다.
코드를 작성할 때 객체를 인스턴스화하는 가장 확실한 방법은 new키워드를 사용하는 것이다.
```java
abstract class Shape{
}
class Circle extends Shape{
}
Circle circle = new Circle();
```
이 코드는 문제없이 동작하지만, 코드에서 Circle이나 해당 문제의 다른 도형들을 인스턴스화 해야 할 곳이 많을 것이다.
대부분의 경우에 Shape를 만들 때마다 처리해야 하는 특정 객체 생성 매개변수가 있을 것이다.
결과적으로 객체 생성 방식을 변경할 때마다 Shape 객체가 인스턴스화되는 모든 위치에서 코드를 변경해야 한다.
한 곳을 변경하면 잠재적으로 다른 많은 곳에서 코드를 변경해야 하므로 코드가 밀접하게 묶이게 된다..
이 접근법의 또 다른 문제점은 클래스를 사용해 **프로그래머에게 객체 작성 로직을 노출**시킨다는 것이다.
이러한 상황을 해결하기 위해 팩토리 메서드 패턴을 구현할 수 있다.
*[팩토리 메서드 패턴 정리 글](https://fkdl0048.github.io/patterns/Patterns_FactoryMethod/)*
팩토리 메서드는 모든 인스턴스화를 캡슐화해 구현 전반에 걸쳐 균일해야 한다.
팰토리를 사용해 인스턴스화하면, 팩토리는 적절하게 인스턴스화 한다.
##### 팩토리 메서드 패턴
팩토리 메서드 패턴의 근본적인 목적은 정확한 클래스를 지정하지 않고도 객체를 생성하는 일과 사실상 인터페이스를 사용해 새로운 객체 유형을 생성하는 일을 담당하는 것이다.
어떤 면에서는 팩토리를 래퍼라고 생각할 수 있다.(그렇게 생각했었다..)
객체를 인스턴스화하는 데 중요한 로직이 있을 수 있으며, 우리는 프로그래머(사용자)가 이 로직에 관심을 갖지 않았으면 한다면..
즉, 값을 검색하는 로직이 일부 로직 내부에 있을 때 접근자 메서드의 개념과 거의 같다.(게터 세터)
필요한 특정 클래스를 미리 알 수 없을 때 팩토리 메서드를 사용하면 된다.
즉, 이번 예제의 모든 클래스는 Shape의 서브클래스여야 한다.
사실 팩토리 메서드는 필요한걸 정확하게 모를 때 사용된다.
나중에 클래스의 일부를 추가할 수 있게 되며 필요한 게 무엇인지를 알고 있다면 생성자나 세터 메서드를 통해 인스턴스를 주입할 수 있다.
기본적으로 이게 다형성에 대한 정의이다.
위 링크에도 피자로 같은 예제가 있지만 책의 예제로 한번 더 복습한다.
enum형태로 도형의 종류를 정의한다.
```java
enum ShapeType{
CIRCLE,
RECTANGLE,
TRIANGLE
}
```
생성자와 generate()라고 부르는 추상 메서드만으로 Shape 클래스를 추상적인 것으로 정의할 수 있다.
```java
abstract class Shape{
private ShapeType shapeType = null;
public Shape(ShapeType shapeType){
this.shapeType = shapeType;
}
public abstract void generate();
}
```
Circle, Rectangle, Triangle 클래스는 Shape 클래스를 상속받아 구현한다.
```java
class Circle extends Shape{
public Circle(){
super(ShapeType.CIRCLE);
generate();
}
@Override
public void generate(){
System.out.println("Circle");
}
}
class Rectangle extends Shape{
public Rectangle(){
super(ShapeType.RECTANGLE);
generate();
}
@Override
public void generate(){
System.out.println("Rectangle");
}
}
class Triangle extends Shape{
public Triangle(){
super(ShapeType.TRIANGLE);
generate();
}
@Override
public void generate(){
System.out.println("Triangle");
}
}
```
이름에서 알 수 있듯이 ShapeFactory 클래스는 실제적인 팩토리다.
generate()메서드에 초점이 맞춰져 있다.
팩토리는 많은 장점을 제공하지만, generate()메서드는 실제로 Shape를 인스턴스화하는 애플리케이션 내의 유일한 위치이다.
```java
class ShapeFactory{
public static Shape generateShape(ShapeType shapeType){
Shape shape = null;
switch(shapeType){
case CIRCLE:
shape = new Circle();
break;
case RECTANGLE:
shape = new Rectangle();
break;
case TRIANGLE:
shape = new Triangle();
break;
default:
// 예외
break;
}
return shape;
}
}
```
```
생각
모든 디자인 패턴은 핵심적인 부분은 비슷하지만 각각의 패턴도 구현하는 언어, 스타일, 컨벤션의 차이가 다 다르다.
자바라서 그런지 모르겠지만 default보다 차라리 예외를 아래서 잡고 각각 리턴을 하거나, Shape 자체에 static메서드로 두거나, 인터페이스 DI로 좀 더 유연성을 강화할 수도 있다.
개인의 구현 차이와 해당 프로젝트의 규모, 팀 컨벤션을 고려해서 작성하는 것이 바람직한 것 같다.
```
이러한 개별 객체를 인스턴스화하는 전통적인 접근 방식은 프로그래머가 다음과 같이 new 키워드를 사용해 객체를 직접 인스턴스화 하는 것이다.
```java
public class TestFactoryPattern{
public static void main(String[] args){
Shape circle = new Circle();
Shape rectangle = new Rectangle();
Shape triangle = new Triangle();
}
}
```
그러나 팩토리 메서드 패턴을 사용하면 다음과 같이 객체를 인스턴스화 할 수 있다.
```java
public class TestFactoryPattern{
public static void main(String[] args){
Shape circle = ShapeFactory.generateShape(ShapeType.CIRCLE);
Shape rectangle = ShapeFactory.generateShape(ShapeType.RECTANGLE);
Shape triangle = ShapeFactory.generateShape(ShapeType.TRIANGLE);
}
}
```
더 나아가서 `C#`의 리플렉션을 사용하면 팩토리 메서드 패턴을 사용하지 않고도 객체를 인스턴스화 할 수 있다.
```csharp
public class TestFactoryPattern{
public static void main(String[] args){
Shape circle = (Shape)Activator.CreateInstance(typeof(Circle));
Shape rectangle = (Shape)Activator.CreateInstance(typeof(Rectangle));
Shape triangle = (Shape)Activator.CreateInstance(typeof(Triangle));
}
}
```
하지만 같은 추상화 레벨이나 생성 로직의 통일성 개인성을 보장하지 못하기 때문에 접근이나 가독성이 부족할 수 있는 것 같다..
실제로는 데이터 베이스상의 Unit을 읽어와서 객체를 동적으로 생성하여 관리하는 것 같다.
#### 10.3.2.구조패턴
구조 패턴(structural pattern)은 객체 그룹에서 더 큰 구조를 만드는 데 사용된다.
다음 일곱 가지 디자인 패턴이 구조 패턴의 범주에 속한다.
- 어댑터(Adapter, 즉 적응자)
- 브리지(Bridge, 즉 가교)
- 컴포지션(Composition, 즉 '합성체')
- 데코레이터(Decorator, 즉 '장식자')
- 파사드(Facade)
- 플라이웨이트(Flyweight)
- 프록시(Proxy)
##### 어댑터 디자인 패턴
어댑터 패턴은 이미 존재하는 클래스에 대해 다른 인터페이스를 작성하는 방법이다.
어댑터 패턴은 기본적으로 클래스 래퍼를 제공한다.
다시 말해, 기존 클래스의 기능을 새로우면서도 이상적으로 볼 때 더 나은 인터페이스로 통합하는(둘러싸는) 새로운 클래스를 만든다.
래퍼의 간단한 예는 자바 클래스인 Integer다.
Integer 클래스는 그 안에 단일 Integer값을 둘러싼다.
객체지향 시스템에서는 모든 것이 객체이기 때문에, 기본적인 int, float등과 같은 기본 데이터 형식은 객체가 아니다.
이러한 기본 데이터 형식들을 바탕으로 함수를 실행해야 할 때 이런 기본 데이터 형식들을 객체로 취급해야 한다.
따라서 래퍼 객체를 작성하고 그 안에 기본 데이터 형식들을 두고 둘러싼다.
다음과 같은 기본 데이터 형식을 사용할 수 있다.
```java
int myInt = 10;
```
그리고 이 기본 데이터 형식을 Integer로 둘러쌀 수 있다.
```java
Integer myInteger = new Integer(myInt);
```
이제 형식 변환을 수행할 수 있으며, 해당 자료를 문자열로 취급할 수 있다.
```java
String myString = myInteger.toString();
```
이 래퍼를 사용하면 원래부터 있던 정수를 객체로 취급해 객체의 모든 장점을 제공할 수 있다.
이것이 어댑터 패턴의 기본 아이디어이다.
어댑터의 말 그대로 한국의 콘센와 일본의 콘센트를 연결하는 어댑터의 역할을 한다.
사용하는 패키지나 라이브러리의 호환성이나 중간 역할을 위해 래퍼로 감싸는 것이다.
#### 10.3.3.행위패턴
행위 패턴(behavioral pattern)은 다음과 같은 범주로 구성된다.
- 책임 연쇄(Chain of responsibility)
- 커맨드(Command, 즉 '명령') 패턴
- 인터프리터(Interpreter, 즉 '해설자') 패턴
- 이터레이터(Iterator, 즉 '반복자') 패턴
- 미디에이터(Mediator, 즉 '중재자') 패턴
- 메멘토(Memento, 즉 '기념비') 패턴
- 옵저버(Observer, 즉 '관찰자') 패턴
- 스테이트(State, 즉 '상태') 패턴
- 스트레이지(Strategy, 즉 '전략') 패턴
- 탬플릿 메서드(Template method) 패턴
- 비지터(Visitor, 즉 '방문자') 패턴
##### 이터레이터 디자인 패턴
이터레이터(iterator)는 벡터와 같은 컬렉션을 순회하기 위한 표준 메커니즘을 제공한다.
컬렉션의 각 항목 한 번에 하나씩 접근할 수 있도록 기능을 제공해야 한다.
이터레이터 패턴은 정보 은닉을 제공해 컬렉션의 내부 구조를 안전하게 유지한다.
이터레이터 패턴은 또한 서로 간섭하지 않고 둘 이상의 이터레이터가 작성될 수 있도록 규정한다.
`C#`은 `IEnumerable` 인터페이스를 사용해 이터레이터 패턴을 구현한다.
```csharp
public interface IEnumerable{
IEnumerator GetEnumerator();
}
```
`IEnumerator` 인터페이스는 컬렉션의 각 항목을 순회하는 데 사용된다.
```csharp
public interface IEnumerator{
bool MoveNext();
object Current{get;}
void Reset();
}
```
자바는 다음과 같다.
```java
package Iterator;
import java.util.*
public class Iterator{
public static void main(String[] args){
ArrayList<String> list = new ArrayList<String>();
list.add("one");
list.add("two");
list.add("three");
list.add("four");
iterate(list);
}
public static void iterate(ArrayList<String> list){
foreach(String s : list){
System.out.println(s);
}
}
}
```
#### 10.3.4. 안티패턴
디자인 패턴은 긍정적인 방식으로 발전하지만, 안티패턴(antipatterns)은 끔찍한 경험들을 모아 놓은 것으로 생각할 수 있다.
안티패턴이란 용어는 특정 유형의 문제를 사전에 해결하기 위해 디자인 패턴이 생성된다는 사실에서 비롯된다.
반면에, 안티패턴은 문제에 대한 반응이고 나쁜 경험으로부터 얻어진다.
요컨대, 디자인 패턴이 견고한 설계 실습을 기반으로 한 반면에, 안티패턴은 피해야할 관행으로 생각할 수 있다.
많은 사람들이 안티패턴이 디자인 패턴보다 더 유용하다고 생각한다.
안티패턴은 이미 발생한 문제를 해결하도록 설계되었기 때문이다..
안티패턴은 기존 설계를 수정하고 실행 가능한 솔루션을 찾을 때까지 해당 설계를 지속적으로 리팩토링한다.
- 안티패턴의 좋은 예 몇가지
- 싱글톤(Singleton)
- 서비스 로케이터(Service Locator)
- 매직 스트링/넘버(Magic String/Number)
- 인터페이스 부풀리기(Interface Bloat)
- 예외에 기반한 코딩(Exception Driven Coding)
- 오류 숨기기/오류 삼키기(Hiding Swallowing Errors)
### 10.4. 결론
패턴은 일상 생활의 일부이며, 이는 객체지향 설계에 대해 생각해야 하는 방식이다.
정보 기술과 관련된 많은 것들과 마찬가지로 해법의 근원은 **실제 상황에서 발각**된다.
|
1.0
|
10장 디자인 패턴 - [Link](https://github.com/fkdl0048/BookReview/blob/main/The_Object-Oriented_Thought_Process/Chapter10.md)
## 10. 디자인 패턴
업무용 소프트웨어 시스템을 작성하려면 개발자는 사업방식을 완전히 이해해야 한다.
결과적으로 개발자는 종종 회사의 업무 과정에 대해 가장 친밀한 지식을 갖게 된다.
우리가 포유류 클래스를 만들 때, 모든 포유류가 특정 행위와 특성을 공유하기 때문에 이 포유류 클래스를 사용하면 개나 고양이 등의 클래스들을 수없이 만들 수 있다.
이렇게 하다 보면 개, 고양이, 다람쥐 및 기타 포유류를 연구할 때 효과적인데, 이는 **패턴**을 발견할 수 있기 때문이다.
이런 패턴을 기반으로 특정 동물을 살펴보고 그게 포유류인지 아니면 행위와 특성의 패턴이 포유류와 다른 파충류인지 판단할 수 있다.
역사적으로 우리는 이러한 패턴 사용해왔다.
이러한 패턴은 소프트웨어 분야의 중요한 부분으로 소프트웨어 재사용형태와 밀접한 관련이 있다.(디자인 패턴)
패턴은 재사용 가능한 소프트웨어 개발 개념에 아주 적합하다.
객체지향 개발은 모두 재사용에 관한 것이므로 패턴과 객체지향 개발은 함께 진보한다.
디자인 패턴의 기본 개념은 모범 사례의 원칙과 관련이 있다.
모범 사례에 따르면 우수하고 효율적인 솔루션이 만들어질 때, 이러한 솔루션은 다른 사람들이 실패로부터 배웠던 방식과 이전의 성공에 따른 이익을 얻을 수 있었던 방식을 문서화한다는 것을 의미한다.
### 10.1. 디자인 패턴이 필요한 이유
디자인 패턴의 개념이 반드시 재사용 가능한 소프트웨어의 필요성에서 시작된 것은 아니다.
디자인 패턴에 대한 중요한 업적은 건물과 도시 건설에 관련되어 있다.
패턴은 우리 환경에서 반복적으로 발생하는 문제들을 기술하며, 그 문제에 대한 핵심 해법도 기술하는데, 이런 방식 덕분에 우리는 그러한 해법을 백만 번 거듭해서 사용할 수 있으며, 그럼에도 같은 방식을 두 번 다시 따르지 않아도 된다..
#### 10.1.1. 패턴의 네 가지 요소
*GoF는 패턴을 다음 네 가지 필수 요소로 설명한다.*
- 패턴 이름
- 패턴 이름은 디자인 문제, 솔루션 및 결과를 한두 단어로 설명하는 데 사용할 수 있는 조종간 같은 것
- 패턴에 이름을 붙여두면 어휘가 즉시 늘어나며 너 높은 수준으로 설계를 추상화할 수 있다.
- 패턴에 대한 어휘를 갖추면 동료와 소통할 수 있고 문서로 교신할 수 있으며 자신과도 이야기 할 수 있다.
- 이해하기 쉬워지면 절충산 설계를 다른 사람들에게 쉽게 전달할 수 있다.
- 문제
- 문제는 패턴을 적용할 시점을 설명한다.
- 이것으로 문제 자체와 내용을 설명할 수 있다.
- 알고리즘을 객체로 표현하는 방법과 같은 특정 설계 문제들을 설명할 수 있다.
- 융통성이 없는 설계의 증상인 클래스 구조나 객체 구조를 설명할 수 있다.
- 해법
- 해법은 설계, 관계, 책임 및 협업을 이루는 요소를 설명한다.
- 패턴은 여러 상황에 적용할 수 있는 템플릿과 같기 때문에, 해법은 구체적인 특정 설계나 구현을 설명하지 않는다.
- 그 대신 패턴으로는 설계 문제를 추상적으로 설명하며, 요소를 일반적으로 배치해 설계 문제를 어떻게 해결하는지 보여준다.
- 귀결
- 귀결이란 패턴을 적용한 결과와 절충점을 말한다.
- 귀결이 종종 무시되기도 하지만, 우리가 설게 결정 내용을 설명할 때는 귀결이 설계 대안을 평가하고 적용 패턴의 비용과 이점을 이해하는데 중요하다.
- 소프트웨어 귀결은 시공간상의 절충과 관련이 있다.
- 귀결로 언어 문제 및 구현 문제도 해결할 수 있다.
### 10.2. 스몰토크의 모델/뷰/컨트롤러
MVC는 종종 디자인 패턴의 기원을 설명하는 데 사용된다.
모델/뷰/컨트롤러(MVC)라는 패러다임은 스몰토크에서 사용자 인터페이스를 생성하는데 사용했다.
> 스몰토크
> 스몰토크는 그 당시 가장 인기있는 객체지향 언어였다..
Design Patterns에서는 다음과 같은 방식으로 MVC 컴포넌트들을 정의한다.
모델(model)은 애플리케이션 객체이고 뷰(view)는 화면 표현이며, 컨트롤러(controller)는 사용자 인터페이스가 사용자 입력에 반응하는 방식을 정의한다.
이전 패러다임에서는 모델, 뷰, 컨트롤러가 단일 엔터티 안에 모두 모여 있는 문제가 있었다.
예를 들어, 단일 객체에 이 세 가지 컴포넌트가 모두 들어 있는 식이다.
MVC패러다임에서는 이 세 가지 컴포넌트에 개별적이며 구별되는 인터페이스들이 있다.
따라서 우리는 어떤 애플리케이션의 사용자 인터페이스를 변경하려고 한다면 뷰만 변경하면 된다.
이는 객체지향 개발의`인터페이스 대 구현부`와 관련이 있다.
가능한 인터페이스와 구현부를 분리하려고 한다.
또한 인터페이스들끼리도 서로 최대한 분리하려고 한다.
MVC는 아주 일반적이면서도 기본적인 프로그래밍 문제와 관련된 특정 컴포넌트 간의 인터페이스를 명시적으로 정의한다.
MVC 개념을 따르고 사용자 인터페이스, 비즈니스 로직 및 데이터를 분리하면서 시스템이 훨씬 유연하고 강력해진다.
예를 들어, 사용자 인터페이스가 클라이언트 시스템에 있고, 비즈니스 로직이 애플리케이션 서버에 있고, 데이터가 데이터 서버에 있다고 가정한다면 개발 도중에 비즈니스 로직에 영향 없이 GUI의 형태를 변경할 수 있다.
> **MVC의 단점**
> MVC는 훌룡한 디자인이지만, 선행 디자인에 많은 주의를 기울여야 한다는 점에서 복잡할 수 있다.
> 이는 일반적인 객체지향 디자인의 문제이다.
> 좋은 디자인과 성가신 디자인 사이에는 미세한 경계선이 있다.
> 그렇다면 과연 완전한 디자인과 관련해 시스템이 얼마나 복잡해야 하는가?
> 정닶은 없고, 항상 트레이드 오프를 생각하자.
### 10.3. 디자인 패턴의 종류
Design Patterns에는 세 가지 범주로 분류된 총 23개의 패턴이 있다.
- 생성 패턴(creational patterns)
- 객체를 직접 인스턴스화하지 않은 채로 객체를 만든다.
- 이를 통해 특정 사례에 대해 어떤 객체를 생성해야 할지 결정할 때 프로그램의 유연성이 향상된다.
- 구조 패턴(structural patterns)
- 복잡한 사용자 인터페이스나 계정 데이터와 같은 객체 그룹을 더 큰 구조로 합성할 수 있다.
- 행위 패턴(behavioral patterns)
- 시스템의 객체 간 통신을 정의하고 복잡한 프로그램에서 흐름을 제어하는 방법을 정의한다.
#### 10.3.1. 생성 패턴
생성 패턴은 다음 범주별로 나눠볼 수 있다.
- 추상 팩토리(Abstract Factory)
- 빌더(Builder)
- 팩토리 메서드(Factory Method)
- 프로토타입(Prototype)
- 싱글톤(Singleton)
이 책에선 각 패턴을 자세하게 다루지 않고 디자인 패턴이 뭔지 설명하는 내용이 주를 이룬다.
```
생각
확실히 멘토님께서 디자인 패턴먼저 공부하지 말라고 하신 이유를 다시한번 느낀다.
게임 프로그래밍에 적합한 디자인 패턴도 물론 있지만 해당 디자인 패턴을 공부한다고 억지로 적용하면 오히려 독이다.
현재 구조에서 필요한 범주 생성인지, 행위인지, 구조인지 필요성을 느낄 때 해당 범주내에서 적합한 디자인 패턴을 공부하고 적용하는 것이 더 효울적이고 기억에 남는다는 것.
공부의 순서는 없지만 적합한 방법은 있는 것 같다.
```
##### 팩토리 메서드 디자인 패턴
객체 생성, 즉 인스턴스화는 객체지향 프로그래밍에서 가장 기본적인 개념 중에 하나일 수 있다.
해당 객체가 존재하지 않으면 객체를 사용할 수 없다는 것은 말할 필요도 없다.
코드를 작성할 때 객체를 인스턴스화하는 가장 확실한 방법은 new키워드를 사용하는 것이다.
```java
abstract class Shape{
}
class Circle extends Shape{
}
Circle circle = new Circle();
```
이 코드는 문제없이 동작하지만, 코드에서 Circle이나 해당 문제의 다른 도형들을 인스턴스화 해야 할 곳이 많을 것이다.
대부분의 경우에 Shape를 만들 때마다 처리해야 하는 특정 객체 생성 매개변수가 있을 것이다.
결과적으로 객체 생성 방식을 변경할 때마다 Shape 객체가 인스턴스화되는 모든 위치에서 코드를 변경해야 한다.
한 곳을 변경하면 잠재적으로 다른 많은 곳에서 코드를 변경해야 하므로 코드가 밀접하게 묶이게 된다..
이 접근법의 또 다른 문제점은 클래스를 사용해 **프로그래머에게 객체 작성 로직을 노출**시킨다는 것이다.
이러한 상황을 해결하기 위해 팩토리 메서드 패턴을 구현할 수 있다.
*[팩토리 메서드 패턴 정리 글](https://fkdl0048.github.io/patterns/Patterns_FactoryMethod/)*
팩토리 메서드는 모든 인스턴스화를 캡슐화해 구현 전반에 걸쳐 균일해야 한다.
팰토리를 사용해 인스턴스화하면, 팩토리는 적절하게 인스턴스화 한다.
##### 팩토리 메서드 패턴
팩토리 메서드 패턴의 근본적인 목적은 정확한 클래스를 지정하지 않고도 객체를 생성하는 일과 사실상 인터페이스를 사용해 새로운 객체 유형을 생성하는 일을 담당하는 것이다.
어떤 면에서는 팩토리를 래퍼라고 생각할 수 있다.(그렇게 생각했었다..)
객체를 인스턴스화하는 데 중요한 로직이 있을 수 있으며, 우리는 프로그래머(사용자)가 이 로직에 관심을 갖지 않았으면 한다면..
즉, 값을 검색하는 로직이 일부 로직 내부에 있을 때 접근자 메서드의 개념과 거의 같다.(게터 세터)
필요한 특정 클래스를 미리 알 수 없을 때 팩토리 메서드를 사용하면 된다.
즉, 이번 예제의 모든 클래스는 Shape의 서브클래스여야 한다.
사실 팩토리 메서드는 필요한걸 정확하게 모를 때 사용된다.
나중에 클래스의 일부를 추가할 수 있게 되며 필요한 게 무엇인지를 알고 있다면 생성자나 세터 메서드를 통해 인스턴스를 주입할 수 있다.
기본적으로 이게 다형성에 대한 정의이다.
위 링크에도 피자로 같은 예제가 있지만 책의 예제로 한번 더 복습한다.
enum형태로 도형의 종류를 정의한다.
```java
enum ShapeType{
CIRCLE,
RECTANGLE,
TRIANGLE
}
```
생성자와 generate()라고 부르는 추상 메서드만으로 Shape 클래스를 추상적인 것으로 정의할 수 있다.
```java
abstract class Shape{
private ShapeType shapeType = null;
public Shape(ShapeType shapeType){
this.shapeType = shapeType;
}
public abstract void generate();
}
```
Circle, Rectangle, Triangle 클래스는 Shape 클래스를 상속받아 구현한다.
```java
class Circle extends Shape{
public Circle(){
super(ShapeType.CIRCLE);
generate();
}
@Override
public void generate(){
System.out.println("Circle");
}
}
class Rectangle extends Shape{
public Rectangle(){
super(ShapeType.RECTANGLE);
generate();
}
@Override
public void generate(){
System.out.println("Rectangle");
}
}
class Triangle extends Shape{
public Triangle(){
super(ShapeType.TRIANGLE);
generate();
}
@Override
public void generate(){
System.out.println("Triangle");
}
}
```
이름에서 알 수 있듯이 ShapeFactory 클래스는 실제적인 팩토리다.
generate()메서드에 초점이 맞춰져 있다.
팩토리는 많은 장점을 제공하지만, generate()메서드는 실제로 Shape를 인스턴스화하는 애플리케이션 내의 유일한 위치이다.
```java
class ShapeFactory{
public static Shape generateShape(ShapeType shapeType){
Shape shape = null;
switch(shapeType){
case CIRCLE:
shape = new Circle();
break;
case RECTANGLE:
shape = new Rectangle();
break;
case TRIANGLE:
shape = new Triangle();
break;
default:
// 예외
break;
}
return shape;
}
}
```
```
생각
모든 디자인 패턴은 핵심적인 부분은 비슷하지만 각각의 패턴도 구현하는 언어, 스타일, 컨벤션의 차이가 다 다르다.
자바라서 그런지 모르겠지만 default보다 차라리 예외를 아래서 잡고 각각 리턴을 하거나, Shape 자체에 static메서드로 두거나, 인터페이스 DI로 좀 더 유연성을 강화할 수도 있다.
개인의 구현 차이와 해당 프로젝트의 규모, 팀 컨벤션을 고려해서 작성하는 것이 바람직한 것 같다.
```
이러한 개별 객체를 인스턴스화하는 전통적인 접근 방식은 프로그래머가 다음과 같이 new 키워드를 사용해 객체를 직접 인스턴스화 하는 것이다.
```java
public class TestFactoryPattern{
public static void main(String[] args){
Shape circle = new Circle();
Shape rectangle = new Rectangle();
Shape triangle = new Triangle();
}
}
```
그러나 팩토리 메서드 패턴을 사용하면 다음과 같이 객체를 인스턴스화 할 수 있다.
```java
public class TestFactoryPattern{
public static void main(String[] args){
Shape circle = ShapeFactory.generateShape(ShapeType.CIRCLE);
Shape rectangle = ShapeFactory.generateShape(ShapeType.RECTANGLE);
Shape triangle = ShapeFactory.generateShape(ShapeType.TRIANGLE);
}
}
```
더 나아가서 `C#`의 리플렉션을 사용하면 팩토리 메서드 패턴을 사용하지 않고도 객체를 인스턴스화 할 수 있다.
```csharp
public class TestFactoryPattern{
public static void main(String[] args){
Shape circle = (Shape)Activator.CreateInstance(typeof(Circle));
Shape rectangle = (Shape)Activator.CreateInstance(typeof(Rectangle));
Shape triangle = (Shape)Activator.CreateInstance(typeof(Triangle));
}
}
```
하지만 같은 추상화 레벨이나 생성 로직의 통일성 개인성을 보장하지 못하기 때문에 접근이나 가독성이 부족할 수 있는 것 같다..
실제로는 데이터 베이스상의 Unit을 읽어와서 객체를 동적으로 생성하여 관리하는 것 같다.
#### 10.3.2.구조패턴
구조 패턴(structural pattern)은 객체 그룹에서 더 큰 구조를 만드는 데 사용된다.
다음 일곱 가지 디자인 패턴이 구조 패턴의 범주에 속한다.
- 어댑터(Adapter, 즉 적응자)
- 브리지(Bridge, 즉 가교)
- 컴포지션(Composition, 즉 '합성체')
- 데코레이터(Decorator, 즉 '장식자')
- 파사드(Facade)
- 플라이웨이트(Flyweight)
- 프록시(Proxy)
##### 어댑터 디자인 패턴
어댑터 패턴은 이미 존재하는 클래스에 대해 다른 인터페이스를 작성하는 방법이다.
어댑터 패턴은 기본적으로 클래스 래퍼를 제공한다.
다시 말해, 기존 클래스의 기능을 새로우면서도 이상적으로 볼 때 더 나은 인터페이스로 통합하는(둘러싸는) 새로운 클래스를 만든다.
래퍼의 간단한 예는 자바 클래스인 Integer다.
Integer 클래스는 그 안에 단일 Integer값을 둘러싼다.
객체지향 시스템에서는 모든 것이 객체이기 때문에, 기본적인 int, float등과 같은 기본 데이터 형식은 객체가 아니다.
이러한 기본 데이터 형식들을 바탕으로 함수를 실행해야 할 때 이런 기본 데이터 형식들을 객체로 취급해야 한다.
따라서 래퍼 객체를 작성하고 그 안에 기본 데이터 형식들을 두고 둘러싼다.
다음과 같은 기본 데이터 형식을 사용할 수 있다.
```java
int myInt = 10;
```
그리고 이 기본 데이터 형식을 Integer로 둘러쌀 수 있다.
```java
Integer myInteger = new Integer(myInt);
```
이제 형식 변환을 수행할 수 있으며, 해당 자료를 문자열로 취급할 수 있다.
```java
String myString = myInteger.toString();
```
이 래퍼를 사용하면 원래부터 있던 정수를 객체로 취급해 객체의 모든 장점을 제공할 수 있다.
이것이 어댑터 패턴의 기본 아이디어이다.
어댑터의 말 그대로 한국의 콘센와 일본의 콘센트를 연결하는 어댑터의 역할을 한다.
사용하는 패키지나 라이브러리의 호환성이나 중간 역할을 위해 래퍼로 감싸는 것이다.
#### 10.3.3.행위패턴
행위 패턴(behavioral pattern)은 다음과 같은 범주로 구성된다.
- 책임 연쇄(Chain of responsibility)
- 커맨드(Command, 즉 '명령') 패턴
- 인터프리터(Interpreter, 즉 '해설자') 패턴
- 이터레이터(Iterator, 즉 '반복자') 패턴
- 미디에이터(Mediator, 즉 '중재자') 패턴
- 메멘토(Memento, 즉 '기념비') 패턴
- 옵저버(Observer, 즉 '관찰자') 패턴
- 스테이트(State, 즉 '상태') 패턴
- 스트레이지(Strategy, 즉 '전략') 패턴
- 탬플릿 메서드(Template method) 패턴
- 비지터(Visitor, 즉 '방문자') 패턴
##### 이터레이터 디자인 패턴
이터레이터(iterator)는 벡터와 같은 컬렉션을 순회하기 위한 표준 메커니즘을 제공한다.
컬렉션의 각 항목 한 번에 하나씩 접근할 수 있도록 기능을 제공해야 한다.
이터레이터 패턴은 정보 은닉을 제공해 컬렉션의 내부 구조를 안전하게 유지한다.
이터레이터 패턴은 또한 서로 간섭하지 않고 둘 이상의 이터레이터가 작성될 수 있도록 규정한다.
`C#`은 `IEnumerable` 인터페이스를 사용해 이터레이터 패턴을 구현한다.
```csharp
public interface IEnumerable{
IEnumerator GetEnumerator();
}
```
`IEnumerator` 인터페이스는 컬렉션의 각 항목을 순회하는 데 사용된다.
```csharp
public interface IEnumerator{
bool MoveNext();
object Current{get;}
void Reset();
}
```
자바는 다음과 같다.
```java
package Iterator;
import java.util.*
public class Iterator{
public static void main(String[] args){
ArrayList<String> list = new ArrayList<String>();
list.add("one");
list.add("two");
list.add("three");
list.add("four");
iterate(list);
}
public static void iterate(ArrayList<String> list){
foreach(String s : list){
System.out.println(s);
}
}
}
```
#### 10.3.4. 안티패턴
디자인 패턴은 긍정적인 방식으로 발전하지만, 안티패턴(antipatterns)은 끔찍한 경험들을 모아 놓은 것으로 생각할 수 있다.
안티패턴이란 용어는 특정 유형의 문제를 사전에 해결하기 위해 디자인 패턴이 생성된다는 사실에서 비롯된다.
반면에, 안티패턴은 문제에 대한 반응이고 나쁜 경험으로부터 얻어진다.
요컨대, 디자인 패턴이 견고한 설계 실습을 기반으로 한 반면에, 안티패턴은 피해야할 관행으로 생각할 수 있다.
많은 사람들이 안티패턴이 디자인 패턴보다 더 유용하다고 생각한다.
안티패턴은 이미 발생한 문제를 해결하도록 설계되었기 때문이다..
안티패턴은 기존 설계를 수정하고 실행 가능한 솔루션을 찾을 때까지 해당 설계를 지속적으로 리팩토링한다.
- 안티패턴의 좋은 예 몇가지
- 싱글톤(Singleton)
- 서비스 로케이터(Service Locator)
- 매직 스트링/넘버(Magic String/Number)
- 인터페이스 부풀리기(Interface Bloat)
- 예외에 기반한 코딩(Exception Driven Coding)
- 오류 숨기기/오류 삼키기(Hiding Swallowing Errors)
### 10.4. 결론
패턴은 일상 생활의 일부이며, 이는 객체지향 설계에 대해 생각해야 하는 방식이다.
정보 기술과 관련된 많은 것들과 마찬가지로 해법의 근원은 **실제 상황에서 발각**된다.
|
process
|
디자인 패턴 디자인 패턴 업무용 소프트웨어 시스템을 작성하려면 개발자는 사업방식을 완전히 이해해야 한다 결과적으로 개발자는 종종 회사의 업무 과정에 대해 가장 친밀한 지식을 갖게 된다 우리가 포유류 클래스를 만들 때 모든 포유류가 특정 행위와 특성을 공유하기 때문에 이 포유류 클래스를 사용하면 개나 고양이 등의 클래스들을 수없이 만들 수 있다 이렇게 하다 보면 개 고양이 다람쥐 및 기타 포유류를 연구할 때 효과적인데 이는 패턴 을 발견할 수 있기 때문이다 이런 패턴을 기반으로 특정 동물을 살펴보고 그게 포유류인지 아니면 행위와 특성의 패턴이 포유류와 다른 파충류인지 판단할 수 있다 역사적으로 우리는 이러한 패턴 사용해왔다 이러한 패턴은 소프트웨어 분야의 중요한 부분으로 소프트웨어 재사용형태와 밀접한 관련이 있다 디자인 패턴 패턴은 재사용 가능한 소프트웨어 개발 개념에 아주 적합하다 객체지향 개발은 모두 재사용에 관한 것이므로 패턴과 객체지향 개발은 함께 진보한다 디자인 패턴의 기본 개념은 모범 사례의 원칙과 관련이 있다 모범 사례에 따르면 우수하고 효율적인 솔루션이 만들어질 때 이러한 솔루션은 다른 사람들이 실패로부터 배웠던 방식과 이전의 성공에 따른 이익을 얻을 수 있었던 방식을 문서화한다는 것을 의미한다 디자인 패턴이 필요한 이유 디자인 패턴의 개념이 반드시 재사용 가능한 소프트웨어의 필요성에서 시작된 것은 아니다 디자인 패턴에 대한 중요한 업적은 건물과 도시 건설에 관련되어 있다 패턴은 우리 환경에서 반복적으로 발생하는 문제들을 기술하며 그 문제에 대한 핵심 해법도 기술하는데 이런 방식 덕분에 우리는 그러한 해법을 백만 번 거듭해서 사용할 수 있으며 그럼에도 같은 방식을 두 번 다시 따르지 않아도 된다 패턴의 네 가지 요소 gof는 패턴을 다음 네 가지 필수 요소로 설명한다 패턴 이름 패턴 이름은 디자인 문제 솔루션 및 결과를 한두 단어로 설명하는 데 사용할 수 있는 조종간 같은 것 패턴에 이름을 붙여두면 어휘가 즉시 늘어나며 너 높은 수준으로 설계를 추상화할 수 있다 패턴에 대한 어휘를 갖추면 동료와 소통할 수 있고 문서로 교신할 수 있으며 자신과도 이야기 할 수 있다 이해하기 쉬워지면 절충산 설계를 다른 사람들에게 쉽게 전달할 수 있다 문제 문제는 패턴을 적용할 시점을 설명한다 이것으로 문제 자체와 내용을 설명할 수 있다 알고리즘을 객체로 표현하는 방법과 같은 특정 설계 문제들을 설명할 수 있다 융통성이 없는 설계의 증상인 클래스 구조나 객체 구조를 설명할 수 있다 해법 해법은 설계 관계 책임 및 협업을 이루는 요소를 설명한다 패턴은 여러 상황에 적용할 수 있는 템플릿과 같기 때문에 해법은 구체적인 특정 설계나 구현을 설명하지 않는다 그 대신 패턴으로는 설계 문제를 추상적으로 설명하며 요소를 일반적으로 배치해 설계 문제를 어떻게 해결하는지 보여준다 귀결 귀결이란 패턴을 적용한 결과와 절충점을 말한다 귀결이 종종 무시되기도 하지만 우리가 설게 결정 내용을 설명할 때는 귀결이 설계 대안을 평가하고 적용 패턴의 비용과 이점을 이해하는데 중요하다 소프트웨어 귀결은 시공간상의 절충과 관련이 있다 귀결로 언어 문제 및 구현 문제도 해결할 수 있다 스몰토크의 모델 뷰 컨트롤러 mvc는 종종 디자인 패턴의 기원을 설명하는 데 사용된다 모델 뷰 컨트롤러 mvc 라는 패러다임은 스몰토크에서 사용자 인터페이스를 생성하는데 사용했다 스몰토크 스몰토크는 그 당시 가장 인기있는 객체지향 언어였다 design patterns에서는 다음과 같은 방식으로 mvc 컴포넌트들을 정의한다 모델 model 은 애플리케이션 객체이고 뷰 view 는 화면 표현이며 컨트롤러 controller 는 사용자 인터페이스가 사용자 입력에 반응하는 방식을 정의한다 이전 패러다임에서는 모델 뷰 컨트롤러가 단일 엔터티 안에 모두 모여 있는 문제가 있었다 예를 들어 단일 객체에 이 세 가지 컴포넌트가 모두 들어 있는 식이다 mvc패러다임에서는 이 세 가지 컴포넌트에 개별적이며 구별되는 인터페이스들이 있다 따라서 우리는 어떤 애플리케이션의 사용자 인터페이스를 변경하려고 한다면 뷰만 변경하면 된다 이는 객체지향 개발의 인터페이스 대 구현부 와 관련이 있다 가능한 인터페이스와 구현부를 분리하려고 한다 또한 인터페이스들끼리도 서로 최대한 분리하려고 한다 mvc는 아주 일반적이면서도 기본적인 프로그래밍 문제와 관련된 특정 컴포넌트 간의 인터페이스를 명시적으로 정의한다 mvc 개념을 따르고 사용자 인터페이스 비즈니스 로직 및 데이터를 분리하면서 시스템이 훨씬 유연하고 강력해진다 예를 들어 사용자 인터페이스가 클라이언트 시스템에 있고 비즈니스 로직이 애플리케이션 서버에 있고 데이터가 데이터 서버에 있다고 가정한다면 개발 도중에 비즈니스 로직에 영향 없이 gui의 형태를 변경할 수 있다 mvc의 단점 mvc는 훌룡한 디자인이지만 선행 디자인에 많은 주의를 기울여야 한다는 점에서 복잡할 수 있다 이는 일반적인 객체지향 디자인의 문제이다 좋은 디자인과 성가신 디자인 사이에는 미세한 경계선이 있다 그렇다면 과연 완전한 디자인과 관련해 시스템이 얼마나 복잡해야 하는가 정닶은 없고 항상 트레이드 오프를 생각하자 디자인 패턴의 종류 design patterns에는 세 가지 범주로 분류된 총 패턴이 있다 생성 패턴 creational patterns 객체를 직접 인스턴스화하지 않은 채로 객체를 만든다 이를 통해 특정 사례에 대해 어떤 객체를 생성해야 할지 결정할 때 프로그램의 유연성이 향상된다 구조 패턴 structural patterns 복잡한 사용자 인터페이스나 계정 데이터와 같은 객체 그룹을 더 큰 구조로 합성할 수 있다 행위 패턴 behavioral patterns 시스템의 객체 간 통신을 정의하고 복잡한 프로그램에서 흐름을 제어하는 방법을 정의한다 생성 패턴 생성 패턴은 다음 범주별로 나눠볼 수 있다 추상 팩토리 abstract factory 빌더 builder 팩토리 메서드 factory method 프로토타입 prototype 싱글톤 singleton 이 책에선 각 패턴을 자세하게 다루지 않고 디자인 패턴이 뭔지 설명하는 내용이 주를 이룬다 생각 확실히 멘토님께서 디자인 패턴먼저 공부하지 말라고 하신 이유를 다시한번 느낀다 게임 프로그래밍에 적합한 디자인 패턴도 물론 있지만 해당 디자인 패턴을 공부한다고 억지로 적용하면 오히려 독이다 현재 구조에서 필요한 범주 생성인지 행위인지 구조인지 필요성을 느낄 때 해당 범주내에서 적합한 디자인 패턴을 공부하고 적용하는 것이 더 효울적이고 기억에 남는다는 것 공부의 순서는 없지만 적합한 방법은 있는 것 같다 팩토리 메서드 디자인 패턴 객체 생성 즉 인스턴스화는 객체지향 프로그래밍에서 가장 기본적인 개념 중에 하나일 수 있다 해당 객체가 존재하지 않으면 객체를 사용할 수 없다는 것은 말할 필요도 없다 코드를 작성할 때 객체를 인스턴스화하는 가장 확실한 방법은 new키워드를 사용하는 것이다 java abstract class shape class circle extends shape circle circle new circle 이 코드는 문제없이 동작하지만 코드에서 circle이나 해당 문제의 다른 도형들을 인스턴스화 해야 할 곳이 많을 것이다 대부분의 경우에 shape를 만들 때마다 처리해야 하는 특정 객체 생성 매개변수가 있을 것이다 결과적으로 객체 생성 방식을 변경할 때마다 shape 객체가 인스턴스화되는 모든 위치에서 코드를 변경해야 한다 한 곳을 변경하면 잠재적으로 다른 많은 곳에서 코드를 변경해야 하므로 코드가 밀접하게 묶이게 된다 이 접근법의 또 다른 문제점은 클래스를 사용해 프로그래머에게 객체 작성 로직을 노출 시킨다는 것이다 이러한 상황을 해결하기 위해 팩토리 메서드 패턴을 구현할 수 있다 팩토리 메서드는 모든 인스턴스화를 캡슐화해 구현 전반에 걸쳐 균일해야 한다 팰토리를 사용해 인스턴스화하면 팩토리는 적절하게 인스턴스화 한다 팩토리 메서드 패턴 팩토리 메서드 패턴의 근본적인 목적은 정확한 클래스를 지정하지 않고도 객체를 생성하는 일과 사실상 인터페이스를 사용해 새로운 객체 유형을 생성하는 일을 담당하는 것이다 어떤 면에서는 팩토리를 래퍼라고 생각할 수 있다 그렇게 생각했었다 객체를 인스턴스화하는 데 중요한 로직이 있을 수 있으며 우리는 프로그래머 사용자 가 이 로직에 관심을 갖지 않았으면 한다면 즉 값을 검색하는 로직이 일부 로직 내부에 있을 때 접근자 메서드의 개념과 거의 같다 게터 세터 필요한 특정 클래스를 미리 알 수 없을 때 팩토리 메서드를 사용하면 된다 즉 이번 예제의 모든 클래스는 shape의 서브클래스여야 한다 사실 팩토리 메서드는 필요한걸 정확하게 모를 때 사용된다 나중에 클래스의 일부를 추가할 수 있게 되며 필요한 게 무엇인지를 알고 있다면 생성자나 세터 메서드를 통해 인스턴스를 주입할 수 있다 기본적으로 이게 다형성에 대한 정의이다 위 링크에도 피자로 같은 예제가 있지만 책의 예제로 한번 더 복습한다 enum형태로 도형의 종류를 정의한다 java enum shapetype circle rectangle triangle 생성자와 generate 라고 부르는 추상 메서드만으로 shape 클래스를 추상적인 것으로 정의할 수 있다 java abstract class shape private shapetype shapetype null public shape shapetype shapetype this shapetype shapetype public abstract void generate circle rectangle triangle 클래스는 shape 클래스를 상속받아 구현한다 java class circle extends shape public circle super shapetype circle generate override public void generate system out println circle class rectangle extends shape public rectangle super shapetype rectangle generate override public void generate system out println rectangle class triangle extends shape public triangle super shapetype triangle generate override public void generate system out println triangle 이름에서 알 수 있듯이 shapefactory 클래스는 실제적인 팩토리다 generate 메서드에 초점이 맞춰져 있다 팩토리는 많은 장점을 제공하지만 generate 메서드는 실제로 shape를 인스턴스화하는 애플리케이션 내의 유일한 위치이다 java class shapefactory public static shape generateshape shapetype shapetype shape shape null switch shapetype case circle shape new circle break case rectangle shape new rectangle break case triangle shape new triangle break default 예외 break return shape 생각 모든 디자인 패턴은 핵심적인 부분은 비슷하지만 각각의 패턴도 구현하는 언어 스타일 컨벤션의 차이가 다 다르다 자바라서 그런지 모르겠지만 default보다 차라리 예외를 아래서 잡고 각각 리턴을 하거나 shape 자체에 static메서드로 두거나 인터페이스 di로 좀 더 유연성을 강화할 수도 있다 개인의 구현 차이와 해당 프로젝트의 규모 팀 컨벤션을 고려해서 작성하는 것이 바람직한 것 같다 이러한 개별 객체를 인스턴스화하는 전통적인 접근 방식은 프로그래머가 다음과 같이 new 키워드를 사용해 객체를 직접 인스턴스화 하는 것이다 java public class testfactorypattern public static void main string args shape circle new circle shape rectangle new rectangle shape triangle new triangle 그러나 팩토리 메서드 패턴을 사용하면 다음과 같이 객체를 인스턴스화 할 수 있다 java public class testfactorypattern public static void main string args shape circle shapefactory generateshape shapetype circle shape rectangle shapefactory generateshape shapetype rectangle shape triangle shapefactory generateshape shapetype triangle 더 나아가서 c 의 리플렉션을 사용하면 팩토리 메서드 패턴을 사용하지 않고도 객체를 인스턴스화 할 수 있다 csharp public class testfactorypattern public static void main string args shape circle shape activator createinstance typeof circle shape rectangle shape activator createinstance typeof rectangle shape triangle shape activator createinstance typeof triangle 하지만 같은 추상화 레벨이나 생성 로직의 통일성 개인성을 보장하지 못하기 때문에 접근이나 가독성이 부족할 수 있는 것 같다 실제로는 데이터 베이스상의 unit을 읽어와서 객체를 동적으로 생성하여 관리하는 것 같다 구조패턴 구조 패턴 structural pattern 은 객체 그룹에서 더 큰 구조를 만드는 데 사용된다 다음 일곱 가지 디자인 패턴이 구조 패턴의 범주에 속한다 어댑터 adapter 즉 적응자 브리지 bridge 즉 가교 컴포지션 composition 즉 합성체 데코레이터 decorator 즉 장식자 파사드 facade 플라이웨이트 flyweight 프록시 proxy 어댑터 디자인 패턴 어댑터 패턴은 이미 존재하는 클래스에 대해 다른 인터페이스를 작성하는 방법이다 어댑터 패턴은 기본적으로 클래스 래퍼를 제공한다 다시 말해 기존 클래스의 기능을 새로우면서도 이상적으로 볼 때 더 나은 인터페이스로 통합하는 둘러싸는 새로운 클래스를 만든다 래퍼의 간단한 예는 자바 클래스인 integer다 integer 클래스는 그 안에 단일 integer값을 둘러싼다 객체지향 시스템에서는 모든 것이 객체이기 때문에 기본적인 int float등과 같은 기본 데이터 형식은 객체가 아니다 이러한 기본 데이터 형식들을 바탕으로 함수를 실행해야 할 때 이런 기본 데이터 형식들을 객체로 취급해야 한다 따라서 래퍼 객체를 작성하고 그 안에 기본 데이터 형식들을 두고 둘러싼다 다음과 같은 기본 데이터 형식을 사용할 수 있다 java int myint 그리고 이 기본 데이터 형식을 integer로 둘러쌀 수 있다 java integer myinteger new integer myint 이제 형식 변환을 수행할 수 있으며 해당 자료를 문자열로 취급할 수 있다 java string mystring myinteger tostring 이 래퍼를 사용하면 원래부터 있던 정수를 객체로 취급해 객체의 모든 장점을 제공할 수 있다 이것이 어댑터 패턴의 기본 아이디어이다 어댑터의 말 그대로 한국의 콘센와 일본의 콘센트를 연결하는 어댑터의 역할을 한다 사용하는 패키지나 라이브러리의 호환성이나 중간 역할을 위해 래퍼로 감싸는 것이다 행위패턴 행위 패턴 behavioral pattern 은 다음과 같은 범주로 구성된다 책임 연쇄 chain of responsibility 커맨드 command 즉 명령 패턴 인터프리터 interpreter 즉 해설자 패턴 이터레이터 iterator 즉 반복자 패턴 미디에이터 mediator 즉 중재자 패턴 메멘토 memento 즉 기념비 패턴 옵저버 observer 즉 관찰자 패턴 스테이트 state 즉 상태 패턴 스트레이지 strategy 즉 전략 패턴 탬플릿 메서드 template method 패턴 비지터 visitor 즉 방문자 패턴 이터레이터 디자인 패턴 이터레이터 iterator 는 벡터와 같은 컬렉션을 순회하기 위한 표준 메커니즘을 제공한다 컬렉션의 각 항목 한 번에 하나씩 접근할 수 있도록 기능을 제공해야 한다 이터레이터 패턴은 정보 은닉을 제공해 컬렉션의 내부 구조를 안전하게 유지한다 이터레이터 패턴은 또한 서로 간섭하지 않고 둘 이상의 이터레이터가 작성될 수 있도록 규정한다 c 은 ienumerable 인터페이스를 사용해 이터레이터 패턴을 구현한다 csharp public interface ienumerable ienumerator getenumerator ienumerator 인터페이스는 컬렉션의 각 항목을 순회하는 데 사용된다 csharp public interface ienumerator bool movenext object current get void reset 자바는 다음과 같다 java package iterator import java util public class iterator public static void main string args arraylist list new arraylist list add one list add two list add three list add four iterate list public static void iterate arraylist list foreach string s list system out println s 안티패턴 디자인 패턴은 긍정적인 방식으로 발전하지만 안티패턴 antipatterns 은 끔찍한 경험들을 모아 놓은 것으로 생각할 수 있다 안티패턴이란 용어는 특정 유형의 문제를 사전에 해결하기 위해 디자인 패턴이 생성된다는 사실에서 비롯된다 반면에 안티패턴은 문제에 대한 반응이고 나쁜 경험으로부터 얻어진다 요컨대 디자인 패턴이 견고한 설계 실습을 기반으로 한 반면에 안티패턴은 피해야할 관행으로 생각할 수 있다 많은 사람들이 안티패턴이 디자인 패턴보다 더 유용하다고 생각한다 안티패턴은 이미 발생한 문제를 해결하도록 설계되었기 때문이다 안티패턴은 기존 설계를 수정하고 실행 가능한 솔루션을 찾을 때까지 해당 설계를 지속적으로 리팩토링한다 안티패턴의 좋은 예 몇가지 싱글톤 singleton 서비스 로케이터 service locator 매직 스트링 넘버 magic string number 인터페이스 부풀리기 interface bloat 예외에 기반한 코딩 exception driven coding 오류 숨기기 오류 삼키기 hiding swallowing errors 결론 패턴은 일상 생활의 일부이며 이는 객체지향 설계에 대해 생각해야 하는 방식이다 정보 기술과 관련된 많은 것들과 마찬가지로 해법의 근원은 실제 상황에서 발각 된다
| 1
|
7,424
| 10,543,359,796
|
IssuesEvent
|
2019-10-02 14:51:01
|
fablabbcn/fablabs.io
|
https://api.github.com/repos/fablabbcn/fablabs.io
|
opened
|
Planned Labs Appear as active labs
|
Approval Process bug
|
**Describe the bug**
When fab labs are approved under the
> planned status of the lab
there is supposed to be placed under this category on the frontend, the problem is that there are appearing as working labs, rather than having some tag or sign that said - this is a planned lab- on previous versions of this that appeared.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to 'fablabs.io frontend and search for a lab'
2. Click on ''
3. Scroll down to '....'
4. See error
**Expected behavior**
Planned labs **Should** appear on the frontend as planned labs, not as active labs, a planned status should not be consider an approved lab.
**Screenshots**
BACKSTAGE description:
<img width="1209" alt="Screen Shot 2019-10-02 at 9 46 11 AM" src="https://user-images.githubusercontent.com/24419466/66054465-a489e180-e4f9-11e9-902a-9c7912530982.png">
Frontend look:
<img width="795" alt="Screen Shot 2019-10-02 at 9 46 35 AM" src="https://user-images.githubusercontent.com/24419466/66054486-ace21c80-e4f9-11e9-909c-5d05e5536179.png">
**Desktop (please complete the following information):**
- OS: macOS 10.14.6
- Browser Chrome
- Version 77.0.3865.90 (Official Build) (64-bit)
**Additional context**
This creates an inaccurate count of labs and shows an inaccurate state of that specific space and lab
|
1.0
|
Planned Labs Appear as active labs - **Describe the bug**
When fab labs are approved under the
> planned status of the lab
there is supposed to be placed under this category on the frontend, the problem is that there are appearing as working labs, rather than having some tag or sign that said - this is a planned lab- on previous versions of this that appeared.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to 'fablabs.io frontend and search for a lab'
2. Click on ''
3. Scroll down to '....'
4. See error
**Expected behavior**
Planned labs **Should** appear on the frontend as planned labs, not as active labs, a planned status should not be consider an approved lab.
**Screenshots**
BACKSTAGE description:
<img width="1209" alt="Screen Shot 2019-10-02 at 9 46 11 AM" src="https://user-images.githubusercontent.com/24419466/66054465-a489e180-e4f9-11e9-902a-9c7912530982.png">
Frontend look:
<img width="795" alt="Screen Shot 2019-10-02 at 9 46 35 AM" src="https://user-images.githubusercontent.com/24419466/66054486-ace21c80-e4f9-11e9-909c-5d05e5536179.png">
**Desktop (please complete the following information):**
- OS: macOS 10.14.6
- Browser Chrome
- Version 77.0.3865.90 (Official Build) (64-bit)
**Additional context**
This creates an inaccurate count of labs and shows an inaccurate state of that specific space and lab
|
process
|
planned labs appear as active labs describe the bug when fab labs are approved under the planned status of the lab there is supposed to be placed under this category on the frontend the problem is that there are appearing as working labs rather than having some tag or sign that said this is a planned lab on previous versions of this that appeared to reproduce steps to reproduce the behavior go to fablabs io frontend and search for a lab click on scroll down to see error expected behavior planned labs should appear on the frontend as planned labs not as active labs a planned status should not be consider an approved lab screenshots backstage description img width alt screen shot at am src frontend look img width alt screen shot at am src desktop please complete the following information os macos browser chrome version official build bit additional context this creates an inaccurate count of labs and shows an inaccurate state of that specific space and lab
| 1
|
491,682
| 14,169,054,790
|
IssuesEvent
|
2020-11-12 12:40:52
|
mozilla/addons-server
|
https://api.github.com/repos/mozilla/addons-server
|
closed
|
Add confirmation page when changing a promoted group in the django admin
|
component: subscription priority: p3 state: pull request ready
|
With the new subscription system, changing the group of a promoted add-on might have implications.
For groups that require payment, we should change a few things in Stripe (like cancelling the subscription when we remove an add-on from a promoted group).
This issue is about adding a confirmation ~~page~~ message when the group of a promoted add-on is changed.
|
1.0
|
Add confirmation page when changing a promoted group in the django admin - With the new subscription system, changing the group of a promoted add-on might have implications.
For groups that require payment, we should change a few things in Stripe (like cancelling the subscription when we remove an add-on from a promoted group).
This issue is about adding a confirmation ~~page~~ message when the group of a promoted add-on is changed.
|
non_process
|
add confirmation page when changing a promoted group in the django admin with the new subscription system changing the group of a promoted add on might have implications for groups that require payment we should change a few things in stripe like cancelling the subscription when we remove an add on from a promoted group this issue is about adding a confirmation page message when the group of a promoted add on is changed
| 0
|
75,596
| 14,495,503,592
|
IssuesEvent
|
2020-12-11 11:17:06
|
hzi-braunschweig/SORMAS-Project
|
https://api.github.com/repos/hzi-braunschweig/SORMAS-Project
|
closed
|
Prevent Blind SQL Injection [3]
|
Code Quality change important
|
### Description
Using the https://pen1.sormas-oegd.de:443/sormas-rest/cases/push REST endpoint to create a case
with an invalid UUID allows to make a SQL injection in the function
```
de.symeda.sormas.backend.contact.ContactFacadeEjb.getNonSourceCaseCountForDashboard
public int getNonSourceCaseCountForDashboard(List<String> caseUuids) {
if (CollectionUtils.isEmpty(caseUuids)) {
// Avoid empty IN clause
return 0;
}
Query query = em.createNativeQuery(
String.format(
"SELECT DISTINCT count(case1_.id) FROM contact AS contact0_ LEFT OUTER JOIN cases AS case1_ ON (contact0_.%s_id = case1_.id) WHERE case1_.%s IN (%s)",
Contact.RESULTING_CASE.toLowerCase(),
Case.UUID,
QueryHelper.concatStrings(caseUuids)));
BigInteger count = (BigInteger) query.getSingleResult();
return count.intValue();
}
```
### Steps to Reproduce
1. Using the following request, a case is created with an invalid UUID:
**Request**
```
POST /sormas-rest/cases/push HTTP/1.1
Host: pen1.sormas-oegd.de
Connection: close
Authorization: Basic U3Vydk9mZjpTdXJ2T2Zm
Content-Type: application/json
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like
Gecko) Chrome/86.0.4240.111 Safari/537.36
Accept:
text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;
q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.9,de;q=0.8
Content-Length: 2861
[
ERNW Enno Rey Netzwerke GmbH Tel. +49 – 6221 – 48 03 90 Page 35
Carl-Bosch-Str. 4 Fax +49 – 6221 – 41 90 08
69115 Heidelberg VAT-ID DE813376919
BIC: GENODE61WNM IBAN: DE40670923000033145934
{
"creationDate": 1603797391019,
"changeDate": 1603797391019,
"uuid": "'><img src onerror='alert(1);",
[…]
```
**Response**
```
HTTP/1.1 200 OK
[…]
["OK"]
```
2. After navigating in the UI to the contacts supervisors’ dashboard, an error will be reported to the user as
shown in the below screenshot. The error seems to be resulted from the SQL error occurred by injecting the
invalid UUID.

### Expected Behavior
The Injected SQL is escaped and doesn't get interpreted.
### Proposed solution
- [ ] Avoid sql injection by rewriting it to parameter(s) and use batching to avoid parameter overflow (_IterableHelper.executeBatched_)
- [ ] Check that there is no html injection for UUID in the UI
### System Details
* SORMAS version: 1.48.2
* Server URL: https://pen1.sormas-oegd.de:443
* User Role: Contact Supervisor
|
1.0
|
Prevent Blind SQL Injection [3] - ### Description
Using the https://pen1.sormas-oegd.de:443/sormas-rest/cases/push REST endpoint to create a case
with an invalid UUID allows to make a SQL injection in the function
```
de.symeda.sormas.backend.contact.ContactFacadeEjb.getNonSourceCaseCountForDashboard
public int getNonSourceCaseCountForDashboard(List<String> caseUuids) {
if (CollectionUtils.isEmpty(caseUuids)) {
// Avoid empty IN clause
return 0;
}
Query query = em.createNativeQuery(
String.format(
"SELECT DISTINCT count(case1_.id) FROM contact AS contact0_ LEFT OUTER JOIN cases AS case1_ ON (contact0_.%s_id = case1_.id) WHERE case1_.%s IN (%s)",
Contact.RESULTING_CASE.toLowerCase(),
Case.UUID,
QueryHelper.concatStrings(caseUuids)));
BigInteger count = (BigInteger) query.getSingleResult();
return count.intValue();
}
```
### Steps to Reproduce
1. Using the following request, a case is created with an invalid UUID:
**Request**
```
POST /sormas-rest/cases/push HTTP/1.1
Host: pen1.sormas-oegd.de
Connection: close
Authorization: Basic U3Vydk9mZjpTdXJ2T2Zm
Content-Type: application/json
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like
Gecko) Chrome/86.0.4240.111 Safari/537.36
Accept:
text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;
q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.9,de;q=0.8
Content-Length: 2861
[
ERNW Enno Rey Netzwerke GmbH Tel. +49 – 6221 – 48 03 90 Page 35
Carl-Bosch-Str. 4 Fax +49 – 6221 – 41 90 08
69115 Heidelberg VAT-ID DE813376919
BIC: GENODE61WNM IBAN: DE40670923000033145934
{
"creationDate": 1603797391019,
"changeDate": 1603797391019,
"uuid": "'><img src onerror='alert(1);",
[…]
```
**Response**
```
HTTP/1.1 200 OK
[…]
["OK"]
```
2. After navigating in the UI to the contacts supervisors’ dashboard, an error will be reported to the user as
shown in the below screenshot. The error seems to be resulted from the SQL error occurred by injecting the
invalid UUID.

### Expected Behavior
The Injected SQL is escaped and doesn't get interpreted.
### Proposed solution
- [ ] Avoid sql injection by rewriting it to parameter(s) and use batching to avoid parameter overflow (_IterableHelper.executeBatched_)
- [ ] Check that there is no html injection for UUID in the UI
### System Details
* SORMAS version: 1.48.2
* Server URL: https://pen1.sormas-oegd.de:443
* User Role: Contact Supervisor
|
non_process
|
prevent blind sql injection description using the rest endpoint to create a case with an invalid uuid allows to make a sql injection in the function de symeda sormas backend contact contactfacadeejb getnonsourcecasecountfordashboard public int getnonsourcecasecountfordashboard list caseuuids if collectionutils isempty caseuuids avoid empty in clause return query query em createnativequery string format select distinct count id from contact as left outer join cases as on s id id where s in s contact resulting case tolowercase case uuid queryhelper concatstrings caseuuids biginteger count biginteger query getsingleresult return count intvalue steps to reproduce using the following request a case is created with an invalid uuid request post sormas rest cases push http host sormas oegd de connection close authorization basic content type application json upgrade insecure requests user agent mozilla windows nt applewebkit khtml like gecko chrome safari accept text html application xhtml xml application xml q image avif image webp image apng q application signed exchange v q sec fetch site none sec fetch mode navigate sec fetch user sec fetch dest document accept encoding gzip deflate accept language en us en q de q content length ernw enno rey netzwerke gmbh tel – – page carl bosch str fax – – heidelberg vat id bic iban creationdate changedate uuid img src onerror alert response http ok after navigating in the ui to the contacts supervisors’ dashboard an error will be reported to the user as shown in the below screenshot the error seems to be resulted from the sql error occurred by injecting the invalid uuid expected behavior the injected sql is escaped and doesn t get interpreted proposed solution avoid sql injection by rewriting it to parameter s and use batching to avoid parameter overflow iterablehelper executebatched check that there is no html injection for uuid in the ui system details sormas version server url user role contact supervisor
| 0
|
111,149
| 17,016,145,575
|
IssuesEvent
|
2021-07-02 12:20:34
|
anyulled/react-boilerplate
|
https://api.github.com/repos/anyulled/react-boilerplate
|
opened
|
WS-2021-0153 (High) detected in ejs-2.7.4.tgz
|
security vulnerability
|
## WS-2021-0153 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ejs-2.7.4.tgz</b></p></summary>
<p>Embedded JavaScript templates</p>
<p>Library home page: <a href="https://registry.npmjs.org/ejs/-/ejs-2.7.4.tgz">https://registry.npmjs.org/ejs/-/ejs-2.7.4.tgz</a></p>
<p>Path to dependency file: react-boilerplate/package.json</p>
<p>Path to vulnerable library: react-boilerplate/node_modules/ejs</p>
<p>
Dependency Hierarchy:
- offline-plugin-5.0.7.tgz (Root Library)
- :x: **ejs-2.7.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/anyulled/react-boilerplate/commit/94d03db1e2d27b111333e7ac0f80533a6aee46f6">94d03db1e2d27b111333e7ac0f80533a6aee46f6</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Injection vulnerability was found in ejs before 3.1.6. Caused by filename which isn't sanitized for display.
<p>Publish Date: 2021-01-22
<p>URL: <a href=https://github.com/mde/ejs/commit/abaee2be937236b1b8da9a1f55096c17dda905fd>WS-2021-0153</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/mde/ejs/issues/571">https://github.com/mde/ejs/issues/571</a></p>
<p>Release Date: 2021-01-22</p>
<p>Fix Resolution: ejs - 3.1.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2021-0153 (High) detected in ejs-2.7.4.tgz - ## WS-2021-0153 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ejs-2.7.4.tgz</b></p></summary>
<p>Embedded JavaScript templates</p>
<p>Library home page: <a href="https://registry.npmjs.org/ejs/-/ejs-2.7.4.tgz">https://registry.npmjs.org/ejs/-/ejs-2.7.4.tgz</a></p>
<p>Path to dependency file: react-boilerplate/package.json</p>
<p>Path to vulnerable library: react-boilerplate/node_modules/ejs</p>
<p>
Dependency Hierarchy:
- offline-plugin-5.0.7.tgz (Root Library)
- :x: **ejs-2.7.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/anyulled/react-boilerplate/commit/94d03db1e2d27b111333e7ac0f80533a6aee46f6">94d03db1e2d27b111333e7ac0f80533a6aee46f6</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Injection vulnerability was found in ejs before 3.1.6. Caused by filename which isn't sanitized for display.
<p>Publish Date: 2021-01-22
<p>URL: <a href=https://github.com/mde/ejs/commit/abaee2be937236b1b8da9a1f55096c17dda905fd>WS-2021-0153</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/mde/ejs/issues/571">https://github.com/mde/ejs/issues/571</a></p>
<p>Release Date: 2021-01-22</p>
<p>Fix Resolution: ejs - 3.1.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws high detected in ejs tgz ws high severity vulnerability vulnerable library ejs tgz embedded javascript templates library home page a href path to dependency file react boilerplate package json path to vulnerable library react boilerplate node modules ejs dependency hierarchy offline plugin tgz root library x ejs tgz vulnerable library found in head commit a href found in base branch master vulnerability details arbitrary code injection vulnerability was found in ejs before caused by filename which isn t sanitized for display publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ejs step up your open source security game with whitesource
| 0
|
50,217
| 3,006,255,819
|
IssuesEvent
|
2015-07-27 09:12:34
|
Itseez/opencv
|
https://api.github.com/repos/Itseez/opencv
|
opened
|
hough circle detection not symmetric, misses obvious circle
|
auto-transferred bug category: imgproc category: video priority: normal
|
Transferred from http://code.opencv.org/issues/2461
```
|| Walter Blume on 2012-10-19 17:20
|| Priority: Normal
|| Affected: None
|| Category: imgproc, video
|| Tracker: Bug
|| Difficulty: None
|| PR: None
|| Platform: None / None
```
hough circle detection not symmetric, misses obvious circle
-----------
```
For some reason the Hough circle detection misses an obvious circle, but if I flip the image, the circle is detected. One would think that results would be similar with flipped image, but they are not.
Attached is source code and test input image that exhibited the problem. Also screenshot with arrow showing the obvious circle miss.
Using OpenCV 2.4.2, built with TBB on Windows7 64 bit machine
```
History
-------
##### Daniil Osokin on 2012-10-22 06:34
```
Thank you, we will check it.
- Category set to imgproc, video
```
##### Jason Harper on 2014-02-22 23:04
```
The problem is in the non-maxima suppression. It is only checking if a value that passes the threshold is also larger (but not equal) than all of it's neighbors. In the first case there are two pixels with equal values next to each other. When that happens neither pixel is set as a possible center and the circle is ignored. When the image is flipped one of these pixels gets an extra count leaving a single peak which gets labeled as a possible center.
I changed the check to >= and it fixed the problem and will submit a pull request soon.
```
##### leonardo bocchi on 2015-06-01 15:59
```
I faced the same problem, and found the same solution.
To avoid duplication, I changed lines 1112-1113 of hough.cpp to read:
adata[base] >= adata[base-1] && adata[base] > adata[base+1] &&
adata[base] >= adata[base-acols-2] && adata[base] > adata[base+acols+2] )
It seems working and no duplicate detections.
```
|
1.0
|
hough circle detection not symmetric, misses obvious circle - Transferred from http://code.opencv.org/issues/2461
```
|| Walter Blume on 2012-10-19 17:20
|| Priority: Normal
|| Affected: None
|| Category: imgproc, video
|| Tracker: Bug
|| Difficulty: None
|| PR: None
|| Platform: None / None
```
hough circle detection not symmetric, misses obvious circle
-----------
```
For some reason the Hough circle detection misses an obvious circle, but if I flip the image, the circle is detected. One would think that results would be similar with flipped image, but they are not.
Attached is source code and test input image that exhibited the problem. Also screenshot with arrow showing the obvious circle miss.
Using OpenCV 2.4.2, built with TBB on Windows7 64 bit machine
```
History
-------
##### Daniil Osokin on 2012-10-22 06:34
```
Thank you, we will check it.
- Category set to imgproc, video
```
##### Jason Harper on 2014-02-22 23:04
```
The problem is in the non-maxima suppression. It is only checking if a value that passes the threshold is also larger (but not equal) than all of it's neighbors. In the first case there are two pixels with equal values next to each other. When that happens neither pixel is set as a possible center and the circle is ignored. When the image is flipped one of these pixels gets an extra count leaving a single peak which gets labeled as a possible center.
I changed the check to >= and it fixed the problem and will submit a pull request soon.
```
##### leonardo bocchi on 2015-06-01 15:59
```
I faced the same problem, and found the same solution.
To avoid duplication, I changed lines 1112-1113 of hough.cpp to read:
adata[base] >= adata[base-1] && adata[base] > adata[base+1] &&
adata[base] >= adata[base-acols-2] && adata[base] > adata[base+acols+2] )
It seems working and no duplicate detections.
```
|
non_process
|
hough circle detection not symmetric misses obvious circle transferred from walter blume on priority normal affected none category imgproc video tracker bug difficulty none pr none platform none none hough circle detection not symmetric misses obvious circle for some reason the hough circle detection misses an obvious circle but if i flip the image the circle is detected one would think that results would be similar with flipped image but they are not attached is source code and test input image that exhibited the problem also screenshot with arrow showing the obvious circle miss using opencv built with tbb on bit machine history daniil osokin on thank you we will check it category set to imgproc video jason harper on the problem is in the non maxima suppression it is only checking if a value that passes the threshold is also larger but not equal than all of it s neighbors in the first case there are two pixels with equal values next to each other when that happens neither pixel is set as a possible center and the circle is ignored when the image is flipped one of these pixels gets an extra count leaving a single peak which gets labeled as a possible center i changed the check to and it fixed the problem and will submit a pull request soon leonardo bocchi on i faced the same problem and found the same solution to avoid duplication i changed lines of hough cpp to read adata adata adata adata adata adata adata adata it seems working and no duplicate detections
| 0
|
4,554
| 7,388,056,864
|
IssuesEvent
|
2018-03-16 00:15:19
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
reopened
|
ProcessThreadTests.TestStartTimeProperty test failure in CI on Ubuntu
|
area-System.Diagnostics.Process test-run-core
|
https://ci.dot.net/job/dotnet_corefx/job/master/job/ubuntu14.04_debug/654/consoleText
```
System.Diagnostics.Tests.ProcessThreadTests.TestStartTimeProperty [FAIL]
System.ComponentModel.Win32Exception : Unable to retrieve the specified information about the process or thread. It may have exited or may be privileged.
Stack Trace:
at System.Diagnostics.ProcessThread.GetStat()
at System.Diagnostics.ProcessThread.get_StartTime()
/mnt/j/workspace/dotnet_corefx/master/ubuntu14.04_debug/src/System.Diagnostics.Process/tests/ProcessThreadTests.cs(106,0): at System.Diagnostics.Tests.ProcessThreadTests.<TestStartTimeProperty>d__3.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
```
|
1.0
|
ProcessThreadTests.TestStartTimeProperty test failure in CI on Ubuntu - https://ci.dot.net/job/dotnet_corefx/job/master/job/ubuntu14.04_debug/654/consoleText
```
System.Diagnostics.Tests.ProcessThreadTests.TestStartTimeProperty [FAIL]
System.ComponentModel.Win32Exception : Unable to retrieve the specified information about the process or thread. It may have exited or may be privileged.
Stack Trace:
at System.Diagnostics.ProcessThread.GetStat()
at System.Diagnostics.ProcessThread.get_StartTime()
/mnt/j/workspace/dotnet_corefx/master/ubuntu14.04_debug/src/System.Diagnostics.Process/tests/ProcessThreadTests.cs(106,0): at System.Diagnostics.Tests.ProcessThreadTests.<TestStartTimeProperty>d__3.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
```
|
process
|
processthreadtests teststarttimeproperty test failure in ci on ubuntu system diagnostics tests processthreadtests teststarttimeproperty system componentmodel unable to retrieve the specified information about the process or thread it may have exited or may be privileged stack trace at system diagnostics processthread getstat at system diagnostics processthread get starttime mnt j workspace dotnet corefx master debug src system diagnostics process tests processthreadtests cs at system diagnostics tests processthreadtests d movenext end of stack trace from previous location where exception was thrown at system runtime exceptionservices exceptiondispatchinfo throw at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task end of stack trace from previous location where exception was thrown at system runtime exceptionservices exceptiondispatchinfo throw at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task end of stack trace from previous location where exception was thrown at system runtime exceptionservices exceptiondispatchinfo throw at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task
| 1
|
275,501
| 23,919,167,261
|
IssuesEvent
|
2022-09-09 15:12:46
|
blockframes/blockframes
|
https://api.github.com/repos/blockframes/blockframes
|
closed
|
E2E - Add to bucket
|
App - Catalog :earth_africa: Type - e2e test Dev - Test / Quality Assurance July clean up
|
### Description
We want to test that users are able to add titles to their bucket with corresponding terms.
### Context
BDD based on these contracts & related films https://docs.google.com/spreadsheets/d/1KHKdN9HAzDUfbKaoSi6_OHxCjMsMd-vQh0VL_qKSKls/edit#gid=0
MovieId `bR4fTHmDDuOSPrNaz39J` should be added as a title from Pulsar Content orgId (`e1VXeusNJK6pb8kmVnUn`)
### Scenario
1. Connect with marketplace user
2. Go to title list
3. Fills in avails filter:
- Dates: 01/01/2025 - 12/31/2026
- Media: S-VOD
- Territories: World
- Exclusivity: Yes
4. Add "Movie 2" to bucket (should be listed)
5. Add "Movie 4" to bucket
6. Add "Movie 3" to bucket
7. Go to bucket page
8. Check that there are 2 cards for "Movie 2" (one for SVOD, another one for A-VOD) and one card for the other titles
### Future upgrades
New edge cases should be added to that test when we'll be able to create discontinued mandates (#5267):
- 2 orgs selling the same title (media will come from 2 different companies)
- A mandate includes 2 discontinued terms
- 2 org selling the same title with discontinued terms (mixing the 2 other edge cases)
|
2.0
|
E2E - Add to bucket - ### Description
We want to test that users are able to add titles to their bucket with corresponding terms.
### Context
BDD based on these contracts & related films https://docs.google.com/spreadsheets/d/1KHKdN9HAzDUfbKaoSi6_OHxCjMsMd-vQh0VL_qKSKls/edit#gid=0
MovieId `bR4fTHmDDuOSPrNaz39J` should be added as a title from Pulsar Content orgId (`e1VXeusNJK6pb8kmVnUn`)
### Scenario
1. Connect with marketplace user
2. Go to title list
3. Fills in avails filter:
- Dates: 01/01/2025 - 12/31/2026
- Media: S-VOD
- Territories: World
- Exclusivity: Yes
4. Add "Movie 2" to bucket (should be listed)
5. Add "Movie 4" to bucket
6. Add "Movie 3" to bucket
7. Go to bucket page
8. Check that there are 2 cards for "Movie 2" (one for SVOD, another one for A-VOD) and one card for the other titles
### Future upgrades
New edge cases should be added to that test when we'll be able to create discontinued mandates (#5267):
- 2 orgs selling the same title (media will come from 2 different companies)
- A mandate includes 2 discontinued terms
- 2 org selling the same title with discontinued terms (mixing the 2 other edge cases)
|
non_process
|
add to bucket description we want to test that users are able to add titles to their bucket with corresponding terms context bdd based on these contracts related films movieid should be added as a title from pulsar content orgid scenario connect with marketplace user go to title list fills in avails filter dates media s vod territories world exclusivity yes add movie to bucket should be listed add movie to bucket add movie to bucket go to bucket page check that there are cards for movie one for svod another one for a vod and one card for the other titles future upgrades new edge cases should be added to that test when we ll be able to create discontinued mandates orgs selling the same title media will come from different companies a mandate includes discontinued terms org selling the same title with discontinued terms mixing the other edge cases
| 0
|
13,519
| 16,057,043,674
|
IssuesEvent
|
2021-04-23 07:11:23
|
ropensci/software-review-meta
|
https://api.github.com/repos/ropensci/software-review-meta
|
closed
|
Standard testing environment
|
automation process
|
We should set up a docker image for a standard testing environment. Our stack has a lot of system dependencies, so maybe we should use rocker/ropensci, with goodpractice installed.
|
1.0
|
Standard testing environment - We should set up a docker image for a standard testing environment. Our stack has a lot of system dependencies, so maybe we should use rocker/ropensci, with goodpractice installed.
|
process
|
standard testing environment we should set up a docker image for a standard testing environment our stack has a lot of system dependencies so maybe we should use rocker ropensci with goodpractice installed
| 1
|
12,833
| 15,213,874,661
|
IssuesEvent
|
2021-02-17 12:26:20
|
trilinos/Trilinos
|
https://api.github.com/repos/trilinos/Trilinos
|
closed
|
Create Trilinos GitHub Issue Lifecycle Model
|
CLOSED_DUE_TO_INACTIVITY MARKED_FOR_CLOSURE process improvement
|
Recent activity with GitHub issues suggests we need an explicit issue lifecycle model to communicate expectations for clients, users and developers.
This issue results from a query by @mwglass. Others who might be interested are: @jwillenbring @bmpersc and @bartlettroscoe
Also, I have added a new label "process improvement" so we can group issues that have this theme.
|
1.0
|
Create Trilinos GitHub Issue Lifecycle Model - Recent activity with GitHub issues suggests we need an explicit issue lifecycle model to communicate expectations for clients, users and developers.
This issue results from a query by @mwglass. Others who might be interested are: @jwillenbring @bmpersc and @bartlettroscoe
Also, I have added a new label "process improvement" so we can group issues that have this theme.
|
process
|
create trilinos github issue lifecycle model recent activity with github issues suggests we need an explicit issue lifecycle model to communicate expectations for clients users and developers this issue results from a query by mwglass others who might be interested are jwillenbring bmpersc and bartlettroscoe also i have added a new label process improvement so we can group issues that have this theme
| 1
|
443,386
| 12,793,680,151
|
IssuesEvent
|
2020-07-02 04:49:33
|
open-learning-exchange/planet
|
https://api.github.com/repos/open-learning-exchange/planet
|
closed
|
Community leaders: Add title shows "Updated title" success message
|
low priority
|
- go to community leaders
- add title to members
- message shown as "Title Updated"
It should show "Title added".

|
1.0
|
Community leaders: Add title shows "Updated title" success message - - go to community leaders
- add title to members
- message shown as "Title Updated"
It should show "Title added".

|
non_process
|
community leaders add title shows updated title success message go to community leaders add title to members message shown as title updated it should show title added
| 0
|
68,717
| 7,108,101,620
|
IssuesEvent
|
2018-01-16 22:28:42
|
Azure/azure-cli
|
https://api.github.com/repos/Azure/azure-cli
|
closed
|
[Test] `check_style` and `run_tests` do not work on Python 2.
|
Test bug
|
Related to https://github.com/Azure/azure-cli/issues/3883
When I run these scripts I get:
```
Traceback (most recent call last):
File "C:\Users\trpresco\Documents\github\env2\Scripts\check_style-script.py", line 11, in <module>
load_entry_point('azure-cli-dev-tools', 'console_scripts', 'check_style')()
File "C:\Users\trpresco\Documents\github\env2\lib\site-packages\pkg_resources\__init__.py", line 561, in load_entry_point
return get_distribution(dist).load_entry_point(group, name)
File "C:\Users\trpresco\Documents\github\env2\lib\site-packages\pkg_resources\__init__.py", line 2626, in load_entry_point
raise ImportError("Entry point %r not found" % ((group, name),))
ImportError: Entry point ('console_scripts', 'check_style') not found
```
```
Traceback (most recent call last):
File "C:\Users\trpresco\Documents\github\env2\Scripts\run_tests-script.py", line 11, in <module>
load_entry_point('azure-cli-dev-tools', 'console_scripts', 'run_tests')()
File "C:\Users\trpresco\Documents\github\env2\lib\site-packages\pkg_resources\__init__.py", line 561, in load_entry_point
return get_distribution(dist).load_entry_point(group, name)
File "C:\Users\trpresco\Documents\github\env2\lib\site-packages\pkg_resources\__init__.py", line 2626, in load_entry_point
raise ImportError("Entry point %r not found" % ((group, name),))
ImportError: Entry point ('console_scripts', 'run_tests') not found
```
---
### Environment summary
Install Method (e.g. pip, interactive script, apt-get, Docker, MSI, edge build) / CLI version (`az --version`) / OS version / Shell Type (e.g. bash, cmd.exe, Bash on Windows)
dev_setup.py / azure-cli (2.0.21) / Win 10 / CMD.exe
```
```
|
1.0
|
[Test] `check_style` and `run_tests` do not work on Python 2. - Related to https://github.com/Azure/azure-cli/issues/3883
When I run these scripts I get:
```
Traceback (most recent call last):
File "C:\Users\trpresco\Documents\github\env2\Scripts\check_style-script.py", line 11, in <module>
load_entry_point('azure-cli-dev-tools', 'console_scripts', 'check_style')()
File "C:\Users\trpresco\Documents\github\env2\lib\site-packages\pkg_resources\__init__.py", line 561, in load_entry_point
return get_distribution(dist).load_entry_point(group, name)
File "C:\Users\trpresco\Documents\github\env2\lib\site-packages\pkg_resources\__init__.py", line 2626, in load_entry_point
raise ImportError("Entry point %r not found" % ((group, name),))
ImportError: Entry point ('console_scripts', 'check_style') not found
```
```
Traceback (most recent call last):
File "C:\Users\trpresco\Documents\github\env2\Scripts\run_tests-script.py", line 11, in <module>
load_entry_point('azure-cli-dev-tools', 'console_scripts', 'run_tests')()
File "C:\Users\trpresco\Documents\github\env2\lib\site-packages\pkg_resources\__init__.py", line 561, in load_entry_point
return get_distribution(dist).load_entry_point(group, name)
File "C:\Users\trpresco\Documents\github\env2\lib\site-packages\pkg_resources\__init__.py", line 2626, in load_entry_point
raise ImportError("Entry point %r not found" % ((group, name),))
ImportError: Entry point ('console_scripts', 'run_tests') not found
```
---
### Environment summary
Install Method (e.g. pip, interactive script, apt-get, Docker, MSI, edge build) / CLI version (`az --version`) / OS version / Shell Type (e.g. bash, cmd.exe, Bash on Windows)
dev_setup.py / azure-cli (2.0.21) / Win 10 / CMD.exe
```
```
|
non_process
|
check style and run tests do not work on python related to when i run these scripts i get traceback most recent call last file c users trpresco documents github scripts check style script py line in load entry point azure cli dev tools console scripts check style file c users trpresco documents github lib site packages pkg resources init py line in load entry point return get distribution dist load entry point group name file c users trpresco documents github lib site packages pkg resources init py line in load entry point raise importerror entry point r not found group name importerror entry point console scripts check style not found traceback most recent call last file c users trpresco documents github scripts run tests script py line in load entry point azure cli dev tools console scripts run tests file c users trpresco documents github lib site packages pkg resources init py line in load entry point return get distribution dist load entry point group name file c users trpresco documents github lib site packages pkg resources init py line in load entry point raise importerror entry point r not found group name importerror entry point console scripts run tests not found environment summary install method e g pip interactive script apt get docker msi edge build cli version az version os version shell type e g bash cmd exe bash on windows dev setup py azure cli win cmd exe
| 0
|
958
| 3,419,126,066
|
IssuesEvent
|
2015-12-08 07:55:50
|
e-government-ua/iBP
|
https://api.github.com/repos/e-government-ua/iBP
|
closed
|
Дніпропетровськ ГоловАПУ - Викопіювання з топографічного плану міста 1:500
|
In process of testing
|
Согласовано с ГлавАПУ
Заявитель подает заявление на имя начальника ГлавАПУ и данные об участке, на который нужны копии.
ГлавАПУ обрабатывает и возвращает заявителю на почту заказанную копию.
Варианты отказа:
1. Отсутствие информации
2. Участок за пределами города
3. Информация засекречена (стратегические объекты)
Срок до 10 дней.
Результат - электронная копия с файлом привязки к местности.
https://test.igov.org.ua/service/1420/general
|
1.0
|
Дніпропетровськ ГоловАПУ - Викопіювання з топографічного плану міста 1:500 - Согласовано с ГлавАПУ
Заявитель подает заявление на имя начальника ГлавАПУ и данные об участке, на который нужны копии.
ГлавАПУ обрабатывает и возвращает заявителю на почту заказанную копию.
Варианты отказа:
1. Отсутствие информации
2. Участок за пределами города
3. Информация засекречена (стратегические объекты)
Срок до 10 дней.
Результат - электронная копия с файлом привязки к местности.
https://test.igov.org.ua/service/1420/general
|
process
|
дніпропетровськ головапу викопіювання з топографічного плану міста согласовано с главапу заявитель подает заявление на имя начальника главапу и данные об участке на который нужны копии главапу обрабатывает и возвращает заявителю на почту заказанную копию варианты отказа отсутствие информации участок за пределами города информация засекречена стратегические объекты срок до дней результат электронная копия с файлом привязки к местности
| 1
|
236,308
| 18,092,471,553
|
IssuesEvent
|
2021-09-22 04:24:02
|
girlscript/winter-of-contributing
|
https://api.github.com/repos/girlscript/winter-of-contributing
|
closed
|
ML 1.5 : Probability Distributions (Gaussian, Standard, Poisson) (D)
|
documentation GWOC21 ML Assigned
|
Welcome to 'ML' Team, good to see you here
This issue will helps readers in acquiring all the knowledge that one needs to know about **Probability Distributions (Gaussian, Standard, Poisson)**.
To get assigned to this issue, add your **Batch Numbers** mentioned in the spreadsheet of "Machine Learning", the approach one would follow and choice you prefer (Documentation, Audio, Video). You can go with all three or any number of options you're interested to work on.
Domain : Machine Learning
Points to Note :
- The issues will be assigned on a first come first serve basis, 1 Issue == 1 PR.
- "Issue Title" and "PR Title should be the same. Include issue number along with it.
- Changes should be made inside the `Machine Learning/Machine_Learning/Statistics_for_Machine_Learning` Branch.
- Follow Contributing Guidelines & Code of Conduct before start Contributing.
- This issue is only for 'GWOC' contributors of 'Machine Learning' domain.
**To be Mentioned while taking the issue :**
- Full name
- Batch Number
- GitHub Profile Link
- Which type of Contribution you want to make :
- [ ] Documentation (Using Google Colab/Jupyter Notebook) [Upload the documented .ipynb file while making the PR]
All the best. Enjoy your open source journey ahead. 😎
|
1.0
|
ML 1.5 : Probability Distributions (Gaussian, Standard, Poisson) (D) - Welcome to 'ML' Team, good to see you here
This issue will helps readers in acquiring all the knowledge that one needs to know about **Probability Distributions (Gaussian, Standard, Poisson)**.
To get assigned to this issue, add your **Batch Numbers** mentioned in the spreadsheet of "Machine Learning", the approach one would follow and choice you prefer (Documentation, Audio, Video). You can go with all three or any number of options you're interested to work on.
Domain : Machine Learning
Points to Note :
- The issues will be assigned on a first come first serve basis, 1 Issue == 1 PR.
- "Issue Title" and "PR Title should be the same. Include issue number along with it.
- Changes should be made inside the `Machine Learning/Machine_Learning/Statistics_for_Machine_Learning` Branch.
- Follow Contributing Guidelines & Code of Conduct before start Contributing.
- This issue is only for 'GWOC' contributors of 'Machine Learning' domain.
**To be Mentioned while taking the issue :**
- Full name
- Batch Number
- GitHub Profile Link
- Which type of Contribution you want to make :
- [ ] Documentation (Using Google Colab/Jupyter Notebook) [Upload the documented .ipynb file while making the PR]
All the best. Enjoy your open source journey ahead. 😎
|
non_process
|
ml probability distributions gaussian standard poisson d welcome to ml team good to see you here this issue will helps readers in acquiring all the knowledge that one needs to know about probability distributions gaussian standard poisson to get assigned to this issue add your batch numbers mentioned in the spreadsheet of machine learning the approach one would follow and choice you prefer documentation audio video you can go with all three or any number of options you re interested to work on domain machine learning points to note the issues will be assigned on a first come first serve basis issue pr issue title and pr title should be the same include issue number along with it changes should be made inside the machine learning machine learning statistics for machine learning branch follow contributing guidelines code of conduct before start contributing this issue is only for gwoc contributors of machine learning domain to be mentioned while taking the issue full name batch number github profile link which type of contribution you want to make documentation using google colab jupyter notebook all the best enjoy your open source journey ahead 😎
| 0
|
547
| 3,005,961,466
|
IssuesEvent
|
2015-07-27 06:51:33
|
e-government-ua/i
|
https://api.github.com/repos/e-government-ua/i
|
closed
|
На дашборде реализовать раздел "Звіт"
|
hi priority In process of testing test
|
При нажатии кнопки "Сформувати файл" - предлагать сохранять файл, используя готовій сервис:
https://test.region.igov.org.ua/wf-region/service/rest/file/download_bp_timing?sID_BP_Name=lviv_mvk-1&sDateAt=2015-06-28&sDateTo=2015-07-01
Который написан по задаче: https://github.com/e-government-ua/i/issues/313, и АПИ которого описано на нашей Вики.
(фактически нужно будет просто туда подставлять параметры из полей интерфейса)

Условия:
- Вкладку отображать пользователям с ролью руководителя - mngr
- Задавать период отчета и услугу (БП, например: "Название БП (lviv_mvk-1)").
- В список услуг должны попасть те процессы, к которым у этого пользователя есть доступ
(в каждом БП и его этапе прописана группа, к которой он относится)
- Доступ определяется по принадлежностям к тем или иным группам в активити, и есть в стандартном описании: http://www.activiti.org/userguide/#_groups
15.15.2. Get a list of groups
GET identity/groups).
|
1.0
|
На дашборде реализовать раздел "Звіт" - При нажатии кнопки "Сформувати файл" - предлагать сохранять файл, используя готовій сервис:
https://test.region.igov.org.ua/wf-region/service/rest/file/download_bp_timing?sID_BP_Name=lviv_mvk-1&sDateAt=2015-06-28&sDateTo=2015-07-01
Который написан по задаче: https://github.com/e-government-ua/i/issues/313, и АПИ которого описано на нашей Вики.
(фактически нужно будет просто туда подставлять параметры из полей интерфейса)

Условия:
- Вкладку отображать пользователям с ролью руководителя - mngr
- Задавать период отчета и услугу (БП, например: "Название БП (lviv_mvk-1)").
- В список услуг должны попасть те процессы, к которым у этого пользователя есть доступ
(в каждом БП и его этапе прописана группа, к которой он относится)
- Доступ определяется по принадлежностям к тем или иным группам в активити, и есть в стандартном описании: http://www.activiti.org/userguide/#_groups
15.15.2. Get a list of groups
GET identity/groups).
|
process
|
на дашборде реализовать раздел звіт при нажатии кнопки сформувати файл предлагать сохранять файл используя готовій сервис который написан по задаче и апи которого описано на нашей вики фактически нужно будет просто туда подставлять параметры из полей интерфейса условия вкладку отображать пользователям с ролью руководителя mngr задавать период отчета и услугу бп например название бп lviv mvk в список услуг должны попасть те процессы к которым у этого пользователя есть доступ в каждом бп и его этапе прописана группа к которой он относится доступ определяется по принадлежностям к тем или иным группам в активити и есть в стандартном описании get a list of groups get identity groups
| 1
|
325,664
| 24,056,194,651
|
IssuesEvent
|
2022-09-16 17:08:46
|
squidfunk/mkdocs-material
|
https://api.github.com/repos/squidfunk/mkdocs-material
|
closed
|
Image link for hiding the sidebar is broken in the documentations
|
documentation
|
### Contribution guidelines
- [X] I've read the [contribution guidelines](https://github.com/squidfunk/mkdocs-material/blob/master/CONTRIBUTING.md) and wholeheartedly agree
### I've found a bug and checked that ...
- [X] ... the problem doesn't occur with the `mkdocs` or `readthedocs` themes
- [ ] ... the problem persists when all overrides are removed, i.e. `custom_dir`, `extra_javascript` and `extra_css`
- [ ] ... the documentation does not mention anything about my problem
- [X] ... there are no open or closed issues that are related to my problem
### Description
The screenshot in the documentation showcasing how & how the sidebar looks like when its hidden is broken.
### Expected behaviour
The screenshot should be rendered appropriately.
### Actual behaviour
The Markdown hyperlink is broken hence the screenshot didn't render as expected.

### Steps to reproduce
Head over to [this section](https://squidfunk.github.io/mkdocs-material/setup/setting-up-navigation/#hiding-the-sidebars) of the official documentations.
### Package versions
NA
### Configuration
```yaml
NA
```
### System information
NA
|
1.0
|
Image link for hiding the sidebar is broken in the documentations - ### Contribution guidelines
- [X] I've read the [contribution guidelines](https://github.com/squidfunk/mkdocs-material/blob/master/CONTRIBUTING.md) and wholeheartedly agree
### I've found a bug and checked that ...
- [X] ... the problem doesn't occur with the `mkdocs` or `readthedocs` themes
- [ ] ... the problem persists when all overrides are removed, i.e. `custom_dir`, `extra_javascript` and `extra_css`
- [ ] ... the documentation does not mention anything about my problem
- [X] ... there are no open or closed issues that are related to my problem
### Description
The screenshot in the documentation showcasing how & how the sidebar looks like when its hidden is broken.
### Expected behaviour
The screenshot should be rendered appropriately.
### Actual behaviour
The Markdown hyperlink is broken hence the screenshot didn't render as expected.

### Steps to reproduce
Head over to [this section](https://squidfunk.github.io/mkdocs-material/setup/setting-up-navigation/#hiding-the-sidebars) of the official documentations.
### Package versions
NA
### Configuration
```yaml
NA
```
### System information
NA
|
non_process
|
image link for hiding the sidebar is broken in the documentations contribution guidelines i ve read the and wholeheartedly agree i ve found a bug and checked that the problem doesn t occur with the mkdocs or readthedocs themes the problem persists when all overrides are removed i e custom dir extra javascript and extra css the documentation does not mention anything about my problem there are no open or closed issues that are related to my problem description the screenshot in the documentation showcasing how how the sidebar looks like when its hidden is broken expected behaviour the screenshot should be rendered appropriately actual behaviour the markdown hyperlink is broken hence the screenshot didn t render as expected steps to reproduce head over to of the official documentations package versions na configuration yaml na system information na
| 0
|
227,332
| 18,056,822,754
|
IssuesEvent
|
2021-09-20 09:17:52
|
Xaymar/obs-StreamFX
|
https://api.github.com/repos/Xaymar/obs-StreamFX
|
closed
|
New Unified logging causes crash on start under Linux/Gentoo
|
type:bug status:help-wanted status:testing-required
|
### Operating System
Linux (like Arch Linux)
### OBS Studio Version?
27.0
### StreamFX Version
0.11.0a3
### OBS Studio Log
https://gist.github.com/WPettersson/1175b9225fc3154edb87b3094f8a6b36
### OBS Studio Crash Log
https://gist.github.com/WPettersson/9b7e1ba34ba8a94e969f1c119e00097d
This includes me running it through gdb to get a backtrace, showing that something with logging was involved.
### Current Behavior
When starting obs (with streamfx enabled) the program crashes as it tries to show the "Allow checking for updates" dialog. I checked with gdb, and it seems to be related to the logging.
### Expected Behavior
Not crash when starting up
### Steps to Reproduce the Bug
Possibly: have not yet accepted the StreamFX Github connection permission
Start obs.
### Any additional Information we need to know?
This was all done with master branch (39026720954929906b44c0fe6e1b3279fb4846e2). I changed back to 23207d046eebb943bcfc953d3f5902e79de884e1 aka tag 0.11.0a3 and the crash is no longer present, so I'm guessing it is somehow related to the unified logging update.
|
1.0
|
New Unified logging causes crash on start under Linux/Gentoo - ### Operating System
Linux (like Arch Linux)
### OBS Studio Version?
27.0
### StreamFX Version
0.11.0a3
### OBS Studio Log
https://gist.github.com/WPettersson/1175b9225fc3154edb87b3094f8a6b36
### OBS Studio Crash Log
https://gist.github.com/WPettersson/9b7e1ba34ba8a94e969f1c119e00097d
This includes me running it through gdb to get a backtrace, showing that something with logging was involved.
### Current Behavior
When starting obs (with streamfx enabled) the program crashes as it tries to show the "Allow checking for updates" dialog. I checked with gdb, and it seems to be related to the logging.
### Expected Behavior
Not crash when starting up
### Steps to Reproduce the Bug
Possibly: have not yet accepted the StreamFX Github connection permission
Start obs.
### Any additional Information we need to know?
This was all done with master branch (39026720954929906b44c0fe6e1b3279fb4846e2). I changed back to 23207d046eebb943bcfc953d3f5902e79de884e1 aka tag 0.11.0a3 and the crash is no longer present, so I'm guessing it is somehow related to the unified logging update.
|
non_process
|
new unified logging causes crash on start under linux gentoo operating system linux like arch linux obs studio version streamfx version obs studio log obs studio crash log this includes me running it through gdb to get a backtrace showing that something with logging was involved current behavior when starting obs with streamfx enabled the program crashes as it tries to show the allow checking for updates dialog i checked with gdb and it seems to be related to the logging expected behavior not crash when starting up steps to reproduce the bug possibly have not yet accepted the streamfx github connection permission start obs any additional information we need to know this was all done with master branch i changed back to aka tag and the crash is no longer present so i m guessing it is somehow related to the unified logging update
| 0
|
1,282
| 3,814,199,427
|
IssuesEvent
|
2016-03-28 11:37:02
|
pelias/api
|
https://api.github.com/repos/pelias/api
|
closed
|
[ngram branch] query weighting
|
experiment processed question
|
@hkrishna I'm opening this ticket to track the admin weighting feature we discussed and so I have a place to put some test cases.
the gist of this is that on the *ngram* branch the part of the query which gave higher scores to queries which matched an admin value and scripts seem to have less effect than it used to.
|
1.0
|
[ngram branch] query weighting - @hkrishna I'm opening this ticket to track the admin weighting feature we discussed and so I have a place to put some test cases.
the gist of this is that on the *ngram* branch the part of the query which gave higher scores to queries which matched an admin value and scripts seem to have less effect than it used to.
|
process
|
query weighting hkrishna i m opening this ticket to track the admin weighting feature we discussed and so i have a place to put some test cases the gist of this is that on the ngram branch the part of the query which gave higher scores to queries which matched an admin value and scripts seem to have less effect than it used to
| 1
|
20,466
| 27,128,970,160
|
IssuesEvent
|
2023-02-16 08:21:22
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
Verifying bazel releases installation
|
P3 type: process team-OSS stale
|
The bazel release has a sig file that is can be used to verify the sh file, but usually this sig file is used to verify the sha256 file. Either choose to verify the shasum or not remove it from use.
|
1.0
|
Verifying bazel releases installation - The bazel release has a sig file that is can be used to verify the sh file, but usually this sig file is used to verify the sha256 file. Either choose to verify the shasum or not remove it from use.
|
process
|
verifying bazel releases installation the bazel release has a sig file that is can be used to verify the sh file but usually this sig file is used to verify the file either choose to verify the shasum or not remove it from use
| 1
|
14,473
| 17,595,872,239
|
IssuesEvent
|
2021-08-17 04:59:19
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Model with "Drop field(s)" processing algorithm is not working with QGIS LTR 3.16.9 because of "native:" prefix
|
Feedback Processing Bug Modeller
|
### What is the bug or the crash?
Model with `Drop field(s)` processing algorithm is not working with QGIS LTR 3.16.9 becaue of `native:` prefix.
### Steps to reproduce the issue
[test_models.zip](https://github.com/qgis/QGIS/files/6975919/test_models.zip) contains two models with the `Drop field(s)` processing algorithm:
- `model_qgis_3_16_9`
- `model_qgis_3_20_1`
Model `model_qgis_3_20_1` is not working with QGIS LTR 3.16.9 despite the `Drop field(s)` algorithm is available.
```
This algorithm cannot be run :-(
The model you are trying to run contains an algorithm that is not available: native:deletecolumn
```
### Versions
<!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
QGIS version | 3.16.9-Hannover | QGIS code revision | 9f8d2f79
-- | -- | -- | --
Compiled against Qt | 5.15.2 | Running against Qt | 5.15.2
Compiled against GDAL/OGR | 3.3.1 | Running against GDAL/OGR | 3.3.1
Compiled against GEOS | 3.9.1-CAPI-1.14.2 | Running against GEOS | 3.9.1-CAPI-1.14.2
Compiled against SQLite | 3.35.2 | Running against SQLite | 3.35.2
PostgreSQL Client Version | 13.0 | SpatiaLite Version | 5.0.1
QWT Version | 6.1.3 | QScintilla2 Version | 2.11.5
Compiled against PROJ | 8.1.0 | Running against PROJ | Rel. 8.1.0, July 1st, 2021
OS Version | Windows 10 Version 1909
Active python plugins | canvas_clipper; LAStools; plugin_reloader; db_manager; MetaSearch; processing
</body></html><!--EndFragment-->
### Additional context
Having a look at the model XML, for the model created with QGIS 3.16.9 the algorithm is named `qgis:deletecolum` while for QGIS 3.20.1 it is named `native:deletecolum`.
According to https://docs.qgis.org/3.16/en/docs/user_manual/processing/console.html#calling-algorithms-from-the-python-console
> qgis:buffer is an alias for native:buffer and will also work
Both test models are working with QGIS 3.20.1, so there it doesn't matter if the algorithm is prefixed with `native:` or `qgis:`.
QGIS 3.16.9 LTR only accepts the `qgis:` prefix.
|
1.0
|
Model with "Drop field(s)" processing algorithm is not working with QGIS LTR 3.16.9 because of "native:" prefix - ### What is the bug or the crash?
Model with `Drop field(s)` processing algorithm is not working with QGIS LTR 3.16.9 becaue of `native:` prefix.
### Steps to reproduce the issue
[test_models.zip](https://github.com/qgis/QGIS/files/6975919/test_models.zip) contains two models with the `Drop field(s)` processing algorithm:
- `model_qgis_3_16_9`
- `model_qgis_3_20_1`
Model `model_qgis_3_20_1` is not working with QGIS LTR 3.16.9 despite the `Drop field(s)` algorithm is available.
```
This algorithm cannot be run :-(
The model you are trying to run contains an algorithm that is not available: native:deletecolumn
```
### Versions
<!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
QGIS version | 3.16.9-Hannover | QGIS code revision | 9f8d2f79
-- | -- | -- | --
Compiled against Qt | 5.15.2 | Running against Qt | 5.15.2
Compiled against GDAL/OGR | 3.3.1 | Running against GDAL/OGR | 3.3.1
Compiled against GEOS | 3.9.1-CAPI-1.14.2 | Running against GEOS | 3.9.1-CAPI-1.14.2
Compiled against SQLite | 3.35.2 | Running against SQLite | 3.35.2
PostgreSQL Client Version | 13.0 | SpatiaLite Version | 5.0.1
QWT Version | 6.1.3 | QScintilla2 Version | 2.11.5
Compiled against PROJ | 8.1.0 | Running against PROJ | Rel. 8.1.0, July 1st, 2021
OS Version | Windows 10 Version 1909
Active python plugins | canvas_clipper; LAStools; plugin_reloader; db_manager; MetaSearch; processing
</body></html><!--EndFragment-->
### Additional context
Having a look at the model XML, for the model created with QGIS 3.16.9 the algorithm is named `qgis:deletecolum` while for QGIS 3.20.1 it is named `native:deletecolum`.
According to https://docs.qgis.org/3.16/en/docs/user_manual/processing/console.html#calling-algorithms-from-the-python-console
> qgis:buffer is an alias for native:buffer and will also work
Both test models are working with QGIS 3.20.1, so there it doesn't matter if the algorithm is prefixed with `native:` or `qgis:`.
QGIS 3.16.9 LTR only accepts the `qgis:` prefix.
|
process
|
model with drop field s processing algorithm is not working with qgis ltr because of native prefix what is the bug or the crash model with drop field s processing algorithm is not working with qgis ltr becaue of native prefix steps to reproduce the issue contains two models with the drop field s processing algorithm model qgis model qgis model model qgis is not working with qgis ltr despite the drop field s algorithm is available this algorithm cannot be run the model you are trying to run contains an algorithm that is not available native deletecolumn versions doctype html public dtd html en qgis version hannover qgis code revision compiled against qt running against qt compiled against gdal ogr running against gdal ogr compiled against geos capi running against geos capi compiled against sqlite running against sqlite postgresql client version spatialite version qwt version version compiled against proj running against proj rel july os version windows version active python plugins canvas clipper lastools plugin reloader db manager metasearch processing additional context having a look at the model xml for the model created with qgis the algorithm is named qgis deletecolum while for qgis it is named native deletecolum according to qgis buffer is an alias for native buffer and will also work both test models are working with qgis so there it doesn t matter if the algorithm is prefixed with native or qgis qgis ltr only accepts the qgis prefix
| 1
|
176,070
| 28,023,443,310
|
IssuesEvent
|
2023-03-28 07:32:42
|
dotnet/winforms
|
https://api.github.com/repos/dotnet/winforms
|
opened
|
Bindingsource not assigned
|
area: VS designer untriaged
|
### Environment
Visual Studio 2022 - 17.5.3
### .NET version
7.0
### Did this work in a previous version of Visual Studio and/or previous .NET release?
Yes
### Issue description
Assigning a class as data source to a binding source doesn't seems to work within the designer. The data source class can be selected but isn't assigned as data source.
see attached sample project and screenshots


[WinFormsAppTestBindingSource.zip](https://github.com/dotnet/winforms/files/11086587/WinFormsAppTestBindingSource.zip)
### Steps to reproduce
- Create a new winform
- drop / drop a bindingsource to the form
- create a simple class with one property / member
- open datasource of the bindingsource property and assign the class with "Add new object data source"
- select the simple class and press ok
- expected: the class is assigned as data source
- current behaviour: the class isn't assigned
### Diagnostics
_No response_
|
1.0
|
Bindingsource not assigned - ### Environment
Visual Studio 2022 - 17.5.3
### .NET version
7.0
### Did this work in a previous version of Visual Studio and/or previous .NET release?
Yes
### Issue description
Assigning a class as data source to a binding source doesn't seems to work within the designer. The data source class can be selected but isn't assigned as data source.
see attached sample project and screenshots


[WinFormsAppTestBindingSource.zip](https://github.com/dotnet/winforms/files/11086587/WinFormsAppTestBindingSource.zip)
### Steps to reproduce
- Create a new winform
- drop / drop a bindingsource to the form
- create a simple class with one property / member
- open datasource of the bindingsource property and assign the class with "Add new object data source"
- select the simple class and press ok
- expected: the class is assigned as data source
- current behaviour: the class isn't assigned
### Diagnostics
_No response_
|
non_process
|
bindingsource not assigned environment visual studio net version did this work in a previous version of visual studio and or previous net release yes issue description assigning a class as data source to a binding source doesn t seems to work within the designer the data source class can be selected but isn t assigned as data source see attached sample project and screenshots steps to reproduce create a new winform drop drop a bindingsource to the form create a simple class with one property member open datasource of the bindingsource property and assign the class with add new object data source select the simple class and press ok expected the class is assigned as data source current behaviour the class isn t assigned diagnostics no response
| 0
|
1,061
| 3,535,167,043
|
IssuesEvent
|
2016-01-16 08:59:05
|
t3kt/vjzual2
|
https://api.github.com/repos/t3kt/vjzual2
|
closed
|
reversable transform module
|
enhancement video processing
|
for example rotate +32 and -32 and composite the outputs.
also for translate x/y.
it doesn't really make sense for scale, but maybe include it anyway
|
1.0
|
reversable transform module - for example rotate +32 and -32 and composite the outputs.
also for translate x/y.
it doesn't really make sense for scale, but maybe include it anyway
|
process
|
reversable transform module for example rotate and and composite the outputs also for translate x y it doesn t really make sense for scale but maybe include it anyway
| 1
|
20,826
| 27,581,532,861
|
IssuesEvent
|
2023-03-08 16:31:38
|
symfony/symfony
|
https://api.github.com/repos/symfony/symfony
|
closed
|
Can the process component be mocked in version 6?
|
Bug Process Status: Needs Review
|
### Symfony version(s) affected
6.0
### Description
I was working with version 5.4. I had the following class returning a Process object https://github.com/nmeri17/suphle/blob/3a522da6818064f6dbbe4b3571c7a8ea6a2d7898/src/Server/VendorBin.php#L46
In my tests, that method was being mocked, thus phpunit returns a mock process object. All was fine until I upgraded to version 6 last night. Now, when the consumer invokes it, phpunit is simply unable to complete creation of a mock process. The generated class it tries to eval is incorrect so it runs into the syntax error "unexpected static. Expected identifier"
I ultimately got around the obstacle by stubbing setProcessArguments to return `new Process([]);` but I imagine I got away easy cuz the test doesn't actually execute the process ie doesn't require path, actual command. Is this behaviour deliberate? Is there a new alternative to mocking or test doubling the process class?
### How to reproduce
This is the mocking helper method without my stub returning a fresh Process
https://github.com/nmeri17/suphle/blob/3a522da6818064f6dbbe4b3571c7a8ea6a2d7898/tests/Integration/Production/ContributorCommandTest.php#L44
### Possible Solution
_No response_
### Additional Context
_No response_
|
1.0
|
Can the process component be mocked in version 6? - ### Symfony version(s) affected
6.0
### Description
I was working with version 5.4. I had the following class returning a Process object https://github.com/nmeri17/suphle/blob/3a522da6818064f6dbbe4b3571c7a8ea6a2d7898/src/Server/VendorBin.php#L46
In my tests, that method was being mocked, thus phpunit returns a mock process object. All was fine until I upgraded to version 6 last night. Now, when the consumer invokes it, phpunit is simply unable to complete creation of a mock process. The generated class it tries to eval is incorrect so it runs into the syntax error "unexpected static. Expected identifier"
I ultimately got around the obstacle by stubbing setProcessArguments to return `new Process([]);` but I imagine I got away easy cuz the test doesn't actually execute the process ie doesn't require path, actual command. Is this behaviour deliberate? Is there a new alternative to mocking or test doubling the process class?
### How to reproduce
This is the mocking helper method without my stub returning a fresh Process
https://github.com/nmeri17/suphle/blob/3a522da6818064f6dbbe4b3571c7a8ea6a2d7898/tests/Integration/Production/ContributorCommandTest.php#L44
### Possible Solution
_No response_
### Additional Context
_No response_
|
process
|
can the process component be mocked in version symfony version s affected description i was working with version i had the following class returning a process object in my tests that method was being mocked thus phpunit returns a mock process object all was fine until i upgraded to version last night now when the consumer invokes it phpunit is simply unable to complete creation of a mock process the generated class it tries to eval is incorrect so it runs into the syntax error unexpected static expected identifier i ultimately got around the obstacle by stubbing setprocessarguments to return new process but i imagine i got away easy cuz the test doesn t actually execute the process ie doesn t require path actual command is this behaviour deliberate is there a new alternative to mocking or test doubling the process class how to reproduce this is the mocking helper method without my stub returning a fresh process possible solution no response additional context no response
| 1
|
626,482
| 19,824,708,333
|
IssuesEvent
|
2022-01-20 04:13:03
|
phetsims/number-play
|
https://api.github.com/repos/phetsims/number-play
|
opened
|
Improve AccordionBox usages
|
priority:2-high
|
From meeting with @jonathanolson today - he gave some suggestions for how to better define their size and not resize in case of a content size change.
I will also use this issue to remove a lot of duplicated code - there is currently OnesAccordionBox, ObjectAccordionBox, and CompareAccordionBox, and with all of the generalizations that have been made, the functionality of all of these could be simplified to one accordion box like CountingAccordionBox.
|
1.0
|
Improve AccordionBox usages - From meeting with @jonathanolson today - he gave some suggestions for how to better define their size and not resize in case of a content size change.
I will also use this issue to remove a lot of duplicated code - there is currently OnesAccordionBox, ObjectAccordionBox, and CompareAccordionBox, and with all of the generalizations that have been made, the functionality of all of these could be simplified to one accordion box like CountingAccordionBox.
|
non_process
|
improve accordionbox usages from meeting with jonathanolson today he gave some suggestions for how to better define their size and not resize in case of a content size change i will also use this issue to remove a lot of duplicated code there is currently onesaccordionbox objectaccordionbox and compareaccordionbox and with all of the generalizations that have been made the functionality of all of these could be simplified to one accordion box like countingaccordionbox
| 0
|
354,286
| 25,158,742,416
|
IssuesEvent
|
2022-11-10 15:19:55
|
GTMNERR/swmp-quarter-report
|
https://api.github.com/repos/GTMNERR/swmp-quarter-report
|
closed
|
Section 5.3.1 Total Nitrogen Last Sentence
|
documentation
|
Currently reads: This was especially true at Pine Island which only has data in January and February (Figure 5.5 (a) and the remaining sites are typically missing the March data.
Since, Pine Island had additional data added in September and October. I suggest maybe changing the sentence to:
This was especially true at Fort Matanzas which only has data in January, February, and April (Figure 5.5 (c).
or
This was especially true at Fort Matanzas which has the least amount of TN data (Figure 5.5 (c).
|
1.0
|
Section 5.3.1 Total Nitrogen Last Sentence - Currently reads: This was especially true at Pine Island which only has data in January and February (Figure 5.5 (a) and the remaining sites are typically missing the March data.
Since, Pine Island had additional data added in September and October. I suggest maybe changing the sentence to:
This was especially true at Fort Matanzas which only has data in January, February, and April (Figure 5.5 (c).
or
This was especially true at Fort Matanzas which has the least amount of TN data (Figure 5.5 (c).
|
non_process
|
section total nitrogen last sentence currently reads this was especially true at pine island which only has data in january and february figure a and the remaining sites are typically missing the march data since pine island had additional data added in september and october i suggest maybe changing the sentence to this was especially true at fort matanzas which only has data in january february and april figure c or this was especially true at fort matanzas which has the least amount of tn data figure c
| 0
|
15,131
| 18,873,759,589
|
IssuesEvent
|
2021-11-13 17:01:09
|
AmpersandTarski/Ampersand
|
https://api.github.com/repos/AmpersandTarski/Ampersand
|
closed
|
Support for build with Docker container
|
software process
|
I've found this post: https://docs.haskellstack.org/en/stable/docker_integration/
I think very promising for us.
Unfortunately no support for Windows yet.. but there's a fix coming very soon I gues: https://github.com/commercialhaskell/stack/issues/2421 and https://github.com/commercialhaskell/stack/pull/5315
Integration of stack with docker can be specified/configured in `stack.yaml`
```yaml
# Docker integration
# See:
docker:
enable: true
# Note that in a docker-enabled configuration, stack uses the GHC installed in the Docker container by default.
# We want to use the compiler installed by stack
system-ghc: false
```
Tbc
|
1.0
|
Support for build with Docker container - I've found this post: https://docs.haskellstack.org/en/stable/docker_integration/
I think very promising for us.
Unfortunately no support for Windows yet.. but there's a fix coming very soon I gues: https://github.com/commercialhaskell/stack/issues/2421 and https://github.com/commercialhaskell/stack/pull/5315
Integration of stack with docker can be specified/configured in `stack.yaml`
```yaml
# Docker integration
# See:
docker:
enable: true
# Note that in a docker-enabled configuration, stack uses the GHC installed in the Docker container by default.
# We want to use the compiler installed by stack
system-ghc: false
```
Tbc
|
process
|
support for build with docker container i ve found this post i think very promising for us unfortunately no support for windows yet but there s a fix coming very soon i gues and integration of stack with docker can be specified configured in stack yaml yaml docker integration see docker enable true note that in a docker enabled configuration stack uses the ghc installed in the docker container by default we want to use the compiler installed by stack system ghc false tbc
| 1
|
94,942
| 11,941,984,433
|
IssuesEvent
|
2020-04-02 19:26:22
|
MozillaFoundation/Design
|
https://api.github.com/repos/MozillaFoundation/Design
|
closed
|
Update /brand with latest MozFest assets
|
design
|
Once the new style guide https://github.com/MozillaFoundation/Design/issues/483 is done we will need to update our /brand page.
[need to add more info]
cc @sabrinang
|
1.0
|
Update /brand with latest MozFest assets - Once the new style guide https://github.com/MozillaFoundation/Design/issues/483 is done we will need to update our /brand page.
[need to add more info]
cc @sabrinang
|
non_process
|
update brand with latest mozfest assets once the new style guide is done we will need to update our brand page cc sabrinang
| 0
|
16,233
| 20,780,645,960
|
IssuesEvent
|
2022-03-16 14:29:51
|
FOLIO-FSE/folio_migration_tools
|
https://api.github.com/repos/FOLIO-FSE/folio_migration_tools
|
closed
|
Library configuration file: make holdingsMergeCriteria options more intuitive
|
simplify_migration_process
|
Example of current:
`"holdingsMergeCriteria": "lb",
`
Could we name the options something in clear text (e.g. "location_and_callnumber")? Would make it easier for to read and work with the config file.
|
1.0
|
Library configuration file: make holdingsMergeCriteria options more intuitive - Example of current:
`"holdingsMergeCriteria": "lb",
`
Could we name the options something in clear text (e.g. "location_and_callnumber")? Would make it easier for to read and work with the config file.
|
process
|
library configuration file make holdingsmergecriteria options more intuitive example of current holdingsmergecriteria lb could we name the options something in clear text e g location and callnumber would make it easier for to read and work with the config file
| 1
|
75,108
| 20,628,063,450
|
IssuesEvent
|
2022-03-08 01:48:19
|
Rust-for-Linux/linux
|
https://api.github.com/repos/Rust-for-Linux/linux
|
opened
|
Makefile: a grep regular expression in the rust/Makefile may have problem
|
• kbuild
|
https://github.com/Rust-for-Linux/linux/blob/28a9bd2ebd821ab0f412f1d661c62801dbd2f632/rust/Makefile#L273-L274
When I learn the process of rust-bindgen, I find this expression confusing to me. Does the second part of this regular expression have a redundant "$"? If the answer is a yes, should I open a pr for it? Thanks for your advice!
|
1.0
|
Makefile: a grep regular expression in the rust/Makefile may have problem - https://github.com/Rust-for-Linux/linux/blob/28a9bd2ebd821ab0f412f1d661c62801dbd2f632/rust/Makefile#L273-L274
When I learn the process of rust-bindgen, I find this expression confusing to me. Does the second part of this regular expression have a redundant "$"? If the answer is a yes, should I open a pr for it? Thanks for your advice!
|
non_process
|
makefile a grep regular expression in the rust makefile may have problem when i learn the process of rust bindgen i find this expression confusing to me does the second part of this regular expression have a redundant if the answer is a yes should i open a pr for it thanks for your advice
| 0
|
17,263
| 23,043,962,336
|
IssuesEvent
|
2022-07-23 15:49:11
|
andrewzah/openbook
|
https://api.github.com/repos/andrewzah/openbook
|
opened
|
[metadata] support multiple album references
|
enhancement rust-preprocessor lilypond
|
* [ ] update yaml parser to check for array values - models.rs
* [ ] display this information somehow in the song render
|
1.0
|
[metadata] support multiple album references - * [ ] update yaml parser to check for array values - models.rs
* [ ] display this information somehow in the song render
|
process
|
support multiple album references update yaml parser to check for array values models rs display this information somehow in the song render
| 1
|
20,498
| 3,368,357,569
|
IssuesEvent
|
2015-11-22 22:15:06
|
libkml/libkml
|
https://api.github.com/repos/libkml/libkml
|
closed
|
examples/java issues
|
auto-migrated Priority-Medium Type-Defect
|
```
Makefile.am is missing CreateFolder.java
run.sh refers to non-existent ParseKml.java
```
Original issue reported on code.google.com by `kml.b...@gmail.com` on 17 Jun 2009 at 3:23
|
1.0
|
examples/java issues - ```
Makefile.am is missing CreateFolder.java
run.sh refers to non-existent ParseKml.java
```
Original issue reported on code.google.com by `kml.b...@gmail.com` on 17 Jun 2009 at 3:23
|
non_process
|
examples java issues makefile am is missing createfolder java run sh refers to non existent parsekml java original issue reported on code google com by kml b gmail com on jun at
| 0
|
184,137
| 14,970,125,699
|
IssuesEvent
|
2021-01-27 19:08:58
|
LibreTexts/metalc
|
https://api.github.com/repos/LibreTexts/metalc
|
closed
|
Create some examples of installing custom software in the libretexts construction guide
|
documentation good first issue medium priority
|
Our current policy is that we will install packages system-wide for users if they are available in the standard Ubuntu 18.04 APT repos or in the conda forge channel for conda packages (see the FAQ: https://jupyter.libretexts.org/hub/faq#how-can-i-install-custom-packages). We need one or more pages in the Libretexts construction guide that explain this policy (note that libretexts users may never visit the jupyterhub faq). The pages should also give a couple of examples: installing a package from a different conda channel, or one from conda forge that we don't already have installed, maybe a python package from pip, and/or downloading a tarball and compiling it from source. These pages should also clearly state that installing custom packages is an **unsupported feature** and that users are on their own when trying to do this.
The use case we want to give a bit of help on can been seen here with @Miniland1333's attempt at installing a difficult package:
https://github.com/LibreTexts/ckeditor-binder-plugin/pull/137
Construction guide: https://chem.libretexts.org/Courses/Remixer_University/LibreTexts_Construction_Guide/05%3A_Interactive_Elements/5.02%3A_Jupyter_Notebooks_(Executable_Programming_Code_and_Figures)
|
1.0
|
Create some examples of installing custom software in the libretexts construction guide - Our current policy is that we will install packages system-wide for users if they are available in the standard Ubuntu 18.04 APT repos or in the conda forge channel for conda packages (see the FAQ: https://jupyter.libretexts.org/hub/faq#how-can-i-install-custom-packages). We need one or more pages in the Libretexts construction guide that explain this policy (note that libretexts users may never visit the jupyterhub faq). The pages should also give a couple of examples: installing a package from a different conda channel, or one from conda forge that we don't already have installed, maybe a python package from pip, and/or downloading a tarball and compiling it from source. These pages should also clearly state that installing custom packages is an **unsupported feature** and that users are on their own when trying to do this.
The use case we want to give a bit of help on can been seen here with @Miniland1333's attempt at installing a difficult package:
https://github.com/LibreTexts/ckeditor-binder-plugin/pull/137
Construction guide: https://chem.libretexts.org/Courses/Remixer_University/LibreTexts_Construction_Guide/05%3A_Interactive_Elements/5.02%3A_Jupyter_Notebooks_(Executable_Programming_Code_and_Figures)
|
non_process
|
create some examples of installing custom software in the libretexts construction guide our current policy is that we will install packages system wide for users if they are available in the standard ubuntu apt repos or in the conda forge channel for conda packages see the faq we need one or more pages in the libretexts construction guide that explain this policy note that libretexts users may never visit the jupyterhub faq the pages should also give a couple of examples installing a package from a different conda channel or one from conda forge that we don t already have installed maybe a python package from pip and or downloading a tarball and compiling it from source these pages should also clearly state that installing custom packages is an unsupported feature and that users are on their own when trying to do this the use case we want to give a bit of help on can been seen here with s attempt at installing a difficult package construction guide
| 0
|
13,061
| 15,394,677,568
|
IssuesEvent
|
2021-03-03 18:13:03
|
googleapis/env-tests-logging
|
https://api.github.com/repos/googleapis/env-tests-logging
|
closed
|
Generalize library to be used by other languages
|
priority: p2 type: process
|
Hello, as I'm stepping through this test module for Go, starting a list of things that need to be generalized:
- [build_container()](https://github.com/googleapis/env-tests-logging/blob/29ace84ca228f5a0d991f8cac559df382a58226a/envctl/envctl#L45) mkdir path name
|
1.0
|
Generalize library to be used by other languages - Hello, as I'm stepping through this test module for Go, starting a list of things that need to be generalized:
- [build_container()](https://github.com/googleapis/env-tests-logging/blob/29ace84ca228f5a0d991f8cac559df382a58226a/envctl/envctl#L45) mkdir path name
|
process
|
generalize library to be used by other languages hello as i m stepping through this test module for go starting a list of things that need to be generalized mkdir path name
| 1
|
19,974
| 26,456,122,013
|
IssuesEvent
|
2023-01-16 14:33:38
|
solop-develop/frontend-core
|
https://api.github.com/repos/solop-develop/frontend-core
|
closed
|
[Feature Request] Proceso: Opción rápida de limpiar parámetros
|
enhancement (PRC) Processes (UX) User Experience
|
<!--
Note: Is your feature request related to a problem? Please describe.
-->
## Feature request
<!--
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
-->
Se necesita agregar la opción rápida de limpiar los parámetros en los `Procesos`. Esta opción fue agregada recientemente en la búsqueda avanzada.
https://user-images.githubusercontent.com/20288327/212694130-4d6157b1-1910-4781-bfe0-fce01a006c5e.mp4
|
1.0
|
[Feature Request] Proceso: Opción rápida de limpiar parámetros - <!--
Note: Is your feature request related to a problem? Please describe.
-->
## Feature request
<!--
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
-->
Se necesita agregar la opción rápida de limpiar los parámetros en los `Procesos`. Esta opción fue agregada recientemente en la búsqueda avanzada.
https://user-images.githubusercontent.com/20288327/212694130-4d6157b1-1910-4781-bfe0-fce01a006c5e.mp4
|
process
|
proceso opción rápida de limpiar parámetros note is your feature request related to a problem please describe feature request a clear and concise description of what the problem is ex i m always frustrated when se necesita agregar la opción rápida de limpiar los parámetros en los procesos esta opción fue agregada recientemente en la búsqueda avanzada
| 1
|
158,684
| 24,876,973,575
|
IssuesEvent
|
2022-10-27 20:02:01
|
OAGi/Score
|
https://api.github.com/repos/OAGi/Score
|
closed
|
Allow ACC Object Class Term to be revised
|
design change
|
In all syntactical expressions that Score current support, this is not causing backwardly incompatible.
In fact, this is one place GUID plays a good role. As the case when the name needs to change only because misspelled, not semantically.
This is also motivated by OAGIS issues such as https://github.com/OAGi/oagis/issues/284 and https://github.com/OAGi/oagis/issues/279.
As we can see, this will make BIE uplift more interesting.
|
1.0
|
Allow ACC Object Class Term to be revised - In all syntactical expressions that Score current support, this is not causing backwardly incompatible.
In fact, this is one place GUID plays a good role. As the case when the name needs to change only because misspelled, not semantically.
This is also motivated by OAGIS issues such as https://github.com/OAGi/oagis/issues/284 and https://github.com/OAGi/oagis/issues/279.
As we can see, this will make BIE uplift more interesting.
|
non_process
|
allow acc object class term to be revised in all syntactical expressions that score current support this is not causing backwardly incompatible in fact this is one place guid plays a good role as the case when the name needs to change only because misspelled not semantically this is also motivated by oagis issues such as and as we can see this will make bie uplift more interesting
| 0
|
19,814
| 26,202,988,925
|
IssuesEvent
|
2023-01-03 19:23:43
|
hashgraph/hedera-mirror-node
|
https://api.github.com/repos/hashgraph/hedera-mirror-node
|
opened
|
Release checklist 0.71
|
enhancement process
|
### Problem
We need a checklist to verify the release is rolled out successfully.
### Solution
## Preparation
- [x] Milestone field populated on relevant [issues](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aclosed+no%3Amilestone+sort%3Aupdated-desc)
- [x] Nothing open for [milestone](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aopen+sort%3Aupdated-desc+milestone%3A0.71.0)
- [x] GitHub checks for branch are passing
- [x] Automated Kubernetes deployment successful
- [x] Tag release
- [x] Upload release artifacts
- [ ] Manual Submission for GCP Marketplace verification by google
- [ ] Publish marketplace release
- [x] Publish release
## Performance
- [x] Deploy to Kubernetes
- [x] Deploy to VM
- [x] gRPC API performance tests
- [x] Importer performance tests
- [x] REST API performance tests
## Previewnet
- [x] Deploy to Kubernetes
## Staging
- [x] Deploy to Kubernetes
## Testnet
- [ ] Deploy to VM
## Mainnet
- [ ] Deploy to Kubernetes EU
- [ ] Deploy to Kubernetes NA
- [ ] Deploy to VM
- [ ] Deploy to ETL
### Alternatives
_No response_
|
1.0
|
Release checklist 0.71 - ### Problem
We need a checklist to verify the release is rolled out successfully.
### Solution
## Preparation
- [x] Milestone field populated on relevant [issues](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aclosed+no%3Amilestone+sort%3Aupdated-desc)
- [x] Nothing open for [milestone](https://github.com/hashgraph/hedera-mirror-node/issues?q=is%3Aopen+sort%3Aupdated-desc+milestone%3A0.71.0)
- [x] GitHub checks for branch are passing
- [x] Automated Kubernetes deployment successful
- [x] Tag release
- [x] Upload release artifacts
- [ ] Manual Submission for GCP Marketplace verification by google
- [ ] Publish marketplace release
- [x] Publish release
## Performance
- [x] Deploy to Kubernetes
- [x] Deploy to VM
- [x] gRPC API performance tests
- [x] Importer performance tests
- [x] REST API performance tests
## Previewnet
- [x] Deploy to Kubernetes
## Staging
- [x] Deploy to Kubernetes
## Testnet
- [ ] Deploy to VM
## Mainnet
- [ ] Deploy to Kubernetes EU
- [ ] Deploy to Kubernetes NA
- [ ] Deploy to VM
- [ ] Deploy to ETL
### Alternatives
_No response_
|
process
|
release checklist problem we need a checklist to verify the release is rolled out successfully solution preparation milestone field populated on relevant nothing open for github checks for branch are passing automated kubernetes deployment successful tag release upload release artifacts manual submission for gcp marketplace verification by google publish marketplace release publish release performance deploy to kubernetes deploy to vm grpc api performance tests importer performance tests rest api performance tests previewnet deploy to kubernetes staging deploy to kubernetes testnet deploy to vm mainnet deploy to kubernetes eu deploy to kubernetes na deploy to vm deploy to etl alternatives no response
| 1
|
4,528
| 7,371,478,867
|
IssuesEvent
|
2018-03-13 11:54:31
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Link to Deploy device simulation is wrong
|
cxp doc-bug in-process iot-suite triaged
|
It links to this document
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 0c7427cc-eefe-7ae4-625c-cece27796c66
* Version Independent ID: 3def07f9-7684-f8fe-eeae-a46ef2f27343
* Content: [Get started with the Device Simulation solution - Azure | Microsoft Docs](https://docs.microsoft.com/en-us/azure/iot-suite/iot-suite-device-simulation-explore)
* Content Source: [articles/iot-suite/iot-suite-device-simulation-explore.md](https://github.com/Microsoft/azure-docs/blob/master/articles/iot-suite/iot-suite-device-simulation-explore.md)
* Service: **iot-suite**
* GitHub Login: @troyhopwood
* Microsoft Alias: **troyhop**
|
1.0
|
Link to Deploy device simulation is wrong - It links to this document
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 0c7427cc-eefe-7ae4-625c-cece27796c66
* Version Independent ID: 3def07f9-7684-f8fe-eeae-a46ef2f27343
* Content: [Get started with the Device Simulation solution - Azure | Microsoft Docs](https://docs.microsoft.com/en-us/azure/iot-suite/iot-suite-device-simulation-explore)
* Content Source: [articles/iot-suite/iot-suite-device-simulation-explore.md](https://github.com/Microsoft/azure-docs/blob/master/articles/iot-suite/iot-suite-device-simulation-explore.md)
* Service: **iot-suite**
* GitHub Login: @troyhopwood
* Microsoft Alias: **troyhop**
|
process
|
link to deploy device simulation is wrong it links to this document document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id eefe version independent id eeae content content source service iot suite github login troyhopwood microsoft alias troyhop
| 1
|
73,924
| 15,286,453,624
|
IssuesEvent
|
2021-02-23 14:43:22
|
wouter140/PortfolioSite
|
https://api.github.com/repos/wouter140/PortfolioSite
|
closed
|
CVE-2020-11022 (Medium) detected in multiple libraries
|
security vulnerability
|
## CVE-2020-11022 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-3.4.0.min.js</b>, <b>jquery-2.1.1.min.js</b>, <b>jquery-2.1.4.min.js</b>, <b>jquery-1.7.1.min.js</b></p></summary>
<p>
<details><summary><b>jquery-3.4.0.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.0/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.0/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/PortfolioSite/node_modules/js-base64/test/index.html</p>
<p>Path to vulnerable library: /PortfolioSite/node_modules/js-base64/test/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-3.4.0.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-2.1.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.1/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/PortfolioSite/node_modules/react-numeric-input/docs/index.html</p>
<p>Path to vulnerable library: /PortfolioSite/node_modules/react-numeric-input/docs/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.1.1.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-2.1.4.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/PortfolioSite/node_modules/js-base64/.attic/test-moment/index.html</p>
<p>Path to vulnerable library: /PortfolioSite/node_modules/js-base64/.attic/test-moment/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.1.4.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-1.7.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/PortfolioSite/node_modules/sockjs/examples/hapi/html/index.html</p>
<p>Path to vulnerable library: /PortfolioSite/node_modules/sockjs/examples/hapi/html/index.html,/PortfolioSite/node_modules/sockjs/examples/echo/index.html,/PortfolioSite/node_modules/sockjs/examples/multiplex/index.html,/PortfolioSite/node_modules/sockjs/examples/express/index.html,/PortfolioSite/node_modules/sockjs/examples/express-3.x/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.7.1.min.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/wouter140/PortfolioSite/commit/2b69df44c79f486e9017ba8ef4ec9834fd54c67d">2b69df44c79f486e9017ba8ef4ec9834fd54c67d</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11022>CVE-2020-11022</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/">https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jQuery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-11022 (Medium) detected in multiple libraries - ## CVE-2020-11022 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-3.4.0.min.js</b>, <b>jquery-2.1.1.min.js</b>, <b>jquery-2.1.4.min.js</b>, <b>jquery-1.7.1.min.js</b></p></summary>
<p>
<details><summary><b>jquery-3.4.0.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.0/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.4.0/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/PortfolioSite/node_modules/js-base64/test/index.html</p>
<p>Path to vulnerable library: /PortfolioSite/node_modules/js-base64/test/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-3.4.0.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-2.1.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.1/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/PortfolioSite/node_modules/react-numeric-input/docs/index.html</p>
<p>Path to vulnerable library: /PortfolioSite/node_modules/react-numeric-input/docs/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.1.1.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-2.1.4.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/PortfolioSite/node_modules/js-base64/.attic/test-moment/index.html</p>
<p>Path to vulnerable library: /PortfolioSite/node_modules/js-base64/.attic/test-moment/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.1.4.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-1.7.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/PortfolioSite/node_modules/sockjs/examples/hapi/html/index.html</p>
<p>Path to vulnerable library: /PortfolioSite/node_modules/sockjs/examples/hapi/html/index.html,/PortfolioSite/node_modules/sockjs/examples/echo/index.html,/PortfolioSite/node_modules/sockjs/examples/multiplex/index.html,/PortfolioSite/node_modules/sockjs/examples/express/index.html,/PortfolioSite/node_modules/sockjs/examples/express-3.x/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.7.1.min.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/wouter140/PortfolioSite/commit/2b69df44c79f486e9017ba8ef4ec9834fd54c67d">2b69df44c79f486e9017ba8ef4ec9834fd54c67d</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11022>CVE-2020-11022</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/">https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jQuery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in multiple libraries cve medium severity vulnerability vulnerable libraries jquery min js jquery min js jquery min js jquery min js jquery min js javascript library for dom operations library home page a href path to dependency file tmp ws scm portfoliosite node modules js test index html path to vulnerable library portfoliosite node modules js test index html dependency hierarchy x jquery min js vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file tmp ws scm portfoliosite node modules react numeric input docs index html path to vulnerable library portfoliosite node modules react numeric input docs index html dependency hierarchy x jquery min js vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file tmp ws scm portfoliosite node modules js attic test moment index html path to vulnerable library portfoliosite node modules js attic test moment index html dependency hierarchy x jquery min js vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file tmp ws scm portfoliosite node modules sockjs examples hapi html index html path to vulnerable library portfoliosite node modules sockjs examples hapi html index html portfoliosite node modules sockjs examples echo index html portfoliosite node modules sockjs examples multiplex index html portfoliosite node modules sockjs examples express index html portfoliosite node modules sockjs examples express x index html dependency hierarchy x jquery min js vulnerable library found in head commit a href vulnerability details in jquery before passing html from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery step up your open source security game with whitesource
| 0
|
8,707
| 11,848,654,653
|
IssuesEvent
|
2020-03-24 14:06:59
|
allinurl/goaccess
|
https://api.github.com/repos/allinurl/goaccess
|
closed
|
Run Goaccess from local machine to monitor apache log in remote server.
|
log-processing
|
Hello,
I have tried Goaccess to monitor the Apache log file in my local machine (windows 10). Now I am trying to run Goaccess in realtime to monitor the log file in remote server (windows server 2016), but I am unable to do it.
Could you please tell me if there is any ways to do this.
Thanks a lot.
Van Dat
|
1.0
|
Run Goaccess from local machine to monitor apache log in remote server. - Hello,
I have tried Goaccess to monitor the Apache log file in my local machine (windows 10). Now I am trying to run Goaccess in realtime to monitor the log file in remote server (windows server 2016), but I am unable to do it.
Could you please tell me if there is any ways to do this.
Thanks a lot.
Van Dat
|
process
|
run goaccess from local machine to monitor apache log in remote server hello i have tried goaccess to monitor the apache log file in my local machine windows now i am trying to run goaccess in realtime to monitor the log file in remote server windows server but i am unable to do it could you please tell me if there is any ways to do this thanks a lot van dat
| 1
|
154,936
| 5,939,925,954
|
IssuesEvent
|
2017-05-25 07:25:29
|
GluuFederation/oxd
|
https://api.github.com/repos/GluuFederation/oxd
|
closed
|
Associate register_site with access_token client
|
bug High priority
|
Associate register_site with access_token client
|
1.0
|
Associate register_site with access_token client - Associate register_site with access_token client
|
non_process
|
associate register site with access token client associate register site with access token client
| 0
|
276,699
| 30,522,088,533
|
IssuesEvent
|
2023-07-19 08:46:30
|
NS-Mend/Java-Demo-ReachabilityUAT
|
https://api.github.com/repos/NS-Mend/Java-Demo-ReachabilityUAT
|
closed
|
jstl-1.2.jar: 1 vulnerabilities (highest severity is: 7.3) - autoclosed
|
Mend: dependency security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jstl-1.2.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/javax/servlet/jstl/1.2/jstl-1.2.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/NS-Mend/Java-Demo/commit/f1bb3f06009b6c337a3174fd7f89dce12fe88e3e">f1bb3f06009b6c337a3174fd7f89dce12fe88e3e</a></p></details>
#### <img src='https://whitesource-resources.whitesourcesoftware.com/suggestedVersion.png' width=19 height=20> Mend has checked all newer package trees, and you are on the least vulnerable package!
#### Please note: There might be a version that explicitly solves one or more of the vulnerabilities listed below, but we do not recommend it. For more info about the optional fixes, check the "Details" section below.
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (jstl version) | Fix PR available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2015-0254](https://www.mend.io/vulnerability-database/CVE-2015-0254) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.3 | jstl-1.2.jar | Direct | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> CVE-2015-0254</summary>
### Vulnerable Library - <b>jstl-1.2.jar</b></p>
<p></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/javax/servlet/jstl/1.2/jstl-1.2.jar</p>
<p>
Dependency Hierarchy:
- :x: **jstl-1.2.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/NS-Mend/Java-Demo/commit/f1bb3f06009b6c337a3174fd7f89dce12fe88e3e">f1bb3f06009b6c337a3174fd7f89dce12fe88e3e</a></p>
<p>Found in base branches: <b>feature/Demo, master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Apache Standard Taglibs before 1.2.3 allows remote attackers to execute arbitrary code or conduct external XML entity (XXE) attacks via a crafted XSLT extension in a (1) <x:parse> or (2) <x:transform> JSTL XML tag.
<p>Publish Date: 2015-03-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-0254>CVE-2015-0254</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tomcat.apache.org/taglibs/standard/">https://tomcat.apache.org/taglibs/standard/</a></p>
<p>Release Date: 2015-03-09</p>
<p>Fix Resolution: org.apache.taglibs:taglibs-standard-impl:1.2.3</p>
</p>
<p></p>
<p>In order to enable automatic remediation, please create <a target="_blank" href="https://docs.mend.io/bundle/integrations/page/mend_for_github_com.html#MendforGitHub.com-RemediateSettings(remediateSettings)">workflow rules</a></p>
</details>
|
True
|
jstl-1.2.jar: 1 vulnerabilities (highest severity is: 7.3) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jstl-1.2.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/javax/servlet/jstl/1.2/jstl-1.2.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/NS-Mend/Java-Demo/commit/f1bb3f06009b6c337a3174fd7f89dce12fe88e3e">f1bb3f06009b6c337a3174fd7f89dce12fe88e3e</a></p></details>
#### <img src='https://whitesource-resources.whitesourcesoftware.com/suggestedVersion.png' width=19 height=20> Mend has checked all newer package trees, and you are on the least vulnerable package!
#### Please note: There might be a version that explicitly solves one or more of the vulnerabilities listed below, but we do not recommend it. For more info about the optional fixes, check the "Details" section below.
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (jstl version) | Fix PR available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2015-0254](https://www.mend.io/vulnerability-database/CVE-2015-0254) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.3 | jstl-1.2.jar | Direct | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> CVE-2015-0254</summary>
### Vulnerable Library - <b>jstl-1.2.jar</b></p>
<p></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/javax/servlet/jstl/1.2/jstl-1.2.jar</p>
<p>
Dependency Hierarchy:
- :x: **jstl-1.2.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/NS-Mend/Java-Demo/commit/f1bb3f06009b6c337a3174fd7f89dce12fe88e3e">f1bb3f06009b6c337a3174fd7f89dce12fe88e3e</a></p>
<p>Found in base branches: <b>feature/Demo, master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Apache Standard Taglibs before 1.2.3 allows remote attackers to execute arbitrary code or conduct external XML entity (XXE) attacks via a crafted XSLT extension in a (1) <x:parse> or (2) <x:transform> JSTL XML tag.
<p>Publish Date: 2015-03-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-0254>CVE-2015-0254</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tomcat.apache.org/taglibs/standard/">https://tomcat.apache.org/taglibs/standard/</a></p>
<p>Release Date: 2015-03-09</p>
<p>Fix Resolution: org.apache.taglibs:taglibs-standard-impl:1.2.3</p>
</p>
<p></p>
<p>In order to enable automatic remediation, please create <a target="_blank" href="https://docs.mend.io/bundle/integrations/page/mend_for_github_com.html#MendforGitHub.com-RemediateSettings(remediateSettings)">workflow rules</a></p>
</details>
|
non_process
|
jstl jar vulnerabilities highest severity is autoclosed vulnerable library jstl jar path to dependency file pom xml path to vulnerable library home wss scanner repository javax servlet jstl jstl jar found in head commit a href mend has checked all newer package trees and you are on the least vulnerable package please note there might be a version that explicitly solves one or more of the vulnerabilities listed below but we do not recommend it for more info about the optional fixes check the details section below vulnerabilities cve severity cvss dependency type fixed in jstl version fix pr available high jstl jar direct n a details cve vulnerable library jstl jar path to dependency file pom xml path to vulnerable library home wss scanner repository javax servlet jstl jstl jar dependency hierarchy x jstl jar vulnerable library found in head commit a href found in base branches feature demo master vulnerability details apache standard taglibs before allows remote attackers to execute arbitrary code or conduct external xml entity xxe attacks via a crafted xslt extension in a or jstl xml tag publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache taglibs taglibs standard impl in order to enable automatic remediation please create
| 0
|
3,025
| 6,028,389,283
|
IssuesEvent
|
2017-06-08 15:38:36
|
wpninjas/ninja-forms
|
https://api.github.com/repos/wpninjas/ninja-forms
|
closed
|
Unreachable block of code
|
BUG: Minor Dev Question FRONT: Processing Needs Review
|
The following block of code is not reachable. if `$subject` is an array, then the method would return first due to `if( empty( $matches[0] ) ) return parent::replace( $subject );`.
I'm not sure what the case for this code was or who created it originally to know exactly what it was supposed to do. My only thought is that the `is_array($subject)` block should be moved before `is_string($subject)` since it's recursive.
This came up while reviewing https://github.com/wpninjas/ninja-forms/issues/2743
```php
// plugins/ninja-forms/includes/MergeTags/WP.php#L57
// Recursively replace merge tags.
if( is_array( $subject ) ){
foreach( $subject as $i => $s ){
$subject[ $i ] = $this->replace( $s );
}
return $subject;
}
```
The context is
```php
// plugins/ninja-forms/includes/MergeTags/WP.php#L36
public function replace( $subject )
{
/*
* If we are dealing with a post meta merge tag, we need to overwrite the parent replace() method.
*
* Otherwise, we use the parent's method.
*/
/**
* {post_meta:foo} --> meta key is 'foo'
*/
if (is_string($subject)) {
preg_match_all("/{post_meta:(.*?)}/", $subject, $matches );
}
// If not matching merge tags are found, then return early.
if( empty( $matches[0] ) ) return parent::replace( $subject );
// Recursively replace merge tags.
if( is_array( $subject ) ){
foreach( $subject as $i => $s ){
$subject[ $i ] = $this->replace( $s );
}
return $subject;
}
```
|
1.0
|
Unreachable block of code - The following block of code is not reachable. if `$subject` is an array, then the method would return first due to `if( empty( $matches[0] ) ) return parent::replace( $subject );`.
I'm not sure what the case for this code was or who created it originally to know exactly what it was supposed to do. My only thought is that the `is_array($subject)` block should be moved before `is_string($subject)` since it's recursive.
This came up while reviewing https://github.com/wpninjas/ninja-forms/issues/2743
```php
// plugins/ninja-forms/includes/MergeTags/WP.php#L57
// Recursively replace merge tags.
if( is_array( $subject ) ){
foreach( $subject as $i => $s ){
$subject[ $i ] = $this->replace( $s );
}
return $subject;
}
```
The context is
```php
// plugins/ninja-forms/includes/MergeTags/WP.php#L36
public function replace( $subject )
{
/*
* If we are dealing with a post meta merge tag, we need to overwrite the parent replace() method.
*
* Otherwise, we use the parent's method.
*/
/**
* {post_meta:foo} --> meta key is 'foo'
*/
if (is_string($subject)) {
preg_match_all("/{post_meta:(.*?)}/", $subject, $matches );
}
// If not matching merge tags are found, then return early.
if( empty( $matches[0] ) ) return parent::replace( $subject );
// Recursively replace merge tags.
if( is_array( $subject ) ){
foreach( $subject as $i => $s ){
$subject[ $i ] = $this->replace( $s );
}
return $subject;
}
```
|
process
|
unreachable block of code the following block of code is not reachable if subject is an array then the method would return first due to if empty matches return parent replace subject i m not sure what the case for this code was or who created it originally to know exactly what it was supposed to do my only thought is that the is array subject block should be moved before is string subject since it s recursive this came up while reviewing php plugins ninja forms includes mergetags wp php recursively replace merge tags if is array subject foreach subject as i s subject this replace s return subject the context is php plugins ninja forms includes mergetags wp php public function replace subject if we are dealing with a post meta merge tag we need to overwrite the parent replace method otherwise we use the parent s method post meta foo meta key is foo if is string subject preg match all post meta subject matches if not matching merge tags are found then return early if empty matches return parent replace subject recursively replace merge tags if is array subject foreach subject as i s subject this replace s return subject
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.