Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
292,999
| 25,258,191,487
|
IssuesEvent
|
2022-11-15 20:05:03
|
eclipse-openj9/openj9
|
https://api.github.com/repos/eclipse-openj9/openj9
|
closed
|
[JDK19] JVMTI framepop02.java#id1 and singlestep01 Segfault
|
comp:vm test excluded project:loom jdk19 comp:jvmti
|
`framepop02/framepop02.java#id1` and `SingleStep/singlestep01` pass with the RI. These tests are related to Project Loom. The test failures will only be seen in JDK19+.
Related: #16187.
### Issue
There is a recursive infinite loop. The test registers a JVMTI event callback for `MethodEntry`, and invokes `jvmtiGetThreadInfo` from the `MethodEntry` event callback. `jvmtiGetThreadInfo` loads the `java/lang/Thread$Constants` class for the virtual thread path. While loading this class, the `MethodEntry` event callback is invoked again. Then, we are stuck in an infinite loop: `MethodEntry` event callback -> `jvmtiGetThreadInfo` -> `MethodEntry` event callback -> `jvmtiGetThreadInfo`.
A similar recursive infinite loop is seen in `SingleStep/singlestep01`: `SingleStep` event callback -> `jvmtiGetThreadInfo` -> `SingleStep` event callback -> `jvmtiGetThreadInfo`.
To resolve this issue, I tried adding `java/lang/Thread$Constants` to the `requiredClasses` array in `jclcinit.c`. But `java/lang/Thread$Constants` has a static init block which fails to successfully execute at startup: [Thread.java#L3037-L3065](https://github.com/ibmruntimes/openj9-openjdk-jdk19/blob/4efd563f9c7560c3b69d6dd4fb77dfd27126fa1e/src/java.base/share/classes/java/lang/Thread.java#L3037-L3065).
### Test CMD
```
# FramePop/framepop02
make test TEST="jtreg:test/hotspot/jtreg/serviceability/jvmti/events/FramePop/framepop02" JTREG="JAVA_OPTIONS=--enable-preview -Dvm.continuations=true;VERBOSE=all"
# SingleStep/singlestep01
make test TEST="jtreg:test/hotspot/jtreg/serviceability/jvmti/events/SingleStep/singlestep01/singlestep01.java" JTREG="JAVA_OPTIONS=--enable-preview -Dvm.continuations=true;VERBOSE=all"
```
### Test Output
```
Builder: java.lang.ThreadBuilders$VirtualThreadBuilder@9ffc73c6
Segmentation fault (core dumped)
```
### GDB Native Stack
```
#1246 MethodEntry (jvmti=jvmti@entry=0x7ffbc80a5b88, jni=jni@entry=0x26c300, thr=0x288860, method=method@entry=0x7ffb38009710)
at test/hotspot/jtreg/serviceability/jvmti/events/FramePop/framepop02/libframepop02.cpp:181
#1247 0x00007ffbc50d7ad2 in jvmtiHookMethodEnter (hook=hook@entry=0x7ffbc8018580, eventNum=eventNum@entry=23, eventData=eventData@entry=0x7ffbaa4edf70, userData=userData@entry=0x7ffbc80a5b88)
--Type <RET> for more, q to quit, c to continue without paging--
at /root/openj9-openjdk-jdk/openj9/runtime/jvmti/jvmtiHook.c:334
#1248 0x00007ffbcc36936e in J9HookDispatch (hookInterface=0x7ffbc8018580, taggedEventNum=<optimized out>, eventData=0x7ffbaa4edf70) at /root/openj9-openjdk-jdk/omr/util/hookable/hookable.cpp:235
#1249 0x00007ffbcc865f4d in VM_DebugBytecodeInterpreterCompressed::reportMethodEnter (_pc=<synthetic pointer>: <optimized out>, _sp=<synthetic pointer>: <optimized out>, this=0x7ffbaa4ee1e0)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.hpp:1635
#1250 VM_DebugBytecodeInterpreterCompressed::run (this=this@entry=0x7ffbaa4ee1e0, vmThread=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.hpp:10304
#1251 0x00007ffbcc864835 in debugBytecodeLoopCompressed (currentThread=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.inc:112
#1252 0x00007ffbcc8cbfc2 in c_cInterpreter () at /root/openj9-openjdk-jdk/build/linux-x86_64-server-release/vm/runtime/vm/xcinterp.s:158
#1253 0x00007ffbcc7a411d in sendLoadClass (currentThread=0x7ffbaa4ee320, currentThread@entry=0x26c300, classLoaderObject=0xc15e07e8, classNameObject=0xffe5ce28)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/callin.cpp:467
#1254 0x00007ffbcc7afc7d in callLoadClass (classNotFoundException=<synthetic pointer>, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants",
vmThread=0x26c300) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:703
#1255 arbitratedLoadClass (classNotFoundException=<synthetic pointer>, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants", vmThread=0x26c300)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:904
#1256 loadNonArrayClass (exception=<synthetic pointer>, options=1, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants", j9module=0x0, vmThread=0x26c300)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1103
#1257 internalFindClassInModule (vmThread=vmThread@entry=0x26c300, j9module=j9module@entry=0x0, className=0x7ffbc510ab12 "java/lang/Thread$Constants", classNameLength=classNameLength@entry=26,
classLoader=classLoader@entry=0x7ffbc80a18b8, options=options@entry=1) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1148
#1258 0x00007ffbcc7b2503 in internalFindClassUTF8 (vmThread=vmThread@entry=0x26c300, className=<optimized out>, classNameLength=classNameLength@entry=26, classLoader=classLoader@entry=0x7ffbc80a18b8,
options=options@entry=1) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1125
#1259 0x00007ffbcc7d67f5 in findClass (env=0x26c300, name=0x7ffbc510ab12 "java/lang/Thread$Constants") at /root/openj9-openjdk-jdk/openj9/runtime/vm/jnimisc.cpp:520
#1260 0x00007ffbcc7cc180 in gpCheckFindClass (env=<optimized out>, name=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/jnicsup.cpp:335
#1261 0x00007ffbc50e913d in jvmtiGetThreadInfo (env=<optimized out>, thread=0x288910, info_ptr=0x7ffbaa4ee720) at /root/openj9-openjdk-jdk/openj9/runtime/jvmti/jvmtiThread.c:533
#1262 0x00007ffba81e1788 in _jvmtiEnv::GetThreadInfo (info_ptr=0x7ffbaa4ee720, thread=0x288910, this=0x7ffbc80a5b88) at build/linux-x86_64-server-release/support/modules_include/java.base/jvmti.h:1189
#1263 isTestThread (thr=0x288910, jvmti=0x7ffbc80a5b88, jni=0x26c300) at test/hotspot/jtreg/serviceability/jvmti/events/FramePop/framepop02/libframepop02.cpp:75
#1264 MethodEntry (jvmti=jvmti@entry=0x7ffbc80a5b88, jni=jni@entry=0x26c300, thr=0x288910, method=method@entry=0x7ffb38009710)
at test/hotspot/jtreg/serviceability/jvmti/events/FramePop/framepop02/libframepop02.cpp:181
#1265 0x00007ffbc50d7ad2 in jvmtiHookMethodEnter (hook=hook@entry=0x7ffbc8018580, eventNum=eventNum@entry=23, eventData=eventData@entry=0x7ffbaa4eeb30, userData=userData@entry=0x7ffbc80a5b88)
at /root/openj9-openjdk-jdk/openj9/runtime/jvmti/jvmtiHook.c:334
#1266 0x00007ffbcc36936e in J9HookDispatch (hookInterface=0x7ffbc8018580, taggedEventNum=<optimized out>, eventData=0x7ffbaa4eeb30) at /root/openj9-openjdk-jdk/omr/util/hookable/hookable.cpp:235
#1267 0x00007ffbcc865f4d in VM_DebugBytecodeInterpreterCompressed::reportMethodEnter (_pc=<synthetic pointer>: <optimized out>, _sp=<synthetic pointer>: <optimized out>, this=0x7ffbaa4eeda0)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.hpp:1635
#1268 VM_DebugBytecodeInterpreterCompressed::run (this=this@entry=0x7ffbaa4eeda0, vmThread=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.hpp:10304
#1269 0x00007ffbcc864835 in debugBytecodeLoopCompressed (currentThread=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.inc:112
#1270 0x00007ffbcc8cbfc2 in c_cInterpreter () at /root/openj9-openjdk-jdk/build/linux-x86_64-server-release/vm/runtime/vm/xcinterp.s:158
#1271 0x00007ffbcc7a411d in sendLoadClass (currentThread=0x7ffbaa4eeee0, currentThread@entry=0x26c300, classLoaderObject=0xc15e07e8, classNameObject=0xffe5cdc8)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/callin.cpp:467
#1272 0x00007ffbcc7afc7d in callLoadClass (classNotFoundException=<synthetic pointer>, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants",
vmThread=0x26c300) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:703
#1273 arbitratedLoadClass (classNotFoundException=<synthetic pointer>, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants", vmThread=0x26c300)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:904
#1274 loadNonArrayClass (exception=<synthetic pointer>, options=1, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants", j9module=0x0, vmThread=0x26c300)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1103
#1275 internalFindClassInModule (vmThread=vmThread@entry=0x26c300, j9module=j9module@entry=0x0, className=0x7ffbc510ab12 "java/lang/Thread$Constants", classNameLength=classNameLength@entry=26,
classLoader=classLoader@entry=0x7ffbc80a18b8, options=options@entry=1) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1148
#1276 0x00007ffbcc7b2503 in internalFindClassUTF8 (vmThread=vmThread@entry=0x26c300, className=<optimized out>, classNameLength=classNameLength@entry=26, classLoader=classLoader@entry=0x7ffbc80a18b8,
options=options@entry=1) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1125
#1277 0x00007ffbcc7d67f5 in findClass (env=0x26c300, name=0x7ffbc510ab12 "java/lang/Thread$Constants") at /root/openj9-openjdk-jdk/openj9/runtime/vm/jnimisc.cpp:520
#1278 0x00007ffbcc7cc180 in gpCheckFindClass (env=<optimized out>, name=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/jnicsup.cpp:335
#1279 0x00007ffbc50e913d in jvmtiGetThreadInfo (env=<optimized out>, thread=0x2889c0, info_ptr=0x7ffbaa4ef2e0) at /root/openj9-openjdk-jdk/openj9/runtime/jvmti/jvmtiThread.c:533
#1280 0x00007ffba81e1788 in _jvmtiEnv::GetThreadInfo (info_ptr=0x7ffbaa4ef2e0, thread=0x2889c0, this=0x7ffbc80a5b88) at build/linux-x86_64-server-release/support/modules_include/java.base/jvmti.h:1189
#1281 isTestThread (thr=0x2889c0, jvmti=0x7ffbc80a5b88, jni=0x26c300) at test/hotspot/jtreg/serviceability/jvmti/events/FramePop/framepop02/libframepop02.cpp:75
#1282 MethodEntry (jvmti=jvmti@entry=0x7ffbc80a5b88, jni=jni@entry=0x26c300, thr=0x2889c0, method=method@entry=0x7ffb38009710)
```
|
1.0
|
[JDK19] JVMTI framepop02.java#id1 and singlestep01 Segfault - `framepop02/framepop02.java#id1` and `SingleStep/singlestep01` pass with the RI. These tests are related to Project Loom. The test failures will only be seen in JDK19+.
Related: #16187.
### Issue
There is a recursive infinite loop. The test registers a JVMTI event callback for `MethodEntry`, and invokes `jvmtiGetThreadInfo` from the `MethodEntry` event callback. `jvmtiGetThreadInfo` loads the `java/lang/Thread$Constants` class for the virtual thread path. While loading this class, the `MethodEntry` event callback is invoked again. Then, we are stuck in an infinite loop: `MethodEntry` event callback -> `jvmtiGetThreadInfo` -> `MethodEntry` event callback -> `jvmtiGetThreadInfo`.
A similar recursive infinite loop is seen in `SingleStep/singlestep01`: `SingleStep` event callback -> `jvmtiGetThreadInfo` -> `SingleStep` event callback -> `jvmtiGetThreadInfo`.
To resolve this issue, I tried adding `java/lang/Thread$Constants` to the `requiredClasses` array in `jclcinit.c`. But `java/lang/Thread$Constants` has a static init block which fails to successfully execute at startup: [Thread.java#L3037-L3065](https://github.com/ibmruntimes/openj9-openjdk-jdk19/blob/4efd563f9c7560c3b69d6dd4fb77dfd27126fa1e/src/java.base/share/classes/java/lang/Thread.java#L3037-L3065).
### Test CMD
```
# FramePop/framepop02
make test TEST="jtreg:test/hotspot/jtreg/serviceability/jvmti/events/FramePop/framepop02" JTREG="JAVA_OPTIONS=--enable-preview -Dvm.continuations=true;VERBOSE=all"
# SingleStep/singlestep01
make test TEST="jtreg:test/hotspot/jtreg/serviceability/jvmti/events/SingleStep/singlestep01/singlestep01.java" JTREG="JAVA_OPTIONS=--enable-preview -Dvm.continuations=true;VERBOSE=all"
```
### Test Output
```
Builder: java.lang.ThreadBuilders$VirtualThreadBuilder@9ffc73c6
Segmentation fault (core dumped)
```
### GDB Native Stack
```
#1246 MethodEntry (jvmti=jvmti@entry=0x7ffbc80a5b88, jni=jni@entry=0x26c300, thr=0x288860, method=method@entry=0x7ffb38009710)
at test/hotspot/jtreg/serviceability/jvmti/events/FramePop/framepop02/libframepop02.cpp:181
#1247 0x00007ffbc50d7ad2 in jvmtiHookMethodEnter (hook=hook@entry=0x7ffbc8018580, eventNum=eventNum@entry=23, eventData=eventData@entry=0x7ffbaa4edf70, userData=userData@entry=0x7ffbc80a5b88)
--Type <RET> for more, q to quit, c to continue without paging--
at /root/openj9-openjdk-jdk/openj9/runtime/jvmti/jvmtiHook.c:334
#1248 0x00007ffbcc36936e in J9HookDispatch (hookInterface=0x7ffbc8018580, taggedEventNum=<optimized out>, eventData=0x7ffbaa4edf70) at /root/openj9-openjdk-jdk/omr/util/hookable/hookable.cpp:235
#1249 0x00007ffbcc865f4d in VM_DebugBytecodeInterpreterCompressed::reportMethodEnter (_pc=<synthetic pointer>: <optimized out>, _sp=<synthetic pointer>: <optimized out>, this=0x7ffbaa4ee1e0)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.hpp:1635
#1250 VM_DebugBytecodeInterpreterCompressed::run (this=this@entry=0x7ffbaa4ee1e0, vmThread=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.hpp:10304
#1251 0x00007ffbcc864835 in debugBytecodeLoopCompressed (currentThread=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.inc:112
#1252 0x00007ffbcc8cbfc2 in c_cInterpreter () at /root/openj9-openjdk-jdk/build/linux-x86_64-server-release/vm/runtime/vm/xcinterp.s:158
#1253 0x00007ffbcc7a411d in sendLoadClass (currentThread=0x7ffbaa4ee320, currentThread@entry=0x26c300, classLoaderObject=0xc15e07e8, classNameObject=0xffe5ce28)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/callin.cpp:467
#1254 0x00007ffbcc7afc7d in callLoadClass (classNotFoundException=<synthetic pointer>, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants",
vmThread=0x26c300) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:703
#1255 arbitratedLoadClass (classNotFoundException=<synthetic pointer>, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants", vmThread=0x26c300)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:904
#1256 loadNonArrayClass (exception=<synthetic pointer>, options=1, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants", j9module=0x0, vmThread=0x26c300)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1103
#1257 internalFindClassInModule (vmThread=vmThread@entry=0x26c300, j9module=j9module@entry=0x0, className=0x7ffbc510ab12 "java/lang/Thread$Constants", classNameLength=classNameLength@entry=26,
classLoader=classLoader@entry=0x7ffbc80a18b8, options=options@entry=1) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1148
#1258 0x00007ffbcc7b2503 in internalFindClassUTF8 (vmThread=vmThread@entry=0x26c300, className=<optimized out>, classNameLength=classNameLength@entry=26, classLoader=classLoader@entry=0x7ffbc80a18b8,
options=options@entry=1) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1125
#1259 0x00007ffbcc7d67f5 in findClass (env=0x26c300, name=0x7ffbc510ab12 "java/lang/Thread$Constants") at /root/openj9-openjdk-jdk/openj9/runtime/vm/jnimisc.cpp:520
#1260 0x00007ffbcc7cc180 in gpCheckFindClass (env=<optimized out>, name=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/jnicsup.cpp:335
#1261 0x00007ffbc50e913d in jvmtiGetThreadInfo (env=<optimized out>, thread=0x288910, info_ptr=0x7ffbaa4ee720) at /root/openj9-openjdk-jdk/openj9/runtime/jvmti/jvmtiThread.c:533
#1262 0x00007ffba81e1788 in _jvmtiEnv::GetThreadInfo (info_ptr=0x7ffbaa4ee720, thread=0x288910, this=0x7ffbc80a5b88) at build/linux-x86_64-server-release/support/modules_include/java.base/jvmti.h:1189
#1263 isTestThread (thr=0x288910, jvmti=0x7ffbc80a5b88, jni=0x26c300) at test/hotspot/jtreg/serviceability/jvmti/events/FramePop/framepop02/libframepop02.cpp:75
#1264 MethodEntry (jvmti=jvmti@entry=0x7ffbc80a5b88, jni=jni@entry=0x26c300, thr=0x288910, method=method@entry=0x7ffb38009710)
at test/hotspot/jtreg/serviceability/jvmti/events/FramePop/framepop02/libframepop02.cpp:181
#1265 0x00007ffbc50d7ad2 in jvmtiHookMethodEnter (hook=hook@entry=0x7ffbc8018580, eventNum=eventNum@entry=23, eventData=eventData@entry=0x7ffbaa4eeb30, userData=userData@entry=0x7ffbc80a5b88)
at /root/openj9-openjdk-jdk/openj9/runtime/jvmti/jvmtiHook.c:334
#1266 0x00007ffbcc36936e in J9HookDispatch (hookInterface=0x7ffbc8018580, taggedEventNum=<optimized out>, eventData=0x7ffbaa4eeb30) at /root/openj9-openjdk-jdk/omr/util/hookable/hookable.cpp:235
#1267 0x00007ffbcc865f4d in VM_DebugBytecodeInterpreterCompressed::reportMethodEnter (_pc=<synthetic pointer>: <optimized out>, _sp=<synthetic pointer>: <optimized out>, this=0x7ffbaa4eeda0)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.hpp:1635
#1268 VM_DebugBytecodeInterpreterCompressed::run (this=this@entry=0x7ffbaa4eeda0, vmThread=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.hpp:10304
#1269 0x00007ffbcc864835 in debugBytecodeLoopCompressed (currentThread=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/BytecodeInterpreter.inc:112
#1270 0x00007ffbcc8cbfc2 in c_cInterpreter () at /root/openj9-openjdk-jdk/build/linux-x86_64-server-release/vm/runtime/vm/xcinterp.s:158
#1271 0x00007ffbcc7a411d in sendLoadClass (currentThread=0x7ffbaa4eeee0, currentThread@entry=0x26c300, classLoaderObject=0xc15e07e8, classNameObject=0xffe5cdc8)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/callin.cpp:467
#1272 0x00007ffbcc7afc7d in callLoadClass (classNotFoundException=<synthetic pointer>, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants",
vmThread=0x26c300) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:703
#1273 arbitratedLoadClass (classNotFoundException=<synthetic pointer>, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants", vmThread=0x26c300)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:904
#1274 loadNonArrayClass (exception=<synthetic pointer>, options=1, classLoader=0x7ffbc80a18b8, classNameLength=26, className=0x7ffbc510ab12 "java/lang/Thread$Constants", j9module=0x0, vmThread=0x26c300)
at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1103
#1275 internalFindClassInModule (vmThread=vmThread@entry=0x26c300, j9module=j9module@entry=0x0, className=0x7ffbc510ab12 "java/lang/Thread$Constants", classNameLength=classNameLength@entry=26,
classLoader=classLoader@entry=0x7ffbc80a18b8, options=options@entry=1) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1148
#1276 0x00007ffbcc7b2503 in internalFindClassUTF8 (vmThread=vmThread@entry=0x26c300, className=<optimized out>, classNameLength=classNameLength@entry=26, classLoader=classLoader@entry=0x7ffbc80a18b8,
options=options@entry=1) at /root/openj9-openjdk-jdk/openj9/runtime/vm/classsupport.c:1125
#1277 0x00007ffbcc7d67f5 in findClass (env=0x26c300, name=0x7ffbc510ab12 "java/lang/Thread$Constants") at /root/openj9-openjdk-jdk/openj9/runtime/vm/jnimisc.cpp:520
#1278 0x00007ffbcc7cc180 in gpCheckFindClass (env=<optimized out>, name=<optimized out>) at /root/openj9-openjdk-jdk/openj9/runtime/vm/jnicsup.cpp:335
#1279 0x00007ffbc50e913d in jvmtiGetThreadInfo (env=<optimized out>, thread=0x2889c0, info_ptr=0x7ffbaa4ef2e0) at /root/openj9-openjdk-jdk/openj9/runtime/jvmti/jvmtiThread.c:533
#1280 0x00007ffba81e1788 in _jvmtiEnv::GetThreadInfo (info_ptr=0x7ffbaa4ef2e0, thread=0x2889c0, this=0x7ffbc80a5b88) at build/linux-x86_64-server-release/support/modules_include/java.base/jvmti.h:1189
#1281 isTestThread (thr=0x2889c0, jvmti=0x7ffbc80a5b88, jni=0x26c300) at test/hotspot/jtreg/serviceability/jvmti/events/FramePop/framepop02/libframepop02.cpp:75
#1282 MethodEntry (jvmti=jvmti@entry=0x7ffbc80a5b88, jni=jni@entry=0x26c300, thr=0x2889c0, method=method@entry=0x7ffb38009710)
```
|
non_process
|
jvmti java and segfault java and singlestep pass with the ri these tests are related to project loom the test failures will only be seen in related issue there is a recursive infinite loop the test registers a jvmti event callback for methodentry and invokes jvmtigetthreadinfo from the methodentry event callback jvmtigetthreadinfo loads the java lang thread constants class for the virtual thread path while loading this class the methodentry event callback is invoked again then we are stuck in an infinite loop methodentry event callback jvmtigetthreadinfo methodentry event callback jvmtigetthreadinfo a similar recursive infinite loop is seen in singlestep singlestep event callback jvmtigetthreadinfo singlestep event callback jvmtigetthreadinfo to resolve this issue i tried adding java lang thread constants to the requiredclasses array in jclcinit c but java lang thread constants has a static init block which fails to successfully execute at startup test cmd framepop make test test jtreg test hotspot jtreg serviceability jvmti events framepop jtreg java options enable preview dvm continuations true verbose all singlestep make test test jtreg test hotspot jtreg serviceability jvmti events singlestep java jtreg java options enable preview dvm continuations true verbose all test output builder java lang threadbuilders virtualthreadbuilder segmentation fault core dumped gdb native stack methodentry jvmti jvmti entry jni jni entry thr method method entry at test hotspot jtreg serviceability jvmti events framepop cpp in jvmtihookmethodenter hook hook entry eventnum eventnum entry eventdata eventdata entry userdata userdata entry type for more q to quit c to continue without paging at root openjdk jdk runtime jvmti jvmtihook c in hookinterface taggedeventnum eventdata at root openjdk jdk omr util hookable hookable cpp in vm debugbytecodeinterpretercompressed reportmethodenter pc sp this at root openjdk jdk runtime vm bytecodeinterpreter hpp vm debugbytecodeinterpretercompressed run this this entry vmthread at root openjdk jdk runtime vm bytecodeinterpreter hpp in debugbytecodeloopcompressed currentthread at root openjdk jdk runtime vm bytecodeinterpreter inc in c cinterpreter at root openjdk jdk build linux server release vm runtime vm xcinterp s in sendloadclass currentthread currentthread entry classloaderobject classnameobject at root openjdk jdk runtime vm callin cpp in callloadclass classnotfoundexception classloader classnamelength classname java lang thread constants vmthread at root openjdk jdk runtime vm classsupport c arbitratedloadclass classnotfoundexception classloader classnamelength classname java lang thread constants vmthread at root openjdk jdk runtime vm classsupport c loadnonarrayclass exception options classloader classnamelength classname java lang thread constants vmthread at root openjdk jdk runtime vm classsupport c internalfindclassinmodule vmthread vmthread entry entry classname java lang thread constants classnamelength classnamelength entry classloader classloader entry options options entry at root openjdk jdk runtime vm classsupport c in vmthread vmthread entry classname classnamelength classnamelength entry classloader classloader entry options options entry at root openjdk jdk runtime vm classsupport c in findclass env name java lang thread constants at root openjdk jdk runtime vm jnimisc cpp in gpcheckfindclass env name at root openjdk jdk runtime vm jnicsup cpp in jvmtigetthreadinfo env thread info ptr at root openjdk jdk runtime jvmti jvmtithread c in jvmtienv getthreadinfo info ptr thread this at build linux server release support modules include java base jvmti h istestthread thr jvmti jni at test hotspot jtreg serviceability jvmti events framepop cpp methodentry jvmti jvmti entry jni jni entry thr method method entry at test hotspot jtreg serviceability jvmti events framepop cpp in jvmtihookmethodenter hook hook entry eventnum eventnum entry eventdata eventdata entry userdata userdata entry at root openjdk jdk runtime jvmti jvmtihook c in hookinterface taggedeventnum eventdata at root openjdk jdk omr util hookable hookable cpp in vm debugbytecodeinterpretercompressed reportmethodenter pc sp this at root openjdk jdk runtime vm bytecodeinterpreter hpp vm debugbytecodeinterpretercompressed run this this entry vmthread at root openjdk jdk runtime vm bytecodeinterpreter hpp in debugbytecodeloopcompressed currentthread at root openjdk jdk runtime vm bytecodeinterpreter inc in c cinterpreter at root openjdk jdk build linux server release vm runtime vm xcinterp s in sendloadclass currentthread currentthread entry classloaderobject classnameobject at root openjdk jdk runtime vm callin cpp in callloadclass classnotfoundexception classloader classnamelength classname java lang thread constants vmthread at root openjdk jdk runtime vm classsupport c arbitratedloadclass classnotfoundexception classloader classnamelength classname java lang thread constants vmthread at root openjdk jdk runtime vm classsupport c loadnonarrayclass exception options classloader classnamelength classname java lang thread constants vmthread at root openjdk jdk runtime vm classsupport c internalfindclassinmodule vmthread vmthread entry entry classname java lang thread constants classnamelength classnamelength entry classloader classloader entry options options entry at root openjdk jdk runtime vm classsupport c in vmthread vmthread entry classname classnamelength classnamelength entry classloader classloader entry options options entry at root openjdk jdk runtime vm classsupport c in findclass env name java lang thread constants at root openjdk jdk runtime vm jnimisc cpp in gpcheckfindclass env name at root openjdk jdk runtime vm jnicsup cpp in jvmtigetthreadinfo env thread info ptr at root openjdk jdk runtime jvmti jvmtithread c in jvmtienv getthreadinfo info ptr thread this at build linux server release support modules include java base jvmti h istestthread thr jvmti jni at test hotspot jtreg serviceability jvmti events framepop cpp methodentry jvmti jvmti entry jni jni entry thr method method entry
| 0
|
12,676
| 4,513,659,031
|
IssuesEvent
|
2016-09-04 12:15:54
|
nextcloud/gallery
|
https://api.github.com/repos/nextcloud/gallery
|
opened
|
GDrive-like grid view
|
coder wanted enhancement sponsor needed
|
_From @RoxasShadow on January 8, 2016 0:52_
For me would be great having the possibility to have a grid view similar to the Google Drive one with nice cropped image that share the same size.
I think it's an elegant and efficient way to look at the image thumbnails.
*Gallery+*
<img width="341" alt="screen shot 2016-01-07 at 12 28 50" src="https://cloud.githubusercontent.com/assets/805144/12169644/4ed1950a-b53a-11e5-805d-df245cab9fd9.png">
*Google Drive*
<img width="1167" alt="screen shot 2016-01-07 at 12 27 45" src="https://cloud.githubusercontent.com/assets/805144/12169715/d1431d9c-b53a-11e5-92d5-0d1ade39b6f6.png">
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/29686884-gdrive-like-grid-view?utm_campaign=plugin&utm_content=tracker%2F9328526&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F9328526&utm_medium=issues&utm_source=github).
</bountysource-plugin>
_Copied from original issue: owncloud/gallery#492_
|
1.0
|
GDrive-like grid view - _From @RoxasShadow on January 8, 2016 0:52_
For me would be great having the possibility to have a grid view similar to the Google Drive one with nice cropped image that share the same size.
I think it's an elegant and efficient way to look at the image thumbnails.
*Gallery+*
<img width="341" alt="screen shot 2016-01-07 at 12 28 50" src="https://cloud.githubusercontent.com/assets/805144/12169644/4ed1950a-b53a-11e5-805d-df245cab9fd9.png">
*Google Drive*
<img width="1167" alt="screen shot 2016-01-07 at 12 27 45" src="https://cloud.githubusercontent.com/assets/805144/12169715/d1431d9c-b53a-11e5-92d5-0d1ade39b6f6.png">
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/29686884-gdrive-like-grid-view?utm_campaign=plugin&utm_content=tracker%2F9328526&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F9328526&utm_medium=issues&utm_source=github).
</bountysource-plugin>
_Copied from original issue: owncloud/gallery#492_
|
non_process
|
gdrive like grid view from roxasshadow on january for me would be great having the possibility to have a grid view similar to the google drive one with nice cropped image that share the same size i think it s an elegant and efficient way to look at the image thumbnails gallery img width alt screen shot at src google drive img width alt screen shot at src want to back this issue we accept bounties via copied from original issue owncloud gallery
| 0
|
2,949
| 5,930,338,028
|
IssuesEvent
|
2017-05-24 00:58:10
|
triplea-game/triplea
|
https://api.github.com/repos/triplea-game/triplea
|
closed
|
New Issue Labeling process and wiki page
|
type: process
|
Creating labels ad-hoc is good, but not having a shared definition of what labels mean is not. We have some notes on the main readme about what labels meant. This was not a complete list.
I copy/pasted our current label set (and created a 'not fixing / ice box' label), and put them in this wiki page: https://github.com/triplea-game/triplea/wiki/Issue-Labelling
So we should feel to update/add/remove from the label list, as long as we keep the wiki page updated.
I then updated our readme instructions to consolidate and give the link to the labelling wiki: https://github.com/triplea-game/triplea/pull/1390
This ticket is mainly meant as an FYI.
|
1.0
|
New Issue Labeling process and wiki page - Creating labels ad-hoc is good, but not having a shared definition of what labels mean is not. We have some notes on the main readme about what labels meant. This was not a complete list.
I copy/pasted our current label set (and created a 'not fixing / ice box' label), and put them in this wiki page: https://github.com/triplea-game/triplea/wiki/Issue-Labelling
So we should feel to update/add/remove from the label list, as long as we keep the wiki page updated.
I then updated our readme instructions to consolidate and give the link to the labelling wiki: https://github.com/triplea-game/triplea/pull/1390
This ticket is mainly meant as an FYI.
|
process
|
new issue labeling process and wiki page creating labels ad hoc is good but not having a shared definition of what labels mean is not we have some notes on the main readme about what labels meant this was not a complete list i copy pasted our current label set and created a not fixing ice box label and put them in this wiki page so we should feel to update add remove from the label list as long as we keep the wiki page updated i then updated our readme instructions to consolidate and give the link to the labelling wiki this ticket is mainly meant as an fyi
| 1
|
43,948
| 5,719,686,767
|
IssuesEvent
|
2017-04-19 22:45:26
|
brave/browser-laptop
|
https://api.github.com/repos/brave/browser-laptop
|
opened
|
Settings v Preferences mismatch
|
design feature/about-pages settings
|
- Did you search for similar issues before submitting this one?
Yes
- Describe the issue you encountered:
I was opening an issue for Sync, and noticed that we use _Settings_ from the top menu:
**_Edit > Settings_** for the `about:preferences` page.
Sync issue opened for reference: https://github.com/brave/sync/issues/79
- Platform (Win7, 8, 10? macOS? Linux distro?):
All (suspected) - need confirmation for macOS.
- Brave Version (revision SHA):
```
Brave: 0.14.2
rev: d4cad892de4c8da087c4efee2e8955caa630675c
Muon: 2.57.10
libchromiumcontent: 57.0.2987.133
V8: 5.7.492.71
Node.js: 7.4.0
Update Channel: dev
os.platform: win32
os.release: 10.0.14393
os.arch: x64
```
- Steps to reproduce:
Captured in this screenshot:

Within the Android browser, this is also referred to as Settings instead of Preferences. I'm not sure how much we can control for naming within the Android Brave browser, but I wanted to match if we're looking to have cross-platform consistency.
We're going to want to make sure we have consistency so the Sync instructions match across platforms.
@bradleyrichter I'm going to assign you on this once since we're talking about UI across platforms. Let me know if there's anything else I can assist with for this.
|
1.0
|
Settings v Preferences mismatch - - Did you search for similar issues before submitting this one?
Yes
- Describe the issue you encountered:
I was opening an issue for Sync, and noticed that we use _Settings_ from the top menu:
**_Edit > Settings_** for the `about:preferences` page.
Sync issue opened for reference: https://github.com/brave/sync/issues/79
- Platform (Win7, 8, 10? macOS? Linux distro?):
All (suspected) - need confirmation for macOS.
- Brave Version (revision SHA):
```
Brave: 0.14.2
rev: d4cad892de4c8da087c4efee2e8955caa630675c
Muon: 2.57.10
libchromiumcontent: 57.0.2987.133
V8: 5.7.492.71
Node.js: 7.4.0
Update Channel: dev
os.platform: win32
os.release: 10.0.14393
os.arch: x64
```
- Steps to reproduce:
Captured in this screenshot:

Within the Android browser, this is also referred to as Settings instead of Preferences. I'm not sure how much we can control for naming within the Android Brave browser, but I wanted to match if we're looking to have cross-platform consistency.
We're going to want to make sure we have consistency so the Sync instructions match across platforms.
@bradleyrichter I'm going to assign you on this once since we're talking about UI across platforms. Let me know if there's anything else I can assist with for this.
|
non_process
|
settings v preferences mismatch did you search for similar issues before submitting this one yes describe the issue you encountered i was opening an issue for sync and noticed that we use settings from the top menu edit settings for the about preferences page sync issue opened for reference platform macos linux distro all suspected need confirmation for macos brave version revision sha brave rev muon libchromiumcontent node js update channel dev os platform os release os arch steps to reproduce captured in this screenshot within the android browser this is also referred to as settings instead of preferences i m not sure how much we can control for naming within the android brave browser but i wanted to match if we re looking to have cross platform consistency we re going to want to make sure we have consistency so the sync instructions match across platforms bradleyrichter i m going to assign you on this once since we re talking about ui across platforms let me know if there s anything else i can assist with for this
| 0
|
46,937
| 13,198,015,237
|
IssuesEvent
|
2020-08-14 01:01:50
|
orenavitov/promoted-builds-plugin
|
https://api.github.com/repos/orenavitov/promoted-builds-plugin
|
opened
|
CVE-2020-2230 (Medium) detected in jenkins-core-2.121.1.jar
|
security vulnerability
|
## CVE-2020-2230 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jenkins-core-2.121.1.jar</b></p></summary>
<p>Jenkins core code and view files to render HTML.</p>
<p>Path to dependency file: /tmp/ws-scm/promoted-builds-plugin/pom.xml</p>
<p>Path to vulnerable library: epository/org/jenkins-ci/main/jenkins-core/2.121.1/jenkins-core-2.121.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **jenkins-core-2.121.1.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Jenkins 2.251 and earlier, LTS 2.235.3 and earlier does not escape the project naming strategy description, resulting in a stored cross-site scripting (XSS) vulnerability exploitable by users with Overall/Manage permission.
<p>Publish Date: 2020-07-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-2230>CVE-2020-2230</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Adjacent
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.jenkins.io/security/advisory/2020-08-12/">https://www.jenkins.io/security/advisory/2020-08-12/</a></p>
<p>Release Date: 2020-07-21</p>
<p>Fix Resolution: org.jenkins-ci.main:jenkins-core:2.235.4,2.252</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.jenkins-ci.main","packageName":"jenkins-core","packageVersion":"2.121.1","isTransitiveDependency":false,"dependencyTree":"org.jenkins-ci.main:jenkins-core:2.121.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.jenkins-ci.main:jenkins-core:2.235.4,2.252"}],"vulnerabilityIdentifier":"CVE-2020-2230","vulnerabilityDetails":"Jenkins 2.251 and earlier, LTS 2.235.3 and earlier does not escape the project naming strategy description, resulting in a stored cross-site scripting (XSS) vulnerability exploitable by users with Overall/Manage permission.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-2230","cvss3Severity":"medium","cvss3Score":"5.1","cvss3Metrics":{"A":"Low","AC":"High","PR":"Low","S":"Changed","C":"Low","UI":"Required","AV":"Adjacent","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-2230 (Medium) detected in jenkins-core-2.121.1.jar - ## CVE-2020-2230 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jenkins-core-2.121.1.jar</b></p></summary>
<p>Jenkins core code and view files to render HTML.</p>
<p>Path to dependency file: /tmp/ws-scm/promoted-builds-plugin/pom.xml</p>
<p>Path to vulnerable library: epository/org/jenkins-ci/main/jenkins-core/2.121.1/jenkins-core-2.121.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **jenkins-core-2.121.1.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Jenkins 2.251 and earlier, LTS 2.235.3 and earlier does not escape the project naming strategy description, resulting in a stored cross-site scripting (XSS) vulnerability exploitable by users with Overall/Manage permission.
<p>Publish Date: 2020-07-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-2230>CVE-2020-2230</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Adjacent
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.jenkins.io/security/advisory/2020-08-12/">https://www.jenkins.io/security/advisory/2020-08-12/</a></p>
<p>Release Date: 2020-07-21</p>
<p>Fix Resolution: org.jenkins-ci.main:jenkins-core:2.235.4,2.252</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.jenkins-ci.main","packageName":"jenkins-core","packageVersion":"2.121.1","isTransitiveDependency":false,"dependencyTree":"org.jenkins-ci.main:jenkins-core:2.121.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.jenkins-ci.main:jenkins-core:2.235.4,2.252"}],"vulnerabilityIdentifier":"CVE-2020-2230","vulnerabilityDetails":"Jenkins 2.251 and earlier, LTS 2.235.3 and earlier does not escape the project naming strategy description, resulting in a stored cross-site scripting (XSS) vulnerability exploitable by users with Overall/Manage permission.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-2230","cvss3Severity":"medium","cvss3Score":"5.1","cvss3Metrics":{"A":"Low","AC":"High","PR":"Low","S":"Changed","C":"Low","UI":"Required","AV":"Adjacent","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in jenkins core jar cve medium severity vulnerability vulnerable library jenkins core jar jenkins core code and view files to render html path to dependency file tmp ws scm promoted builds plugin pom xml path to vulnerable library epository org jenkins ci main jenkins core jenkins core jar dependency hierarchy x jenkins core jar vulnerable library vulnerability details jenkins and earlier lts and earlier does not escape the project naming strategy description resulting in a stored cross site scripting xss vulnerability exploitable by users with overall manage permission publish date url a href cvss score details base score metrics exploitability metrics attack vector adjacent attack complexity high privileges required low user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org jenkins ci main jenkins core check this box to open an automated fix pr isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails jenkins and earlier lts and earlier does not escape the project naming strategy description resulting in a stored cross site scripting xss vulnerability exploitable by users with overall manage permission vulnerabilityurl
| 0
|
16,542
| 21,567,859,771
|
IssuesEvent
|
2022-05-02 02:31:43
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Custom QNetworkAccessManager for QWebPage segfaults
|
Processing Bug Python Console
|
### What is the bug or the crash?
Using `QWebView`, I needed to accommodate for an authentication procedure where I have to capture some of the javascript responses a web page does. That requires a custom `QNetworkAccessManager` which needs to be connected with the `QWebPage`, see script below (the reason is by default the `QNetworkAccessManager::finished` signal only gets empty responses since `readAll` has already been called by the `QWebPage` at that point).
When run this within QGIS, that script segfaults for some reason.
However, it's **fine when run standalone when importing from `PyQt5`** and segfaults as well when importing from `qgis.PyQt` namespace. For standalone switch these lines
```python
w = OneLoginWidget()
w.show()
```
with
```python
if __name__ == '__main__':
app = QApplication([])
w = OneLoginWidget()
w.show()
app.exec_()
```
### Steps to reproduce the issue
1. In Processing Toolbox, click the Python dropdown and "Create new script"
2. Paste the below code and hit "Run"
3. Enter your github user/password and login
```python
from typing import Optional
from PyQt5.QtWidgets import QApplication
from PyQt5.QtNetwork import QNetworkAccessManager, QNetworkReply
from PyQt5.QtCore import QUrl
from PyQt5.QtWidgets import QDialog, QVBoxLayout
from PyQt5.QtWebKitWidgets import QWebView, QWebPage
class CustomNam(QNetworkAccessManager):
def __init__(self):
super(CustomNam, self).__init__()
self.reply: Optional[QNetworkReply] = None
def createRequest(self, op, req, device=None):
self.reply = super(CustomNam, self).createRequest(op, req, device)
if op == QNetworkAccessManager.PutOperation:
self.reply.readyRead.connect(self._on_ready_read)
return self.reply
def _on_ready_read(self):
if self.reply:
print(self.reply.peek(self.reply.bytesAvailable()))
class OneLoginWidget(QDialog):
def __init__(self, parent=None):
super(OneLoginWidget, self).__init__(parent)
self.webview = QWebView(self)
self.page = QWebPage(self)
self.page.setNetworkAccessManager(CustomNam())
self.webview.setPage(self.page)
layout = QVBoxLayout()
layout.addWidget(self.webview)
self.setLayout(layout)
self.webview.load(QUrl('https://github.com/login'))
w = OneLoginWidget()
w.show()
```
### Versions
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.24.0-Tisler | QGIS code branch | Release 3.24
-- | -- | -- | --
Compiled against Qt | 5.15.2 | Running against Qt | 5.15.3
Python version | 3.10.2
GDAL/OGR version | 3.4.0
PROJ version | 8.2.0
EPSG Registry database version | v10.038 (2021-10-21)
GEOS version | 3.9.1-CAPI-1.14.2
Compiled against SQLite | 3.37.2 | Running against SQLite | 3.38.0
PDAL version | 2.3.0
PostgreSQL client version | unknown
SpatiaLite version | 5.0.1
QWT version | 6.2.0
QScintilla2 version | 2.13.1
OS version | Manjaro Linux
| | |
Active Python plugins
QuickOSM | 2.0.1
valhalla | 2.2.1
graphio | 1.1.0
plugin_reloader | 0.9.1
tardis | 2.0.0-dev
network_analyst | 0.0.2
quick_map_services | 0.19.29
latlontools | 3.6.3
grassprovider | 2.12.99
sagaprovider | 2.12.99
db_manager | 0.1.20
MetaSearch | 0.3.6
processing | 2.12.99
</body></html>
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
Not all URLs cause a segfault, e.g. https://google.com is ok. Makes me wonder if it's only problematic on (concurrent?) POST/PUT requests.
It seems to be the `createRequest` override.. As soon as that is commented out it works for all URLs. I'm really puzzled what's causing this..
EDIT: even just overriding `createRequest` with a no-op results in the segfault:
```python
def createRequest(self, op, req, device=None):
return super(CustomNam, self).createRequest(op, req, device)
```
|
1.0
|
Custom QNetworkAccessManager for QWebPage segfaults - ### What is the bug or the crash?
Using `QWebView`, I needed to accommodate for an authentication procedure where I have to capture some of the javascript responses a web page does. That requires a custom `QNetworkAccessManager` which needs to be connected with the `QWebPage`, see script below (the reason is by default the `QNetworkAccessManager::finished` signal only gets empty responses since `readAll` has already been called by the `QWebPage` at that point).
When run this within QGIS, that script segfaults for some reason.
However, it's **fine when run standalone when importing from `PyQt5`** and segfaults as well when importing from `qgis.PyQt` namespace. For standalone switch these lines
```python
w = OneLoginWidget()
w.show()
```
with
```python
if __name__ == '__main__':
app = QApplication([])
w = OneLoginWidget()
w.show()
app.exec_()
```
### Steps to reproduce the issue
1. In Processing Toolbox, click the Python dropdown and "Create new script"
2. Paste the below code and hit "Run"
3. Enter your github user/password and login
```python
from typing import Optional
from PyQt5.QtWidgets import QApplication
from PyQt5.QtNetwork import QNetworkAccessManager, QNetworkReply
from PyQt5.QtCore import QUrl
from PyQt5.QtWidgets import QDialog, QVBoxLayout
from PyQt5.QtWebKitWidgets import QWebView, QWebPage
class CustomNam(QNetworkAccessManager):
def __init__(self):
super(CustomNam, self).__init__()
self.reply: Optional[QNetworkReply] = None
def createRequest(self, op, req, device=None):
self.reply = super(CustomNam, self).createRequest(op, req, device)
if op == QNetworkAccessManager.PutOperation:
self.reply.readyRead.connect(self._on_ready_read)
return self.reply
def _on_ready_read(self):
if self.reply:
print(self.reply.peek(self.reply.bytesAvailable()))
class OneLoginWidget(QDialog):
def __init__(self, parent=None):
super(OneLoginWidget, self).__init__(parent)
self.webview = QWebView(self)
self.page = QWebPage(self)
self.page.setNetworkAccessManager(CustomNam())
self.webview.setPage(self.page)
layout = QVBoxLayout()
layout.addWidget(self.webview)
self.setLayout(layout)
self.webview.load(QUrl('https://github.com/login'))
w = OneLoginWidget()
w.show()
```
### Versions
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.24.0-Tisler | QGIS code branch | Release 3.24
-- | -- | -- | --
Compiled against Qt | 5.15.2 | Running against Qt | 5.15.3
Python version | 3.10.2
GDAL/OGR version | 3.4.0
PROJ version | 8.2.0
EPSG Registry database version | v10.038 (2021-10-21)
GEOS version | 3.9.1-CAPI-1.14.2
Compiled against SQLite | 3.37.2 | Running against SQLite | 3.38.0
PDAL version | 2.3.0
PostgreSQL client version | unknown
SpatiaLite version | 5.0.1
QWT version | 6.2.0
QScintilla2 version | 2.13.1
OS version | Manjaro Linux
| | |
Active Python plugins
QuickOSM | 2.0.1
valhalla | 2.2.1
graphio | 1.1.0
plugin_reloader | 0.9.1
tardis | 2.0.0-dev
network_analyst | 0.0.2
quick_map_services | 0.19.29
latlontools | 3.6.3
grassprovider | 2.12.99
sagaprovider | 2.12.99
db_manager | 0.1.20
MetaSearch | 0.3.6
processing | 2.12.99
</body></html>
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
Not all URLs cause a segfault, e.g. https://google.com is ok. Makes me wonder if it's only problematic on (concurrent?) POST/PUT requests.
It seems to be the `createRequest` override.. As soon as that is commented out it works for all URLs. I'm really puzzled what's causing this..
EDIT: even just overriding `createRequest` with a no-op results in the segfault:
```python
def createRequest(self, op, req, device=None):
return super(CustomNam, self).createRequest(op, req, device)
```
|
process
|
custom qnetworkaccessmanager for qwebpage segfaults what is the bug or the crash using qwebview i needed to accommodate for an authentication procedure where i have to capture some of the javascript responses a web page does that requires a custom qnetworkaccessmanager which needs to be connected with the qwebpage see script below the reason is by default the qnetworkaccessmanager finished signal only gets empty responses since readall has already been called by the qwebpage at that point when run this within qgis that script segfaults for some reason however it s fine when run standalone when importing from and segfaults as well when importing from qgis pyqt namespace for standalone switch these lines python w oneloginwidget w show with python if name main app qapplication w oneloginwidget w show app exec steps to reproduce the issue in processing toolbox click the python dropdown and create new script paste the below code and hit run enter your github user password and login python from typing import optional from qtwidgets import qapplication from qtnetwork import qnetworkaccessmanager qnetworkreply from qtcore import qurl from qtwidgets import qdialog qvboxlayout from qtwebkitwidgets import qwebview qwebpage class customnam qnetworkaccessmanager def init self super customnam self init self reply optional none def createrequest self op req device none self reply super customnam self createrequest op req device if op qnetworkaccessmanager putoperation self reply readyread connect self on ready read return self reply def on ready read self if self reply print self reply peek self reply bytesavailable class oneloginwidget qdialog def init self parent none super oneloginwidget self init parent self webview qwebview self self page qwebpage self self page setnetworkaccessmanager customnam self webview setpage self page layout qvboxlayout layout addwidget self webview self setlayout layout self webview load qurl w oneloginwidget w show versions doctype html public dtd html en p li white space pre wrap qgis version tisler qgis code branch release compiled against qt running against qt python version gdal ogr version proj version epsg registry database version geos version capi compiled against sqlite running against sqlite pdal version postgresql client version unknown spatialite version qwt version version os version manjaro linux active python plugins quickosm valhalla graphio plugin reloader tardis dev network analyst quick map services latlontools grassprovider sagaprovider db manager metasearch processing supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context not all urls cause a segfault e g is ok makes me wonder if it s only problematic on concurrent post put requests it seems to be the createrequest override as soon as that is commented out it works for all urls i m really puzzled what s causing this edit even just overriding createrequest with a no op results in the segfault python def createrequest self op req device none return super customnam self createrequest op req device
| 1
|
79,680
| 9,933,936,249
|
IssuesEvent
|
2019-07-02 13:27:51
|
microsoft/terminal
|
https://api.github.com/repos/microsoft/terminal
|
closed
|
Bug Report
|
Area-Settings Issue-Question Needs-Author-Feedback Product-Terminal Resolution-By-Design
|
# Environment
```none
Windows build number: Microsoft Windows [Version 10.0.18362.207]
Windows Terminal version (if applicable): unsure how to check, fresh install from the Microsoft Store
```
# Steps to reproduce
<!-- A description of how to trigger this bug. -->
Start Windows Terminal,
Attempt to move the window around, it is not possible
Focusing and unfocusing the Terminal brings it back to default settings.
Close the tap with a middle click, the app crashes,
# Expected behavior
<!-- A description of what you're expecting, possibly containing screenshots or reference material. -->
1) The terminal should be able to move freely
2) The profile.json settings should remain
3) Middle click shouldn't crash the program
# Actual behavior
<!-- What's actually happening? -->
Screen recording of being unable to move the terminal, profile.json settings being lost, crashing on middle click (in order):

profiles.json contents
```
{
"globals" :
{
"alwaysShowTabs" : true,
"defaultProfile" : "{61c54bbd-c2c6-5271-96e7-009a87ff44bf}",
"initialCols" : 120,
"initialRows" : 30,
"keybindings" :
[
{
"command" : "closeTab",
"keys" :
[
"ctrl+w"
]
},
{
"command" : "newTab",
"keys" :
[
"ctrl+t"
]
},
{
"command" : "newTabProfile0",
"keys" :
[
"ctrl+shift+1"
]
},
{
"command" : "newTabProfile1",
"keys" :
[
"ctrl+shift+2"
]
},
{
"command" : "newTabProfile2",
"keys" :
[
"ctrl+shift+3"
]
},
{
"command" : "newTabProfile3",
"keys" :
[
"ctrl+shift+4"
]
},
{
"command" : "newTabProfile4",
"keys" :
[
"ctrl+shift+5"
]
},
{
"command" : "newTabProfile5",
"keys" :
[
"ctrl+shift+6"
]
},
{
"command" : "newTabProfile6",
"keys" :
[
"ctrl+shift+7"
]
},
{
"command" : "newTabProfile7",
"keys" :
[
"ctrl+shift+8"
]
},
{
"command" : "newTabProfile8",
"keys" :
[
"ctrl+shift+9"
]
},
{
"command" : "nextTab",
"keys" :
[
"ctrl+tab"
]
},
{
"command" : "openSettings",
"keys" :
[
"ctrl+,"
]
},
{
"command" : "prevTab",
"keys" :
[
"ctrl+shift+tab"
]
},
{
"command" : "scrollDown",
"keys" :
[
"ctrl+shift+down"
]
},
{
"command" : "scrollDownPage",
"keys" :
[
"ctrl+shift+pgdn"
]
},
{
"command" : "scrollUp",
"keys" :
[
"ctrl+shift+up"
]
},
{
"command" : "scrollUpPage",
"keys" :
[
"ctrl+shift+pgup"
]
},
{
"command" : "switchToTab0",
"keys" :
[
"alt+1"
]
},
{
"command" : "switchToTab1",
"keys" :
[
"alt+2"
]
},
{
"command" : "switchToTab2",
"keys" :
[
"alt+3"
]
},
{
"command" : "switchToTab3",
"keys" :
[
"alt+4"
]
},
{
"command" : "switchToTab4",
"keys" :
[
"alt+5"
]
},
{
"command" : "switchToTab5",
"keys" :
[
"alt+6"
]
},
{
"command" : "switchToTab6",
"keys" :
[
"alt+7"
]
},
{
"command" : "switchToTab7",
"keys" :
[
"alt+8"
]
},
{
"command" : "switchToTab8",
"keys" :
[
"alt+9"
]
}
],
"requestedTheme" : "system",
"showTabsInTitlebar" : true,
"showTerminalTitleInTitlebar" : true
},
"profiles" :
[
{
"acrylicOpacity" : 0.5,
"background" : "#012456",
"closeOnExit" : true,
"colorScheme" : "Campbell",
"commandline" : "powershell.exe",
"cursorColor" : "#FFFFFF",
"cursorShape" : "bar",
"fontFace" : "Consolas",
"fontSize" : 10,
"guid" : "{61c54bbd-c2c6-5271-96e7-009a87ff44bf}",
"historySize" : 9001,
"icon" : "ms-appx:///ProfileIcons/{61c54bbd-c2c6-5271-96e7-009a87ff44bf}.png",
"name" : "Windows PowerShell",
"padding" : "0, 0, 0, 0",
"snapOnInput" : true,
"startingDirectory" : "%USERPROFILE%",
"useAcrylic" : true
},
{
"acrylicOpacity" : 0.75,
"closeOnExit" : true,
"colorScheme" : "Campbell",
"commandline" : "cmd.exe",
"cursorColor" : "#FFFFFF",
"cursorShape" : "bar",
"fontFace" : "Consolas",
"fontSize" : 10,
"guid" : "{0caa0dad-35be-5f56-a8ff-afceeeaa6101}",
"historySize" : 9001,
"icon" : "ms-appx:///ProfileIcons/{0caa0dad-35be-5f56-a8ff-afceeeaa6101}.png",
"name" : "cmd",
"padding" : "0, 0, 0, 0",
"snapOnInput" : true,
"startingDirectory" : "%USERPROFILE%",
"useAcrylic" : true
},
{
"acrylicOpacity" : 0.5,
"closeOnExit" : true,
"colorScheme" : "Campbell",
"commandline" : "wsl.exe -d Ubuntu-18.04",
"cursorColor" : "#FFFFFF",
"cursorShape" : "bar",
"fontFace" : "Consolas",
"fontSize" : 10,
"guid" : "{c6eaf9f4-32a7-5fdc-b5cf-066e8a4b1e40}",
"historySize" : 9001,
"icon" : "ms-appx:///ProfileIcons/{9acb9455-ca41-5af7-950f-6bca1bc9722f}.png",
"name" : "Ubuntu-18.04",
"padding" : "0, 0, 0, 0",
"snapOnInput" : true,
"useAcrylic" : true
}
],
"schemes" :
[
{
"background" : "#0C0C0C",
"black" : "#0C0C0C",
"blue" : "#0037DA",
"brightBlack" : "#767676",
"brightBlue" : "#3B78FF",
"brightCyan" : "#61D6D6",
"brightGreen" : "#16C60C",
"brightPurple" : "#B4009E",
"brightRed" : "#E74856",
"brightWhite" : "#F2F2F2",
"brightYellow" : "#F9F1A5",
"cyan" : "#3A96DD",
"foreground" : "#F2F2F2",
"green" : "#13A10E",
"name" : "Campbell",
"purple" : "#881798",
"red" : "#C50F1F",
"white" : "#CCCCCC",
"yellow" : "#C19C00"
},
{
"background" : "#282C34",
"black" : "#282C34",
"blue" : "#61AFEF",
"brightBlack" : "#5A6374",
"brightBlue" : "#61AFEF",
"brightCyan" : "#56B6C2",
"brightGreen" : "#98C379",
"brightPurple" : "#C678DD",
"brightRed" : "#E06C75",
"brightWhite" : "#DCDFE4",
"brightYellow" : "#E5C07B",
"cyan" : "#56B6C2",
"foreground" : "#DCDFE4",
"green" : "#98C379",
"name" : "One Half Dark",
"purple" : "#C678DD",
"red" : "#E06C75",
"white" : "#DCDFE4",
"yellow" : "#E5C07B"
},
{
"background" : "#FAFAFA",
"black" : "#383A42",
"blue" : "#0184BC",
"brightBlack" : "#4F525D",
"brightBlue" : "#61AFEF",
"brightCyan" : "#56B5C1",
"brightGreen" : "#98C379",
"brightPurple" : "#C577DD",
"brightRed" : "#DF6C75",
"brightWhite" : "#FFFFFF",
"brightYellow" : "#E4C07A",
"cyan" : "#0997B3",
"foreground" : "#383A42",
"green" : "#50A14F",
"name" : "One Half Light",
"purple" : "#A626A4",
"red" : "#E45649",
"white" : "#FAFAFA",
"yellow" : "#C18301"
},
{
"background" : "#073642",
"black" : "#073642",
"blue" : "#268BD2",
"brightBlack" : "#002B36",
"brightBlue" : "#839496",
"brightCyan" : "#93A1A1",
"brightGreen" : "#586E75",
"brightPurple" : "#6C71C4",
"brightRed" : "#CB4B16",
"brightWhite" : "#FDF6E3",
"brightYellow" : "#657B83",
"cyan" : "#2AA198",
"foreground" : "#FDF6E3",
"green" : "#859900",
"name" : "Solarized Dark",
"purple" : "#D33682",
"red" : "#D30102",
"white" : "#EEE8D5",
"yellow" : "#B58900"
},
{
"background" : "#FDF6E3",
"black" : "#073642",
"blue" : "#268BD2",
"brightBlack" : "#002B36",
"brightBlue" : "#839496",
"brightCyan" : "#93A1A1",
"brightGreen" : "#586E75",
"brightPurple" : "#6C71C4",
"brightRed" : "#CB4B16",
"brightWhite" : "#FDF6E3",
"brightYellow" : "#657B83",
"cyan" : "#2AA198",
"foreground" : "#073642",
"green" : "#859900",
"name" : "Solarized Light",
"purple" : "#D33682",
"red" : "#D30102",
"white" : "#EEE8D5",
"yellow" : "#B58900"
}
]
}
```
|
1.0
|
Bug Report -
# Environment
```none
Windows build number: Microsoft Windows [Version 10.0.18362.207]
Windows Terminal version (if applicable): unsure how to check, fresh install from the Microsoft Store
```
# Steps to reproduce
<!-- A description of how to trigger this bug. -->
Start Windows Terminal,
Attempt to move the window around, it is not possible
Focusing and unfocusing the Terminal brings it back to default settings.
Close the tap with a middle click, the app crashes,
# Expected behavior
<!-- A description of what you're expecting, possibly containing screenshots or reference material. -->
1) The terminal should be able to move freely
2) The profile.json settings should remain
3) Middle click shouldn't crash the program
# Actual behavior
<!-- What's actually happening? -->
Screen recording of being unable to move the terminal, profile.json settings being lost, crashing on middle click (in order):

profiles.json contents
```
{
"globals" :
{
"alwaysShowTabs" : true,
"defaultProfile" : "{61c54bbd-c2c6-5271-96e7-009a87ff44bf}",
"initialCols" : 120,
"initialRows" : 30,
"keybindings" :
[
{
"command" : "closeTab",
"keys" :
[
"ctrl+w"
]
},
{
"command" : "newTab",
"keys" :
[
"ctrl+t"
]
},
{
"command" : "newTabProfile0",
"keys" :
[
"ctrl+shift+1"
]
},
{
"command" : "newTabProfile1",
"keys" :
[
"ctrl+shift+2"
]
},
{
"command" : "newTabProfile2",
"keys" :
[
"ctrl+shift+3"
]
},
{
"command" : "newTabProfile3",
"keys" :
[
"ctrl+shift+4"
]
},
{
"command" : "newTabProfile4",
"keys" :
[
"ctrl+shift+5"
]
},
{
"command" : "newTabProfile5",
"keys" :
[
"ctrl+shift+6"
]
},
{
"command" : "newTabProfile6",
"keys" :
[
"ctrl+shift+7"
]
},
{
"command" : "newTabProfile7",
"keys" :
[
"ctrl+shift+8"
]
},
{
"command" : "newTabProfile8",
"keys" :
[
"ctrl+shift+9"
]
},
{
"command" : "nextTab",
"keys" :
[
"ctrl+tab"
]
},
{
"command" : "openSettings",
"keys" :
[
"ctrl+,"
]
},
{
"command" : "prevTab",
"keys" :
[
"ctrl+shift+tab"
]
},
{
"command" : "scrollDown",
"keys" :
[
"ctrl+shift+down"
]
},
{
"command" : "scrollDownPage",
"keys" :
[
"ctrl+shift+pgdn"
]
},
{
"command" : "scrollUp",
"keys" :
[
"ctrl+shift+up"
]
},
{
"command" : "scrollUpPage",
"keys" :
[
"ctrl+shift+pgup"
]
},
{
"command" : "switchToTab0",
"keys" :
[
"alt+1"
]
},
{
"command" : "switchToTab1",
"keys" :
[
"alt+2"
]
},
{
"command" : "switchToTab2",
"keys" :
[
"alt+3"
]
},
{
"command" : "switchToTab3",
"keys" :
[
"alt+4"
]
},
{
"command" : "switchToTab4",
"keys" :
[
"alt+5"
]
},
{
"command" : "switchToTab5",
"keys" :
[
"alt+6"
]
},
{
"command" : "switchToTab6",
"keys" :
[
"alt+7"
]
},
{
"command" : "switchToTab7",
"keys" :
[
"alt+8"
]
},
{
"command" : "switchToTab8",
"keys" :
[
"alt+9"
]
}
],
"requestedTheme" : "system",
"showTabsInTitlebar" : true,
"showTerminalTitleInTitlebar" : true
},
"profiles" :
[
{
"acrylicOpacity" : 0.5,
"background" : "#012456",
"closeOnExit" : true,
"colorScheme" : "Campbell",
"commandline" : "powershell.exe",
"cursorColor" : "#FFFFFF",
"cursorShape" : "bar",
"fontFace" : "Consolas",
"fontSize" : 10,
"guid" : "{61c54bbd-c2c6-5271-96e7-009a87ff44bf}",
"historySize" : 9001,
"icon" : "ms-appx:///ProfileIcons/{61c54bbd-c2c6-5271-96e7-009a87ff44bf}.png",
"name" : "Windows PowerShell",
"padding" : "0, 0, 0, 0",
"snapOnInput" : true,
"startingDirectory" : "%USERPROFILE%",
"useAcrylic" : true
},
{
"acrylicOpacity" : 0.75,
"closeOnExit" : true,
"colorScheme" : "Campbell",
"commandline" : "cmd.exe",
"cursorColor" : "#FFFFFF",
"cursorShape" : "bar",
"fontFace" : "Consolas",
"fontSize" : 10,
"guid" : "{0caa0dad-35be-5f56-a8ff-afceeeaa6101}",
"historySize" : 9001,
"icon" : "ms-appx:///ProfileIcons/{0caa0dad-35be-5f56-a8ff-afceeeaa6101}.png",
"name" : "cmd",
"padding" : "0, 0, 0, 0",
"snapOnInput" : true,
"startingDirectory" : "%USERPROFILE%",
"useAcrylic" : true
},
{
"acrylicOpacity" : 0.5,
"closeOnExit" : true,
"colorScheme" : "Campbell",
"commandline" : "wsl.exe -d Ubuntu-18.04",
"cursorColor" : "#FFFFFF",
"cursorShape" : "bar",
"fontFace" : "Consolas",
"fontSize" : 10,
"guid" : "{c6eaf9f4-32a7-5fdc-b5cf-066e8a4b1e40}",
"historySize" : 9001,
"icon" : "ms-appx:///ProfileIcons/{9acb9455-ca41-5af7-950f-6bca1bc9722f}.png",
"name" : "Ubuntu-18.04",
"padding" : "0, 0, 0, 0",
"snapOnInput" : true,
"useAcrylic" : true
}
],
"schemes" :
[
{
"background" : "#0C0C0C",
"black" : "#0C0C0C",
"blue" : "#0037DA",
"brightBlack" : "#767676",
"brightBlue" : "#3B78FF",
"brightCyan" : "#61D6D6",
"brightGreen" : "#16C60C",
"brightPurple" : "#B4009E",
"brightRed" : "#E74856",
"brightWhite" : "#F2F2F2",
"brightYellow" : "#F9F1A5",
"cyan" : "#3A96DD",
"foreground" : "#F2F2F2",
"green" : "#13A10E",
"name" : "Campbell",
"purple" : "#881798",
"red" : "#C50F1F",
"white" : "#CCCCCC",
"yellow" : "#C19C00"
},
{
"background" : "#282C34",
"black" : "#282C34",
"blue" : "#61AFEF",
"brightBlack" : "#5A6374",
"brightBlue" : "#61AFEF",
"brightCyan" : "#56B6C2",
"brightGreen" : "#98C379",
"brightPurple" : "#C678DD",
"brightRed" : "#E06C75",
"brightWhite" : "#DCDFE4",
"brightYellow" : "#E5C07B",
"cyan" : "#56B6C2",
"foreground" : "#DCDFE4",
"green" : "#98C379",
"name" : "One Half Dark",
"purple" : "#C678DD",
"red" : "#E06C75",
"white" : "#DCDFE4",
"yellow" : "#E5C07B"
},
{
"background" : "#FAFAFA",
"black" : "#383A42",
"blue" : "#0184BC",
"brightBlack" : "#4F525D",
"brightBlue" : "#61AFEF",
"brightCyan" : "#56B5C1",
"brightGreen" : "#98C379",
"brightPurple" : "#C577DD",
"brightRed" : "#DF6C75",
"brightWhite" : "#FFFFFF",
"brightYellow" : "#E4C07A",
"cyan" : "#0997B3",
"foreground" : "#383A42",
"green" : "#50A14F",
"name" : "One Half Light",
"purple" : "#A626A4",
"red" : "#E45649",
"white" : "#FAFAFA",
"yellow" : "#C18301"
},
{
"background" : "#073642",
"black" : "#073642",
"blue" : "#268BD2",
"brightBlack" : "#002B36",
"brightBlue" : "#839496",
"brightCyan" : "#93A1A1",
"brightGreen" : "#586E75",
"brightPurple" : "#6C71C4",
"brightRed" : "#CB4B16",
"brightWhite" : "#FDF6E3",
"brightYellow" : "#657B83",
"cyan" : "#2AA198",
"foreground" : "#FDF6E3",
"green" : "#859900",
"name" : "Solarized Dark",
"purple" : "#D33682",
"red" : "#D30102",
"white" : "#EEE8D5",
"yellow" : "#B58900"
},
{
"background" : "#FDF6E3",
"black" : "#073642",
"blue" : "#268BD2",
"brightBlack" : "#002B36",
"brightBlue" : "#839496",
"brightCyan" : "#93A1A1",
"brightGreen" : "#586E75",
"brightPurple" : "#6C71C4",
"brightRed" : "#CB4B16",
"brightWhite" : "#FDF6E3",
"brightYellow" : "#657B83",
"cyan" : "#2AA198",
"foreground" : "#073642",
"green" : "#859900",
"name" : "Solarized Light",
"purple" : "#D33682",
"red" : "#D30102",
"white" : "#EEE8D5",
"yellow" : "#B58900"
}
]
}
```
|
non_process
|
bug report environment none windows build number microsoft windows windows terminal version if applicable unsure how to check fresh install from the microsoft store steps to reproduce start windows terminal attempt to move the window around it is not possible focusing and unfocusing the terminal brings it back to default settings close the tap with a middle click the app crashes expected behavior the terminal should be able to move freely the profile json settings should remain middle click shouldn t crash the program actual behavior screen recording of being unable to move the terminal profile json settings being lost crashing on middle click in order profiles json contents globals alwaysshowtabs true defaultprofile initialcols initialrows keybindings command closetab keys ctrl w command newtab keys ctrl t command keys ctrl shift command keys ctrl shift command keys ctrl shift command keys ctrl shift command keys ctrl shift command keys ctrl shift command keys ctrl shift command keys ctrl shift command keys ctrl shift command nexttab keys ctrl tab command opensettings keys ctrl command prevtab keys ctrl shift tab command scrolldown keys ctrl shift down command scrolldownpage keys ctrl shift pgdn command scrollup keys ctrl shift up command scrolluppage keys ctrl shift pgup command keys alt command keys alt command keys alt command keys alt command keys alt command keys alt command keys alt command keys alt command keys alt requestedtheme system showtabsintitlebar true showterminaltitleintitlebar true profiles acrylicopacity background closeonexit true colorscheme campbell commandline powershell exe cursorcolor ffffff cursorshape bar fontface consolas fontsize guid historysize icon ms appx profileicons png name windows powershell padding snaponinput true startingdirectory userprofile useacrylic true acrylicopacity closeonexit true colorscheme campbell commandline cmd exe cursorcolor ffffff cursorshape bar fontface consolas fontsize guid historysize icon ms appx profileicons png name cmd padding snaponinput true startingdirectory userprofile useacrylic true acrylicopacity closeonexit true colorscheme campbell commandline wsl exe d ubuntu cursorcolor ffffff cursorshape bar fontface consolas fontsize guid historysize icon ms appx profileicons png name ubuntu padding snaponinput true useacrylic true schemes background black blue brightblack brightblue brightcyan brightgreen brightpurple brightred brightwhite brightyellow cyan foreground green name campbell purple red white cccccc yellow background black blue brightblack brightblue brightcyan brightgreen brightpurple brightred brightwhite brightyellow cyan foreground green name one half dark purple red white yellow background fafafa black blue brightblack brightblue brightcyan brightgreen brightpurple brightred brightwhite ffffff brightyellow cyan foreground green name one half light purple red white fafafa yellow background black blue brightblack brightblue brightcyan brightgreen brightpurple brightred brightwhite brightyellow cyan foreground green name solarized dark purple red white yellow background black blue brightblack brightblue brightcyan brightgreen brightpurple brightred brightwhite brightyellow cyan foreground green name solarized light purple red white yellow
| 0
|
8,243
| 11,420,168,833
|
IssuesEvent
|
2020-02-03 09:35:09
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
Merge GO:0044054 rounding by symbiont of host cells into GO:0052039 modification by symbiont of host cytoskeleton
|
multi-species process
|
GO:0044054 rounding by symbiont of host cells -> 1 annotation
is a read-out of GO:0052039 modification by symbiont of host cytoskeleton (1 annotation)
These two terms will be merged.
@RLovering just informing you since the annotation to 'GO:0044054 rounding by symbiont of host cells ' is from your group.
Thanks, Pascale
|
1.0
|
Merge GO:0044054 rounding by symbiont of host cells into GO:0052039 modification by symbiont of host cytoskeleton - GO:0044054 rounding by symbiont of host cells -> 1 annotation
is a read-out of GO:0052039 modification by symbiont of host cytoskeleton (1 annotation)
These two terms will be merged.
@RLovering just informing you since the annotation to 'GO:0044054 rounding by symbiont of host cells ' is from your group.
Thanks, Pascale
|
process
|
merge go rounding by symbiont of host cells into go modification by symbiont of host cytoskeleton go rounding by symbiont of host cells annotation is a read out of go modification by symbiont of host cytoskeleton annotation these two terms will be merged rlovering just informing you since the annotation to go rounding by symbiont of host cells is from your group thanks pascale
| 1
|
148,378
| 5,680,706,200
|
IssuesEvent
|
2017-04-13 02:30:34
|
meetinghouse/cms
|
https://api.github.com/repos/meetinghouse/cms
|
opened
|
Dark Theme: Add "active"/"not-active" classes to /projects/tags and /posts/tags sub-menu items.
|
Medium Priority
|
@vivek-chaudhari When you click on a sub-menu item for projects (portfolios) or blog and the page loads, there is no "active" class on the active menu item. Can you add the active/not-active classes functionality to those sub-menus on dark theme like it is on the main menu?
You can see the projects sub-items on Steve's site here: http://revrbacoustics.corbettresearchgroup.com/projects/tags/Bars
And, of course, the posts here: http://shlandscapesitework.com/posts
Thanks,
John
|
1.0
|
Dark Theme: Add "active"/"not-active" classes to /projects/tags and /posts/tags sub-menu items. - @vivek-chaudhari When you click on a sub-menu item for projects (portfolios) or blog and the page loads, there is no "active" class on the active menu item. Can you add the active/not-active classes functionality to those sub-menus on dark theme like it is on the main menu?
You can see the projects sub-items on Steve's site here: http://revrbacoustics.corbettresearchgroup.com/projects/tags/Bars
And, of course, the posts here: http://shlandscapesitework.com/posts
Thanks,
John
|
non_process
|
dark theme add active not active classes to projects tags and posts tags sub menu items vivek chaudhari when you click on a sub menu item for projects portfolios or blog and the page loads there is no active class on the active menu item can you add the active not active classes functionality to those sub menus on dark theme like it is on the main menu you can see the projects sub items on steve s site here and of course the posts here thanks john
| 0
|
7,756
| 10,867,564,318
|
IssuesEvent
|
2019-11-15 00:23:34
|
googleapis/nodejs-containeranalysis
|
https://api.github.com/repos/googleapis/nodejs-containeranalysis
|
closed
|
Release GA
|
type: process
|
Package name: `@google-cloud/containeranalysis`
Current release: **beta**
Proposed release: **GA**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [x] 28 days elapsed since last beta release with new API surface
- [x] Server API is GA
- [x] Package API is stable, and we can commit to backward compatibility
- [x] All dependencies are GA
## Optional
- [x] Most common / important scenarios have descriptive samples
- [x] Public manual methods have at least one usage sample each (excluding overloads)
- [x] Per-API README includes a full description of the API
- [x] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
1.0
|
Release GA - Package name: `@google-cloud/containeranalysis`
Current release: **beta**
Proposed release: **GA**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [x] 28 days elapsed since last beta release with new API surface
- [x] Server API is GA
- [x] Package API is stable, and we can commit to backward compatibility
- [x] All dependencies are GA
## Optional
- [x] Most common / important scenarios have descriptive samples
- [x] Public manual methods have at least one usage sample each (excluding overloads)
- [x] Per-API README includes a full description of the API
- [x] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
process
|
release ga package name google cloud containeranalysis current release beta proposed release ga instructions check the lists below adding tests documentation as required once all the required boxes are ticked please create a release and close this issue required days elapsed since last beta release with new api surface server api is ga package api is stable and we can commit to backward compatibility all dependencies are ga optional most common important scenarios have descriptive samples public manual methods have at least one usage sample each excluding overloads per api readme includes a full description of the api per api readme contains at least one “getting started” sample using the most common api scenario manual code has been reviewed by api producer manual code has been reviewed by a dpe responsible for samples client libraries page is added to the product documentation in apis reference section of the product s documentation on cloud site
| 1
|
121,426
| 10,168,139,398
|
IssuesEvent
|
2019-08-07 20:02:01
|
davissong/first_reposit
|
https://api.github.com/repos/davissong/first_reposit
|
opened
|
SQA-FTest-Learning Markdown from template
|
test
|
<h3>Tests US: # </h3>
Description: trying to use markdown to show how we can write test scripts from GitHub
Test Cases (External) Link to Test Cases, <h1>trying to replace this</h2>
Where executed from:
- [x] GFE
- [x] CAG
- [x] Both UI and API are on same laptop
*Show Result: via labels like the Pass/Fail label*
|
1.0
|
SQA-FTest-Learning Markdown from template - <h3>Tests US: # </h3>
Description: trying to use markdown to show how we can write test scripts from GitHub
Test Cases (External) Link to Test Cases, <h1>trying to replace this</h2>
Where executed from:
- [x] GFE
- [x] CAG
- [x] Both UI and API are on same laptop
*Show Result: via labels like the Pass/Fail label*
|
non_process
|
sqa ftest learning markdown from template tests us description trying to use markdown to show how we can write test scripts from github test cases external link to test cases trying to replace this where executed from gfe cag both ui and api are on same laptop show result via labels like the pass fail label
| 0
|
551,701
| 16,178,197,157
|
IssuesEvent
|
2021-05-03 10:23:38
|
DIT112-V21/group-09
|
https://api.github.com/repos/DIT112-V21/group-09
|
closed
|
Release and test alpha-1 version
|
JS enhancement high-priority project sprint-2
|
## Objectives
- Finalize the Frontend app - [Milestone 1](https://github.com/DIT112-V21/group-09/milestone/1)
- Setup ElectronJS build workflow and initialize
- Automatically build and deploy alpha-1 release for macOS, Windows and Ubuntu
## Target
- By mid-week around Apr 28th, 2021
- At latest by the end of Sprint 2
## Next steps after release
- Improve build quality
- Stabilize dependencies and script installations
- Deploy further automated test cases for unit testing
|
1.0
|
Release and test alpha-1 version - ## Objectives
- Finalize the Frontend app - [Milestone 1](https://github.com/DIT112-V21/group-09/milestone/1)
- Setup ElectronJS build workflow and initialize
- Automatically build and deploy alpha-1 release for macOS, Windows and Ubuntu
## Target
- By mid-week around Apr 28th, 2021
- At latest by the end of Sprint 2
## Next steps after release
- Improve build quality
- Stabilize dependencies and script installations
- Deploy further automated test cases for unit testing
|
non_process
|
release and test alpha version objectives finalize the frontend app setup electronjs build workflow and initialize automatically build and deploy alpha release for macos windows and ubuntu target by mid week around apr at latest by the end of sprint next steps after release improve build quality stabilize dependencies and script installations deploy further automated test cases for unit testing
| 0
|
135,573
| 18,714,904,476
|
IssuesEvent
|
2021-11-03 02:18:27
|
ChoeMinji/react
|
https://api.github.com/repos/ChoeMinji/react
|
opened
|
CVE-2020-7793 (High) detected in multiple libraries
|
security vulnerability
|
## CVE-2020-7793 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ua-parser-js-0.7.22.tgz</b>, <b>ua-parser-js-0.7.12.tgz</b>, <b>ua-parser-js-0.7.20.tgz</b></p></summary>
<p>
<details><summary><b>ua-parser-js-0.7.22.tgz</b></p></summary>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.22.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.22.tgz</a></p>
<p>Path to dependency file: react/fixtures/legacy-jsx-runtimes/react-14/package.json</p>
<p>Path to vulnerable library: react/fixtures/legacy-jsx-runtimes/react-14/node_modules/ua-parser-js/package.json,react/fixtures/legacy-jsx-runtimes/react-15/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-dom-15.6.2.tgz (Root Library)
- fbjs-0.8.17.tgz
- :x: **ua-parser-js-0.7.22.tgz** (Vulnerable Library)
</details>
<details><summary><b>ua-parser-js-0.7.12.tgz</b></p></summary>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.12.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.12.tgz</a></p>
<p>Path to dependency file: react/fixtures/fiber-debugger/package.json</p>
<p>Path to vulnerable library: react/fixtures/fiber-debugger/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-motion-0.5.0.tgz (Root Library)
- prop-types-15.5.10.tgz
- fbjs-0.8.12.tgz
- :x: **ua-parser-js-0.7.12.tgz** (Vulnerable Library)
</details>
<details><summary><b>ua-parser-js-0.7.20.tgz</b></p></summary>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.20.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.20.tgz</a></p>
<p>Path to dependency file: react/package.json</p>
<p>Path to vulnerable library: react/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-devtools-shell-0.0.0.tgz (Root Library)
- react-native-web-0.0.0-26873b469.tgz
- fbjs-1.0.0.tgz
- :x: **ua-parser-js-0.7.20.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/ChoeMinji/react/commit/cfdac8a3b655e30ad4724d1e0f6910d3ca3c2b5e">cfdac8a3b655e30ad4724d1e0f6910d3ca3c2b5e</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package ua-parser-js before 0.7.23 are vulnerable to Regular Expression Denial of Service (ReDoS) in multiple regexes (see linked commit for more info).
<p>Publish Date: 2020-12-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7793>CVE-2020-7793</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/faisalman/ua-parser-js/commit/6d1f26df051ba681463ef109d36c9cf0f7e32b18">https://github.com/faisalman/ua-parser-js/commit/6d1f26df051ba681463ef109d36c9cf0f7e32b18</a></p>
<p>Release Date: 2020-12-11</p>
<p>Fix Resolution: 0.7.23</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-7793 (High) detected in multiple libraries - ## CVE-2020-7793 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ua-parser-js-0.7.22.tgz</b>, <b>ua-parser-js-0.7.12.tgz</b>, <b>ua-parser-js-0.7.20.tgz</b></p></summary>
<p>
<details><summary><b>ua-parser-js-0.7.22.tgz</b></p></summary>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.22.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.22.tgz</a></p>
<p>Path to dependency file: react/fixtures/legacy-jsx-runtimes/react-14/package.json</p>
<p>Path to vulnerable library: react/fixtures/legacy-jsx-runtimes/react-14/node_modules/ua-parser-js/package.json,react/fixtures/legacy-jsx-runtimes/react-15/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-dom-15.6.2.tgz (Root Library)
- fbjs-0.8.17.tgz
- :x: **ua-parser-js-0.7.22.tgz** (Vulnerable Library)
</details>
<details><summary><b>ua-parser-js-0.7.12.tgz</b></p></summary>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.12.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.12.tgz</a></p>
<p>Path to dependency file: react/fixtures/fiber-debugger/package.json</p>
<p>Path to vulnerable library: react/fixtures/fiber-debugger/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-motion-0.5.0.tgz (Root Library)
- prop-types-15.5.10.tgz
- fbjs-0.8.12.tgz
- :x: **ua-parser-js-0.7.12.tgz** (Vulnerable Library)
</details>
<details><summary><b>ua-parser-js-0.7.20.tgz</b></p></summary>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.20.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.20.tgz</a></p>
<p>Path to dependency file: react/package.json</p>
<p>Path to vulnerable library: react/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-devtools-shell-0.0.0.tgz (Root Library)
- react-native-web-0.0.0-26873b469.tgz
- fbjs-1.0.0.tgz
- :x: **ua-parser-js-0.7.20.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/ChoeMinji/react/commit/cfdac8a3b655e30ad4724d1e0f6910d3ca3c2b5e">cfdac8a3b655e30ad4724d1e0f6910d3ca3c2b5e</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package ua-parser-js before 0.7.23 are vulnerable to Regular Expression Denial of Service (ReDoS) in multiple regexes (see linked commit for more info).
<p>Publish Date: 2020-12-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7793>CVE-2020-7793</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/faisalman/ua-parser-js/commit/6d1f26df051ba681463ef109d36c9cf0f7e32b18">https://github.com/faisalman/ua-parser-js/commit/6d1f26df051ba681463ef109d36c9cf0f7e32b18</a></p>
<p>Release Date: 2020-12-11</p>
<p>Fix Resolution: 0.7.23</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries ua parser js tgz ua parser js tgz ua parser js tgz ua parser js tgz lightweight javascript based user agent string parser library home page a href path to dependency file react fixtures legacy jsx runtimes react package json path to vulnerable library react fixtures legacy jsx runtimes react node modules ua parser js package json react fixtures legacy jsx runtimes react node modules ua parser js package json dependency hierarchy react dom tgz root library fbjs tgz x ua parser js tgz vulnerable library ua parser js tgz lightweight javascript based user agent string parser library home page a href path to dependency file react fixtures fiber debugger package json path to vulnerable library react fixtures fiber debugger node modules ua parser js package json dependency hierarchy react motion tgz root library prop types tgz fbjs tgz x ua parser js tgz vulnerable library ua parser js tgz lightweight javascript based user agent string parser library home page a href path to dependency file react package json path to vulnerable library react node modules ua parser js package json dependency hierarchy react devtools shell tgz root library react native web tgz fbjs tgz x ua parser js tgz vulnerable library found in head commit a href found in base branch main vulnerability details the package ua parser js before are vulnerable to regular expression denial of service redos in multiple regexes see linked commit for more info publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
14,620
| 17,766,639,947
|
IssuesEvent
|
2021-08-30 08:23:07
|
googleapis/teeny-request
|
https://api.github.com/repos/googleapis/teeny-request
|
reopened
|
Dependency Dashboard
|
type: process
|
This issue provides visibility into Renovate updates and their statuses. [Learn more](https://docs.renovatebot.com/key-concepts/dashboard/)
## Awaiting Schedule
These updates are awaiting their schedule. Click on a checkbox to get an update now.
- [ ] <!-- unschedule-branch=renovate/actions-setup-node-2.x -->chore(deps): update actions/setup-node action to v2
## Ignored or Blocked
These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below.
- [ ] <!-- recreate-branch=renovate/mocha-9.x -->[chore(deps): update dependency mocha to v9](../pull/230) (`mocha`, `@types/mocha`)
- [ ] <!-- recreate-branch=renovate/sinon-11.x -->[chore(deps): update dependency sinon to v11](../pull/228)
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
1.0
|
Dependency Dashboard - This issue provides visibility into Renovate updates and their statuses. [Learn more](https://docs.renovatebot.com/key-concepts/dashboard/)
## Awaiting Schedule
These updates are awaiting their schedule. Click on a checkbox to get an update now.
- [ ] <!-- unschedule-branch=renovate/actions-setup-node-2.x -->chore(deps): update actions/setup-node action to v2
## Ignored or Blocked
These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below.
- [ ] <!-- recreate-branch=renovate/mocha-9.x -->[chore(deps): update dependency mocha to v9](../pull/230) (`mocha`, `@types/mocha`)
- [ ] <!-- recreate-branch=renovate/sinon-11.x -->[chore(deps): update dependency sinon to v11](../pull/228)
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
process
|
dependency dashboard this issue provides visibility into renovate updates and their statuses awaiting schedule these updates are awaiting their schedule click on a checkbox to get an update now chore deps update actions setup node action to ignored or blocked these are blocked by an existing closed pr and will not be recreated unless you click a checkbox below pull mocha types mocha pull check this box to trigger a request for renovate to run again on this repository
| 1
|
189,633
| 22,047,080,635
|
IssuesEvent
|
2022-05-30 03:51:13
|
madhans23/linux-4.1.15
|
https://api.github.com/repos/madhans23/linux-4.1.15
|
closed
|
CVE-2019-19037 (Medium) detected in linux-stable-rtv4.1.33 - autoclosed
|
security vulnerability
|
## CVE-2019-19037 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/madhans23/linux-4.1.15/commit/f9d19044b0eef1965f9bc412d7d9e579b74ec968">f9d19044b0eef1965f9bc412d7d9e579b74ec968</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/ext4/namei.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/ext4/namei.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ext4_empty_dir in fs/ext4/namei.c in the Linux kernel through 5.3.12 allows a NULL pointer dereference because ext4_read_dirblock(inode,0,DIRENT_HTREE) can be zero.
WhiteSource Note: After conducting further research, WhiteSource has determined that versions v2.6.30-rc1-v4.9.207, v4.10-rc1-v4.14.160, v4.15-rc1--v4.19.91, v5.0-rc1--v5.4.6 and v5.5-rc1--v5.5-rc2 of Linux Kernel are vulnerable to CVE-2019-19037.
<p>Publish Date: 2019-11-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-19037>CVE-2019-19037</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2019-19037">https://www.linuxkernelcves.com/cves/CVE-2019-19037</a></p>
<p>Release Date: 2019-11-21</p>
<p>Fix Resolution: v4.9.208, v4.14.161, v4.19.92, v5.4.7, v5.5-rc3,</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-19037 (Medium) detected in linux-stable-rtv4.1.33 - autoclosed - ## CVE-2019-19037 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/madhans23/linux-4.1.15/commit/f9d19044b0eef1965f9bc412d7d9e579b74ec968">f9d19044b0eef1965f9bc412d7d9e579b74ec968</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/ext4/namei.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/ext4/namei.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ext4_empty_dir in fs/ext4/namei.c in the Linux kernel through 5.3.12 allows a NULL pointer dereference because ext4_read_dirblock(inode,0,DIRENT_HTREE) can be zero.
WhiteSource Note: After conducting further research, WhiteSource has determined that versions v2.6.30-rc1-v4.9.207, v4.10-rc1-v4.14.160, v4.15-rc1--v4.19.91, v5.0-rc1--v5.4.6 and v5.5-rc1--v5.5-rc2 of Linux Kernel are vulnerable to CVE-2019-19037.
<p>Publish Date: 2019-11-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-19037>CVE-2019-19037</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2019-19037">https://www.linuxkernelcves.com/cves/CVE-2019-19037</a></p>
<p>Release Date: 2019-11-21</p>
<p>Fix Resolution: v4.9.208, v4.14.161, v4.19.92, v5.4.7, v5.5-rc3,</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in linux stable autoclosed cve medium severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href found in base branch master vulnerable source files fs namei c fs namei c vulnerability details empty dir in fs namei c in the linux kernel through allows a null pointer dereference because read dirblock inode dirent htree can be zero whitesource note after conducting further research whitesource has determined that versions and of linux kernel are vulnerable to cve publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
20,904
| 27,746,311,852
|
IssuesEvent
|
2023-03-15 17:16:31
|
openxla/community
|
https://api.github.com/repos/openxla/community
|
opened
|
Add iree-discuss redirect to appropriate ML on openxla.org
|
process
|
Would be good to have all OpenXLA mailing lists live on openxla.org. Should set up iree-discuss to migrate either to openxla-discuss or dedicated ML on openxla.org.
|
1.0
|
Add iree-discuss redirect to appropriate ML on openxla.org - Would be good to have all OpenXLA mailing lists live on openxla.org. Should set up iree-discuss to migrate either to openxla-discuss or dedicated ML on openxla.org.
|
process
|
add iree discuss redirect to appropriate ml on openxla org would be good to have all openxla mailing lists live on openxla org should set up iree discuss to migrate either to openxla discuss or dedicated ml on openxla org
| 1
|
14,145
| 17,035,635,740
|
IssuesEvent
|
2021-07-05 06:39:13
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[PM] Responsive issue > Customer logo is not getting displayed on error pages
|
Bug P2 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
|
AR: Customer logo is not getting displayed on error pages
ER: Customer logo should be displayed on all the error pages and if the user click on logo then user should be navigate to sign in page

|
3.0
|
[PM] Responsive issue > Customer logo is not getting displayed on error pages - AR: Customer logo is not getting displayed on error pages
ER: Customer logo should be displayed on all the error pages and if the user click on logo then user should be navigate to sign in page

|
process
|
responsive issue customer logo is not getting displayed on error pages ar customer logo is not getting displayed on error pages er customer logo should be displayed on all the error pages and if the user click on logo then user should be navigate to sign in page
| 1
|
6,080
| 8,925,515,077
|
IssuesEvent
|
2019-01-21 23:07:15
|
enKryptIO/ethvm
|
https://api.github.com/repos/enKryptIO/ethvm
|
closed
|
Add to UncleRecord
|
enhancement milestone:1 priority:high project:processing
|
- BlockHeight: Real block number where it appeared
- Position / Index (or another name): The position where the uncle was in the array
- Uncle reward (just reward for the uncle)
Take a look on: https://etherscan.io/uncles
|
1.0
|
Add to UncleRecord - - BlockHeight: Real block number where it appeared
- Position / Index (or another name): The position where the uncle was in the array
- Uncle reward (just reward for the uncle)
Take a look on: https://etherscan.io/uncles
|
process
|
add to unclerecord blockheight real block number where it appeared position index or another name the position where the uncle was in the array uncle reward just reward for the uncle take a look on
| 1
|
15,477
| 19,685,996,288
|
IssuesEvent
|
2022-01-11 22:14:38
|
PyCQA/pylint
|
https://api.github.com/repos/PyCQA/pylint
|
closed
|
ValueError: generator already executing
|
Crash 💥 python 3.9 topic-multiprocessing Requires Astroid Update
|
### Steps to reproduce
This appears very sensitive to the exact current code and environment. My attempts to create a simplified case for testing have failed. I have a commit in a local git repo of the failing state. I can publish that if needed to debug the root cause. There is nothing proprietory in the project.
Commenting out a single line `assert uni.dimensions == 2` avoids the error.
Uncommenting 2 following lines
```py
assert isinstance(uni.neighbourhood, frozenset)
assert uni.neighbourhood == frozenset(test_neighbourhood)
```
avoids the error. Removing other test cases avoids the error. With or without the error condition, pytest can run the code without errors.
```txt .pylintrc
init-hook="from pylint.config import find_pylintrc; import os, sys; sys.path.append(os.path.dirname(find_pylintrc()))"
```
```py tests/test_create_universe.py
# pylint: disable=E1120,E1121
```
```sh
pipenv run pylint tests/test_create_universe.py
```
### Current behavior
```t
--------------------------------------------------------------------
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)
Exception ignored in: <generator object infer_attribute at 0x7f98d58f6b30>
Traceback (most recent call last):
File "/home/phil/.local/share/virtualenvs/cellular-automata-JH0ZnjF-/lib/python3.9/site-packages/astroid/inference.py", line 326, in infer_attribute
yield from owner.igetattr(self.attrname, context)
File "/home/phil/.local/share/virtualenvs/cellular-automata-JH0ZnjF-/lib/python3.9/site-packages/astroid/bases.py", line 232, in igetattr
yield from _infer_stmts(
ValueError: generator already executing
```
### Expected behavior
```t
--------------------------------------------------------------------
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)
```
### pylint --version output
```sh
% uname -r
5.12.13-200.fc33.x86_64
% pipenv --version
pipenv, version 2021.5.29
% pipenv run pylint --version
pylint 2.9.3
astroid 2.6.2
Python 3.9.5 (default, May 14 2021, 00:00:00)
[GCC 10.3.1 20210422 (Red Hat 10.3.1-1)]
% pipenv graph
pylint==2.9.3
- astroid [required: >=2.6.2,<2.7, installed: 2.6.2]
- lazy-object-proxy [required: >=1.4.0, installed: 1.6.0]
- wrapt [required: >=1.11,<1.13, installed: 1.12.1]
- isort [required: >=4.2.5,<6, installed: 5.9.1]
- mccabe [required: >=0.6,<0.7, installed: 0.6.1]
- toml [required: >=0.7.1, installed: 0.10.2]
pytest==6.2.4
- attrs [required: >=19.2.0, installed: 21.2.0]
- iniconfig [required: Any, installed: 1.1.1]
- packaging [required: Any, installed: 20.9]
- pyparsing [required: >=2.0.2, installed: 2.4.7]
- pluggy [required: >=0.12,<1.0.0a1, installed: 0.13.1]
- py [required: >=1.8.2, installed: 1.10.0]
- toml [required: Any, installed: 0.10.2]
```
Needs the main project files being imported for the unittest cases.
|
1.0
|
ValueError: generator already executing - ### Steps to reproduce
This appears very sensitive to the exact current code and environment. My attempts to create a simplified case for testing have failed. I have a commit in a local git repo of the failing state. I can publish that if needed to debug the root cause. There is nothing proprietory in the project.
Commenting out a single line `assert uni.dimensions == 2` avoids the error.
Uncommenting 2 following lines
```py
assert isinstance(uni.neighbourhood, frozenset)
assert uni.neighbourhood == frozenset(test_neighbourhood)
```
avoids the error. Removing other test cases avoids the error. With or without the error condition, pytest can run the code without errors.
```txt .pylintrc
init-hook="from pylint.config import find_pylintrc; import os, sys; sys.path.append(os.path.dirname(find_pylintrc()))"
```
```py tests/test_create_universe.py
# pylint: disable=E1120,E1121
```
```sh
pipenv run pylint tests/test_create_universe.py
```
### Current behavior
```t
--------------------------------------------------------------------
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)
Exception ignored in: <generator object infer_attribute at 0x7f98d58f6b30>
Traceback (most recent call last):
File "/home/phil/.local/share/virtualenvs/cellular-automata-JH0ZnjF-/lib/python3.9/site-packages/astroid/inference.py", line 326, in infer_attribute
yield from owner.igetattr(self.attrname, context)
File "/home/phil/.local/share/virtualenvs/cellular-automata-JH0ZnjF-/lib/python3.9/site-packages/astroid/bases.py", line 232, in igetattr
yield from _infer_stmts(
ValueError: generator already executing
```
### Expected behavior
```t
--------------------------------------------------------------------
Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)
```
### pylint --version output
```sh
% uname -r
5.12.13-200.fc33.x86_64
% pipenv --version
pipenv, version 2021.5.29
% pipenv run pylint --version
pylint 2.9.3
astroid 2.6.2
Python 3.9.5 (default, May 14 2021, 00:00:00)
[GCC 10.3.1 20210422 (Red Hat 10.3.1-1)]
% pipenv graph
pylint==2.9.3
- astroid [required: >=2.6.2,<2.7, installed: 2.6.2]
- lazy-object-proxy [required: >=1.4.0, installed: 1.6.0]
- wrapt [required: >=1.11,<1.13, installed: 1.12.1]
- isort [required: >=4.2.5,<6, installed: 5.9.1]
- mccabe [required: >=0.6,<0.7, installed: 0.6.1]
- toml [required: >=0.7.1, installed: 0.10.2]
pytest==6.2.4
- attrs [required: >=19.2.0, installed: 21.2.0]
- iniconfig [required: Any, installed: 1.1.1]
- packaging [required: Any, installed: 20.9]
- pyparsing [required: >=2.0.2, installed: 2.4.7]
- pluggy [required: >=0.12,<1.0.0a1, installed: 0.13.1]
- py [required: >=1.8.2, installed: 1.10.0]
- toml [required: Any, installed: 0.10.2]
```
Needs the main project files being imported for the unittest cases.
|
process
|
valueerror generator already executing steps to reproduce this appears very sensitive to the exact current code and environment my attempts to create a simplified case for testing have failed i have a commit in a local git repo of the failing state i can publish that if needed to debug the root cause there is nothing proprietory in the project commenting out a single line assert uni dimensions avoids the error uncommenting following lines py assert isinstance uni neighbourhood frozenset assert uni neighbourhood frozenset test neighbourhood avoids the error removing other test cases avoids the error with or without the error condition pytest can run the code without errors txt pylintrc init hook from pylint config import find pylintrc import os sys sys path append os path dirname find pylintrc py tests test create universe py pylint disable sh pipenv run pylint tests test create universe py current behavior t your code has been rated at previous run exception ignored in traceback most recent call last file home phil local share virtualenvs cellular automata lib site packages astroid inference py line in infer attribute yield from owner igetattr self attrname context file home phil local share virtualenvs cellular automata lib site packages astroid bases py line in igetattr yield from infer stmts valueerror generator already executing expected behavior t your code has been rated at previous run pylint version output sh uname r pipenv version pipenv version pipenv run pylint version pylint astroid python default may pipenv graph pylint astroid lazy object proxy wrapt isort mccabe toml pytest attrs iniconfig packaging pyparsing pluggy py toml needs the main project files being imported for the unittest cases
| 1
|
11,300
| 14,105,572,282
|
IssuesEvent
|
2020-11-06 13:42:39
|
paul-buerkner/brms
|
https://api.github.com/repos/paul-buerkner/brms
|
closed
|
PPC for (right-)censored data
|
feature post-processing
|
It seems like posterior predictive checks (PPCs) for censored data (in non-Cox models) are currently performed by leaving out the censored observations from the observed data. I think this is not appropriate: For example, the noncensored observations may be systematically smaller than the censored ones. (Think of the example from pp. 225-226 of [BDA3 (Gelman et al., 2014)](http://www.stat.columbia.edu/~gelman/book/) where a scale fails to weigh objects which are too heavy.)
As already mentioned in #966, a PPC for right-censored data in non-Cox models (e.g. right-censored log-normal or right-censored Weibull models) might be possible by comparing the Kaplan-Meier estimate of the CCDF for the observed data to the posterior CCDFs. With the default settings, there would be 4000 posterior CCDFs, so as usual, only a random subset of them would be used for the overlay plot. The resulting plot should be similar to `bayesplot::ppc_ecdf_overlay()`, but taking the right censoring of the observed data into account and applying the "CCDF = 1 - CDF" transformation (although this transformation is not strictly necessary and would just be a convention from traditional survival analysis).
I guess the approach described above only holds for right censoring (because I only know a Kaplan-Meier estimator for right-censored data), but perhaps similar approaches are possible for other types of censoring.
|
1.0
|
PPC for (right-)censored data - It seems like posterior predictive checks (PPCs) for censored data (in non-Cox models) are currently performed by leaving out the censored observations from the observed data. I think this is not appropriate: For example, the noncensored observations may be systematically smaller than the censored ones. (Think of the example from pp. 225-226 of [BDA3 (Gelman et al., 2014)](http://www.stat.columbia.edu/~gelman/book/) where a scale fails to weigh objects which are too heavy.)
As already mentioned in #966, a PPC for right-censored data in non-Cox models (e.g. right-censored log-normal or right-censored Weibull models) might be possible by comparing the Kaplan-Meier estimate of the CCDF for the observed data to the posterior CCDFs. With the default settings, there would be 4000 posterior CCDFs, so as usual, only a random subset of them would be used for the overlay plot. The resulting plot should be similar to `bayesplot::ppc_ecdf_overlay()`, but taking the right censoring of the observed data into account and applying the "CCDF = 1 - CDF" transformation (although this transformation is not strictly necessary and would just be a convention from traditional survival analysis).
I guess the approach described above only holds for right censoring (because I only know a Kaplan-Meier estimator for right-censored data), but perhaps similar approaches are possible for other types of censoring.
|
process
|
ppc for right censored data it seems like posterior predictive checks ppcs for censored data in non cox models are currently performed by leaving out the censored observations from the observed data i think this is not appropriate for example the noncensored observations may be systematically smaller than the censored ones think of the example from pp of where a scale fails to weigh objects which are too heavy as already mentioned in a ppc for right censored data in non cox models e g right censored log normal or right censored weibull models might be possible by comparing the kaplan meier estimate of the ccdf for the observed data to the posterior ccdfs with the default settings there would be posterior ccdfs so as usual only a random subset of them would be used for the overlay plot the resulting plot should be similar to bayesplot ppc ecdf overlay but taking the right censoring of the observed data into account and applying the ccdf cdf transformation although this transformation is not strictly necessary and would just be a convention from traditional survival analysis i guess the approach described above only holds for right censoring because i only know a kaplan meier estimator for right censored data but perhaps similar approaches are possible for other types of censoring
| 1
|
12,472
| 14,940,966,887
|
IssuesEvent
|
2021-01-25 19:03:06
|
ORNL-AMO/AMO-Tools-Desktop
|
https://api.github.com/repos/ORNL-AMO/AMO-Tools-Desktop
|
opened
|
Process Hst thing asked for?eating Small things
|
Process Heating Quick Fix
|
Aux - why is duty cycle first. it should be last
|
1.0
|
Process Hst thing asked for?eating Small things - Aux - why is duty cycle first. it should be last
|
process
|
process hst thing asked for eating small things aux why is duty cycle first it should be last
| 1
|
159,144
| 6,041,203,894
|
IssuesEvent
|
2017-06-10 21:49:57
|
svof/svof
|
https://api.github.com/repos/svof/svof
|
closed
|
Wrong definition of `a_darkyellow`
|
bug low priority simple difficulty up for grabs
|
> Sent By: Lynara On 2017-03-20 00:54:22
A_darkyellow, a color made by svo, is wrong - it should be {179,179,0}. It is {0,179,0}. Which is a_darkgreen.
|
1.0
|
Wrong definition of `a_darkyellow` - > Sent By: Lynara On 2017-03-20 00:54:22
A_darkyellow, a color made by svo, is wrong - it should be {179,179,0}. It is {0,179,0}. Which is a_darkgreen.
|
non_process
|
wrong definition of a darkyellow sent by lynara on a darkyellow a color made by svo is wrong it should be it is which is a darkgreen
| 0
|
664
| 11,860,896,873
|
IssuesEvent
|
2020-03-25 15:33:08
|
planningcenter/developers
|
https://api.github.com/repos/planningcenter/developers
|
closed
|
Total count is exactly 10000 when doing GET request for People (People) JSON
|
People
|
When going to https://api.planningcenteronline.com/people/v2/people on the browser, json['meta']['total_count'] reveals the real number (27922).

However, I've tried doing a GET request via Postman, an ETL tool, and Python, and json['meta']['total_count'] shows 10000 for all three of them. It won't display any data when the offset is that number or greater.

Is this a bug, or am I missing a header somewhere?
Thanks,
Brad Wyatt
|
1.0
|
Total count is exactly 10000 when doing GET request for People (People) JSON - When going to https://api.planningcenteronline.com/people/v2/people on the browser, json['meta']['total_count'] reveals the real number (27922).

However, I've tried doing a GET request via Postman, an ETL tool, and Python, and json['meta']['total_count'] shows 10000 for all three of them. It won't display any data when the offset is that number or greater.

Is this a bug, or am I missing a header somewhere?
Thanks,
Brad Wyatt
|
non_process
|
total count is exactly when doing get request for people people json when going to on the browser json reveals the real number however i ve tried doing a get request via postman an etl tool and python and json shows for all three of them it won t display any data when the offset is that number or greater is this a bug or am i missing a header somewhere thanks brad wyatt
| 0
|
7,592
| 10,703,989,330
|
IssuesEvent
|
2019-10-24 10:42:40
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
new taxon constraint GO:0002250 adaptive immune response only in animals/never in plants
|
multi-species process quick fix taxon restriction
|
The plant people I am talking to state that plants only have innate immunity.
Therefore can we have a taxon constraint on
https://www.ebi.ac.uk/QuickGO/term/GO:0002250
GO:0002250 adaptive immune response
do you agree @tberardini
|
1.0
|
new taxon constraint GO:0002250 adaptive immune response only in animals/never in plants - The plant people I am talking to state that plants only have innate immunity.
Therefore can we have a taxon constraint on
https://www.ebi.ac.uk/QuickGO/term/GO:0002250
GO:0002250 adaptive immune response
do you agree @tberardini
|
process
|
new taxon constraint go adaptive immune response only in animals never in plants the plant people i am talking to state that plants only have innate immunity therefore can we have a taxon constraint on go adaptive immune response do you agree tberardini
| 1
|
17,663
| 23,485,743,369
|
IssuesEvent
|
2022-08-17 14:17:31
|
sparc4-dev/astropop
|
https://api.github.com/repos/sparc4-dev/astropop
|
opened
|
Cross-Correlation registration need sky subtraction
|
bug image-processing
|
With gradient sky, the cross-correlation is affected by the sky. Aligning images using the sources position may be better if the sky is subtracted during this computation.
|
1.0
|
Cross-Correlation registration need sky subtraction - With gradient sky, the cross-correlation is affected by the sky. Aligning images using the sources position may be better if the sky is subtracted during this computation.
|
process
|
cross correlation registration need sky subtraction with gradient sky the cross correlation is affected by the sky aligning images using the sources position may be better if the sky is subtracted during this computation
| 1
|
37,482
| 6,618,276,919
|
IssuesEvent
|
2017-09-21 07:26:49
|
d4rken/sdmaid-public
|
https://api.github.com/repos/d4rken/sdmaid-public
|
closed
|
Update AppCleaner filter documentation
|
c: AppCleaner Documentation
|
`v4.8.0` switches category names.
* Update documentation screenshots
* Update documentation description
|
1.0
|
Update AppCleaner filter documentation - `v4.8.0` switches category names.
* Update documentation screenshots
* Update documentation description
|
non_process
|
update appcleaner filter documentation switches category names update documentation screenshots update documentation description
| 0
|
31,126
| 4,233,899,869
|
IssuesEvent
|
2016-07-05 09:43:13
|
thetodd/invoice
|
https://api.github.com/repos/thetodd/invoice
|
opened
|
PDF: stream method deprecated
|
redesign
|
The stream method for the pdf view in invoice controller is deprected. Search for a better solution.
|
1.0
|
PDF: stream method deprecated - The stream method for the pdf view in invoice controller is deprected. Search for a better solution.
|
non_process
|
pdf stream method deprecated the stream method for the pdf view in invoice controller is deprected search for a better solution
| 0
|
20,416
| 27,075,836,324
|
IssuesEvent
|
2023-02-14 10:30:24
|
billingran/Newsletter
|
https://api.github.com/repos/billingran/Newsletter
|
closed
|
Optimiser le code pour ne plus répéter de code.
|
processing... Brief 2
|
- [ ] optimiser le code pour ne plus répéter de code.
|
1.0
|
Optimiser le code pour ne plus répéter de code. - - [ ] optimiser le code pour ne plus répéter de code.
|
process
|
optimiser le code pour ne plus répéter de code optimiser le code pour ne plus répéter de code
| 1
|
310,665
| 9,522,594,676
|
IssuesEvent
|
2019-04-27 09:57:39
|
ExchangeUnion/xud
|
https://api.github.com/repos/ExchangeUnion/xud
|
closed
|
Switch to bitcoind and litecoind
|
low priority simnet
|
- [ ] change current simnet setup to use bitcoind and litecoind. That might be a bigger change in conjunction with lnd.
- [ ] adjust xud-simulation tests & other tests to use bitcoind and litecoind. Also quite some changes.
- [ ] adjust in-code documentation to reference bitcoind and litecoind
- [ ] adjust simnet wiki & other external documentation to reference bitcoind and litecoind
|
1.0
|
Switch to bitcoind and litecoind - - [ ] change current simnet setup to use bitcoind and litecoind. That might be a bigger change in conjunction with lnd.
- [ ] adjust xud-simulation tests & other tests to use bitcoind and litecoind. Also quite some changes.
- [ ] adjust in-code documentation to reference bitcoind and litecoind
- [ ] adjust simnet wiki & other external documentation to reference bitcoind and litecoind
|
non_process
|
switch to bitcoind and litecoind change current simnet setup to use bitcoind and litecoind that might be a bigger change in conjunction with lnd adjust xud simulation tests other tests to use bitcoind and litecoind also quite some changes adjust in code documentation to reference bitcoind and litecoind adjust simnet wiki other external documentation to reference bitcoind and litecoind
| 0
|
134,187
| 5,221,396,618
|
IssuesEvent
|
2017-01-27 01:20:58
|
elTiempoVuela/https-finder
|
https://api.github.com/repos/elTiempoVuela/https-finder
|
closed
|
Add "Delay Page Load" Option
|
auto-migrated Priority-Medium Type-Enhancement
|
```
As mentioned, there is the chance that during the load of the unencrypted
version of the website, someone might grab cookies.
Could an option be added to block the page load until HTTPS support is known?
```
Original issue reported on code.google.com by `Sub.Atom...@gmail.com` on 16 Dec 2011 at 9:13
|
1.0
|
Add "Delay Page Load" Option - ```
As mentioned, there is the chance that during the load of the unencrypted
version of the website, someone might grab cookies.
Could an option be added to block the page load until HTTPS support is known?
```
Original issue reported on code.google.com by `Sub.Atom...@gmail.com` on 16 Dec 2011 at 9:13
|
non_process
|
add delay page load option as mentioned there is the chance that during the load of the unencrypted version of the website someone might grab cookies could an option be added to block the page load until https support is known original issue reported on code google com by sub atom gmail com on dec at
| 0
|
43,767
| 7,064,866,108
|
IssuesEvent
|
2018-01-06 13:01:07
|
JuliaReach/LazySets.jl
|
https://api.github.com/repos/JuliaReach/LazySets.jl
|
closed
|
Illustrative examples for the manual
|
documentation
|
let's create a new `examples` folder with some illustrative examples and pictures:
- [x] polygon overapproximations => iterative refinement, see #108
- [x] convex hulls
- [x] interval hulls
- [x] reachability algorithm for linear systems using zonotopes
|
1.0
|
Illustrative examples for the manual - let's create a new `examples` folder with some illustrative examples and pictures:
- [x] polygon overapproximations => iterative refinement, see #108
- [x] convex hulls
- [x] interval hulls
- [x] reachability algorithm for linear systems using zonotopes
|
non_process
|
illustrative examples for the manual let s create a new examples folder with some illustrative examples and pictures polygon overapproximations iterative refinement see convex hulls interval hulls reachability algorithm for linear systems using zonotopes
| 0
|
13,200
| 15,630,964,557
|
IssuesEvent
|
2021-03-22 03:34:59
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
the payload for enabling the customer managed keys is not complete
|
Pri2 automation/svc cxp doc-enhancement process-automation/subsvc triaged
|
i have worked through this documentation to enable the customer managed keys and found that the payload for enabling the encryption is incomplete.
```json
{
"properties": {
"encryption": {
"keySource": "Microsoft.Keyvault",
"keyvaultProperties": {
"keyName": "sample-vault-key",
"keyvaultUri": "https://sample-vault-key12.vault.azure.net",
"keyVersion": "7c73556c521340209371eaf623cc099d"
}
}
}
}
```
yields the following result
```cmd
ERROR: Bad Request({"code":"BadRequest","message":" Identity property or identity type property should not be null while updating Encryption properties."})
```
the payload should be
```json
{
"identity": {
"type": "SystemAssigned"
},
"properties": {
"encryption": {
"keySource": "Microsoft.Keyvault",
"keyvaultProperties": {
"keyName": "sample-vault-key",
"keyvaultUri": "https://sample-vault-key12.vault.azure.net",
"keyVersion": "7c73556c521340209371eaf623cc099d"
}
}
}
}
```
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: b96959e5-3ca5-7725-4f06-4d2abbf4a48e
* Version Independent ID: 377c1f35-a67a-5a79-5dc2-d58c3fbba7ef
* Content: [Encryption of secure assets in Azure Automation](https://docs.microsoft.com/en-us/azure/automation/automation-secure-asset-encryption)
* Content Source: [articles/automation/automation-secure-asset-encryption.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/automation/automation-secure-asset-encryption.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @snehithm
* Microsoft Alias: **snmuvva**
|
1.0
|
the payload for enabling the customer managed keys is not complete - i have worked through this documentation to enable the customer managed keys and found that the payload for enabling the encryption is incomplete.
```json
{
"properties": {
"encryption": {
"keySource": "Microsoft.Keyvault",
"keyvaultProperties": {
"keyName": "sample-vault-key",
"keyvaultUri": "https://sample-vault-key12.vault.azure.net",
"keyVersion": "7c73556c521340209371eaf623cc099d"
}
}
}
}
```
yields the following result
```cmd
ERROR: Bad Request({"code":"BadRequest","message":" Identity property or identity type property should not be null while updating Encryption properties."})
```
the payload should be
```json
{
"identity": {
"type": "SystemAssigned"
},
"properties": {
"encryption": {
"keySource": "Microsoft.Keyvault",
"keyvaultProperties": {
"keyName": "sample-vault-key",
"keyvaultUri": "https://sample-vault-key12.vault.azure.net",
"keyVersion": "7c73556c521340209371eaf623cc099d"
}
}
}
}
```
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: b96959e5-3ca5-7725-4f06-4d2abbf4a48e
* Version Independent ID: 377c1f35-a67a-5a79-5dc2-d58c3fbba7ef
* Content: [Encryption of secure assets in Azure Automation](https://docs.microsoft.com/en-us/azure/automation/automation-secure-asset-encryption)
* Content Source: [articles/automation/automation-secure-asset-encryption.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/automation/automation-secure-asset-encryption.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @snehithm
* Microsoft Alias: **snmuvva**
|
process
|
the payload for enabling the customer managed keys is not complete i have worked through this documentation to enable the customer managed keys and found that the payload for enabling the encryption is incomplete json properties encryption keysource microsoft keyvault keyvaultproperties keyname sample vault key keyvaulturi keyversion yields the following result cmd error bad request code badrequest message identity property or identity type property should not be null while updating encryption properties the payload should be json identity type systemassigned properties encryption keysource microsoft keyvault keyvaultproperties keyname sample vault key keyvaulturi keyversion document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login snehithm microsoft alias snmuvva
| 1
|
106,864
| 13,391,776,498
|
IssuesEvent
|
2020-09-02 23:26:53
|
chanzuckerberg/single-cell
|
https://api.github.com/repos/chanzuckerberg/single-cell
|
closed
|
Scientists want predictable downloads of annotated expression matrices from the portal dataset-view
|
design me! epic
|
**Story**
_Scientists want predictable downloads of annotated expression matrices from the portal dataset-view. The size of downloads can be large and require a large time investment to complete. Network interruptions or timeouts should not require scientists to restart a large download._
|
1.0
|
Scientists want predictable downloads of annotated expression matrices from the portal dataset-view - **Story**
_Scientists want predictable downloads of annotated expression matrices from the portal dataset-view. The size of downloads can be large and require a large time investment to complete. Network interruptions or timeouts should not require scientists to restart a large download._
|
non_process
|
scientists want predictable downloads of annotated expression matrices from the portal dataset view story scientists want predictable downloads of annotated expression matrices from the portal dataset view the size of downloads can be large and require a large time investment to complete network interruptions or timeouts should not require scientists to restart a large download
| 0
|
134,578
| 18,471,936,254
|
IssuesEvent
|
2021-10-17 21:55:47
|
samq-ghdemo/JS-Demo
|
https://api.github.com/repos/samq-ghdemo/JS-Demo
|
opened
|
WS-2019-0032 (High) detected in js-yaml-3.5.5.tgz, js-yaml-3.6.1.tgz
|
security vulnerability
|
## WS-2019-0032 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>js-yaml-3.5.5.tgz</b>, <b>js-yaml-3.6.1.tgz</b></p></summary>
<p>
<details><summary><b>js-yaml-3.5.5.tgz</b></p></summary>
<p>YAML 1.2 parser and serializer</p>
<p>Library home page: <a href="https://registry.npmjs.org/js-yaml/-/js-yaml-3.5.5.tgz">https://registry.npmjs.org/js-yaml/-/js-yaml-3.5.5.tgz</a></p>
<p>Path to dependency file: JS-Demo/package.json</p>
<p>Path to vulnerable library: JS-Demo/node_modules/js-yaml/package.json</p>
<p>
Dependency Hierarchy:
- grunt-1.0.3.tgz (Root Library)
- :x: **js-yaml-3.5.5.tgz** (Vulnerable Library)
</details>
<details><summary><b>js-yaml-3.6.1.tgz</b></p></summary>
<p>YAML 1.2 parser and serializer</p>
<p>Library home page: <a href="https://registry.npmjs.org/js-yaml/-/js-yaml-3.6.1.tgz">https://registry.npmjs.org/js-yaml/-/js-yaml-3.6.1.tgz</a></p>
<p>Path to dependency file: JS-Demo/package.json</p>
<p>Path to vulnerable library: JS-Demo/node_modules/coveralls/node_modules/js-yaml/package.json</p>
<p>
Dependency Hierarchy:
- grunt-if-0.2.0.tgz (Root Library)
- grunt-contrib-nodeunit-1.0.0.tgz
- nodeunit-0.9.5.tgz
- tap-7.1.2.tgz
- coveralls-2.13.3.tgz
- :x: **js-yaml-3.6.1.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/JS-Demo/commit/210025573ddd44a379ebb23baeb6e2648a69b3d3">210025573ddd44a379ebb23baeb6e2648a69b3d3</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Versions js-yaml prior to 3.13.0 are vulnerable to Denial of Service. By parsing a carefully-crafted YAML file, the node process stalls and may exhaust system resources leading to a Denial of Service.
<p>Publish Date: 2019-03-20
<p>URL: <a href=https://github.com/nodeca/js-yaml/commit/a567ef3c6e61eb319f0bfc2671d91061afb01235>WS-2019-0032</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/788/versions">https://www.npmjs.com/advisories/788/versions</a></p>
<p>Release Date: 2019-03-20</p>
<p>Fix Resolution: js-yaml - 3.13.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"js-yaml","packageVersion":"3.5.5","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt:1.0.3;js-yaml:3.5.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"js-yaml - 3.13.0"},{"packageType":"javascript/Node.js","packageName":"js-yaml","packageVersion":"3.6.1","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-if:0.2.0;grunt-contrib-nodeunit:1.0.0;nodeunit:0.9.5;tap:7.1.2;coveralls:2.13.3;js-yaml:3.6.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"js-yaml - 3.13.0"}],"baseBranches":["main"],"vulnerabilityIdentifier":"WS-2019-0032","vulnerabilityDetails":"Versions js-yaml prior to 3.13.0 are vulnerable to Denial of Service. By parsing a carefully-crafted YAML file, the node process stalls and may exhaust system resources leading to a Denial of Service.","vulnerabilityUrl":"https://github.com/nodeca/js-yaml/commit/a567ef3c6e61eb319f0bfc2671d91061afb01235","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
WS-2019-0032 (High) detected in js-yaml-3.5.5.tgz, js-yaml-3.6.1.tgz - ## WS-2019-0032 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>js-yaml-3.5.5.tgz</b>, <b>js-yaml-3.6.1.tgz</b></p></summary>
<p>
<details><summary><b>js-yaml-3.5.5.tgz</b></p></summary>
<p>YAML 1.2 parser and serializer</p>
<p>Library home page: <a href="https://registry.npmjs.org/js-yaml/-/js-yaml-3.5.5.tgz">https://registry.npmjs.org/js-yaml/-/js-yaml-3.5.5.tgz</a></p>
<p>Path to dependency file: JS-Demo/package.json</p>
<p>Path to vulnerable library: JS-Demo/node_modules/js-yaml/package.json</p>
<p>
Dependency Hierarchy:
- grunt-1.0.3.tgz (Root Library)
- :x: **js-yaml-3.5.5.tgz** (Vulnerable Library)
</details>
<details><summary><b>js-yaml-3.6.1.tgz</b></p></summary>
<p>YAML 1.2 parser and serializer</p>
<p>Library home page: <a href="https://registry.npmjs.org/js-yaml/-/js-yaml-3.6.1.tgz">https://registry.npmjs.org/js-yaml/-/js-yaml-3.6.1.tgz</a></p>
<p>Path to dependency file: JS-Demo/package.json</p>
<p>Path to vulnerable library: JS-Demo/node_modules/coveralls/node_modules/js-yaml/package.json</p>
<p>
Dependency Hierarchy:
- grunt-if-0.2.0.tgz (Root Library)
- grunt-contrib-nodeunit-1.0.0.tgz
- nodeunit-0.9.5.tgz
- tap-7.1.2.tgz
- coveralls-2.13.3.tgz
- :x: **js-yaml-3.6.1.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/JS-Demo/commit/210025573ddd44a379ebb23baeb6e2648a69b3d3">210025573ddd44a379ebb23baeb6e2648a69b3d3</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Versions js-yaml prior to 3.13.0 are vulnerable to Denial of Service. By parsing a carefully-crafted YAML file, the node process stalls and may exhaust system resources leading to a Denial of Service.
<p>Publish Date: 2019-03-20
<p>URL: <a href=https://github.com/nodeca/js-yaml/commit/a567ef3c6e61eb319f0bfc2671d91061afb01235>WS-2019-0032</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/788/versions">https://www.npmjs.com/advisories/788/versions</a></p>
<p>Release Date: 2019-03-20</p>
<p>Fix Resolution: js-yaml - 3.13.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"js-yaml","packageVersion":"3.5.5","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt:1.0.3;js-yaml:3.5.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"js-yaml - 3.13.0"},{"packageType":"javascript/Node.js","packageName":"js-yaml","packageVersion":"3.6.1","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-if:0.2.0;grunt-contrib-nodeunit:1.0.0;nodeunit:0.9.5;tap:7.1.2;coveralls:2.13.3;js-yaml:3.6.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"js-yaml - 3.13.0"}],"baseBranches":["main"],"vulnerabilityIdentifier":"WS-2019-0032","vulnerabilityDetails":"Versions js-yaml prior to 3.13.0 are vulnerable to Denial of Service. By parsing a carefully-crafted YAML file, the node process stalls and may exhaust system resources leading to a Denial of Service.","vulnerabilityUrl":"https://github.com/nodeca/js-yaml/commit/a567ef3c6e61eb319f0bfc2671d91061afb01235","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
ws high detected in js yaml tgz js yaml tgz ws high severity vulnerability vulnerable libraries js yaml tgz js yaml tgz js yaml tgz yaml parser and serializer library home page a href path to dependency file js demo package json path to vulnerable library js demo node modules js yaml package json dependency hierarchy grunt tgz root library x js yaml tgz vulnerable library js yaml tgz yaml parser and serializer library home page a href path to dependency file js demo package json path to vulnerable library js demo node modules coveralls node modules js yaml package json dependency hierarchy grunt if tgz root library grunt contrib nodeunit tgz nodeunit tgz tap tgz coveralls tgz x js yaml tgz vulnerable library found in head commit a href found in base branch main vulnerability details versions js yaml prior to are vulnerable to denial of service by parsing a carefully crafted yaml file the node process stalls and may exhaust system resources leading to a denial of service publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution js yaml isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree grunt js yaml isminimumfixversionavailable true minimumfixversion js yaml packagetype javascript node js packagename js yaml packageversion packagefilepaths istransitivedependency true dependencytree grunt if grunt contrib nodeunit nodeunit tap coveralls js yaml isminimumfixversionavailable true minimumfixversion js yaml basebranches vulnerabilityidentifier ws vulnerabilitydetails versions js yaml prior to are vulnerable to denial of service by parsing a carefully crafted yaml file the node process stalls and may exhaust system resources leading to a denial of service vulnerabilityurl
| 0
|
20,701
| 27,385,849,614
|
IssuesEvent
|
2023-02-28 13:16:10
|
microsoftgraph/msgraph-metadata
|
https://api.github.com/repos/microsoftgraph/msgraph-metadata
|
closed
|
beta generation failing because of delta function
|
Generator: Metadata Preprocessor metadata-issue blocking
|
[beta weekly generation is currently failing](https://microsoftgraph.visualstudio.com/Graph%20Developer%20Experiences/_build/results?buildId=106293&view=logs&j=4c81e8e4-6eb9-5def-7261-b44ad0fc1b7d&t=163839c6-f69e-5b5d-39e2-5b38e97d3fe3) because [a delta function on directory objects was added](https://dev.azure.com/msazure/One/_git/AD-AggregatorService-Workloads/pullRequest/7582888#1676991425) which conflicts with the derived types delta functions.
I believe we should introduce an xsl fix that removes the functions on the derived types, only for beta, only for kiota based SDK for the time being until the service fixes their definition to unblock our generation pipeline, but I'm open to feedback here first.
CC @andrueastman @peombwa @irvinesunday
|
1.0
|
beta generation failing because of delta function - [beta weekly generation is currently failing](https://microsoftgraph.visualstudio.com/Graph%20Developer%20Experiences/_build/results?buildId=106293&view=logs&j=4c81e8e4-6eb9-5def-7261-b44ad0fc1b7d&t=163839c6-f69e-5b5d-39e2-5b38e97d3fe3) because [a delta function on directory objects was added](https://dev.azure.com/msazure/One/_git/AD-AggregatorService-Workloads/pullRequest/7582888#1676991425) which conflicts with the derived types delta functions.
I believe we should introduce an xsl fix that removes the functions on the derived types, only for beta, only for kiota based SDK for the time being until the service fixes their definition to unblock our generation pipeline, but I'm open to feedback here first.
CC @andrueastman @peombwa @irvinesunday
|
process
|
beta generation failing because of delta function because which conflicts with the derived types delta functions i believe we should introduce an xsl fix that removes the functions on the derived types only for beta only for kiota based sdk for the time being until the service fixes their definition to unblock our generation pipeline but i m open to feedback here first cc andrueastman peombwa irvinesunday
| 1
|
80,728
| 3,573,587,270
|
IssuesEvent
|
2016-01-27 07:30:47
|
OpenSRP/opensrp-server
|
https://api.github.com/repos/OpenSRP/opensrp-server
|
closed
|
Immediate schedule generation
|
BANGLADESH High Priority
|
When a front line (FWA) health worker visit a house and register a household. Then after that registration Opensrp server generate pregnancy surveillance schedule (psrf) after 8 weeks if it finds a eligible couple in that household registration. But the workers want to complete the surveillance in a day they are doing household registration to minimize visit and as well as reduce delay in surveillance. Which is very pragmatic case and they want to get immediate schedule support from opensrp.
One solution we can think of reduce schedule definition window to 0/1 day but that really not work because of immediate generated schedule trigger by motech once in a day that can't save next day another visit requirement to that particular household.
|
1.0
|
Immediate schedule generation - When a front line (FWA) health worker visit a house and register a household. Then after that registration Opensrp server generate pregnancy surveillance schedule (psrf) after 8 weeks if it finds a eligible couple in that household registration. But the workers want to complete the surveillance in a day they are doing household registration to minimize visit and as well as reduce delay in surveillance. Which is very pragmatic case and they want to get immediate schedule support from opensrp.
One solution we can think of reduce schedule definition window to 0/1 day but that really not work because of immediate generated schedule trigger by motech once in a day that can't save next day another visit requirement to that particular household.
|
non_process
|
immediate schedule generation when a front line fwa health worker visit a house and register a household then after that registration opensrp server generate pregnancy surveillance schedule psrf after weeks if it finds a eligible couple in that household registration but the workers want to complete the surveillance in a day they are doing household registration to minimize visit and as well as reduce delay in surveillance which is very pragmatic case and they want to get immediate schedule support from opensrp one solution we can think of reduce schedule definition window to day but that really not work because of immediate generated schedule trigger by motech once in a day that can t save next day another visit requirement to that particular household
| 0
|
4,905
| 7,783,585,384
|
IssuesEvent
|
2018-06-06 10:23:43
|
openvstorage/framework
|
https://api.github.com/repos/openvstorage/framework
|
closed
|
Why are Ubuntu and centos 7vdisk automatic snapshots different?
|
process_wontfix
|
My Ubuntu has automatic snapshots of virtual disks, why centos 7 does not have automatic snapshots of virtual disks, what do I need to do?
|
1.0
|
Why are Ubuntu and centos 7vdisk automatic snapshots different? - My Ubuntu has automatic snapshots of virtual disks, why centos 7 does not have automatic snapshots of virtual disks, what do I need to do?
|
process
|
why are ubuntu and centos automatic snapshots different my ubuntu has automatic snapshots of virtual disks why centos does not have automatic snapshots of virtual disks what do i need to do
| 1
|
21,141
| 28,111,846,866
|
IssuesEvent
|
2023-03-31 07:50:45
|
Open-EO/openeo-processes
|
https://api.github.com/repos/Open-EO/openeo-processes
|
closed
|
load_stac_collection
|
new process
|
**Proposed Process ID:** load_stac_collection
## Context
An openEO backend now only supports predefined collections, but properly configured STAC collections can also be loaded. This avoids the need for static config, and simplifies use cases like running an openEO backend out of the box.
There's a few bigger questions here:
- How will we deal with authentication headers? Can we do something generic?
- Would it be useful to be able to override collection metadata? This would provide a workaround for catalogs that don't provide sufficient metadata, like the openeo:gsd values to determine resolution per band.
## Summary
Loads a collection from a STAC catalog.
## Description
Loads a collection from a STAC catalog by its id and returns it as a processable data cube. The data that is added to the data cube can be restricted with the parameters spatial_extent, temporal_extent, bands and properties.
## Parameters
(copy some from load_collection/load_result
### `collection_id`
**Optional:** no
#### Description
The STAC collection id to use when querying the catalog
#### Data Type
string
### `access_token`
**Optional:** yes
#### Description
Some catalogs may require an authorization header to be sent along with requests, both for catalog access and for data access.
#### Data Type
string
## Return Value
### Description
A data cube for further processing. The dimensions and dimension properties (name, type, labels, reference system and resolution) correspond to the collection's metadata, but the dimension labels are restricted as specified in the parameters.
### Data Type
raster-cube
## Categories (optional)
* Cubes
* Import
|
1.0
|
load_stac_collection - **Proposed Process ID:** load_stac_collection
## Context
An openEO backend now only supports predefined collections, but properly configured STAC collections can also be loaded. This avoids the need for static config, and simplifies use cases like running an openEO backend out of the box.
There's a few bigger questions here:
- How will we deal with authentication headers? Can we do something generic?
- Would it be useful to be able to override collection metadata? This would provide a workaround for catalogs that don't provide sufficient metadata, like the openeo:gsd values to determine resolution per band.
## Summary
Loads a collection from a STAC catalog.
## Description
Loads a collection from a STAC catalog by its id and returns it as a processable data cube. The data that is added to the data cube can be restricted with the parameters spatial_extent, temporal_extent, bands and properties.
## Parameters
(copy some from load_collection/load_result
### `collection_id`
**Optional:** no
#### Description
The STAC collection id to use when querying the catalog
#### Data Type
string
### `access_token`
**Optional:** yes
#### Description
Some catalogs may require an authorization header to be sent along with requests, both for catalog access and for data access.
#### Data Type
string
## Return Value
### Description
A data cube for further processing. The dimensions and dimension properties (name, type, labels, reference system and resolution) correspond to the collection's metadata, but the dimension labels are restricted as specified in the parameters.
### Data Type
raster-cube
## Categories (optional)
* Cubes
* Import
|
process
|
load stac collection proposed process id load stac collection context an openeo backend now only supports predefined collections but properly configured stac collections can also be loaded this avoids the need for static config and simplifies use cases like running an openeo backend out of the box there s a few bigger questions here how will we deal with authentication headers can we do something generic would it be useful to be able to override collection metadata this would provide a workaround for catalogs that don t provide sufficient metadata like the openeo gsd values to determine resolution per band summary loads a collection from a stac catalog description loads a collection from a stac catalog by its id and returns it as a processable data cube the data that is added to the data cube can be restricted with the parameters spatial extent temporal extent bands and properties parameters copy some from load collection load result collection id optional no description the stac collection id to use when querying the catalog data type string access token optional yes description some catalogs may require an authorization header to be sent along with requests both for catalog access and for data access data type string return value description a data cube for further processing the dimensions and dimension properties name type labels reference system and resolution correspond to the collection s metadata but the dimension labels are restricted as specified in the parameters data type raster cube categories optional cubes import
| 1
|
306,769
| 26,494,028,796
|
IssuesEvent
|
2023-01-18 02:46:52
|
microsoft/AzureStorageExplorer
|
https://api.github.com/repos/microsoft/AzureStorageExplorer
|
closed
|
The toolbar and editor for one app configuration are not localized
|
:heavy_check_mark: merged 🧪 testing :globe_with_meridians: localization :gear: app-config
|
**Storage Explorer Version:** 1.28.0-dev
**Build Number:** 20221226.3
**Branch:** main
**Platform/OS:** Windows 10/Linux Ubuntu 22.04/MacOS Ventura 13.1 (Apple M1 Pro)
**Reproduce Languages:** All
**Architecture:** ia32/x64
**How Found:** From running test cases
**Regression From:** Not a regression
## Steps to Reproduce ##
1. Launch Storage Explorer -> Open 'Settings' -> 'Application -> Regional Settings'.
2. Select 'Español' -> Restart Storage Explorer.
3. Expand the 'Application Configurations' node.
4. Open one app configuration.
5. Check whether the toolbar and editor are localized.
## Expected Experience ##
The toolbar and editor are localized.
## Actual Experience ##
The toolbar and editor are not localized.

|
1.0
|
The toolbar and editor for one app configuration are not localized - **Storage Explorer Version:** 1.28.0-dev
**Build Number:** 20221226.3
**Branch:** main
**Platform/OS:** Windows 10/Linux Ubuntu 22.04/MacOS Ventura 13.1 (Apple M1 Pro)
**Reproduce Languages:** All
**Architecture:** ia32/x64
**How Found:** From running test cases
**Regression From:** Not a regression
## Steps to Reproduce ##
1. Launch Storage Explorer -> Open 'Settings' -> 'Application -> Regional Settings'.
2. Select 'Español' -> Restart Storage Explorer.
3. Expand the 'Application Configurations' node.
4. Open one app configuration.
5. Check whether the toolbar and editor are localized.
## Expected Experience ##
The toolbar and editor are localized.
## Actual Experience ##
The toolbar and editor are not localized.

|
non_process
|
the toolbar and editor for one app configuration are not localized storage explorer version dev build number branch main platform os windows linux ubuntu macos ventura apple pro reproduce languages all architecture how found from running test cases regression from not a regression steps to reproduce launch storage explorer open settings application regional settings select español restart storage explorer expand the application configurations node open one app configuration check whether the toolbar and editor are localized expected experience the toolbar and editor are localized actual experience the toolbar and editor are not localized
| 0
|
436,380
| 12,550,374,565
|
IssuesEvent
|
2020-06-06 10:53:18
|
EBIvariation/trait-curation
|
https://api.github.com/repos/EBIvariation/trait-curation
|
opened
|
Main page: implement “Export as spreadsheet” button
|
Priority: Low Scope: Main page
|
This should function in a way similar to #18. Note that this is low priority and should only be implemented after all other UI components are done & tested, unless we identify a use case which makes this more important.

|
1.0
|
Main page: implement “Export as spreadsheet” button - This should function in a way similar to #18. Note that this is low priority and should only be implemented after all other UI components are done & tested, unless we identify a use case which makes this more important.

|
non_process
|
main page implement “export as spreadsheet” button this should function in a way similar to note that this is low priority and should only be implemented after all other ui components are done tested unless we identify a use case which makes this more important
| 0
|
2,444
| 5,224,054,408
|
IssuesEvent
|
2017-01-27 14:25:01
|
DynareTeam/dynare
|
https://api.github.com/repos/DynareTeam/dynare
|
opened
|
do not automatically create *_set_auxiliary_variables.m on preprocessor run
|
enhancement preprocessor
|
As `*_set_auxiliary_variables.m` is sometimes empty, it does not make sense to create it on every run. Only create it if something will be written to it.
This change requires either a flag in `M_` to be tested every time the function is called in the code or to test `exist('*set_auxiliary_variables.m') == 2` and take the appropriate action in the code.
|
1.0
|
do not automatically create *_set_auxiliary_variables.m on preprocessor run - As `*_set_auxiliary_variables.m` is sometimes empty, it does not make sense to create it on every run. Only create it if something will be written to it.
This change requires either a flag in `M_` to be tested every time the function is called in the code or to test `exist('*set_auxiliary_variables.m') == 2` and take the appropriate action in the code.
|
process
|
do not automatically create set auxiliary variables m on preprocessor run as set auxiliary variables m is sometimes empty it does not make sense to create it on every run only create it if something will be written to it this change requires either a flag in m to be tested every time the function is called in the code or to test exist set auxiliary variables m and take the appropriate action in the code
| 1
|
242,375
| 20,244,890,914
|
IssuesEvent
|
2022-02-14 12:50:37
|
commercialhaskell/stackage
|
https://api.github.com/repos/commercialhaskell/stackage
|
closed
|
hspec-core-2.8.4
|
failure: test-suite
|
```
Test suite failure for package hspec-core-2.8.4
spec: exited with: ExitFailure 1
Failures:
test/Test/Hspec/Core/Config/OptionsSpec.hs:41:7:
1) Test.Hspec.Core.Config.Options.parseOptions, with --help, prints help
uncaught exception: IOException of type NoSuchThing
help.txt: openFile: does not exist (No such file or directory)
To rerun use: --match "/Test.Hspec.Core.Config.Options/parseOptions/with --help/prints help/"
Randomized with seed 2076271229
Finished in 0.1542 seconds
329 examples, 1 failure
```
cc @sol
|
1.0
|
hspec-core-2.8.4 - ```
Test suite failure for package hspec-core-2.8.4
spec: exited with: ExitFailure 1
Failures:
test/Test/Hspec/Core/Config/OptionsSpec.hs:41:7:
1) Test.Hspec.Core.Config.Options.parseOptions, with --help, prints help
uncaught exception: IOException of type NoSuchThing
help.txt: openFile: does not exist (No such file or directory)
To rerun use: --match "/Test.Hspec.Core.Config.Options/parseOptions/with --help/prints help/"
Randomized with seed 2076271229
Finished in 0.1542 seconds
329 examples, 1 failure
```
cc @sol
|
non_process
|
hspec core test suite failure for package hspec core spec exited with exitfailure failures test test hspec core config optionsspec hs test hspec core config options parseoptions with help prints help uncaught exception ioexception of type nosuchthing help txt openfile does not exist no such file or directory to rerun use match test hspec core config options parseoptions with help prints help randomized with seed finished in seconds examples failure cc sol
| 0
|
4,935
| 7,795,874,294
|
IssuesEvent
|
2018-06-08 09:35:05
|
StrikeNP/trac_test
|
https://api.github.com/repos/StrikeNP/trac_test
|
closed
|
Update README in directory output_scripts/twp_ice (Trac #168)
|
Migrated from Trac enhancement nielsenb@uwm.edu post_processing
|
In r3915, you've added files used to verify the .nc output for submission to the TWP-ICE intercomparison. Thanks for this.
Could you also please update the README in that same directory in order to describe each of the new files? Perhaps you could augment what you wrote in the svn commit:
". . . verification plots for the
TWP_ICE ensemble.
These scripts verify that the NC data generated for submission is valid.
The twp_ice_output_plot.m recreates figures 1-3 of the case setup
document.
verify_output.bash is a driver script that handles calling all of the
scripts."
What about twp_ice_profile_plot.m and twp_ice_timeseries_plot.m?
Does the verification consist merely in creating plots from the .nc files? Or do overplot things or compare numbers?
Thanks.
Attachments:
Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/168
```json
{
"status": "closed",
"changetime": "2009-08-18T20:52:32",
"description": "In r3915, you've added files used to verify the .nc output for submission to the TWP-ICE intercomparison. Thanks for this.\n\nCould you also please update the README in that same directory in order to describe each of the new files? Perhaps you could augment what you wrote in the svn commit:\n\n\". . . verification plots for the\nTWP_ICE ensemble.\n\nThese scripts verify that the NC data generated for submission is valid.\n\nThe twp_ice_output_plot.m recreates figures 1-3 of the case setup\ndocument.\n\nverify_output.bash is a driver script that handles calling all of the\nscripts.\"\n\nWhat about twp_ice_profile_plot.m and twp_ice_timeseries_plot.m? \n\nDoes the verification consist merely in creating plots from the .nc files? Or do overplot things or compare numbers?\n\nThanks.",
"reporter": "vlarson@uwm.edu",
"cc": "",
"resolution": "Verified by V. Larson",
"_ts": "1250628752000000",
"component": "post_processing",
"summary": "Update README in directory output_scripts/twp_ice",
"priority": "minor",
"keywords": "",
"time": "2009-08-04T15:36:49",
"milestone": "Set up CLUBB for the TWP-ICE intercomparison",
"owner": "nielsenb@uwm.edu",
"type": "enhancement"
}
```
|
1.0
|
Update README in directory output_scripts/twp_ice (Trac #168) - In r3915, you've added files used to verify the .nc output for submission to the TWP-ICE intercomparison. Thanks for this.
Could you also please update the README in that same directory in order to describe each of the new files? Perhaps you could augment what you wrote in the svn commit:
". . . verification plots for the
TWP_ICE ensemble.
These scripts verify that the NC data generated for submission is valid.
The twp_ice_output_plot.m recreates figures 1-3 of the case setup
document.
verify_output.bash is a driver script that handles calling all of the
scripts."
What about twp_ice_profile_plot.m and twp_ice_timeseries_plot.m?
Does the verification consist merely in creating plots from the .nc files? Or do overplot things or compare numbers?
Thanks.
Attachments:
Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/168
```json
{
"status": "closed",
"changetime": "2009-08-18T20:52:32",
"description": "In r3915, you've added files used to verify the .nc output for submission to the TWP-ICE intercomparison. Thanks for this.\n\nCould you also please update the README in that same directory in order to describe each of the new files? Perhaps you could augment what you wrote in the svn commit:\n\n\". . . verification plots for the\nTWP_ICE ensemble.\n\nThese scripts verify that the NC data generated for submission is valid.\n\nThe twp_ice_output_plot.m recreates figures 1-3 of the case setup\ndocument.\n\nverify_output.bash is a driver script that handles calling all of the\nscripts.\"\n\nWhat about twp_ice_profile_plot.m and twp_ice_timeseries_plot.m? \n\nDoes the verification consist merely in creating plots from the .nc files? Or do overplot things or compare numbers?\n\nThanks.",
"reporter": "vlarson@uwm.edu",
"cc": "",
"resolution": "Verified by V. Larson",
"_ts": "1250628752000000",
"component": "post_processing",
"summary": "Update README in directory output_scripts/twp_ice",
"priority": "minor",
"keywords": "",
"time": "2009-08-04T15:36:49",
"milestone": "Set up CLUBB for the TWP-ICE intercomparison",
"owner": "nielsenb@uwm.edu",
"type": "enhancement"
}
```
|
process
|
update readme in directory output scripts twp ice trac in you ve added files used to verify the nc output for submission to the twp ice intercomparison thanks for this could you also please update the readme in that same directory in order to describe each of the new files perhaps you could augment what you wrote in the svn commit verification plots for the twp ice ensemble these scripts verify that the nc data generated for submission is valid the twp ice output plot m recreates figures of the case setup document verify output bash is a driver script that handles calling all of the scripts what about twp ice profile plot m and twp ice timeseries plot m does the verification consist merely in creating plots from the nc files or do overplot things or compare numbers thanks attachments migrated from json status closed changetime description in you ve added files used to verify the nc output for submission to the twp ice intercomparison thanks for this n ncould you also please update the readme in that same directory in order to describe each of the new files perhaps you could augment what you wrote in the svn commit n n verification plots for the ntwp ice ensemble n nthese scripts verify that the nc data generated for submission is valid n nthe twp ice output plot m recreates figures of the case setup ndocument n nverify output bash is a driver script that handles calling all of the nscripts n nwhat about twp ice profile plot m and twp ice timeseries plot m n ndoes the verification consist merely in creating plots from the nc files or do overplot things or compare numbers n nthanks reporter vlarson uwm edu cc resolution verified by v larson ts component post processing summary update readme in directory output scripts twp ice priority minor keywords time milestone set up clubb for the twp ice intercomparison owner nielsenb uwm edu type enhancement
| 1
|
153,188
| 5,886,972,232
|
IssuesEvent
|
2017-05-17 05:35:08
|
CovertJaguar/Railcraft
|
https://api.github.com/repos/CovertJaguar/Railcraft
|
closed
|
multiblock not forming on login
|
bug cannot reproduce needs verification old version priority-low
|
this happens whenever I log into the server in a chuck close to the coke ovens or tanks.
https://youtu.be/lowsT3twtTY
Railcraft_1.7.10-9.12.2.0.jar
Forge 10.13.4.1566
|
1.0
|
multiblock not forming on login - this happens whenever I log into the server in a chuck close to the coke ovens or tanks.
https://youtu.be/lowsT3twtTY
Railcraft_1.7.10-9.12.2.0.jar
Forge 10.13.4.1566
|
non_process
|
multiblock not forming on login this happens whenever i log into the server in a chuck close to the coke ovens or tanks railcraft jar forge
| 0
|
16,832
| 22,062,010,103
|
IssuesEvent
|
2022-05-30 19:20:56
|
tc39/proposal-regexp-v-flag
|
https://api.github.com/repos/tc39/proposal-regexp-v-flag
|
closed
|
Advance to Stage 3
|
process
|
Criteria taken from [the TC39 process document](https://tc39.es/process-document/) minus those from previous stages:
> - [x] Complete spec text
https://github.com/tc39/proposal-regexp-set-notation#specification
> - [x] Designated reviewers have signed off on the current spec text
- [x] @waldemarhorwat: https://github.com/tc39/proposal-regexp-set-notation/issues/54
- [x] @gibson042: https://github.com/tc39/ecma262/pull/2418#pullrequestreview-922486850
- [x] @msaboff: https://github.com/tc39/proposal-regexp-set-notation/issues/55
> - [x] All ECMAScript editors have signed off on the current spec text
- [x] @bakkot: https://github.com/tc39/ecma262/pull/2418
- [x] @michaelficarra: verbally approved during 2022-03-29 TC39 meeting
- [x] @syg: verbally approved during 2022-03-29 TC39 meeting
|
1.0
|
Advance to Stage 3 - Criteria taken from [the TC39 process document](https://tc39.es/process-document/) minus those from previous stages:
> - [x] Complete spec text
https://github.com/tc39/proposal-regexp-set-notation#specification
> - [x] Designated reviewers have signed off on the current spec text
- [x] @waldemarhorwat: https://github.com/tc39/proposal-regexp-set-notation/issues/54
- [x] @gibson042: https://github.com/tc39/ecma262/pull/2418#pullrequestreview-922486850
- [x] @msaboff: https://github.com/tc39/proposal-regexp-set-notation/issues/55
> - [x] All ECMAScript editors have signed off on the current spec text
- [x] @bakkot: https://github.com/tc39/ecma262/pull/2418
- [x] @michaelficarra: verbally approved during 2022-03-29 TC39 meeting
- [x] @syg: verbally approved during 2022-03-29 TC39 meeting
|
process
|
advance to stage criteria taken from minus those from previous stages complete spec text designated reviewers have signed off on the current spec text waldemarhorwat msaboff all ecmascript editors have signed off on the current spec text bakkot michaelficarra verbally approved during meeting syg verbally approved during meeting
| 1
|
16,237
| 20,790,139,460
|
IssuesEvent
|
2022-03-17 00:28:50
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
QP/MBQL: `[:relative-datetime :current]` doesn't work inside `[:between]` filter
|
Type:Bug Priority:P2 Querying/Processor .Backend
|
Combining the two types of syntactic sugar clauses doesn't work properly and results in an error. Should be an easy fix
|
1.0
|
QP/MBQL: `[:relative-datetime :current]` doesn't work inside `[:between]` filter - Combining the two types of syntactic sugar clauses doesn't work properly and results in an error. Should be an easy fix
|
process
|
qp mbql doesn t work inside filter combining the two types of syntactic sugar clauses doesn t work properly and results in an error should be an easy fix
| 1
|
19,390
| 25,528,473,443
|
IssuesEvent
|
2022-11-29 05:52:21
|
open-telemetry/opentelemetry-collector-contrib
|
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
|
closed
|
[servicegraphprocessor] Index out of range panic in updateDurationMetrics method
|
bug priority:p2 processor/servicegraph
|
### What happened?
## Description
Index out of range panic in updateDurationMetrics method.
## Steps to Reproduce
Startting up with servicegraphprocessor configured for a while, the collector got panic.
### Collector version
v0.63.0
### Environment information
## Environment
OS: macos 12.3.1
Compiler(if manually compiled): go 1.8
### OpenTelemetry Collector configuration
```yaml
extensions:
health_check:
receivers:
otlp:
protocols:
grpc:
http:
otlp/servicegraph: # Dummy receiver for the metrics pipeline
protocols:
grpc:
endpoint: localhost:12345
processors:
batch:
servicegraph:
metrics_exporter: prometheus/servicegraph
latency_histogram_buckets: [2ms, 4ms, 6ms, 8ms, 10ms, 50ms, 100ms, 200ms, 500ms, 800ms, 1s, 1400ms, 2s, 5s, 10s, 15s]
dimensions:
- k8s.cluster.id
- k8s.namespace.name
store:
ttl: 10s
max_items: 100000
exporters:
logging:
prometheus/servicegraph:
endpoint: 0.0.0.0:8889
service:
telemetry:
metrics:
address: 0.0.0.0:8888
pipelines:
traces:
receivers: [otlp]
processors: [servicegraph,batch]
exporters: [logging]
metrics/servicegraph:
receivers: [otlp/servicegraph]
processors: []
exporters: [prometheus/servicegraph]
extensions: [health_check]
```
### Log output
```shell
panic: runtime error: index out of range [16] with length 16 [recovered]
panic: runtime error: index out of range [16] with length 16
goroutine 2747 [running]:
go.opentelemetry.io/otel/sdk/trace.(*recordingSpan).End.func1()
/Users/fraps/go/pkg/mod/go.opentelemetry.io/otel/sdk@v1.11.1/trace/span.go:383 +0x30
go.opentelemetry.io/otel/sdk/trace.(*recordingSpan).End(0x14001838480, {0x0, 0x0, 0x10?})
/Users/fraps/go/pkg/mod/go.opentelemetry.io/otel/sdk@v1.11.1/trace/span.go:415 +0x6c8
panic({0x10abcb220, 0x14000dc95d8})
/usr/local/go/src/runtime/panic.go:838 +0x204
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor.(*processor).updateDurationMetrics(0x14000b433f0, {0x14001fa6240, 0x5d}, 0x40ed687b645a1cac)
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/processor.go:331 +0x17c
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor.(*processor).aggregateMetricsForEdge(0x14000b433f0, 0x14001a195f0)
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/processor.go:301 +0x19c
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor.(*processor).onComplete(0x14000b433f0, 0x14001a195f0)
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/processor.go:273 +0x2f4
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor/internal/store.(*store).UpsertEdge(0x14000cc6a80, {0x14001b0fc40, 0x31}, 0x1400184eaa0)
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/internal/store/store.go:77 +0x154
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor.(*processor).aggregateMetrics(0x14000b433f0, {0x10b1369c8, 0x14002465290}, {0x0?})
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/processor.go:224 +0x9dc
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor.(*processor).ConsumeTraces(0x14000b433f0, {0x10b1369c8, 0x14002465290}, {0x10884f9c7?})
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/processor.go:145 +0x30
go.opentelemetry.io/collector/receiver/otlpreceiver/internal/trace.(*Receiver).Export(0x1400059b170, {0x10b1369c8, 0x14002465200}, {0x10b1163b8?})
/Users/fraps/go/pkg/mod/go.opentelemetry.io/collector@v0.63.2-0.20221031183340-2ed8c0c6ff9c/receiver/otlpreceiver/internal/trace/otlp.go:60 +0xb4
go.opentelemetry.io/collector/pdata/ptrace/ptraceotlp.rawTracesServer.Export({{0x10b0ec300?, 0x1400059b170?}}, {0x10b1369c8?, 0x14002465200?}, 0x10885c200?)
/Users/fraps/go/pkg/mod/go.opentelemetry.io/collector/pdata@v0.63.2-0.20221031183340-2ed8c0c6ff9c/ptrace/ptraceotlp/grpc.go:72 +0xf8
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler.func1({0x10b1369c8, 0x14002465200}, {0x10adc99c0?, 0x14001fda4b0})
/Users/fraps/go/pkg/mod/go.opentelemetry.io/collector/pdata@v0.63.2-0.20221031183340-2ed8c0c6ff9c/internal/data/protogen/collector/trace/v1/trace_service.pb.go:310 +0x78
go.opentelemetry.io/collector/config/configgrpc.enhanceWithClientInformation.func1({0x10b1369c8?, 0x140024651a0?}, {0x10adc99c0, 0x14001fda4b0}, 0x2?, 0x14001fda4c8)
/Users/fraps/go/pkg/mod/go.opentelemetry.io/collector@v0.63.2-0.20221031183340-2ed8c0c6ff9c/config/configgrpc/configgrpc.go:415 +0x54
google.golang.org/grpc.chainUnaryInterceptors.func1.1({0x10b1369c8?, 0x140024651a0?}, {0x10adc99c0?, 0x14001fda4b0?})
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1162 +0x64
go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1({0x10b1369c8, 0x140024650e0}, {0x10adc99c0, 0x14001fda4b0}, 0x14001b0d0e0, 0x14001b03fc0)
/Users/fraps/go/pkg/mod/go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc@v0.36.4/interceptor.go:341 +0x34c
google.golang.org/grpc.chainUnaryInterceptors.func1.1({0x10b1369c8?, 0x140024650e0?}, {0x10adc99c0?, 0x14001fda4b0?})
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165 +0x90
google.golang.org/grpc.chainUnaryInterceptors.func1({0x10b1369c8, 0x140024650e0}, {0x10adc99c0, 0x14001fda4b0}, 0x14001b0d0e0, 0x14001fda4c8)
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1167 +0x124
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler({0x10a219640?, 0x14000cd0690}, {0x10b1369c8, 0x140024650e0}, 0x140007ce070, 0x140001ea8a0)
/Users/fraps/go/pkg/mod/go.opentelemetry.io/collector/pdata@v0.63.2-0.20221031183340-2ed8c0c6ff9c/internal/data/protogen/collector/trace/v1/trace_service.pb.go:312 +0x13c
google.golang.org/grpc.(*Server).processUnaryRPC(0x14000c925a0, {0x10b154440, 0x14000142340}, 0x1400184a900, 0x14000d19ec0, 0x10ea5e350, 0x0)
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1340 +0xb90
google.golang.org/grpc.(*Server).handleStream(0x14000c925a0, {0x10b154440, 0x14000142340}, 0x1400184a900, 0x0)
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1713 +0x840
google.golang.org/grpc.(*Server).serveStreams.func1.2()
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:965 +0x88
created by google.golang.org/grpc.(*Server).serveStreams.func1
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:963 +0x298
```
### Additional context
_No response_
|
1.0
|
[servicegraphprocessor] Index out of range panic in updateDurationMetrics method - ### What happened?
## Description
Index out of range panic in updateDurationMetrics method.
## Steps to Reproduce
Startting up with servicegraphprocessor configured for a while, the collector got panic.
### Collector version
v0.63.0
### Environment information
## Environment
OS: macos 12.3.1
Compiler(if manually compiled): go 1.8
### OpenTelemetry Collector configuration
```yaml
extensions:
health_check:
receivers:
otlp:
protocols:
grpc:
http:
otlp/servicegraph: # Dummy receiver for the metrics pipeline
protocols:
grpc:
endpoint: localhost:12345
processors:
batch:
servicegraph:
metrics_exporter: prometheus/servicegraph
latency_histogram_buckets: [2ms, 4ms, 6ms, 8ms, 10ms, 50ms, 100ms, 200ms, 500ms, 800ms, 1s, 1400ms, 2s, 5s, 10s, 15s]
dimensions:
- k8s.cluster.id
- k8s.namespace.name
store:
ttl: 10s
max_items: 100000
exporters:
logging:
prometheus/servicegraph:
endpoint: 0.0.0.0:8889
service:
telemetry:
metrics:
address: 0.0.0.0:8888
pipelines:
traces:
receivers: [otlp]
processors: [servicegraph,batch]
exporters: [logging]
metrics/servicegraph:
receivers: [otlp/servicegraph]
processors: []
exporters: [prometheus/servicegraph]
extensions: [health_check]
```
### Log output
```shell
panic: runtime error: index out of range [16] with length 16 [recovered]
panic: runtime error: index out of range [16] with length 16
goroutine 2747 [running]:
go.opentelemetry.io/otel/sdk/trace.(*recordingSpan).End.func1()
/Users/fraps/go/pkg/mod/go.opentelemetry.io/otel/sdk@v1.11.1/trace/span.go:383 +0x30
go.opentelemetry.io/otel/sdk/trace.(*recordingSpan).End(0x14001838480, {0x0, 0x0, 0x10?})
/Users/fraps/go/pkg/mod/go.opentelemetry.io/otel/sdk@v1.11.1/trace/span.go:415 +0x6c8
panic({0x10abcb220, 0x14000dc95d8})
/usr/local/go/src/runtime/panic.go:838 +0x204
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor.(*processor).updateDurationMetrics(0x14000b433f0, {0x14001fa6240, 0x5d}, 0x40ed687b645a1cac)
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/processor.go:331 +0x17c
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor.(*processor).aggregateMetricsForEdge(0x14000b433f0, 0x14001a195f0)
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/processor.go:301 +0x19c
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor.(*processor).onComplete(0x14000b433f0, 0x14001a195f0)
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/processor.go:273 +0x2f4
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor/internal/store.(*store).UpsertEdge(0x14000cc6a80, {0x14001b0fc40, 0x31}, 0x1400184eaa0)
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/internal/store/store.go:77 +0x154
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor.(*processor).aggregateMetrics(0x14000b433f0, {0x10b1369c8, 0x14002465290}, {0x0?})
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/processor.go:224 +0x9dc
github.com/open-telemetry/opentelemetry-collector-contrib/processor/servicegraphprocessor.(*processor).ConsumeTraces(0x14000b433f0, {0x10b1369c8, 0x14002465290}, {0x10884f9c7?})
/Users/fraps/daocloud/github-code/opentelemetry-collector-contrib/processor/servicegraphprocessor/processor.go:145 +0x30
go.opentelemetry.io/collector/receiver/otlpreceiver/internal/trace.(*Receiver).Export(0x1400059b170, {0x10b1369c8, 0x14002465200}, {0x10b1163b8?})
/Users/fraps/go/pkg/mod/go.opentelemetry.io/collector@v0.63.2-0.20221031183340-2ed8c0c6ff9c/receiver/otlpreceiver/internal/trace/otlp.go:60 +0xb4
go.opentelemetry.io/collector/pdata/ptrace/ptraceotlp.rawTracesServer.Export({{0x10b0ec300?, 0x1400059b170?}}, {0x10b1369c8?, 0x14002465200?}, 0x10885c200?)
/Users/fraps/go/pkg/mod/go.opentelemetry.io/collector/pdata@v0.63.2-0.20221031183340-2ed8c0c6ff9c/ptrace/ptraceotlp/grpc.go:72 +0xf8
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler.func1({0x10b1369c8, 0x14002465200}, {0x10adc99c0?, 0x14001fda4b0})
/Users/fraps/go/pkg/mod/go.opentelemetry.io/collector/pdata@v0.63.2-0.20221031183340-2ed8c0c6ff9c/internal/data/protogen/collector/trace/v1/trace_service.pb.go:310 +0x78
go.opentelemetry.io/collector/config/configgrpc.enhanceWithClientInformation.func1({0x10b1369c8?, 0x140024651a0?}, {0x10adc99c0, 0x14001fda4b0}, 0x2?, 0x14001fda4c8)
/Users/fraps/go/pkg/mod/go.opentelemetry.io/collector@v0.63.2-0.20221031183340-2ed8c0c6ff9c/config/configgrpc/configgrpc.go:415 +0x54
google.golang.org/grpc.chainUnaryInterceptors.func1.1({0x10b1369c8?, 0x140024651a0?}, {0x10adc99c0?, 0x14001fda4b0?})
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1162 +0x64
go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1({0x10b1369c8, 0x140024650e0}, {0x10adc99c0, 0x14001fda4b0}, 0x14001b0d0e0, 0x14001b03fc0)
/Users/fraps/go/pkg/mod/go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc@v0.36.4/interceptor.go:341 +0x34c
google.golang.org/grpc.chainUnaryInterceptors.func1.1({0x10b1369c8?, 0x140024650e0?}, {0x10adc99c0?, 0x14001fda4b0?})
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1165 +0x90
google.golang.org/grpc.chainUnaryInterceptors.func1({0x10b1369c8, 0x140024650e0}, {0x10adc99c0, 0x14001fda4b0}, 0x14001b0d0e0, 0x14001fda4c8)
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1167 +0x124
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler({0x10a219640?, 0x14000cd0690}, {0x10b1369c8, 0x140024650e0}, 0x140007ce070, 0x140001ea8a0)
/Users/fraps/go/pkg/mod/go.opentelemetry.io/collector/pdata@v0.63.2-0.20221031183340-2ed8c0c6ff9c/internal/data/protogen/collector/trace/v1/trace_service.pb.go:312 +0x13c
google.golang.org/grpc.(*Server).processUnaryRPC(0x14000c925a0, {0x10b154440, 0x14000142340}, 0x1400184a900, 0x14000d19ec0, 0x10ea5e350, 0x0)
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1340 +0xb90
google.golang.org/grpc.(*Server).handleStream(0x14000c925a0, {0x10b154440, 0x14000142340}, 0x1400184a900, 0x0)
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:1713 +0x840
google.golang.org/grpc.(*Server).serveStreams.func1.2()
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:965 +0x88
created by google.golang.org/grpc.(*Server).serveStreams.func1
/Users/fraps/go/pkg/mod/google.golang.org/grpc@v1.50.1/server.go:963 +0x298
```
### Additional context
_No response_
|
process
|
index out of range panic in updatedurationmetrics method what happened description index out of range panic in updatedurationmetrics method steps to reproduce startting up with servicegraphprocessor configured for a while the collector got panic collector version environment information environment os macos compiler if manually compiled go opentelemetry collector configuration yaml extensions health check receivers otlp protocols grpc http otlp servicegraph dummy receiver for the metrics pipeline protocols grpc endpoint localhost processors batch servicegraph metrics exporter prometheus servicegraph latency histogram buckets dimensions cluster id namespace name store ttl max items exporters logging prometheus servicegraph endpoint service telemetry metrics address pipelines traces receivers processors exporters metrics servicegraph receivers processors exporters extensions log output shell panic runtime error index out of range with length panic runtime error index out of range with length goroutine go opentelemetry io otel sdk trace recordingspan end users fraps go pkg mod go opentelemetry io otel sdk trace span go go opentelemetry io otel sdk trace recordingspan end users fraps go pkg mod go opentelemetry io otel sdk trace span go panic usr local go src runtime panic go github com open telemetry opentelemetry collector contrib processor servicegraphprocessor processor updatedurationmetrics users fraps daocloud github code opentelemetry collector contrib processor servicegraphprocessor processor go github com open telemetry opentelemetry collector contrib processor servicegraphprocessor processor aggregatemetricsforedge users fraps daocloud github code opentelemetry collector contrib processor servicegraphprocessor processor go github com open telemetry opentelemetry collector contrib processor servicegraphprocessor processor oncomplete users fraps daocloud github code opentelemetry collector contrib processor servicegraphprocessor processor go github com open telemetry opentelemetry collector contrib processor servicegraphprocessor internal store store upsertedge users fraps daocloud github code opentelemetry collector contrib processor servicegraphprocessor internal store store go github com open telemetry opentelemetry collector contrib processor servicegraphprocessor processor aggregatemetrics users fraps daocloud github code opentelemetry collector contrib processor servicegraphprocessor processor go github com open telemetry opentelemetry collector contrib processor servicegraphprocessor processor consumetraces users fraps daocloud github code opentelemetry collector contrib processor servicegraphprocessor processor go go opentelemetry io collector receiver otlpreceiver internal trace receiver export users fraps go pkg mod go opentelemetry io collector receiver otlpreceiver internal trace otlp go go opentelemetry io collector pdata ptrace ptraceotlp rawtracesserver export users fraps go pkg mod go opentelemetry io collector pdata ptrace ptraceotlp grpc go go opentelemetry io collector pdata internal data protogen collector trace traceservice export handler users fraps go pkg mod go opentelemetry io collector pdata internal data protogen collector trace trace service pb go go opentelemetry io collector config configgrpc enhancewithclientinformation users fraps go pkg mod go opentelemetry io collector config configgrpc configgrpc go google golang org grpc chainunaryinterceptors users fraps go pkg mod google golang org grpc server go go opentelemetry io contrib instrumentation google golang org grpc otelgrpc unaryserverinterceptor users fraps go pkg mod go opentelemetry io contrib instrumentation google golang org grpc otelgrpc interceptor go google golang org grpc chainunaryinterceptors users fraps go pkg mod google golang org grpc server go google golang org grpc chainunaryinterceptors users fraps go pkg mod google golang org grpc server go go opentelemetry io collector pdata internal data protogen collector trace traceservice export handler users fraps go pkg mod go opentelemetry io collector pdata internal data protogen collector trace trace service pb go google golang org grpc server processunaryrpc users fraps go pkg mod google golang org grpc server go google golang org grpc server handlestream users fraps go pkg mod google golang org grpc server go google golang org grpc server servestreams users fraps go pkg mod google golang org grpc server go created by google golang org grpc server servestreams users fraps go pkg mod google golang org grpc server go additional context no response
| 1
|
4,382
| 7,273,083,873
|
IssuesEvent
|
2018-02-21 02:38:20
|
GoogleCloudPlatform/google-cloud-python
|
https://api.github.com/repos/GoogleCloudPlatform/google-cloud-python
|
closed
|
Logging: Release new version of google-cloud-logging
|
api: logging type: process
|
Hello,
The latest version of Google cloud logging is https://pypi.org/project/google-cloud-logging/#history, in October 2017. We wanted a PyPi release with the latest changes with respect to creating sinks with `unique_writer_identity` set.
We really would prefer not using the latest google-cloud module as we want to use different versions of various google cloud services.
Thanks,
Sneha
|
1.0
|
Logging: Release new version of google-cloud-logging - Hello,
The latest version of Google cloud logging is https://pypi.org/project/google-cloud-logging/#history, in October 2017. We wanted a PyPi release with the latest changes with respect to creating sinks with `unique_writer_identity` set.
We really would prefer not using the latest google-cloud module as we want to use different versions of various google cloud services.
Thanks,
Sneha
|
process
|
logging release new version of google cloud logging hello the latest version of google cloud logging is in october we wanted a pypi release with the latest changes with respect to creating sinks with unique writer identity set we really would prefer not using the latest google cloud module as we want to use different versions of various google cloud services thanks sneha
| 1
|
258,653
| 8,178,616,979
|
IssuesEvent
|
2018-08-28 14:17:07
|
Theophilix/event-table-edit
|
https://api.github.com/repos/Theophilix/event-table-edit
|
closed
|
Frontend: Enable stack view also for large screens
|
enhancement low priority
|
When stack view is chosen, the view stays in toggle mode until browser width is <640px. Enable stack view also for large screen.
|
1.0
|
Frontend: Enable stack view also for large screens - When stack view is chosen, the view stays in toggle mode until browser width is <640px. Enable stack view also for large screen.
|
non_process
|
frontend enable stack view also for large screens when stack view is chosen the view stays in toggle mode until browser width is enable stack view also for large screen
| 0
|
94,720
| 10,851,448,412
|
IssuesEvent
|
2019-11-13 10:47:18
|
geosolutions-it/MapStore2
|
https://api.github.com/repos/geosolutions-it/MapStore2
|
reopened
|
GeoStory User Guide
|
Documentation GeoStory Priority: High Task User Guide
|
### Description
A proper documentation for the GeoStory tool must be provided for users in the user guide.
### Other useful information (optional):
This task must be accomplished within a dedicated branch of mapstore to be merged with master as soon as the first version of the tool is completed.
|
1.0
|
GeoStory User Guide - ### Description
A proper documentation for the GeoStory tool must be provided for users in the user guide.
### Other useful information (optional):
This task must be accomplished within a dedicated branch of mapstore to be merged with master as soon as the first version of the tool is completed.
|
non_process
|
geostory user guide description a proper documentation for the geostory tool must be provided for users in the user guide other useful information optional this task must be accomplished within a dedicated branch of mapstore to be merged with master as soon as the first version of the tool is completed
| 0
|
4,370
| 7,260,515,712
|
IssuesEvent
|
2018-02-18 10:54:27
|
qgis/QGIS-Documentation
|
https://api.github.com/repos/qgis/QGIS-Documentation
|
closed
|
[FEATURE] New algorithms to add Z/M values to existing geometries
|
Automatic new feature Processing
|
Original commit: https://github.com/qgis/QGIS/commit/340cf93f93fb28e1aa9e23b2d80a7c38e6a89d6c by nyalldawson
Allows upgrading geometries to include these dimensions, or
overwriting any existing Z/M values with a new value.
Intended mostly as a test run for QgsProcessingFeatureBasedAlgorithm
|
1.0
|
[FEATURE] New algorithms to add Z/M values to existing geometries - Original commit: https://github.com/qgis/QGIS/commit/340cf93f93fb28e1aa9e23b2d80a7c38e6a89d6c by nyalldawson
Allows upgrading geometries to include these dimensions, or
overwriting any existing Z/M values with a new value.
Intended mostly as a test run for QgsProcessingFeatureBasedAlgorithm
|
process
|
new algorithms to add z m values to existing geometries original commit by nyalldawson allows upgrading geometries to include these dimensions or overwriting any existing z m values with a new value intended mostly as a test run for qgsprocessingfeaturebasedalgorithm
| 1
|
40,078
| 6,797,338,956
|
IssuesEvent
|
2017-11-01 22:24:59
|
glpi-project/php-library-glpi
|
https://api.github.com/repos/glpi-project/php-library-glpi
|
closed
|
Update README
|
documentation
|
Hi, @Naylin15
Please, replace this URL in the README file:
https://dev.flyve.org/glpi/apirest.php
To this one:
https://github.com/glpi-project/glpi/blob/master/apirest.md
And the new install command is:
`composer require glpi-project/php-library-glpi`
Thank you.
|
1.0
|
Update README - Hi, @Naylin15
Please, replace this URL in the README file:
https://dev.flyve.org/glpi/apirest.php
To this one:
https://github.com/glpi-project/glpi/blob/master/apirest.md
And the new install command is:
`composer require glpi-project/php-library-glpi`
Thank you.
|
non_process
|
update readme hi please replace this url in the readme file to this one and the new install command is composer require glpi project php library glpi thank you
| 0
|
15,735
| 19,910,273,748
|
IssuesEvent
|
2022-01-25 16:30:20
|
input-output-hk/high-assurance-legacy
|
https://api.github.com/repos/input-output-hk/high-assurance-legacy
|
closed
|
Add an automatic proof method that shows a given set of possible transitions to be complete
|
type: enhancement reason: wontfix language: isabelle topic: process calculus
|
When proving that a certain relation is a simulation, one often needs to consider all forms of transitions possible from a process of a certain shape. Applying case distinction repeatedly for this purpose is laborious and leads to complicated proofs.
In an informal proof, one would typically state the ultimate cases right away and assume that the reader is able to figure out why the given list of cases is comprehensive. Our goal is to make an analogous approach possible for formal proofs in Isar. The user should be able to state the possible cases using the `consider` construct and let Isabelle automatically proof the elimination rule created by that.
For this purpose, we want to implement a proof method that constructs proofs of such elimination rules. This proof method should make use of elimination rules of transition relations as well as higher-level elimination rules derived from them for more specific shapes of processes. The user should be able to specify the set of rules to apply via a dynamic fact (see Subsubsection 1.3.1 of the [Eisbach User Manual][eisbach]).
[eisbach]:
https://isabelle.in.tum.de/dist/Isabelle2018/doc/eisbach.pdf
"The Eisbach User Manual"
- [ ] Conduct manual experiments to figure out an appropriate proof method definition
- [ ] Research whether Eisbach supports different definitions of a dynamic fact for different interpretations of a locale
- [ ] Implement the proof method
- [ ] Implement higher-level elimination rules
|
1.0
|
Add an automatic proof method that shows a given set of possible transitions to be complete - When proving that a certain relation is a simulation, one often needs to consider all forms of transitions possible from a process of a certain shape. Applying case distinction repeatedly for this purpose is laborious and leads to complicated proofs.
In an informal proof, one would typically state the ultimate cases right away and assume that the reader is able to figure out why the given list of cases is comprehensive. Our goal is to make an analogous approach possible for formal proofs in Isar. The user should be able to state the possible cases using the `consider` construct and let Isabelle automatically proof the elimination rule created by that.
For this purpose, we want to implement a proof method that constructs proofs of such elimination rules. This proof method should make use of elimination rules of transition relations as well as higher-level elimination rules derived from them for more specific shapes of processes. The user should be able to specify the set of rules to apply via a dynamic fact (see Subsubsection 1.3.1 of the [Eisbach User Manual][eisbach]).
[eisbach]:
https://isabelle.in.tum.de/dist/Isabelle2018/doc/eisbach.pdf
"The Eisbach User Manual"
- [ ] Conduct manual experiments to figure out an appropriate proof method definition
- [ ] Research whether Eisbach supports different definitions of a dynamic fact for different interpretations of a locale
- [ ] Implement the proof method
- [ ] Implement higher-level elimination rules
|
process
|
add an automatic proof method that shows a given set of possible transitions to be complete when proving that a certain relation is a simulation one often needs to consider all forms of transitions possible from a process of a certain shape applying case distinction repeatedly for this purpose is laborious and leads to complicated proofs in an informal proof one would typically state the ultimate cases right away and assume that the reader is able to figure out why the given list of cases is comprehensive our goal is to make an analogous approach possible for formal proofs in isar the user should be able to state the possible cases using the consider construct and let isabelle automatically proof the elimination rule created by that for this purpose we want to implement a proof method that constructs proofs of such elimination rules this proof method should make use of elimination rules of transition relations as well as higher level elimination rules derived from them for more specific shapes of processes the user should be able to specify the set of rules to apply via a dynamic fact see subsubsection of the the eisbach user manual conduct manual experiments to figure out an appropriate proof method definition research whether eisbach supports different definitions of a dynamic fact for different interpretations of a locale implement the proof method implement higher level elimination rules
| 1
|
10,280
| 13,132,054,149
|
IssuesEvent
|
2020-08-06 18:10:04
|
googleapis/code-suggester
|
https://api.github.com/repos/googleapis/code-suggester
|
closed
|
Framework-core: handle existing PRs and existing branches
|
type: process
|
- [x] When there is an existing PR on an up-stream repository and a PR from the same branch and same down-stream repository is opened, there will be an error thrown. Ensure that the latest version is made into a PR.
- [x] When there is an existing branch and someone tries to apply changes onto an existing branch, optionally overwrite the existing branch. If overwrite is enabled, overwrite. Otherwise gracefully fail.
### Additional Information
This is for branches on a fork from an upstream-repository, and PRs from a fork on an upstream-repository
|
1.0
|
Framework-core: handle existing PRs and existing branches - - [x] When there is an existing PR on an up-stream repository and a PR from the same branch and same down-stream repository is opened, there will be an error thrown. Ensure that the latest version is made into a PR.
- [x] When there is an existing branch and someone tries to apply changes onto an existing branch, optionally overwrite the existing branch. If overwrite is enabled, overwrite. Otherwise gracefully fail.
### Additional Information
This is for branches on a fork from an upstream-repository, and PRs from a fork on an upstream-repository
|
process
|
framework core handle existing prs and existing branches when there is an existing pr on an up stream repository and a pr from the same branch and same down stream repository is opened there will be an error thrown ensure that the latest version is made into a pr when there is an existing branch and someone tries to apply changes onto an existing branch optionally overwrite the existing branch if overwrite is enabled overwrite otherwise gracefully fail additional information this is for branches on a fork from an upstream repository and prs from a fork on an upstream repository
| 1
|
21,553
| 29,868,206,783
|
IssuesEvent
|
2023-06-20 06:32:10
|
allinurl/goaccess
|
https://api.github.com/repos/allinurl/goaccess
|
closed
|
Automatically parse log files in folder.
|
question log-processing
|
Hi
Love your work with GoAccess.
One thing, would it be possible to have it automatically parse log files in a defined folder?
That way log rotating will not mess-up parsing and you would not have to change the config every time you add a new log from a system. you would to have to tell what ever system it may be, to output logs to that folder in what ever format you have configured I GoAccess.
|
1.0
|
Automatically parse log files in folder. - Hi
Love your work with GoAccess.
One thing, would it be possible to have it automatically parse log files in a defined folder?
That way log rotating will not mess-up parsing and you would not have to change the config every time you add a new log from a system. you would to have to tell what ever system it may be, to output logs to that folder in what ever format you have configured I GoAccess.
|
process
|
automatically parse log files in folder hi love your work with goaccess one thing would it be possible to have it automatically parse log files in a defined folder that way log rotating will not mess up parsing and you would not have to change the config every time you add a new log from a system you would to have to tell what ever system it may be to output logs to that folder in what ever format you have configured i goaccess
| 1
|
22,015
| 11,660,553,571
|
IssuesEvent
|
2020-03-03 03:44:26
|
cityofaustin/atd-geospatial
|
https://api.github.com/repos/cityofaustin/atd-geospatial
|
closed
|
1:1 GIS Training - M. Alonso
|
Service: Geo Type: IT Support Workgroup: OSE
|
Maria would like to improve her GIS skills for queries and using AMANDA data. I am going to work with her 1:1 since she has specific needs with using GIS. Meeting Thursday 2/27 1pm-3pm at OTC.
|
1.0
|
1:1 GIS Training - M. Alonso - Maria would like to improve her GIS skills for queries and using AMANDA data. I am going to work with her 1:1 since she has specific needs with using GIS. Meeting Thursday 2/27 1pm-3pm at OTC.
|
non_process
|
gis training m alonso maria would like to improve her gis skills for queries and using amanda data i am going to work with her since she has specific needs with using gis meeting thursday at otc
| 0
|
19,617
| 25,970,716,728
|
IssuesEvent
|
2022-12-19 11:00:14
|
deepset-ai/haystack
|
https://api.github.com/repos/deepset-ai/haystack
|
closed
|
Incorporate LayoutLM for information extraction from PDFs
|
type:feature topic:preprocessing
|
Hugging Face recently [added](https://twitter.com/huggingface/status/1432717993637818383) LayoutLMv2 and published a [repo](https://github.com/NielsRogge/Transformers-Tutorials/tree/master/LayoutLMv2) with several nice tutorials on how to use them.
Could these models be useful for Haystack's handling of PDFs?
|
1.0
|
Incorporate LayoutLM for information extraction from PDFs - Hugging Face recently [added](https://twitter.com/huggingface/status/1432717993637818383) LayoutLMv2 and published a [repo](https://github.com/NielsRogge/Transformers-Tutorials/tree/master/LayoutLMv2) with several nice tutorials on how to use them.
Could these models be useful for Haystack's handling of PDFs?
|
process
|
incorporate layoutlm for information extraction from pdfs hugging face recently and published a with several nice tutorials on how to use them could these models be useful for haystack s handling of pdfs
| 1
|
261,913
| 19,750,880,849
|
IssuesEvent
|
2022-01-15 03:58:16
|
UBC-MDS/bc_covid_simple_eda
|
https://api.github.com/repos/UBC-MDS/bc_covid_simple_eda
|
closed
|
Edit README
|
documentation
|
- [x] Add summary
- [x] Add `Role within Python Ecosystem` research
- [x] Add Function summary
|
1.0
|
Edit README - - [x] Add summary
- [x] Add `Role within Python Ecosystem` research
- [x] Add Function summary
|
non_process
|
edit readme add summary add role within python ecosystem research add function summary
| 0
|
18,543
| 24,555,077,815
|
IssuesEvent
|
2022-10-12 15:16:33
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[IOS] [Standalone] Updated consent is not getting displayed in the following scenario
|
Bug P1 iOS Process: Fixed Process: Tested dev
|
**Description**
**Pre-condition:** Study should be created with Comprehension test questions in the Study builder and the study should be launched
**Steps:**
1. Sign up or sign in to the mobile app
2. Enroll to the study
3. Go to the study builder and update the consent for enrolled participants
4. Go to the mobile app, click on the review consent pop up
5. Navigate to the comprehension test screen
6. Fail the comprehension test
7. Click on the 'Cancel' button on the retry screen
8. Observe
**AR:** Updated review consent popup is not getting displayed
ER: Updated review consent popup should get displayed to the participant
[Note: If the participant refreshes the page then an updated review consent popup is getting displayed]
https://user-images.githubusercontent.com/71445210/188878223-f373d1d0-4efe-4c10-a3ff-e9cd51fc68fe.MOV
|
2.0
|
[IOS] [Standalone] Updated consent is not getting displayed in the following scenario - **Description**
**Pre-condition:** Study should be created with Comprehension test questions in the Study builder and the study should be launched
**Steps:**
1. Sign up or sign in to the mobile app
2. Enroll to the study
3. Go to the study builder and update the consent for enrolled participants
4. Go to the mobile app, click on the review consent pop up
5. Navigate to the comprehension test screen
6. Fail the comprehension test
7. Click on the 'Cancel' button on the retry screen
8. Observe
**AR:** Updated review consent popup is not getting displayed
ER: Updated review consent popup should get displayed to the participant
[Note: If the participant refreshes the page then an updated review consent popup is getting displayed]
https://user-images.githubusercontent.com/71445210/188878223-f373d1d0-4efe-4c10-a3ff-e9cd51fc68fe.MOV
|
process
|
updated consent is not getting displayed in the following scenario description pre condition study should be created with comprehension test questions in the study builder and the study should be launched steps sign up or sign in to the mobile app enroll to the study go to the study builder and update the consent for enrolled participants go to the mobile app click on the review consent pop up navigate to the comprehension test screen fail the comprehension test click on the cancel button on the retry screen observe ar updated review consent popup is not getting displayed er updated review consent popup should get displayed to the participant
| 1
|
418,321
| 28,114,390,839
|
IssuesEvent
|
2023-03-31 09:37:48
|
natashatanyt/ped
|
https://api.github.com/repos/natashatanyt/ped
|
opened
|
Suggestions for UG
|
severity.VeryLow type.DocumentationBug
|
I feel like more could be done to improve the UG.
For example, more images could be included, or extracts from running the example commands. ie. when you run `add 3 /of jackets` you will get xyz output.
Additionally, there could be a table of contents. Although the UG is short, it does help to give an overview of what the users can expect.
<!--session: 1680252445899-d38e1fb5-400a-435a-9da9-c8cbfa72ba3a-->
<!--Version: Web v3.4.7-->
|
1.0
|
Suggestions for UG - I feel like more could be done to improve the UG.
For example, more images could be included, or extracts from running the example commands. ie. when you run `add 3 /of jackets` you will get xyz output.
Additionally, there could be a table of contents. Although the UG is short, it does help to give an overview of what the users can expect.
<!--session: 1680252445899-d38e1fb5-400a-435a-9da9-c8cbfa72ba3a-->
<!--Version: Web v3.4.7-->
|
non_process
|
suggestions for ug i feel like more could be done to improve the ug for example more images could be included or extracts from running the example commands ie when you run add of jackets you will get xyz output additionally there could be a table of contents although the ug is short it does help to give an overview of what the users can expect
| 0
|
5,879
| 8,701,865,878
|
IssuesEvent
|
2018-12-05 12:52:52
|
dzhw/zofar
|
https://api.github.com/repos/dzhw/zofar
|
opened
|
information about massmailing used by FDZ
|
category: service.processes prio: ? status: discussion
|
FDZ uses a massmailing system for their newsletter - maybe an alternative for zofar ?
|
1.0
|
information about massmailing used by FDZ - FDZ uses a massmailing system for their newsletter - maybe an alternative for zofar ?
|
process
|
information about massmailing used by fdz fdz uses a massmailing system for their newsletter maybe an alternative for zofar
| 1
|
92,365
| 18,843,841,635
|
IssuesEvent
|
2021-11-11 12:48:40
|
appsmithorg/appsmith
|
https://api.github.com/repos/appsmithorg/appsmith
|
closed
|
[Feature]: Allow user to dynamically input spreadsheet url
|
Enhancement Actions Pod Google Sheets BE Coders Pod
|
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Summary
Option to add spreadsheet url dynamically. Ex - A user can enter the googlesheet URL via an input widget and then google API can show the data based on the input widget.
### Why should this be worked on?
Improves the GSheet integration. When there are multiple spreadsheets to connect, right now the user has to create a new API for each of them. This will be very good experience and user does not have to maintain so many API's
|
1.0
|
[Feature]: Allow user to dynamically input spreadsheet url - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Summary
Option to add spreadsheet url dynamically. Ex - A user can enter the googlesheet URL via an input widget and then google API can show the data based on the input widget.
### Why should this be worked on?
Improves the GSheet integration. When there are multiple spreadsheets to connect, right now the user has to create a new API for each of them. This will be very good experience and user does not have to maintain so many API's
|
non_process
|
allow user to dynamically input spreadsheet url is there an existing issue for this i have searched the existing issues summary option to add spreadsheet url dynamically ex a user can enter the googlesheet url via an input widget and then google api can show the data based on the input widget why should this be worked on improves the gsheet integration when there are multiple spreadsheets to connect right now the user has to create a new api for each of them this will be very good experience and user does not have to maintain so many api s
| 0
|
4,113
| 7,058,177,552
|
IssuesEvent
|
2018-01-04 19:18:08
|
log2timeline/plaso
|
https://api.github.com/repos/log2timeline/plaso
|
closed
|
OS detection failing on dean-mac test image
|
preprocessing
|
The Dean-mac test image is not being detecting as OSX correctly, and all parsers are being enabled.
|
1.0
|
OS detection failing on dean-mac test image - The Dean-mac test image is not being detecting as OSX correctly, and all parsers are being enabled.
|
process
|
os detection failing on dean mac test image the dean mac test image is not being detecting as osx correctly and all parsers are being enabled
| 1
|
650,161
| 21,336,847,801
|
IssuesEvent
|
2022-04-18 15:32:05
|
CCAFS/MARLO
|
https://api.github.com/repos/CCAFS/MARLO
|
closed
|
[KT] (AICCRA) Update AICCRA Roadmap
|
Priority - High Type -Task AICCRA
|
The AICCRA roadmap is located at:
https://docs.google.com/presentation/d/1HYk3q4wsAv8mg2U5XjPbZzpYpaOZ3IJbC548vAbve2Q/edit#slide=id.gc7a83534ac_6_178
- [x] Review previous items
- [x] Add new and missing tasks
**Deliverable:**
**Move to Review when:** The information is complete
**Move to Closed when:** Be reviewed by hector
|
1.0
|
[KT] (AICCRA) Update AICCRA Roadmap - The AICCRA roadmap is located at:
https://docs.google.com/presentation/d/1HYk3q4wsAv8mg2U5XjPbZzpYpaOZ3IJbC548vAbve2Q/edit#slide=id.gc7a83534ac_6_178
- [x] Review previous items
- [x] Add new and missing tasks
**Deliverable:**
**Move to Review when:** The information is complete
**Move to Closed when:** Be reviewed by hector
|
non_process
|
aiccra update aiccra roadmap the aiccra roadmap is located at review previous items add new and missing tasks deliverable move to review when the information is complete move to closed when be reviewed by hector
| 0
|
1,718
| 6,574,482,976
|
IssuesEvent
|
2017-09-11 13:03:30
|
ansible/ansible-modules-core
|
https://api.github.com/repos/ansible/ansible-modules-core
|
closed
|
pip module doesn't use proper pip executable from virtualenv
|
affects_2.2 bug_report waiting_on_maintainer
|
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
pip_module
##### ANSIBLE VERSION
```
ansible 2.2.0.0
config file = /code/ansible.cfg
configured module search path = Default w/o overrides
```
Could affect 2.1 also
##### OS / ENVIRONMENT
CentOS 7
##### SUMMARY
pip module uses `/bin/pip2` instead of the one from the virtualenv when virtualenv is specified and is newly created.
This result for example in that it unable to upgrade pip inside the virtualenv with the error
```
OSError: [Errno 13] Permission denied: '/usr/bin/pip'
```
because it tries to unlink the system pip.
##### STEPS TO REPRODUCE
* Use target host with the old pip installed.
* Try to upgrade pip to the latest state via pip module providing some new virtualenv path using unprivileged user having full access to the target virtualenv.
* Notice system pip executable unlink error.
```
- name: ensure pip is latest version in venv
pip:
name: pip
virtualenv: '{{ projects_path }}/venv'
state: latest
become_user: venvuser
```
##### EXPECTED RESULTS
pip module should use pip executable from the virtualenv provided via the module arg.
##### ACTUAL RESULTS
```
fatal: [hostname]: FAILED! => {
"changed": false,
"cmd": "/bin/pip2 install -U pip",
"failed": true,
"invocation": {
"module_args": {
"chdir": null,
"editable": true,
"executable": null,
"extra_args": null,
"name": [
"pip"
],
"requirements": null,
"state": "latest",
"umask": null,
"use_mirrors": true,
"version": null,
"virtualenv": "/srv/www/hostname/venv",
"virtualenv_command": "virtualenv",
"virtualenv_python": null,
"virtualenv_site_packages": false
},
"module_name": "pip"
},
"msg": "stdout: New python executable in /srv/www/hostname/venv/bin/python\nInstalling Setuptools..............................................................................................................................................................................................................................done.\nInstalling Pip.....................................................................................................................................................................................................................................................................................................................................done.\nCollecting pip\n Downloading pip-9.0.0-py2.py3-none-any.whl (1.3MB)\nInstalling collected packages: pip\n Found existing installation: pip 7.1.0\n Uninstalling pip-7.1.0:\n\n:stderr: You are using pip version 7.1.0, however version 9.0.0 is available.\nYou should consider upgrading via the 'pip install --upgrade pip' command.\nException:\nTraceback (most recent call last):\n File \"/usr/lib/python2.7/site-packages/pip/basecommand.py\", line 223, in main\n status = self.run(options, args)\n File \"/usr/lib/python2.7/site-packages/pip/commands/install.py\", line 308, in run\n strip_file_prefix=options.strip_file_prefix,\n File \"/usr/lib/python2.7/site-packages/pip/req/req_set.py\", line 640, in install\n requirement.uninstall(auto_confirm=True)\n File \"/usr/lib/python2.7/site-packages/pip/req/req_install.py\", line 726, in uninstall\n paths_to_remove.remove(auto_confirm)\n File \"/usr/lib/python2.7/site-packages/pip/req/req_uninstall.py\", line 125, in remove\n renames(path, new_path)\n File \"/usr/lib/python2.7/site-packages/pip/utils/__init__.py\", line 314, in renames\n shutil.move(old, new)\n File \"/usr/lib64/python2.7/shutil.py\", line 302, in move\n os.unlink(src)\nOSError: [Errno 13] Permission denied: '/usr/bin/pip'\n\n"
}
```
|
True
|
pip module doesn't use proper pip executable from virtualenv - ##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
pip_module
##### ANSIBLE VERSION
```
ansible 2.2.0.0
config file = /code/ansible.cfg
configured module search path = Default w/o overrides
```
Could affect 2.1 also
##### OS / ENVIRONMENT
CentOS 7
##### SUMMARY
pip module uses `/bin/pip2` instead of the one from the virtualenv when virtualenv is specified and is newly created.
This result for example in that it unable to upgrade pip inside the virtualenv with the error
```
OSError: [Errno 13] Permission denied: '/usr/bin/pip'
```
because it tries to unlink the system pip.
##### STEPS TO REPRODUCE
* Use target host with the old pip installed.
* Try to upgrade pip to the latest state via pip module providing some new virtualenv path using unprivileged user having full access to the target virtualenv.
* Notice system pip executable unlink error.
```
- name: ensure pip is latest version in venv
pip:
name: pip
virtualenv: '{{ projects_path }}/venv'
state: latest
become_user: venvuser
```
##### EXPECTED RESULTS
pip module should use pip executable from the virtualenv provided via the module arg.
##### ACTUAL RESULTS
```
fatal: [hostname]: FAILED! => {
"changed": false,
"cmd": "/bin/pip2 install -U pip",
"failed": true,
"invocation": {
"module_args": {
"chdir": null,
"editable": true,
"executable": null,
"extra_args": null,
"name": [
"pip"
],
"requirements": null,
"state": "latest",
"umask": null,
"use_mirrors": true,
"version": null,
"virtualenv": "/srv/www/hostname/venv",
"virtualenv_command": "virtualenv",
"virtualenv_python": null,
"virtualenv_site_packages": false
},
"module_name": "pip"
},
"msg": "stdout: New python executable in /srv/www/hostname/venv/bin/python\nInstalling Setuptools..............................................................................................................................................................................................................................done.\nInstalling Pip.....................................................................................................................................................................................................................................................................................................................................done.\nCollecting pip\n Downloading pip-9.0.0-py2.py3-none-any.whl (1.3MB)\nInstalling collected packages: pip\n Found existing installation: pip 7.1.0\n Uninstalling pip-7.1.0:\n\n:stderr: You are using pip version 7.1.0, however version 9.0.0 is available.\nYou should consider upgrading via the 'pip install --upgrade pip' command.\nException:\nTraceback (most recent call last):\n File \"/usr/lib/python2.7/site-packages/pip/basecommand.py\", line 223, in main\n status = self.run(options, args)\n File \"/usr/lib/python2.7/site-packages/pip/commands/install.py\", line 308, in run\n strip_file_prefix=options.strip_file_prefix,\n File \"/usr/lib/python2.7/site-packages/pip/req/req_set.py\", line 640, in install\n requirement.uninstall(auto_confirm=True)\n File \"/usr/lib/python2.7/site-packages/pip/req/req_install.py\", line 726, in uninstall\n paths_to_remove.remove(auto_confirm)\n File \"/usr/lib/python2.7/site-packages/pip/req/req_uninstall.py\", line 125, in remove\n renames(path, new_path)\n File \"/usr/lib/python2.7/site-packages/pip/utils/__init__.py\", line 314, in renames\n shutil.move(old, new)\n File \"/usr/lib64/python2.7/shutil.py\", line 302, in move\n os.unlink(src)\nOSError: [Errno 13] Permission denied: '/usr/bin/pip'\n\n"
}
```
|
non_process
|
pip module doesn t use proper pip executable from virtualenv issue type bug report component name pip module ansible version ansible config file code ansible cfg configured module search path default w o overrides could affect also os environment centos summary pip module uses bin instead of the one from the virtualenv when virtualenv is specified and is newly created this result for example in that it unable to upgrade pip inside the virtualenv with the error oserror permission denied usr bin pip because it tries to unlink the system pip steps to reproduce use target host with the old pip installed try to upgrade pip to the latest state via pip module providing some new virtualenv path using unprivileged user having full access to the target virtualenv notice system pip executable unlink error name ensure pip is latest version in venv pip name pip virtualenv projects path venv state latest become user venvuser expected results pip module should use pip executable from the virtualenv provided via the module arg actual results fatal failed changed false cmd bin install u pip failed true invocation module args chdir null editable true executable null extra args null name pip requirements null state latest umask null use mirrors true version null virtualenv srv www hostname venv virtualenv command virtualenv virtualenv python null virtualenv site packages false module name pip msg stdout new python executable in srv www hostname venv bin python ninstalling setuptools done ninstalling pip done ncollecting pip n downloading pip none any whl ninstalling collected packages pip n found existing installation pip n uninstalling pip n n stderr you are using pip version however version is available nyou should consider upgrading via the pip install upgrade pip command nexception ntraceback most recent call last n file usr lib site packages pip basecommand py line in main n status self run options args n file usr lib site packages pip commands install py line in run n strip file prefix options strip file prefix n file usr lib site packages pip req req set py line in install n requirement uninstall auto confirm true n file usr lib site packages pip req req install py line in uninstall n paths to remove remove auto confirm n file usr lib site packages pip req req uninstall py line in remove n renames path new path n file usr lib site packages pip utils init py line in renames n shutil move old new n file usr shutil py line in move n os unlink src noserror permission denied usr bin pip n n
| 0
|
6,405
| 9,487,304,253
|
IssuesEvent
|
2019-04-22 16:28:20
|
allinurl/goaccess
|
https://api.github.com/repos/allinurl/goaccess
|
closed
|
Goaccess takes long time to quit
|
command-line options log-processing
|
There is a considerable time delay before goaccess closes once I send SIGINT to it. Is there any way to skip whatever it is doing throughout that time (other than killing it)? I need to start a new instance on the same port as soon as possible. Right now I use a slight time delay, around 2 minutes just to be sure.
I tried using --keep-db-files and it still takes long. I need it to reset at midnight to show statistics for the current day or, better, display only the last 24 hours. None of these two things seem to be possible currently with realtime html.
|
1.0
|
Goaccess takes long time to quit - There is a considerable time delay before goaccess closes once I send SIGINT to it. Is there any way to skip whatever it is doing throughout that time (other than killing it)? I need to start a new instance on the same port as soon as possible. Right now I use a slight time delay, around 2 minutes just to be sure.
I tried using --keep-db-files and it still takes long. I need it to reset at midnight to show statistics for the current day or, better, display only the last 24 hours. None of these two things seem to be possible currently with realtime html.
|
process
|
goaccess takes long time to quit there is a considerable time delay before goaccess closes once i send sigint to it is there any way to skip whatever it is doing throughout that time other than killing it i need to start a new instance on the same port as soon as possible right now i use a slight time delay around minutes just to be sure i tried using keep db files and it still takes long i need it to reset at midnight to show statistics for the current day or better display only the last hours none of these two things seem to be possible currently with realtime html
| 1
|
7,091
| 10,238,826,654
|
IssuesEvent
|
2019-08-19 16:45:09
|
cncf/sig-security
|
https://api.github.com/repos/cncf/sig-security
|
closed
|
security reviewers must not have a conflict of interest
|
assessment-process help wanted
|
We are practicing this, but we need some language that describes exactly what we believe represents a conflict of interest.
For starters:
- no one who is on the core team of the project should be a security reviewer
- if someone is a user of the project or has contributed a PR, that is fine (and would be positive attribute for a reviewer)
|
1.0
|
security reviewers must not have a conflict of interest - We are practicing this, but we need some language that describes exactly what we believe represents a conflict of interest.
For starters:
- no one who is on the core team of the project should be a security reviewer
- if someone is a user of the project or has contributed a PR, that is fine (and would be positive attribute for a reviewer)
|
process
|
security reviewers must not have a conflict of interest we are practicing this but we need some language that describes exactly what we believe represents a conflict of interest for starters no one who is on the core team of the project should be a security reviewer if someone is a user of the project or has contributed a pr that is fine and would be positive attribute for a reviewer
| 1
|
369,808
| 25,869,076,913
|
IssuesEvent
|
2022-12-14 00:13:38
|
strapi/strapi
|
https://api.github.com/repos/strapi/strapi
|
closed
|
Swagger plugin issue on GCP - Documentation
|
source: plugin:documentation status: pending reproduction
|
## Bug report
### Describe the bug
When accessing the `/documentation` page from the Strapi admin panel on a GCP App Engine (Standard) instance, I am seeing the below errors in GCP logs and Page not found with a 404 error.
`error Error: EROFS: read-only file system, open '/workspace/extensions/documentation/public/index.html'`
The documentation plugin was installed in my LOCAL development environment and docs compiled before deploying to GCP App Engine. Locally it works fine.
### Steps to reproduce the behavior
1. launch your Strapi project
2. active the Swagger documentation plugin
3. request the documentation locally (it will work)
4. upload the Strapi project to Google App Engine (gcloud command or whatever)
5. visit the documentation ( https://YourGoogleProject.appspot.com/documentation/v1.0.0 )
### System
- Node.js version: v14.17.6
- NPM version: 8.1.2
- Strapi version: v3.6.8
- Database: MongoDB
### Additional context
Other users are facing the same problem: https://issueexplorer.com/issue/strapi/documentation/377
|
1.0
|
Swagger plugin issue on GCP - Documentation - ## Bug report
### Describe the bug
When accessing the `/documentation` page from the Strapi admin panel on a GCP App Engine (Standard) instance, I am seeing the below errors in GCP logs and Page not found with a 404 error.
`error Error: EROFS: read-only file system, open '/workspace/extensions/documentation/public/index.html'`
The documentation plugin was installed in my LOCAL development environment and docs compiled before deploying to GCP App Engine. Locally it works fine.
### Steps to reproduce the behavior
1. launch your Strapi project
2. active the Swagger documentation plugin
3. request the documentation locally (it will work)
4. upload the Strapi project to Google App Engine (gcloud command or whatever)
5. visit the documentation ( https://YourGoogleProject.appspot.com/documentation/v1.0.0 )
### System
- Node.js version: v14.17.6
- NPM version: 8.1.2
- Strapi version: v3.6.8
- Database: MongoDB
### Additional context
Other users are facing the same problem: https://issueexplorer.com/issue/strapi/documentation/377
|
non_process
|
swagger plugin issue on gcp documentation bug report describe the bug when accessing the documentation page from the strapi admin panel on a gcp app engine standard instance i am seeing the below errors in gcp logs and page not found with a error error error erofs read only file system open workspace extensions documentation public index html the documentation plugin was installed in my local development environment and docs compiled before deploying to gcp app engine locally it works fine steps to reproduce the behavior launch your strapi project active the swagger documentation plugin request the documentation locally it will work upload the strapi project to google app engine gcloud command or whatever visit the documentation system node js version npm version strapi version database mongodb additional context other users are facing the same problem
| 0
|
648,339
| 21,183,343,638
|
IssuesEvent
|
2022-04-08 10:06:20
|
open62541/open62541
|
https://api.github.com/repos/open62541/open62541
|
closed
|
Error when building package that uses open62541
|
Priority: Low Status: Pending Component: Server
|
## Description
I have written my OPC UA client (opcua_bridge) using open62541. However, when I try and build it (using colcon build) I get the following error:
```
/usr/bin/ld: cannot find -lmbedtls
/usr/bin/ld: cannot find -lmbedx509
/usr/bin/ld: cannot find -lmbedcrypto
collect2: error: ld returned 1 exit status
make[2]: *** [CMakeFiles/opcua_bridge.dir/build.make:152: opcua_bridge] Error 1
make[1]: *** [CMakeFiles/Makefile2:78: CMakeFiles/opcua_bridge.dir/all] Error 2
make: *** [Makefile:141: all] Error 2
```
## Background Information / Reproduction Steps
Operating system: Ubuntu 20.04
Installed using the [debian](https://open62541.org/doc/current/installing.html#debian)
## Fix
I already found how to fix it, I need to run the following:
`sudo apt install libmbedtls-dev`
However, it's not mentioned on the installation page. I also feel like it should be something that's installed automatically when you install open62541.
If you have any other questions, please let me know. I'll be happy to provide more information
|
1.0
|
Error when building package that uses open62541 - ## Description
I have written my OPC UA client (opcua_bridge) using open62541. However, when I try and build it (using colcon build) I get the following error:
```
/usr/bin/ld: cannot find -lmbedtls
/usr/bin/ld: cannot find -lmbedx509
/usr/bin/ld: cannot find -lmbedcrypto
collect2: error: ld returned 1 exit status
make[2]: *** [CMakeFiles/opcua_bridge.dir/build.make:152: opcua_bridge] Error 1
make[1]: *** [CMakeFiles/Makefile2:78: CMakeFiles/opcua_bridge.dir/all] Error 2
make: *** [Makefile:141: all] Error 2
```
## Background Information / Reproduction Steps
Operating system: Ubuntu 20.04
Installed using the [debian](https://open62541.org/doc/current/installing.html#debian)
## Fix
I already found how to fix it, I need to run the following:
`sudo apt install libmbedtls-dev`
However, it's not mentioned on the installation page. I also feel like it should be something that's installed automatically when you install open62541.
If you have any other questions, please let me know. I'll be happy to provide more information
|
non_process
|
error when building package that uses description i have written my opc ua client opcua bridge using however when i try and build it using colcon build i get the following error usr bin ld cannot find lmbedtls usr bin ld cannot find usr bin ld cannot find lmbedcrypto error ld returned exit status make error make error make error background information reproduction steps operating system ubuntu installed using the fix i already found how to fix it i need to run the following sudo apt install libmbedtls dev however it s not mentioned on the installation page i also feel like it should be something that s installed automatically when you install if you have any other questions please let me know i ll be happy to provide more information
| 0
|
7,114
| 10,266,108,356
|
IssuesEvent
|
2019-08-22 20:34:05
|
automotive-edge-computing-consortium/AECC
|
https://api.github.com/repos/automotive-edge-computing-consortium/AECC
|
opened
|
Missing technical item capture process
|
priority:High status:Open type:Process
|
Need a formal method for capturing missing technical items. Need to review and prioritize items. Need to review periodically. Need to be prepared to close/archive issues
|
1.0
|
Missing technical item capture process - Need a formal method for capturing missing technical items. Need to review and prioritize items. Need to review periodically. Need to be prepared to close/archive issues
|
process
|
missing technical item capture process need a formal method for capturing missing technical items need to review and prioritize items need to review periodically need to be prepared to close archive issues
| 1
|
91,144
| 26,282,925,428
|
IssuesEvent
|
2023-01-07 14:06:07
|
skypjack/uvw
|
https://api.github.com/repos/skypjack/uvw
|
opened
|
Failing workflows as github changed several compilers from ubuntu machines.
|
build system
|
I'm working on a PR and it's impossible to rely on the CI as several are failing on build tools installation.
Reference: https://github.com/actions/runner-images/issues/3235
|
1.0
|
Failing workflows as github changed several compilers from ubuntu machines. - I'm working on a PR and it's impossible to rely on the CI as several are failing on build tools installation.
Reference: https://github.com/actions/runner-images/issues/3235
|
non_process
|
failing workflows as github changed several compilers from ubuntu machines i m working on a pr and it s impossible to rely on the ci as several are failing on build tools installation reference
| 0
|
15,590
| 19,715,423,597
|
IssuesEvent
|
2022-01-13 10:30:14
|
chef/chef-oss-practices
|
https://api.github.com/repos/chef/chef-oss-practices
|
closed
|
We should migrate and default to the main branch
|
Development Process
|
We're currently using the master branch everywhere
GitHub now defaults to main everywhere, we should follow this standard where possible
|
1.0
|
We should migrate and default to the main branch - We're currently using the master branch everywhere
GitHub now defaults to main everywhere, we should follow this standard where possible
|
process
|
we should migrate and default to the main branch we re currently using the master branch everywhere github now defaults to main everywhere we should follow this standard where possible
| 1
|
30,948
| 6,370,257,810
|
IssuesEvent
|
2017-08-01 13:50:36
|
PowerDNS/pdns
|
https://api.github.com/repos/PowerDNS/pdns
|
closed
|
dnsdist: add setStaleCacheEntriesTTL to console autocomplete
|
defect dnsdist
|
<!-- Tell us what is issue is about -->
- Program: dnsdist <!-- delete the ones that do not apply -->
- Issue type: Bug report <!-- delete the one that does not apply -->
### Short description
setStaleCacheEntriesTTL is not available in dnsdist console autocompletion as discussed in IRC some time ago. Turns out to be a simple oversight. Please add :-)
|
1.0
|
dnsdist: add setStaleCacheEntriesTTL to console autocomplete - <!-- Tell us what is issue is about -->
- Program: dnsdist <!-- delete the ones that do not apply -->
- Issue type: Bug report <!-- delete the one that does not apply -->
### Short description
setStaleCacheEntriesTTL is not available in dnsdist console autocompletion as discussed in IRC some time ago. Turns out to be a simple oversight. Please add :-)
|
non_process
|
dnsdist add setstalecacheentriesttl to console autocomplete program dnsdist issue type bug report short description setstalecacheentriesttl is not available in dnsdist console autocompletion as discussed in irc some time ago turns out to be a simple oversight please add
| 0
|
5,037
| 7,853,646,247
|
IssuesEvent
|
2018-06-20 18:06:14
|
aspnet/IISIntegration
|
https://api.github.com/repos/aspnet/IISIntegration
|
closed
|
Profile ANCM for performance
|
Task cost: M in-process invalid
|
I have been prioritizing correctness over performance for the implementation so far. One we get the in-process mode to a more correct state, we need to do a sweep to reduce allocations. @davidfowl
|
1.0
|
Profile ANCM for performance - I have been prioritizing correctness over performance for the implementation so far. One we get the in-process mode to a more correct state, we need to do a sweep to reduce allocations. @davidfowl
|
process
|
profile ancm for performance i have been prioritizing correctness over performance for the implementation so far one we get the in process mode to a more correct state we need to do a sweep to reduce allocations davidfowl
| 1
|
20,426
| 27,089,518,549
|
IssuesEvent
|
2023-02-14 19:45:42
|
googleapis/gapic-generator-java
|
https://api.github.com/repos/googleapis/gapic-generator-java
|
opened
|
Add configuration validation for renovate.json
|
type: process priority: p3
|
Look into a way to add check to verify `renovate.json` in presubmit, so things like #1349's missed comma can be caught ahead of renovate bot [issue](https://github.com/googleapis/gapic-generator-java/issues/1352).
|
1.0
|
Add configuration validation for renovate.json - Look into a way to add check to verify `renovate.json` in presubmit, so things like #1349's missed comma can be caught ahead of renovate bot [issue](https://github.com/googleapis/gapic-generator-java/issues/1352).
|
process
|
add configuration validation for renovate json look into a way to add check to verify renovate json in presubmit so things like s missed comma can be caught ahead of renovate bot
| 1
|
51,820
| 10,729,726,924
|
IssuesEvent
|
2019-10-28 16:05:05
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
Failed Tests: X-Pack Jest Tests.x-pack/plugins/code/server/lsp.passive launcher can start and end a process
|
Team:Code failed-test
|
```
Stacktrace
Error: expect(received).toBe(expected) // Object.is equality
Expected: "process started"
Received: "socket connected"
at toBe (/var/lib/jenkins/workspace/elastic+kibana+pull-request/JOB/x-pack-intake/node/immutable/kibana/x-pack/plugins/code/server/lsp/abstract_launcher.test.ts:196:42)
at testFn (/var/lib/jenkins/workspace/elastic+kibana+pull-request/JOB/x-pack-intake/node/immutable/kibana/x-pack/plugins/code/server/lsp/abstract_launcher.test.ts:132:5)
at retryUtil (/var/lib/jenkins/workspace/elastic+kibana+pull-request/JOB/x-pack-intake/node/immutable/kibana/x-pack/plugins/code/server/lsp/abstract_launcher.test.ts:136:13)
```

Recent failure: https://kibana-ci.elastic.co/job/elastic+kibana+pull-request/1530/JOB=x-pack-intake,node=immutable/testReport/junit/X-Pack%20Jest%20Tests/x-pack_plugins_code_server_lsp/passive_launcher_can_start_and_end_a_process/
|
1.0
|
Failed Tests: X-Pack Jest Tests.x-pack/plugins/code/server/lsp.passive launcher can start and end a process - ```
Stacktrace
Error: expect(received).toBe(expected) // Object.is equality
Expected: "process started"
Received: "socket connected"
at toBe (/var/lib/jenkins/workspace/elastic+kibana+pull-request/JOB/x-pack-intake/node/immutable/kibana/x-pack/plugins/code/server/lsp/abstract_launcher.test.ts:196:42)
at testFn (/var/lib/jenkins/workspace/elastic+kibana+pull-request/JOB/x-pack-intake/node/immutable/kibana/x-pack/plugins/code/server/lsp/abstract_launcher.test.ts:132:5)
at retryUtil (/var/lib/jenkins/workspace/elastic+kibana+pull-request/JOB/x-pack-intake/node/immutable/kibana/x-pack/plugins/code/server/lsp/abstract_launcher.test.ts:136:13)
```

Recent failure: https://kibana-ci.elastic.co/job/elastic+kibana+pull-request/1530/JOB=x-pack-intake,node=immutable/testReport/junit/X-Pack%20Jest%20Tests/x-pack_plugins_code_server_lsp/passive_launcher_can_start_and_end_a_process/
|
non_process
|
failed tests x pack jest tests x pack plugins code server lsp passive launcher can start and end a process stacktrace error expect received tobe expected object is equality expected process started received socket connected at tobe var lib jenkins workspace elastic kibana pull request job x pack intake node immutable kibana x pack plugins code server lsp abstract launcher test ts at testfn var lib jenkins workspace elastic kibana pull request job x pack intake node immutable kibana x pack plugins code server lsp abstract launcher test ts at retryutil var lib jenkins workspace elastic kibana pull request job x pack intake node immutable kibana x pack plugins code server lsp abstract launcher test ts recent failure
| 0
|
68,701
| 29,482,265,529
|
IssuesEvent
|
2023-06-02 06:58:12
|
hashicorp/terraform-provider-azurerm
|
https://api.github.com/repos/hashicorp/terraform-provider-azurerm
|
closed
|
Cannot disable all storage account bypass exceptions
|
service/storage
|
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Community Note
<!--- Please keep this note for the community --->
* Please vote on this issue by adding a :thumbsup: [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Terraform Version
1.4.6
### AzureRM Provider Version
3.58.0
### Affected Resource(s)/Data Source(s)
azurerm_storage_account
### Terraform Configuration Files
```hcl
resource "azurerm_storage_account" "this" {
name = var.name
resource_group_name = var.resource_group_name
location = var.location
account_kind = var.kind
account_tier = var.tier
account_replication_type = var.replication_type
is_hns_enabled = var.is_hns_enabled
tags = local.data_tags
allow_nested_items_to_be_public = false
identity {
type = "SystemAssigned"
}
network_rules {
default_action = var.network_rules_default_action
bypass = var.network_rules_bypass
ip_rules = []
virtual_network_subnet_ids = []
}
}
```
### Debug Output/Panic Output
```shell
not applicable
```
### Expected Behaviour
Storage account created with no exceptions
### Actual Behaviour
Storage account created with an Exception for Azure services
### Steps to Reproduce
_No response_
### Important Factoids
_No response_
### References
_No response_
|
1.0
|
Cannot disable all storage account bypass exceptions - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Community Note
<!--- Please keep this note for the community --->
* Please vote on this issue by adding a :thumbsup: [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Terraform Version
1.4.6
### AzureRM Provider Version
3.58.0
### Affected Resource(s)/Data Source(s)
azurerm_storage_account
### Terraform Configuration Files
```hcl
resource "azurerm_storage_account" "this" {
name = var.name
resource_group_name = var.resource_group_name
location = var.location
account_kind = var.kind
account_tier = var.tier
account_replication_type = var.replication_type
is_hns_enabled = var.is_hns_enabled
tags = local.data_tags
allow_nested_items_to_be_public = false
identity {
type = "SystemAssigned"
}
network_rules {
default_action = var.network_rules_default_action
bypass = var.network_rules_bypass
ip_rules = []
virtual_network_subnet_ids = []
}
}
```
### Debug Output/Panic Output
```shell
not applicable
```
### Expected Behaviour
Storage account created with no exceptions
### Actual Behaviour
Storage account created with an Exception for Azure services
### Steps to Reproduce
_No response_
### Important Factoids
_No response_
### References
_No response_
|
non_process
|
cannot disable all storage account bypass exceptions is there an existing issue for this i have searched the existing issues community note please vote on this issue by adding a thumbsup to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform version azurerm provider version affected resource s data source s azurerm storage account terraform configuration files hcl resource azurerm storage account this name var name resource group name var resource group name location var location account kind var kind account tier var tier account replication type var replication type is hns enabled var is hns enabled tags local data tags allow nested items to be public false identity type systemassigned network rules default action var network rules default action bypass var network rules bypass ip rules virtual network subnet ids debug output panic output shell not applicable expected behaviour storage account created with no exceptions actual behaviour storage account created with an exception for azure services steps to reproduce no response important factoids no response references no response
| 0
|
21,334
| 29,041,383,781
|
IssuesEvent
|
2023-05-13 02:00:07
|
lizhihao6/get-daily-arxiv-noti
|
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
|
opened
|
New submissions for Fri, 12 May 23
|
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
|
## Keyword: events
### HyperE2VID: Improving Event-Based Video Reconstruction via Hypernetworks
- **Authors:** Burak Ercan, Onur Eker, Canberk Saglam, Aykut Erdem, Erkut Erdem
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06382
- **Pdf link:** https://arxiv.org/pdf/2305.06382
- **Abstract**
Event-based cameras are becoming increasingly popular for their ability to capture high-speed motion with low latency and high dynamic range. However, generating videos from events remains challenging due to the highly sparse and varying nature of event data. To address this, in this study, we propose HyperE2VID, a dynamic neural network architecture for event-based video reconstruction. Our approach uses hypernetworks and dynamic convolutions to generate per-pixel adaptive filters guided by a context fusion module that combines information from event voxel grids and previously reconstructed intensity images. We also employ a curriculum learning strategy to train the network more robustly. Experimental results demonstrate that HyperE2VID achieves better reconstruction quality with fewer parameters and faster inference time than the state-of-the-art methods.
### WeditGAN: Few-shot Image Generation via Latent Space Relocation
- **Authors:** Yuxuan Duan, Li Niu, Yan Hong, Liqing Zhang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06671
- **Pdf link:** https://arxiv.org/pdf/2305.06671
- **Abstract**
In few-shot image generation, directly training GAN models on just a handful of images faces the risk of overfitting. A popular solution is to transfer the models pretrained on large source domains to small target ones. In this work, we introduce WeditGAN, which realizes model transfer by editing the intermediate latent codes $w$ in StyleGANs with learned constant offsets ($\Delta w$), discovering and constructing target latent spaces via simply relocating the distribution of source latent spaces. The established one-to-one mapping between latent spaces can naturally prevents mode collapse and overfitting. Besides, we also propose variants of WeditGAN to further enhance the relocation process by regularizing the direction or finetuning the intensity of $\Delta w$. Experiments on a collection of widely used source/target datasets manifest the capability of WeditGAN in generating realistic and diverse images, which is simple yet highly effective in the research area of few-shot image generation.
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
There is no result
## Keyword: ISP
### Towards L-System Captioning for Tree Reconstruction
- **Authors:** Jannes S. Magnusson, Anna Hilsmann, Peter Eisert
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06483
- **Pdf link:** https://arxiv.org/pdf/2305.06483
- **Abstract**
This work proposes a novel concept for tree and plant reconstruction by directly inferring a Lindenmayer-System (L-System) word representation from image data in an image captioning approach. We train a model end-to-end which is able to translate given images into L-System words as a description of the displayed tree. To prove this concept, we demonstrate the applicability on 2D tree topologies. Transferred to real image data, this novel idea could lead to more efficient, accurate and semantically meaningful tree and plant reconstruction without using error-prone point cloud extraction, and other processes usually utilized in tree reconstruction. Furthermore, this approach bypasses the need for a predefined L-System grammar and enables species-specific L-System inference without biological knowledge.
### Emotion Recognition for Challenged People Facial Appearance in Social using Neural Network
- **Authors:** P. Deivendran, P. Suresh Babu, G. Malathi, K. Anbazhagan, R. Senthil Kumar
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2305.06842
- **Pdf link:** https://arxiv.org/pdf/2305.06842
- **Abstract**
Human communication is the vocal and non verbal signal to communicate with others. Human expression is a significant biometric object in picture and record databases of surveillance systems. Face appreciation has a serious role in biometric methods and is good-looking for plentiful applications, including visual scrutiny and security. Facial expressions are a form of nonverbal communication; recognizing them helps improve the human machine interaction. This paper proposes an idea for face and enlightenment invariant credit of facial expressions by the images. In order on, the person's face can be computed. Face expression is used in CNN classifier to categorize the acquired picture into different emotion categories. It is a deep, feed-forward artificial neural network. Outcome surpasses human presentation and shows poses alternate performance. Varying lighting conditions can influence the fitting process and reduce recognition precision. Results illustrate that dependable facial appearance credited with changing lighting conditions for separating reasonable facial terminology display emotions is an efficient representation of clean and assorted moving expressions. This process can also manage the proportions of dissimilar basic affecting expressions of those mixed jointly to produce sensible emotional facial expressions. Our system contains a pre-defined data set, which was residential by a statistics scientist and includes all pure and varied expressions. On average, a data set has achieved 92.4% exact validation of the expressions synthesized by our technique. These facial expressions are compared through the pre-defined data-position inside our system. If it recognizes the person in an abnormal condition, an alert will be passed to the nearby hospital/doctor seeing that a message.
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### Treasure What You Have: Exploiting Similarity in Deep Neural Networks for Efficient Video Processing
- **Authors:** Hadjer Benmeziane, Halima Bouzidi, Hamza Ouarnoughi, Ozcan Ozturk, Smail Niar
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06492
- **Pdf link:** https://arxiv.org/pdf/2305.06492
- **Abstract**
Deep learning has enabled various Internet of Things (IoT) applications. Still, designing models with high accuracy and computational efficiency remains a significant challenge, especially in real-time video processing applications. Such applications exhibit high inter- and intra-frame redundancy, allowing further improvement. This paper proposes a similarity-aware training methodology that exploits data redundancy in video frames for efficient processing. Our approach introduces a per-layer regularization that enhances computation reuse by increasing the similarity of weights during training. We validate our methodology on two critical real-time applications, lane detection and scene parsing. We observe an average compression ratio of approximately 50% and a speedup of \sim 1.5x for different models while maintaining the same accuracy.
### Exploiting Fine-Grained DCT Representations for Hiding Image-Level Messages within JPEG Images
- **Authors:** Junxue Yang, Xin Liao
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06582
- **Pdf link:** https://arxiv.org/pdf/2305.06582
- **Abstract**
Unlike hiding bit-level messages, hiding image-level messages is more challenging, which requires large capacity, high imperceptibility, and high security. Although recent advances in hiding image-level messages have been remarkable, existing schemes are limited to lossless spatial images as covers and cannot be directly applied to JPEG images, the ubiquitous lossy format images in daily life. The difficulties of migration are caused by the lack of targeted design and the loss of details due to lossy decompression and re-compression. Considering that taking DCT densely on $8\times8$ image patches is the core of the JPEG compression standard, we design a novel model called \textsf{EFDR}, which can comprehensively \underline{E}xploit \underline{F}ine-grained \underline{D}CT \underline{R}epresentations and embed the secret image into quantized DCT coefficients to avoid the lossy process. Specifically, we transform the JPEG cover image and hidden secret image into fine-grained DCT representations that compact the frequency and are associated with the inter-block and intra-block correlations. Subsequently, the fine-grained DCT representations are further enhanced by a sub-band features enhancement module. Afterward, a transformer-based invertibility module is designed to fuse enhanced sub-band features. Such a design enables a fine-grained self-attention on each sub-band and captures long-range dependencies while maintaining excellent reversibility for hiding and recovery. To our best knowledge, this is the first attempt to embed a color image of equal size in a color JPEG image. Extensive experiments demonstrate the effectiveness of our \textsf{EFDR} with superior performance.
## Keyword: RAW
### Combo of Thinking and Observing for Outside-Knowledge VQA
- **Authors:** Qingyi Si, Yuchen Mo, Zheng Lin, Huishan Ji, Weiping Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2305.06407
- **Pdf link:** https://arxiv.org/pdf/2305.06407
- **Abstract**
Outside-knowledge visual question answering is a challenging task that requires both the acquisition and the use of open-ended real-world knowledge. Some existing solutions draw external knowledge into the cross-modality space which overlooks the much vaster textual knowledge in natural-language space, while others transform the image into a text that further fuses with the textual knowledge into the natural-language space and completely abandons the use of visual features. In this paper, we are inspired to constrain the cross-modality space into the same space of natural-language space which makes the visual features preserved directly, and the model still benefits from the vast knowledge in natural-language space. To this end, we propose a novel framework consisting of a multimodal encoder, a textual encoder and an answer decoder. Such structure allows us to introduce more types of knowledge including explicit and implicit multimodal and textual knowledge. Extensive experiments validate the superiority of the proposed method which outperforms the state-of-the-art by 6.17% accuracy. We also conduct comprehensive ablations of each component, and systematically study the roles of varying types of knowledge. Codes and knowledge data can be found at https://github.com/PhoebusSi/Thinking-while-Observing.
### DeepSTEP -- Deep Learning-Based Spatio-Temporal End-To-End Perception for Autonomous Vehicles
- **Authors:** Sebastian Huch, Florian Sauerbeck, Johannes Betz
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06820
- **Pdf link:** https://arxiv.org/pdf/2305.06820
- **Abstract**
Autonomous vehicles demand high accuracy and robustness of perception algorithms. To develop efficient and scalable perception algorithms, the maximum information should be extracted from the available sensor data. In this work, we present our concept for an end-to-end perception architecture, named DeepSTEP. The deep learning-based architecture processes raw sensor data from the camera, LiDAR, and RaDAR, and combines the extracted data in a deep fusion network. The output of this deep fusion network is a shared feature space, which is used by perception head networks to fulfill several perception tasks, such as object detection or local mapping. DeepSTEP incorporates multiple ideas to advance state of the art: First, combining detection and localization into a single pipeline allows for efficient processing to reduce computational overhead and further improves overall performance. Second, the architecture leverages the temporal domain by using a self-attention mechanism that focuses on the most important features. We believe that our concept of DeepSTEP will advance the development of end-to-end perception systems. The network will be deployed on our research vehicle, which will be used as a platform for data collection, real-world testing, and validation. In conclusion, DeepSTEP represents a significant advancement in the field of perception for autonomous vehicles. The architecture's end-to-end design, time-aware attention mechanism, and integration of multiple perception tasks make it a promising solution for real-world deployment. This research is a work in progress and presents the first concept of establishing a novel perception pipeline.
## Keyword: raw image
There is no result
|
2.0
|
New submissions for Fri, 12 May 23 - ## Keyword: events
### HyperE2VID: Improving Event-Based Video Reconstruction via Hypernetworks
- **Authors:** Burak Ercan, Onur Eker, Canberk Saglam, Aykut Erdem, Erkut Erdem
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06382
- **Pdf link:** https://arxiv.org/pdf/2305.06382
- **Abstract**
Event-based cameras are becoming increasingly popular for their ability to capture high-speed motion with low latency and high dynamic range. However, generating videos from events remains challenging due to the highly sparse and varying nature of event data. To address this, in this study, we propose HyperE2VID, a dynamic neural network architecture for event-based video reconstruction. Our approach uses hypernetworks and dynamic convolutions to generate per-pixel adaptive filters guided by a context fusion module that combines information from event voxel grids and previously reconstructed intensity images. We also employ a curriculum learning strategy to train the network more robustly. Experimental results demonstrate that HyperE2VID achieves better reconstruction quality with fewer parameters and faster inference time than the state-of-the-art methods.
### WeditGAN: Few-shot Image Generation via Latent Space Relocation
- **Authors:** Yuxuan Duan, Li Niu, Yan Hong, Liqing Zhang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06671
- **Pdf link:** https://arxiv.org/pdf/2305.06671
- **Abstract**
In few-shot image generation, directly training GAN models on just a handful of images faces the risk of overfitting. A popular solution is to transfer the models pretrained on large source domains to small target ones. In this work, we introduce WeditGAN, which realizes model transfer by editing the intermediate latent codes $w$ in StyleGANs with learned constant offsets ($\Delta w$), discovering and constructing target latent spaces via simply relocating the distribution of source latent spaces. The established one-to-one mapping between latent spaces can naturally prevents mode collapse and overfitting. Besides, we also propose variants of WeditGAN to further enhance the relocation process by regularizing the direction or finetuning the intensity of $\Delta w$. Experiments on a collection of widely used source/target datasets manifest the capability of WeditGAN in generating realistic and diverse images, which is simple yet highly effective in the research area of few-shot image generation.
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
There is no result
## Keyword: ISP
### Towards L-System Captioning for Tree Reconstruction
- **Authors:** Jannes S. Magnusson, Anna Hilsmann, Peter Eisert
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06483
- **Pdf link:** https://arxiv.org/pdf/2305.06483
- **Abstract**
This work proposes a novel concept for tree and plant reconstruction by directly inferring a Lindenmayer-System (L-System) word representation from image data in an image captioning approach. We train a model end-to-end which is able to translate given images into L-System words as a description of the displayed tree. To prove this concept, we demonstrate the applicability on 2D tree topologies. Transferred to real image data, this novel idea could lead to more efficient, accurate and semantically meaningful tree and plant reconstruction without using error-prone point cloud extraction, and other processes usually utilized in tree reconstruction. Furthermore, this approach bypasses the need for a predefined L-System grammar and enables species-specific L-System inference without biological knowledge.
### Emotion Recognition for Challenged People Facial Appearance in Social using Neural Network
- **Authors:** P. Deivendran, P. Suresh Babu, G. Malathi, K. Anbazhagan, R. Senthil Kumar
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2305.06842
- **Pdf link:** https://arxiv.org/pdf/2305.06842
- **Abstract**
Human communication is the vocal and non verbal signal to communicate with others. Human expression is a significant biometric object in picture and record databases of surveillance systems. Face appreciation has a serious role in biometric methods and is good-looking for plentiful applications, including visual scrutiny and security. Facial expressions are a form of nonverbal communication; recognizing them helps improve the human machine interaction. This paper proposes an idea for face and enlightenment invariant credit of facial expressions by the images. In order on, the person's face can be computed. Face expression is used in CNN classifier to categorize the acquired picture into different emotion categories. It is a deep, feed-forward artificial neural network. Outcome surpasses human presentation and shows poses alternate performance. Varying lighting conditions can influence the fitting process and reduce recognition precision. Results illustrate that dependable facial appearance credited with changing lighting conditions for separating reasonable facial terminology display emotions is an efficient representation of clean and assorted moving expressions. This process can also manage the proportions of dissimilar basic affecting expressions of those mixed jointly to produce sensible emotional facial expressions. Our system contains a pre-defined data set, which was residential by a statistics scientist and includes all pure and varied expressions. On average, a data set has achieved 92.4% exact validation of the expressions synthesized by our technique. These facial expressions are compared through the pre-defined data-position inside our system. If it recognizes the person in an abnormal condition, an alert will be passed to the nearby hospital/doctor seeing that a message.
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### Treasure What You Have: Exploiting Similarity in Deep Neural Networks for Efficient Video Processing
- **Authors:** Hadjer Benmeziane, Halima Bouzidi, Hamza Ouarnoughi, Ozcan Ozturk, Smail Niar
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06492
- **Pdf link:** https://arxiv.org/pdf/2305.06492
- **Abstract**
Deep learning has enabled various Internet of Things (IoT) applications. Still, designing models with high accuracy and computational efficiency remains a significant challenge, especially in real-time video processing applications. Such applications exhibit high inter- and intra-frame redundancy, allowing further improvement. This paper proposes a similarity-aware training methodology that exploits data redundancy in video frames for efficient processing. Our approach introduces a per-layer regularization that enhances computation reuse by increasing the similarity of weights during training. We validate our methodology on two critical real-time applications, lane detection and scene parsing. We observe an average compression ratio of approximately 50% and a speedup of \sim 1.5x for different models while maintaining the same accuracy.
### Exploiting Fine-Grained DCT Representations for Hiding Image-Level Messages within JPEG Images
- **Authors:** Junxue Yang, Xin Liao
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06582
- **Pdf link:** https://arxiv.org/pdf/2305.06582
- **Abstract**
Unlike hiding bit-level messages, hiding image-level messages is more challenging, which requires large capacity, high imperceptibility, and high security. Although recent advances in hiding image-level messages have been remarkable, existing schemes are limited to lossless spatial images as covers and cannot be directly applied to JPEG images, the ubiquitous lossy format images in daily life. The difficulties of migration are caused by the lack of targeted design and the loss of details due to lossy decompression and re-compression. Considering that taking DCT densely on $8\times8$ image patches is the core of the JPEG compression standard, we design a novel model called \textsf{EFDR}, which can comprehensively \underline{E}xploit \underline{F}ine-grained \underline{D}CT \underline{R}epresentations and embed the secret image into quantized DCT coefficients to avoid the lossy process. Specifically, we transform the JPEG cover image and hidden secret image into fine-grained DCT representations that compact the frequency and are associated with the inter-block and intra-block correlations. Subsequently, the fine-grained DCT representations are further enhanced by a sub-band features enhancement module. Afterward, a transformer-based invertibility module is designed to fuse enhanced sub-band features. Such a design enables a fine-grained self-attention on each sub-band and captures long-range dependencies while maintaining excellent reversibility for hiding and recovery. To our best knowledge, this is the first attempt to embed a color image of equal size in a color JPEG image. Extensive experiments demonstrate the effectiveness of our \textsf{EFDR} with superior performance.
## Keyword: RAW
### Combo of Thinking and Observing for Outside-Knowledge VQA
- **Authors:** Qingyi Si, Yuchen Mo, Zheng Lin, Huishan Ji, Weiping Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2305.06407
- **Pdf link:** https://arxiv.org/pdf/2305.06407
- **Abstract**
Outside-knowledge visual question answering is a challenging task that requires both the acquisition and the use of open-ended real-world knowledge. Some existing solutions draw external knowledge into the cross-modality space which overlooks the much vaster textual knowledge in natural-language space, while others transform the image into a text that further fuses with the textual knowledge into the natural-language space and completely abandons the use of visual features. In this paper, we are inspired to constrain the cross-modality space into the same space of natural-language space which makes the visual features preserved directly, and the model still benefits from the vast knowledge in natural-language space. To this end, we propose a novel framework consisting of a multimodal encoder, a textual encoder and an answer decoder. Such structure allows us to introduce more types of knowledge including explicit and implicit multimodal and textual knowledge. Extensive experiments validate the superiority of the proposed method which outperforms the state-of-the-art by 6.17% accuracy. We also conduct comprehensive ablations of each component, and systematically study the roles of varying types of knowledge. Codes and knowledge data can be found at https://github.com/PhoebusSi/Thinking-while-Observing.
### DeepSTEP -- Deep Learning-Based Spatio-Temporal End-To-End Perception for Autonomous Vehicles
- **Authors:** Sebastian Huch, Florian Sauerbeck, Johannes Betz
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2305.06820
- **Pdf link:** https://arxiv.org/pdf/2305.06820
- **Abstract**
Autonomous vehicles demand high accuracy and robustness of perception algorithms. To develop efficient and scalable perception algorithms, the maximum information should be extracted from the available sensor data. In this work, we present our concept for an end-to-end perception architecture, named DeepSTEP. The deep learning-based architecture processes raw sensor data from the camera, LiDAR, and RaDAR, and combines the extracted data in a deep fusion network. The output of this deep fusion network is a shared feature space, which is used by perception head networks to fulfill several perception tasks, such as object detection or local mapping. DeepSTEP incorporates multiple ideas to advance state of the art: First, combining detection and localization into a single pipeline allows for efficient processing to reduce computational overhead and further improves overall performance. Second, the architecture leverages the temporal domain by using a self-attention mechanism that focuses on the most important features. We believe that our concept of DeepSTEP will advance the development of end-to-end perception systems. The network will be deployed on our research vehicle, which will be used as a platform for data collection, real-world testing, and validation. In conclusion, DeepSTEP represents a significant advancement in the field of perception for autonomous vehicles. The architecture's end-to-end design, time-aware attention mechanism, and integration of multiple perception tasks make it a promising solution for real-world deployment. This research is a work in progress and presents the first concept of establishing a novel perception pipeline.
## Keyword: raw image
There is no result
|
process
|
new submissions for fri may keyword events improving event based video reconstruction via hypernetworks authors burak ercan onur eker canberk saglam aykut erdem erkut erdem subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract event based cameras are becoming increasingly popular for their ability to capture high speed motion with low latency and high dynamic range however generating videos from events remains challenging due to the highly sparse and varying nature of event data to address this in this study we propose a dynamic neural network architecture for event based video reconstruction our approach uses hypernetworks and dynamic convolutions to generate per pixel adaptive filters guided by a context fusion module that combines information from event voxel grids and previously reconstructed intensity images we also employ a curriculum learning strategy to train the network more robustly experimental results demonstrate that achieves better reconstruction quality with fewer parameters and faster inference time than the state of the art methods weditgan few shot image generation via latent space relocation authors yuxuan duan li niu yan hong liqing zhang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract in few shot image generation directly training gan models on just a handful of images faces the risk of overfitting a popular solution is to transfer the models pretrained on large source domains to small target ones in this work we introduce weditgan which realizes model transfer by editing the intermediate latent codes w in stylegans with learned constant offsets delta w discovering and constructing target latent spaces via simply relocating the distribution of source latent spaces the established one to one mapping between latent spaces can naturally prevents mode collapse and overfitting besides we also propose variants of weditgan to further enhance the relocation process by regularizing the direction or finetuning the intensity of delta w experiments on a collection of widely used source target datasets manifest the capability of weditgan in generating realistic and diverse images which is simple yet highly effective in the research area of few shot image generation keyword event camera there is no result keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb there is no result keyword isp towards l system captioning for tree reconstruction authors jannes s magnusson anna hilsmann peter eisert subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract this work proposes a novel concept for tree and plant reconstruction by directly inferring a lindenmayer system l system word representation from image data in an image captioning approach we train a model end to end which is able to translate given images into l system words as a description of the displayed tree to prove this concept we demonstrate the applicability on tree topologies transferred to real image data this novel idea could lead to more efficient accurate and semantically meaningful tree and plant reconstruction without using error prone point cloud extraction and other processes usually utilized in tree reconstruction furthermore this approach bypasses the need for a predefined l system grammar and enables species specific l system inference without biological knowledge emotion recognition for challenged people facial appearance in social using neural network authors p deivendran p suresh babu g malathi k anbazhagan r senthil kumar subjects computer vision and pattern recognition cs cv artificial intelligence cs ai arxiv link pdf link abstract human communication is the vocal and non verbal signal to communicate with others human expression is a significant biometric object in picture and record databases of surveillance systems face appreciation has a serious role in biometric methods and is good looking for plentiful applications including visual scrutiny and security facial expressions are a form of nonverbal communication recognizing them helps improve the human machine interaction this paper proposes an idea for face and enlightenment invariant credit of facial expressions by the images in order on the person s face can be computed face expression is used in cnn classifier to categorize the acquired picture into different emotion categories it is a deep feed forward artificial neural network outcome surpasses human presentation and shows poses alternate performance varying lighting conditions can influence the fitting process and reduce recognition precision results illustrate that dependable facial appearance credited with changing lighting conditions for separating reasonable facial terminology display emotions is an efficient representation of clean and assorted moving expressions this process can also manage the proportions of dissimilar basic affecting expressions of those mixed jointly to produce sensible emotional facial expressions our system contains a pre defined data set which was residential by a statistics scientist and includes all pure and varied expressions on average a data set has achieved exact validation of the expressions synthesized by our technique these facial expressions are compared through the pre defined data position inside our system if it recognizes the person in an abnormal condition an alert will be passed to the nearby hospital doctor seeing that a message keyword image signal processing there is no result keyword image signal process there is no result keyword compression treasure what you have exploiting similarity in deep neural networks for efficient video processing authors hadjer benmeziane halima bouzidi hamza ouarnoughi ozcan ozturk smail niar subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract deep learning has enabled various internet of things iot applications still designing models with high accuracy and computational efficiency remains a significant challenge especially in real time video processing applications such applications exhibit high inter and intra frame redundancy allowing further improvement this paper proposes a similarity aware training methodology that exploits data redundancy in video frames for efficient processing our approach introduces a per layer regularization that enhances computation reuse by increasing the similarity of weights during training we validate our methodology on two critical real time applications lane detection and scene parsing we observe an average compression ratio of approximately and a speedup of sim for different models while maintaining the same accuracy exploiting fine grained dct representations for hiding image level messages within jpeg images authors junxue yang xin liao subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract unlike hiding bit level messages hiding image level messages is more challenging which requires large capacity high imperceptibility and high security although recent advances in hiding image level messages have been remarkable existing schemes are limited to lossless spatial images as covers and cannot be directly applied to jpeg images the ubiquitous lossy format images in daily life the difficulties of migration are caused by the lack of targeted design and the loss of details due to lossy decompression and re compression considering that taking dct densely on image patches is the core of the jpeg compression standard we design a novel model called textsf efdr which can comprehensively underline e xploit underline f ine grained underline d ct underline r epresentations and embed the secret image into quantized dct coefficients to avoid the lossy process specifically we transform the jpeg cover image and hidden secret image into fine grained dct representations that compact the frequency and are associated with the inter block and intra block correlations subsequently the fine grained dct representations are further enhanced by a sub band features enhancement module afterward a transformer based invertibility module is designed to fuse enhanced sub band features such a design enables a fine grained self attention on each sub band and captures long range dependencies while maintaining excellent reversibility for hiding and recovery to our best knowledge this is the first attempt to embed a color image of equal size in a color jpeg image extensive experiments demonstrate the effectiveness of our textsf efdr with superior performance keyword raw combo of thinking and observing for outside knowledge vqa authors qingyi si yuchen mo zheng lin huishan ji weiping wang subjects computer vision and pattern recognition cs cv artificial intelligence cs ai arxiv link pdf link abstract outside knowledge visual question answering is a challenging task that requires both the acquisition and the use of open ended real world knowledge some existing solutions draw external knowledge into the cross modality space which overlooks the much vaster textual knowledge in natural language space while others transform the image into a text that further fuses with the textual knowledge into the natural language space and completely abandons the use of visual features in this paper we are inspired to constrain the cross modality space into the same space of natural language space which makes the visual features preserved directly and the model still benefits from the vast knowledge in natural language space to this end we propose a novel framework consisting of a multimodal encoder a textual encoder and an answer decoder such structure allows us to introduce more types of knowledge including explicit and implicit multimodal and textual knowledge extensive experiments validate the superiority of the proposed method which outperforms the state of the art by accuracy we also conduct comprehensive ablations of each component and systematically study the roles of varying types of knowledge codes and knowledge data can be found at deepstep deep learning based spatio temporal end to end perception for autonomous vehicles authors sebastian huch florian sauerbeck johannes betz subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract autonomous vehicles demand high accuracy and robustness of perception algorithms to develop efficient and scalable perception algorithms the maximum information should be extracted from the available sensor data in this work we present our concept for an end to end perception architecture named deepstep the deep learning based architecture processes raw sensor data from the camera lidar and radar and combines the extracted data in a deep fusion network the output of this deep fusion network is a shared feature space which is used by perception head networks to fulfill several perception tasks such as object detection or local mapping deepstep incorporates multiple ideas to advance state of the art first combining detection and localization into a single pipeline allows for efficient processing to reduce computational overhead and further improves overall performance second the architecture leverages the temporal domain by using a self attention mechanism that focuses on the most important features we believe that our concept of deepstep will advance the development of end to end perception systems the network will be deployed on our research vehicle which will be used as a platform for data collection real world testing and validation in conclusion deepstep represents a significant advancement in the field of perception for autonomous vehicles the architecture s end to end design time aware attention mechanism and integration of multiple perception tasks make it a promising solution for real world deployment this research is a work in progress and presents the first concept of establishing a novel perception pipeline keyword raw image there is no result
| 1
|
21,520
| 29,804,442,877
|
IssuesEvent
|
2023-06-16 10:31:36
|
bitfocus/companion-module-requests
|
https://api.github.com/repos/bitfocus/companion-module-requests
|
opened
|
Electro-Voice N8000 NetMax 300 MIPS Digital Matrix Controller
|
NOT YET PROCESSED
|
- [x] **I have researched the list of existing Companion modules and requests and have determined this has not yet been requested**
- YES
The name of the device, hardware, or software you would like to control:
- Electro-Voice N8000 NetMax 300 MIPS Digital Matrix Controller
What you would like to be able to make it do from Companion:
- mute input
- mute output
- mute input mixer
- mute output mixer
- comand for DSP presets
- noise generator
Direct links or attachments to the ethernet control protocol or API:
[RS-232_for_N8000_1.4.pdf](https://github.com/bitfocus/companion-module-requests/files/11769103/RS-232_for_N8000_1.4.pdf)
|
1.0
|
Electro-Voice N8000 NetMax 300 MIPS Digital Matrix Controller - - [x] **I have researched the list of existing Companion modules and requests and have determined this has not yet been requested**
- YES
The name of the device, hardware, or software you would like to control:
- Electro-Voice N8000 NetMax 300 MIPS Digital Matrix Controller
What you would like to be able to make it do from Companion:
- mute input
- mute output
- mute input mixer
- mute output mixer
- comand for DSP presets
- noise generator
Direct links or attachments to the ethernet control protocol or API:
[RS-232_for_N8000_1.4.pdf](https://github.com/bitfocus/companion-module-requests/files/11769103/RS-232_for_N8000_1.4.pdf)
|
process
|
electro voice netmax mips digital matrix controller i have researched the list of existing companion modules and requests and have determined this has not yet been requested yes the name of the device hardware or software you would like to control electro voice netmax mips digital matrix controller what you would like to be able to make it do from companion mute input mute output mute input mixer mute output mixer comand for dsp presets noise generator direct links or attachments to the ethernet control protocol or api
| 1
|
68,629
| 21,769,783,847
|
IssuesEvent
|
2022-05-13 07:55:12
|
vector-im/element-web
|
https://api.github.com/repos/vector-im/element-web
|
closed
|
Can't navigate long topics
|
T-Defect S-Minor A-Room-View Help Wanted Z-Visibility-1 Z-Impact-2 O-Uncommon good first issue Z-IA Z-Labs Z-WTF Team: Delight Z-NewUserJourney
|
I'm in a room with a long topic, and I can't read it all. Specifically, I'm trying to click on a link which is clipped out of bounds, and it's not clickable when viewing room settings making it even more frustrating.
Should we implement a mechanic that reveals more of the room topic? Either by making it reveal more on hover, or scrollable, or with a 'Show more/Show less' interaction. Simple first option:
- Click topic -> modal
- Room avatar & room name as modal title
- Topic as modal body
- X button/ESC to close
|
1.0
|
Can't navigate long topics - I'm in a room with a long topic, and I can't read it all. Specifically, I'm trying to click on a link which is clipped out of bounds, and it's not clickable when viewing room settings making it even more frustrating.
Should we implement a mechanic that reveals more of the room topic? Either by making it reveal more on hover, or scrollable, or with a 'Show more/Show less' interaction. Simple first option:
- Click topic -> modal
- Room avatar & room name as modal title
- Topic as modal body
- X button/ESC to close
|
non_process
|
can t navigate long topics i m in a room with a long topic and i can t read it all specifically i m trying to click on a link which is clipped out of bounds and it s not clickable when viewing room settings making it even more frustrating should we implement a mechanic that reveals more of the room topic either by making it reveal more on hover or scrollable or with a show more show less interaction simple first option click topic modal room avatar room name as modal title topic as modal body x button esc to close
| 0
|
8,676
| 11,809,680,792
|
IssuesEvent
|
2020-03-19 15:19:29
|
MicrosoftDocs/vsts-docs
|
https://api.github.com/repos/MicrosoftDocs/vsts-docs
|
closed
|
Disable all stages at start by default so that only those required can be selected manually
|
Pri1 devops-cicd-process/tech devops/prod doc-bug
|
We would like to provide a pipeline that has various build configurations as different stages but depending on what build configuration they want they need to select appropriate stages.
Currently this is possible if they individually disable stages one by one before starting but by default all stages are always enabled.
If there is a parameter which can set the starting checkbox as disabled it would be helpful so that the pipeline comes up with all unchecked stages and they can choose stages that they want.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 4266f72c-c774-0046-4593-d01eb775d3c3
* Version Independent ID: f20827aa-a6c5-96a8-5969-e576ffbc2e38
* Content: [Stages in Azure Pipelines - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=yaml&source=docs)
* Content Source: [docs/pipelines/process/stages.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/stages.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Disable all stages at start by default so that only those required can be selected manually - We would like to provide a pipeline that has various build configurations as different stages but depending on what build configuration they want they need to select appropriate stages.
Currently this is possible if they individually disable stages one by one before starting but by default all stages are always enabled.
If there is a parameter which can set the starting checkbox as disabled it would be helpful so that the pipeline comes up with all unchecked stages and they can choose stages that they want.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 4266f72c-c774-0046-4593-d01eb775d3c3
* Version Independent ID: f20827aa-a6c5-96a8-5969-e576ffbc2e38
* Content: [Stages in Azure Pipelines - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=yaml&source=docs)
* Content Source: [docs/pipelines/process/stages.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/stages.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
disable all stages at start by default so that only those required can be selected manually we would like to provide a pipeline that has various build configurations as different stages but depending on what build configuration they want they need to select appropriate stages currently this is possible if they individually disable stages one by one before starting but by default all stages are always enabled if there is a parameter which can set the starting checkbox as disabled it would be helpful so that the pipeline comes up with all unchecked stages and they can choose stages that they want document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
7,497
| 10,583,898,105
|
IssuesEvent
|
2019-10-08 14:31:20
|
prisma/studio
|
https://api.github.com/repos/prisma/studio
|
closed
|
Pressing Tab while editing a cell should not focus the browser's address bar
|
bug/2-confirmed kind/bug process/candidate
|
Instead, exit the edit mode and move to the next cell horizontally
|
1.0
|
Pressing Tab while editing a cell should not focus the browser's address bar - Instead, exit the edit mode and move to the next cell horizontally
|
process
|
pressing tab while editing a cell should not focus the browser s address bar instead exit the edit mode and move to the next cell horizontally
| 1
|
1,717
| 2,665,174,183
|
IssuesEvent
|
2015-03-20 18:46:37
|
phetsims/arch
|
https://api.github.com/repos/phetsims/arch
|
closed
|
code review for potential merge into master
|
code review
|
This is a consolidation of https://github.com/phetsims/axon/issues/34, https://github.com/phetsims/joist/issues/167 and https://github.com/phetsims/sun/issues/120.
Review arch and the 'arch' branches of axon, joist and sun. Sim examples are in the 'arch' branches of balancing-act and fluid-pressure-and-flow.
Evaluate whether this approach to data collection is (a) suitable for merging into master, and (b) if the design seems scalable enough for future studies or (c) if we should make API changes before proliferating usage.
It would be nice to review our strategy and merge it to master before the next study. The study begins on Monday October 27, so having the review complete by Monday October 13 would give us plenty of time to address the review issues and merge to master, with enough time for testing.
|
1.0
|
code review for potential merge into master - This is a consolidation of https://github.com/phetsims/axon/issues/34, https://github.com/phetsims/joist/issues/167 and https://github.com/phetsims/sun/issues/120.
Review arch and the 'arch' branches of axon, joist and sun. Sim examples are in the 'arch' branches of balancing-act and fluid-pressure-and-flow.
Evaluate whether this approach to data collection is (a) suitable for merging into master, and (b) if the design seems scalable enough for future studies or (c) if we should make API changes before proliferating usage.
It would be nice to review our strategy and merge it to master before the next study. The study begins on Monday October 27, so having the review complete by Monday October 13 would give us plenty of time to address the review issues and merge to master, with enough time for testing.
|
non_process
|
code review for potential merge into master this is a consolidation of and review arch and the arch branches of axon joist and sun sim examples are in the arch branches of balancing act and fluid pressure and flow evaluate whether this approach to data collection is a suitable for merging into master and b if the design seems scalable enough for future studies or c if we should make api changes before proliferating usage it would be nice to review our strategy and merge it to master before the next study the study begins on monday october so having the review complete by monday october would give us plenty of time to address the review issues and merge to master with enough time for testing
| 0
|
21,665
| 30,110,942,025
|
IssuesEvent
|
2023-06-30 07:42:05
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
pipelines conditions.md example error
|
doc-bug Pri1 azure-devops-pipelines/svc azure-devops-pipelines-process/subsvc
|
The `if eq` from this block results in an error in ADO pipelines.
```yaml
# parameters.yml
parameters:
- name: doThing
default: true # value passed to the condition
type: boolean
jobs:
- job: B
steps:
- script: echo I did a thing
condition: ${{ if eq(parameters.doThing, true) }}
```
Shouldn't be an `if` here.
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: 21e5cee4-eaae-3a96-db91-540ac759e83a
* Version Independent ID: 9bdc837c-ffe0-d999-f922-f3a5debc7f92
* Content: [Conditions - Azure Pipelines](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml%2Cstages)
* Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/conditions.md)
* Service: **azure-devops-pipelines**
* Sub-service: **azure-devops-pipelines-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
pipelines conditions.md example error - The `if eq` from this block results in an error in ADO pipelines.
```yaml
# parameters.yml
parameters:
- name: doThing
default: true # value passed to the condition
type: boolean
jobs:
- job: B
steps:
- script: echo I did a thing
condition: ${{ if eq(parameters.doThing, true) }}
```
Shouldn't be an `if` here.
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: 21e5cee4-eaae-3a96-db91-540ac759e83a
* Version Independent ID: 9bdc837c-ffe0-d999-f922-f3a5debc7f92
* Content: [Conditions - Azure Pipelines](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml%2Cstages)
* Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/conditions.md)
* Service: **azure-devops-pipelines**
* Sub-service: **azure-devops-pipelines-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
pipelines conditions md example error the if eq from this block results in an error in ado pipelines yaml parameters yml parameters name dothing default true value passed to the condition type boolean jobs job b steps script echo i did a thing condition if eq parameters dothing true shouldn t be an if here document details ⚠ do not edit this section it is required for learn microsoft com ➟ github issue linking id eaae version independent id content content source service azure devops pipelines sub service azure devops pipelines process github login juliakm microsoft alias jukullam
| 1
|
7,029
| 10,188,946,042
|
IssuesEvent
|
2019-08-11 15:24:42
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
child_process: empty options.env results in ENOMEM error
|
child_process libuv windows
|
* **Version**: v10.16.1
* **Platform**: Windows 10 x64 Version 1803 (OS Build 17134.885)
* **Subsystem**: child_process
When calling another process, the `exec` and `execSync` commands fail with a ENOMEM error when using an empty object for the `options.env` variable. Using any non-`undefined` value at all in the `env` object makes the call work again.
```js
const { execSync } = require('child_process');
execSync('node', {env: {}}) // Fails with ENOMEM error
execSync('node', {env: {x: undefined}}) // Fails with ENOMEM error
execSync('node', {env: {x: 0}}) // Works
```
The error trace:
```
Thrown:
{ Error: spawnSync C:\WINDOWS\system32\cmd.exe ENOMEM
at Object.spawnSync (internal/child_process.js:1002:20)
at spawnSync (child_process.js:614:24)
at execSync (child_process.js:661:13)
errno: 'ENOMEM',
code: 'ENOMEM',
syscall: 'spawnSync C:\\WINDOWS\\system32\\cmd.exe',
path: 'C:\\WINDOWS\\system32\\cmd.exe',
spawnargs: [ '/d', '/s', '/c', '"node"' ],
error: [Circular],
status: null,
signal: null,
output: null,
pid: 0,
stdout: null,
stderr: null }
```
This did not happen in v10.15.*, and seems to affect v10.16.0 as well.
|
1.0
|
child_process: empty options.env results in ENOMEM error - * **Version**: v10.16.1
* **Platform**: Windows 10 x64 Version 1803 (OS Build 17134.885)
* **Subsystem**: child_process
When calling another process, the `exec` and `execSync` commands fail with a ENOMEM error when using an empty object for the `options.env` variable. Using any non-`undefined` value at all in the `env` object makes the call work again.
```js
const { execSync } = require('child_process');
execSync('node', {env: {}}) // Fails with ENOMEM error
execSync('node', {env: {x: undefined}}) // Fails with ENOMEM error
execSync('node', {env: {x: 0}}) // Works
```
The error trace:
```
Thrown:
{ Error: spawnSync C:\WINDOWS\system32\cmd.exe ENOMEM
at Object.spawnSync (internal/child_process.js:1002:20)
at spawnSync (child_process.js:614:24)
at execSync (child_process.js:661:13)
errno: 'ENOMEM',
code: 'ENOMEM',
syscall: 'spawnSync C:\\WINDOWS\\system32\\cmd.exe',
path: 'C:\\WINDOWS\\system32\\cmd.exe',
spawnargs: [ '/d', '/s', '/c', '"node"' ],
error: [Circular],
status: null,
signal: null,
output: null,
pid: 0,
stdout: null,
stderr: null }
```
This did not happen in v10.15.*, and seems to affect v10.16.0 as well.
|
process
|
child process empty options env results in enomem error version platform windows version os build subsystem child process when calling another process the exec and execsync commands fail with a enomem error when using an empty object for the options env variable using any non undefined value at all in the env object makes the call work again js const execsync require child process execsync node env fails with enomem error execsync node env x undefined fails with enomem error execsync node env x works the error trace thrown error spawnsync c windows cmd exe enomem at object spawnsync internal child process js at spawnsync child process js at execsync child process js errno enomem code enomem syscall spawnsync c windows cmd exe path c windows cmd exe spawnargs error status null signal null output null pid stdout null stderr null this did not happen in and seems to affect as well
| 1
|
10,421
| 13,213,868,667
|
IssuesEvent
|
2020-08-16 14:58:37
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
closed
|
Redirecting stdout/stderr of a process throws exception with .NET 5 preview.6 on WSL1
|
area-System.Diagnostics.Process bug
|
<!--This is just a template - feel free to delete any and all of it and replace as appropriate.-->
### Description
This was reported in PowerShell repo (https://github.com/PowerShell/PowerShell/issues/13407), and it turns out to be a .NET 5 issue on Ubuntu 18.04 with WSL1.
#### Repro
On Ubuntu 18.04 with WSL1, run the following in PowerShell 7.1.0-preview.5 (using .NET 5 preview.6)
```
PS:29> date > blah
ResourceUnavailable: Program 'date' failed to run: The operation is not allowed on non-connected sockets.At line:1 char:1
+ date > blah
+ ~~~~~~~~~~~.
```
The exception is:
```
PS:33> $e.Exception.InnerException Type : System.IO.IOException
TargetSite : Void .ctor(System.Net.Sockets.Socket, System.IO.FileAccess, Boolean)
StackTrace : at System.Net.Sockets.NetworkStream..ctor(Socket socket, FileAccess access, Boolean ownsSocket)
at System.Diagnostics.Process.OpenStream(Int32 fd, FileAccess access)
at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo)
at System.Diagnostics.Process.Start()
at System.Management.Automation.NativeCommandProcessor.InitNativeProcess() in /PowerShell/src/System.Management.Automation/engine
/NativeCommandProcessor.cs:line 466
Message : The operation is not allowed on non-connected sockets.
Data : {}
InnerException :
HelpLink :
Source : System.Net.Sockets
HResult : -2146232800
```
### Configuration
* Which version of .NET is the code running on:
- .NET 5 preview.6
* What OS and version, and what distro if applicable:
- Ubuntu 18.04.4 LTS, WSL1
* What is the architecture (x64, x86, ARM, ARM64):
- x64
* Do you know whether it is specific to that configuration:
- not sure, it works fine on my standalone Ubuntu 16.04 machine, but didn't try with other configurations.
### Regression?
Yes, it doesn't repro with PowerShell 7.1.0-preview.3, which is on top of .NET 5 preview.4.
That means the issue was a regression introduced after .NET 5 preview.4.
|
1.0
|
Redirecting stdout/stderr of a process throws exception with .NET 5 preview.6 on WSL1 - <!--This is just a template - feel free to delete any and all of it and replace as appropriate.-->
### Description
This was reported in PowerShell repo (https://github.com/PowerShell/PowerShell/issues/13407), and it turns out to be a .NET 5 issue on Ubuntu 18.04 with WSL1.
#### Repro
On Ubuntu 18.04 with WSL1, run the following in PowerShell 7.1.0-preview.5 (using .NET 5 preview.6)
```
PS:29> date > blah
ResourceUnavailable: Program 'date' failed to run: The operation is not allowed on non-connected sockets.At line:1 char:1
+ date > blah
+ ~~~~~~~~~~~.
```
The exception is:
```
PS:33> $e.Exception.InnerException Type : System.IO.IOException
TargetSite : Void .ctor(System.Net.Sockets.Socket, System.IO.FileAccess, Boolean)
StackTrace : at System.Net.Sockets.NetworkStream..ctor(Socket socket, FileAccess access, Boolean ownsSocket)
at System.Diagnostics.Process.OpenStream(Int32 fd, FileAccess access)
at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo)
at System.Diagnostics.Process.Start()
at System.Management.Automation.NativeCommandProcessor.InitNativeProcess() in /PowerShell/src/System.Management.Automation/engine
/NativeCommandProcessor.cs:line 466
Message : The operation is not allowed on non-connected sockets.
Data : {}
InnerException :
HelpLink :
Source : System.Net.Sockets
HResult : -2146232800
```
### Configuration
* Which version of .NET is the code running on:
- .NET 5 preview.6
* What OS and version, and what distro if applicable:
- Ubuntu 18.04.4 LTS, WSL1
* What is the architecture (x64, x86, ARM, ARM64):
- x64
* Do you know whether it is specific to that configuration:
- not sure, it works fine on my standalone Ubuntu 16.04 machine, but didn't try with other configurations.
### Regression?
Yes, it doesn't repro with PowerShell 7.1.0-preview.3, which is on top of .NET 5 preview.4.
That means the issue was a regression introduced after .NET 5 preview.4.
|
process
|
redirecting stdout stderr of a process throws exception with net preview on description this was reported in powershell repo and it turns out to be a net issue on ubuntu with repro on ubuntu with run the following in powershell preview using net preview ps date blah resourceunavailable program date failed to run the operation is not allowed on non connected sockets at line char date blah the exception is ps e exception innerexception type system io ioexception targetsite void ctor system net sockets socket system io fileaccess boolean stacktrace at system net sockets networkstream ctor socket socket fileaccess access boolean ownssocket at system diagnostics process openstream fd fileaccess access at system diagnostics process startcore processstartinfo startinfo at system diagnostics process start at system management automation nativecommandprocessor initnativeprocess in powershell src system management automation engine nativecommandprocessor cs line message the operation is not allowed on non connected sockets data innerexception helplink source system net sockets hresult configuration which version of net is the code running on net preview what os and version and what distro if applicable ubuntu lts what is the architecture arm do you know whether it is specific to that configuration not sure it works fine on my standalone ubuntu machine but didn t try with other configurations regression yes it doesn t repro with powershell preview which is on top of net preview that means the issue was a regression introduced after net preview
| 1
|
126,858
| 5,006,781,120
|
IssuesEvent
|
2016-12-12 15:06:43
|
openshift/origin-web-console
|
https://api.github.com/repos/openshift/origin-web-console
|
opened
|
Alignment and spacing around <status-icon>s is inconsistent
|
area/styles kind/bug priority/P3
|
In working on #1016, I noticed sometimes <status-icon>s have margin-right. Some times they don't. Some times they use .fa-fw. Some times they don't. The icon size differs based on the context (which is ok), so make sure an fixes for the aforementioned aren't negatively impacted by the size.
|
1.0
|
Alignment and spacing around <status-icon>s is inconsistent - In working on #1016, I noticed sometimes <status-icon>s have margin-right. Some times they don't. Some times they use .fa-fw. Some times they don't. The icon size differs based on the context (which is ok), so make sure an fixes for the aforementioned aren't negatively impacted by the size.
|
non_process
|
alignment and spacing around lt status icon gt s is inconsistent in working on i noticed sometimes lt status icon gt s have margin right some times they don t some times they use fa fw some times they don t the icon size differs based on the context which is ok so make sure an fixes for the aforementioned aren t negatively impacted by the size
| 0
|
44,034
| 23,469,387,233
|
IssuesEvent
|
2022-08-16 20:06:27
|
ansible/ansible
|
https://api.github.com/repos/ansible/ansible
|
closed
|
On systems with slow disks, Ansible 2.10 runs generally much slower than 2.9
|
python3 performance module support:core bug has_pr docs commands affects_2.10
|
##### SUMMARY
At first, I thought this may just be a problem that was caused by 2.10 using collections and shipping with dozens of collections out of the box when you `pip install` it now.
But after exploring further, I found that basic `ansible` commands like `ansible --version` are 3x slower than Ansible 2.9, even if I'm just installing and using `ansible-base`, with _no collections_ installed.
Note that these tests were done on a Raspberry Pi 4 (after noticing it took about 2 minutes to run `ansible --version` on my Pi Zero after upgrading to 2.10). I haven't yet tested on my Mac, where the system's blazing-fast NVM drive and i9 CPU will make the absolute numbers much better—but I would like to see if the _relative_ performance difference is the same.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible (and other `ansible-*` commands)
##### ANSIBLE VERSION
```paste below
$ time ansible --version
ansible 2.10.1
config file = /home/pi/pi-webcam/ansible.cfg
configured module search path = ['/home/pi/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.7/dist-packages/ansible
executable location = /usr/local/bin/ansible
python version = 3.7.3 (default, Jul 25 2020, 13:03:44) [GCC 8.3.0]
```
##### CONFIGURATION
```paste below
N/A
```
##### OS / ENVIRONMENT
Linux (Debian 10, Raspberry Pi OS)
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
pip3 install -y ansible
time ansible --version # run 4 times, discard 1st result
pip3 uninstall -y ansible ansible-base
pip3 install -y ansible-base
time ansible --version # run 4 times, discard 1st result
pip3 uninstall -y ansible ansible-base
pip3 install -y ansible==2.9.*
time ansible --version # run 4 times, discard 1st result
pip3 uninstall -y ansible ansible-base
```
##### EXPECTED RESULTS
2.10.1 version should be at least _similar_, maybe a little slower, but not twice or three times slower.
##### ACTUAL RESULTS
Benchmarking results:
| Ansible version | Pip package size | Time | Delta vs 2.9 |
| --- | --- | --- | --- |
| 2.9.13 | 16.2MB | 2.09s | - |
| 2.10.0 (ansible) | 43.1MB | 6.09s | 3x slower |
| 2.10.1 (ansible-base) | 1.9MB | 6.33s | 3x slower |
|
True
|
On systems with slow disks, Ansible 2.10 runs generally much slower than 2.9 - ##### SUMMARY
At first, I thought this may just be a problem that was caused by 2.10 using collections and shipping with dozens of collections out of the box when you `pip install` it now.
But after exploring further, I found that basic `ansible` commands like `ansible --version` are 3x slower than Ansible 2.9, even if I'm just installing and using `ansible-base`, with _no collections_ installed.
Note that these tests were done on a Raspberry Pi 4 (after noticing it took about 2 minutes to run `ansible --version` on my Pi Zero after upgrading to 2.10). I haven't yet tested on my Mac, where the system's blazing-fast NVM drive and i9 CPU will make the absolute numbers much better—but I would like to see if the _relative_ performance difference is the same.
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
ansible (and other `ansible-*` commands)
##### ANSIBLE VERSION
```paste below
$ time ansible --version
ansible 2.10.1
config file = /home/pi/pi-webcam/ansible.cfg
configured module search path = ['/home/pi/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.7/dist-packages/ansible
executable location = /usr/local/bin/ansible
python version = 3.7.3 (default, Jul 25 2020, 13:03:44) [GCC 8.3.0]
```
##### CONFIGURATION
```paste below
N/A
```
##### OS / ENVIRONMENT
Linux (Debian 10, Raspberry Pi OS)
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
pip3 install -y ansible
time ansible --version # run 4 times, discard 1st result
pip3 uninstall -y ansible ansible-base
pip3 install -y ansible-base
time ansible --version # run 4 times, discard 1st result
pip3 uninstall -y ansible ansible-base
pip3 install -y ansible==2.9.*
time ansible --version # run 4 times, discard 1st result
pip3 uninstall -y ansible ansible-base
```
##### EXPECTED RESULTS
2.10.1 version should be at least _similar_, maybe a little slower, but not twice or three times slower.
##### ACTUAL RESULTS
Benchmarking results:
| Ansible version | Pip package size | Time | Delta vs 2.9 |
| --- | --- | --- | --- |
| 2.9.13 | 16.2MB | 2.09s | - |
| 2.10.0 (ansible) | 43.1MB | 6.09s | 3x slower |
| 2.10.1 (ansible-base) | 1.9MB | 6.33s | 3x slower |
|
non_process
|
on systems with slow disks ansible runs generally much slower than summary at first i thought this may just be a problem that was caused by using collections and shipping with dozens of collections out of the box when you pip install it now but after exploring further i found that basic ansible commands like ansible version are slower than ansible even if i m just installing and using ansible base with no collections installed note that these tests were done on a raspberry pi after noticing it took about minutes to run ansible version on my pi zero after upgrading to i haven t yet tested on my mac where the system s blazing fast nvm drive and cpu will make the absolute numbers much better—but i would like to see if the relative performance difference is the same issue type bug report component name ansible and other ansible commands ansible version paste below time ansible version ansible config file home pi pi webcam ansible cfg configured module search path ansible python module location usr local lib dist packages ansible executable location usr local bin ansible python version default jul configuration paste below n a os environment linux debian raspberry pi os steps to reproduce yaml install y ansible time ansible version run times discard result uninstall y ansible ansible base install y ansible base time ansible version run times discard result uninstall y ansible ansible base install y ansible time ansible version run times discard result uninstall y ansible ansible base expected results version should be at least similar maybe a little slower but not twice or three times slower actual results benchmarking results ansible version pip package size time delta vs ansible slower ansible base slower
| 0
|
3,686
| 6,716,213,393
|
IssuesEvent
|
2017-10-14 04:26:12
|
facebook/osquery
|
https://api.github.com/repos/facebook/osquery
|
closed
|
Convert Linux process_events mode column to octal
|
API change events Hacktoberfest process auditing
|
The events on linux systems from process_events log the "mode" column in decimal. However, the OpenBSM stack on OSX appropriately converts it to octal. This should be consistent (octal) across operating systems.
Context in Slack
https://osquery.slack.com/archives/C08V7KTJB/p1505428514000089
|
1.0
|
Convert Linux process_events mode column to octal - The events on linux systems from process_events log the "mode" column in decimal. However, the OpenBSM stack on OSX appropriately converts it to octal. This should be consistent (octal) across operating systems.
Context in Slack
https://osquery.slack.com/archives/C08V7KTJB/p1505428514000089
|
process
|
convert linux process events mode column to octal the events on linux systems from process events log the mode column in decimal however the openbsm stack on osx appropriately converts it to octal this should be consistent octal across operating systems context in slack
| 1
|
12,799
| 15,180,843,082
|
IssuesEvent
|
2021-02-15 01:28:02
|
Geonovum/disgeo-arch
|
https://api.github.com/repos/Geonovum/disgeo-arch
|
closed
|
3.1 Beleid Samenhangende Objectenregistratie
|
In behandeling - voorstel principes In behandeling - voorstel processen e.d. Principes Processen Functies Componenten
|
Punt 2 van de punten uit de houtskoolschet wordt de term leverancier gebruikt. Deze term is hier onduidelijk en verwarrend. Ook het brondocument geeft hier geen helderheid. Gaat het om het machtigen van leveranciers voor het aanleveren van informatie namens een bijvoorbeeld een gemeente vanuit hun software. Of het machtigen van derden om zelfstandig wijzigingen in de bronregistratie door te voeren. Bijvoorbeeld externe beheerders van de openbare ruimte? Het eerste geval buiten de scope van dit document en gaat dat over een technische oplossing voor het aanleveren van informatie via een infrastructuur van een derde.
De architectuurvisie Dis Geo bevat 10 principes waar dit document er maar 5 noemt. Onduidelijk wat hiervan de motivatie is. De weggelaten principes zijn relevant voor de architectuur beschrijving. Met name omdat een aantal de rol van "eenieder" beschrijft. En daarmee een belangrijke leidraad voor de doelstelling die de voorzieningen tot stand moeten brengen.
Punt 7 vanuit de beleidsvisie SOR lijkt te staan, het vastleggen en opvragen zit in een aparte database. Dat is althans ooit een principe geweest: scheiding tussen registreren en verstrekken. Maar die scheiding leidde ertoe dat er aparte datamodellen ontstonden voor inwinnen en aparte voor opvragen/verstrekken. Met issues/uitval, vertalingen en actualiteit verschillen tot gevolg. De intentie zou moeten zijn: inwinnen kent een eigen dynamiek (veel controles, in stappen inwinnen), die anders is dan de dynamiek bij verstrekken (informatie op maat/informatie producten, gehele objecten uitleveren).
Processen wil je scheiden, maar de informatie in beide processen moet juist niet gescheiden zijn maar gelijk.
Andere formulering: In het ontwerp van een samenhangende objectenregistratie is er op procesniveau sprake van een nadrukkelijke scheiding tussen de inwinningsprocessen en de functionaliteit voor het bewerken, opvragen en presenteren daarvan, maar op het informatie/dataniveau is juist geen scheiding. Op data niveau is er sprake van dezelfde definities, uniformiteit en actualiteit.
Verder uitleg bij punt 7: De uitdaging is tweeledig:
Hoe win je in, in samenhang Hoe verstrek je, over basisregistraties heen, actueel, zonder dat de gebruiker hier inconsistenties ervaart. Bij deze uitdaging hoort een ander principe, of in ieder geval een andere formulering:
Principe: inwinnen voor gebruik. Bijvoorbeeld: de inwinnende partij is verantwoordelijk voor de informatie die verstrekt wordt en wint in met kennis en kunde van de informatiebehoefte in het verstrekkingen proces.
Bijvoorbeeld: er is een database conform het informatiemodel, en de inwinnende partij vult die database zodra dit kan. Dit is dan direct actueel, de inwinnende partij ervaart zelf of de informatie die ze inwinnen bruikbaar is, en er is geen transformatiestap meer nodig en dus geen kans op uitval.
Natuurlijk is het zo dat de inwinnende partij ook extra informatie wilt bijhouden en dat mag en kan, maar ten minste een onderdeel van de architectuur oplossing zou moeten zijn dat de inwinnende partij de data in de database voor verstrekken op orde heeft/maakt, een database van waaruit direct geleverd zou kunnen worden - landelijk verzamelt in een LV, waar hetzelfde informatiemodel staat.
Hierom moet natuurlijk ook goed naar de verschillende doelstellingen worden gekeken waarvoor een registratie moet dienen. De BAG is opgezet als een registratie met verwijzingen naar de besluiten van de gemeente met betrekking tot Gebouwen enz. Daar is een informatiemodel voor gemaakt. Een heel groot deel van de gebruikers is geïnteresseerd in adressen. Een al te snel genomen besluit om bijvoorbeeld de gebouwen in de BAG te gebruiken als basis voor de BGT kan vervelende gevolgen hebben, zowel voor de inwin kant als voor de gebruikerskant. (De inwinner is overigens vaak een van de gebruikers)
In 4.2 figuur 6 en 5.2.3.1 Afgeleide opslag komt dit ook weer terug. Zie opmerkingen aldaar.
|
2.0
|
3.1 Beleid Samenhangende Objectenregistratie - Punt 2 van de punten uit de houtskoolschet wordt de term leverancier gebruikt. Deze term is hier onduidelijk en verwarrend. Ook het brondocument geeft hier geen helderheid. Gaat het om het machtigen van leveranciers voor het aanleveren van informatie namens een bijvoorbeeld een gemeente vanuit hun software. Of het machtigen van derden om zelfstandig wijzigingen in de bronregistratie door te voeren. Bijvoorbeeld externe beheerders van de openbare ruimte? Het eerste geval buiten de scope van dit document en gaat dat over een technische oplossing voor het aanleveren van informatie via een infrastructuur van een derde.
De architectuurvisie Dis Geo bevat 10 principes waar dit document er maar 5 noemt. Onduidelijk wat hiervan de motivatie is. De weggelaten principes zijn relevant voor de architectuur beschrijving. Met name omdat een aantal de rol van "eenieder" beschrijft. En daarmee een belangrijke leidraad voor de doelstelling die de voorzieningen tot stand moeten brengen.
Punt 7 vanuit de beleidsvisie SOR lijkt te staan, het vastleggen en opvragen zit in een aparte database. Dat is althans ooit een principe geweest: scheiding tussen registreren en verstrekken. Maar die scheiding leidde ertoe dat er aparte datamodellen ontstonden voor inwinnen en aparte voor opvragen/verstrekken. Met issues/uitval, vertalingen en actualiteit verschillen tot gevolg. De intentie zou moeten zijn: inwinnen kent een eigen dynamiek (veel controles, in stappen inwinnen), die anders is dan de dynamiek bij verstrekken (informatie op maat/informatie producten, gehele objecten uitleveren).
Processen wil je scheiden, maar de informatie in beide processen moet juist niet gescheiden zijn maar gelijk.
Andere formulering: In het ontwerp van een samenhangende objectenregistratie is er op procesniveau sprake van een nadrukkelijke scheiding tussen de inwinningsprocessen en de functionaliteit voor het bewerken, opvragen en presenteren daarvan, maar op het informatie/dataniveau is juist geen scheiding. Op data niveau is er sprake van dezelfde definities, uniformiteit en actualiteit.
Verder uitleg bij punt 7: De uitdaging is tweeledig:
Hoe win je in, in samenhang Hoe verstrek je, over basisregistraties heen, actueel, zonder dat de gebruiker hier inconsistenties ervaart. Bij deze uitdaging hoort een ander principe, of in ieder geval een andere formulering:
Principe: inwinnen voor gebruik. Bijvoorbeeld: de inwinnende partij is verantwoordelijk voor de informatie die verstrekt wordt en wint in met kennis en kunde van de informatiebehoefte in het verstrekkingen proces.
Bijvoorbeeld: er is een database conform het informatiemodel, en de inwinnende partij vult die database zodra dit kan. Dit is dan direct actueel, de inwinnende partij ervaart zelf of de informatie die ze inwinnen bruikbaar is, en er is geen transformatiestap meer nodig en dus geen kans op uitval.
Natuurlijk is het zo dat de inwinnende partij ook extra informatie wilt bijhouden en dat mag en kan, maar ten minste een onderdeel van de architectuur oplossing zou moeten zijn dat de inwinnende partij de data in de database voor verstrekken op orde heeft/maakt, een database van waaruit direct geleverd zou kunnen worden - landelijk verzamelt in een LV, waar hetzelfde informatiemodel staat.
Hierom moet natuurlijk ook goed naar de verschillende doelstellingen worden gekeken waarvoor een registratie moet dienen. De BAG is opgezet als een registratie met verwijzingen naar de besluiten van de gemeente met betrekking tot Gebouwen enz. Daar is een informatiemodel voor gemaakt. Een heel groot deel van de gebruikers is geïnteresseerd in adressen. Een al te snel genomen besluit om bijvoorbeeld de gebouwen in de BAG te gebruiken als basis voor de BGT kan vervelende gevolgen hebben, zowel voor de inwin kant als voor de gebruikerskant. (De inwinner is overigens vaak een van de gebruikers)
In 4.2 figuur 6 en 5.2.3.1 Afgeleide opslag komt dit ook weer terug. Zie opmerkingen aldaar.
|
process
|
beleid samenhangende objectenregistratie punt van de punten uit de houtskoolschet wordt de term leverancier gebruikt deze term is hier onduidelijk en verwarrend ook het brondocument geeft hier geen helderheid gaat het om het machtigen van leveranciers voor het aanleveren van informatie namens een bijvoorbeeld een gemeente vanuit hun software of het machtigen van derden om zelfstandig wijzigingen in de bronregistratie door te voeren bijvoorbeeld externe beheerders van de openbare ruimte het eerste geval buiten de scope van dit document en gaat dat over een technische oplossing voor het aanleveren van informatie via een infrastructuur van een derde de architectuurvisie dis geo bevat principes waar dit document er maar noemt onduidelijk wat hiervan de motivatie is de weggelaten principes zijn relevant voor de architectuur beschrijving met name omdat een aantal de rol van eenieder beschrijft en daarmee een belangrijke leidraad voor de doelstelling die de voorzieningen tot stand moeten brengen punt vanuit de beleidsvisie sor lijkt te staan het vastleggen en opvragen zit in een aparte database dat is althans ooit een principe geweest scheiding tussen registreren en verstrekken maar die scheiding leidde ertoe dat er aparte datamodellen ontstonden voor inwinnen en aparte voor opvragen verstrekken met issues uitval vertalingen en actualiteit verschillen tot gevolg de intentie zou moeten zijn inwinnen kent een eigen dynamiek veel controles in stappen inwinnen die anders is dan de dynamiek bij verstrekken informatie op maat informatie producten gehele objecten uitleveren processen wil je scheiden maar de informatie in beide processen moet juist niet gescheiden zijn maar gelijk andere formulering in het ontwerp van een samenhangende objectenregistratie is er op procesniveau sprake van een nadrukkelijke scheiding tussen de inwinningsprocessen en de functionaliteit voor het bewerken opvragen en presenteren daarvan maar op het informatie dataniveau is juist geen scheiding op data niveau is er sprake van dezelfde definities uniformiteit en actualiteit verder uitleg bij punt de uitdaging is tweeledig hoe win je in in samenhang hoe verstrek je over basisregistraties heen actueel zonder dat de gebruiker hier inconsistenties ervaart bij deze uitdaging hoort een ander principe of in ieder geval een andere formulering principe inwinnen voor gebruik bijvoorbeeld de inwinnende partij is verantwoordelijk voor de informatie die verstrekt wordt en wint in met kennis en kunde van de informatiebehoefte in het verstrekkingen proces bijvoorbeeld er is een database conform het informatiemodel en de inwinnende partij vult die database zodra dit kan dit is dan direct actueel de inwinnende partij ervaart zelf of de informatie die ze inwinnen bruikbaar is en er is geen transformatiestap meer nodig en dus geen kans op uitval natuurlijk is het zo dat de inwinnende partij ook extra informatie wilt bijhouden en dat mag en kan maar ten minste een onderdeel van de architectuur oplossing zou moeten zijn dat de inwinnende partij de data in de database voor verstrekken op orde heeft maakt een database van waaruit direct geleverd zou kunnen worden landelijk verzamelt in een lv waar hetzelfde informatiemodel staat hierom moet natuurlijk ook goed naar de verschillende doelstellingen worden gekeken waarvoor een registratie moet dienen de bag is opgezet als een registratie met verwijzingen naar de besluiten van de gemeente met betrekking tot gebouwen enz daar is een informatiemodel voor gemaakt een heel groot deel van de gebruikers is geïnteresseerd in adressen een al te snel genomen besluit om bijvoorbeeld de gebouwen in de bag te gebruiken als basis voor de bgt kan vervelende gevolgen hebben zowel voor de inwin kant als voor de gebruikerskant de inwinner is overigens vaak een van de gebruikers in figuur en afgeleide opslag komt dit ook weer terug zie opmerkingen aldaar
| 1
|
244,562
| 18,761,949,047
|
IssuesEvent
|
2021-11-05 17:32:02
|
CMPUT301F21T23/HabitTracker
|
https://api.github.com/repos/CMPUT301F21T23/HabitTracker
|
closed
|
Export and move meeting minutes to Github
|
documentation
|
Take the meeting minutes which we keep on Google Docs, and put them in the main branch of the Github.
|
1.0
|
Export and move meeting minutes to Github - Take the meeting minutes which we keep on Google Docs, and put them in the main branch of the Github.
|
non_process
|
export and move meeting minutes to github take the meeting minutes which we keep on google docs and put them in the main branch of the github
| 0
|
385,080
| 26,616,377,008
|
IssuesEvent
|
2023-01-24 07:33:40
|
CAPSTONE369/369_SERVER
|
https://api.github.com/repos/CAPSTONE369/369_SERVER
|
closed
|
지역 DB 기준 문서화
|
documentation
|
@sunnyineverywhere 이 배정해준 일
- [x] 지역 그룹 어떻게 나눌지 기준을 정해주세요
예) 서대문구 마포구 -> 그룹 1
이렇게 한곳에 묶일 지역 그룹들
- [x] DB에 넣어주세요
일단은 한 그룹에 있는 지역끼리 게시글을 띄워줄 생각이었는데? 후에 개발하다 보면 방식이 바뀔 수는 있을 것 같음
사용자의 address를 기반으로 반경 5km 안쪽을 반환한다거나…
|
1.0
|
지역 DB 기준 문서화 - @sunnyineverywhere 이 배정해준 일
- [x] 지역 그룹 어떻게 나눌지 기준을 정해주세요
예) 서대문구 마포구 -> 그룹 1
이렇게 한곳에 묶일 지역 그룹들
- [x] DB에 넣어주세요
일단은 한 그룹에 있는 지역끼리 게시글을 띄워줄 생각이었는데? 후에 개발하다 보면 방식이 바뀔 수는 있을 것 같음
사용자의 address를 기반으로 반경 5km 안쪽을 반환한다거나…
|
non_process
|
지역 db 기준 문서화 sunnyineverywhere 이 배정해준 일 지역 그룹 어떻게 나눌지 기준을 정해주세요 예 서대문구 마포구 그룹 이렇게 한곳에 묶일 지역 그룹들 db에 넣어주세요 일단은 한 그룹에 있는 지역끼리 게시글을 띄워줄 생각이었는데 후에 개발하다 보면 방식이 바뀔 수는 있을 것 같음 사용자의 address를 기반으로 반경 안쪽을 반환한다거나…
| 0
|
89,265
| 8,199,140,832
|
IssuesEvent
|
2018-08-31 18:58:50
|
elastic/elasticsearch
|
https://api.github.com/repos/elastic/elasticsearch
|
opened
|
[CI] SmokeTestWatcherWithSecurityIT.testSearchInputHasPermissions flaky failures
|
:Core/Watcher >test-failure
|
master and 6.x have been running into many instances of this test failing
here is one reproduce snippet that doesn't always reproduce
```
REPRODUCE WITH: ./gradlew :x-pack:qa:smoke-test-watcher-with-security:integTestRunner \
-Dtests.seed=7EF35AACC2EC51CB \
-Dtests.class=org.elasticsearch.smoketest.SmokeTestWatcherWithSecurityIT \
-Dtests.method="testSearchInputHasPermissions" \
-Dtests.security.manager=true \
-Dtests.locale=sr-Latn-BA \
-Dtests.timezone=Pacific/Fakaofo \
-Dcompiler.java=10 \
-Druntime.java=10
```
|
1.0
|
[CI] SmokeTestWatcherWithSecurityIT.testSearchInputHasPermissions flaky failures - master and 6.x have been running into many instances of this test failing
here is one reproduce snippet that doesn't always reproduce
```
REPRODUCE WITH: ./gradlew :x-pack:qa:smoke-test-watcher-with-security:integTestRunner \
-Dtests.seed=7EF35AACC2EC51CB \
-Dtests.class=org.elasticsearch.smoketest.SmokeTestWatcherWithSecurityIT \
-Dtests.method="testSearchInputHasPermissions" \
-Dtests.security.manager=true \
-Dtests.locale=sr-Latn-BA \
-Dtests.timezone=Pacific/Fakaofo \
-Dcompiler.java=10 \
-Druntime.java=10
```
|
non_process
|
smoketestwatcherwithsecurityit testsearchinputhaspermissions flaky failures master and x have been running into many instances of this test failing here is one reproduce snippet that doesn t always reproduce reproduce with gradlew x pack qa smoke test watcher with security integtestrunner dtests seed dtests class org elasticsearch smoketest smoketestwatcherwithsecurityit dtests method testsearchinputhaspermissions dtests security manager true dtests locale sr latn ba dtests timezone pacific fakaofo dcompiler java druntime java
| 0
|
5,741
| 8,580,870,921
|
IssuesEvent
|
2018-11-13 13:16:45
|
easy-software-ufal/annotations_repos
|
https://api.github.com/repos/easy-software-ufal/annotations_repos
|
opened
|
NakedObjectsGroup/NakedObjectsFramework [Immutable] class throws error when an action is invoked on it.
|
ADA C# wrong processing
|
Issue: `https://github.com/NakedObjectsGroup/NakedObjectsFramework/issues/30`
PR: `https://github.com/NakedObjectsGroup/NakedObjectsFramework/commit/8f01544f1e85be80f8e835024a52be8ac5677c8e`
Multiple pull requests.
I'd put as misuse as they're marking mutable objects as `[Immutable]`.
|
1.0
|
NakedObjectsGroup/NakedObjectsFramework [Immutable] class throws error when an action is invoked on it. - Issue: `https://github.com/NakedObjectsGroup/NakedObjectsFramework/issues/30`
PR: `https://github.com/NakedObjectsGroup/NakedObjectsFramework/commit/8f01544f1e85be80f8e835024a52be8ac5677c8e`
Multiple pull requests.
I'd put as misuse as they're marking mutable objects as `[Immutable]`.
|
process
|
nakedobjectsgroup nakedobjectsframework class throws error when an action is invoked on it issue pr multiple pull requests i d put as misuse as they re marking mutable objects as
| 1
|
286,072
| 21,562,823,670
|
IssuesEvent
|
2022-05-01 12:24:52
|
ISPP-Eventia/eventus
|
https://api.github.com/repos/ISPP-Eventia/eventus
|
opened
|
doc - Plan de Marketing
|
documentation
|
## descripcion
descripción detallada de lo que debe ser documentado, como y donde.
## to-do
- [ ] Modelo de Segmentación
- [ ] Perfiles de viewers objetivo de los anuncios
- [ ] Generar Hype (propuestas e ideas, eslogan, promociones, sorpresas)
- [ ] tamaño de audiencia esperada
- [ ] Impresiones esperadas
- [ ] Costes (anuncios, impresiones, analisis, ...)
|
1.0
|
doc - Plan de Marketing - ## descripcion
descripción detallada de lo que debe ser documentado, como y donde.
## to-do
- [ ] Modelo de Segmentación
- [ ] Perfiles de viewers objetivo de los anuncios
- [ ] Generar Hype (propuestas e ideas, eslogan, promociones, sorpresas)
- [ ] tamaño de audiencia esperada
- [ ] Impresiones esperadas
- [ ] Costes (anuncios, impresiones, analisis, ...)
|
non_process
|
doc plan de marketing descripcion descripción detallada de lo que debe ser documentado como y donde to do modelo de segmentación perfiles de viewers objetivo de los anuncios generar hype propuestas e ideas eslogan promociones sorpresas tamaño de audiencia esperada impresiones esperadas costes anuncios impresiones analisis
| 0
|
460,356
| 13,208,661,137
|
IssuesEvent
|
2020-08-15 06:20:32
|
magento/magento2
|
https://api.github.com/repos/magento/magento2
|
closed
|
Price Filter Into Grid not working for specific data
|
Component: Filter Component: Widget Fixed in 2.4.x Issue: Confirmed Issue: Format is valid Issue: Ready for Work Priority: P2 Progress: ready for dev Reproduced on 2.4.x Severity: S2
|
<!---
Please review our guidelines before adding a new issue: https://github.com/magento/magento2/wiki/Issue-reporting-guidelines
Fields marked with (*) are required. Please don't remove the template.
-->
### Preconditions (*)
1. Magento versions: 2.3.4, 2.3.5, 2.4-develop
### Steps to reproduce (*)
1. Create Grid Block with deprecated functionality extended class Magento\Backend\Block\Widget\Grid\Extended
2. add column type price `
$this->addColumn(
'total_price',
[
'header' => __('Total Price'),
'index' => 'total_price',
'type' => 'price',
'currency_code' => (string)$this->_scopeConfig
->getValue(Currency::XML_PATH_CURRENCY_BASE),
]
);
`
3. apply a filter on this field. Example: '123a'
### Expected result (*)
1. apply filter price to 123 cost
### Actual result (*)
1. Throw exception `Notice: A non well formed numeric value encountered in vendor/magento/module-backend/Block/Widget/Grid/Column/Filter/Price.php on line 197` and redirect on some page
---
Please provide [Severity](https://devdocs.magento.com/guides/v2.3/contributor-guide/contributing.html#backlog) assessment for the Issue as Reporter. This information will help during Confirmation and Issue triage processes.
- [ ] Severity: **S0** _- Affects critical data or functionality and leaves users without workaround._
- [ ] Severity: **S1** _- Affects critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S2** _- Affects non-critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S3** _- Affects non-critical data or functionality and does not force users to employ a workaround._
- [ ] Severity: **S4** _- Affects aesthetics, professional look and feel, “quality” or “usability”._
|
1.0
|
Price Filter Into Grid not working for specific data - <!---
Please review our guidelines before adding a new issue: https://github.com/magento/magento2/wiki/Issue-reporting-guidelines
Fields marked with (*) are required. Please don't remove the template.
-->
### Preconditions (*)
1. Magento versions: 2.3.4, 2.3.5, 2.4-develop
### Steps to reproduce (*)
1. Create Grid Block with deprecated functionality extended class Magento\Backend\Block\Widget\Grid\Extended
2. add column type price `
$this->addColumn(
'total_price',
[
'header' => __('Total Price'),
'index' => 'total_price',
'type' => 'price',
'currency_code' => (string)$this->_scopeConfig
->getValue(Currency::XML_PATH_CURRENCY_BASE),
]
);
`
3. apply a filter on this field. Example: '123a'
### Expected result (*)
1. apply filter price to 123 cost
### Actual result (*)
1. Throw exception `Notice: A non well formed numeric value encountered in vendor/magento/module-backend/Block/Widget/Grid/Column/Filter/Price.php on line 197` and redirect on some page
---
Please provide [Severity](https://devdocs.magento.com/guides/v2.3/contributor-guide/contributing.html#backlog) assessment for the Issue as Reporter. This information will help during Confirmation and Issue triage processes.
- [ ] Severity: **S0** _- Affects critical data or functionality and leaves users without workaround._
- [ ] Severity: **S1** _- Affects critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S2** _- Affects non-critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S3** _- Affects non-critical data or functionality and does not force users to employ a workaround._
- [ ] Severity: **S4** _- Affects aesthetics, professional look and feel, “quality” or “usability”._
|
non_process
|
price filter into grid not working for specific data please review our guidelines before adding a new issue fields marked with are required please don t remove the template preconditions magento versions develop steps to reproduce create grid block with deprecated functionality extended class magento backend block widget grid extended add column type price this addcolumn total price header total price index total price type price currency code string this scopeconfig getvalue currency xml path currency base apply a filter on this field example expected result apply filter price to cost actual result throw exception notice a non well formed numeric value encountered in vendor magento module backend block widget grid column filter price php on line and redirect on some page please provide assessment for the issue as reporter this information will help during confirmation and issue triage processes severity affects critical data or functionality and leaves users without workaround severity affects critical data or functionality and forces users to employ a workaround severity affects non critical data or functionality and forces users to employ a workaround severity affects non critical data or functionality and does not force users to employ a workaround severity affects aesthetics professional look and feel “quality” or “usability”
| 0
|
8,579
| 11,747,171,742
|
IssuesEvent
|
2020-03-12 13:11:04
|
pwittchen/gesture
|
https://api.github.com/repos/pwittchen/gesture
|
opened
|
release 0.1.1
|
release process
|
- [ ] bump library version
- [ ] update JavaDoc on gh-pages
- [ ] upload Archives to Maven Central Repository
- [ ] close and release artifact on Nexus
- [ ] update `CHANGELOG.md` after Maven Sync
- [ ] update download section in `README.md` after Maven Sync
- [ ] create new GitHub release
|
1.0
|
release 0.1.1 - - [ ] bump library version
- [ ] update JavaDoc on gh-pages
- [ ] upload Archives to Maven Central Repository
- [ ] close and release artifact on Nexus
- [ ] update `CHANGELOG.md` after Maven Sync
- [ ] update download section in `README.md` after Maven Sync
- [ ] create new GitHub release
|
process
|
release bump library version update javadoc on gh pages upload archives to maven central repository close and release artifact on nexus update changelog md after maven sync update download section in readme md after maven sync create new github release
| 1
|
39,838
| 6,776,914,627
|
IssuesEvent
|
2017-10-27 19:48:40
|
nicolargo/glances
|
https://api.github.com/repos/nicolargo/glances
|
closed
|
unable to install with pip
|
documentation install
|
works: (sort of)
```
pip install glances --user
glances
Error while initializing the sensors plugin ([Errno 19] No such device)
# but will load
```
doesn't work:
```
pip install glances[action,browser,cloud,cpuinfo,chart,docker,export,folders,gpu,ip,raid,snmp,web,wifi] --user
no matches found: glances[action,browser,cloud,cpuinfo,chart,docker,export,folders,gpu,ip,raid,snmp,web,wifi]
```
any ideas?
```
pip -V
pip 9.0.1 from /home/vmi/.local/lib/python2.7/site-packages (python 2.7)
lsb_release -a -u
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 16.04.2 LTS
Release: 16.04
Codename: xenial
```
|
1.0
|
unable to install with pip - works: (sort of)
```
pip install glances --user
glances
Error while initializing the sensors plugin ([Errno 19] No such device)
# but will load
```
doesn't work:
```
pip install glances[action,browser,cloud,cpuinfo,chart,docker,export,folders,gpu,ip,raid,snmp,web,wifi] --user
no matches found: glances[action,browser,cloud,cpuinfo,chart,docker,export,folders,gpu,ip,raid,snmp,web,wifi]
```
any ideas?
```
pip -V
pip 9.0.1 from /home/vmi/.local/lib/python2.7/site-packages (python 2.7)
lsb_release -a -u
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 16.04.2 LTS
Release: 16.04
Codename: xenial
```
|
non_process
|
unable to install with pip works sort of pip install glances user glances error while initializing the sensors plugin no such device but will load doesn t work pip install glances user no matches found glances any ideas pip v pip from home vmi local lib site packages python lsb release a u no lsb modules are available distributor id ubuntu description ubuntu lts release codename xenial
| 0
|
305,884
| 9,377,997,691
|
IssuesEvent
|
2019-04-04 11:48:50
|
IBM/watson-assistant-workbench
|
https://api.github.com/repos/IBM/watson-assistant-workbench
|
opened
|
Improve error message in CF test
|
CI Priority: medium discussion functions
|
`2019-04-03 12:19:16,369 functions_test.py ERROR Unexpected response status: 202, response: b'{"activationId": "06617ed8e894483ea17ed8e894183eff"}'` in build https://travis-ci.com/IBM/watson-assistant-workbench/builds/106894871 is probably caused by long response time of function.
|
1.0
|
Improve error message in CF test - `2019-04-03 12:19:16,369 functions_test.py ERROR Unexpected response status: 202, response: b'{"activationId": "06617ed8e894483ea17ed8e894183eff"}'` in build https://travis-ci.com/IBM/watson-assistant-workbench/builds/106894871 is probably caused by long response time of function.
|
non_process
|
improve error message in cf test functions test py error unexpected response status response b activationid in build is probably caused by long response time of function
| 0
|
2,920
| 5,914,496,302
|
IssuesEvent
|
2017-05-22 03:10:57
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
parallel/test-process-setuid-setgid fails if "nobody" user does not exist
|
process test
|
<!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
-->
* **Version**: 7.10.0
* **Platform**: Linux
* **Subsystem**: test
<!-- Enter your issue details below this comment. -->
Some isolated build or test VMs don't have a "nobody" user, causing the parallel/test-process-setuid-setgid test to fail.
|
1.0
|
parallel/test-process-setuid-setgid fails if "nobody" user does not exist - <!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
-->
* **Version**: 7.10.0
* **Platform**: Linux
* **Subsystem**: test
<!-- Enter your issue details below this comment. -->
Some isolated build or test VMs don't have a "nobody" user, causing the parallel/test-process-setuid-setgid test to fail.
|
process
|
parallel test process setuid setgid fails if nobody user does not exist thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version platform linux subsystem test some isolated build or test vms don t have a nobody user causing the parallel test process setuid setgid test to fail
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.