Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
5
112
repo_url
stringlengths
34
141
action
stringclasses
3 values
title
stringlengths
1
757
labels
stringlengths
4
664
body
stringlengths
3
261k
index
stringclasses
10 values
text_combine
stringlengths
96
261k
label
stringclasses
2 values
text
stringlengths
96
232k
binary_label
int64
0
1
326,984
28,036,026,679
IssuesEvent
2023-03-28 15:07:54
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
reopened
Fix comparison.test_numpy_greater_equal
NumPy Frontend Sub Task Failing Test
| | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4530890837/jobs/7980387749" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/4530890837/jobs/7980387749" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4530890837/jobs/7980387749" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/4530890837/jobs/7980387749" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_numpy/test_logic/test_comparison.py::test_numpy_greater_equal[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-27T09:45:10.7587592Z E IndexError: list index out of range 2023-03-27T09:45:10.7589489Z E Falsifying example: test_numpy_greater_equal( 2023-03-27T09:45:10.7590235Z E dtypes_values_casting=(['uint32', 'uint32'], 2023-03-27T09:45:10.7590833Z E [array(0, dtype=uint32), array(0, dtype=uint32)], 2023-03-27T09:45:10.7591316Z E 'no', 2023-03-27T09:45:10.7591685Z E None), 2023-03-27T09:45:10.7592104Z E where=[array(False)], 2023-03-27T09:45:10.7592569Z E test_flags=FrontendFunctionTestFlags( 2023-03-27T09:45:10.7647238Z E num_positional_args=2, 2023-03-27T09:45:10.7647505Z E with_out=False, 2023-03-27T09:45:10.7647747Z E inplace=False, 2023-03-27T09:45:10.7647992Z E as_variable=[False], 2023-03-27T09:45:10.7648244Z E native_arrays=[False], 2023-03-27T09:45:10.7648531Z E generate_frontend_arrays=True, 2023-03-27T09:45:10.7648769Z E ), 2023-03-27T09:45:10.7649223Z E fn_tree='ivy.functional.frontends.numpy.greater_equal', 2023-03-27T09:45:10.7649583Z E on_device='cpu', 2023-03-27T09:45:10.7649857Z E frontend='numpy', 2023-03-27T09:45:10.7650141Z E ) 2023-03-27T09:45:10.7650330Z E 2023-03-27T09:45:10.7650891Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.0', b'AXicY2BgYGBkgBFgBgAALQAE') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_numpy/test_logic/test_comparison.py::test_numpy_greater_equal[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-27T09:45:10.7587592Z E IndexError: list index out of range 2023-03-27T09:45:10.7589489Z E Falsifying example: test_numpy_greater_equal( 2023-03-27T09:45:10.7590235Z E dtypes_values_casting=(['uint32', 'uint32'], 2023-03-27T09:45:10.7590833Z E [array(0, dtype=uint32), array(0, dtype=uint32)], 2023-03-27T09:45:10.7591316Z E 'no', 2023-03-27T09:45:10.7591685Z E None), 2023-03-27T09:45:10.7592104Z E where=[array(False)], 2023-03-27T09:45:10.7592569Z E test_flags=FrontendFunctionTestFlags( 2023-03-27T09:45:10.7647238Z E num_positional_args=2, 2023-03-27T09:45:10.7647505Z E with_out=False, 2023-03-27T09:45:10.7647747Z E inplace=False, 2023-03-27T09:45:10.7647992Z E as_variable=[False], 2023-03-27T09:45:10.7648244Z E native_arrays=[False], 2023-03-27T09:45:10.7648531Z E generate_frontend_arrays=True, 2023-03-27T09:45:10.7648769Z E ), 2023-03-27T09:45:10.7649223Z E fn_tree='ivy.functional.frontends.numpy.greater_equal', 2023-03-27T09:45:10.7649583Z E on_device='cpu', 2023-03-27T09:45:10.7649857Z E frontend='numpy', 2023-03-27T09:45:10.7650141Z E ) 2023-03-27T09:45:10.7650330Z E 2023-03-27T09:45:10.7650891Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.0', b'AXicY2BgYGBkgBFgBgAALQAE') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_numpy/test_logic/test_comparison.py::test_numpy_greater_equal[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-27T09:45:10.7587592Z E IndexError: list index out of range 2023-03-27T09:45:10.7589489Z E Falsifying example: test_numpy_greater_equal( 2023-03-27T09:45:10.7590235Z E dtypes_values_casting=(['uint32', 'uint32'], 2023-03-27T09:45:10.7590833Z E [array(0, dtype=uint32), array(0, dtype=uint32)], 2023-03-27T09:45:10.7591316Z E 'no', 2023-03-27T09:45:10.7591685Z E None), 2023-03-27T09:45:10.7592104Z E where=[array(False)], 2023-03-27T09:45:10.7592569Z E test_flags=FrontendFunctionTestFlags( 2023-03-27T09:45:10.7647238Z E num_positional_args=2, 2023-03-27T09:45:10.7647505Z E with_out=False, 2023-03-27T09:45:10.7647747Z E inplace=False, 2023-03-27T09:45:10.7647992Z E as_variable=[False], 2023-03-27T09:45:10.7648244Z E native_arrays=[False], 2023-03-27T09:45:10.7648531Z E generate_frontend_arrays=True, 2023-03-27T09:45:10.7648769Z E ), 2023-03-27T09:45:10.7649223Z E fn_tree='ivy.functional.frontends.numpy.greater_equal', 2023-03-27T09:45:10.7649583Z E on_device='cpu', 2023-03-27T09:45:10.7649857Z E frontend='numpy', 2023-03-27T09:45:10.7650141Z E ) 2023-03-27T09:45:10.7650330Z E 2023-03-27T09:45:10.7650891Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.0', b'AXicY2BgYGBkgBFgBgAALQAE') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_numpy/test_logic/test_comparison.py::test_numpy_greater_equal[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-27T09:45:10.7587592Z E IndexError: list index out of range 2023-03-27T09:45:10.7589489Z E Falsifying example: test_numpy_greater_equal( 2023-03-27T09:45:10.7590235Z E dtypes_values_casting=(['uint32', 'uint32'], 2023-03-27T09:45:10.7590833Z E [array(0, dtype=uint32), array(0, dtype=uint32)], 2023-03-27T09:45:10.7591316Z E 'no', 2023-03-27T09:45:10.7591685Z E None), 2023-03-27T09:45:10.7592104Z E where=[array(False)], 2023-03-27T09:45:10.7592569Z E test_flags=FrontendFunctionTestFlags( 2023-03-27T09:45:10.7647238Z E num_positional_args=2, 2023-03-27T09:45:10.7647505Z E with_out=False, 2023-03-27T09:45:10.7647747Z E inplace=False, 2023-03-27T09:45:10.7647992Z E as_variable=[False], 2023-03-27T09:45:10.7648244Z E native_arrays=[False], 2023-03-27T09:45:10.7648531Z E generate_frontend_arrays=True, 2023-03-27T09:45:10.7648769Z E ), 2023-03-27T09:45:10.7649223Z E fn_tree='ivy.functional.frontends.numpy.greater_equal', 2023-03-27T09:45:10.7649583Z E on_device='cpu', 2023-03-27T09:45:10.7649857Z E frontend='numpy', 2023-03-27T09:45:10.7650141Z E ) 2023-03-27T09:45:10.7650330Z E 2023-03-27T09:45:10.7650891Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.0', b'AXicY2BgYGBkgBFgBgAALQAE') as a decorator on your test case </details>
1.0
Fix comparison.test_numpy_greater_equal - | | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4530890837/jobs/7980387749" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/4530890837/jobs/7980387749" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4530890837/jobs/7980387749" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/4530890837/jobs/7980387749" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_numpy/test_logic/test_comparison.py::test_numpy_greater_equal[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-27T09:45:10.7587592Z E IndexError: list index out of range 2023-03-27T09:45:10.7589489Z E Falsifying example: test_numpy_greater_equal( 2023-03-27T09:45:10.7590235Z E dtypes_values_casting=(['uint32', 'uint32'], 2023-03-27T09:45:10.7590833Z E [array(0, dtype=uint32), array(0, dtype=uint32)], 2023-03-27T09:45:10.7591316Z E 'no', 2023-03-27T09:45:10.7591685Z E None), 2023-03-27T09:45:10.7592104Z E where=[array(False)], 2023-03-27T09:45:10.7592569Z E test_flags=FrontendFunctionTestFlags( 2023-03-27T09:45:10.7647238Z E num_positional_args=2, 2023-03-27T09:45:10.7647505Z E with_out=False, 2023-03-27T09:45:10.7647747Z E inplace=False, 2023-03-27T09:45:10.7647992Z E as_variable=[False], 2023-03-27T09:45:10.7648244Z E native_arrays=[False], 2023-03-27T09:45:10.7648531Z E generate_frontend_arrays=True, 2023-03-27T09:45:10.7648769Z E ), 2023-03-27T09:45:10.7649223Z E fn_tree='ivy.functional.frontends.numpy.greater_equal', 2023-03-27T09:45:10.7649583Z E on_device='cpu', 2023-03-27T09:45:10.7649857Z E frontend='numpy', 2023-03-27T09:45:10.7650141Z E ) 2023-03-27T09:45:10.7650330Z E 2023-03-27T09:45:10.7650891Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.0', b'AXicY2BgYGBkgBFgBgAALQAE') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_numpy/test_logic/test_comparison.py::test_numpy_greater_equal[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-27T09:45:10.7587592Z E IndexError: list index out of range 2023-03-27T09:45:10.7589489Z E Falsifying example: test_numpy_greater_equal( 2023-03-27T09:45:10.7590235Z E dtypes_values_casting=(['uint32', 'uint32'], 2023-03-27T09:45:10.7590833Z E [array(0, dtype=uint32), array(0, dtype=uint32)], 2023-03-27T09:45:10.7591316Z E 'no', 2023-03-27T09:45:10.7591685Z E None), 2023-03-27T09:45:10.7592104Z E where=[array(False)], 2023-03-27T09:45:10.7592569Z E test_flags=FrontendFunctionTestFlags( 2023-03-27T09:45:10.7647238Z E num_positional_args=2, 2023-03-27T09:45:10.7647505Z E with_out=False, 2023-03-27T09:45:10.7647747Z E inplace=False, 2023-03-27T09:45:10.7647992Z E as_variable=[False], 2023-03-27T09:45:10.7648244Z E native_arrays=[False], 2023-03-27T09:45:10.7648531Z E generate_frontend_arrays=True, 2023-03-27T09:45:10.7648769Z E ), 2023-03-27T09:45:10.7649223Z E fn_tree='ivy.functional.frontends.numpy.greater_equal', 2023-03-27T09:45:10.7649583Z E on_device='cpu', 2023-03-27T09:45:10.7649857Z E frontend='numpy', 2023-03-27T09:45:10.7650141Z E ) 2023-03-27T09:45:10.7650330Z E 2023-03-27T09:45:10.7650891Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.0', b'AXicY2BgYGBkgBFgBgAALQAE') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_numpy/test_logic/test_comparison.py::test_numpy_greater_equal[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-27T09:45:10.7587592Z E IndexError: list index out of range 2023-03-27T09:45:10.7589489Z E Falsifying example: test_numpy_greater_equal( 2023-03-27T09:45:10.7590235Z E dtypes_values_casting=(['uint32', 'uint32'], 2023-03-27T09:45:10.7590833Z E [array(0, dtype=uint32), array(0, dtype=uint32)], 2023-03-27T09:45:10.7591316Z E 'no', 2023-03-27T09:45:10.7591685Z E None), 2023-03-27T09:45:10.7592104Z E where=[array(False)], 2023-03-27T09:45:10.7592569Z E test_flags=FrontendFunctionTestFlags( 2023-03-27T09:45:10.7647238Z E num_positional_args=2, 2023-03-27T09:45:10.7647505Z E with_out=False, 2023-03-27T09:45:10.7647747Z E inplace=False, 2023-03-27T09:45:10.7647992Z E as_variable=[False], 2023-03-27T09:45:10.7648244Z E native_arrays=[False], 2023-03-27T09:45:10.7648531Z E generate_frontend_arrays=True, 2023-03-27T09:45:10.7648769Z E ), 2023-03-27T09:45:10.7649223Z E fn_tree='ivy.functional.frontends.numpy.greater_equal', 2023-03-27T09:45:10.7649583Z E on_device='cpu', 2023-03-27T09:45:10.7649857Z E frontend='numpy', 2023-03-27T09:45:10.7650141Z E ) 2023-03-27T09:45:10.7650330Z E 2023-03-27T09:45:10.7650891Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.0', b'AXicY2BgYGBkgBFgBgAALQAE') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_numpy/test_logic/test_comparison.py::test_numpy_greater_equal[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-27T09:45:10.7587592Z E IndexError: list index out of range 2023-03-27T09:45:10.7589489Z E Falsifying example: test_numpy_greater_equal( 2023-03-27T09:45:10.7590235Z E dtypes_values_casting=(['uint32', 'uint32'], 2023-03-27T09:45:10.7590833Z E [array(0, dtype=uint32), array(0, dtype=uint32)], 2023-03-27T09:45:10.7591316Z E 'no', 2023-03-27T09:45:10.7591685Z E None), 2023-03-27T09:45:10.7592104Z E where=[array(False)], 2023-03-27T09:45:10.7592569Z E test_flags=FrontendFunctionTestFlags( 2023-03-27T09:45:10.7647238Z E num_positional_args=2, 2023-03-27T09:45:10.7647505Z E with_out=False, 2023-03-27T09:45:10.7647747Z E inplace=False, 2023-03-27T09:45:10.7647992Z E as_variable=[False], 2023-03-27T09:45:10.7648244Z E native_arrays=[False], 2023-03-27T09:45:10.7648531Z E generate_frontend_arrays=True, 2023-03-27T09:45:10.7648769Z E ), 2023-03-27T09:45:10.7649223Z E fn_tree='ivy.functional.frontends.numpy.greater_equal', 2023-03-27T09:45:10.7649583Z E on_device='cpu', 2023-03-27T09:45:10.7649857Z E frontend='numpy', 2023-03-27T09:45:10.7650141Z E ) 2023-03-27T09:45:10.7650330Z E 2023-03-27T09:45:10.7650891Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.0', b'AXicY2BgYGBkgBFgBgAALQAE') as a decorator on your test case </details>
non_defect
fix comparison test numpy greater equal tensorflow img src torch img src numpy img src jax img src failed ivy tests test ivy test frontends test numpy test logic test comparison py test numpy greater equal e indexerror list index out of range e falsifying example test numpy greater equal e dtypes values casting e e no e none e where e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e generate frontend arrays true e e fn tree ivy functional frontends numpy greater equal e on device cpu e frontend numpy e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case failed ivy tests test ivy test frontends test numpy test logic test comparison py test numpy greater equal e indexerror list index out of range e falsifying example test numpy greater equal e dtypes values casting e e no e none e where e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e generate frontend arrays true e e fn tree ivy functional frontends numpy greater equal e on device cpu e frontend numpy e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case failed ivy tests test ivy test frontends test numpy test logic test comparison py test numpy greater equal e indexerror list index out of range e falsifying example test numpy greater equal e dtypes values casting e e no e none e where e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e generate frontend arrays true e e fn tree ivy functional frontends numpy greater equal e on device cpu e frontend numpy e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case failed ivy tests test ivy test frontends test numpy test logic test comparison py test numpy greater equal e indexerror list index out of range e falsifying example test numpy greater equal e dtypes values casting e e no e none e where e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e generate frontend arrays true e e fn tree ivy functional frontends numpy greater equal e on device cpu e frontend numpy e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case
0
35,699
6,491,960,028
IssuesEvent
2017-08-21 11:28:50
angular/angular-cli
https://api.github.com/repos/angular/angular-cli
closed
Wiki: add link from ng serve page to proxy-config page
community: help wanted type: documentation
### Bug Report or Feature Request (mark with an `x`) ``` - [ ] bug report -> please search issues before submitting - [ ] feature request - [ x ] documentation change (github wiki) ``` Since I can't do a pull request regarding the wiki... This page https://github.com/angular/angular-cli/wiki/serve should include information about the option `--proxy-config` documented here: https://github.com/angular/angular-cli/blob/master/docs/documentation/stories/proxy.md
1.0
Wiki: add link from ng serve page to proxy-config page - ### Bug Report or Feature Request (mark with an `x`) ``` - [ ] bug report -> please search issues before submitting - [ ] feature request - [ x ] documentation change (github wiki) ``` Since I can't do a pull request regarding the wiki... This page https://github.com/angular/angular-cli/wiki/serve should include information about the option `--proxy-config` documented here: https://github.com/angular/angular-cli/blob/master/docs/documentation/stories/proxy.md
non_defect
wiki add link from ng serve page to proxy config page bug report or feature request mark with an x bug report please search issues before submitting feature request documentation change github wiki since i can t do a pull request regarding the wiki this page should include information about the option proxy config documented here
0
79,616
28,479,545,854
IssuesEvent
2023-04-18 00:39:03
zed-industries/community
https://api.github.com/repos/zed-industries/community
closed
Zed won't open
defect panic / crash
### Check for existing issues - [X] Completed ### Describe the bug / provide steps to reproduce it After downloading both Zed 0.80.5 and Zed 0.79.1 (just to be sure, I tried different versions) and installing (opening the .dmg file and dragging to Applications), Zed will not open. I have clicked through from the Applications gui and tried to open using the `zed` binary under `Applications/Zed.app/Contents/MacOS`. In the terminal, I see the following: ``` >RUST_BACKTRACE=full ./zed thread 'main' panicked at 'could not create config path: Os { code: 13, kind: PermissionDenied, message: "Permission denied" }', crates/zed/src/main.rs:283:56 stack backtrace: 0: 0x106be6bc4 - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::h3c406a4521928e59 1: 0x106c054b8 - core::fmt::write::hc60df9bae744c40c 2: 0x106be0a68 - std::io::Write::write_fmt::h1c4129dfc94f7c33 3: 0x106be69d8 - std::sys_common::backtrace::print::h34db077f1fa49c76 4: 0x106be8530 - std::panicking::default_hook::{{closure}}::hff4fe3239c020cef 5: 0x106be8288 - std::panicking::default_hook::hd2988fbcc86fdc46 6: 0x106be8b54 - std::panicking::rust_panic_with_hook::h01730ad11d62092c 7: 0x106be8974 - std::panicking::begin_panic_handler::{{closure}}::h30f454e305d4a708 8: 0x106be702c - std::sys_common::backtrace::__rust_end_short_backtrace::hb7ff1894d55794a1 9: 0x106be86d0 - _rust_begin_unwind 10: 0x106d938c0 - core::panicking::panic_fmt::hf1070b3fc33229fa 11: 0x106d93bf8 - core::result::unwrap_failed::h334387d5f1f555c1 12: 0x104add188 - Zed::main::h3ede68c77edf0427 13: 0x104aef790 - std::sys_common::backtrace::__rust_begin_short_backtrace::hb0c48201b725886f 14: 0x104ac56e0 - std::rt::lang_start::{{closure}}::h231bd61fbeca8b79 15: 0x106bdaee0 - std::rt::lang_start_internal::h24d09c16ec322e75 16: 0x104adf460 - _main ``` ### Environment New Apple M2 mac book pro. Cannot open zed, hence cannot run copy system specs from the command palette. ### If applicable, add mockups / screenshots to help explain present your vision of the feature n/a ### If applicable, attach your `~/Library/Logs/Zed/Zed.log` file to this issue. If you only need the most recent lines, you can run the `zed: open log` command palette action to see the last 1000. ``` >cd ~/Library/Logs/Zed cd: no such file or directory: /Users/jps/Library/Logs/Zed ```
1.0
Zed won't open - ### Check for existing issues - [X] Completed ### Describe the bug / provide steps to reproduce it After downloading both Zed 0.80.5 and Zed 0.79.1 (just to be sure, I tried different versions) and installing (opening the .dmg file and dragging to Applications), Zed will not open. I have clicked through from the Applications gui and tried to open using the `zed` binary under `Applications/Zed.app/Contents/MacOS`. In the terminal, I see the following: ``` >RUST_BACKTRACE=full ./zed thread 'main' panicked at 'could not create config path: Os { code: 13, kind: PermissionDenied, message: "Permission denied" }', crates/zed/src/main.rs:283:56 stack backtrace: 0: 0x106be6bc4 - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::h3c406a4521928e59 1: 0x106c054b8 - core::fmt::write::hc60df9bae744c40c 2: 0x106be0a68 - std::io::Write::write_fmt::h1c4129dfc94f7c33 3: 0x106be69d8 - std::sys_common::backtrace::print::h34db077f1fa49c76 4: 0x106be8530 - std::panicking::default_hook::{{closure}}::hff4fe3239c020cef 5: 0x106be8288 - std::panicking::default_hook::hd2988fbcc86fdc46 6: 0x106be8b54 - std::panicking::rust_panic_with_hook::h01730ad11d62092c 7: 0x106be8974 - std::panicking::begin_panic_handler::{{closure}}::h30f454e305d4a708 8: 0x106be702c - std::sys_common::backtrace::__rust_end_short_backtrace::hb7ff1894d55794a1 9: 0x106be86d0 - _rust_begin_unwind 10: 0x106d938c0 - core::panicking::panic_fmt::hf1070b3fc33229fa 11: 0x106d93bf8 - core::result::unwrap_failed::h334387d5f1f555c1 12: 0x104add188 - Zed::main::h3ede68c77edf0427 13: 0x104aef790 - std::sys_common::backtrace::__rust_begin_short_backtrace::hb0c48201b725886f 14: 0x104ac56e0 - std::rt::lang_start::{{closure}}::h231bd61fbeca8b79 15: 0x106bdaee0 - std::rt::lang_start_internal::h24d09c16ec322e75 16: 0x104adf460 - _main ``` ### Environment New Apple M2 mac book pro. Cannot open zed, hence cannot run copy system specs from the command palette. ### If applicable, add mockups / screenshots to help explain present your vision of the feature n/a ### If applicable, attach your `~/Library/Logs/Zed/Zed.log` file to this issue. If you only need the most recent lines, you can run the `zed: open log` command palette action to see the last 1000. ``` >cd ~/Library/Logs/Zed cd: no such file or directory: /Users/jps/Library/Logs/Zed ```
defect
zed won t open check for existing issues completed describe the bug provide steps to reproduce it after downloading both zed and zed just to be sure i tried different versions and installing opening the dmg file and dragging to applications zed will not open i have clicked through from the applications gui and tried to open using the zed binary under applications zed app contents macos in the terminal i see the following rust backtrace full zed thread main panicked at could not create config path os code kind permissiondenied message permission denied crates zed src main rs stack backtrace fmt core fmt write std io write write fmt std sys common backtrace print std panicking default hook closure std panicking default hook std panicking rust panic with hook std panicking begin panic handler closure std sys common backtrace rust end short backtrace rust begin unwind core panicking panic fmt core result unwrap failed zed main std sys common backtrace rust begin short backtrace std rt lang start closure std rt lang start internal main environment new apple mac book pro cannot open zed hence cannot run copy system specs from the command palette if applicable add mockups screenshots to help explain present your vision of the feature n a if applicable attach your library logs zed zed log file to this issue if you only need the most recent lines you can run the zed open log command palette action to see the last cd library logs zed cd no such file or directory users jps library logs zed
1
23,078
20,980,510,802
IssuesEvent
2022-03-28 19:25:23
zcash/zcash
https://api.github.com/repos/zcash/zcash
opened
UX: zcashd: UIWarning() should also log to debug.log
usability A-logging
I'm getting this warning on the UI (metrics screen) when running `zcashd`: ``` Messages: - Warning: Error reading wallet.dat! All keys read correctly, but transaction data or address book entries might be missing or incorrect. ``` but nothing is logged to `debug.log` for this condition. Perhaps this and possibly other warnings should appear in the log file. More generally, maybe `UIWarning()` should write its argument to `debug.log` (although we'd want to be careful not to log too much).
True
UX: zcashd: UIWarning() should also log to debug.log - I'm getting this warning on the UI (metrics screen) when running `zcashd`: ``` Messages: - Warning: Error reading wallet.dat! All keys read correctly, but transaction data or address book entries might be missing or incorrect. ``` but nothing is logged to `debug.log` for this condition. Perhaps this and possibly other warnings should appear in the log file. More generally, maybe `UIWarning()` should write its argument to `debug.log` (although we'd want to be careful not to log too much).
non_defect
ux zcashd uiwarning should also log to debug log i m getting this warning on the ui metrics screen when running zcashd messages warning error reading wallet dat all keys read correctly but transaction data or address book entries might be missing or incorrect but nothing is logged to debug log for this condition perhaps this and possibly other warnings should appear in the log file more generally maybe uiwarning should write its argument to debug log although we d want to be careful not to log too much
0
185,511
15,024,021,473
IssuesEvent
2021-02-01 19:02:34
gessnerfl/terraform-provider-instana
https://api.github.com/repos/gessnerfl/terraform-provider-instana
closed
Application perspetive to be added as arguments.
documentation question
Hey @gessnerfl doing great work, wanted to have application perspective to be added to instana_alerting_config, Because i can see no argument under alert config so need to add that in your repo to support adding application perspective for our alerts. we are able to use DFQ,all entities option, but not the Application perspective. how to go with that, can we get any help from here
1.0
Application perspetive to be added as arguments. - Hey @gessnerfl doing great work, wanted to have application perspective to be added to instana_alerting_config, Because i can see no argument under alert config so need to add that in your repo to support adding application perspective for our alerts. we are able to use DFQ,all entities option, but not the Application perspective. how to go with that, can we get any help from here
non_defect
application perspetive to be added as arguments hey gessnerfl doing great work wanted to have application perspective to be added to instana alerting config because i can see no argument under alert config so need to add that in your repo to support adding application perspective for our alerts we are able to use dfq all entities option but not the application perspective how to go with that can we get any help from here
0
764
3,250,901,201
IssuesEvent
2015-10-19 06:01:30
t3kt/vjzual2
https://api.github.com/repos/t3kt/vjzual2
opened
create a 3d color-based tiling module
enhancement video processing
also support using an external source for the tiling data
1.0
create a 3d color-based tiling module - also support using an external source for the tiling data
non_defect
create a color based tiling module also support using an external source for the tiling data
0
88,245
10,567,414,342
IssuesEvent
2019-10-06 03:56:35
ShivamSharma779/The-Rainbow
https://api.github.com/repos/ShivamSharma779/The-Rainbow
closed
Beauty Of Rainbow
documentation good first issue help wanted
Could you please be more creative with the **Beauty of the rainbow** ![fr (1)](https://user-images.githubusercontent.com/46594668/66263687-87645400-e814-11e9-81ef-b1fa45e43fca.jpg)
1.0
Beauty Of Rainbow - Could you please be more creative with the **Beauty of the rainbow** ![fr (1)](https://user-images.githubusercontent.com/46594668/66263687-87645400-e814-11e9-81ef-b1fa45e43fca.jpg)
non_defect
beauty of rainbow could you please be more creative with the beauty of the rainbow
0
42,809
11,273,854,222
IssuesEvent
2020-01-14 17:19:55
mozilla-lockwise/lockwise-android
https://api.github.com/repos/mozilla-lockwise/lockwise-android
closed
The password is not auto-hidden when creating a new entry
closed-invalid defect feature-CUD
## Steps to reproduce 1. Launch lockwise. 2. Login with valid credentials. 3. On the entries, section tap on the `+` button in order to create a new entry. 4. Add the password for the account. ### Expected behavior The password should be hidden while typing until the user wants to reveal it. ### Actual behavior The password is not hidden when creating a new entry. ### Device & build information * Device: **Google Pixel 3a XL(Android 10).** * Build version: **Latest master version 3.3.0 from 1/13.** ### Notes Attachments: ![ezgif-2-d5d5e2f8dfaf](https://user-images.githubusercontent.com/42831109/72264253-18707580-3623-11ea-913a-697c88e16d0e.gif)
1.0
The password is not auto-hidden when creating a new entry - ## Steps to reproduce 1. Launch lockwise. 2. Login with valid credentials. 3. On the entries, section tap on the `+` button in order to create a new entry. 4. Add the password for the account. ### Expected behavior The password should be hidden while typing until the user wants to reveal it. ### Actual behavior The password is not hidden when creating a new entry. ### Device & build information * Device: **Google Pixel 3a XL(Android 10).** * Build version: **Latest master version 3.3.0 from 1/13.** ### Notes Attachments: ![ezgif-2-d5d5e2f8dfaf](https://user-images.githubusercontent.com/42831109/72264253-18707580-3623-11ea-913a-697c88e16d0e.gif)
defect
the password is not auto hidden when creating a new entry steps to reproduce launch lockwise login with valid credentials on the entries section tap on the button in order to create a new entry add the password for the account expected behavior the password should be hidden while typing until the user wants to reveal it actual behavior the password is not hidden when creating a new entry device build information device google pixel xl android build version latest master version from notes attachments
1
54,380
13,634,617,282
IssuesEvent
2020-09-25 00:23:00
SAP/fundamental-styles
https://api.github.com/repos/SAP/fundamental-styles
closed
Defect Hunting 0.12.0
0.12.0 Bug Defect Hunting
- [x] Button not inverted on RTL - Wrong Icon - https://github.com/SAP/fundamental-styles/pull/1694 ![image](https://user-images.githubusercontent.com/26483208/92732713-1bb92e00-f377-11ea-84c4-20a44d74ffde.png) - [x] Breadcrump iframe Height with opened popover - https://github.com/SAP/fundamental-styles/pull/1694 ![image](https://user-images.githubusercontent.com/26483208/92733036-7f435b80-f377-11ea-8e92-633266cc3829.png) - [x] empty checkbox focus double outline (example form cards with table) - https://github.com/SAP/fundamental-styles/pull/1662 ![image](https://user-images.githubusercontent.com/26483208/92733213-b7e33500-f377-11ea-83fb-e601e40c72b0.png) - [x] Form Grid - Inputs/Labels got redundant ID/for attributes - Jedrzej - [x] Form Messages - Popovers don't open - https://github.com/SAP/fundamental-styles/pull/1694 - [x] Feed Input - button - https://github.com/SAP/fundamental-styles/pull/1694 ![image](https://user-images.githubusercontent.com/26483208/92918653-ea506900-f42f-11ea-950b-99b59412d550.png) - [ ] Feed input - RTL button ![image](https://user-images.githubusercontent.com/26483208/92918718-f9371b80-f42f-11ea-9697-e69c82408a2f.png) - [x] File Uploader - Doesn't react on click, brings console error - https://github.com/SAP/fundamental-styles/pull/1694 - [x] Generic Tile - actions - https://github.com/SAP/fundamental-styles/pull/1698 ![image](https://user-images.githubusercontent.com/26483208/92918955-2b487d80-f430-11ea-91a4-82b4214a79bb.png) - [x] List - checkboxes - Repeated ID/For attributes on RTL [#1693](https://github.com/SAP/fundamental-styles/pull/1693) - [x] Side Navigation - nested list button with multiple focus outlines [#1691](https://github.com/SAP/fundamental-styles/pull/1691) ![image](https://user-images.githubusercontent.com/26483208/93889300-51e89d00-fce9-11ea-8a8c-78d345a37340.png)
1.0
Defect Hunting 0.12.0 - - [x] Button not inverted on RTL - Wrong Icon - https://github.com/SAP/fundamental-styles/pull/1694 ![image](https://user-images.githubusercontent.com/26483208/92732713-1bb92e00-f377-11ea-84c4-20a44d74ffde.png) - [x] Breadcrump iframe Height with opened popover - https://github.com/SAP/fundamental-styles/pull/1694 ![image](https://user-images.githubusercontent.com/26483208/92733036-7f435b80-f377-11ea-8e92-633266cc3829.png) - [x] empty checkbox focus double outline (example form cards with table) - https://github.com/SAP/fundamental-styles/pull/1662 ![image](https://user-images.githubusercontent.com/26483208/92733213-b7e33500-f377-11ea-83fb-e601e40c72b0.png) - [x] Form Grid - Inputs/Labels got redundant ID/for attributes - Jedrzej - [x] Form Messages - Popovers don't open - https://github.com/SAP/fundamental-styles/pull/1694 - [x] Feed Input - button - https://github.com/SAP/fundamental-styles/pull/1694 ![image](https://user-images.githubusercontent.com/26483208/92918653-ea506900-f42f-11ea-950b-99b59412d550.png) - [ ] Feed input - RTL button ![image](https://user-images.githubusercontent.com/26483208/92918718-f9371b80-f42f-11ea-9697-e69c82408a2f.png) - [x] File Uploader - Doesn't react on click, brings console error - https://github.com/SAP/fundamental-styles/pull/1694 - [x] Generic Tile - actions - https://github.com/SAP/fundamental-styles/pull/1698 ![image](https://user-images.githubusercontent.com/26483208/92918955-2b487d80-f430-11ea-91a4-82b4214a79bb.png) - [x] List - checkboxes - Repeated ID/For attributes on RTL [#1693](https://github.com/SAP/fundamental-styles/pull/1693) - [x] Side Navigation - nested list button with multiple focus outlines [#1691](https://github.com/SAP/fundamental-styles/pull/1691) ![image](https://user-images.githubusercontent.com/26483208/93889300-51e89d00-fce9-11ea-8a8c-78d345a37340.png)
defect
defect hunting button not inverted on rtl wrong icon breadcrump iframe height with opened popover empty checkbox focus double outline example form cards with table form grid inputs labels got redundant id for attributes jedrzej form messages popovers don t open feed input button feed input rtl button file uploader doesn t react on click brings console error generic tile actions list checkboxes repeated id for attributes on rtl side navigation nested list button with multiple focus outlines
1
43,274
7,035,814,555
IssuesEvent
2017-12-28 03:30:33
litehelpers/Cordova-sqlcipher-adapter
https://api.github.com/repos/litehelpers/Cordova-sqlcipher-adapter
closed
Ratchet CSS can't display correctly and cant navigate to other page
documentation invalid question user community help
Hi, I am using cordova to build a mobile application but i having a problem with navigation/link to other pages. Is there any suggestion for me. Thanks in advance [https://stackoverflow.com/questions/47439120/ratchet-css-cant-display-correctly-and-cant-navigate-to-other-page](url)
1.0
Ratchet CSS can't display correctly and cant navigate to other page - Hi, I am using cordova to build a mobile application but i having a problem with navigation/link to other pages. Is there any suggestion for me. Thanks in advance [https://stackoverflow.com/questions/47439120/ratchet-css-cant-display-correctly-and-cant-navigate-to-other-page](url)
non_defect
ratchet css can t display correctly and cant navigate to other page hi i am using cordova to build a mobile application but i having a problem with navigation link to other pages is there any suggestion for me thanks in advance url
0
38,017
8,634,509,534
IssuesEvent
2018-11-22 17:06:01
hazelcast/hazelcast
https://api.github.com/repos/hazelcast/hazelcast
closed
The querying performance drop when running inside OSGI
Module: IMap Module: Query Source: Internal Team: Core Type: Perf. Defect
A problem seems to be related to the Java Reflection info caching implemented in `com.hazelcast.query.impl.getters.Extractors`: ``` Getter getGetter(InternalSerializationService serializationService, Object targetObject, String attributeName) { Getter getter = getterCache.getGetter(targetObject.getClass(), attributeName); if (getter == null) { getter = instantiateGetter(serializationService, targetObject, attributeName); if (getter.isCacheable()) { getterCache.putGetter(targetObject.getClass(), attributeName, getter); } } return getter; } ``` For OSGi environments the given method will be returning `false` because obviously Hazelcast and the POJOs are in separate bundles, thus having separate class loaders: ``` boolean isCacheable() { return ReflectionHelper.THIS_CL.equals(method.getDeclaringClass().getClassLoader()); } ``` It looks like this was introduced with [this commit]( https://github.com/hazelcast/hazelcast/commit/57394ddb59a521190e3a7533c656af9f03a28526) as a fix for: https://github.com/hazelcast/hazelcast/issues/1503 Please reach out to me for more details.
1.0
The querying performance drop when running inside OSGI - A problem seems to be related to the Java Reflection info caching implemented in `com.hazelcast.query.impl.getters.Extractors`: ``` Getter getGetter(InternalSerializationService serializationService, Object targetObject, String attributeName) { Getter getter = getterCache.getGetter(targetObject.getClass(), attributeName); if (getter == null) { getter = instantiateGetter(serializationService, targetObject, attributeName); if (getter.isCacheable()) { getterCache.putGetter(targetObject.getClass(), attributeName, getter); } } return getter; } ``` For OSGi environments the given method will be returning `false` because obviously Hazelcast and the POJOs are in separate bundles, thus having separate class loaders: ``` boolean isCacheable() { return ReflectionHelper.THIS_CL.equals(method.getDeclaringClass().getClassLoader()); } ``` It looks like this was introduced with [this commit]( https://github.com/hazelcast/hazelcast/commit/57394ddb59a521190e3a7533c656af9f03a28526) as a fix for: https://github.com/hazelcast/hazelcast/issues/1503 Please reach out to me for more details.
defect
the querying performance drop when running inside osgi a problem seems to be related to the java reflection info caching implemented in com hazelcast query impl getters extractors getter getgetter internalserializationservice serializationservice object targetobject string attributename getter getter gettercache getgetter targetobject getclass attributename if getter null getter instantiategetter serializationservice targetobject attributename if getter iscacheable gettercache putgetter targetobject getclass attributename getter return getter for osgi environments the given method will be returning false because obviously hazelcast and the pojos are in separate bundles thus having separate class loaders boolean iscacheable return reflectionhelper this cl equals method getdeclaringclass getclassloader it looks like this was introduced with as a fix for please reach out to me for more details
1
61,711
7,495,729,222
IssuesEvent
2018-04-08 00:32:03
jfurrow/flood
https://api.github.com/repos/jfurrow/flood
closed
Right click context menu breaks at browser bottom edge
bug design
## Summary When right clicking on torrents too close to the bottom of the list, the context menu appears off the top of the view. ## Expected Behavior When the context menu would appear too close to the bottom of the screen to be practical, it should begin to be nudged upwards relative to the click location, until when the very last list element is pressed, the entire menu is above the click location. ## Current Behavior If more than four list elements are trying to overflow in the context menu, it is pushed to the top off the browser window, and is mostly obscured ![cursor](https://user-images.githubusercontent.com/4299422/38445266-0e0b5fc2-39b0-11e8-8702-e7d8d11f65cc.png) ## Possible Solution Should adopt the standard functionality in right-click context menus ## Steps to Reproduce (for bugs) 1. Re-size main window so that there are torrents extending to the bottom of the browser window 2. Right-click the last torrent in the list to bring up its context menu 3. Observe context menu behavior ## Context Makes the context menu impossible to use for the last few torrents in the list ## Your Environment * Version used: + Version (stable release) `v1.0.0` + Commit ID (development release) `293aa11d512ee089f1d559b24e6f8bbe707cfb8d` * Environment name and version + Node `v9.0.0` + Chrome `65.0.3325.181 (Official Build) (64-bit)` + OS `Debian GNU/Linux 9.4` * Operating System and version: + macOS `10.13.4`
1.0
Right click context menu breaks at browser bottom edge - ## Summary When right clicking on torrents too close to the bottom of the list, the context menu appears off the top of the view. ## Expected Behavior When the context menu would appear too close to the bottom of the screen to be practical, it should begin to be nudged upwards relative to the click location, until when the very last list element is pressed, the entire menu is above the click location. ## Current Behavior If more than four list elements are trying to overflow in the context menu, it is pushed to the top off the browser window, and is mostly obscured ![cursor](https://user-images.githubusercontent.com/4299422/38445266-0e0b5fc2-39b0-11e8-8702-e7d8d11f65cc.png) ## Possible Solution Should adopt the standard functionality in right-click context menus ## Steps to Reproduce (for bugs) 1. Re-size main window so that there are torrents extending to the bottom of the browser window 2. Right-click the last torrent in the list to bring up its context menu 3. Observe context menu behavior ## Context Makes the context menu impossible to use for the last few torrents in the list ## Your Environment * Version used: + Version (stable release) `v1.0.0` + Commit ID (development release) `293aa11d512ee089f1d559b24e6f8bbe707cfb8d` * Environment name and version + Node `v9.0.0` + Chrome `65.0.3325.181 (Official Build) (64-bit)` + OS `Debian GNU/Linux 9.4` * Operating System and version: + macOS `10.13.4`
non_defect
right click context menu breaks at browser bottom edge summary when right clicking on torrents too close to the bottom of the list the context menu appears off the top of the view expected behavior when the context menu would appear too close to the bottom of the screen to be practical it should begin to be nudged upwards relative to the click location until when the very last list element is pressed the entire menu is above the click location current behavior if more than four list elements are trying to overflow in the context menu it is pushed to the top off the browser window and is mostly obscured possible solution should adopt the standard functionality in right click context menus steps to reproduce for bugs re size main window so that there are torrents extending to the bottom of the browser window right click the last torrent in the list to bring up its context menu observe context menu behavior context makes the context menu impossible to use for the last few torrents in the list your environment version used version stable release commit id development release environment name and version node chrome official build bit os debian gnu linux operating system and version macos
0
684,549
23,422,137,057
IssuesEvent
2022-08-13 21:35:27
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
Developers targeting browser-wasm can use Web Crypto APIs
arch-wasm area-System.Security User Story Priority:1 Cost:L Team:Libraries
We want to avoid shipping OpenSSL for Browser as that’s not something that aligns well with web nature of WebAssembly as well as it has noticeable impacts on the size of the final app. This would also save us from having to deal with zero-day vulnerabilities as well as support for crypto algorithms which are only secure if they use special CPU instructions (RNG and AES for example). Using the platform native crypto functions is the preferred solution as it does not have any noticeable size and it’s also the most performant solution. All browser have nowadays support for Crypto APIs which we should be able to use to implement the core crypto functions required by BCL. Relevant documentation can be found at https://developer.mozilla.org/en-US/docs/Web/API/SubtleCrypto. The tricky part will be to deal with the async nature of these APIs but we could introduce new async APIs to make the integration easier. .NET has support for old/obscure algorithms and also other features like certificates which are not relevant in browser space and for them, we would keep throwing PNSE. ### Design Proposal The design proposal for how we will enable cryptographic algorithms in .NET for WebAssembly can be found here: https://github.com/dotnet/designs/blob/main/accepted/2021/blazor-wasm-crypto.md We intend on providing the most secure implementations of cryptographic algorithms when available. On browsers which support `SharedArrayBuffer`, we will utilize that as a synchronization mechanism to perform a sync-over-async operation, letting the browser's secure `SubtleCrypto` implementation act as our cryptographic primitive. On browsers which do not support `SharedArrayBuffer`, we will fall back to in-box managed algorithm implementations of these primitives. Where managed algorithm implementations are needed, they can be migrated from the .NET Framework Reference Source. * https://github.com/microsoft/referencesource/tree/master/mscorlib/system/security/cryptography * https://github.com/microsoft/referencesource/tree/master/System.Core/System/Security/Cryptography ### Work Items - [x] SHA1 - [x] SHA256 - [x] SHA384 - [x] SHA512 - [x] HMACSHA1 - [x] HMACSHA256 - [x] HMACSHA384 - [x] HMACSHA512 - [x] AES-CBC - [x] AES-ECB and AES-CFB should throw PlatformNotSupportedException, as they are not available in webcrypto - [x] PBKDF2 (via [Rfc2898DeriveBytes](https://docs.microsoft.com/dotnet/api/system.security.cryptography.rfc2898derivebytes)) - [x] HKDF ### Acceptance Criteria - [ ] Integrate into arch-wasm infrastructure - [ ] Ensure a productive development/testing workflow with WebAssembly - [ ] Get all existing unit tests for the algorithms passing in WebAssembly - [ ] Look for opportunities to improve argument validation - [ ] Look for opportunities to modernize the code, such as using Span - [ ] Validate end-to-end scenarios in desktop browsers that support `SharedArrayBuffer` - [ ] Validate end-to-end scenarios in mobile browsers that support `SharedArrayBuffer` - [ ] Validate end-to-end scenarios in desktop browsers that do not support `SharedArrayBuffer` - [ ] Validate end-to-end scenarios in mobile browsers that do not support `SharedArrayBuffer` ### Issues to address - [ ] https://github.com/dotnet/runtime/issues/69741 - [x] https://github.com/dotnet/runtime/issues/69740 - [x] https://github.com/dotnet/runtime/issues/69806
1.0
Developers targeting browser-wasm can use Web Crypto APIs - We want to avoid shipping OpenSSL for Browser as that’s not something that aligns well with web nature of WebAssembly as well as it has noticeable impacts on the size of the final app. This would also save us from having to deal with zero-day vulnerabilities as well as support for crypto algorithms which are only secure if they use special CPU instructions (RNG and AES for example). Using the platform native crypto functions is the preferred solution as it does not have any noticeable size and it’s also the most performant solution. All browser have nowadays support for Crypto APIs which we should be able to use to implement the core crypto functions required by BCL. Relevant documentation can be found at https://developer.mozilla.org/en-US/docs/Web/API/SubtleCrypto. The tricky part will be to deal with the async nature of these APIs but we could introduce new async APIs to make the integration easier. .NET has support for old/obscure algorithms and also other features like certificates which are not relevant in browser space and for them, we would keep throwing PNSE. ### Design Proposal The design proposal for how we will enable cryptographic algorithms in .NET for WebAssembly can be found here: https://github.com/dotnet/designs/blob/main/accepted/2021/blazor-wasm-crypto.md We intend on providing the most secure implementations of cryptographic algorithms when available. On browsers which support `SharedArrayBuffer`, we will utilize that as a synchronization mechanism to perform a sync-over-async operation, letting the browser's secure `SubtleCrypto` implementation act as our cryptographic primitive. On browsers which do not support `SharedArrayBuffer`, we will fall back to in-box managed algorithm implementations of these primitives. Where managed algorithm implementations are needed, they can be migrated from the .NET Framework Reference Source. * https://github.com/microsoft/referencesource/tree/master/mscorlib/system/security/cryptography * https://github.com/microsoft/referencesource/tree/master/System.Core/System/Security/Cryptography ### Work Items - [x] SHA1 - [x] SHA256 - [x] SHA384 - [x] SHA512 - [x] HMACSHA1 - [x] HMACSHA256 - [x] HMACSHA384 - [x] HMACSHA512 - [x] AES-CBC - [x] AES-ECB and AES-CFB should throw PlatformNotSupportedException, as they are not available in webcrypto - [x] PBKDF2 (via [Rfc2898DeriveBytes](https://docs.microsoft.com/dotnet/api/system.security.cryptography.rfc2898derivebytes)) - [x] HKDF ### Acceptance Criteria - [ ] Integrate into arch-wasm infrastructure - [ ] Ensure a productive development/testing workflow with WebAssembly - [ ] Get all existing unit tests for the algorithms passing in WebAssembly - [ ] Look for opportunities to improve argument validation - [ ] Look for opportunities to modernize the code, such as using Span - [ ] Validate end-to-end scenarios in desktop browsers that support `SharedArrayBuffer` - [ ] Validate end-to-end scenarios in mobile browsers that support `SharedArrayBuffer` - [ ] Validate end-to-end scenarios in desktop browsers that do not support `SharedArrayBuffer` - [ ] Validate end-to-end scenarios in mobile browsers that do not support `SharedArrayBuffer` ### Issues to address - [ ] https://github.com/dotnet/runtime/issues/69741 - [x] https://github.com/dotnet/runtime/issues/69740 - [x] https://github.com/dotnet/runtime/issues/69806
non_defect
developers targeting browser wasm can use web crypto apis we want to avoid shipping openssl for browser as that’s not something that aligns well with web nature of webassembly as well as it has noticeable impacts on the size of the final app this would also save us from having to deal with zero day vulnerabilities as well as support for crypto algorithms which are only secure if they use special cpu instructions rng and aes for example using the platform native crypto functions is the preferred solution as it does not have any noticeable size and it’s also the most performant solution all browser have nowadays support for crypto apis which we should be able to use to implement the core crypto functions required by bcl relevant documentation can be found at the tricky part will be to deal with the async nature of these apis but we could introduce new async apis to make the integration easier net has support for old obscure algorithms and also other features like certificates which are not relevant in browser space and for them we would keep throwing pnse design proposal the design proposal for how we will enable cryptographic algorithms in net for webassembly can be found here we intend on providing the most secure implementations of cryptographic algorithms when available on browsers which support sharedarraybuffer we will utilize that as a synchronization mechanism to perform a sync over async operation letting the browser s secure subtlecrypto implementation act as our cryptographic primitive on browsers which do not support sharedarraybuffer we will fall back to in box managed algorithm implementations of these primitives where managed algorithm implementations are needed they can be migrated from the net framework reference source work items aes cbc aes ecb and aes cfb should throw platformnotsupportedexception as they are not available in webcrypto via hkdf acceptance criteria integrate into arch wasm infrastructure ensure a productive development testing workflow with webassembly get all existing unit tests for the algorithms passing in webassembly look for opportunities to improve argument validation look for opportunities to modernize the code such as using span validate end to end scenarios in desktop browsers that support sharedarraybuffer validate end to end scenarios in mobile browsers that support sharedarraybuffer validate end to end scenarios in desktop browsers that do not support sharedarraybuffer validate end to end scenarios in mobile browsers that do not support sharedarraybuffer issues to address
0
68,968
22,035,810,962
IssuesEvent
2022-05-28 15:03:32
mautrix/telegram
https://api.github.com/repos/mautrix/telegram
closed
Images with captions don't store Matrix event ID of caption
requires db update bug: defect
Currently the database can only store one event ID per Telegram message (except for edits). However, images with captions are bridged as two messages. This means that: * Replying to the image event from Matrix won't be bridged correctly * Deleting the image event from Matrix won't be bridged correctly * Deleting the message from Telegram will only delete the caption Bridging image messages as inline images is already supported by the bridge, but not supported by most clients. The preferred solution would be implementing matrix-org/matrix-doc#2530 everywhere, but alternatively the database could be updated to support a many-to-one mapping.
1.0
Images with captions don't store Matrix event ID of caption - Currently the database can only store one event ID per Telegram message (except for edits). However, images with captions are bridged as two messages. This means that: * Replying to the image event from Matrix won't be bridged correctly * Deleting the image event from Matrix won't be bridged correctly * Deleting the message from Telegram will only delete the caption Bridging image messages as inline images is already supported by the bridge, but not supported by most clients. The preferred solution would be implementing matrix-org/matrix-doc#2530 everywhere, but alternatively the database could be updated to support a many-to-one mapping.
defect
images with captions don t store matrix event id of caption currently the database can only store one event id per telegram message except for edits however images with captions are bridged as two messages this means that replying to the image event from matrix won t be bridged correctly deleting the image event from matrix won t be bridged correctly deleting the message from telegram will only delete the caption bridging image messages as inline images is already supported by the bridge but not supported by most clients the preferred solution would be implementing matrix org matrix doc everywhere but alternatively the database could be updated to support a many to one mapping
1
391,824
11,579,096,147
IssuesEvent
2020-02-21 17:11:06
googleapis/nodejs-pubsub
https://api.github.com/repos/googleapis/nodejs-pubsub
closed
"Failed to connect before the deadline" errors being seen
api: pubsub external priority: p2 type: bug
Since upgrading to the latest pubsub (`"@google-cloud/pubsub": "1.1.5"`) my node is still getting occasional errors from the underlying grpc library. My app is using grpc-js@0.6.9: $ npm ls | grep grpc-js │ │ ├─┬ @grpc/grpc-js@0.6.9 It's not as bad as it was. When I first upgraded to pubsub 1.x it simply didn't work at all, I'd get an error within a few seconds of the app starting and from then no messages would get delivered. Now the problem is much less bad (my app is working), but I still see the following errors in my logs: From `subscription.on(`error`, (err) => {` I see this: {"code":4,"details":"Failed to connect before the deadline","metadata":{"internalRepr":{},"options":{}}} And in my logs I've also noticed these (excuse the formatting): `Error: Failed to add metadata entry @���: Wed, 13 Nov 2019 07:13:16 GMT. Metadata key "@���" contains illegal characters at validate (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/metadata.js:35:15) at Metadata.add (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/metadata.js:87:9) at values.split.forEach.v (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/metadata.js:226:63) at Array.forEach (<anonymous>) at Object.keys.forEach.key (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/metadata.js:226:43) at Array.forEach (<anonymous>) at Function.fromHttp2Headers (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/metadata.js:200:30) at ClientHttp2Stream.stream.on (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/call-stream.js:211:56) at ClientHttp2Stream.emit (events.js:193:13) at emit (internal/http2/core.js:244:8) at processTicksAndRejections (internal/process/task_queues.js:81:17)` .. but the two don't seem to be related. I will start running with the debug flags enabled and report back
1.0
"Failed to connect before the deadline" errors being seen - Since upgrading to the latest pubsub (`"@google-cloud/pubsub": "1.1.5"`) my node is still getting occasional errors from the underlying grpc library. My app is using grpc-js@0.6.9: $ npm ls | grep grpc-js │ │ ├─┬ @grpc/grpc-js@0.6.9 It's not as bad as it was. When I first upgraded to pubsub 1.x it simply didn't work at all, I'd get an error within a few seconds of the app starting and from then no messages would get delivered. Now the problem is much less bad (my app is working), but I still see the following errors in my logs: From `subscription.on(`error`, (err) => {` I see this: {"code":4,"details":"Failed to connect before the deadline","metadata":{"internalRepr":{},"options":{}}} And in my logs I've also noticed these (excuse the formatting): `Error: Failed to add metadata entry @���: Wed, 13 Nov 2019 07:13:16 GMT. Metadata key "@���" contains illegal characters at validate (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/metadata.js:35:15) at Metadata.add (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/metadata.js:87:9) at values.split.forEach.v (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/metadata.js:226:63) at Array.forEach (<anonymous>) at Object.keys.forEach.key (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/metadata.js:226:43) at Array.forEach (<anonymous>) at Function.fromHttp2Headers (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/metadata.js:200:30) at ClientHttp2Stream.stream.on (/home/nodeapp/apps/crypto-trader-prototype/shared-code/node_modules/@grpc/grpc-js/build/src/call-stream.js:211:56) at ClientHttp2Stream.emit (events.js:193:13) at emit (internal/http2/core.js:244:8) at processTicksAndRejections (internal/process/task_queues.js:81:17)` .. but the two don't seem to be related. I will start running with the debug flags enabled and report back
non_defect
failed to connect before the deadline errors being seen since upgrading to the latest pubsub google cloud pubsub my node is still getting occasional errors from the underlying grpc library my app is using grpc js npm ls grep grpc js │ │ ├─┬ grpc grpc js it s not as bad as it was when i first upgraded to pubsub x it simply didn t work at all i d get an error within a few seconds of the app starting and from then no messages would get delivered now the problem is much less bad my app is working but i still see the following errors in my logs from subscription on error err i see this code details failed to connect before the deadline metadata internalrepr options and in my logs i ve also noticed these excuse the formatting error failed to add metadata entry ��� wed nov gmt metadata key ��� contains illegal characters at validate home nodeapp apps crypto trader prototype shared code node modules grpc grpc js build src metadata js at metadata add home nodeapp apps crypto trader prototype shared code node modules grpc grpc js build src metadata js at values split foreach v home nodeapp apps crypto trader prototype shared code node modules grpc grpc js build src metadata js at array foreach at object keys foreach key home nodeapp apps crypto trader prototype shared code node modules grpc grpc js build src metadata js at array foreach at function home nodeapp apps crypto trader prototype shared code node modules grpc grpc js build src metadata js at stream on home nodeapp apps crypto trader prototype shared code node modules grpc grpc js build src call stream js at emit events js at emit internal core js at processticksandrejections internal process task queues js but the two don t seem to be related i will start running with the debug flags enabled and report back
0
313
3,050,734,831
IssuesEvent
2015-08-12 01:13:32
mediapublic/mediapublic
https://api.github.com/repos/mediapublic/mediapublic
opened
Add remaining endpoints to pyramid server
architecture coding
We currently have: /users /user_types /recording_categories /organizations /blogs We are missing: /people /recordings /howtos /playlists I believe these were not implemented by @ryansb during the re-org due to a lack of understanding of what my initial "vision".
1.0
Add remaining endpoints to pyramid server - We currently have: /users /user_types /recording_categories /organizations /blogs We are missing: /people /recordings /howtos /playlists I believe these were not implemented by @ryansb during the re-org due to a lack of understanding of what my initial "vision".
non_defect
add remaining endpoints to pyramid server we currently have users user types recording categories organizations blogs we are missing people recordings howtos playlists i believe these were not implemented by ryansb during the re org due to a lack of understanding of what my initial vision
0
813,801
30,473,317,878
IssuesEvent
2023-07-17 14:54:02
broadinstitute/gnomad-browser
https://api.github.com/repos/broadinstitute/gnomad-browser
closed
Update "Feedback" page
Priority: High
<!-- * Please describe the feature you would like added and how it will help you. Please be detailed: the more information you provide about the desired feature, and the potential use cases, the more likely we are to be able to implement it. --> The "Feedback" [page](https://gnomad.broadinstitute.org/feedback) is currently not very informative. Would it be possible to update the page to add more information? Here are a few suggested edits: - Update the name of the page from "Feedback" to "Contact" (this page is more about broadly how to contact us than give feedback specifically) - Add text to point users to our Help/Docs & FAQ page and a note about individual level data, maybe: > For questions about gnomAD, check out the [Docs & FAQ](https://gnomad.broadinstitute.org/help). > > Note that, for many reasons (including consent and data usage restrictions), we do not have (and cannot share) phenotype information. Overall, we have limited information that we can share for some cohorts, such as last known age in bins of 5 years (when known) and chromosomal sex. > > Report data problems or feature suggestions by [email](mailto:gnomad@broadinstitute.org). - Update the text about reporting errors in the website to avoid pointing users to the email inbox: > Report errors in the website on [GitHub](https://github.com/broadinstitute/gnomad-browser/issues).
1.0
Update "Feedback" page - <!-- * Please describe the feature you would like added and how it will help you. Please be detailed: the more information you provide about the desired feature, and the potential use cases, the more likely we are to be able to implement it. --> The "Feedback" [page](https://gnomad.broadinstitute.org/feedback) is currently not very informative. Would it be possible to update the page to add more information? Here are a few suggested edits: - Update the name of the page from "Feedback" to "Contact" (this page is more about broadly how to contact us than give feedback specifically) - Add text to point users to our Help/Docs & FAQ page and a note about individual level data, maybe: > For questions about gnomAD, check out the [Docs & FAQ](https://gnomad.broadinstitute.org/help). > > Note that, for many reasons (including consent and data usage restrictions), we do not have (and cannot share) phenotype information. Overall, we have limited information that we can share for some cohorts, such as last known age in bins of 5 years (when known) and chromosomal sex. > > Report data problems or feature suggestions by [email](mailto:gnomad@broadinstitute.org). - Update the text about reporting errors in the website to avoid pointing users to the email inbox: > Report errors in the website on [GitHub](https://github.com/broadinstitute/gnomad-browser/issues).
non_defect
update feedback page please describe the feature you would like added and how it will help you please be detailed the more information you provide about the desired feature and the potential use cases the more likely we are to be able to implement it the feedback is currently not very informative would it be possible to update the page to add more information here are a few suggested edits update the name of the page from feedback to contact this page is more about broadly how to contact us than give feedback specifically add text to point users to our help docs faq page and a note about individual level data maybe for questions about gnomad check out the note that for many reasons including consent and data usage restrictions we do not have and cannot share phenotype information overall we have limited information that we can share for some cohorts such as last known age in bins of years when known and chromosomal sex report data problems or feature suggestions by mailto gnomad broadinstitute org update the text about reporting errors in the website to avoid pointing users to the email inbox report errors in the website on
0
594,133
18,024,567,531
IssuesEvent
2021-09-17 01:34:20
GoogleCloudPlatform/java-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/java-docs-samples
closed
com.example.dataflow.SpannerReadIT: readTableEndToEnd failed
type: bug priority: p1 api: dataflow samples flakybot: issue
This test failed! To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot). If I'm commenting on this issue too often, add the `flakybot: quiet` label and I will stop commenting. --- commit: 51bc9349b73520baa0180c8201fe6560798cf861 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/71a84563-4015-4f92-a157-0f0b856f0f03), [Sponge](http://sponge2/71a84563-4015-4f92-a157-0f0b856f0f03) status: failed <details><summary>Test output</summary><br><pre>org.apache.beam.sdk.Pipeline$PipelineExecutionException: com.google.cloud.spanner.SpannerException: DEADLINE_EXCEEDED: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 29.999642728s. [remote_addr=batch-spanner.googleapis.com/74.125.197.95:443] at com.example.dataflow.SpannerReadIT.readTableEndToEnd(SpannerReadIT.java:181) Caused by: com.google.cloud.spanner.SpannerException: DEADLINE_EXCEEDED: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 29.999642728s. [remote_addr=batch-spanner.googleapis.com/74.125.197.95:443] Caused by: java.util.concurrent.ExecutionException: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 29.999642728s. [remote_addr=batch-spanner.googleapis.com/74.125.197.95:443] Caused by: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 29.999642728s. [remote_addr=batch-spanner.googleapis.com/74.125.197.95:443] Caused by: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 29.999642728s. [remote_addr=batch-spanner.googleapis.com/74.125.197.95:443] </pre></details>
1.0
com.example.dataflow.SpannerReadIT: readTableEndToEnd failed - This test failed! To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot). If I'm commenting on this issue too often, add the `flakybot: quiet` label and I will stop commenting. --- commit: 51bc9349b73520baa0180c8201fe6560798cf861 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/71a84563-4015-4f92-a157-0f0b856f0f03), [Sponge](http://sponge2/71a84563-4015-4f92-a157-0f0b856f0f03) status: failed <details><summary>Test output</summary><br><pre>org.apache.beam.sdk.Pipeline$PipelineExecutionException: com.google.cloud.spanner.SpannerException: DEADLINE_EXCEEDED: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 29.999642728s. [remote_addr=batch-spanner.googleapis.com/74.125.197.95:443] at com.example.dataflow.SpannerReadIT.readTableEndToEnd(SpannerReadIT.java:181) Caused by: com.google.cloud.spanner.SpannerException: DEADLINE_EXCEEDED: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 29.999642728s. [remote_addr=batch-spanner.googleapis.com/74.125.197.95:443] Caused by: java.util.concurrent.ExecutionException: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 29.999642728s. [remote_addr=batch-spanner.googleapis.com/74.125.197.95:443] Caused by: com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 29.999642728s. [remote_addr=batch-spanner.googleapis.com/74.125.197.95:443] Caused by: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 29.999642728s. [remote_addr=batch-spanner.googleapis.com/74.125.197.95:443] </pre></details>
non_defect
com example dataflow spannerreadit readtableendtoend failed this test failed to configure my behavior see if i m commenting on this issue too often add the flakybot quiet label and i will stop commenting commit buildurl status failed test output org apache beam sdk pipeline pipelineexecutionexception com google cloud spanner spannerexception deadline exceeded com google api gax rpc deadlineexceededexception io grpc statusruntimeexception deadline exceeded deadline exceeded after at com example dataflow spannerreadit readtableendtoend spannerreadit java caused by com google cloud spanner spannerexception deadline exceeded com google api gax rpc deadlineexceededexception io grpc statusruntimeexception deadline exceeded deadline exceeded after caused by java util concurrent executionexception com google api gax rpc deadlineexceededexception io grpc statusruntimeexception deadline exceeded deadline exceeded after caused by com google api gax rpc deadlineexceededexception io grpc statusruntimeexception deadline exceeded deadline exceeded after caused by io grpc statusruntimeexception deadline exceeded deadline exceeded after
0
20,295
3,331,545,040
IssuesEvent
2015-11-11 16:16:39
xmindltd/xmind
https://api.github.com/repos/xmindltd/xmind
closed
Please help me to come out from this issues
auto-migrated Priority-Medium Type-Defect
``` What steps will reproduce the problem? 1. I am trying to develop plugin for xmind like same as previously available but can you guide me how can i develop it. which type of plugin i have to create. What is the expected output? What do you see instead? Now i am creating view plugin but I want to create plugin which is already available in xmind ``` Original issue reported on code.google.com by `somani.a...@gmail.com` on 7 Sep 2013 at 6:26
1.0
Please help me to come out from this issues - ``` What steps will reproduce the problem? 1. I am trying to develop plugin for xmind like same as previously available but can you guide me how can i develop it. which type of plugin i have to create. What is the expected output? What do you see instead? Now i am creating view plugin but I want to create plugin which is already available in xmind ``` Original issue reported on code.google.com by `somani.a...@gmail.com` on 7 Sep 2013 at 6:26
defect
please help me to come out from this issues what steps will reproduce the problem i am trying to develop plugin for xmind like same as previously available but can you guide me how can i develop it which type of plugin i have to create what is the expected output what do you see instead now i am creating view plugin but i want to create plugin which is already available in xmind original issue reported on code google com by somani a gmail com on sep at
1
50,884
13,187,938,821
IssuesEvent
2020-08-13 05:05:09
icecube-trac/tix3
https://api.github.com/repos/icecube-trac/tix3
closed
[iceprod2] multiple errors in functions.compress/uncompress (Trac #1602)
Migrated from Trac defect iceprod
Something must have happened, but most of the compression tests are failing, at least on my laptop. <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1602">https://code.icecube.wisc.edu/ticket/1602</a>, reported by david.schultz and owned by david.schultz</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:11:28", "description": "Something must have happened, but most of the compression tests are failing, at least on my laptop.", "reporter": "david.schultz", "cc": "", "resolution": "fixed", "_ts": "1550067088921308", "component": "iceprod", "summary": "[iceprod2] multiple errors in functions.compress/uncompress", "priority": "critical", "keywords": "", "time": "2016-03-23T20:28:54", "milestone": "", "owner": "david.schultz", "type": "defect" } ``` </p> </details>
1.0
[iceprod2] multiple errors in functions.compress/uncompress (Trac #1602) - Something must have happened, but most of the compression tests are failing, at least on my laptop. <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1602">https://code.icecube.wisc.edu/ticket/1602</a>, reported by david.schultz and owned by david.schultz</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:11:28", "description": "Something must have happened, but most of the compression tests are failing, at least on my laptop.", "reporter": "david.schultz", "cc": "", "resolution": "fixed", "_ts": "1550067088921308", "component": "iceprod", "summary": "[iceprod2] multiple errors in functions.compress/uncompress", "priority": "critical", "keywords": "", "time": "2016-03-23T20:28:54", "milestone": "", "owner": "david.schultz", "type": "defect" } ``` </p> </details>
defect
multiple errors in functions compress uncompress trac something must have happened but most of the compression tests are failing at least on my laptop migrated from json status closed changetime description something must have happened but most of the compression tests are failing at least on my laptop reporter david schultz cc resolution fixed ts component iceprod summary multiple errors in functions compress uncompress priority critical keywords time milestone owner david schultz type defect
1
76,603
14,652,350,654
IssuesEvent
2020-12-28 01:21:30
joomla/joomla-cms
https://api.github.com/repos/joomla/joomla-cms
closed
Better SEO options in edit articles
J4 Re-evaluate No Code Attached Yet
I think this may improve the SEO option, grouped in one tab. ![screen shot 2017-05-05 at 11 20 34](https://issues.joomla.org/uploads/1/ba939b87f14827f9a64bf859ebc9b87c.png)
1.0
Better SEO options in edit articles - I think this may improve the SEO option, grouped in one tab. ![screen shot 2017-05-05 at 11 20 34](https://issues.joomla.org/uploads/1/ba939b87f14827f9a64bf859ebc9b87c.png)
non_defect
better seo options in edit articles i think this may improve the seo option grouped in one tab
0
61,491
17,023,706,612
IssuesEvent
2021-07-03 03:24:35
tomhughes/trac-tickets
https://api.github.com/repos/tomhughes/trac-tickets
closed
Wrong county/region detected
Component: nominatim Priority: major Resolution: worksforme Type: defect
**[Submitted to the original trac issue database at 7.49am, Thursday, 28th April 2011]** Hello, check this result: http://open.mapquestapi.com/nominatim/v1/details.php?place_id=2133055671 ``` Via Giovanni Bessarione (Type: highway:tertiary, way 109368643, 15, , Polygon, 0 GOTO) Mazara del Vallo (Type: boundary:administrative, relation 39318, 8, , Point, 0.0842660761759271 GOTO) Trapani (Type: boundary:administrative, relation 39166, 6, , Point, 0.201223571443735 GOTO) Isola Grande (Type: boundary:administrative, way 10364827, 4, , Point, 0.247943334970063 GOTO) 91026 (Type: place:postcode, , , Point, 0.0237432108093155 GOTO) Italia (Type: boundary:administrative, relation 365331, 2, , Point, 4.33148222531914 GOTO) ``` The admin_level=4 boundary detected is wrong. In fact, the nearest admin_level=4 is [http://www.openstreetmap.org/browse/way/33900429 the coastline], and should be detected as [http://www.openstreetmap.org/browse/relation/39152 relation "Sicilia"]. I can't exactly tell what is behaviour is due to. "Italia", for example, is a relation spread all over Italy, so I'd say that "Sicilia" would be detected too. Maybe it prefers nearest closed ways instead of nearest (not closed) ways? Kindly, David
1.0
Wrong county/region detected - **[Submitted to the original trac issue database at 7.49am, Thursday, 28th April 2011]** Hello, check this result: http://open.mapquestapi.com/nominatim/v1/details.php?place_id=2133055671 ``` Via Giovanni Bessarione (Type: highway:tertiary, way 109368643, 15, , Polygon, 0 GOTO) Mazara del Vallo (Type: boundary:administrative, relation 39318, 8, , Point, 0.0842660761759271 GOTO) Trapani (Type: boundary:administrative, relation 39166, 6, , Point, 0.201223571443735 GOTO) Isola Grande (Type: boundary:administrative, way 10364827, 4, , Point, 0.247943334970063 GOTO) 91026 (Type: place:postcode, , , Point, 0.0237432108093155 GOTO) Italia (Type: boundary:administrative, relation 365331, 2, , Point, 4.33148222531914 GOTO) ``` The admin_level=4 boundary detected is wrong. In fact, the nearest admin_level=4 is [http://www.openstreetmap.org/browse/way/33900429 the coastline], and should be detected as [http://www.openstreetmap.org/browse/relation/39152 relation "Sicilia"]. I can't exactly tell what is behaviour is due to. "Italia", for example, is a relation spread all over Italy, so I'd say that "Sicilia" would be detected too. Maybe it prefers nearest closed ways instead of nearest (not closed) ways? Kindly, David
defect
wrong county region detected hello check this result via giovanni bessarione type highway tertiary way polygon goto mazara del vallo type boundary administrative relation point goto trapani type boundary administrative relation point goto isola grande type boundary administrative way point goto type place postcode point goto italia type boundary administrative relation point goto the admin level boundary detected is wrong in fact the nearest admin level is and should be detected as i can t exactly tell what is behaviour is due to italia for example is a relation spread all over italy so i d say that sicilia would be detected too maybe it prefers nearest closed ways instead of nearest not closed ways kindly david
1
695,181
23,847,932,214
IssuesEvent
2022-09-06 15:20:56
valora-inc/wallet
https://api.github.com/repos/valora-inc/wallet
closed
[Query] While sending an invite link the URL appears as "https://vlra.app/invite" via application whereas URL seen on Figma screen is : "https://vlara.app/vDewFH..." which URL is correct ?
Priority: P3 wallet qa-report qa-query
Frequency : 100% Repro on build version: iOS alfajores build V 1.40.0, Android Alfajores build V 1.40.0 Repro on device version : Google Pixel 2XL(11.0) , iPhone 12(14.7.1) Pre-condition: 1] User must have downloaded Alfajores build 2] User must have restored account 3] User must be on homepage 4] User must be have some funds in the account 5] User must have provided access to contacts Repro Steps : 1] Tap on send button from homepage 2] Select a contact / enter valid mobile number (Which is non valora user) 3] User will be redirected to enter amount page and then on send invite page 4] tap on send invite link and share the message on 3rd party app 5] Observe the URL Query : URL shared via App is different as compared with Figma screen design when compared Expected Behavior : URL should appear as https://vlara.app/vDewFH. as seen in Figma screen Investigation : - Learn more link is not seen on send invite page as it is seen in Figma screen [attachment](https://www.figma.com/file/8ig95B4sfsYIDIpg3JGP03/Phone-Verification?node-id=1378%3A21270) Attachment : [URL.mp4](https://www.figma.com/file/8ig95B4sfsYIDIpg3JGP03/Phone-Verification?node-id=1378%3A21227) [Figma screen URL](https://www.figma.com/file/8ig95B4sfsYIDIpg3JGP03/Phone-Verification?node-id=1378%3A21227)
1.0
[Query] While sending an invite link the URL appears as "https://vlra.app/invite" via application whereas URL seen on Figma screen is : "https://vlara.app/vDewFH..." which URL is correct ? - Frequency : 100% Repro on build version: iOS alfajores build V 1.40.0, Android Alfajores build V 1.40.0 Repro on device version : Google Pixel 2XL(11.0) , iPhone 12(14.7.1) Pre-condition: 1] User must have downloaded Alfajores build 2] User must have restored account 3] User must be on homepage 4] User must be have some funds in the account 5] User must have provided access to contacts Repro Steps : 1] Tap on send button from homepage 2] Select a contact / enter valid mobile number (Which is non valora user) 3] User will be redirected to enter amount page and then on send invite page 4] tap on send invite link and share the message on 3rd party app 5] Observe the URL Query : URL shared via App is different as compared with Figma screen design when compared Expected Behavior : URL should appear as https://vlara.app/vDewFH. as seen in Figma screen Investigation : - Learn more link is not seen on send invite page as it is seen in Figma screen [attachment](https://www.figma.com/file/8ig95B4sfsYIDIpg3JGP03/Phone-Verification?node-id=1378%3A21270) Attachment : [URL.mp4](https://www.figma.com/file/8ig95B4sfsYIDIpg3JGP03/Phone-Verification?node-id=1378%3A21227) [Figma screen URL](https://www.figma.com/file/8ig95B4sfsYIDIpg3JGP03/Phone-Verification?node-id=1378%3A21227)
non_defect
while sending an invite link the url appears as via application whereas url seen on figma screen is which url is correct frequency repro on build version ios alfajores build v android alfajores build v repro on device version google pixel iphone pre condition user must have downloaded alfajores build user must have restored account user must be on homepage user must be have some funds in the account user must have provided access to contacts repro steps tap on send button from homepage select a contact enter valid mobile number which is non valora user user will be redirected to enter amount page and then on send invite page tap on send invite link and share the message on party app observe the url query url shared via app is different as compared with figma screen design when compared expected behavior url should appear as as seen in figma screen investigation learn more link is not seen on send invite page as it is seen in figma screen attachment
0
22,137
3,603,028,799
IssuesEvent
2016-02-03 17:33:06
bridgedotnet/Bridge
https://api.github.com/repos/bridgedotnet/Bridge
closed
Comments are stripped out of JavaScript output in Release mode
defect
When you build a **Bridge.sln** in `Release` mode. For example, **Testing\www\typescript\3_Classes.js** compared with the same file built in `Debug` mode have the lines get removed marked with the minus sign "-": ``` - /// <reference path="..\..\www\qunit\qunit.d.ts" /> - /// <reference path="..\..\www\typescriptjs\bridge.d.ts" /> - /// <reference path="..\..\www\typescriptjs\classes.d.ts" /> QUnit.module("TypeScript - Classes"); QUnit.test("Inheritance", function (assert) { var animal = new Classes.Animal.$constructor(); QUnit.deepEqual(animal.getName(), "Animal", "Animal name parameterless constructor"); animal = new Classes.Animal.constructor$1("A"); QUnit.deepEqual(animal.getName(), "A", "Animal name"); - // TODO #292 Should not require optional parameters QUnit.deepEqual(animal.move(), 1, "Animal move"); ```
1.0
Comments are stripped out of JavaScript output in Release mode - When you build a **Bridge.sln** in `Release` mode. For example, **Testing\www\typescript\3_Classes.js** compared with the same file built in `Debug` mode have the lines get removed marked with the minus sign "-": ``` - /// <reference path="..\..\www\qunit\qunit.d.ts" /> - /// <reference path="..\..\www\typescriptjs\bridge.d.ts" /> - /// <reference path="..\..\www\typescriptjs\classes.d.ts" /> QUnit.module("TypeScript - Classes"); QUnit.test("Inheritance", function (assert) { var animal = new Classes.Animal.$constructor(); QUnit.deepEqual(animal.getName(), "Animal", "Animal name parameterless constructor"); animal = new Classes.Animal.constructor$1("A"); QUnit.deepEqual(animal.getName(), "A", "Animal name"); - // TODO #292 Should not require optional parameters QUnit.deepEqual(animal.move(), 1, "Animal move"); ```
defect
comments are stripped out of javascript output in release mode when you build a bridge sln in release mode for example testing www typescript classes js compared with the same file built in debug mode have the lines get removed marked with the minus sign qunit module typescript classes qunit test inheritance function assert var animal new classes animal constructor qunit deepequal animal getname animal animal name parameterless constructor animal new classes animal constructor a qunit deepequal animal getname a animal name todo should not require optional parameters qunit deepequal animal move animal move
1
22,964
3,728,896,807
IssuesEvent
2016-03-07 03:45:36
rmjarvis/tmv
https://api.github.com/repos/rmjarvis/tmv
closed
SmallVector has extra 16 bytes of memory usage.
auto-migrated Priority-Medium Type-Defect
``` Gary Bernstein pointed out that SmallVector<double> takes 8*N + 16 bytes of storage, rather than the expected 8*N. SmallMatrix<double>, on the contrary does take 8*M*N bytes. The reason is that SmallVector erroneously has a virtual destructor, which it should not. The 16 bytes are the extra storage for the (useless) vtable. If the extra 16 bytes matter to you, you can remove the virtual specification from the SmallVector destructor. Line 188 in TMV_SmallVector.h. Otherwise, this will be fixed in the next release (0.73). ``` Original issue reported on code.google.com by `mikejarvis17@gmail.com` on 18 Apr 2014 at 2:23
1.0
SmallVector has extra 16 bytes of memory usage. - ``` Gary Bernstein pointed out that SmallVector<double> takes 8*N + 16 bytes of storage, rather than the expected 8*N. SmallMatrix<double>, on the contrary does take 8*M*N bytes. The reason is that SmallVector erroneously has a virtual destructor, which it should not. The 16 bytes are the extra storage for the (useless) vtable. If the extra 16 bytes matter to you, you can remove the virtual specification from the SmallVector destructor. Line 188 in TMV_SmallVector.h. Otherwise, this will be fixed in the next release (0.73). ``` Original issue reported on code.google.com by `mikejarvis17@gmail.com` on 18 Apr 2014 at 2:23
defect
smallvector has extra bytes of memory usage gary bernstein pointed out that smallvector takes n bytes of storage rather than the expected n smallmatrix on the contrary does take m n bytes the reason is that smallvector erroneously has a virtual destructor which it should not the bytes are the extra storage for the useless vtable if the extra bytes matter to you you can remove the virtual specification from the smallvector destructor line in tmv smallvector h otherwise this will be fixed in the next release original issue reported on code google com by gmail com on apr at
1
67,035
20,817,821,729
IssuesEvent
2022-03-18 12:24:03
vector-im/element-web
https://api.github.com/repos/vector-im/element-web
opened
Long names are displayed inconsistently in thread summary
T-Defect
### Steps to reproduce 1. Where are you starting? What can you see? -I change my display name to something long 2. What do you click? -I reply to a thread 3. More steps… -I reply to another thread with longer text ![Screenshot](https://user-images.githubusercontent.com/18530109/159001941-dd999680-a221-402a-a3ab-8146e1fb6f4f.png) ### Outcome #### What did you expect? I expected my name to be displayed consistently across all thread summaries. #### What happened instead? My display name is sometimes wrapped within the summary and sometimes only a part of my name is displayed. ### Operating system Windows 10 Home 21H1 ### Browser information Firefox 98.0.1 (64-bit) ### URL for webapp https://develop.element.io ### Application version Element version: 071a5410b88e-react-cdcf6d0fd182-js-779afbcb3992 Olm version: 3.2.8 ### Homeserver matrix.org ### Will you send logs? No
1.0
Long names are displayed inconsistently in thread summary - ### Steps to reproduce 1. Where are you starting? What can you see? -I change my display name to something long 2. What do you click? -I reply to a thread 3. More steps… -I reply to another thread with longer text ![Screenshot](https://user-images.githubusercontent.com/18530109/159001941-dd999680-a221-402a-a3ab-8146e1fb6f4f.png) ### Outcome #### What did you expect? I expected my name to be displayed consistently across all thread summaries. #### What happened instead? My display name is sometimes wrapped within the summary and sometimes only a part of my name is displayed. ### Operating system Windows 10 Home 21H1 ### Browser information Firefox 98.0.1 (64-bit) ### URL for webapp https://develop.element.io ### Application version Element version: 071a5410b88e-react-cdcf6d0fd182-js-779afbcb3992 Olm version: 3.2.8 ### Homeserver matrix.org ### Will you send logs? No
defect
long names are displayed inconsistently in thread summary steps to reproduce where are you starting what can you see i change my display name to something long what do you click i reply to a thread more steps… i reply to another thread with longer text outcome what did you expect i expected my name to be displayed consistently across all thread summaries what happened instead my display name is sometimes wrapped within the summary and sometimes only a part of my name is displayed operating system windows home browser information firefox bit url for webapp application version element version react js olm version homeserver matrix org will you send logs no
1
22,861
3,727,389,252
IssuesEvent
2016-03-06 08:04:54
godfather1103/mentohust
https://api.github.com/repos/godfather1103/mentohust
closed
尝试破解mentohust-rp的授权码,有买过他路由器的同学希望能提供支持,以打破不良卖家的垄断
auto-migrated Priority-Medium Type-Defect
``` 淘宝有一家mentohust-rp的卖家,218一台,现在想要破解mentohust-r p的注册码。但是反推加密算法需要很多信息,希望买过他路� ��的同学可以将路由中的授权码、对应的锐捷账号、以及mac地 址发到702130264@qq.com中 或者在下面留言,并备注您的邮箱 有成果后一定回馈提供信息者 ``` Original issue reported on code.google.com by `qu456...@gmail.com` on 18 Aug 2013 at 6:29
1.0
尝试破解mentohust-rp的授权码,有买过他路由器的同学希望能提供支持,以打破不良卖家的垄断 - ``` 淘宝有一家mentohust-rp的卖家,218一台,现在想要破解mentohust-r p的注册码。但是反推加密算法需要很多信息,希望买过他路� ��的同学可以将路由中的授权码、对应的锐捷账号、以及mac地 址发到702130264@qq.com中 或者在下面留言,并备注您的邮箱 有成果后一定回馈提供信息者 ``` Original issue reported on code.google.com by `qu456...@gmail.com` on 18 Aug 2013 at 6:29
defect
尝试破解mentohust rp的授权码,有买过他路由器的同学希望能提供支持,以打破不良卖家的垄断 淘宝有一家mentohust rp的卖家, ,现在想要破解mentohust r p的注册码。但是反推加密算法需要很多信息,希望买过他路� ��的同学可以将路由中的授权码、对应的锐捷账号、以及mac地 qq com中 或者在下面留言,并备注您的邮箱 有成果后一定回馈提供信息者 original issue reported on code google com by gmail com on aug at
1
45,917
13,055,821,972
IssuesEvent
2020-07-30 02:50:20
icecube-trac/tix2
https://api.github.com/repos/icecube-trac/tix2
opened
libarchive review (Trac #261)
Incomplete Migration Migrated from Trac combo core defect
Migrated from https://code.icecube.wisc.edu/ticket/261 ```json { "status": "closed", "changetime": "2014-11-23T03:37:57", "description": "See: #IT282", "reporter": "nega", "cc": "", "resolution": "worksforme", "_ts": "1416713877165085", "component": "combo core", "summary": "libarchive review", "priority": "normal", "keywords": "libarchive", "time": "2011-05-11T20:39:48", "milestone": "", "owner": "olivas", "type": "defect" } ```
1.0
libarchive review (Trac #261) - Migrated from https://code.icecube.wisc.edu/ticket/261 ```json { "status": "closed", "changetime": "2014-11-23T03:37:57", "description": "See: #IT282", "reporter": "nega", "cc": "", "resolution": "worksforme", "_ts": "1416713877165085", "component": "combo core", "summary": "libarchive review", "priority": "normal", "keywords": "libarchive", "time": "2011-05-11T20:39:48", "milestone": "", "owner": "olivas", "type": "defect" } ```
defect
libarchive review trac migrated from json status closed changetime description see reporter nega cc resolution worksforme ts component combo core summary libarchive review priority normal keywords libarchive time milestone owner olivas type defect
1
86,024
24,744,625,261
IssuesEvent
2022-10-21 08:41:31
Software-Hardware-Codesign/AVR-Sandbox
https://api.github.com/repos/Software-Hardware-Codesign/AVR-Sandbox
opened
[WSL Batch Installer] Build a debian wsl installer for windows
build
[WSL Batch Installer] Create a wsl batch installer for windows subsystems for linux kernal and debian via windows batch files.
1.0
[WSL Batch Installer] Build a debian wsl installer for windows - [WSL Batch Installer] Create a wsl batch installer for windows subsystems for linux kernal and debian via windows batch files.
non_defect
build a debian wsl installer for windows create a wsl batch installer for windows subsystems for linux kernal and debian via windows batch files
0
794,356
28,032,877,079
IssuesEvent
2023-03-28 13:27:30
FTC7393/FtcRobotController
https://api.github.com/repos/FTC7393/FtcRobotController
closed
Wait for Fetcher not working in Auto
bug good first issue Auto Extremely high priority
With Fetcher Coordination turned on, the fetcher encoder does not match the requested position we send to it from the Auto opmode. This confused the "wait for fetcher" code and it frequently got stuck. We need to use a different equation to determine when the Fetcher is in a good position for grabbing. Either read the actual command back out, or go back to using the `fetcher.isDone` flag. We can change the margin of error used for the isDone flag if we want.
1.0
Wait for Fetcher not working in Auto - With Fetcher Coordination turned on, the fetcher encoder does not match the requested position we send to it from the Auto opmode. This confused the "wait for fetcher" code and it frequently got stuck. We need to use a different equation to determine when the Fetcher is in a good position for grabbing. Either read the actual command back out, or go back to using the `fetcher.isDone` flag. We can change the margin of error used for the isDone flag if we want.
non_defect
wait for fetcher not working in auto with fetcher coordination turned on the fetcher encoder does not match the requested position we send to it from the auto opmode this confused the wait for fetcher code and it frequently got stuck we need to use a different equation to determine when the fetcher is in a good position for grabbing either read the actual command back out or go back to using the fetcher isdone flag we can change the margin of error used for the isdone flag if we want
0
34,678
7,458,677,699
IssuesEvent
2018-03-30 11:42:14
kerdokullamae/test_koik_issued
https://api.github.com/repos/kerdokullamae/test_koik_issued
closed
Lisada CS portaali abitekstid
C: AVAR P: highest R: fixed T: defect
**Reported by sven syld on 29 Jan 2015 09:50 UTC** '''Ülesannete loetelu ehk anna.ra.ee/et/task/list/''' Siin lehel on kuvatud kõik panustamiseks avatud ülesanded. Enne tööleasumist soovitame siseneda VAU kaudu (link üleval paremal nurgas) - nii jäävad panustused seotuks Sinu kontoga. Lehe paremas servas kuvatakse tublimate kasutajate edetabelit. Punkte saab arhivaari poolt heakskiidetud panustuste eest, seega ei ilmu punktid edetabelisse vahetult peale panustuse salvestamist, vaid siis, kui arhivaar on jõudnud selle üle vaadata. Edetabelis osalemiseks pead arhivaale kirjeldades olema sisse logitud - anonüümseid panustusi edetabel ei kajasta. Töö alustamiseks vali vasakult loetelust huvipakkuv ülesanne klõpsates selle pealkirjal. '''Ülesande objektide loetelu ehk anna.ra.ee/et/task/taskObjectList/''' Vali arhivaal klõpsates selle pealkirjal - seejärel kuvatakse arhivaali kujutis ja andmete sisestamise vorm. '''Panustamise vorm ehk anna.ra.ee/et/proposal/add/''' Uuri arhivaali kujutist ja sisesta tekst sobivasse lahtrisse. Näiteks kui ülesande sisuks on isikute tuvastamine fotodelt, siis vali isiku nimi isikute menüüst, või kui seal sobivat nime ei leidu, siis kirjuta nimi märkuste lahtrisse. Ettepaneku tegemise järel vajuta nuppu "Salvesta." Nupuga "Tagasi" saad minna arhivaalide loetellu järgmist arhivaali valima.
1.0
Lisada CS portaali abitekstid - **Reported by sven syld on 29 Jan 2015 09:50 UTC** '''Ülesannete loetelu ehk anna.ra.ee/et/task/list/''' Siin lehel on kuvatud kõik panustamiseks avatud ülesanded. Enne tööleasumist soovitame siseneda VAU kaudu (link üleval paremal nurgas) - nii jäävad panustused seotuks Sinu kontoga. Lehe paremas servas kuvatakse tublimate kasutajate edetabelit. Punkte saab arhivaari poolt heakskiidetud panustuste eest, seega ei ilmu punktid edetabelisse vahetult peale panustuse salvestamist, vaid siis, kui arhivaar on jõudnud selle üle vaadata. Edetabelis osalemiseks pead arhivaale kirjeldades olema sisse logitud - anonüümseid panustusi edetabel ei kajasta. Töö alustamiseks vali vasakult loetelust huvipakkuv ülesanne klõpsates selle pealkirjal. '''Ülesande objektide loetelu ehk anna.ra.ee/et/task/taskObjectList/''' Vali arhivaal klõpsates selle pealkirjal - seejärel kuvatakse arhivaali kujutis ja andmete sisestamise vorm. '''Panustamise vorm ehk anna.ra.ee/et/proposal/add/''' Uuri arhivaali kujutist ja sisesta tekst sobivasse lahtrisse. Näiteks kui ülesande sisuks on isikute tuvastamine fotodelt, siis vali isiku nimi isikute menüüst, või kui seal sobivat nime ei leidu, siis kirjuta nimi märkuste lahtrisse. Ettepaneku tegemise järel vajuta nuppu "Salvesta." Nupuga "Tagasi" saad minna arhivaalide loetellu järgmist arhivaali valima.
defect
lisada cs portaali abitekstid reported by sven syld on jan utc ülesannete loetelu ehk anna ra ee et task list siin lehel on kuvatud kõik panustamiseks avatud ülesanded enne tööleasumist soovitame siseneda vau kaudu link üleval paremal nurgas nii jäävad panustused seotuks sinu kontoga lehe paremas servas kuvatakse tublimate kasutajate edetabelit punkte saab arhivaari poolt heakskiidetud panustuste eest seega ei ilmu punktid edetabelisse vahetult peale panustuse salvestamist vaid siis kui arhivaar on jõudnud selle üle vaadata edetabelis osalemiseks pead arhivaale kirjeldades olema sisse logitud anonüümseid panustusi edetabel ei kajasta töö alustamiseks vali vasakult loetelust huvipakkuv ülesanne klõpsates selle pealkirjal ülesande objektide loetelu ehk anna ra ee et task taskobjectlist vali arhivaal klõpsates selle pealkirjal seejärel kuvatakse arhivaali kujutis ja andmete sisestamise vorm panustamise vorm ehk anna ra ee et proposal add uuri arhivaali kujutist ja sisesta tekst sobivasse lahtrisse näiteks kui ülesande sisuks on isikute tuvastamine fotodelt siis vali isiku nimi isikute menüüst või kui seal sobivat nime ei leidu siis kirjuta nimi märkuste lahtrisse ettepaneku tegemise järel vajuta nuppu salvesta nupuga tagasi saad minna arhivaalide loetellu järgmist arhivaali valima
1
596,225
18,100,369,776
IssuesEvent
2021-09-22 13:39:48
mantidproject/mantid
https://api.github.com/repos/mantidproject/mantid
closed
MSlice issues
High Priority Bug ISIS Team: Excitations/Vesuvio
This is note of the 3 MSlice issues that are to be considered for this release: https://github.com/mantidproject/mslice/issues/630 - auto-generated slice scripts no longer run successfully https://github.com/mantidproject/mslice/issues/628 - Mantid crashes when trying to generate a script from a cut (linux) https://github.com/mantidproject/mslice/issues/629 - MSlice opens to the wrong tab
1.0
MSlice issues - This is note of the 3 MSlice issues that are to be considered for this release: https://github.com/mantidproject/mslice/issues/630 - auto-generated slice scripts no longer run successfully https://github.com/mantidproject/mslice/issues/628 - Mantid crashes when trying to generate a script from a cut (linux) https://github.com/mantidproject/mslice/issues/629 - MSlice opens to the wrong tab
non_defect
mslice issues this is note of the mslice issues that are to be considered for this release auto generated slice scripts no longer run successfully mantid crashes when trying to generate a script from a cut linux mslice opens to the wrong tab
0
15,967
10,417,838,211
IssuesEvent
2019-09-15 02:11:25
ParveenBhadooOfficial/Bhadoo-Cloud
https://api.github.com/repos/ParveenBhadooOfficial/Bhadoo-Cloud
closed
Free Servers for Public Torrenting
Public Service
Post your Free Server URL Below. Current Active Servers https://github.com/ParveenBhadooOfficial/BhadooCloud/wiki/Available-Servers
1.0
Free Servers for Public Torrenting - Post your Free Server URL Below. Current Active Servers https://github.com/ParveenBhadooOfficial/BhadooCloud/wiki/Available-Servers
non_defect
free servers for public torrenting post your free server url below current active servers
0
47,369
13,056,147,725
IssuesEvent
2020-07-30 03:48:17
icecube-trac/tix2
https://api.github.com/repos/icecube-trac/tix2
closed
pybinding of TimeResidual requires 5 args (Trac #435)
Migrated from Trac combo core defect
The 2 last arguments of the TimeResidual function of I3Calculator, don't use their default (I3Constants.n_ice_group, n_ice_phase) in python and need to explicitly given which is not the case in C++ land. only the following works now : phys_services.I3Calculator.time_residual(track, omgeo[omkey].position, pulse.time,dataclasses.I3Constants.n_ice_group,dataclasses.I3Constants.n_ice_phase) and not (yet) : phys_services.I3Calculator.time_residual(track, omgeo[omkey].position, pulse.time) Migrated from https://code.icecube.wisc.edu/ticket/435 ```json { "status": "closed", "changetime": "2012-07-23T23:33:19", "description": "The 2 last arguments of the TimeResidual function of I3Calculator, don't use their default (I3Constants.n_ice_group, n_ice_phase) in python and need to explicitly given which is not the case in C++ land.\n\nonly the following works now :\nphys_services.I3Calculator.time_residual(track, omgeo[omkey].position, pulse.time,dataclasses.I3Constants.n_ice_group,dataclasses.I3Constants.n_ice_phase)\n\nand not (yet) :\nphys_services.I3Calculator.time_residual(track, omgeo[omkey].position, pulse.time)", "reporter": "icecube", "cc": "", "resolution": "fixed", "_ts": "1343086399000000", "component": "combo core", "summary": "pybinding of TimeResidual requires 5 args", "priority": "normal", "keywords": "", "time": "2012-07-23T14:35:48", "milestone": "", "owner": "jvansanten", "type": "defect" } ```
1.0
pybinding of TimeResidual requires 5 args (Trac #435) - The 2 last arguments of the TimeResidual function of I3Calculator, don't use their default (I3Constants.n_ice_group, n_ice_phase) in python and need to explicitly given which is not the case in C++ land. only the following works now : phys_services.I3Calculator.time_residual(track, omgeo[omkey].position, pulse.time,dataclasses.I3Constants.n_ice_group,dataclasses.I3Constants.n_ice_phase) and not (yet) : phys_services.I3Calculator.time_residual(track, omgeo[omkey].position, pulse.time) Migrated from https://code.icecube.wisc.edu/ticket/435 ```json { "status": "closed", "changetime": "2012-07-23T23:33:19", "description": "The 2 last arguments of the TimeResidual function of I3Calculator, don't use their default (I3Constants.n_ice_group, n_ice_phase) in python and need to explicitly given which is not the case in C++ land.\n\nonly the following works now :\nphys_services.I3Calculator.time_residual(track, omgeo[omkey].position, pulse.time,dataclasses.I3Constants.n_ice_group,dataclasses.I3Constants.n_ice_phase)\n\nand not (yet) :\nphys_services.I3Calculator.time_residual(track, omgeo[omkey].position, pulse.time)", "reporter": "icecube", "cc": "", "resolution": "fixed", "_ts": "1343086399000000", "component": "combo core", "summary": "pybinding of TimeResidual requires 5 args", "priority": "normal", "keywords": "", "time": "2012-07-23T14:35:48", "milestone": "", "owner": "jvansanten", "type": "defect" } ```
defect
pybinding of timeresidual requires args trac the last arguments of the timeresidual function of don t use their default n ice group n ice phase in python and need to explicitly given which is not the case in c land only the following works now phys services time residual track omgeo position pulse time dataclasses n ice group dataclasses n ice phase and not yet phys services time residual track omgeo position pulse time migrated from json status closed changetime description the last arguments of the timeresidual function of don t use their default n ice group n ice phase in python and need to explicitly given which is not the case in c land n nonly the following works now nphys services time residual track omgeo position pulse time dataclasses n ice group dataclasses n ice phase n nand not yet nphys services time residual track omgeo position pulse time reporter icecube cc resolution fixed ts component combo core summary pybinding of timeresidual requires args priority normal keywords time milestone owner jvansanten type defect
1
27,736
5,089,721,984
IssuesEvent
2017-01-01 20:39:08
edno/kleis
https://api.github.com/repos/edno/kleis
closed
Check password update
defect test
Possible issue with password update from user's profile page. **TODO**: add tests for password update for both case, profile and user management.
1.0
Check password update - Possible issue with password update from user's profile page. **TODO**: add tests for password update for both case, profile and user management.
defect
check password update possible issue with password update from user s profile page todo add tests for password update for both case profile and user management
1
23,447
3,827,811,926
IssuesEvent
2016-03-31 00:55:03
CompEvol/beast2
https://api.github.com/repos/CompEvol/beast2
closed
MRCA prior selector needs to be clarified
BEAUti defect
This may belong to GUI bug: BEAUti => StarBEAST template: I can create the same named MRCA prior from both "Multi Species Coalescent" and "Priors" panel, which is every confusing. Because of the same MRCA prior selector, I can create MRCA prior for species tree from "Multi Species Coalescent" panel, or MRCA prior for gene tree from "Priors" panel. And xml looks weird: <distribution id="posterior" spec="util.CompoundDistribution"> <distribution id="speciescoalescent" spec="util.CompoundDistribution"> <distribution id="SpeciesTreePopSize.Species" spec="beast.evolution.speciation.SpeciesTreePrior" bottomPopSize="@popSize" gammaParameter="@popMean" taxonset="@taxonsuperset" tree="@Tree.t:Species"/> <distribution id="treePrior.t:26" spec="beast.evolution.speciation.GeneTreeForSpeciesTreeDistribution" speciesTree="@Tree.t:Species" speciesTreePrior="@SpeciesTreePopSize.Species" tree="@Tree.t:26"/> <distribution id="treePrior.t:29" spec="beast.evolution.speciation.GeneTreeForSpeciesTreeDistribution" speciesTree="@Tree.t:Species" speciesTreePrior="@SpeciesTreePopSize.Species" tree="@Tree.t:29"/> <distribution id="treePrior.t:47" spec="beast.evolution.speciation.GeneTreeForSpeciesTreeDistribution" speciesTree="@Tree.t:Species" speciesTreePrior="@SpeciesTreePopSize.Species" tree="@Tree.t:47"/> <distribution id="MRCAPrior.0" spec="beast.math.distributions.MRCAPrior" tree="@Tree.t:Species"> <Exponential id="Exponential.0" name="distr"> <parameter id="RealParameter.0" estimate="false" name="mean">1.0</parameter> </Exponential> </distribution> </distribution> <distribution id="prior" spec="util.CompoundDistribution"> <distribution id="CalibratedYuleModel.t:Species" spec="beast.evolution.speciation.CalibratedYuleModel" birthRate="@birthRateY.t:Species" tree="@Tree.t:Species"/> <prior id="CalibratedYuleBirthRatePrior.t:Species" name="distribution" x="@birthRateY.t:Species"> <Uniform id="Uniform.0" name="distr" upper="1000.0"/> </prior> <prior id="ClockPrior.c:26" name="distribution" x="@clockRate.c:26"> <Uniform id="Uniform.01" name="distr" upper="Infinity"/> </prior> <prior id="popMean.prior" name="distribution" x="@popMean"> <OneOnX id="OneOnX.0" name="distr"/> </prior> <distribution id="MRCAPrior.01" spec="beast.math.distributions.MRCAPrior" tree="@Tree.t:Species"> <LogNormal id="LogNormalDistributionModel.0" name="distr"> <parameter id="RealParameter.01" estimate="false" name="M">1.0</parameter> <parameter id="RealParameter.02" estimate="false" lower="0.0" name="S" upper="5.0">1.25</parameter> </LogNormal> </distribution> <distribution id="a.prior" spec="beast.math.distributions.MRCAPrior" tree="@Tree.t:26"> <taxonset id="a" spec="TaxonSet"> <taxon id="Thomomys_bottae_albatus1" spec="Taxon"/> <taxon id="Thomomys_bottae_alpinus1" spec="Taxon"/> <taxon id="Thomomys_bottae_awahnee_a1" spec="Taxon"/> <taxon id="Thomomys_bottae_awahnee_b1" spec="Taxon"/> <taxon id="Thomomys_bottae_bottae1" spec="Taxon"/> </taxonset> <Gamma id="Gamma.0" name="distr"> <parameter id="RealParameter.03" estimate="false" name="alpha">2.0</parameter> <parameter id="RealParameter.04" estimate="false" name="beta">2.0</parameter> </Gamma> </distribution> </distribution> <distribution id="likelihood" spec="util.CompoundDistribution"> <distribution id="treeLikelihood.26" spec="TreeLikelihood" data="@26" tree="@Tree.t:26"> <siteModel id="SiteModel.s:26" spec="SiteModel"> <parameter id="mutationRate.s:26" estimate="false" name="mutationRate">1.0</parameter> <parameter id="gammaShape.s:26" estimate="false" name="shape">1.0</parameter> <parameter id="proportionInvariant.s:26" estimate="false" lower="0.0" name="proportionInvariant" upper="1.0">0.0</parameter> <substModel id="JC69.s:26" spec="JukesCantor"/> </siteModel> <branchRateModel id="StrictClock.c:26" spec="beast.evolution.branchratemodel.StrictClockModel" clock.rate="@clockRate.c:26"/> </distribution> <distribution id="treeLikelihood.29" spec="TreeLikelihood" branchRateModel="@StrictClock.c:26" data="@29" siteModel="@SiteModel.s:26" tree="@Tree.t:29"/> <distribution id="treeLikelihood.47" spec="TreeLikelihood" branchRateModel="@StrictClock.c:26" data="@47" siteModel="@SiteModel.s:26" tree="@Tree.t:47"/> </distribution> </distribution>
1.0
MRCA prior selector needs to be clarified - This may belong to GUI bug: BEAUti => StarBEAST template: I can create the same named MRCA prior from both "Multi Species Coalescent" and "Priors" panel, which is every confusing. Because of the same MRCA prior selector, I can create MRCA prior for species tree from "Multi Species Coalescent" panel, or MRCA prior for gene tree from "Priors" panel. And xml looks weird: <distribution id="posterior" spec="util.CompoundDistribution"> <distribution id="speciescoalescent" spec="util.CompoundDistribution"> <distribution id="SpeciesTreePopSize.Species" spec="beast.evolution.speciation.SpeciesTreePrior" bottomPopSize="@popSize" gammaParameter="@popMean" taxonset="@taxonsuperset" tree="@Tree.t:Species"/> <distribution id="treePrior.t:26" spec="beast.evolution.speciation.GeneTreeForSpeciesTreeDistribution" speciesTree="@Tree.t:Species" speciesTreePrior="@SpeciesTreePopSize.Species" tree="@Tree.t:26"/> <distribution id="treePrior.t:29" spec="beast.evolution.speciation.GeneTreeForSpeciesTreeDistribution" speciesTree="@Tree.t:Species" speciesTreePrior="@SpeciesTreePopSize.Species" tree="@Tree.t:29"/> <distribution id="treePrior.t:47" spec="beast.evolution.speciation.GeneTreeForSpeciesTreeDistribution" speciesTree="@Tree.t:Species" speciesTreePrior="@SpeciesTreePopSize.Species" tree="@Tree.t:47"/> <distribution id="MRCAPrior.0" spec="beast.math.distributions.MRCAPrior" tree="@Tree.t:Species"> <Exponential id="Exponential.0" name="distr"> <parameter id="RealParameter.0" estimate="false" name="mean">1.0</parameter> </Exponential> </distribution> </distribution> <distribution id="prior" spec="util.CompoundDistribution"> <distribution id="CalibratedYuleModel.t:Species" spec="beast.evolution.speciation.CalibratedYuleModel" birthRate="@birthRateY.t:Species" tree="@Tree.t:Species"/> <prior id="CalibratedYuleBirthRatePrior.t:Species" name="distribution" x="@birthRateY.t:Species"> <Uniform id="Uniform.0" name="distr" upper="1000.0"/> </prior> <prior id="ClockPrior.c:26" name="distribution" x="@clockRate.c:26"> <Uniform id="Uniform.01" name="distr" upper="Infinity"/> </prior> <prior id="popMean.prior" name="distribution" x="@popMean"> <OneOnX id="OneOnX.0" name="distr"/> </prior> <distribution id="MRCAPrior.01" spec="beast.math.distributions.MRCAPrior" tree="@Tree.t:Species"> <LogNormal id="LogNormalDistributionModel.0" name="distr"> <parameter id="RealParameter.01" estimate="false" name="M">1.0</parameter> <parameter id="RealParameter.02" estimate="false" lower="0.0" name="S" upper="5.0">1.25</parameter> </LogNormal> </distribution> <distribution id="a.prior" spec="beast.math.distributions.MRCAPrior" tree="@Tree.t:26"> <taxonset id="a" spec="TaxonSet"> <taxon id="Thomomys_bottae_albatus1" spec="Taxon"/> <taxon id="Thomomys_bottae_alpinus1" spec="Taxon"/> <taxon id="Thomomys_bottae_awahnee_a1" spec="Taxon"/> <taxon id="Thomomys_bottae_awahnee_b1" spec="Taxon"/> <taxon id="Thomomys_bottae_bottae1" spec="Taxon"/> </taxonset> <Gamma id="Gamma.0" name="distr"> <parameter id="RealParameter.03" estimate="false" name="alpha">2.0</parameter> <parameter id="RealParameter.04" estimate="false" name="beta">2.0</parameter> </Gamma> </distribution> </distribution> <distribution id="likelihood" spec="util.CompoundDistribution"> <distribution id="treeLikelihood.26" spec="TreeLikelihood" data="@26" tree="@Tree.t:26"> <siteModel id="SiteModel.s:26" spec="SiteModel"> <parameter id="mutationRate.s:26" estimate="false" name="mutationRate">1.0</parameter> <parameter id="gammaShape.s:26" estimate="false" name="shape">1.0</parameter> <parameter id="proportionInvariant.s:26" estimate="false" lower="0.0" name="proportionInvariant" upper="1.0">0.0</parameter> <substModel id="JC69.s:26" spec="JukesCantor"/> </siteModel> <branchRateModel id="StrictClock.c:26" spec="beast.evolution.branchratemodel.StrictClockModel" clock.rate="@clockRate.c:26"/> </distribution> <distribution id="treeLikelihood.29" spec="TreeLikelihood" branchRateModel="@StrictClock.c:26" data="@29" siteModel="@SiteModel.s:26" tree="@Tree.t:29"/> <distribution id="treeLikelihood.47" spec="TreeLikelihood" branchRateModel="@StrictClock.c:26" data="@47" siteModel="@SiteModel.s:26" tree="@Tree.t:47"/> </distribution> </distribution>
defect
mrca prior selector needs to be clarified this may belong to gui bug beauti starbeast template i can create the same named mrca prior from both multi species coalescent and priors panel which is every confusing because of the same mrca prior selector i can create mrca prior for species tree from multi species coalescent panel or mrca prior for gene tree from priors panel and xml looks weird
1
59,584
17,023,168,666
IssuesEvent
2021-07-03 00:40:56
tomhughes/trac-tickets
https://api.github.com/repos/tomhughes/trac-tickets
closed
"unique" segment id's repeated: planet file contains two pairs of duplicate segments
Component: api Priority: critical Resolution: fixed Type: defect
**[Submitted to the original trac issue database at 10.29am, Thursday, 31st May 2007]** Oh dear, the planet file for May 30, 2007 contains two pairs of duplicate segments: they have the same id. Surely id's are unique, yes? Isn't that the point? It doesn't look like they are new, but it wasn't like this last week or I'd have noticed. Is this just a problem generating the planet file, or indicative of an underlying data integrity problem? (There may be other duplicated, but my planet processing program crashed trying to do an insert on a duplicate primary key) Excerpt below... 18591604 and 18591605 are the culprits, and this is a contiguous section of the file. (Also, note that 18591606 doesn't have any useful tags and 18591607 is empty.) ---------------------- <segment id="18591603" from="22542877" to="22542874" timestamp="2007-05-11T08:24:21+01:00"> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS(could be inacurately)" /> </segment> <segment id="18591604" from="22542876" to="22542878" timestamp="2006-12-31T00:20:10+00:00"> <tag k="natural" v="coastline" /> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS" /> </segment> <segment id="18591605" from="22542869" to="22542875" timestamp="2006-12-31T00:20:10+00:00"> <tag k="natural" v="coastline" /> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS" /> </segment> <segment id="18591604" from="22542876" to="22542878" timestamp="2006-12-31T00:20:10+00:00"> <tag k="natural" v="coastline" /> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS" /> </segment> <segment id="18591605" from="22542869" to="22542875" timestamp="2006-12-31T00:20:10+00:00"> <tag k="natural" v="coastline" /> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS" /> </segment> <segment id="18591606" from="22542879" to="22542877" timestamp="2007-05-11T08:24:37+01:00"> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS(could be inacurately)" /> </segment> <segment id="18591607" from="22542173" to="22542182" timestamp="2006-12-31T00:20:10+00:00"/> <segment id="18591608" from="22542880" to="22542881" timestamp="2006-12-31T00:20:10+00:00"> <tag k="natural" v="coastline" /> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS" /> </segment>
1.0
"unique" segment id's repeated: planet file contains two pairs of duplicate segments - **[Submitted to the original trac issue database at 10.29am, Thursday, 31st May 2007]** Oh dear, the planet file for May 30, 2007 contains two pairs of duplicate segments: they have the same id. Surely id's are unique, yes? Isn't that the point? It doesn't look like they are new, but it wasn't like this last week or I'd have noticed. Is this just a problem generating the planet file, or indicative of an underlying data integrity problem? (There may be other duplicated, but my planet processing program crashed trying to do an insert on a duplicate primary key) Excerpt below... 18591604 and 18591605 are the culprits, and this is a contiguous section of the file. (Also, note that 18591606 doesn't have any useful tags and 18591607 is empty.) ---------------------- <segment id="18591603" from="22542877" to="22542874" timestamp="2007-05-11T08:24:21+01:00"> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS(could be inacurately)" /> </segment> <segment id="18591604" from="22542876" to="22542878" timestamp="2006-12-31T00:20:10+00:00"> <tag k="natural" v="coastline" /> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS" /> </segment> <segment id="18591605" from="22542869" to="22542875" timestamp="2006-12-31T00:20:10+00:00"> <tag k="natural" v="coastline" /> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS" /> </segment> <segment id="18591604" from="22542876" to="22542878" timestamp="2006-12-31T00:20:10+00:00"> <tag k="natural" v="coastline" /> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS" /> </segment> <segment id="18591605" from="22542869" to="22542875" timestamp="2006-12-31T00:20:10+00:00"> <tag k="natural" v="coastline" /> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS" /> </segment> <segment id="18591606" from="22542879" to="22542877" timestamp="2007-05-11T08:24:37+01:00"> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS(could be inacurately)" /> </segment> <segment id="18591607" from="22542173" to="22542182" timestamp="2006-12-31T00:20:10+00:00"/> <segment id="18591608" from="22542880" to="22542881" timestamp="2006-12-31T00:20:10+00:00"> <tag k="natural" v="coastline" /> <tag k="created_by" v="almien_coastlines" /> <tag k="source" v="PGS" /> </segment>
defect
unique segment id s repeated planet file contains two pairs of duplicate segments oh dear the planet file for may contains two pairs of duplicate segments they have the same id surely id s are unique yes isn t that the point it doesn t look like they are new but it wasn t like this last week or i d have noticed is this just a problem generating the planet file or indicative of an underlying data integrity problem there may be other duplicated but my planet processing program crashed trying to do an insert on a duplicate primary key excerpt below and are the culprits and this is a contiguous section of the file also note that doesn t have any useful tags and is empty
1
25,496
7,720,111,392
IssuesEvent
2018-05-23 21:43:14
hashicorp/packer
https://api.github.com/repos/hashicorp/packer
closed
Amazon-Chroot builder cannot properly mount EBS volumes on new-gen (c5, m5) HVM instances
bug builder/amazon
Version: 1.1.3 Host: Amazon Linux, HVM The amazon-chroot builder relies on Amazon attaching an EBS volume at the same device endpoint as what is described in the console/api. However, on HVM instances, the two do not match and require additional discovery to have a mapping. For example, an EBS volume mapped as `/dev/xvda` is enumerated in the OS as `/dev/nvme0n1`. Additional EBS volumes attached to the instance are associated as `/dev/sd*` but still attached as `/dev/nvme*`. The amazon-chroot builder relies on a direct mapping to mount the newly created EBS volume (ie `/dev/sdg -> /dev/sdg`). Launching a packer build from this instance fails because of this expectation. Problem code: https://github.com/hashicorp/packer/blob/master/builder/amazon/chroot/device.go#L15 See here for output: https://gist.github.com/JoshArrington/09970427b4bad32d73b3b9bb920fbbe4 **To reproduce** Launch an Amazon Linux AMI (I used ami-f2d3638a) with HVM (c5.large). Notce that from the Amazon console, the EBS volume attached is /dev/xvda. Additional volumes attach as /dev/sd*. Use the config in the gist above. **References** Identifying the appropriate device mapping https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/nvme-ebs-volumes.html Device naming for HVM instances https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/device_naming.html
1.0
Amazon-Chroot builder cannot properly mount EBS volumes on new-gen (c5, m5) HVM instances - Version: 1.1.3 Host: Amazon Linux, HVM The amazon-chroot builder relies on Amazon attaching an EBS volume at the same device endpoint as what is described in the console/api. However, on HVM instances, the two do not match and require additional discovery to have a mapping. For example, an EBS volume mapped as `/dev/xvda` is enumerated in the OS as `/dev/nvme0n1`. Additional EBS volumes attached to the instance are associated as `/dev/sd*` but still attached as `/dev/nvme*`. The amazon-chroot builder relies on a direct mapping to mount the newly created EBS volume (ie `/dev/sdg -> /dev/sdg`). Launching a packer build from this instance fails because of this expectation. Problem code: https://github.com/hashicorp/packer/blob/master/builder/amazon/chroot/device.go#L15 See here for output: https://gist.github.com/JoshArrington/09970427b4bad32d73b3b9bb920fbbe4 **To reproduce** Launch an Amazon Linux AMI (I used ami-f2d3638a) with HVM (c5.large). Notce that from the Amazon console, the EBS volume attached is /dev/xvda. Additional volumes attach as /dev/sd*. Use the config in the gist above. **References** Identifying the appropriate device mapping https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/nvme-ebs-volumes.html Device naming for HVM instances https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/device_naming.html
non_defect
amazon chroot builder cannot properly mount ebs volumes on new gen hvm instances version host amazon linux hvm the amazon chroot builder relies on amazon attaching an ebs volume at the same device endpoint as what is described in the console api however on hvm instances the two do not match and require additional discovery to have a mapping for example an ebs volume mapped as dev xvda is enumerated in the os as dev additional ebs volumes attached to the instance are associated as dev sd but still attached as dev nvme the amazon chroot builder relies on a direct mapping to mount the newly created ebs volume ie dev sdg dev sdg launching a packer build from this instance fails because of this expectation problem code see here for output to reproduce launch an amazon linux ami i used ami with hvm large notce that from the amazon console the ebs volume attached is dev xvda additional volumes attach as dev sd use the config in the gist above references identifying the appropriate device mapping device naming for hvm instances
0
79,823
29,192,169,452
IssuesEvent
2023-05-19 21:14:38
dotCMS/core
https://api.github.com/repos/dotCMS/core
reopened
Content type field variables cut off
Type : Defect dotCMS : Admin Tools QA : Passed Internal OKR : User Experience Team : Lunik Release : 23.05
[![Screenshot_2023-01-13_20-49-23.png](https://mrkr.io/s/63c1c3d39165a8659ceb96be/2)](https://mrkr.io/s/63c1c3d39165a8659ceb96be/0) ## Problem Statement When the field variable is too long, it cut off. ## Steps to Reproduce 1. Edit a content type 2. Add rows with 4 columns 3. Add one field to each column with a long title ## Acceptance Criteria - Needs to show Title - Needs to show Type - Needs to show Variable Name - with "Copy" icon - Users have "unlimited" characters... so some truncation is needed. ## Wireframes / Designs / Prototypes - Needs UI --- **Reported by:** Freddy Montes (freddy@dotcms.com) **Source URL:** [http://localhost:8080/dotAdmin/#/content-types-angular/edit/657897dfb36ef211ebfb4128de818787](http://localhost:8080/dotAdmin/#/content-types-angular/edit/657897dfb36ef211ebfb4128de818787) **Issue details:** [Open in Marker.io](https://app.marker.io/i/63c1c3d39165a8659ceb96c1_36d8a4e2737e5abf?advanced=1) <table><tr><td><strong>Device type</strong></td><td>desktop</td></tr><tr><td><strong>Browser</strong></td><td>Chrome 109.0.0.0</td></tr><tr><td><strong>Screen Size</strong></td><td>3360 x 1890</td></tr><tr><td><strong>OS</strong></td><td>OS X 13.0.0</td></tr><tr><td><strong>Viewport Size</strong></td><td>1551 x 1805</td></tr><tr><td><strong>Zoom Level</strong></td><td>100%</td></tr><tr><td><strong>Pixel Ratio</strong></td><td>@&#8203;2x</td></tr></table> ```[tasklist] ### Tasks - [ ] Provide Design @Melissa-dotCMS - [ ] Update Feature ```
1.0
Content type field variables cut off - [![Screenshot_2023-01-13_20-49-23.png](https://mrkr.io/s/63c1c3d39165a8659ceb96be/2)](https://mrkr.io/s/63c1c3d39165a8659ceb96be/0) ## Problem Statement When the field variable is too long, it cut off. ## Steps to Reproduce 1. Edit a content type 2. Add rows with 4 columns 3. Add one field to each column with a long title ## Acceptance Criteria - Needs to show Title - Needs to show Type - Needs to show Variable Name - with "Copy" icon - Users have "unlimited" characters... so some truncation is needed. ## Wireframes / Designs / Prototypes - Needs UI --- **Reported by:** Freddy Montes (freddy@dotcms.com) **Source URL:** [http://localhost:8080/dotAdmin/#/content-types-angular/edit/657897dfb36ef211ebfb4128de818787](http://localhost:8080/dotAdmin/#/content-types-angular/edit/657897dfb36ef211ebfb4128de818787) **Issue details:** [Open in Marker.io](https://app.marker.io/i/63c1c3d39165a8659ceb96c1_36d8a4e2737e5abf?advanced=1) <table><tr><td><strong>Device type</strong></td><td>desktop</td></tr><tr><td><strong>Browser</strong></td><td>Chrome 109.0.0.0</td></tr><tr><td><strong>Screen Size</strong></td><td>3360 x 1890</td></tr><tr><td><strong>OS</strong></td><td>OS X 13.0.0</td></tr><tr><td><strong>Viewport Size</strong></td><td>1551 x 1805</td></tr><tr><td><strong>Zoom Level</strong></td><td>100%</td></tr><tr><td><strong>Pixel Ratio</strong></td><td>@&#8203;2x</td></tr></table> ```[tasklist] ### Tasks - [ ] Provide Design @Melissa-dotCMS - [ ] Update Feature ```
defect
content type field variables cut off problem statement when the field variable is too long it cut off steps to reproduce edit a content type add rows with columns add one field to each column with a long title acceptance criteria needs to show title needs to show type needs to show variable name with copy icon users have unlimited characters so some truncation is needed wireframes designs prototypes needs ui reported by freddy montes freddy dotcms com source url issue details device type desktop browser chrome screen size x os os x viewport size x zoom level pixel ratio tasks provide design melissa dotcms update feature
1
513,622
14,924,045,915
IssuesEvent
2021-01-23 21:46:08
Mitiko/BWDPerf
https://api.github.com/repos/Mitiko/BWDPerf
closed
Encoder emits s token, when it should cover the whole buffer
compression parsing priority
When running the test for multiple dictionaries, the dictionary size is way too big for the buffer, so BWD easily covers the whole buffer with words and we don't even get to the s token, yet an s token is emitted in the compressed file. The test data is enwik4 with options m=12, r=8, b=1_000. The problem occurs in the second buffer, at location 667. At that location the data is `A`, being the third `A` in `[[AAA]]` I suppose this is some sort of a parsing problem. Check which words should've covered that region according to the dictionary calculator and then check why these words weren't used when parsing the stream.
1.0
Encoder emits s token, when it should cover the whole buffer - When running the test for multiple dictionaries, the dictionary size is way too big for the buffer, so BWD easily covers the whole buffer with words and we don't even get to the s token, yet an s token is emitted in the compressed file. The test data is enwik4 with options m=12, r=8, b=1_000. The problem occurs in the second buffer, at location 667. At that location the data is `A`, being the third `A` in `[[AAA]]` I suppose this is some sort of a parsing problem. Check which words should've covered that region according to the dictionary calculator and then check why these words weren't used when parsing the stream.
non_defect
encoder emits s token when it should cover the whole buffer when running the test for multiple dictionaries the dictionary size is way too big for the buffer so bwd easily covers the whole buffer with words and we don t even get to the s token yet an s token is emitted in the compressed file the test data is with options m r b the problem occurs in the second buffer at location at that location the data is a being the third a in i suppose this is some sort of a parsing problem check which words should ve covered that region according to the dictionary calculator and then check why these words weren t used when parsing the stream
0
66,093
19,978,748,247
IssuesEvent
2022-01-29 14:46:40
martinrotter/rssguard
https://api.github.com/repos/martinrotter/rssguard
closed
[BUG]: reader panel resizing not remembered
Type-Defect
### Brief description of the issue Here's what I set it to: ![image](https://user-images.githubusercontent.com/1309656/150525972-edf6c5ee-2b7d-4515-90ec-d17af92d9de5.png) And here is what I come back to next day: ![image](https://user-images.githubusercontent.com/1309656/150525950-29ae55dd-79cf-4400-96d6-18e0c08d7c54.png) ### How to reproduce the bug? restart app? restart windows? wait a day? ### What was the expected result? should be remembered ### What actually happened? it isn't ### Other information I use nowebengine ### Operating system and version * OS: win10 x64 * RSS Guard version: dev
1.0
[BUG]: reader panel resizing not remembered - ### Brief description of the issue Here's what I set it to: ![image](https://user-images.githubusercontent.com/1309656/150525972-edf6c5ee-2b7d-4515-90ec-d17af92d9de5.png) And here is what I come back to next day: ![image](https://user-images.githubusercontent.com/1309656/150525950-29ae55dd-79cf-4400-96d6-18e0c08d7c54.png) ### How to reproduce the bug? restart app? restart windows? wait a day? ### What was the expected result? should be remembered ### What actually happened? it isn't ### Other information I use nowebengine ### Operating system and version * OS: win10 x64 * RSS Guard version: dev
defect
reader panel resizing not remembered brief description of the issue here s what i set it to and here is what i come back to next day how to reproduce the bug restart app restart windows wait a day what was the expected result should be remembered what actually happened it isn t other information i use nowebengine operating system and version os rss guard version dev
1
299,554
25,910,921,694
IssuesEvent
2022-12-15 13:55:58
rootzoll/raspiblitz
https://api.github.com/repos/rootzoll/raspiblitz
closed
Lots of DNS queries on 1.6.2
final testing
Reference: #1656 Hi, even with the latest 1.6.2 release I see a lot of DNS requests on my pihole coming from Raspiblitz. ![grafik](https://user-images.githubusercontent.com/12113170/103173432-39919c80-485b-11eb-8ea7-6f6b45efd99d.png) I disabled the IPv6 address on the eth0 interface to make sure, that v6 is not the root cause. Do you need more information? I'm glad, when I can help.
1.0
Lots of DNS queries on 1.6.2 - Reference: #1656 Hi, even with the latest 1.6.2 release I see a lot of DNS requests on my pihole coming from Raspiblitz. ![grafik](https://user-images.githubusercontent.com/12113170/103173432-39919c80-485b-11eb-8ea7-6f6b45efd99d.png) I disabled the IPv6 address on the eth0 interface to make sure, that v6 is not the root cause. Do you need more information? I'm glad, when I can help.
non_defect
lots of dns queries on reference hi even with the latest release i see a lot of dns requests on my pihole coming from raspiblitz i disabled the address on the interface to make sure that is not the root cause do you need more information i m glad when i can help
0
518,989
15,038,400,487
IssuesEvent
2021-02-02 17:24:06
blogtutor/blog-tutor-support
https://api.github.com/repos/blogtutor/blog-tutor-support
opened
Don't purge Cloudflare cache for "Spotlight Instagram Feeds" Custom Post Types
enhancement high priority
The Spotlight Instagram Feeds plugin creates hundreds of custom post types, and apparently when they refresh their cache it can trigger hundreds of individual Cloudflare cache clear requests, hitting rate limits. The URLs follow the format `domain.com/sl-insta-media/slug/` -- so we can probably just check for `sl-insta-media` and skip the cache clear request for those. Slack conversation here: https://nerd-press.slack.com/archives/C7N19NT6X/p1612214248000700
1.0
Don't purge Cloudflare cache for "Spotlight Instagram Feeds" Custom Post Types - The Spotlight Instagram Feeds plugin creates hundreds of custom post types, and apparently when they refresh their cache it can trigger hundreds of individual Cloudflare cache clear requests, hitting rate limits. The URLs follow the format `domain.com/sl-insta-media/slug/` -- so we can probably just check for `sl-insta-media` and skip the cache clear request for those. Slack conversation here: https://nerd-press.slack.com/archives/C7N19NT6X/p1612214248000700
non_defect
don t purge cloudflare cache for spotlight instagram feeds custom post types the spotlight instagram feeds plugin creates hundreds of custom post types and apparently when they refresh their cache it can trigger hundreds of individual cloudflare cache clear requests hitting rate limits the urls follow the format domain com sl insta media slug so we can probably just check for sl insta media and skip the cache clear request for those slack conversation here
0
232,756
17,793,631,738
IssuesEvent
2021-08-31 19:15:30
andrescasta777/MovieInfo
https://api.github.com/repos/andrescasta777/MovieInfo
opened
Spring 3
documentation
i. Presentación MVC 1. Implementación de la lógica de negocio 2. Pruebas unitarias de la lógica desarrollada ii. Informe de retrospectiva iii. Historias de usuario a desarrollar en el sprint 3
1.0
Spring 3 - i. Presentación MVC 1. Implementación de la lógica de negocio 2. Pruebas unitarias de la lógica desarrollada ii. Informe de retrospectiva iii. Historias de usuario a desarrollar en el sprint 3
non_defect
spring i presentación mvc implementación de la lógica de negocio pruebas unitarias de la lógica desarrollada ii informe de retrospectiva iii historias de usuario a desarrollar en el sprint
0
254,423
21,785,521,809
IssuesEvent
2022-05-14 04:13:13
milvus-io/milvus
https://api.github.com/repos/milvus-io/milvus
closed
[Bug]:[benchmark][cluster] Gradual increase in response time for continuous data insertion
kind/enhancement triage/accepted stale test/benchmark
### Is there an existing issue for this? - [X] I have searched the existing issues ### Environment ```markdown - Milvus version: - Deployment mode(standalone or cluster):cluster - SDK version(e.g. pymilvus v2.0.0rc2):2.0.1dev - OS(Ubuntu or CentOS): - CPU/Memory: - GPU: - Others: ``` ### Current Behavior <img width="1382" alt="截屏2022-03-01 18 46 40" src="https://user-images.githubusercontent.com/34296482/156155426-05e27cfb-97d2-469d-abff-5ee223ff05a7.png"> Milvus server ``` benchmark-tag-7bdq2-1-etcd-0 1/1 Running 0 3m28s 10.97.17.82 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-etcd-1 1/1 Running 0 3m28s 10.97.16.38 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-etcd-2 1/1 Running 0 3m28s 10.97.16.39 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-datacoord-8448c65667-hg7pg 1/1 Running 0 3m28s 10.97.18.82 qa-node017.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-datanode-7958b574db-smcnd 1/1 Running 0 3m28s 10.97.16.32 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-indexcoord-58c5589754-zldqn 1/1 Running 0 3m28s 10.97.16.34 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-indexnode-84dbc55d8f-7vq5z 1/1 Running 0 3m28s 10.97.16.35 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-proxy-7969c8869b-w5nmz 1/1 Running 0 3m28s 10.97.18.81 qa-node017.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-querycoord-6dc969f597-xbx62 1/1 Running 0 3m28s 10.97.16.30 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-querynode-697755ff87-ssgj8 1/1 Running 0 3m28s 10.97.17.78 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-rootcoord-85d4c5d6ff-4z24v 1/1 Running 0 3m28s 10.97.17.73 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-minio-0 1/1 Running 0 3m28s 10.97.17.81 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-minio-1 1/1 Running 0 3m28s 10.97.19.242 qa-node016.zilliz.local <none> <none> benchmark-tag-7bdq2-1-minio-2 1/1 Running 0 3m28s 10.97.18.84 qa-node017.zilliz.local <none> <none> benchmark-tag-7bdq2-1-minio-3 1/1 Running 0 3m28s 10.97.12.113 qa-node015.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-autorecovery-7694fdfdd-mmdk4 1/1 Running 0 3m28s 10.97.16.33 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-bastion-df8cc4d6-7m6k4 1/1 Running 0 3m28s 10.97.16.31 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-bookkeeper-0 1/1 Running 0 3m28s 10.97.17.84 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-bookkeeper-1 1/1 Running 0 104s 10.97.18.89 qa-node017.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-bookkeeper-2 1/1 Running 0 66s 10.97.12.118 qa-node015.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-broker-c88f7c974-6vnxg 1/1 Running 0 3m28s 10.97.17.75 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-proxy-7b675bb5dd-jpjpb 2/2 Running 0 3m28s 10.97.19.240 qa-node016.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-zookeeper-0 1/1 Running 0 3m28s 10.97.17.80 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-zookeeper-1 1/1 Running 0 3m11s 10.97.18.86 qa-node017.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-zookeeper-2 1/1 Running 0 2m53s 10.97.12.115 qa-node015.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-zookeeper-metadata-fk9hc 0/1 Completed 0 3m28s 10.97.18.80 qa-node017.zilliz.local <none> <none> ``` server-configmap server-cluster-dn8c32g-in2c2g-qn2c2g client-configmap pre2.0-client-insert-locust-client1-nb30k ### Expected Behavior ``` { "config.yaml": "locust_insert_performance: collections: - collection_name: sift_10w_128_l2 ni_per: 50000 build_index: false index_type: ivf_sq8 index_param: nlist: 1024 task: connection_num: 1 clients_num: 1 spawn_rate: 1 during_time: 600 types: - type: insert weight: 1 params: ni_per: 30000 " } ``` ### Steps To Reproduce ``` 1、create collection 2、build index of ivf_sq8 3、insert 10w vectors 4、flush 5、build index with the same params 6、load 7、locust concurrent: insert 30k ``` ### Anything else? _No response_
1.0
[Bug]:[benchmark][cluster] Gradual increase in response time for continuous data insertion - ### Is there an existing issue for this? - [X] I have searched the existing issues ### Environment ```markdown - Milvus version: - Deployment mode(standalone or cluster):cluster - SDK version(e.g. pymilvus v2.0.0rc2):2.0.1dev - OS(Ubuntu or CentOS): - CPU/Memory: - GPU: - Others: ``` ### Current Behavior <img width="1382" alt="截屏2022-03-01 18 46 40" src="https://user-images.githubusercontent.com/34296482/156155426-05e27cfb-97d2-469d-abff-5ee223ff05a7.png"> Milvus server ``` benchmark-tag-7bdq2-1-etcd-0 1/1 Running 0 3m28s 10.97.17.82 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-etcd-1 1/1 Running 0 3m28s 10.97.16.38 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-etcd-2 1/1 Running 0 3m28s 10.97.16.39 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-datacoord-8448c65667-hg7pg 1/1 Running 0 3m28s 10.97.18.82 qa-node017.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-datanode-7958b574db-smcnd 1/1 Running 0 3m28s 10.97.16.32 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-indexcoord-58c5589754-zldqn 1/1 Running 0 3m28s 10.97.16.34 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-indexnode-84dbc55d8f-7vq5z 1/1 Running 0 3m28s 10.97.16.35 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-proxy-7969c8869b-w5nmz 1/1 Running 0 3m28s 10.97.18.81 qa-node017.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-querycoord-6dc969f597-xbx62 1/1 Running 0 3m28s 10.97.16.30 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-querynode-697755ff87-ssgj8 1/1 Running 0 3m28s 10.97.17.78 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-milvus-rootcoord-85d4c5d6ff-4z24v 1/1 Running 0 3m28s 10.97.17.73 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-minio-0 1/1 Running 0 3m28s 10.97.17.81 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-minio-1 1/1 Running 0 3m28s 10.97.19.242 qa-node016.zilliz.local <none> <none> benchmark-tag-7bdq2-1-minio-2 1/1 Running 0 3m28s 10.97.18.84 qa-node017.zilliz.local <none> <none> benchmark-tag-7bdq2-1-minio-3 1/1 Running 0 3m28s 10.97.12.113 qa-node015.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-autorecovery-7694fdfdd-mmdk4 1/1 Running 0 3m28s 10.97.16.33 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-bastion-df8cc4d6-7m6k4 1/1 Running 0 3m28s 10.97.16.31 qa-node013.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-bookkeeper-0 1/1 Running 0 3m28s 10.97.17.84 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-bookkeeper-1 1/1 Running 0 104s 10.97.18.89 qa-node017.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-bookkeeper-2 1/1 Running 0 66s 10.97.12.118 qa-node015.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-broker-c88f7c974-6vnxg 1/1 Running 0 3m28s 10.97.17.75 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-proxy-7b675bb5dd-jpjpb 2/2 Running 0 3m28s 10.97.19.240 qa-node016.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-zookeeper-0 1/1 Running 0 3m28s 10.97.17.80 qa-node014.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-zookeeper-1 1/1 Running 0 3m11s 10.97.18.86 qa-node017.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-zookeeper-2 1/1 Running 0 2m53s 10.97.12.115 qa-node015.zilliz.local <none> <none> benchmark-tag-7bdq2-1-pulsar-zookeeper-metadata-fk9hc 0/1 Completed 0 3m28s 10.97.18.80 qa-node017.zilliz.local <none> <none> ``` server-configmap server-cluster-dn8c32g-in2c2g-qn2c2g client-configmap pre2.0-client-insert-locust-client1-nb30k ### Expected Behavior ``` { "config.yaml": "locust_insert_performance: collections: - collection_name: sift_10w_128_l2 ni_per: 50000 build_index: false index_type: ivf_sq8 index_param: nlist: 1024 task: connection_num: 1 clients_num: 1 spawn_rate: 1 during_time: 600 types: - type: insert weight: 1 params: ni_per: 30000 " } ``` ### Steps To Reproduce ``` 1、create collection 2、build index of ivf_sq8 3、insert 10w vectors 4、flush 5、build index with the same params 6、load 7、locust concurrent: insert 30k ``` ### Anything else? _No response_
non_defect
gradual increase in response time for continuous data insertion is there an existing issue for this i have searched the existing issues environment markdown milvus version deployment mode standalone or cluster cluster sdk version e g pymilvus os ubuntu or centos cpu memory gpu others current behavior img width alt src milvus server benchmark tag etcd running qa zilliz local benchmark tag etcd running qa zilliz local benchmark tag etcd running qa zilliz local benchmark tag milvus datacoord running qa zilliz local benchmark tag milvus datanode smcnd running qa zilliz local benchmark tag milvus indexcoord zldqn running qa zilliz local benchmark tag milvus indexnode running qa zilliz local benchmark tag milvus proxy running qa zilliz local benchmark tag milvus querycoord running qa zilliz local benchmark tag milvus querynode running qa zilliz local benchmark tag milvus rootcoord running qa zilliz local benchmark tag minio running qa zilliz local benchmark tag minio running qa zilliz local benchmark tag minio running qa zilliz local benchmark tag minio running qa zilliz local benchmark tag pulsar autorecovery running qa zilliz local benchmark tag pulsar bastion running qa zilliz local benchmark tag pulsar bookkeeper running qa zilliz local benchmark tag pulsar bookkeeper running qa zilliz local benchmark tag pulsar bookkeeper running qa zilliz local benchmark tag pulsar broker running qa zilliz local benchmark tag pulsar proxy jpjpb running qa zilliz local benchmark tag pulsar zookeeper running qa zilliz local benchmark tag pulsar zookeeper running qa zilliz local benchmark tag pulsar zookeeper running qa zilliz local benchmark tag pulsar zookeeper metadata completed qa zilliz local server configmap server cluster client configmap client insert locust expected behavior config yaml locust insert performance collections collection name sift ni per build index false index type ivf index param nlist task connection num clients num spawn rate during time types type insert weight params ni per steps to reproduce 、create collection 、build index of ivf 、insert vectors 、flush 、build index with the same params 、load 、locust concurrent insert anything else no response
0
358,281
25,185,296,470
IssuesEvent
2022-11-11 17:19:34
aws/aws-cli
https://api.github.com/repos/aws/aws-cli
closed
Cannot reapply for SES prod access via the CLI after initial denial
documentation feature-request service-api sesv2
Confirm by changing [ ] to [x] below to ensure that it's a bug: - [x] I've gone though the [User Guide](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html) and the [API reference](https://docs.aws.amazon.com/cli/latest/reference/) - [x] I've searched for [previous similar issues](https://github.com/aws/aws-cli/issues) and didn't find any solution **Describe the bug** I initially applied to get prod access (i.e. out of the sandbox) for Amazon SES via the AWS dashboard but got denied because I didn't provide enough detail. Online documentation said I would have to re-apply via the AWS CLI. This is the command I tried to do ``` aws sesv2 put-account-details \ --profile ${ADMIN_PROFILE} \ --production-access-enabled \ --mail-type TRANSACTIONAL \ --website-url ${MY_SITE} \ --use-case-description "...." \ --additional-contact-email-addresses ${EMAIL_ADDRESS}. \ --contact-language EN \ ``` But it's giving me this error message: ``` An error occurred (ConflictException) when calling the PutAccountDetails operation: None ``` It didn't give me a lot more info when adding the `-debug` flag. Logs posted below. **SDK version number** 2.4.10 **Platform/OS/Hardware/Device** What are you running the cli on? macOS Monterrey on M1 laptop, Python/3.8.8 Darwin/21.2.0 exe/x86_64 prompt/of **To Reproduce (observed behavior)** Steps to reproduce the behavior - Run the command shown above **Expected behavior** A clear and concise description of what you expected to happen. - After running the command, I expect to be able to reapply for prod access programmatically. **Logs/output** Get full traceback and error logs by adding `--debug` to the command. This seems to be the important bit, but doesn't give much of a reason of why it failed. ``` ': 'https://email.us-east-1.amazonaws.com/v2/email/account/details', 'context': {'client_region': 'us-east-1', 'client_config': <botocore.config.Config object at 0x7fe82963c790>, 'has_streaming_input': False, 'auth_type': None}} 2022-01-12 22:19:21,817 - MainThread - botocore.hooks - DEBUG - Event request-created.sesv2.PutAccountDetails: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7fe82963c880>> 2022-01-12 22:19:21,817 - MainThread - botocore.hooks - DEBUG - Event choose-signer.sesv2.PutAccountDetails: calling handler <function set_operation_specific_signer at 0x7fe8288a9af0> 2022-01-12 22:19:21,818 - MainThread - botocore.auth - DEBUG - Calculating signature using v4 auth. 2022-01-12 22:19:21,818 - MainThread - botocore.auth - DEBUG - CanonicalRequest: POST /v2/email/account/details content-type:application/json host:email.us-east-1.amazonaws.com x-amz-date:20220113T031921Z content-type;host;x-amz-date f0ad0c2ee8f3139b5f2fbf52802456e0213f0c7c94f56083b0266f74a1cf0094 2022-01-12 22:19:21,818 - MainThread - botocore.auth - DEBUG - StringToSign: AWS4-HMAC-SHA256 20220113T031921Z 20220113/us-east-1/ses/aws4_request 2a0059f64ac792463be9a9cdfbf6d60c9b90954e8d2787e4ebebb29105d96a4c 2022-01-12 22:19:21,818 - MainThread - botocore.auth - DEBUG - Signature: 0c1cab687b59e3c77bf8182afe1bcab4ff03763142a6818b07cf56a4d04035f5 2022-01-12 22:19:21,818 - MainThread - botocore.endpoint - DEBUG - Sending http request: <AWSPreparedRequest stream_output=False, method=POST, url=https://email.us-east-1.amazonaws.com/v2/email/account/details, headers={'Content-Type': b'application/json', 'User-Agent': b'aws-cli/2.4.10 Python/3.8.8 Darwin/21.2.0 exe/x86_64 prompt/off command/sesv2.put-account-details', 'X-Amz-Date': b'20220113T031921Z', 'Authorization': b'AWS4-HMAC-SHA256 Credential=AKIAUQOYYAVWDHCZ3U4O/20220113/us-east-1/ses/aws4_request, SignedHeaders=content-type;host;x-amz-date, Signature=0c1cab687b59e3c77bf8182afe1bcab4ff03763142a6818b07cf56a4d04035f5', 'Content-Length': '692'}> 2022-01-12 22:19:21,819 - MainThread - botocore.httpsession - DEBUG - Certificate path: /usr/local/aws-cli/awscli/botocore/cacert.pem 2022-01-12 22:19:21,819 - MainThread - urllib3.connectionpool - DEBUG - Starting new HTTPS connection (1): email.us-east-1.amazonaws.com:443 2022-01-12 22:19:22,002 - MainThread - urllib3.connectionpool - DEBUG - https://email.us-east-1.amazonaws.com:443 "POST /v2/email/account/details HTTP/1.1" 409 16 2022-01-12 22:19:22,003 - MainThread - botocore.parsers - DEBUG - Response headers: {'Date': 'Thu, 13 Jan 2022 03:19:21 GMT', 'Content-Type': 'application/json', 'Content-Length': '16', 'Connection': 'keep-alive', 'x-amzn-RequestId': '29f067d8-3cdd-4e1d-b2f1-36d6100ca738', 'x-amzn-ErrorType': 'ConflictException'} 2022-01-12 22:19:22,003 - MainThread - botocore.parsers - DEBUG - Response body: b'{"message":null}' 2022-01-12 22:19:22,006 - MainThread - botocore.parsers - DEBUG - Response headers: {'Date': 'Thu, 13 Jan 2022 03:19:21 GMT', 'Content-Type': 'application/json', 'Content-Length': '16', 'Connection': 'keep-alive', 'x-amzn-RequestId': '29f067d8-3cdd-4e1d-b2f1-36d6100ca738', 'x-amzn-ErrorType': 'ConflictException'} 2022-01-12 22:19:22,006 - MainThread - botocore.parsers - DEBUG - Response body: b'{"message":null}' 2022-01-12 22:19:22,006 - MainThread - botocore.hooks - DEBUG - Event needs-retry.sesv2.PutAccountDetails: calling handler <bound method RetryHandler.needs_retry of <botocore.retries.standard.RetryHandler object at 0x7fe8296822e0>> 2022-01-12 22:19:22,006 - MainThread - botocore.retries.standard - DEBUG - Not retrying request. 2022-01-12 22:19:22,006 - MainThread - botocore.hooks - DEBUG - Event after-call.sesv2.PutAccountDetails: calling handler <bound method RetryQuotaChecker.release_retry_quota of <botocore.retries.standard.RetryQuotaChecker object at 0x7fe82963ce20>> 2022-01-12 22:19:22,007 - MainThread - awscli.clidriver - DEBUG - Exception caught in main() Traceback (most recent call last): File "awscli/clidriver.py", line 459, in main File "awscli/clidriver.py", line 594, in __call__ File "awscli/clidriver.py", line 770, in __call__ File "awscli/clidriver.py", line 901, in invoke File "awscli/clidriver.py", line 913, in _make_client_call File "awscli/botocore/client.py", line 281, in _api_call File "awscli/botocore/client.py", line 609, in _make_api_call botocore.errorfactory.ConflictException: An error occurred (ConflictException) when calling the PutAccountDetails operation: None An error occurred (ConflictException) when calling the PutAccountDetails operation: None ``` **Additional context** Add any other context about the problem here. N/A
1.0
Cannot reapply for SES prod access via the CLI after initial denial - Confirm by changing [ ] to [x] below to ensure that it's a bug: - [x] I've gone though the [User Guide](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html) and the [API reference](https://docs.aws.amazon.com/cli/latest/reference/) - [x] I've searched for [previous similar issues](https://github.com/aws/aws-cli/issues) and didn't find any solution **Describe the bug** I initially applied to get prod access (i.e. out of the sandbox) for Amazon SES via the AWS dashboard but got denied because I didn't provide enough detail. Online documentation said I would have to re-apply via the AWS CLI. This is the command I tried to do ``` aws sesv2 put-account-details \ --profile ${ADMIN_PROFILE} \ --production-access-enabled \ --mail-type TRANSACTIONAL \ --website-url ${MY_SITE} \ --use-case-description "...." \ --additional-contact-email-addresses ${EMAIL_ADDRESS}. \ --contact-language EN \ ``` But it's giving me this error message: ``` An error occurred (ConflictException) when calling the PutAccountDetails operation: None ``` It didn't give me a lot more info when adding the `-debug` flag. Logs posted below. **SDK version number** 2.4.10 **Platform/OS/Hardware/Device** What are you running the cli on? macOS Monterrey on M1 laptop, Python/3.8.8 Darwin/21.2.0 exe/x86_64 prompt/of **To Reproduce (observed behavior)** Steps to reproduce the behavior - Run the command shown above **Expected behavior** A clear and concise description of what you expected to happen. - After running the command, I expect to be able to reapply for prod access programmatically. **Logs/output** Get full traceback and error logs by adding `--debug` to the command. This seems to be the important bit, but doesn't give much of a reason of why it failed. ``` ': 'https://email.us-east-1.amazonaws.com/v2/email/account/details', 'context': {'client_region': 'us-east-1', 'client_config': <botocore.config.Config object at 0x7fe82963c790>, 'has_streaming_input': False, 'auth_type': None}} 2022-01-12 22:19:21,817 - MainThread - botocore.hooks - DEBUG - Event request-created.sesv2.PutAccountDetails: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7fe82963c880>> 2022-01-12 22:19:21,817 - MainThread - botocore.hooks - DEBUG - Event choose-signer.sesv2.PutAccountDetails: calling handler <function set_operation_specific_signer at 0x7fe8288a9af0> 2022-01-12 22:19:21,818 - MainThread - botocore.auth - DEBUG - Calculating signature using v4 auth. 2022-01-12 22:19:21,818 - MainThread - botocore.auth - DEBUG - CanonicalRequest: POST /v2/email/account/details content-type:application/json host:email.us-east-1.amazonaws.com x-amz-date:20220113T031921Z content-type;host;x-amz-date f0ad0c2ee8f3139b5f2fbf52802456e0213f0c7c94f56083b0266f74a1cf0094 2022-01-12 22:19:21,818 - MainThread - botocore.auth - DEBUG - StringToSign: AWS4-HMAC-SHA256 20220113T031921Z 20220113/us-east-1/ses/aws4_request 2a0059f64ac792463be9a9cdfbf6d60c9b90954e8d2787e4ebebb29105d96a4c 2022-01-12 22:19:21,818 - MainThread - botocore.auth - DEBUG - Signature: 0c1cab687b59e3c77bf8182afe1bcab4ff03763142a6818b07cf56a4d04035f5 2022-01-12 22:19:21,818 - MainThread - botocore.endpoint - DEBUG - Sending http request: <AWSPreparedRequest stream_output=False, method=POST, url=https://email.us-east-1.amazonaws.com/v2/email/account/details, headers={'Content-Type': b'application/json', 'User-Agent': b'aws-cli/2.4.10 Python/3.8.8 Darwin/21.2.0 exe/x86_64 prompt/off command/sesv2.put-account-details', 'X-Amz-Date': b'20220113T031921Z', 'Authorization': b'AWS4-HMAC-SHA256 Credential=AKIAUQOYYAVWDHCZ3U4O/20220113/us-east-1/ses/aws4_request, SignedHeaders=content-type;host;x-amz-date, Signature=0c1cab687b59e3c77bf8182afe1bcab4ff03763142a6818b07cf56a4d04035f5', 'Content-Length': '692'}> 2022-01-12 22:19:21,819 - MainThread - botocore.httpsession - DEBUG - Certificate path: /usr/local/aws-cli/awscli/botocore/cacert.pem 2022-01-12 22:19:21,819 - MainThread - urllib3.connectionpool - DEBUG - Starting new HTTPS connection (1): email.us-east-1.amazonaws.com:443 2022-01-12 22:19:22,002 - MainThread - urllib3.connectionpool - DEBUG - https://email.us-east-1.amazonaws.com:443 "POST /v2/email/account/details HTTP/1.1" 409 16 2022-01-12 22:19:22,003 - MainThread - botocore.parsers - DEBUG - Response headers: {'Date': 'Thu, 13 Jan 2022 03:19:21 GMT', 'Content-Type': 'application/json', 'Content-Length': '16', 'Connection': 'keep-alive', 'x-amzn-RequestId': '29f067d8-3cdd-4e1d-b2f1-36d6100ca738', 'x-amzn-ErrorType': 'ConflictException'} 2022-01-12 22:19:22,003 - MainThread - botocore.parsers - DEBUG - Response body: b'{"message":null}' 2022-01-12 22:19:22,006 - MainThread - botocore.parsers - DEBUG - Response headers: {'Date': 'Thu, 13 Jan 2022 03:19:21 GMT', 'Content-Type': 'application/json', 'Content-Length': '16', 'Connection': 'keep-alive', 'x-amzn-RequestId': '29f067d8-3cdd-4e1d-b2f1-36d6100ca738', 'x-amzn-ErrorType': 'ConflictException'} 2022-01-12 22:19:22,006 - MainThread - botocore.parsers - DEBUG - Response body: b'{"message":null}' 2022-01-12 22:19:22,006 - MainThread - botocore.hooks - DEBUG - Event needs-retry.sesv2.PutAccountDetails: calling handler <bound method RetryHandler.needs_retry of <botocore.retries.standard.RetryHandler object at 0x7fe8296822e0>> 2022-01-12 22:19:22,006 - MainThread - botocore.retries.standard - DEBUG - Not retrying request. 2022-01-12 22:19:22,006 - MainThread - botocore.hooks - DEBUG - Event after-call.sesv2.PutAccountDetails: calling handler <bound method RetryQuotaChecker.release_retry_quota of <botocore.retries.standard.RetryQuotaChecker object at 0x7fe82963ce20>> 2022-01-12 22:19:22,007 - MainThread - awscli.clidriver - DEBUG - Exception caught in main() Traceback (most recent call last): File "awscli/clidriver.py", line 459, in main File "awscli/clidriver.py", line 594, in __call__ File "awscli/clidriver.py", line 770, in __call__ File "awscli/clidriver.py", line 901, in invoke File "awscli/clidriver.py", line 913, in _make_client_call File "awscli/botocore/client.py", line 281, in _api_call File "awscli/botocore/client.py", line 609, in _make_api_call botocore.errorfactory.ConflictException: An error occurred (ConflictException) when calling the PutAccountDetails operation: None An error occurred (ConflictException) when calling the PutAccountDetails operation: None ``` **Additional context** Add any other context about the problem here. N/A
non_defect
cannot reapply for ses prod access via the cli after initial denial confirm by changing to below to ensure that it s a bug i ve gone though the and the i ve searched for and didn t find any solution describe the bug i initially applied to get prod access i e out of the sandbox for amazon ses via the aws dashboard but got denied because i didn t provide enough detail online documentation said i would have to re apply via the aws cli this is the command i tried to do aws put account details profile admin profile production access enabled mail type transactional website url my site use case description additional contact email addresses email address contact language en but it s giving me this error message an error occurred conflictexception when calling the putaccountdetails operation none it didn t give me a lot more info when adding the debug flag logs posted below sdk version number platform os hardware device what are you running the cli on macos monterrey on laptop python darwin exe prompt of to reproduce observed behavior steps to reproduce the behavior run the command shown above expected behavior a clear and concise description of what you expected to happen after running the command i expect to be able to reapply for prod access programmatically logs output get full traceback and error logs by adding debug to the command this seems to be the important bit but doesn t give much of a reason of why it failed context client region us east client config has streaming input false auth type none mainthread botocore hooks debug event request created putaccountdetails calling handler mainthread botocore hooks debug event choose signer putaccountdetails calling handler mainthread botocore auth debug calculating signature using auth mainthread botocore auth debug canonicalrequest post email account details content type application json host email us east amazonaws com x amz date content type host x amz date mainthread botocore auth debug stringtosign hmac us east ses request mainthread botocore auth debug signature mainthread botocore endpoint debug sending http request mainthread botocore httpsession debug certificate path usr local aws cli awscli botocore cacert pem mainthread connectionpool debug starting new https connection email us east amazonaws com mainthread connectionpool debug post email account details http mainthread botocore parsers debug response headers date thu jan gmt content type application json content length connection keep alive x amzn requestid x amzn errortype conflictexception mainthread botocore parsers debug response body b message null mainthread botocore parsers debug response headers date thu jan gmt content type application json content length connection keep alive x amzn requestid x amzn errortype conflictexception mainthread botocore parsers debug response body b message null mainthread botocore hooks debug event needs retry putaccountdetails calling handler mainthread botocore retries standard debug not retrying request mainthread botocore hooks debug event after call putaccountdetails calling handler mainthread awscli clidriver debug exception caught in main traceback most recent call last file awscli clidriver py line in main file awscli clidriver py line in call file awscli clidriver py line in call file awscli clidriver py line in invoke file awscli clidriver py line in make client call file awscli botocore client py line in api call file awscli botocore client py line in make api call botocore errorfactory conflictexception an error occurred conflictexception when calling the putaccountdetails operation none an error occurred conflictexception when calling the putaccountdetails operation none additional context add any other context about the problem here n a
0
30,803
6,288,551,786
IssuesEvent
2017-07-19 17:13:54
googlei18n/libphonenumber
https://api.github.com/repos/googlei18n/libphonenumber
closed
No location for prefix 00333875
priority-medium type-defect
Imported from [Google Code issue #450](https://code.google.com/p/libphonenumber/issues/detail?id=450) created by [alessandro.casella@hisolution.it](https://code.google.com/u/112157349565416887571/) on 2014-04-10T13:14:29.000Z: --- <b>What steps will reproduce the problem?</b> 1. Send a request to analyze French phone number 003338757xxxx 2. Search for location in response 3. Notice that returned location is &quot;France&quot; <b>What is the expected output? What do you see instead?</b> I expected location &quot;Metz&quot; as the phone number's one. Returned location is generic &quot;France&quot; instead. <b>What version of the product are you using? On what operating system?</b> Python library version 6.0.0a on Linux (Ubuntu 12.04 LTS). The same problem using on-line web interface on http://libphonenumber.appspot.com/. <b>Please provide any additional information below.</b> I noticed that the prefix '3338757' is missing in python library files included in 'geodata' directory ('data4.py' file contains French prefixes but not '3338757').
1.0
No location for prefix 00333875 - Imported from [Google Code issue #450](https://code.google.com/p/libphonenumber/issues/detail?id=450) created by [alessandro.casella@hisolution.it](https://code.google.com/u/112157349565416887571/) on 2014-04-10T13:14:29.000Z: --- <b>What steps will reproduce the problem?</b> 1. Send a request to analyze French phone number 003338757xxxx 2. Search for location in response 3. Notice that returned location is &quot;France&quot; <b>What is the expected output? What do you see instead?</b> I expected location &quot;Metz&quot; as the phone number's one. Returned location is generic &quot;France&quot; instead. <b>What version of the product are you using? On what operating system?</b> Python library version 6.0.0a on Linux (Ubuntu 12.04 LTS). The same problem using on-line web interface on http://libphonenumber.appspot.com/. <b>Please provide any additional information below.</b> I noticed that the prefix '3338757' is missing in python library files included in 'geodata' directory ('data4.py' file contains French prefixes but not '3338757').
defect
no location for prefix imported from created by on what steps will reproduce the problem send a request to analyze french phone number search for location in response notice that returned location is quot france quot what is the expected output what do you see instead i expected location quot metz quot as the phone number s one returned location is generic quot france quot instead what version of the product are you using on what operating system python library version on linux ubuntu lts the same problem using on line web interface on please provide any additional information below i noticed that the prefix is missing in python library files included in geodata directory py file contains french prefixes but not
1
651,195
21,469,109,987
IssuesEvent
2022-04-26 07:54:08
bitfoundation/bitframework
https://api.github.com/repos/bitfoundation/bitframework
closed
Missing validations in the Sign-In page in the `TodoTemplate` project
bug area / project template high priority
Currently, the validation for the Email & password fields is not complete on the sign-in page. It is required to find a proper solution to apply these necessary validations on this page.
1.0
Missing validations in the Sign-In page in the `TodoTemplate` project - Currently, the validation for the Email & password fields is not complete on the sign-in page. It is required to find a proper solution to apply these necessary validations on this page.
non_defect
missing validations in the sign in page in the todotemplate project currently the validation for the email password fields is not complete on the sign in page it is required to find a proper solution to apply these necessary validations on this page
0
488,606
14,079,856,308
IssuesEvent
2020-11-04 15:23:53
TIBCOSoftware/genxdm
https://api.github.com/repos/TIBCOSoftware/genxdm
closed
Should there be a method corresponding to dm:node-name as well as its current two components?
Component-API Priority-Medium enhancement question
``` Currently: String getLocalName(N node) (and similarly for Reader/Informer/Cursor with no parameter) String getNamespaceURI(N node) (ditto) The accessor in the data model specification would be more closely modeled as: javax.xml.namespace.QName getName(N node) (or getNodeName(N node)) For a variety of reasons, which will be included in the comments below, this pattern was not chosen for the original development. The question is now officially reopened for discussion and resolution on the merits. ``` Original issue reported on code.google.com by `aale...@gmail.com` on 5 Nov 2010 at 6:16
1.0
Should there be a method corresponding to dm:node-name as well as its current two components? - ``` Currently: String getLocalName(N node) (and similarly for Reader/Informer/Cursor with no parameter) String getNamespaceURI(N node) (ditto) The accessor in the data model specification would be more closely modeled as: javax.xml.namespace.QName getName(N node) (or getNodeName(N node)) For a variety of reasons, which will be included in the comments below, this pattern was not chosen for the original development. The question is now officially reopened for discussion and resolution on the merits. ``` Original issue reported on code.google.com by `aale...@gmail.com` on 5 Nov 2010 at 6:16
non_defect
should there be a method corresponding to dm node name as well as its current two components currently string getlocalname n node and similarly for reader informer cursor with no parameter string getnamespaceuri n node ditto the accessor in the data model specification would be more closely modeled as javax xml namespace qname getname n node or getnodename n node for a variety of reasons which will be included in the comments below this pattern was not chosen for the original development the question is now officially reopened for discussion and resolution on the merits original issue reported on code google com by aale gmail com on nov at
0
3,006
3,059,850,452
IssuesEvent
2015-08-14 17:15:42
mikrosimage/loudness_validator
https://api.github.com/repos/mikrosimage/loudness_validator
closed
use SCons Options (to display on help)
build
refers to 12.1.5 http://www.scons.org/doc/1.0.0/HTML/scons-user/c1965.html Change our ARGUMENTS, gets in SConstruct (https://github.com/mikrosimage/loudness_validator/blob/develop/SConstruct#L18)
1.0
use SCons Options (to display on help) - refers to 12.1.5 http://www.scons.org/doc/1.0.0/HTML/scons-user/c1965.html Change our ARGUMENTS, gets in SConstruct (https://github.com/mikrosimage/loudness_validator/blob/develop/SConstruct#L18)
non_defect
use scons options to display on help refers to change our arguments gets in sconstruct
0
72,087
23,930,779,838
IssuesEvent
2022-09-10 14:07:06
vector-im/element-android
https://api.github.com/repos/vector-im/element-android
opened
Unable to set up cross-signing when using CAS login
T-Defect
### Steps to reproduce I have a homeserver set up on a YunoHost with CAS login. When I try to set up cross-signing, Element asks for my user's password but then says my password is incorrect. ### Outcome #### What did you expect? Element would recognize that I signed in using CAS and provide that option while setting up cross-signing. #### What happened instead? Unable to set up cross-signing. ### Your phone model Pixel 6 ### Operating system version 12 ### Application version and app store version 1.4.34 from F-droid ### Homeserver matrix.schmidthaus.rocks Synapse 1.65.0 ### Will you send logs? Yes ### Are you willing to provide a PR? No
1.0
Unable to set up cross-signing when using CAS login - ### Steps to reproduce I have a homeserver set up on a YunoHost with CAS login. When I try to set up cross-signing, Element asks for my user's password but then says my password is incorrect. ### Outcome #### What did you expect? Element would recognize that I signed in using CAS and provide that option while setting up cross-signing. #### What happened instead? Unable to set up cross-signing. ### Your phone model Pixel 6 ### Operating system version 12 ### Application version and app store version 1.4.34 from F-droid ### Homeserver matrix.schmidthaus.rocks Synapse 1.65.0 ### Will you send logs? Yes ### Are you willing to provide a PR? No
defect
unable to set up cross signing when using cas login steps to reproduce i have a homeserver set up on a yunohost with cas login when i try to set up cross signing element asks for my user s password but then says my password is incorrect outcome what did you expect element would recognize that i signed in using cas and provide that option while setting up cross signing what happened instead unable to set up cross signing your phone model pixel operating system version application version and app store version from f droid homeserver matrix schmidthaus rocks synapse will you send logs yes are you willing to provide a pr no
1
452,301
32,056,586,681
IssuesEvent
2023-09-24 06:28:27
neohere97/SkySync
https://api.github.com/repos/neohere97/SkySync
closed
ENS160 Average Current Rough Estimation
documentation
| Parameter | Value | |-----------------------|---------------| | Operating Voltage | 1.8V, Max: 1.98V | | I/O Supply Voltage | upto 3.6V | | Current - Deep Sleep | 0.01mA (10uA) | | Current - Idle | 2mA | | Current - Standard | 29mA | | State | Current (mA) | Power |Weight | |-------------------|-------|-------| ------------| | Deep Sleep | 0.01 | 18 uW | 0.7 | | Active | 29 | 52.2 mW | 0.3 | The sensor needs 3 minutes to warm up, considering one measurement every 10minutes Weighted Average Current = **8.7mA**, at 1.8V Weighted Average Power = **15.66mW**
1.0
ENS160 Average Current Rough Estimation - | Parameter | Value | |-----------------------|---------------| | Operating Voltage | 1.8V, Max: 1.98V | | I/O Supply Voltage | upto 3.6V | | Current - Deep Sleep | 0.01mA (10uA) | | Current - Idle | 2mA | | Current - Standard | 29mA | | State | Current (mA) | Power |Weight | |-------------------|-------|-------| ------------| | Deep Sleep | 0.01 | 18 uW | 0.7 | | Active | 29 | 52.2 mW | 0.3 | The sensor needs 3 minutes to warm up, considering one measurement every 10minutes Weighted Average Current = **8.7mA**, at 1.8V Weighted Average Power = **15.66mW**
non_defect
average current rough estimation parameter value operating voltage max i o supply voltage upto current deep sleep current idle current standard state current ma power weight deep sleep uw active mw the sensor needs minutes to warm up considering one measurement every weighted average current at weighted average power
0
280,979
21,315,307,035
IssuesEvent
2022-04-16 06:58:54
mazx4960/pe
https://api.github.com/repos/mazx4960/pe
opened
Instructions unclear for `add` command in UG
type.DocumentationBug severity.VeryLow
The user guide did not provide clear instructions on what counts as being the same student in the address book. (i.e. I could add students with the same name but I cannot add students with the same student id) ![image.png](https://raw.githubusercontent.com/mazx4960/pe/main/files/55c429c4-96a5-486e-a84c-4e322bbfdf20.png) <!--session: 1650088281927-88477934-0a85-4799-bc5a-4e78286ccbd2--> <!--Version: Web v3.4.2-->
1.0
Instructions unclear for `add` command in UG - The user guide did not provide clear instructions on what counts as being the same student in the address book. (i.e. I could add students with the same name but I cannot add students with the same student id) ![image.png](https://raw.githubusercontent.com/mazx4960/pe/main/files/55c429c4-96a5-486e-a84c-4e322bbfdf20.png) <!--session: 1650088281927-88477934-0a85-4799-bc5a-4e78286ccbd2--> <!--Version: Web v3.4.2-->
non_defect
instructions unclear for add command in ug the user guide did not provide clear instructions on what counts as being the same student in the address book i e i could add students with the same name but i cannot add students with the same student id
0
444,988
31,159,704,464
IssuesEvent
2023-08-16 15:11:42
webrecorder/browsertrix-cloud
https://api.github.com/repos/webrecorder/browsertrix-cloud
opened
1.6 docs update pass
documentation
### Context We updated the application with new terminology and features (collections!) but this has not yet been reflected fully in the documentation. ### Requirements This issue encompasses a few different areas of focus... - Terminology unification update pass (part of #922) - Add a section on collections to the user docs - Add uploading content to the docs - Add admonition type usage guidelines to the docs styleguide
1.0
1.6 docs update pass - ### Context We updated the application with new terminology and features (collections!) but this has not yet been reflected fully in the documentation. ### Requirements This issue encompasses a few different areas of focus... - Terminology unification update pass (part of #922) - Add a section on collections to the user docs - Add uploading content to the docs - Add admonition type usage guidelines to the docs styleguide
non_defect
docs update pass context we updated the application with new terminology and features collections but this has not yet been reflected fully in the documentation requirements this issue encompasses a few different areas of focus terminology unification update pass part of add a section on collections to the user docs add uploading content to the docs add admonition type usage guidelines to the docs styleguide
0
81,099
30,709,098,985
IssuesEvent
2023-07-27 08:32:40
scipy/scipy
https://api.github.com/repos/scipy/scipy
closed
BUG (likely): DBSCAN producing strange results on a geographical dataset
defect
### Describe your issue. The issue has been described under this StackOverflow question: [https://stackoverflow.com/questions/76774329/dbscan-producing-strange-results-on-a-ships-location-dataset](https://stackoverflow.com/questions/76774329/dbscan-producing-strange-results-on-a-ships-location-dataset) The dataset: https://pastebin.pl/view/86b44fe9 ### Reproducing Code Example ```python from sklearn.cluster import DBSCAN file = open("86b44fe9.txt", "r") lines = file.readlines() x = [] for p in lines: poi = p.rstrip("\n").split(" ") x.append([float(poi[2]), float(poi[3])]) # Parameters of DBSCAN min_samples = 42 eps = 0.001 dbscan = DBSCAN(eps, min_samples).fit(x) # fitting the model labels = dbscan.labels_ out_file = open("clusters.txt", "w") for p in range(len(lines)): poi = lines[p].rstrip("\n").split(" ") out_file.write(poi[0] + " ") out_file.write(poi[1] + " ") out_file.write(str(labels[p])) out_file.write("\n") out_file.close() ``` ### Error message ```shell There is likely a logic error. ``` ### SciPy/NumPy/Python version and system information ```shell 1.10.1 1.23.5 sys.version_info(major=3, minor=8, micro=3, releaselevel='final', serial=0) Build Dependencies: blas: detection method: pkgconfig found: true include directory: c:/opt/openblas/if_32/64/include lib directory: c:/opt/openblas/if_32/64/lib name: openblas openblas configuration: USE_64BITINT= DYNAMIC_ARCH=1 DYNAMIC_OLDER= NO_CBLAS= NO_LAPACK= NO_LAPACKE= NO_AFFINITY=1 USE_OPENMP= PRESCOTT MAX_THREADS=4 pc file directory: c:/opt/openblas/if_32/64/lib/pkgconfig version: 0.3.18 lapack: detection method: pkgconfig found: true include directory: c:/opt/openblas/if_32/64/include lib directory: c:/opt/openblas/if_32/64/lib name: openblas openblas configuration: USE_64BITINT= DYNAMIC_ARCH=1 DYNAMIC_OLDER= NO_CBLAS= NO_LAPACK= NO_LAPACKE= NO_AFFINITY=1 USE_OPENMP= PRESCOTT MAX_THREADS=4 pc file directory: c:/opt/openblas/if_32/64/lib/pkgconfig version: 0.3.18 Compilers: c: commands: cc linker: ld.bfd name: gcc version: 10.3.0 c++: commands: c++ linker: ld.bfd name: gcc version: 10.3.0 cython: commands: cython linker: cython name: cython version: 0.29.33 fortran: commands: gfortran linker: ld.bfd name: gcc version: 10.3.0 pythran: include directory: C:\Users\runneradmin\AppData\Local\Temp\pip-build-env-yajwq4a4\overlay\Lib\site-packages\pythran version: 0.12.1 Machine Information: build: cpu: x86_64 endian: little family: x86_64 system: windows cross-compiled: false host: cpu: x86_64 endian: little family: x86_64 system: windows Python Information: path: C:\Users\runneradmin\AppData\Local\Temp\cibw-run-5o1gjg1j\cp38-win_amd64\build\venv\Scripts\python.exe version: '3.8' ```
1.0
BUG (likely): DBSCAN producing strange results on a geographical dataset - ### Describe your issue. The issue has been described under this StackOverflow question: [https://stackoverflow.com/questions/76774329/dbscan-producing-strange-results-on-a-ships-location-dataset](https://stackoverflow.com/questions/76774329/dbscan-producing-strange-results-on-a-ships-location-dataset) The dataset: https://pastebin.pl/view/86b44fe9 ### Reproducing Code Example ```python from sklearn.cluster import DBSCAN file = open("86b44fe9.txt", "r") lines = file.readlines() x = [] for p in lines: poi = p.rstrip("\n").split(" ") x.append([float(poi[2]), float(poi[3])]) # Parameters of DBSCAN min_samples = 42 eps = 0.001 dbscan = DBSCAN(eps, min_samples).fit(x) # fitting the model labels = dbscan.labels_ out_file = open("clusters.txt", "w") for p in range(len(lines)): poi = lines[p].rstrip("\n").split(" ") out_file.write(poi[0] + " ") out_file.write(poi[1] + " ") out_file.write(str(labels[p])) out_file.write("\n") out_file.close() ``` ### Error message ```shell There is likely a logic error. ``` ### SciPy/NumPy/Python version and system information ```shell 1.10.1 1.23.5 sys.version_info(major=3, minor=8, micro=3, releaselevel='final', serial=0) Build Dependencies: blas: detection method: pkgconfig found: true include directory: c:/opt/openblas/if_32/64/include lib directory: c:/opt/openblas/if_32/64/lib name: openblas openblas configuration: USE_64BITINT= DYNAMIC_ARCH=1 DYNAMIC_OLDER= NO_CBLAS= NO_LAPACK= NO_LAPACKE= NO_AFFINITY=1 USE_OPENMP= PRESCOTT MAX_THREADS=4 pc file directory: c:/opt/openblas/if_32/64/lib/pkgconfig version: 0.3.18 lapack: detection method: pkgconfig found: true include directory: c:/opt/openblas/if_32/64/include lib directory: c:/opt/openblas/if_32/64/lib name: openblas openblas configuration: USE_64BITINT= DYNAMIC_ARCH=1 DYNAMIC_OLDER= NO_CBLAS= NO_LAPACK= NO_LAPACKE= NO_AFFINITY=1 USE_OPENMP= PRESCOTT MAX_THREADS=4 pc file directory: c:/opt/openblas/if_32/64/lib/pkgconfig version: 0.3.18 Compilers: c: commands: cc linker: ld.bfd name: gcc version: 10.3.0 c++: commands: c++ linker: ld.bfd name: gcc version: 10.3.0 cython: commands: cython linker: cython name: cython version: 0.29.33 fortran: commands: gfortran linker: ld.bfd name: gcc version: 10.3.0 pythran: include directory: C:\Users\runneradmin\AppData\Local\Temp\pip-build-env-yajwq4a4\overlay\Lib\site-packages\pythran version: 0.12.1 Machine Information: build: cpu: x86_64 endian: little family: x86_64 system: windows cross-compiled: false host: cpu: x86_64 endian: little family: x86_64 system: windows Python Information: path: C:\Users\runneradmin\AppData\Local\Temp\cibw-run-5o1gjg1j\cp38-win_amd64\build\venv\Scripts\python.exe version: '3.8' ```
defect
bug likely dbscan producing strange results on a geographical dataset describe your issue the issue has been described under this stackoverflow question the dataset reproducing code example python from sklearn cluster import dbscan file open txt r lines file readlines x for p in lines poi p rstrip n split x append float poi parameters of dbscan min samples eps dbscan dbscan eps min samples fit x fitting the model labels dbscan labels out file open clusters txt w for p in range len lines poi lines rstrip n split out file write poi out file write poi out file write str labels out file write n out file close error message shell there is likely a logic error scipy numpy python version and system information shell sys version info major minor micro releaselevel final serial build dependencies blas detection method pkgconfig found true include directory c opt openblas if include lib directory c opt openblas if lib name openblas openblas configuration use dynamic arch dynamic older no cblas no lapack no lapacke no affinity use openmp prescott max threads pc file directory c opt openblas if lib pkgconfig version lapack detection method pkgconfig found true include directory c opt openblas if include lib directory c opt openblas if lib name openblas openblas configuration use dynamic arch dynamic older no cblas no lapack no lapacke no affinity use openmp prescott max threads pc file directory c opt openblas if lib pkgconfig version compilers c commands cc linker ld bfd name gcc version c commands c linker ld bfd name gcc version cython commands cython linker cython name cython version fortran commands gfortran linker ld bfd name gcc version pythran include directory c users runneradmin appdata local temp pip build env overlay lib site packages pythran version machine information build cpu endian little family system windows cross compiled false host cpu endian little family system windows python information path c users runneradmin appdata local temp cibw run win build venv scripts python exe version
1
76,794
3,493,483,673
IssuesEvent
2016-01-05 02:28:55
brodieG/unitizer
https://api.github.com/repos/brodieG/unitizer
closed
Unexpected Exit When Browsing To some Tests
bug high priority
Happened with `review("tests/unitizer/alike.unitizer/")`, started at test number two, and at end of section got: ``` > alike(c(a = 1, b = 2), c(1, 2)) [1] "should be type \"character\" (is \"NULL\") for \"names\"" unitizer> Y = lists ======================================================================== - Passed ----------------------------------------------------------------------- The 6 tests in this section passed. Keep tests in store ([Y]es, [N]o, [P]rev, [B]rowse, [Q]uit, [H]elp)? > lst <- list(list(1, 2), list(3, list(4, list(5, list(6, 6.1, + 6.2))))) Error in if ((curr.sub.sec.obj@show.out || x@review == 0L) && nchar(item.main@data@output)) screen_out(item.main@data@output, : missing value where TRUE/FALSE needed Unexpectedly exited before storing unitizer; tests were not saved or changed. ```
1.0
Unexpected Exit When Browsing To some Tests - Happened with `review("tests/unitizer/alike.unitizer/")`, started at test number two, and at end of section got: ``` > alike(c(a = 1, b = 2), c(1, 2)) [1] "should be type \"character\" (is \"NULL\") for \"names\"" unitizer> Y = lists ======================================================================== - Passed ----------------------------------------------------------------------- The 6 tests in this section passed. Keep tests in store ([Y]es, [N]o, [P]rev, [B]rowse, [Q]uit, [H]elp)? > lst <- list(list(1, 2), list(3, list(4, list(5, list(6, 6.1, + 6.2))))) Error in if ((curr.sub.sec.obj@show.out || x@review == 0L) && nchar(item.main@data@output)) screen_out(item.main@data@output, : missing value where TRUE/FALSE needed Unexpectedly exited before storing unitizer; tests were not saved or changed. ```
non_defect
unexpected exit when browsing to some tests happened with review tests unitizer alike unitizer started at test number two and at end of section got alike c a b c should be type character is null for names unitizer y lists passed the tests in this section passed keep tests in store es o rev rowse uit elp lst list list list list list list error in if curr sub sec obj show out x review nchar item main data output screen out item main data output missing value where true false needed unexpectedly exited before storing unitizer tests were not saved or changed
0
40,317
9,955,121,214
IssuesEvent
2019-07-05 10:08:01
vector-im/riot-web
https://api.github.com/repos/vector-im/riot-web
closed
Missing messages whilst client suspended
bug defect p1 🔥 Fire 🔥
Reports of messages being missing in a room in the time between closing a laptop lid in the evening and opening it in the morning.
1.0
Missing messages whilst client suspended - Reports of messages being missing in a room in the time between closing a laptop lid in the evening and opening it in the morning.
defect
missing messages whilst client suspended reports of messages being missing in a room in the time between closing a laptop lid in the evening and opening it in the morning
1
69,475
22,365,344,119
IssuesEvent
2022-06-16 03:00:30
NREL/EnergyPlus
https://api.github.com/repos/NREL/EnergyPlus
closed
Cooling coil sizing uses StdRhoAir to size water coils while using actual air density for other coil types
Defect
Issue overview -------------- Coil models use either StdRhoAir or actual air density during the coil sizing procedure. All coils will now use StdRhoAir for sizing coil capacity. ### Details Some additional details for this issue (if relevant): - Platform (Operating system, version) - Version of EnergyPlus (if using an intermediate build, include SHA) - Unmethours link or helpdesk ticket number ### Checklist Add to this list or remove from it as applicable. This is a simple templated set of guidelines. - [ ] Defect file added (list location of defect file here) - [ ] Ticket added to Pivotal for defect (development team task) - [ ] Pull request created (the pull request will have additional tasks related to reviewing changes that fix this defect)
1.0
Cooling coil sizing uses StdRhoAir to size water coils while using actual air density for other coil types - Issue overview -------------- Coil models use either StdRhoAir or actual air density during the coil sizing procedure. All coils will now use StdRhoAir for sizing coil capacity. ### Details Some additional details for this issue (if relevant): - Platform (Operating system, version) - Version of EnergyPlus (if using an intermediate build, include SHA) - Unmethours link or helpdesk ticket number ### Checklist Add to this list or remove from it as applicable. This is a simple templated set of guidelines. - [ ] Defect file added (list location of defect file here) - [ ] Ticket added to Pivotal for defect (development team task) - [ ] Pull request created (the pull request will have additional tasks related to reviewing changes that fix this defect)
defect
cooling coil sizing uses stdrhoair to size water coils while using actual air density for other coil types issue overview coil models use either stdrhoair or actual air density during the coil sizing procedure all coils will now use stdrhoair for sizing coil capacity details some additional details for this issue if relevant platform operating system version version of energyplus if using an intermediate build include sha unmethours link or helpdesk ticket number checklist add to this list or remove from it as applicable this is a simple templated set of guidelines defect file added list location of defect file here ticket added to pivotal for defect development team task pull request created the pull request will have additional tasks related to reviewing changes that fix this defect
1
111,749
11,741,305,023
IssuesEvent
2020-03-11 21:26:22
americanexpress/one-app
https://api.github.com/repos/americanexpress/one-app
opened
Creating Recipe for Making API Calls
documentation good first issue
Create a recipe on how to make API calls within One App. This documentation should be added here: https://github.com/americanexpress/one-app/blob/master/docs/recipes/Making-An-API-Call.md *Topics that should be Covered* * How to make calls using the [fetch api](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch) * How to make calls using [Iguazu](https://github.com/americanexpress/iguazu)
1.0
Creating Recipe for Making API Calls - Create a recipe on how to make API calls within One App. This documentation should be added here: https://github.com/americanexpress/one-app/blob/master/docs/recipes/Making-An-API-Call.md *Topics that should be Covered* * How to make calls using the [fetch api](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch) * How to make calls using [Iguazu](https://github.com/americanexpress/iguazu)
non_defect
creating recipe for making api calls create a recipe on how to make api calls within one app this documentation should be added here topics that should be covered how to make calls using the how to make calls using
0
43,627
11,781,223,253
IssuesEvent
2020-03-16 21:52:56
cakephp/bake
https://api.github.com/repos/cakephp/bake
closed
AppController of plugin should be optional
Defect
`cake bake all ExposedUsers -p Sandbox` leads to: ``` Exception: Class 'Sandbox\Controller\AppController' not found In [/home/vagrant/Apps/sandbox.local/plugins/Sandbox/src/Controller/ExposedUsersController.php, line 13] Error: [Error] Class 'Sandbox\Controller\AppController' not found in /home/vagrant/Apps/sandbox.local/plugins/Sandbox/src/Controller/ExposedUsersController.php on line 13 Stack Trace: - /home/vagrant/Apps/sandbox.local/vendor/composer/ClassLoader.php:444 - /home/vagrant/Apps/sandbox.local/vendor/composer/ClassLoader.php:322 - spl_autoload_call - [internal], line ??- class_exists - [internal], line ??- /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/TestCommand.php:243 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/TestCommand.php:120 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/ControllerCommand.php:201 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/ControllerCommand.php:147 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/ControllerCommand.php:64 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/AllCommand.php:93 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/cakephp/src/Console/BaseCommand.php:175 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/cakephp/src/Console/CommandRunner.php:336 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/cakephp/src/Console/CommandRunner.php:171 - /home/vagrant/Apps/sandbox.local/bin/cake.php:12 ``` If no AppController is found inside the plugin, it should silently accept the normal app AppController instead (which most use anyway, there is no need to have one per plugin by default). The controller code generated is ```php namespace Sandbox\Controller; class ExposedUsersController extends AppController ``` And missing `use App\Controller\AppController;` here.
1.0
AppController of plugin should be optional - `cake bake all ExposedUsers -p Sandbox` leads to: ``` Exception: Class 'Sandbox\Controller\AppController' not found In [/home/vagrant/Apps/sandbox.local/plugins/Sandbox/src/Controller/ExposedUsersController.php, line 13] Error: [Error] Class 'Sandbox\Controller\AppController' not found in /home/vagrant/Apps/sandbox.local/plugins/Sandbox/src/Controller/ExposedUsersController.php on line 13 Stack Trace: - /home/vagrant/Apps/sandbox.local/vendor/composer/ClassLoader.php:444 - /home/vagrant/Apps/sandbox.local/vendor/composer/ClassLoader.php:322 - spl_autoload_call - [internal], line ??- class_exists - [internal], line ??- /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/TestCommand.php:243 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/TestCommand.php:120 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/ControllerCommand.php:201 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/ControllerCommand.php:147 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/ControllerCommand.php:64 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/bake/src/Command/AllCommand.php:93 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/cakephp/src/Console/BaseCommand.php:175 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/cakephp/src/Console/CommandRunner.php:336 - /home/vagrant/Apps/sandbox.local/vendor/cakephp/cakephp/src/Console/CommandRunner.php:171 - /home/vagrant/Apps/sandbox.local/bin/cake.php:12 ``` If no AppController is found inside the plugin, it should silently accept the normal app AppController instead (which most use anyway, there is no need to have one per plugin by default). The controller code generated is ```php namespace Sandbox\Controller; class ExposedUsersController extends AppController ``` And missing `use App\Controller\AppController;` here.
defect
appcontroller of plugin should be optional cake bake all exposedusers p sandbox leads to exception class sandbox controller appcontroller not found in error class sandbox controller appcontroller not found in home vagrant apps sandbox local plugins sandbox src controller exposeduserscontroller php on line stack trace home vagrant apps sandbox local vendor composer classloader php home vagrant apps sandbox local vendor composer classloader php spl autoload call line class exists line home vagrant apps sandbox local vendor cakephp bake src command testcommand php home vagrant apps sandbox local vendor cakephp bake src command testcommand php home vagrant apps sandbox local vendor cakephp bake src command controllercommand php home vagrant apps sandbox local vendor cakephp bake src command controllercommand php home vagrant apps sandbox local vendor cakephp bake src command controllercommand php home vagrant apps sandbox local vendor cakephp bake src command allcommand php home vagrant apps sandbox local vendor cakephp cakephp src console basecommand php home vagrant apps sandbox local vendor cakephp cakephp src console commandrunner php home vagrant apps sandbox local vendor cakephp cakephp src console commandrunner php home vagrant apps sandbox local bin cake php if no appcontroller is found inside the plugin it should silently accept the normal app appcontroller instead which most use anyway there is no need to have one per plugin by default the controller code generated is php namespace sandbox controller class exposeduserscontroller extends appcontroller and missing use app controller appcontroller here
1
46,627
13,055,949,516
IssuesEvent
2020-07-30 03:12:34
icecube-trac/tix2
https://api.github.com/repos/icecube-trac/tix2
opened
icetray-inspect - on OSX `icetray.dylib` is included as an inspectable project (Trac #1604)
Incomplete Migration Migrated from Trac defect infrastructure
Migrated from https://code.icecube.wisc.edu/ticket/1604 ```json { "status": "closed", "changetime": "2019-02-13T14:11:42", "description": "dumping `args` before `icetray-inspect` starts its magic:\n\n{{{\n~/i3/combo/build\n\u276f ./env-shell.sh /Users/nega/i3/combo/build/bin/icetray-inspect --sphinx --sphinx-references --all --no-params --title=\"IceTracy Quick Reference\" -o /Users/nega/i3/combo/build/sphinx_src/source/icetray_quick_reference.rst\n['astro', 'bayesian_priors', 'CascadeVariables', 'clast', 'clsim', 'cmc', 'coinc_twc', 'CoincSuite', 'common_variables', 'common_variables__direct_hits', 'common_variables__hit_multiplicity', 'common_variables__hit_statistics', 'common_variables__track_characteristics', 'CommonVariables', 'coordinate_service', 'core_removal', 'corsika_reader', 'cramer_rao', 'credo', 'cscd_llh', 'daq_decode', 'dataclasses', 'dataio', 'DeepCore_Filter', 'diplopia', 'dipolefit', 'DOMLauncher', 'DomTools', 'double_muon', 'dst', 'earthmodel_service', 'examples_simulation', 'fill_ratio', 'filter_tools', 'filterscripts', 'finiteReco', 'frame_object_diff', 'full_event_followup', 'g4_tankresponse', 'gulliver', 'gulliver_modules', 'hdfwriter', 'HiveSplitter', 'i3math', 'IceHive', 'icepick', 'icetray', 'icetray.dylib', 'icetray_test', 'improvedLinefit', 'interfaces', 'ipdf', 'KalmanFilter', 'level3_filter_cascade', 'level3_filter_lowen', 'level3_filter_muon', 'lilliput', 'linefit', 'mue', 'MuonGun', 'neutrino_generator', 'NoiseEngine', 'paraboloid', 'payload_parsing', 'photonics_service', 'photospline', 'phys_services', 'portia', 'ppc', 'ppc_eff', 'ppc_pick', 'PROPOSAL', 'recclasses', 'rootwriter', 'shield', 'shovelart', 'shovelio', 'sim_services', 'simclasses', 'simprod', 'SLOPtools', 'smallshower_filter', 'spline_reco', 'static_twc', 'steamshovel', 'STTools', 'tableio', 'tensor_of_inertia', 'test_unregistered', 'topeventcleaning', 'TopologicalSplitter', 'toprec', 'topsimulator', 'tpx', 'trigger_sim', 'trigger_splitter', 'truncated_energy', 'VHESelfVeto', 'vuvuzela', 'WaveCalibrator', 'wavereform', 'weighting', 'wimpsim_reader', 'xppc']\n}}}\n\n`icetray-inspect` then proceeds to fail:\n{{{\nicepick\nLoading icepick...........................................ok\nicetray\nicetray.dylib\n*** Failure loading 'libicetray.dylib'.\n*** Load external libraries without an extension.\n*** e.g. please omit '.dylib' or '.so'.\n\n}}}\n", "reporter": "nega", "cc": "", "resolution": "fixed", "_ts": "1550067102580394", "component": "infrastructure", "summary": "icetray-inspect - on OSX `icetray.dylib` is included as an inspectable project", "priority": "major", "keywords": "documentation", "time": "2016-03-28T17:34:57", "milestone": "", "owner": "kjmeagher", "type": "defect" } ```
1.0
icetray-inspect - on OSX `icetray.dylib` is included as an inspectable project (Trac #1604) - Migrated from https://code.icecube.wisc.edu/ticket/1604 ```json { "status": "closed", "changetime": "2019-02-13T14:11:42", "description": "dumping `args` before `icetray-inspect` starts its magic:\n\n{{{\n~/i3/combo/build\n\u276f ./env-shell.sh /Users/nega/i3/combo/build/bin/icetray-inspect --sphinx --sphinx-references --all --no-params --title=\"IceTracy Quick Reference\" -o /Users/nega/i3/combo/build/sphinx_src/source/icetray_quick_reference.rst\n['astro', 'bayesian_priors', 'CascadeVariables', 'clast', 'clsim', 'cmc', 'coinc_twc', 'CoincSuite', 'common_variables', 'common_variables__direct_hits', 'common_variables__hit_multiplicity', 'common_variables__hit_statistics', 'common_variables__track_characteristics', 'CommonVariables', 'coordinate_service', 'core_removal', 'corsika_reader', 'cramer_rao', 'credo', 'cscd_llh', 'daq_decode', 'dataclasses', 'dataio', 'DeepCore_Filter', 'diplopia', 'dipolefit', 'DOMLauncher', 'DomTools', 'double_muon', 'dst', 'earthmodel_service', 'examples_simulation', 'fill_ratio', 'filter_tools', 'filterscripts', 'finiteReco', 'frame_object_diff', 'full_event_followup', 'g4_tankresponse', 'gulliver', 'gulliver_modules', 'hdfwriter', 'HiveSplitter', 'i3math', 'IceHive', 'icepick', 'icetray', 'icetray.dylib', 'icetray_test', 'improvedLinefit', 'interfaces', 'ipdf', 'KalmanFilter', 'level3_filter_cascade', 'level3_filter_lowen', 'level3_filter_muon', 'lilliput', 'linefit', 'mue', 'MuonGun', 'neutrino_generator', 'NoiseEngine', 'paraboloid', 'payload_parsing', 'photonics_service', 'photospline', 'phys_services', 'portia', 'ppc', 'ppc_eff', 'ppc_pick', 'PROPOSAL', 'recclasses', 'rootwriter', 'shield', 'shovelart', 'shovelio', 'sim_services', 'simclasses', 'simprod', 'SLOPtools', 'smallshower_filter', 'spline_reco', 'static_twc', 'steamshovel', 'STTools', 'tableio', 'tensor_of_inertia', 'test_unregistered', 'topeventcleaning', 'TopologicalSplitter', 'toprec', 'topsimulator', 'tpx', 'trigger_sim', 'trigger_splitter', 'truncated_energy', 'VHESelfVeto', 'vuvuzela', 'WaveCalibrator', 'wavereform', 'weighting', 'wimpsim_reader', 'xppc']\n}}}\n\n`icetray-inspect` then proceeds to fail:\n{{{\nicepick\nLoading icepick...........................................ok\nicetray\nicetray.dylib\n*** Failure loading 'libicetray.dylib'.\n*** Load external libraries without an extension.\n*** e.g. please omit '.dylib' or '.so'.\n\n}}}\n", "reporter": "nega", "cc": "", "resolution": "fixed", "_ts": "1550067102580394", "component": "infrastructure", "summary": "icetray-inspect - on OSX `icetray.dylib` is included as an inspectable project", "priority": "major", "keywords": "documentation", "time": "2016-03-28T17:34:57", "milestone": "", "owner": "kjmeagher", "type": "defect" } ```
defect
icetray inspect on osx icetray dylib is included as an inspectable project trac migrated from json status closed changetime description dumping args before icetray inspect starts its magic n n n combo build n env shell sh users nega combo build bin icetray inspect sphinx sphinx references all no params title icetracy quick reference o users nega combo build sphinx src source icetray quick reference rst n n n n icetray inspect then proceeds to fail n nicepick nloading icepick ok nicetray nicetray dylib n failure loading libicetray dylib n load external libraries without an extension n e g please omit dylib or so n n n reporter nega cc resolution fixed ts component infrastructure summary icetray inspect on osx icetray dylib is included as an inspectable project priority major keywords documentation time milestone owner kjmeagher type defect
1
131,046
18,214,669,637
IssuesEvent
2021-09-30 01:40:53
catalpainternational/canoe
https://api.github.com/repos/catalpainternational/canoe
opened
Bero: visual assets for content development
Design
### Overview As part of the development of the visual system and assets for core Bero, we need to create a number of illustrations that will be used by content creators. We want to use the visual system of Bero while creating a catalogue of illustrations and graphics that will be used in lessons and courses. ### Concept A catalogue of visual assets that can be reused by content creators when preparing content for Bero. The initial usage will be for the 'How to create a great course on Bero' but later the visual assets should be accessible and available for other courses to be developed by Catalpa and Catalpa's partners. We want to use the same style and design direction that we are using for illustration and animations for Bero screens, keeping simplicity, a mid level of abstraction and a format that fits well in the content cards of Bero lessons and courses. For individual concepts that we aim to explore in each card, please see this spreadsheet - sheet 'Core Bero' rows 28 to 46 and sheet 'How to Bero course' ### Process - [ ] Create static illustration - [ ] Approve static illustration - [ ] Export static illustration and upload it to the Bero product drive - [ ] Inform Product lead (Juliana) ### Conditions of satisfaction (assets - ordered by priority - please follow this order when working on it) - [ ] Community card - [ ] Expected lesson duration card - [ ] How Bero works - [ ] Download content (screenshot of app UI) - [ ] Explore the app (screenshot of app UI) - [ ] Social learning (screenshot of app UI) - [ ] Quiz/question - [ ] Talk to your team - [ ] Taking notes - [ ] Group activities - [ ] Learning objectives - [ ] Grading language level - [ ] Mobile learning - [ ] Screen size - [ ] Mobile first - [ ] Mix the media - [ ] References card - [ ] Glossary card - [ ] Summary card - [ ] 5 E's IBL Cycle - [ ] The learning pyramid ### Resources - [Briefing and creative direction](https://docs.google.com/document/d/1o1RvoAmWyBUoVeiwVZis5A9TSs9bdS2kT7h8EOUEj1o/edit#) - [Check list Bero visual assets](https://docs.google.com/spreadsheets/d/1qurBq1GE5VbYy1jnH34uXtWwC7R9enhm1T0-_BN_1ZE/edit#gid=1579069067) - [Screen UI - example of type of content card the visual assets will be on](https://kumulheltskul.com/#/site/kumul-helt-skul/learn/emergency-department/welcome/navigating-the-essential-emergency-care-systems-training-program:content:1) - [Screen mockup - Core DS](https://projects.invisionapp.com/d/main?origin=v7#/console/20251865/430145918/preview?scrollOffset=5148.75)
1.0
Bero: visual assets for content development - ### Overview As part of the development of the visual system and assets for core Bero, we need to create a number of illustrations that will be used by content creators. We want to use the visual system of Bero while creating a catalogue of illustrations and graphics that will be used in lessons and courses. ### Concept A catalogue of visual assets that can be reused by content creators when preparing content for Bero. The initial usage will be for the 'How to create a great course on Bero' but later the visual assets should be accessible and available for other courses to be developed by Catalpa and Catalpa's partners. We want to use the same style and design direction that we are using for illustration and animations for Bero screens, keeping simplicity, a mid level of abstraction and a format that fits well in the content cards of Bero lessons and courses. For individual concepts that we aim to explore in each card, please see this spreadsheet - sheet 'Core Bero' rows 28 to 46 and sheet 'How to Bero course' ### Process - [ ] Create static illustration - [ ] Approve static illustration - [ ] Export static illustration and upload it to the Bero product drive - [ ] Inform Product lead (Juliana) ### Conditions of satisfaction (assets - ordered by priority - please follow this order when working on it) - [ ] Community card - [ ] Expected lesson duration card - [ ] How Bero works - [ ] Download content (screenshot of app UI) - [ ] Explore the app (screenshot of app UI) - [ ] Social learning (screenshot of app UI) - [ ] Quiz/question - [ ] Talk to your team - [ ] Taking notes - [ ] Group activities - [ ] Learning objectives - [ ] Grading language level - [ ] Mobile learning - [ ] Screen size - [ ] Mobile first - [ ] Mix the media - [ ] References card - [ ] Glossary card - [ ] Summary card - [ ] 5 E's IBL Cycle - [ ] The learning pyramid ### Resources - [Briefing and creative direction](https://docs.google.com/document/d/1o1RvoAmWyBUoVeiwVZis5A9TSs9bdS2kT7h8EOUEj1o/edit#) - [Check list Bero visual assets](https://docs.google.com/spreadsheets/d/1qurBq1GE5VbYy1jnH34uXtWwC7R9enhm1T0-_BN_1ZE/edit#gid=1579069067) - [Screen UI - example of type of content card the visual assets will be on](https://kumulheltskul.com/#/site/kumul-helt-skul/learn/emergency-department/welcome/navigating-the-essential-emergency-care-systems-training-program:content:1) - [Screen mockup - Core DS](https://projects.invisionapp.com/d/main?origin=v7#/console/20251865/430145918/preview?scrollOffset=5148.75)
non_defect
bero visual assets for content development overview as part of the development of the visual system and assets for core bero we need to create a number of illustrations that will be used by content creators we want to use the visual system of bero while creating a catalogue of illustrations and graphics that will be used in lessons and courses concept a catalogue of visual assets that can be reused by content creators when preparing content for bero the initial usage will be for the how to create a great course on bero but later the visual assets should be accessible and available for other courses to be developed by catalpa and catalpa s partners we want to use the same style and design direction that we are using for illustration and animations for bero screens keeping simplicity a mid level of abstraction and a format that fits well in the content cards of bero lessons and courses for individual concepts that we aim to explore in each card please see this spreadsheet sheet core bero rows to and sheet how to bero course process create static illustration approve static illustration export static illustration and upload it to the bero product drive inform product lead juliana conditions of satisfaction assets ordered by priority please follow this order when working on it community card expected lesson duration card how bero works download content screenshot of app ui explore the app screenshot of app ui social learning screenshot of app ui quiz question talk to your team taking notes group activities learning objectives grading language level mobile learning screen size mobile first mix the media references card glossary card summary card e s ibl cycle the learning pyramid resources
0
40,320
9,955,805,161
IssuesEvent
2019-07-05 12:11:38
contao/contao
https://api.github.com/repos/contao/contao
opened
Use the "resname" attribute in .xliff files
defect
As described in the [Xliff wiki](https://wiki.oasis-open.org/xliff/FAQ#IdvsResname), the original identifier of a text item should be stored in the `resname` attribute of the `<trans-unit>` tag. We are currently (ab)using the `id` attribute, which is why the Contao .xliff files are not compatible with the Symfony translator (see https://github.com/symfony/symfony/issues/12241). We should adjust our Xliff loader to handle the attribute and update our .xliff files accordingly.
1.0
Use the "resname" attribute in .xliff files - As described in the [Xliff wiki](https://wiki.oasis-open.org/xliff/FAQ#IdvsResname), the original identifier of a text item should be stored in the `resname` attribute of the `<trans-unit>` tag. We are currently (ab)using the `id` attribute, which is why the Contao .xliff files are not compatible with the Symfony translator (see https://github.com/symfony/symfony/issues/12241). We should adjust our Xliff loader to handle the attribute and update our .xliff files accordingly.
defect
use the resname attribute in xliff files as described in the the original identifier of a text item should be stored in the resname attribute of the tag we are currently ab using the id attribute which is why the contao xliff files are not compatible with the symfony translator see we should adjust our xliff loader to handle the attribute and update our xliff files accordingly
1
48,178
13,067,497,870
IssuesEvent
2020-07-31 00:39:07
icecube-trac/tix2
https://api.github.com/repos/icecube-trac/tix2
closed
[cmake] python3 and the shebang (Trac #1912)
Migrated from Trac cmake defect
If you build cmake with a python binary that isn't called `python`, such as `python3`, then `icetray-inspect` fails. Likely `dataio-pyshovel` and other python scripts fail too. I guess the solution is to modify the shebang of all python executables as a cmake step? Something like: https://github.com/ros/catkin/pull/574/files Migrated from https://code.icecube.wisc.edu/ticket/1912 ```json { "status": "closed", "changetime": "2019-09-18T05:50:20", "description": "If you build cmake with a python binary that isn't called `python`, such as `python3`, then `icetray-inspect` fails. Likely `dataio-pyshovel` and other python scripts fail too.\n\nI guess the solution is to modify the shebang of all python executables as a cmake step? Something like:\nhttps://github.com/ros/catkin/pull/574/files", "reporter": "david.schultz", "cc": "kjmeagher", "resolution": "worksforme", "_ts": "1568785820929050", "component": "cmake", "summary": "[cmake] python3 and the shebang", "priority": "normal", "keywords": "", "time": "2016-11-17T22:54:14", "milestone": "Long-Term Future", "owner": "nega", "type": "defect" } ```
1.0
[cmake] python3 and the shebang (Trac #1912) - If you build cmake with a python binary that isn't called `python`, such as `python3`, then `icetray-inspect` fails. Likely `dataio-pyshovel` and other python scripts fail too. I guess the solution is to modify the shebang of all python executables as a cmake step? Something like: https://github.com/ros/catkin/pull/574/files Migrated from https://code.icecube.wisc.edu/ticket/1912 ```json { "status": "closed", "changetime": "2019-09-18T05:50:20", "description": "If you build cmake with a python binary that isn't called `python`, such as `python3`, then `icetray-inspect` fails. Likely `dataio-pyshovel` and other python scripts fail too.\n\nI guess the solution is to modify the shebang of all python executables as a cmake step? Something like:\nhttps://github.com/ros/catkin/pull/574/files", "reporter": "david.schultz", "cc": "kjmeagher", "resolution": "worksforme", "_ts": "1568785820929050", "component": "cmake", "summary": "[cmake] python3 and the shebang", "priority": "normal", "keywords": "", "time": "2016-11-17T22:54:14", "milestone": "Long-Term Future", "owner": "nega", "type": "defect" } ```
defect
and the shebang trac if you build cmake with a python binary that isn t called python such as then icetray inspect fails likely dataio pyshovel and other python scripts fail too i guess the solution is to modify the shebang of all python executables as a cmake step something like migrated from json status closed changetime description if you build cmake with a python binary that isn t called python such as then icetray inspect fails likely dataio pyshovel and other python scripts fail too n ni guess the solution is to modify the shebang of all python executables as a cmake step something like n reporter david schultz cc kjmeagher resolution worksforme ts component cmake summary and the shebang priority normal keywords time milestone long term future owner nega type defect
1
78,253
27,393,886,205
IssuesEvent
2023-02-28 18:07:19
primefaces-extensions/primefaces-extensions
https://api.github.com/repos/primefaces-extensions/primefaces-extensions
closed
InputPhone: GEO IP lookup showcase not working?
defect
### Describe the bug See: https://www.primefaces.org/showcase-ext/sections/inputphone/geoIpLookup.jsf ![image](https://user-images.githubusercontent.com/4399574/221915899-3791bf15-fbdf-41db-b8aa-2bd24719263c.png) ### Reproducer _No response_ ### Expected behavior _No response_ ### PrimeFaces Extensions version 12.0.4 ### JSF implementation MyFaces ### JSF version 2.3 ### Browser(s) _No response_
1.0
InputPhone: GEO IP lookup showcase not working? - ### Describe the bug See: https://www.primefaces.org/showcase-ext/sections/inputphone/geoIpLookup.jsf ![image](https://user-images.githubusercontent.com/4399574/221915899-3791bf15-fbdf-41db-b8aa-2bd24719263c.png) ### Reproducer _No response_ ### Expected behavior _No response_ ### PrimeFaces Extensions version 12.0.4 ### JSF implementation MyFaces ### JSF version 2.3 ### Browser(s) _No response_
defect
inputphone geo ip lookup showcase not working describe the bug see reproducer no response expected behavior no response primefaces extensions version jsf implementation myfaces jsf version browser s no response
1
137,638
12,768,352,919
IssuesEvent
2020-06-30 00:10:06
zio/zio-web
https://api.github.com/repos/zio/zio-web
closed
Encode OpenAPI Specification
documentation enhancement
The services of zio-http should have a type safe descriptions that are seperate from implementation and can be used to generate documentation in common standards. OpenAPI is the gold standard for REST APIs and should be fully supported. As a first step a scala representation of the OpenApi Specification should be implemented. /cc @mijicd
1.0
Encode OpenAPI Specification - The services of zio-http should have a type safe descriptions that are seperate from implementation and can be used to generate documentation in common standards. OpenAPI is the gold standard for REST APIs and should be fully supported. As a first step a scala representation of the OpenApi Specification should be implemented. /cc @mijicd
non_defect
encode openapi specification the services of zio http should have a type safe descriptions that are seperate from implementation and can be used to generate documentation in common standards openapi is the gold standard for rest apis and should be fully supported as a first step a scala representation of the openapi specification should be implemented cc mijicd
0
67,223
20,961,585,556
IssuesEvent
2022-03-27 21:44:15
abedmaatalla/foursquared
https://api.github.com/repos/abedmaatalla/foursquared
closed
places problems
Priority-Medium Type-Defect auto-migrated
``` im having problems with my places page on my droid when i go to this page it tells me a surprising new problem has occurred any way this can be fixed would be great!! ``` Original issue reported on code.google.com by `farmvill...@yahoo.com` on 28 Jul 2010 at 3:41
1.0
places problems - ``` im having problems with my places page on my droid when i go to this page it tells me a surprising new problem has occurred any way this can be fixed would be great!! ``` Original issue reported on code.google.com by `farmvill...@yahoo.com` on 28 Jul 2010 at 3:41
defect
places problems im having problems with my places page on my droid when i go to this page it tells me a surprising new problem has occurred any way this can be fixed would be great original issue reported on code google com by farmvill yahoo com on jul at
1
1,156
2,532,101,256
IssuesEvent
2015-01-23 13:44:19
Financial-Times/o-overlay
https://api.github.com/repos/Financial-Times/o-overlay
closed
Cross icon
design
Currently is an image. Agreed today to change to an font character icon. @AlbertoElias
1.0
Cross icon - Currently is an image. Agreed today to change to an font character icon. @AlbertoElias
non_defect
cross icon currently is an image agreed today to change to an font character icon albertoelias
0
69,317
9,299,251,723
IssuesEvent
2019-03-23 01:26:07
Roxxers/roxbot
https://api.github.com/repos/Roxxers/roxbot
closed
Documentation Needs Finishing
documentation enhancement
Documentation is unfinished. - [ ] Missing Installing for other UNIX systems and Windows. - [ ] Needs tutorial on how to write cog and configuring Roxbot. - [ ] Missing command docs for 8 Ball command
1.0
Documentation Needs Finishing - Documentation is unfinished. - [ ] Missing Installing for other UNIX systems and Windows. - [ ] Needs tutorial on how to write cog and configuring Roxbot. - [ ] Missing command docs for 8 Ball command
non_defect
documentation needs finishing documentation is unfinished missing installing for other unix systems and windows needs tutorial on how to write cog and configuring roxbot missing command docs for ball command
0
39,070
9,197,243,375
IssuesEvent
2019-03-07 09:31:53
netty/netty
https://api.github.com/repos/netty/netty
closed
HttpContentDecompressor does not wait until the whole GZIP member is read before issuing a message
defect
A minimal example project reproducing this bug can be found here: https://github.com/andreisilviudragnea/reactor-webclient-netty-bug The Spring `WebClient` sometimes hangs indefinitely because of a possible bug in the [`reactor-netty`](https://github.com/reactor/reactor-netty) and/or [`netty`](https://github.com/netty/netty) projects. The root of this bug is that the `channel.read()` call in [`FluxReceive.drainReceiver()`](https://github.com/reactor/reactor-netty/blob/master/src/main/java/reactor/netty/channel/FluxReceive.java#L237) is sometimes not issued enough times for the Netty channel to finish reading the message. This happens when `receiverFastpath == false`, which is the case when [`Jackson2JsonDecoder`](https://github.com/spring-projects/spring-framework/blob/master/spring-web/src/main/java/org/springframework/http/codec/json/Jackson2JsonDecoder.java) is used by Spring's `WebClient`. The `receiverFastpath == false` condition is a consequence of the Reactor [`flatMap`](https://github.com/reactor/reactor-core/blob/master/reactor-core/src/main/java/reactor/core/publisher/Flux.java#L4823) operator from [`Jackson2Tokenizer`](https://github.com/spring-projects/spring-framework/blob/master/spring-web/src/main/java/org/springframework/http/codec/json/Jackson2Tokenizer.java#L86), which requests at most `maxConcurrency` (with a default of `32`) elements from the upstream publisher. This bug is very hard to reproduce because it happens when gzip encoding is used on a very small response. From what I understand, it happens because gzipping a very small response (like `[]`) expands the response size significantly and because the [`HttpContentDecompressor`](https://github.com/netty/netty/blob/4.1/codec-http/src/main/java/io/netty/handler/codec/http/HttpContentDecompressor.java) issues the gzipped response before the corresponding CRC is received. The `channel.read()` signals get issued after the `HttpContentDecompressor` sends the message up the `ChannelPipeline`, but if the reads from the socket are "too chunked", there won't be another chance for the `HttpContentDecompressor` to emit a `LastHttpContent`. I have attached an example `WebClient` and a simple TCP server to reproduce this bug. I know that the example seems contrived, but I tried to reproduce this bug in the most simple way possible that I could find. In our setup, the request which hangs is made against an `https` endpoint, so the Netty `SslHandler` also processes the response in the pipeline, before `HttpContentDecompressor`, which made the bug even harder to track and reproduce. The Netty Nio event loop thread which processes the response gets stuck in the [`NioEventLoop`](https://github.com/netty/netty/blob/4.1/transport/src/main/java/io/netty/channel/nio/NioEventLoop.java#L423) indefinitely in this case, from what I have tested so far. I do not know yet where the problem lies, but I believe that maybe if the `HttpContentDecompressor` would issue `channel.read()` operations until in consumes a whole gzip member, before sending the decompressed data up the pipeline, maybe the pipeline would not get stuck. However, I believe that this problem can still occur even in other circumstances where the reads from the socket return many small chunks. ### Netty version 4.1.33.Final ### JVM version (e.g. `java -version`) java version "1.8.0_192" Java(TM) SE Runtime Environment (build 1.8.0_192-b12) Java HotSpot(TM) 64-Bit Server VM (build 25.192-b12, mixed mode) ### OS version (e.g. `uname -a`) Darwin macbook-pro-5.eur.adobe.com 18.2.0 Darwin Kernel Version 18.2.0: Thu Dec 20 20:46:53 PST 2018; root:xnu-4903.241.1~1/RELEASE_X86_64 x86_64
1.0
HttpContentDecompressor does not wait until the whole GZIP member is read before issuing a message - A minimal example project reproducing this bug can be found here: https://github.com/andreisilviudragnea/reactor-webclient-netty-bug The Spring `WebClient` sometimes hangs indefinitely because of a possible bug in the [`reactor-netty`](https://github.com/reactor/reactor-netty) and/or [`netty`](https://github.com/netty/netty) projects. The root of this bug is that the `channel.read()` call in [`FluxReceive.drainReceiver()`](https://github.com/reactor/reactor-netty/blob/master/src/main/java/reactor/netty/channel/FluxReceive.java#L237) is sometimes not issued enough times for the Netty channel to finish reading the message. This happens when `receiverFastpath == false`, which is the case when [`Jackson2JsonDecoder`](https://github.com/spring-projects/spring-framework/blob/master/spring-web/src/main/java/org/springframework/http/codec/json/Jackson2JsonDecoder.java) is used by Spring's `WebClient`. The `receiverFastpath == false` condition is a consequence of the Reactor [`flatMap`](https://github.com/reactor/reactor-core/blob/master/reactor-core/src/main/java/reactor/core/publisher/Flux.java#L4823) operator from [`Jackson2Tokenizer`](https://github.com/spring-projects/spring-framework/blob/master/spring-web/src/main/java/org/springframework/http/codec/json/Jackson2Tokenizer.java#L86), which requests at most `maxConcurrency` (with a default of `32`) elements from the upstream publisher. This bug is very hard to reproduce because it happens when gzip encoding is used on a very small response. From what I understand, it happens because gzipping a very small response (like `[]`) expands the response size significantly and because the [`HttpContentDecompressor`](https://github.com/netty/netty/blob/4.1/codec-http/src/main/java/io/netty/handler/codec/http/HttpContentDecompressor.java) issues the gzipped response before the corresponding CRC is received. The `channel.read()` signals get issued after the `HttpContentDecompressor` sends the message up the `ChannelPipeline`, but if the reads from the socket are "too chunked", there won't be another chance for the `HttpContentDecompressor` to emit a `LastHttpContent`. I have attached an example `WebClient` and a simple TCP server to reproduce this bug. I know that the example seems contrived, but I tried to reproduce this bug in the most simple way possible that I could find. In our setup, the request which hangs is made against an `https` endpoint, so the Netty `SslHandler` also processes the response in the pipeline, before `HttpContentDecompressor`, which made the bug even harder to track and reproduce. The Netty Nio event loop thread which processes the response gets stuck in the [`NioEventLoop`](https://github.com/netty/netty/blob/4.1/transport/src/main/java/io/netty/channel/nio/NioEventLoop.java#L423) indefinitely in this case, from what I have tested so far. I do not know yet where the problem lies, but I believe that maybe if the `HttpContentDecompressor` would issue `channel.read()` operations until in consumes a whole gzip member, before sending the decompressed data up the pipeline, maybe the pipeline would not get stuck. However, I believe that this problem can still occur even in other circumstances where the reads from the socket return many small chunks. ### Netty version 4.1.33.Final ### JVM version (e.g. `java -version`) java version "1.8.0_192" Java(TM) SE Runtime Environment (build 1.8.0_192-b12) Java HotSpot(TM) 64-Bit Server VM (build 25.192-b12, mixed mode) ### OS version (e.g. `uname -a`) Darwin macbook-pro-5.eur.adobe.com 18.2.0 Darwin Kernel Version 18.2.0: Thu Dec 20 20:46:53 PST 2018; root:xnu-4903.241.1~1/RELEASE_X86_64 x86_64
defect
httpcontentdecompressor does not wait until the whole gzip member is read before issuing a message a minimal example project reproducing this bug can be found here the spring webclient sometimes hangs indefinitely because of a possible bug in the and or projects the root of this bug is that the channel read call in is sometimes not issued enough times for the netty channel to finish reading the message this happens when receiverfastpath false which is the case when is used by spring s webclient the receiverfastpath false condition is a consequence of the reactor operator from which requests at most maxconcurrency with a default of elements from the upstream publisher this bug is very hard to reproduce because it happens when gzip encoding is used on a very small response from what i understand it happens because gzipping a very small response like expands the response size significantly and because the issues the gzipped response before the corresponding crc is received the channel read signals get issued after the httpcontentdecompressor sends the message up the channelpipeline but if the reads from the socket are too chunked there won t be another chance for the httpcontentdecompressor to emit a lasthttpcontent i have attached an example webclient and a simple tcp server to reproduce this bug i know that the example seems contrived but i tried to reproduce this bug in the most simple way possible that i could find in our setup the request which hangs is made against an https endpoint so the netty sslhandler also processes the response in the pipeline before httpcontentdecompressor which made the bug even harder to track and reproduce the netty nio event loop thread which processes the response gets stuck in the indefinitely in this case from what i have tested so far i do not know yet where the problem lies but i believe that maybe if the httpcontentdecompressor would issue channel read operations until in consumes a whole gzip member before sending the decompressed data up the pipeline maybe the pipeline would not get stuck however i believe that this problem can still occur even in other circumstances where the reads from the socket return many small chunks netty version final jvm version e g java version java version java tm se runtime environment build java hotspot tm bit server vm build mixed mode os version e g uname a darwin macbook pro eur adobe com darwin kernel version thu dec pst root xnu release
1
292,619
22,032,672,464
IssuesEvent
2022-05-28 04:41:30
AllTheMods/ATM-7
https://api.github.com/repos/AllTheMods/ATM-7
closed
Add extra lore to ATM ores to avoid confusion
documentation enhancement
They should add this under the ATM ore's lore. "Cannot be quarried or mined automatically." Ive seen many people get confused and try to mine these ores using Digital Miners, Quarries, Builders, etc. And obviously come up with nothing. It'd save the time of the player and the people of #atm7-help if this was made clear on the ore itself.
1.0
Add extra lore to ATM ores to avoid confusion - They should add this under the ATM ore's lore. "Cannot be quarried or mined automatically." Ive seen many people get confused and try to mine these ores using Digital Miners, Quarries, Builders, etc. And obviously come up with nothing. It'd save the time of the player and the people of #atm7-help if this was made clear on the ore itself.
non_defect
add extra lore to atm ores to avoid confusion they should add this under the atm ore s lore cannot be quarried or mined automatically ive seen many people get confused and try to mine these ores using digital miners quarries builders etc and obviously come up with nothing it d save the time of the player and the people of help if this was made clear on the ore itself
0
43,147
11,500,505,020
IssuesEvent
2020-02-12 15:40:19
ProcessMaker/modeler
https://api.github.com/repos/ProcessMaker/modeler
closed
Remove the world "New" from all default element names
Defect
The following nodes have the world "New" in their default names, and their names should be updated to omit that word: * All Tasks except for the base (user) task. * All Gateways * Pools * Text Annotations
1.0
Remove the world "New" from all default element names - The following nodes have the world "New" in their default names, and their names should be updated to omit that word: * All Tasks except for the base (user) task. * All Gateways * Pools * Text Annotations
defect
remove the world new from all default element names the following nodes have the world new in their default names and their names should be updated to omit that word all tasks except for the base user task all gateways pools text annotations
1
305,641
23,124,273,185
IssuesEvent
2022-07-28 02:50:25
microsoft/vscode-cordova
https://api.github.com/repos/microsoft/vscode-cordova
closed
Rename parameter runtimeArgs to chromeArgs and expose it to public
enhancement documentation copiedtobacklog
Hello, This is not an issue but a fix or feature I need a way to add parameter to chrome when run Serve to the browser. To handle CORS expection for example. I make a fix : I add this line 1373 in cordovaDebugAdapter.ts : chromeArgs.push(...args.chromeArgs); just after the chromeArgs object initialization. And in my launch.json i'm now able to push some arguments to chrome. "chromeArgs": ["--disable-web-security", "--ignore-certificate-errors 'http://127.0.0.1:8100'", "--remote-debugging-port=9222"],
1.0
Rename parameter runtimeArgs to chromeArgs and expose it to public - Hello, This is not an issue but a fix or feature I need a way to add parameter to chrome when run Serve to the browser. To handle CORS expection for example. I make a fix : I add this line 1373 in cordovaDebugAdapter.ts : chromeArgs.push(...args.chromeArgs); just after the chromeArgs object initialization. And in my launch.json i'm now able to push some arguments to chrome. "chromeArgs": ["--disable-web-security", "--ignore-certificate-errors 'http://127.0.0.1:8100'", "--remote-debugging-port=9222"],
non_defect
rename parameter runtimeargs to chromeargs and expose it to public hello this is not an issue but a fix or feature i need a way to add parameter to chrome when run serve to the browser to handle cors expection for example i make a fix i add this line in cordovadebugadapter ts chromeargs push args chromeargs just after the chromeargs object initialization and in my launch json i m now able to push some arguments to chrome chromeargs
0
144,588
19,292,271,616
IssuesEvent
2021-12-12 01:21:23
ikonovalov/spring-distributed-tracing
https://api.github.com/repos/ikonovalov/spring-distributed-tracing
opened
CVE-2021-43797 (Medium) detected in netty-codec-http-4.1.34.Final.jar
security vulnerability
## CVE-2021-43797 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http-4.1.34.Final.jar</b></p></summary> <p>Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.</p> <p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p> <p>Path to dependency file: /spring-distributed-tracing/admin-server/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/io/netty/netty-codec-http/4.1.34.Final/netty-codec-http-4.1.34.Final.jar</p> <p> Dependency Hierarchy: - spring-boot-admin-starter-server-2.1.4.jar (Root Library) - spring-boot-admin-server-2.1.4.jar - spring-boot-starter-webflux-2.1.4.RELEASE.jar - spring-boot-starter-reactor-netty-2.1.4.RELEASE.jar - reactor-netty-0.8.6.RELEASE.jar - :x: **netty-codec-http-4.1.34.Final.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. Netty prior to version 4.1.7.1.Final skips control chars when they are present at the beginning / end of the header name. It should instead fail fast as these are not allowed by the spec and could lead to HTTP request smuggling. Failing to do the validation might cause netty to "sanitize" header names before it forward these to another remote system when used as proxy. This remote system can't see the invalid usage anymore, and therefore does not do the validation itself. Users should upgrade to version 4.1.7.1.Final to receive a patch. <p>Publish Date: 2021-12-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43797>CVE-2021-43797</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-wx5j-54mm-rqqq">https://github.com/advisories/GHSA-wx5j-54mm-rqqq</a></p> <p>Release Date: 2021-12-09</p> <p>Fix Resolution: io.netty:netty-codec-http:4.1.71.Final</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-43797 (Medium) detected in netty-codec-http-4.1.34.Final.jar - ## CVE-2021-43797 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http-4.1.34.Final.jar</b></p></summary> <p>Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.</p> <p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p> <p>Path to dependency file: /spring-distributed-tracing/admin-server/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/io/netty/netty-codec-http/4.1.34.Final/netty-codec-http-4.1.34.Final.jar</p> <p> Dependency Hierarchy: - spring-boot-admin-starter-server-2.1.4.jar (Root Library) - spring-boot-admin-server-2.1.4.jar - spring-boot-starter-webflux-2.1.4.RELEASE.jar - spring-boot-starter-reactor-netty-2.1.4.RELEASE.jar - reactor-netty-0.8.6.RELEASE.jar - :x: **netty-codec-http-4.1.34.Final.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. Netty prior to version 4.1.7.1.Final skips control chars when they are present at the beginning / end of the header name. It should instead fail fast as these are not allowed by the spec and could lead to HTTP request smuggling. Failing to do the validation might cause netty to "sanitize" header names before it forward these to another remote system when used as proxy. This remote system can't see the invalid usage anymore, and therefore does not do the validation itself. Users should upgrade to version 4.1.7.1.Final to receive a patch. <p>Publish Date: 2021-12-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43797>CVE-2021-43797</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-wx5j-54mm-rqqq">https://github.com/advisories/GHSA-wx5j-54mm-rqqq</a></p> <p>Release Date: 2021-12-09</p> <p>Fix Resolution: io.netty:netty-codec-http:4.1.71.Final</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_defect
cve medium detected in netty codec http final jar cve medium severity vulnerability vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file spring distributed tracing admin server pom xml path to vulnerable library root repository io netty netty codec http final netty codec http final jar dependency hierarchy spring boot admin starter server jar root library spring boot admin server jar spring boot starter webflux release jar spring boot starter reactor netty release jar reactor netty release jar x netty codec http final jar vulnerable library vulnerability details netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers clients netty prior to version final skips control chars when they are present at the beginning end of the header name it should instead fail fast as these are not allowed by the spec and could lead to http request smuggling failing to do the validation might cause netty to sanitize header names before it forward these to another remote system when used as proxy this remote system can t see the invalid usage anymore and therefore does not do the validation itself users should upgrade to version final to receive a patch publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec http final step up your open source security game with whitesource
0
655,997
21,716,420,475
IssuesEvent
2022-05-10 18:23:19
azgs/geomapmaker
https://api.github.com/repos/azgs/geomapmaker
closed
Validity checks on submit of map unit polys
wontfix high priority
We need to implement validity checks on map unit poly geometries (including that they do not overlap other polys) when they are submitted.
1.0
Validity checks on submit of map unit polys - We need to implement validity checks on map unit poly geometries (including that they do not overlap other polys) when they are submitted.
non_defect
validity checks on submit of map unit polys we need to implement validity checks on map unit poly geometries including that they do not overlap other polys when they are submitted
0
117,700
15,165,706,780
IssuesEvent
2021-02-12 15:24:30
department-of-veterans-affairs/va.gov-team
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
closed
[Design] Claims Status: Allow the Notification Letter to be accessed once the package has been sent via US Mail
Claim Status Tool design discovery vsa-benefits
## Issue Description This is coming from NCC and we had an idea that this would be an update to the existing tool. It looks like once we get confirmation that the letter has been sent, we would like to allow them to download it or view it (as PDF) within the tool. This ticket is to figure out where to place the letter in the design of the existing tool. Some goals: 1. Determine where in the UX and overall CST flow this letter should appear (in the decision details page, assumingly) 2. Determine whether it's just a download link or whether we display it. ## Acceptance Criteria - [x] Design showing the letter --- ## How to configure this issue - [x] **Attached to a Milestone** (when will this be completed?) - [x] **Attached to an Epic** (what body of work is this a part of?) - [x] **Labeled with Team** (`product support`, `analytics-insights`, `operations`, `service-design`, `tools-be`, `tools-fe`) - [x] **Labeled with Practice Area** (`backend`, `frontend`, `devops`, `design`, `research`, `product`, `ia`, `qa`, `analytics`, `contact center`, `research`, `accessibility`, `content`) - [x] **Labeled with Type** (`bug`, `request`, `discovery`, `documentation`, etc.)
1.0
[Design] Claims Status: Allow the Notification Letter to be accessed once the package has been sent via US Mail - ## Issue Description This is coming from NCC and we had an idea that this would be an update to the existing tool. It looks like once we get confirmation that the letter has been sent, we would like to allow them to download it or view it (as PDF) within the tool. This ticket is to figure out where to place the letter in the design of the existing tool. Some goals: 1. Determine where in the UX and overall CST flow this letter should appear (in the decision details page, assumingly) 2. Determine whether it's just a download link or whether we display it. ## Acceptance Criteria - [x] Design showing the letter --- ## How to configure this issue - [x] **Attached to a Milestone** (when will this be completed?) - [x] **Attached to an Epic** (what body of work is this a part of?) - [x] **Labeled with Team** (`product support`, `analytics-insights`, `operations`, `service-design`, `tools-be`, `tools-fe`) - [x] **Labeled with Practice Area** (`backend`, `frontend`, `devops`, `design`, `research`, `product`, `ia`, `qa`, `analytics`, `contact center`, `research`, `accessibility`, `content`) - [x] **Labeled with Type** (`bug`, `request`, `discovery`, `documentation`, etc.)
non_defect
claims status allow the notification letter to be accessed once the package has been sent via us mail issue description this is coming from ncc and we had an idea that this would be an update to the existing tool it looks like once we get confirmation that the letter has been sent we would like to allow them to download it or view it as pdf within the tool this ticket is to figure out where to place the letter in the design of the existing tool some goals determine where in the ux and overall cst flow this letter should appear in the decision details page assumingly determine whether it s just a download link or whether we display it acceptance criteria design showing the letter how to configure this issue attached to a milestone when will this be completed attached to an epic what body of work is this a part of labeled with team product support analytics insights operations service design tools be tools fe labeled with practice area backend frontend devops design research product ia qa analytics contact center research accessibility content labeled with type bug request discovery documentation etc
0
36,208
7,868,225,117
IssuesEvent
2018-06-23 18:46:12
StrikeNP/trac_test
https://api.github.com/repos/StrikeNP/trac_test
closed
What happened to cgils_s6 ?? (Trac #674)
Migrated from Trac clubb_src defect
**Introduction** As of r6818, `cgils_s6` and all later cases have failed to run in the nightly tests. It appears that the problem is a failure of the "Cholesky decomposition" routine. The following is printed several times for the `cgils_s6` case: ```text Cholesky_factor: leading minor of order 10 is not positive definite. factorization failed. a_input= 1.00 0.300 1.00 0.000E+00 0.000E+00 1.00 0.108 0.324E-01 0.408 1.00 0.673E-01 0.202E-01 0.000E+00 0.000E+00 1.00 0.721E-01 0.216E-01 0.781 0.189 0.000E+00 1.00 -0.961E-01-0.288E-01 0.528 0.475 0.000E+00 0.367 1.00 0.180E-01 0.541E-02 0.000E+00 0.000E+00 0.915 0.000E+00 0.000E+00 1.00 0.480E-01 0.144E-01 0.877 0.275 0.000E+00 0.963 0.575 0.000E+00 1.00 0.336 0.101 0.661 0.367 0.000E+00 0.516 0.824 0.000E+00 0.678 1.00 ... ``` Now, one might instantly wonder why all the rest of the cases fail as well. Shouldn't just the `cgils_s6` case fail? Well, we have the following line in `run_scm_all.bash`: ```text RESULT=`./run_scm.bash $OPTIONS ${RUN_CASE[$x]} 2>&1` ``` So, it seems that the entirety of the model diagnostic console output is stored in a single BASH variable. When we have the Cholesky error message printed a gazillion times (a fairly refined estimate), it seems that `./run_scm_all.bash` is failing to store all of it. In fact, this is the error that happens: ```text [1]+ Segmentation fault (core dumped) ./run_scm_all.bash --nightly > out.log 2>&1 ``` This would explain why the `./run_scm_all.bash` script immediately terminates!! **Technical spec** Several questions remain: 1. Why do we stuff all the output from `./run_scm.bash` into a single variable? It seems that we do this in order to check the final output message to make sure that CLUBB ran successfully. It seems like better practice to make use of a temporary file and "grep" instead. It is worth mentioning that this is really a bug in Bash. No robust program should "seg-fault" just because it received too much input. Rather, it should have printed an error message and gracefully quit. There isn't really much we can do about that. 2. It is kind of odd that a Cholesky matrix is generated from the correlation matrix regardless of whether or not SILHS is called. Should we only generate the Cholesky matrix if SILHS is called? 3. Any ideas as to why r6818 causes the Cholesky decomposition to fail for `cgils_s6` (and maybe later cases)? Attachments: [plot_explicit_ta_configs.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_explicit_ta_configs.maff) [plot_new_pdf_config_1_plot_2.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_config_1_plot_2.maff) [plot_combo_pdf_run_3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_combo_pdf_run_3.maff) [plot_input_fields_rtp3_thlp3_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_input_fields_rtp3_thlp3_1.maff) [plot_new_pdf_20180522_test_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_20180522_test_1.maff) [plot_attempts_8_10.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempts_8_10.maff) [plot_attempt_8_only.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempt_8_only.maff) [plot_beta_1p3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3.maff) [plot_beta_1p3_all.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3_all.maff) Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/674 ```json { "status": "closed", "changetime": "2015-08-18T20:57:58", "description": "'''Introduction'''\n\nAs of r6818, `cgils_s6` and all later cases have failed to run in the nightly tests. It appears that the problem is a failure of the \"Cholesky decomposition\" routine. The following is printed several times for the `cgils_s6` case:\n{{{\n Cholesky_factor: leading minor of order 10 is not positive definite.\n factorization failed.\n a_input=\n 1.00 \n 0.300 1.00 \n 0.000E+00 0.000E+00 1.00 \n 0.108 0.324E-01 0.408 1.00 \n 0.673E-01 0.202E-01 0.000E+00 0.000E+00 1.00 \n 0.721E-01 0.216E-01 0.781 0.189 0.000E+00 1.00 \n-0.961E-01-0.288E-01 0.528 0.475 0.000E+00 0.367 1.00 \n 0.180E-01 0.541E-02 0.000E+00 0.000E+00 0.915 0.000E+00 0.000E+00 1.00 \n 0.480E-01 0.144E-01 0.877 0.275 0.000E+00 0.963 0.575 0.000E+00 1.00 \n 0.336 0.101 0.661 0.367 0.000E+00 0.516 0.824 0.000E+00 0.678 1.00\n... \n}}}\n\nNow, one might instantly wonder why all the rest of the cases fail as well. Shouldn't just the `cgils_s6` case fail? Well, we have the following line in `run_scm_all.bash`:\n{{{\nRESULT=`./run_scm.bash $OPTIONS ${RUN_CASE[$x]} 2>&1`\n}}}\n\nSo, it seems that the entirety of the model diagnostic console output is stored in a single BASH variable. When we have the Cholesky error message printed a gazillion times (a fairly refined estimate), it seems that `./run_scm_all.bash` is failing to store all of it. In fact, this is the error that happens:\n\n{{{\n[1]+ Segmentation fault (core dumped) ./run_scm_all.bash --nightly > out.log 2>&1\n}}}\n\nThis would explain why the `./run_scm_all.bash` script immediately terminates!!\n\n'''Technical spec'''\n\nSeveral questions remain:\n\n1. Why do we stuff all the output from `./run_scm.bash` into a single variable? It seems that we do this in order to check the final output message to make sure that CLUBB ran successfully. It seems like better practice to make use of a temporary file and \"grep\" instead.\n\n It is worth mentioning that this is really a bug in Bash. No robust program should \"seg-fault\" just because it received too much input. Rather, it should have printed an error message and gracefully quit. There isn't really much we can do about that.\n\n2. It is kind of odd that a Cholesky matrix is generated from the correlation matrix regardless of whether or not SILHS is called. Should we only generate the Cholesky matrix if SILHS is called?\n\n3. Any ideas as to why r6818 causes the Cholesky decomposition to fail for `cgils_s6` (and maybe later cases)?", "reporter": "raut@uwm.edu", "cc": "vlarson@uwm.edu, bmg2@uwm.edu", "resolution": "fixed", "_ts": "1439931478267592", "component": "clubb_src", "summary": "What happened to cgils_s6 ??", "priority": "critical", "keywords": "", "time": "2014-04-04T23:15:18", "milestone": "4. Fix bugs", "owner": "", "type": "defect" } ```
1.0
What happened to cgils_s6 ?? (Trac #674) - **Introduction** As of r6818, `cgils_s6` and all later cases have failed to run in the nightly tests. It appears that the problem is a failure of the "Cholesky decomposition" routine. The following is printed several times for the `cgils_s6` case: ```text Cholesky_factor: leading minor of order 10 is not positive definite. factorization failed. a_input= 1.00 0.300 1.00 0.000E+00 0.000E+00 1.00 0.108 0.324E-01 0.408 1.00 0.673E-01 0.202E-01 0.000E+00 0.000E+00 1.00 0.721E-01 0.216E-01 0.781 0.189 0.000E+00 1.00 -0.961E-01-0.288E-01 0.528 0.475 0.000E+00 0.367 1.00 0.180E-01 0.541E-02 0.000E+00 0.000E+00 0.915 0.000E+00 0.000E+00 1.00 0.480E-01 0.144E-01 0.877 0.275 0.000E+00 0.963 0.575 0.000E+00 1.00 0.336 0.101 0.661 0.367 0.000E+00 0.516 0.824 0.000E+00 0.678 1.00 ... ``` Now, one might instantly wonder why all the rest of the cases fail as well. Shouldn't just the `cgils_s6` case fail? Well, we have the following line in `run_scm_all.bash`: ```text RESULT=`./run_scm.bash $OPTIONS ${RUN_CASE[$x]} 2>&1` ``` So, it seems that the entirety of the model diagnostic console output is stored in a single BASH variable. When we have the Cholesky error message printed a gazillion times (a fairly refined estimate), it seems that `./run_scm_all.bash` is failing to store all of it. In fact, this is the error that happens: ```text [1]+ Segmentation fault (core dumped) ./run_scm_all.bash --nightly > out.log 2>&1 ``` This would explain why the `./run_scm_all.bash` script immediately terminates!! **Technical spec** Several questions remain: 1. Why do we stuff all the output from `./run_scm.bash` into a single variable? It seems that we do this in order to check the final output message to make sure that CLUBB ran successfully. It seems like better practice to make use of a temporary file and "grep" instead. It is worth mentioning that this is really a bug in Bash. No robust program should "seg-fault" just because it received too much input. Rather, it should have printed an error message and gracefully quit. There isn't really much we can do about that. 2. It is kind of odd that a Cholesky matrix is generated from the correlation matrix regardless of whether or not SILHS is called. Should we only generate the Cholesky matrix if SILHS is called? 3. Any ideas as to why r6818 causes the Cholesky decomposition to fail for `cgils_s6` (and maybe later cases)? Attachments: [plot_explicit_ta_configs.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_explicit_ta_configs.maff) [plot_new_pdf_config_1_plot_2.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_config_1_plot_2.maff) [plot_combo_pdf_run_3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_combo_pdf_run_3.maff) [plot_input_fields_rtp3_thlp3_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_input_fields_rtp3_thlp3_1.maff) [plot_new_pdf_20180522_test_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_20180522_test_1.maff) [plot_attempts_8_10.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempts_8_10.maff) [plot_attempt_8_only.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempt_8_only.maff) [plot_beta_1p3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3.maff) [plot_beta_1p3_all.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3_all.maff) Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/674 ```json { "status": "closed", "changetime": "2015-08-18T20:57:58", "description": "'''Introduction'''\n\nAs of r6818, `cgils_s6` and all later cases have failed to run in the nightly tests. It appears that the problem is a failure of the \"Cholesky decomposition\" routine. The following is printed several times for the `cgils_s6` case:\n{{{\n Cholesky_factor: leading minor of order 10 is not positive definite.\n factorization failed.\n a_input=\n 1.00 \n 0.300 1.00 \n 0.000E+00 0.000E+00 1.00 \n 0.108 0.324E-01 0.408 1.00 \n 0.673E-01 0.202E-01 0.000E+00 0.000E+00 1.00 \n 0.721E-01 0.216E-01 0.781 0.189 0.000E+00 1.00 \n-0.961E-01-0.288E-01 0.528 0.475 0.000E+00 0.367 1.00 \n 0.180E-01 0.541E-02 0.000E+00 0.000E+00 0.915 0.000E+00 0.000E+00 1.00 \n 0.480E-01 0.144E-01 0.877 0.275 0.000E+00 0.963 0.575 0.000E+00 1.00 \n 0.336 0.101 0.661 0.367 0.000E+00 0.516 0.824 0.000E+00 0.678 1.00\n... \n}}}\n\nNow, one might instantly wonder why all the rest of the cases fail as well. Shouldn't just the `cgils_s6` case fail? Well, we have the following line in `run_scm_all.bash`:\n{{{\nRESULT=`./run_scm.bash $OPTIONS ${RUN_CASE[$x]} 2>&1`\n}}}\n\nSo, it seems that the entirety of the model diagnostic console output is stored in a single BASH variable. When we have the Cholesky error message printed a gazillion times (a fairly refined estimate), it seems that `./run_scm_all.bash` is failing to store all of it. In fact, this is the error that happens:\n\n{{{\n[1]+ Segmentation fault (core dumped) ./run_scm_all.bash --nightly > out.log 2>&1\n}}}\n\nThis would explain why the `./run_scm_all.bash` script immediately terminates!!\n\n'''Technical spec'''\n\nSeveral questions remain:\n\n1. Why do we stuff all the output from `./run_scm.bash` into a single variable? It seems that we do this in order to check the final output message to make sure that CLUBB ran successfully. It seems like better practice to make use of a temporary file and \"grep\" instead.\n\n It is worth mentioning that this is really a bug in Bash. No robust program should \"seg-fault\" just because it received too much input. Rather, it should have printed an error message and gracefully quit. There isn't really much we can do about that.\n\n2. It is kind of odd that a Cholesky matrix is generated from the correlation matrix regardless of whether or not SILHS is called. Should we only generate the Cholesky matrix if SILHS is called?\n\n3. Any ideas as to why r6818 causes the Cholesky decomposition to fail for `cgils_s6` (and maybe later cases)?", "reporter": "raut@uwm.edu", "cc": "vlarson@uwm.edu, bmg2@uwm.edu", "resolution": "fixed", "_ts": "1439931478267592", "component": "clubb_src", "summary": "What happened to cgils_s6 ??", "priority": "critical", "keywords": "", "time": "2014-04-04T23:15:18", "milestone": "4. Fix bugs", "owner": "", "type": "defect" } ```
defect
what happened to cgils trac introduction as of cgils and all later cases have failed to run in the nightly tests it appears that the problem is a failure of the cholesky decomposition routine the following is printed several times for the cgils case text cholesky factor leading minor of order is not positive definite factorization failed a input now one might instantly wonder why all the rest of the cases fail as well shouldn t just the cgils case fail well we have the following line in run scm all bash text result run scm bash options run case so it seems that the entirety of the model diagnostic console output is stored in a single bash variable when we have the cholesky error message printed a gazillion times a fairly refined estimate it seems that run scm all bash is failing to store all of it in fact this is the error that happens text segmentation fault core dumped run scm all bash nightly out log this would explain why the run scm all bash script immediately terminates technical spec several questions remain why do we stuff all the output from run scm bash into a single variable it seems that we do this in order to check the final output message to make sure that clubb ran successfully it seems like better practice to make use of a temporary file and grep instead it is worth mentioning that this is really a bug in bash no robust program should seg fault just because it received too much input rather it should have printed an error message and gracefully quit there isn t really much we can do about that it is kind of odd that a cholesky matrix is generated from the correlation matrix regardless of whether or not silhs is called should we only generate the cholesky matrix if silhs is called any ideas as to why causes the cholesky decomposition to fail for cgils and maybe later cases attachments migrated from json status closed changetime description introduction n nas of cgils and all later cases have failed to run in the nightly tests it appears that the problem is a failure of the cholesky decomposition routine the following is printed several times for the cgils case n n cholesky factor leading minor of order is not positive definite n factorization failed n a input n n n n n n n n n n n n n nnow one might instantly wonder why all the rest of the cases fail as well shouldn t just the cgils case fail well we have the following line in run scm all bash n nresult run scm bash options run case n n nso it seems that the entirety of the model diagnostic console output is stored in a single bash variable when we have the cholesky error message printed a gazillion times a fairly refined estimate it seems that run scm all bash is failing to store all of it in fact this is the error that happens n n n segmentation fault core dumped run scm all bash nightly out log n n nthis would explain why the run scm all bash script immediately terminates n n technical spec n nseveral questions remain n why do we stuff all the output from run scm bash into a single variable it seems that we do this in order to check the final output message to make sure that clubb ran successfully it seems like better practice to make use of a temporary file and grep instead n n it is worth mentioning that this is really a bug in bash no robust program should seg fault just because it received too much input rather it should have printed an error message and gracefully quit there isn t really much we can do about that n it is kind of odd that a cholesky matrix is generated from the correlation matrix regardless of whether or not silhs is called should we only generate the cholesky matrix if silhs is called n any ideas as to why causes the cholesky decomposition to fail for cgils and maybe later cases reporter raut uwm edu cc vlarson uwm edu uwm edu resolution fixed ts component clubb src summary what happened to cgils priority critical keywords time milestone fix bugs owner type defect
1
11,636
2,660,011,422
IssuesEvent
2015-03-19 01:33:33
perfsonar/project
https://api.github.com/repos/perfsonar/project
closed
Graph does not load if one of servers is down
Component-GUI Milestone-Release3.5 Priority-Medium Type-Defect
Original [issue 1018](https://code.google.com/p/perfsonar-ps/issues/detail?id=1018) created by arlake228 on 2014-12-08T19:00:52.000Z: We have a graph that points at multiple MAs. One of the MA hosts is not up and this causes none of the graph to load even though the other MA is up and has data. It appears to be due to an unhandled 'undefined' value: TypeError: undefined is not an object (evaluating 'ps_data.packet_retransmits_src') graphWidget.js:280 Example graph where cern-272-owamp is down and cern-272-pt1.es.net is up (possible the owamp host will be up in the next couple days, thus it may start working): http://stats.es.net/serviceTest/graphWidget.cgi?url=http://cern-272-pt1.es.net:8085/esmond/perfsonar/archive&amp;source=198.124.238.6&amp;dest=198.124.238.18&amp;url=http://cern-272-pt1.es.net:8085/esmond/perfsonar/archive&amp;source=198.124.238.6&amp;dest=198.124.252.130&amp;url=http://cern-272-owamp.es.net:8085/esmond/perfsonar/archive&amp;source=198.124.238.46&amp;dest=198.124.238.18&amp;url=http://cern-272-owamp.es.net:8085/esmond/perfsonar/archive&amp;source=198.124.238.46&amp;dest=198.124.252.130&amp;
1.0
Graph does not load if one of servers is down - Original [issue 1018](https://code.google.com/p/perfsonar-ps/issues/detail?id=1018) created by arlake228 on 2014-12-08T19:00:52.000Z: We have a graph that points at multiple MAs. One of the MA hosts is not up and this causes none of the graph to load even though the other MA is up and has data. It appears to be due to an unhandled 'undefined' value: TypeError: undefined is not an object (evaluating 'ps_data.packet_retransmits_src') graphWidget.js:280 Example graph where cern-272-owamp is down and cern-272-pt1.es.net is up (possible the owamp host will be up in the next couple days, thus it may start working): http://stats.es.net/serviceTest/graphWidget.cgi?url=http://cern-272-pt1.es.net:8085/esmond/perfsonar/archive&amp;source=198.124.238.6&amp;dest=198.124.238.18&amp;url=http://cern-272-pt1.es.net:8085/esmond/perfsonar/archive&amp;source=198.124.238.6&amp;dest=198.124.252.130&amp;url=http://cern-272-owamp.es.net:8085/esmond/perfsonar/archive&amp;source=198.124.238.46&amp;dest=198.124.238.18&amp;url=http://cern-272-owamp.es.net:8085/esmond/perfsonar/archive&amp;source=198.124.238.46&amp;dest=198.124.252.130&amp;
defect
graph does not load if one of servers is down original created by on we have a graph that points at multiple mas one of the ma hosts is not up and this causes none of the graph to load even though the other ma is up and has data it appears to be due to an unhandled undefined value typeerror undefined is not an object evaluating ps data packet retransmits src graphwidget js example graph where cern owamp is down and cern es net is up possible the owamp host will be up in the next couple days thus it may start working
1
116,785
15,012,331,269
IssuesEvent
2021-02-01 01:14:07
microsoft/vscode
https://api.github.com/repos/microsoft/vscode
closed
Icons not showing in Visual Studio Code header
*as-designed
Dear Sir/Madam, I have having a problem where some of the icons are not showing my Visual Studio Code. I am unable to find tools on the toolbar at the top of the screen. Please could you help me with this? Kind regards, Jamie (please respond to jamielukephillips@gmail.com)
1.0
Icons not showing in Visual Studio Code header - Dear Sir/Madam, I have having a problem where some of the icons are not showing my Visual Studio Code. I am unable to find tools on the toolbar at the top of the screen. Please could you help me with this? Kind regards, Jamie (please respond to jamielukephillips@gmail.com)
non_defect
icons not showing in visual studio code header dear sir madam i have having a problem where some of the icons are not showing my visual studio code i am unable to find tools on the toolbar at the top of the screen please could you help me with this kind regards jamie please respond to jamielukephillips gmail com
0
11,302
2,648,925,702
IssuesEvent
2015-03-14 12:05:54
jancona/android-on-freerunner
https://api.github.com/repos/jancona/android-on-freerunner
reopened
Manual Installation on SD Card
auto-migrated invalid Priority-Medium Type-Defect
``` About a year ago I successfully built and installed Android cupcake on a 4GB SD card, with the following partitioning: sdx1: FAT32 1GB android_dat -> EMPTY sdx2: ext3 1GB android_system -> copied built filesystem as described at http://code.google.com/p/android-on-freerunner/wiki/AndroidOnSd#Copy_to_/_Update _your_SD_installation sdx3: ext2 8MB android_kernel -> copied kernel from /boot and renamed to uImage.bin I used U-Boot to boot, from part3 and part2. Android would boot reliably though it was not really usable at that stage. Now I decided to try and install the android-on-freerunner-cupcake-sd-stable-201101.tar.gz from the Downloads tab. Meanwhile I use FR as my daily phone (mostly with Om or QtMoko) and did not want to touch the NAND or the boot loader. So I created on a new SD card 8GB the same partitioning as above (only with larger partitions), then extracted the build image there, set the permissions as above and placed the kernel in its own small partition. Result: Kernel boots successfully from U-Boot, then the screen turns black (or rather grey) and the AUX button blinks quickly. Phone is stuck there. My questions: 1. Is it ok to partition like that (kernel on separate partition)? In the wiki page it seems like everything is on a single partition, but to me it is not clear how that could work, since U-Boot expects the kernel in its own partition (see also my comment http://code.google.com/p/android-on-freerunner/issues/detail?id=7#c77) 2. Is this likely a HW/SD card issue? (AUX button blinking fast) 3. Any other parameters to adjust when copying a build to SD card? I really would like to find a reliable way to install and test different AoF builds without having to use installation scripts which expect a particular configuration and/or Qi bootloader. I don't mind writing/updating a wiki page if I succeed. Thanks a lot for any help. Marco ``` Original issue reported on code.google.com by `marcoj...@gmail.com` on 9 Sep 2011 at 8:31
1.0
Manual Installation on SD Card - ``` About a year ago I successfully built and installed Android cupcake on a 4GB SD card, with the following partitioning: sdx1: FAT32 1GB android_dat -> EMPTY sdx2: ext3 1GB android_system -> copied built filesystem as described at http://code.google.com/p/android-on-freerunner/wiki/AndroidOnSd#Copy_to_/_Update _your_SD_installation sdx3: ext2 8MB android_kernel -> copied kernel from /boot and renamed to uImage.bin I used U-Boot to boot, from part3 and part2. Android would boot reliably though it was not really usable at that stage. Now I decided to try and install the android-on-freerunner-cupcake-sd-stable-201101.tar.gz from the Downloads tab. Meanwhile I use FR as my daily phone (mostly with Om or QtMoko) and did not want to touch the NAND or the boot loader. So I created on a new SD card 8GB the same partitioning as above (only with larger partitions), then extracted the build image there, set the permissions as above and placed the kernel in its own small partition. Result: Kernel boots successfully from U-Boot, then the screen turns black (or rather grey) and the AUX button blinks quickly. Phone is stuck there. My questions: 1. Is it ok to partition like that (kernel on separate partition)? In the wiki page it seems like everything is on a single partition, but to me it is not clear how that could work, since U-Boot expects the kernel in its own partition (see also my comment http://code.google.com/p/android-on-freerunner/issues/detail?id=7#c77) 2. Is this likely a HW/SD card issue? (AUX button blinking fast) 3. Any other parameters to adjust when copying a build to SD card? I really would like to find a reliable way to install and test different AoF builds without having to use installation scripts which expect a particular configuration and/or Qi bootloader. I don't mind writing/updating a wiki page if I succeed. Thanks a lot for any help. Marco ``` Original issue reported on code.google.com by `marcoj...@gmail.com` on 9 Sep 2011 at 8:31
defect
manual installation on sd card about a year ago i successfully built and installed android cupcake on a sd card with the following partitioning android dat empty android system copied built filesystem as described at your sd installation android kernel copied kernel from boot and renamed to uimage bin i used u boot to boot from and android would boot reliably though it was not really usable at that stage now i decided to try and install the android on freerunner cupcake sd stable tar gz from the downloads tab meanwhile i use fr as my daily phone mostly with om or qtmoko and did not want to touch the nand or the boot loader so i created on a new sd card the same partitioning as above only with larger partitions then extracted the build image there set the permissions as above and placed the kernel in its own small partition result kernel boots successfully from u boot then the screen turns black or rather grey and the aux button blinks quickly phone is stuck there my questions is it ok to partition like that kernel on separate partition in the wiki page it seems like everything is on a single partition but to me it is not clear how that could work since u boot expects the kernel in its own partition see also my comment is this likely a hw sd card issue aux button blinking fast any other parameters to adjust when copying a build to sd card i really would like to find a reliable way to install and test different aof builds without having to use installation scripts which expect a particular configuration and or qi bootloader i don t mind writing updating a wiki page if i succeed thanks a lot for any help marco original issue reported on code google com by marcoj gmail com on sep at
1
30,523
6,151,200,076
IssuesEvent
2017-06-28 01:29:04
cakephp/cakephp
https://api.github.com/repos/cakephp/cakephp
closed
Chronos issue suddenly appears
Defect
This is a (multiple allowed): - [x] bug - CakePHP Version: Current Release. - Platform and Target: apache on xampp with php7 ### What you did I have not done any php backend things yet. I have merely been styling the app before I get started on the back end. ### Expected Behavior Everything was working just fine and then suddenly I get this error: Fatal error: Class Cake\Chronos\Chronos contains 1 abstract method and must therefore be declared abstract or implement the remaining methods (DateTimeInterface::format) in C:\xampp\htdocs\vw\vendor\cakephp\chronos\src\Chronos.php on line 50
1.0
Chronos issue suddenly appears - This is a (multiple allowed): - [x] bug - CakePHP Version: Current Release. - Platform and Target: apache on xampp with php7 ### What you did I have not done any php backend things yet. I have merely been styling the app before I get started on the back end. ### Expected Behavior Everything was working just fine and then suddenly I get this error: Fatal error: Class Cake\Chronos\Chronos contains 1 abstract method and must therefore be declared abstract or implement the remaining methods (DateTimeInterface::format) in C:\xampp\htdocs\vw\vendor\cakephp\chronos\src\Chronos.php on line 50
defect
chronos issue suddenly appears this is a multiple allowed bug cakephp version current release platform and target apache on xampp with what you did i have not done any php backend things yet i have merely been styling the app before i get started on the back end expected behavior everything was working just fine and then suddenly i get this error fatal error class cake chronos chronos contains abstract method and must therefore be declared abstract or implement the remaining methods datetimeinterface format in c xampp htdocs vw vendor cakephp chronos src chronos php on line
1
65,096
19,095,921,390
IssuesEvent
2021-11-29 16:39:42
cakephp/cakephp
https://api.github.com/repos/cakephp/cakephp
closed
Router issue after changes in #16119
defect
### Description After changes comming from #16119 routeReverse stop working for url '/authors/1/articles/1' (still works for /authors/1/articles/10) In this usecase author_id is same as passed id for article and it is deleted. The reverse route receive ``` ( [author_id] => 1 [pass] => Array ( [0] => 1 ) [controller] => Articles [action] => view [_method] => GET [map] => Array ( ) [plugin] => [_matchedRoute] => /authors/{author_id}/articles/{id} ) ``` Reverting the reverseToArray changes fixes issue. ### CakePHP Version 4.3.2 ### PHP Version _No response_
1.0
Router issue after changes in #16119 - ### Description After changes comming from #16119 routeReverse stop working for url '/authors/1/articles/1' (still works for /authors/1/articles/10) In this usecase author_id is same as passed id for article and it is deleted. The reverse route receive ``` ( [author_id] => 1 [pass] => Array ( [0] => 1 ) [controller] => Articles [action] => view [_method] => GET [map] => Array ( ) [plugin] => [_matchedRoute] => /authors/{author_id}/articles/{id} ) ``` Reverting the reverseToArray changes fixes issue. ### CakePHP Version 4.3.2 ### PHP Version _No response_
defect
router issue after changes in description after changes comming from routereverse stop working for url authors articles still works for authors articles in this usecase author id is same as passed id for article and it is deleted the reverse route receive array articles view get array authors author id articles id reverting the reversetoarray changes fixes issue cakephp version php version no response
1
321,914
23,878,380,649
IssuesEvent
2022-09-07 21:32:28
j01101111sh/metabase-tools
https://api.github.com/repos/j01101111sh/metabase-tools
closed
Add additional information into pyproject.toml and related files
documentation
Additional classifiers need to be added to pyproject.toml, config.py and setup.cfg
1.0
Add additional information into pyproject.toml and related files - Additional classifiers need to be added to pyproject.toml, config.py and setup.cfg
non_defect
add additional information into pyproject toml and related files additional classifiers need to be added to pyproject toml config py and setup cfg
0
85,426
3,690,582,635
IssuesEvent
2016-02-25 20:33:36
BCGamer/website
https://api.github.com/repos/BCGamer/website
closed
Bootstrap navbar menu button appears when there are no items
bug low priority
When the screen is reduced below 900 px the navbar reduces to a simple bootstrap navbar with a standard menu button. This button should not appear if the navbar is empty. It's element should be wrapped in a {% if %} statement.
1.0
Bootstrap navbar menu button appears when there are no items - When the screen is reduced below 900 px the navbar reduces to a simple bootstrap navbar with a standard menu button. This button should not appear if the navbar is empty. It's element should be wrapped in a {% if %} statement.
non_defect
bootstrap navbar menu button appears when there are no items when the screen is reduced below px the navbar reduces to a simple bootstrap navbar with a standard menu button this button should not appear if the navbar is empty it s element should be wrapped in a if statement
0
53,969
13,890,313,902
IssuesEvent
2020-10-19 09:07:03
istio/istio
https://api.github.com/repos/istio/istio
closed
Evaluate authentication filter in WASM
area/security kind/enhancement lifecycle/stale
(This is used to request new product features, please visit <https://discuss.istio.io> for questions on using Istio) **Describe the feature request** This issue is to track the evaluation of migrating the istio authentication filter to WASM which allows us to get rid of the dependency on the istio/proxy repo and with more flexibility of the filter deployment. We need to evaluate the functionality, performance, testing and debuggability of the WASM approach compared to the current implementation. I believe the functionality would not be a concern as the authentication filter only uses a very small set of Envoy API, mostly the dynamic metadata and connection->ssl(). I'm more concerned about the testing/debuggability at this moment. @incfly I will soon create a separate branch in istio/proxy with a minimal skeleton authn-v2 filter in WASM using the null-vm for our evaluation. cc @PiotrSikora @jplevyak @diemtvu @mandarjog @kyessenov @bianpengyuan @silentdai **Describe alternatives you've considered** We could also consider to upstream the authN filter to Envoy if we only want the benefit of getting rid of the dependency on istio/proxy repo after the evaluation of the pros/cons of migrating to WASM. This will be tracked separately if needed. **Affected product area (please put an X in all that apply)** [ ] Configuration Infrastructure [ ] Docs [ ] Installation [ ] Networking [ ] Performance and Scalability [ ] Policies and Telemetry [X] Security [ ] Test and Release [ ] User Experience [ ] Developer Infrastructure **Additional context**
True
Evaluate authentication filter in WASM - (This is used to request new product features, please visit <https://discuss.istio.io> for questions on using Istio) **Describe the feature request** This issue is to track the evaluation of migrating the istio authentication filter to WASM which allows us to get rid of the dependency on the istio/proxy repo and with more flexibility of the filter deployment. We need to evaluate the functionality, performance, testing and debuggability of the WASM approach compared to the current implementation. I believe the functionality would not be a concern as the authentication filter only uses a very small set of Envoy API, mostly the dynamic metadata and connection->ssl(). I'm more concerned about the testing/debuggability at this moment. @incfly I will soon create a separate branch in istio/proxy with a minimal skeleton authn-v2 filter in WASM using the null-vm for our evaluation. cc @PiotrSikora @jplevyak @diemtvu @mandarjog @kyessenov @bianpengyuan @silentdai **Describe alternatives you've considered** We could also consider to upstream the authN filter to Envoy if we only want the benefit of getting rid of the dependency on istio/proxy repo after the evaluation of the pros/cons of migrating to WASM. This will be tracked separately if needed. **Affected product area (please put an X in all that apply)** [ ] Configuration Infrastructure [ ] Docs [ ] Installation [ ] Networking [ ] Performance and Scalability [ ] Policies and Telemetry [X] Security [ ] Test and Release [ ] User Experience [ ] Developer Infrastructure **Additional context**
non_defect
evaluate authentication filter in wasm this is used to request new product features please visit for questions on using istio describe the feature request this issue is to track the evaluation of migrating the istio authentication filter to wasm which allows us to get rid of the dependency on the istio proxy repo and with more flexibility of the filter deployment we need to evaluate the functionality performance testing and debuggability of the wasm approach compared to the current implementation i believe the functionality would not be a concern as the authentication filter only uses a very small set of envoy api mostly the dynamic metadata and connection ssl i m more concerned about the testing debuggability at this moment incfly i will soon create a separate branch in istio proxy with a minimal skeleton authn filter in wasm using the null vm for our evaluation cc piotrsikora jplevyak diemtvu mandarjog kyessenov bianpengyuan silentdai describe alternatives you ve considered we could also consider to upstream the authn filter to envoy if we only want the benefit of getting rid of the dependency on istio proxy repo after the evaluation of the pros cons of migrating to wasm this will be tracked separately if needed affected product area please put an x in all that apply configuration infrastructure docs installation networking performance and scalability policies and telemetry security test and release user experience developer infrastructure additional context
0
26,703
4,777,610,077
IssuesEvent
2016-10-27 16:46:15
wheeler-microfluidics/microdrop
https://api.github.com/repos/wheeler-microfluidics/microdrop
closed
config object upgrade to 0.1 does not set 'enabled_plugins' (Trac #30)
defect microdrop Migrated from Trac
Migrated from http://microfluidics.utoronto.ca/ticket/30 ```json { "status": "closed", "changetime": "2014-04-17T19:39:01", "description": "", "reporter": "cfobel", "cc": "", "resolution": "fixed", "_ts": "1397763541728826", "component": "microdrop", "summary": "config object upgrade to 0.1 does not set 'enabled_plugins'", "priority": "major", "keywords": "", "version": "0.1", "time": "2012-01-04T23:02:54", "milestone": "Microdrop 1.0", "owner": "cfobel", "type": "defect" } ```
1.0
config object upgrade to 0.1 does not set 'enabled_plugins' (Trac #30) - Migrated from http://microfluidics.utoronto.ca/ticket/30 ```json { "status": "closed", "changetime": "2014-04-17T19:39:01", "description": "", "reporter": "cfobel", "cc": "", "resolution": "fixed", "_ts": "1397763541728826", "component": "microdrop", "summary": "config object upgrade to 0.1 does not set 'enabled_plugins'", "priority": "major", "keywords": "", "version": "0.1", "time": "2012-01-04T23:02:54", "milestone": "Microdrop 1.0", "owner": "cfobel", "type": "defect" } ```
defect
config object upgrade to does not set enabled plugins trac migrated from json status closed changetime description reporter cfobel cc resolution fixed ts component microdrop summary config object upgrade to does not set enabled plugins priority major keywords version time milestone microdrop owner cfobel type defect
1
330,790
24,277,500,147
IssuesEvent
2022-09-28 14:49:18
dagster-io/dagster
https://api.github.com/repos/dagster-io/dagster
opened
[docs] - Add docs for env vars and secrets in Dagster Cloud
documentation
### What's the issue or suggestion? This issue serves to track research / content work for adding documentation about environment variables and secrets to Dagster Cloud. ### Additional information _No response_ ### Message from the maintainers Impacted by this issue? Give it a 👍! We factor engagement into prioritization.
1.0
[docs] - Add docs for env vars and secrets in Dagster Cloud - ### What's the issue or suggestion? This issue serves to track research / content work for adding documentation about environment variables and secrets to Dagster Cloud. ### Additional information _No response_ ### Message from the maintainers Impacted by this issue? Give it a 👍! We factor engagement into prioritization.
non_defect
add docs for env vars and secrets in dagster cloud what s the issue or suggestion this issue serves to track research content work for adding documentation about environment variables and secrets to dagster cloud additional information no response message from the maintainers impacted by this issue give it a 👍 we factor engagement into prioritization
0
19,142
13,546,860,286
IssuesEvent
2020-09-17 02:29:09
OctopusDeploy/Issues
https://api.github.com/repos/OctopusDeploy/Issues
opened
Importing an ActionTemplate does not validate the FeedId
area/usability kind/bug
# Prerequisites - [x] I have verified the problem exists in the latest version - [x] I have searched [open](https://github.com/OctopusDeploy/Issues/issues) and [closed](https://github.com/OctopusDeploy/Issues/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aclosed) issues to make sure it isn't already reported - [x] I have written a descriptive issue title - [x] I have linked the original source of this report - [x] I have tagged the issue appropriately (area/*, kind/bug, tag/regression?) # The bug Importing an ActionTemplate that contains a package reference saves successfully even if the FeedId does not exist in the space the template is being imported to ## What I expected to happen The import should save only if the FeedId is valid for the space, otherwise, an appropriate error should be returned. ## Steps to reproduce 1. Go to 'Library' > 'Step Templates' 2. Click on 'Import' 3. Import a step template which contains a package reference from an external package feed: ``` { "Name": "script template", "ActionType": "Octopus.Script", "Packages": [ { "Name": "Newtonsoft.Json", "PackageId": "Newtonsoft.Json", "FeedId": "Feeds-1001", "AcquisitionLocation": "Server", "Properties": { "Extract": "True", "SelectionMode": "immediate" } } ], "Properties": { "Octopus.Action.Script.ScriptSource": "Inline", "Octopus.Action.Script.Syntax": "PowerShell", "Octopus.Action.Script.ScriptBody": "write-host \"hello from the template\"" }, "$Meta": { "ExportedAt": "2020-09-17T02:12:59.926Z", "OctopusVersion": "0.0.0-local", "Type": "ActionTemplate" } } ``` 4. No error will appear but when viewing the step template, the package reference will not have a value set for the feed. ### Screen capture ![image](https://user-images.githubusercontent.com/5336529/93412526-ab635d00-f8e0-11ea-867a-aab0cfa223c6.png) ## Affected versions **Octopus Server:** 2018.8.0 ## Workarounds Ensure the Feed ID is valid for the space to which the template is being imported, or fix the feed on the package references after importing
True
Importing an ActionTemplate does not validate the FeedId - # Prerequisites - [x] I have verified the problem exists in the latest version - [x] I have searched [open](https://github.com/OctopusDeploy/Issues/issues) and [closed](https://github.com/OctopusDeploy/Issues/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aclosed) issues to make sure it isn't already reported - [x] I have written a descriptive issue title - [x] I have linked the original source of this report - [x] I have tagged the issue appropriately (area/*, kind/bug, tag/regression?) # The bug Importing an ActionTemplate that contains a package reference saves successfully even if the FeedId does not exist in the space the template is being imported to ## What I expected to happen The import should save only if the FeedId is valid for the space, otherwise, an appropriate error should be returned. ## Steps to reproduce 1. Go to 'Library' > 'Step Templates' 2. Click on 'Import' 3. Import a step template which contains a package reference from an external package feed: ``` { "Name": "script template", "ActionType": "Octopus.Script", "Packages": [ { "Name": "Newtonsoft.Json", "PackageId": "Newtonsoft.Json", "FeedId": "Feeds-1001", "AcquisitionLocation": "Server", "Properties": { "Extract": "True", "SelectionMode": "immediate" } } ], "Properties": { "Octopus.Action.Script.ScriptSource": "Inline", "Octopus.Action.Script.Syntax": "PowerShell", "Octopus.Action.Script.ScriptBody": "write-host \"hello from the template\"" }, "$Meta": { "ExportedAt": "2020-09-17T02:12:59.926Z", "OctopusVersion": "0.0.0-local", "Type": "ActionTemplate" } } ``` 4. No error will appear but when viewing the step template, the package reference will not have a value set for the feed. ### Screen capture ![image](https://user-images.githubusercontent.com/5336529/93412526-ab635d00-f8e0-11ea-867a-aab0cfa223c6.png) ## Affected versions **Octopus Server:** 2018.8.0 ## Workarounds Ensure the Feed ID is valid for the space to which the template is being imported, or fix the feed on the package references after importing
non_defect
importing an actiontemplate does not validate the feedid prerequisites i have verified the problem exists in the latest version i have searched and issues to make sure it isn t already reported i have written a descriptive issue title i have linked the original source of this report i have tagged the issue appropriately area kind bug tag regression the bug importing an actiontemplate that contains a package reference saves successfully even if the feedid does not exist in the space the template is being imported to what i expected to happen the import should save only if the feedid is valid for the space otherwise an appropriate error should be returned steps to reproduce go to library step templates click on import import a step template which contains a package reference from an external package feed name script template actiontype octopus script packages name newtonsoft json packageid newtonsoft json feedid feeds acquisitionlocation server properties extract true selectionmode immediate properties octopus action script scriptsource inline octopus action script syntax powershell octopus action script scriptbody write host hello from the template meta exportedat octopusversion local type actiontemplate no error will appear but when viewing the step template the package reference will not have a value set for the feed screen capture affected versions octopus server workarounds ensure the feed id is valid for the space to which the template is being imported or fix the feed on the package references after importing
0
10,881
2,622,512,094
IssuesEvent
2015-03-04 03:39:30
tswast/pywiiuse
https://api.github.com/repos/tswast/pywiiuse
closed
example.py does not recogonize which button I pressed on wii
auto-migrated Priority-Medium Type-Defect
``` What steps will reproduce the problem? 1. I just connect my wii and ran the program What is the expected output? What do you see instead? example.py should tell me which button I pressed, but it does not. It only knows that and event has happened, but does not know which button I pressed. What version of the product are you using? On what operating system? I'm using windows 7 64 bits. But I installed 32 bits python 2.5 Thanks. ``` Original issue reported on code.google.com by `yangyh0...@gmail.com` on 14 Jun 2011 at 8:33
1.0
example.py does not recogonize which button I pressed on wii - ``` What steps will reproduce the problem? 1. I just connect my wii and ran the program What is the expected output? What do you see instead? example.py should tell me which button I pressed, but it does not. It only knows that and event has happened, but does not know which button I pressed. What version of the product are you using? On what operating system? I'm using windows 7 64 bits. But I installed 32 bits python 2.5 Thanks. ``` Original issue reported on code.google.com by `yangyh0...@gmail.com` on 14 Jun 2011 at 8:33
defect
example py does not recogonize which button i pressed on wii what steps will reproduce the problem i just connect my wii and ran the program what is the expected output what do you see instead example py should tell me which button i pressed but it does not it only knows that and event has happened but does not know which button i pressed what version of the product are you using on what operating system i m using windows bits but i installed bits python thanks original issue reported on code google com by gmail com on jun at
1
33,872
9,210,526,647
IssuesEvent
2019-03-09 06:15:55
depthlove/depthlove.github.io
https://api.github.com/repos/depthlove/depthlove.github.io
closed
深度学习项目移植之使用 cmake 编译 iOS 平台库 | Minmin.Sun Blog
/2017/05/21/use-cmake-to-build-ios-lib/ Gitalk
https://depthlove.github.io/2017/05/21/use-cmake-to-build-ios-lib/ 去年(2016年)做深度学习项目的移动端移植用到了 cmake,现在把当时写的一篇使用流程贴出来,主要目的是备忘。废话不多说,直接进入正题。 1. 下载 X11 并安装关于 Mac 版 X11,Mac 不再随附 X11,但 XQuartz 项目会提供 X11 服务器和客户端库。XQuartz 项目提供适用于 MacOS 的 X11 服务器和客户端库,网址是 https://www.xquartz.
1.0
深度学习项目移植之使用 cmake 编译 iOS 平台库 | Minmin.Sun Blog - https://depthlove.github.io/2017/05/21/use-cmake-to-build-ios-lib/ 去年(2016年)做深度学习项目的移动端移植用到了 cmake,现在把当时写的一篇使用流程贴出来,主要目的是备忘。废话不多说,直接进入正题。 1. 下载 X11 并安装关于 Mac 版 X11,Mac 不再随附 X11,但 XQuartz 项目会提供 X11 服务器和客户端库。XQuartz 项目提供适用于 MacOS 的 X11 服务器和客户端库,网址是 https://www.xquartz.
non_defect
深度学习项目移植之使用 cmake 编译 ios 平台库 minmin sun blog 去年( )做深度学习项目的移动端移植用到了 cmake,现在把当时写的一篇使用流程贴出来,主要目的是备忘。废话不多说,直接进入正题。 下载 并安装关于 mac 版 ,mac 不再随附 ,但 xquartz 项目会提供 服务器和客户端库。xquartz 项目提供适用于 macos 的 服务器和客户端库,网址是
0
72,715
24,256,213,948
IssuesEvent
2022-09-27 18:00:46
zed-industries/feedback
https://api.github.com/repos/zed-industries/feedback
opened
Cancel jump to definition on paste before mouseup
defect triage
### Check for existing issues - [X] Completed ### Describe the bug I often accidentally jump to a definition when instead I am just wanting to place the cursor and paste. While this is mostly due to bad timing between my left and right hand, I never noticed that in other editors. After digging into it, I can confirm that VSCode and Sublime cancel the jump to definition when text is pasted before the mouseup event is fired. ### To reproduce 1. Copy some code 2. Keep <kbd>⌘</kbd> pressed (for the duration of all the following steps) 3. Mousedown somewhere where a `Jump to Definition` is possible (just down, don't release yet) 4. Paste the code (<kbd>⌘</kbd> + <kbd>v</kbd>) 5. Release the mouse (mouseup) 6. Zed will jump to the definition ### Expected behavior Zed is not jumping to definition. The jump to definition is canceled by pasting the code (or changing the buffer in general maybe). ### Environment Zed 0.55.0 – /Applications/Zed.app \nmacOS 12.6 \narchitecture arm64 ### If applicable, add mockups / screenshots to help explain present your vision of the feature _No response_ ### If applicable, attach your `~/Library/Logs/Zed/Zed.log` file to this issue _No response_
1.0
Cancel jump to definition on paste before mouseup - ### Check for existing issues - [X] Completed ### Describe the bug I often accidentally jump to a definition when instead I am just wanting to place the cursor and paste. While this is mostly due to bad timing between my left and right hand, I never noticed that in other editors. After digging into it, I can confirm that VSCode and Sublime cancel the jump to definition when text is pasted before the mouseup event is fired. ### To reproduce 1. Copy some code 2. Keep <kbd>⌘</kbd> pressed (for the duration of all the following steps) 3. Mousedown somewhere where a `Jump to Definition` is possible (just down, don't release yet) 4. Paste the code (<kbd>⌘</kbd> + <kbd>v</kbd>) 5. Release the mouse (mouseup) 6. Zed will jump to the definition ### Expected behavior Zed is not jumping to definition. The jump to definition is canceled by pasting the code (or changing the buffer in general maybe). ### Environment Zed 0.55.0 – /Applications/Zed.app \nmacOS 12.6 \narchitecture arm64 ### If applicable, add mockups / screenshots to help explain present your vision of the feature _No response_ ### If applicable, attach your `~/Library/Logs/Zed/Zed.log` file to this issue _No response_
defect
cancel jump to definition on paste before mouseup check for existing issues completed describe the bug i often accidentally jump to a definition when instead i am just wanting to place the cursor and paste while this is mostly due to bad timing between my left and right hand i never noticed that in other editors after digging into it i can confirm that vscode and sublime cancel the jump to definition when text is pasted before the mouseup event is fired to reproduce copy some code keep ⌘ pressed for the duration of all the following steps mousedown somewhere where a jump to definition is possible just down don t release yet paste the code ⌘ v release the mouse mouseup zed will jump to the definition expected behavior zed is not jumping to definition the jump to definition is canceled by pasting the code or changing the buffer in general maybe environment zed – applications zed app nmacos narchitecture if applicable add mockups screenshots to help explain present your vision of the feature no response if applicable attach your library logs zed zed log file to this issue no response
1
235,299
19,322,264,717
IssuesEvent
2021-12-14 07:31:40
redhat-developer/odo
https://api.github.com/repos/redhat-developer/odo
closed
refactor test-cmd-devfile-storage
area/testing points/3
/area testing ## Acceptance Criteria - [ ] test-cmd-devfile-storage should use new test approach and run successfully.
1.0
refactor test-cmd-devfile-storage - /area testing ## Acceptance Criteria - [ ] test-cmd-devfile-storage should use new test approach and run successfully.
non_defect
refactor test cmd devfile storage area testing acceptance criteria test cmd devfile storage should use new test approach and run successfully
0
16,123
2,872,987,041
IssuesEvent
2015-06-08 14:54:09
msimpson/pixelcity
https://api.github.com/repos/msimpson/pixelcity
closed
sky is rendered as a lot of vertical lines
auto-migrated Priority-Medium Type-Defect
``` What steps will reproduce the problem? happens all the time What is the expected output? What do you see instead? I expect to see the sky like in the pictures but instead I get alot of vertical blue lines. What version of the product are you using? On what operating system? latest 1.011 on a vista ultimate 64. running a radeon 4870, Q6600@3.6 GHZ & 2GB RAM. Please provide any additional information below. at least the FPS is a constant 60-65. ``` Original issue reported on code.google.com by `zivpe...@gmail.com` on 14 May 2009 at 11:10
1.0
sky is rendered as a lot of vertical lines - ``` What steps will reproduce the problem? happens all the time What is the expected output? What do you see instead? I expect to see the sky like in the pictures but instead I get alot of vertical blue lines. What version of the product are you using? On what operating system? latest 1.011 on a vista ultimate 64. running a radeon 4870, Q6600@3.6 GHZ & 2GB RAM. Please provide any additional information below. at least the FPS is a constant 60-65. ``` Original issue reported on code.google.com by `zivpe...@gmail.com` on 14 May 2009 at 11:10
defect
sky is rendered as a lot of vertical lines what steps will reproduce the problem happens all the time what is the expected output what do you see instead i expect to see the sky like in the pictures but instead i get alot of vertical blue lines what version of the product are you using on what operating system latest on a vista ultimate running a radeon ghz ram please provide any additional information below at least the fps is a constant original issue reported on code google com by zivpe gmail com on may at
1
121,161
10,152,067,412
IssuesEvent
2019-08-05 22:11:58
kubernetes/kubernetes
https://api.github.com/repos/kubernetes/kubernetes
closed
Promote Node Tests to Conformance
area/conformance area/test sig/testing
Using the list of tested stable endpoints referenced in https://github.com/kubernetes/kubernetes/issues/78748, we can pick out the endpoints which are part of the Node kind: [replaceCoreV1NodeStatus](https://apisnoop.cncf.io/?zoomed=operationId-stable-core-replaceCoreV1NodeStatus&showUntested=false&showConformanceTested=false) [replaceCoreV1Node](https://apisnoop.cncf.io/?zoomed=operationId-stable-core-replaceCoreV1Node&showUntested=false&showConformanceTested=false) [patchCoreV1Node](https://apisnoop.cncf.io/?zoomed=operationId-stable-core-patchCoreV1Node&showUntested=false&showConformanceTested=false) [readCoreV1Node](https://apisnoop.cncf.io/?zoomed=operationId-stable-core-readCoreV1Node&showUntested=false&showConformanceTested=false) These are stable core endpoints which have existing testing, potentially making for low-hanging fruit to upgrade to Conformance. /area test /area conformance /sig testing
2.0
Promote Node Tests to Conformance - Using the list of tested stable endpoints referenced in https://github.com/kubernetes/kubernetes/issues/78748, we can pick out the endpoints which are part of the Node kind: [replaceCoreV1NodeStatus](https://apisnoop.cncf.io/?zoomed=operationId-stable-core-replaceCoreV1NodeStatus&showUntested=false&showConformanceTested=false) [replaceCoreV1Node](https://apisnoop.cncf.io/?zoomed=operationId-stable-core-replaceCoreV1Node&showUntested=false&showConformanceTested=false) [patchCoreV1Node](https://apisnoop.cncf.io/?zoomed=operationId-stable-core-patchCoreV1Node&showUntested=false&showConformanceTested=false) [readCoreV1Node](https://apisnoop.cncf.io/?zoomed=operationId-stable-core-readCoreV1Node&showUntested=false&showConformanceTested=false) These are stable core endpoints which have existing testing, potentially making for low-hanging fruit to upgrade to Conformance. /area test /area conformance /sig testing
non_defect
promote node tests to conformance using the list of tested stable endpoints referenced in we can pick out the endpoints which are part of the node kind these are stable core endpoints which have existing testing potentially making for low hanging fruit to upgrade to conformance area test area conformance sig testing
0
46,071
13,055,847,729
IssuesEvent
2020-07-30 02:54:53
icecube-trac/tix2
https://api.github.com/repos/icecube-trac/tix2
opened
boost 1.38 delete issue with Release builds on Ubuntu 8.04 LTS (Trac #537)
IceTray Incomplete Migration Migrated from Trac defect
Migrated from https://code.icecube.wisc.edu/ticket/537 ```json { "status": "closed", "changetime": "2014-11-23T03:37:56", "description": "Several tests failing with a Release build on Ubuntu 8.04 LTS and boost 1.38.00\n\nRecompling with RelWithDebInfo:\n\nStarting program: /usr/bin/python ./icetray/resources/scripts/decode_i3trayinfo.py \n...\nProgram received signal SIGSEGV, Segmentation fault.\n[Switching to Thread 0x7fe1147eb6e0 (LWP 2104)]\n0x00007fe1134b6b19 in boost::detail::sp_counted_impl_p<I3TrayInfo>::dispose (\n this=<value optimized out>)\n at /opt/users/blaufuss/i3tools/include/boost-1.38.0/boost/checked_delete.hpp:34\n34 delete x;\n\n", "reporter": "blaufuss", "cc": "", "resolution": "fixed", "_ts": "1416713876900096", "component": "IceTray", "summary": "boost 1.38 delete issue with Release builds on Ubuntu 8.04 LTS", "priority": "normal", "keywords": "", "time": "2009-02-23T19:30:54", "milestone": "", "owner": "troy", "type": "defect" } ```
1.0
boost 1.38 delete issue with Release builds on Ubuntu 8.04 LTS (Trac #537) - Migrated from https://code.icecube.wisc.edu/ticket/537 ```json { "status": "closed", "changetime": "2014-11-23T03:37:56", "description": "Several tests failing with a Release build on Ubuntu 8.04 LTS and boost 1.38.00\n\nRecompling with RelWithDebInfo:\n\nStarting program: /usr/bin/python ./icetray/resources/scripts/decode_i3trayinfo.py \n...\nProgram received signal SIGSEGV, Segmentation fault.\n[Switching to Thread 0x7fe1147eb6e0 (LWP 2104)]\n0x00007fe1134b6b19 in boost::detail::sp_counted_impl_p<I3TrayInfo>::dispose (\n this=<value optimized out>)\n at /opt/users/blaufuss/i3tools/include/boost-1.38.0/boost/checked_delete.hpp:34\n34 delete x;\n\n", "reporter": "blaufuss", "cc": "", "resolution": "fixed", "_ts": "1416713876900096", "component": "IceTray", "summary": "boost 1.38 delete issue with Release builds on Ubuntu 8.04 LTS", "priority": "normal", "keywords": "", "time": "2009-02-23T19:30:54", "milestone": "", "owner": "troy", "type": "defect" } ```
defect
boost delete issue with release builds on ubuntu lts trac migrated from json status closed changetime description several tests failing with a release build on ubuntu lts and boost n nrecompling with relwithdebinfo n nstarting program usr bin python icetray resources scripts decode py n nprogram received signal sigsegv segmentation fault n in boost detail sp counted impl p dispose n this n at opt users blaufuss include boost boost checked delete hpp delete x n n reporter blaufuss cc resolution fixed ts component icetray summary boost delete issue with release builds on ubuntu lts priority normal keywords time milestone owner troy type defect
1
80,420
30,281,289,611
IssuesEvent
2023-07-08 05:04:33
vector-im/element-web
https://api.github.com/repos/vector-im/element-web
opened
"You can't see earlier messages" in a non-encrypted room
T-Defect
### Steps to reproduce 1. Use Element for some time 2. Go to `#heisenbridge:vi.fi` yesterday (a public room without encryption) ### Outcome #### What did you expect? I should continue being able to see most of the room history. #### What happened instead? Most of the room history is now unavailable. ![image](https://github.com/vector-im/element-web/assets/160894/f7e2dc25-5a8d-493b-bdcd-a9701a2916b4) - Notably, unlike other filed issues on similar problems, this room is *not* encrypted. - If I click on a link leading to an older event in history, Element does show the old history, however then I cannot scroll to the bottom of the room to see the latest history. - The "Clear cache and reload" button had no effect. - The history does load fine from a matrix.org account. One other user in the channel reporting having the same problem, though. ### Operating system Arch Linux x86_64 ### Browser information Firefox 114.0.2 (64-bit) ### URL for webapp Self-hosted ### Application version Element version: 1.11.35 Olm version: 3.2.14 ### Homeserver Synapse 1.87.0 ### Will you send logs? Yes
1.0
"You can't see earlier messages" in a non-encrypted room - ### Steps to reproduce 1. Use Element for some time 2. Go to `#heisenbridge:vi.fi` yesterday (a public room without encryption) ### Outcome #### What did you expect? I should continue being able to see most of the room history. #### What happened instead? Most of the room history is now unavailable. ![image](https://github.com/vector-im/element-web/assets/160894/f7e2dc25-5a8d-493b-bdcd-a9701a2916b4) - Notably, unlike other filed issues on similar problems, this room is *not* encrypted. - If I click on a link leading to an older event in history, Element does show the old history, however then I cannot scroll to the bottom of the room to see the latest history. - The "Clear cache and reload" button had no effect. - The history does load fine from a matrix.org account. One other user in the channel reporting having the same problem, though. ### Operating system Arch Linux x86_64 ### Browser information Firefox 114.0.2 (64-bit) ### URL for webapp Self-hosted ### Application version Element version: 1.11.35 Olm version: 3.2.14 ### Homeserver Synapse 1.87.0 ### Will you send logs? Yes
defect
you can t see earlier messages in a non encrypted room steps to reproduce use element for some time go to heisenbridge vi fi yesterday a public room without encryption outcome what did you expect i should continue being able to see most of the room history what happened instead most of the room history is now unavailable notably unlike other filed issues on similar problems this room is not encrypted if i click on a link leading to an older event in history element does show the old history however then i cannot scroll to the bottom of the room to see the latest history the clear cache and reload button had no effect the history does load fine from a matrix org account one other user in the channel reporting having the same problem though operating system arch linux browser information firefox bit url for webapp self hosted application version element version olm version homeserver synapse will you send logs yes
1
135,958
18,722,222,427
IssuesEvent
2021-11-03 13:05:00
KDWSS/dd-trace-java
https://api.github.com/repos/KDWSS/dd-trace-java
opened
CVE-2018-19362 (High) detected in multiple libraries
security vulnerability
## CVE-2018-19362 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.7.9.1.jar</b>, <b>jackson-databind-2.8.11.1.jar</b>, <b>jackson-databind-2.7.4.jar</b>, <b>jackson-databind-2.7.5.jar</b>, <b>jackson-databind-2.6.6.jar</b>, <b>jackson-databind-2.8.11.2.jar</b>, <b>jackson-databind-2.8.9.jar</b>, <b>jackson-databind-2.6.5.jar</b>, <b>jackson-databind-2.8.11.jar</b>, <b>jackson-databind-2.8.5.jar</b>, <b>jackson-databind-2.9.1.jar</b>, <b>jackson-databind-2.9.7.jar</b>, <b>jackson-databind-2.6.4.jar</b>, <b>jackson-databind-2.7.8.jar</b>, <b>jackson-databind-2.7.1.jar</b>, <b>jackson-databind-2.3.2.jar</b>, <b>jackson-databind-2.7.9.3.jar</b>, <b>jackson-databind-2.3.3.jar</b>, <b>jackson-databind-2.8.4.jar</b>, <b>jackson-databind-2.5.3.jar</b>, <b>jackson-databind-2.9.4.jar</b>, <b>jackson-databind-2.5.4.jar</b>, <b>jackson-databind-2.9.0.jar</b>, <b>jackson-databind-2.8.7.jar</b>, <b>jackson-databind-2.8.3.jar</b></p></summary> <p> <details><summary><b>jackson-databind-2.7.9.1.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/appsec/weblog/weblog-spring-app/weblog-spring-app.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.9.1/85343e40e4f68d4a25226d53736646abaf0ae039/jackson-databind-2.7.9.1.jar,/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.9.1/85343e40e4f68d4a25226d53736646abaf0ae039/jackson-databind-2.7.9.1.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.7.9.1.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.11.1.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/benchmark-integration/play-perftest/play-perftest.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.1/341edc63fdd8b44e17b2c36abbc9b451d8fd05a5/jackson-databind-2.8.11.1.jar</p> <p> Dependency Hierarchy: - play_2.12-2.6.20.jar (Root Library) - jjwt-0.7.0.jar - :x: **jackson-databind-2.8.11.1.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/vertx-web-3.4/vertx-web-3.4.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.4/1e9c6f3659644aeac84872c3b62d8e363bf4c96d/jackson-databind-2.7.4.jar</p> <p> Dependency Hierarchy: - vertx-web-3.4.0.jar (Root Library) - vertx-core-3.4.0.jar - :x: **jackson-databind-2.7.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-cloud-zuul-2/spring-cloud-zuul-2.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.5/ca7084021d9f213003eafe2583d1783d3d6a3685/jackson-databind-2.7.5.jar</p> <p> Dependency Hierarchy: - zuul-core-1.3.1.jar (Root Library) - archaius-core-0.7.6.jar - :x: **jackson-databind-2.7.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.6.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/aws-java-sqs-1.0/aws-java-sqs-1.0.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.6/5108dde6049374ba980b360e1ecff49847baba4a/jackson-databind-2.6.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.6/5108dde6049374ba980b360e1ecff49847baba4a/jackson-databind-2.6.6.jar</p> <p> Dependency Hierarchy: - aws-java-sdk-kinesis-1.11.106.jar (Root Library) - aws-java-sdk-core-1.11.106.jar - :x: **jackson-databind-2.6.6.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.11.2.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-webmvc-3.1/spring-webmvc-3.1.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.2/2c5051e8e84d2c16316b758ebf746f9e90bef5a4/jackson-databind-2.8.11.2.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.2/2c5051e8e84d2c16316b758ebf746f9e90bef5a4/jackson-databind-2.8.11.2.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-1.5.17.RELEASE.jar (Root Library) - :x: **jackson-databind-2.8.11.2.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.9.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/play-2.6/play-2.6.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.9/4dfca3975be3c1a98eacb829e70f02e9a71bc159/jackson-databind-2.8.9.jar</p> <p> Dependency Hierarchy: - play_2.11-2.6.0.jar (Root Library) - jjwt-0.7.0.jar - :x: **jackson-databind-2.8.9.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.6.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-2/transport-2.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.5/d50be1723a09befd903887099ff2014ea9020333/jackson-databind-2.6.5.jar</p> <p> Dependency Hierarchy: - spring-data-elasticsearch-2.0.0.RELEASE.jar (Root Library) - :x: **jackson-databind-2.6.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.11.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/akka-http-10.0/akka-http-10.0.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11/569a9f220273024523799dba9dd358121b0ee09/jackson-databind-2.8.11.jar</p> <p> Dependency Hierarchy: - lagom-javadsl-testkit_2.11-1.4.0.jar (Root Library) - lagom-persistence-core_2.11-1.4.0.jar - play_2.11-2.6.11.jar - jjwt-0.7.0.jar - :x: **jackson-databind-2.8.11.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/kafka-streams-0.11/kafka-streams-0.11.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.5/b3035f37e674c04dafe36a660c3815cc59f764e2/jackson-databind-2.8.5.jar</p> <p> Dependency Hierarchy: - kafka-streams-0.11.0.0.jar (Root Library) - connect-json-0.11.0.0.jar - :x: **jackson-databind-2.8.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.1.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.1/716da1830a2043f18882fc036ec26eb32cbe5aff/jackson-databind-2.9.1.jar</p> <p> Dependency Hierarchy: - spring-data-elasticsearch-3.0.0.RELEASE.jar (Root Library) - :x: **jackson-databind-2.9.1.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.7.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/couchbase-2.6/couchbase-2.6.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.7/e6faad47abd3179666e89068485a1b88a195ceb7/jackson-databind-2.9.7.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.7/e6faad47abd3179666e89068485a1b88a195ceb7/jackson-databind-2.9.7.jar</p> <p> Dependency Hierarchy: - encryption-1.0.0.jar (Root Library) - :x: **jackson-databind-2.9.7.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.6.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/couchbase-2.0/couchbase-2.0.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.4/f2abadd10891512268b16a1a1a6f81890f3e2976/jackson-databind-2.6.4.jar</p> <p> Dependency Hierarchy: - spring-data-couchbase-2.0.0.RELEASE.jar (Root Library) - :x: **jackson-databind-2.6.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-smoke-tests/play-2.5/play-2.5.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.8/9bc551426f1e19b4e2d87bb4bb2e19f8ecf8d578/jackson-databind-2.7.8.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.8/9bc551426f1e19b4e2d87bb4bb2e19f8ecf8d578/jackson-databind-2.7.8.jar</p> <p> Dependency Hierarchy: - play_2.11-2.5.19.jar (Root Library) - :x: **jackson-databind-2.7.8.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.1.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.1/14d88822bca655de7aa6ed3e4c498d115505710a/jackson-databind-2.7.1.jar</p> <p> Dependency Hierarchy: - play-java_2.11-2.5.0.jar (Root Library) - play_2.11-2.5.0.jar - jackson-datatype-jdk8-2.7.1.jar - :x: **jackson-databind-2.7.1.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.3.2.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/dropwizard/dropwizard-views/dropwizard-views.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.3.2/c75edc740a6d8cb1cef6fa82fa594e0bce561916/jackson-databind-2.3.2.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.3.2/c75edc740a6d8cb1cef6fa82fa594e0bce561916/jackson-databind-2.3.2.jar</p> <p> Dependency Hierarchy: - play-java-ws_2.11-2.3.10.jar (Root Library) - play_2.11-2.3.10.jar - :x: **jackson-databind-2.3.2.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.9.3.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/datastax-cassandra-3/datastax-cassandra-3.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.9.3/fc6d8373d2f5a012473c764c3556704be6da15e/jackson-databind-2.7.9.3.jar</p> <p> Dependency Hierarchy: - cassandra-driver-core-3.11.0.jar (Root Library) - :x: **jackson-databind-2.7.9.3.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.3.3.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/jax-rs-annotations-1/jax-rs-annotations-1.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.3.3/63b77400b5f1cf83a81823562c48d3120ef5518e/jackson-databind-2.3.3.jar</p> <p> Dependency Hierarchy: - dropwizard-testing-0.7.1.jar (Root Library) - dropwizard-core-0.7.1.jar - dropwizard-jackson-0.7.1.jar - :x: **jackson-databind-2.3.3.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-rabbit/spring-rabbit.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.4/1c36c81e79cacdf48116afba8495e3393d267ba1/jackson-databind-2.8.4.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.4/1c36c81e79cacdf48116afba8495e3393d267ba1/jackson-databind-2.8.4.jar</p> <p> Dependency Hierarchy: - spring-rabbit-2.0.0.RELEASE.jar (Root Library) - http-client-1.3.0.RELEASE.jar - :x: **jackson-databind-2.8.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.5.3.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/aws-java-sdk-1.11.0/aws-java-sdk-1.11.0.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.5.3/c37875ff66127d93e5f672708cb2dcc14c8232ab/jackson-databind-2.5.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.5.3/c37875ff66127d93e5f672708cb2dcc14c8232ab/jackson-databind-2.5.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.5.3/c37875ff66127d93e5f672708cb2dcc14c8232ab/jackson-databind-2.5.3.jar</p> <p> Dependency Hierarchy: - play_2.11-2.4.0.jar (Root Library) - :x: **jackson-databind-2.5.3.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.4/498bbc3b94f566982c7f7c6d4d303fce365529be/jackson-databind-2.9.4.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-webflux-2.0.0.RELEASE.jar (Root Library) - spring-boot-starter-json-2.0.0.RELEASE.jar - :x: **jackson-databind-2.9.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.5.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-smoke-tests/play-2.4/play-2.4.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.5.4/5dfa42af84584b4a862ea488da84bbbebbb06c35/jackson-databind-2.5.4.jar</p> <p> Dependency Hierarchy: - play_2.11-2.4.11.jar (Root Library) - jackson-datatype-jsr310-2.5.4.jar - :x: **jackson-databind-2.5.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.0.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/vertx-rx-3.5/vertx-rx-3.5.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.0/14fb5f088cc0b0dc90a73ba745bcade4961a3ee3/jackson-databind-2.9.0.jar</p> <p> Dependency Hierarchy: - vertx-rx-java2-3.5.0.jar (Root Library) - vertx-core-3.5.0.jar - :x: **jackson-databind-2.9.0.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.7.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/twilio/twilio.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.7/6c3257ef458ac58a8da69a6dca3d2a15286d88c8/jackson-databind-2.8.7.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.7/6c3257ef458ac58a8da69a6dca3d2a15286d88c8/jackson-databind-2.8.7.jar</p> <p> Dependency Hierarchy: - ratpack-core-1.5.0.jar (Root Library) - :x: **jackson-databind-2.8.7.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.3.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-smoke-tests/log-injection/log-injection.gradle</p> <p>Path to vulnerable library: /caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.3/cea3788c72271d45676ce32c0665991674b24cc5/jackson-databind-2.8.3.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.8.3.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/KDWSS/dd-trace-java/commit/2819174635979a19573ec0ce8e3e2b63a3848079">2819174635979a19573ec0ce8e3e2b63a3848079</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.8 might allow attackers to have unspecified impact by leveraging failure to block the jboss-common-core class from polymorphic deserialization. <p>Publish Date: 2019-01-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-19362>CVE-2018-19362</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19362">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19362</a></p> <p>Release Date: 2019-01-02</p> <p>Fix Resolution: 2.9.8</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.9.1","packageFilePaths":["/dd-java-agent/appsec/weblog/weblog-spring-app/weblog-spring-app.gradle"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.7.9.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.11.1","packageFilePaths":["/dd-java-agent/benchmark-integration/play-perftest/play-perftest.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play_2.12:2.6.20;io.jsonwebtoken:jjwt:0.7.0;com.fasterxml.jackson.core:jackson-databind:2.8.11.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.4","packageFilePaths":["/dd-java-agent/instrumentation/vertx-web-3.4/vertx-web-3.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.vertx:vertx-web:3.4.0;io.vertx:vertx-core:3.4.0;com.fasterxml.jackson.core:jackson-databind:2.7.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.5","packageFilePaths":["/dd-java-agent/instrumentation/spring-cloud-zuul-2/spring-cloud-zuul-2.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.netflix.zuul:zuul-core:1.3.1;com.netflix.archaius:archaius-core:0.7.6;com.fasterxml.jackson.core:jackson-databind:2.7.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.6","packageFilePaths":["/dd-java-agent/instrumentation/aws-java-sqs-1.0/aws-java-sqs-1.0.gradle","/dd-java-agent/instrumentation/aws-java-sdk-1.11.0/aws-java-sdk-1.11.0.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.amazonaws:aws-java-sdk-kinesis:1.11.106;com.amazonaws:aws-java-sdk-core:1.11.106;com.fasterxml.jackson.core:jackson-databind:2.6.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.11.2","packageFilePaths":["/dd-java-agent/instrumentation/spring-webmvc-3.1/spring-webmvc-3.1.gradle","/dd-java-agent/instrumentation/elasticsearch/transport-2/transport-2.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:1.5.17.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.8.11.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.9","packageFilePaths":["/dd-java-agent/instrumentation/play-2.6/play-2.6.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play_2.11:2.6.0;io.jsonwebtoken:jjwt:0.7.0;com.fasterxml.jackson.core:jackson-databind:2.8.9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.5","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-2/transport-2.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.data:spring-data-elasticsearch:2.0.0.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.6.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.11","packageFilePaths":["/dd-java-agent/instrumentation/akka-http-10.0/akka-http-10.0.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.lightbend.lagom:lagom-javadsl-testkit_2.11:1.4.0;com.lightbend.lagom:lagom-persistence-core_2.11:1.4.0;com.typesafe.play:play_2.11:2.6.11;io.jsonwebtoken:jjwt:0.7.0;com.fasterxml.jackson.core:jackson-databind:2.8.11","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.5","packageFilePaths":["/dd-java-agent/instrumentation/kafka-streams-0.11/kafka-streams-0.11.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.apache.kafka:kafka-streams:0.11.0.0;org.apache.kafka:connect-json:0.11.0.0;com.fasterxml.jackson.core:jackson-databind:2.8.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.1","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.data:spring-data-elasticsearch:3.0.0.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.9.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.7","packageFilePaths":["/dd-java-agent/instrumentation/couchbase-2.6/couchbase-2.6.gradle","/dd-java-agent/instrumentation/aws-java-sdk-2.2/aws-java-sdk-2.2.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.couchbase.client:encryption:1.0.0;com.fasterxml.jackson.core:jackson-databind:2.9.7","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.4","packageFilePaths":["/dd-java-agent/instrumentation/couchbase-2.0/couchbase-2.0.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.data:spring-data-couchbase:2.0.0.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.6.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.8","packageFilePaths":["/dd-smoke-tests/play-2.5/play-2.5.gradle","/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play_2.11:2.5.19;com.fasterxml.jackson.core:jackson-databind:2.7.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.1","packageFilePaths":["/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-java_2.11:2.5.0;com.typesafe.play:play_2.11:2.5.0;com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.7.1;com.fasterxml.jackson.core:jackson-databind:2.7.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.3.2","packageFilePaths":["/dd-java-agent/instrumentation/dropwizard/dropwizard-views/dropwizard-views.gradle","/dd-java-agent/instrumentation/play-2.3/play-2.3.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-java-ws_2.11:2.3.10;com.typesafe.play:play_2.11:2.3.10;com.fasterxml.jackson.core:jackson-databind:2.3.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.9.3","packageFilePaths":["/dd-java-agent/instrumentation/datastax-cassandra-3/datastax-cassandra-3.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.datastax.cassandra:cassandra-driver-core:3.11.0;com.fasterxml.jackson.core:jackson-databind:2.7.9.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.3.3","packageFilePaths":["/dd-java-agent/instrumentation/jax-rs-annotations-1/jax-rs-annotations-1.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.dropwizard:dropwizard-testing:0.7.1;io.dropwizard:dropwizard-core:0.7.1;io.dropwizard:dropwizard-jackson:0.7.1;com.fasterxml.jackson.core:jackson-databind:2.3.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.4","packageFilePaths":["/dd-java-agent/instrumentation/spring-rabbit/spring-rabbit.gradle","/dd-java-agent/instrumentation/finatra-2.9/finatra-2.9.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.amqp:spring-rabbit:2.0.0.RELEASE;com.rabbitmq:http-client:1.3.0.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.8.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.5.3","packageFilePaths":["/dd-java-agent/instrumentation/aws-java-sdk-1.11.0/aws-java-sdk-1.11.0.gradle","/dd-java-agent/instrumentation/aws-java-sqs-1.0/aws-java-sqs-1.0.gradle","/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play_2.11:2.4.0;com.fasterxml.jackson.core:jackson-databind:2.5.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.4","packageFilePaths":["/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-webflux:2.0.0.RELEASE;org.springframework.boot:spring-boot-starter-json:2.0.0.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.9.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.5.4","packageFilePaths":["/dd-smoke-tests/play-2.4/play-2.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play_2.11:2.4.11;com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.5.4;com.fasterxml.jackson.core:jackson-databind:2.5.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.0","packageFilePaths":["/dd-java-agent/instrumentation/vertx-rx-3.5/vertx-rx-3.5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.vertx:vertx-rx-java2:3.5.0;io.vertx:vertx-core:3.5.0;com.fasterxml.jackson.core:jackson-databind:2.9.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.7","packageFilePaths":["/dd-java-agent/instrumentation/twilio/twilio.gradle","/dd-java-agent/instrumentation/ratpack-1.5/ratpack-1.5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.ratpack:ratpack-core:1.5.0;com.fasterxml.jackson.core:jackson-databind:2.8.7","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.3","packageFilePaths":["/dd-smoke-tests/log-injection/log-injection.gradle"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.8.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-19362","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.8 might allow attackers to have unspecified impact by leveraging failure to block the jboss-common-core class from polymorphic deserialization.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-19362","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2018-19362 (High) detected in multiple libraries - ## CVE-2018-19362 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.7.9.1.jar</b>, <b>jackson-databind-2.8.11.1.jar</b>, <b>jackson-databind-2.7.4.jar</b>, <b>jackson-databind-2.7.5.jar</b>, <b>jackson-databind-2.6.6.jar</b>, <b>jackson-databind-2.8.11.2.jar</b>, <b>jackson-databind-2.8.9.jar</b>, <b>jackson-databind-2.6.5.jar</b>, <b>jackson-databind-2.8.11.jar</b>, <b>jackson-databind-2.8.5.jar</b>, <b>jackson-databind-2.9.1.jar</b>, <b>jackson-databind-2.9.7.jar</b>, <b>jackson-databind-2.6.4.jar</b>, <b>jackson-databind-2.7.8.jar</b>, <b>jackson-databind-2.7.1.jar</b>, <b>jackson-databind-2.3.2.jar</b>, <b>jackson-databind-2.7.9.3.jar</b>, <b>jackson-databind-2.3.3.jar</b>, <b>jackson-databind-2.8.4.jar</b>, <b>jackson-databind-2.5.3.jar</b>, <b>jackson-databind-2.9.4.jar</b>, <b>jackson-databind-2.5.4.jar</b>, <b>jackson-databind-2.9.0.jar</b>, <b>jackson-databind-2.8.7.jar</b>, <b>jackson-databind-2.8.3.jar</b></p></summary> <p> <details><summary><b>jackson-databind-2.7.9.1.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/appsec/weblog/weblog-spring-app/weblog-spring-app.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.9.1/85343e40e4f68d4a25226d53736646abaf0ae039/jackson-databind-2.7.9.1.jar,/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.9.1/85343e40e4f68d4a25226d53736646abaf0ae039/jackson-databind-2.7.9.1.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.7.9.1.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.11.1.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/benchmark-integration/play-perftest/play-perftest.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.1/341edc63fdd8b44e17b2c36abbc9b451d8fd05a5/jackson-databind-2.8.11.1.jar</p> <p> Dependency Hierarchy: - play_2.12-2.6.20.jar (Root Library) - jjwt-0.7.0.jar - :x: **jackson-databind-2.8.11.1.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/vertx-web-3.4/vertx-web-3.4.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.4/1e9c6f3659644aeac84872c3b62d8e363bf4c96d/jackson-databind-2.7.4.jar</p> <p> Dependency Hierarchy: - vertx-web-3.4.0.jar (Root Library) - vertx-core-3.4.0.jar - :x: **jackson-databind-2.7.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-cloud-zuul-2/spring-cloud-zuul-2.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.5/ca7084021d9f213003eafe2583d1783d3d6a3685/jackson-databind-2.7.5.jar</p> <p> Dependency Hierarchy: - zuul-core-1.3.1.jar (Root Library) - archaius-core-0.7.6.jar - :x: **jackson-databind-2.7.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.6.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/aws-java-sqs-1.0/aws-java-sqs-1.0.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.6/5108dde6049374ba980b360e1ecff49847baba4a/jackson-databind-2.6.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.6/5108dde6049374ba980b360e1ecff49847baba4a/jackson-databind-2.6.6.jar</p> <p> Dependency Hierarchy: - aws-java-sdk-kinesis-1.11.106.jar (Root Library) - aws-java-sdk-core-1.11.106.jar - :x: **jackson-databind-2.6.6.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.11.2.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-webmvc-3.1/spring-webmvc-3.1.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.2/2c5051e8e84d2c16316b758ebf746f9e90bef5a4/jackson-databind-2.8.11.2.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.2/2c5051e8e84d2c16316b758ebf746f9e90bef5a4/jackson-databind-2.8.11.2.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-1.5.17.RELEASE.jar (Root Library) - :x: **jackson-databind-2.8.11.2.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.9.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/play-2.6/play-2.6.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.9/4dfca3975be3c1a98eacb829e70f02e9a71bc159/jackson-databind-2.8.9.jar</p> <p> Dependency Hierarchy: - play_2.11-2.6.0.jar (Root Library) - jjwt-0.7.0.jar - :x: **jackson-databind-2.8.9.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.6.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-2/transport-2.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.5/d50be1723a09befd903887099ff2014ea9020333/jackson-databind-2.6.5.jar</p> <p> Dependency Hierarchy: - spring-data-elasticsearch-2.0.0.RELEASE.jar (Root Library) - :x: **jackson-databind-2.6.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.11.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/akka-http-10.0/akka-http-10.0.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11/569a9f220273024523799dba9dd358121b0ee09/jackson-databind-2.8.11.jar</p> <p> Dependency Hierarchy: - lagom-javadsl-testkit_2.11-1.4.0.jar (Root Library) - lagom-persistence-core_2.11-1.4.0.jar - play_2.11-2.6.11.jar - jjwt-0.7.0.jar - :x: **jackson-databind-2.8.11.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.5.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/kafka-streams-0.11/kafka-streams-0.11.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.5/b3035f37e674c04dafe36a660c3815cc59f764e2/jackson-databind-2.8.5.jar</p> <p> Dependency Hierarchy: - kafka-streams-0.11.0.0.jar (Root Library) - connect-json-0.11.0.0.jar - :x: **jackson-databind-2.8.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.1.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.1/716da1830a2043f18882fc036ec26eb32cbe5aff/jackson-databind-2.9.1.jar</p> <p> Dependency Hierarchy: - spring-data-elasticsearch-3.0.0.RELEASE.jar (Root Library) - :x: **jackson-databind-2.9.1.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.7.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/couchbase-2.6/couchbase-2.6.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.7/e6faad47abd3179666e89068485a1b88a195ceb7/jackson-databind-2.9.7.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.7/e6faad47abd3179666e89068485a1b88a195ceb7/jackson-databind-2.9.7.jar</p> <p> Dependency Hierarchy: - encryption-1.0.0.jar (Root Library) - :x: **jackson-databind-2.9.7.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.6.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/couchbase-2.0/couchbase-2.0.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.6.4/f2abadd10891512268b16a1a1a6f81890f3e2976/jackson-databind-2.6.4.jar</p> <p> Dependency Hierarchy: - spring-data-couchbase-2.0.0.RELEASE.jar (Root Library) - :x: **jackson-databind-2.6.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-smoke-tests/play-2.5/play-2.5.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.8/9bc551426f1e19b4e2d87bb4bb2e19f8ecf8d578/jackson-databind-2.7.8.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.8/9bc551426f1e19b4e2d87bb4bb2e19f8ecf8d578/jackson-databind-2.7.8.jar</p> <p> Dependency Hierarchy: - play_2.11-2.5.19.jar (Root Library) - :x: **jackson-databind-2.7.8.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.1.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.1/14d88822bca655de7aa6ed3e4c498d115505710a/jackson-databind-2.7.1.jar</p> <p> Dependency Hierarchy: - play-java_2.11-2.5.0.jar (Root Library) - play_2.11-2.5.0.jar - jackson-datatype-jdk8-2.7.1.jar - :x: **jackson-databind-2.7.1.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.3.2.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/dropwizard/dropwizard-views/dropwizard-views.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.3.2/c75edc740a6d8cb1cef6fa82fa594e0bce561916/jackson-databind-2.3.2.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.3.2/c75edc740a6d8cb1cef6fa82fa594e0bce561916/jackson-databind-2.3.2.jar</p> <p> Dependency Hierarchy: - play-java-ws_2.11-2.3.10.jar (Root Library) - play_2.11-2.3.10.jar - :x: **jackson-databind-2.3.2.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.7.9.3.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/datastax-cassandra-3/datastax-cassandra-3.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.9.3/fc6d8373d2f5a012473c764c3556704be6da15e/jackson-databind-2.7.9.3.jar</p> <p> Dependency Hierarchy: - cassandra-driver-core-3.11.0.jar (Root Library) - :x: **jackson-databind-2.7.9.3.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.3.3.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/jax-rs-annotations-1/jax-rs-annotations-1.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.3.3/63b77400b5f1cf83a81823562c48d3120ef5518e/jackson-databind-2.3.3.jar</p> <p> Dependency Hierarchy: - dropwizard-testing-0.7.1.jar (Root Library) - dropwizard-core-0.7.1.jar - dropwizard-jackson-0.7.1.jar - :x: **jackson-databind-2.3.3.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-rabbit/spring-rabbit.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.4/1c36c81e79cacdf48116afba8495e3393d267ba1/jackson-databind-2.8.4.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.4/1c36c81e79cacdf48116afba8495e3393d267ba1/jackson-databind-2.8.4.jar</p> <p> Dependency Hierarchy: - spring-rabbit-2.0.0.RELEASE.jar (Root Library) - http-client-1.3.0.RELEASE.jar - :x: **jackson-databind-2.8.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.5.3.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/aws-java-sdk-1.11.0/aws-java-sdk-1.11.0.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.5.3/c37875ff66127d93e5f672708cb2dcc14c8232ab/jackson-databind-2.5.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.5.3/c37875ff66127d93e5f672708cb2dcc14c8232ab/jackson-databind-2.5.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.5.3/c37875ff66127d93e5f672708cb2dcc14c8232ab/jackson-databind-2.5.3.jar</p> <p> Dependency Hierarchy: - play_2.11-2.4.0.jar (Root Library) - :x: **jackson-databind-2.5.3.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.4/498bbc3b94f566982c7f7c6d4d303fce365529be/jackson-databind-2.9.4.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-webflux-2.0.0.RELEASE.jar (Root Library) - spring-boot-starter-json-2.0.0.RELEASE.jar - :x: **jackson-databind-2.9.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.5.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-smoke-tests/play-2.4/play-2.4.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.5.4/5dfa42af84584b4a862ea488da84bbbebbb06c35/jackson-databind-2.5.4.jar</p> <p> Dependency Hierarchy: - play_2.11-2.4.11.jar (Root Library) - jackson-datatype-jsr310-2.5.4.jar - :x: **jackson-databind-2.5.4.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.9.0.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/vertx-rx-3.5/vertx-rx-3.5.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.0/14fb5f088cc0b0dc90a73ba745bcade4961a3ee3/jackson-databind-2.9.0.jar</p> <p> Dependency Hierarchy: - vertx-rx-java2-3.5.0.jar (Root Library) - vertx-core-3.5.0.jar - :x: **jackson-databind-2.9.0.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.7.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/twilio/twilio.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.7/6c3257ef458ac58a8da69a6dca3d2a15286d88c8/jackson-databind-2.8.7.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.7/6c3257ef458ac58a8da69a6dca3d2a15286d88c8/jackson-databind-2.8.7.jar</p> <p> Dependency Hierarchy: - ratpack-core-1.5.0.jar (Root Library) - :x: **jackson-databind-2.8.7.jar** (Vulnerable Library) </details> <details><summary><b>jackson-databind-2.8.3.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: dd-trace-java/dd-smoke-tests/log-injection/log-injection.gradle</p> <p>Path to vulnerable library: /caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.3/cea3788c72271d45676ce32c0665991674b24cc5/jackson-databind-2.8.3.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.8.3.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/KDWSS/dd-trace-java/commit/2819174635979a19573ec0ce8e3e2b63a3848079">2819174635979a19573ec0ce8e3e2b63a3848079</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.8 might allow attackers to have unspecified impact by leveraging failure to block the jboss-common-core class from polymorphic deserialization. <p>Publish Date: 2019-01-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-19362>CVE-2018-19362</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19362">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19362</a></p> <p>Release Date: 2019-01-02</p> <p>Fix Resolution: 2.9.8</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.9.1","packageFilePaths":["/dd-java-agent/appsec/weblog/weblog-spring-app/weblog-spring-app.gradle"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.7.9.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.11.1","packageFilePaths":["/dd-java-agent/benchmark-integration/play-perftest/play-perftest.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play_2.12:2.6.20;io.jsonwebtoken:jjwt:0.7.0;com.fasterxml.jackson.core:jackson-databind:2.8.11.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.4","packageFilePaths":["/dd-java-agent/instrumentation/vertx-web-3.4/vertx-web-3.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.vertx:vertx-web:3.4.0;io.vertx:vertx-core:3.4.0;com.fasterxml.jackson.core:jackson-databind:2.7.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.5","packageFilePaths":["/dd-java-agent/instrumentation/spring-cloud-zuul-2/spring-cloud-zuul-2.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.netflix.zuul:zuul-core:1.3.1;com.netflix.archaius:archaius-core:0.7.6;com.fasterxml.jackson.core:jackson-databind:2.7.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.6","packageFilePaths":["/dd-java-agent/instrumentation/aws-java-sqs-1.0/aws-java-sqs-1.0.gradle","/dd-java-agent/instrumentation/aws-java-sdk-1.11.0/aws-java-sdk-1.11.0.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.amazonaws:aws-java-sdk-kinesis:1.11.106;com.amazonaws:aws-java-sdk-core:1.11.106;com.fasterxml.jackson.core:jackson-databind:2.6.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.11.2","packageFilePaths":["/dd-java-agent/instrumentation/spring-webmvc-3.1/spring-webmvc-3.1.gradle","/dd-java-agent/instrumentation/elasticsearch/transport-2/transport-2.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:1.5.17.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.8.11.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.9","packageFilePaths":["/dd-java-agent/instrumentation/play-2.6/play-2.6.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play_2.11:2.6.0;io.jsonwebtoken:jjwt:0.7.0;com.fasterxml.jackson.core:jackson-databind:2.8.9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.5","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-2/transport-2.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.data:spring-data-elasticsearch:2.0.0.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.6.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.11","packageFilePaths":["/dd-java-agent/instrumentation/akka-http-10.0/akka-http-10.0.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.lightbend.lagom:lagom-javadsl-testkit_2.11:1.4.0;com.lightbend.lagom:lagom-persistence-core_2.11:1.4.0;com.typesafe.play:play_2.11:2.6.11;io.jsonwebtoken:jjwt:0.7.0;com.fasterxml.jackson.core:jackson-databind:2.8.11","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.5","packageFilePaths":["/dd-java-agent/instrumentation/kafka-streams-0.11/kafka-streams-0.11.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.apache.kafka:kafka-streams:0.11.0.0;org.apache.kafka:connect-json:0.11.0.0;com.fasterxml.jackson.core:jackson-databind:2.8.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.1","packageFilePaths":["/dd-java-agent/instrumentation/elasticsearch/transport-5.3/transport-5.3.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.data:spring-data-elasticsearch:3.0.0.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.9.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.7","packageFilePaths":["/dd-java-agent/instrumentation/couchbase-2.6/couchbase-2.6.gradle","/dd-java-agent/instrumentation/aws-java-sdk-2.2/aws-java-sdk-2.2.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.couchbase.client:encryption:1.0.0;com.fasterxml.jackson.core:jackson-databind:2.9.7","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.6.4","packageFilePaths":["/dd-java-agent/instrumentation/couchbase-2.0/couchbase-2.0.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.data:spring-data-couchbase:2.0.0.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.6.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.8","packageFilePaths":["/dd-smoke-tests/play-2.5/play-2.5.gradle","/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play_2.11:2.5.19;com.fasterxml.jackson.core:jackson-databind:2.7.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.1","packageFilePaths":["/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-java_2.11:2.5.0;com.typesafe.play:play_2.11:2.5.0;com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.7.1;com.fasterxml.jackson.core:jackson-databind:2.7.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.3.2","packageFilePaths":["/dd-java-agent/instrumentation/dropwizard/dropwizard-views/dropwizard-views.gradle","/dd-java-agent/instrumentation/play-2.3/play-2.3.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-java-ws_2.11:2.3.10;com.typesafe.play:play_2.11:2.3.10;com.fasterxml.jackson.core:jackson-databind:2.3.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.7.9.3","packageFilePaths":["/dd-java-agent/instrumentation/datastax-cassandra-3/datastax-cassandra-3.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.datastax.cassandra:cassandra-driver-core:3.11.0;com.fasterxml.jackson.core:jackson-databind:2.7.9.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.3.3","packageFilePaths":["/dd-java-agent/instrumentation/jax-rs-annotations-1/jax-rs-annotations-1.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.dropwizard:dropwizard-testing:0.7.1;io.dropwizard:dropwizard-core:0.7.1;io.dropwizard:dropwizard-jackson:0.7.1;com.fasterxml.jackson.core:jackson-databind:2.3.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.4","packageFilePaths":["/dd-java-agent/instrumentation/spring-rabbit/spring-rabbit.gradle","/dd-java-agent/instrumentation/finatra-2.9/finatra-2.9.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.amqp:spring-rabbit:2.0.0.RELEASE;com.rabbitmq:http-client:1.3.0.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.8.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.5.3","packageFilePaths":["/dd-java-agent/instrumentation/aws-java-sdk-1.11.0/aws-java-sdk-1.11.0.gradle","/dd-java-agent/instrumentation/aws-java-sqs-1.0/aws-java-sqs-1.0.gradle","/dd-java-agent/instrumentation/play-2.4/play-2.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play_2.11:2.4.0;com.fasterxml.jackson.core:jackson-databind:2.5.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.4","packageFilePaths":["/dd-java-agent/instrumentation/spring-webflux-5/spring-webflux-5.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-webflux:2.0.0.RELEASE;org.springframework.boot:spring-boot-starter-json:2.0.0.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.9.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.5.4","packageFilePaths":["/dd-smoke-tests/play-2.4/play-2.4.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play_2.11:2.4.11;com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.5.4;com.fasterxml.jackson.core:jackson-databind:2.5.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.0","packageFilePaths":["/dd-java-agent/instrumentation/vertx-rx-3.5/vertx-rx-3.5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.vertx:vertx-rx-java2:3.5.0;io.vertx:vertx-core:3.5.0;com.fasterxml.jackson.core:jackson-databind:2.9.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.7","packageFilePaths":["/dd-java-agent/instrumentation/twilio/twilio.gradle","/dd-java-agent/instrumentation/ratpack-1.5/ratpack-1.5.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.ratpack:ratpack-core:1.5.0;com.fasterxml.jackson.core:jackson-databind:2.8.7","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.3","packageFilePaths":["/dd-smoke-tests/log-injection/log-injection.gradle"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.8.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.9.8"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-19362","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.8 might allow attackers to have unspecified impact by leveraging failure to block the jboss-common-core class from polymorphic deserialization.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-19362","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_defect
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent appsec weblog weblog spring app weblog spring app gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent benchmark integration play perftest play perftest gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy play jar root library jjwt jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation vertx web vertx web gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy vertx web jar root library vertx core jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation spring cloud zuul spring cloud zuul gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy zuul core jar root library archaius core jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation aws java sqs aws java sqs gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy aws java sdk kinesis jar root library aws java sdk core jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation spring webmvc spring webmvc gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring boot starter web release jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation play play gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy play jar root library jjwt jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation elasticsearch transport transport gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring data elasticsearch release jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation akka http akka http gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy lagom javadsl testkit jar root library lagom persistence core jar play jar jjwt jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation kafka streams kafka streams gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy kafka streams jar root library connect json jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation elasticsearch transport transport gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring data elasticsearch release jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation couchbase couchbase gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy encryption jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation couchbase couchbase gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring data couchbase release jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd smoke tests play play gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy play jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation play play gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy play java jar root library play jar jackson datatype jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api path to dependency file dd trace java dd java agent instrumentation dropwizard dropwizard views dropwizard views gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy play java ws jar root library play jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation datastax cassandra datastax cassandra gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy cassandra driver core jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api path to dependency file dd trace java dd java agent instrumentation jax rs annotations jax rs annotations gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy dropwizard testing jar root library dropwizard core jar dropwizard jackson jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation spring rabbit spring rabbit gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring rabbit release jar root library http client release jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation aws java sdk aws java sdk gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy play jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation spring webflux spring webflux gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring boot starter webflux release jar root library spring boot starter json release jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd smoke tests play play gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy play jar root library jackson datatype jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation vertx rx vertx rx gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy vertx rx jar root library vertx core jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd java agent instrumentation twilio twilio gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy ratpack core jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file dd trace java dd smoke tests log injection log injection gradle path to vulnerable library caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before might allow attackers to have unspecified impact by leveraging failure to block the jboss common core class from polymorphic deserialization publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com typesafe play play io jsonwebtoken jjwt com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree io vertx vertx web io vertx vertx core com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com netflix zuul zuul core com netflix archaius archaius core com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com amazonaws aws java sdk kinesis com amazonaws aws java sdk core com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree org springframework boot spring boot starter web release com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com typesafe play play io jsonwebtoken jjwt com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree org springframework data spring data elasticsearch release com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com lightbend lagom lagom javadsl testkit com lightbend lagom lagom persistence core com typesafe play play io jsonwebtoken jjwt com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree org apache kafka kafka streams org apache kafka connect json com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree org springframework data spring data elasticsearch release com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com couchbase client encryption com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree org springframework data spring data couchbase release com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com typesafe play play com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com typesafe play play java com typesafe play play com fasterxml jackson datatype jackson datatype com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com typesafe play play java ws com typesafe play play com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com datastax cassandra cassandra driver core com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree io dropwizard dropwizard testing io dropwizard dropwizard core io dropwizard dropwizard jackson com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree org springframework amqp spring rabbit release com rabbitmq http client release com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com typesafe play play com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree org springframework boot spring boot starter webflux release org springframework boot spring boot starter json release com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree com typesafe play play com fasterxml jackson datatype jackson datatype com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree io vertx vertx rx io vertx vertx core com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree io ratpack ratpack core com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before might allow attackers to have unspecified impact by leveraging failure to block the jboss common core class from polymorphic deserialization vulnerabilityurl
0
16,599
2,920,434,475
IssuesEvent
2015-06-24 18:55:09
ashanbh/chrome-rest-client
https://api.github.com/repos/ashanbh/chrome-rest-client
closed
base-64 response did not display
auto-migrated Priority-Medium Type-Defect
``` What steps will reproduce the problem? 1. Our REST resource returns a JSON array with 2 variables. A status string and a base64 encoded PDF. Expect the JSON array. The same function works in the REST Console app. Instead the response header populates, but the output gives me a green PacMan icon that does not disappear. On what operating system, browser and browser version? Win 7 x64, Chrome 29.0.1547.0 dev-m Please provide any additional information below. This bug has existed since I started using the program about a year ago. ``` Original issue reported on code.google.com by `johnschram@gmail.com` on 22 Jul 2013 at 9:10
1.0
base-64 response did not display - ``` What steps will reproduce the problem? 1. Our REST resource returns a JSON array with 2 variables. A status string and a base64 encoded PDF. Expect the JSON array. The same function works in the REST Console app. Instead the response header populates, but the output gives me a green PacMan icon that does not disappear. On what operating system, browser and browser version? Win 7 x64, Chrome 29.0.1547.0 dev-m Please provide any additional information below. This bug has existed since I started using the program about a year ago. ``` Original issue reported on code.google.com by `johnschram@gmail.com` on 22 Jul 2013 at 9:10
defect
base response did not display what steps will reproduce the problem our rest resource returns a json array with variables a status string and a encoded pdf expect the json array the same function works in the rest console app instead the response header populates but the output gives me a green pacman icon that does not disappear on what operating system browser and browser version win chrome dev m please provide any additional information below this bug has existed since i started using the program about a year ago original issue reported on code google com by johnschram gmail com on jul at
1
131,139
27,831,583,150
IssuesEvent
2023-03-20 05:38:49
WordPress/openverse
https://api.github.com/repos/WordPress/openverse
closed
Plausible DB setup is not idempotent
🟩 priority: low 🛠 goal: fix 💻 aspect: code 🧱 stack: analytics
## Description <!-- Concisely describe the bug. Compare your experience with what you expected to happen. --> <!-- For example: "I clicked the 'submit' button and instead of seeing a thank you message, I saw a blank page." --> When you run `just init` locally, you get the following error output: ``` ERROR: duplicate key value violates unique constraint "api_keys_pkey" DETAIL: Key (id)=(1) already exists. {"error":"domain This domain has already been taken. Perhaps one of your team members registered it? If that's not the case, please contact support@plausible.io"}% ``` The Plausible set up should not try to register the domain if it is already registered. Setting this as low priority because it does not affect how Plausible or anything else works.
1.0
Plausible DB setup is not idempotent - ## Description <!-- Concisely describe the bug. Compare your experience with what you expected to happen. --> <!-- For example: "I clicked the 'submit' button and instead of seeing a thank you message, I saw a blank page." --> When you run `just init` locally, you get the following error output: ``` ERROR: duplicate key value violates unique constraint "api_keys_pkey" DETAIL: Key (id)=(1) already exists. {"error":"domain This domain has already been taken. Perhaps one of your team members registered it? If that's not the case, please contact support@plausible.io"}% ``` The Plausible set up should not try to register the domain if it is already registered. Setting this as low priority because it does not affect how Plausible or anything else works.
non_defect
plausible db setup is not idempotent description when you run just init locally you get the following error output error duplicate key value violates unique constraint api keys pkey detail key id already exists error domain this domain has already been taken perhaps one of your team members registered it if that s not the case please contact support plausible io the plausible set up should not try to register the domain if it is already registered setting this as low priority because it does not affect how plausible or anything else works
0
8,042
4,150,420,564
IssuesEvent
2016-06-15 17:17:28
gogglesmm/gogglesmm
https://api.github.com/repos/gogglesmm/gogglesmm
reopened
cmake build system
buildsystem
Improvements to be made to cmake build system in regards to #51, #52, and #53 - [x] Fix shared library build of libgap - [x] libgap when build as shared library should link to FOX. - [x] CMAKE_INSTALL_LIBDIR should be usable as relative and absolute path - [x] Install GAP plugins in correct LIBDIR. - [ ] Make -lto compilation optional and disable by default.
1.0
cmake build system - Improvements to be made to cmake build system in regards to #51, #52, and #53 - [x] Fix shared library build of libgap - [x] libgap when build as shared library should link to FOX. - [x] CMAKE_INSTALL_LIBDIR should be usable as relative and absolute path - [x] Install GAP plugins in correct LIBDIR. - [ ] Make -lto compilation optional and disable by default.
non_defect
cmake build system improvements to be made to cmake build system in regards to and fix shared library build of libgap libgap when build as shared library should link to fox cmake install libdir should be usable as relative and absolute path install gap plugins in correct libdir make lto compilation optional and disable by default
0