Unnamed: 0
int64
0
825k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
61
repo_url
stringlengths
36
90
action
stringclasses
3 values
title
stringlengths
4
228
labels
stringlengths
4
352
body
stringlengths
48
210k
index
stringclasses
4 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
146k
binary_label
int64
0
1
584
7,986,119,894
IssuesEvent
2018-07-19 00:07:47
rust-lang-nursery/stdsimd
https://api.github.com/repos/rust-lang-nursery/stdsimd
closed
floating-point sum / product are buggy w.r.t. NaNs
A-portable
Due to https://bugs.llvm.org/show_bug.cgi?id=36732, `wrapping_sum` / `wrapping_product` are implemented with fast-math flags unconditionally enabled, which results in inconsistencies like them returning a `NaN` for which the `nan.is_nan()` method returns `false`... We'll probably need to work-around these issues here in `stdsimd`.
True
floating-point sum / product are buggy w.r.t. NaNs - Due to https://bugs.llvm.org/show_bug.cgi?id=36732, `wrapping_sum` / `wrapping_product` are implemented with fast-math flags unconditionally enabled, which results in inconsistencies like them returning a `NaN` for which the `nan.is_nan()` method returns `false`... We'll probably need to work-around these issues here in `stdsimd`.
port
floating point sum product are buggy w r t nans due to wrapping sum wrapping product are implemented with fast math flags unconditionally enabled which results in inconsistencies like them returning a nan for which the nan is nan method returns false we ll probably need to work around these issues here in stdsimd
1
40,848
2,868,945,909
IssuesEvent
2015-06-05 22:07:21
dart-lang/pub
https://api.github.com/repos/dart-lang/pub
closed
Use new Link() instead of shelling out to ln/mklink
bug Fixed Priority-Medium
<a href="https://github.com/munificent"><img src="https://avatars.githubusercontent.com/u/46275?v=3" align="left" width="96" height="96"hspace="10"></img></a> **Issue by [munificent](https://github.com/munificent)** _Originally opened as dart-lang/sdk#9467_ ---- Now that dart:io has an API for creating symlinks, we should use that.
1.0
Use new Link() instead of shelling out to ln/mklink - <a href="https://github.com/munificent"><img src="https://avatars.githubusercontent.com/u/46275?v=3" align="left" width="96" height="96"hspace="10"></img></a> **Issue by [munificent](https://github.com/munificent)** _Originally opened as dart-lang/sdk#9467_ ---- Now that dart:io has an API for creating symlinks, we should use that.
non_port
use new link instead of shelling out to ln mklink issue by originally opened as dart lang sdk now that dart io has an api for creating symlinks we should use that
0
1,741
25,410,141,818
IssuesEvent
2022-11-22 18:14:34
golang/vulndb
https://api.github.com/repos/golang/vulndb
closed
x/vulndb: potential Go vuln in github.com/hashicorp/consul: GHSA-gw2g-hhc9-wgjh
excluded: NOT_IMPORTABLE
In GitHub Security Advisory [GHSA-gw2g-hhc9-wgjh](https://github.com/advisories/GHSA-gw2g-hhc9-wgjh), there is a vulnerability in the following Go packages or modules: | Unit | Fixed | Vulnerable Ranges | | - | - | - | | [github.com/hashicorp/consul](https://pkg.go.dev/github.com/hashicorp/consul) | 1.14.0 | >= 1.13.0, < 1.14.0 | See [doc/triage.md](https://github.com/golang/vulndb/blob/master/doc/triage.md) for instructions on how to triage this report. ``` modules: - module: TODO versions: - introduced: 1.13.0 fixed: 1.14.0 packages: - package: github.com/hashicorp/consul description: HashiCorp Consul and Consul Enterprise 1.13.0 up to 1.13.3 do not filter cluster filtering's imported nodes and services for HTTP or RPC endpoints used by the UI. Fixed in 1.14.0. cves: - CVE-2022-3920 ghsas: - GHSA-gw2g-hhc9-wgjh ```
True
x/vulndb: potential Go vuln in github.com/hashicorp/consul: GHSA-gw2g-hhc9-wgjh - In GitHub Security Advisory [GHSA-gw2g-hhc9-wgjh](https://github.com/advisories/GHSA-gw2g-hhc9-wgjh), there is a vulnerability in the following Go packages or modules: | Unit | Fixed | Vulnerable Ranges | | - | - | - | | [github.com/hashicorp/consul](https://pkg.go.dev/github.com/hashicorp/consul) | 1.14.0 | >= 1.13.0, < 1.14.0 | See [doc/triage.md](https://github.com/golang/vulndb/blob/master/doc/triage.md) for instructions on how to triage this report. ``` modules: - module: TODO versions: - introduced: 1.13.0 fixed: 1.14.0 packages: - package: github.com/hashicorp/consul description: HashiCorp Consul and Consul Enterprise 1.13.0 up to 1.13.3 do not filter cluster filtering's imported nodes and services for HTTP or RPC endpoints used by the UI. Fixed in 1.14.0. cves: - CVE-2022-3920 ghsas: - GHSA-gw2g-hhc9-wgjh ```
port
x vulndb potential go vuln in github com hashicorp consul ghsa wgjh in github security advisory there is a vulnerability in the following go packages or modules unit fixed vulnerable ranges see for instructions on how to triage this report modules module todo versions introduced fixed packages package github com hashicorp consul description hashicorp consul and consul enterprise up to do not filter cluster filtering s imported nodes and services for http or rpc endpoints used by the ui fixed in cves cve ghsas ghsa wgjh
1
410
6,552,008,599
IssuesEvent
2017-09-05 16:37:43
Shinmera/portacle
https://api.github.com/repos/Shinmera/portacle
closed
Fails to start on Fedora 25
portability
Portacle 0.12 won't start on Fedora 25. `portacle.desktop` shown as `Portacle` didn't do anything, so I ran the `Exec` command from within the same directory. ``` bash -c 'cd $(dirname %k) && ./portacle.run' p11-kit: couldn't load module: /usr/lib/x86_64-linux-gnu/pkcs11/p11-kit-trust.so: /usr/lib/x86_64-linux-gnu/pkcs11/p11-kit-trust.so: cannot open shared object file: No such file or directory p11-kit: couldn't load module: /usr/lib/x86_64-linux-gnu/pkcs11/gnome-keyring-pkcs11.so: /usr/lib/x86_64-linux-gnu/pkcs11/gnome-keyring-pkcs11.so: cannot open shared object file: No such file or directory Warning: arch-dependent data dir '/home/linus/portacle/lin/emacs/libexec/emacs/25.1/x86_64-unknown-linux-gnu/': No such file or directory Warning: Lisp directory '/home/linus/portacle/lin/emacs/share/emacs/25.1/lisp': No such file or directory GLib: Cannot convert message: Conversion from character set 'UTF-8' to 'ISO-8859-1' is not supported (emacs:28472): Gtk-WARNING **: Conversion from character set 'ISO-8859-1' to 'UTF-8' is not supported (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "adwaita", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "adwaita", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "adwaita", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "adwaita", GLib-GIO-Message: Using the 'memory' GSettings backend. Your settings will not be saved or shared with other applications. (emacs:28472): GLib-CRITICAL **: g_error_new_literal: assertion 'message != NULL' failed Fatal error 11: Segmentation fault Backtrace: /home/martin/portacle//lin/emacs/bin/emacs[0x4f7742] /home/martin/portacle//lin/emacs/bin/emacs[0x4df2e9] /home/martin/portacle//lin/emacs/bin/emacs[0x4f65be] /home/martin/portacle//lin/emacs/bin/emacs[0x4f67c3] /home/martin/portacle//lin/emacs/bin/emacs[0x4f67fa] /home/martin/portacle//lin/lib/libpthread.so.0(+0x10330)[0x7f4ea484b330] /home/martin/portacle//lin/lib/libgdk_pixbuf-2.0.so.0(+0xa253)[0x7f4ea7096253] /home/martin/portacle//lin/lib/libgdk_pixbuf-2.0.so.0(gdk_pixbuf_get_formats+0xd)[0x7f4ea709866d] /home/martin/portacle//lin/lib/libgtk-x11-2.0.so.0(+0xff2dc)[0x7f4ea79d32dc] /home/martin/portacle//lin/lib/libgobject-2.0.so.0(g_type_create_instance+0x1eb)[0x7f4ea6c2ee3b] /home/martin/portacle//lin/lib/libgobject-2.0.so.0(+0x15355)[0x7f4ea6c13355] /home/martin/portacle//lin/lib/libgobject-2.0.so.0(g_object_newv+0x22d)[0x7f4ea6c1510d] /home/martin/portacle//lin/lib/libgobject-2.0.so.0(g_object_new+0xec)[0x7f4ea6c158bc] /home/martin/portacle//lin/lib/libgtk-x11-2.0.so.0(gtk_icon_theme_get_for_screen+0x77)[0x7f4ea79d35c7] /home/martin/portacle//lin/emacs/bin/emacs[0x4d91a5] /home/martin/portacle//lin/emacs/bin/emacs[0x4da968] /home/martin/portacle//lin/emacs/bin/emacs[0x4ca9a0] /home/martin/portacle//lin/emacs/bin/emacs[0x4cb464] /home/martin/portacle//lin/emacs/bin/emacs[0x550832] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550282] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x551b70] /home/martin/portacle//lin/emacs/bin/emacs[0x55073a] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x54f7f3] /home/martin/portacle//lin/emacs/bin/emacs[0x54fb3e] /home/martin/portacle//lin/emacs/bin/emacs[0x553001] /home/martin/portacle//lin/emacs/bin/emacs[0x54f0dd] /home/martin/portacle//lin/emacs/bin/emacs[0x4e1ddc] /home/martin/portacle//lin/emacs/bin/emacs[0x54f08b] ... ./portacle.run: line 7: 28472 Segmentation fault (core dumped) "$ROOT/lin/launcher/portacle" "$@" $ ```
True
Fails to start on Fedora 25 - Portacle 0.12 won't start on Fedora 25. `portacle.desktop` shown as `Portacle` didn't do anything, so I ran the `Exec` command from within the same directory. ``` bash -c 'cd $(dirname %k) && ./portacle.run' p11-kit: couldn't load module: /usr/lib/x86_64-linux-gnu/pkcs11/p11-kit-trust.so: /usr/lib/x86_64-linux-gnu/pkcs11/p11-kit-trust.so: cannot open shared object file: No such file or directory p11-kit: couldn't load module: /usr/lib/x86_64-linux-gnu/pkcs11/gnome-keyring-pkcs11.so: /usr/lib/x86_64-linux-gnu/pkcs11/gnome-keyring-pkcs11.so: cannot open shared object file: No such file or directory Warning: arch-dependent data dir '/home/linus/portacle/lin/emacs/libexec/emacs/25.1/x86_64-unknown-linux-gnu/': No such file or directory Warning: Lisp directory '/home/linus/portacle/lin/emacs/share/emacs/25.1/lisp': No such file or directory GLib: Cannot convert message: Conversion from character set 'UTF-8' to 'ISO-8859-1' is not supported (emacs:28472): Gtk-WARNING **: Conversion from character set 'ISO-8859-1' to 'UTF-8' is not supported (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "adwaita", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "adwaita", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "adwaita", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "pixmap", (emacs:28472): Gtk-WARNING **: Unable to locate theme engine in module_path: "adwaita", GLib-GIO-Message: Using the 'memory' GSettings backend. Your settings will not be saved or shared with other applications. (emacs:28472): GLib-CRITICAL **: g_error_new_literal: assertion 'message != NULL' failed Fatal error 11: Segmentation fault Backtrace: /home/martin/portacle//lin/emacs/bin/emacs[0x4f7742] /home/martin/portacle//lin/emacs/bin/emacs[0x4df2e9] /home/martin/portacle//lin/emacs/bin/emacs[0x4f65be] /home/martin/portacle//lin/emacs/bin/emacs[0x4f67c3] /home/martin/portacle//lin/emacs/bin/emacs[0x4f67fa] /home/martin/portacle//lin/lib/libpthread.so.0(+0x10330)[0x7f4ea484b330] /home/martin/portacle//lin/lib/libgdk_pixbuf-2.0.so.0(+0xa253)[0x7f4ea7096253] /home/martin/portacle//lin/lib/libgdk_pixbuf-2.0.so.0(gdk_pixbuf_get_formats+0xd)[0x7f4ea709866d] /home/martin/portacle//lin/lib/libgtk-x11-2.0.so.0(+0xff2dc)[0x7f4ea79d32dc] /home/martin/portacle//lin/lib/libgobject-2.0.so.0(g_type_create_instance+0x1eb)[0x7f4ea6c2ee3b] /home/martin/portacle//lin/lib/libgobject-2.0.so.0(+0x15355)[0x7f4ea6c13355] /home/martin/portacle//lin/lib/libgobject-2.0.so.0(g_object_newv+0x22d)[0x7f4ea6c1510d] /home/martin/portacle//lin/lib/libgobject-2.0.so.0(g_object_new+0xec)[0x7f4ea6c158bc] /home/martin/portacle//lin/lib/libgtk-x11-2.0.so.0(gtk_icon_theme_get_for_screen+0x77)[0x7f4ea79d35c7] /home/martin/portacle//lin/emacs/bin/emacs[0x4d91a5] /home/martin/portacle//lin/emacs/bin/emacs[0x4da968] /home/martin/portacle//lin/emacs/bin/emacs[0x4ca9a0] /home/martin/portacle//lin/emacs/bin/emacs[0x4cb464] /home/martin/portacle//lin/emacs/bin/emacs[0x550832] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550282] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x551b70] /home/martin/portacle//lin/emacs/bin/emacs[0x55073a] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x550643] /home/martin/portacle//lin/emacs/bin/emacs[0x583d15] /home/martin/portacle//lin/emacs/bin/emacs[0x54f7f3] /home/martin/portacle//lin/emacs/bin/emacs[0x54fb3e] /home/martin/portacle//lin/emacs/bin/emacs[0x553001] /home/martin/portacle//lin/emacs/bin/emacs[0x54f0dd] /home/martin/portacle//lin/emacs/bin/emacs[0x4e1ddc] /home/martin/portacle//lin/emacs/bin/emacs[0x54f08b] ... ./portacle.run: line 7: 28472 Segmentation fault (core dumped) "$ROOT/lin/launcher/portacle" "$@" $ ```
port
fails to start on fedora portacle won t start on fedora portacle desktop shown as portacle didn t do anything so i ran the exec command from within the same directory bash c cd dirname k portacle run kit couldn t load module usr lib linux gnu kit trust so usr lib linux gnu kit trust so cannot open shared object file no such file or directory kit couldn t load module usr lib linux gnu gnome keyring so usr lib linux gnu gnome keyring so cannot open shared object file no such file or directory warning arch dependent data dir home linus portacle lin emacs libexec emacs unknown linux gnu no such file or directory warning lisp directory home linus portacle lin emacs share emacs lisp no such file or directory glib cannot convert message conversion from character set utf to iso is not supported emacs gtk warning conversion from character set iso to utf is not supported emacs gtk warning unable to locate theme engine in module path adwaita emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path adwaita emacs gtk warning unable to locate theme engine in module path adwaita emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path pixmap emacs gtk warning unable to locate theme engine in module path adwaita glib gio message using the memory gsettings backend your settings will not be saved or shared with other applications emacs glib critical g error new literal assertion message null failed fatal error segmentation fault backtrace home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin lib libpthread so home martin portacle lin lib libgdk pixbuf so home martin portacle lin lib libgdk pixbuf so gdk pixbuf get formats home martin portacle lin lib libgtk so home martin portacle lin lib libgobject so g type create instance home martin portacle lin lib libgobject so home martin portacle lin lib libgobject so g object newv home martin portacle lin lib libgobject so g object new home martin portacle lin lib libgtk so gtk icon theme get for screen home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs home martin portacle lin emacs bin emacs portacle run line segmentation fault core dumped root lin launcher portacle
1
1,606
23,245,041,986
IssuesEvent
2022-08-03 19:15:22
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
Excluded from the TOC
azure-supportability/svc triaged assigned-to-author doc-enhancement Pri2
This page seems to be excluded from the TOC by the commit https://github.com/MicrosoftDocs/azure-docs/commit/f273f46a4e4fdd644c06857c12bc911bf75fd068#diff-d29a77de58e37b3c6d943a73d35641b4 --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: fe04a05a-f753-ee1f-5171-fd2a81a030f9 * Version Independent ID: 1507a7ac-11cc-81a7-e932-9e8ef9644e2b * Content: [Azure Resource Manager vCPU quota increase requests](https://docs.microsoft.com/en-us/azure/azure-portal/supportability/resource-manager-core-quotas-request) * Content Source: [articles/azure-portal/supportability/resource-manager-core-quotas-request.md](https://github.com/Microsoft/azure-docs/blob/master/articles/azure-portal/supportability/resource-manager-core-quotas-request.md) * Service: **azure-supportability** * GitHub Login: @sowmyavenkat86 * Microsoft Alias: **svenkat**
True
Excluded from the TOC - This page seems to be excluded from the TOC by the commit https://github.com/MicrosoftDocs/azure-docs/commit/f273f46a4e4fdd644c06857c12bc911bf75fd068#diff-d29a77de58e37b3c6d943a73d35641b4 --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: fe04a05a-f753-ee1f-5171-fd2a81a030f9 * Version Independent ID: 1507a7ac-11cc-81a7-e932-9e8ef9644e2b * Content: [Azure Resource Manager vCPU quota increase requests](https://docs.microsoft.com/en-us/azure/azure-portal/supportability/resource-manager-core-quotas-request) * Content Source: [articles/azure-portal/supportability/resource-manager-core-quotas-request.md](https://github.com/Microsoft/azure-docs/blob/master/articles/azure-portal/supportability/resource-manager-core-quotas-request.md) * Service: **azure-supportability** * GitHub Login: @sowmyavenkat86 * Microsoft Alias: **svenkat**
port
excluded from the toc this page seems to be excluded from the toc by the commit document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service azure supportability github login microsoft alias svenkat
1
135,876
30,442,800,804
IssuesEvent
2023-07-15 09:20:51
linwu-hi/coding-time
https://api.github.com/repos/linwu-hi/coding-time
opened
poker
javascript typescript dart leetcode 数据结构和算法 data-structures algorithms
# TS实战之扑克牌排序 [在线运行](https://code.juejin.cn/pen/7254739493366333499) 我们用`ts实现扑克牌排序问题`,首先,我们将定义所需的数据类型,然后专注于模式查找算法,该算法有几个有趣的要点。 ## 类型和转换 定义一些我们需要的类型。`Rank`和`Suit`是明显的[联合类型](https://www.typescriptlang.org/docs/handbook/2/everyday-types.html#union-types)。 ```ts type Rank = | 'A' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9' | '10' | 'J' | 'Q' | 'K' type Suit = '♥' | '♦' | '♠' | '♣'; ``` 我们将使用`Card`对象进行处理,将rank和suit转换为数字。卡片将用从1(Ace)到13(King)的值表示,花色从1(红心)到4(梅花)。`rankToNumber()`和`suitToNumber()`函数处理从`Rank`和`Suit`值到数字的转换。 ```ts type Card = { rank: number; suit: number }; const rankToNumber = (rank: Rank): number => rank === 'A' ? 1 : rank === 'J' ? 11 : rank === 'Q' ? 12 : rank === 'K' ? 13 : Number(rank); const suitToNumber = (suit: Suit): number => suit === '♥' ? 1 : suit === '♦' ? 2 : suit === '♠' ? 3 : /* suit === "♣" */ 4; ``` ![-](./images/3.png) 这些类型用于内部工作;我们还必须定义手牌检测算法的结果类型。我们需要一个[枚举](https://www.typescriptlang.org/docs/handbook/enums.html)类型来表示手牌的可能值。这些值按照从最低("高牌")到最高("皇家同花顺")的顺序排列。 ```ts enum Hand { HighCard, // 高牌 OnePair, // 一对 TwoPairs, // 两对 ThreeOfAKind, // 三条 Straight, // 顺子 Flush, // 同花 FullHouse, // 葫芦 FourOfAKind, // 四条 StraightFlush, // 同花顺 RoyalFlush //皇家同花顺 } ``` ## 我们有什么手牌? 让我们首先定义我们将要构建的`handRank()`函数。我们的函数将接收一个包含`五张牌的元组`,并返回一个`Hand`结果。 ```ts export function handRank( cardStrings: [string, string, string, string, string] ): Hand { . . . } ``` 由于处理字符串比我们需要的要困难,我们将把牌字符串转换为具有数字`rank`和`suit`值的`Card`对象,以便更容易编写。 ```ts const cards: Card[] = cardStrings.map((str: string) => ({ rank: rankToNumber( str.substring(0, str.length - 1) as Rank ), suit: suitToNumber(str.at(-1) as Suit) })); . . . // 继续... ``` ![-](./images/4.png) 确定玩家手牌的价值的关键在于知道每个等级的牌有多少张,以及我们有多少计数。例如,如果我们有三张J和两张K,J的计数为3,K的计数为2。然后,知道我们有一个计数为三和一个计数为两的计数,我们可以确定我们有一个葫芦。另一个例子:如果我们有两个Q,两个A和一个5,我们会得到两个计数为两和一个计数为一;我们有两对。 生成计数很简单。我们希望A的计数在`countByRank[1]`处,因此我们不会使用`countByRank`数组的初始位置。类似地,花色的计数将位于`countBySuit[1]`到`countBySuit[4]`之间,因此我们也不会使用该数组的初始位置。 ```ts // ...继续 . . . const countBySuit = new Array(5).fill(0); const countByRank = new Array(15).fill(0); const countBySet = new Array(5).fill(0); cards.forEach((card: Card) => { countByRank[card.rank]++; countBySuit[card.suit]++; }); countByRank.forEach( (count: number) => count && countBySet[count]++ ); . . . // 继续... ``` 我们不要忘记A可能位于顺子的开头(A-2-3-4-5)或结尾(10-J-Q-K-A)。我们可以通过在K之后复制Aces计数来处理这个问题。 ```ts // ...继续 . . . countByRank[14] = countByRank[1]; . . . // 继续... ``` 现在我们可以开始识别手牌了。我们只需要查看按等级计数即可识别几种手牌: ```ts // ...继续 . . . if (count BySet[4] === 1 && countBySet[1] === 1) return Hand.FourOfAKind; else if (countBySet[3] && countBySet[2] === 1) return Hand.FullHouse; else if (countBySet[3] && countBySet[1] === 2) return Hand.ThreeOfAKind; else if (countBySet[2] === 2 && countBySet[1] === 1) return Hand.TwoPairs; else if (countBySet[2] === 1 && countBySet[1] === 3) return Hand.OnePair; . . . // 继续... ``` 例如,如果有四张相同等级的牌,我们知道玩家将获得“四条”。可能会问:如果`countBySet[4] === 1`,为什么还要测试`countBySet[1] === 1`?如果四张牌的等级相同,应该只有一张其他牌,对吗?答案是[“防御性编程”](https://en.wikipedia.org/wiki/Defensive_programming)——在开发代码时,有时会出现错误,通过在测试中更加具体,有助于排查错误。 上面的情况包括了所有某个等级出现多次的可能性。我们必须处理其他情况,包括顺子、同花和“高牌”。 ```ts // ...继续 . . . else if (countBySet[1] === 5) { if (countByRank.join('').includes('11111')) return !countBySuit.includes(5) ? Hand.Straight : countByRank.slice(10).join('') === '11111' ? Hand.RoyalFlush : Hand.StraightFlush; else { return countBySuit.includes(5) ? Hand.Flush : Hand.HighCard; } } else { throw new Error( 'Unknown hand! This cannot happen! Bad logic!' ); } ``` 这里我们再次进行防御性编程;即使我们知道我们有五个不同的等级,我们也确保逻辑工作良好,甚至在出现问题时抛出一个`throw`。 我们如何测试顺子?我们应该有五个连续的等级。如果我们查看`countByRank`数组,它应该有五个连续的1,所以通过执行`countByRank.join()`并检查生成的字符串是否包含`11111`,我们可以确定是顺子。 ![-](./images/5.png) 我们必须区分几种情况: * 如果没有五张相同花色的牌,那么它是一个普通的顺子 * 如果所有牌都是相同花色,如果顺子以一张A结束,则为皇家同花顺 * 如果所有牌都是相同花色,但我们不以A结束,那么我们有一个同花顺 如果我们没有顺子,只有两种可能性: * 如果所有牌都是相同花色,我们有一个同花 * 如果不是所有牌都是相同花色,我们有一个“高牌” 完整的函数如下所示: ```ts export function handRank( cardStrings: [string, string, string, string, string] ): Hand { const cards: Card[] = cardStrings.map((str: string) => ({ rank: rankToNumber( str.substring(0, str.length - 1) as Rank ), suit: suitToNumber(str.at(-1) as Suit) })); // We won't use the [0] place in the following arrays const countBySuit = new Array(5).fill(0); const countByRank = new Array(15).fill(0); const countBySet = new Array(5).fill(0); cards.forEach((card: Card) => { countByRank[card.rank]++; countBySuit[card.suit]++; }); countByRank.forEach( (count: number) => count && countBySet[count]++ ); // count the A also as a 14, for straights countByRank[14] = countByRank[1]; if (countBySet[4] === 1 && countBySet[1] === 1) return Hand.FourOfAKind; else if (countBySet[3] && countBySet[2] === 1) return Hand.FullHouse; else if (countBySet[3] && countBySet[1] === 2) return Hand.ThreeOfAKind; else if (countBySet[2] === 2 && countBySet[1] === 1) return Hand.TwoPairs; else if (countBySet[2] === 1 && countBySet[1] === 3) return Hand.OnePair; else if (countBySet[1] === 5) { if (countByRank.join('').includes('11111')) return !countBySuit.includes(5) ? Hand.Straight : countByRank.slice(10).join('') === '11111' ? Hand.RoyalFlush : Hand.StraightFlush; else { /* !countByRank.join("").includes("11111") */ return countBySuit.includes(5) ? Hand.Flush : Hand.HighCard; } } else { throw new Error( 'Unknown hand! This cannot happen! Bad logic!' ); } } ``` ## 测试代码 ```ts console.log(handRank(['3♥', '5♦', '8♣', 'A♥', '6♠'])); // 0 console.log(handRank(['3♥', '5♦', '8♣', 'A♥', '5♠'])); // 1 console.log(handRank(['3♥', '5♦', '3♣', 'A♥', '5♠'])); // 2 console.log(handRank(['3♥', '5♦', '8♣', '5♥', '5♠'])); // 3 console.log(handRank(['3♥', '2♦', 'A♣', '5♥', '4♠'])); // 4 console.log(handRank(['J♥', '10♦', 'A♣', 'Q♥', 'K♠'])); // 4 console.log(handRank(['3♥', '4♦', '7♣', '5♥', '6♠'])); // 4 console.log(handRank(['3♥', '4♥', '9♥', '5♥', '6♥'])); // 5 console.log(handRank(['3♥', '5♦', '3♣', '5♥', '3♠'])); // 6 console.log(handRank(['3♥', '3♦', '3♣', '5♥', '3♠'])); // 7 console.log(handRank(['3♥', '4♥', '7♥', '5♥', '6♥'])); // 8 console.log(handRank(['K♥', 'Q♥', 'A♥', '10♥', 'J♥'])); // 9 ``` [在线运行](https://code.juejin.cn/pen/7254739493366333499)
1.0
poker - # TS实战之扑克牌排序 [在线运行](https://code.juejin.cn/pen/7254739493366333499) 我们用`ts实现扑克牌排序问题`,首先,我们将定义所需的数据类型,然后专注于模式查找算法,该算法有几个有趣的要点。 ## 类型和转换 定义一些我们需要的类型。`Rank`和`Suit`是明显的[联合类型](https://www.typescriptlang.org/docs/handbook/2/everyday-types.html#union-types)。 ```ts type Rank = | 'A' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9' | '10' | 'J' | 'Q' | 'K' type Suit = '♥' | '♦' | '♠' | '♣'; ``` 我们将使用`Card`对象进行处理,将rank和suit转换为数字。卡片将用从1(Ace)到13(King)的值表示,花色从1(红心)到4(梅花)。`rankToNumber()`和`suitToNumber()`函数处理从`Rank`和`Suit`值到数字的转换。 ```ts type Card = { rank: number; suit: number }; const rankToNumber = (rank: Rank): number => rank === 'A' ? 1 : rank === 'J' ? 11 : rank === 'Q' ? 12 : rank === 'K' ? 13 : Number(rank); const suitToNumber = (suit: Suit): number => suit === '♥' ? 1 : suit === '♦' ? 2 : suit === '♠' ? 3 : /* suit === "♣" */ 4; ``` ![-](./images/3.png) 这些类型用于内部工作;我们还必须定义手牌检测算法的结果类型。我们需要一个[枚举](https://www.typescriptlang.org/docs/handbook/enums.html)类型来表示手牌的可能值。这些值按照从最低("高牌")到最高("皇家同花顺")的顺序排列。 ```ts enum Hand { HighCard, // 高牌 OnePair, // 一对 TwoPairs, // 两对 ThreeOfAKind, // 三条 Straight, // 顺子 Flush, // 同花 FullHouse, // 葫芦 FourOfAKind, // 四条 StraightFlush, // 同花顺 RoyalFlush //皇家同花顺 } ``` ## 我们有什么手牌? 让我们首先定义我们将要构建的`handRank()`函数。我们的函数将接收一个包含`五张牌的元组`,并返回一个`Hand`结果。 ```ts export function handRank( cardStrings: [string, string, string, string, string] ): Hand { . . . } ``` 由于处理字符串比我们需要的要困难,我们将把牌字符串转换为具有数字`rank`和`suit`值的`Card`对象,以便更容易编写。 ```ts const cards: Card[] = cardStrings.map((str: string) => ({ rank: rankToNumber( str.substring(0, str.length - 1) as Rank ), suit: suitToNumber(str.at(-1) as Suit) })); . . . // 继续... ``` ![-](./images/4.png) 确定玩家手牌的价值的关键在于知道每个等级的牌有多少张,以及我们有多少计数。例如,如果我们有三张J和两张K,J的计数为3,K的计数为2。然后,知道我们有一个计数为三和一个计数为两的计数,我们可以确定我们有一个葫芦。另一个例子:如果我们有两个Q,两个A和一个5,我们会得到两个计数为两和一个计数为一;我们有两对。 生成计数很简单。我们希望A的计数在`countByRank[1]`处,因此我们不会使用`countByRank`数组的初始位置。类似地,花色的计数将位于`countBySuit[1]`到`countBySuit[4]`之间,因此我们也不会使用该数组的初始位置。 ```ts // ...继续 . . . const countBySuit = new Array(5).fill(0); const countByRank = new Array(15).fill(0); const countBySet = new Array(5).fill(0); cards.forEach((card: Card) => { countByRank[card.rank]++; countBySuit[card.suit]++; }); countByRank.forEach( (count: number) => count && countBySet[count]++ ); . . . // 继续... ``` 我们不要忘记A可能位于顺子的开头(A-2-3-4-5)或结尾(10-J-Q-K-A)。我们可以通过在K之后复制Aces计数来处理这个问题。 ```ts // ...继续 . . . countByRank[14] = countByRank[1]; . . . // 继续... ``` 现在我们可以开始识别手牌了。我们只需要查看按等级计数即可识别几种手牌: ```ts // ...继续 . . . if (count BySet[4] === 1 && countBySet[1] === 1) return Hand.FourOfAKind; else if (countBySet[3] && countBySet[2] === 1) return Hand.FullHouse; else if (countBySet[3] && countBySet[1] === 2) return Hand.ThreeOfAKind; else if (countBySet[2] === 2 && countBySet[1] === 1) return Hand.TwoPairs; else if (countBySet[2] === 1 && countBySet[1] === 3) return Hand.OnePair; . . . // 继续... ``` 例如,如果有四张相同等级的牌,我们知道玩家将获得“四条”。可能会问:如果`countBySet[4] === 1`,为什么还要测试`countBySet[1] === 1`?如果四张牌的等级相同,应该只有一张其他牌,对吗?答案是[“防御性编程”](https://en.wikipedia.org/wiki/Defensive_programming)——在开发代码时,有时会出现错误,通过在测试中更加具体,有助于排查错误。 上面的情况包括了所有某个等级出现多次的可能性。我们必须处理其他情况,包括顺子、同花和“高牌”。 ```ts // ...继续 . . . else if (countBySet[1] === 5) { if (countByRank.join('').includes('11111')) return !countBySuit.includes(5) ? Hand.Straight : countByRank.slice(10).join('') === '11111' ? Hand.RoyalFlush : Hand.StraightFlush; else { return countBySuit.includes(5) ? Hand.Flush : Hand.HighCard; } } else { throw new Error( 'Unknown hand! This cannot happen! Bad logic!' ); } ``` 这里我们再次进行防御性编程;即使我们知道我们有五个不同的等级,我们也确保逻辑工作良好,甚至在出现问题时抛出一个`throw`。 我们如何测试顺子?我们应该有五个连续的等级。如果我们查看`countByRank`数组,它应该有五个连续的1,所以通过执行`countByRank.join()`并检查生成的字符串是否包含`11111`,我们可以确定是顺子。 ![-](./images/5.png) 我们必须区分几种情况: * 如果没有五张相同花色的牌,那么它是一个普通的顺子 * 如果所有牌都是相同花色,如果顺子以一张A结束,则为皇家同花顺 * 如果所有牌都是相同花色,但我们不以A结束,那么我们有一个同花顺 如果我们没有顺子,只有两种可能性: * 如果所有牌都是相同花色,我们有一个同花 * 如果不是所有牌都是相同花色,我们有一个“高牌” 完整的函数如下所示: ```ts export function handRank( cardStrings: [string, string, string, string, string] ): Hand { const cards: Card[] = cardStrings.map((str: string) => ({ rank: rankToNumber( str.substring(0, str.length - 1) as Rank ), suit: suitToNumber(str.at(-1) as Suit) })); // We won't use the [0] place in the following arrays const countBySuit = new Array(5).fill(0); const countByRank = new Array(15).fill(0); const countBySet = new Array(5).fill(0); cards.forEach((card: Card) => { countByRank[card.rank]++; countBySuit[card.suit]++; }); countByRank.forEach( (count: number) => count && countBySet[count]++ ); // count the A also as a 14, for straights countByRank[14] = countByRank[1]; if (countBySet[4] === 1 && countBySet[1] === 1) return Hand.FourOfAKind; else if (countBySet[3] && countBySet[2] === 1) return Hand.FullHouse; else if (countBySet[3] && countBySet[1] === 2) return Hand.ThreeOfAKind; else if (countBySet[2] === 2 && countBySet[1] === 1) return Hand.TwoPairs; else if (countBySet[2] === 1 && countBySet[1] === 3) return Hand.OnePair; else if (countBySet[1] === 5) { if (countByRank.join('').includes('11111')) return !countBySuit.includes(5) ? Hand.Straight : countByRank.slice(10).join('') === '11111' ? Hand.RoyalFlush : Hand.StraightFlush; else { /* !countByRank.join("").includes("11111") */ return countBySuit.includes(5) ? Hand.Flush : Hand.HighCard; } } else { throw new Error( 'Unknown hand! This cannot happen! Bad logic!' ); } } ``` ## 测试代码 ```ts console.log(handRank(['3♥', '5♦', '8♣', 'A♥', '6♠'])); // 0 console.log(handRank(['3♥', '5♦', '8♣', 'A♥', '5♠'])); // 1 console.log(handRank(['3♥', '5♦', '3♣', 'A♥', '5♠'])); // 2 console.log(handRank(['3♥', '5♦', '8♣', '5♥', '5♠'])); // 3 console.log(handRank(['3♥', '2♦', 'A♣', '5♥', '4♠'])); // 4 console.log(handRank(['J♥', '10♦', 'A♣', 'Q♥', 'K♠'])); // 4 console.log(handRank(['3♥', '4♦', '7♣', '5♥', '6♠'])); // 4 console.log(handRank(['3♥', '4♥', '9♥', '5♥', '6♥'])); // 5 console.log(handRank(['3♥', '5♦', '3♣', '5♥', '3♠'])); // 6 console.log(handRank(['3♥', '3♦', '3♣', '5♥', '3♠'])); // 7 console.log(handRank(['3♥', '4♥', '7♥', '5♥', '6♥'])); // 8 console.log(handRank(['K♥', 'Q♥', 'A♥', '10♥', 'J♥'])); // 9 ``` [在线运行](https://code.juejin.cn/pen/7254739493366333499)
non_port
poker ts实战之扑克牌排序 我们用 ts实现扑克牌排序问题 ,首先,我们将定义所需的数据类型,然后专注于模式查找算法,该算法有几个有趣的要点。 类型和转换 定义一些我们需要的类型。 rank 和 suit 是明显的 ts type rank a j q k type suit ♥ ♦ ♠ ♣ 我们将使用 card 对象进行处理,将rank和suit转换为数字。 (ace) (king)的值表示, (红心) (梅花)。 ranktonumber 和 suittonumber 函数处理从 rank 和 suit 值到数字的转换。 ts type card rank number suit number const ranktonumber rank rank number rank a rank j rank q rank k number rank const suittonumber suit suit number suit ♥ suit ♦ suit ♠ suit ♣ images png 这些类型用于内部工作;我们还必须定义手牌检测算法的结果类型。我们需要一个 ts enum hand highcard 高牌 onepair 一对 twopairs 两对 threeofakind 三条 straight 顺子 flush 同花 fullhouse 葫芦 fourofakind 四条 straightflush 同花顺 royalflush 皇家同花顺 我们有什么手牌? 让我们首先定义我们将要构建的 handrank 函数。我们的函数将接收一个包含 五张牌的元组 ,并返回一个 hand 结果。 ts export function handrank cardstrings hand 由于处理字符串比我们需要的要困难,我们将把牌字符串转换为具有数字 rank 和 suit 值的 card 对象,以便更容易编写。 ts const cards card cardstrings map str string rank ranktonumber str substring str length as rank suit suittonumber str at as suit 继续 images png 确定玩家手牌的价值的关键在于知道每个等级的牌有多少张,以及我们有多少计数。例如,如果我们有三张j和两张k, , 。然后,知道我们有一个计数为三和一个计数为两的计数,我们可以确定我们有一个葫芦。另一个例子:如果我们有两个q, ,我们会得到两个计数为两和一个计数为一;我们有两对。 生成计数很简单。我们希望a的计数在 countbyrank 处,因此我们不会使用 countbyrank 数组的初始位置。类似地,花色的计数将位于 countbysuit 到 countbysuit 之间,因此我们也不会使用该数组的初始位置。 ts 继续 const countbysuit new array fill const countbyrank new array fill const countbyset new array fill cards foreach card card countbyrank countbysuit countbyrank foreach count number count countbyset 继续 我们不要忘记a可能位于顺子的开头(a )或结尾( j q k a)。我们可以通过在k之后复制aces计数来处理这个问题。 ts 继续 countbyrank countbyrank 继续 现在我们可以开始识别手牌了。我们只需要查看按等级计数即可识别几种手牌: ts 继续 if count byset countbyset return hand fourofakind else if countbyset countbyset return hand fullhouse else if countbyset countbyset return hand threeofakind else if countbyset countbyset return hand twopairs else if countbyset countbyset return hand onepair 继续 例如,如果有四张相同等级的牌,我们知道玩家将获得“四条”。可能会问:如果 countbyset ,为什么还要测试 countbyset ?如果四张牌的等级相同,应该只有一张其他牌,对吗?答案是 上面的情况包括了所有某个等级出现多次的可能性。我们必须处理其他情况,包括顺子、同花和“高牌”。 ts 继续 else if countbyset if countbyrank join includes return countbysuit includes hand straight countbyrank slice join hand royalflush hand straightflush else return countbysuit includes hand flush hand highcard else throw new error unknown hand this cannot happen bad logic 这里我们再次进行防御性编程;即使我们知道我们有五个不同的等级,我们也确保逻辑工作良好,甚至在出现问题时抛出一个 throw 。 我们如何测试顺子?我们应该有五个连续的等级。如果我们查看 countbyrank 数组, ,所以通过执行 countbyrank join 并检查生成的字符串是否包含 ,我们可以确定是顺子。 images png 我们必须区分几种情况: 如果没有五张相同花色的牌,那么它是一个普通的顺子 如果所有牌都是相同花色,如果顺子以一张a结束,则为皇家同花顺 如果所有牌都是相同花色,但我们不以a结束,那么我们有一个同花顺 如果我们没有顺子,只有两种可能性: 如果所有牌都是相同花色,我们有一个同花 如果不是所有牌都是相同花色,我们有一个“高牌” 完整的函数如下所示: ts export function handrank cardstrings hand const cards card cardstrings map str string rank ranktonumber str substring str length as rank suit suittonumber str at as suit we won t use the place in the following arrays const countbysuit new array fill const countbyrank new array fill const countbyset new array fill cards foreach card card countbyrank countbysuit countbyrank foreach count number count countbyset count the a also as a for straights countbyrank countbyrank if countbyset countbyset return hand fourofakind else if countbyset countbyset return hand fullhouse else if countbyset countbyset return hand threeofakind else if countbyset countbyset return hand twopairs else if countbyset countbyset return hand onepair else if countbyset if countbyrank join includes return countbysuit includes hand straight countbyrank slice join hand royalflush hand straightflush else countbyrank join includes return countbysuit includes hand flush hand highcard else throw new error unknown hand this cannot happen bad logic 测试代码 ts console log handrank console log handrank console log handrank console log handrank console log handrank console log handrank console log handrank console log handrank console log handrank console log handrank console log handrank console log handrank
0
928
12,219,309,256
IssuesEvent
2020-05-01 21:23:33
ocaml/opam
https://api.github.com/repos/ocaml/opam
closed
generate success/failure graphs
AREA: PLATFORM AREA: PORTABILITY KIND: FEATURE WISH
@yallop had the good idea of generating % success/failure graphs for builds across the repository. This will likely be an `opam-admin` option to generate the graph, so tracking it here (not a 1.2 blocker)
True
generate success/failure graphs - @yallop had the good idea of generating % success/failure graphs for builds across the repository. This will likely be an `opam-admin` option to generate the graph, so tracking it here (not a 1.2 blocker)
port
generate success failure graphs yallop had the good idea of generating success failure graphs for builds across the repository this will likely be an opam admin option to generate the graph so tracking it here not a blocker
1
448,616
12,954,406,331
IssuesEvent
2020-07-20 03:33:40
GoogleChrome/lighthouse
https://api.github.com/repos/GoogleChrome/lighthouse
closed
Tensorflow model assets import degrading performance scores randomly
needs-priority pending-close question
Hello lighthouse team, I'm trying to load a tjfs model locally for my web app ( in the end ) for a later point of time. All of my page metrics are fine except the fact that either lighthouse is giving me poor performance score or either giving me score of 90+ with note : " page loaded too slowly to finish within the time limit ". Is there a way I can avoid my model import to affect my performance scores since I want to start loading my model as soon as poosible after initial load. Link : https://toxicity-detector.web.app/ ( Checked with lighthouse chrome extension ) With regards, Aditya
1.0
Tensorflow model assets import degrading performance scores randomly - Hello lighthouse team, I'm trying to load a tjfs model locally for my web app ( in the end ) for a later point of time. All of my page metrics are fine except the fact that either lighthouse is giving me poor performance score or either giving me score of 90+ with note : " page loaded too slowly to finish within the time limit ". Is there a way I can avoid my model import to affect my performance scores since I want to start loading my model as soon as poosible after initial load. Link : https://toxicity-detector.web.app/ ( Checked with lighthouse chrome extension ) With regards, Aditya
non_port
tensorflow model assets import degrading performance scores randomly hello lighthouse team i m trying to load a tjfs model locally for my web app in the end for a later point of time all of my page metrics are fine except the fact that either lighthouse is giving me poor performance score or either giving me score of with note page loaded too slowly to finish within the time limit is there a way i can avoid my model import to affect my performance scores since i want to start loading my model as soon as poosible after initial load link checked with lighthouse chrome extension with regards aditya
0
1,978
30,925,044,823
IssuesEvent
2023-08-06 11:35:24
microsoft/winget-cli
https://api.github.com/repos/microsoft/winget-cli
closed
Symlinks are not created for portable installations
Portable
### Brief description of your issue When installing a portable app, the executable is correctly extracted and stored into the correct location, but the symlink referenced in the respective sqlite DB is not created. IMPORTANT: the chosen example is restic, but it is not limited to this tool. A different example would be VirusTotal.YARA ### Steps to reproduce Using the example of restic: 1. Run `winget install restic` ### Expected behavior - restic is installed to the specified portablePackageUserRoot - a symlink `C:\Users\<USER>\AppData\Local\Microsoft\WinGet\Links\restic.exe` is created pointing to the installed file is created - the directory `C:\Users\<USER>\AppData\Local\Microsoft\WinGet\Links\` is added to the users `PATH` It is possible to start the tool using `restic` ### Actual behavior - restic is installed to the specified portablePackageUserRoot - the directory `<portablePackageUserRoot>\restic.restic_Microsoft.Winget.Source_8wekyb3d8bbwe` is added to the users `PATH` This makes the tool not available under the alias `restic`, but only `restic_0.15.2_windows_amd64` which is the executable stored to disk ### Environment ```shell Windows Package Manager v1.5.1881 Copyright (c) Microsoft Corporation. All rights reserved. Windows: Windows.Desktop v10.0.19045.3271 System Architecture: X64 Package: Microsoft.DesktopAppInstaller v1.20.1881.0 Winget Directories ------------------------------------------------------------------------------------------------------------------------------- Logs %LOCALAPPDATA%\Packages\Microsoft.DesktopAppInstaller_8wekyb3d8bbwe\LocalState\DiagOutputDir User Settings %LOCALAPPDATA%\Packages\Microsoft.DesktopAppInstaller_8wekyb3d8bbwe\LocalState\settings.json Portable Links Directory (User) %LOCALAPPDATA%\Microsoft\WinGet\Links Portable Links Directory (Machine) C:\Program Files\WinGet\Links Portable Package Root (User) %USERPROFILE%\tools\ Portable Package Root C:\Program Files\WinGet\Packages Portable Package Root (x86) C:\Program Files (x86)\WinGet\Packages Links --------------------------------------------------------------------------- Privacy Statement https://aka.ms/winget-privacy License Agreement https://aka.ms/winget-license Third Party Notices https://aka.ms/winget-3rdPartyNotice Homepage https://aka.ms/winget Windows Store Terms https://www.microsoft.com/en-us/storedocs/terms-of-sale Admin Setting State -------------------------------------------------- LocalManifestFiles Disabled BypassCertificatePinningForMicrosoftStore Disabled InstallerHashOverride Disabled LocalArchiveMalwareScanOverride Disabled ```
True
Symlinks are not created for portable installations - ### Brief description of your issue When installing a portable app, the executable is correctly extracted and stored into the correct location, but the symlink referenced in the respective sqlite DB is not created. IMPORTANT: the chosen example is restic, but it is not limited to this tool. A different example would be VirusTotal.YARA ### Steps to reproduce Using the example of restic: 1. Run `winget install restic` ### Expected behavior - restic is installed to the specified portablePackageUserRoot - a symlink `C:\Users\<USER>\AppData\Local\Microsoft\WinGet\Links\restic.exe` is created pointing to the installed file is created - the directory `C:\Users\<USER>\AppData\Local\Microsoft\WinGet\Links\` is added to the users `PATH` It is possible to start the tool using `restic` ### Actual behavior - restic is installed to the specified portablePackageUserRoot - the directory `<portablePackageUserRoot>\restic.restic_Microsoft.Winget.Source_8wekyb3d8bbwe` is added to the users `PATH` This makes the tool not available under the alias `restic`, but only `restic_0.15.2_windows_amd64` which is the executable stored to disk ### Environment ```shell Windows Package Manager v1.5.1881 Copyright (c) Microsoft Corporation. All rights reserved. Windows: Windows.Desktop v10.0.19045.3271 System Architecture: X64 Package: Microsoft.DesktopAppInstaller v1.20.1881.0 Winget Directories ------------------------------------------------------------------------------------------------------------------------------- Logs %LOCALAPPDATA%\Packages\Microsoft.DesktopAppInstaller_8wekyb3d8bbwe\LocalState\DiagOutputDir User Settings %LOCALAPPDATA%\Packages\Microsoft.DesktopAppInstaller_8wekyb3d8bbwe\LocalState\settings.json Portable Links Directory (User) %LOCALAPPDATA%\Microsoft\WinGet\Links Portable Links Directory (Machine) C:\Program Files\WinGet\Links Portable Package Root (User) %USERPROFILE%\tools\ Portable Package Root C:\Program Files\WinGet\Packages Portable Package Root (x86) C:\Program Files (x86)\WinGet\Packages Links --------------------------------------------------------------------------- Privacy Statement https://aka.ms/winget-privacy License Agreement https://aka.ms/winget-license Third Party Notices https://aka.ms/winget-3rdPartyNotice Homepage https://aka.ms/winget Windows Store Terms https://www.microsoft.com/en-us/storedocs/terms-of-sale Admin Setting State -------------------------------------------------- LocalManifestFiles Disabled BypassCertificatePinningForMicrosoftStore Disabled InstallerHashOverride Disabled LocalArchiveMalwareScanOverride Disabled ```
port
symlinks are not created for portable installations brief description of your issue when installing a portable app the executable is correctly extracted and stored into the correct location but the symlink referenced in the respective sqlite db is not created important the chosen example is restic but it is not limited to this tool a different example would be virustotal yara steps to reproduce using the example of restic run winget install restic expected behavior restic is installed to the specified portablepackageuserroot a symlink c users appdata local microsoft winget links restic exe is created pointing to the installed file is created the directory c users appdata local microsoft winget links is added to the users path it is possible to start the tool using restic actual behavior restic is installed to the specified portablepackageuserroot the directory restic restic microsoft winget source is added to the users path this makes the tool not available under the alias restic but only restic windows which is the executable stored to disk environment shell windows package manager copyright c microsoft corporation all rights reserved windows windows desktop system architecture package microsoft desktopappinstaller winget directories logs localappdata packages microsoft desktopappinstaller localstate diagoutputdir user settings localappdata packages microsoft desktopappinstaller localstate settings json portable links directory user localappdata microsoft winget links portable links directory machine c program files winget links portable package root user userprofile tools portable package root c program files winget packages portable package root c program files winget packages links privacy statement license agreement third party notices homepage windows store terms admin setting state localmanifestfiles disabled bypasscertificatepinningformicrosoftstore disabled installerhashoverride disabled localarchivemalwarescanoverride disabled
1
297,842
9,182,303,999
IssuesEvent
2019-03-05 12:30:40
servicemesher/istio-official-translation
https://api.github.com/repos/servicemesher/istio-official-translation
closed
content/docs/examples/advanced-gateways/_index.md
lang/zh pending priority/P0 sync/update version/1.1
文件路径:content/docs/examples/advanced-gateways/_index.md [源码](https://github.com/istio/istio.github.io/tree/master/content/docs/examples/advanced-gateways/_index.md) [网址](https://istio.io//docs/examples/advanced-gateways/_index.htm) ```diff diff --git a/content/docs/examples/advanced-gateways/_index.md b/content/docs/examples/advanced-gateways/_index.md index 364a9901..68a4ed5e 100644 --- a/content/docs/examples/advanced-gateways/_index.md +++ b/content/docs/examples/advanced-gateways/_index.md @@ -1,5 +1,5 @@ --- -title: Edge Traffic Management +title: Advanced Edge Traffic Management description: A variety of advanced examples for managing traffic at the edge (i.e., ingress and egress traffic) of an Istio service mesh. weight: 61 keywords: [ingress, egress, gateway] ```
1.0
content/docs/examples/advanced-gateways/_index.md - 文件路径:content/docs/examples/advanced-gateways/_index.md [源码](https://github.com/istio/istio.github.io/tree/master/content/docs/examples/advanced-gateways/_index.md) [网址](https://istio.io//docs/examples/advanced-gateways/_index.htm) ```diff diff --git a/content/docs/examples/advanced-gateways/_index.md b/content/docs/examples/advanced-gateways/_index.md index 364a9901..68a4ed5e 100644 --- a/content/docs/examples/advanced-gateways/_index.md +++ b/content/docs/examples/advanced-gateways/_index.md @@ -1,5 +1,5 @@ --- -title: Edge Traffic Management +title: Advanced Edge Traffic Management description: A variety of advanced examples for managing traffic at the edge (i.e., ingress and egress traffic) of an Istio service mesh. weight: 61 keywords: [ingress, egress, gateway] ```
non_port
content docs examples advanced gateways index md 文件路径:content docs examples advanced gateways index md diff diff git a content docs examples advanced gateways index md b content docs examples advanced gateways index md index a content docs examples advanced gateways index md b content docs examples advanced gateways index md title edge traffic management title advanced edge traffic management description a variety of advanced examples for managing traffic at the edge i e ingress and egress traffic of an istio service mesh weight keywords
0
512,572
14,900,877,629
IssuesEvent
2021-01-21 15:52:01
department-of-veterans-affairs/va.gov-team
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
opened
Front end display when Walk ins value = "unknown"
Q121-priority frontend frontend-vamc vsa vsa-facilities
## Issue Description Within the Audiology accordion for [University](Drive](https://www.va.gov/pittsburgh-health-care/locations/pittsburgh-va-medical-center-university-drive/), the CMS value for "Walkins accepted" = Unknown. - On Staging, "Walkins accepted" = No. - On Prod, "Walkins accepted" does not appear at all. ### Possible Values for "Walk ins" and display settings If the value for "Walk ins" = "Yes" in CMS, the front end should display "Walk-ins accepted? Yes" If the value for "Walk ins" = "No" in CMS, the front end should display "Walk-ins accepted? No" If the value for "Walk ins" = "Unknown" in CMS, do not display the line "Walk-ins accepted" --- ## Tasks - [ ] Ensure the front end renders appropriately based on logic displayed above. ## Acceptance Criteria - [ ] When the value for "Walk ins" = "Yes" in CMS, the front end displays "Walk-ins accepted? Yes" - [ ] When the value for "Walk ins" = "No" in CMS, the front end displays "Walk-ins accepted? No" - [ ] When the value for "Walk ins" = "Unknown" in CMS, the line "Walk-ins accepted" is not displayed at all. This can be validated using the Audiology accordion for [University](Drive](https://www.va.gov/pittsburgh-health-care/locations/pittsburgh-va-medical-center-university-drive/) --- ## How to configure this issue - [ ] **Attached to a Milestone** (when will this be completed?) - [ ] **Attached to an Epic** (what body of work is this a part of?) - [ ] **Labeled with Team** (`product support`, `analytics-insights`, `operations`, `service-design`, `tools-be`, `tools-fe`) - [ ] **Labeled with Practice Area** (`backend`, `frontend`, `devops`, `design`, `research`, `product`, `ia`, `qa`, `analytics`, `contact center`, `research`, `accessibility`, `content`) - [ ] **Labeled with Type** (`bug`, `request`, `discovery`, `documentation`, etc.)
1.0
Front end display when Walk ins value = "unknown" - ## Issue Description Within the Audiology accordion for [University](Drive](https://www.va.gov/pittsburgh-health-care/locations/pittsburgh-va-medical-center-university-drive/), the CMS value for "Walkins accepted" = Unknown. - On Staging, "Walkins accepted" = No. - On Prod, "Walkins accepted" does not appear at all. ### Possible Values for "Walk ins" and display settings If the value for "Walk ins" = "Yes" in CMS, the front end should display "Walk-ins accepted? Yes" If the value for "Walk ins" = "No" in CMS, the front end should display "Walk-ins accepted? No" If the value for "Walk ins" = "Unknown" in CMS, do not display the line "Walk-ins accepted" --- ## Tasks - [ ] Ensure the front end renders appropriately based on logic displayed above. ## Acceptance Criteria - [ ] When the value for "Walk ins" = "Yes" in CMS, the front end displays "Walk-ins accepted? Yes" - [ ] When the value for "Walk ins" = "No" in CMS, the front end displays "Walk-ins accepted? No" - [ ] When the value for "Walk ins" = "Unknown" in CMS, the line "Walk-ins accepted" is not displayed at all. This can be validated using the Audiology accordion for [University](Drive](https://www.va.gov/pittsburgh-health-care/locations/pittsburgh-va-medical-center-university-drive/) --- ## How to configure this issue - [ ] **Attached to a Milestone** (when will this be completed?) - [ ] **Attached to an Epic** (what body of work is this a part of?) - [ ] **Labeled with Team** (`product support`, `analytics-insights`, `operations`, `service-design`, `tools-be`, `tools-fe`) - [ ] **Labeled with Practice Area** (`backend`, `frontend`, `devops`, `design`, `research`, `product`, `ia`, `qa`, `analytics`, `contact center`, `research`, `accessibility`, `content`) - [ ] **Labeled with Type** (`bug`, `request`, `discovery`, `documentation`, etc.)
non_port
front end display when walk ins value unknown issue description within the audiology accordion for drive the cms value for walkins accepted unknown on staging walkins accepted no on prod walkins accepted does not appear at all possible values for walk ins and display settings if the value for walk ins yes in cms the front end should display walk ins accepted yes if the value for walk ins no in cms the front end should display walk ins accepted no if the value for walk ins unknown in cms do not display the line walk ins accepted tasks ensure the front end renders appropriately based on logic displayed above acceptance criteria when the value for walk ins yes in cms the front end displays walk ins accepted yes when the value for walk ins no in cms the front end displays walk ins accepted no when the value for walk ins unknown in cms the line walk ins accepted is not displayed at all this can be validated using the audiology accordion for drive how to configure this issue attached to a milestone when will this be completed attached to an epic what body of work is this a part of labeled with team product support analytics insights operations service design tools be tools fe labeled with practice area backend frontend devops design research product ia qa analytics contact center research accessibility content labeled with type bug request discovery documentation etc
0
416,660
28,094,312,334
IssuesEvent
2023-03-30 14:51:38
VeryGoodOpenSource/dart_frog
https://api.github.com/repos/VeryGoodOpenSource/dart_frog
opened
docs: Custom Handler for non-file-based Routing
documentation
**Description** The docs do not currently state how to implement a custom router if file-based routing is not desired. This was requested in #467 and #530. I provided an example based on shelf_router in #530, but any other router would be fine too. **Requirements** - [ ] Provide a clear example on how to implement a non-file-based router - [ ] Provide references in Routing/Middleware/Custom Server Entrypoint sections
1.0
docs: Custom Handler for non-file-based Routing - **Description** The docs do not currently state how to implement a custom router if file-based routing is not desired. This was requested in #467 and #530. I provided an example based on shelf_router in #530, but any other router would be fine too. **Requirements** - [ ] Provide a clear example on how to implement a non-file-based router - [ ] Provide references in Routing/Middleware/Custom Server Entrypoint sections
non_port
docs custom handler for non file based routing description the docs do not currently state how to implement a custom router if file based routing is not desired this was requested in and i provided an example based on shelf router in but any other router would be fine too requirements provide a clear example on how to implement a non file based router provide references in routing middleware custom server entrypoint sections
0
1,410
2,544,426,081
IssuesEvent
2015-01-29 09:49:34
cogizz/metamodelsfilter_textcombine
https://api.github.com/repos/cogizz/metamodelsfilter_textcombine
closed
Fatal error: Class 'MetaModels\DcGeneral\Events\Table\FilterSetting\DrawSetting' not found
bug testing
Contao 3.3.5 metamodels/core dev-tng (5a98d965) Habe gerade über den Composer metamodelsfilter_textcombine in der Version dev-tng (b0c17243) installiert. Wenn ich die Attributeinstellungen für meine Filtereinstellungen editieren will, erhalte ich folgende Fehlermeldung: Fatal error: Class 'MetaModels\DcGeneral\Events\Table\FilterSetting\DrawSetting' not found in /.../.../.../.../.../composer/vendor/cogizz/metamodelsfilter_textcombine/src/system/modules/metamodelsfilter_textcombine/MetaModels/DcGeneral/Events/Table/FilterSetting/DrawTextCombineSetting.php on line 34 Was kann ich tun?
1.0
Fatal error: Class 'MetaModels\DcGeneral\Events\Table\FilterSetting\DrawSetting' not found - Contao 3.3.5 metamodels/core dev-tng (5a98d965) Habe gerade über den Composer metamodelsfilter_textcombine in der Version dev-tng (b0c17243) installiert. Wenn ich die Attributeinstellungen für meine Filtereinstellungen editieren will, erhalte ich folgende Fehlermeldung: Fatal error: Class 'MetaModels\DcGeneral\Events\Table\FilterSetting\DrawSetting' not found in /.../.../.../.../.../composer/vendor/cogizz/metamodelsfilter_textcombine/src/system/modules/metamodelsfilter_textcombine/MetaModels/DcGeneral/Events/Table/FilterSetting/DrawTextCombineSetting.php on line 34 Was kann ich tun?
non_port
fatal error class metamodels dcgeneral events table filtersetting drawsetting not found contao metamodels core dev tng habe gerade über den composer metamodelsfilter textcombine in der version dev tng installiert wenn ich die attributeinstellungen für meine filtereinstellungen editieren will erhalte ich folgende fehlermeldung fatal error class metamodels dcgeneral events table filtersetting drawsetting not found in composer vendor cogizz metamodelsfilter textcombine src system modules metamodelsfilter textcombine metamodels dcgeneral events table filtersetting drawtextcombinesetting php on line was kann ich tun
0
827
10,597,104,797
IssuesEvent
2019-10-09 23:17:11
Azure/azure-functions-host
https://api.github.com/repos/Azure/azure-functions-host
reopened
Missing FunctionName in some logs
P1 Supportability
There are a few areas where "FunctionName" is missing from our logs: - where Source contains "Host.Triggers.Timer" - where Source contains "Script.Host" and where Summary contains "updated status: Last=" - where Summary contains 'functions are in error' and Summary contains ".Run"
True
Missing FunctionName in some logs - There are a few areas where "FunctionName" is missing from our logs: - where Source contains "Host.Triggers.Timer" - where Source contains "Script.Host" and where Summary contains "updated status: Last=" - where Summary contains 'functions are in error' and Summary contains ".Run"
port
missing functionname in some logs there are a few areas where functionname is missing from our logs where source contains host triggers timer where source contains script host and where summary contains updated status last where summary contains functions are in error and summary contains run
1
166,857
14,079,884,977
IssuesEvent
2020-11-04 15:26:01
OpenBankingToolkit/openbanking-reference-implementation
https://api.github.com/repos/OpenBankingToolkit/openbanking-reference-implementation
closed
Event Notification API - Documentation
documentation fixed: smiths
## Story As a customer I want to test that `Event Notification API` implementation works properly. ## Acceptance criteria - Documentation created - Following the documentation as a developer I can test the `Event Notification API`. ## Tasks _OPTIONAL_ List of tasks required to implement this story - [x] Update de documentation on docs application service. ### Release Notes Affected App: DOCS Description: Add the Event Notification section to the documentation application. <end release notes>
1.0
Event Notification API - Documentation - ## Story As a customer I want to test that `Event Notification API` implementation works properly. ## Acceptance criteria - Documentation created - Following the documentation as a developer I can test the `Event Notification API`. ## Tasks _OPTIONAL_ List of tasks required to implement this story - [x] Update de documentation on docs application service. ### Release Notes Affected App: DOCS Description: Add the Event Notification section to the documentation application. <end release notes>
non_port
event notification api documentation story as a customer i want to test that event notification api implementation works properly acceptance criteria documentation created following the documentation as a developer i can test the event notification api tasks optional list of tasks required to implement this story update de documentation on docs application service release notes affected app docs description add the event notification section to the documentation application
0
359,480
25,239,833,048
IssuesEvent
2022-11-15 06:10:27
arcus-azure/arcus.templates
https://api.github.com/repos/arcus-azure/arcus.templates
closed
Update README with available project templates
documentation
**Is your feature request related to a problem? Please describe.** Currently, we only list the Web API and the Azure Service Bus worker project templates. **Describe the solution you'd like** List all the available project templates, or point them directly to our feature documentation, as it would become a rather long list if we would list (and maintain) them all in the README file.
1.0
Update README with available project templates - **Is your feature request related to a problem? Please describe.** Currently, we only list the Web API and the Azure Service Bus worker project templates. **Describe the solution you'd like** List all the available project templates, or point them directly to our feature documentation, as it would become a rather long list if we would list (and maintain) them all in the README file.
non_port
update readme with available project templates is your feature request related to a problem please describe currently we only list the web api and the azure service bus worker project templates describe the solution you d like list all the available project templates or point them directly to our feature documentation as it would become a rather long list if we would list and maintain them all in the readme file
0
216,598
16,663,631,080
IssuesEvent
2021-06-06 19:36:11
Kuifje02/vrpy
https://api.github.com/repos/Kuifje02/vrpy
opened
from_numpy_matrix() is ambiguous when edges have weight 0
documentation invalid
When using the [networkx.from_numpy_matrix()](https://networkx.org/documentation/stable/reference/generated/networkx.convert_matrix.from_numpy_matrix.html) method, there is a problem with edges with weight 0. Such edges are not created. Should put a warning in the docs.
1.0
from_numpy_matrix() is ambiguous when edges have weight 0 - When using the [networkx.from_numpy_matrix()](https://networkx.org/documentation/stable/reference/generated/networkx.convert_matrix.from_numpy_matrix.html) method, there is a problem with edges with weight 0. Such edges are not created. Should put a warning in the docs.
non_port
from numpy matrix is ambiguous when edges have weight when using the method there is a problem with edges with weight such edges are not created should put a warning in the docs
0
313,827
26,957,886,098
IssuesEvent
2023-02-08 16:04:15
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
opened
Fix converters.test_from_backend_module
Sub Task Ivy Stateful API Failing Test
| | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4002898723/jobs/6870536606" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/4002898723/jobs/6870536606" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4002898723/jobs/6870536606" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/4123979403/jobs/7122701774" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> <details> <summary>Not found</summary> Not found </details>
1.0
Fix converters.test_from_backend_module - | | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4002898723/jobs/6870536606" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/4002898723/jobs/6870536606" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4002898723/jobs/6870536606" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/4123979403/jobs/7122701774" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> <details> <summary>Not found</summary> Not found </details>
non_port
fix converters test from backend module tensorflow img src torch img src numpy img src jax img src not found not found
0
126,453
26,858,310,216
IssuesEvent
2023-02-03 16:15:07
eclipse-theia/theia
https://api.github.com/repos/eclipse-theia/theia
closed
Backport Tabs API to 2023/1 Community Release
vscode
<!-- Please fill out the following content for a feature request. --> <!-- Please provide a clear description of the feature and any relevant information. --> ### Feature Description: There has been agreement in the community to backport the tabs API to the Jan. 2023 Community Release candidate. See https://github.com/eclipse-theia/theia/pull/12109
1.0
Backport Tabs API to 2023/1 Community Release - <!-- Please fill out the following content for a feature request. --> <!-- Please provide a clear description of the feature and any relevant information. --> ### Feature Description: There has been agreement in the community to backport the tabs API to the Jan. 2023 Community Release candidate. See https://github.com/eclipse-theia/theia/pull/12109
non_port
backport tabs api to community release feature description there has been agreement in the community to backport the tabs api to the jan community release candidate see
0
724
9,708,775,139
IssuesEvent
2019-05-28 08:35:35
microsoft/vscode
https://api.github.com/repos/microsoft/vscode
closed
Unable to write program user data when invoking VS Code Portable in a singularity image
bug install-update portable-mode
**Problem description:** I am trying to install vs code inside a singularity image. I, unfortunately, haven't been able to do this as I keep running into some problems. I first tried to install vs code inside the container using the[ .dep package](https://code.visualstudio.com/Download). However, as inside this container, the *vs code* program doesn't have write permissions to the user data and data folder on the main system it wont start. To solve this I tried using the portable version as this is explained in the [vscode portable documentation](https://code.visualstudio.com/docs/editor/portable). Unfortunately, also this gave me the `user-data and data directories should be writable` error. **System information:** - VSCode Version: 1.34.0 - OS Version: Ubuntu 16.04 (Singularity container) **Steps to Reproduce:** 1. Install singularity according to [this guide](https://www.sylabs.io/guides/3.2/user-guide/). 2. Build a ubuntu 16.04 singularity image by running the following command: `sudo singularity build --sandbox ubuntu1604 docker://ubuntu:16.04` 3. When finished run the shell as sudo by using `sudo singularity run --nv --writable ubuntu1604` 4. Download the lates tar.gz and unzip it: ``` curl -L "https://go.microsoft.com/fwlink/?LinkID=620884" > vscode-stable.tar.gz tar xzf vscode-stable.tar.gz ``` 5. Go into the *VSCode-linux-x64* folder. 6. Create a *user-data* and *data* folder as explained in the [vscode portable documentation](https://code.visualstudio.com/docs/editor/portable). 7. Try to run the VSCode program by executing `sh ./bin/code`. 8. You will now get the following error message: ![image](https://user-images.githubusercontent.com/17570430/58173708-5ab40380-7c9c-11e9-81dc-96abfaf69f64.png) This can be solved by running the shell as sudo but as this has some risks I was wondering if I can solve the error so that I can run vscode as a normal user from within a singularity container. **Extra information:** Does this issue occur when all extensions are disabled?: Yes **--verbose output:** ``Gtk-Message: Failed to load module "appmenu-gtk-module" Gtk-Message: Failed to load module "canberra-gtk-module" Gtk-Message: Failed to load module "canberra-gtk-module" [9043:0522/143121.884272:ERROR:bus.cc(394)] Failed to connect to the bus: Failed to connect to socket /var/run/dbus/system_bus_socket: No such file or directory Gtk-Message: GtkDialog mapped without a transient parent. This is discouraged. [main 2019-05-22T12:31:27.080Z] Error: listen EACCES /run/user/1000/vscode-5f7fbcc5-1.34.0-main.sock at Server.setupListenHandle [as _listen2] (net.js:1313:19) at listenInCluster (net.js:1378:12) at Server.listen (net.js:1477:5) at Promise (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:184:637) at new Promise (<anonymous>) at Object.t.serve (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:184:574) at n (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:490:263) at R (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:492:559) at l.invokeFunction (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:221:331) at then (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:494:347) [main 2019-05-22T12:31:27.084Z] Lifecycle#kill()``
True
Unable to write program user data when invoking VS Code Portable in a singularity image - **Problem description:** I am trying to install vs code inside a singularity image. I, unfortunately, haven't been able to do this as I keep running into some problems. I first tried to install vs code inside the container using the[ .dep package](https://code.visualstudio.com/Download). However, as inside this container, the *vs code* program doesn't have write permissions to the user data and data folder on the main system it wont start. To solve this I tried using the portable version as this is explained in the [vscode portable documentation](https://code.visualstudio.com/docs/editor/portable). Unfortunately, also this gave me the `user-data and data directories should be writable` error. **System information:** - VSCode Version: 1.34.0 - OS Version: Ubuntu 16.04 (Singularity container) **Steps to Reproduce:** 1. Install singularity according to [this guide](https://www.sylabs.io/guides/3.2/user-guide/). 2. Build a ubuntu 16.04 singularity image by running the following command: `sudo singularity build --sandbox ubuntu1604 docker://ubuntu:16.04` 3. When finished run the shell as sudo by using `sudo singularity run --nv --writable ubuntu1604` 4. Download the lates tar.gz and unzip it: ``` curl -L "https://go.microsoft.com/fwlink/?LinkID=620884" > vscode-stable.tar.gz tar xzf vscode-stable.tar.gz ``` 5. Go into the *VSCode-linux-x64* folder. 6. Create a *user-data* and *data* folder as explained in the [vscode portable documentation](https://code.visualstudio.com/docs/editor/portable). 7. Try to run the VSCode program by executing `sh ./bin/code`. 8. You will now get the following error message: ![image](https://user-images.githubusercontent.com/17570430/58173708-5ab40380-7c9c-11e9-81dc-96abfaf69f64.png) This can be solved by running the shell as sudo but as this has some risks I was wondering if I can solve the error so that I can run vscode as a normal user from within a singularity container. **Extra information:** Does this issue occur when all extensions are disabled?: Yes **--verbose output:** ``Gtk-Message: Failed to load module "appmenu-gtk-module" Gtk-Message: Failed to load module "canberra-gtk-module" Gtk-Message: Failed to load module "canberra-gtk-module" [9043:0522/143121.884272:ERROR:bus.cc(394)] Failed to connect to the bus: Failed to connect to socket /var/run/dbus/system_bus_socket: No such file or directory Gtk-Message: GtkDialog mapped without a transient parent. This is discouraged. [main 2019-05-22T12:31:27.080Z] Error: listen EACCES /run/user/1000/vscode-5f7fbcc5-1.34.0-main.sock at Server.setupListenHandle [as _listen2] (net.js:1313:19) at listenInCluster (net.js:1378:12) at Server.listen (net.js:1477:5) at Promise (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:184:637) at new Promise (<anonymous>) at Object.t.serve (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:184:574) at n (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:490:263) at R (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:492:559) at l.invokeFunction (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:221:331) at then (/jan/VSCode-linux-x64/resources/app/out/vs/code/electron-main/main.js:494:347) [main 2019-05-22T12:31:27.084Z] Lifecycle#kill()``
port
unable to write program user data when invoking vs code portable in a singularity image problem description i am trying to install vs code inside a singularity image i unfortunately haven t been able to do this as i keep running into some problems i first tried to install vs code inside the container using the however as inside this container the vs code program doesn t have write permissions to the user data and data folder on the main system it wont start to solve this i tried using the portable version as this is explained in the unfortunately also this gave me the user data and data directories should be writable error system information vscode version os version ubuntu singularity container steps to reproduce install singularity according to build a ubuntu singularity image by running the following command sudo singularity build sandbox docker ubuntu when finished run the shell as sudo by using sudo singularity run nv writable download the lates tar gz and unzip it curl l vscode stable tar gz tar xzf vscode stable tar gz go into the vscode linux folder create a user data and data folder as explained in the try to run the vscode program by executing sh bin code you will now get the following error message this can be solved by running the shell as sudo but as this has some risks i was wondering if i can solve the error so that i can run vscode as a normal user from within a singularity container extra information does this issue occur when all extensions are disabled yes verbose output gtk message failed to load module appmenu gtk module gtk message failed to load module canberra gtk module gtk message failed to load module canberra gtk module failed to connect to the bus failed to connect to socket var run dbus system bus socket no such file or directory gtk message gtkdialog mapped without a transient parent this is discouraged error listen eacces run user vscode main sock at server setuplistenhandle net js at listenincluster net js at server listen net js at promise jan vscode linux resources app out vs code electron main main js at new promise at object t serve jan vscode linux resources app out vs code electron main main js at n jan vscode linux resources app out vs code electron main main js at r jan vscode linux resources app out vs code electron main main js at l invokefunction jan vscode linux resources app out vs code electron main main js at then jan vscode linux resources app out vs code electron main main js lifecycle kill
1
253,689
21,699,294,615
IssuesEvent
2022-05-10 00:58:25
damccorm/test-migration-target
https://api.github.com/repos/damccorm/test-migration-target
opened
Enable org.apache.beam.sdk.transforms.GroupByKeyTest$WindowTests.testWindowFnPostMerging
P3 test runner-samza portability-samza
Imported from Jira [BEAM-12886](https://issues.apache.org/jira/browse/BEAM-12886). Original Jira may contain additional context. Reported by: kw2542. This issue has child subcomponents which were not migrated over. See the original Jira for more information.
1.0
Enable org.apache.beam.sdk.transforms.GroupByKeyTest$WindowTests.testWindowFnPostMerging - Imported from Jira [BEAM-12886](https://issues.apache.org/jira/browse/BEAM-12886). Original Jira may contain additional context. Reported by: kw2542. This issue has child subcomponents which were not migrated over. See the original Jira for more information.
non_port
enable org apache beam sdk transforms groupbykeytest windowtests testwindowfnpostmerging imported from jira original jira may contain additional context reported by this issue has child subcomponents which were not migrated over see the original jira for more information
0
636
8,551,440,249
IssuesEvent
2018-11-07 18:03:13
arangodb/arangodb
https://api.github.com/repos/arangodb/arangodb
closed
More descriptive error messages for ArangoSearch
1 Feature 3 ArangoSearch supportability
Using an unsupported analyzer results in the error `Query: AQL: unsupported SEARCH condition (while optimizing plan)`. While there is a more specific warning in the logs, the user usually does not see that and the error message could be more descriptive. Using a scorer with an illegal argument (e.g. `FOR doc IN view SORT BM25(doc.name)` or `SORT BM25({})`), the error message `Scorer function is designed to be used with ArangoSearch view only` is given. This could be more specific, especially as this will probably be a common mistake. Passing an illegal argument to `EXISTS`, e.g. `EXISTS("str")` results in the error message `Filter function is designed to be used with ArangoSearch view only (while optimizing ast)`.
True
More descriptive error messages for ArangoSearch - Using an unsupported analyzer results in the error `Query: AQL: unsupported SEARCH condition (while optimizing plan)`. While there is a more specific warning in the logs, the user usually does not see that and the error message could be more descriptive. Using a scorer with an illegal argument (e.g. `FOR doc IN view SORT BM25(doc.name)` or `SORT BM25({})`), the error message `Scorer function is designed to be used with ArangoSearch view only` is given. This could be more specific, especially as this will probably be a common mistake. Passing an illegal argument to `EXISTS`, e.g. `EXISTS("str")` results in the error message `Filter function is designed to be used with ArangoSearch view only (while optimizing ast)`.
port
more descriptive error messages for arangosearch using an unsupported analyzer results in the error query aql unsupported search condition while optimizing plan while there is a more specific warning in the logs the user usually does not see that and the error message could be more descriptive using a scorer with an illegal argument e g for doc in view sort doc name or sort the error message scorer function is designed to be used with arangosearch view only is given this could be more specific especially as this will probably be a common mistake passing an illegal argument to exists e g exists str results in the error message filter function is designed to be used with arangosearch view only while optimizing ast
1
1,918
30,209,568,167
IssuesEvent
2023-07-05 11:53:05
chapel-lang/chapel
https://api.github.com/repos/chapel-lang/chapel
closed
Chapel does not yet support LLVM 15
area: Compiler type: Portability
### Summary of Problem Since `clang` version 15 has become the main version distributed by the Arch Linux repositories, Chapel has been unable to build since it permits versions between 11 and 14. I manage the Chapel packages for the Arch User Repository, and wanted to check in with the team on the best way to handle this. I optimistically tried to change [this line](https://github.com/chapel-lang/chapel/blob/8ca23c39b7f97e3f1a30d6a5e16242d2559b9ec8/util/chplenv/chpl_llvm.py#L18), but unfortunately I got an error (somewhat deep) in the build process. The main options on my side are: - constraining the package version for `clang` (I hear this is not so simple, but may be good for robustness of the package in the long run) - personally downgrade my `clang` version and put temporary warnings on the package page (I really don't like this option) Build commands: ``` ./configure --prefix=/usr make ``` I'll be investigating the dependency constraints, but if anybody has suggestions that would be great too. ### Configuration Information - Output of `chpl --version`: `<currently erroring, was running 1.30 pre-release>` - Output of `$CHPL_HOME/util/printchplenv --anonymize`: - Back-end compiler and version, e.g. `gcc --version` or `clang --version`: `clang version 15.0.7`
True
Chapel does not yet support LLVM 15 - ### Summary of Problem Since `clang` version 15 has become the main version distributed by the Arch Linux repositories, Chapel has been unable to build since it permits versions between 11 and 14. I manage the Chapel packages for the Arch User Repository, and wanted to check in with the team on the best way to handle this. I optimistically tried to change [this line](https://github.com/chapel-lang/chapel/blob/8ca23c39b7f97e3f1a30d6a5e16242d2559b9ec8/util/chplenv/chpl_llvm.py#L18), but unfortunately I got an error (somewhat deep) in the build process. The main options on my side are: - constraining the package version for `clang` (I hear this is not so simple, but may be good for robustness of the package in the long run) - personally downgrade my `clang` version and put temporary warnings on the package page (I really don't like this option) Build commands: ``` ./configure --prefix=/usr make ``` I'll be investigating the dependency constraints, but if anybody has suggestions that would be great too. ### Configuration Information - Output of `chpl --version`: `<currently erroring, was running 1.30 pre-release>` - Output of `$CHPL_HOME/util/printchplenv --anonymize`: - Back-end compiler and version, e.g. `gcc --version` or `clang --version`: `clang version 15.0.7`
port
chapel does not yet support llvm summary of problem since clang version has become the main version distributed by the arch linux repositories chapel has been unable to build since it permits versions between and i manage the chapel packages for the arch user repository and wanted to check in with the team on the best way to handle this i optimistically tried to change but unfortunately i got an error somewhat deep in the build process the main options on my side are constraining the package version for clang i hear this is not so simple but may be good for robustness of the package in the long run personally downgrade my clang version and put temporary warnings on the package page i really don t like this option build commands configure prefix usr make i ll be investigating the dependency constraints but if anybody has suggestions that would be great too configuration information output of chpl version output of chpl home util printchplenv anonymize back end compiler and version e g gcc version or clang version clang version
1
653,693
21,610,692,062
IssuesEvent
2022-05-04 09:47:17
celo-org/celo-monorepo
https://api.github.com/repos/celo-org/celo-monorepo
closed
Add Forno API Key to ODIS
Priority: P0 Component: ODIS Component: Identity
Forno recently added an API key for rate limiting requests. We need to add API keys for the Combiner and Signers to prevent them from getting rate limited. Release TODO: - [ ] Test Combiner in staging - [ ] Test Signer in staging - [ ] Test Combiner in Alfajores - [ ] Test Signers in Alfajores - [ ] Release Combiner to mainnet - [ ] Release to mainnet cLabs signers - [ ] Release to mainnet partner signers (Note: Let's try to lump this with another change to limit overhead for partner signers, as they just upgraded to 1.1.9)
1.0
Add Forno API Key to ODIS - Forno recently added an API key for rate limiting requests. We need to add API keys for the Combiner and Signers to prevent them from getting rate limited. Release TODO: - [ ] Test Combiner in staging - [ ] Test Signer in staging - [ ] Test Combiner in Alfajores - [ ] Test Signers in Alfajores - [ ] Release Combiner to mainnet - [ ] Release to mainnet cLabs signers - [ ] Release to mainnet partner signers (Note: Let's try to lump this with another change to limit overhead for partner signers, as they just upgraded to 1.1.9)
non_port
add forno api key to odis forno recently added an api key for rate limiting requests we need to add api keys for the combiner and signers to prevent them from getting rate limited release todo test combiner in staging test signer in staging test combiner in alfajores test signers in alfajores release combiner to mainnet release to mainnet clabs signers release to mainnet partner signers note let s try to lump this with another change to limit overhead for partner signers as they just upgraded to
0
171,750
27,172,097,762
IssuesEvent
2023-02-17 20:27:13
dotnet/roslyn
https://api.github.com/repos/dotnet/roslyn
closed
Format align chain method calls in different lines
Area-IDE Feature Request Need Design Review IDE-Formatter
**Version Used**: **Steps to Reproduce**: select the following code and run ctrl + K F in VS ```c# buildCommand .ExecuteWithoutRestore() .Should() .Fail() .And .HaveStdOutContaining("NETSDK1004"); ``` **Expected Behavior**: ```c# buildCommand .ExecuteWithoutRestore() .Should() .Fail() .And .HaveStdOutContaining("NETSDK1004"); ``` **Actual Behavior**: No change
1.0
Format align chain method calls in different lines - **Version Used**: **Steps to Reproduce**: select the following code and run ctrl + K F in VS ```c# buildCommand .ExecuteWithoutRestore() .Should() .Fail() .And .HaveStdOutContaining("NETSDK1004"); ``` **Expected Behavior**: ```c# buildCommand .ExecuteWithoutRestore() .Should() .Fail() .And .HaveStdOutContaining("NETSDK1004"); ``` **Actual Behavior**: No change
non_port
format align chain method calls in different lines version used steps to reproduce select the following code and run ctrl k f in vs c buildcommand executewithoutrestore should fail and havestdoutcontaining expected behavior c buildcommand executewithoutrestore should fail and havestdoutcontaining actual behavior no change
0
241
4,793,268,113
IssuesEvent
2016-10-31 17:42:04
wahern/cqueues
https://api.github.com/repos/wahern/cqueues
closed
clock_getTime() with new Mac OS
packaging/portability
cqueues fails to make on Mac OS. It throws errors about clock_getTime(). I fixed it by removing this chunk of code in: `../src/cqueues.c` ``` #if __APPLE__ #include <time.h> /* struct timespec */ #include <errno.h> /* errno EINVAL */ #include <sys/time.h> /* TIMEVAL_TO_TIMESPEC struct timeval gettimeofday(3) */ #include <mach/mach_time.h> /* mach_timebase_info_data_t mach_timebase_info() mach_absolute_time() */ #define CLOCK_REALTIME 0 #define CLOCK_VIRTUAL 1 #define CLOCK_PROF 2 #define CLOCK_MONOTONIC 3 static mach_timebase_info_data_t clock_timebase = { .numer = 1, .denom = 1, }; /* clock_timebase */ void clock_gettime_init(void) __attribute__((constructor)); void clock_gettime_init(void) { if (mach_timebase_info(&clock_timebase) != KERN_SUCCESS) __builtin_abort(); } /* clock_gettime_init() */ static int clock_gettime(int clockid, struct timespec *ts) { switch (clockid) { case CLOCK_REALTIME: { struct timeval tv; if (0 != gettimeofday(&tv, 0)) return -1; TIMEVAL_TO_TIMESPEC(&tv, ts); return 0; } case CLOCK_MONOTONIC: { unsigned long long abt; abt = mach_absolute_time(); abt = abt * clock_timebase.numer / clock_timebase.denom; ts->tv_sec = abt / 1000000000UL; ts->tv_nsec = abt % 1000000000UL; return 0; } default: errno = EINVAL; return -1; } /* switch() */ } /* clock_gettime() */ #endif /* __APPLE__ */ ```
True
clock_getTime() with new Mac OS - cqueues fails to make on Mac OS. It throws errors about clock_getTime(). I fixed it by removing this chunk of code in: `../src/cqueues.c` ``` #if __APPLE__ #include <time.h> /* struct timespec */ #include <errno.h> /* errno EINVAL */ #include <sys/time.h> /* TIMEVAL_TO_TIMESPEC struct timeval gettimeofday(3) */ #include <mach/mach_time.h> /* mach_timebase_info_data_t mach_timebase_info() mach_absolute_time() */ #define CLOCK_REALTIME 0 #define CLOCK_VIRTUAL 1 #define CLOCK_PROF 2 #define CLOCK_MONOTONIC 3 static mach_timebase_info_data_t clock_timebase = { .numer = 1, .denom = 1, }; /* clock_timebase */ void clock_gettime_init(void) __attribute__((constructor)); void clock_gettime_init(void) { if (mach_timebase_info(&clock_timebase) != KERN_SUCCESS) __builtin_abort(); } /* clock_gettime_init() */ static int clock_gettime(int clockid, struct timespec *ts) { switch (clockid) { case CLOCK_REALTIME: { struct timeval tv; if (0 != gettimeofday(&tv, 0)) return -1; TIMEVAL_TO_TIMESPEC(&tv, ts); return 0; } case CLOCK_MONOTONIC: { unsigned long long abt; abt = mach_absolute_time(); abt = abt * clock_timebase.numer / clock_timebase.denom; ts->tv_sec = abt / 1000000000UL; ts->tv_nsec = abt % 1000000000UL; return 0; } default: errno = EINVAL; return -1; } /* switch() */ } /* clock_gettime() */ #endif /* __APPLE__ */ ```
port
clock gettime with new mac os cqueues fails to make on mac os it throws errors about clock gettime i fixed it by removing this chunk of code in src cqueues c if apple include struct timespec include errno einval include timeval to timespec struct timeval gettimeofday include mach timebase info data t mach timebase info mach absolute time define clock realtime define clock virtual define clock prof define clock monotonic static mach timebase info data t clock timebase numer denom clock timebase void clock gettime init void attribute constructor void clock gettime init void if mach timebase info clock timebase kern success builtin abort clock gettime init static int clock gettime int clockid struct timespec ts switch clockid case clock realtime struct timeval tv if gettimeofday tv return timeval to timespec tv ts return case clock monotonic unsigned long long abt abt mach absolute time abt abt clock timebase numer clock timebase denom ts tv sec abt ts tv nsec abt return default errno einval return switch clock gettime endif apple
1
480
6,963,712,408
IssuesEvent
2017-12-08 18:32:07
usnistgov/hiperc
https://api.github.com/repos/usnistgov/hiperc
closed
rethink BCs
enhancement portability
Use of `fp_t bc[2][2]` is opaque. Try simplifying, without losing generality if possible.
True
rethink BCs - Use of `fp_t bc[2][2]` is opaque. Try simplifying, without losing generality if possible.
port
rethink bcs use of fp t bc is opaque try simplifying without losing generality if possible
1
1,463
21,693,235,094
IssuesEvent
2022-05-09 17:21:36
damccorm/test-migration-target
https://api.github.com/repos/damccorm/test-migration-target
opened
Add support for timers in Spark portable streaming
P3 runner-spark improvement portability-spark
Add support for timely processing (using timers) for streaming on the portable Spark runner. Validates runner tests relying on timers (e.g. UsesTimersInParDo) should pass Imported from Jira [BEAM-10755](https://issues.apache.org/jira/browse/BEAM-10755). Original Jira may contain additional context. Reported by: annaqin.
True
Add support for timers in Spark portable streaming - Add support for timely processing (using timers) for streaming on the portable Spark runner. Validates runner tests relying on timers (e.g. UsesTimersInParDo) should pass Imported from Jira [BEAM-10755](https://issues.apache.org/jira/browse/BEAM-10755). Original Jira may contain additional context. Reported by: annaqin.
port
add support for timers in spark portable streaming add support for timely processing using timers for streaming on the portable spark runner validates runner tests relying on timers e g usestimersinpardo should pass imported from jira original jira may contain additional context reported by annaqin
1
68,365
3,286,721,280
IssuesEvent
2015-10-29 05:22:08
metamaps/metamaps_gen002
https://api.github.com/repos/metamaps/metamaps_gen002
opened
custom metacodes
ruby uservoice priority
Uservoice people have requested the ability to create their own metacodes. This functionality exists for admins, but isn't pretty. We need to discuss this more.
1.0
custom metacodes - Uservoice people have requested the ability to create their own metacodes. This functionality exists for admins, but isn't pretty. We need to discuss this more.
non_port
custom metacodes uservoice people have requested the ability to create their own metacodes this functionality exists for admins but isn t pretty we need to discuss this more
0
1,068
13,675,364,378
IssuesEvent
2020-09-29 12:35:21
openwall/john
https://api.github.com/repos/openwall/john
closed
md5crypt-opencl fails on NVIDIA with CL_INVALID_COMMAND_QUEUE
portability
``` ~/work/extern/arch/aur/john-git » john --test --format=md5crypt-opencl john: /opt/cuda/lib64/libOpenCL.so.1: no version information available (required by john) Device 0: GeForce GTX TITAN Benchmarking: md5crypt-opencl, crypt(3) $1$ [MD5 OpenCL]... OpenCL CL_INVALID_COMMAND_QUEUE error in opencl_cryptmd5_fmt_plug.c:227 - Error releasing memory mappings ``` john is built from git ([this AUR package](https://aur.archlinux.org/packages/john-git/), commit 7eeb2bfe1008a8a153daab4a71e9f0c391a17ff7). Build info: ``` john: /opt/cuda/lib64/libOpenCL.so.1: no version information available (required by john) Version: 1.8.0-jumbo-1-5593-g7eeb2bfe1+ Build: linux-gnu 64-bit AVX-ac OMP SIMD: AVX, interleaving: MD4:3 MD5:3 SHA1:1 SHA256:1 SHA512:1 System-wide exec: /usr/libexec/john System-wide home: /usr/share/john Private home: ~/.john $JOHN is /usr/share/john/ Format interface version: 14 Max. number of reported tunable costs: 3 Rec file version: REC4 Charset file version: CHR3 CHARSET_MIN: 1 (0x01) CHARSET_MAX: 255 (0xff) CHARSET_LENGTH: 24 SALT_HASH_SIZE: 1048576 Max. Markov mode level: 400 Max. Markov mode password length: 30 gcc version: 6.3.1 GNU libc version: 2.25 (loaded: 2.25) OpenCL headers version: 2.1 Crypto library: OpenSSL OpenSSL library version: 0100020bf OpenSSL 1.0.2k 26 Jan 2017 GMP library version: 6.1.2 File locking: fcntl() fseek(): fseek ftell(): ftell fopen(): fopen memmem(): System's ``` However, [the stable package (1.8.0.jumbo1)](https://www.archlinux.org/packages/community/x86_64/john/) fails this test too. Output of `john --list=opencl-devices`: ``` john: /opt/cuda/lib64/libOpenCL.so.1: no version information available (required by john) Platform #0 name: NVIDIA CUDA, version: OpenCL 1.2 CUDA 8.0.0 Device #0 (0) name: GeForce GTX TITAN Device vendor: NVIDIA Corporation Device type: GPU (LE) Device version: OpenCL 1.2 CUDA Driver version: 378.13 [recommended] Native vector widths: char 1, short 1, int 1, long 1 Preferred vector width: char 1, short 1, int 1, long 1 Global Memory: 5.0 GB Global Memory Cache: 224.2 KB Local Memory: 48.0 KB (Local) Max memory alloc. size: 1.0 GB Max clock (MHz): 875 Profiling timer res.: 1000 ns Max Work Group Size: 1024 Parallel compute cores: 14 CUDA cores: 2688 (14 x 192) Speed index: 2352000 Warp size: 32 Max. GPRs/work-group: 65536 Compute capability: 3.5 (sm_35) Kernel exec. timeout: no NVML id: 0 PCI device topology: 05:00.0 PCI lanes: 16/16 Fan speed: 32% Temperature: 47°C Utilization: 0% ``` System is running Arch Linux x86_64 with kernel 4.10.8-1-ARCH and the 378.13-5 nvidia driver package. I noticed a few issues regarding md5crypt-opencl on AMD cards, but none on NVIDIA.
True
md5crypt-opencl fails on NVIDIA with CL_INVALID_COMMAND_QUEUE - ``` ~/work/extern/arch/aur/john-git » john --test --format=md5crypt-opencl john: /opt/cuda/lib64/libOpenCL.so.1: no version information available (required by john) Device 0: GeForce GTX TITAN Benchmarking: md5crypt-opencl, crypt(3) $1$ [MD5 OpenCL]... OpenCL CL_INVALID_COMMAND_QUEUE error in opencl_cryptmd5_fmt_plug.c:227 - Error releasing memory mappings ``` john is built from git ([this AUR package](https://aur.archlinux.org/packages/john-git/), commit 7eeb2bfe1008a8a153daab4a71e9f0c391a17ff7). Build info: ``` john: /opt/cuda/lib64/libOpenCL.so.1: no version information available (required by john) Version: 1.8.0-jumbo-1-5593-g7eeb2bfe1+ Build: linux-gnu 64-bit AVX-ac OMP SIMD: AVX, interleaving: MD4:3 MD5:3 SHA1:1 SHA256:1 SHA512:1 System-wide exec: /usr/libexec/john System-wide home: /usr/share/john Private home: ~/.john $JOHN is /usr/share/john/ Format interface version: 14 Max. number of reported tunable costs: 3 Rec file version: REC4 Charset file version: CHR3 CHARSET_MIN: 1 (0x01) CHARSET_MAX: 255 (0xff) CHARSET_LENGTH: 24 SALT_HASH_SIZE: 1048576 Max. Markov mode level: 400 Max. Markov mode password length: 30 gcc version: 6.3.1 GNU libc version: 2.25 (loaded: 2.25) OpenCL headers version: 2.1 Crypto library: OpenSSL OpenSSL library version: 0100020bf OpenSSL 1.0.2k 26 Jan 2017 GMP library version: 6.1.2 File locking: fcntl() fseek(): fseek ftell(): ftell fopen(): fopen memmem(): System's ``` However, [the stable package (1.8.0.jumbo1)](https://www.archlinux.org/packages/community/x86_64/john/) fails this test too. Output of `john --list=opencl-devices`: ``` john: /opt/cuda/lib64/libOpenCL.so.1: no version information available (required by john) Platform #0 name: NVIDIA CUDA, version: OpenCL 1.2 CUDA 8.0.0 Device #0 (0) name: GeForce GTX TITAN Device vendor: NVIDIA Corporation Device type: GPU (LE) Device version: OpenCL 1.2 CUDA Driver version: 378.13 [recommended] Native vector widths: char 1, short 1, int 1, long 1 Preferred vector width: char 1, short 1, int 1, long 1 Global Memory: 5.0 GB Global Memory Cache: 224.2 KB Local Memory: 48.0 KB (Local) Max memory alloc. size: 1.0 GB Max clock (MHz): 875 Profiling timer res.: 1000 ns Max Work Group Size: 1024 Parallel compute cores: 14 CUDA cores: 2688 (14 x 192) Speed index: 2352000 Warp size: 32 Max. GPRs/work-group: 65536 Compute capability: 3.5 (sm_35) Kernel exec. timeout: no NVML id: 0 PCI device topology: 05:00.0 PCI lanes: 16/16 Fan speed: 32% Temperature: 47°C Utilization: 0% ``` System is running Arch Linux x86_64 with kernel 4.10.8-1-ARCH and the 378.13-5 nvidia driver package. I noticed a few issues regarding md5crypt-opencl on AMD cards, but none on NVIDIA.
port
opencl fails on nvidia with cl invalid command queue work extern arch aur john git » john test format opencl john opt cuda libopencl so no version information available required by john device geforce gtx titan benchmarking opencl crypt opencl cl invalid command queue error in opencl fmt plug c error releasing memory mappings john is built from git commit build info john opt cuda libopencl so no version information available required by john version jumbo build linux gnu bit avx ac omp simd avx interleaving system wide exec usr libexec john system wide home usr share john private home john john is usr share john format interface version max number of reported tunable costs rec file version charset file version charset min charset max charset length salt hash size max markov mode level max markov mode password length gcc version gnu libc version loaded opencl headers version crypto library openssl openssl library version openssl jan gmp library version file locking fcntl fseek fseek ftell ftell fopen fopen memmem system s however fails this test too output of john list opencl devices john opt cuda libopencl so no version information available required by john platform name nvidia cuda version opencl cuda device name geforce gtx titan device vendor nvidia corporation device type gpu le device version opencl cuda driver version native vector widths char short int long preferred vector width char short int long global memory gb global memory cache kb local memory kb local max memory alloc size gb max clock mhz profiling timer res ns max work group size parallel compute cores cuda cores x speed index warp size max gprs work group compute capability sm kernel exec timeout no nvml id pci device topology pci lanes fan speed temperature °c utilization system is running arch linux with kernel arch and the nvidia driver package i noticed a few issues regarding opencl on amd cards but none on nvidia
1
400,833
11,781,486,375
IssuesEvent
2020-03-16 22:36:24
LBNL-ETA/BEDES-Manager
https://api.github.com/repos/LBNL-ETA/BEDES-Manager
closed
Update "Download Sample File" to download the correct sample csv file
high priority
In import-csv, "Download Sample File" downloads the previous (with incorrect headers) sample csv file. Fix it to download the latest sample csv file (with the updated headers)
1.0
Update "Download Sample File" to download the correct sample csv file - In import-csv, "Download Sample File" downloads the previous (with incorrect headers) sample csv file. Fix it to download the latest sample csv file (with the updated headers)
non_port
update download sample file to download the correct sample csv file in import csv download sample file downloads the previous with incorrect headers sample csv file fix it to download the latest sample csv file with the updated headers
0
1,343
19,058,578,530
IssuesEvent
2021-11-26 02:19:59
PCSX2/pcsx2
https://api.github.com/repos/PCSX2/pcsx2
closed
Static Analysis of PVS-Studio
Enhancement / Feature Request Portability
Hello, I had the chance to examine PCSX2 project with PVS-Studio Static Analyzer. Github doesn't support attaching Excel files, but here is a link from my Dropbox: https://www.dropbox.com/s/sswkcfqj9g1x6vq/pcsx2_suite_2013.xlsx?dl=0 Again the report of the analysis is in Excel format. It is definitely worth looking at!! Even if some of the reported code mistakes could be false positive, or something that the devs of PCSX2 overlooked. You can get get more information and a detailed explanation about a specific error Codes using the online documentation: http://www.viva64.com/en/Vxxxx replacing "xxxx" with the code number. The good thing about the report, it gives you the file name, line number and a brief message explaining the error, visit viva64 to get the full explanation. I have seen some possible memory leaks in the report or dangerous bit shifting. Note: I didn't do the whole solution of PCSX2 suite, only PCSX2 project. If anybody interested, I could upload that too. There is another static analysis which is free and also good called CppCheck, Again, if anybody is interested, I could upload a report of that too. I hope this will help. Regards, Rebel_X
True
Static Analysis of PVS-Studio - Hello, I had the chance to examine PCSX2 project with PVS-Studio Static Analyzer. Github doesn't support attaching Excel files, but here is a link from my Dropbox: https://www.dropbox.com/s/sswkcfqj9g1x6vq/pcsx2_suite_2013.xlsx?dl=0 Again the report of the analysis is in Excel format. It is definitely worth looking at!! Even if some of the reported code mistakes could be false positive, or something that the devs of PCSX2 overlooked. You can get get more information and a detailed explanation about a specific error Codes using the online documentation: http://www.viva64.com/en/Vxxxx replacing "xxxx" with the code number. The good thing about the report, it gives you the file name, line number and a brief message explaining the error, visit viva64 to get the full explanation. I have seen some possible memory leaks in the report or dangerous bit shifting. Note: I didn't do the whole solution of PCSX2 suite, only PCSX2 project. If anybody interested, I could upload that too. There is another static analysis which is free and also good called CppCheck, Again, if anybody is interested, I could upload a report of that too. I hope this will help. Regards, Rebel_X
port
static analysis of pvs studio hello i had the chance to examine project with pvs studio static analyzer github doesn t support attaching excel files but here is a link from my dropbox again the report of the analysis is in excel format it is definitely worth looking at even if some of the reported code mistakes could be false positive or something that the devs of overlooked you can get get more information and a detailed explanation about a specific error codes using the online documentation replacing xxxx with the code number the good thing about the report it gives you the file name line number and a brief message explaining the error visit to get the full explanation i have seen some possible memory leaks in the report or dangerous bit shifting note i didn t do the whole solution of suite only project if anybody interested i could upload that too there is another static analysis which is free and also good called cppcheck again if anybody is interested i could upload a report of that too i hope this will help regards rebel x
1
1,940
30,512,337,249
IssuesEvent
2023-07-18 22:05:48
alcionai/corso
https://api.github.com/repos/alcionai/corso
opened
Add service level isolation
supportability
### What happened? While getting user info during backup, we try to [discover](https://github.com/alcionai/corso/blob/22f990a709b996a1f1783ec81ebb909430a79111/src/pkg/services/m365/api/users.go#L174C18-L174C18) all services which are enabled for the user. This approach has potential drawbacks. For example, assume that the user intends to an exchange backup. The backup may fail during service discovery if the user does not have a onedrive or if corso doesn't have file permissions. While we do handle such scenarios gracefully, we have noticed that graph api may change error messages on us, leading to backup failures. Proposed changes: 1. Corso should only discover & enable requested services. If the user is asking to do exchange backups, we should not attempt onedrive/sharepoint discovery. 2. This can be taken to another level by forcing isolation within a service (e.g. calendar/mail/contacts in exchange). ### Corso Version? Corso v0.10.0 ### Where are you running Corso? Linux ### Relevant log output _No response_
True
Add service level isolation - ### What happened? While getting user info during backup, we try to [discover](https://github.com/alcionai/corso/blob/22f990a709b996a1f1783ec81ebb909430a79111/src/pkg/services/m365/api/users.go#L174C18-L174C18) all services which are enabled for the user. This approach has potential drawbacks. For example, assume that the user intends to an exchange backup. The backup may fail during service discovery if the user does not have a onedrive or if corso doesn't have file permissions. While we do handle such scenarios gracefully, we have noticed that graph api may change error messages on us, leading to backup failures. Proposed changes: 1. Corso should only discover & enable requested services. If the user is asking to do exchange backups, we should not attempt onedrive/sharepoint discovery. 2. This can be taken to another level by forcing isolation within a service (e.g. calendar/mail/contacts in exchange). ### Corso Version? Corso v0.10.0 ### Where are you running Corso? Linux ### Relevant log output _No response_
port
add service level isolation what happened while getting user info during backup we try to all services which are enabled for the user this approach has potential drawbacks for example assume that the user intends to an exchange backup the backup may fail during service discovery if the user does not have a onedrive or if corso doesn t have file permissions while we do handle such scenarios gracefully we have noticed that graph api may change error messages on us leading to backup failures proposed changes corso should only discover enable requested services if the user is asking to do exchange backups we should not attempt onedrive sharepoint discovery this can be taken to another level by forcing isolation within a service e g calendar mail contacts in exchange corso version corso where are you running corso linux relevant log output no response
1
1,779
26,174,511,338
IssuesEvent
2023-01-02 07:52:38
primefaces/primeng
https://api.github.com/repos/primefaces/primeng
closed
Tab key in p-dialog with p-InputNumber
Type: Bug LTS-PORTABLE
[x] bug report => Search github for a similar issue or PR before submitting [ ] feature request => Please check if request is not on the roadmap already https://github.com/primefaces/primeng/wiki/Roadmap [ ] support request => Please do not submit support request here, instead see http://forum.primefaces.org/viewforum.php?f=35 **Plunkr Case (Bug Reports)** https://stackblitz.com/edit/github-dialog-tab?embed=1&file=src/app/app.component.html **Current behavior** * Click Button Show * Set cursor on first input field * Move cursor by pressing tab or shift+tab key * Try to override or to correct values of the input fields **Expected behavior** The whole value of a input should be selected in a p-dialog, easier to override * **Angular version:** 10.X * **PrimeNG version:** 10.0.2 (Possibly any version) * **Browser:** [all]
True
Tab key in p-dialog with p-InputNumber - [x] bug report => Search github for a similar issue or PR before submitting [ ] feature request => Please check if request is not on the roadmap already https://github.com/primefaces/primeng/wiki/Roadmap [ ] support request => Please do not submit support request here, instead see http://forum.primefaces.org/viewforum.php?f=35 **Plunkr Case (Bug Reports)** https://stackblitz.com/edit/github-dialog-tab?embed=1&file=src/app/app.component.html **Current behavior** * Click Button Show * Set cursor on first input field * Move cursor by pressing tab or shift+tab key * Try to override or to correct values of the input fields **Expected behavior** The whole value of a input should be selected in a p-dialog, easier to override * **Angular version:** 10.X * **PrimeNG version:** 10.0.2 (Possibly any version) * **Browser:** [all]
port
tab key in p dialog with p inputnumber bug report search github for a similar issue or pr before submitting feature request please check if request is not on the roadmap already support request please do not submit support request here instead see plunkr case bug reports current behavior click button show set cursor on first input field move cursor by pressing tab or shift tab key try to override or to correct values of the input fields expected behavior the whole value of a input should be selected in a p dialog easier to override angular version x primeng version possibly any version browser
1
1,467
21,694,545,716
IssuesEvent
2022-05-09 18:41:11
damccorm/test-migration-target
https://api.github.com/repos/damccorm/test-migration-target
opened
Rework dependency structure of Flink job server jar
P3 improvement runner-flink portability-flink
Enabling the strict dependency checker (BEAM-10961) revealed that we are unnecessarily making :runners:flink:1.x a compile dependency of :runners:flink:1.x:job-server. :runners:flink:1.x is not needed at compile time at all, so it can probably be a runtimeOnly dependency instead. Imported from Jira [BEAM-11664](https://issues.apache.org/jira/browse/BEAM-11664). Original Jira may contain additional context. Reported by: ibzib.
True
Rework dependency structure of Flink job server jar - Enabling the strict dependency checker (BEAM-10961) revealed that we are unnecessarily making :runners:flink:1.x a compile dependency of :runners:flink:1.x:job-server. :runners:flink:1.x is not needed at compile time at all, so it can probably be a runtimeOnly dependency instead. Imported from Jira [BEAM-11664](https://issues.apache.org/jira/browse/BEAM-11664). Original Jira may contain additional context. Reported by: ibzib.
port
rework dependency structure of flink job server jar enabling the strict dependency checker beam revealed that we are unnecessarily making runners flink x a compile dependency of runners flink x job server runners flink x is not needed at compile time at all so it can probably be a runtimeonly dependency instead imported from jira original jira may contain additional context reported by ibzib
1
1,933
30,347,684,811
IssuesEvent
2023-07-11 16:31:11
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
Need to remove section "Memory dump collection"
azure-supportability/svc triaged assigned-to-author doc-enhancement Pri1
[Enter feedback here] The section pertaining to "Memory dump collection" needs to be removed in this section. We are reviewing this information and will update this section at a later time. --- #### Document Details ⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.* * ID: 43a60f94-0826-8cdf-04b3-c5f20d299720 * Version Independent ID: 7d1dcfe7-4e68-6b98-1003-9154e7a0b22d * Content: [How to create an Azure support request - Azure supportability](https://learn.microsoft.com/en-us/azure/azure-portal/supportability/how-to-create-azure-support-request#memory-dump-collection) * Content Source: [articles/azure-portal/supportability/how-to-create-azure-support-request.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/azure-portal/supportability/how-to-create-azure-support-request.md) * Service: **azure-supportability** * GitHub Login: @JnHs * Microsoft Alias: **jenhayes**
True
Need to remove section "Memory dump collection" - [Enter feedback here] The section pertaining to "Memory dump collection" needs to be removed in this section. We are reviewing this information and will update this section at a later time. --- #### Document Details ⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.* * ID: 43a60f94-0826-8cdf-04b3-c5f20d299720 * Version Independent ID: 7d1dcfe7-4e68-6b98-1003-9154e7a0b22d * Content: [How to create an Azure support request - Azure supportability](https://learn.microsoft.com/en-us/azure/azure-portal/supportability/how-to-create-azure-support-request#memory-dump-collection) * Content Source: [articles/azure-portal/supportability/how-to-create-azure-support-request.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/azure-portal/supportability/how-to-create-azure-support-request.md) * Service: **azure-supportability** * GitHub Login: @JnHs * Microsoft Alias: **jenhayes**
port
need to remove section memory dump collection the section pertaining to memory dump collection needs to be removed in this section we are reviewing this information and will update this section at a later time document details ⚠ do not edit this section it is required for learn microsoft com ➟ github issue linking id version independent id content content source service azure supportability github login jnhs microsoft alias jenhayes
1
1,493
6,122,924,471
IssuesEvent
2017-06-23 02:02:42
Endogix/WebFormWeaver
https://api.github.com/repos/Endogix/WebFormWeaver
opened
Export form structure
architecture business logic feature presentation logic
Allow the user to export the form structure when they are finished editing the form, through a button near the bottom. Exporting will show a modal with the code - as optionally either HTML code, JSON, or XML. There should also be the option to specify an endpoint that this script can connect to through AJAX to save to a server, with another option to specify a function to call when the connection is complete (most likely to be a redirect function). ## Acceptance criteria User should be able to: - [ ] Export the form as HTML code - [ ] Export the form as a JSON string - [ ] Export the form as an XML string - [ ] Export the form to an endpoint through AJAX (either JSON or XML) - [ ] Configure a function in the options to call when then connection to the endpoint is complete
1.0
Export form structure - Allow the user to export the form structure when they are finished editing the form, through a button near the bottom. Exporting will show a modal with the code - as optionally either HTML code, JSON, or XML. There should also be the option to specify an endpoint that this script can connect to through AJAX to save to a server, with another option to specify a function to call when the connection is complete (most likely to be a redirect function). ## Acceptance criteria User should be able to: - [ ] Export the form as HTML code - [ ] Export the form as a JSON string - [ ] Export the form as an XML string - [ ] Export the form to an endpoint through AJAX (either JSON or XML) - [ ] Configure a function in the options to call when then connection to the endpoint is complete
non_port
export form structure allow the user to export the form structure when they are finished editing the form through a button near the bottom exporting will show a modal with the code as optionally either html code json or xml there should also be the option to specify an endpoint that this script can connect to through ajax to save to a server with another option to specify a function to call when the connection is complete most likely to be a redirect function acceptance criteria user should be able to export the form as html code export the form as a json string export the form as an xml string export the form to an endpoint through ajax either json or xml configure a function in the options to call when then connection to the endpoint is complete
0
415
6,575,568,093
IssuesEvent
2017-09-11 16:31:30
openucx/ucx
https://api.github.com/repos/openucx/ucx
closed
Compilation failure with clang 3.6.1
bug portability
> ./configure --prefix=$PWD/inst --disable-numa CC=clang && make -Bj breaks down the compilation on master branch with the following output: ``` CC async/libucs_la-pipe.lo CC async/libucs_la-thread.lo CC config/libucs_la-global_opts.lo config/global_opts.c:32:30: error: initializer overrides prior initialization of this subobject [-Werror,-Winitializer-overrides] .stats_dest = "", ^~ config/global_opts.c:29:30: note: previous initialization is here .stats_dest = "", ^~ config/global_opts.c:34:30: error: initializer overrides prior initialization of this subobject [-Werror,-Winitializer-overrides] .memtrack_dest = "", ^~ config/global_opts.c:31:30: note: previous initialization is here .memtrack_dest = "", ^~ 2 errors generated. make[2]: *** [config/libucs_la-global_opts.lo] Error 1 ``` and ``` wireup/wireup.c:548:24: error: equality comparison with extraneous parentheses [-Werror,-Wparentheses-equality] if ((ep->cfg_index == new_cfg_index)) { ~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~ wireup/wireup.c:548:24: note: remove extraneous parentheses around the comparison to silence this warning if ((ep->cfg_index == new_cfg_index)) { ~ ^ ~ wireup/wireup.c:548:24: note: use '=' to turn this equality comparison into an assignment if ((ep->cfg_index == new_cfg_index)) { ```
True
Compilation failure with clang 3.6.1 - > ./configure --prefix=$PWD/inst --disable-numa CC=clang && make -Bj breaks down the compilation on master branch with the following output: ``` CC async/libucs_la-pipe.lo CC async/libucs_la-thread.lo CC config/libucs_la-global_opts.lo config/global_opts.c:32:30: error: initializer overrides prior initialization of this subobject [-Werror,-Winitializer-overrides] .stats_dest = "", ^~ config/global_opts.c:29:30: note: previous initialization is here .stats_dest = "", ^~ config/global_opts.c:34:30: error: initializer overrides prior initialization of this subobject [-Werror,-Winitializer-overrides] .memtrack_dest = "", ^~ config/global_opts.c:31:30: note: previous initialization is here .memtrack_dest = "", ^~ 2 errors generated. make[2]: *** [config/libucs_la-global_opts.lo] Error 1 ``` and ``` wireup/wireup.c:548:24: error: equality comparison with extraneous parentheses [-Werror,-Wparentheses-equality] if ((ep->cfg_index == new_cfg_index)) { ~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~ wireup/wireup.c:548:24: note: remove extraneous parentheses around the comparison to silence this warning if ((ep->cfg_index == new_cfg_index)) { ~ ^ ~ wireup/wireup.c:548:24: note: use '=' to turn this equality comparison into an assignment if ((ep->cfg_index == new_cfg_index)) { ```
port
compilation failure with clang configure prefix pwd inst disable numa cc clang make bj breaks down the compilation on master branch with the following output cc async libucs la pipe lo cc async libucs la thread lo cc config libucs la global opts lo config global opts c error initializer overrides prior initialization of this subobject stats dest config global opts c note previous initialization is here stats dest config global opts c error initializer overrides prior initialization of this subobject memtrack dest config global opts c note previous initialization is here memtrack dest errors generated make error and wireup wireup c error equality comparison with extraneous parentheses if ep cfg index new cfg index wireup wireup c note remove extraneous parentheses around the comparison to silence this warning if ep cfg index new cfg index wireup wireup c note use to turn this equality comparison into an assignment if ep cfg index new cfg index
1
1,654
23,804,439,341
IssuesEvent
2022-09-03 20:26:06
systemd/systemd
https://api.github.com/repos/systemd/systemd
reopened
TEST-29-PORTABLE is flaky under sanitizers
bug 🐛 tests portabled
### systemd version the issue has been seen with latest main ### Used distribution Arch Linux ### Linux kernel version used _No response_ ### CPU architectures issue was seen on _No response_ ### Component systemd-portabled, tests ### Expected behaviour you didn't see TEST-29-PORTABLE should pass reliably(ish). ### Unexpected behaviour you saw Recently I noticed an uptrend in TEST-29-PORTABLE related fails, mostly concentrated around failing `minimal-app0.service`: ``` [ 22.571421] systemd[1]: Starting minimal-app0.service... [ 22.653735] systemd[375]: Allocating context for crypt device /usr/share/minimal_0.verity. [ 22.654544] systemd[376]: Trying to open and read device /usr/share/minimal_0.verity with direct-io. [ 22.654817] systemd[375]: Trying to open and read device /usr/share/minimal_0.verity with direct-io. [ 22.657815] systemd[376]: Initialising device-mapper backend library. [ 22.657977] systemd[376]: Trying to load VERITY crypt type from device /usr/share/minimal_0.verity. [ 22.658121] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1/job/351 interface=org.freedesktop.DBus.Properties member=PropertiesChanged cookie=937 reply_cookie=0 signature=sa{sv}as error-name=n/a error-message=n/a [ 22.658283] systemd[376]: Crypto backend (OpenSSL 1.1.1q 5 Jul 2022) initialized in cryptsetup library version 2.4.3. [ 22.658407] systemd[376]: Detected kernel Linux 5.18.12-arch1-1 x86_64. [ 22.821073] kernel: device-mapper: uevent: version 1.0.3 [ 22.821315] kernel: device-mapper: ioctl: 4.46.0-ioctl (2022-02-22) initialised: dm-devel@redhat.com [ 22.825921] kernel: loop2: detected capacity change from 0 to 184 [ 22.827129] kernel: loop3: detected capacity change from 0 to 184 [ 22.767974] systemd[1]: sys-devices-virtual-block-dm\x2d0.device: Changed dead -> plugged [ 22.768109] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1 interface=org.freedesktop.systemd1.Manager member=UnitNew cookie=1060 reply_cookie=0 signature=so error-name=n/a error-message=n/a [ 22.768245] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1 interface=org.freedesktop.systemd1.Manager member=UnitNew cookie=1061 reply_cookie=0 signature=so error-name=n/a error-message=n/a [ 22.768369] systemd[1]: dev-disk-by\x2ddiskseq-17.device: Job 372 dev-disk-by\x2ddiskseq-17.device/nop finished, result=done [ 22.768522] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1 interface=org.freedesktop.systemd1.Manager member=JobRemoved cookie=1062 reply_cookie=0 signature=uoss error-name=n/a error-message=n/a [ 22.768677] systemd[1]: dev-loop3.device: Job 373 dev-loop3.device/nop finished, result=done [ 22.768808] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1 interface=org.freedesktop.systemd1.Manager member=JobRemoved cookie=1063 reply_cookie=0 signature=uoss error-name=n/a error-message=n/a ... [ 22.770481] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1/unit/dev_2ddisk_2dby_5cx2ddiskseq_2d17_2edevice interface=org.freedesktop.DBus.Properties member=PropertiesChanged cookie=1070 reply_cookie=0 signature=sa{sv}as error-name=n/a error-message=n/a [ 22.770637] systemd[376]: device-mapper: create ioctl on f01cff8db2a0f9ea70a3261ea6a34050d9377fa64831d98427569e8d94cfa567-verity CRYPT-VERITY-38f962cef7174fb4b132c46a4a8a6aa3-f01cff8db2a0f9ea70a3261ea6a34050d9377fa64831d98427569e8d94cfa567-verity failed: Device or resource busy [ 22.770805] systemd[376]: Udev cookie 0xd4dd849 (semid 1) decremented to 1 ... [ 22.973176] systemd[375]: Applying namespace mount on /run/systemd/unit-root/run/host/os-release [ 22.973301] systemd[376]: Successfully mounted /run/systemd/inaccessible/dir to /run/systemd/unit-root/run/credentials [ 22.973447] systemd[376]: Applying namespace mount on /run/systemd/unit-root/run/host/os-release [ 22.973587] systemd[375]: Followed source symlinks /etc/os-release → /usr/lib/os-release. [ 22.973735] systemd[375]: Bind-mounting /usr/lib/os-release on /run/systemd/unit-root/run/host/os-release (MS_BIND|MS_REC "")... [ 22.973861] systemd[375]: Failed to mount /usr/lib/os-release (type n/a) on /run/systemd/unit-root/run/host/os-release (MS_BIND|MS_REC ""): No such file or directory [ 22.973989] systemd[376]: Followed source symlinks /etc/os-release → /usr/lib/os-release. [ 22.974118] systemd[376]: Bind-mounting /usr/lib/os-release on /run/systemd/unit-root/run/host/os-release (MS_BIND|MS_REC "")... [ 22.974288] systemd[376]: Failed to mount /usr/lib/os-release (type n/a) on /run/systemd/unit-root/run/host/os-release (MS_BIND|MS_REC ""): No such file or directory [ 22.974418] systemd[375]: Bind-mounting /usr/lib/os-release on /run/systemd/unit-root/run/host/os-release (MS_BIND|MS_REC "")... [ 22.974551] systemd[376]: Failed to create destination mount point node '/run/systemd/unit-root/run/host/os-release': Operation not permitted [ 22.974701] systemd[375]: Successfully mounted /usr/lib/os-release to /run/systemd/unit-root/run/host/os-release [ 22.974831] systemd[376]: Failed to mount /usr/lib/os-release to /run/systemd/unit-root/run/host/os-release: No such file or directory [ 22.974965] systemd[375]: Applying namespace mount on /run/systemd/unit-root/run/systemd/incoming [ 22.975088] systemd[375]: Followed source symlinks /run/systemd/propagate/minimal-app0-foo.service → /run/systemd/propagate/minimal-app0-foo.service. [ 22.975220] systemd[376]: Releasing crypt device /dev/loop3 context. [ 22.975352] systemd[375]: Bind-mounting /run/systemd/propagate/minimal-app0-foo.service on /run/systemd/unit-root/run/systemd/incoming (MS_BIND "")... [ 22.975474] systemd[376]: Releasing device-mapper backend. [ 22.975600] systemd[375]: Successfully mounted /run/systemd/propagate/minimal-app0-foo.service to /run/systemd/unit-root/run/systemd/incoming [ 22.975736] systemd[375]: Applying namespace mount on /run/systemd/unit-root/sys [ 22.975856] systemd[375]: Bind-mounting /sys on /run/systemd/unit-root/sys (MS_BIND|MS_REC "")... [ 22.975973] systemd[375]: Applying namespace mount on /run/systemd/unit-root/tmp [ 22.976103] systemd[375]: Bind-mounting /tmp/systemd-private-aa79f66eaddb4df58ede18f5c18ba3bf-minimal-app0-foo.service-ANJ00G/tmp on /run/systemd/unit-root/tmp (MS_BIND|MS_REC "")... [ 22.976268] systemd[375]: Successfully mounted /tmp/systemd-private-aa79f66eaddb4df58ede18f5c18ba3bf-minimal-app0-foo.service-ANJ00G/tmp to /run/systemd/unit-root/tmp [ 22.976466] systemd[375]: Applying namespace mount on /run/systemd/unit-root/var/tmp [ 22.976601] systemd[375]: Bind-mounting /var/tmp/systemd-private-aa79f66eaddb4df58ede18f5c18ba3bf-minimal-app0-foo.service-bQZ4Ew/tmp on /run/systemd/unit-root/var/tmp (MS_BIND|MS_REC "")... [ 22.976764] systemd[375]: Successfully mounted /var/tmp/systemd-private-aa79f66eaddb4df58ede18f5c18ba3bf-minimal-app0-foo.service-bQZ4Ew/tmp to /run/systemd/unit-root/var/tmp [ 22.976881] systemd[375]: Remounted /run/systemd/unit-root/etc/machine-id. [ 22.977019] systemd[375]: Remounted /run/systemd/unit-root/etc/resolv.conf. [ 22.977137] systemd[375]: Remounted /run/systemd/unit-root/run/credentials. [ 22.977256] systemd[375]: Remounted /run/systemd/unit-root/run/host/os-release. [ 22.977380] systemd[375]: Remounted /run/systemd/unit-root/run/systemd/incoming. [ 22.978290] systemd[375]: Remounted /run/systemd/unit-root/proc. [ 22.979550] systemd[375]: Remounted /run/systemd/unit-root/run/credentials. [ 22.981318] systemd[375]: Remounted /run/systemd/unit-root/sys. [ 22.981501] systemd[375]: Remounted /run/systemd/unit-root/sys/fs/bpf. [ 22.981646] systemd[375]: Remounted /run/systemd/unit-root/sys/fs/pstore. [ 22.981786] systemd[375]: Remounted /run/systemd/unit-root/sys/kernel/config. [ 22.981921] systemd[375]: Remounted /run/systemd/unit-root/sys/fs/cgroup. [ 22.982050] systemd[375]: Remounted /run/systemd/unit-root/sys/kernel/tracing. [ 22.982180] systemd[375]: Remounted /run/systemd/unit-root/sys/kernel/security. [ 22.982314] systemd[375]: Remounted /run/systemd/unit-root/sys/kernel/debug. [ 22.982552] systemd[375]: Releasing crypt device /usr/share/minimal_0.verity context. [ 22.982719] systemd[375]: Releasing device-mapper backend. [ 22.982846] systemd[375]: Closing read only fd for /usr/share/minimal_0.verity. [ 22.983047] systemd[375]: Closed loop /dev/loop3 (/usr/share/minimal_0.verity). [ 22.986724] systemd[375]: minimal-app0-foo.service: Executing: cat /usr/lib/os-release [ 23.000841] systemd[376]: minimal-app0.service: Failed to set up mount namespacing: /run/systemd/unit-root/run/host/os-release: No such file or directory [ 23.001204] systemd[376]: minimal-app0.service: Failed at step NAMESPACE spawning cat: No such file or directory [ 23.009991] systemd[1]: systemd-journald.service: Received EPOLLHUP on stored fd 43 (stored), closing. [ 23.010264] systemd[1]: minimal-app0.service: Control group is empty. [ 23.010423] systemd[1]: Received SIGCHLD from PID 376 ((cat)). [ 23.010650] systemd[1]: Child 376 ((cat)) died (code=exited, status=226/NAMESPACE) [ 23.010804] systemd[1]: minimal-app0.service: Child 376 belongs to minimal-app0.service. [ 23.011028] systemd[1]: minimal-app0.service: Control process exited, code=exited, status=226/NAMESPACE [ 23.011234] systemd[1]: minimal-app0.service: Got final SIGCHLD for state start-pre. [ 23.011431] systemd[1]: minimal-app0.service: Failed with result 'exit-code'. ``` Full journals: * [systemd.journal.tar.gz](https://github.com/systemd/systemd/files/9213531/systemd.journal.tar.gz) * [systemd.journal.tar.gz](https://github.com/systemd/systemd/files/9213553/systemd.journal.tar.gz) So far I've seen this only in the CentOS CI sanitizer run, so I'm opening this as (not only) a tracker while I dig deeper. ### Steps to reproduce the problem _No response_ ### Additional program output to the terminal or log subsystem illustrating the issue _No response_
True
TEST-29-PORTABLE is flaky under sanitizers - ### systemd version the issue has been seen with latest main ### Used distribution Arch Linux ### Linux kernel version used _No response_ ### CPU architectures issue was seen on _No response_ ### Component systemd-portabled, tests ### Expected behaviour you didn't see TEST-29-PORTABLE should pass reliably(ish). ### Unexpected behaviour you saw Recently I noticed an uptrend in TEST-29-PORTABLE related fails, mostly concentrated around failing `minimal-app0.service`: ``` [ 22.571421] systemd[1]: Starting minimal-app0.service... [ 22.653735] systemd[375]: Allocating context for crypt device /usr/share/minimal_0.verity. [ 22.654544] systemd[376]: Trying to open and read device /usr/share/minimal_0.verity with direct-io. [ 22.654817] systemd[375]: Trying to open and read device /usr/share/minimal_0.verity with direct-io. [ 22.657815] systemd[376]: Initialising device-mapper backend library. [ 22.657977] systemd[376]: Trying to load VERITY crypt type from device /usr/share/minimal_0.verity. [ 22.658121] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1/job/351 interface=org.freedesktop.DBus.Properties member=PropertiesChanged cookie=937 reply_cookie=0 signature=sa{sv}as error-name=n/a error-message=n/a [ 22.658283] systemd[376]: Crypto backend (OpenSSL 1.1.1q 5 Jul 2022) initialized in cryptsetup library version 2.4.3. [ 22.658407] systemd[376]: Detected kernel Linux 5.18.12-arch1-1 x86_64. [ 22.821073] kernel: device-mapper: uevent: version 1.0.3 [ 22.821315] kernel: device-mapper: ioctl: 4.46.0-ioctl (2022-02-22) initialised: dm-devel@redhat.com [ 22.825921] kernel: loop2: detected capacity change from 0 to 184 [ 22.827129] kernel: loop3: detected capacity change from 0 to 184 [ 22.767974] systemd[1]: sys-devices-virtual-block-dm\x2d0.device: Changed dead -> plugged [ 22.768109] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1 interface=org.freedesktop.systemd1.Manager member=UnitNew cookie=1060 reply_cookie=0 signature=so error-name=n/a error-message=n/a [ 22.768245] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1 interface=org.freedesktop.systemd1.Manager member=UnitNew cookie=1061 reply_cookie=0 signature=so error-name=n/a error-message=n/a [ 22.768369] systemd[1]: dev-disk-by\x2ddiskseq-17.device: Job 372 dev-disk-by\x2ddiskseq-17.device/nop finished, result=done [ 22.768522] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1 interface=org.freedesktop.systemd1.Manager member=JobRemoved cookie=1062 reply_cookie=0 signature=uoss error-name=n/a error-message=n/a [ 22.768677] systemd[1]: dev-loop3.device: Job 373 dev-loop3.device/nop finished, result=done [ 22.768808] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1 interface=org.freedesktop.systemd1.Manager member=JobRemoved cookie=1063 reply_cookie=0 signature=uoss error-name=n/a error-message=n/a ... [ 22.770481] systemd[1]: Sent message type=signal sender=n/a destination=n/a path=/org/freedesktop/systemd1/unit/dev_2ddisk_2dby_5cx2ddiskseq_2d17_2edevice interface=org.freedesktop.DBus.Properties member=PropertiesChanged cookie=1070 reply_cookie=0 signature=sa{sv}as error-name=n/a error-message=n/a [ 22.770637] systemd[376]: device-mapper: create ioctl on f01cff8db2a0f9ea70a3261ea6a34050d9377fa64831d98427569e8d94cfa567-verity CRYPT-VERITY-38f962cef7174fb4b132c46a4a8a6aa3-f01cff8db2a0f9ea70a3261ea6a34050d9377fa64831d98427569e8d94cfa567-verity failed: Device or resource busy [ 22.770805] systemd[376]: Udev cookie 0xd4dd849 (semid 1) decremented to 1 ... [ 22.973176] systemd[375]: Applying namespace mount on /run/systemd/unit-root/run/host/os-release [ 22.973301] systemd[376]: Successfully mounted /run/systemd/inaccessible/dir to /run/systemd/unit-root/run/credentials [ 22.973447] systemd[376]: Applying namespace mount on /run/systemd/unit-root/run/host/os-release [ 22.973587] systemd[375]: Followed source symlinks /etc/os-release → /usr/lib/os-release. [ 22.973735] systemd[375]: Bind-mounting /usr/lib/os-release on /run/systemd/unit-root/run/host/os-release (MS_BIND|MS_REC "")... [ 22.973861] systemd[375]: Failed to mount /usr/lib/os-release (type n/a) on /run/systemd/unit-root/run/host/os-release (MS_BIND|MS_REC ""): No such file or directory [ 22.973989] systemd[376]: Followed source symlinks /etc/os-release → /usr/lib/os-release. [ 22.974118] systemd[376]: Bind-mounting /usr/lib/os-release on /run/systemd/unit-root/run/host/os-release (MS_BIND|MS_REC "")... [ 22.974288] systemd[376]: Failed to mount /usr/lib/os-release (type n/a) on /run/systemd/unit-root/run/host/os-release (MS_BIND|MS_REC ""): No such file or directory [ 22.974418] systemd[375]: Bind-mounting /usr/lib/os-release on /run/systemd/unit-root/run/host/os-release (MS_BIND|MS_REC "")... [ 22.974551] systemd[376]: Failed to create destination mount point node '/run/systemd/unit-root/run/host/os-release': Operation not permitted [ 22.974701] systemd[375]: Successfully mounted /usr/lib/os-release to /run/systemd/unit-root/run/host/os-release [ 22.974831] systemd[376]: Failed to mount /usr/lib/os-release to /run/systemd/unit-root/run/host/os-release: No such file or directory [ 22.974965] systemd[375]: Applying namespace mount on /run/systemd/unit-root/run/systemd/incoming [ 22.975088] systemd[375]: Followed source symlinks /run/systemd/propagate/minimal-app0-foo.service → /run/systemd/propagate/minimal-app0-foo.service. [ 22.975220] systemd[376]: Releasing crypt device /dev/loop3 context. [ 22.975352] systemd[375]: Bind-mounting /run/systemd/propagate/minimal-app0-foo.service on /run/systemd/unit-root/run/systemd/incoming (MS_BIND "")... [ 22.975474] systemd[376]: Releasing device-mapper backend. [ 22.975600] systemd[375]: Successfully mounted /run/systemd/propagate/minimal-app0-foo.service to /run/systemd/unit-root/run/systemd/incoming [ 22.975736] systemd[375]: Applying namespace mount on /run/systemd/unit-root/sys [ 22.975856] systemd[375]: Bind-mounting /sys on /run/systemd/unit-root/sys (MS_BIND|MS_REC "")... [ 22.975973] systemd[375]: Applying namespace mount on /run/systemd/unit-root/tmp [ 22.976103] systemd[375]: Bind-mounting /tmp/systemd-private-aa79f66eaddb4df58ede18f5c18ba3bf-minimal-app0-foo.service-ANJ00G/tmp on /run/systemd/unit-root/tmp (MS_BIND|MS_REC "")... [ 22.976268] systemd[375]: Successfully mounted /tmp/systemd-private-aa79f66eaddb4df58ede18f5c18ba3bf-minimal-app0-foo.service-ANJ00G/tmp to /run/systemd/unit-root/tmp [ 22.976466] systemd[375]: Applying namespace mount on /run/systemd/unit-root/var/tmp [ 22.976601] systemd[375]: Bind-mounting /var/tmp/systemd-private-aa79f66eaddb4df58ede18f5c18ba3bf-minimal-app0-foo.service-bQZ4Ew/tmp on /run/systemd/unit-root/var/tmp (MS_BIND|MS_REC "")... [ 22.976764] systemd[375]: Successfully mounted /var/tmp/systemd-private-aa79f66eaddb4df58ede18f5c18ba3bf-minimal-app0-foo.service-bQZ4Ew/tmp to /run/systemd/unit-root/var/tmp [ 22.976881] systemd[375]: Remounted /run/systemd/unit-root/etc/machine-id. [ 22.977019] systemd[375]: Remounted /run/systemd/unit-root/etc/resolv.conf. [ 22.977137] systemd[375]: Remounted /run/systemd/unit-root/run/credentials. [ 22.977256] systemd[375]: Remounted /run/systemd/unit-root/run/host/os-release. [ 22.977380] systemd[375]: Remounted /run/systemd/unit-root/run/systemd/incoming. [ 22.978290] systemd[375]: Remounted /run/systemd/unit-root/proc. [ 22.979550] systemd[375]: Remounted /run/systemd/unit-root/run/credentials. [ 22.981318] systemd[375]: Remounted /run/systemd/unit-root/sys. [ 22.981501] systemd[375]: Remounted /run/systemd/unit-root/sys/fs/bpf. [ 22.981646] systemd[375]: Remounted /run/systemd/unit-root/sys/fs/pstore. [ 22.981786] systemd[375]: Remounted /run/systemd/unit-root/sys/kernel/config. [ 22.981921] systemd[375]: Remounted /run/systemd/unit-root/sys/fs/cgroup. [ 22.982050] systemd[375]: Remounted /run/systemd/unit-root/sys/kernel/tracing. [ 22.982180] systemd[375]: Remounted /run/systemd/unit-root/sys/kernel/security. [ 22.982314] systemd[375]: Remounted /run/systemd/unit-root/sys/kernel/debug. [ 22.982552] systemd[375]: Releasing crypt device /usr/share/minimal_0.verity context. [ 22.982719] systemd[375]: Releasing device-mapper backend. [ 22.982846] systemd[375]: Closing read only fd for /usr/share/minimal_0.verity. [ 22.983047] systemd[375]: Closed loop /dev/loop3 (/usr/share/minimal_0.verity). [ 22.986724] systemd[375]: minimal-app0-foo.service: Executing: cat /usr/lib/os-release [ 23.000841] systemd[376]: minimal-app0.service: Failed to set up mount namespacing: /run/systemd/unit-root/run/host/os-release: No such file or directory [ 23.001204] systemd[376]: minimal-app0.service: Failed at step NAMESPACE spawning cat: No such file or directory [ 23.009991] systemd[1]: systemd-journald.service: Received EPOLLHUP on stored fd 43 (stored), closing. [ 23.010264] systemd[1]: minimal-app0.service: Control group is empty. [ 23.010423] systemd[1]: Received SIGCHLD from PID 376 ((cat)). [ 23.010650] systemd[1]: Child 376 ((cat)) died (code=exited, status=226/NAMESPACE) [ 23.010804] systemd[1]: minimal-app0.service: Child 376 belongs to minimal-app0.service. [ 23.011028] systemd[1]: minimal-app0.service: Control process exited, code=exited, status=226/NAMESPACE [ 23.011234] systemd[1]: minimal-app0.service: Got final SIGCHLD for state start-pre. [ 23.011431] systemd[1]: minimal-app0.service: Failed with result 'exit-code'. ``` Full journals: * [systemd.journal.tar.gz](https://github.com/systemd/systemd/files/9213531/systemd.journal.tar.gz) * [systemd.journal.tar.gz](https://github.com/systemd/systemd/files/9213553/systemd.journal.tar.gz) So far I've seen this only in the CentOS CI sanitizer run, so I'm opening this as (not only) a tracker while I dig deeper. ### Steps to reproduce the problem _No response_ ### Additional program output to the terminal or log subsystem illustrating the issue _No response_
port
test portable is flaky under sanitizers systemd version the issue has been seen with latest main used distribution arch linux linux kernel version used no response cpu architectures issue was seen on no response component systemd portabled tests expected behaviour you didn t see test portable should pass reliably ish unexpected behaviour you saw recently i noticed an uptrend in test portable related fails mostly concentrated around failing minimal service systemd starting minimal service systemd allocating context for crypt device usr share minimal verity systemd trying to open and read device usr share minimal verity with direct io systemd trying to open and read device usr share minimal verity with direct io systemd initialising device mapper backend library systemd trying to load verity crypt type from device usr share minimal verity systemd sent message type signal sender n a destination n a path org freedesktop job interface org freedesktop dbus properties member propertieschanged cookie reply cookie signature sa sv as error name n a error message n a systemd crypto backend openssl jul initialized in cryptsetup library version systemd detected kernel linux kernel device mapper uevent version kernel device mapper ioctl ioctl initialised dm devel redhat com kernel detected capacity change from to kernel detected capacity change from to systemd sys devices virtual block dm device changed dead plugged systemd sent message type signal sender n a destination n a path org freedesktop interface org freedesktop manager member unitnew cookie reply cookie signature so error name n a error message n a systemd sent message type signal sender n a destination n a path org freedesktop interface org freedesktop manager member unitnew cookie reply cookie signature so error name n a error message n a systemd dev disk by device job dev disk by device nop finished result done systemd sent message type signal sender n a destination n a path org freedesktop interface org freedesktop manager member jobremoved cookie reply cookie signature uoss error name n a error message n a systemd dev device job dev device nop finished result done systemd sent message type signal sender n a destination n a path org freedesktop interface org freedesktop manager member jobremoved cookie reply cookie signature uoss error name n a error message n a systemd sent message type signal sender n a destination n a path org freedesktop unit dev interface org freedesktop dbus properties member propertieschanged cookie reply cookie signature sa sv as error name n a error message n a systemd device mapper create ioctl on verity crypt verity verity failed device or resource busy systemd udev cookie semid decremented to systemd applying namespace mount on run systemd unit root run host os release systemd successfully mounted run systemd inaccessible dir to run systemd unit root run credentials systemd applying namespace mount on run systemd unit root run host os release systemd followed source symlinks etc os release → usr lib os release systemd bind mounting usr lib os release on run systemd unit root run host os release ms bind ms rec systemd failed to mount usr lib os release type n a on run systemd unit root run host os release ms bind ms rec no such file or directory systemd followed source symlinks etc os release → usr lib os release systemd bind mounting usr lib os release on run systemd unit root run host os release ms bind ms rec systemd failed to mount usr lib os release type n a on run systemd unit root run host os release ms bind ms rec no such file or directory systemd bind mounting usr lib os release on run systemd unit root run host os release ms bind ms rec systemd failed to create destination mount point node run systemd unit root run host os release operation not permitted systemd successfully mounted usr lib os release to run systemd unit root run host os release systemd failed to mount usr lib os release to run systemd unit root run host os release no such file or directory systemd applying namespace mount on run systemd unit root run systemd incoming systemd followed source symlinks run systemd propagate minimal foo service → run systemd propagate minimal foo service systemd releasing crypt device dev context systemd bind mounting run systemd propagate minimal foo service on run systemd unit root run systemd incoming ms bind systemd releasing device mapper backend systemd successfully mounted run systemd propagate minimal foo service to run systemd unit root run systemd incoming systemd applying namespace mount on run systemd unit root sys systemd bind mounting sys on run systemd unit root sys ms bind ms rec systemd applying namespace mount on run systemd unit root tmp systemd bind mounting tmp systemd private minimal foo service tmp on run systemd unit root tmp ms bind ms rec systemd successfully mounted tmp systemd private minimal foo service tmp to run systemd unit root tmp systemd applying namespace mount on run systemd unit root var tmp systemd bind mounting var tmp systemd private minimal foo service tmp on run systemd unit root var tmp ms bind ms rec systemd successfully mounted var tmp systemd private minimal foo service tmp to run systemd unit root var tmp systemd remounted run systemd unit root etc machine id systemd remounted run systemd unit root etc resolv conf systemd remounted run systemd unit root run credentials systemd remounted run systemd unit root run host os release systemd remounted run systemd unit root run systemd incoming systemd remounted run systemd unit root proc systemd remounted run systemd unit root run credentials systemd remounted run systemd unit root sys systemd remounted run systemd unit root sys fs bpf systemd remounted run systemd unit root sys fs pstore systemd remounted run systemd unit root sys kernel config systemd remounted run systemd unit root sys fs cgroup systemd remounted run systemd unit root sys kernel tracing systemd remounted run systemd unit root sys kernel security systemd remounted run systemd unit root sys kernel debug systemd releasing crypt device usr share minimal verity context systemd releasing device mapper backend systemd closing read only fd for usr share minimal verity systemd closed loop dev usr share minimal verity systemd minimal foo service executing cat usr lib os release systemd minimal service failed to set up mount namespacing run systemd unit root run host os release no such file or directory systemd minimal service failed at step namespace spawning cat no such file or directory systemd systemd journald service received epollhup on stored fd stored closing systemd minimal service control group is empty systemd received sigchld from pid cat systemd child cat died code exited status namespace systemd minimal service child belongs to minimal service systemd minimal service control process exited code exited status namespace systemd minimal service got final sigchld for state start pre systemd minimal service failed with result exit code full journals so far i ve seen this only in the centos ci sanitizer run so i m opening this as not only a tracker while i dig deeper steps to reproduce the problem no response additional program output to the terminal or log subsystem illustrating the issue no response
1
222
4,622,806,536
IssuesEvent
2016-09-27 08:53:59
ocaml/opam-repository
https://api.github.com/repos/ocaml/opam-repository
closed
lablgl for windows
portability
from @blue-prawn: ``` Hi, On my Windows test environment I can not use "lablgl-20120306" because of the glShader module that I myself provided before I get a Windows test environment. But the vanilla "lablgl-1.04" works fine. If I've understood correctly Opam is a source package manager, so maybe opam could provide "lablgl.1.04" for Windows users that have a similar environment than mine ? My windows test environment is: Windows 7 starter with protz's ocaml version 4.00.1 Well in this environment only lablgl itself compiles out of the box. LablGlut does not. Togl (Tk) does not either. (I think that the devel libs for both glut and tkgl are there) LablGlut and Togl are for windowing, so if we use OCamlSDL instead this is just fine we don't need these. OCamlSDL1 compiles out of the box in my win env. In this environment I compile the Vanilla LablGL like this: cp Makefile.config.mingw Makefile.config # need to add -I/usr/include, the patch below does this: patch < Makefile.config-cygwin-needs-I_usr_include.patch # don't build anything else than lablgl itself or we get errors: make lib make libopt LABLGL_INSTALLDIR="/tmp/mylblgl0" make install INSTALLDIR="$LABLGL_INSTALLDIR" # it seems that the file "dlllablgl.dll" is not installed, so: cp src/dlllablgl.dll "$LABLGL_INSTALLDIR"/ Now a hello-world works fine. (I'm providing this hello at the end after the config patch) =============== $ cat Makefile.config-cygwin-needs-I_usr_include.patch +++ Makefile.config~ 2013-03-12 18:11:58.945140000 +0100 --- Makefile.config 2013-03-30 13:55:08.524315300 +0100 @@ -35,7 +35,7 @@ -lws2_32 -luser32 -lgdi32 # Where to find OpenGL/Mesa/Glut headers and libraries -GLINCLUDES = -DHAS_GLEXT_H -DGL_GLEXT_PROTOTYPES -DGLU_VERSION_1_3 +GLINCLUDES = -I/usr/include -DHAS_GLEXT_H -DGL_GLEXT_PROTOTYPES -DGLU_VERSION_1_3 GLLIBS = -lglu32 -lopengl32 GLLIBS0 = $(GLLIBS) GLUTLIBS = -lglut32 ============ $ cat lblgl_hello.ml let display () = GlClear.color (0.0, 0.0, 0.0); GlClear.clear [`color]; GlDraw.color (1.0, 1.0, 0.0); GlMat.mode `projection; GlMat.load_identity (); GlMat.ortho ~x:(-1.0,1.0) ~y:(-1.0,1.0) ~z:(-1.0,1.0); GlDraw.begins `polygon; GlDraw.vertex ~x:(-0.5) ~y:(-0.5) (); GlDraw.vertex ~x:(-0.5) ~y:(0.5) (); GlDraw.vertex ~x:(0.5) ~y:(0.5) (); GlDraw.vertex ~x:(0.5) ~y:(-0.5) (); GlDraw.ends (); Gl.flush () let () = Sdl.init [`EVERYTHING]; let _ = Sdlvideo.set_video_mode ~w:640 ~h:400 [`HWSURFACE; `DOUBLEBUF; `OPENGL; `RESIZABLE] in for i = 1 to 6 do display (); Sdlgl.swap_buffers (); Sdltimer.delay 1000; done; Sdl.quit () # using full windows path, because protz's ocaml doesn't like cygwin ones: $ ocaml bigarray.cma -I +site-lib/sdl sdl.cma -I C:/cygwin/tmp/mylblgl0 lablgl.cma lblgl_hello.ml =========== # completely untested, just copied and modified from "lablgl.20120306" $ cat opam opam-version: "1" maintainer: "contact@ocamlpro.com" homepage: "https://forge.ocamlcore.org/projects/lablgl/" authors: [ "Jacques Garrigue" "Isaac Trotts" "Erick Tryzelaar" "Christophe Raffali" ] # "Jon Harrop" contributed the files "gluTess.ml" / "ml_glutess.c" # should he be considered as one of the authors? or just a contributor? build: [ ["cp" "Makefile.config.ex" "Makefile.config"] ["cp" "Makefile.config.osx" "Makefile.config"] {"%{os}%" = "darwin"} ["cp" "Makefile.config.mingw" "Makefile.config"] {"%{os}%" = "windows 7 starter with protz's ocaml 4"} ["mkdir" "-p" "%{bin}%"] # [make "glut"] ### do not build this one, it doesn't compile out of the box in my W env # [make "glutopt"] ### do not build this one, it doesn't compile out of the box in my W env ### in my W env, only build lablgl.{cma,cmxa} [make "lib"] [make "libopt"] [make "install" "-C" "src" "BINDIR=%{bin}%" "INSTALLDIR=%{lib}%/lablgl" "DLLDIR=%{lib}%/stublibs"] # it seems that the makefile forgets to install this one: ["cp" "src/dlllablgl.dll" "DLLDIR=%{lib}%/stublibs"] {"%{os:env}%" = "win7::protz"} ### do not install LablGlut, it doesn't compile out of the box in my W env ### [make "install" "-C" "LablGlut/src" "BINDIR=%{bin}%" "INSTALLDIR=%{lib}%/lablgl" "DLLDIR=%{lib}%/stublibs"] ] -- Cheers ```
True
lablgl for windows - from @blue-prawn: ``` Hi, On my Windows test environment I can not use "lablgl-20120306" because of the glShader module that I myself provided before I get a Windows test environment. But the vanilla "lablgl-1.04" works fine. If I've understood correctly Opam is a source package manager, so maybe opam could provide "lablgl.1.04" for Windows users that have a similar environment than mine ? My windows test environment is: Windows 7 starter with protz's ocaml version 4.00.1 Well in this environment only lablgl itself compiles out of the box. LablGlut does not. Togl (Tk) does not either. (I think that the devel libs for both glut and tkgl are there) LablGlut and Togl are for windowing, so if we use OCamlSDL instead this is just fine we don't need these. OCamlSDL1 compiles out of the box in my win env. In this environment I compile the Vanilla LablGL like this: cp Makefile.config.mingw Makefile.config # need to add -I/usr/include, the patch below does this: patch < Makefile.config-cygwin-needs-I_usr_include.patch # don't build anything else than lablgl itself or we get errors: make lib make libopt LABLGL_INSTALLDIR="/tmp/mylblgl0" make install INSTALLDIR="$LABLGL_INSTALLDIR" # it seems that the file "dlllablgl.dll" is not installed, so: cp src/dlllablgl.dll "$LABLGL_INSTALLDIR"/ Now a hello-world works fine. (I'm providing this hello at the end after the config patch) =============== $ cat Makefile.config-cygwin-needs-I_usr_include.patch +++ Makefile.config~ 2013-03-12 18:11:58.945140000 +0100 --- Makefile.config 2013-03-30 13:55:08.524315300 +0100 @@ -35,7 +35,7 @@ -lws2_32 -luser32 -lgdi32 # Where to find OpenGL/Mesa/Glut headers and libraries -GLINCLUDES = -DHAS_GLEXT_H -DGL_GLEXT_PROTOTYPES -DGLU_VERSION_1_3 +GLINCLUDES = -I/usr/include -DHAS_GLEXT_H -DGL_GLEXT_PROTOTYPES -DGLU_VERSION_1_3 GLLIBS = -lglu32 -lopengl32 GLLIBS0 = $(GLLIBS) GLUTLIBS = -lglut32 ============ $ cat lblgl_hello.ml let display () = GlClear.color (0.0, 0.0, 0.0); GlClear.clear [`color]; GlDraw.color (1.0, 1.0, 0.0); GlMat.mode `projection; GlMat.load_identity (); GlMat.ortho ~x:(-1.0,1.0) ~y:(-1.0,1.0) ~z:(-1.0,1.0); GlDraw.begins `polygon; GlDraw.vertex ~x:(-0.5) ~y:(-0.5) (); GlDraw.vertex ~x:(-0.5) ~y:(0.5) (); GlDraw.vertex ~x:(0.5) ~y:(0.5) (); GlDraw.vertex ~x:(0.5) ~y:(-0.5) (); GlDraw.ends (); Gl.flush () let () = Sdl.init [`EVERYTHING]; let _ = Sdlvideo.set_video_mode ~w:640 ~h:400 [`HWSURFACE; `DOUBLEBUF; `OPENGL; `RESIZABLE] in for i = 1 to 6 do display (); Sdlgl.swap_buffers (); Sdltimer.delay 1000; done; Sdl.quit () # using full windows path, because protz's ocaml doesn't like cygwin ones: $ ocaml bigarray.cma -I +site-lib/sdl sdl.cma -I C:/cygwin/tmp/mylblgl0 lablgl.cma lblgl_hello.ml =========== # completely untested, just copied and modified from "lablgl.20120306" $ cat opam opam-version: "1" maintainer: "contact@ocamlpro.com" homepage: "https://forge.ocamlcore.org/projects/lablgl/" authors: [ "Jacques Garrigue" "Isaac Trotts" "Erick Tryzelaar" "Christophe Raffali" ] # "Jon Harrop" contributed the files "gluTess.ml" / "ml_glutess.c" # should he be considered as one of the authors? or just a contributor? build: [ ["cp" "Makefile.config.ex" "Makefile.config"] ["cp" "Makefile.config.osx" "Makefile.config"] {"%{os}%" = "darwin"} ["cp" "Makefile.config.mingw" "Makefile.config"] {"%{os}%" = "windows 7 starter with protz's ocaml 4"} ["mkdir" "-p" "%{bin}%"] # [make "glut"] ### do not build this one, it doesn't compile out of the box in my W env # [make "glutopt"] ### do not build this one, it doesn't compile out of the box in my W env ### in my W env, only build lablgl.{cma,cmxa} [make "lib"] [make "libopt"] [make "install" "-C" "src" "BINDIR=%{bin}%" "INSTALLDIR=%{lib}%/lablgl" "DLLDIR=%{lib}%/stublibs"] # it seems that the makefile forgets to install this one: ["cp" "src/dlllablgl.dll" "DLLDIR=%{lib}%/stublibs"] {"%{os:env}%" = "win7::protz"} ### do not install LablGlut, it doesn't compile out of the box in my W env ### [make "install" "-C" "LablGlut/src" "BINDIR=%{bin}%" "INSTALLDIR=%{lib}%/lablgl" "DLLDIR=%{lib}%/stublibs"] ] -- Cheers ```
port
lablgl for windows from blue prawn hi on my windows test environment i can not use lablgl because of the glshader module that i myself provided before i get a windows test environment but the vanilla lablgl works fine if i ve understood correctly opam is a source package manager so maybe opam could provide lablgl for windows users that have a similar environment than mine my windows test environment is windows starter with protz s ocaml version well in this environment only lablgl itself compiles out of the box lablglut does not togl tk does not either i think that the devel libs for both glut and tkgl are there lablglut and togl are for windowing so if we use ocamlsdl instead this is just fine we don t need these compiles out of the box in my win env in this environment i compile the vanilla lablgl like this cp makefile config mingw makefile config need to add i usr include the patch below does this patch makefile config cygwin needs i usr include patch don t build anything else than lablgl itself or we get errors make lib make libopt lablgl installdir tmp make install installdir lablgl installdir it seems that the file dlllablgl dll is not installed so cp src dlllablgl dll lablgl installdir now a hello world works fine i m providing this hello at the end after the config patch cat makefile config cygwin needs i usr include patch makefile config makefile config where to find opengl mesa glut headers and libraries glincludes dhas glext h dgl glext prototypes dglu version glincludes i usr include dhas glext h dgl glext prototypes dglu version gllibs gllibs glutlibs cat lblgl hello ml let display glclear color glclear clear gldraw color glmat mode projection glmat load identity glmat ortho x y z gldraw begins polygon gldraw vertex x y gldraw vertex x y gldraw vertex x y gldraw vertex x y gldraw ends gl flush let sdl init let sdlvideo set video mode w h in for i to do display sdlgl swap buffers sdltimer delay done sdl quit using full windows path because protz s ocaml doesn t like cygwin ones ocaml bigarray cma i site lib sdl sdl cma i c cygwin tmp lablgl cma lblgl hello ml completely untested just copied and modified from lablgl cat opam opam version maintainer contact ocamlpro com homepage authors jacques garrigue isaac trotts erick tryzelaar christophe raffali jon harrop contributed the files glutess ml ml glutess c should he be considered as one of the authors or just a contributor build os darwin os windows starter with protz s ocaml do not build this one it doesn t compile out of the box in my w env do not build this one it doesn t compile out of the box in my w env in my w env only build lablgl cma cmxa make install c src bindir bin installdir lib lablgl dlldir lib stublibs it seems that the makefile forgets to install this one os env protz do not install lablglut it doesn t compile out of the box in my w env make install c lablglut src bindir bin installdir lib lablgl dlldir lib stublibs cheers
1
44,819
13,091,611,483
IssuesEvent
2020-08-03 07:00:10
elikkatzgit/git_test2
https://api.github.com/repos/elikkatzgit/git_test2
opened
CVE-2019-14892 (High) detected in jackson-databind-2.9.8.jar
security vulnerability
## CVE-2019-14892 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /tmp/ws-scm/git_test2/pom.xml</p> <p>Path to vulnerable library: canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.8/jackson-databind-2.9.8.jar,/git_test2/target/myfinal-1.0.0-BUILD-SNAPSHOT/WEB-INF/lib/jackson-databind-2.9.8.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.8.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/elikkatzgit/git_test2/commits/6c7791e0dbfe9f7fe7e387607f242d4f69208017">6c7791e0dbfe9f7fe7e387607f242d4f69208017</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A flaw was discovered in jackson-databind in versions before 2.9.10, 2.8.11.5 and 2.6.7.3, where it would permit polymorphic deserialization of a malicious object using commons-configuration 1 and 2 JNDI classes. An attacker could use this flaw to execute arbitrary code. <p>Publish Date: 2020-03-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14892>CVE-2019-14892</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2462">https://github.com/FasterXML/jackson-databind/issues/2462</a></p> <p>Release Date: 2020-03-02</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.6.7.3,2.7.9.7,2.8.11.5,2.9.10</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.8","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.6.7.3,2.7.9.7,2.8.11.5,2.9.10"}],"vulnerabilityIdentifier":"CVE-2019-14892","vulnerabilityDetails":"A flaw was discovered in jackson-databind in versions before 2.9.10, 2.8.11.5 and 2.6.7.3, where it would permit polymorphic deserialization of a malicious object using commons-configuration 1 and 2 JNDI classes. An attacker could use this flaw to execute arbitrary code.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14892","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2019-14892 (High) detected in jackson-databind-2.9.8.jar - ## CVE-2019-14892 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /tmp/ws-scm/git_test2/pom.xml</p> <p>Path to vulnerable library: canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.8/jackson-databind-2.9.8.jar,/git_test2/target/myfinal-1.0.0-BUILD-SNAPSHOT/WEB-INF/lib/jackson-databind-2.9.8.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.9.8.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/elikkatzgit/git_test2/commits/6c7791e0dbfe9f7fe7e387607f242d4f69208017">6c7791e0dbfe9f7fe7e387607f242d4f69208017</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A flaw was discovered in jackson-databind in versions before 2.9.10, 2.8.11.5 and 2.6.7.3, where it would permit polymorphic deserialization of a malicious object using commons-configuration 1 and 2 JNDI classes. An attacker could use this flaw to execute arbitrary code. <p>Publish Date: 2020-03-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14892>CVE-2019-14892</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2462">https://github.com/FasterXML/jackson-databind/issues/2462</a></p> <p>Release Date: 2020-03-02</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.6.7.3,2.7.9.7,2.8.11.5,2.9.10</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.8","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.6.7.3,2.7.9.7,2.8.11.5,2.9.10"}],"vulnerabilityIdentifier":"CVE-2019-14892","vulnerabilityDetails":"A flaw was discovered in jackson-databind in versions before 2.9.10, 2.8.11.5 and 2.6.7.3, where it would permit polymorphic deserialization of a malicious object using commons-configuration 1 and 2 JNDI classes. An attacker could use this flaw to execute arbitrary code.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14892","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_port
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file tmp ws scm git pom xml path to vulnerable library canner repository com fasterxml jackson core jackson databind jackson databind jar git target myfinal build snapshot web inf lib jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href vulnerability details a flaw was discovered in jackson databind in versions before and where it would permit polymorphic deserialization of a malicious object using commons configuration and jndi classes an attacker could use this flaw to execute arbitrary code publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails a flaw was discovered in jackson databind in versions before and where it would permit polymorphic deserialization of a malicious object using commons configuration and jndi classes an attacker could use this flaw to execute arbitrary code vulnerabilityurl
0
25,488
25,280,935,755
IssuesEvent
2022-11-16 15:44:07
spiral/framework
https://api.github.com/repos/spiral/framework
closed
JsonPayloadMiddleware should not try to decode body for GET requests
Bug Usability
Currenly, GET requests with `Content-Type: application/json` will throw exception because `JsonPayloadMiddleware` tries to decode its empty body. I think, middleware should fall back to empty arrays in such cases. Correction: it fails for any kinds of requests with empty body.
True
JsonPayloadMiddleware should not try to decode body for GET requests - Currenly, GET requests with `Content-Type: application/json` will throw exception because `JsonPayloadMiddleware` tries to decode its empty body. I think, middleware should fall back to empty arrays in such cases. Correction: it fails for any kinds of requests with empty body.
non_port
jsonpayloadmiddleware should not try to decode body for get requests currenly get requests with content type application json will throw exception because jsonpayloadmiddleware tries to decode its empty body i think middleware should fall back to empty arrays in such cases correction it fails for any kinds of requests with empty body
0
160,875
20,120,315,320
IssuesEvent
2022-02-08 01:06:56
AkshayMukkavilli/Tensorflow
https://api.github.com/repos/AkshayMukkavilli/Tensorflow
opened
CVE-2022-23566 (High) detected in tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl
security vulnerability
## CVE-2022-23566 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary> <p>TensorFlow is an open source machine learning framework for everyone.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</a></p> <p>Path to dependency file: /Tensorflow/src/requirements.txt</p> <p>Path to vulnerable library: /teSource-ArchiveExtractor_5ea86033-7612-4210-97f3-8edb65806ddf/20190525011619_2843/20190525011537_depth_0/2/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64/tensorflow-1.13.1.data/purelib/tensorflow</p> <p> Dependency Hierarchy: - :x: **tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Tensorflow is an Open Source Machine Learning Framework. TensorFlow is vulnerable to a heap OOB write in `Grappler`. The `set_output` function writes to an array at the specified index. Hence, this gives a malicious user a write primitive. The fix will be included in TensorFlow 2.8.0. We will also cherrypick this commit on TensorFlow 2.7.1, TensorFlow 2.6.3, and TensorFlow 2.5.3, as these are also affected and still in supported range. <p>Publish Date: 2022-02-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23566>CVE-2022-23566</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-5qw5-89mw-wcg2">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-5qw5-89mw-wcg2</a></p> <p>Release Date: 2022-02-04</p> <p>Fix Resolution: tensorflow - 2.5.3,2.6.3,2.7.1,2.8.0;tensorflow-cpu - 2.5.3,2.6.3,2.7.1,2.8.0;tensorflow-gpu - 2.5.3,2.6.3,2.7.1,2.8.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-23566 (High) detected in tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl - ## CVE-2022-23566 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary> <p>TensorFlow is an open source machine learning framework for everyone.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</a></p> <p>Path to dependency file: /Tensorflow/src/requirements.txt</p> <p>Path to vulnerable library: /teSource-ArchiveExtractor_5ea86033-7612-4210-97f3-8edb65806ddf/20190525011619_2843/20190525011537_depth_0/2/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64/tensorflow-1.13.1.data/purelib/tensorflow</p> <p> Dependency Hierarchy: - :x: **tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Tensorflow is an Open Source Machine Learning Framework. TensorFlow is vulnerable to a heap OOB write in `Grappler`. The `set_output` function writes to an array at the specified index. Hence, this gives a malicious user a write primitive. The fix will be included in TensorFlow 2.8.0. We will also cherrypick this commit on TensorFlow 2.7.1, TensorFlow 2.6.3, and TensorFlow 2.5.3, as these are also affected and still in supported range. <p>Publish Date: 2022-02-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23566>CVE-2022-23566</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-5qw5-89mw-wcg2">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-5qw5-89mw-wcg2</a></p> <p>Release Date: 2022-02-04</p> <p>Fix Resolution: tensorflow - 2.5.3,2.6.3,2.7.1,2.8.0;tensorflow-cpu - 2.5.3,2.6.3,2.7.1,2.8.0;tensorflow-gpu - 2.5.3,2.6.3,2.7.1,2.8.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_port
cve high detected in tensorflow whl cve high severity vulnerability vulnerable library tensorflow whl tensorflow is an open source machine learning framework for everyone library home page a href path to dependency file tensorflow src requirements txt path to vulnerable library tesource archiveextractor depth tensorflow tensorflow data purelib tensorflow dependency hierarchy x tensorflow whl vulnerable library vulnerability details tensorflow is an open source machine learning framework tensorflow is vulnerable to a heap oob write in grappler the set output function writes to an array at the specified index hence this gives a malicious user a write primitive the fix will be included in tensorflow we will also cherrypick this commit on tensorflow tensorflow and tensorflow as these are also affected and still in supported range publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tensorflow tensorflow cpu tensorflow gpu step up your open source security game with whitesource
0
807,897
30,023,443,319
IssuesEvent
2023-06-27 02:49:15
AnOpenSauceDev/Methane-mod
https://api.github.com/repos/AnOpenSauceDev/Methane-mod
closed
[Issue] 1.20 is broken
enhancement compat Methane-Next priority:medium
## Description of what happened methane worked fine till 23w14a (by editing fabric.mod.json), but it broke for 23w16a ## Mods used methane 1.7 cloth config 10.0.96 fabric api for 23w16a ## possible ways to replicate this bug join a server; then it crashes maybe u already want to update methane to 1.20 might be an issue with MatrixStack screen. not sure tho it seems to be a fundamental change, because all 1.19.4 mods that worked fine for 23w14a broke for 23w16a (game crashes while launching/joining server) ## crashreport ---- Minecraft Crash Report ---- // Shall we play a game? Time: 2023-04-21 01:03:34 Description: Unexpected error java.lang.AbstractMethodError: Receiver class me.wolfie.methane.client.HudRenderListener does not define or inherit an implementation of the resolved method 'abstract void onHudRender(net.minecraft.class_332, float)' of interface net.fabricmc.fabric.api.client.rendering.v1.HudRenderCallback. at net.fabricmc.fabric.api.client.rendering.v1.HudRenderCallback.lambda$static$0(HudRenderCallback.java:27) at net.minecraft.class_329.handler$zhd000$fabric-rendering-v1$render(class_329.java:1393) at net.minecraft.class_329.method_1753(class_329.java:371) at net.minecraft.class_757.method_3192(class_757.java:918) at net.minecraft.class_310.method_1523(class_310.java:1203) at net.minecraft.class_310.method_1514(class_310.java:786) at net.minecraft.client.main.Main.main(Main.java:240) at net.fabricmc.loader.impl.game.minecraft.MinecraftGameProvider.launch(MinecraftGameProvider.java:462) at net.fabricmc.loader.impl.launch.knot.Knot.launch(Knot.java:74) at net.fabricmc.loader.impl.launch.knot.KnotClient.main(KnotClient.java:23) A detailed walkthrough of the error, its code path and all known details is as follows: --------------------------------------------------------------------------------------- -- Head -- Thread: Render thread Stacktrace: at net.fabricmc.fabric.api.client.rendering.v1.HudRenderCallback.lambda$static$0(HudRenderCallback.java:27) at net.minecraft.class_329.handler$zhd000$fabric-rendering-v1$render(class_329.java:1393) at net.minecraft.class_329.method_1753(class_329.java:371) -- Affected level -- Details: All players: 1 total; [class_746['Cyclopropinon'/2857, l='ClientLevel', x=192.87, y=133.00, z=-246.02]] Chunk stats: 729, 248 Level dimension: minecraft:the_nether Level spawn location: World: (0,73,0), Section: (at 0,9,0 in 0,4,0; chunk contains blocks 0,0,0 to 15,255,15), Region: (0,0; contains chunks 0,0 to 31,31, blocks 0,0,0 to 511,255,511) Level time: 119784655 game time, 129434541 day time Server brand: fabric Server type: Non-integrated multiplayer server Stacktrace: at net.minecraft.class_638.method_8538(class_638.java:455) at net.minecraft.class_310.method_1587(class_310.java:2394) at net.minecraft.class_310.method_1514(class_310.java:810) at net.minecraft.client.main.Main.main(Main.java:240) at net.fabricmc.loader.impl.game.minecraft.MinecraftGameProvider.launch(MinecraftGameProvider.java:462) at net.fabricmc.loader.impl.launch.knot.Knot.launch(Knot.java:74) at net.fabricmc.loader.impl.launch.knot.KnotClient.main(KnotClient.java:23) -- Last reload -- Details: Reload number: 1 Reload reason: initial Finished: Yes Packs: vanilla, Fabric Mods -- System Details -- Details: Minecraft Version: 23w16a Minecraft Version ID: 23w16a Operating System: Windows 10 (amd64) version 10.0 Java Version: 17.0.3, Microsoft Java VM Version: OpenJDK 64-Bit Server VM (mixed mode), Microsoft Memory: 359664912 bytes (343 MiB) / 671088640 bytes (640 MiB) up to 2147483648 bytes (2048 MiB) CPUs: 4 Processor Vendor: GenuineIntel Processor Name: Intel(R) Core(TM) i5 CPU M 560 @ 2.67GHz Identifier: Intel64 Family 6 Model 37 Stepping 5 Microarchitecture: Westmere (Client) Frequency (GHz): 2.66 Number of physical packages: 1 Number of physical CPUs: 2 Number of logical CPUs: 4 Graphics card #0 name: NVIDIA NVS 3100M Graphics card #0 vendor: NVIDIA (0x10de) Graphics card #0 VRAM (MB): 512.00 Graphics card #0 deviceId: 0x0a6c Graphics card #0 versionInfo: DriverVersion=21.21.13.4201 Memory slot #0 capacity (MB): 4096.00 Memory slot #0 clockSpeed (GHz): 1.33 Memory slot #0 type: DDR3 Virtual memory max (MB): 15219.67 Virtual memory used (MB): 10379.44 Swap memory total (MB): 11264.00 Swap memory used (MB): 2109.98 JVM Flags: 9 total; -XX:HeapDumpPath=MojangTricksIntelDriversForPerformance_javaw.exe_minecraft.exe.heapdump -Xss1M -Xmx2G -XX:+UnlockExperimentalVMOptions -XX:+UseG1GC -XX:G1NewSizePercent=20 -XX:G1ReservePercent=20 -XX:MaxGCPauseMillis=50 -XX:G1HeapRegionSize=32M Fabric Mods: cloth-config: Cloth Config v10 10.0.96 cloth-basic-math: cloth-basic-math 0.6.1 fabric-api: Fabric API 0.78.1+1.20 fabric-api-base: Fabric API Base 0.4.23+9ff28bce67 fabric-api-lookup-api-v1: Fabric API Lookup API (v1) 1.6.26+eff2638667 fabric-biome-api-v1: Fabric Biome API (v1) 13.0.6+348a9c6467 fabric-block-api-v1: Fabric Block API (v1) 1.0.5+e022e5d167 fabric-blockrenderlayer-v1: Fabric BlockRenderLayer Registration (v1) 1.1.33+c2e6f67467 fabric-client-tags-api-v1: Fabric Client Tags 1.0.14+1134c5b867 fabric-command-api-v1: Fabric Command API (v1) 1.2.26+f71b366f67 fabric-command-api-v2: Fabric Command API (v2) 2.2.5+df5b2a9d67 fabric-commands-v0: Fabric Commands (v0) 0.2.43+df3654b367 fabric-containers-v0: Fabric Containers (v0) 0.1.55+df3654b367 fabric-content-registries-v0: Fabric Content Registries (v0) 4.0.0+eff2638667 fabric-convention-tags-v1: Fabric Convention Tags 1.4.0+9a7c5daa67 fabric-crash-report-info-v1: Fabric Crash Report Info (v1) 0.2.14+aeb40ebe67 fabric-data-generation-api-v1: Fabric Data Generation API (v1) 12.0.0+eff2638667 fabric-dimensions-v1: Fabric Dimensions API (v1) 2.1.44+7f87f8fa67 fabric-entity-events-v1: Fabric Entity Events (v1) 1.5.14+eff2638667 fabric-events-interaction-v0: Fabric Events Interaction (v0) 0.4.42+a1ccd7bf67 fabric-events-lifecycle-v0: Fabric Events Lifecycle (v0) 0.2.53+df3654b367 fabric-game-rule-api-v1: Fabric Game Rule API (v1) 1.0.33+eff2638667 fabric-item-api-v1: Fabric Item API (v1) 2.1.18+eff2638667 fabric-item-group-api-v1: Fabric Item Group API (v1) 4.0.0+eff2638667 fabric-key-binding-api-v1: Fabric Key Binding API (v1) 1.0.32+c477957e67 fabric-keybindings-v0: Fabric Key Bindings (v0) 0.2.30+df3654b367 fabric-lifecycle-events-v1: Fabric Lifecycle Events (v1) 2.2.14+5da15ca167 fabric-loot-api-v2: Fabric Loot API (v2) 1.1.29+eff2638667 fabric-loot-tables-v1: Fabric Loot Tables (v1) 1.1.33+9e7660c667 fabric-message-api-v1: Fabric Message API (v1) 5.1.0+1ee8be4067 fabric-mining-level-api-v1: Fabric Mining Level API (v1) 2.1.39+eff2638667 fabric-models-v0: Fabric Models (v0) 0.3.29+11ba9c3b67 fabric-networking-api-v1: Fabric Networking API (v1) 1.3.2+eff2638667 fabric-networking-v0: Fabric Networking (v0) 0.3.42+df3654b367 fabric-object-builder-api-v1: Fabric Object Builder API (v1) 10.0.0+eff2638667 fabric-particles-v1: Fabric Particles (v1) 1.0.22+f1e4495b67 fabric-recipe-api-v1: Fabric Recipe API (v1) 1.0.9+a1ccd7bf67 fabric-registry-sync-v0: Fabric Registry Sync (v0) 2.1.5+eff2638667 fabric-renderer-api-v1: Fabric Renderer API (v1) 2.2.5+eff2638667 fabric-renderer-indigo: Fabric Renderer - Indigo 1.1.1+81e8c57667 fabric-renderer-registries-v1: Fabric Renderer Registries (v1) 3.2.38+df3654b367 fabric-rendering-data-attachment-v1: Fabric Rendering Data Attachment (v1) 0.3.27+afca2f3e67 fabric-rendering-fluids-v1: Fabric Rendering Fluids (v1) 3.0.20+f1e4495b67 fabric-rendering-v0: Fabric Rendering (v0) 1.1.41+df3654b367 fabric-rendering-v1: Fabric Rendering (v1) 3.0.0+eff2638667 fabric-resource-conditions-api-v1: Fabric Resource Conditions API (v1) 2.3.0+e6c7d4ee67 fabric-resource-loader-v0: Fabric Resource Loader (v0) 0.11.1+03ffe37867 fabric-screen-api-v1: Fabric Screen API (v1) 2.0.0+eff2638667 fabric-screen-handler-api-v1: Fabric Screen Handler API (v1) 1.3.21+eff2638667 fabric-sound-api-v1: Fabric Sound API (v1) 1.0.8+75e9821167 fabric-transfer-api-v1: Fabric Transfer API (v1) 3.1.1+eff2638667 fabric-transitive-access-wideners-v1: Fabric Transitive Access Wideners (v1) 4.0.1+848ffaab67 fabricloader: Fabric Loader 0.14.19 java: OpenJDK 64-Bit Server VM 17 methane: Methane 1.7 minecraft: Minecraft 1.20-alpha.23.16.a Launched Version: fabric-loader-0.14.19-23w16a Backend library: LWJGL version 3.3.1 SNAPSHOT Backend API: NVS 3100M/PCIe/SSE2 GL version 3.2.0, NVIDIA Corporation Window size: 1440x837 GL Caps: Using framebuffer using OpenGL 3.2 GL debug messages: Using VBOs: Yes Is Modded: Definitely; Client brand changed to 'fabric' Type: Client (map_client.txt) Graphics mode: fancy Resource Packs: fabric Current Language: en_us CPU: 4x Intel(R) Core(TM) i5 CPU M 560 @ 2.67GHz
1.0
[Issue] 1.20 is broken - ## Description of what happened methane worked fine till 23w14a (by editing fabric.mod.json), but it broke for 23w16a ## Mods used methane 1.7 cloth config 10.0.96 fabric api for 23w16a ## possible ways to replicate this bug join a server; then it crashes maybe u already want to update methane to 1.20 might be an issue with MatrixStack screen. not sure tho it seems to be a fundamental change, because all 1.19.4 mods that worked fine for 23w14a broke for 23w16a (game crashes while launching/joining server) ## crashreport ---- Minecraft Crash Report ---- // Shall we play a game? Time: 2023-04-21 01:03:34 Description: Unexpected error java.lang.AbstractMethodError: Receiver class me.wolfie.methane.client.HudRenderListener does not define or inherit an implementation of the resolved method 'abstract void onHudRender(net.minecraft.class_332, float)' of interface net.fabricmc.fabric.api.client.rendering.v1.HudRenderCallback. at net.fabricmc.fabric.api.client.rendering.v1.HudRenderCallback.lambda$static$0(HudRenderCallback.java:27) at net.minecraft.class_329.handler$zhd000$fabric-rendering-v1$render(class_329.java:1393) at net.minecraft.class_329.method_1753(class_329.java:371) at net.minecraft.class_757.method_3192(class_757.java:918) at net.minecraft.class_310.method_1523(class_310.java:1203) at net.minecraft.class_310.method_1514(class_310.java:786) at net.minecraft.client.main.Main.main(Main.java:240) at net.fabricmc.loader.impl.game.minecraft.MinecraftGameProvider.launch(MinecraftGameProvider.java:462) at net.fabricmc.loader.impl.launch.knot.Knot.launch(Knot.java:74) at net.fabricmc.loader.impl.launch.knot.KnotClient.main(KnotClient.java:23) A detailed walkthrough of the error, its code path and all known details is as follows: --------------------------------------------------------------------------------------- -- Head -- Thread: Render thread Stacktrace: at net.fabricmc.fabric.api.client.rendering.v1.HudRenderCallback.lambda$static$0(HudRenderCallback.java:27) at net.minecraft.class_329.handler$zhd000$fabric-rendering-v1$render(class_329.java:1393) at net.minecraft.class_329.method_1753(class_329.java:371) -- Affected level -- Details: All players: 1 total; [class_746['Cyclopropinon'/2857, l='ClientLevel', x=192.87, y=133.00, z=-246.02]] Chunk stats: 729, 248 Level dimension: minecraft:the_nether Level spawn location: World: (0,73,0), Section: (at 0,9,0 in 0,4,0; chunk contains blocks 0,0,0 to 15,255,15), Region: (0,0; contains chunks 0,0 to 31,31, blocks 0,0,0 to 511,255,511) Level time: 119784655 game time, 129434541 day time Server brand: fabric Server type: Non-integrated multiplayer server Stacktrace: at net.minecraft.class_638.method_8538(class_638.java:455) at net.minecraft.class_310.method_1587(class_310.java:2394) at net.minecraft.class_310.method_1514(class_310.java:810) at net.minecraft.client.main.Main.main(Main.java:240) at net.fabricmc.loader.impl.game.minecraft.MinecraftGameProvider.launch(MinecraftGameProvider.java:462) at net.fabricmc.loader.impl.launch.knot.Knot.launch(Knot.java:74) at net.fabricmc.loader.impl.launch.knot.KnotClient.main(KnotClient.java:23) -- Last reload -- Details: Reload number: 1 Reload reason: initial Finished: Yes Packs: vanilla, Fabric Mods -- System Details -- Details: Minecraft Version: 23w16a Minecraft Version ID: 23w16a Operating System: Windows 10 (amd64) version 10.0 Java Version: 17.0.3, Microsoft Java VM Version: OpenJDK 64-Bit Server VM (mixed mode), Microsoft Memory: 359664912 bytes (343 MiB) / 671088640 bytes (640 MiB) up to 2147483648 bytes (2048 MiB) CPUs: 4 Processor Vendor: GenuineIntel Processor Name: Intel(R) Core(TM) i5 CPU M 560 @ 2.67GHz Identifier: Intel64 Family 6 Model 37 Stepping 5 Microarchitecture: Westmere (Client) Frequency (GHz): 2.66 Number of physical packages: 1 Number of physical CPUs: 2 Number of logical CPUs: 4 Graphics card #0 name: NVIDIA NVS 3100M Graphics card #0 vendor: NVIDIA (0x10de) Graphics card #0 VRAM (MB): 512.00 Graphics card #0 deviceId: 0x0a6c Graphics card #0 versionInfo: DriverVersion=21.21.13.4201 Memory slot #0 capacity (MB): 4096.00 Memory slot #0 clockSpeed (GHz): 1.33 Memory slot #0 type: DDR3 Virtual memory max (MB): 15219.67 Virtual memory used (MB): 10379.44 Swap memory total (MB): 11264.00 Swap memory used (MB): 2109.98 JVM Flags: 9 total; -XX:HeapDumpPath=MojangTricksIntelDriversForPerformance_javaw.exe_minecraft.exe.heapdump -Xss1M -Xmx2G -XX:+UnlockExperimentalVMOptions -XX:+UseG1GC -XX:G1NewSizePercent=20 -XX:G1ReservePercent=20 -XX:MaxGCPauseMillis=50 -XX:G1HeapRegionSize=32M Fabric Mods: cloth-config: Cloth Config v10 10.0.96 cloth-basic-math: cloth-basic-math 0.6.1 fabric-api: Fabric API 0.78.1+1.20 fabric-api-base: Fabric API Base 0.4.23+9ff28bce67 fabric-api-lookup-api-v1: Fabric API Lookup API (v1) 1.6.26+eff2638667 fabric-biome-api-v1: Fabric Biome API (v1) 13.0.6+348a9c6467 fabric-block-api-v1: Fabric Block API (v1) 1.0.5+e022e5d167 fabric-blockrenderlayer-v1: Fabric BlockRenderLayer Registration (v1) 1.1.33+c2e6f67467 fabric-client-tags-api-v1: Fabric Client Tags 1.0.14+1134c5b867 fabric-command-api-v1: Fabric Command API (v1) 1.2.26+f71b366f67 fabric-command-api-v2: Fabric Command API (v2) 2.2.5+df5b2a9d67 fabric-commands-v0: Fabric Commands (v0) 0.2.43+df3654b367 fabric-containers-v0: Fabric Containers (v0) 0.1.55+df3654b367 fabric-content-registries-v0: Fabric Content Registries (v0) 4.0.0+eff2638667 fabric-convention-tags-v1: Fabric Convention Tags 1.4.0+9a7c5daa67 fabric-crash-report-info-v1: Fabric Crash Report Info (v1) 0.2.14+aeb40ebe67 fabric-data-generation-api-v1: Fabric Data Generation API (v1) 12.0.0+eff2638667 fabric-dimensions-v1: Fabric Dimensions API (v1) 2.1.44+7f87f8fa67 fabric-entity-events-v1: Fabric Entity Events (v1) 1.5.14+eff2638667 fabric-events-interaction-v0: Fabric Events Interaction (v0) 0.4.42+a1ccd7bf67 fabric-events-lifecycle-v0: Fabric Events Lifecycle (v0) 0.2.53+df3654b367 fabric-game-rule-api-v1: Fabric Game Rule API (v1) 1.0.33+eff2638667 fabric-item-api-v1: Fabric Item API (v1) 2.1.18+eff2638667 fabric-item-group-api-v1: Fabric Item Group API (v1) 4.0.0+eff2638667 fabric-key-binding-api-v1: Fabric Key Binding API (v1) 1.0.32+c477957e67 fabric-keybindings-v0: Fabric Key Bindings (v0) 0.2.30+df3654b367 fabric-lifecycle-events-v1: Fabric Lifecycle Events (v1) 2.2.14+5da15ca167 fabric-loot-api-v2: Fabric Loot API (v2) 1.1.29+eff2638667 fabric-loot-tables-v1: Fabric Loot Tables (v1) 1.1.33+9e7660c667 fabric-message-api-v1: Fabric Message API (v1) 5.1.0+1ee8be4067 fabric-mining-level-api-v1: Fabric Mining Level API (v1) 2.1.39+eff2638667 fabric-models-v0: Fabric Models (v0) 0.3.29+11ba9c3b67 fabric-networking-api-v1: Fabric Networking API (v1) 1.3.2+eff2638667 fabric-networking-v0: Fabric Networking (v0) 0.3.42+df3654b367 fabric-object-builder-api-v1: Fabric Object Builder API (v1) 10.0.0+eff2638667 fabric-particles-v1: Fabric Particles (v1) 1.0.22+f1e4495b67 fabric-recipe-api-v1: Fabric Recipe API (v1) 1.0.9+a1ccd7bf67 fabric-registry-sync-v0: Fabric Registry Sync (v0) 2.1.5+eff2638667 fabric-renderer-api-v1: Fabric Renderer API (v1) 2.2.5+eff2638667 fabric-renderer-indigo: Fabric Renderer - Indigo 1.1.1+81e8c57667 fabric-renderer-registries-v1: Fabric Renderer Registries (v1) 3.2.38+df3654b367 fabric-rendering-data-attachment-v1: Fabric Rendering Data Attachment (v1) 0.3.27+afca2f3e67 fabric-rendering-fluids-v1: Fabric Rendering Fluids (v1) 3.0.20+f1e4495b67 fabric-rendering-v0: Fabric Rendering (v0) 1.1.41+df3654b367 fabric-rendering-v1: Fabric Rendering (v1) 3.0.0+eff2638667 fabric-resource-conditions-api-v1: Fabric Resource Conditions API (v1) 2.3.0+e6c7d4ee67 fabric-resource-loader-v0: Fabric Resource Loader (v0) 0.11.1+03ffe37867 fabric-screen-api-v1: Fabric Screen API (v1) 2.0.0+eff2638667 fabric-screen-handler-api-v1: Fabric Screen Handler API (v1) 1.3.21+eff2638667 fabric-sound-api-v1: Fabric Sound API (v1) 1.0.8+75e9821167 fabric-transfer-api-v1: Fabric Transfer API (v1) 3.1.1+eff2638667 fabric-transitive-access-wideners-v1: Fabric Transitive Access Wideners (v1) 4.0.1+848ffaab67 fabricloader: Fabric Loader 0.14.19 java: OpenJDK 64-Bit Server VM 17 methane: Methane 1.7 minecraft: Minecraft 1.20-alpha.23.16.a Launched Version: fabric-loader-0.14.19-23w16a Backend library: LWJGL version 3.3.1 SNAPSHOT Backend API: NVS 3100M/PCIe/SSE2 GL version 3.2.0, NVIDIA Corporation Window size: 1440x837 GL Caps: Using framebuffer using OpenGL 3.2 GL debug messages: Using VBOs: Yes Is Modded: Definitely; Client brand changed to 'fabric' Type: Client (map_client.txt) Graphics mode: fancy Resource Packs: fabric Current Language: en_us CPU: 4x Intel(R) Core(TM) i5 CPU M 560 @ 2.67GHz
non_port
is broken description of what happened methane worked fine till by editing fabric mod json but it broke for mods used methane cloth config fabric api for possible ways to replicate this bug join a server then it crashes maybe u already want to update methane to might be an issue with matrixstack screen not sure tho it seems to be a fundamental change because all mods that worked fine for broke for game crashes while launching joining server crashreport minecraft crash report shall we play a game time description unexpected error java lang abstractmethoderror receiver class me wolfie methane client hudrenderlistener does not define or inherit an implementation of the resolved method abstract void onhudrender net minecraft class float of interface net fabricmc fabric api client rendering hudrendercallback at net fabricmc fabric api client rendering hudrendercallback lambda static hudrendercallback java at net minecraft class handler fabric rendering render class java at net minecraft class method class java at net minecraft class method class java at net minecraft class method class java at net minecraft class method class java at net minecraft client main main main main java at net fabricmc loader impl game minecraft minecraftgameprovider launch minecraftgameprovider java at net fabricmc loader impl launch knot knot launch knot java at net fabricmc loader impl launch knot knotclient main knotclient java a detailed walkthrough of the error its code path and all known details is as follows head thread render thread stacktrace at net fabricmc fabric api client rendering hudrendercallback lambda static hudrendercallback java at net minecraft class handler fabric rendering render class java at net minecraft class method class java affected level details all players total chunk stats level dimension minecraft the nether level spawn location world section at in chunk contains blocks to region contains chunks to blocks to level time game time day time server brand fabric server type non integrated multiplayer server stacktrace at net minecraft class method class java at net minecraft class method class java at net minecraft class method class java at net minecraft client main main main main java at net fabricmc loader impl game minecraft minecraftgameprovider launch minecraftgameprovider java at net fabricmc loader impl launch knot knot launch knot java at net fabricmc loader impl launch knot knotclient main knotclient java last reload details reload number reload reason initial finished yes packs vanilla fabric mods system details details minecraft version minecraft version id operating system windows version java version microsoft java vm version openjdk bit server vm mixed mode microsoft memory bytes mib bytes mib up to bytes mib cpus processor vendor genuineintel processor name intel r core tm cpu m identifier family model stepping microarchitecture westmere client frequency ghz number of physical packages number of physical cpus number of logical cpus graphics card name nvidia nvs graphics card vendor nvidia graphics card vram mb graphics card deviceid graphics card versioninfo driverversion memory slot capacity mb memory slot clockspeed ghz memory slot type virtual memory max mb virtual memory used mb swap memory total mb swap memory used mb jvm flags total xx heapdumppath mojangtricksinteldriversforperformance javaw exe minecraft exe heapdump xx unlockexperimentalvmoptions xx xx xx xx maxgcpausemillis xx fabric mods cloth config cloth config cloth basic math cloth basic math fabric api fabric api fabric api base fabric api base fabric api lookup api fabric api lookup api fabric biome api fabric biome api fabric block api fabric block api fabric blockrenderlayer fabric blockrenderlayer registration fabric client tags api fabric client tags fabric command api fabric command api fabric command api fabric command api fabric commands fabric commands fabric containers fabric containers fabric content registries fabric content registries fabric convention tags fabric convention tags fabric crash report info fabric crash report info fabric data generation api fabric data generation api fabric dimensions fabric dimensions api fabric entity events fabric entity events fabric events interaction fabric events interaction fabric events lifecycle fabric events lifecycle fabric game rule api fabric game rule api fabric item api fabric item api fabric item group api fabric item group api fabric key binding api fabric key binding api fabric keybindings fabric key bindings fabric lifecycle events fabric lifecycle events fabric loot api fabric loot api fabric loot tables fabric loot tables fabric message api fabric message api fabric mining level api fabric mining level api fabric models fabric models fabric networking api fabric networking api fabric networking fabric networking fabric object builder api fabric object builder api fabric particles fabric particles fabric recipe api fabric recipe api fabric registry sync fabric registry sync fabric renderer api fabric renderer api fabric renderer indigo fabric renderer indigo fabric renderer registries fabric renderer registries fabric rendering data attachment fabric rendering data attachment fabric rendering fluids fabric rendering fluids fabric rendering fabric rendering fabric rendering fabric rendering fabric resource conditions api fabric resource conditions api fabric resource loader fabric resource loader fabric screen api fabric screen api fabric screen handler api fabric screen handler api fabric sound api fabric sound api fabric transfer api fabric transfer api fabric transitive access wideners fabric transitive access wideners fabricloader fabric loader java openjdk bit server vm methane methane minecraft minecraft alpha a launched version fabric loader backend library lwjgl version snapshot backend api nvs pcie gl version nvidia corporation window size gl caps using framebuffer using opengl gl debug messages using vbos yes is modded definitely client brand changed to fabric type client map client txt graphics mode fancy resource packs fabric current language en us cpu intel r core tm cpu m
0
31,959
6,020,849,946
IssuesEvent
2017-06-07 17:21:43
pilosa/pilosa
https://api.github.com/repos/pilosa/pilosa
closed
Adjust TopN documentation to include the `inverse` argument.
documentation
@travisturner commented on [Mon May 15 2017](https://github.com/pilosa/www/issues/100) See this PR for information: https://github.com/pilosa/pilosa/pull/551
1.0
Adjust TopN documentation to include the `inverse` argument. - @travisturner commented on [Mon May 15 2017](https://github.com/pilosa/www/issues/100) See this PR for information: https://github.com/pilosa/pilosa/pull/551
non_port
adjust topn documentation to include the inverse argument travisturner commented on see this pr for information
0
177
3,967,599,580
IssuesEvent
2016-05-03 16:45:26
jemalloc/jemalloc
https://api.github.com/repos/jemalloc/jemalloc
closed
Test suite failure on multiple architectures with --enable-prof
notabug portability
Hi, On Debian, --enable-prof was [briefly enabled](https://bugs.debian.org/767342) but it was [quickly reverted](https://bugs.debian.org/807548) as it resulted into test build failures [across multiple architectures](https://buildd.debian.org/status/logs.php?pkg=jemalloc&ver=3.6.0-4) (32-bit but also 64-bit, e.g. arm64). This was with 3.6.0, but I've reproduced this just now on an arm64 host with 4.0.4. The failure is this: ``` === test/unit/prof_accum === thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 5 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 5 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 5 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 5 > 1: Expected larger backtrace count increase test_idump: fail --- pass: 0/1, skip: 0/1, fail: 1/1 --- ``` This was with: ``` (sid_arm64-dchroot)paravoid@asachi:~/jemalloc-4.0.4$ gcc -v Using built-in specs. COLLECT_GCC=gcc COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/5/lto-wrapper Target: aarch64-linux-gnu Configured with: ../src/configure -v --with-pkgversion='Debian 5.3.1-9' --with-bugurl=file:///usr/share/doc/gcc-5/README.Bugs --enable-languages=c,ada,c++,java,go,d,fortran,objc,obj-c++ --prefix=/usr --program-suffix=-5 --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --with-sysroot=/ --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --enable-plugin --with-system-zlib --disable-browser-plugin --enable-java-awt=gtk --enable-gtk-cairo --with-java-home=/usr/lib/jvm/java-1.5.0-gcj-5-arm64/jre --enable-java-home --with-jvm-root-dir=/usr/lib/jvm/java-1.5.0-gcj-5-arm64 --with-jvm-jar-dir=/usr/lib/jvm-exports/java-1.5.0-gcj-5-arm64 --with-arch-directory=aarch64 --with-ecj-jar=/usr/share/java/eclipse-ecj.jar --enable-multiarch --enable-fix-cortex-a53-843419 --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu Thread model: posix gcc version 5.3.1 20160220 (Debian 5.3.1-9) ``` Debugging this is over my head, but I'd be happy to test potential fixes on arm64 or other architectures (using Debian's porterbox network).
True
Test suite failure on multiple architectures with --enable-prof - Hi, On Debian, --enable-prof was [briefly enabled](https://bugs.debian.org/767342) but it was [quickly reverted](https://bugs.debian.org/807548) as it resulted into test build failures [across multiple architectures](https://buildd.debian.org/status/logs.php?pkg=jemalloc&ver=3.6.0-4) (32-bit but also 64-bit, e.g. arm64). This was with 3.6.0, but I've reproduced this just now on an arm64 host with 4.0.4. The failure is this: ``` === test/unit/prof_accum === thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 5 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 6 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 5 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 5 > 1: Expected larger backtrace count increase thd_start:test/unit/prof_accum.c:52: Failed assertion: (bt_count_prev+(i-i_prev)) <= (bt_count) --> 5 > 1: Expected larger backtrace count increase test_idump: fail --- pass: 0/1, skip: 0/1, fail: 1/1 --- ``` This was with: ``` (sid_arm64-dchroot)paravoid@asachi:~/jemalloc-4.0.4$ gcc -v Using built-in specs. COLLECT_GCC=gcc COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/5/lto-wrapper Target: aarch64-linux-gnu Configured with: ../src/configure -v --with-pkgversion='Debian 5.3.1-9' --with-bugurl=file:///usr/share/doc/gcc-5/README.Bugs --enable-languages=c,ada,c++,java,go,d,fortran,objc,obj-c++ --prefix=/usr --program-suffix=-5 --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --with-sysroot=/ --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --enable-plugin --with-system-zlib --disable-browser-plugin --enable-java-awt=gtk --enable-gtk-cairo --with-java-home=/usr/lib/jvm/java-1.5.0-gcj-5-arm64/jre --enable-java-home --with-jvm-root-dir=/usr/lib/jvm/java-1.5.0-gcj-5-arm64 --with-jvm-jar-dir=/usr/lib/jvm-exports/java-1.5.0-gcj-5-arm64 --with-arch-directory=aarch64 --with-ecj-jar=/usr/share/java/eclipse-ecj.jar --enable-multiarch --enable-fix-cortex-a53-843419 --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu Thread model: posix gcc version 5.3.1 20160220 (Debian 5.3.1-9) ``` Debugging this is over my head, but I'd be happy to test potential fixes on arm64 or other architectures (using Debian's porterbox network).
port
test suite failure on multiple architectures with enable prof hi on debian enable prof was but it was as it resulted into test build failures bit but also bit e g this was with but i ve reproduced this just now on an host with the failure is this test unit prof accum thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase thd start test unit prof accum c failed assertion bt count prev i i prev expected larger backtrace count increase test idump fail pass skip fail this was with sid dchroot paravoid asachi jemalloc gcc v using built in specs collect gcc gcc collect lto wrapper usr lib gcc linux gnu lto wrapper target linux gnu configured with src configure v with pkgversion debian with bugurl file usr share doc gcc readme bugs enable languages c ada c java go d fortran objc obj c prefix usr program suffix enable shared enable linker build id libexecdir usr lib without included gettext enable threads posix libdir usr lib enable nls with sysroot enable clocale gnu enable libstdcxx debug enable libstdcxx time yes with default libstdcxx abi new enable gnu unique object disable libquadmath enable plugin with system zlib disable browser plugin enable java awt gtk enable gtk cairo with java home usr lib jvm java gcj jre enable java home with jvm root dir usr lib jvm java gcj with jvm jar dir usr lib jvm exports java gcj with arch directory with ecj jar usr share java eclipse ecj jar enable multiarch enable fix cortex enable checking release build linux gnu host linux gnu target linux gnu thread model posix gcc version debian debugging this is over my head but i d be happy to test potential fixes on or other architectures using debian s porterbox network
1
282,179
8,704,290,844
IssuesEvent
2018-12-05 18:59:52
AICrowd/AIcrowd
https://api.github.com/repos/AICrowd/AIcrowd
closed
Drafts challenges are publicly visible (no access check done)
high priority
_From @spMohanty on April 26, 2018 15:39_ https://www.crowdai.org/challenges/marlo-2018 _Copied from original issue: crowdAI/crowdai#724_
1.0
Drafts challenges are publicly visible (no access check done) - _From @spMohanty on April 26, 2018 15:39_ https://www.crowdai.org/challenges/marlo-2018 _Copied from original issue: crowdAI/crowdai#724_
non_port
drafts challenges are publicly visible no access check done from spmohanty on april copied from original issue crowdai crowdai
0
523
7,374,251,975
IssuesEvent
2018-03-13 19:44:53
nasa/multipath-tcp-tools
https://api.github.com/repos/nasa/multipath-tcp-tools
closed
Compilation issues
bug portability
I had to do the following change through the project to make it compile: ``` - tcp_header_length = ((tcp_header->th_off & 0xf0 >> 4) * 4); + tcp_header_length = ((tcp_header->doff & 0xf0 >> 4) * 4); ``` It's probably because on your system you have ```__FAVOR_BSD``` set, because ```netinet/tcp.h``` does: ``` # ifdef __FAVOR_BSD [...] ``` I think you could probably fix that issue by defining ```__FAVOR_BSD``` in the Makefile.
True
Compilation issues - I had to do the following change through the project to make it compile: ``` - tcp_header_length = ((tcp_header->th_off & 0xf0 >> 4) * 4); + tcp_header_length = ((tcp_header->doff & 0xf0 >> 4) * 4); ``` It's probably because on your system you have ```__FAVOR_BSD``` set, because ```netinet/tcp.h``` does: ``` # ifdef __FAVOR_BSD [...] ``` I think you could probably fix that issue by defining ```__FAVOR_BSD``` in the Makefile.
port
compilation issues i had to do the following change through the project to make it compile tcp header length tcp header th off tcp header length tcp header doff it s probably because on your system you have favor bsd set because netinet tcp h does ifdef favor bsd i think you could probably fix that issue by defining favor bsd in the makefile
1
22,052
6,228,242,526
IssuesEvent
2017-07-10 22:46:23
XceedBoucherS/TestImport5
https://api.github.com/repos/XceedBoucherS/TestImport5
closed
Feature Request: Property Grid Addons
CodePlex
<b>jogibear9988[CodePlex]</b> <br />I'd like to see a small Preview Window in the Property Grid(like in VS), if the selected Object is a UiElement. (I've implemented this on the wpg PropertyGrid, but i like yours much more http://wpg.codeplex.com/) nbsp Also a better Brush Editor, a one wich supports Gradients would be nice (I've used the one from SharpDevelop in WPG, but this is more a ugly hack)
1.0
Feature Request: Property Grid Addons - <b>jogibear9988[CodePlex]</b> <br />I'd like to see a small Preview Window in the Property Grid(like in VS), if the selected Object is a UiElement. (I've implemented this on the wpg PropertyGrid, but i like yours much more http://wpg.codeplex.com/) nbsp Also a better Brush Editor, a one wich supports Gradients would be nice (I've used the one from SharpDevelop in WPG, but this is more a ugly hack)
non_port
feature request property grid addons i d like to see a small preview window in the property grid like in vs if the selected object is a uielement i ve implemented this on the wpg propertygrid but i like yours much more nbsp also a better brush editor a one wich supports gradients would be nice i ve used the one from sharpdevelop in wpg but this is more a ugly hack
0
20,733
15,977,415,584
IssuesEvent
2021-04-17 04:55:48
HauntedBees/Uprooted
https://api.github.com/repos/HauntedBees/Uprooted
closed
autosave places you in a weird spot
bug usability
you often end up stuck inside the inn and entering it again after you load the save ed. note: was likely caused by using noclip. smooth.
True
autosave places you in a weird spot - you often end up stuck inside the inn and entering it again after you load the save ed. note: was likely caused by using noclip. smooth.
non_port
autosave places you in a weird spot you often end up stuck inside the inn and entering it again after you load the save ed note was likely caused by using noclip smooth
0
963
12,371,137,588
IssuesEvent
2020-05-18 18:01:45
verilator/verilator
https://api.github.com/repos/verilator/verilator
closed
Fix trace for older GCC's
area: portability new
The recent trace changes trip up older compilers, please take a look and feel free to push changes. See e.g. (There are other problems in the cron runs, so ignore those not related to your change please) https://travis-ci.com/github/verilator/verilator/jobs/336002727 ``` home/travis/build/verilator/verilator/test_regress/../include/verilated_trace_imp.cpp:464:26: required from here /home/travis/build/verilator/verilator/test_regress/../include/verilated_trace.h:122:12: error: non-static const member ‘void* const VerilatedTrace<VerilatedVcd>::CallbackRecord::m_userp’, can’t use default assignment operator struct CallbackRecord { ^ In file included from /usr/include/c++/5/vector:69:0, from /home/travis/build/verilator/verilator/test_regress/../include/verilated_trace.h:28, from /home/travis/build/verilator/verilator/test_regress/../include/verilated_vcd_c.h:24, from /home/travis/build/verilator/verilator/test_regress/../include/verilated_vcd_c.cpp:24: /usr/include/c++/5/bits/vector.tcc:343:16: note: synthesized method ‘VerilatedTrace<VerilatedVcd>::CallbackRecord& VerilatedTrace<VerilatedVcd>::CallbackRecord::operator=(const VerilatedTrace<VerilatedVcd>::CallbackRecord&)’ first required here *__position = __x_copy; ^ /home/travis/build/verilator/verilator/test_regress/../include/verilated.mk:213: recipe for target 'verilated_vcd_c.o' failed make[1]: *** [verilated_vcd_c.o] Error 1 ```
True
Fix trace for older GCC's - The recent trace changes trip up older compilers, please take a look and feel free to push changes. See e.g. (There are other problems in the cron runs, so ignore those not related to your change please) https://travis-ci.com/github/verilator/verilator/jobs/336002727 ``` home/travis/build/verilator/verilator/test_regress/../include/verilated_trace_imp.cpp:464:26: required from here /home/travis/build/verilator/verilator/test_regress/../include/verilated_trace.h:122:12: error: non-static const member ‘void* const VerilatedTrace<VerilatedVcd>::CallbackRecord::m_userp’, can’t use default assignment operator struct CallbackRecord { ^ In file included from /usr/include/c++/5/vector:69:0, from /home/travis/build/verilator/verilator/test_regress/../include/verilated_trace.h:28, from /home/travis/build/verilator/verilator/test_regress/../include/verilated_vcd_c.h:24, from /home/travis/build/verilator/verilator/test_regress/../include/verilated_vcd_c.cpp:24: /usr/include/c++/5/bits/vector.tcc:343:16: note: synthesized method ‘VerilatedTrace<VerilatedVcd>::CallbackRecord& VerilatedTrace<VerilatedVcd>::CallbackRecord::operator=(const VerilatedTrace<VerilatedVcd>::CallbackRecord&)’ first required here *__position = __x_copy; ^ /home/travis/build/verilator/verilator/test_regress/../include/verilated.mk:213: recipe for target 'verilated_vcd_c.o' failed make[1]: *** [verilated_vcd_c.o] Error 1 ```
port
fix trace for older gcc s the recent trace changes trip up older compilers please take a look and feel free to push changes see e g there are other problems in the cron runs so ignore those not related to your change please home travis build verilator verilator test regress include verilated trace imp cpp required from here home travis build verilator verilator test regress include verilated trace h error non static const member ‘void const verilatedtrace callbackrecord m userp’ can’t use default assignment operator struct callbackrecord in file included from usr include c vector from home travis build verilator verilator test regress include verilated trace h from home travis build verilator verilator test regress include verilated vcd c h from home travis build verilator verilator test regress include verilated vcd c cpp usr include c bits vector tcc note synthesized method ‘verilatedtrace callbackrecord verilatedtrace callbackrecord operator const verilatedtrace callbackrecord ’ first required here position x copy home travis build verilator verilator test regress include verilated mk recipe for target verilated vcd c o failed make error
1
437
6,715,871,396
IssuesEvent
2017-10-13 23:55:14
wahern/cqueues
https://api.github.com/repos/wahern/cqueues
closed
Build fails against luajit 2.1
packaging/portability
LuaJIT 2.1 provides some lua 5.2 symbols. These end up clashing with our compat implementations. Build log from https://travis-ci.org/daurnimator/lua-http/jobs/261447848: ``` enabling Lua 5.1 cp /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/config.h.guess /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/config.h cp /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/config.h /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/config.h mkdir -p /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/5.1 gcc -O2 -std=gnu99 -fPIC -g -Wall -Wextra -Wno-missing-field-initializers -Wno-override-init -Wno-unused -O2 -fPIC -DLUA_COMPAT_APIINTCASTS -I/home/travis/hererocks/include -D_REENTRANT -D_THREAD_SAFE -D_GNU_SOURCE -I"/usr/include" -I"/usr/include" -DCQUEUES_VENDOR='"william@25thandClement.com"' -DCQUEUES_VERSION=20161215L -c -o /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/5.1/cqueues.o /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/cqueues.c In file included from /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/cqueues.h:44:0, from /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/cqueues.c:51: /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/compat52.h:31:13: error: static declaration of ‘luaL_setmetatable’ follows non-static declaration /home/travis/hererocks/include/lauxlib.h:92:18: note: previous declaration of ‘luaL_setmetatable’ was here /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/compat52.h:42:14: error: static declaration of ‘luaL_testudata’ follows non-static declaration /home/travis/hererocks/include/lauxlib.h:91:19: note: previous declaration of ‘luaL_testudata’ was here /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/compat52.h:57:13: error: static declaration of ‘luaL_setfuncs’ follows non-static declaration /home/travis/hererocks/include/lauxlib.h:88:18: note: previous declaration of ‘luaL_setfuncs’ was here /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/compat52.h:71:0: warning: "luaL_newlibtable" redefined [enabled by default] /home/travis/hererocks/include/lauxlib.h:123:0: note: this is the location of the previous definition /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/compat52.h:74:0: warning: "luaL_newlib" redefined [enabled by default] /home/travis/hererocks/include/lauxlib.h:125:0: note: this is the location of the previous definition make: *** [/tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/5.1/cqueues.o] Error 1 ```
True
Build fails against luajit 2.1 - LuaJIT 2.1 provides some lua 5.2 symbols. These end up clashing with our compat implementations. Build log from https://travis-ci.org/daurnimator/lua-http/jobs/261447848: ``` enabling Lua 5.1 cp /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/config.h.guess /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/config.h cp /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/config.h /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/config.h mkdir -p /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/5.1 gcc -O2 -std=gnu99 -fPIC -g -Wall -Wextra -Wno-missing-field-initializers -Wno-override-init -Wno-unused -O2 -fPIC -DLUA_COMPAT_APIINTCASTS -I/home/travis/hererocks/include -D_REENTRANT -D_THREAD_SAFE -D_GNU_SOURCE -I"/usr/include" -I"/usr/include" -DCQUEUES_VENDOR='"william@25thandClement.com"' -DCQUEUES_VERSION=20161215L -c -o /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/5.1/cqueues.o /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/cqueues.c In file included from /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/cqueues.h:44:0, from /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/cqueues.c:51: /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/compat52.h:31:13: error: static declaration of ‘luaL_setmetatable’ follows non-static declaration /home/travis/hererocks/include/lauxlib.h:92:18: note: previous declaration of ‘luaL_setmetatable’ was here /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/compat52.h:42:14: error: static declaration of ‘luaL_testudata’ follows non-static declaration /home/travis/hererocks/include/lauxlib.h:91:19: note: previous declaration of ‘luaL_testudata’ was here /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/compat52.h:57:13: error: static declaration of ‘luaL_setfuncs’ follows non-static declaration /home/travis/hererocks/include/lauxlib.h:88:18: note: previous declaration of ‘luaL_setfuncs’ was here /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/compat52.h:71:0: warning: "luaL_newlibtable" redefined [enabled by default] /home/travis/hererocks/include/lauxlib.h:123:0: note: this is the location of the previous definition /tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/compat52.h:74:0: warning: "luaL_newlib" redefined [enabled by default] /home/travis/hererocks/include/lauxlib.h:125:0: note: this is the location of the previous definition make: *** [/tmp/luarocks_cqueues-20161215.51-0-QiLoDX/cqueues-rel-20161215/src/5.1/cqueues.o] Error 1 ```
port
build fails against luajit luajit provides some lua symbols these end up clashing with our compat implementations build log from enabling lua cp tmp luarocks cqueues qilodx cqueues rel config h guess tmp luarocks cqueues qilodx cqueues rel config h cp tmp luarocks cqueues qilodx cqueues rel config h tmp luarocks cqueues qilodx cqueues rel src config h mkdir p tmp luarocks cqueues qilodx cqueues rel src gcc std fpic g wall wextra wno missing field initializers wno override init wno unused fpic dlua compat apiintcasts i home travis hererocks include d reentrant d thread safe d gnu source i usr include i usr include dcqueues vendor william com dcqueues version c o tmp luarocks cqueues qilodx cqueues rel src cqueues o tmp luarocks cqueues qilodx cqueues rel src cqueues c in file included from tmp luarocks cqueues qilodx cqueues rel src cqueues h from tmp luarocks cqueues qilodx cqueues rel src cqueues c tmp luarocks cqueues qilodx cqueues rel src h error static declaration of ‘lual setmetatable’ follows non static declaration home travis hererocks include lauxlib h note previous declaration of ‘lual setmetatable’ was here tmp luarocks cqueues qilodx cqueues rel src h error static declaration of ‘lual testudata’ follows non static declaration home travis hererocks include lauxlib h note previous declaration of ‘lual testudata’ was here tmp luarocks cqueues qilodx cqueues rel src h error static declaration of ‘lual setfuncs’ follows non static declaration home travis hererocks include lauxlib h note previous declaration of ‘lual setfuncs’ was here tmp luarocks cqueues qilodx cqueues rel src h warning lual newlibtable redefined home travis hererocks include lauxlib h note this is the location of the previous definition tmp luarocks cqueues qilodx cqueues rel src h warning lual newlib redefined home travis hererocks include lauxlib h note this is the location of the previous definition make error
1
49,980
10,437,017,847
IssuesEvent
2019-09-17 20:55:58
The-Four-Lords/CongressCrush
https://api.github.com/repos/The-Four-Lords/CongressCrush
closed
% Avance
code
La barra de progreso avanza en funcion de la puntuación y no de los escaños ganados. #8
1.0
% Avance - La barra de progreso avanza en funcion de la puntuación y no de los escaños ganados. #8
non_port
avance la barra de progreso avanza en funcion de la puntuación y no de los escaños ganados
0
1,854
27,399,167,637
IssuesEvent
2023-02-28 22:29:02
golang/vulndb
https://api.github.com/repos/golang/vulndb
closed
x/vulndb: potential Go vuln in gogs.io/gogs: GHSA-pfvh-p8qp-9ww9
excluded: NOT_IMPORTABLE
In GitHub Security Advisory [GHSA-pfvh-p8qp-9ww9](https://github.com/advisories/GHSA-pfvh-p8qp-9ww9), there is a vulnerability in the following Go packages or modules: | Unit | Fixed | Vulnerable Ranges | | - | - | - | | [gogs.io/gogs](https://pkg.go.dev/gogs.io/gogs) | 0.12.11 | < 0.12.11 | Cross references: - Module gogs.io/gogs appears in issue #369 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #377 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #473 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #554 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #556 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #557 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #562 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #566 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #570 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #583 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #597 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #642 EFFECTIVELY_PRIVATE - Module gogs.io/gogs appears in issue #749 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #788 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #797 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #822 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #831 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #1060 NOT_IMPORTABLE See [doc/triage.md](https://github.com/golang/vulndb/blob/master/doc/triage.md) for instructions on how to triage this report. ``` modules: - module: gogs.io/gogs versions: - fixed: 0.12.11 packages: - package: gogs.io/gogs description: | ### Impact The malicious user is able to update a crafted `config` file into repository's `.git` directory in combination with crafted file deletion to gain SSH access to the server on case-insensitive file systems. All installations with [repository upload enabled (default)](https://github.com/gogs/gogs/blob/f36eeedbf89328ee70cc3a2e239f6314f9021f58/conf/app.ini#L127-L129) on case-insensitive file systems (Windows, macOS, etc.) are affected. ### Patches Make sanitization of upload path to `.git` directory to be case-insensitive. Users should upgrade to 0.12.11 or the latest 0.13.0+dev. ### Workarounds Disable [repository upload](https://github.com/gogs/gogs/blob/f36eeedbf89328ee70cc3a2e239f6314f9021f58/conf/app.ini#L127-L129). ### References https://huntr.dev/bounties/18cf9256-23ab-4098-a769-85f8da130f97/ ### For more information If you have any questions or comments about this advisory, please post on https://github.com/gogs/gogs/issues/7030. cves: - CVE-2022-2024 ghsas: - GHSA-pfvh-p8qp-9ww9 references: - advisory: https://github.com/gogs/gogs/security/advisories/GHSA-pfvh-p8qp-9ww9 - web: https://nvd.nist.gov/vuln/detail/CVE-2022-2024 - report: https://github.com/gogs/gogs/issues/7030 - fix: https://github.com/gogs/gogs/commit/15d0d6a94be0098a8227b6b95bdf2daed105ec41 - web: https://github.com/gogs/gogs/blob/f36eeedbf89328ee70cc3a2e239f6314f9021f58/conf/app.ini#L127-L129 - web: https://huntr.dev/bounties/18cf9256-23ab-4098-a769-85f8da130f97 - advisory: https://github.com/advisories/GHSA-pfvh-p8qp-9ww9 ```
True
x/vulndb: potential Go vuln in gogs.io/gogs: GHSA-pfvh-p8qp-9ww9 - In GitHub Security Advisory [GHSA-pfvh-p8qp-9ww9](https://github.com/advisories/GHSA-pfvh-p8qp-9ww9), there is a vulnerability in the following Go packages or modules: | Unit | Fixed | Vulnerable Ranges | | - | - | - | | [gogs.io/gogs](https://pkg.go.dev/gogs.io/gogs) | 0.12.11 | < 0.12.11 | Cross references: - Module gogs.io/gogs appears in issue #369 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #377 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #473 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #554 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #556 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #557 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #562 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #566 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #570 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #583 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #597 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #642 EFFECTIVELY_PRIVATE - Module gogs.io/gogs appears in issue #749 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #788 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #797 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #822 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #831 NOT_IMPORTABLE - Module gogs.io/gogs appears in issue #1060 NOT_IMPORTABLE See [doc/triage.md](https://github.com/golang/vulndb/blob/master/doc/triage.md) for instructions on how to triage this report. ``` modules: - module: gogs.io/gogs versions: - fixed: 0.12.11 packages: - package: gogs.io/gogs description: | ### Impact The malicious user is able to update a crafted `config` file into repository's `.git` directory in combination with crafted file deletion to gain SSH access to the server on case-insensitive file systems. All installations with [repository upload enabled (default)](https://github.com/gogs/gogs/blob/f36eeedbf89328ee70cc3a2e239f6314f9021f58/conf/app.ini#L127-L129) on case-insensitive file systems (Windows, macOS, etc.) are affected. ### Patches Make sanitization of upload path to `.git` directory to be case-insensitive. Users should upgrade to 0.12.11 or the latest 0.13.0+dev. ### Workarounds Disable [repository upload](https://github.com/gogs/gogs/blob/f36eeedbf89328ee70cc3a2e239f6314f9021f58/conf/app.ini#L127-L129). ### References https://huntr.dev/bounties/18cf9256-23ab-4098-a769-85f8da130f97/ ### For more information If you have any questions or comments about this advisory, please post on https://github.com/gogs/gogs/issues/7030. cves: - CVE-2022-2024 ghsas: - GHSA-pfvh-p8qp-9ww9 references: - advisory: https://github.com/gogs/gogs/security/advisories/GHSA-pfvh-p8qp-9ww9 - web: https://nvd.nist.gov/vuln/detail/CVE-2022-2024 - report: https://github.com/gogs/gogs/issues/7030 - fix: https://github.com/gogs/gogs/commit/15d0d6a94be0098a8227b6b95bdf2daed105ec41 - web: https://github.com/gogs/gogs/blob/f36eeedbf89328ee70cc3a2e239f6314f9021f58/conf/app.ini#L127-L129 - web: https://huntr.dev/bounties/18cf9256-23ab-4098-a769-85f8da130f97 - advisory: https://github.com/advisories/GHSA-pfvh-p8qp-9ww9 ```
port
x vulndb potential go vuln in gogs io gogs ghsa pfvh in github security advisory there is a vulnerability in the following go packages or modules unit fixed vulnerable ranges cross references module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue effectively private module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable module gogs io gogs appears in issue not importable see for instructions on how to triage this report modules module gogs io gogs versions fixed packages package gogs io gogs description impact the malicious user is able to update a crafted config file into repository s git directory in combination with crafted file deletion to gain ssh access to the server on case insensitive file systems all installations with on case insensitive file systems windows macos etc are affected patches make sanitization of upload path to git directory to be case insensitive users should upgrade to or the latest dev workarounds disable references for more information if you have any questions or comments about this advisory please post on cves cve ghsas ghsa pfvh references advisory web report fix web web advisory
1
1,530
22,156,599,511
IssuesEvent
2022-06-03 23:54:26
apache/beam
https://api.github.com/repos/apache/beam
opened
Ensure all RunnerHarnesses provide a valid RunnerApi.IsBounded value on all PCollections
new feature P3 runner-core portable-metrics-bugs
Fixing this requires updating 4 locations. * Dataflow RunnerHarness * FNAPDoFnRunner * UnifiedWorker * Shared libraries for this proto generation, which should cover OSS runners **** Remove the workaround in ProcessBundleHandler.java which will assume that all PCollections are bounded, if not set. See PCollectionTranslation.fromProto which should be always passed a valid value and not default to error or assume the PCollection is bounded.   Context \=== When I was updating the java SDK to conditionally serialize some elements to reported a sampled byte size metric, I encountered this.   Its due to to the refactoring in my [PR/8416](https://github.com/apache/beam/pull/8416), the RehydratedComponents was pulled up a level, and shared now among all the calls to createRunnerForPTransform in the various PtransfomRunnerFactories.   I is now triggering some code paths which were not previously triggered for all types of PTransforms/PCollections, causing this error to occur.    jsonPayload: {   exception:  "org.apache.beam.vendor.guava.v20_0.com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: Cannot convert unknown org.apache.beam.model.pipeline.v1.RunnerApi.IsBounded to org.apache.beam.sdk.values.PCollection.IsBounded: UNSPECIFIED at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache.get(LocalCache.java:4053) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4057) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4986) at org.apache.beam.runners.core.construction.RehydratedComponents.getPCollection(RehydratedComponents.java:144) at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry.getMultiplexingConsumer(PCollectionConsumerRegistry.java:145) at org.apache.beam.fn.harness.DoFnPTransformRunnerFactory$Context.<init\>(DoFnPTransformRunnerFactory.java:284) at org.apache.beam.fn.harness.DoFnPTransformRunnerFactory.createRunnerForPTransform(DoFnPTransformRunnerFactory.java:97) at org.apache.beam.fn.harness.DoFnPTransformRunnerFactory.createRunnerForPTransform(DoFnPTransformRunnerFactory.java:63) at org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:198) at org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:166) at org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:166) at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:306) at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:160) at org.apache.beam.fn.harness.control.BeamFnControlClient.lambda$processInstructionRequests$0(BeamFnControlClient.java:144) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.IllegalArgumentException: Cannot convert unknown org.apache.beam.model.pipeline.v1.RunnerApi.IsBounded to org.apache.beam.sdk.values.PCollection.IsBounded: UNSPECIFIED at org.apache.beam.runners.core.construction.PCollectionTranslation.fromProto(PCollectionTranslation.java:88) at org.apache.beam.runners.core.construction.PCollectionTranslation.fromProto(PCollectionTranslation.java:56) at org.apache.beam.runners.core.construction.RehydratedComponents$3.load(RehydratedComponents.java:103) at org.apache.beam.runners.core.construction.RehydratedComponents$3.load(RehydratedComponents.java:93) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208) ... 17 more "     job:  "2019-05-29_03_31_14-4799355109250203557"     logger:  "org.apache.beam.fn.harness.control.BeamFnControlClient"     *message:  "Exception while trying to handle InstructionRequest -28"*     portability_worker_id:  "1"     thread:  "16"     worker:  "testpipeline-pabloem-0529-05290331-75o8-harness-htz8"    }   The root of the issue is that the ProcessBundleDescriptors are invalid. The RunnerHarnesses are not setting the org.apache.beam.model.pipeline.v1.RunnerApi.IsBounded which breaks the specification and leads to this error.     Imported from Jira [BEAM-7452](https://issues.apache.org/jira/browse/BEAM-7452). Original Jira may contain additional context. Reported by: ajamato@google.com.
True
Ensure all RunnerHarnesses provide a valid RunnerApi.IsBounded value on all PCollections - Fixing this requires updating 4 locations. * Dataflow RunnerHarness * FNAPDoFnRunner * UnifiedWorker * Shared libraries for this proto generation, which should cover OSS runners **** Remove the workaround in ProcessBundleHandler.java which will assume that all PCollections are bounded, if not set. See PCollectionTranslation.fromProto which should be always passed a valid value and not default to error or assume the PCollection is bounded.   Context \=== When I was updating the java SDK to conditionally serialize some elements to reported a sampled byte size metric, I encountered this.   Its due to to the refactoring in my [PR/8416](https://github.com/apache/beam/pull/8416), the RehydratedComponents was pulled up a level, and shared now among all the calls to createRunnerForPTransform in the various PtransfomRunnerFactories.   I is now triggering some code paths which were not previously triggered for all types of PTransforms/PCollections, causing this error to occur.    jsonPayload: {   exception:  "org.apache.beam.vendor.guava.v20_0.com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: Cannot convert unknown org.apache.beam.model.pipeline.v1.RunnerApi.IsBounded to org.apache.beam.sdk.values.PCollection.IsBounded: UNSPECIFIED at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2214) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache.get(LocalCache.java:4053) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4057) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4986) at org.apache.beam.runners.core.construction.RehydratedComponents.getPCollection(RehydratedComponents.java:144) at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry.getMultiplexingConsumer(PCollectionConsumerRegistry.java:145) at org.apache.beam.fn.harness.DoFnPTransformRunnerFactory$Context.<init\>(DoFnPTransformRunnerFactory.java:284) at org.apache.beam.fn.harness.DoFnPTransformRunnerFactory.createRunnerForPTransform(DoFnPTransformRunnerFactory.java:97) at org.apache.beam.fn.harness.DoFnPTransformRunnerFactory.createRunnerForPTransform(DoFnPTransformRunnerFactory.java:63) at org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:198) at org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:166) at org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:166) at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:306) at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:160) at org.apache.beam.fn.harness.control.BeamFnControlClient.lambda$processInstructionRequests$0(BeamFnControlClient.java:144) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.IllegalArgumentException: Cannot convert unknown org.apache.beam.model.pipeline.v1.RunnerApi.IsBounded to org.apache.beam.sdk.values.PCollection.IsBounded: UNSPECIFIED at org.apache.beam.runners.core.construction.PCollectionTranslation.fromProto(PCollectionTranslation.java:88) at org.apache.beam.runners.core.construction.PCollectionTranslation.fromProto(PCollectionTranslation.java:56) at org.apache.beam.runners.core.construction.RehydratedComponents$3.load(RehydratedComponents.java:103) at org.apache.beam.runners.core.construction.RehydratedComponents$3.load(RehydratedComponents.java:93) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295) at org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208) ... 17 more "     job:  "2019-05-29_03_31_14-4799355109250203557"     logger:  "org.apache.beam.fn.harness.control.BeamFnControlClient"     *message:  "Exception while trying to handle InstructionRequest -28"*     portability_worker_id:  "1"     thread:  "16"     worker:  "testpipeline-pabloem-0529-05290331-75o8-harness-htz8"    }   The root of the issue is that the ProcessBundleDescriptors are invalid. The RunnerHarnesses are not setting the org.apache.beam.model.pipeline.v1.RunnerApi.IsBounded which breaks the specification and leads to this error.     Imported from Jira [BEAM-7452](https://issues.apache.org/jira/browse/BEAM-7452). Original Jira may contain additional context. Reported by: ajamato@google.com.
port
ensure all runnerharnesses provide a valid runnerapi isbounded value on all pcollections fixing this requires updating locations dataflow runnerharness fnapdofnrunner unifiedworker shared libraries for this proto generation which should cover oss runners remove the workaround in processbundlehandler java which will assume that all pcollections are bounded if not set see pcollectiontranslation fromproto which should be always passed a valid value and not default to error or assume the pcollection is bounded   context when i was updating the java sdk to conditionally serialize some elements to reported a sampled byte size metric i encountered this   its due to to the refactoring in my the rehydratedcomponents was pulled up a level and shared now among all the calls to createrunnerforptransform in the various ptransfomrunnerfactories   i is now triggering some code paths which were not previously triggered for all types of ptransforms pcollections causing this error to occur    jsonpayload   exception   org apache beam vendor guava com google common util concurrent uncheckedexecutionexception java lang illegalargumentexception cannot convert unknown org apache beam model pipeline runnerapi isbounded to org apache beam sdk values pcollection isbounded unspecified at org apache beam vendor guava com google common cache localcache segment get localcache java at org apache beam vendor guava com google common cache localcache get localcache java at org apache beam vendor guava com google common cache localcache getorload localcache java at org apache beam vendor guava com google common cache localcache localloadingcache get localcache java at org apache beam runners core construction rehydratedcomponents getpcollection rehydratedcomponents java at org apache beam fn harness data pcollectionconsumerregistry getmultiplexingconsumer pcollectionconsumerregistry java at org apache beam fn harness dofnptransformrunnerfactory context dofnptransformrunnerfactory java at org apache beam fn harness dofnptransformrunnerfactory createrunnerforptransform dofnptransformrunnerfactory java at org apache beam fn harness dofnptransformrunnerfactory createrunnerforptransform dofnptransformrunnerfactory java at org apache beam fn harness control processbundlehandler createrunnerandconsumersforptransformrecursively processbundlehandler java at org apache beam fn harness control processbundlehandler createrunnerandconsumersforptransformrecursively processbundlehandler java at org apache beam fn harness control processbundlehandler createrunnerandconsumersforptransformrecursively processbundlehandler java at org apache beam fn harness control processbundlehandler processbundle processbundlehandler java at org apache beam fn harness control beamfncontrolclient delegateoninstructionrequesttype beamfncontrolclient java at org apache beam fn harness control beamfncontrolclient lambda processinstructionrequests beamfncontrolclient java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java caused by java lang illegalargumentexception cannot convert unknown org apache beam model pipeline runnerapi isbounded to org apache beam sdk values pcollection isbounded unspecified at org apache beam runners core construction pcollectiontranslation fromproto pcollectiontranslation java at org apache beam runners core construction pcollectiontranslation fromproto pcollectiontranslation java at org apache beam runners core construction rehydratedcomponents load rehydratedcomponents java at org apache beam runners core construction rehydratedcomponents load rehydratedcomponents java at org apache beam vendor guava com google common cache localcache loadingvaluereference loadfuture localcache java at org apache beam vendor guava com google common cache localcache segment loadsync localcache java at org apache beam vendor guava com google common cache localcache segment lockedgetorload localcache java at org apache beam vendor guava com google common cache localcache segment get localcache java more     job       logger   org apache beam fn harness control beamfncontrolclient     message   exception while trying to handle instructionrequest     portability worker id       thread       worker   testpipeline pabloem harness       the root of the issue is that the processbundledescriptors are invalid the runnerharnesses are not setting the org apache beam model pipeline runnerapi isbounded which breaks the specification and leads to this error     imported from jira original jira may contain additional context reported by ajamato google com
1
70,820
23,329,787,739
IssuesEvent
2022-08-09 03:09:00
primefaces/primeng
https://api.github.com/repos/primefaces/primeng
closed
DataTable with virtual scroll and expand mode is not working as expected
defect
Reported By PRO User; > A) In a table with virtual scroll and expand mode, the table expands but the vertical scroll bar is no longer always visible. The columns do not align correctly when the table width is small. Both these things do not happen in a table without virtual scroll. >B) In a table with virtual scroll and set widths, if you shrink a column, there is a double horizontal scrollbar. Exp; ``` <div class="card"> <h5>A) Virtual Scroll with Expand Mode</h5> <p-table [columns]="cols" [value]="cars" [scrollable]="true" [rows]="100" scrollHeight="250px" [virtualScroll]="true" [virtualScrollItemSize]="34" styleClass="p-datatable-gridlines" [resizableColumns]="true" columnResizeMode="expand"> <ng-template pTemplate="header" let-columns> <tr> <th pResizableColumn *ngFor="let col of columns"> {{col.header}} </th> </tr> </ng-template> <ng-template pTemplate="body" let-rowData let-columns="columns"> <tr style="height:34px"> <td *ngFor="let col of columns"> {{rowData[col.field]}} </td> </tr> </ng-template> </p-table> </div> <div class="card"> <h5> B) Virtual Scroll with Expand Mode and Wide Widths</h5> <p-table [columns]="cols" [value]="cars" [scrollable]="true" [rows]="100" scrollHeight="250px" [virtualScroll]="true" [virtualScrollItemSize]="34" styleClass="p-datatable-gridlines" [resizableColumns]="true" columnResizeMode="expand"> <ng-template pTemplate="header" let-columns> <tr> <th style="width:200px" pResizableColumn *ngFor="let col of columns"> {{col.header}} </th> </tr> </ng-template> <ng-template pTemplate="body" let-rowData let-columns="columns"> <tr style="height:34px"> <td style="width:200px" *ngFor="let col of columns"> {{rowData[col.field]}} </td> </tr> </ng-template> </p-table> </div> ```
1.0
DataTable with virtual scroll and expand mode is not working as expected - Reported By PRO User; > A) In a table with virtual scroll and expand mode, the table expands but the vertical scroll bar is no longer always visible. The columns do not align correctly when the table width is small. Both these things do not happen in a table without virtual scroll. >B) In a table with virtual scroll and set widths, if you shrink a column, there is a double horizontal scrollbar. Exp; ``` <div class="card"> <h5>A) Virtual Scroll with Expand Mode</h5> <p-table [columns]="cols" [value]="cars" [scrollable]="true" [rows]="100" scrollHeight="250px" [virtualScroll]="true" [virtualScrollItemSize]="34" styleClass="p-datatable-gridlines" [resizableColumns]="true" columnResizeMode="expand"> <ng-template pTemplate="header" let-columns> <tr> <th pResizableColumn *ngFor="let col of columns"> {{col.header}} </th> </tr> </ng-template> <ng-template pTemplate="body" let-rowData let-columns="columns"> <tr style="height:34px"> <td *ngFor="let col of columns"> {{rowData[col.field]}} </td> </tr> </ng-template> </p-table> </div> <div class="card"> <h5> B) Virtual Scroll with Expand Mode and Wide Widths</h5> <p-table [columns]="cols" [value]="cars" [scrollable]="true" [rows]="100" scrollHeight="250px" [virtualScroll]="true" [virtualScrollItemSize]="34" styleClass="p-datatable-gridlines" [resizableColumns]="true" columnResizeMode="expand"> <ng-template pTemplate="header" let-columns> <tr> <th style="width:200px" pResizableColumn *ngFor="let col of columns"> {{col.header}} </th> </tr> </ng-template> <ng-template pTemplate="body" let-rowData let-columns="columns"> <tr style="height:34px"> <td style="width:200px" *ngFor="let col of columns"> {{rowData[col.field]}} </td> </tr> </ng-template> </p-table> </div> ```
non_port
datatable with virtual scroll and expand mode is not working as expected reported by pro user a in a table with virtual scroll and expand mode the table expands but the vertical scroll bar is no longer always visible the columns do not align correctly when the table width is small both these things do not happen in a table without virtual scroll b in a table with virtual scroll and set widths if you shrink a column there is a double horizontal scrollbar exp a virtual scroll with expand mode p table cols cars true scrollheight true styleclass p datatable gridlines true columnresizemode expand col header rowdata b virtual scroll with expand mode and wide widths p table cols cars true scrollheight true styleclass p datatable gridlines true columnresizemode expand col header rowdata
0
598
8,066,614,320
IssuesEvent
2018-08-04 17:59:31
dpteam/GLQuake3D
https://api.github.com/repos/dpteam/GLQuake3D
closed
Get rid of ASM and only use C
portability
This is good for portability reasons; modern compilers are good enough at optimization so we can as well drop any remaining ASM code in favor of C code.
True
Get rid of ASM and only use C - This is good for portability reasons; modern compilers are good enough at optimization so we can as well drop any remaining ASM code in favor of C code.
port
get rid of asm and only use c this is good for portability reasons modern compilers are good enough at optimization so we can as well drop any remaining asm code in favor of c code
1
1,864
27,585,460,712
IssuesEvent
2023-03-08 19:23:13
golang/vulndb
https://api.github.com/repos/golang/vulndb
closed
x/vulndb: potential Go vuln in github.com/answerdev/answer: GHSA-qrwm-xqfr-4vhv
excluded: NOT_IMPORTABLE
In GitHub Security Advisory [GHSA-qrwm-xqfr-4vhv](https://github.com/advisories/GHSA-qrwm-xqfr-4vhv), there is a vulnerability in the following Go packages or modules: | Unit | Fixed | Vulnerable Ranges | | - | - | - | | [github.com/answerdev/answer](https://pkg.go.dev/github.com/answerdev/answer) | 1.0.6 | < 1.0.6 | Cross references: - Module github.com/answerdev/answer appears in issue #1541 EFFECTIVELY_PRIVATE - Module github.com/answerdev/answer appears in issue #1550 NOT_IMPORTABLE - Module github.com/answerdev/answer appears in issue #1551 NOT_IMPORTABLE - Module github.com/answerdev/answer appears in issue #1552 EFFECTIVELY_PRIVATE - Module github.com/answerdev/answer appears in issue #1553 NOT_IMPORTABLE - Module github.com/answerdev/answer appears in issue #1554 EFFECTIVELY_PRIVATE - Module github.com/answerdev/answer appears in issue #1592 NOT_IMPORTABLE See [doc/triage.md](https://github.com/golang/vulndb/blob/master/doc/triage.md) for instructions on how to triage this report. ``` modules: - module: github.com/answerdev/answer versions: - fixed: 1.0.6 packages: - package: github.com/answerdev/answer description: Cross-site Scripting (XSS) - Stored in GitHub repository answerdev/answer prior to 1.0.6. cves: - CVE-2023-1242 ghsas: - GHSA-qrwm-xqfr-4vhv references: - web: https://nvd.nist.gov/vuln/detail/CVE-2023-1242 - fix: https://github.com/answerdev/answer/commit/90bfa0dcc7b49482f1d1e31aee3ab073f3c13dd9 - web: https://huntr.dev/bounties/71c24c5e-ceb2-45cf-bda7-fa195d37e289 - advisory: https://github.com/advisories/GHSA-qrwm-xqfr-4vhv ```
True
x/vulndb: potential Go vuln in github.com/answerdev/answer: GHSA-qrwm-xqfr-4vhv - In GitHub Security Advisory [GHSA-qrwm-xqfr-4vhv](https://github.com/advisories/GHSA-qrwm-xqfr-4vhv), there is a vulnerability in the following Go packages or modules: | Unit | Fixed | Vulnerable Ranges | | - | - | - | | [github.com/answerdev/answer](https://pkg.go.dev/github.com/answerdev/answer) | 1.0.6 | < 1.0.6 | Cross references: - Module github.com/answerdev/answer appears in issue #1541 EFFECTIVELY_PRIVATE - Module github.com/answerdev/answer appears in issue #1550 NOT_IMPORTABLE - Module github.com/answerdev/answer appears in issue #1551 NOT_IMPORTABLE - Module github.com/answerdev/answer appears in issue #1552 EFFECTIVELY_PRIVATE - Module github.com/answerdev/answer appears in issue #1553 NOT_IMPORTABLE - Module github.com/answerdev/answer appears in issue #1554 EFFECTIVELY_PRIVATE - Module github.com/answerdev/answer appears in issue #1592 NOT_IMPORTABLE See [doc/triage.md](https://github.com/golang/vulndb/blob/master/doc/triage.md) for instructions on how to triage this report. ``` modules: - module: github.com/answerdev/answer versions: - fixed: 1.0.6 packages: - package: github.com/answerdev/answer description: Cross-site Scripting (XSS) - Stored in GitHub repository answerdev/answer prior to 1.0.6. cves: - CVE-2023-1242 ghsas: - GHSA-qrwm-xqfr-4vhv references: - web: https://nvd.nist.gov/vuln/detail/CVE-2023-1242 - fix: https://github.com/answerdev/answer/commit/90bfa0dcc7b49482f1d1e31aee3ab073f3c13dd9 - web: https://huntr.dev/bounties/71c24c5e-ceb2-45cf-bda7-fa195d37e289 - advisory: https://github.com/advisories/GHSA-qrwm-xqfr-4vhv ```
port
x vulndb potential go vuln in github com answerdev answer ghsa qrwm xqfr in github security advisory there is a vulnerability in the following go packages or modules unit fixed vulnerable ranges cross references module github com answerdev answer appears in issue effectively private module github com answerdev answer appears in issue not importable module github com answerdev answer appears in issue not importable module github com answerdev answer appears in issue effectively private module github com answerdev answer appears in issue not importable module github com answerdev answer appears in issue effectively private module github com answerdev answer appears in issue not importable see for instructions on how to triage this report modules module github com answerdev answer versions fixed packages package github com answerdev answer description cross site scripting xss stored in github repository answerdev answer prior to cves cve ghsas ghsa qrwm xqfr references web fix web advisory
1
783
10,325,877,472
IssuesEvent
2019-09-01 21:08:35
unitsofmeasurement/uom-demos
https://api.github.com/repos/unitsofmeasurement/uom-demos
closed
ME 8.2 on actual device
device device:raspberryPi documents help wanted portability portability:ME ready task
Java ME 8.2 SDK offers support for a few actual devices. Especially Raspberry Pi. Try to install it on an actual Raspberry Pi and document steps to reproduce with the demo.
True
ME 8.2 on actual device - Java ME 8.2 SDK offers support for a few actual devices. Especially Raspberry Pi. Try to install it on an actual Raspberry Pi and document steps to reproduce with the demo.
port
me on actual device java me sdk offers support for a few actual devices especially raspberry pi try to install it on an actual raspberry pi and document steps to reproduce with the demo
1
140,917
18,927,439,779
IssuesEvent
2021-11-17 11:01:59
Seagate/cortx-prvsnr
https://api.github.com/repos/Seagate/cortx-prvsnr
closed
CVE-2021-3144 (High) detected in salt-3002.2.tar.gz
needs-attention security_vulnerability
## CVE-2021-3144 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>salt-3002.2.tar.gz</b></p></summary> <p>Portable, distributed, remote execution and configuration management system</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/b5/45/a20ff8a3cad48b50a924ee9c65f2df0e214de4fa282c4feef2e1d6a0b886/salt-3002.2.tar.gz">https://files.pythonhosted.org/packages/b5/45/a20ff8a3cad48b50a924ee9c65f2df0e214de4fa282c4feef2e1d6a0b886/salt-3002.2.tar.gz</a></p> <p>Path to dependency file: cortx-prvsnr/lr-cli</p> <p>Path to vulnerable library: /lr-cli,/api/python/provisioner/commands/configure,/api/python</p> <p> Dependency Hierarchy: - :x: **salt-3002.2.tar.gz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/Seagate/cortx-prvsnr/commits/91dd9ff4b628dd14653923b67b0bbbf274155a9e">91dd9ff4b628dd14653923b67b0bbbf274155a9e</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In SaltStack Salt before 3002.5, eauth tokens can be used once after expiration. (They might be used to run command against the salt master or minions.) <p>Publish Date: 2021-02-27 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3144>CVE-2021-3144</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/saltstack/salt/blob/master/CHANGELOG.md">https://github.com/saltstack/salt/blob/master/CHANGELOG.md</a></p> <p>Release Date: 2021-02-27</p> <p>Fix Resolution: v3002.3</p> </p> </details> <p></p>
True
CVE-2021-3144 (High) detected in salt-3002.2.tar.gz - ## CVE-2021-3144 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>salt-3002.2.tar.gz</b></p></summary> <p>Portable, distributed, remote execution and configuration management system</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/b5/45/a20ff8a3cad48b50a924ee9c65f2df0e214de4fa282c4feef2e1d6a0b886/salt-3002.2.tar.gz">https://files.pythonhosted.org/packages/b5/45/a20ff8a3cad48b50a924ee9c65f2df0e214de4fa282c4feef2e1d6a0b886/salt-3002.2.tar.gz</a></p> <p>Path to dependency file: cortx-prvsnr/lr-cli</p> <p>Path to vulnerable library: /lr-cli,/api/python/provisioner/commands/configure,/api/python</p> <p> Dependency Hierarchy: - :x: **salt-3002.2.tar.gz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/Seagate/cortx-prvsnr/commits/91dd9ff4b628dd14653923b67b0bbbf274155a9e">91dd9ff4b628dd14653923b67b0bbbf274155a9e</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In SaltStack Salt before 3002.5, eauth tokens can be used once after expiration. (They might be used to run command against the salt master or minions.) <p>Publish Date: 2021-02-27 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3144>CVE-2021-3144</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/saltstack/salt/blob/master/CHANGELOG.md">https://github.com/saltstack/salt/blob/master/CHANGELOG.md</a></p> <p>Release Date: 2021-02-27</p> <p>Fix Resolution: v3002.3</p> </p> </details> <p></p>
non_port
cve high detected in salt tar gz cve high severity vulnerability vulnerable library salt tar gz portable distributed remote execution and configuration management system library home page a href path to dependency file cortx prvsnr lr cli path to vulnerable library lr cli api python provisioner commands configure api python dependency hierarchy x salt tar gz vulnerable library found in head commit a href vulnerability details in saltstack salt before eauth tokens can be used once after expiration they might be used to run command against the salt master or minions publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
0
1,483
21,707,957,119
IssuesEvent
2022-05-10 11:24:46
damccorm/test-migration-target
https://api.github.com/repos/damccorm/test-migration-target
opened
Improvements of portability framework to make it usable in other projects
P3 improvement runner-core portability sdk-py-harness
The Flink community will use Beam's portability framework for its multi-language support, such as Python UDF execution. This is an umbrella JIRA which tracks all the improvement requirements collected from Flink community. For details of the discussion can be found in [1]. [1] [https://lists.apache.org/thread.html/4025a19ba267f0b1b842af20b63ac794306259a3226e07c87bda62ba@%3Cdev.beam.apache.org%3E](https://lists.apache.org/thread.html/4025a19ba267f0b1b842af20b63ac794306259a3226e07c87bda62ba@%3Cdev.beam.apache.org%3E) Imported from Jira [BEAM-7944](https://issues.apache.org/jira/browse/BEAM-7944). Original Jira may contain additional context. Reported by: sunjincheng121.
True
Improvements of portability framework to make it usable in other projects - The Flink community will use Beam's portability framework for its multi-language support, such as Python UDF execution. This is an umbrella JIRA which tracks all the improvement requirements collected from Flink community. For details of the discussion can be found in [1]. [1] [https://lists.apache.org/thread.html/4025a19ba267f0b1b842af20b63ac794306259a3226e07c87bda62ba@%3Cdev.beam.apache.org%3E](https://lists.apache.org/thread.html/4025a19ba267f0b1b842af20b63ac794306259a3226e07c87bda62ba@%3Cdev.beam.apache.org%3E) Imported from Jira [BEAM-7944](https://issues.apache.org/jira/browse/BEAM-7944). Original Jira may contain additional context. Reported by: sunjincheng121.
port
improvements of portability framework to make it usable in other projects the flink community will use beam s portability framework for its multi language support such as python udf execution this is an umbrella jira which tracks all the improvement requirements collected from flink community for details of the discussion can be found in   imported from jira original jira may contain additional context reported by
1
391,128
26,881,209,315
IssuesEvent
2023-02-05 17:03:48
Seneca-CDOT/starchart
https://api.github.com/repos/Seneca-CDOT/starchart
closed
Prisma introduction in wiki
documentation
Currently, there's no introduction for Prisma in our wiki to serve as a starting point to looking into the tool. While the official site has [good documentation](https://www.prisma.io/docs/concepts/overview/what-is-prisma), we might still want to add a helpful entry for navigating the this according to how we want to use it. Some points to touch on: - How to set it up - How to connect to a MySQL database - How to do CRUD operations
1.0
Prisma introduction in wiki - Currently, there's no introduction for Prisma in our wiki to serve as a starting point to looking into the tool. While the official site has [good documentation](https://www.prisma.io/docs/concepts/overview/what-is-prisma), we might still want to add a helpful entry for navigating the this according to how we want to use it. Some points to touch on: - How to set it up - How to connect to a MySQL database - How to do CRUD operations
non_port
prisma introduction in wiki currently there s no introduction for prisma in our wiki to serve as a starting point to looking into the tool while the official site has we might still want to add a helpful entry for navigating the this according to how we want to use it some points to touch on how to set it up how to connect to a mysql database how to do crud operations
0
1,652
23,796,593,592
IssuesEvent
2022-09-02 20:32:30
golang/vulndb
https://api.github.com/repos/golang/vulndb
closed
x/vulndb: potential Go vuln in github.com/zitadel/zitadel: CVE-2022-36051
excluded: NOT_IMPORTABLE
CVE-2022-36051 references [github.com/zitadel/zitadel](https://github.com/zitadel/zitadel), which may be a Go module. Description: ZITADEL combines the ease of Auth0 and the versatility of Keycloak.**Actions**, introduced in ZITADEL **1.42.0** on the API and **1.56.0** for Console, is a feature, where users with role.`ORG_OWNER` are able to create Javascript Code, which is invoked by the system at certain points during the login. **Actions**, for example, allow creating authorizations (user grants) on newly created users programmatically. Due to a missing authorization check, **Actions** were able to grant authorizations for projects that belong to other organizations inside the same Instance. Granting authorizations via API and Console is not affected by this vulnerability. There is currently no known workaround, users should update. References: - NIST: https://nvd.nist.gov/vuln/detail/CVE-2022-36051 - JSON: https://github.com/CVEProject/cvelist/tree/33138126b6cf9be5834bbcd5b2c6a82d76e8c905/2022/36xxx/CVE-2022-36051.json - web: https://github.com/zitadel/zitadel/security/advisories/GHSA-c8fj-4pm8-mp2c - web: https://github.com/zitadel/zitadel/releases/tag/v1.87.1 - web: https://github.com/zitadel/zitadel/releases/tag/v2.2.0 - Imported by: https://pkg.go.dev/github.com/zitadel/zitadel?tab=importedby See [doc/triage.md](https://github.com/golang/vulndb/blob/master/doc/triage.md) for instructions on how to triage this report. ``` modules: - module: github.com/zitadel/zitadel packages: - package: zitadel description: |+ ZITADEL combines the ease of Auth0 and the versatility of Keycloak.**Actions**, introduced in ZITADEL **1.42.0** on the API and **1.56.0** for Console, is a feature, where users with role.`ORG_OWNER` are able to create Javascript Code, which is invoked by the system at certain points during the login. **Actions**, for example, allow creating authorizations (user grants) on newly created users programmatically. Due to a missing authorization check, **Actions** were able to grant authorizations for projects that belong to other organizations inside the same Instance. Granting authorizations via API and Console is not affected by this vulnerability. There is currently no known workaround, users should update. cves: - CVE-2022-36051 references: - web: https://github.com/zitadel/zitadel/security/advisories/GHSA-c8fj-4pm8-mp2c - web: https://github.com/zitadel/zitadel/releases/tag/v1.87.1 - web: https://github.com/zitadel/zitadel/releases/tag/v2.2.0 ```
True
x/vulndb: potential Go vuln in github.com/zitadel/zitadel: CVE-2022-36051 - CVE-2022-36051 references [github.com/zitadel/zitadel](https://github.com/zitadel/zitadel), which may be a Go module. Description: ZITADEL combines the ease of Auth0 and the versatility of Keycloak.**Actions**, introduced in ZITADEL **1.42.0** on the API and **1.56.0** for Console, is a feature, where users with role.`ORG_OWNER` are able to create Javascript Code, which is invoked by the system at certain points during the login. **Actions**, for example, allow creating authorizations (user grants) on newly created users programmatically. Due to a missing authorization check, **Actions** were able to grant authorizations for projects that belong to other organizations inside the same Instance. Granting authorizations via API and Console is not affected by this vulnerability. There is currently no known workaround, users should update. References: - NIST: https://nvd.nist.gov/vuln/detail/CVE-2022-36051 - JSON: https://github.com/CVEProject/cvelist/tree/33138126b6cf9be5834bbcd5b2c6a82d76e8c905/2022/36xxx/CVE-2022-36051.json - web: https://github.com/zitadel/zitadel/security/advisories/GHSA-c8fj-4pm8-mp2c - web: https://github.com/zitadel/zitadel/releases/tag/v1.87.1 - web: https://github.com/zitadel/zitadel/releases/tag/v2.2.0 - Imported by: https://pkg.go.dev/github.com/zitadel/zitadel?tab=importedby See [doc/triage.md](https://github.com/golang/vulndb/blob/master/doc/triage.md) for instructions on how to triage this report. ``` modules: - module: github.com/zitadel/zitadel packages: - package: zitadel description: |+ ZITADEL combines the ease of Auth0 and the versatility of Keycloak.**Actions**, introduced in ZITADEL **1.42.0** on the API and **1.56.0** for Console, is a feature, where users with role.`ORG_OWNER` are able to create Javascript Code, which is invoked by the system at certain points during the login. **Actions**, for example, allow creating authorizations (user grants) on newly created users programmatically. Due to a missing authorization check, **Actions** were able to grant authorizations for projects that belong to other organizations inside the same Instance. Granting authorizations via API and Console is not affected by this vulnerability. There is currently no known workaround, users should update. cves: - CVE-2022-36051 references: - web: https://github.com/zitadel/zitadel/security/advisories/GHSA-c8fj-4pm8-mp2c - web: https://github.com/zitadel/zitadel/releases/tag/v1.87.1 - web: https://github.com/zitadel/zitadel/releases/tag/v2.2.0 ```
port
x vulndb potential go vuln in github com zitadel zitadel cve cve references which may be a go module description zitadel combines the ease of and the versatility of keycloak actions introduced in zitadel on the api and for console is a feature where users with role org owner are able to create javascript code which is invoked by the system at certain points during the login actions for example allow creating authorizations user grants on newly created users programmatically due to a missing authorization check actions were able to grant authorizations for projects that belong to other organizations inside the same instance granting authorizations via api and console is not affected by this vulnerability there is currently no known workaround users should update references nist json web web web imported by see for instructions on how to triage this report modules module github com zitadel zitadel packages package zitadel description zitadel combines the ease of and the versatility of keycloak actions introduced in zitadel on the api and for console is a feature where users with role org owner are able to create javascript code which is invoked by the system at certain points during the login actions for example allow creating authorizations user grants on newly created users programmatically due to a missing authorization check actions were able to grant authorizations for projects that belong to other organizations inside the same instance granting authorizations via api and console is not affected by this vulnerability there is currently no known workaround users should update cves cve references web web web
1
416,119
12,139,743,413
IssuesEvent
2020-04-23 19:21:36
hyphacoop/organizing
https://api.github.com/repos/hyphacoop/organizing
opened
Host design jam x2 to refine internal look and feel
[priority-★★☆] look&feel wg:business-planning
<sup>_This initial comment is collaborative and open to modification by all._</sup> ## Task Summary 🎟️ **Re-ticketed from:** # 🗣 **Loomio:** N/A 📅 **Due date:** N/A 🎯 **Success criteria:** Have developed look and feel for our internal spaces. Building on #77 and coming out of convo in `1010-04-23 bizdev call`, we wanted to carry the visual look through handbook and chat ## To Do - [ ] have discussion abt current icons in chat and look and feel - [ ] jam on ideas for handbook + chat - ...
1.0
Host design jam x2 to refine internal look and feel - <sup>_This initial comment is collaborative and open to modification by all._</sup> ## Task Summary 🎟️ **Re-ticketed from:** # 🗣 **Loomio:** N/A 📅 **Due date:** N/A 🎯 **Success criteria:** Have developed look and feel for our internal spaces. Building on #77 and coming out of convo in `1010-04-23 bizdev call`, we wanted to carry the visual look through handbook and chat ## To Do - [ ] have discussion abt current icons in chat and look and feel - [ ] jam on ideas for handbook + chat - ...
non_port
host design jam to refine internal look and feel this initial comment is collaborative and open to modification by all task summary 🎟️ re ticketed from 🗣 loomio n a 📅 due date n a 🎯 success criteria have developed look and feel for our internal spaces building on and coming out of convo in bizdev call we wanted to carry the visual look through handbook and chat to do have discussion abt current icons in chat and look and feel jam on ideas for handbook chat
0
1,944
30,539,778,535
IssuesEvent
2023-07-19 20:20:21
nmlgc/ssg
https://api.github.com/repos/nmlgc/ssg
opened
Migrate non-graphics code to SDL
Portability ~€60-100
The industry-standard base library for portable game engines. Since Ember2528 plans to fund a complete Linux port, it makes sense to move away from Win32 sooner rather than later. This way, we avoid writing any more Win32-exclusive code for things like [SC88Pro recordings](https://github.com/nmlgc/ssg/issues/9), only for them to be rewritten in a more portable way later. This issue is about using SDL for: * [ ] Basic window creation * [ ] Sound effects Keyboard and joypad input is tracked in #22. MIDI will continue to use pbg's original native Win32 code. SDL_mixer would be a strict downgrade, because it [can only output to the default "MIDI mapper" device](https://github.com/libsdl-org/SDL_mixer/blob/334672d85d09f16b4ecb5e2041a0ad78128c3369/src/codecs/native_midi/native_midi_win32.c#L47), which can no longer be easily changed as of Windows 7. This obviously means that we stop maintaining all direct Win32 code. We're still going to leave the files in the repo, but just no longer compile them, allowing future pushes to still fund their continued maintenance. After all, [the ReC98-adjacent community loves tinkering with old Windows systems](https://lainnet.superglobalmegacorp.com/misc/np2gdi.html)
True
Migrate non-graphics code to SDL - The industry-standard base library for portable game engines. Since Ember2528 plans to fund a complete Linux port, it makes sense to move away from Win32 sooner rather than later. This way, we avoid writing any more Win32-exclusive code for things like [SC88Pro recordings](https://github.com/nmlgc/ssg/issues/9), only for them to be rewritten in a more portable way later. This issue is about using SDL for: * [ ] Basic window creation * [ ] Sound effects Keyboard and joypad input is tracked in #22. MIDI will continue to use pbg's original native Win32 code. SDL_mixer would be a strict downgrade, because it [can only output to the default "MIDI mapper" device](https://github.com/libsdl-org/SDL_mixer/blob/334672d85d09f16b4ecb5e2041a0ad78128c3369/src/codecs/native_midi/native_midi_win32.c#L47), which can no longer be easily changed as of Windows 7. This obviously means that we stop maintaining all direct Win32 code. We're still going to leave the files in the repo, but just no longer compile them, allowing future pushes to still fund their continued maintenance. After all, [the ReC98-adjacent community loves tinkering with old Windows systems](https://lainnet.superglobalmegacorp.com/misc/np2gdi.html)
port
migrate non graphics code to sdl the industry standard base library for portable game engines since plans to fund a complete linux port it makes sense to move away from sooner rather than later this way we avoid writing any more exclusive code for things like only for them to be rewritten in a more portable way later this issue is about using sdl for basic window creation sound effects keyboard and joypad input is tracked in midi will continue to use pbg s original native code sdl mixer would be a strict downgrade because it which can no longer be easily changed as of windows this obviously means that we stop maintaining all direct code we re still going to leave the files in the repo but just no longer compile them allowing future pushes to still fund their continued maintenance after all
1
1,694
24,626,329,488
IssuesEvent
2022-10-16 15:12:44
verilator/verilator
https://api.github.com/repos/verilator/verilator
closed
Make test fails on stable
area: portability resolution: answered
Can you attach an example that shows the issue? (Must be openly licensed, ideally in test_regress format.) ``` $ ... # follow git install instructions up to but not including make install $ make test ------------------------------------------------------------ making verilator in src make -C src make[1]: Entering directory '/home/donovick/src/verilator/src' make -C obj_dbg -j 1 TGT=../../bin/verilator_bin_dbg VL_DEBUG=1 -f ../Makefile_obj serial make[2]: Entering directory '/home/donovick/src/verilator/src/obj_dbg' make[2]: Nothing to be done for 'serial'. make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_dbg' make -C obj_dbg TGT=../../bin/verilator_bin_dbg VL_DEBUG=1 -f ../Makefile_obj make[2]: Entering directory '/home/donovick/src/verilator/src/obj_dbg' Compile flags: g++ -Og -ggdb -gz -DVL_DEBUG -D_GLIBCXX_DEBUG -MMD -I. -I.. -I.. -I../../include -I../../include -MP -faligned-new -Wno-unused-parameter -Wno-shadow -fno-builtin-malloc -fno-builtin-calloc -fno-builtin-realloc -fno-builtin-free -DDEFENV_SYSTEMC="" -DDEFENV_SYSTEMC_ARCH="" -DDEFENV_SYSTEMC_INCLUDE="" -DDEFENV_SYSTEMC_LIBDIR="" -DDEFENV_VERILATOR_ROOT="/usr/local/share/verilator" make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_dbg' make -C obj_dbg TGT=../../bin/verilator_coverage_bin_dbg VL_DEBUG=1 VL_VLCOV=1 -f ../Makefile_obj serial_vlcov make[2]: Entering directory '/home/donovick/src/verilator/src/obj_dbg' make[2]: Nothing to be done for 'serial_vlcov'. make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_dbg' make -C obj_dbg TGT=../../bin/verilator_coverage_bin_dbg VL_DEBUG=1 VL_VLCOV=1 -f ../Makefile_obj make[2]: Entering directory '/home/donovick/src/verilator/src/obj_dbg' Compile flags: g++ -Og -ggdb -gz -DVL_DEBUG -D_GLIBCXX_DEBUG -MMD -I. -I.. -I.. -I../../include -I../../include -MP -faligned-new -Wno-unused-parameter -Wno-shadow -fno-builtin-malloc -fno-builtin-calloc -fno-builtin-realloc -fno-builtin-free -DDEFENV_SYSTEMC="" -DDEFENV_SYSTEMC_ARCH="" -DDEFENV_SYSTEMC_INCLUDE="" -DDEFENV_SYSTEMC_LIBDIR="" -DDEFENV_VERILATOR_ROOT="/usr/local/share/verilator" make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_dbg' make -C obj_opt -j 1 TGT=../../bin/verilator_bin -f ../Makefile_obj serial make[2]: Entering directory '/home/donovick/src/verilator/src/obj_opt' make[2]: Nothing to be done for 'serial'. make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_opt' make -C obj_opt TGT=../../bin/verilator_bin -f ../Makefile_obj make[2]: Entering directory '/home/donovick/src/verilator/src/obj_opt' Compile flags: g++ -O3 -MMD -I. -I.. -I.. -I../../include -I../../include -MP -faligned-new -Wno-unused-parameter -Wno-shadow -fno-builtin-malloc -fno-builtin-calloc -fno-builtin-realloc -fno-builtin-free -DDEFENV_SYSTEMC="" -DDEFENV_SYSTEMC_ARCH="" -DDEFENV_SYSTEMC_INCLUDE="" -DDEFENV_SYSTEMC_LIBDIR="" -DDEFENV_VERILATOR_ROOT="/usr/local/share/verilator" make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_opt' make[1]: Leaving directory '/home/donovick/src/verilator/src' test_regress/t/t_a1_first_cc.pl ====================================================================== dist/t_a1_first_cc: ================================================== -Skip: dist/t_a1_first_cc: scenario 'dist' not enabled for test dist/t_a1_first_cc: -Skip: Skip: scenario 'dist' not enabled for test ==SUMMARY: Passed 0 Failed 0 Unsup 0 Time 0:00 ====================================================================== vlt/t_a1_first_cc: ================================================== perl ../bin/verilator --debug --debugi 0 --gdbbt --no-dump-tree -V warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". No stack. warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". To enable execution of this file add add-auto-load-safe-path /home/donovick/src/verilator/test_regress/.gdbinit line to your configuration file "/home/donovick/config/gdb/gdbinit". To completely disable this security protection add set auto-load safe-path / line to your configuration file "/home/donovick/config/gdb/gdbinit". For more information about this security protection see the "Auto-loading safe path" section in the GDB manual. E.g., run from the shell: info "(gdb)Auto-loading safe path" [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1". Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd Copyright 2003-2022 by Wilson Snyder. Verilator is free software; you can redistribute it and/or modify the Verilator internals under the terms of either the GNU Lesser General Public License Version 3 or the Perl Artistic License Version 2.0. See https://verilator.org for documentation Summary of configuration: Compiled in defaults if not in environment: SYSTEMC = SYSTEMC_ARCH = SYSTEMC_INCLUDE = SYSTEMC_LIBDIR = VERILATOR_ROOT = /usr/local/share/verilator SystemC system-wide = 1 Environment: MAKE = make PERL = SYSTEMC = SYSTEMC_ARCH = SYSTEMC_INCLUDE = SYSTEMC_LIBDIR = VERILATOR_BIN = VERILATOR_ROOT = /home/donovick/src/verilator/test_regress/.. Features (based on environment or compiled-in support): SystemC found = 1 [Inferior 1 (process 271264) exited normally] No stack. perl /home/donovick/src/verilator/test_regress/../bin/verilator --prefix Vt_a1_first_cc ../obj_vlt/t_a1_first_cc/Vt_a1_first_cc__main.cpp --exe --make gmake --x-assign unique -cc -Mdir obj_vlt/t_a1_first_cc --fdedup --debug-check --comp-limit-members 10 --debug --debugi 0 --gdbbt --no-dump-tree --trace --clk clk -f input.vc +define+TEST_OBJ_DIR=obj_vlt/t_a1_first_cc t/t_a1_first_cc.v > obj_vlt/t_a1_first_cc/vlt_compile.log warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". No stack. warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". To enable execution of this file add add-auto-load-safe-path /home/donovick/src/verilator/test_regress/.gdbinit line to your configuration file "/home/donovick/config/gdb/gdbinit". To completely disable this security protection add set auto-load safe-path / line to your configuration file "/home/donovick/config/gdb/gdbinit". For more information about this security protection see the "Auto-loading safe path" section in the GDB manual. E.g., run from the shell: info "(gdb)Auto-loading safe path" [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1". Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd [Inferior 1 (process 271306) exited normally] No stack. make -C obj_vlt/t_a1_first_cc -f /home/donovick/src/verilator/test_regress/Makefile_obj --no-print-directory VM_PREFIX=Vt_a1_first_cc TEST_OBJ_DIR=obj_vlt/t_a1_first_cc CPPFLAGS_DRIVER=-DT_A1_FIRST_CC OPT_FAST=-O0 OPT_GLOBAL=-O0 Vt_a1_first_cc > obj_vlt/t_a1_first_cc/vlt_gcc.log driver: Entering directory '/home/donovick/src/verilator/test_regress/obj_vlt/t_a1_first_cc' ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=0 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a1_first_cc -DVM_PREFIX=Vt_a1_first_cc -DVM_PREFIX_INCLUDE="<Vt_a1_first_cc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a1_first_cc___024root.h>" -DT_A1_FIRST_CC -O0 -c -o Vt_a1_first_cc__main.o ../../obj_vlt/t_a1_first_cc/Vt_a1_first_cc__main.cpp g++ Vt_a1_first_cc__main.o verilated.o verilated_vcd_c.o Vt_a1_first_cc__ALL.a -o Vt_a1_first_cc driver: Leaving directory '/home/donovick/src/verilator/test_regress/obj_vlt/t_a1_first_cc' obj_vlt/t_a1_first_cc/Vt_a1_first_cc > obj_vlt/t_a1_first_cc/vlt_sim.log *-* All Finished *-* - t/t_a1_first_cc.v:17: Verilog $finish vlt/t_a1_first_cc: Self PASSED ==SUMMARY: Passed 1 Failed 0 Unsup 0 Time 0:09 ==SUMMARY: Passed 1 Failed 0 Unsup 0 Time 0:09 ====================================================================== TESTS DONE, PASSED: Passed 1 Failed 0 Unsup 0 Time 0:09 test_regress/t/t_a2_first_sc.pl ====================================================================== dist/t_a2_first_sc: ================================================== -Skip: dist/t_a2_first_sc: scenario 'dist' not enabled for test dist/t_a2_first_sc: -Skip: Skip: scenario 'dist' not enabled for test ==SUMMARY: Passed 0 Failed 0 Unsup 0 Time 0:00 ====================================================================== vlt/t_a2_first_sc: ================================================== perl /home/donovick/src/verilator/test_regress/../bin/verilator --prefix Vt_a2_first_sc ../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp --exe --make gmake --x-assign unique -cc -Mdir obj_vlt/t_a2_first_sc --fdedup --debug-check --comp-limit-members 10 --debug --debugi 0 --gdbbt --no-dump-tree -sc --trace --clk clk -f input.vc +define+TEST_OBJ_DIR=obj_vlt/t_a2_first_sc t/t_a1_first_cc.v > obj_vlt/t_a2_first_sc/vlt_compile.log warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". No stack. warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". To enable execution of this file add add-auto-load-safe-path /home/donovick/src/verilator/test_regress/.gdbinit line to your configuration file "/home/donovick/config/gdb/gdbinit". To completely disable this security protection add set auto-load safe-path / line to your configuration file "/home/donovick/config/gdb/gdbinit". For more information about this security protection see the "Auto-loading safe path" section in the GDB manual. E.g., run from the shell: info "(gdb)Auto-loading safe path" [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1". Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd [Inferior 1 (process 271370) exited normally] No stack. make -C obj_vlt/t_a2_first_sc -f /home/donovick/src/verilator/test_regress/Makefile_obj --no-print-directory VM_PREFIX=Vt_a2_first_sc TEST_OBJ_DIR=obj_vlt/t_a2_first_sc CPPFLAGS_DRIVER=-DT_A2_FIRST_SC OPT_FAST=-O0 OPT_GLOBAL=-O0 Vt_a2_first_sc > obj_vlt/t_a2_first_sc/vlt_gcc.log driver: Entering directory '/home/donovick/src/verilator/test_regress/obj_vlt/t_a2_first_sc' ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=1 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a2_first_sc -DVM_PREFIX=Vt_a2_first_sc -DVM_PREFIX_INCLUDE="<Vt_a2_first_sc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a2_first_sc___024root.h>" -DT_A2_FIRST_SC -O0 -c -o Vt_a2_first_sc__main.o ../../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp g++ Vt_a2_first_sc__main.o verilated.o verilated_vcd_c.o verilated_vcd_sc.o Vt_a2_first_sc__ALL.a -lsystemc -o Vt_a2_first_sc /usr/bin/ld: Vt_a2_first_sc__main.o: in function `__static_initialization_and_destruction_0(int, int)': Vt_a2_first_sc__main.cpp:(.text+0x597): undefined reference to `sc_core::sc_api_version_2_3_3_cxx201402L<&sc_core::SC_DISABLE_VIRTUAL_BIND_UNDEFINED_>::sc_api_version_2_3_3_cxx201402L(sc_core::sc_writer_policy)' /usr/bin/ld: verilated.o: in function `__static_initialization_and_destruction_0(int, int)': verilated.cpp:(.text+0xe694): undefined reference to `sc_core::sc_api_version_2_3_3_cxx201402L<&sc_core::SC_DISABLE_VIRTUAL_BIND_UNDEFINED_>::sc_api_version_2_3_3_cxx201402L(sc_core::sc_writer_policy)' /usr/bin/ld: verilated_vcd_c.o: in function `__static_initialization_and_destruction_0(int, int)': verilated_vcd_c.cpp:(.text+0x4fee): undefined reference to `sc_core::sc_api_version_2_3_3_cxx201402L<&sc_core::SC_DISABLE_VIRTUAL_BIND_UNDEFINED_>::sc_api_version_2_3_3_cxx201402L(sc_core::sc_writer_policy)' /usr/bin/ld: verilated_vcd_sc.o: in function `__static_initialization_and_destruction_0(int, int)': verilated_vcd_sc.cpp:(.text+0x41a): undefined reference to `sc_core::sc_api_version_2_3_3_cxx201402L<&sc_core::SC_DISABLE_VIRTUAL_BIND_UNDEFINED_>::sc_api_version_2_3_3_cxx201402L(sc_core::sc_writer_policy)' /usr/bin/ld: Vt_a2_first_sc__ALL.a(Vt_a2_first_sc__ALL.o): in function `__static_initialization_and_destruction_0(int, int)': Vt_a2_first_sc__ALL.cpp:(.text+0x129d): undefined reference to `sc_core::sc_api_version_2_3_3_cxx201402L<&sc_core::SC_DISABLE_VIRTUAL_BIND_UNDEFINED_>::sc_api_version_2_3_3_cxx201402L(sc_core::sc_writer_policy)' collect2: error: ld returned 1 exit status make[1]: *** [Vt_a2_first_sc.mk:65: Vt_a2_first_sc] Error 1 driver: Leaving directory '/home/donovick/src/verilator/test_regress/obj_vlt/t_a2_first_sc' %Warning: vlt/t_a2_first_sc: Exec of make failed: ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=1 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a2_first_sc -DVM_PREFIX=Vt_a2_first_sc -DVM_PREFIX_INCLUDE="<Vt_a2_first_sc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a2_first_sc___024root.h>" -DT_A2_FIRST_SC -O0 -c -o Vt_a2_first_sc__main.o ../../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp vlt/t_a2_first_sc: %Error: Exec of make failed: ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=1 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a2_first_sc -DVM_PREFIX=Vt_a2_first_sc -DVM_PREFIX_INCLUDE="<Vt_a2_first_sc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a2_first_sc___024root.h>" -DT_A2_FIRST_SC -O0 -c -o Vt_a2_first_sc__main.o ../../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp vlt/t_a2_first_sc: FAILED: Exec of make failed: ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=1 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a2_first_sc -DVM_PREFIX=Vt_a2_first_sc -DVM_PREFIX_INCLUDE="<Vt_a2_first_sc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a2_first_sc___024root.h>" -DT_A2_FIRST_SC -O0 -c -o Vt_a2_first_sc__main.o ../../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp sc/Vt_a2_first_sc__main.cpp ==SUMMARY: Passed 0 Failed 1 Unsup 0 Time 0:05 ==SUMMARY: Passed 0 Failed 1 Unsup 0 Time 0:05 ====================================================================== #vlt/t_a2_first_sc: %Error: Exec of make failed: ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=1 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a2_first_sc -DVM_PREFIX=Vt_a2_first_sc -DVM_PREFIX_INCLUDE="<Vt_a2_first_sc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a2_first_sc___024root.h>" -DT_A2_FIRST_SC -O0 -c -o Vt_a2_first_sc__main.o ../../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp make && test_regress/t/t_a2_first_sc.pl --vlt TESTS DONE, FAILED: Passed 0 Failed 1 Unsup 0 Time 0:05 make: *** [Makefile:166: smoke-test] Error 10 ``` What 'verilator --version' are you using? Did you try it with the git master version? ``` $ ./bin/verilator --version Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd ``` What OS and distribution are you using? Linux/Debian
True
Make test fails on stable - Can you attach an example that shows the issue? (Must be openly licensed, ideally in test_regress format.) ``` $ ... # follow git install instructions up to but not including make install $ make test ------------------------------------------------------------ making verilator in src make -C src make[1]: Entering directory '/home/donovick/src/verilator/src' make -C obj_dbg -j 1 TGT=../../bin/verilator_bin_dbg VL_DEBUG=1 -f ../Makefile_obj serial make[2]: Entering directory '/home/donovick/src/verilator/src/obj_dbg' make[2]: Nothing to be done for 'serial'. make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_dbg' make -C obj_dbg TGT=../../bin/verilator_bin_dbg VL_DEBUG=1 -f ../Makefile_obj make[2]: Entering directory '/home/donovick/src/verilator/src/obj_dbg' Compile flags: g++ -Og -ggdb -gz -DVL_DEBUG -D_GLIBCXX_DEBUG -MMD -I. -I.. -I.. -I../../include -I../../include -MP -faligned-new -Wno-unused-parameter -Wno-shadow -fno-builtin-malloc -fno-builtin-calloc -fno-builtin-realloc -fno-builtin-free -DDEFENV_SYSTEMC="" -DDEFENV_SYSTEMC_ARCH="" -DDEFENV_SYSTEMC_INCLUDE="" -DDEFENV_SYSTEMC_LIBDIR="" -DDEFENV_VERILATOR_ROOT="/usr/local/share/verilator" make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_dbg' make -C obj_dbg TGT=../../bin/verilator_coverage_bin_dbg VL_DEBUG=1 VL_VLCOV=1 -f ../Makefile_obj serial_vlcov make[2]: Entering directory '/home/donovick/src/verilator/src/obj_dbg' make[2]: Nothing to be done for 'serial_vlcov'. make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_dbg' make -C obj_dbg TGT=../../bin/verilator_coverage_bin_dbg VL_DEBUG=1 VL_VLCOV=1 -f ../Makefile_obj make[2]: Entering directory '/home/donovick/src/verilator/src/obj_dbg' Compile flags: g++ -Og -ggdb -gz -DVL_DEBUG -D_GLIBCXX_DEBUG -MMD -I. -I.. -I.. -I../../include -I../../include -MP -faligned-new -Wno-unused-parameter -Wno-shadow -fno-builtin-malloc -fno-builtin-calloc -fno-builtin-realloc -fno-builtin-free -DDEFENV_SYSTEMC="" -DDEFENV_SYSTEMC_ARCH="" -DDEFENV_SYSTEMC_INCLUDE="" -DDEFENV_SYSTEMC_LIBDIR="" -DDEFENV_VERILATOR_ROOT="/usr/local/share/verilator" make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_dbg' make -C obj_opt -j 1 TGT=../../bin/verilator_bin -f ../Makefile_obj serial make[2]: Entering directory '/home/donovick/src/verilator/src/obj_opt' make[2]: Nothing to be done for 'serial'. make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_opt' make -C obj_opt TGT=../../bin/verilator_bin -f ../Makefile_obj make[2]: Entering directory '/home/donovick/src/verilator/src/obj_opt' Compile flags: g++ -O3 -MMD -I. -I.. -I.. -I../../include -I../../include -MP -faligned-new -Wno-unused-parameter -Wno-shadow -fno-builtin-malloc -fno-builtin-calloc -fno-builtin-realloc -fno-builtin-free -DDEFENV_SYSTEMC="" -DDEFENV_SYSTEMC_ARCH="" -DDEFENV_SYSTEMC_INCLUDE="" -DDEFENV_SYSTEMC_LIBDIR="" -DDEFENV_VERILATOR_ROOT="/usr/local/share/verilator" make[2]: Leaving directory '/home/donovick/src/verilator/src/obj_opt' make[1]: Leaving directory '/home/donovick/src/verilator/src' test_regress/t/t_a1_first_cc.pl ====================================================================== dist/t_a1_first_cc: ================================================== -Skip: dist/t_a1_first_cc: scenario 'dist' not enabled for test dist/t_a1_first_cc: -Skip: Skip: scenario 'dist' not enabled for test ==SUMMARY: Passed 0 Failed 0 Unsup 0 Time 0:00 ====================================================================== vlt/t_a1_first_cc: ================================================== perl ../bin/verilator --debug --debugi 0 --gdbbt --no-dump-tree -V warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". No stack. warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". To enable execution of this file add add-auto-load-safe-path /home/donovick/src/verilator/test_regress/.gdbinit line to your configuration file "/home/donovick/config/gdb/gdbinit". To completely disable this security protection add set auto-load safe-path / line to your configuration file "/home/donovick/config/gdb/gdbinit". For more information about this security protection see the "Auto-loading safe path" section in the GDB manual. E.g., run from the shell: info "(gdb)Auto-loading safe path" [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1". Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd Copyright 2003-2022 by Wilson Snyder. Verilator is free software; you can redistribute it and/or modify the Verilator internals under the terms of either the GNU Lesser General Public License Version 3 or the Perl Artistic License Version 2.0. See https://verilator.org for documentation Summary of configuration: Compiled in defaults if not in environment: SYSTEMC = SYSTEMC_ARCH = SYSTEMC_INCLUDE = SYSTEMC_LIBDIR = VERILATOR_ROOT = /usr/local/share/verilator SystemC system-wide = 1 Environment: MAKE = make PERL = SYSTEMC = SYSTEMC_ARCH = SYSTEMC_INCLUDE = SYSTEMC_LIBDIR = VERILATOR_BIN = VERILATOR_ROOT = /home/donovick/src/verilator/test_regress/.. Features (based on environment or compiled-in support): SystemC found = 1 [Inferior 1 (process 271264) exited normally] No stack. perl /home/donovick/src/verilator/test_regress/../bin/verilator --prefix Vt_a1_first_cc ../obj_vlt/t_a1_first_cc/Vt_a1_first_cc__main.cpp --exe --make gmake --x-assign unique -cc -Mdir obj_vlt/t_a1_first_cc --fdedup --debug-check --comp-limit-members 10 --debug --debugi 0 --gdbbt --no-dump-tree --trace --clk clk -f input.vc +define+TEST_OBJ_DIR=obj_vlt/t_a1_first_cc t/t_a1_first_cc.v > obj_vlt/t_a1_first_cc/vlt_compile.log warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". No stack. warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". To enable execution of this file add add-auto-load-safe-path /home/donovick/src/verilator/test_regress/.gdbinit line to your configuration file "/home/donovick/config/gdb/gdbinit". To completely disable this security protection add set auto-load safe-path / line to your configuration file "/home/donovick/config/gdb/gdbinit". For more information about this security protection see the "Auto-loading safe path" section in the GDB manual. E.g., run from the shell: info "(gdb)Auto-loading safe path" [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1". Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd [Inferior 1 (process 271306) exited normally] No stack. make -C obj_vlt/t_a1_first_cc -f /home/donovick/src/verilator/test_regress/Makefile_obj --no-print-directory VM_PREFIX=Vt_a1_first_cc TEST_OBJ_DIR=obj_vlt/t_a1_first_cc CPPFLAGS_DRIVER=-DT_A1_FIRST_CC OPT_FAST=-O0 OPT_GLOBAL=-O0 Vt_a1_first_cc > obj_vlt/t_a1_first_cc/vlt_gcc.log driver: Entering directory '/home/donovick/src/verilator/test_regress/obj_vlt/t_a1_first_cc' ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=0 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a1_first_cc -DVM_PREFIX=Vt_a1_first_cc -DVM_PREFIX_INCLUDE="<Vt_a1_first_cc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a1_first_cc___024root.h>" -DT_A1_FIRST_CC -O0 -c -o Vt_a1_first_cc__main.o ../../obj_vlt/t_a1_first_cc/Vt_a1_first_cc__main.cpp g++ Vt_a1_first_cc__main.o verilated.o verilated_vcd_c.o Vt_a1_first_cc__ALL.a -o Vt_a1_first_cc driver: Leaving directory '/home/donovick/src/verilator/test_regress/obj_vlt/t_a1_first_cc' obj_vlt/t_a1_first_cc/Vt_a1_first_cc > obj_vlt/t_a1_first_cc/vlt_sim.log *-* All Finished *-* - t/t_a1_first_cc.v:17: Verilog $finish vlt/t_a1_first_cc: Self PASSED ==SUMMARY: Passed 1 Failed 0 Unsup 0 Time 0:09 ==SUMMARY: Passed 1 Failed 0 Unsup 0 Time 0:09 ====================================================================== TESTS DONE, PASSED: Passed 1 Failed 0 Unsup 0 Time 0:09 test_regress/t/t_a2_first_sc.pl ====================================================================== dist/t_a2_first_sc: ================================================== -Skip: dist/t_a2_first_sc: scenario 'dist' not enabled for test dist/t_a2_first_sc: -Skip: Skip: scenario 'dist' not enabled for test ==SUMMARY: Passed 0 Failed 0 Unsup 0 Time 0:00 ====================================================================== vlt/t_a2_first_sc: ================================================== perl /home/donovick/src/verilator/test_regress/../bin/verilator --prefix Vt_a2_first_sc ../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp --exe --make gmake --x-assign unique -cc -Mdir obj_vlt/t_a2_first_sc --fdedup --debug-check --comp-limit-members 10 --debug --debugi 0 --gdbbt --no-dump-tree -sc --trace --clk clk -f input.vc +define+TEST_OBJ_DIR=obj_vlt/t_a2_first_sc t/t_a1_first_cc.v > obj_vlt/t_a2_first_sc/vlt_compile.log warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". No stack. warning: File "/home/donovick/src/verilator/test_regress/.gdbinit" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load". To enable execution of this file add add-auto-load-safe-path /home/donovick/src/verilator/test_regress/.gdbinit line to your configuration file "/home/donovick/config/gdb/gdbinit". To completely disable this security protection add set auto-load safe-path / line to your configuration file "/home/donovick/config/gdb/gdbinit". For more information about this security protection see the "Auto-loading safe path" section in the GDB manual. E.g., run from the shell: info "(gdb)Auto-loading safe path" [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1". Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd Starting Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd [Inferior 1 (process 271370) exited normally] No stack. make -C obj_vlt/t_a2_first_sc -f /home/donovick/src/verilator/test_regress/Makefile_obj --no-print-directory VM_PREFIX=Vt_a2_first_sc TEST_OBJ_DIR=obj_vlt/t_a2_first_sc CPPFLAGS_DRIVER=-DT_A2_FIRST_SC OPT_FAST=-O0 OPT_GLOBAL=-O0 Vt_a2_first_sc > obj_vlt/t_a2_first_sc/vlt_gcc.log driver: Entering directory '/home/donovick/src/verilator/test_regress/obj_vlt/t_a2_first_sc' ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=1 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a2_first_sc -DVM_PREFIX=Vt_a2_first_sc -DVM_PREFIX_INCLUDE="<Vt_a2_first_sc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a2_first_sc___024root.h>" -DT_A2_FIRST_SC -O0 -c -o Vt_a2_first_sc__main.o ../../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp g++ Vt_a2_first_sc__main.o verilated.o verilated_vcd_c.o verilated_vcd_sc.o Vt_a2_first_sc__ALL.a -lsystemc -o Vt_a2_first_sc /usr/bin/ld: Vt_a2_first_sc__main.o: in function `__static_initialization_and_destruction_0(int, int)': Vt_a2_first_sc__main.cpp:(.text+0x597): undefined reference to `sc_core::sc_api_version_2_3_3_cxx201402L<&sc_core::SC_DISABLE_VIRTUAL_BIND_UNDEFINED_>::sc_api_version_2_3_3_cxx201402L(sc_core::sc_writer_policy)' /usr/bin/ld: verilated.o: in function `__static_initialization_and_destruction_0(int, int)': verilated.cpp:(.text+0xe694): undefined reference to `sc_core::sc_api_version_2_3_3_cxx201402L<&sc_core::SC_DISABLE_VIRTUAL_BIND_UNDEFINED_>::sc_api_version_2_3_3_cxx201402L(sc_core::sc_writer_policy)' /usr/bin/ld: verilated_vcd_c.o: in function `__static_initialization_and_destruction_0(int, int)': verilated_vcd_c.cpp:(.text+0x4fee): undefined reference to `sc_core::sc_api_version_2_3_3_cxx201402L<&sc_core::SC_DISABLE_VIRTUAL_BIND_UNDEFINED_>::sc_api_version_2_3_3_cxx201402L(sc_core::sc_writer_policy)' /usr/bin/ld: verilated_vcd_sc.o: in function `__static_initialization_and_destruction_0(int, int)': verilated_vcd_sc.cpp:(.text+0x41a): undefined reference to `sc_core::sc_api_version_2_3_3_cxx201402L<&sc_core::SC_DISABLE_VIRTUAL_BIND_UNDEFINED_>::sc_api_version_2_3_3_cxx201402L(sc_core::sc_writer_policy)' /usr/bin/ld: Vt_a2_first_sc__ALL.a(Vt_a2_first_sc__ALL.o): in function `__static_initialization_and_destruction_0(int, int)': Vt_a2_first_sc__ALL.cpp:(.text+0x129d): undefined reference to `sc_core::sc_api_version_2_3_3_cxx201402L<&sc_core::SC_DISABLE_VIRTUAL_BIND_UNDEFINED_>::sc_api_version_2_3_3_cxx201402L(sc_core::sc_writer_policy)' collect2: error: ld returned 1 exit status make[1]: *** [Vt_a2_first_sc.mk:65: Vt_a2_first_sc] Error 1 driver: Leaving directory '/home/donovick/src/verilator/test_regress/obj_vlt/t_a2_first_sc' %Warning: vlt/t_a2_first_sc: Exec of make failed: ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=1 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a2_first_sc -DVM_PREFIX=Vt_a2_first_sc -DVM_PREFIX_INCLUDE="<Vt_a2_first_sc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a2_first_sc___024root.h>" -DT_A2_FIRST_SC -O0 -c -o Vt_a2_first_sc__main.o ../../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp vlt/t_a2_first_sc: %Error: Exec of make failed: ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=1 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a2_first_sc -DVM_PREFIX=Vt_a2_first_sc -DVM_PREFIX_INCLUDE="<Vt_a2_first_sc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a2_first_sc___024root.h>" -DT_A2_FIRST_SC -O0 -c -o Vt_a2_first_sc__main.o ../../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp vlt/t_a2_first_sc: FAILED: Exec of make failed: ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=1 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a2_first_sc -DVM_PREFIX=Vt_a2_first_sc -DVM_PREFIX_INCLUDE="<Vt_a2_first_sc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a2_first_sc___024root.h>" -DT_A2_FIRST_SC -O0 -c -o Vt_a2_first_sc__main.o ../../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp sc/Vt_a2_first_sc__main.cpp ==SUMMARY: Passed 0 Failed 1 Unsup 0 Time 0:05 ==SUMMARY: Passed 0 Failed 1 Unsup 0 Time 0:05 ====================================================================== #vlt/t_a2_first_sc: %Error: Exec of make failed: ccache g++ -I. -MMD -I/home/donovick/src/verilator/test_regress/../include -I/home/donovick/src/verilator/test_regress/../include/vltstd -DVM_COVERAGE=0 -DVM_SC=1 -DVM_TRACE=1 -DVM_TRACE_FST=0 -DVM_TRACE_VCD=1 -faligned-new -fcf-protection=none -Wno-bool-operation -Wno-sign-compare -Wno-uninitialized -Wno-unused-but-set-variable -Wno-unused-parameter -Wno-unused-variable -Wno-shadow -std=gnu++14 -DVERILATOR=1 -DVL_DEBUG=1 -DTEST_OBJ_DIR=obj_vlt/t_a2_first_sc -DVM_PREFIX=Vt_a2_first_sc -DVM_PREFIX_INCLUDE="<Vt_a2_first_sc.h>" -DVM_PREFIX_ROOT_INCLUDE="<Vt_a2_first_sc___024root.h>" -DT_A2_FIRST_SC -O0 -c -o Vt_a2_first_sc__main.o ../../obj_vlt/t_a2_first_sc/Vt_a2_first_sc__main.cpp make && test_regress/t/t_a2_first_sc.pl --vlt TESTS DONE, FAILED: Passed 0 Failed 1 Unsup 0 Time 0:05 make: *** [Makefile:166: smoke-test] Error 10 ``` What 'verilator --version' are you using? Did you try it with the git master version? ``` $ ./bin/verilator --version Verilator 4.228 2022-10-01 rev v4.228-28-g04fff7ebd ``` What OS and distribution are you using? Linux/Debian
port
make test fails on stable can you attach an example that shows the issue must be openly licensed ideally in test regress format follow git install instructions up to but not including make install make test making verilator in src make c src make entering directory home donovick src verilator src make c obj dbg j tgt bin verilator bin dbg vl debug f makefile obj serial make entering directory home donovick src verilator src obj dbg make nothing to be done for serial make leaving directory home donovick src verilator src obj dbg make c obj dbg tgt bin verilator bin dbg vl debug f makefile obj make entering directory home donovick src verilator src obj dbg compile flags g og ggdb gz dvl debug d glibcxx debug mmd i i i i include i include mp faligned new wno unused parameter wno shadow fno builtin malloc fno builtin calloc fno builtin realloc fno builtin free ddefenv systemc ddefenv systemc arch ddefenv systemc include ddefenv systemc libdir ddefenv verilator root usr local share verilator make leaving directory home donovick src verilator src obj dbg make c obj dbg tgt bin verilator coverage bin dbg vl debug vl vlcov f makefile obj serial vlcov make entering directory home donovick src verilator src obj dbg make nothing to be done for serial vlcov make leaving directory home donovick src verilator src obj dbg make c obj dbg tgt bin verilator coverage bin dbg vl debug vl vlcov f makefile obj make entering directory home donovick src verilator src obj dbg compile flags g og ggdb gz dvl debug d glibcxx debug mmd i i i i include i include mp faligned new wno unused parameter wno shadow fno builtin malloc fno builtin calloc fno builtin realloc fno builtin free ddefenv systemc ddefenv systemc arch ddefenv systemc include ddefenv systemc libdir ddefenv verilator root usr local share verilator make leaving directory home donovick src verilator src obj dbg make c obj opt j tgt bin verilator bin f makefile obj serial make entering directory home donovick src verilator src obj opt make nothing to be done for serial make leaving directory home donovick src verilator src obj opt make c obj opt tgt bin verilator bin f makefile obj make entering directory home donovick src verilator src obj opt compile flags g mmd i i i i include i include mp faligned new wno unused parameter wno shadow fno builtin malloc fno builtin calloc fno builtin realloc fno builtin free ddefenv systemc ddefenv systemc arch ddefenv systemc include ddefenv systemc libdir ddefenv verilator root usr local share verilator make leaving directory home donovick src verilator src obj opt make leaving directory home donovick src verilator src test regress t t first cc pl dist t first cc skip dist t first cc scenario dist not enabled for test dist t first cc skip skip scenario dist not enabled for test summary passed failed unsup time vlt t first cc perl bin verilator debug debugi gdbbt no dump tree v warning file home donovick src verilator test regress gdbinit auto loading has been declined by your auto load safe path set to debugdir datadir auto load no stack warning file home donovick src verilator test regress gdbinit auto loading has been declined by your auto load safe path set to debugdir datadir auto load to enable execution of this file add add auto load safe path home donovick src verilator test regress gdbinit line to your configuration file home donovick config gdb gdbinit to completely disable this security protection add set auto load safe path line to your configuration file home donovick config gdb gdbinit for more information about this security protection see the auto loading safe path section in the gdb manual e g run from the shell info gdb auto loading safe path using host libthread db library lib linux gnu libthread db so starting verilator rev starting verilator rev verilator rev copyright by wilson snyder verilator is free software you can redistribute it and or modify the verilator internals under the terms of either the gnu lesser general public license version or the perl artistic license version see for documentation summary of configuration compiled in defaults if not in environment systemc systemc arch systemc include systemc libdir verilator root usr local share verilator systemc system wide environment make make perl systemc systemc arch systemc include systemc libdir verilator bin verilator root home donovick src verilator test regress features based on environment or compiled in support systemc found no stack perl home donovick src verilator test regress bin verilator prefix vt first cc obj vlt t first cc vt first cc main cpp exe make gmake x assign unique cc mdir obj vlt t first cc fdedup debug check comp limit members debug debugi gdbbt no dump tree trace clk clk f input vc define test obj dir obj vlt t first cc t t first cc v obj vlt t first cc vlt compile log warning file home donovick src verilator test regress gdbinit auto loading has been declined by your auto load safe path set to debugdir datadir auto load no stack warning file home donovick src verilator test regress gdbinit auto loading has been declined by your auto load safe path set to debugdir datadir auto load to enable execution of this file add add auto load safe path home donovick src verilator test regress gdbinit line to your configuration file home donovick config gdb gdbinit to completely disable this security protection add set auto load safe path line to your configuration file home donovick config gdb gdbinit for more information about this security protection see the auto loading safe path section in the gdb manual e g run from the shell info gdb auto loading safe path using host libthread db library lib linux gnu libthread db so starting verilator rev starting verilator rev no stack make c obj vlt t first cc f home donovick src verilator test regress makefile obj no print directory vm prefix vt first cc test obj dir obj vlt t first cc cppflags driver dt first cc opt fast opt global vt first cc obj vlt t first cc vlt gcc log driver entering directory home donovick src verilator test regress obj vlt t first cc ccache g i mmd i home donovick src verilator test regress include i home donovick src verilator test regress include vltstd dvm coverage dvm sc dvm trace dvm trace fst dvm trace vcd faligned new fcf protection none wno bool operation wno sign compare wno uninitialized wno unused but set variable wno unused parameter wno unused variable wno shadow std gnu dverilator dvl debug dtest obj dir obj vlt t first cc dvm prefix vt first cc dvm prefix include dvm prefix root include dt first cc c o vt first cc main o obj vlt t first cc vt first cc main cpp g vt first cc main o verilated o verilated vcd c o vt first cc all a o vt first cc driver leaving directory home donovick src verilator test regress obj vlt t first cc obj vlt t first cc vt first cc obj vlt t first cc vlt sim log all finished t t first cc v verilog finish vlt t first cc self passed summary passed failed unsup time summary passed failed unsup time tests done passed passed failed unsup time test regress t t first sc pl dist t first sc skip dist t first sc scenario dist not enabled for test dist t first sc skip skip scenario dist not enabled for test summary passed failed unsup time vlt t first sc perl home donovick src verilator test regress bin verilator prefix vt first sc obj vlt t first sc vt first sc main cpp exe make gmake x assign unique cc mdir obj vlt t first sc fdedup debug check comp limit members debug debugi gdbbt no dump tree sc trace clk clk f input vc define test obj dir obj vlt t first sc t t first cc v obj vlt t first sc vlt compile log warning file home donovick src verilator test regress gdbinit auto loading has been declined by your auto load safe path set to debugdir datadir auto load no stack warning file home donovick src verilator test regress gdbinit auto loading has been declined by your auto load safe path set to debugdir datadir auto load to enable execution of this file add add auto load safe path home donovick src verilator test regress gdbinit line to your configuration file home donovick config gdb gdbinit to completely disable this security protection add set auto load safe path line to your configuration file home donovick config gdb gdbinit for more information about this security protection see the auto loading safe path section in the gdb manual e g run from the shell info gdb auto loading safe path using host libthread db library lib linux gnu libthread db so starting verilator rev starting verilator rev no stack make c obj vlt t first sc f home donovick src verilator test regress makefile obj no print directory vm prefix vt first sc test obj dir obj vlt t first sc cppflags driver dt first sc opt fast opt global vt first sc obj vlt t first sc vlt gcc log driver entering directory home donovick src verilator test regress obj vlt t first sc ccache g i mmd i home donovick src verilator test regress include i home donovick src verilator test regress include vltstd dvm coverage dvm sc dvm trace dvm trace fst dvm trace vcd faligned new fcf protection none wno bool operation wno sign compare wno uninitialized wno unused but set variable wno unused parameter wno unused variable wno shadow std gnu dverilator dvl debug dtest obj dir obj vlt t first sc dvm prefix vt first sc dvm prefix include dvm prefix root include dt first sc c o vt first sc main o obj vlt t first sc vt first sc main cpp g vt first sc main o verilated o verilated vcd c o verilated vcd sc o vt first sc all a lsystemc o vt first sc usr bin ld vt first sc main o in function static initialization and destruction int int vt first sc main cpp text undefined reference to sc core sc api version sc api version sc core sc writer policy usr bin ld verilated o in function static initialization and destruction int int verilated cpp text undefined reference to sc core sc api version sc api version sc core sc writer policy usr bin ld verilated vcd c o in function static initialization and destruction int int verilated vcd c cpp text undefined reference to sc core sc api version sc api version sc core sc writer policy usr bin ld verilated vcd sc o in function static initialization and destruction int int verilated vcd sc cpp text undefined reference to sc core sc api version sc api version sc core sc writer policy usr bin ld vt first sc all a vt first sc all o in function static initialization and destruction int int vt first sc all cpp text undefined reference to sc core sc api version sc api version sc core sc writer policy error ld returned exit status make error driver leaving directory home donovick src verilator test regress obj vlt t first sc warning vlt t first sc exec of make failed ccache g i mmd i home donovick src verilator test regress include i home donovick src verilator test regress include vltstd dvm coverage dvm sc dvm trace dvm trace fst dvm trace vcd faligned new fcf protection none wno bool operation wno sign compare wno uninitialized wno unused but set variable wno unused parameter wno unused variable wno shadow std gnu dverilator dvl debug dtest obj dir obj vlt t first sc dvm prefix vt first sc dvm prefix include dvm prefix root include dt first sc c o vt first sc main o obj vlt t first sc vt first sc main cpp vlt t first sc error exec of make failed ccache g i mmd i home donovick src verilator test regress include i home donovick src verilator test regress include vltstd dvm coverage dvm sc dvm trace dvm trace fst dvm trace vcd faligned new fcf protection none wno bool operation wno sign compare wno uninitialized wno unused but set variable wno unused parameter wno unused variable wno shadow std gnu dverilator dvl debug dtest obj dir obj vlt t first sc dvm prefix vt first sc dvm prefix include dvm prefix root include dt first sc c o vt first sc main o obj vlt t first sc vt first sc main cpp vlt t first sc failed exec of make failed ccache g i mmd i home donovick src verilator test regress include i home donovick src verilator test regress include vltstd dvm coverage dvm sc dvm trace dvm trace fst dvm trace vcd faligned new fcf protection none wno bool operation wno sign compare wno uninitialized wno unused but set variable wno unused parameter wno unused variable wno shadow std gnu dverilator dvl debug dtest obj dir obj vlt t first sc dvm prefix vt first sc dvm prefix include dvm prefix root include dt first sc c o vt first sc main o obj vlt t first sc vt first sc main cpp sc vt first sc main cpp summary passed failed unsup time summary passed failed unsup time vlt t first sc error exec of make failed ccache g i mmd i home donovick src verilator test regress include i home donovick src verilator test regress include vltstd dvm coverage dvm sc dvm trace dvm trace fst dvm trace vcd faligned new fcf protection none wno bool operation wno sign compare wno uninitialized wno unused but set variable wno unused parameter wno unused variable wno shadow std gnu dverilator dvl debug dtest obj dir obj vlt t first sc dvm prefix vt first sc dvm prefix include dvm prefix root include dt first sc c o vt first sc main o obj vlt t first sc vt first sc main cpp make test regress t t first sc pl vlt tests done failed passed failed unsup time make error what verilator version are you using did you try it with the git master version bin verilator version verilator rev what os and distribution are you using linux debian
1
203,404
23,154,905,922
IssuesEvent
2022-07-29 12:04:48
turkdevops/prism
https://api.github.com/repos/turkdevops/prism
closed
CVE-2020-28469 (High) detected in glob-parent-3.1.0.tgz, glob-parent-5.1.1.tgz - autoclosed
security vulnerability
## CVE-2020-28469 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>glob-parent-3.1.0.tgz</b>, <b>glob-parent-5.1.1.tgz</b></p></summary> <p> <details><summary><b>glob-parent-3.1.0.tgz</b></p></summary> <p>Strips glob magic from a string to provide the parent directory path</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - gulp-4.0.2.tgz (Root Library) - vinyl-fs-3.0.3.tgz - glob-stream-6.1.0.tgz - :x: **glob-parent-3.1.0.tgz** (Vulnerable Library) </details> <details><summary><b>glob-parent-5.1.1.tgz</b></p></summary> <p>Extract the non-magic parent path from a glob string.</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/fast-glob/node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - del-6.0.0.tgz (Root Library) - globby-11.0.1.tgz - fast-glob-3.2.4.tgz - :x: **glob-parent-5.1.1.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/turkdevops/prism/commit/c015b23efb1d53d038dc8d04e7411d690d7075ca">c015b23efb1d53d038dc8d04e7411d690d7075ca</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator. <p>Publish Date: 2021-06-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p> <p>Release Date: 2021-06-03</p> <p>Fix Resolution (glob-parent): 5.1.2</p> <p>Direct dependency fix Resolution (del): 6.1.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-28469 (High) detected in glob-parent-3.1.0.tgz, glob-parent-5.1.1.tgz - autoclosed - ## CVE-2020-28469 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>glob-parent-3.1.0.tgz</b>, <b>glob-parent-5.1.1.tgz</b></p></summary> <p> <details><summary><b>glob-parent-3.1.0.tgz</b></p></summary> <p>Strips glob magic from a string to provide the parent directory path</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - gulp-4.0.2.tgz (Root Library) - vinyl-fs-3.0.3.tgz - glob-stream-6.1.0.tgz - :x: **glob-parent-3.1.0.tgz** (Vulnerable Library) </details> <details><summary><b>glob-parent-5.1.1.tgz</b></p></summary> <p>Extract the non-magic parent path from a glob string.</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/fast-glob/node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - del-6.0.0.tgz (Root Library) - globby-11.0.1.tgz - fast-glob-3.2.4.tgz - :x: **glob-parent-5.1.1.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/turkdevops/prism/commit/c015b23efb1d53d038dc8d04e7411d690d7075ca">c015b23efb1d53d038dc8d04e7411d690d7075ca</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator. <p>Publish Date: 2021-06-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p> <p>Release Date: 2021-06-03</p> <p>Fix Resolution (glob-parent): 5.1.2</p> <p>Direct dependency fix Resolution (del): 6.1.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_port
cve high detected in glob parent tgz glob parent tgz autoclosed cve high severity vulnerability vulnerable libraries glob parent tgz glob parent tgz glob parent tgz strips glob magic from a string to provide the parent directory path library home page a href path to dependency file package json path to vulnerable library node modules glob parent package json dependency hierarchy gulp tgz root library vinyl fs tgz glob stream tgz x glob parent tgz vulnerable library glob parent tgz extract the non magic parent path from a glob string library home page a href path to dependency file package json path to vulnerable library node modules fast glob node modules glob parent package json dependency hierarchy del tgz root library globby tgz fast glob tgz x glob parent tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects the package glob parent before the enclosure regex used to check for strings ending in enclosure containing path separator publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution glob parent direct dependency fix resolution del step up your open source security game with mend
0
1,130
14,480,401,310
IssuesEvent
2020-12-10 11:07:23
ToFuProject/tofu
https://api.github.com/repos/ToFuProject/tofu
closed
tofu 1.4.8 fails for Windows only on conda-forge
Fixed in devel geom - bug portability
The attempt to export tofu 1.4.8 to conda forge reveals that all unit tests are passed except for [Windows platforms](https://dev.azure.com/conda-forge/feedstock-builds/_build/results?buildId=249918&view=logs&j=533b006e-c27f-58a3-8df4-b5a79e785988&t=dccf4699-a398-5507-50c7-f61f6dad5a20) and for python 3.6 and 3.7 (3.8 and 3.9 are fine) There are 2 errors in the surface and volume sampling routines: ``` ====================================================================== ERROR: tests.tests01_geom.tests01_GG.test07_Ves_Vmesh_Tor ---------------------------------------------------------------------- Traceback (most recent call last): File "D:\bld\tofu_1607509981720\_test_env\lib\site-packages\nose\case.py", line 197, in runTest self.test(*self.arg) File "D:\bld\tofu_1607509981720\test_tmp\tofu\tests\tests01_geom\tests01_GG.py", line 309, in test07_Ves_Vmesh_Tor margin=1.e-9) File "tofu\geom\_GG.pyx", line 1102, in tofu.geom._GG._Ves_Vmesh_Tor_SubFromD_cython ValueError: Buffer dtype mismatch, expected 'long' but got 'long long' ====================================================================== ERROR: tests.tests01_geom.tests03_core.Test01_Struct.test15_get_sampleV ---------------------------------------------------------------------- Traceback (most recent call last): File "D:\bld\tofu_1607509981720\test_tmp\tofu\tests\tests01_geom\tests03_core.py", line 452, in test15_get_sampleV algo='new') File "D:\bld\tofu_1607509981720\_test_env\lib\site-packages\tofu\geom\_core.py", line 1432, in get_sampleV return _comp._Ves_get_sampleV(*args, **kwdargs) File "D:\bld\tofu_1607509981720\_test_env\lib\site-packages\tofu\geom\_comp.py", line 712, in _Ves_get_sampleV num_threads=num_threads, File "tofu\geom\_GG.pyx", line 1102, in tofu.geom._GG._Ves_Vmesh_Tor_SubFromD_cython ValueError: Buffer dtype mismatch, expected 'long' but got 'long long' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "D:\bld\tofu_1607509981720\_test_env\lib\site-packages\nose\case.py", line 197, in runTest self.test(*self.arg) File "D:\bld\tofu_1607509981720\test_tmp\tofu\tests\tests01_geom\tests03_core.py", line 462, in test15_get_sampleV raise Exception(msg) Exception: Buffer dtype mismatch, expected 'long' but got 'long long' Failed for Tor_Ves_VesIn - ii = 0 - Lim = [] - domain = None - algo = 'new' ---------------------------------------------------------------------- Ran 127 tests in 140.733s ``` @lasofivec could you have a look at these ? I feel that we already encountered that kind of issue, but can't remember how we fixed it, any memories @flothesof ?
True
tofu 1.4.8 fails for Windows only on conda-forge - The attempt to export tofu 1.4.8 to conda forge reveals that all unit tests are passed except for [Windows platforms](https://dev.azure.com/conda-forge/feedstock-builds/_build/results?buildId=249918&view=logs&j=533b006e-c27f-58a3-8df4-b5a79e785988&t=dccf4699-a398-5507-50c7-f61f6dad5a20) and for python 3.6 and 3.7 (3.8 and 3.9 are fine) There are 2 errors in the surface and volume sampling routines: ``` ====================================================================== ERROR: tests.tests01_geom.tests01_GG.test07_Ves_Vmesh_Tor ---------------------------------------------------------------------- Traceback (most recent call last): File "D:\bld\tofu_1607509981720\_test_env\lib\site-packages\nose\case.py", line 197, in runTest self.test(*self.arg) File "D:\bld\tofu_1607509981720\test_tmp\tofu\tests\tests01_geom\tests01_GG.py", line 309, in test07_Ves_Vmesh_Tor margin=1.e-9) File "tofu\geom\_GG.pyx", line 1102, in tofu.geom._GG._Ves_Vmesh_Tor_SubFromD_cython ValueError: Buffer dtype mismatch, expected 'long' but got 'long long' ====================================================================== ERROR: tests.tests01_geom.tests03_core.Test01_Struct.test15_get_sampleV ---------------------------------------------------------------------- Traceback (most recent call last): File "D:\bld\tofu_1607509981720\test_tmp\tofu\tests\tests01_geom\tests03_core.py", line 452, in test15_get_sampleV algo='new') File "D:\bld\tofu_1607509981720\_test_env\lib\site-packages\tofu\geom\_core.py", line 1432, in get_sampleV return _comp._Ves_get_sampleV(*args, **kwdargs) File "D:\bld\tofu_1607509981720\_test_env\lib\site-packages\tofu\geom\_comp.py", line 712, in _Ves_get_sampleV num_threads=num_threads, File "tofu\geom\_GG.pyx", line 1102, in tofu.geom._GG._Ves_Vmesh_Tor_SubFromD_cython ValueError: Buffer dtype mismatch, expected 'long' but got 'long long' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "D:\bld\tofu_1607509981720\_test_env\lib\site-packages\nose\case.py", line 197, in runTest self.test(*self.arg) File "D:\bld\tofu_1607509981720\test_tmp\tofu\tests\tests01_geom\tests03_core.py", line 462, in test15_get_sampleV raise Exception(msg) Exception: Buffer dtype mismatch, expected 'long' but got 'long long' Failed for Tor_Ves_VesIn - ii = 0 - Lim = [] - domain = None - algo = 'new' ---------------------------------------------------------------------- Ran 127 tests in 140.733s ``` @lasofivec could you have a look at these ? I feel that we already encountered that kind of issue, but can't remember how we fixed it, any memories @flothesof ?
port
tofu fails for windows only on conda forge the attempt to export tofu to conda forge reveals that all unit tests are passed except for and for python and and are fine there are errors in the surface and volume sampling routines error tests geom gg ves vmesh tor traceback most recent call last file d bld tofu test env lib site packages nose case py line in runtest self test self arg file d bld tofu test tmp tofu tests geom gg py line in ves vmesh tor margin e file tofu geom gg pyx line in tofu geom gg ves vmesh tor subfromd cython valueerror buffer dtype mismatch expected long but got long long error tests geom core struct get samplev traceback most recent call last file d bld tofu test tmp tofu tests geom core py line in get samplev algo new file d bld tofu test env lib site packages tofu geom core py line in get samplev return comp ves get samplev args kwdargs file d bld tofu test env lib site packages tofu geom comp py line in ves get samplev num threads num threads file tofu geom gg pyx line in tofu geom gg ves vmesh tor subfromd cython valueerror buffer dtype mismatch expected long but got long long during handling of the above exception another exception occurred traceback most recent call last file d bld tofu test env lib site packages nose case py line in runtest self test self arg file d bld tofu test tmp tofu tests geom core py line in get samplev raise exception msg exception buffer dtype mismatch expected long but got long long failed for tor ves vesin ii lim domain none algo new ran tests in lasofivec could you have a look at these i feel that we already encountered that kind of issue but can t remember how we fixed it any memories flothesof
1
756
10,143,265,679
IssuesEvent
2019-08-04 10:33:07
droe/sslsplit
https://api.github.com/repos/droe/sslsplit
closed
Certificate error: loading src server certificate failed
bug merged-to-develop portability
This is very similar to #246. I'm running Kali 2019.1 with SSLsplit 0.5.4 and see the "loading src server certificate failed" error.. I tested the same commands using SSLsplit 0.5.2 and everything works fine. Below are the commands that I used to set up the server, client and SSLsplit. The failing output from the server, client and SSLsplit are attached. # Server The default options were used for the openssl req command. ``` openssl genrsa -out server.key 2048 openssl req -new -key server.key -out server.csr openssl x509 -req -in server.csr -signkey server.key -out server_signed_cert.pem openssl s_server -accept 9999 -key server.key -cert server_signed_cert.pem -debug ``` # SSLsplit The default options were used for the openssl req command. ``` openssl genrsa -out fake.key 2048 openssl req -new -key fake.key -out fake.csr openssl x509 -req -in fake.csr -signkey fake.key -out fake_signed_cert.pem sslsplit -D -c fake_signed_cert.pem -k fake.key ssl 0.0.0.0 8888 192.168.128.129 9999 ``` # Connections Initial connection directly to server to prove the server/client works fine: `openssl s_client -connect 192.168.128.129:9999` Follow up connection via SSLsplit: `openssl s_client -connect 192.168.128.134:8888` [openssl-client-trace.txt](https://github.com/droe/sslsplit/files/2919469/openssl-client-trace.txt) [openssl-server-trace.txt](https://github.com/droe/sslsplit/files/2919470/openssl-server-trace.txt) [sslsplit-trace.txt](https://github.com/droe/sslsplit/files/2919471/sslsplit-trace.txt)
True
Certificate error: loading src server certificate failed - This is very similar to #246. I'm running Kali 2019.1 with SSLsplit 0.5.4 and see the "loading src server certificate failed" error.. I tested the same commands using SSLsplit 0.5.2 and everything works fine. Below are the commands that I used to set up the server, client and SSLsplit. The failing output from the server, client and SSLsplit are attached. # Server The default options were used for the openssl req command. ``` openssl genrsa -out server.key 2048 openssl req -new -key server.key -out server.csr openssl x509 -req -in server.csr -signkey server.key -out server_signed_cert.pem openssl s_server -accept 9999 -key server.key -cert server_signed_cert.pem -debug ``` # SSLsplit The default options were used for the openssl req command. ``` openssl genrsa -out fake.key 2048 openssl req -new -key fake.key -out fake.csr openssl x509 -req -in fake.csr -signkey fake.key -out fake_signed_cert.pem sslsplit -D -c fake_signed_cert.pem -k fake.key ssl 0.0.0.0 8888 192.168.128.129 9999 ``` # Connections Initial connection directly to server to prove the server/client works fine: `openssl s_client -connect 192.168.128.129:9999` Follow up connection via SSLsplit: `openssl s_client -connect 192.168.128.134:8888` [openssl-client-trace.txt](https://github.com/droe/sslsplit/files/2919469/openssl-client-trace.txt) [openssl-server-trace.txt](https://github.com/droe/sslsplit/files/2919470/openssl-server-trace.txt) [sslsplit-trace.txt](https://github.com/droe/sslsplit/files/2919471/sslsplit-trace.txt)
port
certificate error loading src server certificate failed this is very similar to i m running kali with sslsplit and see the loading src server certificate failed error i tested the same commands using sslsplit and everything works fine below are the commands that i used to set up the server client and sslsplit the failing output from the server client and sslsplit are attached server the default options were used for the openssl req command openssl genrsa out server key openssl req new key server key out server csr openssl req in server csr signkey server key out server signed cert pem openssl s server accept key server key cert server signed cert pem debug sslsplit the default options were used for the openssl req command openssl genrsa out fake key openssl req new key fake key out fake csr openssl req in fake csr signkey fake key out fake signed cert pem sslsplit d c fake signed cert pem k fake key ssl connections initial connection directly to server to prove the server client works fine openssl s client connect follow up connection via sslsplit openssl s client connect
1
1,641
23,612,565,609
IssuesEvent
2022-08-24 13:32:27
Samsung/thorvg
https://api.github.com/repos/Samsung/thorvg
closed
Windows and clang: compilation error in src/loaders/svg/tvgSvgSceneBuilder.cpp
portability
error: ``` ../src/lib/tvgMath.h:52:56: error: use of undeclared identifier 'M_PI_2' if (radian < FLT_EPSILON || mathEqual(radian, float(M_PI_2)) || mathEqual(radian, float(M_PI))) return true; ^ ../src/lib/tvgMath.h:52:92: error: use of undeclared identifier 'M_PI' if (radian < FLT_EPSILON || mathEqual(radian, float(M_PI_2)) || mathEqual(radian, float(M_PI))) return true; ^ ../src/lib/tvgMath.h:122:37: error: use of undeclared identifier 'M_PI' auto radian = degree / 180.0f * M_PI; ^ ``` Problem: * First, on Windows, one must define `_USE_MATH_DEFINES` before including `math.h` to get M_PI and other macros. * tvgSvgSceneBuilder.cpp includes first `cstring`, which includes `math.h`, and **after**, it includes `tvgMath.h`. So `_USE_MATH_DEFINES` is not defined when `math.h` is included from `cstring` * when `tvgMath.h` is included, `_USE_MATH_DEFINES` is defined, but `math.h` is not included anymore. Hence the problem. solution: in tvgSvgSceneBuilder.cpp, include `tvgMath.h` before the strings includes or else define `_USE_MATH_DEFINES` globally what do you think of this solutions ?
True
Windows and clang: compilation error in src/loaders/svg/tvgSvgSceneBuilder.cpp - error: ``` ../src/lib/tvgMath.h:52:56: error: use of undeclared identifier 'M_PI_2' if (radian < FLT_EPSILON || mathEqual(radian, float(M_PI_2)) || mathEqual(radian, float(M_PI))) return true; ^ ../src/lib/tvgMath.h:52:92: error: use of undeclared identifier 'M_PI' if (radian < FLT_EPSILON || mathEqual(radian, float(M_PI_2)) || mathEqual(radian, float(M_PI))) return true; ^ ../src/lib/tvgMath.h:122:37: error: use of undeclared identifier 'M_PI' auto radian = degree / 180.0f * M_PI; ^ ``` Problem: * First, on Windows, one must define `_USE_MATH_DEFINES` before including `math.h` to get M_PI and other macros. * tvgSvgSceneBuilder.cpp includes first `cstring`, which includes `math.h`, and **after**, it includes `tvgMath.h`. So `_USE_MATH_DEFINES` is not defined when `math.h` is included from `cstring` * when `tvgMath.h` is included, `_USE_MATH_DEFINES` is defined, but `math.h` is not included anymore. Hence the problem. solution: in tvgSvgSceneBuilder.cpp, include `tvgMath.h` before the strings includes or else define `_USE_MATH_DEFINES` globally what do you think of this solutions ?
port
windows and clang compilation error in src loaders svg tvgsvgscenebuilder cpp error src lib tvgmath h error use of undeclared identifier m pi if radian flt epsilon mathequal radian float m pi mathequal radian float m pi return true src lib tvgmath h error use of undeclared identifier m pi if radian flt epsilon mathequal radian float m pi mathequal radian float m pi return true src lib tvgmath h error use of undeclared identifier m pi auto radian degree m pi problem first on windows one must define use math defines before including math h to get m pi and other macros tvgsvgscenebuilder cpp includes first cstring which includes math h and after it includes tvgmath h so use math defines is not defined when math h is included from cstring when tvgmath h is included use math defines is defined but math h is not included anymore hence the problem solution in tvgsvgscenebuilder cpp include tvgmath h before the strings includes or else define use math defines globally what do you think of this solutions
1
499,917
14,482,424,206
IssuesEvent
2020-12-10 13:58:07
traefik/traefik
https://api.github.com/repos/traefik/traefik
closed
IngressRoute resource allows cross-namespace routing by default
area/provider/k8s/crd kind/bug/confirmed priority/P1
<!-- PLEASE FOLLOW THE ISSUE TEMPLATE TO HELP TRIAGE AND SUPPORT! --> ### Do you want to request a *feature* or report a *bug*? <!-- DO NOT FILE ISSUES FOR GENERAL SUPPORT QUESTIONS. The issue tracker is for reporting bugs and feature requests only. For end-user related support questions, please refer to one of the following: - the Traefik community forum: https://community.containo.us/ --> Bug <!-- The configurations between 1.X and 2.X are NOT compatible. Please have a look here https://doc.traefik.io/traefik/getting-started/configuration-overview/. --> ### What did you do? This is a replication and investigation report for the issue reported in https://github.com/containous/traefik/issues/7151 and discussed on the community forums https://community.traefik.io/t/cross-namespaces-ingressroutes-and-services/7419/19 - Prior to Traefik v2.1, `IngressRoute` was not capable of routing to services outside of its own "root" namespace - This limitation appears to coincide with the constraints on K8S native `Ingress` based on an [issue reported](https://github.com/containous/traefik/issues/5748#issuecomment-547323939) running Traefik 2.0.4 - This limitation was removed with the introduction of [this PR](https://github.com/traefik/traefik/pull/5711), and shipped with Traefik v2.1 to current. - While a user can [define which namespaces can be watched](https://doc.traefik.io/traefik/providers/kubernetes-crd/#namespaces), this does not constrain the user's ability to restrict `IngressRoute` to their respective namespaces, which is based on an expectation set by native K8S `Ingress` objects and the behavior of `IngressRoute` prior to 2.1. - This behavior is specific to Traefik and `IngressRoute` and I could reproduce the regressed behavior on v2.0, and produce the cross-namespace behavior on 2.3 on K8S 1.16, 1.17, and 1.18 (this is not something K8S will restrict) <!-- HOW TO WRITE A GOOD BUG REPORT? - Respect the issue template as much as possible. - The title should be short and descriptive. - Explain the conditions which led you to report this issue: the context. - The context should lead to something, an idea or a problem that you’re facing. - Remain clear and concise. - Format your messages to help the reader focus on what matters and understand the structure of your message, use Markdown syntax https://help.github.com/articles/github-flavored-markdown --> ### What did you expect to see? `IngressRoute` provider configuration to contain a flag that enables cross-namespace routing behavior (this is a worthwhile feature, IMO). ### What did you see instead? By default, `IngressRoute` will allow users to cross-namespace boundaries even though the user would expect this behavior to be disallowed given current constraints on `Ingress` ### Output of `traefik version`: (_What version of Traefik are you using?_) * Traefik 2.0.4 * Traefik 2.3 <!-- `latest` is not considered as a valid version. For the Traefik Docker image: docker run [IMAGE] version ex: docker run traefik version --> ### What is your environment & configuration (arguments, toml, provider, platform, ...)? See original issue for environment / configuration. Additional tests were run on: * Kubernetes 1.16 * Kubernetes 1.17 <!-- Add more configuration information here. --> ### If applicable, please paste the log output in DEBUG level (`--log.level=DEBUG` switch) n/a
1.0
IngressRoute resource allows cross-namespace routing by default - <!-- PLEASE FOLLOW THE ISSUE TEMPLATE TO HELP TRIAGE AND SUPPORT! --> ### Do you want to request a *feature* or report a *bug*? <!-- DO NOT FILE ISSUES FOR GENERAL SUPPORT QUESTIONS. The issue tracker is for reporting bugs and feature requests only. For end-user related support questions, please refer to one of the following: - the Traefik community forum: https://community.containo.us/ --> Bug <!-- The configurations between 1.X and 2.X are NOT compatible. Please have a look here https://doc.traefik.io/traefik/getting-started/configuration-overview/. --> ### What did you do? This is a replication and investigation report for the issue reported in https://github.com/containous/traefik/issues/7151 and discussed on the community forums https://community.traefik.io/t/cross-namespaces-ingressroutes-and-services/7419/19 - Prior to Traefik v2.1, `IngressRoute` was not capable of routing to services outside of its own "root" namespace - This limitation appears to coincide with the constraints on K8S native `Ingress` based on an [issue reported](https://github.com/containous/traefik/issues/5748#issuecomment-547323939) running Traefik 2.0.4 - This limitation was removed with the introduction of [this PR](https://github.com/traefik/traefik/pull/5711), and shipped with Traefik v2.1 to current. - While a user can [define which namespaces can be watched](https://doc.traefik.io/traefik/providers/kubernetes-crd/#namespaces), this does not constrain the user's ability to restrict `IngressRoute` to their respective namespaces, which is based on an expectation set by native K8S `Ingress` objects and the behavior of `IngressRoute` prior to 2.1. - This behavior is specific to Traefik and `IngressRoute` and I could reproduce the regressed behavior on v2.0, and produce the cross-namespace behavior on 2.3 on K8S 1.16, 1.17, and 1.18 (this is not something K8S will restrict) <!-- HOW TO WRITE A GOOD BUG REPORT? - Respect the issue template as much as possible. - The title should be short and descriptive. - Explain the conditions which led you to report this issue: the context. - The context should lead to something, an idea or a problem that you’re facing. - Remain clear and concise. - Format your messages to help the reader focus on what matters and understand the structure of your message, use Markdown syntax https://help.github.com/articles/github-flavored-markdown --> ### What did you expect to see? `IngressRoute` provider configuration to contain a flag that enables cross-namespace routing behavior (this is a worthwhile feature, IMO). ### What did you see instead? By default, `IngressRoute` will allow users to cross-namespace boundaries even though the user would expect this behavior to be disallowed given current constraints on `Ingress` ### Output of `traefik version`: (_What version of Traefik are you using?_) * Traefik 2.0.4 * Traefik 2.3 <!-- `latest` is not considered as a valid version. For the Traefik Docker image: docker run [IMAGE] version ex: docker run traefik version --> ### What is your environment & configuration (arguments, toml, provider, platform, ...)? See original issue for environment / configuration. Additional tests were run on: * Kubernetes 1.16 * Kubernetes 1.17 <!-- Add more configuration information here. --> ### If applicable, please paste the log output in DEBUG level (`--log.level=DEBUG` switch) n/a
non_port
ingressroute resource allows cross namespace routing by default do you want to request a feature or report a bug do not file issues for general support questions the issue tracker is for reporting bugs and feature requests only for end user related support questions please refer to one of the following the traefik community forum bug the configurations between x and x are not compatible please have a look here what did you do this is a replication and investigation report for the issue reported in and discussed on the community forums prior to traefik ingressroute was not capable of routing to services outside of its own root namespace this limitation appears to coincide with the constraints on native ingress based on an running traefik this limitation was removed with the introduction of and shipped with traefik to current while a user can this does not constrain the user s ability to restrict ingressroute to their respective namespaces which is based on an expectation set by native ingress objects and the behavior of ingressroute prior to this behavior is specific to traefik and ingressroute and i could reproduce the regressed behavior on and produce the cross namespace behavior on on and this is not something will restrict how to write a good bug report respect the issue template as much as possible the title should be short and descriptive explain the conditions which led you to report this issue the context the context should lead to something an idea or a problem that you’re facing remain clear and concise format your messages to help the reader focus on what matters and understand the structure of your message use markdown syntax what did you expect to see ingressroute provider configuration to contain a flag that enables cross namespace routing behavior this is a worthwhile feature imo what did you see instead by default ingressroute will allow users to cross namespace boundaries even though the user would expect this behavior to be disallowed given current constraints on ingress output of traefik version what version of traefik are you using traefik traefik latest is not considered as a valid version for the traefik docker image docker run version ex docker run traefik version what is your environment configuration arguments toml provider platform see original issue for environment configuration additional tests were run on kubernetes kubernetes add more configuration information here if applicable please paste the log output in debug level log level debug switch n a
0
280,204
30,805,164,182
IssuesEvent
2023-08-01 06:33:15
Satheesh575555/linux-4.1.15
https://api.github.com/repos/Satheesh575555/linux-4.1.15
reopened
CVE-2022-25258 (Medium) detected in linuxlinux-4.6
Mend: dependency security vulnerability
## CVE-2022-25258 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.6</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in drivers/usb/gadget/composite.c in the Linux kernel before 5.16.10. The USB Gadget subsystem lacks certain validation of interface OS descriptor requests (ones with a large array index and ones associated with NULL function pointer retrieval). Memory corruption might occur. <p>Publish Date: 2022-02-16 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25258>CVE-2022-25258</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Physical - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25258">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25258</a></p> <p>Release Date: 2022-02-16</p> <p>Fix Resolution: v5.17-rc4</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-25258 (Medium) detected in linuxlinux-4.6 - ## CVE-2022-25258 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.6</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in drivers/usb/gadget/composite.c in the Linux kernel before 5.16.10. The USB Gadget subsystem lacks certain validation of interface OS descriptor requests (ones with a large array index and ones associated with NULL function pointer retrieval). Memory corruption might occur. <p>Publish Date: 2022-02-16 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25258>CVE-2022-25258</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Physical - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25258">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25258</a></p> <p>Release Date: 2022-02-16</p> <p>Fix Resolution: v5.17-rc4</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_port
cve medium detected in linuxlinux cve medium severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in base branch master vulnerable source files vulnerability details an issue was discovered in drivers usb gadget composite c in the linux kernel before the usb gadget subsystem lacks certain validation of interface os descriptor requests ones with a large array index and ones associated with null function pointer retrieval memory corruption might occur publish date url a href cvss score details base score metrics exploitability metrics attack vector physical attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
0
791
10,353,476,024
IssuesEvent
2019-09-05 11:41:00
ToFuProject/tofu
https://api.github.com/repos/ToFuProject/tofu
closed
Support of python 3.7
Fixed in the next release enhancement portability
I was trying to configure Travis tests with Python version 3.7, after some configurations, I just realized there is a bottom line: > conda.exceptions.UnsatisfiableError: The following specifications were found to be in conflict: > - polygon3 -> python[version='>=3.6,<3.7.0a0'] > - python=3.7 So we either try to move forward without the polygon library or we drop the support of python 3.7 (knowing that for the future this still will have to be solved)
True
Support of python 3.7 - I was trying to configure Travis tests with Python version 3.7, after some configurations, I just realized there is a bottom line: > conda.exceptions.UnsatisfiableError: The following specifications were found to be in conflict: > - polygon3 -> python[version='>=3.6,<3.7.0a0'] > - python=3.7 So we either try to move forward without the polygon library or we drop the support of python 3.7 (knowing that for the future this still will have to be solved)
port
support of python i was trying to configure travis tests with python version after some configurations i just realized there is a bottom line conda exceptions unsatisfiableerror the following specifications were found to be in conflict python python so we either try to move forward without the polygon library or we drop the support of python knowing that for the future this still will have to be solved
1
755
10,142,903,362
IssuesEvent
2019-08-04 06:47:50
roc-project/roc
https://api.github.com/repos/roc-project/roc
closed
Add support for musl libc
portability
Currently it's assumed that glibc is installed when the target is linux. This is not always the case. A distro like alpine, which makes a lot of sense for devices like the rpi 0, ships with musl by default. ``` $ gcc -v -E 2>&1| grep Target: Target: armv6-alpine-linux-musleabihf ```
True
Add support for musl libc - Currently it's assumed that glibc is installed when the target is linux. This is not always the case. A distro like alpine, which makes a lot of sense for devices like the rpi 0, ships with musl by default. ``` $ gcc -v -E 2>&1| grep Target: Target: armv6-alpine-linux-musleabihf ```
port
add support for musl libc currently it s assumed that glibc is installed when the target is linux this is not always the case a distro like alpine which makes a lot of sense for devices like the rpi ships with musl by default gcc v e grep target target alpine linux musleabihf
1
31,459
5,953,732,962
IssuesEvent
2017-05-27 10:42:48
ada-students/apprentice-handbook
https://api.github.com/repos/ada-students/apprentice-handbook
closed
Pseudocode
documentation in-progress
* **Year**: One * **Subject**: DSA * **Chapter**: Basic ideas / skills * **Section(s)**: * [ ] How to write pseudocode I'm assigning @joehalloran to this one 🙄
1.0
Pseudocode - * **Year**: One * **Subject**: DSA * **Chapter**: Basic ideas / skills * **Section(s)**: * [ ] How to write pseudocode I'm assigning @joehalloran to this one 🙄
non_port
pseudocode year one subject dsa chapter basic ideas skills section s how to write pseudocode i m assigning joehalloran to this one 🙄
0
16,903
22,215,784,060
IssuesEvent
2022-06-08 01:23:21
hashgraph/hedera-json-rpc-relay
https://api.github.com/repos/hashgraph/hedera-json-rpc-relay
opened
ENhance acceptance tests to run against deployed environments
enhancement P2 process
### Problem The acceptance tests were meant to run against any environment regardless of local vs deployed. However, during implementation we missed making the Mirror client configureable ### Solution Update acceptance test file and allow configuration of the Mirror Client to point to any IP not just localhost ### Alternatives _No response_
1.0
ENhance acceptance tests to run against deployed environments - ### Problem The acceptance tests were meant to run against any environment regardless of local vs deployed. However, during implementation we missed making the Mirror client configureable ### Solution Update acceptance test file and allow configuration of the Mirror Client to point to any IP not just localhost ### Alternatives _No response_
non_port
enhance acceptance tests to run against deployed environments problem the acceptance tests were meant to run against any environment regardless of local vs deployed however during implementation we missed making the mirror client configureable solution update acceptance test file and allow configuration of the mirror client to point to any ip not just localhost alternatives no response
0
181,290
6,658,182,132
IssuesEvent
2017-09-30 16:06:57
enforcer574/smashclub
https://api.github.com/repos/enforcer574/smashclub
opened
Log in by email address
Minor Change Priority 3
Change the system to use email addresses for login instead of gamertags. This improves security and allows gamertag changing.
1.0
Log in by email address - Change the system to use email addresses for login instead of gamertags. This improves security and allows gamertag changing.
non_port
log in by email address change the system to use email addresses for login instead of gamertags this improves security and allows gamertag changing
0
75,283
9,834,461,664
IssuesEvent
2019-06-17 09:44:54
ethereum/solidity
https://api.github.com/repos/ethereum/solidity
closed
[DOCS] Ensure warnings, notes etc are in logical orders
documentation :book:
## Description In my opinion when there are multiple sequences of notes, warnings etc, warnings should appear on top.
1.0
[DOCS] Ensure warnings, notes etc are in logical orders - ## Description In my opinion when there are multiple sequences of notes, warnings etc, warnings should appear on top.
non_port
ensure warnings notes etc are in logical orders description in my opinion when there are multiple sequences of notes warnings etc warnings should appear on top
0
66,604
27,525,455,299
IssuesEvent
2023-03-06 17:42:45
hashicorp/terraform-provider-aws
https://api.github.com/repos/hashicorp/terraform-provider-aws
closed
Error creating SES configuration set: ConfigurationSetAlreadyExists: Configuration set <ses-verify-email-identity-config-set> already exists.
bug service/ses stale
Hi, I am getting the error below when I execute a aws_ses_configuration_set resource. The resource does not exist prior to me creating it. AWS shows the resource existing after the script stops. When I delete the configuration set manually from AWS and rerun the script, I get the same result. Error occurred: Error: Error creating SES configuration set: ConfigurationSetAlreadyExists: Configuration set <ses-verify-email-identity-config-set> already exists. status code: 400, request id: 8b484c22-5c36-44ed-bd21-2215a2bc7716 on ../ses/main.tf line 1, in resource "aws_ses_configuration_set" "ses_verify_email_identity_configuration_set": 1: resource "aws_ses_configuration_set" "ses_verify_email_identity_configuration_set" { ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform Version v0.12.3 <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl resource "aws_ses_configuration_set" "ses_verify_email_identity_configuration_set" { name = "ses-verify-email-identity-config-set" } resource "aws_ses_event_destination" "ses" { name = "ses-verify-email-identity-destination-sns" configuration_set_name = "${aws_ses_configuration_set.ses_verify_email_identity_configuration_set.name}" enabled = true matching_types = ["bounce", "send", "reject", "complaint", "delivery", "open", "click", "renderingFailure"] sns_destination { topic_arn = "${var.sns_verify_email_identity_topic_arn}" } } ```
1.0
Error creating SES configuration set: ConfigurationSetAlreadyExists: Configuration set <ses-verify-email-identity-config-set> already exists. - Hi, I am getting the error below when I execute a aws_ses_configuration_set resource. The resource does not exist prior to me creating it. AWS shows the resource existing after the script stops. When I delete the configuration set manually from AWS and rerun the script, I get the same result. Error occurred: Error: Error creating SES configuration set: ConfigurationSetAlreadyExists: Configuration set <ses-verify-email-identity-config-set> already exists. status code: 400, request id: 8b484c22-5c36-44ed-bd21-2215a2bc7716 on ../ses/main.tf line 1, in resource "aws_ses_configuration_set" "ses_verify_email_identity_configuration_set": 1: resource "aws_ses_configuration_set" "ses_verify_email_identity_configuration_set" { ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform Version v0.12.3 <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl resource "aws_ses_configuration_set" "ses_verify_email_identity_configuration_set" { name = "ses-verify-email-identity-config-set" } resource "aws_ses_event_destination" "ses" { name = "ses-verify-email-identity-destination-sns" configuration_set_name = "${aws_ses_configuration_set.ses_verify_email_identity_configuration_set.name}" enabled = true matching_types = ["bounce", "send", "reject", "complaint", "delivery", "open", "click", "renderingFailure"] sns_destination { topic_arn = "${var.sns_verify_email_identity_topic_arn}" } } ```
non_port
error creating ses configuration set configurationsetalreadyexists configuration set already exists hi i am getting the error below when i execute a aws ses configuration set resource the resource does not exist prior to me creating it aws shows the resource existing after the script stops when i delete the configuration set manually from aws and rerun the script i get the same result error occurred error error creating ses configuration set configurationsetalreadyexists configuration set already exists status code request id on ses main tf line in resource aws ses configuration set ses verify email identity configuration set resource aws ses configuration set ses verify email identity configuration set community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform version hcl resource aws ses configuration set ses verify email identity configuration set name ses verify email identity config set resource aws ses event destination ses name ses verify email identity destination sns configuration set name aws ses configuration set ses verify email identity configuration set name enabled true matching types sns destination topic arn var sns verify email identity topic arn
0
473
6,897,163,333
IssuesEvent
2017-11-23 23:36:37
craigbarnes/dte
https://api.github.com/repos/craigbarnes/dte
closed
Use terminfo or a built-in TerminalInfo entry to handle the Backspace key
portability
The default <kbd>Backspace</kbd> key binding is currently just handled by the simplistic command `bind ^\? erase` in [`config/binding/default`]. However, the equivalence between <kbd>Backspace</kbd> and `^?` isn't applicable to some terminals, including the FreeBSD VT. This is a portability problem and should be fixed by handling <kbd>Backspace</kbd> as a special key and querying terminfo (or a built-in `TerminalInfo` entry) for the correct escape sequence to use. [`config/binding/default`]: https://github.com/craigbarnes/dte/blob/9b0bc56bfc4633b015b06998792a9d3d64e6af4e/config/binding/default#L11
True
Use terminfo or a built-in TerminalInfo entry to handle the Backspace key - The default <kbd>Backspace</kbd> key binding is currently just handled by the simplistic command `bind ^\? erase` in [`config/binding/default`]. However, the equivalence between <kbd>Backspace</kbd> and `^?` isn't applicable to some terminals, including the FreeBSD VT. This is a portability problem and should be fixed by handling <kbd>Backspace</kbd> as a special key and querying terminfo (or a built-in `TerminalInfo` entry) for the correct escape sequence to use. [`config/binding/default`]: https://github.com/craigbarnes/dte/blob/9b0bc56bfc4633b015b06998792a9d3d64e6af4e/config/binding/default#L11
port
use terminfo or a built in terminalinfo entry to handle the backspace key the default backspace key binding is currently just handled by the simplistic command bind erase in however the equivalence between backspace and isn t applicable to some terminals including the freebsd vt this is a portability problem and should be fixed by handling backspace as a special key and querying terminfo or a built in terminalinfo entry for the correct escape sequence to use
1
204,778
7,090,850,603
IssuesEvent
2018-01-12 10:34:40
advaita-krishna-das/gem
https://api.github.com/repos/advaita-krishna-das/gem
opened
Export comments to file
High priority enhancement ui
Взято из: #104 Comments view: На этапе комментирования сделать возможность сохранить комментарии в документе и экспортировать их в виде текстового файла.
1.0
Export comments to file - Взято из: #104 Comments view: На этапе комментирования сделать возможность сохранить комментарии в документе и экспортировать их в виде текстового файла.
non_port
export comments to file взято из comments view на этапе комментирования сделать возможность сохранить комментарии в документе и экспортировать их в виде текстового файла
0
187,342
15,096,906,261
IssuesEvent
2021-02-07 16:45:46
hackforla/civic-opportunity
https://api.github.com/repos/hackforla/civic-opportunity
opened
Recruiting Pipelines documents and messaging
documentation
### Overview Gather all 'Recruiting Pipelines' documents and messaging in one place ### Action Items Identify the following three documents for each CoP: - 'Recruiting Pipelines' google form - 'Recruiting Pipelines' response spreadsheet - Messaging document or examples of messaging in Slack Link to documents in Resources for the each CoP: - [ ] UI/UX - Design - [ ] UI/UX - Research - [ ] Data Science - [ ] Development (programming) - [ ] Ops - [ ] Product Management - [ ] Marketing - [ ] Fundraising ### Resources/Instructions REPLACE THIS TEXT -If there is a website which has documentation that helps with this issue provide the link(s) here.
1.0
Recruiting Pipelines documents and messaging - ### Overview Gather all 'Recruiting Pipelines' documents and messaging in one place ### Action Items Identify the following three documents for each CoP: - 'Recruiting Pipelines' google form - 'Recruiting Pipelines' response spreadsheet - Messaging document or examples of messaging in Slack Link to documents in Resources for the each CoP: - [ ] UI/UX - Design - [ ] UI/UX - Research - [ ] Data Science - [ ] Development (programming) - [ ] Ops - [ ] Product Management - [ ] Marketing - [ ] Fundraising ### Resources/Instructions REPLACE THIS TEXT -If there is a website which has documentation that helps with this issue provide the link(s) here.
non_port
recruiting pipelines documents and messaging overview gather all recruiting pipelines documents and messaging in one place action items identify the following three documents for each cop recruiting pipelines google form recruiting pipelines response spreadsheet messaging document or examples of messaging in slack link to documents in resources for the each cop ui ux design ui ux research data science development programming ops product management marketing fundraising resources instructions replace this text if there is a website which has documentation that helps with this issue provide the link s here
0
574
7,974,160,684
IssuesEvent
2018-07-17 03:40:00
PowerShell/PowerShell
https://api.github.com/repos/PowerShell/PowerShell
closed
Respect CompatiblePSEditions field in module manifests, with sane default
Area-Engine Area-Portability Issue-Enhancement Resolution-Fixed
For Windows PowerShell/PowerShell Core modules to have a clear cross-compatibility story, we need PowerShell Core to respect the `CompatiblePSEditions` field. This means, for PowerShell Core: * `Get-Module -ListAvailable` lists only those modules on the module path that are Core compatible * `Import-Module` will only import a module that is Core compatible and throw an error otherwise * The absence of the `CompatiblePSEditions` field should imply a sane default. The current suggested default is `CompatiblePSEditions = @('Desktop')`, i.e. compatible with Windows PowerShell (since there is no such field in Windows PowerShell and modules written for it only would lack this field by default).
True
Respect CompatiblePSEditions field in module manifests, with sane default - For Windows PowerShell/PowerShell Core modules to have a clear cross-compatibility story, we need PowerShell Core to respect the `CompatiblePSEditions` field. This means, for PowerShell Core: * `Get-Module -ListAvailable` lists only those modules on the module path that are Core compatible * `Import-Module` will only import a module that is Core compatible and throw an error otherwise * The absence of the `CompatiblePSEditions` field should imply a sane default. The current suggested default is `CompatiblePSEditions = @('Desktop')`, i.e. compatible with Windows PowerShell (since there is no such field in Windows PowerShell and modules written for it only would lack this field by default).
port
respect compatiblepseditions field in module manifests with sane default for windows powershell powershell core modules to have a clear cross compatibility story we need powershell core to respect the compatiblepseditions field this means for powershell core get module listavailable lists only those modules on the module path that are core compatible import module will only import a module that is core compatible and throw an error otherwise the absence of the compatiblepseditions field should imply a sane default the current suggested default is compatiblepseditions desktop i e compatible with windows powershell since there is no such field in windows powershell and modules written for it only would lack this field by default
1
37
2,716,599,877
IssuesEvent
2015-04-10 20:08:49
svaarala/duktape
https://api.github.com/repos/svaarala/duktape
closed
Warnings (1.2.x) when compiling with fastint support
portability
I wanted to test out the new fastint support, but when defining DUK_OPT_FASTINT and compiling with /W3 in MSVC, I get the following warnings: ``` 1>duk_api_stack.c(84): warning C4244: 'return' : conversion from 'duk_int64_t' to 'duk_int_t', possible loss of data 1>duk_api_stack.c(141): warning C4244: 'return' : conversion from 'duk_int64_t' to 'duk_uint_t', possible loss of data 1>duk_tval.c(78): warning C4146: unary minus operator applied to unsigned type, result still unsigned ``` These look potentially disastrous. Compiler is MSVC 2013 (x86 cl.exe 18.0).
True
Warnings (1.2.x) when compiling with fastint support - I wanted to test out the new fastint support, but when defining DUK_OPT_FASTINT and compiling with /W3 in MSVC, I get the following warnings: ``` 1>duk_api_stack.c(84): warning C4244: 'return' : conversion from 'duk_int64_t' to 'duk_int_t', possible loss of data 1>duk_api_stack.c(141): warning C4244: 'return' : conversion from 'duk_int64_t' to 'duk_uint_t', possible loss of data 1>duk_tval.c(78): warning C4146: unary minus operator applied to unsigned type, result still unsigned ``` These look potentially disastrous. Compiler is MSVC 2013 (x86 cl.exe 18.0).
port
warnings x when compiling with fastint support i wanted to test out the new fastint support but when defining duk opt fastint and compiling with in msvc i get the following warnings duk api stack c warning return conversion from duk t to duk int t possible loss of data duk api stack c warning return conversion from duk t to duk uint t possible loss of data duk tval c warning unary minus operator applied to unsigned type result still unsigned these look potentially disastrous compiler is msvc cl exe
1
1,783
26,206,435,721
IssuesEvent
2023-01-03 23:17:11
golang/vulndb
https://api.github.com/repos/golang/vulndb
closed
x/vulndb: potential Go vuln in github.com/jessfraz/pastebinit: CVE-2018-25059
excluded: NOT_IMPORTABLE
CVE-2018-25059 references [github.com/jessfraz/pastebinit](https://github.com/jessfraz/pastebinit), which may be a Go module. Description: A vulnerability was found in pastebinit up to 0.2.2 and classified as critical. Affected by this issue is the function pasteHandler of the file server.go. The manipulation of the argument r.URL.Path leads to path traversal. Upgrading to version 0.2.3 is able to address this issue. The name of the patch is 1af2facb6d95976c532b7f8f82747d454a092272. It is recommended to upgrade the affected component. The identifier of this vulnerability is VDB-217040. References: - NIST: https://nvd.nist.gov/vuln/detail/CVE-2018-25059 - JSON: https://github.com/CVEProject/cvelist/tree/b4495c7a770bb3c933d6b51ee1aa9b8af831654f/2018/25xxx/CVE-2018-25059.json - web: https://vuldb.com/?id.217040 - web: https://vuldb.com/?ctiid.217040 - fix: https://github.com/jessfraz/pastebinit/pull/3 - fix: https://github.com/jessfraz/pastebinit/commit/1af2facb6d95976c532b7f8f82747d454a092272 - web: https://github.com/jessfraz/pastebinit/releases/tag/v0.2.3 - Imported by: https://pkg.go.dev/github.com/jessfraz/pastebinit?tab=importedby Cross references: No existing reports found with this module or alias. See [doc/triage.md](https://github.com/golang/vulndb/blob/master/doc/triage.md) for instructions on how to triage this report. ``` modules: - module: github.com/jessfraz/pastebinit packages: - package: pastebinit description: | A vulnerability was found in pastebinit up to 0.2.2 and classified as critical. Affected by this issue is the function pasteHandler of the file server.go. The manipulation of the argument r.URL.Path leads to path traversal. Upgrading to version 0.2.3 is able to address this issue. The name of the patch is 1af2facb6d95976c532b7f8f82747d454a092272. It is recommended to upgrade the affected component. The identifier of this vulnerability is VDB-217040. Eine Schwachstelle wurde in pastebinit bis 0.2.2 gefunden. Sie wurde als kritisch eingestuft. Davon betroffen ist die Funktion pasteHandler der Datei server.go. Durch Beeinflussen des Arguments r.URL.Path mit unbekannten Daten kann eine path traversal-Schwachstelle ausgenutzt werden. Ein Aktualisieren auf die Version 0.2.3 vermag dieses Problem zu lösen. Der Patch wird als 1af2facb6d95976c532b7f8f82747d454a092272 bezeichnet. Als bestmögliche Massnahme wird das Einspielen eines Upgrades empfohlen. cves: - CVE-2018-25059 references: - web: https://vuldb.com/?id.217040 - web: https://vuldb.com/?ctiid.217040 - fix: https://github.com/jessfraz/pastebinit/pull/3 - fix: https://github.com/jessfraz/pastebinit/commit/1af2facb6d95976c532b7f8f82747d454a092272 - web: https://github.com/jessfraz/pastebinit/releases/tag/v0.2.3 ```
True
x/vulndb: potential Go vuln in github.com/jessfraz/pastebinit: CVE-2018-25059 - CVE-2018-25059 references [github.com/jessfraz/pastebinit](https://github.com/jessfraz/pastebinit), which may be a Go module. Description: A vulnerability was found in pastebinit up to 0.2.2 and classified as critical. Affected by this issue is the function pasteHandler of the file server.go. The manipulation of the argument r.URL.Path leads to path traversal. Upgrading to version 0.2.3 is able to address this issue. The name of the patch is 1af2facb6d95976c532b7f8f82747d454a092272. It is recommended to upgrade the affected component. The identifier of this vulnerability is VDB-217040. References: - NIST: https://nvd.nist.gov/vuln/detail/CVE-2018-25059 - JSON: https://github.com/CVEProject/cvelist/tree/b4495c7a770bb3c933d6b51ee1aa9b8af831654f/2018/25xxx/CVE-2018-25059.json - web: https://vuldb.com/?id.217040 - web: https://vuldb.com/?ctiid.217040 - fix: https://github.com/jessfraz/pastebinit/pull/3 - fix: https://github.com/jessfraz/pastebinit/commit/1af2facb6d95976c532b7f8f82747d454a092272 - web: https://github.com/jessfraz/pastebinit/releases/tag/v0.2.3 - Imported by: https://pkg.go.dev/github.com/jessfraz/pastebinit?tab=importedby Cross references: No existing reports found with this module or alias. See [doc/triage.md](https://github.com/golang/vulndb/blob/master/doc/triage.md) for instructions on how to triage this report. ``` modules: - module: github.com/jessfraz/pastebinit packages: - package: pastebinit description: | A vulnerability was found in pastebinit up to 0.2.2 and classified as critical. Affected by this issue is the function pasteHandler of the file server.go. The manipulation of the argument r.URL.Path leads to path traversal. Upgrading to version 0.2.3 is able to address this issue. The name of the patch is 1af2facb6d95976c532b7f8f82747d454a092272. It is recommended to upgrade the affected component. The identifier of this vulnerability is VDB-217040. Eine Schwachstelle wurde in pastebinit bis 0.2.2 gefunden. Sie wurde als kritisch eingestuft. Davon betroffen ist die Funktion pasteHandler der Datei server.go. Durch Beeinflussen des Arguments r.URL.Path mit unbekannten Daten kann eine path traversal-Schwachstelle ausgenutzt werden. Ein Aktualisieren auf die Version 0.2.3 vermag dieses Problem zu lösen. Der Patch wird als 1af2facb6d95976c532b7f8f82747d454a092272 bezeichnet. Als bestmögliche Massnahme wird das Einspielen eines Upgrades empfohlen. cves: - CVE-2018-25059 references: - web: https://vuldb.com/?id.217040 - web: https://vuldb.com/?ctiid.217040 - fix: https://github.com/jessfraz/pastebinit/pull/3 - fix: https://github.com/jessfraz/pastebinit/commit/1af2facb6d95976c532b7f8f82747d454a092272 - web: https://github.com/jessfraz/pastebinit/releases/tag/v0.2.3 ```
port
x vulndb potential go vuln in github com jessfraz pastebinit cve cve references which may be a go module description a vulnerability was found in pastebinit up to and classified as critical affected by this issue is the function pastehandler of the file server go the manipulation of the argument r url path leads to path traversal upgrading to version is able to address this issue the name of the patch is it is recommended to upgrade the affected component the identifier of this vulnerability is vdb references nist json web web fix fix web imported by cross references no existing reports found with this module or alias see for instructions on how to triage this report modules module github com jessfraz pastebinit packages package pastebinit description a vulnerability was found in pastebinit up to and classified as critical affected by this issue is the function pastehandler of the file server go the manipulation of the argument r url path leads to path traversal upgrading to version is able to address this issue the name of the patch is it is recommended to upgrade the affected component the identifier of this vulnerability is vdb eine schwachstelle wurde in pastebinit bis gefunden sie wurde als kritisch eingestuft davon betroffen ist die funktion pastehandler der datei server go durch beeinflussen des arguments r url path mit unbekannten daten kann eine path traversal schwachstelle ausgenutzt werden ein aktualisieren auf die version vermag dieses problem zu lösen der patch wird als bezeichnet als bestmögliche massnahme wird das einspielen eines upgrades empfohlen cves cve references web web fix fix web
1
79
3,005,768,663
IssuesEvent
2015-07-27 04:11:21
stedolan/jq
https://api.github.com/repos/stedolan/jq
closed
Setup CI build?
portability
Dear all, Not exactly sure if this has been raised before, any chance we could setup CI e.g. Travis to do the testing? I just cloned the code and it failed tests. Best, Dong
True
Setup CI build? - Dear all, Not exactly sure if this has been raised before, any chance we could setup CI e.g. Travis to do the testing? I just cloned the code and it failed tests. Best, Dong
port
setup ci build dear all not exactly sure if this has been raised before any chance we could setup ci e g travis to do the testing i just cloned the code and it failed tests best dong
1
121,510
10,171,097,143
IssuesEvent
2019-08-08 07:31:52
eclipsesource/modelserver
https://api.github.com/repos/eclipsesource/modelserver
closed
JUnit test coverage for command client API: compound
test
JUnit test coverage for `compound` command in the client API: - [ ] EP call - [ ] subscription notifications
1.0
JUnit test coverage for command client API: compound - JUnit test coverage for `compound` command in the client API: - [ ] EP call - [ ] subscription notifications
non_port
junit test coverage for command client api compound junit test coverage for compound command in the client api ep call subscription notifications
0
306,394
9,392,640,174
IssuesEvent
2019-04-07 03:07:47
tra38/Paranoia_Super_Mission_Generator
https://api.github.com/repos/tra38/Paranoia_Super_Mission_Generator
closed
Campaign Generator
low-priority
**User Story (MVP)** - User should be able to run a "loosely connected" campaign, where the only thing connecting one-shots are (a) references to past sessions, and (b) certain characters making appearances again. **Post-MVP** - A "fully connected" campaign, with multiple references to past sessions and an actual plot of some sort. (This may very well be maddening to code.) **Notes** - If there is no demand at my tabletop group for campaigns, then this user story probably won't be worked on. It, however, would be cool to have. **Priority** - Low
1.0
Campaign Generator - **User Story (MVP)** - User should be able to run a "loosely connected" campaign, where the only thing connecting one-shots are (a) references to past sessions, and (b) certain characters making appearances again. **Post-MVP** - A "fully connected" campaign, with multiple references to past sessions and an actual plot of some sort. (This may very well be maddening to code.) **Notes** - If there is no demand at my tabletop group for campaigns, then this user story probably won't be worked on. It, however, would be cool to have. **Priority** - Low
non_port
campaign generator user story mvp user should be able to run a loosely connected campaign where the only thing connecting one shots are a references to past sessions and b certain characters making appearances again post mvp a fully connected campaign with multiple references to past sessions and an actual plot of some sort this may very well be maddening to code notes if there is no demand at my tabletop group for campaigns then this user story probably won t be worked on it however would be cool to have priority low
0
1,365
19,598,730,734
IssuesEvent
2022-01-05 21:25:39
Azure/Azure-Functions
https://api.github.com/repos/Azure/Azure-Functions
closed
Create breaking change detector (main detector/aggregator)
feature supportability
Work to track the detector that will provide migration help from 3.0 to 4.0 We need detectors for the following: https://github.com/Azure/Azure-Functions/issues?q=is%3Aissue+is%3Aopen+label%3A%22Breaking+Change%3A+Approved%22 This issue tracks building the main detector "collection" page and drive work with owners of the different breaking changes to implement the detectors.
True
Create breaking change detector (main detector/aggregator) - Work to track the detector that will provide migration help from 3.0 to 4.0 We need detectors for the following: https://github.com/Azure/Azure-Functions/issues?q=is%3Aissue+is%3Aopen+label%3A%22Breaking+Change%3A+Approved%22 This issue tracks building the main detector "collection" page and drive work with owners of the different breaking changes to implement the detectors.
port
create breaking change detector main detector aggregator work to track the detector that will provide migration help from to we need detectors for the following this issue tracks building the main detector collection page and drive work with owners of the different breaking changes to implement the detectors
1
368,032
10,864,992,349
IssuesEvent
2019-11-14 18:00:55
siteorigin/so-widgets-bundle
https://api.github.com/repos/siteorigin/so-widgets-bundle
opened
SO Widget Block: Features defaults aren't present
bug priority-2
In the SO Widget Block, the SiteOrigin Features defaults aren't present. <img width="229" alt="Edit_Page_‹_SiteOrigin_—_WordPress" src="https://user-images.githubusercontent.com/789159/68883301-65b29280-0719-11ea-91d5-598d9d1a67ce.png"> Icon container size Icon size Features per row Empty, empty and zero are incorrect defaults for those fields.
1.0
SO Widget Block: Features defaults aren't present - In the SO Widget Block, the SiteOrigin Features defaults aren't present. <img width="229" alt="Edit_Page_‹_SiteOrigin_—_WordPress" src="https://user-images.githubusercontent.com/789159/68883301-65b29280-0719-11ea-91d5-598d9d1a67ce.png"> Icon container size Icon size Features per row Empty, empty and zero are incorrect defaults for those fields.
non_port
so widget block features defaults aren t present in the so widget block the siteorigin features defaults aren t present img width alt edit page ‹ siteorigin — wordpress src icon container size icon size features per row empty empty and zero are incorrect defaults for those fields
0
1,013
12,892,161,291
IssuesEvent
2020-07-13 19:05:21
esnet/iperf
https://api.github.com/repos/esnet/iperf
closed
Compilation fails on Solaris 10 Sparc - Iperf 3.1.3
Help Wanted portability
# /usr/sfw/bin/gmake (...) Making all in src gmake[1]: Entering directory `/srv/source/iperf-3.1.3/src' /usr/sfw/bin/gmake all-am gmake[2]: Entering directory`/srv/source/iperf-3.1.3/src' /bin/bash ../libtool --tag=CC --mode=link cc -g -g -g -R/srv//source/iperf-3.1.3/src/.libs/ -L/srv/source/iperf-3.1.3/src/.libs/ -o iperf3 iperf3-main.o libiperf.la -lsctp -lnsl -lsocket -lresolv -lrt -lm libtool: link: cc -g -g -g -o .libs/iperf3 iperf3-main.o -L/srv/source/iperf-3.1.3/src/.libs/ ./.libs/libiperf.so -lsctp -lnsl -lsocket -lresolv -lrt -lm -R/usr/local/lib -R/srv//source/iperf-3.1.3/src/.libs/ Undefined first referenced symbol in file htonll ./.libs/libiperf.so ntohll ./.libs/libiperf.so ld: fatal: symbol referencing errors. No output written to .libs/iperf3 gmake[2]: **\* [iperf3] Error 1 gmake[2]: Leaving directory `/srv/source/iperf-3.1.3/src' gmake[1]: *** [all] Error 2 gmake[1]: Leaving directory`/srv/source/iperf-3.1.3/src' gmake: **\* [all-recursive] Error 1
True
Compilation fails on Solaris 10 Sparc - Iperf 3.1.3 - # /usr/sfw/bin/gmake (...) Making all in src gmake[1]: Entering directory `/srv/source/iperf-3.1.3/src' /usr/sfw/bin/gmake all-am gmake[2]: Entering directory`/srv/source/iperf-3.1.3/src' /bin/bash ../libtool --tag=CC --mode=link cc -g -g -g -R/srv//source/iperf-3.1.3/src/.libs/ -L/srv/source/iperf-3.1.3/src/.libs/ -o iperf3 iperf3-main.o libiperf.la -lsctp -lnsl -lsocket -lresolv -lrt -lm libtool: link: cc -g -g -g -o .libs/iperf3 iperf3-main.o -L/srv/source/iperf-3.1.3/src/.libs/ ./.libs/libiperf.so -lsctp -lnsl -lsocket -lresolv -lrt -lm -R/usr/local/lib -R/srv//source/iperf-3.1.3/src/.libs/ Undefined first referenced symbol in file htonll ./.libs/libiperf.so ntohll ./.libs/libiperf.so ld: fatal: symbol referencing errors. No output written to .libs/iperf3 gmake[2]: **\* [iperf3] Error 1 gmake[2]: Leaving directory `/srv/source/iperf-3.1.3/src' gmake[1]: *** [all] Error 2 gmake[1]: Leaving directory`/srv/source/iperf-3.1.3/src' gmake: **\* [all-recursive] Error 1
port
compilation fails on solaris sparc iperf usr sfw bin gmake making all in src gmake entering directory srv source iperf src usr sfw bin gmake all am gmake entering directory srv source iperf src bin bash libtool tag cc mode link cc g g g r srv source iperf src libs l srv source iperf src libs o main o libiperf la lsctp lnsl lsocket lresolv lrt lm libtool link cc g g g o libs main o l srv source iperf src libs libs libiperf so lsctp lnsl lsocket lresolv lrt lm r usr local lib r srv source iperf src libs undefined first referenced symbol in file htonll libs libiperf so ntohll libs libiperf so ld fatal symbol referencing errors no output written to libs gmake error gmake leaving directory srv source iperf src gmake error gmake leaving directory srv source iperf src gmake error
1
211,873
23,851,162,663
IssuesEvent
2022-09-06 18:04:17
xmidt-org/talaria
https://api.github.com/repos/xmidt-org/talaria
closed
CVE-2022-29526 (Medium) detected in github.com/hashicorp/go-sockaddr-v1.0.2 - autoclosed
security vulnerability
## CVE-2022-29526 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/hashicorp/go-sockaddr-v1.0.2</b></p></summary> <p>IP Address/UNIX Socket convenience functions for Go</p> <p> Dependency Hierarchy: - github.com/xmidt-org/webpa-common/v2-v2.0.7-dev.1 (Root Library) - github.com/hashicorp/consul/api-v1.13.1 - github.com/hashicorp/serf-v0.9.8 - github.com/hashicorp/memberlist-v0.3.0 - :x: **github.com/hashicorp/go-sockaddr-v1.0.2** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/xmidt-org/talaria/commit/5585120e6948118205d4650798b0373c28b1ec78">5585120e6948118205d4650798b0373c28b1ec78</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Go before 1.17.10 and 1.18.x before 1.18.2 has Incorrect Privilege Assignment. When called with a non-zero flags parameter, the Faccessat function could incorrectly report that a file is accessible. <p>Publish Date: 2022-06-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-29526>CVE-2022-29526</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://security-tracker.debian.org/tracker/CVE-2022-29526">https://security-tracker.debian.org/tracker/CVE-2022-29526</a></p> <p>Release Date: 2022-06-23</p> <p>Fix Resolution: go1.17.10,go1.18.2,go1.19</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-29526 (Medium) detected in github.com/hashicorp/go-sockaddr-v1.0.2 - autoclosed - ## CVE-2022-29526 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/hashicorp/go-sockaddr-v1.0.2</b></p></summary> <p>IP Address/UNIX Socket convenience functions for Go</p> <p> Dependency Hierarchy: - github.com/xmidt-org/webpa-common/v2-v2.0.7-dev.1 (Root Library) - github.com/hashicorp/consul/api-v1.13.1 - github.com/hashicorp/serf-v0.9.8 - github.com/hashicorp/memberlist-v0.3.0 - :x: **github.com/hashicorp/go-sockaddr-v1.0.2** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/xmidt-org/talaria/commit/5585120e6948118205d4650798b0373c28b1ec78">5585120e6948118205d4650798b0373c28b1ec78</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Go before 1.17.10 and 1.18.x before 1.18.2 has Incorrect Privilege Assignment. When called with a non-zero flags parameter, the Faccessat function could incorrectly report that a file is accessible. <p>Publish Date: 2022-06-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-29526>CVE-2022-29526</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://security-tracker.debian.org/tracker/CVE-2022-29526">https://security-tracker.debian.org/tracker/CVE-2022-29526</a></p> <p>Release Date: 2022-06-23</p> <p>Fix Resolution: go1.17.10,go1.18.2,go1.19</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_port
cve medium detected in github com hashicorp go sockaddr autoclosed cve medium severity vulnerability vulnerable library github com hashicorp go sockaddr ip address unix socket convenience functions for go dependency hierarchy github com xmidt org webpa common dev root library github com hashicorp consul api github com hashicorp serf github com hashicorp memberlist x github com hashicorp go sockaddr vulnerable library found in head commit a href found in base branch main vulnerability details go before and x before has incorrect privilege assignment when called with a non zero flags parameter the faccessat function could incorrectly report that a file is accessible publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
0
780
10,288,867,277
IssuesEvent
2019-08-27 14:20:38
MicrosoftDocs/sql-docs
https://api.github.com/repos/MicrosoftDocs/sql-docs
closed
fail to delete empty database from ssms
Pri1 sql/prod support-request supportability/tech
Hi! I try to delete a empty database by using ssms. But the console shows this message: "Cannot drop the database 'AuthServer', because it does not exist or you do not have permission." But there is one one user 'admin' in this whole database instance. Any idea about this? Thank you in advance!!! --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: f6cb23e7-dfcf-8e7e-9042-aa4ce18fdfe3 * Version Independent ID: 71341cc5-767d-74d1-d0d9-81c1c87540ad * Content: [Delete a Database - SQL Server](https://docs.microsoft.com/en-us/sql/relational-databases/databases/delete-a-database?view=sql-server-2017#feedback) * Content Source: [docs/relational-databases/databases/delete-a-database.md](https://github.com/MicrosoftDocs/sql-docs/blob/live/docs/relational-databases/databases/delete-a-database.md) * Product: **sql** * Technology: **supportability** * GitHub Login: @stevestein * Microsoft Alias: **sstein**
True
fail to delete empty database from ssms - Hi! I try to delete a empty database by using ssms. But the console shows this message: "Cannot drop the database 'AuthServer', because it does not exist or you do not have permission." But there is one one user 'admin' in this whole database instance. Any idea about this? Thank you in advance!!! --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: f6cb23e7-dfcf-8e7e-9042-aa4ce18fdfe3 * Version Independent ID: 71341cc5-767d-74d1-d0d9-81c1c87540ad * Content: [Delete a Database - SQL Server](https://docs.microsoft.com/en-us/sql/relational-databases/databases/delete-a-database?view=sql-server-2017#feedback) * Content Source: [docs/relational-databases/databases/delete-a-database.md](https://github.com/MicrosoftDocs/sql-docs/blob/live/docs/relational-databases/databases/delete-a-database.md) * Product: **sql** * Technology: **supportability** * GitHub Login: @stevestein * Microsoft Alias: **sstein**
port
fail to delete empty database from ssms hi i try to delete a empty database by using ssms but the console shows this message cannot drop the database authserver because it does not exist or you do not have permission but there is one one user admin in this whole database instance any idea about this thank you in advance document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id dfcf version independent id content content source product sql technology supportability github login stevestein microsoft alias sstein
1
1,761
25,945,290,710
IssuesEvent
2022-12-16 23:39:14
MicrosoftDocs/sql-docs
https://api.github.com/repos/MicrosoftDocs/sql-docs
closed
The code snippet is misleading and inaccurate
sql/prod supportability/tech doc-bug assigned-to-author Pri1
The code snippet is misleading and inaccurate. There is no need to specify a Database in a `Use` statement to query `sys.databases`. Furthermore, for those unfamiliar with TSQL, the reference to a Database called "AdventureWorks2012" is likely to confuse; especially if their SQL Server instance does not contain a Database by that name. #### Document Details ⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.* * ID: 0ef0bcf2-f4b6-0bc0-a568-36f425aa4c24 * Version Independent ID: faec9433-5f72-8ddc-28bf-d0821701a634 * Content: [View list of databases on SQL Server - SQL Server](https://learn.microsoft.com/en-us/sql/relational-databases/databases/view-a-list-of-databases-on-an-instance-of-sql-server?view=sql-server-ver16) * Content Source: [docs/relational-databases/databases/view-a-list-of-databases-on-an-instance-of-sql-server.md](https://github.com/MicrosoftDocs/sql-docs/blob/live/docs/relational-databases/databases/view-a-list-of-databases-on-an-instance-of-sql-server.md) * Product: **sql** * Technology: **supportability** * GitHub Login: @WilliamDAssafMSFT * Microsoft Alias: **wiassaf**
True
The code snippet is misleading and inaccurate - The code snippet is misleading and inaccurate. There is no need to specify a Database in a `Use` statement to query `sys.databases`. Furthermore, for those unfamiliar with TSQL, the reference to a Database called "AdventureWorks2012" is likely to confuse; especially if their SQL Server instance does not contain a Database by that name. #### Document Details ⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.* * ID: 0ef0bcf2-f4b6-0bc0-a568-36f425aa4c24 * Version Independent ID: faec9433-5f72-8ddc-28bf-d0821701a634 * Content: [View list of databases on SQL Server - SQL Server](https://learn.microsoft.com/en-us/sql/relational-databases/databases/view-a-list-of-databases-on-an-instance-of-sql-server?view=sql-server-ver16) * Content Source: [docs/relational-databases/databases/view-a-list-of-databases-on-an-instance-of-sql-server.md](https://github.com/MicrosoftDocs/sql-docs/blob/live/docs/relational-databases/databases/view-a-list-of-databases-on-an-instance-of-sql-server.md) * Product: **sql** * Technology: **supportability** * GitHub Login: @WilliamDAssafMSFT * Microsoft Alias: **wiassaf**
port
the code snippet is misleading and inaccurate the code snippet is misleading and inaccurate there is no need to specify a database in a use statement to query sys databases furthermore for those unfamiliar with tsql the reference to a database called is likely to confuse especially if their sql server instance does not contain a database by that name document details ⚠ do not edit this section it is required for learn microsoft com ➟ github issue linking id version independent id content content source product sql technology supportability github login williamdassafmsft microsoft alias wiassaf
1
332,299
10,090,865,724
IssuesEvent
2019-07-26 12:53:41
brave/brave-browser
https://api.github.com/repos/brave/brave-browser
closed
Move RewardsInternalsInfo to mojo interface
dev-concern feature/rewards priority/P5
We need to move `RewardsInternalsInfo` in `vendor/bat-native-ledger/include/bat/ledger/rewards_internals_info.h` into mojo interface. Example how to do this https://github.com/brave/brave-core/pull/2432
1.0
Move RewardsInternalsInfo to mojo interface - We need to move `RewardsInternalsInfo` in `vendor/bat-native-ledger/include/bat/ledger/rewards_internals_info.h` into mojo interface. Example how to do this https://github.com/brave/brave-core/pull/2432
non_port
move rewardsinternalsinfo to mojo interface we need to move rewardsinternalsinfo in vendor bat native ledger include bat ledger rewards internals info h into mojo interface example how to do this
0
90,793
18,256,732,113
IssuesEvent
2021-10-03 06:31:50
UNIST-Almight/ps-study-2021-fall
https://api.github.com/repos/UNIST-Almight/ps-study-2021-fall
closed
[초급반 3회차] C, D번 문제 코드 리뷰 (kts0921)
code review beginner week 3
[C번 문제 암호 만들기](http://boj.kr/d8087a8c8e8e48f7a480cf218a92fb5b) ```c++ void func(int index, int pre, int vowel, int consonant) { if (index == L) { if (vowel > 0 && consonant > 1) cout << password << "\n"; return; } for (int i = pre + 1; i < C; i++) { password[index] = arr[i]; func(index + 1, i, vowel + isVowel[arr[i] - 'a'], consonant + !isVowel[arr[i] - 'a']); } } ``` 백트래킹 함수를 위처럼 참고 자료를 보고 짰습니다. 함수의 매개변수가 4개가 사용된 건 처음인데 매개변수가 많은 경우가 PS에서 흔한가요? 매개변수를 줄이는 쪽으로 최적화를 해야하는지 궁금합니다. --- [D번 문제 N-Queen](http://boj.kr/4884228052b041cfb26d0473e9a83dc8) 이 문제도 자료를 보고 만들었습니다. 자유롭게 피드백 주시면 감사합니다. --- 30분 정도 늦게 올려서 죄송합니다..😢
1.0
[초급반 3회차] C, D번 문제 코드 리뷰 (kts0921) - [C번 문제 암호 만들기](http://boj.kr/d8087a8c8e8e48f7a480cf218a92fb5b) ```c++ void func(int index, int pre, int vowel, int consonant) { if (index == L) { if (vowel > 0 && consonant > 1) cout << password << "\n"; return; } for (int i = pre + 1; i < C; i++) { password[index] = arr[i]; func(index + 1, i, vowel + isVowel[arr[i] - 'a'], consonant + !isVowel[arr[i] - 'a']); } } ``` 백트래킹 함수를 위처럼 참고 자료를 보고 짰습니다. 함수의 매개변수가 4개가 사용된 건 처음인데 매개변수가 많은 경우가 PS에서 흔한가요? 매개변수를 줄이는 쪽으로 최적화를 해야하는지 궁금합니다. --- [D번 문제 N-Queen](http://boj.kr/4884228052b041cfb26d0473e9a83dc8) 이 문제도 자료를 보고 만들었습니다. 자유롭게 피드백 주시면 감사합니다. --- 30분 정도 늦게 올려서 죄송합니다..😢
non_port
c d번 문제 코드 리뷰 c void func int index int pre int vowel int consonant if index l if vowel consonant cout password n return for int i pre i c i password arr func index i vowel isvowel a consonant isvowel a 백트래킹 함수를 위처럼 참고 자료를 보고 짰습니다 함수의 매개변수가 사용된 건 처음인데 매개변수가 많은 경우가 ps에서 흔한가요 매개변수를 줄이는 쪽으로 최적화를 해야하는지 궁금합니다 이 문제도 자료를 보고 만들었습니다 자유롭게 피드백 주시면 감사합니다 정도 늦게 올려서 죄송합니다 😢
0
845
2,911,795,231
IssuesEvent
2015-06-22 12:06:27
searchisko/searchisko
https://api.github.com/repos/searchisko/searchisko
opened
Investigate search-guard plugin
enhancement security
[Search Guard](https://github.com/floragunncom/search-guard) is a free and open source plugin for Elasticsearch which provides security features. It seems to have some interesting features (like field level security for example - which is something that we have implemented ourselves for the main [search api](http://docs.jbossorg.apiary.io/#searchapi) but we are missing it for [registered queries](http://docs.jbossorg.apiary.io/#managementapiregisteredqueries) now).
True
Investigate search-guard plugin - [Search Guard](https://github.com/floragunncom/search-guard) is a free and open source plugin for Elasticsearch which provides security features. It seems to have some interesting features (like field level security for example - which is something that we have implemented ourselves for the main [search api](http://docs.jbossorg.apiary.io/#searchapi) but we are missing it for [registered queries](http://docs.jbossorg.apiary.io/#managementapiregisteredqueries) now).
non_port
investigate search guard plugin is a free and open source plugin for elasticsearch which provides security features it seems to have some interesting features like field level security for example which is something that we have implemented ourselves for the main but we are missing it for now
0
137,151
30,636,772,398
IssuesEvent
2023-07-24 18:24:18
ita-social-projects/StreetCode
https://api.github.com/repos/ita-social-projects/StreetCode
opened
[Admin/Interesting facts block] The "Підпис фото" field in the "Wow-факти" modal window limits up to 100 symbols
bug (Epic#2) Admin/New StreetCode
**Environment:** OS: Windows 10 Pro **Browser:** Google Chrome Version 111.0.5563.112. **Reproducible:** always. **Build found:** commit [d494c37](https://github.com/ita-social-projects/StreetCode/commit/d494c372c230bf30fef322fdb50405a1c708c55b) **Type:** Functional **Priority:** Medium **Severity:** Low **Preconditions:** 1. Go to the site. 2. Login as admin. 3. Open the new StreetCode page or the StreetCode page for editing. **Steps to reproduce** 1 Scroll down to the "Wow-факти" block. 3. Click on "+" button. 4. Pay atention to the "Підпис фото" field. **Actual result** The "Підпис фото" field in the "Wow-факти" modal window limits up to 100 symbols. **Expected result** The "Підпис фото" field in the "Wow-факти" modal window limits up to 200 symbols. **User story and test case links** User story #123 Test case #420 <img width="981" alt="Підпис фото" src="https://github.com/ita-social-projects/StreetCode/assets/135837034/d1723e5e-df22-4085-a421-7d6747c647b8">
1.0
[Admin/Interesting facts block] The "Підпис фото" field in the "Wow-факти" modal window limits up to 100 symbols - **Environment:** OS: Windows 10 Pro **Browser:** Google Chrome Version 111.0.5563.112. **Reproducible:** always. **Build found:** commit [d494c37](https://github.com/ita-social-projects/StreetCode/commit/d494c372c230bf30fef322fdb50405a1c708c55b) **Type:** Functional **Priority:** Medium **Severity:** Low **Preconditions:** 1. Go to the site. 2. Login as admin. 3. Open the new StreetCode page or the StreetCode page for editing. **Steps to reproduce** 1 Scroll down to the "Wow-факти" block. 3. Click on "+" button. 4. Pay atention to the "Підпис фото" field. **Actual result** The "Підпис фото" field in the "Wow-факти" modal window limits up to 100 symbols. **Expected result** The "Підпис фото" field in the "Wow-факти" modal window limits up to 200 symbols. **User story and test case links** User story #123 Test case #420 <img width="981" alt="Підпис фото" src="https://github.com/ita-social-projects/StreetCode/assets/135837034/d1723e5e-df22-4085-a421-7d6747c647b8">
non_port
the підпис фото field in the wow факти modal window limits up to symbols environment os windows pro browser google chrome version reproducible always build found commit type functional priority medium severity low preconditions go to the site login as admin open the new streetcode page or the streetcode page for editing steps to reproduce scroll down to the wow факти block click on button pay atention to the підпис фото field actual result the підпис фото field in the wow факти modal window limits up to symbols expected result the підпис фото field in the wow факти modal window limits up to symbols user story and test case links user story test case img width alt підпис фото src
0
785
10,348,824,905
IssuesEvent
2019-09-04 20:45:11
Azure/azure-webjobs-sdk
https://api.github.com/repos/Azure/azure-webjobs-sdk
opened
Detect if the v2 ApplicationInsights Service for v2 has been replaced by customer dependency injection.
Supportability
Customers are mistakenly overwriting the Application Insights dependency in Startup which results in missing data and dependencies not showing up in the Application Map. This is referred to in the documentation here. https://docs.microsoft.com/en-us/azure/azure-functions/functions-dotnet-dependency-injection#logging-services and here https://docs.microsoft.com/en-us/azure/azure-functions/functions-monitoring#version-2x-3 #### Repro steps Provide the steps required to reproduce the problem 1. Create a New Functions Project 2. Add a Startup class. 3. Inject your own instance of TelemetryClient serviceCollection.AddSingleton<TelemetryClient>(sp => { var configuration = TelemetryConfiguration.CreateDefault(); configuration.TelemetryProcessorChainBuilder.Use(next => { var qpProcessor = new QuickPulseTelemetryProcessor(next); qpProcessor.Initialize(configuration); return qpProcessor; }); #### Expected behavior The runtime can log an error indicating that this may be missing and we can build a detector to share the above links with the customer. #### Actual behavior The Application Insights data and dependency tracking is affected and currently this needs a code review to detect and fix. #### Known workarounds No workarounds exist.
True
Detect if the v2 ApplicationInsights Service for v2 has been replaced by customer dependency injection. - Customers are mistakenly overwriting the Application Insights dependency in Startup which results in missing data and dependencies not showing up in the Application Map. This is referred to in the documentation here. https://docs.microsoft.com/en-us/azure/azure-functions/functions-dotnet-dependency-injection#logging-services and here https://docs.microsoft.com/en-us/azure/azure-functions/functions-monitoring#version-2x-3 #### Repro steps Provide the steps required to reproduce the problem 1. Create a New Functions Project 2. Add a Startup class. 3. Inject your own instance of TelemetryClient serviceCollection.AddSingleton<TelemetryClient>(sp => { var configuration = TelemetryConfiguration.CreateDefault(); configuration.TelemetryProcessorChainBuilder.Use(next => { var qpProcessor = new QuickPulseTelemetryProcessor(next); qpProcessor.Initialize(configuration); return qpProcessor; }); #### Expected behavior The runtime can log an error indicating that this may be missing and we can build a detector to share the above links with the customer. #### Actual behavior The Application Insights data and dependency tracking is affected and currently this needs a code review to detect and fix. #### Known workarounds No workarounds exist.
port
detect if the applicationinsights service for has been replaced by customer dependency injection customers are mistakenly overwriting the application insights dependency in startup which results in missing data and dependencies not showing up in the application map this is referred to in the documentation here and here repro steps provide the steps required to reproduce the problem create a new functions project add a startup class inject your own instance of telemetryclient servicecollection addsingleton sp var configuration telemetryconfiguration createdefault configuration telemetryprocessorchainbuilder use next var qpprocessor new quickpulsetelemetryprocessor next qpprocessor initialize configuration return qpprocessor expected behavior the runtime can log an error indicating that this may be missing and we can build a detector to share the above links with the customer actual behavior the application insights data and dependency tracking is affected and currently this needs a code review to detect and fix known workarounds no workarounds exist
1
183,386
21,721,724,193
IssuesEvent
2022-05-11 01:20:15
raindigi/reaction
https://api.github.com/repos/raindigi/reaction
closed
CVE-2021-29060 (Medium) detected in color-string-1.5.3.tgz - autoclosed
security vulnerability
## CVE-2021-29060 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>color-string-1.5.3.tgz</b></p></summary> <p>Parser and generator for CSS color strings</p> <p>Library home page: <a href="https://registry.npmjs.org/color-string/-/color-string-1.5.3.tgz">https://registry.npmjs.org/color-string/-/color-string-1.5.3.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/color-string/package.json</p> <p> Dependency Hierarchy: - sharp-0.20.5.tgz (Root Library) - color-3.1.0.tgz - :x: **color-string-1.5.3.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/raindigi/reaction/commit/c3f5e6b9d647cd1f977b184ae9c079f1ae297353">c3f5e6b9d647cd1f977b184ae9c079f1ae297353</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A Regular Expression Denial of Service (ReDOS) vulnerability was discovered in Color-String version 1.5.5 and below which occurs when the application is provided and checks a crafted invalid HWB string. <p>Publish Date: 2021-06-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-29060>CVE-2021-29060</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-257v-vj4p-3w2h">https://github.com/advisories/GHSA-257v-vj4p-3w2h</a></p> <p>Release Date: 2021-06-21</p> <p>Fix Resolution: color-string - 1.5.5</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-29060 (Medium) detected in color-string-1.5.3.tgz - autoclosed - ## CVE-2021-29060 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>color-string-1.5.3.tgz</b></p></summary> <p>Parser and generator for CSS color strings</p> <p>Library home page: <a href="https://registry.npmjs.org/color-string/-/color-string-1.5.3.tgz">https://registry.npmjs.org/color-string/-/color-string-1.5.3.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/color-string/package.json</p> <p> Dependency Hierarchy: - sharp-0.20.5.tgz (Root Library) - color-3.1.0.tgz - :x: **color-string-1.5.3.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/raindigi/reaction/commit/c3f5e6b9d647cd1f977b184ae9c079f1ae297353">c3f5e6b9d647cd1f977b184ae9c079f1ae297353</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A Regular Expression Denial of Service (ReDOS) vulnerability was discovered in Color-String version 1.5.5 and below which occurs when the application is provided and checks a crafted invalid HWB string. <p>Publish Date: 2021-06-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-29060>CVE-2021-29060</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-257v-vj4p-3w2h">https://github.com/advisories/GHSA-257v-vj4p-3w2h</a></p> <p>Release Date: 2021-06-21</p> <p>Fix Resolution: color-string - 1.5.5</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_port
cve medium detected in color string tgz autoclosed cve medium severity vulnerability vulnerable library color string tgz parser and generator for css color strings library home page a href path to dependency file package json path to vulnerable library node modules color string package json dependency hierarchy sharp tgz root library color tgz x color string tgz vulnerable library found in head commit a href vulnerability details a regular expression denial of service redos vulnerability was discovered in color string version and below which occurs when the application is provided and checks a crafted invalid hwb string publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution color string step up your open source security game with whitesource
0