Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
103,932
| 11,386,447,189
|
IssuesEvent
|
2020-01-29 13:18:14
|
fitbenchmarking/fitbenchmarking
|
https://api.github.com/repos/fitbenchmarking/fitbenchmarking
|
opened
|
Update outdated main README
|
Documentation
|
**Description of the documentation**
The main README on https://github.com/fitbenchmarking/fitbenchmarking is out-of-date and with broken links. Replace it with new readthedocs documentation
|
1.0
|
Update outdated main README - **Description of the documentation**
The main README on https://github.com/fitbenchmarking/fitbenchmarking is out-of-date and with broken links. Replace it with new readthedocs documentation
|
non_process
|
update outdated main readme description of the documentation the main readme on is out of date and with broken links replace it with new readthedocs documentation
| 0
|
79,744
| 23,033,246,288
|
IssuesEvent
|
2022-07-22 15:49:49
|
google-coral/edgetpu
|
https://api.github.com/repos/google-coral/edgetpu
|
closed
|
ValueError: Failed to load delegate from libedgetpu.so.1.0 on coral usb tpu
|
type:build/install subtype:ubuntu/linux Hardware:USB Accelerator
|
### Description
im in ubuntu 22.04 LTS (GNU/Linux 5.15.0-39-generic x86_64), docker Docker version 20.10.14, build a224086349 and python3 3.10.4-0ubuntu2
run lsusb with the result Bus 001 Device 002: ID 1a6e:089a Global Unichip Corp.
there is no way to initialize usb acelerator
i run in docker frigate and get
Process detector:coral:
[2022-07-13 11:54:51] frigate.edgetpu ERROR : No EdgeTPU was detected. If you do not have a Coral device yet, you must configure CPU detectors.
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 160, in load_delegate
delegate = Delegate(library, options)
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 119, in init
raise ValueError(capture.message)
ValueError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/usr/lib/python3.8/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/opt/frigate/frigate/edgetpu.py", line 136, in run_detector
object_detector = LocalObjectDetector(
File "/opt/frigate/frigate/edgetpu.py", line 44, in init
edge_tpu_delegate = load_delegate("libedgetpu.so.1.0", device_config)
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 162, in load_delegate
raise ValueError('Failed to load delegate from {}\n{}'.format(
ValueError: Failed to load delegate from libedgetpu.so.1.0
im in ubuntu 22.04 LTS (GNU/Linux 5.15.0-39-generic x86_64), docker Docker version 20.10.14, build a224086349 and python3 3.10.4-0ubuntu2
can you please help me?
<details><summary>Click to expand!</summary>
### Issue Type
Build/Install
### Operating System
Linux
### Coral Device
USB Accelerator
### Other Devices
_No response_
### Programming Language
_No response_
### Relevant Log Output
```shell
Process detector:coral:
[2022-07-13 18:16:22] frigate.edgetpu ERROR : No EdgeTPU was detected. If you do not have a Coral device yet, you must configure CPU detectors.
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 160, in load_delegate
delegate = Delegate(library, options)
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 119, in init
raise ValueError(capture.message)
ValueError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/usr/lib/python3.8/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/opt/frigate/frigate/edgetpu.py", line 136, in run_detector
object_detector = LocalObjectDetector(
File "/opt/frigate/frigate/edgetpu.py", line 44, in init
edge_tpu_delegate = load_delegate("libedgetpu.so.1.0", device_config)
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 162, in load_delegate
raise ValueError('Failed to load delegate from {}\n{}'.format(
ValueError: Failed to load delegate from libedgetpu.so.1.0
```
</details>
|
1.0
|
ValueError: Failed to load delegate from libedgetpu.so.1.0 on coral usb tpu - ### Description
im in ubuntu 22.04 LTS (GNU/Linux 5.15.0-39-generic x86_64), docker Docker version 20.10.14, build a224086349 and python3 3.10.4-0ubuntu2
run lsusb with the result Bus 001 Device 002: ID 1a6e:089a Global Unichip Corp.
there is no way to initialize usb acelerator
i run in docker frigate and get
Process detector:coral:
[2022-07-13 11:54:51] frigate.edgetpu ERROR : No EdgeTPU was detected. If you do not have a Coral device yet, you must configure CPU detectors.
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 160, in load_delegate
delegate = Delegate(library, options)
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 119, in init
raise ValueError(capture.message)
ValueError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/usr/lib/python3.8/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/opt/frigate/frigate/edgetpu.py", line 136, in run_detector
object_detector = LocalObjectDetector(
File "/opt/frigate/frigate/edgetpu.py", line 44, in init
edge_tpu_delegate = load_delegate("libedgetpu.so.1.0", device_config)
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 162, in load_delegate
raise ValueError('Failed to load delegate from {}\n{}'.format(
ValueError: Failed to load delegate from libedgetpu.so.1.0
im in ubuntu 22.04 LTS (GNU/Linux 5.15.0-39-generic x86_64), docker Docker version 20.10.14, build a224086349 and python3 3.10.4-0ubuntu2
can you please help me?
<details><summary>Click to expand!</summary>
### Issue Type
Build/Install
### Operating System
Linux
### Coral Device
USB Accelerator
### Other Devices
_No response_
### Programming Language
_No response_
### Relevant Log Output
```shell
Process detector:coral:
[2022-07-13 18:16:22] frigate.edgetpu ERROR : No EdgeTPU was detected. If you do not have a Coral device yet, you must configure CPU detectors.
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 160, in load_delegate
delegate = Delegate(library, options)
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 119, in init
raise ValueError(capture.message)
ValueError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/usr/lib/python3.8/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/opt/frigate/frigate/edgetpu.py", line 136, in run_detector
object_detector = LocalObjectDetector(
File "/opt/frigate/frigate/edgetpu.py", line 44, in init
edge_tpu_delegate = load_delegate("libedgetpu.so.1.0", device_config)
File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 162, in load_delegate
raise ValueError('Failed to load delegate from {}\n{}'.format(
ValueError: Failed to load delegate from libedgetpu.so.1.0
```
</details>
|
non_process
|
valueerror failed to load delegate from libedgetpu so on coral usb tpu description im in ubuntu lts gnu linux generic docker docker version build and run lsusb with the result bus device id global unichip corp there is no way to initialize usb acelerator i run in docker frigate and get process detector coral frigate edgetpu error no edgetpu was detected if you do not have a coral device yet you must configure cpu detectors traceback most recent call last file usr lib dist packages tflite runtime interpreter py line in load delegate delegate delegate library options file usr lib dist packages tflite runtime interpreter py line in init raise valueerror capture message valueerror during handling of the above exception another exception occurred traceback most recent call last file usr lib multiprocessing process py line in bootstrap self run file usr lib multiprocessing process py line in run self target self args self kwargs file opt frigate frigate edgetpu py line in run detector object detector localobjectdetector file opt frigate frigate edgetpu py line in init edge tpu delegate load delegate libedgetpu so device config file usr lib dist packages tflite runtime interpreter py line in load delegate raise valueerror failed to load delegate from n format valueerror failed to load delegate from libedgetpu so im in ubuntu lts gnu linux generic docker docker version build and can you please help me click to expand issue type build install operating system linux coral device usb accelerator other devices no response programming language no response relevant log output shell process detector coral frigate edgetpu error no edgetpu was detected if you do not have a coral device yet you must configure cpu detectors traceback most recent call last file usr lib dist packages tflite runtime interpreter py line in load delegate delegate delegate library options file usr lib dist packages tflite runtime interpreter py line in init raise valueerror capture message valueerror during handling of the above exception another exception occurred traceback most recent call last file usr lib multiprocessing process py line in bootstrap self run file usr lib multiprocessing process py line in run self target self args self kwargs file opt frigate frigate edgetpu py line in run detector object detector localobjectdetector file opt frigate frigate edgetpu py line in init edge tpu delegate load delegate libedgetpu so device config file usr lib dist packages tflite runtime interpreter py line in load delegate raise valueerror failed to load delegate from n format valueerror failed to load delegate from libedgetpu so
| 0
|
343
| 2,793,268,811
|
IssuesEvent
|
2015-05-11 09:47:49
|
ecodistrict/IDSSDashboard
|
https://api.github.com/repos/ecodistrict/IDSSDashboard
|
closed
|
Export function
|
enhancement form feedback 09102014 process step: assess alternatives
|
At the moment it seems that the export function is put under the step Assess Alternatives, I would give it it’s own.
|
1.0
|
Export function - At the moment it seems that the export function is put under the step Assess Alternatives, I would give it it’s own.
|
process
|
export function at the moment it seems that the export function is put under the step assess alternatives i would give it it’s own
| 1
|
2,053
| 4,862,769,069
|
IssuesEvent
|
2016-11-14 13:34:54
|
openvstorage/framework
|
https://api.github.com/repos/openvstorage/framework
|
closed
|
File contains parsing errors: /opt/OpenvStorage/config/arakoon_cacc.ini during extend
|
priority_urgent process_duplicate type_bug
|
During the setup of the second node on one of our CI environments:
```
2016-11-13 23:12:27 76700 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 45 - INFO - Running "extranode" hooks
2016-11-13 23:12:27 76700 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 46 - INFO - Executing albacontroller._add_base_configuration
2016-11-13 23:12:27 81900 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 47 - INFO - Executing albanodecontroller.model_albanodes
2016-11-13 23:12:27 83300 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 48 - DEBUG - Avahi installed
2016-11-13 23:12:27 83400 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 49 - INFO - Announcing service
2016-11-13 23:12:27 86500 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 50 - DEBUG - 10.100.198.2 - Restarting service avahi-daemon
2016-11-13 23:12:27 91100 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 51 - DEBUG - 10.100.198.2 - Service avahi-daemon restarted
2016-11-13 23:12:27 97600 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 52 - INFO - Extra node complete
2016-11-13 23:12:27 97600 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 53 - INFO - Analyzing cluster layout
2016-11-13 23:12:27 97800 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 54 - DEBUG - 1 nodes for cluster ovsdb found
2016-11-13 23:12:27 98000 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 55 - INFO - Promoting node
2016-11-13 23:12:28 03900 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 56 - INFO - Joining Arakoon configuration cluster
2016-11-13 23:12:36 41100 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 88 - INFO - Joining Arakoon OVS DB cluster
2016-11-13 23:12:38 47600 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 99 - ERROR -
Failed to promote node, rolling back
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/setup.py", line 419, in setup_node
configure_rabbitmq=configure_rabbitmq)
File "/opt/OpenvStorage/ovs/lib/setup.py", line 1269, in _promote_node
base_dir=Configuration.get('/ovs/framework/paths|ovsdb'))
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/ArakoonInstaller.py", line 435, in extend_cluster
ArakoonInstaller._deploy(config, filesystem=filesystem)
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/ArakoonInstaller.py", line 629, in _deploy
config_path = Configuration.get_configuration_path(config.config_path)
File "/opt/OpenvStorage/ovs/extensions/generic/configuration.py", line 101, in get_configuration_path
key=key)
File "/opt/OpenvStorage/ovs/extensions/generic/configuration.py", line 414, in _passthrough
return getattr(ArakoonConfiguration, method)(*args, **kwargs)
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/configuration.py", line 51, in get_configuration_path
parser.readfp(config_file)
File "/usr/lib/python2.7/ConfigParser.py", line 324, in readfp
self._read(fp, filename)
File "/usr/lib/python2.7/ConfigParser.py", line 546, in _read
raise e
ParsingError: File contains parsing errors: /opt/OpenvStorage/config/arakoon_cacc.ini
[line 20]: 'on/config/tlogs\n'
2016-11-13 23:12:38 48300 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 100 - ERROR - File contains parsing errors: /opt/OpenvStorage/config/arakoon_cacc.ini
[line 20]: 'on/config/tlogs\n'
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/setup.py", line 419, in setup_node
configure_rabbitmq=configure_rabbitmq)
File "/opt/OpenvStorage/ovs/lib/setup.py", line 1269, in _promote_node
base_dir=Configuration.get('/ovs/framework/paths|ovsdb'))
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/ArakoonInstaller.py", line 435, in extend_cluster
ArakoonInstaller._deploy(config, filesystem=filesystem)
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/ArakoonInstaller.py", line 629, in _deploy
config_path = Configuration.get_configuration_path(config.config_path)
File "/opt/OpenvStorage/ovs/extensions/generic/configuration.py", line 101, in get_configuration_path
key=key)
File "/opt/OpenvStorage/ovs/extensions/generic/configuration.py", line 414, in _passthrough
return getattr(ArakoonConfiguration, method)(*args, **kwargs)
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/configuration.py", line 51, in get_configuration_path
parser.readfp(config_file)
File "/usr/lib/python2.7/ConfigParser.py", line 324, in readfp
self._read(fp, filename)
File "/usr/lib/python2.7/ConfigParser.py", line 546, in _read
raise e
ParsingError: File contains parsing errors: /opt/OpenvStorage/config/arakoon_cacc.ini
[line 20]: 'on/config/tlogs\n'
```
|
1.0
|
File contains parsing errors: /opt/OpenvStorage/config/arakoon_cacc.ini during extend - During the setup of the second node on one of our CI environments:
```
2016-11-13 23:12:27 76700 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 45 - INFO - Running "extranode" hooks
2016-11-13 23:12:27 76700 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 46 - INFO - Executing albacontroller._add_base_configuration
2016-11-13 23:12:27 81900 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 47 - INFO - Executing albanodecontroller.model_albanodes
2016-11-13 23:12:27 83300 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 48 - DEBUG - Avahi installed
2016-11-13 23:12:27 83400 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 49 - INFO - Announcing service
2016-11-13 23:12:27 86500 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 50 - DEBUG - 10.100.198.2 - Restarting service avahi-daemon
2016-11-13 23:12:27 91100 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 51 - DEBUG - 10.100.198.2 - Service avahi-daemon restarted
2016-11-13 23:12:27 97600 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 52 - INFO - Extra node complete
2016-11-13 23:12:27 97600 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 53 - INFO - Analyzing cluster layout
2016-11-13 23:12:27 97800 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 54 - DEBUG - 1 nodes for cluster ovsdb found
2016-11-13 23:12:27 98000 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 55 - INFO - Promoting node
2016-11-13 23:12:28 03900 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 56 - INFO - Joining Arakoon configuration cluster
2016-11-13 23:12:36 41100 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 88 - INFO - Joining Arakoon OVS DB cluster
2016-11-13 23:12:38 47600 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 99 - ERROR -
Failed to promote node, rolling back
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/setup.py", line 419, in setup_node
configure_rabbitmq=configure_rabbitmq)
File "/opt/OpenvStorage/ovs/lib/setup.py", line 1269, in _promote_node
base_dir=Configuration.get('/ovs/framework/paths|ovsdb'))
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/ArakoonInstaller.py", line 435, in extend_cluster
ArakoonInstaller._deploy(config, filesystem=filesystem)
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/ArakoonInstaller.py", line 629, in _deploy
config_path = Configuration.get_configuration_path(config.config_path)
File "/opt/OpenvStorage/ovs/extensions/generic/configuration.py", line 101, in get_configuration_path
key=key)
File "/opt/OpenvStorage/ovs/extensions/generic/configuration.py", line 414, in _passthrough
return getattr(ArakoonConfiguration, method)(*args, **kwargs)
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/configuration.py", line 51, in get_configuration_path
parser.readfp(config_file)
File "/usr/lib/python2.7/ConfigParser.py", line 324, in readfp
self._read(fp, filename)
File "/usr/lib/python2.7/ConfigParser.py", line 546, in _read
raise e
ParsingError: File contains parsing errors: /opt/OpenvStorage/config/arakoon_cacc.ini
[line 20]: 'on/config/tlogs\n'
2016-11-13 23:12:38 48300 +0100 - ovsnode02-198 - 22773/139921333040896 - lib/setup - 100 - ERROR - File contains parsing errors: /opt/OpenvStorage/config/arakoon_cacc.ini
[line 20]: 'on/config/tlogs\n'
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/setup.py", line 419, in setup_node
configure_rabbitmq=configure_rabbitmq)
File "/opt/OpenvStorage/ovs/lib/setup.py", line 1269, in _promote_node
base_dir=Configuration.get('/ovs/framework/paths|ovsdb'))
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/ArakoonInstaller.py", line 435, in extend_cluster
ArakoonInstaller._deploy(config, filesystem=filesystem)
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/ArakoonInstaller.py", line 629, in _deploy
config_path = Configuration.get_configuration_path(config.config_path)
File "/opt/OpenvStorage/ovs/extensions/generic/configuration.py", line 101, in get_configuration_path
key=key)
File "/opt/OpenvStorage/ovs/extensions/generic/configuration.py", line 414, in _passthrough
return getattr(ArakoonConfiguration, method)(*args, **kwargs)
File "/opt/OpenvStorage/ovs/extensions/db/arakoon/configuration.py", line 51, in get_configuration_path
parser.readfp(config_file)
File "/usr/lib/python2.7/ConfigParser.py", line 324, in readfp
self._read(fp, filename)
File "/usr/lib/python2.7/ConfigParser.py", line 546, in _read
raise e
ParsingError: File contains parsing errors: /opt/OpenvStorage/config/arakoon_cacc.ini
[line 20]: 'on/config/tlogs\n'
```
|
process
|
file contains parsing errors opt openvstorage config arakoon cacc ini during extend during the setup of the second node on one of our ci environments lib setup info running extranode hooks lib setup info executing albacontroller add base configuration lib setup info executing albanodecontroller model albanodes lib setup debug avahi installed lib setup info announcing service lib setup debug restarting service avahi daemon lib setup debug service avahi daemon restarted lib setup info extra node complete lib setup info analyzing cluster layout lib setup debug nodes for cluster ovsdb found lib setup info promoting node lib setup info joining arakoon configuration cluster lib setup info joining arakoon ovs db cluster lib setup error failed to promote node rolling back traceback most recent call last file opt openvstorage ovs lib setup py line in setup node configure rabbitmq configure rabbitmq file opt openvstorage ovs lib setup py line in promote node base dir configuration get ovs framework paths ovsdb file opt openvstorage ovs extensions db arakoon arakooninstaller py line in extend cluster arakooninstaller deploy config filesystem filesystem file opt openvstorage ovs extensions db arakoon arakooninstaller py line in deploy config path configuration get configuration path config config path file opt openvstorage ovs extensions generic configuration py line in get configuration path key key file opt openvstorage ovs extensions generic configuration py line in passthrough return getattr arakoonconfiguration method args kwargs file opt openvstorage ovs extensions db arakoon configuration py line in get configuration path parser readfp config file file usr lib configparser py line in readfp self read fp filename file usr lib configparser py line in read raise e parsingerror file contains parsing errors opt openvstorage config arakoon cacc ini on config tlogs n lib setup error file contains parsing errors opt openvstorage config arakoon cacc ini on config tlogs n traceback most recent call last file opt openvstorage ovs lib setup py line in setup node configure rabbitmq configure rabbitmq file opt openvstorage ovs lib setup py line in promote node base dir configuration get ovs framework paths ovsdb file opt openvstorage ovs extensions db arakoon arakooninstaller py line in extend cluster arakooninstaller deploy config filesystem filesystem file opt openvstorage ovs extensions db arakoon arakooninstaller py line in deploy config path configuration get configuration path config config path file opt openvstorage ovs extensions generic configuration py line in get configuration path key key file opt openvstorage ovs extensions generic configuration py line in passthrough return getattr arakoonconfiguration method args kwargs file opt openvstorage ovs extensions db arakoon configuration py line in get configuration path parser readfp config file file usr lib configparser py line in readfp self read fp filename file usr lib configparser py line in read raise e parsingerror file contains parsing errors opt openvstorage config arakoon cacc ini on config tlogs n
| 1
|
15,033
| 18,755,264,843
|
IssuesEvent
|
2021-11-05 09:55:23
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
Help users of migrate to recover from a failed migration deployment
|
process/candidate topic: migrate team/migrations topic: migrate deploy
|
## Problem
We currently do nothing to help diagnose the failed state other than telling users what migration failed, and with which error. We could do more.
## Suggested solution
TBD
## Alternatives
Ø
## Additional context
- https://www.prisma.io/docs/guides/database/production-troubleshooting
|
1.0
|
Help users of migrate to recover from a failed migration deployment - ## Problem
We currently do nothing to help diagnose the failed state other than telling users what migration failed, and with which error. We could do more.
## Suggested solution
TBD
## Alternatives
Ø
## Additional context
- https://www.prisma.io/docs/guides/database/production-troubleshooting
|
process
|
help users of migrate to recover from a failed migration deployment problem we currently do nothing to help diagnose the failed state other than telling users what migration failed and with which error we could do more suggested solution tbd alternatives ø additional context
| 1
|
115,564
| 4,676,017,309
|
IssuesEvent
|
2016-10-07 10:06:25
|
CS2103AUG2016-T14-C2/main
|
https://api.github.com/repos/CS2103AUG2016-T14-C2/main
|
closed
|
Update AboutUs.md
|
priority.medium
|
In particular, indicate which member is in charge of which component (i.e. UI, Logic, Storage, etc.).
Note: Being 'in charge' of a component does not mean you work in that component only. Rather, you are expected to know it well and possibly review/approve changes done to that component by others.
Delete details/photos of original developers. Include your own details and photos instead.
Include the name and photo of the project mentor (i.e. your phase C tutor). His/her photo can be taken from the Teaching Team page.
|
1.0
|
Update AboutUs.md - In particular, indicate which member is in charge of which component (i.e. UI, Logic, Storage, etc.).
Note: Being 'in charge' of a component does not mean you work in that component only. Rather, you are expected to know it well and possibly review/approve changes done to that component by others.
Delete details/photos of original developers. Include your own details and photos instead.
Include the name and photo of the project mentor (i.e. your phase C tutor). His/her photo can be taken from the Teaching Team page.
|
non_process
|
update aboutus md in particular indicate which member is in charge of which component i e ui logic storage etc note being in charge of a component does not mean you work in that component only rather you are expected to know it well and possibly review approve changes done to that component by others delete details photos of original developers include your own details and photos instead include the name and photo of the project mentor i e your phase c tutor his her photo can be taken from the teaching team page
| 0
|
815,641
| 30,565,533,493
|
IssuesEvent
|
2023-07-20 17:28:08
|
kubernetes/ingress-nginx
|
https://api.github.com/repos/kubernetes/ingress-nginx
|
closed
|
ssl passthrough not working with --enable-ssl-passthrough flag and annotation
|
lifecycle/frozen needs-kind needs-triage needs-priority
|
**What happened**:
I am running a controller in ssl passthrough mode passing the argument --enable-ssl-passthrough
Also annotating ingress using nginx.ingress.kubernetes.io/ssl-passthrough: true
Controller args:
```javascript
containers:
- args:
- /nginx-ingress-controller
- --default-backend-service=$(POD_NAMESPACE)/nginx-ingress-default-backend
- --publish-service=$(POD_NAMESPACE)/nginx-ingress-controller
- --election-id=ingress-controller-leader
- --controller-class=k8s.io/ingress-nginx
- --ingress-class=nginx
- --configmap=$(POD_NAMESPACE)/nginx-ingress-controller
- --enable-ssl-passthrough
```
Please note, I'm not passing any certificate here e.g. ` - --default-ssl-certificate=$(POD_NAMESPACE)/nginx-tls-secret`
This is my ingress resource
```javascript
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
annotations:
kubernetes.io/ingress.class: nginx
nginx.ingress.kubernetes.io/backend-protocol: HTTPS
nginx.ingress.kubernetes.io/ssl-passthrough: "true"
nginx.ingress.kubernetes.io/ssl-redirect: "true"
name: ingress1
namespace: mynamespace
spec:
rules:
- http:
paths:
- backend:
service:
name: service-for-ingress-instance1
port:
number: 443
path: /myservice/instance1
pathType: ImplementationSpecific
```
My expectation is," the controller to send TLS connections directly to the backend instead of letting NGINX decrypt the communication" as per [document](https://kubernetes.github.io/ingress-nginx/user-guide/nginx-configuration/annotations/#ssl-passthrough).
But in this case it is not sending traffic to the corresponding backend. with pathType: **Prefix** also doesn't work.
The moment I pass a certificate in controller args` - --default-ssl-certificate=$(POD_NAMESPACE)/nginx-tls-secret` it works and sends traffic to backend as expected.
**My questions:**
**1. Is it mandatory to pass certificate to the controller when I use the above ingress resource (path based routing)?
2. Does ssl-passthrough not work with path based routing ?**
But in this case it is not sending traffic to the corresponding backend. with pathType: **Prefix** also doesn't work.
**What you expected to happen**:
Without cert the path based routing should be working
**NGINX Ingress controller version** (exec into the pod and run nginx-ingress-controller --version.): 1.1.2
**Kubernetes version** (use `kubectl version`): 1.23.4
|
1.0
|
ssl passthrough not working with --enable-ssl-passthrough flag and annotation - **What happened**:
I am running a controller in ssl passthrough mode passing the argument --enable-ssl-passthrough
Also annotating ingress using nginx.ingress.kubernetes.io/ssl-passthrough: true
Controller args:
```javascript
containers:
- args:
- /nginx-ingress-controller
- --default-backend-service=$(POD_NAMESPACE)/nginx-ingress-default-backend
- --publish-service=$(POD_NAMESPACE)/nginx-ingress-controller
- --election-id=ingress-controller-leader
- --controller-class=k8s.io/ingress-nginx
- --ingress-class=nginx
- --configmap=$(POD_NAMESPACE)/nginx-ingress-controller
- --enable-ssl-passthrough
```
Please note, I'm not passing any certificate here e.g. ` - --default-ssl-certificate=$(POD_NAMESPACE)/nginx-tls-secret`
This is my ingress resource
```javascript
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
annotations:
kubernetes.io/ingress.class: nginx
nginx.ingress.kubernetes.io/backend-protocol: HTTPS
nginx.ingress.kubernetes.io/ssl-passthrough: "true"
nginx.ingress.kubernetes.io/ssl-redirect: "true"
name: ingress1
namespace: mynamespace
spec:
rules:
- http:
paths:
- backend:
service:
name: service-for-ingress-instance1
port:
number: 443
path: /myservice/instance1
pathType: ImplementationSpecific
```
My expectation is," the controller to send TLS connections directly to the backend instead of letting NGINX decrypt the communication" as per [document](https://kubernetes.github.io/ingress-nginx/user-guide/nginx-configuration/annotations/#ssl-passthrough).
But in this case it is not sending traffic to the corresponding backend. with pathType: **Prefix** also doesn't work.
The moment I pass a certificate in controller args` - --default-ssl-certificate=$(POD_NAMESPACE)/nginx-tls-secret` it works and sends traffic to backend as expected.
**My questions:**
**1. Is it mandatory to pass certificate to the controller when I use the above ingress resource (path based routing)?
2. Does ssl-passthrough not work with path based routing ?**
But in this case it is not sending traffic to the corresponding backend. with pathType: **Prefix** also doesn't work.
**What you expected to happen**:
Without cert the path based routing should be working
**NGINX Ingress controller version** (exec into the pod and run nginx-ingress-controller --version.): 1.1.2
**Kubernetes version** (use `kubectl version`): 1.23.4
|
non_process
|
ssl passthrough not working with enable ssl passthrough flag and annotation what happened i am running a controller in ssl passthrough mode passing the argument enable ssl passthrough also annotating ingress using nginx ingress kubernetes io ssl passthrough true controller args javascript containers args nginx ingress controller default backend service pod namespace nginx ingress default backend publish service pod namespace nginx ingress controller election id ingress controller leader controller class io ingress nginx ingress class nginx configmap pod namespace nginx ingress controller enable ssl passthrough please note i m not passing any certificate here e g default ssl certificate pod namespace nginx tls secret this is my ingress resource javascript apiversion networking io kind ingress metadata annotations kubernetes io ingress class nginx nginx ingress kubernetes io backend protocol https nginx ingress kubernetes io ssl passthrough true nginx ingress kubernetes io ssl redirect true name namespace mynamespace spec rules http paths backend service name service for ingress port number path myservice pathtype implementationspecific my expectation is the controller to send tls connections directly to the backend instead of letting nginx decrypt the communication as per but in this case it is not sending traffic to the corresponding backend with pathtype prefix also doesn t work the moment i pass a certificate in controller args default ssl certificate pod namespace nginx tls secret it works and sends traffic to backend as expected my questions is it mandatory to pass certificate to the controller when i use the above ingress resource path based routing does ssl passthrough not work with path based routing but in this case it is not sending traffic to the corresponding backend with pathtype prefix also doesn t work what you expected to happen without cert the path based routing should be working nginx ingress controller version exec into the pod and run nginx ingress controller version kubernetes version use kubectl version
| 0
|
21,331
| 29,040,855,786
|
IssuesEvent
|
2023-05-13 00:31:16
|
devssa/onde-codar-em-salvador
|
https://api.github.com/repos/devssa/onde-codar-em-salvador
|
closed
|
[Remoto] QA Analyst na Coodesh
|
SALVADOR TESTE PJ PHP JAVASCRIPT LARAVEL TYPESCRIPT NODE.JS GO REACT REQUISITOS REMOTO CYPRESS PROCESSOS GITHUB INGLÊS UMA C LIDERANÇA QA METODOLOGIAS ÁGEIS Stale
|
## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/qa-analyst-194604120?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋
<p>A <strong>Nickelpay </strong>busca <strong>QA Analyst</strong> para compor seu time!</p>
<p>Somos uma fintech focada em fazer diferente. Trazer facilidade, conforto e confiança no processo de gestão de contas é nosso dever. A Nickelpay tem uma proposta forte que atrela Contas Digitais, Gestão Centralizada e BPO Financeiro. </p>
<p></p>
## Nickelpay:
<p>Somos uma fintech focada em fazer diferente. Trazer facilidade, conforto e confiança no processo de gestão de contas é nosso dever. A Nickelpay tem uma proposta forte que atrela Contas Digitais, Gestão Centralizada e BPO Financeiro. </p>
<p>Código próprio, em constante desenvolvimento, queremos pessoas que queiram fazer a diferença, pois temos liberdade para criar, desenvolver e implementar.</p><a href='https://coodesh.com/empresas/nickelpay'>Veja mais no site</a>
## Habilidades:
- Cypress
- Javascript
- Typescript
- Node.js
- React.js
- React Native
- Detox
## Local:
100% Remoto
## Requisitos:
- Experiência anterior como Analista de Testes/QA;
- Conhecimentos em processos e metodologias ágeis;
- Experiência com automação de teste para aplicação Web;
- Experiência com testes em Cypress;
- Experiência com uma ou mais dessas linguagens: React, Node, Go Lang, PHP/Laravel, React Native;
- Experiência com Detox.
## Diferenciais:
- Experiência com teste de vulnerabilidades.
## Benefícios:
- 20 dias de recesso remunerado (após 12 meses);
- 5 dias entre natal e ano novo;
- Day off aniversário;
- Ajuda de custo com cursos na área e de Inglês;
- Incentivo à liderança e desenvolvimento de carreira.
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [QA Analyst na Nickelpay](https://coodesh.com/vagas/qa-analyst-194604120?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Remoto
#### Regime
PJ
#### Categoria
Testes/Q.A
|
1.0
|
[Remoto] QA Analyst na Coodesh - ## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/qa-analyst-194604120?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋
<p>A <strong>Nickelpay </strong>busca <strong>QA Analyst</strong> para compor seu time!</p>
<p>Somos uma fintech focada em fazer diferente. Trazer facilidade, conforto e confiança no processo de gestão de contas é nosso dever. A Nickelpay tem uma proposta forte que atrela Contas Digitais, Gestão Centralizada e BPO Financeiro. </p>
<p></p>
## Nickelpay:
<p>Somos uma fintech focada em fazer diferente. Trazer facilidade, conforto e confiança no processo de gestão de contas é nosso dever. A Nickelpay tem uma proposta forte que atrela Contas Digitais, Gestão Centralizada e BPO Financeiro. </p>
<p>Código próprio, em constante desenvolvimento, queremos pessoas que queiram fazer a diferença, pois temos liberdade para criar, desenvolver e implementar.</p><a href='https://coodesh.com/empresas/nickelpay'>Veja mais no site</a>
## Habilidades:
- Cypress
- Javascript
- Typescript
- Node.js
- React.js
- React Native
- Detox
## Local:
100% Remoto
## Requisitos:
- Experiência anterior como Analista de Testes/QA;
- Conhecimentos em processos e metodologias ágeis;
- Experiência com automação de teste para aplicação Web;
- Experiência com testes em Cypress;
- Experiência com uma ou mais dessas linguagens: React, Node, Go Lang, PHP/Laravel, React Native;
- Experiência com Detox.
## Diferenciais:
- Experiência com teste de vulnerabilidades.
## Benefícios:
- 20 dias de recesso remunerado (após 12 meses);
- 5 dias entre natal e ano novo;
- Day off aniversário;
- Ajuda de custo com cursos na área e de Inglês;
- Incentivo à liderança e desenvolvimento de carreira.
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [QA Analyst na Nickelpay](https://coodesh.com/vagas/qa-analyst-194604120?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Remoto
#### Regime
PJ
#### Categoria
Testes/Q.A
|
process
|
qa analyst na coodesh descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 a nickelpay busca qa analyst para compor seu time somos uma fintech focada em fazer diferente trazer facilidade conforto e confiança no processo de gestão de contas é nosso dever a nickelpay tem uma proposta forte que atrela contas digitais gestão centralizada e bpo financeiro nbsp nickelpay somos uma fintech focada em fazer diferente trazer facilidade conforto e confiança no processo de gestão de contas é nosso dever a nickelpay tem uma proposta forte que atrela contas digitais gestão centralizada e bpo financeiro nbsp código próprio em constante desenvolvimento queremos pessoas que queiram fazer a diferença pois temos liberdade para criar desenvolver e implementar habilidades cypress javascript typescript node js react js react native detox local remoto requisitos experiência anterior como analista de testes qa conhecimentos em processos e metodologias ágeis experiência com automação de teste para aplicação web experiência com testes em cypress experiência com uma ou mais dessas linguagens react node go lang php laravel react native experiência com detox diferenciais experiência com teste de vulnerabilidades benefícios dias de recesso remunerado após meses dias entre natal e ano novo day off aniversário ajuda de custo com cursos na área e de inglês incentivo à liderança e desenvolvimento de carreira como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação remoto regime pj categoria testes q a
| 1
|
4,495
| 7,346,430,565
|
IssuesEvent
|
2018-03-07 20:42:06
|
GoogleCloudPlatform/google-cloud-dotnet
|
https://api.github.com/repos/GoogleCloudPlatform/google-cloud-dotnet
|
closed
|
Update dependencies
|
type: process
|
We should update to the latest version of Google.Protobuf and Google.Protobuf.Tools, and potentially Grpc.Core, although we might want to hold off on the latter due to https://github.com/grpc/grpc/issues/14021.
In particular, Google.Protobuf now supports:
- Retaining unknown fields
- Better comparisons of `double` values in the face of NaN (which would allow a slight test simplification)
This needs to be done in the GAX repo as well as here. This issue is basically to avoid me forgetting to do this; it's not urgent, but it would be good to get done.
|
1.0
|
Update dependencies - We should update to the latest version of Google.Protobuf and Google.Protobuf.Tools, and potentially Grpc.Core, although we might want to hold off on the latter due to https://github.com/grpc/grpc/issues/14021.
In particular, Google.Protobuf now supports:
- Retaining unknown fields
- Better comparisons of `double` values in the face of NaN (which would allow a slight test simplification)
This needs to be done in the GAX repo as well as here. This issue is basically to avoid me forgetting to do this; it's not urgent, but it would be good to get done.
|
process
|
update dependencies we should update to the latest version of google protobuf and google protobuf tools and potentially grpc core although we might want to hold off on the latter due to in particular google protobuf now supports retaining unknown fields better comparisons of double values in the face of nan which would allow a slight test simplification this needs to be done in the gax repo as well as here this issue is basically to avoid me forgetting to do this it s not urgent but it would be good to get done
| 1
|
67,140
| 7,036,519,538
|
IssuesEvent
|
2017-12-28 09:17:54
|
edenlabllc/ehealth.api
|
https://api.github.com/repos/edenlabllc/ehealth.api
|
closed
|
FE fixes for Auth
|
epic/Auth FE kind/task priority/medium status/test
|
1. Add description for Error 400

2. Paint the phone number entry field

|
1.0
|
FE fixes for Auth - 1. Add description for Error 400

2. Paint the phone number entry field

|
non_process
|
fe fixes for auth add description for error paint the phone number entry field
| 0
|
160,361
| 20,099,780,698
|
IssuesEvent
|
2022-02-07 01:34:38
|
jbarrus/diagram-js
|
https://api.github.com/repos/jbarrus/diagram-js
|
opened
|
karma-4.2.0.tgz: 9 vulnerabilities (highest severity is: 9.4)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>karma-4.2.0.tgz</b></p></summary>
<p>Spectacular Test Runner for JavaScript.</p>
<p>Library home page: <a href="https://registry.npmjs.org/karma/-/karma-4.2.0.tgz">https://registry.npmjs.org/karma/-/karma-4.2.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /diagram-js/node_modules/karma/package.json</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-31597](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-31597) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.4 | xmlhttprequest-ssl-1.5.5.tgz | Transitive | 5.0.8 | ❌ |
| [WS-2020-0443](https://github.com/socketio/socket.io/commit/f78a575f66ab693c3ea96ea88429ddb1a44c86c7) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.1 | socket.io-2.1.1.tgz | Transitive | 5.0.8 | ❌ |
| [CVE-2020-28502](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28502) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.1 | xmlhttprequest-ssl-1.5.5.tgz | Transitive | 5.0.8 | ❌ |
| [CVE-2020-36048](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36048) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | engine.io-3.2.1.tgz | Transitive | 6.0.0 | ❌ |
| [CVE-2020-36049](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36049) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | socket.io-parser-3.2.0.tgz | Transitive | 5.0.8 | ❌ |
| [CVE-2020-28469](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | glob-parent-5.0.0.tgz | Transitive | 4.3.0 | ❌ |
| [WS-2020-0091](https://github.com/http-party/node-http-proxy/pull/1447) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | http-proxy-1.17.0.tgz | Transitive | 4.3.0 | ❌ |
| [CVE-2022-0437](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0437) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.4 | karma-4.2.0.tgz | Direct | karma - v6.3.14 | ❌ |
| [CVE-2020-28481](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28481) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.3 | socket.io-2.1.1.tgz | Transitive | 5.0.8 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-31597</summary>
### Vulnerable Library - <b>xmlhttprequest-ssl-1.5.5.tgz</b></p>
<p>XMLHttpRequest for Node</p>
<p>Library home page: <a href="https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz">https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/xmlhttprequest-ssl/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- socket.io-2.1.1.tgz
- socket.io-client-2.1.1.tgz
- engine.io-client-3.2.1.tgz
- :x: **xmlhttprequest-ssl-1.5.5.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
The xmlhttprequest-ssl package before 1.6.1 for Node.js disables SSL certificate validation by default, because rejectUnauthorized (when the property exists but is undefined) is considered to be false within the https.request function of Node.js. In other words, no certificate is ever rejected.
<p>Publish Date: 2021-04-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-31597>CVE-2021-31597</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-31597">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-31597</a></p>
<p>Release Date: 2021-04-23</p>
<p>Fix Resolution (xmlhttprequest-ssl): 1.6.1</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> WS-2020-0443</summary>
### Vulnerable Library - <b>socket.io-2.1.1.tgz</b></p>
<p>node.js realtime framework server</p>
<p>Library home page: <a href="https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz">https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/socket.io/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- :x: **socket.io-2.1.1.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
In socket.io in versions 1.0.0 to 2.3.0 is vulnerable to Cross-Site Websocket Hijacking, it allows an attacker to bypass origin protection using special symbols include "`" and "$".
<p>Publish Date: 2020-02-20
<p>URL: <a href=https://github.com/socketio/socket.io/commit/f78a575f66ab693c3ea96ea88429ddb1a44c86c7>WS-2020-0443</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://hackerone.com/reports/931197">https://hackerone.com/reports/931197</a></p>
<p>Release Date: 2020-02-20</p>
<p>Fix Resolution (socket.io): 2.4.0</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-28502</summary>
### Vulnerable Library - <b>xmlhttprequest-ssl-1.5.5.tgz</b></p>
<p>XMLHttpRequest for Node</p>
<p>Library home page: <a href="https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz">https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/xmlhttprequest-ssl/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- socket.io-2.1.1.tgz
- socket.io-client-2.1.1.tgz
- engine.io-client-3.2.1.tgz
- :x: **xmlhttprequest-ssl-1.5.5.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
This affects the package xmlhttprequest before 1.7.0; all versions of package xmlhttprequest-ssl. Provided requests are sent synchronously (async=False on xhr.open), malicious user input flowing into xhr.send could result in arbitrary code being injected and run.
<p>Publish Date: 2021-03-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28502>CVE-2020-28502</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-h4j5-c7cj-74xg">https://github.com/advisories/GHSA-h4j5-c7cj-74xg</a></p>
<p>Release Date: 2021-03-05</p>
<p>Fix Resolution (xmlhttprequest-ssl): 1.6.1</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-36048</summary>
### Vulnerable Library - <b>engine.io-3.2.1.tgz</b></p>
<p>The realtime engine behind Socket.IO. Provides the foundation of a bidirectional connection between client and server</p>
<p>Library home page: <a href="https://registry.npmjs.org/engine.io/-/engine.io-3.2.1.tgz">https://registry.npmjs.org/engine.io/-/engine.io-3.2.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/engine.io/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- socket.io-2.1.1.tgz
- :x: **engine.io-3.2.1.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
Engine.IO before 4.0.0 allows attackers to cause a denial of service (resource consumption) via a POST request to the long polling transport.
<p>Publish Date: 2021-01-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36048>CVE-2020-36048</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-36048">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-36048</a></p>
<p>Release Date: 2021-01-08</p>
<p>Fix Resolution (engine.io): 4.0.0-alpha.0</p>
<p>Direct dependency fix Resolution (karma): 6.0.0</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-36049</summary>
### Vulnerable Library - <b>socket.io-parser-3.2.0.tgz</b></p>
<p>socket.io protocol parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-3.2.0.tgz">https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-3.2.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/socket.io-parser/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- socket.io-2.1.1.tgz
- :x: **socket.io-parser-3.2.0.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
socket.io-parser before 3.4.1 allows attackers to cause a denial of service (memory consumption) via a large packet because a concatenation approach is used.
<p>Publish Date: 2021-01-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36049>CVE-2020-36049</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-xfhh-g9f5-x4m4">https://github.com/advisories/GHSA-xfhh-g9f5-x4m4</a></p>
<p>Release Date: 2021-01-08</p>
<p>Fix Resolution (socket.io-parser): 3.3.2</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-28469</summary>
### Vulnerable Library - <b>glob-parent-5.0.0.tgz</b></p>
<p>Extract the non-magic parent path from a glob string.</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.0.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- chokidar-3.0.2.tgz
- :x: **glob-parent-5.0.0.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator.
<p>Publish Date: 2021-06-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p>
<p>Release Date: 2021-06-03</p>
<p>Fix Resolution (glob-parent): 5.1.2</p>
<p>Direct dependency fix Resolution (karma): 4.3.0</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> WS-2020-0091</summary>
### Vulnerable Library - <b>http-proxy-1.17.0.tgz</b></p>
<p>HTTP proxying for the masses</p>
<p>Library home page: <a href="https://registry.npmjs.org/http-proxy/-/http-proxy-1.17.0.tgz">https://registry.npmjs.org/http-proxy/-/http-proxy-1.17.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/http-proxy/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- :x: **http-proxy-1.17.0.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
Versions of http-proxy prior to 1.18.1 are vulnerable to Denial of Service. An HTTP request with a long body triggers an ERR_HTTP_HEADERS_SENT unhandled exception that crashes the proxy server. This is only possible when the proxy server sets headers in the proxy request using the proxyReq.setHeader function.
<p>Publish Date: 2020-05-14
<p>URL: <a href=https://github.com/http-party/node-http-proxy/pull/1447>WS-2020-0091</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1486">https://www.npmjs.com/advisories/1486</a></p>
<p>Release Date: 2020-05-14</p>
<p>Fix Resolution (http-proxy): 1.18.1</p>
<p>Direct dependency fix Resolution (karma): 4.3.0</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-0437</summary>
### Vulnerable Library - <b>karma-4.2.0.tgz</b></p>
<p>Spectacular Test Runner for JavaScript.</p>
<p>Library home page: <a href="https://registry.npmjs.org/karma/-/karma-4.2.0.tgz">https://registry.npmjs.org/karma/-/karma-4.2.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /diagram-js/node_modules/karma/package.json</p>
<p>
Dependency Hierarchy:
- :x: **karma-4.2.0.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
Cross-site Scripting (XSS) - DOM in NPM karma prior to 6.3.14.
<p>Publish Date: 2022-02-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0437>CVE-2022-0437</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2022-0437">https://nvd.nist.gov/vuln/detail/CVE-2022-0437</a></p>
<p>Release Date: 2022-02-05</p>
<p>Fix Resolution: karma - v6.3.14</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-28481</summary>
### Vulnerable Library - <b>socket.io-2.1.1.tgz</b></p>
<p>node.js realtime framework server</p>
<p>Library home page: <a href="https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz">https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/socket.io/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- :x: **socket.io-2.1.1.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
The package socket.io before 2.4.0 are vulnerable to Insecure Defaults due to CORS Misconfiguration. All domains are whitelisted by default.
<p>Publish Date: 2021-01-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28481>CVE-2020-28481</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28481">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28481</a></p>
<p>Release Date: 2021-01-19</p>
<p>Fix Resolution (socket.io): 2.4.0</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
<!-- <REMEDIATE>[{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.0.8","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2021-31597","vulnerabilityDetails":"The xmlhttprequest-ssl package before 1.6.1 for Node.js disables SSL certificate validation by default, because rejectUnauthorized (when the property exists but is undefined) is considered to be false within the https.request function of Node.js. In other words, no certificate is ever rejected.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-31597","cvss3Severity":"high","cvss3Score":"9.4","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.0.8","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"WS-2020-0443","vulnerabilityDetails":"In socket.io in versions 1.0.0 to 2.3.0 is vulnerable to Cross-Site Websocket Hijacking, it allows an attacker to bypass origin protection using special symbols include \"`\" and \"$\".","vulnerabilityUrl":"https://github.com/socketio/socket.io/commit/f78a575f66ab693c3ea96ea88429ddb1a44c86c7","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"High"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.0.8","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-28502","vulnerabilityDetails":"This affects the package xmlhttprequest before 1.7.0; all versions of package xmlhttprequest-ssl. Provided requests are sent synchronously (async\u003dFalse on xhr.open), malicious user input flowing into xhr.send could result in arbitrary code being injected and run.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28502","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"6.0.0","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-36048","vulnerabilityDetails":"Engine.IO before 4.0.0 allows attackers to cause a denial of service (resource consumption) via a POST request to the long polling transport.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36048","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.0.8","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-36049","vulnerabilityDetails":"socket.io-parser before 3.4.1 allows attackers to cause a denial of service (memory consumption) via a large packet because a concatenation approach is used.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36049","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.3.0","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-28469","vulnerabilityDetails":"This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.3.0","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"WS-2020-0091","vulnerabilityDetails":"Versions of http-proxy prior to 1.18.1 are vulnerable to Denial of Service. An HTTP request with a long body triggers an ERR_HTTP_HEADERS_SENT unhandled exception that crashes the proxy server. This is only possible when the proxy server sets headers in the proxy request using the proxyReq.setHeader function.","vulnerabilityUrl":"https://github.com/http-party/node-http-proxy/pull/1447","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"karma - v6.3.14","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2022-0437","vulnerabilityDetails":"Cross-site Scripting (XSS) - DOM in NPM karma prior to 6.3.14.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0437","cvss3Severity":"medium","cvss3Score":"5.4","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.0.8","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-28481","vulnerabilityDetails":"The package socket.io before 2.4.0 are vulnerable to Insecure Defaults due to CORS Misconfiguration. All domains are whitelisted by default.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28481","cvss3Severity":"medium","cvss3Score":"4.3","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"None"},"extraData":{}}]</REMEDIATE> -->
|
True
|
karma-4.2.0.tgz: 9 vulnerabilities (highest severity is: 9.4) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>karma-4.2.0.tgz</b></p></summary>
<p>Spectacular Test Runner for JavaScript.</p>
<p>Library home page: <a href="https://registry.npmjs.org/karma/-/karma-4.2.0.tgz">https://registry.npmjs.org/karma/-/karma-4.2.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /diagram-js/node_modules/karma/package.json</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-31597](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-31597) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.4 | xmlhttprequest-ssl-1.5.5.tgz | Transitive | 5.0.8 | ❌ |
| [WS-2020-0443](https://github.com/socketio/socket.io/commit/f78a575f66ab693c3ea96ea88429ddb1a44c86c7) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.1 | socket.io-2.1.1.tgz | Transitive | 5.0.8 | ❌ |
| [CVE-2020-28502](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28502) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.1 | xmlhttprequest-ssl-1.5.5.tgz | Transitive | 5.0.8 | ❌ |
| [CVE-2020-36048](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36048) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | engine.io-3.2.1.tgz | Transitive | 6.0.0 | ❌ |
| [CVE-2020-36049](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36049) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | socket.io-parser-3.2.0.tgz | Transitive | 5.0.8 | ❌ |
| [CVE-2020-28469](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | glob-parent-5.0.0.tgz | Transitive | 4.3.0 | ❌ |
| [WS-2020-0091](https://github.com/http-party/node-http-proxy/pull/1447) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | http-proxy-1.17.0.tgz | Transitive | 4.3.0 | ❌ |
| [CVE-2022-0437](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0437) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.4 | karma-4.2.0.tgz | Direct | karma - v6.3.14 | ❌ |
| [CVE-2020-28481](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28481) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.3 | socket.io-2.1.1.tgz | Transitive | 5.0.8 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-31597</summary>
### Vulnerable Library - <b>xmlhttprequest-ssl-1.5.5.tgz</b></p>
<p>XMLHttpRequest for Node</p>
<p>Library home page: <a href="https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz">https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/xmlhttprequest-ssl/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- socket.io-2.1.1.tgz
- socket.io-client-2.1.1.tgz
- engine.io-client-3.2.1.tgz
- :x: **xmlhttprequest-ssl-1.5.5.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
The xmlhttprequest-ssl package before 1.6.1 for Node.js disables SSL certificate validation by default, because rejectUnauthorized (when the property exists but is undefined) is considered to be false within the https.request function of Node.js. In other words, no certificate is ever rejected.
<p>Publish Date: 2021-04-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-31597>CVE-2021-31597</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-31597">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-31597</a></p>
<p>Release Date: 2021-04-23</p>
<p>Fix Resolution (xmlhttprequest-ssl): 1.6.1</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> WS-2020-0443</summary>
### Vulnerable Library - <b>socket.io-2.1.1.tgz</b></p>
<p>node.js realtime framework server</p>
<p>Library home page: <a href="https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz">https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/socket.io/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- :x: **socket.io-2.1.1.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
In socket.io in versions 1.0.0 to 2.3.0 is vulnerable to Cross-Site Websocket Hijacking, it allows an attacker to bypass origin protection using special symbols include "`" and "$".
<p>Publish Date: 2020-02-20
<p>URL: <a href=https://github.com/socketio/socket.io/commit/f78a575f66ab693c3ea96ea88429ddb1a44c86c7>WS-2020-0443</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://hackerone.com/reports/931197">https://hackerone.com/reports/931197</a></p>
<p>Release Date: 2020-02-20</p>
<p>Fix Resolution (socket.io): 2.4.0</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-28502</summary>
### Vulnerable Library - <b>xmlhttprequest-ssl-1.5.5.tgz</b></p>
<p>XMLHttpRequest for Node</p>
<p>Library home page: <a href="https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz">https://registry.npmjs.org/xmlhttprequest-ssl/-/xmlhttprequest-ssl-1.5.5.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/xmlhttprequest-ssl/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- socket.io-2.1.1.tgz
- socket.io-client-2.1.1.tgz
- engine.io-client-3.2.1.tgz
- :x: **xmlhttprequest-ssl-1.5.5.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
This affects the package xmlhttprequest before 1.7.0; all versions of package xmlhttprequest-ssl. Provided requests are sent synchronously (async=False on xhr.open), malicious user input flowing into xhr.send could result in arbitrary code being injected and run.
<p>Publish Date: 2021-03-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28502>CVE-2020-28502</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-h4j5-c7cj-74xg">https://github.com/advisories/GHSA-h4j5-c7cj-74xg</a></p>
<p>Release Date: 2021-03-05</p>
<p>Fix Resolution (xmlhttprequest-ssl): 1.6.1</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-36048</summary>
### Vulnerable Library - <b>engine.io-3.2.1.tgz</b></p>
<p>The realtime engine behind Socket.IO. Provides the foundation of a bidirectional connection between client and server</p>
<p>Library home page: <a href="https://registry.npmjs.org/engine.io/-/engine.io-3.2.1.tgz">https://registry.npmjs.org/engine.io/-/engine.io-3.2.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/engine.io/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- socket.io-2.1.1.tgz
- :x: **engine.io-3.2.1.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
Engine.IO before 4.0.0 allows attackers to cause a denial of service (resource consumption) via a POST request to the long polling transport.
<p>Publish Date: 2021-01-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36048>CVE-2020-36048</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-36048">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-36048</a></p>
<p>Release Date: 2021-01-08</p>
<p>Fix Resolution (engine.io): 4.0.0-alpha.0</p>
<p>Direct dependency fix Resolution (karma): 6.0.0</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-36049</summary>
### Vulnerable Library - <b>socket.io-parser-3.2.0.tgz</b></p>
<p>socket.io protocol parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-3.2.0.tgz">https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-3.2.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/socket.io-parser/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- socket.io-2.1.1.tgz
- :x: **socket.io-parser-3.2.0.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
socket.io-parser before 3.4.1 allows attackers to cause a denial of service (memory consumption) via a large packet because a concatenation approach is used.
<p>Publish Date: 2021-01-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36049>CVE-2020-36049</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-xfhh-g9f5-x4m4">https://github.com/advisories/GHSA-xfhh-g9f5-x4m4</a></p>
<p>Release Date: 2021-01-08</p>
<p>Fix Resolution (socket.io-parser): 3.3.2</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-28469</summary>
### Vulnerable Library - <b>glob-parent-5.0.0.tgz</b></p>
<p>Extract the non-magic parent path from a glob string.</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.0.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- chokidar-3.0.2.tgz
- :x: **glob-parent-5.0.0.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator.
<p>Publish Date: 2021-06-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p>
<p>Release Date: 2021-06-03</p>
<p>Fix Resolution (glob-parent): 5.1.2</p>
<p>Direct dependency fix Resolution (karma): 4.3.0</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> WS-2020-0091</summary>
### Vulnerable Library - <b>http-proxy-1.17.0.tgz</b></p>
<p>HTTP proxying for the masses</p>
<p>Library home page: <a href="https://registry.npmjs.org/http-proxy/-/http-proxy-1.17.0.tgz">https://registry.npmjs.org/http-proxy/-/http-proxy-1.17.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/http-proxy/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- :x: **http-proxy-1.17.0.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
Versions of http-proxy prior to 1.18.1 are vulnerable to Denial of Service. An HTTP request with a long body triggers an ERR_HTTP_HEADERS_SENT unhandled exception that crashes the proxy server. This is only possible when the proxy server sets headers in the proxy request using the proxyReq.setHeader function.
<p>Publish Date: 2020-05-14
<p>URL: <a href=https://github.com/http-party/node-http-proxy/pull/1447>WS-2020-0091</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1486">https://www.npmjs.com/advisories/1486</a></p>
<p>Release Date: 2020-05-14</p>
<p>Fix Resolution (http-proxy): 1.18.1</p>
<p>Direct dependency fix Resolution (karma): 4.3.0</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-0437</summary>
### Vulnerable Library - <b>karma-4.2.0.tgz</b></p>
<p>Spectacular Test Runner for JavaScript.</p>
<p>Library home page: <a href="https://registry.npmjs.org/karma/-/karma-4.2.0.tgz">https://registry.npmjs.org/karma/-/karma-4.2.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /diagram-js/node_modules/karma/package.json</p>
<p>
Dependency Hierarchy:
- :x: **karma-4.2.0.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
Cross-site Scripting (XSS) - DOM in NPM karma prior to 6.3.14.
<p>Publish Date: 2022-02-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0437>CVE-2022-0437</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2022-0437">https://nvd.nist.gov/vuln/detail/CVE-2022-0437</a></p>
<p>Release Date: 2022-02-05</p>
<p>Fix Resolution: karma - v6.3.14</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-28481</summary>
### Vulnerable Library - <b>socket.io-2.1.1.tgz</b></p>
<p>node.js realtime framework server</p>
<p>Library home page: <a href="https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz">https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/socket.io/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.2.0.tgz (Root Library)
- :x: **socket.io-2.1.1.tgz** (Vulnerable Library)
</p>
<p></p>
### Vulnerability Details
<p>
The package socket.io before 2.4.0 are vulnerable to Insecure Defaults due to CORS Misconfiguration. All domains are whitelisted by default.
<p>Publish Date: 2021-01-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28481>CVE-2020-28481</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28481">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28481</a></p>
<p>Release Date: 2021-01-19</p>
<p>Fix Resolution (socket.io): 2.4.0</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
<!-- <REMEDIATE>[{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.0.8","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2021-31597","vulnerabilityDetails":"The xmlhttprequest-ssl package before 1.6.1 for Node.js disables SSL certificate validation by default, because rejectUnauthorized (when the property exists but is undefined) is considered to be false within the https.request function of Node.js. In other words, no certificate is ever rejected.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-31597","cvss3Severity":"high","cvss3Score":"9.4","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.0.8","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"WS-2020-0443","vulnerabilityDetails":"In socket.io in versions 1.0.0 to 2.3.0 is vulnerable to Cross-Site Websocket Hijacking, it allows an attacker to bypass origin protection using special symbols include \"`\" and \"$\".","vulnerabilityUrl":"https://github.com/socketio/socket.io/commit/f78a575f66ab693c3ea96ea88429ddb1a44c86c7","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"High"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.0.8","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-28502","vulnerabilityDetails":"This affects the package xmlhttprequest before 1.7.0; all versions of package xmlhttprequest-ssl. Provided requests are sent synchronously (async\u003dFalse on xhr.open), malicious user input flowing into xhr.send could result in arbitrary code being injected and run.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28502","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"6.0.0","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-36048","vulnerabilityDetails":"Engine.IO before 4.0.0 allows attackers to cause a denial of service (resource consumption) via a POST request to the long polling transport.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36048","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.0.8","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-36049","vulnerabilityDetails":"socket.io-parser before 3.4.1 allows attackers to cause a denial of service (memory consumption) via a large packet because a concatenation approach is used.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36049","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.3.0","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-28469","vulnerabilityDetails":"This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.3.0","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"WS-2020-0091","vulnerabilityDetails":"Versions of http-proxy prior to 1.18.1 are vulnerable to Denial of Service. An HTTP request with a long body triggers an ERR_HTTP_HEADERS_SENT unhandled exception that crashes the proxy server. This is only possible when the proxy server sets headers in the proxy request using the proxyReq.setHeader function.","vulnerabilityUrl":"https://github.com/http-party/node-http-proxy/pull/1447","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"karma - v6.3.14","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2022-0437","vulnerabilityDetails":"Cross-site Scripting (XSS) - DOM in NPM karma prior to 6.3.14.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0437","cvss3Severity":"medium","cvss3Score":"5.4","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}},{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"karma","packageVersion":"4.2.0","packageFilePaths":["/package.json"],"isTransitiveDependency":false,"dependencyTree":"karma:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.0.8","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-28481","vulnerabilityDetails":"The package socket.io before 2.4.0 are vulnerable to Insecure Defaults due to CORS Misconfiguration. All domains are whitelisted by default.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28481","cvss3Severity":"medium","cvss3Score":"4.3","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"None"},"extraData":{}}]</REMEDIATE> -->
|
non_process
|
karma tgz vulnerabilities highest severity is vulnerable library karma tgz spectacular test runner for javascript library home page a href path to dependency file package json path to vulnerable library diagram js node modules karma package json vulnerabilities cve severity cvss dependency type fixed in remediation available high xmlhttprequest ssl tgz transitive ❌ high socket io tgz transitive ❌ high xmlhttprequest ssl tgz transitive ❌ high engine io tgz transitive ❌ high socket io parser tgz transitive ❌ high glob parent tgz transitive ❌ high http proxy tgz transitive ❌ medium karma tgz direct karma ❌ medium socket io tgz transitive ❌ details cve vulnerable library xmlhttprequest ssl tgz xmlhttprequest for node library home page a href path to dependency file package json path to vulnerable library node modules xmlhttprequest ssl package json dependency hierarchy karma tgz root library socket io tgz socket io client tgz engine io client tgz x xmlhttprequest ssl tgz vulnerable library vulnerability details the xmlhttprequest ssl package before for node js disables ssl certificate validation by default because rejectunauthorized when the property exists but is undefined is considered to be false within the https request function of node js in other words no certificate is ever rejected publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution xmlhttprequest ssl direct dependency fix resolution karma step up your open source security game with whitesource ws vulnerable library socket io tgz node js realtime framework server library home page a href path to dependency file package json path to vulnerable library node modules socket io package json dependency hierarchy karma tgz root library x socket io tgz vulnerable library vulnerability details in socket io in versions to is vulnerable to cross site websocket hijacking it allows an attacker to bypass origin protection using special symbols include and publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution socket io direct dependency fix resolution karma step up your open source security game with whitesource cve vulnerable library xmlhttprequest ssl tgz xmlhttprequest for node library home page a href path to dependency file package json path to vulnerable library node modules xmlhttprequest ssl package json dependency hierarchy karma tgz root library socket io tgz socket io client tgz engine io client tgz x xmlhttprequest ssl tgz vulnerable library vulnerability details this affects the package xmlhttprequest before all versions of package xmlhttprequest ssl provided requests are sent synchronously async false on xhr open malicious user input flowing into xhr send could result in arbitrary code being injected and run publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution xmlhttprequest ssl direct dependency fix resolution karma step up your open source security game with whitesource cve vulnerable library engine io tgz the realtime engine behind socket io provides the foundation of a bidirectional connection between client and server library home page a href path to dependency file package json path to vulnerable library node modules engine io package json dependency hierarchy karma tgz root library socket io tgz x engine io tgz vulnerable library vulnerability details engine io before allows attackers to cause a denial of service resource consumption via a post request to the long polling transport publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution engine io alpha direct dependency fix resolution karma step up your open source security game with whitesource cve vulnerable library socket io parser tgz socket io protocol parser library home page a href path to dependency file package json path to vulnerable library node modules socket io parser package json dependency hierarchy karma tgz root library socket io tgz x socket io parser tgz vulnerable library vulnerability details socket io parser before allows attackers to cause a denial of service memory consumption via a large packet because a concatenation approach is used publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution socket io parser direct dependency fix resolution karma step up your open source security game with whitesource cve vulnerable library glob parent tgz extract the non magic parent path from a glob string library home page a href path to dependency file package json path to vulnerable library node modules glob parent package json dependency hierarchy karma tgz root library chokidar tgz x glob parent tgz vulnerable library vulnerability details this affects the package glob parent before the enclosure regex used to check for strings ending in enclosure containing path separator publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution glob parent direct dependency fix resolution karma step up your open source security game with whitesource ws vulnerable library http proxy tgz http proxying for the masses library home page a href path to dependency file package json path to vulnerable library node modules http proxy package json dependency hierarchy karma tgz root library x http proxy tgz vulnerable library vulnerability details versions of http proxy prior to are vulnerable to denial of service an http request with a long body triggers an err http headers sent unhandled exception that crashes the proxy server this is only possible when the proxy server sets headers in the proxy request using the proxyreq setheader function publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution http proxy direct dependency fix resolution karma step up your open source security game with whitesource cve vulnerable library karma tgz spectacular test runner for javascript library home page a href path to dependency file package json path to vulnerable library diagram js node modules karma package json dependency hierarchy x karma tgz vulnerable library vulnerability details cross site scripting xss dom in npm karma prior to publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution karma step up your open source security game with whitesource cve vulnerable library socket io tgz node js realtime framework server library home page a href path to dependency file package json path to vulnerable library node modules socket io package json dependency hierarchy karma tgz root library x socket io tgz vulnerable library vulnerability details the package socket io before are vulnerable to insecure defaults due to cors misconfiguration all domains are whitelisted by default publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution socket io direct dependency fix resolution karma step up your open source security game with whitesource istransitivedependency false dependencytree karma isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails the xmlhttprequest ssl package before for node js disables ssl certificate validation by default because rejectunauthorized when the property exists but is undefined is considered to be false within the https request function of node js in other words no certificate is ever rejected vulnerabilityurl istransitivedependency false dependencytree karma isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier ws vulnerabilitydetails in socket io in versions to is vulnerable to cross site websocket hijacking it allows an attacker to bypass origin protection using special symbols include and vulnerabilityurl istransitivedependency false dependencytree karma isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails this affects the package xmlhttprequest before all versions of package xmlhttprequest ssl provided requests are sent synchronously async on xhr open malicious user input flowing into xhr send could result in arbitrary code being injected and run vulnerabilityurl istransitivedependency false dependencytree karma isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails engine io before allows attackers to cause a denial of service resource consumption via a post request to the long polling transport vulnerabilityurl istransitivedependency false dependencytree karma isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails socket io parser before allows attackers to cause a denial of service memory consumption via a large packet because a concatenation approach is used vulnerabilityurl istransitivedependency false dependencytree karma isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails this affects the package glob parent before the enclosure regex used to check for strings ending in enclosure containing path separator vulnerabilityurl istransitivedependency false dependencytree karma isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier ws vulnerabilitydetails versions of http proxy prior to are vulnerable to denial of service an http request with a long body triggers an err http headers sent unhandled exception that crashes the proxy server this is only possible when the proxy server sets headers in the proxy request using the proxyreq setheader function vulnerabilityurl istransitivedependency false dependencytree karma isminimumfixversionavailable true minimumfixversion karma isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails cross site scripting xss dom in npm karma prior to vulnerabilityurl istransitivedependency false dependencytree karma isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails the package socket io before are vulnerable to insecure defaults due to cors misconfiguration all domains are whitelisted by default vulnerabilityurl
| 0
|
17,698
| 23,547,576,855
|
IssuesEvent
|
2022-08-21 10:54:15
|
ForNeVeR/Cesium
|
https://api.github.com/repos/ForNeVeR/Cesium
|
closed
|
Add support for #ifdef/#else/#endif
|
kind:feature area:preprocessor good-first-issue
|
Right now `#ifdef/#endif` supported, but you cannot include `#else`.
|
1.0
|
Add support for #ifdef/#else/#endif - Right now `#ifdef/#endif` supported, but you cannot include `#else`.
|
process
|
add support for ifdef else endif right now ifdef endif supported but you cannot include else
| 1
|
18,260
| 24,341,714,129
|
IssuesEvent
|
2022-10-01 19:50:39
|
quark-engine/quark-engine
|
https://api.github.com/repos/quark-engine/quark-engine
|
closed
|
CLI gives an outdated path to the default ruleset
|
issue-processing-state-06
|
**Describe the bug**
Currently, the CLI shows an outdated path to the built-in ruleset.
After running `freshquark`, we have the following output.

The last line shows that the path to the built-in rules is `$HOME/.quark-engine/quark-rules/<rule_name>.json`.
However, the path is incorrect since quark-engine/quark-rules#26 had moved all the rules into a new folder called `rules`. Thus, the current path to the ruleset should be `$HOME/.quark-engine/quark-rules/rules/<rule_name>.json`.
**Expected behavior**
The CLI shows the correct path to the ruleset.
**To reproduce**
```bash
freshquark
```
**Environment**
- Ubuntu 22.04
- Quark-Engine v22.7.1
|
1.0
|
CLI gives an outdated path to the default ruleset - **Describe the bug**
Currently, the CLI shows an outdated path to the built-in ruleset.
After running `freshquark`, we have the following output.

The last line shows that the path to the built-in rules is `$HOME/.quark-engine/quark-rules/<rule_name>.json`.
However, the path is incorrect since quark-engine/quark-rules#26 had moved all the rules into a new folder called `rules`. Thus, the current path to the ruleset should be `$HOME/.quark-engine/quark-rules/rules/<rule_name>.json`.
**Expected behavior**
The CLI shows the correct path to the ruleset.
**To reproduce**
```bash
freshquark
```
**Environment**
- Ubuntu 22.04
- Quark-Engine v22.7.1
|
process
|
cli gives an outdated path to the default ruleset describe the bug currently the cli shows an outdated path to the built in ruleset after running freshquark we have the following output the last line shows that the path to the built in rules is home quark engine quark rules json however the path is incorrect since quark engine quark rules had moved all the rules into a new folder called rules thus the current path to the ruleset should be home quark engine quark rules rules json expected behavior the cli shows the correct path to the ruleset to reproduce bash freshquark environment ubuntu quark engine
| 1
|
37
| 2,507,170,375
|
IssuesEvent
|
2015-01-12 16:33:21
|
GsDevKit/gsApplicationTools
|
https://api.github.com/repos/GsDevKit/gsApplicationTools
|
closed
|
remove `ports` iv from GemServer
|
in process
|
forgot to remove it when I was replacing `ports` with `portOrResourceNameList`
|
1.0
|
remove `ports` iv from GemServer - forgot to remove it when I was replacing `ports` with `portOrResourceNameList`
|
process
|
remove ports iv from gemserver forgot to remove it when i was replacing ports with portorresourcenamelist
| 1
|
9,131
| 12,200,457,074
|
IssuesEvent
|
2020-04-30 04:47:06
|
nkumar115/Data-Science
|
https://api.github.com/repos/nkumar115/Data-Science
|
opened
|
Different Types of Joins in Python - Pandas
|
Data - PreProcessing
|
**TL;DR**
Different Types of Joins in Python - Pandas
**Link**
https://www.analyticsvidhya.com/blog/2020/02/joins-in-pandas-master-the-different-types-of-joins-in-python/
**Author**
Analytics Vidhya
**Outcomes**
1. Inner Join in Pandas
2. Full Join in Pandas
3. Left Join in Pandas
4. Right Join in Pandas
|
1.0
|
Different Types of Joins in Python - Pandas - **TL;DR**
Different Types of Joins in Python - Pandas
**Link**
https://www.analyticsvidhya.com/blog/2020/02/joins-in-pandas-master-the-different-types-of-joins-in-python/
**Author**
Analytics Vidhya
**Outcomes**
1. Inner Join in Pandas
2. Full Join in Pandas
3. Left Join in Pandas
4. Right Join in Pandas
|
process
|
different types of joins in python pandas tl dr different types of joins in python pandas link author analytics vidhya outcomes inner join in pandas full join in pandas left join in pandas right join in pandas
| 1
|
11,031
| 13,838,640,214
|
IssuesEvent
|
2020-10-14 06:37:59
|
amor71/LiuAlgoTrader
|
https://api.github.com/repos/amor71/LiuAlgoTrader
|
closed
|
improvements to build process
|
in-process
|
1. fix issue w/ package versioning post-deploy
2. fix badge issues on pypi page
3. auto-build & deploy a new docker file when a new Liu is published
|
1.0
|
improvements to build process - 1. fix issue w/ package versioning post-deploy
2. fix badge issues on pypi page
3. auto-build & deploy a new docker file when a new Liu is published
|
process
|
improvements to build process fix issue w package versioning post deploy fix badge issues on pypi page auto build deploy a new docker file when a new liu is published
| 1
|
169,572
| 13,152,604,550
|
IssuesEvent
|
2020-08-09 23:18:06
|
microsoft/STL
|
https://api.github.com/repos/microsoft/STL
|
opened
|
Add test coverage for deque::erase(iter, iter) avoiding self-move-assigns
|
test
|
#1118 was fixed by #1148, so `deque::erase(iter, iter)` avoids performing self-move-assigns when called with empty ranges. We should have test coverage for this scenario. Our test for `vector` can be extended:
https://github.com/microsoft/STL/blob/19135668f4110210e663bfc0502d3359470bbd18/tests/std/tests/Dev10_881629_vector_erase_return_value/test.cpp#L10-L11
|
1.0
|
Add test coverage for deque::erase(iter, iter) avoiding self-move-assigns - #1118 was fixed by #1148, so `deque::erase(iter, iter)` avoids performing self-move-assigns when called with empty ranges. We should have test coverage for this scenario. Our test for `vector` can be extended:
https://github.com/microsoft/STL/blob/19135668f4110210e663bfc0502d3359470bbd18/tests/std/tests/Dev10_881629_vector_erase_return_value/test.cpp#L10-L11
|
non_process
|
add test coverage for deque erase iter iter avoiding self move assigns was fixed by so deque erase iter iter avoids performing self move assigns when called with empty ranges we should have test coverage for this scenario our test for vector can be extended
| 0
|
31,553
| 5,959,503,233
|
IssuesEvent
|
2017-05-29 11:17:27
|
wunderkraut/WunderTools
|
https://api.github.com/repos/wunderkraut/WunderTools
|
closed
|
Review & make new site setup documentation better
|
Documentation
|
[Documentation](/wunderkraut/WunderTools/blob/master/docs/Setup.md) for spinning up a new D8 site with wundertools is outdated. Make it better.
|
1.0
|
Review & make new site setup documentation better - [Documentation](/wunderkraut/WunderTools/blob/master/docs/Setup.md) for spinning up a new D8 site with wundertools is outdated. Make it better.
|
non_process
|
review make new site setup documentation better wunderkraut wundertools blob master docs setup md for spinning up a new site with wundertools is outdated make it better
| 0
|
74,720
| 20,325,471,763
|
IssuesEvent
|
2022-02-18 05:05:12
|
numpy/numpy
|
https://api.github.com/repos/numpy/numpy
|
closed
|
'setup.py install --skip-build' misses generated files
|
component: numpy.distutils 57 - Close? component: build
|
### Reproducing code example:
```
python3.8 setup.py build config_fc --noopt --noarch --fcompiler=gnu95
python3.8 setup.py install --skip-build --root=<some-directory> config_fc --noopt --noarch --fcompiler=gnu95
```
### Error message:
The install tree is missing the following files that are normally installed (without `--skip-build`):
```
/usr/lib/python3.8/site-packages/numpy/core/include/numpy/__multiarray_api.h
/usr/lib/python3.8/site-packages/numpy/core/include/numpy/multiarray_api.txt
/usr/lib/python3.8/site-packages/numpy/core/include/numpy/_numpyconfig.h
/usr/lib/python3.8/site-packages/numpy/core/include/numpy/__ufunc_api.h
/usr/lib/python3.8/site-packages/numpy/core/include/numpy/ufunc_api.txt
/usr/lib/python3.8/site-packages/numpy/core/lib/npy-pkg-config/mlib.ini
/usr/lib/python3.8/site-packages/numpy/core/lib/npy-pkg-config/npymath.ini
```
### Numpy/Python version information:
```
$ python -c 'import sys, numpy; print(numpy.__version__, sys.version)'
1.17.4 3.8.2 (default, Apr 4 2020, 10:16:10)
[GCC 9.2.0]
```
|
1.0
|
'setup.py install --skip-build' misses generated files - ### Reproducing code example:
```
python3.8 setup.py build config_fc --noopt --noarch --fcompiler=gnu95
python3.8 setup.py install --skip-build --root=<some-directory> config_fc --noopt --noarch --fcompiler=gnu95
```
### Error message:
The install tree is missing the following files that are normally installed (without `--skip-build`):
```
/usr/lib/python3.8/site-packages/numpy/core/include/numpy/__multiarray_api.h
/usr/lib/python3.8/site-packages/numpy/core/include/numpy/multiarray_api.txt
/usr/lib/python3.8/site-packages/numpy/core/include/numpy/_numpyconfig.h
/usr/lib/python3.8/site-packages/numpy/core/include/numpy/__ufunc_api.h
/usr/lib/python3.8/site-packages/numpy/core/include/numpy/ufunc_api.txt
/usr/lib/python3.8/site-packages/numpy/core/lib/npy-pkg-config/mlib.ini
/usr/lib/python3.8/site-packages/numpy/core/lib/npy-pkg-config/npymath.ini
```
### Numpy/Python version information:
```
$ python -c 'import sys, numpy; print(numpy.__version__, sys.version)'
1.17.4 3.8.2 (default, Apr 4 2020, 10:16:10)
[GCC 9.2.0]
```
|
non_process
|
setup py install skip build misses generated files reproducing code example setup py build config fc noopt noarch fcompiler setup py install skip build root config fc noopt noarch fcompiler error message the install tree is missing the following files that are normally installed without skip build usr lib site packages numpy core include numpy multiarray api h usr lib site packages numpy core include numpy multiarray api txt usr lib site packages numpy core include numpy numpyconfig h usr lib site packages numpy core include numpy ufunc api h usr lib site packages numpy core include numpy ufunc api txt usr lib site packages numpy core lib npy pkg config mlib ini usr lib site packages numpy core lib npy pkg config npymath ini numpy python version information python c import sys numpy print numpy version sys version default apr
| 0
|
15,144
| 18,895,775,564
|
IssuesEvent
|
2021-11-15 17:43:18
|
nion-software/nionswift
|
https://api.github.com/repos/nion-software/nionswift
|
opened
|
Consider reintroducing separation between regions of interest ("regions") and graphics
|
type - enhancement stage - planning level - difficult f - graphics f - processing f - regions-of-interest
|
We might want to step back and distinguish between a *region*, which is a subset of data; and a *graphic*, which is the region rendered on a line plot or image display.
We currently have cases where a separation would be helpful. A use case would be the display slice of SI-like data, which is currently stored as two fractional values in the `DisplayDataChannel` and represented graphically by an interval graphic on any associated pick line plot. An associated issue is that [there is no way to reconstitute the slice graphic after removing it](https://github.com/nion-software/nionswift/issues/23).
This would also be helpful in cases in EELS processing where having a region separate from a graphic would be used to track internal regions which aren't displayed to the user.
Open questions are where these region objects would reside? and how would they be linked to graphics?
|
1.0
|
Consider reintroducing separation between regions of interest ("regions") and graphics - We might want to step back and distinguish between a *region*, which is a subset of data; and a *graphic*, which is the region rendered on a line plot or image display.
We currently have cases where a separation would be helpful. A use case would be the display slice of SI-like data, which is currently stored as two fractional values in the `DisplayDataChannel` and represented graphically by an interval graphic on any associated pick line plot. An associated issue is that [there is no way to reconstitute the slice graphic after removing it](https://github.com/nion-software/nionswift/issues/23).
This would also be helpful in cases in EELS processing where having a region separate from a graphic would be used to track internal regions which aren't displayed to the user.
Open questions are where these region objects would reside? and how would they be linked to graphics?
|
process
|
consider reintroducing separation between regions of interest regions and graphics we might want to step back and distinguish between a region which is a subset of data and a graphic which is the region rendered on a line plot or image display we currently have cases where a separation would be helpful a use case would be the display slice of si like data which is currently stored as two fractional values in the displaydatachannel and represented graphically by an interval graphic on any associated pick line plot an associated issue is that this would also be helpful in cases in eels processing where having a region separate from a graphic would be used to track internal regions which aren t displayed to the user open questions are where these region objects would reside and how would they be linked to graphics
| 1
|
46,096
| 24,361,858,138
|
IssuesEvent
|
2022-10-03 12:23:41
|
jupyterlab/jupyterlab
|
https://api.github.com/repos/jupyterlab/jupyterlab
|
closed
|
Sluggish UI when long, math-heavy notebooks are open.
|
bug tag:Performance
|
## Description
We're noticing the UI in Lab 3.0.x (x ~ 5-7) getting sluggish (slow menu switching, latency when highlighting menu entries or typing, tab switching) when notebooks that are both long and have a lot of math are open. [This notebook](https://github.com/UCB-stat-159-s21/site/blob/main/Notes/tests.ipynb) was the one we noticed with, but it might be just "long and lots of math" that's the problem.
Unfortunately it's tricky to pin down exactly what the problem is - I've tried opening it several times, locally and on the UC Berkeley, cloud-hosted campus hub, and I get inconsistent results. Sometimes my Lab instance gets sluggish while it's open, but other times it seems fairly normal.
## Reproduce
To the extent it happens, I suggest
1. Download [that notebook](https://github.com/UCB-stat-159-s21/site/blob/main/Notes/tests.ipynb)
2. Experiment with moving around, opening menus, having a few other notebooks open and switching back and forth, etc.
## Expected behavior
UI should remain responsive and without high latency on interactive actions (menu open/hover, typing, tab switching, etc).
## Context
I see folks in #4292 also discussing rendering issues, though there the conversation went more towards codemirror issues, so I'm not sure if the math aspect is the main problem, or "lots of codemirror instances", or some other combination of factors.
@ellisonbg mentioned we might want to test the [KaTeX extension](https://github.com/jupyterlab/jupyter-renderers/tree/master/packages/katex-extension) that I see @jasongrout has made recent commits to.
The notebooks the above is part of (all part of a class I'm co-teaching with @pbstark) have a ton of math so it's possible this alternate renderer is not enough for us, but it would be useful to know if it makes a significant difference in rendering speed and/or responsiveness of the UI. If anyone has a chance to test it out, that would be great. I'll report more if I do.
- Operating System and version: macOS Big Sur.
- Browser and version: Chrome 88 (current).
- JupyterLab version: 3.0.5, 3.0.7.
|
True
|
Sluggish UI when long, math-heavy notebooks are open. - ## Description
We're noticing the UI in Lab 3.0.x (x ~ 5-7) getting sluggish (slow menu switching, latency when highlighting menu entries or typing, tab switching) when notebooks that are both long and have a lot of math are open. [This notebook](https://github.com/UCB-stat-159-s21/site/blob/main/Notes/tests.ipynb) was the one we noticed with, but it might be just "long and lots of math" that's the problem.
Unfortunately it's tricky to pin down exactly what the problem is - I've tried opening it several times, locally and on the UC Berkeley, cloud-hosted campus hub, and I get inconsistent results. Sometimes my Lab instance gets sluggish while it's open, but other times it seems fairly normal.
## Reproduce
To the extent it happens, I suggest
1. Download [that notebook](https://github.com/UCB-stat-159-s21/site/blob/main/Notes/tests.ipynb)
2. Experiment with moving around, opening menus, having a few other notebooks open and switching back and forth, etc.
## Expected behavior
UI should remain responsive and without high latency on interactive actions (menu open/hover, typing, tab switching, etc).
## Context
I see folks in #4292 also discussing rendering issues, though there the conversation went more towards codemirror issues, so I'm not sure if the math aspect is the main problem, or "lots of codemirror instances", or some other combination of factors.
@ellisonbg mentioned we might want to test the [KaTeX extension](https://github.com/jupyterlab/jupyter-renderers/tree/master/packages/katex-extension) that I see @jasongrout has made recent commits to.
The notebooks the above is part of (all part of a class I'm co-teaching with @pbstark) have a ton of math so it's possible this alternate renderer is not enough for us, but it would be useful to know if it makes a significant difference in rendering speed and/or responsiveness of the UI. If anyone has a chance to test it out, that would be great. I'll report more if I do.
- Operating System and version: macOS Big Sur.
- Browser and version: Chrome 88 (current).
- JupyterLab version: 3.0.5, 3.0.7.
|
non_process
|
sluggish ui when long math heavy notebooks are open description we re noticing the ui in lab x x getting sluggish slow menu switching latency when highlighting menu entries or typing tab switching when notebooks that are both long and have a lot of math are open was the one we noticed with but it might be just long and lots of math that s the problem unfortunately it s tricky to pin down exactly what the problem is i ve tried opening it several times locally and on the uc berkeley cloud hosted campus hub and i get inconsistent results sometimes my lab instance gets sluggish while it s open but other times it seems fairly normal reproduce to the extent it happens i suggest download experiment with moving around opening menus having a few other notebooks open and switching back and forth etc expected behavior ui should remain responsive and without high latency on interactive actions menu open hover typing tab switching etc context i see folks in also discussing rendering issues though there the conversation went more towards codemirror issues so i m not sure if the math aspect is the main problem or lots of codemirror instances or some other combination of factors ellisonbg mentioned we might want to test the that i see jasongrout has made recent commits to the notebooks the above is part of all part of a class i m co teaching with pbstark have a ton of math so it s possible this alternate renderer is not enough for us but it would be useful to know if it makes a significant difference in rendering speed and or responsiveness of the ui if anyone has a chance to test it out that would be great i ll report more if i do operating system and version macos big sur browser and version chrome current jupyterlab version
| 0
|
535,288
| 15,685,936,007
|
IssuesEvent
|
2021-03-25 11:52:47
|
abpframework/abp
|
https://api.github.com/repos/abpframework/abp
|
closed
|
[Angular-UI] create a page component to wrap content
|
effort-8 feature priority:high ui-angular
|
Will be available from `@abp/ng.components` package as a secondary entry point.
It'll be imported from `@abp/ng.components/page`.
|
1.0
|
[Angular-UI] create a page component to wrap content - Will be available from `@abp/ng.components` package as a secondary entry point.
It'll be imported from `@abp/ng.components/page`.
|
non_process
|
create a page component to wrap content will be available from abp ng components package as a secondary entry point it ll be imported from abp ng components page
| 0
|
2,262
| 5,093,712,130
|
IssuesEvent
|
2017-01-03 08:07:01
|
openvstorage/framework
|
https://api.github.com/repos/openvstorage/framework
|
closed
|
Update to ovs setup reorder the questions asked and breaks the interactive setup
|
process_moved
|
This changes the order of the interactive setup ... pls move the question for the public IP after the cluster_name, as we now can ask to create a new cluster, get a question for the ip and then the question for the cluster_name
This breaks the interactive setup and requires an update of the documentation too ...
|
1.0
|
Update to ovs setup reorder the questions asked and breaks the interactive setup - This changes the order of the interactive setup ... pls move the question for the public IP after the cluster_name, as we now can ask to create a new cluster, get a question for the ip and then the question for the cluster_name
This breaks the interactive setup and requires an update of the documentation too ...
|
process
|
update to ovs setup reorder the questions asked and breaks the interactive setup this changes the order of the interactive setup pls move the question for the public ip after the cluster name as we now can ask to create a new cluster get a question for the ip and then the question for the cluster name this breaks the interactive setup and requires an update of the documentation too
| 1
|
119,545
| 12,034,224,983
|
IssuesEvent
|
2020-04-13 15:39:37
|
allofphysicsgraph/proofofconcept
|
https://api.github.com/repos/allofphysicsgraph/proofofconcept
|
opened
|
generate exploded graph that shows all relations and indicies
|
documentation enhancement
|
Currently the graphs show only the inference rules and the expressions Latex.
For the developer view, showing the full graph with local IDs, global IDs, inf rules, latex, and step ID would be useful
|
1.0
|
generate exploded graph that shows all relations and indicies - Currently the graphs show only the inference rules and the expressions Latex.
For the developer view, showing the full graph with local IDs, global IDs, inf rules, latex, and step ID would be useful
|
non_process
|
generate exploded graph that shows all relations and indicies currently the graphs show only the inference rules and the expressions latex for the developer view showing the full graph with local ids global ids inf rules latex and step id would be useful
| 0
|
21,843
| 30,320,473,092
|
IssuesEvent
|
2023-07-10 18:49:42
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Serialization load fails when filter values comes "from another model or question"
|
Type:Bug Priority:P1 .Backend .Regression Operation/Serialization .Team/QueryProcessor :hammer_and_wrench:
|
### Describe the bug
If you have a dashboard that fetches the values from a gui or sql question, "load" command will fail
### To Reproduce
1) create a question (e.g. table people)
2) add it to a dashboard
3) create 2 more questions, one gui (table people) and one sql (select * from people)
4) create a dashboard
5) add the question of step 1
6) add a filter that gets the value from a question or model (can be from either of the questions from step 3, both will fail)
### Expected behavior
load should work
### Logs
<details>
```
2023-06-14 23:28:41,775 ERROR serialization.upsert :: Error inserting :metabase.models.dashboard/Dashboard "abc - Duplicate3": :parameters must be a sequence of maps with :id and :type keys
clojure.lang.ExceptionInfo: :parameters must be a sequence of maps with :id and :type keys {:parameters [#ordered/map ([:name "Text 3"] [:slug "text_3"] [:id "fc8bec5d"] [:type "string/="] [:sectionId "string"] [:values_source_type "card"] [:values_source_config #ordered/map ([:card_id "/collections/root/cards/people_sql"] [:value_field ["field" "source" #ordered/map ([:base-type "type/Text"])]])])], :toucan2/context-trace [[toucan2.tools.before-insert/before-insert {:toucan2.tools.before-insert/model :metabase.models.dashboard/Dashboard, :toucan2.tools.before-insert/row #ordered/map ([:points_of_interest nil] [:enable_embedding false] [:show_in_getting_started false] [:position nil] [:name "abc - Duplicate3"] [:archived false] [:collection_position nil] [:embedding_params nil] [:cache_ttl nil] [:public_uuid nil] [:caveats nil] [:parameters [#ordered/map ([:name "Text 3"] [:slug "text_3"] [:id "fc8bec5d"] [:type "string/="] [:sectionId "string"] [:values_source_type "card"] [:values_source_config #ordered/map ([:card_id "/collections/root/cards/people_sql"] [:value_field ["field" "source" #ordered/map ([:base-type "type/Text"])]])])]] [:description nil] [:collection_id nil] [:creator_id 1])}] ["resolve connection" {:toucan2.connection/connectable metabase.db.connection.ApplicationDB}] ["resolve connection" {:toucan2.connection/connectable :default}] ["resolve connection" {:toucan2.connection/connectable nil}] ["with resolved query" {:toucan2.pipeline/resolved-query {}}] ["with parsed args" {:toucan2.pipeline/query-type :toucan.query-type/insert.instances, :toucan2.pipeline/parsed-args {:rows [#ordered/map ([:points_of_interest nil] [:enable_embedding false] [:show_in_getting_started false] [:position nil] [:name "abc - Duplicate3"] [:archived false] [:collection_position nil] [:embedding_params nil] [:cache_ttl nil] [:public_uuid nil] [:caveats nil] [:parameters [#ordered/map ([:name "Text 3"] [:slug "text_3"] [:id "fc8bec5d"] [:type "string/="] [:sectionId "string"] [:values_source_type "card"] [:values_source_config #ordered/map ([:card_id "/collections/root/cards/people_sql"] [:value_field ["field" "source" #ordered/map ([:base-type "type/Text"])]])])]] [:description nil] [:collection_id nil] [:creator_id 1])]}}] ["with model" {:toucan2.pipeline/model :metabase.models.dashboard/Dashboard}] ["with unparsed args" {:toucan2.pipeline/query-type :toucan.query-type/insert.instances, :toucan2.pipeline/unparsed-args (:metabase.models.dashboard/Dashboard #ordered/map ([:points_of_interest nil] [:enable_embedding false] [:show_in_getting_started false] [:position nil] [:name "abc - Duplicate3"] [:archived false] [:collection_position nil] [:embedding_params nil] [:cache_ttl nil] [:public_uuid nil] [:caveats nil] [:parameters [#ordered/map ([:name "Text 3"] [:slug "text_3"] [:id "fc8bec5d"] [:type "string/="] [:sectionId "string"] [:values_source_type "card"] [:values_source_config #ordered/map ([:card_id "/collections/root/cards/people_sql"] [:value_field ["field" "source" #ordered/map ([:base-type "type/Text"])]])])]] [:description nil] [:collection_id nil] [:creator_id 1]))}]]}
at metabase.models.params$assert_valid_parameters.invokeStatic(params.clj:39)
at metabase.models.params$assert_valid_parameters.invoke(params.clj:35)
at metabase.models.dashboard$pre_insert.invokeStatic(dashboard.clj:90)
at metabase.models.dashboard$pre_insert.invoke(dashboard.clj:86)
at toucan.models$define_method_with_IModel_method_primary_method_pre_insert$before_insert_primary_method_model__41677.invoke(models.clj:405)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19601.invoke(threaded.clj:70)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.invoke(core.clj:2587)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:43)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:55)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:193)
at toucan2.tools.before_insert$do_before_insert_to_rows$fn__24766.invoke(before_insert.clj:19)
at clojure.core$mapv$fn__8535.invoke(core.clj:6979)
at clojure.lang.PersistentVector.reduce(PersistentVector.java:343)
at clojure.core$reduce.invokeStatic(core.clj:6885)
at clojure.core$mapv.invokeStatic(core.clj:6970)
at clojure.core$mapv.invoke(core.clj:6970)
at toucan2.tools.before_insert$do_before_insert_to_rows.invokeStatic(before_insert.clj:15)
at toucan2.tools.before_insert$do_before_insert_to_rows.invoke(before_insert.clj:14)
at clojure.core$update.invokeStatic(core.clj:6233)
at clojure.core$update.invoke(core.clj:6223)
at toucan2.tools.before_insert$build_primary_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_toucan_map_backend_honeysql2.invokeStatic(before_insert.clj:42)
at toucan2.tools.before_insert$build_primary_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_toucan_map_backend_honeysql2.invoke(before_insert.clj:34)
at clojure.lang.AFn.applyToHelper(AFn.java:171)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19605.invoke(threaded.clj:72)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.doInvoke(core.clj:2589)
at clojure.lang.RestFn.invoke(RestFn.java:467)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:45)
at clojure.lang.AFn.applyToHelper(AFn.java:165)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at metabase.db.setup$build_around_method_default.invokeStatic(setup.clj:197)
at metabase.db.setup$build_around_method_default.invoke(setup.clj:190)
at clojure.lang.AFn.applyToHelper(AFn.java:171)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:61)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:197)
at clojure.lang.Var.invoke(Var.java:399)
at toucan2.pipeline$transduce_query_primary_method_default.invokeStatic(pipeline.clj:311)
at toucan2.pipeline$transduce_query_primary_method_default.invoke(pipeline.clj:309)
at clojure.lang.AFn.applyToHelper(AFn.java:178)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:482)
at toucan.db$transduce_query_primary_method_default_toucan1_model_default.invokeStatic(db.clj:104)
at toucan.db$transduce_query_primary_method_default_toucan1_model_default.invoke(db.clj:99)
at clojure.lang.AFn.applyToHelper(AFn.java:178)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:482)
at toucan2.tools.after$transduce_query_primary_method_toucan2_tools_after_query_type_toucan2_tools_after_model_default.invokeStatic(after.clj:91)
at toucan2.tools.after$transduce_query_primary_method_toucan2_tools_after_query_type_toucan2_tools_after_model_default.invoke(after.clj:80)
at clojure.lang.AFn.applyToHelper(AFn.java:178)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:667)
at clojure.core$apply.invoke(core.clj:662)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19607.invoke(threaded.clj:79)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.doInvoke(core.clj:2589)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$apply.doInvoke(core.clj:662)
at clojure.lang.RestFn.invoke(RestFn.java:533)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.doInvoke(threaded.clj:46)
at clojure.lang.RestFn.applyTo(RestFn.java:151)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:482)
at toucan2.tools.before_insert$transduce_query_around_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_default$with_connection_STAR___24779$with_transaction_STAR___24780.invoke(before_insert.clj:32)
at toucan2.connection$bind_current_connectable_fn$fn__23886.invoke(connection.clj:104)
at toucan2.connection$do_with_transaction_primary_method_java_sql_Connection$fn__23948.invoke(connection.clj:277)
at next.jdbc.transaction$transact_STAR_.invokeStatic(transaction.clj:72)
at next.jdbc.transaction$transact_STAR_.invoke(transaction.clj:51)
at next.jdbc.transaction$fn__23374.invokeStatic(transaction.clj:122)
at next.jdbc.transaction$fn__23374.invoke(transaction.clj:115)
at next.jdbc.protocols$fn__22534$G__22529__22543.invoke(protocols.clj:58)
at next.jdbc$transact.invokeStatic(jdbc.clj:381)
at next.jdbc$transact.invoke(jdbc.clj:373)
at toucan2.connection$do_with_transaction_primary_method_java_sql_Connection.invokeStatic(connection.clj:276)
at toucan2.connection$do_with_transaction_primary_method_java_sql_Connection.invoke(connection.clj:270)
at clojure.lang.AFn.applyToHelper(AFn.java:165)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at clojure.core$partial$fn__5908.invoke(core.clj:2643)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19603.invoke(threaded.clj:71)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.invoke(core.clj:2588)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:44)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at metabase.db.connection$do_with_transaction_around_method_java_sql_Connection.invokeStatic(connection.clj:145)
at metabase.db.connection$do_with_transaction_around_method_java_sql_Connection.invoke(connection.clj:141)
at clojure.lang.AFn.applyToHelper(AFn.java:165)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at clojure.core$partial$fn__5908.invoke(core.clj:2643)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at toucan2.connection$do_with_transaction_around_method_toucan2_connection_default.invokeStatic(connection.clj:268)
at toucan2.connection$do_with_transaction_around_method_toucan2_connection_default.invoke(connection.clj:264)
at clojure.lang.AFn.applyToHelper(AFn.java:165)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at clojure.core$partial$fn__5908.invoke(core.clj:2643)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:58)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:195)
at toucan2.tools.before_insert$transduce_query_around_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_default$with_connection_STAR___24779.invoke(before_insert.clj:31)
at toucan2.connection$bind_current_connectable_fn$fn__23886.invoke(connection.clj:104)
at toucan2.connection$bind_current_connectable_fn$fn__23886.invoke(connection.clj:104)
at toucan2.connection$bind_current_connectable_fn$fn__23886.invoke(connection.clj:104)
at toucan2.connection$do_with_connection_primary_method_javax_sql_DataSource.invokeStatic(connection.clj:213)
at toucan2.connection$do_with_connection_primary_method_javax_sql_DataSource.invoke(connection.clj:210)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19601.invoke(threaded.clj:70)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.invoke(core.clj:2587)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:43)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invokeStatic(connection.clj:118)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invoke(connection.clj:106)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:55)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:193)
at metabase.db.connection$do_with_connection_primary_method_default.invokeStatic(connection.clj:139)
at metabase.db.connection$do_with_connection_primary_method_default.invoke(connection.clj:137)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19601.invoke(threaded.clj:70)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.invoke(core.clj:2587)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:43)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invokeStatic(connection.clj:118)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invoke(connection.clj:106)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:55)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:193)
at toucan2.connection$do_with_connection_primary_method_.invokeStatic(connection.clj:204)
at toucan2.connection$do_with_connection_primary_method_.invoke(connection.clj:194)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19601.invoke(threaded.clj:70)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.invoke(core.clj:2587)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:43)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invokeStatic(connection.clj:118)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invoke(connection.clj:106)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:55)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:193)
at toucan2.tools.before_insert$transduce_query_around_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_default.invokeStatic(before_insert.clj:31)
at toucan2.tools.before_insert$transduce_query_around_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_default.invoke(before_insert.clj:26)
at clojure.lang.AFn.applyToHelper(AFn.java:178)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$apply.doInvoke(core.clj:662)
at clojure.lang.RestFn.invoke(RestFn.java:533)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:65)
at methodical.impl.standard$invoke_multifn.doInvoke(standard.clj:47)
at clojure.lang.RestFn.invoke(RestFn.java:594)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:199)
at toucan2.pipeline$transduce_query_STAR_.invokeStatic(pipeline.clj:318)
at toucan2.pipeline$transduce_query_STAR_.invoke(pipeline.clj:314)
at toucan2.pipeline$transduce_with_model.invokeStatic(pipeline.clj:333)
at toucan2.pipeline$transduce_with_model.invoke(pipeline.clj:320)
at toucan2.pipeline$transduce_parsed.invokeStatic(pipeline.clj:349)
at toucan2.pipeline$transduce_parsed.invoke(pipeline.clj:335)
at toucan2.pipeline$transduce_unparsed.invokeStatic(pipeline.clj:357)
at toucan2.pipeline$transduce_unparsed.invoke(pipeline.clj:351)
at toucan2.pipeline$transduce_unparsed_with_default_rf.invokeStatic(pipeline.clj:414)
at toucan2.pipeline$transduce_unparsed_with_default_rf.invoke(pipeline.clj:408)
at toucan2.insert$insert_returning_instances_BANG_.invokeStatic(insert.clj:154)
at toucan2.insert$insert_returning_instances_BANG_.doInvoke(insert.clj:141)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at toucan.db$insert_BANG_.invokeStatic(db.clj:379)
at toucan.db$insert_BANG_.invoke(db.clj:376)
at metabase_enterprise.serialization.upsert$insert_many_individually_BANG_$iter__77408__77412$fn__77413$fn__77414$fn__77415.invoke(upsert.clj:85)
at metabase_enterprise.serialization.upsert$insert_many_individually_BANG_$iter__77408__77412$fn__77413$fn__77414.invoke(upsert.clj:83)
at metabase_enterprise.serialization.upsert$insert_many_individually_BANG_$iter__77408__77412$fn__77413.invoke(upsert.clj:80)
at clojure.lang.LazySeq.sval(LazySeq.java:42)
at clojure.lang.LazySeq.seq(LazySeq.java:51)
at clojure.lang.RT.seq(RT.java:535)
at clojure.core$seq__5467.invokeStatic(core.clj:139)
at clojure.core$map$fn__5935.invoke(core.clj:2763)
at clojure.lang.LazySeq.sval(LazySeq.java:42)
at clojure.lang.LazySeq.seq(LazySeq.java:51)
at clojure.lang.RT.seq(RT.java:535)
at clojure.core$seq__5467.invokeStatic(core.clj:139)
at clojure.core$map$fn__5939.invoke(core.clj:2774)
at clojure.lang.LazySeq.sval(LazySeq.java:42)
at clojure.lang.LazySeq.seq(LazySeq.java:58)
at clojure.lang.RT.seq(RT.java:535)
at clojure.core$seq__5467.invokeStatic(core.clj:139)
at clojure.core$concat$cat__5560$fn__5561.invoke(core.clj:736)
at clojure.lang.LazySeq.sval(LazySeq.java:42)
at clojure.lang.LazySeq.seq(LazySeq.java:51)
at clojure.lang.ChunkedCons.chunkedNext(ChunkedCons.java:59)
at clojure.lang.ChunkedCons.next(ChunkedCons.java:43)
at clojure.lang.RT.length(RT.java:1782)
at clojure.lang.RT.seqToArray(RT.java:1723)
at clojure.lang.LazySeq.toArray(LazySeq.java:132)
at clojure.lang.RT.toArray(RT.java:1696)
at clojure.core$to_array.invokeStatic(core.clj:346)
at clojure.core$sort.invokeStatic(core.clj:3114)
at clojure.core$sort_by.invokeStatic(core.clj:3120)
at clojure.core$sort_by.invokeStatic(core.clj:3120)
at clojure.core$sort_by.invoke(core.clj:3120)
at metabase_enterprise.serialization.upsert$maybe_upsert_many_BANG_.invokeStatic(upsert.clj:155)
at metabase_enterprise.serialization.upsert$maybe_upsert_many_BANG_.invoke(upsert.clj:119)
at metabase_enterprise.serialization.load$load_dashboards.invokeStatic(load.clj:444)
at metabase_enterprise.serialization.load$load_dashboards.invoke(load.clj:440)
at metabase_enterprise.serialization.load$fn__104888.invokeStatic(load.clj:524)
at metabase_enterprise.serialization.load$fn__104888.invoke(load.clj:521)
at clojure.lang.MultiFn.invoke(MultiFn.java:234)
at metabase_enterprise.serialization.load$load_collections$iter__105092__105096$fn__105097$fn__105098.invoke(load.clj:757)
at metabase_enterprise.serialization.load$load_collections$iter__105092__105096$fn__105097.invoke(load.clj:745)
at clojure.lang.LazySeq.sval(LazySeq.java:42)
at clojure.lang.LazySeq.seq(LazySeq.java:51)
at clojure.lang.RT.seq(RT.java:535)
at clojure.core$seq__5467.invokeStatic(core.clj:139)
at clojure.core$apply.invokeStatic(core.clj:662)
at clojure.core$apply.invoke(core.clj:662)
at metabase_enterprise.serialization.load$load_collections.invokeStatic(load.clj:762)
at metabase_enterprise.serialization.load$load_collections.invoke(load.clj:734)
at metabase_enterprise.serialization.load$fn__105145.invokeStatic(load.clj:767)
at metabase_enterprise.serialization.load$fn__105145.invoke(load.clj:765)
at clojure.lang.MultiFn.invoke(MultiFn.java:234)
at metabase_enterprise.serialization.cmd$fn__105546$v1_load__105551$fn__105552.invoke(cmd.clj:61)
at metabase_enterprise.serialization.cmd$fn__105546$v1_load__105551.invoke(cmd.clj:47)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.Var.applyTo(Var.java:705)
at clojure.core$apply.invokeStatic(core.clj:667)
at clojure.core$apply.invoke(core.clj:662)
at metabase.cmd$call_enterprise.invokeStatic(cmd.clj:146)
at metabase.cmd$call_enterprise.doInvoke(cmd.clj:136)
at clojure.lang.RestFn.invoke(RestFn.java:439)
at metabase.cmd$load.invokeStatic(cmd.clj:157)
at metabase.cmd$load.doInvoke(cmd.clj:148)
at clojure.lang.RestFn.invoke(RestFn.java:410)
at clojure.lang.AFn.applyToHelper(AFn.java:154)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.core$apply.invokeStatic(core.clj:667)
at clojure.core$apply.invoke(core.clj:662)
at metabase.cmd$run_cmd$fn__107444.invoke(cmd.clj:264)
at metabase.cmd$run_cmd.invokeStatic(cmd.clj:264)
at metabase.cmd$run_cmd.invoke(cmd.clj:255)
at clojure.lang.Var.invoke(Var.java:388)
at metabase.core$run_cmd.invokeStatic(core.clj:166)
at metabase.core$run_cmd.invoke(core.clj:164)
at metabase.core$_main.invokeStatic(core.clj:188)
at metabase.core$_main.doInvoke(core.clj:183)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.lang.Var.applyTo(Var.java:705)
at clojure.core$apply.invokeStatic(core.clj:667)
at clojure.core$apply.invoke(core.clj:662)
at metabase.bootstrap$_main.invokeStatic(bootstrap.clj:31)
at metabase.bootstrap$_main.doInvoke(bootstrap.clj:28)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at metabase.bootstrap.main(Unknown Source)
```
</details>
### Information about your Metabase installation
```JSON
46.x
```
### Severity
P1
### Additional context
_No response_
|
1.0
|
Serialization load fails when filter values comes "from another model or question" - ### Describe the bug
If you have a dashboard that fetches the values from a gui or sql question, "load" command will fail
### To Reproduce
1) create a question (e.g. table people)
2) add it to a dashboard
3) create 2 more questions, one gui (table people) and one sql (select * from people)
4) create a dashboard
5) add the question of step 1
6) add a filter that gets the value from a question or model (can be from either of the questions from step 3, both will fail)
### Expected behavior
load should work
### Logs
<details>
```
2023-06-14 23:28:41,775 ERROR serialization.upsert :: Error inserting :metabase.models.dashboard/Dashboard "abc - Duplicate3": :parameters must be a sequence of maps with :id and :type keys
clojure.lang.ExceptionInfo: :parameters must be a sequence of maps with :id and :type keys {:parameters [#ordered/map ([:name "Text 3"] [:slug "text_3"] [:id "fc8bec5d"] [:type "string/="] [:sectionId "string"] [:values_source_type "card"] [:values_source_config #ordered/map ([:card_id "/collections/root/cards/people_sql"] [:value_field ["field" "source" #ordered/map ([:base-type "type/Text"])]])])], :toucan2/context-trace [[toucan2.tools.before-insert/before-insert {:toucan2.tools.before-insert/model :metabase.models.dashboard/Dashboard, :toucan2.tools.before-insert/row #ordered/map ([:points_of_interest nil] [:enable_embedding false] [:show_in_getting_started false] [:position nil] [:name "abc - Duplicate3"] [:archived false] [:collection_position nil] [:embedding_params nil] [:cache_ttl nil] [:public_uuid nil] [:caveats nil] [:parameters [#ordered/map ([:name "Text 3"] [:slug "text_3"] [:id "fc8bec5d"] [:type "string/="] [:sectionId "string"] [:values_source_type "card"] [:values_source_config #ordered/map ([:card_id "/collections/root/cards/people_sql"] [:value_field ["field" "source" #ordered/map ([:base-type "type/Text"])]])])]] [:description nil] [:collection_id nil] [:creator_id 1])}] ["resolve connection" {:toucan2.connection/connectable metabase.db.connection.ApplicationDB}] ["resolve connection" {:toucan2.connection/connectable :default}] ["resolve connection" {:toucan2.connection/connectable nil}] ["with resolved query" {:toucan2.pipeline/resolved-query {}}] ["with parsed args" {:toucan2.pipeline/query-type :toucan.query-type/insert.instances, :toucan2.pipeline/parsed-args {:rows [#ordered/map ([:points_of_interest nil] [:enable_embedding false] [:show_in_getting_started false] [:position nil] [:name "abc - Duplicate3"] [:archived false] [:collection_position nil] [:embedding_params nil] [:cache_ttl nil] [:public_uuid nil] [:caveats nil] [:parameters [#ordered/map ([:name "Text 3"] [:slug "text_3"] [:id "fc8bec5d"] [:type "string/="] [:sectionId "string"] [:values_source_type "card"] [:values_source_config #ordered/map ([:card_id "/collections/root/cards/people_sql"] [:value_field ["field" "source" #ordered/map ([:base-type "type/Text"])]])])]] [:description nil] [:collection_id nil] [:creator_id 1])]}}] ["with model" {:toucan2.pipeline/model :metabase.models.dashboard/Dashboard}] ["with unparsed args" {:toucan2.pipeline/query-type :toucan.query-type/insert.instances, :toucan2.pipeline/unparsed-args (:metabase.models.dashboard/Dashboard #ordered/map ([:points_of_interest nil] [:enable_embedding false] [:show_in_getting_started false] [:position nil] [:name "abc - Duplicate3"] [:archived false] [:collection_position nil] [:embedding_params nil] [:cache_ttl nil] [:public_uuid nil] [:caveats nil] [:parameters [#ordered/map ([:name "Text 3"] [:slug "text_3"] [:id "fc8bec5d"] [:type "string/="] [:sectionId "string"] [:values_source_type "card"] [:values_source_config #ordered/map ([:card_id "/collections/root/cards/people_sql"] [:value_field ["field" "source" #ordered/map ([:base-type "type/Text"])]])])]] [:description nil] [:collection_id nil] [:creator_id 1]))}]]}
at metabase.models.params$assert_valid_parameters.invokeStatic(params.clj:39)
at metabase.models.params$assert_valid_parameters.invoke(params.clj:35)
at metabase.models.dashboard$pre_insert.invokeStatic(dashboard.clj:90)
at metabase.models.dashboard$pre_insert.invoke(dashboard.clj:86)
at toucan.models$define_method_with_IModel_method_primary_method_pre_insert$before_insert_primary_method_model__41677.invoke(models.clj:405)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19601.invoke(threaded.clj:70)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.invoke(core.clj:2587)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:43)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:55)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:193)
at toucan2.tools.before_insert$do_before_insert_to_rows$fn__24766.invoke(before_insert.clj:19)
at clojure.core$mapv$fn__8535.invoke(core.clj:6979)
at clojure.lang.PersistentVector.reduce(PersistentVector.java:343)
at clojure.core$reduce.invokeStatic(core.clj:6885)
at clojure.core$mapv.invokeStatic(core.clj:6970)
at clojure.core$mapv.invoke(core.clj:6970)
at toucan2.tools.before_insert$do_before_insert_to_rows.invokeStatic(before_insert.clj:15)
at toucan2.tools.before_insert$do_before_insert_to_rows.invoke(before_insert.clj:14)
at clojure.core$update.invokeStatic(core.clj:6233)
at clojure.core$update.invoke(core.clj:6223)
at toucan2.tools.before_insert$build_primary_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_toucan_map_backend_honeysql2.invokeStatic(before_insert.clj:42)
at toucan2.tools.before_insert$build_primary_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_toucan_map_backend_honeysql2.invoke(before_insert.clj:34)
at clojure.lang.AFn.applyToHelper(AFn.java:171)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19605.invoke(threaded.clj:72)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.doInvoke(core.clj:2589)
at clojure.lang.RestFn.invoke(RestFn.java:467)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:45)
at clojure.lang.AFn.applyToHelper(AFn.java:165)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at metabase.db.setup$build_around_method_default.invokeStatic(setup.clj:197)
at metabase.db.setup$build_around_method_default.invoke(setup.clj:190)
at clojure.lang.AFn.applyToHelper(AFn.java:171)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:61)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:197)
at clojure.lang.Var.invoke(Var.java:399)
at toucan2.pipeline$transduce_query_primary_method_default.invokeStatic(pipeline.clj:311)
at toucan2.pipeline$transduce_query_primary_method_default.invoke(pipeline.clj:309)
at clojure.lang.AFn.applyToHelper(AFn.java:178)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:482)
at toucan.db$transduce_query_primary_method_default_toucan1_model_default.invokeStatic(db.clj:104)
at toucan.db$transduce_query_primary_method_default_toucan1_model_default.invoke(db.clj:99)
at clojure.lang.AFn.applyToHelper(AFn.java:178)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:482)
at toucan2.tools.after$transduce_query_primary_method_toucan2_tools_after_query_type_toucan2_tools_after_model_default.invokeStatic(after.clj:91)
at toucan2.tools.after$transduce_query_primary_method_toucan2_tools_after_query_type_toucan2_tools_after_model_default.invoke(after.clj:80)
at clojure.lang.AFn.applyToHelper(AFn.java:178)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:667)
at clojure.core$apply.invoke(core.clj:662)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19607.invoke(threaded.clj:79)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.doInvoke(core.clj:2589)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$apply.doInvoke(core.clj:662)
at clojure.lang.RestFn.invoke(RestFn.java:533)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.doInvoke(threaded.clj:46)
at clojure.lang.RestFn.applyTo(RestFn.java:151)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:482)
at toucan2.tools.before_insert$transduce_query_around_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_default$with_connection_STAR___24779$with_transaction_STAR___24780.invoke(before_insert.clj:32)
at toucan2.connection$bind_current_connectable_fn$fn__23886.invoke(connection.clj:104)
at toucan2.connection$do_with_transaction_primary_method_java_sql_Connection$fn__23948.invoke(connection.clj:277)
at next.jdbc.transaction$transact_STAR_.invokeStatic(transaction.clj:72)
at next.jdbc.transaction$transact_STAR_.invoke(transaction.clj:51)
at next.jdbc.transaction$fn__23374.invokeStatic(transaction.clj:122)
at next.jdbc.transaction$fn__23374.invoke(transaction.clj:115)
at next.jdbc.protocols$fn__22534$G__22529__22543.invoke(protocols.clj:58)
at next.jdbc$transact.invokeStatic(jdbc.clj:381)
at next.jdbc$transact.invoke(jdbc.clj:373)
at toucan2.connection$do_with_transaction_primary_method_java_sql_Connection.invokeStatic(connection.clj:276)
at toucan2.connection$do_with_transaction_primary_method_java_sql_Connection.invoke(connection.clj:270)
at clojure.lang.AFn.applyToHelper(AFn.java:165)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at clojure.core$partial$fn__5908.invoke(core.clj:2643)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19603.invoke(threaded.clj:71)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.invoke(core.clj:2588)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:44)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at metabase.db.connection$do_with_transaction_around_method_java_sql_Connection.invokeStatic(connection.clj:145)
at metabase.db.connection$do_with_transaction_around_method_java_sql_Connection.invoke(connection.clj:141)
at clojure.lang.AFn.applyToHelper(AFn.java:165)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at clojure.core$partial$fn__5908.invoke(core.clj:2643)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at toucan2.connection$do_with_transaction_around_method_toucan2_connection_default.invokeStatic(connection.clj:268)
at toucan2.connection$do_with_transaction_around_method_toucan2_connection_default.invoke(connection.clj:264)
at clojure.lang.AFn.applyToHelper(AFn.java:165)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:457)
at clojure.core$partial$fn__5908.invoke(core.clj:2643)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:58)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:195)
at toucan2.tools.before_insert$transduce_query_around_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_default$with_connection_STAR___24779.invoke(before_insert.clj:31)
at toucan2.connection$bind_current_connectable_fn$fn__23886.invoke(connection.clj:104)
at toucan2.connection$bind_current_connectable_fn$fn__23886.invoke(connection.clj:104)
at toucan2.connection$bind_current_connectable_fn$fn__23886.invoke(connection.clj:104)
at toucan2.connection$do_with_connection_primary_method_javax_sql_DataSource.invokeStatic(connection.clj:213)
at toucan2.connection$do_with_connection_primary_method_javax_sql_DataSource.invoke(connection.clj:210)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19601.invoke(threaded.clj:70)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.invoke(core.clj:2587)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:43)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invokeStatic(connection.clj:118)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invoke(connection.clj:106)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:55)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:193)
at metabase.db.connection$do_with_connection_primary_method_default.invokeStatic(connection.clj:139)
at metabase.db.connection$do_with_connection_primary_method_default.invoke(connection.clj:137)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19601.invoke(threaded.clj:70)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.invoke(core.clj:2587)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:43)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invokeStatic(connection.clj:118)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invoke(connection.clj:106)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:55)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:193)
at toucan2.connection$do_with_connection_primary_method_.invokeStatic(connection.clj:204)
at toucan2.connection$do_with_connection_primary_method_.invoke(connection.clj:194)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.combo.threaded$fn__19599$fn__19600$fn__19601.invoke(threaded.clj:70)
at methodical.impl.combo.threaded$reducer_fn$fn__19569$fn__19573.invoke(threaded.clj:23)
at clojure.lang.ArrayChunk.reduce(ArrayChunk.java:58)
at clojure.core.protocols$fn__8244.invokeStatic(protocols.clj:136)
at clojure.core.protocols$fn__8244.invoke(protocols.clj:124)
at clojure.core.protocols$fn__8204$G__8199__8213.invoke(protocols.clj:19)
at clojure.core.protocols$seq_reduce.invokeStatic(protocols.clj:31)
at clojure.core.protocols$fn__8236.invokeStatic(protocols.clj:75)
at clojure.core.protocols$fn__8236.invoke(protocols.clj:75)
at clojure.core.protocols$fn__8178$G__8173__8191.invoke(protocols.clj:13)
at clojure.core$reduce.invokeStatic(core.clj:6886)
at clojure.core$reduce.invoke(core.clj:6868)
at methodical.impl.combo.threaded$reducer_fn$fn__19569.invoke(threaded.clj:21)
at clojure.core$comp$fn__5876.invoke(core.clj:2587)
at methodical.impl.combo.threaded$combine_with_threader$fn__19579.invoke(threaded.clj:43)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invokeStatic(connection.clj:118)
at toucan2.connection$do_with_connection_around_method_toucan2_connection_default.invoke(connection.clj:106)
at clojure.lang.AFn.applyToHelper(AFn.java:160)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:436)
at clojure.core$partial$fn__5908.invoke(core.clj:2642)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:55)
at methodical.impl.standard$invoke_multifn.invoke(standard.clj:47)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:193)
at toucan2.tools.before_insert$transduce_query_around_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_default.invokeStatic(before_insert.clj:31)
at toucan2.tools.before_insert$transduce_query_around_method_toucan_query_type_insert__STAR__toucan2_tools_before_insert_before_insert_default.invoke(before_insert.clj:26)
at clojure.lang.AFn.applyToHelper(AFn.java:178)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$partial$fn__5908.doInvoke(core.clj:2639)
at clojure.lang.RestFn.applyTo(RestFn.java:146)
at clojure.lang.AFunction$1.doInvoke(AFunction.java:31)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.core$apply.invokeStatic(core.clj:675)
at clojure.core$apply.doInvoke(core.clj:662)
at clojure.lang.RestFn.invoke(RestFn.java:533)
at methodical.impl.standard$invoke_multifn.invokeStatic(standard.clj:65)
at methodical.impl.standard$invoke_multifn.doInvoke(standard.clj:47)
at clojure.lang.RestFn.invoke(RestFn.java:594)
at methodical.impl.standard.StandardMultiFn.invoke(standard.clj:199)
at toucan2.pipeline$transduce_query_STAR_.invokeStatic(pipeline.clj:318)
at toucan2.pipeline$transduce_query_STAR_.invoke(pipeline.clj:314)
at toucan2.pipeline$transduce_with_model.invokeStatic(pipeline.clj:333)
at toucan2.pipeline$transduce_with_model.invoke(pipeline.clj:320)
at toucan2.pipeline$transduce_parsed.invokeStatic(pipeline.clj:349)
at toucan2.pipeline$transduce_parsed.invoke(pipeline.clj:335)
at toucan2.pipeline$transduce_unparsed.invokeStatic(pipeline.clj:357)
at toucan2.pipeline$transduce_unparsed.invoke(pipeline.clj:351)
at toucan2.pipeline$transduce_unparsed_with_default_rf.invokeStatic(pipeline.clj:414)
at toucan2.pipeline$transduce_unparsed_with_default_rf.invoke(pipeline.clj:408)
at toucan2.insert$insert_returning_instances_BANG_.invokeStatic(insert.clj:154)
at toucan2.insert$insert_returning_instances_BANG_.doInvoke(insert.clj:141)
at clojure.lang.RestFn.invoke(RestFn.java:421)
at toucan.db$insert_BANG_.invokeStatic(db.clj:379)
at toucan.db$insert_BANG_.invoke(db.clj:376)
at metabase_enterprise.serialization.upsert$insert_many_individually_BANG_$iter__77408__77412$fn__77413$fn__77414$fn__77415.invoke(upsert.clj:85)
at metabase_enterprise.serialization.upsert$insert_many_individually_BANG_$iter__77408__77412$fn__77413$fn__77414.invoke(upsert.clj:83)
at metabase_enterprise.serialization.upsert$insert_many_individually_BANG_$iter__77408__77412$fn__77413.invoke(upsert.clj:80)
at clojure.lang.LazySeq.sval(LazySeq.java:42)
at clojure.lang.LazySeq.seq(LazySeq.java:51)
at clojure.lang.RT.seq(RT.java:535)
at clojure.core$seq__5467.invokeStatic(core.clj:139)
at clojure.core$map$fn__5935.invoke(core.clj:2763)
at clojure.lang.LazySeq.sval(LazySeq.java:42)
at clojure.lang.LazySeq.seq(LazySeq.java:51)
at clojure.lang.RT.seq(RT.java:535)
at clojure.core$seq__5467.invokeStatic(core.clj:139)
at clojure.core$map$fn__5939.invoke(core.clj:2774)
at clojure.lang.LazySeq.sval(LazySeq.java:42)
at clojure.lang.LazySeq.seq(LazySeq.java:58)
at clojure.lang.RT.seq(RT.java:535)
at clojure.core$seq__5467.invokeStatic(core.clj:139)
at clojure.core$concat$cat__5560$fn__5561.invoke(core.clj:736)
at clojure.lang.LazySeq.sval(LazySeq.java:42)
at clojure.lang.LazySeq.seq(LazySeq.java:51)
at clojure.lang.ChunkedCons.chunkedNext(ChunkedCons.java:59)
at clojure.lang.ChunkedCons.next(ChunkedCons.java:43)
at clojure.lang.RT.length(RT.java:1782)
at clojure.lang.RT.seqToArray(RT.java:1723)
at clojure.lang.LazySeq.toArray(LazySeq.java:132)
at clojure.lang.RT.toArray(RT.java:1696)
at clojure.core$to_array.invokeStatic(core.clj:346)
at clojure.core$sort.invokeStatic(core.clj:3114)
at clojure.core$sort_by.invokeStatic(core.clj:3120)
at clojure.core$sort_by.invokeStatic(core.clj:3120)
at clojure.core$sort_by.invoke(core.clj:3120)
at metabase_enterprise.serialization.upsert$maybe_upsert_many_BANG_.invokeStatic(upsert.clj:155)
at metabase_enterprise.serialization.upsert$maybe_upsert_many_BANG_.invoke(upsert.clj:119)
at metabase_enterprise.serialization.load$load_dashboards.invokeStatic(load.clj:444)
at metabase_enterprise.serialization.load$load_dashboards.invoke(load.clj:440)
at metabase_enterprise.serialization.load$fn__104888.invokeStatic(load.clj:524)
at metabase_enterprise.serialization.load$fn__104888.invoke(load.clj:521)
at clojure.lang.MultiFn.invoke(MultiFn.java:234)
at metabase_enterprise.serialization.load$load_collections$iter__105092__105096$fn__105097$fn__105098.invoke(load.clj:757)
at metabase_enterprise.serialization.load$load_collections$iter__105092__105096$fn__105097.invoke(load.clj:745)
at clojure.lang.LazySeq.sval(LazySeq.java:42)
at clojure.lang.LazySeq.seq(LazySeq.java:51)
at clojure.lang.RT.seq(RT.java:535)
at clojure.core$seq__5467.invokeStatic(core.clj:139)
at clojure.core$apply.invokeStatic(core.clj:662)
at clojure.core$apply.invoke(core.clj:662)
at metabase_enterprise.serialization.load$load_collections.invokeStatic(load.clj:762)
at metabase_enterprise.serialization.load$load_collections.invoke(load.clj:734)
at metabase_enterprise.serialization.load$fn__105145.invokeStatic(load.clj:767)
at metabase_enterprise.serialization.load$fn__105145.invoke(load.clj:765)
at clojure.lang.MultiFn.invoke(MultiFn.java:234)
at metabase_enterprise.serialization.cmd$fn__105546$v1_load__105551$fn__105552.invoke(cmd.clj:61)
at metabase_enterprise.serialization.cmd$fn__105546$v1_load__105551.invoke(cmd.clj:47)
at clojure.lang.AFn.applyToHelper(AFn.java:156)
at clojure.lang.AFn.applyTo(AFn.java:144)
at clojure.lang.Var.applyTo(Var.java:705)
at clojure.core$apply.invokeStatic(core.clj:667)
at clojure.core$apply.invoke(core.clj:662)
at metabase.cmd$call_enterprise.invokeStatic(cmd.clj:146)
at metabase.cmd$call_enterprise.doInvoke(cmd.clj:136)
at clojure.lang.RestFn.invoke(RestFn.java:439)
at metabase.cmd$load.invokeStatic(cmd.clj:157)
at metabase.cmd$load.doInvoke(cmd.clj:148)
at clojure.lang.RestFn.invoke(RestFn.java:410)
at clojure.lang.AFn.applyToHelper(AFn.java:154)
at clojure.lang.RestFn.applyTo(RestFn.java:132)
at clojure.core$apply.invokeStatic(core.clj:667)
at clojure.core$apply.invoke(core.clj:662)
at metabase.cmd$run_cmd$fn__107444.invoke(cmd.clj:264)
at metabase.cmd$run_cmd.invokeStatic(cmd.clj:264)
at metabase.cmd$run_cmd.invoke(cmd.clj:255)
at clojure.lang.Var.invoke(Var.java:388)
at metabase.core$run_cmd.invokeStatic(core.clj:166)
at metabase.core$run_cmd.invoke(core.clj:164)
at metabase.core$_main.invokeStatic(core.clj:188)
at metabase.core$_main.doInvoke(core.clj:183)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at clojure.lang.Var.applyTo(Var.java:705)
at clojure.core$apply.invokeStatic(core.clj:667)
at clojure.core$apply.invoke(core.clj:662)
at metabase.bootstrap$_main.invokeStatic(bootstrap.clj:31)
at metabase.bootstrap$_main.doInvoke(bootstrap.clj:28)
at clojure.lang.RestFn.applyTo(RestFn.java:137)
at metabase.bootstrap.main(Unknown Source)
```
</details>
### Information about your Metabase installation
```JSON
46.x
```
### Severity
P1
### Additional context
_No response_
|
process
|
serialization load fails when filter values comes from another model or question describe the bug if you have a dashboard that fetches the values from a gui or sql question load command will fail to reproduce create a question e g table people add it to a dashboard create more questions one gui table people and one sql select from people create a dashboard add the question of step add a filter that gets the value from a question or model can be from either of the questions from step both will fail expected behavior load should work logs error serialization upsert error inserting metabase models dashboard dashboard abc parameters must be a sequence of maps with id and type keys clojure lang exceptioninfo parameters must be a sequence of maps with id and type keys parameters context trace at metabase models params assert valid parameters invokestatic params clj at metabase models params assert valid parameters invoke params clj at metabase models dashboard pre insert invokestatic dashboard clj at metabase models dashboard pre insert invoke dashboard clj at toucan models define method with imodel method primary method pre insert before insert primary method model invoke models clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at clojure core partial fn invoke core clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl combo threaded fn fn fn invoke threaded clj at methodical impl combo threaded reducer fn fn fn invoke threaded clj at clojure lang arraychunk reduce arraychunk java at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core protocols seq reduce invokestatic protocols clj at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core reduce invokestatic core clj at clojure core reduce invoke core clj at methodical impl combo threaded reducer fn fn invoke threaded clj at clojure core comp fn invoke core clj at methodical impl combo threaded combine with threader fn invoke threaded clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl standard invoke multifn invokestatic standard clj at methodical impl standard invoke multifn invoke standard clj at methodical impl standard standardmultifn invoke standard clj at tools before insert do before insert to rows fn invoke before insert clj at clojure core mapv fn invoke core clj at clojure lang persistentvector reduce persistentvector java at clojure core reduce invokestatic core clj at clojure core mapv invokestatic core clj at clojure core mapv invoke core clj at tools before insert do before insert to rows invokestatic before insert clj at tools before insert do before insert to rows invoke before insert clj at clojure core update invokestatic core clj at clojure core update invoke core clj at tools before insert build primary method toucan query type insert star tools before insert before insert toucan map backend invokestatic before insert clj at tools before insert build primary method toucan query type insert star tools before insert before insert toucan map backend invoke before insert clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn applyto restfn java at clojure core apply invokestatic core clj at clojure core partial fn doinvoke core clj at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl combo threaded fn fn fn invoke threaded clj at methodical impl combo threaded reducer fn fn fn invoke threaded clj at clojure lang arraychunk reduce arraychunk java at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core protocols seq reduce invokestatic protocols clj at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core reduce invokestatic core clj at clojure core reduce invoke core clj at methodical impl combo threaded reducer fn fn invoke threaded clj at clojure core comp fn doinvoke core clj at clojure lang restfn invoke restfn java at methodical impl combo threaded combine with threader fn invoke threaded clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at metabase db setup build around method default invokestatic setup clj at metabase db setup build around method default invoke setup clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn applyto restfn java at clojure core apply invokestatic core clj at clojure core partial fn doinvoke core clj at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl standard invoke multifn invokestatic standard clj at methodical impl standard invoke multifn invoke standard clj at methodical impl standard standardmultifn invoke standard clj at clojure lang var invoke var java at pipeline transduce query primary method default invokestatic pipeline clj at pipeline transduce query primary method default invoke pipeline clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn applyto restfn java at clojure core apply invokestatic core clj at clojure core partial fn doinvoke core clj at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at toucan db transduce query primary method default model default invokestatic db clj at toucan db transduce query primary method default model default invoke db clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn applyto restfn java at clojure core apply invokestatic core clj at clojure core partial fn doinvoke core clj at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at tools after transduce query primary method tools after query type tools after model default invokestatic after clj at tools after transduce query primary method tools after query type tools after model default invoke after clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn applyto restfn java at clojure core apply invokestatic core clj at clojure core partial fn doinvoke core clj at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn applyto restfn java at clojure core apply invokestatic core clj at clojure core apply invoke core clj at methodical impl combo threaded fn fn fn invoke threaded clj at methodical impl combo threaded reducer fn fn fn invoke threaded clj at clojure lang arraychunk reduce arraychunk java at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core protocols seq reduce invokestatic protocols clj at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core reduce invokestatic core clj at clojure core reduce invoke core clj at methodical impl combo threaded reducer fn fn invoke threaded clj at clojure core comp fn doinvoke core clj at clojure lang restfn applyto restfn java at clojure core apply invokestatic core clj at clojure core apply doinvoke core clj at clojure lang restfn invoke restfn java at methodical impl combo threaded combine with threader fn doinvoke threaded clj at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at tools before insert transduce query around method toucan query type insert star tools before insert before insert default with connection star with transaction star invoke before insert clj at connection bind current connectable fn fn invoke connection clj at connection do with transaction primary method java sql connection fn invoke connection clj at next jdbc transaction transact star invokestatic transaction clj at next jdbc transaction transact star invoke transaction clj at next jdbc transaction fn invokestatic transaction clj at next jdbc transaction fn invoke transaction clj at next jdbc protocols fn g invoke protocols clj at next jdbc transact invokestatic jdbc clj at next jdbc transact invoke jdbc clj at connection do with transaction primary method java sql connection invokestatic connection clj at connection do with transaction primary method java sql connection invoke connection clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at clojure core partial fn invoke core clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl combo threaded fn fn fn invoke threaded clj at methodical impl combo threaded reducer fn fn fn invoke threaded clj at clojure lang arraychunk reduce arraychunk java at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core protocols seq reduce invokestatic protocols clj at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core reduce invokestatic core clj at clojure core reduce invoke core clj at methodical impl combo threaded reducer fn fn invoke threaded clj at clojure core comp fn invoke core clj at methodical impl combo threaded combine with threader fn invoke threaded clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at metabase db connection do with transaction around method java sql connection invokestatic connection clj at metabase db connection do with transaction around method java sql connection invoke connection clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at clojure core partial fn invoke core clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at connection do with transaction around method connection default invokestatic connection clj at connection do with transaction around method connection default invoke connection clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at clojure core partial fn invoke core clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl standard invoke multifn invokestatic standard clj at methodical impl standard invoke multifn invoke standard clj at methodical impl standard standardmultifn invoke standard clj at tools before insert transduce query around method toucan query type insert star tools before insert before insert default with connection star invoke before insert clj at connection bind current connectable fn fn invoke connection clj at connection bind current connectable fn fn invoke connection clj at connection bind current connectable fn fn invoke connection clj at connection do with connection primary method javax sql datasource invokestatic connection clj at connection do with connection primary method javax sql datasource invoke connection clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at clojure core partial fn invoke core clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl combo threaded fn fn fn invoke threaded clj at methodical impl combo threaded reducer fn fn fn invoke threaded clj at clojure lang arraychunk reduce arraychunk java at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core protocols seq reduce invokestatic protocols clj at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core reduce invokestatic core clj at clojure core reduce invoke core clj at methodical impl combo threaded reducer fn fn invoke threaded clj at clojure core comp fn invoke core clj at methodical impl combo threaded combine with threader fn invoke threaded clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at connection do with connection around method connection default invokestatic connection clj at connection do with connection around method connection default invoke connection clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at clojure core partial fn invoke core clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl standard invoke multifn invokestatic standard clj at methodical impl standard invoke multifn invoke standard clj at methodical impl standard standardmultifn invoke standard clj at metabase db connection do with connection primary method default invokestatic connection clj at metabase db connection do with connection primary method default invoke connection clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at clojure core partial fn invoke core clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl combo threaded fn fn fn invoke threaded clj at methodical impl combo threaded reducer fn fn fn invoke threaded clj at clojure lang arraychunk reduce arraychunk java at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core protocols seq reduce invokestatic protocols clj at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core reduce invokestatic core clj at clojure core reduce invoke core clj at methodical impl combo threaded reducer fn fn invoke threaded clj at clojure core comp fn invoke core clj at methodical impl combo threaded combine with threader fn invoke threaded clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at connection do with connection around method connection default invokestatic connection clj at connection do with connection around method connection default invoke connection clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at clojure core partial fn invoke core clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl standard invoke multifn invokestatic standard clj at methodical impl standard invoke multifn invoke standard clj at methodical impl standard standardmultifn invoke standard clj at connection do with connection primary method invokestatic connection clj at connection do with connection primary method invoke connection clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at clojure core partial fn invoke core clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl combo threaded fn fn fn invoke threaded clj at methodical impl combo threaded reducer fn fn fn invoke threaded clj at clojure lang arraychunk reduce arraychunk java at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core protocols seq reduce invokestatic protocols clj at clojure core protocols fn invokestatic protocols clj at clojure core protocols fn invoke protocols clj at clojure core protocols fn g invoke protocols clj at clojure core reduce invokestatic core clj at clojure core reduce invoke core clj at methodical impl combo threaded reducer fn fn invoke threaded clj at clojure core comp fn invoke core clj at methodical impl combo threaded combine with threader fn invoke threaded clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at connection do with connection around method connection default invokestatic connection clj at connection do with connection around method connection default invoke connection clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at clojure core partial fn invoke core clj at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn invoke restfn java at methodical impl standard invoke multifn invokestatic standard clj at methodical impl standard invoke multifn invoke standard clj at methodical impl standard standardmultifn invoke standard clj at tools before insert transduce query around method toucan query type insert star tools before insert before insert default invokestatic before insert clj at tools before insert transduce query around method toucan query type insert star tools before insert before insert default invoke before insert clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang afunction doinvoke afunction java at clojure lang restfn applyto restfn java at clojure core apply invokestatic core clj at clojure core partial fn doinvoke core clj at clojure lang restfn applyto restfn java at clojure lang afunction doinvoke afunction java at clojure lang restfn applyto restfn java at clojure core apply invokestatic core clj at clojure core apply doinvoke core clj at clojure lang restfn invoke restfn java at methodical impl standard invoke multifn invokestatic standard clj at methodical impl standard invoke multifn doinvoke standard clj at clojure lang restfn invoke restfn java at methodical impl standard standardmultifn invoke standard clj at pipeline transduce query star invokestatic pipeline clj at pipeline transduce query star invoke pipeline clj at pipeline transduce with model invokestatic pipeline clj at pipeline transduce with model invoke pipeline clj at pipeline transduce parsed invokestatic pipeline clj at pipeline transduce parsed invoke pipeline clj at pipeline transduce unparsed invokestatic pipeline clj at pipeline transduce unparsed invoke pipeline clj at pipeline transduce unparsed with default rf invokestatic pipeline clj at pipeline transduce unparsed with default rf invoke pipeline clj at insert insert returning instances bang invokestatic insert clj at insert insert returning instances bang doinvoke insert clj at clojure lang restfn invoke restfn java at toucan db insert bang invokestatic db clj at toucan db insert bang invoke db clj at metabase enterprise serialization upsert insert many individually bang iter fn fn fn invoke upsert clj at metabase enterprise serialization upsert insert many individually bang iter fn fn invoke upsert clj at metabase enterprise serialization upsert insert many individually bang iter fn invoke upsert clj at clojure lang lazyseq sval lazyseq java at clojure lang lazyseq seq lazyseq java at clojure lang rt seq rt java at clojure core seq invokestatic core clj at clojure core map fn invoke core clj at clojure lang lazyseq sval lazyseq java at clojure lang lazyseq seq lazyseq java at clojure lang rt seq rt java at clojure core seq invokestatic core clj at clojure core map fn invoke core clj at clojure lang lazyseq sval lazyseq java at clojure lang lazyseq seq lazyseq java at clojure lang rt seq rt java at clojure core seq invokestatic core clj at clojure core concat cat fn invoke core clj at clojure lang lazyseq sval lazyseq java at clojure lang lazyseq seq lazyseq java at clojure lang chunkedcons chunkednext chunkedcons java at clojure lang chunkedcons next chunkedcons java at clojure lang rt length rt java at clojure lang rt seqtoarray rt java at clojure lang lazyseq toarray lazyseq java at clojure lang rt toarray rt java at clojure core to array invokestatic core clj at clojure core sort invokestatic core clj at clojure core sort by invokestatic core clj at clojure core sort by invokestatic core clj at clojure core sort by invoke core clj at metabase enterprise serialization upsert maybe upsert many bang invokestatic upsert clj at metabase enterprise serialization upsert maybe upsert many bang invoke upsert clj at metabase enterprise serialization load load dashboards invokestatic load clj at metabase enterprise serialization load load dashboards invoke load clj at metabase enterprise serialization load fn invokestatic load clj at metabase enterprise serialization load fn invoke load clj at clojure lang multifn invoke multifn java at metabase enterprise serialization load load collections iter fn fn invoke load clj at metabase enterprise serialization load load collections iter fn invoke load clj at clojure lang lazyseq sval lazyseq java at clojure lang lazyseq seq lazyseq java at clojure lang rt seq rt java at clojure core seq invokestatic core clj at clojure core apply invokestatic core clj at clojure core apply invoke core clj at metabase enterprise serialization load load collections invokestatic load clj at metabase enterprise serialization load load collections invoke load clj at metabase enterprise serialization load fn invokestatic load clj at metabase enterprise serialization load fn invoke load clj at clojure lang multifn invoke multifn java at metabase enterprise serialization cmd fn load fn invoke cmd clj at metabase enterprise serialization cmd fn load invoke cmd clj at clojure lang afn applytohelper afn java at clojure lang afn applyto afn java at clojure lang var applyto var java at clojure core apply invokestatic core clj at clojure core apply invoke core clj at metabase cmd call enterprise invokestatic cmd clj at metabase cmd call enterprise doinvoke cmd clj at clojure lang restfn invoke restfn java at metabase cmd load invokestatic cmd clj at metabase cmd load doinvoke cmd clj at clojure lang restfn invoke restfn java at clojure lang afn applytohelper afn java at clojure lang restfn applyto restfn java at clojure core apply invokestatic core clj at clojure core apply invoke core clj at metabase cmd run cmd fn invoke cmd clj at metabase cmd run cmd invokestatic cmd clj at metabase cmd run cmd invoke cmd clj at clojure lang var invoke var java at metabase core run cmd invokestatic core clj at metabase core run cmd invoke core clj at metabase core main invokestatic core clj at metabase core main doinvoke core clj at clojure lang restfn applyto restfn java at clojure lang var applyto var java at clojure core apply invokestatic core clj at clojure core apply invoke core clj at metabase bootstrap main invokestatic bootstrap clj at metabase bootstrap main doinvoke bootstrap clj at clojure lang restfn applyto restfn java at metabase bootstrap main unknown source information about your metabase installation json x severity additional context no response
| 1
|
21,087
| 28,041,538,835
|
IssuesEvent
|
2023-03-28 18:55:33
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
[MLv2] Port `queries/utils/description.js` to MLv2
|
.Frontend .Backend .metabase-lib .Team/QueryProcessor :hammer_and_wrench:
|
Braden suggested https://github.com/metabase/metabase/blob/6265ac4fffe4889efa8175cb75fc78d10affaa1a/frontend/src/metabase-lib/queries/utils/description.js as a good file to port over to MLv2. I think porting something sooner rather than later would be a good way to make sure everything is actually working, even from JS land.
We will want to make sure the following issues are done before attempting this:
- #28883
- #28867
|
1.0
|
[MLv2] Port `queries/utils/description.js` to MLv2 - Braden suggested https://github.com/metabase/metabase/blob/6265ac4fffe4889efa8175cb75fc78d10affaa1a/frontend/src/metabase-lib/queries/utils/description.js as a good file to port over to MLv2. I think porting something sooner rather than later would be a good way to make sure everything is actually working, even from JS land.
We will want to make sure the following issues are done before attempting this:
- #28883
- #28867
|
process
|
port queries utils description js to braden suggested as a good file to port over to i think porting something sooner rather than later would be a good way to make sure everything is actually working even from js land we will want to make sure the following issues are done before attempting this
| 1
|
24,421
| 5,065,298,028
|
IssuesEvent
|
2016-12-23 11:24:59
|
loklak/loklak_server
|
https://api.github.com/repos/loklak/loklak_server
|
opened
|
Auto-Generate HTML site on each commit from md files in development branch
|
documentation
|
We have got a basic html site that uses md files in the gh-pages branch here http://loklak.github.io/loklak_server/
Please automatically render this page using the md files from the development branch on each commit. You might use travis to trigger this.
Ensure all links work in the rendered site.
|
1.0
|
Auto-Generate HTML site on each commit from md files in development branch - We have got a basic html site that uses md files in the gh-pages branch here http://loklak.github.io/loklak_server/
Please automatically render this page using the md files from the development branch on each commit. You might use travis to trigger this.
Ensure all links work in the rendered site.
|
non_process
|
auto generate html site on each commit from md files in development branch we have got a basic html site that uses md files in the gh pages branch here please automatically render this page using the md files from the development branch on each commit you might use travis to trigger this ensure all links work in the rendered site
| 0
|
63,004
| 8,651,115,444
|
IssuesEvent
|
2018-11-27 01:34:04
|
docpad/website
|
https://api.github.com/repos/docpad/website
|
closed
|
Start making "official" tutorials
|
affects documentation
|
@balupton
I am constantly looking for useful stuff and today I found [Kirby](http://getkirby.com/), a file based CMS. (No, not that Kirby. :wink:)
Anyway, one thing I noticed and that I liked is that the dev asks the users what they would like to see a tutorial about next, and whatever grabs his attention or is most requested he does a [tutorial on the blog](http://getkirby.com/).
Thought why don't we do something similar? In this case it would probably be me doing some of the tutorials, but I still think it's nice to have something like this.
|
1.0
|
Start making "official" tutorials - @balupton
I am constantly looking for useful stuff and today I found [Kirby](http://getkirby.com/), a file based CMS. (No, not that Kirby. :wink:)
Anyway, one thing I noticed and that I liked is that the dev asks the users what they would like to see a tutorial about next, and whatever grabs his attention or is most requested he does a [tutorial on the blog](http://getkirby.com/).
Thought why don't we do something similar? In this case it would probably be me doing some of the tutorials, but I still think it's nice to have something like this.
|
non_process
|
start making official tutorials balupton i am constantly looking for useful stuff and today i found a file based cms no not that kirby wink anyway one thing i noticed and that i liked is that the dev asks the users what they would like to see a tutorial about next and whatever grabs his attention or is most requested he does a thought why don t we do something similar in this case it would probably be me doing some of the tutorials but i still think it s nice to have something like this
| 0
|
21,089
| 28,043,513,131
|
IssuesEvent
|
2023-03-28 20:31:27
|
microsoft/typespec
|
https://api.github.com/repos/microsoft/typespec
|
closed
|
Docker image does not exist
|
bug :pushpin: WS: Process Tools & Automation
|
I'm trying to follow the steps described here: https://github.com/microsoft/typespec/tree/main/docker but the image cannot be found in the registry described:
```
starttypespec docker run -v "${pwd}:/wd" --workdir="/wd" -t azsdkengsys.azurecr.io/tsp install
Unable to find image 'azsdkengsys.azurecr.io/tsp:latest' locally
docker: Error response from daemon: manifest for azsdkengsys.azurecr.io/tsp:latest not found: manifest unknown: manifest tagged by "latest" is not found.
See 'docker run --help'.
➜ starttypespec docker run -v "${pwd}:/wd" --workdir="/wd" -t azsdkengsys.azurecr.io/typespec install
Unable to find image 'azsdkengsys.azurecr.io/typespec:latest' locally
docker: Error response from daemon: manifest for azsdkengsys.azurecr.io/typespec:latest not found: manifest unknown: manifest tagged by "latest" is not found.
See 'docker run --help'.
➜ starttypespec docker run -v "${pwd}:/wd" --workdir="/wd" -t azsdkengsys.azurecr.io/typespec:alpine install
Unable to find image 'azsdkengsys.azurecr.io/typespec:alpine' locally
docker: Error response from daemon: manifest for azsdkengsys.azurecr.io/typespec:alpine not found: manifest unknown: manifest tagged by "alpine" is not found.
See 'docker run --help'.
➜ starttypespec docker run -v "${pwd}:/wd" --workdir="/wd" -t azsdkengsys.azurecr.io/tsp:alpine install
Unable to find image 'azsdkengsys.azurecr.io/tsp:alpine' locally
docker: Error response from daemon: manifest for azsdkengsys.azurecr.io/tsp:alpine not found: manifest unknown: manifest tagged by "alpine" is not found.
See 'docker run --help'.
```
|
1.0
|
Docker image does not exist - I'm trying to follow the steps described here: https://github.com/microsoft/typespec/tree/main/docker but the image cannot be found in the registry described:
```
starttypespec docker run -v "${pwd}:/wd" --workdir="/wd" -t azsdkengsys.azurecr.io/tsp install
Unable to find image 'azsdkengsys.azurecr.io/tsp:latest' locally
docker: Error response from daemon: manifest for azsdkengsys.azurecr.io/tsp:latest not found: manifest unknown: manifest tagged by "latest" is not found.
See 'docker run --help'.
➜ starttypespec docker run -v "${pwd}:/wd" --workdir="/wd" -t azsdkengsys.azurecr.io/typespec install
Unable to find image 'azsdkengsys.azurecr.io/typespec:latest' locally
docker: Error response from daemon: manifest for azsdkengsys.azurecr.io/typespec:latest not found: manifest unknown: manifest tagged by "latest" is not found.
See 'docker run --help'.
➜ starttypespec docker run -v "${pwd}:/wd" --workdir="/wd" -t azsdkengsys.azurecr.io/typespec:alpine install
Unable to find image 'azsdkengsys.azurecr.io/typespec:alpine' locally
docker: Error response from daemon: manifest for azsdkengsys.azurecr.io/typespec:alpine not found: manifest unknown: manifest tagged by "alpine" is not found.
See 'docker run --help'.
➜ starttypespec docker run -v "${pwd}:/wd" --workdir="/wd" -t azsdkengsys.azurecr.io/tsp:alpine install
Unable to find image 'azsdkengsys.azurecr.io/tsp:alpine' locally
docker: Error response from daemon: manifest for azsdkengsys.azurecr.io/tsp:alpine not found: manifest unknown: manifest tagged by "alpine" is not found.
See 'docker run --help'.
```
|
process
|
docker image does not exist i m trying to follow the steps described here but the image cannot be found in the registry described starttypespec docker run v pwd wd workdir wd t azsdkengsys azurecr io tsp install unable to find image azsdkengsys azurecr io tsp latest locally docker error response from daemon manifest for azsdkengsys azurecr io tsp latest not found manifest unknown manifest tagged by latest is not found see docker run help ➜ starttypespec docker run v pwd wd workdir wd t azsdkengsys azurecr io typespec install unable to find image azsdkengsys azurecr io typespec latest locally docker error response from daemon manifest for azsdkengsys azurecr io typespec latest not found manifest unknown manifest tagged by latest is not found see docker run help ➜ starttypespec docker run v pwd wd workdir wd t azsdkengsys azurecr io typespec alpine install unable to find image azsdkengsys azurecr io typespec alpine locally docker error response from daemon manifest for azsdkengsys azurecr io typespec alpine not found manifest unknown manifest tagged by alpine is not found see docker run help ➜ starttypespec docker run v pwd wd workdir wd t azsdkengsys azurecr io tsp alpine install unable to find image azsdkengsys azurecr io tsp alpine locally docker error response from daemon manifest for azsdkengsys azurecr io tsp alpine not found manifest unknown manifest tagged by alpine is not found see docker run help
| 1
|
3,345
| 9,529,027,144
|
IssuesEvent
|
2019-04-29 10:03:45
|
ietf-tapswg/api-drafts
|
https://api.github.com/repos/ietf-tapswg/api-drafts
|
opened
|
Detailed author review of Arch text for -03 to prepare -04
|
Architecture ready for text
|
TAPS Architecture - Additional comments on rev -03 (detailed read-through).
Note: This adds to my previous set of review issues - these could be handled at the same time.
Section 1 :
- Can we make reading easier:
“This flexibility does not only enable
faster deployment of new feature and protocols, but it can also
support applications with racing and fallback mechanisms which
otherwise need to be implemented in each application separately.”
- suggest:
“This flexibility enables
faster deployment of new feature and protocols,. It can also
support applications with racing and fallback mechanisms, which
otherwise need to be implemented in each application separately.””
—
“Although following the Transport
Services Architecture does of course not mean that “
- do we need /of course/ (this reads oddly).
—
“from one system to the another”
- should be:
“from one system to another”
==========
Section 1.1 :
“The model of using sockets for networking can be represented as
follows”
- could we insert “traditional” or “existing” or something like that before “model”, since a new reader may not realise this is the description of what exists rather what this document develops?
—
“, which provides the interface to the implementations of
UDP and TCP (typically implemented in the system's kernel), which in
turn send data over the available network layer interfaces.”
- Two “, which” in a long sentence, could we write:
- “. This API provides the interface to the implementations of
UDP and TCP (typically implemented in the system's kernel), which in
turn send data over the available network layer interfaces.”
—
“The Implementation”
- Is capitalised “I” needed?
—
“for mapping the API into the various available transport”
- Is this “into” or “to” - I think the interface maps to these?
—
“There are a few key departures that Transport Services makes from the
sockets API:”
- a “few key” seems odd to me in a spec.
“There are key differences between the architecture of the Transport Services system and the architecture of the sockets API:” - or something like that?
==========
Section 1.2 :
“and changes to available network links”:
- insert “the” and “to” so it does not mean that it makes the change:
- “and changes in the available network links”
—
Section 1.3 :
“HTTP/1.1 uses character delimiters to segment messages
over a stream; TLS record headers carry a version, content type, and
length; and HTTP/2 uses”
- I think this needs text needs several references to the specs?
==========
Section 3 :
“ The following considerations were used in the design of this
architecture.”
- If these were only a list of considerations, then we we do not need “RFC2119” language after all in this section. If they are actually the basis of the architecture, then we need to change the words “considerations” to “requirements” or something. I actually think it is helpful for the basis for the design should be expressed in RFC2119 language.
- I suggest the best option could be simply to omit the sentence?
—
“that include some transport security
protocol are eligible to be used.”
- The word “some” is wrong, and also “protocol”.
- I think this should be “a transport security function”?
—
Section 3.3 :
“The abstract API definition
[I-D.ietf-taps-interface] describes this interface and is aimed at
application developers.”
- “aimed at” seems odd because of “and” and “aimed”.
“The abstract API definition
[I-D.ietf-taps-interface] describes this interface. This is expected to be utilised by
application developers.”
—
“It is expected that all
implementations of Transport Services will offer the entire mandatory
API, but that some features will not be functional in certain
implementations.”
- This seems to me to still be possible to read the “mandatory” part os optional, which can’t be. Is it also OK to say this instead:
“It is expected that all
implementations of Transport Services system will offer the entire mandatory
API. However, some features provided below the API may not be functional in certain
implementations.”
—
“All implementations are REQUIRED to offer
sufficient APIs to use the distilled minimal set of features offered
by transport protocols ...“
- What does the plural “APIs” mean?
- I suggest we write:
“All implementations are REQUIRED to offer
an API that is sufficient to provide the distilled minimal set of features offered
by transport protocols …”
—
“, but it is possible that some very
constrained devices might not have, for example, a full TCP
implementation beneath the API.”
- I’m not fond of RFC2119 constructs that include an exception clause. Could we write this instead?
“For example, it is possible that some very
constrained devices might not have, for example, a full TCP
implementation beneath the API [I.D.ietf-lwig-tcp-constrained-node-networks-07].”
- This ref is I believe in cross-WG WGLC and should now be stable.
—
“It is expected that this document will be
updated and supplemented as new protocols and protocol features are
developed.”
- could we replace “this document” by the [REF], to avoid the possibility of misreading/misquoting it as the architecture document itself?
—
“defines new protocols that require any changes
to a remote system.”
- to me the first part here is that there is no implementation work required at the RECEIVER. I think that should be clear.
—
“ The Transport Services system MUST be deployable
on one side only.”
- I think the second point is that it can be deployed in this way which places design constraints on the system, specifically that the SENDER is a one-sided system. Is it possible we could put this in a separate para instead, e.g.
“The Transport Services system MUST be deployable
on one side only. A Transport Services system implemented at the sending endpoint can communicate with a remote endpoint on any existing system that implements the transport protocol(s) selected by the TAPS System. Similarly a Transport Services system at a listening endpoint can communicate using a using a transport protocol at a sending endpoint implemented in an existing system.”
==========
Section 4.1 :
“establish
communication and send and receive data.”
- insert comma after “communication” or “then” after first “and”
—
“A Preconnection can be fully specified and
represent a single possible Connection”
- is this /represents/?
—
“The Remote Endpoint MUST be specified in the
Preconnection is used to Initiate connections.”
- seems broken English, Is this:
“The Remote Endpoint MUST be specified if a
Preconnection is used to Initiate connections.”
—
in 4.1.2:
“Changes made
to Connection Properties after establishment take effect on a
best-effort basis.”
- My understanding was that this does **not** change the protocol selection, but can change the way the protocol uses the path. If I was correct, then it would be good to add a sentence to say this.
—
In 4.1.3:
“prepare any required
local or remote state to be able to send and/or receive Messages.”
- what becomes able?
“prepare any required
local or remote state to enable the endpoint to send and/or receive Messages.”
—
In 4.1.4:
“If a received Message is incomplete or corrupted, it
might or might not be usable by certain applications.”
- I agree this is possible, but I’d like to assert this really must **not** be the default and it is important to highlight that. Is it possible to say: “If a received Message is incomplete or corrupted, the default action is to not pass the data to the application. A certain application can override this default and the data might or might not be usable by that application.”
—
“Message Properties can be used to annotate specific Messages. “
- I agree, however I think it is really important to say that these annotations for sending exist only within the Local Endpoint and doe not cause additional bytes to be communicated to the Remote Endpoint. For example, they could change the DSCP, for instance. [[We need to choose words carefully, because the the DSCP is actually communicated across the path and could be sent end-to-end, but it is **not** additional data added to the message on the wire .]]
—
“When receiving Messages, Message
Properties can contain per-protocol properties.”
- Again, I think it is important to say the Remote Endpoint can add per-protocol properties to the messages it receives.
—
“properties specific to how the Message's content
is to be sent. “
- I think we should delete the apostrophe, or alternatively we could write “content of the message”.
—
“Status of the Send operation can be delivered back
to the application in an event (Section 4.1.5).”
- could be :
“The Status of the Send operation can be delivered back
to the sending application in an event (Section 4.1.5).“
—
In 4.1.5:
“This list of events that can be delivered to an application is not
exhaustive, but gives the top-level categories of events.”
Could be:
“This section provides the top-level categories of events events that can be delivered to an application. This list is not
exhaustive.”
—
In 4.2:
“The Transport System Implementation Concepts”
- We do not need to define this as a capitalised term.
—
“are allowed be multiplexed”
- insert /to/ before /be/.
—
“Applications can use can use their explicitly defined”
- please rephrase:
“An application can explicitly define “
—
“including
any state it has necessary”
- please remove /it has/
—
“and a Transport Services system's policies and
heuristics.”
- could we instead write:
“and the heuristics or policies of the Transport Services system.”
—
In 4.2.3:
“Transitioning between different Protocol Stacks is in some
cases controlled by properties that only change when application code
is updated. “ - is this statement about the present case, or the case with TAPS, this is not clear, but I think the former?
—
“This functionality
can be a powerful driver of new protocol adoption, but needs to”
- I agree - is it protocol or stack functionality, I think this could be better as:
“This functionality in the API
can be a powerful driver of new protocol adoption, but needs to”
—
“Both stacks MUST offer the same transport services”
- this could use a reference to mindset perhaps?
—
1.
Could better with a reference. We could use [RFC8303] and [RFC8304] to contrast the two?
—
2.
“However, if the
application does not require reliability, then a Protocol Stack
that adds unnecessary reliability might be allowed as an
equivalent Protocol Stack as long as it does not conflict with
any other application-requested properties.”
- allowing or not allowing seems odd, could we write:
3.
“However, if the
application does not require reliability, then a Protocol Stack
that adds reliability could be regarded as an
equivalent Protocol Stack providing this would not conflict with
other application-requested properties.”
—
In 4.2.4:
“properties of the Implementation”
- implementation should not be capitalised.
—
“Transport System Implementation”
- implementation should not be capitalised.
—
“The interface to specify these Groups”
-groups should not be capitalised.
—
Normative references:
[I-D.ietf-taps-interface]
[RFC8174]
[RFC2119]
|
1.0
|
Detailed author review of Arch text for -03 to prepare -04 - TAPS Architecture - Additional comments on rev -03 (detailed read-through).
Note: This adds to my previous set of review issues - these could be handled at the same time.
Section 1 :
- Can we make reading easier:
“This flexibility does not only enable
faster deployment of new feature and protocols, but it can also
support applications with racing and fallback mechanisms which
otherwise need to be implemented in each application separately.”
- suggest:
“This flexibility enables
faster deployment of new feature and protocols,. It can also
support applications with racing and fallback mechanisms, which
otherwise need to be implemented in each application separately.””
—
“Although following the Transport
Services Architecture does of course not mean that “
- do we need /of course/ (this reads oddly).
—
“from one system to the another”
- should be:
“from one system to another”
==========
Section 1.1 :
“The model of using sockets for networking can be represented as
follows”
- could we insert “traditional” or “existing” or something like that before “model”, since a new reader may not realise this is the description of what exists rather what this document develops?
—
“, which provides the interface to the implementations of
UDP and TCP (typically implemented in the system's kernel), which in
turn send data over the available network layer interfaces.”
- Two “, which” in a long sentence, could we write:
- “. This API provides the interface to the implementations of
UDP and TCP (typically implemented in the system's kernel), which in
turn send data over the available network layer interfaces.”
—
“The Implementation”
- Is capitalised “I” needed?
—
“for mapping the API into the various available transport”
- Is this “into” or “to” - I think the interface maps to these?
—
“There are a few key departures that Transport Services makes from the
sockets API:”
- a “few key” seems odd to me in a spec.
“There are key differences between the architecture of the Transport Services system and the architecture of the sockets API:” - or something like that?
==========
Section 1.2 :
“and changes to available network links”:
- insert “the” and “to” so it does not mean that it makes the change:
- “and changes in the available network links”
—
Section 1.3 :
“HTTP/1.1 uses character delimiters to segment messages
over a stream; TLS record headers carry a version, content type, and
length; and HTTP/2 uses”
- I think this needs text needs several references to the specs?
==========
Section 3 :
“ The following considerations were used in the design of this
architecture.”
- If these were only a list of considerations, then we we do not need “RFC2119” language after all in this section. If they are actually the basis of the architecture, then we need to change the words “considerations” to “requirements” or something. I actually think it is helpful for the basis for the design should be expressed in RFC2119 language.
- I suggest the best option could be simply to omit the sentence?
—
“that include some transport security
protocol are eligible to be used.”
- The word “some” is wrong, and also “protocol”.
- I think this should be “a transport security function”?
—
Section 3.3 :
“The abstract API definition
[I-D.ietf-taps-interface] describes this interface and is aimed at
application developers.”
- “aimed at” seems odd because of “and” and “aimed”.
“The abstract API definition
[I-D.ietf-taps-interface] describes this interface. This is expected to be utilised by
application developers.”
—
“It is expected that all
implementations of Transport Services will offer the entire mandatory
API, but that some features will not be functional in certain
implementations.”
- This seems to me to still be possible to read the “mandatory” part os optional, which can’t be. Is it also OK to say this instead:
“It is expected that all
implementations of Transport Services system will offer the entire mandatory
API. However, some features provided below the API may not be functional in certain
implementations.”
—
“All implementations are REQUIRED to offer
sufficient APIs to use the distilled minimal set of features offered
by transport protocols ...“
- What does the plural “APIs” mean?
- I suggest we write:
“All implementations are REQUIRED to offer
an API that is sufficient to provide the distilled minimal set of features offered
by transport protocols …”
—
“, but it is possible that some very
constrained devices might not have, for example, a full TCP
implementation beneath the API.”
- I’m not fond of RFC2119 constructs that include an exception clause. Could we write this instead?
“For example, it is possible that some very
constrained devices might not have, for example, a full TCP
implementation beneath the API [I.D.ietf-lwig-tcp-constrained-node-networks-07].”
- This ref is I believe in cross-WG WGLC and should now be stable.
—
“It is expected that this document will be
updated and supplemented as new protocols and protocol features are
developed.”
- could we replace “this document” by the [REF], to avoid the possibility of misreading/misquoting it as the architecture document itself?
—
“defines new protocols that require any changes
to a remote system.”
- to me the first part here is that there is no implementation work required at the RECEIVER. I think that should be clear.
—
“ The Transport Services system MUST be deployable
on one side only.”
- I think the second point is that it can be deployed in this way which places design constraints on the system, specifically that the SENDER is a one-sided system. Is it possible we could put this in a separate para instead, e.g.
“The Transport Services system MUST be deployable
on one side only. A Transport Services system implemented at the sending endpoint can communicate with a remote endpoint on any existing system that implements the transport protocol(s) selected by the TAPS System. Similarly a Transport Services system at a listening endpoint can communicate using a using a transport protocol at a sending endpoint implemented in an existing system.”
==========
Section 4.1 :
“establish
communication and send and receive data.”
- insert comma after “communication” or “then” after first “and”
—
“A Preconnection can be fully specified and
represent a single possible Connection”
- is this /represents/?
—
“The Remote Endpoint MUST be specified in the
Preconnection is used to Initiate connections.”
- seems broken English, Is this:
“The Remote Endpoint MUST be specified if a
Preconnection is used to Initiate connections.”
—
in 4.1.2:
“Changes made
to Connection Properties after establishment take effect on a
best-effort basis.”
- My understanding was that this does **not** change the protocol selection, but can change the way the protocol uses the path. If I was correct, then it would be good to add a sentence to say this.
—
In 4.1.3:
“prepare any required
local or remote state to be able to send and/or receive Messages.”
- what becomes able?
“prepare any required
local or remote state to enable the endpoint to send and/or receive Messages.”
—
In 4.1.4:
“If a received Message is incomplete or corrupted, it
might or might not be usable by certain applications.”
- I agree this is possible, but I’d like to assert this really must **not** be the default and it is important to highlight that. Is it possible to say: “If a received Message is incomplete or corrupted, the default action is to not pass the data to the application. A certain application can override this default and the data might or might not be usable by that application.”
—
“Message Properties can be used to annotate specific Messages. “
- I agree, however I think it is really important to say that these annotations for sending exist only within the Local Endpoint and doe not cause additional bytes to be communicated to the Remote Endpoint. For example, they could change the DSCP, for instance. [[We need to choose words carefully, because the the DSCP is actually communicated across the path and could be sent end-to-end, but it is **not** additional data added to the message on the wire .]]
—
“When receiving Messages, Message
Properties can contain per-protocol properties.”
- Again, I think it is important to say the Remote Endpoint can add per-protocol properties to the messages it receives.
—
“properties specific to how the Message's content
is to be sent. “
- I think we should delete the apostrophe, or alternatively we could write “content of the message”.
—
“Status of the Send operation can be delivered back
to the application in an event (Section 4.1.5).”
- could be :
“The Status of the Send operation can be delivered back
to the sending application in an event (Section 4.1.5).“
—
In 4.1.5:
“This list of events that can be delivered to an application is not
exhaustive, but gives the top-level categories of events.”
Could be:
“This section provides the top-level categories of events events that can be delivered to an application. This list is not
exhaustive.”
—
In 4.2:
“The Transport System Implementation Concepts”
- We do not need to define this as a capitalised term.
—
“are allowed be multiplexed”
- insert /to/ before /be/.
—
“Applications can use can use their explicitly defined”
- please rephrase:
“An application can explicitly define “
—
“including
any state it has necessary”
- please remove /it has/
—
“and a Transport Services system's policies and
heuristics.”
- could we instead write:
“and the heuristics or policies of the Transport Services system.”
—
In 4.2.3:
“Transitioning between different Protocol Stacks is in some
cases controlled by properties that only change when application code
is updated. “ - is this statement about the present case, or the case with TAPS, this is not clear, but I think the former?
—
“This functionality
can be a powerful driver of new protocol adoption, but needs to”
- I agree - is it protocol or stack functionality, I think this could be better as:
“This functionality in the API
can be a powerful driver of new protocol adoption, but needs to”
—
“Both stacks MUST offer the same transport services”
- this could use a reference to mindset perhaps?
—
1.
Could better with a reference. We could use [RFC8303] and [RFC8304] to contrast the two?
—
2.
“However, if the
application does not require reliability, then a Protocol Stack
that adds unnecessary reliability might be allowed as an
equivalent Protocol Stack as long as it does not conflict with
any other application-requested properties.”
- allowing or not allowing seems odd, could we write:
3.
“However, if the
application does not require reliability, then a Protocol Stack
that adds reliability could be regarded as an
equivalent Protocol Stack providing this would not conflict with
other application-requested properties.”
—
In 4.2.4:
“properties of the Implementation”
- implementation should not be capitalised.
—
“Transport System Implementation”
- implementation should not be capitalised.
—
“The interface to specify these Groups”
-groups should not be capitalised.
—
Normative references:
[I-D.ietf-taps-interface]
[RFC8174]
[RFC2119]
|
non_process
|
detailed author review of arch text for to prepare taps architecture additional comments on rev detailed read through note this adds to my previous set of review issues these could be handled at the same time section can we make reading easier “this flexibility does not only enable faster deployment of new feature and protocols but it can also support applications with racing and fallback mechanisms which otherwise need to be implemented in each application separately ” suggest “this flexibility enables faster deployment of new feature and protocols it can also support applications with racing and fallback mechanisms which otherwise need to be implemented in each application separately ”” — “although following the transport services architecture does of course not mean that “ do we need of course this reads oddly — “from one system to the another” should be “from one system to another” section “the model of using sockets for networking can be represented as follows” could we insert “traditional” or “existing” or something like that before “model” since a new reader may not realise this is the description of what exists rather what this document develops — “ which provides the interface to the implementations of udp and tcp typically implemented in the system s kernel which in turn send data over the available network layer interfaces ” two “ which” in a long sentence could we write “ this api provides the interface to the implementations of udp and tcp typically implemented in the system s kernel which in turn send data over the available network layer interfaces ” — “the implementation” is capitalised “i” needed — “for mapping the api into the various available transport” is this “into” or “to” i think the interface maps to these — “there are a few key departures that transport services makes from the sockets api ” a “few key” seems odd to me in a spec “there are key differences between the architecture of the transport services system and the architecture of the sockets api ” or something like that section “and changes to available network links” insert “the” and “to” so it does not mean that it makes the change “and changes in the available network links” — section “http uses character delimiters to segment messages over a stream tls record headers carry a version content type and length and http uses” i think this needs text needs several references to the specs section “ the following considerations were used in the design of this architecture ” if these were only a list of considerations then we we do not need “ ” language after all in this section if they are actually the basis of the architecture then we need to change the words “considerations” to “requirements” or something i actually think it is helpful for the basis for the design should be expressed in language i suggest the best option could be simply to omit the sentence — “that include some transport security protocol are eligible to be used ” the word “some” is wrong and also “protocol” i think this should be “a transport security function” — section “the abstract api definition describes this interface and is aimed at application developers ” “aimed at” seems odd because of “and” and “aimed” “the abstract api definition describes this interface this is expected to be utilised by application developers ” — “it is expected that all implementations of transport services will offer the entire mandatory api but that some features will not be functional in certain implementations ” this seems to me to still be possible to read the “mandatory” part os optional which can’t be is it also ok to say this instead “it is expected that all implementations of transport services system will offer the entire mandatory api however some features provided below the api may not be functional in certain implementations ” — “all implementations are required to offer sufficient apis to use the distilled minimal set of features offered by transport protocols “ what does the plural “apis” mean i suggest we write “all implementations are required to offer an api that is sufficient to provide the distilled minimal set of features offered by transport protocols …” — “ but it is possible that some very constrained devices might not have for example a full tcp implementation beneath the api ” i’m not fond of constructs that include an exception clause could we write this instead “for example it is possible that some very constrained devices might not have for example a full tcp implementation beneath the api ” this ref is i believe in cross wg wglc and should now be stable — “it is expected that this document will be updated and supplemented as new protocols and protocol features are developed ” could we replace “this document” by the to avoid the possibility of misreading misquoting it as the architecture document itself — “defines new protocols that require any changes to a remote system ” to me the first part here is that there is no implementation work required at the receiver i think that should be clear — “ the transport services system must be deployable on one side only ” i think the second point is that it can be deployed in this way which places design constraints on the system specifically that the sender is a one sided system is it possible we could put this in a separate para instead e g “the transport services system must be deployable on one side only a transport services system implemented at the sending endpoint can communicate with a remote endpoint on any existing system that implements the transport protocol s selected by the taps system similarly a transport services system at a listening endpoint can communicate using a using a transport protocol at a sending endpoint implemented in an existing system ” section “establish communication and send and receive data ” insert comma after “communication” or “then” after first “and” — “a preconnection can be fully specified and represent a single possible connection” is this represents — “the remote endpoint must be specified in the preconnection is used to initiate connections ” seems broken english is this “the remote endpoint must be specified if a preconnection is used to initiate connections ” — in “changes made to connection properties after establishment take effect on a best effort basis ” my understanding was that this does not change the protocol selection but can change the way the protocol uses the path if i was correct then it would be good to add a sentence to say this — in “prepare any required local or remote state to be able to send and or receive messages ” what becomes able “prepare any required local or remote state to enable the endpoint to send and or receive messages ” — in “if a received message is incomplete or corrupted it might or might not be usable by certain applications ” i agree this is possible but i’d like to assert this really must not be the default and it is important to highlight that is it possible to say “if a received message is incomplete or corrupted the default action is to not pass the data to the application a certain application can override this default and the data might or might not be usable by that application ” — “message properties can be used to annotate specific messages “ i agree however i think it is really important to say that these annotations for sending exist only within the local endpoint and doe not cause additional bytes to be communicated to the remote endpoint for example they could change the dscp for instance — “when receiving messages message properties can contain per protocol properties ” again i think it is important to say the remote endpoint can add per protocol properties to the messages it receives — “properties specific to how the message s content is to be sent “ i think we should delete the apostrophe or alternatively we could write “content of the message” — “status of the send operation can be delivered back to the application in an event section ” could be “the status of the send operation can be delivered back to the sending application in an event section “ — in “this list of events that can be delivered to an application is not exhaustive but gives the top level categories of events ” could be “this section provides the top level categories of events events that can be delivered to an application this list is not exhaustive ” — in “the transport system implementation concepts” we do not need to define this as a capitalised term — “are allowed be multiplexed” insert to before be — “applications can use can use their explicitly defined” please rephrase “an application can explicitly define “ — “including any state it has necessary” please remove it has — “and a transport services system s policies and heuristics ” could we instead write “and the heuristics or policies of the transport services system ” — in “transitioning between different protocol stacks is in some cases controlled by properties that only change when application code is updated “ is this statement about the present case or the case with taps this is not clear but i think the former — “this functionality can be a powerful driver of new protocol adoption but needs to” i agree is it protocol or stack functionality i think this could be better as “this functionality in the api can be a powerful driver of new protocol adoption but needs to” — “both stacks must offer the same transport services” this could use a reference to mindset perhaps — could better with a reference we could use and to contrast the two — “however if the application does not require reliability then a protocol stack that adds unnecessary reliability might be allowed as an equivalent protocol stack as long as it does not conflict with any other application requested properties ” allowing or not allowing seems odd could we write “however if the application does not require reliability then a protocol stack that adds reliability could be regarded as an equivalent protocol stack providing this would not conflict with other application requested properties ” — in “properties of the implementation” implementation should not be capitalised — “transport system implementation” implementation should not be capitalised — “the interface to specify these groups” groups should not be capitalised — normative references
| 0
|
10,904
| 13,684,308,361
|
IssuesEvent
|
2020-09-30 04:30:12
|
GoogleCloudPlatform/cloud-ops-sandbox
|
https://api.github.com/repos/GoogleCloudPlatform/cloud-ops-sandbox
|
closed
|
Figure out better renovate bot experience/replacement
|
priority: p2 release blocking type: process
|
Improve Renovate bot on (1) the noise and (2) the breaking e2e tests:
- [x] 1. Triage each PR coming from the bot by applying the labels "type: cleanup" and "priority: p1"
- [x] 2. Autorun e2e tests on renovate bot PRs
- [x] 2.1 Update workflow bot s.t. repo bots CAN trigger workflows & it wont be cancelled
- [x] 3. Add "automerge" label so automerge bot will merge renovate bot PRs as soon as the CI that included e2e tests passes
- [ ] Add yoshi-approver to auto-approve renovate-bot PRs that pass e2e/other checks
- [x] 4. Configure selective scheduling to reduce Renovate bot noise
Not configurable in Renovatebot, needs external solution:
- x Auto-add the PR to our [github project (external to repo)](https://github.com/orgs/GoogleCloudPlatform/projects/10)
- x Auto tag prs with upcoming milestone (if available)
Maybe I add some automations using labels
|
1.0
|
Figure out better renovate bot experience/replacement - Improve Renovate bot on (1) the noise and (2) the breaking e2e tests:
- [x] 1. Triage each PR coming from the bot by applying the labels "type: cleanup" and "priority: p1"
- [x] 2. Autorun e2e tests on renovate bot PRs
- [x] 2.1 Update workflow bot s.t. repo bots CAN trigger workflows & it wont be cancelled
- [x] 3. Add "automerge" label so automerge bot will merge renovate bot PRs as soon as the CI that included e2e tests passes
- [ ] Add yoshi-approver to auto-approve renovate-bot PRs that pass e2e/other checks
- [x] 4. Configure selective scheduling to reduce Renovate bot noise
Not configurable in Renovatebot, needs external solution:
- x Auto-add the PR to our [github project (external to repo)](https://github.com/orgs/GoogleCloudPlatform/projects/10)
- x Auto tag prs with upcoming milestone (if available)
Maybe I add some automations using labels
|
process
|
figure out better renovate bot experience replacement improve renovate bot on the noise and the breaking tests triage each pr coming from the bot by applying the labels type cleanup and priority autorun tests on renovate bot prs update workflow bot s t repo bots can trigger workflows it wont be cancelled add automerge label so automerge bot will merge renovate bot prs as soon as the ci that included tests passes add yoshi approver to auto approve renovate bot prs that pass other checks configure selective scheduling to reduce renovate bot noise not configurable in renovatebot needs external solution x auto add the pr to our x auto tag prs with upcoming milestone if available maybe i add some automations using labels
| 1
|
946
| 3,410,703,526
|
IssuesEvent
|
2015-12-04 21:28:08
|
MaretEngineering/MROV
|
https://api.github.com/repos/MaretEngineering/MROV
|
closed
|
Duplicate code to create proper length of strings
|
enhancement Processing
|
From line 211 to 243 in the processing there are two pieces of code that look almost exactly the same. This should be turned into a function and replaced with function calls.
```Java
String toSend = "!";
// makes all thrust value strings going out 3 characters long + the "/"
for (int counter = 0; counter < 4; counter++) {
thrustValues[counter] += 256;
if (thrustValues[counter] < 10) {
toSend+= "00" + str(thrustValues[counter]) + "/";
}
if (thrustValues[counter] >=10 && thrustValues[counter] < 100) {
toSend+= "0" + str(thrustValues[counter]) + "/";
}
if (thrustValues[counter] >= 100) {
toSend+= str(thrustValues[counter]) + "/";
}
}
//makes the diff value 3 characters long + "/"
diff += 256;
if (diff == 512) {
diff -= 1;
} else if (diff == 0) {
diff += 1;
}
if (diff < 10) {
toSend+= "00" + str(diff) + "/";
toSend+= "00" + str(diff) + "/";
}
if (diff >= 10 && diff < 100) {
toSend+= "0" + str(diff) + "/";
toSend+= "0" + str(diff) + "/";
}
if (diff >= 100) {
toSend+= str(diff) + "/";
toSend+= str(diff) + "/";
}
```
|
1.0
|
Duplicate code to create proper length of strings - From line 211 to 243 in the processing there are two pieces of code that look almost exactly the same. This should be turned into a function and replaced with function calls.
```Java
String toSend = "!";
// makes all thrust value strings going out 3 characters long + the "/"
for (int counter = 0; counter < 4; counter++) {
thrustValues[counter] += 256;
if (thrustValues[counter] < 10) {
toSend+= "00" + str(thrustValues[counter]) + "/";
}
if (thrustValues[counter] >=10 && thrustValues[counter] < 100) {
toSend+= "0" + str(thrustValues[counter]) + "/";
}
if (thrustValues[counter] >= 100) {
toSend+= str(thrustValues[counter]) + "/";
}
}
//makes the diff value 3 characters long + "/"
diff += 256;
if (diff == 512) {
diff -= 1;
} else if (diff == 0) {
diff += 1;
}
if (diff < 10) {
toSend+= "00" + str(diff) + "/";
toSend+= "00" + str(diff) + "/";
}
if (diff >= 10 && diff < 100) {
toSend+= "0" + str(diff) + "/";
toSend+= "0" + str(diff) + "/";
}
if (diff >= 100) {
toSend+= str(diff) + "/";
toSend+= str(diff) + "/";
}
```
|
process
|
duplicate code to create proper length of strings from line to in the processing there are two pieces of code that look almost exactly the same this should be turned into a function and replaced with function calls java string tosend makes all thrust value strings going out characters long the for int counter counter counter thrustvalues if thrustvalues tosend str thrustvalues if thrustvalues thrustvalues tosend str thrustvalues if thrustvalues tosend str thrustvalues makes the diff value characters long diff if diff diff else if diff diff if diff tosend str diff tosend str diff if diff diff tosend str diff tosend str diff if diff tosend str diff tosend str diff
| 1
|
5,355
| 8,182,355,425
|
IssuesEvent
|
2018-08-29 04:32:48
|
Microsoft/LightGBM
|
https://api.github.com/repos/Microsoft/LightGBM
|
closed
|
ValueError: negative dimensions are not allowed
|
in-process
|
## Environment info
Operating System: Debian
CPU: x86_64
C++/Python/R version: python 3.4
## Error message
File "/home/beda/venv/system3/lib/python3.4/site-packages/shap/explainers/tree.py", line 144, in shap_values
phi = self.model.predict(X, num_iteration=tree_limit, pred_contrib=True)
File "/home/beda/venv/system3/lib/python3.4/site-packages/lightgbm/basic.py", line 1802, in predict
return predictor.predict(data, num_iteration, raw_score, pred_leaf, pred_contrib, data_has_header, is_reshape)
File "/home/beda/venv/system3/lib/python3.4/site-packages/lightgbm/basic.py", line 434, in predict
predict_type)
File "/home/beda/venv/system3/lib/python3.4/site-packages/lightgbm/basic.py", line 518, in __pred_for_csr
preds = np.zeros(n_preds, dtype=np.float64)
ValueError: negative dimensions are not allowed
## Steps to reproduce
1. Call LGBMClassifier.predict with pred_contrib=True
2. As data pass CSR matrix with shape (50000, 48793) or bigger
Probably 32-bit integer overflow when computing number of predictions.
|
1.0
|
ValueError: negative dimensions are not allowed - ## Environment info
Operating System: Debian
CPU: x86_64
C++/Python/R version: python 3.4
## Error message
File "/home/beda/venv/system3/lib/python3.4/site-packages/shap/explainers/tree.py", line 144, in shap_values
phi = self.model.predict(X, num_iteration=tree_limit, pred_contrib=True)
File "/home/beda/venv/system3/lib/python3.4/site-packages/lightgbm/basic.py", line 1802, in predict
return predictor.predict(data, num_iteration, raw_score, pred_leaf, pred_contrib, data_has_header, is_reshape)
File "/home/beda/venv/system3/lib/python3.4/site-packages/lightgbm/basic.py", line 434, in predict
predict_type)
File "/home/beda/venv/system3/lib/python3.4/site-packages/lightgbm/basic.py", line 518, in __pred_for_csr
preds = np.zeros(n_preds, dtype=np.float64)
ValueError: negative dimensions are not allowed
## Steps to reproduce
1. Call LGBMClassifier.predict with pred_contrib=True
2. As data pass CSR matrix with shape (50000, 48793) or bigger
Probably 32-bit integer overflow when computing number of predictions.
|
process
|
valueerror negative dimensions are not allowed environment info operating system debian cpu c python r version python error message file home beda venv lib site packages shap explainers tree py line in shap values phi self model predict x num iteration tree limit pred contrib true file home beda venv lib site packages lightgbm basic py line in predict return predictor predict data num iteration raw score pred leaf pred contrib data has header is reshape file home beda venv lib site packages lightgbm basic py line in predict predict type file home beda venv lib site packages lightgbm basic py line in pred for csr preds np zeros n preds dtype np valueerror negative dimensions are not allowed steps to reproduce call lgbmclassifier predict with pred contrib true as data pass csr matrix with shape or bigger probably bit integer overflow when computing number of predictions
| 1
|
37,548
| 12,483,882,970
|
IssuesEvent
|
2020-05-30 11:53:00
|
MadeByEmil/Diademos
|
https://api.github.com/repos/MadeByEmil/Diademos
|
closed
|
WS-2020-0068 (High) detected in yargs-parser-13.1.2.tgz
|
security vulnerability
|
## WS-2020-0068 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>yargs-parser-13.1.2.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-13.1.2.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-13.1.2.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/Diademos/JSLib/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/Diademos/JSLib/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- mercury-parser-2.2.0.tgz (Root Library)
- :x: **yargs-parser-13.1.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MadeByEmil/Diademos/commit/f543d4c0692c1f8e5f50b5e861d65d888a636fd4">f543d4c0692c1f8e5f50b5e861d65d888a636fd4</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Affected versions of yargs-parser are vulnerable to prototype pollution. Arguments are not properly sanitized, allowing an attacker to modify the prototype of Object, causing the addition or modification of an existing property that will exist on all objects. Parsing the argument --foo.__proto__.bar baz' adds a bar property with value baz to all objects. This is only exploitable if attackers have control over the arguments being passed to yargs-parser.
<p>Publish Date: 2020-05-01
<p>URL: <a href=https://www.npmjs.com/advisories/1500>WS-2020-0068</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Adjacent
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/package/yargs-parser">https://www.npmjs.com/package/yargs-parser</a></p>
<p>Release Date: 2020-05-04</p>
<p>Fix Resolution: https://www.npmjs.com/package/yargs-parser/v/18.1.2,https://www.npmjs.com/package/yargs-parser/v/15.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2020-0068 (High) detected in yargs-parser-13.1.2.tgz - ## WS-2020-0068 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>yargs-parser-13.1.2.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-13.1.2.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-13.1.2.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/Diademos/JSLib/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/Diademos/JSLib/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- mercury-parser-2.2.0.tgz (Root Library)
- :x: **yargs-parser-13.1.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MadeByEmil/Diademos/commit/f543d4c0692c1f8e5f50b5e861d65d888a636fd4">f543d4c0692c1f8e5f50b5e861d65d888a636fd4</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Affected versions of yargs-parser are vulnerable to prototype pollution. Arguments are not properly sanitized, allowing an attacker to modify the prototype of Object, causing the addition or modification of an existing property that will exist on all objects. Parsing the argument --foo.__proto__.bar baz' adds a bar property with value baz to all objects. This is only exploitable if attackers have control over the arguments being passed to yargs-parser.
<p>Publish Date: 2020-05-01
<p>URL: <a href=https://www.npmjs.com/advisories/1500>WS-2020-0068</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Adjacent
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/package/yargs-parser">https://www.npmjs.com/package/yargs-parser</a></p>
<p>Release Date: 2020-05-04</p>
<p>Fix Resolution: https://www.npmjs.com/package/yargs-parser/v/18.1.2,https://www.npmjs.com/package/yargs-parser/v/15.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws high detected in yargs parser tgz ws high severity vulnerability vulnerable library yargs parser tgz the mighty option parser used by yargs library home page a href path to dependency file tmp ws scm diademos jslib package json path to vulnerable library tmp ws scm diademos jslib node modules yargs parser package json dependency hierarchy mercury parser tgz root library x yargs parser tgz vulnerable library found in head commit a href vulnerability details affected versions of yargs parser are vulnerable to prototype pollution arguments are not properly sanitized allowing an attacker to modify the prototype of object causing the addition or modification of an existing property that will exist on all objects parsing the argument foo proto bar baz adds a bar property with value baz to all objects this is only exploitable if attackers have control over the arguments being passed to yargs parser publish date url a href cvss score details base score metrics exploitability metrics attack vector adjacent attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
95,328
| 11,981,700,392
|
IssuesEvent
|
2020-04-07 11:38:12
|
Kathrin92/MY-DIY
|
https://api.github.com/repos/Kathrin92/MY-DIY
|
opened
|
Create a design prototype with Adobe XD to have a structure and a guideline for my project.
|
Design User-Story
|
#### User Story:
As a user, I want to plan my DIY project with an app which is intuitive and has an appealing design.
#### To Do:
* [ ] get inspired by several designs online and similar app's
* [ ] style all needed pages (page structure, colour scheme, font, etc.)
* [ ] get reviews for the design
* [ ] implement the suggested changes, if useful and convinced
#### To keep in mind:
* user experience!
* prefer clean but modern design
* eye-guided structure
|
1.0
|
Create a design prototype with Adobe XD to have a structure and a guideline for my project. - #### User Story:
As a user, I want to plan my DIY project with an app which is intuitive and has an appealing design.
#### To Do:
* [ ] get inspired by several designs online and similar app's
* [ ] style all needed pages (page structure, colour scheme, font, etc.)
* [ ] get reviews for the design
* [ ] implement the suggested changes, if useful and convinced
#### To keep in mind:
* user experience!
* prefer clean but modern design
* eye-guided structure
|
non_process
|
create a design prototype with adobe xd to have a structure and a guideline for my project user story as a user i want to plan my diy project with an app which is intuitive and has an appealing design to do get inspired by several designs online and similar app s style all needed pages page structure colour scheme font etc get reviews for the design implement the suggested changes if useful and convinced to keep in mind user experience prefer clean but modern design eye guided structure
| 0
|
89,096
| 17,785,872,072
|
IssuesEvent
|
2021-08-31 10:58:44
|
libjxl/libjxl
|
https://api.github.com/repos/libjxl/libjxl
|
closed
|
problem to encode with grayscale ICC profile
|
encoder api
|
I am trying to encode JXL file using C API.
I set basic info `uses_original_profile = JXL_TRUE;` and to set grayscale ICC profile via `JxlEncoderSetICCProfile` but I get following error during the process:
```
lib/jxl/enc_external_image.cc:113: JXL_FAILURE: Buffer size is too small
lib/jxl/enc_external_image.cc:328: JXL_RETURN_IF_ERROR code=1: ConvertFromExternal( jxl::Span<const uint8_t>(static_cast<const uint8_t*>(buffer), size), xsize, ysize, c_current, pixel_format.num_channels == 2 || pixel_format.num_channels == 4, false, bitdepth, pixel_format.endianness, false, pool, ib, float_in)
```
I used `num_color_channels = 1;` and pixel format `num_channels = 1;` and `bits_per_sample = 16;` and `data_type = JXL_TYPE_UINT16;`
Similar code with `uses_original_profile = JXL_FALSE;` and `JxlEncoderSetColorEncoding` works with grayscale images so I believe I calculate buffer size correctly.
When I want to save RGB ICC profile, everything works but my attempts to save grayscale ICC lead to failure.
The grayscale ICC profile is generated by GIMP.
|
1.0
|
problem to encode with grayscale ICC profile - I am trying to encode JXL file using C API.
I set basic info `uses_original_profile = JXL_TRUE;` and to set grayscale ICC profile via `JxlEncoderSetICCProfile` but I get following error during the process:
```
lib/jxl/enc_external_image.cc:113: JXL_FAILURE: Buffer size is too small
lib/jxl/enc_external_image.cc:328: JXL_RETURN_IF_ERROR code=1: ConvertFromExternal( jxl::Span<const uint8_t>(static_cast<const uint8_t*>(buffer), size), xsize, ysize, c_current, pixel_format.num_channels == 2 || pixel_format.num_channels == 4, false, bitdepth, pixel_format.endianness, false, pool, ib, float_in)
```
I used `num_color_channels = 1;` and pixel format `num_channels = 1;` and `bits_per_sample = 16;` and `data_type = JXL_TYPE_UINT16;`
Similar code with `uses_original_profile = JXL_FALSE;` and `JxlEncoderSetColorEncoding` works with grayscale images so I believe I calculate buffer size correctly.
When I want to save RGB ICC profile, everything works but my attempts to save grayscale ICC lead to failure.
The grayscale ICC profile is generated by GIMP.
|
non_process
|
problem to encode with grayscale icc profile i am trying to encode jxl file using c api i set basic info uses original profile jxl true and to set grayscale icc profile via jxlencoderseticcprofile but i get following error during the process lib jxl enc external image cc jxl failure buffer size is too small lib jxl enc external image cc jxl return if error code convertfromexternal jxl span static cast buffer size xsize ysize c current pixel format num channels pixel format num channels false bitdepth pixel format endianness false pool ib float in i used num color channels and pixel format num channels and bits per sample and data type jxl type similar code with uses original profile jxl false and jxlencodersetcolorencoding works with grayscale images so i believe i calculate buffer size correctly when i want to save rgb icc profile everything works but my attempts to save grayscale icc lead to failure the grayscale icc profile is generated by gimp
| 0
|
3,785
| 6,762,731,586
|
IssuesEvent
|
2017-10-25 09:00:12
|
BlesseNtumble/GalaxySpace
|
https://api.github.com/repos/BlesseNtumble/GalaxySpace
|
closed
|
[1.1.8 STABLE] Exception caught during firing event net.minecraftforge.event.entity.living.LivingEvent
|
fixed in the process of correcting
|
Установлена на сервер Thermos
Прилетел на венеру, в итоге стало выкидывать с сервера.
На сервера выдает такие ошибки
[01:23:22 INFO]: 4erk[/5.18.239.71:12721] logged in with entity id 4888 at ([DIM-1006] -125.5, 161.19375, 87.5)
[01:23:22 INFO]: Sending server configs to client for com.enderio.core.common.config.ConfigHandler
[01:23:22 INFO]: Serialized Player data saved in PlayerCache.dat
[01:23:22 INFO]: You're using the latest recommended version of GT++.
[01:23:22 INFO]: Sending server configs to client for tterrag.wailaplugins.config.WPConfigHandler
[01:23:22 INFO]: Player GCEntityPlayerMP['4erk'/4888, l='DIM-1006', x=-125.50, y=161.19, z=87.50](4erk at -125.5,161.19375,87.5) connected. Sending ping
[01:23:29 ERROR]: Exception caught during firing event net.minecraftforge.event.entity.living.LivingEvent$LivingUpdateEvent@6ba979e5:
java.lang.NoClassDefFoundError: net/minecraft/client/Minecraft
at galaxyspace.SolarSystem.planets.venus.dimension.WorldProviderVenus.getYPosLightning(Unknown Source) ~[WorldProviderVenus.class:?]
at galaxyspace.SolarSystem.core.handler.GSLightningStormHandler.spawnLightning(Unknown Source) ~[GSLightningStormHandler.class:?]
at galaxyspace.SolarSystem.core.events.GSEventHandler.onEntityUpdate(Unknown Source) ~[GSEventHandler.class:?]
at cpw.mods.fml.common.eventhandler.ASMEventHandler_246_GSEventHandler_onEntityUpdate_LivingUpdateEvent.invoke(.dynamic) ~[?:?]
at cpw.mods.fml.common.eventhandler.ASMEventHandler.invoke(ASMEventHandler.java:54) ~[ASMEventHandler.class:1.7.10-1614.57]
at cpw.mods.fml.common.eventhandler.EventBus.post(EventBus.java:140) [EventBus.class:1.7.10-1614.57]
at net.minecraftforge.common.ForgeHooks.onLivingUpdate(ForgeHooks.java:298) [ForgeHooks.class:1.7.10-1614.57]
at net.minecraft.entity.EntityLivingBase.func_70071_h_(EntityLivingBase.java:1901) [sv.class:?]
at net.minecraft.entity.player.EntityPlayer.func_70071_h_(EntityPlayer.java:315) [yz.class:?]
at net.minecraft.entity.player.EntityPlayerMP.func_71127_g(EntityPlayerMP.java:399) [mw.class:?]
at net.minecraft.network.NetHandlerPlayServer.func_147347_a(NetHandlerPlayServer.java:416) [nh.class:?]
at net.minecraft.network.play.client.C03PacketPlayer.func_148833_a(C03PacketPlayer.java:36) [jd.class:?]
at net.minecraft.network.play.client.C03PacketPlayer$C05PacketPlayerLook.func_148833_a(C03PacketPlayer.java:182) [jg.class:?]
at net.minecraft.network.NetworkManager.func_74428_b(NetworkManager.java:245) [ej.class:?]
at net.minecraft.network.NetworkSystem.func_151269_c(NetworkSystem.java:181) [nc.class:?]
at net.minecraft.server.MinecraftServer.func_71190_q(MinecraftServer.java:1023) [MinecraftServer.class:?]
at net.minecraft.server.dedicated.DedicatedServer.func_71190_q(DedicatedServer.java:432) [lt.class:?]
at net.minecraft.server.MinecraftServer.func_71217_p(MinecraftServer.java:841) [MinecraftServer.class:?]
at net.minecraft.server.MinecraftServer.run(MinecraftServer.java:693) [MinecraftServer.class:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_144]
[01:23:29 ERROR]: Index: 4 Listeners:
[01:23:29 ERROR]: 0: NORMAL
[01:23:29 ERROR]: 1: ASM: crazypants.enderio.machine.spawner.BlockPoweredSpawner@3fe57c73 onLivingUpdate(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 2: ASM: com.rwtema.extrautils.item.ItemAngelRing$EventHandlerRing@5f517ae9 entTick(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 3: ASM: micdoodle8.mods.galacticraft.core.event.EventHandlerGC@4e8e31ca entityLivingEvent(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 4: ASM: galaxyspace.SolarSystem.core.events.GSEventHandler@6d333660 onEntityUpdate(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 5: ASM: com.rwtema.extrautils.EventHandlerServer@45a47c24 updateEntity(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 6: ASM: com.rwtema.extrautils.EventHandlerSiege@25478853 Siege(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 7: ASM: galaxyspace.SolarSystem.core.achievements.AchEvent@780732c6 onEntityUpdate(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 WARN]: Failed to handle packet for /5.18.239.71:12721
net.minecraft.util.ReportedException: Ticking player
at net.minecraft.entity.player.EntityPlayerMP.func_71127_g(EntityPlayerMP.java:477) ~[mw.class:?]
at net.minecraft.network.NetHandlerPlayServer.func_147347_a(NetHandlerPlayServer.java:416) ~[nh.class:?]
at net.minecraft.network.play.client.C03PacketPlayer.func_148833_a(C03PacketPlayer.java:36) ~[jd.class:?]
at net.minecraft.network.play.client.C03PacketPlayer$C05PacketPlayerLook.func_148833_a(C03PacketPlayer.java:182) ~[jg.class:?]
at net.minecraft.network.NetworkManager.func_74428_b(NetworkManager.java:245) ~[ej.class:?]
at net.minecraft.network.NetworkSystem.func_151269_c(NetworkSystem.java:181) [nc.class:?]
at net.minecraft.server.MinecraftServer.func_71190_q(MinecraftServer.java:1023) [MinecraftServer.class:?]
at net.minecraft.server.dedicated.DedicatedServer.func_71190_q(DedicatedServer.java:432) [lt.class:?]
at net.minecraft.server.MinecraftServer.func_71217_p(MinecraftServer.java:841) [MinecraftServer.class:?]
at net.minecraft.server.MinecraftServer.run(MinecraftServer.java:693) [MinecraftServer.class:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_144]
Caused by: java.lang.NoClassDefFoundError: net/minecraft/client/Minecraft
at galaxyspace.SolarSystem.planets.venus.dimension.WorldProviderVenus.getYPosLightning(Unknown Source) ~[WorldProviderVenus.class:?]
at galaxyspace.SolarSystem.core.handler.GSLightningStormHandler.spawnLightning(Unknown Source) ~[GSLightningStormHandler.class:?]
at galaxyspace.SolarSystem.core.events.GSEventHandler.onEntityUpdate(Unknown Source) ~[GSEventHandler.class:?]
at cpw.mods.fml.common.eventhandler.ASMEventHandler_246_GSEventHandler_onEntityUpdate_LivingUpdateEvent.invoke(.dynamic) ~[?:?]
at cpw.mods.fml.common.eventhandler.ASMEventHandler.invoke(ASMEventHandler.java:54) ~[ASMEventHandler.class:1.7.10-1614.57]
at cpw.mods.fml.common.eventhandler.EventBus.post(EventBus.java:140) ~[EventBus.class:1.7.10-1614.57]
at net.minecraftforge.common.ForgeHooks.onLivingUpdate(ForgeHooks.java:298) ~[ForgeHooks.class:1.7.10-1614.57]
at net.minecraft.entity.EntityLivingBase.func_70071_h_(EntityLivingBase.java:1901) ~[sv.class:?]
at net.minecraft.entity.player.EntityPlayer.func_70071_h_(EntityPlayer.java:315) ~[yz.class:?]
at net.minecraft.entity.player.EntityPlayerMP.func_71127_g(EntityPlayerMP.java:399) ~[mw.class:?]
... 10 more
[01:23:29 INFO]: 4erk lost connection: Internal server error
[01:23:29 INFO]: 4erk left the game.
Фордж 1614
|
1.0
|
[1.1.8 STABLE] Exception caught during firing event net.minecraftforge.event.entity.living.LivingEvent - Установлена на сервер Thermos
Прилетел на венеру, в итоге стало выкидывать с сервера.
На сервера выдает такие ошибки
[01:23:22 INFO]: 4erk[/5.18.239.71:12721] logged in with entity id 4888 at ([DIM-1006] -125.5, 161.19375, 87.5)
[01:23:22 INFO]: Sending server configs to client for com.enderio.core.common.config.ConfigHandler
[01:23:22 INFO]: Serialized Player data saved in PlayerCache.dat
[01:23:22 INFO]: You're using the latest recommended version of GT++.
[01:23:22 INFO]: Sending server configs to client for tterrag.wailaplugins.config.WPConfigHandler
[01:23:22 INFO]: Player GCEntityPlayerMP['4erk'/4888, l='DIM-1006', x=-125.50, y=161.19, z=87.50](4erk at -125.5,161.19375,87.5) connected. Sending ping
[01:23:29 ERROR]: Exception caught during firing event net.minecraftforge.event.entity.living.LivingEvent$LivingUpdateEvent@6ba979e5:
java.lang.NoClassDefFoundError: net/minecraft/client/Minecraft
at galaxyspace.SolarSystem.planets.venus.dimension.WorldProviderVenus.getYPosLightning(Unknown Source) ~[WorldProviderVenus.class:?]
at galaxyspace.SolarSystem.core.handler.GSLightningStormHandler.spawnLightning(Unknown Source) ~[GSLightningStormHandler.class:?]
at galaxyspace.SolarSystem.core.events.GSEventHandler.onEntityUpdate(Unknown Source) ~[GSEventHandler.class:?]
at cpw.mods.fml.common.eventhandler.ASMEventHandler_246_GSEventHandler_onEntityUpdate_LivingUpdateEvent.invoke(.dynamic) ~[?:?]
at cpw.mods.fml.common.eventhandler.ASMEventHandler.invoke(ASMEventHandler.java:54) ~[ASMEventHandler.class:1.7.10-1614.57]
at cpw.mods.fml.common.eventhandler.EventBus.post(EventBus.java:140) [EventBus.class:1.7.10-1614.57]
at net.minecraftforge.common.ForgeHooks.onLivingUpdate(ForgeHooks.java:298) [ForgeHooks.class:1.7.10-1614.57]
at net.minecraft.entity.EntityLivingBase.func_70071_h_(EntityLivingBase.java:1901) [sv.class:?]
at net.minecraft.entity.player.EntityPlayer.func_70071_h_(EntityPlayer.java:315) [yz.class:?]
at net.minecraft.entity.player.EntityPlayerMP.func_71127_g(EntityPlayerMP.java:399) [mw.class:?]
at net.minecraft.network.NetHandlerPlayServer.func_147347_a(NetHandlerPlayServer.java:416) [nh.class:?]
at net.minecraft.network.play.client.C03PacketPlayer.func_148833_a(C03PacketPlayer.java:36) [jd.class:?]
at net.minecraft.network.play.client.C03PacketPlayer$C05PacketPlayerLook.func_148833_a(C03PacketPlayer.java:182) [jg.class:?]
at net.minecraft.network.NetworkManager.func_74428_b(NetworkManager.java:245) [ej.class:?]
at net.minecraft.network.NetworkSystem.func_151269_c(NetworkSystem.java:181) [nc.class:?]
at net.minecraft.server.MinecraftServer.func_71190_q(MinecraftServer.java:1023) [MinecraftServer.class:?]
at net.minecraft.server.dedicated.DedicatedServer.func_71190_q(DedicatedServer.java:432) [lt.class:?]
at net.minecraft.server.MinecraftServer.func_71217_p(MinecraftServer.java:841) [MinecraftServer.class:?]
at net.minecraft.server.MinecraftServer.run(MinecraftServer.java:693) [MinecraftServer.class:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_144]
[01:23:29 ERROR]: Index: 4 Listeners:
[01:23:29 ERROR]: 0: NORMAL
[01:23:29 ERROR]: 1: ASM: crazypants.enderio.machine.spawner.BlockPoweredSpawner@3fe57c73 onLivingUpdate(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 2: ASM: com.rwtema.extrautils.item.ItemAngelRing$EventHandlerRing@5f517ae9 entTick(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 3: ASM: micdoodle8.mods.galacticraft.core.event.EventHandlerGC@4e8e31ca entityLivingEvent(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 4: ASM: galaxyspace.SolarSystem.core.events.GSEventHandler@6d333660 onEntityUpdate(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 5: ASM: com.rwtema.extrautils.EventHandlerServer@45a47c24 updateEntity(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 6: ASM: com.rwtema.extrautils.EventHandlerSiege@25478853 Siege(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 ERROR]: 7: ASM: galaxyspace.SolarSystem.core.achievements.AchEvent@780732c6 onEntityUpdate(Lnet/minecraftforge/event/entity/living/LivingEvent$LivingUpdateEvent;)V
[01:23:29 WARN]: Failed to handle packet for /5.18.239.71:12721
net.minecraft.util.ReportedException: Ticking player
at net.minecraft.entity.player.EntityPlayerMP.func_71127_g(EntityPlayerMP.java:477) ~[mw.class:?]
at net.minecraft.network.NetHandlerPlayServer.func_147347_a(NetHandlerPlayServer.java:416) ~[nh.class:?]
at net.minecraft.network.play.client.C03PacketPlayer.func_148833_a(C03PacketPlayer.java:36) ~[jd.class:?]
at net.minecraft.network.play.client.C03PacketPlayer$C05PacketPlayerLook.func_148833_a(C03PacketPlayer.java:182) ~[jg.class:?]
at net.minecraft.network.NetworkManager.func_74428_b(NetworkManager.java:245) ~[ej.class:?]
at net.minecraft.network.NetworkSystem.func_151269_c(NetworkSystem.java:181) [nc.class:?]
at net.minecraft.server.MinecraftServer.func_71190_q(MinecraftServer.java:1023) [MinecraftServer.class:?]
at net.minecraft.server.dedicated.DedicatedServer.func_71190_q(DedicatedServer.java:432) [lt.class:?]
at net.minecraft.server.MinecraftServer.func_71217_p(MinecraftServer.java:841) [MinecraftServer.class:?]
at net.minecraft.server.MinecraftServer.run(MinecraftServer.java:693) [MinecraftServer.class:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_144]
Caused by: java.lang.NoClassDefFoundError: net/minecraft/client/Minecraft
at galaxyspace.SolarSystem.planets.venus.dimension.WorldProviderVenus.getYPosLightning(Unknown Source) ~[WorldProviderVenus.class:?]
at galaxyspace.SolarSystem.core.handler.GSLightningStormHandler.spawnLightning(Unknown Source) ~[GSLightningStormHandler.class:?]
at galaxyspace.SolarSystem.core.events.GSEventHandler.onEntityUpdate(Unknown Source) ~[GSEventHandler.class:?]
at cpw.mods.fml.common.eventhandler.ASMEventHandler_246_GSEventHandler_onEntityUpdate_LivingUpdateEvent.invoke(.dynamic) ~[?:?]
at cpw.mods.fml.common.eventhandler.ASMEventHandler.invoke(ASMEventHandler.java:54) ~[ASMEventHandler.class:1.7.10-1614.57]
at cpw.mods.fml.common.eventhandler.EventBus.post(EventBus.java:140) ~[EventBus.class:1.7.10-1614.57]
at net.minecraftforge.common.ForgeHooks.onLivingUpdate(ForgeHooks.java:298) ~[ForgeHooks.class:1.7.10-1614.57]
at net.minecraft.entity.EntityLivingBase.func_70071_h_(EntityLivingBase.java:1901) ~[sv.class:?]
at net.minecraft.entity.player.EntityPlayer.func_70071_h_(EntityPlayer.java:315) ~[yz.class:?]
at net.minecraft.entity.player.EntityPlayerMP.func_71127_g(EntityPlayerMP.java:399) ~[mw.class:?]
... 10 more
[01:23:29 INFO]: 4erk lost connection: Internal server error
[01:23:29 INFO]: 4erk left the game.
Фордж 1614
|
process
|
exception caught during firing event net minecraftforge event entity living livingevent установлена на сервер thermos прилетел на венеру в итоге стало выкидывать с сервера на сервера выдает такие ошибки logged in with entity id at sending server configs to client for com enderio core common config confighandler serialized player data saved in playercache dat you re using the latest recommended version of gt sending server configs to client for tterrag wailaplugins config wpconfighandler player gcentityplayermp at connected sending ping exception caught during firing event net minecraftforge event entity living livingevent livingupdateevent java lang noclassdeffounderror net minecraft client minecraft at galaxyspace solarsystem planets venus dimension worldprovidervenus getyposlightning unknown source at galaxyspace solarsystem core handler gslightningstormhandler spawnlightning unknown source at galaxyspace solarsystem core events gseventhandler onentityupdate unknown source at cpw mods fml common eventhandler asmeventhandler gseventhandler onentityupdate livingupdateevent invoke dynamic at cpw mods fml common eventhandler asmeventhandler invoke asmeventhandler java at cpw mods fml common eventhandler eventbus post eventbus java at net minecraftforge common forgehooks onlivingupdate forgehooks java at net minecraft entity entitylivingbase func h entitylivingbase java at net minecraft entity player entityplayer func h entityplayer java at net minecraft entity player entityplayermp func g entityplayermp java at net minecraft network nethandlerplayserver func a nethandlerplayserver java at net minecraft network play client func a java at net minecraft network play client func a java at net minecraft network networkmanager func b networkmanager java at net minecraft network networksystem func c networksystem java at net minecraft server minecraftserver func q minecraftserver java at net minecraft server dedicated dedicatedserver func q dedicatedserver java at net minecraft server minecraftserver func p minecraftserver java at net minecraft server minecraftserver run minecraftserver java at java lang thread run thread java index listeners normal asm crazypants enderio machine spawner blockpoweredspawner onlivingupdate lnet minecraftforge event entity living livingevent livingupdateevent v asm com rwtema extrautils item itemangelring eventhandlerring enttick lnet minecraftforge event entity living livingevent livingupdateevent v asm mods galacticraft core event eventhandlergc entitylivingevent lnet minecraftforge event entity living livingevent livingupdateevent v asm galaxyspace solarsystem core events gseventhandler onentityupdate lnet minecraftforge event entity living livingevent livingupdateevent v asm com rwtema extrautils eventhandlerserver updateentity lnet minecraftforge event entity living livingevent livingupdateevent v asm com rwtema extrautils eventhandlersiege siege lnet minecraftforge event entity living livingevent livingupdateevent v asm galaxyspace solarsystem core achievements achevent onentityupdate lnet minecraftforge event entity living livingevent livingupdateevent v failed to handle packet for net minecraft util reportedexception ticking player at net minecraft entity player entityplayermp func g entityplayermp java at net minecraft network nethandlerplayserver func a nethandlerplayserver java at net minecraft network play client func a java at net minecraft network play client func a java at net minecraft network networkmanager func b networkmanager java at net minecraft network networksystem func c networksystem java at net minecraft server minecraftserver func q minecraftserver java at net minecraft server dedicated dedicatedserver func q dedicatedserver java at net minecraft server minecraftserver func p minecraftserver java at net minecraft server minecraftserver run minecraftserver java at java lang thread run thread java caused by java lang noclassdeffounderror net minecraft client minecraft at galaxyspace solarsystem planets venus dimension worldprovidervenus getyposlightning unknown source at galaxyspace solarsystem core handler gslightningstormhandler spawnlightning unknown source at galaxyspace solarsystem core events gseventhandler onentityupdate unknown source at cpw mods fml common eventhandler asmeventhandler gseventhandler onentityupdate livingupdateevent invoke dynamic at cpw mods fml common eventhandler asmeventhandler invoke asmeventhandler java at cpw mods fml common eventhandler eventbus post eventbus java at net minecraftforge common forgehooks onlivingupdate forgehooks java at net minecraft entity entitylivingbase func h entitylivingbase java at net minecraft entity player entityplayer func h entityplayer java at net minecraft entity player entityplayermp func g entityplayermp java more lost connection internal server error left the game фордж
| 1
|
554,342
| 16,418,371,631
|
IssuesEvent
|
2021-05-19 09:31:50
|
ahmedkaludi/accelerated-mobile-pages
|
https://api.github.com/repos/ahmedkaludi/accelerated-mobile-pages
|
closed
|
Create a filter and hook from which user can control Mobile redirection
|
Urgent [Priority: HIGH] enhancement
|
User can be able to control the Mobile redirection via code regardless of what is set in the options panel
https://secure.helpscout.net/conversation/1168448486/130528?folderId=1060554
|
1.0
|
Create a filter and hook from which user can control Mobile redirection - User can be able to control the Mobile redirection via code regardless of what is set in the options panel
https://secure.helpscout.net/conversation/1168448486/130528?folderId=1060554
|
non_process
|
create a filter and hook from which user can control mobile redirection user can be able to control the mobile redirection via code regardless of what is set in the options panel
| 0
|
350,490
| 31,896,397,657
|
IssuesEvent
|
2023-09-18 02:24:28
|
mobilitysol/monitorweb
|
https://api.github.com/repos/mobilitysol/monitorweb
|
closed
|
🛑 Server Mobility Testing is down
|
status server-mobility-testing
|
In [`c80616e`](https://github.com/mobilitysol/monitorweb/commit/c80616e455a4fd589a9d1c325eae5a7f62ea0270
), Server Mobility Testing (https://mobilitysol.com:30443) was **down**:
- HTTP code: 0
- Response time: 0 ms
|
1.0
|
🛑 Server Mobility Testing is down - In [`c80616e`](https://github.com/mobilitysol/monitorweb/commit/c80616e455a4fd589a9d1c325eae5a7f62ea0270
), Server Mobility Testing (https://mobilitysol.com:30443) was **down**:
- HTTP code: 0
- Response time: 0 ms
|
non_process
|
🛑 server mobility testing is down in server mobility testing was down http code response time ms
| 0
|
12,283
| 14,798,702,271
|
IssuesEvent
|
2021-01-13 00:25:36
|
allinurl/goaccess
|
https://api.github.com/repos/allinurl/goaccess
|
closed
|
timestamp in milliseconds
|
add enhancement log-processing log/date/time format
|
I got a log from a CDN provider with a timestamp in milliseconds!
I see there is a time format specifier %s - for UNIX timestamp in seconds - and %f - for timestamps in microseconds - but none for timestamps in milliseconds or did I overlook something?
Otherwise, this can be seen as a feature request, it would be great being able to read it in without further processing.
|
1.0
|
timestamp in milliseconds - I got a log from a CDN provider with a timestamp in milliseconds!
I see there is a time format specifier %s - for UNIX timestamp in seconds - and %f - for timestamps in microseconds - but none for timestamps in milliseconds or did I overlook something?
Otherwise, this can be seen as a feature request, it would be great being able to read it in without further processing.
|
process
|
timestamp in milliseconds i got a log from a cdn provider with a timestamp in milliseconds i see there is a time format specifier s for unix timestamp in seconds and f for timestamps in microseconds but none for timestamps in milliseconds or did i overlook something otherwise this can be seen as a feature request it would be great being able to read it in without further processing
| 1
|
93,228
| 26,897,930,426
|
IssuesEvent
|
2023-02-06 13:47:32
|
vitessio/vitess
|
https://api.github.com/repos/vitessio/vitess
|
opened
|
Enhancement: Remove dependency on k8s libraries except where strictly necessary
|
Type: Enhancement Component: Build/CI Type: CI/Build
|
### Feature Description
Today we use the K8s `apimachinery` sets API pretty widely inside Vitess. This causes issues when for example in other projects you need to pin to a specific k8s version. This is common for example specifically the operator, as in https://github.com/planetscale/vitess-operator/blob/main/go.mod.
In recent versions, k8s deprecated the old Set APIs for new ones and we fixed that deprecation, but that means we can't now pin to older k8s if needed.
We should have our own Set implementation and not use the one from Kubernetes to make sure we don't have this complex dependency that breaks in these cases.
### Use Case(s)
-
|
2.0
|
Enhancement: Remove dependency on k8s libraries except where strictly necessary - ### Feature Description
Today we use the K8s `apimachinery` sets API pretty widely inside Vitess. This causes issues when for example in other projects you need to pin to a specific k8s version. This is common for example specifically the operator, as in https://github.com/planetscale/vitess-operator/blob/main/go.mod.
In recent versions, k8s deprecated the old Set APIs for new ones and we fixed that deprecation, but that means we can't now pin to older k8s if needed.
We should have our own Set implementation and not use the one from Kubernetes to make sure we don't have this complex dependency that breaks in these cases.
### Use Case(s)
-
|
non_process
|
enhancement remove dependency on libraries except where strictly necessary feature description today we use the apimachinery sets api pretty widely inside vitess this causes issues when for example in other projects you need to pin to a specific version this is common for example specifically the operator as in in recent versions deprecated the old set apis for new ones and we fixed that deprecation but that means we can t now pin to older if needed we should have our own set implementation and not use the one from kubernetes to make sure we don t have this complex dependency that breaks in these cases use case s
| 0
|
3,461
| 6,545,138,032
|
IssuesEvent
|
2017-09-04 01:58:10
|
BlesseNtumble/GalaxySpace
|
https://api.github.com/repos/BlesseNtumble/GalaxySpace
|
closed
|
Assembly machine eating materials
|
in the process of correcting priority
|
When using any item transport to attempt to automate assembly IC2 copper ingots the system fails with consequences. I can hand place the ingots in the machine with no issue.
AE- ic2 copper simply vanish (not in machine, not stored in ME interface, no longer on ME network) tin places in ok but only into slot 2
Thermal Dynamics - servo wont push out of chest to feed, retriever doesnt see machine as a valid destination
BC - works but only for slot 2 and only first item, every item that follow just vanishes
This bug is easily repeatable. Let me know if you need any other info or pics/vid of my setup or anything else.
edit: running on KCauldron 1.7.10 r0.1
|
1.0
|
Assembly machine eating materials - When using any item transport to attempt to automate assembly IC2 copper ingots the system fails with consequences. I can hand place the ingots in the machine with no issue.
AE- ic2 copper simply vanish (not in machine, not stored in ME interface, no longer on ME network) tin places in ok but only into slot 2
Thermal Dynamics - servo wont push out of chest to feed, retriever doesnt see machine as a valid destination
BC - works but only for slot 2 and only first item, every item that follow just vanishes
This bug is easily repeatable. Let me know if you need any other info or pics/vid of my setup or anything else.
edit: running on KCauldron 1.7.10 r0.1
|
process
|
assembly machine eating materials when using any item transport to attempt to automate assembly copper ingots the system fails with consequences i can hand place the ingots in the machine with no issue ae copper simply vanish not in machine not stored in me interface no longer on me network tin places in ok but only into slot thermal dynamics servo wont push out of chest to feed retriever doesnt see machine as a valid destination bc works but only for slot and only first item every item that follow just vanishes this bug is easily repeatable let me know if you need any other info or pics vid of my setup or anything else edit running on kcauldron
| 1
|
75,692
| 20,959,326,188
|
IssuesEvent
|
2022-03-27 15:06:08
|
microsoft/azure-pipelines-tasks
|
https://api.github.com/repos/microsoft/azure-pipelines-tasks
|
closed
|
Publish Build Artifact from Linux Agent
|
enhancement stale Task: PublishBuildArtifacts
|
**Question, Bug, or Feature?**
*Type*: Feature
**Enter Task Name**: Publish build artifacts
Hi team,
at the time of this writing the Publish build artifacts task only supports Windows agents when used in the file share mode. Is there any plan to support the scenario also for Linux agents?

|
1.0
|
Publish Build Artifact from Linux Agent - **Question, Bug, or Feature?**
*Type*: Feature
**Enter Task Name**: Publish build artifacts
Hi team,
at the time of this writing the Publish build artifacts task only supports Windows agents when used in the file share mode. Is there any plan to support the scenario also for Linux agents?

|
non_process
|
publish build artifact from linux agent question bug or feature type feature enter task name publish build artifacts hi team at the time of this writing the publish build artifacts task only supports windows agents when used in the file share mode is there any plan to support the scenario also for linux agents
| 0
|
266,030
| 28,298,869,431
|
IssuesEvent
|
2023-04-10 02:49:59
|
nidhi7598/linux-4.19.72
|
https://api.github.com/repos/nidhi7598/linux-4.19.72
|
closed
|
CVE-2022-3564 (High) detected in linuxlinux-4.19.254 - autoclosed
|
Mend: dependency security vulnerability
|
## CVE-2022-3564 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.254</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/nidhi7598/linux-4.19.72/commit/10a8c99e4f60044163c159867bc6f5452c1c36e5">10a8c99e4f60044163c159867bc6f5452c1c36e5</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/bluetooth/l2cap_core.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability classified as critical was found in Linux Kernel. Affected by this vulnerability is the function l2cap_reassemble_sdu of the file net/bluetooth/l2cap_core.c of the component Bluetooth. The manipulation leads to use after free. It is recommended to apply a patch to fix this issue. The associated identifier of this vulnerability is VDB-211087.
<p>Publish Date: 2022-10-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-3564>CVE-2022-3564</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Adjacent
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2022-3564">https://www.linuxkernelcves.com/cves/CVE-2022-3564</a></p>
<p>Release Date: 2022-10-17</p>
<p>Fix Resolution: v4.9.333,v4.14.299,v4.19.265,v5.4.224,v5.10.154,v5.15.78,v6.0.8,v6.1-rc4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-3564 (High) detected in linuxlinux-4.19.254 - autoclosed - ## CVE-2022-3564 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.254</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/nidhi7598/linux-4.19.72/commit/10a8c99e4f60044163c159867bc6f5452c1c36e5">10a8c99e4f60044163c159867bc6f5452c1c36e5</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/bluetooth/l2cap_core.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability classified as critical was found in Linux Kernel. Affected by this vulnerability is the function l2cap_reassemble_sdu of the file net/bluetooth/l2cap_core.c of the component Bluetooth. The manipulation leads to use after free. It is recommended to apply a patch to fix this issue. The associated identifier of this vulnerability is VDB-211087.
<p>Publish Date: 2022-10-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-3564>CVE-2022-3564</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Adjacent
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2022-3564">https://www.linuxkernelcves.com/cves/CVE-2022-3564</a></p>
<p>Release Date: 2022-10-17</p>
<p>Fix Resolution: v4.9.333,v4.14.299,v4.19.265,v5.4.224,v5.10.154,v5.15.78,v6.0.8,v6.1-rc4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in linuxlinux autoclosed cve high severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch master vulnerable source files net bluetooth core c vulnerability details a vulnerability classified as critical was found in linux kernel affected by this vulnerability is the function reassemble sdu of the file net bluetooth core c of the component bluetooth the manipulation leads to use after free it is recommended to apply a patch to fix this issue the associated identifier of this vulnerability is vdb publish date url a href cvss score details base score metrics exploitability metrics attack vector adjacent attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
30,917
| 25,169,087,784
|
IssuesEvent
|
2022-11-11 00:27:39
|
patternfly/patternfly
|
https://api.github.com/repos/patternfly/patternfly
|
closed
|
Bug - pf-svg classes not included in react-styles
|
bug infrastructure needs triage released
|
**Describe the problem**
The CI build is failing in https://github.com/patternfly/patternfly-react/pull/5275 due to [these missing css classes ](https://github.com/patternfly/patternfly/blob/main/src/patternfly/base/patternfly-icons.scss#L4-L30) which do not get pulled into `react-styles`.
This is because `patternfly-base.scss` doesn't include them, as they're not in the import chain.
**How do you reproduce the problem?**
https://patternfly-react-pr-5275.surge.sh/components/button#variant-examples
The button variants that include icons are oversized due to the missing styles.
|
1.0
|
Bug - pf-svg classes not included in react-styles - **Describe the problem**
The CI build is failing in https://github.com/patternfly/patternfly-react/pull/5275 due to [these missing css classes ](https://github.com/patternfly/patternfly/blob/main/src/patternfly/base/patternfly-icons.scss#L4-L30) which do not get pulled into `react-styles`.
This is because `patternfly-base.scss` doesn't include them, as they're not in the import chain.
**How do you reproduce the problem?**
https://patternfly-react-pr-5275.surge.sh/components/button#variant-examples
The button variants that include icons are oversized due to the missing styles.
|
non_process
|
bug pf svg classes not included in react styles describe the problem the ci build is failing in due to which do not get pulled into react styles this is because patternfly base scss doesn t include them as they re not in the import chain how do you reproduce the problem the button variants that include icons are oversized due to the missing styles
| 0
|
14,412
| 17,464,584,680
|
IssuesEvent
|
2021-08-06 15:03:33
|
googleapis/python-pubsub
|
https://api.github.com/repos/googleapis/python-pubsub
|
opened
|
Add yoshi-python group to CODEOWNERS
|
type: process
|
As discussed offline, the [yoshi-python](https://github.com/orgs/googleapis/teams/yoshi-python) group should be added to CODEOWNERS, so that the reviews by that group's members are sufficient to merge a green PR.
|
1.0
|
Add yoshi-python group to CODEOWNERS - As discussed offline, the [yoshi-python](https://github.com/orgs/googleapis/teams/yoshi-python) group should be added to CODEOWNERS, so that the reviews by that group's members are sufficient to merge a green PR.
|
process
|
add yoshi python group to codeowners as discussed offline the group should be added to codeowners so that the reviews by that group s members are sufficient to merge a green pr
| 1
|
6,359
| 4,237,576,882
|
IssuesEvent
|
2016-07-05 22:22:53
|
biocaddie/prototype_issues
|
https://api.github.com/repos/biocaddie/prototype_issues
|
opened
|
Synonym Function/Advanced Search
|
Enhancement Usability V1.0
|
The synonym function does not work if you use the advanced search function.
|
True
|
Synonym Function/Advanced Search - The synonym function does not work if you use the advanced search function.
|
non_process
|
synonym function advanced search the synonym function does not work if you use the advanced search function
| 0
|
22,108
| 30,640,030,382
|
IssuesEvent
|
2023-07-24 21:03:35
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
Release 6.3.0 - July 2023
|
P1 type: process release team-OSS
|
# Status of Bazel 6.3.0
- Expected first release candidate date: 2023-07-13
- Expected release date: 2023-07-24
- [List of release blockers](https://github.com/bazelbuild/bazel/milestone/53)
To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone.
To cherry-pick a mainline commit into 6.3.0, simply send a PR against the `release-6.3.0` branch.
**Task list:**
- [x] Create release candidate
- [x] Check downstream projects
- [x] Create [draft release announcement](https://docs.google.com/document/d/1pu2ARPweOCTxPsRR8snoDtkC9R51XWRyBXeiC6Ql5so/edit)
- [x] Push the release and notify package maintainers
- [ ] Update the documentation
- [x] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
|
1.0
|
Release 6.3.0 - July 2023 - # Status of Bazel 6.3.0
- Expected first release candidate date: 2023-07-13
- Expected release date: 2023-07-24
- [List of release blockers](https://github.com/bazelbuild/bazel/milestone/53)
To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone.
To cherry-pick a mainline commit into 6.3.0, simply send a PR against the `release-6.3.0` branch.
**Task list:**
- [x] Create release candidate
- [x] Check downstream projects
- [x] Create [draft release announcement](https://docs.google.com/document/d/1pu2ARPweOCTxPsRR8snoDtkC9R51XWRyBXeiC6Ql5so/edit)
- [x] Push the release and notify package maintainers
- [ ] Update the documentation
- [x] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
|
process
|
release july status of bazel expected first release candidate date expected release date to report a release blocking bug please add a comment with the text bazel io flag to the issue a release manager will triage it and add it to the milestone to cherry pick a mainline commit into simply send a pr against the release branch task list create release candidate check downstream projects create push the release and notify package maintainers update the documentation update the
| 1
|
1,727
| 4,385,892,913
|
IssuesEvent
|
2016-08-08 10:39:13
|
inasafe/inasafe
|
https://api.github.com/repos/inasafe/inasafe
|
closed
|
Building Type Report Table mixing with zero and no data
|
Aggregation Current sprint Postprocessing
|
# Problem
The Building Type Report Table still reporting zero (o) and No data if run IF with aggregation. The implication is the report become too long and truncation in the lateral side.
The table still use OSM preset/attribute although the exposure data that being used is not come from OSM downloader.



# Detail
Test on Dock IF : Earthquake with Building Point
Hazard Data : EQ_TTU (Polygon)
Exposure : Infrastuvture_BIG (Point)
Aggregation : Yes (Village Level)
# proposed solution
1. No need report the zero (0) result. This can save space
2. The Detail Building Type report should use existing exposure that run in IF
# CC
@ismailsunni @samnawi @Charlotte-Morgan
|
1.0
|
Building Type Report Table mixing with zero and no data - # Problem
The Building Type Report Table still reporting zero (o) and No data if run IF with aggregation. The implication is the report become too long and truncation in the lateral side.
The table still use OSM preset/attribute although the exposure data that being used is not come from OSM downloader.



# Detail
Test on Dock IF : Earthquake with Building Point
Hazard Data : EQ_TTU (Polygon)
Exposure : Infrastuvture_BIG (Point)
Aggregation : Yes (Village Level)
# proposed solution
1. No need report the zero (0) result. This can save space
2. The Detail Building Type report should use existing exposure that run in IF
# CC
@ismailsunni @samnawi @Charlotte-Morgan
|
process
|
building type report table mixing with zero and no data problem the building type report table still reporting zero o and no data if run if with aggregation the implication is the report become too long and truncation in the lateral side the table still use osm preset attribute although the exposure data that being used is not come from osm downloader detail test on dock if earthquake with building point hazard data eq ttu polygon exposure infrastuvture big point aggregation yes village level proposed solution no need report the zero result this can save space the detail building type report should use existing exposure that run in if cc ismailsunni samnawi charlotte morgan
| 1
|
202,061
| 23,053,923,012
|
IssuesEvent
|
2022-07-25 01:17:16
|
Xi0ngfei/e-mart-backend
|
https://api.github.com/repos/Xi0ngfei/e-mart-backend
|
opened
|
spring-boot-starter-data-jpa-1.4.4.RELEASE.jar: 3 vulnerabilities (highest severity is: 7.5)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-boot-starter-data-jpa-1.4.4.RELEASE.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /emart-auth2-service/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Xi0ngfei/e-mart-backend/commit/9e2cdf0fabfba0aa30b3a80420cea42d1b714754">9e2cdf0fabfba0aa30b3a80420cea42d1b714754</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2018-1000632](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | dom4j-1.6.1.jar | Transitive | 1.5.0.RELEASE | ❌ |
| [CVE-2020-25638](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-25638) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.4 | hibernate-core-5.0.11.Final.jar | Transitive | 1.5.0.RELEASE | ❌ |
| [CVE-2019-14900](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14900) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | hibernate-core-5.0.11.Final.jar | Transitive | 1.5.0.RELEASE | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2018-1000632</summary>
### Vulnerable Library - <b>dom4j-1.6.1.jar</b></p>
<p>dom4j: the flexible XML framework for Java</p>
<p>Library home page: <a href="http://dom4j.org">http://dom4j.org</a></p>
<p>Path to dependency file: /emart-auth2-service/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-data-jpa-1.4.4.RELEASE.jar (Root Library)
- hibernate-core-5.0.11.Final.jar
- :x: **dom4j-1.6.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Xi0ngfei/e-mart-backend/commit/9e2cdf0fabfba0aa30b3a80420cea42d1b714754">9e2cdf0fabfba0aa30b3a80420cea42d1b714754</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
dom4j version prior to version 2.1.1 contains a CWE-91: XML Injection vulnerability in Class: Element. Methods: addElement, addAttribute that can result in an attacker tampering with XML documents through XML injection. This attack appear to be exploitable via an attacker specifying attributes or elements in the XML document. This vulnerability appears to have been fixed in 2.1.1 or later.
<p>Publish Date: 2018-08-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632>CVE-2018-1000632</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632</a></p>
<p>Release Date: 2018-08-20</p>
<p>Fix Resolution (dom4j:dom4j): 20040902.021138</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-data-jpa): 1.5.0.RELEASE</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-25638</summary>
### Vulnerable Library - <b>hibernate-core-5.0.11.Final.jar</b></p>
<p>The core O/RM functionality as provided by Hibernate</p>
<p>Library home page: <a href="http://hibernate.org">http://hibernate.org</a></p>
<p>Path to dependency file: /emart-auth2-service/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/hibernate-core/5.0.11.Final/hibernate-core-5.0.11.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-data-jpa-1.4.4.RELEASE.jar (Root Library)
- :x: **hibernate-core-5.0.11.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Xi0ngfei/e-mart-backend/commit/9e2cdf0fabfba0aa30b3a80420cea42d1b714754">9e2cdf0fabfba0aa30b3a80420cea42d1b714754</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A flaw was found in hibernate-core in versions prior to and including 5.4.23.Final. A SQL injection in the implementation of the JPA Criteria API can permit unsanitized literals when a literal is used in the SQL comments of the query. This flaw could allow an attacker to access unauthorized information or possibly conduct further attacks. The highest threat from this vulnerability is to data confidentiality and integrity.
<p>Publish Date: 2020-12-02
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-25638>CVE-2020-25638</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://in.relation.to/2020/11/19/hibernate-orm-5424-final-release/">https://in.relation.to/2020/11/19/hibernate-orm-5424-final-release/</a></p>
<p>Release Date: 2020-12-02</p>
<p>Fix Resolution (org.hibernate:hibernate-core): 5.3.20.Final</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-data-jpa): 1.5.0.RELEASE</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2019-14900</summary>
### Vulnerable Library - <b>hibernate-core-5.0.11.Final.jar</b></p>
<p>The core O/RM functionality as provided by Hibernate</p>
<p>Library home page: <a href="http://hibernate.org">http://hibernate.org</a></p>
<p>Path to dependency file: /emart-auth2-service/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/hibernate-core/5.0.11.Final/hibernate-core-5.0.11.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-data-jpa-1.4.4.RELEASE.jar (Root Library)
- :x: **hibernate-core-5.0.11.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Xi0ngfei/e-mart-backend/commit/9e2cdf0fabfba0aa30b3a80420cea42d1b714754">9e2cdf0fabfba0aa30b3a80420cea42d1b714754</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A flaw was found in Hibernate ORM in versions before 5.3.18, 5.4.18 and 5.5.0.Beta1. A SQL injection in the implementation of the JPA Criteria API can permit unsanitized literals when a literal is used in the SELECT or GROUP BY parts of the query. This flaw could allow an attacker to access unauthorized information or possibly conduct further attacks.
<p>Publish Date: 2020-07-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14900>CVE-2019-14900</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14900">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14900</a></p>
<p>Release Date: 2020-07-06</p>
<p>Fix Resolution (org.hibernate:hibernate-core): 5.1.10.Final</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-data-jpa): 1.5.0.RELEASE</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
|
True
|
spring-boot-starter-data-jpa-1.4.4.RELEASE.jar: 3 vulnerabilities (highest severity is: 7.5) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-boot-starter-data-jpa-1.4.4.RELEASE.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /emart-auth2-service/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Xi0ngfei/e-mart-backend/commit/9e2cdf0fabfba0aa30b3a80420cea42d1b714754">9e2cdf0fabfba0aa30b3a80420cea42d1b714754</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2018-1000632](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | dom4j-1.6.1.jar | Transitive | 1.5.0.RELEASE | ❌ |
| [CVE-2020-25638](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-25638) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.4 | hibernate-core-5.0.11.Final.jar | Transitive | 1.5.0.RELEASE | ❌ |
| [CVE-2019-14900](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14900) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | hibernate-core-5.0.11.Final.jar | Transitive | 1.5.0.RELEASE | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2018-1000632</summary>
### Vulnerable Library - <b>dom4j-1.6.1.jar</b></p>
<p>dom4j: the flexible XML framework for Java</p>
<p>Library home page: <a href="http://dom4j.org">http://dom4j.org</a></p>
<p>Path to dependency file: /emart-auth2-service/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-data-jpa-1.4.4.RELEASE.jar (Root Library)
- hibernate-core-5.0.11.Final.jar
- :x: **dom4j-1.6.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Xi0ngfei/e-mart-backend/commit/9e2cdf0fabfba0aa30b3a80420cea42d1b714754">9e2cdf0fabfba0aa30b3a80420cea42d1b714754</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
dom4j version prior to version 2.1.1 contains a CWE-91: XML Injection vulnerability in Class: Element. Methods: addElement, addAttribute that can result in an attacker tampering with XML documents through XML injection. This attack appear to be exploitable via an attacker specifying attributes or elements in the XML document. This vulnerability appears to have been fixed in 2.1.1 or later.
<p>Publish Date: 2018-08-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632>CVE-2018-1000632</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632</a></p>
<p>Release Date: 2018-08-20</p>
<p>Fix Resolution (dom4j:dom4j): 20040902.021138</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-data-jpa): 1.5.0.RELEASE</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-25638</summary>
### Vulnerable Library - <b>hibernate-core-5.0.11.Final.jar</b></p>
<p>The core O/RM functionality as provided by Hibernate</p>
<p>Library home page: <a href="http://hibernate.org">http://hibernate.org</a></p>
<p>Path to dependency file: /emart-auth2-service/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/hibernate-core/5.0.11.Final/hibernate-core-5.0.11.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-data-jpa-1.4.4.RELEASE.jar (Root Library)
- :x: **hibernate-core-5.0.11.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Xi0ngfei/e-mart-backend/commit/9e2cdf0fabfba0aa30b3a80420cea42d1b714754">9e2cdf0fabfba0aa30b3a80420cea42d1b714754</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A flaw was found in hibernate-core in versions prior to and including 5.4.23.Final. A SQL injection in the implementation of the JPA Criteria API can permit unsanitized literals when a literal is used in the SQL comments of the query. This flaw could allow an attacker to access unauthorized information or possibly conduct further attacks. The highest threat from this vulnerability is to data confidentiality and integrity.
<p>Publish Date: 2020-12-02
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-25638>CVE-2020-25638</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://in.relation.to/2020/11/19/hibernate-orm-5424-final-release/">https://in.relation.to/2020/11/19/hibernate-orm-5424-final-release/</a></p>
<p>Release Date: 2020-12-02</p>
<p>Fix Resolution (org.hibernate:hibernate-core): 5.3.20.Final</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-data-jpa): 1.5.0.RELEASE</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2019-14900</summary>
### Vulnerable Library - <b>hibernate-core-5.0.11.Final.jar</b></p>
<p>The core O/RM functionality as provided by Hibernate</p>
<p>Library home page: <a href="http://hibernate.org">http://hibernate.org</a></p>
<p>Path to dependency file: /emart-auth2-service/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/hibernate-core/5.0.11.Final/hibernate-core-5.0.11.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-data-jpa-1.4.4.RELEASE.jar (Root Library)
- :x: **hibernate-core-5.0.11.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Xi0ngfei/e-mart-backend/commit/9e2cdf0fabfba0aa30b3a80420cea42d1b714754">9e2cdf0fabfba0aa30b3a80420cea42d1b714754</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A flaw was found in Hibernate ORM in versions before 5.3.18, 5.4.18 and 5.5.0.Beta1. A SQL injection in the implementation of the JPA Criteria API can permit unsanitized literals when a literal is used in the SELECT or GROUP BY parts of the query. This flaw could allow an attacker to access unauthorized information or possibly conduct further attacks.
<p>Publish Date: 2020-07-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14900>CVE-2019-14900</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14900">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14900</a></p>
<p>Release Date: 2020-07-06</p>
<p>Fix Resolution (org.hibernate:hibernate-core): 5.1.10.Final</p>
<p>Direct dependency fix Resolution (org.springframework.boot:spring-boot-starter-data-jpa): 1.5.0.RELEASE</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
|
non_process
|
spring boot starter data jpa release jar vulnerabilities highest severity is vulnerable library spring boot starter data jpa release jar path to dependency file emart service pom xml path to vulnerable library home wss scanner repository jar found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high jar transitive release high hibernate core final jar transitive release medium hibernate core final jar transitive release details cve vulnerable library jar the flexible xml framework for java library home page a href path to dependency file emart service pom xml path to vulnerable library home wss scanner repository jar dependency hierarchy spring boot starter data jpa release jar root library hibernate core final jar x jar vulnerable library found in head commit a href found in base branch master vulnerability details version prior to version contains a cwe xml injection vulnerability in class element methods addelement addattribute that can result in an attacker tampering with xml documents through xml injection this attack appear to be exploitable via an attacker specifying attributes or elements in the xml document this vulnerability appears to have been fixed in or later publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution direct dependency fix resolution org springframework boot spring boot starter data jpa release step up your open source security game with mend cve vulnerable library hibernate core final jar the core o rm functionality as provided by hibernate library home page a href path to dependency file emart service pom xml path to vulnerable library home wss scanner repository org hibernate hibernate core final hibernate core final jar dependency hierarchy spring boot starter data jpa release jar root library x hibernate core final jar vulnerable library found in head commit a href found in base branch master vulnerability details a flaw was found in hibernate core in versions prior to and including final a sql injection in the implementation of the jpa criteria api can permit unsanitized literals when a literal is used in the sql comments of the query this flaw could allow an attacker to access unauthorized information or possibly conduct further attacks the highest threat from this vulnerability is to data confidentiality and integrity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org hibernate hibernate core final direct dependency fix resolution org springframework boot spring boot starter data jpa release step up your open source security game with mend cve vulnerable library hibernate core final jar the core o rm functionality as provided by hibernate library home page a href path to dependency file emart service pom xml path to vulnerable library home wss scanner repository org hibernate hibernate core final hibernate core final jar dependency hierarchy spring boot starter data jpa release jar root library x hibernate core final jar vulnerable library found in head commit a href found in base branch master vulnerability details a flaw was found in hibernate orm in versions before and a sql injection in the implementation of the jpa criteria api can permit unsanitized literals when a literal is used in the select or group by parts of the query this flaw could allow an attacker to access unauthorized information or possibly conduct further attacks publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org hibernate hibernate core final direct dependency fix resolution org springframework boot spring boot starter data jpa release step up your open source security game with mend
| 0
|
211,120
| 16,174,846,269
|
IssuesEvent
|
2021-05-03 04:00:31
|
backdrop/backdrop-issues
|
https://api.github.com/repos/backdrop/backdrop-issues
|
closed
|
Make sure Backdrop works in PHP 8
|
pr - needs testing pr - works for me status - has pull request type - task
|
**Description of the need**
PHP 8 will be out soon, and we will need to make some changes so that Backdrop can run on it.
**Additional information**
There is a meta issue for adding PHP 8 support for Drupal 9, but I was not able to find information about PHP 8 support in older versions of Drupal.
https://www.drupal.org/project/drupal/issues/3109885
---
PR by @hosef https://github.com/backdrop/backdrop/pull/3412
|
1.0
|
Make sure Backdrop works in PHP 8 - **Description of the need**
PHP 8 will be out soon, and we will need to make some changes so that Backdrop can run on it.
**Additional information**
There is a meta issue for adding PHP 8 support for Drupal 9, but I was not able to find information about PHP 8 support in older versions of Drupal.
https://www.drupal.org/project/drupal/issues/3109885
---
PR by @hosef https://github.com/backdrop/backdrop/pull/3412
|
non_process
|
make sure backdrop works in php description of the need php will be out soon and we will need to make some changes so that backdrop can run on it additional information there is a meta issue for adding php support for drupal but i was not able to find information about php support in older versions of drupal pr by hosef
| 0
|
687
| 3,172,309,374
|
IssuesEvent
|
2015-09-23 07:15:59
|
tomchristie/django-rest-framework
|
https://api.github.com/repos/tomchristie/django-rest-framework
|
closed
|
Review supported Django Versions and Compat
|
Process
|
Following the discussion in #3405
* [ ] Decide what versions of Django are we going to support.
* [ ] Review code in `compat.py` in accordance with that.
|
1.0
|
Review supported Django Versions and Compat - Following the discussion in #3405
* [ ] Decide what versions of Django are we going to support.
* [ ] Review code in `compat.py` in accordance with that.
|
process
|
review supported django versions and compat following the discussion in decide what versions of django are we going to support review code in compat py in accordance with that
| 1
|
9,081
| 12,150,752,651
|
IssuesEvent
|
2020-04-24 18:34:31
|
gkiar/reproreading
|
https://api.github.com/repos/gkiar/reproreading
|
opened
|
Paper: Prediction of Membrane Transport Proteins and Their Substrate Specificities Using Primary Sequence Information
|
machine-learning processing reproducibility
|
URL: [https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0100278](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0100278)
(replicated by Hamid in https://github.com/gkiar/reproducibility-bioinfo)
### This paper does...
- Develops a predictive model (based upon SVM) to identify transporting substrates associated with membrane transport proteins.
- Curated a dataset that categorizes transport proteins based on their transported substrates.
- (Data compilation): Is it reasonable that they remove those which are excessively similar to one another?
- (don't have a ton of notes... seems reasonably straight forward)
|
1.0
|
Paper: Prediction of Membrane Transport Proteins and Their Substrate Specificities Using Primary Sequence Information - URL: [https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0100278](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0100278)
(replicated by Hamid in https://github.com/gkiar/reproducibility-bioinfo)
### This paper does...
- Develops a predictive model (based upon SVM) to identify transporting substrates associated with membrane transport proteins.
- Curated a dataset that categorizes transport proteins based on their transported substrates.
- (Data compilation): Is it reasonable that they remove those which are excessively similar to one another?
- (don't have a ton of notes... seems reasonably straight forward)
|
process
|
paper prediction of membrane transport proteins and their substrate specificities using primary sequence information url replicated by hamid in this paper does develops a predictive model based upon svm to identify transporting substrates associated with membrane transport proteins curated a dataset that categorizes transport proteins based on their transported substrates data compilation is it reasonable that they remove those which are excessively similar to one another don t have a ton of notes seems reasonably straight forward
| 1
|
1,249
| 3,785,579,448
|
IssuesEvent
|
2016-03-20 16:13:15
|
e-government-ua/i
|
https://api.github.com/repos/e-government-ua/i
|
closed
|
Реализовать на дашборде отображение ссылок на файлы, прикрепленный вместе с комментарием, и обеспечить возможность их загрузке по клику на ссылку.
|
active In process of testing question test _dashboard-js
|
- [x] 1) Подобно тому как описано в п.4 задачи https://github.com/e-government-ua/i/issues/1113
Дописывать в отдельной строке (ниже самого комментария) фразу-ссылку на файл, вида "Прикреплен файл: file1.ext", если в поле sData объекта с комментарием присутствует признак прикрепленного файла:
"aFile":[{"sFileName":"file.ext", "sKey":"978G9-g97B6-g6gtg-iYlkj9j-Jh9jG"}]
где sFileName - название файла (механизм и формат сохранения так-же описан в п.3, https://github.com/e-government-ua/i/issues/1112 )
- [x] 2) Ссылка должна быть на промежуточный сервис сервера ноды, с передачей только номер сообщения и номер-процесса (пример: "/api/downloadFileOfComment?nID_Message=2342354&nID_Process=324252")
- [x] 3) Сервис ноды (для п.2) должен использовать сервис сервис явы, написанный по п.2 задачи
https://github.com/e-government-ua/i/issues/1115 и возвращать массив байт чтоб загрузить файл при клике на ссылку (п.2.)
|
1.0
|
Реализовать на дашборде отображение ссылок на файлы, прикрепленный вместе с комментарием, и обеспечить возможность их загрузке по клику на ссылку. - - [x] 1) Подобно тому как описано в п.4 задачи https://github.com/e-government-ua/i/issues/1113
Дописывать в отдельной строке (ниже самого комментария) фразу-ссылку на файл, вида "Прикреплен файл: file1.ext", если в поле sData объекта с комментарием присутствует признак прикрепленного файла:
"aFile":[{"sFileName":"file.ext", "sKey":"978G9-g97B6-g6gtg-iYlkj9j-Jh9jG"}]
где sFileName - название файла (механизм и формат сохранения так-же описан в п.3, https://github.com/e-government-ua/i/issues/1112 )
- [x] 2) Ссылка должна быть на промежуточный сервис сервера ноды, с передачей только номер сообщения и номер-процесса (пример: "/api/downloadFileOfComment?nID_Message=2342354&nID_Process=324252")
- [x] 3) Сервис ноды (для п.2) должен использовать сервис сервис явы, написанный по п.2 задачи
https://github.com/e-government-ua/i/issues/1115 и возвращать массив байт чтоб загрузить файл при клике на ссылку (п.2.)
|
process
|
реализовать на дашборде отображение ссылок на файлы прикрепленный вместе с комментарием и обеспечить возможность их загрузке по клику на ссылку подобно тому как описано в п задачи дописывать в отдельной строке ниже самого комментария фразу ссылку на файл вида прикреплен файл ext если в поле sdata объекта с комментарием присутствует признак прикрепленного файла afile где sfilename название файла механизм и формат сохранения так же описан в п ссылка должна быть на промежуточный сервис сервера ноды с передачей только номер сообщения и номер процесса пример api downloadfileofcomment nid message nid process сервис ноды для п должен использовать сервис сервис явы написанный по п задачи и возвращать массив байт чтоб загрузить файл при клике на ссылку п
| 1
|
2,736
| 5,623,046,794
|
IssuesEvent
|
2017-04-04 14:11:11
|
DynareTeam/dynare
|
https://api.github.com/repos/DynareTeam/dynare
|
closed
|
use of tags in write_latex_dynamic_model
|
enhancement preprocessor
|
would it be possible to add tag info in the write_latex_dynamic_model, e.g. by inserting a simple text line with tags info before each equation?
|
1.0
|
use of tags in write_latex_dynamic_model - would it be possible to add tag info in the write_latex_dynamic_model, e.g. by inserting a simple text line with tags info before each equation?
|
process
|
use of tags in write latex dynamic model would it be possible to add tag info in the write latex dynamic model e g by inserting a simple text line with tags info before each equation
| 1
|
117,848
| 4,728,422,783
|
IssuesEvent
|
2016-10-18 15:54:52
|
CS2103AUG2016-W13-C3/main
|
https://api.github.com/repos/CS2103AUG2016-W13-C3/main
|
opened
|
As a user, I want to browse through my finished events and tasks...
|
priority.medium type.story
|
So that I can refer back to what I have done
|
1.0
|
As a user, I want to browse through my finished events and tasks... - So that I can refer back to what I have done
|
non_process
|
as a user i want to browse through my finished events and tasks so that i can refer back to what i have done
| 0
|
6,660
| 9,781,836,746
|
IssuesEvent
|
2019-06-07 21:01:41
|
googleapis/google-cloud-java
|
https://api.github.com/repos/googleapis/google-cloud-java
|
closed
|
Reduce size of gh-pages
|
type: process
|
Probably just retain latest/ instead of every single version every generated.
|
1.0
|
Reduce size of gh-pages - Probably just retain latest/ instead of every single version every generated.
|
process
|
reduce size of gh pages probably just retain latest instead of every single version every generated
| 1
|
25,005
| 4,166,882,924
|
IssuesEvent
|
2016-06-20 07:01:38
|
Zhandos-/fuzzy-search-tools
|
https://api.github.com/repos/Zhandos-/fuzzy-search-tools
|
closed
|
ArrayIndexOutOfBoundsException when exceeding DEFAULT_LENGTH
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1. Pass a string longer than DEFAULT_LENGTH
What is the expected output? What do you see instead?
It should increase currentRow, previousRow and transpositionRow. Instead, an
ArrayIndexOutOfBoundsException is thrown.
I believe the problem lies with these checks:
http://code.google.com/p/fuzzy-search-tools/source/browse/trunk/src/ru/fuzzysear
ch/DamerauLevensteinMetric.java#42
http://code.google.com/p/fuzzy-search-tools/source/browse/trunk/src/ru/fuzzysear
ch/DamerauLevensteinMetric.java#103
Shouldn't it be >=?
```
Original issue reported on code.google.com by `GoncaloS...@gmail.com` on 1 Jun 2012 at 5:01
|
1.0
|
ArrayIndexOutOfBoundsException when exceeding DEFAULT_LENGTH - ```
What steps will reproduce the problem?
1. Pass a string longer than DEFAULT_LENGTH
What is the expected output? What do you see instead?
It should increase currentRow, previousRow and transpositionRow. Instead, an
ArrayIndexOutOfBoundsException is thrown.
I believe the problem lies with these checks:
http://code.google.com/p/fuzzy-search-tools/source/browse/trunk/src/ru/fuzzysear
ch/DamerauLevensteinMetric.java#42
http://code.google.com/p/fuzzy-search-tools/source/browse/trunk/src/ru/fuzzysear
ch/DamerauLevensteinMetric.java#103
Shouldn't it be >=?
```
Original issue reported on code.google.com by `GoncaloS...@gmail.com` on 1 Jun 2012 at 5:01
|
non_process
|
arrayindexoutofboundsexception when exceeding default length what steps will reproduce the problem pass a string longer than default length what is the expected output what do you see instead it should increase currentrow previousrow and transpositionrow instead an arrayindexoutofboundsexception is thrown i believe the problem lies with these checks ch dameraulevensteinmetric java ch dameraulevensteinmetric java shouldn t it be original issue reported on code google com by goncalos gmail com on jun at
| 0
|
9,627
| 12,566,180,191
|
IssuesEvent
|
2020-06-08 10:44:40
|
citation-style-language/csl-evolution
|
https://api.github.com/repos/citation-style-language/csl-evolution
|
closed
|
define versioning process for styles, schema, etc.
|
process
|
Creating new versions of the schema and spec is technically straightforward.
But how are we thinking to deal with style versioning related to the above, given the thousands of styles we have to maintain?
Will a 1.1 release require completely new style files, that we maintain in parallel with 1.0 styles?
What about 2.0?
@rmzelle suggested [here](https://discourse.citationstyles.org/t/csl-1-2-planning/1476/3) that master would be the current schema, and we'd branch for earlier style versions.
That makes sense, but I think the below is still relevant.
Could we possibly avoid or minimize the need for separate branches?
Maybe we could add a compatibility section to the spec so we ahead of time make this much easier for us to manage?
This issue should settle and describe whatever the strategy will be.
|
1.0
|
define versioning process for styles, schema, etc. - Creating new versions of the schema and spec is technically straightforward.
But how are we thinking to deal with style versioning related to the above, given the thousands of styles we have to maintain?
Will a 1.1 release require completely new style files, that we maintain in parallel with 1.0 styles?
What about 2.0?
@rmzelle suggested [here](https://discourse.citationstyles.org/t/csl-1-2-planning/1476/3) that master would be the current schema, and we'd branch for earlier style versions.
That makes sense, but I think the below is still relevant.
Could we possibly avoid or minimize the need for separate branches?
Maybe we could add a compatibility section to the spec so we ahead of time make this much easier for us to manage?
This issue should settle and describe whatever the strategy will be.
|
process
|
define versioning process for styles schema etc creating new versions of the schema and spec is technically straightforward but how are we thinking to deal with style versioning related to the above given the thousands of styles we have to maintain will a release require completely new style files that we maintain in parallel with styles what about rmzelle suggested that master would be the current schema and we d branch for earlier style versions that makes sense but i think the below is still relevant could we possibly avoid or minimize the need for separate branches maybe we could add a compatibility section to the spec so we ahead of time make this much easier for us to manage this issue should settle and describe whatever the strategy will be
| 1
|
7,004
| 10,146,617,793
|
IssuesEvent
|
2019-08-05 08:36:36
|
aiidateam/aiida-core
|
https://api.github.com/repos/aiidateam/aiida-core
|
opened
|
Engine exception: `AttributeError: 'NoneType' object has no attribute 'append'`
|
priority/important topic/engine topic/processes type/bug
|
This exception occurs very rarely when a process submits a child process and in an attempt to add itself as a broadcast subscriber, the following exception is thrown:
```
2019-08-04 20:00:33 [429466 | REPORT]: [879754|PwBaseWorkChain|on_except]: Traceback (most recent call last):
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/process_states.py", line 220, in execute
result = self.run_fn(*self.args, **self.kwargs)
File "/home/aiida/code/aiida/env/3dd/aiida-core/aiida/engine/processes/workchains/workchain.py", line 181, in _do_step
finished, stepper_result = self._stepper.step()
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/workchains.py", line 281, in step
finished, result = self._child_stepper.step()
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/workchains.py", line 232, in step
return True, self._fn(self._workchain)
File "/home/aiida/code/aiida/env/3dd/aiida-quantumespresso/aiida_quantumespresso/workflows/pw/base.py", line 145, in validate_kpoints
kpoints = create_kpoints_from_distance(**inputs)
File "/home/aiida/code/aiida/env/3dd/aiida-core/aiida/engine/processes/functions.py", line 197, in decorated_function
result, _ = run_get_node(*args, **kwargs)
File "/home/aiida/code/aiida/env/3dd/aiida-core/aiida/engine/processes/functions.py", line 147, in run_get_node
process = process_class(inputs=inputs, runner=runner)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/base/state_machine.py", line 188, in __call__
call_with_super_check(inst.init)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/base/utils.py", line 29, in call_with_super_check
fn(*args, **kwargs)
File "/home/aiida/code/aiida/env/3dd/aiida-core/aiida/engine/processes/process.py", line 134, in init
super(Process, self).init()
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/base/utils.py", line 16, in new_fn
fn(self, *args, **kwargs)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/processes.py", line 290, in init
self.broadcast_receive, identifier=str(self.pid))
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/kiwipy/rmq/communicator.py", line 592, in add_broadcast_subscriber
return self._run_task(coro)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/kiwipy/rmq/communicator.py", line 677, in _run_task
return self.tornado_to_kiwi_future(self._create_task(coro)).result(timeout=self.TASK_TIMEOUT)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/kiwipy/rmq/communicator.py", line 656, in tornado_to_kiwi_future
self.loop().add_future(tornado_future, done)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/tornado/ioloop.py", line 597, in add_future
lambda future: self.add_callback(callback, future))
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/tornado/concurrent.py", line 270, in add_done_callback
self._callbacks.append(fn)
AttributeError: 'NoneType' object has no attribute 'append'
```
This is most likely a bug/problem in `kiwipy/tornado` but the result is that the parent process excepts and the child process remains in the `CREATED` state as the required continuation task is not sent to RabbitMQ.
|
1.0
|
Engine exception: `AttributeError: 'NoneType' object has no attribute 'append'` - This exception occurs very rarely when a process submits a child process and in an attempt to add itself as a broadcast subscriber, the following exception is thrown:
```
2019-08-04 20:00:33 [429466 | REPORT]: [879754|PwBaseWorkChain|on_except]: Traceback (most recent call last):
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/process_states.py", line 220, in execute
result = self.run_fn(*self.args, **self.kwargs)
File "/home/aiida/code/aiida/env/3dd/aiida-core/aiida/engine/processes/workchains/workchain.py", line 181, in _do_step
finished, stepper_result = self._stepper.step()
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/workchains.py", line 281, in step
finished, result = self._child_stepper.step()
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/workchains.py", line 232, in step
return True, self._fn(self._workchain)
File "/home/aiida/code/aiida/env/3dd/aiida-quantumespresso/aiida_quantumespresso/workflows/pw/base.py", line 145, in validate_kpoints
kpoints = create_kpoints_from_distance(**inputs)
File "/home/aiida/code/aiida/env/3dd/aiida-core/aiida/engine/processes/functions.py", line 197, in decorated_function
result, _ = run_get_node(*args, **kwargs)
File "/home/aiida/code/aiida/env/3dd/aiida-core/aiida/engine/processes/functions.py", line 147, in run_get_node
process = process_class(inputs=inputs, runner=runner)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/base/state_machine.py", line 188, in __call__
call_with_super_check(inst.init)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/base/utils.py", line 29, in call_with_super_check
fn(*args, **kwargs)
File "/home/aiida/code/aiida/env/3dd/aiida-core/aiida/engine/processes/process.py", line 134, in init
super(Process, self).init()
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/base/utils.py", line 16, in new_fn
fn(self, *args, **kwargs)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/plumpy/processes.py", line 290, in init
self.broadcast_receive, identifier=str(self.pid))
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/kiwipy/rmq/communicator.py", line 592, in add_broadcast_subscriber
return self._run_task(coro)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/kiwipy/rmq/communicator.py", line 677, in _run_task
return self.tornado_to_kiwi_future(self._create_task(coro)).result(timeout=self.TASK_TIMEOUT)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/kiwipy/rmq/communicator.py", line 656, in tornado_to_kiwi_future
self.loop().add_future(tornado_future, done)
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/tornado/ioloop.py", line 597, in add_future
lambda future: self.add_callback(callback, future))
File "/home/aiida/.virtualenvs/aiida_3dd/lib/python2.7/site-packages/tornado/concurrent.py", line 270, in add_done_callback
self._callbacks.append(fn)
AttributeError: 'NoneType' object has no attribute 'append'
```
This is most likely a bug/problem in `kiwipy/tornado` but the result is that the parent process excepts and the child process remains in the `CREATED` state as the required continuation task is not sent to RabbitMQ.
|
process
|
engine exception attributeerror nonetype object has no attribute append this exception occurs very rarely when a process submits a child process and in an attempt to add itself as a broadcast subscriber the following exception is thrown traceback most recent call last file home aiida virtualenvs aiida lib site packages plumpy process states py line in execute result self run fn self args self kwargs file home aiida code aiida env aiida core aiida engine processes workchains workchain py line in do step finished stepper result self stepper step file home aiida virtualenvs aiida lib site packages plumpy workchains py line in step finished result self child stepper step file home aiida virtualenvs aiida lib site packages plumpy workchains py line in step return true self fn self workchain file home aiida code aiida env aiida quantumespresso aiida quantumespresso workflows pw base py line in validate kpoints kpoints create kpoints from distance inputs file home aiida code aiida env aiida core aiida engine processes functions py line in decorated function result run get node args kwargs file home aiida code aiida env aiida core aiida engine processes functions py line in run get node process process class inputs inputs runner runner file home aiida virtualenvs aiida lib site packages plumpy base state machine py line in call call with super check inst init file home aiida virtualenvs aiida lib site packages plumpy base utils py line in call with super check fn args kwargs file home aiida code aiida env aiida core aiida engine processes process py line in init super process self init file home aiida virtualenvs aiida lib site packages plumpy base utils py line in new fn fn self args kwargs file home aiida virtualenvs aiida lib site packages plumpy processes py line in init self broadcast receive identifier str self pid file home aiida virtualenvs aiida lib site packages kiwipy rmq communicator py line in add broadcast subscriber return self run task coro file home aiida virtualenvs aiida lib site packages kiwipy rmq communicator py line in run task return self tornado to kiwi future self create task coro result timeout self task timeout file home aiida virtualenvs aiida lib site packages kiwipy rmq communicator py line in tornado to kiwi future self loop add future tornado future done file home aiida virtualenvs aiida lib site packages tornado ioloop py line in add future lambda future self add callback callback future file home aiida virtualenvs aiida lib site packages tornado concurrent py line in add done callback self callbacks append fn attributeerror nonetype object has no attribute append this is most likely a bug problem in kiwipy tornado but the result is that the parent process excepts and the child process remains in the created state as the required continuation task is not sent to rabbitmq
| 1
|
202,133
| 15,257,579,850
|
IssuesEvent
|
2021-02-21 02:03:34
|
urapadmin/kiosk
|
https://api.github.com/repos/urapadmin/kiosk
|
closed
|
moving pinboard button to menu area on all (?) forms
|
enhancement / extension filemaker test-stage
|
Related to #790 -- pinboard buttons currently cut the titles on several of the forms, and moving them left to the menu area looks neater.
|
1.0
|
moving pinboard button to menu area on all (?) forms - Related to #790 -- pinboard buttons currently cut the titles on several of the forms, and moving them left to the menu area looks neater.
|
non_process
|
moving pinboard button to menu area on all forms related to pinboard buttons currently cut the titles on several of the forms and moving them left to the menu area looks neater
| 0
|
261,787
| 19,726,624,552
|
IssuesEvent
|
2022-01-13 20:39:28
|
intel/cve-bin-tool
|
https://api.github.com/repos/intel/cve-bin-tool
|
opened
|
doc: make CONTRIBUTING.md show up on readthedocs
|
good first issue documentation
|
https://cve-bin-tool.readthedocs.io/en/latest/ isn't showing the new CONTRIBUTING.md docs. It's probably because my symbolic link trick doesn't work on windows and I did that PR on a different machine than usual. Find a way to fix it (that does NOT involve moving CONTRIBUING.md; it needs to stay in the root directory).
Remember you can build your local copy of what readthedocs does by going into the documentation directory and typing `make html` (if you haven't done this before, you might need to install the recommended tools in `doc/requirements.txt` first)
|
1.0
|
doc: make CONTRIBUTING.md show up on readthedocs - https://cve-bin-tool.readthedocs.io/en/latest/ isn't showing the new CONTRIBUTING.md docs. It's probably because my symbolic link trick doesn't work on windows and I did that PR on a different machine than usual. Find a way to fix it (that does NOT involve moving CONTRIBUING.md; it needs to stay in the root directory).
Remember you can build your local copy of what readthedocs does by going into the documentation directory and typing `make html` (if you haven't done this before, you might need to install the recommended tools in `doc/requirements.txt` first)
|
non_process
|
doc make contributing md show up on readthedocs isn t showing the new contributing md docs it s probably because my symbolic link trick doesn t work on windows and i did that pr on a different machine than usual find a way to fix it that does not involve moving contribuing md it needs to stay in the root directory remember you can build your local copy of what readthedocs does by going into the documentation directory and typing make html if you haven t done this before you might need to install the recommended tools in doc requirements txt first
| 0
|
853
| 2,757,631,730
|
IssuesEvent
|
2015-04-27 15:50:32
|
dreal/dreal3
|
https://api.github.com/repos/dreal/dreal3
|
opened
|
performance difference between commit ead6f1b02623 and commit 5b23325408e1
|
performance
|
reported by @pzuliani.
```lisp
(set-logic QF_NRA_ODE)
(declare-fun v () Real)
(declare-fun v_0_0 () Real)
(declare-fun v_0_t () Real)
(declare-fun x () Real)
(declare-fun x_0_0 () Real)
(declare-fun x_0_t () Real)
(declare-fun y () Real)
(declare-fun y_0_0 () Real)
(declare-fun y_0_t () Real)
(declare-fun z () Real)
(declare-fun z_0_0 () Real)
(declare-fun z_0_t () Real)
(declare-fun alphay () Real)
(declare-fun alphay_0_0 () Real)
(declare-fun alphay_0_t () Real)
(declare-fun betax () Real)
(declare-fun betax_0_0 () Real)
(declare-fun betax_0_t () Real)
(declare-fun t () Real)
(declare-fun t_0_0 () Real)
(declare-fun t_0_t () Real)
(declare-fun time_0 () Real)
(define-ode flow_1 ((= d/dt[alphay] 0.0)(= d/dt[betax] 0.0)(= d/dt[t] 1.0)(= d/dt[v] (+ (* (- (- (- (/ 0.0197 (+ 1.0 (exp (* (- 10.0 z) 1.0)))) (/ betax (+ 1.0 (exp (* (- z 10.0) 2.0))))) (* 5.0E-5 (- 1.0 (/ z 12.0)))) 0.01) x) (+ 0.02 (+ (* 5.0E-5 (* (- 1.0 (/ z 12.0)) x)) (* (- (* alphay (- 1.0 (* 1.0 (/ z 12.0)))) 0.0168) y)))))(= d/dt[x] (+ (* (- (- (- (/ 0.0197 (+ 1.0 (exp (* (- 10.0 z) 1.0)))) (/ betax (+ 1.0 (exp (* (- z 10.0) 2.0))))) (* 5.0E-5 (- 1.0 (/ z 12.0)))) 0.01) x) 0.02))(= d/dt[y] (+ (* 5.0E-5 (* (- 1.0 (/ z 12.0)) x)) (* (- (* alphay (- 1.0 (* 1.0 (/ z 12.0)))) 0.0168) y)))(= d/dt[z] (+ (* (- 0.0 z) 0.08) 0.03))))
(assert (>= v_0_0 0))
(assert (<= v_0_0 100))
(assert (>= v_0_t 0))
(assert (<= v_0_t 100))
(assert (>= x_0_0 0))
(assert (<= x_0_0 100))
(assert (>= x_0_t 0))
(assert (<= x_0_t 100))
(assert (>= y_0_0 0))
(assert (<= y_0_0 100))
(assert (>= y_0_t 0))
(assert (<= y_0_t 100))
(assert (>= z_0_0 0))
(assert (<= z_0_0 100))
(assert (>= z_0_t 0))
(assert (<= z_0_t 100))
(assert (>= alphay_0_0 0))
(assert (<= alphay_0_0 0.025))
(assert (>= alphay_0_t 0))
(assert (<= alphay_0_t 0.025))
(assert (>= betax_0_0 0))
(assert (<= betax_0_0 0.025))
(assert (>= betax_0_t 0))
(assert (<= betax_0_t 0.025))
(assert (>= time_0 0))
(assert (<= time_0 83))
(assert (>= t_0_0 0))
(assert (<= t_0_0 83))
(assert (>= t_0_t 0))
(assert (<= t_0_t 83))
(assert
(and
(= [v_0_t x_0_t y_0_t z_0_t alphay_0_t betax_0_t t_0_t] (integral 0. time_0 [v_0_0 x_0_0 y_0_0 z_0_0 alphay_0_0 betax_0_0 t_0_0] flow_1))
(= alphay_0_0 alphay_0_t)
(= betax_0_0 betax_0_t)
(= t_0_0 0)
(>= v_0_0 19.0998)
(<= v_0_0 19.1002)
(>= x_0_0 18.9998)
(<= x_0_0 19.0002)
(>= y_0_0 0.099999)
(<= y_0_0 0.100001)
(>= z_0_0 12.4999)
(<= z_0_0 12.5001)
(= t_0_t 83)
(>= v_0_t 1.1)
(<= v_0_t 3.9)
)
)
(check-sat)
(exit)
```
|
True
|
performance difference between commit ead6f1b02623 and commit 5b23325408e1 - reported by @pzuliani.
```lisp
(set-logic QF_NRA_ODE)
(declare-fun v () Real)
(declare-fun v_0_0 () Real)
(declare-fun v_0_t () Real)
(declare-fun x () Real)
(declare-fun x_0_0 () Real)
(declare-fun x_0_t () Real)
(declare-fun y () Real)
(declare-fun y_0_0 () Real)
(declare-fun y_0_t () Real)
(declare-fun z () Real)
(declare-fun z_0_0 () Real)
(declare-fun z_0_t () Real)
(declare-fun alphay () Real)
(declare-fun alphay_0_0 () Real)
(declare-fun alphay_0_t () Real)
(declare-fun betax () Real)
(declare-fun betax_0_0 () Real)
(declare-fun betax_0_t () Real)
(declare-fun t () Real)
(declare-fun t_0_0 () Real)
(declare-fun t_0_t () Real)
(declare-fun time_0 () Real)
(define-ode flow_1 ((= d/dt[alphay] 0.0)(= d/dt[betax] 0.0)(= d/dt[t] 1.0)(= d/dt[v] (+ (* (- (- (- (/ 0.0197 (+ 1.0 (exp (* (- 10.0 z) 1.0)))) (/ betax (+ 1.0 (exp (* (- z 10.0) 2.0))))) (* 5.0E-5 (- 1.0 (/ z 12.0)))) 0.01) x) (+ 0.02 (+ (* 5.0E-5 (* (- 1.0 (/ z 12.0)) x)) (* (- (* alphay (- 1.0 (* 1.0 (/ z 12.0)))) 0.0168) y)))))(= d/dt[x] (+ (* (- (- (- (/ 0.0197 (+ 1.0 (exp (* (- 10.0 z) 1.0)))) (/ betax (+ 1.0 (exp (* (- z 10.0) 2.0))))) (* 5.0E-5 (- 1.0 (/ z 12.0)))) 0.01) x) 0.02))(= d/dt[y] (+ (* 5.0E-5 (* (- 1.0 (/ z 12.0)) x)) (* (- (* alphay (- 1.0 (* 1.0 (/ z 12.0)))) 0.0168) y)))(= d/dt[z] (+ (* (- 0.0 z) 0.08) 0.03))))
(assert (>= v_0_0 0))
(assert (<= v_0_0 100))
(assert (>= v_0_t 0))
(assert (<= v_0_t 100))
(assert (>= x_0_0 0))
(assert (<= x_0_0 100))
(assert (>= x_0_t 0))
(assert (<= x_0_t 100))
(assert (>= y_0_0 0))
(assert (<= y_0_0 100))
(assert (>= y_0_t 0))
(assert (<= y_0_t 100))
(assert (>= z_0_0 0))
(assert (<= z_0_0 100))
(assert (>= z_0_t 0))
(assert (<= z_0_t 100))
(assert (>= alphay_0_0 0))
(assert (<= alphay_0_0 0.025))
(assert (>= alphay_0_t 0))
(assert (<= alphay_0_t 0.025))
(assert (>= betax_0_0 0))
(assert (<= betax_0_0 0.025))
(assert (>= betax_0_t 0))
(assert (<= betax_0_t 0.025))
(assert (>= time_0 0))
(assert (<= time_0 83))
(assert (>= t_0_0 0))
(assert (<= t_0_0 83))
(assert (>= t_0_t 0))
(assert (<= t_0_t 83))
(assert
(and
(= [v_0_t x_0_t y_0_t z_0_t alphay_0_t betax_0_t t_0_t] (integral 0. time_0 [v_0_0 x_0_0 y_0_0 z_0_0 alphay_0_0 betax_0_0 t_0_0] flow_1))
(= alphay_0_0 alphay_0_t)
(= betax_0_0 betax_0_t)
(= t_0_0 0)
(>= v_0_0 19.0998)
(<= v_0_0 19.1002)
(>= x_0_0 18.9998)
(<= x_0_0 19.0002)
(>= y_0_0 0.099999)
(<= y_0_0 0.100001)
(>= z_0_0 12.4999)
(<= z_0_0 12.5001)
(= t_0_t 83)
(>= v_0_t 1.1)
(<= v_0_t 3.9)
)
)
(check-sat)
(exit)
```
|
non_process
|
performance difference between commit and commit reported by pzuliani lisp set logic qf nra ode declare fun v real declare fun v real declare fun v t real declare fun x real declare fun x real declare fun x t real declare fun y real declare fun y real declare fun y t real declare fun z real declare fun z real declare fun z t real declare fun alphay real declare fun alphay real declare fun alphay t real declare fun betax real declare fun betax real declare fun betax t real declare fun t real declare fun t real declare fun t t real declare fun time real define ode flow d dt d dt d dt d dt exp z betax exp z z x z x alphay z y d dt exp z betax exp z z x d dt z x alphay z y d dt z assert v assert v assert v t assert v t assert x assert x assert x t assert x t assert y assert y assert y t assert y t assert z assert z assert z t assert z t assert alphay assert alphay assert alphay t assert alphay t assert betax assert betax assert betax t assert betax t assert time assert time assert t assert t assert t t assert t t assert and integral time flow alphay alphay t betax betax t t v v x x y y z z t t v t v t check sat exit
| 0
|
284,541
| 21,443,795,305
|
IssuesEvent
|
2022-04-25 02:32:38
|
orbitmechanic/USAMobile
|
https://api.github.com/repos/orbitmechanic/USAMobile
|
closed
|
Update USA Wallet demo video (8 hours)
|
documentation
|
Create a new demo video to show to investors/the world by Friday.
|
1.0
|
Update USA Wallet demo video (8 hours) - Create a new demo video to show to investors/the world by Friday.
|
non_process
|
update usa wallet demo video hours create a new demo video to show to investors the world by friday
| 0
|
44,361
| 12,112,143,217
|
IssuesEvent
|
2020-04-21 13:23:10
|
department-of-veterans-affairs/va.gov-cms
|
https://api.github.com/repos/department-of-veterans-affairs/va.gov-cms
|
opened
|
Various users can't select Listing pages they have access to
|
Critical defect Defect
|
**Describe the defect**
Permissions for selecting listing pages in field_listing are not as expected.
**To Reproduce**
- As louis.scavnicky@va.gov (content_admin) or @ryan.stubblebine@va.gov (content_publisher), go to
- https://staging.cms.va.gov/node/add/event
- https://staging.cms.va.gov/node/add/press_release
- https://staging.cms.va.gov/node/add/news_story
- You should be able to select a Listing for each of the 3 content types. As of now, you can only select it for events.
- Try these URLs again as admin (eg kevin.walsh). Then you can only select the listing field for news_release, but not events or
**Expected behavior**
- [ ] content_admin and admin should be able to bypass all content permissions.
- [ ] content_publisher should be able to select any Listing page that they have access to edit.
|
2.0
|
Various users can't select Listing pages they have access to - **Describe the defect**
Permissions for selecting listing pages in field_listing are not as expected.
**To Reproduce**
- As louis.scavnicky@va.gov (content_admin) or @ryan.stubblebine@va.gov (content_publisher), go to
- https://staging.cms.va.gov/node/add/event
- https://staging.cms.va.gov/node/add/press_release
- https://staging.cms.va.gov/node/add/news_story
- You should be able to select a Listing for each of the 3 content types. As of now, you can only select it for events.
- Try these URLs again as admin (eg kevin.walsh). Then you can only select the listing field for news_release, but not events or
**Expected behavior**
- [ ] content_admin and admin should be able to bypass all content permissions.
- [ ] content_publisher should be able to select any Listing page that they have access to edit.
|
non_process
|
various users can t select listing pages they have access to describe the defect permissions for selecting listing pages in field listing are not as expected to reproduce as louis scavnicky va gov content admin or ryan stubblebine va gov content publisher go to you should be able to select a listing for each of the content types as of now you can only select it for events try these urls again as admin eg kevin walsh then you can only select the listing field for news release but not events or expected behavior content admin and admin should be able to bypass all content permissions content publisher should be able to select any listing page that they have access to edit
| 0
|
16,698
| 21,797,947,489
|
IssuesEvent
|
2022-05-15 22:19:53
|
TheUltimateC0der/listrr.pro
|
https://api.github.com/repos/TheUltimateC0der/listrr.pro
|
closed
|
Filter by beginning of name only
|
processing:server-side version:v2
|
I want to create a list of TV shows that start with a letter, like "A". When I create a filter like this with search in title "A*" it seems to come back with all shows containing the letter A, instead of starting with the letter A.
Any way to tell Listrr to search beginning of title instead of anywhere in title?
|
1.0
|
Filter by beginning of name only - I want to create a list of TV shows that start with a letter, like "A". When I create a filter like this with search in title "A*" it seems to come back with all shows containing the letter A, instead of starting with the letter A.
Any way to tell Listrr to search beginning of title instead of anywhere in title?
|
process
|
filter by beginning of name only i want to create a list of tv shows that start with a letter like a when i create a filter like this with search in title a it seems to come back with all shows containing the letter a instead of starting with the letter a any way to tell listrr to search beginning of title instead of anywhere in title
| 1
|
13,593
| 16,164,104,006
|
IssuesEvent
|
2021-05-01 06:39:28
|
ooi-data/CE04OSPS-SF01B-2A-CTDPFA107-streamed-ctdpf_sbe43_sample
|
https://api.github.com/repos/ooi-data/CE04OSPS-SF01B-2A-CTDPFA107-streamed-ctdpf_sbe43_sample
|
opened
|
🛑 Processing failed: ResponseParserError
|
process
|
## Overview
`ResponseParserError` found in `processing_task` task during run ended on 2021-05-01T06:39:27.973410.
## Details
Flow name: `CE04OSPS-SF01B-2A-CTDPFA107-streamed-ctdpf_sbe43_sample`
Task name: `processing_task`
Error type: `ResponseParserError`
Error message: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed:
b'<?xml version="1.0" encoding="UTF-8"?>\n'
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 452, in _parse_xml_string_to_dom
root = parser.close()
File "<string>", line None
xml.etree.ElementTree.ParseError: no element found: line 2, column 0
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/share/miniconda/envs/harvester/lib/python3.8/site-packages/ooi_harvester/processor/pipeline.py", line 71, in processing_task
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/__init__.py", line 311, in finalize_zarr
source_store.fs.delete(source_store.root, recursive=True)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/spec.py", line 1151, in delete
return self.rm(path, recursive=recursive, maxdepth=maxdepth)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 72, in wrapper
return sync(self.loop, func, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 53, in sync
raise result[0]
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 20, in _runner
result[0] = await coro
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1510, in _rm
await asyncio.gather(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1498, in _bulk_delete
await self._call_s3("delete_objects", kwargs, Bucket=bucket, Delete=delete_keys)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 252, in _call_s3
raise translate_boto_error(err)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 233, in _call_s3
out = await method(**additional_kwargs)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 140, in _make_api_call
http, parsed_response = await self._make_request(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 160, in _make_request
return await self._endpoint.make_request(operation_model, request_dict)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 101, in _send_request
success_response, exception = await self._get_response(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 120, in _get_response
success_response, exception = await self._do_get_response(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 180, in _do_get_response
parsed_response = parser.parse(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 245, in parse
parsed = self._do_parse(response, shape)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 809, in _do_parse
self._add_modeled_parse(response, shape, final_parsed)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 818, in _add_modeled_parse
self._parse_payload(response, shape, member_shapes, final_parsed)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 858, in _parse_payload
original_parsed = self._initial_body_parse(response['body'])
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 944, in _initial_body_parse
return self._parse_xml_string_to_dom(xml_string)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 454, in _parse_xml_string_to_dom
raise ResponseParserError(
botocore.parsers.ResponseParserError: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed:
b'<?xml version="1.0" encoding="UTF-8"?>\n'
```
</details>
|
1.0
|
🛑 Processing failed: ResponseParserError - ## Overview
`ResponseParserError` found in `processing_task` task during run ended on 2021-05-01T06:39:27.973410.
## Details
Flow name: `CE04OSPS-SF01B-2A-CTDPFA107-streamed-ctdpf_sbe43_sample`
Task name: `processing_task`
Error type: `ResponseParserError`
Error message: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed:
b'<?xml version="1.0" encoding="UTF-8"?>\n'
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 452, in _parse_xml_string_to_dom
root = parser.close()
File "<string>", line None
xml.etree.ElementTree.ParseError: no element found: line 2, column 0
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/share/miniconda/envs/harvester/lib/python3.8/site-packages/ooi_harvester/processor/pipeline.py", line 71, in processing_task
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/__init__.py", line 311, in finalize_zarr
source_store.fs.delete(source_store.root, recursive=True)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/spec.py", line 1151, in delete
return self.rm(path, recursive=recursive, maxdepth=maxdepth)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 72, in wrapper
return sync(self.loop, func, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 53, in sync
raise result[0]
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 20, in _runner
result[0] = await coro
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1510, in _rm
await asyncio.gather(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1498, in _bulk_delete
await self._call_s3("delete_objects", kwargs, Bucket=bucket, Delete=delete_keys)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 252, in _call_s3
raise translate_boto_error(err)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 233, in _call_s3
out = await method(**additional_kwargs)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 140, in _make_api_call
http, parsed_response = await self._make_request(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 160, in _make_request
return await self._endpoint.make_request(operation_model, request_dict)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 101, in _send_request
success_response, exception = await self._get_response(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 120, in _get_response
success_response, exception = await self._do_get_response(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 180, in _do_get_response
parsed_response = parser.parse(
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 245, in parse
parsed = self._do_parse(response, shape)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 809, in _do_parse
self._add_modeled_parse(response, shape, final_parsed)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 818, in _add_modeled_parse
self._parse_payload(response, shape, member_shapes, final_parsed)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 858, in _parse_payload
original_parsed = self._initial_body_parse(response['body'])
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 944, in _initial_body_parse
return self._parse_xml_string_to_dom(xml_string)
File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 454, in _parse_xml_string_to_dom
raise ResponseParserError(
botocore.parsers.ResponseParserError: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed:
b'<?xml version="1.0" encoding="UTF-8"?>\n'
```
</details>
|
process
|
🛑 processing failed responseparsererror overview responseparsererror found in processing task task during run ended on details flow name streamed ctdpf sample task name processing task error type responseparsererror error message unable to parse response no element found line column invalid xml received further retries may succeed b n traceback traceback most recent call last file srv conda envs notebook lib site packages botocore parsers py line in parse xml string to dom root parser close file line none xml etree elementtree parseerror no element found line column during handling of the above exception another exception occurred traceback most recent call last file usr share miniconda envs harvester lib site packages ooi harvester processor pipeline py line in processing task file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize zarr source store fs delete source store root recursive true file srv conda envs notebook lib site packages fsspec spec py line in delete return self rm path recursive recursive maxdepth maxdepth file srv conda envs notebook lib site packages fsspec asyn py line in wrapper return sync self loop func args kwargs file srv conda envs notebook lib site packages fsspec asyn py line in sync raise result file srv conda envs notebook lib site packages fsspec asyn py line in runner result await coro file srv conda envs notebook lib site packages core py line in rm await asyncio gather file srv conda envs notebook lib site packages core py line in bulk delete await self call delete objects kwargs bucket bucket delete delete keys file srv conda envs notebook lib site packages core py line in call raise translate boto error err file srv conda envs notebook lib site packages core py line in call out await method additional kwargs file srv conda envs notebook lib site packages aiobotocore client py line in make api call http parsed response await self make request file srv conda envs notebook lib site packages aiobotocore client py line in make request return await self endpoint make request operation model request dict file srv conda envs notebook lib site packages aiobotocore endpoint py line in send request success response exception await self get response file srv conda envs notebook lib site packages aiobotocore endpoint py line in get response success response exception await self do get response file srv conda envs notebook lib site packages aiobotocore endpoint py line in do get response parsed response parser parse file srv conda envs notebook lib site packages botocore parsers py line in parse parsed self do parse response shape file srv conda envs notebook lib site packages botocore parsers py line in do parse self add modeled parse response shape final parsed file srv conda envs notebook lib site packages botocore parsers py line in add modeled parse self parse payload response shape member shapes final parsed file srv conda envs notebook lib site packages botocore parsers py line in parse payload original parsed self initial body parse response file srv conda envs notebook lib site packages botocore parsers py line in initial body parse return self parse xml string to dom xml string file srv conda envs notebook lib site packages botocore parsers py line in parse xml string to dom raise responseparsererror botocore parsers responseparsererror unable to parse response no element found line column invalid xml received further retries may succeed b n
| 1
|
20,362
| 3,348,254,871
|
IssuesEvent
|
2015-11-17 00:38:33
|
cakephp/cakephp
|
https://api.github.com/repos/cakephp/cakephp
|
closed
|
Negative Integers are interpreted as "string" in marshaller
|
Defect
|
Post data of negative integers are patched to the entity as string.
This happened after I upgraded to 3.1.4.
I found where you convert the data;
\Cake\Database\Type\IntegerType::mashal($value) {
...
}
also, I confirmed negative integers are not recognized as integers.
This seems to be the problem because all integer field of all databases accept negative integers,
and there are no reason to exclude negative integers from this function.
These days no one recommends == instead of ===,
and using === cause errors when negative integers are set to the post data.
Is this already argued?
|
1.0
|
Negative Integers are interpreted as "string" in marshaller - Post data of negative integers are patched to the entity as string.
This happened after I upgraded to 3.1.4.
I found where you convert the data;
\Cake\Database\Type\IntegerType::mashal($value) {
...
}
also, I confirmed negative integers are not recognized as integers.
This seems to be the problem because all integer field of all databases accept negative integers,
and there are no reason to exclude negative integers from this function.
These days no one recommends == instead of ===,
and using === cause errors when negative integers are set to the post data.
Is this already argued?
|
non_process
|
negative integers are interpreted as string in marshaller post data of negative integers are patched to the entity as string this happened after i upgraded to i found where you convert the data cake database type integertype mashal value also i confirmed negative integers are not recognized as integers this seems to be the problem because all integer field of all databases accept negative integers and there are no reason to exclude negative integers from this function these days no one recommends instead of and using cause errors when negative integers are set to the post data is this already argued
| 0
|
11,025
| 13,820,850,101
|
IssuesEvent
|
2020-10-13 00:42:43
|
tokio-rs/tokio
|
https://api.github.com/repos/tokio-rs/tokio
|
closed
|
tokio::process::Command leaves zombies when child future is dropped
|
A-tokio C-bug M-process T-docs
|
**Version**
```
└── tokio v0.2.22
└── tokio-macros v0.2.5
```
**Platform**
Linux zita 5.4.44-2-pve #1 SMP PVE 5.4.44-2 (Wed, 01 Jul 2020 16:37:57 +0200) x86_64 GNU/Linux
**Description**
when i start a child process with 'kill_on_drop(true)'
and before the future is finished i drop it, there is a zombie
left until a new Child future is polled.
short reproducer: https://play.rust-lang.org/?version=stable&mode=debug&edition=2018&gist=6746973d696b5122bc212e4bf6436092
i commented out the stdin code for the playground
when run locally it is easier to see that there is a defunct process when looking at e.g., 'ps ax | grep sleep' output
as long as no new child future is polled
i would expect that someone (e.g. the drop handler?) polls the future once again
so that the reaper can call waitpid once more...
|
1.0
|
tokio::process::Command leaves zombies when child future is dropped - **Version**
```
└── tokio v0.2.22
└── tokio-macros v0.2.5
```
**Platform**
Linux zita 5.4.44-2-pve #1 SMP PVE 5.4.44-2 (Wed, 01 Jul 2020 16:37:57 +0200) x86_64 GNU/Linux
**Description**
when i start a child process with 'kill_on_drop(true)'
and before the future is finished i drop it, there is a zombie
left until a new Child future is polled.
short reproducer: https://play.rust-lang.org/?version=stable&mode=debug&edition=2018&gist=6746973d696b5122bc212e4bf6436092
i commented out the stdin code for the playground
when run locally it is easier to see that there is a defunct process when looking at e.g., 'ps ax | grep sleep' output
as long as no new child future is polled
i would expect that someone (e.g. the drop handler?) polls the future once again
so that the reaper can call waitpid once more...
|
process
|
tokio process command leaves zombies when child future is dropped version └── tokio └── tokio macros platform linux zita pve smp pve wed jul gnu linux description when i start a child process with kill on drop true and before the future is finished i drop it there is a zombie left until a new child future is polled short reproducer i commented out the stdin code for the playground when run locally it is easier to see that there is a defunct process when looking at e g ps ax grep sleep output as long as no new child future is polled i would expect that someone e g the drop handler polls the future once again so that the reaper can call waitpid once more
| 1
|
15,934
| 20,160,311,892
|
IssuesEvent
|
2022-02-09 20:45:32
|
googleapis/google-cloud-go
|
https://api.github.com/repos/googleapis/google-cloud-go
|
opened
|
automate promotion of beta clients to GA
|
type: process
|
- set, in code, a time when a project should be marked as GA
- once time has passed mark the client as GA
- Make sure in regen PR there is a conventional commit added to trigger a release PR.
- If this is the only client in the module should be a to v1 PR(this might be harder to automate)
|
1.0
|
automate promotion of beta clients to GA - - set, in code, a time when a project should be marked as GA
- once time has passed mark the client as GA
- Make sure in regen PR there is a conventional commit added to trigger a release PR.
- If this is the only client in the module should be a to v1 PR(this might be harder to automate)
|
process
|
automate promotion of beta clients to ga set in code a time when a project should be marked as ga once time has passed mark the client as ga make sure in regen pr there is a conventional commit added to trigger a release pr if this is the only client in the module should be a to pr this might be harder to automate
| 1
|
664,937
| 22,293,283,010
|
IssuesEvent
|
2022-06-12 17:27:32
|
CookieJarApps/SmartCookieWeb
|
https://api.github.com/repos/CookieJarApps/SmartCookieWeb
|
opened
|
[Bug] Make the currently active tab more distinct within the drawer while using AMOLED theme.
|
bug P2: Medium priority
|
In Light and Dark themes it's easy to tell which one of the tabs shown in drawer is the active one, because of the contrast of its background compared to the rest of the drawer. But it's relatively harder to tell the active tab apart while using AMOLED theme, specially when brightness is set at low or lowest.
|
1.0
|
[Bug] Make the currently active tab more distinct within the drawer while using AMOLED theme. - In Light and Dark themes it's easy to tell which one of the tabs shown in drawer is the active one, because of the contrast of its background compared to the rest of the drawer. But it's relatively harder to tell the active tab apart while using AMOLED theme, specially when brightness is set at low or lowest.
|
non_process
|
make the currently active tab more distinct within the drawer while using amoled theme in light and dark themes it s easy to tell which one of the tabs shown in drawer is the active one because of the contrast of its background compared to the rest of the drawer but it s relatively harder to tell the active tab apart while using amoled theme specially when brightness is set at low or lowest
| 0
|
121,749
| 16,019,974,223
|
IssuesEvent
|
2021-04-20 21:17:43
|
Azure/WALinuxAgent
|
https://api.github.com/repos/Azure/WALinuxAgent
|
closed
|
[BUG][2.2.54.2] waagent-network-setup.service failed
|
by design triaged
|
**Describe the bug: A clear and concise description of what the bug is.**
waagent-network-setup.service failed because "can't open file '/var/lib/waagent/WALinuxAgent-2.2.54.2/bin/WALinuxAgent-2.2.54.2-py2.7.egg': [Errno 2] No such file or directory"
```
2021-04-13T08:58:20.898332Z INFO ExtHandler ExtHandler Distro: redhat-8.5; OSUtil: RedhatOSUtil; AgentService: waagent; Python: 3.6.8; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.14.0;
2021-04-13T08:58:20.903325Z INFO ExtHandler ExtHandler WALinuxAgent-2.2.54.2 running as process 5342
2021-04-13T08:58:20.906831Z INFO ExtHandler ExtHandler Cgroup monitoring is not supported on ['redhat', '8.5', 'Ootpa', 'Red Hat Enterprise Linux']
2021-04-13T08:58:20.907598Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules
2021-04-13T08:58:21.782166Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service
2021-04-13T08:58:21.793557Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service already enabled. No change needed.
2021-04-13T08:58:21.811422Z INFO ExtHandler ExtHandler Logs from the waagent-network-setup.service since system boot:
-- Logs begin at Tue 2021-04-13 16:56:51 CST, end at Tue 2021-04-13 16:58:21 CST. --
Apr 13 16:57:11 localhost.localdomain python3.6[555]: /usr/bin/python3.6: can't open file '/var/lib/waagent/WALinuxAgent-2.2.54.2/bin/WALinuxAgent-2.2.54.2-py2.7.egg': [Errno 2] No such file or directory
Apr 13 16:57:11 localhost.localdomain systemd[1]: waagent-network-setup.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Apr 13 16:57:11 localhost.localdomain systemd[1]: waagent-network-setup.service: Failed with result 'exit-code'.
Apr 13 16:57:11 localhost.localdomain systemd[1]: Failed to start Setup network rules for WALinuxAgent.
```
Note: Please add some context which would help us understand the problem better
1. Start a RHEL-8 VM with WALinuxAgent installed. firewalld service is disabled.
2. Waiting for WALinuxAgent-2.2.54.2 downloaded and started.
3. Check the waagent-network-setup.service
```
# systemctl status waagent-network-setup
Warning: The unit file, source configuration file or drop-ins of waagent-network-setup.service changed on disk. Run 'systemctl daemon-reload' to reload units.
● waagent-network-setup.service - Setup network rules for WALinuxAgent
Loaded: loaded (/usr/lib/systemd/system/waagent-network-setup.service; enabled; vendor preset: disabled)
Drop-In: /usr/lib/systemd/system/waagent-network-setup.service.d
└─10-environment.conf
Active: failed (Result: exit-code) since Tue 2021-04-13 16:57:11 CST; 31min ago
Main PID: 555 (code=exited, status=2)
Apr 13 16:57:11 localhost.localdomain python3.6[555]: /usr/bin/python3.6: can't open file '/var/lib/waagent/WALinuxAgent-2.2.54.2/bin/WALinuxAgent-2.2.54.2-py2.7.egg': [Errno 2] No such file or directory
Apr 13 16:57:11 localhost.localdomain systemd[1]: waagent-network-setup.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Apr 13 16:57:11 localhost.localdomain systemd[1]: waagent-network-setup.service: Failed with result 'exit-code'.
Apr 13 16:57:11 localhost.localdomain systemd[1]: Failed to start Setup network rules for WALinuxAgent.
```
**Distro and WALinuxAgent details (please complete the following information):**
- Distro and Version: RHEL-8.5
- WALinuxAgent version
WALinuxAgent-2.2.49.2 running on redhat 8.5
Python: 3.6.8
Goal state agent: 2.2.54.2
|
1.0
|
[BUG][2.2.54.2] waagent-network-setup.service failed - **Describe the bug: A clear and concise description of what the bug is.**
waagent-network-setup.service failed because "can't open file '/var/lib/waagent/WALinuxAgent-2.2.54.2/bin/WALinuxAgent-2.2.54.2-py2.7.egg': [Errno 2] No such file or directory"
```
2021-04-13T08:58:20.898332Z INFO ExtHandler ExtHandler Distro: redhat-8.5; OSUtil: RedhatOSUtil; AgentService: waagent; Python: 3.6.8; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.14.0;
2021-04-13T08:58:20.903325Z INFO ExtHandler ExtHandler WALinuxAgent-2.2.54.2 running as process 5342
2021-04-13T08:58:20.906831Z INFO ExtHandler ExtHandler Cgroup monitoring is not supported on ['redhat', '8.5', 'Ootpa', 'Red Hat Enterprise Linux']
2021-04-13T08:58:20.907598Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules
2021-04-13T08:58:21.782166Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service
2021-04-13T08:58:21.793557Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service already enabled. No change needed.
2021-04-13T08:58:21.811422Z INFO ExtHandler ExtHandler Logs from the waagent-network-setup.service since system boot:
-- Logs begin at Tue 2021-04-13 16:56:51 CST, end at Tue 2021-04-13 16:58:21 CST. --
Apr 13 16:57:11 localhost.localdomain python3.6[555]: /usr/bin/python3.6: can't open file '/var/lib/waagent/WALinuxAgent-2.2.54.2/bin/WALinuxAgent-2.2.54.2-py2.7.egg': [Errno 2] No such file or directory
Apr 13 16:57:11 localhost.localdomain systemd[1]: waagent-network-setup.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Apr 13 16:57:11 localhost.localdomain systemd[1]: waagent-network-setup.service: Failed with result 'exit-code'.
Apr 13 16:57:11 localhost.localdomain systemd[1]: Failed to start Setup network rules for WALinuxAgent.
```
Note: Please add some context which would help us understand the problem better
1. Start a RHEL-8 VM with WALinuxAgent installed. firewalld service is disabled.
2. Waiting for WALinuxAgent-2.2.54.2 downloaded and started.
3. Check the waagent-network-setup.service
```
# systemctl status waagent-network-setup
Warning: The unit file, source configuration file or drop-ins of waagent-network-setup.service changed on disk. Run 'systemctl daemon-reload' to reload units.
● waagent-network-setup.service - Setup network rules for WALinuxAgent
Loaded: loaded (/usr/lib/systemd/system/waagent-network-setup.service; enabled; vendor preset: disabled)
Drop-In: /usr/lib/systemd/system/waagent-network-setup.service.d
└─10-environment.conf
Active: failed (Result: exit-code) since Tue 2021-04-13 16:57:11 CST; 31min ago
Main PID: 555 (code=exited, status=2)
Apr 13 16:57:11 localhost.localdomain python3.6[555]: /usr/bin/python3.6: can't open file '/var/lib/waagent/WALinuxAgent-2.2.54.2/bin/WALinuxAgent-2.2.54.2-py2.7.egg': [Errno 2] No such file or directory
Apr 13 16:57:11 localhost.localdomain systemd[1]: waagent-network-setup.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Apr 13 16:57:11 localhost.localdomain systemd[1]: waagent-network-setup.service: Failed with result 'exit-code'.
Apr 13 16:57:11 localhost.localdomain systemd[1]: Failed to start Setup network rules for WALinuxAgent.
```
**Distro and WALinuxAgent details (please complete the following information):**
- Distro and Version: RHEL-8.5
- WALinuxAgent version
WALinuxAgent-2.2.49.2 running on redhat 8.5
Python: 3.6.8
Goal state agent: 2.2.54.2
|
non_process
|
waagent network setup service failed describe the bug a clear and concise description of what the bug is waagent network setup service failed because can t open file var lib waagent walinuxagent bin walinuxagent egg no such file or directory info exthandler exthandler distro redhat osutil redhatosutil agentservice waagent python systemd true lisdrivers absent logrotate logrotate info exthandler exthandler walinuxagent running as process info exthandler exthandler cgroup monitoring is not supported on info exthandler exthandler starting setup for persistent firewall rules info exthandler exthandler firewalld service not running unavailable trying to set up waagent network setup service info exthandler exthandler service waagent network setup service already enabled no change needed info exthandler exthandler logs from the waagent network setup service since system boot logs begin at tue cst end at tue cst apr localhost localdomain usr bin can t open file var lib waagent walinuxagent bin walinuxagent egg no such file or directory apr localhost localdomain systemd waagent network setup service main process exited code exited status invalidargument apr localhost localdomain systemd waagent network setup service failed with result exit code apr localhost localdomain systemd failed to start setup network rules for walinuxagent note please add some context which would help us understand the problem better start a rhel vm with walinuxagent installed firewalld service is disabled waiting for walinuxagent downloaded and started check the waagent network setup service systemctl status waagent network setup warning the unit file source configuration file or drop ins of waagent network setup service changed on disk run systemctl daemon reload to reload units ● waagent network setup service setup network rules for walinuxagent loaded loaded usr lib systemd system waagent network setup service enabled vendor preset disabled drop in usr lib systemd system waagent network setup service d └─ environment conf active failed result exit code since tue cst ago main pid code exited status apr localhost localdomain usr bin can t open file var lib waagent walinuxagent bin walinuxagent egg no such file or directory apr localhost localdomain systemd waagent network setup service main process exited code exited status invalidargument apr localhost localdomain systemd waagent network setup service failed with result exit code apr localhost localdomain systemd failed to start setup network rules for walinuxagent distro and walinuxagent details please complete the following information distro and version rhel walinuxagent version walinuxagent running on redhat python goal state agent
| 0
|
188,330
| 15,158,923,315
|
IssuesEvent
|
2021-02-12 02:35:07
|
chakra-ui/chakra-ui
|
https://api.github.com/repos/chakra-ui/chakra-ui
|
closed
|
Link component listed twice within the documentation
|
Topic: Documentation 📚
|
Once under layout and once under navigation
|
1.0
|
Link component listed twice within the documentation - Once under layout and once under navigation
|
non_process
|
link component listed twice within the documentation once under layout and once under navigation
| 0
|
308,202
| 23,237,572,712
|
IssuesEvent
|
2022-08-03 13:07:47
|
Dharmik48/meme-generator
|
https://api.github.com/repos/Dharmik48/meme-generator
|
closed
|
Improve README.md
|
documentation good first issue EddieHub:good-first-issue
|
The current README.md is very lame, doesn't have any proper instructions regarding the project. So we need to add those.
Some of the sections that need to added are Contribution, Install steps, Description, Screenshots, etc.
|
1.0
|
Improve README.md - The current README.md is very lame, doesn't have any proper instructions regarding the project. So we need to add those.
Some of the sections that need to added are Contribution, Install steps, Description, Screenshots, etc.
|
non_process
|
improve readme md the current readme md is very lame doesn t have any proper instructions regarding the project so we need to add those some of the sections that need to added are contribution install steps description screenshots etc
| 0
|
21,681
| 30,122,701,667
|
IssuesEvent
|
2023-06-30 16:29:51
|
UnitTestBot/UTBotJava
|
https://api.github.com/repos/UnitTestBot/UTBotJava
|
opened
|
Introduce another approach to patch annotations
|
ctg-enhancement comp-instrumented-process comp-spring
|
**Description**
Some annotations patching in runtime is required for Spring projects. For example, to construct a context for concrete execution.
Currently we use `ConfigurationManager.patchAnnotation` method, put it is not very reliable because of using `sun.relect` that may be JDK specific and is not compatible with the version of current user.
The proper way is to apply it in instrumentation during transformation by adding a specific visitor.
Sergey Pospelov will provide us further instructions.
|
1.0
|
Introduce another approach to patch annotations - **Description**
Some annotations patching in runtime is required for Spring projects. For example, to construct a context for concrete execution.
Currently we use `ConfigurationManager.patchAnnotation` method, put it is not very reliable because of using `sun.relect` that may be JDK specific and is not compatible with the version of current user.
The proper way is to apply it in instrumentation during transformation by adding a specific visitor.
Sergey Pospelov will provide us further instructions.
|
process
|
introduce another approach to patch annotations description some annotations patching in runtime is required for spring projects for example to construct a context for concrete execution currently we use configurationmanager patchannotation method put it is not very reliable because of using sun relect that may be jdk specific and is not compatible with the version of current user the proper way is to apply it in instrumentation during transformation by adding a specific visitor sergey pospelov will provide us further instructions
| 1
|
344,698
| 30,753,806,576
|
IssuesEvent
|
2023-07-28 22:26:43
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
closed
|
DISABLED test_cond_export (__main__.AutomaticDynamicShapesMiscTests)
|
triaged module: flaky-tests skipped module: dynamic shapes module: dynamo
|
Platforms: linux, rocm, slow
This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_cond_export&suite=AutomaticDynamicShapesMiscTests) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/14386388501).
Over the past 3 hours, it has been determined flaky in 2 workflow(s) with 2 failures and 2 successes.
**Debugging instructions (after clicking on the recent samples link):**
DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs.
To find relevant log snippets:
1. Click on the workflow logs linked above
2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work.
3. Grep for `test_cond_export`
4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs.
Test file path: `dynamo/test_dynamic_shapes.py`
cc @ezyang @voznesenskym @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @ipiszy @chenyang78
|
1.0
|
DISABLED test_cond_export (__main__.AutomaticDynamicShapesMiscTests) - Platforms: linux, rocm, slow
This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_cond_export&suite=AutomaticDynamicShapesMiscTests) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/14386388501).
Over the past 3 hours, it has been determined flaky in 2 workflow(s) with 2 failures and 2 successes.
**Debugging instructions (after clicking on the recent samples link):**
DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs.
To find relevant log snippets:
1. Click on the workflow logs linked above
2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work.
3. Grep for `test_cond_export`
4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs.
Test file path: `dynamo/test_dynamic_shapes.py`
cc @ezyang @voznesenskym @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @ipiszy @chenyang78
|
non_process
|
disabled test cond export main automaticdynamicshapesmisctests platforms linux rocm slow this test was disabled because it is failing in ci see and the most recent trunk over the past hours it has been determined flaky in workflow s with failures and successes debugging instructions after clicking on the recent samples link do not assume things are okay if the ci is green we now shield flaky tests from developers so ci will thus be green but it will be harder to parse the logs to find relevant log snippets click on the workflow logs linked above click on the test step of the job so that it is expanded otherwise the grepping will not work grep for test cond export there should be several instances run as flaky tests are rerun in ci from which you can study the logs test file path dynamo test dynamic shapes py cc ezyang voznesenskym penguinwu eikanwang guobing chen xiaobingsuper zhuhaozhe blzheng xia weiwen wenzhe nrv jiayisunx ipiszy
| 0
|
12,868
| 15,255,034,133
|
IssuesEvent
|
2021-02-20 14:30:30
|
TeamPotry/CustomPart
|
https://api.github.com/repos/TeamPotry/CustomPart
|
closed
|
개선 요구 사항
|
enhancement help wanted in_process
|
- [ ] 모듈화
> 진행 중.
### 광역변수
- [x] __`MaxPartGlobalSlot` 삭제 (취소됨.)__
- [x] `NeedHelpPart` 삭제
- [ ] `cvarPropSize` 삭제
> 범위 기준 -> 신체가 프롭에 닿았는지 체크 여부
- [x] 파츠의 쿨타임, 지속시간 등의 모든 정보를 하나의 메소드 맵에 담을 것.
- [x] __플래그 변수 `CPFLAG_DONOTCLEARSLOT`가 죽었을 때 확인을 하지 않던 문제 수정 (확인됨)__
> 파츠 '저주 받은 눈'에서 텔레포트 위치가 끼는 위치일 경우, 리스폰 되었는데 이때 파츠 슬릇이 모두 없어짐.
---
### 파츠
- [x] 메소드맵 `CPPart` 생성
> 파츠와 관련된 모든 forward를 이쪽으로 옮길 것.
- [x] 파츠가 다음 라운드에 보존되지 않도록 확정지을 것.
- [x] __파츠 프롭에 파츠 정보를 미리 저장해둘 것. (취소됨.)__
> 만약 특정 플레이어가 사용할 수 없는 파츠의 경우, 그 플레이어에게 보이지 않도록 수정.
- [ ] 파츠 컨픽 내에서 아이템 효과 설정을 할 수 있게 변경
- [ ] 파츠 무지개 등급 추가
---
### 파츠 프롭
- [ ] 범위 기준의 터치 판정에서 신체가 프롭에 겹치는지의 대한 판정.
- [ ] 동시에 여러 명이 죽을 때 생기는 프롭이 바닥 밑으로 꺼지는 현상 픽스
---
### 파츠 도감
- [x] 다국어 지원 (허드상 표기도 마찬가지)
- [ ] 태그 지원
- [ ] 파츠 컨픽에서 사운드 관련 프리캐시가 이뤄지도록 수정.
---
### HUD
- [ ] 원인이 밝혀지진 않았지만 매우 희귀하게 허드가 꼬였던 문제 픽스할 것,
- [ ] 파츠 흭득 시, 무슨 파츠를 흭득했는지 `tutorial_text`를 통해 공지.
---
### 서브 플러그인
- [ ] FF2와의 관계를 서브 플러그인으로 옮길 것.
> 현재 FF2 관련 코드를 제거만 한 상태지 아직 옮기진 않음.
|
1.0
|
개선 요구 사항 - - [ ] 모듈화
> 진행 중.
### 광역변수
- [x] __`MaxPartGlobalSlot` 삭제 (취소됨.)__
- [x] `NeedHelpPart` 삭제
- [ ] `cvarPropSize` 삭제
> 범위 기준 -> 신체가 프롭에 닿았는지 체크 여부
- [x] 파츠의 쿨타임, 지속시간 등의 모든 정보를 하나의 메소드 맵에 담을 것.
- [x] __플래그 변수 `CPFLAG_DONOTCLEARSLOT`가 죽었을 때 확인을 하지 않던 문제 수정 (확인됨)__
> 파츠 '저주 받은 눈'에서 텔레포트 위치가 끼는 위치일 경우, 리스폰 되었는데 이때 파츠 슬릇이 모두 없어짐.
---
### 파츠
- [x] 메소드맵 `CPPart` 생성
> 파츠와 관련된 모든 forward를 이쪽으로 옮길 것.
- [x] 파츠가 다음 라운드에 보존되지 않도록 확정지을 것.
- [x] __파츠 프롭에 파츠 정보를 미리 저장해둘 것. (취소됨.)__
> 만약 특정 플레이어가 사용할 수 없는 파츠의 경우, 그 플레이어에게 보이지 않도록 수정.
- [ ] 파츠 컨픽 내에서 아이템 효과 설정을 할 수 있게 변경
- [ ] 파츠 무지개 등급 추가
---
### 파츠 프롭
- [ ] 범위 기준의 터치 판정에서 신체가 프롭에 겹치는지의 대한 판정.
- [ ] 동시에 여러 명이 죽을 때 생기는 프롭이 바닥 밑으로 꺼지는 현상 픽스
---
### 파츠 도감
- [x] 다국어 지원 (허드상 표기도 마찬가지)
- [ ] 태그 지원
- [ ] 파츠 컨픽에서 사운드 관련 프리캐시가 이뤄지도록 수정.
---
### HUD
- [ ] 원인이 밝혀지진 않았지만 매우 희귀하게 허드가 꼬였던 문제 픽스할 것,
- [ ] 파츠 흭득 시, 무슨 파츠를 흭득했는지 `tutorial_text`를 통해 공지.
---
### 서브 플러그인
- [ ] FF2와의 관계를 서브 플러그인으로 옮길 것.
> 현재 FF2 관련 코드를 제거만 한 상태지 아직 옮기진 않음.
|
process
|
개선 요구 사항 모듈화 진행 중 광역변수 maxpartglobalslot 삭제 취소됨 needhelppart 삭제 cvarpropsize 삭제 범위 기준 신체가 프롭에 닿았는지 체크 여부 파츠의 쿨타임 지속시간 등의 모든 정보를 하나의 메소드 맵에 담을 것 플래그 변수 cpflag donotclearslot 가 죽었을 때 확인을 하지 않던 문제 수정 확인됨 파츠 저주 받은 눈 에서 텔레포트 위치가 끼는 위치일 경우 리스폰 되었는데 이때 파츠 슬릇이 모두 없어짐 파츠 메소드맵 cppart 생성 파츠와 관련된 모든 forward를 이쪽으로 옮길 것 파츠가 다음 라운드에 보존되지 않도록 확정지을 것 파츠 프롭에 파츠 정보를 미리 저장해둘 것 취소됨 만약 특정 플레이어가 사용할 수 없는 파츠의 경우 그 플레이어에게 보이지 않도록 수정 파츠 컨픽 내에서 아이템 효과 설정을 할 수 있게 변경 파츠 무지개 등급 추가 파츠 프롭 범위 기준의 터치 판정에서 신체가 프롭에 겹치는지의 대한 판정 동시에 여러 명이 죽을 때 생기는 프롭이 바닥 밑으로 꺼지는 현상 픽스 파츠 도감 다국어 지원 허드상 표기도 마찬가지 태그 지원 파츠 컨픽에서 사운드 관련 프리캐시가 이뤄지도록 수정 hud 원인이 밝혀지진 않았지만 매우 희귀하게 허드가 꼬였던 문제 픽스할 것 파츠 흭득 시 무슨 파츠를 흭득했는지 tutorial text 를 통해 공지 서브 플러그인 관계를 서브 플러그인으로 옮길 것 현재 관련 코드를 제거만 한 상태지 아직 옮기진 않음
| 1
|
18,290
| 5,619,534,887
|
IssuesEvent
|
2017-04-04 02:05:46
|
VATSIM-UK/core
|
https://api.github.com/repos/VATSIM-UK/core
|
closed
|
Refactor Models\Mship\Account to use magic-attributes rather than methods
|
acknowledged code refactor
|
In gitlab by @A-Lawrence on Sep 4, 2016, 18:17
* `hasPassword()` => `has_password`
* `hasPasswordExpired()` => `has_password_expired` or even `password_expired`
* `mandatory_password` => `has_mandatory_password`
|
1.0
|
Refactor Models\Mship\Account to use magic-attributes rather than methods - In gitlab by @A-Lawrence on Sep 4, 2016, 18:17
* `hasPassword()` => `has_password`
* `hasPasswordExpired()` => `has_password_expired` or even `password_expired`
* `mandatory_password` => `has_mandatory_password`
|
non_process
|
refactor models mship account to use magic attributes rather than methods in gitlab by a lawrence on sep haspassword has password haspasswordexpired has password expired or even password expired mandatory password has mandatory password
| 0
|
6,061
| 8,900,991,974
|
IssuesEvent
|
2019-01-17 00:15:45
|
edgi-govdata-archiving/web-monitoring
|
https://api.github.com/repos/edgi-govdata-archiving/web-monitoring
|
closed
|
Important Change Identification Road Map/Plan
|
processing stale
|
Based on the discussion with @suchthis @danielballan @trinberg in a recent call, here is the plan for important change identification and prioritization.
The task will follow this road map:
1. Classification of changes into two categories "primary"(worth taking a second look at) and "secondary" (less important than primary changes but can be looked at for some meaningful information). This will happen after passing the changes through the first filtering or pre-filtering layer which will tag the indisputable insignificant changes (date/time, social media, contact info etc.) and thus we will have three categories in the end - primary, secondary, and insignificant.
2. Improvement of classification model based on feedback and results.
3. Assigning a numerical priority or score to each of the changes in the primary and secondary category based on some features of the changes. The different categories will be separately prioritised and there will be no relation between the priorities of primary and secondary. For example - A high priority in secondary (let's say 0.9) will still be less than a low priority in primary (let's say 0.2).
4. Improvement of prioritization model based on feedback and results.
This map will be followed for all types of models i.e. text, source, and other changes based on different differs. Each model will have its own road map based on this general plan.
The creation of each type of model will have three basic steps or parts:
1. Dataset creation through extraction of relevant information from the source ( for example text from source). This will be followed by required pre-processing.
2. Model training using the dataset and validating its performance on a test set.
3. Real time classification/prioritization of new changes by passing them through the trained model.
The correctly classified data will be added to the dataset and the model will be retrained periodically.
Each change will only be tagged and classified and none of them will be removed from the list. This is to ensure that any important change which is incorrectly classified by the model isn't deleted or removed.
This issue is to define a clear process to follow for classification and prioritization. New contributors can also use this to add their own prioritization models.
This is open for discussion and suggestions are welcome. :)
|
1.0
|
Important Change Identification Road Map/Plan - Based on the discussion with @suchthis @danielballan @trinberg in a recent call, here is the plan for important change identification and prioritization.
The task will follow this road map:
1. Classification of changes into two categories "primary"(worth taking a second look at) and "secondary" (less important than primary changes but can be looked at for some meaningful information). This will happen after passing the changes through the first filtering or pre-filtering layer which will tag the indisputable insignificant changes (date/time, social media, contact info etc.) and thus we will have three categories in the end - primary, secondary, and insignificant.
2. Improvement of classification model based on feedback and results.
3. Assigning a numerical priority or score to each of the changes in the primary and secondary category based on some features of the changes. The different categories will be separately prioritised and there will be no relation between the priorities of primary and secondary. For example - A high priority in secondary (let's say 0.9) will still be less than a low priority in primary (let's say 0.2).
4. Improvement of prioritization model based on feedback and results.
This map will be followed for all types of models i.e. text, source, and other changes based on different differs. Each model will have its own road map based on this general plan.
The creation of each type of model will have three basic steps or parts:
1. Dataset creation through extraction of relevant information from the source ( for example text from source). This will be followed by required pre-processing.
2. Model training using the dataset and validating its performance on a test set.
3. Real time classification/prioritization of new changes by passing them through the trained model.
The correctly classified data will be added to the dataset and the model will be retrained periodically.
Each change will only be tagged and classified and none of them will be removed from the list. This is to ensure that any important change which is incorrectly classified by the model isn't deleted or removed.
This issue is to define a clear process to follow for classification and prioritization. New contributors can also use this to add their own prioritization models.
This is open for discussion and suggestions are welcome. :)
|
process
|
important change identification road map plan based on the discussion with suchthis danielballan trinberg in a recent call here is the plan for important change identification and prioritization the task will follow this road map classification of changes into two categories primary worth taking a second look at and secondary less important than primary changes but can be looked at for some meaningful information this will happen after passing the changes through the first filtering or pre filtering layer which will tag the indisputable insignificant changes date time social media contact info etc and thus we will have three categories in the end primary secondary and insignificant improvement of classification model based on feedback and results assigning a numerical priority or score to each of the changes in the primary and secondary category based on some features of the changes the different categories will be separately prioritised and there will be no relation between the priorities of primary and secondary for example a high priority in secondary let s say will still be less than a low priority in primary let s say improvement of prioritization model based on feedback and results this map will be followed for all types of models i e text source and other changes based on different differs each model will have its own road map based on this general plan the creation of each type of model will have three basic steps or parts dataset creation through extraction of relevant information from the source for example text from source this will be followed by required pre processing model training using the dataset and validating its performance on a test set real time classification prioritization of new changes by passing them through the trained model the correctly classified data will be added to the dataset and the model will be retrained periodically each change will only be tagged and classified and none of them will be removed from the list this is to ensure that any important change which is incorrectly classified by the model isn t deleted or removed this issue is to define a clear process to follow for classification and prioritization new contributors can also use this to add their own prioritization models this is open for discussion and suggestions are welcome
| 1
|
3,112
| 6,136,341,362
|
IssuesEvent
|
2017-06-26 09:07:07
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
child_process: special handling of process.on('message')
|
child_process feature request process
|
<!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Version**: *
* **Platform**: *
* **Subsystem**: child_process
<!-- Enter your issue details below this comment. -->
```js
process.on('message');
```
has a special meaning in the context of IPC between parent and child. The problem is a `'message'` event could be triggered other code, or listened to outside of IPC context, so we can not do any special treatment for it.
I suggest adding `'IPCMessage'` that only the IPC channel can trigger, and registering a listener to would fail if an IPC channel was not established.
Ref: https://github.com/nodejs/help/issues/693#issuecomment-310525958
[edit]
The intention is to emit both events: `message` for backwards compatibility, and `IPCMessage` for a validated IPC only events.
|
2.0
|
child_process: special handling of process.on('message') - <!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Version**: *
* **Platform**: *
* **Subsystem**: child_process
<!-- Enter your issue details below this comment. -->
```js
process.on('message');
```
has a special meaning in the context of IPC between parent and child. The problem is a `'message'` event could be triggered other code, or listened to outside of IPC context, so we can not do any special treatment for it.
I suggest adding `'IPCMessage'` that only the IPC channel can trigger, and registering a listener to would fail if an IPC channel was not established.
Ref: https://github.com/nodejs/help/issues/693#issuecomment-310525958
[edit]
The intention is to emit both events: `message` for backwards compatibility, and `IPCMessage` for a validated IPC only events.
|
process
|
child process special handling of process on message thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version platform subsystem child process js process on message has a special meaning in the context of ipc between parent and child the problem is a message event could be triggered other code or listened to outside of ipc context so we can not do any special treatment for it i suggest adding ipcmessage that only the ipc channel can trigger and registering a listener to would fail if an ipc channel was not established ref the intention is to emit both events message for backwards compatibility and ipcmessage for a validated ipc only events
| 1
|
9,269
| 12,300,994,363
|
IssuesEvent
|
2020-05-11 14:47:06
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
closed
|
System.DllNotFoundException: Unable to load DLL 'libproc':
|
area-System.Diagnostics.Process bug os-mac-os-x untriaged up-for-grabs
|
_From @livarcocc on October 6, 2017 16:16_
_From @Petermarcu on October 6, 2017 16:3_
@m2b commented on [Thu Apr 20 2017](https://github.com/dotnet/core/issues/599)
**Here is the simple test project code**
using System;
using Xunit;
namespace ITVizion.ActionBoard.RepositoryInMemory
{
public class Test
{
[Fact]
public void Test1()
{
Assert.Equal(1,1);
}
}
}
=========================================================================
**Here is the command window output. dotnet crashes!**
blackielanetMBP:actionboard.git blackie$ dotnet test ITVizion.ActionBoard.RepositoryInMemory.Test/ITVizion.ActionBoard.RepositoryInMemory.Test.csproj
Welcome to .NET Core!
---------------------
Learn more about .NET Core @ https://aka.ms/dotnet-docs. Use dotnet --help to see available commands or go to https://aka.ms/dotnet-cli-docs.
Telemetry
--------------
The .NET Core tools collect usage data in order to improve your experience. The data is anonymous and does not include command-line arguments. The data is collected by Microsoft and shared with the community.
You can opt out of telemetry by setting a DOTNET_CLI_TELEMETRY_OPTOUT environment variable to 1 using your favorite shell.
You can read more about .NET Core tools telemetry @ https://aka.ms/dotnet-cli-telemetry.
Configuring...
-------------------
A command is running to initially populate your local package cache, to improve restore speed and enable offline access. This command will take up to a minute to complete and will only happen once.
Decompressing 100% 5896 ms
Expanding 100% 7278 ms
Failed to create prime the NuGet cache. restore failed with: 134
Unhandled Exception: System.DllNotFoundException: Unable to load DLL 'libproc': The specified module could not be found.
(Exception from HRESULT: 0x8007007E)
at Interop.libproc.proc_pidpath(Int32 pid, Byte* buffer, UInt32 bufferSize)
at Interop.libproc.proc_pidpath(Int32 pid)
at System.Diagnostics.Process.ResolvePath(String filename)
at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo)
at System.Diagnostics.Process.Start()
at Microsoft.DotNet.Cli.ForwardingApp.Execute()
at Microsoft.DotNet.Tools.MSBuild.MSBuildForwardingApp.Execute()
at Microsoft.DotNet.Tools.Test.TestCommand.<>c__DisplayClass0_0.<Run>b__0()
at Microsoft.DotNet.Cli.CommandLine.CommandLineApplication.Execute(String[] args)
at Microsoft.DotNet.Tools.Test.TestCommand.Run(String[] args)
at Microsoft.DotNet.Cli.Program.ProcessArgs(String[] args, ITelemetry telemetryClient)
at Microsoft.DotNet.Cli.Program.Main(String[] args)
Abort trap: 6
=========================================================================
**Here is the Mac Problem Report**
Process: dotnet [26183]
Path: /usr/local/share/dotnet/dotnet
Identifier: dotnet
Version: 0
Code Type: X86-64 (Native)
Parent Process: dotnet [26170]
Responsible: dotnet [26183]
User ID: 501
Date/Time: 2017-04-20 17:28:26.552 -0700
OS Version: Mac OS X 10.12.4 (16E195)
Report Version: 12
Anonymous UUID: E3FC06FA-F158-98C8-DDAD-74F8F018A97A
Sleep/Wake UUID: ECCEF7E0-8ED4-4B9C-9EF0-00E0D60B0BBE
Time Awake Since Boot: 470000 seconds
Time Since Wake: 5600 seconds
System Integrity Protection: disabled
Crashed Thread: 0 Dispatch queue: com.apple.main-thread
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Exception Note: EXC_CORPSE_NOTIFY
Application Specific Information:
abort() called
Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 libsystem_kernel.dylib 0x00007fffa33bed42 __pthread_kill + 10
1 libsystem_pthread.dylib 0x00007fffa34ac5bf pthread_kill + 90
2 libsystem_c.dylib 0x00007fffa3324420 abort + 129
3 libcoreclr.dylib 0x0000000104dc1312 PROCEndProcess(void*, unsigned int, int) + 226
4 libcoreclr.dylib 0x00000001050928c1 UnwindManagedExceptionPass1(PAL_SEHException&, _CONTEXT*) + 833
5 libcoreclr.dylib 0x0000000105092979 DispatchManagedException(PAL_SEHException&, bool) + 73
6 libcoreclr.dylib 0x0000000104f1c6c9 PreStubWorker + 937
7 libcoreclr.dylib 0x00000001050a0579 ThePreStub + 92
8 ??? 0x000000010c1cad79 0 + 4498173305
9 ??? 0x000000010c1c66b1 0 + 4498155185
10 ??? 0x000000010c1c6181 0 + 4498153857
11 ??? 0x000000010c1c51cb 0 + 4498149835
12 ??? 0x000000010bc5cabd 0 + 4492479165
13 ??? 0x000000010bc574b2 0 + 4492457138
14 ??? 0x000000010bc6271b 0 + 4492502811
15 ??? 0x000000010bc5f21f 0 + 4492489247
16 ??? 0x000000010bc55bc2 0 + 4492450754
17 ??? 0x000000010bc5d4df 0 + 4492481759
18 ??? 0x000000010bc5cf8d 0 + 4492480397
19 libcoreclr.dylib 0x000000010509f9a1 CallDescrWorkerInternal + 124
20 libcoreclr.dylib 0x0000000104f8bb43 MethodDescCallSite::CallTargetWorker(unsigned long const*, unsigned long*, int) + 707
21 libcoreclr.dylib 0x0000000104e59fc4 RunMain(MethodDesc*, short, int*, PtrArray**) + 932
22 libcoreclr.dylib 0x0000000104e5a18d Assembly::ExecuteMainMethod(PtrArray**, int) + 221
23 libcoreclr.dylib 0x0000000104e9762a CorHost2::ExecuteAssembly(unsigned int, char16_t const*, int, char16_t const**, unsigned int*) + 442
24 libcoreclr.dylib 0x0000000104dca4f3 coreclr_execute_assembly + 259
25 libhostpolicy.dylib 0x0000000104d027e8 coreclr::execute_assembly(void*, unsigned int, int, char const**, char const*, unsigned int*) + 152
26 libhostpolicy.dylib 0x0000000104cf1388 run(arguments_t const&) + 20568
27 libhostpolicy.dylib 0x0000000104cf1d0d corehost_main + 1405
28 libhostfxr.dylib 0x0000000104c3c90e execute_app(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, corehost_init_t*, int, char const**) + 446
29 libhostfxr.dylib 0x0000000104c52679 fx_muxer_t::read_config_and_execute(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::unordered_map<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > > >, std::__1::hash<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > >, std::__1::equal_to<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > >, std::__1::allocator<std::__1::pair<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > > > > > > const&, int, char const**, host_mode_t) + 8441
30 libhostfxr.dylib 0x0000000104c504a9 fx_muxer_t::parse_args_and_execute(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, int, int, char const**, bool, host_mode_t, bool*) + 8425
31 libhostfxr.dylib 0x0000000104c53a09 fx_muxer_t::execute(int, char const**) + 4537
32 libhostfxr.dylib 0x0000000104c3c985 hostfxr_main + 53
33 dotnet 0x0000000104b707ac run(int, char const**) + 1420
34 dotnet 0x0000000104b708ee main + 158
35 libdyld.dylib 0x00007fffa3290235 start + 1
Thread 1:
0 libsystem_kernel.dylib 0x00007fffa33b734a mach_msg_trap + 10
1 libsystem_kernel.dylib 0x00007fffa33b6797 mach_msg + 55
2 libcoreclr.dylib 0x0000000104dc761a MachMessage::Receive(unsigned int) + 74
3 libcoreclr.dylib 0x0000000104dc6539 SEHExceptionThread(void*) + 105
4 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
5 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
6 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 2:
0 libsystem_kernel.dylib 0x00007fffa33c019e poll + 10
1 libcoreclr.dylib 0x0000000104db9afe CorUnix::CPalSynchronizationManager::ThreadPrepareForShutdown() + 30
2 libcoreclr.dylib 0x0000000104dbb729 CorUnix::CPalSynchronizationManager::WorkerThread(void*) + 1177
3 libcoreclr.dylib 0x0000000104dc3e48 CorUnix::CPalThread::ThreadEntry(void*) + 328
4 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
5 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
6 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 3:
0 libsystem_kernel.dylib 0x00007fffa33bea3e __open + 10
1 libcoreclr.dylib 0x0000000104e36b0f TwoWayPipe::WaitForConnection() + 31
2 libcoreclr.dylib 0x0000000104e2e8c1 DbgTransportSession::TransportWorker() + 145
3 libcoreclr.dylib 0x0000000104e2d4e9 DbgTransportSession::TransportWorkerStatic(void*) + 9
4 libcoreclr.dylib 0x0000000104dc3e48 CorUnix::CPalThread::ThreadEntry(void*) + 328
5 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
6 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
7 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 4:
0 libsystem_kernel.dylib 0x00007fffa33bebf2 __psynch_cvwait + 10
1 libsystem_pthread.dylib 0x00007fffa34aa86e _pthread_cond_wait + 712
2 libcoreclr.dylib 0x0000000104db97a2 CorUnix::CPalSynchronizationManager::ThreadNativeWait(CorUnix::_ThreadNativeWaitData*, unsigned int, CorUnix::ThreadWakeupReason*, unsigned int*) + 306
3 libcoreclr.dylib 0x0000000104db93f6 CorUnix::CPalSynchronizationManager::BlockThread(CorUnix::CPalThread*, unsigned int, bool, bool, CorUnix::ThreadWakeupReason*, unsigned int*) + 390
4 libcoreclr.dylib 0x0000000104dbe3e8 CorUnix::InternalWaitForMultipleObjectsEx(CorUnix::CPalThread*, unsigned int, void* const*, int, unsigned int, int) + 1912
5 libcoreclr.dylib 0x0000000104e2bc73 DebuggerRCThread::MainLoop() + 755
6 libcoreclr.dylib 0x0000000104e2b927 DebuggerRCThread::ThreadProc() + 263
7 libcoreclr.dylib 0x0000000104e2b5d4 DebuggerRCThread::ThreadProcStatic(void*) + 132
8 libcoreclr.dylib 0x0000000104dc3e48 CorUnix::CPalThread::ThreadEntry(void*) + 328
9 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
10 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
11 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 5:
0 libsystem_kernel.dylib 0x00007fffa33bebf2 __psynch_cvwait + 10
1 libsystem_pthread.dylib 0x00007fffa34aa86e _pthread_cond_wait + 712
2 libcoreclr.dylib 0x0000000104db9785 CorUnix::CPalSynchronizationManager::ThreadNativeWait(CorUnix::_ThreadNativeWaitData*, unsigned int, CorUnix::ThreadWakeupReason*, unsigned int*) + 277
3 libcoreclr.dylib 0x0000000104db93f6 CorUnix::CPalSynchronizationManager::BlockThread(CorUnix::CPalThread*, unsigned int, bool, bool, CorUnix::ThreadWakeupReason*, unsigned int*) + 390
4 libcoreclr.dylib 0x0000000104dbe3e8 CorUnix::InternalWaitForMultipleObjectsEx(CorUnix::CPalThread*, unsigned int, void* const*, int, unsigned int, int) + 1912
5 libcoreclr.dylib 0x0000000104dbe636 WaitForSingleObjectEx + 70
6 libcoreclr.dylib 0x000000010507b340 CLREventBase::WaitEx(unsigned int, WaitMode, PendingSync*) + 176
7 libcoreclr.dylib 0x0000000104fd0e6f FinalizerThread::WaitForFinalizerEvent(CLREvent*) + 31
8 libcoreclr.dylib 0x0000000104fd0fe3 FinalizerThread::FinalizerThreadWorker(void*) + 115
9 libcoreclr.dylib 0x0000000104f4b15a ManagedThreadBase_DispatchOuter(ManagedThreadCallState*) + 378
10 libcoreclr.dylib 0x0000000104f4b859 ManagedThreadBase::FinalizerBase(void (*)(void*)) + 73
11 libcoreclr.dylib 0x0000000104fd13cc FinalizerThread::FinalizerThreadStart(void*) + 204
12 libcoreclr.dylib 0x0000000104dc3e48 CorUnix::CPalThread::ThreadEntry(void*) + 328
13 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
14 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
15 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 6:
0 libsystem_kernel.dylib 0x00007fffa33b734a mach_msg_trap + 10
1 libsystem_kernel.dylib 0x00007fffa33b6797 mach_msg + 55
2 libclrjit.dylib 0x0000000105bb174a MachMessage::Receive(unsigned int) + 74
3 libclrjit.dylib 0x0000000105bb0669 SEHExceptionThread(void*) + 105
4 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
5 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
6 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 7:
0 libsystem_kernel.dylib 0x00007fffa33bf44e __workq_kernreturn + 10
1 libsystem_pthread.dylib 0x00007fffa34a9695 _pthread_wqthread + 1426
2 libsystem_pthread.dylib 0x00007fffa34a90f1 start_wqthread + 13
Thread 0 crashed with X86 Thread State (64-bit):
rax: 0x0000000000000000 rbx: 0x0000000000000006 rcx: 0x00007fff5b0927c8 rdx: 0x0000000000000000
rdi: 0x0000000000000307 rsi: 0x0000000000000006 rbp: 0x00007fff5b0927f0 rsp: 0x00007fff5b0927c8
r8: 0x00007faa1b8088f8 r9: 0x00000000007a682f r10: 0x0000000008000000 r11: 0x0000000000000206
r12: 0x00007fff5b092e30 r13: 0x0000000000000000 r14: 0x00007fffac1f43c0 r15: 0x0000000000000000
rip: 0x00007fffa33bed42 rfl: 0x0000000000000206 cr2: 0x00007fffac1d6128
Logical CPU: 0
Error Code: 0x02000148
Trap Number: 133
Binary Images:
0x104b65000 - 0x104b81ff3 +dotnet (0) <3D909F9D-70EC-3599-8378-D1E3D3F3C4C2> /usr/local/share/dotnet/dotnet
0x104bbf000 - 0x104c62ff7 +libhostfxr.dylib (0) <C0FDA7FE-45E6-3289-8926-3DA0A4901323> /usr/local/share/dotnet/host/fxr/1.1.0/libhostfxr.dylib
0x104c8c000 - 0x104d48ff3 +libhostpolicy.dylib (0) <BFE7338C-2137-3433-A8F8-FEB0390BAF83> /usr/local/share/dotnet/shared/Microsoft.NETCore.App/1.1.1/libhostpolicy.dylib
0x104d79000 - 0x10529cfff +libcoreclr.dylib (0) <40FA6CB2-53C9-3176-8516-FE40B2D8F3C8> /usr/local/share/dotnet/shared/Microsoft.NETCore.App/1.1.1/libcoreclr.dylib
0x1055cc000 - 0x105609dc7 dyld (433.5) <8239D0D7-66F6-3C44-A77F-586F74525DA3> /usr/lib/dyld
0x1059f4000 - 0x105bf6fff +libclrjit.dylib (0) <C4B9490D-6AC6-3CEE-9A9C-90495E36699C> /usr/local/share/dotnet/shared/Microsoft.NETCore.App/1.1.1/libclrjit.dylib
0x105cc0000 - 0x105cc8fff +System.Globalization.Native.dylib (0) <60FCF655-55BA-37A9-924C-157C9E8E7AC1> /usr/local/share/dotnet/shared/Microsoft.NETCore.App/1.1.1/System.Globalization.Native.dylib
0x10759d000 - 0x1075a4ffb +System.Native.dylib (0) <5F1F2A95-1960-379D-A897-AF4F76F06D11> /usr/local/share/dotnet/shared/Microsoft.NETCore.App/1.1.1/System.Native.dylib
0x7fff8a057000 - 0x7fff8a057fff com.apple.Accelerate (1.11 - Accelerate 1.11) <D7745BB9-42FD-3443-9265-151413E4C8AD> /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate
0x7fff8a070000 - 0x7fff8a6b0fe3 com.apple.vImage (8.1 - ???) <B1786726-6477-327E-83F4-8EFF4D15DFAC> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vImage.framework/Versions/A/vImage
0x7fff8a6b1000 - 0x7fff8a878fef libBLAS.dylib (1185.50.4) <7AF8DB9A-E33B-30EB-8767-87AFB951E231> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
0x7fff8a879000 - 0x7fff8a890fff libBNNS.dylib (15) <26F32264-148E-35E5-A280-CA035CB3D6F0> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBNNS.dylib
0x7fff8a891000 - 0x7fff8ac9cfff libLAPACK.dylib (1185.50.4) <51B2BABA-F736-3663-885A-65A8991D3560> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libLAPACK.dylib
0x7fff8ac9d000 - 0x7fff8acb3fff libLinearAlgebra.dylib (1185.50.4) <0EC25E70-05DC-3615-85B6-81721566CF1D> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libLinearAlgebra.dylib
0x7fff8acb4000 - 0x7fff8acbafff libQuadrature.dylib (3) <EF56C8E6-DE22-3C69-B543-A8648F335FDD> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libQuadrature.dylib
0x7fff8acbb000 - 0x7fff8acceff7 libSparseBLAS.dylib (1185.50.4) <0BDCF6A7-0228-3719-81C7-B6EBC7911320> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libSparseBLAS.dylib
0x7fff8accf000 - 0x7fff8ae1cff7 libvDSP.dylib (600) <9D9CFF97-2A64-341A-AB35-DC0C76068B9C> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvDSP.dylib
0x7fff8ae1d000 - 0x7fff8aed4fff libvMisc.dylib (600) <661B825D-274E-3B85-B160-89873AD65A62> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvMisc.dylib
0x7fff8aed5000 - 0x7fff8aed5fff com.apple.Accelerate.vecLib (3.11 - vecLib 3.11) <B8F2814E-0927-3905-A394-EFEB5636DE76> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/vecLib
0x7fff8bf7f000 - 0x7fff8bf7ffff com.apple.ApplicationServices (48 - 48) <847E54B5-DEA4-3B50-93CE-4FC67789F179> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/ApplicationServices
0x7fff8bf80000 - 0x7fff8bfeeff7 com.apple.ApplicationServices.ATS (377 - 422.2) <012ACEE0-9A90-3998-8495-734E105117C0> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/ATS
0x7fff8c088000 - 0x7fff8c1b7ff7 libFontParser.dylib (194.11) <635DF6EF-18DF-3D06-90AA-88C509B43068> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libFontParser.dylib
0x7fff8c1b8000 - 0x7fff8c202fff libFontRegistry.dylib (196.4) <EA96AE47-3369-3DEA-BB82-98348ADBD85B> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libFontRegistry.dylib
0x7fff8c2fe000 - 0x7fff8c3a8ff7 com.apple.ColorSync (4.12.0 - 502.2) <ACA4001E-A0E3-33F6-9CD6-EEC2AA15E322> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ColorSync.framework/Versions/A/ColorSync
0x7fff8c3a9000 - 0x7fff8c3f9ff7 com.apple.HIServices (1.22 - 591) <D16A5699-F3A2-3AF5-93B1-9E2F487F1577> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/HIServices.framework/Versions/A/HIServices
0x7fff8c3fa000 - 0x7fff8c409ff3 com.apple.LangAnalysis (1.7.0 - 1.7.0) <2CBE7F61-2056-3F96-99A1-0D527796AFA6> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/LangAnalysis.framework/Versions/A/LangAnalysis
0x7fff8c40a000 - 0x7fff8c457fff com.apple.print.framework.PrintCore (12 - 491) <5027FD58-F0EE-33E4-8577-934CA06CD2AF> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/PrintCore.framework/Versions/A/PrintCore
0x7fff8c458000 - 0x7fff8c493fff com.apple.QD (3.12 - 313) <B339C41D-8CDF-3342-8414-F9717DCCADD4> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/QD.framework/Versions/A/QD
0x7fff8c494000 - 0x7fff8c49fff7 com.apple.speech.synthesis.framework (6.3.3 - 6.3.3) <5808F070-8199-36C9-B3C9-F9B98D5AA359> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/SpeechSynthesis.framework/Versions/A/SpeechSynthesis
0x7fff8c4a0000 - 0x7fff8c6acfff com.apple.audio.toolbox.AudioToolbox (1.14 - 1.14) <6EEF498D-8233-3622-B34B-49FDD9D4DF14> /System/Library/Frameworks/AudioToolbox.framework/Versions/A/AudioToolbox
0x7fff8c816000 - 0x7fff8cbf0ff7 com.apple.CFNetwork (811.4.18 - 811.4.18) <9CE329E8-6177-3474-976D-F5C63FC875CD> /System/Library/Frameworks/CFNetwork.framework/Versions/A/CFNetwork
0x7fff8d20c000 - 0x7fff8d299fff com.apple.audio.CoreAudio (4.3.0 - 4.3.0) <184D9C49-248F-3374-944C-FD1A99A6B32E> /System/Library/Frameworks/CoreAudio.framework/Versions/A/CoreAudio
0x7fff8d2ae000 - 0x7fff8d5adff3 com.apple.CoreData (120 - 754.2) <C9933C8C-85D5-3FB9-8D6D-DB80AB3F496B> /System/Library/Frameworks/CoreData.framework/Versions/A/CoreData
0x7fff8d65b000 - 0x7fff8daeefff com.apple.CoreFoundation (6.9 - 1349.65) <2B7C4BA4-D69E-3651-93DF-3930880B5084> /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation
0x7fff8daef000 - 0x7fff8e192ff7 com.apple.CoreGraphics (2.0 - 1070.22) <1676F5EC-AEE3-3C52-97C4-43CBF705EA2A> /System/Library/Frameworks/CoreGraphics.framework/Versions/A/CoreGraphics
0x7fff8e53b000 - 0x7fff8e53bfff com.apple.CoreServices (775.19 - 775.19) <8AA95E32-AB13-3792-B248-FA150D8E6583> /System/Library/Frameworks/CoreServices.framework/Versions/A/CoreServices
0x7fff8e53c000 - 0x7fff8e58dfff com.apple.AE (712.5 - 712.5) <F0B36ABC-C0D4-370E-8257-11A7F351EC7F> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/AE.framework/Versions/A/AE
0x7fff8e58e000 - 0x7fff8e869ff7 com.apple.CoreServices.CarbonCore (1159.6 - 1159.6) <08AC074C-965B-3EDF-8E45-0707C8DE9EAD> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/CarbonCore.framework/Versions/A/CarbonCore
0x7fff8e86a000 - 0x7fff8e89dfff com.apple.DictionaryServices (1.2 - 274) <D23866E2-F7C8-3984-A9D4-96552CCDE573> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/DictionaryServices.framework/Versions/A/DictionaryServices
0x7fff8e89e000 - 0x7fff8e8a6ff3 com.apple.CoreServices.FSEvents (1230.50.1 - 1230.50.1) <2AD1B0E5-7214-37C4-8D11-A27C9CAC0F74> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/FSEvents.framework/Versions/A/FSEvents
0x7fff8e8a7000 - 0x7fff8ea13ff7 com.apple.LaunchServices (775.19 - 775.19) <1CF81B5F-BA1A-3FC6-B4F9-E0A319AE94D0> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/LaunchServices.framework/Versions/A/LaunchServices
0x7fff8ea14000 - 0x7fff8eac4ffb com.apple.Metadata (10.7.0 - 1075.40) <F205A001-250D-3D9A-8375-0F7A834C46E6> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/Metadata.framework/Versions/A/Metadata
0x7fff8eac5000 - 0x7fff8eb24fff com.apple.CoreServices.OSServices (775.19 - 775.19) <724312AC-5CE8-333C-BC35-BC5AB1583D9A> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/OSServices.framework/Versions/A/OSServices
0x7fff8eb25000 - 0x7fff8eb95fff com.apple.SearchKit (1.4.0 - 1.4.0) <7A6DDA2B-03F1-3137-BA9E-1CC211973E26> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/SearchKit.framework/Versions/A/SearchKit
0x7fff8eb96000 - 0x7fff8ebdbff7 com.apple.coreservices.SharedFileList (38 - 38) <DA096678-93AB-3291-BDE2-482F1D544589> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/SharedFileList.framework/Versions/A/SharedFileList
0x7fff8ec64000 - 0x7fff8edb0ff3 com.apple.CoreText (352.0 - 544.12) <1ED17C4A-9E2D-3537-8C5F-FB675492A002> /System/Library/Frameworks/CoreText.framework/Versions/A/CoreText
0x7fff8ef56000 - 0x7fff8ef5bfff com.apple.DiskArbitration (2.7 - 2.7) <A4DCD470-D8EA-37E9-BDCA-A2B469754C12> /System/Library/Frameworks/DiskArbitration.framework/Versions/A/DiskArbitration
0x7fff8f0ed000 - 0x7fff8f493ff3 com.apple.Foundation (6.9 - 1349.64) <49C8DA40-9E5B-33F9-B092-F50115B59E95> /System/Library/Frameworks/Foundation.framework/Versions/C/Foundation
0x7fff8f4bf000 - 0x7fff8f4f0ff7 com.apple.GSS (4.0 - 2.0) <6FADED0B-0425-3567-A75A-040C5A4638EB> /System/Library/Frameworks/GSS.framework/Versions/A/GSS
0x7fff8f654000 - 0x7fff8f6e9fff com.apple.framework.IOKit (2.0.2 - 1324.50.21) <BA7DC917-35A9-3D1B-BBEC-ADF4495A166D> /System/Library/Frameworks/IOKit.framework/Versions/A/IOKit
0x7fff8f6ea000 - 0x7fff8f6f0ffb com.apple.IOSurface (159.6 - 159.6) <661BFCC0-85AB-3343-853E-3797932871D4> /System/Library/Frameworks/IOSurface.framework/Versions/A/IOSurface
0x7fff8f743000 - 0x7fff8f8a5fff com.apple.ImageIO.framework (3.3.0 - 1599.7) <EFABDE90-A1B0-3211-978B-FF1414355087> /System/Library/Frameworks/ImageIO.framework/Versions/A/ImageIO
0x7fff8f8a6000 - 0x7fff8f8aafff libGIF.dylib (1599.7) <6004F3A9-A9F3-3287-A105-72870ED4537A> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libGIF.dylib
0x7fff8f8ab000 - 0x7fff8f99cff7 libJP2.dylib (1599.7) <447C19DA-1EC7-3145-9C03-392084CEC012> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libJP2.dylib
0x7fff8f99d000 - 0x7fff8f9c0fff libJPEG.dylib (1599.7) <CA292CD5-38A5-33B2-B84C-185E46ABDD85> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libJPEG.dylib
0x7fff8f9c1000 - 0x7fff8f9e7fff libPng.dylib (1599.7) <5EFC9938-CA0F-3AAD-AB70-210089939A74> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libPng.dylib
0x7fff8f9e8000 - 0x7fff8f9eaff3 libRadiance.dylib (1599.7) <AE2355C1-1C5F-3F41-A156-3D0CE09FBF6D> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libRadiance.dylib
0x7fff8f9eb000 - 0x7fff8fa39ff7 libTIFF.dylib (1599.7) <5CE8FC45-4B15-355F-AF40-8A98F0ADC9CF> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libTIFF.dylib
0x7fff907a0000 - 0x7fff907b9ff7 com.apple.Kerberos (3.0 - 1) <B9D242EB-E325-3A21-9812-C77CBBFB0D51> /System/Library/Frameworks/Kerberos.framework/Versions/A/Kerberos
0x7fff918dc000 - 0x7fff918e4fff com.apple.NetFS (6.0 - 4.0) <14A24D00-5673-330A-959D-87F72040DEFF> /System/Library/Frameworks/NetFS.framework/Versions/A/NetFS
0x7fff91cdb000 - 0x7fff91cf4ffb com.apple.CFOpenDirectory (10.12 - 194) <2D856BB1-4865-3B54-A39A-CCBB25A4A935> /System/Library/Frameworks/OpenDirectory.framework/Versions/A/Frameworks/CFOpenDirectory.framework/Versions/A/CFOpenDirectory
0x7fff91cf5000 - 0x7fff91d00ff7 com.apple.OpenDirectory (10.12 - 194) <D5977817-7507-3005-8DDC-AB059672BEA0> /System/Library/Frameworks/OpenDirectory.framework/Versions/A/OpenDirectory
0x7fff93d19000 - 0x7fff9401afff com.apple.security (7.0 - 57740.51.3) <E8E40839-4F2C-3954-9870-9F9BA185BC81> /System/Library/Frameworks/Security.framework/Versions/A/Security
0x7fff9401b000 - 0x7fff94090fff com.apple.securityfoundation (6.0 - 55132.50.7) <2A013E36-EEB5-3E9A-AAA7-8E10EC49E75C> /System/Library/Frameworks/SecurityFoundation.framework/Versions/A/SecurityFoundation
0x7fff940bb000 - 0x7fff940beffb com.apple.xpc.ServiceManagement (1.0 - 1) <00B5C305-37B4-378A-BCAE-5EC441A889C8> /System/Library/Frameworks/ServiceManagement.framework/Versions/A/ServiceManagement
0x7fff94445000 - 0x7fff944b4ff7 com.apple.SystemConfiguration (1.14 - 1.14) <A4B97859-CB45-3910-9785-0CAF015B46BC> /System/Library/Frameworks/SystemConfiguration.framework/Versions/A/SystemConfiguration
0x7fff96e54000 - 0x7fff96edf97f com.apple.AppleJPEG (1.0 - 1) <B9E9570D-04A4-34E4-B756-D200043B25B8> /System/Library/PrivateFrameworks/AppleJPEG.framework/Versions/A/AppleJPEG
0x7fff98966000 - 0x7fff9896fffb com.apple.CommonAuth (4.0 - 2.0) <216950CB-269F-3476-BA17-D6363AC49FBC> /System/Library/PrivateFrameworks/CommonAuth.framework/Versions/A/CommonAuth
0x7fff990b7000 - 0x7fff990c7fff com.apple.CoreEmoji (1.0 - 40.3.3) <E9A28301-2D79-3A97-A046-028258A6ABE5> /System/Library/PrivateFrameworks/CoreEmoji.framework/Versions/A/CoreEmoji
0x7fff9c14c000 - 0x7fff9c1c2ff3 com.apple.Heimdal (4.0 - 2.0) <8F9C9041-66D5-36C9-8A4C-1658035C311D> /System/Library/PrivateFrameworks/Heimdal.framework/Versions/A/Heimdal
0x7fff9c903000 - 0x7fff9cabafff com.apple.LanguageModeling (1.0 - 123.2.5) <E7EDBA2B-8B97-3EC8-BDB1-232287E07581> /System/Library/PrivateFrameworks/LanguageModeling.framework/Versions/A/LanguageModeling
0x7fff9d5d8000 - 0x7fff9d600fff com.apple.MultitouchSupport.framework (368.14 - 368.14) <930109A4-6949-377F-AD30-F9B542CBAE1C> /System/Library/PrivateFrameworks/MultitouchSupport.framework/Versions/A/MultitouchSupport
0x7fff9d6b2000 - 0x7fff9d6bdfff com.apple.NetAuth (6.2 - 6.2) <97F487D6-8089-31A8-B68C-6C1EAC6ED1B5> /System/Library/PrivateFrameworks/NetAuth.framework/Versions/A/NetAuth
0x7fffa0515000 - 0x7fffa051bff7 com.apple.TCC (1.0 - 1) <911B534B-4AC7-34E4-935E-E42ECD008CBC> /System/Library/PrivateFrameworks/TCC.framework/Versions/A/TCC
0x7fffa1924000 - 0x7fffa1926ffb com.apple.loginsupport (1.0 - 1) <F3140B97-12C3-35A7-9D3D-43DA2D13C113> /System/Library/PrivateFrameworks/login.framework/Versions/A/Frameworks/loginsupport.framework/Versions/A/loginsupport
0x7fffa197b000 - 0x7fffa1996ff7 libCRFSuite.dylib (34) <F78B7F5F-0B4F-35C6-AA2F-84EE9CB22137> /usr/lib/libCRFSuite.dylib
0x7fffa1997000 - 0x7fffa19a2fff libChineseTokenizer.dylib (21) <0886E908-A825-36AF-B94B-2361FD8BC2A1> /usr/lib/libChineseTokenizer.dylib
0x7fffa1a34000 - 0x7fffa1a35ff3 libDiagnosticMessagesClient.dylib (102) <84A04D24-0E60-3810-A8C0-90A65E2DF61A> /usr/lib/libDiagnosticMessagesClient.dylib
0x7fffa1c6d000 - 0x7fffa1c6dfff libOpenScriptingUtil.dylib (172) <90743888-C1E8-34E3-924E-1A754B2B63B9> /usr/lib/libOpenScriptingUtil.dylib
0x7fffa1c73000 - 0x7fffa1c74ffb libSystem.B.dylib (1238.51.1) <D9B20A4F-87BC-36CB-9405-80E105666725> /usr/lib/libSystem.B.dylib
0x7fffa1ce0000 - 0x7fffa1d0bff3 libarchive.2.dylib (41.50.2) <B4F507BC-B24E-3BE7-B658-94D798E2CD81> /usr/lib/libarchive.2.dylib
0x7fffa1e05000 - 0x7fffa1e05ff3 libauto.dylib (187) <34388D0B-C539-3C1B-9408-2BC152162E43> /usr/lib/libauto.dylib
0x7fffa1e06000 - 0x7fffa1e16ff3 libbsm.0.dylib (34) <20084796-B04D-3B35-A003-EA11459557A9> /usr/lib/libbsm.0.dylib
0x7fffa1e17000 - 0x7fffa1e25ff7 libbz2.1.0.dylib (38) <25D9FACF-5583-348A-80A0-2B51DCE37680> /usr/lib/libbz2.1.0.dylib
0x7fffa1e26000 - 0x7fffa1e7cff7 libc++.1.dylib (307.5) <0B43BB5D-E6EB-3464-8DE9-B41AC8ED9D1C> /usr/lib/libc++.1.dylib
0x7fffa1e7d000 - 0x7fffa1ea7fff libc++abi.dylib (307.3) <30199352-88BF-30BD-8CFF-2A4FBE247523> /usr/lib/libc++abi.dylib
0x7fffa1ea8000 - 0x7fffa1eb8ffb libcmph.dylib (6) <2B5D405E-2D0B-3320-ABD6-622934C86ABE> /usr/lib/libcmph.dylib
0x7fffa1eb9000 - 0x7fffa1ecfff7 libcompression.dylib (39) <BDAA8CC7-0BFC-36EC-9E75-58BDC15AC3B6> /usr/lib/libcompression.dylib
0x7fffa1ed0000 - 0x7fffa1ed0ff7 libcoretls.dylib (121.50.4) <64B1001E-10F6-3542-A3B2-C4B49F51817F> /usr/lib/libcoretls.dylib
0x7fffa1ed1000 - 0x7fffa1ed2ff3 libcoretls_cfhelpers.dylib (121.50.4) <1A10303E-5EB0-3C7C-9165-021FCDFD934D> /usr/lib/libcoretls_cfhelpers.dylib
0x7fffa220f000 - 0x7fffa2262ff7 libcups.2.dylib (450) <F7AC4FF1-9755-3CFF-8CE3-F4FFACC43BEC> /usr/lib/libcups.2.dylib
0x7fffa22dd000 - 0x7fffa22ddfff libenergytrace.dylib (15) <A1B040A2-7977-3097-9ADF-34FF181EB970> /usr/lib/libenergytrace.dylib
0x7fffa22ed000 - 0x7fffa22f2ff7 libheimdal-asn1.dylib (498.50.8) <A40E3196-235E-34CE-AD9A-8D1AFC5DE004> /usr/lib/libheimdal-asn1.dylib
0x7fffa22f3000 - 0x7fffa23e5ff7 libiconv.2.dylib (50) <42125B35-81D7-3FC4-9475-A26DBE10884D> /usr/lib/libiconv.2.dylib
0x7fffa23e6000 - 0x7fffa260bffb libicucore.A.dylib (57163.0.1) <325E1C97-1C45-3A7E-9AFB-D1328E31D879> /usr/lib/libicucore.A.dylib
0x7fffa2611000 - 0x7fffa2612fff liblangid.dylib (126) <2085E7A7-9A34-3735-87F4-F174EF8EABF0> /usr/lib/liblangid.dylib
0x7fffa2613000 - 0x7fffa262cffb liblzma.5.dylib (10) <44BD0279-99DD-36B5-8A6E-C11432E2098D> /usr/lib/liblzma.5.dylib
0x7fffa262d000 - 0x7fffa2643ff7 libmarisa.dylib (5) <9030D214-5D0F-30CB-AC03-902C63909362> /usr/lib/libmarisa.dylib
0x7fffa2644000 - 0x7fffa28ecff7 libmecabra.dylib (744.8) <D429FCC9-42A4-38B3-8784-44024BC859EF> /usr/lib/libmecabra.dylib
0x7fffa291f000 - 0x7fffa2999ff3 libnetwork.dylib (856.50.56) <021B3FCF-6CFC-359D-845A-8A6AD7C54D73> /usr/lib/libnetwork.dylib
0x7fffa299a000 - 0x7fffa2d6fbc7 libobjc.A.dylib (709) <54CD8D1A-5C98-3559-B13A-932B3D3DD338> /usr/lib/libobjc.A.dylib
0x7fffa2d72000 - 0x7fffa2d76fff libpam.2.dylib (21.30.1) <71EB0D88-DE84-3C8D-A2C5-58AA282BC5BC> /usr/lib/libpam.2.dylib
0x7fffa2d77000 - 0x7fffa2da8ff7 libpcap.A.dylib (67.50.2) <D4A7EFB6-15FE-3C9C-A47C-1CA3CB75D06C> /usr/lib/libpcap.A.dylib
0x7fffa2dc5000 - 0x7fffa2de1ffb libresolv.9.dylib (64) <A244AE4C-00B0-396C-98FF-97FE4DB3DA30> /usr/lib/libresolv.9.dylib
0x7fffa2e31000 - 0x7fffa2f77fff libsqlite3.dylib (254.5) <71E9B5E9-67D8-329E-86A6-894B885A542E> /usr/lib/libsqlite3.dylib
0x7fffa306c000 - 0x7fffa3079fff libxar.1.dylib (357) <69547C64-E811-326F-BBED-490C6361BDCB> /usr/lib/libxar.1.dylib
0x7fffa307a000 - 0x7fffa3169ffb libxml2.2.dylib (30.15) <99A58C37-98A2-3430-942A-D6038C1A198C> /usr/lib/libxml2.2.dylib
0x7fffa316a000 - 0x7fffa3193fff libxslt.1.dylib (15.9) <71FFCDFF-4AAF-394C-8452-92F301FB1A46> /usr/lib/libxslt.1.dylib
0x7fffa3194000 - 0x7fffa31a5ff3 libz.1.dylib (67) <46E3FFA2-4328-327A-8D34-A03E20BFFB8E> /usr/lib/libz.1.dylib
0x7fffa31b4000 - 0x7fffa31b8ff7 libcache.dylib (79) <093A4DAB-8385-3D47-A350-E20CB7CCF7BF> /usr/lib/system/libcache.dylib
0x7fffa31b9000 - 0x7fffa31c3fff libcommonCrypto.dylib (60092.50.5) <BE8380C5-C09D-3F48-A502-AEBB58231067> /usr/lib/system/libcommonCrypto.dylib
0x7fffa31c4000 - 0x7fffa31cbfff libcompiler_rt.dylib (62) <55D47421-772A-32AB-B529-1A46C2F43B4D> /usr/lib/system/libcompiler_rt.dylib
0x7fffa31cc000 - 0x7fffa31d4fff libcopyfile.dylib (138) <819BEA3C-DF11-3E3D-A1A1-5A51C5BF1961> /usr/lib/system/libcopyfile.dylib
0x7fffa31d5000 - 0x7fffa3258fdf libcorecrypto.dylib (442.50.19) <8A39EE06-121C-3731-A9E9-35847064B3EE> /usr/lib/system/libcorecrypto.dylib
0x7fffa3259000 - 0x7fffa328afff libdispatch.dylib (703.50.37) <D122E712-9593-31CA-BAC4-4A54410BF4A0> /usr/lib/system/libdispatch.dylib
0x7fffa328b000 - 0x7fffa3290ffb libdyld.dylib (433.5) <129D3B44-FB21-3750-9A68-48B5C3DC632B> /usr/lib/system/libdyld.dylib
0x7fffa3291000 - 0x7fffa3291ffb libkeymgr.dylib (28) <7AA011A9-DC21-3488-BF73-3B5B14D1FDD6> /usr/lib/system/libkeymgr.dylib
0x7fffa3292000 - 0x7fffa329effb libkxld.dylib (3789.51.2) <0BD544C8-A376-3F91-8426-564B4F7FE7E6> /usr/lib/system/libkxld.dylib
0x7fffa329f000 - 0x7fffa329ffff liblaunch.dylib (972.50.27) <037D198D-9B02-3EF9-A8E9-6F43EA555A9E> /usr/lib/system/liblaunch.dylib
0x7fffa32a0000 - 0x7fffa32a5ff3 libmacho.dylib (898) <17D5D855-F6C3-3B04-B680-E9BF02EF8AED> /usr/lib/system/libmacho.dylib
0x7fffa32a6000 - 0x7fffa32a8ff3 libquarantine.dylib (85.50.1) <7B32EA91-AB8B-32A4-8E52-9D3ED46CAC8E> /usr/lib/system/libquarantine.dylib
0x7fffa32a9000 - 0x7fffa32aaffb libremovefile.dylib (45) <38D4CB9C-10CD-30D3-8B7B-A515EC75FE85> /usr/lib/system/libremovefile.dylib
0x7fffa32ab000 - 0x7fffa32c3ff7 libsystem_asl.dylib (349.50.5) <096E4228-3B7C-30A6-8B13-EC909A64499A> /usr/lib/system/libsystem_asl.dylib
0x7fffa32c4000 - 0x7fffa32c4ff7 libsystem_blocks.dylib (67) <10DC5404-73AB-35B3-A277-A8AFECB476EB> /usr/lib/system/libsystem_blocks.dylib
0x7fffa32c5000 - 0x7fffa3352fef libsystem_c.dylib (1158.50.2) <B03F8915-1E9B-3C84-AED5-68E2E0031630> /usr/lib/system/libsystem_c.dylib
0x7fffa3353000 - 0x7fffa3356ffb libsystem_configuration.dylib (888.51.2) <872C8A42-0871-3424-830B-84E587A75D27> /usr/lib/system/libsystem_configuration.dylib
0x7fffa3357000 - 0x7fffa335afff libsystem_coreservices.dylib (41.4) <FD0915E8-9C43-3FCB-94E0-33C45DF028CD> /usr/lib/system/libsystem_coreservices.dylib
0x7fffa335b000 - 0x7fffa3373fff libsystem_coretls.dylib (121.50.4) <EC6FCF07-DCFB-3A03-9CC9-6DD3709974C6> /usr/lib/system/libsystem_coretls.dylib
0x7fffa3374000 - 0x7fffa337afff libsystem_dnssd.dylib (765.50.9) <FF02A197-7CEF-3684-8155-E5E225051E44> /usr/lib/system/libsystem_dnssd.dylib
0x7fffa337b000 - 0x7fffa33a4ff7 libsystem_info.dylib (503.50.4) <611DB84C-BF70-3F92-8702-B9F28A900920> /usr/lib/system/libsystem_info.dylib
0x7fffa33a5000 - 0x7fffa33c7ff7 libsystem_kernel.dylib (3789.51.2) <FC51D7B0-8292-3F6A-9231-64340B237EB7> /usr/lib/system/libsystem_kernel.dylib
0x7fffa33c8000 - 0x7fffa340ffe7 libsystem_m.dylib (3121.6) <A790C9A5-DD24-32F5-8FD7-33BFCE79AC87> /usr/lib/system/libsystem_m.dylib
0x7fffa3410000 - 0x7fffa342eff7 libsystem_malloc.dylib (116.50.8) <48D1BBA3-914E-3C65-AF70-C33B4A1B5233> /usr/lib/system/libsystem_malloc.dylib
0x7fffa342f000 - 0x7fffa3488ffb libsystem_network.dylib (856.50.56) <FDE14243-4328-3EFD-824C-C0D314D7B540> /usr/lib/system/libsystem_network.dylib
0x7fffa3489000 - 0x7fffa3492ff3 libsystem_networkextension.dylib (563.50.32) <D5381DA9-529C-3588-BE16-A2245DE93423> /usr/lib/system/libsystem_networkextension.dylib
0x7fffa3493000 - 0x7fffa349cff3 libsystem_notify.dylib (165.20.1) <B8160190-A069-3B3A-BDF6-2AA408221FAE> /usr/lib/system/libsystem_notify.dylib
0x7fffa349d000 - 0x7fffa34a5fe7 libsystem_platform.dylib (126.50.8) <5940EAB7-84D6-34DC-9B38-111648B2B589> /usr/lib/system/libsystem_platform.dylib
0x7fffa34a6000 - 0x7fffa34b0ff7 libsystem_pthread.dylib (218.51.1) <62A84A68-431D-3B54-A7B6-31367CCF2884> /usr/lib/system/libsystem_pthread.dylib
0x7fffa34b1000 - 0x7fffa34b4ff7 libsystem_sandbox.dylib (592.50.47) <87A2327D-B7A1-3E4C-A85D-D3D9484003DB> /usr/lib/system/libsystem_sandbox.dylib
0x7fffa34b5000 - 0x7fffa34b6ff3 libsystem_secinit.dylib (24.50.4) <F78B847B-3565-3E4B-98A6-F7AD40392E2D> /usr/lib/system/libsystem_secinit.dylib
0x7fffa34b7000 - 0x7fffa34beffb libsystem_symptoms.dylib (532.50.47) <9CF6A47C-8343-3E85-9C27-A8D98E726A8B> /usr/lib/system/libsystem_symptoms.dylib
0x7fffa34bf000 - 0x7fffa34d2ff7 libsystem_trace.dylib (518.51.1) <E1D540D8-CC88-3901-92BA-FC4B802FE0E8> /usr/lib/system/libsystem_trace.dylib
0x7fffa34d3000 - 0x7fffa34d8ffb libunwind.dylib (35.3) <3D50D8A8-C460-334D-A519-2DA841102C6B> /usr/lib/system/libunwind.dylib
0x7fffa34d9000 - 0x7fffa3502ff7 libxpc.dylib (972.50.27) <ABC45890-DA23-3A4A-B50B-1384BD4CBBDF> /usr/lib/system/libxpc.dylib
External Modification Summary:
Calls made by other processes targeting this process:
task_for_pid: 0
thread_create: 0
thread_set_state: 0
Calls made by this process:
task_for_pid: 0
thread_create: 0
thread_set_state: 0
Calls made by all processes on this machine:
task_for_pid: 317840
thread_create: 0
thread_set_state: 0
VM Region Summary:
ReadOnly portion of Libraries: Total=204.7M resident=0K(0%) swapped_out_or_unallocated=204.7M(100%)
Writable regions: Total=97.1M written=0K(0%) resident=0K(0%) swapped_out=0K(0%) unallocated=97.1M(100%)
VIRTUAL REGION
REGION TYPE SIZE COUNT (non-coalesced)
=========== ======= =======
Dispatch continuations 16.0M 2
Kernel Alloc Once 8K 2
MALLOC 65.0M 12
MALLOC guard page 16K 4
MALLOC_LARGE (reserved) 1168K 3 reserved VM address space (unallocated)
STACK GUARD 56.0M 9
Stack 11.6M 9
VM_ALLOCATE 2.3G 77
VM_ALLOCATE (reserved) 64K 2 reserved VM address space (unallocated)
__DATA 9928K 145
__LINKEDIT 116.5M 10
__TEXT 88.2M 149
__UNICODE 556K 2
mapped file 40.3M 130
shared memory 8K 3
=========== ======= =======
TOTAL 2.7G 544
TOTAL, minus reserved VM space 2.7G 544
Model: MacBookPro10,1, BootROM MBP101.00EE.B12, 4 processors, Intel Core i7, 2.3 GHz, 16 GB, SMC 2.3f35
Graphics: Intel HD Graphics 4000, Intel HD Graphics 4000, Built-In
Graphics: NVIDIA GeForce GT 650M, NVIDIA GeForce GT 650M, PCIe, 1024 MB
Memory Module: BANK 0/DIMM0, 8 GB, DDR3, 1600 MHz, 0x802C, 0x384B54463531323634485A2D314736453120
Memory Module: BANK 1/DIMM0, 8 GB, DDR3, 1600 MHz, 0x802C, 0x384B54463531323634485A2D314736453120
AirPort: spairport_wireless_card_type_airport_extreme (0x14E4, 0xEF), Broadcom BCM43xx 1.0 (7.21.171.124.1a2)
Bluetooth: Version 5.0.4f18, 3 services, 27 devices, 1 incoming serial ports
Network Service: Thunderbolt Ethernet, Ethernet, en4
PCI Card: Apple 57762-A0, Ethernet Controller, Thunderbolt@195,0,0
Serial ATA Device: APPLE SSD SM512E, 500.28 GB
USB Device: USB 2.0 Bus
USB Device: Hub
USB Device: FaceTime HD Camera (Built-in)
USB Device: USB 2.0 Bus
USB Device: Hub
USB Device: Hub
USB Device: Apple Internal Keyboard / Trackpad
USB Device: BRCM20702 Hub
USB Device: Bluetooth USB Host Controller
USB Device: USB 3.0 Bus
Thunderbolt Bus: MacBook Pro, Apple Inc., 23.4
Thunderbolt Device: Thunderbolt to Gigabit Ethernet Adapter, Apple Inc., 3, 5.5
---
@Petermarcu commented on [Wed Jun 14 2017](https://github.com/dotnet/core/issues/599#issuecomment-308614973)
@m2b , Any chance you can share more about which version of everything you are running and how you installed it? Through the installer or from the tar.gz?
---
@Petermarcu commented on [Wed Jun 28 2017](https://github.com/dotnet/core/issues/599#issuecomment-311856464)
I'm going to close this because we need more info. Please reopen if you have more info you can share. Thanks!
---
@jpcarrascal commented on [Sun Jul 09 2017](https://github.com/dotnet/core/issues/599#issuecomment-313953236)
I have exactly the same problem.
I am running OS X El Capitan (10.11.6).
dotnet --version output is 1.0.4.
I tried both the installer and running the binary from the tar.gz and the result is the same. The only difference is that in the former case I also get a "dotnet quit unexpectedly" window.
I'd appreciate any help and I'll be happy to provide more diagnostics information if needed.
Thanks!
JP
---
@richlander commented on [Sun Jul 09 2017](https://github.com/dotnet/core/issues/599#issuecomment-313994521)
/cc @livarcocc
_Copied from original issue: dotnet/cli#7777_
_Copied from original issue: dotnet/core-setup#3279_
|
1.0
|
System.DllNotFoundException: Unable to load DLL 'libproc': - _From @livarcocc on October 6, 2017 16:16_
_From @Petermarcu on October 6, 2017 16:3_
@m2b commented on [Thu Apr 20 2017](https://github.com/dotnet/core/issues/599)
**Here is the simple test project code**
using System;
using Xunit;
namespace ITVizion.ActionBoard.RepositoryInMemory
{
public class Test
{
[Fact]
public void Test1()
{
Assert.Equal(1,1);
}
}
}
=========================================================================
**Here is the command window output. dotnet crashes!**
blackielanetMBP:actionboard.git blackie$ dotnet test ITVizion.ActionBoard.RepositoryInMemory.Test/ITVizion.ActionBoard.RepositoryInMemory.Test.csproj
Welcome to .NET Core!
---------------------
Learn more about .NET Core @ https://aka.ms/dotnet-docs. Use dotnet --help to see available commands or go to https://aka.ms/dotnet-cli-docs.
Telemetry
--------------
The .NET Core tools collect usage data in order to improve your experience. The data is anonymous and does not include command-line arguments. The data is collected by Microsoft and shared with the community.
You can opt out of telemetry by setting a DOTNET_CLI_TELEMETRY_OPTOUT environment variable to 1 using your favorite shell.
You can read more about .NET Core tools telemetry @ https://aka.ms/dotnet-cli-telemetry.
Configuring...
-------------------
A command is running to initially populate your local package cache, to improve restore speed and enable offline access. This command will take up to a minute to complete and will only happen once.
Decompressing 100% 5896 ms
Expanding 100% 7278 ms
Failed to create prime the NuGet cache. restore failed with: 134
Unhandled Exception: System.DllNotFoundException: Unable to load DLL 'libproc': The specified module could not be found.
(Exception from HRESULT: 0x8007007E)
at Interop.libproc.proc_pidpath(Int32 pid, Byte* buffer, UInt32 bufferSize)
at Interop.libproc.proc_pidpath(Int32 pid)
at System.Diagnostics.Process.ResolvePath(String filename)
at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo)
at System.Diagnostics.Process.Start()
at Microsoft.DotNet.Cli.ForwardingApp.Execute()
at Microsoft.DotNet.Tools.MSBuild.MSBuildForwardingApp.Execute()
at Microsoft.DotNet.Tools.Test.TestCommand.<>c__DisplayClass0_0.<Run>b__0()
at Microsoft.DotNet.Cli.CommandLine.CommandLineApplication.Execute(String[] args)
at Microsoft.DotNet.Tools.Test.TestCommand.Run(String[] args)
at Microsoft.DotNet.Cli.Program.ProcessArgs(String[] args, ITelemetry telemetryClient)
at Microsoft.DotNet.Cli.Program.Main(String[] args)
Abort trap: 6
=========================================================================
**Here is the Mac Problem Report**
Process: dotnet [26183]
Path: /usr/local/share/dotnet/dotnet
Identifier: dotnet
Version: 0
Code Type: X86-64 (Native)
Parent Process: dotnet [26170]
Responsible: dotnet [26183]
User ID: 501
Date/Time: 2017-04-20 17:28:26.552 -0700
OS Version: Mac OS X 10.12.4 (16E195)
Report Version: 12
Anonymous UUID: E3FC06FA-F158-98C8-DDAD-74F8F018A97A
Sleep/Wake UUID: ECCEF7E0-8ED4-4B9C-9EF0-00E0D60B0BBE
Time Awake Since Boot: 470000 seconds
Time Since Wake: 5600 seconds
System Integrity Protection: disabled
Crashed Thread: 0 Dispatch queue: com.apple.main-thread
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Exception Note: EXC_CORPSE_NOTIFY
Application Specific Information:
abort() called
Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 libsystem_kernel.dylib 0x00007fffa33bed42 __pthread_kill + 10
1 libsystem_pthread.dylib 0x00007fffa34ac5bf pthread_kill + 90
2 libsystem_c.dylib 0x00007fffa3324420 abort + 129
3 libcoreclr.dylib 0x0000000104dc1312 PROCEndProcess(void*, unsigned int, int) + 226
4 libcoreclr.dylib 0x00000001050928c1 UnwindManagedExceptionPass1(PAL_SEHException&, _CONTEXT*) + 833
5 libcoreclr.dylib 0x0000000105092979 DispatchManagedException(PAL_SEHException&, bool) + 73
6 libcoreclr.dylib 0x0000000104f1c6c9 PreStubWorker + 937
7 libcoreclr.dylib 0x00000001050a0579 ThePreStub + 92
8 ??? 0x000000010c1cad79 0 + 4498173305
9 ??? 0x000000010c1c66b1 0 + 4498155185
10 ??? 0x000000010c1c6181 0 + 4498153857
11 ??? 0x000000010c1c51cb 0 + 4498149835
12 ??? 0x000000010bc5cabd 0 + 4492479165
13 ??? 0x000000010bc574b2 0 + 4492457138
14 ??? 0x000000010bc6271b 0 + 4492502811
15 ??? 0x000000010bc5f21f 0 + 4492489247
16 ??? 0x000000010bc55bc2 0 + 4492450754
17 ??? 0x000000010bc5d4df 0 + 4492481759
18 ??? 0x000000010bc5cf8d 0 + 4492480397
19 libcoreclr.dylib 0x000000010509f9a1 CallDescrWorkerInternal + 124
20 libcoreclr.dylib 0x0000000104f8bb43 MethodDescCallSite::CallTargetWorker(unsigned long const*, unsigned long*, int) + 707
21 libcoreclr.dylib 0x0000000104e59fc4 RunMain(MethodDesc*, short, int*, PtrArray**) + 932
22 libcoreclr.dylib 0x0000000104e5a18d Assembly::ExecuteMainMethod(PtrArray**, int) + 221
23 libcoreclr.dylib 0x0000000104e9762a CorHost2::ExecuteAssembly(unsigned int, char16_t const*, int, char16_t const**, unsigned int*) + 442
24 libcoreclr.dylib 0x0000000104dca4f3 coreclr_execute_assembly + 259
25 libhostpolicy.dylib 0x0000000104d027e8 coreclr::execute_assembly(void*, unsigned int, int, char const**, char const*, unsigned int*) + 152
26 libhostpolicy.dylib 0x0000000104cf1388 run(arguments_t const&) + 20568
27 libhostpolicy.dylib 0x0000000104cf1d0d corehost_main + 1405
28 libhostfxr.dylib 0x0000000104c3c90e execute_app(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, corehost_init_t*, int, char const**) + 446
29 libhostfxr.dylib 0x0000000104c52679 fx_muxer_t::read_config_and_execute(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::unordered_map<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > > >, std::__1::hash<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > >, std::__1::equal_to<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > >, std::__1::allocator<std::__1::pair<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const, std::__1::vector<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > > > > > > const&, int, char const**, host_mode_t) + 8441
30 libhostfxr.dylib 0x0000000104c504a9 fx_muxer_t::parse_args_and_execute(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, int, int, char const**, bool, host_mode_t, bool*) + 8425
31 libhostfxr.dylib 0x0000000104c53a09 fx_muxer_t::execute(int, char const**) + 4537
32 libhostfxr.dylib 0x0000000104c3c985 hostfxr_main + 53
33 dotnet 0x0000000104b707ac run(int, char const**) + 1420
34 dotnet 0x0000000104b708ee main + 158
35 libdyld.dylib 0x00007fffa3290235 start + 1
Thread 1:
0 libsystem_kernel.dylib 0x00007fffa33b734a mach_msg_trap + 10
1 libsystem_kernel.dylib 0x00007fffa33b6797 mach_msg + 55
2 libcoreclr.dylib 0x0000000104dc761a MachMessage::Receive(unsigned int) + 74
3 libcoreclr.dylib 0x0000000104dc6539 SEHExceptionThread(void*) + 105
4 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
5 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
6 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 2:
0 libsystem_kernel.dylib 0x00007fffa33c019e poll + 10
1 libcoreclr.dylib 0x0000000104db9afe CorUnix::CPalSynchronizationManager::ThreadPrepareForShutdown() + 30
2 libcoreclr.dylib 0x0000000104dbb729 CorUnix::CPalSynchronizationManager::WorkerThread(void*) + 1177
3 libcoreclr.dylib 0x0000000104dc3e48 CorUnix::CPalThread::ThreadEntry(void*) + 328
4 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
5 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
6 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 3:
0 libsystem_kernel.dylib 0x00007fffa33bea3e __open + 10
1 libcoreclr.dylib 0x0000000104e36b0f TwoWayPipe::WaitForConnection() + 31
2 libcoreclr.dylib 0x0000000104e2e8c1 DbgTransportSession::TransportWorker() + 145
3 libcoreclr.dylib 0x0000000104e2d4e9 DbgTransportSession::TransportWorkerStatic(void*) + 9
4 libcoreclr.dylib 0x0000000104dc3e48 CorUnix::CPalThread::ThreadEntry(void*) + 328
5 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
6 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
7 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 4:
0 libsystem_kernel.dylib 0x00007fffa33bebf2 __psynch_cvwait + 10
1 libsystem_pthread.dylib 0x00007fffa34aa86e _pthread_cond_wait + 712
2 libcoreclr.dylib 0x0000000104db97a2 CorUnix::CPalSynchronizationManager::ThreadNativeWait(CorUnix::_ThreadNativeWaitData*, unsigned int, CorUnix::ThreadWakeupReason*, unsigned int*) + 306
3 libcoreclr.dylib 0x0000000104db93f6 CorUnix::CPalSynchronizationManager::BlockThread(CorUnix::CPalThread*, unsigned int, bool, bool, CorUnix::ThreadWakeupReason*, unsigned int*) + 390
4 libcoreclr.dylib 0x0000000104dbe3e8 CorUnix::InternalWaitForMultipleObjectsEx(CorUnix::CPalThread*, unsigned int, void* const*, int, unsigned int, int) + 1912
5 libcoreclr.dylib 0x0000000104e2bc73 DebuggerRCThread::MainLoop() + 755
6 libcoreclr.dylib 0x0000000104e2b927 DebuggerRCThread::ThreadProc() + 263
7 libcoreclr.dylib 0x0000000104e2b5d4 DebuggerRCThread::ThreadProcStatic(void*) + 132
8 libcoreclr.dylib 0x0000000104dc3e48 CorUnix::CPalThread::ThreadEntry(void*) + 328
9 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
10 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
11 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 5:
0 libsystem_kernel.dylib 0x00007fffa33bebf2 __psynch_cvwait + 10
1 libsystem_pthread.dylib 0x00007fffa34aa86e _pthread_cond_wait + 712
2 libcoreclr.dylib 0x0000000104db9785 CorUnix::CPalSynchronizationManager::ThreadNativeWait(CorUnix::_ThreadNativeWaitData*, unsigned int, CorUnix::ThreadWakeupReason*, unsigned int*) + 277
3 libcoreclr.dylib 0x0000000104db93f6 CorUnix::CPalSynchronizationManager::BlockThread(CorUnix::CPalThread*, unsigned int, bool, bool, CorUnix::ThreadWakeupReason*, unsigned int*) + 390
4 libcoreclr.dylib 0x0000000104dbe3e8 CorUnix::InternalWaitForMultipleObjectsEx(CorUnix::CPalThread*, unsigned int, void* const*, int, unsigned int, int) + 1912
5 libcoreclr.dylib 0x0000000104dbe636 WaitForSingleObjectEx + 70
6 libcoreclr.dylib 0x000000010507b340 CLREventBase::WaitEx(unsigned int, WaitMode, PendingSync*) + 176
7 libcoreclr.dylib 0x0000000104fd0e6f FinalizerThread::WaitForFinalizerEvent(CLREvent*) + 31
8 libcoreclr.dylib 0x0000000104fd0fe3 FinalizerThread::FinalizerThreadWorker(void*) + 115
9 libcoreclr.dylib 0x0000000104f4b15a ManagedThreadBase_DispatchOuter(ManagedThreadCallState*) + 378
10 libcoreclr.dylib 0x0000000104f4b859 ManagedThreadBase::FinalizerBase(void (*)(void*)) + 73
11 libcoreclr.dylib 0x0000000104fd13cc FinalizerThread::FinalizerThreadStart(void*) + 204
12 libcoreclr.dylib 0x0000000104dc3e48 CorUnix::CPalThread::ThreadEntry(void*) + 328
13 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
14 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
15 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 6:
0 libsystem_kernel.dylib 0x00007fffa33b734a mach_msg_trap + 10
1 libsystem_kernel.dylib 0x00007fffa33b6797 mach_msg + 55
2 libclrjit.dylib 0x0000000105bb174a MachMessage::Receive(unsigned int) + 74
3 libclrjit.dylib 0x0000000105bb0669 SEHExceptionThread(void*) + 105
4 libsystem_pthread.dylib 0x00007fffa34a99af _pthread_body + 180
5 libsystem_pthread.dylib 0x00007fffa34a98fb _pthread_start + 286
6 libsystem_pthread.dylib 0x00007fffa34a9101 thread_start + 13
Thread 7:
0 libsystem_kernel.dylib 0x00007fffa33bf44e __workq_kernreturn + 10
1 libsystem_pthread.dylib 0x00007fffa34a9695 _pthread_wqthread + 1426
2 libsystem_pthread.dylib 0x00007fffa34a90f1 start_wqthread + 13
Thread 0 crashed with X86 Thread State (64-bit):
rax: 0x0000000000000000 rbx: 0x0000000000000006 rcx: 0x00007fff5b0927c8 rdx: 0x0000000000000000
rdi: 0x0000000000000307 rsi: 0x0000000000000006 rbp: 0x00007fff5b0927f0 rsp: 0x00007fff5b0927c8
r8: 0x00007faa1b8088f8 r9: 0x00000000007a682f r10: 0x0000000008000000 r11: 0x0000000000000206
r12: 0x00007fff5b092e30 r13: 0x0000000000000000 r14: 0x00007fffac1f43c0 r15: 0x0000000000000000
rip: 0x00007fffa33bed42 rfl: 0x0000000000000206 cr2: 0x00007fffac1d6128
Logical CPU: 0
Error Code: 0x02000148
Trap Number: 133
Binary Images:
0x104b65000 - 0x104b81ff3 +dotnet (0) <3D909F9D-70EC-3599-8378-D1E3D3F3C4C2> /usr/local/share/dotnet/dotnet
0x104bbf000 - 0x104c62ff7 +libhostfxr.dylib (0) <C0FDA7FE-45E6-3289-8926-3DA0A4901323> /usr/local/share/dotnet/host/fxr/1.1.0/libhostfxr.dylib
0x104c8c000 - 0x104d48ff3 +libhostpolicy.dylib (0) <BFE7338C-2137-3433-A8F8-FEB0390BAF83> /usr/local/share/dotnet/shared/Microsoft.NETCore.App/1.1.1/libhostpolicy.dylib
0x104d79000 - 0x10529cfff +libcoreclr.dylib (0) <40FA6CB2-53C9-3176-8516-FE40B2D8F3C8> /usr/local/share/dotnet/shared/Microsoft.NETCore.App/1.1.1/libcoreclr.dylib
0x1055cc000 - 0x105609dc7 dyld (433.5) <8239D0D7-66F6-3C44-A77F-586F74525DA3> /usr/lib/dyld
0x1059f4000 - 0x105bf6fff +libclrjit.dylib (0) <C4B9490D-6AC6-3CEE-9A9C-90495E36699C> /usr/local/share/dotnet/shared/Microsoft.NETCore.App/1.1.1/libclrjit.dylib
0x105cc0000 - 0x105cc8fff +System.Globalization.Native.dylib (0) <60FCF655-55BA-37A9-924C-157C9E8E7AC1> /usr/local/share/dotnet/shared/Microsoft.NETCore.App/1.1.1/System.Globalization.Native.dylib
0x10759d000 - 0x1075a4ffb +System.Native.dylib (0) <5F1F2A95-1960-379D-A897-AF4F76F06D11> /usr/local/share/dotnet/shared/Microsoft.NETCore.App/1.1.1/System.Native.dylib
0x7fff8a057000 - 0x7fff8a057fff com.apple.Accelerate (1.11 - Accelerate 1.11) <D7745BB9-42FD-3443-9265-151413E4C8AD> /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate
0x7fff8a070000 - 0x7fff8a6b0fe3 com.apple.vImage (8.1 - ???) <B1786726-6477-327E-83F4-8EFF4D15DFAC> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vImage.framework/Versions/A/vImage
0x7fff8a6b1000 - 0x7fff8a878fef libBLAS.dylib (1185.50.4) <7AF8DB9A-E33B-30EB-8767-87AFB951E231> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
0x7fff8a879000 - 0x7fff8a890fff libBNNS.dylib (15) <26F32264-148E-35E5-A280-CA035CB3D6F0> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBNNS.dylib
0x7fff8a891000 - 0x7fff8ac9cfff libLAPACK.dylib (1185.50.4) <51B2BABA-F736-3663-885A-65A8991D3560> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libLAPACK.dylib
0x7fff8ac9d000 - 0x7fff8acb3fff libLinearAlgebra.dylib (1185.50.4) <0EC25E70-05DC-3615-85B6-81721566CF1D> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libLinearAlgebra.dylib
0x7fff8acb4000 - 0x7fff8acbafff libQuadrature.dylib (3) <EF56C8E6-DE22-3C69-B543-A8648F335FDD> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libQuadrature.dylib
0x7fff8acbb000 - 0x7fff8acceff7 libSparseBLAS.dylib (1185.50.4) <0BDCF6A7-0228-3719-81C7-B6EBC7911320> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libSparseBLAS.dylib
0x7fff8accf000 - 0x7fff8ae1cff7 libvDSP.dylib (600) <9D9CFF97-2A64-341A-AB35-DC0C76068B9C> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvDSP.dylib
0x7fff8ae1d000 - 0x7fff8aed4fff libvMisc.dylib (600) <661B825D-274E-3B85-B160-89873AD65A62> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvMisc.dylib
0x7fff8aed5000 - 0x7fff8aed5fff com.apple.Accelerate.vecLib (3.11 - vecLib 3.11) <B8F2814E-0927-3905-A394-EFEB5636DE76> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/vecLib
0x7fff8bf7f000 - 0x7fff8bf7ffff com.apple.ApplicationServices (48 - 48) <847E54B5-DEA4-3B50-93CE-4FC67789F179> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/ApplicationServices
0x7fff8bf80000 - 0x7fff8bfeeff7 com.apple.ApplicationServices.ATS (377 - 422.2) <012ACEE0-9A90-3998-8495-734E105117C0> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/ATS
0x7fff8c088000 - 0x7fff8c1b7ff7 libFontParser.dylib (194.11) <635DF6EF-18DF-3D06-90AA-88C509B43068> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libFontParser.dylib
0x7fff8c1b8000 - 0x7fff8c202fff libFontRegistry.dylib (196.4) <EA96AE47-3369-3DEA-BB82-98348ADBD85B> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libFontRegistry.dylib
0x7fff8c2fe000 - 0x7fff8c3a8ff7 com.apple.ColorSync (4.12.0 - 502.2) <ACA4001E-A0E3-33F6-9CD6-EEC2AA15E322> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ColorSync.framework/Versions/A/ColorSync
0x7fff8c3a9000 - 0x7fff8c3f9ff7 com.apple.HIServices (1.22 - 591) <D16A5699-F3A2-3AF5-93B1-9E2F487F1577> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/HIServices.framework/Versions/A/HIServices
0x7fff8c3fa000 - 0x7fff8c409ff3 com.apple.LangAnalysis (1.7.0 - 1.7.0) <2CBE7F61-2056-3F96-99A1-0D527796AFA6> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/LangAnalysis.framework/Versions/A/LangAnalysis
0x7fff8c40a000 - 0x7fff8c457fff com.apple.print.framework.PrintCore (12 - 491) <5027FD58-F0EE-33E4-8577-934CA06CD2AF> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/PrintCore.framework/Versions/A/PrintCore
0x7fff8c458000 - 0x7fff8c493fff com.apple.QD (3.12 - 313) <B339C41D-8CDF-3342-8414-F9717DCCADD4> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/QD.framework/Versions/A/QD
0x7fff8c494000 - 0x7fff8c49fff7 com.apple.speech.synthesis.framework (6.3.3 - 6.3.3) <5808F070-8199-36C9-B3C9-F9B98D5AA359> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/SpeechSynthesis.framework/Versions/A/SpeechSynthesis
0x7fff8c4a0000 - 0x7fff8c6acfff com.apple.audio.toolbox.AudioToolbox (1.14 - 1.14) <6EEF498D-8233-3622-B34B-49FDD9D4DF14> /System/Library/Frameworks/AudioToolbox.framework/Versions/A/AudioToolbox
0x7fff8c816000 - 0x7fff8cbf0ff7 com.apple.CFNetwork (811.4.18 - 811.4.18) <9CE329E8-6177-3474-976D-F5C63FC875CD> /System/Library/Frameworks/CFNetwork.framework/Versions/A/CFNetwork
0x7fff8d20c000 - 0x7fff8d299fff com.apple.audio.CoreAudio (4.3.0 - 4.3.0) <184D9C49-248F-3374-944C-FD1A99A6B32E> /System/Library/Frameworks/CoreAudio.framework/Versions/A/CoreAudio
0x7fff8d2ae000 - 0x7fff8d5adff3 com.apple.CoreData (120 - 754.2) <C9933C8C-85D5-3FB9-8D6D-DB80AB3F496B> /System/Library/Frameworks/CoreData.framework/Versions/A/CoreData
0x7fff8d65b000 - 0x7fff8daeefff com.apple.CoreFoundation (6.9 - 1349.65) <2B7C4BA4-D69E-3651-93DF-3930880B5084> /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation
0x7fff8daef000 - 0x7fff8e192ff7 com.apple.CoreGraphics (2.0 - 1070.22) <1676F5EC-AEE3-3C52-97C4-43CBF705EA2A> /System/Library/Frameworks/CoreGraphics.framework/Versions/A/CoreGraphics
0x7fff8e53b000 - 0x7fff8e53bfff com.apple.CoreServices (775.19 - 775.19) <8AA95E32-AB13-3792-B248-FA150D8E6583> /System/Library/Frameworks/CoreServices.framework/Versions/A/CoreServices
0x7fff8e53c000 - 0x7fff8e58dfff com.apple.AE (712.5 - 712.5) <F0B36ABC-C0D4-370E-8257-11A7F351EC7F> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/AE.framework/Versions/A/AE
0x7fff8e58e000 - 0x7fff8e869ff7 com.apple.CoreServices.CarbonCore (1159.6 - 1159.6) <08AC074C-965B-3EDF-8E45-0707C8DE9EAD> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/CarbonCore.framework/Versions/A/CarbonCore
0x7fff8e86a000 - 0x7fff8e89dfff com.apple.DictionaryServices (1.2 - 274) <D23866E2-F7C8-3984-A9D4-96552CCDE573> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/DictionaryServices.framework/Versions/A/DictionaryServices
0x7fff8e89e000 - 0x7fff8e8a6ff3 com.apple.CoreServices.FSEvents (1230.50.1 - 1230.50.1) <2AD1B0E5-7214-37C4-8D11-A27C9CAC0F74> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/FSEvents.framework/Versions/A/FSEvents
0x7fff8e8a7000 - 0x7fff8ea13ff7 com.apple.LaunchServices (775.19 - 775.19) <1CF81B5F-BA1A-3FC6-B4F9-E0A319AE94D0> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/LaunchServices.framework/Versions/A/LaunchServices
0x7fff8ea14000 - 0x7fff8eac4ffb com.apple.Metadata (10.7.0 - 1075.40) <F205A001-250D-3D9A-8375-0F7A834C46E6> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/Metadata.framework/Versions/A/Metadata
0x7fff8eac5000 - 0x7fff8eb24fff com.apple.CoreServices.OSServices (775.19 - 775.19) <724312AC-5CE8-333C-BC35-BC5AB1583D9A> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/OSServices.framework/Versions/A/OSServices
0x7fff8eb25000 - 0x7fff8eb95fff com.apple.SearchKit (1.4.0 - 1.4.0) <7A6DDA2B-03F1-3137-BA9E-1CC211973E26> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/SearchKit.framework/Versions/A/SearchKit
0x7fff8eb96000 - 0x7fff8ebdbff7 com.apple.coreservices.SharedFileList (38 - 38) <DA096678-93AB-3291-BDE2-482F1D544589> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/SharedFileList.framework/Versions/A/SharedFileList
0x7fff8ec64000 - 0x7fff8edb0ff3 com.apple.CoreText (352.0 - 544.12) <1ED17C4A-9E2D-3537-8C5F-FB675492A002> /System/Library/Frameworks/CoreText.framework/Versions/A/CoreText
0x7fff8ef56000 - 0x7fff8ef5bfff com.apple.DiskArbitration (2.7 - 2.7) <A4DCD470-D8EA-37E9-BDCA-A2B469754C12> /System/Library/Frameworks/DiskArbitration.framework/Versions/A/DiskArbitration
0x7fff8f0ed000 - 0x7fff8f493ff3 com.apple.Foundation (6.9 - 1349.64) <49C8DA40-9E5B-33F9-B092-F50115B59E95> /System/Library/Frameworks/Foundation.framework/Versions/C/Foundation
0x7fff8f4bf000 - 0x7fff8f4f0ff7 com.apple.GSS (4.0 - 2.0) <6FADED0B-0425-3567-A75A-040C5A4638EB> /System/Library/Frameworks/GSS.framework/Versions/A/GSS
0x7fff8f654000 - 0x7fff8f6e9fff com.apple.framework.IOKit (2.0.2 - 1324.50.21) <BA7DC917-35A9-3D1B-BBEC-ADF4495A166D> /System/Library/Frameworks/IOKit.framework/Versions/A/IOKit
0x7fff8f6ea000 - 0x7fff8f6f0ffb com.apple.IOSurface (159.6 - 159.6) <661BFCC0-85AB-3343-853E-3797932871D4> /System/Library/Frameworks/IOSurface.framework/Versions/A/IOSurface
0x7fff8f743000 - 0x7fff8f8a5fff com.apple.ImageIO.framework (3.3.0 - 1599.7) <EFABDE90-A1B0-3211-978B-FF1414355087> /System/Library/Frameworks/ImageIO.framework/Versions/A/ImageIO
0x7fff8f8a6000 - 0x7fff8f8aafff libGIF.dylib (1599.7) <6004F3A9-A9F3-3287-A105-72870ED4537A> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libGIF.dylib
0x7fff8f8ab000 - 0x7fff8f99cff7 libJP2.dylib (1599.7) <447C19DA-1EC7-3145-9C03-392084CEC012> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libJP2.dylib
0x7fff8f99d000 - 0x7fff8f9c0fff libJPEG.dylib (1599.7) <CA292CD5-38A5-33B2-B84C-185E46ABDD85> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libJPEG.dylib
0x7fff8f9c1000 - 0x7fff8f9e7fff libPng.dylib (1599.7) <5EFC9938-CA0F-3AAD-AB70-210089939A74> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libPng.dylib
0x7fff8f9e8000 - 0x7fff8f9eaff3 libRadiance.dylib (1599.7) <AE2355C1-1C5F-3F41-A156-3D0CE09FBF6D> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libRadiance.dylib
0x7fff8f9eb000 - 0x7fff8fa39ff7 libTIFF.dylib (1599.7) <5CE8FC45-4B15-355F-AF40-8A98F0ADC9CF> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libTIFF.dylib
0x7fff907a0000 - 0x7fff907b9ff7 com.apple.Kerberos (3.0 - 1) <B9D242EB-E325-3A21-9812-C77CBBFB0D51> /System/Library/Frameworks/Kerberos.framework/Versions/A/Kerberos
0x7fff918dc000 - 0x7fff918e4fff com.apple.NetFS (6.0 - 4.0) <14A24D00-5673-330A-959D-87F72040DEFF> /System/Library/Frameworks/NetFS.framework/Versions/A/NetFS
0x7fff91cdb000 - 0x7fff91cf4ffb com.apple.CFOpenDirectory (10.12 - 194) <2D856BB1-4865-3B54-A39A-CCBB25A4A935> /System/Library/Frameworks/OpenDirectory.framework/Versions/A/Frameworks/CFOpenDirectory.framework/Versions/A/CFOpenDirectory
0x7fff91cf5000 - 0x7fff91d00ff7 com.apple.OpenDirectory (10.12 - 194) <D5977817-7507-3005-8DDC-AB059672BEA0> /System/Library/Frameworks/OpenDirectory.framework/Versions/A/OpenDirectory
0x7fff93d19000 - 0x7fff9401afff com.apple.security (7.0 - 57740.51.3) <E8E40839-4F2C-3954-9870-9F9BA185BC81> /System/Library/Frameworks/Security.framework/Versions/A/Security
0x7fff9401b000 - 0x7fff94090fff com.apple.securityfoundation (6.0 - 55132.50.7) <2A013E36-EEB5-3E9A-AAA7-8E10EC49E75C> /System/Library/Frameworks/SecurityFoundation.framework/Versions/A/SecurityFoundation
0x7fff940bb000 - 0x7fff940beffb com.apple.xpc.ServiceManagement (1.0 - 1) <00B5C305-37B4-378A-BCAE-5EC441A889C8> /System/Library/Frameworks/ServiceManagement.framework/Versions/A/ServiceManagement
0x7fff94445000 - 0x7fff944b4ff7 com.apple.SystemConfiguration (1.14 - 1.14) <A4B97859-CB45-3910-9785-0CAF015B46BC> /System/Library/Frameworks/SystemConfiguration.framework/Versions/A/SystemConfiguration
0x7fff96e54000 - 0x7fff96edf97f com.apple.AppleJPEG (1.0 - 1) <B9E9570D-04A4-34E4-B756-D200043B25B8> /System/Library/PrivateFrameworks/AppleJPEG.framework/Versions/A/AppleJPEG
0x7fff98966000 - 0x7fff9896fffb com.apple.CommonAuth (4.0 - 2.0) <216950CB-269F-3476-BA17-D6363AC49FBC> /System/Library/PrivateFrameworks/CommonAuth.framework/Versions/A/CommonAuth
0x7fff990b7000 - 0x7fff990c7fff com.apple.CoreEmoji (1.0 - 40.3.3) <E9A28301-2D79-3A97-A046-028258A6ABE5> /System/Library/PrivateFrameworks/CoreEmoji.framework/Versions/A/CoreEmoji
0x7fff9c14c000 - 0x7fff9c1c2ff3 com.apple.Heimdal (4.0 - 2.0) <8F9C9041-66D5-36C9-8A4C-1658035C311D> /System/Library/PrivateFrameworks/Heimdal.framework/Versions/A/Heimdal
0x7fff9c903000 - 0x7fff9cabafff com.apple.LanguageModeling (1.0 - 123.2.5) <E7EDBA2B-8B97-3EC8-BDB1-232287E07581> /System/Library/PrivateFrameworks/LanguageModeling.framework/Versions/A/LanguageModeling
0x7fff9d5d8000 - 0x7fff9d600fff com.apple.MultitouchSupport.framework (368.14 - 368.14) <930109A4-6949-377F-AD30-F9B542CBAE1C> /System/Library/PrivateFrameworks/MultitouchSupport.framework/Versions/A/MultitouchSupport
0x7fff9d6b2000 - 0x7fff9d6bdfff com.apple.NetAuth (6.2 - 6.2) <97F487D6-8089-31A8-B68C-6C1EAC6ED1B5> /System/Library/PrivateFrameworks/NetAuth.framework/Versions/A/NetAuth
0x7fffa0515000 - 0x7fffa051bff7 com.apple.TCC (1.0 - 1) <911B534B-4AC7-34E4-935E-E42ECD008CBC> /System/Library/PrivateFrameworks/TCC.framework/Versions/A/TCC
0x7fffa1924000 - 0x7fffa1926ffb com.apple.loginsupport (1.0 - 1) <F3140B97-12C3-35A7-9D3D-43DA2D13C113> /System/Library/PrivateFrameworks/login.framework/Versions/A/Frameworks/loginsupport.framework/Versions/A/loginsupport
0x7fffa197b000 - 0x7fffa1996ff7 libCRFSuite.dylib (34) <F78B7F5F-0B4F-35C6-AA2F-84EE9CB22137> /usr/lib/libCRFSuite.dylib
0x7fffa1997000 - 0x7fffa19a2fff libChineseTokenizer.dylib (21) <0886E908-A825-36AF-B94B-2361FD8BC2A1> /usr/lib/libChineseTokenizer.dylib
0x7fffa1a34000 - 0x7fffa1a35ff3 libDiagnosticMessagesClient.dylib (102) <84A04D24-0E60-3810-A8C0-90A65E2DF61A> /usr/lib/libDiagnosticMessagesClient.dylib
0x7fffa1c6d000 - 0x7fffa1c6dfff libOpenScriptingUtil.dylib (172) <90743888-C1E8-34E3-924E-1A754B2B63B9> /usr/lib/libOpenScriptingUtil.dylib
0x7fffa1c73000 - 0x7fffa1c74ffb libSystem.B.dylib (1238.51.1) <D9B20A4F-87BC-36CB-9405-80E105666725> /usr/lib/libSystem.B.dylib
0x7fffa1ce0000 - 0x7fffa1d0bff3 libarchive.2.dylib (41.50.2) <B4F507BC-B24E-3BE7-B658-94D798E2CD81> /usr/lib/libarchive.2.dylib
0x7fffa1e05000 - 0x7fffa1e05ff3 libauto.dylib (187) <34388D0B-C539-3C1B-9408-2BC152162E43> /usr/lib/libauto.dylib
0x7fffa1e06000 - 0x7fffa1e16ff3 libbsm.0.dylib (34) <20084796-B04D-3B35-A003-EA11459557A9> /usr/lib/libbsm.0.dylib
0x7fffa1e17000 - 0x7fffa1e25ff7 libbz2.1.0.dylib (38) <25D9FACF-5583-348A-80A0-2B51DCE37680> /usr/lib/libbz2.1.0.dylib
0x7fffa1e26000 - 0x7fffa1e7cff7 libc++.1.dylib (307.5) <0B43BB5D-E6EB-3464-8DE9-B41AC8ED9D1C> /usr/lib/libc++.1.dylib
0x7fffa1e7d000 - 0x7fffa1ea7fff libc++abi.dylib (307.3) <30199352-88BF-30BD-8CFF-2A4FBE247523> /usr/lib/libc++abi.dylib
0x7fffa1ea8000 - 0x7fffa1eb8ffb libcmph.dylib (6) <2B5D405E-2D0B-3320-ABD6-622934C86ABE> /usr/lib/libcmph.dylib
0x7fffa1eb9000 - 0x7fffa1ecfff7 libcompression.dylib (39) <BDAA8CC7-0BFC-36EC-9E75-58BDC15AC3B6> /usr/lib/libcompression.dylib
0x7fffa1ed0000 - 0x7fffa1ed0ff7 libcoretls.dylib (121.50.4) <64B1001E-10F6-3542-A3B2-C4B49F51817F> /usr/lib/libcoretls.dylib
0x7fffa1ed1000 - 0x7fffa1ed2ff3 libcoretls_cfhelpers.dylib (121.50.4) <1A10303E-5EB0-3C7C-9165-021FCDFD934D> /usr/lib/libcoretls_cfhelpers.dylib
0x7fffa220f000 - 0x7fffa2262ff7 libcups.2.dylib (450) <F7AC4FF1-9755-3CFF-8CE3-F4FFACC43BEC> /usr/lib/libcups.2.dylib
0x7fffa22dd000 - 0x7fffa22ddfff libenergytrace.dylib (15) <A1B040A2-7977-3097-9ADF-34FF181EB970> /usr/lib/libenergytrace.dylib
0x7fffa22ed000 - 0x7fffa22f2ff7 libheimdal-asn1.dylib (498.50.8) <A40E3196-235E-34CE-AD9A-8D1AFC5DE004> /usr/lib/libheimdal-asn1.dylib
0x7fffa22f3000 - 0x7fffa23e5ff7 libiconv.2.dylib (50) <42125B35-81D7-3FC4-9475-A26DBE10884D> /usr/lib/libiconv.2.dylib
0x7fffa23e6000 - 0x7fffa260bffb libicucore.A.dylib (57163.0.1) <325E1C97-1C45-3A7E-9AFB-D1328E31D879> /usr/lib/libicucore.A.dylib
0x7fffa2611000 - 0x7fffa2612fff liblangid.dylib (126) <2085E7A7-9A34-3735-87F4-F174EF8EABF0> /usr/lib/liblangid.dylib
0x7fffa2613000 - 0x7fffa262cffb liblzma.5.dylib (10) <44BD0279-99DD-36B5-8A6E-C11432E2098D> /usr/lib/liblzma.5.dylib
0x7fffa262d000 - 0x7fffa2643ff7 libmarisa.dylib (5) <9030D214-5D0F-30CB-AC03-902C63909362> /usr/lib/libmarisa.dylib
0x7fffa2644000 - 0x7fffa28ecff7 libmecabra.dylib (744.8) <D429FCC9-42A4-38B3-8784-44024BC859EF> /usr/lib/libmecabra.dylib
0x7fffa291f000 - 0x7fffa2999ff3 libnetwork.dylib (856.50.56) <021B3FCF-6CFC-359D-845A-8A6AD7C54D73> /usr/lib/libnetwork.dylib
0x7fffa299a000 - 0x7fffa2d6fbc7 libobjc.A.dylib (709) <54CD8D1A-5C98-3559-B13A-932B3D3DD338> /usr/lib/libobjc.A.dylib
0x7fffa2d72000 - 0x7fffa2d76fff libpam.2.dylib (21.30.1) <71EB0D88-DE84-3C8D-A2C5-58AA282BC5BC> /usr/lib/libpam.2.dylib
0x7fffa2d77000 - 0x7fffa2da8ff7 libpcap.A.dylib (67.50.2) <D4A7EFB6-15FE-3C9C-A47C-1CA3CB75D06C> /usr/lib/libpcap.A.dylib
0x7fffa2dc5000 - 0x7fffa2de1ffb libresolv.9.dylib (64) <A244AE4C-00B0-396C-98FF-97FE4DB3DA30> /usr/lib/libresolv.9.dylib
0x7fffa2e31000 - 0x7fffa2f77fff libsqlite3.dylib (254.5) <71E9B5E9-67D8-329E-86A6-894B885A542E> /usr/lib/libsqlite3.dylib
0x7fffa306c000 - 0x7fffa3079fff libxar.1.dylib (357) <69547C64-E811-326F-BBED-490C6361BDCB> /usr/lib/libxar.1.dylib
0x7fffa307a000 - 0x7fffa3169ffb libxml2.2.dylib (30.15) <99A58C37-98A2-3430-942A-D6038C1A198C> /usr/lib/libxml2.2.dylib
0x7fffa316a000 - 0x7fffa3193fff libxslt.1.dylib (15.9) <71FFCDFF-4AAF-394C-8452-92F301FB1A46> /usr/lib/libxslt.1.dylib
0x7fffa3194000 - 0x7fffa31a5ff3 libz.1.dylib (67) <46E3FFA2-4328-327A-8D34-A03E20BFFB8E> /usr/lib/libz.1.dylib
0x7fffa31b4000 - 0x7fffa31b8ff7 libcache.dylib (79) <093A4DAB-8385-3D47-A350-E20CB7CCF7BF> /usr/lib/system/libcache.dylib
0x7fffa31b9000 - 0x7fffa31c3fff libcommonCrypto.dylib (60092.50.5) <BE8380C5-C09D-3F48-A502-AEBB58231067> /usr/lib/system/libcommonCrypto.dylib
0x7fffa31c4000 - 0x7fffa31cbfff libcompiler_rt.dylib (62) <55D47421-772A-32AB-B529-1A46C2F43B4D> /usr/lib/system/libcompiler_rt.dylib
0x7fffa31cc000 - 0x7fffa31d4fff libcopyfile.dylib (138) <819BEA3C-DF11-3E3D-A1A1-5A51C5BF1961> /usr/lib/system/libcopyfile.dylib
0x7fffa31d5000 - 0x7fffa3258fdf libcorecrypto.dylib (442.50.19) <8A39EE06-121C-3731-A9E9-35847064B3EE> /usr/lib/system/libcorecrypto.dylib
0x7fffa3259000 - 0x7fffa328afff libdispatch.dylib (703.50.37) <D122E712-9593-31CA-BAC4-4A54410BF4A0> /usr/lib/system/libdispatch.dylib
0x7fffa328b000 - 0x7fffa3290ffb libdyld.dylib (433.5) <129D3B44-FB21-3750-9A68-48B5C3DC632B> /usr/lib/system/libdyld.dylib
0x7fffa3291000 - 0x7fffa3291ffb libkeymgr.dylib (28) <7AA011A9-DC21-3488-BF73-3B5B14D1FDD6> /usr/lib/system/libkeymgr.dylib
0x7fffa3292000 - 0x7fffa329effb libkxld.dylib (3789.51.2) <0BD544C8-A376-3F91-8426-564B4F7FE7E6> /usr/lib/system/libkxld.dylib
0x7fffa329f000 - 0x7fffa329ffff liblaunch.dylib (972.50.27) <037D198D-9B02-3EF9-A8E9-6F43EA555A9E> /usr/lib/system/liblaunch.dylib
0x7fffa32a0000 - 0x7fffa32a5ff3 libmacho.dylib (898) <17D5D855-F6C3-3B04-B680-E9BF02EF8AED> /usr/lib/system/libmacho.dylib
0x7fffa32a6000 - 0x7fffa32a8ff3 libquarantine.dylib (85.50.1) <7B32EA91-AB8B-32A4-8E52-9D3ED46CAC8E> /usr/lib/system/libquarantine.dylib
0x7fffa32a9000 - 0x7fffa32aaffb libremovefile.dylib (45) <38D4CB9C-10CD-30D3-8B7B-A515EC75FE85> /usr/lib/system/libremovefile.dylib
0x7fffa32ab000 - 0x7fffa32c3ff7 libsystem_asl.dylib (349.50.5) <096E4228-3B7C-30A6-8B13-EC909A64499A> /usr/lib/system/libsystem_asl.dylib
0x7fffa32c4000 - 0x7fffa32c4ff7 libsystem_blocks.dylib (67) <10DC5404-73AB-35B3-A277-A8AFECB476EB> /usr/lib/system/libsystem_blocks.dylib
0x7fffa32c5000 - 0x7fffa3352fef libsystem_c.dylib (1158.50.2) <B03F8915-1E9B-3C84-AED5-68E2E0031630> /usr/lib/system/libsystem_c.dylib
0x7fffa3353000 - 0x7fffa3356ffb libsystem_configuration.dylib (888.51.2) <872C8A42-0871-3424-830B-84E587A75D27> /usr/lib/system/libsystem_configuration.dylib
0x7fffa3357000 - 0x7fffa335afff libsystem_coreservices.dylib (41.4) <FD0915E8-9C43-3FCB-94E0-33C45DF028CD> /usr/lib/system/libsystem_coreservices.dylib
0x7fffa335b000 - 0x7fffa3373fff libsystem_coretls.dylib (121.50.4) <EC6FCF07-DCFB-3A03-9CC9-6DD3709974C6> /usr/lib/system/libsystem_coretls.dylib
0x7fffa3374000 - 0x7fffa337afff libsystem_dnssd.dylib (765.50.9) <FF02A197-7CEF-3684-8155-E5E225051E44> /usr/lib/system/libsystem_dnssd.dylib
0x7fffa337b000 - 0x7fffa33a4ff7 libsystem_info.dylib (503.50.4) <611DB84C-BF70-3F92-8702-B9F28A900920> /usr/lib/system/libsystem_info.dylib
0x7fffa33a5000 - 0x7fffa33c7ff7 libsystem_kernel.dylib (3789.51.2) <FC51D7B0-8292-3F6A-9231-64340B237EB7> /usr/lib/system/libsystem_kernel.dylib
0x7fffa33c8000 - 0x7fffa340ffe7 libsystem_m.dylib (3121.6) <A790C9A5-DD24-32F5-8FD7-33BFCE79AC87> /usr/lib/system/libsystem_m.dylib
0x7fffa3410000 - 0x7fffa342eff7 libsystem_malloc.dylib (116.50.8) <48D1BBA3-914E-3C65-AF70-C33B4A1B5233> /usr/lib/system/libsystem_malloc.dylib
0x7fffa342f000 - 0x7fffa3488ffb libsystem_network.dylib (856.50.56) <FDE14243-4328-3EFD-824C-C0D314D7B540> /usr/lib/system/libsystem_network.dylib
0x7fffa3489000 - 0x7fffa3492ff3 libsystem_networkextension.dylib (563.50.32) <D5381DA9-529C-3588-BE16-A2245DE93423> /usr/lib/system/libsystem_networkextension.dylib
0x7fffa3493000 - 0x7fffa349cff3 libsystem_notify.dylib (165.20.1) <B8160190-A069-3B3A-BDF6-2AA408221FAE> /usr/lib/system/libsystem_notify.dylib
0x7fffa349d000 - 0x7fffa34a5fe7 libsystem_platform.dylib (126.50.8) <5940EAB7-84D6-34DC-9B38-111648B2B589> /usr/lib/system/libsystem_platform.dylib
0x7fffa34a6000 - 0x7fffa34b0ff7 libsystem_pthread.dylib (218.51.1) <62A84A68-431D-3B54-A7B6-31367CCF2884> /usr/lib/system/libsystem_pthread.dylib
0x7fffa34b1000 - 0x7fffa34b4ff7 libsystem_sandbox.dylib (592.50.47) <87A2327D-B7A1-3E4C-A85D-D3D9484003DB> /usr/lib/system/libsystem_sandbox.dylib
0x7fffa34b5000 - 0x7fffa34b6ff3 libsystem_secinit.dylib (24.50.4) <F78B847B-3565-3E4B-98A6-F7AD40392E2D> /usr/lib/system/libsystem_secinit.dylib
0x7fffa34b7000 - 0x7fffa34beffb libsystem_symptoms.dylib (532.50.47) <9CF6A47C-8343-3E85-9C27-A8D98E726A8B> /usr/lib/system/libsystem_symptoms.dylib
0x7fffa34bf000 - 0x7fffa34d2ff7 libsystem_trace.dylib (518.51.1) <E1D540D8-CC88-3901-92BA-FC4B802FE0E8> /usr/lib/system/libsystem_trace.dylib
0x7fffa34d3000 - 0x7fffa34d8ffb libunwind.dylib (35.3) <3D50D8A8-C460-334D-A519-2DA841102C6B> /usr/lib/system/libunwind.dylib
0x7fffa34d9000 - 0x7fffa3502ff7 libxpc.dylib (972.50.27) <ABC45890-DA23-3A4A-B50B-1384BD4CBBDF> /usr/lib/system/libxpc.dylib
External Modification Summary:
Calls made by other processes targeting this process:
task_for_pid: 0
thread_create: 0
thread_set_state: 0
Calls made by this process:
task_for_pid: 0
thread_create: 0
thread_set_state: 0
Calls made by all processes on this machine:
task_for_pid: 317840
thread_create: 0
thread_set_state: 0
VM Region Summary:
ReadOnly portion of Libraries: Total=204.7M resident=0K(0%) swapped_out_or_unallocated=204.7M(100%)
Writable regions: Total=97.1M written=0K(0%) resident=0K(0%) swapped_out=0K(0%) unallocated=97.1M(100%)
VIRTUAL REGION
REGION TYPE SIZE COUNT (non-coalesced)
=========== ======= =======
Dispatch continuations 16.0M 2
Kernel Alloc Once 8K 2
MALLOC 65.0M 12
MALLOC guard page 16K 4
MALLOC_LARGE (reserved) 1168K 3 reserved VM address space (unallocated)
STACK GUARD 56.0M 9
Stack 11.6M 9
VM_ALLOCATE 2.3G 77
VM_ALLOCATE (reserved) 64K 2 reserved VM address space (unallocated)
__DATA 9928K 145
__LINKEDIT 116.5M 10
__TEXT 88.2M 149
__UNICODE 556K 2
mapped file 40.3M 130
shared memory 8K 3
=========== ======= =======
TOTAL 2.7G 544
TOTAL, minus reserved VM space 2.7G 544
Model: MacBookPro10,1, BootROM MBP101.00EE.B12, 4 processors, Intel Core i7, 2.3 GHz, 16 GB, SMC 2.3f35
Graphics: Intel HD Graphics 4000, Intel HD Graphics 4000, Built-In
Graphics: NVIDIA GeForce GT 650M, NVIDIA GeForce GT 650M, PCIe, 1024 MB
Memory Module: BANK 0/DIMM0, 8 GB, DDR3, 1600 MHz, 0x802C, 0x384B54463531323634485A2D314736453120
Memory Module: BANK 1/DIMM0, 8 GB, DDR3, 1600 MHz, 0x802C, 0x384B54463531323634485A2D314736453120
AirPort: spairport_wireless_card_type_airport_extreme (0x14E4, 0xEF), Broadcom BCM43xx 1.0 (7.21.171.124.1a2)
Bluetooth: Version 5.0.4f18, 3 services, 27 devices, 1 incoming serial ports
Network Service: Thunderbolt Ethernet, Ethernet, en4
PCI Card: Apple 57762-A0, Ethernet Controller, Thunderbolt@195,0,0
Serial ATA Device: APPLE SSD SM512E, 500.28 GB
USB Device: USB 2.0 Bus
USB Device: Hub
USB Device: FaceTime HD Camera (Built-in)
USB Device: USB 2.0 Bus
USB Device: Hub
USB Device: Hub
USB Device: Apple Internal Keyboard / Trackpad
USB Device: BRCM20702 Hub
USB Device: Bluetooth USB Host Controller
USB Device: USB 3.0 Bus
Thunderbolt Bus: MacBook Pro, Apple Inc., 23.4
Thunderbolt Device: Thunderbolt to Gigabit Ethernet Adapter, Apple Inc., 3, 5.5
---
@Petermarcu commented on [Wed Jun 14 2017](https://github.com/dotnet/core/issues/599#issuecomment-308614973)
@m2b , Any chance you can share more about which version of everything you are running and how you installed it? Through the installer or from the tar.gz?
---
@Petermarcu commented on [Wed Jun 28 2017](https://github.com/dotnet/core/issues/599#issuecomment-311856464)
I'm going to close this because we need more info. Please reopen if you have more info you can share. Thanks!
---
@jpcarrascal commented on [Sun Jul 09 2017](https://github.com/dotnet/core/issues/599#issuecomment-313953236)
I have exactly the same problem.
I am running OS X El Capitan (10.11.6).
dotnet --version output is 1.0.4.
I tried both the installer and running the binary from the tar.gz and the result is the same. The only difference is that in the former case I also get a "dotnet quit unexpectedly" window.
I'd appreciate any help and I'll be happy to provide more diagnostics information if needed.
Thanks!
JP
---
@richlander commented on [Sun Jul 09 2017](https://github.com/dotnet/core/issues/599#issuecomment-313994521)
/cc @livarcocc
_Copied from original issue: dotnet/cli#7777_
_Copied from original issue: dotnet/core-setup#3279_
|
process
|
system dllnotfoundexception unable to load dll libproc from livarcocc on october from petermarcu on october commented on here is the simple test project code using system using xunit namespace itvizion actionboard repositoryinmemory public class test public void assert equal here is the command window output dotnet crashes blackielanetmbp actionboard git blackie dotnet test itvizion actionboard repositoryinmemory test itvizion actionboard repositoryinmemory test csproj welcome to net core learn more about net core use dotnet help to see available commands or go to telemetry the net core tools collect usage data in order to improve your experience the data is anonymous and does not include command line arguments the data is collected by microsoft and shared with the community you can opt out of telemetry by setting a dotnet cli telemetry optout environment variable to using your favorite shell you can read more about net core tools telemetry configuring a command is running to initially populate your local package cache to improve restore speed and enable offline access this command will take up to a minute to complete and will only happen once decompressing ms expanding ms failed to create prime the nuget cache restore failed with unhandled exception system dllnotfoundexception unable to load dll libproc the specified module could not be found exception from hresult at interop libproc proc pidpath pid byte buffer buffersize at interop libproc proc pidpath pid at system diagnostics process resolvepath string filename at system diagnostics process startcore processstartinfo startinfo at system diagnostics process start at microsoft dotnet cli forwardingapp execute at microsoft dotnet tools msbuild msbuildforwardingapp execute at microsoft dotnet tools test testcommand c b at microsoft dotnet cli commandline commandlineapplication execute string args at microsoft dotnet tools test testcommand run string args at microsoft dotnet cli program processargs string args itelemetry telemetryclient at microsoft dotnet cli program main string args abort trap here is the mac problem report process dotnet path usr local share dotnet dotnet identifier dotnet version code type native parent process dotnet responsible dotnet user id date time os version mac os x report version anonymous uuid ddad sleep wake uuid time awake since boot seconds time since wake seconds system integrity protection disabled crashed thread dispatch queue com apple main thread exception type exc crash sigabrt exception codes exception note exc corpse notify application specific information abort called thread crashed dispatch queue com apple main thread libsystem kernel dylib pthread kill libsystem pthread dylib pthread kill libsystem c dylib abort libcoreclr dylib procendprocess void unsigned int int libcoreclr dylib pal sehexception context libcoreclr dylib dispatchmanagedexception pal sehexception bool libcoreclr dylib prestubworker libcoreclr dylib theprestub libcoreclr dylib calldescrworkerinternal libcoreclr dylib methoddesccallsite calltargetworker unsigned long const unsigned long int libcoreclr dylib runmain methoddesc short int ptrarray libcoreclr dylib assembly executemainmethod ptrarray int libcoreclr dylib executeassembly unsigned int t const int t const unsigned int libcoreclr dylib coreclr execute assembly libhostpolicy dylib coreclr execute assembly void unsigned int int char const char const unsigned int libhostpolicy dylib run arguments t const libhostpolicy dylib corehost main libhostfxr dylib execute app std basic string std allocator const corehost init t int char const libhostfxr dylib fx muxer t read config and execute std basic string std allocator const std basic string std allocator const std unordered map std allocator std vector std allocator std allocator std allocator std hash std allocator std equal to std allocator std allocator std allocator const std vector std allocator std allocator std allocator const int char const host mode t libhostfxr dylib fx muxer t parse args and execute std basic string std allocator const std basic string std allocator const int int char const bool host mode t bool libhostfxr dylib fx muxer t execute int char const libhostfxr dylib hostfxr main dotnet run int char const dotnet main libdyld dylib start thread libsystem kernel dylib mach msg trap libsystem kernel dylib mach msg libcoreclr dylib machmessage receive unsigned int libcoreclr dylib sehexceptionthread void libsystem pthread dylib pthread body libsystem pthread dylib pthread start libsystem pthread dylib thread start thread libsystem kernel dylib poll libcoreclr dylib corunix cpalsynchronizationmanager threadprepareforshutdown libcoreclr dylib corunix cpalsynchronizationmanager workerthread void libcoreclr dylib corunix cpalthread threadentry void libsystem pthread dylib pthread body libsystem pthread dylib pthread start libsystem pthread dylib thread start thread libsystem kernel dylib open libcoreclr dylib twowaypipe waitforconnection libcoreclr dylib dbgtransportsession transportworker libcoreclr dylib dbgtransportsession transportworkerstatic void libcoreclr dylib corunix cpalthread threadentry void libsystem pthread dylib pthread body libsystem pthread dylib pthread start libsystem pthread dylib thread start thread libsystem kernel dylib psynch cvwait libsystem pthread dylib pthread cond wait libcoreclr dylib corunix cpalsynchronizationmanager threadnativewait corunix threadnativewaitdata unsigned int corunix threadwakeupreason unsigned int libcoreclr dylib corunix cpalsynchronizationmanager blockthread corunix cpalthread unsigned int bool bool corunix threadwakeupreason unsigned int libcoreclr dylib corunix internalwaitformultipleobjectsex corunix cpalthread unsigned int void const int unsigned int int libcoreclr dylib debuggerrcthread mainloop libcoreclr dylib debuggerrcthread threadproc libcoreclr dylib debuggerrcthread threadprocstatic void libcoreclr dylib corunix cpalthread threadentry void libsystem pthread dylib pthread body libsystem pthread dylib pthread start libsystem pthread dylib thread start thread libsystem kernel dylib psynch cvwait libsystem pthread dylib pthread cond wait libcoreclr dylib corunix cpalsynchronizationmanager threadnativewait corunix threadnativewaitdata unsigned int corunix threadwakeupreason unsigned int libcoreclr dylib corunix cpalsynchronizationmanager blockthread corunix cpalthread unsigned int bool bool corunix threadwakeupreason unsigned int libcoreclr dylib corunix internalwaitformultipleobjectsex corunix cpalthread unsigned int void const int unsigned int int libcoreclr dylib waitforsingleobjectex libcoreclr dylib clreventbase waitex unsigned int waitmode pendingsync libcoreclr dylib finalizerthread waitforfinalizerevent clrevent libcoreclr dylib finalizerthread finalizerthreadworker void libcoreclr dylib managedthreadbase dispatchouter managedthreadcallstate libcoreclr dylib managedthreadbase finalizerbase void void libcoreclr dylib finalizerthread finalizerthreadstart void libcoreclr dylib corunix cpalthread threadentry void libsystem pthread dylib pthread body libsystem pthread dylib pthread start libsystem pthread dylib thread start thread libsystem kernel dylib mach msg trap libsystem kernel dylib mach msg libclrjit dylib machmessage receive unsigned int libclrjit dylib sehexceptionthread void libsystem pthread dylib pthread body libsystem pthread dylib pthread start libsystem pthread dylib thread start thread libsystem kernel dylib workq kernreturn libsystem pthread dylib pthread wqthread libsystem pthread dylib start wqthread thread crashed with thread state bit rax rbx rcx rdx rdi rsi rbp rsp rip rfl logical cpu error code trap number binary images dotnet usr local share dotnet dotnet libhostfxr dylib usr local share dotnet host fxr libhostfxr dylib libhostpolicy dylib usr local share dotnet shared microsoft netcore app libhostpolicy dylib libcoreclr dylib usr local share dotnet shared microsoft netcore app libcoreclr dylib dyld usr lib dyld libclrjit dylib usr local share dotnet shared microsoft netcore app libclrjit dylib system globalization native dylib usr local share dotnet shared microsoft netcore app system globalization native dylib system native dylib usr local share dotnet shared microsoft netcore app system native dylib com apple accelerate accelerate system library frameworks accelerate framework versions a accelerate com apple vimage system library frameworks accelerate framework versions a frameworks vimage framework versions a vimage libblas dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a libblas dylib libbnns dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a libbnns dylib liblapack dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a liblapack dylib liblinearalgebra dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a liblinearalgebra dylib libquadrature dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a libquadrature dylib libsparseblas dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a libsparseblas dylib libvdsp dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a libvdsp dylib libvmisc dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a libvmisc dylib com apple accelerate veclib veclib system library frameworks accelerate framework versions a frameworks veclib framework versions a veclib com apple applicationservices system library frameworks applicationservices framework versions a applicationservices com apple applicationservices ats system library frameworks applicationservices framework versions a frameworks ats framework versions a ats libfontparser dylib system library frameworks applicationservices framework versions a frameworks ats framework versions a resources libfontparser dylib libfontregistry dylib system library frameworks applicationservices framework versions a frameworks ats framework versions a resources libfontregistry dylib com apple colorsync system library frameworks applicationservices framework versions a frameworks colorsync framework versions a colorsync com apple hiservices system library frameworks applicationservices framework versions a frameworks hiservices framework versions a hiservices com apple langanalysis system library frameworks applicationservices framework versions a frameworks langanalysis framework versions a langanalysis com apple print framework printcore system library frameworks applicationservices framework versions a frameworks printcore framework versions a printcore com apple qd system library frameworks applicationservices framework versions a frameworks qd framework versions a qd com apple speech synthesis framework system library frameworks applicationservices framework versions a frameworks speechsynthesis framework versions a speechsynthesis com apple audio toolbox audiotoolbox system library frameworks audiotoolbox framework versions a audiotoolbox com apple cfnetwork system library frameworks cfnetwork framework versions a cfnetwork com apple audio coreaudio system library frameworks coreaudio framework versions a coreaudio com apple coredata system library frameworks coredata framework versions a coredata com apple corefoundation system library frameworks corefoundation framework versions a corefoundation com apple coregraphics system library frameworks coregraphics framework versions a coregraphics com apple coreservices system library frameworks coreservices framework versions a coreservices com apple ae system library frameworks coreservices framework versions a frameworks ae framework versions a ae com apple coreservices carboncore system library frameworks coreservices framework versions a frameworks carboncore framework versions a carboncore com apple dictionaryservices system library frameworks coreservices framework versions a frameworks dictionaryservices framework versions a dictionaryservices com apple coreservices fsevents system library frameworks coreservices framework versions a frameworks fsevents framework versions a fsevents com apple launchservices system library frameworks coreservices framework versions a frameworks launchservices framework versions a launchservices com apple metadata system library frameworks coreservices framework versions a frameworks metadata framework versions a metadata com apple coreservices osservices system library frameworks coreservices framework versions a frameworks osservices framework versions a osservices com apple searchkit system library frameworks coreservices framework versions a frameworks searchkit framework versions a searchkit com apple coreservices sharedfilelist system library frameworks coreservices framework versions a frameworks sharedfilelist framework versions a sharedfilelist com apple coretext system library frameworks coretext framework versions a coretext com apple diskarbitration system library frameworks diskarbitration framework versions a diskarbitration com apple foundation system library frameworks foundation framework versions c foundation com apple gss system library frameworks gss framework versions a gss com apple framework iokit system library frameworks iokit framework versions a iokit com apple iosurface system library frameworks iosurface framework versions a iosurface com apple imageio framework system library frameworks imageio framework versions a imageio libgif dylib system library frameworks imageio framework versions a resources libgif dylib dylib system library frameworks imageio framework versions a resources dylib libjpeg dylib system library frameworks imageio framework versions a resources libjpeg dylib libpng dylib system library frameworks imageio framework versions a resources libpng dylib libradiance dylib system library frameworks imageio framework versions a resources libradiance dylib libtiff dylib system library frameworks imageio framework versions a resources libtiff dylib com apple kerberos system library frameworks kerberos framework versions a kerberos com apple netfs system library frameworks netfs framework versions a netfs com apple cfopendirectory system library frameworks opendirectory framework versions a frameworks cfopendirectory framework versions a cfopendirectory com apple opendirectory system library frameworks opendirectory framework versions a opendirectory com apple security system library frameworks security framework versions a security com apple securityfoundation system library frameworks securityfoundation framework versions a securityfoundation com apple xpc servicemanagement system library frameworks servicemanagement framework versions a servicemanagement com apple systemconfiguration system library frameworks systemconfiguration framework versions a systemconfiguration com apple applejpeg system library privateframeworks applejpeg framework versions a applejpeg com apple commonauth system library privateframeworks commonauth framework versions a commonauth com apple coreemoji system library privateframeworks coreemoji framework versions a coreemoji com apple heimdal system library privateframeworks heimdal framework versions a heimdal com apple languagemodeling system library privateframeworks languagemodeling framework versions a languagemodeling com apple multitouchsupport framework system library privateframeworks multitouchsupport framework versions a multitouchsupport com apple netauth system library privateframeworks netauth framework versions a netauth com apple tcc system library privateframeworks tcc framework versions a tcc com apple loginsupport system library privateframeworks login framework versions a frameworks loginsupport framework versions a loginsupport libcrfsuite dylib usr lib libcrfsuite dylib libchinesetokenizer dylib usr lib libchinesetokenizer dylib libdiagnosticmessagesclient dylib usr lib libdiagnosticmessagesclient dylib libopenscriptingutil dylib usr lib libopenscriptingutil dylib libsystem b dylib usr lib libsystem b dylib libarchive dylib usr lib libarchive dylib libauto dylib usr lib libauto dylib libbsm dylib usr lib libbsm dylib dylib usr lib dylib libc dylib usr lib libc dylib libc abi dylib usr lib libc abi dylib libcmph dylib usr lib libcmph dylib libcompression dylib usr lib libcompression dylib libcoretls dylib usr lib libcoretls dylib libcoretls cfhelpers dylib usr lib libcoretls cfhelpers dylib libcups dylib usr lib libcups dylib libenergytrace dylib usr lib libenergytrace dylib libheimdal dylib usr lib libheimdal dylib libiconv dylib usr lib libiconv dylib libicucore a dylib usr lib libicucore a dylib liblangid dylib usr lib liblangid dylib liblzma dylib usr lib liblzma dylib libmarisa dylib usr lib libmarisa dylib libmecabra dylib usr lib libmecabra dylib libnetwork dylib usr lib libnetwork dylib libobjc a dylib usr lib libobjc a dylib libpam dylib usr lib libpam dylib libpcap a dylib usr lib libpcap a dylib libresolv dylib usr lib libresolv dylib dylib usr lib dylib libxar dylib usr lib libxar dylib dylib usr lib dylib libxslt dylib usr lib libxslt dylib libz dylib usr lib libz dylib libcache dylib usr lib system libcache dylib libcommoncrypto dylib usr lib system libcommoncrypto dylib libcompiler rt dylib usr lib system libcompiler rt dylib libcopyfile dylib usr lib system libcopyfile dylib libcorecrypto dylib usr lib system libcorecrypto dylib libdispatch dylib usr lib system libdispatch dylib libdyld dylib usr lib system libdyld dylib libkeymgr dylib usr lib system libkeymgr dylib libkxld dylib usr lib system libkxld dylib liblaunch dylib usr lib system liblaunch dylib libmacho dylib usr lib system libmacho dylib libquarantine dylib usr lib system libquarantine dylib libremovefile dylib usr lib system libremovefile dylib libsystem asl dylib usr lib system libsystem asl dylib libsystem blocks dylib usr lib system libsystem blocks dylib libsystem c dylib usr lib system libsystem c dylib libsystem configuration dylib usr lib system libsystem configuration dylib libsystem coreservices dylib usr lib system libsystem coreservices dylib libsystem coretls dylib usr lib system libsystem coretls dylib libsystem dnssd dylib usr lib system libsystem dnssd dylib libsystem info dylib usr lib system libsystem info dylib libsystem kernel dylib usr lib system libsystem kernel dylib libsystem m dylib usr lib system libsystem m dylib libsystem malloc dylib usr lib system libsystem malloc dylib libsystem network dylib usr lib system libsystem network dylib libsystem networkextension dylib usr lib system libsystem networkextension dylib libsystem notify dylib usr lib system libsystem notify dylib libsystem platform dylib usr lib system libsystem platform dylib libsystem pthread dylib usr lib system libsystem pthread dylib libsystem sandbox dylib usr lib system libsystem sandbox dylib libsystem secinit dylib usr lib system libsystem secinit dylib libsystem symptoms dylib usr lib system libsystem symptoms dylib libsystem trace dylib usr lib system libsystem trace dylib libunwind dylib usr lib system libunwind dylib libxpc dylib usr lib system libxpc dylib external modification summary calls made by other processes targeting this process task for pid thread create thread set state calls made by this process task for pid thread create thread set state calls made by all processes on this machine task for pid thread create thread set state vm region summary readonly portion of libraries total resident swapped out or unallocated writable regions total written resident swapped out unallocated virtual region region type size count non coalesced dispatch continuations kernel alloc once malloc malloc guard page malloc large reserved reserved vm address space unallocated stack guard stack vm allocate vm allocate reserved reserved vm address space unallocated data linkedit text unicode mapped file shared memory total total minus reserved vm space model bootrom processors intel core ghz gb smc graphics intel hd graphics intel hd graphics built in graphics nvidia geforce gt nvidia geforce gt pcie mb memory module bank gb mhz memory module bank gb mhz airport spairport wireless card type airport extreme broadcom bluetooth version services devices incoming serial ports network service thunderbolt ethernet ethernet pci card apple ethernet controller thunderbolt serial ata device apple ssd gb usb device usb bus usb device hub usb device facetime hd camera built in usb device usb bus usb device hub usb device hub usb device apple internal keyboard trackpad usb device hub usb device bluetooth usb host controller usb device usb bus thunderbolt bus macbook pro apple inc thunderbolt device thunderbolt to gigabit ethernet adapter apple inc petermarcu commented on any chance you can share more about which version of everything you are running and how you installed it through the installer or from the tar gz petermarcu commented on i m going to close this because we need more info please reopen if you have more info you can share thanks jpcarrascal commented on i have exactly the same problem i am running os x el capitan dotnet version output is i tried both the installer and running the binary from the tar gz and the result is the same the only difference is that in the former case i also get a dotnet quit unexpectedly window i d appreciate any help and i ll be happy to provide more diagnostics information if needed thanks jp richlander commented on cc livarcocc copied from original issue dotnet cli copied from original issue dotnet core setup
| 1
|
13,950
| 24,047,441,084
|
IssuesEvent
|
2022-09-16 09:35:39
|
banillie/project_tracker
|
https://api.github.com/repos/banillie/project_tracker
|
closed
|
Recheck sql_proxy connection for development
|
learning requirement
|
Need to recheck whether I'm inadvertently touching the production db, when using the sql-proxy connect. The proxy should connect to a different db in the sql instance. However, some testing I recently did seemed to have appeared in the production db when I deployed it, which is a bit baffling.
|
1.0
|
Recheck sql_proxy connection for development - Need to recheck whether I'm inadvertently touching the production db, when using the sql-proxy connect. The proxy should connect to a different db in the sql instance. However, some testing I recently did seemed to have appeared in the production db when I deployed it, which is a bit baffling.
|
non_process
|
recheck sql proxy connection for development need to recheck whether i m inadvertently touching the production db when using the sql proxy connect the proxy should connect to a different db in the sql instance however some testing i recently did seemed to have appeared in the production db when i deployed it which is a bit baffling
| 0
|
328,838
| 24,200,824,159
|
IssuesEvent
|
2022-09-24 14:42:19
|
cahamo/cahamo.github.io
|
https://api.github.com/repos/cahamo/cahamo.github.io
|
closed
|
Update domain name used in cookies.html
|
bug documentation
|
The domain is referred to as `cahamo.github.io` - should now be `tips.delphidabbler.com`.
|
1.0
|
Update domain name used in cookies.html - The domain is referred to as `cahamo.github.io` - should now be `tips.delphidabbler.com`.
|
non_process
|
update domain name used in cookies html the domain is referred to as cahamo github io should now be tips delphidabbler com
| 0
|
21,039
| 27,979,788,306
|
IssuesEvent
|
2023-03-26 02:00:07
|
lizhihao6/get-daily-arxiv-noti
|
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
|
opened
|
New submissions for Fri, 24 Mar 23
|
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
|
## Keyword: events
### Dense-Localizing Audio-Visual Events in Untrimmed Videos: A Large-Scale Benchmark and Baseline
- **Authors:** Tiantian Geng, Teng Wang, Jinming Duan, Runmin Cong, Feng Zheng
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Sound (cs.SD); Audio and Speech Processing (eess.AS)
- **Arxiv link:** https://arxiv.org/abs/2303.12930
- **Pdf link:** https://arxiv.org/pdf/2303.12930
- **Abstract**
Existing audio-visual event localization (AVE) handles manually trimmed videos with only a single instance in each of them. However, this setting is unrealistic as natural videos often contain numerous audio-visual events with different categories. To better adapt to real-life applications, in this paper we focus on the task of dense-localizing audio-visual events, which aims to jointly localize and recognize all audio-visual events occurring in an untrimmed video. The problem is challenging as it requires fine-grained audio-visual scene and context understanding. To tackle this problem, we introduce the first Untrimmed Audio-Visual (UnAV-100) dataset, which contains 10K untrimmed videos with over 30K audio-visual events. Each video has 2.8 audio-visual events on average, and the events are usually related to each other and might co-occur as in real-life scenes. Next, we formulate the task using a new learning-based framework, which is capable of fully integrating audio and visual modalities to localize audio-visual events with various lengths and capture dependencies between them in a single pass. Extensive experiments demonstrate the effectiveness of our method as well as the significance of multi-scale cross-modal perception and dependency modeling for this task.
### Ablating Concepts in Text-to-Image Diffusion Models
- **Authors:** Nupur Kumari, Bingliang Zhang, Sheng-Yu Wang, Eli Shechtman, Richard Zhang, Jun-Yan Zhu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Graphics (cs.GR); Machine Learning (cs.LG)
- **Arxiv link:** https://arxiv.org/abs/2303.13516
- **Pdf link:** https://arxiv.org/pdf/2303.13516
- **Abstract**
Large-scale text-to-image diffusion models can generate high-fidelity images with powerful compositional ability. However, these models are typically trained on an enormous amount of Internet data, often containing copyrighted material, licensed images, and personal photos. Furthermore, they have been found to replicate the style of various living artists or memorize exact training samples. How can we remove such copyrighted concepts or images without retraining the model from scratch? To achieve this goal, we propose an efficient method of ablating concepts in the pretrained model, i.e., preventing the generation of a target concept. Our algorithm learns to match the image distribution for a target style, instance, or text prompt we wish to ablate to the distribution corresponding to an anchor concept. This prevents the model from generating target concepts given its text condition. Extensive experiments show that our method can successfully prevent the generation of the ablated concept while preserving closely related concepts in the model.
### Three ways to improve feature alignment for open vocabulary detection
- **Authors:** Relja Arandjelović, Alex Andonian, Arthur Mensch, Olivier J. Hénaff, Jean-Baptiste Alayrac, Andrew Zisserman
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Machine Learning (cs.LG)
- **Arxiv link:** https://arxiv.org/abs/2303.13518
- **Pdf link:** https://arxiv.org/pdf/2303.13518
- **Abstract**
The core problem in zero-shot open vocabulary detection is how to align visual and text features, so that the detector performs well on unseen classes. Previous approaches train the feature pyramid and detection head from scratch, which breaks the vision-text feature alignment established during pretraining, and struggles to prevent the language model from forgetting unseen classes. We propose three methods to alleviate these issues. Firstly, a simple scheme is used to augment the text embeddings which prevents overfitting to a small number of classes seen during training, while simultaneously saving memory and computation. Secondly, the feature pyramid network and the detection head are modified to include trainable gated shortcuts, which encourages vision-text feature alignment and guarantees it at the start of detection training. Finally, a self-training approach is used to leverage a larger corpus of image-text pairs thus improving detection performance on classes with no human annotated bounding boxes. Our three methods are evaluated on the zero-shot version of the LVIS benchmark, each of them showing clear and significant benefits. Our final network achieves the new stateof-the-art on the mAP-all metric and demonstrates competitive performance for mAP-rare, as well as superior transfer to COCO and Objects365.
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
### Efficient Meshy Neural Fields for Animatable Human Avatars
- **Authors:** Xiaoke Huang, Yiji Cheng, Yansong Tang, Xiu Li, Jie Zhou, Jiwen Lu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Graphics (cs.GR)
- **Arxiv link:** https://arxiv.org/abs/2303.12965
- **Pdf link:** https://arxiv.org/pdf/2303.12965
- **Abstract**
Efficiently digitizing high-fidelity animatable human avatars from videos is a challenging and active research topic. Recent volume rendering-based neural representations open a new way for human digitization with their friendly usability and photo-realistic reconstruction quality. However, they are inefficient for long optimization times and slow inference speed; their implicit nature results in entangled geometry, materials, and dynamics of humans, which are hard to edit afterward. Such drawbacks prevent their direct applicability to downstream applications, especially the prominent rasterization-based graphic ones. We present EMA, a method that Efficiently learns Meshy neural fields to reconstruct animatable human Avatars. It jointly optimizes explicit triangular canonical mesh, spatial-varying material, and motion dynamics, via inverse rendering in an end-to-end fashion. Each above component is derived from separate neural fields, relaxing the requirement of a template, or rigging. The mesh representation is highly compatible with the efficient rasterization-based renderer, thus our method only takes about an hour of training and can render in real-time. Moreover, only minutes of optimization is enough for plausible reconstruction results. The disentanglement of meshes enables direct downstream applications. Extensive experiments illustrate the very competitive performance and significant speed boost against previous methods. We also showcase applications including novel pose synthesis, material editing, and relighting. The project page: https://xk-huang.github.io/ema/.
## Keyword: ISP
### CP$^3$: Channel Pruning Plug-in for Point-based Networks
- **Authors:** Yaomin Huang, Ning Liu, Zhengping Che, Zhiyuan Xu, Chaomin Shen, Yaxin Peng, Guixu Zhang, Xinmei Liu, Feifei Feng, Jian Tang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2303.13097
- **Pdf link:** https://arxiv.org/pdf/2303.13097
- **Abstract**
Channel pruning can effectively reduce both computational cost and memory footprint of the original network while keeping a comparable accuracy performance. Though great success has been achieved in channel pruning for 2D image-based convolutional networks (CNNs), existing works seldom extend the channel pruning methods to 3D point-based neural networks (PNNs). Directly implementing the 2D CNN channel pruning methods to PNNs undermine the performance of PNNs because of the different representations of 2D images and 3D point clouds as well as the network architecture disparity. In this paper, we proposed CP$^3$, which is a Channel Pruning Plug-in for Point-based network. CP$^3$ is elaborately designed to leverage the characteristics of point clouds and PNNs in order to enable 2D channel pruning methods for PNNs. Specifically, it presents a coordinate-enhanced channel importance metric to reflect the correlation between dimensional information and individual channel features, and it recycles the discarded points in PNN's sampling process and reconsiders their potentially-exclusive information to enhance the robustness of channel pruning. Experiments on various PNN architectures show that CP$^3$ constantly improves state-of-the-art 2D CNN pruning approaches on different point cloud tasks. For instance, our compressed PointNeXt-S on ScanObjectNN achieves an accuracy of 88.52% with a pruning rate of 57.8%, outperforming the baseline pruning methods with an accuracy gain of 1.94%.
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### Improvement of Color Image Analysis Using a New Hybrid Face Recognition Algorithm based on Discrete Wavelets and Chebyshev Polynomials
- **Authors:** Hassan Mohamed Muhi-Aldeen, Maha Ammar Mustafa, Asma A. Abdulrahman, Jabbar Abed Eleiwy, Fouad S. Tahir, Yurii Khlaponin
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV)
- **Arxiv link:** https://arxiv.org/abs/2303.13158
- **Pdf link:** https://arxiv.org/pdf/2303.13158
- **Abstract**
This work is unique in the use of discrete wavelets that were built from or derived from Chebyshev polynomials of the second and third kind, filter the Discrete Second Chebyshev Wavelets Transform (DSCWT), and derive two effective filters. The Filter Discrete Third Chebyshev Wavelets Transform (FDTCWT) is used in the process of analyzing color images and removing noise and impurities that accompany the image, as well as because of the large amount of data that makes up the image as it is taken. These data are massive, making it difficult to deal with each other during transmission. However to address this issue, the image compression technique is used, with the image not losing information due to the readings that were obtained, and the results were satisfactory. Mean Square Error (MSE), Peak Signal Noise Ratio (PSNR), Bit Per Pixel (BPP), and Compression Ratio (CR) Coronavirus is the initial treatment, while the processing stage is done with network training for Convolutional Neural Networks (CNN) with Discrete Second Chebeshev Wavelets Convolutional Neural Network (DSCWCNN) and Discrete Third Chebeshev Wavelets Convolutional Neural Network (DTCWCNN) to create an efficient algorithm for face recognition, and the best results were achieved in accuracy and in the least amount of time. Two samples of color images that were made or implemented were used. The proposed theory was obtained with fast and good results; the results are evident shown in the tables below.
### Enhancement of theColor Image Compression Using a New Algorithm based on Discrete Hermite Wavelet Transform
- **Authors:** Hassan Mohamed Muhi-Aldeen, Asma A. Abdulrahman, Jabbar Abed Eleiwy, Fouad S. Tahir, Yurii Khlaponin
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV)
- **Arxiv link:** https://arxiv.org/abs/2303.13175
- **Pdf link:** https://arxiv.org/pdf/2303.13175
- **Abstract**
The Internet has turned the entire world into a small village;this is because it has made it possible to share millions of images and videos. However, sending and receiving a huge amount of data is considered to be a main challenge. To address this issue, a new algorithm is required to reduce image bits and represent the data in a compressed form. Nevertheless, image compression is an important application for transferring large files and images. This requires appropriate and efficient transfers in this field to achieve the task and reach the best results. In this work, we propose a new algorithm based on discrete Hermite wavelets transformation (DHWT) that shows the efficiency and quality of the color images. By compressing the color image, this method analyzes it and divides it into approximate coefficients and detail coefficients after adding the wavelets into MATLAB. With Multi-Resolution Analyses (MRA), the appropriate filter is derived, and the mathematical aspects prove to be validated by testing a new filter and performing its operation. After the decomposition of the rows and upon the process of the reconstruction, taking the inverse of the filter and dealing with the columns of the matrix, the original matrix is improved by measuring the parameters of the image to achieve the best quality of the resulting image, such as the peak signal-to-noise ratio (PSNR), compression ratio (CR), bits per pixel (BPP), and mean square error (MSE).
## Keyword: RAW
### Efficient Meshy Neural Fields for Animatable Human Avatars
- **Authors:** Xiaoke Huang, Yiji Cheng, Yansong Tang, Xiu Li, Jie Zhou, Jiwen Lu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Graphics (cs.GR)
- **Arxiv link:** https://arxiv.org/abs/2303.12965
- **Pdf link:** https://arxiv.org/pdf/2303.12965
- **Abstract**
Efficiently digitizing high-fidelity animatable human avatars from videos is a challenging and active research topic. Recent volume rendering-based neural representations open a new way for human digitization with their friendly usability and photo-realistic reconstruction quality. However, they are inefficient for long optimization times and slow inference speed; their implicit nature results in entangled geometry, materials, and dynamics of humans, which are hard to edit afterward. Such drawbacks prevent their direct applicability to downstream applications, especially the prominent rasterization-based graphic ones. We present EMA, a method that Efficiently learns Meshy neural fields to reconstruct animatable human Avatars. It jointly optimizes explicit triangular canonical mesh, spatial-varying material, and motion dynamics, via inverse rendering in an end-to-end fashion. Each above component is derived from separate neural fields, relaxing the requirement of a template, or rigging. The mesh representation is highly compatible with the efficient rasterization-based renderer, thus our method only takes about an hour of training and can render in real-time. Moreover, only minutes of optimization is enough for plausible reconstruction results. The disentanglement of meshes enables direct downstream applications. Extensive experiments illustrate the very competitive performance and significant speed boost against previous methods. We also showcase applications including novel pose synthesis, material editing, and relighting. The project page: https://xk-huang.github.io/ema/.
### VADER: Video Alignment Differencing and Retrieval
- **Authors:** Alexander Black, Simon Jenni, Tu Bui, Md. Mehrab Tanjim, Stefano Petrangeli, Ritwik Sinha, Viswanathan Swaminathan, John Collomosse
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.13193
- **Pdf link:** https://arxiv.org/pdf/2303.13193
- **Abstract**
We propose VADER, a spatio-temporal matching, alignment, and change summarization method to help fight misinformation spread via manipulated videos. VADER matches and coarsely aligns partial video fragments to candidate videos using a robust visual descriptor and scalable search over adaptively chunked video content. A transformer-based alignment module then refines the temporal localization of the query fragment within the matched video. A space-time comparator module identifies regions of manipulation between aligned content, invariant to any changes due to any residual temporal misalignments or artifacts arising from non-editorial changes of the content. Robustly matching video to a trusted source enables conclusions to be drawn on video provenance, enabling informed trust decisions on content encountered.
### Multi-granularity Interaction Simulation for Unsupervised Interactive Segmentation
- **Authors:** Kehan Li, Yian Zhao, Zhennan Wang, Zesen Cheng, Peng Jin, Xiangyang Ji, Li Yuan, Chang Liu, Jie Chen
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.13399
- **Pdf link:** https://arxiv.org/pdf/2303.13399
- **Abstract**
Interactive segmentation enables users to segment as needed by providing cues of objects, which introduces human-computer interaction for many fields, such as image editing and medical image analysis. Typically, massive and expansive pixel-level annotations are spent to train deep models by object-oriented interactions with manually labeled object masks. In this work, we reveal that informative interactions can be made by simulation with semantic-consistent yet diverse region exploration in an unsupervised paradigm. Concretely, we introduce a Multi-granularity Interaction Simulation (MIS) approach to open up a promising direction for unsupervised interactive segmentation. Drawing on the high-quality dense features produced by recent self-supervised models, we propose to gradually merge patches or regions with similar features to form more extensive regions and thus, every merged region serves as a semantic-meaningful multi-granularity proposal. By randomly sampling these proposals and simulating possible interactions based on them, we provide meaningful interaction at multiple granularities to teach the model to understand interactions. Our MIS significantly outperforms non-deep learning unsupervised methods and is even comparable with some previous deep-supervised methods without any annotation.
## Keyword: raw image
There is no result
|
2.0
|
New submissions for Fri, 24 Mar 23 - ## Keyword: events
### Dense-Localizing Audio-Visual Events in Untrimmed Videos: A Large-Scale Benchmark and Baseline
- **Authors:** Tiantian Geng, Teng Wang, Jinming Duan, Runmin Cong, Feng Zheng
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Sound (cs.SD); Audio and Speech Processing (eess.AS)
- **Arxiv link:** https://arxiv.org/abs/2303.12930
- **Pdf link:** https://arxiv.org/pdf/2303.12930
- **Abstract**
Existing audio-visual event localization (AVE) handles manually trimmed videos with only a single instance in each of them. However, this setting is unrealistic as natural videos often contain numerous audio-visual events with different categories. To better adapt to real-life applications, in this paper we focus on the task of dense-localizing audio-visual events, which aims to jointly localize and recognize all audio-visual events occurring in an untrimmed video. The problem is challenging as it requires fine-grained audio-visual scene and context understanding. To tackle this problem, we introduce the first Untrimmed Audio-Visual (UnAV-100) dataset, which contains 10K untrimmed videos with over 30K audio-visual events. Each video has 2.8 audio-visual events on average, and the events are usually related to each other and might co-occur as in real-life scenes. Next, we formulate the task using a new learning-based framework, which is capable of fully integrating audio and visual modalities to localize audio-visual events with various lengths and capture dependencies between them in a single pass. Extensive experiments demonstrate the effectiveness of our method as well as the significance of multi-scale cross-modal perception and dependency modeling for this task.
### Ablating Concepts in Text-to-Image Diffusion Models
- **Authors:** Nupur Kumari, Bingliang Zhang, Sheng-Yu Wang, Eli Shechtman, Richard Zhang, Jun-Yan Zhu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Graphics (cs.GR); Machine Learning (cs.LG)
- **Arxiv link:** https://arxiv.org/abs/2303.13516
- **Pdf link:** https://arxiv.org/pdf/2303.13516
- **Abstract**
Large-scale text-to-image diffusion models can generate high-fidelity images with powerful compositional ability. However, these models are typically trained on an enormous amount of Internet data, often containing copyrighted material, licensed images, and personal photos. Furthermore, they have been found to replicate the style of various living artists or memorize exact training samples. How can we remove such copyrighted concepts or images without retraining the model from scratch? To achieve this goal, we propose an efficient method of ablating concepts in the pretrained model, i.e., preventing the generation of a target concept. Our algorithm learns to match the image distribution for a target style, instance, or text prompt we wish to ablate to the distribution corresponding to an anchor concept. This prevents the model from generating target concepts given its text condition. Extensive experiments show that our method can successfully prevent the generation of the ablated concept while preserving closely related concepts in the model.
### Three ways to improve feature alignment for open vocabulary detection
- **Authors:** Relja Arandjelović, Alex Andonian, Arthur Mensch, Olivier J. Hénaff, Jean-Baptiste Alayrac, Andrew Zisserman
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Machine Learning (cs.LG)
- **Arxiv link:** https://arxiv.org/abs/2303.13518
- **Pdf link:** https://arxiv.org/pdf/2303.13518
- **Abstract**
The core problem in zero-shot open vocabulary detection is how to align visual and text features, so that the detector performs well on unseen classes. Previous approaches train the feature pyramid and detection head from scratch, which breaks the vision-text feature alignment established during pretraining, and struggles to prevent the language model from forgetting unseen classes. We propose three methods to alleviate these issues. Firstly, a simple scheme is used to augment the text embeddings which prevents overfitting to a small number of classes seen during training, while simultaneously saving memory and computation. Secondly, the feature pyramid network and the detection head are modified to include trainable gated shortcuts, which encourages vision-text feature alignment and guarantees it at the start of detection training. Finally, a self-training approach is used to leverage a larger corpus of image-text pairs thus improving detection performance on classes with no human annotated bounding boxes. Our three methods are evaluated on the zero-shot version of the LVIS benchmark, each of them showing clear and significant benefits. Our final network achieves the new stateof-the-art on the mAP-all metric and demonstrates competitive performance for mAP-rare, as well as superior transfer to COCO and Objects365.
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
### Efficient Meshy Neural Fields for Animatable Human Avatars
- **Authors:** Xiaoke Huang, Yiji Cheng, Yansong Tang, Xiu Li, Jie Zhou, Jiwen Lu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Graphics (cs.GR)
- **Arxiv link:** https://arxiv.org/abs/2303.12965
- **Pdf link:** https://arxiv.org/pdf/2303.12965
- **Abstract**
Efficiently digitizing high-fidelity animatable human avatars from videos is a challenging and active research topic. Recent volume rendering-based neural representations open a new way for human digitization with their friendly usability and photo-realistic reconstruction quality. However, they are inefficient for long optimization times and slow inference speed; their implicit nature results in entangled geometry, materials, and dynamics of humans, which are hard to edit afterward. Such drawbacks prevent their direct applicability to downstream applications, especially the prominent rasterization-based graphic ones. We present EMA, a method that Efficiently learns Meshy neural fields to reconstruct animatable human Avatars. It jointly optimizes explicit triangular canonical mesh, spatial-varying material, and motion dynamics, via inverse rendering in an end-to-end fashion. Each above component is derived from separate neural fields, relaxing the requirement of a template, or rigging. The mesh representation is highly compatible with the efficient rasterization-based renderer, thus our method only takes about an hour of training and can render in real-time. Moreover, only minutes of optimization is enough for plausible reconstruction results. The disentanglement of meshes enables direct downstream applications. Extensive experiments illustrate the very competitive performance and significant speed boost against previous methods. We also showcase applications including novel pose synthesis, material editing, and relighting. The project page: https://xk-huang.github.io/ema/.
## Keyword: ISP
### CP$^3$: Channel Pruning Plug-in for Point-based Networks
- **Authors:** Yaomin Huang, Ning Liu, Zhengping Che, Zhiyuan Xu, Chaomin Shen, Yaxin Peng, Guixu Zhang, Xinmei Liu, Feifei Feng, Jian Tang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2303.13097
- **Pdf link:** https://arxiv.org/pdf/2303.13097
- **Abstract**
Channel pruning can effectively reduce both computational cost and memory footprint of the original network while keeping a comparable accuracy performance. Though great success has been achieved in channel pruning for 2D image-based convolutional networks (CNNs), existing works seldom extend the channel pruning methods to 3D point-based neural networks (PNNs). Directly implementing the 2D CNN channel pruning methods to PNNs undermine the performance of PNNs because of the different representations of 2D images and 3D point clouds as well as the network architecture disparity. In this paper, we proposed CP$^3$, which is a Channel Pruning Plug-in for Point-based network. CP$^3$ is elaborately designed to leverage the characteristics of point clouds and PNNs in order to enable 2D channel pruning methods for PNNs. Specifically, it presents a coordinate-enhanced channel importance metric to reflect the correlation between dimensional information and individual channel features, and it recycles the discarded points in PNN's sampling process and reconsiders their potentially-exclusive information to enhance the robustness of channel pruning. Experiments on various PNN architectures show that CP$^3$ constantly improves state-of-the-art 2D CNN pruning approaches on different point cloud tasks. For instance, our compressed PointNeXt-S on ScanObjectNN achieves an accuracy of 88.52% with a pruning rate of 57.8%, outperforming the baseline pruning methods with an accuracy gain of 1.94%.
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### Improvement of Color Image Analysis Using a New Hybrid Face Recognition Algorithm based on Discrete Wavelets and Chebyshev Polynomials
- **Authors:** Hassan Mohamed Muhi-Aldeen, Maha Ammar Mustafa, Asma A. Abdulrahman, Jabbar Abed Eleiwy, Fouad S. Tahir, Yurii Khlaponin
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV)
- **Arxiv link:** https://arxiv.org/abs/2303.13158
- **Pdf link:** https://arxiv.org/pdf/2303.13158
- **Abstract**
This work is unique in the use of discrete wavelets that were built from or derived from Chebyshev polynomials of the second and third kind, filter the Discrete Second Chebyshev Wavelets Transform (DSCWT), and derive two effective filters. The Filter Discrete Third Chebyshev Wavelets Transform (FDTCWT) is used in the process of analyzing color images and removing noise and impurities that accompany the image, as well as because of the large amount of data that makes up the image as it is taken. These data are massive, making it difficult to deal with each other during transmission. However to address this issue, the image compression technique is used, with the image not losing information due to the readings that were obtained, and the results were satisfactory. Mean Square Error (MSE), Peak Signal Noise Ratio (PSNR), Bit Per Pixel (BPP), and Compression Ratio (CR) Coronavirus is the initial treatment, while the processing stage is done with network training for Convolutional Neural Networks (CNN) with Discrete Second Chebeshev Wavelets Convolutional Neural Network (DSCWCNN) and Discrete Third Chebeshev Wavelets Convolutional Neural Network (DTCWCNN) to create an efficient algorithm for face recognition, and the best results were achieved in accuracy and in the least amount of time. Two samples of color images that were made or implemented were used. The proposed theory was obtained with fast and good results; the results are evident shown in the tables below.
### Enhancement of theColor Image Compression Using a New Algorithm based on Discrete Hermite Wavelet Transform
- **Authors:** Hassan Mohamed Muhi-Aldeen, Asma A. Abdulrahman, Jabbar Abed Eleiwy, Fouad S. Tahir, Yurii Khlaponin
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV)
- **Arxiv link:** https://arxiv.org/abs/2303.13175
- **Pdf link:** https://arxiv.org/pdf/2303.13175
- **Abstract**
The Internet has turned the entire world into a small village;this is because it has made it possible to share millions of images and videos. However, sending and receiving a huge amount of data is considered to be a main challenge. To address this issue, a new algorithm is required to reduce image bits and represent the data in a compressed form. Nevertheless, image compression is an important application for transferring large files and images. This requires appropriate and efficient transfers in this field to achieve the task and reach the best results. In this work, we propose a new algorithm based on discrete Hermite wavelets transformation (DHWT) that shows the efficiency and quality of the color images. By compressing the color image, this method analyzes it and divides it into approximate coefficients and detail coefficients after adding the wavelets into MATLAB. With Multi-Resolution Analyses (MRA), the appropriate filter is derived, and the mathematical aspects prove to be validated by testing a new filter and performing its operation. After the decomposition of the rows and upon the process of the reconstruction, taking the inverse of the filter and dealing with the columns of the matrix, the original matrix is improved by measuring the parameters of the image to achieve the best quality of the resulting image, such as the peak signal-to-noise ratio (PSNR), compression ratio (CR), bits per pixel (BPP), and mean square error (MSE).
## Keyword: RAW
### Efficient Meshy Neural Fields for Animatable Human Avatars
- **Authors:** Xiaoke Huang, Yiji Cheng, Yansong Tang, Xiu Li, Jie Zhou, Jiwen Lu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Graphics (cs.GR)
- **Arxiv link:** https://arxiv.org/abs/2303.12965
- **Pdf link:** https://arxiv.org/pdf/2303.12965
- **Abstract**
Efficiently digitizing high-fidelity animatable human avatars from videos is a challenging and active research topic. Recent volume rendering-based neural representations open a new way for human digitization with their friendly usability and photo-realistic reconstruction quality. However, they are inefficient for long optimization times and slow inference speed; their implicit nature results in entangled geometry, materials, and dynamics of humans, which are hard to edit afterward. Such drawbacks prevent their direct applicability to downstream applications, especially the prominent rasterization-based graphic ones. We present EMA, a method that Efficiently learns Meshy neural fields to reconstruct animatable human Avatars. It jointly optimizes explicit triangular canonical mesh, spatial-varying material, and motion dynamics, via inverse rendering in an end-to-end fashion. Each above component is derived from separate neural fields, relaxing the requirement of a template, or rigging. The mesh representation is highly compatible with the efficient rasterization-based renderer, thus our method only takes about an hour of training and can render in real-time. Moreover, only minutes of optimization is enough for plausible reconstruction results. The disentanglement of meshes enables direct downstream applications. Extensive experiments illustrate the very competitive performance and significant speed boost against previous methods. We also showcase applications including novel pose synthesis, material editing, and relighting. The project page: https://xk-huang.github.io/ema/.
### VADER: Video Alignment Differencing and Retrieval
- **Authors:** Alexander Black, Simon Jenni, Tu Bui, Md. Mehrab Tanjim, Stefano Petrangeli, Ritwik Sinha, Viswanathan Swaminathan, John Collomosse
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.13193
- **Pdf link:** https://arxiv.org/pdf/2303.13193
- **Abstract**
We propose VADER, a spatio-temporal matching, alignment, and change summarization method to help fight misinformation spread via manipulated videos. VADER matches and coarsely aligns partial video fragments to candidate videos using a robust visual descriptor and scalable search over adaptively chunked video content. A transformer-based alignment module then refines the temporal localization of the query fragment within the matched video. A space-time comparator module identifies regions of manipulation between aligned content, invariant to any changes due to any residual temporal misalignments or artifacts arising from non-editorial changes of the content. Robustly matching video to a trusted source enables conclusions to be drawn on video provenance, enabling informed trust decisions on content encountered.
### Multi-granularity Interaction Simulation for Unsupervised Interactive Segmentation
- **Authors:** Kehan Li, Yian Zhao, Zhennan Wang, Zesen Cheng, Peng Jin, Xiangyang Ji, Li Yuan, Chang Liu, Jie Chen
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.13399
- **Pdf link:** https://arxiv.org/pdf/2303.13399
- **Abstract**
Interactive segmentation enables users to segment as needed by providing cues of objects, which introduces human-computer interaction for many fields, such as image editing and medical image analysis. Typically, massive and expansive pixel-level annotations are spent to train deep models by object-oriented interactions with manually labeled object masks. In this work, we reveal that informative interactions can be made by simulation with semantic-consistent yet diverse region exploration in an unsupervised paradigm. Concretely, we introduce a Multi-granularity Interaction Simulation (MIS) approach to open up a promising direction for unsupervised interactive segmentation. Drawing on the high-quality dense features produced by recent self-supervised models, we propose to gradually merge patches or regions with similar features to form more extensive regions and thus, every merged region serves as a semantic-meaningful multi-granularity proposal. By randomly sampling these proposals and simulating possible interactions based on them, we provide meaningful interaction at multiple granularities to teach the model to understand interactions. Our MIS significantly outperforms non-deep learning unsupervised methods and is even comparable with some previous deep-supervised methods without any annotation.
## Keyword: raw image
There is no result
|
process
|
new submissions for fri mar keyword events dense localizing audio visual events in untrimmed videos a large scale benchmark and baseline authors tiantian geng teng wang jinming duan runmin cong feng zheng subjects computer vision and pattern recognition cs cv multimedia cs mm sound cs sd audio and speech processing eess as arxiv link pdf link abstract existing audio visual event localization ave handles manually trimmed videos with only a single instance in each of them however this setting is unrealistic as natural videos often contain numerous audio visual events with different categories to better adapt to real life applications in this paper we focus on the task of dense localizing audio visual events which aims to jointly localize and recognize all audio visual events occurring in an untrimmed video the problem is challenging as it requires fine grained audio visual scene and context understanding to tackle this problem we introduce the first untrimmed audio visual unav dataset which contains untrimmed videos with over audio visual events each video has audio visual events on average and the events are usually related to each other and might co occur as in real life scenes next we formulate the task using a new learning based framework which is capable of fully integrating audio and visual modalities to localize audio visual events with various lengths and capture dependencies between them in a single pass extensive experiments demonstrate the effectiveness of our method as well as the significance of multi scale cross modal perception and dependency modeling for this task ablating concepts in text to image diffusion models authors nupur kumari bingliang zhang sheng yu wang eli shechtman richard zhang jun yan zhu subjects computer vision and pattern recognition cs cv graphics cs gr machine learning cs lg arxiv link pdf link abstract large scale text to image diffusion models can generate high fidelity images with powerful compositional ability however these models are typically trained on an enormous amount of internet data often containing copyrighted material licensed images and personal photos furthermore they have been found to replicate the style of various living artists or memorize exact training samples how can we remove such copyrighted concepts or images without retraining the model from scratch to achieve this goal we propose an efficient method of ablating concepts in the pretrained model i e preventing the generation of a target concept our algorithm learns to match the image distribution for a target style instance or text prompt we wish to ablate to the distribution corresponding to an anchor concept this prevents the model from generating target concepts given its text condition extensive experiments show that our method can successfully prevent the generation of the ablated concept while preserving closely related concepts in the model three ways to improve feature alignment for open vocabulary detection authors relja arandjelović alex andonian arthur mensch olivier j hénaff jean baptiste alayrac andrew zisserman subjects computer vision and pattern recognition cs cv artificial intelligence cs ai machine learning cs lg arxiv link pdf link abstract the core problem in zero shot open vocabulary detection is how to align visual and text features so that the detector performs well on unseen classes previous approaches train the feature pyramid and detection head from scratch which breaks the vision text feature alignment established during pretraining and struggles to prevent the language model from forgetting unseen classes we propose three methods to alleviate these issues firstly a simple scheme is used to augment the text embeddings which prevents overfitting to a small number of classes seen during training while simultaneously saving memory and computation secondly the feature pyramid network and the detection head are modified to include trainable gated shortcuts which encourages vision text feature alignment and guarantees it at the start of detection training finally a self training approach is used to leverage a larger corpus of image text pairs thus improving detection performance on classes with no human annotated bounding boxes our three methods are evaluated on the zero shot version of the lvis benchmark each of them showing clear and significant benefits our final network achieves the new stateof the art on the map all metric and demonstrates competitive performance for map rare as well as superior transfer to coco and keyword event camera there is no result keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb efficient meshy neural fields for animatable human avatars authors xiaoke huang yiji cheng yansong tang xiu li jie zhou jiwen lu subjects computer vision and pattern recognition cs cv graphics cs gr arxiv link pdf link abstract efficiently digitizing high fidelity animatable human avatars from videos is a challenging and active research topic recent volume rendering based neural representations open a new way for human digitization with their friendly usability and photo realistic reconstruction quality however they are inefficient for long optimization times and slow inference speed their implicit nature results in entangled geometry materials and dynamics of humans which are hard to edit afterward such drawbacks prevent their direct applicability to downstream applications especially the prominent rasterization based graphic ones we present ema a method that efficiently learns meshy neural fields to reconstruct animatable human avatars it jointly optimizes explicit triangular canonical mesh spatial varying material and motion dynamics via inverse rendering in an end to end fashion each above component is derived from separate neural fields relaxing the requirement of a template or rigging the mesh representation is highly compatible with the efficient rasterization based renderer thus our method only takes about an hour of training and can render in real time moreover only minutes of optimization is enough for plausible reconstruction results the disentanglement of meshes enables direct downstream applications extensive experiments illustrate the very competitive performance and significant speed boost against previous methods we also showcase applications including novel pose synthesis material editing and relighting the project page keyword isp cp channel pruning plug in for point based networks authors yaomin huang ning liu zhengping che zhiyuan xu chaomin shen yaxin peng guixu zhang xinmei liu feifei feng jian tang subjects computer vision and pattern recognition cs cv artificial intelligence cs ai arxiv link pdf link abstract channel pruning can effectively reduce both computational cost and memory footprint of the original network while keeping a comparable accuracy performance though great success has been achieved in channel pruning for image based convolutional networks cnns existing works seldom extend the channel pruning methods to point based neural networks pnns directly implementing the cnn channel pruning methods to pnns undermine the performance of pnns because of the different representations of images and point clouds as well as the network architecture disparity in this paper we proposed cp which is a channel pruning plug in for point based network cp is elaborately designed to leverage the characteristics of point clouds and pnns in order to enable channel pruning methods for pnns specifically it presents a coordinate enhanced channel importance metric to reflect the correlation between dimensional information and individual channel features and it recycles the discarded points in pnn s sampling process and reconsiders their potentially exclusive information to enhance the robustness of channel pruning experiments on various pnn architectures show that cp constantly improves state of the art cnn pruning approaches on different point cloud tasks for instance our compressed pointnext s on scanobjectnn achieves an accuracy of with a pruning rate of outperforming the baseline pruning methods with an accuracy gain of keyword image signal processing there is no result keyword image signal process there is no result keyword compression improvement of color image analysis using a new hybrid face recognition algorithm based on discrete wavelets and chebyshev polynomials authors hassan mohamed muhi aldeen maha ammar mustafa asma a abdulrahman jabbar abed eleiwy fouad s tahir yurii khlaponin subjects computer vision and pattern recognition cs cv image and video processing eess iv arxiv link pdf link abstract this work is unique in the use of discrete wavelets that were built from or derived from chebyshev polynomials of the second and third kind filter the discrete second chebyshev wavelets transform dscwt and derive two effective filters the filter discrete third chebyshev wavelets transform fdtcwt is used in the process of analyzing color images and removing noise and impurities that accompany the image as well as because of the large amount of data that makes up the image as it is taken these data are massive making it difficult to deal with each other during transmission however to address this issue the image compression technique is used with the image not losing information due to the readings that were obtained and the results were satisfactory mean square error mse peak signal noise ratio psnr bit per pixel bpp and compression ratio cr coronavirus is the initial treatment while the processing stage is done with network training for convolutional neural networks cnn with discrete second chebeshev wavelets convolutional neural network dscwcnn and discrete third chebeshev wavelets convolutional neural network dtcwcnn to create an efficient algorithm for face recognition and the best results were achieved in accuracy and in the least amount of time two samples of color images that were made or implemented were used the proposed theory was obtained with fast and good results the results are evident shown in the tables below enhancement of thecolor image compression using a new algorithm based on discrete hermite wavelet transform authors hassan mohamed muhi aldeen asma a abdulrahman jabbar abed eleiwy fouad s tahir yurii khlaponin subjects computer vision and pattern recognition cs cv image and video processing eess iv arxiv link pdf link abstract the internet has turned the entire world into a small village this is because it has made it possible to share millions of images and videos however sending and receiving a huge amount of data is considered to be a main challenge to address this issue a new algorithm is required to reduce image bits and represent the data in a compressed form nevertheless image compression is an important application for transferring large files and images this requires appropriate and efficient transfers in this field to achieve the task and reach the best results in this work we propose a new algorithm based on discrete hermite wavelets transformation dhwt that shows the efficiency and quality of the color images by compressing the color image this method analyzes it and divides it into approximate coefficients and detail coefficients after adding the wavelets into matlab with multi resolution analyses mra the appropriate filter is derived and the mathematical aspects prove to be validated by testing a new filter and performing its operation after the decomposition of the rows and upon the process of the reconstruction taking the inverse of the filter and dealing with the columns of the matrix the original matrix is improved by measuring the parameters of the image to achieve the best quality of the resulting image such as the peak signal to noise ratio psnr compression ratio cr bits per pixel bpp and mean square error mse keyword raw efficient meshy neural fields for animatable human avatars authors xiaoke huang yiji cheng yansong tang xiu li jie zhou jiwen lu subjects computer vision and pattern recognition cs cv graphics cs gr arxiv link pdf link abstract efficiently digitizing high fidelity animatable human avatars from videos is a challenging and active research topic recent volume rendering based neural representations open a new way for human digitization with their friendly usability and photo realistic reconstruction quality however they are inefficient for long optimization times and slow inference speed their implicit nature results in entangled geometry materials and dynamics of humans which are hard to edit afterward such drawbacks prevent their direct applicability to downstream applications especially the prominent rasterization based graphic ones we present ema a method that efficiently learns meshy neural fields to reconstruct animatable human avatars it jointly optimizes explicit triangular canonical mesh spatial varying material and motion dynamics via inverse rendering in an end to end fashion each above component is derived from separate neural fields relaxing the requirement of a template or rigging the mesh representation is highly compatible with the efficient rasterization based renderer thus our method only takes about an hour of training and can render in real time moreover only minutes of optimization is enough for plausible reconstruction results the disentanglement of meshes enables direct downstream applications extensive experiments illustrate the very competitive performance and significant speed boost against previous methods we also showcase applications including novel pose synthesis material editing and relighting the project page vader video alignment differencing and retrieval authors alexander black simon jenni tu bui md mehrab tanjim stefano petrangeli ritwik sinha viswanathan swaminathan john collomosse subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract we propose vader a spatio temporal matching alignment and change summarization method to help fight misinformation spread via manipulated videos vader matches and coarsely aligns partial video fragments to candidate videos using a robust visual descriptor and scalable search over adaptively chunked video content a transformer based alignment module then refines the temporal localization of the query fragment within the matched video a space time comparator module identifies regions of manipulation between aligned content invariant to any changes due to any residual temporal misalignments or artifacts arising from non editorial changes of the content robustly matching video to a trusted source enables conclusions to be drawn on video provenance enabling informed trust decisions on content encountered multi granularity interaction simulation for unsupervised interactive segmentation authors kehan li yian zhao zhennan wang zesen cheng peng jin xiangyang ji li yuan chang liu jie chen subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract interactive segmentation enables users to segment as needed by providing cues of objects which introduces human computer interaction for many fields such as image editing and medical image analysis typically massive and expansive pixel level annotations are spent to train deep models by object oriented interactions with manually labeled object masks in this work we reveal that informative interactions can be made by simulation with semantic consistent yet diverse region exploration in an unsupervised paradigm concretely we introduce a multi granularity interaction simulation mis approach to open up a promising direction for unsupervised interactive segmentation drawing on the high quality dense features produced by recent self supervised models we propose to gradually merge patches or regions with similar features to form more extensive regions and thus every merged region serves as a semantic meaningful multi granularity proposal by randomly sampling these proposals and simulating possible interactions based on them we provide meaningful interaction at multiple granularities to teach the model to understand interactions our mis significantly outperforms non deep learning unsupervised methods and is even comparable with some previous deep supervised methods without any annotation keyword raw image there is no result
| 1
|
815,859
| 30,575,291,884
|
IssuesEvent
|
2023-07-21 04:32:30
|
Aadesh-Baral/OSMLocalizer
|
https://api.github.com/repos/Aadesh-Baral/OSMLocalizer
|
closed
|
Validated localized features are not being counted in endpoint `stats/home` after validation mode.
|
bug Priority: High
|
We have a field in stats/home endpoint that returns total features localized but those features that are validated aren't being counted as localized due to status set as validated. These features should still be counted as validated.
|
1.0
|
Validated localized features are not being counted in endpoint `stats/home` after validation mode. - We have a field in stats/home endpoint that returns total features localized but those features that are validated aren't being counted as localized due to status set as validated. These features should still be counted as validated.
|
non_process
|
validated localized features are not being counted in endpoint stats home after validation mode we have a field in stats home endpoint that returns total features localized but those features that are validated aren t being counted as localized due to status set as validated these features should still be counted as validated
| 0
|
14,880
| 18,287,167,514
|
IssuesEvent
|
2021-10-05 11:38:15
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[PM] Alignment is not proper in My account screen
|
Bug P2 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
|
A/R:- Alignment is not proper for **Update** and **Cancel** buttons
E/R:- Alignment should be proper for all the elements on the screen

|
3.0
|
[PM] Alignment is not proper in My account screen - A/R:- Alignment is not proper for **Update** and **Cancel** buttons
E/R:- Alignment should be proper for all the elements on the screen

|
process
|
alignment is not proper in my account screen a r alignment is not proper for update and cancel buttons e r alignment should be proper for all the elements on the screen
| 1
|
17,256
| 23,039,491,456
|
IssuesEvent
|
2022-07-23 00:37:25
|
alchemistry/alchemlyb
|
https://api.github.com/repos/alchemistry/alchemlyb
|
opened
|
upgrade to use pymbar 4
|
enhancement preprocessors estimators docs CI
|
pymbar 4 has been released and pymbar 3 will not me maintained anymore (see Discussion #205 ).
alchemlyb will move to pymbar 4 and we need to
- [ ] update CI and dependency handling for pymbar >= 4
- [ ] update code to use new API
- [ ] update docs (installation etc)
- [ ] update conda package https://github.com/conda-forge/alchemlyb-feedstock/
|
1.0
|
upgrade to use pymbar 4 - pymbar 4 has been released and pymbar 3 will not me maintained anymore (see Discussion #205 ).
alchemlyb will move to pymbar 4 and we need to
- [ ] update CI and dependency handling for pymbar >= 4
- [ ] update code to use new API
- [ ] update docs (installation etc)
- [ ] update conda package https://github.com/conda-forge/alchemlyb-feedstock/
|
process
|
upgrade to use pymbar pymbar has been released and pymbar will not me maintained anymore see discussion alchemlyb will move to pymbar and we need to update ci and dependency handling for pymbar update code to use new api update docs installation etc update conda package
| 1
|
9,749
| 12,735,896,450
|
IssuesEvent
|
2020-06-25 16:00:51
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Several incorrect functions appear on Azure DevOps 2019 edition
|
Pri2 devops-cicd-process/tech devops/prod
|
None of these functions are available on 2019:
- `lower`
- `upper`
- `replace`
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 77c58a78-a567-e99a-9eb7-62dddd1b90b6
* Version Independent ID: 680a79bc-11de-39fc-43e3-e07dc762db18
* Content: [Expressions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops-2019#functions)
* Content Source: [docs/pipelines/process/expressions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/expressions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Several incorrect functions appear on Azure DevOps 2019 edition - None of these functions are available on 2019:
- `lower`
- `upper`
- `replace`
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 77c58a78-a567-e99a-9eb7-62dddd1b90b6
* Version Independent ID: 680a79bc-11de-39fc-43e3-e07dc762db18
* Content: [Expressions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops-2019#functions)
* Content Source: [docs/pipelines/process/expressions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/expressions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
several incorrect functions appear on azure devops edition none of these functions are available on lower upper replace document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
65,872
| 14,761,954,506
|
IssuesEvent
|
2021-01-09 01:08:25
|
jgeraigery/gradle
|
https://api.github.com/repos/jgeraigery/gradle
|
opened
|
CVE-2020-36188 (Medium) detected in jackson-databind-2.8.11.1.jar, jackson-databind-2.9.4.jar
|
security vulnerability
|
## CVE-2020-36188 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.8.11.1.jar</b>, <b>jackson-databind-2.9.4.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.8.11.1.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: gradle/subprojects/docs/src/snippets/play/multiproject/groovy/modules/user/build.gradle</p>
<p>Path to vulnerable library: gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.1/341edc63fdd8b44e17b2c36abbc9b451d8fd05a5/jackson-databind-2.8.11.1.jar</p>
<p>
Dependency Hierarchy:
- play-guice_2.12-2.6.15.jar (Root Library)
- play_2.12-2.6.15.jar
- jjwt-0.7.0.jar
- :x: **jackson-databind-2.8.11.1.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.9.4.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: gradle/subprojects/docs/src/snippets/kotlinDsl/multiProjectBuild/kotlin/http/build.gradle.kts</p>
<p>Path to vulnerable library: gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.4/498bbc3b94f566982c7f7c6d4d303fce365529be/jackson-databind-2.9.4.jar</p>
<p>
Dependency Hierarchy:
- ratpack-core-1.5.4.jar (Root Library)
- :x: **jackson-databind-2.9.4.jar** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.JNDIConnectionSource.
<p>Publish Date: 2021-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36188>CVE-2020-36188</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2996">https://github.com/FasterXML/jackson-databind/issues/2996</a></p>
<p>Release Date: 2021-01-06</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.11.1","isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-guice_2.12:2.6.15;com.typesafe.play:play_2.12:2.6.15;io.jsonwebtoken:jjwt:0.7.0;com.fasterxml.jackson.core:jackson-databind:2.8.11.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.4","isTransitiveDependency":true,"dependencyTree":"io.ratpack:ratpack-core:1.5.4;com.fasterxml.jackson.core:jackson-databind:2.9.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"}],"vulnerabilityIdentifier":"CVE-2020-36188","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.JNDIConnectionSource.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36188","cvss3Severity":"medium","cvss3Score":"4.2","cvss3Metrics":{"A":"Low","AC":"High","PR":"Low","S":"Unchanged","C":"Low","UI":"Required","AV":"Local","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-36188 (Medium) detected in jackson-databind-2.8.11.1.jar, jackson-databind-2.9.4.jar - ## CVE-2020-36188 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.8.11.1.jar</b>, <b>jackson-databind-2.9.4.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.8.11.1.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: gradle/subprojects/docs/src/snippets/play/multiproject/groovy/modules/user/build.gradle</p>
<p>Path to vulnerable library: gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.8.11.1/341edc63fdd8b44e17b2c36abbc9b451d8fd05a5/jackson-databind-2.8.11.1.jar</p>
<p>
Dependency Hierarchy:
- play-guice_2.12-2.6.15.jar (Root Library)
- play_2.12-2.6.15.jar
- jjwt-0.7.0.jar
- :x: **jackson-databind-2.8.11.1.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.9.4.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: gradle/subprojects/docs/src/snippets/kotlinDsl/multiProjectBuild/kotlin/http/build.gradle.kts</p>
<p>Path to vulnerable library: gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.4/498bbc3b94f566982c7f7c6d4d303fce365529be/jackson-databind-2.9.4.jar</p>
<p>
Dependency Hierarchy:
- ratpack-core-1.5.4.jar (Root Library)
- :x: **jackson-databind-2.9.4.jar** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.JNDIConnectionSource.
<p>Publish Date: 2021-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36188>CVE-2020-36188</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2996">https://github.com/FasterXML/jackson-databind/issues/2996</a></p>
<p>Release Date: 2021-01-06</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.11.1","isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-guice_2.12:2.6.15;com.typesafe.play:play_2.12:2.6.15;io.jsonwebtoken:jjwt:0.7.0;com.fasterxml.jackson.core:jackson-databind:2.8.11.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.4","isTransitiveDependency":true,"dependencyTree":"io.ratpack:ratpack-core:1.5.4;com.fasterxml.jackson.core:jackson-databind:2.9.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"}],"vulnerabilityIdentifier":"CVE-2020-36188","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.JNDIConnectionSource.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36188","cvss3Severity":"medium","cvss3Score":"4.2","cvss3Metrics":{"A":"Low","AC":"High","PR":"Low","S":"Unchanged","C":"Low","UI":"Required","AV":"Local","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in jackson databind jar jackson databind jar cve medium severity vulnerability vulnerable libraries jackson databind jar jackson databind jar jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file gradle subprojects docs src snippets play multiproject groovy modules user build gradle path to vulnerable library gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy play guice jar root library play jar jjwt jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file gradle subprojects docs src snippets kotlindsl multiprojectbuild kotlin http build gradle kts path to vulnerable library gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy ratpack core jar root library x jackson databind jar vulnerable library vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to com newrelic agent deps ch qos logback core db jndiconnectionsource publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction required scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to com newrelic agent deps ch qos logback core db jndiconnectionsource vulnerabilityurl
| 0
|
11,892
| 14,688,497,433
|
IssuesEvent
|
2021-01-02 03:10:06
|
e4exp/paper_manager_abstract
|
https://api.github.com/repos/e4exp/paper_manager_abstract
|
opened
|
Shortformer: Better Language Modeling using Shorter Inputs
|
2020 Natural Language Processing Transformer _read_later
|
* https://arxiv.org/abs/2012.15832
* 2020
我々は、変換器の入力長を減らすことの利点を探る。
まず、長いものに移行する前に、最初に短い部分文でモデルを訓練することで、全体的な訓練時間が短縮され、驚くべきことに、複雑さが大幅に改善されることを示す。
次に、(トランスが一度に処理できる最大長よりも大きいシーケンスを生成する場合に)前に処理されたトークンをモデルに条件付けすることを可能にするトランスにおける再帰法の効率を向上させる方法を示す。
既存の手法では、計算量の多い相対位置埋め込みを必要としますが、我々は、ワード埋め込みの代わりに、クエリとキーに絶対位置埋め込みを追加するという単純な代替手法を導入し、効率的に優れた結果を得ることができます。
これらの技術を組み合わせることで、学習速度を65%向上させ、生成を9倍高速化し、パラメータを追加することなく、WikiText-103の複雑性を大幅に改善します。
|
1.0
|
Shortformer: Better Language Modeling using Shorter Inputs - * https://arxiv.org/abs/2012.15832
* 2020
我々は、変換器の入力長を減らすことの利点を探る。
まず、長いものに移行する前に、最初に短い部分文でモデルを訓練することで、全体的な訓練時間が短縮され、驚くべきことに、複雑さが大幅に改善されることを示す。
次に、(トランスが一度に処理できる最大長よりも大きいシーケンスを生成する場合に)前に処理されたトークンをモデルに条件付けすることを可能にするトランスにおける再帰法の効率を向上させる方法を示す。
既存の手法では、計算量の多い相対位置埋め込みを必要としますが、我々は、ワード埋め込みの代わりに、クエリとキーに絶対位置埋め込みを追加するという単純な代替手法を導入し、効率的に優れた結果を得ることができます。
これらの技術を組み合わせることで、学習速度を65%向上させ、生成を9倍高速化し、パラメータを追加することなく、WikiText-103の複雑性を大幅に改善します。
|
process
|
shortformer better language modeling using shorter inputs 我々は、変換器の入力長を減らすことの利点を探る。 まず、長いものに移行する前に、最初に短い部分文でモデルを訓練することで、全体的な訓練時間が短縮され、驚くべきことに、複雑さが大幅に改善されることを示す。 次に、(トランスが一度に処理できる最大長よりも大きいシーケンスを生成する場合に)前に処理されたトークンをモデルに条件付けすることを可能にするトランスにおける再帰法の効率を向上させる方法を示す。 既存の手法では、計算量の多い相対位置埋め込みを必要としますが、我々は、ワード埋め込みの代わりに、クエリとキーに絶対位置埋め込みを追加するという単純な代替手法を導入し、効率的に優れた結果を得ることができます。 これらの技術を組み合わせることで、 向上させ、 、パラメータを追加することなく、wikitext 。
| 1
|
113,118
| 14,369,208,146
|
IssuesEvent
|
2020-12-01 09:28:24
|
dspeyer/ritualEngine
|
https://api.github.com/repos/dspeyer/ritualEngine
|
closed
|
Decide whether spectators can see videoconferences
|
Needs Design
|
The primary reason against is that they have to take up a slot in a Twilio room, plus privacy-expectations stuff.
|
1.0
|
Decide whether spectators can see videoconferences - The primary reason against is that they have to take up a slot in a Twilio room, plus privacy-expectations stuff.
|
non_process
|
decide whether spectators can see videoconferences the primary reason against is that they have to take up a slot in a twilio room plus privacy expectations stuff
| 0
|
3,427
| 6,529,491,163
|
IssuesEvent
|
2017-08-30 11:53:43
|
DynareTeam/dynare
|
https://api.github.com/repos/DynareTeam/dynare
|
opened
|
Backward models with lagged exogenous variables
|
bug preprocessor
|
The Jacobian matrix of backward models with lagged exogenous variables is wrong, columns for the lagged exogenous variables are missing. It is unclear to me why we do not simply add auxiliary endogenous variables for the lagged exogenous variables (as in the case of backward/forward models). I suppose that the simplest fix is to add auxiliary variables (rather than augmenting the size of the Jacobian matrix).
An example is available [here](https://mnemosyne.adjemian.eu/snippets/1)
|
1.0
|
Backward models with lagged exogenous variables - The Jacobian matrix of backward models with lagged exogenous variables is wrong, columns for the lagged exogenous variables are missing. It is unclear to me why we do not simply add auxiliary endogenous variables for the lagged exogenous variables (as in the case of backward/forward models). I suppose that the simplest fix is to add auxiliary variables (rather than augmenting the size of the Jacobian matrix).
An example is available [here](https://mnemosyne.adjemian.eu/snippets/1)
|
process
|
backward models with lagged exogenous variables the jacobian matrix of backward models with lagged exogenous variables is wrong columns for the lagged exogenous variables are missing it is unclear to me why we do not simply add auxiliary endogenous variables for the lagged exogenous variables as in the case of backward forward models i suppose that the simplest fix is to add auxiliary variables rather than augmenting the size of the jacobian matrix an example is available
| 1
|
341,788
| 30,600,008,693
|
IssuesEvent
|
2023-07-22 08:43:53
|
opensearch-project/index-management
|
https://api.github.com/repos/opensearch-project/index-management
|
closed
|
[AUTOCUT] Integration Test failed for index-management: 2.8.0 rpm distribution
|
autocut v2.8.0 integ-test-failure
|
The integration test failed at distribution level for component index-management<br>Version: 2.8.0<br>Distribution: rpm<br>Architecture: arm64<br>Platform: linux<br><br>Please check the logs: https://build.ci.opensearch.org/job/integ-test/5087/display/redirect<br><br> * Steps to reproduce: See https://github.com/opensearch-project/opensearch-build/tree/main/src/test_workflow#integration-tests<br>* Access components yml file:<br> - [With security](https://ci.opensearch.org/ci/dbc/integ-test/2.8.0/7935/linux/arm64/rpm/test-results/5087/integ-test/index-management/with-security/index-management.yml) (if applicable)<br> - [Without security](https://ci.opensearch.org/ci/dbc/integ-test/2.8.0/7935/linux/arm64/rpm/test-results/5087/integ-test/index-management/without-security/index-management.yml) (if applicable)<br><br> _Note: All in one test report manifest with all the details coming soon. See https://github.com/opensearch-project/opensearch-build/issues/1274_
|
1.0
|
[AUTOCUT] Integration Test failed for index-management: 2.8.0 rpm distribution - The integration test failed at distribution level for component index-management<br>Version: 2.8.0<br>Distribution: rpm<br>Architecture: arm64<br>Platform: linux<br><br>Please check the logs: https://build.ci.opensearch.org/job/integ-test/5087/display/redirect<br><br> * Steps to reproduce: See https://github.com/opensearch-project/opensearch-build/tree/main/src/test_workflow#integration-tests<br>* Access components yml file:<br> - [With security](https://ci.opensearch.org/ci/dbc/integ-test/2.8.0/7935/linux/arm64/rpm/test-results/5087/integ-test/index-management/with-security/index-management.yml) (if applicable)<br> - [Without security](https://ci.opensearch.org/ci/dbc/integ-test/2.8.0/7935/linux/arm64/rpm/test-results/5087/integ-test/index-management/without-security/index-management.yml) (if applicable)<br><br> _Note: All in one test report manifest with all the details coming soon. See https://github.com/opensearch-project/opensearch-build/issues/1274_
|
non_process
|
integration test failed for index management rpm distribution the integration test failed at distribution level for component index management version distribution rpm architecture platform linux please check the logs steps to reproduce see access components yml file if applicable if applicable note all in one test report manifest with all the details coming soon see
| 0
|
49,835
| 13,187,278,733
|
IssuesEvent
|
2020-08-13 02:54:35
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
opened
|
wimpsim-reader - default options are invalid (Trac #2155)
|
Incomplete Migration Migrated from Trac combo simulation defect
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/2155">https://code.icecube.wisc.edu/ticket/2155</a>, reported by grenzi and owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:15:23",
"description": "In [http://software.icecube.wisc.edu/documentation/inspect/wimpsim_reader.html?highlight=i3wimpsim#I3WimpSimReader I3WimpSimReader] we can read:\n\n Param EndMJD:\tDefault = nan, MJD to end simulation; if unspecified: read everything\n\nBut if I try not to set it (and take the NAN default) I receive this error\n\n\n{{{\nERROR (dataclasses): Calling with NAN not possible; will do nothing (I3Time.cxx:142 in void I3Time::SetModJulianTimeDouble(double))\n}}}\n\nThe same is for `StartMJD`.\n\n",
"reporter": "grenzi",
"cc": "",
"resolution": "fixed",
"_ts": "1550067323910946",
"component": "combo simulation",
"summary": "wimpsim-reader - default options are invalid",
"priority": "normal",
"keywords": "",
"time": "2018-05-17T15:48:52",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
wimpsim-reader - default options are invalid (Trac #2155) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/2155">https://code.icecube.wisc.edu/ticket/2155</a>, reported by grenzi and owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:15:23",
"description": "In [http://software.icecube.wisc.edu/documentation/inspect/wimpsim_reader.html?highlight=i3wimpsim#I3WimpSimReader I3WimpSimReader] we can read:\n\n Param EndMJD:\tDefault = nan, MJD to end simulation; if unspecified: read everything\n\nBut if I try not to set it (and take the NAN default) I receive this error\n\n\n{{{\nERROR (dataclasses): Calling with NAN not possible; will do nothing (I3Time.cxx:142 in void I3Time::SetModJulianTimeDouble(double))\n}}}\n\nThe same is for `StartMJD`.\n\n",
"reporter": "grenzi",
"cc": "",
"resolution": "fixed",
"_ts": "1550067323910946",
"component": "combo simulation",
"summary": "wimpsim-reader - default options are invalid",
"priority": "normal",
"keywords": "",
"time": "2018-05-17T15:48:52",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
|
non_process
|
wimpsim reader default options are invalid trac migrated from json status closed changetime description in we can read n n param endmjd tdefault nan mjd to end simulation if unspecified read everything n nbut if i try not to set it and take the nan default i receive this error n n n nerror dataclasses calling with nan not possible will do nothing cxx in void setmodjuliantimedouble double n n nthe same is for startmjd n n reporter grenzi cc resolution fixed ts component combo simulation summary wimpsim reader default options are invalid priority normal keywords time milestone owner nega type defect
| 0
|
39,821
| 16,097,577,263
|
IssuesEvent
|
2021-04-27 03:44:53
|
terraform-providers/terraform-provider-azurerm
|
https://api.github.com/repos/terraform-providers/terraform-provider-azurerm
|
closed
|
Support for `blobIndexMatch` in `azurerm_storage_management_policy`
|
enhancement service/storage
|
<!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
<!--- Please leave a helpful description of the feature request here. --->
Support to find and manage Azure Blob data with blob index tags

### New or Affected Resource(s)
<!--- Please list the new or affected resources and data sources. --->
* azurerm_storage_management_policy
### Potential Terraform Configuration
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
```hcl
resource "azurerm_storage_management_policy" "test" {
storage_account_id = azurerm_storage_account.test.id
rule {
name = "rule1"
enabled = true
filters {
prefix_match = ["container1/prefix1"]
blob_types = ["blockBlob"]
blob_index_match {
tag_name = "tag1"
tag_op = "=="
tag_value = "val1"
}
blob_index_match {
tag_name = "tag2"
tag_op = "=="
tag_value = "val2"
}
}
actions {
base_blob {
...
}
}
}
}
```
### References
<!---
Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests
Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example:
* https://azure.microsoft.com/en-us/roadmap/virtual-network-service-endpoint-for-azure-cosmos-db/
--->
* https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-index-how-to?tabs=azure-portal#lifecycle-management-with-blob-index-tag-filters
|
1.0
|
Support for `blobIndexMatch` in `azurerm_storage_management_policy` - <!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
<!--- Please leave a helpful description of the feature request here. --->
Support to find and manage Azure Blob data with blob index tags

### New or Affected Resource(s)
<!--- Please list the new or affected resources and data sources. --->
* azurerm_storage_management_policy
### Potential Terraform Configuration
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
```hcl
resource "azurerm_storage_management_policy" "test" {
storage_account_id = azurerm_storage_account.test.id
rule {
name = "rule1"
enabled = true
filters {
prefix_match = ["container1/prefix1"]
blob_types = ["blockBlob"]
blob_index_match {
tag_name = "tag1"
tag_op = "=="
tag_value = "val1"
}
blob_index_match {
tag_name = "tag2"
tag_op = "=="
tag_value = "val2"
}
}
actions {
base_blob {
...
}
}
}
}
```
### References
<!---
Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests
Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example:
* https://azure.microsoft.com/en-us/roadmap/virtual-network-service-endpoint-for-azure-cosmos-db/
--->
* https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-index-how-to?tabs=azure-portal#lifecycle-management-with-blob-index-tag-filters
|
non_process
|
support for blobindexmatch in azurerm storage management policy community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment description support to find and manage azure blob data with blob index tags new or affected resource s azurerm storage management policy potential terraform configuration hcl resource azurerm storage management policy test storage account id azurerm storage account test id rule name enabled true filters prefix match blob types blob index match tag name tag op tag value blob index match tag name tag op tag value actions base blob references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here vendor blog posts or documentation for example
| 0
|
14,084
| 16,963,226,409
|
IssuesEvent
|
2021-06-29 07:46:49
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
Release 3.1.0 lacks an arm64 build for linux.
|
team-XProduct type: support / not a bug (process)
|
~~~
root@nano-4gb-jp441:/workspace/io-0.16.0# bazel build -s --verbose_failures //tensorflow_io/...
2021/05/28 11:05:41 Downloading https://releases.bazel.build/3.1.0/release/bazel-3.1.0-linux-arm64...
2021/05/28 11:05:42 could not download Bazel: HTTP GET https://releases.bazel.build/3.1.0/release/bazel-3.1.0-linux-arm64 failed with error 404
~~~
I think release 3.1.0 lacks an arm64 build for linux. I know 3.1.0 is the old one, but I have to build through it. Could you update the 3.1.0 assets to include linux-arm64?
|
1.0
|
Release 3.1.0 lacks an arm64 build for linux. - ~~~
root@nano-4gb-jp441:/workspace/io-0.16.0# bazel build -s --verbose_failures //tensorflow_io/...
2021/05/28 11:05:41 Downloading https://releases.bazel.build/3.1.0/release/bazel-3.1.0-linux-arm64...
2021/05/28 11:05:42 could not download Bazel: HTTP GET https://releases.bazel.build/3.1.0/release/bazel-3.1.0-linux-arm64 failed with error 404
~~~
I think release 3.1.0 lacks an arm64 build for linux. I know 3.1.0 is the old one, but I have to build through it. Could you update the 3.1.0 assets to include linux-arm64?
|
process
|
release lacks an build for linux root nano workspace io bazel build s verbose failures tensorflow io downloading could not download bazel http get failed with error i think release lacks an build for linux i know is the old one but i have to build through it could you update the assets to include linux
| 1
|
35,312
| 6,444,320,596
|
IssuesEvent
|
2017-08-12 09:49:46
|
ev3dev-lang-java/ev3dev-lang-java
|
https://api.github.com/repos/ev3dev-lang-java/ev3dev-lang-java
|
closed
|
Review LICENCE info on lejos-commons & lejos-navigation
|
documentation in progress LeJOS operations
|
The review the licence file included in the library to add a GPL file.
https://sourceforge.net/p/lejos/wiki-nxt/Licensing%20Issues/
```
Preferred license
The options for the preferred license are GPL, LGPL, or something like Sun's openjdk GPL license with CLASSPATH exception.
The main issue between GPL and LGPL is whether we wish to allow our libraries to be used in proprietary products. We could issue some code in GPL and some in LGPL. My understanding is that MPL 1.0 is closer to LGPL than to GPL.
If we want to use openjdk, I think we need to issue our Java code as GPL with a classpath exception.
```
|
1.0
|
Review LICENCE info on lejos-commons & lejos-navigation - The review the licence file included in the library to add a GPL file.
https://sourceforge.net/p/lejos/wiki-nxt/Licensing%20Issues/
```
Preferred license
The options for the preferred license are GPL, LGPL, or something like Sun's openjdk GPL license with CLASSPATH exception.
The main issue between GPL and LGPL is whether we wish to allow our libraries to be used in proprietary products. We could issue some code in GPL and some in LGPL. My understanding is that MPL 1.0 is closer to LGPL than to GPL.
If we want to use openjdk, I think we need to issue our Java code as GPL with a classpath exception.
```
|
non_process
|
review licence info on lejos commons lejos navigation the review the licence file included in the library to add a gpl file preferred license the options for the preferred license are gpl lgpl or something like sun s openjdk gpl license with classpath exception the main issue between gpl and lgpl is whether we wish to allow our libraries to be used in proprietary products we could issue some code in gpl and some in lgpl my understanding is that mpl is closer to lgpl than to gpl if we want to use openjdk i think we need to issue our java code as gpl with a classpath exception
| 0
|
274,859
| 23,873,203,321
|
IssuesEvent
|
2022-09-07 16:28:23
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
closed
|
Verified Publisher Rewards icon is not updated after publishers list update
|
bug feature/rewards QA/Yes QA/Test-Plan-Specified
|
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
Test case 1
1. Clean install
2. Open `kjozwiak.github.io`
3. Enable rewards through Rewards Panel
4. Open Rewards Panel
5. Click `Refresh Status`
Test case 2
1. Clean install
2. Enable rewards
3. Make sure publishers list is downloaded
4. Close Brave
5. Delete `kjozwiak.github.io` entry from `publishers_list`
6. Run Brave
7. Open `kjozwiak.github.io`
8. Open Rewards Panel
9. Click `Refresh Status`
Note: Reload of the page will show proper icon
## Actual result:
<!--Please add screenshots if needed-->
Regular Rewards(BAT) icon is shown in the URL bar


## Expected result:
Verified publisher Rewards(BAT) icon is shown in the URL bar

## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easily reproduced
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
Brave | 0.66.87 Chromium: 75.0.3770.80 (Official Build) beta (64-bit)
-- | --
Revision | 9a9aa15057b6b2cc0909bdcf638c0b65ecd516f2-refs/branch-heads/3770@{#948}
OS | Windows 7 Service Pack 1 (Build 7601.24465)
cc @brave/legacy_qa @NejcZdovc @rossmoody
|
1.0
|
Verified Publisher Rewards icon is not updated after publishers list update - ## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
Test case 1
1. Clean install
2. Open `kjozwiak.github.io`
3. Enable rewards through Rewards Panel
4. Open Rewards Panel
5. Click `Refresh Status`
Test case 2
1. Clean install
2. Enable rewards
3. Make sure publishers list is downloaded
4. Close Brave
5. Delete `kjozwiak.github.io` entry from `publishers_list`
6. Run Brave
7. Open `kjozwiak.github.io`
8. Open Rewards Panel
9. Click `Refresh Status`
Note: Reload of the page will show proper icon
## Actual result:
<!--Please add screenshots if needed-->
Regular Rewards(BAT) icon is shown in the URL bar


## Expected result:
Verified publisher Rewards(BAT) icon is shown in the URL bar

## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easily reproduced
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
Brave | 0.66.87 Chromium: 75.0.3770.80 (Official Build) beta (64-bit)
-- | --
Revision | 9a9aa15057b6b2cc0909bdcf638c0b65ecd516f2-refs/branch-heads/3770@{#948}
OS | Windows 7 Service Pack 1 (Build 7601.24465)
cc @brave/legacy_qa @NejcZdovc @rossmoody
|
non_process
|
verified publisher rewards icon is not updated after publishers list update steps to reproduce test case clean install open kjozwiak github io enable rewards through rewards panel open rewards panel click refresh status test case clean install enable rewards make sure publishers list is downloaded close brave delete kjozwiak github io entry from publishers list run brave open kjozwiak github io open rewards panel click refresh status note reload of the page will show proper icon actual result regular rewards bat icon is shown in the url bar expected result verified publisher rewards bat icon is shown in the url bar reproduces how often easily reproduced brave version brave version info brave chromium official build beta bit revision refs branch heads os windows service pack build cc brave legacy qa nejczdovc rossmoody
| 0
|
4,419
| 7,300,078,200
|
IssuesEvent
|
2018-02-26 22:16:53
|
aspnet/IISIntegration
|
https://api.github.com/repos/aspnet/IISIntegration
|
closed
|
Add logging to file for in process
|
in-process
|
Add logging to the file specified in web.config for the managed application. In the out of process scenario, we currently clone the logging file handle for both ANCM and the managed application. I believe we technically don't need to clone the file handle across native and managed, but it may cause difficulties for closing the handle and guaranteeing no extra writes occur.
|
1.0
|
Add logging to file for in process - Add logging to the file specified in web.config for the managed application. In the out of process scenario, we currently clone the logging file handle for both ANCM and the managed application. I believe we technically don't need to clone the file handle across native and managed, but it may cause difficulties for closing the handle and guaranteeing no extra writes occur.
|
process
|
add logging to file for in process add logging to the file specified in web config for the managed application in the out of process scenario we currently clone the logging file handle for both ancm and the managed application i believe we technically don t need to clone the file handle across native and managed but it may cause difficulties for closing the handle and guaranteeing no extra writes occur
| 1
|
177,040
| 21,464,528,702
|
IssuesEvent
|
2022-04-26 01:19:07
|
EcommEasy/EcommEasy-Store
|
https://api.github.com/repos/EcommEasy/EcommEasy-Store
|
closed
|
WS-2019-0332 (Medium) detected in handlebars-4.1.1.tgz - autoclosed
|
security vulnerability
|
## WS-2019-0332 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.1.1.tgz</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.1.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.1.tgz</a></p>
<p>Path to dependency file: /EcommEasy-Store/package.json</p>
<p>Path to vulnerable library: EcommEasy-Store/node_modules/handlebars/package.json</p>
<p>
Dependency Hierarchy:
- :x: **handlebars-4.1.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/EcommEasy/EcommEasy-Store/commit/5741eb749a071020b9cbfaa9d02e738a630ee181">5741eb749a071020b9cbfaa9d02e738a630ee181</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Execution vulnerability found in handlebars before 4.5.3. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.It is due to an incomplete fix for a WS-2019-0331.
<p>Publish Date: 2019-11-17
<p>URL: <a href=https://github.com/wycats/handlebars.js/commit/198887808780bbef9dba67a8af68ece091d5baa7>WS-2019-0332</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1324">https://www.npmjs.com/advisories/1324</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: handlebars - 4.5.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2019-0332 (Medium) detected in handlebars-4.1.1.tgz - autoclosed - ## WS-2019-0332 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.1.1.tgz</b></p></summary>
<p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p>
<p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.1.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.1.tgz</a></p>
<p>Path to dependency file: /EcommEasy-Store/package.json</p>
<p>Path to vulnerable library: EcommEasy-Store/node_modules/handlebars/package.json</p>
<p>
Dependency Hierarchy:
- :x: **handlebars-4.1.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/EcommEasy/EcommEasy-Store/commit/5741eb749a071020b9cbfaa9d02e738a630ee181">5741eb749a071020b9cbfaa9d02e738a630ee181</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Arbitrary Code Execution vulnerability found in handlebars before 4.5.3. Lookup helper fails to validate templates. Attack may submit templates that execute arbitrary JavaScript in the system.It is due to an incomplete fix for a WS-2019-0331.
<p>Publish Date: 2019-11-17
<p>URL: <a href=https://github.com/wycats/handlebars.js/commit/198887808780bbef9dba67a8af68ece091d5baa7>WS-2019-0332</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1324">https://www.npmjs.com/advisories/1324</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: handlebars - 4.5.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws medium detected in handlebars tgz autoclosed ws medium severity vulnerability vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file ecommeasy store package json path to vulnerable library ecommeasy store node modules handlebars package json dependency hierarchy x handlebars tgz vulnerable library found in head commit a href vulnerability details arbitrary code execution vulnerability found in handlebars before lookup helper fails to validate templates attack may submit templates that execute arbitrary javascript in the system it is due to an incomplete fix for a ws publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution handlebars step up your open source security game with whitesource
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.