Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
13,334
| 3,702,660,113
|
IssuesEvent
|
2016-02-29 17:34:48
|
Axelrod-Python/Axelrod
|
https://api.github.com/repos/Axelrod-Python/Axelrod
|
closed
|
Reorganize overview of strategies
|
documentation
|
I think we should de-emphasize the strategies from Axelrod's original tournaments unless they are particularly interesting or well-known now. Additionally, there are **many** strategies in the library that do not appear in the overview at all. I suggest we split the file into:
* One overview with all strategies, in alphabetical order
* Auxiliary files for various tournaments, if desired.
|
1.0
|
Reorganize overview of strategies - I think we should de-emphasize the strategies from Axelrod's original tournaments unless they are particularly interesting or well-known now. Additionally, there are **many** strategies in the library that do not appear in the overview at all. I suggest we split the file into:
* One overview with all strategies, in alphabetical order
* Auxiliary files for various tournaments, if desired.
|
non_process
|
reorganize overview of strategies i think we should de emphasize the strategies from axelrod s original tournaments unless they are particularly interesting or well known now additionally there are many strategies in the library that do not appear in the overview at all i suggest we split the file into one overview with all strategies in alphabetical order auxiliary files for various tournaments if desired
| 0
|
2,158
| 5,006,367,154
|
IssuesEvent
|
2016-12-12 13:56:45
|
openvstorage/alba-asdmanager
|
https://api.github.com/repos/openvstorage/alba-asdmanager
|
closed
|
Set file descriptor limit for maintenance process upgrade script
|
process_wontfix type_bug
|
https://github.com/openvstorage/alba-asdmanager/pull/156 was merged without update logic for existing maintenance services.
I was testing the reviewers to see if they would spot this, @khenderick failed the test ;-)
|
1.0
|
Set file descriptor limit for maintenance process upgrade script - https://github.com/openvstorage/alba-asdmanager/pull/156 was merged without update logic for existing maintenance services.
I was testing the reviewers to see if they would spot this, @khenderick failed the test ;-)
|
process
|
set file descriptor limit for maintenance process upgrade script was merged without update logic for existing maintenance services i was testing the reviewers to see if they would spot this khenderick failed the test
| 1
|
22,522
| 11,643,486,981
|
IssuesEvent
|
2020-02-29 13:51:34
|
aonez/Keka
|
https://api.github.com/repos/aonez/Keka
|
closed
|
Beta 1.2.0 partially hanging on long compressions
|
performance
|
Using the 1.2.0-dev beta that had the ACE stuff added in for testing. Seems to be 1.2.0-dev (3701), more specifically. I mostly use Keka as just a Dock drop target to do nothing but 7Z (max) compression. I'm finding that it seems to (partially) hang more than 50% of the time on any fairly large/long compressions, and is more apt to do so, it seems, when switching focus away from the app to do something else. The progress bar stops, you get a "beachball" cursor, and if you leave the app, nothing happens when you try to get back into it (by Dock icon click or by Cmd-Tab). It will often work right on a second or later attempt, but not always. In reality, the app is actually still working in the background, as can be confirmed by keeping an eye on the file size of the archive being written out. However, after it is done, it does not play its sound, and the app remains unresponsive.
For example, I kept getting a "beachball" in the app when compressing a 1.3 GB Native Instruments Maschine library (one big file) I don't need any time soon. It guestimates 6 minutes (at my performance settings), and becomes unresponsive after about 1 (but actually got the job done in 10–15). The non-beta version has no trouble with the file, and gets the job done in about 1.5 minutes (though hogging CPU and RAM), plays its completion sound, and is working properly afterward (in my case it quits unless I have it open doing something else, as intended). The beta version just sits there, stuck. The archive it created while semi-hung tests out as valid (in The Archive Browser).
I'm doubting this has to do with the ACE addition, since extracting ACEs and compressing 7Zs wouldn't seem to related to each other. But who knows.
I am testing some of the new Performance settings:
* Max. sim. ops.: 2
* Max. sim. comp.: 1
* Max. sim. extr.: 2
* Max. thread/op.: 2
(Aside: I can confirm that this stuff works to stop the app from using up all available RAM and CPU, at the cost of speed, as expected.)
File access is volume-wide on every volume. Inherit quarantine is off. Default action: always compress. Compression: 7z, slowest/most. Save: next to original, same base name. Exclude res. forks. Play sound when done. Ask before quit, enable Notification Ctr., use unified toolbar, close app when no windows open, auto-updates off. Keka not set as default app for anything.
Oh, and as I noted elsewhere, it crashed to desktop one time, after running the beta for the first time and opening preferences and picking some panes to look at.
System: macOS 10.13.6, mid-2010 Mac Pro, 40 GB RAM, loads of disk/SSD space.
|
True
|
Beta 1.2.0 partially hanging on long compressions - Using the 1.2.0-dev beta that had the ACE stuff added in for testing. Seems to be 1.2.0-dev (3701), more specifically. I mostly use Keka as just a Dock drop target to do nothing but 7Z (max) compression. I'm finding that it seems to (partially) hang more than 50% of the time on any fairly large/long compressions, and is more apt to do so, it seems, when switching focus away from the app to do something else. The progress bar stops, you get a "beachball" cursor, and if you leave the app, nothing happens when you try to get back into it (by Dock icon click or by Cmd-Tab). It will often work right on a second or later attempt, but not always. In reality, the app is actually still working in the background, as can be confirmed by keeping an eye on the file size of the archive being written out. However, after it is done, it does not play its sound, and the app remains unresponsive.
For example, I kept getting a "beachball" in the app when compressing a 1.3 GB Native Instruments Maschine library (one big file) I don't need any time soon. It guestimates 6 minutes (at my performance settings), and becomes unresponsive after about 1 (but actually got the job done in 10–15). The non-beta version has no trouble with the file, and gets the job done in about 1.5 minutes (though hogging CPU and RAM), plays its completion sound, and is working properly afterward (in my case it quits unless I have it open doing something else, as intended). The beta version just sits there, stuck. The archive it created while semi-hung tests out as valid (in The Archive Browser).
I'm doubting this has to do with the ACE addition, since extracting ACEs and compressing 7Zs wouldn't seem to related to each other. But who knows.
I am testing some of the new Performance settings:
* Max. sim. ops.: 2
* Max. sim. comp.: 1
* Max. sim. extr.: 2
* Max. thread/op.: 2
(Aside: I can confirm that this stuff works to stop the app from using up all available RAM and CPU, at the cost of speed, as expected.)
File access is volume-wide on every volume. Inherit quarantine is off. Default action: always compress. Compression: 7z, slowest/most. Save: next to original, same base name. Exclude res. forks. Play sound when done. Ask before quit, enable Notification Ctr., use unified toolbar, close app when no windows open, auto-updates off. Keka not set as default app for anything.
Oh, and as I noted elsewhere, it crashed to desktop one time, after running the beta for the first time and opening preferences and picking some panes to look at.
System: macOS 10.13.6, mid-2010 Mac Pro, 40 GB RAM, loads of disk/SSD space.
|
non_process
|
beta partially hanging on long compressions using the dev beta that had the ace stuff added in for testing seems to be dev more specifically i mostly use keka as just a dock drop target to do nothing but max compression i m finding that it seems to partially hang more than of the time on any fairly large long compressions and is more apt to do so it seems when switching focus away from the app to do something else the progress bar stops you get a beachball cursor and if you leave the app nothing happens when you try to get back into it by dock icon click or by cmd tab it will often work right on a second or later attempt but not always in reality the app is actually still working in the background as can be confirmed by keeping an eye on the file size of the archive being written out however after it is done it does not play its sound and the app remains unresponsive for example i kept getting a beachball in the app when compressing a gb native instruments maschine library one big file i don t need any time soon it guestimates minutes at my performance settings and becomes unresponsive after about but actually got the job done in – the non beta version has no trouble with the file and gets the job done in about minutes though hogging cpu and ram plays its completion sound and is working properly afterward in my case it quits unless i have it open doing something else as intended the beta version just sits there stuck the archive it created while semi hung tests out as valid in the archive browser i m doubting this has to do with the ace addition since extracting aces and compressing wouldn t seem to related to each other but who knows i am testing some of the new performance settings max sim ops max sim comp max sim extr max thread op aside i can confirm that this stuff works to stop the app from using up all available ram and cpu at the cost of speed as expected file access is volume wide on every volume inherit quarantine is off default action always compress compression slowest most save next to original same base name exclude res forks play sound when done ask before quit enable notification ctr use unified toolbar close app when no windows open auto updates off keka not set as default app for anything oh and as i noted elsewhere it crashed to desktop one time after running the beta for the first time and opening preferences and picking some panes to look at system macos mid mac pro gb ram loads of disk ssd space
| 0
|
271,107
| 29,269,836,051
|
IssuesEvent
|
2023-05-24 01:01:14
|
ShiftLeft-Corp-Test/NodeGoat
|
https://api.github.com/repos/ShiftLeft-Corp-Test/NodeGoat
|
closed
|
event-source-polyfill-1.0.26.tgz: 1 vulnerabilities (highest severity is: 5.3) - autoclosed
|
Mend: dependency security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>event-source-polyfill-1.0.26.tgz</b></p></summary>
<p>A polyfill for http://www.w3.org/TR/eventsource/ </p>
<p>Library home page: <a href="https://registry.npmjs.org/event-source-polyfill/-/event-source-polyfill-1.0.26.tgz">https://registry.npmjs.org/event-source-polyfill/-/event-source-polyfill-1.0.26.tgz</a></p>
<p>Path to dependency file: /MSC/package.json</p>
<p>Path to vulnerable library: /MSC/node_modules/event-source-polyfill/package.json</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (event-source-polyfill version) | Fix PR available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [MSC-2022-1557](https://my.diffend.io/npm/event-source-polyfill/prev/1.0.26) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Medium | 5.3 | event-source-polyfill-1.0.26.tgz | Direct | N/A | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> MSC-2022-1557</summary>
### Vulnerable Library - <b>event-source-polyfill-1.0.26.tgz</b></p>
<p>A polyfill for http://www.w3.org/TR/eventsource/ </p>
<p>Library home page: <a href="https://registry.npmjs.org/event-source-polyfill/-/event-source-polyfill-1.0.26.tgz">https://registry.npmjs.org/event-source-polyfill/-/event-source-polyfill-1.0.26.tgz</a></p>
<p>Path to dependency file: /MSC/package.json</p>
<p>Path to vulnerable library: /MSC/node_modules/event-source-polyfill/package.json</p>
<p>
Dependency Hierarchy:
- :x: **event-source-polyfill-1.0.26.tgz** (Vulnerable Library)
<p>Found in base branch: <b>Test</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
This package has been identified by Mend as containing potential malicious functionality. The severity of the functionality can change depending on where the library is running (user's machine or backend server). The following risks were identified: Protestware – this package contains code that differs from the stated functionality. This can be as simple as opening a web URL to protest or, in some cases could, delete files.
<p>Publish Date: 2022-06-29
<p>URL: <a href=https://my.diffend.io/npm/event-source-polyfill/prev/1.0.26>MSC-2022-1557</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
True
|
event-source-polyfill-1.0.26.tgz: 1 vulnerabilities (highest severity is: 5.3) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>event-source-polyfill-1.0.26.tgz</b></p></summary>
<p>A polyfill for http://www.w3.org/TR/eventsource/ </p>
<p>Library home page: <a href="https://registry.npmjs.org/event-source-polyfill/-/event-source-polyfill-1.0.26.tgz">https://registry.npmjs.org/event-source-polyfill/-/event-source-polyfill-1.0.26.tgz</a></p>
<p>Path to dependency file: /MSC/package.json</p>
<p>Path to vulnerable library: /MSC/node_modules/event-source-polyfill/package.json</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (event-source-polyfill version) | Fix PR available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [MSC-2022-1557](https://my.diffend.io/npm/event-source-polyfill/prev/1.0.26) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Medium | 5.3 | event-source-polyfill-1.0.26.tgz | Direct | N/A | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> MSC-2022-1557</summary>
### Vulnerable Library - <b>event-source-polyfill-1.0.26.tgz</b></p>
<p>A polyfill for http://www.w3.org/TR/eventsource/ </p>
<p>Library home page: <a href="https://registry.npmjs.org/event-source-polyfill/-/event-source-polyfill-1.0.26.tgz">https://registry.npmjs.org/event-source-polyfill/-/event-source-polyfill-1.0.26.tgz</a></p>
<p>Path to dependency file: /MSC/package.json</p>
<p>Path to vulnerable library: /MSC/node_modules/event-source-polyfill/package.json</p>
<p>
Dependency Hierarchy:
- :x: **event-source-polyfill-1.0.26.tgz** (Vulnerable Library)
<p>Found in base branch: <b>Test</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
This package has been identified by Mend as containing potential malicious functionality. The severity of the functionality can change depending on where the library is running (user's machine or backend server). The following risks were identified: Protestware – this package contains code that differs from the stated functionality. This can be as simple as opening a web URL to protest or, in some cases could, delete files.
<p>Publish Date: 2022-06-29
<p>URL: <a href=https://my.diffend.io/npm/event-source-polyfill/prev/1.0.26>MSC-2022-1557</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
non_process
|
event source polyfill tgz vulnerabilities highest severity is autoclosed vulnerable library event source polyfill tgz a polyfill for library home page a href path to dependency file msc package json path to vulnerable library msc node modules event source polyfill package json vulnerabilities cve severity cvss dependency type fixed in event source polyfill version fix pr available medium event source polyfill tgz direct n a details msc vulnerable library event source polyfill tgz a polyfill for library home page a href path to dependency file msc package json path to vulnerable library msc node modules event source polyfill package json dependency hierarchy x event source polyfill tgz vulnerable library found in base branch test vulnerability details this package has been identified by mend as containing potential malicious functionality the severity of the functionality can change depending on where the library is running user s machine or backend server the following risks were identified protestware – this package contains code that differs from the stated functionality this can be as simple as opening a web url to protest or in some cases could delete files publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
| 0
|
55,392
| 30,729,900,304
|
IssuesEvent
|
2023-07-27 23:55:49
|
coq/coq
|
https://api.github.com/repos/coq/coq
|
opened
|
native compilation should have a special case for `Definition ... := Eval native_compute in ...`
|
kind: performance part: native compiler kind: enhancement
|
Related to #17870 and [this Zulip thread](https://coq.zulipchat.com/#narrow/stream/237656-Coq-devs-.26-plugin-devs/topic/Avoiding.20running.20the.20vm.20.2F.20native.20compiler.20twice), I would like it to be the case that if I write
```coq
From Coq Require Import PArray Uint63 PrimFloat.
Definition arrays :=
make 4096 (make 64 ((*make 64*) 0%float)).
Set NativeCompute Timing.
Time Definition arrays_val := Eval native_compute in arrays.
(* native_compute: Conversion to native code done in 0.00002
native_compute: Compilation done in 0.26909
native_compute: Evaluation done in 0.00112
native_compute: Reification done in 0.00964
arrays_val is defined
Finished transaction in 0.447 secs (0.148u,0.031s) (successful)
*)
Time Definition arrays_val_eq : arrays_val = arrays.
Proof. native_cast_no_check (eq_refl arrays). Time Qed.
(* Finished transaction in 13.784 secs (0.512u,0.03s) (successful) *)
```
I do not have to pay the cost of compiling `arrays_val` (in the project I'm working on, this cost is 10--15 minutes per definition).
Instead, it should be the case that when I write `Definition foo := Eval native_compute in bar.` (at least with no arguments), the native compiler should preemptively emit a definition for the constant `foo` which just aliases the lazy value used to compute the result of `Eval native_compute in`.
|
True
|
native compilation should have a special case for `Definition ... := Eval native_compute in ...` - Related to #17870 and [this Zulip thread](https://coq.zulipchat.com/#narrow/stream/237656-Coq-devs-.26-plugin-devs/topic/Avoiding.20running.20the.20vm.20.2F.20native.20compiler.20twice), I would like it to be the case that if I write
```coq
From Coq Require Import PArray Uint63 PrimFloat.
Definition arrays :=
make 4096 (make 64 ((*make 64*) 0%float)).
Set NativeCompute Timing.
Time Definition arrays_val := Eval native_compute in arrays.
(* native_compute: Conversion to native code done in 0.00002
native_compute: Compilation done in 0.26909
native_compute: Evaluation done in 0.00112
native_compute: Reification done in 0.00964
arrays_val is defined
Finished transaction in 0.447 secs (0.148u,0.031s) (successful)
*)
Time Definition arrays_val_eq : arrays_val = arrays.
Proof. native_cast_no_check (eq_refl arrays). Time Qed.
(* Finished transaction in 13.784 secs (0.512u,0.03s) (successful) *)
```
I do not have to pay the cost of compiling `arrays_val` (in the project I'm working on, this cost is 10--15 minutes per definition).
Instead, it should be the case that when I write `Definition foo := Eval native_compute in bar.` (at least with no arguments), the native compiler should preemptively emit a definition for the constant `foo` which just aliases the lazy value used to compute the result of `Eval native_compute in`.
|
non_process
|
native compilation should have a special case for definition eval native compute in related to and i would like it to be the case that if i write coq from coq require import parray primfloat definition arrays make make make float set nativecompute timing time definition arrays val eval native compute in arrays native compute conversion to native code done in native compute compilation done in native compute evaluation done in native compute reification done in arrays val is defined finished transaction in secs successful time definition arrays val eq arrays val arrays proof native cast no check eq refl arrays time qed finished transaction in secs successful i do not have to pay the cost of compiling arrays val in the project i m working on this cost is minutes per definition instead it should be the case that when i write definition foo eval native compute in bar at least with no arguments the native compiler should preemptively emit a definition for the constant foo which just aliases the lazy value used to compute the result of eval native compute in
| 0
|
150,203
| 5,740,660,952
|
IssuesEvent
|
2017-04-24 00:48:05
|
chrislo27/RhythmHeavenRemixEditor2
|
https://api.github.com/repos/chrislo27/RhythmHeavenRemixEditor2
|
closed
|
Variation menus need to be a thing.
|
feature request invalid low priority
|
### Prerequisites
* [X] This issue specifically has something to do with RHRE2
* [X] I have attempted to [look for similar issues](https://github.com/chrislo27/RhythmHeavenRemixEditor/issues?utf8=%E2%9C%93&q=is%3Aissue)
already
### Description
With the growing amount of multi-language games the editor can use, and French coming soon, it's starting to hold the editor back. I suggest we do something about it. I suggest the creation of variation folders. Variation menus contain every variation of a single game. You can access them by hovering your cursor over a game that has multiple variations. Variation menus are drop-down menus that show a list of variations for that game. The editor would use the most recent sounds by default. The variations would still have their own data.jsons. This would be a great way to merge games like Karate Man, Munchy Monk, Airboarder, Space Soccer, Marching Orders, and the Count-Ins. ESPECIALLY the Count-Ins and Karate Man.
|
1.0
|
Variation menus need to be a thing. - ### Prerequisites
* [X] This issue specifically has something to do with RHRE2
* [X] I have attempted to [look for similar issues](https://github.com/chrislo27/RhythmHeavenRemixEditor/issues?utf8=%E2%9C%93&q=is%3Aissue)
already
### Description
With the growing amount of multi-language games the editor can use, and French coming soon, it's starting to hold the editor back. I suggest we do something about it. I suggest the creation of variation folders. Variation menus contain every variation of a single game. You can access them by hovering your cursor over a game that has multiple variations. Variation menus are drop-down menus that show a list of variations for that game. The editor would use the most recent sounds by default. The variations would still have their own data.jsons. This would be a great way to merge games like Karate Man, Munchy Monk, Airboarder, Space Soccer, Marching Orders, and the Count-Ins. ESPECIALLY the Count-Ins and Karate Man.
|
non_process
|
variation menus need to be a thing prerequisites this issue specifically has something to do with i have attempted to already description with the growing amount of multi language games the editor can use and french coming soon it s starting to hold the editor back i suggest we do something about it i suggest the creation of variation folders variation menus contain every variation of a single game you can access them by hovering your cursor over a game that has multiple variations variation menus are drop down menus that show a list of variations for that game the editor would use the most recent sounds by default the variations would still have their own data jsons this would be a great way to merge games like karate man munchy monk airboarder space soccer marching orders and the count ins especially the count ins and karate man
| 0
|
225,094
| 24,807,981,444
|
IssuesEvent
|
2022-10-25 07:05:40
|
sast-automation-dev/react-security-45
|
https://api.github.com/repos/sast-automation-dev/react-security-45
|
opened
|
react-16.2.0.tgz: 4 vulnerabilities (highest severity is: 7.5)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>react-16.2.0.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /security_examples/package.json</p>
<p>Path to vulnerable library: /fullstack_auth/frontend/node_modules/ua-parser-js/package.json,/authentication/node_modules/ua-parser-js/package.json,/security_examples/node_modules/ua-parser-js/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/react-security-45/commit/1460900f4af566f33edff7a5ece266441b72ef87">1460900f4af566f33edff7a5ece266441b72ef87</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (react version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2021-27292](https://www.mend.io/vulnerability-database/CVE-2021-27292) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | ua-parser-js-0.7.17.tgz | Transitive | 16.3.0 | ✅ |
| [CVE-2020-7733](https://www.mend.io/vulnerability-database/CVE-2020-7733) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | ua-parser-js-0.7.17.tgz | Transitive | 16.3.0 | ✅ |
| [CVE-2020-7793](https://www.mend.io/vulnerability-database/CVE-2020-7793) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | ua-parser-js-0.7.17.tgz | Transitive | 16.3.0 | ✅ |
| [CVE-2020-15168](https://www.mend.io/vulnerability-database/CVE-2020-15168) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | node-fetch-1.7.3.tgz | Transitive | 16.5.0 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-27292</summary>
### Vulnerable Library - <b>ua-parser-js-0.7.17.tgz</b></p>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz</a></p>
<p>Path to dependency file: /fullstack_auth/frontend/package.json</p>
<p>Path to vulnerable library: /fullstack_auth/frontend/node_modules/ua-parser-js/package.json,/authentication/node_modules/ua-parser-js/package.json,/security_examples/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-16.2.0.tgz (Root Library)
- fbjs-0.8.16.tgz
- :x: **ua-parser-js-0.7.17.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/react-security-45/commit/1460900f4af566f33edff7a5ece266441b72ef87">1460900f4af566f33edff7a5ece266441b72ef87</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
ua-parser-js >= 0.7.14, fixed in 0.7.24, uses a regular expression which is vulnerable to denial of service. If an attacker sends a malicious User-Agent header, ua-parser-js will get stuck processing it for an extended period of time.
<p>Publish Date: 2021-03-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-27292>CVE-2021-27292</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-03-17</p>
<p>Fix Resolution (ua-parser-js): 0.7.24</p>
<p>Direct dependency fix Resolution (react): 16.3.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-7733</summary>
### Vulnerable Library - <b>ua-parser-js-0.7.17.tgz</b></p>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz</a></p>
<p>Path to dependency file: /fullstack_auth/frontend/package.json</p>
<p>Path to vulnerable library: /fullstack_auth/frontend/node_modules/ua-parser-js/package.json,/authentication/node_modules/ua-parser-js/package.json,/security_examples/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-16.2.0.tgz (Root Library)
- fbjs-0.8.16.tgz
- :x: **ua-parser-js-0.7.17.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/react-security-45/commit/1460900f4af566f33edff7a5ece266441b72ef87">1460900f4af566f33edff7a5ece266441b72ef87</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The package ua-parser-js before 0.7.22 are vulnerable to Regular Expression Denial of Service (ReDoS) via the regex for Redmi Phones and Mi Pad Tablets UA.
<p>Publish Date: 2020-09-16
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-7733>CVE-2020-7733</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7733">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7733</a></p>
<p>Release Date: 2020-09-16</p>
<p>Fix Resolution (ua-parser-js): 0.7.22</p>
<p>Direct dependency fix Resolution (react): 16.3.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-7793</summary>
### Vulnerable Library - <b>ua-parser-js-0.7.17.tgz</b></p>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz</a></p>
<p>Path to dependency file: /fullstack_auth/frontend/package.json</p>
<p>Path to vulnerable library: /fullstack_auth/frontend/node_modules/ua-parser-js/package.json,/authentication/node_modules/ua-parser-js/package.json,/security_examples/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-16.2.0.tgz (Root Library)
- fbjs-0.8.16.tgz
- :x: **ua-parser-js-0.7.17.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/react-security-45/commit/1460900f4af566f33edff7a5ece266441b72ef87">1460900f4af566f33edff7a5ece266441b72ef87</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The package ua-parser-js before 0.7.23 are vulnerable to Regular Expression Denial of Service (ReDoS) in multiple regexes (see linked commit for more info).
<p>Publish Date: 2020-12-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-7793>CVE-2020-7793</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2020-12-11</p>
<p>Fix Resolution (ua-parser-js): 0.7.23</p>
<p>Direct dependency fix Resolution (react): 16.3.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-15168</summary>
### Vulnerable Library - <b>node-fetch-1.7.3.tgz</b></p>
<p>A light-weight module that brings window.fetch to node.js and io.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-fetch/-/node-fetch-1.7.3.tgz">https://registry.npmjs.org/node-fetch/-/node-fetch-1.7.3.tgz</a></p>
<p>Path to dependency file: /fullstack_auth/frontend/package.json</p>
<p>Path to vulnerable library: /fullstack_auth/frontend/node_modules/node-fetch/package.json,/security_examples/node_modules/node-fetch/package.json,/authentication/node_modules/node-fetch/package.json</p>
<p>
Dependency Hierarchy:
- react-16.2.0.tgz (Root Library)
- fbjs-0.8.16.tgz
- isomorphic-fetch-2.2.1.tgz
- :x: **node-fetch-1.7.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/react-security-45/commit/1460900f4af566f33edff7a5ece266441b72ef87">1460900f4af566f33edff7a5ece266441b72ef87</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
node-fetch before versions 2.6.1 and 3.0.0-beta.9 did not honor the size option after following a redirect, which means that when a content size was over the limit, a FetchError would never get thrown and the process would end without failure. For most people, this fix will have a little or no impact. However, if you are relying on node-fetch to gate files above a size, the impact could be significant, for example: If you don't double-check the size of the data after fetch() has completed, your JS thread could get tied up doing work on a large file (DoS) and/or cost you money in computing.
<p>Publish Date: 2020-09-10
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-15168>CVE-2020-15168</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/node-fetch/node-fetch/security/advisories/GHSA-w7rc-rwvf-8q5r">https://github.com/node-fetch/node-fetch/security/advisories/GHSA-w7rc-rwvf-8q5r</a></p>
<p>Release Date: 2020-09-17</p>
<p>Fix Resolution (node-fetch): 2.6.1</p>
<p>Direct dependency fix Resolution (react): 16.5.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
True
|
react-16.2.0.tgz: 4 vulnerabilities (highest severity is: 7.5) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>react-16.2.0.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /security_examples/package.json</p>
<p>Path to vulnerable library: /fullstack_auth/frontend/node_modules/ua-parser-js/package.json,/authentication/node_modules/ua-parser-js/package.json,/security_examples/node_modules/ua-parser-js/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/react-security-45/commit/1460900f4af566f33edff7a5ece266441b72ef87">1460900f4af566f33edff7a5ece266441b72ef87</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (react version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2021-27292](https://www.mend.io/vulnerability-database/CVE-2021-27292) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | ua-parser-js-0.7.17.tgz | Transitive | 16.3.0 | ✅ |
| [CVE-2020-7733](https://www.mend.io/vulnerability-database/CVE-2020-7733) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | ua-parser-js-0.7.17.tgz | Transitive | 16.3.0 | ✅ |
| [CVE-2020-7793](https://www.mend.io/vulnerability-database/CVE-2020-7793) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | ua-parser-js-0.7.17.tgz | Transitive | 16.3.0 | ✅ |
| [CVE-2020-15168](https://www.mend.io/vulnerability-database/CVE-2020-15168) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | node-fetch-1.7.3.tgz | Transitive | 16.5.0 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-27292</summary>
### Vulnerable Library - <b>ua-parser-js-0.7.17.tgz</b></p>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz</a></p>
<p>Path to dependency file: /fullstack_auth/frontend/package.json</p>
<p>Path to vulnerable library: /fullstack_auth/frontend/node_modules/ua-parser-js/package.json,/authentication/node_modules/ua-parser-js/package.json,/security_examples/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-16.2.0.tgz (Root Library)
- fbjs-0.8.16.tgz
- :x: **ua-parser-js-0.7.17.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/react-security-45/commit/1460900f4af566f33edff7a5ece266441b72ef87">1460900f4af566f33edff7a5ece266441b72ef87</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
ua-parser-js >= 0.7.14, fixed in 0.7.24, uses a regular expression which is vulnerable to denial of service. If an attacker sends a malicious User-Agent header, ua-parser-js will get stuck processing it for an extended period of time.
<p>Publish Date: 2021-03-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-27292>CVE-2021-27292</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-03-17</p>
<p>Fix Resolution (ua-parser-js): 0.7.24</p>
<p>Direct dependency fix Resolution (react): 16.3.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-7733</summary>
### Vulnerable Library - <b>ua-parser-js-0.7.17.tgz</b></p>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz</a></p>
<p>Path to dependency file: /fullstack_auth/frontend/package.json</p>
<p>Path to vulnerable library: /fullstack_auth/frontend/node_modules/ua-parser-js/package.json,/authentication/node_modules/ua-parser-js/package.json,/security_examples/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-16.2.0.tgz (Root Library)
- fbjs-0.8.16.tgz
- :x: **ua-parser-js-0.7.17.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/react-security-45/commit/1460900f4af566f33edff7a5ece266441b72ef87">1460900f4af566f33edff7a5ece266441b72ef87</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The package ua-parser-js before 0.7.22 are vulnerable to Regular Expression Denial of Service (ReDoS) via the regex for Redmi Phones and Mi Pad Tablets UA.
<p>Publish Date: 2020-09-16
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-7733>CVE-2020-7733</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7733">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7733</a></p>
<p>Release Date: 2020-09-16</p>
<p>Fix Resolution (ua-parser-js): 0.7.22</p>
<p>Direct dependency fix Resolution (react): 16.3.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-7793</summary>
### Vulnerable Library - <b>ua-parser-js-0.7.17.tgz</b></p>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz</a></p>
<p>Path to dependency file: /fullstack_auth/frontend/package.json</p>
<p>Path to vulnerable library: /fullstack_auth/frontend/node_modules/ua-parser-js/package.json,/authentication/node_modules/ua-parser-js/package.json,/security_examples/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- react-16.2.0.tgz (Root Library)
- fbjs-0.8.16.tgz
- :x: **ua-parser-js-0.7.17.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/react-security-45/commit/1460900f4af566f33edff7a5ece266441b72ef87">1460900f4af566f33edff7a5ece266441b72ef87</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The package ua-parser-js before 0.7.23 are vulnerable to Regular Expression Denial of Service (ReDoS) in multiple regexes (see linked commit for more info).
<p>Publish Date: 2020-12-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-7793>CVE-2020-7793</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2020-12-11</p>
<p>Fix Resolution (ua-parser-js): 0.7.23</p>
<p>Direct dependency fix Resolution (react): 16.3.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-15168</summary>
### Vulnerable Library - <b>node-fetch-1.7.3.tgz</b></p>
<p>A light-weight module that brings window.fetch to node.js and io.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-fetch/-/node-fetch-1.7.3.tgz">https://registry.npmjs.org/node-fetch/-/node-fetch-1.7.3.tgz</a></p>
<p>Path to dependency file: /fullstack_auth/frontend/package.json</p>
<p>Path to vulnerable library: /fullstack_auth/frontend/node_modules/node-fetch/package.json,/security_examples/node_modules/node-fetch/package.json,/authentication/node_modules/node-fetch/package.json</p>
<p>
Dependency Hierarchy:
- react-16.2.0.tgz (Root Library)
- fbjs-0.8.16.tgz
- isomorphic-fetch-2.2.1.tgz
- :x: **node-fetch-1.7.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/react-security-45/commit/1460900f4af566f33edff7a5ece266441b72ef87">1460900f4af566f33edff7a5ece266441b72ef87</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
node-fetch before versions 2.6.1 and 3.0.0-beta.9 did not honor the size option after following a redirect, which means that when a content size was over the limit, a FetchError would never get thrown and the process would end without failure. For most people, this fix will have a little or no impact. However, if you are relying on node-fetch to gate files above a size, the impact could be significant, for example: If you don't double-check the size of the data after fetch() has completed, your JS thread could get tied up doing work on a large file (DoS) and/or cost you money in computing.
<p>Publish Date: 2020-09-10
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-15168>CVE-2020-15168</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/node-fetch/node-fetch/security/advisories/GHSA-w7rc-rwvf-8q5r">https://github.com/node-fetch/node-fetch/security/advisories/GHSA-w7rc-rwvf-8q5r</a></p>
<p>Release Date: 2020-09-17</p>
<p>Fix Resolution (node-fetch): 2.6.1</p>
<p>Direct dependency fix Resolution (react): 16.5.0</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
non_process
|
react tgz vulnerabilities highest severity is vulnerable library react tgz path to dependency file security examples package json path to vulnerable library fullstack auth frontend node modules ua parser js package json authentication node modules ua parser js package json security examples node modules ua parser js package json found in head commit a href vulnerabilities cve severity cvss dependency type fixed in react version remediation available high ua parser js tgz transitive high ua parser js tgz transitive high ua parser js tgz transitive medium node fetch tgz transitive details cve vulnerable library ua parser js tgz lightweight javascript based user agent string parser library home page a href path to dependency file fullstack auth frontend package json path to vulnerable library fullstack auth frontend node modules ua parser js package json authentication node modules ua parser js package json security examples node modules ua parser js package json dependency hierarchy react tgz root library fbjs tgz x ua parser js tgz vulnerable library found in head commit a href found in base branch master vulnerability details ua parser js fixed in uses a regular expression which is vulnerable to denial of service if an attacker sends a malicious user agent header ua parser js will get stuck processing it for an extended period of time publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution ua parser js direct dependency fix resolution react rescue worker helmet automatic remediation is available for this issue cve vulnerable library ua parser js tgz lightweight javascript based user agent string parser library home page a href path to dependency file fullstack auth frontend package json path to vulnerable library fullstack auth frontend node modules ua parser js package json authentication node modules ua parser js package json security examples node modules ua parser js package json dependency hierarchy react tgz root library fbjs tgz x ua parser js tgz vulnerable library found in head commit a href found in base branch master vulnerability details the package ua parser js before are vulnerable to regular expression denial of service redos via the regex for redmi phones and mi pad tablets ua publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ua parser js direct dependency fix resolution react rescue worker helmet automatic remediation is available for this issue cve vulnerable library ua parser js tgz lightweight javascript based user agent string parser library home page a href path to dependency file fullstack auth frontend package json path to vulnerable library fullstack auth frontend node modules ua parser js package json authentication node modules ua parser js package json security examples node modules ua parser js package json dependency hierarchy react tgz root library fbjs tgz x ua parser js tgz vulnerable library found in head commit a href found in base branch master vulnerability details the package ua parser js before are vulnerable to regular expression denial of service redos in multiple regexes see linked commit for more info publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution ua parser js direct dependency fix resolution react rescue worker helmet automatic remediation is available for this issue cve vulnerable library node fetch tgz a light weight module that brings window fetch to node js and io js library home page a href path to dependency file fullstack auth frontend package json path to vulnerable library fullstack auth frontend node modules node fetch package json security examples node modules node fetch package json authentication node modules node fetch package json dependency hierarchy react tgz root library fbjs tgz isomorphic fetch tgz x node fetch tgz vulnerable library found in head commit a href found in base branch master vulnerability details node fetch before versions and beta did not honor the size option after following a redirect which means that when a content size was over the limit a fetcherror would never get thrown and the process would end without failure for most people this fix will have a little or no impact however if you are relying on node fetch to gate files above a size the impact could be significant for example if you don t double check the size of the data after fetch has completed your js thread could get tied up doing work on a large file dos and or cost you money in computing publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution node fetch direct dependency fix resolution react rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
| 0
|
691,019
| 23,680,924,093
|
IssuesEvent
|
2022-08-28 19:42:05
|
StrangeLoopGames/EcoIssues
|
https://api.github.com/repos/StrangeLoopGames/EcoIssues
|
closed
|
[0.9.3 Staging-1960]Antialiasing doesn't seem to apply on every object
|
Priority: Medium Category: Optimization
|
Build: 0.9.3 Staging-1960
### Issue
Setting antialiasing to SMAA High, which is the highest level of antialiasing in-game, doesn't seem to apply to every in-game object.
There are noticeable changes but on some objects, there aren't any at all. However, the way it works with SMAA High, the changes made on the rugged edges feels like it's set to SMAA Low and not High. Furthermore, its performance impact, which should have been heavy, is minimal with less than 1 to 2 frames per second decrease in graphical impact when enabled.
Comparison Screenshot

Expected Result:
|
1.0
|
[0.9.3 Staging-1960]Antialiasing doesn't seem to apply on every object - Build: 0.9.3 Staging-1960
### Issue
Setting antialiasing to SMAA High, which is the highest level of antialiasing in-game, doesn't seem to apply to every in-game object.
There are noticeable changes but on some objects, there aren't any at all. However, the way it works with SMAA High, the changes made on the rugged edges feels like it's set to SMAA Low and not High. Furthermore, its performance impact, which should have been heavy, is minimal with less than 1 to 2 frames per second decrease in graphical impact when enabled.
Comparison Screenshot

Expected Result:
|
non_process
|
antialiasing doesn t seem to apply on every object build staging issue setting antialiasing to smaa high which is the highest level of antialiasing in game doesn t seem to apply to every in game object there are noticeable changes but on some objects there aren t any at all however the way it works with smaa high the changes made on the rugged edges feels like it s set to smaa low and not high furthermore its performance impact which should have been heavy is minimal with less than to frames per second decrease in graphical impact when enabled comparison screenshot expected result
| 0
|
172,185
| 21,040,472,603
|
IssuesEvent
|
2022-03-31 11:52:02
|
LynRodWS/alcor
|
https://api.github.com/repos/LynRodWS/alcor
|
opened
|
WS-2022-0107 (High) detected in multiple libraries
|
security vulnerability
|
## WS-2022-0107 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>spring-beans-5.1.8.RELEASE.jar</b>, <b>spring-beans-5.2.4.RELEASE.jar</b>, <b>spring-beans-5.2.6.RELEASE.jar</b>, <b>spring-beans-5.2.5.RELEASE.jar</b>, <b>spring-beans-5.2.7.RELEASE.jar</b></p></summary>
<p>
<details><summary><b>spring-beans-5.1.8.RELEASE.jar</b></p></summary>
<p>Spring Beans</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /lib/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.1.8.RELEASE/spring-beans-5.1.8.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.1.8.RELEASE/spring-beans-5.1.8.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.1.8.RELEASE/spring-beans-5.1.8.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.1.6.RELEASE.jar (Root Library)
- spring-web-5.1.8.RELEASE.jar
- :x: **spring-beans-5.1.8.RELEASE.jar** (Vulnerable Library)
</details>
<details><summary><b>spring-beans-5.2.4.RELEASE.jar</b></p></summary>
<p>Spring Beans</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /services/route_manager/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.4.RELEASE/spring-beans-5.2.4.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.2.5.RELEASE.jar (Root Library)
- spring-web-5.2.4.RELEASE.jar
- :x: **spring-beans-5.2.4.RELEASE.jar** (Vulnerable Library)
</details>
<details><summary><b>spring-beans-5.2.6.RELEASE.jar</b></p></summary>
<p>Spring Beans</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /services/security_group_manager/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.6.RELEASE/spring-beans-5.2.6.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.3.0.RELEASE.jar (Root Library)
- spring-web-5.2.6.RELEASE.jar
- :x: **spring-beans-5.2.6.RELEASE.jar** (Vulnerable Library)
</details>
<details><summary><b>spring-beans-5.2.5.RELEASE.jar</b></p></summary>
<p>Spring Beans</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /services/port_manager/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.2.6.RELEASE.jar (Root Library)
- spring-web-5.2.5.RELEASE.jar
- :x: **spring-beans-5.2.5.RELEASE.jar** (Vulnerable Library)
</details>
<details><summary><b>spring-beans-5.2.7.RELEASE.jar</b></p></summary>
<p>Spring Beans</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /services/network_acl_manager/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.7.RELEASE/spring-beans-5.2.7.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.3.1.RELEASE.jar (Root Library)
- spring-web-5.2.7.RELEASE.jar
- :x: **spring-beans-5.2.7.RELEASE.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Vulnerability in Spring-beans which is a component associated with Spring Core allows attackers under certain circumstances to achieve remote code execution, this vulnerability is also known as ״Spring4Shell״ or ״SpringShell״.
The current POC related to the attack is done by creating a specially crafted request which manipulates ClassLoader to successfully achieve RCE (Remote Code Execution).
Please note that the ease of exploitation may diverge by the code implementation.
Currently, the attack depends on the following environment - Tomcat version 9 or above, JDK version 9 or above.
WhiteSource's research team is carefully observing developments and researching the case. We will keep updating this page and our WhiteSource resources with updates.
This is a temporary WhiteSource ID until an official CVE ID will be released.
<p>Publish Date: 2022-03-30
<p>URL: <a href=https://www.cyberkendra.com/2022/03/springshell-rce-0-day-vulnerability.html>WS-2022-0107</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.1.8.RELEASE","packageFilePaths":["/lib/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.1.6.RELEASE;org.springframework:spring-web:5.1.8.RELEASE;org.springframework:spring-beans:5.1.8.RELEASE","isMinimumFixVersionAvailable":false,"isBinary":false},{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.2.4.RELEASE","packageFilePaths":["/services/route_manager/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.2.5.RELEASE;org.springframework:spring-web:5.2.4.RELEASE;org.springframework:spring-beans:5.2.4.RELEASE","isMinimumFixVersionAvailable":false,"isBinary":false},{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.2.6.RELEASE","packageFilePaths":["/services/security_group_manager/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.3.0.RELEASE;org.springframework:spring-web:5.2.6.RELEASE;org.springframework:spring-beans:5.2.6.RELEASE","isMinimumFixVersionAvailable":false,"isBinary":false},{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.2.5.RELEASE","packageFilePaths":["/services/port_manager/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.2.6.RELEASE;org.springframework:spring-web:5.2.5.RELEASE;org.springframework:spring-beans:5.2.5.RELEASE","isMinimumFixVersionAvailable":false,"isBinary":false},{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.2.7.RELEASE","packageFilePaths":["/services/network_acl_manager/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.3.1.RELEASE;org.springframework:spring-web:5.2.7.RELEASE;org.springframework:spring-beans:5.2.7.RELEASE","isMinimumFixVersionAvailable":false,"isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2022-0107","vulnerabilityDetails":"Vulnerability in Spring-beans which is a component associated with Spring Core allows attackers under certain circumstances to achieve remote code execution, this vulnerability is also known as ״Spring4Shell״ or ״SpringShell״. \n \nThe current POC related to the attack is done by creating a specially crafted request which manipulates ClassLoader to successfully achieve RCE (Remote Code Execution).\n \nPlease note that the ease of exploitation may diverge by the code implementation.\nCurrently, the attack depends on the following environment - Tomcat version 9 or above, JDK version 9 or above.\n \nWhiteSource\u0027s research team is carefully observing developments and researching the case. We will keep updating this page and our WhiteSource resources with updates. \n \nThis is a temporary WhiteSource ID until an official CVE ID will be released.\n","vulnerabilityUrl":"https://www.cyberkendra.com/2022/03/springshell-rce-0-day-vulnerability.html","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
WS-2022-0107 (High) detected in multiple libraries - ## WS-2022-0107 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>spring-beans-5.1.8.RELEASE.jar</b>, <b>spring-beans-5.2.4.RELEASE.jar</b>, <b>spring-beans-5.2.6.RELEASE.jar</b>, <b>spring-beans-5.2.5.RELEASE.jar</b>, <b>spring-beans-5.2.7.RELEASE.jar</b></p></summary>
<p>
<details><summary><b>spring-beans-5.1.8.RELEASE.jar</b></p></summary>
<p>Spring Beans</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /lib/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.1.8.RELEASE/spring-beans-5.1.8.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.1.8.RELEASE/spring-beans-5.1.8.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.1.8.RELEASE/spring-beans-5.1.8.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.1.6.RELEASE.jar (Root Library)
- spring-web-5.1.8.RELEASE.jar
- :x: **spring-beans-5.1.8.RELEASE.jar** (Vulnerable Library)
</details>
<details><summary><b>spring-beans-5.2.4.RELEASE.jar</b></p></summary>
<p>Spring Beans</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /services/route_manager/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.4.RELEASE/spring-beans-5.2.4.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.2.5.RELEASE.jar (Root Library)
- spring-web-5.2.4.RELEASE.jar
- :x: **spring-beans-5.2.4.RELEASE.jar** (Vulnerable Library)
</details>
<details><summary><b>spring-beans-5.2.6.RELEASE.jar</b></p></summary>
<p>Spring Beans</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /services/security_group_manager/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.6.RELEASE/spring-beans-5.2.6.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.3.0.RELEASE.jar (Root Library)
- spring-web-5.2.6.RELEASE.jar
- :x: **spring-beans-5.2.6.RELEASE.jar** (Vulnerable Library)
</details>
<details><summary><b>spring-beans-5.2.5.RELEASE.jar</b></p></summary>
<p>Spring Beans</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /services/port_manager/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.5.RELEASE/spring-beans-5.2.5.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.2.6.RELEASE.jar (Root Library)
- spring-web-5.2.5.RELEASE.jar
- :x: **spring-beans-5.2.5.RELEASE.jar** (Vulnerable Library)
</details>
<details><summary><b>spring-beans-5.2.7.RELEASE.jar</b></p></summary>
<p>Spring Beans</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /services/network_acl_manager/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-beans/5.2.7.RELEASE/spring-beans-5.2.7.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.3.1.RELEASE.jar (Root Library)
- spring-web-5.2.7.RELEASE.jar
- :x: **spring-beans-5.2.7.RELEASE.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Vulnerability in Spring-beans which is a component associated with Spring Core allows attackers under certain circumstances to achieve remote code execution, this vulnerability is also known as ״Spring4Shell״ or ״SpringShell״.
The current POC related to the attack is done by creating a specially crafted request which manipulates ClassLoader to successfully achieve RCE (Remote Code Execution).
Please note that the ease of exploitation may diverge by the code implementation.
Currently, the attack depends on the following environment - Tomcat version 9 or above, JDK version 9 or above.
WhiteSource's research team is carefully observing developments and researching the case. We will keep updating this page and our WhiteSource resources with updates.
This is a temporary WhiteSource ID until an official CVE ID will be released.
<p>Publish Date: 2022-03-30
<p>URL: <a href=https://www.cyberkendra.com/2022/03/springshell-rce-0-day-vulnerability.html>WS-2022-0107</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.1.8.RELEASE","packageFilePaths":["/lib/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.1.6.RELEASE;org.springframework:spring-web:5.1.8.RELEASE;org.springframework:spring-beans:5.1.8.RELEASE","isMinimumFixVersionAvailable":false,"isBinary":false},{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.2.4.RELEASE","packageFilePaths":["/services/route_manager/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.2.5.RELEASE;org.springframework:spring-web:5.2.4.RELEASE;org.springframework:spring-beans:5.2.4.RELEASE","isMinimumFixVersionAvailable":false,"isBinary":false},{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.2.6.RELEASE","packageFilePaths":["/services/security_group_manager/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.3.0.RELEASE;org.springframework:spring-web:5.2.6.RELEASE;org.springframework:spring-beans:5.2.6.RELEASE","isMinimumFixVersionAvailable":false,"isBinary":false},{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.2.5.RELEASE","packageFilePaths":["/services/port_manager/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.2.6.RELEASE;org.springframework:spring-web:5.2.5.RELEASE;org.springframework:spring-beans:5.2.5.RELEASE","isMinimumFixVersionAvailable":false,"isBinary":false},{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.2.7.RELEASE","packageFilePaths":["/services/network_acl_manager/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.3.1.RELEASE;org.springframework:spring-web:5.2.7.RELEASE;org.springframework:spring-beans:5.2.7.RELEASE","isMinimumFixVersionAvailable":false,"isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2022-0107","vulnerabilityDetails":"Vulnerability in Spring-beans which is a component associated with Spring Core allows attackers under certain circumstances to achieve remote code execution, this vulnerability is also known as ״Spring4Shell״ or ״SpringShell״. \n \nThe current POC related to the attack is done by creating a specially crafted request which manipulates ClassLoader to successfully achieve RCE (Remote Code Execution).\n \nPlease note that the ease of exploitation may diverge by the code implementation.\nCurrently, the attack depends on the following environment - Tomcat version 9 or above, JDK version 9 or above.\n \nWhiteSource\u0027s research team is carefully observing developments and researching the case. We will keep updating this page and our WhiteSource resources with updates. \n \nThis is a temporary WhiteSource ID until an official CVE ID will be released.\n","vulnerabilityUrl":"https://www.cyberkendra.com/2022/03/springshell-rce-0-day-vulnerability.html","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
ws high detected in multiple libraries ws high severity vulnerability vulnerable libraries spring beans release jar spring beans release jar spring beans release jar spring beans release jar spring beans release jar spring beans release jar spring beans library home page a href path to dependency file lib pom xml path to vulnerable library home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar dependency hierarchy spring boot starter web release jar root library spring web release jar x spring beans release jar vulnerable library spring beans release jar spring beans library home page a href path to dependency file services route manager pom xml path to vulnerable library home wss scanner repository org springframework spring beans release spring beans release jar dependency hierarchy spring boot starter web release jar root library spring web release jar x spring beans release jar vulnerable library spring beans release jar spring beans library home page a href path to dependency file services security group manager pom xml path to vulnerable library home wss scanner repository org springframework spring beans release spring beans release jar dependency hierarchy spring boot starter web release jar root library spring web release jar x spring beans release jar vulnerable library spring beans release jar spring beans library home page a href path to dependency file services port manager pom xml path to vulnerable library home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar home wss scanner repository org springframework spring beans release spring beans release jar dependency hierarchy spring boot starter web release jar root library spring web release jar x spring beans release jar vulnerable library spring beans release jar spring beans library home page a href path to dependency file services network acl manager pom xml path to vulnerable library home wss scanner repository org springframework spring beans release spring beans release jar dependency hierarchy spring boot starter web release jar root library spring web release jar x spring beans release jar vulnerable library found in base branch master vulnerability details vulnerability in spring beans which is a component associated with spring core allows attackers under certain circumstances to achieve remote code execution this vulnerability is also known as ״ ״ or ״springshell״ the current poc related to the attack is done by creating a specially crafted request which manipulates classloader to successfully achieve rce remote code execution please note that the ease of exploitation may diverge by the code implementation currently the attack depends on the following environment tomcat version or above jdk version or above whitesource s research team is carefully observing developments and researching the case we will keep updating this page and our whitesource resources with updates this is a temporary whitesource id until an official cve id will be released publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree org springframework boot spring boot starter web release org springframework spring web release org springframework spring beans release isminimumfixversionavailable false isbinary false packagetype java groupid org springframework packagename spring beans packageversion release packagefilepaths istransitivedependency true dependencytree org springframework boot spring boot starter web release org springframework spring web release org springframework spring beans release isminimumfixversionavailable false isbinary false packagetype java groupid org springframework packagename spring beans packageversion release packagefilepaths istransitivedependency true dependencytree org springframework boot spring boot starter web release org springframework spring web release org springframework spring beans release isminimumfixversionavailable false isbinary false packagetype java groupid org springframework packagename spring beans packageversion release packagefilepaths istransitivedependency true dependencytree org springframework boot spring boot starter web release org springframework spring web release org springframework spring beans release isminimumfixversionavailable false isbinary false packagetype java groupid org springframework packagename spring beans packageversion release packagefilepaths istransitivedependency true dependencytree org springframework boot spring boot starter web release org springframework spring web release org springframework spring beans release isminimumfixversionavailable false isbinary false basebranches vulnerabilityidentifier ws vulnerabilitydetails vulnerability in spring beans which is a component associated with spring core allows attackers under certain circumstances to achieve remote code execution this vulnerability is also known as ״ ״ or ״springshell״ n nthe current poc related to the attack is done by creating a specially crafted request which manipulates classloader to successfully achieve rce remote code execution n nplease note that the ease of exploitation may diverge by the code implementation ncurrently the attack depends on the following environment tomcat version or above jdk version or above n nwhitesource research team is carefully observing developments and researching the case we will keep updating this page and our whitesource resources with updates n nthis is a temporary whitesource id until an official cve id will be released n vulnerabilityurl
| 0
|
12,078
| 14,739,958,944
|
IssuesEvent
|
2021-01-07 08:15:29
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Billing Cycles - Email Attachment Error
|
anc-process anp-1 ant-bug ant-child/secondary
|
In GitLab by @kdjstudios on Oct 1, 2018, 08:52
Hello Team,
Due to #1141 we have created a new ticket to correctly resolve the underlining issue. Please see what is needed below.
- Need to generate an GUI error when an attached file goes missing on the server; This error need to happen before the emails are sent out.
- We need to know how and why attachment files are going missing in the first place.
- We need to establish a fix to prevent files from being removed from the server.
|
1.0
|
Billing Cycles - Email Attachment Error - In GitLab by @kdjstudios on Oct 1, 2018, 08:52
Hello Team,
Due to #1141 we have created a new ticket to correctly resolve the underlining issue. Please see what is needed below.
- Need to generate an GUI error when an attached file goes missing on the server; This error need to happen before the emails are sent out.
- We need to know how and why attachment files are going missing in the first place.
- We need to establish a fix to prevent files from being removed from the server.
|
process
|
billing cycles email attachment error in gitlab by kdjstudios on oct hello team due to we have created a new ticket to correctly resolve the underlining issue please see what is needed below need to generate an gui error when an attached file goes missing on the server this error need to happen before the emails are sent out we need to know how and why attachment files are going missing in the first place we need to establish a fix to prevent files from being removed from the server
| 1
|
64,704
| 14,677,224,924
|
IssuesEvent
|
2020-12-30 22:34:39
|
GooseWSS/ksa
|
https://api.github.com/repos/GooseWSS/ksa
|
opened
|
CVE-2020-17510 (High) detected in shiro-web-1.2.0.jar
|
security vulnerability
|
## CVE-2020-17510 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>shiro-web-1.2.0.jar</b></p></summary>
<p>Apache Shiro is a powerful and flexible open-source security framework that cleanly handles
authentication, authorization, enterprise session management, single sign-on and cryptography services.</p>
<p>Path to dependency file: ksa/ksa-web-root/ksa-security-web/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,ksa/ksa-web-root/ksa-web/target/ROOT/WEB-INF/lib/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **shiro-web-1.2.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/GooseWSS/ksa/commit/92b4fc1a7755c8d454d53e9ae803447b86a9521a">92b4fc1a7755c8d454d53e9ae803447b86a9521a</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache Shiro before 1.7.0, when using Apache Shiro with Spring, a specially crafted HTTP request may cause an authentication bypass.
<p>Publish Date: 2020-11-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-17510>CVE-2020-17510</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://lists.apache.org/thread.html/rc2cff2538b683d480426393eecf1ce8dd80e052fbef49303b4f47171%40%3Cdev.shiro.apache.org%3E">https://lists.apache.org/thread.html/rc2cff2538b683d480426393eecf1ce8dd80e052fbef49303b4f47171%40%3Cdev.shiro.apache.org%3E</a></p>
<p>Release Date: 2020-11-05</p>
<p>Fix Resolution: org.apache.shiro:shiro-web:1.7.0</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.shiro","packageName":"shiro-web","packageVersion":"1.2.0","isTransitiveDependency":false,"dependencyTree":"org.apache.shiro:shiro-web:1.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.shiro:shiro-web:1.7.0"}],"vulnerabilityIdentifier":"CVE-2020-17510","vulnerabilityDetails":"Apache Shiro before 1.7.0, when using Apache Shiro with Spring, a specially crafted HTTP request may cause an authentication bypass.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-17510","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-17510 (High) detected in shiro-web-1.2.0.jar - ## CVE-2020-17510 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>shiro-web-1.2.0.jar</b></p></summary>
<p>Apache Shiro is a powerful and flexible open-source security framework that cleanly handles
authentication, authorization, enterprise session management, single sign-on and cryptography services.</p>
<p>Path to dependency file: ksa/ksa-web-root/ksa-security-web/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar,ksa/ksa-web-root/ksa-web/target/ROOT/WEB-INF/lib/shiro-web-1.2.0.jar,canner/.m2/repository/org/apache/shiro/shiro-web/1.2.0/shiro-web-1.2.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **shiro-web-1.2.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/GooseWSS/ksa/commit/92b4fc1a7755c8d454d53e9ae803447b86a9521a">92b4fc1a7755c8d454d53e9ae803447b86a9521a</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache Shiro before 1.7.0, when using Apache Shiro with Spring, a specially crafted HTTP request may cause an authentication bypass.
<p>Publish Date: 2020-11-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-17510>CVE-2020-17510</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://lists.apache.org/thread.html/rc2cff2538b683d480426393eecf1ce8dd80e052fbef49303b4f47171%40%3Cdev.shiro.apache.org%3E">https://lists.apache.org/thread.html/rc2cff2538b683d480426393eecf1ce8dd80e052fbef49303b4f47171%40%3Cdev.shiro.apache.org%3E</a></p>
<p>Release Date: 2020-11-05</p>
<p>Fix Resolution: org.apache.shiro:shiro-web:1.7.0</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.shiro","packageName":"shiro-web","packageVersion":"1.2.0","isTransitiveDependency":false,"dependencyTree":"org.apache.shiro:shiro-web:1.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.shiro:shiro-web:1.7.0"}],"vulnerabilityIdentifier":"CVE-2020-17510","vulnerabilityDetails":"Apache Shiro before 1.7.0, when using Apache Shiro with Spring, a specially crafted HTTP request may cause an authentication bypass.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-17510","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in shiro web jar cve high severity vulnerability vulnerable library shiro web jar apache shiro is a powerful and flexible open source security framework that cleanly handles authentication authorization enterprise session management single sign on and cryptography services path to dependency file ksa ksa web root ksa security web pom xml path to vulnerable library canner repository org apache shiro shiro web shiro web jar canner repository org apache shiro shiro web shiro web jar canner repository org apache shiro shiro web shiro web jar canner repository org apache shiro shiro web shiro web jar canner repository org apache shiro shiro web shiro web jar canner repository org apache shiro shiro web shiro web jar canner repository org apache shiro shiro web shiro web jar ksa ksa web root ksa web target root web inf lib shiro web jar canner repository org apache shiro shiro web shiro web jar dependency hierarchy x shiro web jar vulnerable library found in head commit a href found in base branch master vulnerability details apache shiro before when using apache shiro with spring a specially crafted http request may cause an authentication bypass publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache shiro shiro web rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails apache shiro before when using apache shiro with spring a specially crafted http request may cause an authentication bypass vulnerabilityurl
| 0
|
4,473
| 7,341,274,826
|
IssuesEvent
|
2018-03-07 01:12:00
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Confusion between 'upgrade' and 'update' domains
|
cxp doc-bug in-process service-fabric triaged
|
There are multiple places in this page where Upgrade Domains are referred to as 'Update Domain'. This causes the page to be low in search rankings. A sev 2 customer impacting bug happened this week because they didn't configure their upgrade domains properly.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: b6e372ba-4d67-6573-aab2-d0ce63776c82
* Version Independent ID: c91844b0-2fee-bd5b-c7f9-a52d16cddf4b
* [Content](https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-application-upgrade#feedback)
* [Content Source](https://github.com/Microsoft/azure-docs/blob/master/articles/service-fabric/service-fabric-application-upgrade.md)
* Service: service-fabric
|
1.0
|
Confusion between 'upgrade' and 'update' domains - There are multiple places in this page where Upgrade Domains are referred to as 'Update Domain'. This causes the page to be low in search rankings. A sev 2 customer impacting bug happened this week because they didn't configure their upgrade domains properly.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: b6e372ba-4d67-6573-aab2-d0ce63776c82
* Version Independent ID: c91844b0-2fee-bd5b-c7f9-a52d16cddf4b
* [Content](https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-application-upgrade#feedback)
* [Content Source](https://github.com/Microsoft/azure-docs/blob/master/articles/service-fabric/service-fabric-application-upgrade.md)
* Service: service-fabric
|
process
|
confusion between upgrade and update domains there are multiple places in this page where upgrade domains are referred to as update domain this causes the page to be low in search rankings a sev customer impacting bug happened this week because they didn t configure their upgrade domains properly document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id service service fabric
| 1
|
366,526
| 25,589,686,920
|
IssuesEvent
|
2022-12-01 12:03:50
|
Componolit/RecordFlux
|
https://api.github.com/repos/Componolit/RecordFlux
|
closed
|
Sequence type is not referenced anywhere in LR
|
bug documentation small
|
The `sequence_type` production is not referenced anywhere. It should probably become an alternative of `basic_declaration` in the [language reference](https://github.com/Componolit/RecordFlux/blob/main/doc/language_reference/index.rst).
EDIT: same for `type_derivation`.
|
1.0
|
Sequence type is not referenced anywhere in LR - The `sequence_type` production is not referenced anywhere. It should probably become an alternative of `basic_declaration` in the [language reference](https://github.com/Componolit/RecordFlux/blob/main/doc/language_reference/index.rst).
EDIT: same for `type_derivation`.
|
non_process
|
sequence type is not referenced anywhere in lr the sequence type production is not referenced anywhere it should probably become an alternative of basic declaration in the edit same for type derivation
| 0
|
576
| 3,041,353,576
|
IssuesEvent
|
2015-08-07 20:46:32
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
Branch filtering does not seem to work with entire DITA Maps [DOT 2.x develop branch]
|
bug filtering P2 preprocess
|
I uploaded a sample project here:
http://www.oxygenxml.com/forum/files/branchFiltering.zip
you should publish to XHTML the test.ditamap which has content like:
<!DOCTYPE map PUBLIC "-//OASIS//DTD DITA 1.3 Map//EN" "map.dtd">
<map>
<title>DITA Topic Map</title>
<topicref href="flowers.ditamap" format="ditamap">
<ditavalref href="expert.ditaval"/>
</topicref>
<topicref href="flowers.ditamap" format="ditamap">
<ditavalref href="novice.ditaval"/>
</topicref>
</map>
and in the resulting TOC the flowers content appears twice but it does not seem to be profiled in any way.
|
1.0
|
Branch filtering does not seem to work with entire DITA Maps [DOT 2.x develop branch] - I uploaded a sample project here:
http://www.oxygenxml.com/forum/files/branchFiltering.zip
you should publish to XHTML the test.ditamap which has content like:
<!DOCTYPE map PUBLIC "-//OASIS//DTD DITA 1.3 Map//EN" "map.dtd">
<map>
<title>DITA Topic Map</title>
<topicref href="flowers.ditamap" format="ditamap">
<ditavalref href="expert.ditaval"/>
</topicref>
<topicref href="flowers.ditamap" format="ditamap">
<ditavalref href="novice.ditaval"/>
</topicref>
</map>
and in the resulting TOC the flowers content appears twice but it does not seem to be profiled in any way.
|
process
|
branch filtering does not seem to work with entire dita maps i uploaded a sample project here you should publish to xhtml the test ditamap which has content like dita topic map and in the resulting toc the flowers content appears twice but it does not seem to be profiled in any way
| 1
|
49,300
| 20,730,475,330
|
IssuesEvent
|
2022-03-14 08:58:45
|
reapit/foundations
|
https://api.github.com/repos/reapit/foundations
|
opened
|
Spike: Ideal architecture for Deployment Service
|
front-end iaas deployment-service task
|
**Background context or User story:**
_Post initial release, we should look at improving the architecture for the Deployment service_
**Specification or Acceptance Criteria:**
- Is the current 1:1 between lambdas and endpoints correct? Would Express / Nest make more sense?
- Can we break out into multiple services?
|
1.0
|
Spike: Ideal architecture for Deployment Service - **Background context or User story:**
_Post initial release, we should look at improving the architecture for the Deployment service_
**Specification or Acceptance Criteria:**
- Is the current 1:1 between lambdas and endpoints correct? Would Express / Nest make more sense?
- Can we break out into multiple services?
|
non_process
|
spike ideal architecture for deployment service background context or user story post initial release we should look at improving the architecture for the deployment service specification or acceptance criteria is the current between lambdas and endpoints correct would express nest make more sense can we break out into multiple services
| 0
|
21,170
| 14,407,897,806
|
IssuesEvent
|
2020-12-03 22:40:47
|
dotnet/aspnetcore
|
https://api.github.com/repos/dotnet/aspnetcore
|
closed
|
Download, configure and run the official source code of ASP.NET Core 3.x, report this error after executing the restore command (下载、配置、运行 ASP.NET Core 3.x 官方源码,执行restore命令之后报此错误!!)
|
Needs: Attention :wave: area-infrastructure
|
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(24,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\src\Analyzers\Internal.AspNetCore.Analyzers\src'. [D:\aspnetcore-master\src\Analyzers\Internal.AspNetCore.Analyzers\src\Internal.AspNetCore.Analyzers.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(47,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\src\Analyzers\Internal.AspNetCore.Analyzers\src'. [D:\aspnetcore-master\src\Analyzers\Internal.AspNetCore.Analyzers\src\Internal.AspNetCore.Analyzers.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.sourcelink.common\1.1.0-beta-20206-02\build\Microsoft.SourceLink.Common.targets(52,5): error : Source control information is not available - the generated source link is empty. [D:\aspnetcore-master\src\Analyzers\Internal.AspNetCore.Analyzers\src\Internal.AspNetCore.Analyzers.csproj]
Internal.AspNetCore.Analyzers -> D:\aspnetcore-master\artifacts\bin\Internal.AspNetCore.Analyzers\Release\netstandard1.3\Internal.AspNetCore.Analyzers.dll
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(24,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\eng\tools\RepoTasks'. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(47,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\eng\tools\RepoTasks'. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.sourcelink.common\1.1.0-beta-20206-02\build\Microsoft.SourceLink.Common.targets(52,5): error : Source control information is not available - the generated source link is empty. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
RepoTasks -> D:\aspnetcore-master\artifacts\bin\RepoTasks\Release\net5.0\RepoTasks.dll
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(24,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\eng\tools\RepoTasks'. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(47,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\eng\tools\RepoTasks'. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.sourcelink.common\1.1.0-beta-20206-02\build\Microsoft.SourceLink.Common.targets(52,5): error : Source control information is not available - the generated source link is empty. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
RepoTasks -> D:\aspnetcore-master\artifacts\bin\RepoTasks\Release\net472\RepoTasks.dll
Build failed.
|
1.0
|
Download, configure and run the official source code of ASP.NET Core 3.x, report this error after executing the restore command (下载、配置、运行 ASP.NET Core 3.x 官方源码,执行restore命令之后报此错误!!) -
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(24,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\src\Analyzers\Internal.AspNetCore.Analyzers\src'. [D:\aspnetcore-master\src\Analyzers\Internal.AspNetCore.Analyzers\src\Internal.AspNetCore.Analyzers.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(47,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\src\Analyzers\Internal.AspNetCore.Analyzers\src'. [D:\aspnetcore-master\src\Analyzers\Internal.AspNetCore.Analyzers\src\Internal.AspNetCore.Analyzers.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.sourcelink.common\1.1.0-beta-20206-02\build\Microsoft.SourceLink.Common.targets(52,5): error : Source control information is not available - the generated source link is empty. [D:\aspnetcore-master\src\Analyzers\Internal.AspNetCore.Analyzers\src\Internal.AspNetCore.Analyzers.csproj]
Internal.AspNetCore.Analyzers -> D:\aspnetcore-master\artifacts\bin\Internal.AspNetCore.Analyzers\Release\netstandard1.3\Internal.AspNetCore.Analyzers.dll
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(24,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\eng\tools\RepoTasks'. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(47,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\eng\tools\RepoTasks'. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.sourcelink.common\1.1.0-beta-20206-02\build\Microsoft.SourceLink.Common.targets(52,5): error : Source control information is not available - the generated source link is empty. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
RepoTasks -> D:\aspnetcore-master\artifacts\bin\RepoTasks\Release\net5.0\RepoTasks.dll
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(24,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\eng\tools\RepoTasks'. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.build.tasks.git\1.1.0-beta-20206-02\build\Microsoft.Build.Tasks.Git.targets(47,5): error : Unable to locate repository with working directory that contains directory 'D:\aspnetcore-master\eng\tools\RepoTasks'. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
C:\Users\Administrator\.nuget\packages\microsoft.sourcelink.common\1.1.0-beta-20206-02\build\Microsoft.SourceLink.Common.targets(52,5): error : Source control information is not available - the generated source link is empty. [D:\aspnetcore-master\eng\tools\RepoTasks\RepoTasks.csproj]
RepoTasks -> D:\aspnetcore-master\artifacts\bin\RepoTasks\Release\net472\RepoTasks.dll
Build failed.
|
non_process
|
download configure and run the official source code of asp net core x report this error after executing the restore command 下载、配置、运行 asp net core x 官方源码,执行restore命令之后报此错误!! c users administrator nuget packages microsoft build tasks git beta build microsoft build tasks git targets error unable to locate repository with working directory that contains directory d aspnetcore master src analyzers internal aspnetcore analyzers src c users administrator nuget packages microsoft build tasks git beta build microsoft build tasks git targets error unable to locate repository with working directory that contains directory d aspnetcore master src analyzers internal aspnetcore analyzers src c users administrator nuget packages microsoft sourcelink common beta build microsoft sourcelink common targets error source control information is not available the generated source link is empty internal aspnetcore analyzers d aspnetcore master artifacts bin internal aspnetcore analyzers release internal aspnetcore analyzers dll c users administrator nuget packages microsoft build tasks git beta build microsoft build tasks git targets error unable to locate repository with working directory that contains directory d aspnetcore master eng tools repotasks c users administrator nuget packages microsoft build tasks git beta build microsoft build tasks git targets error unable to locate repository with working directory that contains directory d aspnetcore master eng tools repotasks c users administrator nuget packages microsoft sourcelink common beta build microsoft sourcelink common targets error source control information is not available the generated source link is empty repotasks d aspnetcore master artifacts bin repotasks release repotasks dll c users administrator nuget packages microsoft build tasks git beta build microsoft build tasks git targets error unable to locate repository with working directory that contains directory d aspnetcore master eng tools repotasks c users administrator nuget packages microsoft build tasks git beta build microsoft build tasks git targets error unable to locate repository with working directory that contains directory d aspnetcore master eng tools repotasks c users administrator nuget packages microsoft sourcelink common beta build microsoft sourcelink common targets error source control information is not available the generated source link is empty repotasks d aspnetcore master artifacts bin repotasks release repotasks dll build failed
| 0
|
123,201
| 17,772,187,104
|
IssuesEvent
|
2021-08-30 14:50:05
|
kapseliboi/johnny-five
|
https://api.github.com/repos/kapseliboi/johnny-five
|
opened
|
CVE-2018-16487 (Medium) detected in lodash-4.6.1.tgz, lodash-3.10.1.tgz
|
security vulnerability
|
## CVE-2018-16487 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-4.6.1.tgz</b>, <b>lodash-3.10.1.tgz</b></p></summary>
<p>
<details><summary><b>lodash-4.6.1.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.6.1.tgz">https://registry.npmjs.org/lodash/-/lodash-4.6.1.tgz</a></p>
<p>Path to dependency file: johnny-five/package.json</p>
<p>Path to vulnerable library: johnny-five/node_modules/grunt-jscs/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-jscs-3.0.1.tgz (Root Library)
- :x: **lodash-4.6.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: johnny-five/package.json</p>
<p>Path to vulnerable library: johnny-five/node_modules/jsdoctypeparser/node_modules/lodash/package.json,johnny-five/node_modules/jscs/node_modules/lodash/package.json,johnny-five/node_modules/xmlbuilder/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-jscs-3.0.1.tgz (Root Library)
- jscs-3.0.7.tgz
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/kapseliboi/johnny-five/commit/e9c804e0b9bcf7cdcf4bc55c904465e172e2cf6a">e9c804e0b9bcf7cdcf4bc55c904465e172e2cf6a</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A prototype pollution vulnerability was found in lodash <4.17.11 where the functions merge, mergeWith, and defaultsDeep can be tricked into adding or modifying properties of Object.prototype.
<p>Publish Date: 2019-02-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-16487>CVE-2018-16487</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2018-16487">https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2018-16487</a></p>
<p>Release Date: 2019-02-01</p>
<p>Fix Resolution: 4.17.11</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2018-16487 (Medium) detected in lodash-4.6.1.tgz, lodash-3.10.1.tgz - ## CVE-2018-16487 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-4.6.1.tgz</b>, <b>lodash-3.10.1.tgz</b></p></summary>
<p>
<details><summary><b>lodash-4.6.1.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.6.1.tgz">https://registry.npmjs.org/lodash/-/lodash-4.6.1.tgz</a></p>
<p>Path to dependency file: johnny-five/package.json</p>
<p>Path to vulnerable library: johnny-five/node_modules/grunt-jscs/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-jscs-3.0.1.tgz (Root Library)
- :x: **lodash-4.6.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: johnny-five/package.json</p>
<p>Path to vulnerable library: johnny-five/node_modules/jsdoctypeparser/node_modules/lodash/package.json,johnny-five/node_modules/jscs/node_modules/lodash/package.json,johnny-five/node_modules/xmlbuilder/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-jscs-3.0.1.tgz (Root Library)
- jscs-3.0.7.tgz
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/kapseliboi/johnny-five/commit/e9c804e0b9bcf7cdcf4bc55c904465e172e2cf6a">e9c804e0b9bcf7cdcf4bc55c904465e172e2cf6a</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A prototype pollution vulnerability was found in lodash <4.17.11 where the functions merge, mergeWith, and defaultsDeep can be tricked into adding or modifying properties of Object.prototype.
<p>Publish Date: 2019-02-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-16487>CVE-2018-16487</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2018-16487">https://bugzilla.redhat.com/show_bug.cgi?id=CVE-2018-16487</a></p>
<p>Release Date: 2019-02-01</p>
<p>Fix Resolution: 4.17.11</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in lodash tgz lodash tgz cve medium severity vulnerability vulnerable libraries lodash tgz lodash tgz lodash tgz lodash modular utilities library home page a href path to dependency file johnny five package json path to vulnerable library johnny five node modules grunt jscs node modules lodash package json dependency hierarchy grunt jscs tgz root library x lodash tgz vulnerable library lodash tgz the modern build of lodash modular utilities library home page a href path to dependency file johnny five package json path to vulnerable library johnny five node modules jsdoctypeparser node modules lodash package json johnny five node modules jscs node modules lodash package json johnny five node modules xmlbuilder node modules lodash package json dependency hierarchy grunt jscs tgz root library jscs tgz x lodash tgz vulnerable library found in head commit a href found in base branch main vulnerability details a prototype pollution vulnerability was found in lodash where the functions merge mergewith and defaultsdeep can be tricked into adding or modifying properties of object prototype publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
5,773
| 13,152,598,823
|
IssuesEvent
|
2020-08-09 23:15:59
|
docwhite/dreamdrugs
|
https://api.github.com/repos/docwhite/dreamdrugs
|
closed
|
Explicitly route socket.io traffic from nginx so just the correct headers are added.
|
architecture enhancement
|
Right now the react application and the sockets share the same location.
|
1.0
|
Explicitly route socket.io traffic from nginx so just the correct headers are added. - Right now the react application and the sockets share the same location.
|
non_process
|
explicitly route socket io traffic from nginx so just the correct headers are added right now the react application and the sockets share the same location
| 0
|
21,040
| 27,980,749,743
|
IssuesEvent
|
2023-03-26 05:17:21
|
open-telemetry/opentelemetry-collector-contrib
|
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
|
closed
|
[k8sprocessor] Watch pods in a set of namespaces
|
enhancement help wanted Stale processor/k8sattributes closed as inactive
|
**Is your feature request related to a problem? Please describe.**
The processor is currently able to watch pods in a single namespace, on a single node, or across the entire cluster. In multi-tenant clusters, watching every pod is at best inefficient and worst forbidden, while looking at a single node is best-suited for DaemonSet deployments. This means the only option left is to have a processor or collector for each namespace of a multi-namespace application.
**Describe the solution you'd like**
It would be great if instead of a single namespace as is currently the case:
```yaml
k8s_tagger:
filter:
namespace: foo
```
A list of namespaces could be passed:
```yaml
k8s_tagger:
filter:
namespaces:
- foo
- bar
- baz
```
**Describe alternatives you've considered**
- Run an OTel Collector deployment per namespace, either consolidated in a single namespace, or in the respective namespaces.
- Configure an instance of `k8s_tagger` for each namespace and list them in the pipeline, assuming the correct one will pick up a given pod and all others will ignore it.
- Configure a distinct pipeline for each namespace, each with a `k8s_tagger` instance filtering by that namespace. Each could use a receiver listening on a dedicated port.
**Additional context**
Prometheus's [K8s service discovery](https://prometheus.io/docs/prometheus/latest/configuration/configuration/#kubernetes_sd_config) is capable of watching multiple namespaces at once:
```yaml
kubernetes_sd_configs:
- namespaces:
names:
- foo
- bar
- baz
```
|
1.0
|
[k8sprocessor] Watch pods in a set of namespaces - **Is your feature request related to a problem? Please describe.**
The processor is currently able to watch pods in a single namespace, on a single node, or across the entire cluster. In multi-tenant clusters, watching every pod is at best inefficient and worst forbidden, while looking at a single node is best-suited for DaemonSet deployments. This means the only option left is to have a processor or collector for each namespace of a multi-namespace application.
**Describe the solution you'd like**
It would be great if instead of a single namespace as is currently the case:
```yaml
k8s_tagger:
filter:
namespace: foo
```
A list of namespaces could be passed:
```yaml
k8s_tagger:
filter:
namespaces:
- foo
- bar
- baz
```
**Describe alternatives you've considered**
- Run an OTel Collector deployment per namespace, either consolidated in a single namespace, or in the respective namespaces.
- Configure an instance of `k8s_tagger` for each namespace and list them in the pipeline, assuming the correct one will pick up a given pod and all others will ignore it.
- Configure a distinct pipeline for each namespace, each with a `k8s_tagger` instance filtering by that namespace. Each could use a receiver listening on a dedicated port.
**Additional context**
Prometheus's [K8s service discovery](https://prometheus.io/docs/prometheus/latest/configuration/configuration/#kubernetes_sd_config) is capable of watching multiple namespaces at once:
```yaml
kubernetes_sd_configs:
- namespaces:
names:
- foo
- bar
- baz
```
|
process
|
watch pods in a set of namespaces is your feature request related to a problem please describe the processor is currently able to watch pods in a single namespace on a single node or across the entire cluster in multi tenant clusters watching every pod is at best inefficient and worst forbidden while looking at a single node is best suited for daemonset deployments this means the only option left is to have a processor or collector for each namespace of a multi namespace application describe the solution you d like it would be great if instead of a single namespace as is currently the case yaml tagger filter namespace foo a list of namespaces could be passed yaml tagger filter namespaces foo bar baz describe alternatives you ve considered run an otel collector deployment per namespace either consolidated in a single namespace or in the respective namespaces configure an instance of tagger for each namespace and list them in the pipeline assuming the correct one will pick up a given pod and all others will ignore it configure a distinct pipeline for each namespace each with a tagger instance filtering by that namespace each could use a receiver listening on a dedicated port additional context prometheus s is capable of watching multiple namespaces at once yaml kubernetes sd configs namespaces names foo bar baz
| 1
|
16,987
| 22,351,248,724
|
IssuesEvent
|
2022-06-15 12:16:27
|
python/cpython
|
https://api.github.com/repos/python/cpython
|
closed
|
NetBSD: do not use POSIX semaphores
|
type-bug interpreter-core 3.11 expert-multiprocessing
|
BPO | [46045](https://bugs.python.org/issue46045)
--- | :---
Nosy | @0-wiz-0, @serhiy-storchaka
PRs | <li>python/cpython#30047</li>
<sup>*Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.*</sup>
<details><summary>Show more details</summary><p>
GitHub fields:
```python
assignee = None
closed_at = None
created_at = <Date 2021-12-11.10:31:04.604>
labels = ['interpreter-core', 'type-bug', '3.11']
title = 'NetBSD: do not use POSIX semaphores'
updated_at = <Date 2022-01-18.22:48:10.053>
user = 'https://github.com/0-wiz-0'
```
bugs.python.org fields:
```python
activity = <Date 2022-01-18.22:48:10.053>
actor = 'wiz'
assignee = 'none'
closed = False
closed_date = None
closer = None
components = ['Interpreter Core']
creation = <Date 2021-12-11.10:31:04.604>
creator = 'wiz'
dependencies = []
files = []
hgrepos = []
issue_num = 46045
keywords = ['patch']
message_count = 4.0
messages = ['408291', '410581', '410888', '410910']
nosy_count = 2.0
nosy_names = ['wiz', 'serhiy.storchaka']
pr_nums = ['30047']
priority = 'normal'
resolution = None
stage = 'patch review'
status = 'open'
superseder = None
type = 'behavior'
url = 'https://bugs.python.org/issue46045'
versions = ['Python 3.11']
```
</p></details>
|
1.0
|
NetBSD: do not use POSIX semaphores - BPO | [46045](https://bugs.python.org/issue46045)
--- | :---
Nosy | @0-wiz-0, @serhiy-storchaka
PRs | <li>python/cpython#30047</li>
<sup>*Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.*</sup>
<details><summary>Show more details</summary><p>
GitHub fields:
```python
assignee = None
closed_at = None
created_at = <Date 2021-12-11.10:31:04.604>
labels = ['interpreter-core', 'type-bug', '3.11']
title = 'NetBSD: do not use POSIX semaphores'
updated_at = <Date 2022-01-18.22:48:10.053>
user = 'https://github.com/0-wiz-0'
```
bugs.python.org fields:
```python
activity = <Date 2022-01-18.22:48:10.053>
actor = 'wiz'
assignee = 'none'
closed = False
closed_date = None
closer = None
components = ['Interpreter Core']
creation = <Date 2021-12-11.10:31:04.604>
creator = 'wiz'
dependencies = []
files = []
hgrepos = []
issue_num = 46045
keywords = ['patch']
message_count = 4.0
messages = ['408291', '410581', '410888', '410910']
nosy_count = 2.0
nosy_names = ['wiz', 'serhiy.storchaka']
pr_nums = ['30047']
priority = 'normal'
resolution = None
stage = 'patch review'
status = 'open'
superseder = None
type = 'behavior'
url = 'https://bugs.python.org/issue46045'
versions = ['Python 3.11']
```
</p></details>
|
process
|
netbsd do not use posix semaphores bpo nosy wiz serhiy storchaka prs python cpython note these values reflect the state of the issue at the time it was migrated and might not reflect the current state show more details github fields python assignee none closed at none created at labels title netbsd do not use posix semaphores updated at user bugs python org fields python activity actor wiz assignee none closed false closed date none closer none components creation creator wiz dependencies files hgrepos issue num keywords message count messages nosy count nosy names pr nums priority normal resolution none stage patch review status open superseder none type behavior url versions
| 1
|
6,906
| 10,058,075,263
|
IssuesEvent
|
2019-07-22 13:11:01
|
syndesisio/syndesis
|
https://api.github.com/repos/syndesisio/syndesis
|
closed
|
Base docker S2I image with camel-fuse.x.y.z dependencies
|
cat/discussion cat/enhancement cat/process
|
Coming from a Retro: The creation of a docker image with dependencies could help as a way to facilitate the development environment set-up.
|
1.0
|
Base docker S2I image with camel-fuse.x.y.z dependencies - Coming from a Retro: The creation of a docker image with dependencies could help as a way to facilitate the development environment set-up.
|
process
|
base docker image with camel fuse x y z dependencies coming from a retro the creation of a docker image with dependencies could help as a way to facilitate the development environment set up
| 1
|
531
| 2,999,851,743
|
IssuesEvent
|
2015-07-23 21:11:56
|
meteor/meteor
|
https://api.github.com/repos/meteor/meteor
|
opened
|
Preview of batch plugins and other 1.2 features
|
Project:Release Process
|
This issue is a good place to track concerns found with the `PLUGINS-PREVIEW@1` release.
We've been working hard over the past few months on a number of features for Meteor 1.2. @Slava and I have been focused on adding new capabilities to build plugins, and we have a preview release ready! Try it out with `meteor --release PLUGINS-PREVIEW@1`.
This release adds new plugin APIs: `Plugin.registerCompiler`, `Plugin.registerLinter`, and `Plugin.registerMinifier`.
`registerCompiler` is a replacement for the now-deprecated `registerSourceHandler` feature. Compilers are like source handlers, but they always run when you're building your app instead of when packages are published, and they have access to all of the relevant files in your app and its packages at once. This lets us implement things like CSS preprocessor `@import`s that work across package boundaries.
`registerLinter` allows you to write linters that handle different file types and show warnings when you run your app or publish a package (or run the new `meteor lint` command).
`registerMinifier` allows you to define your own minifiers to use instead of the standard minifiers, which are no longer baked into the tool but now can be replaced.
The new APIs are documented at https://github.com/meteor/meteor/wiki/Build-Plugins-API
Some notes:
* You'll need to add the `standard-minifiers` package to your app when testing this. (When this is released for real as Meteor 1.2, `meteor update` will add that to your app automatically.)
* The `less` and `stylus` packages now support cross-package imports! If you specify a file like `@import "{package}/file.js"` (with the curly braces), it will load the file from that package. Use `@import "{}/file.js"` to load a file from your app.
* There's a new core `jshint` package which is a linter for js files using JSHint.
* We've made one backwards-incompatible change to packages. To include static assets in packages, you now need to explicitly specify `{isAsset: true}` in your `api.addFiles` call. This is not necessary for static assets in apps, and doesn't affect published packages, just package sources.
* Because of this change to these packages, we've bumped their major version numbers. Packages that include `less` files will need to publish a new version using this preview (or an RC) before you can use them.
* We've improved rebuild time in `meteor run` in various ways, including via new caches and new APIs that allow plugins to define their own caches.
* There's a lot of other stuff in here too! For example, there's a new core `ecmascript` package which uses Babel to let you write Meteor code in ES2015 (similar to `grigio:babel`). There's automatic compression on the wire for DDP. There's some improvements to Livequery oplog tailing performance. There are several months worth of incremental bugfixes. Pretty exciting!
Were not quite ready for the formal 1.2 release candidate process, but we'd love to hear how this release works for your apps! And plugin authors (or those interested in becoming plugin authors), we'd love to see how the new APIs work for you!
|
1.0
|
Preview of batch plugins and other 1.2 features - This issue is a good place to track concerns found with the `PLUGINS-PREVIEW@1` release.
We've been working hard over the past few months on a number of features for Meteor 1.2. @Slava and I have been focused on adding new capabilities to build plugins, and we have a preview release ready! Try it out with `meteor --release PLUGINS-PREVIEW@1`.
This release adds new plugin APIs: `Plugin.registerCompiler`, `Plugin.registerLinter`, and `Plugin.registerMinifier`.
`registerCompiler` is a replacement for the now-deprecated `registerSourceHandler` feature. Compilers are like source handlers, but they always run when you're building your app instead of when packages are published, and they have access to all of the relevant files in your app and its packages at once. This lets us implement things like CSS preprocessor `@import`s that work across package boundaries.
`registerLinter` allows you to write linters that handle different file types and show warnings when you run your app or publish a package (or run the new `meteor lint` command).
`registerMinifier` allows you to define your own minifiers to use instead of the standard minifiers, which are no longer baked into the tool but now can be replaced.
The new APIs are documented at https://github.com/meteor/meteor/wiki/Build-Plugins-API
Some notes:
* You'll need to add the `standard-minifiers` package to your app when testing this. (When this is released for real as Meteor 1.2, `meteor update` will add that to your app automatically.)
* The `less` and `stylus` packages now support cross-package imports! If you specify a file like `@import "{package}/file.js"` (with the curly braces), it will load the file from that package. Use `@import "{}/file.js"` to load a file from your app.
* There's a new core `jshint` package which is a linter for js files using JSHint.
* We've made one backwards-incompatible change to packages. To include static assets in packages, you now need to explicitly specify `{isAsset: true}` in your `api.addFiles` call. This is not necessary for static assets in apps, and doesn't affect published packages, just package sources.
* Because of this change to these packages, we've bumped their major version numbers. Packages that include `less` files will need to publish a new version using this preview (or an RC) before you can use them.
* We've improved rebuild time in `meteor run` in various ways, including via new caches and new APIs that allow plugins to define their own caches.
* There's a lot of other stuff in here too! For example, there's a new core `ecmascript` package which uses Babel to let you write Meteor code in ES2015 (similar to `grigio:babel`). There's automatic compression on the wire for DDP. There's some improvements to Livequery oplog tailing performance. There are several months worth of incremental bugfixes. Pretty exciting!
Were not quite ready for the formal 1.2 release candidate process, but we'd love to hear how this release works for your apps! And plugin authors (or those interested in becoming plugin authors), we'd love to see how the new APIs work for you!
|
process
|
preview of batch plugins and other features this issue is a good place to track concerns found with the plugins preview release we ve been working hard over the past few months on a number of features for meteor slava and i have been focused on adding new capabilities to build plugins and we have a preview release ready try it out with meteor release plugins preview this release adds new plugin apis plugin registercompiler plugin registerlinter and plugin registerminifier registercompiler is a replacement for the now deprecated registersourcehandler feature compilers are like source handlers but they always run when you re building your app instead of when packages are published and they have access to all of the relevant files in your app and its packages at once this lets us implement things like css preprocessor import s that work across package boundaries registerlinter allows you to write linters that handle different file types and show warnings when you run your app or publish a package or run the new meteor lint command registerminifier allows you to define your own minifiers to use instead of the standard minifiers which are no longer baked into the tool but now can be replaced the new apis are documented at some notes you ll need to add the standard minifiers package to your app when testing this when this is released for real as meteor meteor update will add that to your app automatically the less and stylus packages now support cross package imports if you specify a file like import package file js with the curly braces it will load the file from that package use import file js to load a file from your app there s a new core jshint package which is a linter for js files using jshint we ve made one backwards incompatible change to packages to include static assets in packages you now need to explicitly specify isasset true in your api addfiles call this is not necessary for static assets in apps and doesn t affect published packages just package sources because of this change to these packages we ve bumped their major version numbers packages that include less files will need to publish a new version using this preview or an rc before you can use them we ve improved rebuild time in meteor run in various ways including via new caches and new apis that allow plugins to define their own caches there s a lot of other stuff in here too for example there s a new core ecmascript package which uses babel to let you write meteor code in similar to grigio babel there s automatic compression on the wire for ddp there s some improvements to livequery oplog tailing performance there are several months worth of incremental bugfixes pretty exciting were not quite ready for the formal release candidate process but we d love to hear how this release works for your apps and plugin authors or those interested in becoming plugin authors we d love to see how the new apis work for you
| 1
|
5,677
| 8,558,097,296
|
IssuesEvent
|
2018-11-08 17:16:01
|
googleapis/google-cloud-python
|
https://api.github.com/repos/googleapis/google-cloud-python
|
closed
|
Suppress 'PendingDeprecationWarnings' in GAPIC-based clients
|
api: automl api: bigquerydatatransfer api: bigtable api: cloudasset api: cloudkms api: cloudtasks api: dataproc api: dlp api: monitoring api: pubsub api: speech api: texttospeech api: videointelligence api: vision type: process
|
Clients which pass through `channel` rather than `transport` to their low-level `api` constructors trigger `PendingDeprecationWarnings`:
```python
<filename>:<lineno>: PendingDeprecationWarning: The `channel` argument is deprecated; use `transport` instead.
```
Tests for those packages routinely trigger those warnings (repeatedly).
|
1.0
|
Suppress 'PendingDeprecationWarnings' in GAPIC-based clients - Clients which pass through `channel` rather than `transport` to their low-level `api` constructors trigger `PendingDeprecationWarnings`:
```python
<filename>:<lineno>: PendingDeprecationWarning: The `channel` argument is deprecated; use `transport` instead.
```
Tests for those packages routinely trigger those warnings (repeatedly).
|
process
|
suppress pendingdeprecationwarnings in gapic based clients clients which pass through channel rather than transport to their low level api constructors trigger pendingdeprecationwarnings python pendingdeprecationwarning the channel argument is deprecated use transport instead tests for those packages routinely trigger those warnings repeatedly
| 1
|
43,897
| 5,575,987,295
|
IssuesEvent
|
2017-03-28 04:39:48
|
MyersResearchGroup/iBioSim
|
https://api.github.com/repos/MyersResearchGroup/iBioSim
|
closed
|
FEATURE: add ability to put SBO term on species for its type
|
FEATURE Needs Testing
|
Use SBO term to indicate if species is protein, small molecule, RNA, complex, etc.
|
1.0
|
FEATURE: add ability to put SBO term on species for its type - Use SBO term to indicate if species is protein, small molecule, RNA, complex, etc.
|
non_process
|
feature add ability to put sbo term on species for its type use sbo term to indicate if species is protein small molecule rna complex etc
| 0
|
5,796
| 8,640,295,816
|
IssuesEvent
|
2018-11-24 03:41:30
|
teracyhq-incubator/teracy-dev-core
|
https://api.github.com/repos/teracyhq-incubator/teracy-dev-core
|
closed
|
should display friendly log messages for the not configured variable key(s)
|
affected:develop affected:v0.4.0 comp:processors prio:major reso:fixed stag:under-review type:improvement
|
Expect: Friendly message to guide users
Actual: scary log
```
/Users/hoatle/teracy-dev/workspace/teracy-dev-core/lib/teracy-dev-core/processors/variables.rb:38:in `%'
/Users/hoatle/teracy-dev/workspace/teracy-dev-core/lib/teracy-dev-core/processors/variables.rb:38:in `process'
/Users/hoatle/teracy-dev/lib/teracy-dev/processors/manager.rb:40:in `block in process'
/Users/hoatle/teracy-dev/lib/teracy-dev/processors/manager.rb:37:in `each'
/Users/hoatle/teracy-dev/lib/teracy-dev/processors/manager.rb:37:in `process'
/Users/hoatle/teracy-dev/lib/teracy-dev/loader.rb:173:in `process'
/Users/hoatle/teracy-dev/lib/teracy-dev/loader.rb:138:in `build_settings'
/Users/hoatle/teracy-dev/lib/teracy-dev/loader.rb:33:in `start'
/Users/hoatle/teracy-dev/lib/teracy-dev.rb:31:in `<module:TeracyDev>'
/Users/hoatle/teracy-dev/lib/teracy-dev.rb:10:in `<top (required)>'
/opt/vagrant/embedded/lib/ruby/2.4.0/rubygems/core_ext/kernel_require.rb:55:in `require'
/opt/vagrant/embedded/lib/ruby/2.4.0/rubygems/core_ext/kernel_require.rb:55:in `require'
/Users/hoatle/teracy-dev/Vagrantfile:21:in `<top (required)>'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:239:in `load'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:239:in `block in procs_for_path'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config.rb:53:in `block in capture_configures'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config.rb:48:in `synchronize'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config.rb:48:in `capture_configures'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:237:in `procs_for_path'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:223:in `procs_for_source'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:67:in `block in set'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:61:in `each'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:61:in `set'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/environment.rb:488:in `config_loader'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/environment.rb:793:in `vagrantfile'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/environment.rb:178:in `initialize'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/bin/vagrant:144:in `new'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/bin/vagrant:144:in `<main>'
Vagrant failed to initialize at a very early stage:
There was an error loading a Vagrantfile. The file being loaded
and the error message are shown below. This is usually caused by
a syntax error.
Path: /Users/hoatle/teracy-dev/Vagrantfile
Line number: 38
Message: KeyError: key{node_name_prefix} not found
```
|
1.0
|
should display friendly log messages for the not configured variable key(s) - Expect: Friendly message to guide users
Actual: scary log
```
/Users/hoatle/teracy-dev/workspace/teracy-dev-core/lib/teracy-dev-core/processors/variables.rb:38:in `%'
/Users/hoatle/teracy-dev/workspace/teracy-dev-core/lib/teracy-dev-core/processors/variables.rb:38:in `process'
/Users/hoatle/teracy-dev/lib/teracy-dev/processors/manager.rb:40:in `block in process'
/Users/hoatle/teracy-dev/lib/teracy-dev/processors/manager.rb:37:in `each'
/Users/hoatle/teracy-dev/lib/teracy-dev/processors/manager.rb:37:in `process'
/Users/hoatle/teracy-dev/lib/teracy-dev/loader.rb:173:in `process'
/Users/hoatle/teracy-dev/lib/teracy-dev/loader.rb:138:in `build_settings'
/Users/hoatle/teracy-dev/lib/teracy-dev/loader.rb:33:in `start'
/Users/hoatle/teracy-dev/lib/teracy-dev.rb:31:in `<module:TeracyDev>'
/Users/hoatle/teracy-dev/lib/teracy-dev.rb:10:in `<top (required)>'
/opt/vagrant/embedded/lib/ruby/2.4.0/rubygems/core_ext/kernel_require.rb:55:in `require'
/opt/vagrant/embedded/lib/ruby/2.4.0/rubygems/core_ext/kernel_require.rb:55:in `require'
/Users/hoatle/teracy-dev/Vagrantfile:21:in `<top (required)>'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:239:in `load'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:239:in `block in procs_for_path'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config.rb:53:in `block in capture_configures'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config.rb:48:in `synchronize'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config.rb:48:in `capture_configures'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:237:in `procs_for_path'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:223:in `procs_for_source'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:67:in `block in set'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:61:in `each'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/config/loader.rb:61:in `set'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/environment.rb:488:in `config_loader'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/environment.rb:793:in `vagrantfile'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/lib/vagrant/environment.rb:178:in `initialize'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/bin/vagrant:144:in `new'
/opt/vagrant/embedded/gems/2.2.0/gems/vagrant-2.2.0/bin/vagrant:144:in `<main>'
Vagrant failed to initialize at a very early stage:
There was an error loading a Vagrantfile. The file being loaded
and the error message are shown below. This is usually caused by
a syntax error.
Path: /Users/hoatle/teracy-dev/Vagrantfile
Line number: 38
Message: KeyError: key{node_name_prefix} not found
```
|
process
|
should display friendly log messages for the not configured variable key s expect friendly message to guide users actual scary log users hoatle teracy dev workspace teracy dev core lib teracy dev core processors variables rb in users hoatle teracy dev workspace teracy dev core lib teracy dev core processors variables rb in process users hoatle teracy dev lib teracy dev processors manager rb in block in process users hoatle teracy dev lib teracy dev processors manager rb in each users hoatle teracy dev lib teracy dev processors manager rb in process users hoatle teracy dev lib teracy dev loader rb in process users hoatle teracy dev lib teracy dev loader rb in build settings users hoatle teracy dev lib teracy dev loader rb in start users hoatle teracy dev lib teracy dev rb in users hoatle teracy dev lib teracy dev rb in opt vagrant embedded lib ruby rubygems core ext kernel require rb in require opt vagrant embedded lib ruby rubygems core ext kernel require rb in require users hoatle teracy dev vagrantfile in opt vagrant embedded gems gems vagrant lib vagrant config loader rb in load opt vagrant embedded gems gems vagrant lib vagrant config loader rb in block in procs for path opt vagrant embedded gems gems vagrant lib vagrant config rb in block in capture configures opt vagrant embedded gems gems vagrant lib vagrant config rb in synchronize opt vagrant embedded gems gems vagrant lib vagrant config rb in capture configures opt vagrant embedded gems gems vagrant lib vagrant config loader rb in procs for path opt vagrant embedded gems gems vagrant lib vagrant config loader rb in procs for source opt vagrant embedded gems gems vagrant lib vagrant config loader rb in block in set opt vagrant embedded gems gems vagrant lib vagrant config loader rb in each opt vagrant embedded gems gems vagrant lib vagrant config loader rb in set opt vagrant embedded gems gems vagrant lib vagrant environment rb in config loader opt vagrant embedded gems gems vagrant lib vagrant environment rb in vagrantfile opt vagrant embedded gems gems vagrant lib vagrant environment rb in initialize opt vagrant embedded gems gems vagrant bin vagrant in new opt vagrant embedded gems gems vagrant bin vagrant in vagrant failed to initialize at a very early stage there was an error loading a vagrantfile the file being loaded and the error message are shown below this is usually caused by a syntax error path users hoatle teracy dev vagrantfile line number message keyerror key node name prefix not found
| 1
|
14,608
| 17,703,702,394
|
IssuesEvent
|
2021-08-25 03:36:27
|
2i2c-org/team-compass
|
https://api.github.com/repos/2i2c-org/team-compass
|
opened
|
Re-purpose our `prio` labels to be `impact` instead
|
:label: team-process type: task
|
### Description
I propose that we remove the `prio:` GitHub labels and instead replace them with labels based on **impact** (so, `impact: low / med / high`).
We currently use a few GitHub labels to encode "priority": `prio:low / med / high`. I have found that these labels are under-specified. Priority is very context- and person-dependent, and also changes fairly frequently (e.g. what is low-priority yesterday may be high-priority tomorrow). Moreover, we currently have two different ways to encode priority: one is with the label, and the other is via an issue's placement in our development backlog / ordering on a column.
Instead I think we should use "impact" and define the meaning of this label specific to an issue's functional area. For example:
- the impact of an infrastructure feature might be defined by the number of users that would benefit from it (all users == high, ~half users == med, ~quarter users == low) OR particularly important users could trigger "high" as well.
- the impact of a bug might be defined by whether functionality is critically impaired or just cosmetic, or whether it is complex to reproduce or is reproduced by everyone
- the impact of an administrative task might be defined by whether it is required to get some essential work done, or just general housekeeping
There would still be a degree of subjectivity and judgment there, but I think this is still more concrete than "priority". In addition, the person that applies an "impact" label should ensure that the value/benefit is described in the issue well enough to justify the label.
### Value / benefit
If we instead used "impact" labels, I believe that it would be easier to have a concrete definition for each issue that would be more stable over time. Moreover, impact maps relatively nicely onto our "value/benefit/user story" practice in describing our team issues. Impact would be an important deciding factor in prioritization (in general, we want to work on things that are high-impact!) but it wouldn't **define** the priority, it would just be an important factor for it.
### Implementation details
I propose that we try this out on the `pilot-hubs/` repository for a month and see if it improves the signal-to-noise of the issues, or helps us triage and prioritize more easily.
If nobody objects, I'll plan to remove all `prio:` labels from pilot-hubs, and replace them with "impact" labels (and potentially re-assign labels to issues as needed).
### Tasks to complete
- [ ] Nobody objects to trying this workflow in pilot-hubs/
- [ ] Swap out `prio:` labels for `impact:` labels in `pilot-hubs/`
- [ ] Revisit this practice in October and decide if we should continue it
- [ ] Decide whether to apply it to other repositories
- [ ] Write up in the team compass
### Updates
_No response_
|
1.0
|
Re-purpose our `prio` labels to be `impact` instead - ### Description
I propose that we remove the `prio:` GitHub labels and instead replace them with labels based on **impact** (so, `impact: low / med / high`).
We currently use a few GitHub labels to encode "priority": `prio:low / med / high`. I have found that these labels are under-specified. Priority is very context- and person-dependent, and also changes fairly frequently (e.g. what is low-priority yesterday may be high-priority tomorrow). Moreover, we currently have two different ways to encode priority: one is with the label, and the other is via an issue's placement in our development backlog / ordering on a column.
Instead I think we should use "impact" and define the meaning of this label specific to an issue's functional area. For example:
- the impact of an infrastructure feature might be defined by the number of users that would benefit from it (all users == high, ~half users == med, ~quarter users == low) OR particularly important users could trigger "high" as well.
- the impact of a bug might be defined by whether functionality is critically impaired or just cosmetic, or whether it is complex to reproduce or is reproduced by everyone
- the impact of an administrative task might be defined by whether it is required to get some essential work done, or just general housekeeping
There would still be a degree of subjectivity and judgment there, but I think this is still more concrete than "priority". In addition, the person that applies an "impact" label should ensure that the value/benefit is described in the issue well enough to justify the label.
### Value / benefit
If we instead used "impact" labels, I believe that it would be easier to have a concrete definition for each issue that would be more stable over time. Moreover, impact maps relatively nicely onto our "value/benefit/user story" practice in describing our team issues. Impact would be an important deciding factor in prioritization (in general, we want to work on things that are high-impact!) but it wouldn't **define** the priority, it would just be an important factor for it.
### Implementation details
I propose that we try this out on the `pilot-hubs/` repository for a month and see if it improves the signal-to-noise of the issues, or helps us triage and prioritize more easily.
If nobody objects, I'll plan to remove all `prio:` labels from pilot-hubs, and replace them with "impact" labels (and potentially re-assign labels to issues as needed).
### Tasks to complete
- [ ] Nobody objects to trying this workflow in pilot-hubs/
- [ ] Swap out `prio:` labels for `impact:` labels in `pilot-hubs/`
- [ ] Revisit this practice in October and decide if we should continue it
- [ ] Decide whether to apply it to other repositories
- [ ] Write up in the team compass
### Updates
_No response_
|
process
|
re purpose our prio labels to be impact instead description i propose that we remove the prio github labels and instead replace them with labels based on impact so impact low med high we currently use a few github labels to encode priority prio low med high i have found that these labels are under specified priority is very context and person dependent and also changes fairly frequently e g what is low priority yesterday may be high priority tomorrow moreover we currently have two different ways to encode priority one is with the label and the other is via an issue s placement in our development backlog ordering on a column instead i think we should use impact and define the meaning of this label specific to an issue s functional area for example the impact of an infrastructure feature might be defined by the number of users that would benefit from it all users high half users med quarter users low or particularly important users could trigger high as well the impact of a bug might be defined by whether functionality is critically impaired or just cosmetic or whether it is complex to reproduce or is reproduced by everyone the impact of an administrative task might be defined by whether it is required to get some essential work done or just general housekeeping there would still be a degree of subjectivity and judgment there but i think this is still more concrete than priority in addition the person that applies an impact label should ensure that the value benefit is described in the issue well enough to justify the label value benefit if we instead used impact labels i believe that it would be easier to have a concrete definition for each issue that would be more stable over time moreover impact maps relatively nicely onto our value benefit user story practice in describing our team issues impact would be an important deciding factor in prioritization in general we want to work on things that are high impact but it wouldn t define the priority it would just be an important factor for it implementation details i propose that we try this out on the pilot hubs repository for a month and see if it improves the signal to noise of the issues or helps us triage and prioritize more easily if nobody objects i ll plan to remove all prio labels from pilot hubs and replace them with impact labels and potentially re assign labels to issues as needed tasks to complete nobody objects to trying this workflow in pilot hubs swap out prio labels for impact labels in pilot hubs revisit this practice in october and decide if we should continue it decide whether to apply it to other repositories write up in the team compass updates no response
| 1
|
240,021
| 18,290,662,510
|
IssuesEvent
|
2021-10-05 14:56:06
|
pitt-crc/bank
|
https://api.github.com/repos/pitt-crc/bank
|
closed
|
Add documentation for all functions, classes, and modules.
|
documentation
|
Documentation has been add gradually over the past several pull requests. Make a dedicated push to finish adding any missing docstring.
|
1.0
|
Add documentation for all functions, classes, and modules. - Documentation has been add gradually over the past several pull requests. Make a dedicated push to finish adding any missing docstring.
|
non_process
|
add documentation for all functions classes and modules documentation has been add gradually over the past several pull requests make a dedicated push to finish adding any missing docstring
| 0
|
437
| 2,716,595,325
|
IssuesEvent
|
2015-04-10 20:06:56
|
globaleaks/GlobaLeaks
|
https://api.github.com/repos/globaleaks/GlobaLeaks
|
opened
|
The PGP key for E2E could be generated in background while the user chose it's password
|
F: Security OTF-D2.1: OpenPGP.js encrypted files for Whistleblower OTF-D2.2: Full (files+submissions) OpenPGP.js encryption for whistleblower/receiver (Hushmail-like) U: Receiver U: Whistleblower
|
Currently the PGP key used for E2E is generated when the user click on the password save button.
GlobaLeaks could optimize the process generating the key starting immediately from the moment the user logs in so that at that when the user click the password save button the application can simply encrypt the password with the scrypt secret generated starting from the password.
The same can be done for the whistleblower; when the wb initialize the submission the system can start generating the pgp, concurrently generate the the receipt,apply the scrypt to the receipt so that at the time of the submission the only operation still to be performed will probably be the encryption of the wb_fields
\cc @origliante
|
True
|
The PGP key for E2E could be generated in background while the user chose it's password - Currently the PGP key used for E2E is generated when the user click on the password save button.
GlobaLeaks could optimize the process generating the key starting immediately from the moment the user logs in so that at that when the user click the password save button the application can simply encrypt the password with the scrypt secret generated starting from the password.
The same can be done for the whistleblower; when the wb initialize the submission the system can start generating the pgp, concurrently generate the the receipt,apply the scrypt to the receipt so that at the time of the submission the only operation still to be performed will probably be the encryption of the wb_fields
\cc @origliante
|
non_process
|
the pgp key for could be generated in background while the user chose it s password currently the pgp key used for is generated when the user click on the password save button globaleaks could optimize the process generating the key starting immediately from the moment the user logs in so that at that when the user click the password save button the application can simply encrypt the password with the scrypt secret generated starting from the password the same can be done for the whistleblower when the wb initialize the submission the system can start generating the pgp concurrently generate the the receipt apply the scrypt to the receipt so that at the time of the submission the only operation still to be performed will probably be the encryption of the wb fields cc origliante
| 0
|
9,142
| 12,203,188,957
|
IssuesEvent
|
2020-04-30 10:10:29
|
MHRA/products
|
https://api.github.com/repos/MHRA/products
|
closed
|
AUTO BATCH PROCESS - documentation required for network access
|
EPIC - Auto Batch Process :oncoming_automobile: HIGH PRIORITY :arrow_double_up: TASK :rescue_worker_helmet:
|
Requires some POC documentation with networking diagrams.
|
1.0
|
AUTO BATCH PROCESS - documentation required for network access - Requires some POC documentation with networking diagrams.
|
process
|
auto batch process documentation required for network access requires some poc documentation with networking diagrams
| 1
|
14,826
| 18,167,436,868
|
IssuesEvent
|
2021-09-27 15:58:33
|
opensearch-project/data-prepper
|
https://api.github.com/repos/opensearch-project/data-prepper
|
closed
|
Grok Prepper Configuration and Boilerplate
|
plugin - processor
|
This is a subtask of the issue for a grok processor: [https://github.com/opensearch-project/data-prepper/issues/256](url).
Grok prepper field names and default values need to be set up, as well as the reading of the configuration and boilerplate for implementing AbstractPrepper.
|
1.0
|
Grok Prepper Configuration and Boilerplate - This is a subtask of the issue for a grok processor: [https://github.com/opensearch-project/data-prepper/issues/256](url).
Grok prepper field names and default values need to be set up, as well as the reading of the configuration and boilerplate for implementing AbstractPrepper.
|
process
|
grok prepper configuration and boilerplate this is a subtask of the issue for a grok processor url grok prepper field names and default values need to be set up as well as the reading of the configuration and boilerplate for implementing abstractprepper
| 1
|
1,177
| 3,671,610,530
|
IssuesEvent
|
2016-02-22 08:35:58
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
uncaughtException only catches the first uncaughtException.
|
process
|
Example:
```node
process.on('uncaughtException', (err) => {
console.log(`Caught exception: ${err}`);
causeError();
});
function causeError(){
console.log(notDefined);
}
causeError();
```
Output:
```
Caught exception: ReferenceError: notDefined is not defined
C:\fail.js:7
console.log(notDefined);
^
ReferenceError: notDefined is not defined
at causeError (C:\fail.js:7:13)
at process.<anonymous> (C:\fail.js:3:3)
at emitOne (events.js:77:13)
at process.emit (events.js:169:7)
at process._fatalException (node.js:223:26)
[Finished in 0.256s]
```
Tested on 64 bit Windows, and Ubuntu. Versions 4.2.3 and 0.10.41
|
1.0
|
uncaughtException only catches the first uncaughtException. - Example:
```node
process.on('uncaughtException', (err) => {
console.log(`Caught exception: ${err}`);
causeError();
});
function causeError(){
console.log(notDefined);
}
causeError();
```
Output:
```
Caught exception: ReferenceError: notDefined is not defined
C:\fail.js:7
console.log(notDefined);
^
ReferenceError: notDefined is not defined
at causeError (C:\fail.js:7:13)
at process.<anonymous> (C:\fail.js:3:3)
at emitOne (events.js:77:13)
at process.emit (events.js:169:7)
at process._fatalException (node.js:223:26)
[Finished in 0.256s]
```
Tested on 64 bit Windows, and Ubuntu. Versions 4.2.3 and 0.10.41
|
process
|
uncaughtexception only catches the first uncaughtexception example node process on uncaughtexception err console log caught exception err causeerror function causeerror console log notdefined causeerror output caught exception referenceerror notdefined is not defined c fail js console log notdefined referenceerror notdefined is not defined at causeerror c fail js at process c fail js at emitone events js at process emit events js at process fatalexception node js tested on bit windows and ubuntu versions and
| 1
|
10,434
| 13,220,064,940
|
IssuesEvent
|
2020-08-17 11:42:30
|
km4ack/pi-build
|
https://api.github.com/repos/km4ack/pi-build
|
closed
|
Add sourceforge backup to wsjtx
|
enhancement in process
|
Add sourceforge as backup site in case princeton.edu fails. See this post https://groups.io/g/KM4ACK-Pi/topic/wsjtx_and_fldigi_not/75361351?p=,,,20,0,0,0::recentpostdate%2Fsticky,,,20,2,0,75361351
|
1.0
|
Add sourceforge backup to wsjtx - Add sourceforge as backup site in case princeton.edu fails. See this post https://groups.io/g/KM4ACK-Pi/topic/wsjtx_and_fldigi_not/75361351?p=,,,20,0,0,0::recentpostdate%2Fsticky,,,20,2,0,75361351
|
process
|
add sourceforge backup to wsjtx add sourceforge as backup site in case princeton edu fails see this post
| 1
|
15,503
| 19,703,264,168
|
IssuesEvent
|
2022-01-12 18:52:08
|
googleapis/java-area120-tables
|
https://api.github.com/repos/googleapis/java-area120-tables
|
opened
|
Your .repo-metadata.json file has a problem 🤒
|
type: process repo-metadata: lint
|
You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'area120-tables' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
1.0
|
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'area120-tables' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
process
|
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 release level must be equal to one of the allowed values in repo metadata json api shortname tables invalid in repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions
| 1
|
751,492
| 26,247,876,671
|
IssuesEvent
|
2023-01-05 16:38:51
|
mit-cml/appinventor-sources
|
https://api.github.com/repos/mit-cml/appinventor-sources
|
closed
|
The blocks area did not load properly / Empty block area
|
bug issue: noted for future Work status: to confirm affects: master priority: medium
|
**Describe the bug**
When i load my project, there appears an error message: "The block area did not load properly. Changes to the blocks for screen ... will not be saved." (refer to the picture at the bottom. I've censored the code, i don't know if it contains sensitive information). I don't know, what i've did wrong. Yesterday, i saved my project and today, the errormessage appears and the block-view is empty. The backup from yesterday (aia-export to computer) leads to the same issue. How i can get back my code?
Ive remembered: I've added a string-block with some unprintable characters from ASCII, maybe this led to this issue?
**Affects**
- [ ] Designer
- [X] Blocks editor
- [ ] Companion
- [ ] Compiled apps
- [ ] Buildserver
- [ ] Debugging
- [ ] Other... (please describe)
> 
--
edit: This Issue occurs, among other, when a backspace appears in a string-block. But, the question remain:
- How to get back the code in case of use of an illegal character?
|
1.0
|
The blocks area did not load properly / Empty block area - **Describe the bug**
When i load my project, there appears an error message: "The block area did not load properly. Changes to the blocks for screen ... will not be saved." (refer to the picture at the bottom. I've censored the code, i don't know if it contains sensitive information). I don't know, what i've did wrong. Yesterday, i saved my project and today, the errormessage appears and the block-view is empty. The backup from yesterday (aia-export to computer) leads to the same issue. How i can get back my code?
Ive remembered: I've added a string-block with some unprintable characters from ASCII, maybe this led to this issue?
**Affects**
- [ ] Designer
- [X] Blocks editor
- [ ] Companion
- [ ] Compiled apps
- [ ] Buildserver
- [ ] Debugging
- [ ] Other... (please describe)
> 
--
edit: This Issue occurs, among other, when a backspace appears in a string-block. But, the question remain:
- How to get back the code in case of use of an illegal character?
|
non_process
|
the blocks area did not load properly empty block area describe the bug when i load my project there appears an error message the block area did not load properly changes to the blocks for screen will not be saved refer to the picture at the bottom i ve censored the code i don t know if it contains sensitive information i don t know what i ve did wrong yesterday i saved my project and today the errormessage appears and the block view is empty the backup from yesterday aia export to computer leads to the same issue how i can get back my code ive remembered i ve added a string block with some unprintable characters from ascii maybe this led to this issue affects designer blocks editor companion compiled apps buildserver debugging other please describe edit this issue occurs among other when a backspace appears in a string block but the question remain how to get back the code in case of use of an illegal character
| 0
|
17,085
| 22,588,588,174
|
IssuesEvent
|
2022-06-28 17:28:22
|
microsoft/vscode
|
https://api.github.com/repos/microsoft/vscode
|
closed
|
"Kill" button does nothing on one task terminal
|
bug terminal-process
|
I have been testing the task shell integration (running some tasks, resizing the terminal to see if the decoration stayed in the right place) and now one of my task terminals is unkillable:

I had just killed 2 other task terminals without issue.
|
1.0
|
"Kill" button does nothing on one task terminal - I have been testing the task shell integration (running some tasks, resizing the terminal to see if the decoration stayed in the right place) and now one of my task terminals is unkillable:

I had just killed 2 other task terminals without issue.
|
process
|
kill button does nothing on one task terminal i have been testing the task shell integration running some tasks resizing the terminal to see if the decoration stayed in the right place and now one of my task terminals is unkillable i had just killed other task terminals without issue
| 1
|
12,785
| 15,166,931,809
|
IssuesEvent
|
2021-02-12 17:02:57
|
nanoframework/Home
|
https://api.github.com/repos/nanoframework/Home
|
closed
|
Add check for multidimensional arrays in MDP
|
Area: Metadata Processor Type: Bug
|
### Details about Problem
**nanoFramework area:** MDP
### Description
The new MDP is missing a check for multidimensional arrays. It should throw an exception when parsing an assembly that contains those.
|
1.0
|
Add check for multidimensional arrays in MDP - ### Details about Problem
**nanoFramework area:** MDP
### Description
The new MDP is missing a check for multidimensional arrays. It should throw an exception when parsing an assembly that contains those.
|
process
|
add check for multidimensional arrays in mdp details about problem nanoframework area mdp description the new mdp is missing a check for multidimensional arrays it should throw an exception when parsing an assembly that contains those
| 1
|
213,255
| 23,972,156,728
|
IssuesEvent
|
2022-09-13 08:39:22
|
snowdensb/dependabot-core
|
https://api.github.com/repos/snowdensb/dependabot-core
|
reopened
|
CVE-2016-10735 (Medium) detected in bootstrap-3.3.4.min.js
|
security vulnerability
|
## CVE-2016-10735 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.4.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.4.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/snowdensb/dependabot-core/commit/ba8cd9078c8ce0cb202767d627706711237abf71">ba8cd9078c8ce0cb202767d627706711237abf71</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10735>CVE-2016-10735</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0, 4.0.0-beta.2</p>
</p>
</details>
<p></p>
|
True
|
CVE-2016-10735 (Medium) detected in bootstrap-3.3.4.min.js - ## CVE-2016-10735 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.4.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.4.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/snowdensb/dependabot-core/commit/ba8cd9078c8ce0cb202767d627706711237abf71">ba8cd9078c8ce0cb202767d627706711237abf71</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10735>CVE-2016-10735</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0, 4.0.0-beta.2</p>
</p>
</details>
<p></p>
|
non_process
|
cve medium detected in bootstrap min js cve medium severity vulnerability vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library js dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch main vulnerability details in bootstrap x before and x beta before beta xss is possible in the data target attribute a different vulnerability than cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap beta
| 0
|
147,171
| 19,500,442,861
|
IssuesEvent
|
2021-12-28 01:33:13
|
talevy013/TestTal
|
https://api.github.com/repos/talevy013/TestTal
|
opened
|
CVE-2018-7489 (High) detected in jackson-databind-2.6.7.1.jar
|
security vulnerability
|
## CVE-2018-7489 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.6.7.1.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /m2/repository/com/fasterxml/jackson/core/jackson-databind/2.6.7.1/jackson-databind-2.6.7.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.6.7.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/talevy013/TestTal/commit/f001f7f069d9289dded859e738eb111a8fd2e984">f001f7f069d9289dded859e738eb111a8fd2e984</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind before 2.7.9.3, 2.8.x before 2.8.11.1 and 2.9.x before 2.9.5 allows unauthenticated remote code execution because of an incomplete fix for the CVE-2017-7525 deserialization flaw. This is exploitable by sending maliciously crafted JSON input to the readValue method of the ObjectMapper, bypassing a blacklist that is ineffective if the c3p0 libraries are available in the classpath.
<p>Publish Date: 2018-02-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-7489>CVE-2018-7489</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-7489">https://nvd.nist.gov/vuln/detail/CVE-2018-7489</a></p>
<p>Release Date: 2018-02-26</p>
<p>Fix Resolution: 2.8.11.1,2.9.5</p>
</p>
</details>
<p></p>
|
True
|
CVE-2018-7489 (High) detected in jackson-databind-2.6.7.1.jar - ## CVE-2018-7489 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.6.7.1.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /m2/repository/com/fasterxml/jackson/core/jackson-databind/2.6.7.1/jackson-databind-2.6.7.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.6.7.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/talevy013/TestTal/commit/f001f7f069d9289dded859e738eb111a8fd2e984">f001f7f069d9289dded859e738eb111a8fd2e984</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind before 2.7.9.3, 2.8.x before 2.8.11.1 and 2.9.x before 2.9.5 allows unauthenticated remote code execution because of an incomplete fix for the CVE-2017-7525 deserialization flaw. This is exploitable by sending maliciously crafted JSON input to the readValue method of the ObjectMapper, bypassing a blacklist that is ineffective if the c3p0 libraries are available in the classpath.
<p>Publish Date: 2018-02-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-7489>CVE-2018-7489</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-7489">https://nvd.nist.gov/vuln/detail/CVE-2018-7489</a></p>
<p>Release Date: 2018-02-26</p>
<p>Fix Resolution: 2.8.11.1,2.9.5</p>
</p>
</details>
<p></p>
|
non_process
|
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file pom xml path to vulnerable library repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href vulnerability details fasterxml jackson databind before x before and x before allows unauthenticated remote code execution because of an incomplete fix for the cve deserialization flaw this is exploitable by sending maliciously crafted json input to the readvalue method of the objectmapper bypassing a blacklist that is ineffective if the libraries are available in the classpath publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
| 0
|
79,176
| 15,161,359,873
|
IssuesEvent
|
2021-02-12 08:54:56
|
OSIPI/DCE-DSC-MRI_CodeCollection
|
https://api.github.com/repos/OSIPI/DCE-DSC-MRI_CodeCollection
|
opened
|
Call open for contributions regarding 'signal extraction (ROI/masking)'
|
Code Contributions
|
We are looking for code contributions to select the time series data from the 4-D S(t) data. Here a choice will be made whether the data will be analysed as an image (2-3 spatial dimensions) or as multiple ROI time series (2D, i.e. ROI index + time). In case of ROI analysis, average time curves will be determined from all of the voxels in each ROI. As a result, other outputs in the pipeline, like T1 and BAT, should also be vectors of length NROI.

Please follow the documentation on new code contributions.
|
1.0
|
Call open for contributions regarding 'signal extraction (ROI/masking)' - We are looking for code contributions to select the time series data from the 4-D S(t) data. Here a choice will be made whether the data will be analysed as an image (2-3 spatial dimensions) or as multiple ROI time series (2D, i.e. ROI index + time). In case of ROI analysis, average time curves will be determined from all of the voxels in each ROI. As a result, other outputs in the pipeline, like T1 and BAT, should also be vectors of length NROI.

Please follow the documentation on new code contributions.
|
non_process
|
call open for contributions regarding signal extraction roi masking we are looking for code contributions to select the time series data from the d s t data here a choice will be made whether the data will be analysed as an image spatial dimensions or as multiple roi time series i e roi index time in case of roi analysis average time curves will be determined from all of the voxels in each roi as a result other outputs in the pipeline like and bat should also be vectors of length nroi please follow the documentation on new code contributions
| 0
|
366,406
| 10,820,496,033
|
IssuesEvent
|
2019-11-08 16:30:29
|
RCSideProjects/RAMP-Vue-Testing
|
https://api.github.com/repos/RCSideProjects/RAMP-Vue-Testing
|
opened
|
Add Group Entry as a separate class from LayerState
|
help wanted priority: high
|
Currently, there are a lot of edge cases that needs to be taken care of when toggling visibility and becomes excessive (especially in the case of nested group entries as elements of a visibility set). It is probably a good idea to make Group Entry (similar to Visibility Sets) as its own class to more take care of these cases.
|
1.0
|
Add Group Entry as a separate class from LayerState - Currently, there are a lot of edge cases that needs to be taken care of when toggling visibility and becomes excessive (especially in the case of nested group entries as elements of a visibility set). It is probably a good idea to make Group Entry (similar to Visibility Sets) as its own class to more take care of these cases.
|
non_process
|
add group entry as a separate class from layerstate currently there are a lot of edge cases that needs to be taken care of when toggling visibility and becomes excessive especially in the case of nested group entries as elements of a visibility set it is probably a good idea to make group entry similar to visibility sets as its own class to more take care of these cases
| 0
|
216,258
| 16,654,277,170
|
IssuesEvent
|
2021-06-05 08:19:58
|
MarlinFirmware/Marlin
|
https://api.github.com/repos/MarlinFirmware/Marlin
|
closed
|
[BUG] Behavior of PRINTJOB_TIMER_AUTOSTART does not match description in Configuration.h
|
C: Documentation C: Temperatures
|
### Did you test the latest `bugfix-2.0.x` code?
Yes, and the problem still exists.
### Bug Description
With `PRINTJOB_TIMER_AUTOSTART` enabled, the print job time can be started as expected, but *stopping* the print job timer does not behave as described by the comments in `Configuration.h` and the behavior has changed recently.
Specifically, the print job timer can not be stopped by either `M104 S0` or `M109 S0` unless the bed and chamber (if applicable) target temperatures are below`BED_MINTEMP` and `CHAMBER_MINTEMP`, respectively.
The comments for `PRINTJOB_TIMER_AUTOSTART` state:
```c
/**
* Print Job Timer
*
* Automatically start and stop the print job timer on M104/M109/M190.
*
* M104 (hotend, no wait) - high temp = none, low temp = stop timer
* M109 (hotend, wait) - high temp = start timer, low temp = stop timer
* M190 (bed, wait) - high temp = start timer, low temp = none
* ...
* /
```
This change in behavior is significant, since it requires the commands in a slicer's 'end gcode' and host gcode snippets (ie. OctoPrint cancel gcode) to be ordered correctly; the bed must be turned off before the hotend or the printer will still 'think' it's printing a job.
This is also likely to be confusing for users who are not familiar with the change in behavior because some slicers default 'end gcode' snippets have the cooldown in the 'wrong' order to trigger stopping the job with the new behavior. Additionally, the default 'cancel gcode' script in OctoPrint also calls `M104 S0` before `M140 S0`: https://docs.octoprint.org/en/master/features/gcode_scripts.html#bundled-scripts
I believe this behavior changed with commit aee971bcaf2d8b7157985f36f6705015ef334238.
Prior to that commit, `M140` could also trigger the print job to be stopped:
https://github.com/MarlinFirmware/Marlin/blob/e5ff55a1be7646b6159e6dedac50bfbe57e6dfa0/Marlin/src/gcode/temp/M140_M190.cpp#L73
After, neither `M140` nor `M190` could trigger the print job to be stopped:
https://github.com/MarlinFirmware/Marlin/blob/aee971bcaf2d8b7157985f36f6705015ef334238/Marlin/src/gcode/temp/M140_M190.cpp#L86
Note, the ability to stop the job with `M140` *also* doesn't match the comment in Configuration.h, but it did result in behavior closer to what's described.
### Bug Timeline
new with 2.0.8, likely aee971bcaf2d8b7157985f36f6705015ef334238
### Expected behavior
Print job timer to be stopped with `M104 S0` as described in Configuration.h
### Actual behavior
Print job timer is not stopped by `M104 S0` if the bed target temp is still set above `BED_MINTEMP`.
### Steps to Reproduce
1. `M190 S60` (or an appropriate bed temp)
2. `M109 S140` (or an appropriate hotend temp)
3. `M104 S0` or `M109 S0`
4. `M140 S0`
```terminal
Send: M190 S60
Recv: //action:notification Bed Heating...
Recv: echo:busy: processing
(repeated multiple times as bed heats)
[...]
Recv: //action:notification Printing...
Recv: ok
Send: M109 S140
Recv: echo:busy: processing
(repeated multiple times as hotend heats)
[...]
Recv: //action:notification Printing...
Recv: ok
Send: M104 S0
Recv: ok
Send: M140 S0
Recv: //action:notification Bed Cooling...
Recv: ok
```
Note that the host action notification 'Printing...' is sent after both `M190 S60` and `M109 S140`; this signifies that both of those commands caused the print job timer to start/resume.
At this point, the printer job timer is still active. The LCD menu has 'Stop Print,' 'Pause Print' and 'Tune' items available.
If steps 4 and 3 are swapped, the desired outcome is achieved:
1. `M190 S60` (or an appropriate bed temp)
2. `M109 S140` (or an appropriate hotend temp)
4. `M140 S0`
3. `M104 S0`
```terminal
Send: M190 S60
Recv: //action:notification Bed Heating...
Recv: echo:busy: processing
(repeated multiple times as bed heats)
[...]
Recv: //action:notification Printing...
Recv: ok
Send: M109 S140
Recv: echo:busy: processing
(repeated multiple times as hotend heats)
[...]
Recv: //action:notification Printing...
Recv: ok
Send: M140 S0
Recv: //action:notification Bed Cooling...
Recv: ok
Send: M104 S0
Recv: //action:notification Ender-3 v2 Ready.
Recv: ok
```
Note that the final command resulted in a host notification of 'Ender-3 v2 Ready.' This signifies that the print job timer was stopped. The LCD does *not* have 'Stop,' 'Pause' and 'Tune' menu available at this point, as expected.
### Version of Marlin Firmware
bugfix-2.0.x (74be64a1ec40ecdb9d4adb5cc8c72afbb2af1651)
### Printer model
Ender 3 Pro & Ender 3 V2
### Electronics
v4.2.7 and v4.2.2 boards
### Add-ons
_No response_
### Your Slicer
Prusa Slicer
### Host Software
OctoPrint
### Additional information & file uploads
Configuration Files:
Ender 3 Pro:
[ender3pro-config.zip](https://github.com/MarlinFirmware/Marlin/files/6583202/ender3pro-config.zip)
Ender 3 V2 (same as pro but 4.2.2 board and slightly different bed geom):
[ender3v2-config.zip](https://github.com/MarlinFirmware/Marlin/files/6583198/ender3v2-config.zip)
|
1.0
|
[BUG] Behavior of PRINTJOB_TIMER_AUTOSTART does not match description in Configuration.h - ### Did you test the latest `bugfix-2.0.x` code?
Yes, and the problem still exists.
### Bug Description
With `PRINTJOB_TIMER_AUTOSTART` enabled, the print job time can be started as expected, but *stopping* the print job timer does not behave as described by the comments in `Configuration.h` and the behavior has changed recently.
Specifically, the print job timer can not be stopped by either `M104 S0` or `M109 S0` unless the bed and chamber (if applicable) target temperatures are below`BED_MINTEMP` and `CHAMBER_MINTEMP`, respectively.
The comments for `PRINTJOB_TIMER_AUTOSTART` state:
```c
/**
* Print Job Timer
*
* Automatically start and stop the print job timer on M104/M109/M190.
*
* M104 (hotend, no wait) - high temp = none, low temp = stop timer
* M109 (hotend, wait) - high temp = start timer, low temp = stop timer
* M190 (bed, wait) - high temp = start timer, low temp = none
* ...
* /
```
This change in behavior is significant, since it requires the commands in a slicer's 'end gcode' and host gcode snippets (ie. OctoPrint cancel gcode) to be ordered correctly; the bed must be turned off before the hotend or the printer will still 'think' it's printing a job.
This is also likely to be confusing for users who are not familiar with the change in behavior because some slicers default 'end gcode' snippets have the cooldown in the 'wrong' order to trigger stopping the job with the new behavior. Additionally, the default 'cancel gcode' script in OctoPrint also calls `M104 S0` before `M140 S0`: https://docs.octoprint.org/en/master/features/gcode_scripts.html#bundled-scripts
I believe this behavior changed with commit aee971bcaf2d8b7157985f36f6705015ef334238.
Prior to that commit, `M140` could also trigger the print job to be stopped:
https://github.com/MarlinFirmware/Marlin/blob/e5ff55a1be7646b6159e6dedac50bfbe57e6dfa0/Marlin/src/gcode/temp/M140_M190.cpp#L73
After, neither `M140` nor `M190` could trigger the print job to be stopped:
https://github.com/MarlinFirmware/Marlin/blob/aee971bcaf2d8b7157985f36f6705015ef334238/Marlin/src/gcode/temp/M140_M190.cpp#L86
Note, the ability to stop the job with `M140` *also* doesn't match the comment in Configuration.h, but it did result in behavior closer to what's described.
### Bug Timeline
new with 2.0.8, likely aee971bcaf2d8b7157985f36f6705015ef334238
### Expected behavior
Print job timer to be stopped with `M104 S0` as described in Configuration.h
### Actual behavior
Print job timer is not stopped by `M104 S0` if the bed target temp is still set above `BED_MINTEMP`.
### Steps to Reproduce
1. `M190 S60` (or an appropriate bed temp)
2. `M109 S140` (or an appropriate hotend temp)
3. `M104 S0` or `M109 S0`
4. `M140 S0`
```terminal
Send: M190 S60
Recv: //action:notification Bed Heating...
Recv: echo:busy: processing
(repeated multiple times as bed heats)
[...]
Recv: //action:notification Printing...
Recv: ok
Send: M109 S140
Recv: echo:busy: processing
(repeated multiple times as hotend heats)
[...]
Recv: //action:notification Printing...
Recv: ok
Send: M104 S0
Recv: ok
Send: M140 S0
Recv: //action:notification Bed Cooling...
Recv: ok
```
Note that the host action notification 'Printing...' is sent after both `M190 S60` and `M109 S140`; this signifies that both of those commands caused the print job timer to start/resume.
At this point, the printer job timer is still active. The LCD menu has 'Stop Print,' 'Pause Print' and 'Tune' items available.
If steps 4 and 3 are swapped, the desired outcome is achieved:
1. `M190 S60` (or an appropriate bed temp)
2. `M109 S140` (or an appropriate hotend temp)
4. `M140 S0`
3. `M104 S0`
```terminal
Send: M190 S60
Recv: //action:notification Bed Heating...
Recv: echo:busy: processing
(repeated multiple times as bed heats)
[...]
Recv: //action:notification Printing...
Recv: ok
Send: M109 S140
Recv: echo:busy: processing
(repeated multiple times as hotend heats)
[...]
Recv: //action:notification Printing...
Recv: ok
Send: M140 S0
Recv: //action:notification Bed Cooling...
Recv: ok
Send: M104 S0
Recv: //action:notification Ender-3 v2 Ready.
Recv: ok
```
Note that the final command resulted in a host notification of 'Ender-3 v2 Ready.' This signifies that the print job timer was stopped. The LCD does *not* have 'Stop,' 'Pause' and 'Tune' menu available at this point, as expected.
### Version of Marlin Firmware
bugfix-2.0.x (74be64a1ec40ecdb9d4adb5cc8c72afbb2af1651)
### Printer model
Ender 3 Pro & Ender 3 V2
### Electronics
v4.2.7 and v4.2.2 boards
### Add-ons
_No response_
### Your Slicer
Prusa Slicer
### Host Software
OctoPrint
### Additional information & file uploads
Configuration Files:
Ender 3 Pro:
[ender3pro-config.zip](https://github.com/MarlinFirmware/Marlin/files/6583202/ender3pro-config.zip)
Ender 3 V2 (same as pro but 4.2.2 board and slightly different bed geom):
[ender3v2-config.zip](https://github.com/MarlinFirmware/Marlin/files/6583198/ender3v2-config.zip)
|
non_process
|
behavior of printjob timer autostart does not match description in configuration h did you test the latest bugfix x code yes and the problem still exists bug description with printjob timer autostart enabled the print job time can be started as expected but stopping the print job timer does not behave as described by the comments in configuration h and the behavior has changed recently specifically the print job timer can not be stopped by either or unless the bed and chamber if applicable target temperatures are below bed mintemp and chamber mintemp respectively the comments for printjob timer autostart state c print job timer automatically start and stop the print job timer on hotend no wait high temp none low temp stop timer hotend wait high temp start timer low temp stop timer bed wait high temp start timer low temp none this change in behavior is significant since it requires the commands in a slicer s end gcode and host gcode snippets ie octoprint cancel gcode to be ordered correctly the bed must be turned off before the hotend or the printer will still think it s printing a job this is also likely to be confusing for users who are not familiar with the change in behavior because some slicers default end gcode snippets have the cooldown in the wrong order to trigger stopping the job with the new behavior additionally the default cancel gcode script in octoprint also calls before i believe this behavior changed with commit prior to that commit could also trigger the print job to be stopped after neither nor could trigger the print job to be stopped note the ability to stop the job with also doesn t match the comment in configuration h but it did result in behavior closer to what s described bug timeline new with likely expected behavior print job timer to be stopped with as described in configuration h actual behavior print job timer is not stopped by if the bed target temp is still set above bed mintemp steps to reproduce or an appropriate bed temp or an appropriate hotend temp or terminal send recv action notification bed heating recv echo busy processing repeated multiple times as bed heats recv action notification printing recv ok send recv echo busy processing repeated multiple times as hotend heats recv action notification printing recv ok send recv ok send recv action notification bed cooling recv ok note that the host action notification printing is sent after both and this signifies that both of those commands caused the print job timer to start resume at this point the printer job timer is still active the lcd menu has stop print pause print and tune items available if steps and are swapped the desired outcome is achieved or an appropriate bed temp or an appropriate hotend temp terminal send recv action notification bed heating recv echo busy processing repeated multiple times as bed heats recv action notification printing recv ok send recv echo busy processing repeated multiple times as hotend heats recv action notification printing recv ok send recv action notification bed cooling recv ok send recv action notification ender ready recv ok note that the final command resulted in a host notification of ender ready this signifies that the print job timer was stopped the lcd does not have stop pause and tune menu available at this point as expected version of marlin firmware bugfix x printer model ender pro ender electronics and boards add ons no response your slicer prusa slicer host software octoprint additional information file uploads configuration files ender pro ender same as pro but board and slightly different bed geom
| 0
|
22,012
| 30,515,637,424
|
IssuesEvent
|
2023-07-19 02:36:24
|
dart-lang/linter
|
https://api.github.com/repos/dart-lang/linter
|
closed
|
☂️ process: define a lint proposal process
|
type-task P3 meta process
|
*Tracking issue for a light-weight process that brings lint requests that are (intentionally) loose and open-ended to actionable proposals.*
## Goals
The process should be
* lightweight
* open
* formal but not litigious
encourage feedback from
* the community
* platform / language / analyzer teams
and help measure buy-in from
* "canonical" lint owners
* 1P/ Google3 customers
* flutter
Ultimately the goal is to introduce exactly as much process as would benefit taking a lint rule idea to something that might be implemented and widely adopted.
---
## Tasks
- [x] #3012
- [x] 🏷️ introduce new issue tags (to identify proposals and reflect status)
- [x] new labels: `lint proposal`, `status: pending`, `status: accepted`, `status: closed`
- [x] 🪜 add a new "proposal" issue template (https://github.com/dart-lang/linter/pull/3010)
|
1.0
|
☂️ process: define a lint proposal process - *Tracking issue for a light-weight process that brings lint requests that are (intentionally) loose and open-ended to actionable proposals.*
## Goals
The process should be
* lightweight
* open
* formal but not litigious
encourage feedback from
* the community
* platform / language / analyzer teams
and help measure buy-in from
* "canonical" lint owners
* 1P/ Google3 customers
* flutter
Ultimately the goal is to introduce exactly as much process as would benefit taking a lint rule idea to something that might be implemented and widely adopted.
---
## Tasks
- [x] #3012
- [x] 🏷️ introduce new issue tags (to identify proposals and reflect status)
- [x] new labels: `lint proposal`, `status: pending`, `status: accepted`, `status: closed`
- [x] 🪜 add a new "proposal" issue template (https://github.com/dart-lang/linter/pull/3010)
|
process
|
☂️ process define a lint proposal process tracking issue for a light weight process that brings lint requests that are intentionally loose and open ended to actionable proposals goals the process should be lightweight open formal but not litigious encourage feedback from the community platform language analyzer teams and help measure buy in from canonical lint owners customers flutter ultimately the goal is to introduce exactly as much process as would benefit taking a lint rule idea to something that might be implemented and widely adopted tasks 🏷️ introduce new issue tags to identify proposals and reflect status new labels lint proposal status pending status accepted status closed 🪜 add a new proposal issue template
| 1
|
248,981
| 21,092,664,195
|
IssuesEvent
|
2022-04-04 07:19:15
|
proarc/proarc-client
|
https://api.github.com/repos/proarc/proarc-client
|
closed
|
Validace nové dávky - zamrzne obrazovka
|
1 chyba 6 k testování 7 návrh na zavření 6c otestováno: KNAV
|
Načetla jsem nová data a dala pokračovat. Aniž bych načtené skeny upravila (tj. měly pouze automaticky vytvořený index, ale neměly stránkování) jsem spustila validaci a obrazovka zamrzla (viz níže). V téhle fázi zpracování je sice validace nesmysl, ale dá se spustit omylem (spustila jsem ji ale cíleně a čekala jsem ohlášení chyby). Refresh nebo návratová šipka dovolí obrazovku opustit.

|
2.0
|
Validace nové dávky - zamrzne obrazovka - Načetla jsem nová data a dala pokračovat. Aniž bych načtené skeny upravila (tj. měly pouze automaticky vytvořený index, ale neměly stránkování) jsem spustila validaci a obrazovka zamrzla (viz níže). V téhle fázi zpracování je sice validace nesmysl, ale dá se spustit omylem (spustila jsem ji ale cíleně a čekala jsem ohlášení chyby). Refresh nebo návratová šipka dovolí obrazovku opustit.

|
non_process
|
validace nové dávky zamrzne obrazovka načetla jsem nová data a dala pokračovat aniž bych načtené skeny upravila tj měly pouze automaticky vytvořený index ale neměly stránkování jsem spustila validaci a obrazovka zamrzla viz níže v téhle fázi zpracování je sice validace nesmysl ale dá se spustit omylem spustila jsem ji ale cíleně a čekala jsem ohlášení chyby refresh nebo návratová šipka dovolí obrazovku opustit
| 0
|
296,548
| 22,306,373,276
|
IssuesEvent
|
2022-06-13 13:23:10
|
twardokus/v2verifier
|
https://api.github.com/repos/twardokus/v2verifier
|
closed
|
Update documentation for version 3.0
|
documentation
|
README needs to be updated with new installation instructions.
|
1.0
|
Update documentation for version 3.0 - README needs to be updated with new installation instructions.
|
non_process
|
update documentation for version readme needs to be updated with new installation instructions
| 0
|
9,782
| 12,800,962,806
|
IssuesEvent
|
2020-07-02 18:09:04
|
brucemiller/LaTeXML
|
https://api.github.com/repos/brucemiller/LaTeXML
|
opened
|
Experimental accessibility annotations for MathML
|
enhancement postprocessing
|
Since the last call of the "Accessibility for MathML" has started discussing prototype ideas, we should try to offer a demo capability in a standalone branch of latexml? And at least discuss implementation paths.
My thoughts on what I'd like to try:
- a math post-processor that acts as an alternative to the Content MathML processor, running after/in parallel to the Presentation post-processor and adding the attributes. Maybe the easiest first stab is to run both pmml and cmml and walk through the `xref`s adding attributes...
- once we have these generated, I'm thinking of writing a small javascript application that uses the attributes to serialize out a full English word verbalization of each formula.
- iterate over enough examples to feel it's reasonable... ?
|
1.0
|
Experimental accessibility annotations for MathML - Since the last call of the "Accessibility for MathML" has started discussing prototype ideas, we should try to offer a demo capability in a standalone branch of latexml? And at least discuss implementation paths.
My thoughts on what I'd like to try:
- a math post-processor that acts as an alternative to the Content MathML processor, running after/in parallel to the Presentation post-processor and adding the attributes. Maybe the easiest first stab is to run both pmml and cmml and walk through the `xref`s adding attributes...
- once we have these generated, I'm thinking of writing a small javascript application that uses the attributes to serialize out a full English word verbalization of each formula.
- iterate over enough examples to feel it's reasonable... ?
|
process
|
experimental accessibility annotations for mathml since the last call of the accessibility for mathml has started discussing prototype ideas we should try to offer a demo capability in a standalone branch of latexml and at least discuss implementation paths my thoughts on what i d like to try a math post processor that acts as an alternative to the content mathml processor running after in parallel to the presentation post processor and adding the attributes maybe the easiest first stab is to run both pmml and cmml and walk through the xref s adding attributes once we have these generated i m thinking of writing a small javascript application that uses the attributes to serialize out a full english word verbalization of each formula iterate over enough examples to feel it s reasonable
| 1
|
60,062
| 7,313,123,368
|
IssuesEvent
|
2018-02-28 23:31:40
|
chapel-lang/chapel
|
https://api.github.com/repos/chapel-lang/chapel
|
closed
|
Arithmetic on enumerators yields different type
|
area: Language stat: Won't fix / Ain't broke type: Design user issue
|
### Summary of Problem
Is there a purpose to setting the internal representation of an `enum` if arithmetic yields the default `int(64)`? Also, why does that happen?
### Steps to Reproduce
```chpl
type T_E = uint(8);
enum E {a = 1: T_E, b, c};
var e/*: T_E */ = E.a + E.b + E.c;
writeln("expected 6: uint(8)");
writeln("output ", e, ": ", e.type: string); // 6: int(64)
```
### Configuration Information
`chpl version 1.17.0 pre-release (2f0af5c5db)`
|
1.0
|
Arithmetic on enumerators yields different type - ### Summary of Problem
Is there a purpose to setting the internal representation of an `enum` if arithmetic yields the default `int(64)`? Also, why does that happen?
### Steps to Reproduce
```chpl
type T_E = uint(8);
enum E {a = 1: T_E, b, c};
var e/*: T_E */ = E.a + E.b + E.c;
writeln("expected 6: uint(8)");
writeln("output ", e, ": ", e.type: string); // 6: int(64)
```
### Configuration Information
`chpl version 1.17.0 pre-release (2f0af5c5db)`
|
non_process
|
arithmetic on enumerators yields different type summary of problem is there a purpose to setting the internal representation of an enum if arithmetic yields the default int also why does that happen steps to reproduce chpl type t e uint enum e a t e b c var e t e e a e b e c writeln expected uint writeln output e e type string int configuration information chpl version pre release
| 0
|
207,264
| 23,435,999,710
|
IssuesEvent
|
2022-08-15 09:53:31
|
Gal-Doron/Baragon-test-1
|
https://api.github.com/repos/Gal-Doron/Baragon-test-1
|
opened
|
logback-classic-1.2.3.jar: 1 vulnerabilities (highest severity is: 6.6)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>logback-classic-1.2.3.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-1/commit/cf5a99e4d7eb323b58cf89eea9fff80401db0bcb">cf5a99e4d7eb323b58cf89eea9fff80401db0bcb</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-42550](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.6 | detected in multiple dependencies | Transitive | 1.2.8 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-42550</summary>
### Vulnerable Libraries - <b>logback-core-1.2.3.jar</b>, <b>logback-classic-1.2.3.jar</b></p>
<p>
### <b>logback-core-1.2.3.jar</b></p>
<p>logback-core module</p>
<p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p>
<p>Path to dependency file: /BaragonServiceIntegrationTests/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar</p>
<p>
Dependency Hierarchy:
- logback-classic-1.2.3.jar (Root Library)
- :x: **logback-core-1.2.3.jar** (Vulnerable Library)
### <b>logback-classic-1.2.3.jar</b></p>
<p>logback-classic module</p>
<p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /sitory/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/sitory/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/sitory/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar</p>
<p>
Dependency Hierarchy:
- :x: **logback-classic-1.2.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-1/commit/cf5a99e4d7eb323b58cf89eea9fff80401db0bcb">cf5a99e4d7eb323b58cf89eea9fff80401db0bcb</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers.
<p>Publish Date: 2021-12-16
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550>CVE-2021-42550</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550">https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550</a></p>
<p>Release Date: 2021-12-16</p>
<p>Fix Resolution (ch.qos.logback:logback-core): 1.2.8</p>
<p>Direct dependency fix Resolution (ch.qos.logback:logback-classic): 1.2.8</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
True
|
logback-classic-1.2.3.jar: 1 vulnerabilities (highest severity is: 6.6) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>logback-classic-1.2.3.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-1/commit/cf5a99e4d7eb323b58cf89eea9fff80401db0bcb">cf5a99e4d7eb323b58cf89eea9fff80401db0bcb</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-42550](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.6 | detected in multiple dependencies | Transitive | 1.2.8 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-42550</summary>
### Vulnerable Libraries - <b>logback-core-1.2.3.jar</b>, <b>logback-classic-1.2.3.jar</b></p>
<p>
### <b>logback-core-1.2.3.jar</b></p>
<p>logback-core module</p>
<p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p>
<p>Path to dependency file: /BaragonServiceIntegrationTests/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-core/1.2.3/logback-core-1.2.3.jar</p>
<p>
Dependency Hierarchy:
- logback-classic-1.2.3.jar (Root Library)
- :x: **logback-core-1.2.3.jar** (Vulnerable Library)
### <b>logback-classic-1.2.3.jar</b></p>
<p>logback-classic module</p>
<p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /sitory/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/sitory/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/sitory/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar,/home/wss-scanner/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar</p>
<p>
Dependency Hierarchy:
- :x: **logback-classic-1.2.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-1/commit/cf5a99e4d7eb323b58cf89eea9fff80401db0bcb">cf5a99e4d7eb323b58cf89eea9fff80401db0bcb</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers.
<p>Publish Date: 2021-12-16
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550>CVE-2021-42550</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550">https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550</a></p>
<p>Release Date: 2021-12-16</p>
<p>Fix Resolution (ch.qos.logback:logback-core): 1.2.8</p>
<p>Direct dependency fix Resolution (ch.qos.logback:logback-classic): 1.2.8</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
non_process
|
logback classic jar vulnerabilities highest severity is vulnerable library logback classic jar path to dependency file baragonagentservice pom xml path to vulnerable library home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available medium detected in multiple dependencies transitive details cve vulnerable libraries logback core jar logback classic jar logback core jar logback core module library home page a href path to dependency file baragonserviceintegrationtests pom xml path to vulnerable library home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar home wss scanner repository ch qos logback logback core logback core jar dependency hierarchy logback classic jar root library x logback core jar vulnerable library logback classic jar logback classic module library home page a href path to dependency file baragonagentservice pom xml path to vulnerable library sitory ch qos logback logback classic logback classic jar sitory ch qos logback logback classic logback classic jar sitory ch qos logback logback classic logback classic jar home wss scanner repository ch qos logback logback classic logback classic jar dependency hierarchy x logback classic jar vulnerable library found in head commit a href found in base branch master vulnerability details in logback version and prior versions an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from ldap servers publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ch qos logback logback core direct dependency fix resolution ch qos logback logback classic rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
| 0
|
509
| 3,868,521,446
|
IssuesEvent
|
2016-04-10 00:41:12
|
duckduckgo/zeroclickinfo-spice
|
https://api.github.com/repos/duckduckgo/zeroclickinfo-spice
|
closed
|
XKCD: Cached comics are outdated
|
Maintainer Approved
|
The ZCI shows another comic than the website. It is not up to date.
Didn't check the source, but it seems as if it is not able to update properly.
------
IA Page: http://duck.co/ia/view/xkcd
[Maintainer](http://docs.duckduckhack.com/maintaining/guidelines.html): @sdball
|
True
|
XKCD: Cached comics are outdated - The ZCI shows another comic than the website. It is not up to date.
Didn't check the source, but it seems as if it is not able to update properly.
------
IA Page: http://duck.co/ia/view/xkcd
[Maintainer](http://docs.duckduckhack.com/maintaining/guidelines.html): @sdball
|
non_process
|
xkcd cached comics are outdated the zci shows another comic than the website it is not up to date didn t check the source but it seems as if it is not able to update properly ia page sdball
| 0
|
4,631
| 7,477,481,260
|
IssuesEvent
|
2018-04-04 08:28:05
|
pingcap/tikv
|
https://api.github.com/repos/pingcap/tikv
|
opened
|
some un-pushed builtin UDFs
|
coprocessor help wanted
|
These functions are not pushed down to tikv:
## Mysql specific types
- [ ] ExprType_MysqlBit ExprType = 101
- [ ] ExprType_MysqlHex ExprType = 105
- [ ] ExprType_MysqlSet ExprType = 106
- [ ] ExprType_MysqlJson ExprType = 108
## Unary operations
- [ ] ExprType_Neg ExprType = 1002
- [ ] ExprType_BitNeg ExprType = 1003
## Bit operations
- [ ] ExprType_BitAnd ExprType = 2101
- [ ] ExprType_BitOr ExprType = 2102
- [ ] ExprType_BitXor ExprType = 2103
- [ ] ExprType_LeftShift ExprType = 2104
- [ ] ExprType_RighShift ExprType = 2105
## Arithmatic
- [ ] ExprType_IntDiv ExprType = 2205
- [ ] ExprType_Mod ExprType = 2206
## Logic operations
- [ ] ExprType_Xor ExprType = 2303
## Aggregate functions
- [ ] ExprType_GroupConcat ExprType = 3007
## Math functions
- [ ] ExprType_Abs ExprType = 3101
- [ ] ExprType_Pow ExprType = 3102
- [ ] ExprType_Round ExprType = 3103
## String functions
- [ ] ExprType_Concat ExprType = 3201
- [ ] ExprType_ConcatWS ExprType = 3202
- [ ] ExprType_Left ExprType = 3203
- [ ] ExprType_Length ExprType = 3204
- [ ] ExprType_Lower ExprType = 3205
- [ ] ExprType_Repeat ExprType = 3206
- [ ] ExprType_Replace ExprType = 3207
- [ ] ExprType_Upper ExprType = 3208
- [ ] ExprType_Strcmp ExprType = 3209
- [ ] ExprType_Convert ExprType = 3210
- [ ] ExprType_Cast ExprType = 3211
- [ ] ExprType_Substring ExprType = 3212
- [ ] ExprType_SubstringIndex ExprType = 3213
- [ ] ExprType_Locate ExprType = 3214
- [ ] ExprType_Trim ExprType = 3215
## Control flow functions
- [ ] ExprType_NullIf ExprType = 3302
## Time functions
- [ ] ExprType_Date ExprType = 3401
- [ ] ExprType_DateAdd ExprType = 3402
- [ ] ExprType_DateSub ExprType = 3403
- [ ] ExprType_Year ExprType = 3411
- [ ] ExprType_YearWeek ExprType = 3412
- [ ] ExprType_Month ExprType = 3421
- [ ] ExprType_Week ExprType = 3431
- [ ] ExprType_Weekday ExprType = 3432
- [ ] ExprType_WeekOfYear ExprType = 3433
- [ ] ExprType_Day ExprType = 3441
- [ ] ExprType_DayName ExprType = 3442
- [ ] ExprType_DayOfYear ExprType = 3443
- [ ] ExprType_DayOfMonth ExprType = 3444
- [ ] ExprType_DayOfWeek ExprType = 3445
- [ ] ExprType_Hour ExprType = 3451
- [ ] ExprType_Minute ExprType = 3452
- [ ] ExprType_Second ExprType = 3453
- [ ] ExprType_Microsecond ExprType = 3454
- [ ] ExprType_Extract ExprType = 3461
## Other functions
- [ ] ExprType_Greatest ExprType = 3502
- [ ] ExprType_Least ExprType = 3503
- [ ] ExprType_IsTruth ExprType = 4002
- [ ] ExprType_ExprRow ExprType = 4004
- [ ] ExprType_RLike ExprType = 4006
- [ ] ExprType_ScalarFunc ExprType = 10000
## Json functions
- [ ] ExprType_JsonValid ExprType = 3606
- [ ] ExprType_JsonContains ExprType = 3611
- [ ] ExprType_JsonContainsPath ExprType = 3613
|
1.0
|
some un-pushed builtin UDFs - These functions are not pushed down to tikv:
## Mysql specific types
- [ ] ExprType_MysqlBit ExprType = 101
- [ ] ExprType_MysqlHex ExprType = 105
- [ ] ExprType_MysqlSet ExprType = 106
- [ ] ExprType_MysqlJson ExprType = 108
## Unary operations
- [ ] ExprType_Neg ExprType = 1002
- [ ] ExprType_BitNeg ExprType = 1003
## Bit operations
- [ ] ExprType_BitAnd ExprType = 2101
- [ ] ExprType_BitOr ExprType = 2102
- [ ] ExprType_BitXor ExprType = 2103
- [ ] ExprType_LeftShift ExprType = 2104
- [ ] ExprType_RighShift ExprType = 2105
## Arithmatic
- [ ] ExprType_IntDiv ExprType = 2205
- [ ] ExprType_Mod ExprType = 2206
## Logic operations
- [ ] ExprType_Xor ExprType = 2303
## Aggregate functions
- [ ] ExprType_GroupConcat ExprType = 3007
## Math functions
- [ ] ExprType_Abs ExprType = 3101
- [ ] ExprType_Pow ExprType = 3102
- [ ] ExprType_Round ExprType = 3103
## String functions
- [ ] ExprType_Concat ExprType = 3201
- [ ] ExprType_ConcatWS ExprType = 3202
- [ ] ExprType_Left ExprType = 3203
- [ ] ExprType_Length ExprType = 3204
- [ ] ExprType_Lower ExprType = 3205
- [ ] ExprType_Repeat ExprType = 3206
- [ ] ExprType_Replace ExprType = 3207
- [ ] ExprType_Upper ExprType = 3208
- [ ] ExprType_Strcmp ExprType = 3209
- [ ] ExprType_Convert ExprType = 3210
- [ ] ExprType_Cast ExprType = 3211
- [ ] ExprType_Substring ExprType = 3212
- [ ] ExprType_SubstringIndex ExprType = 3213
- [ ] ExprType_Locate ExprType = 3214
- [ ] ExprType_Trim ExprType = 3215
## Control flow functions
- [ ] ExprType_NullIf ExprType = 3302
## Time functions
- [ ] ExprType_Date ExprType = 3401
- [ ] ExprType_DateAdd ExprType = 3402
- [ ] ExprType_DateSub ExprType = 3403
- [ ] ExprType_Year ExprType = 3411
- [ ] ExprType_YearWeek ExprType = 3412
- [ ] ExprType_Month ExprType = 3421
- [ ] ExprType_Week ExprType = 3431
- [ ] ExprType_Weekday ExprType = 3432
- [ ] ExprType_WeekOfYear ExprType = 3433
- [ ] ExprType_Day ExprType = 3441
- [ ] ExprType_DayName ExprType = 3442
- [ ] ExprType_DayOfYear ExprType = 3443
- [ ] ExprType_DayOfMonth ExprType = 3444
- [ ] ExprType_DayOfWeek ExprType = 3445
- [ ] ExprType_Hour ExprType = 3451
- [ ] ExprType_Minute ExprType = 3452
- [ ] ExprType_Second ExprType = 3453
- [ ] ExprType_Microsecond ExprType = 3454
- [ ] ExprType_Extract ExprType = 3461
## Other functions
- [ ] ExprType_Greatest ExprType = 3502
- [ ] ExprType_Least ExprType = 3503
- [ ] ExprType_IsTruth ExprType = 4002
- [ ] ExprType_ExprRow ExprType = 4004
- [ ] ExprType_RLike ExprType = 4006
- [ ] ExprType_ScalarFunc ExprType = 10000
## Json functions
- [ ] ExprType_JsonValid ExprType = 3606
- [ ] ExprType_JsonContains ExprType = 3611
- [ ] ExprType_JsonContainsPath ExprType = 3613
|
process
|
some un pushed builtin udfs these functions are not pushed down to tikv mysql specific types exprtype mysqlbit exprtype exprtype mysqlhex exprtype exprtype mysqlset exprtype exprtype mysqljson exprtype unary operations exprtype neg exprtype exprtype bitneg exprtype bit operations exprtype bitand exprtype exprtype bitor exprtype exprtype bitxor exprtype exprtype leftshift exprtype exprtype righshift exprtype arithmatic exprtype intdiv exprtype exprtype mod exprtype logic operations exprtype xor exprtype aggregate functions exprtype groupconcat exprtype math functions exprtype abs exprtype exprtype pow exprtype exprtype round exprtype string functions exprtype concat exprtype exprtype concatws exprtype exprtype left exprtype exprtype length exprtype exprtype lower exprtype exprtype repeat exprtype exprtype replace exprtype exprtype upper exprtype exprtype strcmp exprtype exprtype convert exprtype exprtype cast exprtype exprtype substring exprtype exprtype substringindex exprtype exprtype locate exprtype exprtype trim exprtype control flow functions exprtype nullif exprtype time functions exprtype date exprtype exprtype dateadd exprtype exprtype datesub exprtype exprtype year exprtype exprtype yearweek exprtype exprtype month exprtype exprtype week exprtype exprtype weekday exprtype exprtype weekofyear exprtype exprtype day exprtype exprtype dayname exprtype exprtype dayofyear exprtype exprtype dayofmonth exprtype exprtype dayofweek exprtype exprtype hour exprtype exprtype minute exprtype exprtype second exprtype exprtype microsecond exprtype exprtype extract exprtype other functions exprtype greatest exprtype exprtype least exprtype exprtype istruth exprtype exprtype exprrow exprtype exprtype rlike exprtype exprtype scalarfunc exprtype json functions exprtype jsonvalid exprtype exprtype jsoncontains exprtype exprtype jsoncontainspath exprtype
| 1
|
14,429
| 17,480,888,268
|
IssuesEvent
|
2021-08-09 01:55:16
|
km4ack/pi-build
|
https://api.github.com/repos/km4ack/pi-build
|
reopened
|
GridTracker upgrade neded
|
in process
|
Maybe it's related to the bug I found, but GridTracker has updated. Wil the upgrade script prompt me to update it or how do I go about upgrading it "properly"?
|
1.0
|
GridTracker upgrade neded - Maybe it's related to the bug I found, but GridTracker has updated. Wil the upgrade script prompt me to update it or how do I go about upgrading it "properly"?
|
process
|
gridtracker upgrade neded maybe it s related to the bug i found but gridtracker has updated wil the upgrade script prompt me to update it or how do i go about upgrading it properly
| 1
|
11,227
| 14,004,992,310
|
IssuesEvent
|
2020-10-28 17:49:58
|
NixOS/nixpkgs
|
https://api.github.com/repos/NixOS/nixpkgs
|
reopened
|
Post 20.09 cleanup
|
6.topic: release process
|
- [ ] Add metadata around kernel "flavors" (e.g. zen, xen, hardened) to allow for easier conditional `meta.broken` cc @NeQuissimus
- [ ] backport changes, and then mark appropriate linux modules as broken
- [ ] go through the failing nixosTests, mark failing tests as broken
- [ ] "forward port" some of the python breakages. https://github.com/NixOS/nixpkgs/commit/87b50c25ba9f67c918353226fb202a75ea090e59
|
1.0
|
Post 20.09 cleanup - - [ ] Add metadata around kernel "flavors" (e.g. zen, xen, hardened) to allow for easier conditional `meta.broken` cc @NeQuissimus
- [ ] backport changes, and then mark appropriate linux modules as broken
- [ ] go through the failing nixosTests, mark failing tests as broken
- [ ] "forward port" some of the python breakages. https://github.com/NixOS/nixpkgs/commit/87b50c25ba9f67c918353226fb202a75ea090e59
|
process
|
post cleanup add metadata around kernel flavors e g zen xen hardened to allow for easier conditional meta broken cc nequissimus backport changes and then mark appropriate linux modules as broken go through the failing nixostests mark failing tests as broken forward port some of the python breakages
| 1
|
18,031
| 24,038,242,285
|
IssuesEvent
|
2022-09-15 21:27:00
|
dtcenter/MET
|
https://api.github.com/repos/dtcenter/MET
|
closed
|
Fix the truncated station_id name in the output from IODA2NC
|
type: bug requestor: METplus Team MET: PreProcessing Tools (Point) priority: high
|
*Replace italics below with details for this issue.*
## Describe the Problem ##
*Provide a clear and concise description of the bug here.*
Generated MET point obs NetCDF file by ioda2nc contains truncated station ids. The station id comes form "report_identifier@MetaData" variable. An IODA input file (at seneca:/d1/personal/kalb/ioda/raob_all_v1_20201215T1200Z.nc4) has "station_id@MetaData" variable instead of "report_identifier@MetaData". The MET's NetCDF dimension is 40 bytes (enough to contain the original station IDs).
The metadata variable for station_id@MetaData variable:
Station_ids from the IODA:
station_id@MetaData =
"89009 23 4gIUS02",
"89009 23 4gIUS04",
"89009 23 4gIUS02",
...
"68538-99-9gIUK02",
"68538-99-9gIUS10",
"68538-99-9gIUK04",
"68538-99-9gIUK06",
Generated MET point obs NetCDF file.
hdr_sid_table =
"89009 23",
"89664 23",
"89664 23 4n",
"89625 23",
...
"68538-99-9gIUK02",
"68538-99-9gIUS10",
"68538-99-9gIUK04",
"68538-99-9gIUK06",
### Expected Behavior ###
*Provide a clear and concise description of what you expected to happen here.*
Do not truncate the string (just strip out trailing white space characters).
### Environment ###
Describe your runtime environment:
*1. Machine: Linux Workstation (seneca)
*2. OS: RedHat Linux
*3. Software version number(s)* 11.0 beta2
### To Reproduce ###
Describe the steps to reproduce the behavior:
1. login to seneca
2. Modify IODA2NC config
```
metadata_map = [
{ key = "message_type"; val = "msg_type,station_ob"; },
{ key = "station_id"; val = "station_id,report_identifier"; },
{ key = "pressure"; val = "air_pressure,pressure"; },
{ key = "height"; val = "height,height_above_mean_sea_level"; },
{ key = "elevation"; val = ""; }
];
```
3. run ioda2nc
```
./ioda2nc /d1/personal/hsoh/data/IODA_files/raob_all_v1_20201215T1200Z.nc4 out_raob_all_air_temperature.nc /d1/personal/hsoh/git/features/feature_2215_ioda2nc_message_type/MET/share/met/config/IODA2NCConfig -obs_var air_temperature -v 4
```
### Relevant Deadlines ###
*List relevant project deadlines here or state NONE.*
### Funding Source ###
2799991
## Define the Metadata ##
### Assignee ###
- [ ] Select **engineer(s)** or **no engineer** required
- [ ] Select **scientist(s)** or **no scientist** required
### Labels ###
- [ ] Select **component(s)**
- [ ] Select **priority**
- [ ] Select **requestor(s)**
### Projects and Milestone ###
- [ ] Select **Organization** level **Project** for support of the current coordinated release
- [ ] Select **Repository** level **Project** for development toward the next official release or add **alert: NEED PROJECT ASSIGNMENT** label
- [ ] Select **Milestone** as the next bugfix version
## Define Related Issue(s) ##
Consider the impact to the other METplus components.
- [ ] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdatadb](https://github.com/dtcenter/METdatadb/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose)
## Bugfix Checklist ##
See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details.
- [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**.
- [ ] Fork this repository or create a branch of **main_\<Version>**.
Branch name: `bugfix_<Issue Number>_main_<Version>_<Description>`
- [ ] Fix the bug and test your changes.
- [ ] Add/update log messages for easier debugging.
- [ ] Add/update unit tests.
- [ ] Add/update documentation.
- [ ] Push local changes to GitHub.
- [ ] Submit a pull request to merge into **main_\<Version>**.
Pull request: `bugfix <Issue Number> main_<Version> <Description>`
- [ ] Define the pull request metadata, as permissions allow.
Select: **Reviewer(s)** and **Linked issues**
Select: **Organization** level software support **Project** for the current coordinated release
Select: **Milestone** as the next bugfix version
- [ ] Iterate until the reviewer(s) accept and merge your changes.
- [ ] Delete your fork or branch.
- [ ] Complete the steps above to fix the bug on the **develop** branch.
Branch name: `bugfix_<Issue Number>_develop_<Description>`
Pull request: `bugfix <Issue Number> develop <Description>`
Select: **Reviewer(s)** and **Linked issues**
Select: **Repository** level development cycle **Project** for the next official release
Select: **Milestone** as the next official version
- [ ] Close this issue.
|
1.0
|
Fix the truncated station_id name in the output from IODA2NC - *Replace italics below with details for this issue.*
## Describe the Problem ##
*Provide a clear and concise description of the bug here.*
Generated MET point obs NetCDF file by ioda2nc contains truncated station ids. The station id comes form "report_identifier@MetaData" variable. An IODA input file (at seneca:/d1/personal/kalb/ioda/raob_all_v1_20201215T1200Z.nc4) has "station_id@MetaData" variable instead of "report_identifier@MetaData". The MET's NetCDF dimension is 40 bytes (enough to contain the original station IDs).
The metadata variable for station_id@MetaData variable:
Station_ids from the IODA:
station_id@MetaData =
"89009 23 4gIUS02",
"89009 23 4gIUS04",
"89009 23 4gIUS02",
...
"68538-99-9gIUK02",
"68538-99-9gIUS10",
"68538-99-9gIUK04",
"68538-99-9gIUK06",
Generated MET point obs NetCDF file.
hdr_sid_table =
"89009 23",
"89664 23",
"89664 23 4n",
"89625 23",
...
"68538-99-9gIUK02",
"68538-99-9gIUS10",
"68538-99-9gIUK04",
"68538-99-9gIUK06",
### Expected Behavior ###
*Provide a clear and concise description of what you expected to happen here.*
Do not truncate the string (just strip out trailing white space characters).
### Environment ###
Describe your runtime environment:
*1. Machine: Linux Workstation (seneca)
*2. OS: RedHat Linux
*3. Software version number(s)* 11.0 beta2
### To Reproduce ###
Describe the steps to reproduce the behavior:
1. login to seneca
2. Modify IODA2NC config
```
metadata_map = [
{ key = "message_type"; val = "msg_type,station_ob"; },
{ key = "station_id"; val = "station_id,report_identifier"; },
{ key = "pressure"; val = "air_pressure,pressure"; },
{ key = "height"; val = "height,height_above_mean_sea_level"; },
{ key = "elevation"; val = ""; }
];
```
3. run ioda2nc
```
./ioda2nc /d1/personal/hsoh/data/IODA_files/raob_all_v1_20201215T1200Z.nc4 out_raob_all_air_temperature.nc /d1/personal/hsoh/git/features/feature_2215_ioda2nc_message_type/MET/share/met/config/IODA2NCConfig -obs_var air_temperature -v 4
```
### Relevant Deadlines ###
*List relevant project deadlines here or state NONE.*
### Funding Source ###
2799991
## Define the Metadata ##
### Assignee ###
- [ ] Select **engineer(s)** or **no engineer** required
- [ ] Select **scientist(s)** or **no scientist** required
### Labels ###
- [ ] Select **component(s)**
- [ ] Select **priority**
- [ ] Select **requestor(s)**
### Projects and Milestone ###
- [ ] Select **Organization** level **Project** for support of the current coordinated release
- [ ] Select **Repository** level **Project** for development toward the next official release or add **alert: NEED PROJECT ASSIGNMENT** label
- [ ] Select **Milestone** as the next bugfix version
## Define Related Issue(s) ##
Consider the impact to the other METplus components.
- [ ] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdatadb](https://github.com/dtcenter/METdatadb/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose)
## Bugfix Checklist ##
See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details.
- [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**.
- [ ] Fork this repository or create a branch of **main_\<Version>**.
Branch name: `bugfix_<Issue Number>_main_<Version>_<Description>`
- [ ] Fix the bug and test your changes.
- [ ] Add/update log messages for easier debugging.
- [ ] Add/update unit tests.
- [ ] Add/update documentation.
- [ ] Push local changes to GitHub.
- [ ] Submit a pull request to merge into **main_\<Version>**.
Pull request: `bugfix <Issue Number> main_<Version> <Description>`
- [ ] Define the pull request metadata, as permissions allow.
Select: **Reviewer(s)** and **Linked issues**
Select: **Organization** level software support **Project** for the current coordinated release
Select: **Milestone** as the next bugfix version
- [ ] Iterate until the reviewer(s) accept and merge your changes.
- [ ] Delete your fork or branch.
- [ ] Complete the steps above to fix the bug on the **develop** branch.
Branch name: `bugfix_<Issue Number>_develop_<Description>`
Pull request: `bugfix <Issue Number> develop <Description>`
Select: **Reviewer(s)** and **Linked issues**
Select: **Repository** level development cycle **Project** for the next official release
Select: **Milestone** as the next official version
- [ ] Close this issue.
|
process
|
fix the truncated station id name in the output from replace italics below with details for this issue describe the problem provide a clear and concise description of the bug here generated met point obs netcdf file by contains truncated station ids the station id comes form report identifier metadata variable an ioda input file at seneca personal kalb ioda raob all has station id metadata variable instead of report identifier metadata the met s netcdf dimension is bytes enough to contain the original station ids the metadata variable for station id metadata variable station ids from the ioda station id metadata generated met point obs netcdf file hdr sid table expected behavior provide a clear and concise description of what you expected to happen here do not truncate the string just strip out trailing white space characters environment describe your runtime environment machine linux workstation seneca os redhat linux software version number s to reproduce describe the steps to reproduce the behavior login to seneca modify config metadata map key message type val msg type station ob key station id val station id report identifier key pressure val air pressure pressure key height val height height above mean sea level key elevation val run personal hsoh data ioda files raob all out raob all air temperature nc personal hsoh git features feature message type met share met config obs var air temperature v relevant deadlines list relevant project deadlines here or state none funding source define the metadata assignee select engineer s or no engineer required select scientist s or no scientist required labels select component s select priority select requestor s projects and milestone select organization level project for support of the current coordinated release select repository level project for development toward the next official release or add alert need project assignment label select milestone as the next bugfix version define related issue s consider the impact to the other metplus components bugfix checklist see the for details complete the issue definition above including the time estimate and funding source fork this repository or create a branch of main branch name bugfix main fix the bug and test your changes add update log messages for easier debugging add update unit tests add update documentation push local changes to github submit a pull request to merge into main pull request bugfix main define the pull request metadata as permissions allow select reviewer s and linked issues select organization level software support project for the current coordinated release select milestone as the next bugfix version iterate until the reviewer s accept and merge your changes delete your fork or branch complete the steps above to fix the bug on the develop branch branch name bugfix develop pull request bugfix develop select reviewer s and linked issues select repository level development cycle project for the next official release select milestone as the next official version close this issue
| 1
|
361,021
| 10,700,871,067
|
IssuesEvent
|
2019-10-24 01:49:21
|
Novusphere/discussions-app
|
https://api.github.com/repos/Novusphere/discussions-app
|
closed
|
Data/login persistence errors pertaining to login
|
bug high priority
|
Login as lmao11 with password lmao11
Logout (Disconnect)
Login - Brian Key - continue as lmao11
Enter lmao11 for password
"Incorrect brain key or password"
Again, i assume this relates to bugs that stem from logging in/out, etc.
|
1.0
|
Data/login persistence errors pertaining to login - Login as lmao11 with password lmao11
Logout (Disconnect)
Login - Brian Key - continue as lmao11
Enter lmao11 for password
"Incorrect brain key or password"
Again, i assume this relates to bugs that stem from logging in/out, etc.
|
non_process
|
data login persistence errors pertaining to login login as with password logout disconnect login brian key continue as enter for password incorrect brain key or password again i assume this relates to bugs that stem from logging in out etc
| 0
|
22,257
| 30,809,189,940
|
IssuesEvent
|
2023-08-01 09:15:17
|
UnitTestBot/UTBotJava
|
https://api.github.com/repos/UnitTestBot/UTBotJava
|
opened
|
Could not initialize class `S***L***` from sandbox in instrumented process
|
ctg-bug spec-internal comp-instrumented-process comp-spring
|
**Description**
Instrumented process errors:
Could not initialize class `S***L***` from sandbox
**To Reproduce**
1. Install [UnitTestBot plugin built from main](https://github.com/UnitTestBot/UTBotJava/actions/runs/5716372497) in IntelliJ IDEA
2. Open `sm***t` project
3. Generate tests for S***L*** class
**Expected behavior**
Class should be instantiated or correctly disabled due to sandbox tests should be generated.
**Actual behavior**
Error tests with failures from instrumented process are generated.
There is one test `Disabled due to sandbox` - but without class instantiation.
There are NoClassDefFoundErrors in utbot-engine-current.log
**Screenshots, logs**
~~~java
@Test
@Disabled(value = "Disabled due to sandbox")
public void test***1() {
MockedStatic mockedStatic = null;
try {
mockedStatic = mockStatic(LoggerFactory.class);
(mockedStatic.when(() -> LoggerFactory.getLogger(any(Class.class)))).thenReturn(((Logger) null));
/* This test fails because method [***.S***L***.***] produces [java.security.AccessControlException: access denied ("java.util.PropertyPermission" "APP_HOME" "read")] */
} finally {
mockedStatic.close();
}
}
///endregion
///region Errors report for ***
public void test***_errors() {
// Couldn't generate some tests. List of errors:
//
// 1 occurrences of:
// Concrete execution failed
}
///endregion
~~~
~~~java
Caused by: com.jetbrains.rd.util.reactive.RdFault: InvocationPhase, reason: org.utbot.instrumentation.instrumentation.execution.phases.ExecutionPhaseError: InvocationPhase
at org.utbot.instrumentation.instrumentation.execution.phases.InvocationPhase.wrapError(InvocationPhase.kt:22)
at org.utbot.instrumentation.instrumentation.execution.phases.ExecutionPhaseKt.start(ExecutionPhase.kt:30)
at org.utbot.instrumentation.instrumentation.execution.phases.PhasesController.executePhaseInTimeout(PhasesController.kt:56)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation$invoke$1$1.invoke(SimpleUtExecutionInstrumentation.kt:63)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation$invoke$1$1.invoke(SimpleUtExecutionInstrumentation.kt:55)
at org.utbot.instrumentation.instrumentation.execution.UtExecutionInstrumentation$invoke$1.invoke(UtExecutionInstrumentation.kt:46)
at org.utbot.instrumentation.instrumentation.execution.UtExecutionInstrumentation$invoke$1.invoke(UtExecutionInstrumentation.kt:45)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation.invoke(SimpleUtExecutionInstrumentation.kt:55)
at org.utbot.instrumentation.instrumentation.execution.UtExecutionInstrumentation$DefaultImpls.invoke(UtExecutionInstrumentation.kt:45)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation.invoke(SimpleUtExecutionInstrumentation.kt:22)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation.invoke(SimpleUtExecutionInstrumentation.kt:22)
at org.utbot.instrumentation.process.InstrumentedProcessMainKt$setup$2.invoke(InstrumentedProcessMain.kt:132)
at org.utbot.instrumentation.process.InstrumentedProcessMainKt$setup$2.invoke(InstrumentedProcessMain.kt:129)
at org.utbot.rd.IdleWatchdog$measureTimeForActiveCall$1$2$1.invoke(ClientProcessUtil.kt:115)
at org.utbot.rd.IdleWatchdog.wrapActive(ClientProcessUtil.kt:88)
at org.utbot.rd.IdleWatchdog$measureTimeForActiveCall$1.invoke(ClientProcessUtil.kt:114)
at com.jetbrains.rd.framework.IRdEndpoint$set$1.invoke(TaskInterfaces.kt:182)
at com.jetbrains.rd.framework.IRdEndpoint$set$1.invoke(TaskInterfaces.kt:182)
at com.jetbrains.rd.framework.impl.RdCall.onWireReceived(RdTask.kt:362)
at com.jetbrains.rd.framework.MessageBroker$invoke$2$2.invoke(MessageBroker.kt:57)
at com.jetbrains.rd.framework.MessageBroker$invoke$2$2.invoke(MessageBroker.kt:56)
at com.jetbrains.rd.framework.impl.ProtocolContexts.readMessageContextAndInvoke(ProtocolContexts.kt:148)
at com.jetbrains.rd.framework.MessageBroker$invoke$2.invoke(MessageBroker.kt:56)
at com.jetbrains.rd.framework.MessageBroker$invoke$2.invoke(MessageBroker.kt:54)
at com.jetbrains.rd.util.threading.SingleThreadSchedulerBase.queue$lambda-3(SingleThreadScheduler.kt:41)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class ***.S***L***
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.utbot.instrumentation.instrumentation.InvokeInstrumentation$invoke$2$result$1.invoke-IoAF18A(InvokeInstrumentation.kt:61)
at org.utbot.instrumentation.instrumentation.InvokeInstrumentation$invoke$2$result$1.invoke(InvokeInstrumentation.kt:59)
at org.utbot.instrumentation.process.SecurityKt$runSandbox$1$1.invoke(Security.kt:40)
at org.utbot.instrumentation.process.SecurityKt$sandbox$1.invoke(Security.kt:62)
at org.utbot.instrumentation.process.SecurityKt$sandbox$2.invoke(Security.kt:78)
at org.utbot.instrumentation.process.SecurityKt$sandbox$3.invoke(Security.kt:83)
at org.utbot.instrumentation.process.SecurityKt$sandbox$4.run(Security.kt:89)
at java.security.AccessController.doPrivileged(Native Method)
at org.utbot.instrumentation.process.SecurityKt.sandbox(Security.kt:89)
at org.utbot.instrumentation.process.SecurityKt.sandbox(Security.kt:83)
at org.utbot.instrumentation.process.SecurityKt.sandbox(Security.kt:78)
at org.utbot.instrumentation.process.SecurityKt.sandbox(Security.kt:62)
at org.utbot.instrumentation.process.SecurityKt.runSandbox(Security.kt:40)
at org.utbot.instrumentation.process.SecurityKt.runSandbox$default(Security.kt:38)
at org.utbot.instrumentation.instrumentation.InvokeInstrumentation.invoke-BWLJW6A(InvokeInstrumentation.kt:59)
at org.utbot.instrumentation.instrumentation.InvokeInstrumentation.invoke(InvokeInstrumentation.kt:21)
at org.utbot.instrumentation.instrumentation.Instrumentation$DefaultImpls.invoke$default(Instrumentation.kt:21)
at org.utbot.instrumentation.instrumentation.execution.phases.InvocationPhase.invoke-0E7RQCE(InvocationPhase.kt:31)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation$invoke$1$1$concreteResult$1.invoke-IoAF18A(SimpleUtExecutionInstrumentation.kt:64)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation$invoke$1$1$concreteResult$1.invoke(SimpleUtExecutionInstrumentation.kt:63)
at org.utbot.instrumentation.instrumentation.execution.phases.PhasesController$executePhaseInTimeout$1$result$1.invoke(PhasesController.kt:62)
at org.utbot.common.ThreadBasedExecutor$invokeWithTimeout$1.invoke(ThreadUtil.kt:44)
at org.utbot.common.ThreadBasedExecutor$ensureThreadIsAlive$1.invoke(ThreadUtil.kt:91)
at org.utbot.common.ThreadBasedExecutor$ensureThreadIsAlive$1.invoke(ThreadUtil.kt:87)
at kotlin.concurrent.ThreadsKt$thread$thread$1.run(Thread.kt:30)
at com.jetbrains.rd.framework.RdTaskResult$Companion.read(TaskInterfaces.kt:30) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.impl.CallSiteWiredRdTask.onWireReceived(RdTask.kt:106) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.MessageBroker$invoke$2$2.invoke(MessageBroker.kt:57) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.MessageBroker$invoke$2$2.invoke(MessageBroker.kt:56) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.impl.ProtocolContexts.readMessageContextAndInvoke(ProtocolContexts.kt:148) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.MessageBroker$invoke$2.invoke(MessageBroker.kt:56) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.MessageBroker$invoke$2.invoke(MessageBroker.kt:54) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.impl.RdCall$createResponseScheduler$1.queue$execute(RdTask.kt:280) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.impl.RdCall$createResponseScheduler$1.access$queue$execute(RdTask.kt:269) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.impl.RdCall$createResponseScheduler$1$queue$1.invokeSuspend(RdTask.kt:289) ~[rd-framework-2023.1.2.jar:?]
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) ~[kotlin-stdlib-1.8.10.jar:1.8.10-release-430(1.8.10)]
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106) ~[kotlinx-coroutines-core-jvm-1.6.3.jar:?]
... 28 more
~~~
**Environment**
IntelliJ IDEA version - 2023.1 Community
Project - Maven
JDK - 1.8
|
1.0
|
Could not initialize class `S***L***` from sandbox in instrumented process - **Description**
Instrumented process errors:
Could not initialize class `S***L***` from sandbox
**To Reproduce**
1. Install [UnitTestBot plugin built from main](https://github.com/UnitTestBot/UTBotJava/actions/runs/5716372497) in IntelliJ IDEA
2. Open `sm***t` project
3. Generate tests for S***L*** class
**Expected behavior**
Class should be instantiated or correctly disabled due to sandbox tests should be generated.
**Actual behavior**
Error tests with failures from instrumented process are generated.
There is one test `Disabled due to sandbox` - but without class instantiation.
There are NoClassDefFoundErrors in utbot-engine-current.log
**Screenshots, logs**
~~~java
@Test
@Disabled(value = "Disabled due to sandbox")
public void test***1() {
MockedStatic mockedStatic = null;
try {
mockedStatic = mockStatic(LoggerFactory.class);
(mockedStatic.when(() -> LoggerFactory.getLogger(any(Class.class)))).thenReturn(((Logger) null));
/* This test fails because method [***.S***L***.***] produces [java.security.AccessControlException: access denied ("java.util.PropertyPermission" "APP_HOME" "read")] */
} finally {
mockedStatic.close();
}
}
///endregion
///region Errors report for ***
public void test***_errors() {
// Couldn't generate some tests. List of errors:
//
// 1 occurrences of:
// Concrete execution failed
}
///endregion
~~~
~~~java
Caused by: com.jetbrains.rd.util.reactive.RdFault: InvocationPhase, reason: org.utbot.instrumentation.instrumentation.execution.phases.ExecutionPhaseError: InvocationPhase
at org.utbot.instrumentation.instrumentation.execution.phases.InvocationPhase.wrapError(InvocationPhase.kt:22)
at org.utbot.instrumentation.instrumentation.execution.phases.ExecutionPhaseKt.start(ExecutionPhase.kt:30)
at org.utbot.instrumentation.instrumentation.execution.phases.PhasesController.executePhaseInTimeout(PhasesController.kt:56)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation$invoke$1$1.invoke(SimpleUtExecutionInstrumentation.kt:63)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation$invoke$1$1.invoke(SimpleUtExecutionInstrumentation.kt:55)
at org.utbot.instrumentation.instrumentation.execution.UtExecutionInstrumentation$invoke$1.invoke(UtExecutionInstrumentation.kt:46)
at org.utbot.instrumentation.instrumentation.execution.UtExecutionInstrumentation$invoke$1.invoke(UtExecutionInstrumentation.kt:45)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation.invoke(SimpleUtExecutionInstrumentation.kt:55)
at org.utbot.instrumentation.instrumentation.execution.UtExecutionInstrumentation$DefaultImpls.invoke(UtExecutionInstrumentation.kt:45)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation.invoke(SimpleUtExecutionInstrumentation.kt:22)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation.invoke(SimpleUtExecutionInstrumentation.kt:22)
at org.utbot.instrumentation.process.InstrumentedProcessMainKt$setup$2.invoke(InstrumentedProcessMain.kt:132)
at org.utbot.instrumentation.process.InstrumentedProcessMainKt$setup$2.invoke(InstrumentedProcessMain.kt:129)
at org.utbot.rd.IdleWatchdog$measureTimeForActiveCall$1$2$1.invoke(ClientProcessUtil.kt:115)
at org.utbot.rd.IdleWatchdog.wrapActive(ClientProcessUtil.kt:88)
at org.utbot.rd.IdleWatchdog$measureTimeForActiveCall$1.invoke(ClientProcessUtil.kt:114)
at com.jetbrains.rd.framework.IRdEndpoint$set$1.invoke(TaskInterfaces.kt:182)
at com.jetbrains.rd.framework.IRdEndpoint$set$1.invoke(TaskInterfaces.kt:182)
at com.jetbrains.rd.framework.impl.RdCall.onWireReceived(RdTask.kt:362)
at com.jetbrains.rd.framework.MessageBroker$invoke$2$2.invoke(MessageBroker.kt:57)
at com.jetbrains.rd.framework.MessageBroker$invoke$2$2.invoke(MessageBroker.kt:56)
at com.jetbrains.rd.framework.impl.ProtocolContexts.readMessageContextAndInvoke(ProtocolContexts.kt:148)
at com.jetbrains.rd.framework.MessageBroker$invoke$2.invoke(MessageBroker.kt:56)
at com.jetbrains.rd.framework.MessageBroker$invoke$2.invoke(MessageBroker.kt:54)
at com.jetbrains.rd.util.threading.SingleThreadSchedulerBase.queue$lambda-3(SingleThreadScheduler.kt:41)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class ***.S***L***
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.utbot.instrumentation.instrumentation.InvokeInstrumentation$invoke$2$result$1.invoke-IoAF18A(InvokeInstrumentation.kt:61)
at org.utbot.instrumentation.instrumentation.InvokeInstrumentation$invoke$2$result$1.invoke(InvokeInstrumentation.kt:59)
at org.utbot.instrumentation.process.SecurityKt$runSandbox$1$1.invoke(Security.kt:40)
at org.utbot.instrumentation.process.SecurityKt$sandbox$1.invoke(Security.kt:62)
at org.utbot.instrumentation.process.SecurityKt$sandbox$2.invoke(Security.kt:78)
at org.utbot.instrumentation.process.SecurityKt$sandbox$3.invoke(Security.kt:83)
at org.utbot.instrumentation.process.SecurityKt$sandbox$4.run(Security.kt:89)
at java.security.AccessController.doPrivileged(Native Method)
at org.utbot.instrumentation.process.SecurityKt.sandbox(Security.kt:89)
at org.utbot.instrumentation.process.SecurityKt.sandbox(Security.kt:83)
at org.utbot.instrumentation.process.SecurityKt.sandbox(Security.kt:78)
at org.utbot.instrumentation.process.SecurityKt.sandbox(Security.kt:62)
at org.utbot.instrumentation.process.SecurityKt.runSandbox(Security.kt:40)
at org.utbot.instrumentation.process.SecurityKt.runSandbox$default(Security.kt:38)
at org.utbot.instrumentation.instrumentation.InvokeInstrumentation.invoke-BWLJW6A(InvokeInstrumentation.kt:59)
at org.utbot.instrumentation.instrumentation.InvokeInstrumentation.invoke(InvokeInstrumentation.kt:21)
at org.utbot.instrumentation.instrumentation.Instrumentation$DefaultImpls.invoke$default(Instrumentation.kt:21)
at org.utbot.instrumentation.instrumentation.execution.phases.InvocationPhase.invoke-0E7RQCE(InvocationPhase.kt:31)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation$invoke$1$1$concreteResult$1.invoke-IoAF18A(SimpleUtExecutionInstrumentation.kt:64)
at org.utbot.instrumentation.instrumentation.execution.SimpleUtExecutionInstrumentation$invoke$1$1$concreteResult$1.invoke(SimpleUtExecutionInstrumentation.kt:63)
at org.utbot.instrumentation.instrumentation.execution.phases.PhasesController$executePhaseInTimeout$1$result$1.invoke(PhasesController.kt:62)
at org.utbot.common.ThreadBasedExecutor$invokeWithTimeout$1.invoke(ThreadUtil.kt:44)
at org.utbot.common.ThreadBasedExecutor$ensureThreadIsAlive$1.invoke(ThreadUtil.kt:91)
at org.utbot.common.ThreadBasedExecutor$ensureThreadIsAlive$1.invoke(ThreadUtil.kt:87)
at kotlin.concurrent.ThreadsKt$thread$thread$1.run(Thread.kt:30)
at com.jetbrains.rd.framework.RdTaskResult$Companion.read(TaskInterfaces.kt:30) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.impl.CallSiteWiredRdTask.onWireReceived(RdTask.kt:106) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.MessageBroker$invoke$2$2.invoke(MessageBroker.kt:57) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.MessageBroker$invoke$2$2.invoke(MessageBroker.kt:56) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.impl.ProtocolContexts.readMessageContextAndInvoke(ProtocolContexts.kt:148) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.MessageBroker$invoke$2.invoke(MessageBroker.kt:56) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.MessageBroker$invoke$2.invoke(MessageBroker.kt:54) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.impl.RdCall$createResponseScheduler$1.queue$execute(RdTask.kt:280) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.impl.RdCall$createResponseScheduler$1.access$queue$execute(RdTask.kt:269) ~[rd-framework-2023.1.2.jar:?]
at com.jetbrains.rd.framework.impl.RdCall$createResponseScheduler$1$queue$1.invokeSuspend(RdTask.kt:289) ~[rd-framework-2023.1.2.jar:?]
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) ~[kotlin-stdlib-1.8.10.jar:1.8.10-release-430(1.8.10)]
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106) ~[kotlinx-coroutines-core-jvm-1.6.3.jar:?]
... 28 more
~~~
**Environment**
IntelliJ IDEA version - 2023.1 Community
Project - Maven
JDK - 1.8
|
process
|
could not initialize class s l from sandbox in instrumented process description instrumented process errors could not initialize class s l from sandbox to reproduce install in intellij idea open sm t project generate tests for s l class expected behavior class should be instantiated or correctly disabled due to sandbox tests should be generated actual behavior error tests with failures from instrumented process are generated there is one test disabled due to sandbox but without class instantiation there are noclassdeffounderrors in utbot engine current log screenshots logs java test disabled value disabled due to sandbox public void test mockedstatic mockedstatic null try mockedstatic mockstatic loggerfactory class mockedstatic when loggerfactory getlogger any class class thenreturn logger null this test fails because method produces finally mockedstatic close endregion region errors report for public void test errors couldn t generate some tests list of errors occurrences of concrete execution failed endregion java caused by com jetbrains rd util reactive rdfault invocationphase reason org utbot instrumentation instrumentation execution phases executionphaseerror invocationphase at org utbot instrumentation instrumentation execution phases invocationphase wraperror invocationphase kt at org utbot instrumentation instrumentation execution phases executionphasekt start executionphase kt at org utbot instrumentation instrumentation execution phases phasescontroller executephaseintimeout phasescontroller kt at org utbot instrumentation instrumentation execution simpleutexecutioninstrumentation invoke invoke simpleutexecutioninstrumentation kt at org utbot instrumentation instrumentation execution simpleutexecutioninstrumentation invoke invoke simpleutexecutioninstrumentation kt at org utbot instrumentation instrumentation execution utexecutioninstrumentation invoke invoke utexecutioninstrumentation kt at org utbot instrumentation instrumentation execution utexecutioninstrumentation invoke invoke utexecutioninstrumentation kt at org utbot instrumentation instrumentation execution simpleutexecutioninstrumentation invoke simpleutexecutioninstrumentation kt at org utbot instrumentation instrumentation execution utexecutioninstrumentation defaultimpls invoke utexecutioninstrumentation kt at org utbot instrumentation instrumentation execution simpleutexecutioninstrumentation invoke simpleutexecutioninstrumentation kt at org utbot instrumentation instrumentation execution simpleutexecutioninstrumentation invoke simpleutexecutioninstrumentation kt at org utbot instrumentation process instrumentedprocessmainkt setup invoke instrumentedprocessmain kt at org utbot instrumentation process instrumentedprocessmainkt setup invoke instrumentedprocessmain kt at org utbot rd idlewatchdog measuretimeforactivecall invoke clientprocessutil kt at org utbot rd idlewatchdog wrapactive clientprocessutil kt at org utbot rd idlewatchdog measuretimeforactivecall invoke clientprocessutil kt at com jetbrains rd framework irdendpoint set invoke taskinterfaces kt at com jetbrains rd framework irdendpoint set invoke taskinterfaces kt at com jetbrains rd framework impl rdcall onwirereceived rdtask kt at com jetbrains rd framework messagebroker invoke invoke messagebroker kt at com jetbrains rd framework messagebroker invoke invoke messagebroker kt at com jetbrains rd framework impl protocolcontexts readmessagecontextandinvoke protocolcontexts kt at com jetbrains rd framework messagebroker invoke invoke messagebroker kt at com jetbrains rd framework messagebroker invoke invoke messagebroker kt at com jetbrains rd util threading singlethreadschedulerbase queue lambda singlethreadscheduler kt at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java caused by java lang noclassdeffounderror could not initialize class s l at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org utbot instrumentation instrumentation invokeinstrumentation invoke result invoke invokeinstrumentation kt at org utbot instrumentation instrumentation invokeinstrumentation invoke result invoke invokeinstrumentation kt at org utbot instrumentation process securitykt runsandbox invoke security kt at org utbot instrumentation process securitykt sandbox invoke security kt at org utbot instrumentation process securitykt sandbox invoke security kt at org utbot instrumentation process securitykt sandbox invoke security kt at org utbot instrumentation process securitykt sandbox run security kt at java security accesscontroller doprivileged native method at org utbot instrumentation process securitykt sandbox security kt at org utbot instrumentation process securitykt sandbox security kt at org utbot instrumentation process securitykt sandbox security kt at org utbot instrumentation process securitykt sandbox security kt at org utbot instrumentation process securitykt runsandbox security kt at org utbot instrumentation process securitykt runsandbox default security kt at org utbot instrumentation instrumentation invokeinstrumentation invoke invokeinstrumentation kt at org utbot instrumentation instrumentation invokeinstrumentation invoke invokeinstrumentation kt at org utbot instrumentation instrumentation instrumentation defaultimpls invoke default instrumentation kt at org utbot instrumentation instrumentation execution phases invocationphase invoke invocationphase kt at org utbot instrumentation instrumentation execution simpleutexecutioninstrumentation invoke concreteresult invoke simpleutexecutioninstrumentation kt at org utbot instrumentation instrumentation execution simpleutexecutioninstrumentation invoke concreteresult invoke simpleutexecutioninstrumentation kt at org utbot instrumentation instrumentation execution phases phasescontroller executephaseintimeout result invoke phasescontroller kt at org utbot common threadbasedexecutor invokewithtimeout invoke threadutil kt at org utbot common threadbasedexecutor ensurethreadisalive invoke threadutil kt at org utbot common threadbasedexecutor ensurethreadisalive invoke threadutil kt at kotlin concurrent threadskt thread thread run thread kt at com jetbrains rd framework rdtaskresult companion read taskinterfaces kt at com jetbrains rd framework impl callsitewiredrdtask onwirereceived rdtask kt at com jetbrains rd framework messagebroker invoke invoke messagebroker kt at com jetbrains rd framework messagebroker invoke invoke messagebroker kt at com jetbrains rd framework impl protocolcontexts readmessagecontextandinvoke protocolcontexts kt at com jetbrains rd framework messagebroker invoke invoke messagebroker kt at com jetbrains rd framework messagebroker invoke invoke messagebroker kt at com jetbrains rd framework impl rdcall createresponsescheduler queue execute rdtask kt at com jetbrains rd framework impl rdcall createresponsescheduler access queue execute rdtask kt at com jetbrains rd framework impl rdcall createresponsescheduler queue invokesuspend rdtask kt at kotlin coroutines jvm internal basecontinuationimpl resumewith continuationimpl kt at kotlinx coroutines dispatchedtask run dispatchedtask kt more environment intellij idea version community project maven jdk
| 1
|
46,981
| 11,941,992,292
|
IssuesEvent
|
2020-04-02 19:27:12
|
GoogleContainerTools/skaffold
|
https://api.github.com/repos/GoogleContainerTools/skaffold
|
closed
|
Jib multi-module builds fails when building with google cloud build
|
area/build build/jib kind/bug kind/question priority/p2
|
### Actual behavior
The build fails, as the parent pom.xml contains a listing of all the submodules, only the modules relevant to the build are being copied over by jib however. This causes the maven build to fail because a module is missing. I tested this using the jib-multimodule example @ https://github.com/GoogleContainerTools/skaffold/blob/master/examples/jib-multimodule
### Expected behavior
Running a google cloud build for a single module of a multi module maven projects with jib should succeed. The parent pom used in the build context should not contain the modules that aren't present in the build context. (Or the whole project has to be copied over into the build context, but that seems counterproductive)
### Information
- Skaffold version: 1.1.0
- Operating system: Windows 10 (Home) Version 10.0.18362 Build 18362
- Contents of skaffold.yaml:
```yaml
apiVersion: skaffold/v2alpha1
kind: Config
build:
artifacts:
- image: gcr.io/k8s-skaffold/skaffold-jib-1
jib:
project: project1
- image: gcr.io/k8s-skaffold/skaffold-jib-2
jib:
project: :skaffold-project-2
profiles:
- name: gcb
build:
googleCloudBuild:
projectId: some-GC-project
```
### Steps to reproduce the behavior
1. Use the skaffold.yaml above with the jib-multimodule example project @ https://github.com/GoogleContainerTools/skaffold/blob/master/examples/jib-multimodule
2. Edit the projectId in the skaffold.yaml for a valid one :))
2. run `skaffold build -p gcb`
3. Output of the command:
```
skaffold build -p gcb
Generating tags...
- gcr.io/k8s-skaffold/skaffold-jib-1 -> gcr.io/k8s-skaffold/skaffold-jib-1:v1.1.0-59-g7ae5aab7e-dirty
- gcr.io/k8s-skaffold/skaffold-jib-2 -> gcr.io/k8s-skaffold/skaffold-jib-2:v1.1.0-59-g7ae5aab7e-dirty
Checking cache...
- gcr.io/k8s-skaffold/skaffold-jib-1: Not found. Building
- gcr.io/k8s-skaffold/skaffold-jib-2: Not found. Building
Building [gcr.io/k8s-skaffold/skaffold-jib-1]...
Pushing code to gs://some-GC-project_cloudbuild/source/some-GC-project-191229db1121117b3e4ad330058f752b.tar.gz
Logs are available at
https://console.cloud.google.com/m/cloudstorage/b/some-GC-project_cloudbuild/o/log-b12534bb-8243-4369-9a98-1641cf438bb5.txt
starting build "b12534bb-8243-4369-9a98-1641cf438bb5"
FETCHSOURCE
Fetching storage object: gs://some-GC-project_cloudbuild/source/some-GC-project-191229db1121117b3e4ad330058f752b.tar.gz#1578396083573298
Copying gs://some-GC-project_cloudbuild/source/some-GC-project-191229db1121117b3e4ad330058f752b.tar.gz#1578396083573298...
/ [1 files][ 1.2 KiB/ 1.2 KiB]
Operation completed over 1 objects/1.2 KiB.
BUILD
Already have image (with digest): gcr.io/cloud-builders/mvn
[INFO] Scanning for projects...
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/boot/spring-boot-starter-parent/2.0.5.RELEASE/spring-boot-starter-parent-2.0.5.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/boot/spring-boot-starter-parent/2.0.5.RELEASE/spring-boot-starter-parent-2.0.5.RELEASE.pom (12 kB at 15 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/boot/spring-boot-dependencies/2.0.5.RELEASE/spring-boot-dependencies-2.0.5.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/boot/spring-boot-dependencies/2.0.5.RELEASE/spring-boot-dependencies-2.0.5.RELEASE.pom (137 kB at 783 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/com/fasterxml/jackson/jackson-bom/2.9.6/jackson-bom-2.9.6.pom
Downloaded from central: https://repo.maven.apache.org/maven2/com/fasterxml/jackson/jackson-bom/2.9.6/jackson-bom-2.9.6.pom (12 kB at 208 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/com/fasterxml/jackson/jackson-parent/2.9.1.1/jackson-parent-2.9.1.1.pom
Downloaded from central: https://repo.maven.apache.org/maven2/com/fasterxml/jackson/jackson-parent/2.9.1.1/jackson-parent-2.9.1.1.pom (8.0 kB at 149 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/com/fasterxml/oss-parent/33/oss-parent-33.pom
Downloaded from central: https://repo.maven.apache.org/maven2/com/fasterxml/oss-parent/33/oss-parent-33.pom (22 kB at 326 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/io/netty/netty-bom/4.1.29.Final/netty-bom-4.1.29.Final.pom
Downloaded from central: https://repo.maven.apache.org/maven2/io/netty/netty-bom/4.1.29.Final/netty-bom-4.1.29.Final.pom (7.9 kB at 136 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/sonatype/oss/oss-parent/7/oss-parent-7.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/sonatype/oss/oss-parent/7/oss-parent-7.pom (4.8 kB at 83 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/io/projectreactor/reactor-bom/Bismuth-SR11/reactor-bom-Bismuth-SR11.pom
Downloaded from central: https://repo.maven.apache.org/maven2/io/projectreactor/reactor-bom/Bismuth-SR11/reactor-bom-Bismuth-SR11.pom (3.6 kB at 67 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/logging/log4j/log4j-bom/2.10.0/log4j-bom-2.10.0.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/logging/log4j/log4j-bom/2.10.0/log4j-bom-2.10.0.pom (5.6 kB at 83 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/logging/logging-parent/1/logging-parent-1.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/logging/logging-parent/1/logging-parent-1.pom (3.2 kB at 56 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/apache/18/apache-18.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/apache/18/apache-18.pom (16 kB at 280 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/eclipse/jetty/jetty-bom/9.4.12.v20180830/jetty-bom-9.4.12.v20180830.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/eclipse/jetty/jetty-bom/9.4.12.v20180830/jetty-bom-9.4.12.v20180830.pom (18 kB at 293 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/spring-framework-bom/5.0.9.RELEASE/spring-framework-bom-5.0.9.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/spring-framework-bom/5.0.9.RELEASE/spring-framework-bom-5.0.9.RELEASE.pom (5.3 kB at 97 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/data/spring-data-releasetrain/Kay-SR10/spring-data-releasetrain-Kay-SR10.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/data/spring-data-releasetrain/Kay-SR10/spring-data-releasetrain-Kay-SR10.pom (4.5 kB at 80 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/data/build/spring-data-build/2.0.10.RELEASE/spring-data-build-2.0.10.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/data/build/spring-data-build/2.0.10.RELEASE/spring-data-build-2.0.10.RELEASE.pom (6.6 kB at 129 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/integration/spring-integration-bom/5.0.8.RELEASE/spring-integration-bom-5.0.8.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/integration/spring-integration-bom/5.0.8.RELEASE/spring-integration-bom-5.0.8.RELEASE.pom (8.9 kB at 148 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/security/spring-security-bom/5.0.8.RELEASE/spring-security-bom-5.0.8.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/security/spring-security-bom/5.0.8.RELEASE/spring-security-bom-5.0.8.RELEASE.pom (4.8 kB at 78 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/session/spring-session-bom/Apple-SR5/spring-session-bom-Apple-SR5.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/session/spring-session-bom/Apple-SR5/spring-session-bom-Apple-SR5.pom (3.0 kB at 66 kB/s)
[ERROR] [ERROR] Some problems were encountered while processing the POMs:
[ERROR] Child module /workspace/project2 of /workspace/pom.xml does not exist @
@
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]
[ERROR] The project org.skaffold:parent:0.1.0 (/workspace/pom.xml) has 1 error
[ERROR] Child module /workspace/project2 of /workspace/pom.xml does not exist
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
ERROR
ERROR: build step 0 "gcr.io/cloud-builders/mvn" failed: exit status 1
time="2020-01-07T12:21:41+01:00" level=fatal msg="build failed: build failed: building [gcr.io/k8s-skaffold/skaffold-jib-1]: cloud build failed: FAILURE"
```
|
2.0
|
Jib multi-module builds fails when building with google cloud build - ### Actual behavior
The build fails, as the parent pom.xml contains a listing of all the submodules, only the modules relevant to the build are being copied over by jib however. This causes the maven build to fail because a module is missing. I tested this using the jib-multimodule example @ https://github.com/GoogleContainerTools/skaffold/blob/master/examples/jib-multimodule
### Expected behavior
Running a google cloud build for a single module of a multi module maven projects with jib should succeed. The parent pom used in the build context should not contain the modules that aren't present in the build context. (Or the whole project has to be copied over into the build context, but that seems counterproductive)
### Information
- Skaffold version: 1.1.0
- Operating system: Windows 10 (Home) Version 10.0.18362 Build 18362
- Contents of skaffold.yaml:
```yaml
apiVersion: skaffold/v2alpha1
kind: Config
build:
artifacts:
- image: gcr.io/k8s-skaffold/skaffold-jib-1
jib:
project: project1
- image: gcr.io/k8s-skaffold/skaffold-jib-2
jib:
project: :skaffold-project-2
profiles:
- name: gcb
build:
googleCloudBuild:
projectId: some-GC-project
```
### Steps to reproduce the behavior
1. Use the skaffold.yaml above with the jib-multimodule example project @ https://github.com/GoogleContainerTools/skaffold/blob/master/examples/jib-multimodule
2. Edit the projectId in the skaffold.yaml for a valid one :))
2. run `skaffold build -p gcb`
3. Output of the command:
```
skaffold build -p gcb
Generating tags...
- gcr.io/k8s-skaffold/skaffold-jib-1 -> gcr.io/k8s-skaffold/skaffold-jib-1:v1.1.0-59-g7ae5aab7e-dirty
- gcr.io/k8s-skaffold/skaffold-jib-2 -> gcr.io/k8s-skaffold/skaffold-jib-2:v1.1.0-59-g7ae5aab7e-dirty
Checking cache...
- gcr.io/k8s-skaffold/skaffold-jib-1: Not found. Building
- gcr.io/k8s-skaffold/skaffold-jib-2: Not found. Building
Building [gcr.io/k8s-skaffold/skaffold-jib-1]...
Pushing code to gs://some-GC-project_cloudbuild/source/some-GC-project-191229db1121117b3e4ad330058f752b.tar.gz
Logs are available at
https://console.cloud.google.com/m/cloudstorage/b/some-GC-project_cloudbuild/o/log-b12534bb-8243-4369-9a98-1641cf438bb5.txt
starting build "b12534bb-8243-4369-9a98-1641cf438bb5"
FETCHSOURCE
Fetching storage object: gs://some-GC-project_cloudbuild/source/some-GC-project-191229db1121117b3e4ad330058f752b.tar.gz#1578396083573298
Copying gs://some-GC-project_cloudbuild/source/some-GC-project-191229db1121117b3e4ad330058f752b.tar.gz#1578396083573298...
/ [1 files][ 1.2 KiB/ 1.2 KiB]
Operation completed over 1 objects/1.2 KiB.
BUILD
Already have image (with digest): gcr.io/cloud-builders/mvn
[INFO] Scanning for projects...
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/boot/spring-boot-starter-parent/2.0.5.RELEASE/spring-boot-starter-parent-2.0.5.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/boot/spring-boot-starter-parent/2.0.5.RELEASE/spring-boot-starter-parent-2.0.5.RELEASE.pom (12 kB at 15 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/boot/spring-boot-dependencies/2.0.5.RELEASE/spring-boot-dependencies-2.0.5.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/boot/spring-boot-dependencies/2.0.5.RELEASE/spring-boot-dependencies-2.0.5.RELEASE.pom (137 kB at 783 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/com/fasterxml/jackson/jackson-bom/2.9.6/jackson-bom-2.9.6.pom
Downloaded from central: https://repo.maven.apache.org/maven2/com/fasterxml/jackson/jackson-bom/2.9.6/jackson-bom-2.9.6.pom (12 kB at 208 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/com/fasterxml/jackson/jackson-parent/2.9.1.1/jackson-parent-2.9.1.1.pom
Downloaded from central: https://repo.maven.apache.org/maven2/com/fasterxml/jackson/jackson-parent/2.9.1.1/jackson-parent-2.9.1.1.pom (8.0 kB at 149 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/com/fasterxml/oss-parent/33/oss-parent-33.pom
Downloaded from central: https://repo.maven.apache.org/maven2/com/fasterxml/oss-parent/33/oss-parent-33.pom (22 kB at 326 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/io/netty/netty-bom/4.1.29.Final/netty-bom-4.1.29.Final.pom
Downloaded from central: https://repo.maven.apache.org/maven2/io/netty/netty-bom/4.1.29.Final/netty-bom-4.1.29.Final.pom (7.9 kB at 136 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/sonatype/oss/oss-parent/7/oss-parent-7.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/sonatype/oss/oss-parent/7/oss-parent-7.pom (4.8 kB at 83 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/io/projectreactor/reactor-bom/Bismuth-SR11/reactor-bom-Bismuth-SR11.pom
Downloaded from central: https://repo.maven.apache.org/maven2/io/projectreactor/reactor-bom/Bismuth-SR11/reactor-bom-Bismuth-SR11.pom (3.6 kB at 67 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/logging/log4j/log4j-bom/2.10.0/log4j-bom-2.10.0.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/logging/log4j/log4j-bom/2.10.0/log4j-bom-2.10.0.pom (5.6 kB at 83 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/logging/logging-parent/1/logging-parent-1.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/logging/logging-parent/1/logging-parent-1.pom (3.2 kB at 56 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/apache/18/apache-18.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/apache/18/apache-18.pom (16 kB at 280 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/eclipse/jetty/jetty-bom/9.4.12.v20180830/jetty-bom-9.4.12.v20180830.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/eclipse/jetty/jetty-bom/9.4.12.v20180830/jetty-bom-9.4.12.v20180830.pom (18 kB at 293 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/spring-framework-bom/5.0.9.RELEASE/spring-framework-bom-5.0.9.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/spring-framework-bom/5.0.9.RELEASE/spring-framework-bom-5.0.9.RELEASE.pom (5.3 kB at 97 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/data/spring-data-releasetrain/Kay-SR10/spring-data-releasetrain-Kay-SR10.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/data/spring-data-releasetrain/Kay-SR10/spring-data-releasetrain-Kay-SR10.pom (4.5 kB at 80 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/data/build/spring-data-build/2.0.10.RELEASE/spring-data-build-2.0.10.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/data/build/spring-data-build/2.0.10.RELEASE/spring-data-build-2.0.10.RELEASE.pom (6.6 kB at 129 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/integration/spring-integration-bom/5.0.8.RELEASE/spring-integration-bom-5.0.8.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/integration/spring-integration-bom/5.0.8.RELEASE/spring-integration-bom-5.0.8.RELEASE.pom (8.9 kB at 148 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/security/spring-security-bom/5.0.8.RELEASE/spring-security-bom-5.0.8.RELEASE.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/security/spring-security-bom/5.0.8.RELEASE/spring-security-bom-5.0.8.RELEASE.pom (4.8 kB at 78 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/springframework/session/spring-session-bom/Apple-SR5/spring-session-bom-Apple-SR5.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/springframework/session/spring-session-bom/Apple-SR5/spring-session-bom-Apple-SR5.pom (3.0 kB at 66 kB/s)
[ERROR] [ERROR] Some problems were encountered while processing the POMs:
[ERROR] Child module /workspace/project2 of /workspace/pom.xml does not exist @
@
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]
[ERROR] The project org.skaffold:parent:0.1.0 (/workspace/pom.xml) has 1 error
[ERROR] Child module /workspace/project2 of /workspace/pom.xml does not exist
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
ERROR
ERROR: build step 0 "gcr.io/cloud-builders/mvn" failed: exit status 1
time="2020-01-07T12:21:41+01:00" level=fatal msg="build failed: build failed: building [gcr.io/k8s-skaffold/skaffold-jib-1]: cloud build failed: FAILURE"
```
|
non_process
|
jib multi module builds fails when building with google cloud build actual behavior the build fails as the parent pom xml contains a listing of all the submodules only the modules relevant to the build are being copied over by jib however this causes the maven build to fail because a module is missing i tested this using the jib multimodule example expected behavior running a google cloud build for a single module of a multi module maven projects with jib should succeed the parent pom used in the build context should not contain the modules that aren t present in the build context or the whole project has to be copied over into the build context but that seems counterproductive information skaffold version operating system windows home version build contents of skaffold yaml yaml apiversion skaffold kind config build artifacts image gcr io skaffold skaffold jib jib project image gcr io skaffold skaffold jib jib project skaffold project profiles name gcb build googlecloudbuild projectid some gc project steps to reproduce the behavior use the skaffold yaml above with the jib multimodule example project edit the projectid in the skaffold yaml for a valid one run skaffold build p gcb output of the command skaffold build p gcb generating tags gcr io skaffold skaffold jib gcr io skaffold skaffold jib dirty gcr io skaffold skaffold jib gcr io skaffold skaffold jib dirty checking cache gcr io skaffold skaffold jib not found building gcr io skaffold skaffold jib not found building building pushing code to gs some gc project cloudbuild source some gc project tar gz logs are available at starting build fetchsource fetching storage object gs some gc project cloudbuild source some gc project tar gz copying gs some gc project cloudbuild source some gc project tar gz operation completed over objects kib build already have image with digest gcr io cloud builders mvn scanning for projects downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s downloading from central downloaded from central kb at kb s some problems were encountered while processing the poms child module workspace of workspace pom xml does not exist the build could not read project the project org skaffold parent workspace pom xml has error child module workspace of workspace pom xml does not exist to see the full stack trace of the errors re run maven with the e switch re run maven using the x switch to enable full debug logging for more information about the errors and possible solutions please read the following articles error error build step gcr io cloud builders mvn failed exit status time level fatal msg build failed build failed building cloud build failed failure
| 0
|
2,326
| 11,771,088,389
|
IssuesEvent
|
2020-03-15 22:14:14
|
AdExNetwork/adex-market
|
https://api.github.com/repos/AdExNetwork/adex-market
|
opened
|
automatic updating of Cloudflare WAF
|
automation enhancement
|
See `scrpits/get-waf`
Use the `cloudflare` npm module to automate updating of the rule
|
1.0
|
automatic updating of Cloudflare WAF - See `scrpits/get-waf`
Use the `cloudflare` npm module to automate updating of the rule
|
non_process
|
automatic updating of cloudflare waf see scrpits get waf use the cloudflare npm module to automate updating of the rule
| 0
|
13,940
| 16,717,543,973
|
IssuesEvent
|
2021-06-10 00:11:54
|
Leviatan-Analytics/LA-data-processing
|
https://api.github.com/repos/Leviatan-Analytics/LA-data-processing
|
closed
|
Test text recognition (EasyOCR vs Pytesseract) [2]
|
Data Processing Sprint 2 Week 2
|
Estimated time: 2 hs per assignee
Compare the two libraries based on time, recognition accuracy, etc.
Output: Document with the different aspects we tested and the result of those tests.
|
1.0
|
Test text recognition (EasyOCR vs Pytesseract) [2] - Estimated time: 2 hs per assignee
Compare the two libraries based on time, recognition accuracy, etc.
Output: Document with the different aspects we tested and the result of those tests.
|
process
|
test text recognition easyocr vs pytesseract estimated time hs per assignee compare the two libraries based on time recognition accuracy etc output document with the different aspects we tested and the result of those tests
| 1
|
16,046
| 20,192,236,836
|
IssuesEvent
|
2022-02-11 07:05:44
|
qgis/QGIS-Documentation
|
https://api.github.com/repos/qgis/QGIS-Documentation
|
closed
|
[processing][needs-docs] By default, hide algorithms with known issues from toolbox
|
Automatic new feature Processing 3.4
|
Original commit: https://github.com/qgis/QGIS/commit/237c74536fbd0d3d9853d2a43c5b1b9989ce8e0f by nyalldawson
And add a Processing setting to allow these to be shown. When shown, they
are highlighted in red with a tooltip explaining that the algorithm
has known issues
(cherry picked from commit 63d648738d893890a48ab3fea175ded725f648e2)
(cherry picked from commit 0b412166e621701fac903067dec5a86089a4d15c)
|
1.0
|
[processing][needs-docs] By default, hide algorithms with known issues from toolbox - Original commit: https://github.com/qgis/QGIS/commit/237c74536fbd0d3d9853d2a43c5b1b9989ce8e0f by nyalldawson
And add a Processing setting to allow these to be shown. When shown, they
are highlighted in red with a tooltip explaining that the algorithm
has known issues
(cherry picked from commit 63d648738d893890a48ab3fea175ded725f648e2)
(cherry picked from commit 0b412166e621701fac903067dec5a86089a4d15c)
|
process
|
by default hide algorithms with known issues from toolbox original commit by nyalldawson and add a processing setting to allow these to be shown when shown they are highlighted in red with a tooltip explaining that the algorithm has known issues cherry picked from commit cherry picked from commit
| 1
|
15,410
| 19,598,938,489
|
IssuesEvent
|
2022-01-05 21:42:55
|
deepset-ai/haystack
|
https://api.github.com/repos/deepset-ai/haystack
|
closed
|
UserWarning in Tutorial2 finetuning notebook
|
type:bug topic:preprocessing
|
**Describe the bug**
This is not really a bug but there is a UserWarning when training the model because a list is directly converted to a tensor:
`haystack/modeling/data_handler/dataset.py:65: UserWarning: Creating a tensor from a list of numpy.ndarrays is extremely slow. Please consider converting the list to a single numpy.ndarray with numpy.array() before converting to a tensor. (Triggered internally at ../torch/csrc/utils/tensor_new.cpp:201.)`
This does not look so nice and it's also easily fixed.
**Error message**
`haystack/modeling/data_handler/dataset.py:65: UserWarning: Creating a tensor from a list of numpy.ndarrays is extremely slow. Please consider converting the list to a single numpy.ndarray with numpy.array() before converting to a tensor. (Triggered internally at ../torch/csrc/utils/tensor_new.cpp:201.)`
**Expected behavior**
The list could be converted to an np.array first. I don't know if it really has any performance benefits in this case but it will at least avoid the UserWarning. I was actually interested in the performance difference and made a little script to compare the two approaches:
```python
import torch
import numpy as np
import timeit
def compare_speed(runs, conversion=False):
lst = range(1000000)
if conversion:
lst = np.array(lst)
times = []
for run in range(runs):
t0 = timeit.default_timer()
tense = torch.tensor(lst, dtype=torch.long)
t1 = timeit.default_timer()
times.append(t1-t0)
print(f'Executed {runs} runs converting {"np.array" if conversion else "list"} to tensor. Took {sum(times)/runs} ms on average.')
if __name__ == '__main__':
compare_speed(1000)
compare_speed(1000, True)
```
This results in:
```
Executed 1000 runs converting list to tensor. Took 0.022728791695000018 ms on average.
Executed 1000 runs converting np.array to tensor. Took 0.00040228743899997 ms on average.
```
Which is still quite a difference.
My understanding is that
https://github.com/deepset-ai/haystack/blob/3e0ef1cc8a6e6cef44056fd18f4e94089fc90311/haystack/modeling/data_handler/dataset.py#L65
Could simply be changed to:
```python
cur_tensor = torch.tensor(np.array([sample[t_name] for sample in features]), dtype=torch.long)
```
**Additional context**
Add any other context about the problem here, like document types / preprocessing steps / settings of reader etc.
**To Reproduce**
Steps to reproduce the behavior
**FAQ Check**
- [ ] Have you had a look at [our new FAQ page](https://haystack.deepset.ai/overview/faq)?
**System:**
- OS:
- GPU/CPU:
- Haystack version (commit or version number):
- DocumentStore:
- Reader:
- Retriever:
|
1.0
|
UserWarning in Tutorial2 finetuning notebook - **Describe the bug**
This is not really a bug but there is a UserWarning when training the model because a list is directly converted to a tensor:
`haystack/modeling/data_handler/dataset.py:65: UserWarning: Creating a tensor from a list of numpy.ndarrays is extremely slow. Please consider converting the list to a single numpy.ndarray with numpy.array() before converting to a tensor. (Triggered internally at ../torch/csrc/utils/tensor_new.cpp:201.)`
This does not look so nice and it's also easily fixed.
**Error message**
`haystack/modeling/data_handler/dataset.py:65: UserWarning: Creating a tensor from a list of numpy.ndarrays is extremely slow. Please consider converting the list to a single numpy.ndarray with numpy.array() before converting to a tensor. (Triggered internally at ../torch/csrc/utils/tensor_new.cpp:201.)`
**Expected behavior**
The list could be converted to an np.array first. I don't know if it really has any performance benefits in this case but it will at least avoid the UserWarning. I was actually interested in the performance difference and made a little script to compare the two approaches:
```python
import torch
import numpy as np
import timeit
def compare_speed(runs, conversion=False):
lst = range(1000000)
if conversion:
lst = np.array(lst)
times = []
for run in range(runs):
t0 = timeit.default_timer()
tense = torch.tensor(lst, dtype=torch.long)
t1 = timeit.default_timer()
times.append(t1-t0)
print(f'Executed {runs} runs converting {"np.array" if conversion else "list"} to tensor. Took {sum(times)/runs} ms on average.')
if __name__ == '__main__':
compare_speed(1000)
compare_speed(1000, True)
```
This results in:
```
Executed 1000 runs converting list to tensor. Took 0.022728791695000018 ms on average.
Executed 1000 runs converting np.array to tensor. Took 0.00040228743899997 ms on average.
```
Which is still quite a difference.
My understanding is that
https://github.com/deepset-ai/haystack/blob/3e0ef1cc8a6e6cef44056fd18f4e94089fc90311/haystack/modeling/data_handler/dataset.py#L65
Could simply be changed to:
```python
cur_tensor = torch.tensor(np.array([sample[t_name] for sample in features]), dtype=torch.long)
```
**Additional context**
Add any other context about the problem here, like document types / preprocessing steps / settings of reader etc.
**To Reproduce**
Steps to reproduce the behavior
**FAQ Check**
- [ ] Have you had a look at [our new FAQ page](https://haystack.deepset.ai/overview/faq)?
**System:**
- OS:
- GPU/CPU:
- Haystack version (commit or version number):
- DocumentStore:
- Reader:
- Retriever:
|
process
|
userwarning in finetuning notebook describe the bug this is not really a bug but there is a userwarning when training the model because a list is directly converted to a tensor haystack modeling data handler dataset py userwarning creating a tensor from a list of numpy ndarrays is extremely slow please consider converting the list to a single numpy ndarray with numpy array before converting to a tensor triggered internally at torch csrc utils tensor new cpp this does not look so nice and it s also easily fixed error message haystack modeling data handler dataset py userwarning creating a tensor from a list of numpy ndarrays is extremely slow please consider converting the list to a single numpy ndarray with numpy array before converting to a tensor triggered internally at torch csrc utils tensor new cpp expected behavior the list could be converted to an np array first i don t know if it really has any performance benefits in this case but it will at least avoid the userwarning i was actually interested in the performance difference and made a little script to compare the two approaches python import torch import numpy as np import timeit def compare speed runs conversion false lst range if conversion lst np array lst times for run in range runs timeit default timer tense torch tensor lst dtype torch long timeit default timer times append print f executed runs runs converting np array if conversion else list to tensor took sum times runs ms on average if name main compare speed compare speed true this results in executed runs converting list to tensor took ms on average executed runs converting np array to tensor took ms on average which is still quite a difference my understanding is that could simply be changed to python cur tensor torch tensor np array for sample in features dtype torch long additional context add any other context about the problem here like document types preprocessing steps settings of reader etc to reproduce steps to reproduce the behavior faq check have you had a look at system os gpu cpu haystack version commit or version number documentstore reader retriever
| 1
|
18,316
| 24,431,375,412
|
IssuesEvent
|
2022-10-06 08:23:08
|
w3c/webauthn
|
https://api.github.com/repos/w3c/webauthn
|
closed
|
Create GitHub issue templates
|
type:process stat:pr-open
|
Create new issue templates for the WebAuthn repo:
* New Use Case or Feature
* Technical Change
* Editorial Change
* WG Administrivia
* "How do I..." < external link to fido-dev >
* Deployment Question < external link to fido-dev >
|
1.0
|
Create GitHub issue templates - Create new issue templates for the WebAuthn repo:
* New Use Case or Feature
* Technical Change
* Editorial Change
* WG Administrivia
* "How do I..." < external link to fido-dev >
* Deployment Question < external link to fido-dev >
|
process
|
create github issue templates create new issue templates for the webauthn repo new use case or feature technical change editorial change wg administrivia how do i deployment question
| 1
|
14,405
| 17,458,818,522
|
IssuesEvent
|
2021-08-06 07:29:58
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
test(client): happy blog-env test has no assertion
|
process/candidate topic: prisma-client topic: tests tech/typescript team/client
|
This test has no assertion. [adding "expect.assertions(1)" at the beginning makes the test fail.](https://github.com/prisma/prisma/pull/6225)
https://github.com/prisma/prisma/blob/main/packages/client/src/__tests__/integration/happy/blog-env/test.ts
On first look, it looks like it's trying to test `PrismaClientInitializationError` but the code never errors.
We should check and properly test `PrismaClientInitializationError`
Context https://prisma-company.slack.com/archives/C016KUHB1R6/p1616400507047100
|
1.0
|
test(client): happy blog-env test has no assertion - This test has no assertion. [adding "expect.assertions(1)" at the beginning makes the test fail.](https://github.com/prisma/prisma/pull/6225)
https://github.com/prisma/prisma/blob/main/packages/client/src/__tests__/integration/happy/blog-env/test.ts
On first look, it looks like it's trying to test `PrismaClientInitializationError` but the code never errors.
We should check and properly test `PrismaClientInitializationError`
Context https://prisma-company.slack.com/archives/C016KUHB1R6/p1616400507047100
|
process
|
test client happy blog env test has no assertion this test has no assertion on first look it looks like it s trying to test prismaclientinitializationerror but the code never errors we should check and properly test prismaclientinitializationerror context
| 1
|
671,549
| 22,766,047,899
|
IssuesEvent
|
2022-07-08 04:33:45
|
MenheraBot/MenheraBot
|
https://api.github.com/repos/MenheraBot/MenheraBot
|
opened
|
[REFACTOR]: Rewrite roleplay battle
|
🔵 [Priority] Low ❤️ Suggestion
|
PvP and PvE were poorly made, rewrite it, and maybe refactor all roleplay system
|
1.0
|
[REFACTOR]: Rewrite roleplay battle - PvP and PvE were poorly made, rewrite it, and maybe refactor all roleplay system
|
non_process
|
rewrite roleplay battle pvp and pve were poorly made rewrite it and maybe refactor all roleplay system
| 0
|
11,885
| 14,680,943,924
|
IssuesEvent
|
2020-12-31 11:40:25
|
ewen-lbh/portfolio
|
https://api.github.com/repos/ewen-lbh/portfolio
|
opened
|
Lint work/tag names to forbid some
|
processing
|
Forbidden:
- future (see #52)
- about
- contact
- made-with
- to
- using
- thanks
- credits
- legal
- license
- legalese
- blog
- source
- goto
- go
|
1.0
|
Lint work/tag names to forbid some - Forbidden:
- future (see #52)
- about
- contact
- made-with
- to
- using
- thanks
- credits
- legal
- license
- legalese
- blog
- source
- goto
- go
|
process
|
lint work tag names to forbid some forbidden future see about contact made with to using thanks credits legal license legalese blog source goto go
| 1
|
20,222
| 26,812,591,752
|
IssuesEvent
|
2023-02-02 00:05:35
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
"Protect access to repositories in YAML pipelines" setting affects the usage of the Access Token
|
devops/prod doc-bug cba Pri2 devops-cicd-process/tech
|
Greetings,
It would be good to include the reference to "Protect access to repositories in YAML pipelines" setting, as it affects the scope of the Access Token.
Reference : https://developercommunity.visualstudio.com/t/error-creating-repository-in-azure-devops-tf401027/1653143.
If you are okay, I am happy to update the FAQ with the details and submit a PR.
Please let me know.
Regards,
Vudaya
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: cdd0749b-b681-8192-3849-f38e6fc7f138
* Version Independent ID: fb5170d3-2e45-3caf-be17-e012bd523660
* Content: [Understand job access tokens - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/access-tokens?view=azure-devops&tabs=yaml)
* Content Source: [docs/pipelines/process/access-tokens.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/access-tokens.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @vijayma
* Microsoft Alias: **vijayma**
|
1.0
|
"Protect access to repositories in YAML pipelines" setting affects the usage of the Access Token -
Greetings,
It would be good to include the reference to "Protect access to repositories in YAML pipelines" setting, as it affects the scope of the Access Token.
Reference : https://developercommunity.visualstudio.com/t/error-creating-repository-in-azure-devops-tf401027/1653143.
If you are okay, I am happy to update the FAQ with the details and submit a PR.
Please let me know.
Regards,
Vudaya
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: cdd0749b-b681-8192-3849-f38e6fc7f138
* Version Independent ID: fb5170d3-2e45-3caf-be17-e012bd523660
* Content: [Understand job access tokens - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/access-tokens?view=azure-devops&tabs=yaml)
* Content Source: [docs/pipelines/process/access-tokens.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/access-tokens.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @vijayma
* Microsoft Alias: **vijayma**
|
process
|
protect access to repositories in yaml pipelines setting affects the usage of the access token greetings it would be good to include the reference to protect access to repositories in yaml pipelines setting as it affects the scope of the access token reference if you are okay i am happy to update the faq with the details and submit a pr please let me know regards vudaya document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login vijayma microsoft alias vijayma
| 1
|
12,466
| 3,615,780,237
|
IssuesEvent
|
2016-02-07 00:23:40
|
frishkorn/timeClock
|
https://api.github.com/repos/frishkorn/timeClock
|
opened
|
Trim Version History
|
documentation
|
Trim version history in timeClock.ino. Remove from start to release 1.0. It's grown large and if someone needs to find that information, they can look at the history in GitHub.
|
1.0
|
Trim Version History - Trim version history in timeClock.ino. Remove from start to release 1.0. It's grown large and if someone needs to find that information, they can look at the history in GitHub.
|
non_process
|
trim version history trim version history in timeclock ino remove from start to release it s grown large and if someone needs to find that information they can look at the history in github
| 0
|
253,645
| 27,300,744,399
|
IssuesEvent
|
2023-02-24 01:34:19
|
panasalap/linux-4.19.72_1
|
https://api.github.com/repos/panasalap/linux-4.19.72_1
|
closed
|
CVE-2019-19065 (Medium) detected in linux-yoctov5.4.51 - autoclosed
|
security vulnerability
|
## CVE-2019-19065 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.19.72/commit/c5a08fe8179013aad614165d792bc5b436591df6">c5a08fe8179013aad614165d792bc5b436591df6</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/infiniband/hw/hfi1/sdma.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/infiniband/hw/hfi1/sdma.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
** DISPUTED ** A memory leak in the sdma_init() function in drivers/infiniband/hw/hfi1/sdma.c in the Linux kernel before 5.3.9 allows attackers to cause a denial of service (memory consumption) by triggering rhashtable_init() failures, aka CID-34b3be18a04e. NOTE: This has been disputed as not a vulnerability because "rhashtable_init() can only fail if it is passed invalid values in the second parameter's struct, but when invoked from sdma_init() that is a pointer to a static const struct, so an attacker could only trigger failure if they could corrupt kernel memory (in which case a small memory leak is not a significant problem)."
<p>Publish Date: 2019-11-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19065>CVE-2019-19065</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19065">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19065</a></p>
<p>Release Date: 2020-08-24</p>
<p>Fix Resolution: v5.4-rc3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-19065 (Medium) detected in linux-yoctov5.4.51 - autoclosed - ## CVE-2019-19065 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.19.72/commit/c5a08fe8179013aad614165d792bc5b436591df6">c5a08fe8179013aad614165d792bc5b436591df6</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/infiniband/hw/hfi1/sdma.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/infiniband/hw/hfi1/sdma.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
** DISPUTED ** A memory leak in the sdma_init() function in drivers/infiniband/hw/hfi1/sdma.c in the Linux kernel before 5.3.9 allows attackers to cause a denial of service (memory consumption) by triggering rhashtable_init() failures, aka CID-34b3be18a04e. NOTE: This has been disputed as not a vulnerability because "rhashtable_init() can only fail if it is passed invalid values in the second parameter's struct, but when invoked from sdma_init() that is a pointer to a static const struct, so an attacker could only trigger failure if they could corrupt kernel memory (in which case a small memory leak is not a significant problem)."
<p>Publish Date: 2019-11-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19065>CVE-2019-19065</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19065">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19065</a></p>
<p>Release Date: 2020-08-24</p>
<p>Fix Resolution: v5.4-rc3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in linux autoclosed cve medium severity vulnerability vulnerable library linux yocto linux embedded kernel library home page a href found in head commit a href found in base branch master vulnerable source files drivers infiniband hw sdma c drivers infiniband hw sdma c vulnerability details disputed a memory leak in the sdma init function in drivers infiniband hw sdma c in the linux kernel before allows attackers to cause a denial of service memory consumption by triggering rhashtable init failures aka cid note this has been disputed as not a vulnerability because rhashtable init can only fail if it is passed invalid values in the second parameter s struct but when invoked from sdma init that is a pointer to a static const struct so an attacker could only trigger failure if they could corrupt kernel memory in which case a small memory leak is not a significant problem publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
192,187
| 14,610,462,983
|
IssuesEvent
|
2020-12-22 00:26:38
|
github-vet/rangeloop-pointer-findings
|
https://api.github.com/repos/github-vet/rangeloop-pointer-findings
|
closed
|
amoghe/rocketship: commander/modules/host/users_test.go; 3 LoC
|
fresh test tiny
|
Found a possible issue in [amoghe/rocketship](https://www.github.com/amoghe/rocketship) at [commander/modules/host/users_test.go](https://github.com/amoghe/rocketship/blob/6759108114ee7d79b565baab58d6e124b864c5da/commander/modules/host/users_test.go#L233-L235)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call which takes a reference to u at line 234 may start a goroutine
[Click here to see the code in its original context.](https://github.com/amoghe/rocketship/blob/6759108114ee7d79b565baab58d6e124b864c5da/commander/modules/host/users_test.go#L233-L235)
<details>
<summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary>
```go
for _, u := range users {
c.Assert(ts.db.Create(&u).Error, IsNil)
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 6759108114ee7d79b565baab58d6e124b864c5da
|
1.0
|
amoghe/rocketship: commander/modules/host/users_test.go; 3 LoC -
Found a possible issue in [amoghe/rocketship](https://www.github.com/amoghe/rocketship) at [commander/modules/host/users_test.go](https://github.com/amoghe/rocketship/blob/6759108114ee7d79b565baab58d6e124b864c5da/commander/modules/host/users_test.go#L233-L235)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call which takes a reference to u at line 234 may start a goroutine
[Click here to see the code in its original context.](https://github.com/amoghe/rocketship/blob/6759108114ee7d79b565baab58d6e124b864c5da/commander/modules/host/users_test.go#L233-L235)
<details>
<summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary>
```go
for _, u := range users {
c.Assert(ts.db.Create(&u).Error, IsNil)
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 6759108114ee7d79b565baab58d6e124b864c5da
|
non_process
|
amoghe rocketship commander modules host users test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to u at line may start a goroutine click here to show the line s of go which triggered the analyzer go for u range users c assert ts db create u error isnil leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
| 0
|
233,307
| 17,860,036,705
|
IssuesEvent
|
2021-09-05 19:53:44
|
celery/celery
|
https://api.github.com/repos/celery/celery
|
opened
|
Mention that worker kills child processes on systems supporting `prctl.PDEATHSIG`
|
Issue Type: Bug Report Category: Documentation
|
<!--
Please fill this template entirely and do not erase parts of it.
We reserve the right to close without a response
bug reports which are incomplete.
-->
# Checklist
<!--
To check an item on the list replace [ ] with [x].
-->
- [x] I have checked the [issues list](https://github.com/celery/celery/issues?utf8=%E2%9C%93&q=is%3Aissue+label%3A%22Category%3A+Documentation%22+)
for similar or identical bug reports.
- [x] I have checked the [pull requests list](https://github.com/celery/celery/pulls?q=is%3Apr+label%3A%22Category%3A+Documentation%22)
for existing proposed fixes.
- [x] I have checked the [commit log](https://github.com/celery/celery/commits/master)
to find out if the bug was already fixed in the master branch.
- [x] I have included all related issues and possible duplicate issues in this issue
(If there are none, check this box anyway).
## Related Issues and Possible Duplicates
#### Related Issues
- None
#### Possible Duplicates
- None
# Description
Documentation https://docs.celeryproject.org/en/master/userguide/workers.html#stopping-the-worker is outdated due merged PR #6942.
# Suggestions
Documentation should mention that Celery worker main process in some systems should be able to kill all child processes using SIGKILL signal.
|
1.0
|
Mention that worker kills child processes on systems supporting `prctl.PDEATHSIG` - <!--
Please fill this template entirely and do not erase parts of it.
We reserve the right to close without a response
bug reports which are incomplete.
-->
# Checklist
<!--
To check an item on the list replace [ ] with [x].
-->
- [x] I have checked the [issues list](https://github.com/celery/celery/issues?utf8=%E2%9C%93&q=is%3Aissue+label%3A%22Category%3A+Documentation%22+)
for similar or identical bug reports.
- [x] I have checked the [pull requests list](https://github.com/celery/celery/pulls?q=is%3Apr+label%3A%22Category%3A+Documentation%22)
for existing proposed fixes.
- [x] I have checked the [commit log](https://github.com/celery/celery/commits/master)
to find out if the bug was already fixed in the master branch.
- [x] I have included all related issues and possible duplicate issues in this issue
(If there are none, check this box anyway).
## Related Issues and Possible Duplicates
#### Related Issues
- None
#### Possible Duplicates
- None
# Description
Documentation https://docs.celeryproject.org/en/master/userguide/workers.html#stopping-the-worker is outdated due merged PR #6942.
# Suggestions
Documentation should mention that Celery worker main process in some systems should be able to kill all child processes using SIGKILL signal.
|
non_process
|
mention that worker kills child processes on systems supporting prctl pdeathsig please fill this template entirely and do not erase parts of it we reserve the right to close without a response bug reports which are incomplete checklist to check an item on the list replace with i have checked the for similar or identical bug reports i have checked the for existing proposed fixes i have checked the to find out if the bug was already fixed in the master branch i have included all related issues and possible duplicate issues in this issue if there are none check this box anyway related issues and possible duplicates related issues none possible duplicates none description documentation is outdated due merged pr suggestions documentation should mention that celery worker main process in some systems should be able to kill all child processes using sigkill signal
| 0
|
204,902
| 15,954,405,649
|
IssuesEvent
|
2021-04-15 13:33:23
|
srijan-sivakumar/redant
|
https://api.github.com/repos/srijan-sivakumar/redant
|
closed
|
Standardizing the Code
|
documentation enhancement
|
* Ops:
1. docstring doesn't match the actual return:
Ex: peer_ops
```js
def peer_status(self, node: str):
"""
Checks the status of the peers
Args:
node (str): Node on which command has to be executed.
Returns:
tuple: Tuple containing three elements (ret, out, err).
The first element 'ret' is of type 'int' and
is the return value
of command execution.
The second element 'out' is of type 'str'
and is the stdout value
of the command execution.
The third element 'err' is of type 'str'
and is the stderr value
of the command execution.
"""
cmd = 'gluster --xml peer status'
self.logger.info(f"Running {cmd} on node {node}")
ret = self.execute_command(node, cmd)
if ret['error_code'] != 0:
self.logger.error(ret['msg']['opErrstr'])
raise Exception(ret['msg']['opErrstr'])
self.logger.info(f"Successfully ran {cmd} on {node}")
return ret
```
Here we are returning just ret which is a dictionary but the docstring is stating it as tuple.
2. In tests we are not using the ret value:
Test: peer_probe_detach
```js
def run_test(self):
"""
In this testcase:
1) glusterd service is started
2) peer probe of a server
3) list the storage pool
4) peer detach
5) glusterd is stopped
"""
try:
server1 = self.server_list[0]
server2 = self.server_list[1]
self.redant.start_glusterd(server1)
self.redant.peer_probe(server2, server1)
self.redant.pool_list(server1)
self.redant.peer_detach(server1, server2)
self.redant.stop_glusterd(server1)
except Exception as e:
self.TEST_RES = False
print(f"Test is failed:{e}")
```
If we are not using the ret value then returning that might not be needed.
* Class names
In some classes SnakeCase is used while in some CamelCase. This is the initial stage so keeping the name formats standard will be better.
* Certain ops are not returning ret or anything.
```js
def volume_mount(self, server: str, volname: str,
path: str, node: str=None):
"""
Mounts the gluster volume to the client's filesystem.
Args:
node (str): The client node in the cluster where volume
mount is to be run
server (str): Hostname or IP address
volname (str): Name of volume to be mounted
path (str): The path of the mount directory(mount point)
"""
cmd = f"mount -t glusterfs {server}:/{volname} {path}"
self.logger.info(f"Running {cmd} on node {node}")
ret = self.execute_command(node=node, cmd=cmd)
if int(ret["error_code"]) != 0:
self.logger.error(ret["error_msg"])
raise Exception(ret["error_msg"])
self.logger.info(f"Successfully ran {cmd} on {node}")
def volume_unmount(self, path: str, node: str=None):
"""
Unmounts the gluster volume .
Args:
node (str): The client node in the cluster where volume
unmount is to be run
server (str): Hostname or IP address
volname (str): Name of volume to be mounted
path (str): The path of the mount directory(mount point)
"""
cmd = f"umount {path}"
self.logger.info(f"Running {cmd} on node {node}")
ret = self.execute_command(node=node, cmd=cmd)
if int(ret["error_code"]) != 0:
self.logger.error(ret["error_msg"])
raise Exception(ret["error_msg"])
self.logger.info(f"Successfully ran {cmd} on {node}")
```
|
1.0
|
Standardizing the Code - * Ops:
1. docstring doesn't match the actual return:
Ex: peer_ops
```js
def peer_status(self, node: str):
"""
Checks the status of the peers
Args:
node (str): Node on which command has to be executed.
Returns:
tuple: Tuple containing three elements (ret, out, err).
The first element 'ret' is of type 'int' and
is the return value
of command execution.
The second element 'out' is of type 'str'
and is the stdout value
of the command execution.
The third element 'err' is of type 'str'
and is the stderr value
of the command execution.
"""
cmd = 'gluster --xml peer status'
self.logger.info(f"Running {cmd} on node {node}")
ret = self.execute_command(node, cmd)
if ret['error_code'] != 0:
self.logger.error(ret['msg']['opErrstr'])
raise Exception(ret['msg']['opErrstr'])
self.logger.info(f"Successfully ran {cmd} on {node}")
return ret
```
Here we are returning just ret which is a dictionary but the docstring is stating it as tuple.
2. In tests we are not using the ret value:
Test: peer_probe_detach
```js
def run_test(self):
"""
In this testcase:
1) glusterd service is started
2) peer probe of a server
3) list the storage pool
4) peer detach
5) glusterd is stopped
"""
try:
server1 = self.server_list[0]
server2 = self.server_list[1]
self.redant.start_glusterd(server1)
self.redant.peer_probe(server2, server1)
self.redant.pool_list(server1)
self.redant.peer_detach(server1, server2)
self.redant.stop_glusterd(server1)
except Exception as e:
self.TEST_RES = False
print(f"Test is failed:{e}")
```
If we are not using the ret value then returning that might not be needed.
* Class names
In some classes SnakeCase is used while in some CamelCase. This is the initial stage so keeping the name formats standard will be better.
* Certain ops are not returning ret or anything.
```js
def volume_mount(self, server: str, volname: str,
path: str, node: str=None):
"""
Mounts the gluster volume to the client's filesystem.
Args:
node (str): The client node in the cluster where volume
mount is to be run
server (str): Hostname or IP address
volname (str): Name of volume to be mounted
path (str): The path of the mount directory(mount point)
"""
cmd = f"mount -t glusterfs {server}:/{volname} {path}"
self.logger.info(f"Running {cmd} on node {node}")
ret = self.execute_command(node=node, cmd=cmd)
if int(ret["error_code"]) != 0:
self.logger.error(ret["error_msg"])
raise Exception(ret["error_msg"])
self.logger.info(f"Successfully ran {cmd} on {node}")
def volume_unmount(self, path: str, node: str=None):
"""
Unmounts the gluster volume .
Args:
node (str): The client node in the cluster where volume
unmount is to be run
server (str): Hostname or IP address
volname (str): Name of volume to be mounted
path (str): The path of the mount directory(mount point)
"""
cmd = f"umount {path}"
self.logger.info(f"Running {cmd} on node {node}")
ret = self.execute_command(node=node, cmd=cmd)
if int(ret["error_code"]) != 0:
self.logger.error(ret["error_msg"])
raise Exception(ret["error_msg"])
self.logger.info(f"Successfully ran {cmd} on {node}")
```
|
non_process
|
standardizing the code ops docstring doesn t match the actual return ex peer ops js def peer status self node str checks the status of the peers args node str node on which command has to be executed returns tuple tuple containing three elements ret out err the first element ret is of type int and is the return value of command execution the second element out is of type str and is the stdout value of the command execution the third element err is of type str and is the stderr value of the command execution cmd gluster xml peer status self logger info f running cmd on node node ret self execute command node cmd if ret self logger error ret raise exception ret self logger info f successfully ran cmd on node return ret here we are returning just ret which is a dictionary but the docstring is stating it as tuple in tests we are not using the ret value test peer probe detach js def run test self in this testcase glusterd service is started peer probe of a server list the storage pool peer detach glusterd is stopped try self server list self server list self redant start glusterd self redant peer probe self redant pool list self redant peer detach self redant stop glusterd except exception as e self test res false print f test is failed e if we are not using the ret value then returning that might not be needed class names in some classes snakecase is used while in some camelcase this is the initial stage so keeping the name formats standard will be better certain ops are not returning ret or anything js def volume mount self server str volname str path str node str none mounts the gluster volume to the client s filesystem args node str the client node in the cluster where volume mount is to be run server str hostname or ip address volname str name of volume to be mounted path str the path of the mount directory mount point cmd f mount t glusterfs server volname path self logger info f running cmd on node node ret self execute command node node cmd cmd if int ret self logger error ret raise exception ret self logger info f successfully ran cmd on node def volume unmount self path str node str none unmounts the gluster volume args node str the client node in the cluster where volume unmount is to be run server str hostname or ip address volname str name of volume to be mounted path str the path of the mount directory mount point cmd f umount path self logger info f running cmd on node node ret self execute command node node cmd cmd if int ret self logger error ret raise exception ret self logger info f successfully ran cmd on node
| 0
|
62,363
| 12,214,106,120
|
IssuesEvent
|
2020-05-01 09:00:45
|
vapor/docs
|
https://api.github.com/repos/vapor/docs
|
closed
|
Nginx deploy, serving files: try_files should be in location block
|
invalid code
|
I couldn't get nginx to serve static files. Putting try_files in a location block made it work.
```sh
server {
...
# Serve all public/static files via nginx and then fallback to Vapor for the rest
location / {
try_files $uri @proxy;
}
location @proxy {
...
}
}
```
|
1.0
|
Nginx deploy, serving files: try_files should be in location block - I couldn't get nginx to serve static files. Putting try_files in a location block made it work.
```sh
server {
...
# Serve all public/static files via nginx and then fallback to Vapor for the rest
location / {
try_files $uri @proxy;
}
location @proxy {
...
}
}
```
|
non_process
|
nginx deploy serving files try files should be in location block i couldn t get nginx to serve static files putting try files in a location block made it work sh server serve all public static files via nginx and then fallback to vapor for the rest location try files uri proxy location proxy
| 0
|
316,783
| 9,657,322,406
|
IssuesEvent
|
2019-05-20 08:17:47
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.google.com - see bug description
|
browser-fenix engine-gecko priority-critical
|
<!-- @browser: Firefox Mobile 68.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://www.google.com/?gws_rd=ssl
**Browser / Version**: Firefox Mobile 68.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Something else
**Description**: Google showing old mobile site
**Steps to Reproduce**:
Visit Google.com
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.google.com - see bug description - <!-- @browser: Firefox Mobile 68.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://www.google.com/?gws_rd=ssl
**Browser / Version**: Firefox Mobile 68.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Something else
**Description**: Google showing old mobile site
**Steps to Reproduce**:
Visit Google.com
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
see bug description url browser version firefox mobile operating system android tested another browser yes problem type something else description google showing old mobile site steps to reproduce visit google com browser configuration none from with ❤️
| 0
|
22,624
| 31,847,425,408
|
IssuesEvent
|
2023-09-14 21:09:10
|
open-telemetry/opentelemetry-collector-contrib
|
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
|
closed
|
Tranform processor- expected string but got pcommon.Map
|
bug processor/transform needs triage
|
### Component(s)
processor/transform
### What happened?
## Description
Fluent Bit is sending logs to otelcol-contrib otlphttp receiver and otelcol-contrib extract kubernetes.namespace_name value using Transform processor and set loki.tenant hint so that loki exporter sends logs to loki endpoint.
## Steps to Reproduce
## Expected Result
it should set loki.tenant =mytenant
## Actual Result
warn ottl@v0.82.0/parser.go:211 failed to execute statement {"kind": "processor", "name": "transform", "pipeline": "logs", "error": "expected string but got pcommon.Map",
### Collector version
0.82
### Environment information
## Environment
OS: AWS EKS
### OpenTelemetry Collector configuration
```yaml
Name: otel-hub-statefulset
Namespace: otel-hub
Labels: app.kubernetes.io/instance=otel-hub
app.kubernetes.io/managed-by=Helm
app.kubernetes.io/name=otel-hub
app.kubernetes.io/version=0.82.0
helm.sh/chart=otel-hub-0.82.0_da6ca26aca3a.3
helm.toolkit.fluxcd.io/name=otel-hub
helm.toolkit.fluxcd.io/namespace=otel-hub
Annotations: meta.helm.sh/release-name: otel-hub
meta.helm.sh/release-namespace: otel-hub
Data
====
relay:
----
exporters:
logging:
verbosity: detailed
loki:
endpoint: http://XXXX/loki/api/v1/push
retry_on_failure:
enabled: true
initial_interval: 1s
max_elapsed_time: 120s
max_interval: 300s
sending_queue:
storage: file_storage/psq
tls:
insecure: false
insecure_skip_verify: true
splunk_hec/logs:
endpoint: https://splunksandbox.bpweb.bp.com:8088/services/collector
retry_on_failure:
enabled: true
initial_interval: 10s
max_elapsed_time: 60s
max_interval: 60s
sending_queue:
storage: file_storage/psq
timeout: 30s
tls:
insecure: false
insecure_skip_verify: true
token: 95c4beb5-c1b0-4735-bcd3-310cea0c9fa8
extensions:
health_check:
endpoint: 0.0.0.0:13133
processors:
memory_limiter:
check_interval: 5s
limit_percentage: 80
spike_limit_percentage: 25
transform:
error_mode: ignore
log_statements:
- context: log
statements:
- set(attributes["cache"], ParseJSON(body)) where IsMatch(body[""], "^\\{")
- set(attributes["tenant"], attributes["cache"]["kubernetes"]["namespace_name"])
- set(attributes["loki.tenant"], "tenant")
- delete_key(attributes, "cache")
receivers:
otlp:
protocols:
http:
endpoint: 0.0.0.0:4318
include_metadata: true
service:
extensions:
- health_check
pipelines:
logs:
exporters:
- logging
- loki
processors:
- transform
receivers:
- otlp
```
### Log output
```shell
ObservedTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2023-09-14 15:54:35.813509554 +0000 UTC
SeverityText:
SeverityNumber: Unspecified(0)
Body: Map({"@timestamp":"2023-09-14T15:54:35.813509554+00:00","cluster":"XXX","kubernetes":{"annotations":{"XXX-linux"},"container_hash":"XX","container_image":"XX","container_name":"XX","docker_id":"XXX","host":"XX","labels":{"app":"XX","controller-revision-hash":"XX","pod-template-generation":"XX"},"namespace_name":"mynamespace","pod_id":"XX","pod_name":"XX"},"logtag":"X","message":"time=\"2XX\" level=debug msg=\"XX\" Duration=\"XX\" Method=GET RequestURL=/readiness Route=ReadinessProbe StatusCode=200 logLayer=rest_frontend requestID=XX requestSource=REST workflow=\"trident_rest=logger\"","stream":"stderr"})
Attributes:
-> loki.tenant: Str(loki_tenant)
Trace ID:
Span ID:
Flags: 0
{"kind": "exporter", "data_type": "logs", "name": "logging"}
2023-09-14T15:54:37.887Z error exporterhelper/queued_retry.go:391 Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "data_type": "logs", "name": "loki", "error":
"Permanent error: HTTP 401 \"Unauthorized\": no org id", "dropped_items": 4}
```
### Additional context
Below telemetry is being exported from Fluent Bit to Otelcol-contrib and then sending to loki endpoint.
ObservedTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2023-09-14 15:54:35.813509554 +0000 UTC
SeverityText:
SeverityNumber: Unspecified(0)
Body: Map({"@timestamp":"2023-09-14T15:54:35.813509554+00:00","cluster":"XXX","kubernetes":{"annotations":{"XXX-linux"},"container_hash":"XX","container_image":"XX","container_name":"XX","docker_id":"XXX","host":"XX","labels":{"app":"XX","controller-revision-hash":"XX","pod-template-generation":"XX"},"namespace_name":"mynamespace","pod_id":"XX","pod_name":"XX"},"logtag":"X","message":"time=\"2XX\" level=debug msg=\"XX\" Duration=\"XX\" Method=GET RequestURL=/readiness Route=ReadinessProbe StatusCode=200 logLayer=rest_frontend requestID=XX requestSource=REST workflow=\"trident_rest=logger\"","stream":"stderr"})
Attributes:
-> loki.tenant: Str(loki_tenant)
Trace ID:
Span ID:
Flags: 0
{"kind": "exporter", "data_type": "logs", "name": "logging"}
2023-09-14T15:54:37.887Z error exporterhelper/queued_retry.go:391 Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "data_type": "logs", "name": "loki", "error":
"Permanent error: HTTP 401 \"Unauthorized\": no org id", "dropped_items": 4}
1) It is working if the log body is a string. Transform processor able to extract namespace_name and set orgid.
2) It is failing the with below error if the log body is a map.
warn ottl@v0.82.0/parser.go:211 failed to execute statement {"kind": "processor", "name": "transform", "pipeline": "logs", "error": "expected string but got pcommon.Map",
|
1.0
|
Tranform processor- expected string but got pcommon.Map - ### Component(s)
processor/transform
### What happened?
## Description
Fluent Bit is sending logs to otelcol-contrib otlphttp receiver and otelcol-contrib extract kubernetes.namespace_name value using Transform processor and set loki.tenant hint so that loki exporter sends logs to loki endpoint.
## Steps to Reproduce
## Expected Result
it should set loki.tenant =mytenant
## Actual Result
warn ottl@v0.82.0/parser.go:211 failed to execute statement {"kind": "processor", "name": "transform", "pipeline": "logs", "error": "expected string but got pcommon.Map",
### Collector version
0.82
### Environment information
## Environment
OS: AWS EKS
### OpenTelemetry Collector configuration
```yaml
Name: otel-hub-statefulset
Namespace: otel-hub
Labels: app.kubernetes.io/instance=otel-hub
app.kubernetes.io/managed-by=Helm
app.kubernetes.io/name=otel-hub
app.kubernetes.io/version=0.82.0
helm.sh/chart=otel-hub-0.82.0_da6ca26aca3a.3
helm.toolkit.fluxcd.io/name=otel-hub
helm.toolkit.fluxcd.io/namespace=otel-hub
Annotations: meta.helm.sh/release-name: otel-hub
meta.helm.sh/release-namespace: otel-hub
Data
====
relay:
----
exporters:
logging:
verbosity: detailed
loki:
endpoint: http://XXXX/loki/api/v1/push
retry_on_failure:
enabled: true
initial_interval: 1s
max_elapsed_time: 120s
max_interval: 300s
sending_queue:
storage: file_storage/psq
tls:
insecure: false
insecure_skip_verify: true
splunk_hec/logs:
endpoint: https://splunksandbox.bpweb.bp.com:8088/services/collector
retry_on_failure:
enabled: true
initial_interval: 10s
max_elapsed_time: 60s
max_interval: 60s
sending_queue:
storage: file_storage/psq
timeout: 30s
tls:
insecure: false
insecure_skip_verify: true
token: 95c4beb5-c1b0-4735-bcd3-310cea0c9fa8
extensions:
health_check:
endpoint: 0.0.0.0:13133
processors:
memory_limiter:
check_interval: 5s
limit_percentage: 80
spike_limit_percentage: 25
transform:
error_mode: ignore
log_statements:
- context: log
statements:
- set(attributes["cache"], ParseJSON(body)) where IsMatch(body[""], "^\\{")
- set(attributes["tenant"], attributes["cache"]["kubernetes"]["namespace_name"])
- set(attributes["loki.tenant"], "tenant")
- delete_key(attributes, "cache")
receivers:
otlp:
protocols:
http:
endpoint: 0.0.0.0:4318
include_metadata: true
service:
extensions:
- health_check
pipelines:
logs:
exporters:
- logging
- loki
processors:
- transform
receivers:
- otlp
```
### Log output
```shell
ObservedTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2023-09-14 15:54:35.813509554 +0000 UTC
SeverityText:
SeverityNumber: Unspecified(0)
Body: Map({"@timestamp":"2023-09-14T15:54:35.813509554+00:00","cluster":"XXX","kubernetes":{"annotations":{"XXX-linux"},"container_hash":"XX","container_image":"XX","container_name":"XX","docker_id":"XXX","host":"XX","labels":{"app":"XX","controller-revision-hash":"XX","pod-template-generation":"XX"},"namespace_name":"mynamespace","pod_id":"XX","pod_name":"XX"},"logtag":"X","message":"time=\"2XX\" level=debug msg=\"XX\" Duration=\"XX\" Method=GET RequestURL=/readiness Route=ReadinessProbe StatusCode=200 logLayer=rest_frontend requestID=XX requestSource=REST workflow=\"trident_rest=logger\"","stream":"stderr"})
Attributes:
-> loki.tenant: Str(loki_tenant)
Trace ID:
Span ID:
Flags: 0
{"kind": "exporter", "data_type": "logs", "name": "logging"}
2023-09-14T15:54:37.887Z error exporterhelper/queued_retry.go:391 Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "data_type": "logs", "name": "loki", "error":
"Permanent error: HTTP 401 \"Unauthorized\": no org id", "dropped_items": 4}
```
### Additional context
Below telemetry is being exported from Fluent Bit to Otelcol-contrib and then sending to loki endpoint.
ObservedTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2023-09-14 15:54:35.813509554 +0000 UTC
SeverityText:
SeverityNumber: Unspecified(0)
Body: Map({"@timestamp":"2023-09-14T15:54:35.813509554+00:00","cluster":"XXX","kubernetes":{"annotations":{"XXX-linux"},"container_hash":"XX","container_image":"XX","container_name":"XX","docker_id":"XXX","host":"XX","labels":{"app":"XX","controller-revision-hash":"XX","pod-template-generation":"XX"},"namespace_name":"mynamespace","pod_id":"XX","pod_name":"XX"},"logtag":"X","message":"time=\"2XX\" level=debug msg=\"XX\" Duration=\"XX\" Method=GET RequestURL=/readiness Route=ReadinessProbe StatusCode=200 logLayer=rest_frontend requestID=XX requestSource=REST workflow=\"trident_rest=logger\"","stream":"stderr"})
Attributes:
-> loki.tenant: Str(loki_tenant)
Trace ID:
Span ID:
Flags: 0
{"kind": "exporter", "data_type": "logs", "name": "logging"}
2023-09-14T15:54:37.887Z error exporterhelper/queued_retry.go:391 Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "data_type": "logs", "name": "loki", "error":
"Permanent error: HTTP 401 \"Unauthorized\": no org id", "dropped_items": 4}
1) It is working if the log body is a string. Transform processor able to extract namespace_name and set orgid.
2) It is failing the with below error if the log body is a map.
warn ottl@v0.82.0/parser.go:211 failed to execute statement {"kind": "processor", "name": "transform", "pipeline": "logs", "error": "expected string but got pcommon.Map",
|
process
|
tranform processor expected string but got pcommon map component s processor transform what happened description fluent bit is sending logs to otelcol contrib otlphttp receiver and otelcol contrib extract kubernetes namespace name value using transform processor and set loki tenant hint so that loki exporter sends logs to loki endpoint steps to reproduce expected result it should set loki tenant mytenant actual result warn ottl parser go failed to execute statement kind processor name transform pipeline logs error expected string but got pcommon map collector version environment information environment os aws eks opentelemetry collector configuration yaml name otel hub statefulset namespace otel hub labels app kubernetes io instance otel hub app kubernetes io managed by helm app kubernetes io name otel hub app kubernetes io version helm sh chart otel hub helm toolkit fluxcd io name otel hub helm toolkit fluxcd io namespace otel hub annotations meta helm sh release name otel hub meta helm sh release namespace otel hub data relay exporters logging verbosity detailed loki endpoint retry on failure enabled true initial interval max elapsed time max interval sending queue storage file storage psq tls insecure false insecure skip verify true splunk hec logs endpoint retry on failure enabled true initial interval max elapsed time max interval sending queue storage file storage psq timeout tls insecure false insecure skip verify true token extensions health check endpoint processors memory limiter check interval limit percentage spike limit percentage transform error mode ignore log statements context log statements set attributes parsejson body where ismatch body set attributes attributes set attributes tenant delete key attributes cache receivers otlp protocols http endpoint include metadata true service extensions health check pipelines logs exporters logging loki processors transform receivers otlp log output shell observedtimestamp utc timestamp utc severitytext severitynumber unspecified body map timestamp cluster xxx kubernetes annotations xxx linux container hash xx container image xx container name xx docker id xxx host xx labels app xx controller revision hash xx pod template generation xx namespace name mynamespace pod id xx pod name xx logtag x message time level debug msg xx duration xx method get requesturl readiness route readinessprobe statuscode loglayer rest frontend requestid xx requestsource rest workflow trident rest logger stream stderr attributes loki tenant str loki tenant trace id span id flags kind exporter data type logs name logging error exporterhelper queued retry go exporting failed the error is not retryable dropping data kind exporter data type logs name loki error permanent error http unauthorized no org id dropped items additional context below telemetry is being exported from fluent bit to otelcol contrib and then sending to loki endpoint observedtimestamp utc timestamp utc severitytext severitynumber unspecified body map timestamp cluster xxx kubernetes annotations xxx linux container hash xx container image xx container name xx docker id xxx host xx labels app xx controller revision hash xx pod template generation xx namespace name mynamespace pod id xx pod name xx logtag x message time level debug msg xx duration xx method get requesturl readiness route readinessprobe statuscode loglayer rest frontend requestid xx requestsource rest workflow trident rest logger stream stderr attributes loki tenant str loki tenant trace id span id flags kind exporter data type logs name logging error exporterhelper queued retry go exporting failed the error is not retryable dropping data kind exporter data type logs name loki error permanent error http unauthorized no org id dropped items it is working if the log body is a string transform processor able to extract namespace name and set orgid it is failing the with below error if the log body is a map warn ottl parser go failed to execute statement kind processor name transform pipeline logs error expected string but got pcommon map
| 1
|
248,835
| 7,936,722,239
|
IssuesEvent
|
2018-07-09 10:20:39
|
telstra/open-kilda
|
https://api.github.com/repos/telstra/open-kilda
|
closed
|
HTTP Status 500 from NB on null latency in Neo4j
|
bug priority/2-high
|
```
<!doctype html><html lang="en"><head><title>HTTP Status 500 – Internal Server Error</title><style type="text/css">h1 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:22px;} h2 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:16px;} h3 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:14px;} body {font-family:Tahoma,Arial,sans-serif;color:black;background-color:white;} b {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;} p {font-family:Tahoma,Arial,sans-serif;background:white;color:black;font-size:12px;} a {color:black;} a.name {color:black;} .line {height:1px;background-color:#525D76;border:none;}</style></head><body><h1>HTTP Status 500 – Internal Server Error</h1><hr class="line" /><p><b>Type</b> Exception Report</p><p><b>Message</b> Request processing failed; nested exception is org.springframework.web.client.RestClientException: Could not extract response: no suitable HttpMessageConverter found for response type [class [Lorg.openkilda.messaging.info.event.IslInfoData;] and content type [text/html;charset=utf-8]</p><p><b>Description</b> The server encountered an unexpected condition that prevented it from fulfilling the request.</p><p><b>Exception</b></p><pre>org.springframework.web.util.NestedServletException: Request processing failed; nested exception is org.springframework.web.client.RestClientException: Could not extract response: no suitable HttpMessageConverter found for response type [class [Lorg.openkilda.messaging.info.event.IslInfoData;] and content type [text/html;charset=utf-8]
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:982)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
javax.servlet.http.HttpServlet.service(HttpServlet.java:635)
org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
javax.servlet.http.HttpServlet.service(HttpServlet.java:742)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
org.openkilda.northbound.utils.RequestCorrelationFilter.doFilterInternal(RequestCorrelationFilter.java:57)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:197)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
</pre><p><b>Root Cause</b></p><pre>org.springframework.web.client.RestClientException: Could not extract response: no suitable HttpMessageConverter found for response type [class [Lorg.openkilda.messaging.info.event.IslInfoData;] and content type [text/html;charset=utf-8]
org.springframework.web.client.HttpMessageConverterExtractor.extractData(HttpMessageConverterExtractor.java:110)
org.springframework.web.client.RestTemplate$ResponseEntityResponseExtractor.extractData(RestTemplate.java:917)
org.springframework.web.client.RestTemplate$ResponseEntityResponseExtractor.extractData(RestTemplate.java:901)
org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:655)
org.springframework.web.client.RestTemplate.execute(RestTemplate.java:613)
org.springframework.web.client.RestTemplate.exchange(RestTemplate.java:531)
org.openkilda.northbound.service.impl.LinkServiceImpl.getLinks(LinkServiceImpl.java:83)
org.openkilda.northbound.controller.LinkController.getLinks(LinkController.java:55)
sun.reflect.GeneratedMethodAccessor148.invoke(Unknown Source)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)
org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:827)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:738)
org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
javax.servlet.http.HttpServlet.service(HttpServlet.java:635)
org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
javax.servlet.http.HttpServlet.service(HttpServlet.java:742)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
org.openkilda.northbound.utils.RequestCorrelationFilter.doFilterInternal(RequestCorrelationFilter.java:57)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:197)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
</pre><p><b>Note</b> The full stack trace of the root cause is available in the server logs.</p><hr class="line" /><h3>Apache Tomcat/8.5.16</h3></body></html>
```
|
1.0
|
HTTP Status 500 from NB on null latency in Neo4j - ```
<!doctype html><html lang="en"><head><title>HTTP Status 500 – Internal Server Error</title><style type="text/css">h1 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:22px;} h2 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:16px;} h3 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:14px;} body {font-family:Tahoma,Arial,sans-serif;color:black;background-color:white;} b {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;} p {font-family:Tahoma,Arial,sans-serif;background:white;color:black;font-size:12px;} a {color:black;} a.name {color:black;} .line {height:1px;background-color:#525D76;border:none;}</style></head><body><h1>HTTP Status 500 – Internal Server Error</h1><hr class="line" /><p><b>Type</b> Exception Report</p><p><b>Message</b> Request processing failed; nested exception is org.springframework.web.client.RestClientException: Could not extract response: no suitable HttpMessageConverter found for response type [class [Lorg.openkilda.messaging.info.event.IslInfoData;] and content type [text/html;charset=utf-8]</p><p><b>Description</b> The server encountered an unexpected condition that prevented it from fulfilling the request.</p><p><b>Exception</b></p><pre>org.springframework.web.util.NestedServletException: Request processing failed; nested exception is org.springframework.web.client.RestClientException: Could not extract response: no suitable HttpMessageConverter found for response type [class [Lorg.openkilda.messaging.info.event.IslInfoData;] and content type [text/html;charset=utf-8]
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:982)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
javax.servlet.http.HttpServlet.service(HttpServlet.java:635)
org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
javax.servlet.http.HttpServlet.service(HttpServlet.java:742)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
org.openkilda.northbound.utils.RequestCorrelationFilter.doFilterInternal(RequestCorrelationFilter.java:57)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:197)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
</pre><p><b>Root Cause</b></p><pre>org.springframework.web.client.RestClientException: Could not extract response: no suitable HttpMessageConverter found for response type [class [Lorg.openkilda.messaging.info.event.IslInfoData;] and content type [text/html;charset=utf-8]
org.springframework.web.client.HttpMessageConverterExtractor.extractData(HttpMessageConverterExtractor.java:110)
org.springframework.web.client.RestTemplate$ResponseEntityResponseExtractor.extractData(RestTemplate.java:917)
org.springframework.web.client.RestTemplate$ResponseEntityResponseExtractor.extractData(RestTemplate.java:901)
org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:655)
org.springframework.web.client.RestTemplate.execute(RestTemplate.java:613)
org.springframework.web.client.RestTemplate.exchange(RestTemplate.java:531)
org.openkilda.northbound.service.impl.LinkServiceImpl.getLinks(LinkServiceImpl.java:83)
org.openkilda.northbound.controller.LinkController.getLinks(LinkController.java:55)
sun.reflect.GeneratedMethodAccessor148.invoke(Unknown Source)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)
org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:827)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:738)
org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
javax.servlet.http.HttpServlet.service(HttpServlet.java:635)
org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
javax.servlet.http.HttpServlet.service(HttpServlet.java:742)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
org.openkilda.northbound.utils.RequestCorrelationFilter.doFilterInternal(RequestCorrelationFilter.java:57)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:197)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
</pre><p><b>Note</b> The full stack trace of the root cause is available in the server logs.</p><hr class="line" /><h3>Apache Tomcat/8.5.16</h3></body></html>
```
|
non_process
|
http status from nb on null latency in http status – internal server error font family tahoma arial sans serif color white background color font size font family tahoma arial sans serif color white background color font size font family tahoma arial sans serif color white background color font size body font family tahoma arial sans serif color black background color white b font family tahoma arial sans serif color white background color p font family tahoma arial sans serif background white color black font size a color black a name color black line height background color border none http status – internal server error type exception report message request processing failed nested exception is org springframework web client restclientexception could not extract response no suitable httpmessageconverter found for response type and content type description the server encountered an unexpected condition that prevented it from fulfilling the request exception org springframework web util nestedservletexception request processing failed nested exception is org springframework web client restclientexception could not extract response no suitable httpmessageconverter found for response type and content type org springframework web servlet frameworkservlet processrequest frameworkservlet java org springframework web servlet frameworkservlet doget frameworkservlet java javax servlet http httpservlet service httpservlet java org springframework web servlet frameworkservlet service frameworkservlet java javax servlet http httpservlet service httpservlet java org apache tomcat websocket server wsfilter dofilter wsfilter java org openkilda northbound utils requestcorrelationfilter dofilterinternal requestcorrelationfilter java org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web access intercept filtersecurityinterceptor invoke filtersecurityinterceptor java org springframework security web access intercept filtersecurityinterceptor dofilter filtersecurityinterceptor java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web access exceptiontranslationfilter dofilter exceptiontranslationfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web session sessionmanagementfilter dofilter sessionmanagementfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web authentication anonymousauthenticationfilter dofilter anonymousauthenticationfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web servletapi securitycontextholderawarerequestfilter dofilter securitycontextholderawarerequestfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web savedrequest requestcacheawarefilter dofilter requestcacheawarefilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web authentication org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web authentication logout logoutfilter dofilter logoutfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web header headerwriterfilter dofilterinternal headerwriterfilter java org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web context securitycontextpersistencefilter dofilter securitycontextpersistencefilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web context request async webasyncmanagerintegrationfilter dofilterinternal webasyncmanagerintegrationfilter java org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web filterchainproxy dofilterinternal filterchainproxy java org springframework security web filterchainproxy dofilter filterchainproxy java org springframework web filter delegatingfilterproxy invokedelegate delegatingfilterproxy java org springframework web filter delegatingfilterproxy dofilter delegatingfilterproxy java org springframework web filter characterencodingfilter dofilterinternal characterencodingfilter java org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java root cause org springframework web client restclientexception could not extract response no suitable httpmessageconverter found for response type and content type org springframework web client httpmessageconverterextractor extractdata httpmessageconverterextractor java org springframework web client resttemplate responseentityresponseextractor extractdata resttemplate java org springframework web client resttemplate responseentityresponseextractor extractdata resttemplate java org springframework web client resttemplate doexecute resttemplate java org springframework web client resttemplate execute resttemplate java org springframework web client resttemplate exchange resttemplate java org openkilda northbound service impl linkserviceimpl getlinks linkserviceimpl java org openkilda northbound controller linkcontroller getlinks linkcontroller java sun reflect invoke unknown source sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java java lang reflect method invoke method java org springframework web method support invocablehandlermethod doinvoke invocablehandlermethod java org springframework web method support invocablehandlermethod invokeforrequest invocablehandlermethod java org springframework web servlet mvc method annotation servletinvocablehandlermethod invokeandhandle servletinvocablehandlermethod java org springframework web servlet mvc method annotation requestmappinghandleradapter invokehandlermethod requestmappinghandleradapter java org springframework web servlet mvc method annotation requestmappinghandleradapter handleinternal requestmappinghandleradapter java org springframework web servlet mvc method abstracthandlermethodadapter handle abstracthandlermethodadapter java org springframework web servlet dispatcherservlet dodispatch dispatcherservlet java org springframework web servlet dispatcherservlet doservice dispatcherservlet java org springframework web servlet frameworkservlet processrequest frameworkservlet java org springframework web servlet frameworkservlet doget frameworkservlet java javax servlet http httpservlet service httpservlet java org springframework web servlet frameworkservlet service frameworkservlet java javax servlet http httpservlet service httpservlet java org apache tomcat websocket server wsfilter dofilter wsfilter java org openkilda northbound utils requestcorrelationfilter dofilterinternal requestcorrelationfilter java org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web access intercept filtersecurityinterceptor invoke filtersecurityinterceptor java org springframework security web access intercept filtersecurityinterceptor dofilter filtersecurityinterceptor java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web access exceptiontranslationfilter dofilter exceptiontranslationfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web session sessionmanagementfilter dofilter sessionmanagementfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web authentication anonymousauthenticationfilter dofilter anonymousauthenticationfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web servletapi securitycontextholderawarerequestfilter dofilter securitycontextholderawarerequestfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web savedrequest requestcacheawarefilter dofilter requestcacheawarefilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web authentication org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web authentication logout logoutfilter dofilter logoutfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web header headerwriterfilter dofilterinternal headerwriterfilter java org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web context securitycontextpersistencefilter dofilter securitycontextpersistencefilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web context request async webasyncmanagerintegrationfilter dofilterinternal webasyncmanagerintegrationfilter java org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java org springframework security web filterchainproxy dofilterinternal filterchainproxy java org springframework security web filterchainproxy dofilter filterchainproxy java org springframework web filter delegatingfilterproxy invokedelegate delegatingfilterproxy java org springframework web filter delegatingfilterproxy dofilter delegatingfilterproxy java org springframework web filter characterencodingfilter dofilterinternal characterencodingfilter java org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java note the full stack trace of the root cause is available in the server logs apache tomcat
| 0
|
127,962
| 18,024,762,623
|
IssuesEvent
|
2021-09-17 02:00:32
|
victorlmneves/fed-pug-boilerplate
|
https://api.github.com/repos/victorlmneves/fed-pug-boilerplate
|
opened
|
CVE-2021-3795 (Medium) detected in semver-regex-2.0.0.tgz
|
security vulnerability
|
## CVE-2021-3795 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>semver-regex-2.0.0.tgz</b></p></summary>
<p>Regular expression for matching semver versions</p>
<p>Library home page: <a href="https://registry.npmjs.org/semver-regex/-/semver-regex-2.0.0.tgz">https://registry.npmjs.org/semver-regex/-/semver-regex-2.0.0.tgz</a></p>
<p>Path to dependency file: fed-pug-boilerplate/package.json</p>
<p>Path to vulnerable library: fed-pug-boilerplate/node_modules/semver-regex/package.json</p>
<p>
Dependency Hierarchy:
- imagemin-pngquant-7.0.0.tgz (Root Library)
- pngquant-bin-5.0.2.tgz
- bin-wrapper-4.1.0.tgz
- bin-version-check-4.0.0.tgz
- bin-version-3.1.0.tgz
- find-versions-3.2.0.tgz
- :x: **semver-regex-2.0.0.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
semver-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3795>CVE-2021-3795</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/sindresorhus/semver-regex/releases/tag/v4.0.1">https://github.com/sindresorhus/semver-regex/releases/tag/v4.0.1</a></p>
<p>Release Date: 2021-09-15</p>
<p>Fix Resolution: semver-regex - 3.1.3,4.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-3795 (Medium) detected in semver-regex-2.0.0.tgz - ## CVE-2021-3795 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>semver-regex-2.0.0.tgz</b></p></summary>
<p>Regular expression for matching semver versions</p>
<p>Library home page: <a href="https://registry.npmjs.org/semver-regex/-/semver-regex-2.0.0.tgz">https://registry.npmjs.org/semver-regex/-/semver-regex-2.0.0.tgz</a></p>
<p>Path to dependency file: fed-pug-boilerplate/package.json</p>
<p>Path to vulnerable library: fed-pug-boilerplate/node_modules/semver-regex/package.json</p>
<p>
Dependency Hierarchy:
- imagemin-pngquant-7.0.0.tgz (Root Library)
- pngquant-bin-5.0.2.tgz
- bin-wrapper-4.1.0.tgz
- bin-version-check-4.0.0.tgz
- bin-version-3.1.0.tgz
- find-versions-3.2.0.tgz
- :x: **semver-regex-2.0.0.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
semver-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3795>CVE-2021-3795</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/sindresorhus/semver-regex/releases/tag/v4.0.1">https://github.com/sindresorhus/semver-regex/releases/tag/v4.0.1</a></p>
<p>Release Date: 2021-09-15</p>
<p>Fix Resolution: semver-regex - 3.1.3,4.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in semver regex tgz cve medium severity vulnerability vulnerable library semver regex tgz regular expression for matching semver versions library home page a href path to dependency file fed pug boilerplate package json path to vulnerable library fed pug boilerplate node modules semver regex package json dependency hierarchy imagemin pngquant tgz root library pngquant bin tgz bin wrapper tgz bin version check tgz bin version tgz find versions tgz x semver regex tgz vulnerable library found in base branch master vulnerability details semver regex is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution semver regex step up your open source security game with whitesource
| 0
|
305,753
| 9,376,572,170
|
IssuesEvent
|
2019-04-04 08:17:49
|
Calipsoplus/calipsoplus-backend
|
https://api.github.com/repos/Calipsoplus/calipsoplus-backend
|
opened
|
Create testing configuration to run on a single server
|
enhancement priority
|
Need testing environment like Travis which uses a single virtual machine to run all tests.
Might be possible to set backend and frontend ip address to 0.0.0.0 in the unit test configuration
|
1.0
|
Create testing configuration to run on a single server - Need testing environment like Travis which uses a single virtual machine to run all tests.
Might be possible to set backend and frontend ip address to 0.0.0.0 in the unit test configuration
|
non_process
|
create testing configuration to run on a single server need testing environment like travis which uses a single virtual machine to run all tests might be possible to set backend and frontend ip address to in the unit test configuration
| 0
|
11,176
| 13,957,695,133
|
IssuesEvent
|
2020-10-24 08:11:27
|
alexanderkotsev/geoportal
|
https://api.github.com/repos/alexanderkotsev/geoportal
|
opened
|
PT: harvesting portugal data
|
Geoportal Harvesting process PT - Portugal
|
From: Ricardo Deus [ricardo.joao.deus@gmail.com]
Sent: 21 December 2018 13:00
To: JRC INSPIRE SUPPORT
Subject: harvesting portugal data
Dear Sir's
My name is Ricardo i'm from Portugal.
I consult your geoportal and i found that the information from Portugal is not updated, the last harvesting is from 06-12-2018. Can you inform about the reason for that date for the harvesting process.
Best regards
Ricardo Deus
|
1.0
|
PT: harvesting portugal data - From: Ricardo Deus [ricardo.joao.deus@gmail.com]
Sent: 21 December 2018 13:00
To: JRC INSPIRE SUPPORT
Subject: harvesting portugal data
Dear Sir's
My name is Ricardo i'm from Portugal.
I consult your geoportal and i found that the information from Portugal is not updated, the last harvesting is from 06-12-2018. Can you inform about the reason for that date for the harvesting process.
Best regards
Ricardo Deus
|
process
|
pt harvesting portugal data from ricardo deus sent december to jrc inspire support subject harvesting portugal data dear sir s my name is ricardo i m from portugal i consult your geoportal and i found that the information from portugal is not updated the last harvesting is from can you inform about the reason for that date for the harvesting process best regards ricardo deus
| 1
|
20,167
| 26,720,343,489
|
IssuesEvent
|
2023-01-29 03:06:07
|
vesoft-inc/nebula
|
https://api.github.com/repos/vesoft-inc/nebula
|
closed
|
[TTL] If there is no compaction, the modification to ttl may not take effect
|
type/bug doc affected wontfix severity/minor affects/master process/done
|
**Please check the FAQ documentation before raising an issue**
<!-- Please check the [FAQ](https://docs.nebula-graph.com.cn/master/20.appendix/0.FAQ/) documentation and old issues before raising an issue in case someone has asked the same question that you are asking. -->
**Describe the bug (__required__)**
```
create tag ttl_tag2(age int, ttl timestamp) ttl_duration=600, ttl_col="ttl";
insert vertex ttl_tag2(age,ttl) VALUES "1":(10,now());
match (v:ttl_tag2) return v limit 10; // visible
alter tag ttl_tag2 ttl_duration=5;
match (v:ttl_tag2) return v limit 10; // not visible
alter tag ttl_tag2 ttl_duration=600;
match (v:ttl_tag2) return v limit 10; // visible
```
**Your Environments (__required__)**
* OS: `uname -a`
* Compiler: `g++ --version` or `clang++ --version`
* CPU: `lscpu`
* Commit id (e.g. `a3ffc7d8`)
**How To Reproduce(__required__)**
Steps to reproduce the behavior:
1. Step 1
2. Step 2
3. Step 3
**Expected behavior**
<!-- A clear and concise description of what you expected to happen. -->
**Additional context**
<!-- Provide logs and configs, or any other context to trace the problem. -->
|
1.0
|
[TTL] If there is no compaction, the modification to ttl may not take effect - **Please check the FAQ documentation before raising an issue**
<!-- Please check the [FAQ](https://docs.nebula-graph.com.cn/master/20.appendix/0.FAQ/) documentation and old issues before raising an issue in case someone has asked the same question that you are asking. -->
**Describe the bug (__required__)**
```
create tag ttl_tag2(age int, ttl timestamp) ttl_duration=600, ttl_col="ttl";
insert vertex ttl_tag2(age,ttl) VALUES "1":(10,now());
match (v:ttl_tag2) return v limit 10; // visible
alter tag ttl_tag2 ttl_duration=5;
match (v:ttl_tag2) return v limit 10; // not visible
alter tag ttl_tag2 ttl_duration=600;
match (v:ttl_tag2) return v limit 10; // visible
```
**Your Environments (__required__)**
* OS: `uname -a`
* Compiler: `g++ --version` or `clang++ --version`
* CPU: `lscpu`
* Commit id (e.g. `a3ffc7d8`)
**How To Reproduce(__required__)**
Steps to reproduce the behavior:
1. Step 1
2. Step 2
3. Step 3
**Expected behavior**
<!-- A clear and concise description of what you expected to happen. -->
**Additional context**
<!-- Provide logs and configs, or any other context to trace the problem. -->
|
process
|
if there is no compaction the modification to ttl may not take effect please check the faq documentation before raising an issue describe the bug required create tag ttl age int ttl timestamp ttl duration ttl col ttl insert vertex ttl age ttl values now match v ttl return v limit visible alter tag ttl ttl duration match v ttl return v limit not visible alter tag ttl ttl duration match v ttl return v limit visible your environments required os uname a compiler g version or clang version cpu lscpu commit id e g how to reproduce required steps to reproduce the behavior step step step expected behavior additional context
| 1
|
54,558
| 13,912,440,656
|
IssuesEvent
|
2020-10-20 18:52:33
|
jgeraigery/LocalCatalogManager
|
https://api.github.com/repos/jgeraigery/LocalCatalogManager
|
closed
|
CVE-2019-14892 (High) detected in jackson-databind-2.8.5.jar - autoclosed
|
security vulnerability
|
## CVE-2019-14892 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.5.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: LocalCatalogManager/lcm-server/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.5/jackson-databind-2.8.5.jar,canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.5/jackson-databind-2.8.5.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.5/jackson-databind-2.8.5.jar,canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.5/jackson-databind-2.8.5.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.8.5.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/LocalCatalogManager/commit/b8c24e199f2d440dea3ce3cc2c66ada102d5d922">b8c24e199f2d440dea3ce3cc2c66ada102d5d922</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw was discovered in jackson-databind in versions before 2.9.10, 2.8.11.5 and 2.6.7.3, where it would permit polymorphic deserialization of a malicious object using commons-configuration 1 and 2 JNDI classes. An attacker could use this flaw to execute arbitrary code.
<p>Publish Date: 2020-03-02
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14892>CVE-2019-14892</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2462">https://github.com/FasterXML/jackson-databind/issues/2462</a></p>
<p>Release Date: 2020-03-02</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.6.7.3,2.7.9.7,2.8.11.5,2.9.10</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.5","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.8.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.6.7.3,2.7.9.7,2.8.11.5,2.9.10"}],"vulnerabilityIdentifier":"CVE-2019-14892","vulnerabilityDetails":"A flaw was discovered in jackson-databind in versions before 2.9.10, 2.8.11.5 and 2.6.7.3, where it would permit polymorphic deserialization of a malicious object using commons-configuration 1 and 2 JNDI classes. An attacker could use this flaw to execute arbitrary code.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14892","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2019-14892 (High) detected in jackson-databind-2.8.5.jar - autoclosed - ## CVE-2019-14892 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.5.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: LocalCatalogManager/lcm-server/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.5/jackson-databind-2.8.5.jar,canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.5/jackson-databind-2.8.5.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.5/jackson-databind-2.8.5.jar,canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.5/jackson-databind-2.8.5.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.8.5.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/LocalCatalogManager/commit/b8c24e199f2d440dea3ce3cc2c66ada102d5d922">b8c24e199f2d440dea3ce3cc2c66ada102d5d922</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw was discovered in jackson-databind in versions before 2.9.10, 2.8.11.5 and 2.6.7.3, where it would permit polymorphic deserialization of a malicious object using commons-configuration 1 and 2 JNDI classes. An attacker could use this flaw to execute arbitrary code.
<p>Publish Date: 2020-03-02
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14892>CVE-2019-14892</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2462">https://github.com/FasterXML/jackson-databind/issues/2462</a></p>
<p>Release Date: 2020-03-02</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.6.7.3,2.7.9.7,2.8.11.5,2.9.10</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.5","isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.8.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.6.7.3,2.7.9.7,2.8.11.5,2.9.10"}],"vulnerabilityIdentifier":"CVE-2019-14892","vulnerabilityDetails":"A flaw was discovered in jackson-databind in versions before 2.9.10, 2.8.11.5 and 2.6.7.3, where it would permit polymorphic deserialization of a malicious object using commons-configuration 1 and 2 JNDI classes. An attacker could use this flaw to execute arbitrary code.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-14892","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in jackson databind jar autoclosed cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file localcatalogmanager lcm server pom xml path to vulnerable library canner repository com fasterxml jackson core jackson databind jackson databind jar canner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar canner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details a flaw was discovered in jackson databind in versions before and where it would permit polymorphic deserialization of a malicious object using commons configuration and jndi classes an attacker could use this flaw to execute arbitrary code publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails a flaw was discovered in jackson databind in versions before and where it would permit polymorphic deserialization of a malicious object using commons configuration and jndi classes an attacker could use this flaw to execute arbitrary code vulnerabilityurl
| 0
|
6,122
| 8,996,344,853
|
IssuesEvent
|
2019-02-02 00:50:18
|
bow-simulation/virtualbow
|
https://api.github.com/repos/bow-simulation/virtualbow
|
closed
|
Use Gitlab-CI to build and run tests automatically
|
area: software process prio: normal type: improvement
|
In GitLab by @spfeifer on Dec 7, 2018, 11:52
This is a first step towards #33
Optional Bonus: Provide development build artifacts (AppImage, deb, rpm)
|
1.0
|
Use Gitlab-CI to build and run tests automatically - In GitLab by @spfeifer on Dec 7, 2018, 11:52
This is a first step towards #33
Optional Bonus: Provide development build artifacts (AppImage, deb, rpm)
|
process
|
use gitlab ci to build and run tests automatically in gitlab by spfeifer on dec this is a first step towards optional bonus provide development build artifacts appimage deb rpm
| 1
|
18,606
| 24,577,686,087
|
IssuesEvent
|
2022-10-13 13:31:53
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
clean: true = clean: all?
|
devops/prod doc-bug Pri1 devops-cicd-process/tech
|
You write in the section on workspaces:
> When the Clean setting is true it is equivalent to specifying clean: true for every [checkout](https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/steps-checkout) step in your pipeline.
But what is `clean: true` ? Before, you defined `clean: outputs | resources | all`. Is it one of those? (maybe a predecessor?)
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 95227592-b2a9-62be-59d1-fe8c7f4eae8e
* Version Independent ID: abf0c659-ce8c-b187-2784-22ddc061d69f
* Content: [Jobs in Azure Pipelines and TFS - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml)
* Content Source: [docs/pipelines/process/phases.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/phases.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
clean: true = clean: all? - You write in the section on workspaces:
> When the Clean setting is true it is equivalent to specifying clean: true for every [checkout](https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/steps-checkout) step in your pipeline.
But what is `clean: true` ? Before, you defined `clean: outputs | resources | all`. Is it one of those? (maybe a predecessor?)
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 95227592-b2a9-62be-59d1-fe8c7f4eae8e
* Version Independent ID: abf0c659-ce8c-b187-2784-22ddc061d69f
* Content: [Jobs in Azure Pipelines and TFS - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml)
* Content Source: [docs/pipelines/process/phases.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/phases.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
clean true clean all you write in the section on workspaces when the clean setting is true it is equivalent to specifying clean true for every step in your pipeline but what is clean true before you defined clean outputs resources all is it one of those maybe a predecessor document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
506,907
| 14,675,721,731
|
IssuesEvent
|
2020-12-30 18:18:49
|
pumpingstationone/WA2AD
|
https://api.github.com/repos/pumpingstationone/WA2AD
|
opened
|
Check for accounts with no match in WA
|
enhancement high priority
|
Can we add functionality that will check if there are any accounts that don't match a WA account and disable them? Also log the accounts with no matching WA ID.
|
1.0
|
Check for accounts with no match in WA - Can we add functionality that will check if there are any accounts that don't match a WA account and disable them? Also log the accounts with no matching WA ID.
|
non_process
|
check for accounts with no match in wa can we add functionality that will check if there are any accounts that don t match a wa account and disable them also log the accounts with no matching wa id
| 0
|
384,219
| 26,578,298,448
|
IssuesEvent
|
2023-01-22 04:34:15
|
suspensive/react
|
https://api.github.com/repos/suspensive/react
|
closed
|
[Feature]: Add related link about suspense for who need to understand it
|
documentation
|
### **Package Scope**
docs
<!--
Is this feature added to an existing package?
Or make a new one?
-->
- [x] Add to an existing package
- [ ] New package
<!-- Write the package name here -->
Package name:
### **Overview**
<!-- A clear and concise description about the feature -->
### **Describe the solution you'd like**
<!-- A clear and concise description of what you want to happen -->
### **Additional context**
<!-- Add any other context or screenshots about the feature request here -->
|
1.0
|
[Feature]: Add related link about suspense for who need to understand it - ### **Package Scope**
docs
<!--
Is this feature added to an existing package?
Or make a new one?
-->
- [x] Add to an existing package
- [ ] New package
<!-- Write the package name here -->
Package name:
### **Overview**
<!-- A clear and concise description about the feature -->
### **Describe the solution you'd like**
<!-- A clear and concise description of what you want to happen -->
### **Additional context**
<!-- Add any other context or screenshots about the feature request here -->
|
non_process
|
add related link about suspense for who need to understand it package scope docs is this feature added to an existing package or make a new one add to an existing package new package package name overview describe the solution you d like additional context
| 0
|
22,416
| 31,158,408,246
|
IssuesEvent
|
2023-08-16 14:28:41
|
UnitTestBot/UTBotJava
|
https://api.github.com/repos/UnitTestBot/UTBotJava
|
opened
|
Investigate transformation of Integration tests generated by Fuzzing into Unit tests
|
ctg-enhancement comp-fuzzing comp-instrumented-process comp-spring
|
**Description**
Investigate transformation of Integration tests generated by Fuzzing into Unit tests.
Possible solutions how to do this are suggested below.
Need to try this live and analyze the results.
**Expected behavior**
Transform saving and retrieving data from repositories into mocks.
Use selected Mocking strategy to replace external method calls with mocks (outside of class/package).
We know what these methods are returning from the concrete execution, so we can mock them with these values.
Use JUnit4/JUnit5 runner instead of @SpringBootTest.
Do not include contextLoads() test.
**Environment**
IntelliJ IDEA 2023.* Ultimate/Community
**Potential alternatives**
- #2321
|
1.0
|
Investigate transformation of Integration tests generated by Fuzzing into Unit tests - **Description**
Investigate transformation of Integration tests generated by Fuzzing into Unit tests.
Possible solutions how to do this are suggested below.
Need to try this live and analyze the results.
**Expected behavior**
Transform saving and retrieving data from repositories into mocks.
Use selected Mocking strategy to replace external method calls with mocks (outside of class/package).
We know what these methods are returning from the concrete execution, so we can mock them with these values.
Use JUnit4/JUnit5 runner instead of @SpringBootTest.
Do not include contextLoads() test.
**Environment**
IntelliJ IDEA 2023.* Ultimate/Community
**Potential alternatives**
- #2321
|
process
|
investigate transformation of integration tests generated by fuzzing into unit tests description investigate transformation of integration tests generated by fuzzing into unit tests possible solutions how to do this are suggested below need to try this live and analyze the results expected behavior transform saving and retrieving data from repositories into mocks use selected mocking strategy to replace external method calls with mocks outside of class package we know what these methods are returning from the concrete execution so we can mock them with these values use runner instead of springboottest do not include contextloads test environment intellij idea ultimate community potential alternatives
| 1
|
44,949
| 13,097,422,482
|
IssuesEvent
|
2020-08-03 17:26:07
|
jtimberlake/COSMOS
|
https://api.github.com/repos/jtimberlake/COSMOS
|
opened
|
CVE-2018-20676 (Medium) detected in bootstrap-3.0.3.js, bootstrap-3.0.3.min.js
|
security vulnerability
|
## CVE-2018-20676 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.0.3.js</b>, <b>bootstrap-3.0.3.min.js</b></p></summary>
<p>
<details><summary><b>bootstrap-3.0.3.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.3/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.3/js/bootstrap.js</a></p>
<p>Path to vulnerable library: /COSMOS/test/performance/config/tools/handbook_creator/assets/js/bootstrap.js,/COSMOS/demo/config/tools/handbook_creator/assets/js/bootstrap.js,/COSMOS/autohotkey/config/tools/handbook_creator/assets/js/bootstrap.js,/COSMOS/install/config/tools/handbook_creator/assets/js/bootstrap.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.0.3.js** (Vulnerable Library)
</details>
<details><summary><b>bootstrap-3.0.3.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.3/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.3/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /COSMOS/autohotkey/config/tools/handbook_creator/assets/js/bootstrap.min.js,/COSMOS/demo/config/tools/handbook_creator/assets/js/bootstrap.min.js,/COSMOS/test/performance/config/tools/handbook_creator/assets/js/bootstrap.min.js,/COSMOS/install/config/tools/handbook_creator/assets/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.0.3.min.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/jtimberlake/COSMOS/commit/e967c77766e1b731bec2bab4f5fafb6af874c2c1">e967c77766e1b731bec2bab4f5fafb6af874c2c1</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 3.4.0, XSS is possible in the tooltip data-viewport attribute.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20676>CVE-2018-20676</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20676">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20676</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.0.3","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.0.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"bootstrap - 3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.0.3","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.0.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"bootstrap - 3.4.0"}],"vulnerabilityIdentifier":"CVE-2018-20676","vulnerabilityDetails":"In Bootstrap before 3.4.0, XSS is possible in the tooltip data-viewport attribute.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20676","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2018-20676 (Medium) detected in bootstrap-3.0.3.js, bootstrap-3.0.3.min.js - ## CVE-2018-20676 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.0.3.js</b>, <b>bootstrap-3.0.3.min.js</b></p></summary>
<p>
<details><summary><b>bootstrap-3.0.3.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.3/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.3/js/bootstrap.js</a></p>
<p>Path to vulnerable library: /COSMOS/test/performance/config/tools/handbook_creator/assets/js/bootstrap.js,/COSMOS/demo/config/tools/handbook_creator/assets/js/bootstrap.js,/COSMOS/autohotkey/config/tools/handbook_creator/assets/js/bootstrap.js,/COSMOS/install/config/tools/handbook_creator/assets/js/bootstrap.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.0.3.js** (Vulnerable Library)
</details>
<details><summary><b>bootstrap-3.0.3.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.3/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.0.3/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /COSMOS/autohotkey/config/tools/handbook_creator/assets/js/bootstrap.min.js,/COSMOS/demo/config/tools/handbook_creator/assets/js/bootstrap.min.js,/COSMOS/test/performance/config/tools/handbook_creator/assets/js/bootstrap.min.js,/COSMOS/install/config/tools/handbook_creator/assets/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.0.3.min.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/jtimberlake/COSMOS/commit/e967c77766e1b731bec2bab4f5fafb6af874c2c1">e967c77766e1b731bec2bab4f5fafb6af874c2c1</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 3.4.0, XSS is possible in the tooltip data-viewport attribute.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20676>CVE-2018-20676</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20676">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20676</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.0.3","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.0.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"bootstrap - 3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.0.3","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.0.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"bootstrap - 3.4.0"}],"vulnerabilityIdentifier":"CVE-2018-20676","vulnerabilityDetails":"In Bootstrap before 3.4.0, XSS is possible in the tooltip data-viewport attribute.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20676","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in bootstrap js bootstrap min js cve medium severity vulnerability vulnerable libraries bootstrap js bootstrap min js bootstrap js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library cosmos test performance config tools handbook creator assets js bootstrap js cosmos demo config tools handbook creator assets js bootstrap js cosmos autohotkey config tools handbook creator assets js bootstrap js cosmos install config tools handbook creator assets js bootstrap js dependency hierarchy x bootstrap js vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library cosmos autohotkey config tools handbook creator assets js bootstrap min js cosmos demo config tools handbook creator assets js bootstrap min js cosmos test performance config tools handbook creator assets js bootstrap min js cosmos install config tools handbook creator assets js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library found in head commit a href vulnerability details in bootstrap before xss is possible in the tooltip data viewport attribute publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails in bootstrap before xss is possible in the tooltip data viewport attribute vulnerabilityurl
| 0
|
191,076
| 15,269,124,544
|
IssuesEvent
|
2021-02-22 12:21:53
|
neoul-gaji/WebsiteFE
|
https://api.github.com/repos/neoul-gaji/WebsiteFE
|
opened
|
Bootstrap Vue 사용법
|
documentation
|
### Bootstrap Vue Docs
**1. Utility (ex. Sizing, Spacing)**
https://getbootstrap.com/docs/4.5/utilities/borders/
**2. Component (ex. Button, Form)**
https://bootstrap-vue.org/docs/components
최대한 위의 공식 문서 참고해서 개발해주세요!
|
1.0
|
Bootstrap Vue 사용법 - ### Bootstrap Vue Docs
**1. Utility (ex. Sizing, Spacing)**
https://getbootstrap.com/docs/4.5/utilities/borders/
**2. Component (ex. Button, Form)**
https://bootstrap-vue.org/docs/components
최대한 위의 공식 문서 참고해서 개발해주세요!
|
non_process
|
bootstrap vue 사용법 bootstrap vue docs utility ex sizing spacing component ex button form 최대한 위의 공식 문서 참고해서 개발해주세요
| 0
|
2,613
| 5,389,768,668
|
IssuesEvent
|
2017-02-25 06:35:38
|
jlm2017/jlm-video-subtitles
|
https://api.github.com/repos/jlm2017/jlm-video-subtitles
|
opened
|
[subtitles] [fr] JEAN-LUC MÉLENCHON INVITÉ DE L’ÉMISSION POLITIQUE
|
Language: French Process: Someone is working on this issue Process: [1] Writing in progress
|
# Titre vidéo
JEAN-LUC MÉLENCHON INVITÉ DE L’ÉMISSION POLITIQUE -
# Date de mise en ligne
24/02/2017
# URL
https://youtu.be/GSxTrpViZno
# Youtube subtitles language
Sous -titres : Français
# Duration
2:15 h
# Subtitles URL
https://www.youtube.com/timedtext_editor?action_mde_edit_form=1&ref=wt&v=GSxTrpViZno&ui=hd&bl=watch&lang=fr&tab=captions
|
2.0
|
[subtitles] [fr] JEAN-LUC MÉLENCHON INVITÉ DE L’ÉMISSION POLITIQUE - # Titre vidéo
JEAN-LUC MÉLENCHON INVITÉ DE L’ÉMISSION POLITIQUE -
# Date de mise en ligne
24/02/2017
# URL
https://youtu.be/GSxTrpViZno
# Youtube subtitles language
Sous -titres : Français
# Duration
2:15 h
# Subtitles URL
https://www.youtube.com/timedtext_editor?action_mde_edit_form=1&ref=wt&v=GSxTrpViZno&ui=hd&bl=watch&lang=fr&tab=captions
|
process
|
jean luc mélenchon invité de l’émission politique titre vidéo jean luc mélenchon invité de l’émission politique date de mise en ligne url youtube subtitles language sous titres français duration h subtitles url
| 1
|
65,040
| 12,519,116,976
|
IssuesEvent
|
2020-06-03 13:59:16
|
atilacamurca/glossario-friends
|
https://api.github.com/repos/atilacamurca/glossario-friends
|
closed
|
Modificar query da busca para usar episódios
|
bug code
|
Atualmente a query está usando posts ao invés de episódios
|
1.0
|
Modificar query da busca para usar episódios - Atualmente a query está usando posts ao invés de episódios
|
non_process
|
modificar query da busca para usar episódios atualmente a query está usando posts ao invés de episódios
| 0
|
77,776
| 21,958,838,698
|
IssuesEvent
|
2022-05-24 14:16:11
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
closed
|
The "MonoAOTCompiler" task failed unexpectedly
|
area-Build-mono
|
### Description
AOT build fails for simple Blazor WebAssembly App when using .NET 7 Preview 3 SDK.
```
Optimizing assemblies for size may change the behavior of the app. Be sure to test after publishing. See: https://aka.ms/dotnet-illink
AOT'ing 28 assemblies
C:\Program Files\dotnet\packs\Microsoft.NET.Runtime.WebAssembly.Sdk\7.0.0-preview.3.22175.4\Sdk\WasmApp.Native.targets(567,5): Error MSB4018: The "MonoAOTCompiler" task failed unexpectedly.
System.IO.FileNotFoundException: Could not load file or assembly 'System.Reflection.Metadata, Version=6.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find the file specified.
File name: 'System.Reflection.Metadata, Version=6.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'
at MonoAOTCompiler.FilterAssemblies(IEnumerable`1 assemblies)
at MonoAOTCompiler.ExecuteInternal()
at MonoAOTCompiler.Execute()
at Microsoft.Build.BackEnd.TaskExecutionHost.Microsoft.Build.BackEnd.ITaskExecutionHost.Execute()
at Microsoft.Build.BackEnd.TaskBuilder.<ExecuteInstantiatedTask>d__26.MoveNext()
```
### Reproduction Steps
- Install .NET 7 Preview 3 SDK and wasm-tools workload
- Create a new 'Blazor WebAssembly App' with Visual Studio 2022 Preview
- Publish the app using AOT compilation
### Expected behavior
Publish succeeds.
### Actual behavior
Publish fails.
### Regression?
_No response_
### Known Workarounds
_No response_
### Configuration
_No response_
### Other information
_No response_
|
1.0
|
The "MonoAOTCompiler" task failed unexpectedly - ### Description
AOT build fails for simple Blazor WebAssembly App when using .NET 7 Preview 3 SDK.
```
Optimizing assemblies for size may change the behavior of the app. Be sure to test after publishing. See: https://aka.ms/dotnet-illink
AOT'ing 28 assemblies
C:\Program Files\dotnet\packs\Microsoft.NET.Runtime.WebAssembly.Sdk\7.0.0-preview.3.22175.4\Sdk\WasmApp.Native.targets(567,5): Error MSB4018: The "MonoAOTCompiler" task failed unexpectedly.
System.IO.FileNotFoundException: Could not load file or assembly 'System.Reflection.Metadata, Version=6.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find the file specified.
File name: 'System.Reflection.Metadata, Version=6.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'
at MonoAOTCompiler.FilterAssemblies(IEnumerable`1 assemblies)
at MonoAOTCompiler.ExecuteInternal()
at MonoAOTCompiler.Execute()
at Microsoft.Build.BackEnd.TaskExecutionHost.Microsoft.Build.BackEnd.ITaskExecutionHost.Execute()
at Microsoft.Build.BackEnd.TaskBuilder.<ExecuteInstantiatedTask>d__26.MoveNext()
```
### Reproduction Steps
- Install .NET 7 Preview 3 SDK and wasm-tools workload
- Create a new 'Blazor WebAssembly App' with Visual Studio 2022 Preview
- Publish the app using AOT compilation
### Expected behavior
Publish succeeds.
### Actual behavior
Publish fails.
### Regression?
_No response_
### Known Workarounds
_No response_
### Configuration
_No response_
### Other information
_No response_
|
non_process
|
the monoaotcompiler task failed unexpectedly description aot build fails for simple blazor webassembly app when using net preview sdk optimizing assemblies for size may change the behavior of the app be sure to test after publishing see aot ing assemblies c program files dotnet packs microsoft net runtime webassembly sdk preview sdk wasmapp native targets error the monoaotcompiler task failed unexpectedly system io filenotfoundexception could not load file or assembly system reflection metadata version culture neutral publickeytoken or one of its dependencies the system cannot find the file specified file name system reflection metadata version culture neutral publickeytoken at monoaotcompiler filterassemblies ienumerable assemblies at monoaotcompiler executeinternal at monoaotcompiler execute at microsoft build backend taskexecutionhost microsoft build backend itaskexecutionhost execute at microsoft build backend taskbuilder d movenext reproduction steps install net preview sdk and wasm tools workload create a new blazor webassembly app with visual studio preview publish the app using aot compilation expected behavior publish succeeds actual behavior publish fails regression no response known workarounds no response configuration no response other information no response
| 0
|
42,512
| 2,870,938,041
|
IssuesEvent
|
2015-06-07 17:36:17
|
Naoghuman/Dream-Better-Worlds
|
https://api.github.com/repos/Naoghuman/Dream-Better-Worlds
|
closed
|
Remove in the menu the entry Exercises.
|
development refactoring issue fixed priority low
|
Remove in the menu the entry Exercises. Comment out only.
|
1.0
|
Remove in the menu the entry Exercises. - Remove in the menu the entry Exercises. Comment out only.
|
non_process
|
remove in the menu the entry exercises remove in the menu the entry exercises comment out only
| 0
|
2,684
| 5,534,161,545
|
IssuesEvent
|
2017-03-21 14:53:01
|
powertac/powertac-server
|
https://api.github.com/repos/powertac/powertac-server
|
closed
|
Use of powertac.version property defeats maven release process.
|
Bug Process
|
The process of releasing the powertac-server multi-module currently requires the user to type in the release and development version numbers for each module to clear "remaining SNAPSHOT dependencies", after which it still fails saying `on project powertac-server: The version could not be updated: ${powertac.version}`. It is not clear exactly which instance of `${powertac.version}` it's complaining about at the end, possibly the one in the build clause under the maven-javadoc-plugin.
|
1.0
|
Use of powertac.version property defeats maven release process. - The process of releasing the powertac-server multi-module currently requires the user to type in the release and development version numbers for each module to clear "remaining SNAPSHOT dependencies", after which it still fails saying `on project powertac-server: The version could not be updated: ${powertac.version}`. It is not clear exactly which instance of `${powertac.version}` it's complaining about at the end, possibly the one in the build clause under the maven-javadoc-plugin.
|
process
|
use of powertac version property defeats maven release process the process of releasing the powertac server multi module currently requires the user to type in the release and development version numbers for each module to clear remaining snapshot dependencies after which it still fails saying on project powertac server the version could not be updated powertac version it is not clear exactly which instance of powertac version it s complaining about at the end possibly the one in the build clause under the maven javadoc plugin
| 1
|
160,487
| 20,102,954,468
|
IssuesEvent
|
2022-02-07 07:27:48
|
pazhanivel07/linux-4.19.72
|
https://api.github.com/repos/pazhanivel07/linux-4.19.72
|
opened
|
CVE-2016-3695 (Medium) detected in linux-yoctov5.4.51
|
security vulnerability
|
## CVE-2016-3695 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/pazhanivel07/linux-4.19.72/commit/ce28e4f7a922d93d9b737061ae46827305c8c30a">ce28e4f7a922d93d9b737061ae46827305c8c30a</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/acpi/apei/einj.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/acpi/apei/einj.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The einj_error_inject function in drivers/acpi/apei/einj.c in the Linux kernel allows local users to simulate hardware errors and consequently cause a denial of service by leveraging failure to disable APEI error injection through EINJ when securelevel is set.
<p>Publish Date: 2017-12-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-3695>CVE-2016-3695</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Change files</p>
<p>Origin: <a href="https://github.com/mjg59/linux/commit/d7a6be58edc01b1c66ecd8fcc91236bfbce0a420">https://github.com/mjg59/linux/commit/d7a6be58edc01b1c66ecd8fcc91236bfbce0a420</a></p>
<p>Release Date: 2016-04-03</p>
<p>Fix Resolution: Replace or update the following file: einj.c</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2016-3695 (Medium) detected in linux-yoctov5.4.51 - ## CVE-2016-3695 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/pazhanivel07/linux-4.19.72/commit/ce28e4f7a922d93d9b737061ae46827305c8c30a">ce28e4f7a922d93d9b737061ae46827305c8c30a</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/acpi/apei/einj.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/acpi/apei/einj.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The einj_error_inject function in drivers/acpi/apei/einj.c in the Linux kernel allows local users to simulate hardware errors and consequently cause a denial of service by leveraging failure to disable APEI error injection through EINJ when securelevel is set.
<p>Publish Date: 2017-12-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-3695>CVE-2016-3695</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Change files</p>
<p>Origin: <a href="https://github.com/mjg59/linux/commit/d7a6be58edc01b1c66ecd8fcc91236bfbce0a420">https://github.com/mjg59/linux/commit/d7a6be58edc01b1c66ecd8fcc91236bfbce0a420</a></p>
<p>Release Date: 2016-04-03</p>
<p>Fix Resolution: Replace or update the following file: einj.c</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in linux cve medium severity vulnerability vulnerable library linux yocto linux embedded kernel library home page a href found in head commit a href found in base branch master vulnerable source files drivers acpi apei einj c drivers acpi apei einj c vulnerability details the einj error inject function in drivers acpi apei einj c in the linux kernel allows local users to simulate hardware errors and consequently cause a denial of service by leveraging failure to disable apei error injection through einj when securelevel is set publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type change files origin a href release date fix resolution replace or update the following file einj c step up your open source security game with whitesource
| 0
|
8,413
| 11,579,139,103
|
IssuesEvent
|
2020-02-21 17:16:33
|
prisma/prisma-client-js
|
https://api.github.com/repos/prisma/prisma-client-js
|
closed
|
`JSON.parse(dmmfString)` fails on prisma schema with docs
|
process/candidate
|
Generating Prisma Client results in creating a file that parses the JSON string containing DMMF representation of the schema
https://github.com/prisma/prisma-client-js/blob/4e67206f85e43d9a06e082027f77fe7a1c7f1d95/packages/photon/src/generation/TSClient.ts#L384-L387
Unofrtunatelly, the `\r` char is invalid in context of the json string, which results in `Unexpected token in JSON` error. Simple example:
```js
const str = '{ "field": "docs\r" }';
JSON.parse(str);
```
My Prisma schema:
```prisma
datasource db {
provider = "sqlite"
url = "file:../dev.db"
}
type Numeric = Float
generator client {
provider = "prisma-client-js"
binaryTargets = ["windows", "debian-openssl-1.1.x"]
output = "../prisma/generated/client"
}
generator typegraphql {
provider = "../src/cli/dev.ts"
output = "../prisma/generated/type-graphql"
emitDMMF = true
}
/// Role enum comment
enum Role {
// USER = "User"
USER
// ADMIN = "Admin"
ADMIN
}
/// User model comment
model User {
/// User model field comment
id Int @id @default(autoincrement())
email String @unique
name String?
age Int
balance Numeric
amount Float
posts Post[]
// maybePosts Post[]?
role Role
// address Address
// address2 embed {
// street String
// zipCode String
// }
}
// embed Address {
// street String
// zipCode String
// }
enum PostKind {
BLOG
ADVERT
}
model Post {
uuid String @default(cuid()) @id
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
published Boolean
title String
content String?
author User
// coAuthor User?
kind PostKind?
}
```
Generated DMMF string:
```js
/**
* DMMF
*/
const dmmfString = '{"datamodel":{"enums":[{"name":"Role","values":[{"name":"USER","dbName":null},{"name":"ADMIN","dbName":null}],"dbName":null,"documentation":"Role enum comment\r"},{"name":"PostKind","values":[{"name":"BLOG","dbName":null},{"name":"ADVERT","dbName":null}],"dbName":null}],"models":[{"name":"User","isEmbedded":false,"dbName":null,"fields":[{"name":"id","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":true,"type":"Int","default":{"name":"autoincrement","returnType":"Int","args":[]},"isGenerated":false,"isUpdatedAt":false,"documentation":"User model field comment\r"},{"name":"email","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":true,"isId":false,"type":"String","isGenerated":false,"isUpdatedAt":false},{"name":"name","kind":"scalar","dbNames":[],"isList":false,"isRequired":false,"isUnique":false,"isId":false,"type":"String","isGenerated":false,"isUpdatedAt":false},{"name":"age","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"Int","isGenerated":false,"isUpdatedAt":false},{"name":"balance","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"Float","isGenerated":false,"isUpdatedAt":false},{"name":"amount","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"Float","isGenerated":false,"isUpdatedAt":false},{"name":"posts","kind":"object","dbNames":[],"isList":true,"isRequired":false,"isUnique":false,"isId":false,"type":"Post","relationName":"PostToUser","relationToFields":[],"relationOnDelete":"NONE","isGenerated":false,"isUpdatedAt":false},{"name":"role","kind":"enum","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"Role","isGenerated":false,"isUpdatedAt":false}],"isGenerated":false,"documentation":"User model comment\r","idFields":[]},{"name":"Post","isEmbedded":false,"dbName":null,"fields":[{"name":"uuid","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":true,"type":"String","default":{"name":"cuid","returnType":"String","args":[]},"isGenerated":false,"isUpdatedAt":false},{"name":"createdAt","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"DateTime","default":{"name":"now","returnType":"DateTime","args":[]},"isGenerated":false,"isUpdatedAt":false},{"name":"updatedAt","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"DateTime","isGenerated":false,"isUpdatedAt":true},{"name":"published","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"Boolean","isGenerated":false,"isUpdatedAt":false},{"name":"title","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"String","isGenerated":false,"isUpdatedAt":false},{"name":"content","kind":"scalar","dbNames":[],"isList":false,"isRequired":false,"isUnique":false,"isId":false,"type":"String","isGenerated":false,"isUpdatedAt":false},{"name":"author","kind":"object","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"User","relationName":"PostToUser","relationToFields":["id"],"relationOnDelete":"NONE","isGenerated":false,"isUpdatedAt":false},{"name":"kind","kind":"enum","dbNames":[],"isList":false,"isRequired":false,"isUnique":false,"isId":false,"type":"PostKind","isGenerated":false,"isUpdatedAt":false}],"isGenerated":false,"idFields":[]}]},"mappings":[{"model":"User","plural":"users","findOne":"findOneUser","findMany":"findManyUser","create":"createOneUser","delete":"deleteOneUser","update":"updateOneUser","deleteMany":"deleteManyUser","updateMany":"updateManyUser","upsert":"upsertOneUser","aggregate":"aggregateUser"},{"model":"Post","plural":"posts","findOne":"findOnePost","findMany":"findManyPost","create":"createOnePost","delete":"deleteOnePost","update":"updateOnePost","deleteMany":"deleteManyPost","updateMany":"updateManyPost","upsert":"upsertOnePost","aggregate":"aggregatePost"}],"schema":{"enums":[{"name":"OrderByArg","values":["asc","desc"]},{"name":"Role","values":["USER","ADMIN"]},{"name":"PostKind","values":["BLOG","ADVERT"]}],"outputTypes":[{"name":"Post","fields":[{"name":"uuid","args":[],"outputType":{"type":"String","kind":"scalar","isRequired":true,"isList":false}},{"name":"createdAt","args":[],"outputType":{"type":"DateTime","kind":"scalar","isRequired":true,"isList":false}},{"name":"updatedAt","args":[],"outputType":{"type":"DateTime","kind":"scalar","isRequired":true,"isList":false}},{"name":"published","args":[],"outputType":{"type":"Boolean","kind":"scalar","isRequired":true,"isList":false}},{"name":"title","args":[],"outputType":{"type":"String","kind":"scalar","isRequired":true,"isList":false}},{"name":"content","args":[],"outputType":{"type":"String","kind":"scalar","isRequired":false,"isList":false}},{"name":"author","args":[],"outputType":{"type":"User","kind":"object","isRequired":true,"isList":false}},{"name":"kind","args":[],"outputType":{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}}]},{"name":"User","fields":[{"name":"id","args":[],"outputType":{"type":"Int","kind":"scalar","isRequired":true,"isList":false}},{"name":"email","args":[],"outputType":{"type":"String","kind":"scalar","isRequired":true,"isList":false}},{"name":"name","args":[],"outputType":{"type":"String","kind":"scalar","isRequired":false,"isList":false}},{"name":"age","args":[],"outputType":{"type":"Int","kind":"scalar","isRequired":true,"isList":false}},{"name":"balance","args":[],"outputType":{"type":"Float","kind":"scalar","isRequired":true,"isList":false}},{"name":"amount","args":[],"outputType":{"type":"Float","kind":"scalar","isRequired":true,"isList":false}},{"name":"posts","args":[{"name":"where","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":false}]},{"name":"orderBy","inputType":[{"isList":false,"isRequired":false,"type":"PostOrderByInput","kind":"object"}]},{"name":"skip","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"after","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"before","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"first","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"last","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":false,"isList":true}},{"name":"role","args":[],"outputType":{"type":"Role","kind":"enum","isRequired":true,"isList":false}}]},{"name":"AggregateUser","fields":[{"name":"count","args":[],"outputType":{"type":"Int","kind":"scalar","isRequired":true,"isList":false}}]},{"name":"AggregatePost","fields":[{"name":"count","args":[],"outputType":{"type":"Int","kind":"scalar","isRequired":true,"isList":false}}]},{"name":"Query","fields":[{"name":"findManyUser","args":[{"name":"where","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":false}]},{"name":"orderBy","inputType":[{"isList":false,"isRequired":false,"type":"UserOrderByInput","kind":"object"}]},{"name":"skip","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"after","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"before","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"first","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"last","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":true,"isList":true}},{"name":"aggregateUser","args":[],"outputType":{"type":"AggregateUser","kind":"object","isRequired":true,"isList":false}},{"name":"findOneUser","args":[{"name":"where","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":false,"isList":false}},{"name":"findManyPost","args":[{"name":"where","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":false}]},{"name":"orderBy","inputType":[{"isList":false,"isRequired":false,"type":"PostOrderByInput","kind":"object"}]},{"name":"skip","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"after","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"before","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"first","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"last","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":true,"isList":true}},{"name":"aggregatePost","args":[],"outputType":{"type":"AggregatePost","kind":"object","isRequired":true,"isList":false}},{"name":"findOnePost","args":[{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":false,"isList":false}}]},{"name":"BatchPayload","fields":[{"name":"count","args":[],"outputType":{"type":"Int","kind":"scalar","isRequired":true,"isList":false}}]},{"name":"Mutation","fields":[{"name":"createOneUser","args":[{"name":"data","inputType":[{"type":"UserCreateInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":true,"isList":false}},{"name":"deleteOneUser","args":[{"name":"where","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":false,"isList":false}},{"name":"updateOneUser","args":[{"name":"data","inputType":[{"type":"UserUpdateInput","kind":"object","isRequired":true,"isList":false}]},{"name":"where","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":false,"isList":false}},{"name":"upsertOneUser","args":[{"name":"where","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]},{"name":"create","inputType":[{"type":"UserCreateInput","kind":"object","isRequired":true,"isList":false}]},{"name":"update","inputType":[{"type":"UserUpdateInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":true,"isList":false}},{"name":"updateManyUser","args":[{"name":"data","inputType":[{"type":"UserUpdateManyMutationInput","kind":"object","isRequired":true,"isList":false}]},{"name":"where","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":false}]}],"outputType":{"type":"BatchPayload","kind":"object","isRequired":true,"isList":false}},{"name":"deleteManyUser","args":[{"name":"where","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":false}]}],"outputType":{"type":"BatchPayload","kind":"object","isRequired":true,"isList":false}},{"name":"createOnePost","args":[{"name":"data","inputType":[{"type":"PostCreateInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":true,"isList":false}},{"name":"deleteOnePost","args":[{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":false,"isList":false}},{"name":"updateOnePost","args":[{"name":"data","inputType":[{"type":"PostUpdateInput","kind":"object","isRequired":true,"isList":false}]},{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":false,"isList":false}},{"name":"upsertOnePost","args":[{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]},{"name":"create","inputType":[{"type":"PostCreateInput","kind":"object","isRequired":true,"isList":false}]},{"name":"update","inputType":[{"type":"PostUpdateInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":true,"isList":false}},{"name":"updateManyPost","args":[{"name":"data","inputType":[{"type":"PostUpdateManyMutationInput","kind":"object","isRequired":true,"isList":false}]},{"name":"where","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":false}]}],"outputType":{"type":"BatchPayload","kind":"object","isRequired":true,"isList":false}},{"name":"deleteManyPost","args":[{"name":"where","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":false}]}],"outputType":{"type":"BatchPayload","kind":"object","isRequired":true,"isList":false}},{"name":"executeRaw","args":[{"name":"query","inputType":[{"type":"String","kind":"scalar","isRequired":true,"isList":false}]},{"name":"parameters","inputType":[{"type":"Json","kind":"scalar","isRequired":false,"isList":false}]}],"outputType":{"type":"Json","kind":"scalar","isRequired":true,"isList":false}}]}],"inputTypes":[{"name":"PostWhereInput","fields":[{"name":"uuid","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"StringFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"createdAt","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"},{"type":"DateTimeFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"updatedAt","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"},{"type":"DateTimeFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"published","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Boolean"},{"type":"BooleanFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"title","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"StringFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"content","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"NullableStringFilter","isList":false,"isRequired":false,"kind":"object"},{"type":"null","isList":false,"isRequired":false,"kind":"scalar"}],"isRelationFilter":false},{"name":"kind","inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"PostKind"},{"type":"NullablePostKindFilter","isList":false,"isRequired":false,"kind":"object"},{"type":"null","isList":false,"isRequired":false,"kind":"scalar"}],"isRelationFilter":false},{"name":"AND","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"OR","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"NOT","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"author","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":false}],"isRelationFilter":true}],"isWhereType":true,"atLeastOne":false},{"name":"UserWhereInput","fields":[{"name":"id","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"},{"type":"IntFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"email","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"StringFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"name","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"NullableStringFilter","isList":false,"isRequired":false,"kind":"object"},{"type":"null","isList":false,"isRequired":false,"kind":"scalar"}],"isRelationFilter":false},{"name":"age","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"},{"type":"IntFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"balance","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"},{"type":"FloatFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"amount","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"},{"type":"FloatFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"posts","inputType":[{"type":"PostFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false,"nullEqualsUndefined":true},{"name":"role","inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"Role"},{"type":"RoleFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"AND","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"OR","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"NOT","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true}],"isWhereType":true,"atLeastOne":false},{"name":"UserWhereUniqueInput","fields":[{"name":"id","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]}],"atLeastOne":true},{"name":"PostWhereUniqueInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]}],"atLeastOne":true},{"name":"PostCreateWithoutAuthorInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":true,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":true,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"PostCreateManyWithoutAuthorInput","fields":[{"name":"create","inputType":[{"type":"PostCreateWithoutAuthorInput","kind":"object","isRequired":false,"isList":true}]},{"name":"connect","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":true}]}]},{"name":"UserCreateInput","fields":[{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":true,"isList":false}]},{"name":"name","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"age","inputType":[{"type":"Int","kind":"scalar","isRequired":true,"isList":false}]},{"name":"balance","inputType":[{"type":"Float","kind":"scalar","isRequired":true,"isList":false}]},{"name":"amount","inputType":[{"type":"Float","kind":"scalar","isRequired":true,"isList":false}]},{"name":"role","inputType":[{"type":"Role","kind":"enum","isRequired":true,"isList":false}]},{"name":"posts","inputType":[{"type":"PostCreateManyWithoutAuthorInput","kind":"object","isRequired":false,"isList":false}]}]},{"name":"PostUpdateWithoutAuthorDataInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":false,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"PostUpdateWithWhereUniqueWithoutAuthorInput","fields":[{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]},{"name":"data","inputType":[{"type":"PostUpdateWithoutAuthorDataInput","kind":"object","isRequired":true,"isList":false}]}]},{"name":"PostScalarWhereInput","fields":[{"name":"uuid","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"StringFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"createdAt","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"},{"type":"DateTimeFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"updatedAt","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"},{"type":"DateTimeFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"published","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Boolean"},{"type":"BooleanFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"title","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"StringFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"content","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"NullableStringFilter","isList":false,"isRequired":false,"kind":"object"},{"type":"null","isList":false,"isRequired":false,"kind":"scalar"}],"isRelationFilter":false},{"name":"kind","inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"PostKind"},{"type":"NullablePostKindFilter","isList":false,"isRequired":false,"kind":"object"},{"type":"null","isList":false,"isRequired":false,"kind":"scalar"}],"isRelationFilter":false},{"name":"AND","inputType":[{"type":"PostScalarWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"OR","inputType":[{"type":"PostScalarWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"NOT","inputType":[{"type":"PostScalarWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true}],"isWhereType":true,"atLeastOne":false},{"name":"PostUpdateManyDataInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":false,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"PostUpdateManyWithWhereNestedInput","fields":[{"name":"where","inputType":[{"type":"PostScalarWhereInput","kind":"object","isRequired":true,"isList":false}]},{"name":"data","inputType":[{"type":"PostUpdateManyDataInput","kind":"object","isRequired":true,"isList":false}]}]},{"name":"PostUpsertWithWhereUniqueWithoutAuthorInput","fields":[{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]},{"name":"update","inputType":[{"type":"PostUpdateWithoutAuthorDataInput","kind":"object","isRequired":true,"isList":false}]},{"name":"create","inputType":[{"type":"PostCreateWithoutAuthorInput","kind":"object","isRequired":true,"isList":false}]}]},{"name":"PostUpdateManyWithoutAuthorInput","fields":[{"name":"create","inputType":[{"type":"PostCreateWithoutAuthorInput","kind":"object","isRequired":false,"isList":true}]},{"name":"connect","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":true}]},{"name":"set","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":true}]},{"name":"disconnect","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":true}]},{"name":"delete","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":true}]},{"name":"update","inputType":[{"type":"PostUpdateWithWhereUniqueWithoutAuthorInput","kind":"object","isRequired":false,"isList":true}]},{"name":"updateMany","inputType":[{"type":"PostUpdateManyWithWhereNestedInput","kind":"object","isRequired":false,"isList":true}]},{"name":"deleteMany","inputType":[{"type":"PostScalarWhereInput","kind":"object","isRequired":false,"isList":true}]},{"name":"upsert","inputType":[{"type":"PostUpsertWithWhereUniqueWithoutAuthorInput","kind":"object","isRequired":false,"isList":true}]}]},{"name":"UserUpdateInput","fields":[{"name":"id","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"name","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"age","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"balance","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"amount","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"role","inputType":[{"type":"Role","kind":"enum","isRequired":false,"isList":false}]},{"name":"posts","inputType":[{"type":"PostUpdateManyWithoutAuthorInput","kind":"object","isRequired":false,"isList":false}]}]},{"name":"UserUpdateManyMutationInput","fields":[{"name":"id","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"name","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"age","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"balance","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"amount","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"role","inputType":[{"type":"Role","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"UserCreateWithoutPostsInput","fields":[{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":true,"isList":false}]},{"name":"name","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"age","inputType":[{"type":"Int","kind":"scalar","isRequired":true,"isList":false}]},{"name":"balance","inputType":[{"type":"Float","kind":"scalar","isRequired":true,"isList":false}]},{"name":"amount","inputType":[{"type":"Float","kind":"scalar","isRequired":true,"isList":false}]},{"name":"role","inputType":[{"type":"Role","kind":"enum","isRequired":true,"isList":false}]}]},{"name":"UserCreateOneWithoutPostsInput","fields":[{"name":"create","inputType":[{"type":"UserCreateWithoutPostsInput","kind":"object","isRequired":false,"isList":false}]},{"name":"connect","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]}]},{"name":"PostCreateInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":true,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":true,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]},{"name":"author","inputType":[{"type":"UserCreateOneWithoutPostsInput","kind":"object","isRequired":true,"isList":false}]}]},{"name":"UserUpdateWithoutPostsDataInput","fields":[{"name":"id","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"name","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"age","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"balance","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"amount","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"role","inputType":[{"type":"Role","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"UserUpsertWithoutPostsInput","fields":[{"name":"update","inputType":[{"type":"UserUpdateWithoutPostsDataInput","kind":"object","isRequired":true,"isList":false}]},{"name":"create","inputType":[{"type":"UserCreateWithoutPostsInput","kind":"object","isRequired":true,"isList":false}]}]},{"name":"UserUpdateOneRequiredWithoutPostsInput","fields":[{"name":"create","inputType":[{"type":"UserCreateWithoutPostsInput","kind":"object","isRequired":false,"isList":false}]},{"name":"connect","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"update","inputType":[{"type":"UserUpdateWithoutPostsDataInput","kind":"object","isRequired":false,"isList":false}]},{"name":"upsert","inputType":[{"type":"UserUpsertWithoutPostsInput","kind":"object","isRequired":false,"isList":false}]}]},{"name":"PostUpdateInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":false,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]},{"name":"author","inputType":[{"type":"UserUpdateOneRequiredWithoutPostsInput","kind":"object","isRequired":false,"isList":false}]}]},{"name":"PostUpdateManyMutationInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":false,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"StringFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"isList":false,"isRequired":false,"kind":"scalar","type":"StringFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"lt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"lte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"gt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"gte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"contains","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"startsWith","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"endsWith","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]}],"atLeastOne":false},{"name":"DateTimeFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"},{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTimeFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"lt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"lte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"gt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"gte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"}]}],"atLeastOne":false},{"name":"BooleanFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Boolean"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Boolean"},{"isList":false,"isRequired":false,"kind":"scalar","type":"BooleanFilter"}]}],"atLeastOne":false},{"name":"NullableStringFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"isList":false,"isRequired":false,"kind":"scalar","type":"null"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"isList":false,"isRequired":false,"kind":"scalar","type":"null"},{"isList":false,"isRequired":false,"kind":"scalar","type":"NullableStringFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"lt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"lte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"gt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"gte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"contains","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"startsWith","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"endsWith","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]}],"atLeastOne":false},{"name":"NullablePostKindFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"PostKind"},{"isList":false,"isRequired":false,"kind":"enum","type":"null"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"PostKind"},{"isList":false,"isRequired":false,"kind":"enum","type":"null"},{"isList":false,"isRequired":false,"kind":"enum","type":"NullablePostKindFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"enum","type":"PostKind"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"enum","type":"PostKind"}]}],"atLeastOne":false},{"name":"IntFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"},{"isList":false,"isRequired":false,"kind":"scalar","type":"IntFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"lt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"lte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"gt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"gte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"}]}],"atLeastOne":false},{"name":"FloatFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"},{"isList":false,"isRequired":false,"kind":"scalar","type":"FloatFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"lt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"lte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"gt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"gte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"}]}],"atLeastOne":false},{"name":"PostFilter","fields":[{"name":"every","isRelationFilter":true,"inputType":[{"isList":false,"isRequired":false,"kind":"object","type":"PostWhereInput"}]},{"name":"some","isRelationFilter":true,"inputType":[{"isList":false,"isRequired":false,"kind":"object","type":"PostWhereInput"}]},{"name":"none","isRelationFilter":true,"inputType":[{"isList":false,"isRequired":false,"kind":"object","type":"PostWhereInput"}]}],"atLeastOne":false},{"name":"RoleFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"Role"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"Role"},{"isList":false,"isRequired":false,"kind":"enum","type":"RoleFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"enum","type":"Role"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"enum","type":"Role"}]}],"atLeastOne":false},{"name":"UserOrderByInput","atLeastOne":true,"atMostOne":true,"isOrderType":true,"fields":[{"name":"id","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"email","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"name","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"age","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"balance","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"amount","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"role","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false}]},{"name":"PostOrderByInput","atLeastOne":true,"atMostOne":true,"isOrderType":true,"fields":[{"name":"uuid","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"createdAt","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"updatedAt","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"published","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"title","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"content","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"kind","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false}]}]}}'
```
Received error:
```
in JSON at position 158expected token
at JSON.parse (<anonymous>)
at Object.<anonymous> (F:\#Projekty\#Github\typegraphql-prisma\experiments\prisma\generated\client\index.js:1011:19) phq
at Module._compile (internal/modules/cjs/loader.js:011778:30)
at Object.Module._extensions..js (internal/modules/778cjs/loader.js:789:10)
at Module.load (internal/modules/cjs/loader.js:653:cjs32) at tryModuleLoad (internal/modules/cjs/loader.js:593:12) 32)
at Function.Module._load (internal/modules/cjs/load3:1er.js:585:3)
at Module.require (internal/modules/cjs/loader.js:6er.92:17)
at require (internal/modules/cjs/helpers.js:25:18) 92:
at Object.generate [as onGenerate] (F:\#Projekty\#Github\typegraphql-prisma\src\cli\prisma-generator.ts:13 :30) ith
at GeneratorProcess.handleResponse (F:\#Projekty\#G) ithub\typegraphql-prisma\experiments\node_modules\prismitha2\build\index.js:2:177310) bui
at LineStream.s.default.on.e (F:\#Projekty\#Github\typegraphql-prisma\experiments\node_modules\prisma2\buitypld\index.js:2:176926) ind
at LineStream.emit (events.js:198:13)
at LineStream.EventEmitter.emit (domain.js:448:20)
at addChunk (_stream_readable.js:288:12)
at readableAddChunk (_stream_readable.js:269:11)
at LineStream.Readable.push (_stream_readable.js:22 4:10) 4:1
at LineStream.Transform.push (_stream_transform.js:151:32) 151
at LineStream._pushBuffer (F:\#Projekty\#Github\typegraphql-prisma\experiments\node_modules\prisma2\build\egrindex.js:2:840724) ex.
at LineStream._transform (F:\#Projekty\#Github\typegraphql-prisma\experiments\node_modules\prisma2\build\igrandex.js:2:840543) code: -32000, data: null }
```
|
1.0
|
`JSON.parse(dmmfString)` fails on prisma schema with docs - Generating Prisma Client results in creating a file that parses the JSON string containing DMMF representation of the schema
https://github.com/prisma/prisma-client-js/blob/4e67206f85e43d9a06e082027f77fe7a1c7f1d95/packages/photon/src/generation/TSClient.ts#L384-L387
Unofrtunatelly, the `\r` char is invalid in context of the json string, which results in `Unexpected token in JSON` error. Simple example:
```js
const str = '{ "field": "docs\r" }';
JSON.parse(str);
```
My Prisma schema:
```prisma
datasource db {
provider = "sqlite"
url = "file:../dev.db"
}
type Numeric = Float
generator client {
provider = "prisma-client-js"
binaryTargets = ["windows", "debian-openssl-1.1.x"]
output = "../prisma/generated/client"
}
generator typegraphql {
provider = "../src/cli/dev.ts"
output = "../prisma/generated/type-graphql"
emitDMMF = true
}
/// Role enum comment
enum Role {
// USER = "User"
USER
// ADMIN = "Admin"
ADMIN
}
/// User model comment
model User {
/// User model field comment
id Int @id @default(autoincrement())
email String @unique
name String?
age Int
balance Numeric
amount Float
posts Post[]
// maybePosts Post[]?
role Role
// address Address
// address2 embed {
// street String
// zipCode String
// }
}
// embed Address {
// street String
// zipCode String
// }
enum PostKind {
BLOG
ADVERT
}
model Post {
uuid String @default(cuid()) @id
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
published Boolean
title String
content String?
author User
// coAuthor User?
kind PostKind?
}
```
Generated DMMF string:
```js
/**
* DMMF
*/
const dmmfString = '{"datamodel":{"enums":[{"name":"Role","values":[{"name":"USER","dbName":null},{"name":"ADMIN","dbName":null}],"dbName":null,"documentation":"Role enum comment\r"},{"name":"PostKind","values":[{"name":"BLOG","dbName":null},{"name":"ADVERT","dbName":null}],"dbName":null}],"models":[{"name":"User","isEmbedded":false,"dbName":null,"fields":[{"name":"id","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":true,"type":"Int","default":{"name":"autoincrement","returnType":"Int","args":[]},"isGenerated":false,"isUpdatedAt":false,"documentation":"User model field comment\r"},{"name":"email","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":true,"isId":false,"type":"String","isGenerated":false,"isUpdatedAt":false},{"name":"name","kind":"scalar","dbNames":[],"isList":false,"isRequired":false,"isUnique":false,"isId":false,"type":"String","isGenerated":false,"isUpdatedAt":false},{"name":"age","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"Int","isGenerated":false,"isUpdatedAt":false},{"name":"balance","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"Float","isGenerated":false,"isUpdatedAt":false},{"name":"amount","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"Float","isGenerated":false,"isUpdatedAt":false},{"name":"posts","kind":"object","dbNames":[],"isList":true,"isRequired":false,"isUnique":false,"isId":false,"type":"Post","relationName":"PostToUser","relationToFields":[],"relationOnDelete":"NONE","isGenerated":false,"isUpdatedAt":false},{"name":"role","kind":"enum","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"Role","isGenerated":false,"isUpdatedAt":false}],"isGenerated":false,"documentation":"User model comment\r","idFields":[]},{"name":"Post","isEmbedded":false,"dbName":null,"fields":[{"name":"uuid","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":true,"type":"String","default":{"name":"cuid","returnType":"String","args":[]},"isGenerated":false,"isUpdatedAt":false},{"name":"createdAt","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"DateTime","default":{"name":"now","returnType":"DateTime","args":[]},"isGenerated":false,"isUpdatedAt":false},{"name":"updatedAt","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"DateTime","isGenerated":false,"isUpdatedAt":true},{"name":"published","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"Boolean","isGenerated":false,"isUpdatedAt":false},{"name":"title","kind":"scalar","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"String","isGenerated":false,"isUpdatedAt":false},{"name":"content","kind":"scalar","dbNames":[],"isList":false,"isRequired":false,"isUnique":false,"isId":false,"type":"String","isGenerated":false,"isUpdatedAt":false},{"name":"author","kind":"object","dbNames":[],"isList":false,"isRequired":true,"isUnique":false,"isId":false,"type":"User","relationName":"PostToUser","relationToFields":["id"],"relationOnDelete":"NONE","isGenerated":false,"isUpdatedAt":false},{"name":"kind","kind":"enum","dbNames":[],"isList":false,"isRequired":false,"isUnique":false,"isId":false,"type":"PostKind","isGenerated":false,"isUpdatedAt":false}],"isGenerated":false,"idFields":[]}]},"mappings":[{"model":"User","plural":"users","findOne":"findOneUser","findMany":"findManyUser","create":"createOneUser","delete":"deleteOneUser","update":"updateOneUser","deleteMany":"deleteManyUser","updateMany":"updateManyUser","upsert":"upsertOneUser","aggregate":"aggregateUser"},{"model":"Post","plural":"posts","findOne":"findOnePost","findMany":"findManyPost","create":"createOnePost","delete":"deleteOnePost","update":"updateOnePost","deleteMany":"deleteManyPost","updateMany":"updateManyPost","upsert":"upsertOnePost","aggregate":"aggregatePost"}],"schema":{"enums":[{"name":"OrderByArg","values":["asc","desc"]},{"name":"Role","values":["USER","ADMIN"]},{"name":"PostKind","values":["BLOG","ADVERT"]}],"outputTypes":[{"name":"Post","fields":[{"name":"uuid","args":[],"outputType":{"type":"String","kind":"scalar","isRequired":true,"isList":false}},{"name":"createdAt","args":[],"outputType":{"type":"DateTime","kind":"scalar","isRequired":true,"isList":false}},{"name":"updatedAt","args":[],"outputType":{"type":"DateTime","kind":"scalar","isRequired":true,"isList":false}},{"name":"published","args":[],"outputType":{"type":"Boolean","kind":"scalar","isRequired":true,"isList":false}},{"name":"title","args":[],"outputType":{"type":"String","kind":"scalar","isRequired":true,"isList":false}},{"name":"content","args":[],"outputType":{"type":"String","kind":"scalar","isRequired":false,"isList":false}},{"name":"author","args":[],"outputType":{"type":"User","kind":"object","isRequired":true,"isList":false}},{"name":"kind","args":[],"outputType":{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}}]},{"name":"User","fields":[{"name":"id","args":[],"outputType":{"type":"Int","kind":"scalar","isRequired":true,"isList":false}},{"name":"email","args":[],"outputType":{"type":"String","kind":"scalar","isRequired":true,"isList":false}},{"name":"name","args":[],"outputType":{"type":"String","kind":"scalar","isRequired":false,"isList":false}},{"name":"age","args":[],"outputType":{"type":"Int","kind":"scalar","isRequired":true,"isList":false}},{"name":"balance","args":[],"outputType":{"type":"Float","kind":"scalar","isRequired":true,"isList":false}},{"name":"amount","args":[],"outputType":{"type":"Float","kind":"scalar","isRequired":true,"isList":false}},{"name":"posts","args":[{"name":"where","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":false}]},{"name":"orderBy","inputType":[{"isList":false,"isRequired":false,"type":"PostOrderByInput","kind":"object"}]},{"name":"skip","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"after","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"before","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"first","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"last","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":false,"isList":true}},{"name":"role","args":[],"outputType":{"type":"Role","kind":"enum","isRequired":true,"isList":false}}]},{"name":"AggregateUser","fields":[{"name":"count","args":[],"outputType":{"type":"Int","kind":"scalar","isRequired":true,"isList":false}}]},{"name":"AggregatePost","fields":[{"name":"count","args":[],"outputType":{"type":"Int","kind":"scalar","isRequired":true,"isList":false}}]},{"name":"Query","fields":[{"name":"findManyUser","args":[{"name":"where","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":false}]},{"name":"orderBy","inputType":[{"isList":false,"isRequired":false,"type":"UserOrderByInput","kind":"object"}]},{"name":"skip","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"after","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"before","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"first","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"last","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":true,"isList":true}},{"name":"aggregateUser","args":[],"outputType":{"type":"AggregateUser","kind":"object","isRequired":true,"isList":false}},{"name":"findOneUser","args":[{"name":"where","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":false,"isList":false}},{"name":"findManyPost","args":[{"name":"where","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":false}]},{"name":"orderBy","inputType":[{"isList":false,"isRequired":false,"type":"PostOrderByInput","kind":"object"}]},{"name":"skip","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"after","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"before","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"first","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"last","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":true,"isList":true}},{"name":"aggregatePost","args":[],"outputType":{"type":"AggregatePost","kind":"object","isRequired":true,"isList":false}},{"name":"findOnePost","args":[{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":false,"isList":false}}]},{"name":"BatchPayload","fields":[{"name":"count","args":[],"outputType":{"type":"Int","kind":"scalar","isRequired":true,"isList":false}}]},{"name":"Mutation","fields":[{"name":"createOneUser","args":[{"name":"data","inputType":[{"type":"UserCreateInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":true,"isList":false}},{"name":"deleteOneUser","args":[{"name":"where","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":false,"isList":false}},{"name":"updateOneUser","args":[{"name":"data","inputType":[{"type":"UserUpdateInput","kind":"object","isRequired":true,"isList":false}]},{"name":"where","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":false,"isList":false}},{"name":"upsertOneUser","args":[{"name":"where","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]},{"name":"create","inputType":[{"type":"UserCreateInput","kind":"object","isRequired":true,"isList":false}]},{"name":"update","inputType":[{"type":"UserUpdateInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"User","kind":"object","isRequired":true,"isList":false}},{"name":"updateManyUser","args":[{"name":"data","inputType":[{"type":"UserUpdateManyMutationInput","kind":"object","isRequired":true,"isList":false}]},{"name":"where","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":false}]}],"outputType":{"type":"BatchPayload","kind":"object","isRequired":true,"isList":false}},{"name":"deleteManyUser","args":[{"name":"where","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":false}]}],"outputType":{"type":"BatchPayload","kind":"object","isRequired":true,"isList":false}},{"name":"createOnePost","args":[{"name":"data","inputType":[{"type":"PostCreateInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":true,"isList":false}},{"name":"deleteOnePost","args":[{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":false,"isList":false}},{"name":"updateOnePost","args":[{"name":"data","inputType":[{"type":"PostUpdateInput","kind":"object","isRequired":true,"isList":false}]},{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":false,"isList":false}},{"name":"upsertOnePost","args":[{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]},{"name":"create","inputType":[{"type":"PostCreateInput","kind":"object","isRequired":true,"isList":false}]},{"name":"update","inputType":[{"type":"PostUpdateInput","kind":"object","isRequired":true,"isList":false}]}],"outputType":{"type":"Post","kind":"object","isRequired":true,"isList":false}},{"name":"updateManyPost","args":[{"name":"data","inputType":[{"type":"PostUpdateManyMutationInput","kind":"object","isRequired":true,"isList":false}]},{"name":"where","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":false}]}],"outputType":{"type":"BatchPayload","kind":"object","isRequired":true,"isList":false}},{"name":"deleteManyPost","args":[{"name":"where","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":false}]}],"outputType":{"type":"BatchPayload","kind":"object","isRequired":true,"isList":false}},{"name":"executeRaw","args":[{"name":"query","inputType":[{"type":"String","kind":"scalar","isRequired":true,"isList":false}]},{"name":"parameters","inputType":[{"type":"Json","kind":"scalar","isRequired":false,"isList":false}]}],"outputType":{"type":"Json","kind":"scalar","isRequired":true,"isList":false}}]}],"inputTypes":[{"name":"PostWhereInput","fields":[{"name":"uuid","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"StringFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"createdAt","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"},{"type":"DateTimeFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"updatedAt","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"},{"type":"DateTimeFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"published","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Boolean"},{"type":"BooleanFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"title","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"StringFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"content","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"NullableStringFilter","isList":false,"isRequired":false,"kind":"object"},{"type":"null","isList":false,"isRequired":false,"kind":"scalar"}],"isRelationFilter":false},{"name":"kind","inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"PostKind"},{"type":"NullablePostKindFilter","isList":false,"isRequired":false,"kind":"object"},{"type":"null","isList":false,"isRequired":false,"kind":"scalar"}],"isRelationFilter":false},{"name":"AND","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"OR","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"NOT","inputType":[{"type":"PostWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"author","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":false}],"isRelationFilter":true}],"isWhereType":true,"atLeastOne":false},{"name":"UserWhereInput","fields":[{"name":"id","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"},{"type":"IntFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"email","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"StringFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"name","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"NullableStringFilter","isList":false,"isRequired":false,"kind":"object"},{"type":"null","isList":false,"isRequired":false,"kind":"scalar"}],"isRelationFilter":false},{"name":"age","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"},{"type":"IntFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"balance","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"},{"type":"FloatFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"amount","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"},{"type":"FloatFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"posts","inputType":[{"type":"PostFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false,"nullEqualsUndefined":true},{"name":"role","inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"Role"},{"type":"RoleFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"AND","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"OR","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"NOT","inputType":[{"type":"UserWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true}],"isWhereType":true,"atLeastOne":false},{"name":"UserWhereUniqueInput","fields":[{"name":"id","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]}],"atLeastOne":true},{"name":"PostWhereUniqueInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]}],"atLeastOne":true},{"name":"PostCreateWithoutAuthorInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":true,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":true,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"PostCreateManyWithoutAuthorInput","fields":[{"name":"create","inputType":[{"type":"PostCreateWithoutAuthorInput","kind":"object","isRequired":false,"isList":true}]},{"name":"connect","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":true}]}]},{"name":"UserCreateInput","fields":[{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":true,"isList":false}]},{"name":"name","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"age","inputType":[{"type":"Int","kind":"scalar","isRequired":true,"isList":false}]},{"name":"balance","inputType":[{"type":"Float","kind":"scalar","isRequired":true,"isList":false}]},{"name":"amount","inputType":[{"type":"Float","kind":"scalar","isRequired":true,"isList":false}]},{"name":"role","inputType":[{"type":"Role","kind":"enum","isRequired":true,"isList":false}]},{"name":"posts","inputType":[{"type":"PostCreateManyWithoutAuthorInput","kind":"object","isRequired":false,"isList":false}]}]},{"name":"PostUpdateWithoutAuthorDataInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":false,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"PostUpdateWithWhereUniqueWithoutAuthorInput","fields":[{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]},{"name":"data","inputType":[{"type":"PostUpdateWithoutAuthorDataInput","kind":"object","isRequired":true,"isList":false}]}]},{"name":"PostScalarWhereInput","fields":[{"name":"uuid","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"StringFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"createdAt","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"},{"type":"DateTimeFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"updatedAt","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"},{"type":"DateTimeFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"published","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Boolean"},{"type":"BooleanFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"title","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"StringFilter","isList":false,"isRequired":false,"kind":"object"}],"isRelationFilter":false},{"name":"content","inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"type":"NullableStringFilter","isList":false,"isRequired":false,"kind":"object"},{"type":"null","isList":false,"isRequired":false,"kind":"scalar"}],"isRelationFilter":false},{"name":"kind","inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"PostKind"},{"type":"NullablePostKindFilter","isList":false,"isRequired":false,"kind":"object"},{"type":"null","isList":false,"isRequired":false,"kind":"scalar"}],"isRelationFilter":false},{"name":"AND","inputType":[{"type":"PostScalarWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"OR","inputType":[{"type":"PostScalarWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true},{"name":"NOT","inputType":[{"type":"PostScalarWhereInput","kind":"object","isRequired":false,"isList":true}],"isRelationFilter":true}],"isWhereType":true,"atLeastOne":false},{"name":"PostUpdateManyDataInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":false,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"PostUpdateManyWithWhereNestedInput","fields":[{"name":"where","inputType":[{"type":"PostScalarWhereInput","kind":"object","isRequired":true,"isList":false}]},{"name":"data","inputType":[{"type":"PostUpdateManyDataInput","kind":"object","isRequired":true,"isList":false}]}]},{"name":"PostUpsertWithWhereUniqueWithoutAuthorInput","fields":[{"name":"where","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":true,"isList":false}]},{"name":"update","inputType":[{"type":"PostUpdateWithoutAuthorDataInput","kind":"object","isRequired":true,"isList":false}]},{"name":"create","inputType":[{"type":"PostCreateWithoutAuthorInput","kind":"object","isRequired":true,"isList":false}]}]},{"name":"PostUpdateManyWithoutAuthorInput","fields":[{"name":"create","inputType":[{"type":"PostCreateWithoutAuthorInput","kind":"object","isRequired":false,"isList":true}]},{"name":"connect","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":true}]},{"name":"set","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":true}]},{"name":"disconnect","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":true}]},{"name":"delete","inputType":[{"type":"PostWhereUniqueInput","kind":"object","isRequired":false,"isList":true}]},{"name":"update","inputType":[{"type":"PostUpdateWithWhereUniqueWithoutAuthorInput","kind":"object","isRequired":false,"isList":true}]},{"name":"updateMany","inputType":[{"type":"PostUpdateManyWithWhereNestedInput","kind":"object","isRequired":false,"isList":true}]},{"name":"deleteMany","inputType":[{"type":"PostScalarWhereInput","kind":"object","isRequired":false,"isList":true}]},{"name":"upsert","inputType":[{"type":"PostUpsertWithWhereUniqueWithoutAuthorInput","kind":"object","isRequired":false,"isList":true}]}]},{"name":"UserUpdateInput","fields":[{"name":"id","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"name","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"age","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"balance","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"amount","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"role","inputType":[{"type":"Role","kind":"enum","isRequired":false,"isList":false}]},{"name":"posts","inputType":[{"type":"PostUpdateManyWithoutAuthorInput","kind":"object","isRequired":false,"isList":false}]}]},{"name":"UserUpdateManyMutationInput","fields":[{"name":"id","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"name","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"age","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"balance","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"amount","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"role","inputType":[{"type":"Role","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"UserCreateWithoutPostsInput","fields":[{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":true,"isList":false}]},{"name":"name","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"age","inputType":[{"type":"Int","kind":"scalar","isRequired":true,"isList":false}]},{"name":"balance","inputType":[{"type":"Float","kind":"scalar","isRequired":true,"isList":false}]},{"name":"amount","inputType":[{"type":"Float","kind":"scalar","isRequired":true,"isList":false}]},{"name":"role","inputType":[{"type":"Role","kind":"enum","isRequired":true,"isList":false}]}]},{"name":"UserCreateOneWithoutPostsInput","fields":[{"name":"create","inputType":[{"type":"UserCreateWithoutPostsInput","kind":"object","isRequired":false,"isList":false}]},{"name":"connect","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]}]},{"name":"PostCreateInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":true,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":true,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]},{"name":"author","inputType":[{"type":"UserCreateOneWithoutPostsInput","kind":"object","isRequired":true,"isList":false}]}]},{"name":"UserUpdateWithoutPostsDataInput","fields":[{"name":"id","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"email","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"name","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"age","inputType":[{"type":"Int","kind":"scalar","isRequired":false,"isList":false}]},{"name":"balance","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"amount","inputType":[{"type":"Float","kind":"scalar","isRequired":false,"isList":false}]},{"name":"role","inputType":[{"type":"Role","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"UserUpsertWithoutPostsInput","fields":[{"name":"update","inputType":[{"type":"UserUpdateWithoutPostsDataInput","kind":"object","isRequired":true,"isList":false}]},{"name":"create","inputType":[{"type":"UserCreateWithoutPostsInput","kind":"object","isRequired":true,"isList":false}]}]},{"name":"UserUpdateOneRequiredWithoutPostsInput","fields":[{"name":"create","inputType":[{"type":"UserCreateWithoutPostsInput","kind":"object","isRequired":false,"isList":false}]},{"name":"connect","inputType":[{"type":"UserWhereUniqueInput","kind":"object","isRequired":false,"isList":false}]},{"name":"update","inputType":[{"type":"UserUpdateWithoutPostsDataInput","kind":"object","isRequired":false,"isList":false}]},{"name":"upsert","inputType":[{"type":"UserUpsertWithoutPostsInput","kind":"object","isRequired":false,"isList":false}]}]},{"name":"PostUpdateInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":false,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]},{"name":"author","inputType":[{"type":"UserUpdateOneRequiredWithoutPostsInput","kind":"object","isRequired":false,"isList":false}]}]},{"name":"PostUpdateManyMutationInput","fields":[{"name":"uuid","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"createdAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"updatedAt","inputType":[{"type":"DateTime","kind":"scalar","isRequired":false,"isList":false}]},{"name":"published","inputType":[{"type":"Boolean","kind":"scalar","isRequired":false,"isList":false}]},{"name":"title","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"content","inputType":[{"type":"String","kind":"scalar","isRequired":false,"isList":false}]},{"name":"kind","inputType":[{"type":"PostKind","kind":"enum","isRequired":false,"isList":false}]}]},{"name":"StringFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"isList":false,"isRequired":false,"kind":"scalar","type":"StringFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"lt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"lte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"gt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"gte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"contains","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"startsWith","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"endsWith","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]}],"atLeastOne":false},{"name":"DateTimeFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"},{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTimeFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"lt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"lte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"gt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"}]},{"name":"gte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"DateTime"}]}],"atLeastOne":false},{"name":"BooleanFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Boolean"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Boolean"},{"isList":false,"isRequired":false,"kind":"scalar","type":"BooleanFilter"}]}],"atLeastOne":false},{"name":"NullableStringFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"isList":false,"isRequired":false,"kind":"scalar","type":"null"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"},{"isList":false,"isRequired":false,"kind":"scalar","type":"null"},{"isList":false,"isRequired":false,"kind":"scalar","type":"NullableStringFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"lt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"lte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"gt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"gte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"contains","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"startsWith","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]},{"name":"endsWith","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"String"}]}],"atLeastOne":false},{"name":"NullablePostKindFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"PostKind"},{"isList":false,"isRequired":false,"kind":"enum","type":"null"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"PostKind"},{"isList":false,"isRequired":false,"kind":"enum","type":"null"},{"isList":false,"isRequired":false,"kind":"enum","type":"NullablePostKindFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"enum","type":"PostKind"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"enum","type":"PostKind"}]}],"atLeastOne":false},{"name":"IntFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"},{"isList":false,"isRequired":false,"kind":"scalar","type":"IntFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"lt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"lte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"gt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"}]},{"name":"gte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Int"}]}],"atLeastOne":false},{"name":"FloatFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"},{"isList":false,"isRequired":false,"kind":"scalar","type":"FloatFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"lt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"lte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"gt","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"}]},{"name":"gte","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"scalar","type":"Float"}]}],"atLeastOne":false},{"name":"PostFilter","fields":[{"name":"every","isRelationFilter":true,"inputType":[{"isList":false,"isRequired":false,"kind":"object","type":"PostWhereInput"}]},{"name":"some","isRelationFilter":true,"inputType":[{"isList":false,"isRequired":false,"kind":"object","type":"PostWhereInput"}]},{"name":"none","isRelationFilter":true,"inputType":[{"isList":false,"isRequired":false,"kind":"object","type":"PostWhereInput"}]}],"atLeastOne":false},{"name":"RoleFilter","fields":[{"name":"equals","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"Role"}]},{"name":"not","isRelationFilter":false,"inputType":[{"isList":false,"isRequired":false,"kind":"enum","type":"Role"},{"isList":false,"isRequired":false,"kind":"enum","type":"RoleFilter"}]},{"name":"in","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"enum","type":"Role"}]},{"name":"notIn","isRelationFilter":false,"inputType":[{"isList":true,"isRequired":false,"kind":"enum","type":"Role"}]}],"atLeastOne":false},{"name":"UserOrderByInput","atLeastOne":true,"atMostOne":true,"isOrderType":true,"fields":[{"name":"id","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"email","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"name","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"age","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"balance","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"amount","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"role","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false}]},{"name":"PostOrderByInput","atLeastOne":true,"atMostOne":true,"isOrderType":true,"fields":[{"name":"uuid","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"createdAt","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"updatedAt","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"published","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"title","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"content","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false},{"name":"kind","inputType":[{"type":"OrderByArg","isList":false,"isRequired":false,"kind":"enum"}],"isRelationFilter":false}]}]}}'
```
Received error:
```
in JSON at position 158expected token
at JSON.parse (<anonymous>)
at Object.<anonymous> (F:\#Projekty\#Github\typegraphql-prisma\experiments\prisma\generated\client\index.js:1011:19) phq
at Module._compile (internal/modules/cjs/loader.js:011778:30)
at Object.Module._extensions..js (internal/modules/778cjs/loader.js:789:10)
at Module.load (internal/modules/cjs/loader.js:653:cjs32) at tryModuleLoad (internal/modules/cjs/loader.js:593:12) 32)
at Function.Module._load (internal/modules/cjs/load3:1er.js:585:3)
at Module.require (internal/modules/cjs/loader.js:6er.92:17)
at require (internal/modules/cjs/helpers.js:25:18) 92:
at Object.generate [as onGenerate] (F:\#Projekty\#Github\typegraphql-prisma\src\cli\prisma-generator.ts:13 :30) ith
at GeneratorProcess.handleResponse (F:\#Projekty\#G) ithub\typegraphql-prisma\experiments\node_modules\prismitha2\build\index.js:2:177310) bui
at LineStream.s.default.on.e (F:\#Projekty\#Github\typegraphql-prisma\experiments\node_modules\prisma2\buitypld\index.js:2:176926) ind
at LineStream.emit (events.js:198:13)
at LineStream.EventEmitter.emit (domain.js:448:20)
at addChunk (_stream_readable.js:288:12)
at readableAddChunk (_stream_readable.js:269:11)
at LineStream.Readable.push (_stream_readable.js:22 4:10) 4:1
at LineStream.Transform.push (_stream_transform.js:151:32) 151
at LineStream._pushBuffer (F:\#Projekty\#Github\typegraphql-prisma\experiments\node_modules\prisma2\build\egrindex.js:2:840724) ex.
at LineStream._transform (F:\#Projekty\#Github\typegraphql-prisma\experiments\node_modules\prisma2\build\igrandex.js:2:840543) code: -32000, data: null }
```
|
process
|
json parse dmmfstring fails on prisma schema with docs generating prisma client results in creating a file that parses the json string containing dmmf representation of the schema unofrtunatelly the r char is invalid in context of the json string which results in unexpected token in json error simple example js const str field docs r json parse str my prisma schema prisma datasource db provider sqlite url file dev db type numeric float generator client provider prisma client js binarytargets output prisma generated client generator typegraphql provider src cli dev ts output prisma generated type graphql emitdmmf true role enum comment enum role user user user admin admin admin user model comment model user user model field comment id int id default autoincrement email string unique name string age int balance numeric amount float posts post maybeposts post role role address address embed street string zipcode string embed address street string zipcode string enum postkind blog advert model post uuid string default cuid id createdat datetime default now updatedat datetime updatedat published boolean title string content string author user coauthor user kind postkind generated dmmf string js dmmf const dmmfstring datamodel enums dbname null documentation role enum comment r name postkind values dbname null models islist false isrequired true isunique false isid true type int default name autoincrement returntype int args isgenerated false isupdatedat false documentation user model field comment r name email kind scalar dbnames islist false isrequired true isunique true isid false type string isgenerated false isupdatedat false name name kind scalar dbnames islist false isrequired false isunique false isid false type string isgenerated false isupdatedat false name age kind scalar dbnames islist false isrequired true isunique false isid false type int isgenerated false isupdatedat false name balance kind scalar dbnames islist false isrequired true isunique false isid false type float isgenerated false isupdatedat false name amount kind scalar dbnames islist false isrequired true isunique false isid false type float isgenerated false isupdatedat false name posts kind object dbnames islist true isrequired false isunique false isid false type post relationname posttouser relationtofields relationondelete none isgenerated false isupdatedat false name role kind enum dbnames islist false isrequired true isunique false isid false type role isgenerated false isupdatedat false isgenerated false documentation user model comment r idfields name post isembedded false dbname null fields islist false isrequired true isunique false isid true type string default name cuid returntype string args isgenerated false isupdatedat false name createdat kind scalar dbnames islist false isrequired true isunique false isid false type datetime default name now returntype datetime args isgenerated false isupdatedat false name updatedat kind scalar dbnames islist false isrequired true isunique false isid false type datetime isgenerated false isupdatedat true name published kind scalar dbnames islist false isrequired true isunique false isid false type boolean isgenerated false isupdatedat false name title kind scalar dbnames islist false isrequired true isunique false isid false type string isgenerated false isupdatedat false name content kind scalar dbnames islist false isrequired false isunique false isid false type string isgenerated false isupdatedat false name author kind object dbnames islist false isrequired true isunique false isid false type user relationname posttouser relationtofields relationondelete none isgenerated false isupdatedat false name kind kind enum dbnames islist false isrequired false isunique false isid false type postkind isgenerated false isupdatedat false isgenerated false idfields mappings schema enums name role values name postkind values outputtypes outputtype type string kind scalar isrequired true islist false name createdat args outputtype type datetime kind scalar isrequired true islist false name updatedat args outputtype type datetime kind scalar isrequired true islist false name published args outputtype type boolean kind scalar isrequired true islist false name title args outputtype type string kind scalar isrequired true islist false name content args outputtype type string kind scalar isrequired false islist false name author args outputtype type user kind object isrequired true islist false name kind args outputtype type postkind kind enum isrequired false islist false name user fields outputtype type int kind scalar isrequired true islist false name email args outputtype type string kind scalar isrequired true islist false name name args outputtype type string kind scalar isrequired false islist false name age args outputtype type int kind scalar isrequired true islist false name balance args outputtype type float kind scalar isrequired true islist false name amount args outputtype type float kind scalar isrequired true islist false name posts args name orderby inputtype name skip inputtype name after inputtype name before inputtype name first inputtype name last inputtype outputtype type post kind object isrequired false islist true name role args outputtype type role kind enum isrequired true islist false name aggregateuser fields outputtype type int kind scalar isrequired true islist false name aggregatepost fields outputtype type int kind scalar isrequired true islist false name query fields name orderby inputtype name skip inputtype name after inputtype name before inputtype name first inputtype name last inputtype outputtype type user kind object isrequired true islist true name aggregateuser args outputtype type aggregateuser kind object isrequired true islist false name findoneuser args outputtype type user kind object isrequired false islist false name findmanypost args name orderby inputtype name skip inputtype name after inputtype name before inputtype name first inputtype name last inputtype outputtype type post kind object isrequired true islist true name aggregatepost args outputtype type aggregatepost kind object isrequired true islist false name findonepost args outputtype type post kind object isrequired false islist false name batchpayload fields outputtype type int kind scalar isrequired true islist false name mutation fields outputtype type user kind object isrequired true islist false name deleteoneuser args outputtype type user kind object isrequired false islist false name updateoneuser args name where inputtype outputtype type user kind object isrequired false islist false name upsertoneuser args name create inputtype name update inputtype outputtype type user kind object isrequired true islist false name updatemanyuser args name where inputtype outputtype type batchpayload kind object isrequired true islist false name deletemanyuser args outputtype type batchpayload kind object isrequired true islist false name createonepost args outputtype type post kind object isrequired true islist false name deleteonepost args outputtype type post kind object isrequired false islist false name updateonepost args name where inputtype outputtype type post kind object isrequired false islist false name upsertonepost args name create inputtype name update inputtype outputtype type post kind object isrequired true islist false name updatemanypost args name where inputtype outputtype type batchpayload kind object isrequired true islist false name deletemanypost args outputtype type batchpayload kind object isrequired true islist false name executeraw args name parameters inputtype outputtype type json kind scalar isrequired true islist false inputtypes isrelationfilter false name createdat inputtype isrelationfilter false name updatedat inputtype isrelationfilter false name published inputtype isrelationfilter false name title inputtype isrelationfilter false name content inputtype isrelationfilter false name kind inputtype isrelationfilter false name and inputtype isrelationfilter true name or inputtype isrelationfilter true name not inputtype isrelationfilter true name author inputtype isrelationfilter true iswheretype true atleastone false name userwhereinput fields isrelationfilter false name email inputtype isrelationfilter false name name inputtype isrelationfilter false name age inputtype isrelationfilter false name balance inputtype isrelationfilter false name amount inputtype isrelationfilter false name posts inputtype isrelationfilter false nullequalsundefined true name role inputtype isrelationfilter false name and inputtype isrelationfilter true name or inputtype isrelationfilter true name not inputtype isrelationfilter true iswheretype true atleastone false name userwhereuniqueinput fields name email inputtype atleastone true name postwhereuniqueinput fields atleastone true name postcreatewithoutauthorinput fields name createdat inputtype name updatedat inputtype name published inputtype name title inputtype name content inputtype name kind inputtype name postcreatemanywithoutauthorinput fields name connect inputtype name usercreateinput fields name name inputtype name age inputtype name balance inputtype name amount inputtype name role inputtype name posts inputtype name postupdatewithoutauthordatainput fields name createdat inputtype name updatedat inputtype name published inputtype name title inputtype name content inputtype name kind inputtype name postupdatewithwhereuniquewithoutauthorinput fields name data inputtype name postscalarwhereinput fields isrelationfilter false name createdat inputtype isrelationfilter false name updatedat inputtype isrelationfilter false name published inputtype isrelationfilter false name title inputtype isrelationfilter false name content inputtype isrelationfilter false name kind inputtype isrelationfilter false name and inputtype isrelationfilter true name or inputtype isrelationfilter true name not inputtype isrelationfilter true iswheretype true atleastone false name postupdatemanydatainput fields name createdat inputtype name updatedat inputtype name published inputtype name title inputtype name content inputtype name kind inputtype name postupdatemanywithwherenestedinput fields name data inputtype name postupsertwithwhereuniquewithoutauthorinput fields name update inputtype name create inputtype name postupdatemanywithoutauthorinput fields name connect inputtype name set inputtype name disconnect inputtype name delete inputtype name update inputtype name updatemany inputtype name deletemany inputtype name upsert inputtype name userupdateinput fields name email inputtype name name inputtype name age inputtype name balance inputtype name amount inputtype name role inputtype name posts inputtype name userupdatemanymutationinput fields name email inputtype name name inputtype name age inputtype name balance inputtype name amount inputtype name role inputtype name usercreatewithoutpostsinput fields name name inputtype name age inputtype name balance inputtype name amount inputtype name role inputtype name usercreateonewithoutpostsinput fields name connect inputtype name postcreateinput fields name createdat inputtype name updatedat inputtype name published inputtype name title inputtype name content inputtype name kind inputtype name author inputtype name userupdatewithoutpostsdatainput fields name email inputtype name name inputtype name age inputtype name balance inputtype name amount inputtype name role inputtype name userupsertwithoutpostsinput fields name create inputtype name userupdateonerequiredwithoutpostsinput fields name connect inputtype name update inputtype name upsert inputtype name postupdateinput fields name createdat inputtype name updatedat inputtype name published inputtype name title inputtype name content inputtype name kind inputtype name author inputtype name postupdatemanymutationinput fields name createdat inputtype name updatedat inputtype name published inputtype name title inputtype name content inputtype name kind inputtype name stringfilter fields name not isrelationfilter false inputtype name in isrelationfilter false inputtype name notin isrelationfilter false inputtype name lt isrelationfilter false inputtype name lte isrelationfilter false inputtype name gt isrelationfilter false inputtype name gte isrelationfilter false inputtype name contains isrelationfilter false inputtype name startswith isrelationfilter false inputtype name endswith isrelationfilter false inputtype atleastone false name datetimefilter fields name not isrelationfilter false inputtype name in isrelationfilter false inputtype name notin isrelationfilter false inputtype name lt isrelationfilter false inputtype name lte isrelationfilter false inputtype name gt isrelationfilter false inputtype name gte isrelationfilter false inputtype atleastone false name booleanfilter fields name not isrelationfilter false inputtype atleastone false name nullablestringfilter fields name not isrelationfilter false inputtype name in isrelationfilter false inputtype name notin isrelationfilter false inputtype name lt isrelationfilter false inputtype name lte isrelationfilter false inputtype name gt isrelationfilter false inputtype name gte isrelationfilter false inputtype name contains isrelationfilter false inputtype name startswith isrelationfilter false inputtype name endswith isrelationfilter false inputtype atleastone false name nullablepostkindfilter fields name not isrelationfilter false inputtype name in isrelationfilter false inputtype name notin isrelationfilter false inputtype atleastone false name intfilter fields name not isrelationfilter false inputtype name in isrelationfilter false inputtype name notin isrelationfilter false inputtype name lt isrelationfilter false inputtype name lte isrelationfilter false inputtype name gt isrelationfilter false inputtype name gte isrelationfilter false inputtype atleastone false name floatfilter fields name not isrelationfilter false inputtype name in isrelationfilter false inputtype name notin isrelationfilter false inputtype name lt isrelationfilter false inputtype name lte isrelationfilter false inputtype name gt isrelationfilter false inputtype name gte isrelationfilter false inputtype atleastone false name postfilter fields name some isrelationfilter true inputtype name none isrelationfilter true inputtype atleastone false name rolefilter fields name not isrelationfilter false inputtype name in isrelationfilter false inputtype name notin isrelationfilter false inputtype atleastone false name userorderbyinput atleastone true atmostone true isordertype true fields isrelationfilter false name email inputtype isrelationfilter false name name inputtype isrelationfilter false name age inputtype isrelationfilter false name balance inputtype isrelationfilter false name amount inputtype isrelationfilter false name role inputtype isrelationfilter false name postorderbyinput atleastone true atmostone true isordertype true fields isrelationfilter false name createdat inputtype isrelationfilter false name updatedat inputtype isrelationfilter false name published inputtype isrelationfilter false name title inputtype isrelationfilter false name content inputtype isrelationfilter false name kind inputtype isrelationfilter false received error in json at position token at json parse at object f projekty github typegraphql prisma experiments prisma generated client index js phq at module compile internal modules cjs loader js at object module extensions js internal modules loader js at module load internal modules cjs loader js at trymoduleload internal modules cjs loader js at function module load internal modules cjs js at module require internal modules cjs loader js at require internal modules cjs helpers js at object generate f projekty github typegraphql prisma src cli prisma generator ts ith at generatorprocess handleresponse f projekty g ithub typegraphql prisma experiments node modules build index js bui at linestream s default on e f projekty github typegraphql prisma experiments node modules buitypld index js ind at linestream emit events js at linestream eventemitter emit domain js at addchunk stream readable js at readableaddchunk stream readable js at linestream readable push stream readable js at linestream transform push stream transform js at linestream pushbuffer f projekty github typegraphql prisma experiments node modules build egrindex js ex at linestream transform f projekty github typegraphql prisma experiments node modules build igrandex js code data null
| 1
|
126,535
| 17,930,043,904
|
IssuesEvent
|
2021-09-10 08:01:44
|
Daniel-luu-7/eko-sport
|
https://api.github.com/repos/Daniel-luu-7/eko-sport
|
closed
|
CVE-2021-33502 (High) detected in normalize-url-1.9.1.tgz, normalize-url-3.3.0.tgz
|
security vulnerability
|
## CVE-2021-33502 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>normalize-url-1.9.1.tgz</b>, <b>normalize-url-3.3.0.tgz</b></p></summary>
<p>
<details><summary><b>normalize-url-1.9.1.tgz</b></p></summary>
<p>Normalize a URL</p>
<p>Library home page: <a href="https://registry.npmjs.org/normalize-url/-/normalize-url-1.9.1.tgz">https://registry.npmjs.org/normalize-url/-/normalize-url-1.9.1.tgz</a></p>
<p>Path to dependency file: eko-sport/package.json</p>
<p>Path to vulnerable library: eko-sport/node_modules/normalize-url/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.3.tgz (Root Library)
- mini-css-extract-plugin-0.11.3.tgz
- :x: **normalize-url-1.9.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>normalize-url-3.3.0.tgz</b></p></summary>
<p>Normalize a URL</p>
<p>Library home page: <a href="https://registry.npmjs.org/normalize-url/-/normalize-url-3.3.0.tgz">https://registry.npmjs.org/normalize-url/-/normalize-url-3.3.0.tgz</a></p>
<p>Path to dependency file: eko-sport/package.json</p>
<p>Path to vulnerable library: eko-sport/node_modules/postcss-normalize-url/node_modules/normalize-url/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.3.tgz (Root Library)
- optimize-css-assets-webpack-plugin-5.0.4.tgz
- cssnano-4.1.11.tgz
- cssnano-preset-default-4.0.8.tgz
- postcss-normalize-url-4.0.1.tgz
- :x: **normalize-url-3.3.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/Daniel-luu-7/eko-sport/commit/25f11a0739b51a447fe0c4fb550e92b7b693aec3">25f11a0739b51a447fe0c4fb550e92b7b693aec3</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The normalize-url package before 4.5.1, 5.x before 5.3.1, and 6.x before 6.0.1 for Node.js has a ReDoS (regular expression denial of service) issue because it has exponential performance for data: URLs.
<p>Publish Date: 2021-05-24
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-33502>CVE-2021-33502</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33502">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33502</a></p>
<p>Release Date: 2021-05-24</p>
<p>Fix Resolution: normalize-url - 4.5.1, 5.3.1, 6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-33502 (High) detected in normalize-url-1.9.1.tgz, normalize-url-3.3.0.tgz - ## CVE-2021-33502 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>normalize-url-1.9.1.tgz</b>, <b>normalize-url-3.3.0.tgz</b></p></summary>
<p>
<details><summary><b>normalize-url-1.9.1.tgz</b></p></summary>
<p>Normalize a URL</p>
<p>Library home page: <a href="https://registry.npmjs.org/normalize-url/-/normalize-url-1.9.1.tgz">https://registry.npmjs.org/normalize-url/-/normalize-url-1.9.1.tgz</a></p>
<p>Path to dependency file: eko-sport/package.json</p>
<p>Path to vulnerable library: eko-sport/node_modules/normalize-url/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.3.tgz (Root Library)
- mini-css-extract-plugin-0.11.3.tgz
- :x: **normalize-url-1.9.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>normalize-url-3.3.0.tgz</b></p></summary>
<p>Normalize a URL</p>
<p>Library home page: <a href="https://registry.npmjs.org/normalize-url/-/normalize-url-3.3.0.tgz">https://registry.npmjs.org/normalize-url/-/normalize-url-3.3.0.tgz</a></p>
<p>Path to dependency file: eko-sport/package.json</p>
<p>Path to vulnerable library: eko-sport/node_modules/postcss-normalize-url/node_modules/normalize-url/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.3.tgz (Root Library)
- optimize-css-assets-webpack-plugin-5.0.4.tgz
- cssnano-4.1.11.tgz
- cssnano-preset-default-4.0.8.tgz
- postcss-normalize-url-4.0.1.tgz
- :x: **normalize-url-3.3.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/Daniel-luu-7/eko-sport/commit/25f11a0739b51a447fe0c4fb550e92b7b693aec3">25f11a0739b51a447fe0c4fb550e92b7b693aec3</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The normalize-url package before 4.5.1, 5.x before 5.3.1, and 6.x before 6.0.1 for Node.js has a ReDoS (regular expression denial of service) issue because it has exponential performance for data: URLs.
<p>Publish Date: 2021-05-24
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-33502>CVE-2021-33502</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33502">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33502</a></p>
<p>Release Date: 2021-05-24</p>
<p>Fix Resolution: normalize-url - 4.5.1, 5.3.1, 6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in normalize url tgz normalize url tgz cve high severity vulnerability vulnerable libraries normalize url tgz normalize url tgz normalize url tgz normalize a url library home page a href path to dependency file eko sport package json path to vulnerable library eko sport node modules normalize url package json dependency hierarchy react scripts tgz root library mini css extract plugin tgz x normalize url tgz vulnerable library normalize url tgz normalize a url library home page a href path to dependency file eko sport package json path to vulnerable library eko sport node modules postcss normalize url node modules normalize url package json dependency hierarchy react scripts tgz root library optimize css assets webpack plugin tgz cssnano tgz cssnano preset default tgz postcss normalize url tgz x normalize url tgz vulnerable library found in head commit a href found in base branch main vulnerability details the normalize url package before x before and x before for node js has a redos regular expression denial of service issue because it has exponential performance for data urls publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution normalize url step up your open source security game with whitesource
| 0
|
21,980
| 30,470,956,469
|
IssuesEvent
|
2023-07-17 13:37:01
|
h4sh5/npm-auto-scanner
|
https://api.github.com/repos/h4sh5/npm-auto-scanner
|
opened
|
electron-updater-with-gitea 1.0.1 has 2 guarddog issues
|
npm-silent-process-execution
|
```{"npm-silent-process-execution":[{"code":" child_process_1.spawn(destination, [], {\n detached: true,\n stdio: \"ignore\",\n env,\n }).unref();","location":"package/out/AppImageUpdater.js:97","message":"This package is silently executing another executable"},{"code":" const process = child_process_1.spawn(exe, args, {\n detached: true,\n stdio: \"ignore\",\n });","location":"package/out/NsisUpdater.js:212","message":"This package is silently executing another executable"}]}```
|
1.0
|
electron-updater-with-gitea 1.0.1 has 2 guarddog issues - ```{"npm-silent-process-execution":[{"code":" child_process_1.spawn(destination, [], {\n detached: true,\n stdio: \"ignore\",\n env,\n }).unref();","location":"package/out/AppImageUpdater.js:97","message":"This package is silently executing another executable"},{"code":" const process = child_process_1.spawn(exe, args, {\n detached: true,\n stdio: \"ignore\",\n });","location":"package/out/NsisUpdater.js:212","message":"This package is silently executing another executable"}]}```
|
process
|
electron updater with gitea has guarddog issues npm silent process execution n detached true n stdio ignore n env n unref location package out appimageupdater js message this package is silently executing another executable code const process child process spawn exe args n detached true n stdio ignore n location package out nsisupdater js message this package is silently executing another executable
| 1
|
725,284
| 24,957,222,815
|
IssuesEvent
|
2022-11-01 12:52:32
|
AY2223S1-CS2103T-T08-4/tp
|
https://api.github.com/repos/AY2223S1-CS2103T-T08-4/tp
|
closed
|
[PE-D][Tester D] Addresponse exception not handled well
|
priority.High
|
The results box(? the one that shows system messages) does not update upon inputting 'addresponse' and an index, without the specified m/ parameter.

<!--session: 1666943676792-73040aaa-cf69-4f70-ad32-6fd7aa351f3d-->
<!--Version: Web v3.4.4-->
-------------
Labels: `type.FeatureFlaw` `severity.Low`
original: ningtan11/ped#6
|
1.0
|
[PE-D][Tester D] Addresponse exception not handled well - The results box(? the one that shows system messages) does not update upon inputting 'addresponse' and an index, without the specified m/ parameter.

<!--session: 1666943676792-73040aaa-cf69-4f70-ad32-6fd7aa351f3d-->
<!--Version: Web v3.4.4-->
-------------
Labels: `type.FeatureFlaw` `severity.Low`
original: ningtan11/ped#6
|
non_process
|
addresponse exception not handled well the results box the one that shows system messages does not update upon inputting addresponse and an index without the specified m parameter labels type featureflaw severity low original ped
| 0
|
21,238
| 28,358,197,229
|
IssuesEvent
|
2023-04-12 08:53:06
|
inmanta/inmanta
|
https://api.github.com/repos/inmanta/inmanta
|
closed
|
release OSS
|
process
|
*See inmanta/inmanta-core#5699 for blockers*
- [x] Ensure that the tests succeed on the development branch (core, extensions and modules) [1]
- [x] Perform a patch release for all modules with unreleased dependency updates. See helper scripts in `irt/scripts/release`. [2]
- [x] Ensure that the compiler version (`src/inmanta/__init__.py`) has been bumped since the last release or no improvements
have been made to the compiler since. [3]
- [x] Ensure that the year part of the version number in the `inmanta` repo is correct
- [x] [run RC release job](https://jenkins.inmanta.com/job/releases/job/inmanta-oss-release/) [4]
- [x] [run next build](https://jenkins.inmanta.com/job/releases/job/inmanta-oss-build/) [5]
- [x] [run next build documentation](https://jenkins.inmanta.com/job/docs/job/inmanta-oss-docs-build/)
- Run the RC tests (These tests can be run in parallel)
- [x] Manually click through all new web-console features (see change entries). Ensure you are running on the correct version of the web-console by force reloading the application in your browser.
- [x] Quickstart ([automated](https://jenkins.inmanta.com/job/integration-tests/job/network-quickstart/))
- [x] Modulesets ([automated](https://jenkins.inmanta.com/job/modules/job/specific_module_set/job/master/))
- [x] [run project invariant check](https://jenkins.inmanta.com/job/product/job/check-all-project-invariants/job/master/) to guard against patching mistakes [6]
- [x] [run stable release job](https://jenkins.inmanta.com/job/releases/job/inmanta-oss-release/) [7]
- [x] [run stable build](https://jenkins.inmanta.com/job/releases/job/inmanta-oss-build/) [8]
- [x] [run stable build documentation](https://jenkins.inmanta.com/job/docs/job/inmanta-oss-docs-build/) [9]
- [x] inform marketing people
- [x] Setup a longevity test instance for the new release. Add a new `LongevityTest` instance to the
[`infra` project](https://code.inmanta.com/inmanta-infra/inmanta-infra) then deploy the changes via the
management orchestrator.
|
1.0
|
release OSS - *See inmanta/inmanta-core#5699 for blockers*
- [x] Ensure that the tests succeed on the development branch (core, extensions and modules) [1]
- [x] Perform a patch release for all modules with unreleased dependency updates. See helper scripts in `irt/scripts/release`. [2]
- [x] Ensure that the compiler version (`src/inmanta/__init__.py`) has been bumped since the last release or no improvements
have been made to the compiler since. [3]
- [x] Ensure that the year part of the version number in the `inmanta` repo is correct
- [x] [run RC release job](https://jenkins.inmanta.com/job/releases/job/inmanta-oss-release/) [4]
- [x] [run next build](https://jenkins.inmanta.com/job/releases/job/inmanta-oss-build/) [5]
- [x] [run next build documentation](https://jenkins.inmanta.com/job/docs/job/inmanta-oss-docs-build/)
- Run the RC tests (These tests can be run in parallel)
- [x] Manually click through all new web-console features (see change entries). Ensure you are running on the correct version of the web-console by force reloading the application in your browser.
- [x] Quickstart ([automated](https://jenkins.inmanta.com/job/integration-tests/job/network-quickstart/))
- [x] Modulesets ([automated](https://jenkins.inmanta.com/job/modules/job/specific_module_set/job/master/))
- [x] [run project invariant check](https://jenkins.inmanta.com/job/product/job/check-all-project-invariants/job/master/) to guard against patching mistakes [6]
- [x] [run stable release job](https://jenkins.inmanta.com/job/releases/job/inmanta-oss-release/) [7]
- [x] [run stable build](https://jenkins.inmanta.com/job/releases/job/inmanta-oss-build/) [8]
- [x] [run stable build documentation](https://jenkins.inmanta.com/job/docs/job/inmanta-oss-docs-build/) [9]
- [x] inform marketing people
- [x] Setup a longevity test instance for the new release. Add a new `LongevityTest` instance to the
[`infra` project](https://code.inmanta.com/inmanta-infra/inmanta-infra) then deploy the changes via the
management orchestrator.
|
process
|
release oss see inmanta inmanta core for blockers ensure that the tests succeed on the development branch core extensions and modules perform a patch release for all modules with unreleased dependency updates see helper scripts in irt scripts release ensure that the compiler version src inmanta init py has been bumped since the last release or no improvements have been made to the compiler since ensure that the year part of the version number in the inmanta repo is correct run the rc tests these tests can be run in parallel manually click through all new web console features see change entries ensure you are running on the correct version of the web console by force reloading the application in your browser quickstart modulesets to guard against patching mistakes inform marketing people setup a longevity test instance for the new release add a new longevitytest instance to the then deploy the changes via the management orchestrator
| 1
|
239,637
| 26,231,995,398
|
IssuesEvent
|
2023-01-05 01:37:22
|
vwasthename/iaf
|
https://api.github.com/repos/vwasthename/iaf
|
reopened
|
CVE-2016-10735 (Medium) detected in bootstrap-3.3.6.js
|
security vulnerability
|
## CVE-2016-10735 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.6.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js</a></p>
<p>Path to vulnerable library: /gui/js/bootstrap/bootstrap.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.js** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2016-10735>CVE-2016-10735</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0, 4.0.0-beta.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2016-10735 (Medium) detected in bootstrap-3.3.6.js - ## CVE-2016-10735 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.6.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js</a></p>
<p>Path to vulnerable library: /gui/js/bootstrap/bootstrap.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.js** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2016-10735>CVE-2016-10735</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0, 4.0.0-beta.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in bootstrap js cve medium severity vulnerability vulnerable library bootstrap js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library gui js bootstrap bootstrap js dependency hierarchy x bootstrap js vulnerable library found in base branch master vulnerability details in bootstrap x before and x beta before beta xss is possible in the data target attribute a different vulnerability than cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap beta step up your open source security game with mend
| 0
|
16,210
| 10,449,760,286
|
IssuesEvent
|
2019-09-19 09:06:25
|
Azure/azure-cli
|
https://api.github.com/repos/Azure/azure-cli
|
closed
|
az vm create recommends deprecated Set-AzureRmMarketplaceTerms instead of az vm image accept-terms
|
Marketplace Ordering Service Attention
|
**Describe the bug**
Using az vm create on an image with terms results in a message about using deprecated PowerShell cmdlets to accept (see 'additional content' below), instead of recommending 'az vm image accept-terms'.
**To Reproduce**
Attempt to create an az vm with an image with terms:
```
az vm create \
--resource-group Fedora \
--name Build \
--image tunnelbiz:fedora:fedora30:0.0.3 \
--admin-username azureuser \
--generate-ssh-keys
```
**Environment summary**
apt-get
2.0.72
Pengwin
Bash
**Additional context**
```
Azure Error: MarketplacePurchaseEligibilityFailed
Message: Marketplace purchase eligibilty check returned errors. See inner errors for details.
Exception Details:
Error Code: BadRequest
Message: Offer with PublisherId: tunnelbiz, OfferId: fedora cannot be purchased due to validation errors. See details for more information.[{"Legal terms have not been accepted for this item on this subscription: 'e2391b26-a1f2-4713-b4b1-92857f6083d1'. To accept legal terms using PowerShell, please use Get-AzureRmMarketplaceTerms and Set-AzureRmMarketplaceTerms API(https://go.microsoft.com/fwlink/?linkid=862451) or deploy via the Azure portal to accept the terms":"StoreApi"}]
```
|
1.0
|
az vm create recommends deprecated Set-AzureRmMarketplaceTerms instead of az vm image accept-terms - **Describe the bug**
Using az vm create on an image with terms results in a message about using deprecated PowerShell cmdlets to accept (see 'additional content' below), instead of recommending 'az vm image accept-terms'.
**To Reproduce**
Attempt to create an az vm with an image with terms:
```
az vm create \
--resource-group Fedora \
--name Build \
--image tunnelbiz:fedora:fedora30:0.0.3 \
--admin-username azureuser \
--generate-ssh-keys
```
**Environment summary**
apt-get
2.0.72
Pengwin
Bash
**Additional context**
```
Azure Error: MarketplacePurchaseEligibilityFailed
Message: Marketplace purchase eligibilty check returned errors. See inner errors for details.
Exception Details:
Error Code: BadRequest
Message: Offer with PublisherId: tunnelbiz, OfferId: fedora cannot be purchased due to validation errors. See details for more information.[{"Legal terms have not been accepted for this item on this subscription: 'e2391b26-a1f2-4713-b4b1-92857f6083d1'. To accept legal terms using PowerShell, please use Get-AzureRmMarketplaceTerms and Set-AzureRmMarketplaceTerms API(https://go.microsoft.com/fwlink/?linkid=862451) or deploy via the Azure portal to accept the terms":"StoreApi"}]
```
|
non_process
|
az vm create recommends deprecated set azurermmarketplaceterms instead of az vm image accept terms describe the bug using az vm create on an image with terms results in a message about using deprecated powershell cmdlets to accept see additional content below instead of recommending az vm image accept terms to reproduce attempt to create an az vm with an image with terms az vm create resource group fedora name build image tunnelbiz fedora admin username azureuser generate ssh keys environment summary apt get pengwin bash additional context azure error marketplacepurchaseeligibilityfailed message marketplace purchase eligibilty check returned errors see inner errors for details exception details error code badrequest message offer with publisherid tunnelbiz offerid fedora cannot be purchased due to validation errors see details for more information
| 0
|
3,382
| 6,506,083,040
|
IssuesEvent
|
2017-08-24 07:29:57
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
child_process.exec and .execSync returning exit code 1 without an error
|
child_process windows
|
<!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Version**: 6.11.2
* **Platform**: Windows 10 v1703 x64
* **Subsystem**: child_process
<!-- Enter your issue details below this comment. -->
I'm trying to use exec to interface with the `netsh wlan` commands on Windows, and I've been having no trouble up until for some reason I get an exit code of 1 for a very specific command and circumstance. Most commands work fine with no issue, but when trying to call `netsh wlan show profiles` which lists all wireless network profiles for the system, I get an error ONLY when there are no profiles at all:
`Error: Command failed: netsh wlan show profiles`
Any profiles exist and the command returns fine. It only takes one, as in the example output below.
Output from `netsh wlan show profiles` w/o profiles: (error)
```
Profiles on interface Wi=Fi:
Group policy profiles (read only)
---------------------------------
<None>
User profiles
-------------
<None>
```
Output from `netsh wlan show profiles` w/ profiles: (no error)
```
Profiles on interface Wi=Fi:
Group policy profiles (read only)
---------------------------------
<None>
User profiles
-------------
All User Profile : MyWiFiNetwork
```
The "error" doesn't seem to prevent the command from executing either. stdout is passed properly and I can proceed if I ignore the error, but then I won't catch any actual errors.
I tried replacing exec with execSync since I couldn't find anything trying to debug the async version and still got the same error. I ended up finding that your binding returns a status of 1 for some reason.
[`var result = spawn_sync.spawn(options);`](https://github.com/nodejs/node/blob/1524458c54d76e7ae7595606cda364edbf1aa4f9/lib/internal/child_process.js#L941)
The result object returned:
```
{
output: Array[3] [null, Uint8Array[162], Uint8Array[0] ],
pid: /*pid*/,
signal: null,
status: 1
}
```
It also has a prototype but this looks like a standard Object prototype.
I have no idea how to try and debug this binding, so that is about as deep of information as I can give on that front, but I have replicated this on multiple machines in isolated code (below).
```
const execSync = require('child_process').execSync;
try {
let out = execSync('netsh wlan show profiles').toString();
console.log(out);
} catch(err) {
console.error(err);
}
```
Note: also tried `const cp = require('child_process');` followed by `cp.execSync()` later but this had no effect on behavior.
|
1.0
|
child_process.exec and .execSync returning exit code 1 without an error - <!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Version**: 6.11.2
* **Platform**: Windows 10 v1703 x64
* **Subsystem**: child_process
<!-- Enter your issue details below this comment. -->
I'm trying to use exec to interface with the `netsh wlan` commands on Windows, and I've been having no trouble up until for some reason I get an exit code of 1 for a very specific command and circumstance. Most commands work fine with no issue, but when trying to call `netsh wlan show profiles` which lists all wireless network profiles for the system, I get an error ONLY when there are no profiles at all:
`Error: Command failed: netsh wlan show profiles`
Any profiles exist and the command returns fine. It only takes one, as in the example output below.
Output from `netsh wlan show profiles` w/o profiles: (error)
```
Profiles on interface Wi=Fi:
Group policy profiles (read only)
---------------------------------
<None>
User profiles
-------------
<None>
```
Output from `netsh wlan show profiles` w/ profiles: (no error)
```
Profiles on interface Wi=Fi:
Group policy profiles (read only)
---------------------------------
<None>
User profiles
-------------
All User Profile : MyWiFiNetwork
```
The "error" doesn't seem to prevent the command from executing either. stdout is passed properly and I can proceed if I ignore the error, but then I won't catch any actual errors.
I tried replacing exec with execSync since I couldn't find anything trying to debug the async version and still got the same error. I ended up finding that your binding returns a status of 1 for some reason.
[`var result = spawn_sync.spawn(options);`](https://github.com/nodejs/node/blob/1524458c54d76e7ae7595606cda364edbf1aa4f9/lib/internal/child_process.js#L941)
The result object returned:
```
{
output: Array[3] [null, Uint8Array[162], Uint8Array[0] ],
pid: /*pid*/,
signal: null,
status: 1
}
```
It also has a prototype but this looks like a standard Object prototype.
I have no idea how to try and debug this binding, so that is about as deep of information as I can give on that front, but I have replicated this on multiple machines in isolated code (below).
```
const execSync = require('child_process').execSync;
try {
let out = execSync('netsh wlan show profiles').toString();
console.log(out);
} catch(err) {
console.error(err);
}
```
Note: also tried `const cp = require('child_process');` followed by `cp.execSync()` later but this had no effect on behavior.
|
process
|
child process exec and execsync returning exit code without an error thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version platform windows subsystem child process i m trying to use exec to interface with the netsh wlan commands on windows and i ve been having no trouble up until for some reason i get an exit code of for a very specific command and circumstance most commands work fine with no issue but when trying to call netsh wlan show profiles which lists all wireless network profiles for the system i get an error only when there are no profiles at all error command failed netsh wlan show profiles any profiles exist and the command returns fine it only takes one as in the example output below output from netsh wlan show profiles w o profiles error profiles on interface wi fi group policy profiles read only user profiles output from netsh wlan show profiles w profiles no error profiles on interface wi fi group policy profiles read only user profiles all user profile mywifinetwork the error doesn t seem to prevent the command from executing either stdout is passed properly and i can proceed if i ignore the error but then i won t catch any actual errors i tried replacing exec with execsync since i couldn t find anything trying to debug the async version and still got the same error i ended up finding that your binding returns a status of for some reason the result object returned output array pid pid signal null status it also has a prototype but this looks like a standard object prototype i have no idea how to try and debug this binding so that is about as deep of information as i can give on that front but i have replicated this on multiple machines in isolated code below const execsync require child process execsync try let out execsync netsh wlan show profiles tostring console log out catch err console error err note also tried const cp require child process followed by cp execsync later but this had no effect on behavior
| 1
|
86,259
| 8,030,280,158
|
IssuesEvent
|
2018-07-27 19:00:35
|
tendermint/tendermint
|
https://api.github.com/repos/tendermint/tendermint
|
closed
|
consensus: Evidence not included in block for gossip
|
bug consensus security test
|
Submitted via [HackerOne](https://hackerone.com/tendermint):
**Summary:**
Evidences not included in block for gossip so there's no chance to punish Byzantine nodes
**Description:**
Refer to below code snippets in consensus/state.go, createProposal() function:
```
// Mempool validated transactions
txs := cs.mempool.Reap(cs.config.MaxBlockSizeTxs)
block, parts := cs.state.MakeBlock(cs.Height, txs, commit)
evidence := cs.evpool.PendingEvidence()
block.AddEvidence(evidence)
```
The newly created block has been divided into parts BEFORE adding evidences. In this case, block parts will be gossiped without last block's evidences, so we could not pick out Byzantine nodes and punish them.
## Steps To Reproduce:
(Add details for how we can reproduce the issue)
1. Set up a Byzantine node who will broadcast conflicting votes
2. Check Byzantine node list in honest node's BeginBlock() function
## Supporting Material/References:
* List any additional material (e.g. screenshots, logs, etc.)
## Impact
Since Byzantine nodes will never be accused, there's no cost for cheating. Attackers may have chance to take charge of more than 1/3 malicious nodes and cheat the whole system without any punishment.
|
1.0
|
consensus: Evidence not included in block for gossip - Submitted via [HackerOne](https://hackerone.com/tendermint):
**Summary:**
Evidences not included in block for gossip so there's no chance to punish Byzantine nodes
**Description:**
Refer to below code snippets in consensus/state.go, createProposal() function:
```
// Mempool validated transactions
txs := cs.mempool.Reap(cs.config.MaxBlockSizeTxs)
block, parts := cs.state.MakeBlock(cs.Height, txs, commit)
evidence := cs.evpool.PendingEvidence()
block.AddEvidence(evidence)
```
The newly created block has been divided into parts BEFORE adding evidences. In this case, block parts will be gossiped without last block's evidences, so we could not pick out Byzantine nodes and punish them.
## Steps To Reproduce:
(Add details for how we can reproduce the issue)
1. Set up a Byzantine node who will broadcast conflicting votes
2. Check Byzantine node list in honest node's BeginBlock() function
## Supporting Material/References:
* List any additional material (e.g. screenshots, logs, etc.)
## Impact
Since Byzantine nodes will never be accused, there's no cost for cheating. Attackers may have chance to take charge of more than 1/3 malicious nodes and cheat the whole system without any punishment.
|
non_process
|
consensus evidence not included in block for gossip submitted via summary evidences not included in block for gossip so there s no chance to punish byzantine nodes description refer to below code snippets in consensus state go createproposal function mempool validated transactions txs cs mempool reap cs config maxblocksizetxs block parts cs state makeblock cs height txs commit evidence cs evpool pendingevidence block addevidence evidence the newly created block has been divided into parts before adding evidences in this case block parts will be gossiped without last block s evidences so we could not pick out byzantine nodes and punish them steps to reproduce add details for how we can reproduce the issue set up a byzantine node who will broadcast conflicting votes check byzantine node list in honest node s beginblock function supporting material references list any additional material e g screenshots logs etc impact since byzantine nodes will never be accused there s no cost for cheating attackers may have chance to take charge of more than malicious nodes and cheat the whole system without any punishment
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.