Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
33,265
| 6,189,037,802
|
IssuesEvent
|
2017-07-04 11:53:35
|
ehcache/ehcache3
|
https://api.github.com/repos/ehcache/ehcache3
|
closed
|
Errors in documentation
|
documentation in progress
|
* http://www.ehcache.org/documentation/3.3/getting-started.html#creating-a-cache-manager-with-clustering-support - callouts do not match code lines.
* http://www.ehcache.org/documentation/3.3/clustered-cache.html#clustered-unspecified-inheritance - review the text and clarify the `clustered()` feature
* http://www.ehcache.org/documentation/3.3/thread-pools.html#let-s-start-with-a-bit-of-theory - review the section and the last sentence in particular
* http://www.ehcache.org/documentation/3.3/serializers-copiers.html#how-is-a-serializer-configured - by _class name_ should be by _class_
* http://www.ehcache.org/documentation/3.3/serializers-copiers.html#lifecycle-instances-vs-class-names - confusing sentence "It will need to dispose of any resource the serializer might hold upon and/or persisting and reloading the serializer's state."
* http://www.ehcache.org/documentation/3.3/serializers-copiers.html#writing-your-own-serializer - section before code extract has some duplication, check
* http://www.ehcache.org/documentation/3.3/eviction-advisor.html - review text in before last note, unclear
* http://www.ehcache.org/documentation/3.3/cache-event-listeners.html - Improve table headers
* http://www.ehcache.org/documentation/3.3/cache-event-listeners.html#event-processing-queues - missing callout text
* http://www.ehcache.org/documentation/3.3/tiering.html#disk - callouts out of order
* http://www.ehcache.org/documentation/3.3/tiering.html#clustered - review sentence
* http://www.ehcache.org/documentation/3.3/tiering.html#destroy-persistent-tiers - some weird sentences
* http://www.ehcache.org/documentation/3.3/xa.html#xa-cache-with-three-tiers-and-persistence - in sample, use `persistence` method
|
1.0
|
Errors in documentation - * http://www.ehcache.org/documentation/3.3/getting-started.html#creating-a-cache-manager-with-clustering-support - callouts do not match code lines.
* http://www.ehcache.org/documentation/3.3/clustered-cache.html#clustered-unspecified-inheritance - review the text and clarify the `clustered()` feature
* http://www.ehcache.org/documentation/3.3/thread-pools.html#let-s-start-with-a-bit-of-theory - review the section and the last sentence in particular
* http://www.ehcache.org/documentation/3.3/serializers-copiers.html#how-is-a-serializer-configured - by _class name_ should be by _class_
* http://www.ehcache.org/documentation/3.3/serializers-copiers.html#lifecycle-instances-vs-class-names - confusing sentence "It will need to dispose of any resource the serializer might hold upon and/or persisting and reloading the serializer's state."
* http://www.ehcache.org/documentation/3.3/serializers-copiers.html#writing-your-own-serializer - section before code extract has some duplication, check
* http://www.ehcache.org/documentation/3.3/eviction-advisor.html - review text in before last note, unclear
* http://www.ehcache.org/documentation/3.3/cache-event-listeners.html - Improve table headers
* http://www.ehcache.org/documentation/3.3/cache-event-listeners.html#event-processing-queues - missing callout text
* http://www.ehcache.org/documentation/3.3/tiering.html#disk - callouts out of order
* http://www.ehcache.org/documentation/3.3/tiering.html#clustered - review sentence
* http://www.ehcache.org/documentation/3.3/tiering.html#destroy-persistent-tiers - some weird sentences
* http://www.ehcache.org/documentation/3.3/xa.html#xa-cache-with-three-tiers-and-persistence - in sample, use `persistence` method
|
non_process
|
errors in documentation callouts do not match code lines review the text and clarify the clustered feature review the section and the last sentence in particular by class name should be by class confusing sentence it will need to dispose of any resource the serializer might hold upon and or persisting and reloading the serializer s state section before code extract has some duplication check review text in before last note unclear improve table headers missing callout text callouts out of order review sentence some weird sentences in sample use persistence method
| 0
|
5,757
| 8,598,515,114
|
IssuesEvent
|
2018-11-15 22:02:29
|
HumanCellAtlas/dcp-community
|
https://api.github.com/repos/HumanCellAtlas/dcp-community
|
closed
|
Rename rfc-proposed and rfc-final-review
|
rfc-process
|
- [x] Rename *rfc-proposed* to *rfc-community-review*
- [x] Rename *rfc-final-review* to *rfc-oversight-review*
- [x] Clarify that *rfc-oversight-review* is limited to oversight reviewers and not the general community
|
1.0
|
Rename rfc-proposed and rfc-final-review - - [x] Rename *rfc-proposed* to *rfc-community-review*
- [x] Rename *rfc-final-review* to *rfc-oversight-review*
- [x] Clarify that *rfc-oversight-review* is limited to oversight reviewers and not the general community
|
process
|
rename rfc proposed and rfc final review rename rfc proposed to rfc community review rename rfc final review to rfc oversight review clarify that rfc oversight review is limited to oversight reviewers and not the general community
| 1
|
641,456
| 20,826,993,608
|
IssuesEvent
|
2022-03-18 22:35:47
|
popovs/OPPtools
|
https://api.github.com/repos/popovs/OPPtools
|
closed
|
Improve shapefile output
|
high priority
|
Modify shapefile output to the following:
- [x] Clearer column names indicating % population use vs. number of individuals etc.
~Simplify shapefile bins to: 100% of population using, 75% using, 50% using, and 25% using~
~Red KBA polyline output~
|
1.0
|
Improve shapefile output - Modify shapefile output to the following:
- [x] Clearer column names indicating % population use vs. number of individuals etc.
~Simplify shapefile bins to: 100% of population using, 75% using, 50% using, and 25% using~
~Red KBA polyline output~
|
non_process
|
improve shapefile output modify shapefile output to the following clearer column names indicating population use vs number of individuals etc simplify shapefile bins to of population using using using and using red kba polyline output
| 0
|
376,617
| 26,208,669,263
|
IssuesEvent
|
2023-01-04 02:51:12
|
gbj/leptos
|
https://api.github.com/repos/gbj/leptos
|
closed
|
Server Function URLs are hard to debug
|
documentation enhancement server
|
When using server functions, there is no way to see the keys in the [`REGISTERED_SERVER_FUNCTIONS` static](https://github.com/gbj/leptos/blob/a2943c464985b2d3baa9d044e38e57ca3a381a52/leptos_server/src/lib.rs#LL103C16-L103C43), and there appears to be no logging on the server around server functions (I'm using the axum integration).
This makes it difficult to debug when setting up for the first time as the main source of information is the server's response to a misconfigured server function:
```
Could not find a server function at that route.
```
The error indicates that I am hitting the right server-side handler, but apparently the key is wrong as the server function is not found.
It would help to either have some additional logging, or error messaging when a server function misses on the server in development, or additional documentation concerning how the urls are expected to be matched.
|
1.0
|
Server Function URLs are hard to debug - When using server functions, there is no way to see the keys in the [`REGISTERED_SERVER_FUNCTIONS` static](https://github.com/gbj/leptos/blob/a2943c464985b2d3baa9d044e38e57ca3a381a52/leptos_server/src/lib.rs#LL103C16-L103C43), and there appears to be no logging on the server around server functions (I'm using the axum integration).
This makes it difficult to debug when setting up for the first time as the main source of information is the server's response to a misconfigured server function:
```
Could not find a server function at that route.
```
The error indicates that I am hitting the right server-side handler, but apparently the key is wrong as the server function is not found.
It would help to either have some additional logging, or error messaging when a server function misses on the server in development, or additional documentation concerning how the urls are expected to be matched.
|
non_process
|
server function urls are hard to debug when using server functions there is no way to see the keys in the and there appears to be no logging on the server around server functions i m using the axum integration this makes it difficult to debug when setting up for the first time as the main source of information is the server s response to a misconfigured server function could not find a server function at that route the error indicates that i am hitting the right server side handler but apparently the key is wrong as the server function is not found it would help to either have some additional logging or error messaging when a server function misses on the server in development or additional documentation concerning how the urls are expected to be matched
| 0
|
1,275
| 2,511,016,099
|
IssuesEvent
|
2015-01-14 01:23:06
|
couchbase/couchbase-lite-net
|
https://api.github.com/repos/couchbase/couchbase-lite-net
|
closed
|
ReplicationTest.TestGetReplication sometimes fails
|
bug P4: minor priority-medium size-large
|
When `TestGetReplication` runs by itself, or with the other tests in `ReplicationTest` it always passes. However, when running all 87 tests it always fails. When it fails, it fails because it calls `replication.Stop()` but `replication.IsActive` returns true.
|
1.0
|
ReplicationTest.TestGetReplication sometimes fails - When `TestGetReplication` runs by itself, or with the other tests in `ReplicationTest` it always passes. However, when running all 87 tests it always fails. When it fails, it fails because it calls `replication.Stop()` but `replication.IsActive` returns true.
|
non_process
|
replicationtest testgetreplication sometimes fails when testgetreplication runs by itself or with the other tests in replicationtest it always passes however when running all tests it always fails when it fails it fails because it calls replication stop but replication isactive returns true
| 0
|
185,824
| 15,034,109,011
|
IssuesEvent
|
2021-02-02 12:26:21
|
cegonse/cest
|
https://api.github.com/repos/cegonse/cest
|
opened
|
Assert functions should have HTML documentation
|
documentation
|
Add documentation for all available assertions, along with examples.
|
1.0
|
Assert functions should have HTML documentation - Add documentation for all available assertions, along with examples.
|
non_process
|
assert functions should have html documentation add documentation for all available assertions along with examples
| 0
|
662,892
| 22,154,906,806
|
IssuesEvent
|
2022-06-03 21:14:30
|
Qiskit/qiskit-ibm-runtime
|
https://api.github.com/repos/Qiskit/qiskit-ibm-runtime
|
closed
|
Mention in tutorials about Qiskit classes like `VQEClient` will work only with legacy runtime
|
enhancement priority: low
|
**What is the expected feature or enhancement?**
Mention in tutorials about Qiskit classes that use runtime like ex: `VQEClient` will work only with legacy runtime
**Acceptance criteria**
|
1.0
|
Mention in tutorials about Qiskit classes like `VQEClient` will work only with legacy runtime - **What is the expected feature or enhancement?**
Mention in tutorials about Qiskit classes that use runtime like ex: `VQEClient` will work only with legacy runtime
**Acceptance criteria**
|
non_process
|
mention in tutorials about qiskit classes like vqeclient will work only with legacy runtime what is the expected feature or enhancement mention in tutorials about qiskit classes that use runtime like ex vqeclient will work only with legacy runtime acceptance criteria
| 0
|
449,455
| 31,846,058,501
|
IssuesEvent
|
2023-09-14 20:02:25
|
uprm-inso4101-2023-2024-S1/semester-project-fitquest-gameified-fitness-application
|
https://api.github.com/repos/uprm-inso4101-2023-2024-S1/semester-project-fitquest-gameified-fitness-application
|
closed
|
Documentation for the stopwatch feature. -> Team 2
|
documentation
|
# Docs!
Write documentation for the stopwatch feature. Describe how the stopwatch feature works.
Also create a "docs" folder on the main directory. So that from now on all documentation goes to that folder.
Note: You can use markdown for the documentation.
|
1.0
|
Documentation for the stopwatch feature. -> Team 2 - # Docs!
Write documentation for the stopwatch feature. Describe how the stopwatch feature works.
Also create a "docs" folder on the main directory. So that from now on all documentation goes to that folder.
Note: You can use markdown for the documentation.
|
non_process
|
documentation for the stopwatch feature team docs write documentation for the stopwatch feature describe how the stopwatch feature works also create a docs folder on the main directory so that from now on all documentation goes to that folder note you can use markdown for the documentation
| 0
|
63,262
| 8,671,120,243
|
IssuesEvent
|
2018-11-29 18:16:10
|
Azure/azure-iot-sdk-c
|
https://api.github.com/repos/Azure/azure-iot-sdk-c
|
closed
|
Fix the SDK API definitions
|
area-documentation
|
- **OS and version used:** <VERSION> ALL
- **SDK version used:** <VERSION> ALL
# Description of the issue:
In Visual Studio, open the telemetry sample project and go to the definition of one of the SDKs methods, like the **IoTHubClient_LL_CreateFromConnectionString**.
VS takes me to a non standard C method definition like the one bellow:
`MOCKABLE_FUNCTION(, IOTHUB_CLIENT_LL_HANDLE, IoTHubClient_LL_CreateFromConnectionString, const char*, connectionString, IOTHUB_CLIENT_TRANSPORT_PROVIDER, protocol); `
|
1.0
|
Fix the SDK API definitions - - **OS and version used:** <VERSION> ALL
- **SDK version used:** <VERSION> ALL
# Description of the issue:
In Visual Studio, open the telemetry sample project and go to the definition of one of the SDKs methods, like the **IoTHubClient_LL_CreateFromConnectionString**.
VS takes me to a non standard C method definition like the one bellow:
`MOCKABLE_FUNCTION(, IOTHUB_CLIENT_LL_HANDLE, IoTHubClient_LL_CreateFromConnectionString, const char*, connectionString, IOTHUB_CLIENT_TRANSPORT_PROVIDER, protocol); `
|
non_process
|
fix the sdk api definitions os and version used all sdk version used all description of the issue in visual studio open the telemetry sample project and go to the definition of one of the sdks methods like the iothubclient ll createfromconnectionstring vs takes me to a non standard c method definition like the one bellow mockable function iothub client ll handle iothubclient ll createfromconnectionstring const char connectionstring iothub client transport provider protocol
| 0
|
411,619
| 12,026,654,537
|
IssuesEvent
|
2020-04-12 15:04:31
|
sarpik/turbo-schedule
|
https://api.github.com/repos/sarpik/turbo-schedule
|
closed
|
Get rid of multiple `srcDir`s - they cause a lot of nasty bugs
|
help wanted priority/high
|
The `server/` has multiple `srcDir`s (`src`, `test`, `script` etc.), but typescript only allows one `outDir`.
This causes a lot of issues once you need to anything outside the `src` directory, since once the app is built, the paths will be different (one level deeper).
We've been having work-arounds for a little while now, but it's crazy annoying, causes lots of bugs (for example, see #26), and it's just not worth it.
Don't forget to get rid of the work-arounds we had once fixing, lol
|
1.0
|
Get rid of multiple `srcDir`s - they cause a lot of nasty bugs - The `server/` has multiple `srcDir`s (`src`, `test`, `script` etc.), but typescript only allows one `outDir`.
This causes a lot of issues once you need to anything outside the `src` directory, since once the app is built, the paths will be different (one level deeper).
We've been having work-arounds for a little while now, but it's crazy annoying, causes lots of bugs (for example, see #26), and it's just not worth it.
Don't forget to get rid of the work-arounds we had once fixing, lol
|
non_process
|
get rid of multiple srcdir s they cause a lot of nasty bugs the server has multiple srcdir s src test script etc but typescript only allows one outdir this causes a lot of issues once you need to anything outside the src directory since once the app is built the paths will be different one level deeper we ve been having work arounds for a little while now but it s crazy annoying causes lots of bugs for example see and it s just not worth it don t forget to get rid of the work arounds we had once fixing lol
| 0
|
11,051
| 13,883,733,314
|
IssuesEvent
|
2020-10-18 13:19:27
|
osquery/osquery
|
https://api.github.com/repos/osquery/osquery
|
closed
|
Configurable Audit backlog limit setting on Linux
|
Linux easy good-first-issue process auditing
|
The Audit backlog limit on Linux is currently hardcoded:
https://github.com/osquery/osquery/blob/444b2cc017218d2ac4c25ab1e431be809e744821/osquery/events/linux/auditdnetlink.cpp#L308
A command line option could be implemented to make this configurable
|
1.0
|
Configurable Audit backlog limit setting on Linux - The Audit backlog limit on Linux is currently hardcoded:
https://github.com/osquery/osquery/blob/444b2cc017218d2ac4c25ab1e431be809e744821/osquery/events/linux/auditdnetlink.cpp#L308
A command line option could be implemented to make this configurable
|
process
|
configurable audit backlog limit setting on linux the audit backlog limit on linux is currently hardcoded a command line option could be implemented to make this configurable
| 1
|
232,749
| 7,674,602,244
|
IssuesEvent
|
2018-05-15 05:07:25
|
rancher/rancher
|
https://api.github.com/repos/rancher/rancher
|
opened
|
AWS hosts added to custom host by passing "--address" does not get public ip address.
|
kind/bug priority/0 version/2.0
|
Rancher version - Build from master
Steps to reproduce the problem:
Create a custom cluster.
Add 1 node by providing "address" and "--internal-address" in agent register command:
```sudo docker run -d --privileged --restart=unless-stopped --net=host -v /etc/kubernetes:/etc/kubernetes -v /var/run:/var/run rancher/rancher-agent:master --server https://<ip> --token fd59bx7qx9mb8hxjpnjjdk4zxl4wm575ppwgb8x7v64dftxq5qddjt --ca-checksum 26d9971cdac09ad1c213ed4e462fc193c65c047266c4da5f0a6807c3953f6366 --address <ip1> --internal-address <ip2> --etcd --controlplane --worker```
Cluster gets provisioned successfully. But the nodes get only internal-address .
"externalIpAddress" of the node gets set to the internal ip address.
|
1.0
|
AWS hosts added to custom host by passing "--address" does not get public ip address. - Rancher version - Build from master
Steps to reproduce the problem:
Create a custom cluster.
Add 1 node by providing "address" and "--internal-address" in agent register command:
```sudo docker run -d --privileged --restart=unless-stopped --net=host -v /etc/kubernetes:/etc/kubernetes -v /var/run:/var/run rancher/rancher-agent:master --server https://<ip> --token fd59bx7qx9mb8hxjpnjjdk4zxl4wm575ppwgb8x7v64dftxq5qddjt --ca-checksum 26d9971cdac09ad1c213ed4e462fc193c65c047266c4da5f0a6807c3953f6366 --address <ip1> --internal-address <ip2> --etcd --controlplane --worker```
Cluster gets provisioned successfully. But the nodes get only internal-address .
"externalIpAddress" of the node gets set to the internal ip address.
|
non_process
|
aws hosts added to custom host by passing address does not get public ip address rancher version build from master steps to reproduce the problem create a custom cluster add node by providing address and internal address in agent register command sudo docker run d privileged restart unless stopped net host v etc kubernetes etc kubernetes v var run var run rancher rancher agent master server token ca checksum address internal address etcd controlplane worker cluster gets provisioned successfully but the nodes get only internal address externalipaddress of the node gets set to the internal ip address
| 0
|
320,360
| 27,432,940,511
|
IssuesEvent
|
2023-03-02 03:54:24
|
FehamIsmail/-hiza_tech-soen341project2023-
|
https://api.github.com/repos/FehamIsmail/-hiza_tech-soen341project2023-
|
opened
|
AT 5.2 Employer can modify job posting
|
Acceptance Test
|
**Issue tracking**
This acceptance test is for #18
**User Acceptance Flow:**
1. Log in as an employer
2. Navigate to the job listing page
3. Click on a job posting to edit
4. Attempt to modify a required field to an invalid value (e.g. empty job title)
5. Submit the form
6. Verify that an error message is displayed for the invalid field
7. Verify that the job posting is not updated in the database
8. Verify that the employer stays on the same page to correct the error
|
1.0
|
AT 5.2 Employer can modify job posting - **Issue tracking**
This acceptance test is for #18
**User Acceptance Flow:**
1. Log in as an employer
2. Navigate to the job listing page
3. Click on a job posting to edit
4. Attempt to modify a required field to an invalid value (e.g. empty job title)
5. Submit the form
6. Verify that an error message is displayed for the invalid field
7. Verify that the job posting is not updated in the database
8. Verify that the employer stays on the same page to correct the error
|
non_process
|
at employer can modify job posting issue tracking this acceptance test is for user acceptance flow log in as an employer navigate to the job listing page click on a job posting to edit attempt to modify a required field to an invalid value e g empty job title submit the form verify that an error message is displayed for the invalid field verify that the job posting is not updated in the database verify that the employer stays on the same page to correct the error
| 0
|
11,038
| 13,851,270,408
|
IssuesEvent
|
2020-10-15 03:31:08
|
tartley/colorama
|
https://api.github.com/repos/tartley/colorama
|
closed
|
Verify that built wheels work before releasing to PyPI
|
process
|
In 2019, I made a quick README change, rebuilt a new wheel version, and uploaded it to PyPI. Unfortunately, the wheel build failed in some way, and I didn't notice, so I ended up uploading a broken wheel for [v0.4.2](https://pypi.org/project/colorama/0.4.2/).
In order to prevent this happening, I we should automate (or, failing that, document) a process for testing built wheels, by uploading them to test.pypi.org, and checking we can "pip install" them from there.
Also, can/should I delete 0.4.2 from PyPI? Or mark it as bad in some way?
(A related effort is https://github.com/tartley/colorama/issues/278, to run tests against a package
built from top level 'src' directory.)
|
1.0
|
Verify that built wheels work before releasing to PyPI - In 2019, I made a quick README change, rebuilt a new wheel version, and uploaded it to PyPI. Unfortunately, the wheel build failed in some way, and I didn't notice, so I ended up uploading a broken wheel for [v0.4.2](https://pypi.org/project/colorama/0.4.2/).
In order to prevent this happening, I we should automate (or, failing that, document) a process for testing built wheels, by uploading them to test.pypi.org, and checking we can "pip install" them from there.
Also, can/should I delete 0.4.2 from PyPI? Or mark it as bad in some way?
(A related effort is https://github.com/tartley/colorama/issues/278, to run tests against a package
built from top level 'src' directory.)
|
process
|
verify that built wheels work before releasing to pypi in i made a quick readme change rebuilt a new wheel version and uploaded it to pypi unfortunately the wheel build failed in some way and i didn t notice so i ended up uploading a broken wheel for in order to prevent this happening i we should automate or failing that document a process for testing built wheels by uploading them to test pypi org and checking we can pip install them from there also can should i delete from pypi or mark it as bad in some way a related effort is to run tests against a package built from top level src directory
| 1
|
2,834
| 5,786,187,779
|
IssuesEvent
|
2017-05-01 09:08:20
|
Nir-Cohen/Bishvil
|
https://api.github.com/repos/Nir-Cohen/Bishvil
|
closed
|
Mission 3 - SRS: Feedback
|
On Processes
|
## TODO:
- Please attach here the Client review.
https://drive.google.com/open?id=0BzHnFrVO7-T5Nnk5TDNNemJSeEU
## סיכום פגישתנו:
יש מוצר ברור, סיכונים לא ידועים, כנראה לקוח לא סגור עד הסוף מה המוצר. יש פתרונות קיימים למוצר.
|
1.0
|
Mission 3 - SRS: Feedback - ## TODO:
- Please attach here the Client review.
https://drive.google.com/open?id=0BzHnFrVO7-T5Nnk5TDNNemJSeEU
## סיכום פגישתנו:
יש מוצר ברור, סיכונים לא ידועים, כנראה לקוח לא סגור עד הסוף מה המוצר. יש פתרונות קיימים למוצר.
|
process
|
mission srs feedback todo please attach here the client review סיכום פגישתנו יש מוצר ברור סיכונים לא ידועים כנראה לקוח לא סגור עד הסוף מה המוצר יש פתרונות קיימים למוצר
| 1
|
12,261
| 14,787,482,562
|
IssuesEvent
|
2021-01-12 07:41:37
|
darktable-org/darktable
|
https://api.github.com/repos/darktable-org/darktable
|
closed
|
[masks] drawn mask, path cannot be removed anymore and also opacity says 0% where it is actually 100%
|
bug: pending priority: high scope: image processing
|
**Describe the bug/issue**
Current Master:
An applied drawn path cannot be removed anymore with right mouse click. At the same time, the mask opacity in Toas Message says 0% where actually it is 100%
[edit] not only path is affected, also gradient, I assume all...
**To Reproduce**
_Please amend or remove, what is/isn't applicable_
1. Go to Exposure
2. Create a drawn path and
3. make mask indicator visible
4. Try to adjust opacity by ctrl+scroll and try to remove the mask with right click
5. see error
**Expected behavior**
former behaviour e.g. of 3.4.0
**Screenshots**

**Which commit intruduced the error**
```
4f8915dc3a73f8211ea16f5f94ce48df2861d71f is the first bad commit
commit 4f8915dc3a73f8211ea16f5f94ce48df2861d71f
Author: Philipp Lutz <philipp.lutz@gmx.de>
Date: Sun Jan 3 23:44:39 2021 +0100
[masks/imageio] track module name changes in mask manager
Additionally refactor masks.c to reduce code duplication
src/develop/imageop.c | 2 +-
src/develop/masks.h | 2 ++
src/develop/masks/masks.c | 73 +++++++++++++++++++++++++++--------------------
3 files changed, 45 insertions(+), 32 deletions(-)
```
@da-phil may I ping you here!?
**Platform**
* darktable version : 3.5.0+543~g54e7e09eb
* OS : Linux - kernel 5.10.2-gentoo
* Distro : Gentoo Base System release 2.7
* Processor : Intel(R) Core(TM) i9-9900K CPU @ 3.60GHz
* Memory : 32 GB (4 x 8 GB) + 5GB Swap
* Graphics card0 : GeForce GTX 1060 6GB
* Graphics card1 : GeForce RTX 2070 SUPER
* Graphics driver : nvidia-drivers-460.27.04
* OpenCL installed : Yes (opencl-headers-2020.06.16)
* OpenCL activated : Yes
* Xorg : xorg-server-1.20.10
* Desktop : KDE 5
* GTK+ : gtk+-3.24.22
* gcc : x86_64-pc-linux-gnu-9.3.0
* cflags : CMAKE_FLAGS="-march=native-O2-mtune=native-pipe"
* CMAKE_BUILD_TYPE : "RelWithDebInfo"
**Additional context**
- Can you reproduce with another Darktable version(s)? **bisected, so several versions**
- Can you reproduce with a RAW or Jpeg or both? **RAW**
- Is the issue still present using an empty/new config-dir (e.g. start darktable with --configdir "/tmp")? **yes**
- [edit] opencl on/off does not matter
- (edit] complete empty history stack does not matter
|
1.0
|
[masks] drawn mask, path cannot be removed anymore and also opacity says 0% where it is actually 100% - **Describe the bug/issue**
Current Master:
An applied drawn path cannot be removed anymore with right mouse click. At the same time, the mask opacity in Toas Message says 0% where actually it is 100%
[edit] not only path is affected, also gradient, I assume all...
**To Reproduce**
_Please amend or remove, what is/isn't applicable_
1. Go to Exposure
2. Create a drawn path and
3. make mask indicator visible
4. Try to adjust opacity by ctrl+scroll and try to remove the mask with right click
5. see error
**Expected behavior**
former behaviour e.g. of 3.4.0
**Screenshots**

**Which commit intruduced the error**
```
4f8915dc3a73f8211ea16f5f94ce48df2861d71f is the first bad commit
commit 4f8915dc3a73f8211ea16f5f94ce48df2861d71f
Author: Philipp Lutz <philipp.lutz@gmx.de>
Date: Sun Jan 3 23:44:39 2021 +0100
[masks/imageio] track module name changes in mask manager
Additionally refactor masks.c to reduce code duplication
src/develop/imageop.c | 2 +-
src/develop/masks.h | 2 ++
src/develop/masks/masks.c | 73 +++++++++++++++++++++++++++--------------------
3 files changed, 45 insertions(+), 32 deletions(-)
```
@da-phil may I ping you here!?
**Platform**
* darktable version : 3.5.0+543~g54e7e09eb
* OS : Linux - kernel 5.10.2-gentoo
* Distro : Gentoo Base System release 2.7
* Processor : Intel(R) Core(TM) i9-9900K CPU @ 3.60GHz
* Memory : 32 GB (4 x 8 GB) + 5GB Swap
* Graphics card0 : GeForce GTX 1060 6GB
* Graphics card1 : GeForce RTX 2070 SUPER
* Graphics driver : nvidia-drivers-460.27.04
* OpenCL installed : Yes (opencl-headers-2020.06.16)
* OpenCL activated : Yes
* Xorg : xorg-server-1.20.10
* Desktop : KDE 5
* GTK+ : gtk+-3.24.22
* gcc : x86_64-pc-linux-gnu-9.3.0
* cflags : CMAKE_FLAGS="-march=native-O2-mtune=native-pipe"
* CMAKE_BUILD_TYPE : "RelWithDebInfo"
**Additional context**
- Can you reproduce with another Darktable version(s)? **bisected, so several versions**
- Can you reproduce with a RAW or Jpeg or both? **RAW**
- Is the issue still present using an empty/new config-dir (e.g. start darktable with --configdir "/tmp")? **yes**
- [edit] opencl on/off does not matter
- (edit] complete empty history stack does not matter
|
process
|
drawn mask path cannot be removed anymore and also opacity says where it is actually describe the bug issue current master an applied drawn path cannot be removed anymore with right mouse click at the same time the mask opacity in toas message says where actually it is not only path is affected also gradient i assume all to reproduce please amend or remove what is isn t applicable go to exposure create a drawn path and make mask indicator visible try to adjust opacity by ctrl scroll and try to remove the mask with right click see error expected behavior former behaviour e g of screenshots which commit intruduced the error is the first bad commit commit author philipp lutz date sun jan track module name changes in mask manager additionally refactor masks c to reduce code duplication src develop imageop c src develop masks h src develop masks masks c files changed insertions deletions da phil may i ping you here platform darktable version os linux kernel gentoo distro gentoo base system release processor intel r core tm cpu memory gb x gb swap graphics geforce gtx graphics geforce rtx super graphics driver nvidia drivers opencl installed yes opencl headers opencl activated yes xorg xorg server desktop kde gtk gtk gcc pc linux gnu cflags cmake flags march native mtune native pipe cmake build type relwithdebinfo additional context can you reproduce with another darktable version s bisected so several versions can you reproduce with a raw or jpeg or both raw is the issue still present using an empty new config dir e g start darktable with configdir tmp yes opencl on off does not matter edit complete empty history stack does not matter
| 1
|
10,059
| 13,044,161,776
|
IssuesEvent
|
2020-07-29 03:47:26
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
closed
|
UCP: Migrate scalar function `AddDateIntString` from TiDB
|
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
|
## Description
Port the scalar function `AddDateIntString` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @mapleFU
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
2.0
|
UCP: Migrate scalar function `AddDateIntString` from TiDB -
## Description
Port the scalar function `AddDateIntString` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @mapleFU
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
process
|
ucp migrate scalar function adddateintstring from tidb description port the scalar function adddateintstring from tidb to coprocessor score mentor s maplefu recommended skills rust programming learning materials already implemented expressions ported from tidb
| 1
|
75,642
| 9,880,689,798
|
IssuesEvent
|
2019-06-24 13:11:49
|
bow-swift/bow
|
https://api.github.com/repos/bow-swift/bow
|
closed
|
Document Optics instances
|
documentation
|
## Description
Add API reference for the following instances:
- [x] ArrayK
- [x] Array
- [x] Const
- [x] EitherK
- [x] Either
- [x] Id
- [x] Iod
- [x] NonEmptyArray
- [x] Option
- [x] String
- [x] Try
- [x] Validated
|
1.0
|
Document Optics instances - ## Description
Add API reference for the following instances:
- [x] ArrayK
- [x] Array
- [x] Const
- [x] EitherK
- [x] Either
- [x] Id
- [x] Iod
- [x] NonEmptyArray
- [x] Option
- [x] String
- [x] Try
- [x] Validated
|
non_process
|
document optics instances description add api reference for the following instances arrayk array const eitherk either id iod nonemptyarray option string try validated
| 0
|
8,518
| 11,699,386,246
|
IssuesEvent
|
2020-03-06 15:33:12
|
googleapis/java-billing
|
https://api.github.com/repos/googleapis/java-billing
|
closed
|
Promote to GA
|
api: cloudbilling type: process
|
Package name: **google-cloud-billing**
Current release: **beta**
Proposed release: **GA**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [ ] 28 days elapsed since last beta release with new API surface
- [x] Server API is GA
- [x] Package API is stable, and we can commit to backward compatibility
- [x] At least one integration/smoke test is defined and passing
- [x] All dependencies are GA
## Optional
- [ ] Most common / important scenarios have descriptive samples
- [ ] Public manual methods have at least one usage sample each (excluding overloads)
- [ ] Per-API README includes a full description of the API
- [ ] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
1.0
|
Promote to GA - Package name: **google-cloud-billing**
Current release: **beta**
Proposed release: **GA**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [ ] 28 days elapsed since last beta release with new API surface
- [x] Server API is GA
- [x] Package API is stable, and we can commit to backward compatibility
- [x] At least one integration/smoke test is defined and passing
- [x] All dependencies are GA
## Optional
- [ ] Most common / important scenarios have descriptive samples
- [ ] Public manual methods have at least one usage sample each (excluding overloads)
- [ ] Per-API README includes a full description of the API
- [ ] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
process
|
promote to ga package name google cloud billing current release beta proposed release ga instructions check the lists below adding tests documentation as required once all the required boxes are ticked please create a release and close this issue required days elapsed since last beta release with new api surface server api is ga package api is stable and we can commit to backward compatibility at least one integration smoke test is defined and passing all dependencies are ga optional most common important scenarios have descriptive samples public manual methods have at least one usage sample each excluding overloads per api readme includes a full description of the api per api readme contains at least one “getting started” sample using the most common api scenario manual code has been reviewed by api producer manual code has been reviewed by a dpe responsible for samples client libraries page is added to the product documentation in apis reference section of the product s documentation on cloud site
| 1
|
16,502
| 21,485,025,183
|
IssuesEvent
|
2022-04-26 22:01:50
|
googleapis/repo-automation-bots
|
https://api.github.com/repos/googleapis/repo-automation-bots
|
reopened
|
Use dedicated service accounts for bot deployments
|
type: process priority: p1
|
The service accounts for the bot deployment are:
- web-frontend@repo-automation-bots.iam.gserviceaccount.com
for Cloud Run frontend deployment
- web-backend@repo-automation-bots.iam.gserviceaccount.com
for normal backend deployment
- policy-bot-backend@repo-automation-bots.iam.gserviceaccount.com
If a bot needs additional permissions, it's advised that we create a dedicated service account. Policy bot needs BigQuery write permission.
--
- [x] auto-approve
- [x] auto-label
- [x] bazel-bot
- [x] blunderbuss
- [x] canary-bot
- [x] cherry-pick-bot
- [x] conventional-commit-lint
- [x] do-not-merge
- [x] failurechecker
- [x] flakybot
- [x] generated-files-bot
- [x] header-checker-lint
- [x] label-sync
- [x] merge-on-green
- [x] owl-bot
- [x] policy
- [x] release-please
- [x] release-trigger
- [x] repo-metadata-lint
- [x] secret-rotator
- [x] snippet-bot
- [x] sync-repo-settings
- [x] trusted-contribution
|
1.0
|
Use dedicated service accounts for bot deployments - The service accounts for the bot deployment are:
- web-frontend@repo-automation-bots.iam.gserviceaccount.com
for Cloud Run frontend deployment
- web-backend@repo-automation-bots.iam.gserviceaccount.com
for normal backend deployment
- policy-bot-backend@repo-automation-bots.iam.gserviceaccount.com
If a bot needs additional permissions, it's advised that we create a dedicated service account. Policy bot needs BigQuery write permission.
--
- [x] auto-approve
- [x] auto-label
- [x] bazel-bot
- [x] blunderbuss
- [x] canary-bot
- [x] cherry-pick-bot
- [x] conventional-commit-lint
- [x] do-not-merge
- [x] failurechecker
- [x] flakybot
- [x] generated-files-bot
- [x] header-checker-lint
- [x] label-sync
- [x] merge-on-green
- [x] owl-bot
- [x] policy
- [x] release-please
- [x] release-trigger
- [x] repo-metadata-lint
- [x] secret-rotator
- [x] snippet-bot
- [x] sync-repo-settings
- [x] trusted-contribution
|
process
|
use dedicated service accounts for bot deployments the service accounts for the bot deployment are web frontend repo automation bots iam gserviceaccount com for cloud run frontend deployment web backend repo automation bots iam gserviceaccount com for normal backend deployment policy bot backend repo automation bots iam gserviceaccount com if a bot needs additional permissions it s advised that we create a dedicated service account policy bot needs bigquery write permission auto approve auto label bazel bot blunderbuss canary bot cherry pick bot conventional commit lint do not merge failurechecker flakybot generated files bot header checker lint label sync merge on green owl bot policy release please release trigger repo metadata lint secret rotator snippet bot sync repo settings trusted contribution
| 1
|
22,388
| 31,142,285,827
|
IssuesEvent
|
2023-08-16 01:44:14
|
cypress-io/cypress
|
https://api.github.com/repos/cypress-io/cypress
|
closed
|
Flaky test: timeouts on windows
|
OS: windows process: flaky test topic: flake ❄️ stage: flake stale
|
### Link to dashboard or CircleCI failure
https://app.circleci.com/pipelines/github/cypress-io/cypress/41761/workflows/958cb274-e0b3-4e6c-873f-2a60038625c7/jobs/1731172/parallel-runs/5
### Link to failing test in GitHub
Here's the link to the `taskInternal` function in question: https://github.com/cypress-io/cypress/blob/develop/packages/frontend-shared/cypress/e2e/support/e2eSupport.ts#L456
### Analysis
Digging through the logs/artifacts we see interesting hints such as:
1. A look at the logs shows repeated instances of MaxListenersExceededWarning:
```
(node:1252) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 dev-server:specs:changed listeners added to [EventEmitter]. Use emitter.setMaxListeners() to increase limit
(Use `node --trace-warnings ...` to show where the warning was created)
```
2. A closer look at the logs shows a performance warning from `NexusSlowGuard`:
<img width="1061" alt="Screen Shot 2022-08-12 at 10 23 27 AM" src="https://user-images.githubusercontent.com/26726429/184425330-361131be-8751-410f-8b4c-9d31ff881751.png">
3. In the video, we don't see a loading spinner between this test and the previous test -- it's still showing the previous test. From our own log in the reporter we see "no commands were issued in this test". Perhaps teardown not working as expected?
<img width="1555" alt="Screen Shot 2022-08-12 at 10 15 00 AM" src="https://user-images.githubusercontent.com/26726429/184424734-b9d9bb5f-786c-4cac-b015-d9e1fd44a101.png">
### Cypress Version
10.4.0
### Other
List of other timeout failures on windows with the same `MaxListenersExceededWarning` and `NexusSlowGuard` logs:
- https://app.circleci.com/pipelines/github/cypress-io/cypress/41805/workflows/c2045215-5c0b-4913-87f1-f7f78aeb6db2/jobs/1733233/tests
|
1.0
|
Flaky test: timeouts on windows - ### Link to dashboard or CircleCI failure
https://app.circleci.com/pipelines/github/cypress-io/cypress/41761/workflows/958cb274-e0b3-4e6c-873f-2a60038625c7/jobs/1731172/parallel-runs/5
### Link to failing test in GitHub
Here's the link to the `taskInternal` function in question: https://github.com/cypress-io/cypress/blob/develop/packages/frontend-shared/cypress/e2e/support/e2eSupport.ts#L456
### Analysis
Digging through the logs/artifacts we see interesting hints such as:
1. A look at the logs shows repeated instances of MaxListenersExceededWarning:
```
(node:1252) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 dev-server:specs:changed listeners added to [EventEmitter]. Use emitter.setMaxListeners() to increase limit
(Use `node --trace-warnings ...` to show where the warning was created)
```
2. A closer look at the logs shows a performance warning from `NexusSlowGuard`:
<img width="1061" alt="Screen Shot 2022-08-12 at 10 23 27 AM" src="https://user-images.githubusercontent.com/26726429/184425330-361131be-8751-410f-8b4c-9d31ff881751.png">
3. In the video, we don't see a loading spinner between this test and the previous test -- it's still showing the previous test. From our own log in the reporter we see "no commands were issued in this test". Perhaps teardown not working as expected?
<img width="1555" alt="Screen Shot 2022-08-12 at 10 15 00 AM" src="https://user-images.githubusercontent.com/26726429/184424734-b9d9bb5f-786c-4cac-b015-d9e1fd44a101.png">
### Cypress Version
10.4.0
### Other
List of other timeout failures on windows with the same `MaxListenersExceededWarning` and `NexusSlowGuard` logs:
- https://app.circleci.com/pipelines/github/cypress-io/cypress/41805/workflows/c2045215-5c0b-4913-87f1-f7f78aeb6db2/jobs/1733233/tests
|
process
|
flaky test timeouts on windows link to dashboard or circleci failure link to failing test in github here s the link to the taskinternal function in question analysis digging through the logs artifacts we see interesting hints such as a look at the logs shows repeated instances of maxlistenersexceededwarning node maxlistenersexceededwarning possible eventemitter memory leak detected dev server specs changed listeners added to use emitter setmaxlisteners to increase limit use node trace warnings to show where the warning was created a closer look at the logs shows a performance warning from nexusslowguard img width alt screen shot at am src in the video we don t see a loading spinner between this test and the previous test it s still showing the previous test from our own log in the reporter we see no commands were issued in this test perhaps teardown not working as expected img width alt screen shot at am src cypress version other list of other timeout failures on windows with the same maxlistenersexceededwarning and nexusslowguard logs
| 1
|
10,229
| 13,094,490,749
|
IssuesEvent
|
2020-08-03 12:30:34
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Raster Merge produces negative Dimension
|
Bug Feedback Processing
|
The Merging Function produces a negative dimension, as already posted in the previous version
Error Protocol:
> QGIS-Version: 3.10.2-A Coruña
QGIS-Codeversion: 616ad4531b
Qt-Version: 5.11.2
GDAL-Version: 3.0.2
GEOS-Version: 3.8.0-CAPI-1.13.1
PROJ-Version: Rel. 6.2.1, November 1st, 2019
Verarbeite Algorithmus…
Algorithmus Verschmelzen startet…
Input parameters:
{ 'DATA_TYPE' : 5, 'EXTRA' : '', 'INPUT' : ['C:/Users/Riedel/Projekte/fun/wandern/334445638_dgm1.xyz','C:/Users/Riedel/Projekte/fun/wandern/334445640_dgm1.xyz','C:/Users/Riedel/Projekte/fun/wandern/334465636_dgm1.xyz','C:/Users/Riedel/Projekte/fun/wandern/334465638_dgm1.xyz','C:/Users/Riedel/Projekte/fun/wandern/334465640_dgm1.xyz','C:/Users/Riedel/Projekte/fun/wandern/334485638_dgm1.xyz'], 'NODATA_INPUT' : None, 'NODATA_OUTPUT' : None, 'OPTIONS' : '', 'OUTPUT' : 'C:/Users/Riedel/Projekte/fun/wandern/test.tif', 'PCT' : False, 'SEPARATE' : False }
GDAL command:
python3 -m gdal_merge -ot Float32 -of GTiff -o C:/Users/Riedel/Projekte/fun/wandern/test.tif --optfile C:\Users\Riedel\AppData\Local\Temp/processing_7c6cecac1d6744d6a5c2bb3a8d321471/4ee4855bf4a845aa9964efafe07f5beb/mergeInputFiles.txt
GDAL command output:
ERROR 1: Attempt to create 6000x-1999 dataset is illegal,sizes must be larger than zero.
Creation failed, terminating gdal_merge.
Execution completed in 18.71 seconds
Results:
{'OUTPUT': 'C:/Users/Riedel/Projekte/fun/wandern/test.tif'}
Lade Ergebnis Layer
Die folgenden Layer wurden nicht erzeugt.<ul><li>C:/Users/Riedel/Projekte/fun/wandern/test.tif</li></ul>Im 'Protokoll-Fenster' im QGIS-Hauptfenster sind mehr Informationen zur Ausführung des Algorithmus zu finden.
_Originally posted by @GGDRriedel in https://github.com/qgis/QGIS/issues/34284#issuecomment-646559409_
|
1.0
|
Raster Merge produces negative Dimension - The Merging Function produces a negative dimension, as already posted in the previous version
Error Protocol:
> QGIS-Version: 3.10.2-A Coruña
QGIS-Codeversion: 616ad4531b
Qt-Version: 5.11.2
GDAL-Version: 3.0.2
GEOS-Version: 3.8.0-CAPI-1.13.1
PROJ-Version: Rel. 6.2.1, November 1st, 2019
Verarbeite Algorithmus…
Algorithmus Verschmelzen startet…
Input parameters:
{ 'DATA_TYPE' : 5, 'EXTRA' : '', 'INPUT' : ['C:/Users/Riedel/Projekte/fun/wandern/334445638_dgm1.xyz','C:/Users/Riedel/Projekte/fun/wandern/334445640_dgm1.xyz','C:/Users/Riedel/Projekte/fun/wandern/334465636_dgm1.xyz','C:/Users/Riedel/Projekte/fun/wandern/334465638_dgm1.xyz','C:/Users/Riedel/Projekte/fun/wandern/334465640_dgm1.xyz','C:/Users/Riedel/Projekte/fun/wandern/334485638_dgm1.xyz'], 'NODATA_INPUT' : None, 'NODATA_OUTPUT' : None, 'OPTIONS' : '', 'OUTPUT' : 'C:/Users/Riedel/Projekte/fun/wandern/test.tif', 'PCT' : False, 'SEPARATE' : False }
GDAL command:
python3 -m gdal_merge -ot Float32 -of GTiff -o C:/Users/Riedel/Projekte/fun/wandern/test.tif --optfile C:\Users\Riedel\AppData\Local\Temp/processing_7c6cecac1d6744d6a5c2bb3a8d321471/4ee4855bf4a845aa9964efafe07f5beb/mergeInputFiles.txt
GDAL command output:
ERROR 1: Attempt to create 6000x-1999 dataset is illegal,sizes must be larger than zero.
Creation failed, terminating gdal_merge.
Execution completed in 18.71 seconds
Results:
{'OUTPUT': 'C:/Users/Riedel/Projekte/fun/wandern/test.tif'}
Lade Ergebnis Layer
Die folgenden Layer wurden nicht erzeugt.<ul><li>C:/Users/Riedel/Projekte/fun/wandern/test.tif</li></ul>Im 'Protokoll-Fenster' im QGIS-Hauptfenster sind mehr Informationen zur Ausführung des Algorithmus zu finden.
_Originally posted by @GGDRriedel in https://github.com/qgis/QGIS/issues/34284#issuecomment-646559409_
|
process
|
raster merge produces negative dimension the merging function produces a negative dimension as already posted in the previous version error protocol qgis version a coruña qgis codeversion qt version gdal version geos version capi proj version rel november verarbeite algorithmus… algorithmus verschmelzen startet… input parameters data type extra input nodata input none nodata output none options output c users riedel projekte fun wandern test tif pct false separate false gdal command m gdal merge ot of gtiff o c users riedel projekte fun wandern test tif optfile c users riedel appdata local temp processing mergeinputfiles txt gdal command output error attempt to create dataset is illegal sizes must be larger than zero creation failed terminating gdal merge execution completed in seconds results output c users riedel projekte fun wandern test tif lade ergebnis layer die folgenden layer wurden nicht erzeugt c users riedel projekte fun wandern test tif im protokoll fenster im qgis hauptfenster sind mehr informationen zur ausführung des algorithmus zu finden originally posted by ggdrriedel in
| 1
|
20,797
| 27,546,793,617
|
IssuesEvent
|
2023-03-07 12:23:19
|
ExpertSDR3/ExpertSDR3-BUG-TRACKER
|
https://api.github.com/repos/ExpertSDR3/ExpertSDR3-BUG-TRACKER
|
closed
|
XVTR setup - Incorrectly displayed frequency of the transverter for the 24GHz band
|
bug in process
|
HI,
ESDR31 1.0.3 - beta, Win10, 16GB RAM
When in the XVTR settings, I enter RX / TX frequency 24048.000 MHz and IF 145.000 MHz, the display shows the wrong frequency 11.999.980, 480
The setting of 10,368,000 (3cm band) and lower XVTR bands is correct
|
1.0
|
XVTR setup - Incorrectly displayed frequency of the transverter for the 24GHz band - HI,
ESDR31 1.0.3 - beta, Win10, 16GB RAM
When in the XVTR settings, I enter RX / TX frequency 24048.000 MHz and IF 145.000 MHz, the display shows the wrong frequency 11.999.980, 480
The setting of 10,368,000 (3cm band) and lower XVTR bands is correct
|
process
|
xvtr setup incorrectly displayed frequency of the transverter for the band hi beta ram when in the xvtr settings i enter rx tx frequency mhz and if mhz the display shows the wrong frequency the setting of band and lower xvtr bands is correct
| 1
|
73,374
| 15,253,659,286
|
IssuesEvent
|
2021-02-20 08:42:01
|
gsylvie/madness
|
https://api.github.com/repos/gsylvie/madness
|
closed
|
CVE-2015-5262 Medium Severity Vulnerability detected by WhiteSource - autoclosed
|
security vulnerability
|
## CVE-2015-5262 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>httpclient-4.3.5.jar</b></p></summary>
<p>null</p>
<p>path: /root/.m2/repository/org/apache/httpcomponents/httpclient/4.3.5/httpclient-4.3.5.jar</p>
<p>
Dependency Hierarchy:
- httpasyncclient-4.0.2.jar (Root Library)
- :x: **httpclient-4.3.5.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
http/conn/ssl/SSLConnectionSocketFactory.java in Apache HttpComponents HttpClient before 4.3.6 ignores the http.socket.timeout configuration setting during an SSL handshake, which allows remote attackers to cause a denial of service (HTTPS call hang) via unspecified vectors.
<p>Publish Date: 2015-10-27
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-5262>CVE-2015-5262</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://www.securitytracker.com/id/1033743">http://www.securitytracker.com/id/1033743</a></p>
<p>Release Date: 2017-12-31</p>
<p>Fix Resolution: The vendor has issued a fix (4.3.6).
The vendor has also issued a source code fix, available at:
http://svn.apache.org/viewvc/httpcomponents/httpclient/branches/4.3.x/httpclient/src/main/java/org/apache/http/conn/ssl/SSLConnectionSocketFactory.java?r1=1560975&r2=1626784</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2015-5262 Medium Severity Vulnerability detected by WhiteSource - autoclosed - ## CVE-2015-5262 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>httpclient-4.3.5.jar</b></p></summary>
<p>null</p>
<p>path: /root/.m2/repository/org/apache/httpcomponents/httpclient/4.3.5/httpclient-4.3.5.jar</p>
<p>
Dependency Hierarchy:
- httpasyncclient-4.0.2.jar (Root Library)
- :x: **httpclient-4.3.5.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
http/conn/ssl/SSLConnectionSocketFactory.java in Apache HttpComponents HttpClient before 4.3.6 ignores the http.socket.timeout configuration setting during an SSL handshake, which allows remote attackers to cause a denial of service (HTTPS call hang) via unspecified vectors.
<p>Publish Date: 2015-10-27
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-5262>CVE-2015-5262</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://www.securitytracker.com/id/1033743">http://www.securitytracker.com/id/1033743</a></p>
<p>Release Date: 2017-12-31</p>
<p>Fix Resolution: The vendor has issued a fix (4.3.6).
The vendor has also issued a source code fix, available at:
http://svn.apache.org/viewvc/httpcomponents/httpclient/branches/4.3.x/httpclient/src/main/java/org/apache/http/conn/ssl/SSLConnectionSocketFactory.java?r1=1560975&r2=1626784</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium severity vulnerability detected by whitesource autoclosed cve medium severity vulnerability vulnerable library httpclient jar null path root repository org apache httpcomponents httpclient httpclient jar dependency hierarchy httpasyncclient jar root library x httpclient jar vulnerable library vulnerability details http conn ssl sslconnectionsocketfactory java in apache httpcomponents httpclient before ignores the http socket timeout configuration setting during an ssl handshake which allows remote attackers to cause a denial of service https call hang via unspecified vectors publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution the vendor has issued a fix the vendor has also issued a source code fix available at step up your open source security game with whitesource
| 0
|
116,072
| 9,819,382,740
|
IssuesEvent
|
2019-06-13 21:50:33
|
ethereum/eth2.0-specs
|
https://api.github.com/repos/ethereum/eth2.0-specs
|
closed
|
Invalid transfer sanity test
|
CI/tests
|
For all our config the `MAX_TRANSFERS` value is 0. However, we have tests that contain transfer, and it's expected to not error!
|
1.0
|
Invalid transfer sanity test - For all our config the `MAX_TRANSFERS` value is 0. However, we have tests that contain transfer, and it's expected to not error!
|
non_process
|
invalid transfer sanity test for all our config the max transfers value is however we have tests that contain transfer and it s expected to not error
| 0
|
15,822
| 20,016,356,434
|
IssuesEvent
|
2022-02-01 12:28:59
|
digitalmethodsinitiative/4cat
|
https://api.github.com/repos/digitalmethodsinitiative/4cat
|
closed
|
Merge 'render rankflow' and 'interactive flowchart' processors
|
enhancement processors
|
SVG can contain JavaScript and be interactive, so the interactive elements of the 'interactive flowchart' graph could be integrated into the SVG that 'render rankflow' produces
|
1.0
|
Merge 'render rankflow' and 'interactive flowchart' processors - SVG can contain JavaScript and be interactive, so the interactive elements of the 'interactive flowchart' graph could be integrated into the SVG that 'render rankflow' produces
|
process
|
merge render rankflow and interactive flowchart processors svg can contain javascript and be interactive so the interactive elements of the interactive flowchart graph could be integrated into the svg that render rankflow produces
| 1
|
101,394
| 4,116,686,972
|
IssuesEvent
|
2016-06-08 02:15:31
|
joemcgill/jojo2016
|
https://api.github.com/repos/joemcgill/jojo2016
|
closed
|
Add newsletter signups to the end of each post
|
enhancement priority
|
At the bottom of each post, we could add a little box that promotes the email list and includes an input form for adding your email address.
* Get text from Joanna
* She had an example site as well.
|
1.0
|
Add newsletter signups to the end of each post - At the bottom of each post, we could add a little box that promotes the email list and includes an input form for adding your email address.
* Get text from Joanna
* She had an example site as well.
|
non_process
|
add newsletter signups to the end of each post at the bottom of each post we could add a little box that promotes the email list and includes an input form for adding your email address get text from joanna she had an example site as well
| 0
|
7,616
| 10,727,042,308
|
IssuesEvent
|
2019-10-28 10:43:53
|
prisma/prisma2
|
https://api.github.com/repos/prisma/prisma2
|
closed
|
Remove panics
|
area/binaries kind/improvement process/candidate
|
**Problem**
Everytime I see rust panic my eyes glaze over, I feel helpless & I google around for other solutions.
```
➜ prisma prisma2 lift save
Error: Error in migration engine: thread 'tokio-runtime-worker-1' panicked at 'get data_type', src/libcore/option.rs:1034:5
stack backtrace:
0: std::panicking::default_hook::{{closure}}
1: std::panicking::default_hook
2: std::panicking::rust_panic_with_hook
3: std::panicking::continue_panic_fmt
4: rust_begin_unwind
5: core::panicking::panic_fmt
6: core::option::expect_failed
7: core::ops::function::impls::<impl core::ops::function::FnOnce<A> for &mut F>::call_once
8: <alloc::vec::Vec<T> as alloc::vec::SpecExtend<T,I>>::from_iter
9: sql_schema_describer::mysql::SqlSchemaDescriber::get_table
10: <core::iter::adapters::Map<I,F> as core::iter::traits::iterator::Iterator>::fold
11: <alloc::vec::Vec<T> as alloc::vec::SpecExtend<T,I>>::from_iter
12: <sql_schema_describer::mysql::SqlSchemaDescriber as sql_schema_describer::SqlSchemaDescriberBackend>::describe
13: <sql_migration_connector::sql_database_migration_inferrer::SqlDatabaseMigrationInferrer as migration_connector::database_migration_inferrer::DatabaseMigrationInferrer<sql_migration_connector::sql_migration::SqlMigration>>::infer
14: <migration_core::api::MigrationApi<C,D> as migration_core::api::GenericApi>::infer_migration_steps
15: migration_core::api::rpc::RpcApi::create_sync_handler
16: tokio_executor::enter::exit
17: tokio_threadpool::blocking::blocking
18: <futures::future::lazy::Lazy<F,R> as futures::future::Future>::poll
19: futures::future::chain::Chain<A,B,C>::poll
20: <futures::future::then::Then<A,B,F> as futures::future::Future>::poll
21: <futures::future::lazy::Lazy<F,R> as futures::future::Future>::poll
22: futures::future::chain::Chain<A,B,C>::poll
23: <futures::future::then::Then<A,B,F> as futures::future::Future>::poll
24: <futures::future::map::Map<A,F> as futures::future::Future>::poll
25: <futures::future::either::Either<A,B> as futures::future::Future>::poll
26: <futures::future::map::Map<A,F> as futures::future::Future>::poll
27: <futures::future::map_err::MapErr<A,F> as futures::future::Future>::poll
28: <futures::stream::and_then::AndThen<S,F,U> as futures::stream::Stream>::poll
29: <futures::stream::forward::Forward<T,U> as futures::future::Future>::poll
30: <futures::future::map::Map<A,F> as futures::future::Future>::poll
31: <futures::future::map_err::MapErr<A,F> as futures::future::Future>::poll
32: futures::task_impl::std::set
33: std::panicking::try::do_call
34: __rust_maybe_catch_panic
35: tokio_threadpool::task::Task::run
36: tokio_threadpool::worker::Worker::run_task
37: tokio_threadpool::worker::Worker::run
38: std::thread::local::LocalKey<T>::with
39: std::thread::local::LocalKey<T>::with
40: tokio_reactor::with_default
41: tokio::runtime::threadpool::builder::Builder::build::{{closure}}
42: std::thread::local::LocalKey<T>::with
43: std::thread::local::LocalKey<T>::with
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
```
**Proposed Solution**
I'd like us to get in a better habit of returning errors for these cases, rather than panicking. Here's the code snippet that caused the panic above:
https://github.com/prisma/prisma-engine/blob/a348799e88f99b9d287bedf9149abb764b0382f4/libs/sql-schema-describer/src/mysql.rs#L93
Concretely this means turning panickable code fragments like `.expect(...)` to `match` statements and returning an error like `Unable to match the "timestamp without timezone" data type`. It's more code, but also more actionable and way more user-friendly.
Ideally we could also add a rust linter that will catch panickable code from the stdlib.
|
1.0
|
Remove panics - **Problem**
Everytime I see rust panic my eyes glaze over, I feel helpless & I google around for other solutions.
```
➜ prisma prisma2 lift save
Error: Error in migration engine: thread 'tokio-runtime-worker-1' panicked at 'get data_type', src/libcore/option.rs:1034:5
stack backtrace:
0: std::panicking::default_hook::{{closure}}
1: std::panicking::default_hook
2: std::panicking::rust_panic_with_hook
3: std::panicking::continue_panic_fmt
4: rust_begin_unwind
5: core::panicking::panic_fmt
6: core::option::expect_failed
7: core::ops::function::impls::<impl core::ops::function::FnOnce<A> for &mut F>::call_once
8: <alloc::vec::Vec<T> as alloc::vec::SpecExtend<T,I>>::from_iter
9: sql_schema_describer::mysql::SqlSchemaDescriber::get_table
10: <core::iter::adapters::Map<I,F> as core::iter::traits::iterator::Iterator>::fold
11: <alloc::vec::Vec<T> as alloc::vec::SpecExtend<T,I>>::from_iter
12: <sql_schema_describer::mysql::SqlSchemaDescriber as sql_schema_describer::SqlSchemaDescriberBackend>::describe
13: <sql_migration_connector::sql_database_migration_inferrer::SqlDatabaseMigrationInferrer as migration_connector::database_migration_inferrer::DatabaseMigrationInferrer<sql_migration_connector::sql_migration::SqlMigration>>::infer
14: <migration_core::api::MigrationApi<C,D> as migration_core::api::GenericApi>::infer_migration_steps
15: migration_core::api::rpc::RpcApi::create_sync_handler
16: tokio_executor::enter::exit
17: tokio_threadpool::blocking::blocking
18: <futures::future::lazy::Lazy<F,R> as futures::future::Future>::poll
19: futures::future::chain::Chain<A,B,C>::poll
20: <futures::future::then::Then<A,B,F> as futures::future::Future>::poll
21: <futures::future::lazy::Lazy<F,R> as futures::future::Future>::poll
22: futures::future::chain::Chain<A,B,C>::poll
23: <futures::future::then::Then<A,B,F> as futures::future::Future>::poll
24: <futures::future::map::Map<A,F> as futures::future::Future>::poll
25: <futures::future::either::Either<A,B> as futures::future::Future>::poll
26: <futures::future::map::Map<A,F> as futures::future::Future>::poll
27: <futures::future::map_err::MapErr<A,F> as futures::future::Future>::poll
28: <futures::stream::and_then::AndThen<S,F,U> as futures::stream::Stream>::poll
29: <futures::stream::forward::Forward<T,U> as futures::future::Future>::poll
30: <futures::future::map::Map<A,F> as futures::future::Future>::poll
31: <futures::future::map_err::MapErr<A,F> as futures::future::Future>::poll
32: futures::task_impl::std::set
33: std::panicking::try::do_call
34: __rust_maybe_catch_panic
35: tokio_threadpool::task::Task::run
36: tokio_threadpool::worker::Worker::run_task
37: tokio_threadpool::worker::Worker::run
38: std::thread::local::LocalKey<T>::with
39: std::thread::local::LocalKey<T>::with
40: tokio_reactor::with_default
41: tokio::runtime::threadpool::builder::Builder::build::{{closure}}
42: std::thread::local::LocalKey<T>::with
43: std::thread::local::LocalKey<T>::with
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
```
**Proposed Solution**
I'd like us to get in a better habit of returning errors for these cases, rather than panicking. Here's the code snippet that caused the panic above:
https://github.com/prisma/prisma-engine/blob/a348799e88f99b9d287bedf9149abb764b0382f4/libs/sql-schema-describer/src/mysql.rs#L93
Concretely this means turning panickable code fragments like `.expect(...)` to `match` statements and returning an error like `Unable to match the "timestamp without timezone" data type`. It's more code, but also more actionable and way more user-friendly.
Ideally we could also add a rust linter that will catch panickable code from the stdlib.
|
process
|
remove panics problem everytime i see rust panic my eyes glaze over i feel helpless i google around for other solutions ➜ prisma lift save error error in migration engine thread tokio runtime worker panicked at get data type src libcore option rs stack backtrace std panicking default hook closure std panicking default hook std panicking rust panic with hook std panicking continue panic fmt rust begin unwind core panicking panic fmt core option expect failed core ops function impls for mut f call once as alloc vec specextend from iter sql schema describer mysql sqlschemadescriber get table as core iter traits iterator iterator fold as alloc vec specextend from iter describe infer as migration core api genericapi infer migration steps migration core api rpc rpcapi create sync handler tokio executor enter exit tokio threadpool blocking blocking as futures future future poll futures future chain chain poll as futures future future poll as futures future future poll futures future chain chain poll as futures future future poll as futures future future poll as futures future future poll as futures future future poll as futures future future poll as futures stream stream poll as futures future future poll as futures future future poll as futures future future poll futures task impl std set std panicking try do call rust maybe catch panic tokio threadpool task task run tokio threadpool worker worker run task tokio threadpool worker worker run std thread local localkey with std thread local localkey with tokio reactor with default tokio runtime threadpool builder builder build closure std thread local localkey with std thread local localkey with note some details are omitted run with rust backtrace full for a verbose backtrace proposed solution i d like us to get in a better habit of returning errors for these cases rather than panicking here s the code snippet that caused the panic above concretely this means turning panickable code fragments like expect to match statements and returning an error like unable to match the timestamp without timezone data type it s more code but also more actionable and way more user friendly ideally we could also add a rust linter that will catch panickable code from the stdlib
| 1
|
198,030
| 6,968,865,845
|
IssuesEvent
|
2017-12-11 00:53:27
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
travel.state.gov - site is not usable
|
browser-firefox priority-normal
|
<!-- @browser: Firefox 59.0 -->
<!-- @ua_header: Mozilla/5.0 (X11; Linux x86_64; rv:59.0) Gecko/20100101 Firefox/59.0 -->
<!-- @reported_with: desktop-reporter -->
**URL**: https://travel.state.gov/content/travel/en/contact-us.html
**Browser / Version**: Firefox 59.0
**Operating System**: Linux
**Tested Another Browser**: Yes
**Problem type**: Site is not usable
**Description**: Images are missing from page, cannot visit linked pages
**Steps to Reproduce**:
Navigated to this page, (linked) images are missing from
* U.S. Visas Contacts
* U.S. Passports
* Intercountry Adoption Contacts
* International Parental Child Abduction Contacts
* Legal Resources
Works in Chrome for Linux
[](https://webcompat.com/uploads/2017/12/e6e70874-e07e-44b6-9bf1-36412f9429b4.jpg)
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
travel.state.gov - site is not usable - <!-- @browser: Firefox 59.0 -->
<!-- @ua_header: Mozilla/5.0 (X11; Linux x86_64; rv:59.0) Gecko/20100101 Firefox/59.0 -->
<!-- @reported_with: desktop-reporter -->
**URL**: https://travel.state.gov/content/travel/en/contact-us.html
**Browser / Version**: Firefox 59.0
**Operating System**: Linux
**Tested Another Browser**: Yes
**Problem type**: Site is not usable
**Description**: Images are missing from page, cannot visit linked pages
**Steps to Reproduce**:
Navigated to this page, (linked) images are missing from
* U.S. Visas Contacts
* U.S. Passports
* Intercountry Adoption Contacts
* International Parental Child Abduction Contacts
* Legal Resources
Works in Chrome for Linux
[](https://webcompat.com/uploads/2017/12/e6e70874-e07e-44b6-9bf1-36412f9429b4.jpg)
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
travel state gov site is not usable url browser version firefox operating system linux tested another browser yes problem type site is not usable description images are missing from page cannot visit linked pages steps to reproduce navigated to this page linked images are missing from u s visas contacts u s passports intercountry adoption contacts international parental child abduction contacts legal resources works in chrome for linux from with ❤️
| 0
|
66,634
| 27,529,626,849
|
IssuesEvent
|
2023-03-06 20:54:23
|
cityofaustin/atd-data-tech
|
https://api.github.com/repos/cityofaustin/atd-data-tech
|
closed
|
Create Viewer for Dark Signal Response Meeting
|
Type: Map Request Service: Geo Workgroup: ATD
|
**Viewer Requested to include**
- [x] All Signals that ATD manages
- [x] Streets/Street names and what level the street is from the Austin Strategic Mobility Plan (ASMP)
- [x] Roadway speed limits (if possible)
- [x] A good aerial
|
1.0
|
Create Viewer for Dark Signal Response Meeting - **Viewer Requested to include**
- [x] All Signals that ATD manages
- [x] Streets/Street names and what level the street is from the Austin Strategic Mobility Plan (ASMP)
- [x] Roadway speed limits (if possible)
- [x] A good aerial
|
non_process
|
create viewer for dark signal response meeting viewer requested to include all signals that atd manages streets street names and what level the street is from the austin strategic mobility plan asmp roadway speed limits if possible a good aerial
| 0
|
59,310
| 6,647,069,094
|
IssuesEvent
|
2017-09-28 01:31:06
|
equella/Equella
|
https://api.github.com/repos/equella/Equella
|
closed
|
Notification email text and template tweaks
|
enhancement Ready for 6.5 GA Testing
|
Several of the notification emails need their text tweaked to make more sense.
|
1.0
|
Notification email text and template tweaks - Several of the notification emails need their text tweaked to make more sense.
|
non_process
|
notification email text and template tweaks several of the notification emails need their text tweaked to make more sense
| 0
|
223,859
| 7,461,327,073
|
IssuesEvent
|
2018-03-31 01:25:28
|
Motoxpro/WorldCupStatsSite
|
https://api.github.com/repos/Motoxpro/WorldCupStatsSite
|
closed
|
Keep an eye on 2016 World Champs R&R Scrape
|
Bug Data Collection Data/Backend Medium Priority Data Issue
|
Didn;t get scraped, Not sure why. Keep an eye on it when executing main.py
|
1.0
|
Keep an eye on 2016 World Champs R&R Scrape - Didn;t get scraped, Not sure why. Keep an eye on it when executing main.py
|
non_process
|
keep an eye on world champs r r scrape didn t get scraped not sure why keep an eye on it when executing main py
| 0
|
286,961
| 21,631,262,535
|
IssuesEvent
|
2022-05-05 09:57:14
|
dotnet/diagnostics
|
https://api.github.com/repos/dotnet/diagnostics
|
closed
|
ILogger guidance
|
enhancement documentation
|
We've long had confusing guidance (or absent guidance) about where to use ILogger and where to use EventSource. Even within the .NET organization informed opinions vary about where each option is best suited and if it is reasonable to offer simple guidance that tells .NET developers to just use one and ignore the other. This issue is an attempt to bring that discussion together in one recorded place rather than the various emails, meetings, and ad-hoc conversations where it often pops up.
An ideal outcome would be that we have documented guidance on docs.microsoft.com that not only describes how to use a particular logging technology, but also gives clear advice about which logging technology the .NET team recommends to use. The advice should be clear, concise, and handle most (not necessarily all) situations.
The best documented history on this issue I am aware of: https://github.com/aspnet/Logging/issues/332 (from 2016)
As a strawman I'll propose the guidance is "Reference the most recent version of Microsoft.Extensions.Logging and log with the ILogger interface." At minimum this is lacking in detail which needs to be fleshed out, but possibly we'll find there are some more substantive problems that need to be addressed. This issue is an opportunity to poke at this strawman in hopes of either shoring it up or changing it to something that works better before we document it.
This is an initial set of concerns I want to get addressed either via my own research or feedback from others:
1. When some code wants to log, where does the instance of ILogger come from?
For libraries that were designed with ILogger and dependency injection in mind from the start there is an [established pattern](https://docs.microsoft.com/en-us/aspnet/core/fundamentals/logging/?view=aspnetcore-3.1#create-logs) of putting an ILogger typed parameter in the constructor. However if you had a pre-existing library that doesn't have such constructors are you intended to add them? This would also require the ILogger instance to be passed as an argument through every method that eventually calls one of those constructors. This appears extremely invasive on API design. Using a static storage location and some static or singleton entrypoint to initialize it might be a reasonable way forward?
2. What code component is in charge of creating ILogger instances from the ILoggerFactory?
In the DI scenario the DI container does this task, but without a DI container whose job is it?
3. Are there any cases we are aware of where it is unreasonable to take the assembly dependency on M.E.L?
For example M.E.L has had some breaking changes in the past, is this likely to lead apps to encounter unresolvable versioning conflicts at build time or runtime? There are some low-level framework libraries that can't reference M.E.L. for layering reasons, but these can probably be ignored because it isn't a restriction typical .Net developers would encounter.
4. EventSource supports a scenario where out-of-process tools can retrieve logs from an instrumented library even if the app author who included the library made zero effort to configure logging or didn't configure logging for specific libraries. Is a similar capability possible from ILogger or do we think the value of this scenario is low enough that we can ignore its absence?
5. I've heard people say they think EventSource isn't suitable for logs primarily intended for human consumption whereas
ILogger isn't suitable for profiling style events intended for automated consumption. Are there technical reasons to make such
a distinction? Would doing both of these tasks with ILogger lead to confusion or a worse experience?
6. EventSource has features that support activity tracking, keywords, tasks, opcodes, tags, ETW/EventPipe/Lttng integration, and dynamic discovery/subscription. Does ILogger provide the same capabilities (for example via the EventSourceLoggerProvider bridge), or suitable alternatives, or we think the features are suitably corner case that they can be ignored in most cases?
7. We tied EventCounters to EventSource, does it look weird to tell people to create EventSources for counters at the same time we are recommending them to ignore EventSource for logging? New metric APIs that don't have direct coupling on EventSource might alleviate this?
More opinions, concerns, questions, and feedback are all encouraged. Thanks!
cc @shirhatti @brianrob @maryamariyan @tarekgh @dotnet/dotnet-diag @davidfowl @reyang @cijothomas
|
1.0
|
ILogger guidance - We've long had confusing guidance (or absent guidance) about where to use ILogger and where to use EventSource. Even within the .NET organization informed opinions vary about where each option is best suited and if it is reasonable to offer simple guidance that tells .NET developers to just use one and ignore the other. This issue is an attempt to bring that discussion together in one recorded place rather than the various emails, meetings, and ad-hoc conversations where it often pops up.
An ideal outcome would be that we have documented guidance on docs.microsoft.com that not only describes how to use a particular logging technology, but also gives clear advice about which logging technology the .NET team recommends to use. The advice should be clear, concise, and handle most (not necessarily all) situations.
The best documented history on this issue I am aware of: https://github.com/aspnet/Logging/issues/332 (from 2016)
As a strawman I'll propose the guidance is "Reference the most recent version of Microsoft.Extensions.Logging and log with the ILogger interface." At minimum this is lacking in detail which needs to be fleshed out, but possibly we'll find there are some more substantive problems that need to be addressed. This issue is an opportunity to poke at this strawman in hopes of either shoring it up or changing it to something that works better before we document it.
This is an initial set of concerns I want to get addressed either via my own research or feedback from others:
1. When some code wants to log, where does the instance of ILogger come from?
For libraries that were designed with ILogger and dependency injection in mind from the start there is an [established pattern](https://docs.microsoft.com/en-us/aspnet/core/fundamentals/logging/?view=aspnetcore-3.1#create-logs) of putting an ILogger typed parameter in the constructor. However if you had a pre-existing library that doesn't have such constructors are you intended to add them? This would also require the ILogger instance to be passed as an argument through every method that eventually calls one of those constructors. This appears extremely invasive on API design. Using a static storage location and some static or singleton entrypoint to initialize it might be a reasonable way forward?
2. What code component is in charge of creating ILogger instances from the ILoggerFactory?
In the DI scenario the DI container does this task, but without a DI container whose job is it?
3. Are there any cases we are aware of where it is unreasonable to take the assembly dependency on M.E.L?
For example M.E.L has had some breaking changes in the past, is this likely to lead apps to encounter unresolvable versioning conflicts at build time or runtime? There are some low-level framework libraries that can't reference M.E.L. for layering reasons, but these can probably be ignored because it isn't a restriction typical .Net developers would encounter.
4. EventSource supports a scenario where out-of-process tools can retrieve logs from an instrumented library even if the app author who included the library made zero effort to configure logging or didn't configure logging for specific libraries. Is a similar capability possible from ILogger or do we think the value of this scenario is low enough that we can ignore its absence?
5. I've heard people say they think EventSource isn't suitable for logs primarily intended for human consumption whereas
ILogger isn't suitable for profiling style events intended for automated consumption. Are there technical reasons to make such
a distinction? Would doing both of these tasks with ILogger lead to confusion or a worse experience?
6. EventSource has features that support activity tracking, keywords, tasks, opcodes, tags, ETW/EventPipe/Lttng integration, and dynamic discovery/subscription. Does ILogger provide the same capabilities (for example via the EventSourceLoggerProvider bridge), or suitable alternatives, or we think the features are suitably corner case that they can be ignored in most cases?
7. We tied EventCounters to EventSource, does it look weird to tell people to create EventSources for counters at the same time we are recommending them to ignore EventSource for logging? New metric APIs that don't have direct coupling on EventSource might alleviate this?
More opinions, concerns, questions, and feedback are all encouraged. Thanks!
cc @shirhatti @brianrob @maryamariyan @tarekgh @dotnet/dotnet-diag @davidfowl @reyang @cijothomas
|
non_process
|
ilogger guidance we ve long had confusing guidance or absent guidance about where to use ilogger and where to use eventsource even within the net organization informed opinions vary about where each option is best suited and if it is reasonable to offer simple guidance that tells net developers to just use one and ignore the other this issue is an attempt to bring that discussion together in one recorded place rather than the various emails meetings and ad hoc conversations where it often pops up an ideal outcome would be that we have documented guidance on docs microsoft com that not only describes how to use a particular logging technology but also gives clear advice about which logging technology the net team recommends to use the advice should be clear concise and handle most not necessarily all situations the best documented history on this issue i am aware of from as a strawman i ll propose the guidance is reference the most recent version of microsoft extensions logging and log with the ilogger interface at minimum this is lacking in detail which needs to be fleshed out but possibly we ll find there are some more substantive problems that need to be addressed this issue is an opportunity to poke at this strawman in hopes of either shoring it up or changing it to something that works better before we document it this is an initial set of concerns i want to get addressed either via my own research or feedback from others when some code wants to log where does the instance of ilogger come from for libraries that were designed with ilogger and dependency injection in mind from the start there is an of putting an ilogger typed parameter in the constructor however if you had a pre existing library that doesn t have such constructors are you intended to add them this would also require the ilogger instance to be passed as an argument through every method that eventually calls one of those constructors this appears extremely invasive on api design using a static storage location and some static or singleton entrypoint to initialize it might be a reasonable way forward what code component is in charge of creating ilogger instances from the iloggerfactory in the di scenario the di container does this task but without a di container whose job is it are there any cases we are aware of where it is unreasonable to take the assembly dependency on m e l for example m e l has had some breaking changes in the past is this likely to lead apps to encounter unresolvable versioning conflicts at build time or runtime there are some low level framework libraries that can t reference m e l for layering reasons but these can probably be ignored because it isn t a restriction typical net developers would encounter eventsource supports a scenario where out of process tools can retrieve logs from an instrumented library even if the app author who included the library made zero effort to configure logging or didn t configure logging for specific libraries is a similar capability possible from ilogger or do we think the value of this scenario is low enough that we can ignore its absence i ve heard people say they think eventsource isn t suitable for logs primarily intended for human consumption whereas ilogger isn t suitable for profiling style events intended for automated consumption are there technical reasons to make such a distinction would doing both of these tasks with ilogger lead to confusion or a worse experience eventsource has features that support activity tracking keywords tasks opcodes tags etw eventpipe lttng integration and dynamic discovery subscription does ilogger provide the same capabilities for example via the eventsourceloggerprovider bridge or suitable alternatives or we think the features are suitably corner case that they can be ignored in most cases we tied eventcounters to eventsource does it look weird to tell people to create eventsources for counters at the same time we are recommending them to ignore eventsource for logging new metric apis that don t have direct coupling on eventsource might alleviate this more opinions concerns questions and feedback are all encouraged thanks cc shirhatti brianrob maryamariyan tarekgh dotnet dotnet diag davidfowl reyang cijothomas
| 0
|
9,280
| 12,303,812,459
|
IssuesEvent
|
2020-05-11 19:22:00
|
nextgenhealthcare/connect
|
https://api.github.com/repos/nextgenhealthcare/connect
|
closed
|
Error script in addition to preprocessor and postprocessor scripts
|
destination error errorprocessor processor queues
|
I have seen many enquiries about custom error handling in channels with persistent queues.
All suggested workarounds seem troublesome to me because they often require the creation of extra channels which is not good from a maintenance point of view.
Additional channel parameters (retry counts, etc) are almost impossible to define because of the specific problems (errors) end users are facing.
May I suggest a more generic solution for this problem.
Currently the following scripts can be defined for a channel:
1) deploy
2) shutdown
3) preprocessor
4) postprocessor
I would like to add:
5) errorprocessor
The error processor is called for each destination connector, even if there are no errors. In the errorprocessor script the user should have
access to the actual responsemap of the destination connector. From that data the user can figure out what he wants. With the return value
he can set the number of retries:
0 = no retries (do what is done normally: sent, errored or queued)
1 = retry 1 time
2 = retry 2 times
3 = etc.
The user can also force an message to error by throwing an exception.
I think this simple scheme will help a lot of users with many problems in the past and future,
I did not go into the Mith Connect source code but maybe the way persistent queues are designed prevent this approach.
I am looking forward to your comments.
Imported Issue. Original Details:
Jira Issue Key: MIRTH-2136
Reporter: ncpunt
Created: 2012-04-20T04:44:30.000-0700
|
2.0
|
Error script in addition to preprocessor and postprocessor scripts - I have seen many enquiries about custom error handling in channels with persistent queues.
All suggested workarounds seem troublesome to me because they often require the creation of extra channels which is not good from a maintenance point of view.
Additional channel parameters (retry counts, etc) are almost impossible to define because of the specific problems (errors) end users are facing.
May I suggest a more generic solution for this problem.
Currently the following scripts can be defined for a channel:
1) deploy
2) shutdown
3) preprocessor
4) postprocessor
I would like to add:
5) errorprocessor
The error processor is called for each destination connector, even if there are no errors. In the errorprocessor script the user should have
access to the actual responsemap of the destination connector. From that data the user can figure out what he wants. With the return value
he can set the number of retries:
0 = no retries (do what is done normally: sent, errored or queued)
1 = retry 1 time
2 = retry 2 times
3 = etc.
The user can also force an message to error by throwing an exception.
I think this simple scheme will help a lot of users with many problems in the past and future,
I did not go into the Mith Connect source code but maybe the way persistent queues are designed prevent this approach.
I am looking forward to your comments.
Imported Issue. Original Details:
Jira Issue Key: MIRTH-2136
Reporter: ncpunt
Created: 2012-04-20T04:44:30.000-0700
|
process
|
error script in addition to preprocessor and postprocessor scripts i have seen many enquiries about custom error handling in channels with persistent queues all suggested workarounds seem troublesome to me because they often require the creation of extra channels which is not good from a maintenance point of view additional channel parameters retry counts etc are almost impossible to define because of the specific problems errors end users are facing may i suggest a more generic solution for this problem currently the following scripts can be defined for a channel deploy shutdown preprocessor postprocessor i would like to add errorprocessor the error processor is called for each destination connector even if there are no errors in the errorprocessor script the user should have access to the actual responsemap of the destination connector from that data the user can figure out what he wants with the return value he can set the number of retries no retries do what is done normally sent errored or queued retry time retry times etc the user can also force an message to error by throwing an exception i think this simple scheme will help a lot of users with many problems in the past and future i did not go into the mith connect source code but maybe the way persistent queues are designed prevent this approach i am looking forward to your comments imported issue original details jira issue key mirth reporter ncpunt created
| 1
|
71,458
| 7,245,231,248
|
IssuesEvent
|
2018-02-14 17:23:49
|
FreeUKGen/FreeUKRegProductIssues
|
https://api.github.com/repos/FreeUKGen/FreeUKRegProductIssues
|
closed
|
97949111 Spelling of Gazetteer (Eric)
|
testing
|
Issue reported by **REGManager** at 2018-02-07 16:05:11 UTC
Time: 2018-02-07T16:03:20+00:00
Session ID: 72255f5ec470b6f579f440a80a819657
Problem Page URL: [/places/53cbf613eca9eb03a6009ca3/edit](/places/53cbf613eca9eb03a6009ca3/edit)
Previous Page URL: [https://www.freereg.org.uk/places/53cbf613eca9eb03a6009ca3](https://www.freereg.org.uk/places/53cbf613eca9eb03a6009ca3)
Reported Issue:
In the Edit Place screen
http://www13.freereg.org.uk/uploads/feedback/screenshot/5a7b23b7f493fda05ed5ecc9/Capture.JPG

|
1.0
|
97949111 Spelling of Gazetteer (Eric) - Issue reported by **REGManager** at 2018-02-07 16:05:11 UTC
Time: 2018-02-07T16:03:20+00:00
Session ID: 72255f5ec470b6f579f440a80a819657
Problem Page URL: [/places/53cbf613eca9eb03a6009ca3/edit](/places/53cbf613eca9eb03a6009ca3/edit)
Previous Page URL: [https://www.freereg.org.uk/places/53cbf613eca9eb03a6009ca3](https://www.freereg.org.uk/places/53cbf613eca9eb03a6009ca3)
Reported Issue:
In the Edit Place screen
http://www13.freereg.org.uk/uploads/feedback/screenshot/5a7b23b7f493fda05ed5ecc9/Capture.JPG

|
non_process
|
spelling of gazetteer eric issue reported by regmanager at utc time session id problem page url places edit previous page url reported issue in the edit place screen
| 0
|
240,357
| 26,256,304,192
|
IssuesEvent
|
2023-01-06 01:14:39
|
attesch/zencart
|
https://api.github.com/repos/attesch/zencart
|
opened
|
CVE-2021-23807 (High) detected in jsonpointer-4.0.1.tgz
|
security vulnerability
|
## CVE-2021-23807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jsonpointer-4.0.1.tgz</b></p></summary>
<p>Simple JSON Addressing.</p>
<p>Library home page: <a href="https://registry.npmjs.org/jsonpointer/-/jsonpointer-4.0.1.tgz">https://registry.npmjs.org/jsonpointer/-/jsonpointer-4.0.1.tgz</a></p>
<p>Path to dependency file: /zencart/admin/includes/template/javascript/gridstack.js-master/package.json</p>
<p>Path to vulnerable library: /admin/includes/template/javascript/gridstack.js-master/node_modules/jsonpointer/package.json</p>
<p>
Dependency Hierarchy:
- coveralls-2.13.3.tgz (Root Library)
- request-2.79.0.tgz
- har-validator-2.0.6.tgz
- is-my-json-valid-2.20.0.tgz
- :x: **jsonpointer-4.0.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package jsonpointer before 5.0.0. A type confusion vulnerability can lead to a bypass of a previous Prototype Pollution fix when the pointer components are arrays.
<p>Publish Date: 2021-11-03
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-23807>CVE-2021-23807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23807">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23807</a></p>
<p>Release Date: 2021-11-03</p>
<p>Fix Resolution (jsonpointer): 5.0.0</p>
<p>Direct dependency fix Resolution (coveralls): 3.0.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-23807 (High) detected in jsonpointer-4.0.1.tgz - ## CVE-2021-23807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jsonpointer-4.0.1.tgz</b></p></summary>
<p>Simple JSON Addressing.</p>
<p>Library home page: <a href="https://registry.npmjs.org/jsonpointer/-/jsonpointer-4.0.1.tgz">https://registry.npmjs.org/jsonpointer/-/jsonpointer-4.0.1.tgz</a></p>
<p>Path to dependency file: /zencart/admin/includes/template/javascript/gridstack.js-master/package.json</p>
<p>Path to vulnerable library: /admin/includes/template/javascript/gridstack.js-master/node_modules/jsonpointer/package.json</p>
<p>
Dependency Hierarchy:
- coveralls-2.13.3.tgz (Root Library)
- request-2.79.0.tgz
- har-validator-2.0.6.tgz
- is-my-json-valid-2.20.0.tgz
- :x: **jsonpointer-4.0.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package jsonpointer before 5.0.0. A type confusion vulnerability can lead to a bypass of a previous Prototype Pollution fix when the pointer components are arrays.
<p>Publish Date: 2021-11-03
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-23807>CVE-2021-23807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23807">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23807</a></p>
<p>Release Date: 2021-11-03</p>
<p>Fix Resolution (jsonpointer): 5.0.0</p>
<p>Direct dependency fix Resolution (coveralls): 3.0.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in jsonpointer tgz cve high severity vulnerability vulnerable library jsonpointer tgz simple json addressing library home page a href path to dependency file zencart admin includes template javascript gridstack js master package json path to vulnerable library admin includes template javascript gridstack js master node modules jsonpointer package json dependency hierarchy coveralls tgz root library request tgz har validator tgz is my json valid tgz x jsonpointer tgz vulnerable library vulnerability details this affects the package jsonpointer before a type confusion vulnerability can lead to a bypass of a previous prototype pollution fix when the pointer components are arrays publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jsonpointer direct dependency fix resolution coveralls step up your open source security game with mend
| 0
|
424,780
| 29,176,660,413
|
IssuesEvent
|
2023-05-19 08:29:46
|
JoTec2002/TINF21C_AAS_Management
|
https://api.github.com/repos/JoTec2002/TINF21C_AAS_Management
|
closed
|
Improvements SRS
|
documentation open
|
Due to the feedback in our presentation please update the **SRS** file (Business Processes).
Please also update the filename with following updates (Versioning)
|
1.0
|
Improvements SRS - Due to the feedback in our presentation please update the **SRS** file (Business Processes).
Please also update the filename with following updates (Versioning)
|
non_process
|
improvements srs due to the feedback in our presentation please update the srs file business processes please also update the filename with following updates versioning
| 0
|
437,786
| 30,609,496,307
|
IssuesEvent
|
2023-07-23 12:37:47
|
novemberizing/eva-old
|
https://api.github.com/repos/novemberizing/eva-old
|
closed
|
추가적인 구현 리스트
|
DOCUMENTATION ENHANCEMENT
|
- [ ] 멀티 이벤트 엔진: 이벤트 엔진을 활용할 수 있는 방법을 고민해보고 다양한 방법을 가이드할 수 있도록 하자.
- [ ] SIGNAL 이벤트 지원
- [ ] SIGPIPE 에 대한 처리 (예외가 발생하는구나.)
- [ ] 클라이언트 풀 (수평적 분할 / 수직적 분할) : 반 정도는 구현이 되었다.
- [ ] 지금까지의 구현을 정리하자. (그림으로 적당히...)
- [ ] 디스크립터의 체크 함수는 지울까 사용할까 고민이 된다. - 지우자. (현재까지는 체크 함수를 사용하고 있지 않다.)
- [ ] 자료조사: 시스템 디스크립터 외에 어플리케이션에서 파일 오픈 시에 시작하는 디스크립터의 번호를 알자.
고민스러운 부분이다.
스트림 처리를 해야 하는데, 사용자가, 스트림 객체를 고민없이 사용하도록 하고 싶다.
그렇지만 여기에는 어디에서 스트림이 존재하지 않는다.
디스크립터는 스트림을 가지고 있지 않기 때문이다.
설계의 통일성을 가지면서 스트림을 사용해서 읽기를 수행했을 때, 스트림의 최적화를 이룰 수 있는 방법이 있을까?
PREPROCESSOR 뿐인가?
디스크립터 ON 을 넣어 두었는데, 여기에 두면 되겠구나....
소켓의 경우, xsocketon (descriptor, xdescriptoreventtype_in, buffer, n);
- [ ] IPv6 지원 ...
- [ ] 로고 - EVA
- [ ] FUNCTION TRACE 를 구현하자. (로그 모듈에 넣고,)
스택을 구현했다는 것은 이럴 때 사용할만 하구만,....
- [ ] 맵의 TRAVERSE 를 구현해야겠다.
- [ ] 간단한 트위터 커맨드 라인 인터페이스
- [ ] ATEXIT 적용하기
- [ ] HASH
- [ ] B+ TREE
- [ ] 자동으로 테스트할 수 있는 방법을 알아보자.
- [ ] 에코 테스트를 진행할때, 큰 파일을 여러 소켓을 이용해서 전송하는 프로그램을 만들어서 진행하자.
- [ ] 제로 카피는 마스크로 제공하자. (엔진의 프로세스에서 제로 카피 마스크가 존재하면, WRITE, READ, WRITE 로직을 READ 수행 시에 WRITE 를 하도록 하자. SEND FILE 을 수행할 수 있도록 하자.)
- [ ] 문서를 정리할 때이다.
- [ ] STRING 관련 함수를 만들자.
- [ ] 정규 표현식 관련하여 POSIX 를 대체할만한 무엇인가를 만들자.
|
1.0
|
추가적인 구현 리스트 - - [ ] 멀티 이벤트 엔진: 이벤트 엔진을 활용할 수 있는 방법을 고민해보고 다양한 방법을 가이드할 수 있도록 하자.
- [ ] SIGNAL 이벤트 지원
- [ ] SIGPIPE 에 대한 처리 (예외가 발생하는구나.)
- [ ] 클라이언트 풀 (수평적 분할 / 수직적 분할) : 반 정도는 구현이 되었다.
- [ ] 지금까지의 구현을 정리하자. (그림으로 적당히...)
- [ ] 디스크립터의 체크 함수는 지울까 사용할까 고민이 된다. - 지우자. (현재까지는 체크 함수를 사용하고 있지 않다.)
- [ ] 자료조사: 시스템 디스크립터 외에 어플리케이션에서 파일 오픈 시에 시작하는 디스크립터의 번호를 알자.
고민스러운 부분이다.
스트림 처리를 해야 하는데, 사용자가, 스트림 객체를 고민없이 사용하도록 하고 싶다.
그렇지만 여기에는 어디에서 스트림이 존재하지 않는다.
디스크립터는 스트림을 가지고 있지 않기 때문이다.
설계의 통일성을 가지면서 스트림을 사용해서 읽기를 수행했을 때, 스트림의 최적화를 이룰 수 있는 방법이 있을까?
PREPROCESSOR 뿐인가?
디스크립터 ON 을 넣어 두었는데, 여기에 두면 되겠구나....
소켓의 경우, xsocketon (descriptor, xdescriptoreventtype_in, buffer, n);
- [ ] IPv6 지원 ...
- [ ] 로고 - EVA
- [ ] FUNCTION TRACE 를 구현하자. (로그 모듈에 넣고,)
스택을 구현했다는 것은 이럴 때 사용할만 하구만,....
- [ ] 맵의 TRAVERSE 를 구현해야겠다.
- [ ] 간단한 트위터 커맨드 라인 인터페이스
- [ ] ATEXIT 적용하기
- [ ] HASH
- [ ] B+ TREE
- [ ] 자동으로 테스트할 수 있는 방법을 알아보자.
- [ ] 에코 테스트를 진행할때, 큰 파일을 여러 소켓을 이용해서 전송하는 프로그램을 만들어서 진행하자.
- [ ] 제로 카피는 마스크로 제공하자. (엔진의 프로세스에서 제로 카피 마스크가 존재하면, WRITE, READ, WRITE 로직을 READ 수행 시에 WRITE 를 하도록 하자. SEND FILE 을 수행할 수 있도록 하자.)
- [ ] 문서를 정리할 때이다.
- [ ] STRING 관련 함수를 만들자.
- [ ] 정규 표현식 관련하여 POSIX 를 대체할만한 무엇인가를 만들자.
|
non_process
|
추가적인 구현 리스트 멀티 이벤트 엔진 이벤트 엔진을 활용할 수 있는 방법을 고민해보고 다양한 방법을 가이드할 수 있도록 하자 signal 이벤트 지원 sigpipe 에 대한 처리 예외가 발생하는구나 클라이언트 풀 수평적 분할 수직적 분할 반 정도는 구현이 되었다 지금까지의 구현을 정리하자 그림으로 적당히 디스크립터의 체크 함수는 지울까 사용할까 고민이 된다 지우자 현재까지는 체크 함수를 사용하고 있지 않다 자료조사 시스템 디스크립터 외에 어플리케이션에서 파일 오픈 시에 시작하는 디스크립터의 번호를 알자 고민스러운 부분이다 스트림 처리를 해야 하는데 사용자가 스트림 객체를 고민없이 사용하도록 하고 싶다 그렇지만 여기에는 어디에서 스트림이 존재하지 않는다 디스크립터는 스트림을 가지고 있지 않기 때문이다 설계의 통일성을 가지면서 스트림을 사용해서 읽기를 수행했을 때 스트림의 최적화를 이룰 수 있는 방법이 있을까 preprocessor 뿐인가 디스크립터 on 을 넣어 두었는데 여기에 두면 되겠구나 소켓의 경우 xsocketon descriptor xdescriptoreventtype in buffer n 지원 로고 eva function trace 를 구현하자 로그 모듈에 넣고 스택을 구현했다는 것은 이럴 때 사용할만 하구만 맵의 traverse 를 구현해야겠다 간단한 트위터 커맨드 라인 인터페이스 atexit 적용하기 hash b tree 자동으로 테스트할 수 있는 방법을 알아보자 에코 테스트를 진행할때 큰 파일을 여러 소켓을 이용해서 전송하는 프로그램을 만들어서 진행하자 제로 카피는 마스크로 제공하자 엔진의 프로세스에서 제로 카피 마스크가 존재하면 write read write 로직을 read 수행 시에 write 를 하도록 하자 send file 을 수행할 수 있도록 하자 문서를 정리할 때이다 string 관련 함수를 만들자 정규 표현식 관련하여 posix 를 대체할만한 무엇인가를 만들자
| 0
|
17,395
| 23,210,337,316
|
IssuesEvent
|
2022-08-02 09:34:23
|
deepset-ai/haystack
|
https://api.github.com/repos/deepset-ai/haystack
|
closed
|
Add support for custom trained PunktTokenizer in PreProcessor
|
type:feature topic:preprocessing journey:advanced
|
### Discussed in https://github.com/deepset-ai/haystack/discussions/2773
<div type='discussions-op-text'>
<sup>Originally posted by **danielbichuetti** July 6, 2022</sup>
Hi,
Today, the PreProcessor makes usages of NLTK PunktTokenizer. The default one is great, except for some specific domains. Like the legal one, where there are many abbreviations that it messes up a little.
I would like to propose and offer myself to implement the possibility to set a custom trained PunktTokenizer for any set of languages.
For example, user defines a directory with the file name using the ISO pattern where Haystack will then search for the language, if not found default to default one.
What do you think about this feature ? It wouldn't interfere and just improve specific cases (which are many in NLP domain).
Have a great day!</div>
**Is your feature request related to a problem? Please describe.**
Today, the PreProcessor makes usages of NLTK PunktTokenizer. The default one is great, except for some specific domains. Like the legal one, where there are many abbreviations that it messes up a little.
**Describe the solution you'd like**
Introduce a parameter `tokenizer_model_folder` on PreProcessor which would represent a directory where custom models could be stored using ISO like:
- pt.pickle
- en.pickle
If for that specific language, a model is present on this folder, PreProcessor would use it. If not, fallback to default one. Since pre-processing is a task for NLP that has a close connection to the domain of the text and the specific task, if anyone wants he could have a folder legal, another medical and so on. And when calling PreProcessor could setup a parameter `model_folder=`
**Describe alternatives you've considered**
Another idea might be to support different sentence tokenizers in general. This would however be more time-consuming as anything else than passing primitives to nodes's __init__ is discouraged as it doesn't work with YAML definitions.
**Additional context**
@danielbichuetti did a test of the default models for NLTK, Spacy and Stanza. The best default model for portuguese (my scenario) was Stanza. But we could get lots of improvements on NLTK using PunktTrainer with a small corpus of legal documents with some abbreviations. These errors in split sentences probably occur in other domains that make usage of lots of dots inside sentences.
When making tests using GPT-3 which has a huge max token size @danielbichuetti got questions not being answered which were present in the text, just because of the bad sentence split. If it happens breaking the law fundaments (article) of a judicial decision, models won't be able to correctly infer. Or when it breaks the judge name and so on. On the law domain, these abbreviations often carry a very important information.
|
1.0
|
Add support for custom trained PunktTokenizer in PreProcessor - ### Discussed in https://github.com/deepset-ai/haystack/discussions/2773
<div type='discussions-op-text'>
<sup>Originally posted by **danielbichuetti** July 6, 2022</sup>
Hi,
Today, the PreProcessor makes usages of NLTK PunktTokenizer. The default one is great, except for some specific domains. Like the legal one, where there are many abbreviations that it messes up a little.
I would like to propose and offer myself to implement the possibility to set a custom trained PunktTokenizer for any set of languages.
For example, user defines a directory with the file name using the ISO pattern where Haystack will then search for the language, if not found default to default one.
What do you think about this feature ? It wouldn't interfere and just improve specific cases (which are many in NLP domain).
Have a great day!</div>
**Is your feature request related to a problem? Please describe.**
Today, the PreProcessor makes usages of NLTK PunktTokenizer. The default one is great, except for some specific domains. Like the legal one, where there are many abbreviations that it messes up a little.
**Describe the solution you'd like**
Introduce a parameter `tokenizer_model_folder` on PreProcessor which would represent a directory where custom models could be stored using ISO like:
- pt.pickle
- en.pickle
If for that specific language, a model is present on this folder, PreProcessor would use it. If not, fallback to default one. Since pre-processing is a task for NLP that has a close connection to the domain of the text and the specific task, if anyone wants he could have a folder legal, another medical and so on. And when calling PreProcessor could setup a parameter `model_folder=`
**Describe alternatives you've considered**
Another idea might be to support different sentence tokenizers in general. This would however be more time-consuming as anything else than passing primitives to nodes's __init__ is discouraged as it doesn't work with YAML definitions.
**Additional context**
@danielbichuetti did a test of the default models for NLTK, Spacy and Stanza. The best default model for portuguese (my scenario) was Stanza. But we could get lots of improvements on NLTK using PunktTrainer with a small corpus of legal documents with some abbreviations. These errors in split sentences probably occur in other domains that make usage of lots of dots inside sentences.
When making tests using GPT-3 which has a huge max token size @danielbichuetti got questions not being answered which were present in the text, just because of the bad sentence split. If it happens breaking the law fundaments (article) of a judicial decision, models won't be able to correctly infer. Or when it breaks the judge name and so on. On the law domain, these abbreviations often carry a very important information.
|
process
|
add support for custom trained punkttokenizer in preprocessor discussed in originally posted by danielbichuetti july hi today the preprocessor makes usages of nltk punkttokenizer the default one is great except for some specific domains like the legal one where there are many abbreviations that it messes up a little i would like to propose and offer myself to implement the possibility to set a custom trained punkttokenizer for any set of languages for example user defines a directory with the file name using the iso pattern where haystack will then search for the language if not found default to default one what do you think about this feature it wouldn t interfere and just improve specific cases which are many in nlp domain have a great day is your feature request related to a problem please describe today the preprocessor makes usages of nltk punkttokenizer the default one is great except for some specific domains like the legal one where there are many abbreviations that it messes up a little describe the solution you d like introduce a parameter tokenizer model folder on preprocessor which would represent a directory where custom models could be stored using iso like pt pickle en pickle if for that specific language a model is present on this folder preprocessor would use it if not fallback to default one since pre processing is a task for nlp that has a close connection to the domain of the text and the specific task if anyone wants he could have a folder legal another medical and so on and when calling preprocessor could setup a parameter model folder describe alternatives you ve considered another idea might be to support different sentence tokenizers in general this would however be more time consuming as anything else than passing primitives to nodes s init is discouraged as it doesn t work with yaml definitions additional context danielbichuetti did a test of the default models for nltk spacy and stanza the best default model for portuguese my scenario was stanza but we could get lots of improvements on nltk using punkttrainer with a small corpus of legal documents with some abbreviations these errors in split sentences probably occur in other domains that make usage of lots of dots inside sentences when making tests using gpt which has a huge max token size danielbichuetti got questions not being answered which were present in the text just because of the bad sentence split if it happens breaking the law fundaments article of a judicial decision models won t be able to correctly infer or when it breaks the judge name and so on on the law domain these abbreviations often carry a very important information
| 1
|
94,099
| 19,476,241,090
|
IssuesEvent
|
2021-12-24 12:58:48
|
Onelinerhub/onelinerhub
|
https://api.github.com/repos/Onelinerhub/onelinerhub
|
closed
|
Short solution needed: "How to load docker image" (docker)
|
help wanted good first issue code docker
|
Please help us write most modern and shortest code solution for this issue:
**How to load docker image** (technology: [docker](https://onelinerhub.com/docker))
### Fast way
Just write the code solution in the comments.
### Prefered way
1. Create pull request with a new code file inside [inbox folder](https://github.com/Onelinerhub/onelinerhub/tree/main/inbox).
2. Don't forget to use comments to make solution explained.
3. Link to this issue in comments of pull request.
|
1.0
|
Short solution needed: "How to load docker image" (docker) - Please help us write most modern and shortest code solution for this issue:
**How to load docker image** (technology: [docker](https://onelinerhub.com/docker))
### Fast way
Just write the code solution in the comments.
### Prefered way
1. Create pull request with a new code file inside [inbox folder](https://github.com/Onelinerhub/onelinerhub/tree/main/inbox).
2. Don't forget to use comments to make solution explained.
3. Link to this issue in comments of pull request.
|
non_process
|
short solution needed how to load docker image docker please help us write most modern and shortest code solution for this issue how to load docker image technology fast way just write the code solution in the comments prefered way create pull request with a new code file inside don t forget to use comments to make solution explained link to this issue in comments of pull request
| 0
|
6,903
| 10,055,978,796
|
IssuesEvent
|
2019-07-22 08:02:59
|
CymChad/BaseRecyclerViewAdapterHelper
|
https://api.github.com/repos/CymChad/BaseRecyclerViewAdapterHelper
|
closed
|
BaseMultiItemQuickAdapter的问题
|
processing
|
在菜单展开的时候 去清理菜单下面的所有的item 会导致item一直在菜单下面 无法折叠
getSubItems().clear();
|
1.0
|
BaseMultiItemQuickAdapter的问题 - 在菜单展开的时候 去清理菜单下面的所有的item 会导致item一直在菜单下面 无法折叠
getSubItems().clear();
|
process
|
basemultiitemquickadapter的问题 在菜单展开的时候 去清理菜单下面的所有的item 会导致item一直在菜单下面 无法折叠 getsubitems clear
| 1
|
27,504
| 13,260,751,778
|
IssuesEvent
|
2020-08-20 18:41:38
|
nion-software/nionswift
|
https://api.github.com/repos/nion-software/nionswift
|
opened
|
Expand partial writing capabilities to allow passing a data source rather than sourcing from an array
|
f - acquisition f - performance stage - planning type - enhancement
|
This could cut out a step of copying when using partial data writing during acquisition if the acquisition device can pass itself as a source directly to the data item writing machinery.
|
True
|
Expand partial writing capabilities to allow passing a data source rather than sourcing from an array - This could cut out a step of copying when using partial data writing during acquisition if the acquisition device can pass itself as a source directly to the data item writing machinery.
|
non_process
|
expand partial writing capabilities to allow passing a data source rather than sourcing from an array this could cut out a step of copying when using partial data writing during acquisition if the acquisition device can pass itself as a source directly to the data item writing machinery
| 0
|
22,694
| 32,005,185,666
|
IssuesEvent
|
2023-09-21 14:27:50
|
X-Sharp/XSharpPublic
|
https://api.github.com/repos/X-Sharp/XSharpPublic
|
closed
|
Preprocessor bug in WP (XBase++ dialect)
|
bug duplicate Compiler Preprocessor
|
**Describe the bug**
Preprocessor translates incorrectly UDC. I believe that this error is similar to the one described in the [ticket](https://github.com/X-Sharp/XSharpPublic/issues/1284)
**To Reproduce .prg**
```
// watch point
#xcommand WP [/<x:S,F,SF,FS>] [/<y:CAPS,C>] [/<m:MUTE,M>] [/OBJ <obj> = <mes> [,<objn> = <mesn> ]] <list,...> [<file>] ;
=> wpRouter({<list>},<(x)>,<file>,<(y)>,<(m)>,{[{<"obj">,{|o| <mes>}}] [,{<"objn">,{|o| <mesn>} }] })
procedure Main()
local a, b, c
wp a,b,c
wp /sf /m a,b,c
return
```
**Expected behavior (xBase++ ppo**
Output
```
procedure Main()
local a, b, c
wpRouter({a,b,c},,,,,{ })
wpRouter({a,b,c},"SF",,,"M",{ })
return
```
**Actual behavior (X# ppo)**
I do not pretend to change the case of strings `"sf"`, `"m"`.
```
procedure Main()
local a, b, c
wpRouter({a,b,},,c,,,{ })
wpRouter({a,b,},"sf",c,,"m",{ })
return
```
**Additional context**
X# Compiler version 2.16.0.5 (public)
-dialect:xBase++ -xpp1 -lb -memvar -vo1 -vo3 -vo5 -vo10 -vo15 -vo16
|
1.0
|
Preprocessor bug in WP (XBase++ dialect) - **Describe the bug**
Preprocessor translates incorrectly UDC. I believe that this error is similar to the one described in the [ticket](https://github.com/X-Sharp/XSharpPublic/issues/1284)
**To Reproduce .prg**
```
// watch point
#xcommand WP [/<x:S,F,SF,FS>] [/<y:CAPS,C>] [/<m:MUTE,M>] [/OBJ <obj> = <mes> [,<objn> = <mesn> ]] <list,...> [<file>] ;
=> wpRouter({<list>},<(x)>,<file>,<(y)>,<(m)>,{[{<"obj">,{|o| <mes>}}] [,{<"objn">,{|o| <mesn>} }] })
procedure Main()
local a, b, c
wp a,b,c
wp /sf /m a,b,c
return
```
**Expected behavior (xBase++ ppo**
Output
```
procedure Main()
local a, b, c
wpRouter({a,b,c},,,,,{ })
wpRouter({a,b,c},"SF",,,"M",{ })
return
```
**Actual behavior (X# ppo)**
I do not pretend to change the case of strings `"sf"`, `"m"`.
```
procedure Main()
local a, b, c
wpRouter({a,b,},,c,,,{ })
wpRouter({a,b,},"sf",c,,"m",{ })
return
```
**Additional context**
X# Compiler version 2.16.0.5 (public)
-dialect:xBase++ -xpp1 -lb -memvar -vo1 -vo3 -vo5 -vo10 -vo15 -vo16
|
process
|
preprocessor bug in wp xbase dialect describe the bug preprocessor translates incorrectly udc i believe that this error is similar to the one described in the to reproduce prg watch point xcommand wp wprouter procedure main local a b c wp a b c wp sf m a b c return expected behavior xbase ppo output procedure main local a b c wprouter a b c wprouter a b c sf m return actual behavior x ppo i do not pretend to change the case of strings sf m procedure main local a b c wprouter a b c wprouter a b sf c m return additional context x compiler version public dialect xbase lb memvar
| 1
|
5,556
| 8,394,982,335
|
IssuesEvent
|
2018-10-10 03:46:50
|
rchain/bounties
|
https://api.github.com/repos/rchain/bounties
|
closed
|
Usability Testing and Updates to e-sign Invoice
|
Development invoice-process
|
When filling in the RContributor Invoice Agreement form, I noticed some usability problems and hit a bug.
- The Ethereum address field is too short. It should display 42 chars: size="42"
- The Reward Voted field should only accept numeric chars.
- Placing a Dollar sign to the left of the Reward Voted field would be a useful hint.
- Displaying placeholder text in the Reward Voted field would mitigate confusion about the amount format: placeholder="1234.56"
- Typically the positive button ("I Agree") is located on the right in order to move forward. The negative button ("I Disagree") should be on the left to indicate non-continuation. Or, the buttons should be stacked, with the most likely course of action ("I Agree") on top.
After clicking the "I Agree" button, a Javascript alert pops up saying:
```Data provided doesn't match the values on file.```
Afaik the data matches the invoice. This alert appears regardless of what I enter in the Reward Voted field: #### | #,### | ####.## | #,###.## | $####.
Bug: the md5 checksum generated in the Javascript for my ETH address is incorrect.
Since the form can't be submitted due to the alert, I clicked the ("I Disagree") button. The form submitted. It went to invoice.rhobot.net/../disagree and replied "Thank you for contributing!". Maybe it should say "Thank you anyway!".
|
1.0
|
Usability Testing and Updates to e-sign Invoice - When filling in the RContributor Invoice Agreement form, I noticed some usability problems and hit a bug.
- The Ethereum address field is too short. It should display 42 chars: size="42"
- The Reward Voted field should only accept numeric chars.
- Placing a Dollar sign to the left of the Reward Voted field would be a useful hint.
- Displaying placeholder text in the Reward Voted field would mitigate confusion about the amount format: placeholder="1234.56"
- Typically the positive button ("I Agree") is located on the right in order to move forward. The negative button ("I Disagree") should be on the left to indicate non-continuation. Or, the buttons should be stacked, with the most likely course of action ("I Agree") on top.
After clicking the "I Agree" button, a Javascript alert pops up saying:
```Data provided doesn't match the values on file.```
Afaik the data matches the invoice. This alert appears regardless of what I enter in the Reward Voted field: #### | #,### | ####.## | #,###.## | $####.
Bug: the md5 checksum generated in the Javascript for my ETH address is incorrect.
Since the form can't be submitted due to the alert, I clicked the ("I Disagree") button. The form submitted. It went to invoice.rhobot.net/../disagree and replied "Thank you for contributing!". Maybe it should say "Thank you anyway!".
|
process
|
usability testing and updates to e sign invoice when filling in the rcontributor invoice agreement form i noticed some usability problems and hit a bug the ethereum address field is too short it should display chars size the reward voted field should only accept numeric chars placing a dollar sign to the left of the reward voted field would be a useful hint displaying placeholder text in the reward voted field would mitigate confusion about the amount format placeholder typically the positive button i agree is located on the right in order to move forward the negative button i disagree should be on the left to indicate non continuation or the buttons should be stacked with the most likely course of action i agree on top after clicking the i agree button a javascript alert pops up saying data provided doesn t match the values on file afaik the data matches the invoice this alert appears regardless of what i enter in the reward voted field bug the checksum generated in the javascript for my eth address is incorrect since the form can t be submitted due to the alert i clicked the i disagree button the form submitted it went to invoice rhobot net disagree and replied thank you for contributing maybe it should say thank you anyway
| 1
|
115,369
| 4,668,059,991
|
IssuesEvent
|
2016-10-06 00:05:38
|
CovertJaguar/Railcraft
|
https://api.github.com/repos/CovertJaguar/Railcraft
|
closed
|
Furnance Minecart and Liquid Unloader issue
|
bug needs verification priority-medium
|
I don't know if this is a bug or a feature but for some reason the vanilla Furnace Minecart exhibits strange behavior when it passes across a Locking Track on top of Liquid Unloader. For some reason the Locking Tracks locks the Furnace cart in place and it doesn't move until the fuel runs out or if it gets a manual redstone signal. I tried different modes both on the track and the loader but the same issue cropped up every time. I even tried to filter the Liquid Unloader but to no avail. I'm not sure what to do to solve the issue (I'm very new to Railcraft)
Just thought I should mention it if it was an issue (I'm running the latest RailCraft version for 1.7.10 in a custom modpack)
|
1.0
|
Furnance Minecart and Liquid Unloader issue - I don't know if this is a bug or a feature but for some reason the vanilla Furnace Minecart exhibits strange behavior when it passes across a Locking Track on top of Liquid Unloader. For some reason the Locking Tracks locks the Furnace cart in place and it doesn't move until the fuel runs out or if it gets a manual redstone signal. I tried different modes both on the track and the loader but the same issue cropped up every time. I even tried to filter the Liquid Unloader but to no avail. I'm not sure what to do to solve the issue (I'm very new to Railcraft)
Just thought I should mention it if it was an issue (I'm running the latest RailCraft version for 1.7.10 in a custom modpack)
|
non_process
|
furnance minecart and liquid unloader issue i don t know if this is a bug or a feature but for some reason the vanilla furnace minecart exhibits strange behavior when it passes across a locking track on top of liquid unloader for some reason the locking tracks locks the furnace cart in place and it doesn t move until the fuel runs out or if it gets a manual redstone signal i tried different modes both on the track and the loader but the same issue cropped up every time i even tried to filter the liquid unloader but to no avail i m not sure what to do to solve the issue i m very new to railcraft just thought i should mention it if it was an issue i m running the latest railcraft version for in a custom modpack
| 0
|
31,503
| 8,705,818,896
|
IssuesEvent
|
2018-12-05 23:52:13
|
hashicorp/packer
|
https://api.github.com/repos/hashicorp/packer
|
closed
|
Delegate AWS region error checking to AWS Go SDK
|
builder/amazon enhancement
|
The error checking that occurs on a user-passed AWS region should be removed and delegated to the AWS Go SDK to future proof `packer` for the introduction of new regions/endpoints. The Go SDK has a much more comprehensive description of which regions and endpoints are valid, and since Amazon supports it, it's more likely to more quickly match their service offerings as those offerings change.
|
1.0
|
Delegate AWS region error checking to AWS Go SDK - The error checking that occurs on a user-passed AWS region should be removed and delegated to the AWS Go SDK to future proof `packer` for the introduction of new regions/endpoints. The Go SDK has a much more comprehensive description of which regions and endpoints are valid, and since Amazon supports it, it's more likely to more quickly match their service offerings as those offerings change.
|
non_process
|
delegate aws region error checking to aws go sdk the error checking that occurs on a user passed aws region should be removed and delegated to the aws go sdk to future proof packer for the introduction of new regions endpoints the go sdk has a much more comprehensive description of which regions and endpoints are valid and since amazon supports it it s more likely to more quickly match their service offerings as those offerings change
| 0
|
17,156
| 22,716,745,130
|
IssuesEvent
|
2022-07-06 03:18:34
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Wrong results converting vectors to Float32 rasters - Rasterize, Rasterization, Round Raster algorithms
|
Feedback stale Raster Processing Bug
|
Because decimal to float (IEEE754) conversion is not an exact conversion, I'm getting wrong results in reclassifications done after converting vectors to rasters.
I have a polygons layer with a "value" field that has 2 decimal places.
Converting it to raster with **gdal_rasterize** (**Rasterize (vetor to raster)**), the output is different if I choose it as Float32 or Float64.
Here is a sample Project: https://cld.pt/dl/download/61894ebd-ade0-4f12-8aa3-1405ab2690cc/Float_Issue_Project.zip
These are the commands I use (inside QGIS or in CLI):
```
gdal_rasterize -l polygons -a value -tr 5.0 5.0 -a_nodata 0.0 -te 61738.6841 107228.3171 63464.9242 108429.7802 -ot Float32 -of GTiff D:\Testes\Float_Issue\polygons.gpkg D:/Testes/Float_Issue/polygons_rasterize_float32.tif
Info:
Name polygons_rasterize_float32
Path D:\Testes\Float_Issue\polygons_rasterize_float32.tif
CRS EPSG:3763 - ETRS89 / Portugal TM06 - Projected
Extent 61738.6840999999985797,107229.7801999999937834 : 63463.6840999999985797,108429.7801999999937834
Unit meters
Width 345
Height 240
Data type Float32 - Thirty two bit floating point
GDAL Driver Description GTiff
GDAL Driver Metadata GeoTIFF
Dataset Description D:\Testes\Float_Issue\polygons_rasterize_float32.tif
Compression
Band 1
STATISTICS_MAXIMUM=18537
STATISTICS_MEAN=3782.009546874
STATISTICS_MINIMUM=0.12999999523163
STATISTICS_STDDEV=7001.5216220675
STATISTICS_VALID_PERCENT=100
More information
AREA_OR_POINT=Area
Dimensions X: 345 Y: 240 Bands: 1
Origin 61738.7,108430
Pixel Size 5,-5
```
```
gdal_rasterize -l polygons -a value -tr 5.0 5.0 -a_nodata 0.0 -te 61738.6841 107228.3171 63464.9242 108429.7802 -ot Float64 -of GTiff D:\Testes\Float_Issue\polygons.gpkg D:/Testes/Float_Issue/polygons_rasterize_float64.tif
Info:
Name polygons_rasterize_float64
Path D:\Testes\Float_Issue\polygons_rasterize_float64.tif
CRS EPSG:3763 - ETRS89 / Portugal TM06 - Projected
Extent 61738.6840999999985797,107229.7801999999937834 : 63463.6840999999985797,108429.7801999999937834
Unit meters
Width 345
Height 240
Data type Float64 - Sixty four bit floating point
GDAL Driver Description GTiff
GDAL Driver Metadata GeoTIFF
Dataset Description D:/Testes/Float_Issue/polygons_rasterize_float64.tif
Compression
Band 1
STATISTICS_MAXIMUM=18537
STATISTICS_MEAN=3782.0095468599
STATISTICS_MINIMUM=0.13
STATISTICS_STDDEV=7001.521622075
STATISTICS_VALID_PERCENT=100
More information
AREA_OR_POINT=Area
Dimensions X: 345 Y: 240 Bands: 1
Origin 61738.7,108430
Pixel Size 5,-5
```
Looking at output in Value Tool:

So, if the conversion is to a Float64 raster, pixel values have 2 decimal places, as the original data.
If the conversion is to a Float32 raster, several pixel values assume 16 decimal places.
```
Float32: 0.2199999988079071
Float64: 0.22
Original value: 0.22
```
This is due, as Andrea Giudiceandrea said in the dev mailing list, to the decimal to floating point numbers conversion, and can be confirmed here:
https://www.binaryconvert.com/convert_double.html
The most accurate representation of **0.22** in
float single precision (**32 bit**) is **0.2199999988079071044921875**
float double precision (**64 bit**) is **0.220000000000000001110223024625**
Where this becomes a serious problem is when the values are used, for instance, for a reclassification process (with **Reclassify by Table**). Imagine these rules:
```
[0 - 0.22[ = 1
[0.22 - 0.5[ = 2
>= 0.5 = 3
```
Pixels 0.22 are reclassified as 2;
Pixels 0.2199999988079071 are reclassified as 1.

So, Float64 gives the right results and Float32 gives wrong results.
However, if instead of gdal_rasterize I use **OTB Rasterization**, no matter whether I use float (Float32) or double (Float64) dtype, I always get wrong results, just like gdal_rasterize with Float32.
This can be really tricky and lead to wrong results.
I would say that as a workaround, maybe making Float64 as GDAL -ot <type> default could be safer?
Also the **Round Raster** algorithm fails here, because if the input is a Float32, for instance, with 0.2199999988079071044921875, the output will be as the same type and it is not rounded.
|
1.0
|
Wrong results converting vectors to Float32 rasters - Rasterize, Rasterization, Round Raster algorithms - Because decimal to float (IEEE754) conversion is not an exact conversion, I'm getting wrong results in reclassifications done after converting vectors to rasters.
I have a polygons layer with a "value" field that has 2 decimal places.
Converting it to raster with **gdal_rasterize** (**Rasterize (vetor to raster)**), the output is different if I choose it as Float32 or Float64.
Here is a sample Project: https://cld.pt/dl/download/61894ebd-ade0-4f12-8aa3-1405ab2690cc/Float_Issue_Project.zip
These are the commands I use (inside QGIS or in CLI):
```
gdal_rasterize -l polygons -a value -tr 5.0 5.0 -a_nodata 0.0 -te 61738.6841 107228.3171 63464.9242 108429.7802 -ot Float32 -of GTiff D:\Testes\Float_Issue\polygons.gpkg D:/Testes/Float_Issue/polygons_rasterize_float32.tif
Info:
Name polygons_rasterize_float32
Path D:\Testes\Float_Issue\polygons_rasterize_float32.tif
CRS EPSG:3763 - ETRS89 / Portugal TM06 - Projected
Extent 61738.6840999999985797,107229.7801999999937834 : 63463.6840999999985797,108429.7801999999937834
Unit meters
Width 345
Height 240
Data type Float32 - Thirty two bit floating point
GDAL Driver Description GTiff
GDAL Driver Metadata GeoTIFF
Dataset Description D:\Testes\Float_Issue\polygons_rasterize_float32.tif
Compression
Band 1
STATISTICS_MAXIMUM=18537
STATISTICS_MEAN=3782.009546874
STATISTICS_MINIMUM=0.12999999523163
STATISTICS_STDDEV=7001.5216220675
STATISTICS_VALID_PERCENT=100
More information
AREA_OR_POINT=Area
Dimensions X: 345 Y: 240 Bands: 1
Origin 61738.7,108430
Pixel Size 5,-5
```
```
gdal_rasterize -l polygons -a value -tr 5.0 5.0 -a_nodata 0.0 -te 61738.6841 107228.3171 63464.9242 108429.7802 -ot Float64 -of GTiff D:\Testes\Float_Issue\polygons.gpkg D:/Testes/Float_Issue/polygons_rasterize_float64.tif
Info:
Name polygons_rasterize_float64
Path D:\Testes\Float_Issue\polygons_rasterize_float64.tif
CRS EPSG:3763 - ETRS89 / Portugal TM06 - Projected
Extent 61738.6840999999985797,107229.7801999999937834 : 63463.6840999999985797,108429.7801999999937834
Unit meters
Width 345
Height 240
Data type Float64 - Sixty four bit floating point
GDAL Driver Description GTiff
GDAL Driver Metadata GeoTIFF
Dataset Description D:/Testes/Float_Issue/polygons_rasterize_float64.tif
Compression
Band 1
STATISTICS_MAXIMUM=18537
STATISTICS_MEAN=3782.0095468599
STATISTICS_MINIMUM=0.13
STATISTICS_STDDEV=7001.521622075
STATISTICS_VALID_PERCENT=100
More information
AREA_OR_POINT=Area
Dimensions X: 345 Y: 240 Bands: 1
Origin 61738.7,108430
Pixel Size 5,-5
```
Looking at output in Value Tool:

So, if the conversion is to a Float64 raster, pixel values have 2 decimal places, as the original data.
If the conversion is to a Float32 raster, several pixel values assume 16 decimal places.
```
Float32: 0.2199999988079071
Float64: 0.22
Original value: 0.22
```
This is due, as Andrea Giudiceandrea said in the dev mailing list, to the decimal to floating point numbers conversion, and can be confirmed here:
https://www.binaryconvert.com/convert_double.html
The most accurate representation of **0.22** in
float single precision (**32 bit**) is **0.2199999988079071044921875**
float double precision (**64 bit**) is **0.220000000000000001110223024625**
Where this becomes a serious problem is when the values are used, for instance, for a reclassification process (with **Reclassify by Table**). Imagine these rules:
```
[0 - 0.22[ = 1
[0.22 - 0.5[ = 2
>= 0.5 = 3
```
Pixels 0.22 are reclassified as 2;
Pixels 0.2199999988079071 are reclassified as 1.

So, Float64 gives the right results and Float32 gives wrong results.
However, if instead of gdal_rasterize I use **OTB Rasterization**, no matter whether I use float (Float32) or double (Float64) dtype, I always get wrong results, just like gdal_rasterize with Float32.
This can be really tricky and lead to wrong results.
I would say that as a workaround, maybe making Float64 as GDAL -ot <type> default could be safer?
Also the **Round Raster** algorithm fails here, because if the input is a Float32, for instance, with 0.2199999988079071044921875, the output will be as the same type and it is not rounded.
|
process
|
wrong results converting vectors to rasters rasterize rasterization round raster algorithms because decimal to float conversion is not an exact conversion i m getting wrong results in reclassifications done after converting vectors to rasters i have a polygons layer with a value field that has decimal places converting it to raster with gdal rasterize rasterize vetor to raster the output is different if i choose it as or here is a sample project these are the commands i use inside qgis or in cli gdal rasterize l polygons a value tr a nodata te ot of gtiff d testes float issue polygons gpkg d testes float issue polygons rasterize tif info name polygons rasterize path d testes float issue polygons rasterize tif crs epsg portugal projected extent unit meters width height data type thirty two bit floating point gdal driver description gtiff gdal driver metadata geotiff dataset description d testes float issue polygons rasterize tif compression band statistics maximum statistics mean statistics minimum statistics stddev statistics valid percent more information area or point area dimensions x y bands origin pixel size gdal rasterize l polygons a value tr a nodata te ot of gtiff d testes float issue polygons gpkg d testes float issue polygons rasterize tif info name polygons rasterize path d testes float issue polygons rasterize tif crs epsg portugal projected extent unit meters width height data type sixty four bit floating point gdal driver description gtiff gdal driver metadata geotiff dataset description d testes float issue polygons rasterize tif compression band statistics maximum statistics mean statistics minimum statistics stddev statistics valid percent more information area or point area dimensions x y bands origin pixel size looking at output in value tool so if the conversion is to a raster pixel values have decimal places as the original data if the conversion is to a raster several pixel values assume decimal places original value this is due as andrea giudiceandrea said in the dev mailing list to the decimal to floating point numbers conversion and can be confirmed here the most accurate representation of in float single precision bit is float double precision bit is where this becomes a serious problem is when the values are used for instance for a reclassification process with reclassify by table imagine these rules pixels are reclassified as pixels are reclassified as so gives the right results and gives wrong results however if instead of gdal rasterize i use otb rasterization no matter whether i use float or double dtype i always get wrong results just like gdal rasterize with this can be really tricky and lead to wrong results i would say that as a workaround maybe making as gdal ot default could be safer also the round raster algorithm fails here because if the input is a for instance with the output will be as the same type and it is not rounded
| 1
|
58,919
| 14,355,540,167
|
IssuesEvent
|
2020-11-30 10:14:46
|
ghosind/node-repeat-checker
|
https://api.github.com/repos/ghosind/node-repeat-checker
|
opened
|
CVE-2020-7774 (High) detected in y18n-4.0.0.tgz
|
security vulnerability
|
## CVE-2020-7774 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>y18n-4.0.0.tgz</b></p></summary>
<p>the bare-bones internationalization library used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz">https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz</a></p>
<p>Path to dependency file: node-repeat-checker/package.json</p>
<p>Path to vulnerable library: node-repeat-checker/node_modules/y18n/package.json</p>
<p>
Dependency Hierarchy:
- mocha-7.1.1.tgz (Root Library)
- yargs-13.3.2.tgz
- :x: **y18n-4.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghosind/node-repeat-checker/commit/13f861331bc471356af151327167aec87efb6ae4">13f861331bc471356af151327167aec87efb6ae4</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package y18n before 5.0.5. PoC by po6ix: const y18n = require('y18n')(); y18n.setLocale('__proto__'); y18n.updateLocale({polluted: true}); console.log(polluted); // true
<p>Publish Date: 2020-11-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774>CVE-2020-7774</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7774">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7774</a></p>
<p>Release Date: 2020-11-17</p>
<p>Fix Resolution: 5.0.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-7774 (High) detected in y18n-4.0.0.tgz - ## CVE-2020-7774 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>y18n-4.0.0.tgz</b></p></summary>
<p>the bare-bones internationalization library used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz">https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz</a></p>
<p>Path to dependency file: node-repeat-checker/package.json</p>
<p>Path to vulnerable library: node-repeat-checker/node_modules/y18n/package.json</p>
<p>
Dependency Hierarchy:
- mocha-7.1.1.tgz (Root Library)
- yargs-13.3.2.tgz
- :x: **y18n-4.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghosind/node-repeat-checker/commit/13f861331bc471356af151327167aec87efb6ae4">13f861331bc471356af151327167aec87efb6ae4</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package y18n before 5.0.5. PoC by po6ix: const y18n = require('y18n')(); y18n.setLocale('__proto__'); y18n.updateLocale({polluted: true}); console.log(polluted); // true
<p>Publish Date: 2020-11-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774>CVE-2020-7774</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7774">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7774</a></p>
<p>Release Date: 2020-11-17</p>
<p>Fix Resolution: 5.0.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in tgz cve high severity vulnerability vulnerable library tgz the bare bones internationalization library used by yargs library home page a href path to dependency file node repeat checker package json path to vulnerable library node repeat checker node modules package json dependency hierarchy mocha tgz root library yargs tgz x tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects the package before poc by const require setlocale proto updatelocale polluted true console log polluted true publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
138,615
| 30,922,243,284
|
IssuesEvent
|
2023-08-06 03:19:08
|
Haidoe/arc
|
https://api.github.com/repos/Haidoe/arc
|
closed
|
Productions page: Add possibility to delete a production
|
priority-medium code
|
## Bug Report
**Reporter: ❗️**
@rjung07
**Describe the bug: ❗️**
A clear and concise description of what the bug is.
**Steps to reproduce: ❗️**
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Screenshots or Video**
If applicable, add screenshots / video to help explain your problem.
**Expected behavior: ❗️**
A clear and concise description of what you expected to happen.
**Actual behavior: ❗️**
A clear and concise description of what actually happened.
**Possible Solution:**
If you have any ideas or suggestions on how to fix the bug, please mention them here.
**Environment:**
- Device & Operating System:
- Browser and Version(if applicable):
- Jira Ticket(if applicable):
- Any other relevant information about your environment.
**Additional context:**
Add any other context about the problem here.
**Follow up checklist: ❗️**
- [ ] Add Assignee
- [ ] Label priority ( priority-low, priority-medium, priority-high )
- [ ] Label Milestone ( Alpha, Beta )
- [ ] Label Issue Type ( Style, Code, API )
|
1.0
|
Productions page: Add possibility to delete a production - ## Bug Report
**Reporter: ❗️**
@rjung07
**Describe the bug: ❗️**
A clear and concise description of what the bug is.
**Steps to reproduce: ❗️**
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Screenshots or Video**
If applicable, add screenshots / video to help explain your problem.
**Expected behavior: ❗️**
A clear and concise description of what you expected to happen.
**Actual behavior: ❗️**
A clear and concise description of what actually happened.
**Possible Solution:**
If you have any ideas or suggestions on how to fix the bug, please mention them here.
**Environment:**
- Device & Operating System:
- Browser and Version(if applicable):
- Jira Ticket(if applicable):
- Any other relevant information about your environment.
**Additional context:**
Add any other context about the problem here.
**Follow up checklist: ❗️**
- [ ] Add Assignee
- [ ] Label priority ( priority-low, priority-medium, priority-high )
- [ ] Label Milestone ( Alpha, Beta )
- [ ] Label Issue Type ( Style, Code, API )
|
non_process
|
productions page add possibility to delete a production bug report reporter ❗️ describe the bug ❗️ a clear and concise description of what the bug is steps to reproduce ❗️ go to click on scroll down to see error screenshots or video if applicable add screenshots video to help explain your problem expected behavior ❗️ a clear and concise description of what you expected to happen actual behavior ❗️ a clear and concise description of what actually happened possible solution if you have any ideas or suggestions on how to fix the bug please mention them here environment device operating system browser and version if applicable jira ticket if applicable any other relevant information about your environment additional context add any other context about the problem here follow up checklist ❗️ add assignee label priority priority low priority medium priority high label milestone alpha beta label issue type style code api
| 0
|
19,828
| 26,217,730,387
|
IssuesEvent
|
2023-01-04 12:27:00
|
scverse/scanpy
|
https://api.github.com/repos/scverse/scanpy
|
closed
|
Move from 88 to 120 characters with Black
|
Question Development Process 🚀
|
This will be the discussion that no one ever wants since developers like us tend to bring our mice and keyboards to fight when it comes to questions like these, but hey I thought at some point it should be had anyways haha
@ivirshup recently introduced black as the formatting standard for Scanpy. I am not a fan of blacks formatting, but the idea of consistent formatting for big open source projects is great! So +1 from me
Anyways, currently black formats with 88 characters per line, which makes, especially with black, for lots and lots of line breaks and encourages bad practices like unspecific short variable names etc. Modern Python programming is not C programming from the 80s. Have a read at Linus rant on the 80 character limit in the Linux kernel and why the Linux kernel does **not** enforce it: https://lkml.org/lkml/2020/5/29/1038
Applying black with a 120 characters limit removes about 1500 lines. That's 1500 lines that you have to scroll less and in my opinion the result is more readable.
What do you think?
|
1.0
|
Move from 88 to 120 characters with Black - This will be the discussion that no one ever wants since developers like us tend to bring our mice and keyboards to fight when it comes to questions like these, but hey I thought at some point it should be had anyways haha
@ivirshup recently introduced black as the formatting standard for Scanpy. I am not a fan of blacks formatting, but the idea of consistent formatting for big open source projects is great! So +1 from me
Anyways, currently black formats with 88 characters per line, which makes, especially with black, for lots and lots of line breaks and encourages bad practices like unspecific short variable names etc. Modern Python programming is not C programming from the 80s. Have a read at Linus rant on the 80 character limit in the Linux kernel and why the Linux kernel does **not** enforce it: https://lkml.org/lkml/2020/5/29/1038
Applying black with a 120 characters limit removes about 1500 lines. That's 1500 lines that you have to scroll less and in my opinion the result is more readable.
What do you think?
|
process
|
move from to characters with black this will be the discussion that no one ever wants since developers like us tend to bring our mice and keyboards to fight when it comes to questions like these but hey i thought at some point it should be had anyways haha ivirshup recently introduced black as the formatting standard for scanpy i am not a fan of blacks formatting but the idea of consistent formatting for big open source projects is great so from me anyways currently black formats with characters per line which makes especially with black for lots and lots of line breaks and encourages bad practices like unspecific short variable names etc modern python programming is not c programming from the have a read at linus rant on the character limit in the linux kernel and why the linux kernel does not enforce it applying black with a characters limit removes about lines that s lines that you have to scroll less and in my opinion the result is more readable what do you think
| 1
|
336,302
| 30,186,600,943
|
IssuesEvent
|
2023-07-04 12:33:12
|
TencentBlueKing/bk-repo
|
https://api.github.com/repos/TencentBlueKing/bk-repo
|
closed
|
generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动
|
bug frontend for test tested
|
<!-- Please use this template while reporting a bug and provide as much info as possible. Not doing so may result in your bug not being addressed in a timely manner. Thanks!-->
**What happened**:
generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动
**What you expected to happen**:
页面显示正常且目录树可以正常拖动改变宽度;
面包屑内容超过两个后中间文件夹名称省略,且每个文件夹名称最长限制为200px,超过显示省略号

**How to reproduce it (as minimally and precisely as possible)**:
多个文件夹名称超长的嵌套或者一个特别长的文件夹名称
**Anything else we need to know?**:

**Environment**:
- bk-repo version (use `cat VERSION` in installed dir):
- Cloud provider or hardware configuration:
- OS (e.g: `cat /etc/os-release`):
- Kernel (e.g. `uname -a`):
- Install tools:
- Others:
|
2.0
|
generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动 - <!-- Please use this template while reporting a bug and provide as much info as possible. Not doing so may result in your bug not being addressed in a timely manner. Thanks!-->
**What happened**:
generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动
**What you expected to happen**:
页面显示正常且目录树可以正常拖动改变宽度;
面包屑内容超过两个后中间文件夹名称省略,且每个文件夹名称最长限制为200px,超过显示省略号

**How to reproduce it (as minimally and precisely as possible)**:
多个文件夹名称超长的嵌套或者一个特别长的文件夹名称
**Anything else we need to know?**:

**Environment**:
- bk-repo version (use `cat VERSION` in installed dir):
- Cloud provider or hardware configuration:
- OS (e.g: `cat /etc/os-release`):
- Kernel (e.g. `uname -a`):
- Install tools:
- Others:
|
non_process
|
generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动 what happened generic仓库文件展示列表上方面包屑在文件夹名称超长之后导致页面显示效果有问题且导致目录树不能拖动 what you expected to happen 页面显示正常且目录树可以正常拖动改变宽度; 面包屑内容超过两个后中间文件夹名称省略, ,超过显示省略号 how to reproduce it as minimally and precisely as possible 多个文件夹名称超长的嵌套或者一个特别长的文件夹名称 anything else we need to know environment bk repo version use cat version in installed dir cloud provider or hardware configuration os e g cat etc os release kernel e g uname a install tools others
| 0
|
9,603
| 12,545,046,058
|
IssuesEvent
|
2020-06-05 18:14:39
|
varys-main/ps-tools
|
https://api.github.com/repos/varys-main/ps-tools
|
closed
|
Docker - Anpassungen
|
processing
|
# User Story
- Die Rechtschreibung soll angepasst werden und die Antworten auf Abfragen (J/N) sollen standardisiert werden. Zudem soll die Standard-Option besser ersichtlich sein.
- Das Laden des Moduls NAV-ContainerHelper soll besser erklärt werden und/oder in eine eigene Funktion ausgelagert werden. Das Modul soll aktualisiert werden können (keine Installation mehrerer Versionen)
- Die gu-Module sollen bereinigt werden.
# Tasks
- [x] Rechtschreibung/Erklärung
- [x] NAV-ContainerHelper
- [x] gu-Module
- [x] Hilfe aktualisieren
# Implementation
- Umlaute im Warmup-Skript korrigiert
- Ausgaben in DockerCreate auf deutsch
- Die Installation/Deinstallation des Moduls NAV-ContainerHelper wurde in eigene Menü-Einträge ausgelagert
- Der Download von Docker-Images wurde aus der DockerCreate-Funktion ausgelagert.
- Die Hilfe wurde in das doc-Repository übernommen und aktualisiert.
# Known Problems
|
1.0
|
Docker - Anpassungen - # User Story
- Die Rechtschreibung soll angepasst werden und die Antworten auf Abfragen (J/N) sollen standardisiert werden. Zudem soll die Standard-Option besser ersichtlich sein.
- Das Laden des Moduls NAV-ContainerHelper soll besser erklärt werden und/oder in eine eigene Funktion ausgelagert werden. Das Modul soll aktualisiert werden können (keine Installation mehrerer Versionen)
- Die gu-Module sollen bereinigt werden.
# Tasks
- [x] Rechtschreibung/Erklärung
- [x] NAV-ContainerHelper
- [x] gu-Module
- [x] Hilfe aktualisieren
# Implementation
- Umlaute im Warmup-Skript korrigiert
- Ausgaben in DockerCreate auf deutsch
- Die Installation/Deinstallation des Moduls NAV-ContainerHelper wurde in eigene Menü-Einträge ausgelagert
- Der Download von Docker-Images wurde aus der DockerCreate-Funktion ausgelagert.
- Die Hilfe wurde in das doc-Repository übernommen und aktualisiert.
# Known Problems
|
process
|
docker anpassungen user story die rechtschreibung soll angepasst werden und die antworten auf abfragen j n sollen standardisiert werden zudem soll die standard option besser ersichtlich sein das laden des moduls nav containerhelper soll besser erklärt werden und oder in eine eigene funktion ausgelagert werden das modul soll aktualisiert werden können keine installation mehrerer versionen die gu module sollen bereinigt werden tasks rechtschreibung erklärung nav containerhelper gu module hilfe aktualisieren implementation umlaute im warmup skript korrigiert ausgaben in dockercreate auf deutsch die installation deinstallation des moduls nav containerhelper wurde in eigene menü einträge ausgelagert der download von docker images wurde aus der dockercreate funktion ausgelagert die hilfe wurde in das doc repository übernommen und aktualisiert known problems
| 1
|
32,312
| 15,326,567,211
|
IssuesEvent
|
2021-02-26 03:56:23
|
hajimehoshi/ebiten
|
https://api.github.com/repos/hajimehoshi/ebiten
|
closed
|
internal/shareable: (*Image).makeShared might be unexpectedly slow on Metal
|
os:ios os:macos performance
|
@wasedaigo reported. Unfortunately I cannot reproduce his slowness, but sometimes `makeShared` can be quite slow.
|
True
|
internal/shareable: (*Image).makeShared might be unexpectedly slow on Metal - @wasedaigo reported. Unfortunately I cannot reproduce his slowness, but sometimes `makeShared` can be quite slow.
|
non_process
|
internal shareable image makeshared might be unexpectedly slow on metal wasedaigo reported unfortunately i cannot reproduce his slowness but sometimes makeshared can be quite slow
| 0
|
100,174
| 21,183,290,717
|
IssuesEvent
|
2022-04-08 10:03:25
|
tijlleenders/ZinZen
|
https://api.github.com/repos/tijlleenders/ZinZen
|
closed
|
control theme with css vars
|
clean_code 2h
|
Currently the theme is controlled in the component code:
```
<Button
variant={darkModeStatus ? "brown" : "peach"}
size="lg"
className={
darkModeStatus
? "dashboard-choice-dark1"
: "dashboard-choice-light1"
}
>
```
This is not necessary as components can reference the css var.
For example: "foreground-color".
Depending on the theme chosen this will change the CSS settings and thereby change the component colors.
In short: the component shouldn't have to know about the theme.
Example implementation: https://css-tricks.com/easy-dark-mode-and-multiple-color-themes-in-react/
|
1.0
|
control theme with css vars - Currently the theme is controlled in the component code:
```
<Button
variant={darkModeStatus ? "brown" : "peach"}
size="lg"
className={
darkModeStatus
? "dashboard-choice-dark1"
: "dashboard-choice-light1"
}
>
```
This is not necessary as components can reference the css var.
For example: "foreground-color".
Depending on the theme chosen this will change the CSS settings and thereby change the component colors.
In short: the component shouldn't have to know about the theme.
Example implementation: https://css-tricks.com/easy-dark-mode-and-multiple-color-themes-in-react/
|
non_process
|
control theme with css vars currently the theme is controlled in the component code button variant darkmodestatus brown peach size lg classname darkmodestatus dashboard choice dashboard choice this is not necessary as components can reference the css var for example foreground color depending on the theme chosen this will change the css settings and thereby change the component colors in short the component shouldn t have to know about the theme example implementation
| 0
|
12,090
| 14,740,070,008
|
IssuesEvent
|
2021-01-07 08:27:51
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Sarasota - SA Billing - Late Fee Account List
|
anc-process anp-important ant-bug has attachment
|
In GitLab by @kdjstudios on Oct 3, 2018, 11:05
[Sarasota.xlsx](/uploads/66633f8d0277b1657e305d94194fa157/Sarasota.xlsx)
HD: http://www.servicedesk.answernet.com/profiles/ticket/2018-10-03-63235/conversation
|
1.0
|
Sarasota - SA Billing - Late Fee Account List - In GitLab by @kdjstudios on Oct 3, 2018, 11:05
[Sarasota.xlsx](/uploads/66633f8d0277b1657e305d94194fa157/Sarasota.xlsx)
HD: http://www.servicedesk.answernet.com/profiles/ticket/2018-10-03-63235/conversation
|
process
|
sarasota sa billing late fee account list in gitlab by kdjstudios on oct uploads sarasota xlsx hd
| 1
|
257,059
| 27,561,761,874
|
IssuesEvent
|
2023-03-07 22:44:45
|
samqws-marketing/walmartlabs-concord
|
https://api.github.com/repos/samqws-marketing/walmartlabs-concord
|
closed
|
CVE-2019-10746 (High) detected in mixin-deep-1.3.1.tgz - autoclosed
|
security vulnerability
|
## CVE-2019-10746 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mixin-deep-1.3.1.tgz</b></p></summary>
<p>Deeply mix the properties of objects into the first object. Like merge-deep, but doesn't clone.</p>
<p>Library home page: <a href="https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3.1.tgz">https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3.1.tgz</a></p>
<p>Path to dependency file: /console2/package.json</p>
<p>Path to vulnerable library: /console2/node_modules/mixin-deep/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.3.tgz (Root Library)
- webpack-4.44.2.tgz
- micromatch-3.1.10.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- :x: **mixin-deep-1.3.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/walmartlabs-concord/commit/b9420f3b9e73a9d381266ece72f7afb756f35a76">b9420f3b9e73a9d381266ece72f7afb756f35a76</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
mixin-deep is vulnerable to Prototype Pollution in versions before 1.3.2 and version 2.0.0. The function mixin-deep could be tricked into adding or modifying properties of Object.prototype using a constructor payload.
<p>Publish Date: 2019-08-23
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-10746>CVE-2019-10746</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2019-08-23</p>
<p>Fix Resolution (mixin-deep): 1.3.2</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
True
|
CVE-2019-10746 (High) detected in mixin-deep-1.3.1.tgz - autoclosed - ## CVE-2019-10746 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mixin-deep-1.3.1.tgz</b></p></summary>
<p>Deeply mix the properties of objects into the first object. Like merge-deep, but doesn't clone.</p>
<p>Library home page: <a href="https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3.1.tgz">https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3.1.tgz</a></p>
<p>Path to dependency file: /console2/package.json</p>
<p>Path to vulnerable library: /console2/node_modules/mixin-deep/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.3.tgz (Root Library)
- webpack-4.44.2.tgz
- micromatch-3.1.10.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- :x: **mixin-deep-1.3.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/walmartlabs-concord/commit/b9420f3b9e73a9d381266ece72f7afb756f35a76">b9420f3b9e73a9d381266ece72f7afb756f35a76</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
mixin-deep is vulnerable to Prototype Pollution in versions before 1.3.2 and version 2.0.0. The function mixin-deep could be tricked into adding or modifying properties of Object.prototype using a constructor payload.
<p>Publish Date: 2019-08-23
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-10746>CVE-2019-10746</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2019-08-23</p>
<p>Fix Resolution (mixin-deep): 1.3.2</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
non_process
|
cve high detected in mixin deep tgz autoclosed cve high severity vulnerability vulnerable library mixin deep tgz deeply mix the properties of objects into the first object like merge deep but doesn t clone library home page a href path to dependency file package json path to vulnerable library node modules mixin deep package json dependency hierarchy react scripts tgz root library webpack tgz micromatch tgz snapdragon tgz base tgz x mixin deep tgz vulnerable library found in head commit a href found in base branch master vulnerability details mixin deep is vulnerable to prototype pollution in versions before and version the function mixin deep could be tricked into adding or modifying properties of object prototype using a constructor payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution mixin deep direct dependency fix resolution react scripts check this box to open an automated fix pr
| 0
|
188,502
| 15,164,173,487
|
IssuesEvent
|
2021-02-12 13:21:47
|
AgentsUnited/demonstrator
|
https://api.github.com/repos/AgentsUnited/demonstrator
|
closed
|
Create folder with demo WOOL scripts that can be added to WOOL Web Service
|
documentation enhancement
|
The demo WOOL scripts should be added to the resources folder in the WOOL Web Service. It is probably best to also upload these files to the demonstrator repo (as they do not really belong in the WOOL repo). This would allow others to run the demo on their own devices.
|
1.0
|
Create folder with demo WOOL scripts that can be added to WOOL Web Service - The demo WOOL scripts should be added to the resources folder in the WOOL Web Service. It is probably best to also upload these files to the demonstrator repo (as they do not really belong in the WOOL repo). This would allow others to run the demo on their own devices.
|
non_process
|
create folder with demo wool scripts that can be added to wool web service the demo wool scripts should be added to the resources folder in the wool web service it is probably best to also upload these files to the demonstrator repo as they do not really belong in the wool repo this would allow others to run the demo on their own devices
| 0
|
22,193
| 2,645,772,535
|
IssuesEvent
|
2015-03-13 02:06:28
|
prikhi/evoluspencil
|
https://api.github.com/repos/prikhi/evoluspencil
|
closed
|
Enhancement in arrangement feature
|
1 star bug imported Priority-Medium
|
_From [fabian...@gmail.com](https://code.google.com/u/117377675167064933363/) on December 08, 2008 11:18:48_
It would be nice to provide keyboard shortcuts for the arrangements that
can be applied to a shape. Maybe these:
CTRL+Up = Arrangement / Bring Forward
CTRL+Down = Arrangement / Send Backward
CTRL+PgUp = Arrangement / Bring to Front
CTRL+PgDown = Arrangement / Send to Back
Thanks.
_Original issue: http://code.google.com/p/evoluspencil/issues/detail?id=79_
|
1.0
|
Enhancement in arrangement feature - _From [fabian...@gmail.com](https://code.google.com/u/117377675167064933363/) on December 08, 2008 11:18:48_
It would be nice to provide keyboard shortcuts for the arrangements that
can be applied to a shape. Maybe these:
CTRL+Up = Arrangement / Bring Forward
CTRL+Down = Arrangement / Send Backward
CTRL+PgUp = Arrangement / Bring to Front
CTRL+PgDown = Arrangement / Send to Back
Thanks.
_Original issue: http://code.google.com/p/evoluspencil/issues/detail?id=79_
|
non_process
|
enhancement in arrangement feature from on december it would be nice to provide keyboard shortcuts for the arrangements that can be applied to a shape maybe these ctrl up arrangement bring forward ctrl down arrangement send backward ctrl pgup arrangement bring to front ctrl pgdown arrangement send to back thanks original issue
| 0
|
136,368
| 11,047,445,401
|
IssuesEvent
|
2019-12-09 18:59:24
|
dexpenses/dexpenses-extract
|
https://api.github.com/repos/dexpenses/dexpenses-extract
|
closed
|
Implement test receipt normal/hannover-goertz-debit
|
enhancement test-data
|
Receipt to implement:

|
1.0
|
Implement test receipt normal/hannover-goertz-debit - Receipt to implement:

|
non_process
|
implement test receipt normal hannover goertz debit receipt to implement normal hannover goertz debit
| 0
|
12,570
| 14,985,993,186
|
IssuesEvent
|
2021-01-28 20:39:50
|
superfluid-finance/protocol-monorepo
|
https://api.github.com/repos/superfluid-finance/protocol-monorepo
|
closed
|
Etherscan code verification
|
squad:PROCESS
|
# Checklist
- [ ] Verify all the proxy contracts.
- [ ] For all new logic contracts update, do etherscan.
- [ ] (If possible) add extra deployment information when doing etherscan.
# Notes:
- Since adding comments wouldn't change compilation bytecode output, it would be convenient to have those deployment information as code comments on etherscan.
|
1.0
|
Etherscan code verification - # Checklist
- [ ] Verify all the proxy contracts.
- [ ] For all new logic contracts update, do etherscan.
- [ ] (If possible) add extra deployment information when doing etherscan.
# Notes:
- Since adding comments wouldn't change compilation bytecode output, it would be convenient to have those deployment information as code comments on etherscan.
|
process
|
etherscan code verification checklist verify all the proxy contracts for all new logic contracts update do etherscan if possible add extra deployment information when doing etherscan notes since adding comments wouldn t change compilation bytecode output it would be convenient to have those deployment information as code comments on etherscan
| 1
|
253,690
| 27,300,794,695
|
IssuesEvent
|
2023-02-24 01:38:39
|
panasalap/linux-4.19.72_1
|
https://api.github.com/repos/panasalap/linux-4.19.72_1
|
closed
|
CVE-2021-28971 (Medium) detected in linux-yoctov5.4.51 - autoclosed
|
security vulnerability
|
## CVE-2021-28971 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.19.72/commit/c5a08fe8179013aad614165d792bc5b436591df6">c5a08fe8179013aad614165d792bc5b436591df6</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In intel_pmu_drain_pebs_nhm in arch/x86/events/intel/ds.c in the Linux kernel through 5.11.8 on some Haswell CPUs, userspace applications (such as perf-fuzzer) can cause a system crash because the PEBS status in a PEBS record is mishandled, aka CID-d88d05a9e0b6.
<p>Publish Date: 2021-03-22
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-28971>CVE-2021-28971</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2021-28971">https://www.linuxkernelcves.com/cves/CVE-2021-28971</a></p>
<p>Release Date: 2021-03-22</p>
<p>Fix Resolution: v4.9.263, v4.14.227, v4.19.183, v5.4.108, v5.10.26, v5.11.9, v5.12-rc4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-28971 (Medium) detected in linux-yoctov5.4.51 - autoclosed - ## CVE-2021-28971 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.19.72/commit/c5a08fe8179013aad614165d792bc5b436591df6">c5a08fe8179013aad614165d792bc5b436591df6</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In intel_pmu_drain_pebs_nhm in arch/x86/events/intel/ds.c in the Linux kernel through 5.11.8 on some Haswell CPUs, userspace applications (such as perf-fuzzer) can cause a system crash because the PEBS status in a PEBS record is mishandled, aka CID-d88d05a9e0b6.
<p>Publish Date: 2021-03-22
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-28971>CVE-2021-28971</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2021-28971">https://www.linuxkernelcves.com/cves/CVE-2021-28971</a></p>
<p>Release Date: 2021-03-22</p>
<p>Fix Resolution: v4.9.263, v4.14.227, v4.19.183, v5.4.108, v5.10.26, v5.11.9, v5.12-rc4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in linux autoclosed cve medium severity vulnerability vulnerable library linux yocto linux embedded kernel library home page a href found in head commit a href found in base branch master vulnerable source files vulnerability details in intel pmu drain pebs nhm in arch events intel ds c in the linux kernel through on some haswell cpus userspace applications such as perf fuzzer can cause a system crash because the pebs status in a pebs record is mishandled aka cid publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
14,501
| 17,571,988,066
|
IssuesEvent
|
2021-08-14 22:19:04
|
zymex22/Project-RimFactory-Revived
|
https://api.github.com/repos/zymex22/Project-RimFactory-Revived
|
opened
|
Architect Icons triggers GUI Error if the Lite mode is used
|
C# Incompatibility
|
Example: https://gist.github.com/HugsLibRecordKeeper/60ac04ddcd90f01b197a0d4b2a5e3022
This is caused as the mod implements its own `CacheDesPanels`.
that grabs values of the original exactly once on startup.
if the Lite mode is used we update invoke `RimWorld.MainTabWindow_Architect.CacheDesPanels` and as the `Architect Icons` never gets a refresh errors are caused.
as i can't find a way to get an instance of: https://github.com/marcin212/architect-icons/blob/master/Source/ArchitectIcons/MainTabWindow_Architect.cs#L72 i plan to open a PR that includes a `public static` method that we can easily hit.
|
True
|
Architect Icons triggers GUI Error if the Lite mode is used - Example: https://gist.github.com/HugsLibRecordKeeper/60ac04ddcd90f01b197a0d4b2a5e3022
This is caused as the mod implements its own `CacheDesPanels`.
that grabs values of the original exactly once on startup.
if the Lite mode is used we update invoke `RimWorld.MainTabWindow_Architect.CacheDesPanels` and as the `Architect Icons` never gets a refresh errors are caused.
as i can't find a way to get an instance of: https://github.com/marcin212/architect-icons/blob/master/Source/ArchitectIcons/MainTabWindow_Architect.cs#L72 i plan to open a PR that includes a `public static` method that we can easily hit.
|
non_process
|
architect icons triggers gui error if the lite mode is used example this is caused as the mod implements its own cachedespanels that grabs values of the original exactly once on startup if the lite mode is used we update invoke rimworld maintabwindow architect cachedespanels and as the architect icons never gets a refresh errors are caused as i can t find a way to get an instance of i plan to open a pr that includes a public static method that we can easily hit
| 0
|
3,806
| 6,792,529,419
|
IssuesEvent
|
2017-11-01 01:00:45
|
rubberduck-vba/Rubberduck
|
https://api.github.com/repos/rubberduck-vba/Rubberduck
|
closed
|
ParserState caching issue?
|
bug critical parse-tree-processing
|
Might be a one-off, but seems identifier references don't *consistently* get cleaned up:

That selection was the location of the *other* reference to `SageContext` (`With New SageContext`) in the first parse, now shifted a number of lines below.
|
1.0
|
ParserState caching issue? - Might be a one-off, but seems identifier references don't *consistently* get cleaned up:

That selection was the location of the *other* reference to `SageContext` (`With New SageContext`) in the first parse, now shifted a number of lines below.
|
process
|
parserstate caching issue might be a one off but seems identifier references don t consistently get cleaned up that selection was the location of the other reference to sagecontext with new sagecontext in the first parse now shifted a number of lines below
| 1
|
305,125
| 9,359,597,815
|
IssuesEvent
|
2019-04-02 07:21:13
|
aiidateam/aiida_core
|
https://api.github.com/repos/aiidateam/aiida_core
|
closed
|
Use period as separator for namespaced input and output links
|
priority/nice-to-have requires discussion topic/workflows type/feature request type/question
|
Currently, if a Process exposes inputs into a specific namespace, the link names will be a concatenation of the namespace and the input port, joined by an underscore. However, in specifying namespaces, we use `.` as the separator. Should these two made to be congruent?
|
1.0
|
Use period as separator for namespaced input and output links - Currently, if a Process exposes inputs into a specific namespace, the link names will be a concatenation of the namespace and the input port, joined by an underscore. However, in specifying namespaces, we use `.` as the separator. Should these two made to be congruent?
|
non_process
|
use period as separator for namespaced input and output links currently if a process exposes inputs into a specific namespace the link names will be a concatenation of the namespace and the input port joined by an underscore however in specifying namespaces we use as the separator should these two made to be congruent
| 0
|
45,171
| 13,107,048,186
|
IssuesEvent
|
2020-08-04 14:43:08
|
AOSC-Dev/aosc-os-abbs
|
https://api.github.com/repos/AOSC-Dev/aosc-os-abbs
|
opened
|
openslp: CVE-2019-5544
|
P1 security to-stable
|
<!-- Please remove items do not apply. -->
**CVE IDs:** CVE-2019-5544
**Other security advisory IDs:** RHSA-2019:4240-01
**Description:**
* openslp: Heap-based buffer overflow in ProcessSrvRqst() in slpd_process.c
leading to remote code execution (CVE-2019-5544)
**Patches:** from CentOS
**PoC(s):** <!-- Please list links to available PoCs (Proofs of Concept). -->
**Architectural progress (Mainline):**
<!-- Please remove any architecture to which the security vulnerabilities do not apply. -->
- [ ] AMD64 `amd64`
- [ ] 32-bit Optional Environment `optenv32`
- [ ] AArch64 `arm64`
**Architectural progress (Retro):**
<!-- Please remove any architecture to which the security vulnerabilities do not apply. -->
- [ ] ARMv5t+ `armel`
- [ ] ARMv7 `armhf`
- [ ] i486 `i486`
<!-- If the specified package is `noarch`, please use the stub below. -->
<!-- - [ ] Architecture-independent `noarch` -->
|
True
|
openslp: CVE-2019-5544 - <!-- Please remove items do not apply. -->
**CVE IDs:** CVE-2019-5544
**Other security advisory IDs:** RHSA-2019:4240-01
**Description:**
* openslp: Heap-based buffer overflow in ProcessSrvRqst() in slpd_process.c
leading to remote code execution (CVE-2019-5544)
**Patches:** from CentOS
**PoC(s):** <!-- Please list links to available PoCs (Proofs of Concept). -->
**Architectural progress (Mainline):**
<!-- Please remove any architecture to which the security vulnerabilities do not apply. -->
- [ ] AMD64 `amd64`
- [ ] 32-bit Optional Environment `optenv32`
- [ ] AArch64 `arm64`
**Architectural progress (Retro):**
<!-- Please remove any architecture to which the security vulnerabilities do not apply. -->
- [ ] ARMv5t+ `armel`
- [ ] ARMv7 `armhf`
- [ ] i486 `i486`
<!-- If the specified package is `noarch`, please use the stub below. -->
<!-- - [ ] Architecture-independent `noarch` -->
|
non_process
|
openslp cve cve ids cve other security advisory ids rhsa description openslp heap based buffer overflow in processsrvrqst in slpd process c leading to remote code execution cve patches from centos poc s architectural progress mainline bit optional environment architectural progress retro armel armhf
| 0
|
1,005
| 3,471,163,312
|
IssuesEvent
|
2015-12-23 13:44:21
|
refugeetech/platform
|
https://api.github.com/repos/refugeetech/platform
|
closed
|
Determine standup time and frequency
|
Open Process
|
The team should meet relatively frequently to coordinate efforts. Determine a time and schedule that works for everyone to have standups. These meetings should happen at least twice weekly and be no longer than 15 minutes.
|
1.0
|
Determine standup time and frequency - The team should meet relatively frequently to coordinate efforts. Determine a time and schedule that works for everyone to have standups. These meetings should happen at least twice weekly and be no longer than 15 minutes.
|
process
|
determine standup time and frequency the team should meet relatively frequently to coordinate efforts determine a time and schedule that works for everyone to have standups these meetings should happen at least twice weekly and be no longer than minutes
| 1
|
236,224
| 25,976,487,411
|
IssuesEvent
|
2022-12-19 15:15:08
|
thinktecture-labs/cloud-native-sample
|
https://api.github.com/repos/thinktecture-labs/cloud-native-sample
|
opened
|
Dockle report for Notification Service
|
containers security
|
The container image for Notification Service (`notification:3ac48c8e238020da1b1ad0d0380a5e67fd7849f3`) was scanned during CI for 3ac48c8e238020da1b1ad0d0380a5e67fd7849f3 using dockle. Please see the findings mentioned below:
## Dockle results
```json
{
"summary": {
"fatal": 1,
"warn": 0,
"info": 3,
"skip": 0,
"pass": 12
},
"details": [
{
"code": "CIS-DI-0009",
"title": "Use COPY instead of ADD in Dockerfile",
"level": "FATAL",
"alerts": [
"Use COPY : /bin/sh -c #(nop) ADD file:1d0cc49ce68a65be88e80d24167798e49ea0bf171568fe7d3abeac6253a75849 in / "
]
},
{
"code": "CIS-DI-0005",
"title": "Enable Content trust for Docker",
"level": "INFO",
"alerts": [
"export DOCKER_CONTENT_TRUST=1 before docker pull/build"
]
},
{
"code": "CIS-DI-0006",
"title": "Add HEALTHCHECK instruction to the container image",
"level": "INFO",
"alerts": [
"not found HEALTHCHECK statement"
]
},
{
"code": "CIS-DI-0008",
"title": "Confirm safety of setuid/setgid files",
"level": "INFO",
"alerts": [
"setuid file: urwxr-xr-x usr/bin/umount",
"setuid file: urwxr-xr-x usr/bin/mount",
"setgid file: grwxr-xr-x usr/bin/chage",
"setuid file: urwxr-xr-x usr/bin/su",
"setgid file: grwxr-xr-x usr/sbin/pam_extrausers_chkpwd",
"setgid file: grwxr-xr-x usr/bin/expiry",
"setuid file: urwxr-xr-x usr/bin/gpasswd",
"setgid file: grwxr-xr-x usr/bin/wall",
"setuid file: urwxr-xr-x usr/bin/newgrp",
"setuid file: urwxr-xr-x usr/bin/passwd",
"setgid file: grwxr-xr-x usr/sbin/unix_chkpwd",
"setuid file: urwxr-xr-x usr/bin/chfn",
"setuid file: urwxr-xr-x usr/bin/chsh"
]
}
]
}
```
|
True
|
Dockle report for Notification Service - The container image for Notification Service (`notification:3ac48c8e238020da1b1ad0d0380a5e67fd7849f3`) was scanned during CI for 3ac48c8e238020da1b1ad0d0380a5e67fd7849f3 using dockle. Please see the findings mentioned below:
## Dockle results
```json
{
"summary": {
"fatal": 1,
"warn": 0,
"info": 3,
"skip": 0,
"pass": 12
},
"details": [
{
"code": "CIS-DI-0009",
"title": "Use COPY instead of ADD in Dockerfile",
"level": "FATAL",
"alerts": [
"Use COPY : /bin/sh -c #(nop) ADD file:1d0cc49ce68a65be88e80d24167798e49ea0bf171568fe7d3abeac6253a75849 in / "
]
},
{
"code": "CIS-DI-0005",
"title": "Enable Content trust for Docker",
"level": "INFO",
"alerts": [
"export DOCKER_CONTENT_TRUST=1 before docker pull/build"
]
},
{
"code": "CIS-DI-0006",
"title": "Add HEALTHCHECK instruction to the container image",
"level": "INFO",
"alerts": [
"not found HEALTHCHECK statement"
]
},
{
"code": "CIS-DI-0008",
"title": "Confirm safety of setuid/setgid files",
"level": "INFO",
"alerts": [
"setuid file: urwxr-xr-x usr/bin/umount",
"setuid file: urwxr-xr-x usr/bin/mount",
"setgid file: grwxr-xr-x usr/bin/chage",
"setuid file: urwxr-xr-x usr/bin/su",
"setgid file: grwxr-xr-x usr/sbin/pam_extrausers_chkpwd",
"setgid file: grwxr-xr-x usr/bin/expiry",
"setuid file: urwxr-xr-x usr/bin/gpasswd",
"setgid file: grwxr-xr-x usr/bin/wall",
"setuid file: urwxr-xr-x usr/bin/newgrp",
"setuid file: urwxr-xr-x usr/bin/passwd",
"setgid file: grwxr-xr-x usr/sbin/unix_chkpwd",
"setuid file: urwxr-xr-x usr/bin/chfn",
"setuid file: urwxr-xr-x usr/bin/chsh"
]
}
]
}
```
|
non_process
|
dockle report for notification service the container image for notification service notification was scanned during ci for using dockle please see the findings mentioned below dockle results json summary fatal warn info skip pass details code cis di title use copy instead of add in dockerfile level fatal alerts use copy bin sh c nop add file in code cis di title enable content trust for docker level info alerts export docker content trust before docker pull build code cis di title add healthcheck instruction to the container image level info alerts not found healthcheck statement code cis di title confirm safety of setuid setgid files level info alerts setuid file urwxr xr x usr bin umount setuid file urwxr xr x usr bin mount setgid file grwxr xr x usr bin chage setuid file urwxr xr x usr bin su setgid file grwxr xr x usr sbin pam extrausers chkpwd setgid file grwxr xr x usr bin expiry setuid file urwxr xr x usr bin gpasswd setgid file grwxr xr x usr bin wall setuid file urwxr xr x usr bin newgrp setuid file urwxr xr x usr bin passwd setgid file grwxr xr x usr sbin unix chkpwd setuid file urwxr xr x usr bin chfn setuid file urwxr xr x usr bin chsh
| 0
|
438,107
| 12,619,307,788
|
IssuesEvent
|
2020-06-13 00:01:20
|
knative/docs
|
https://api.github.com/repos/knative/docs
|
closed
|
Document testing (in Contributing section?)
|
kind/good-first-issue lifecycle/rotten priority/2
|
**Describe the change you'd like to see**
A new possible contributor would have a hard time finding testgrid.knative.dev, the PR dashboard https://gubernator.knative.dev/pr, or performance testing results at https://mako.dev/project?name=Knative.
|
1.0
|
Document testing (in Contributing section?) - **Describe the change you'd like to see**
A new possible contributor would have a hard time finding testgrid.knative.dev, the PR dashboard https://gubernator.knative.dev/pr, or performance testing results at https://mako.dev/project?name=Knative.
|
non_process
|
document testing in contributing section describe the change you d like to see a new possible contributor would have a hard time finding testgrid knative dev the pr dashboard or performance testing results at
| 0
|
727,710
| 25,044,891,766
|
IssuesEvent
|
2022-11-05 05:18:21
|
AY2223S1-CS2103T-F12-1/tp
|
https://api.github.com/repos/AY2223S1-CS2103T-F12-1/tp
|
closed
|
As a property agent, I want a way to view all of my existing information stored in the database in a clean, visually appealing manner.
|
priority.low type.story
|
Make a nice GUI
|
1.0
|
As a property agent, I want a way to view all of my existing information stored in the database in a clean, visually appealing manner. - Make a nice GUI
|
non_process
|
as a property agent i want a way to view all of my existing information stored in the database in a clean visually appealing manner make a nice gui
| 0
|
224,426
| 24,773,348,361
|
IssuesEvent
|
2022-10-23 12:28:50
|
sast-automation-dev/easybuggy4django-41
|
https://api.github.com/repos/sast-automation-dev/easybuggy4django-41
|
opened
|
Django-2.0.3-py3-none-any.whl: 11 vulnerabilities (highest severity is: 9.8)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p></summary>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (Django version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2019-19844](https://www.mend.io/vulnerability-database/CVE-2019-19844) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | Django-2.0.3-py3-none-any.whl | Direct | 2.2.9 | ✅ |
| [CVE-2019-14234](https://www.mend.io/vulnerability-database/CVE-2019-14234) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | Django-2.0.3-py3-none-any.whl | Direct | 2.1.11 | ✅ |
| [CVE-2020-9402](https://www.mend.io/vulnerability-database/CVE-2020-9402) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.8 | Django-2.0.3-py3-none-any.whl | Direct | 2.2.11 | ✅ |
| [CVE-2019-6975](https://www.mend.io/vulnerability-database/CVE-2019-6975) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | Django-2.0.3-py3-none-any.whl | Direct | 2.0.12 | ✅ |
| [CVE-2016-7401](https://www.mend.io/vulnerability-database/CVE-2016-7401) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | Django-2.0.3-py3-none-any.whl | Direct | 2.0.8 | ✅ |
| [CVE-2019-3498](https://www.mend.io/vulnerability-database/CVE-2019-3498) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | Django-2.0.3-py3-none-any.whl | Direct | 2.0.10 | ✅ |
| [CVE-2020-13596](https://www.mend.io/vulnerability-database/CVE-2020-13596) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | Django-2.0.3-py3-none-any.whl | Direct | 2.2.13 | ✅ |
| [CVE-2018-14574](https://www.mend.io/vulnerability-database/CVE-2018-14574) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | Django-2.0.3-py3-none-any.whl | Direct | 2.0.8 | ✅ |
| [CVE-2020-13254](https://www.mend.io/vulnerability-database/CVE-2020-13254) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.9 | Django-2.0.3-py3-none-any.whl | Direct | 2.2.13 | ✅ |
| [CVE-2021-28658](https://www.mend.io/vulnerability-database/CVE-2021-28658) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | Django-2.0.3-py3-none-any.whl | Direct | django-2.2.20, 3.0.14, 3.1.8, 3.2 | ✅ |
| [CVE-2018-16984](https://www.mend.io/vulnerability-database/CVE-2018-16984) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.9 | Django-2.0.3-py3-none-any.whl | Direct | 2.1.2 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-19844</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Django before 1.11.27, 2.x before 2.2.9, and 3.x before 3.0.1 allows account takeover. A suitably crafted email address (that is equal to an existing user's email address after case transformation of Unicode characters) would allow an attacker to be sent a password reset token for the matched user account. (One mitigation in the new releases is to send password reset tokens only to the registered user email address.)
<p>Publish Date: Dec 18, 2019 7:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19844>CVE-2019-19844</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19844">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19844</a></p>
<p>Release Date: Jan 8, 2020 4:15:00 AM</p>
<p>Fix Resolution: 2.2.9</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-14234</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An issue was discovered in Django 1.11.x before 1.11.23, 2.1.x before 2.1.11, and 2.2.x before 2.2.4. Due to an error in shallow key transformation, key and index lookups for django.contrib.postgres.fields.JSONField, and key lookups for django.contrib.postgres.fields.HStoreField, were subject to SQL injection. This could, for example, be exploited via crafted use of "OR 1=1" in a key or index name to return all records, using a suitably crafted dictionary, with dictionary expansion, as the **kwargs passed to the QuerySet.filter() function.
<p>Publish Date: Aug 9, 2019 1:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-14234>CVE-2019-14234</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.djangoproject.com/weblog/2019/aug/01/security-releases/">https://www.djangoproject.com/weblog/2019/aug/01/security-releases/</a></p>
<p>Release Date: Aug 28, 2019 1:15:00 PM</p>
<p>Fix Resolution: 2.1.11</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-9402</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Django 1.11 before 1.11.29, 2.2 before 2.2.11, and 3.0 before 3.0.4 allows SQL Injection if untrusted data is used as a tolerance parameter in GIS functions and aggregates on Oracle. By passing a suitably crafted tolerance to GIS functions and aggregates on Oracle, it was possible to break escaping and inject malicious SQL.
<p>Publish Date: Mar 5, 2020 3:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-9402>CVE-2020-9402</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9402">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9402</a></p>
<p>Release Date: Jul 14, 2020 5:28:00 PM</p>
<p>Fix Resolution: 2.2.11</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-6975</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Django 1.11.x before 1.11.19, 2.0.x before 2.0.11, and 2.1.x before 2.1.6 allows Uncontrolled Memory Consumption via a malicious attacker-supplied value to the django.utils.numberformat.format() function.
<p>Publish Date: Feb 11, 2019 1:29:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-6975>CVE-2019-6975</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.djangoproject.com/weblog/2019/feb/11/security-releases/">https://www.djangoproject.com/weblog/2019/feb/11/security-releases/</a></p>
<p>Release Date: Aug 24, 2020 5:37:00 PM</p>
<p>Fix Resolution: 2.0.12</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2016-7401</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The cookie parsing code in Django before 1.8.15 and 1.9.x before 1.9.10, when used on a site with Google Analytics, allows remote attackers to bypass an intended CSRF protection mechanism by setting arbitrary cookies.
<p>Publish Date: Oct 3, 2016 6:59:13 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2016-7401>CVE-2016-7401</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2016-7401">https://nvd.nist.gov/vuln/detail/CVE-2016-7401</a></p>
<p>Release Date: Oct 3, 2016 6:59:13 PM</p>
<p>Fix Resolution: 2.0.8</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2019-3498</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Django 1.11.x before 1.11.18, 2.0.x before 2.0.10, and 2.1.x before 2.1.5, an Improper Neutralization of Special Elements in Output Used by a Downstream Component issue exists in django.views.defaults.page_not_found(), leading to content spoofing (in a 404 error page) if a user fails to recognize that a crafted URL has malicious content.
<p>Publish Date: Jan 9, 2019 11:29:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-3498>CVE-2019-3498</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.djangoproject.com/weblog/2019/jan/04/security-releases/">https://www.djangoproject.com/weblog/2019/jan/04/security-releases/</a></p>
<p>Release Date: Jan 9, 2019 11:29:00 PM</p>
<p>Fix Resolution: 2.0.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-13596</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An issue was discovered in Django 2.2 before 2.2.13 and 3.0 before 3.0.7. Query parameters generated by the Django admin ForeignKeyRawIdWidget were not properly URL encoded, leading to a possibility of an XSS attack.
<p>Publish Date: Jun 3, 2020 2:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-13596>CVE-2020-13596</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.djangoproject.com/weblog/2020/jun/03/security-releases/">https://www.djangoproject.com/weblog/2020/jun/03/security-releases/</a></p>
<p>Release Date: Jun 3, 2020 2:15:00 PM</p>
<p>Fix Resolution: 2.2.13</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-14574</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
django.middleware.common.CommonMiddleware in Django 1.11.x before 1.11.15 and 2.0.x before 2.0.8 has an Open Redirect.
<p>Publish Date: Aug 3, 2018 5:29:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-14574>CVE-2018-14574</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-14574">https://nvd.nist.gov/vuln/detail/CVE-2018-14574</a></p>
<p>Release Date: Aug 3, 2018 5:29:00 PM</p>
<p>Fix Resolution: 2.0.8</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-13254</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An issue was discovered in Django 2.2 before 2.2.13 and 3.0 before 3.0.7. In cases where a memcached backend does not perform key validation, passing malformed cache keys could result in a key collision, and potential data leakage.
<p>Publish Date: Jun 3, 2020 2:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-13254>CVE-2020-13254</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.9</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.djangoproject.com/weblog/2020/jun/03/security-releases/">https://www.djangoproject.com/weblog/2020/jun/03/security-releases/</a></p>
<p>Release Date: Jun 3, 2020 2:15:00 PM</p>
<p>Fix Resolution: 2.2.13</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-28658</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Django 2.2 before 2.2.20, 3.0 before 3.0.14, and 3.1 before 3.1.8, MultiPartParser allowed directory traversal via uploaded files with suitably crafted file names. Built-in upload handlers were not affected by this vulnerability.
<p>Publish Date: Apr 6, 2021 3:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-28658>CVE-2021-28658</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28658">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28658</a></p>
<p>Release Date: Apr 6, 2021 3:15:00 PM</p>
<p>Fix Resolution: django-2.2.20, 3.0.14, 3.1.8, 3.2</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-16984</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An issue was discovered in Django 2.1 before 2.1.2, in which unprivileged users can read the password hashes of arbitrary accounts. The read-only password widget used by the Django Admin to display an obfuscated password hash was bypassed if a user has only the "view" permission (new in Django 2.1), resulting in display of the entire password hash to those users. This may result in a vulnerability for sites with legacy user accounts using insecure hashes.
<p>Publish Date: Oct 2, 2018 6:29:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-16984>CVE-2018-16984</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.9</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-16984">https://nvd.nist.gov/vuln/detail/CVE-2018-16984</a></p>
<p>Release Date: Oct 2, 2018 6:29:01 PM</p>
<p>Fix Resolution: 2.1.2</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
True
|
Django-2.0.3-py3-none-any.whl: 11 vulnerabilities (highest severity is: 9.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p></summary>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (Django version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2019-19844](https://www.mend.io/vulnerability-database/CVE-2019-19844) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | Django-2.0.3-py3-none-any.whl | Direct | 2.2.9 | ✅ |
| [CVE-2019-14234](https://www.mend.io/vulnerability-database/CVE-2019-14234) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | Django-2.0.3-py3-none-any.whl | Direct | 2.1.11 | ✅ |
| [CVE-2020-9402](https://www.mend.io/vulnerability-database/CVE-2020-9402) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.8 | Django-2.0.3-py3-none-any.whl | Direct | 2.2.11 | ✅ |
| [CVE-2019-6975](https://www.mend.io/vulnerability-database/CVE-2019-6975) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | Django-2.0.3-py3-none-any.whl | Direct | 2.0.12 | ✅ |
| [CVE-2016-7401](https://www.mend.io/vulnerability-database/CVE-2016-7401) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | Django-2.0.3-py3-none-any.whl | Direct | 2.0.8 | ✅ |
| [CVE-2019-3498](https://www.mend.io/vulnerability-database/CVE-2019-3498) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | Django-2.0.3-py3-none-any.whl | Direct | 2.0.10 | ✅ |
| [CVE-2020-13596](https://www.mend.io/vulnerability-database/CVE-2020-13596) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | Django-2.0.3-py3-none-any.whl | Direct | 2.2.13 | ✅ |
| [CVE-2018-14574](https://www.mend.io/vulnerability-database/CVE-2018-14574) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | Django-2.0.3-py3-none-any.whl | Direct | 2.0.8 | ✅ |
| [CVE-2020-13254](https://www.mend.io/vulnerability-database/CVE-2020-13254) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.9 | Django-2.0.3-py3-none-any.whl | Direct | 2.2.13 | ✅ |
| [CVE-2021-28658](https://www.mend.io/vulnerability-database/CVE-2021-28658) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | Django-2.0.3-py3-none-any.whl | Direct | django-2.2.20, 3.0.14, 3.1.8, 3.2 | ✅ |
| [CVE-2018-16984](https://www.mend.io/vulnerability-database/CVE-2018-16984) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.9 | Django-2.0.3-py3-none-any.whl | Direct | 2.1.2 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-19844</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Django before 1.11.27, 2.x before 2.2.9, and 3.x before 3.0.1 allows account takeover. A suitably crafted email address (that is equal to an existing user's email address after case transformation of Unicode characters) would allow an attacker to be sent a password reset token for the matched user account. (One mitigation in the new releases is to send password reset tokens only to the registered user email address.)
<p>Publish Date: Dec 18, 2019 7:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19844>CVE-2019-19844</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19844">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19844</a></p>
<p>Release Date: Jan 8, 2020 4:15:00 AM</p>
<p>Fix Resolution: 2.2.9</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-14234</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An issue was discovered in Django 1.11.x before 1.11.23, 2.1.x before 2.1.11, and 2.2.x before 2.2.4. Due to an error in shallow key transformation, key and index lookups for django.contrib.postgres.fields.JSONField, and key lookups for django.contrib.postgres.fields.HStoreField, were subject to SQL injection. This could, for example, be exploited via crafted use of "OR 1=1" in a key or index name to return all records, using a suitably crafted dictionary, with dictionary expansion, as the **kwargs passed to the QuerySet.filter() function.
<p>Publish Date: Aug 9, 2019 1:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-14234>CVE-2019-14234</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.djangoproject.com/weblog/2019/aug/01/security-releases/">https://www.djangoproject.com/weblog/2019/aug/01/security-releases/</a></p>
<p>Release Date: Aug 28, 2019 1:15:00 PM</p>
<p>Fix Resolution: 2.1.11</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-9402</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Django 1.11 before 1.11.29, 2.2 before 2.2.11, and 3.0 before 3.0.4 allows SQL Injection if untrusted data is used as a tolerance parameter in GIS functions and aggregates on Oracle. By passing a suitably crafted tolerance to GIS functions and aggregates on Oracle, it was possible to break escaping and inject malicious SQL.
<p>Publish Date: Mar 5, 2020 3:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-9402>CVE-2020-9402</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9402">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9402</a></p>
<p>Release Date: Jul 14, 2020 5:28:00 PM</p>
<p>Fix Resolution: 2.2.11</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-6975</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Django 1.11.x before 1.11.19, 2.0.x before 2.0.11, and 2.1.x before 2.1.6 allows Uncontrolled Memory Consumption via a malicious attacker-supplied value to the django.utils.numberformat.format() function.
<p>Publish Date: Feb 11, 2019 1:29:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-6975>CVE-2019-6975</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.djangoproject.com/weblog/2019/feb/11/security-releases/">https://www.djangoproject.com/weblog/2019/feb/11/security-releases/</a></p>
<p>Release Date: Aug 24, 2020 5:37:00 PM</p>
<p>Fix Resolution: 2.0.12</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2016-7401</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The cookie parsing code in Django before 1.8.15 and 1.9.x before 1.9.10, when used on a site with Google Analytics, allows remote attackers to bypass an intended CSRF protection mechanism by setting arbitrary cookies.
<p>Publish Date: Oct 3, 2016 6:59:13 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2016-7401>CVE-2016-7401</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2016-7401">https://nvd.nist.gov/vuln/detail/CVE-2016-7401</a></p>
<p>Release Date: Oct 3, 2016 6:59:13 PM</p>
<p>Fix Resolution: 2.0.8</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2019-3498</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Django 1.11.x before 1.11.18, 2.0.x before 2.0.10, and 2.1.x before 2.1.5, an Improper Neutralization of Special Elements in Output Used by a Downstream Component issue exists in django.views.defaults.page_not_found(), leading to content spoofing (in a 404 error page) if a user fails to recognize that a crafted URL has malicious content.
<p>Publish Date: Jan 9, 2019 11:29:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-3498>CVE-2019-3498</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.djangoproject.com/weblog/2019/jan/04/security-releases/">https://www.djangoproject.com/weblog/2019/jan/04/security-releases/</a></p>
<p>Release Date: Jan 9, 2019 11:29:00 PM</p>
<p>Fix Resolution: 2.0.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-13596</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An issue was discovered in Django 2.2 before 2.2.13 and 3.0 before 3.0.7. Query parameters generated by the Django admin ForeignKeyRawIdWidget were not properly URL encoded, leading to a possibility of an XSS attack.
<p>Publish Date: Jun 3, 2020 2:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-13596>CVE-2020-13596</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.djangoproject.com/weblog/2020/jun/03/security-releases/">https://www.djangoproject.com/weblog/2020/jun/03/security-releases/</a></p>
<p>Release Date: Jun 3, 2020 2:15:00 PM</p>
<p>Fix Resolution: 2.2.13</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-14574</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
django.middleware.common.CommonMiddleware in Django 1.11.x before 1.11.15 and 2.0.x before 2.0.8 has an Open Redirect.
<p>Publish Date: Aug 3, 2018 5:29:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-14574>CVE-2018-14574</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-14574">https://nvd.nist.gov/vuln/detail/CVE-2018-14574</a></p>
<p>Release Date: Aug 3, 2018 5:29:00 PM</p>
<p>Fix Resolution: 2.0.8</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-13254</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An issue was discovered in Django 2.2 before 2.2.13 and 3.0 before 3.0.7. In cases where a memcached backend does not perform key validation, passing malformed cache keys could result in a key collision, and potential data leakage.
<p>Publish Date: Jun 3, 2020 2:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-13254>CVE-2020-13254</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.9</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.djangoproject.com/weblog/2020/jun/03/security-releases/">https://www.djangoproject.com/weblog/2020/jun/03/security-releases/</a></p>
<p>Release Date: Jun 3, 2020 2:15:00 PM</p>
<p>Fix Resolution: 2.2.13</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-28658</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Django 2.2 before 2.2.20, 3.0 before 3.0.14, and 3.1 before 3.1.8, MultiPartParser allowed directory traversal via uploaded files with suitably crafted file names. Built-in upload handlers were not affected by this vulnerability.
<p>Publish Date: Apr 6, 2021 3:15:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-28658>CVE-2021-28658</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28658">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28658</a></p>
<p>Release Date: Apr 6, 2021 3:15:00 PM</p>
<p>Fix Resolution: django-2.2.20, 3.0.14, 3.1.8, 3.2</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-16984</summary>
### Vulnerable Library - <b>Django-2.0.3-py3-none-any.whl</b></p>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl">https://files.pythonhosted.org/packages/3d/81/7e6cf5cb6f0f333946b5d3ee22e17c3c3f329d3bfeb86943a2a3cd861092/Django-2.0.3-py3-none-any.whl</a></p>
<p>Path to dependency file: /easybuggy4django-41</p>
<p>Path to vulnerable library: /easybuggy4django-41,/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.0.3-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-41/commit/38a3155da23d81cc9375f9627133f9556f58a9ad">38a3155da23d81cc9375f9627133f9556f58a9ad</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An issue was discovered in Django 2.1 before 2.1.2, in which unprivileged users can read the password hashes of arbitrary accounts. The read-only password widget used by the Django Admin to display an obfuscated password hash was bypassed if a user has only the "view" permission (new in Django 2.1), resulting in display of the entire password hash to those users. This may result in a vulnerability for sites with legacy user accounts using insecure hashes.
<p>Publish Date: Oct 2, 2018 6:29:00 PM
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-16984>CVE-2018-16984</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.9</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-16984">https://nvd.nist.gov/vuln/detail/CVE-2018-16984</a></p>
<p>Release Date: Oct 2, 2018 6:29:01 PM</p>
<p>Fix Resolution: 2.1.2</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
non_process
|
django none any whl vulnerabilities highest severity is vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt found in head commit a href vulnerabilities cve severity cvss dependency type fixed in django version remediation available high django none any whl direct high django none any whl direct high django none any whl direct high django none any whl direct high django none any whl direct medium django none any whl direct medium django none any whl direct medium django none any whl direct medium django none any whl direct medium django none any whl direct django medium django none any whl direct details cve vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href found in base branch master vulnerability details django before x before and x before allows account takeover a suitably crafted email address that is equal to an existing user s email address after case transformation of unicode characters would allow an attacker to be sent a password reset token for the matched user account one mitigation in the new releases is to send password reset tokens only to the registered user email address publish date dec pm url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date jan am fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href found in base branch master vulnerability details an issue was discovered in django x before x before and x before due to an error in shallow key transformation key and index lookups for django contrib postgres fields jsonfield and key lookups for django contrib postgres fields hstorefield were subject to sql injection this could for example be exploited via crafted use of or in a key or index name to return all records using a suitably crafted dictionary with dictionary expansion as the kwargs passed to the queryset filter function publish date aug pm url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date aug pm fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href found in base branch master vulnerability details django before before and before allows sql injection if untrusted data is used as a tolerance parameter in gis functions and aggregates on oracle by passing a suitably crafted tolerance to gis functions and aggregates on oracle it was possible to break escaping and inject malicious sql publish date mar pm url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date jul pm fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href found in base branch master vulnerability details django x before x before and x before allows uncontrolled memory consumption via a malicious attacker supplied value to the django utils numberformat format function publish date feb pm url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date aug pm fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href found in base branch master vulnerability details the cookie parsing code in django before and x before when used on a site with google analytics allows remote attackers to bypass an intended csrf protection mechanism by setting arbitrary cookies publish date oct pm url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date oct pm fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href found in base branch master vulnerability details in django x before x before and x before an improper neutralization of special elements in output used by a downstream component issue exists in django views defaults page not found leading to content spoofing in a error page if a user fails to recognize that a crafted url has malicious content publish date jan pm url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date jan pm fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href found in base branch master vulnerability details an issue was discovered in django before and before query parameters generated by the django admin foreignkeyrawidwidget were not properly url encoded leading to a possibility of an xss attack publish date jun pm url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date jun pm fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href found in base branch master vulnerability details django middleware common commonmiddleware in django x before and x before has an open redirect publish date aug pm url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date aug pm fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href found in base branch master vulnerability details an issue was discovered in django before and before in cases where a memcached backend does not perform key validation passing malformed cache keys could result in a key collision and potential data leakage publish date jun pm url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date jun pm fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href found in base branch master vulnerability details in django before before and before multipartparser allowed directory traversal via uploaded files with suitably crafted file names built in upload handlers were not affected by this vulnerability publish date apr pm url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date apr pm fix resolution django rescue worker helmet automatic remediation is available for this issue cve vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href found in base branch master vulnerability details an issue was discovered in django before in which unprivileged users can read the password hashes of arbitrary accounts the read only password widget used by the django admin to display an obfuscated password hash was bypassed if a user has only the view permission new in django resulting in display of the entire password hash to those users this may result in a vulnerability for sites with legacy user accounts using insecure hashes publish date oct pm url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date oct pm fix resolution rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
| 0
|
66,814
| 16,724,794,718
|
IssuesEvent
|
2021-06-10 11:43:37
|
surge-synthesizer/surge
|
https://api.github.com/repos/surge-synthesizer/surge
|
reopened
|
JUCE Cursor Hiding
|
Rebuild with JUCE UX
|
Subject says it all, really
but in this case, the CursorHiding API we have may be one we want to keep
|
1.0
|
JUCE Cursor Hiding - Subject says it all, really
but in this case, the CursorHiding API we have may be one we want to keep
|
non_process
|
juce cursor hiding subject says it all really but in this case the cursorhiding api we have may be one we want to keep
| 0
|
82,477
| 15,946,423,387
|
IssuesEvent
|
2021-04-15 01:02:04
|
scdoja/suum
|
https://api.github.com/repos/scdoja/suum
|
closed
|
Responsive Design
|
CODE: User-Interace
|
- It adjust the app to the width and height of various phone sizes. (Look into vertical positioning property)
-Add responsive styling for all components & screens!
|
1.0
|
Responsive Design - - It adjust the app to the width and height of various phone sizes. (Look into vertical positioning property)
-Add responsive styling for all components & screens!
|
non_process
|
responsive design it adjust the app to the width and height of various phone sizes look into vertical positioning property add responsive styling for all components screens
| 0
|
531,833
| 15,526,123,712
|
IssuesEvent
|
2021-03-13 00:11:03
|
googleapis/java-dlp
|
https://api.github.com/repos/googleapis/java-dlp
|
closed
|
dlp.snippets.InspectTests: testInspectDatastoreEntity failed
|
api: dlp priority: p2 type: bug
|
Note: #254 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
----
commit: 042340a89b8666385c0990d53d640eb780ffab78
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/db061d86-024b-4036-8645-d55d8b226090), [Sponge](http://sponge2/db061d86-024b-4036-8645-d55d8b226090)
status: failed
<details><summary>Test output</summary><br><pre>expected to contain:
Job status: DONE
but was:
Job created: projects/java-docs-samples-testing/locations/global/dlpJobs/i-3133936587365543019
Job was not completed after 15 minutes.
at dlp.snippets.InspectTests.testInspectDatastoreEntity(InspectTests.java:332)
</pre></details>
|
1.0
|
dlp.snippets.InspectTests: testInspectDatastoreEntity failed - Note: #254 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
----
commit: 042340a89b8666385c0990d53d640eb780ffab78
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/db061d86-024b-4036-8645-d55d8b226090), [Sponge](http://sponge2/db061d86-024b-4036-8645-d55d8b226090)
status: failed
<details><summary>Test output</summary><br><pre>expected to contain:
Job status: DONE
but was:
Job created: projects/java-docs-samples-testing/locations/global/dlpJobs/i-3133936587365543019
Job was not completed after 15 minutes.
at dlp.snippets.InspectTests.testInspectDatastoreEntity(InspectTests.java:332)
</pre></details>
|
non_process
|
dlp snippets inspecttests testinspectdatastoreentity failed note was also for this test but it was closed more than days ago so i didn t mark it flaky commit buildurl status failed test output expected to contain job status done but was job created projects java docs samples testing locations global dlpjobs i job was not completed after minutes at dlp snippets inspecttests testinspectdatastoreentity inspecttests java
| 0
|
350,109
| 10,478,461,879
|
IssuesEvent
|
2019-09-24 00:08:11
|
BCcampus/edehr
|
https://api.github.com/repos/BCcampus/edehr
|
closed
|
Nav panel links should include background
|
Effort - Low Epic - Navigation Priority - Medium
|
**Expected behaviour**
The entire rectangle that the link is in should be part of the link.
**Actual behaviour**
Only the text in the box is a link.
|
1.0
|
Nav panel links should include background - **Expected behaviour**
The entire rectangle that the link is in should be part of the link.
**Actual behaviour**
Only the text in the box is a link.
|
non_process
|
nav panel links should include background expected behaviour the entire rectangle that the link is in should be part of the link actual behaviour only the text in the box is a link
| 0
|
157,853
| 13,723,188,570
|
IssuesEvent
|
2020-10-03 07:59:57
|
101Loop/APIManager-Flutter
|
https://api.github.com/repos/101Loop/APIManager-Flutter
|
closed
|
create Pull Request template
|
documentation good first issue hacktoberfest
|
Whoever is reading this, plz note that I need a detailed PR template asking for:
- why this change is required
- is this tested, how?
- will this affect any functionality?
- is this backward compatible?
something like this and in detail... this is just an example... and plz ask to contribute first... I'll assign it to you, otherwise it might get assigned to someone else or maybe I'll do it myself, which will result in wasting your own time :slightly_smiling_face: So plz do LMK before doing it
Thanks
|
1.0
|
create Pull Request template - Whoever is reading this, plz note that I need a detailed PR template asking for:
- why this change is required
- is this tested, how?
- will this affect any functionality?
- is this backward compatible?
something like this and in detail... this is just an example... and plz ask to contribute first... I'll assign it to you, otherwise it might get assigned to someone else or maybe I'll do it myself, which will result in wasting your own time :slightly_smiling_face: So plz do LMK before doing it
Thanks
|
non_process
|
create pull request template whoever is reading this plz note that i need a detailed pr template asking for why this change is required is this tested how will this affect any functionality is this backward compatible something like this and in detail this is just an example and plz ask to contribute first i ll assign it to you otherwise it might get assigned to someone else or maybe i ll do it myself which will result in wasting your own time slightly smiling face so plz do lmk before doing it thanks
| 0
|
16,813
| 22,060,914,185
|
IssuesEvent
|
2022-05-30 17:41:27
|
bitPogo/kmock
|
https://api.github.com/repos/bitPogo/kmock
|
closed
|
Optimise the Processor
|
enhancement kmock-processor
|
## Description
<!--- Provide a detailed introduction to the issue itself, and why you consider it to be a bug -->
The processor currently has (too) many loops. They can be reduced by using an emitter like approach and other micro optimisations.
|
1.0
|
Optimise the Processor - ## Description
<!--- Provide a detailed introduction to the issue itself, and why you consider it to be a bug -->
The processor currently has (too) many loops. They can be reduced by using an emitter like approach and other micro optimisations.
|
process
|
optimise the processor description the processor currently has too many loops they can be reduced by using an emitter like approach and other micro optimisations
| 1
|
4,482
| 7,344,511,149
|
IssuesEvent
|
2018-03-07 14:53:22
|
UKHomeOffice/dq-aws-transition
|
https://api.github.com/repos/UKHomeOffice/dq-aws-transition
|
closed
|
Test End-to-End Job_55_SMM_ACL Wherescape Job in NotProd
|
DQ Data Pipeline Production SSM processing
|
Task Estimate: 3 hours
All tasks complete and expected files and data
- [x] End-to-End Job_55_SMM_ACL tested
- [x] Batch 1 data tested
- [x] Batch 2, 3 data tested
- [x] Batch 4 data tested
- [x] Job running in NotProd from Wherescape
|
1.0
|
Test End-to-End Job_55_SMM_ACL Wherescape Job in NotProd - Task Estimate: 3 hours
All tasks complete and expected files and data
- [x] End-to-End Job_55_SMM_ACL tested
- [x] Batch 1 data tested
- [x] Batch 2, 3 data tested
- [x] Batch 4 data tested
- [x] Job running in NotProd from Wherescape
|
process
|
test end to end job smm acl wherescape job in notprod task estimate hours all tasks complete and expected files and data end to end job smm acl tested batch data tested batch data tested batch data tested job running in notprod from wherescape
| 1
|
153,667
| 19,708,551,162
|
IssuesEvent
|
2022-01-13 01:40:14
|
artsking/linux-4.19.72_CVE-2020-14386
|
https://api.github.com/repos/artsking/linux-4.19.72_CVE-2020-14386
|
opened
|
CVE-2020-0427 (Medium) detected in linux-yoctov5.4.51
|
security vulnerability
|
## CVE-2020-0427 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/pinctrl/devicetree.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/pinctrl/devicetree.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In create_pinctrl of core.c, there is a possible out of bounds read due to a use after free. This could lead to local information disclosure with no additional execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android kernelAndroid ID: A-140550171
<p>Publish Date: 2020-09-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-0427>CVE-2020-0427</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2020-0427">https://www.linuxkernelcves.com/cves/CVE-2020-0427</a></p>
<p>Release Date: 2020-07-21</p>
<p>Fix Resolution: v4.14.161,v4.19.92,v5.4.7,v5.5-rc1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-0427 (Medium) detected in linux-yoctov5.4.51 - ## CVE-2020-0427 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/pinctrl/devicetree.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/pinctrl/devicetree.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In create_pinctrl of core.c, there is a possible out of bounds read due to a use after free. This could lead to local information disclosure with no additional execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android kernelAndroid ID: A-140550171
<p>Publish Date: 2020-09-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-0427>CVE-2020-0427</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2020-0427">https://www.linuxkernelcves.com/cves/CVE-2020-0427</a></p>
<p>Release Date: 2020-07-21</p>
<p>Fix Resolution: v4.14.161,v4.19.92,v5.4.7,v5.5-rc1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in linux cve medium severity vulnerability vulnerable library linux yocto linux embedded kernel library home page a href found in base branch master vulnerable source files drivers pinctrl devicetree c drivers pinctrl devicetree c vulnerability details in create pinctrl of core c there is a possible out of bounds read due to a use after free this could lead to local information disclosure with no additional execution privileges needed user interaction is not needed for exploitation product androidversions android kernelandroid id a publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
102,548
| 12,807,099,218
|
IssuesEvent
|
2020-07-03 10:45:00
|
mozilla/m-response
|
https://api.github.com/repos/mozilla/m-response
|
closed
|
Help doc images should have width restrictions
|
Design
|
When an image is placed in a help doc the width is not restricted to the size of the sidebar.
In the image below we see the image is presented without any scaling and so expands off the page.
<img width="522" alt="Screenshot 2019-08-20 at 09 51 49" src="https://user-images.githubusercontent.com/42435754/63333136-cd328f00-c330-11e9-9362-b45ba1560d4d.png">
Can we have images max width 100% so that they all fit within the sidebar?
|
1.0
|
Help doc images should have width restrictions - When an image is placed in a help doc the width is not restricted to the size of the sidebar.
In the image below we see the image is presented without any scaling and so expands off the page.
<img width="522" alt="Screenshot 2019-08-20 at 09 51 49" src="https://user-images.githubusercontent.com/42435754/63333136-cd328f00-c330-11e9-9362-b45ba1560d4d.png">
Can we have images max width 100% so that they all fit within the sidebar?
|
non_process
|
help doc images should have width restrictions when an image is placed in a help doc the width is not restricted to the size of the sidebar in the image below we see the image is presented without any scaling and so expands off the page img width alt screenshot at src can we have images max width so that they all fit within the sidebar
| 0
|
132,368
| 28,134,206,491
|
IssuesEvent
|
2023-04-01 07:07:03
|
hassanhabib/Standard.AI.OpenAI
|
https://api.github.com/repos/hassanhabib/Standard.AI.OpenAI
|
closed
|
CODE RUB: Remove empty statement
|
CODE RUB
|
Remove empty statement at `ImageGenerationService.Exceptions.cs`
|
1.0
|
CODE RUB: Remove empty statement - Remove empty statement at `ImageGenerationService.Exceptions.cs`
|
non_process
|
code rub remove empty statement remove empty statement at imagegenerationservice exceptions cs
| 0
|
258,090
| 27,563,853,919
|
IssuesEvent
|
2023-03-08 01:11:11
|
LynRodWS/alcor
|
https://api.github.com/repos/LynRodWS/alcor
|
opened
|
CVE-2023-25194 (Medium) detected in multiple libraries
|
security vulnerability
|
## CVE-2023-25194 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>kafka-clients-2.3.1.jar</b>, <b>kafka-clients-2.3.0.jar</b>, <b>kafka-clients-2.5.0.jar</b></p></summary>
<p>
<details><summary><b>kafka-clients-2.3.1.jar</b></p></summary>
<p></p>
<p>Library home page: <a href="https://kafka.apache.org">https://kafka.apache.org</a></p>
<p>
Dependency Hierarchy:
- common-0.1.0-SNAPSHOT.jar (Root Library)
- :x: **kafka-clients-2.3.1.jar** (Vulnerable Library)
</details>
<details><summary><b>kafka-clients-2.3.0.jar</b></p></summary>
<p></p>
<p>Library home page: <a href="https://kafka.apache.org">https://kafka.apache.org</a></p>
<p>Path to dependency file: /lib/pom.xml</p>
<p>Path to vulnerable library: /canner/.m2/repository/org/apache/kafka/kafka-clients/2.3.0/kafka-clients-2.3.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **kafka-clients-2.3.0.jar** (Vulnerable Library)
</details>
<details><summary><b>kafka-clients-2.5.0.jar</b></p></summary>
<p></p>
<p>Library home page: <a href="https://kafka.apache.org">https://kafka.apache.org</a></p>
<p>
Dependency Hierarchy:
- common-0.1.0-SNAPSHOT.jar (Root Library)
- :x: **kafka-clients-2.5.0.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A possible security vulnerability has been identified in Apache Kafka Connect. This requires access to a Kafka Connect worker, and the ability to create/modify connectors on it with an arbitrary Kafka client SASL JAAS config and a SASL-based security protocol, which has been possible on Kafka Connect clusters since Apache Kafka 2.3.0. When configuring the connector via the Kafka Connect REST API, an authenticated operator can set the `sasl.jaas.config` property for any of the connector's Kafka clients to "com.sun.security.auth.module.JndiLoginModule", which can be done via the `producer.override.sasl.jaas.config`, `consumer.override.sasl.jaas.config`, or `admin.override.sasl.jaas.config` properties. This will allow the server to connect to the attacker's LDAP server and deserialize the LDAP response, which the attacker can use to execute java deserialization gadget chains on the Kafka connect server. Attacker can cause unrestricted deserialization of untrusted data (or) RCE vulnerability when there are gadgets in the classpath. Since Apache Kafka 3.0.0, users are allowed to specify these properties in connector configurations for Kafka Connect clusters running with out-of-the-box configurations. Before Apache Kafka 3.0.0, users may not specify these properties unless the Kafka Connect cluster has been reconfigured with a connector client override policy that permits them. Since Apache Kafka 3.4.0, we have added a system property ("-Dorg.apache.kafka.disallowed.login.modules") to disable the problematic login modules usage in SASL JAAS configuration. Also by default "com.sun.security.auth.module.JndiLoginModule" is disabled in Apache Kafka 3.4.0. We advise the Kafka Connect users to validate connector configurations and only allow trusted JNDI configurations. Also examine connector dependencies for vulnerable versions and either upgrade their connectors, upgrading that specific dependency, or removing the connectors as options for remediation. Finally, in addition to leveraging the "org.apache.kafka.disallowed.login.modules" system property, Kafka Connect users can also implement their own connector client config override policy, which can be used to control which Kafka client properties can be overridden directly in a connector config and which cannot.
<p>Publish Date: 2023-02-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-25194>CVE-2023-25194</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://kafka.apache.org/cve-list">https://kafka.apache.org/cve-list</a></p>
<p>Release Date: 2023-02-07</p>
<p>Fix Resolution: 3.4.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
True
|
CVE-2023-25194 (Medium) detected in multiple libraries - ## CVE-2023-25194 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>kafka-clients-2.3.1.jar</b>, <b>kafka-clients-2.3.0.jar</b>, <b>kafka-clients-2.5.0.jar</b></p></summary>
<p>
<details><summary><b>kafka-clients-2.3.1.jar</b></p></summary>
<p></p>
<p>Library home page: <a href="https://kafka.apache.org">https://kafka.apache.org</a></p>
<p>
Dependency Hierarchy:
- common-0.1.0-SNAPSHOT.jar (Root Library)
- :x: **kafka-clients-2.3.1.jar** (Vulnerable Library)
</details>
<details><summary><b>kafka-clients-2.3.0.jar</b></p></summary>
<p></p>
<p>Library home page: <a href="https://kafka.apache.org">https://kafka.apache.org</a></p>
<p>Path to dependency file: /lib/pom.xml</p>
<p>Path to vulnerable library: /canner/.m2/repository/org/apache/kafka/kafka-clients/2.3.0/kafka-clients-2.3.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **kafka-clients-2.3.0.jar** (Vulnerable Library)
</details>
<details><summary><b>kafka-clients-2.5.0.jar</b></p></summary>
<p></p>
<p>Library home page: <a href="https://kafka.apache.org">https://kafka.apache.org</a></p>
<p>
Dependency Hierarchy:
- common-0.1.0-SNAPSHOT.jar (Root Library)
- :x: **kafka-clients-2.5.0.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A possible security vulnerability has been identified in Apache Kafka Connect. This requires access to a Kafka Connect worker, and the ability to create/modify connectors on it with an arbitrary Kafka client SASL JAAS config and a SASL-based security protocol, which has been possible on Kafka Connect clusters since Apache Kafka 2.3.0. When configuring the connector via the Kafka Connect REST API, an authenticated operator can set the `sasl.jaas.config` property for any of the connector's Kafka clients to "com.sun.security.auth.module.JndiLoginModule", which can be done via the `producer.override.sasl.jaas.config`, `consumer.override.sasl.jaas.config`, or `admin.override.sasl.jaas.config` properties. This will allow the server to connect to the attacker's LDAP server and deserialize the LDAP response, which the attacker can use to execute java deserialization gadget chains on the Kafka connect server. Attacker can cause unrestricted deserialization of untrusted data (or) RCE vulnerability when there are gadgets in the classpath. Since Apache Kafka 3.0.0, users are allowed to specify these properties in connector configurations for Kafka Connect clusters running with out-of-the-box configurations. Before Apache Kafka 3.0.0, users may not specify these properties unless the Kafka Connect cluster has been reconfigured with a connector client override policy that permits them. Since Apache Kafka 3.4.0, we have added a system property ("-Dorg.apache.kafka.disallowed.login.modules") to disable the problematic login modules usage in SASL JAAS configuration. Also by default "com.sun.security.auth.module.JndiLoginModule" is disabled in Apache Kafka 3.4.0. We advise the Kafka Connect users to validate connector configurations and only allow trusted JNDI configurations. Also examine connector dependencies for vulnerable versions and either upgrade their connectors, upgrading that specific dependency, or removing the connectors as options for remediation. Finally, in addition to leveraging the "org.apache.kafka.disallowed.login.modules" system property, Kafka Connect users can also implement their own connector client config override policy, which can be used to control which Kafka client properties can be overridden directly in a connector config and which cannot.
<p>Publish Date: 2023-02-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-25194>CVE-2023-25194</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://kafka.apache.org/cve-list">https://kafka.apache.org/cve-list</a></p>
<p>Release Date: 2023-02-07</p>
<p>Fix Resolution: 3.4.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
non_process
|
cve medium detected in multiple libraries cve medium severity vulnerability vulnerable libraries kafka clients jar kafka clients jar kafka clients jar kafka clients jar library home page a href dependency hierarchy common snapshot jar root library x kafka clients jar vulnerable library kafka clients jar library home page a href path to dependency file lib pom xml path to vulnerable library canner repository org apache kafka kafka clients kafka clients jar dependency hierarchy x kafka clients jar vulnerable library kafka clients jar library home page a href dependency hierarchy common snapshot jar root library x kafka clients jar vulnerable library found in base branch master vulnerability details a possible security vulnerability has been identified in apache kafka connect this requires access to a kafka connect worker and the ability to create modify connectors on it with an arbitrary kafka client sasl jaas config and a sasl based security protocol which has been possible on kafka connect clusters since apache kafka when configuring the connector via the kafka connect rest api an authenticated operator can set the sasl jaas config property for any of the connector s kafka clients to com sun security auth module jndiloginmodule which can be done via the producer override sasl jaas config consumer override sasl jaas config or admin override sasl jaas config properties this will allow the server to connect to the attacker s ldap server and deserialize the ldap response which the attacker can use to execute java deserialization gadget chains on the kafka connect server attacker can cause unrestricted deserialization of untrusted data or rce vulnerability when there are gadgets in the classpath since apache kafka users are allowed to specify these properties in connector configurations for kafka connect clusters running with out of the box configurations before apache kafka users may not specify these properties unless the kafka connect cluster has been reconfigured with a connector client override policy that permits them since apache kafka we have added a system property dorg apache kafka disallowed login modules to disable the problematic login modules usage in sasl jaas configuration also by default com sun security auth module jndiloginmodule is disabled in apache kafka we advise the kafka connect users to validate connector configurations and only allow trusted jndi configurations also examine connector dependencies for vulnerable versions and either upgrade their connectors upgrading that specific dependency or removing the connectors as options for remediation finally in addition to leveraging the org apache kafka disallowed login modules system property kafka connect users can also implement their own connector client config override policy which can be used to control which kafka client properties can be overridden directly in a connector config and which cannot publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution check this box to open an automated fix pr
| 0
|
134,574
| 18,471,935,410
|
IssuesEvent
|
2021-10-17 21:55:36
|
samq-ghdemo/JS-Demo
|
https://api.github.com/repos/samq-ghdemo/JS-Demo
|
opened
|
CVE-2018-16492 (High) detected in extend-3.0.0.tgz
|
security vulnerability
|
## CVE-2018-16492 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>extend-3.0.0.tgz</b></p></summary>
<p>Port of jQuery.extend for node.js and the browser</p>
<p>Library home page: <a href="https://registry.npmjs.org/extend/-/extend-3.0.0.tgz">https://registry.npmjs.org/extend/-/extend-3.0.0.tgz</a></p>
<p>Path to dependency file: JS-Demo/package.json</p>
<p>Path to vulnerable library: JS-Demo/node_modules/npm/node_modules/request/node_modules/extend/package.json</p>
<p>
Dependency Hierarchy:
- grunt-npm-install-0.3.1.tgz (Root Library)
- npm-3.10.10.tgz
- request-2.75.0.tgz
- :x: **extend-3.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/JS-Demo/commit/210025573ddd44a379ebb23baeb6e2648a69b3d3">210025573ddd44a379ebb23baeb6e2648a69b3d3</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A prototype pollution vulnerability was found in module extend <2.0.2, ~<3.0.2 that allows an attacker to inject arbitrary properties onto Object.prototype.
<p>Publish Date: 2019-02-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-16492>CVE-2018-16492</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://hackerone.com/reports/381185">https://hackerone.com/reports/381185</a></p>
<p>Release Date: 2019-02-01</p>
<p>Fix Resolution: extend - v3.0.2,v2.0.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"extend","packageVersion":"3.0.0","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-npm-install:0.3.1;npm:3.10.10;request:2.75.0;extend:3.0.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"extend - v3.0.2,v2.0.2"}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2018-16492","vulnerabilityDetails":"A prototype pollution vulnerability was found in module extend \u003c2.0.2, ~\u003c3.0.2 that allows an attacker to inject arbitrary properties onto Object.prototype.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-16492","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2018-16492 (High) detected in extend-3.0.0.tgz - ## CVE-2018-16492 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>extend-3.0.0.tgz</b></p></summary>
<p>Port of jQuery.extend for node.js and the browser</p>
<p>Library home page: <a href="https://registry.npmjs.org/extend/-/extend-3.0.0.tgz">https://registry.npmjs.org/extend/-/extend-3.0.0.tgz</a></p>
<p>Path to dependency file: JS-Demo/package.json</p>
<p>Path to vulnerable library: JS-Demo/node_modules/npm/node_modules/request/node_modules/extend/package.json</p>
<p>
Dependency Hierarchy:
- grunt-npm-install-0.3.1.tgz (Root Library)
- npm-3.10.10.tgz
- request-2.75.0.tgz
- :x: **extend-3.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/JS-Demo/commit/210025573ddd44a379ebb23baeb6e2648a69b3d3">210025573ddd44a379ebb23baeb6e2648a69b3d3</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A prototype pollution vulnerability was found in module extend <2.0.2, ~<3.0.2 that allows an attacker to inject arbitrary properties onto Object.prototype.
<p>Publish Date: 2019-02-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-16492>CVE-2018-16492</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://hackerone.com/reports/381185">https://hackerone.com/reports/381185</a></p>
<p>Release Date: 2019-02-01</p>
<p>Fix Resolution: extend - v3.0.2,v2.0.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"extend","packageVersion":"3.0.0","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-npm-install:0.3.1;npm:3.10.10;request:2.75.0;extend:3.0.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"extend - v3.0.2,v2.0.2"}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2018-16492","vulnerabilityDetails":"A prototype pollution vulnerability was found in module extend \u003c2.0.2, ~\u003c3.0.2 that allows an attacker to inject arbitrary properties onto Object.prototype.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-16492","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in extend tgz cve high severity vulnerability vulnerable library extend tgz port of jquery extend for node js and the browser library home page a href path to dependency file js demo package json path to vulnerable library js demo node modules npm node modules request node modules extend package json dependency hierarchy grunt npm install tgz root library npm tgz request tgz x extend tgz vulnerable library found in head commit a href found in base branch main vulnerability details a prototype pollution vulnerability was found in module extend that allows an attacker to inject arbitrary properties onto object prototype publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution extend isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree grunt npm install npm request extend isminimumfixversionavailable true minimumfixversion extend basebranches vulnerabilityidentifier cve vulnerabilitydetails a prototype pollution vulnerability was found in module extend that allows an attacker to inject arbitrary properties onto object prototype vulnerabilityurl
| 0
|
4,453
| 7,320,170,195
|
IssuesEvent
|
2018-03-02 05:31:13
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Add command to enable widgets
|
bug cxp in-process machine-learning triaged
|
I found that among the prerequisites it is needed the following command
jupyter nbextension enable --py widgetsnbextension
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: c66347d2-1537-5a87-2cf4-5e6d9a0e3ad4
* Version Independent ID: 711fdff8-894e-b451-a50b-6069edf5d2c5
* [Content](https://docs.microsoft.com/en-us/azure/machine-learning/preview/scenario-image-classification-using-cntk)
* [Content Source](https://github.com/Microsoft/azure-docs/blob/master/articles/machine-learning/preview/scenario-image-classification-using-cntk.md)
* Service: machine-learning
|
1.0
|
Add command to enable widgets - I found that among the prerequisites it is needed the following command
jupyter nbextension enable --py widgetsnbextension
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: c66347d2-1537-5a87-2cf4-5e6d9a0e3ad4
* Version Independent ID: 711fdff8-894e-b451-a50b-6069edf5d2c5
* [Content](https://docs.microsoft.com/en-us/azure/machine-learning/preview/scenario-image-classification-using-cntk)
* [Content Source](https://github.com/Microsoft/azure-docs/blob/master/articles/machine-learning/preview/scenario-image-classification-using-cntk.md)
* Service: machine-learning
|
process
|
add command to enable widgets i found that among the prerequisites it is needed the following command jupyter nbextension enable py widgetsnbextension document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id service machine learning
| 1
|
15,710
| 19,848,725,867
|
IssuesEvent
|
2022-01-21 09:51:47
|
ooi-data/RS03AXPS-PC03A-06-VADCPA301-streamed-vadcp_pd0_beam_parsed
|
https://api.github.com/repos/ooi-data/RS03AXPS-PC03A-06-VADCPA301-streamed-vadcp_pd0_beam_parsed
|
opened
|
🛑 Processing failed: KeyError
|
process
|
## Overview
`KeyError` found in `processing_task` task during run ended on 2022-01-21T09:51:46.446459.
## Details
Flow name: `RS03AXPS-PC03A-06-VADCPA301-streamed-vadcp_pd0_beam_parsed`
Task name: `processing_task`
Error type: `KeyError`
Error message: 'vadcp_beam_error_dim_0'
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 1395, in _construct_dataarray
variable = self._variables[name]
KeyError: 'vadcp_beam_error_dim_0'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing
final_path = finalize_data_stream(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream
append_to_zarr(mod_ds, final_store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 344, in append_to_zarr
mod_ds = _prepare_ds_to_append(store, mod_ds)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 133, in _prepare_ds_to_append
existing_shape = tuple(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 134, in <genexpr>
ds_to_append[dim].shape[0] for dim, size in new_var.sizes.items()
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 1499, in __getitem__
return self._construct_dataarray(key)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 1397, in _construct_dataarray
_, name, variable = _get_virtual_variable(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 170, in _get_virtual_variable
ref_var = variables[ref_name]
KeyError: 'vadcp_beam_error_dim_0'
```
</details>
|
1.0
|
🛑 Processing failed: KeyError - ## Overview
`KeyError` found in `processing_task` task during run ended on 2022-01-21T09:51:46.446459.
## Details
Flow name: `RS03AXPS-PC03A-06-VADCPA301-streamed-vadcp_pd0_beam_parsed`
Task name: `processing_task`
Error type: `KeyError`
Error message: 'vadcp_beam_error_dim_0'
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 1395, in _construct_dataarray
variable = self._variables[name]
KeyError: 'vadcp_beam_error_dim_0'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing
final_path = finalize_data_stream(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream
append_to_zarr(mod_ds, final_store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 344, in append_to_zarr
mod_ds = _prepare_ds_to_append(store, mod_ds)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 133, in _prepare_ds_to_append
existing_shape = tuple(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 134, in <genexpr>
ds_to_append[dim].shape[0] for dim, size in new_var.sizes.items()
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 1499, in __getitem__
return self._construct_dataarray(key)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 1397, in _construct_dataarray
_, name, variable = _get_virtual_variable(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/dataset.py", line 170, in _get_virtual_variable
ref_var = variables[ref_name]
KeyError: 'vadcp_beam_error_dim_0'
```
</details>
|
process
|
🛑 processing failed keyerror overview keyerror found in processing task task during run ended on details flow name streamed vadcp beam parsed task name processing task error type keyerror error message vadcp beam error dim traceback traceback most recent call last file srv conda envs notebook lib site packages xarray core dataset py line in construct dataarray variable self variables keyerror vadcp beam error dim during handling of the above exception another exception occurred traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing final path finalize data stream file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize data stream append to zarr mod ds final store enc logger logger file srv conda envs notebook lib site packages ooi harvester processor init py line in append to zarr mod ds prepare ds to append store mod ds file srv conda envs notebook lib site packages ooi harvester processor utils py line in prepare ds to append existing shape tuple file srv conda envs notebook lib site packages ooi harvester processor utils py line in ds to append shape for dim size in new var sizes items file srv conda envs notebook lib site packages xarray core dataset py line in getitem return self construct dataarray key file srv conda envs notebook lib site packages xarray core dataset py line in construct dataarray name variable get virtual variable file srv conda envs notebook lib site packages xarray core dataset py line in get virtual variable ref var variables keyerror vadcp beam error dim
| 1
|
716
| 3,206,290,791
|
IssuesEvent
|
2015-10-04 21:40:34
|
pwittchen/NetworkEvents
|
https://api.github.com/repos/pwittchen/NetworkEvents
|
closed
|
Release 2.1.2
|
release process
|
**Initial release notes**:
- bumped target SDK version to 23
- bumped buildToolsVersion to 23.0.1
- removed `CHANGE_NETWORK_STATE` and `INTERNET` permissions from `AndroidManifest.xml`, because they're no longer required
**Things to do**:
- [x] bump library version to 2.1.2
- [x] upload Archives to Maven Central Repository
- [x] bump library version in `README.md` after Maven Sync
- [x] update `CHANGELOG.md` after Maven Sync
- [x] create new GitHub release
|
1.0
|
Release 2.1.2 - **Initial release notes**:
- bumped target SDK version to 23
- bumped buildToolsVersion to 23.0.1
- removed `CHANGE_NETWORK_STATE` and `INTERNET` permissions from `AndroidManifest.xml`, because they're no longer required
**Things to do**:
- [x] bump library version to 2.1.2
- [x] upload Archives to Maven Central Repository
- [x] bump library version in `README.md` after Maven Sync
- [x] update `CHANGELOG.md` after Maven Sync
- [x] create new GitHub release
|
process
|
release initial release notes bumped target sdk version to bumped buildtoolsversion to removed change network state and internet permissions from androidmanifest xml because they re no longer required things to do bump library version to upload archives to maven central repository bump library version in readme md after maven sync update changelog md after maven sync create new github release
| 1
|
12,345
| 14,883,834,771
|
IssuesEvent
|
2021-01-20 13:51:38
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
Customer logo should be displayed in About and Terms of Service pages
|
Bug P2 Participant manager Process: Dev Process: Fixed Process: Tested QA Process: Tested dev
|
AR : Logo is not displayed in 'About' and 'Terms of Service' pages
ER : Customer logo should be displayed in 'About' and 'Terms of Service' pages

|
4.0
|
Customer logo should be displayed in About and Terms of Service pages - AR : Logo is not displayed in 'About' and 'Terms of Service' pages
ER : Customer logo should be displayed in 'About' and 'Terms of Service' pages

|
process
|
customer logo should be displayed in about and terms of service pages ar logo is not displayed in about and terms of service pages er customer logo should be displayed in about and terms of service pages
| 1
|
94,529
| 11,882,571,480
|
IssuesEvent
|
2020-03-27 14:36:44
|
InterprocomOfficial/OpenEAM_DC
|
https://api.github.com/repos/InterprocomOfficial/OpenEAM_DC
|
closed
|
Проектирование и разработка универсального механизма изменения информационной модели БД.
|
designing
|
Для обеспечения возможности дальнейшего развития ПП необходим механизм, позволяющий в рамках обновления ПП на основании входящего структурированного файла с описанием изменений, расширять и изменять заложенную информационную модель.
|
1.0
|
Проектирование и разработка универсального механизма изменения информационной модели БД. - Для обеспечения возможности дальнейшего развития ПП необходим механизм, позволяющий в рамках обновления ПП на основании входящего структурированного файла с описанием изменений, расширять и изменять заложенную информационную модель.
|
non_process
|
проектирование и разработка универсального механизма изменения информационной модели бд для обеспечения возможности дальнейшего развития пп необходим механизм позволяющий в рамках обновления пп на основании входящего структурированного файла с описанием изменений расширять и изменять заложенную информационную модель
| 0
|
54,280
| 29,999,897,444
|
IssuesEvent
|
2023-06-26 08:37:19
|
redhat-developer/intellij-quarkus
|
https://api.github.com/repos/redhat-developer/intellij-quarkus
|
opened
|
Clicking Next button is slow in Quarkus wizard
|
quarkus wizard performance
|
Following changes on the Quarkus wizard in #969, @angelozerr mentioned:
> I have noticed that click on Next button is slow. I think it is because it collect Quarkus extension, but I think we should manage a progress monitor or something like this which explains that Quarkus extensions are collected. At first I though it didn't work and click on Next button several times.
2 remote calls are performed on the UI thread :
- QuarkusModuleInfoStep loads the Quarkus versions
- QuarkusExtensionsStep loads the Quarkus extensions
|
True
|
Clicking Next button is slow in Quarkus wizard - Following changes on the Quarkus wizard in #969, @angelozerr mentioned:
> I have noticed that click on Next button is slow. I think it is because it collect Quarkus extension, but I think we should manage a progress monitor or something like this which explains that Quarkus extensions are collected. At first I though it didn't work and click on Next button several times.
2 remote calls are performed on the UI thread :
- QuarkusModuleInfoStep loads the Quarkus versions
- QuarkusExtensionsStep loads the Quarkus extensions
|
non_process
|
clicking next button is slow in quarkus wizard following changes on the quarkus wizard in angelozerr mentioned i have noticed that click on next button is slow i think it is because it collect quarkus extension but i think we should manage a progress monitor or something like this which explains that quarkus extensions are collected at first i though it didn t work and click on next button several times remote calls are performed on the ui thread quarkusmoduleinfostep loads the quarkus versions quarkusextensionsstep loads the quarkus extensions
| 0
|
767,952
| 26,948,310,585
|
IssuesEvent
|
2023-02-08 09:48:57
|
LikeLion-VJS10/TAING10
|
https://api.github.com/repos/LikeLion-VJS10/TAING10
|
closed
|
🔥README.md 파일(팀원소개, 플젝 소개 등)
|
📃 Docs High Priority
|
- README.md 파일은 꼭 작성해주세요. 팀원 소개와 프로젝트의 소개 등 자유롭게 작성해주시면 됩니다.
- 좋은 예제나 템플릿을 갖고 있다면 공유부탁.
|
1.0
|
🔥README.md 파일(팀원소개, 플젝 소개 등) - - README.md 파일은 꼭 작성해주세요. 팀원 소개와 프로젝트의 소개 등 자유롭게 작성해주시면 됩니다.
- 좋은 예제나 템플릿을 갖고 있다면 공유부탁.
|
non_process
|
🔥readme md 파일 팀원소개 플젝 소개 등 readme md 파일은 꼭 작성해주세요 팀원 소개와 프로젝트의 소개 등 자유롭게 작성해주시면 됩니다 좋은 예제나 템플릿을 갖고 있다면 공유부탁
| 0
|
20,436
| 27,099,251,534
|
IssuesEvent
|
2023-02-15 07:07:39
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
Cleanup legacy code paths for old PY2 default
|
P4 type: process team-Rules-Python stale
|
Since #6647 has been closed as of 0.25, we should remove the flags `--incompatible_py3_is_default` and `--incompatible_py2_outputs_are_suffixed`.
Blocked on #7903, and the fact that these flags are still in use internally to Google.
|
1.0
|
Cleanup legacy code paths for old PY2 default - Since #6647 has been closed as of 0.25, we should remove the flags `--incompatible_py3_is_default` and `--incompatible_py2_outputs_are_suffixed`.
Blocked on #7903, and the fact that these flags are still in use internally to Google.
|
process
|
cleanup legacy code paths for old default since has been closed as of we should remove the flags incompatible is default and incompatible outputs are suffixed blocked on and the fact that these flags are still in use internally to google
| 1
|
55,320
| 14,369,868,774
|
IssuesEvent
|
2020-12-01 10:19:36
|
SAP/fundamental-ngx
|
https://api.github.com/repos/SAP/fundamental-ngx
|
opened
|
Bug: (docs) Tabs example – Programmatic Selection works incorrect
|
Defect Hunting bug core
|
#### Is this a bug, enhancement, or feature request?
bug
#### Briefly describe your proposal.
When the user click "Select Tab 2" and then click "Select Tab 1", tabs do not switching to "tab 1":

#### Which versions of Angular and Fundamental Library for Angular are affected? (If this is a feature request, use current version.)
fundamental-ngx: v 0.25.0
#### If this is a bug, please provide steps for reproducing it.
#### Please provide relevant source code if applicable.
#### Is there anything else we should know?
|
1.0
|
Bug: (docs) Tabs example – Programmatic Selection works incorrect - #### Is this a bug, enhancement, or feature request?
bug
#### Briefly describe your proposal.
When the user click "Select Tab 2" and then click "Select Tab 1", tabs do not switching to "tab 1":

#### Which versions of Angular and Fundamental Library for Angular are affected? (If this is a feature request, use current version.)
fundamental-ngx: v 0.25.0
#### If this is a bug, please provide steps for reproducing it.
#### Please provide relevant source code if applicable.
#### Is there anything else we should know?
|
non_process
|
bug docs tabs example – programmatic selection works incorrect is this a bug enhancement or feature request bug briefly describe your proposal when the user click select tab and then click select tab tabs do not switching to tab which versions of angular and fundamental library for angular are affected if this is a feature request use current version fundamental ngx v if this is a bug please provide steps for reproducing it please provide relevant source code if applicable is there anything else we should know
| 0
|
158,923
| 20,035,847,516
|
IssuesEvent
|
2022-02-02 11:48:10
|
kapseliboi/bui
|
https://api.github.com/repos/kapseliboi/bui
|
opened
|
WS-2018-0122 (High) detected in superstatic-4.0.1.tgz
|
security vulnerability
|
## WS-2018-0122 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>superstatic-4.0.1.tgz</b></p></summary>
<p>A static file server for fancy apps</p>
<p>Library home page: <a href="https://registry.npmjs.org/superstatic/-/superstatic-4.0.1.tgz">https://registry.npmjs.org/superstatic/-/superstatic-4.0.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/superstatic/package.json</p>
<p>
Dependency Hierarchy:
- docpress-0.7.4.tgz (Root Library)
- metalsmith-start-2.0.1.tgz
- :x: **superstatic-4.0.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/kapseliboi/bui/commit/26ea1662970a2fd4bc1c6a2c8e21dca1e039dbdf">26ea1662970a2fd4bc1c6a2c8e21dca1e039dbdf</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Path Traversal (Windows only).
superstatic verifies that current dir is not evaded by checking the presense of ../ in the decoded path, but on Windows, ..\ works.
<p>Publish Date: 2018-05-09
<p>URL: <a href=https://hackerone.com/reports/319951>WS-2018-0122</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: High
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://hackerone.com/reports/319951">https://hackerone.com/reports/319951</a></p>
<p>Release Date: 2018-05-09</p>
<p>Fix Resolution: 5.0.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2018-0122 (High) detected in superstatic-4.0.1.tgz - ## WS-2018-0122 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>superstatic-4.0.1.tgz</b></p></summary>
<p>A static file server for fancy apps</p>
<p>Library home page: <a href="https://registry.npmjs.org/superstatic/-/superstatic-4.0.1.tgz">https://registry.npmjs.org/superstatic/-/superstatic-4.0.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/superstatic/package.json</p>
<p>
Dependency Hierarchy:
- docpress-0.7.4.tgz (Root Library)
- metalsmith-start-2.0.1.tgz
- :x: **superstatic-4.0.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/kapseliboi/bui/commit/26ea1662970a2fd4bc1c6a2c8e21dca1e039dbdf">26ea1662970a2fd4bc1c6a2c8e21dca1e039dbdf</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Path Traversal (Windows only).
superstatic verifies that current dir is not evaded by checking the presense of ../ in the decoded path, but on Windows, ..\ works.
<p>Publish Date: 2018-05-09
<p>URL: <a href=https://hackerone.com/reports/319951>WS-2018-0122</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: High
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://hackerone.com/reports/319951">https://hackerone.com/reports/319951</a></p>
<p>Release Date: 2018-05-09</p>
<p>Fix Resolution: 5.0.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws high detected in superstatic tgz ws high severity vulnerability vulnerable library superstatic tgz a static file server for fancy apps library home page a href path to dependency file package json path to vulnerable library node modules superstatic package json dependency hierarchy docpress tgz root library metalsmith start tgz x superstatic tgz vulnerable library found in head commit a href found in base branch master vulnerability details path traversal windows only superstatic verifies that current dir is not evaded by checking the presense of in the decoded path but on windows works publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact high availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
62,326
| 8,597,114,704
|
IssuesEvent
|
2018-11-15 17:41:55
|
agda/agda
|
https://api.github.com/repos/agda/agda
|
closed
|
A nice way to hide instances?
|
documentation instance status: working-as-intended
|
Suppose I'm trying to create a module to put some module-local stuff inside so Agda won't get confused when I create things outside the scope of the module.
However, Agda's instance resolution still get confused by module-local instances:
A very small reproducible example:
```agda
module InstanceResolution where
-- Type is an alias to Set https://ice1000.org/lagda/lib.Base.html#1144
-- List is just https://ice1000.org/lagda/lib.types.List.html#197
record Monoid {i} (A : Type i) : Type i where
inductive
field
mempty : A
_<>_ : A -> A -> A
open Monoid {{...}} public
module IWantToHideSomethingInsideThisModule where
instance
{-# TERMINATING #-}
ThisIsTheInstanceIWantToHide : Monoid (List A)
ThisIsTheInstanceIWantToHide = record
{ mempty = nil
; _<>_ = λ where
nil b -> b
(a :: as) b -> a :: (as <> b)
}
module OkLetUsSeeWhatHappensInThisModule where
instance
{-# TERMINATING #-}
ListMonoid : Monoid (List A)
Monoid.mempty ListMonoid = nil
Monoid._<>_ ListMonoid nil b = b
Monoid._<>_ ListMonoid (a :: as) b = a :: (as <> b)
_ : List Nat
_ = mempty <> mempty -- complains that there's two instances, while I didn't find a way to hide the first one
```
Temporary solution: hide the first declaration inside a `where` statement of an arbitrary declaration
|
1.0
|
A nice way to hide instances? - Suppose I'm trying to create a module to put some module-local stuff inside so Agda won't get confused when I create things outside the scope of the module.
However, Agda's instance resolution still get confused by module-local instances:
A very small reproducible example:
```agda
module InstanceResolution where
-- Type is an alias to Set https://ice1000.org/lagda/lib.Base.html#1144
-- List is just https://ice1000.org/lagda/lib.types.List.html#197
record Monoid {i} (A : Type i) : Type i where
inductive
field
mempty : A
_<>_ : A -> A -> A
open Monoid {{...}} public
module IWantToHideSomethingInsideThisModule where
instance
{-# TERMINATING #-}
ThisIsTheInstanceIWantToHide : Monoid (List A)
ThisIsTheInstanceIWantToHide = record
{ mempty = nil
; _<>_ = λ where
nil b -> b
(a :: as) b -> a :: (as <> b)
}
module OkLetUsSeeWhatHappensInThisModule where
instance
{-# TERMINATING #-}
ListMonoid : Monoid (List A)
Monoid.mempty ListMonoid = nil
Monoid._<>_ ListMonoid nil b = b
Monoid._<>_ ListMonoid (a :: as) b = a :: (as <> b)
_ : List Nat
_ = mempty <> mempty -- complains that there's two instances, while I didn't find a way to hide the first one
```
Temporary solution: hide the first declaration inside a `where` statement of an arbitrary declaration
|
non_process
|
a nice way to hide instances suppose i m trying to create a module to put some module local stuff inside so agda won t get confused when i create things outside the scope of the module however agda s instance resolution still get confused by module local instances a very small reproducible example agda module instanceresolution where type is an alias to set list is just record monoid i a type i type i where inductive field mempty a a a a open monoid public module iwanttohidesomethinginsidethismodule where instance terminating thisistheinstanceiwanttohide monoid list a thisistheinstanceiwanttohide record mempty nil λ where nil b b a as b a as b module okletusseewhathappensinthismodule where instance terminating listmonoid monoid list a monoid mempty listmonoid nil monoid listmonoid nil b b monoid listmonoid a as b a as b list nat mempty mempty complains that there s two instances while i didn t find a way to hide the first one temporary solution hide the first declaration inside a where statement of an arbitrary declaration
| 0
|
823,251
| 30,962,059,164
|
IssuesEvent
|
2023-08-08 05:20:58
|
hackforla/tdm-calculator
|
https://api.github.com/repos/hackforla/tdm-calculator
|
closed
|
Change the Download Summary button to Print Summary on Page 5
|
enhancement role: front-end level: medium priority: MUST HAVE p-Feature - Final Summary Page pg 5 p-Feature: print/download
|
### Overview
We need to change the `DOWNLOAD SUMMARY` button on page 5 to `PRINT SUMMARY` per stakeholders' request on 2023-06-13
### Action Items
- [ ] Change the `DOWNLOAD SUMMARY` button to `PRINT SUMMARY`
### Resources/Instructions

- Please reference #1314
|
1.0
|
Change the Download Summary button to Print Summary on Page 5 - ### Overview
We need to change the `DOWNLOAD SUMMARY` button on page 5 to `PRINT SUMMARY` per stakeholders' request on 2023-06-13
### Action Items
- [ ] Change the `DOWNLOAD SUMMARY` button to `PRINT SUMMARY`
### Resources/Instructions

- Please reference #1314
|
non_process
|
change the download summary button to print summary on page overview we need to change the download summary button on page to print summary per stakeholders request on action items change the download summary button to print summary resources instructions please reference
| 0
|
18,631
| 24,580,358,978
|
IssuesEvent
|
2022-10-13 15:10:57
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Consent API] State is getting displayed as 'Active' in the following scenarios
|
Bug P0 Process: Fixed Process: Tested QA Process: Tested dev
|
State should be changed in primary consent when withdrawn from the study
**AR:** State is getting displayed as 'Active'
**ER:** State should be displayed as 'Revoked'
**Note:** Issue needs to be fixed even when participants deletes their app account
|
3.0
|
[Consent API] State is getting displayed as 'Active' in the following scenarios - State should be changed in primary consent when withdrawn from the study
**AR:** State is getting displayed as 'Active'
**ER:** State should be displayed as 'Revoked'
**Note:** Issue needs to be fixed even when participants deletes their app account
|
process
|
state is getting displayed as active in the following scenarios state should be changed in primary consent when withdrawn from the study ar state is getting displayed as active er state should be displayed as revoked note issue needs to be fixed even when participants deletes their app account
| 1
|
17,115
| 22,635,143,235
|
IssuesEvent
|
2022-06-30 18:10:42
|
googleapis/synthtool
|
https://api.github.com/repos/googleapis/synthtool
|
closed
|
Multiple broken links in README file.
|
type: docs type: process
|
Found broken links in `README.md` file at `line 13` and `line 21`.
|
1.0
|
Multiple broken links in README file. - Found broken links in `README.md` file at `line 13` and `line 21`.
|
process
|
multiple broken links in readme file found broken links in readme md file at line and line
| 1
|
8,065
| 4,154,623,750
|
IssuesEvent
|
2016-06-16 12:23:46
|
pydata/pandas
|
https://api.github.com/repos/pydata/pandas
|
closed
|
installlation error
|
Build
|
traceback(most recet call last):
file "setup,py" , line 33 , in <module>
ver =Cython. __version__
atrribute error: 'module' object has no attribute ' __version__ '
|
1.0
|
installlation error - traceback(most recet call last):
file "setup,py" , line 33 , in <module>
ver =Cython. __version__
atrribute error: 'module' object has no attribute ' __version__ '
|
non_process
|
installlation error traceback most recet call last file setup py line in ver cython version atrribute error module object has no attribute version
| 0
|
1,024
| 3,481,655,087
|
IssuesEvent
|
2015-12-29 17:34:11
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
child_process.execFile / libuv spawn does not use self-binary
|
child_process
|
The child_process.execFile API (which finally leads to a new ChildProcess() and spawn call via the internal wrapper) spawns a process that uses the native /usr/bin/env node binary and NOT the currently used one.
All versions (including 5.1.1) are affected. Problem is visible among all Debian based servers and systems. Probably all GNU/unix systems are affected as shebang leads to the specific problem.
**STEPS TO REPRODUCE**
- Download new nodejs binary to ~/mytest/node-new
- Download old nodejs binary to ~/mytest/node-old (every version applies and 5.1.1 is affected, too - so just use a different version number for the sake of comparison)
**REDUCED TEST CASE**
The `~/mytest/test.js`:
```javascript
console.log(process.version);
var cp = require('child_process');
cp.execFile(__dirname + '/child.js', [
'foo',
'bar'
], {
cwd: __dirname
}, function(error, stdout, stderr) {
console.log('execution finished', stdout);
});
```
The `~/mytest/child.js` with the initial shebang that leads to the problem:
```javascript
#!/usr/bin/env node
console.log(process.version);
```
Now execute the following command in the Terminal / bash:
```bash
cd ~/mytest;
sudo cp ./node-old /usr/bin/node; # Yes, we WANT this for reproduction of issue
sudo chmod 0777 /usr/bin/node; # Just to be sure it's executable
chmod +x ./node-new;
chmod +x ./test.js;
chmod +x ./child.js;
./node-new ./test.js;
```
**TEST OUTPUT**
```bash
./node-new app.js;
v5.1.1
execution finished v4.2.4
```
Is this the wanted behaviour?
It is nowhere documented and child_process as an API name suggests that forks of the self-binary are made and no new processes are spawned when JS files are executed.
|
1.0
|
child_process.execFile / libuv spawn does not use self-binary - The child_process.execFile API (which finally leads to a new ChildProcess() and spawn call via the internal wrapper) spawns a process that uses the native /usr/bin/env node binary and NOT the currently used one.
All versions (including 5.1.1) are affected. Problem is visible among all Debian based servers and systems. Probably all GNU/unix systems are affected as shebang leads to the specific problem.
**STEPS TO REPRODUCE**
- Download new nodejs binary to ~/mytest/node-new
- Download old nodejs binary to ~/mytest/node-old (every version applies and 5.1.1 is affected, too - so just use a different version number for the sake of comparison)
**REDUCED TEST CASE**
The `~/mytest/test.js`:
```javascript
console.log(process.version);
var cp = require('child_process');
cp.execFile(__dirname + '/child.js', [
'foo',
'bar'
], {
cwd: __dirname
}, function(error, stdout, stderr) {
console.log('execution finished', stdout);
});
```
The `~/mytest/child.js` with the initial shebang that leads to the problem:
```javascript
#!/usr/bin/env node
console.log(process.version);
```
Now execute the following command in the Terminal / bash:
```bash
cd ~/mytest;
sudo cp ./node-old /usr/bin/node; # Yes, we WANT this for reproduction of issue
sudo chmod 0777 /usr/bin/node; # Just to be sure it's executable
chmod +x ./node-new;
chmod +x ./test.js;
chmod +x ./child.js;
./node-new ./test.js;
```
**TEST OUTPUT**
```bash
./node-new app.js;
v5.1.1
execution finished v4.2.4
```
Is this the wanted behaviour?
It is nowhere documented and child_process as an API name suggests that forks of the self-binary are made and no new processes are spawned when JS files are executed.
|
process
|
child process execfile libuv spawn does not use self binary the child process execfile api which finally leads to a new childprocess and spawn call via the internal wrapper spawns a process that uses the native usr bin env node binary and not the currently used one all versions including are affected problem is visible among all debian based servers and systems probably all gnu unix systems are affected as shebang leads to the specific problem steps to reproduce download new nodejs binary to mytest node new download old nodejs binary to mytest node old every version applies and is affected too so just use a different version number for the sake of comparison reduced test case the mytest test js javascript console log process version var cp require child process cp execfile dirname child js foo bar cwd dirname function error stdout stderr console log execution finished stdout the mytest child js with the initial shebang that leads to the problem javascript usr bin env node console log process version now execute the following command in the terminal bash bash cd mytest sudo cp node old usr bin node yes we want this for reproduction of issue sudo chmod usr bin node just to be sure it s executable chmod x node new chmod x test js chmod x child js node new test js test output bash node new app js execution finished is this the wanted behaviour it is nowhere documented and child process as an api name suggests that forks of the self binary are made and no new processes are spawned when js files are executed
| 1
|
15,901
| 11,758,452,346
|
IssuesEvent
|
2020-03-13 15:26:19
|
twosixlabs/armory
|
https://api.github.com/repos/twosixlabs/armory
|
opened
|
Add utility to download saved weights for baseline models
|
baseline models infrastructure
|
Baseline model weights need to be downloaded for users. This will likely require specific directory structure for keras/pytorch/tf/tf2 weights.
|
1.0
|
Add utility to download saved weights for baseline models - Baseline model weights need to be downloaded for users. This will likely require specific directory structure for keras/pytorch/tf/tf2 weights.
|
non_process
|
add utility to download saved weights for baseline models baseline model weights need to be downloaded for users this will likely require specific directory structure for keras pytorch tf weights
| 0
|
15,551
| 27,398,100,014
|
IssuesEvent
|
2023-02-28 21:29:18
|
NASA-PDS/pds4-information-model
|
https://api.github.com/repos/NASA-PDS/pds4-information-model
|
closed
|
CCB-362: Add a permissible value of nm/mm to Units_of_Misc
|
requirement pending-scr p.must-have needs:triage
|
https://pds-jira.jpl.nasa.gov/browse/CCB-362
CCB-362 Add a permissible value of nm/mm to Units_of_Misc
The request is to add the permissible value “nm/mm” and its value meaning to the unit of measure <Units_of_Misc>.
This change is needed to provide a unit of measure for slope parameters of a linear variable filter.
|
1.0
|
CCB-362: Add a permissible value of nm/mm to Units_of_Misc - https://pds-jira.jpl.nasa.gov/browse/CCB-362
CCB-362 Add a permissible value of nm/mm to Units_of_Misc
The request is to add the permissible value “nm/mm” and its value meaning to the unit of measure <Units_of_Misc>.
This change is needed to provide a unit of measure for slope parameters of a linear variable filter.
|
non_process
|
ccb add a permissible value of nm mm to units of misc ccb add a permissible value of nm mm to units of misc the request is to add the permissible value “nm mm” and its value meaning to the unit of measure this change is needed to provide a unit of measure for slope parameters of a linear variable filter
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.