Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3 values | title stringlengths 1 844 | labels stringlengths 4 721 | body stringlengths 1 261k | index stringclasses 12 values | text_combine stringlengths 96 261k | label stringclasses 2 values | text stringlengths 96 248k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
270,314 | 20,597,409,039 | IssuesEvent | 2022-03-05 18:25:54 | ReactiveX/RxPY | https://api.github.com/repos/ReactiveX/RxPY | closed | Getting Started out of date | documentation | The Getting Started in the notebooks uses Observer and from_iterable, which are not used anymore by the library.
The getting started should be simple to do and the current one asks for digging into the documentation. | 1.0 | Getting Started out of date - The Getting Started in the notebooks uses Observer and from_iterable, which are not used anymore by the library.
The getting started should be simple to do and the current one asks for digging into the documentation. | non_priority | getting started out of date the getting started in the notebooks uses observer and from iterable which are not used anymore by the library the getting started should be simple to do and the current one asks for digging into the documentation | 0 |
314,617 | 27,013,901,649 | IssuesEvent | 2023-02-10 17:32:57 | AFM-SPM/TopoStats | https://api.github.com/repos/AFM-SPM/TopoStats | closed | Remove unnecessary images from tests | testing | Tests take ages to run, I introduced comparing Matplotlib generated images to many of the unittests but I think there are too many in there now.
We should leave those in for explicit testing of `plottingfuncs` and `plotting` (work in progress!) but all the other arrays are checked by their sums and so the graphs themselves are a bit unnecessary.
This would also reduce the size of the repository to download (although I fear they will remain buried in the Git history unless I undertake some attach to remove them, not sure I could do that without breaking things). | 1.0 | Remove unnecessary images from tests - Tests take ages to run, I introduced comparing Matplotlib generated images to many of the unittests but I think there are too many in there now.
We should leave those in for explicit testing of `plottingfuncs` and `plotting` (work in progress!) but all the other arrays are checked by their sums and so the graphs themselves are a bit unnecessary.
This would also reduce the size of the repository to download (although I fear they will remain buried in the Git history unless I undertake some attach to remove them, not sure I could do that without breaking things). | non_priority | remove unnecessary images from tests tests take ages to run i introduced comparing matplotlib generated images to many of the unittests but i think there are too many in there now we should leave those in for explicit testing of plottingfuncs and plotting work in progress but all the other arrays are checked by their sums and so the graphs themselves are a bit unnecessary this would also reduce the size of the repository to download although i fear they will remain buried in the git history unless i undertake some attach to remove them not sure i could do that without breaking things | 0 |
378,978 | 26,346,843,589 | IssuesEvent | 2023-01-10 23:04:10 | gr0vity-dev/nano-node-tracker | https://api.github.com/repos/gr0vity-dev/nano-node-tracker | opened | Break up and document node source | documentation quality improvements | 2018-08-28T16:21:23Z cryptocodeA common feedback from new devs looking at the code is the large and mostly undocumented source files.Breaking it up makes navigating easier and will reduce compile times, esp. when editing headers which many files depend on due to their scope. is already done.(This issue is work in progress, suggestions welcome)**documentation**- [ ] Add class/member docs where needed- [ ] Move existing docs to doxygen format**node.cpp**- [x] break up bootstrap #2274- [x] port mapping #1038 - [x] configuration #1296- [x] peering #1315 - [x] signature checker #1700- [x] rep crawler #1803 - [x] election #2030 - [x] network (network and udp_buffer classes)- [x] gap_cache/online_reps #2070- [x] vote processor #2041 - [x] logging #1262 - [x] block_processor #1633- [x] move port/address parsing code to common.cpp (declared in common.hpp already) #1263 - [x] alarm/operation #2173 | 1.0 | Break up and document node source - 2018-08-28T16:21:23Z cryptocodeA common feedback from new devs looking at the code is the large and mostly undocumented source files.Breaking it up makes navigating easier and will reduce compile times, esp. when editing headers which many files depend on due to their scope. is already done.(This issue is work in progress, suggestions welcome)**documentation**- [ ] Add class/member docs where needed- [ ] Move existing docs to doxygen format**node.cpp**- [x] break up bootstrap #2274- [x] port mapping #1038 - [x] configuration #1296- [x] peering #1315 - [x] signature checker #1700- [x] rep crawler #1803 - [x] election #2030 - [x] network (network and udp_buffer classes)- [x] gap_cache/online_reps #2070- [x] vote processor #2041 - [x] logging #1262 - [x] block_processor #1633- [x] move port/address parsing code to common.cpp (declared in common.hpp already) #1263 - [x] alarm/operation #2173 | non_priority | break up and document node source cryptocodea common feedback from new devs looking at the code is the large and mostly undocumented source files breaking it up makes navigating easier and will reduce compile times esp when editing headers which many files depend on due to their scope is already done this issue is work in progress suggestions welcome documentation add class member docs where needed move existing docs to doxygen format node cpp break up bootstrap port mapping configuration peering signature checker rep crawler election network network and udp buffer classes gap cache online reps vote processor logging block processor move port address parsing code to common cpp declared in common hpp already alarm operation | 0 |
86,892 | 10,850,140,545 | IssuesEvent | 2019-11-13 08:08:16 | vaadin/flow | https://api.github.com/repos/vaadin/flow | closed | Change generated TS types for nullable Java types | ccdm needs design | The TS type wrappers generated by Connect should fix the DX issue with nullability: https://github.com/vaadin/vaadin-connect/issues/404.
Current implementation:
Java:
```java
/* Category.java */
public class Category {
private Long id;
private String name
}
/* CategoryService.java */
@VaadinService
public class CategoryService {
public List<Category> list(String filter) {
return repository.find(filter);
}
}
```
TypeScript:
```typescript
/* generated/com/vaadin/starter/beveragebuddy/backend/Category.ts */
export default interface Category {
id?: number | null;
name?: string | null;
}
/* generated/CategoryService.ts */
export function list(
filter?: string | null
): Promise<Array<Category | null> | null> {
return client.call('CategoryService', 'list', {filter});
}
```
**Alternative 1: generate non-nullable TS types and make JVM throw instead of sending nulls**
TypeScript developers do not need to check for nulls when consuming values returned from a `@VaadinService`. Java developers do need to check for nulls before returning anything from a `@VaadinService`, but there is no tool to force these checks at the build time. When unexpected nulls happen they cause run-time errors on the server.
```typescript
/* generated/com/vaadin/starter/beveragebuddy/backend/Category.ts */
export default interface Category {
id?: number;
name?: string;
}
/* generated/CategoryService.ts */
export function list(
filter?: string
): Promise<Array<Category>> {
return client.call('CategoryService', 'list', {filter});
}
```
Add run-time server-side non-null checks that would make the request handler fail and return an 'error' response to the client. That way, if the client gets an OK response it is guaranteed to be non-null.
**Alternative 2: generate nullable TS types, but make them look prettier**
TypeScript developers are forced by the compiler to check for nulls when consuming values returned from a `@VaadinService`. Java developers do not need to check for nulls before returning anything from a `@VaadinService`. Unexpected nulls do not happen.
```typescript
/* generated/com/vaadin/starter/beveragebuddy/backend/Category.ts */
export interface NonNullCategory {
id?: number;
name?: string;
}
type Category = NonNullCategory | undefined;
export default Category;
/* generated/CategoryService.ts */
export function list(
filter?: string
): Promise<Array<Category>> {
return client.call('CategoryService', 'list', {filter});
}
```
An extension to this point would be supporting the `@NotNull` annotation in Java. Parameters, object properties and method return values annotated with it would translate to non-nullable types in TS. See https://github.com/vaadin/vaadin-connect/issues/388.
**Alternative 3: generate nullable TS types, but remove [`strictNullChecks`](https://www.typescriptlang.org/docs/handbook/compiler-options.html)**
At some point either TypeScript or Java developers need to check for nulls, but there is no tool to force these checks at the build time. When unexpected nulls happen they cause run-time errors in the browser.
```typescript
/* generated/com/vaadin/starter/beveragebuddy/backend/Category.ts */
export default interface Category {
id?: number;
name?: string;
}
/* generated/CategoryService.ts */
export function list(
filter?: string
): Promise<Array<Category>> {
return client.call('CategoryService', 'list', {filter});
}
``` | 1.0 | Change generated TS types for nullable Java types - The TS type wrappers generated by Connect should fix the DX issue with nullability: https://github.com/vaadin/vaadin-connect/issues/404.
Current implementation:
Java:
```java
/* Category.java */
public class Category {
private Long id;
private String name
}
/* CategoryService.java */
@VaadinService
public class CategoryService {
public List<Category> list(String filter) {
return repository.find(filter);
}
}
```
TypeScript:
```typescript
/* generated/com/vaadin/starter/beveragebuddy/backend/Category.ts */
export default interface Category {
id?: number | null;
name?: string | null;
}
/* generated/CategoryService.ts */
export function list(
filter?: string | null
): Promise<Array<Category | null> | null> {
return client.call('CategoryService', 'list', {filter});
}
```
**Alternative 1: generate non-nullable TS types and make JVM throw instead of sending nulls**
TypeScript developers do not need to check for nulls when consuming values returned from a `@VaadinService`. Java developers do need to check for nulls before returning anything from a `@VaadinService`, but there is no tool to force these checks at the build time. When unexpected nulls happen they cause run-time errors on the server.
```typescript
/* generated/com/vaadin/starter/beveragebuddy/backend/Category.ts */
export default interface Category {
id?: number;
name?: string;
}
/* generated/CategoryService.ts */
export function list(
filter?: string
): Promise<Array<Category>> {
return client.call('CategoryService', 'list', {filter});
}
```
Add run-time server-side non-null checks that would make the request handler fail and return an 'error' response to the client. That way, if the client gets an OK response it is guaranteed to be non-null.
**Alternative 2: generate nullable TS types, but make them look prettier**
TypeScript developers are forced by the compiler to check for nulls when consuming values returned from a `@VaadinService`. Java developers do not need to check for nulls before returning anything from a `@VaadinService`. Unexpected nulls do not happen.
```typescript
/* generated/com/vaadin/starter/beveragebuddy/backend/Category.ts */
export interface NonNullCategory {
id?: number;
name?: string;
}
type Category = NonNullCategory | undefined;
export default Category;
/* generated/CategoryService.ts */
export function list(
filter?: string
): Promise<Array<Category>> {
return client.call('CategoryService', 'list', {filter});
}
```
An extension to this point would be supporting the `@NotNull` annotation in Java. Parameters, object properties and method return values annotated with it would translate to non-nullable types in TS. See https://github.com/vaadin/vaadin-connect/issues/388.
**Alternative 3: generate nullable TS types, but remove [`strictNullChecks`](https://www.typescriptlang.org/docs/handbook/compiler-options.html)**
At some point either TypeScript or Java developers need to check for nulls, but there is no tool to force these checks at the build time. When unexpected nulls happen they cause run-time errors in the browser.
```typescript
/* generated/com/vaadin/starter/beveragebuddy/backend/Category.ts */
export default interface Category {
id?: number;
name?: string;
}
/* generated/CategoryService.ts */
export function list(
filter?: string
): Promise<Array<Category>> {
return client.call('CategoryService', 'list', {filter});
}
``` | non_priority | change generated ts types for nullable java types the ts type wrappers generated by connect should fix the dx issue with nullability current implementation java java category java public class category private long id private string name categoryservice java vaadinservice public class categoryservice public list list string filter return repository find filter typescript typescript generated com vaadin starter beveragebuddy backend category ts export default interface category id number null name string null generated categoryservice ts export function list filter string null promise null return client call categoryservice list filter alternative generate non nullable ts types and make jvm throw instead of sending nulls typescript developers do not need to check for nulls when consuming values returned from a vaadinservice java developers do need to check for nulls before returning anything from a vaadinservice but there is no tool to force these checks at the build time when unexpected nulls happen they cause run time errors on the server typescript generated com vaadin starter beveragebuddy backend category ts export default interface category id number name string generated categoryservice ts export function list filter string promise return client call categoryservice list filter add run time server side non null checks that would make the request handler fail and return an error response to the client that way if the client gets an ok response it is guaranteed to be non null alternative generate nullable ts types but make them look prettier typescript developers are forced by the compiler to check for nulls when consuming values returned from a vaadinservice java developers do not need to check for nulls before returning anything from a vaadinservice unexpected nulls do not happen typescript generated com vaadin starter beveragebuddy backend category ts export interface nonnullcategory id number name string type category nonnullcategory undefined export default category generated categoryservice ts export function list filter string promise return client call categoryservice list filter an extension to this point would be supporting the notnull annotation in java parameters object properties and method return values annotated with it would translate to non nullable types in ts see alternative generate nullable ts types but remove at some point either typescript or java developers need to check for nulls but there is no tool to force these checks at the build time when unexpected nulls happen they cause run time errors in the browser typescript generated com vaadin starter beveragebuddy backend category ts export default interface category id number name string generated categoryservice ts export function list filter string promise return client call categoryservice list filter | 0 |
356,094 | 25,176,107,693 | IssuesEvent | 2022-11-11 09:24:12 | totsukatomofumi/pe | https://api.github.com/repos/totsukatomofumi/pe | opened | Missing punctuation | type.DocumentationBug severity.Low | 
The missing comma made this hard to read at first. Page 16 of User Guide.
<!--session: 1668154060078-7154a181-3c17-4a5d-8442-00e667ee7736-->
<!--Version: Web v3.4.4--> | 1.0 | Missing punctuation - 
The missing comma made this hard to read at first. Page 16 of User Guide.
<!--session: 1668154060078-7154a181-3c17-4a5d-8442-00e667ee7736-->
<!--Version: Web v3.4.4--> | non_priority | missing punctuation the missing comma made this hard to read at first page of user guide | 0 |
4,781 | 5,289,808,618 | IssuesEvent | 2017-02-08 18:17:30 | hydroshare/hydroshare | https://api.github.com/repos/hydroshare/hydroshare | closed | Add requires.io badge to Github | in progress SECURITY | Allow easy checking for out of date Python dependencies
- [x] rework [hs_docker_base](https://github.com/hydroshare/hs_docker_base) to comply to requires.io formatting.
- Existing requirements are not brought in by a file, but rather by direct definition in the Dockerfile
- [ ] ~update Celery to 4.0.0 to allow for the removal of Pickle ( reference #1799 )~
- [x] define as hs_docker_base:1.9.5 in [docker hub](https://hub.docker.com/r/mjstealey/hs_docker_base/) | True | Add requires.io badge to Github - Allow easy checking for out of date Python dependencies
- [x] rework [hs_docker_base](https://github.com/hydroshare/hs_docker_base) to comply to requires.io formatting.
- Existing requirements are not brought in by a file, but rather by direct definition in the Dockerfile
- [ ] ~update Celery to 4.0.0 to allow for the removal of Pickle ( reference #1799 )~
- [x] define as hs_docker_base:1.9.5 in [docker hub](https://hub.docker.com/r/mjstealey/hs_docker_base/) | non_priority | add requires io badge to github allow easy checking for out of date python dependencies rework to comply to requires io formatting existing requirements are not brought in by a file but rather by direct definition in the dockerfile update celery to to allow for the removal of pickle reference define as hs docker base in | 0 |
182,004 | 21,664,473,871 | IssuesEvent | 2022-05-07 01:28:11 | rzr/rzr-presentation-gstreamer | https://api.github.com/repos/rzr/rzr-presentation-gstreamer | closed | WS-2016-0031 (High) detected in ws-0.8.0.tgz - autoclosed | security vulnerability | ## WS-2016-0031 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ws-0.8.0.tgz</b></p></summary>
<p>simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js, up-to-date against RFC-6455</p>
<p>Library home page: <a href="https://registry.npmjs.org/ws/-/ws-0.8.0.tgz">https://registry.npmjs.org/ws/-/ws-0.8.0.tgz</a></p>
<p>Path to dependency file: rzr-presentation-gstreamer/reveal.js-master/plugin/multiplex/package.json</p>
<p>Path to vulnerable library: rzr-presentation-gstreamer/reveal.js-master/node_modules/ws/package.json,rzr-presentation-gstreamer/reveal.js-master/node_modules/ws/package.json</p>
<p>
Dependency Hierarchy:
- socket.io-1.3.7.tgz (Root Library)
- engine.io-1.5.4.tgz
- :x: **ws-0.8.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/rzr/rzr-presentation-gstreamer/commit/f6343cce620a1e142125782fb78520213496b307">f6343cce620a1e142125782fb78520213496b307</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
DoS in ws module due to excessively large websocket message.
<p>Publish Date: 2016-06-23
<p>URL: <a href=https://github.com/nodejs/node/issues/7388>WS-2016-0031</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/WS-2016-0031">https://nvd.nist.gov/vuln/detail/WS-2016-0031</a></p>
<p>Release Date: 2016-06-23</p>
<p>Fix Resolution: ws - 1.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2016-0031 (High) detected in ws-0.8.0.tgz - autoclosed - ## WS-2016-0031 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ws-0.8.0.tgz</b></p></summary>
<p>simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js, up-to-date against RFC-6455</p>
<p>Library home page: <a href="https://registry.npmjs.org/ws/-/ws-0.8.0.tgz">https://registry.npmjs.org/ws/-/ws-0.8.0.tgz</a></p>
<p>Path to dependency file: rzr-presentation-gstreamer/reveal.js-master/plugin/multiplex/package.json</p>
<p>Path to vulnerable library: rzr-presentation-gstreamer/reveal.js-master/node_modules/ws/package.json,rzr-presentation-gstreamer/reveal.js-master/node_modules/ws/package.json</p>
<p>
Dependency Hierarchy:
- socket.io-1.3.7.tgz (Root Library)
- engine.io-1.5.4.tgz
- :x: **ws-0.8.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/rzr/rzr-presentation-gstreamer/commit/f6343cce620a1e142125782fb78520213496b307">f6343cce620a1e142125782fb78520213496b307</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
DoS in ws module due to excessively large websocket message.
<p>Publish Date: 2016-06-23
<p>URL: <a href=https://github.com/nodejs/node/issues/7388>WS-2016-0031</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/WS-2016-0031">https://nvd.nist.gov/vuln/detail/WS-2016-0031</a></p>
<p>Release Date: 2016-06-23</p>
<p>Fix Resolution: ws - 1.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws high detected in ws tgz autoclosed ws high severity vulnerability vulnerable library ws tgz simple to use blazing fast and thoroughly tested websocket client server and console for node js up to date against rfc library home page a href path to dependency file rzr presentation gstreamer reveal js master plugin multiplex package json path to vulnerable library rzr presentation gstreamer reveal js master node modules ws package json rzr presentation gstreamer reveal js master node modules ws package json dependency hierarchy socket io tgz root library engine io tgz x ws tgz vulnerable library found in head commit a href found in base branch master vulnerability details dos in ws module due to excessively large websocket message publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ws step up your open source security game with whitesource | 0 |
10,630 | 2,962,553,209 | IssuesEvent | 2015-07-10 01:34:15 | TypeStrong/atom-typescript | https://api.github.com/repos/TypeStrong/atom-typescript | closed | Typescript Plugin Reload | by-design | Switching git branches doesn't prompt a reload of atom-typescript
After switching between branches, compilation errors will appear because of inconsistent state. I need to close and re-open atom for these errors to go away. | 1.0 | Typescript Plugin Reload - Switching git branches doesn't prompt a reload of atom-typescript
After switching between branches, compilation errors will appear because of inconsistent state. I need to close and re-open atom for these errors to go away. | non_priority | typescript plugin reload switching git branches doesn t prompt a reload of atom typescript after switching between branches compilation errors will appear because of inconsistent state i need to close and re open atom for these errors to go away | 0 |
233,263 | 25,758,329,224 | IssuesEvent | 2022-12-08 18:12:00 | whitesource-ps/ws-cleanup-tool | https://api.github.com/repos/whitesource-ps/ws-cleanup-tool | opened | certifi-2022.6.15-py3-none-any.whl: 1 vulnerabilities (highest severity is: 6.8) | security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>certifi-2022.6.15-py3-none-any.whl</b></p></summary>
<p>Python package for providing Mozilla's CA Bundle.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/e9/06/d3d367b7af6305b16f0d28ae2aaeb86154fa91f144f036c2d5002a5a202b/certifi-2022.6.15-py3-none-any.whl">https://files.pythonhosted.org/packages/e9/06/d3d367b7af6305b16f0d28ae2aaeb86154fa91f144f036c2d5002a5a202b/certifi-2022.6.15-py3-none-any.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt,/tmp/ws-scm/ws-cleanup-tool</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (certifi version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-23491](https://www.mend.io/vulnerability-database/CVE-2022-23491) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.8 | certifi-2022.6.15-py3-none-any.whl | Direct | certifi - 2022.12.07 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-23491</summary>
### Vulnerable Library - <b>certifi-2022.6.15-py3-none-any.whl</b></p>
<p>Python package for providing Mozilla's CA Bundle.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/e9/06/d3d367b7af6305b16f0d28ae2aaeb86154fa91f144f036c2d5002a5a202b/certifi-2022.6.15-py3-none-any.whl">https://files.pythonhosted.org/packages/e9/06/d3d367b7af6305b16f0d28ae2aaeb86154fa91f144f036c2d5002a5a202b/certifi-2022.6.15-py3-none-any.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt,/tmp/ws-scm/ws-cleanup-tool</p>
<p>
Dependency Hierarchy:
- :x: **certifi-2022.6.15-py3-none-any.whl** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Certifi is a curated collection of Root Certificates for validating the trustworthiness of SSL certificates while verifying the identity of TLS hosts. Certifi 2022.12.07 removes root certificates from "TrustCor" from the root store. These are in the process of being removed from Mozilla's trust store. TrustCor's root certificates are being removed pursuant to an investigation prompted by media reporting that TrustCor's ownership also operated a business that produced spyware. Conclusions of Mozilla's investigation can be found in the linked google group discussion.
<p>Publish Date: 2022-12-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-23491>CVE-2022-23491</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2022-23491">https://www.cve.org/CVERecord?id=CVE-2022-23491</a></p>
<p>Release Date: 2022-12-07</p>
<p>Fix Resolution: certifi - 2022.12.07</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p> | True | certifi-2022.6.15-py3-none-any.whl: 1 vulnerabilities (highest severity is: 6.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>certifi-2022.6.15-py3-none-any.whl</b></p></summary>
<p>Python package for providing Mozilla's CA Bundle.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/e9/06/d3d367b7af6305b16f0d28ae2aaeb86154fa91f144f036c2d5002a5a202b/certifi-2022.6.15-py3-none-any.whl">https://files.pythonhosted.org/packages/e9/06/d3d367b7af6305b16f0d28ae2aaeb86154fa91f144f036c2d5002a5a202b/certifi-2022.6.15-py3-none-any.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt,/tmp/ws-scm/ws-cleanup-tool</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (certifi version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-23491](https://www.mend.io/vulnerability-database/CVE-2022-23491) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.8 | certifi-2022.6.15-py3-none-any.whl | Direct | certifi - 2022.12.07 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-23491</summary>
### Vulnerable Library - <b>certifi-2022.6.15-py3-none-any.whl</b></p>
<p>Python package for providing Mozilla's CA Bundle.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/e9/06/d3d367b7af6305b16f0d28ae2aaeb86154fa91f144f036c2d5002a5a202b/certifi-2022.6.15-py3-none-any.whl">https://files.pythonhosted.org/packages/e9/06/d3d367b7af6305b16f0d28ae2aaeb86154fa91f144f036c2d5002a5a202b/certifi-2022.6.15-py3-none-any.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt,/tmp/ws-scm/ws-cleanup-tool</p>
<p>
Dependency Hierarchy:
- :x: **certifi-2022.6.15-py3-none-any.whl** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Certifi is a curated collection of Root Certificates for validating the trustworthiness of SSL certificates while verifying the identity of TLS hosts. Certifi 2022.12.07 removes root certificates from "TrustCor" from the root store. These are in the process of being removed from Mozilla's trust store. TrustCor's root certificates are being removed pursuant to an investigation prompted by media reporting that TrustCor's ownership also operated a business that produced spyware. Conclusions of Mozilla's investigation can be found in the linked google group discussion.
<p>Publish Date: 2022-12-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-23491>CVE-2022-23491</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2022-23491">https://www.cve.org/CVERecord?id=CVE-2022-23491</a></p>
<p>Release Date: 2022-12-07</p>
<p>Fix Resolution: certifi - 2022.12.07</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p> | non_priority | certifi none any whl vulnerabilities highest severity is vulnerable library certifi none any whl python package for providing mozilla s ca bundle library home page a href path to dependency file requirements txt path to vulnerable library requirements txt tmp ws scm ws cleanup tool vulnerabilities cve severity cvss dependency type fixed in certifi version remediation available medium certifi none any whl direct certifi details cve vulnerable library certifi none any whl python package for providing mozilla s ca bundle library home page a href path to dependency file requirements txt path to vulnerable library requirements txt tmp ws scm ws cleanup tool dependency hierarchy x certifi none any whl vulnerable library found in base branch master vulnerability details certifi is a curated collection of root certificates for validating the trustworthiness of ssl certificates while verifying the identity of tls hosts certifi removes root certificates from trustcor from the root store these are in the process of being removed from mozilla s trust store trustcor s root certificates are being removed pursuant to an investigation prompted by media reporting that trustcor s ownership also operated a business that produced spyware conclusions of mozilla s investigation can be found in the linked google group discussion publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required high user interaction none scope changed impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution certifi rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue | 0 |
223,090 | 24,711,644,089 | IssuesEvent | 2022-10-20 01:35:57 | raindigi/site-preview | https://api.github.com/repos/raindigi/site-preview | closed | CVE-2021-3805 (High) detected in object-path-0.11.4.tgz - autoclosed | security vulnerability | ## CVE-2021-3805 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>object-path-0.11.4.tgz</b></p></summary>
<p>Access deep object properties using a path</p>
<p>Library home page: <a href="https://registry.npmjs.org/object-path/-/object-path-0.11.4.tgz">https://registry.npmjs.org/object-path/-/object-path-0.11.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/object-path/package.json</p>
<p>
Dependency Hierarchy:
- gatsby-1.9.277.tgz (Root Library)
- gatsby-cli-1.1.58.tgz
- yurnalist-0.2.1.tgz
- :x: **object-path-0.11.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/raindigi/site-preview/git/commits/5b58d9941528c1a41f80dabfe33e36195928235b">5b58d9941528c1a41f80dabfe33e36195928235b</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
object-path is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution')
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3805>CVE-2021-3805</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/571e3baf-7c46-46e3-9003-ba7e4e623053/">https://huntr.dev/bounties/571e3baf-7c46-46e3-9003-ba7e4e623053/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution: object-path - 0.11.8</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-3805 (High) detected in object-path-0.11.4.tgz - autoclosed - ## CVE-2021-3805 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>object-path-0.11.4.tgz</b></p></summary>
<p>Access deep object properties using a path</p>
<p>Library home page: <a href="https://registry.npmjs.org/object-path/-/object-path-0.11.4.tgz">https://registry.npmjs.org/object-path/-/object-path-0.11.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/object-path/package.json</p>
<p>
Dependency Hierarchy:
- gatsby-1.9.277.tgz (Root Library)
- gatsby-cli-1.1.58.tgz
- yurnalist-0.2.1.tgz
- :x: **object-path-0.11.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/raindigi/site-preview/git/commits/5b58d9941528c1a41f80dabfe33e36195928235b">5b58d9941528c1a41f80dabfe33e36195928235b</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
object-path is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution')
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3805>CVE-2021-3805</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/571e3baf-7c46-46e3-9003-ba7e4e623053/">https://huntr.dev/bounties/571e3baf-7c46-46e3-9003-ba7e4e623053/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution: object-path - 0.11.8</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in object path tgz autoclosed cve high severity vulnerability vulnerable library object path tgz access deep object properties using a path library home page a href path to dependency file package json path to vulnerable library node modules object path package json dependency hierarchy gatsby tgz root library gatsby cli tgz yurnalist tgz x object path tgz vulnerable library found in head commit a href vulnerability details object path is vulnerable to improperly controlled modification of object prototype attributes prototype pollution publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution object path step up your open source security game with whitesource | 0 |
70,565 | 13,492,754,775 | IssuesEvent | 2020-09-11 18:32:23 | dotnet/interactive | https://api.github.com/repos/dotnet/interactive | closed | Failed execution in vscode notebook display green tick | Area-VS Code Extension Impact-Medium bug | The cell state is success even if the submission failed

| 1.0 | Failed execution in vscode notebook display green tick - The cell state is success even if the submission failed

| non_priority | failed execution in vscode notebook display green tick the cell state is success even if the submission failed | 0 |
8,950 | 7,516,991,102 | IssuesEvent | 2018-04-12 00:57:48 | LighthouseBlog/Blog | https://api.github.com/repos/LighthouseBlog/Blog | opened | Authentication not redirecting after a certain amount of time | bug security | <!--
PLEASE HELP US PROCESS GITHUB ISSUES FASTER BY PROVIDING THE FOLLOWING INFORMATION.
ISSUES MISSING IMPORTANT INFORMATION MAY BE CLOSED WITHOUT INVESTIGATION.
-->
## I'm submitting a...
<!-- Check one of the following options with "x" -->
<pre><code>
[ ] Regression (a behavior that used to work and stopped working in a new release)
[x] Bug report <!-- Please search GitHub for a similar issue or PR before submitting -->
[ ] Feature request
[ ] Support request
</code></pre>
## Current behavior
<!-- Describe how the issue manifests. -->
When I am authenticated, my token is not getting silently renewed (a different issue). Even after that, when I not authenticated, the nav bar will change, but I am not redirected to the home page.
## Expected behavior
<!-- Describe what the desired behavior would be. -->
When I am not authenticated and I hit the api, redirect me to the home page.
## What is the motivation for this change?
<!-- Describe the motivation or the concrete use case. -->
Security...
## Environment
<pre><code>
Browser:
- [ ] Chrome (desktop) version XX
- [ ] Firefox version XX
- [ ] Safari (desktop) version XX
- [ ] Edge version XX
For Tooling issues:
- Node version: XX <!-- run `node --version` -->
- Platform: <!-- Mac, Linux, Windows -->
Others:
<!-- Anything else relevant? Operating system version, IDE, package manager, HTTP server, ... -->
</code></pre>
| True | Authentication not redirecting after a certain amount of time - <!--
PLEASE HELP US PROCESS GITHUB ISSUES FASTER BY PROVIDING THE FOLLOWING INFORMATION.
ISSUES MISSING IMPORTANT INFORMATION MAY BE CLOSED WITHOUT INVESTIGATION.
-->
## I'm submitting a...
<!-- Check one of the following options with "x" -->
<pre><code>
[ ] Regression (a behavior that used to work and stopped working in a new release)
[x] Bug report <!-- Please search GitHub for a similar issue or PR before submitting -->
[ ] Feature request
[ ] Support request
</code></pre>
## Current behavior
<!-- Describe how the issue manifests. -->
When I am authenticated, my token is not getting silently renewed (a different issue). Even after that, when I not authenticated, the nav bar will change, but I am not redirected to the home page.
## Expected behavior
<!-- Describe what the desired behavior would be. -->
When I am not authenticated and I hit the api, redirect me to the home page.
## What is the motivation for this change?
<!-- Describe the motivation or the concrete use case. -->
Security...
## Environment
<pre><code>
Browser:
- [ ] Chrome (desktop) version XX
- [ ] Firefox version XX
- [ ] Safari (desktop) version XX
- [ ] Edge version XX
For Tooling issues:
- Node version: XX <!-- run `node --version` -->
- Platform: <!-- Mac, Linux, Windows -->
Others:
<!-- Anything else relevant? Operating system version, IDE, package manager, HTTP server, ... -->
</code></pre>
| non_priority | authentication not redirecting after a certain amount of time please help us process github issues faster by providing the following information issues missing important information may be closed without investigation i m submitting a regression a behavior that used to work and stopped working in a new release bug report feature request support request current behavior when i am authenticated my token is not getting silently renewed a different issue even after that when i not authenticated the nav bar will change but i am not redirected to the home page expected behavior when i am not authenticated and i hit the api redirect me to the home page what is the motivation for this change security environment browser chrome desktop version xx firefox version xx safari desktop version xx edge version xx for tooling issues node version xx platform others | 0 |
175,231 | 27,811,241,651 | IssuesEvent | 2023-03-18 06:00:16 | HHS/OPRE-OPS | https://api.github.com/repos/HHS/OPRE-OPS | closed | Experiment on Figma Components | design | ### Goals
* Determine if Figma components ease the iterative and creative design process
* Determine how much current reusability there is with our design
* Assess what tasks will be on future stories, or totally net-new stories should result
### Tasks
- [x] Meet with and learn from other Flexioneers about design systems in Figma
- [x] Create reusable Figma component and adapt current pages to use that component.
- [x] Make judgements on time savings, if any
### Additional Context
### Resources
| 1.0 | Experiment on Figma Components - ### Goals
* Determine if Figma components ease the iterative and creative design process
* Determine how much current reusability there is with our design
* Assess what tasks will be on future stories, or totally net-new stories should result
### Tasks
- [x] Meet with and learn from other Flexioneers about design systems in Figma
- [x] Create reusable Figma component and adapt current pages to use that component.
- [x] Make judgements on time savings, if any
### Additional Context
### Resources
| non_priority | experiment on figma components goals determine if figma components ease the iterative and creative design process determine how much current reusability there is with our design assess what tasks will be on future stories or totally net new stories should result tasks meet with and learn from other flexioneers about design systems in figma create reusable figma component and adapt current pages to use that component make judgements on time savings if any additional context resources | 0 |
244,959 | 20,734,563,084 | IssuesEvent | 2022-03-14 12:36:34 | hyperledger/cactus | https://api.github.com/repos/hyperledger/cactus | closed | test: fix yarn test:all on GCP instances | bug Flaky-Test-Automation Tests | **Describe the bug**
I spawn an Ubuntu 20 VM, installed java 8, latest docker and node 16.13, run configure (sucess) and then npm run test:all. There are tests that fail
**To Reproduce**
Spawn a VM on Google Cloud ( c2-standard-8, 70gb SSD 8 cores, 32gb ram), run npm configure. After that run npm run ™test:all
**Expected behavior**
The OpenAPI tests fail, always. Part of them:
FAIL packages/cactus-plugin-keychain-vault/src/test/typescript/integration/openapi/openapi-validation.test.ts 8 failed of 22 14s
✖ Error: Request failed with status code 500
✖ Error: Request failed with status code 500
✖ Error: Request failed with status code 500
✖ Error: Request failed with status code 500
✖ Endpoint setKeychainEntryV1 with fake=4: response.status === 400 OK
✖ TypeError: _b.data.map is not a function
✖ Endpoint getKeychainEntryV1 with fake=4: response.status === 400 OK
✖ TypeError: _b.data.map is not a function
[**Logs/Stack traces**
](https://gist.github.com/RafaelAPB/df52004cf5bda1b7538a6ebacc16b15f)
**Operating system name, version, build:**
**Hyperledger Cactus release version or commit (git rev-parse --short HEAD):**
commit ed6e16413f00031392451c4d4896903fece3a7c6
| 2.0 | test: fix yarn test:all on GCP instances - **Describe the bug**
I spawn an Ubuntu 20 VM, installed java 8, latest docker and node 16.13, run configure (sucess) and then npm run test:all. There are tests that fail
**To Reproduce**
Spawn a VM on Google Cloud ( c2-standard-8, 70gb SSD 8 cores, 32gb ram), run npm configure. After that run npm run ™test:all
**Expected behavior**
The OpenAPI tests fail, always. Part of them:
FAIL packages/cactus-plugin-keychain-vault/src/test/typescript/integration/openapi/openapi-validation.test.ts 8 failed of 22 14s
✖ Error: Request failed with status code 500
✖ Error: Request failed with status code 500
✖ Error: Request failed with status code 500
✖ Error: Request failed with status code 500
✖ Endpoint setKeychainEntryV1 with fake=4: response.status === 400 OK
✖ TypeError: _b.data.map is not a function
✖ Endpoint getKeychainEntryV1 with fake=4: response.status === 400 OK
✖ TypeError: _b.data.map is not a function
[**Logs/Stack traces**
](https://gist.github.com/RafaelAPB/df52004cf5bda1b7538a6ebacc16b15f)
**Operating system name, version, build:**
**Hyperledger Cactus release version or commit (git rev-parse --short HEAD):**
commit ed6e16413f00031392451c4d4896903fece3a7c6
| non_priority | test fix yarn test all on gcp instances describe the bug i spawn an ubuntu vm installed java latest docker and node run configure sucess and then npm run test all there are tests that fail to reproduce spawn a vm on google cloud standard ssd cores ram run npm configure after that run npm run ™test all expected behavior the openapi tests fail always part of them fail packages cactus plugin keychain vault src test typescript integration openapi openapi validation test ts failed of ✖ error request failed with status code ✖ error request failed with status code ✖ error request failed with status code ✖ error request failed with status code ✖ endpoint with fake response status ok ✖ typeerror b data map is not a function ✖ endpoint with fake response status ok ✖ typeerror b data map is not a function logs stack traces operating system name version build hyperledger cactus release version or commit git rev parse short head commit | 0 |
15,702 | 19,848,397,889 | IssuesEvent | 2022-01-21 09:31:07 | ooi-data/RS03ASHS-MJ03B-09-BOTPTA304-streamed-botpt_nano_sample | https://api.github.com/repos/ooi-data/RS03ASHS-MJ03B-09-BOTPTA304-streamed-botpt_nano_sample | opened | 🛑 Processing failed: ValueError | process | ## Overview
`ValueError` found in `processing_task` task during run ended on 2022-01-21T09:31:06.652844.
## Details
Flow name: `RS03ASHS-MJ03B-09-BOTPTA304-streamed-botpt_nano_sample`
Task name: `processing_task`
Error type: `ValueError`
Error message: cannot reshape array of size 1209600 into shape (25000000,)
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing
final_path = finalize_data_stream(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream
append_to_zarr(mod_ds, final_store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr
_append_zarr(store, mod_ds)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr
existing_arr.append(var_data.values)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2305, in append
return self._write_op(self._append_nosync, data, axis=axis)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2211, in _write_op
return self._synchronized_op(f, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2201, in _synchronized_op
result = f(*args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2341, in _append_nosync
self[append_selection] = data
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1224, in __setitem__
self.set_basic_selection(selection, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1319, in set_basic_selection
return self._set_basic_selection_nd(selection, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1610, in _set_basic_selection_nd
self._set_selection(indexer, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1682, in _set_selection
self._chunk_setitems(lchunk_coords, lchunk_selection, chunk_values,
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1871, in _chunk_setitems
cdatas = [self._process_for_setitem(key, sel, val, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1871, in <listcomp>
cdatas = [self._process_for_setitem(key, sel, val, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1950, in _process_for_setitem
chunk = self._decode_chunk(cdata)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2003, in _decode_chunk
chunk = chunk.reshape(expected_shape or self._chunks, order=self._order)
ValueError: cannot reshape array of size 1209600 into shape (25000000,)
```
</details>
| 1.0 | 🛑 Processing failed: ValueError - ## Overview
`ValueError` found in `processing_task` task during run ended on 2022-01-21T09:31:06.652844.
## Details
Flow name: `RS03ASHS-MJ03B-09-BOTPTA304-streamed-botpt_nano_sample`
Task name: `processing_task`
Error type: `ValueError`
Error message: cannot reshape array of size 1209600 into shape (25000000,)
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing
final_path = finalize_data_stream(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream
append_to_zarr(mod_ds, final_store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr
_append_zarr(store, mod_ds)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr
existing_arr.append(var_data.values)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2305, in append
return self._write_op(self._append_nosync, data, axis=axis)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2211, in _write_op
return self._synchronized_op(f, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2201, in _synchronized_op
result = f(*args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2341, in _append_nosync
self[append_selection] = data
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1224, in __setitem__
self.set_basic_selection(selection, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1319, in set_basic_selection
return self._set_basic_selection_nd(selection, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1610, in _set_basic_selection_nd
self._set_selection(indexer, value, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1682, in _set_selection
self._chunk_setitems(lchunk_coords, lchunk_selection, chunk_values,
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1871, in _chunk_setitems
cdatas = [self._process_for_setitem(key, sel, val, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1871, in <listcomp>
cdatas = [self._process_for_setitem(key, sel, val, fields=fields)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1950, in _process_for_setitem
chunk = self._decode_chunk(cdata)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 2003, in _decode_chunk
chunk = chunk.reshape(expected_shape or self._chunks, order=self._order)
ValueError: cannot reshape array of size 1209600 into shape (25000000,)
```
</details>
| non_priority | 🛑 processing failed valueerror overview valueerror found in processing task task during run ended on details flow name streamed botpt nano sample task name processing task error type valueerror error message cannot reshape array of size into shape traceback traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing final path finalize data stream file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize data stream append to zarr mod ds final store enc logger logger file srv conda envs notebook lib site packages ooi harvester processor init py line in append to zarr append zarr store mod ds file srv conda envs notebook lib site packages ooi harvester processor utils py line in append zarr existing arr append var data values file srv conda envs notebook lib site packages zarr core py line in append return self write op self append nosync data axis axis file srv conda envs notebook lib site packages zarr core py line in write op return self synchronized op f args kwargs file srv conda envs notebook lib site packages zarr core py line in synchronized op result f args kwargs file srv conda envs notebook lib site packages zarr core py line in append nosync self data file srv conda envs notebook lib site packages zarr core py line in setitem self set basic selection selection value fields fields file srv conda envs notebook lib site packages zarr core py line in set basic selection return self set basic selection nd selection value fields fields file srv conda envs notebook lib site packages zarr core py line in set basic selection nd self set selection indexer value fields fields file srv conda envs notebook lib site packages zarr core py line in set selection self chunk setitems lchunk coords lchunk selection chunk values file srv conda envs notebook lib site packages zarr core py line in chunk setitems cdatas self process for setitem key sel val fields fields file srv conda envs notebook lib site packages zarr core py line in cdatas self process for setitem key sel val fields fields file srv conda envs notebook lib site packages zarr core py line in process for setitem chunk self decode chunk cdata file srv conda envs notebook lib site packages zarr core py line in decode chunk chunk chunk reshape expected shape or self chunks order self order valueerror cannot reshape array of size into shape | 0 |
40,209 | 9,908,257,073 | IssuesEvent | 2019-06-27 17:49:22 | idaholab/moose | https://api.github.com/repos/idaholab/moose | opened | MOOSEDocs: links are same color as note box heading background | T: defect | ## Bug Description
There is at least one instance where a link is used in a note box header: the "Core Extension" page of the MOOSEDocs documentation. The link "Markdown" is invisible in the first note. The fix may just be to not use links in note headers, rather than change colors.
## Steps to Reproduce
See above.
## Impact
The link text is invisible.
| 1.0 | MOOSEDocs: links are same color as note box heading background - ## Bug Description
There is at least one instance where a link is used in a note box header: the "Core Extension" page of the MOOSEDocs documentation. The link "Markdown" is invisible in the first note. The fix may just be to not use links in note headers, rather than change colors.
## Steps to Reproduce
See above.
## Impact
The link text is invisible.
| non_priority | moosedocs links are same color as note box heading background bug description there is at least one instance where a link is used in a note box header the core extension page of the moosedocs documentation the link markdown is invisible in the first note the fix may just be to not use links in note headers rather than change colors steps to reproduce see above impact the link text is invisible | 0 |
326,532 | 24,089,774,175 | IssuesEvent | 2022-09-19 13:55:36 | DaveyJH/ci-portfolio-four | https://api.github.com/repos/DaveyJH/ci-portfolio-four | closed | Docs: Add live site links and repo link | documentation 1 sprint-12 | **Details:**
Add live site links and repo link
- Render
- Heroku | 1.0 | Docs: Add live site links and repo link - **Details:**
Add live site links and repo link
- Render
- Heroku | non_priority | docs add live site links and repo link details add live site links and repo link render heroku | 0 |
15,990 | 20,188,203,128 | IssuesEvent | 2022-02-11 01:17:42 | savitamittalmsft/WAS-SEC-TEST | https://api.github.com/repos/savitamittalmsft/WAS-SEC-TEST | opened | Automatically remove/obfuscate personally identifiable information (PII) for this workload | WARP-Import WAF FEB 2021 Security Performance and Scalability Capacity Management Processes Health Modeling & Monitoring Application Level Monitoring | <a href="https://docs.microsoft.com/azure/search/cognitive-search-skill-pii-detection">Automatically remove/obfuscate personally identifiable information (PII) for this workload</a>
<p><b>Why Consider This?</b></p>
Extra care should be taken around logging of sensitive application areas. PII (contact information, payment information etc.) should not be stored in any application logs and protective measures should be applied (such as obfuscation).
<p><b>Context</b></p>
<p><b>Suggested Actions</b></p>
<p><span>Deploy PII detection and removal/obfuscation solution for this workload.</span></p>
<p><b>Learn More</b></p>
<p><span>Machine learning tools like </span><a href="https://docs.microsoft.com/en-us/azure/search/cognitive-search-skill-pii-detection" target="_blank"><span>Cognitive Search PII detection</span></a><span> can help with this.</span></p> | 1.0 | Automatically remove/obfuscate personally identifiable information (PII) for this workload - <a href="https://docs.microsoft.com/azure/search/cognitive-search-skill-pii-detection">Automatically remove/obfuscate personally identifiable information (PII) for this workload</a>
<p><b>Why Consider This?</b></p>
Extra care should be taken around logging of sensitive application areas. PII (contact information, payment information etc.) should not be stored in any application logs and protective measures should be applied (such as obfuscation).
<p><b>Context</b></p>
<p><b>Suggested Actions</b></p>
<p><span>Deploy PII detection and removal/obfuscation solution for this workload.</span></p>
<p><b>Learn More</b></p>
<p><span>Machine learning tools like </span><a href="https://docs.microsoft.com/en-us/azure/search/cognitive-search-skill-pii-detection" target="_blank"><span>Cognitive Search PII detection</span></a><span> can help with this.</span></p> | non_priority | automatically remove obfuscate personally identifiable information pii for this workload why consider this extra care should be taken around logging of sensitive application areas pii contact information payment information etc should not be stored in any application logs and protective measures should be applied such as obfuscation context suggested actions deploy pii detection and removal obfuscation solution for this workload learn more machine learning tools like cognitive search pii detection can help with this | 0 |
175,753 | 21,327,565,329 | IssuesEvent | 2022-04-18 02:16:38 | fluent/fluent-bit | https://api.github.com/repos/fluent/fluent-bit | closed | Security Vulnerabilities for fluent-bit:1.8.10 | Stale security | ## Bug Report
**Describe the bug**
Hi team,
Security-scanning the latest release version (`v1.8.10`) for our fluent-bit image, we have found two vulnerabilities (high and medium severity) that got in via the fluent-bit binary installation.
Would be highly appreciated if you could either rescore them based on your usage or fix those security threats within your next release.
Thanks in advance!
| Library | Vulnerability | Description | NVD | Used Version | Unaffected Version(s) |
|:-------:|:--------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------:|:------------:|:---------------------:|
| c-ares | CVE-2021-3672 | A flaw was found in c-ares library, where a missing input validation check of host names returned by DNS (Domain Name Servers) can lead to output of wrong hostnames which might potentially lead to Domain Hijacking. The highest threat from this vulnerability is to confidentiality and integrity as well as system availability. | https://nvd.nist.gov/vuln/detail/CVE-2021-3672 | 1.17.1 | >= 1.17.2 |
| sqlite3 | CVE-2021-20227 | A flaw was found in SQLite's SELECT query functionality (src/select.c). This flaw allows an attacker who is capable of running SQL queries locally on the SQLite database to cause a denial of service or possible code execution by triggering a use-after-free. The highest threat from this vulnerability is to system availability. | https://nvd.nist.gov/vuln/detail/CVE-2021-20227 | 3.33.0 | >= 3.34.1 | | True | Security Vulnerabilities for fluent-bit:1.8.10 - ## Bug Report
**Describe the bug**
Hi team,
Security-scanning the latest release version (`v1.8.10`) for our fluent-bit image, we have found two vulnerabilities (high and medium severity) that got in via the fluent-bit binary installation.
Would be highly appreciated if you could either rescore them based on your usage or fix those security threats within your next release.
Thanks in advance!
| Library | Vulnerability | Description | NVD | Used Version | Unaffected Version(s) |
|:-------:|:--------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------:|:------------:|:---------------------:|
| c-ares | CVE-2021-3672 | A flaw was found in c-ares library, where a missing input validation check of host names returned by DNS (Domain Name Servers) can lead to output of wrong hostnames which might potentially lead to Domain Hijacking. The highest threat from this vulnerability is to confidentiality and integrity as well as system availability. | https://nvd.nist.gov/vuln/detail/CVE-2021-3672 | 1.17.1 | >= 1.17.2 |
| sqlite3 | CVE-2021-20227 | A flaw was found in SQLite's SELECT query functionality (src/select.c). This flaw allows an attacker who is capable of running SQL queries locally on the SQLite database to cause a denial of service or possible code execution by triggering a use-after-free. The highest threat from this vulnerability is to system availability. | https://nvd.nist.gov/vuln/detail/CVE-2021-20227 | 3.33.0 | >= 3.34.1 | | non_priority | security vulnerabilities for fluent bit bug report describe the bug hi team security scanning the latest release version for our fluent bit image we have found two vulnerabilities high and medium severity that got in via the fluent bit binary installation would be highly appreciated if you could either rescore them based on your usage or fix those security threats within your next release thanks in advance library vulnerability description nvd used version unaffected version s c ares cve a flaw was found in c ares library where a missing input validation check of host names returned by dns domain name servers can lead to output of wrong hostnames which might potentially lead to domain hijacking the highest threat from this vulnerability is to confidentiality and integrity as well as system availability cve a flaw was found in sqlite s select query functionality src select c this flaw allows an attacker who is capable of running sql queries locally on the sqlite database to cause a denial of service or possible code execution by triggering a use after free the highest threat from this vulnerability is to system availability | 0 |
30,736 | 8,583,555,625 | IssuesEvent | 2018-11-13 20:04:10 | travis-ci/travis-ci | https://api.github.com/repos/travis-ci/travis-ci | closed | Passing build artifacts across stages | build stages feature-request important locked | We want to achieve a 2 stage CI:
* A `build` stage that compiles a program. 2 builds: mac and linux.
* A `test` stage that use the compiled program. Multiple builds: 2 OSes * number of different language versions (e.g. `python: 2.7` and `python: 3.5`).
It is hard to pass the compiled program to the `test` stage. 2 approaches I thought of:
* use Travis's caching: Not feasible since the `language:` / `python:` keys are different between the `build` stage and the `test` stage. The cache wouldn't be shared according to https://docs.travis-ci.com/user/caching#Caches-and-build-matrices. Don't even consider what would happen when there are parallel builds of different commits, or when one restart a build of an older commit.
* use custom S3: It can work in our own repo, however, without exposing our S3 credentials, our forks wouldn't be able to use CI (unless they override the S3 credentials with their own).
It would be great if there is some ad-hoc feature for passing build artifacts across stages.
For reference, the GitLab CI artifacts is pretty perfect: https://docs.gitlab.com/ee/ci/yaml/#artifacts
Their `artifacts` from all previous stages are passed by default, but also can be fine-tuned with `dependencies`. | 1.0 | Passing build artifacts across stages - We want to achieve a 2 stage CI:
* A `build` stage that compiles a program. 2 builds: mac and linux.
* A `test` stage that use the compiled program. Multiple builds: 2 OSes * number of different language versions (e.g. `python: 2.7` and `python: 3.5`).
It is hard to pass the compiled program to the `test` stage. 2 approaches I thought of:
* use Travis's caching: Not feasible since the `language:` / `python:` keys are different between the `build` stage and the `test` stage. The cache wouldn't be shared according to https://docs.travis-ci.com/user/caching#Caches-and-build-matrices. Don't even consider what would happen when there are parallel builds of different commits, or when one restart a build of an older commit.
* use custom S3: It can work in our own repo, however, without exposing our S3 credentials, our forks wouldn't be able to use CI (unless they override the S3 credentials with their own).
It would be great if there is some ad-hoc feature for passing build artifacts across stages.
For reference, the GitLab CI artifacts is pretty perfect: https://docs.gitlab.com/ee/ci/yaml/#artifacts
Their `artifacts` from all previous stages are passed by default, but also can be fine-tuned with `dependencies`. | non_priority | passing build artifacts across stages we want to achieve a stage ci a build stage that compiles a program builds mac and linux a test stage that use the compiled program multiple builds oses number of different language versions e g python and python it is hard to pass the compiled program to the test stage approaches i thought of use travis s caching not feasible since the language python keys are different between the build stage and the test stage the cache wouldn t be shared according to don t even consider what would happen when there are parallel builds of different commits or when one restart a build of an older commit use custom it can work in our own repo however without exposing our credentials our forks wouldn t be able to use ci unless they override the credentials with their own it would be great if there is some ad hoc feature for passing build artifacts across stages for reference the gitlab ci artifacts is pretty perfect their artifacts from all previous stages are passed by default but also can be fine tuned with dependencies | 0 |
7,999 | 4,120,203,029 | IssuesEvent | 2016-06-08 17:08:44 | Azure/azure-iot-gateway-sdk | https://api.github.com/repos/Azure/azure-iot-gateway-sdk | closed | Building on Raspbian - Raspberry PI 3 - Errors and workaround | build | Trying to build the gateway sources on a PI 3, using the **build.sh** script, fails with the following error:
`[ 43%] Building C object deps/azure-iot-sdks/c/azure-uamqp-c/tests/session_unittests/CMakeFiles/session_unittests_exe.dir/main.c.o
Linking CXX executable session_unittests_exe
Linking CXX executable mqtt_client_unittests_exe
[ 43%] Built target session_unittests_exe
[ 43%] Built target mqtt_client_unittests_exe
The bug is not reproducible, so it is likely a hardware or OS problem.
deps/azure-iot-sdks/c/azure-uamqp-c/tests/saslclientio_unittests/CMakeFiles/saslclientio_unittests_exe.dir/build.make:54: recipe for target 'deps/azure-iot-sdks/c/azure-uamqp-c/tests/saslclientio_unittests/CMakeFiles/saslclientio_unittests_exe.dir/saslclientio_unittests.cpp.o' failed
make[2]: *** [deps/azure-iot-sdks/c/azure-uamqp-c/tests/saslclientio_unittests/CMakeFiles/saslclientio_unittests_exe.dir/saslclientio_unittests.cpp.o] Error 1
CMakeFiles/Makefile2:4415: recipe for target 'deps/azure-iot-sdks/c/azure-uamqp-c/tests/saslclientio_unittests/CMakeFiles/saslclientio_unittests_exe.dir/all' failed
make[1]: *** [deps/azure-iot-sdks/c/azure-uamqp-c/tests/saslclientio_unittests/CMakeFiles/saslclientio_unittests_exe.dir/all] Error 2
Makefile:127: recipe for target 'all' failed
make: *** [all] Error 2
`
A manual build using **make -j $(nproc)** fails as well. Whereas omitting the **-j $(nproc)** parameter for make in general, compiles successfully.
Afterwards, test can be run manually using **ctest -j $(nproc) -C Debug --output-on-failure** most of the tests will run successfully, except for the **unmocktypes_c_unittests**:
`99% tests passed, 1 tests failed out of 97
Total Test time (real) = 4.13 sec
The following tests FAILED:
12 - umocktypes_c_unittests (Failed)
Errors while running CTest`
I am going to test the gateway code further. Trying examples, etc.. If you need additional files like logs, let me know. | 1.0 | Building on Raspbian - Raspberry PI 3 - Errors and workaround - Trying to build the gateway sources on a PI 3, using the **build.sh** script, fails with the following error:
`[ 43%] Building C object deps/azure-iot-sdks/c/azure-uamqp-c/tests/session_unittests/CMakeFiles/session_unittests_exe.dir/main.c.o
Linking CXX executable session_unittests_exe
Linking CXX executable mqtt_client_unittests_exe
[ 43%] Built target session_unittests_exe
[ 43%] Built target mqtt_client_unittests_exe
The bug is not reproducible, so it is likely a hardware or OS problem.
deps/azure-iot-sdks/c/azure-uamqp-c/tests/saslclientio_unittests/CMakeFiles/saslclientio_unittests_exe.dir/build.make:54: recipe for target 'deps/azure-iot-sdks/c/azure-uamqp-c/tests/saslclientio_unittests/CMakeFiles/saslclientio_unittests_exe.dir/saslclientio_unittests.cpp.o' failed
make[2]: *** [deps/azure-iot-sdks/c/azure-uamqp-c/tests/saslclientio_unittests/CMakeFiles/saslclientio_unittests_exe.dir/saslclientio_unittests.cpp.o] Error 1
CMakeFiles/Makefile2:4415: recipe for target 'deps/azure-iot-sdks/c/azure-uamqp-c/tests/saslclientio_unittests/CMakeFiles/saslclientio_unittests_exe.dir/all' failed
make[1]: *** [deps/azure-iot-sdks/c/azure-uamqp-c/tests/saslclientio_unittests/CMakeFiles/saslclientio_unittests_exe.dir/all] Error 2
Makefile:127: recipe for target 'all' failed
make: *** [all] Error 2
`
A manual build using **make -j $(nproc)** fails as well. Whereas omitting the **-j $(nproc)** parameter for make in general, compiles successfully.
Afterwards, test can be run manually using **ctest -j $(nproc) -C Debug --output-on-failure** most of the tests will run successfully, except for the **unmocktypes_c_unittests**:
`99% tests passed, 1 tests failed out of 97
Total Test time (real) = 4.13 sec
The following tests FAILED:
12 - umocktypes_c_unittests (Failed)
Errors while running CTest`
I am going to test the gateway code further. Trying examples, etc.. If you need additional files like logs, let me know. | non_priority | building on raspbian raspberry pi errors and workaround trying to build the gateway sources on a pi using the build sh script fails with the following error building c object deps azure iot sdks c azure uamqp c tests session unittests cmakefiles session unittests exe dir main c o linking cxx executable session unittests exe linking cxx executable mqtt client unittests exe built target session unittests exe built target mqtt client unittests exe the bug is not reproducible so it is likely a hardware or os problem deps azure iot sdks c azure uamqp c tests saslclientio unittests cmakefiles saslclientio unittests exe dir build make recipe for target deps azure iot sdks c azure uamqp c tests saslclientio unittests cmakefiles saslclientio unittests exe dir saslclientio unittests cpp o failed make error cmakefiles recipe for target deps azure iot sdks c azure uamqp c tests saslclientio unittests cmakefiles saslclientio unittests exe dir all failed make error makefile recipe for target all failed make error a manual build using make j nproc fails as well whereas omitting the j nproc parameter for make in general compiles successfully afterwards test can be run manually using ctest j nproc c debug output on failure most of the tests will run successfully except for the unmocktypes c unittests tests passed tests failed out of total test time real sec the following tests failed umocktypes c unittests failed errors while running ctest i am going to test the gateway code further trying examples etc if you need additional files like logs let me know | 0 |
58,421 | 8,257,946,063 | IssuesEvent | 2018-09-13 07:35:42 | sebcrozet/ncollide | https://api.github.com/repos/sebcrozet/ncollide | closed | Update the section about event handling on the user guide. | documentation | In particular, the callbacks for event handling are no longer present on ncollide 0.14. | 1.0 | Update the section about event handling on the user guide. - In particular, the callbacks for event handling are no longer present on ncollide 0.14. | non_priority | update the section about event handling on the user guide in particular the callbacks for event handling are no longer present on ncollide | 0 |
52,381 | 22,171,451,792 | IssuesEvent | 2022-06-06 01:28:25 | hashicorp/terraform-provider-azurerm | https://api.github.com/repos/hashicorp/terraform-provider-azurerm | reopened | Support for adding service account permissions while setting up azurerm_storage_sync_cloud_endpoint | question service/storage | ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Community Note
<!--- Please keep this note for the community --->
* Please vote on this issue by adding a :thumbsup: [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
When working with azurerm_storage_sync_cloud_endpoint there is a comment in the documentation that the service principal Microsoft.StorageSync needs to be granted "Reader and Data Access". This makes sense however there does not seem to be any way to do this via terraform (at least not that I can find). Since this is a required step it would seem to make sense that this should be down automatically or at the very least there should be another way to complete this in a terraform step
Two suggested options could be
Option 1: add support to the existing resource azurerm_storage_sync_cloud_endpoint to automatically add it.
Option 2: Add support for azuread_service_principal to retrieve built in service principals
```hcl
data "azuread_service_principal" "af_sasp" {
display_name = "Microsoft.StorageAccount"
}
resource "azurerm_role_assignment" "ad_role_storageaccount" {
scope = azurerm_storage_account.filesync.id
role_definition_name = "Reader and Data Access"
principal_id = data.azuread_service_principal.af_sasp.id
}
```
or
```hcl
data "azuread_service_principal" "af_sasp" {
display_name = "Microsoft.StorageSync"
}
resource "azurerm_role_assignment" "ad_role_storageaccount" {
scope = azurerm_storage_account.filesync.id
role_definition_name = "Reader and Data Access"
principal_id = data.azuread_service_principal.af_sasp.id
}
```
### New or Affected Resource(s)/Data Source(s)
azurerm_storage_sync_cloud_endpoint
### Potential Terraform Configuration
_No response_
### References
_No response_ | 1.0 | Support for adding service account permissions while setting up azurerm_storage_sync_cloud_endpoint - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Community Note
<!--- Please keep this note for the community --->
* Please vote on this issue by adding a :thumbsup: [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
When working with azurerm_storage_sync_cloud_endpoint there is a comment in the documentation that the service principal Microsoft.StorageSync needs to be granted "Reader and Data Access". This makes sense however there does not seem to be any way to do this via terraform (at least not that I can find). Since this is a required step it would seem to make sense that this should be down automatically or at the very least there should be another way to complete this in a terraform step
Two suggested options could be
Option 1: add support to the existing resource azurerm_storage_sync_cloud_endpoint to automatically add it.
Option 2: Add support for azuread_service_principal to retrieve built in service principals
```hcl
data "azuread_service_principal" "af_sasp" {
display_name = "Microsoft.StorageAccount"
}
resource "azurerm_role_assignment" "ad_role_storageaccount" {
scope = azurerm_storage_account.filesync.id
role_definition_name = "Reader and Data Access"
principal_id = data.azuread_service_principal.af_sasp.id
}
```
or
```hcl
data "azuread_service_principal" "af_sasp" {
display_name = "Microsoft.StorageSync"
}
resource "azurerm_role_assignment" "ad_role_storageaccount" {
scope = azurerm_storage_account.filesync.id
role_definition_name = "Reader and Data Access"
principal_id = data.azuread_service_principal.af_sasp.id
}
```
### New or Affected Resource(s)/Data Source(s)
azurerm_storage_sync_cloud_endpoint
### Potential Terraform Configuration
_No response_
### References
_No response_ | non_priority | support for adding service account permissions while setting up azurerm storage sync cloud endpoint is there an existing issue for this i have searched the existing issues community note please vote on this issue by adding a thumbsup to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment description when working with azurerm storage sync cloud endpoint there is a comment in the documentation that the service principal microsoft storagesync needs to be granted reader and data access this makes sense however there does not seem to be any way to do this via terraform at least not that i can find since this is a required step it would seem to make sense that this should be down automatically or at the very least there should be another way to complete this in a terraform step two suggested options could be option add support to the existing resource azurerm storage sync cloud endpoint to automatically add it option add support for azuread service principal to retrieve built in service principals hcl data azuread service principal af sasp display name microsoft storageaccount resource azurerm role assignment ad role storageaccount scope azurerm storage account filesync id role definition name reader and data access principal id data azuread service principal af sasp id or hcl data azuread service principal af sasp display name microsoft storagesync resource azurerm role assignment ad role storageaccount scope azurerm storage account filesync id role definition name reader and data access principal id data azuread service principal af sasp id new or affected resource s data source s azurerm storage sync cloud endpoint potential terraform configuration no response references no response | 0 |
155,982 | 12,289,317,639 | IssuesEvent | 2020-05-09 20:53:52 | openshift/openshift-azure | https://api.github.com/repos/openshift/openshift-azure | closed | readiness failed on DeploymentConfig | kind/test-flake lifecycle/rotten | /kind test-flake
During scale up/down readiness failed during sanity check
`time="2019-12-10T13:48:45Z" level=error msg="templateinstance e2e-test-h3dty/e2e-test-h3dty failed: Failed (readiness failed on DeploymentConfig e2e-test-h3dty/django-psql-persistent)" func="github.com/openshift/openshift-azure/test/e2e/standard.(*SanityChecker).CreateTestApp" file="/home/prow/go/src/github.com/openshift/openshift-azure/test/e2e/standard/sanitychecker.go:85"` | 1.0 | readiness failed on DeploymentConfig - /kind test-flake
During scale up/down readiness failed during sanity check
`time="2019-12-10T13:48:45Z" level=error msg="templateinstance e2e-test-h3dty/e2e-test-h3dty failed: Failed (readiness failed on DeploymentConfig e2e-test-h3dty/django-psql-persistent)" func="github.com/openshift/openshift-azure/test/e2e/standard.(*SanityChecker).CreateTestApp" file="/home/prow/go/src/github.com/openshift/openshift-azure/test/e2e/standard/sanitychecker.go:85"` | non_priority | readiness failed on deploymentconfig kind test flake during scale up down readiness failed during sanity check time level error msg templateinstance test test failed failed readiness failed on deploymentconfig test django psql persistent func github com openshift openshift azure test standard sanitychecker createtestapp file home prow go src github com openshift openshift azure test standard sanitychecker go | 0 |
191 | 4,513,527,470 | IssuesEvent | 2016-09-04 10:20:46 | backdrop-ops/backdropcms.org | https://api.github.com/repos/backdrop-ops/backdropcms.org | opened | Project release node title not updated when GitHub release is changed. | type - github-automation | @docwilmot has made an initial release of https://github.com/backdrop-contrib/block_disabler and made the common mistake to tag the release as 1.x-1.x (https://github.com/backdrop-contrib/block_disabler/issues/1). Even though the release was edited and changed to 1.x-1.0.0, the respective project node on b.org still shows 1.x-1.x and points to the original release that 404s.
I know that we can manually edit the project release node on b.org and fix this, but can we please have it so that release nodes on b.org are updated upon GitHub release renames? Thanx.
PS: @Gormartsen haven't heard from you lately. Hope you're doing OK. | 1.0 | Project release node title not updated when GitHub release is changed. - @docwilmot has made an initial release of https://github.com/backdrop-contrib/block_disabler and made the common mistake to tag the release as 1.x-1.x (https://github.com/backdrop-contrib/block_disabler/issues/1). Even though the release was edited and changed to 1.x-1.0.0, the respective project node on b.org still shows 1.x-1.x and points to the original release that 404s.
I know that we can manually edit the project release node on b.org and fix this, but can we please have it so that release nodes on b.org are updated upon GitHub release renames? Thanx.
PS: @Gormartsen haven't heard from you lately. Hope you're doing OK. | non_priority | project release node title not updated when github release is changed docwilmot has made an initial release of and made the common mistake to tag the release as x x even though the release was edited and changed to x the respective project node on b org still shows x x and points to the original release that i know that we can manually edit the project release node on b org and fix this but can we please have it so that release nodes on b org are updated upon github release renames thanx ps gormartsen haven t heard from you lately hope you re doing ok | 0 |
28,449 | 12,840,675,958 | IssuesEvent | 2020-07-07 21:30:28 | terraform-providers/terraform-provider-azurerm | https://api.github.com/repos/terraform-providers/terraform-provider-azurerm | closed | azurerm_api_management_identity_provider_aad keeps updating client_secret | service/api-management | ### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
### Terraform (and AzureRM Provider) Version
Terraform v0.12.26
+ provider.azurerm v2.13.0
### Affected Resource(s)
azurerm_api_management_identity_provider_aad
### Terraform Configuration Files
```hcl
resource "azurerm_api_management_identity_provider_aad" "this" {
api_management_name = azurerm_api_management.this.name
resource_group_name = azurerm_resource_group.apim.name
client_id = azuread_application.apim.application_id
client_secret = random_password.apim.id
allowed_tenants = [local.tenant_id]
}
```
### Expected Behavior
Plan with no changes.
### Actual Behavior
Apply creates the resource, the consecutive plan wants to update client_secret
```bash
...
azurerm_api_management_identity_provider_aad.this: Modifications complete after 1s [..]
...
terraform apply
...
# azurerm_api_management_identity_provider_aad.this will be updated in-place
~ resource "azurerm_api_management_identity_provider_aad" "this" {
allowed_tenants = [
"XXX",
]
api_management_name = "apim"
client_id = "xxx"
+ client_secret = (sensitive value)
id = "/subscriptions/xxx/resourceGroups/rn-name/providers/Microsoft.ApiManagement/service/apim/identityProviders/Aad"
resource_group_name = "rg-name"
}
```
### Steps to Reproduce
1. Create a plan with apim service and aad ID provider.
2. `terraform apply`
3. `terraform plan`
| 1.0 | azurerm_api_management_identity_provider_aad keeps updating client_secret - ### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
### Terraform (and AzureRM Provider) Version
Terraform v0.12.26
+ provider.azurerm v2.13.0
### Affected Resource(s)
azurerm_api_management_identity_provider_aad
### Terraform Configuration Files
```hcl
resource "azurerm_api_management_identity_provider_aad" "this" {
api_management_name = azurerm_api_management.this.name
resource_group_name = azurerm_resource_group.apim.name
client_id = azuread_application.apim.application_id
client_secret = random_password.apim.id
allowed_tenants = [local.tenant_id]
}
```
### Expected Behavior
Plan with no changes.
### Actual Behavior
Apply creates the resource, the consecutive plan wants to update client_secret
```bash
...
azurerm_api_management_identity_provider_aad.this: Modifications complete after 1s [..]
...
terraform apply
...
# azurerm_api_management_identity_provider_aad.this will be updated in-place
~ resource "azurerm_api_management_identity_provider_aad" "this" {
allowed_tenants = [
"XXX",
]
api_management_name = "apim"
client_id = "xxx"
+ client_secret = (sensitive value)
id = "/subscriptions/xxx/resourceGroups/rn-name/providers/Microsoft.ApiManagement/service/apim/identityProviders/Aad"
resource_group_name = "rg-name"
}
```
### Steps to Reproduce
1. Create a plan with apim service and aad ID provider.
2. `terraform apply`
3. `terraform plan`
| non_priority | azurerm api management identity provider aad keeps updating client secret community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform and azurerm provider version terraform provider azurerm affected resource s azurerm api management identity provider aad terraform configuration files hcl resource azurerm api management identity provider aad this api management name azurerm api management this name resource group name azurerm resource group apim name client id azuread application apim application id client secret random password apim id allowed tenants expected behavior plan with no changes actual behavior apply creates the resource the consecutive plan wants to update client secret bash azurerm api management identity provider aad this modifications complete after terraform apply azurerm api management identity provider aad this will be updated in place resource azurerm api management identity provider aad this allowed tenants xxx api management name apim client id xxx client secret sensitive value id subscriptions xxx resourcegroups rn name providers microsoft apimanagement service apim identityproviders aad resource group name rg name steps to reproduce create a plan with apim service and aad id provider terraform apply terraform plan | 0 |
41,577 | 6,919,579,185 | IssuesEvent | 2017-11-29 15:51:26 | samvera-labs/valkyrie | https://api.github.com/repos/samvera-labs/valkyrie | opened | Document how to create an array with its order preserved | documentation | I want to do something like:
``` ruby
class ThingWithAnOrderedAttribute < Valkyrie::Resource
attribute :ordered_array, Valkyrie::Types::Array
end
```
And know that the elements in `ordered_array` have their ordered preserved. It would also be doubly-nice if we could use something like:
``` ruby
class ThingWithOrderedOtherThings < Valkyrie::Resource
attribute :ordered_other_things, Types::Strict::Array.member(MyTypes::OtherThing)
end
``` | 1.0 | Document how to create an array with its order preserved - I want to do something like:
``` ruby
class ThingWithAnOrderedAttribute < Valkyrie::Resource
attribute :ordered_array, Valkyrie::Types::Array
end
```
And know that the elements in `ordered_array` have their ordered preserved. It would also be doubly-nice if we could use something like:
``` ruby
class ThingWithOrderedOtherThings < Valkyrie::Resource
attribute :ordered_other_things, Types::Strict::Array.member(MyTypes::OtherThing)
end
``` | non_priority | document how to create an array with its order preserved i want to do something like ruby class thingwithanorderedattribute valkyrie resource attribute ordered array valkyrie types array end and know that the elements in ordered array have their ordered preserved it would also be doubly nice if we could use something like ruby class thingwithorderedotherthings valkyrie resource attribute ordered other things types strict array member mytypes otherthing end | 0 |
310,240 | 26,707,711,241 | IssuesEvent | 2023-01-27 19:50:25 | MPMG-DCC-UFMG/F01 | https://api.github.com/repos/MPMG-DCC-UFMG/F01 | closed | Teste de generalizacao para a tag Informações Institucionais - Leis Municipais - Brumadinho | generalization test development template - ABO (21) subtag - Registro das Competências subtag - Leis Municipais | DoD: Realizar o teste de Generalização do validador da tag Informações Institucionais - Leis Municipais para o Município de Brumadinho. | 1.0 | Teste de generalizacao para a tag Informações Institucionais - Leis Municipais - Brumadinho - DoD: Realizar o teste de Generalização do validador da tag Informações Institucionais - Leis Municipais para o Município de Brumadinho. | non_priority | teste de generalizacao para a tag informações institucionais leis municipais brumadinho dod realizar o teste de generalização do validador da tag informações institucionais leis municipais para o município de brumadinho | 0 |
190,955 | 14,589,514,239 | IssuesEvent | 2020-12-19 02:20:52 | red/red | https://api.github.com/repos/red/red | closed | unix-style LF + read/lines = trash being read | status.built status.tested test.written type.bug | **Describe the bug**
```
Red []
recycle/off
make-dir %tmp$
files: []
repeat i 100 [
append files f: rejoin [%tmp$/drw i ".red"]
write/binary f append/dup copy {^/} "11 " i
]
foreach f files [read/lines probe f]
```
Running this script I'm getting:
```
%tmp$/drw1.red
%tmp$/drw2.red
%tmp$/drw3.red
%tmp$/drw4.red
%tmp$/drw5.red
%tmp$/drw6.red
*** Access Error: invalid UTF-8 encoding: #{9867C408}
*** Where: read
*** Stack:
```
In `#{9867C408}` bytes 2 and 3 are random.
**Expected behavior**
No errors.
**Platform version**
```
Red 0.6.4 for Windows built 4-Dec-2020/7:41:48+03:00 commit #382dd4e
```
| 2.0 | unix-style LF + read/lines = trash being read - **Describe the bug**
```
Red []
recycle/off
make-dir %tmp$
files: []
repeat i 100 [
append files f: rejoin [%tmp$/drw i ".red"]
write/binary f append/dup copy {^/} "11 " i
]
foreach f files [read/lines probe f]
```
Running this script I'm getting:
```
%tmp$/drw1.red
%tmp$/drw2.red
%tmp$/drw3.red
%tmp$/drw4.red
%tmp$/drw5.red
%tmp$/drw6.red
*** Access Error: invalid UTF-8 encoding: #{9867C408}
*** Where: read
*** Stack:
```
In `#{9867C408}` bytes 2 and 3 are random.
**Expected behavior**
No errors.
**Platform version**
```
Red 0.6.4 for Windows built 4-Dec-2020/7:41:48+03:00 commit #382dd4e
```
| non_priority | unix style lf read lines trash being read describe the bug red recycle off make dir tmp files repeat i append files f rejoin write binary f append dup copy i foreach f files running this script i m getting tmp red tmp red tmp red tmp red tmp red tmp red access error invalid utf encoding where read stack in bytes and are random expected behavior no errors platform version red for windows built dec commit | 0 |
404,282 | 27,456,952,938 | IssuesEvent | 2023-03-02 22:16:38 | pfmc-assessments/pfmc_assessment_handbook | https://api.github.com/repos/pfmc-assessments/pfmc_assessment_handbook | opened | add more info on reasons for choosing discard approach? | documentation question | In two independent settings today I was asked about approaches to modeling discards. The section in our handbook on discards, which @chantelwetzel-noaa recently expanded, has lots of really valuable detail about the different ways to model them: https://pfmc-assessments.github.io/pfmc_assessment_handbook/01-data-sources.html#notes-and-best-practices-for-observer-data-and-discards.
However, I think it would be useful to provide a bit more guidance on why you would choose to estimate them within the model vs outside. Here's a rough representation of my thinking on this subject. Missing from this is the nuances of recreational data where we have things like death-by-depth estimates applied by the states.
Primary reason for discarding | What to do if discards are high or length comps differ | What to do if discards are low or length comps very similar
-- | -- | --
Catch limits or lack of market for all sizes (e.g. dogfish) | Model separate discard fleet (challenge is extrapolation to period without discard data) | Combine discard and retained catch and ignore discard comps or combine retained + discard comps
Size limits or lack of market for small fish (e.g. lingcod) | Model retention within the assessment (probably needs to be time-varying) | Combine discard and retained catch and ignore discard comps or combine retained + discard comps
Does this reasoning match how other people are thinking about it? Would it be useful to add?
Tagging @31ingrid and @brianlangseth-NOAA. | 1.0 | add more info on reasons for choosing discard approach? - In two independent settings today I was asked about approaches to modeling discards. The section in our handbook on discards, which @chantelwetzel-noaa recently expanded, has lots of really valuable detail about the different ways to model them: https://pfmc-assessments.github.io/pfmc_assessment_handbook/01-data-sources.html#notes-and-best-practices-for-observer-data-and-discards.
However, I think it would be useful to provide a bit more guidance on why you would choose to estimate them within the model vs outside. Here's a rough representation of my thinking on this subject. Missing from this is the nuances of recreational data where we have things like death-by-depth estimates applied by the states.
Primary reason for discarding | What to do if discards are high or length comps differ | What to do if discards are low or length comps very similar
-- | -- | --
Catch limits or lack of market for all sizes (e.g. dogfish) | Model separate discard fleet (challenge is extrapolation to period without discard data) | Combine discard and retained catch and ignore discard comps or combine retained + discard comps
Size limits or lack of market for small fish (e.g. lingcod) | Model retention within the assessment (probably needs to be time-varying) | Combine discard and retained catch and ignore discard comps or combine retained + discard comps
Does this reasoning match how other people are thinking about it? Would it be useful to add?
Tagging @31ingrid and @brianlangseth-NOAA. | non_priority | add more info on reasons for choosing discard approach in two independent settings today i was asked about approaches to modeling discards the section in our handbook on discards which chantelwetzel noaa recently expanded has lots of really valuable detail about the different ways to model them however i think it would be useful to provide a bit more guidance on why you would choose to estimate them within the model vs outside here s a rough representation of my thinking on this subject missing from this is the nuances of recreational data where we have things like death by depth estimates applied by the states primary reason for discarding what to do if discards are high or length comps differ what to do if discards are low or length comps very similar catch limits or lack of market for all sizes e g dogfish model separate discard fleet challenge is extrapolation to period without discard data combine discard and retained catch and ignore discard comps or combine retained discard comps size limits or lack of market for small fish e g lingcod model retention within the assessment probably needs to be time varying combine discard and retained catch and ignore discard comps or combine retained discard comps does this reasoning match how other people are thinking about it would it be useful to add tagging and brianlangseth noaa | 0 |
32,057 | 8,790,658,315 | IssuesEvent | 2018-12-21 09:50:12 | openshiftio/openshift.io | https://api.github.com/repos/openshiftio/openshift.io | closed | Build Service: Connect with fabric8-environement | area/pipelines sprint/current team/build-cd type/task |
Get json schema from fabric8-environement service
Fabric8-Build issue: https://github.com/fabric8-services/fabric8-build-service/issues/41 | 1.0 | Build Service: Connect with fabric8-environement -
Get json schema from fabric8-environement service
Fabric8-Build issue: https://github.com/fabric8-services/fabric8-build-service/issues/41 | non_priority | build service connect with environement get json schema from environement service build issue | 0 |
431,692 | 30,246,766,241 | IssuesEvent | 2023-07-06 17:06:09 | NetAppDocs/storagegrid-116 | https://api.github.com/repos/NetAppDocs/storagegrid-116 | closed | SSE Feedback on Remove Fibre Channel HBA | documentation | Page: [Remove Fibre Channel HBA](https://docs.netapp.com/us-en/storagegrid-116/sg6000/removing-fibre-channel-hba.html)
the link under "about this task" for monitoring node connections is broken
"See the information about [monitoring node connection states]" (https://docs.netapp.com/us-en/storagegrid-116/monitor/monitor-node-connection-states.html)
that link routes to a 404 | 1.0 | SSE Feedback on Remove Fibre Channel HBA - Page: [Remove Fibre Channel HBA](https://docs.netapp.com/us-en/storagegrid-116/sg6000/removing-fibre-channel-hba.html)
the link under "about this task" for monitoring node connections is broken
"See the information about [monitoring node connection states]" (https://docs.netapp.com/us-en/storagegrid-116/monitor/monitor-node-connection-states.html)
that link routes to a 404 | non_priority | sse feedback on remove fibre channel hba page the link under about this task for monitoring node connections is broken see the information about that link routes to a | 0 |
151,113 | 19,648,495,301 | IssuesEvent | 2022-01-10 01:52:18 | Thezone1975/tabliss | https://api.github.com/repos/Thezone1975/tabliss | opened | WS-2019-0605 (Medium) detected in opennmsopennms-source-23.0.0-1 | security vulnerability | ## WS-2019-0605 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary>
<p>
<p>A Java based fault and performance management system</p>
<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/tabliss/node_modules/node-sass/src/libsass/src/lexer.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In sass versions between 3.2.0 to 3.6.3 may read 1 byte outside an allocated buffer while parsing a specially crafted css rule.
<p>Publish Date: 2019-07-16
<p>URL: <a href=https://github.com/sass/libsass/commit/7a21c79e321927363a153dc5d7e9c492365faf9b>WS-2019-0605</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/OSV-2020-734">https://osv.dev/vulnerability/OSV-2020-734</a></p>
<p>Release Date: 2019-07-16</p>
<p>Fix Resolution: 3.6.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2019-0605 (Medium) detected in opennmsopennms-source-23.0.0-1 - ## WS-2019-0605 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary>
<p>
<p>A Java based fault and performance management system</p>
<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/tabliss/node_modules/node-sass/src/libsass/src/lexer.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In sass versions between 3.2.0 to 3.6.3 may read 1 byte outside an allocated buffer while parsing a specially crafted css rule.
<p>Publish Date: 2019-07-16
<p>URL: <a href=https://github.com/sass/libsass/commit/7a21c79e321927363a153dc5d7e9c492365faf9b>WS-2019-0605</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/OSV-2020-734">https://osv.dev/vulnerability/OSV-2020-734</a></p>
<p>Release Date: 2019-07-16</p>
<p>Fix Resolution: 3.6.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws medium detected in opennmsopennms source ws medium severity vulnerability vulnerable library opennmsopennms source a java based fault and performance management system library home page a href vulnerable source files tabliss node modules node sass src libsass src lexer cpp vulnerability details in sass versions between to may read byte outside an allocated buffer while parsing a specially crafted css rule publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
150,039 | 13,308,180,018 | IssuesEvent | 2020-08-26 00:11:21 | nwg-piotr/autotiling | https://api.github.com/repos/nwg-piotr/autotiling | closed | Wrong manual install instructions and AUR package | documentation packaging | Manual install is not possible according to instructions because of #14
```
==> Checking for dependencies
==> Making package: autotiling-git r64.126c07d-1 (Ne 23. srpna 2020, 00:15:53)
==> Checking runtime dependencies...
==> Checking buildtime dependencies...
==> Retrieving sources...
-> Updating autotiling git repo...
Fetching origin
==> Validating source files with md5sums...
autotiling ... Skipped
==> Extracting sources...
-> Creating working copy of autotiling git repo...
Switched to a new branch 'makepkg'
==> Starting pkgver()...
==> Removing existing $pkgdir/ directory...
==> Entering fakeroot environment...
==> Starting package()...
install: cannot stat 'autotiling.py': No such file or directory
==> ERROR: A failure occurred in package().
Aborting...
``` | 1.0 | Wrong manual install instructions and AUR package - Manual install is not possible according to instructions because of #14
```
==> Checking for dependencies
==> Making package: autotiling-git r64.126c07d-1 (Ne 23. srpna 2020, 00:15:53)
==> Checking runtime dependencies...
==> Checking buildtime dependencies...
==> Retrieving sources...
-> Updating autotiling git repo...
Fetching origin
==> Validating source files with md5sums...
autotiling ... Skipped
==> Extracting sources...
-> Creating working copy of autotiling git repo...
Switched to a new branch 'makepkg'
==> Starting pkgver()...
==> Removing existing $pkgdir/ directory...
==> Entering fakeroot environment...
==> Starting package()...
install: cannot stat 'autotiling.py': No such file or directory
==> ERROR: A failure occurred in package().
Aborting...
``` | non_priority | wrong manual install instructions and aur package manual install is not possible according to instructions because of checking for dependencies making package autotiling git ne srpna checking runtime dependencies checking buildtime dependencies retrieving sources updating autotiling git repo fetching origin validating source files with autotiling skipped extracting sources creating working copy of autotiling git repo switched to a new branch makepkg starting pkgver removing existing pkgdir directory entering fakeroot environment starting package install cannot stat autotiling py no such file or directory error a failure occurred in package aborting | 0 |
26,231 | 19,759,597,487 | IssuesEvent | 2022-01-16 07:08:14 | WordPress/performance | https://api.github.com/repos/WordPress/performance | closed | Implement GitHub action to ensure necessary labels and milestone are present on pull requests | [Type] Enhancement Infrastructure no milestone | Based on https://github.com/WordPress/performance/issues/41#issuecomment-994746944, we should have a GitHub workflow that ensures that every pull request has the necessary labels and milestones. This can later be set up as a requirement that has to be met before a pull request can be merged.
Here's the specifications that the GitHub workflow should implemented (based on #41 as well):
* Every pull request must have one of the `[Type] xyz` labels. This is used to group pull requests in the changelog.
* Every pull request must have _either_ one of the `[Focus] xyz` labels _or_ the `Infrastructure` label. This is used to further categorize and prefix pull requests in the changelog.
* Every pull request must have _either_ a milestone set _or_ be marked with the `no milestone` label. This is used to define which release a pull request belongs to, or to explicitly define that a certain issue is "release-irrelevant" and therefore does not need a milestone. | 1.0 | Implement GitHub action to ensure necessary labels and milestone are present on pull requests - Based on https://github.com/WordPress/performance/issues/41#issuecomment-994746944, we should have a GitHub workflow that ensures that every pull request has the necessary labels and milestones. This can later be set up as a requirement that has to be met before a pull request can be merged.
Here's the specifications that the GitHub workflow should implemented (based on #41 as well):
* Every pull request must have one of the `[Type] xyz` labels. This is used to group pull requests in the changelog.
* Every pull request must have _either_ one of the `[Focus] xyz` labels _or_ the `Infrastructure` label. This is used to further categorize and prefix pull requests in the changelog.
* Every pull request must have _either_ a milestone set _or_ be marked with the `no milestone` label. This is used to define which release a pull request belongs to, or to explicitly define that a certain issue is "release-irrelevant" and therefore does not need a milestone. | non_priority | implement github action to ensure necessary labels and milestone are present on pull requests based on we should have a github workflow that ensures that every pull request has the necessary labels and milestones this can later be set up as a requirement that has to be met before a pull request can be merged here s the specifications that the github workflow should implemented based on as well every pull request must have one of the xyz labels this is used to group pull requests in the changelog every pull request must have either one of the xyz labels or the infrastructure label this is used to further categorize and prefix pull requests in the changelog every pull request must have either a milestone set or be marked with the no milestone label this is used to define which release a pull request belongs to or to explicitly define that a certain issue is release irrelevant and therefore does not need a milestone | 0 |
118,963 | 15,383,336,899 | IssuesEvent | 2021-03-03 02:31:53 | microsoft/vscode | https://api.github.com/repos/microsoft/vscode | closed | Getting started: NVDA does not allow up / down navigation | *as-designed accessibility | Refs: #117313
1. Turn on NVDA screen reader on windows
2. VS Code - > getting started
3. Try to use up / down to navigate between getting started buttons -> does not work 🐛
NVDA seems to be eating up the up and down keys.
I am not sure why that is the case. Probably because NVDA thinks you are in Browse mode and then does not allow up / down.
@MarcoZehe might have an idea | 1.0 | Getting started: NVDA does not allow up / down navigation - Refs: #117313
1. Turn on NVDA screen reader on windows
2. VS Code - > getting started
3. Try to use up / down to navigate between getting started buttons -> does not work 🐛
NVDA seems to be eating up the up and down keys.
I am not sure why that is the case. Probably because NVDA thinks you are in Browse mode and then does not allow up / down.
@MarcoZehe might have an idea | non_priority | getting started nvda does not allow up down navigation refs turn on nvda screen reader on windows vs code getting started try to use up down to navigate between getting started buttons does not work 🐛 nvda seems to be eating up the up and down keys i am not sure why that is the case probably because nvda thinks you are in browse mode and then does not allow up down marcozehe might have an idea | 0 |
40,108 | 5,183,251,492 | IssuesEvent | 2017-01-20 00:03:33 | codeforamerica/syracuse_biz_portal | https://api.github.com/repos/codeforamerica/syracuse_biz_portal | closed | Account page wireframes | design specs feature ready | These are designs for the account page. We've decided that the other stuff (notebooks/checklists/etc) will live on its own page.
# On load
## Mobile

## Desktop

- [ ] Display name
- [ ] Display email
- [ ] Button to change password
- [ ] Desktop is using the 7 column grid for content (same as the category and step pages), but we can adjust this if it looks weird
---
# Change password


- [ ] Explain password requirements, using whatever language we use at sign up
- [ ] Enter old password (I'm assuming this is best practices, but correct me if I'm wrong
- [ ] Enter new password twice
- [ ] Real time validation for password complexity and/or password matching would be great! If not, regular validation
- [ ] Save button that takes you to confirmation
- [ ] Cancel button (secondary button style) that takes you to load state
---
# Confirmation

- [ ] Confirm successful change | 1.0 | Account page wireframes - These are designs for the account page. We've decided that the other stuff (notebooks/checklists/etc) will live on its own page.
# On load
## Mobile

## Desktop

- [ ] Display name
- [ ] Display email
- [ ] Button to change password
- [ ] Desktop is using the 7 column grid for content (same as the category and step pages), but we can adjust this if it looks weird
---
# Change password


- [ ] Explain password requirements, using whatever language we use at sign up
- [ ] Enter old password (I'm assuming this is best practices, but correct me if I'm wrong
- [ ] Enter new password twice
- [ ] Real time validation for password complexity and/or password matching would be great! If not, regular validation
- [ ] Save button that takes you to confirmation
- [ ] Cancel button (secondary button style) that takes you to load state
---
# Confirmation

- [ ] Confirm successful change | non_priority | account page wireframes these are designs for the account page we ve decided that the other stuff notebooks checklists etc will live on its own page on load mobile desktop display name display email button to change password desktop is using the column grid for content same as the category and step pages but we can adjust this if it looks weird change password explain password requirements using whatever language we use at sign up enter old password i m assuming this is best practices but correct me if i m wrong enter new password twice real time validation for password complexity and or password matching would be great if not regular validation save button that takes you to confirmation cancel button secondary button style that takes you to load state confirmation confirm successful change | 0 |
102,860 | 12,827,681,416 | IssuesEvent | 2020-07-06 18:59:03 | cityofaustin/techstack | https://api.github.com/repos/cityofaustin/techstack | closed | Assigning departments to topics and topic collections? | Content type: Topic Page Content type: Topic collection page Feature: Staff Permissions & Roles Joplin Alpha Site Content Team: Content Team: Design | We're implemented department permissions that make it so that a user selects a department when creating a page, so that it has departmental permissions. But we're thinking of limiting the creation of topic and topic collections to admins, who will decide the direction of the IA. Does this mean we don't need this section here?
 | 1.0 | Assigning departments to topics and topic collections? - We're implemented department permissions that make it so that a user selects a department when creating a page, so that it has departmental permissions. But we're thinking of limiting the creation of topic and topic collections to admins, who will decide the direction of the IA. Does this mean we don't need this section here?
 | non_priority | assigning departments to topics and topic collections we re implemented department permissions that make it so that a user selects a department when creating a page so that it has departmental permissions but we re thinking of limiting the creation of topic and topic collections to admins who will decide the direction of the ia does this mean we don t need this section here | 0 |
317,545 | 23,678,542,016 | IssuesEvent | 2022-08-28 13:02:32 | rashmi-carol-dsouza/baklava-or-not-baklava | https://api.github.com/repos/rashmi-carol-dsouza/baklava-or-not-baklava | closed | Design Web App Architecture | documentation | # Web App Architecture
## UI

## Local System Design

## General Requirements
- User opens app which uses a camera to take a picture
- User clicks picture
- UI shows loading state
- UI indicates - Yes it's Baklava or No It's Not
## Specific Requirements
- [x] Working on Larry's and Rashmi's cell phones
## Components
- [x] Web Server: Flask API, Postman for testing
- [x] DB: MongoDB
## Data Entities
- [x] Photo
- [x] Image Filename: `upload-uuid.jpg`
- [x] Metadata: `{ id: String, isBaklava: Boolean }`
## API Design
- [x] `POST /photo` - https://stackoverflow.com/a/28027640
- [x] Response: JSON: { isBaklava: Boolean } | 1.0 | Design Web App Architecture - # Web App Architecture
## UI

## Local System Design

## General Requirements
- User opens app which uses a camera to take a picture
- User clicks picture
- UI shows loading state
- UI indicates - Yes it's Baklava or No It's Not
## Specific Requirements
- [x] Working on Larry's and Rashmi's cell phones
## Components
- [x] Web Server: Flask API, Postman for testing
- [x] DB: MongoDB
## Data Entities
- [x] Photo
- [x] Image Filename: `upload-uuid.jpg`
- [x] Metadata: `{ id: String, isBaklava: Boolean }`
## API Design
- [x] `POST /photo` - https://stackoverflow.com/a/28027640
- [x] Response: JSON: { isBaklava: Boolean } | non_priority | design web app architecture web app architecture ui local system design general requirements user opens app which uses a camera to take a picture user clicks picture ui shows loading state ui indicates yes it s baklava or no it s not specific requirements working on larry s and rashmi s cell phones components web server flask api postman for testing db mongodb data entities photo image filename upload uuid jpg metadata id string isbaklava boolean api design post photo response json isbaklava boolean | 0 |
128,390 | 17,532,554,203 | IssuesEvent | 2021-08-12 00:33:13 | Subscribie/subscribie | https://api.github.com/repos/Subscribie/subscribie | closed | Send emails asynchronously | needs-user-story needs-design needs-qa needs-marketing | **Is your feature request related to a problem? Please describe.**
Don't block on email sending
**Describe the solution you'd like**
Move the task of sending email to external system
**Describe alternatives you've considered**
**Additional context**
Add any other context or screenshots about the feature request here.
Ref
- #495
- #654
| 1.0 | Send emails asynchronously - **Is your feature request related to a problem? Please describe.**
Don't block on email sending
**Describe the solution you'd like**
Move the task of sending email to external system
**Describe alternatives you've considered**
**Additional context**
Add any other context or screenshots about the feature request here.
Ref
- #495
- #654
| non_priority | send emails asynchronously is your feature request related to a problem please describe don t block on email sending describe the solution you d like move the task of sending email to external system describe alternatives you ve considered additional context add any other context or screenshots about the feature request here ref | 0 |
24,040 | 3,874,930,758 | IssuesEvent | 2016-04-11 22:19:07 | Microsoft/TypeScript | https://api.github.com/repos/Microsoft/TypeScript | reopened | Suggestion Backlog Slog, 4/11/2016 | Design Notes |
* #2671 Allow omitted elements in tuple type annotations
* #1579 `nameof` operator
* #3802 Allow merging classes and modules across files
* #2957 Reopen static and instance side of classes
* #7285 Allow subclass constructors without super() call
* #7657 Type guards in Array.prototype.filter
* #7738 Execute property initializer expressions at the expected time
* #7763 implicitly infer parameter types for the implementation signature of an overloaded method
* #7485 All Optional Object Interface
* #4142 Contextually type IIFE parameters by their arguments
* #7782 Use LHS type parameters as inference sites for RHS call/new expr
* #4325 dotted names in export assignments
* #3716 Down-level lexical 'arguments' binding for arrow functions | 1.0 | Suggestion Backlog Slog, 4/11/2016 -
* #2671 Allow omitted elements in tuple type annotations
* #1579 `nameof` operator
* #3802 Allow merging classes and modules across files
* #2957 Reopen static and instance side of classes
* #7285 Allow subclass constructors without super() call
* #7657 Type guards in Array.prototype.filter
* #7738 Execute property initializer expressions at the expected time
* #7763 implicitly infer parameter types for the implementation signature of an overloaded method
* #7485 All Optional Object Interface
* #4142 Contextually type IIFE parameters by their arguments
* #7782 Use LHS type parameters as inference sites for RHS call/new expr
* #4325 dotted names in export assignments
* #3716 Down-level lexical 'arguments' binding for arrow functions | non_priority | suggestion backlog slog allow omitted elements in tuple type annotations nameof operator allow merging classes and modules across files reopen static and instance side of classes allow subclass constructors without super call type guards in array prototype filter execute property initializer expressions at the expected time implicitly infer parameter types for the implementation signature of an overloaded method all optional object interface contextually type iife parameters by their arguments use lhs type parameters as inference sites for rhs call new expr dotted names in export assignments down level lexical arguments binding for arrow functions | 0 |
179,529 | 13,885,448,199 | IssuesEvent | 2020-10-18 20:03:23 | daniel-norris/neu_ui | https://api.github.com/repos/daniel-norris/neu_ui | opened | Create a CardText test | good first issue hacktoberfest tests | **Is your feature request related to a problem? Please describe.**
Test coverage across the application is low. We need to build confidence that the components have the expected behaviour that we want and to help mitigate any regression in the future.
**Describe the solution you'd like**
We need to implement better test coverage across the library. Ideally each component should be accompanied by a test case written using Jest. At a minimum the test should check whether the component successfully shows any child props. If you are able to include tests for any additional functionality then that would be appreciated!
We should have a test case covering the CardText component implemented using Jest. More info on Jest can be found here.
For examples of how this is done, take a look at existing test cases in the library. An example would be the CardHeader or Input components.
You can run your tests using `npm run test` or to see test coverage across the library `npm run test:cov`.
This is part of epic #19. | 1.0 | Create a CardText test - **Is your feature request related to a problem? Please describe.**
Test coverage across the application is low. We need to build confidence that the components have the expected behaviour that we want and to help mitigate any regression in the future.
**Describe the solution you'd like**
We need to implement better test coverage across the library. Ideally each component should be accompanied by a test case written using Jest. At a minimum the test should check whether the component successfully shows any child props. If you are able to include tests for any additional functionality then that would be appreciated!
We should have a test case covering the CardText component implemented using Jest. More info on Jest can be found here.
For examples of how this is done, take a look at existing test cases in the library. An example would be the CardHeader or Input components.
You can run your tests using `npm run test` or to see test coverage across the library `npm run test:cov`.
This is part of epic #19. | non_priority | create a cardtext test is your feature request related to a problem please describe test coverage across the application is low we need to build confidence that the components have the expected behaviour that we want and to help mitigate any regression in the future describe the solution you d like we need to implement better test coverage across the library ideally each component should be accompanied by a test case written using jest at a minimum the test should check whether the component successfully shows any child props if you are able to include tests for any additional functionality then that would be appreciated we should have a test case covering the cardtext component implemented using jest more info on jest can be found here for examples of how this is done take a look at existing test cases in the library an example would be the cardheader or input components you can run your tests using npm run test or to see test coverage across the library npm run test cov this is part of epic | 0 |
22,046 | 3,932,373,810 | IssuesEvent | 2016-04-25 15:32:44 | NetsBlox/NetsBlox | https://api.github.com/repos/NetsBlox/NetsBlox | closed | periodically failing tests on travis | bug minor testing under review | Sometimes the tests will fail on travis because it can't get phantomjs. Caching the node_modules directory should help with this | 1.0 | periodically failing tests on travis - Sometimes the tests will fail on travis because it can't get phantomjs. Caching the node_modules directory should help with this | non_priority | periodically failing tests on travis sometimes the tests will fail on travis because it can t get phantomjs caching the node modules directory should help with this | 0 |
286,207 | 24,729,818,020 | IssuesEvent | 2022-10-20 16:36:15 | pytorch/pytorch | https://api.github.com/repos/pytorch/pytorch | reopened | DISABLED test_noncontiguous_samples_nn_functional_conv_transpose2d_cuda_float32 (__main__.TestCommonCUDA) | module: flaky-tests skipped module: unknown | Platforms: linux
This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_noncontiguous_samples_nn_functional_conv_transpose2d_cuda_float32&suite=TestCommonCUDA) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/8751614432).
Over the past 3 hours, it has been determined flaky in 5 workflow(s) with 7 failures and 5 successes.
**Debugging instructions (after clicking on the recent samples link):**
DO NOT BE ALARMED IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs.
To find relevant log snippets:
1. Click on the workflow logs linked above
2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work.
3. Grep for `test_noncontiguous_samples_nn_functional_conv_transpose2d_cuda_float32`
4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs.
| 1.0 | DISABLED test_noncontiguous_samples_nn_functional_conv_transpose2d_cuda_float32 (__main__.TestCommonCUDA) - Platforms: linux
This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_noncontiguous_samples_nn_functional_conv_transpose2d_cuda_float32&suite=TestCommonCUDA) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/8751614432).
Over the past 3 hours, it has been determined flaky in 5 workflow(s) with 7 failures and 5 successes.
**Debugging instructions (after clicking on the recent samples link):**
DO NOT BE ALARMED IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs.
To find relevant log snippets:
1. Click on the workflow logs linked above
2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work.
3. Grep for `test_noncontiguous_samples_nn_functional_conv_transpose2d_cuda_float32`
4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs.
| non_priority | disabled test noncontiguous samples nn functional conv cuda main testcommoncuda platforms linux this test was disabled because it is failing in ci see and the most recent trunk over the past hours it has been determined flaky in workflow s with failures and successes debugging instructions after clicking on the recent samples link do not be alarmed if the ci is green we now shield flaky tests from developers so ci will thus be green but it will be harder to parse the logs to find relevant log snippets click on the workflow logs linked above click on the test step of the job so that it is expanded otherwise the grepping will not work grep for test noncontiguous samples nn functional conv cuda there should be several instances run as flaky tests are rerun in ci from which you can study the logs | 0 |
295,920 | 22,283,484,896 | IssuesEvent | 2022-06-11 08:28:36 | a2n-s/scripts | https://api.github.com/repos/a2n-s/scripts | closed | Correct the headers and the doc | documentation enhancement good first issue | As scripts have been renamed recently, the headers of the scripts and the mentions to the old file names are incorrect.
This issue proposes to **correct the headers and to change the mentions to the old names to the new ones**.
**IMPLEMENTATION DETAILS**:
let's take the example of the `battery` script.
1. it has been renamed from `./scripts/battery.sh` to `./scripts/a2n-s-battery` -> change every occurence of `battery.sh` to `a2n-s-battery`. below is an extract of the diff:
```diff
@@ -145,3 +146,3 @@ help () {
#
- echo "battery.sh:"
+ echo "a2n-s-battery:"
echo " a script to check the state of the battery and throw appropriate notifications."
@@ -149,3 +150,3 @@ help () {
echo "Usage:"
- echo " battery.sh [-hcp]"
+ echo " a2n-s-battery [-hcp]"
echo ""
```
2. correct the header with the following diff
```diff
diff --git a/scripts/a2n-s-battery b/scripts/a2n-s-battery
index 057d33e..3e92545 100755
--- a/scripts/a2n-s-battery
+++ b/scripts/a2n-s-battery
@@ -3,27 +3,29 @@
# __ _|_ )_ _ ___ ___ personal page: https://a2n-s.github.io/
# / _` |/ /| ' \___(_-< github page: https://github.com/a2n-s
# \__,_/___|_||_| /__/ my dotfiles: https://github.com/a2n-s/dotfiles
-# _ __ _ __ _ _ _ _
-# | | / / | |__ / / | |__ __ _| |_| |_ ___ _ _ _ _ __| |_
-# _| | / / | '_ \ / / | '_ \/ _` | _| _/ -_) '_| || |_(_-< ' \
-# (_)_| /_/ |_.__/ /_/ |_.__/\__,_|\__|\__\___|_| \_, (_)__/_||_|
-# |__/
+# ___ _ _ _
+# __ _|_ )_ _ ___ ______| |__ __ _| |_| |_ ___ _ _ _ _
+# / _` |/ /| ' \___(_-<___| '_ \/ _` | _| _/ -_) '_| || |
+# \__,_/___|_||_| /__/ |_.__/\__,_|\__|\__\___|_| \_, |
+# |__/
``` | 1.0 | Correct the headers and the doc - As scripts have been renamed recently, the headers of the scripts and the mentions to the old file names are incorrect.
This issue proposes to **correct the headers and to change the mentions to the old names to the new ones**.
**IMPLEMENTATION DETAILS**:
let's take the example of the `battery` script.
1. it has been renamed from `./scripts/battery.sh` to `./scripts/a2n-s-battery` -> change every occurence of `battery.sh` to `a2n-s-battery`. below is an extract of the diff:
```diff
@@ -145,3 +146,3 @@ help () {
#
- echo "battery.sh:"
+ echo "a2n-s-battery:"
echo " a script to check the state of the battery and throw appropriate notifications."
@@ -149,3 +150,3 @@ help () {
echo "Usage:"
- echo " battery.sh [-hcp]"
+ echo " a2n-s-battery [-hcp]"
echo ""
```
2. correct the header with the following diff
```diff
diff --git a/scripts/a2n-s-battery b/scripts/a2n-s-battery
index 057d33e..3e92545 100755
--- a/scripts/a2n-s-battery
+++ b/scripts/a2n-s-battery
@@ -3,27 +3,29 @@
# __ _|_ )_ _ ___ ___ personal page: https://a2n-s.github.io/
# / _` |/ /| ' \___(_-< github page: https://github.com/a2n-s
# \__,_/___|_||_| /__/ my dotfiles: https://github.com/a2n-s/dotfiles
-# _ __ _ __ _ _ _ _
-# | | / / | |__ / / | |__ __ _| |_| |_ ___ _ _ _ _ __| |_
-# _| | / / | '_ \ / / | '_ \/ _` | _| _/ -_) '_| || |_(_-< ' \
-# (_)_| /_/ |_.__/ /_/ |_.__/\__,_|\__|\__\___|_| \_, (_)__/_||_|
-# |__/
+# ___ _ _ _
+# __ _|_ )_ _ ___ ______| |__ __ _| |_| |_ ___ _ _ _ _
+# / _` |/ /| ' \___(_-<___| '_ \/ _` | _| _/ -_) '_| || |
+# \__,_/___|_||_| /__/ |_.__/\__,_|\__|\__\___|_| \_, |
+# |__/
``` | non_priority | correct the headers and the doc as scripts have been renamed recently the headers of the scripts and the mentions to the old file names are incorrect this issue proposes to correct the headers and to change the mentions to the old names to the new ones implementation details let s take the example of the battery script it has been renamed from scripts battery sh to scripts s battery change every occurence of battery sh to s battery below is an extract of the diff diff help echo battery sh echo s battery echo a script to check the state of the battery and throw appropriate notifications help echo usage echo battery sh echo s battery echo correct the header with the following diff diff diff git a scripts s battery b scripts s battery index a scripts s battery b scripts s battery personal page github page my dotfiles | 0 |
79,619 | 10,134,213,761 | IssuesEvent | 2019-08-02 06:49:49 | kyma-project/console | https://api.github.com/repos/kyma-project/console | closed | First header in documentation is ignored | area/console area/documentation bug | <!-- Thank you for your contribution. Before you submit the issue:
1. Search open and closed issues for duplicates.
2. Read the contributing guidelines.
-->
**Description**
<!-- Provide a clear and concise description of the problem.
Describe where it appears, when it occurred, and what it affects. -->
<!-- Provide relevant technical details such as the Kubernetes version, the cluster name and provider, the Kyma version, the browser name and version, or the operating system. -->
When the header is in the first line of a document then it is ignored. It can be a common case when some external documentation is used.
**Expected result**
<!-- Describe what you expect to happen. -->

**Actual result**
<!-- Describe what happens instead. -->

**Steps to reproduce**
<!-- List the steps to follow to reproduce the bug. Attach any files, links, code samples, or screenshots that could help in investigating the problem. -->
Use the following markdown:
```markdown
## First Header
Some text
### Next Header
Some additional text
```
**Troubleshooting**
<!-- Describe the steps you have already taken to solve the issue. -->
| 1.0 | First header in documentation is ignored - <!-- Thank you for your contribution. Before you submit the issue:
1. Search open and closed issues for duplicates.
2. Read the contributing guidelines.
-->
**Description**
<!-- Provide a clear and concise description of the problem.
Describe where it appears, when it occurred, and what it affects. -->
<!-- Provide relevant technical details such as the Kubernetes version, the cluster name and provider, the Kyma version, the browser name and version, or the operating system. -->
When the header is in the first line of a document then it is ignored. It can be a common case when some external documentation is used.
**Expected result**
<!-- Describe what you expect to happen. -->

**Actual result**
<!-- Describe what happens instead. -->

**Steps to reproduce**
<!-- List the steps to follow to reproduce the bug. Attach any files, links, code samples, or screenshots that could help in investigating the problem. -->
Use the following markdown:
```markdown
## First Header
Some text
### Next Header
Some additional text
```
**Troubleshooting**
<!-- Describe the steps you have already taken to solve the issue. -->
| non_priority | first header in documentation is ignored thank you for your contribution before you submit the issue search open and closed issues for duplicates read the contributing guidelines description provide a clear and concise description of the problem describe where it appears when it occurred and what it affects when the header is in the first line of a document then it is ignored it can be a common case when some external documentation is used expected result actual result steps to reproduce use the following markdown markdown first header some text next header some additional text troubleshooting | 0 |
24,307 | 12,259,119,485 | IssuesEvent | 2020-05-06 16:06:03 | beakerbrowser/beaker | https://api.github.com/repos/beakerbrowser/beaker | closed | High CPU usage on loading a large file | performance | Operation System: Ubuntu 16.04
Beaker Version: 0.7.9
I have a [large TiddlyWiki file](https://www.dropbox.com/s/fzf67dpktmfau0d/notes.html?dl=1) (55MB due to a lot of embedded images) that I want to open using Beaker. Opening the HTML file works as expected, but if I add it as a site's index.html and open the site, I get some problems: The page shows up and behaves as expected, but the loading indicator continues indefinitely, and meanwhile Beaker uses all of my CPU.
I do not experience these problems with a fresh TiddlyWiki, so the problem must be something with this specific file. Does Beaker have a problem with large files in general?
My next step is to try gradually importing the content into a new TiddlyWiki to see when the problem begins to occur, but this would take ages, so if anyone can help me figure out what is causing Beaker to get stuck like this, it would be greatly appreciated. | True | High CPU usage on loading a large file - Operation System: Ubuntu 16.04
Beaker Version: 0.7.9
I have a [large TiddlyWiki file](https://www.dropbox.com/s/fzf67dpktmfau0d/notes.html?dl=1) (55MB due to a lot of embedded images) that I want to open using Beaker. Opening the HTML file works as expected, but if I add it as a site's index.html and open the site, I get some problems: The page shows up and behaves as expected, but the loading indicator continues indefinitely, and meanwhile Beaker uses all of my CPU.
I do not experience these problems with a fresh TiddlyWiki, so the problem must be something with this specific file. Does Beaker have a problem with large files in general?
My next step is to try gradually importing the content into a new TiddlyWiki to see when the problem begins to occur, but this would take ages, so if anyone can help me figure out what is causing Beaker to get stuck like this, it would be greatly appreciated. | non_priority | high cpu usage on loading a large file operation system ubuntu beaker version i have a due to a lot of embedded images that i want to open using beaker opening the html file works as expected but if i add it as a site s index html and open the site i get some problems the page shows up and behaves as expected but the loading indicator continues indefinitely and meanwhile beaker uses all of my cpu i do not experience these problems with a fresh tiddlywiki so the problem must be something with this specific file does beaker have a problem with large files in general my next step is to try gradually importing the content into a new tiddlywiki to see when the problem begins to occur but this would take ages so if anyone can help me figure out what is causing beaker to get stuck like this it would be greatly appreciated | 0 |
265,491 | 20,100,341,645 | IssuesEvent | 2022-02-07 02:45:39 | tanyaleepr/portfolio-generator | https://api.github.com/repos/tanyaleepr/portfolio-generator | opened | Prompt user for more input | documentation | **Description**
_Profile questions_
- Name
- GitHub account name
- About me
_Project questions_
- Project name
- Project description
- Programming Languages
- Project link | 1.0 | Prompt user for more input - **Description**
_Profile questions_
- Name
- GitHub account name
- About me
_Project questions_
- Project name
- Project description
- Programming Languages
- Project link | non_priority | prompt user for more input description profile questions name github account name about me project questions project name project description programming languages project link | 0 |
209,668 | 23,730,737,894 | IssuesEvent | 2022-08-31 01:18:24 | melsorg/github-scanner-test | https://api.github.com/repos/melsorg/github-scanner-test | closed | WS-2015-0018 (Medium) detected in semver-1.1.4.tgz - autoclosed | security vulnerability | ## WS-2015-0018 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>semver-1.1.4.tgz</b></p></summary>
<p>The semantic version parser used by npm.</p>
<p>Library home page: <a href="https://registry.npmjs.org/semver/-/semver-1.1.4.tgz">https://registry.npmjs.org/semver/-/semver-1.1.4.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/github-scanner-test/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/github-scanner-test/node_modules/semver/package.json</p>
<p>
Dependency Hierarchy:
- npm2es-0.4.5.tgz (Root Library)
- npm-normalize-0.2.9.tgz
- :x: **semver-1.1.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/melsorg/github-scanner-test/commit/d78b9e5410cf312856c3d176abc7fe70ea70dc53">d78b9e5410cf312856c3d176abc7fe70ea70dc53</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Semver is vulnerable to regular expression denial of service (ReDoS) when extremely long version strings are parsed.
<p>Publish Date: 2015-04-04
<p>URL: <a href=https://github.com/npm/node-semver/commit/c80180d8341a8ada0236815c29a2be59864afd70>WS-2015-0018</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/31">https://nodesecurity.io/advisories/31</a></p>
<p>Release Date: 2015-04-04</p>
<p>Fix Resolution: Update to a version 4.3.2 or greater</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2015-0018 (Medium) detected in semver-1.1.4.tgz - autoclosed - ## WS-2015-0018 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>semver-1.1.4.tgz</b></p></summary>
<p>The semantic version parser used by npm.</p>
<p>Library home page: <a href="https://registry.npmjs.org/semver/-/semver-1.1.4.tgz">https://registry.npmjs.org/semver/-/semver-1.1.4.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/github-scanner-test/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/github-scanner-test/node_modules/semver/package.json</p>
<p>
Dependency Hierarchy:
- npm2es-0.4.5.tgz (Root Library)
- npm-normalize-0.2.9.tgz
- :x: **semver-1.1.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/melsorg/github-scanner-test/commit/d78b9e5410cf312856c3d176abc7fe70ea70dc53">d78b9e5410cf312856c3d176abc7fe70ea70dc53</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Semver is vulnerable to regular expression denial of service (ReDoS) when extremely long version strings are parsed.
<p>Publish Date: 2015-04-04
<p>URL: <a href=https://github.com/npm/node-semver/commit/c80180d8341a8ada0236815c29a2be59864afd70>WS-2015-0018</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/31">https://nodesecurity.io/advisories/31</a></p>
<p>Release Date: 2015-04-04</p>
<p>Fix Resolution: Update to a version 4.3.2 or greater</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws medium detected in semver tgz autoclosed ws medium severity vulnerability vulnerable library semver tgz the semantic version parser used by npm library home page a href path to dependency file tmp ws scm github scanner test package json path to vulnerable library tmp ws scm github scanner test node modules semver package json dependency hierarchy tgz root library npm normalize tgz x semver tgz vulnerable library found in head commit a href vulnerability details semver is vulnerable to regular expression denial of service redos when extremely long version strings are parsed publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution update to a version or greater step up your open source security game with whitesource | 0 |
27,897 | 22,587,842,992 | IssuesEvent | 2022-06-28 16:46:23 | LanDinh/khaleesi-ninja | https://api.github.com/repos/LanDinh/khaleesi-ninja | opened | Use named ports for gRPC probes | layer:infrastructure waiting | As per kubernetes 1.24, it's not possible to use named ports for gRPC probes | 1.0 | Use named ports for gRPC probes - As per kubernetes 1.24, it's not possible to use named ports for gRPC probes | non_priority | use named ports for grpc probes as per kubernetes it s not possible to use named ports for grpc probes | 0 |
47,527 | 12,044,043,508 | IssuesEvent | 2020-04-14 13:26:37 | gradle/gradle | https://api.github.com/repos/gradle/gradle | closed | Fix NullPointerExceptions for Linux watchers | @build-cache a:bug from:member | Currently, we receive `NullPointerException`s when trying to update the watchers on Linux. The exception happens here: https://github.com/gradle/gradle/blob/f5d92e72e2c13c82071b199ad44231642febb5b0/subprojects/core/src/main/java/org/gradle/internal/vfs/LinuxFileWatcherRegistry.java#L65
It seems like we don't have the directories which we used when we added the snapshot, so something may be wrong with our bookkeeping.
This happening mostly in relocation tests, e.g. in [Antlr3RelocationIntegrationTest.project is relocatable](https://builds.gradle.org/viewLog.html?buildId=33467324&buildTypeId=Gradle_Check_VfsRetention_28_bucket41).
---
cc: @gradle/build-cache
| 1.0 | Fix NullPointerExceptions for Linux watchers - Currently, we receive `NullPointerException`s when trying to update the watchers on Linux. The exception happens here: https://github.com/gradle/gradle/blob/f5d92e72e2c13c82071b199ad44231642febb5b0/subprojects/core/src/main/java/org/gradle/internal/vfs/LinuxFileWatcherRegistry.java#L65
It seems like we don't have the directories which we used when we added the snapshot, so something may be wrong with our bookkeeping.
This happening mostly in relocation tests, e.g. in [Antlr3RelocationIntegrationTest.project is relocatable](https://builds.gradle.org/viewLog.html?buildId=33467324&buildTypeId=Gradle_Check_VfsRetention_28_bucket41).
---
cc: @gradle/build-cache
| non_priority | fix nullpointerexceptions for linux watchers currently we receive nullpointerexception s when trying to update the watchers on linux the exception happens here it seems like we don t have the directories which we used when we added the snapshot so something may be wrong with our bookkeeping this happening mostly in relocation tests e g in cc gradle build cache | 0 |
23,720 | 7,370,586,981 | IssuesEvent | 2018-03-13 09:00:41 | scalacenter/bloop | https://api.github.com/repos/scalacenter/bloop | closed | install.py in release is broken | bug build install | The [install.py](https://github.com/scalacenter/bloop/releases/download/v1.0.0-M6/install.py) that makes it into the release assets on GitHub is broken as it contains python expressions before the almighty import `from __future__`.
The following should just be moved below the imports.
```
NAILGUN_COMMIT = "ebe7ab23"
BLOOP_VERSION = "1.0.0-M6"
``` | 1.0 | install.py in release is broken - The [install.py](https://github.com/scalacenter/bloop/releases/download/v1.0.0-M6/install.py) that makes it into the release assets on GitHub is broken as it contains python expressions before the almighty import `from __future__`.
The following should just be moved below the imports.
```
NAILGUN_COMMIT = "ebe7ab23"
BLOOP_VERSION = "1.0.0-M6"
``` | non_priority | install py in release is broken the that makes it into the release assets on github is broken as it contains python expressions before the almighty import from future the following should just be moved below the imports nailgun commit bloop version | 0 |
175,232 | 14,519,514,640 | IssuesEvent | 2020-12-14 03:03:44 | SpencerTSterling/ColorGame | https://api.github.com/repos/SpencerTSterling/ColorGame | closed | Core Game Mechanics | documentation enhancement | These mechanics are necessary for both game modes.
Basic Game Mechanics:
- Buttons need to be randomly generated colors
- A button needs to be randomly chosen to be the "odd one out"
- Points are rewarded when the correct button is pressed (+10 points)
- OPTIONALLY: more points are rewarded if the correct button is pressed in a short amount of time (approximately 2 seconds for an extra 5 points).
Some of the methods needed:
- [x] GenerateColor to generate the RGB code
- [x] AssignCorrectButton to select which button will be the correct one pressed
- [x] SetupGameBoard to change game button colors, giving an altered color to the correct button
- [x] CheckCorrect to check if the player selected the correct button
- [x] UpdateScore to update the player’s score when they press the right button
- [x] GameOver to inform the player if they won or lost
- [ ] OPTIONALLY: RewardExtraPoints when a player presses the right button quickly
| 1.0 | Core Game Mechanics - These mechanics are necessary for both game modes.
Basic Game Mechanics:
- Buttons need to be randomly generated colors
- A button needs to be randomly chosen to be the "odd one out"
- Points are rewarded when the correct button is pressed (+10 points)
- OPTIONALLY: more points are rewarded if the correct button is pressed in a short amount of time (approximately 2 seconds for an extra 5 points).
Some of the methods needed:
- [x] GenerateColor to generate the RGB code
- [x] AssignCorrectButton to select which button will be the correct one pressed
- [x] SetupGameBoard to change game button colors, giving an altered color to the correct button
- [x] CheckCorrect to check if the player selected the correct button
- [x] UpdateScore to update the player’s score when they press the right button
- [x] GameOver to inform the player if they won or lost
- [ ] OPTIONALLY: RewardExtraPoints when a player presses the right button quickly
| non_priority | core game mechanics these mechanics are necessary for both game modes basic game mechanics buttons need to be randomly generated colors a button needs to be randomly chosen to be the odd one out points are rewarded when the correct button is pressed points optionally more points are rewarded if the correct button is pressed in a short amount of time approximately seconds for an extra points some of the methods needed generatecolor to generate the rgb code assigncorrectbutton to select which button will be the correct one pressed setupgameboard to change game button colors giving an altered color to the correct button checkcorrect to check if the player selected the correct button updatescore to update the player’s score when they press the right button gameover to inform the player if they won or lost optionally rewardextrapoints when a player presses the right button quickly | 0 |
194,085 | 15,396,798,890 | IssuesEvent | 2021-03-03 21:10:19 | antoinezanardi/werewolves-assistant-web | https://api.github.com/repos/antoinezanardi/werewolves-assistant-web | closed | Update README.md | documentation | Update the `README.md` file with:
- Travis explanations for CI.
- New `npm` version on live server. | 1.0 | Update README.md - Update the `README.md` file with:
- Travis explanations for CI.
- New `npm` version on live server. | non_priority | update readme md update the readme md file with travis explanations for ci new npm version on live server | 0 |
220,606 | 17,210,387,021 | IssuesEvent | 2021-07-19 02:50:32 | dapr/dapr | https://api.github.com/repos/dapr/dapr | reopened | E2E test to verify trace export | P2 area/test/e2e kind/observability size/XS triaged/resolved | <!-- If you need to report a security issue with Dapr, send an email to daprct@microsoft.com. -->
## In what area(s)?
<!-- Remove the '> ' to select -->
/area test-and-release
## Describe the feature
As part of #2337 we want to have E2E tests to make sure traces are correctly exported.
## Release Note
<!-- How should the fix for this issue be communicated in our release notes? It can be populated later. -->
<!-- Keep it as a single line. Examples: -->
<!-- RELEASE NOTE: **ADD** New feature in Dapr. -->
<!-- RELEASE NOTE: **FIX** Bug in runtime. -->
<!-- RELEASE NOTE: **UPDATE** Runtime dependency. -->
RELEASE NOTE: **ADD** E2E test to verify trace export
| 1.0 | E2E test to verify trace export - <!-- If you need to report a security issue with Dapr, send an email to daprct@microsoft.com. -->
## In what area(s)?
<!-- Remove the '> ' to select -->
/area test-and-release
## Describe the feature
As part of #2337 we want to have E2E tests to make sure traces are correctly exported.
## Release Note
<!-- How should the fix for this issue be communicated in our release notes? It can be populated later. -->
<!-- Keep it as a single line. Examples: -->
<!-- RELEASE NOTE: **ADD** New feature in Dapr. -->
<!-- RELEASE NOTE: **FIX** Bug in runtime. -->
<!-- RELEASE NOTE: **UPDATE** Runtime dependency. -->
RELEASE NOTE: **ADD** E2E test to verify trace export
| non_priority | test to verify trace export in what area s to select area test and release describe the feature as part of we want to have tests to make sure traces are correctly exported release note release note add test to verify trace export | 0 |
13,837 | 3,362,943,364 | IssuesEvent | 2015-11-20 09:38:51 | redmatrix/hubzilla | https://api.github.com/repos/redmatrix/hubzilla | closed | Editing a post will remove community tags and saved folders from post | bug retest please UX | Have not investigated further yet. | 1.0 | Editing a post will remove community tags and saved folders from post - Have not investigated further yet. | non_priority | editing a post will remove community tags and saved folders from post have not investigated further yet | 0 |
349,833 | 24,957,583,062 | IssuesEvent | 2022-11-01 13:09:04 | atorus-research/Tplyr | https://api.github.com/repos/atorus-research/Tplyr | closed | Typo in `f_str()` documentation on valid variables | documentation | `f_str()` docs say 'variance' instead of 'var' for the variance summary in desc layers.
https://github.com/atorus-research/Tplyr/blob/03f7650d932fd2fb5b9609c77ac6b220c991a0c7/R/format.R#L83 | 1.0 | Typo in `f_str()` documentation on valid variables - `f_str()` docs say 'variance' instead of 'var' for the variance summary in desc layers.
https://github.com/atorus-research/Tplyr/blob/03f7650d932fd2fb5b9609c77ac6b220c991a0c7/R/format.R#L83 | non_priority | typo in f str documentation on valid variables f str docs say variance instead of var for the variance summary in desc layers | 0 |
124,227 | 10,300,478,575 | IssuesEvent | 2019-08-28 12:58:59 | wsi-cogs/frontend | https://api.github.com/repos/wsi-cogs/frontend | closed | Meaning of the deadlines is unclear to users | UX/UI user-testing | The five deadlines that make up a rotation are only described in a single short phrase each in the application, and there's no documentation of them elsewhere either. Members of the Graduate Office are sometimes unsure of the current state of the rotation, which this won't be helping.
There should be some kind of information somewhere that describes what each of the deadlines actually means. | 1.0 | Meaning of the deadlines is unclear to users - The five deadlines that make up a rotation are only described in a single short phrase each in the application, and there's no documentation of them elsewhere either. Members of the Graduate Office are sometimes unsure of the current state of the rotation, which this won't be helping.
There should be some kind of information somewhere that describes what each of the deadlines actually means. | non_priority | meaning of the deadlines is unclear to users the five deadlines that make up a rotation are only described in a single short phrase each in the application and there s no documentation of them elsewhere either members of the graduate office are sometimes unsure of the current state of the rotation which this won t be helping there should be some kind of information somewhere that describes what each of the deadlines actually means | 0 |
115,685 | 9,809,131,958 | IssuesEvent | 2019-06-12 17:11:57 | firebase/firebase-ios-sdk | https://api.github.com/repos/firebase/firebase-ios-sdk | closed | ABTExperimentsToSetFromPayloads crash | api: abtesting api: analytics api: remoteconfig | Our most common crash now is a a crash in what seems to be something related to remote config. In crashlytics its title is `ABTExperimentsToSetFromPayloads` and this is the message:
```
Fatal Exception: NSGenericException
*** Collection <__NSArrayM: 0x281ca66d0> was mutated while being enumerated.
ABTExperimentsToSetFromPayloads
```
We are unable to reproduce it ourselves, but the crash happens just after users start the app according to our logs. We sometimes get a few logs such as `session_started` so its most likely not at the very start, but just after it. There seems to be no pattern of iOS version or device type etc.
The first crash I can find regarding this was with the firebase sdk `5.2.1`. But could have started before that version as well. We are seeing this crash more frequently now, but most likely related to more users on our platform.
### Stacktrace
```
Fatal Exception: NSGenericException
0 CoreFoundation 0x2251d4518 __exceptionPreprocess
1 libobjc.A.dylib 0x2243af9f8 objc_exception_throw
2 CoreFoundation 0x2251d3de0 -[__NSSingleObjectEnumerator initWithObject:]
3 Equilab 0x10325946c ABTExperimentsToSetFromPayloads
4 Equilab 0x103259c54 -[FIRExperimentController updateExperimentsInBackgroundQueueWithServiceOrigin:events:policy:lastStartTime:payloads:]
5 Equilab 0x103259b14 __99-[FIRExperimentController updateExperimentsWithServiceOrigin:events:policy:lastStartTime:payloads:]_block_invoke
6 libdispatch.dylib 0x224c14a38 _dispatch_call_block_and_release
7 libdispatch.dylib 0x224c157d4 _dispatch_client_callout
8 libdispatch.dylib 0x224bf8afc _dispatch_root_queue_drain
9 libdispatch.dylib 0x224bf9248 _dispatch_worker_thread2
10 libsystem_pthread.dylib 0x224df51b4 _pthread_wqthread
11 libsystem_pthread.dylib 0x224df7cd4 start_wqthrea
```
### Environment
* Xcode version: 10.2
* Firebase SDK version: 5.20.2
* Firebase Component: Remote config? Not sure which component it is.
* Component version: N/A But we are running latest version of all components under main lib version 5.20.2.
| 1.0 | ABTExperimentsToSetFromPayloads crash - Our most common crash now is a a crash in what seems to be something related to remote config. In crashlytics its title is `ABTExperimentsToSetFromPayloads` and this is the message:
```
Fatal Exception: NSGenericException
*** Collection <__NSArrayM: 0x281ca66d0> was mutated while being enumerated.
ABTExperimentsToSetFromPayloads
```
We are unable to reproduce it ourselves, but the crash happens just after users start the app according to our logs. We sometimes get a few logs such as `session_started` so its most likely not at the very start, but just after it. There seems to be no pattern of iOS version or device type etc.
The first crash I can find regarding this was with the firebase sdk `5.2.1`. But could have started before that version as well. We are seeing this crash more frequently now, but most likely related to more users on our platform.
### Stacktrace
```
Fatal Exception: NSGenericException
0 CoreFoundation 0x2251d4518 __exceptionPreprocess
1 libobjc.A.dylib 0x2243af9f8 objc_exception_throw
2 CoreFoundation 0x2251d3de0 -[__NSSingleObjectEnumerator initWithObject:]
3 Equilab 0x10325946c ABTExperimentsToSetFromPayloads
4 Equilab 0x103259c54 -[FIRExperimentController updateExperimentsInBackgroundQueueWithServiceOrigin:events:policy:lastStartTime:payloads:]
5 Equilab 0x103259b14 __99-[FIRExperimentController updateExperimentsWithServiceOrigin:events:policy:lastStartTime:payloads:]_block_invoke
6 libdispatch.dylib 0x224c14a38 _dispatch_call_block_and_release
7 libdispatch.dylib 0x224c157d4 _dispatch_client_callout
8 libdispatch.dylib 0x224bf8afc _dispatch_root_queue_drain
9 libdispatch.dylib 0x224bf9248 _dispatch_worker_thread2
10 libsystem_pthread.dylib 0x224df51b4 _pthread_wqthread
11 libsystem_pthread.dylib 0x224df7cd4 start_wqthrea
```
### Environment
* Xcode version: 10.2
* Firebase SDK version: 5.20.2
* Firebase Component: Remote config? Not sure which component it is.
* Component version: N/A But we are running latest version of all components under main lib version 5.20.2.
| non_priority | abtexperimentstosetfrompayloads crash our most common crash now is a a crash in what seems to be something related to remote config in crashlytics its title is abtexperimentstosetfrompayloads and this is the message fatal exception nsgenericexception collection was mutated while being enumerated abtexperimentstosetfrompayloads we are unable to reproduce it ourselves but the crash happens just after users start the app according to our logs we sometimes get a few logs such as session started so its most likely not at the very start but just after it there seems to be no pattern of ios version or device type etc the first crash i can find regarding this was with the firebase sdk but could have started before that version as well we are seeing this crash more frequently now but most likely related to more users on our platform stacktrace fatal exception nsgenericexception corefoundation exceptionpreprocess libobjc a dylib objc exception throw corefoundation equilab abtexperimentstosetfrompayloads equilab equilab block invoke libdispatch dylib dispatch call block and release libdispatch dylib dispatch client callout libdispatch dylib dispatch root queue drain libdispatch dylib dispatch worker libsystem pthread dylib pthread wqthread libsystem pthread dylib start wqthrea environment xcode version firebase sdk version firebase component remote config not sure which component it is component version n a but we are running latest version of all components under main lib version | 0 |
29,920 | 7,134,600,075 | IssuesEvent | 2018-01-22 21:26:32 | opencode18/Girls-who-code | https://api.github.com/repos/opencode18/Girls-who-code | opened | Change the events only limited to india | Advanced: 30 Points Opencode18 | Collect infromation about events like confernces and events that happen in india and add update the list.
This PR requires decent reasearch, if you are not doing it, we wont merge the PR.
P.S add hack in the north ❤️ | 1.0 | Change the events only limited to india - Collect infromation about events like confernces and events that happen in india and add update the list.
This PR requires decent reasearch, if you are not doing it, we wont merge the PR.
P.S add hack in the north ❤️ | non_priority | change the events only limited to india collect infromation about events like confernces and events that happen in india and add update the list this pr requires decent reasearch if you are not doing it we wont merge the pr p s add hack in the north ❤️ | 0 |
67,876 | 17,094,047,093 | IssuesEvent | 2021-07-08 21:57:51 | typeorm/typeorm | https://api.github.com/repos/typeorm/typeorm | closed | SQL Server error when requesting additional returning columns | bug comp: query builder driver: mssql | ## Issue Description
#5361 changed the way SQL Server uses `OUTPUT` in `INSERT`/`UPDATE` queries, causing an error if additional columns are requested using `InsertQueryBuilder.returning()`.
### Expected Behavior
```typescript
connection.createQueryBuilder(Post, 'post')
.insert()
.values({ title: "TITLE" })
.returning(["text"])
.execute();
connection.createQueryBuilder(Post, 'post')
.update()
.set({ title: "TITLE" })
.returning(["text"])
.execute();
```
Should both execute without error and return additional column "text".
### Actual Behavior
```sql
DECLARE @OutputTable TABLE ("id" int);
INSERT INTO "post"("title", "text") OUTPUT INSERTED."text", INSERTED."id" INTO @OutputTable VALUES (@0, @1);
SELECT * FROM @OutputTable
```
`QueryFailedError: Error: Column name or number of supplied values does not match table definition.`
```sql
UPDATE "post" SET "title" = @1 OUTPUT INSERTED."text" INTO @OutputTable WHERE "id" IN (@0)
```
`QueryFailedError: Error: Must declare the table variable "@OutputTable".`
### Relevant Database Driver(s)
- [ ] `aurora-data-api`
- [ ] `aurora-data-api-pg`
- [ ] `better-sqlite3`
- [ ] `cockroachdb`
- [ ] `cordova`
- [ ] `expo`
- [ ] `mongodb`
- [ ] `mysql`
- [ ] `nativescript`
- [ ] `oracle`
- [ ] `postgres`
- [ ] `react-native`
- [ ] `sap`
- [ ] `sqlite`
- [ ] `sqlite-abstract`
- [ ] `sqljs`
- [x] `sqlserver`
### Are you willing to resolve this issue by submitting a Pull Request?
- [x] Yes, I have the time, and I know how to start.
- [ ] Yes, I have the time, but I don't know how to start. I would need guidance.
- [ ] No, I don't have the time, although I believe I could do it if I had the time...
- [ ] No, I don't have the time and I wouldn't even know how to start.
| 1.0 | SQL Server error when requesting additional returning columns - ## Issue Description
#5361 changed the way SQL Server uses `OUTPUT` in `INSERT`/`UPDATE` queries, causing an error if additional columns are requested using `InsertQueryBuilder.returning()`.
### Expected Behavior
```typescript
connection.createQueryBuilder(Post, 'post')
.insert()
.values({ title: "TITLE" })
.returning(["text"])
.execute();
connection.createQueryBuilder(Post, 'post')
.update()
.set({ title: "TITLE" })
.returning(["text"])
.execute();
```
Should both execute without error and return additional column "text".
### Actual Behavior
```sql
DECLARE @OutputTable TABLE ("id" int);
INSERT INTO "post"("title", "text") OUTPUT INSERTED."text", INSERTED."id" INTO @OutputTable VALUES (@0, @1);
SELECT * FROM @OutputTable
```
`QueryFailedError: Error: Column name or number of supplied values does not match table definition.`
```sql
UPDATE "post" SET "title" = @1 OUTPUT INSERTED."text" INTO @OutputTable WHERE "id" IN (@0)
```
`QueryFailedError: Error: Must declare the table variable "@OutputTable".`
### Relevant Database Driver(s)
- [ ] `aurora-data-api`
- [ ] `aurora-data-api-pg`
- [ ] `better-sqlite3`
- [ ] `cockroachdb`
- [ ] `cordova`
- [ ] `expo`
- [ ] `mongodb`
- [ ] `mysql`
- [ ] `nativescript`
- [ ] `oracle`
- [ ] `postgres`
- [ ] `react-native`
- [ ] `sap`
- [ ] `sqlite`
- [ ] `sqlite-abstract`
- [ ] `sqljs`
- [x] `sqlserver`
### Are you willing to resolve this issue by submitting a Pull Request?
- [x] Yes, I have the time, and I know how to start.
- [ ] Yes, I have the time, but I don't know how to start. I would need guidance.
- [ ] No, I don't have the time, although I believe I could do it if I had the time...
- [ ] No, I don't have the time and I wouldn't even know how to start.
| non_priority | sql server error when requesting additional returning columns issue description changed the way sql server uses output in insert update queries causing an error if additional columns are requested using insertquerybuilder returning expected behavior typescript connection createquerybuilder post post insert values title title returning execute connection createquerybuilder post post update set title title returning execute should both execute without error and return additional column text actual behavior sql declare outputtable table id int insert into post title text output inserted text inserted id into outputtable values select from outputtable queryfailederror error column name or number of supplied values does not match table definition sql update post set title output inserted text into outputtable where id in queryfailederror error must declare the table variable outputtable relevant database driver s aurora data api aurora data api pg better cockroachdb cordova expo mongodb mysql nativescript oracle postgres react native sap sqlite sqlite abstract sqljs sqlserver are you willing to resolve this issue by submitting a pull request yes i have the time and i know how to start yes i have the time but i don t know how to start i would need guidance no i don t have the time although i believe i could do it if i had the time no i don t have the time and i wouldn t even know how to start | 0 |
13,545 | 16,088,313,038 | IssuesEvent | 2021-04-26 13:55:52 | geneontology/go-ontology | https://api.github.com/repos/geneontology/go-ontology | closed | host cell apoplast question | multi-species process quick fix |
host cell apoplast -> host apoplast
(it isn't a cellular thing?) | 1.0 | host cell apoplast question -
host cell apoplast -> host apoplast
(it isn't a cellular thing?) | non_priority | host cell apoplast question host cell apoplast host apoplast it isn t a cellular thing | 0 |
133,727 | 18,948,509,333 | IssuesEvent | 2021-11-18 12:53:42 | medicotary/Medicotary | https://api.github.com/repos/medicotary/Medicotary | closed | Make the vendors page | enhancement Design | it would be awesome if I could see the list of all vendors which supply me some medicines, and use the contact info to contact them
- [x] top search bar
- [x] Vendor table | 1.0 | Make the vendors page - it would be awesome if I could see the list of all vendors which supply me some medicines, and use the contact info to contact them
- [x] top search bar
- [x] Vendor table | non_priority | make the vendors page it would be awesome if i could see the list of all vendors which supply me some medicines and use the contact info to contact them top search bar vendor table | 0 |
4,588 | 3,407,146,884 | IssuesEvent | 2015-12-04 00:46:29 | bokeh/bokeh | https://api.github.com/repos/bokeh/bokeh | opened | CI Build breakdown | tag: build type: task | Some suggestions:
* JS, docs, and flake tests only need to run once, suggest one py3 test for all three
* Consider python + integration tests could be merged into one build, to share "install" cost.
* longest tests (both "examples" builds) should be run first together, so that their start times are not unnecessarily staggered. The current order adds ~5-10 minutes to every full run. | 1.0 | CI Build breakdown - Some suggestions:
* JS, docs, and flake tests only need to run once, suggest one py3 test for all three
* Consider python + integration tests could be merged into one build, to share "install" cost.
* longest tests (both "examples" builds) should be run first together, so that their start times are not unnecessarily staggered. The current order adds ~5-10 minutes to every full run. | non_priority | ci build breakdown some suggestions js docs and flake tests only need to run once suggest one test for all three consider python integration tests could be merged into one build to share install cost longest tests both examples builds should be run first together so that their start times are not unnecessarily staggered the current order adds minutes to every full run | 0 |
140,773 | 32,058,364,064 | IssuesEvent | 2023-09-24 11:11:06 | coder/modules | https://api.github.com/repos/coder/modules | closed | VS Code Web | module-idea coder_script | This is different than code-server: `code serve-web`:
- [x] Variable to accept license (user must put true as a variable, default is false)
- [x] coder_app
- [x] coder_script | 1.0 | VS Code Web - This is different than code-server: `code serve-web`:
- [x] Variable to accept license (user must put true as a variable, default is false)
- [x] coder_app
- [x] coder_script | non_priority | vs code web this is different than code server code serve web variable to accept license user must put true as a variable default is false coder app coder script | 0 |
7,312 | 3,535,371,803 | IssuesEvent | 2016-01-16 13:11:52 | itchio/itch | https://api.github.com/repos/itchio/itch | closed | Add opts-checking code and start using it throughout tasks/utils | code quality | It's just preconditions — for example, `util/deploy` shouldn't start if `stage_path` isn't specified, etc. | 1.0 | Add opts-checking code and start using it throughout tasks/utils - It's just preconditions — for example, `util/deploy` shouldn't start if `stage_path` isn't specified, etc. | non_priority | add opts checking code and start using it throughout tasks utils it s just preconditions — for example util deploy shouldn t start if stage path isn t specified etc | 0 |
37,572 | 18,536,389,379 | IssuesEvent | 2021-10-21 12:00:32 | timescale/timescaledb | https://api.github.com/repos/timescale/timescaledb | reopened | Select with a bound on a sequence field in a compressed hypertable with an index is very slow/fails the database | bug performance investigate compression severity-p3 hypertables | ### Tables
```
CREATE TABLE IF NOT EXISTS series_values (
time TIMESTAMPTZ NOT NULL,
value DOUBLE PRECISION NOT NULL,
series_id INTEGER NOT NULL,
seq BIGSERIAL
);
SELECT create_hypertable('series_values', 'time');
ALTER TABLE series_values SET (
timescaledb.compress,
timescaledb.compress_segmentby = 'series_id',
timescaledb.compress_orderby = 'time DESC, seq DESC'
);
SELECT set_chunk_time_interval('series_values', INTERVAL '1h');
SELECT add_compress_chunks_policy('series_values', INTERVAL '2h');
SELECT add_drop_chunks_policy('series_values', INTERVAL '6 months');
CREATE INDEX IF NOT EXISTS series_values_series_id_idx ON series_values USING BTREE (series_id, time desc);
CREATE INDEX IF NOT EXISTS series_values_seq_idx ON series_values USING BTREE (seq DESC);
CREATE OR REPLACE VIEW data AS
SELECT
vals.time AS time,
series.metric AS metric,
series.labels AS labels,
vals.value AS value,
vals.seq as seq
FROM
series_values AS vals
INNER JOIN series ON vals.series_id = series.id;
```
### Query
```
SELECT seq FROM data WHERE seq >= 70000000 ORDER BY seq ASC LIMIT 1;
```
There are approximately 80000000 (80mln) rows in the table in total.
### `explain analyze` of the query
(Not exactly that query, but with a different bound because `explain analyze` with the original bound hangs the database as well as the query itself):
```
Limit (cost=3393.15..3393.49 rows=1 width=8) (actual time=223.840..223.841 rows=1 loops=1)
-> Nested Loop (cost=3393.15..124266.80 rows=349756 width=8) (actual time=223.834..223.834 rows=1 loops=1)
-> Merge Append (cost=3392.87..21937.62 rows=349756 width=12) (actual time=219.550..219.551 rows=1 loops=1)
Sort Key: vals.seq
-> Sort (cost=67.15..69.65 rows=1000 width=12) (actual time=0.861..0.862 rows=0 loops=1)
Sort Key: vals.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_1_chunk vals (cost=17.32..17.32 rows=1000 width=12) (actual time=0.841..0.842 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_36_chunk (cost=0.00..17.32 rows=1 width=56) (actual time=0.836..0.837 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 266
-> Sort (cost=95.15..97.65 rows=1000 width=12) (actual time=0.938..0.938 rows=0 loops=1)
Sort Key: vals_1.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_38_chunk vals_1 (cost=45.33..45.33 rows=1000 width=12) (actual time=0.925..0.926 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_43_chunk (cost=0.00..45.33 rows=1 width=56) (actual time=0.923..0.923 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 266
-> Sort (cost=77.15..79.65 rows=1000 width=12) (actual time=0.916..0.916 rows=0 loops=1)
Sort Key: vals_2.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_42_chunk vals_2 (cost=27.32..27.32 rows=1000 width=12) (actual time=0.904..0.904 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_45_chunk (cost=0.00..27.32 rows=1 width=56) (actual time=0.901..0.901 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 266
-> Sort (cost=214.64..217.14 rows=1000 width=12) (actual time=11.089..11.089 rows=0 loops=1)
Sort Key: vals_3.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_48_chunk vals_3 (cost=164.81..164.81 rows=1000 width=12) (actual time=11.072..11.072 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_60_chunk (cost=0.00..164.81 rows=1 width=56) (actual time=11.067..11.068 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 5185
-> Sort (cost=638.53..641.03 rows=1000 width=12) (actual time=42.984..42.984 rows=0 loops=1)
Sort Key: vals_4.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_52_chunk vals_4 (cost=588.70..588.70 rows=1000 width=12) (actual time=42.950..42.950 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_167_chunk (cost=0.00..588.70 rows=1 width=56) (actual time=42.943..42.944 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 20136
-> Sort (cost=722.04..724.54 rows=1000 width=12) (actual time=48.335..48.336 rows=0 loops=1)
Sort Key: vals_5.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_57_chunk vals_5 (cost=672.21..672.21 rows=1000 width=12) (actual time=48.302..48.303 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_186_chunk (cost=0.00..672.21 rows=1 width=56) (actual time=48.296..48.296 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 22817
-> Sort (cost=365.38..367.88 rows=1000 width=12) (actual time=22.637..22.637 rows=0 loops=1)
Sort Key: vals_6.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_97_chunk vals_6 (cost=315.55..315.55 rows=1000 width=12) (actual time=22.604..22.604 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_220_chunk (cost=0.00..315.55 rows=1 width=56) (actual time=22.598..22.598 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 10524
-> Sort (cost=99.33..101.83 rows=1000 width=12) (actual time=2.963..2.963 rows=0 loops=1)
Sort Key: vals_7.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_170_chunk vals_7 (cost=49.50..49.50 rows=1000 width=12) (actual time=2.929..2.930 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_172_chunk (cost=0.00..49.50 rows=1 width=56) (actual time=2.925..2.926 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 1160
-> Sort (cost=222.12..224.62 rows=1000 width=12) (actual time=12.335..12.336 rows=0 loops=1)
Sort Key: vals_8.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_175_chunk vals_8 (cost=172.29..172.29 rows=1000 width=12) (actual time=12.314..12.315 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_221_chunk (cost=0.00..172.29 rows=1 width=56) (actual time=12.309..12.310 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 5783
-> Sort (cost=150.32..152.82 rows=1000 width=12) (actual time=6.604..6.605 rows=0 loops=1)
Sort Key: vals_9.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_222_chunk vals_9 (cost=100.49..100.49 rows=1000 width=12) (actual time=6.578..6.578 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_407_chunk (cost=0.00..100.49 rows=1 width=56) (actual time=6.572..6.573 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 3079
-> Sort (cost=450.57..453.07 rows=1000 width=12) (actual time=65.976..65.976 rows=0 loops=1)
Sort Key: vals_10.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_265_chunk vals_10 (cost=400.74..400.74 rows=1000 width=12) (actual time=65.952..65.952 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_413_chunk (cost=0.00..400.74 rows=1 width=56) (actual time=65.946..65.946 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 13019
-> Index Scan Backward using _hyper_1_408_chunk_series_values_seq_idx on _hyper_1_408_chunk vals_11 (cost=0.42..333.11 rows=10993 width=12) (actual time=3.800..3.800 rows=1 loops=1)
Index Cond: (seq >= 80000062)
-> Index Scan Backward using _hyper_1_414_chunk_series_values_seq_idx on _hyper_1_414_chunk vals_12 (cost=0.42..9952.58 rows=327763 width=12) (actual time=0.074..0.074 rows=1 loops=1)
Index Cond: (seq >= 80000062)
-> Index Only Scan using series_pkey on series (cost=0.28..0.29 rows=1 width=4) (actual time=4.263..4.264 rows=1 loops=1)
Index Cond: (id = vals.series_id)
Heap Fetches: 1
Planning Time: 30.596 ms
Execution Time: 225.835 ms
```
### Discussion
All rows in the hypertable `data` are globally sorted by `seq` (as well as `time`). I'd expect `SELECT seq FROM data WHERE seq >= 70000000 ORDER BY seq ASC LIMIT 1;` to quickly skim through the metadata of compressed chunks, find the first chunk that potentially has the seq above the bound, uncompress it (alone), find a sole `seq`, then _use that found seq from the first uncompressed chunk_ to go through the metadata again and find that none of the other chunks could possibly have `seq` less than the one already found, and return.
In practice, the database apparently decompresses a lot of chunks.
The database also probably tries to decompress them in parallel without the limit on the available memory, because if it did, I would expect the query to be just very slow, but not to break the database. In practice, however, with X in `seq >= X` low enough the database fails altogether, maybe due to OOM. | True | Select with a bound on a sequence field in a compressed hypertable with an index is very slow/fails the database - ### Tables
```
CREATE TABLE IF NOT EXISTS series_values (
time TIMESTAMPTZ NOT NULL,
value DOUBLE PRECISION NOT NULL,
series_id INTEGER NOT NULL,
seq BIGSERIAL
);
SELECT create_hypertable('series_values', 'time');
ALTER TABLE series_values SET (
timescaledb.compress,
timescaledb.compress_segmentby = 'series_id',
timescaledb.compress_orderby = 'time DESC, seq DESC'
);
SELECT set_chunk_time_interval('series_values', INTERVAL '1h');
SELECT add_compress_chunks_policy('series_values', INTERVAL '2h');
SELECT add_drop_chunks_policy('series_values', INTERVAL '6 months');
CREATE INDEX IF NOT EXISTS series_values_series_id_idx ON series_values USING BTREE (series_id, time desc);
CREATE INDEX IF NOT EXISTS series_values_seq_idx ON series_values USING BTREE (seq DESC);
CREATE OR REPLACE VIEW data AS
SELECT
vals.time AS time,
series.metric AS metric,
series.labels AS labels,
vals.value AS value,
vals.seq as seq
FROM
series_values AS vals
INNER JOIN series ON vals.series_id = series.id;
```
### Query
```
SELECT seq FROM data WHERE seq >= 70000000 ORDER BY seq ASC LIMIT 1;
```
There are approximately 80000000 (80mln) rows in the table in total.
### `explain analyze` of the query
(Not exactly that query, but with a different bound because `explain analyze` with the original bound hangs the database as well as the query itself):
```
Limit (cost=3393.15..3393.49 rows=1 width=8) (actual time=223.840..223.841 rows=1 loops=1)
-> Nested Loop (cost=3393.15..124266.80 rows=349756 width=8) (actual time=223.834..223.834 rows=1 loops=1)
-> Merge Append (cost=3392.87..21937.62 rows=349756 width=12) (actual time=219.550..219.551 rows=1 loops=1)
Sort Key: vals.seq
-> Sort (cost=67.15..69.65 rows=1000 width=12) (actual time=0.861..0.862 rows=0 loops=1)
Sort Key: vals.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_1_chunk vals (cost=17.32..17.32 rows=1000 width=12) (actual time=0.841..0.842 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_36_chunk (cost=0.00..17.32 rows=1 width=56) (actual time=0.836..0.837 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 266
-> Sort (cost=95.15..97.65 rows=1000 width=12) (actual time=0.938..0.938 rows=0 loops=1)
Sort Key: vals_1.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_38_chunk vals_1 (cost=45.33..45.33 rows=1000 width=12) (actual time=0.925..0.926 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_43_chunk (cost=0.00..45.33 rows=1 width=56) (actual time=0.923..0.923 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 266
-> Sort (cost=77.15..79.65 rows=1000 width=12) (actual time=0.916..0.916 rows=0 loops=1)
Sort Key: vals_2.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_42_chunk vals_2 (cost=27.32..27.32 rows=1000 width=12) (actual time=0.904..0.904 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_45_chunk (cost=0.00..27.32 rows=1 width=56) (actual time=0.901..0.901 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 266
-> Sort (cost=214.64..217.14 rows=1000 width=12) (actual time=11.089..11.089 rows=0 loops=1)
Sort Key: vals_3.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_48_chunk vals_3 (cost=164.81..164.81 rows=1000 width=12) (actual time=11.072..11.072 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_60_chunk (cost=0.00..164.81 rows=1 width=56) (actual time=11.067..11.068 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 5185
-> Sort (cost=638.53..641.03 rows=1000 width=12) (actual time=42.984..42.984 rows=0 loops=1)
Sort Key: vals_4.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_52_chunk vals_4 (cost=588.70..588.70 rows=1000 width=12) (actual time=42.950..42.950 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_167_chunk (cost=0.00..588.70 rows=1 width=56) (actual time=42.943..42.944 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 20136
-> Sort (cost=722.04..724.54 rows=1000 width=12) (actual time=48.335..48.336 rows=0 loops=1)
Sort Key: vals_5.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_57_chunk vals_5 (cost=672.21..672.21 rows=1000 width=12) (actual time=48.302..48.303 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_186_chunk (cost=0.00..672.21 rows=1 width=56) (actual time=48.296..48.296 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 22817
-> Sort (cost=365.38..367.88 rows=1000 width=12) (actual time=22.637..22.637 rows=0 loops=1)
Sort Key: vals_6.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_97_chunk vals_6 (cost=315.55..315.55 rows=1000 width=12) (actual time=22.604..22.604 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_220_chunk (cost=0.00..315.55 rows=1 width=56) (actual time=22.598..22.598 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 10524
-> Sort (cost=99.33..101.83 rows=1000 width=12) (actual time=2.963..2.963 rows=0 loops=1)
Sort Key: vals_7.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_170_chunk vals_7 (cost=49.50..49.50 rows=1000 width=12) (actual time=2.929..2.930 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_172_chunk (cost=0.00..49.50 rows=1 width=56) (actual time=2.925..2.926 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 1160
-> Sort (cost=222.12..224.62 rows=1000 width=12) (actual time=12.335..12.336 rows=0 loops=1)
Sort Key: vals_8.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_175_chunk vals_8 (cost=172.29..172.29 rows=1000 width=12) (actual time=12.314..12.315 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_221_chunk (cost=0.00..172.29 rows=1 width=56) (actual time=12.309..12.310 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 5783
-> Sort (cost=150.32..152.82 rows=1000 width=12) (actual time=6.604..6.605 rows=0 loops=1)
Sort Key: vals_9.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_222_chunk vals_9 (cost=100.49..100.49 rows=1000 width=12) (actual time=6.578..6.578 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_407_chunk (cost=0.00..100.49 rows=1 width=56) (actual time=6.572..6.573 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 3079
-> Sort (cost=450.57..453.07 rows=1000 width=12) (actual time=65.976..65.976 rows=0 loops=1)
Sort Key: vals_10.seq
Sort Method: quicksort Memory: 17kB
-> Custom Scan (DecompressChunk) on _hyper_1_265_chunk vals_10 (cost=400.74..400.74 rows=1000 width=12) (actual time=65.952..65.952 rows=0 loops=1)
Filter: (seq >= 80000062)
-> Seq Scan on compress_hyper_2_413_chunk (cost=0.00..400.74 rows=1 width=56) (actual time=65.946..65.946 rows=0 loops=1)
Filter: (_ts_meta_max_2 >= 80000062)
Rows Removed by Filter: 13019
-> Index Scan Backward using _hyper_1_408_chunk_series_values_seq_idx on _hyper_1_408_chunk vals_11 (cost=0.42..333.11 rows=10993 width=12) (actual time=3.800..3.800 rows=1 loops=1)
Index Cond: (seq >= 80000062)
-> Index Scan Backward using _hyper_1_414_chunk_series_values_seq_idx on _hyper_1_414_chunk vals_12 (cost=0.42..9952.58 rows=327763 width=12) (actual time=0.074..0.074 rows=1 loops=1)
Index Cond: (seq >= 80000062)
-> Index Only Scan using series_pkey on series (cost=0.28..0.29 rows=1 width=4) (actual time=4.263..4.264 rows=1 loops=1)
Index Cond: (id = vals.series_id)
Heap Fetches: 1
Planning Time: 30.596 ms
Execution Time: 225.835 ms
```
### Discussion
All rows in the hypertable `data` are globally sorted by `seq` (as well as `time`). I'd expect `SELECT seq FROM data WHERE seq >= 70000000 ORDER BY seq ASC LIMIT 1;` to quickly skim through the metadata of compressed chunks, find the first chunk that potentially has the seq above the bound, uncompress it (alone), find a sole `seq`, then _use that found seq from the first uncompressed chunk_ to go through the metadata again and find that none of the other chunks could possibly have `seq` less than the one already found, and return.
In practice, the database apparently decompresses a lot of chunks.
The database also probably tries to decompress them in parallel without the limit on the available memory, because if it did, I would expect the query to be just very slow, but not to break the database. In practice, however, with X in `seq >= X` low enough the database fails altogether, maybe due to OOM. | non_priority | select with a bound on a sequence field in a compressed hypertable with an index is very slow fails the database tables create table if not exists series values time timestamptz not null value double precision not null series id integer not null seq bigserial select create hypertable series values time alter table series values set timescaledb compress timescaledb compress segmentby series id timescaledb compress orderby time desc seq desc select set chunk time interval series values interval select add compress chunks policy series values interval select add drop chunks policy series values interval months create index if not exists series values series id idx on series values using btree series id time desc create index if not exists series values seq idx on series values using btree seq desc create or replace view data as select vals time as time series metric as metric series labels as labels vals value as value vals seq as seq from series values as vals inner join series on vals series id series id query select seq from data where seq order by seq asc limit there are approximately rows in the table in total explain analyze of the query not exactly that query but with a different bound because explain analyze with the original bound hangs the database as well as the query itself limit cost rows width actual time rows loops nested loop cost rows width actual time rows loops merge append cost rows width actual time rows loops sort key vals seq sort cost rows width actual time rows loops sort key vals seq sort method quicksort memory custom scan decompresschunk on hyper chunk vals cost rows width actual time rows loops filter seq seq scan on compress hyper chunk cost rows width actual time rows loops filter ts meta max rows removed by filter sort cost rows width actual time rows loops sort key vals seq sort method quicksort memory custom scan decompresschunk on hyper chunk vals cost rows width actual time rows loops filter seq seq scan on compress hyper chunk cost rows width actual time rows loops filter ts meta max rows removed by filter sort cost rows width actual time rows loops sort key vals seq sort method quicksort memory custom scan decompresschunk on hyper chunk vals cost rows width actual time rows loops filter seq seq scan on compress hyper chunk cost rows width actual time rows loops filter ts meta max rows removed by filter sort cost rows width actual time rows loops sort key vals seq sort method quicksort memory custom scan decompresschunk on hyper chunk vals cost rows width actual time rows loops filter seq seq scan on compress hyper chunk cost rows width actual time rows loops filter ts meta max rows removed by filter sort cost rows width actual time rows loops sort key vals seq sort method quicksort memory custom scan decompresschunk on hyper chunk vals cost rows width actual time rows loops filter seq seq scan on compress hyper chunk cost rows width actual time rows loops filter ts meta max rows removed by filter sort cost rows width actual time rows loops sort key vals seq sort method quicksort memory custom scan decompresschunk on hyper chunk vals cost rows width actual time rows loops filter seq seq scan on compress hyper chunk cost rows width actual time rows loops filter ts meta max rows removed by filter sort cost rows width actual time rows loops sort key vals seq sort method quicksort memory custom scan decompresschunk on hyper chunk vals cost rows width actual time rows loops filter seq seq scan on compress hyper chunk cost rows width actual time rows loops filter ts meta max rows removed by filter sort cost rows width actual time rows loops sort key vals seq sort method quicksort memory custom scan decompresschunk on hyper chunk vals cost rows width actual time rows loops filter seq seq scan on compress hyper chunk cost rows width actual time rows loops filter ts meta max rows removed by filter sort cost rows width actual time rows loops sort key vals seq sort method quicksort memory custom scan decompresschunk on hyper chunk vals cost rows width actual time rows loops filter seq seq scan on compress hyper chunk cost rows width actual time rows loops filter ts meta max rows removed by filter sort cost rows width actual time rows loops sort key vals seq sort method quicksort memory custom scan decompresschunk on hyper chunk vals cost rows width actual time rows loops filter seq seq scan on compress hyper chunk cost rows width actual time rows loops filter ts meta max rows removed by filter sort cost rows width actual time rows loops sort key vals seq sort method quicksort memory custom scan decompresschunk on hyper chunk vals cost rows width actual time rows loops filter seq seq scan on compress hyper chunk cost rows width actual time rows loops filter ts meta max rows removed by filter index scan backward using hyper chunk series values seq idx on hyper chunk vals cost rows width actual time rows loops index cond seq index scan backward using hyper chunk series values seq idx on hyper chunk vals cost rows width actual time rows loops index cond seq index only scan using series pkey on series cost rows width actual time rows loops index cond id vals series id heap fetches planning time ms execution time ms discussion all rows in the hypertable data are globally sorted by seq as well as time i d expect select seq from data where seq order by seq asc limit to quickly skim through the metadata of compressed chunks find the first chunk that potentially has the seq above the bound uncompress it alone find a sole seq then use that found seq from the first uncompressed chunk to go through the metadata again and find that none of the other chunks could possibly have seq less than the one already found and return in practice the database apparently decompresses a lot of chunks the database also probably tries to decompress them in parallel without the limit on the available memory because if it did i would expect the query to be just very slow but not to break the database in practice however with x in seq x low enough the database fails altogether maybe due to oom | 0 |
183,395 | 31,465,208,098 | IssuesEvent | 2023-08-30 01:02:57 | WordPress/gutenberg | https://api.github.com/repos/WordPress/gutenberg | closed | Group block and Row block borders overlapping on content | [Block] Group [Feature] Design Tools | ### Description
Adding borders on group block and row blocks overlap on content
### Step-by-step reproduction instructions
1 Go to FSE,
2 Add a group block, and add a /para block and list block inside the group block
3 Add border to the group block with rounded corners
4 Change the border colour and size (optional for better visibility)
5 Publish and view the page
### Screenshots, screen recording, code snippet

### Environment info
WP 5.9 GB: 12.5.4 TT2
WP 5.9 GB: 12.5.4 TT2
Browser Brave on Windows 10
### Please confirm that you have searched existing issues in the repo.
Yes
### Please confirm that you have tested with all plugins deactivated except Gutenberg.
Yes | 1.0 | Group block and Row block borders overlapping on content - ### Description
Adding borders on group block and row blocks overlap on content
### Step-by-step reproduction instructions
1 Go to FSE,
2 Add a group block, and add a /para block and list block inside the group block
3 Add border to the group block with rounded corners
4 Change the border colour and size (optional for better visibility)
5 Publish and view the page
### Screenshots, screen recording, code snippet

### Environment info
WP 5.9 GB: 12.5.4 TT2
WP 5.9 GB: 12.5.4 TT2
Browser Brave on Windows 10
### Please confirm that you have searched existing issues in the repo.
Yes
### Please confirm that you have tested with all plugins deactivated except Gutenberg.
Yes | non_priority | group block and row block borders overlapping on content description adding borders on group block and row blocks overlap on content step by step reproduction instructions go to fse add a group block and add a para block and list block inside the group block add border to the group block with rounded corners change the border colour and size optional for better visibility publish and view the page screenshots screen recording code snippet environment info wp gb wp gb browser brave on windows please confirm that you have searched existing issues in the repo yes please confirm that you have tested with all plugins deactivated except gutenberg yes | 0 |
56,950 | 7,018,437,325 | IssuesEvent | 2017-12-21 13:44:36 | TechnionYearlyProject/Roommates | https://api.github.com/repos/TechnionYearlyProject/Roommates | closed | Login Form properties | Design | Login
These are the proprties we are going to fetch from **POST /users/login** request:
| Field | requirement | comment |
|------------------------|----------------------------------|--------------------------------|
|email|MANDATORY||
|password|MANDATORY||
the response includes the user with x-auth code in header
Regards. | 1.0 | Login Form properties - Login
These are the proprties we are going to fetch from **POST /users/login** request:
| Field | requirement | comment |
|------------------------|----------------------------------|--------------------------------|
|email|MANDATORY||
|password|MANDATORY||
the response includes the user with x-auth code in header
Regards. | non_priority | login form properties login these are the proprties we are going to fetch from post users login request field requirement comment email mandatory password mandatory the response includes the user with x auth code in header regards | 0 |
139,329 | 12,853,880,096 | IssuesEvent | 2020-07-09 00:06:07 | seanpm2001/horsin-around-in-the-barn | https://api.github.com/repos/seanpm2001/horsin-around-in-the-barn | opened | 3 files couldn't be uploaded | documentation enhancement good first issue wontfix |
***
### 3 files couldn't be uploaded
I couldn't upload the following 3 projects to this repository, due to the GitHub 25 Megabyte file limit:
`//horsingaroundinthebarn.com/Root/Media/WelcomeVideoProjectDraft1.mp4` - Size: 31,717,824 bytes (31.71 Megabytes)
`//horsingaroundinthebarn.com/Horsingaroundinthebarnwebsite_version1.zip` - Size: 89,832,591 bytes (89.83 Megabytes)
`//horsingaroundinthebarn.com/Horsingaroundinthebarnwebsite_version2.zip` - Size: 131,575,962 bytes (131.57 Megabytes)
I likely won't fix this, unless GitHub raises the limit, or there is a workaround.
***
| 1.0 | 3 files couldn't be uploaded -
***
### 3 files couldn't be uploaded
I couldn't upload the following 3 projects to this repository, due to the GitHub 25 Megabyte file limit:
`//horsingaroundinthebarn.com/Root/Media/WelcomeVideoProjectDraft1.mp4` - Size: 31,717,824 bytes (31.71 Megabytes)
`//horsingaroundinthebarn.com/Horsingaroundinthebarnwebsite_version1.zip` - Size: 89,832,591 bytes (89.83 Megabytes)
`//horsingaroundinthebarn.com/Horsingaroundinthebarnwebsite_version2.zip` - Size: 131,575,962 bytes (131.57 Megabytes)
I likely won't fix this, unless GitHub raises the limit, or there is a workaround.
***
| non_priority | files couldn t be uploaded files couldn t be uploaded i couldn t upload the following projects to this repository due to the github megabyte file limit horsingaroundinthebarn com root media size bytes megabytes horsingaroundinthebarn com horsingaroundinthebarnwebsite zip size bytes megabytes horsingaroundinthebarn com horsingaroundinthebarnwebsite zip size bytes megabytes i likely won t fix this unless github raises the limit or there is a workaround | 0 |
210,161 | 23,739,023,575 | IssuesEvent | 2022-08-31 10:41:27 | SacleuxBenoit/test-nodejs | https://api.github.com/repos/SacleuxBenoit/test-nodejs | closed | CVE-2021-3664 (Medium) detected in url-parse-1.4.7.tgz - autoclosed | security vulnerability | ## CVE-2021-3664 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary>
<p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p>
<p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p>
<p>Path to dependency file: /front/package.json</p>
<p>Path to vulnerable library: /front/node_modules/url-parse/package.json</p>
<p>
Dependency Hierarchy:
- cli-service-4.2.3.tgz (Root Library)
- webpack-dev-server-3.10.3.tgz
- sockjs-client-1.4.0.tgz
- :x: **url-parse-1.4.7.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
url-parse is vulnerable to URL Redirection to Untrusted Site
<p>Publish Date: 2021-07-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3664>CVE-2021-3664</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-3664">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-3664</a></p>
<p>Release Date: 2021-07-26</p>
<p>Fix Resolution (url-parse): 1.5.2</p>
<p>Direct dependency fix Resolution (@vue/cli-service): 4.3.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-3664 (Medium) detected in url-parse-1.4.7.tgz - autoclosed - ## CVE-2021-3664 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary>
<p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p>
<p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p>
<p>Path to dependency file: /front/package.json</p>
<p>Path to vulnerable library: /front/node_modules/url-parse/package.json</p>
<p>
Dependency Hierarchy:
- cli-service-4.2.3.tgz (Root Library)
- webpack-dev-server-3.10.3.tgz
- sockjs-client-1.4.0.tgz
- :x: **url-parse-1.4.7.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
url-parse is vulnerable to URL Redirection to Untrusted Site
<p>Publish Date: 2021-07-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3664>CVE-2021-3664</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-3664">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-3664</a></p>
<p>Release Date: 2021-07-26</p>
<p>Fix Resolution (url-parse): 1.5.2</p>
<p>Direct dependency fix Resolution (@vue/cli-service): 4.3.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in url parse tgz autoclosed cve medium severity vulnerability vulnerable library url parse tgz small footprint url parser that works seamlessly across node js and browser environments library home page a href path to dependency file front package json path to vulnerable library front node modules url parse package json dependency hierarchy cli service tgz root library webpack dev server tgz sockjs client tgz x url parse tgz vulnerable library found in base branch master vulnerability details url parse is vulnerable to url redirection to untrusted site publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution url parse direct dependency fix resolution vue cli service step up your open source security game with whitesource | 0 |
113,329 | 9,636,215,401 | IssuesEvent | 2019-05-16 04:56:53 | owncloud/encryption | https://api.github.com/repos/owncloud/encryption | opened | Acceptance test: Check users are required to re-login after recreating master key. | QA-team dev:acceptance-tests | Part of #35207
Acceptance test: Check users are required to re-login after recreating master key.
introduced by https://github.com/owncloud/core/pull/34596 | 1.0 | Acceptance test: Check users are required to re-login after recreating master key. - Part of #35207
Acceptance test: Check users are required to re-login after recreating master key.
introduced by https://github.com/owncloud/core/pull/34596 | non_priority | acceptance test check users are required to re login after recreating master key part of acceptance test check users are required to re login after recreating master key introduced by | 0 |
46,690 | 13,180,992,699 | IssuesEvent | 2020-08-12 13:40:39 | mibo32/fitbit-api-example-java | https://api.github.com/repos/mibo32/fitbit-api-example-java | opened | CVE-2018-8014 (High) detected in tomcat-embed-core-8.5.4.jar | security vulnerability | ## CVE-2018-8014 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tomcat-embed-core-8.5.4.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Library home page: <a href="http://tomcat.apache.org/">http://tomcat.apache.org/</a></p>
<p>Path to dependency file: /tmp/ws-scm/fitbit-api-example-java/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/8.5.4/tomcat-embed-core-8.5.4.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-1.4.0.RELEASE.jar (Root Library)
- spring-boot-starter-tomcat-1.4.0.RELEASE.jar
- :x: **tomcat-embed-core-8.5.4.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/mibo32/fitbit-api-example-java/commit/fdd0855fc8f3b846f83506299759cd9cc820e5d2">fdd0855fc8f3b846f83506299759cd9cc820e5d2</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The defaults settings for the CORS filter provided in Apache Tomcat 9.0.0.M1 to 9.0.8, 8.5.0 to 8.5.31, 8.0.0.RC1 to 8.0.52, 7.0.41 to 7.0.88 are insecure and enable 'supportsCredentials' for all origins. It is expected that users of the CORS filter will have configured it appropriately for their environment rather than using it in the default configuration. Therefore, it is expected that most users will not be impacted by this issue.
<p>Publish Date: 2018-05-16
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8014>CVE-2018-8014</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-8014">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-8014</a></p>
<p>Release Date: 2018-05-16</p>
<p>Fix Resolution: org.apache.tomcat.embed:tomcat-embed-core:9.0.10,8.5.32,8.0.53,7.0.90,org.apache.tomcat:tomcat-catalina:9.0.10,8.5.32,8.0.53,7.0.90</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2018-8014 (High) detected in tomcat-embed-core-8.5.4.jar - ## CVE-2018-8014 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tomcat-embed-core-8.5.4.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Library home page: <a href="http://tomcat.apache.org/">http://tomcat.apache.org/</a></p>
<p>Path to dependency file: /tmp/ws-scm/fitbit-api-example-java/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/8.5.4/tomcat-embed-core-8.5.4.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-1.4.0.RELEASE.jar (Root Library)
- spring-boot-starter-tomcat-1.4.0.RELEASE.jar
- :x: **tomcat-embed-core-8.5.4.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/mibo32/fitbit-api-example-java/commit/fdd0855fc8f3b846f83506299759cd9cc820e5d2">fdd0855fc8f3b846f83506299759cd9cc820e5d2</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The defaults settings for the CORS filter provided in Apache Tomcat 9.0.0.M1 to 9.0.8, 8.5.0 to 8.5.31, 8.0.0.RC1 to 8.0.52, 7.0.41 to 7.0.88 are insecure and enable 'supportsCredentials' for all origins. It is expected that users of the CORS filter will have configured it appropriately for their environment rather than using it in the default configuration. Therefore, it is expected that most users will not be impacted by this issue.
<p>Publish Date: 2018-05-16
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8014>CVE-2018-8014</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-8014">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-8014</a></p>
<p>Release Date: 2018-05-16</p>
<p>Fix Resolution: org.apache.tomcat.embed:tomcat-embed-core:9.0.10,8.5.32,8.0.53,7.0.90,org.apache.tomcat:tomcat-catalina:9.0.10,8.5.32,8.0.53,7.0.90</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in tomcat embed core jar cve high severity vulnerability vulnerable library tomcat embed core jar core tomcat implementation library home page a href path to dependency file tmp ws scm fitbit api example java pom xml path to vulnerable library home wss scanner repository org apache tomcat embed tomcat embed core tomcat embed core jar dependency hierarchy spring boot starter web release jar root library spring boot starter tomcat release jar x tomcat embed core jar vulnerable library found in head commit a href vulnerability details the defaults settings for the cors filter provided in apache tomcat to to to to are insecure and enable supportscredentials for all origins it is expected that users of the cors filter will have configured it appropriately for their environment rather than using it in the default configuration therefore it is expected that most users will not be impacted by this issue publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache tomcat embed tomcat embed core org apache tomcat tomcat catalina step up your open source security game with whitesource | 0 |
217,146 | 24,313,218,070 | IssuesEvent | 2022-09-30 02:02:58 | RG4421/ampere-centos-kernel | https://api.github.com/repos/RG4421/ampere-centos-kernel | reopened | CVE-2022-0492 (High) detected in linuxv5.2 | security vulnerability | ## CVE-2022-0492 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
<p>Found in base branch: <b>amp-centos-8.0-kernel</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/kernel/cgroup/cgroup-v1.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/kernel/cgroup/cgroup-v1.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability was found in the Linux kernel’s cgroup_release_agent_write in the kernel/cgroup/cgroup-v1.c function. This flaw, under certain circumstances, allows the use of the cgroups v1 release_agent feature to escalate privileges and bypass the namespace isolation unexpectedly.
<p>Publish Date: 2022-03-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0492>CVE-2022-0492</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://security-tracker.debian.org/tracker/CVE-2022-0492">https://security-tracker.debian.org/tracker/CVE-2022-0492</a></p>
<p>Release Date: 2022-03-03</p>
<p>Fix Resolution: v5.17-rc3</p>
</p>
</details>
<p></p>
| True | CVE-2022-0492 (High) detected in linuxv5.2 - ## CVE-2022-0492 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
<p>Found in base branch: <b>amp-centos-8.0-kernel</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/kernel/cgroup/cgroup-v1.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/kernel/cgroup/cgroup-v1.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability was found in the Linux kernel’s cgroup_release_agent_write in the kernel/cgroup/cgroup-v1.c function. This flaw, under certain circumstances, allows the use of the cgroups v1 release_agent feature to escalate privileges and bypass the namespace isolation unexpectedly.
<p>Publish Date: 2022-03-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0492>CVE-2022-0492</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://security-tracker.debian.org/tracker/CVE-2022-0492">https://security-tracker.debian.org/tracker/CVE-2022-0492</a></p>
<p>Release Date: 2022-03-03</p>
<p>Fix Resolution: v5.17-rc3</p>
</p>
</details>
<p></p>
| non_priority | cve high detected in cve high severity vulnerability vulnerable library linux kernel source tree library home page a href found in base branch amp centos kernel vulnerable source files kernel cgroup cgroup c kernel cgroup cgroup c vulnerability details a vulnerability was found in the linux kernel’s cgroup release agent write in the kernel cgroup cgroup c function this flaw under certain circumstances allows the use of the cgroups release agent feature to escalate privileges and bypass the namespace isolation unexpectedly publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution | 0 |
192,310 | 15,343,074,831 | IssuesEvent | 2021-02-27 18:46:05 | ethereum-optimism/community-hub | https://api.github.com/repos/ethereum-optimism/community-hub | opened | L2 FAQ/Noob Questions | documentation good first issue help wanted | Would like to pick out the best questions from this thread and give them answers:
https://twitter.com/AdamScochran/status/1365365202749915136 | 1.0 | L2 FAQ/Noob Questions - Would like to pick out the best questions from this thread and give them answers:
https://twitter.com/AdamScochran/status/1365365202749915136 | non_priority | faq noob questions would like to pick out the best questions from this thread and give them answers | 0 |
196,339 | 14,856,190,876 | IssuesEvent | 2021-01-18 13:49:48 | elastic/kibana | https://api.github.com/repos/elastic/kibana | closed | Failing test: "after all" hook for "Creates and activates a new EQL rule" - Detection rules, EQL "after all" hook for "Creates and activates a new EQL rule" | Team: SecuritySolution Team:Detections and Resp failed-test | A test failed on a tracked branch
```
CypressError: `cy.request()` failed on:
http://elastic:changeme@localhost:6151/api/solutions/security/graphql
The response we received from your web server was:
> 400: Bad Request
This was considered a failure because the status code was not `2xx` or `3xx`.
If you do not want status codes to cause failures pass the option: `failOnStatusCode: false`
-----------------------------------------------------------
The request we sent was:
Method: POST
URL: http://elastic:changeme@localhost:6151/api/solutions/security/graphql
Headers: {
"Connection": "keep-alive",
"kbn-xsrf": "delete-signals",
"user-agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) HeadlessChrome/87.0.4280.88 Safari/537.36",
"accept": "*/*",
"accept-encoding": "gzip, deflate",
"authorization": "Basic ZWxhc3RpYzpjaGFuZ2VtZQ==",
"content-type": "application/json",
"content-length": 157
}
Body: {"operationName":"DeleteTimelineMutation","variables":{"id":[null]},"query":"mutation DeleteTimelineMutation($id: [ID!]!) {\n deleteTimeline(id: $id)\n}\n"}
-----------------------------------------------------------
The response we got was:
Status: 400 - Bad Request
Headers: {
"content-type": "application/json; charset=utf-8",
"kbn-name": "kibana-ci-immutable-ubuntu-18-tests-xxl-1607634196457586066",
"kbn-license-sig": "11a88980626736dd699d8f9f75172106e6a26228aefff99ea2b4f132147816db",
"cache-control": "private, no-cache, no-store, must-revalidate",
"content-length": "213",
"date": "Thu, 10 Dec 2020 21:43:20 GMT",
"connection": "keep-alive",
"keep-alive": "timeout=120"
}
Body: {
"message": "{\"errors\":[{\"message\":\"Variable \\\"$id\\\" got invalid value [null]; Expected non-nullable type ID! not to be null at value[0].\",\"locations\":[{\"line\":1,\"column\":33}]}]}",
"status_code": 400
}
https://on.cypress.io/request
Because this error occurred during a `after all` hook we are skipping the remaining tests in the current suite: `Detection rules, EQL`
Although you have test retries enabled, we do not retry tests when `before all` or `after all` hooks fail
at http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:158924:21
at tryCatcher (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:10584:23)
at Promise._settlePromiseFromHandler (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:8519:31)
at Promise._settlePromise (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:8576:18)
at Promise._settlePromise0 (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:8621:10)
at Promise._settlePromises (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:8701:18)
at _drainQueueStep (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:5291:12)
at _drainQueue (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:5284:9)
at Async.../../node_modules/bluebird/js/release/async.js.Async._drainQueues (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:5300:5)
at Async.drainQueues (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:5170:14)
From Your Spec Code:
at Object.deleteTimeline (http://localhost:6151/__cypress/tests?p=cypress/integration/alerts_detection_rules_eql.spec.ts:21539:8)
at Context.eval (http://localhost:6151/__cypress/tests?p=cypress/integration/alerts_detection_rules_eql.spec.ts:20336:21)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+master/10294/)
<!-- kibanaCiData = {"failed-test":{"test.class":"\"after all\" hook for \"Creates and activates a new EQL rule\"","test.name":"Detection rules, EQL \"after all\" hook for \"Creates and activates a new EQL rule\"","test.failCount":19}} --> | 1.0 | Failing test: "after all" hook for "Creates and activates a new EQL rule" - Detection rules, EQL "after all" hook for "Creates and activates a new EQL rule" - A test failed on a tracked branch
```
CypressError: `cy.request()` failed on:
http://elastic:changeme@localhost:6151/api/solutions/security/graphql
The response we received from your web server was:
> 400: Bad Request
This was considered a failure because the status code was not `2xx` or `3xx`.
If you do not want status codes to cause failures pass the option: `failOnStatusCode: false`
-----------------------------------------------------------
The request we sent was:
Method: POST
URL: http://elastic:changeme@localhost:6151/api/solutions/security/graphql
Headers: {
"Connection": "keep-alive",
"kbn-xsrf": "delete-signals",
"user-agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) HeadlessChrome/87.0.4280.88 Safari/537.36",
"accept": "*/*",
"accept-encoding": "gzip, deflate",
"authorization": "Basic ZWxhc3RpYzpjaGFuZ2VtZQ==",
"content-type": "application/json",
"content-length": 157
}
Body: {"operationName":"DeleteTimelineMutation","variables":{"id":[null]},"query":"mutation DeleteTimelineMutation($id: [ID!]!) {\n deleteTimeline(id: $id)\n}\n"}
-----------------------------------------------------------
The response we got was:
Status: 400 - Bad Request
Headers: {
"content-type": "application/json; charset=utf-8",
"kbn-name": "kibana-ci-immutable-ubuntu-18-tests-xxl-1607634196457586066",
"kbn-license-sig": "11a88980626736dd699d8f9f75172106e6a26228aefff99ea2b4f132147816db",
"cache-control": "private, no-cache, no-store, must-revalidate",
"content-length": "213",
"date": "Thu, 10 Dec 2020 21:43:20 GMT",
"connection": "keep-alive",
"keep-alive": "timeout=120"
}
Body: {
"message": "{\"errors\":[{\"message\":\"Variable \\\"$id\\\" got invalid value [null]; Expected non-nullable type ID! not to be null at value[0].\",\"locations\":[{\"line\":1,\"column\":33}]}]}",
"status_code": 400
}
https://on.cypress.io/request
Because this error occurred during a `after all` hook we are skipping the remaining tests in the current suite: `Detection rules, EQL`
Although you have test retries enabled, we do not retry tests when `before all` or `after all` hooks fail
at http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:158924:21
at tryCatcher (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:10584:23)
at Promise._settlePromiseFromHandler (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:8519:31)
at Promise._settlePromise (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:8576:18)
at Promise._settlePromise0 (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:8621:10)
at Promise._settlePromises (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:8701:18)
at _drainQueueStep (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:5291:12)
at _drainQueue (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:5284:9)
at Async.../../node_modules/bluebird/js/release/async.js.Async._drainQueues (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:5300:5)
at Async.drainQueues (http://elastic:changeme@localhost:6151/__cypress/runner/cypress_runner.js:5170:14)
From Your Spec Code:
at Object.deleteTimeline (http://localhost:6151/__cypress/tests?p=cypress/integration/alerts_detection_rules_eql.spec.ts:21539:8)
at Context.eval (http://localhost:6151/__cypress/tests?p=cypress/integration/alerts_detection_rules_eql.spec.ts:20336:21)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+master/10294/)
<!-- kibanaCiData = {"failed-test":{"test.class":"\"after all\" hook for \"Creates and activates a new EQL rule\"","test.name":"Detection rules, EQL \"after all\" hook for \"Creates and activates a new EQL rule\"","test.failCount":19}} --> | non_priority | failing test after all hook for creates and activates a new eql rule detection rules eql after all hook for creates and activates a new eql rule a test failed on a tracked branch cypresserror cy request failed on the response we received from your web server was bad request this was considered a failure because the status code was not or if you do not want status codes to cause failures pass the option failonstatuscode false the request we sent was method post url headers connection keep alive kbn xsrf delete signals user agent mozilla linux applewebkit khtml like gecko headlesschrome safari accept accept encoding gzip deflate authorization basic content type application json content length body operationname deletetimelinemutation variables id query mutation deletetimelinemutation id n deletetimeline id id n n the response we got was status bad request headers content type application json charset utf kbn name kibana ci immutable ubuntu tests xxl kbn license sig cache control private no cache no store must revalidate content length date thu dec gmt connection keep alive keep alive timeout body message errors expected non nullable type id not to be null at value locations status code because this error occurred during a after all hook we are skipping the remaining tests in the current suite detection rules eql although you have test retries enabled we do not retry tests when before all or after all hooks fail at at trycatcher at promise settlepromisefromhandler at promise settlepromise at promise at promise settlepromises at drainqueuestep at drainqueue at async node modules bluebird js release async js async drainqueues at async drainqueues from your spec code at object deletetimeline at context eval first failure | 0 |
290,923 | 21,911,217,861 | IssuesEvent | 2022-05-21 04:26:53 | danielrivard/homeassistant-innova | https://api.github.com/repos/danielrivard/homeassistant-innova | closed | How to configure multiple units | documentation | @danielrivard Thanks allready for this amazing job!
To configure multiple units, and be able to control each unit (note that platform must be there for each unit!):
climate:
- platform: innova
host: 192.168.1.134
scan_interval: 1200
- platform: innova
host: 192.168.1.126
scan_interval: 1200
- platform: innova
host: 192.168.1.129
scan_interval: 1200 | 1.0 | How to configure multiple units - @danielrivard Thanks allready for this amazing job!
To configure multiple units, and be able to control each unit (note that platform must be there for each unit!):
climate:
- platform: innova
host: 192.168.1.134
scan_interval: 1200
- platform: innova
host: 192.168.1.126
scan_interval: 1200
- platform: innova
host: 192.168.1.129
scan_interval: 1200 | non_priority | how to configure multiple units danielrivard thanks allready for this amazing job to configure multiple units and be able to control each unit note that platform must be there for each unit climate platform innova host scan interval platform innova host scan interval platform innova host scan interval | 0 |
58,558 | 24,481,151,938 | IssuesEvent | 2022-10-08 21:16:22 | LiskHQ/lisk-service | https://api.github.com/repos/LiskHQ/lisk-service | opened | Add endpoint to dry run transactions | service/gateway type: improvement service/blockchain-connector service/blockchain-indexer | ### Description
Lisk SDK v6.0.0-alpha.4 adds support to dry-run transactions. Users should be able to utilize the same when using Lisk Service.
### Acceptance Criteria
- The following endpoints are available
- Integration tests are implemented
- Swagger specs are up-to-date
### Additional information
#### Endpoints
**HTTP**: `POST /transactions/dryrun`
**RPC**: `post.transactions.dryrun`
#### Request body params
Parameter | Type | Validation | Default | Comment
-- | -- | -- | -- | --
transaction | String | /^\b[0-9a-fA-F]+\b$/ | (empty) |
#### Response
```
``` | 3.0 | Add endpoint to dry run transactions - ### Description
Lisk SDK v6.0.0-alpha.4 adds support to dry-run transactions. Users should be able to utilize the same when using Lisk Service.
### Acceptance Criteria
- The following endpoints are available
- Integration tests are implemented
- Swagger specs are up-to-date
### Additional information
#### Endpoints
**HTTP**: `POST /transactions/dryrun`
**RPC**: `post.transactions.dryrun`
#### Request body params
Parameter | Type | Validation | Default | Comment
-- | -- | -- | -- | --
transaction | String | /^\b[0-9a-fA-F]+\b$/ | (empty) |
#### Response
```
``` | non_priority | add endpoint to dry run transactions description lisk sdk alpha adds support to dry run transactions users should be able to utilize the same when using lisk service acceptance criteria the following endpoints are available integration tests are implemented swagger specs are up to date additional information endpoints http post transactions dryrun rpc post transactions dryrun request body params parameter type validation default comment transaction string b b empty response | 0 |
90,211 | 15,856,084,949 | IssuesEvent | 2021-04-08 01:29:03 | rgordon95/advanced-react-redux-demo | https://api.github.com/repos/rgordon95/advanced-react-redux-demo | opened | CVE-2020-7660 (High) detected in serialize-javascript-1.6.1.tgz | security vulnerability | ## CVE-2020-7660 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>serialize-javascript-1.6.1.tgz</b></p></summary>
<p>Serialize JavaScript to a superset of JSON that includes regular expressions and functions.</p>
<p>Library home page: <a href="https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.6.1.tgz">https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.6.1.tgz</a></p>
<p>Path to dependency file: /advanced-react-redux-demo/package.json</p>
<p>Path to vulnerable library: advanced-react-redux-demo/node_modules/serialize-javascript/package.json</p>
<p>
Dependency Hierarchy:
- webpack-4.29.6.tgz (Root Library)
- terser-webpack-plugin-1.2.3.tgz
- :x: **serialize-javascript-1.6.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
serialize-javascript prior to 3.1.0 allows remote attackers to inject arbitrary code via the function "deleteFunctions" within "index.js".
<p>Publish Date: 2020-06-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7660>CVE-2020-7660</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7660">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7660</a></p>
<p>Release Date: 2020-06-01</p>
<p>Fix Resolution: serialize-javascript - 3.1.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-7660 (High) detected in serialize-javascript-1.6.1.tgz - ## CVE-2020-7660 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>serialize-javascript-1.6.1.tgz</b></p></summary>
<p>Serialize JavaScript to a superset of JSON that includes regular expressions and functions.</p>
<p>Library home page: <a href="https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.6.1.tgz">https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.6.1.tgz</a></p>
<p>Path to dependency file: /advanced-react-redux-demo/package.json</p>
<p>Path to vulnerable library: advanced-react-redux-demo/node_modules/serialize-javascript/package.json</p>
<p>
Dependency Hierarchy:
- webpack-4.29.6.tgz (Root Library)
- terser-webpack-plugin-1.2.3.tgz
- :x: **serialize-javascript-1.6.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
serialize-javascript prior to 3.1.0 allows remote attackers to inject arbitrary code via the function "deleteFunctions" within "index.js".
<p>Publish Date: 2020-06-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7660>CVE-2020-7660</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7660">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7660</a></p>
<p>Release Date: 2020-06-01</p>
<p>Fix Resolution: serialize-javascript - 3.1.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in serialize javascript tgz cve high severity vulnerability vulnerable library serialize javascript tgz serialize javascript to a superset of json that includes regular expressions and functions library home page a href path to dependency file advanced react redux demo package json path to vulnerable library advanced react redux demo node modules serialize javascript package json dependency hierarchy webpack tgz root library terser webpack plugin tgz x serialize javascript tgz vulnerable library vulnerability details serialize javascript prior to allows remote attackers to inject arbitrary code via the function deletefunctions within index js publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution serialize javascript step up your open source security game with whitesource | 0 |
341,003 | 24,679,735,818 | IssuesEvent | 2022-10-18 20:10:08 | project-chip/connectedhomeip | https://api.github.com/repos/project-chip/connectedhomeip | closed | Update QEMU build instructions with pre-reqs | documentation V1.X stale | #### Problem
QEMU needs libgcrypt20-dev installed before the config is run otherwise it's not buildable.
#### Proposed Solution
Add preerq installation instructions to the readme.
| 1.0 | Update QEMU build instructions with pre-reqs - #### Problem
QEMU needs libgcrypt20-dev installed before the config is run otherwise it's not buildable.
#### Proposed Solution
Add preerq installation instructions to the readme.
| non_priority | update qemu build instructions with pre reqs problem qemu needs dev installed before the config is run otherwise it s not buildable proposed solution add preerq installation instructions to the readme | 0 |
448,763 | 31,811,658,296 | IssuesEvent | 2023-09-13 17:16:46 | vijayk3327/JavaScript-JQuery | https://api.github.com/repos/vijayk3327/JavaScript-JQuery | opened | How to include header and footer in all pages uses of JavaScript function in HTML file | documentation question | In this post we are going to learn about How to include header and footer in all pages uses of JavaScript function in HTML file.
**[👉 Get source code live demo link:-](https://www.w3web.net/how-to-add-header-and-footer-in-javascript/)**
<img src="https://www.w3web.net/wp-content/uploads/2023/04/include-min.gif"/>
**Create HTML
Step 1:- Create HTML : include.html**
`<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<title>How to add header and footer in JavaScript</title>
<link href="stylesheet.css" rel="stylesheet" type="text/css" />
<script type="text/javascript" src="common.js"></script>
</head>
<body>
<!--Start Header-->
<script type="text/javascript">
header ();
</script>
<!--End Header-->
<!--Start Middle-->
<div id="mdlContainer">
<div class="mdlLftMain">
<script type="text/javascript">
leftMenu();
</script>
</div>
<div class="mdlRtMain">
<h1 style="font-size:16px; color:#0085e3;">Home</h1>
<p>Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.</p>
<script type="text/javascript">
showTxt();
</script>
</div>
<div class="clr"></div>
</div>
<div>fsa</div>
<!--End Middle-->
<!--Start Footer-->
<script type="text/javascript">
footerContainer();
</script>
<!--End Footer-->
</body>
</html>`
**Create HTML
Step 2:- Create HTML : product.html**
`<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<title>How to add header and footer in JavaScript</title>
<link href="stylesheet.css" rel="stylesheet" type="text/css" />
<script type="text/javascript" src="common.js"></script>
</head>
<body>
<!--Start Header-->
<script type="text/javascript">
header ();
</script>
<!--End Header-->
<!--Start Middle-->
<div id="mdlContainer">
<div class="mdlLftMain">
<script type="text/javascript">
leftMenu();
</script>
</div>
<div class="mdlRtMain">
<h1 style="font-size:16px; color:#0085e3;">Product</h1>
<p>Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.</p>
</div>
<div class="clr"></div>
</div>
<div>fsa</div>
<!--End Middle-->
<!--Start Footer-->
<script type="text/javascript">
footerContainer();
</script>
<!--End Footer-->
</body>
</html>`
**Create HTML
Step 3:- Create HTML : about.html**
`<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<title>How to add header and footer in JavaScript</title>
<link href="stylesheet.css" rel="stylesheet" type="text/css" />
<script type="text/javascript" src="common.js"></script>
</head>
<body>
<!--Start Header-->
<script type="text/javascript">
header ();
</script>
<!--End Header-->
<!--Start Middle-->
<div id="mdlContainer">
<div class="mdlLftMain">
<script type="text/javascript">
leftMenu();
</script>
</div>
<div class="mdlRtMain">
<h1 style="font-size:16px; color:#0085e3;">About</h1>
<p>Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.</p>
</div>
<div class="clr"></div>
</div>
<div>fsa</div>
<!--End Middle-->
<!--Start Footer-->
<script type="text/javascript">
footerContainer();
</script>
<!--End Footer-->
</body>
</html>`
**Create HTML
Step 4:- Create HTML : contact.html**
`<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<title>How to add header and footer in JavaScript</title>
<link href="stylesheet.css" rel="stylesheet" type="text/css" />
<script type="text/javascript" src="common.js"></script>
</head>
<body>
<!--Start Header-->
<script type="text/javascript">
header ();
</script>
<!--End Header-->
<!--Start Middle-->
<div id="mdlContainer">
<div class="mdlLftMain">
<script type="text/javascript">
leftMenu();
</script>
</div>
<div class="mdlRtMain">
<h1 style="font-size:16px; color:#0085e3;">Contact</h1>
<p>Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.</p>
</div>
<div class="clr"></div>
</div>
<div>fsa</div>
<!--End Middle-->
<!--Start Footer-->
<script type="text/javascript">
footerContainer();
</script>
<!--End Footer-->
</body>
</html>`
**Create HTML
Step 5:- Create HTML : product_1.html**
`<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<title>How to add header and footer in JavaScript</title>
<link href="stylesheet.css" rel="stylesheet" type="text/css" />
<script type="text/javascript" src="common.js"></script>
</head>
<body>
<!--Start Header-->
<script type="text/javascript">
header ();
</script>
<!--End Header-->
<!--Start Middle-->
<div id="mdlContainer">
<div class="mdlLftMain">
<script type="text/javascript">
leftMenu();
</script>
</div>
<div class="mdlRtMain">
<h1 style="font-size:16px; color:#0085e3;">Product1</h1>
<p>Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.</p>
</div>
<div class="clr"></div>
</div>
<div> ;</div>
<!--End Middle-->
<!--Start Footer-->
<script type="text/javascript">
footerContainer();
</script>
<!--End Footer-->
</body>
</html>`
**Create JavaScript
Step 6:- Create JavaScript : common.js**
`function header () {
with(document) {
write("<div id='headerMain'>");
write("<div class='logo'>Logo")
write("</div>");
write("<div class='menuMain'>");
write("<ul class='menu'>");
write("<li><a href='include.html' id='rgtlink1' name='rgtlink1'>Home</a>");
write("</li>")
write("<li><a href='product.html' id='rgtlink2' name='rgtlink2' onmouseover='show(\"t1\",\"rgtlink2\")' onmouseout='hide(\"t1\",\"rgtlink2\")'>Products</a>");
write("<div style='display:none;' id='t1' onmouseover='show(\"t1\",\"rgtlink2\")' onmouseout='hide(\"t1\",\"rgtlink2\")' class='sublnkBox'>");
write("<ul>");
write("<li><a href='product_1.html' id='rgtlink01' name='rgtlink01'>Product1</a></li>");
write("<li><a href='product_2.html' id='a02' name='a02'>Product2</a></li>");
write("<li><a href='#'>Product3</a></li>");
write("<li><a href='#'>Product4</a></li>");
write("<li><a href='#'>Product5</a></li>");
write("</ul>");
write("</div>");
write("</li>")
write("<li><a href='about.html' id='a1' name='a1'>About Us</a>");
write("</li>")
write("<li><a href='contact.html' id='rgtlink3' name='rgtlink3'>Contact Us</a>");
write("</li>")
write("</ul>");
write("</div>");
write("</div>");
}
imgsearch();
}
function footerContainer(){
with(document){
write("<div id='footerMain'>");
write("<div class='footerLnk'><a href='#'>Home</a> | <a href='#'>About Us</a> | <a href='#'>Contact Us</a></div>");
write("</div>");
}
}
function leftMenu(){
with(document){
write("<ul class='leftMenu'>");
write("<li><a href='#'>Listing1</a></li>");
write("<li><a href='#'>Listing2</a></li>");
write("<li><a href='#'>Listing3</a></li>");
write("<li><a href='#'>Listing4</a></li>");
write("<li><a href='#'>Listing5</a></li>");
write("<li><a href='#'>Listing6</a></li>");
write("<li><a href='#'>Listing7</a></li>");
write("<li><a href='#'>Listing8</a></li>");
write("</ul>");
}
}
function showTxt(){
with(document){
write("<div><a href='#' onmouseover='showDiv(\"d1\")' onmouseout='hideDiv(\"d1\")'>Click here</a></div>");
write("<div style='display:none;' id='d1'>This is default Description. This is default Description.This is default Description.</div>");
}
}
var act='';
function show(sid,cid)
{
if(document.getElementById(cid).className=='sel')
{
act=cid;
}
document.getElementById(sid).style.display='block';
document.getElementById(cid).className='sel';
}
function hide(sid,cid)
{
document.getElementById(sid).style.display='none';
document.getElementById(cid).className='';
if(act==cid)
{
document.getElementById(act).className='sel';
}
}
function showDiv(d)
{
document.getElementById(d).style.display='block';
}
function hideDiv(d)
{
document.getElementById(d).style.display='none';
}
function imgsearch() {
var sPath = window.location.pathname;
var sPage = sPath.substring(sPath.lastIndexOf('/') + 1);
//alert(sPage)
///////////////////about debt///////////////////////
if (sPage.toLowerCase() == 'include.html'.toLowerCase()) {
setdeafulturlwith('', 'rgtlink1', 'rgtlink1');
}
if (sPage.toLowerCase() == 'about.html'.toLowerCase()) {
setdeafulturlwith('', 'a1', 'a1');
}
//Start Sub menu//
if (sPage.toLowerCase() == 'product_1.html'.toLowerCase()) {
setdeafulturlwith('', 'rgtlink2', 'rgtlink2');
setdeafulturlwith('', 'rgtlink01', 'rgtlink01');
}
if (sPage.toLowerCase() == 'product_2.html'.toLowerCase()) {
setdeafulturlwith('', 'a1', 'a1');
setdeafulturlwith('', 'a02', 'a02');
}
//End Sub menu//
if (sPage.toLowerCase() == 'product.html'.toLowerCase()) {
setdeafulturlwith('', 'rgtlink2', 'rgtlink2');
}
if (sPage.toLowerCase() == 'contact.html'.toLowerCase()) {
setdeafulturlwith('', 'rgtlink3', 'rgtlink3');
}
}
function setdeafulturlwith(pgname, hrefid, td_id) {
document.getElementById(hrefid).removeAttribute("href");
document.getElementById(td_id).onclick = '';
document.getElementById(td_id).className = "sel";
}`
**Create LWC Style CSS
Step 7:- Create Style CSS : stylesheet.css**
` body{ margin:0; padding:0; font-family:Arial, Helvetica, sans-serif; color:#333333; font-size:12px;}
.over{ overflow:hidden;}
.fl{ float:left;}
.fr{ float:right;}
.clr{ clear:both;}
#headerMain{ height:100px; border-bottom:1px #ccc solid; width:800px; margin:auto;}
.logo{ font-size:18px; color:#000000; float:left; padding:30px 0 0 0;}
.menuMain{ width:500px; float:right; padding-top:75px;}
ul.menu{ margin:0; padding:0; list-style:none;}
ul.menu li{ font-size:12px; color:#000000; float:left; border-right:1px #ccc solid; position:relative;}
ul.menu li a, ul.menu li a:visited{ color:#000000; text-decoration:none; background:#eee; padding:5px 10px; display:block;}
ul.menu li a:hover, ul.menu li a.sel, ul.menu li a.sel:visited{ color:#ff0000; text-decoration:none; background:#d1d1d1;}
ul.menu li .sublnkBox{ width:200px; border:1px #ccc solid; border-top:0; border-bottom:0; background:#eee; position:absolute; top:25px; left:0;}
ul.menu li .sublnkBox ul{ margin:0; padding:0; list-style:none;}
ul.menu li .sublnkBox ul li{font-size:12px; color:#000000; float:none; border-right:0; border-bottom:1px #ccc solid;}
ul.menu li .sublnkBox ul li a, ul.menu li .sublnkBox ul li a:visited{color:#000000; text-decoration:none; background:#none; padding:5px 10px; display:block;}
ul.menu li .sublnkBox ul li a:hover{color:#ff0000; text-decoration:none; background:#d1d1d1;}
#mdlContainer{ width:800px; margin:auto; padding:20px 0 0 0; min-height:500px;}
.mdlLftMain{ width:200px; float:left; border-right:1px #ccc solid; min-height:450px;}
ul.leftMenu{ margin:0; padding:0; list-style:none;}
ul.leftMenu li{ color:#000000; font-size:12px; border-bottom:1px #ccc solid;}
ul.leftMenu li a, ul.leftMenu li a:visited{ color:#000000; text-decoration:none; background:#eee; padding:5px 10px; display:block;}
ul.leftMenu li a:hover{ color:#ff0000; text-decoration:none; background:#ccc;}
.mdlRtMain{ width:575px; float:right;}
#footerMain{ width:800px; margin:auto; border-top:1px #ccc solid; padding:10px 0 0 0;}
.footerLnk{ font-size:12px; color:#333333;}
.footerLnk a, .footerLnk a:visited{ color:#333333; text-decoration:none; margin:0 10px 0 10px;}
.footerLnk a:hover{ color:#ff0000; text-decoration:none;}`
| 1.0 | How to include header and footer in all pages uses of JavaScript function in HTML file - In this post we are going to learn about How to include header and footer in all pages uses of JavaScript function in HTML file.
**[👉 Get source code live demo link:-](https://www.w3web.net/how-to-add-header-and-footer-in-javascript/)**
<img src="https://www.w3web.net/wp-content/uploads/2023/04/include-min.gif"/>
**Create HTML
Step 1:- Create HTML : include.html**
`<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<title>How to add header and footer in JavaScript</title>
<link href="stylesheet.css" rel="stylesheet" type="text/css" />
<script type="text/javascript" src="common.js"></script>
</head>
<body>
<!--Start Header-->
<script type="text/javascript">
header ();
</script>
<!--End Header-->
<!--Start Middle-->
<div id="mdlContainer">
<div class="mdlLftMain">
<script type="text/javascript">
leftMenu();
</script>
</div>
<div class="mdlRtMain">
<h1 style="font-size:16px; color:#0085e3;">Home</h1>
<p>Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.</p>
<script type="text/javascript">
showTxt();
</script>
</div>
<div class="clr"></div>
</div>
<div>fsa</div>
<!--End Middle-->
<!--Start Footer-->
<script type="text/javascript">
footerContainer();
</script>
<!--End Footer-->
</body>
</html>`
**Create HTML
Step 2:- Create HTML : product.html**
`<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<title>How to add header and footer in JavaScript</title>
<link href="stylesheet.css" rel="stylesheet" type="text/css" />
<script type="text/javascript" src="common.js"></script>
</head>
<body>
<!--Start Header-->
<script type="text/javascript">
header ();
</script>
<!--End Header-->
<!--Start Middle-->
<div id="mdlContainer">
<div class="mdlLftMain">
<script type="text/javascript">
leftMenu();
</script>
</div>
<div class="mdlRtMain">
<h1 style="font-size:16px; color:#0085e3;">Product</h1>
<p>Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.</p>
</div>
<div class="clr"></div>
</div>
<div>fsa</div>
<!--End Middle-->
<!--Start Footer-->
<script type="text/javascript">
footerContainer();
</script>
<!--End Footer-->
</body>
</html>`
**Create HTML
Step 3:- Create HTML : about.html**
`<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<title>How to add header and footer in JavaScript</title>
<link href="stylesheet.css" rel="stylesheet" type="text/css" />
<script type="text/javascript" src="common.js"></script>
</head>
<body>
<!--Start Header-->
<script type="text/javascript">
header ();
</script>
<!--End Header-->
<!--Start Middle-->
<div id="mdlContainer">
<div class="mdlLftMain">
<script type="text/javascript">
leftMenu();
</script>
</div>
<div class="mdlRtMain">
<h1 style="font-size:16px; color:#0085e3;">About</h1>
<p>Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.</p>
</div>
<div class="clr"></div>
</div>
<div>fsa</div>
<!--End Middle-->
<!--Start Footer-->
<script type="text/javascript">
footerContainer();
</script>
<!--End Footer-->
</body>
</html>`
**Create HTML
Step 4:- Create HTML : contact.html**
`<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<title>How to add header and footer in JavaScript</title>
<link href="stylesheet.css" rel="stylesheet" type="text/css" />
<script type="text/javascript" src="common.js"></script>
</head>
<body>
<!--Start Header-->
<script type="text/javascript">
header ();
</script>
<!--End Header-->
<!--Start Middle-->
<div id="mdlContainer">
<div class="mdlLftMain">
<script type="text/javascript">
leftMenu();
</script>
</div>
<div class="mdlRtMain">
<h1 style="font-size:16px; color:#0085e3;">Contact</h1>
<p>Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.</p>
</div>
<div class="clr"></div>
</div>
<div>fsa</div>
<!--End Middle-->
<!--Start Footer-->
<script type="text/javascript">
footerContainer();
</script>
<!--End Footer-->
</body>
</html>`
**Create HTML
Step 5:- Create HTML : product_1.html**
`<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
<title>How to add header and footer in JavaScript</title>
<link href="stylesheet.css" rel="stylesheet" type="text/css" />
<script type="text/javascript" src="common.js"></script>
</head>
<body>
<!--Start Header-->
<script type="text/javascript">
header ();
</script>
<!--End Header-->
<!--Start Middle-->
<div id="mdlContainer">
<div class="mdlLftMain">
<script type="text/javascript">
leftMenu();
</script>
</div>
<div class="mdlRtMain">
<h1 style="font-size:16px; color:#0085e3;">Product1</h1>
<p>Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.</p>
</div>
<div class="clr"></div>
</div>
<div> ;</div>
<!--End Middle-->
<!--Start Footer-->
<script type="text/javascript">
footerContainer();
</script>
<!--End Footer-->
</body>
</html>`
**Create JavaScript
Step 6:- Create JavaScript : common.js**
`function header () {
with(document) {
write("<div id='headerMain'>");
write("<div class='logo'>Logo")
write("</div>");
write("<div class='menuMain'>");
write("<ul class='menu'>");
write("<li><a href='include.html' id='rgtlink1' name='rgtlink1'>Home</a>");
write("</li>")
write("<li><a href='product.html' id='rgtlink2' name='rgtlink2' onmouseover='show(\"t1\",\"rgtlink2\")' onmouseout='hide(\"t1\",\"rgtlink2\")'>Products</a>");
write("<div style='display:none;' id='t1' onmouseover='show(\"t1\",\"rgtlink2\")' onmouseout='hide(\"t1\",\"rgtlink2\")' class='sublnkBox'>");
write("<ul>");
write("<li><a href='product_1.html' id='rgtlink01' name='rgtlink01'>Product1</a></li>");
write("<li><a href='product_2.html' id='a02' name='a02'>Product2</a></li>");
write("<li><a href='#'>Product3</a></li>");
write("<li><a href='#'>Product4</a></li>");
write("<li><a href='#'>Product5</a></li>");
write("</ul>");
write("</div>");
write("</li>")
write("<li><a href='about.html' id='a1' name='a1'>About Us</a>");
write("</li>")
write("<li><a href='contact.html' id='rgtlink3' name='rgtlink3'>Contact Us</a>");
write("</li>")
write("</ul>");
write("</div>");
write("</div>");
}
imgsearch();
}
function footerContainer(){
with(document){
write("<div id='footerMain'>");
write("<div class='footerLnk'><a href='#'>Home</a> | <a href='#'>About Us</a> | <a href='#'>Contact Us</a></div>");
write("</div>");
}
}
function leftMenu(){
with(document){
write("<ul class='leftMenu'>");
write("<li><a href='#'>Listing1</a></li>");
write("<li><a href='#'>Listing2</a></li>");
write("<li><a href='#'>Listing3</a></li>");
write("<li><a href='#'>Listing4</a></li>");
write("<li><a href='#'>Listing5</a></li>");
write("<li><a href='#'>Listing6</a></li>");
write("<li><a href='#'>Listing7</a></li>");
write("<li><a href='#'>Listing8</a></li>");
write("</ul>");
}
}
function showTxt(){
with(document){
write("<div><a href='#' onmouseover='showDiv(\"d1\")' onmouseout='hideDiv(\"d1\")'>Click here</a></div>");
write("<div style='display:none;' id='d1'>This is default Description. This is default Description.This is default Description.</div>");
}
}
var act='';
function show(sid,cid)
{
if(document.getElementById(cid).className=='sel')
{
act=cid;
}
document.getElementById(sid).style.display='block';
document.getElementById(cid).className='sel';
}
function hide(sid,cid)
{
document.getElementById(sid).style.display='none';
document.getElementById(cid).className='';
if(act==cid)
{
document.getElementById(act).className='sel';
}
}
function showDiv(d)
{
document.getElementById(d).style.display='block';
}
function hideDiv(d)
{
document.getElementById(d).style.display='none';
}
function imgsearch() {
var sPath = window.location.pathname;
var sPage = sPath.substring(sPath.lastIndexOf('/') + 1);
//alert(sPage)
///////////////////about debt///////////////////////
if (sPage.toLowerCase() == 'include.html'.toLowerCase()) {
setdeafulturlwith('', 'rgtlink1', 'rgtlink1');
}
if (sPage.toLowerCase() == 'about.html'.toLowerCase()) {
setdeafulturlwith('', 'a1', 'a1');
}
//Start Sub menu//
if (sPage.toLowerCase() == 'product_1.html'.toLowerCase()) {
setdeafulturlwith('', 'rgtlink2', 'rgtlink2');
setdeafulturlwith('', 'rgtlink01', 'rgtlink01');
}
if (sPage.toLowerCase() == 'product_2.html'.toLowerCase()) {
setdeafulturlwith('', 'a1', 'a1');
setdeafulturlwith('', 'a02', 'a02');
}
//End Sub menu//
if (sPage.toLowerCase() == 'product.html'.toLowerCase()) {
setdeafulturlwith('', 'rgtlink2', 'rgtlink2');
}
if (sPage.toLowerCase() == 'contact.html'.toLowerCase()) {
setdeafulturlwith('', 'rgtlink3', 'rgtlink3');
}
}
function setdeafulturlwith(pgname, hrefid, td_id) {
document.getElementById(hrefid).removeAttribute("href");
document.getElementById(td_id).onclick = '';
document.getElementById(td_id).className = "sel";
}`
**Create LWC Style CSS
Step 7:- Create Style CSS : stylesheet.css**
` body{ margin:0; padding:0; font-family:Arial, Helvetica, sans-serif; color:#333333; font-size:12px;}
.over{ overflow:hidden;}
.fl{ float:left;}
.fr{ float:right;}
.clr{ clear:both;}
#headerMain{ height:100px; border-bottom:1px #ccc solid; width:800px; margin:auto;}
.logo{ font-size:18px; color:#000000; float:left; padding:30px 0 0 0;}
.menuMain{ width:500px; float:right; padding-top:75px;}
ul.menu{ margin:0; padding:0; list-style:none;}
ul.menu li{ font-size:12px; color:#000000; float:left; border-right:1px #ccc solid; position:relative;}
ul.menu li a, ul.menu li a:visited{ color:#000000; text-decoration:none; background:#eee; padding:5px 10px; display:block;}
ul.menu li a:hover, ul.menu li a.sel, ul.menu li a.sel:visited{ color:#ff0000; text-decoration:none; background:#d1d1d1;}
ul.menu li .sublnkBox{ width:200px; border:1px #ccc solid; border-top:0; border-bottom:0; background:#eee; position:absolute; top:25px; left:0;}
ul.menu li .sublnkBox ul{ margin:0; padding:0; list-style:none;}
ul.menu li .sublnkBox ul li{font-size:12px; color:#000000; float:none; border-right:0; border-bottom:1px #ccc solid;}
ul.menu li .sublnkBox ul li a, ul.menu li .sublnkBox ul li a:visited{color:#000000; text-decoration:none; background:#none; padding:5px 10px; display:block;}
ul.menu li .sublnkBox ul li a:hover{color:#ff0000; text-decoration:none; background:#d1d1d1;}
#mdlContainer{ width:800px; margin:auto; padding:20px 0 0 0; min-height:500px;}
.mdlLftMain{ width:200px; float:left; border-right:1px #ccc solid; min-height:450px;}
ul.leftMenu{ margin:0; padding:0; list-style:none;}
ul.leftMenu li{ color:#000000; font-size:12px; border-bottom:1px #ccc solid;}
ul.leftMenu li a, ul.leftMenu li a:visited{ color:#000000; text-decoration:none; background:#eee; padding:5px 10px; display:block;}
ul.leftMenu li a:hover{ color:#ff0000; text-decoration:none; background:#ccc;}
.mdlRtMain{ width:575px; float:right;}
#footerMain{ width:800px; margin:auto; border-top:1px #ccc solid; padding:10px 0 0 0;}
.footerLnk{ font-size:12px; color:#333333;}
.footerLnk a, .footerLnk a:visited{ color:#333333; text-decoration:none; margin:0 10px 0 10px;}
.footerLnk a:hover{ color:#ff0000; text-decoration:none;}`
| non_priority | how to include header and footer in all pages uses of javascript function in html file in this post we are going to learn about how to include header and footer in all pages uses of javascript function in html file img src create html step create html include html html xmlns how to add header and footer in javascript header leftmenu home lorem ipsum is simply dummy text of the printing and typesetting industry lorem ipsum has been the industry s standard dummy text ever since the when an unknown printer took a galley of type and scrambled it to make a type specimen book it has survived not only five centuries but also the leap into electronic typesetting remaining essentially unchanged it was popularised in the with the release of letraset sheets containing lorem ipsum passages and more recently with desktop publishing software like aldus pagemaker including versions of lorem ipsum showtxt fsa footercontainer create html step create html product html html xmlns how to add header and footer in javascript header leftmenu product lorem ipsum is simply dummy text of the printing and typesetting industry lorem ipsum has been the industry s standard dummy text ever since the when an unknown printer took a galley of type and scrambled it to make a type specimen book it has survived not only five centuries but also the leap into electronic typesetting remaining essentially unchanged it was popularised in the with the release of letraset sheets containing lorem ipsum passages and more recently with desktop publishing software like aldus pagemaker including versions of lorem ipsum fsa footercontainer create html step create html about html html xmlns how to add header and footer in javascript header leftmenu about lorem ipsum is simply dummy text of the printing and typesetting industry lorem ipsum has been the industry s standard dummy text ever since the when an unknown printer took a galley of type and scrambled it to make a type specimen book it has survived not only five centuries but also the leap into electronic typesetting remaining essentially unchanged it was popularised in the with the release of letraset sheets containing lorem ipsum passages and more recently with desktop publishing software like aldus pagemaker including versions of lorem ipsum fsa footercontainer create html step create html contact html html xmlns how to add header and footer in javascript header leftmenu contact lorem ipsum is simply dummy text of the printing and typesetting industry lorem ipsum has been the industry s standard dummy text ever since the when an unknown printer took a galley of type and scrambled it to make a type specimen book it has survived not only five centuries but also the leap into electronic typesetting remaining essentially unchanged it was popularised in the with the release of letraset sheets containing lorem ipsum passages and more recently with desktop publishing software like aldus pagemaker including versions of lorem ipsum fsa footercontainer create html step create html product html html xmlns how to add header and footer in javascript header leftmenu lorem ipsum is simply dummy text of the printing and typesetting industry lorem ipsum has been the industry s standard dummy text ever since the when an unknown printer took a galley of type and scrambled it to make a type specimen book it has survived not only five centuries but also the leap into electronic typesetting remaining essentially unchanged it was popularised in the with the release of letraset sheets containing lorem ipsum passages and more recently with desktop publishing software like aldus pagemaker including versions of lorem ipsum nbsp footercontainer create javascript step create javascript common js function header with document write write logo write write write write home write write products write write write write write write write write write write write about us write write contact us write write write write imgsearch function footercontainer with document write write home about us contact us write function leftmenu with document write write write write write write write write write write function showtxt with document write click here write this is default description this is default description this is default description var act function show sid cid if document getelementbyid cid classname sel act cid document getelementbyid sid style display block document getelementbyid cid classname sel function hide sid cid document getelementbyid sid style display none document getelementbyid cid classname if act cid document getelementbyid act classname sel function showdiv d document getelementbyid d style display block function hidediv d document getelementbyid d style display none function imgsearch var spath window location pathname var spage spath substring spath lastindexof alert spage about debt if spage tolowercase include html tolowercase setdeafulturlwith if spage tolowercase about html tolowercase setdeafulturlwith start sub menu if spage tolowercase product html tolowercase setdeafulturlwith setdeafulturlwith if spage tolowercase product html tolowercase setdeafulturlwith setdeafulturlwith end sub menu if spage tolowercase product html tolowercase setdeafulturlwith if spage tolowercase contact html tolowercase setdeafulturlwith function setdeafulturlwith pgname hrefid td id document getelementbyid hrefid removeattribute href document getelementbyid td id onclick document getelementbyid td id classname sel create lwc style css step create style css stylesheet css body margin padding font family arial helvetica sans serif color font size over overflow hidden fl float left fr float right clr clear both headermain height border bottom ccc solid width margin auto logo font size color float left padding menumain width float right padding top ul menu margin padding list style none ul menu li font size color float left border right ccc solid position relative ul menu li a ul menu li a visited color text decoration none background eee padding display block ul menu li a hover ul menu li a sel ul menu li a sel visited color text decoration none background ul menu li sublnkbox width border ccc solid border top border bottom background eee position absolute top left ul menu li sublnkbox ul margin padding list style none ul menu li sublnkbox ul li font size color float none border right border bottom ccc solid ul menu li sublnkbox ul li a ul menu li sublnkbox ul li a visited color text decoration none background none padding display block ul menu li sublnkbox ul li a hover color text decoration none background mdlcontainer width margin auto padding min height mdllftmain width float left border right ccc solid min height ul leftmenu margin padding list style none ul leftmenu li color font size border bottom ccc solid ul leftmenu li a ul leftmenu li a visited color text decoration none background eee padding display block ul leftmenu li a hover color text decoration none background ccc mdlrtmain width float right footermain width margin auto border top ccc solid padding footerlnk font size color footerlnk a footerlnk a visited color text decoration none margin footerlnk a hover color text decoration none | 0 |
35,387 | 7,724,997,005 | IssuesEvent | 2018-05-24 16:34:08 | bridgedotnet/Bridge | https://api.github.com/repos/bridgedotnet/Bridge | closed | Broken cast operators with Template attribute | defect | In our project we have csharp bindings to playcanvas 3d web engine. We do that via external classes and Template attributes. Every 3d engine has 2d and 3d vectors. It's very common to cast 2d vectors to 3d ones by adding zero z component. It's also very common to cast 3d vectors to 2d vectors by dropping z component. Below you can see a very simple example of doing that by passing 2d vectors to function that works with 3d ones. Test1 does use implicit casting. Test2 does that explicitly.
### Steps To Reproduce
```csharp
using Bridge;
[External]
[Constructor("new pc.Vec2")]
public class Vector2
{
public float x;
public float y;
[Template("new pc.Vec2({0}, {1})")]
public extern Vector2(float x, float y);
[Template("new pc.Vec3({0}.x, {0}.y, 0)")]
public extern static implicit operator Vector3(Vector2 source);
}
[External]
[Constructor("new pc.Vec3")]
public class Vector3
{
public float x;
public float y;
public float z;
[Template("new pc.Vec3({0}, {1}, {2})")]
public extern Vector3(float x, float y, float z);
[Template("new pc.Vec2({0}.x, {0}.y)")]
public extern static implicit operator Vector2(Vector3 source);
}
public class Utility
{
public Vector3 F(Vector3 v) {
return new Vector3(v.x + 1, v.y + 2, v.z + 3);
}
}
public class Program
{
public static void Test1() {
var u = new Utility();
Vector2 v0 = new Vector2(1, 1);
Vector2 v1 = u.F(v0);
}
public static void Test2() {
var u = new Utility();
Vector2 v0 = new Vector2(1, 1);
Vector3 v0_ = v0;
Vector3 v1_ = u.F(v0_);
Vector2 v1 = v0_;
}
public static void Main() {
Test1();
Test2();
}
}
```
Sadly this result can't be compiled via deck.net. Gives me `System.Exception: ReferenceError: pc is not defined` error. But this can be done in empty bridge project.
### Code generated for Test1:
```js
Test1: function () {
var u = new Utility();
var v0 = new pc.Vec2(1, 1);
var v1 = new pc.Vec2(new pc.Vec3(v0.x, v0.y, 0).x, new pc.Vec3(v0.x, v0.y, 0).y);
}
```
### Code generated for Test2:
```js
Test2: function () {
var u = new Utility();
var v0 = new pc.Vec2(1, 1);
var v0_ = new pc.Vec3(v0.x, v0.y, 0);
var v1_ = u.F(v0_);
var v1 = new pc.Vec2(v0_.x, v0_.y);
}
```
Test1 code is obviously incorrect because there is no `u.F()` call at all.
Bug is reproducing for me while running both 16.5.0 and 17.0.0 Bridge versions. My guess is that all 16.x versions are affected too. | 1.0 | Broken cast operators with Template attribute - In our project we have csharp bindings to playcanvas 3d web engine. We do that via external classes and Template attributes. Every 3d engine has 2d and 3d vectors. It's very common to cast 2d vectors to 3d ones by adding zero z component. It's also very common to cast 3d vectors to 2d vectors by dropping z component. Below you can see a very simple example of doing that by passing 2d vectors to function that works with 3d ones. Test1 does use implicit casting. Test2 does that explicitly.
### Steps To Reproduce
```csharp
using Bridge;
[External]
[Constructor("new pc.Vec2")]
public class Vector2
{
public float x;
public float y;
[Template("new pc.Vec2({0}, {1})")]
public extern Vector2(float x, float y);
[Template("new pc.Vec3({0}.x, {0}.y, 0)")]
public extern static implicit operator Vector3(Vector2 source);
}
[External]
[Constructor("new pc.Vec3")]
public class Vector3
{
public float x;
public float y;
public float z;
[Template("new pc.Vec3({0}, {1}, {2})")]
public extern Vector3(float x, float y, float z);
[Template("new pc.Vec2({0}.x, {0}.y)")]
public extern static implicit operator Vector2(Vector3 source);
}
public class Utility
{
public Vector3 F(Vector3 v) {
return new Vector3(v.x + 1, v.y + 2, v.z + 3);
}
}
public class Program
{
public static void Test1() {
var u = new Utility();
Vector2 v0 = new Vector2(1, 1);
Vector2 v1 = u.F(v0);
}
public static void Test2() {
var u = new Utility();
Vector2 v0 = new Vector2(1, 1);
Vector3 v0_ = v0;
Vector3 v1_ = u.F(v0_);
Vector2 v1 = v0_;
}
public static void Main() {
Test1();
Test2();
}
}
```
Sadly this result can't be compiled via deck.net. Gives me `System.Exception: ReferenceError: pc is not defined` error. But this can be done in empty bridge project.
### Code generated for Test1:
```js
Test1: function () {
var u = new Utility();
var v0 = new pc.Vec2(1, 1);
var v1 = new pc.Vec2(new pc.Vec3(v0.x, v0.y, 0).x, new pc.Vec3(v0.x, v0.y, 0).y);
}
```
### Code generated for Test2:
```js
Test2: function () {
var u = new Utility();
var v0 = new pc.Vec2(1, 1);
var v0_ = new pc.Vec3(v0.x, v0.y, 0);
var v1_ = u.F(v0_);
var v1 = new pc.Vec2(v0_.x, v0_.y);
}
```
Test1 code is obviously incorrect because there is no `u.F()` call at all.
Bug is reproducing for me while running both 16.5.0 and 17.0.0 Bridge versions. My guess is that all 16.x versions are affected too. | non_priority | broken cast operators with template attribute in our project we have csharp bindings to playcanvas web engine we do that via external classes and template attributes every engine has and vectors it s very common to cast vectors to ones by adding zero z component it s also very common to cast vectors to vectors by dropping z component below you can see a very simple example of doing that by passing vectors to function that works with ones does use implicit casting does that explicitly steps to reproduce csharp using bridge public class public float x public float y public extern float x float y public extern static implicit operator source public class public float x public float y public float z public extern float x float y float z public extern static implicit operator source public class utility public f v return new v x v y v z public class program public static void var u new utility new u f public static void var u new utility new u f public static void main sadly this result can t be compiled via deck net gives me system exception referenceerror pc is not defined error but this can be done in empty bridge project code generated for js function var u new utility var new pc var new pc new pc x y x new pc x y y code generated for js function var u new utility var new pc var new pc x y var u f var new pc x y code is obviously incorrect because there is no u f call at all bug is reproducing for me while running both and bridge versions my guess is that all x versions are affected too | 0 |
184,007 | 14,966,010,337 | IssuesEvent | 2021-01-27 14:05:52 | cengage/react-magma | https://api.github.com/repos/cengage/react-magma | closed | Add link to github in the docs. | documentation styles/theme | Right now we don't have anything in the docs pointing at github ... probably should add that, either has a general github icon or a link in all the API sections to their corresponding components | 1.0 | Add link to github in the docs. - Right now we don't have anything in the docs pointing at github ... probably should add that, either has a general github icon or a link in all the API sections to their corresponding components | non_priority | add link to github in the docs right now we don t have anything in the docs pointing at github probably should add that either has a general github icon or a link in all the api sections to their corresponding components | 0 |
173,348 | 14,408,420,907 | IssuesEvent | 2020-12-03 23:45:29 | nucypher/rust-umbral | https://api.github.com/repos/nucypher/rust-umbral | opened | Synchronize wording and argument names in docs and examples | API Python WASM documentation help wanted | Currently different terminology may be used in function argument names, comments and examples. Also the comments may be worded differently. Need to go through them and check.
In particular, we need to decide on the best name for the participating parties to use ("delegating/receiving" is good for the keys, but what would be the nouns for the parties, to use in function docstrings? Delegator/recipient?). "Alice" and "Bob" seem fine for the examples, but may be too specific to use in the API. | 1.0 | Synchronize wording and argument names in docs and examples - Currently different terminology may be used in function argument names, comments and examples. Also the comments may be worded differently. Need to go through them and check.
In particular, we need to decide on the best name for the participating parties to use ("delegating/receiving" is good for the keys, but what would be the nouns for the parties, to use in function docstrings? Delegator/recipient?). "Alice" and "Bob" seem fine for the examples, but may be too specific to use in the API. | non_priority | synchronize wording and argument names in docs and examples currently different terminology may be used in function argument names comments and examples also the comments may be worded differently need to go through them and check in particular we need to decide on the best name for the participating parties to use delegating receiving is good for the keys but what would be the nouns for the parties to use in function docstrings delegator recipient alice and bob seem fine for the examples but may be too specific to use in the api | 0 |
58,349 | 24,421,791,385 | IssuesEvent | 2022-10-05 21:03:05 | hashicorp/terraform-provider-aws | https://api.github.com/repos/hashicorp/terraform-provider-aws | closed | Add support for stateful_rule_group_reference override in AWS managed rules | enhancement good first issue service/networkfirewall | <!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
As a follow up to https://github.com/hashicorp/terraform-provider-aws/pull/22355, kindly add support for Override in the `stateful_rule_group_reference` for aws_networkfirewall_firewall_policy resource.
**Currently supported**
```
resource "aws_networkfirewall_firewall_policy" "example" {
name = "example"
firewall_policy {
stateless_default_actions = ["aws:pass"]
stateless_fragment_default_actions = ["aws:drop"]
stateless_rule_group_reference {
priority = 1
resource_arn = aws_networkfirewall_rule_group.example.arn
stateful_rule_group_reference {
priority = 2
resource_arn = aws_networkfirewall_rule_group.example.arn
}
}`
```
**feature request**
```
resource "aws_networkfirewall_firewall_policy" "example" {
name = "example"
firewall_policy {
stateless_default_actions = ["aws:pass"]
stateless_fragment_default_actions = ["aws:drop"]
stateless_rule_group_reference {
priority = 1
overrride {
action = ["DROP_TO_ALERT"]
}
resource_arn = aws_networkfirewall_rule_group.example.arn
stateful_rule_group_reference {
priority = 2
overrride {
action = ["DROP_TO_ALERT"]
}
resource_arn = aws_networkfirewall_rule_group.example.arn
}
}`
```
### New or Affected Resource(s)
<!--- Please list the new or affected resources and data sources. --->
* aws_XXXXX
### Potential Terraform Configuration
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
```hcl
# Copy-paste your Terraform configurations here - for large Terraform configs,
# please use a service like Dropbox and share a link to the ZIP file. For
# security, you can also encrypt the files using our GPG public key.
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/network-firewall.html
'StatefulRuleGroupReferences': [
{
'ResourceArn': 'string',
'Priority': 123,
'Override': {
'Action': 'DROP_TO_ALERT'
}
},
],
```
### References
<!---
Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests
Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example:
* https://aws.amazon.com/about-aws/whats-new/2018/04/introducing-amazon-ec2-fleet/
--->
* #0000
| 1.0 | Add support for stateful_rule_group_reference override in AWS managed rules - <!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
As a follow up to https://github.com/hashicorp/terraform-provider-aws/pull/22355, kindly add support for Override in the `stateful_rule_group_reference` for aws_networkfirewall_firewall_policy resource.
**Currently supported**
```
resource "aws_networkfirewall_firewall_policy" "example" {
name = "example"
firewall_policy {
stateless_default_actions = ["aws:pass"]
stateless_fragment_default_actions = ["aws:drop"]
stateless_rule_group_reference {
priority = 1
resource_arn = aws_networkfirewall_rule_group.example.arn
stateful_rule_group_reference {
priority = 2
resource_arn = aws_networkfirewall_rule_group.example.arn
}
}`
```
**feature request**
```
resource "aws_networkfirewall_firewall_policy" "example" {
name = "example"
firewall_policy {
stateless_default_actions = ["aws:pass"]
stateless_fragment_default_actions = ["aws:drop"]
stateless_rule_group_reference {
priority = 1
overrride {
action = ["DROP_TO_ALERT"]
}
resource_arn = aws_networkfirewall_rule_group.example.arn
stateful_rule_group_reference {
priority = 2
overrride {
action = ["DROP_TO_ALERT"]
}
resource_arn = aws_networkfirewall_rule_group.example.arn
}
}`
```
### New or Affected Resource(s)
<!--- Please list the new or affected resources and data sources. --->
* aws_XXXXX
### Potential Terraform Configuration
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
```hcl
# Copy-paste your Terraform configurations here - for large Terraform configs,
# please use a service like Dropbox and share a link to the ZIP file. For
# security, you can also encrypt the files using our GPG public key.
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/network-firewall.html
'StatefulRuleGroupReferences': [
{
'ResourceArn': 'string',
'Priority': 123,
'Override': {
'Action': 'DROP_TO_ALERT'
}
},
],
```
### References
<!---
Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests
Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example:
* https://aws.amazon.com/about-aws/whats-new/2018/04/introducing-amazon-ec2-fleet/
--->
* #0000
| non_priority | add support for stateful rule group reference override in aws managed rules community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or other comments that do not add relevant new information or questions they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment description as a follow up to kindly add support for override in the stateful rule group reference for aws networkfirewall firewall policy resource currently supported resource aws networkfirewall firewall policy example name example firewall policy stateless default actions stateless fragment default actions stateless rule group reference priority resource arn aws networkfirewall rule group example arn stateful rule group reference priority resource arn aws networkfirewall rule group example arn feature request resource aws networkfirewall firewall policy example name example firewall policy stateless default actions stateless fragment default actions stateless rule group reference priority overrride action resource arn aws networkfirewall rule group example arn stateful rule group reference priority overrride action resource arn aws networkfirewall rule group example arn new or affected resource s aws xxxxx potential terraform configuration hcl copy paste your terraform configurations here for large terraform configs please use a service like dropbox and share a link to the zip file for security you can also encrypt the files using our gpg public key statefulrulegroupreferences resourcearn string priority override action drop to alert references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here vendor blog posts or documentation for example | 0 |
32,084 | 6,029,661,547 | IssuesEvent | 2017-06-08 18:33:09 | mailwatch/MailWatch | https://api.github.com/repos/mailwatch/MailWatch | closed | Problem with session_start() | documentation | With last develop.
`Notice: session_start(): ps_files_cleanup_dir: opendir(/var/lib/php5/sessions) failed: Permission denied (13) in /var/www/html/mailscanner/login.php on line 37`
```
# ll /var/lib/php5/sessions/
-rw------- 1 mailwatch mailwatch 109 mars 31 11:26 sess_6vfvmt6vfah8u8gv4mmtroqac6
```
The mail watch vhost run as mailwatch/mailwatch using apache 2.3 + mpm_itk.
I rebooted the dev server to be sure.
| 1.0 | Problem with session_start() - With last develop.
`Notice: session_start(): ps_files_cleanup_dir: opendir(/var/lib/php5/sessions) failed: Permission denied (13) in /var/www/html/mailscanner/login.php on line 37`
```
# ll /var/lib/php5/sessions/
-rw------- 1 mailwatch mailwatch 109 mars 31 11:26 sess_6vfvmt6vfah8u8gv4mmtroqac6
```
The mail watch vhost run as mailwatch/mailwatch using apache 2.3 + mpm_itk.
I rebooted the dev server to be sure.
| non_priority | problem with session start with last develop notice session start ps files cleanup dir opendir var lib sessions failed permission denied in var www html mailscanner login php on line ll var lib sessions rw mailwatch mailwatch mars sess the mail watch vhost run as mailwatch mailwatch using apache mpm itk i rebooted the dev server to be sure | 0 |
2,876 | 3,686,400,723 | IssuesEvent | 2016-02-25 01:05:08 | bzdeck/bzdeck | https://api.github.com/repos/bzdeck/bzdeck | opened | Retrieve multiple bugs and users whenever possible | maintenance performance | Due to [Bug 1169040 - Cannot retrieve multiple bugs/users when one or more entries are private or not found](https://bugzilla.mozilla.org/show_bug.cgi?id=1169040), `UserCollection` doesn't have the `fetch` method. `BugCollection` has the method, but it's not always used. Until the bug is fixed, to reduce the number of requests and to avoid the 429 Too Many Requests error, we can:
1. Retrieve 10 bugs/users per request
2. Retrieve individual bug/user when Bugzilla returns an error
3. Combine them using `Promise` | True | Retrieve multiple bugs and users whenever possible - Due to [Bug 1169040 - Cannot retrieve multiple bugs/users when one or more entries are private or not found](https://bugzilla.mozilla.org/show_bug.cgi?id=1169040), `UserCollection` doesn't have the `fetch` method. `BugCollection` has the method, but it's not always used. Until the bug is fixed, to reduce the number of requests and to avoid the 429 Too Many Requests error, we can:
1. Retrieve 10 bugs/users per request
2. Retrieve individual bug/user when Bugzilla returns an error
3. Combine them using `Promise` | non_priority | retrieve multiple bugs and users whenever possible due to usercollection doesn t have the fetch method bugcollection has the method but it s not always used until the bug is fixed to reduce the number of requests and to avoid the too many requests error we can retrieve bugs users per request retrieve individual bug user when bugzilla returns an error combine them using promise | 0 |
10,631 | 3,411,480,482 | IssuesEvent | 2015-12-05 03:54:23 | square/okhttp | https://api.github.com/repos/square/okhttp | closed | No "Content-Length" header if missing RequestBody.contentLength() | documentation | I had request using RequestBody object. I did not Override contentType() method, but set "Content-Length" header manually with:
Request request = new Request.Builder()
.url(uploadUri)
.post(requestBody)
.addHeader("Content-Length", String.valueOf(fileSize))
.build();
But my header was removed by OkHttp because of this code in Call class:
Response getResponse(Request request, boolean forWebSocket) throws IOException {
RequestBody body = request.body();
if(body != null) {
Builder followUpCount = request.newBuilder();
MediaType response = body.contentType();
if(response != null) {
followUpCount.header("Content-Type", response.toString());
}
long followUp = body.contentLength();
if(followUp != -1L) {
followUpCount.header("Content-Length", Long.toString(followUp));
followUpCount.removeHeader("Transfer-Encoding");
} else {
followUpCount.header("Transfer-Encoding", "chunked");
followUpCount.removeHeader("Content-Length");
}
request = followUpCount.build();
}
So basically how it works now is that the lib checks if body has contentLength and if it doesn't then it removes header set by me. I would expect exception being thrown, or header not removed. | 1.0 | No "Content-Length" header if missing RequestBody.contentLength() - I had request using RequestBody object. I did not Override contentType() method, but set "Content-Length" header manually with:
Request request = new Request.Builder()
.url(uploadUri)
.post(requestBody)
.addHeader("Content-Length", String.valueOf(fileSize))
.build();
But my header was removed by OkHttp because of this code in Call class:
Response getResponse(Request request, boolean forWebSocket) throws IOException {
RequestBody body = request.body();
if(body != null) {
Builder followUpCount = request.newBuilder();
MediaType response = body.contentType();
if(response != null) {
followUpCount.header("Content-Type", response.toString());
}
long followUp = body.contentLength();
if(followUp != -1L) {
followUpCount.header("Content-Length", Long.toString(followUp));
followUpCount.removeHeader("Transfer-Encoding");
} else {
followUpCount.header("Transfer-Encoding", "chunked");
followUpCount.removeHeader("Content-Length");
}
request = followUpCount.build();
}
So basically how it works now is that the lib checks if body has contentLength and if it doesn't then it removes header set by me. I would expect exception being thrown, or header not removed. | non_priority | no content length header if missing requestbody contentlength i had request using requestbody object i did not override contenttype method but set content length header manually with request request new request builder url uploaduri post requestbody addheader content length string valueof filesize build but my header was removed by okhttp because of this code in call class response getresponse request request boolean forwebsocket throws ioexception requestbody body request body if body null builder followupcount request newbuilder mediatype response body contenttype if response null followupcount header content type response tostring long followup body contentlength if followup followupcount header content length long tostring followup followupcount removeheader transfer encoding else followupcount header transfer encoding chunked followupcount removeheader content length request followupcount build so basically how it works now is that the lib checks if body has contentlength and if it doesn t then it removes header set by me i would expect exception being thrown or header not removed | 0 |
69,789 | 30,476,308,733 | IssuesEvent | 2023-07-17 16:46:10 | MicrosoftDocs/azure-docs | https://api.github.com/repos/MicrosoftDocs/azure-docs | closed | Move the end to the beginning | service-fabric/svc triaged assigned-to-author doc-enhancement Pri2 | It was painful and laborious to read this article and then discover the best was at the end beginning with the section **Scenarios and configurations.**
When you go shopping you walk down aisles and scan. When you see what you are looking for, if you were required to spend many minutes reading technical details before you get to a brief, simple, elegant summary, you'd move onto the next product.
Put the summary on the top so developers can quickly find what they are looking for. They can read the details later. Provide links from the top of the page to the details and code samples.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 2639218d-5e54-aba0-ca0a-e53025b06d15
* Version Independent ID: fec67d60-26a4-ba4c-45cf-f53c74281659
* Content: [Service communication with the ASP.NET Core - Azure Service Fabric](https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-reliable-services-communication-aspnetcore)
* Content Source: [articles/service-fabric/service-fabric-reliable-services-communication-aspnetcore.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/service-fabric/service-fabric-reliable-services-communication-aspnetcore.md)
* Service: **service-fabric**
* GitHub Login: @vturecek
* Microsoft Alias: **vturecek** | 1.0 | Move the end to the beginning - It was painful and laborious to read this article and then discover the best was at the end beginning with the section **Scenarios and configurations.**
When you go shopping you walk down aisles and scan. When you see what you are looking for, if you were required to spend many minutes reading technical details before you get to a brief, simple, elegant summary, you'd move onto the next product.
Put the summary on the top so developers can quickly find what they are looking for. They can read the details later. Provide links from the top of the page to the details and code samples.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 2639218d-5e54-aba0-ca0a-e53025b06d15
* Version Independent ID: fec67d60-26a4-ba4c-45cf-f53c74281659
* Content: [Service communication with the ASP.NET Core - Azure Service Fabric](https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-reliable-services-communication-aspnetcore)
* Content Source: [articles/service-fabric/service-fabric-reliable-services-communication-aspnetcore.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/service-fabric/service-fabric-reliable-services-communication-aspnetcore.md)
* Service: **service-fabric**
* GitHub Login: @vturecek
* Microsoft Alias: **vturecek** | non_priority | move the end to the beginning it was painful and laborious to read this article and then discover the best was at the end beginning with the section scenarios and configurations when you go shopping you walk down aisles and scan when you see what you are looking for if you were required to spend many minutes reading technical details before you get to a brief simple elegant summary you d move onto the next product put the summary on the top so developers can quickly find what they are looking for they can read the details later provide links from the top of the page to the details and code samples document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service service fabric github login vturecek microsoft alias vturecek | 0 |
2,338 | 11,777,472,170 | IssuesEvent | 2020-03-16 14:52:25 | coolOrangeLabs/powerGateTemplate | https://api.github.com/repos/coolOrangeLabs/powerGateTemplate | closed | Fragebogen: Freigabe von Baugruppen | DE PGClient_Automation | ## Frage
Welche Prüfungen sollen bei der Freigabe von Baugruppen durchgeführt werden, welche zum Unterbrechen der Freigabe führen könnten?
## TaskList
| Github Issue | Reihenfolge | Abschätzung |
| - | - | - |
| # | 1. | ? h | | 1.0 | Fragebogen: Freigabe von Baugruppen - ## Frage
Welche Prüfungen sollen bei der Freigabe von Baugruppen durchgeführt werden, welche zum Unterbrechen der Freigabe führen könnten?
## TaskList
| Github Issue | Reihenfolge | Abschätzung |
| - | - | - |
| # | 1. | ? h | | non_priority | fragebogen freigabe von baugruppen frage welche prüfungen sollen bei der freigabe von baugruppen durchgeführt werden welche zum unterbrechen der freigabe führen könnten tasklist github issue reihenfolge abschätzung h | 0 |
19,348 | 10,349,155,950 | IssuesEvent | 2019-09-04 21:36:48 | cordova-rtc/cordova-plugin-iosrtc | https://api.github.com/repos/cordova-rtc/cordova-plugin-iosrtc | closed | Github Security Alerts lodash.template >=4.5.0 | security | Source is following packages, on devDependencies impacted, waiting for `gulp-header` and `gulp-derequire` update.
```
gulp-header@1.8.12 > lodash.template@4.5.0
gulp-derequire@2.1.0 > gulp-util@3.0.8 > lodash.template@3.6.2
```
CVE-2019-10744 More information:
> Affected versions of lodash are vulnerable to Prototype Pollution.
> The function defaultsDeep could be tricked into adding or modifying properties of Object.prototype using a constructor payload.
Source: https://github.com/lodash/lodash/pull/4336
| True | Github Security Alerts lodash.template >=4.5.0 - Source is following packages, on devDependencies impacted, waiting for `gulp-header` and `gulp-derequire` update.
```
gulp-header@1.8.12 > lodash.template@4.5.0
gulp-derequire@2.1.0 > gulp-util@3.0.8 > lodash.template@3.6.2
```
CVE-2019-10744 More information:
> Affected versions of lodash are vulnerable to Prototype Pollution.
> The function defaultsDeep could be tricked into adding or modifying properties of Object.prototype using a constructor payload.
Source: https://github.com/lodash/lodash/pull/4336
| non_priority | github security alerts lodash template source is following packages on devdependencies impacted waiting for gulp header and gulp derequire update gulp header lodash template gulp derequire gulp util lodash template cve more information affected versions of lodash are vulnerable to prototype pollution the function defaultsdeep could be tricked into adding or modifying properties of object prototype using a constructor payload source | 0 |
150,920 | 13,385,431,838 | IssuesEvent | 2020-09-02 13:27:45 | woodwardwebdev/one_page_portfolio | https://api.github.com/repos/woodwardwebdev/one_page_portfolio | opened | General file structure and documentation | documentation help wanted | As a self-taught developer learning from home, I rarely get to see 'professional' grade projects. I would like to see how someone with experience would set out the folders/files and what documentation I am missing to further improve the project and make it easier to work with others. | 1.0 | General file structure and documentation - As a self-taught developer learning from home, I rarely get to see 'professional' grade projects. I would like to see how someone with experience would set out the folders/files and what documentation I am missing to further improve the project and make it easier to work with others. | non_priority | general file structure and documentation as a self taught developer learning from home i rarely get to see professional grade projects i would like to see how someone with experience would set out the folders files and what documentation i am missing to further improve the project and make it easier to work with others | 0 |
133,767 | 10,862,188,754 | IssuesEvent | 2019-11-14 12:47:12 | servo/servo | https://api.github.com/repos/servo/servo | closed | Enter and backspace don't trigger onkeypress events for text inputs | A-content/dom A-input C-assigned C-has-manual-testcase | `./mach run -d 'data:text/html,<input value="hi" onkeypress="console.log(event)">'` | 1.0 | Enter and backspace don't trigger onkeypress events for text inputs - `./mach run -d 'data:text/html,<input value="hi" onkeypress="console.log(event)">'` | non_priority | enter and backspace don t trigger onkeypress events for text inputs mach run d data text html | 0 |
358,391 | 25,190,246,996 | IssuesEvent | 2022-11-11 23:22:41 | deislabs/ratify | https://api.github.com/repos/deislabs/ratify | closed | ECR and Cosign Walkthrough is Broken | bug documentation | - Cosign verification flow doesn't work with changes to ORAS store
- Fix must catch the 404 error returned by ECR when referrers API is queried
- Previous ORAS implementation: https://github.com/deislabs/ratify/blob/ee5d9d8213bf267721eb74934f61eee37a982311/pkg/referrerstore/oras/oras.go#L119 | 1.0 | ECR and Cosign Walkthrough is Broken - - Cosign verification flow doesn't work with changes to ORAS store
- Fix must catch the 404 error returned by ECR when referrers API is queried
- Previous ORAS implementation: https://github.com/deislabs/ratify/blob/ee5d9d8213bf267721eb74934f61eee37a982311/pkg/referrerstore/oras/oras.go#L119 | non_priority | ecr and cosign walkthrough is broken cosign verification flow doesn t work with changes to oras store fix must catch the error returned by ecr when referrers api is queried previous oras implementation | 0 |
13,807 | 8,374,235,340 | IssuesEvent | 2018-10-05 13:07:18 | dbs-leipzig/gradoop | https://api.github.com/repos/dbs-leipzig/gradoop | closed | performance of toTransaction() on a filtered graph collection | flink performance | Scenario: I have many small graphs and one huge graph in an initial graphCollection. I create a graphCollection with the small graphs by calling the select method with a filter function that selects those graphs. The toTransaction call takes more time than for an initial graphCollection without the huge graph. A possible cause is that the flatMap generates tuples for the huge graph because the graphId is contained in the idSets for each vertex.
| True | performance of toTransaction() on a filtered graph collection - Scenario: I have many small graphs and one huge graph in an initial graphCollection. I create a graphCollection with the small graphs by calling the select method with a filter function that selects those graphs. The toTransaction call takes more time than for an initial graphCollection without the huge graph. A possible cause is that the flatMap generates tuples for the huge graph because the graphId is contained in the idSets for each vertex.
| non_priority | performance of totransaction on a filtered graph collection scenario i have many small graphs and one huge graph in an initial graphcollection i create a graphcollection with the small graphs by calling the select method with a filter function that selects those graphs the totransaction call takes more time than for an initial graphcollection without the huge graph a possible cause is that the flatmap generates tuples for the huge graph because the graphid is contained in the idsets for each vertex | 0 |
77,998 | 9,652,322,254 | IssuesEvent | 2019-05-18 16:08:44 | nanograv/enterprise | https://api.github.com/repos/nanograv/enterprise | opened | use list comprehension instead of map / filter | design | The code mixes list comprehensions and `map` / `filter` with `lambda` statements. I find list comprehension to be much more readable, and `lambda` statements are generally slower than their alternatives.
I say we:
* replace all `map` and `filter` calls with list comprehension.
* eliminate `lambda` statements wherever possible | 1.0 | use list comprehension instead of map / filter - The code mixes list comprehensions and `map` / `filter` with `lambda` statements. I find list comprehension to be much more readable, and `lambda` statements are generally slower than their alternatives.
I say we:
* replace all `map` and `filter` calls with list comprehension.
* eliminate `lambda` statements wherever possible | non_priority | use list comprehension instead of map filter the code mixes list comprehensions and map filter with lambda statements i find list comprehension to be much more readable and lambda statements are generally slower than their alternatives i say we replace all map and filter calls with list comprehension eliminate lambda statements wherever possible | 0 |
224,000 | 17,651,322,700 | IssuesEvent | 2021-08-20 13:35:41 | ismail0234/status | https://api.github.com/repos/ismail0234/status | closed | 🛑 TEST WEBSITE 2 is down | status test-website-2 | In [`57458e8`](https://github.com/ismail0234/status/commit/57458e8cfb750fc0325e6913229064e6e510c7fc
), TEST WEBSITE 2 (https://asdasdasdasd2.com) was **down**:
- HTTP code: 0
- Response time: 0 ms
| 1.0 | 🛑 TEST WEBSITE 2 is down - In [`57458e8`](https://github.com/ismail0234/status/commit/57458e8cfb750fc0325e6913229064e6e510c7fc
), TEST WEBSITE 2 (https://asdasdasdasd2.com) was **down**:
- HTTP code: 0
- Response time: 0 ms
| non_priority | 🛑 test website is down in test website was down http code response time ms | 0 |
140,063 | 21,004,611,450 | IssuesEvent | 2022-03-29 21:05:16 | WatershedXiaolan/Xiaolan-s-Gitbook | https://api.github.com/repos/WatershedXiaolan/Xiaolan-s-Gitbook | opened | How many chat servers do we need? | system design | Let’s plan for 500 million connections at any time. Assuming a modern server can handle 50K concurrent connections at any time, we would need 10K such servers. | 1.0 | How many chat servers do we need? - Let’s plan for 500 million connections at any time. Assuming a modern server can handle 50K concurrent connections at any time, we would need 10K such servers. | non_priority | how many chat servers do we need let’s plan for million connections at any time assuming a modern server can handle concurrent connections at any time we would need such servers | 0 |
24,408 | 7,494,927,109 | IssuesEvent | 2018-04-07 15:25:34 | DynamoRIO/dynamorio | https://api.github.com/repos/DynamoRIO/dynamorio | closed | Statically linked client does not work | Component-Build OpSys-Windows Status-UserOrAppError | Hi,
I found that a statically linked client does not work. The same client works fine if dynamorio is used dynamically (using `dynamorio.dll`) .
I've created a very simple PoC to test it [here](https://github.com/illera88/DR_static_test).
The PoC simply prints a string to console using `dr_printf`.
To test, just use Cmake with `STATIC_DR` for a static build (which will use `configure_DynamoRIO_static`) or don't set to create a build that makes use of `dynaomorio.dll`.
If statically linked the string `This must be printed` does not get printed. Otherwise it does.
Regards | 1.0 | Statically linked client does not work - Hi,
I found that a statically linked client does not work. The same client works fine if dynamorio is used dynamically (using `dynamorio.dll`) .
I've created a very simple PoC to test it [here](https://github.com/illera88/DR_static_test).
The PoC simply prints a string to console using `dr_printf`.
To test, just use Cmake with `STATIC_DR` for a static build (which will use `configure_DynamoRIO_static`) or don't set to create a build that makes use of `dynaomorio.dll`.
If statically linked the string `This must be printed` does not get printed. Otherwise it does.
Regards | non_priority | statically linked client does not work hi i found that a statically linked client does not work the same client works fine if dynamorio is used dynamically using dynamorio dll i ve created a very simple poc to test it the poc simply prints a string to console using dr printf to test just use cmake with static dr for a static build which will use configure dynamorio static or don t set to create a build that makes use of dynaomorio dll if statically linked the string this must be printed does not get printed otherwise it does regards | 0 |
182,739 | 21,675,118,322 | IssuesEvent | 2022-05-08 15:35:08 | Hieunc-NT/WebGoat.NET | https://api.github.com/repos/Hieunc-NT/WebGoat.NET | opened | microsoft.visualstudio.web.codegeneration.design.5.0.0.nupkg: 1 vulnerabilities (highest severity is: 9.8) | security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.visualstudio.web.codegeneration.design.5.0.0.nupkg</b></p></summary>
<p></p>
<p>Path to dependency file: /WebGoatCore/WebGoatCore.csproj</p>
<p>Path to vulnerable library: /home/wss-scanner/.nuget/packages/system.text.encodings.web/4.5.0/system.text.encodings.web.4.5.0.nupkg</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Hieunc-NT/WebGoat.NET/commit/61712e7a2c048106c69793f517a867129002f670">61712e7a2c048106c69793f517a867129002f670</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-26701](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-26701) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | system.text.encodings.web.4.5.0.nupkg | Transitive | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-26701</summary>
### Vulnerable Library - <b>system.text.encodings.web.4.5.0.nupkg</b></p>
<p>Provides types for encoding and escaping strings for use in JavaScript, HyperText Markup Language (H...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/system.text.encodings.web.4.5.0.nupkg">https://api.nuget.org/packages/system.text.encodings.web.4.5.0.nupkg</a></p>
<p>Path to dependency file: /WebGoatCore/WebGoatCore.csproj</p>
<p>Path to vulnerable library: /home/wss-scanner/.nuget/packages/system.text.encodings.web/4.5.0/system.text.encodings.web.4.5.0.nupkg</p>
<p>
Dependency Hierarchy:
- microsoft.visualstudio.web.codegeneration.design.5.0.0.nupkg (Root Library)
- microsoft.visualstudio.web.codegenerators.mvc.5.0.0.nupkg
- microsoft.visualstudio.web.codegeneration.5.0.0.nupkg
- microsoft.visualstudio.web.codegeneration.entityframeworkcore.5.0.0.nupkg
- microsoft.visualstudio.web.codegeneration.core.5.0.0.nupkg
- microsoft.visualstudio.web.codegeneration.templating.5.0.0.nupkg
- microsoft.aspnetcore.razor.runtime.2.2.0.nupkg
- microsoft.aspnetcore.html.abstractions.2.2.0.nupkg
- :x: **system.text.encodings.web.4.5.0.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Hieunc-NT/WebGoat.NET/commit/61712e7a2c048106c69793f517a867129002f670">61712e7a2c048106c69793f517a867129002f670</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
.NET Core Remote Code Execution Vulnerability This CVE ID is unique from CVE-2021-24112.
<p>Publish Date: 2021-02-25
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-26701>CVE-2021-26701</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/dotnet/announcements/issues/178">https://github.com/dotnet/announcements/issues/178</a></p>
<p>Release Date: 2021-02-25</p>
<p>Fix Resolution: System.Text.Encodings.Web - 4.5.1,4.7.2,5.0.1</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
<!-- <REMEDIATE>[{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Nuget","packageName":"System.Text.Encodings.Web","packageVersion":"4.5.0","packageFilePaths":["/WebGoatCore/WebGoatCore.csproj"],"isTransitiveDependency":true,"dependencyTree":"Microsoft.VisualStudio.Web.CodeGeneration.Design:5.0.0;Microsoft.VisualStudio.Web.CodeGenerators.Mvc:5.0.0;Microsoft.VisualStudio.Web.CodeGeneration:5.0.0;Microsoft.VisualStudio.Web.CodeGeneration.EntityFrameworkCore:5.0.0;Microsoft.VisualStudio.Web.CodeGeneration.Core:5.0.0;Microsoft.VisualStudio.Web.CodeGeneration.Templating:5.0.0;Microsoft.AspNetCore.Razor.Runtime:2.2.0;Microsoft.AspNetCore.Html.Abstractions:2.2.0;System.Text.Encodings.Web:4.5.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"System.Text.Encodings.Web - 4.5.1,4.7.2,5.0.1","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-26701","vulnerabilityDetails":".NET Core Remote Code Execution Vulnerability This CVE ID is unique from CVE-2021-24112.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-26701","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}]</REMEDIATE> --> | True | microsoft.visualstudio.web.codegeneration.design.5.0.0.nupkg: 1 vulnerabilities (highest severity is: 9.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.visualstudio.web.codegeneration.design.5.0.0.nupkg</b></p></summary>
<p></p>
<p>Path to dependency file: /WebGoatCore/WebGoatCore.csproj</p>
<p>Path to vulnerable library: /home/wss-scanner/.nuget/packages/system.text.encodings.web/4.5.0/system.text.encodings.web.4.5.0.nupkg</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Hieunc-NT/WebGoat.NET/commit/61712e7a2c048106c69793f517a867129002f670">61712e7a2c048106c69793f517a867129002f670</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-26701](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-26701) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | system.text.encodings.web.4.5.0.nupkg | Transitive | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-26701</summary>
### Vulnerable Library - <b>system.text.encodings.web.4.5.0.nupkg</b></p>
<p>Provides types for encoding and escaping strings for use in JavaScript, HyperText Markup Language (H...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/system.text.encodings.web.4.5.0.nupkg">https://api.nuget.org/packages/system.text.encodings.web.4.5.0.nupkg</a></p>
<p>Path to dependency file: /WebGoatCore/WebGoatCore.csproj</p>
<p>Path to vulnerable library: /home/wss-scanner/.nuget/packages/system.text.encodings.web/4.5.0/system.text.encodings.web.4.5.0.nupkg</p>
<p>
Dependency Hierarchy:
- microsoft.visualstudio.web.codegeneration.design.5.0.0.nupkg (Root Library)
- microsoft.visualstudio.web.codegenerators.mvc.5.0.0.nupkg
- microsoft.visualstudio.web.codegeneration.5.0.0.nupkg
- microsoft.visualstudio.web.codegeneration.entityframeworkcore.5.0.0.nupkg
- microsoft.visualstudio.web.codegeneration.core.5.0.0.nupkg
- microsoft.visualstudio.web.codegeneration.templating.5.0.0.nupkg
- microsoft.aspnetcore.razor.runtime.2.2.0.nupkg
- microsoft.aspnetcore.html.abstractions.2.2.0.nupkg
- :x: **system.text.encodings.web.4.5.0.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Hieunc-NT/WebGoat.NET/commit/61712e7a2c048106c69793f517a867129002f670">61712e7a2c048106c69793f517a867129002f670</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
.NET Core Remote Code Execution Vulnerability This CVE ID is unique from CVE-2021-24112.
<p>Publish Date: 2021-02-25
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-26701>CVE-2021-26701</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/dotnet/announcements/issues/178">https://github.com/dotnet/announcements/issues/178</a></p>
<p>Release Date: 2021-02-25</p>
<p>Fix Resolution: System.Text.Encodings.Web - 4.5.1,4.7.2,5.0.1</p>
</p>
<p></p>
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
<!-- <REMEDIATE>[{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Nuget","packageName":"System.Text.Encodings.Web","packageVersion":"4.5.0","packageFilePaths":["/WebGoatCore/WebGoatCore.csproj"],"isTransitiveDependency":true,"dependencyTree":"Microsoft.VisualStudio.Web.CodeGeneration.Design:5.0.0;Microsoft.VisualStudio.Web.CodeGenerators.Mvc:5.0.0;Microsoft.VisualStudio.Web.CodeGeneration:5.0.0;Microsoft.VisualStudio.Web.CodeGeneration.EntityFrameworkCore:5.0.0;Microsoft.VisualStudio.Web.CodeGeneration.Core:5.0.0;Microsoft.VisualStudio.Web.CodeGeneration.Templating:5.0.0;Microsoft.AspNetCore.Razor.Runtime:2.2.0;Microsoft.AspNetCore.Html.Abstractions:2.2.0;System.Text.Encodings.Web:4.5.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"System.Text.Encodings.Web - 4.5.1,4.7.2,5.0.1","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-26701","vulnerabilityDetails":".NET Core Remote Code Execution Vulnerability This CVE ID is unique from CVE-2021-24112.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-26701","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}]</REMEDIATE> --> | non_priority | microsoft visualstudio web codegeneration design nupkg vulnerabilities highest severity is vulnerable library microsoft visualstudio web codegeneration design nupkg path to dependency file webgoatcore webgoatcore csproj path to vulnerable library home wss scanner nuget packages system text encodings web system text encodings web nupkg found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high system text encodings web nupkg transitive n a details cve vulnerable library system text encodings web nupkg provides types for encoding and escaping strings for use in javascript hypertext markup language h library home page a href path to dependency file webgoatcore webgoatcore csproj path to vulnerable library home wss scanner nuget packages system text encodings web system text encodings web nupkg dependency hierarchy microsoft visualstudio web codegeneration design nupkg root library microsoft visualstudio web codegenerators mvc nupkg microsoft visualstudio web codegeneration nupkg microsoft visualstudio web codegeneration entityframeworkcore nupkg microsoft visualstudio web codegeneration core nupkg microsoft visualstudio web codegeneration templating nupkg microsoft aspnetcore razor runtime nupkg microsoft aspnetcore html abstractions nupkg x system text encodings web nupkg vulnerable library found in head commit a href found in base branch master vulnerability details net core remote code execution vulnerability this cve id is unique from cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution system text encodings web step up your open source security game with whitesource istransitivedependency true dependencytree microsoft visualstudio web codegeneration design microsoft visualstudio web codegenerators mvc microsoft visualstudio web codegeneration microsoft visualstudio web codegeneration entityframeworkcore microsoft visualstudio web codegeneration core microsoft visualstudio web codegeneration templating microsoft aspnetcore razor runtime microsoft aspnetcore html abstractions system text encodings web isminimumfixversionavailable true minimumfixversion system text encodings web isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails net core remote code execution vulnerability this cve id is unique from cve vulnerabilityurl | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.