Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
226,252
| 24,946,796,288
|
IssuesEvent
|
2022-11-01 01:27:34
|
Thezone1975/tabliss
|
https://api.github.com/repos/Thezone1975/tabliss
|
closed
|
CVE-2022-37599 (High) detected in loader-utils-1.1.0.tgz - autoclosed
|
security vulnerability
|
## CVE-2022-37599 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>loader-utils-1.1.0.tgz</b></p></summary>
<p>utils for webpack loaders</p>
<p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.1.0.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.1.0.tgz</a></p>
<p>Path to dependency file: /tabliss/package.json</p>
<p>Path to vulnerable library: /node_modules/loader-utils/package.json</p>
<p>
Dependency Hierarchy:
- copy-webpack-plugin-4.5.1.tgz (Root Library)
- :x: **loader-utils-1.1.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Regular expression denial of service (ReDoS) flaw was found in Function interpolateName in interpolateName.js in webpack loader-utils 2.0.0 via the resourcePath variable in interpolateName.js.
<p>Publish Date: 2022-10-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37599>CVE-2022-37599</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-37599 (High) detected in loader-utils-1.1.0.tgz - autoclosed - ## CVE-2022-37599 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>loader-utils-1.1.0.tgz</b></p></summary>
<p>utils for webpack loaders</p>
<p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.1.0.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.1.0.tgz</a></p>
<p>Path to dependency file: /tabliss/package.json</p>
<p>Path to vulnerable library: /node_modules/loader-utils/package.json</p>
<p>
Dependency Hierarchy:
- copy-webpack-plugin-4.5.1.tgz (Root Library)
- :x: **loader-utils-1.1.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Regular expression denial of service (ReDoS) flaw was found in Function interpolateName in interpolateName.js in webpack loader-utils 2.0.0 via the resourcePath variable in interpolateName.js.
<p>Publish Date: 2022-10-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37599>CVE-2022-37599</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in loader utils tgz autoclosed cve high severity vulnerability vulnerable library loader utils tgz utils for webpack loaders library home page a href path to dependency file tabliss package json path to vulnerable library node modules loader utils package json dependency hierarchy copy webpack plugin tgz root library x loader utils tgz vulnerable library vulnerability details a regular expression denial of service redos flaw was found in function interpolatename in interpolatename js in webpack loader utils via the resourcepath variable in interpolatename js publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href step up your open source security game with mend
| 0
|
17,353
| 23,175,297,293
|
IssuesEvent
|
2022-07-31 10:27:39
|
X-Sharp/XSharpPublic
|
https://api.github.com/repos/X-Sharp/XSharpPublic
|
closed
|
#xtranslate doesn't recognize the Regular match marker (xBase++ dialect)
|
bug Compiler Preprocessor
|
Describe the bug
#xtranslate expression is not recognized by the preprocessor (xBase++ dialect).
To Reproduce
```
#xtranslate ASTR(<x>) => ALLTRIM(STR(<x>))
if astr(n := val(fldleft)) == fldleft
endif
```
Expected behavior (xBase ppo)
```
if ALLTRIM(STR(n := val(fldleft))) == fldleft
endif
```
Actual behavior (x# ppo)
```
if astr(n := val(fldleft)) == fldleft
endif
```
I use version 2.12.2.0
|
1.0
|
#xtranslate doesn't recognize the Regular match marker (xBase++ dialect) - Describe the bug
#xtranslate expression is not recognized by the preprocessor (xBase++ dialect).
To Reproduce
```
#xtranslate ASTR(<x>) => ALLTRIM(STR(<x>))
if astr(n := val(fldleft)) == fldleft
endif
```
Expected behavior (xBase ppo)
```
if ALLTRIM(STR(n := val(fldleft))) == fldleft
endif
```
Actual behavior (x# ppo)
```
if astr(n := val(fldleft)) == fldleft
endif
```
I use version 2.12.2.0
|
process
|
xtranslate doesn t recognize the regular match marker xbase dialect describe the bug xtranslate expression is not recognized by the preprocessor xbase dialect to reproduce xtranslate astr alltrim str if astr n val fldleft fldleft endif expected behavior xbase ppo if alltrim str n val fldleft fldleft endif actual behavior x ppo if astr n val fldleft fldleft endif i use version
| 1
|
113,770
| 24,485,074,442
|
IssuesEvent
|
2022-10-09 10:18:10
|
dsc-tiu/open-source-ml-ai
|
https://api.github.com/repos/dsc-tiu/open-source-ml-ai
|
reopened
|
Basic machine learning algorithms.
|
hacktoberfest hacktoberfest-starters intermediate-ml code and documentation
|
Here, you are required to give self implemented machine learning algorithms. This means, submitting code that works similar to the inbuilt machine learning algorithms or error metrics like RMSE, MAE, simple regressors and classifiers.
Folder structure to follow:
```
├── Self-implemented ML Algorithms
│ ├── Error metrics
│ │ ├──Mean Absolute Error (Create sub-folder here)
│ │ ├──Root Mean Absolute Error (Create sub-folder here) and so on.
│ ├── ML Algorithms
│ │ ├──Linear Regression (Create sub-folder here)
│ │ ├──Logistic Regression (Create sub-folder here) and so on.
│ │ ├──Decision Trees(Create sub-folder here)
│ │ ├──Support Vector Machines(Create sub-folder here)
```
**P.S. If you use any datasets to check the correctness of your algorithms, then do include the links in the relevant markdown files.**
If you are interested, add a comment to this issue and I will allocate the issue to you.
Please see, you will be allotted a small portion of the topics mentioned above just so all interested people have a chance to contribute.
You can also try implementing some metric or algorithm that is not mentioned in this issue. Do so by mentioning what you plan on implementing in the comments below or creating a new issue and I will assign the issue to you.
Also notice the labels for each issue, that will give you an idea about what kind of work you are expected to do.
Cheers! 😄
|
1.0
|
Basic machine learning algorithms. - Here, you are required to give self implemented machine learning algorithms. This means, submitting code that works similar to the inbuilt machine learning algorithms or error metrics like RMSE, MAE, simple regressors and classifiers.
Folder structure to follow:
```
├── Self-implemented ML Algorithms
│ ├── Error metrics
│ │ ├──Mean Absolute Error (Create sub-folder here)
│ │ ├──Root Mean Absolute Error (Create sub-folder here) and so on.
│ ├── ML Algorithms
│ │ ├──Linear Regression (Create sub-folder here)
│ │ ├──Logistic Regression (Create sub-folder here) and so on.
│ │ ├──Decision Trees(Create sub-folder here)
│ │ ├──Support Vector Machines(Create sub-folder here)
```
**P.S. If you use any datasets to check the correctness of your algorithms, then do include the links in the relevant markdown files.**
If you are interested, add a comment to this issue and I will allocate the issue to you.
Please see, you will be allotted a small portion of the topics mentioned above just so all interested people have a chance to contribute.
You can also try implementing some metric or algorithm that is not mentioned in this issue. Do so by mentioning what you plan on implementing in the comments below or creating a new issue and I will assign the issue to you.
Also notice the labels for each issue, that will give you an idea about what kind of work you are expected to do.
Cheers! 😄
|
non_process
|
basic machine learning algorithms here you are required to give self implemented machine learning algorithms this means submitting code that works similar to the inbuilt machine learning algorithms or error metrics like rmse mae simple regressors and classifiers folder structure to follow ├── self implemented ml algorithms │ ├── error metrics │ │ ├──mean absolute error create sub folder here │ │ ├──root mean absolute error create sub folder here and so on │ ├── ml algorithms │ │ ├──linear regression create sub folder here │ │ ├──logistic regression create sub folder here and so on │ │ ├──decision trees create sub folder here │ │ ├──support vector machines create sub folder here p s if you use any datasets to check the correctness of your algorithms then do include the links in the relevant markdown files if you are interested add a comment to this issue and i will allocate the issue to you please see you will be allotted a small portion of the topics mentioned above just so all interested people have a chance to contribute you can also try implementing some metric or algorithm that is not mentioned in this issue do so by mentioning what you plan on implementing in the comments below or creating a new issue and i will assign the issue to you also notice the labels for each issue that will give you an idea about what kind of work you are expected to do cheers 😄
| 0
|
335,514
| 10,154,583,350
|
IssuesEvent
|
2019-08-06 08:23:30
|
bbc/simorgh
|
https://api.github.com/repos/bbc/simorgh
|
closed
|
Local Test Data errors
|
articles-current-epic articles-features-stream bug high priority
|
**Describe the bug**
When I run simorgh locally and visit any of these following test pages, I see console errors
http://localhost:7080/news/articles/c0000000002o
http://localhost:7080/news/articles/c0000000003o
http://localhost:7080/news/articles/c0000000004o
http://localhost:7080/news/articles/c0000000005o
http://localhost:7080/news/articles/c0000000006o
http://localhost:7080/news/articles/c0000000009o
http://localhost:7080/news/articles/c0000000012o
http://localhost:7080/news/articles/c0000000013o
http://localhost:7080/news/articles/c0000000015o
http://localhost:7080/news/articles/c0000000016o
http://localhost:7080/news/articles/c0000000017o
http://localhost:7080/news/articles/c0000000018o
http://localhost:7080/news/articles/c0000000019o
http://localhost:7080/news/articles/c0000000022o
**To Reproduce**
Steps to reproduce the behavior:
1. `git checkout latest && npm ci && npm run dev`
2. Open dev console in a browser & visit http://localhost:7080/news/articles/c0000000002o
3. See error
```
Warning: Failed prop type: Invalid prop `data.data.content.model.blocks[2]` supplied to `ArticleContainer`.
in ArticleContainer (created by Route)
in Route (created by a)
in Switch (created by a)
in a (created by Route)
in Route (created by withRouter(a))
in withRouter(a) (created by s)
in Router (created by BrowserRouter)
in BrowserRouter (created by s)
in s (at client.js:21) proxyConsole.js:54
Warning: Failed prop type: Invalid prop `blocks[0]` supplied to `TextContainer`.
in TextContainer (at Blocks/index.jsx:20)
in Blocks (at Article/index.jsx:95)
in div (created by GridItemConstrained)
in GridItemConstrained (at Article/index.jsx:94)
in div (created by Wrapper)
in Wrapper (created by OatWrapper)
in OatWrapper (at Article/index.jsx:93)
in main (at Article/index.jsx:83)
in PlatformContextProvider (at Article/index.jsx:76)
in ServiceContextProvider (at Article/index.jsx:75)
in ArticleContainer (created by Route)
in Route (created by a)
in Switch (created by a)
in a (created by Route)
in Route (created by withRouter(a))
in withRouter(a) (created by s)
in Router (created by BrowserRouter)
in BrowserRouter (created by s)
in s (at client.js:21) proxyConsole.js:54
Warning: Failed prop type: Invalid prop `blocks[1]` supplied to `ParagraphContainer`.
in ParagraphContainer (at Blocks/index.jsx:20)
in Blocks (at Text/index.jsx:19)
in TextContainer (at Blocks/index.jsx:20)
in Blocks (at Article/index.jsx:95)
in div (created by GridItemConstrained)
in GridItemConstrained (at Article/index.jsx:94)
in div (created by Wrapper)
in Wrapper (created by OatWrapper)
in OatWrapper (at Article/index.jsx:93)
in main (at Article/index.jsx:83)
in PlatformContextProvider (at Article/index.jsx:76)
in ServiceContextProvider (at Article/index.jsx:75)
in ArticleContainer (created by Route)
in Route (created by a)
in Switch (created by a)
in a (created by Route)
in Route (created by withRouter(a))
in withRouter(a) (created by s)
in Router (created by BrowserRouter)
in BrowserRouter (created by s)
in s (at client.js:21)
```
**Expected behavior**
No such errors.
**Screenshots**
Screenshot of the error pasted above.
<img width="597" alt="screen shot 2018-12-04 at 12 28 07" src="https://user-images.githubusercontent.com/3028997/49441947-eb1a8c80-f7bf-11e8-9fa7-daa27a70e662.png">
**Desktop (please complete the following information):**
- OS: Mac OS X
- Browser: Firefox Nightly v 65
- [x] Initially labelled with ["bug"](https://github.com/BBC-News/simorgh/labels/bug)
|
1.0
|
Local Test Data errors - **Describe the bug**
When I run simorgh locally and visit any of these following test pages, I see console errors
http://localhost:7080/news/articles/c0000000002o
http://localhost:7080/news/articles/c0000000003o
http://localhost:7080/news/articles/c0000000004o
http://localhost:7080/news/articles/c0000000005o
http://localhost:7080/news/articles/c0000000006o
http://localhost:7080/news/articles/c0000000009o
http://localhost:7080/news/articles/c0000000012o
http://localhost:7080/news/articles/c0000000013o
http://localhost:7080/news/articles/c0000000015o
http://localhost:7080/news/articles/c0000000016o
http://localhost:7080/news/articles/c0000000017o
http://localhost:7080/news/articles/c0000000018o
http://localhost:7080/news/articles/c0000000019o
http://localhost:7080/news/articles/c0000000022o
**To Reproduce**
Steps to reproduce the behavior:
1. `git checkout latest && npm ci && npm run dev`
2. Open dev console in a browser & visit http://localhost:7080/news/articles/c0000000002o
3. See error
```
Warning: Failed prop type: Invalid prop `data.data.content.model.blocks[2]` supplied to `ArticleContainer`.
in ArticleContainer (created by Route)
in Route (created by a)
in Switch (created by a)
in a (created by Route)
in Route (created by withRouter(a))
in withRouter(a) (created by s)
in Router (created by BrowserRouter)
in BrowserRouter (created by s)
in s (at client.js:21) proxyConsole.js:54
Warning: Failed prop type: Invalid prop `blocks[0]` supplied to `TextContainer`.
in TextContainer (at Blocks/index.jsx:20)
in Blocks (at Article/index.jsx:95)
in div (created by GridItemConstrained)
in GridItemConstrained (at Article/index.jsx:94)
in div (created by Wrapper)
in Wrapper (created by OatWrapper)
in OatWrapper (at Article/index.jsx:93)
in main (at Article/index.jsx:83)
in PlatformContextProvider (at Article/index.jsx:76)
in ServiceContextProvider (at Article/index.jsx:75)
in ArticleContainer (created by Route)
in Route (created by a)
in Switch (created by a)
in a (created by Route)
in Route (created by withRouter(a))
in withRouter(a) (created by s)
in Router (created by BrowserRouter)
in BrowserRouter (created by s)
in s (at client.js:21) proxyConsole.js:54
Warning: Failed prop type: Invalid prop `blocks[1]` supplied to `ParagraphContainer`.
in ParagraphContainer (at Blocks/index.jsx:20)
in Blocks (at Text/index.jsx:19)
in TextContainer (at Blocks/index.jsx:20)
in Blocks (at Article/index.jsx:95)
in div (created by GridItemConstrained)
in GridItemConstrained (at Article/index.jsx:94)
in div (created by Wrapper)
in Wrapper (created by OatWrapper)
in OatWrapper (at Article/index.jsx:93)
in main (at Article/index.jsx:83)
in PlatformContextProvider (at Article/index.jsx:76)
in ServiceContextProvider (at Article/index.jsx:75)
in ArticleContainer (created by Route)
in Route (created by a)
in Switch (created by a)
in a (created by Route)
in Route (created by withRouter(a))
in withRouter(a) (created by s)
in Router (created by BrowserRouter)
in BrowserRouter (created by s)
in s (at client.js:21)
```
**Expected behavior**
No such errors.
**Screenshots**
Screenshot of the error pasted above.
<img width="597" alt="screen shot 2018-12-04 at 12 28 07" src="https://user-images.githubusercontent.com/3028997/49441947-eb1a8c80-f7bf-11e8-9fa7-daa27a70e662.png">
**Desktop (please complete the following information):**
- OS: Mac OS X
- Browser: Firefox Nightly v 65
- [x] Initially labelled with ["bug"](https://github.com/BBC-News/simorgh/labels/bug)
|
non_process
|
local test data errors describe the bug when i run simorgh locally and visit any of these following test pages i see console errors to reproduce steps to reproduce the behavior git checkout latest npm ci npm run dev open dev console in a browser visit see error warning failed prop type invalid prop data data content model blocks supplied to articlecontainer in articlecontainer created by route in route created by a in switch created by a in a created by route in route created by withrouter a in withrouter a created by s in router created by browserrouter in browserrouter created by s in s at client js proxyconsole js warning failed prop type invalid prop blocks supplied to textcontainer in textcontainer at blocks index jsx in blocks at article index jsx in div created by griditemconstrained in griditemconstrained at article index jsx in div created by wrapper in wrapper created by oatwrapper in oatwrapper at article index jsx in main at article index jsx in platformcontextprovider at article index jsx in servicecontextprovider at article index jsx in articlecontainer created by route in route created by a in switch created by a in a created by route in route created by withrouter a in withrouter a created by s in router created by browserrouter in browserrouter created by s in s at client js proxyconsole js warning failed prop type invalid prop blocks supplied to paragraphcontainer in paragraphcontainer at blocks index jsx in blocks at text index jsx in textcontainer at blocks index jsx in blocks at article index jsx in div created by griditemconstrained in griditemconstrained at article index jsx in div created by wrapper in wrapper created by oatwrapper in oatwrapper at article index jsx in main at article index jsx in platformcontextprovider at article index jsx in servicecontextprovider at article index jsx in articlecontainer created by route in route created by a in switch created by a in a created by route in route created by withrouter a in withrouter a created by s in router created by browserrouter in browserrouter created by s in s at client js expected behavior no such errors screenshots screenshot of the error pasted above img width alt screen shot at src desktop please complete the following information os mac os x browser firefox nightly v initially labelled with
| 0
|
166,425
| 20,718,495,798
|
IssuesEvent
|
2022-03-13 01:56:16
|
jinuem/IonicV2Tabs
|
https://api.github.com/repos/jinuem/IonicV2Tabs
|
opened
|
CVE-2021-32803 (High) detected in tar-2.2.1.tgz
|
security vulnerability
|
## CVE-2021-32803 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-2.2.1.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-2.2.1.tgz">https://registry.npmjs.org/tar/-/tar-2.2.1.tgz</a></p>
<p>Path to dependency file: /IonicV2Tabs/package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- app-scripts-1.3.5.tgz (Root Library)
- node-sass-4.5.0.tgz
- node-gyp-3.8.0.tgz
- :x: **tar-2.2.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2.
<p>Publish Date: 2021-08-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803>CVE-2021-32803</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw">https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw</a></p>
<p>Release Date: 2021-08-03</p>
<p>Fix Resolution: tar - 3.2.3, 4.4.15, 5.0.7, 6.1.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-32803 (High) detected in tar-2.2.1.tgz - ## CVE-2021-32803 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-2.2.1.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-2.2.1.tgz">https://registry.npmjs.org/tar/-/tar-2.2.1.tgz</a></p>
<p>Path to dependency file: /IonicV2Tabs/package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- app-scripts-1.3.5.tgz (Root Library)
- node-sass-4.5.0.tgz
- node-gyp-3.8.0.tgz
- :x: **tar-2.2.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2.
<p>Publish Date: 2021-08-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803>CVE-2021-32803</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw">https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw</a></p>
<p>Release Date: 2021-08-03</p>
<p>Fix Resolution: tar - 3.2.3, 4.4.15, 5.0.7, 6.1.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in tar tgz cve high severity vulnerability vulnerable library tar tgz tar for node library home page a href path to dependency file package json path to vulnerable library node modules tar package json dependency hierarchy app scripts tgz root library node sass tgz node gyp tgz x tar tgz vulnerable library vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite vulnerability via insufficient symlink protection node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory this order of operations resulted in the directory being created and added to the node tar directory cache when a directory is present in the directory cache subsequent calls to mkdir for that directory are skipped however this is also where node tar checks for symlinks occur by first creating a directory and then replacing that directory with a symlink it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite this issue was addressed in releases and publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar step up your open source security game with whitesource
| 0
|
196,415
| 22,441,737,521
|
IssuesEvent
|
2022-06-21 02:07:33
|
KingdomB/mern-ecommerce
|
https://api.github.com/repos/KingdomB/mern-ecommerce
|
opened
|
CVE-2022-33987 (Medium) detected in got-9.6.0.tgz
|
security vulnerability
|
## CVE-2022-33987 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>got-9.6.0.tgz</b></p></summary>
<p>Simplified HTTP requests</p>
<p>Library home page: <a href="https://registry.npmjs.org/got/-/got-9.6.0.tgz">https://registry.npmjs.org/got/-/got-9.6.0.tgz</a></p>
<p>Path to dependency file: /ecommerce-back-end/node_modules/got/package.json</p>
<p>Path to vulnerable library: /ecommerce-back-end/node_modules/got/package.json</p>
<p>
Dependency Hierarchy:
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The got package before 12.1.0 for Node.js allows a redirect to a UNIX socket.
<p>Publish Date: 2022-06-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-33987>CVE-2022-33987</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-33987">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-33987</a></p>
<p>Release Date: 2022-06-18</p>
<p>Fix Resolution: 12.0.0-beta.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-33987 (Medium) detected in got-9.6.0.tgz - ## CVE-2022-33987 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>got-9.6.0.tgz</b></p></summary>
<p>Simplified HTTP requests</p>
<p>Library home page: <a href="https://registry.npmjs.org/got/-/got-9.6.0.tgz">https://registry.npmjs.org/got/-/got-9.6.0.tgz</a></p>
<p>Path to dependency file: /ecommerce-back-end/node_modules/got/package.json</p>
<p>Path to vulnerable library: /ecommerce-back-end/node_modules/got/package.json</p>
<p>
Dependency Hierarchy:
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The got package before 12.1.0 for Node.js allows a redirect to a UNIX socket.
<p>Publish Date: 2022-06-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-33987>CVE-2022-33987</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-33987">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-33987</a></p>
<p>Release Date: 2022-06-18</p>
<p>Fix Resolution: 12.0.0-beta.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in got tgz cve medium severity vulnerability vulnerable library got tgz simplified http requests library home page a href path to dependency file ecommerce back end node modules got package json path to vulnerable library ecommerce back end node modules got package json dependency hierarchy found in base branch master vulnerability details the got package before for node js allows a redirect to a unix socket publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution beta step up your open source security game with mend
| 0
|
778,438
| 27,316,186,887
|
IssuesEvent
|
2023-02-24 15:52:38
|
aquasecurity/trivy
|
https://api.github.com/repos/aquasecurity/trivy
|
closed
|
Add Go license detection
|
kind/feature priority/important-longterm scan/license
|
## Description
Detect licenses of go modules.
1. Assume `go mod download`
2. Looks for license files in $GOPATH/pkg/mod
- e.g. $GOPATH/pkg/mod/github.com/stretchr/testify@v1.7.0/LICENSE
3. Classify the license type
## Related
https://github.com/aquasecurity/trivy/issues/1835
|
1.0
|
Add Go license detection - ## Description
Detect licenses of go modules.
1. Assume `go mod download`
2. Looks for license files in $GOPATH/pkg/mod
- e.g. $GOPATH/pkg/mod/github.com/stretchr/testify@v1.7.0/LICENSE
3. Classify the license type
## Related
https://github.com/aquasecurity/trivy/issues/1835
|
non_process
|
add go license detection description detect licenses of go modules assume go mod download looks for license files in gopath pkg mod e g gopath pkg mod github com stretchr testify license classify the license type related
| 0
|
14,633
| 17,768,059,033
|
IssuesEvent
|
2021-08-30 10:05:47
|
KI-Vorlesung/kitest
|
https://api.github.com/repos/KI-Vorlesung/kitest
|
closed
|
Literatur, Zitieren
|
WEB SLIDES PRE-PROCESSING
|
In Pandoc kann man mit `@key` und dem in Pandoc enthaltenen `citeproc` Literaturverweise auflösen und Literaturverzeichnisse erstellen. Hugo kennt dieses Konzept nicht.
Für Hugo wurde in diesem Projekt ein Mechanismus mit einem Data-Template `readings.yaml` (erinnert an ein stark vereinfachtes Bib-File) und einem Shortcode bzw. Partial und dem Key `readings` im YAML-Header einer Seite implementiert.
Pandoc/Citeproc kann aber neben Bibtex auch YAML und andere Formate lesen. Es kann sogar zwischen den Formaten übersetzen: Mit `pandoc -s -f biblatex -t markdown <bibtex-file>` kann man aus einem Bibtex-File ein entsprechendes YAML-File erzeugen (https://pandoc.org/MANUAL.html#citations).
Idee:
- [x] BibTeX: Die vorhandenen Bibtex-Files mit dem Projekt versionieren und pflegen. Das ermöglicht auch die Integration mit verbreiteten Reference Management Tools ([JabRef](https://www.jabref.org/), [Zotero](https://www.zotero.org/), [Mendeley](https://www.mendeley.com), ...).
- [x] Vorverarbeitung: In der Vorverarbeitung wird das Bibtex-File mit Pandoc als YAML-Data-Template exportiert: `pandoc -s -f biblatex -t markdown <bibtex-file> -o ./static/readings.yaml`
- [x] YAML/WEB: Format des generierten YAML klären und den Shortcode und das Partial entsprechend anpassen, damit die Seitengenerierung mit Hugo weiterhin funktioniert.
- [x] SLIDES: Beim Erzeugen von Folien wie bisher Pandoc mit Citeproc und dem Bibtex-File laufen lassen. Da auf Slides oft kein eigenes Literaturverzeichnis existiert, evtl. statt des Keys+Link analog zu Webseite ein "Author: Titel, Jahr" ersetzen? Zusätzlich im Defaults-File die Optionen `metadata: suppress-bibliography: true, link-citations: false` beibehalten!
- [x] Vorverarbeitung/WEB: Pandoc+Filter, um aus den `@key` in der Seite einen Text mit "Author: Titel, Jahr" zu machen und in das Literaturverzeichnis verlinken. Zusätzlich muss die Variable `readings` der Seite entsprechend ergänzt werden.
- Filter:
- Start: Metadata extrahieren
- Auf Elemente vom Typ `Inline`: `Cite` reagieren
- `el.t == "Cite"; el.citations[i] => Referenz`
- Element `@Key` durch Link auf Literaturverzeichnis ersetzen: `[Key](#key)`
- Key zu Meta-Schlüssel `readings` hinzufügen (Achtung: `readings` fehlt, doppelte Einträge)
- Ende: Metadata zurückschreiben
- Shortcode/Partial `bib.html`:
- Anker `#key` zu jedem Eintrag generieren
Beispiel:
``` yaml
readings:
- key: "AIMA"
- key: "Ertel"
comment: "Kapitel 2 und 3"
```
```json
"readings":{"t":"MetaList","c":[
{"t":"MetaMap","c":
{"key":{"t":"MetaInlines","c":[{"t":"Str","c":"AIMA"}]}}
},
{"t":"MetaMap","c":
{"comment":{"t":"MetaInlines","c":[{"t":"Str","c":"Kapitel"},{"t":"Space"},{"t":"Str","c":"2"},{"t":"Space"},{"t":"Str","c":"und"},{"t":"Space"},{"t":"Str","c":"3"}]},
"key":{"t":"MetaInlines","c":[{"t":"Str","c":"Ertel"}]}}
},
{"t":"MetaMap","c":{
"key":{"t":"MetaInlines","c":[{"t":"Str","c":"Wuppie"}]}}
},
{"t":"MetaMap","c":
{"comment":{"t":"MetaInlines","c":[{"t":"Str","c":"UGH!!"}]},
"key":{"t":"MetaInlines","c":[{"t":"Str","c":"Fluppie"}]}}
}
]}
```
```markdown
Blablab @Russell2020 blublbubl [@Russell2020] bla [@Russell2020, S.10] ...
Hier eine Mehrfachzitierung [@Russell2020; @Pedregosa2011]
```
```json
{"t":"Para","c":[
{"t":"Str","c":"Blablab"},
{"t":"Space"},
{"t":"Cite","c":[
[{"citationId":"Russell2020",
"citationPrefix":[],
"citationSuffix":[],
"citationMode":{"t":"AuthorInText"},
"citationNoteNum":1,
"citationHash":0}],
[{"t":"Str","c":"@Russell2020"}]]},
{"t":"Space"},
{"t":"Str","c":"blublbubl"},
{"t":"Space"},
{"t":"Cite","c":[
[{"citationId":"Russell2020",
"citationPrefix":[],
"citationSuffix":[],
"citationMode":{"t":"NormalCitation"},
"citationNoteNum":2,
"citationHash":0}],
[{"t":"Str","c":"[@Russell2020]"}]]},
{"t":"Space"},
{"t":"Str","c":"bla"},
{"t":"Space"},
{"t":"Cite","c":[
[{"citationId":"Russell2020",
"citationPrefix":[],
"citationSuffix":[{"t":"Str","c":","},{"t":"Space"},{"t":"Str","c":"S.10"}],
"citationMode":{"t":"NormalCitation"},
"citationNoteNum":3,
"citationHash":0}],
[{"t":"Str","c":"[@Russell2020,"},{"t":"Space"},{"t":"Str","c":"S.10]"}]]},
{"t":"Space"},
{"t":"Str","c":"…"}]},
{"t":"Para","c":[
{"t":"Str","c":"Hier"},
{"t":"Space"},
{"t":"Str","c":"eine"},
{"t":"Space"},
{"t":"Str","c":"Mehrfachzitierung"},
{"t":"Space"},
{"t":"Cite","c":[
[
{"citationId":"Russell2020",
"citationPrefix":[],
"citationSuffix":[],
"citationMode":{"t":"NormalCitation"},
"citationNoteNum":4,
"citationHash":0},
{"citationId":"Pedregosa2011",
"citationPrefix":[],
"citationSuffix":[],
"citationMode":{"t":"NormalCitation"},
"citationNoteNum":4,
"citationHash":0}
],
[{"t":"Str","c":"[@Russell2020;"},{"t":"Space"},{"t":"Str","c":"@Pedregosa2011]"}]]}]},
```
---
siehe auch https://github.com/Compilerbau/Lecture/issues/14
|
1.0
|
Literatur, Zitieren - In Pandoc kann man mit `@key` und dem in Pandoc enthaltenen `citeproc` Literaturverweise auflösen und Literaturverzeichnisse erstellen. Hugo kennt dieses Konzept nicht.
Für Hugo wurde in diesem Projekt ein Mechanismus mit einem Data-Template `readings.yaml` (erinnert an ein stark vereinfachtes Bib-File) und einem Shortcode bzw. Partial und dem Key `readings` im YAML-Header einer Seite implementiert.
Pandoc/Citeproc kann aber neben Bibtex auch YAML und andere Formate lesen. Es kann sogar zwischen den Formaten übersetzen: Mit `pandoc -s -f biblatex -t markdown <bibtex-file>` kann man aus einem Bibtex-File ein entsprechendes YAML-File erzeugen (https://pandoc.org/MANUAL.html#citations).
Idee:
- [x] BibTeX: Die vorhandenen Bibtex-Files mit dem Projekt versionieren und pflegen. Das ermöglicht auch die Integration mit verbreiteten Reference Management Tools ([JabRef](https://www.jabref.org/), [Zotero](https://www.zotero.org/), [Mendeley](https://www.mendeley.com), ...).
- [x] Vorverarbeitung: In der Vorverarbeitung wird das Bibtex-File mit Pandoc als YAML-Data-Template exportiert: `pandoc -s -f biblatex -t markdown <bibtex-file> -o ./static/readings.yaml`
- [x] YAML/WEB: Format des generierten YAML klären und den Shortcode und das Partial entsprechend anpassen, damit die Seitengenerierung mit Hugo weiterhin funktioniert.
- [x] SLIDES: Beim Erzeugen von Folien wie bisher Pandoc mit Citeproc und dem Bibtex-File laufen lassen. Da auf Slides oft kein eigenes Literaturverzeichnis existiert, evtl. statt des Keys+Link analog zu Webseite ein "Author: Titel, Jahr" ersetzen? Zusätzlich im Defaults-File die Optionen `metadata: suppress-bibliography: true, link-citations: false` beibehalten!
- [x] Vorverarbeitung/WEB: Pandoc+Filter, um aus den `@key` in der Seite einen Text mit "Author: Titel, Jahr" zu machen und in das Literaturverzeichnis verlinken. Zusätzlich muss die Variable `readings` der Seite entsprechend ergänzt werden.
- Filter:
- Start: Metadata extrahieren
- Auf Elemente vom Typ `Inline`: `Cite` reagieren
- `el.t == "Cite"; el.citations[i] => Referenz`
- Element `@Key` durch Link auf Literaturverzeichnis ersetzen: `[Key](#key)`
- Key zu Meta-Schlüssel `readings` hinzufügen (Achtung: `readings` fehlt, doppelte Einträge)
- Ende: Metadata zurückschreiben
- Shortcode/Partial `bib.html`:
- Anker `#key` zu jedem Eintrag generieren
Beispiel:
``` yaml
readings:
- key: "AIMA"
- key: "Ertel"
comment: "Kapitel 2 und 3"
```
```json
"readings":{"t":"MetaList","c":[
{"t":"MetaMap","c":
{"key":{"t":"MetaInlines","c":[{"t":"Str","c":"AIMA"}]}}
},
{"t":"MetaMap","c":
{"comment":{"t":"MetaInlines","c":[{"t":"Str","c":"Kapitel"},{"t":"Space"},{"t":"Str","c":"2"},{"t":"Space"},{"t":"Str","c":"und"},{"t":"Space"},{"t":"Str","c":"3"}]},
"key":{"t":"MetaInlines","c":[{"t":"Str","c":"Ertel"}]}}
},
{"t":"MetaMap","c":{
"key":{"t":"MetaInlines","c":[{"t":"Str","c":"Wuppie"}]}}
},
{"t":"MetaMap","c":
{"comment":{"t":"MetaInlines","c":[{"t":"Str","c":"UGH!!"}]},
"key":{"t":"MetaInlines","c":[{"t":"Str","c":"Fluppie"}]}}
}
]}
```
```markdown
Blablab @Russell2020 blublbubl [@Russell2020] bla [@Russell2020, S.10] ...
Hier eine Mehrfachzitierung [@Russell2020; @Pedregosa2011]
```
```json
{"t":"Para","c":[
{"t":"Str","c":"Blablab"},
{"t":"Space"},
{"t":"Cite","c":[
[{"citationId":"Russell2020",
"citationPrefix":[],
"citationSuffix":[],
"citationMode":{"t":"AuthorInText"},
"citationNoteNum":1,
"citationHash":0}],
[{"t":"Str","c":"@Russell2020"}]]},
{"t":"Space"},
{"t":"Str","c":"blublbubl"},
{"t":"Space"},
{"t":"Cite","c":[
[{"citationId":"Russell2020",
"citationPrefix":[],
"citationSuffix":[],
"citationMode":{"t":"NormalCitation"},
"citationNoteNum":2,
"citationHash":0}],
[{"t":"Str","c":"[@Russell2020]"}]]},
{"t":"Space"},
{"t":"Str","c":"bla"},
{"t":"Space"},
{"t":"Cite","c":[
[{"citationId":"Russell2020",
"citationPrefix":[],
"citationSuffix":[{"t":"Str","c":","},{"t":"Space"},{"t":"Str","c":"S.10"}],
"citationMode":{"t":"NormalCitation"},
"citationNoteNum":3,
"citationHash":0}],
[{"t":"Str","c":"[@Russell2020,"},{"t":"Space"},{"t":"Str","c":"S.10]"}]]},
{"t":"Space"},
{"t":"Str","c":"…"}]},
{"t":"Para","c":[
{"t":"Str","c":"Hier"},
{"t":"Space"},
{"t":"Str","c":"eine"},
{"t":"Space"},
{"t":"Str","c":"Mehrfachzitierung"},
{"t":"Space"},
{"t":"Cite","c":[
[
{"citationId":"Russell2020",
"citationPrefix":[],
"citationSuffix":[],
"citationMode":{"t":"NormalCitation"},
"citationNoteNum":4,
"citationHash":0},
{"citationId":"Pedregosa2011",
"citationPrefix":[],
"citationSuffix":[],
"citationMode":{"t":"NormalCitation"},
"citationNoteNum":4,
"citationHash":0}
],
[{"t":"Str","c":"[@Russell2020;"},{"t":"Space"},{"t":"Str","c":"@Pedregosa2011]"}]]}]},
```
---
siehe auch https://github.com/Compilerbau/Lecture/issues/14
|
process
|
literatur zitieren in pandoc kann man mit key und dem in pandoc enthaltenen citeproc literaturverweise auflösen und literaturverzeichnisse erstellen hugo kennt dieses konzept nicht für hugo wurde in diesem projekt ein mechanismus mit einem data template readings yaml erinnert an ein stark vereinfachtes bib file und einem shortcode bzw partial und dem key readings im yaml header einer seite implementiert pandoc citeproc kann aber neben bibtex auch yaml und andere formate lesen es kann sogar zwischen den formaten übersetzen mit pandoc s f biblatex t markdown kann man aus einem bibtex file ein entsprechendes yaml file erzeugen idee bibtex die vorhandenen bibtex files mit dem projekt versionieren und pflegen das ermöglicht auch die integration mit verbreiteten reference management tools vorverarbeitung in der vorverarbeitung wird das bibtex file mit pandoc als yaml data template exportiert pandoc s f biblatex t markdown o static readings yaml yaml web format des generierten yaml klären und den shortcode und das partial entsprechend anpassen damit die seitengenerierung mit hugo weiterhin funktioniert slides beim erzeugen von folien wie bisher pandoc mit citeproc und dem bibtex file laufen lassen da auf slides oft kein eigenes literaturverzeichnis existiert evtl statt des keys link analog zu webseite ein author titel jahr ersetzen zusätzlich im defaults file die optionen metadata suppress bibliography true link citations false beibehalten vorverarbeitung web pandoc filter um aus den key in der seite einen text mit author titel jahr zu machen und in das literaturverzeichnis verlinken zusätzlich muss die variable readings der seite entsprechend ergänzt werden filter start metadata extrahieren auf elemente vom typ inline cite reagieren el t cite el citations referenz element key durch link auf literaturverzeichnis ersetzen key key zu meta schlüssel readings hinzufügen achtung readings fehlt doppelte einträge ende metadata zurückschreiben shortcode partial bib html anker key zu jedem eintrag generieren beispiel yaml readings key aima key ertel comment kapitel und json readings t metalist c t metamap c key t metainlines c t metamap c comment t metainlines c key t metainlines c t metamap c key t metainlines c t metamap c comment t metainlines c key t metainlines c markdown blablab blublbubl bla hier eine mehrfachzitierung json t para c t str c blablab t space t cite c citationid citationprefix citationsuffix citationmode t authorintext citationnotenum citationhash t space t str c blublbubl t space t cite c citationid citationprefix citationsuffix citationmode t normalcitation citationnotenum citationhash t space t str c bla t space t cite c citationid citationprefix citationsuffix citationmode t normalcitation citationnotenum citationhash t space t str c … t para c t str c hier t space t str c eine t space t str c mehrfachzitierung t space t cite c citationid citationprefix citationsuffix citationmode t normalcitation citationnotenum citationhash citationid citationprefix citationsuffix citationmode t normalcitation citationnotenum citationhash siehe auch
| 1
|
10,719
| 13,522,867,791
|
IssuesEvent
|
2020-09-15 09:09:11
|
chavarera/python-mini-projects
|
https://api.github.com/repos/chavarera/python-mini-projects
|
closed
|
Write a script to reduce image file size
|
Assigned Image-processing Learn beginner boring-stuffs good first issue
|
# Description
Write a script to reduce image file size
## Type of issue
- [x] Feature (New Script)
## Checklist:
- [x] I have read the project guidelines.
- [x] I have checked all the existing projects, before submitting a new project issue.
- [x] I have checked previous issues to avoid duplicates.
- [x] This issue will be meaningful for the project.
<!-- Uncomment this in case you have a issue related to a bug in existing code.-->
<!--
- [ ] I have added screenshots of the bug
- [ ] I have added steps to reproduce the bug
- [ ] I have proposed a possible solution for the bug
-->
|
1.0
|
Write a script to reduce image file size - # Description
Write a script to reduce image file size
## Type of issue
- [x] Feature (New Script)
## Checklist:
- [x] I have read the project guidelines.
- [x] I have checked all the existing projects, before submitting a new project issue.
- [x] I have checked previous issues to avoid duplicates.
- [x] This issue will be meaningful for the project.
<!-- Uncomment this in case you have a issue related to a bug in existing code.-->
<!--
- [ ] I have added screenshots of the bug
- [ ] I have added steps to reproduce the bug
- [ ] I have proposed a possible solution for the bug
-->
|
process
|
write a script to reduce image file size description write a script to reduce image file size type of issue feature new script checklist i have read the project guidelines i have checked all the existing projects before submitting a new project issue i have checked previous issues to avoid duplicates this issue will be meaningful for the project i have added screenshots of the bug i have added steps to reproduce the bug i have proposed a possible solution for the bug
| 1
|
11,511
| 14,396,080,317
|
IssuesEvent
|
2020-12-03 05:27:29
|
LoveShack-Inc/represent
|
https://api.github.com/repos/LoveShack-Inc/represent
|
opened
|
CT PDFs we don't currently process
|
processor
|
## Desc
Ongoing list of PDFs that we don't currently process, but that we should
#### Tally Sheets
Examples
- https://www.cga.ct.gov/2011/TS/s/pdf/2011SB-00366-R00APP-CV103-TS.pdf
- https://www.cga.ct.gov/2017/TS/h/pdf/2017HB-07008-R00HS-CV23-TS.pdf
- https://www.cga.ct.gov/2013/TS/h/pdf/2013HB-05062-R00HSG-CV66-TS.pdf
|
1.0
|
CT PDFs we don't currently process - ## Desc
Ongoing list of PDFs that we don't currently process, but that we should
#### Tally Sheets
Examples
- https://www.cga.ct.gov/2011/TS/s/pdf/2011SB-00366-R00APP-CV103-TS.pdf
- https://www.cga.ct.gov/2017/TS/h/pdf/2017HB-07008-R00HS-CV23-TS.pdf
- https://www.cga.ct.gov/2013/TS/h/pdf/2013HB-05062-R00HSG-CV66-TS.pdf
|
process
|
ct pdfs we don t currently process desc ongoing list of pdfs that we don t currently process but that we should tally sheets examples
| 1
|
17,765
| 23,697,548,250
|
IssuesEvent
|
2022-08-29 15:51:47
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
closed
|
Prisma v4 breaks support for empty `dbgenerated()` - invalid migration created with no schema change
|
bug/2-confirmed kind/bug process/candidate team/schema topic: dbgenerated
|
### Bug description
Using Postgres [generated columns](https://www.postgresql.org/docs/current/ddl-generated-columns.html) via `@default(dbgenerated())` with custom Postgres `GENERATED AWAYS AS ... STORED` SQL in migration worked just fine through Prisma v3; breaks with Prisma 4.0.0 onwards.
Reproduction repo: https://github.com/andyjy/prisma-dbgenerated-bug-repro
### Reproduction details:
`schema.prisma`:
```
model table {
id String @id
hereBeDragons String @default(dbgenerated())
}
```
..with custom migration to use Postgres generated field:
`migration.sql`:
```
CREATE TABLE "table" (
...
-- manually edited migration to make column generated
"hereBeDragons" TEXT NOT NULL GENERATED ALWAYS AS (
'this row ID is: '::text || "id") STORED,
...
```
Worked just fine through prisma 3.15.2; prisma 4.0.0 onwards produces invalid migration with no schema change
that attempts to DROP DEFAULT (which we don't want to do, and fails since Postgres generated columns require
use of DROP EXPRESSION if we were actually to want to convert the generated column to a normal one).
`...invalid_failed_migration/migration.sql`:
```
ALTER TABLE "table" ALTER COLUMN "hereBeDragons" DROP DEFAULT;
```
(unwanted new migration fails with error:)
```
Database error code: 42601
Database error:
ERROR: column "hereBeDragons" of relation "table" is a generated column
HINT: Use ALTER TABLE ... ALTER COLUMN ... DROP EXPRESSION instead.
```
### How to reproduce
Reproduction repo: https://github.com/andyjy/prisma-dbgenerated-bug-repro
### Expected behavior
Prisma 4 continues to support this schema and migration history with no attempt
to generate a new migration dropping the default value for the column.
### Prisma information
Schema above + in reproduction repo
### Environment & setup
- OS: MacOS
- Database: PostgreSQL
- Node.js version: v16.16.0
### Prisma Version
works up to `3.15.2`, fails since `4.0.0` (tested up to `4.2.1`)
|
1.0
|
Prisma v4 breaks support for empty `dbgenerated()` - invalid migration created with no schema change - ### Bug description
Using Postgres [generated columns](https://www.postgresql.org/docs/current/ddl-generated-columns.html) via `@default(dbgenerated())` with custom Postgres `GENERATED AWAYS AS ... STORED` SQL in migration worked just fine through Prisma v3; breaks with Prisma 4.0.0 onwards.
Reproduction repo: https://github.com/andyjy/prisma-dbgenerated-bug-repro
### Reproduction details:
`schema.prisma`:
```
model table {
id String @id
hereBeDragons String @default(dbgenerated())
}
```
..with custom migration to use Postgres generated field:
`migration.sql`:
```
CREATE TABLE "table" (
...
-- manually edited migration to make column generated
"hereBeDragons" TEXT NOT NULL GENERATED ALWAYS AS (
'this row ID is: '::text || "id") STORED,
...
```
Worked just fine through prisma 3.15.2; prisma 4.0.0 onwards produces invalid migration with no schema change
that attempts to DROP DEFAULT (which we don't want to do, and fails since Postgres generated columns require
use of DROP EXPRESSION if we were actually to want to convert the generated column to a normal one).
`...invalid_failed_migration/migration.sql`:
```
ALTER TABLE "table" ALTER COLUMN "hereBeDragons" DROP DEFAULT;
```
(unwanted new migration fails with error:)
```
Database error code: 42601
Database error:
ERROR: column "hereBeDragons" of relation "table" is a generated column
HINT: Use ALTER TABLE ... ALTER COLUMN ... DROP EXPRESSION instead.
```
### How to reproduce
Reproduction repo: https://github.com/andyjy/prisma-dbgenerated-bug-repro
### Expected behavior
Prisma 4 continues to support this schema and migration history with no attempt
to generate a new migration dropping the default value for the column.
### Prisma information
Schema above + in reproduction repo
### Environment & setup
- OS: MacOS
- Database: PostgreSQL
- Node.js version: v16.16.0
### Prisma Version
works up to `3.15.2`, fails since `4.0.0` (tested up to `4.2.1`)
|
process
|
prisma breaks support for empty dbgenerated invalid migration created with no schema change bug description using postgres via default dbgenerated with custom postgres generated aways as stored sql in migration worked just fine through prisma breaks with prisma onwards reproduction repo reproduction details schema prisma model table id string id herebedragons string default dbgenerated with custom migration to use postgres generated field migration sql create table table manually edited migration to make column generated herebedragons text not null generated always as this row id is text id stored worked just fine through prisma prisma onwards produces invalid migration with no schema change that attempts to drop default which we don t want to do and fails since postgres generated columns require use of drop expression if we were actually to want to convert the generated column to a normal one invalid failed migration migration sql alter table table alter column herebedragons drop default unwanted new migration fails with error database error code database error error column herebedragons of relation table is a generated column hint use alter table alter column drop expression instead how to reproduce reproduction repo expected behavior prisma continues to support this schema and migration history with no attempt to generate a new migration dropping the default value for the column prisma information schema above in reproduction repo environment setup os macos database postgresql node js version prisma version works up to fails since tested up to
| 1
|
14,652
| 17,776,605,436
|
IssuesEvent
|
2021-08-30 20:06:13
|
googleapis/google-cloud-node
|
https://api.github.com/repos/googleapis/google-cloud-node
|
closed
|
Migration to OwlBot
|
type: process
|
**cloud-debug-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**cloud-profiler-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**cloud-trace-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**code-suggester:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**gapic-generator-typescript:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**gax-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**gaxios:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**gcp-metadata:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**gcs-resumable-upload:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**github-repo-automation:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-api-nodejs-client:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-auth-library-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-cloud-node:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-cloudevents:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-cloudevents-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-p12-pem:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**googleapis-gen:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**jsdoc-fresh:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**jsdoc-region-tag:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**node-gtoken:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-access-approval:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-ai-platform:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-analytics-admin:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-analytics-data:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-api-gateway:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-area120-tables:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-artifact-registry:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-asset:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-assured-workloads:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-automl:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigquery:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigquery-connection:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigquery-data-transfer:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigquery-reservation:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigquery-storage:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigtable:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-billing:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-billing-budgets:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-binary-authorization:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-channel:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-cloud-container:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-cloudbuild:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-common:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-compute:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-containeranalysis:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-data-qna:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-datacatalog:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-datalabeling:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dataproc:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dataproc-metastore:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-datastore:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-datastore-kvstore:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-datastore-session:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dialogflow:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dialogflow-cx:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dlp:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dns:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-document-ai:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-domains:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-error-reporting:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-firestore:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-firestore-session:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-functions:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-game-servers:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-gce-images:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-gke-hub:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-googleapis-common:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-grafeas:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-iam-credentials:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-iot:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-kms:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-language:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-local-auth:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-logging:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-logging-bunyan:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-logging-winston:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-managed-identities:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-media-translation:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-memcache:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-monitoring:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-monitoring-dashboards:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-network-connectivity:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-notebooks:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-org-policy:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-os-config:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-os-login:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-paginator:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-phishing-protection:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-policy-troubleshooter:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-precise-date:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-projectify:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-promisify:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-proto-files:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-pubsub:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-rcloadenv:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-recaptcha-enterprise:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-recommender:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-redis:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-resource:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-retail:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-scheduler:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-secret-manager:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-security-center:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-security-private-ca:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-service-directory:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-spanner:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-speech:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-storage:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-talent:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-tasks:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-text-to-speech:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-translate:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-video-intelligence:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-video-transcoder:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-vision:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-web-risk:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-web-security-scanner:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-workflows:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**release-please:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**repo-automation-bots:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**sloth:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**teeny-request:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
|
1.0
|
Migration to OwlBot - **cloud-debug-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**cloud-profiler-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**cloud-trace-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**code-suggester:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**gapic-generator-typescript:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**gax-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**gaxios:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**gcp-metadata:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**gcs-resumable-upload:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**github-repo-automation:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-api-nodejs-client:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-auth-library-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-cloud-node:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-cloudevents:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-cloudevents-nodejs:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**google-p12-pem:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**googleapis-gen:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**jsdoc-fresh:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**jsdoc-region-tag:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**node-gtoken:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-access-approval:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-ai-platform:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-analytics-admin:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-analytics-data:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-api-gateway:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-area120-tables:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-artifact-registry:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-asset:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-assured-workloads:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-automl:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigquery:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigquery-connection:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigquery-data-transfer:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigquery-reservation:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigquery-storage:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-bigtable:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-billing:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-billing-budgets:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-binary-authorization:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-channel:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-cloud-container:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-cloudbuild:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-common:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-compute:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-containeranalysis:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-data-qna:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-datacatalog:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-datalabeling:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dataproc:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dataproc-metastore:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-datastore:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-datastore-kvstore:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-datastore-session:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dialogflow:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dialogflow-cx:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dlp:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-dns:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-document-ai:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-domains:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-error-reporting:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-firestore:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-firestore-session:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-functions:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-game-servers:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-gce-images:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-gke-hub:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-googleapis-common:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-grafeas:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-iam-credentials:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-iot:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-kms:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-language:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-local-auth:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-logging:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-logging-bunyan:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-logging-winston:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-managed-identities:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-media-translation:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-memcache:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-monitoring:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-monitoring-dashboards:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-network-connectivity:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-notebooks:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-org-policy:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-os-config:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-os-login:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-paginator:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-phishing-protection:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-policy-troubleshooter:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-precise-date:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-projectify:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-promisify:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-proto-files:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-pubsub:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-rcloadenv:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-recaptcha-enterprise:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-recommender:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-redis:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-resource:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-retail:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-scheduler:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-secret-manager:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-security-center:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-security-private-ca:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-service-directory:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-spanner:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-speech:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-storage:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-talent:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-tasks:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-text-to-speech:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-translate:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-video-intelligence:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-video-transcoder:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-vision:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-web-risk:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-web-security-scanner:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**nodejs-workflows:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**release-please:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**repo-automation-bots:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**sloth:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
**teeny-request:**
- [ ] .OwlBot.lock generated
- [ ] .OwlBot.yaml generated
- [ ] owlbot.py generated
|
process
|
migration to owlbot cloud debug nodejs owlbot lock generated owlbot yaml generated owlbot py generated cloud profiler nodejs owlbot lock generated owlbot yaml generated owlbot py generated cloud trace nodejs owlbot lock generated owlbot yaml generated owlbot py generated code suggester owlbot lock generated owlbot yaml generated owlbot py generated gapic generator typescript owlbot lock generated owlbot yaml generated owlbot py generated gax nodejs owlbot lock generated owlbot yaml generated owlbot py generated gaxios owlbot lock generated owlbot yaml generated owlbot py generated gcp metadata owlbot lock generated owlbot yaml generated owlbot py generated gcs resumable upload owlbot lock generated owlbot yaml generated owlbot py generated github repo automation owlbot lock generated owlbot yaml generated owlbot py generated google api nodejs client owlbot lock generated owlbot yaml generated owlbot py generated google auth library nodejs owlbot lock generated owlbot yaml generated owlbot py generated google cloud node owlbot lock generated owlbot yaml generated owlbot py generated google cloudevents owlbot lock generated owlbot yaml generated owlbot py generated google cloudevents nodejs owlbot lock generated owlbot yaml generated owlbot py generated google pem owlbot lock generated owlbot yaml generated owlbot py generated googleapis gen owlbot lock generated owlbot yaml generated owlbot py generated jsdoc fresh owlbot lock generated owlbot yaml generated owlbot py generated jsdoc region tag owlbot lock generated owlbot yaml generated owlbot py generated node gtoken owlbot lock generated owlbot yaml generated owlbot py generated nodejs access approval owlbot lock generated owlbot yaml generated owlbot py generated nodejs ai platform owlbot lock generated owlbot yaml generated owlbot py generated nodejs analytics admin owlbot lock generated owlbot yaml generated owlbot py generated nodejs analytics data owlbot lock generated owlbot yaml generated owlbot py generated nodejs api gateway owlbot lock generated owlbot yaml generated owlbot py generated nodejs tables owlbot lock generated owlbot yaml generated owlbot py generated nodejs artifact registry owlbot lock generated owlbot yaml generated owlbot py generated nodejs asset owlbot lock generated owlbot yaml generated owlbot py generated nodejs assured workloads owlbot lock generated owlbot yaml generated owlbot py generated nodejs automl owlbot lock generated owlbot yaml generated owlbot py generated nodejs bigquery owlbot lock generated owlbot yaml generated owlbot py generated nodejs bigquery connection owlbot lock generated owlbot yaml generated owlbot py generated nodejs bigquery data transfer owlbot lock generated owlbot yaml generated owlbot py generated nodejs bigquery reservation owlbot lock generated owlbot yaml generated owlbot py generated nodejs bigquery storage owlbot lock generated owlbot yaml generated owlbot py generated nodejs bigtable owlbot lock generated owlbot yaml generated owlbot py generated nodejs billing owlbot lock generated owlbot yaml generated owlbot py generated nodejs billing budgets owlbot lock generated owlbot yaml generated owlbot py generated nodejs binary authorization owlbot lock generated owlbot yaml generated owlbot py generated nodejs channel owlbot lock generated owlbot yaml generated owlbot py generated nodejs cloud container owlbot lock generated owlbot yaml generated owlbot py generated nodejs cloudbuild owlbot lock generated owlbot yaml generated owlbot py generated nodejs common owlbot lock generated owlbot yaml generated owlbot py generated nodejs compute owlbot lock generated owlbot yaml generated owlbot py generated nodejs containeranalysis owlbot lock generated owlbot yaml generated owlbot py generated nodejs data qna owlbot lock generated owlbot yaml generated owlbot py generated nodejs datacatalog owlbot lock generated owlbot yaml generated owlbot py generated nodejs datalabeling owlbot lock generated owlbot yaml generated owlbot py generated nodejs dataproc owlbot lock generated owlbot yaml generated owlbot py generated nodejs dataproc metastore owlbot lock generated owlbot yaml generated owlbot py generated nodejs datastore owlbot lock generated owlbot yaml generated owlbot py generated nodejs datastore kvstore owlbot lock generated owlbot yaml generated owlbot py generated nodejs datastore session owlbot lock generated owlbot yaml generated owlbot py generated nodejs dialogflow owlbot lock generated owlbot yaml generated owlbot py generated nodejs dialogflow cx owlbot lock generated owlbot yaml generated owlbot py generated nodejs dlp owlbot lock generated owlbot yaml generated owlbot py generated nodejs dns owlbot lock generated owlbot yaml generated owlbot py generated nodejs document ai owlbot lock generated owlbot yaml generated owlbot py generated nodejs domains owlbot lock generated owlbot yaml generated owlbot py generated nodejs error reporting owlbot lock generated owlbot yaml generated owlbot py generated nodejs firestore owlbot lock generated owlbot yaml generated owlbot py generated nodejs firestore session owlbot lock generated owlbot yaml generated owlbot py generated nodejs functions owlbot lock generated owlbot yaml generated owlbot py generated nodejs game servers owlbot lock generated owlbot yaml generated owlbot py generated nodejs gce images owlbot lock generated owlbot yaml generated owlbot py generated nodejs gke hub owlbot lock generated owlbot yaml generated owlbot py generated nodejs googleapis common owlbot lock generated owlbot yaml generated owlbot py generated nodejs grafeas owlbot lock generated owlbot yaml generated owlbot py generated nodejs iam credentials owlbot lock generated owlbot yaml generated owlbot py generated nodejs iot owlbot lock generated owlbot yaml generated owlbot py generated nodejs kms owlbot lock generated owlbot yaml generated owlbot py generated nodejs language owlbot lock generated owlbot yaml generated owlbot py generated nodejs local auth owlbot lock generated owlbot yaml generated owlbot py generated nodejs logging owlbot lock generated owlbot yaml generated owlbot py generated nodejs logging bunyan owlbot lock generated owlbot yaml generated owlbot py generated nodejs logging winston owlbot lock generated owlbot yaml generated owlbot py generated nodejs managed identities owlbot lock generated owlbot yaml generated owlbot py generated nodejs media translation owlbot lock generated owlbot yaml generated owlbot py generated nodejs memcache owlbot lock generated owlbot yaml generated owlbot py generated nodejs monitoring owlbot lock generated owlbot yaml generated owlbot py generated nodejs monitoring dashboards owlbot lock generated owlbot yaml generated owlbot py generated nodejs network connectivity owlbot lock generated owlbot yaml generated owlbot py generated nodejs notebooks owlbot lock generated owlbot yaml generated owlbot py generated nodejs org policy owlbot lock generated owlbot yaml generated owlbot py generated nodejs os config owlbot lock generated owlbot yaml generated owlbot py generated nodejs os login owlbot lock generated owlbot yaml generated owlbot py generated nodejs paginator owlbot lock generated owlbot yaml generated owlbot py generated nodejs phishing protection owlbot lock generated owlbot yaml generated owlbot py generated nodejs policy troubleshooter owlbot lock generated owlbot yaml generated owlbot py generated nodejs precise date owlbot lock generated owlbot yaml generated owlbot py generated nodejs projectify owlbot lock generated owlbot yaml generated owlbot py generated nodejs promisify owlbot lock generated owlbot yaml generated owlbot py generated nodejs proto files owlbot lock generated owlbot yaml generated owlbot py generated nodejs pubsub owlbot lock generated owlbot yaml generated owlbot py generated nodejs rcloadenv owlbot lock generated owlbot yaml generated owlbot py generated nodejs recaptcha enterprise owlbot lock generated owlbot yaml generated owlbot py generated nodejs recommender owlbot lock generated owlbot yaml generated owlbot py generated nodejs redis owlbot lock generated owlbot yaml generated owlbot py generated nodejs resource owlbot lock generated owlbot yaml generated owlbot py generated nodejs retail owlbot lock generated owlbot yaml generated owlbot py generated nodejs scheduler owlbot lock generated owlbot yaml generated owlbot py generated nodejs secret manager owlbot lock generated owlbot yaml generated owlbot py generated nodejs security center owlbot lock generated owlbot yaml generated owlbot py generated nodejs security private ca owlbot lock generated owlbot yaml generated owlbot py generated nodejs service directory owlbot lock generated owlbot yaml generated owlbot py generated nodejs spanner owlbot lock generated owlbot yaml generated owlbot py generated nodejs speech owlbot lock generated owlbot yaml generated owlbot py generated nodejs storage owlbot lock generated owlbot yaml generated owlbot py generated nodejs talent owlbot lock generated owlbot yaml generated owlbot py generated nodejs tasks owlbot lock generated owlbot yaml generated owlbot py generated nodejs text to speech owlbot lock generated owlbot yaml generated owlbot py generated nodejs translate owlbot lock generated owlbot yaml generated owlbot py generated nodejs video intelligence owlbot lock generated owlbot yaml generated owlbot py generated nodejs video transcoder owlbot lock generated owlbot yaml generated owlbot py generated nodejs vision owlbot lock generated owlbot yaml generated owlbot py generated nodejs web risk owlbot lock generated owlbot yaml generated owlbot py generated nodejs web security scanner owlbot lock generated owlbot yaml generated owlbot py generated nodejs workflows owlbot lock generated owlbot yaml generated owlbot py generated release please owlbot lock generated owlbot yaml generated owlbot py generated repo automation bots owlbot lock generated owlbot yaml generated owlbot py generated sloth owlbot lock generated owlbot yaml generated owlbot py generated teeny request owlbot lock generated owlbot yaml generated owlbot py generated
| 1
|
11,037
| 13,850,852,937
|
IssuesEvent
|
2020-10-15 02:23:27
|
unicode-org/icu4x
|
https://api.github.com/repos/unicode-org/icu4x
|
closed
|
Add docs on how to plug DataProvider for Components
|
C-data C-process T-docs
|
While working on #189 I'd like to add docs on how to plug `DataProvider` into a `Component`.
For such documentations I historically preferred Wiki over PR because its easier for people to collaborate on it, and it's not part of the formal codebase.
Would everyone be ok if I added a wiki to this repo and wrote the initial version of the article?
|
1.0
|
Add docs on how to plug DataProvider for Components - While working on #189 I'd like to add docs on how to plug `DataProvider` into a `Component`.
For such documentations I historically preferred Wiki over PR because its easier for people to collaborate on it, and it's not part of the formal codebase.
Would everyone be ok if I added a wiki to this repo and wrote the initial version of the article?
|
process
|
add docs on how to plug dataprovider for components while working on i d like to add docs on how to plug dataprovider into a component for such documentations i historically preferred wiki over pr because its easier for people to collaborate on it and it s not part of the formal codebase would everyone be ok if i added a wiki to this repo and wrote the initial version of the article
| 1
|
5,211
| 7,988,397,660
|
IssuesEvent
|
2018-07-19 10:54:40
|
peterwebster/henson
|
https://api.github.com/repos/peterwebster/henson
|
closed
|
Create approved list of HTML entities for pasting into XML
|
medium priority process refinement
|
@DurHHHI through no fault of her own (I think) introduced an additional character in the middle of the entity references for e acute etc. (Sometimes what you see is not all you get when copying from a webpage). I'm fixing this in 14 and 15, but before we do any more XML work in 16 or subsequent volumes, I need to create a .txt file from which @KPalmerHeathman and Hilary can safely copy and paste.
|
1.0
|
Create approved list of HTML entities for pasting into XML - @DurHHHI through no fault of her own (I think) introduced an additional character in the middle of the entity references for e acute etc. (Sometimes what you see is not all you get when copying from a webpage). I'm fixing this in 14 and 15, but before we do any more XML work in 16 or subsequent volumes, I need to create a .txt file from which @KPalmerHeathman and Hilary can safely copy and paste.
|
process
|
create approved list of html entities for pasting into xml durhhhi through no fault of her own i think introduced an additional character in the middle of the entity references for e acute etc sometimes what you see is not all you get when copying from a webpage i m fixing this in and but before we do any more xml work in or subsequent volumes i need to create a txt file from which kpalmerheathman and hilary can safely copy and paste
| 1
|
3,760
| 6,734,286,723
|
IssuesEvent
|
2017-10-18 17:30:56
|
cypress-io/cypress
|
https://api.github.com/repos/cypress-io/cypress
|
opened
|
Jazz up the Readme
|
process: contributing
|
The `.README.md` for this repo is kind of sad. 😢 I would like to see the readme more focused on explaining what cypress is, why you would use cypress, **simple install instructions**, etc - for our users and have contributing the secondary focus.
**Ideas to jazz up**
- Add header from old readme branch.
- Embed 'Why Cypress' video directly into readme.
- Have simple 'npm install' instructions.
- Add some cool badges. (gitter, npm package)
- Don't make it too long - refer to the docs when necessary.
|
1.0
|
Jazz up the Readme - The `.README.md` for this repo is kind of sad. 😢 I would like to see the readme more focused on explaining what cypress is, why you would use cypress, **simple install instructions**, etc - for our users and have contributing the secondary focus.
**Ideas to jazz up**
- Add header from old readme branch.
- Embed 'Why Cypress' video directly into readme.
- Have simple 'npm install' instructions.
- Add some cool badges. (gitter, npm package)
- Don't make it too long - refer to the docs when necessary.
|
process
|
jazz up the readme the readme md for this repo is kind of sad 😢 i would like to see the readme more focused on explaining what cypress is why you would use cypress simple install instructions etc for our users and have contributing the secondary focus ideas to jazz up add header from old readme branch embed why cypress video directly into readme have simple npm install instructions add some cool badges gitter npm package don t make it too long refer to the docs when necessary
| 1
|
8,469
| 11,641,077,496
|
IssuesEvent
|
2020-02-29 01:19:22
|
MicrosoftDocs/vsts-docs
|
https://api.github.com/repos/MicrosoftDocs/vsts-docs
|
closed
|
available in Azure DevOps Server, timeline?
|
Pri1 devops-cicd-process/tech devops/prod
|
according to your feature timeline this will be available during 2020, this is highly requested by our developers, do you have a more specific release timeline for this feature?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 5aeeaace-1c5b-a51b-e41f-f25b806155b8
* Version Independent ID: fd7ff690-b2e4-41c7-a342-e528b911c6e1
* Content: [Deployment jobs - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops&viewFallbackFrom=azure-devops-2019#feedback)
* Content Source: [docs/pipelines/process/deployment-jobs.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/deployment-jobs.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
available in Azure DevOps Server, timeline? - according to your feature timeline this will be available during 2020, this is highly requested by our developers, do you have a more specific release timeline for this feature?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 5aeeaace-1c5b-a51b-e41f-f25b806155b8
* Version Independent ID: fd7ff690-b2e4-41c7-a342-e528b911c6e1
* Content: [Deployment jobs - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops&viewFallbackFrom=azure-devops-2019#feedback)
* Content Source: [docs/pipelines/process/deployment-jobs.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/deployment-jobs.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
available in azure devops server timeline according to your feature timeline this will be available during this is highly requested by our developers do you have a more specific release timeline for this feature document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
34,250
| 7,431,748,797
|
IssuesEvent
|
2018-03-25 17:43:46
|
Yahkal/replicaisland
|
https://api.github.com/repos/Yahkal/replicaisland
|
closed
|
in memory #34, the orb is not moving even when i tilt my tab. probably coz it doesnt support tilting. please remove the spikes coz it is impossible otherwise. i would like to finish the game very much but alas!
|
Priority-Medium Type-Defect auto-migrated
|
```
What steps will reproduce the problem?
1.
2.
3.
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Please provide any additional information below.
```
Original issue reported on code.google.com by `party....@gmail.com` on 9 Apr 2013 at 5:33
|
1.0
|
in memory #34, the orb is not moving even when i tilt my tab. probably coz it doesnt support tilting. please remove the spikes coz it is impossible otherwise. i would like to finish the game very much but alas! - ```
What steps will reproduce the problem?
1.
2.
3.
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Please provide any additional information below.
```
Original issue reported on code.google.com by `party....@gmail.com` on 9 Apr 2013 at 5:33
|
non_process
|
in memory the orb is not moving even when i tilt my tab probably coz it doesnt support tilting please remove the spikes coz it is impossible otherwise i would like to finish the game very much but alas what steps will reproduce the problem what is the expected output what do you see instead what version of the product are you using on what operating system please provide any additional information below original issue reported on code google com by party gmail com on apr at
| 0
|
464,197
| 13,307,798,258
|
IssuesEvent
|
2020-08-25 23:05:33
|
certbot/certbot
|
https://api.github.com/repos/certbot/certbot
|
opened
|
Audit Certbot error output
|
area: error handling area: ui / ux priority: significant
|
Under the "Open design questions" section of https://docs.google.com/document/d/1aKnQYEzCrZgYX-iuE-ErOdIKERUuLmZSX_BDb7d0M6I/edit?usp=sharing, there are number of cases of Certbot error output that have not been looked at. We should look at these cases and suggest improvements that we can make to Certbot's output.
|
1.0
|
Audit Certbot error output - Under the "Open design questions" section of https://docs.google.com/document/d/1aKnQYEzCrZgYX-iuE-ErOdIKERUuLmZSX_BDb7d0M6I/edit?usp=sharing, there are number of cases of Certbot error output that have not been looked at. We should look at these cases and suggest improvements that we can make to Certbot's output.
|
non_process
|
audit certbot error output under the open design questions section of there are number of cases of certbot error output that have not been looked at we should look at these cases and suggest improvements that we can make to certbot s output
| 0
|
190,378
| 6,817,898,370
|
IssuesEvent
|
2017-11-07 01:55:50
|
minishift/minishift
|
https://api.github.com/repos/minishift/minishift
|
closed
|
`minishift start' fails with 'open config.json: The system cannot find the file specified'
|
kind/bug os/windows priority/major
|
I was running minishift for a while, stopping and starting it. Then suddenly after several errors during 'minishift stop' it starts failing with message
```
open C:\Users\eskimo\.minishift\machines\minishift\config.json: The system cannot find the file specified'
```
Even if I do
``` shell
minishift config vm-driver virualbox
No Minishift instance exists. New vm-driver setting will be applied on next 'minishift start'
```
next
``` shell
minishift start
open C:\Users\eskimo\.minishift\machines\minishift\config.json: The system cannot find the file specified.
```
still complains about missing config.json
|
1.0
|
`minishift start' fails with 'open config.json: The system cannot find the file specified' - I was running minishift for a while, stopping and starting it. Then suddenly after several errors during 'minishift stop' it starts failing with message
```
open C:\Users\eskimo\.minishift\machines\minishift\config.json: The system cannot find the file specified'
```
Even if I do
``` shell
minishift config vm-driver virualbox
No Minishift instance exists. New vm-driver setting will be applied on next 'minishift start'
```
next
``` shell
minishift start
open C:\Users\eskimo\.minishift\machines\minishift\config.json: The system cannot find the file specified.
```
still complains about missing config.json
|
non_process
|
minishift start fails with open config json the system cannot find the file specified i was running minishift for a while stopping and starting it then suddenly after several errors during minishift stop it starts failing with message open c users eskimo minishift machines minishift config json the system cannot find the file specified even if i do shell minishift config vm driver virualbox no minishift instance exists new vm driver setting will be applied on next minishift start next shell minishift start open c users eskimo minishift machines minishift config json the system cannot find the file specified still complains about missing config json
| 0
|
756,699
| 26,482,285,258
|
IssuesEvent
|
2023-01-17 15:28:57
|
planetary-social/scuttlego
|
https://api.github.com/repos/planetary-social/scuttlego
|
closed
|
Call a callback when running migrations
|
enhancement priority/high
|
We need to call a progress callback when running migrations.
See https://github.com/planetary-social/planetary-ios/issues/1007.
|
1.0
|
Call a callback when running migrations - We need to call a progress callback when running migrations.
See https://github.com/planetary-social/planetary-ios/issues/1007.
|
non_process
|
call a callback when running migrations we need to call a progress callback when running migrations see
| 0
|
15,622
| 10,327,671,891
|
IssuesEvent
|
2019-09-02 07:37:59
|
Azure/azure-libraries-for-java
|
https://api.github.com/repos/Azure/azure-libraries-for-java
|
closed
|
No webapps folder is created when creating a new Linux web app with Tomcat
|
App Services
|
When creating a Windows web app with web container Tomcat, folder `webapps` will be created when the web app is created, but for Linux web app, there is no `webapps` folder created until users browse the web app.
From the experience with Windows web apps, we assume that there is always a `webapps` folder, and we continue the same operations for Linux web apps. And it causes the operation's failure.
Does this inconsistent behavior by design?
|
1.0
|
No webapps folder is created when creating a new Linux web app with Tomcat - When creating a Windows web app with web container Tomcat, folder `webapps` will be created when the web app is created, but for Linux web app, there is no `webapps` folder created until users browse the web app.
From the experience with Windows web apps, we assume that there is always a `webapps` folder, and we continue the same operations for Linux web apps. And it causes the operation's failure.
Does this inconsistent behavior by design?
|
non_process
|
no webapps folder is created when creating a new linux web app with tomcat when creating a windows web app with web container tomcat folder webapps will be created when the web app is created but for linux web app there is no webapps folder created until users browse the web app from the experience with windows web apps we assume that there is always a webapps folder and we continue the same operations for linux web apps and it causes the operation s failure does this inconsistent behavior by design
| 0
|
623
| 3,089,284,547
|
IssuesEvent
|
2015-08-25 20:44:58
|
spootTheLousy/saguaro
|
https://api.github.com/repos/spootTheLousy/saguaro
|
opened
|
Deleting the only post on a new page leaves the html page
|
Bug: Minor Post/text processing
|
Example:
There is one post on page 2
I delete that post.
Page 2 still exists, although the image is deleted.
This would probably only be an issue on slower moving boards that don't hit the page limit.
FIX: I'll have to tweak delete_post to destroy the file as well.
|
1.0
|
Deleting the only post on a new page leaves the html page - Example:
There is one post on page 2
I delete that post.
Page 2 still exists, although the image is deleted.
This would probably only be an issue on slower moving boards that don't hit the page limit.
FIX: I'll have to tweak delete_post to destroy the file as well.
|
process
|
deleting the only post on a new page leaves the html page example there is one post on page i delete that post page still exists although the image is deleted this would probably only be an issue on slower moving boards that don t hit the page limit fix i ll have to tweak delete post to destroy the file as well
| 1
|
17,546
| 12,141,054,681
|
IssuesEvent
|
2020-04-23 21:39:58
|
tailscale/tailscale
|
https://api.github.com/repos/tailscale/tailscale
|
closed
|
Adapt to changing network environments better
|
L4 Most users P4 Halts deployment T6 Major usability connectivity
|
Overall tracking bug for network change reactivity.
When the network environment changes (e.g. switch from LTE to wifi, NAT gateway reboots and loses all its mappings), nodes should reestablish connectivity gracefully.
We currently adapt in some cases, but not others. We should always notice changes to connectivity, and adapt gracefully to them.
┆Issue is synchronized with this [Asana task](https://app.asana.com/0/1171685499906294/1171687424845760) by [Unito](https://www.unito.io/learn-more)
|
True
|
Adapt to changing network environments better - Overall tracking bug for network change reactivity.
When the network environment changes (e.g. switch from LTE to wifi, NAT gateway reboots and loses all its mappings), nodes should reestablish connectivity gracefully.
We currently adapt in some cases, but not others. We should always notice changes to connectivity, and adapt gracefully to them.
┆Issue is synchronized with this [Asana task](https://app.asana.com/0/1171685499906294/1171687424845760) by [Unito](https://www.unito.io/learn-more)
|
non_process
|
adapt to changing network environments better overall tracking bug for network change reactivity when the network environment changes e g switch from lte to wifi nat gateway reboots and loses all its mappings nodes should reestablish connectivity gracefully we currently adapt in some cases but not others we should always notice changes to connectivity and adapt gracefully to them ┆issue is synchronized with this by
| 0
|
344,565
| 30,751,685,731
|
IssuesEvent
|
2023-07-28 19:55:08
|
saltstack/salt
|
https://api.github.com/repos/saltstack/salt
|
opened
|
[Increase Test Coverage] Batch 125
|
Tests
|
Increase the code coverage percent on the following files to at least 80%.
Please be aware that currently the percentage might be inaccurate if the module uses salt due to #64696
File | Percent
salt/utils/vsan.py 9
salt/ext/win_inet_pton.py 49
salt/modules/cloud.py 30
salt/modules/dummyproxy_service.py 41
salt/modules/ldapmod.py 51
|
1.0
|
[Increase Test Coverage] Batch 125 - Increase the code coverage percent on the following files to at least 80%.
Please be aware that currently the percentage might be inaccurate if the module uses salt due to #64696
File | Percent
salt/utils/vsan.py 9
salt/ext/win_inet_pton.py 49
salt/modules/cloud.py 30
salt/modules/dummyproxy_service.py 41
salt/modules/ldapmod.py 51
|
non_process
|
batch increase the code coverage percent on the following files to at least please be aware that currently the percentage might be inaccurate if the module uses salt due to file percent salt utils vsan py salt ext win inet pton py salt modules cloud py salt modules dummyproxy service py salt modules ldapmod py
| 0
|
11,474
| 14,343,218,079
|
IssuesEvent
|
2020-11-28 08:07:22
|
assimp/assimp
|
https://api.github.com/repos/assimp/assimp
|
closed
|
Extra face in FBX file.
|
Bug Postprocessing
|
I recently started modeling a vehicle in blender and exported it to my game engine through an FBX file. The FBX is brought into my game engine using assimp. I noticed that there is an extra face on the imported mesh, which does not appear when opening the FBX file in other applications. It is, however, visible in the assimp viewer, so it seems this is a bug in assimp. This is what the mesh looks like in blender:

And here is the same mesh in the assimp viewer:

I highlighted the extra triangle in red. I uploaded the problematic fbx file to: http://www.sourcedrive.net/assimp_fbx_problem/chassis.fbx
I am testing this with assimp master, downloaded on 2. November 2020.
|
1.0
|
Extra face in FBX file. - I recently started modeling a vehicle in blender and exported it to my game engine through an FBX file. The FBX is brought into my game engine using assimp. I noticed that there is an extra face on the imported mesh, which does not appear when opening the FBX file in other applications. It is, however, visible in the assimp viewer, so it seems this is a bug in assimp. This is what the mesh looks like in blender:

And here is the same mesh in the assimp viewer:

I highlighted the extra triangle in red. I uploaded the problematic fbx file to: http://www.sourcedrive.net/assimp_fbx_problem/chassis.fbx
I am testing this with assimp master, downloaded on 2. November 2020.
|
process
|
extra face in fbx file i recently started modeling a vehicle in blender and exported it to my game engine through an fbx file the fbx is brought into my game engine using assimp i noticed that there is an extra face on the imported mesh which does not appear when opening the fbx file in other applications it is however visible in the assimp viewer so it seems this is a bug in assimp this is what the mesh looks like in blender and here is the same mesh in the assimp viewer i highlighted the extra triangle in red i uploaded the problematic fbx file to i am testing this with assimp master downloaded on november
| 1
|
3,768
| 6,737,011,181
|
IssuesEvent
|
2017-10-19 07:45:23
|
jimbrown75/Permit-Vision-Enhancements
|
https://api.github.com/repos/jimbrown75/Permit-Vision-Enhancements
|
closed
|
Signatures not resetting when permit issued
|
enhancement Should Fix Take Forward Verified by PTW Process Lead
|
In the system on steps 5, 6 and 7 of the permit details, the signatures of the previous users who signed for actions like Permit Issue, Permit Accept, handover, etc. are not being reset when the permit is issued.
Here you can see that the permit is LIVE and the signature page seems to have been signed already. We are expecting to have these signatures reset when the permit is issued and only applied again when the signatures take place after the permit was LIVE.

For a full explanation of the expected behaviour please review the attached word file.
[Signature not reset on permit issue.docx](https://github.com/jimbrown75/Permit-Vision-Enhancements/files/1319460/Signature.not.reset.on.permit.issue.docx)
|
1.0
|
Signatures not resetting when permit issued - In the system on steps 5, 6 and 7 of the permit details, the signatures of the previous users who signed for actions like Permit Issue, Permit Accept, handover, etc. are not being reset when the permit is issued.
Here you can see that the permit is LIVE and the signature page seems to have been signed already. We are expecting to have these signatures reset when the permit is issued and only applied again when the signatures take place after the permit was LIVE.

For a full explanation of the expected behaviour please review the attached word file.
[Signature not reset on permit issue.docx](https://github.com/jimbrown75/Permit-Vision-Enhancements/files/1319460/Signature.not.reset.on.permit.issue.docx)
|
process
|
signatures not resetting when permit issued in the system on steps and of the permit details the signatures of the previous users who signed for actions like permit issue permit accept handover etc are not being reset when the permit is issued here you can see that the permit is live and the signature page seems to have been signed already we are expecting to have these signatures reset when the permit is issued and only applied again when the signatures take place after the permit was live for a full explanation of the expected behaviour please review the attached word file
| 1
|
196,515
| 14,876,637,021
|
IssuesEvent
|
2021-01-20 01:18:30
|
ObliqueNET/Server
|
https://api.github.com/repos/ObliqueNET/Server
|
closed
|
Pokemon spawning like crazy
|
needs testing pixelmon
|
Pokemon after a bit of standing around start to over spawn and get crowded
https://i.gyazo.com/thumb/1200/fc201eacfb87bab01b63a1437b4ba077-png.jpg
https://cdn.discordapp.com/attachments/151095626916954113/800051124705951754/unknown.png
|
1.0
|
Pokemon spawning like crazy - Pokemon after a bit of standing around start to over spawn and get crowded
https://i.gyazo.com/thumb/1200/fc201eacfb87bab01b63a1437b4ba077-png.jpg
https://cdn.discordapp.com/attachments/151095626916954113/800051124705951754/unknown.png
|
non_process
|
pokemon spawning like crazy pokemon after a bit of standing around start to over spawn and get crowded
| 0
|
667,764
| 22,499,501,951
|
IssuesEvent
|
2022-06-23 10:29:05
|
MiSTer-devel/PSX_MiSTer
|
https://api.github.com/repos/MiSTer-devel/PSX_MiSTer
|
closed
|
Evil Dead - Hail to the King USA
|
Priority-1
|
**Issue**
Game loads as normal but when you get into game you can't actually leave your inventory to start playing
**Reproduce**
Start a new game, watch or skip the FMV's, Try and press Triangle to leave the inventory
**Workaround**
none
**BIOS**
7001
**CD Image**
Redump bin/cue
**Core Version**
PSX_DUALSDRAM20220508_3
|
1.0
|
Evil Dead - Hail to the King USA - **Issue**
Game loads as normal but when you get into game you can't actually leave your inventory to start playing
**Reproduce**
Start a new game, watch or skip the FMV's, Try and press Triangle to leave the inventory
**Workaround**
none
**BIOS**
7001
**CD Image**
Redump bin/cue
**Core Version**
PSX_DUALSDRAM20220508_3
|
non_process
|
evil dead hail to the king usa issue game loads as normal but when you get into game you can t actually leave your inventory to start playing reproduce start a new game watch or skip the fmv s try and press triangle to leave the inventory workaround none bios cd image redump bin cue core version psx
| 0
|
666,788
| 22,386,841,038
|
IssuesEvent
|
2022-06-17 01:51:25
|
ctm/mb2-doc
|
https://api.github.com/repos/ctm/mb2-doc
|
opened
|
Fix lammer redemption in GUI
|
bug high priority easy regression
|
Fix lammer redemption to work with the new background.
I never tested it and not only do we not get a white background, but the alignment / placement appears to be off. I'd like to get this fixed in time for tomorrow's Pot-Limit Oklahoma tournament.
|
1.0
|
Fix lammer redemption in GUI - Fix lammer redemption to work with the new background.
I never tested it and not only do we not get a white background, but the alignment / placement appears to be off. I'd like to get this fixed in time for tomorrow's Pot-Limit Oklahoma tournament.
|
non_process
|
fix lammer redemption in gui fix lammer redemption to work with the new background i never tested it and not only do we not get a white background but the alignment placement appears to be off i d like to get this fixed in time for tomorrow s pot limit oklahoma tournament
| 0
|
574,823
| 17,023,880,147
|
IssuesEvent
|
2021-07-03 04:20:22
|
tomhughes/trac-tickets
|
https://api.github.com/repos/tomhughes/trac-tickets
|
closed
|
Add iD and openstreetmap-carto to trac other trackers list
|
Component: admin Priority: minor Resolution: invalid Type: enhancement
|
**[Submitted to the original trac issue database at 9.20pm, Monday, 23rd September 2013]**
Currently when creating tickets there is a list of some projects with different issue trackers, currently JOSM and JXAPI.
It would be good if iD and openstreetmap-carto were added to the list, with trackers of https://github.com/systemed/iD/issues and https://github.com/gravitystorm/openstreetmap-carto/issues
|
1.0
|
Add iD and openstreetmap-carto to trac other trackers list - **[Submitted to the original trac issue database at 9.20pm, Monday, 23rd September 2013]**
Currently when creating tickets there is a list of some projects with different issue trackers, currently JOSM and JXAPI.
It would be good if iD and openstreetmap-carto were added to the list, with trackers of https://github.com/systemed/iD/issues and https://github.com/gravitystorm/openstreetmap-carto/issues
|
non_process
|
add id and openstreetmap carto to trac other trackers list currently when creating tickets there is a list of some projects with different issue trackers currently josm and jxapi it would be good if id and openstreetmap carto were added to the list with trackers of and
| 0
|
352,564
| 32,076,842,961
|
IssuesEvent
|
2023-09-25 11:35:49
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
closed
|
Fix tensor.test_torch_tensor_addmm
|
PyTorch Frontend Sub Task Failing Test
|
| | |
|---|---|
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/6298493420"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/6298493420"><img src=https://img.shields.io/badge/-success-success></a>
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/6298493420"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/6298493420"><img src=https://img.shields.io/badge/-success-success></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/6298493420"><img src=https://img.shields.io/badge/-success-success></a>
|
1.0
|
Fix tensor.test_torch_tensor_addmm - | | |
|---|---|
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/6298493420"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/6298493420"><img src=https://img.shields.io/badge/-success-success></a>
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/6298493420"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/6298493420"><img src=https://img.shields.io/badge/-success-success></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/6298493420"><img src=https://img.shields.io/badge/-success-success></a>
|
non_process
|
fix tensor test torch tensor addmm numpy a href src jax a href src tensorflow a href src torch a href src paddle a href src
| 0
|
5,681
| 8,558,759,390
|
IssuesEvent
|
2018-11-08 19:12:43
|
easy-software-ufal/annotations_repos
|
https://api.github.com/repos/easy-software-ufal/annotations_repos
|
opened
|
json-api-dotnet/JsonApiDotNetCore PATCH request with nullable attributes does not work
|
C# no operator wrong processing
|
Issue: `https://github.com/json-api-dotnet/JsonApiDotNetCore/issues/95`
PR: `https://github.com/json-api-dotnet/JsonApiDotNetCore/pull/94`
Comment: massive pull request.
|
1.0
|
json-api-dotnet/JsonApiDotNetCore PATCH request with nullable attributes does not work - Issue: `https://github.com/json-api-dotnet/JsonApiDotNetCore/issues/95`
PR: `https://github.com/json-api-dotnet/JsonApiDotNetCore/pull/94`
Comment: massive pull request.
|
process
|
json api dotnet jsonapidotnetcore patch request with nullable attributes does not work issue pr comment massive pull request
| 1
|
50,676
| 6,424,821,072
|
IssuesEvent
|
2017-08-09 14:17:30
|
xogroup/union
|
https://api.github.com/repos/xogroup/union
|
opened
|
Fancy Field Dropdown & Suggestion list dropdown design
|
design
|
Context
===
We noticed that two new dropdown components were recently added to the union library -- [Fancy Field dropdown](http://docs.union.theknot.com/pattern-library/core-components/fancy-fields#dropdown-component) and [<SuggestionList /> dropdown](http://docs.union.theknot.com/pattern-library/core-components/suggestion-list). Both dropdowns are used in same on-boarding experience, however they look different.
We want to confirm that the different looks are intended for the dropdowns.
Fancy Field Dropdown:
===
<img width="301" alt="screen shot 2017-08-09 at 10 10 14 am" src="https://user-images.githubusercontent.com/15005719/29126233-b9ea8b6a-7ceb-11e7-86e8-e3794173e40e.png">
Suggestion List Dropdown:
===
<img width="384" alt="screen shot 2017-08-09 at 10 10 50 am" src="https://user-images.githubusercontent.com/15005719/29126238-be86f76c-7ceb-11e7-8fc0-355cf0b92061.png">
|
1.0
|
Fancy Field Dropdown & Suggestion list dropdown design - Context
===
We noticed that two new dropdown components were recently added to the union library -- [Fancy Field dropdown](http://docs.union.theknot.com/pattern-library/core-components/fancy-fields#dropdown-component) and [<SuggestionList /> dropdown](http://docs.union.theknot.com/pattern-library/core-components/suggestion-list). Both dropdowns are used in same on-boarding experience, however they look different.
We want to confirm that the different looks are intended for the dropdowns.
Fancy Field Dropdown:
===
<img width="301" alt="screen shot 2017-08-09 at 10 10 14 am" src="https://user-images.githubusercontent.com/15005719/29126233-b9ea8b6a-7ceb-11e7-86e8-e3794173e40e.png">
Suggestion List Dropdown:
===
<img width="384" alt="screen shot 2017-08-09 at 10 10 50 am" src="https://user-images.githubusercontent.com/15005719/29126238-be86f76c-7ceb-11e7-8fc0-355cf0b92061.png">
|
non_process
|
fancy field dropdown suggestion list dropdown design context we noticed that two new dropdown components were recently added to the union library and both dropdowns are used in same on boarding experience however they look different we want to confirm that the different looks are intended for the dropdowns fancy field dropdown img width alt screen shot at am src suggestion list dropdown img width alt screen shot at am src
| 0
|
18,573
| 24,556,334,544
|
IssuesEvent
|
2022-10-12 16:09:31
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
Android > App is crashing in the below scenario
|
Bug P1 Android Process: Fixed Process: Tested QA Process: Tested dev
|
**Steps:**
1. Install the app
2. Sign in and complete the passcode process
3. Minimize / Lock the app once after study list screen is loaded
4. Open the app and Click on 'Sign in again' in passcode screen
5. Click on 'Ok' button
6. Click on back button in 'Sign in' screen
7. Click on 'Get started' button and Verify
**AR:** App is crashing in the above scenario
**ER:** App should not crash and study list screen should be available for the participant
https://user-images.githubusercontent.com/86007179/169323816-98d096eb-5588-404a-a9e2-03054bbc0b16.mp4
|
3.0
|
Android > App is crashing in the below scenario - **Steps:**
1. Install the app
2. Sign in and complete the passcode process
3. Minimize / Lock the app once after study list screen is loaded
4. Open the app and Click on 'Sign in again' in passcode screen
5. Click on 'Ok' button
6. Click on back button in 'Sign in' screen
7. Click on 'Get started' button and Verify
**AR:** App is crashing in the above scenario
**ER:** App should not crash and study list screen should be available for the participant
https://user-images.githubusercontent.com/86007179/169323816-98d096eb-5588-404a-a9e2-03054bbc0b16.mp4
|
process
|
android app is crashing in the below scenario steps install the app sign in and complete the passcode process minimize lock the app once after study list screen is loaded open the app and click on sign in again in passcode screen click on ok button click on back button in sign in screen click on get started button and verify ar app is crashing in the above scenario er app should not crash and study list screen should be available for the participant
| 1
|
7,441
| 10,554,692,390
|
IssuesEvent
|
2019-10-03 20:03:44
|
pelias/pelias
|
https://api.github.com/repos/pelias/pelias
|
closed
|
equinox san francisco fails to find the POI
|
Q1-2017 acceptance test input parsing processed
|
When searching for `equinox san francisco` the gym is expected to be found, instead we are returning `France Equinoxiale`, even though a focus point was specified in SF. :grimacing:

Correct behavior is to prefer venues when a focus point is provided.
A few sample queries:
[with a focus point in SF](http://pelias.github.io/compare/#/v1/search%3Fpoint.lon=-122.431272&point.lat=37.778008&text=equinox%20san%20francisco):
```
1) France Équinoxiale, Kourou, France
2) Salle Équinoxe, La Tour-du-Pin, France
3) San Francisco (El Tecolote), Mexico
4) Ex-Hacienda San Francisco Cuexcomatepec, Mexico
5) San Francisco el Naranjo, Mexico
6) El Paraíso (San Francisco), Mexico
7) San Francisco El Alto, Guatemala
8) San Francisco el Rincón, Mexico
9) San Francisco El Alto, Guatemala
10) Ejido San Francisco, Mexico
```
[with a focus point in SF and comma](http://pelias.github.io/compare/#/v1/search%3Fpoint.lon=-122.431272&point.lat=37.778008&text=equinox,%20san%20francisco):
```
1) Equinox, London, England, United Kingdom
2) Equinox, Castelmassimo, Italy
3) Equinox, New Haven, CT, USA
4) Equinox, Zürich, Switzerland
5) Equinox, Washington, District of Columbia, USA
6) Equinox, Madison, WI, USA
7) Equinox, Bristol, England, United Kingdom
8) Equinox, Manhattan, New York, NY, USA
9) Equinox, Oostvoorne, Netherlands
10) Equinox, Paris, France
```
[focus point in SF and just `equinox`](http://pelias.github.io/compare/#/v1/search%3Fpoint.lon=-122.431272&point.lat=37.778008&text=equinox):
```
1) Equinox, London, England, United Kingdom
2) Equinox, Castelmassimo, Italy
3) Equinox, New Haven, CT, USA
4) Equinox, Zürich, Switzerland
5) Equinox, Washington, District of Columbia, USA
6) Equinox, Madison, WI, USA
7) Equinox, Bristol, England, United Kingdom
8) Equinox, Manhattan, New York, NY, USA
9) Equinox, Oostvoorne, Netherlands
10) Equinox, Paris, France
```
|
1.0
|
equinox san francisco fails to find the POI - When searching for `equinox san francisco` the gym is expected to be found, instead we are returning `France Equinoxiale`, even though a focus point was specified in SF. :grimacing:

Correct behavior is to prefer venues when a focus point is provided.
A few sample queries:
[with a focus point in SF](http://pelias.github.io/compare/#/v1/search%3Fpoint.lon=-122.431272&point.lat=37.778008&text=equinox%20san%20francisco):
```
1) France Équinoxiale, Kourou, France
2) Salle Équinoxe, La Tour-du-Pin, France
3) San Francisco (El Tecolote), Mexico
4) Ex-Hacienda San Francisco Cuexcomatepec, Mexico
5) San Francisco el Naranjo, Mexico
6) El Paraíso (San Francisco), Mexico
7) San Francisco El Alto, Guatemala
8) San Francisco el Rincón, Mexico
9) San Francisco El Alto, Guatemala
10) Ejido San Francisco, Mexico
```
[with a focus point in SF and comma](http://pelias.github.io/compare/#/v1/search%3Fpoint.lon=-122.431272&point.lat=37.778008&text=equinox,%20san%20francisco):
```
1) Equinox, London, England, United Kingdom
2) Equinox, Castelmassimo, Italy
3) Equinox, New Haven, CT, USA
4) Equinox, Zürich, Switzerland
5) Equinox, Washington, District of Columbia, USA
6) Equinox, Madison, WI, USA
7) Equinox, Bristol, England, United Kingdom
8) Equinox, Manhattan, New York, NY, USA
9) Equinox, Oostvoorne, Netherlands
10) Equinox, Paris, France
```
[focus point in SF and just `equinox`](http://pelias.github.io/compare/#/v1/search%3Fpoint.lon=-122.431272&point.lat=37.778008&text=equinox):
```
1) Equinox, London, England, United Kingdom
2) Equinox, Castelmassimo, Italy
3) Equinox, New Haven, CT, USA
4) Equinox, Zürich, Switzerland
5) Equinox, Washington, District of Columbia, USA
6) Equinox, Madison, WI, USA
7) Equinox, Bristol, England, United Kingdom
8) Equinox, Manhattan, New York, NY, USA
9) Equinox, Oostvoorne, Netherlands
10) Equinox, Paris, France
```
|
process
|
equinox san francisco fails to find the poi when searching for equinox san francisco the gym is expected to be found instead we are returning france equinoxiale even though a focus point was specified in sf grimacing correct behavior is to prefer venues when a focus point is provided a few sample queries france équinoxiale kourou france salle équinoxe la tour du pin france san francisco el tecolote mexico ex hacienda san francisco cuexcomatepec mexico san francisco el naranjo mexico el paraíso san francisco mexico san francisco el alto guatemala san francisco el rincón mexico san francisco el alto guatemala ejido san francisco mexico equinox london england united kingdom equinox castelmassimo italy equinox new haven ct usa equinox zürich switzerland equinox washington district of columbia usa equinox madison wi usa equinox bristol england united kingdom equinox manhattan new york ny usa equinox oostvoorne netherlands equinox paris france equinox london england united kingdom equinox castelmassimo italy equinox new haven ct usa equinox zürich switzerland equinox washington district of columbia usa equinox madison wi usa equinox bristol england united kingdom equinox manhattan new york ny usa equinox oostvoorne netherlands equinox paris france
| 1
|
265,238
| 28,262,369,310
|
IssuesEvent
|
2023-04-07 01:15:02
|
hshivhare67/platform_device_renesas_kernel_v4.19.72
|
https://api.github.com/repos/hshivhare67/platform_device_renesas_kernel_v4.19.72
|
closed
|
CVE-2021-28660 (High) detected in linuxlinux-4.19.279 - autoclosed
|
Mend: dependency security vulnerability
|
## CVE-2021-28660 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.279</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/hshivhare67/platform_device_renesas_kernel_v4.19.72/commit/3f00c931cf51848ec37b8817097db058fcc2f3f7">3f00c931cf51848ec37b8817097db058fcc2f3f7</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/staging/rtl8188eu/os_dep/ioctl_linux.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/staging/rtl8188eu/os_dep/ioctl_linux.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
rtw_wx_set_scan in drivers/staging/rtl8188eu/os_dep/ioctl_linux.c in the Linux kernel through 5.11.6 allows writing beyond the end of the ->ssid[] array. NOTE: from the perspective of kernel.org releases, CVE IDs are not normally used for drivers/staging/* (unfinished work); however, system integrators may have situations in which a drivers/staging issue is relevant to their own customer base.
<p>Publish Date: 2021-03-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-28660>CVE-2021-28660</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Adjacent
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2021-28660">https://www.linuxkernelcves.com/cves/CVE-2021-28660</a></p>
<p>Release Date: 2021-03-17</p>
<p>Fix Resolution: v4.4.262,v4.9.262,v4.14.226,v4.19.181,v5.4.106,v5.10.24,v5.11.7,v5.12-rc3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-28660 (High) detected in linuxlinux-4.19.279 - autoclosed - ## CVE-2021-28660 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.279</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/hshivhare67/platform_device_renesas_kernel_v4.19.72/commit/3f00c931cf51848ec37b8817097db058fcc2f3f7">3f00c931cf51848ec37b8817097db058fcc2f3f7</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/staging/rtl8188eu/os_dep/ioctl_linux.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/staging/rtl8188eu/os_dep/ioctl_linux.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
rtw_wx_set_scan in drivers/staging/rtl8188eu/os_dep/ioctl_linux.c in the Linux kernel through 5.11.6 allows writing beyond the end of the ->ssid[] array. NOTE: from the perspective of kernel.org releases, CVE IDs are not normally used for drivers/staging/* (unfinished work); however, system integrators may have situations in which a drivers/staging issue is relevant to their own customer base.
<p>Publish Date: 2021-03-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-28660>CVE-2021-28660</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Adjacent
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2021-28660">https://www.linuxkernelcves.com/cves/CVE-2021-28660</a></p>
<p>Release Date: 2021-03-17</p>
<p>Fix Resolution: v4.4.262,v4.9.262,v4.14.226,v4.19.181,v5.4.106,v5.10.24,v5.11.7,v5.12-rc3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in linuxlinux autoclosed cve high severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch main vulnerable source files drivers staging os dep ioctl linux c drivers staging os dep ioctl linux c vulnerability details rtw wx set scan in drivers staging os dep ioctl linux c in the linux kernel through allows writing beyond the end of the ssid array note from the perspective of kernel org releases cve ids are not normally used for drivers staging unfinished work however system integrators may have situations in which a drivers staging issue is relevant to their own customer base publish date url a href cvss score details base score metrics exploitability metrics attack vector adjacent attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
21,644
| 30,078,995,747
|
IssuesEvent
|
2023-06-29 00:21:04
|
ethereum/EIPs
|
https://api.github.com/repos/ethereum/EIPs
|
closed
|
Automatically merge all new EIP drafts
|
w-stale enhancement r-ci r-process e-consensus
|
### Proposed Change
Currently, the process for creating an EIP is not very simple and it is pretty time-consuming just to get an EIP to draft. I imagine a large part of the reason that a lot of EIPs end up stagnant is that the effort even to get an EIP into draft is a lot - more than is probably needed when the criteria for drafts, per EIP-1, is them being "properly formatted." We have tooling that lets users know when their EIPs aren't properly formatted, and we might soon have tooling to automatically *fix* the most common errors.
Currently, when an EIP is initially proposed is when things like "is this EIP ideal for its current purpose" are discussed and block the progress. Having an actual EIP that is being built and on which people can submit PRs that propose changes is, IMHO, better than the status quo, and would actually allow the community to take *some* of the burden off of the EIP editors.
|
1.0
|
Automatically merge all new EIP drafts - ### Proposed Change
Currently, the process for creating an EIP is not very simple and it is pretty time-consuming just to get an EIP to draft. I imagine a large part of the reason that a lot of EIPs end up stagnant is that the effort even to get an EIP into draft is a lot - more than is probably needed when the criteria for drafts, per EIP-1, is them being "properly formatted." We have tooling that lets users know when their EIPs aren't properly formatted, and we might soon have tooling to automatically *fix* the most common errors.
Currently, when an EIP is initially proposed is when things like "is this EIP ideal for its current purpose" are discussed and block the progress. Having an actual EIP that is being built and on which people can submit PRs that propose changes is, IMHO, better than the status quo, and would actually allow the community to take *some* of the burden off of the EIP editors.
|
process
|
automatically merge all new eip drafts proposed change currently the process for creating an eip is not very simple and it is pretty time consuming just to get an eip to draft i imagine a large part of the reason that a lot of eips end up stagnant is that the effort even to get an eip into draft is a lot more than is probably needed when the criteria for drafts per eip is them being properly formatted we have tooling that lets users know when their eips aren t properly formatted and we might soon have tooling to automatically fix the most common errors currently when an eip is initially proposed is when things like is this eip ideal for its current purpose are discussed and block the progress having an actual eip that is being built and on which people can submit prs that propose changes is imho better than the status quo and would actually allow the community to take some of the burden off of the eip editors
| 1
|
8,304
| 11,463,344,313
|
IssuesEvent
|
2020-02-07 15:50:12
|
prisma/prisma2
|
https://api.github.com/repos/prisma/prisma2
|
opened
|
[Introspection] Better `introspect` success message
|
kind/improvement process/candidate topic: dx
|
A successful `introspect` run currently looks like this:
```
λ npx prisma2@alpha introspect
Introspecting …
Done with introspection in 1.42s
Wrote schema.prisma
```
We should probably try to make this a bit nicer.
1. The text itself can be a lot more descriptive ("datamodel was added to your `schema.prisma`") and for example offer next steps ("Now use `prisma2 generate` to create a Prisma Client for your database").
2. We could also output some statistics about the datamodel that was added: x models, y relations, z indexes (Actually only the "model" part might be reasonably simple, but maybe this is a good place to go a bit overbord and have fun)
|
1.0
|
[Introspection] Better `introspect` success message - A successful `introspect` run currently looks like this:
```
λ npx prisma2@alpha introspect
Introspecting …
Done with introspection in 1.42s
Wrote schema.prisma
```
We should probably try to make this a bit nicer.
1. The text itself can be a lot more descriptive ("datamodel was added to your `schema.prisma`") and for example offer next steps ("Now use `prisma2 generate` to create a Prisma Client for your database").
2. We could also output some statistics about the datamodel that was added: x models, y relations, z indexes (Actually only the "model" part might be reasonably simple, but maybe this is a good place to go a bit overbord and have fun)
|
process
|
better introspect success message a successful introspect run currently looks like this λ npx alpha introspect introspecting … done with introspection in wrote schema prisma we should probably try to make this a bit nicer the text itself can be a lot more descriptive datamodel was added to your schema prisma and for example offer next steps now use generate to create a prisma client for your database we could also output some statistics about the datamodel that was added x models y relations z indexes actually only the model part might be reasonably simple but maybe this is a good place to go a bit overbord and have fun
| 1
|
6,186
| 9,102,320,633
|
IssuesEvent
|
2019-02-20 13:30:08
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
opened
|
Consistent environment variables
|
meta process
|
AFAIK we have no rule when to check for environment variables.
I see three possibilities to do so:
1. On startup
1. On first use
1. On each use
So far we mainly use `3` and some times `1`. There is probably also usage of `2` but I could not find any by briefly searching through the code base.
Do we want to consolidate the behavior to always follow the same rules (that might not be possible in all cases but probably in most)?
To give examples for all three behaviors:
```js
// 1.
const env = process.env.FOOBAR;
function abc() {
if (env) {
doThings();
}
}
// 2.
let env = null;
function abc() {
if (env === null) {
env = process.env.FOOBAR;
}
if (env) {
doThings();
}
}
// 3.
function abc() {
if (process.env.FOOBAR) {
doThings();
}
}
```
|
1.0
|
Consistent environment variables - AFAIK we have no rule when to check for environment variables.
I see three possibilities to do so:
1. On startup
1. On first use
1. On each use
So far we mainly use `3` and some times `1`. There is probably also usage of `2` but I could not find any by briefly searching through the code base.
Do we want to consolidate the behavior to always follow the same rules (that might not be possible in all cases but probably in most)?
To give examples for all three behaviors:
```js
// 1.
const env = process.env.FOOBAR;
function abc() {
if (env) {
doThings();
}
}
// 2.
let env = null;
function abc() {
if (env === null) {
env = process.env.FOOBAR;
}
if (env) {
doThings();
}
}
// 3.
function abc() {
if (process.env.FOOBAR) {
doThings();
}
}
```
|
process
|
consistent environment variables afaik we have no rule when to check for environment variables i see three possibilities to do so on startup on first use on each use so far we mainly use and some times there is probably also usage of but i could not find any by briefly searching through the code base do we want to consolidate the behavior to always follow the same rules that might not be possible in all cases but probably in most to give examples for all three behaviors js const env process env foobar function abc if env dothings let env null function abc if env null env process env foobar if env dothings function abc if process env foobar dothings
| 1
|
4,721
| 7,552,881,100
|
IssuesEvent
|
2018-04-19 02:55:45
|
UnbFeelings/unb-feelings-docs
|
https://api.github.com/repos/UnbFeelings/unb-feelings-docs
|
closed
|
[Não Conformidade] Lista informal de requisitos
|
Processo Qualidade invalid
|
@UnbFeelings/devel
A auditoria da [Lista Informal de Requisitos](https://github.com/UnbFeelings/unb-feelings-docs/wiki/Processo#2211-elicitar-requisitos) foi feita com base nos [critérios](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Crit%C3%A9rios-de-Avalia%C3%A7%C3%A3o-e-T%C3%A9cnicas-de-Auditoria#lista-informal-de-requisitos) pré-estabelecidos para poder verificar se a atividade Elicitar Requisitos aconteceu. O resultado da auditoria pode ser acessado através da seguinte página: [Lista Informal de Requisitos](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Auditoria-Lista-informal-de-requisitos---Ciclo-1).
_Observação: Não é possível delimitar prazo para a correção, já que a fase de requisitos do ciclo já foi realizada._
### Descrição
O Artefato, Lista Informal de Requisitos, planejado no processo não foi criado ou documentado pela equipe de desenvolvimento.
#### Recomendações
Procurar trazer o cliente para dentro do processo de desenvolvimento o máximo possível, já que essa fase inicial de requisitos foi afetada pela falta desse artefato previsto no processo.
### Detalhes
**Autor:** Allan Nobre
**Tipo:** Subprocesso de Requisitos, atividade Elicitar Requisitos
**Prazo:** Seria de 1 dia para a correção, pela Matriz GUT, mas não cabe correção para esse caso
|
1.0
|
[Não Conformidade] Lista informal de requisitos - @UnbFeelings/devel
A auditoria da [Lista Informal de Requisitos](https://github.com/UnbFeelings/unb-feelings-docs/wiki/Processo#2211-elicitar-requisitos) foi feita com base nos [critérios](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Crit%C3%A9rios-de-Avalia%C3%A7%C3%A3o-e-T%C3%A9cnicas-de-Auditoria#lista-informal-de-requisitos) pré-estabelecidos para poder verificar se a atividade Elicitar Requisitos aconteceu. O resultado da auditoria pode ser acessado através da seguinte página: [Lista Informal de Requisitos](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Auditoria-Lista-informal-de-requisitos---Ciclo-1).
_Observação: Não é possível delimitar prazo para a correção, já que a fase de requisitos do ciclo já foi realizada._
### Descrição
O Artefato, Lista Informal de Requisitos, planejado no processo não foi criado ou documentado pela equipe de desenvolvimento.
#### Recomendações
Procurar trazer o cliente para dentro do processo de desenvolvimento o máximo possível, já que essa fase inicial de requisitos foi afetada pela falta desse artefato previsto no processo.
### Detalhes
**Autor:** Allan Nobre
**Tipo:** Subprocesso de Requisitos, atividade Elicitar Requisitos
**Prazo:** Seria de 1 dia para a correção, pela Matriz GUT, mas não cabe correção para esse caso
|
process
|
lista informal de requisitos unbfeelings devel a auditoria da foi feita com base nos pré estabelecidos para poder verificar se a atividade elicitar requisitos aconteceu o resultado da auditoria pode ser acessado através da seguinte página observação não é possível delimitar prazo para a correção já que a fase de requisitos do ciclo já foi realizada descrição o artefato lista informal de requisitos planejado no processo não foi criado ou documentado pela equipe de desenvolvimento recomendações procurar trazer o cliente para dentro do processo de desenvolvimento o máximo possível já que essa fase inicial de requisitos foi afetada pela falta desse artefato previsto no processo detalhes autor allan nobre tipo subprocesso de requisitos atividade elicitar requisitos prazo seria de dia para a correção pela matriz gut mas não cabe correção para esse caso
| 1
|
58,198
| 3,088,012,848
|
IssuesEvent
|
2015-08-25 14:44:31
|
brharp/hjckrrh
|
https://api.github.com/repos/brharp/hjckrrh
|
closed
|
G0 - File attachment types don't follow spec
|
feature: general (G) priority: normal type: bug
|
eg. Events is only: txt pdf
News is: txt pdf doc docx.
Basic Page is: txt pdf doc docx rtf png gif jpg jpeg.
|
1.0
|
G0 - File attachment types don't follow spec - eg. Events is only: txt pdf
News is: txt pdf doc docx.
Basic Page is: txt pdf doc docx rtf png gif jpg jpeg.
|
non_process
|
file attachment types don t follow spec eg events is only txt pdf news is txt pdf doc docx basic page is txt pdf doc docx rtf png gif jpg jpeg
| 0
|
341,989
| 24,724,295,074
|
IssuesEvent
|
2022-10-20 13:03:55
|
ham-radio-software/lzhuf
|
https://api.github.com/repos/ham-radio-software/lzhuf
|
closed
|
Create Debian packages for x86_64
|
documentation enhancement fixed
|
Need to use the Github Action to host a dockerfile to do a chroot Debian packages for x86_64.
The following Debian packages are needed for x86_64:
- Anti-X 21 (Bullseye)
- Ubuntu 18.04
- Ubuntu 20.04
- Ubuntu 22.04
|
1.0
|
Create Debian packages for x86_64 - Need to use the Github Action to host a dockerfile to do a chroot Debian packages for x86_64.
The following Debian packages are needed for x86_64:
- Anti-X 21 (Bullseye)
- Ubuntu 18.04
- Ubuntu 20.04
- Ubuntu 22.04
|
non_process
|
create debian packages for need to use the github action to host a dockerfile to do a chroot debian packages for the following debian packages are needed for anti x bullseye ubuntu ubuntu ubuntu
| 0
|
19,073
| 25,102,363,054
|
IssuesEvent
|
2022-11-08 14:27:46
|
tdwg/dwc
|
https://api.github.com/repos/tdwg/dwc
|
closed
|
New term - subfamily
|
Term - add Class - Taxon normative Process - complete
|
Was https://code.google.com/p/darwincore/issues/detail?id=146
## New Term
Submitter: David Remsen
Justification: For many insect groups, and perhaps for other taxa, there are a number of important intermediate taxon ranks between Family and Genus. While the higher taxon groupings (Kingdom-Order) also can be subcategorized with sub- and super- intermediates, the use of subfamily is intended to record a finer grade classification than is now possible with the existing terms.
Proponents: iDigBio, World Flora Online, Catalogue of Life, Canadensys
Definition: The full scientific name of the subfamily in which the taxon is classified.
Comment:
Examples: `Periptyctinae`, `Orchidoideae`, `Sphindociinae`
Refines: None
Replaces: None
ABCD 2.06: /DataSets/DataSet/Units/Unit/Identifications/Identification/Result/TaxonIdentified/HigherTaxa/HigherTaxon/HigherTaxonName with /DataSets/DataSet/Units/Unit/Identifications/Identification/Result/TaxonIdentified/HigherTaxa/HigherTaxon/HigherTaxonRank == "subfamilia"
Oct 3, 2013 comment #5 gtuco.btuco
I would like to promote the adoption of the concept mentioned in this issue. To do so, I will need a stronger proposal demonstrating the need to share this information - that is, that independent groups, organizations, projects have the same need and can reach a consensus proposal about how the term should be used. It might be a good idea to circulate the proposal on tdwg-content and see if a community can be built around and support the addition.
|
1.0
|
New term - subfamily - Was https://code.google.com/p/darwincore/issues/detail?id=146
## New Term
Submitter: David Remsen
Justification: For many insect groups, and perhaps for other taxa, there are a number of important intermediate taxon ranks between Family and Genus. While the higher taxon groupings (Kingdom-Order) also can be subcategorized with sub- and super- intermediates, the use of subfamily is intended to record a finer grade classification than is now possible with the existing terms.
Proponents: iDigBio, World Flora Online, Catalogue of Life, Canadensys
Definition: The full scientific name of the subfamily in which the taxon is classified.
Comment:
Examples: `Periptyctinae`, `Orchidoideae`, `Sphindociinae`
Refines: None
Replaces: None
ABCD 2.06: /DataSets/DataSet/Units/Unit/Identifications/Identification/Result/TaxonIdentified/HigherTaxa/HigherTaxon/HigherTaxonName with /DataSets/DataSet/Units/Unit/Identifications/Identification/Result/TaxonIdentified/HigherTaxa/HigherTaxon/HigherTaxonRank == "subfamilia"
Oct 3, 2013 comment #5 gtuco.btuco
I would like to promote the adoption of the concept mentioned in this issue. To do so, I will need a stronger proposal demonstrating the need to share this information - that is, that independent groups, organizations, projects have the same need and can reach a consensus proposal about how the term should be used. It might be a good idea to circulate the proposal on tdwg-content and see if a community can be built around and support the addition.
|
process
|
new term subfamily was new term submitter david remsen justification for many insect groups and perhaps for other taxa there are a number of important intermediate taxon ranks between family and genus while the higher taxon groupings kingdom order also can be subcategorized with sub and super intermediates the use of subfamily is intended to record a finer grade classification than is now possible with the existing terms proponents idigbio world flora online catalogue of life canadensys definition the full scientific name of the subfamily in which the taxon is classified comment examples periptyctinae orchidoideae sphindociinae refines none replaces none abcd datasets dataset units unit identifications identification result taxonidentified highertaxa highertaxon highertaxonname with datasets dataset units unit identifications identification result taxonidentified highertaxa highertaxon highertaxonrank subfamilia oct comment gtuco btuco i would like to promote the adoption of the concept mentioned in this issue to do so i will need a stronger proposal demonstrating the need to share this information that is that independent groups organizations projects have the same need and can reach a consensus proposal about how the term should be used it might be a good idea to circulate the proposal on tdwg content and see if a community can be built around and support the addition
| 1
|
19,183
| 13,203,156,868
|
IssuesEvent
|
2020-08-14 13:41:32
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
closed
|
Missing a way to check latest stable version / Standardize GitHub releases
|
OS/Android OS/Desktop infrastructure packaging
|
Hi, I maintain a Linux packaging script for Brave. As part of that work I need to be aware when a new stable version comes out.
Unfortunately for Linux you do not have a link under https://laptop-updates.brave.com/latest/xxxx like you do for Windows and macOS to download the latest binary (currently it seems the fallback route handler for that server redirects to manual Linux installation instructions).
Right now I'm resorting to parsing your GitHub releases every half an hour or so in an attempt to find a release marked as stable, but unfortunately the GitHub release seem to be done by hand and are not consistent (1.12.112 was first marked as stable, then prerelease like an hour or so later, then stable again; stable release titles have used different words in the past; etc).
Is there any reliable way I can check via automation what the latest release of Brave for Linux is? And if not, can we have one?
|
1.0
|
Missing a way to check latest stable version / Standardize GitHub releases - Hi, I maintain a Linux packaging script for Brave. As part of that work I need to be aware when a new stable version comes out.
Unfortunately for Linux you do not have a link under https://laptop-updates.brave.com/latest/xxxx like you do for Windows and macOS to download the latest binary (currently it seems the fallback route handler for that server redirects to manual Linux installation instructions).
Right now I'm resorting to parsing your GitHub releases every half an hour or so in an attempt to find a release marked as stable, but unfortunately the GitHub release seem to be done by hand and are not consistent (1.12.112 was first marked as stable, then prerelease like an hour or so later, then stable again; stable release titles have used different words in the past; etc).
Is there any reliable way I can check via automation what the latest release of Brave for Linux is? And if not, can we have one?
|
non_process
|
missing a way to check latest stable version standardize github releases hi i maintain a linux packaging script for brave as part of that work i need to be aware when a new stable version comes out unfortunately for linux you do not have a link under like you do for windows and macos to download the latest binary currently it seems the fallback route handler for that server redirects to manual linux installation instructions right now i m resorting to parsing your github releases every half an hour or so in an attempt to find a release marked as stable but unfortunately the github release seem to be done by hand and are not consistent was first marked as stable then prerelease like an hour or so later then stable again stable release titles have used different words in the past etc is there any reliable way i can check via automation what the latest release of brave for linux is and if not can we have one
| 0
|
6,266
| 9,220,672,273
|
IssuesEvent
|
2019-03-11 18:02:49
|
P0cL4bs/WiFi-Pumpkin
|
https://api.github.com/repos/P0cL4bs/WiFi-Pumpkin
|
closed
|
DNSMasq DHCP error
|
enhancement in process priority solved
|
When I select DNSMasq DHCP Server and start, I get an error message
Traceback (most recent call last):
File "/usr/share/WiFi-Pumpkin/core/main.py", line 816, in start_access_point
self.dhcpcontrol.Start()
File "/usr/share/WiFi-Pumpkin/core/controllers/dhcpcontroller.py", line 14, in Start
self.Active.Start()
File "/usr/share/WiFi-Pumpkin/core/servers/dhcp/dhcp.py", line 45, in Start
self.Initialize()
File "/usr/share/WiFi-Pumpkin/core/servers/dhcp/DNSMasq.py", line 21, in Initialize
with open(leases, 'wb') as leaconf:
TypeError: an integer is required
I do not know how to fix that. I would use the Python DHCP which works perfectly, but I need the DHCP to be a specific range and I cannot change these settings for the Python DHCP. It is always the network 10.0.0.0
Help would be appreciated
|
1.0
|
DNSMasq DHCP error - When I select DNSMasq DHCP Server and start, I get an error message
Traceback (most recent call last):
File "/usr/share/WiFi-Pumpkin/core/main.py", line 816, in start_access_point
self.dhcpcontrol.Start()
File "/usr/share/WiFi-Pumpkin/core/controllers/dhcpcontroller.py", line 14, in Start
self.Active.Start()
File "/usr/share/WiFi-Pumpkin/core/servers/dhcp/dhcp.py", line 45, in Start
self.Initialize()
File "/usr/share/WiFi-Pumpkin/core/servers/dhcp/DNSMasq.py", line 21, in Initialize
with open(leases, 'wb') as leaconf:
TypeError: an integer is required
I do not know how to fix that. I would use the Python DHCP which works perfectly, but I need the DHCP to be a specific range and I cannot change these settings for the Python DHCP. It is always the network 10.0.0.0
Help would be appreciated
|
process
|
dnsmasq dhcp error when i select dnsmasq dhcp server and start i get an error message traceback most recent call last file usr share wifi pumpkin core main py line in start access point self dhcpcontrol start file usr share wifi pumpkin core controllers dhcpcontroller py line in start self active start file usr share wifi pumpkin core servers dhcp dhcp py line in start self initialize file usr share wifi pumpkin core servers dhcp dnsmasq py line in initialize with open leases wb as leaconf typeerror an integer is required i do not know how to fix that i would use the python dhcp which works perfectly but i need the dhcp to be a specific range and i cannot change these settings for the python dhcp it is always the network help would be appreciated
| 1
|
18,590
| 24,568,966,847
|
IssuesEvent
|
2022-10-13 07:02:37
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Android] Getting an error message when clicked on 'App level notifications' in the mobile app
|
Bug P1 Android Process: Fixed Process: Tested QA Process: Tested dev
|
**Steps:**
1. Sign in / Sign up in the mobile app
2. Go to SB, send app level notifications
3. Now, Open the mobile app
4. Click on 'Notification' icon
5. Click on that particular notification and Verify
**AR:** Getting an error message stating as 'This study is not available' , when clicked on 'App level notifications' in the mobile app
**ER:** An error message should not get displayed when clicked on 'App level notifications' in the mobile app

|
3.0
|
[Android] Getting an error message when clicked on 'App level notifications' in the mobile app - **Steps:**
1. Sign in / Sign up in the mobile app
2. Go to SB, send app level notifications
3. Now, Open the mobile app
4. Click on 'Notification' icon
5. Click on that particular notification and Verify
**AR:** Getting an error message stating as 'This study is not available' , when clicked on 'App level notifications' in the mobile app
**ER:** An error message should not get displayed when clicked on 'App level notifications' in the mobile app

|
process
|
getting an error message when clicked on app level notifications in the mobile app steps sign in sign up in the mobile app go to sb send app level notifications now open the mobile app click on notification icon click on that particular notification and verify ar getting an error message stating as this study is not available when clicked on app level notifications in the mobile app er an error message should not get displayed when clicked on app level notifications in the mobile app
| 1
|
18,227
| 24,292,517,334
|
IssuesEvent
|
2022-09-29 07:27:58
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
Building a CMake project using Bazel
|
type: support / not a bug (process) untriaged team-OSS
|
I am trying to build a CMake project using Bazel. The folder structure looks like this:
BazelCmake
1. WORKSPACE.bazel
2. Source
3. Build
BazelCmake is the parent folder and Source subfolder contains the source files. I created the Build folder where i will be putting the build files. The Source folder looks like this:
Source
1. BUILD.bazel
2. CMakeLists.txt
3. tutorial.cpp
The contents of the workspace file are:
```
workspace(name = "rules_foreign_cc_usage_example")
load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
# Rule repository, note that it's recommended to use a pinned commit to a released version of the rules
http_archive(
name = "rules_foreign_cc",
sha256 = "c2cdcf55ffaf49366725639e45dedd449b8c3fe22b54e31625eb80ce3a240f1e",
strip_prefix = "rules_foreign_cc-0.1.0",
url = "https://github.com/bazelbuild/rules_foreign_cc/archive/0.1.0.zip",
)
load("@rules_foreign_cc//foreign_cc:repositories.bzl", "rules_foreign_cc_dependencies")
# This sets up some common toolchains for building targets. For more details, please see
# https://github.com/bazelbuild/rules_foreign_cc/tree/main/docs#rules_foreign_cc_dependencies
rules_foreign_cc_dependencies()
_ALL_CONTENT = """\
filegroup(
name = "all_srcs",
srcs = glob(["**"]),
visibility = ["//visibility:public"],
)
"""
# pcre source code repository
http_archive(
name = "pcre",
build_file_content = _ALL_CONTENT,
strip_prefix = "pcre-8.43",
urls = [
"https://mirror.bazel.build/ftp.pcre.org/pub/pcre/pcre-8.43.tar.gz",
"https://ftp.pcre.org/pub/pcre/pcre-8.43.tar.gz",
],
sha256 = "0b8e7465dc5e98c757cc3650a20a7843ee4c3edf50aaf60bb33fd879690d2c73",
)
```
This is the same as directed in the [official documentation](https://bazelbuild.github.io/rules_foreign_cc/main/cmake.html).
Apart from that, the Build.bazel file inside the Source folder looks like this:
```
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
cmake(
name = "pcre",
cache_entries = {
"CMAKE_C_FLAGS": "-fPIC",
},
lib_source = "@pcre//:all_srcs",
out_static_libs = ["libpcre.a"], )
```
This is also the same as given in the [official documentation](https://bazelbuild.github.io/rules_foreign_cc/main/cmake.html).
The contents of tutorial.cpp are:
```
#include <cmath>
#include <cstdlib>
#include <iostream>
#include <string>
using namespace std;
int main()
{
cout << "Hello "<<endl;
return 0;
}
```
The contents of CMakeLists.txt are:
```
cmake_minimum_required(VERSION 3.23.2)
project(Tutorial VERSION 1.0)
set(CMAKE_CXX_STANDARD 11)
set(CMAKE_CXX_STANDARD_REQUIRED True)
add_executable(Tutorial tutorial.cxx)
target_include_directories(Tutorial PUBLIC
"${PROJECT_BINARY_DIR}"
)
```
Now, when I open command prompt, enter the Build directory and type:
` bazel build //:pcre`
from there, I get the error:
```
Starting local Bazel server and connecting to it...
ERROR: error loading package '': Label '@rules_foreign_cc//foreign_cc:repositories.bzl' is invalid because 'foreign_cc' is not a package; perhaps you meant to put the colon here: '@rules_foreign_cc//:foreign_cc/repositories.bzl'?
INFO: Elapsed time: 6.982s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
```
I am very much new to Bazel and want to build CMake projects using Bazel. How do I resolve the error and build CMake projects using bazel?
Thanks.
|
1.0
|
Building a CMake project using Bazel - I am trying to build a CMake project using Bazel. The folder structure looks like this:
BazelCmake
1. WORKSPACE.bazel
2. Source
3. Build
BazelCmake is the parent folder and Source subfolder contains the source files. I created the Build folder where i will be putting the build files. The Source folder looks like this:
Source
1. BUILD.bazel
2. CMakeLists.txt
3. tutorial.cpp
The contents of the workspace file are:
```
workspace(name = "rules_foreign_cc_usage_example")
load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
# Rule repository, note that it's recommended to use a pinned commit to a released version of the rules
http_archive(
name = "rules_foreign_cc",
sha256 = "c2cdcf55ffaf49366725639e45dedd449b8c3fe22b54e31625eb80ce3a240f1e",
strip_prefix = "rules_foreign_cc-0.1.0",
url = "https://github.com/bazelbuild/rules_foreign_cc/archive/0.1.0.zip",
)
load("@rules_foreign_cc//foreign_cc:repositories.bzl", "rules_foreign_cc_dependencies")
# This sets up some common toolchains for building targets. For more details, please see
# https://github.com/bazelbuild/rules_foreign_cc/tree/main/docs#rules_foreign_cc_dependencies
rules_foreign_cc_dependencies()
_ALL_CONTENT = """\
filegroup(
name = "all_srcs",
srcs = glob(["**"]),
visibility = ["//visibility:public"],
)
"""
# pcre source code repository
http_archive(
name = "pcre",
build_file_content = _ALL_CONTENT,
strip_prefix = "pcre-8.43",
urls = [
"https://mirror.bazel.build/ftp.pcre.org/pub/pcre/pcre-8.43.tar.gz",
"https://ftp.pcre.org/pub/pcre/pcre-8.43.tar.gz",
],
sha256 = "0b8e7465dc5e98c757cc3650a20a7843ee4c3edf50aaf60bb33fd879690d2c73",
)
```
This is the same as directed in the [official documentation](https://bazelbuild.github.io/rules_foreign_cc/main/cmake.html).
Apart from that, the Build.bazel file inside the Source folder looks like this:
```
load("@rules_foreign_cc//foreign_cc:defs.bzl", "cmake")
cmake(
name = "pcre",
cache_entries = {
"CMAKE_C_FLAGS": "-fPIC",
},
lib_source = "@pcre//:all_srcs",
out_static_libs = ["libpcre.a"], )
```
This is also the same as given in the [official documentation](https://bazelbuild.github.io/rules_foreign_cc/main/cmake.html).
The contents of tutorial.cpp are:
```
#include <cmath>
#include <cstdlib>
#include <iostream>
#include <string>
using namespace std;
int main()
{
cout << "Hello "<<endl;
return 0;
}
```
The contents of CMakeLists.txt are:
```
cmake_minimum_required(VERSION 3.23.2)
project(Tutorial VERSION 1.0)
set(CMAKE_CXX_STANDARD 11)
set(CMAKE_CXX_STANDARD_REQUIRED True)
add_executable(Tutorial tutorial.cxx)
target_include_directories(Tutorial PUBLIC
"${PROJECT_BINARY_DIR}"
)
```
Now, when I open command prompt, enter the Build directory and type:
` bazel build //:pcre`
from there, I get the error:
```
Starting local Bazel server and connecting to it...
ERROR: error loading package '': Label '@rules_foreign_cc//foreign_cc:repositories.bzl' is invalid because 'foreign_cc' is not a package; perhaps you meant to put the colon here: '@rules_foreign_cc//:foreign_cc/repositories.bzl'?
INFO: Elapsed time: 6.982s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
```
I am very much new to Bazel and want to build CMake projects using Bazel. How do I resolve the error and build CMake projects using bazel?
Thanks.
|
process
|
building a cmake project using bazel i am trying to build a cmake project using bazel the folder structure looks like this bazelcmake workspace bazel source build bazelcmake is the parent folder and source subfolder contains the source files i created the build folder where i will be putting the build files the source folder looks like this source build bazel cmakelists txt tutorial cpp the contents of the workspace file are workspace name rules foreign cc usage example load bazel tools tools build defs repo http bzl http archive rule repository note that it s recommended to use a pinned commit to a released version of the rules http archive name rules foreign cc strip prefix rules foreign cc url load rules foreign cc foreign cc repositories bzl rules foreign cc dependencies this sets up some common toolchains for building targets for more details please see rules foreign cc dependencies all content filegroup name all srcs srcs glob visibility pcre source code repository http archive name pcre build file content all content strip prefix pcre urls this is the same as directed in the apart from that the build bazel file inside the source folder looks like this load rules foreign cc foreign cc defs bzl cmake cmake name pcre cache entries cmake c flags fpic lib source pcre all srcs out static libs this is also the same as given in the the contents of tutorial cpp are include include include include using namespace std int main cout hello endl return the contents of cmakelists txt are cmake minimum required version project tutorial version set cmake cxx standard set cmake cxx standard required true add executable tutorial tutorial cxx target include directories tutorial public project binary dir now when i open command prompt enter the build directory and type bazel build pcre from there i get the error starting local bazel server and connecting to it error error loading package label rules foreign cc foreign cc repositories bzl is invalid because foreign cc is not a package perhaps you meant to put the colon here rules foreign cc foreign cc repositories bzl info elapsed time info processes failed build did not complete successfully packages loaded i am very much new to bazel and want to build cmake projects using bazel how do i resolve the error and build cmake projects using bazel thanks
| 1
|
147,836
| 23,281,646,195
|
IssuesEvent
|
2022-08-05 12:41:59
|
OpenRefine/OpenRefine
|
https://api.github.com/repos/OpenRefine/OpenRefine
|
opened
|
Extension point for cell rendering
|
enhancement UI to be reviewed extension design discussions
|
As part of the Wikimedia Commons integration project, there is an interest (@trnstlntk @lozanaross ) in customizing the cell rendering to include thumbnails of the media files being uploaded (or edited): https://github.com/OpenRefine/CommonsExtension/issues/34.
There is currently no official way for an extension to customize how the cells of a particular column are rendered. Therefore, implementing this would first require introducing the corresponding extension point in OpenRefine itself.
Introducing extension points is a subtle thing to do, as we need to care about the following aspects:
* The introduction of the extension points commits us to maintaining such an interface in a stable way in the future, to some extent. For instance, if we wanted to rewrite the grid rendering in a different framework, this would likely imply changing this extension point, hence breaking extensions which rely on it;
* The use case that motivates the introduction of the extension point should be generalized sufficiently for the extension point to be useful to other use cases: we need to make a good effort at thinking about in which other use cases one might want to customize the way cells are rendered.
### Proposed solution
We first need to enable extensions to store column-specific metadata which controls how the column is rendered. Introduce for this a JSON field in the column metadata object, where extensions can store arbitrary JSON.
Then, add a new extension point which lets extensions register cell rendering callbacks. A cell rendering callback is a javascript function, which takes as arguments:
* a cell object (as a JSON-deserialized object, which came from the backend)
* the column metadata for the column in which it is being rendered
The callback is expected to return:
* either a DOM element, which then becomes the rendered cell value (on top of which is added the "edit" button, still managed by the core software)
* or `null`, meaning that the callback is not attempting to override the rendering of this cell and delegates its rendering to other callbacks or the core software
When cell rendering callbacks are registered, they are also attributed a priority (an integer) by the registrant. When rendering a cell, cell rendering callbacks are executed in order of decreasing priority. The first callback that returns a DOM element determines how the cell is rendered. If all callbacks return `null`, then the core software is used to render the cell.
### Alternatives considered
Do not build an extension point for cell rendering, and instead build this feature in the core software, with the understanding that rendering file paths of images as thumbnails is a feature that would be useful to a wide enough community.
## Feedback welcome
@elebitzero @tfmorris Can you think of other ways to support this?
|
1.0
|
Extension point for cell rendering - As part of the Wikimedia Commons integration project, there is an interest (@trnstlntk @lozanaross ) in customizing the cell rendering to include thumbnails of the media files being uploaded (or edited): https://github.com/OpenRefine/CommonsExtension/issues/34.
There is currently no official way for an extension to customize how the cells of a particular column are rendered. Therefore, implementing this would first require introducing the corresponding extension point in OpenRefine itself.
Introducing extension points is a subtle thing to do, as we need to care about the following aspects:
* The introduction of the extension points commits us to maintaining such an interface in a stable way in the future, to some extent. For instance, if we wanted to rewrite the grid rendering in a different framework, this would likely imply changing this extension point, hence breaking extensions which rely on it;
* The use case that motivates the introduction of the extension point should be generalized sufficiently for the extension point to be useful to other use cases: we need to make a good effort at thinking about in which other use cases one might want to customize the way cells are rendered.
### Proposed solution
We first need to enable extensions to store column-specific metadata which controls how the column is rendered. Introduce for this a JSON field in the column metadata object, where extensions can store arbitrary JSON.
Then, add a new extension point which lets extensions register cell rendering callbacks. A cell rendering callback is a javascript function, which takes as arguments:
* a cell object (as a JSON-deserialized object, which came from the backend)
* the column metadata for the column in which it is being rendered
The callback is expected to return:
* either a DOM element, which then becomes the rendered cell value (on top of which is added the "edit" button, still managed by the core software)
* or `null`, meaning that the callback is not attempting to override the rendering of this cell and delegates its rendering to other callbacks or the core software
When cell rendering callbacks are registered, they are also attributed a priority (an integer) by the registrant. When rendering a cell, cell rendering callbacks are executed in order of decreasing priority. The first callback that returns a DOM element determines how the cell is rendered. If all callbacks return `null`, then the core software is used to render the cell.
### Alternatives considered
Do not build an extension point for cell rendering, and instead build this feature in the core software, with the understanding that rendering file paths of images as thumbnails is a feature that would be useful to a wide enough community.
## Feedback welcome
@elebitzero @tfmorris Can you think of other ways to support this?
|
non_process
|
extension point for cell rendering as part of the wikimedia commons integration project there is an interest trnstlntk lozanaross in customizing the cell rendering to include thumbnails of the media files being uploaded or edited there is currently no official way for an extension to customize how the cells of a particular column are rendered therefore implementing this would first require introducing the corresponding extension point in openrefine itself introducing extension points is a subtle thing to do as we need to care about the following aspects the introduction of the extension points commits us to maintaining such an interface in a stable way in the future to some extent for instance if we wanted to rewrite the grid rendering in a different framework this would likely imply changing this extension point hence breaking extensions which rely on it the use case that motivates the introduction of the extension point should be generalized sufficiently for the extension point to be useful to other use cases we need to make a good effort at thinking about in which other use cases one might want to customize the way cells are rendered proposed solution we first need to enable extensions to store column specific metadata which controls how the column is rendered introduce for this a json field in the column metadata object where extensions can store arbitrary json then add a new extension point which lets extensions register cell rendering callbacks a cell rendering callback is a javascript function which takes as arguments a cell object as a json deserialized object which came from the backend the column metadata for the column in which it is being rendered the callback is expected to return either a dom element which then becomes the rendered cell value on top of which is added the edit button still managed by the core software or null meaning that the callback is not attempting to override the rendering of this cell and delegates its rendering to other callbacks or the core software when cell rendering callbacks are registered they are also attributed a priority an integer by the registrant when rendering a cell cell rendering callbacks are executed in order of decreasing priority the first callback that returns a dom element determines how the cell is rendered if all callbacks return null then the core software is used to render the cell alternatives considered do not build an extension point for cell rendering and instead build this feature in the core software with the understanding that rendering file paths of images as thumbnails is a feature that would be useful to a wide enough community feedback welcome elebitzero tfmorris can you think of other ways to support this
| 0
|
18,551
| 24,555,419,982
|
IssuesEvent
|
2022-10-12 15:30:29
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[iOS] Study activities > App crashes clicking on 'Done' in below scenario for branching source questionnaire
|
Bug P0 iOS Process: Fixed Process: Tested dev
|
**Steps:**
1. Configure a source questionnaire from SB
2. Add some question steps for the above questionnaire
3. Apply branching to those questions
4. Now configure a date response type question and enable 'Use response as anchor date'
5. Configure branching in such a way that atleast 1 option chosen by participant skips the above date response type.
6. Configure a target questioonaire and Publish
7. Open the iOS mobile app
8. Open the source questionnaire
9. Submit options so that date response type is skipped
10. Click on 'Done' and observe app crashes
**Actual:** App is crashing
**Expected:** App should not crash
Issue not observed if date response type is not skipped
Issue not observed if 'Use response as anchor date' is disabled for date response type
Study details to test:
1. Study Name and Study ID: iosupdate1
2. App ID: GCPMOB001
3. Activity name: source with branching
In 1st option select A or B
In 2nd option select 'No' and click on 'Done'
**Refer video:**
https://user-images.githubusercontent.com/60386291/185616902-fd06beff-d7a7-4561-ba52-8129eb0dd40b.MOV
|
2.0
|
[iOS] Study activities > App crashes clicking on 'Done' in below scenario for branching source questionnaire - **Steps:**
1. Configure a source questionnaire from SB
2. Add some question steps for the above questionnaire
3. Apply branching to those questions
4. Now configure a date response type question and enable 'Use response as anchor date'
5. Configure branching in such a way that atleast 1 option chosen by participant skips the above date response type.
6. Configure a target questioonaire and Publish
7. Open the iOS mobile app
8. Open the source questionnaire
9. Submit options so that date response type is skipped
10. Click on 'Done' and observe app crashes
**Actual:** App is crashing
**Expected:** App should not crash
Issue not observed if date response type is not skipped
Issue not observed if 'Use response as anchor date' is disabled for date response type
Study details to test:
1. Study Name and Study ID: iosupdate1
2. App ID: GCPMOB001
3. Activity name: source with branching
In 1st option select A or B
In 2nd option select 'No' and click on 'Done'
**Refer video:**
https://user-images.githubusercontent.com/60386291/185616902-fd06beff-d7a7-4561-ba52-8129eb0dd40b.MOV
|
process
|
study activities app crashes clicking on done in below scenario for branching source questionnaire steps configure a source questionnaire from sb add some question steps for the above questionnaire apply branching to those questions now configure a date response type question and enable use response as anchor date configure branching in such a way that atleast option chosen by participant skips the above date response type configure a target questioonaire and publish open the ios mobile app open the source questionnaire submit options so that date response type is skipped click on done and observe app crashes actual app is crashing expected app should not crash issue not observed if date response type is not skipped issue not observed if use response as anchor date is disabled for date response type study details to test study name and study id app id activity name source with branching in option select a or b in option select no and click on done refer video
| 1
|
338,587
| 24,591,618,191
|
IssuesEvent
|
2022-10-14 03:09:33
|
Final-healthree/healthree-backend
|
https://api.github.com/repos/Final-healthree/healthree-backend
|
closed
|
Issue : 우분투에 배포 시 `editly` 실행 환경 설정 건 (`Headless-gi`과 `Xvfb`)
|
documentation
|
## 9월 20일
> [Bug : ffmpeg-fluent 영상 병합(concatenate) 시 에러 발생 #124](https://github.com/Final-healthree/healthree-backend/issues/124)
`editly`란 새로운 라이브러리를 사용함으로써 영상 병합이 로컬에서는 잘 적용이 되었지만 Ubuntu 배포 환경에서는 또 다시 작동이 되지 않았다.
**`Error, gl returned null, this probably means that some dependencies are not installed. See README`**
<br>
`gl`이 확실히 무엇인지는 모르지만, 일단 에러 메시지를 읽으면 `editly`는 `gl`에 의존하고 있는데 이것이 설치되지 않아서 발생한 에러라고 한다.
> `gl`
>
> > [npm gl 공식문서](https://www.npmjs.com/package/gl)
> >
> > - Creates a **WebGL** context **without a window**
> > - lets you create a WebGL context in Node.js **without making a window or loading a full browser environment.**
추후 npm 공식 페이지에서 `gl`을 검색했을 때 `Headless-gl`이란 표현이 있었는데, **headless란 보통 GUI로 동작하는 웹 브라우저를 GUI 없이 오직 프로그램만으로 사용하는 것**을 의미한다고 한다. 개발자들이 **Command Line**으로만 다루고 확인할 수 있는 것이라고 생각하면 될 것 같다.
리눅스 기반인 우분투에서 배포를 하고 있기 때문에 발생한 에러라고 생각하여 우리는 로컬로 돌아와 `gl`을 다시 설치하여 배포를 진행했다.
```javascript
// package.json
"dependencies" : {
...
"gl": "^5.0.3",
}
```
|
1.0
|
Issue : 우분투에 배포 시 `editly` 실행 환경 설정 건 (`Headless-gi`과 `Xvfb`) - ## 9월 20일
> [Bug : ffmpeg-fluent 영상 병합(concatenate) 시 에러 발생 #124](https://github.com/Final-healthree/healthree-backend/issues/124)
`editly`란 새로운 라이브러리를 사용함으로써 영상 병합이 로컬에서는 잘 적용이 되었지만 Ubuntu 배포 환경에서는 또 다시 작동이 되지 않았다.
**`Error, gl returned null, this probably means that some dependencies are not installed. See README`**
<br>
`gl`이 확실히 무엇인지는 모르지만, 일단 에러 메시지를 읽으면 `editly`는 `gl`에 의존하고 있는데 이것이 설치되지 않아서 발생한 에러라고 한다.
> `gl`
>
> > [npm gl 공식문서](https://www.npmjs.com/package/gl)
> >
> > - Creates a **WebGL** context **without a window**
> > - lets you create a WebGL context in Node.js **without making a window or loading a full browser environment.**
추후 npm 공식 페이지에서 `gl`을 검색했을 때 `Headless-gl`이란 표현이 있었는데, **headless란 보통 GUI로 동작하는 웹 브라우저를 GUI 없이 오직 프로그램만으로 사용하는 것**을 의미한다고 한다. 개발자들이 **Command Line**으로만 다루고 확인할 수 있는 것이라고 생각하면 될 것 같다.
리눅스 기반인 우분투에서 배포를 하고 있기 때문에 발생한 에러라고 생각하여 우리는 로컬로 돌아와 `gl`을 다시 설치하여 배포를 진행했다.
```javascript
// package.json
"dependencies" : {
...
"gl": "^5.0.3",
}
```
|
non_process
|
issue 우분투에 배포 시 editly 실행 환경 설정 건 headless gi 과 xvfb editly 란 새로운 라이브러리를 사용함으로써 영상 병합이 로컬에서는 잘 적용이 되었지만 ubuntu 배포 환경에서는 또 다시 작동이 되지 않았다 error gl returned null this probably means that some dependencies are not installed see readme gl 이 확실히 무엇인지는 모르지만 일단 에러 메시지를 읽으면 editly 는 gl 에 의존하고 있는데 이것이 설치되지 않아서 발생한 에러라고 한다 gl creates a webgl context without a window lets you create a webgl context in node js without making a window or loading a full browser environment 추후 npm 공식 페이지에서 gl 을 검색했을 때 headless gl 이란 표현이 있었는데 headless란 보통 gui로 동작하는 웹 브라우저를 gui 없이 오직 프로그램만으로 사용하는 것 을 의미한다고 한다 개발자들이 command line 으로만 다루고 확인할 수 있는 것이라고 생각하면 될 것 같다 리눅스 기반인 우분투에서 배포를 하고 있기 때문에 발생한 에러라고 생각하여 우리는 로컬로 돌아와 gl 을 다시 설치하여 배포를 진행했다 javascript package json dependencies gl
| 0
|
19,421
| 25,568,601,716
|
IssuesEvent
|
2022-11-30 15:59:01
|
argosp/trialdash
|
https://api.github.com/repos/argosp/trialdash
|
closed
|
Resolutions conflicts
|
question in process
|
@HadasS888
There is a difference in the resolution between figma demand to the current project resolution, as you can see here #253
Figma demand height - 1127px
Current stage height - 637px
It is not clear to me if you want me to fix it and work by Figma demands or stick to the current stage as it is.
If it does relevant to fix it by Figma demands, there is two options I came up with, please refer or offer another solution.
option 1: make components smaller - This can lead to difficulty reading due to low resolution
option 2: expand the height of elements in current stage - This will lead to expanded window view that can make handling harder ( because of scroll between elemetns )
|
1.0
|
Resolutions conflicts - @HadasS888
There is a difference in the resolution between figma demand to the current project resolution, as you can see here #253
Figma demand height - 1127px
Current stage height - 637px
It is not clear to me if you want me to fix it and work by Figma demands or stick to the current stage as it is.
If it does relevant to fix it by Figma demands, there is two options I came up with, please refer or offer another solution.
option 1: make components smaller - This can lead to difficulty reading due to low resolution
option 2: expand the height of elements in current stage - This will lead to expanded window view that can make handling harder ( because of scroll between elemetns )
|
process
|
resolutions conflicts there is a difference in the resolution between figma demand to the current project resolution as you can see here figma demand height current stage height it is not clear to me if you want me to fix it and work by figma demands or stick to the current stage as it is if it does relevant to fix it by figma demands there is two options i came up with please refer or offer another solution option make components smaller this can lead to difficulty reading due to low resolution option expand the height of elements in current stage this will lead to expanded window view that can make handling harder because of scroll between elemetns
| 1
|
13,543
| 16,085,812,214
|
IssuesEvent
|
2021-04-26 11:03:10
|
aiidateam/aiida-core
|
https://api.github.com/repos/aiidateam/aiida-core
|
closed
|
Add automated namespace support to `inputs` and `outputs` node neighbor manager
|
priority/nice-to-have topic/orm topic/processes type/accepted feature
|
The `inputs` and `outputs` node neighbor managers support addressing process node inputs or outputs through attribute dereferencing by the link labels, e.g.:
```
node.outputs.some_output
```
Both the inputs and outputs can have arbitrarily nested namespaces, for example:
```
{
'relax': {
'structure': StructureData()
}
}
```
However, since the links are flat, the resulting link label is `relax__structure`, where the namespace is indicated by the double underscore. To retrieve this output one should use `node.outputs.relax__structure`. However, it would be nice if the neighbor manager understands the double underscore and allows it to be retrieved through the syntax `node.outputs.relax.structure`.
|
1.0
|
Add automated namespace support to `inputs` and `outputs` node neighbor manager - The `inputs` and `outputs` node neighbor managers support addressing process node inputs or outputs through attribute dereferencing by the link labels, e.g.:
```
node.outputs.some_output
```
Both the inputs and outputs can have arbitrarily nested namespaces, for example:
```
{
'relax': {
'structure': StructureData()
}
}
```
However, since the links are flat, the resulting link label is `relax__structure`, where the namespace is indicated by the double underscore. To retrieve this output one should use `node.outputs.relax__structure`. However, it would be nice if the neighbor manager understands the double underscore and allows it to be retrieved through the syntax `node.outputs.relax.structure`.
|
process
|
add automated namespace support to inputs and outputs node neighbor manager the inputs and outputs node neighbor managers support addressing process node inputs or outputs through attribute dereferencing by the link labels e g node outputs some output both the inputs and outputs can have arbitrarily nested namespaces for example relax structure structuredata however since the links are flat the resulting link label is relax structure where the namespace is indicated by the double underscore to retrieve this output one should use node outputs relax structure however it would be nice if the neighbor manager understands the double underscore and allows it to be retrieved through the syntax node outputs relax structure
| 1
|
383,562
| 11,357,923,702
|
IssuesEvent
|
2020-01-25 10:05:46
|
woocommerce/woocommerce-gateway-stripe
|
https://api.github.com/repos/woocommerce/woocommerce-gateway-stripe
|
opened
|
Handle Stripe.com account changes
|
Priority: Low [Type] Bug
|
**Describe the bug**
Under certain circumstances, store owners may need to change the Stripe.com account.
This will cause the error shown in #1092 but has a large scope of how to fix that.
**To Reproduce**
I chatted to a store owner on WooCommerce.com live chat, this was their scenario:
> We recently went through a legal change that required us to create a new Stripe account. We were using the Stripe/Woo plugin fine previously. After inputting the new Stripe keys for our new account, we tried placing a test order in test mode using my WordPress Admin user, but we received the following error
>
> No such customer: cus_xxxxxxxxx
>
> Later we tried to place another test order in incognito mode and it worked, but we wanted to check with you about this, and see if you could take a look and give us directions. Reading some support forums such as the one below it looks like this issue could potentially affect all existing customers.
Also https://github.com/woocommerce/woocommerce-gateway-stripe/issues/1092#issuecomment-571888062 by @richardsplayground
> We tried changing the Stripe public/secret keys in the plugin, and immediately started getting this error for all customers that had previously made a transaction with our old Stripe account.
**Expected behavior**
Store owners can change their Stripe.com if required using the Settings screen, our extension handles the code aspect
**Additional context**
p6q7sZ-6uE-p2#comment-19674
cc @compchris per p2 post and @richardsplayground due to your comment in #1092
@allendav I'm placing Low due to the description of that label - this only affects a small number of users but is a show stopper for those customers so you may want to reprioritize.
|
1.0
|
Handle Stripe.com account changes - **Describe the bug**
Under certain circumstances, store owners may need to change the Stripe.com account.
This will cause the error shown in #1092 but has a large scope of how to fix that.
**To Reproduce**
I chatted to a store owner on WooCommerce.com live chat, this was their scenario:
> We recently went through a legal change that required us to create a new Stripe account. We were using the Stripe/Woo plugin fine previously. After inputting the new Stripe keys for our new account, we tried placing a test order in test mode using my WordPress Admin user, but we received the following error
>
> No such customer: cus_xxxxxxxxx
>
> Later we tried to place another test order in incognito mode and it worked, but we wanted to check with you about this, and see if you could take a look and give us directions. Reading some support forums such as the one below it looks like this issue could potentially affect all existing customers.
Also https://github.com/woocommerce/woocommerce-gateway-stripe/issues/1092#issuecomment-571888062 by @richardsplayground
> We tried changing the Stripe public/secret keys in the plugin, and immediately started getting this error for all customers that had previously made a transaction with our old Stripe account.
**Expected behavior**
Store owners can change their Stripe.com if required using the Settings screen, our extension handles the code aspect
**Additional context**
p6q7sZ-6uE-p2#comment-19674
cc @compchris per p2 post and @richardsplayground due to your comment in #1092
@allendav I'm placing Low due to the description of that label - this only affects a small number of users but is a show stopper for those customers so you may want to reprioritize.
|
non_process
|
handle stripe com account changes describe the bug under certain circumstances store owners may need to change the stripe com account this will cause the error shown in but has a large scope of how to fix that to reproduce i chatted to a store owner on woocommerce com live chat this was their scenario we recently went through a legal change that required us to create a new stripe account we were using the stripe woo plugin fine previously after inputting the new stripe keys for our new account we tried placing a test order in test mode using my wordpress admin user but we received the following error no such customer cus xxxxxxxxx later we tried to place another test order in incognito mode and it worked but we wanted to check with you about this and see if you could take a look and give us directions reading some support forums such as the one below it looks like this issue could potentially affect all existing customers also by richardsplayground we tried changing the stripe public secret keys in the plugin and immediately started getting this error for all customers that had previously made a transaction with our old stripe account expected behavior store owners can change their stripe com if required using the settings screen our extension handles the code aspect additional context comment cc compchris per post and richardsplayground due to your comment in allendav i m placing low due to the description of that label this only affects a small number of users but is a show stopper for those customers so you may want to reprioritize
| 0
|
601,053
| 18,365,118,773
|
IssuesEvent
|
2021-10-09 23:10:54
|
helgoboss/realearn
|
https://api.github.com/repos/helgoboss/realearn
|
closed
|
Add support for LCDs
|
enhancement high priority
|
Current state:
- It's possible to program LCDs in EEL language by using the ["MIDI script" source](https://github.com/helgoboss/realearn/blob/master/doc/user-guide.adoc#script-source).
- Needs programming skills.
- You can't access any other data than the current numerical target value ... no track name, no nothing.
- It's possible *and convenient* to control the 7-segment display on MC (Mackie Control) devices in order to display the current (numerical) target value as percentage by using the the ["Mackie Control" controller preset](https://github.com/helgoboss/realearn/blob/master/doc/user-guide.adoc#91-mackie-control).
Solution approach:
- Displaying target properties which are *not* the current numerical target value could be achieved by letting targets (but also ReaLearn globally) expose key-value pairs and making them accessible in the already existing "MIDI script" source.
- "MIDI LCD" source which supports various ways of programming LCDs (builtin) and accessing the key-value pairs
- "OSC" source should support string type in same way
In general, usage should be straightforward:
- Controller mapping should expose one line on the LCD via LCD source as a named virtual control element "ch1/lcd/line1", not yet mapping any info, just defining the LCD type etc.
- Mapping selected track's name to that line (or any other globally reachable info):
- Source: Virtual control element "ch1/lcd/line1"
- Target: "Global/Project: General info (feedback only, for displays)"
- Display expression: "{selected_track_position}. {selected_track_name}"
- Mapping formatted target value to that line:
- Target: e.g. "Send: Volume"
- Display expression: "{value_in_db} dB"
- Mapping send name to that line:
- Target: "Send: Volume"
- Display expression: "{target_track_name}"
Related: #380
Losely related: #187
|
1.0
|
Add support for LCDs - Current state:
- It's possible to program LCDs in EEL language by using the ["MIDI script" source](https://github.com/helgoboss/realearn/blob/master/doc/user-guide.adoc#script-source).
- Needs programming skills.
- You can't access any other data than the current numerical target value ... no track name, no nothing.
- It's possible *and convenient* to control the 7-segment display on MC (Mackie Control) devices in order to display the current (numerical) target value as percentage by using the the ["Mackie Control" controller preset](https://github.com/helgoboss/realearn/blob/master/doc/user-guide.adoc#91-mackie-control).
Solution approach:
- Displaying target properties which are *not* the current numerical target value could be achieved by letting targets (but also ReaLearn globally) expose key-value pairs and making them accessible in the already existing "MIDI script" source.
- "MIDI LCD" source which supports various ways of programming LCDs (builtin) and accessing the key-value pairs
- "OSC" source should support string type in same way
In general, usage should be straightforward:
- Controller mapping should expose one line on the LCD via LCD source as a named virtual control element "ch1/lcd/line1", not yet mapping any info, just defining the LCD type etc.
- Mapping selected track's name to that line (or any other globally reachable info):
- Source: Virtual control element "ch1/lcd/line1"
- Target: "Global/Project: General info (feedback only, for displays)"
- Display expression: "{selected_track_position}. {selected_track_name}"
- Mapping formatted target value to that line:
- Target: e.g. "Send: Volume"
- Display expression: "{value_in_db} dB"
- Mapping send name to that line:
- Target: "Send: Volume"
- Display expression: "{target_track_name}"
Related: #380
Losely related: #187
|
non_process
|
add support for lcds current state it s possible to program lcds in eel language by using the needs programming skills you can t access any other data than the current numerical target value no track name no nothing it s possible and convenient to control the segment display on mc mackie control devices in order to display the current numerical target value as percentage by using the the solution approach displaying target properties which are not the current numerical target value could be achieved by letting targets but also realearn globally expose key value pairs and making them accessible in the already existing midi script source midi lcd source which supports various ways of programming lcds builtin and accessing the key value pairs osc source should support string type in same way in general usage should be straightforward controller mapping should expose one line on the lcd via lcd source as a named virtual control element lcd not yet mapping any info just defining the lcd type etc mapping selected track s name to that line or any other globally reachable info source virtual control element lcd target global project general info feedback only for displays display expression selected track position selected track name mapping formatted target value to that line target e g send volume display expression value in db db mapping send name to that line target send volume display expression target track name related losely related
| 0
|
38,022
| 5,163,779,116
|
IssuesEvent
|
2017-01-17 08:21:31
|
thisisodense/tio-web
|
https://api.github.com/repos/thisisodense/tio-web
|
closed
|
Fejlmeddelelse i backend
|
backend bug test okay
|
Denne fejlmeddelelse har jeg nu flere gange fået, når jeg er inde for at oprette/redigere i en location. Fx den her: http://www.thisisodense.dk/umbraco/#/content/content/edit/2402
Skærmbillede her: (fuld fejlmeddelelse nedenunder)

Der skete en fejl på severen
An error occured
No argument found for the current action with the name: id
EXCEPTION DETAILS
System.InvalidOperationException: No argument found for the current action with the name: id
STACKTRACE
at Umbraco.Web.WebApi.Filters.EnsureUserPermissionForContentAttribute.OnActionExecuting(HttpActionContext actionContext)
at System.Web.Http.Filters.ActionFilterAttribute.OnActionExecutingAsync(HttpActionContext actionContext, CancellationToken cancellationToken)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Controllers.ActionFilterResult.<ExecuteAsync>d__2.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.AuthorizationFilterAttribute.<ExecuteAuthorizationFilterAsyncCore>d__2.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.AuthorizationFilterAttribute.<ExecuteAuthorizationFilterAsyncCore>d__2.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.AuthorizationFilterAttribute.<ExecuteAuthorizationFilterAsyncCore>d__2.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Dispatcher.HttpControllerDispatcher.<SendAsync>d__1.MoveNext()
|
1.0
|
Fejlmeddelelse i backend - Denne fejlmeddelelse har jeg nu flere gange fået, når jeg er inde for at oprette/redigere i en location. Fx den her: http://www.thisisodense.dk/umbraco/#/content/content/edit/2402
Skærmbillede her: (fuld fejlmeddelelse nedenunder)

Der skete en fejl på severen
An error occured
No argument found for the current action with the name: id
EXCEPTION DETAILS
System.InvalidOperationException: No argument found for the current action with the name: id
STACKTRACE
at Umbraco.Web.WebApi.Filters.EnsureUserPermissionForContentAttribute.OnActionExecuting(HttpActionContext actionContext)
at System.Web.Http.Filters.ActionFilterAttribute.OnActionExecutingAsync(HttpActionContext actionContext, CancellationToken cancellationToken)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Web.Http.Filters.ActionFilterAttribute.<CallOnActionExecutedAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.<ExecuteActionFilterAsyncCore>d__0.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Controllers.ActionFilterResult.<ExecuteAsync>d__2.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.AuthorizationFilterAttribute.<ExecuteAuthorizationFilterAsyncCore>d__2.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.AuthorizationFilterAttribute.<ExecuteAuthorizationFilterAsyncCore>d__2.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.AuthorizationFilterAttribute.<ExecuteAuthorizationFilterAsyncCore>d__2.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Dispatcher.HttpControllerDispatcher.<SendAsync>d__1.MoveNext()
|
non_process
|
fejlmeddelelse i backend denne fejlmeddelelse har jeg nu flere gange fået når jeg er inde for at oprette redigere i en location fx den her skærmbillede her fuld fejlmeddelelse nedenunder der skete en fejl på severen an error occured no argument found for the current action with the name id exception details system invalidoperationexception no argument found for the current action with the name id stacktrace at umbraco web webapi filters ensureuserpermissionforcontentattribute onactionexecuting httpactioncontext actioncontext at system web http filters actionfilterattribute onactionexecutingasync httpactioncontext actioncontext cancellationtoken cancellationtoken end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters actionfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http controllers actionfilterresult d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters authorizationfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters authorizationfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http filters authorizationfilterattribute d movenext end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system web http dispatcher httpcontrollerdispatcher d movenext
| 0
|
530,534
| 15,434,025,178
|
IssuesEvent
|
2021-03-07 00:53:27
|
zephyrproject-rtos/zephyr
|
https://api.github.com/repos/zephyrproject-rtos/zephyr
|
opened
|
[Coverity CID :219646] Untrusted value as argument in subsys/net/lib/coap/coap.c
|
Coverity bug priority: low
|
Static code scan issues found in file:
https://github.com/zephyrproject-rtos/zephyr/tree/bd97359a5338b2542d19011b6d6aa1d8d1b9cc3f/subsys/net/lib/coap/coap.c
Category: Insecure data handling
Function: `coap_reply_init`
Component: Networking
CID: [219646](https://scan9.coverity.com/reports.htm#v29726/p12996/mergedDefectId=219646)
Details:
https://github.com/zephyrproject-rtos/zephyr/blob/bd97359a5338b2542d19011b6d6aa1d8d1b9cc3f/subsys/net/lib/coap/coap.c#L1333
Please fix or provide comments in coverity using the link:
https://scan9.coverity.com/reports.htm#v32951/p12996.
Note: This issue was created automatically. Priority was set based on classification
of the file affected and the impact field in coverity. Assignees were set using the CODEOWNERS file.
|
1.0
|
[Coverity CID :219646] Untrusted value as argument in subsys/net/lib/coap/coap.c -
Static code scan issues found in file:
https://github.com/zephyrproject-rtos/zephyr/tree/bd97359a5338b2542d19011b6d6aa1d8d1b9cc3f/subsys/net/lib/coap/coap.c
Category: Insecure data handling
Function: `coap_reply_init`
Component: Networking
CID: [219646](https://scan9.coverity.com/reports.htm#v29726/p12996/mergedDefectId=219646)
Details:
https://github.com/zephyrproject-rtos/zephyr/blob/bd97359a5338b2542d19011b6d6aa1d8d1b9cc3f/subsys/net/lib/coap/coap.c#L1333
Please fix or provide comments in coverity using the link:
https://scan9.coverity.com/reports.htm#v32951/p12996.
Note: This issue was created automatically. Priority was set based on classification
of the file affected and the impact field in coverity. Assignees were set using the CODEOWNERS file.
|
non_process
|
untrusted value as argument in subsys net lib coap coap c static code scan issues found in file category insecure data handling function coap reply init component networking cid details please fix or provide comments in coverity using the link note this issue was created automatically priority was set based on classification of the file affected and the impact field in coverity assignees were set using the codeowners file
| 0
|
11,036
| 13,850,599,401
|
IssuesEvent
|
2020-10-15 01:40:34
|
unicode-org/icu4x
|
https://api.github.com/repos/unicode-org/icu4x
|
closed
|
Fill in MIT copyright notices before first release
|
C-process T-task
|
In the `LICENSE` file, we still have "[Copyright notices to be filled in as crates are imported]" that hasn't been filled in. For Mozilla's part, our notice is "Copyright Mozilla Foundation". I'll let Googlers figure out how what should go in the copyright notice slot on Google's part.
|
1.0
|
Fill in MIT copyright notices before first release - In the `LICENSE` file, we still have "[Copyright notices to be filled in as crates are imported]" that hasn't been filled in. For Mozilla's part, our notice is "Copyright Mozilla Foundation". I'll let Googlers figure out how what should go in the copyright notice slot on Google's part.
|
process
|
fill in mit copyright notices before first release in the license file we still have that hasn t been filled in for mozilla s part our notice is copyright mozilla foundation i ll let googlers figure out how what should go in the copyright notice slot on google s part
| 1
|
16,170
| 20,611,203,910
|
IssuesEvent
|
2022-03-07 08:49:27
|
pycaret/pycaret
|
https://api.github.com/repos/pycaret/pycaret
|
closed
|
Backward fill / Forward fill and interpolate with time for missing date values
|
enhancement time_series preprocessing
|
Hi pycaret team...
I figure out that, In pycaret modules there is no option for making Imbalanced data into balanced one in classification. as we know there are many oversampling and under sampling techniques are there to make it balance, but in majority of the cases we use SMOTE module. so that we can train the module equally for both "Yes and No" classes. which avoids inaccurate prediction on new data. parallelly, where we can increase precision and recall scores too.
I noticed that, while setup the module it can automatically fill the continues and categorical missing features by using mean for numerical and mode for categorical. but there is no option for "date type" feature. for date type we use forward fill and backward fill or we use mean value between that missing features.
Example: 5,4,nan,6(date type)
In this case we fill this missing value with 4 or 6. or we fill it by using average between 4 and 6,which is 5.
please add this option in next release. where we use it in most of the time series forecasting projects.
|
1.0
|
Backward fill / Forward fill and interpolate with time for missing date values - Hi pycaret team...
I figure out that, In pycaret modules there is no option for making Imbalanced data into balanced one in classification. as we know there are many oversampling and under sampling techniques are there to make it balance, but in majority of the cases we use SMOTE module. so that we can train the module equally for both "Yes and No" classes. which avoids inaccurate prediction on new data. parallelly, where we can increase precision and recall scores too.
I noticed that, while setup the module it can automatically fill the continues and categorical missing features by using mean for numerical and mode for categorical. but there is no option for "date type" feature. for date type we use forward fill and backward fill or we use mean value between that missing features.
Example: 5,4,nan,6(date type)
In this case we fill this missing value with 4 or 6. or we fill it by using average between 4 and 6,which is 5.
please add this option in next release. where we use it in most of the time series forecasting projects.
|
process
|
backward fill forward fill and interpolate with time for missing date values hi pycaret team i figure out that in pycaret modules there is no option for making imbalanced data into balanced one in classification as we know there are many oversampling and under sampling techniques are there to make it balance but in majority of the cases we use smote module so that we can train the module equally for both yes and no classes which avoids inaccurate prediction on new data parallelly where we can increase precision and recall scores too i noticed that while setup the module it can automatically fill the continues and categorical missing features by using mean for numerical and mode for categorical but there is no option for date type feature for date type we use forward fill and backward fill or we use mean value between that missing features example nan date type in this case we fill this missing value with or or we fill it by using average between and which is please add this option in next release where we use it in most of the time series forecasting projects
| 1
|
9,015
| 12,125,118,359
|
IssuesEvent
|
2020-04-22 15:05:53
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
GDAL rasterize - add support for "burn in" to existing rasters, 'add' parameter
|
Feature Request Processing
|
Author Name: **Rijk Zuurmond** (Rijk Zuurmond)
Original Redmine Issue: [19470](https://issues.qgis.org/issues/19470)
Redmine category:processing/gdal
Assignee: Giovanni Manghi
---
missing -add function in GDAL_rasterize
---
- [after-gdal-rasterize.jpg](https://issues.qgis.org/attachments/download/13012/after-gdal-rasterize.jpg) (Rijk Zuurmond)
- [polylines.jpg](https://issues.qgis.org/attachments/download/13013/polylines.jpg) (Rijk Zuurmond)
- [qgis2.18_menu_GDAL-rasterize.jpg](https://issues.qgis.org/attachments/download/13014/qgis2.18_menu_GDAL-rasterize.jpg) (Rijk Zuurmond)
|
1.0
|
GDAL rasterize - add support for "burn in" to existing rasters, 'add' parameter - Author Name: **Rijk Zuurmond** (Rijk Zuurmond)
Original Redmine Issue: [19470](https://issues.qgis.org/issues/19470)
Redmine category:processing/gdal
Assignee: Giovanni Manghi
---
missing -add function in GDAL_rasterize
---
- [after-gdal-rasterize.jpg](https://issues.qgis.org/attachments/download/13012/after-gdal-rasterize.jpg) (Rijk Zuurmond)
- [polylines.jpg](https://issues.qgis.org/attachments/download/13013/polylines.jpg) (Rijk Zuurmond)
- [qgis2.18_menu_GDAL-rasterize.jpg](https://issues.qgis.org/attachments/download/13014/qgis2.18_menu_GDAL-rasterize.jpg) (Rijk Zuurmond)
|
process
|
gdal rasterize add support for burn in to existing rasters add parameter author name rijk zuurmond rijk zuurmond original redmine issue redmine category processing gdal assignee giovanni manghi missing add function in gdal rasterize rijk zuurmond rijk zuurmond rijk zuurmond
| 1
|
725,167
| 24,953,301,866
|
IssuesEvent
|
2022-11-01 09:28:30
|
CS3219-AY2223S1/cs3219-project-ay2223s1-g28
|
https://api.github.com/repos/CS3219-AY2223S1/cs3219-project-ay2223s1-g28
|
closed
|
Backend (Collaboration Service): Add database to store code in editor
|
type.Enhancement priority.Medium
|
So that the code can be loaded if users refreshed their page.
|
1.0
|
Backend (Collaboration Service): Add database to store code in editor - So that the code can be loaded if users refreshed their page.
|
non_process
|
backend collaboration service add database to store code in editor so that the code can be loaded if users refreshed their page
| 0
|
13,664
| 16,386,470,898
|
IssuesEvent
|
2021-05-17 11:07:28
|
apache/arrow-rs
|
https://api.github.com/repos/apache/arrow-rs
|
closed
|
Update Arrow release process to include Rust and DataFusion commits, contributors, changes in release notes
|
development-process
|
Bringing content from https://issues.apache.org/jira/browse/ARROW-12701, so we have visibility in arrow-rs
@ianmcook suggests:
> For the 5.0.0 release, we should change the code in {{dev/releasepost-03-website.sh}} to include commits, contributors, and changes to the official {{apache/arrow-rs}} and {{apache/arrow-datafusion}} repos. This is import to ensure that the contributions to Rust, DataFusion, and Ballista are recognized in our release notes and blog posts going forward.
Which seems like a very good idea. @andygrove / @jorgecarleitao / @Dandandan do we have any thoughts / plans regarding release notes at this time?
|
1.0
|
Update Arrow release process to include Rust and DataFusion commits, contributors, changes in release notes - Bringing content from https://issues.apache.org/jira/browse/ARROW-12701, so we have visibility in arrow-rs
@ianmcook suggests:
> For the 5.0.0 release, we should change the code in {{dev/releasepost-03-website.sh}} to include commits, contributors, and changes to the official {{apache/arrow-rs}} and {{apache/arrow-datafusion}} repos. This is import to ensure that the contributions to Rust, DataFusion, and Ballista are recognized in our release notes and blog posts going forward.
Which seems like a very good idea. @andygrove / @jorgecarleitao / @Dandandan do we have any thoughts / plans regarding release notes at this time?
|
process
|
update arrow release process to include rust and datafusion commits contributors changes in release notes bringing content from so we have visibility in arrow rs ianmcook suggests for the release we should change the code in dev releasepost website sh to include commits contributors and changes to the official apache arrow rs and apache arrow datafusion repos this is import to ensure that the contributions to rust datafusion and ballista are recognized in our release notes and blog posts going forward which seems like a very good idea andygrove jorgecarleitao dandandan do we have any thoughts plans regarding release notes at this time
| 1
|
205,813
| 7,106,274,029
|
IssuesEvent
|
2018-01-16 16:08:42
|
wevote/WebApp
|
https://api.github.com/repos/wevote/WebApp
|
opened
|
"Share Ballot on Facebook" prior to sign in (visual glitch)
|
Priority: 1
|
### Please describe the issue (What happens? What do you expect?)
If you click the "Share Ballot on Facebook" button prior to signing in:

...you get a floating Facebook box without the modal behind it.

The Facebook box also doesn't seem to fully load.
### Steps to reproduce the problem (1, 2, 3...), including links
1. Make sure you are not signed in
2. Go to your Ballot
3. Click on the Email icon in the secondary navigation
4. Click on the "Share Ballot on Facebook" button
|
1.0
|
"Share Ballot on Facebook" prior to sign in (visual glitch) - ### Please describe the issue (What happens? What do you expect?)
If you click the "Share Ballot on Facebook" button prior to signing in:

...you get a floating Facebook box without the modal behind it.

The Facebook box also doesn't seem to fully load.
### Steps to reproduce the problem (1, 2, 3...), including links
1. Make sure you are not signed in
2. Go to your Ballot
3. Click on the Email icon in the secondary navigation
4. Click on the "Share Ballot on Facebook" button
|
non_process
|
share ballot on facebook prior to sign in visual glitch please describe the issue what happens what do you expect if you click the share ballot on facebook button prior to signing in you get a floating facebook box without the modal behind it the facebook box also doesn t seem to fully load steps to reproduce the problem including links make sure you are not signed in go to your ballot click on the email icon in the secondary navigation click on the share ballot on facebook button
| 0
|
20,043
| 26,529,430,477
|
IssuesEvent
|
2023-01-19 11:17:25
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
Introspection of SQL Server views
|
process/candidate topic: schema topic: introspection topic: re-introspection tech/engines/introspection engine topic: sql server kind/subtask
|
We should fit this to the describer:
```sql
select schema_name(v.schema_id) as schema_name,
object_name(c.object_id) as view_name,
c.column_id,
c.name as column_name,
type_name(user_type_id) as data_type,
c.max_length,
c.precision
from sys.columns c
join sys.views v
on v.object_id = c.object_id
order by schema_name,
view_name,
column_id;
```
Relations should be kept in re-introspection. Those are not backed by a foreign key.
Part of: https://github.com/prisma/prisma/issues/17412
|
1.0
|
Introspection of SQL Server views - We should fit this to the describer:
```sql
select schema_name(v.schema_id) as schema_name,
object_name(c.object_id) as view_name,
c.column_id,
c.name as column_name,
type_name(user_type_id) as data_type,
c.max_length,
c.precision
from sys.columns c
join sys.views v
on v.object_id = c.object_id
order by schema_name,
view_name,
column_id;
```
Relations should be kept in re-introspection. Those are not backed by a foreign key.
Part of: https://github.com/prisma/prisma/issues/17412
|
process
|
introspection of sql server views we should fit this to the describer sql select schema name v schema id as schema name object name c object id as view name c column id c name as column name type name user type id as data type c max length c precision from sys columns c join sys views v on v object id c object id order by schema name view name column id relations should be kept in re introspection those are not backed by a foreign key part of
| 1
|
9
| 2,496,234,738
|
IssuesEvent
|
2015-01-06 18:01:17
|
vivo-isf/vivo-isf-ontology
|
https://api.github.com/repos/vivo-isf/vivo-isf-ontology
|
closed
|
vocal learning
|
biological_process imported
|
_From [vasil...@ohsu.edu](https://code.google.com/u/108803237899917466626/) on November 12, 2012 15:36:55_
from GO:
GO:0042297
Parent: multicellular organismal process
\<a href="http://purl.obolibrary.org/obo/GO_0032501" rel="nofollow">http://purl.obolibrary.org/obo/GO_0032501</a>
for record:
\<a href="http://ohsu.eagle-i.net/i/0000012f-0dd3-6911-4384-aa6580000000" rel="nofollow">http://ohsu.eagle-i.net/i/0000012f-0dd3-6911-4384-aa6580000000</a>
_Original issue: http://code.google.com/p/eagle-i/issues/detail?id=159_
|
1.0
|
vocal learning - _From [vasil...@ohsu.edu](https://code.google.com/u/108803237899917466626/) on November 12, 2012 15:36:55_
from GO:
GO:0042297
Parent: multicellular organismal process
\<a href="http://purl.obolibrary.org/obo/GO_0032501" rel="nofollow">http://purl.obolibrary.org/obo/GO_0032501</a>
for record:
\<a href="http://ohsu.eagle-i.net/i/0000012f-0dd3-6911-4384-aa6580000000" rel="nofollow">http://ohsu.eagle-i.net/i/0000012f-0dd3-6911-4384-aa6580000000</a>
_Original issue: http://code.google.com/p/eagle-i/issues/detail?id=159_
|
process
|
vocal learning from on november from go go parent multicellular organismal process for record original issue
| 1
|
59,424
| 3,110,041,582
|
IssuesEvent
|
2015-09-02 02:51:48
|
noxsicarius/Epoch-Admin-Tools
|
https://api.github.com/repos/noxsicarius/Epoch-Admin-Tools
|
closed
|
Rewrite every file
|
enhancement Priority:High
|
Most of version 1.9.3 will be this issue alone.
I plan to rewrite EVERY file for efficiency and minimal size.
I did not create most of the files in this and the ones I did were relatively early on when I was still learning the syntax for this game. Because of this the code is inefficient in many places and I have worked to fix this over time. I have decided that I must take time to specifically view every file here and rewrite it to be more efficient so there is less drain on the server and much less code. Some places make calls to the same code when a function would be much more useful.
|
1.0
|
Rewrite every file - Most of version 1.9.3 will be this issue alone.
I plan to rewrite EVERY file for efficiency and minimal size.
I did not create most of the files in this and the ones I did were relatively early on when I was still learning the syntax for this game. Because of this the code is inefficient in many places and I have worked to fix this over time. I have decided that I must take time to specifically view every file here and rewrite it to be more efficient so there is less drain on the server and much less code. Some places make calls to the same code when a function would be much more useful.
|
non_process
|
rewrite every file most of version will be this issue alone i plan to rewrite every file for efficiency and minimal size i did not create most of the files in this and the ones i did were relatively early on when i was still learning the syntax for this game because of this the code is inefficient in many places and i have worked to fix this over time i have decided that i must take time to specifically view every file here and rewrite it to be more efficient so there is less drain on the server and much less code some places make calls to the same code when a function would be much more useful
| 0
|
84,192
| 7,894,086,264
|
IssuesEvent
|
2018-06-28 20:15:42
|
apache/incubator-mxnet
|
https://api.github.com/repos/apache/incubator-mxnet
|
closed
|
Seg fault test_autograd.test_unary_func @ Python2: MKLML-CPU
|
Flaky Test
|
Command: ``tests/ci_build/ci_build.sh cpu_mklml --dockerbinary docker PYTHONPATH=./python/ nosetests-2.7 --with-timer --verbose tests/python/unittest``
Commit hash: 1dc4aea2eee5032b79a07e5bd6a4a4325d5708cb
Log:
```
[ut-python2-mklml-cpu] Running shell script
+ tests/ci_build/ci_build.sh cpu_mklml --dockerbinary docker PYTHONPATH=./python/ nosetests-2.7 --with-timer --verbose tests/python/unittest
Using custom Docker Engine: docker
WORKSPACE: /var/lib/jenkins_slave/workspace/ut-python2-mklml-cpu
CI_DOCKER_EXTRA_PARAMS:
COMMAND: PYTHONPATH=./python/ nosetests-2.7 --with-timer --verbose tests/python/unittest
CONTAINER_TYPE: cpu_mklml
BUILD_TAG: jenkins-mxnet_incubator-master-134
NODE_NAME: mxnet-linux-cpu12
DOCKER CONTAINER NAME: mx-ci.cpu_mklml
PRE_COMMAND: tests/ci_build/with_the_same_user
Building container (mx-ci.cpu_mklml)...
Sending build context to Docker daemon 77.31kB
Step 1/14 : FROM ubuntu:16.04
---> 20c44cd7596f
Step 2/14 : COPY install/ubuntu_install_core.sh /install/
---> Using cache
---> 8e51d37de291
Step 3/14 : RUN /install/ubuntu_install_core.sh
---> Using cache
---> f2f835c6a97c
Step 4/14 : COPY install/ubuntu_install_python.sh /install/
---> Using cache
---> e8e12454c7cb
Step 5/14 : RUN /install/ubuntu_install_python.sh
---> Using cache
---> 22335add2426
Step 6/14 : COPY install/ubuntu_install_scala.sh /install/
---> Using cache
---> c6579fff2515
Step 7/14 : RUN /install/ubuntu_install_scala.sh
---> Using cache
---> bbe88ee5bbb4
Step 8/14 : COPY install/ubuntu_install_r.sh /install/
---> Using cache
---> 243c5f0281ae
Step 9/14 : RUN /install/ubuntu_install_r.sh
---> Using cache
---> 503828fe0ff7
Step 10/14 : COPY install/ubuntu_install_perl.sh /install/
---> Using cache
---> 0a44448e67bf
Step 11/14 : RUN /install/ubuntu_install_perl.sh
---> Using cache
---> fb494037e7b9
Step 12/14 : RUN wget --no-check-certificate -O /tmp/mklml.tgz https://github.com/01org/mkl-dnn/releases/download/v0.11/mklml_lnx_2018.0.1.20171007.tgz
---> Using cache
---> 1c1abb1bee57
Step 13/14 : RUN tar -zxvf /tmp/mklml.tgz && cp -rf mklml_*/* /usr/local/ && rm -rf mklml_*
---> Using cache
---> 4916dc8d0255
Step 14/14 : ENV LD_LIBRARY_PATH ${LD_LIBRARY_PATH}:/usr/local/lib
---> Using cache
---> e87b6046dc08
Successfully built e87b6046dc08
Successfully tagged mx-ci.cpu_mklml:latest
Running 'PYTHONPATH=./python/ nosetests-2.7 --with-timer --verbose tests/python/unittest' inside mx-ci.cpu_mklml...
Adding group `ubuntu' (GID 1000) ...
Done.
test_attr.test_attr_basic ... ok (0.0037s)
test_attr.test_operator ... ok (0.0009s)
test_attr.test_list_attr ... ok (0.0002s)
test_attr.test_attr_dict ... ok (0.0002s)
test_autograd.test_unary_func ... tests/ci_build/with_the_same_user: line 30: 5695 Segmentation fault (core dumped) sudo -u "#${CI_BUILD_UID}" --preserve-env "LD_LIBRARY_PATH=${LD_LIBRARY_PATH}" "HOME=${CI_BUILD_HOME}" ${COMMAND[@]}
script returned exit code 139
```
|
1.0
|
Seg fault test_autograd.test_unary_func @ Python2: MKLML-CPU - Command: ``tests/ci_build/ci_build.sh cpu_mklml --dockerbinary docker PYTHONPATH=./python/ nosetests-2.7 --with-timer --verbose tests/python/unittest``
Commit hash: 1dc4aea2eee5032b79a07e5bd6a4a4325d5708cb
Log:
```
[ut-python2-mklml-cpu] Running shell script
+ tests/ci_build/ci_build.sh cpu_mklml --dockerbinary docker PYTHONPATH=./python/ nosetests-2.7 --with-timer --verbose tests/python/unittest
Using custom Docker Engine: docker
WORKSPACE: /var/lib/jenkins_slave/workspace/ut-python2-mklml-cpu
CI_DOCKER_EXTRA_PARAMS:
COMMAND: PYTHONPATH=./python/ nosetests-2.7 --with-timer --verbose tests/python/unittest
CONTAINER_TYPE: cpu_mklml
BUILD_TAG: jenkins-mxnet_incubator-master-134
NODE_NAME: mxnet-linux-cpu12
DOCKER CONTAINER NAME: mx-ci.cpu_mklml
PRE_COMMAND: tests/ci_build/with_the_same_user
Building container (mx-ci.cpu_mklml)...
Sending build context to Docker daemon 77.31kB
Step 1/14 : FROM ubuntu:16.04
---> 20c44cd7596f
Step 2/14 : COPY install/ubuntu_install_core.sh /install/
---> Using cache
---> 8e51d37de291
Step 3/14 : RUN /install/ubuntu_install_core.sh
---> Using cache
---> f2f835c6a97c
Step 4/14 : COPY install/ubuntu_install_python.sh /install/
---> Using cache
---> e8e12454c7cb
Step 5/14 : RUN /install/ubuntu_install_python.sh
---> Using cache
---> 22335add2426
Step 6/14 : COPY install/ubuntu_install_scala.sh /install/
---> Using cache
---> c6579fff2515
Step 7/14 : RUN /install/ubuntu_install_scala.sh
---> Using cache
---> bbe88ee5bbb4
Step 8/14 : COPY install/ubuntu_install_r.sh /install/
---> Using cache
---> 243c5f0281ae
Step 9/14 : RUN /install/ubuntu_install_r.sh
---> Using cache
---> 503828fe0ff7
Step 10/14 : COPY install/ubuntu_install_perl.sh /install/
---> Using cache
---> 0a44448e67bf
Step 11/14 : RUN /install/ubuntu_install_perl.sh
---> Using cache
---> fb494037e7b9
Step 12/14 : RUN wget --no-check-certificate -O /tmp/mklml.tgz https://github.com/01org/mkl-dnn/releases/download/v0.11/mklml_lnx_2018.0.1.20171007.tgz
---> Using cache
---> 1c1abb1bee57
Step 13/14 : RUN tar -zxvf /tmp/mklml.tgz && cp -rf mklml_*/* /usr/local/ && rm -rf mklml_*
---> Using cache
---> 4916dc8d0255
Step 14/14 : ENV LD_LIBRARY_PATH ${LD_LIBRARY_PATH}:/usr/local/lib
---> Using cache
---> e87b6046dc08
Successfully built e87b6046dc08
Successfully tagged mx-ci.cpu_mklml:latest
Running 'PYTHONPATH=./python/ nosetests-2.7 --with-timer --verbose tests/python/unittest' inside mx-ci.cpu_mklml...
Adding group `ubuntu' (GID 1000) ...
Done.
test_attr.test_attr_basic ... ok (0.0037s)
test_attr.test_operator ... ok (0.0009s)
test_attr.test_list_attr ... ok (0.0002s)
test_attr.test_attr_dict ... ok (0.0002s)
test_autograd.test_unary_func ... tests/ci_build/with_the_same_user: line 30: 5695 Segmentation fault (core dumped) sudo -u "#${CI_BUILD_UID}" --preserve-env "LD_LIBRARY_PATH=${LD_LIBRARY_PATH}" "HOME=${CI_BUILD_HOME}" ${COMMAND[@]}
script returned exit code 139
```
|
non_process
|
seg fault test autograd test unary func mklml cpu command tests ci build ci build sh cpu mklml dockerbinary docker pythonpath python nosetests with timer verbose tests python unittest commit hash log running shell script tests ci build ci build sh cpu mklml dockerbinary docker pythonpath python nosetests with timer verbose tests python unittest using custom docker engine docker workspace var lib jenkins slave workspace ut mklml cpu ci docker extra params command pythonpath python nosetests with timer verbose tests python unittest container type cpu mklml build tag jenkins mxnet incubator master node name mxnet linux docker container name mx ci cpu mklml pre command tests ci build with the same user building container mx ci cpu mklml sending build context to docker daemon step from ubuntu step copy install ubuntu install core sh install using cache step run install ubuntu install core sh using cache step copy install ubuntu install python sh install using cache step run install ubuntu install python sh using cache step copy install ubuntu install scala sh install using cache step run install ubuntu install scala sh using cache step copy install ubuntu install r sh install using cache step run install ubuntu install r sh using cache step copy install ubuntu install perl sh install using cache step run install ubuntu install perl sh using cache step run wget no check certificate o tmp mklml tgz using cache step run tar zxvf tmp mklml tgz cp rf mklml usr local rm rf mklml using cache step env ld library path ld library path usr local lib using cache successfully built successfully tagged mx ci cpu mklml latest running pythonpath python nosetests with timer verbose tests python unittest inside mx ci cpu mklml adding group ubuntu gid done test attr test attr basic ok test attr test operator ok test attr test list attr ok test attr test attr dict ok test autograd test unary func tests ci build with the same user line segmentation fault core dumped sudo u ci build uid preserve env ld library path ld library path home ci build home command script returned exit code
| 0
|
16,309
| 20,962,106,380
|
IssuesEvent
|
2022-03-27 23:26:07
|
RobertCraigie/prisma-client-py
|
https://api.github.com/repos/RobertCraigie/prisma-client-py
|
closed
|
Improve import error message when the client is imported before generation
|
kind/improvement process/candidate level/intermediate priority/high topic: dx
|
## Problem
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
Currently it may not be immediately clear to users that they have to run `prisma generate` when they encounter this error:
```
>>> from prisma import Prisma
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: cannot import name 'Prisma' from 'prisma'
```
## Suggested solution
<!-- A clear and concise description of what you want to happen. -->
Python supports module level `__getattr__`, we could probably make use of this to provide an improved error message.
|
1.0
|
Improve import error message when the client is imported before generation - ## Problem
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
Currently it may not be immediately clear to users that they have to run `prisma generate` when they encounter this error:
```
>>> from prisma import Prisma
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: cannot import name 'Prisma' from 'prisma'
```
## Suggested solution
<!-- A clear and concise description of what you want to happen. -->
Python supports module level `__getattr__`, we could probably make use of this to provide an improved error message.
|
process
|
improve import error message when the client is imported before generation problem currently it may not be immediately clear to users that they have to run prisma generate when they encounter this error from prisma import prisma traceback most recent call last file line in importerror cannot import name prisma from prisma suggested solution python supports module level getattr we could probably make use of this to provide an improved error message
| 1
|
3,643
| 6,677,431,936
|
IssuesEvent
|
2017-10-05 10:25:35
|
our-city-app/oca-backend
|
https://api.github.com/repos/our-city-app/oca-backend
|
closed
|
Appointment
|
process_duplicate type_feature
|
@ this time we have no usefull appointment system.
It would be great to generate a good appointment-system.
This could be a feature they have to pay for?
But for now nobody uses the feature because not good enough.
It is crucial for certain categories by ex nail/hair/doctors/....
|
1.0
|
Appointment - @ this time we have no usefull appointment system.
It would be great to generate a good appointment-system.
This could be a feature they have to pay for?
But for now nobody uses the feature because not good enough.
It is crucial for certain categories by ex nail/hair/doctors/....
|
process
|
appointment this time we have no usefull appointment system it would be great to generate a good appointment system this could be a feature they have to pay for but for now nobody uses the feature because not good enough it is crucial for certain categories by ex nail hair doctors
| 1
|
18,868
| 24,798,699,772
|
IssuesEvent
|
2022-10-24 19:38:19
|
OpenDataScotland/the_od_bods
|
https://api.github.com/repos/OpenDataScotland/the_od_bods
|
closed
|
Enhance info for file links
|
data processing back end
|
On this page: https://opendata.scot/datasets/Public%20Health%20Scotland-239/ the user is presented with abut 75 CSV links with no context.
On the page on the data hosts site https://www.opendata.nhs.scot/dataset/prescriptions-in-the-community the metadata of the file (month / year) is presented.
Can we pull that through in some way to make the ODS page more usable?
|
1.0
|
Enhance info for file links - On this page: https://opendata.scot/datasets/Public%20Health%20Scotland-239/ the user is presented with abut 75 CSV links with no context.
On the page on the data hosts site https://www.opendata.nhs.scot/dataset/prescriptions-in-the-community the metadata of the file (month / year) is presented.
Can we pull that through in some way to make the ODS page more usable?
|
process
|
enhance info for file links on this page the user is presented with abut csv links with no context on the page on the data hosts site the metadata of the file month year is presented can we pull that through in some way to make the ods page more usable
| 1
|
22,627
| 31,856,534,239
|
IssuesEvent
|
2023-09-15 07:53:07
|
AvaloniaUI/Avalonia
|
https://api.github.com/repos/AvaloniaUI/Avalonia
|
reopened
|
System.InvalidOperationException: Default font family name can't be null or empty
|
bug os-linux area-textprocessing
|
Version: 0.10-preview2
I upgraded project IPOCS.JMRI.CONTROL, located at https://github.com/GMJS/ipocs.jmri, to preview2 (not committed and pushed yet) and made a build for `linux-arm` using `dotnet publish IPOCS.JMRI.CONTROL -c Release -r linux-arm`. Upon trying to run this on a raspberry pi I'm getting the exception:
```
Unhandled exception. System.InvalidOperationException: Default font family name can't be null or empty.
at Avalonia.Media.FontManager..ctor(IFontManagerImpl platformImpl)
at Avalonia.Media.FontManager.get_Current()
at Avalonia.Rendering.RendererBase..ctor(Boolean useManualFpsCounting)
at Avalonia.Rendering.DeferredRenderer..ctor(IRenderRoot root, IRenderLoop renderLoop, ISceneBuilder sceneBuilder, IDispatcher dispatcher, IDeferredRendererLock rendererLock)
at Avalonia.X11.X11Window.CreateRenderer(IRenderRoot root)
at Avalonia.Controls.TopLevel..ctor(ITopLevelImpl impl, IAvaloniaDependencyResolver dependencyResolver)
at Avalonia.Controls.WindowBase..ctor(IWindowBaseImpl impl, IAvaloniaDependencyResolver dependencyResolver)
at Avalonia.Controls.WindowBase..ctor(IWindowBaseImpl impl)
at Avalonia.Controls.Window..ctor(IWindowImpl impl)
at Avalonia.Controls.Window..ctor()
at IPOCS.JMRI.CONTROL.Views.MainWindow..ctor() in <path>\ipocs.jmri\IPOCS.JMRI.CONTROL\Views\MainWindow.axaml.cs:line 9
at IPOCS.JMRI.CONTROL.App.OnFrameworkInitializationCompleted() in <path>t\ipocs.jmri\IPOCS.JMRI.CONTROL\App.axaml.cs:line 20
at Avalonia.Controls.AppBuilderBase`1.Setup()
at Avalonia.Controls.AppBuilderBase`1.SetupWithLifetime(IApplicationLifetime lifetime)
at Avalonia.ClassicDesktopStyleApplicationLifetimeExtensions.StartWithClassicDesktopLifetime[T](T builder, String[] args, ShutdownMode shutdownMode)
at IPOCS.JMRI.CONTROL.Program.Main(String[] args) in <path>\ipocs.jmri\IPOCS.JMRI.CONTROL\Program.cs:line 13
```
|
1.0
|
System.InvalidOperationException: Default font family name can't be null or empty - Version: 0.10-preview2
I upgraded project IPOCS.JMRI.CONTROL, located at https://github.com/GMJS/ipocs.jmri, to preview2 (not committed and pushed yet) and made a build for `linux-arm` using `dotnet publish IPOCS.JMRI.CONTROL -c Release -r linux-arm`. Upon trying to run this on a raspberry pi I'm getting the exception:
```
Unhandled exception. System.InvalidOperationException: Default font family name can't be null or empty.
at Avalonia.Media.FontManager..ctor(IFontManagerImpl platformImpl)
at Avalonia.Media.FontManager.get_Current()
at Avalonia.Rendering.RendererBase..ctor(Boolean useManualFpsCounting)
at Avalonia.Rendering.DeferredRenderer..ctor(IRenderRoot root, IRenderLoop renderLoop, ISceneBuilder sceneBuilder, IDispatcher dispatcher, IDeferredRendererLock rendererLock)
at Avalonia.X11.X11Window.CreateRenderer(IRenderRoot root)
at Avalonia.Controls.TopLevel..ctor(ITopLevelImpl impl, IAvaloniaDependencyResolver dependencyResolver)
at Avalonia.Controls.WindowBase..ctor(IWindowBaseImpl impl, IAvaloniaDependencyResolver dependencyResolver)
at Avalonia.Controls.WindowBase..ctor(IWindowBaseImpl impl)
at Avalonia.Controls.Window..ctor(IWindowImpl impl)
at Avalonia.Controls.Window..ctor()
at IPOCS.JMRI.CONTROL.Views.MainWindow..ctor() in <path>\ipocs.jmri\IPOCS.JMRI.CONTROL\Views\MainWindow.axaml.cs:line 9
at IPOCS.JMRI.CONTROL.App.OnFrameworkInitializationCompleted() in <path>t\ipocs.jmri\IPOCS.JMRI.CONTROL\App.axaml.cs:line 20
at Avalonia.Controls.AppBuilderBase`1.Setup()
at Avalonia.Controls.AppBuilderBase`1.SetupWithLifetime(IApplicationLifetime lifetime)
at Avalonia.ClassicDesktopStyleApplicationLifetimeExtensions.StartWithClassicDesktopLifetime[T](T builder, String[] args, ShutdownMode shutdownMode)
at IPOCS.JMRI.CONTROL.Program.Main(String[] args) in <path>\ipocs.jmri\IPOCS.JMRI.CONTROL\Program.cs:line 13
```
|
process
|
system invalidoperationexception default font family name can t be null or empty version i upgraded project ipocs jmri control located at to not committed and pushed yet and made a build for linux arm using dotnet publish ipocs jmri control c release r linux arm upon trying to run this on a raspberry pi i m getting the exception unhandled exception system invalidoperationexception default font family name can t be null or empty at avalonia media fontmanager ctor ifontmanagerimpl platformimpl at avalonia media fontmanager get current at avalonia rendering rendererbase ctor boolean usemanualfpscounting at avalonia rendering deferredrenderer ctor irenderroot root irenderloop renderloop iscenebuilder scenebuilder idispatcher dispatcher ideferredrendererlock rendererlock at avalonia createrenderer irenderroot root at avalonia controls toplevel ctor itoplevelimpl impl iavaloniadependencyresolver dependencyresolver at avalonia controls windowbase ctor iwindowbaseimpl impl iavaloniadependencyresolver dependencyresolver at avalonia controls windowbase ctor iwindowbaseimpl impl at avalonia controls window ctor iwindowimpl impl at avalonia controls window ctor at ipocs jmri control views mainwindow ctor in ipocs jmri ipocs jmri control views mainwindow axaml cs line at ipocs jmri control app onframeworkinitializationcompleted in t ipocs jmri ipocs jmri control app axaml cs line at avalonia controls appbuilderbase setup at avalonia controls appbuilderbase setupwithlifetime iapplicationlifetime lifetime at avalonia classicdesktopstyleapplicationlifetimeextensions startwithclassicdesktoplifetime t builder string args shutdownmode shutdownmode at ipocs jmri control program main string args in ipocs jmri ipocs jmri control program cs line
| 1
|
164,533
| 20,364,573,084
|
IssuesEvent
|
2022-02-21 03:03:52
|
RG4421/cbp-theme
|
https://api.github.com/repos/RG4421/cbp-theme
|
closed
|
CVE-2020-24025 (Medium) detected in node-sass-4.14.1.tgz, node-sass-4.11.0.tgz - autoclosed
|
security vulnerability
|
## CVE-2020-24025 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.14.1.tgz</b>, <b>node-sass-4.11.0.tgz</b></p></summary>
<p>
<details><summary><b>node-sass-4.14.1.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz</a></p>
<p>Path to dependency file: /ds-css/package.json</p>
<p>Path to vulnerable library: /ds-css/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.14.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>node-sass-4.11.0.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.11.0.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Certificate validation in node-sass 2.0.0 to 4.14.1 is disabled when requesting binaries even if the user is not specifying an alternative download path.
<p>Publish Date: 2021-01-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24025>CVE-2020-24025</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2020-24025">https://nvd.nist.gov/vuln/detail/CVE-2020-24025</a></p>
<p>Release Date: 2021-01-11</p>
<p>Fix Resolution: 5.0.0</p>
</p>
</details>
<p></p>
|
True
|
CVE-2020-24025 (Medium) detected in node-sass-4.14.1.tgz, node-sass-4.11.0.tgz - autoclosed - ## CVE-2020-24025 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.14.1.tgz</b>, <b>node-sass-4.11.0.tgz</b></p></summary>
<p>
<details><summary><b>node-sass-4.14.1.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz</a></p>
<p>Path to dependency file: /ds-css/package.json</p>
<p>Path to vulnerable library: /ds-css/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.14.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>node-sass-4.11.0.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- :x: **node-sass-4.11.0.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Certificate validation in node-sass 2.0.0 to 4.14.1 is disabled when requesting binaries even if the user is not specifying an alternative download path.
<p>Publish Date: 2021-01-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-24025>CVE-2020-24025</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2020-24025">https://nvd.nist.gov/vuln/detail/CVE-2020-24025</a></p>
<p>Release Date: 2021-01-11</p>
<p>Fix Resolution: 5.0.0</p>
</p>
</details>
<p></p>
|
non_process
|
cve medium detected in node sass tgz node sass tgz autoclosed cve medium severity vulnerability vulnerable libraries node sass tgz node sass tgz node sass tgz wrapper around libsass library home page a href path to dependency file ds css package json path to vulnerable library ds css node modules node sass package json dependency hierarchy x node sass tgz vulnerable library node sass tgz wrapper around libsass library home page a href path to dependency file package json path to vulnerable library node modules node sass package json dependency hierarchy x node sass tgz vulnerable library vulnerability details certificate validation in node sass to is disabled when requesting binaries even if the user is not specifying an alternative download path publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
| 0
|
20,714
| 27,410,232,792
|
IssuesEvent
|
2023-03-01 10:01:45
|
Deltares/Ribasim.jl
|
https://api.github.com/repos/Deltares/Ribasim.jl
|
opened
|
consider Chezy or Manning-based Level connector
|
physical process
|
the current LinearLevelConnector may be hard to configure/calibrate
a Manning or Chezy based connector using a wetted area (of the upstream storage) and a roughness factor may be easier given available data sources
|
1.0
|
consider Chezy or Manning-based Level connector - the current LinearLevelConnector may be hard to configure/calibrate
a Manning or Chezy based connector using a wetted area (of the upstream storage) and a roughness factor may be easier given available data sources
|
process
|
consider chezy or manning based level connector the current linearlevelconnector may be hard to configure calibrate a manning or chezy based connector using a wetted area of the upstream storage and a roughness factor may be easier given available data sources
| 1
|
15,789
| 19,981,749,304
|
IssuesEvent
|
2022-01-30 01:46:13
|
DevExpress/testcafe-hammerhead
|
https://api.github.com/repos/DevExpress/testcafe-hammerhead
|
closed
|
iframe is recognized as visible, but not when switching
|
TYPE: bug AREA: client FREQUENCY: level 1 SYSTEM: iframe processing STATE: Stale
|
### What is your Test Scenario?
I want to write an E2E for login with [Fortmatic](https://fortmatic.com/). Fortmatic uses an `<iframe />` to inject its UI.
### What is the Current behavior?
TestCafe recognizes the `<iframe />` with `.exists`, `.ok()` and `.visible`, but not when `switchToIframe` is called.
### What is the Expected behavior?
TestCafe should always recognize the `<iframe />`.
### What is your web application and your TestCafe test code?
<details>
<summary>1. Use the `useFortmatic` hook in a React application</summary>
```js
import Fortmatic from 'fortmatic';
import { useState, useEffect, useRef } from 'react';
import Web3 from 'web3';
const usePromise = () => {
const ref = [];
const container = useRef(ref);
ref[0] = new Promise((resolve, reject) => {
ref[1] = resolve;
ref[2] = reject;
});
return container.current;
};
const useFortmatic = apiKey => {
const [accounts, setAccounts] = useState([]);
const [web3Ready, setWeb3Ready] = usePromise();
const [web3IsInitialized, setWeb3IsInitialized] = useState(false);
const signIn = async () => {
const { web3 } = await web3Ready;
// Get the current user account addresses.
// Auth if needed.
await web3.eth.getAccounts().then(setAccounts);
};
const signOut = async () => {
const { fm } = await web3Ready;
await fm.user.logout();
setAccounts([]);
};
// Fire only once after components mounts.
useEffect(() => {
const initializeWeb3 = async () => {
const fm = new Fortmatic(apiKey);
const web3 = new Web3(fm.getProvider());
// This needs to run before we update state
// if we want to enable users to stay signed
// in between page reloads.
(await fm.user.isLoggedIn()) && signIn();
setWeb3Ready({ fm, web3 });
setWeb3IsInitialized(true);
};
initializeWeb3();
// eslint-disable-next-line react-hooks/exhaustive-deps
}, []);
return {
accounts,
signOut,
signIn,
web3IsInitialized,
web3Ready,
};
};
export default useFortmatic;
```
</details>
<details>
<summary>2. Use the hook and its `signIn` method</summary>
```js
function MyPage() {
const { signIn, web3IsInitialized } = useFortmatic('<your-api-key>');
return (
<button
className="sign-in-button"
disabled={!web3IsInitialized}
onClick={signIn}
>
Sign In
</button>;
}
```
</details>
3. In your test navigate to the page with the hook and click the `signIn` button.
4. Try to switch to the `iframe`.
```js
await t.switchToIframe('.fortmatic-iframe');
```
Your website URL (or attach your complete example):
<details>
<summary>Your complete test code (or attach your test files):</summary>
<!-- Paste your test code here: -->
```js
test.only('should sign in with fortmatic', async t => {
const signInButton = await Selector('.sign-in-button');
await t.click(signInButton);
const fortmaticUI = await Selector('.fortmatic-iframe');
await t.expect(fortmaticUI).ok(); // passes
await fortmaticUI.exists; // passes
await fortmaticUI.visible; // passes
await t.switchToIframe(fortmaticUI); // fails!
await emailInput.visible;
});
```
</details>
<details>
<summary>Your complete test report:</summary>
<!-- Paste your complete result test report here (even if it is huge): -->
```
npm run test-functional
> @ test-functional /Users/janhesters/dev/my-project
> testcafe chrome src/**/*functional-test.js --app 'npm run dev' --app-init-delay 2000
Running tests in:
- Chrome 78.0.3904 / Mac OS X 10.15.1
sign in page
✖ should sign in with Fortmatic
1) The element that matches the specified selector is not visible.
Browser: Chrome 78.0.3904 / Mac OS X 10.15.1
41 | const signInButton = await
Selector('.sign-in-and-up-button');
42 | await t.click(signInButton);
43 | const fortmaticUI = await Selector('.fortmatic-iframe');
44 | await t.expect(fortmaticUI).ok();
45 | await fortmaticUI.visible;
> 46 | await t.switchToIframe(fortmaticUI);
47 |
48 | await emailInput.visible;
49 |
50 | // await t
51 | // .expect(signInButton.hasAttribute('disabled'))
at <anonymous>
(/Users/janhesters/dev/my-projecty/src/features/user-authentication/sign-in-component-functional-test.js:46:11)
at step
(/Users/janhesters/dev/my-project/node_modules/babel-runtime/helpers/asyncToGenerator.js:17:30)
at <anonymous>
(/Users/janhesters/dev/my-project/node_modules/babel-runtime/helpers/asyncToGenerator.js:28:13)
1/1 failed (18s)
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! @ test-functional: `testcafe chrome src/**/*functional-test.js --app 'npm run dev' --app-init-delay 2000`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the @ test-functional script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! /Users/janhesters/.npm/_logs/2019-12-09T09_12_07_619Z-debug.log
```
</details>
### Steps to Reproduce:
Build the test application as described above.
### Your Environment details:
* testcafe version: 6.12
* node.js version: 12.13.0
* command-line arguments: `testcafe chrome src/**/*functional-test.js --app 'npm run dev' --app-init-delay 2000`
* browser name and version: Chrome 78.0.3904
|
1.0
|
iframe is recognized as visible, but not when switching - ### What is your Test Scenario?
I want to write an E2E for login with [Fortmatic](https://fortmatic.com/). Fortmatic uses an `<iframe />` to inject its UI.
### What is the Current behavior?
TestCafe recognizes the `<iframe />` with `.exists`, `.ok()` and `.visible`, but not when `switchToIframe` is called.
### What is the Expected behavior?
TestCafe should always recognize the `<iframe />`.
### What is your web application and your TestCafe test code?
<details>
<summary>1. Use the `useFortmatic` hook in a React application</summary>
```js
import Fortmatic from 'fortmatic';
import { useState, useEffect, useRef } from 'react';
import Web3 from 'web3';
const usePromise = () => {
const ref = [];
const container = useRef(ref);
ref[0] = new Promise((resolve, reject) => {
ref[1] = resolve;
ref[2] = reject;
});
return container.current;
};
const useFortmatic = apiKey => {
const [accounts, setAccounts] = useState([]);
const [web3Ready, setWeb3Ready] = usePromise();
const [web3IsInitialized, setWeb3IsInitialized] = useState(false);
const signIn = async () => {
const { web3 } = await web3Ready;
// Get the current user account addresses.
// Auth if needed.
await web3.eth.getAccounts().then(setAccounts);
};
const signOut = async () => {
const { fm } = await web3Ready;
await fm.user.logout();
setAccounts([]);
};
// Fire only once after components mounts.
useEffect(() => {
const initializeWeb3 = async () => {
const fm = new Fortmatic(apiKey);
const web3 = new Web3(fm.getProvider());
// This needs to run before we update state
// if we want to enable users to stay signed
// in between page reloads.
(await fm.user.isLoggedIn()) && signIn();
setWeb3Ready({ fm, web3 });
setWeb3IsInitialized(true);
};
initializeWeb3();
// eslint-disable-next-line react-hooks/exhaustive-deps
}, []);
return {
accounts,
signOut,
signIn,
web3IsInitialized,
web3Ready,
};
};
export default useFortmatic;
```
</details>
<details>
<summary>2. Use the hook and its `signIn` method</summary>
```js
function MyPage() {
const { signIn, web3IsInitialized } = useFortmatic('<your-api-key>');
return (
<button
className="sign-in-button"
disabled={!web3IsInitialized}
onClick={signIn}
>
Sign In
</button>;
}
```
</details>
3. In your test navigate to the page with the hook and click the `signIn` button.
4. Try to switch to the `iframe`.
```js
await t.switchToIframe('.fortmatic-iframe');
```
Your website URL (or attach your complete example):
<details>
<summary>Your complete test code (or attach your test files):</summary>
<!-- Paste your test code here: -->
```js
test.only('should sign in with fortmatic', async t => {
const signInButton = await Selector('.sign-in-button');
await t.click(signInButton);
const fortmaticUI = await Selector('.fortmatic-iframe');
await t.expect(fortmaticUI).ok(); // passes
await fortmaticUI.exists; // passes
await fortmaticUI.visible; // passes
await t.switchToIframe(fortmaticUI); // fails!
await emailInput.visible;
});
```
</details>
<details>
<summary>Your complete test report:</summary>
<!-- Paste your complete result test report here (even if it is huge): -->
```
npm run test-functional
> @ test-functional /Users/janhesters/dev/my-project
> testcafe chrome src/**/*functional-test.js --app 'npm run dev' --app-init-delay 2000
Running tests in:
- Chrome 78.0.3904 / Mac OS X 10.15.1
sign in page
✖ should sign in with Fortmatic
1) The element that matches the specified selector is not visible.
Browser: Chrome 78.0.3904 / Mac OS X 10.15.1
41 | const signInButton = await
Selector('.sign-in-and-up-button');
42 | await t.click(signInButton);
43 | const fortmaticUI = await Selector('.fortmatic-iframe');
44 | await t.expect(fortmaticUI).ok();
45 | await fortmaticUI.visible;
> 46 | await t.switchToIframe(fortmaticUI);
47 |
48 | await emailInput.visible;
49 |
50 | // await t
51 | // .expect(signInButton.hasAttribute('disabled'))
at <anonymous>
(/Users/janhesters/dev/my-projecty/src/features/user-authentication/sign-in-component-functional-test.js:46:11)
at step
(/Users/janhesters/dev/my-project/node_modules/babel-runtime/helpers/asyncToGenerator.js:17:30)
at <anonymous>
(/Users/janhesters/dev/my-project/node_modules/babel-runtime/helpers/asyncToGenerator.js:28:13)
1/1 failed (18s)
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! @ test-functional: `testcafe chrome src/**/*functional-test.js --app 'npm run dev' --app-init-delay 2000`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the @ test-functional script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! /Users/janhesters/.npm/_logs/2019-12-09T09_12_07_619Z-debug.log
```
</details>
### Steps to Reproduce:
Build the test application as described above.
### Your Environment details:
* testcafe version: 6.12
* node.js version: 12.13.0
* command-line arguments: `testcafe chrome src/**/*functional-test.js --app 'npm run dev' --app-init-delay 2000`
* browser name and version: Chrome 78.0.3904
|
process
|
iframe is recognized as visible but not when switching what is your test scenario i want to write an for login with fortmatic uses an to inject its ui what is the current behavior testcafe recognizes the with exists ok and visible but not when switchtoiframe is called what is the expected behavior testcafe should always recognize the what is your web application and your testcafe test code use the usefortmatic hook in a react application js import fortmatic from fortmatic import usestate useeffect useref from react import from const usepromise const ref const container useref ref ref new promise resolve reject ref resolve ref reject return container current const usefortmatic apikey const usestate const usepromise const usestate false const signin async const await get the current user account addresses auth if needed await eth getaccounts then setaccounts const signout async const fm await await fm user logout setaccounts fire only once after components mounts useeffect const async const fm new fortmatic apikey const new fm getprovider this needs to run before we update state if we want to enable users to stay signed in between page reloads await fm user isloggedin signin fm true eslint disable next line react hooks exhaustive deps return accounts signout signin export default usefortmatic use the hook and its signin method js function mypage const signin usefortmatic return button classname sign in button disabled onclick signin sign in in your test navigate to the page with the hook and click the signin button try to switch to the iframe js await t switchtoiframe fortmatic iframe your website url or attach your complete example your complete test code or attach your test files js test only should sign in with fortmatic async t const signinbutton await selector sign in button await t click signinbutton const fortmaticui await selector fortmatic iframe await t expect fortmaticui ok passes await fortmaticui exists passes await fortmaticui visible passes await t switchtoiframe fortmaticui fails await emailinput visible your complete test report npm run test functional test functional users janhesters dev my project testcafe chrome src functional test js app npm run dev app init delay running tests in chrome mac os x sign in page ✖ should sign in with fortmatic the element that matches the specified selector is not visible browser chrome mac os x const signinbutton await selector sign in and up button await t click signinbutton const fortmaticui await selector fortmatic iframe await t expect fortmaticui ok await fortmaticui visible await t switchtoiframe fortmaticui await emailinput visible await t expect signinbutton hasattribute disabled at users janhesters dev my projecty src features user authentication sign in component functional test js at step users janhesters dev my project node modules babel runtime helpers asynctogenerator js at users janhesters dev my project node modules babel runtime helpers asynctogenerator js failed npm err code elifecycle npm err errno npm err test functional testcafe chrome src functional test js app npm run dev app init delay npm err exit status npm err npm err failed at the test functional script npm err this is probably not a problem with npm there is likely additional logging output above npm err a complete log of this run can be found in npm err users janhesters npm logs debug log steps to reproduce build the test application as described above your environment details testcafe version node js version command line arguments testcafe chrome src functional test js app npm run dev app init delay browser name and version chrome
| 1
|
380,100
| 26,401,877,020
|
IssuesEvent
|
2023-01-13 02:24:04
|
tastekim/GLB-to-USDZ-convert-API
|
https://api.github.com/repos/tastekim/GLB-to-USDZ-convert-API
|
opened
|
[Ref] Reference model
|
documentation
|
# 목표
GLB 파일을 USDZ 파일로 변환해서 Cloud Storage 에 상품 단위로 함께 저장하는 API
```
// 예시
Product : {
GLB: 'testfile.glb',
USDZ: 'testfile.usdz',
}
```
### 구현 스택
* Koa.js - Typescript
* Firebase - Cloud Storage
## Reference
[GLB TO USDZ](https://github.com/JesungKoo/glb2usdz/blob/master/app.js) - Express.js, threejs, ejs, shellscript
|
1.0
|
[Ref] Reference model - # 목표
GLB 파일을 USDZ 파일로 변환해서 Cloud Storage 에 상품 단위로 함께 저장하는 API
```
// 예시
Product : {
GLB: 'testfile.glb',
USDZ: 'testfile.usdz',
}
```
### 구현 스택
* Koa.js - Typescript
* Firebase - Cloud Storage
## Reference
[GLB TO USDZ](https://github.com/JesungKoo/glb2usdz/blob/master/app.js) - Express.js, threejs, ejs, shellscript
|
non_process
|
reference model 목표 glb 파일을 usdz 파일로 변환해서 cloud storage 에 상품 단위로 함께 저장하는 api 예시 product glb testfile glb usdz testfile usdz 구현 스택 koa js typescript firebase cloud storage reference express js threejs ejs shellscript
| 0
|
18,187
| 12,828,124,557
|
IssuesEvent
|
2020-07-06 19:53:52
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
closed
|
Git clone from azure dev ops can take >30 minutes
|
area-Infrastructure-coreclr
|
This issue is not specific to coreclr, roslyn has been noticing it as well.
|
1.0
|
Git clone from azure dev ops can take >30 minutes - This issue is not specific to coreclr, roslyn has been noticing it as well.
|
non_process
|
git clone from azure dev ops can take minutes this issue is not specific to coreclr roslyn has been noticing it as well
| 0
|
72,491
| 15,238,064,853
|
IssuesEvent
|
2021-02-19 01:01:40
|
idonthaveafifaaddiction/yugabyte-db
|
https://api.github.com/repos/idonthaveafifaaddiction/yugabyte-db
|
opened
|
CVE-2021-23339 (Medium) detected in akka-http-core_2.11-10.0.15.jar
|
security vulnerability
|
## CVE-2021-23339 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>akka-http-core_2.11-10.0.15.jar</b></p></summary>
<p>akka-http-core</p>
<p>Library home page: <a href="https://akka.io">https://akka.io</a></p>
<p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.typesafe.akka/akka-http-core_2.11/jars/akka-http-core_2.11-10.0.15.jar</p>
<p>
Dependency Hierarchy:
- play-akka-http-server_2.11-2.6.25.jar (Root Library)
- :x: **akka-http-core_2.11-10.0.15.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects all versions of package com.typesafe.akka:akka-http-core. It allows multiple Transfer-Encoding headers.
<p>Publish Date: 2021-02-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23339>CVE-2021-23339</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.typesafe.akka","packageName":"akka-http-core_2.11","packageVersion":"10.0.15","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-akka-http-server_2.11:2.6.25;com.typesafe.akka:akka-http-core_2.11:10.0.15","isMinimumFixVersionAvailable":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-23339","vulnerabilityDetails":"This affects all versions of package com.typesafe.akka:akka-http-core. It allows multiple Transfer-Encoding headers.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23339","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2021-23339 (Medium) detected in akka-http-core_2.11-10.0.15.jar - ## CVE-2021-23339 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>akka-http-core_2.11-10.0.15.jar</b></p></summary>
<p>akka-http-core</p>
<p>Library home page: <a href="https://akka.io">https://akka.io</a></p>
<p>Path to vulnerable library: /home/wss-scanner/.ivy2/cache/com.typesafe.akka/akka-http-core_2.11/jars/akka-http-core_2.11-10.0.15.jar</p>
<p>
Dependency Hierarchy:
- play-akka-http-server_2.11-2.6.25.jar (Root Library)
- :x: **akka-http-core_2.11-10.0.15.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects all versions of package com.typesafe.akka:akka-http-core. It allows multiple Transfer-Encoding headers.
<p>Publish Date: 2021-02-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23339>CVE-2021-23339</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.typesafe.akka","packageName":"akka-http-core_2.11","packageVersion":"10.0.15","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"com.typesafe.play:play-akka-http-server_2.11:2.6.25;com.typesafe.akka:akka-http-core_2.11:10.0.15","isMinimumFixVersionAvailable":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-23339","vulnerabilityDetails":"This affects all versions of package com.typesafe.akka:akka-http-core. It allows multiple Transfer-Encoding headers.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23339","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in akka http core jar cve medium severity vulnerability vulnerable library akka http core jar akka http core library home page a href path to vulnerable library home wss scanner cache com typesafe akka akka http core jars akka http core jar dependency hierarchy play akka http server jar root library x akka http core jar vulnerable library found in base branch master vulnerability details this affects all versions of package com typesafe akka akka http core it allows multiple transfer encoding headers publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree com typesafe play play akka http server com typesafe akka akka http core isminimumfixversionavailable false basebranches vulnerabilityidentifier cve vulnerabilitydetails this affects all versions of package com typesafe akka akka http core it allows multiple transfer encoding headers vulnerabilityurl
| 0
|
326,345
| 9,955,340,414
|
IssuesEvent
|
2019-07-05 10:45:14
|
mozilla/addons-frontend
|
https://api.github.com/repos/mozilla/addons-frontend
|
closed
|
Notice for add-on dependencies isn't shown
|
priority: p3 project: amo
|
### Describe the problem and steps to reproduce it:
Go to the page of an extension that has listed another extension as a dependency. E.g. [[STG plugin] Load custom group](https://addons.mozilla.org/en-US/firefox/addon/stg-plugin-load-custom-group/)
### What happened?
There was no dependency notice.
<img width="779" alt="screenshot 2018-02-08 00 22 48" src="https://user-images.githubusercontent.com/90871/35948782-3bbcd332-0c66-11e8-9abf-460bdff53cf8.png">
### What did you expect to happen?
Expected to see something like the following:

|
1.0
|
Notice for add-on dependencies isn't shown - ### Describe the problem and steps to reproduce it:
Go to the page of an extension that has listed another extension as a dependency. E.g. [[STG plugin] Load custom group](https://addons.mozilla.org/en-US/firefox/addon/stg-plugin-load-custom-group/)
### What happened?
There was no dependency notice.
<img width="779" alt="screenshot 2018-02-08 00 22 48" src="https://user-images.githubusercontent.com/90871/35948782-3bbcd332-0c66-11e8-9abf-460bdff53cf8.png">
### What did you expect to happen?
Expected to see something like the following:

|
non_process
|
notice for add on dependencies isn t shown describe the problem and steps to reproduce it go to the page of an extension that has listed another extension as a dependency e g load custom group what happened there was no dependency notice img width alt screenshot src what did you expect to happen expected to see something like the following
| 0
|
4,597
| 7,438,771,835
|
IssuesEvent
|
2018-03-27 02:25:57
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
Test failure: System.Diagnostics.Tests.ProcessTests/TestProcessStartTime
|
area-System.Diagnostics.Process test bug
|
Opened on behalf of @danmosemsft
The test `System.Diagnostics.Tests.ProcessTests/TestProcessStartTime` has failed.
Assert.InRange() Failure\r
Range: (3/26/18 12:54:34 PM - 3/26/18 12:54:36 PM)\r
Actual: 3/26/18 12:54:34 PM
Stack Trace:
at System.Diagnostics.Tests.ProcessTests.TestProcessStartTime() in /root/corefx-1520178/src/System.Diagnostics.Process/tests/ProcessTests.cs:line 716
Build : Master - 20180326.03 (Core Tests)
Failing configurations:
- RedHat.73.Amd64-x64
- Release
This used to be 3 seconds, Could you please bump it up again. It looks like we were <500msec too low.
```c#
[Fact]
public void TestProcessStartTime()
{
TimeSpan allowedWindow = TimeSpan.FromSeconds(1);
```
|
1.0
|
Test failure: System.Diagnostics.Tests.ProcessTests/TestProcessStartTime - Opened on behalf of @danmosemsft
The test `System.Diagnostics.Tests.ProcessTests/TestProcessStartTime` has failed.
Assert.InRange() Failure\r
Range: (3/26/18 12:54:34 PM - 3/26/18 12:54:36 PM)\r
Actual: 3/26/18 12:54:34 PM
Stack Trace:
at System.Diagnostics.Tests.ProcessTests.TestProcessStartTime() in /root/corefx-1520178/src/System.Diagnostics.Process/tests/ProcessTests.cs:line 716
Build : Master - 20180326.03 (Core Tests)
Failing configurations:
- RedHat.73.Amd64-x64
- Release
This used to be 3 seconds, Could you please bump it up again. It looks like we were <500msec too low.
```c#
[Fact]
public void TestProcessStartTime()
{
TimeSpan allowedWindow = TimeSpan.FromSeconds(1);
```
|
process
|
test failure system diagnostics tests processtests testprocessstarttime opened on behalf of danmosemsft the test system diagnostics tests processtests testprocessstarttime has failed assert inrange failure r range pm pm r actual pm stack trace at system diagnostics tests processtests testprocessstarttime in root corefx src system diagnostics process tests processtests cs line build master core tests failing configurations redhat release this used to be seconds could you please bump it up again it looks like we were too low c public void testprocessstarttime timespan allowedwindow timespan fromseconds
| 1
|
294,943
| 22,171,434,050
|
IssuesEvent
|
2022-06-06 01:26:28
|
ToolJet/ToolJet
|
https://api.github.com/repos/ToolJet/ToolJet
|
closed
|
[docs]: Broken link to Setup under Getting Started on Introduction page of docs
|
documentation
|
### Summary
### Summary
The link to setup ToolJet locally using docker on the Introduction page of the docs is broken. Clicking on the link attempts to navigate to https://docs.tooljet.com/docs/setup/architecture which yields page not found.

I propose to simply amend the link in the intro.md file under ToolJet/docs/docs to https://docs.tooljet.com/docs/setup/docker-local/ instead of https://docs.tooljet.com/docs/setup/architecture
### Issue Type
Documentation bug
### The entire URL of the documentation with the issue
https://docs.tooljet.com/docs/#getting-started
### Steps to reproduce the issue
1. Go to https://docs.tooljet.com/docs/#getting-started
2. Click on the link labelled Setup
3. Expected: Docker Setup Page
4. Actual: Page Not Found
### Additional Information
The proposed change will make the documentation more fluid and will reduce confusion for those consulting the docs.
### If the issue is confirmed, would you be willing to submit a pull request?
Yes
### Code of Conduct
- [X] I agree to follow the ToolJet Code of Conduct
|
1.0
|
[docs]: Broken link to Setup under Getting Started on Introduction page of docs - ### Summary
### Summary
The link to setup ToolJet locally using docker on the Introduction page of the docs is broken. Clicking on the link attempts to navigate to https://docs.tooljet.com/docs/setup/architecture which yields page not found.

I propose to simply amend the link in the intro.md file under ToolJet/docs/docs to https://docs.tooljet.com/docs/setup/docker-local/ instead of https://docs.tooljet.com/docs/setup/architecture
### Issue Type
Documentation bug
### The entire URL of the documentation with the issue
https://docs.tooljet.com/docs/#getting-started
### Steps to reproduce the issue
1. Go to https://docs.tooljet.com/docs/#getting-started
2. Click on the link labelled Setup
3. Expected: Docker Setup Page
4. Actual: Page Not Found
### Additional Information
The proposed change will make the documentation more fluid and will reduce confusion for those consulting the docs.
### If the issue is confirmed, would you be willing to submit a pull request?
Yes
### Code of Conduct
- [X] I agree to follow the ToolJet Code of Conduct
|
non_process
|
broken link to setup under getting started on introduction page of docs summary summary the link to setup tooljet locally using docker on the introduction page of the docs is broken clicking on the link attempts to navigate to which yields page not found i propose to simply amend the link in the intro md file under tooljet docs docs to instead of issue type documentation bug the entire url of the documentation with the issue steps to reproduce the issue go to click on the link labelled setup expected docker setup page actual page not found additional information the proposed change will make the documentation more fluid and will reduce confusion for those consulting the docs if the issue is confirmed would you be willing to submit a pull request yes code of conduct i agree to follow the tooljet code of conduct
| 0
|
768,195
| 26,957,756,314
|
IssuesEvent
|
2023-02-08 15:59:40
|
valantic/vue-template
|
https://api.github.com/repos/valantic/vue-template
|
closed
|
Vue3: modal component
|
enhancement high priority vue-3 components
|
We need a new modal component for Vue3. Maybe we can use the component that was created for ADM.
|
1.0
|
Vue3: modal component - We need a new modal component for Vue3. Maybe we can use the component that was created for ADM.
|
non_process
|
modal component we need a new modal component for maybe we can use the component that was created for adm
| 0
|
85,783
| 10,681,943,803
|
IssuesEvent
|
2019-10-22 03:04:56
|
pax-app/Wiki
|
https://api.github.com/repos/pax-app/Wiki
|
closed
|
Padrões de Projeto - Chat
|
Design Patterns Sprint 08 improvement wontfix
|
# Descrição da Issue
Aplicação de padrões de projeto no serviço de chat.
## Tasks:
- [ ] Estudar os patterns
- [ ] Separar quais são aplicáveis para o serviço
- [ ] Implementá-los
# Critérios de Aceitação
* Após a aplicação em código, a técnica e o trecho de código devem ser colocados na Wiki
<!---Não esquecer de fazer as devidas referências no menu lateral direito--->
|
1.0
|
Padrões de Projeto - Chat - # Descrição da Issue
Aplicação de padrões de projeto no serviço de chat.
## Tasks:
- [ ] Estudar os patterns
- [ ] Separar quais são aplicáveis para o serviço
- [ ] Implementá-los
# Critérios de Aceitação
* Após a aplicação em código, a técnica e o trecho de código devem ser colocados na Wiki
<!---Não esquecer de fazer as devidas referências no menu lateral direito--->
|
non_process
|
padrões de projeto chat descrição da issue aplicação de padrões de projeto no serviço de chat tasks estudar os patterns separar quais são aplicáveis para o serviço implementá los critérios de aceitação após a aplicação em código a técnica e o trecho de código devem ser colocados na wiki
| 0
|
327,302
| 28,052,068,713
|
IssuesEvent
|
2023-03-29 06:45:55
|
ALTA-LapakUMKM-Group-2/LapakUMKM-APITesting
|
https://api.github.com/repos/ALTA-LapakUMKM-Group-2/LapakUMKM-APITesting
|
closed
|
[User-A025] POST Update Photo Profile With Invalid Parameter
|
Manual Api Testing
|
**Given** Post update photo profile with invalid parameter
**When** Send post update data
**Then** API should be 404 Not Found
**And** Validate post create new data resource json schema
|
1.0
|
[User-A025] POST Update Photo Profile With Invalid Parameter - **Given** Post update photo profile with invalid parameter
**When** Send post update data
**Then** API should be 404 Not Found
**And** Validate post create new data resource json schema
|
non_process
|
post update photo profile with invalid parameter given post update photo profile with invalid parameter when send post update data then api should be not found and validate post create new data resource json schema
| 0
|
18,177
| 24,225,219,925
|
IssuesEvent
|
2022-09-26 13:58:03
|
akrherz/iem
|
https://api.github.com/repos/akrherz/iem
|
closed
|
Backfill PIREP archive
|
Data Processing
|
An archive of processed PIREPs is available on the [IEM website](https://mesonet.agron.iastate.edu/request/gis/pireps.php), but it only goes back to Jan 2015. I believe I have all (maybe not Canada) of the raw text archived in my AFOS database, so we should attempt to backfill the IEM archive.
|
1.0
|
Backfill PIREP archive - An archive of processed PIREPs is available on the [IEM website](https://mesonet.agron.iastate.edu/request/gis/pireps.php), but it only goes back to Jan 2015. I believe I have all (maybe not Canada) of the raw text archived in my AFOS database, so we should attempt to backfill the IEM archive.
|
process
|
backfill pirep archive an archive of processed pireps is available on the but it only goes back to jan i believe i have all maybe not canada of the raw text archived in my afos database so we should attempt to backfill the iem archive
| 1
|
22,729
| 32,046,625,202
|
IssuesEvent
|
2023-09-23 04:09:15
|
h4sh5/npm-auto-scanner
|
https://api.github.com/repos/h4sh5/npm-auto-scanner
|
opened
|
@salesforce/plugin-telemetry 2.3.4 has 2 guarddog issues
|
npm-install-script npm-silent-process-execution
|
```{"npm-install-script":[{"code":" \"prepare\": \"sf-install\",","location":"package/package.json:91","message":"The package.json has a script automatically running when the package is installed"}],"npm-silent-process-execution":[{"code":" (0, child_process_1.spawn)(nodePath, [processPath, Telemetry.cacheDir, this.getTelemetryFilePath()], {\n detached: true,\n stdio: 'ignore',\n }).unref();","location":"package/lib/telemetry.js:245","message":"This package is silently executing another executable"}]}```
|
1.0
|
@salesforce/plugin-telemetry 2.3.4 has 2 guarddog issues - ```{"npm-install-script":[{"code":" \"prepare\": \"sf-install\",","location":"package/package.json:91","message":"The package.json has a script automatically running when the package is installed"}],"npm-silent-process-execution":[{"code":" (0, child_process_1.spawn)(nodePath, [processPath, Telemetry.cacheDir, this.getTelemetryFilePath()], {\n detached: true,\n stdio: 'ignore',\n }).unref();","location":"package/lib/telemetry.js:245","message":"This package is silently executing another executable"}]}```
|
process
|
salesforce plugin telemetry has guarddog issues npm install script npm silent process execution n detached true n stdio ignore n unref location package lib telemetry js message this package is silently executing another executable
| 1
|
8,293
| 11,458,278,402
|
IssuesEvent
|
2020-02-07 02:46:06
|
googleapis/google-cloud-python
|
https://api.github.com/repos/googleapis/google-cloud-python
|
closed
|
Testing: 'get_target_packages_kokoro.py' script recieving bogus output.
|
flaky testing type: process
|
From [this Kokoro job today](https://source.cloud.google.com/results/invocations/ef22b02c-8631-4f6d-bd99-33de0d84984c/targets/cloud-devrel%2Fclient-libraries%2Fgoogle-cloud-python%2Fpresubmit%2Fwebsecurityscanner/log):
```python
Traceback (most recent call last):
File "test_utils/scripts/get_target_packages_kokoro.py", line 98, in <module>
main()
File "test_utils/scripts/get_target_packages_kokoro.py", line 85, in main
changed_files = list(get_changed_files_from_pr(environment.pr))
File "test_utils/scripts/get_target_packages_kokoro.py", line 60, in get_changed_files_from_pr
yield info['filename']
TypeError: string indices must be integers
```
|
1.0
|
Testing: 'get_target_packages_kokoro.py' script recieving bogus output. - From [this Kokoro job today](https://source.cloud.google.com/results/invocations/ef22b02c-8631-4f6d-bd99-33de0d84984c/targets/cloud-devrel%2Fclient-libraries%2Fgoogle-cloud-python%2Fpresubmit%2Fwebsecurityscanner/log):
```python
Traceback (most recent call last):
File "test_utils/scripts/get_target_packages_kokoro.py", line 98, in <module>
main()
File "test_utils/scripts/get_target_packages_kokoro.py", line 85, in main
changed_files = list(get_changed_files_from_pr(environment.pr))
File "test_utils/scripts/get_target_packages_kokoro.py", line 60, in get_changed_files_from_pr
yield info['filename']
TypeError: string indices must be integers
```
|
process
|
testing get target packages kokoro py script recieving bogus output from python traceback most recent call last file test utils scripts get target packages kokoro py line in main file test utils scripts get target packages kokoro py line in main changed files list get changed files from pr environment pr file test utils scripts get target packages kokoro py line in get changed files from pr yield info typeerror string indices must be integers
| 1
|
1,154
| 2,532,536,087
|
IssuesEvent
|
2015-01-23 16:43:42
|
autotest/autotest-docker
|
https://api.github.com/repos/autotest/autotest-docker
|
opened
|
New test: Build from a Dockerfile specified to -f parameter
|
New Feature Docker Test On Milestone
|
These new features can all be included:
* Dockerfile to use for a given `docker build` can be specified with the `-f` flag
* Dockerfile and .dockerignore files can be themselves excluded as part of the .dockerignore file, thus preventing modifications to these files invalidating ADD or COPY instructions cache
* Dockerfile `FROM scratch` instruction should not pull 'scratch' image.
|
1.0
|
New test: Build from a Dockerfile specified to -f parameter - These new features can all be included:
* Dockerfile to use for a given `docker build` can be specified with the `-f` flag
* Dockerfile and .dockerignore files can be themselves excluded as part of the .dockerignore file, thus preventing modifications to these files invalidating ADD or COPY instructions cache
* Dockerfile `FROM scratch` instruction should not pull 'scratch' image.
|
non_process
|
new test build from a dockerfile specified to f parameter these new features can all be included dockerfile to use for a given docker build can be specified with the f flag dockerfile and dockerignore files can be themselves excluded as part of the dockerignore file thus preventing modifications to these files invalidating add or copy instructions cache dockerfile from scratch instruction should not pull scratch image
| 0
|
18,722
| 24,611,436,268
|
IssuesEvent
|
2022-10-14 22:03:49
|
GoogleCloudPlatform/cloud-ops-sandbox
|
https://api.github.com/repos/GoogleCloudPlatform/cloud-ops-sandbox
|
closed
|
Create end-to-end workflow to validate Terraform configuration
|
priority: p1 type: process
|
### Create an end-to-end (e2e) workflow configuration that will provision a o11y artifactors for a dummy application.
The workflow should be triggered in each pull request to release branches (`main` and `release/[0-9]+.[0-9]+`). It should be possible to trigger it manually on demand, so it can be used in development branches.
A dummy application configuration files should be stored under `tests/test-app` folder and include
* `tests/test-app/alerts.json`
* `tests/test-app/dashboards.json`
* `tests/test-app/services.json`
* `tests/test-app/uptime_checks.json`
The e2e testing should validate that the current Terraform configuration can provision the test application's artifacts without an error. The application's configuration files should cover all supported customizations of the o11y artifacts including dashboards, charts, log-based metrics, service SLOs, alerts and uptime checks.
|
1.0
|
Create end-to-end workflow to validate Terraform configuration - ### Create an end-to-end (e2e) workflow configuration that will provision a o11y artifactors for a dummy application.
The workflow should be triggered in each pull request to release branches (`main` and `release/[0-9]+.[0-9]+`). It should be possible to trigger it manually on demand, so it can be used in development branches.
A dummy application configuration files should be stored under `tests/test-app` folder and include
* `tests/test-app/alerts.json`
* `tests/test-app/dashboards.json`
* `tests/test-app/services.json`
* `tests/test-app/uptime_checks.json`
The e2e testing should validate that the current Terraform configuration can provision the test application's artifacts without an error. The application's configuration files should cover all supported customizations of the o11y artifacts including dashboards, charts, log-based metrics, service SLOs, alerts and uptime checks.
|
process
|
create end to end workflow to validate terraform configuration create an end to end workflow configuration that will provision a artifactors for a dummy application the workflow should be triggered in each pull request to release branches main and release it should be possible to trigger it manually on demand so it can be used in development branches a dummy application configuration files should be stored under tests test app folder and include tests test app alerts json tests test app dashboards json tests test app services json tests test app uptime checks json the testing should validate that the current terraform configuration can provision the test application s artifacts without an error the application s configuration files should cover all supported customizations of the artifacts including dashboards charts log based metrics service slos alerts and uptime checks
| 1
|
7,901
| 11,089,087,605
|
IssuesEvent
|
2019-12-14 16:01:01
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
Check for missing @href for undefined keys
|
enhancement preprocess/keyref stale
|
We sometimes link to images by key. There's an error of type `info` that checks, whether the key is undefined:
```
DOTJ047I Unable to find key definition for key reference "%1"
in root scope. The href attribute may be used as fallback if it exists
```
The problem is here, if the key is undefined and no `href` is set, the image is missing. That's worse than just an `info`.
I'd suggest to refactor this as follows:
* Remove **[DOTJ047I](www.dita-ot.org/dev/user-guide/DITA-messages.html#msgs__DOTJ047I)** from the DITA-OT, keep it in the docs for reference
* Split the error in the following 2 new errors:
```
[INFO] Unable to find key definition for key reference "%1"
in root scope. A href attribute is found and will be used as fallback
[ERROR] Unable to find key definition for key reference "%1"
in root scope. No href attribute is found as fallback
```
- - - -
**[keyref] [DOTJ046E](www.dita-ot.org/dev/user-guide/DITA-messages.html#msgs__DOTJ046E)** is of type `[ERROR]`. For us, a broken key reference without a fallback `@href` is as bad as a broken content key reference. So I'd recommend to use `[ERROR]` as the error level for the suggested error type.
```
[echo] [keyref] [DOTJ046E][ERROR] Conkeyref="TOPIC/ID" can not be resolved
because it does not contain a key or the key is not defined. The build will use
the conref attribute for fallback, if one exists.
```
|
1.0
|
Check for missing @href for undefined keys - We sometimes link to images by key. There's an error of type `info` that checks, whether the key is undefined:
```
DOTJ047I Unable to find key definition for key reference "%1"
in root scope. The href attribute may be used as fallback if it exists
```
The problem is here, if the key is undefined and no `href` is set, the image is missing. That's worse than just an `info`.
I'd suggest to refactor this as follows:
* Remove **[DOTJ047I](www.dita-ot.org/dev/user-guide/DITA-messages.html#msgs__DOTJ047I)** from the DITA-OT, keep it in the docs for reference
* Split the error in the following 2 new errors:
```
[INFO] Unable to find key definition for key reference "%1"
in root scope. A href attribute is found and will be used as fallback
[ERROR] Unable to find key definition for key reference "%1"
in root scope. No href attribute is found as fallback
```
- - - -
**[keyref] [DOTJ046E](www.dita-ot.org/dev/user-guide/DITA-messages.html#msgs__DOTJ046E)** is of type `[ERROR]`. For us, a broken key reference without a fallback `@href` is as bad as a broken content key reference. So I'd recommend to use `[ERROR]` as the error level for the suggested error type.
```
[echo] [keyref] [DOTJ046E][ERROR] Conkeyref="TOPIC/ID" can not be resolved
because it does not contain a key or the key is not defined. The build will use
the conref attribute for fallback, if one exists.
```
|
process
|
check for missing href for undefined keys we sometimes link to images by key there s an error of type info that checks whether the key is undefined unable to find key definition for key reference in root scope the href attribute may be used as fallback if it exists the problem is here if the key is undefined and no href is set the image is missing that s worse than just an info i d suggest to refactor this as follows remove from the dita ot keep it in the docs for reference split the error in the following new errors unable to find key definition for key reference in root scope a href attribute is found and will be used as fallback unable to find key definition for key reference in root scope no href attribute is found as fallback is of type for us a broken key reference without a fallback href is as bad as a broken content key reference so i d recommend to use as the error level for the suggested error type conkeyref topic id can not be resolved because it does not contain a key or the key is not defined the build will use the conref attribute for fallback if one exists
| 1
|
7,382
| 10,515,284,915
|
IssuesEvent
|
2019-09-28 08:30:05
|
sysown/proxysql
|
https://api.github.com/repos/sysown/proxysql
|
closed
|
Extend stats_mysql_query_digest with more statistics
|
ADMIN CONNECTION POOL PROTOCOL QUERY PROCESSOR STATISTICS
|
## Why
stats_mysql_query_digest is a great source of information on what it is running inside ProxySQL . More statistics are welcome
## What
- [x] add columns to identify backend: hostgroup_id , address, port
- [ ] add columns to identify the amount of data sent to backend: min_data_sent , max_data_sent , sum_data_sent
- [ ] add columns to identify the amount of data received from backend: min_data_recv , max_data_recv , sum_data_recv
|
1.0
|
Extend stats_mysql_query_digest with more statistics - ## Why
stats_mysql_query_digest is a great source of information on what it is running inside ProxySQL . More statistics are welcome
## What
- [x] add columns to identify backend: hostgroup_id , address, port
- [ ] add columns to identify the amount of data sent to backend: min_data_sent , max_data_sent , sum_data_sent
- [ ] add columns to identify the amount of data received from backend: min_data_recv , max_data_recv , sum_data_recv
|
process
|
extend stats mysql query digest with more statistics why stats mysql query digest is a great source of information on what it is running inside proxysql more statistics are welcome what add columns to identify backend hostgroup id address port add columns to identify the amount of data sent to backend min data sent max data sent sum data sent add columns to identify the amount of data received from backend min data recv max data recv sum data recv
| 1
|
18,798
| 24,699,771,739
|
IssuesEvent
|
2022-10-19 14:32:29
|
python/cpython
|
https://api.github.com/repos/python/cpython
|
closed
|
Multiprocessing returns Array, RawArray of wrong length python 3.10.4
|
type-bug expert-multiprocessing
|
# Bug report
When initializing Array or RawArray of type int, wrong length of the arrays is returned
Reproducible example:
```
from multiprocessing import RawArray, RawValue, Array, Value
a = Array('d',12)
X = np.frombuffer(a.get_obj())
print(X) # [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
a = Array('i',12)
X = np.frombuffer(a.get_obj())
print(X) # [0. 0. 0. 0. 0. 0.]
a = RawArray('d',12)
X = np.frombuffer(a)
print(X) # [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
a = RawArray('i',12)
X = np.frombuffer(a)
print(X) # [0. 0. 0. 0. 0. 0.]
```
# Your environment
working with vscode
conda-forge python 3.10.4
using ubuntu uname-a: 5.11.0-46-generic #51~20.04.1-Ubuntu SMP Fri Jan 7 06:51:40 UTC 2022
- CPython versions tested on: conda-forge python 3.10.4
- Operating system and architecture: using ubuntu uname-a: 5.11.0-46-generic 51~20.04.1-Ubuntu
|
1.0
|
Multiprocessing returns Array, RawArray of wrong length python 3.10.4 - # Bug report
When initializing Array or RawArray of type int, wrong length of the arrays is returned
Reproducible example:
```
from multiprocessing import RawArray, RawValue, Array, Value
a = Array('d',12)
X = np.frombuffer(a.get_obj())
print(X) # [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
a = Array('i',12)
X = np.frombuffer(a.get_obj())
print(X) # [0. 0. 0. 0. 0. 0.]
a = RawArray('d',12)
X = np.frombuffer(a)
print(X) # [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
a = RawArray('i',12)
X = np.frombuffer(a)
print(X) # [0. 0. 0. 0. 0. 0.]
```
# Your environment
working with vscode
conda-forge python 3.10.4
using ubuntu uname-a: 5.11.0-46-generic #51~20.04.1-Ubuntu SMP Fri Jan 7 06:51:40 UTC 2022
- CPython versions tested on: conda-forge python 3.10.4
- Operating system and architecture: using ubuntu uname-a: 5.11.0-46-generic 51~20.04.1-Ubuntu
|
process
|
multiprocessing returns array rawarray of wrong length python bug report when initializing array or rawarray of type int wrong length of the arrays is returned reproducible example from multiprocessing import rawarray rawvalue array value a array d x np frombuffer a get obj print x a array i x np frombuffer a get obj print x a rawarray d x np frombuffer a print x a rawarray i x np frombuffer a print x your environment working with vscode conda forge python using ubuntu uname a generic ubuntu smp fri jan utc cpython versions tested on conda forge python operating system and architecture using ubuntu uname a generic ubuntu
| 1
|
10,312
| 13,156,704,121
|
IssuesEvent
|
2020-08-10 11:19:54
|
MarkBind/markbind
|
https://api.github.com/repos/MarkBind/markbind
|
closed
|
Use Prettier to Format Code
|
a-Process p.Low
|
Right now, we are manually formatting code to comply with the linting rules. Having to do this can be somewhat annoying to developers, and can also result it some inconsistencies across the codebase (e.g. where/when to break to a new line).
Instead, we can use a code formatter such as [Prettier](https://prettier.io) to automatically format the code for us. This would avoid the overhead of formatting our own code and makes the codebase look more consistent.
|
1.0
|
Use Prettier to Format Code - Right now, we are manually formatting code to comply with the linting rules. Having to do this can be somewhat annoying to developers, and can also result it some inconsistencies across the codebase (e.g. where/when to break to a new line).
Instead, we can use a code formatter such as [Prettier](https://prettier.io) to automatically format the code for us. This would avoid the overhead of formatting our own code and makes the codebase look more consistent.
|
process
|
use prettier to format code right now we are manually formatting code to comply with the linting rules having to do this can be somewhat annoying to developers and can also result it some inconsistencies across the codebase e g where when to break to a new line instead we can use a code formatter such as to automatically format the code for us this would avoid the overhead of formatting our own code and makes the codebase look more consistent
| 1
|
326,676
| 24,097,210,240
|
IssuesEvent
|
2022-09-19 19:54:57
|
flyteorg/flyte
|
https://api.github.com/repos/flyteorg/flyte
|
opened
|
[Docs] Flyte Tensorflow cookbook example doesn't work on Macbooks with Silicon Chip
|
documentation untriaged
|
### Description
Trying to run the Tensorflow example from the Flyte cookbook results in an error for users with the Mac silicon chip.
Tutorial - [link](https://github.com/flyteorg/flytesnacks/tree/master/cookbook/integrations/kubernetes/kftensorflow)
Tensorflow can be installed on a user's local machine via `pip install tensorflow-macos`, but using base image for the tutorial (`tensorflow/tensorflow:latest-gpu` or `tensorflow/tensorflow:latest`) causes the following error when a workflow execution is attempted:
```
The TensorFlow library was compiled to use AVX instructions, but these aren't available on your machine.
qemu: uncaught target signal 6 (Aborted) - core dumped
```
### Are you sure this issue hasn't been raised already?
- [X] Yes
### Have you read the Code of Conduct?
- [X] Yes
|
1.0
|
[Docs] Flyte Tensorflow cookbook example doesn't work on Macbooks with Silicon Chip - ### Description
Trying to run the Tensorflow example from the Flyte cookbook results in an error for users with the Mac silicon chip.
Tutorial - [link](https://github.com/flyteorg/flytesnacks/tree/master/cookbook/integrations/kubernetes/kftensorflow)
Tensorflow can be installed on a user's local machine via `pip install tensorflow-macos`, but using base image for the tutorial (`tensorflow/tensorflow:latest-gpu` or `tensorflow/tensorflow:latest`) causes the following error when a workflow execution is attempted:
```
The TensorFlow library was compiled to use AVX instructions, but these aren't available on your machine.
qemu: uncaught target signal 6 (Aborted) - core dumped
```
### Are you sure this issue hasn't been raised already?
- [X] Yes
### Have you read the Code of Conduct?
- [X] Yes
|
non_process
|
flyte tensorflow cookbook example doesn t work on macbooks with silicon chip description trying to run the tensorflow example from the flyte cookbook results in an error for users with the mac silicon chip tutorial tensorflow can be installed on a user s local machine via pip install tensorflow macos but using base image for the tutorial tensorflow tensorflow latest gpu or tensorflow tensorflow latest causes the following error when a workflow execution is attempted the tensorflow library was compiled to use avx instructions but these aren t available on your machine qemu uncaught target signal aborted core dumped are you sure this issue hasn t been raised already yes have you read the code of conduct yes
| 0
|
11,928
| 14,704,355,272
|
IssuesEvent
|
2021-01-04 16:22:34
|
dialogos-project/dialogos
|
https://api.github.com/repos/dialogos-project/dialogos
|
opened
|
Java 14 does not include Pack200 decompression any more.
|
installer release-process
|
The Installer doesn't work on Java 14 anymore because of https://openjdk.java.net/jeps/367 .
We need to check that the change in https://openjdk.java.net/jeps/367:39d5085 fixes this.
|
1.0
|
Java 14 does not include Pack200 decompression any more. - The Installer doesn't work on Java 14 anymore because of https://openjdk.java.net/jeps/367 .
We need to check that the change in https://openjdk.java.net/jeps/367:39d5085 fixes this.
|
process
|
java does not include decompression any more the installer doesn t work on java anymore because of we need to check that the change in fixes this
| 1
|
19,696
| 26,047,573,817
|
IssuesEvent
|
2022-12-22 15:37:49
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Clarification: Does "Deploy all in sequence" option look at commit sequence or build completion sequence?
|
doc-enhancement devops/prod Pri2 devops-cicd-process/tech
|
> Deploy all in sequence: Use this option if you want to deploy all the releases ***sequentially*** into the same shared physical resources.
Could you please clarify what does ***sequentially*** mean in this sentence? Is it a sequentiality of commits, or the sequence in which builds were completed?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: d322215c-8025-4f21-0700-7dfa7dc5c46e
* Version Independent ID: 141fcdbb-8394-525b-bb29-eff9a693a9c4
* Content: [Stages in Azure Pipelines - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=classic)
* Content Source: [docs/pipelines/process/stages.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/stages.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Clarification: Does "Deploy all in sequence" option look at commit sequence or build completion sequence? - > Deploy all in sequence: Use this option if you want to deploy all the releases ***sequentially*** into the same shared physical resources.
Could you please clarify what does ***sequentially*** mean in this sentence? Is it a sequentiality of commits, or the sequence in which builds were completed?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: d322215c-8025-4f21-0700-7dfa7dc5c46e
* Version Independent ID: 141fcdbb-8394-525b-bb29-eff9a693a9c4
* Content: [Stages in Azure Pipelines - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/stages?view=azure-devops&tabs=classic)
* Content Source: [docs/pipelines/process/stages.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/stages.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
clarification does deploy all in sequence option look at commit sequence or build completion sequence deploy all in sequence use this option if you want to deploy all the releases sequentially into the same shared physical resources could you please clarify what does sequentially mean in this sentence is it a sequentiality of commits or the sequence in which builds were completed document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
161,650
| 12,555,857,818
|
IssuesEvent
|
2020-06-07 07:42:25
|
snext1220/stext
|
https://api.github.com/repos/snext1220/stext
|
closed
|
【Playground Flow】グリッド表でのid値検証
|
Testing enhancement
|
https://github.com/snext1220/snext-management/issues/18#issuecomment-612367219 を受けてのIsueです。
> ・敵のIDはm+数字で書かないと機能しないのですが、数字だけで書いても入力、シーン指定できてしまうので、あとで混乱しました。
アイテム、フラグ、敵、実績のid値を入力値にチェックし、適切なプレフィックスがついていないものに対してエラーを表示するようにしています。GitHubのみ反映済みです。
すべてのチェックは(主にメンテナンス的な意味で)難しいかもしれませんが、「この辺は間違えやすい」などあれば、本Issueにてぜひお教えください。
|
1.0
|
【Playground Flow】グリッド表でのid値検証 - https://github.com/snext1220/snext-management/issues/18#issuecomment-612367219 を受けてのIsueです。
> ・敵のIDはm+数字で書かないと機能しないのですが、数字だけで書いても入力、シーン指定できてしまうので、あとで混乱しました。
アイテム、フラグ、敵、実績のid値を入力値にチェックし、適切なプレフィックスがついていないものに対してエラーを表示するようにしています。GitHubのみ反映済みです。
すべてのチェックは(主にメンテナンス的な意味で)難しいかもしれませんが、「この辺は間違えやすい」などあれば、本Issueにてぜひお教えください。
|
non_process
|
【playground flow】グリッド表でのid値検証 を受けてのisueです。 ・敵のidはm 数字で書かないと機能しないのですが、数字だけで書いても入力、シーン指定できてしまうので、あとで混乱しました。 アイテム、フラグ、敵、実績のid値を入力値にチェックし、適切なプレフィックスがついていないものに対してエラーを表示するようにしています。githubのみ反映済みです。 すべてのチェックは(主にメンテナンス的な意味で)難しいかもしれませんが、「この辺は間違えやすい」などあれば、本issueにてぜひお教えください。
| 0
|
360,613
| 10,694,908,691
|
IssuesEvent
|
2019-10-23 11:56:24
|
fack2/ptfb-griffinot
|
https://api.github.com/repos/fack2/ptfb-griffinot
|
closed
|
Set either height or width for pictures
|
T25m priority-3
|
Currently this picture has both height and width set, which distorts it a little. I think you should select only width, and then add some margin if you would like to take more space. 😺

=>

|
1.0
|
Set either height or width for pictures - Currently this picture has both height and width set, which distorts it a little. I think you should select only width, and then add some margin if you would like to take more space. 😺

=>

|
non_process
|
set either height or width for pictures currently this picture has both height and width set which distorts it a little i think you should select only width and then add some margin if you would like to take more space 😺
| 0
|
305,954
| 23,139,019,389
|
IssuesEvent
|
2022-07-28 16:36:09
|
iterative/terraform-provider-iterative
|
https://api.github.com/repos/iterative/terraform-provider-iterative
|
opened
|
TPI behaves differently in the CI
|
documentation p1-important discussion
|
As per https://github.com/iterative/terraform-provider-iterative/pull/339#discussion_r855222543:
- TPI workflows are restarted iff run in CI, but not if run locally (counter-intuitive)
- This behaviour is not documented
- The [use case](https://github.com/iterative/terraform-provider-iterative/issues/389#issuecomment-1047102568), i.e. no need to have an explicit for-loop wrapping the `refresh && iterative_task.example.status["succeeded"]`
|
1.0
|
TPI behaves differently in the CI - As per https://github.com/iterative/terraform-provider-iterative/pull/339#discussion_r855222543:
- TPI workflows are restarted iff run in CI, but not if run locally (counter-intuitive)
- This behaviour is not documented
- The [use case](https://github.com/iterative/terraform-provider-iterative/issues/389#issuecomment-1047102568), i.e. no need to have an explicit for-loop wrapping the `refresh && iterative_task.example.status["succeeded"]`
|
non_process
|
tpi behaves differently in the ci as per tpi workflows are restarted iff run in ci but not if run locally counter intuitive this behaviour is not documented the i e no need to have an explicit for loop wrapping the refresh iterative task example status
| 0
|
16,022
| 20,188,229,179
|
IssuesEvent
|
2022-02-11 01:19:53
|
savitamittalmsft/WAS-SEC-TEST
|
https://api.github.com/repos/savitamittalmsft/WAS-SEC-TEST
|
opened
|
Synchronize on-premises directory with Azure AD
|
WARP-Import WAF FEB 2021 Security Performance and Scalability Capacity Management Processes Security & Compliance Authentication and authorization
|
<a href="https://docs.microsoft.com/azure/architecture/framework/security/design-identity-authentication#centralize-all-identity-systems">Synchronize on-premises directory with Azure AD</a>
<p><b>Why Consider This?</b></p>
Consistency of identities across cloud and on-premises will reduce human error and resulting security risk. Teams managing resources in both environments need a consistent authoritative source to achieve security assurances. For monitoring, if identity can be determined without an intermediate mapping process, security efficiency is improved.
<p><b>Context</b></p>
<p><span>Synchronization is all about providing users an identity in the cloud based on their on-premises identity. Whether or not they will use synchronized account for authentication or federated authentication, the users will still need to have an identity in the cloud. This identity will need to be maintained and updated periodically. The updates can take many forms, from title changes to password changes.</span></p><p><span>Start by evaluating the organization's on-premises identity solution and user requirements. This evaluation is important, as it defines the technical requirements for how user identities will be created and maintained in the cloud. For the majority of organizations, Active Directory is established on-premises and will be the on-premises directory from which users will be synchronized, but this is not always the case.</span></p>
<p><b>Suggested Actions</b></p>
<p><span>Consider using Azure AD Connect to synchronize Azure AD with your existing on-premises directory. For migration projects, have a requirement to complete this task before an Azure migration and development projects begin.</span></p>
<p><b>Learn More</b></p>
<p><a href="https://docs.microsoft.com/en-us/azure/architecture/framework/security/design-identity#synchronize-the-hybrid-identity-systems" target="_blank"><span>Synchronize the hybrid identity systems</span></a><span /></p>
|
1.0
|
Synchronize on-premises directory with Azure AD - <a href="https://docs.microsoft.com/azure/architecture/framework/security/design-identity-authentication#centralize-all-identity-systems">Synchronize on-premises directory with Azure AD</a>
<p><b>Why Consider This?</b></p>
Consistency of identities across cloud and on-premises will reduce human error and resulting security risk. Teams managing resources in both environments need a consistent authoritative source to achieve security assurances. For monitoring, if identity can be determined without an intermediate mapping process, security efficiency is improved.
<p><b>Context</b></p>
<p><span>Synchronization is all about providing users an identity in the cloud based on their on-premises identity. Whether or not they will use synchronized account for authentication or federated authentication, the users will still need to have an identity in the cloud. This identity will need to be maintained and updated periodically. The updates can take many forms, from title changes to password changes.</span></p><p><span>Start by evaluating the organization's on-premises identity solution and user requirements. This evaluation is important, as it defines the technical requirements for how user identities will be created and maintained in the cloud. For the majority of organizations, Active Directory is established on-premises and will be the on-premises directory from which users will be synchronized, but this is not always the case.</span></p>
<p><b>Suggested Actions</b></p>
<p><span>Consider using Azure AD Connect to synchronize Azure AD with your existing on-premises directory. For migration projects, have a requirement to complete this task before an Azure migration and development projects begin.</span></p>
<p><b>Learn More</b></p>
<p><a href="https://docs.microsoft.com/en-us/azure/architecture/framework/security/design-identity#synchronize-the-hybrid-identity-systems" target="_blank"><span>Synchronize the hybrid identity systems</span></a><span /></p>
|
process
|
synchronize on premises directory with azure ad why consider this consistency of identities across cloud and on premises will reduce human error and resulting security risk teams managing resources in both environments need a consistent authoritative source to achieve security assurances for monitoring if identity can be determined without an intermediate mapping process security efficiency is improved context synchronization is all about providing users an identity in the cloud based on their on premises identity whether or not they will use synchronized account for authentication or federated authentication the users will still need to have an identity in the cloud this identity will need to be maintained and updated periodically the updates can take many forms from title changes to password changes start by evaluating the organization s on premises identity solution and user requirements this evaluation is important as it defines the technical requirements for how user identities will be created and maintained in the cloud for the majority of organizations active directory is established on premises and will be the on premises directory from which users will be synchronized but this is not always the case suggested actions consider using azure ad connect to synchronize azure ad with your existing on premises directory for migration projects have a requirement to complete this task before an azure migration and development projects begin learn more synchronize the hybrid identity systems
| 1
|
43,734
| 11,812,559,689
|
IssuesEvent
|
2020-03-19 20:24:54
|
department-of-veterans-affairs/va.gov-team
|
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
|
closed
|
[SCREENREADER] Calendar Widget - Fieldset needs to be a sibling to the button trigger for screen reader usability
|
508-defect-2 508/Accessibility frontend vaos
|
## Description
<!-- This is a detailed description of the issue. It should include a restatement of the title, and provide more background information. -->
The staging versions of our request-date and select-date pickers have the `<fieldset>` of radios / checkboxes appended to the end of row. This is creating a situation where JAWS and VoiceOver are ignoring the inputs unless users are navigating by using the `TAB` key. Screen reader users who navigate by virtual cursor are missing the inputs, and will not be able to use the calendar effectively.
## Point of Contact
<!-- If this issue is being opened by a VFS team member, please add a point of contact. Usually this is the same person who enters the issue ticket.
-->
**VFS Point of Contact:** _Trevor_
## Acceptance Criteria
<!-- As a keyboard user, I want to open the Level of Coverage widget by pressing Spacebar or pressing Enter. These keypress actions should not interfere with the mouse click event also opening the widget. -->
**As a screen reader user:**
* I want to have clear instructions for use read out. This could be accomplished by adding an SR-only paragraph block to the table's `aria-describedby` H2.
* I want to be able to expand a cell by clicking or pressing the `<button>`, and navigate to the form inputs immediately by:
* Pressing `Ctrl + Opt + RIGHT_ARROW` or `TAB` in VoiceOver
* Pressing `DOWN_ARROW` or `f` or `TAB` in NVDA
* Pressing `f` or `TAB` in JAWS
## Environment
* Windows 10, MacOS Mojave
* IE11, Chrome, Firefox, Safari
* JAWS, NVDA, VoiceOver
## Possible Fixes (optional)
The form elements have to be moved inside table cells. I've mocked a very simple version of this on Codepen: https://codepen.io/tpierce_402/pen/LYYxLxN . This mockup has been tested with the big three screen readers on desktop, and works well for all.
~The codepen will be updated to include `aria-controls` attributes ASAP. This has been tested with the big three screen readers for efficacy.~
## WCAG or Vendor Guidance (optional)
* [Keyboard: Understanding SC 2.1.1](https://www.w3.org/TR/UNDERSTANDING-WCAG20/keyboard-operation-keyboard-operable.html)
* [Info and Relationships: Understanding SC 1.3.1](https://www.w3.org/TR/UNDERSTANDING-WCAG20/content-structure-separation-programmatic.html)
* [Name, Role, Value: Understanding SC 4.1.2](https://www.w3.org/TR/UNDERSTANDING-WCAG20/ensure-compat-rsv.html)
## Screenshots or Trace Logs
<!-- Drop any screenshots or error logs that might be useful for debugging -->

---

|
1.0
|
[SCREENREADER] Calendar Widget - Fieldset needs to be a sibling to the button trigger for screen reader usability - ## Description
<!-- This is a detailed description of the issue. It should include a restatement of the title, and provide more background information. -->
The staging versions of our request-date and select-date pickers have the `<fieldset>` of radios / checkboxes appended to the end of row. This is creating a situation where JAWS and VoiceOver are ignoring the inputs unless users are navigating by using the `TAB` key. Screen reader users who navigate by virtual cursor are missing the inputs, and will not be able to use the calendar effectively.
## Point of Contact
<!-- If this issue is being opened by a VFS team member, please add a point of contact. Usually this is the same person who enters the issue ticket.
-->
**VFS Point of Contact:** _Trevor_
## Acceptance Criteria
<!-- As a keyboard user, I want to open the Level of Coverage widget by pressing Spacebar or pressing Enter. These keypress actions should not interfere with the mouse click event also opening the widget. -->
**As a screen reader user:**
* I want to have clear instructions for use read out. This could be accomplished by adding an SR-only paragraph block to the table's `aria-describedby` H2.
* I want to be able to expand a cell by clicking or pressing the `<button>`, and navigate to the form inputs immediately by:
* Pressing `Ctrl + Opt + RIGHT_ARROW` or `TAB` in VoiceOver
* Pressing `DOWN_ARROW` or `f` or `TAB` in NVDA
* Pressing `f` or `TAB` in JAWS
## Environment
* Windows 10, MacOS Mojave
* IE11, Chrome, Firefox, Safari
* JAWS, NVDA, VoiceOver
## Possible Fixes (optional)
The form elements have to be moved inside table cells. I've mocked a very simple version of this on Codepen: https://codepen.io/tpierce_402/pen/LYYxLxN . This mockup has been tested with the big three screen readers on desktop, and works well for all.
~The codepen will be updated to include `aria-controls` attributes ASAP. This has been tested with the big three screen readers for efficacy.~
## WCAG or Vendor Guidance (optional)
* [Keyboard: Understanding SC 2.1.1](https://www.w3.org/TR/UNDERSTANDING-WCAG20/keyboard-operation-keyboard-operable.html)
* [Info and Relationships: Understanding SC 1.3.1](https://www.w3.org/TR/UNDERSTANDING-WCAG20/content-structure-separation-programmatic.html)
* [Name, Role, Value: Understanding SC 4.1.2](https://www.w3.org/TR/UNDERSTANDING-WCAG20/ensure-compat-rsv.html)
## Screenshots or Trace Logs
<!-- Drop any screenshots or error logs that might be useful for debugging -->

---

|
non_process
|
calendar widget fieldset needs to be a sibling to the button trigger for screen reader usability description the staging versions of our request date and select date pickers have the of radios checkboxes appended to the end of row this is creating a situation where jaws and voiceover are ignoring the inputs unless users are navigating by using the tab key screen reader users who navigate by virtual cursor are missing the inputs and will not be able to use the calendar effectively point of contact if this issue is being opened by a vfs team member please add a point of contact usually this is the same person who enters the issue ticket vfs point of contact trevor acceptance criteria as a screen reader user i want to have clear instructions for use read out this could be accomplished by adding an sr only paragraph block to the table s aria describedby i want to be able to expand a cell by clicking or pressing the and navigate to the form inputs immediately by pressing ctrl opt right arrow or tab in voiceover pressing down arrow or f or tab in nvda pressing f or tab in jaws environment windows macos mojave chrome firefox safari jaws nvda voiceover possible fixes optional the form elements have to be moved inside table cells i ve mocked a very simple version of this on codepen this mockup has been tested with the big three screen readers on desktop and works well for all the codepen will be updated to include aria controls attributes asap this has been tested with the big three screen readers for efficacy wcag or vendor guidance optional screenshots or trace logs
| 0
|
215,613
| 7,295,859,134
|
IssuesEvent
|
2018-02-26 08:48:22
|
spring-projects/spring-boot
|
https://api.github.com/repos/spring-projects/spring-boot
|
closed
|
KafkaHealthIndicator and older brokers
|
priority: normal type: enhancement
|
Currently, when the broker doesn't support obtaining configuration (e.g. 0.10.2.0), the health indicator reports `Down`.
Consider skipping the `replicationFactor` check if the broker cannot provide that information.
See https://github.com/spring-projects/spring-boot/pull/11515#issuecomment-368044582
|
1.0
|
KafkaHealthIndicator and older brokers - Currently, when the broker doesn't support obtaining configuration (e.g. 0.10.2.0), the health indicator reports `Down`.
Consider skipping the `replicationFactor` check if the broker cannot provide that information.
See https://github.com/spring-projects/spring-boot/pull/11515#issuecomment-368044582
|
non_process
|
kafkahealthindicator and older brokers currently when the broker doesn t support obtaining configuration e g the health indicator reports down consider skipping the replicationfactor check if the broker cannot provide that information see
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.