Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 5
112
| repo_url
stringlengths 34
141
| action
stringclasses 3
values | title
stringlengths 1
757
| labels
stringlengths 4
664
| body
stringlengths 3
261k
| index
stringclasses 10
values | text_combine
stringlengths 96
261k
| label
stringclasses 2
values | text
stringlengths 96
232k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
81,275
| 30,779,279,798
|
IssuesEvent
|
2023-07-31 08:54:11
|
SeleniumHQ/selenium
|
https://api.github.com/repos/SeleniumHQ/selenium
|
opened
|
[🐛 Bug]: Not able to download files when using --headless=new
|
I-defect needs-triaging
|
### What happened?
After introducing --headless=new, I can no longer download files on Linux/Mac architecture.
I did test this with --headless=old, and the download works as expected.
I am not sure if this is because Chrome options are not taking in the given arguments, such as:
Map<String, Object> chromePreferences = Maps.of("profile.default_content_settings.popups", ZERO_INT,
"plugins.plugins_disabled", new String[]{
"Adobe Flash Player",
"Chrome PDF Viewer"
},
"download.prompt.for.download", false,
"plugins.always_open_pdf_externally", true,
"download.open_pdf_in_system_reader", false,
"profile.default_content_setting_values.automatic_downloads", 0,
"download.default_directory", downloadLocation);
### How can we reproduce the issue?
```shell
You will need to run a test which will download a file using headless with --headless=new
This must be tested on Linux/Mac architecture.
```
### Relevant log output
```shell
N/A
```
### Operating System
Mac and Linux
### Selenium version
Selenium 4.10.0
### What are the browser(s) and version(s) where you see this issue?
Chrome 115.0.5790.114
### What are the browser driver(s) and version(s) where you see this issue?
Chrome 115.0.5790.114
### Are you using Selenium Grid?
N/A
|
1.0
|
[🐛 Bug]: Not able to download files when using --headless=new - ### What happened?
After introducing --headless=new, I can no longer download files on Linux/Mac architecture.
I did test this with --headless=old, and the download works as expected.
I am not sure if this is because Chrome options are not taking in the given arguments, such as:
Map<String, Object> chromePreferences = Maps.of("profile.default_content_settings.popups", ZERO_INT,
"plugins.plugins_disabled", new String[]{
"Adobe Flash Player",
"Chrome PDF Viewer"
},
"download.prompt.for.download", false,
"plugins.always_open_pdf_externally", true,
"download.open_pdf_in_system_reader", false,
"profile.default_content_setting_values.automatic_downloads", 0,
"download.default_directory", downloadLocation);
### How can we reproduce the issue?
```shell
You will need to run a test which will download a file using headless with --headless=new
This must be tested on Linux/Mac architecture.
```
### Relevant log output
```shell
N/A
```
### Operating System
Mac and Linux
### Selenium version
Selenium 4.10.0
### What are the browser(s) and version(s) where you see this issue?
Chrome 115.0.5790.114
### What are the browser driver(s) and version(s) where you see this issue?
Chrome 115.0.5790.114
### Are you using Selenium Grid?
N/A
|
defect
|
not able to download files when using headless new what happened after introducing headless new i can no longer download files on linux mac architecture i did test this with headless old and the download works as expected i am not sure if this is because chrome options are not taking in the given arguments such as map chromepreferences maps of profile default content settings popups zero int plugins plugins disabled new string adobe flash player chrome pdf viewer download prompt for download false plugins always open pdf externally true download open pdf in system reader false profile default content setting values automatic downloads download default directory downloadlocation how can we reproduce the issue shell you will need to run a test which will download a file using headless with headless new this must be tested on linux mac architecture relevant log output shell n a operating system mac and linux selenium version selenium what are the browser s and version s where you see this issue chrome what are the browser driver s and version s where you see this issue chrome are you using selenium grid n a
| 1
|
46,494
| 13,055,920,878
|
IssuesEvent
|
2020-07-30 03:07:31
|
icecube-trac/tix2
|
https://api.github.com/repos/icecube-trac/tix2
|
opened
|
ipdf - docs are out of date and incomplete (Trac #1298)
|
Incomplete Migration Migrated from Trac combo simulation defect
|
Migrated from https://code.icecube.wisc.edu/ticket/1298
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"description": "rst docs are light, and barely gloss over IPDF usage. links are dead.\n\ndoxygen docs are incomplete (several \"Comming soon\"'s)\ndoxygen docs are out of date (refer to the old Makefile build system, \"plan to use the icetray unit system\")",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1550067117911749",
"component": "combo simulation",
"summary": "ipdf - docs are out of date and incomplete",
"priority": "blocker",
"keywords": "documentation",
"time": "2015-08-28T20:21:09",
"milestone": "",
"owner": "kjmeagher",
"type": "defect"
}
```
|
1.0
|
ipdf - docs are out of date and incomplete (Trac #1298) - Migrated from https://code.icecube.wisc.edu/ticket/1298
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"description": "rst docs are light, and barely gloss over IPDF usage. links are dead.\n\ndoxygen docs are incomplete (several \"Comming soon\"'s)\ndoxygen docs are out of date (refer to the old Makefile build system, \"plan to use the icetray unit system\")",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1550067117911749",
"component": "combo simulation",
"summary": "ipdf - docs are out of date and incomplete",
"priority": "blocker",
"keywords": "documentation",
"time": "2015-08-28T20:21:09",
"milestone": "",
"owner": "kjmeagher",
"type": "defect"
}
```
|
defect
|
ipdf docs are out of date and incomplete trac migrated from json status closed changetime description rst docs are light and barely gloss over ipdf usage links are dead n ndoxygen docs are incomplete several comming soon s ndoxygen docs are out of date refer to the old makefile build system plan to use the icetray unit system reporter nega cc resolution fixed ts component combo simulation summary ipdf docs are out of date and incomplete priority blocker keywords documentation time milestone owner kjmeagher type defect
| 1
|
40,891
| 10,213,355,272
|
IssuesEvent
|
2019-08-14 21:56:07
|
Azure/batch-shipyard
|
https://api.github.com/repos/Azure/batch-shipyard
|
opened
|
Network direct RDMA VM provisioning fails
|
defect rdma
|
`3.8.0` has a regression preventing network direct RDMA VM instances (A8/A9/NC24rX/H16r/H16mr) from provisioning correctly.
Temporary mitigation until the hotfix is to perform the following (after installing/re-installing `3.8.0`):
```shell
# cd to the directory where you've cloned the repository
git checkout develop
git pull
```
|
1.0
|
Network direct RDMA VM provisioning fails - `3.8.0` has a regression preventing network direct RDMA VM instances (A8/A9/NC24rX/H16r/H16mr) from provisioning correctly.
Temporary mitigation until the hotfix is to perform the following (after installing/re-installing `3.8.0`):
```shell
# cd to the directory where you've cloned the repository
git checkout develop
git pull
```
|
defect
|
network direct rdma vm provisioning fails has a regression preventing network direct rdma vm instances from provisioning correctly temporary mitigation until the hotfix is to perform the following after installing re installing shell cd to the directory where you ve cloned the repository git checkout develop git pull
| 1
|
46,695
| 19,412,184,296
|
IssuesEvent
|
2021-12-20 10:49:23
|
tuna/issues
|
https://api.github.com/repos/tuna/issues
|
closed
|
Wget下载CentOS ISO显示403 Forbidden
|
Service Issue
|
### 先决条件 (Prerequisites)
- [X] 我已确认这个问题没有在[其他 issues](https://github.com/tuna/issues/issues)中提出过。
I am sure that this problem has NEVER been discussed in [other issues](https://github.com/tuna/issues/issues).
### 发生了什么(What happened)
使用Wget命令行工具下载CentOS ISO镜像时显示403 Forbidden,但使用浏览器下载时正常。
试了下Debian、Ubuntu正常,但CentOS异常(CentOS 7, 8均不行)
```
$ wget -d 'https://mirrors.tuna.tsinghua.edu.cn/centos/8.5.2111/isos/x86_64/CentOS-8.5.2111-x86_64-dvd1.iso'
DEBUG output created by Wget 1.21 on linux-gnu.
Reading HSTS entries from /home/zby/.wget-hsts
URI encoding = “UTF-8”
Converted file name 'CentOS-8.5.2111-x86_64-dvd1.iso' (UTF-8) -> 'CentOS-8.5.2111-x86_64-dvd1.iso' (UTF-8)
--2021-12-19 16:34:53-- https://mirrors.tuna.tsinghua.edu.cn/centos/8.5.2111/isos/x86_64/CentOS-8.5.2111-x86_64-dvd1.iso
Certificates loaded: 129
正在解析主机 mirrors.tuna.tsinghua.edu.cn (mirrors.tuna.tsinghua.edu.cn)... 101.6.15.130, 2402:f000:1:400::2
Caching mirrors.tuna.tsinghua.edu.cn => 101.6.15.130 2402:f000:1:400::2
正在连接 mirrors.tuna.tsinghua.edu.cn (mirrors.tuna.tsinghua.edu.cn)|101.6.15.130|:443... 已连接。
Created socket 3.
Releasing 0x00005576f88d00a0 (new refcount 1).
---request begin---
GET /centos/8.5.2111/isos/x86_64/CentOS-8.5.2111-x86_64-dvd1.iso HTTP/1.1
User-Agent: Wget/1.21
Accept: */*
Accept-Encoding: identity
Host: mirrors.tuna.tsinghua.edu.cn
Connection: Keep-Alive
---request end---
已发出 HTTP 请求,正在等待回应...
---response begin---
HTTP/1.1 403 Forbidden
Server: nginx/1.18.0
Date: Sun, 19 Dec 2021 08:34:54 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: keep-alive
Strict-Transport-Security: max-age=31536000
X-TUNA-MIRROR-ID: nanomirrors
---response end---
403 Forbidden
Registered socket 3 for persistent reuse.
Parsed Strict-Transport-Security max-age = 31536000, includeSubDomains = false
Updated HSTS host: mirrors.tuna.tsinghua.edu.cn:443 (max-age: 31536000, includeSubdomains: false)
Skipping 512 bytes of body: [<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>清华大学 TUNA 镜像站</title>
<link href="//cdn.jsdelivr.Skipping 512 bytes of body: [net/npm/bootstrap@3.3.7/dist/css/bootstrap.min.css" rel="stylesheet">
<style> body { padding-top: 40px; } </style>
</head>
<body>
<div class="container">
<div class="well">
<p lang="zh-cn">尊敬的访问者,</p>
<p lang="en">Dear visitor, </p>
<p lang="fr">Chère cliente, </p>
<p lang="ja">訪問者へ 、</p>
<p> </p>
<p lang="zh-cn">您好!</p>
<p lang="en">Hello! </p>
<p lang="fr">Bonjour! </p>
<p lang="ja">こんにちは !</p>
<p> </p>
Skipping 512 bytes of body: [ <p lang="zh-cn">我们检测到您正在使用移动设备下载镜像站上较大的二进制文件,为防止消耗过多流量,我们建议您改用计算机下载。</p>
<p lang="en">Your attempt to download a large binary file from a mobile device has been detected. To avoid excessive consumption of data transfer, you are recommended to download it from a computer. </p>
<p lang="fr">Une tentative de téléchargement d'un fichier binaire de grande taille depuis un appareil mobile a été Skipping 512 bytes of body: [détectée. Afin d'éviter une consommation excessive de données mobile, nous vous conseillons de recommencer cette action à partir d'un ordinateur. </p>
<p lang="ja">容量の大きいバイナリファイルが今モバイルデバイスに転送しようとしております、大量のトラフィックが発生するため、コンピュータからのダウンロードをお勧めします。</p>
<p> </p>
<p lang="zh-cn">如有疑问,请联系 <a href="mailto:support@tuna.tsinghua.edu.cSkipping 512 bytes of body: [n">support@tuna.tsinghua.edu.cn</a> 。</p>
<p lang="en">If there are any possible questions, please contact <a href="mailto:support@tuna.tsinghua.edu.cn">support@tuna.tsinghua.edu.cn</a>. </p>
<p lang="fr">Pour plus d'informations, veuillez contacter <a href="mailto:support@tuna.tsinghua.edu.cn">support@tuna.tsinghua.edu.cn</a>. </p>
<p lang="ja">なおご質問などありましたら、<a href="mailto:support@tuna.tsinghua.edu.cn">support@tuna.tsinghua.edu.cn</a>までご連絡してください。Skipping 127 bytes of body: [</p>
<p> </p>
<p><code>00000070 e0c0e5fba7856c37ba7f0d56b6850407</code></p>
</div>
</div>
</body>
</html>
] done.
2021-12-19 16:34:54 错误 403:Forbidden。
Saving HSTS entries to /home/zby/.wget-hsts
```
### 期望的现象(What you expected to happen)
Wget可以正常下载
### 如何重现此问题(How to reproduce it)
`wget -d 'https://mirrors.tuna.tsinghua.edu.cn/centos/8.5.2111/isos/x86_64/CentOS-8.5.2111-x86_64-dvd1.iso'`
### 操作系统(OS Version)
Debian 11
### 浏览器(如果适用)(Browser version, if applicable)
Wget 1.21
### 其他环境(Other environments)
_No response_
### 其他需要说明的事项(Anything else we need to know)
_No response_
|
1.0
|
Wget下载CentOS ISO显示403 Forbidden - ### 先决条件 (Prerequisites)
- [X] 我已确认这个问题没有在[其他 issues](https://github.com/tuna/issues/issues)中提出过。
I am sure that this problem has NEVER been discussed in [other issues](https://github.com/tuna/issues/issues).
### 发生了什么(What happened)
使用Wget命令行工具下载CentOS ISO镜像时显示403 Forbidden,但使用浏览器下载时正常。
试了下Debian、Ubuntu正常,但CentOS异常(CentOS 7, 8均不行)
```
$ wget -d 'https://mirrors.tuna.tsinghua.edu.cn/centos/8.5.2111/isos/x86_64/CentOS-8.5.2111-x86_64-dvd1.iso'
DEBUG output created by Wget 1.21 on linux-gnu.
Reading HSTS entries from /home/zby/.wget-hsts
URI encoding = “UTF-8”
Converted file name 'CentOS-8.5.2111-x86_64-dvd1.iso' (UTF-8) -> 'CentOS-8.5.2111-x86_64-dvd1.iso' (UTF-8)
--2021-12-19 16:34:53-- https://mirrors.tuna.tsinghua.edu.cn/centos/8.5.2111/isos/x86_64/CentOS-8.5.2111-x86_64-dvd1.iso
Certificates loaded: 129
正在解析主机 mirrors.tuna.tsinghua.edu.cn (mirrors.tuna.tsinghua.edu.cn)... 101.6.15.130, 2402:f000:1:400::2
Caching mirrors.tuna.tsinghua.edu.cn => 101.6.15.130 2402:f000:1:400::2
正在连接 mirrors.tuna.tsinghua.edu.cn (mirrors.tuna.tsinghua.edu.cn)|101.6.15.130|:443... 已连接。
Created socket 3.
Releasing 0x00005576f88d00a0 (new refcount 1).
---request begin---
GET /centos/8.5.2111/isos/x86_64/CentOS-8.5.2111-x86_64-dvd1.iso HTTP/1.1
User-Agent: Wget/1.21
Accept: */*
Accept-Encoding: identity
Host: mirrors.tuna.tsinghua.edu.cn
Connection: Keep-Alive
---request end---
已发出 HTTP 请求,正在等待回应...
---response begin---
HTTP/1.1 403 Forbidden
Server: nginx/1.18.0
Date: Sun, 19 Dec 2021 08:34:54 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: keep-alive
Strict-Transport-Security: max-age=31536000
X-TUNA-MIRROR-ID: nanomirrors
---response end---
403 Forbidden
Registered socket 3 for persistent reuse.
Parsed Strict-Transport-Security max-age = 31536000, includeSubDomains = false
Updated HSTS host: mirrors.tuna.tsinghua.edu.cn:443 (max-age: 31536000, includeSubdomains: false)
Skipping 512 bytes of body: [<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>清华大学 TUNA 镜像站</title>
<link href="//cdn.jsdelivr.Skipping 512 bytes of body: [net/npm/bootstrap@3.3.7/dist/css/bootstrap.min.css" rel="stylesheet">
<style> body { padding-top: 40px; } </style>
</head>
<body>
<div class="container">
<div class="well">
<p lang="zh-cn">尊敬的访问者,</p>
<p lang="en">Dear visitor, </p>
<p lang="fr">Chère cliente, </p>
<p lang="ja">訪問者へ 、</p>
<p> </p>
<p lang="zh-cn">您好!</p>
<p lang="en">Hello! </p>
<p lang="fr">Bonjour! </p>
<p lang="ja">こんにちは !</p>
<p> </p>
Skipping 512 bytes of body: [ <p lang="zh-cn">我们检测到您正在使用移动设备下载镜像站上较大的二进制文件,为防止消耗过多流量,我们建议您改用计算机下载。</p>
<p lang="en">Your attempt to download a large binary file from a mobile device has been detected. To avoid excessive consumption of data transfer, you are recommended to download it from a computer. </p>
<p lang="fr">Une tentative de téléchargement d'un fichier binaire de grande taille depuis un appareil mobile a été Skipping 512 bytes of body: [détectée. Afin d'éviter une consommation excessive de données mobile, nous vous conseillons de recommencer cette action à partir d'un ordinateur. </p>
<p lang="ja">容量の大きいバイナリファイルが今モバイルデバイスに転送しようとしております、大量のトラフィックが発生するため、コンピュータからのダウンロードをお勧めします。</p>
<p> </p>
<p lang="zh-cn">如有疑问,请联系 <a href="mailto:support@tuna.tsinghua.edu.cSkipping 512 bytes of body: [n">support@tuna.tsinghua.edu.cn</a> 。</p>
<p lang="en">If there are any possible questions, please contact <a href="mailto:support@tuna.tsinghua.edu.cn">support@tuna.tsinghua.edu.cn</a>. </p>
<p lang="fr">Pour plus d'informations, veuillez contacter <a href="mailto:support@tuna.tsinghua.edu.cn">support@tuna.tsinghua.edu.cn</a>. </p>
<p lang="ja">なおご質問などありましたら、<a href="mailto:support@tuna.tsinghua.edu.cn">support@tuna.tsinghua.edu.cn</a>までご連絡してください。Skipping 127 bytes of body: [</p>
<p> </p>
<p><code>00000070 e0c0e5fba7856c37ba7f0d56b6850407</code></p>
</div>
</div>
</body>
</html>
] done.
2021-12-19 16:34:54 错误 403:Forbidden。
Saving HSTS entries to /home/zby/.wget-hsts
```
### 期望的现象(What you expected to happen)
Wget可以正常下载
### 如何重现此问题(How to reproduce it)
`wget -d 'https://mirrors.tuna.tsinghua.edu.cn/centos/8.5.2111/isos/x86_64/CentOS-8.5.2111-x86_64-dvd1.iso'`
### 操作系统(OS Version)
Debian 11
### 浏览器(如果适用)(Browser version, if applicable)
Wget 1.21
### 其他环境(Other environments)
_No response_
### 其他需要说明的事项(Anything else we need to know)
_No response_
|
non_defect
|
wget下载centos forbidden 先决条件 prerequisites 我已确认这个问题没有在 i am sure that this problem has never been discussed in 发生了什么(what happened) 使用wget命令行工具下载centos forbidden,但使用浏览器下载时正常。 试了下debian、ubuntu正常,但centos异常(centos ) wget d debug output created by wget on linux gnu reading hsts entries from home zby wget hsts uri encoding “utf ” converted file name centos iso utf centos iso utf certificates loaded 正在解析主机 mirrors tuna tsinghua edu cn mirrors tuna tsinghua edu cn caching mirrors tuna tsinghua edu cn 正在连接 mirrors tuna tsinghua edu cn mirrors tuna tsinghua edu cn 已连接。 created socket releasing new refcount request begin get centos isos centos iso http user agent wget accept accept encoding identity host mirrors tuna tsinghua edu cn connection keep alive request end 已发出 http 请求,正在等待回应 response begin http forbidden server nginx date sun dec gmt content type text html transfer encoding chunked connection keep alive strict transport security max age x tuna mirror id nanomirrors response end forbidden registered socket for persistent reuse parsed strict transport security max age includesubdomains false updated hsts host mirrors tuna tsinghua edu cn max age includesubdomains false skipping bytes of body 清华大学 tuna 镜像站 body padding top 尊敬的访问者, dear visitor chère cliente 訪問者へ 、 nbsp 您好! hello bonjour こんにちは ! nbsp skipping bytes of body 我们检测到您正在使用移动设备下载镜像站上较大的二进制文件,为防止消耗过多流量,我们建议您改用计算机下载。 your attempt to download a large binary file from a mobile device has been detected to avoid excessive consumption of data transfer you are recommended to download it from a computer une tentative de téléchargement d un fichier binaire de grande taille depuis un appareil mobile a été skipping bytes of body détectée afin d éviter une consommation excessive de données mobile nous vous conseillons de recommencer cette action à partir d un ordinateur 容量の大きいバイナリファイルが今モバイルデバイスに転送しようとしております、大量のトラフィックが発生するため、コンピュータからのダウンロードをお勧めします。 nbsp 如有疑问,请联系 support tuna tsinghua edu cn 。 if there are any possible questions please contact support tuna tsinghua edu cn pour plus d informations veuillez contacter support tuna tsinghua edu cn なおご質問などありましたら、 support tuna tsinghua edu cn までご連絡してください。skipping bytes of body nbsp done 错误 :forbidden。 saving hsts entries to home zby wget hsts 期望的现象(what you expected to happen) wget可以正常下载 如何重现此问题(how to reproduce it) wget d 操作系统(os version) debian 浏览器(如果适用)(browser version if applicable) wget 其他环境(other environments) no response 其他需要说明的事项(anything else we need to know) no response
| 0
|
70,989
| 23,395,281,644
|
IssuesEvent
|
2022-08-11 22:26:26
|
vector-im/element-web
|
https://api.github.com/repos/vector-im/element-web
|
closed
|
Same suggested person appears many times
|
T-Defect S-Minor O-Occasional A-New-Search-Experience
|
### Steps to reproduce
1. Have a DM with a user
2. Have another room with the same user
3. Open search
4. Click people
5. Search for the user id, e.g. `@user:example.com`
6. Clear the name (but not person search)
5. Search for the user id again, e.g. `@user:example.com`
### Outcome
#### What did you expect?
User suggested once
#### What happened instead?
User suggested twice (or as many times you searched for the user)

### Operating system
Ubuntu 22.04 LTS
### Browser information
Firefox 102.0.1
### URL for webapp
https://develop.element.io/
### Application version
Version von Element: ecda0a10734a-react-742b21ca46d0-js-32bb4b1fc464 Version von Olm: 3.2.8
### Homeserver
matrix.org
### Will you send logs?
No
|
1.0
|
Same suggested person appears many times - ### Steps to reproduce
1. Have a DM with a user
2. Have another room with the same user
3. Open search
4. Click people
5. Search for the user id, e.g. `@user:example.com`
6. Clear the name (but not person search)
5. Search for the user id again, e.g. `@user:example.com`
### Outcome
#### What did you expect?
User suggested once
#### What happened instead?
User suggested twice (or as many times you searched for the user)

### Operating system
Ubuntu 22.04 LTS
### Browser information
Firefox 102.0.1
### URL for webapp
https://develop.element.io/
### Application version
Version von Element: ecda0a10734a-react-742b21ca46d0-js-32bb4b1fc464 Version von Olm: 3.2.8
### Homeserver
matrix.org
### Will you send logs?
No
|
defect
|
same suggested person appears many times steps to reproduce have a dm with a user have another room with the same user open search click people search for the user id e g user example com clear the name but not person search search for the user id again e g user example com outcome what did you expect user suggested once what happened instead user suggested twice or as many times you searched for the user operating system ubuntu lts browser information firefox url for webapp application version version von element react js version von olm homeserver matrix org will you send logs no
| 1
|
71,173
| 23,481,388,653
|
IssuesEvent
|
2022-08-17 10:53:29
|
vector-im/element-ios
|
https://api.github.com/repos/vector-im/element-ios
|
closed
|
Missing "people" in all chats on iOS
|
T-Defect Z-AppLayout
|
### Steps to reproduce
1. Enable filters
2. Go to all chats
### Outcome
#### What did you expect?
To see DMs in that list
#### What happened instead?
All DMs are missing in action
### Your phone model
_No response_
### Operating system version
_No response_
### Application version
_No response_
### Homeserver
_No response_
### Will you send logs?
No
|
1.0
|
Missing "people" in all chats on iOS - ### Steps to reproduce
1. Enable filters
2. Go to all chats
### Outcome
#### What did you expect?
To see DMs in that list
#### What happened instead?
All DMs are missing in action
### Your phone model
_No response_
### Operating system version
_No response_
### Application version
_No response_
### Homeserver
_No response_
### Will you send logs?
No
|
defect
|
missing people in all chats on ios steps to reproduce enable filters go to all chats outcome what did you expect to see dms in that list what happened instead all dms are missing in action your phone model no response operating system version no response application version no response homeserver no response will you send logs no
| 1
|
46,276
| 13,055,884,567
|
IssuesEvent
|
2020-07-30 03:01:15
|
icecube-trac/tix2
|
https://api.github.com/repos/icecube-trac/tix2
|
opened
|
[documentation] redirect Trac wiki pages elsewhere (Trac #915)
|
Incomplete Migration Migrated from Trac defect infrastructure
|
Migrated from https://code.icecube.wisc.edu/ticket/915
```json
{
"status": "closed",
"changetime": "2015-04-08T16:37:00",
"description": "Trac automatically creates links to wiki pages for CamelCase that it detects. If our documentation was in the Trac wiki, this would be awesome. \n\nSet up an easy way to redirect CamelCase to the correct docs.",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1428511020599097",
"component": "infrastructure",
"summary": "[documentation] redirect Trac wiki pages elsewhere",
"priority": "normal",
"keywords": "",
"time": "2015-04-08T16:32:15",
"milestone": "",
"owner": "",
"type": "defect"
}
```
|
1.0
|
[documentation] redirect Trac wiki pages elsewhere (Trac #915) - Migrated from https://code.icecube.wisc.edu/ticket/915
```json
{
"status": "closed",
"changetime": "2015-04-08T16:37:00",
"description": "Trac automatically creates links to wiki pages for CamelCase that it detects. If our documentation was in the Trac wiki, this would be awesome. \n\nSet up an easy way to redirect CamelCase to the correct docs.",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1428511020599097",
"component": "infrastructure",
"summary": "[documentation] redirect Trac wiki pages elsewhere",
"priority": "normal",
"keywords": "",
"time": "2015-04-08T16:32:15",
"milestone": "",
"owner": "",
"type": "defect"
}
```
|
defect
|
redirect trac wiki pages elsewhere trac migrated from json status closed changetime description trac automatically creates links to wiki pages for camelcase that it detects if our documentation was in the trac wiki this would be awesome n nset up an easy way to redirect camelcase to the correct docs reporter nega cc resolution fixed ts component infrastructure summary redirect trac wiki pages elsewhere priority normal keywords time milestone owner type defect
| 1
|
32,759
| 6,918,652,884
|
IssuesEvent
|
2017-11-29 13:01:16
|
ontop/ontop
|
https://api.github.com/repos/ontop/ontop
|
closed
|
Parsing error with space in the mapping assertion
|
status: check if still relevant topic: mapping processing topic: protégé type: defect
|
Remove unnecessary space from the source part of the mapping assertion to avoid parsing problems.
Example of error message:
> MappingId = 'mapping--656471061'
> Line 54: Invalid input: (sourceQuery = null)`
|
1.0
|
Parsing error with space in the mapping assertion - Remove unnecessary space from the source part of the mapping assertion to avoid parsing problems.
Example of error message:
> MappingId = 'mapping--656471061'
> Line 54: Invalid input: (sourceQuery = null)`
|
defect
|
parsing error with space in the mapping assertion remove unnecessary space from the source part of the mapping assertion to avoid parsing problems example of error message mappingid mapping line invalid input sourcequery null
| 1
|
41,693
| 10,567,712,928
|
IssuesEvent
|
2019-10-06 07:09:18
|
markoteittinen/custom-maps
|
https://api.github.com/repos/markoteittinen/custom-maps
|
closed
|
Tracking
|
Priority-Medium Type-Defect auto-migrated
|
```
What steps will reproduce the problem?
1. Any map
What is the expected output? What do you see instead?
1. It would be great if the app could record and draw a track
2. A page could show the statistics of the track (length, time, average speed,
moving time, moving speed, etc)
What version of the product are you using? On what operating system?
1.1.14
Android 2.3.3 HTC Desire
Please provide any additional information below.
```
Original issue reported on code.google.com by `mor...@s32.dk` on 8 Feb 2012 at 9:32
|
1.0
|
Tracking - ```
What steps will reproduce the problem?
1. Any map
What is the expected output? What do you see instead?
1. It would be great if the app could record and draw a track
2. A page could show the statistics of the track (length, time, average speed,
moving time, moving speed, etc)
What version of the product are you using? On what operating system?
1.1.14
Android 2.3.3 HTC Desire
Please provide any additional information below.
```
Original issue reported on code.google.com by `mor...@s32.dk` on 8 Feb 2012 at 9:32
|
defect
|
tracking what steps will reproduce the problem any map what is the expected output what do you see instead it would be great if the app could record and draw a track a page could show the statistics of the track length time average speed moving time moving speed etc what version of the product are you using on what operating system android htc desire please provide any additional information below original issue reported on code google com by mor dk on feb at
| 1
|
63,414
| 15,598,031,563
|
IssuesEvent
|
2021-03-18 17:36:11
|
RPTools/maptool
|
https://api.github.com/repos/RPTools/maptool
|
closed
|
Update TwelveMonkeys ImageIO plugins to 3.6.4
|
build-configuration tested
|
Update build.gradle to use current version of plugins.
See https://github.com/haraldk/TwelveMonkeys/releases for various bug fixes/enhancements.
|
1.0
|
Update TwelveMonkeys ImageIO plugins to 3.6.4 - Update build.gradle to use current version of plugins.
See https://github.com/haraldk/TwelveMonkeys/releases for various bug fixes/enhancements.
|
non_defect
|
update twelvemonkeys imageio plugins to update build gradle to use current version of plugins see for various bug fixes enhancements
| 0
|
47,572
| 13,056,256,899
|
IssuesEvent
|
2020-07-30 04:08:44
|
icecube-trac/tix2
|
https://api.github.com/repos/icecube-trac/tix2
|
closed
|
dataio - dataio::test_filestager.py needs to test against a better website (Trac #777)
|
Migrated from Trac combo core defect
|
not one that goes down like a prom dress.
Migrated from https://code.icecube.wisc.edu/ticket/777
```json
{
"status": "closed",
"changetime": "2014-10-09T18:22:41",
"description": "not one that goes down like a prom dress.",
"reporter": "nega",
"cc": "",
"resolution": "worksforme",
"_ts": "1412878961075630",
"component": "combo core",
"summary": "dataio - dataio::test_filestager.py needs to test against a better website",
"priority": "normal",
"keywords": "",
"time": "2014-10-09T16:04:47",
"milestone": "",
"owner": "david.schultz",
"type": "defect"
}
```
|
1.0
|
dataio - dataio::test_filestager.py needs to test against a better website (Trac #777) - not one that goes down like a prom dress.
Migrated from https://code.icecube.wisc.edu/ticket/777
```json
{
"status": "closed",
"changetime": "2014-10-09T18:22:41",
"description": "not one that goes down like a prom dress.",
"reporter": "nega",
"cc": "",
"resolution": "worksforme",
"_ts": "1412878961075630",
"component": "combo core",
"summary": "dataio - dataio::test_filestager.py needs to test against a better website",
"priority": "normal",
"keywords": "",
"time": "2014-10-09T16:04:47",
"milestone": "",
"owner": "david.schultz",
"type": "defect"
}
```
|
defect
|
dataio dataio test filestager py needs to test against a better website trac not one that goes down like a prom dress migrated from json status closed changetime description not one that goes down like a prom dress reporter nega cc resolution worksforme ts component combo core summary dataio dataio test filestager py needs to test against a better website priority normal keywords time milestone owner david schultz type defect
| 1
|
27,942
| 5,140,721,921
|
IssuesEvent
|
2017-01-12 06:52:01
|
netty/netty
|
https://api.github.com/repos/netty/netty
|
closed
|
HttpProxyHandler incorrectly formats IPv6 host and port in CONNECT request
|
defect
|
### Expected behavior
The HttpProxyHandler is expected to be capable of issuing a valid CONNECT request for a tunneled connection to an IPv6 host. In this case we are passing an IPv6 address (eg fd00:c0de:42::c:293a:5736) rather than a host name.
### Actual behavior
The HttpProxyHandler does not properly concatenate the IPv6 address and port. The resulting error after we fail to connect will show you the problem:
```
io.netty.handler.proxy.ProxyConnectException: http, none, /fd00:c0de:42:0:5:562d:d54:1:3128 => /fd00:c0de:42:0:50:5694:2fda:1:4287, status: 503 Service Unavailable
```
In both cases you can see the IPv6 address is formatted without brackets: `fd00:c0de:42:0:50:5694:2fda:1:4287` should be `[fd00:c0de:42:0:50:5694:2fda:1]:4287`. This is just an exception message so it doesn't prove it's formatting incorrectly. However, if you look at the request on the wire you can see it is certainly wrong:
```
CONNECT fd00:c0de:42:0:50:5694:2fda:1:4287 HTTP/1.1\r\n
host: fd00:c0de:42:0:50:5694:2fda:1:4287\r\n
\r\n
```
Here is the problem method from HttpProxyHandler:
```
@Override
protected Object newInitialMessage(ChannelHandlerContext ctx) throws Exception {
InetSocketAddress raddr = destinationAddress();
String rhost;
if (raddr.isUnresolved()) {
rhost = raddr.getHostString();
} else {
rhost = raddr.getAddress().getHostAddress();
}
final String host = rhost + ':' + raddr.getPort();
FullHttpRequest req = new DefaultFullHttpRequest(
HttpVersion.HTTP_1_1, HttpMethod.CONNECT,
host,
Unpooled.EMPTY_BUFFER, false);
req.headers().set(HttpHeaderNames.HOST, host);
if (authorization != null) {
req.headers().set(HttpHeaderNames.PROXY_AUTHORIZATION, authorization);
}
return req;
}
```
Specifically: ` final String host = rhost + ':' + raddr.getPort();`
### Steps to reproduce
* Setup an HTTP Proxy with IPv6.
* Setup a target server with an IPv6 address.
* Attempt to establish a connection to the target server through the proxy, giving the HttpProxyHandler an IPv6 IP address for the target.
** We can successfully connect to the proxy server with IPv6. The problem seems to be specific to the target of the tunneled connection using an IPv6 address.
### Minimal yet complete reproducer code (or URL to code)
* I am not able to create a reproducer or patch with my company at this time. I think the issue is relatively straightforward though.
### Netty version
4.1.4.Final
### JVM version (e.g. `java -version`)
Java 8 update 92
### OS version (e.g. `uname -a`)
Attempted on Ubuntu Linux 14.04 and Mac OSX. I don't think the OS matters in this case.
|
1.0
|
HttpProxyHandler incorrectly formats IPv6 host and port in CONNECT request - ### Expected behavior
The HttpProxyHandler is expected to be capable of issuing a valid CONNECT request for a tunneled connection to an IPv6 host. In this case we are passing an IPv6 address (eg fd00:c0de:42::c:293a:5736) rather than a host name.
### Actual behavior
The HttpProxyHandler does not properly concatenate the IPv6 address and port. The resulting error after we fail to connect will show you the problem:
```
io.netty.handler.proxy.ProxyConnectException: http, none, /fd00:c0de:42:0:5:562d:d54:1:3128 => /fd00:c0de:42:0:50:5694:2fda:1:4287, status: 503 Service Unavailable
```
In both cases you can see the IPv6 address is formatted without brackets: `fd00:c0de:42:0:50:5694:2fda:1:4287` should be `[fd00:c0de:42:0:50:5694:2fda:1]:4287`. This is just an exception message so it doesn't prove it's formatting incorrectly. However, if you look at the request on the wire you can see it is certainly wrong:
```
CONNECT fd00:c0de:42:0:50:5694:2fda:1:4287 HTTP/1.1\r\n
host: fd00:c0de:42:0:50:5694:2fda:1:4287\r\n
\r\n
```
Here is the problem method from HttpProxyHandler:
```
@Override
protected Object newInitialMessage(ChannelHandlerContext ctx) throws Exception {
InetSocketAddress raddr = destinationAddress();
String rhost;
if (raddr.isUnresolved()) {
rhost = raddr.getHostString();
} else {
rhost = raddr.getAddress().getHostAddress();
}
final String host = rhost + ':' + raddr.getPort();
FullHttpRequest req = new DefaultFullHttpRequest(
HttpVersion.HTTP_1_1, HttpMethod.CONNECT,
host,
Unpooled.EMPTY_BUFFER, false);
req.headers().set(HttpHeaderNames.HOST, host);
if (authorization != null) {
req.headers().set(HttpHeaderNames.PROXY_AUTHORIZATION, authorization);
}
return req;
}
```
Specifically: ` final String host = rhost + ':' + raddr.getPort();`
### Steps to reproduce
* Setup an HTTP Proxy with IPv6.
* Setup a target server with an IPv6 address.
* Attempt to establish a connection to the target server through the proxy, giving the HttpProxyHandler an IPv6 IP address for the target.
** We can successfully connect to the proxy server with IPv6. The problem seems to be specific to the target of the tunneled connection using an IPv6 address.
### Minimal yet complete reproducer code (or URL to code)
* I am not able to create a reproducer or patch with my company at this time. I think the issue is relatively straightforward though.
### Netty version
4.1.4.Final
### JVM version (e.g. `java -version`)
Java 8 update 92
### OS version (e.g. `uname -a`)
Attempted on Ubuntu Linux 14.04 and Mac OSX. I don't think the OS matters in this case.
|
defect
|
httpproxyhandler incorrectly formats host and port in connect request expected behavior the httpproxyhandler is expected to be capable of issuing a valid connect request for a tunneled connection to an host in this case we are passing an address eg c rather than a host name actual behavior the httpproxyhandler does not properly concatenate the address and port the resulting error after we fail to connect will show you the problem io netty handler proxy proxyconnectexception http none status service unavailable in both cases you can see the address is formatted without brackets should be this is just an exception message so it doesn t prove it s formatting incorrectly however if you look at the request on the wire you can see it is certainly wrong connect http r n host r n r n here is the problem method from httpproxyhandler override protected object newinitialmessage channelhandlercontext ctx throws exception inetsocketaddress raddr destinationaddress string rhost if raddr isunresolved rhost raddr gethoststring else rhost raddr getaddress gethostaddress final string host rhost raddr getport fullhttprequest req new defaultfullhttprequest httpversion http httpmethod connect host unpooled empty buffer false req headers set httpheadernames host host if authorization null req headers set httpheadernames proxy authorization authorization return req specifically final string host rhost raddr getport steps to reproduce setup an http proxy with setup a target server with an address attempt to establish a connection to the target server through the proxy giving the httpproxyhandler an ip address for the target we can successfully connect to the proxy server with the problem seems to be specific to the target of the tunneled connection using an address minimal yet complete reproducer code or url to code i am not able to create a reproducer or patch with my company at this time i think the issue is relatively straightforward though netty version final jvm version e g java version java update os version e g uname a attempted on ubuntu linux and mac osx i don t think the os matters in this case
| 1
|
221,765
| 7,395,953,151
|
IssuesEvent
|
2018-03-18 05:37:04
|
langbakk/HSS
|
https://api.github.com/repos/langbakk/HSS
|
closed
|
Bug: the counter for bike doesn't stop at 150
|
bug priority 3
|
For some reason, it seems the bike-status doesn't stop at 150
|
1.0
|
Bug: the counter for bike doesn't stop at 150 - For some reason, it seems the bike-status doesn't stop at 150
|
non_defect
|
bug the counter for bike doesn t stop at for some reason it seems the bike status doesn t stop at
| 0
|
265,453
| 20,099,524,108
|
IssuesEvent
|
2022-02-07 01:04:13
|
jacksund/simmate
|
https://api.github.com/repos/jacksund/simmate
|
opened
|
Make archive after each ErrorHandler correction
|
documentation
|
It would make sense to archive the working directory after an error is found and correction applied. Currently, the files are just replaced.
|
1.0
|
Make archive after each ErrorHandler correction - It would make sense to archive the working directory after an error is found and correction applied. Currently, the files are just replaced.
|
non_defect
|
make archive after each errorhandler correction it would make sense to archive the working directory after an error is found and correction applied currently the files are just replaced
| 0
|
91,913
| 10,731,553,759
|
IssuesEvent
|
2019-10-28 19:49:20
|
scikit-learn/scikit-learn
|
https://api.github.com/repos/scikit-learn/scikit-learn
|
opened
|
Missing from what's new v0.22
|
Blocker Documentation
|
I've tried to reconcile the commits in 0.21.3..master with the PR and Issue IDs in the change log. The following appear to be missing from `doc/whats_new/v0.22.rst` and entries should be added for them:
reconciled and found
* [ ] ENH faster polynomial features for dense matrices (#13290)
* [ ] ENH Adds roc weighted scorers to SCORERS (#14417)
* [ ] ColumnTransformer input feature name and count validation (#14544)
* [ ] FEA Adds Plotting API to Partial Dependence (#14646)
* [ ] Fix mixin inheritance order, allow overwriting tags (#14884) (not sure if this needs an entry)
* [ ] EHN update lobpcg from scipy master (#14971)
* [ ] Error for cosine affinity when zero vectors present (#7943)
|
1.0
|
Missing from what's new v0.22 - I've tried to reconcile the commits in 0.21.3..master with the PR and Issue IDs in the change log. The following appear to be missing from `doc/whats_new/v0.22.rst` and entries should be added for them:
reconciled and found
* [ ] ENH faster polynomial features for dense matrices (#13290)
* [ ] ENH Adds roc weighted scorers to SCORERS (#14417)
* [ ] ColumnTransformer input feature name and count validation (#14544)
* [ ] FEA Adds Plotting API to Partial Dependence (#14646)
* [ ] Fix mixin inheritance order, allow overwriting tags (#14884) (not sure if this needs an entry)
* [ ] EHN update lobpcg from scipy master (#14971)
* [ ] Error for cosine affinity when zero vectors present (#7943)
|
non_defect
|
missing from what s new i ve tried to reconcile the commits in master with the pr and issue ids in the change log the following appear to be missing from doc whats new rst and entries should be added for them reconciled and found enh faster polynomial features for dense matrices enh adds roc weighted scorers to scorers columntransformer input feature name and count validation fea adds plotting api to partial dependence fix mixin inheritance order allow overwriting tags not sure if this needs an entry ehn update lobpcg from scipy master error for cosine affinity when zero vectors present
| 0
|
50,096
| 13,187,323,895
|
IssuesEvent
|
2020-08-13 03:03:02
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
closed
|
error message hidden when linking to full disk? (Trac #113)
|
Migrated from Trac cmake defect
|
<details>
<summary>_Migrated from https://code.icecube.wisc.edu/ticket/113
, reported by troy and owned by troy_</summary>
<p>
```json
{
"status": "closed",
"changetime": "2014-11-23T03:37:56",
"description": "",
"reporter": "troy",
"cc": "",
"resolution": "invalid",
"_ts": "1416713876900096",
"component": "cmake",
"summary": "error message hidden when linking to full disk?",
"priority": "major",
"keywords": "",
"time": "2008-08-20T21:17:40",
"milestone": "",
"owner": "troy",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
error message hidden when linking to full disk? (Trac #113) -
<details>
<summary>_Migrated from https://code.icecube.wisc.edu/ticket/113
, reported by troy and owned by troy_</summary>
<p>
```json
{
"status": "closed",
"changetime": "2014-11-23T03:37:56",
"description": "",
"reporter": "troy",
"cc": "",
"resolution": "invalid",
"_ts": "1416713876900096",
"component": "cmake",
"summary": "error message hidden when linking to full disk?",
"priority": "major",
"keywords": "",
"time": "2008-08-20T21:17:40",
"milestone": "",
"owner": "troy",
"type": "defect"
}
```
</p>
</details>
|
defect
|
error message hidden when linking to full disk trac migrated from reported by troy and owned by troy json status closed changetime description reporter troy cc resolution invalid ts component cmake summary error message hidden when linking to full disk priority major keywords time milestone owner troy type defect
| 1
|
37,015
| 8,201,990,764
|
IssuesEvent
|
2018-09-02 02:08:07
|
scipy/scipy
|
https://api.github.com/repos/scipy/scipy
|
closed
|
stats.statlib.spearman.f doesn't return prho (Trac #1271)
|
Migrated from Trac defect prio-normal scipy.stats
|
_Original ticket http://projects.scipy.org/scipy/ticket/1271 on 2010-09-07 by @josef-pkt, assigned to unknown._
scipy\stats\statlib\spearman.f doesn't return prho
currently not used, but it might give better small sample probabilities for stats.spearmanr
statlib.prho currently returns only a status code
|
1.0
|
stats.statlib.spearman.f doesn't return prho (Trac #1271) - _Original ticket http://projects.scipy.org/scipy/ticket/1271 on 2010-09-07 by @josef-pkt, assigned to unknown._
scipy\stats\statlib\spearman.f doesn't return prho
currently not used, but it might give better small sample probabilities for stats.spearmanr
statlib.prho currently returns only a status code
|
defect
|
stats statlib spearman f doesn t return prho trac original ticket on by josef pkt assigned to unknown scipy stats statlib spearman f doesn t return prho currently not used but it might give better small sample probabilities for stats spearmanr statlib prho currently returns only a status code
| 1
|
27,036
| 27,580,156,226
|
IssuesEvent
|
2023-03-08 15:41:44
|
redpanda-data/documentation
|
https://api.github.com/repos/redpanda-data/documentation
|
closed
|
engine: Provide more context in internal search results
|
website engine usability improvement P3
|
### Describe the Issue
When you enter a query into our search box, the results don't differentiate between Cloud and Platform. It would be useful to configure Algolia to display the product to which the search result applies.
The highlighted results are for Cloud, but the label is 'Documentation':

### Updates to existing documentation
<!--
Provide the URL of the page(s) to which the updates apply.
Which topic(s) should be updated?
What is the requested fix? Describe what is wrong in the existing doc and include screenshots if possible. Then provide the correct information.
Is this request to document an existing Redpanda feature that is not currently documented?
-->
### New feature or content gap requests
### If new feature, in which release is it included or expected?
### If the requested update is from customer feedback or a Community slack conversation, provide a link:
### Additional notes
<!--
Documentation Issues must be distinct and not overlap. If filing an Issue for a feature that spans platform and cloud, you must file two separate Issues. Each Issue will drive the new content in GitHub for the respective platform or cloud doc.
Include:
- Relevant GitHub issues and pull requests
- Dependencies on other features or components
- Specific Redpanda contributors to notify
-->
|
True
|
engine: Provide more context in internal search results - ### Describe the Issue
When you enter a query into our search box, the results don't differentiate between Cloud and Platform. It would be useful to configure Algolia to display the product to which the search result applies.
The highlighted results are for Cloud, but the label is 'Documentation':

### Updates to existing documentation
<!--
Provide the URL of the page(s) to which the updates apply.
Which topic(s) should be updated?
What is the requested fix? Describe what is wrong in the existing doc and include screenshots if possible. Then provide the correct information.
Is this request to document an existing Redpanda feature that is not currently documented?
-->
### New feature or content gap requests
### If new feature, in which release is it included or expected?
### If the requested update is from customer feedback or a Community slack conversation, provide a link:
### Additional notes
<!--
Documentation Issues must be distinct and not overlap. If filing an Issue for a feature that spans platform and cloud, you must file two separate Issues. Each Issue will drive the new content in GitHub for the respective platform or cloud doc.
Include:
- Relevant GitHub issues and pull requests
- Dependencies on other features or components
- Specific Redpanda contributors to notify
-->
|
non_defect
|
engine provide more context in internal search results describe the issue when you enter a query into our search box the results don t differentiate between cloud and platform it would be useful to configure algolia to display the product to which the search result applies the highlighted results are for cloud but the label is documentation updates to existing documentation provide the url of the page s to which the updates apply which topic s should be updated what is the requested fix describe what is wrong in the existing doc and include screenshots if possible then provide the correct information is this request to document an existing redpanda feature that is not currently documented new feature or content gap requests if new feature in which release is it included or expected if the requested update is from customer feedback or a community slack conversation provide a link additional notes documentation issues must be distinct and not overlap if filing an issue for a feature that spans platform and cloud you must file two separate issues each issue will drive the new content in github for the respective platform or cloud doc include relevant github issues and pull requests dependencies on other features or components specific redpanda contributors to notify
| 0
|
6,030
| 2,610,219,678
|
IssuesEvent
|
2015-02-26 19:09:49
|
chrsmith/somefinders
|
https://api.github.com/repos/chrsmith/somefinders
|
opened
|
x3daudio1 7 dll для windows 7 cкачать
|
auto-migrated Priority-Medium Type-Defect
|
```
'''Гавриил Щукин'''
День добрый никак не могу найти .x3daudio1 7 dll
для windows 7 cкачать. где то видел уже
'''воин Стрелков'''
Вот хороший сайт где можно скачать
http://bit.ly/17vGvqb
'''Бернард Назаров'''
Спасибо вроде то но просит телефон вводить
'''Арсен Ершов'''
Неа все ок у меня ничего не списало
'''Вахтанг Крюков'''
Не это не влияет на баланс
Информация о файле: x3daudio1 7 dll для windows 7
cкачать
Загружен: В этом месяце
Скачан раз: 695
Рейтинг: 1091
Средняя скорость скачивания: 420
Похожих файлов: 30
```
-----
Original issue reported on code.google.com by `kondense...@gmail.com` on 17 Dec 2013 at 4:57
|
1.0
|
x3daudio1 7 dll для windows 7 cкачать - ```
'''Гавриил Щукин'''
День добрый никак не могу найти .x3daudio1 7 dll
для windows 7 cкачать. где то видел уже
'''воин Стрелков'''
Вот хороший сайт где можно скачать
http://bit.ly/17vGvqb
'''Бернард Назаров'''
Спасибо вроде то но просит телефон вводить
'''Арсен Ершов'''
Неа все ок у меня ничего не списало
'''Вахтанг Крюков'''
Не это не влияет на баланс
Информация о файле: x3daudio1 7 dll для windows 7
cкачать
Загружен: В этом месяце
Скачан раз: 695
Рейтинг: 1091
Средняя скорость скачивания: 420
Похожих файлов: 30
```
-----
Original issue reported on code.google.com by `kondense...@gmail.com` on 17 Dec 2013 at 4:57
|
defect
|
dll для windows cкачать гавриил щукин день добрый никак не могу найти dll для windows cкачать где то видел уже воин стрелков вот хороший сайт где можно скачать бернард назаров спасибо вроде то но просит телефон вводить арсен ершов неа все ок у меня ничего не списало вахтанг крюков не это не влияет на баланс информация о файле dll для windows cкачать загружен в этом месяце скачан раз рейтинг средняя скорость скачивания похожих файлов original issue reported on code google com by kondense gmail com on dec at
| 1
|
23,697
| 6,475,326,752
|
IssuesEvent
|
2017-08-17 20:08:24
|
Microsoft/TypeScript
|
https://api.github.com/repos/Microsoft/TypeScript
|
closed
|
Intellisense not working at all
|
Needs More Info VS Code Tracked
|
_From @joshfttb on October 23, 2016 18:25_
I'm really trying to give vscode a go, but after two days I'm unable to make intellisense work.
I've tried configuring my project according to these instrustions:
https://code.visualstudio.com/Docs/languages/javascript
https://code.visualstudio.com/Docs/runtimes/nodejs
I've completely uninstalled and re-installed both vscode and typings. I've tried removing the files argument from my jsconfig, as well as re-creating it completely. I've tried installing typings locally. I've also tried installing and running typescript from my machine instead of vscode's built-in typescript on the off chance that would help. No luck.
If anyone could provide assistance I would greatly appreciate it.
I'm running on OSX (or is is MacOS) Sierra with brew installed.
<img width="427" alt="screen shot 2016-10-23 at 14 06 04" src="https://cloud.githubusercontent.com/assets/2595454/19628461/22bade28-992c-11e6-82c5-602240873b2d.png">
<img width="1440" alt="screen shot 2016-10-23 at 12 41 46" src="https://cloud.githubusercontent.com/assets/2595454/19628462/31d9e67e-992c-11e6-89f1-1f0277fa8c2b.png">
<img width="954" alt="screen shot 2016-10-23 at 14 24 51" src="https://cloud.githubusercontent.com/assets/2595454/19628476/82696e8e-992c-11e6-8441-60f8632afad3.png">
<img width="958" alt="screen shot 2016-10-23 at 14 24 58" src="https://cloud.githubusercontent.com/assets/2595454/19628477/8269fd18-992c-11e6-929f-56ab8a5eea8f.png">
_Copied from original issue: Microsoft/vscode#14236_
|
1.0
|
Intellisense not working at all - _From @joshfttb on October 23, 2016 18:25_
I'm really trying to give vscode a go, but after two days I'm unable to make intellisense work.
I've tried configuring my project according to these instrustions:
https://code.visualstudio.com/Docs/languages/javascript
https://code.visualstudio.com/Docs/runtimes/nodejs
I've completely uninstalled and re-installed both vscode and typings. I've tried removing the files argument from my jsconfig, as well as re-creating it completely. I've tried installing typings locally. I've also tried installing and running typescript from my machine instead of vscode's built-in typescript on the off chance that would help. No luck.
If anyone could provide assistance I would greatly appreciate it.
I'm running on OSX (or is is MacOS) Sierra with brew installed.
<img width="427" alt="screen shot 2016-10-23 at 14 06 04" src="https://cloud.githubusercontent.com/assets/2595454/19628461/22bade28-992c-11e6-82c5-602240873b2d.png">
<img width="1440" alt="screen shot 2016-10-23 at 12 41 46" src="https://cloud.githubusercontent.com/assets/2595454/19628462/31d9e67e-992c-11e6-89f1-1f0277fa8c2b.png">
<img width="954" alt="screen shot 2016-10-23 at 14 24 51" src="https://cloud.githubusercontent.com/assets/2595454/19628476/82696e8e-992c-11e6-8441-60f8632afad3.png">
<img width="958" alt="screen shot 2016-10-23 at 14 24 58" src="https://cloud.githubusercontent.com/assets/2595454/19628477/8269fd18-992c-11e6-929f-56ab8a5eea8f.png">
_Copied from original issue: Microsoft/vscode#14236_
|
non_defect
|
intellisense not working at all from joshfttb on october i m really trying to give vscode a go but after two days i m unable to make intellisense work i ve tried configuring my project according to these instrustions i ve completely uninstalled and re installed both vscode and typings i ve tried removing the files argument from my jsconfig as well as re creating it completely i ve tried installing typings locally i ve also tried installing and running typescript from my machine instead of vscode s built in typescript on the off chance that would help no luck if anyone could provide assistance i would greatly appreciate it i m running on osx or is is macos sierra with brew installed img width alt screen shot at src img width alt screen shot at src img width alt screen shot at src img width alt screen shot at src copied from original issue microsoft vscode
| 0
|
16,549
| 2,915,242,548
|
IssuesEvent
|
2015-06-23 11:16:06
|
mintty/mintty
|
https://api.github.com/repos/mintty/mintty
|
closed
|
adb shell
|
Type-Defect
|
```
open a conection whit adb shell for android terminal.
the line command is very well but the keys up and down don't print the last
typed command.
```
Original issue reported on code.google.com by `leone...@gmail.com` on 7 Mar 2013 at 6:32
|
1.0
|
adb shell - ```
open a conection whit adb shell for android terminal.
the line command is very well but the keys up and down don't print the last
typed command.
```
Original issue reported on code.google.com by `leone...@gmail.com` on 7 Mar 2013 at 6:32
|
defect
|
adb shell open a conection whit adb shell for android terminal the line command is very well but the keys up and down don t print the last typed command original issue reported on code google com by leone gmail com on mar at
| 1
|
26,568
| 20,253,765,215
|
IssuesEvent
|
2022-02-14 20:40:57
|
google/site-kit-wp
|
https://api.github.com/repos/google/site-kit-wp
|
closed
|
Update actions to run on PRs for feature branches
|
P0 QA: Eng Type: Infrastructure
|
## Feature Description
We normally use feature flags to handle long-running development rather than using a separate branch until it is ready to go. In some cases, it makes sense to use a separate, shorter-lived branch when working on a few changes that will go in all together without a feature flag.
To support this, we should update our actions to run for PRs against these feature branches as they are currently limited to PRs against `main` or `develop`.
---------------
_Do not alter or remove anything below. The following sections will be managed by moderators only._
## Acceptance criteria
* All GitHub actions that run for PRs should also run for PRs against `feature/*` branches
## Implementation Brief
Update the `on` > `pull_request` > `branches` of all Github workflows to also include `'feature/**'`.
````yml
on:
pull_request:
branches:
- develop
- main
- 'feature/**'
````
The workflows are located in `.github/workflows` and all of them needs to be updated accordingly.
### Test Coverage
* <!-- One or more bullet points for how to implement automated tests to verify the feature works. -->
## QA Brief
* QA:ENG:
* Create a test feature branch off of `develop`. ie. `feature/test-branch` and push it.
* Create another branch and do a commit and push it.
* Make sure changed files have some js/css and php files.
* Create a PR from the second branch and target the feature branch.
* Make sure that Actions are running.
* Delete the PR and test branches afterwards.
## Changelog entry
* N/A
|
1.0
|
Update actions to run on PRs for feature branches - ## Feature Description
We normally use feature flags to handle long-running development rather than using a separate branch until it is ready to go. In some cases, it makes sense to use a separate, shorter-lived branch when working on a few changes that will go in all together without a feature flag.
To support this, we should update our actions to run for PRs against these feature branches as they are currently limited to PRs against `main` or `develop`.
---------------
_Do not alter or remove anything below. The following sections will be managed by moderators only._
## Acceptance criteria
* All GitHub actions that run for PRs should also run for PRs against `feature/*` branches
## Implementation Brief
Update the `on` > `pull_request` > `branches` of all Github workflows to also include `'feature/**'`.
````yml
on:
pull_request:
branches:
- develop
- main
- 'feature/**'
````
The workflows are located in `.github/workflows` and all of them needs to be updated accordingly.
### Test Coverage
* <!-- One or more bullet points for how to implement automated tests to verify the feature works. -->
## QA Brief
* QA:ENG:
* Create a test feature branch off of `develop`. ie. `feature/test-branch` and push it.
* Create another branch and do a commit and push it.
* Make sure changed files have some js/css and php files.
* Create a PR from the second branch and target the feature branch.
* Make sure that Actions are running.
* Delete the PR and test branches afterwards.
## Changelog entry
* N/A
|
non_defect
|
update actions to run on prs for feature branches feature description we normally use feature flags to handle long running development rather than using a separate branch until it is ready to go in some cases it makes sense to use a separate shorter lived branch when working on a few changes that will go in all together without a feature flag to support this we should update our actions to run for prs against these feature branches as they are currently limited to prs against main or develop do not alter or remove anything below the following sections will be managed by moderators only acceptance criteria all github actions that run for prs should also run for prs against feature branches implementation brief update the on pull request branches of all github workflows to also include feature yml on pull request branches develop main feature the workflows are located in github workflows and all of them needs to be updated accordingly test coverage qa brief qa eng create a test feature branch off of develop ie feature test branch and push it create another branch and do a commit and push it make sure changed files have some js css and php files create a pr from the second branch and target the feature branch make sure that actions are running delete the pr and test branches afterwards changelog entry n a
| 0
|
65,316
| 14,711,065,444
|
IssuesEvent
|
2021-01-05 06:42:18
|
skeshari12/chef-server
|
https://api.github.com/repos/skeshari12/chef-server
|
opened
|
CVE-2015-8559 (High) detected in chef-12.22.5.gem
|
security vulnerability
|
## CVE-2015-8559 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>chef-12.22.5.gem</b></p></summary>
<p>A systems integration framework, built to bring the benefits of configuration management to your entire infrastructure.</p>
<p>Library home page: <a href="https://rubygems.org/gems/chef-12.22.5.gem">https://rubygems.org/gems/chef-12.22.5.gem</a></p>
<p>
Dependency Hierarchy:
- :x: **chef-12.22.5.gem** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/skeshari12/chef-server/commit/6942614fdaa22f54c101995c996e72ea6f6c9553">6942614fdaa22f54c101995c996e72ea6f6c9553</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The knife bootstrap command in chef leaks the validator.pem private RSA key to /var/log/messages.
<p>Publish Date: 2017-09-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-8559>CVE-2015-8559</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8559">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8559</a></p>
<p>Release Date: 2017-09-21</p>
<p>Fix Resolution: v15.4.22</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2015-8559 (High) detected in chef-12.22.5.gem - ## CVE-2015-8559 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>chef-12.22.5.gem</b></p></summary>
<p>A systems integration framework, built to bring the benefits of configuration management to your entire infrastructure.</p>
<p>Library home page: <a href="https://rubygems.org/gems/chef-12.22.5.gem">https://rubygems.org/gems/chef-12.22.5.gem</a></p>
<p>
Dependency Hierarchy:
- :x: **chef-12.22.5.gem** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/skeshari12/chef-server/commit/6942614fdaa22f54c101995c996e72ea6f6c9553">6942614fdaa22f54c101995c996e72ea6f6c9553</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The knife bootstrap command in chef leaks the validator.pem private RSA key to /var/log/messages.
<p>Publish Date: 2017-09-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-8559>CVE-2015-8559</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8559">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-8559</a></p>
<p>Release Date: 2017-09-21</p>
<p>Fix Resolution: v15.4.22</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve high detected in chef gem cve high severity vulnerability vulnerable library chef gem a systems integration framework built to bring the benefits of configuration management to your entire infrastructure library home page a href dependency hierarchy x chef gem vulnerable library found in head commit a href found in base branch master vulnerability details the knife bootstrap command in chef leaks the validator pem private rsa key to var log messages publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
38,217
| 8,701,619,739
|
IssuesEvent
|
2018-12-05 12:07:30
|
primefaces/primefaces
|
https://api.github.com/repos/primefaces/primefaces
|
closed
|
Tree: Drag&Drop of top-level nodes not working (JavaScript error)
|
defect
|
## 1) Environment
- PrimeFaces version: Current Showcase (`Running PrimeFaces-6.2.9 on Mojarra-2.3.2`)
- Affected browsers: Tested on Chrome, probably all browsers
## 2) Expected behavior
Moving a top-level node of the tree via drag&drop to another top-level position should update the tree and the node should have been moved.
## 3) Actual behavior
Nothing happens after performing the drag&drop action, tree remains as before. An error is logged in the JavaScript console of the browser.
## 4) Steps to reproduce
Go to the showcase: https://www.primefaces.org/showcase/ui/data/tree/dragdrop.xhtml
On the left tree, try dropping `Node0` on the drop point after `Node1`. The node is not moved and the console displays a JavaScript error.
## 5) Reason
The issue seems to be caused by commit https://github.com/primefaces/primefaces/commit/874fd7d952b8d604e2bc09980a16d36746c91a2d#diff-3cc01523b05bd1fb5ffa60e642384c14
File: `/src/main/resources/META-INF/resources/primefaces/tree/tree.js`
On line `1004-1005`, it says
```javascript
dropNode = dropPoint.closest('li.ui-treenode-parent'),
dropNodeKey = $this.getRowKey(dropNode),
```
As a top-level node does not have a closest `li.ui-treenode-parent`, `dropNodeKey` ends up being `undefined`.
Then an error is thrown when attempting to access a property on `dropNodeKey` on line `1032`:
```javascript
if(!transfer && dropNodeKey.indexOf(dragNodeKey) === 0) {
return;
}
```
For reference, the error message is:
```
components.js.xhtml?ln=primefaces&v=6.2.9:20419 Uncaught TypeError: Cannot read property 'indexOf' of undefined
at HTMLLIElement.drop (components.js.xhtml?ln=primefaces&v=6.2.9:20419)
at $.(anonymous function).(anonymous function)._trigger (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:697:13)
at $.(anonymous function).(anonymous function)._drop (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:3367:9)
at $.(anonymous function).(anonymous function)._drop (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:144:25)
at $.(anonymous function).(anonymous function).<anonymous> (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:3512:26)
at Function.each (jquery.js.xhtml?ln=primefaces&v=6.2.9:2)
at Object.drop (jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:3505)
at $.(anonymous function).(anonymous function)._mouseStop (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:2201:29)
at $.(anonymous function).(anonymous function)._mouseStop (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:144:25)
at $.(anonymous function).(anonymous function)._mouseUp (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:1826:9)
```
|
1.0
|
Tree: Drag&Drop of top-level nodes not working (JavaScript error) - ## 1) Environment
- PrimeFaces version: Current Showcase (`Running PrimeFaces-6.2.9 on Mojarra-2.3.2`)
- Affected browsers: Tested on Chrome, probably all browsers
## 2) Expected behavior
Moving a top-level node of the tree via drag&drop to another top-level position should update the tree and the node should have been moved.
## 3) Actual behavior
Nothing happens after performing the drag&drop action, tree remains as before. An error is logged in the JavaScript console of the browser.
## 4) Steps to reproduce
Go to the showcase: https://www.primefaces.org/showcase/ui/data/tree/dragdrop.xhtml
On the left tree, try dropping `Node0` on the drop point after `Node1`. The node is not moved and the console displays a JavaScript error.
## 5) Reason
The issue seems to be caused by commit https://github.com/primefaces/primefaces/commit/874fd7d952b8d604e2bc09980a16d36746c91a2d#diff-3cc01523b05bd1fb5ffa60e642384c14
File: `/src/main/resources/META-INF/resources/primefaces/tree/tree.js`
On line `1004-1005`, it says
```javascript
dropNode = dropPoint.closest('li.ui-treenode-parent'),
dropNodeKey = $this.getRowKey(dropNode),
```
As a top-level node does not have a closest `li.ui-treenode-parent`, `dropNodeKey` ends up being `undefined`.
Then an error is thrown when attempting to access a property on `dropNodeKey` on line `1032`:
```javascript
if(!transfer && dropNodeKey.indexOf(dragNodeKey) === 0) {
return;
}
```
For reference, the error message is:
```
components.js.xhtml?ln=primefaces&v=6.2.9:20419 Uncaught TypeError: Cannot read property 'indexOf' of undefined
at HTMLLIElement.drop (components.js.xhtml?ln=primefaces&v=6.2.9:20419)
at $.(anonymous function).(anonymous function)._trigger (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:697:13)
at $.(anonymous function).(anonymous function)._drop (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:3367:9)
at $.(anonymous function).(anonymous function)._drop (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:144:25)
at $.(anonymous function).(anonymous function).<anonymous> (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:3512:26)
at Function.each (jquery.js.xhtml?ln=primefaces&v=6.2.9:2)
at Object.drop (jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:3505)
at $.(anonymous function).(anonymous function)._mouseStop (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:2201:29)
at $.(anonymous function).(anonymous function)._mouseStop (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:144:25)
at $.(anonymous function).(anonymous function)._mouseUp (https://www.primefaces.org/showcase/javax.faces.resource/jquery/jquery-plugins.js.xhtml?ln=primefaces&v=6.2.9:1826:9)
```
|
defect
|
tree drag drop of top level nodes not working javascript error environment primefaces version current showcase running primefaces on mojarra affected browsers tested on chrome probably all browsers expected behavior moving a top level node of the tree via drag drop to another top level position should update the tree and the node should have been moved actual behavior nothing happens after performing the drag drop action tree remains as before an error is logged in the javascript console of the browser steps to reproduce go to the showcase on the left tree try dropping on the drop point after the node is not moved and the console displays a javascript error reason the issue seems to be caused by commit file src main resources meta inf resources primefaces tree tree js on line it says javascript dropnode droppoint closest li ui treenode parent dropnodekey this getrowkey dropnode as a top level node does not have a closest li ui treenode parent dropnodekey ends up being undefined then an error is thrown when attempting to access a property on dropnodekey on line javascript if transfer dropnodekey indexof dragnodekey return for reference the error message is components js xhtml ln primefaces v uncaught typeerror cannot read property indexof of undefined at htmllielement drop components js xhtml ln primefaces v at anonymous function anonymous function trigger at anonymous function anonymous function drop at anonymous function anonymous function drop at anonymous function anonymous function at function each jquery js xhtml ln primefaces v at object drop jquery plugins js xhtml ln primefaces v at anonymous function anonymous function mousestop at anonymous function anonymous function mousestop at anonymous function anonymous function mouseup
| 1
|
376,865
| 26,220,848,093
|
IssuesEvent
|
2023-01-04 14:44:11
|
NFDI4Chem/VibrationalSpectroscopyOntology
|
https://api.github.com/repos/NFDI4Chem/VibrationalSpectroscopyOntology
|
closed
|
[Docs] Why are there duplicate classes?
|
documentation
|
Some fields seem to have the same meaning:
- "excitation wavelength" = "excitation wavelength setting";
- "groove density" = "groove density setting" = "diffraction grating"
I suggest to choose only one and choose one coherent labelling (with or without "... setting")
|
1.0
|
[Docs] Why are there duplicate classes? - Some fields seem to have the same meaning:
- "excitation wavelength" = "excitation wavelength setting";
- "groove density" = "groove density setting" = "diffraction grating"
I suggest to choose only one and choose one coherent labelling (with or without "... setting")
|
non_defect
|
why are there duplicate classes some fields seem to have the same meaning excitation wavelength excitation wavelength setting groove density groove density setting diffraction grating i suggest to choose only one and choose one coherent labelling with or without setting
| 0
|
290,960
| 25,110,512,374
|
IssuesEvent
|
2022-11-08 20:06:27
|
LtxProgrammer/Changed-Minecraft-Mod
|
https://api.github.com/repos/LtxProgrammer/Changed-Minecraft-Mod
|
closed
|
Transfur noise fails to play when keepCon is off.
|
bug needs testing
|
Players and entities don't make noise. Only plays when using syringe.
|
1.0
|
Transfur noise fails to play when keepCon is off. - Players and entities don't make noise. Only plays when using syringe.
|
non_defect
|
transfur noise fails to play when keepcon is off players and entities don t make noise only plays when using syringe
| 0
|
19,386
| 3,198,404,591
|
IssuesEvent
|
2015-10-01 12:03:03
|
dmaidaniuk/content-management-faces
|
https://api.github.com/repos/dmaidaniuk/content-management-faces
|
closed
|
Buildfile doesn't work - build.xml
|
auto-migrated Priority-Medium Type-Defect
|
What steps will reproduce the problem?
1. try to run ant build in eclipse or on command line
What is the expected output? What do you see instead?
expected output should be: BUILD SUCCESSFUL
instead i see: dozens of exceptions due to missing dependencies
What version of the product are you using? On what operating system?
checked out sources from /trunk
Please provide any additional information below.
Original issue reported on code.google.com by `anne.mil...@profitbricks.com` on 3 Nov 2011 at 1:07
|
1.0
|
Buildfile doesn't work - build.xml - What steps will reproduce the problem?
1. try to run ant build in eclipse or on command line
What is the expected output? What do you see instead?
expected output should be: BUILD SUCCESSFUL
instead i see: dozens of exceptions due to missing dependencies
What version of the product are you using? On what operating system?
checked out sources from /trunk
Please provide any additional information below.
Original issue reported on code.google.com by `anne.mil...@profitbricks.com` on 3 Nov 2011 at 1:07
|
defect
|
buildfile doesn t work build xml what steps will reproduce the problem try to run ant build in eclipse or on command line what is the expected output what do you see instead expected output should be build successful instead i see dozens of exceptions due to missing dependencies what version of the product are you using on what operating system checked out sources from trunk please provide any additional information below original issue reported on code google com by anne mil profitbricks com on nov at
| 1
|
79,444
| 28,263,010,791
|
IssuesEvent
|
2023-04-07 02:17:58
|
openzfs/zfs
|
https://api.github.com/repos/openzfs/zfs
|
opened
|
Fedora 38 zfs-testing repo is not available
|
Type: Defect
|
<!-- Please fill out the following template, which will help other contributors address your issue. -->
<!--
Thank you for reporting an issue.
*IMPORTANT* - Please check our issue tracker before opening a new issue.
Additional valuable information can be found in the OpenZFS documentation
and mailing list archives.
Please fill in as much of the template as possible.
-->
### System information
<!-- add version after "|" character -->
Type | Version/Name
--- | ---
Distribution Name |Fedora
Distribution Version |38
Kernel Version |6.2.2-301
Architecture |x86_64
OpenZFS Version |2.1.9
<!--
Command to find OpenZFS version:
zfs version
Commands to find kernel version:
uname -r # Linux
freebsd-version -r # FreeBSD
-->
### Describe the problem you're observing
zfs-testing repo for Fedora 38 is missing
### Describe how to reproduce the problem
```
dnf install https://zfsonlinux.org/fedora/zfs-release.fc38.noarch.rpm
gpg --import --import-options show-only /etc/pki/rpm-gpg/RPM-GPG-KEY-zfsonlinux
dnf config-manager --enable zfs-testing
dnf update
```
### Include any warning/errors/backtraces from the system logs
```
Errors during downloading metadata for repository 'zfs-testing':
- Status code: 403 for http://download.zfsonlinux.org/fedora-testing/38/x86_64/repodata/repomd.xml (IP: 52.92.212.170)
Error: Failed to download metadata for repo 'zfs-testing': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried
```
|
1.0
|
Fedora 38 zfs-testing repo is not available - <!-- Please fill out the following template, which will help other contributors address your issue. -->
<!--
Thank you for reporting an issue.
*IMPORTANT* - Please check our issue tracker before opening a new issue.
Additional valuable information can be found in the OpenZFS documentation
and mailing list archives.
Please fill in as much of the template as possible.
-->
### System information
<!-- add version after "|" character -->
Type | Version/Name
--- | ---
Distribution Name |Fedora
Distribution Version |38
Kernel Version |6.2.2-301
Architecture |x86_64
OpenZFS Version |2.1.9
<!--
Command to find OpenZFS version:
zfs version
Commands to find kernel version:
uname -r # Linux
freebsd-version -r # FreeBSD
-->
### Describe the problem you're observing
zfs-testing repo for Fedora 38 is missing
### Describe how to reproduce the problem
```
dnf install https://zfsonlinux.org/fedora/zfs-release.fc38.noarch.rpm
gpg --import --import-options show-only /etc/pki/rpm-gpg/RPM-GPG-KEY-zfsonlinux
dnf config-manager --enable zfs-testing
dnf update
```
### Include any warning/errors/backtraces from the system logs
```
Errors during downloading metadata for repository 'zfs-testing':
- Status code: 403 for http://download.zfsonlinux.org/fedora-testing/38/x86_64/repodata/repomd.xml (IP: 52.92.212.170)
Error: Failed to download metadata for repo 'zfs-testing': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried
```
|
defect
|
fedora zfs testing repo is not available thank you for reporting an issue important please check our issue tracker before opening a new issue additional valuable information can be found in the openzfs documentation and mailing list archives please fill in as much of the template as possible system information type version name distribution name fedora distribution version kernel version architecture openzfs version command to find openzfs version zfs version commands to find kernel version uname r linux freebsd version r freebsd describe the problem you re observing zfs testing repo for fedora is missing describe how to reproduce the problem dnf install gpg import import options show only etc pki rpm gpg rpm gpg key zfsonlinux dnf config manager enable zfs testing dnf update include any warning errors backtraces from the system logs errors during downloading metadata for repository zfs testing status code for ip error failed to download metadata for repo zfs testing cannot download repomd xml cannot download repodata repomd xml all mirrors were tried
| 1
|
24,657
| 4,057,939,087
|
IssuesEvent
|
2016-05-25 00:40:36
|
jccastillo0007/eFacturaT
|
https://api.github.com/repos/jccastillo0007/eFacturaT
|
opened
|
en conector o timbrador, no enviar mensajes modales
|
defect
|
ya que impiden la continuidad del timbrado, y son procesos 'ciegos'.
cuando no existe internet, se pueden hacer reintentos?
así lo pidió el cliente...
|
1.0
|
en conector o timbrador, no enviar mensajes modales - ya que impiden la continuidad del timbrado, y son procesos 'ciegos'.
cuando no existe internet, se pueden hacer reintentos?
así lo pidió el cliente...
|
defect
|
en conector o timbrador no enviar mensajes modales ya que impiden la continuidad del timbrado y son procesos ciegos cuando no existe internet se pueden hacer reintentos así lo pidió el cliente
| 1
|
23,720
| 3,851,866,005
|
IssuesEvent
|
2016-04-06 05:28:07
|
GPF/imame4all
|
https://api.github.com/repos/GPF/imame4all
|
closed
|
iCade Mobile controller remapping issue
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1. Access TAB menu by pressing two buttons.
2. Select specific input on iCade Mobile attempting to reassign
3. Press the desired button
What is the expected output? What do you see instead?
I expected to see the button I pressed reassigned to the selected input.
Instead I got the message ENTER or ENTER J1 JoystickButton J2 JoystickButton J3
JoystickButton J4 JoystickButton (referred to in ISSUE 19).
What version of the product are you using? On what operating system?
imame4all 1.10-1 and iOS 5.0.1 on iphone 4S
Please provide any additional information below.
Each button on the iCade Mobile is recognized by mame as two buttons. I listed
them below. All I really need to do is assign NONE to many of the inputs that
are already being used. I know I need to press ESC to do this but there is no
way for me to press ESC with the iCade Mobile. I also tried a bluetooth
keyboard to press ESC but that didn't work. Thanks.
ICa mame
E1 og
E2 lv
9 im
0 kp
8 jn
7 uf
6 hr
5 yt
4 right
3 left
2 down
1 up
```
Original issue reported on code.google.com by `marto...@gmail.com` on 26 Jun 2012 at 8:10
|
1.0
|
iCade Mobile controller remapping issue - ```
What steps will reproduce the problem?
1. Access TAB menu by pressing two buttons.
2. Select specific input on iCade Mobile attempting to reassign
3. Press the desired button
What is the expected output? What do you see instead?
I expected to see the button I pressed reassigned to the selected input.
Instead I got the message ENTER or ENTER J1 JoystickButton J2 JoystickButton J3
JoystickButton J4 JoystickButton (referred to in ISSUE 19).
What version of the product are you using? On what operating system?
imame4all 1.10-1 and iOS 5.0.1 on iphone 4S
Please provide any additional information below.
Each button on the iCade Mobile is recognized by mame as two buttons. I listed
them below. All I really need to do is assign NONE to many of the inputs that
are already being used. I know I need to press ESC to do this but there is no
way for me to press ESC with the iCade Mobile. I also tried a bluetooth
keyboard to press ESC but that didn't work. Thanks.
ICa mame
E1 og
E2 lv
9 im
0 kp
8 jn
7 uf
6 hr
5 yt
4 right
3 left
2 down
1 up
```
Original issue reported on code.google.com by `marto...@gmail.com` on 26 Jun 2012 at 8:10
|
defect
|
icade mobile controller remapping issue what steps will reproduce the problem access tab menu by pressing two buttons select specific input on icade mobile attempting to reassign press the desired button what is the expected output what do you see instead i expected to see the button i pressed reassigned to the selected input instead i got the message enter or enter joystickbutton joystickbutton joystickbutton joystickbutton referred to in issue what version of the product are you using on what operating system and ios on iphone please provide any additional information below each button on the icade mobile is recognized by mame as two buttons i listed them below all i really need to do is assign none to many of the inputs that are already being used i know i need to press esc to do this but there is no way for me to press esc with the icade mobile i also tried a bluetooth keyboard to press esc but that didn t work thanks ica mame og lv im kp jn uf hr yt right left down up original issue reported on code google com by marto gmail com on jun at
| 1
|
33,845
| 7,268,381,074
|
IssuesEvent
|
2018-02-20 09:54:36
|
hazelcast/hazelcast
|
https://api.github.com/repos/hazelcast/hazelcast
|
opened
|
JCache 1.1 TCK - org.jsr107.tck.event.CacheListenerTest passes, but assertion errors are in the log
|
Interface: ICache Team: Core Type: Defect
|
The JCache 1.1 TCK passes for Hazelcast, but the log file contains assertion errors. It seems we experience 2 issues here:
* the TCK handles the result wronly (the test should fail for Hazelcast) ;
* and we have an JCache incompatibility in the CacheListener
https://hazelcast-l337.ci.cloudbees.com/view/JCache/job/JCache-1.1-TCK-branchparam-client-OS/4/console
```
11:59:16 java.lang.AssertionError: Old value should be available for CacheEntryEvent{eventType = REMOVED, isOldValueAvailable = false, value = Lucky, oldValue = null}
11:59:16 at org.junit.Assert.fail(Assert.java:88)
11:59:16 at org.junit.Assert.assertTrue(Assert.java:41)
11:59:16 at org.jsr107.tck.testutil.CacheTestSupport$MyCacheEntryListener.assertOldValueForExpiredRemovedListener(CacheTestSupport.java:204)
11:59:16 at org.jsr107.tck.testutil.CacheTestSupport$MyCacheEntryListener.onRemoved(CacheTestSupport.java:178)
11:59:16 at org.jsr107.tck.event.CacheEntryListenerServer.runHandlers(CacheEntryListenerServer.java:151)
11:59:16 at org.jsr107.tck.event.CacheEntryListenerServer.access$000(CacheEntryListenerServer.java:38)
11:59:16 at org.jsr107.tck.event.CacheEntryListenerServer$CacheEntryEventOperationHandler.onProcess(CacheEntryListenerServer.java:120)
11:59:16 at org.jsr107.tck.support.Server$ClientConnection.run(Server.java:310)
```
|
1.0
|
JCache 1.1 TCK - org.jsr107.tck.event.CacheListenerTest passes, but assertion errors are in the log - The JCache 1.1 TCK passes for Hazelcast, but the log file contains assertion errors. It seems we experience 2 issues here:
* the TCK handles the result wronly (the test should fail for Hazelcast) ;
* and we have an JCache incompatibility in the CacheListener
https://hazelcast-l337.ci.cloudbees.com/view/JCache/job/JCache-1.1-TCK-branchparam-client-OS/4/console
```
11:59:16 java.lang.AssertionError: Old value should be available for CacheEntryEvent{eventType = REMOVED, isOldValueAvailable = false, value = Lucky, oldValue = null}
11:59:16 at org.junit.Assert.fail(Assert.java:88)
11:59:16 at org.junit.Assert.assertTrue(Assert.java:41)
11:59:16 at org.jsr107.tck.testutil.CacheTestSupport$MyCacheEntryListener.assertOldValueForExpiredRemovedListener(CacheTestSupport.java:204)
11:59:16 at org.jsr107.tck.testutil.CacheTestSupport$MyCacheEntryListener.onRemoved(CacheTestSupport.java:178)
11:59:16 at org.jsr107.tck.event.CacheEntryListenerServer.runHandlers(CacheEntryListenerServer.java:151)
11:59:16 at org.jsr107.tck.event.CacheEntryListenerServer.access$000(CacheEntryListenerServer.java:38)
11:59:16 at org.jsr107.tck.event.CacheEntryListenerServer$CacheEntryEventOperationHandler.onProcess(CacheEntryListenerServer.java:120)
11:59:16 at org.jsr107.tck.support.Server$ClientConnection.run(Server.java:310)
```
|
defect
|
jcache tck org tck event cachelistenertest passes but assertion errors are in the log the jcache tck passes for hazelcast but the log file contains assertion errors it seems we experience issues here the tck handles the result wronly the test should fail for hazelcast and we have an jcache incompatibility in the cachelistener java lang assertionerror old value should be available for cacheentryevent eventtype removed isoldvalueavailable false value lucky oldvalue null at org junit assert fail assert java at org junit assert asserttrue assert java at org tck testutil cachetestsupport mycacheentrylistener assertoldvalueforexpiredremovedlistener cachetestsupport java at org tck testutil cachetestsupport mycacheentrylistener onremoved cachetestsupport java at org tck event cacheentrylistenerserver runhandlers cacheentrylistenerserver java at org tck event cacheentrylistenerserver access cacheentrylistenerserver java at org tck event cacheentrylistenerserver cacheentryeventoperationhandler onprocess cacheentrylistenerserver java at org tck support server clientconnection run server java
| 1
|
78,157
| 27,347,780,396
|
IssuesEvent
|
2023-02-27 07:03:46
|
FalsehoodMC/Fabrication
|
https://api.github.com/repos/FalsehoodMC/Fabrication
|
closed
|
Toggle Sprint fails injection on forgery
|
k: Defect n: Forge
|
FabInjector failed to find injection point for net/minecraft/client/player/LocalPlayer;m_8107_()V Lnet/minecraft/client/KeyMapping;m_90857_()Z
located in com.unascribed.fabrication.mixin.b_utility.toggle_sprint.MixinClientPlayerEntity
nvm it wasnt your mod but still this part^
|
1.0
|
Toggle Sprint fails injection on forgery - FabInjector failed to find injection point for net/minecraft/client/player/LocalPlayer;m_8107_()V Lnet/minecraft/client/KeyMapping;m_90857_()Z
located in com.unascribed.fabrication.mixin.b_utility.toggle_sprint.MixinClientPlayerEntity
nvm it wasnt your mod but still this part^
|
defect
|
toggle sprint fails injection on forgery fabinjector failed to find injection point for net minecraft client player localplayer m v lnet minecraft client keymapping m z located in com unascribed fabrication mixin b utility toggle sprint mixinclientplayerentity nvm it wasnt your mod but still this part
| 1
|
7,097
| 2,610,326,856
|
IssuesEvent
|
2015-02-26 19:45:21
|
chrsmith/republic-at-war
|
https://api.github.com/repos/chrsmith/republic-at-war
|
closed
|
Typo
|
auto-migrated Priority-Low Type-Defect
|
```
B1 droids need a space after "numbers." in their description
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 9 Jun 2011 at 12:06
* Merged into: #575
|
1.0
|
Typo - ```
B1 droids need a space after "numbers." in their description
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 9 Jun 2011 at 12:06
* Merged into: #575
|
defect
|
typo droids need a space after numbers in their description original issue reported on code google com by gmail com on jun at merged into
| 1
|
318,118
| 9,681,994,424
|
IssuesEvent
|
2019-05-23 08:06:26
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
reopened
|
scandinavianphoto.se - very slow browsing
|
browser-firefox-mobile priority-normal severity-important type-bad-performance
|
**URL**: www.scandinavianphoto.se
**Browser/Version**: Firefox Mobile 66.0
**Operating System**: Android
**What seems to be the trouble?(Required)**
- [ ] Desktop site instead of mobile site
- [x] Mobile site is not usable
- [ ] Video doesn't play
- [ ] Layout is messed up
- [ ] Text is not visible
- [ ] Something else (Add details below)
**Steps to Reproduce**
1. Navigate to: (*site url*)
2. Try navigation and you will notice the extreme slow response
*__Expected Behavior:__*
Should not be this slow
*__Actual Behavior:__*
**Screenshot**

|
1.0
|
scandinavianphoto.se - very slow browsing - **URL**: www.scandinavianphoto.se
**Browser/Version**: Firefox Mobile 66.0
**Operating System**: Android
**What seems to be the trouble?(Required)**
- [ ] Desktop site instead of mobile site
- [x] Mobile site is not usable
- [ ] Video doesn't play
- [ ] Layout is messed up
- [ ] Text is not visible
- [ ] Something else (Add details below)
**Steps to Reproduce**
1. Navigate to: (*site url*)
2. Try navigation and you will notice the extreme slow response
*__Expected Behavior:__*
Should not be this slow
*__Actual Behavior:__*
**Screenshot**

|
non_defect
|
scandinavianphoto se very slow browsing url browser version firefox mobile operating system android what seems to be the trouble required desktop site instead of mobile site mobile site is not usable video doesn t play layout is messed up text is not visible something else add details below steps to reproduce navigate to site url try navigation and you will notice the extreme slow response expected behavior should not be this slow actual behavior screenshot screenshot descriptions
| 0
|
85,967
| 15,755,311,302
|
IssuesEvent
|
2021-03-31 01:33:11
|
ysmanohar/Arrowlytics
|
https://api.github.com/repos/ysmanohar/Arrowlytics
|
opened
|
WS-2018-0625 (High) detected in xmlbuilder-2.6.5.tgz, xmlbuilder-8.2.2.tgz
|
security vulnerability
|
## WS-2018-0625 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>xmlbuilder-2.6.5.tgz</b>, <b>xmlbuilder-8.2.2.tgz</b></p></summary>
<p>
<details><summary><b>xmlbuilder-2.6.5.tgz</b></p></summary>
<p>An XML builder for node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-2.6.5.tgz">https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-2.6.5.tgz</a></p>
<p>Path to dependency file: /Arrowlytics/bower_components/async/package.json</p>
<p>Path to vulnerable library: Arrowlytics/bower_components/async/node_modules/xmlbuilder/package.json</p>
<p>
Dependency Hierarchy:
- jscs-1.13.1.tgz (Root Library)
- :x: **xmlbuilder-2.6.5.tgz** (Vulnerable Library)
</details>
<details><summary><b>xmlbuilder-8.2.2.tgz</b></p></summary>
<p>An XML builder for node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-8.2.2.tgz">https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-8.2.2.tgz</a></p>
<p>Path to dependency file: /Arrowlytics/bower_components/test-fixture/package.json</p>
<p>Path to vulnerable library: Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json</p>
<p>
Dependency Hierarchy:
- wct-local-2.1.3.tgz (Root Library)
- launchpad-0.7.2.tgz
- plist-2.1.0.tgz
- :x: **xmlbuilder-8.2.2.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package xmlbuilder-js before 9.0.5 is vulnerable to denial of service due to a regular expression issue.
<p>Publish Date: 2018-02-08
<p>URL: <a href=https://github.com/oozcitak/xmlbuilder-js/commit/bbf929a8a54f0d012bdc44cbe622fdeda2509230>WS-2018-0625</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/oozcitak/xmlbuilder-js/commit/bbf929a8a54f0d012bdc44cbe622fdeda2509230">https://github.com/oozcitak/xmlbuilder-js/commit/bbf929a8a54f0d012bdc44cbe622fdeda2509230</a></p>
<p>Release Date: 2020-03-23</p>
<p>Fix Resolution: 9.0.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2018-0625 (High) detected in xmlbuilder-2.6.5.tgz, xmlbuilder-8.2.2.tgz - ## WS-2018-0625 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>xmlbuilder-2.6.5.tgz</b>, <b>xmlbuilder-8.2.2.tgz</b></p></summary>
<p>
<details><summary><b>xmlbuilder-2.6.5.tgz</b></p></summary>
<p>An XML builder for node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-2.6.5.tgz">https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-2.6.5.tgz</a></p>
<p>Path to dependency file: /Arrowlytics/bower_components/async/package.json</p>
<p>Path to vulnerable library: Arrowlytics/bower_components/async/node_modules/xmlbuilder/package.json</p>
<p>
Dependency Hierarchy:
- jscs-1.13.1.tgz (Root Library)
- :x: **xmlbuilder-2.6.5.tgz** (Vulnerable Library)
</details>
<details><summary><b>xmlbuilder-8.2.2.tgz</b></p></summary>
<p>An XML builder for node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-8.2.2.tgz">https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-8.2.2.tgz</a></p>
<p>Path to dependency file: /Arrowlytics/bower_components/test-fixture/package.json</p>
<p>Path to vulnerable library: Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json,Arrowlytics/bower_components/shadycss/node_modules/xmlbuilder/package.json</p>
<p>
Dependency Hierarchy:
- wct-local-2.1.3.tgz (Root Library)
- launchpad-0.7.2.tgz
- plist-2.1.0.tgz
- :x: **xmlbuilder-8.2.2.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package xmlbuilder-js before 9.0.5 is vulnerable to denial of service due to a regular expression issue.
<p>Publish Date: 2018-02-08
<p>URL: <a href=https://github.com/oozcitak/xmlbuilder-js/commit/bbf929a8a54f0d012bdc44cbe622fdeda2509230>WS-2018-0625</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/oozcitak/xmlbuilder-js/commit/bbf929a8a54f0d012bdc44cbe622fdeda2509230">https://github.com/oozcitak/xmlbuilder-js/commit/bbf929a8a54f0d012bdc44cbe622fdeda2509230</a></p>
<p>Release Date: 2020-03-23</p>
<p>Fix Resolution: 9.0.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
ws high detected in xmlbuilder tgz xmlbuilder tgz ws high severity vulnerability vulnerable libraries xmlbuilder tgz xmlbuilder tgz xmlbuilder tgz an xml builder for node js library home page a href path to dependency file arrowlytics bower components async package json path to vulnerable library arrowlytics bower components async node modules xmlbuilder package json dependency hierarchy jscs tgz root library x xmlbuilder tgz vulnerable library xmlbuilder tgz an xml builder for node js library home page a href path to dependency file arrowlytics bower components test fixture package json path to vulnerable library arrowlytics bower components shadycss node modules xmlbuilder package json arrowlytics bower components shadycss node modules xmlbuilder package json arrowlytics bower components shadycss node modules xmlbuilder package json arrowlytics bower components shadycss node modules xmlbuilder package json arrowlytics bower components shadycss node modules xmlbuilder package json arrowlytics bower components shadycss node modules xmlbuilder package json arrowlytics bower components shadycss node modules xmlbuilder package json dependency hierarchy wct local tgz root library launchpad tgz plist tgz x xmlbuilder tgz vulnerable library vulnerability details the package xmlbuilder js before is vulnerable to denial of service due to a regular expression issue publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
236,414
| 26,010,602,198
|
IssuesEvent
|
2022-12-21 01:03:51
|
brogers588/spring-boot
|
https://api.github.com/repos/brogers588/spring-boot
|
closed
|
CVE-2022-31692 (High) detected in spring-security-web-5.5.0.jar - autoclosed
|
security vulnerability
|
## CVE-2022-31692 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-security-web-5.5.0.jar</b></p></summary>
<p>Spring Security</p>
<p>Library home page: <a href="https://spring.io/projects/spring-security">https://spring.io/projects/spring-security</a></p>
<p>Path to dependency file: /spring-boot-tests/spring-boot-smoke-tests/spring-boot-smoke-test-actuator-log4j2/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **spring-security-web-5.5.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/brogers588/spring-boot/commits/12b99a3ee31b333f29415387505dfb45f75ced5f">12b99a3ee31b333f29415387505dfb45f75ced5f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Spring Security, versions 5.7 prior to 5.7.5 and 5.6 prior to 5.6.9 could be susceptible to authorization rules bypass via forward or include dispatcher types. Specifically, an application is vulnerable when all of the following are true: The application expects that Spring Security applies security to forward and include dispatcher types. The application uses the AuthorizationFilter either manually or via the authorizeHttpRequests() method. The application configures the FilterChainProxy to apply to forward and/or include requests (e.g. spring.security.filter.dispatcher-types = request, error, async, forward, include). The application may forward or include the request to a higher privilege-secured endpoint.The application configures Spring Security to apply to every dispatcher type via authorizeHttpRequests().shouldFilterAllDispatcherTypes(true)
<p>Publish Date: 2022-10-31
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-31692>CVE-2022-31692</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tanzu.vmware.com/security/cve-2022-31692">https://tanzu.vmware.com/security/cve-2022-31692</a></p>
<p>Release Date: 2022-10-31</p>
<p>Fix Resolution: 5.6.9</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-31692 (High) detected in spring-security-web-5.5.0.jar - autoclosed - ## CVE-2022-31692 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-security-web-5.5.0.jar</b></p></summary>
<p>Spring Security</p>
<p>Library home page: <a href="https://spring.io/projects/spring-security">https://spring.io/projects/spring-security</a></p>
<p>Path to dependency file: /spring-boot-tests/spring-boot-smoke-tests/spring-boot-smoke-test-actuator-log4j2/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/le/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework.security/spring-security-web/5.5.0/98e946722f83311978c4ee627ddf722218746377/spring-security-web-5.5.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **spring-security-web-5.5.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/brogers588/spring-boot/commits/12b99a3ee31b333f29415387505dfb45f75ced5f">12b99a3ee31b333f29415387505dfb45f75ced5f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Spring Security, versions 5.7 prior to 5.7.5 and 5.6 prior to 5.6.9 could be susceptible to authorization rules bypass via forward or include dispatcher types. Specifically, an application is vulnerable when all of the following are true: The application expects that Spring Security applies security to forward and include dispatcher types. The application uses the AuthorizationFilter either manually or via the authorizeHttpRequests() method. The application configures the FilterChainProxy to apply to forward and/or include requests (e.g. spring.security.filter.dispatcher-types = request, error, async, forward, include). The application may forward or include the request to a higher privilege-secured endpoint.The application configures Spring Security to apply to every dispatcher type via authorizeHttpRequests().shouldFilterAllDispatcherTypes(true)
<p>Publish Date: 2022-10-31
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-31692>CVE-2022-31692</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tanzu.vmware.com/security/cve-2022-31692">https://tanzu.vmware.com/security/cve-2022-31692</a></p>
<p>Release Date: 2022-10-31</p>
<p>Fix Resolution: 5.6.9</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve high detected in spring security web jar autoclosed cve high severity vulnerability vulnerable library spring security web jar spring security library home page a href path to dependency file spring boot tests spring boot smoke tests spring boot smoke test actuator build gradle path to vulnerable library home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar le caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar le caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar le caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar le caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar le caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar le caches modules files org springframework security spring security web spring security web jar le caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar home wss scanner gradle caches modules files org springframework security spring security web spring security web jar dependency hierarchy x spring security web jar vulnerable library found in head commit a href found in base branch main vulnerability details spring security versions prior to and prior to could be susceptible to authorization rules bypass via forward or include dispatcher types specifically an application is vulnerable when all of the following are true the application expects that spring security applies security to forward and include dispatcher types the application uses the authorizationfilter either manually or via the authorizehttprequests method the application configures the filterchainproxy to apply to forward and or include requests e g spring security filter dispatcher types request error async forward include the application may forward or include the request to a higher privilege secured endpoint the application configures spring security to apply to every dispatcher type via authorizehttprequests shouldfilteralldispatchertypes true publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
12,111
| 8,599,960,293
|
IssuesEvent
|
2018-11-16 05:07:53
|
istio/istio
|
https://api.github.com/repos/istio/istio
|
closed
|
Tracking: make local RBAC available in Istio 1.0
|
area/security area/security/aaa
|
This is a tracking bug for making local RBAC available in Istio 1.0.
* Pilot:
- [x] Add authz plugin to construct RBAC policies (#5484)
- [x] Add `ON_WITH_INCLUSION` and `ON_WITH_EXCLUSION` mode (#6456)
- [x] Add destination attributes support in `ServiceRole` (#6554)
- [x] Add authentication attributes support in `ServiceRoleBinding` (#6722)
* Upstream Envoy:
- [x] Add metadata support in RBAC policy (envoyproxy/envoy#3638)
- [x] Add more logging for help debugging (envoyproxy/envoy#3744)
* Istio Proxy:
- [x] Pass authentication attributes to RBAC filter via metadata (istio/proxy#1825)
- [x] Update Istio Proxy to latest Envoy with Metadata support (istio/proxy#1832)
* Test:
- [x] Test the user flow to make sure everything is working as expected
- [x] Check the basic `ON` and `OFF` mode is working as expected
- [x] Check the advanced `ON_WITH_INCLUDE` and `ON_WITH_EXCLUSION` is working as expected
- [x] Check the Istio attributes is working as expected
- [x] Add e2e tests for local RBAC (#7193, istio/proxy#1907)
* Document:
- [x] Add a troubleshooting for local RBAC (istio/istio.github.io#1999)
- [x] Add new sample yaml files to use attributes in `ServiceRole` and `ServiceRoleBinding` (#6749)
- [x] Update existing documents: concept and task (istio/istio.github.io#1768, istio/api#580)
* Deprecate:
- [x] Remove mixer based RBAC adapter, samples and e2e tests
cc @liminw @diemtvu @quanjielin @qiwzhang @lizan
|
True
|
Tracking: make local RBAC available in Istio 1.0 - This is a tracking bug for making local RBAC available in Istio 1.0.
* Pilot:
- [x] Add authz plugin to construct RBAC policies (#5484)
- [x] Add `ON_WITH_INCLUSION` and `ON_WITH_EXCLUSION` mode (#6456)
- [x] Add destination attributes support in `ServiceRole` (#6554)
- [x] Add authentication attributes support in `ServiceRoleBinding` (#6722)
* Upstream Envoy:
- [x] Add metadata support in RBAC policy (envoyproxy/envoy#3638)
- [x] Add more logging for help debugging (envoyproxy/envoy#3744)
* Istio Proxy:
- [x] Pass authentication attributes to RBAC filter via metadata (istio/proxy#1825)
- [x] Update Istio Proxy to latest Envoy with Metadata support (istio/proxy#1832)
* Test:
- [x] Test the user flow to make sure everything is working as expected
- [x] Check the basic `ON` and `OFF` mode is working as expected
- [x] Check the advanced `ON_WITH_INCLUDE` and `ON_WITH_EXCLUSION` is working as expected
- [x] Check the Istio attributes is working as expected
- [x] Add e2e tests for local RBAC (#7193, istio/proxy#1907)
* Document:
- [x] Add a troubleshooting for local RBAC (istio/istio.github.io#1999)
- [x] Add new sample yaml files to use attributes in `ServiceRole` and `ServiceRoleBinding` (#6749)
- [x] Update existing documents: concept and task (istio/istio.github.io#1768, istio/api#580)
* Deprecate:
- [x] Remove mixer based RBAC adapter, samples and e2e tests
cc @liminw @diemtvu @quanjielin @qiwzhang @lizan
|
non_defect
|
tracking make local rbac available in istio this is a tracking bug for making local rbac available in istio pilot add authz plugin to construct rbac policies add on with inclusion and on with exclusion mode add destination attributes support in servicerole add authentication attributes support in servicerolebinding upstream envoy add metadata support in rbac policy envoyproxy envoy add more logging for help debugging envoyproxy envoy istio proxy pass authentication attributes to rbac filter via metadata istio proxy update istio proxy to latest envoy with metadata support istio proxy test test the user flow to make sure everything is working as expected check the basic on and off mode is working as expected check the advanced on with include and on with exclusion is working as expected check the istio attributes is working as expected add tests for local rbac istio proxy document add a troubleshooting for local rbac istio istio github io add new sample yaml files to use attributes in servicerole and servicerolebinding update existing documents concept and task istio istio github io istio api deprecate remove mixer based rbac adapter samples and tests cc liminw diemtvu quanjielin qiwzhang lizan
| 0
|
49,451
| 13,186,732,664
|
IssuesEvent
|
2020-08-13 01:08:22
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
opened
|
Steamshovel linking to CVMFS Qt libs not system ones (Trac #1419)
|
Incomplete Migration Migrated from Trac cmake defect
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1419">https://code.icecube.wisc.edu/ticket/1419</a>, reported by blaufuss and owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2016-03-18T21:14:10",
"description": "Building offline-software trunk (r139168) using cvmfs.\n\nI noted these scary warnings on cmake:\n{{{\nCMake Warning at cmake/project.cmake:459 (add_executable):\n Cannot generate a safe runtime search path for target steamshovel because\n files in some directories may conflict with libraries in implicit\n directories:\n\u200b\n runtime library [libQtOpenGL.so.4] in /usr/lib/x86_64-linux-gnu may be hidden by files in:\n /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib\n runtime library [libQtGui.so.4] in /usr/lib/x86_64-linux-gnu may be hidden by files in:\n /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib\n runtime library [libQtCore.so.4] in /usr/lib/x86_64-linux-gnu may be hidden by files in:\n /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib\n\u200b\n Some of these libraries may not be found correctly.\nCall Stack (most recent call first):\n steamshovel/CMakeLists.txt:120 (i3_executable)\n}}}\nThings built without issue, but indeed the warnings came true:\n{{{\nblaufuss@lubuntu[build]% ldd ./bin/steamshovel |grep Qt\n\tlibQtOpenGL.so.4 => /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib/libQtOpenGL.so.4 (0x00007fb5b6174000)\n\tlibQtGui.so.4 => /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib/libQtGui.so.4 (0x00007fb5b54c7000)\n\tlibQtCore.so.4 => /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib/libQtCore.so.4 (0x00007fb5b4fd6000)\n}}}\n\nSteamshovel does seem to work fine.....at least with some preliminary tests...",
"reporter": "blaufuss",
"cc": "david.schultz, hdembinski, nega",
"resolution": "wontfix",
"_ts": "1458335650323600",
"component": "cmake",
"summary": "Steamshovel linking to CVMFS Qt libs not system ones",
"priority": "normal",
"keywords": "cvmfs qt",
"time": "2015-11-04T16:17:11",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
Steamshovel linking to CVMFS Qt libs not system ones (Trac #1419) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1419">https://code.icecube.wisc.edu/ticket/1419</a>, reported by blaufuss and owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2016-03-18T21:14:10",
"description": "Building offline-software trunk (r139168) using cvmfs.\n\nI noted these scary warnings on cmake:\n{{{\nCMake Warning at cmake/project.cmake:459 (add_executable):\n Cannot generate a safe runtime search path for target steamshovel because\n files in some directories may conflict with libraries in implicit\n directories:\n\u200b\n runtime library [libQtOpenGL.so.4] in /usr/lib/x86_64-linux-gnu may be hidden by files in:\n /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib\n runtime library [libQtGui.so.4] in /usr/lib/x86_64-linux-gnu may be hidden by files in:\n /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib\n runtime library [libQtCore.so.4] in /usr/lib/x86_64-linux-gnu may be hidden by files in:\n /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib\n\u200b\n Some of these libraries may not be found correctly.\nCall Stack (most recent call first):\n steamshovel/CMakeLists.txt:120 (i3_executable)\n}}}\nThings built without issue, but indeed the warnings came true:\n{{{\nblaufuss@lubuntu[build]% ldd ./bin/steamshovel |grep Qt\n\tlibQtOpenGL.so.4 => /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib/libQtOpenGL.so.4 (0x00007fb5b6174000)\n\tlibQtGui.so.4 => /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib/libQtGui.so.4 (0x00007fb5b54c7000)\n\tlibQtCore.so.4 => /cvmfs/icecube.opensciencegrid.org/standard/Ubuntu_14_x86_64/lib/libQtCore.so.4 (0x00007fb5b4fd6000)\n}}}\n\nSteamshovel does seem to work fine.....at least with some preliminary tests...",
"reporter": "blaufuss",
"cc": "david.schultz, hdembinski, nega",
"resolution": "wontfix",
"_ts": "1458335650323600",
"component": "cmake",
"summary": "Steamshovel linking to CVMFS Qt libs not system ones",
"priority": "normal",
"keywords": "cvmfs qt",
"time": "2015-11-04T16:17:11",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
|
defect
|
steamshovel linking to cvmfs qt libs not system ones trac migrated from json status closed changetime description building offline software trunk using cvmfs n ni noted these scary warnings on cmake n ncmake warning at cmake project cmake add executable n cannot generate a safe runtime search path for target steamshovel because n files in some directories may conflict with libraries in implicit n directories n n runtime library in usr lib linux gnu may be hidden by files in n cvmfs icecube opensciencegrid org standard ubuntu lib n runtime library in usr lib linux gnu may be hidden by files in n cvmfs icecube opensciencegrid org standard ubuntu lib n runtime library in usr lib linux gnu may be hidden by files in n cvmfs icecube opensciencegrid org standard ubuntu lib n n some of these libraries may not be found correctly ncall stack most recent call first n steamshovel cmakelists txt executable n nthings built without issue but indeed the warnings came true n nblaufuss lubuntu ldd bin steamshovel grep qt n tlibqtopengl so cvmfs icecube opensciencegrid org standard ubuntu lib libqtopengl so n tlibqtgui so cvmfs icecube opensciencegrid org standard ubuntu lib libqtgui so n tlibqtcore so cvmfs icecube opensciencegrid org standard ubuntu lib libqtcore so n n nsteamshovel does seem to work fine at least with some preliminary tests reporter blaufuss cc david schultz hdembinski nega resolution wontfix ts component cmake summary steamshovel linking to cvmfs qt libs not system ones priority normal keywords cvmfs qt time milestone owner nega type defect
| 1
|
103,058
| 4,164,330,985
|
IssuesEvent
|
2016-06-18 18:25:37
|
RioBus/ionic
|
https://api.github.com/repos/RioBus/ionic
|
closed
|
Notify the user when there were no buses in the query
|
easy enhancement priority
|
Show a notification toast when the query returns nothing.
|
1.0
|
Notify the user when there were no buses in the query - Show a notification toast when the query returns nothing.
|
non_defect
|
notify the user when there were no buses in the query show a notification toast when the query returns nothing
| 0
|
81,288
| 30,783,463,764
|
IssuesEvent
|
2023-07-31 11:41:58
|
vector-im/element-x-android
|
https://api.github.com/repos/vector-im/element-x-android
|
opened
|
Room appears twice in timeline
|
T-Defect
|
### Steps to reproduce
1. Look at timeline, see room twice
### Outcome
#### What did you expect?
Room appears once in timeline
#### What happened instead?
Room appears twice in timeline
### Your phone model
Pixel 6a
### Operating system version
Graphene OS
### Application version and app store
Nightly
### Homeserver
matrix.org
### Will you send logs?
Yes
### Are you willing to provide a PR?
No
|
1.0
|
Room appears twice in timeline - ### Steps to reproduce
1. Look at timeline, see room twice
### Outcome
#### What did you expect?
Room appears once in timeline
#### What happened instead?
Room appears twice in timeline
### Your phone model
Pixel 6a
### Operating system version
Graphene OS
### Application version and app store
Nightly
### Homeserver
matrix.org
### Will you send logs?
Yes
### Are you willing to provide a PR?
No
|
defect
|
room appears twice in timeline steps to reproduce look at timeline see room twice outcome what did you expect room appears once in timeline what happened instead room appears twice in timeline your phone model pixel operating system version graphene os application version and app store nightly homeserver matrix org will you send logs yes are you willing to provide a pr no
| 1
|
570,204
| 17,021,044,764
|
IssuesEvent
|
2021-07-02 19:07:43
|
kubernetes/minikube
|
https://api.github.com/repos/kubernetes/minikube
|
closed
|
preload job skips generating preload for v1.20.8
|
co/preload kind/regression priority/important-soon
|
while trying to bump k8s version I noticed it was slower on v1.20.8
and then I did
```
$mk start --kubernetes-version=v1.20.8 --download-only
😄 minikube v1.22.0-beta.0 on Darwin 11.4
✨ Using the docker driver based on existing profile
👍 Starting control plane node minikube in cluster minikube
🚜 Pulling base image ...
✅ Download complete!
```
and verified there was no preload downloaded
```
$ ls ~/.minikube/cache/preloaded-tarball/
preloaded-images-k8s-v11-v1.20.7-docker-overlay2-amd64.tar.lz4
preloaded-images-k8s-v11-v1.20.7-docker-overlay2-amd64.tar.lz4.checksum
```
on [last jenkins job](https://a646c87040050000000000000000001.proxy.googleprod.com/job/Preload%20Generation/519/console), jenkins "Thinks" the preload exists !
```
13:36:39 A preloaded tarball for k8s version v1.20.8 - runtime "docker" already exists, skipping generation.
13:36:39 I0701 17:36:39.203376 2919028 preload.go:111] Checking if preload exists for k8s version v1.20.8 and runtime containerd
13:36:39 A preloaded tarball for k8s version v1.20.8 - runtime "containerd" already exists, skipping generation.
```
but I checked manually in the bucket GCS bucket and there is no preload
|
1.0
|
preload job skips generating preload for v1.20.8 - while trying to bump k8s version I noticed it was slower on v1.20.8
and then I did
```
$mk start --kubernetes-version=v1.20.8 --download-only
😄 minikube v1.22.0-beta.0 on Darwin 11.4
✨ Using the docker driver based on existing profile
👍 Starting control plane node minikube in cluster minikube
🚜 Pulling base image ...
✅ Download complete!
```
and verified there was no preload downloaded
```
$ ls ~/.minikube/cache/preloaded-tarball/
preloaded-images-k8s-v11-v1.20.7-docker-overlay2-amd64.tar.lz4
preloaded-images-k8s-v11-v1.20.7-docker-overlay2-amd64.tar.lz4.checksum
```
on [last jenkins job](https://a646c87040050000000000000000001.proxy.googleprod.com/job/Preload%20Generation/519/console), jenkins "Thinks" the preload exists !
```
13:36:39 A preloaded tarball for k8s version v1.20.8 - runtime "docker" already exists, skipping generation.
13:36:39 I0701 17:36:39.203376 2919028 preload.go:111] Checking if preload exists for k8s version v1.20.8 and runtime containerd
13:36:39 A preloaded tarball for k8s version v1.20.8 - runtime "containerd" already exists, skipping generation.
```
but I checked manually in the bucket GCS bucket and there is no preload
|
non_defect
|
preload job skips generating preload for while trying to bump version i noticed it was slower on and then i did mk start kubernetes version download only 😄 minikube beta on darwin ✨ using the docker driver based on existing profile 👍 starting control plane node minikube in cluster minikube 🚜 pulling base image ✅ download complete and verified there was no preload downloaded ls minikube cache preloaded tarball preloaded images docker tar preloaded images docker tar checksum on jenkins thinks the preload exists a preloaded tarball for version runtime docker already exists skipping generation preload go checking if preload exists for version and runtime containerd a preloaded tarball for version runtime containerd already exists skipping generation but i checked manually in the bucket gcs bucket and there is no preload
| 0
|
312,619
| 26,873,404,019
|
IssuesEvent
|
2023-02-04 18:55:30
|
MPMG-DCC-UFMG/F01
|
https://api.github.com/repos/MPMG-DCC-UFMG/F01
|
closed
|
Teste de generalizacao para a tag Informações Institucionais - Leis Municipais - Raposos
|
generalization test development template - Betha (26) tag - Informações Institucionais subtag - Leis Municipais
|
DoD: Realizar o teste de Generalização do validador da tag Informações Institucionais - Leis Municipais para o Município de Raposos.
|
1.0
|
Teste de generalizacao para a tag Informações Institucionais - Leis Municipais - Raposos - DoD: Realizar o teste de Generalização do validador da tag Informações Institucionais - Leis Municipais para o Município de Raposos.
|
non_defect
|
teste de generalizacao para a tag informações institucionais leis municipais raposos dod realizar o teste de generalização do validador da tag informações institucionais leis municipais para o município de raposos
| 0
|
89,990
| 15,856,044,272
|
IssuesEvent
|
2021-04-08 01:23:03
|
Rossb0b/Swapi
|
https://api.github.com/repos/Rossb0b/Swapi
|
opened
|
CVE-2019-16769 (Medium) detected in serialize-javascript-1.5.0.tgz
|
security vulnerability
|
## CVE-2019-16769 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>serialize-javascript-1.5.0.tgz</b></p></summary>
<p>Serialize JavaScript to a superset of JSON that includes regular expressions and functions.</p>
<p>Library home page: <a href="https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.5.0.tgz">https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.5.0.tgz</a></p>
<p>Path to dependency file: /Swapi/package.json</p>
<p>Path to vulnerable library: Swapi/node_modules/serialize-javascript/package.json</p>
<p>
Dependency Hierarchy:
- build-angular-0.10.6.tgz (Root Library)
- copy-webpack-plugin-4.5.4.tgz
- :x: **serialize-javascript-1.5.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The serialize-javascript npm package before version 2.1.1 is vulnerable to Cross-site Scripting (XSS). It does not properly mitigate against unsafe characters in serialized regular expressions. This vulnerability is not affected on Node.js environment since Node.js's implementation of RegExp.prototype.toString() backslash-escapes all forward slashes in regular expressions. If serialized data of regular expression objects are used in an environment other than Node.js, it is affected by this vulnerability.
<p>Publish Date: 2019-12-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-16769>CVE-2019-16769</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16769">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16769</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: v2.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-16769 (Medium) detected in serialize-javascript-1.5.0.tgz - ## CVE-2019-16769 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>serialize-javascript-1.5.0.tgz</b></p></summary>
<p>Serialize JavaScript to a superset of JSON that includes regular expressions and functions.</p>
<p>Library home page: <a href="https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.5.0.tgz">https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.5.0.tgz</a></p>
<p>Path to dependency file: /Swapi/package.json</p>
<p>Path to vulnerable library: Swapi/node_modules/serialize-javascript/package.json</p>
<p>
Dependency Hierarchy:
- build-angular-0.10.6.tgz (Root Library)
- copy-webpack-plugin-4.5.4.tgz
- :x: **serialize-javascript-1.5.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The serialize-javascript npm package before version 2.1.1 is vulnerable to Cross-site Scripting (XSS). It does not properly mitigate against unsafe characters in serialized regular expressions. This vulnerability is not affected on Node.js environment since Node.js's implementation of RegExp.prototype.toString() backslash-escapes all forward slashes in regular expressions. If serialized data of regular expression objects are used in an environment other than Node.js, it is affected by this vulnerability.
<p>Publish Date: 2019-12-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-16769>CVE-2019-16769</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16769">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16769</a></p>
<p>Release Date: 2019-12-05</p>
<p>Fix Resolution: v2.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve medium detected in serialize javascript tgz cve medium severity vulnerability vulnerable library serialize javascript tgz serialize javascript to a superset of json that includes regular expressions and functions library home page a href path to dependency file swapi package json path to vulnerable library swapi node modules serialize javascript package json dependency hierarchy build angular tgz root library copy webpack plugin tgz x serialize javascript tgz vulnerable library vulnerability details the serialize javascript npm package before version is vulnerable to cross site scripting xss it does not properly mitigate against unsafe characters in serialized regular expressions this vulnerability is not affected on node js environment since node js s implementation of regexp prototype tostring backslash escapes all forward slashes in regular expressions if serialized data of regular expression objects are used in an environment other than node js it is affected by this vulnerability publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
52,411
| 13,224,719,496
|
IssuesEvent
|
2020-08-17 19:42:27
|
icecube-trac/tix4
|
https://api.github.com/repos/icecube-trac/tix4
|
opened
|
Wrong datatype in steering file for 20208 -> unable to weight via from_simprod (Trac #2178)
|
Incomplete Migration Migrated from Trac analysis defect
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/2178">https://code.icecube.wisc.edu/projects/icecube/ticket/2178</a>, reported by flauber</summary>
<p>
```json
{
"status": "closed",
"changetime": "2018-08-27T15:15:55",
"_ts": "1535382955013988",
"description": "Hi,\n\nanother problem with the steering file of 20208\n(http://simprod.icecube.wisc.edu/cgi-bin/simulation/cgi/cfg?dataset=20208)\n\nCORSIKA::eprimarymax\tstring\t1e6\nshould be either int or float, not string.\nIt being a string leads to this error in simprod:\n\n File \"/data/user/flauber/software/py2-v3.0.1/combo/stable/build/lib/icecube/weighting /weighting.py\", line 834, in from_simprod\n emax=steering['CORSIKA::eprimarymax']*I3Units.GeV,\n TypeError: can't multiply sequence by non-int of type 'float'\n\n",
"reporter": "flauber",
"cc": "",
"resolution": "fixed",
"time": "2018-07-30T13:16:33",
"component": "analysis",
"summary": "Wrong datatype in steering file for 20208 -> unable to weight via from_simprod",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
Wrong datatype in steering file for 20208 -> unable to weight via from_simprod (Trac #2178) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/2178">https://code.icecube.wisc.edu/projects/icecube/ticket/2178</a>, reported by flauber</summary>
<p>
```json
{
"status": "closed",
"changetime": "2018-08-27T15:15:55",
"_ts": "1535382955013988",
"description": "Hi,\n\nanother problem with the steering file of 20208\n(http://simprod.icecube.wisc.edu/cgi-bin/simulation/cgi/cfg?dataset=20208)\n\nCORSIKA::eprimarymax\tstring\t1e6\nshould be either int or float, not string.\nIt being a string leads to this error in simprod:\n\n File \"/data/user/flauber/software/py2-v3.0.1/combo/stable/build/lib/icecube/weighting /weighting.py\", line 834, in from_simprod\n emax=steering['CORSIKA::eprimarymax']*I3Units.GeV,\n TypeError: can't multiply sequence by non-int of type 'float'\n\n",
"reporter": "flauber",
"cc": "",
"resolution": "fixed",
"time": "2018-07-30T13:16:33",
"component": "analysis",
"summary": "Wrong datatype in steering file for 20208 -> unable to weight via from_simprod",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
|
defect
|
wrong datatype in steering file for unable to weight via from simprod trac migrated from json status closed changetime ts description hi n nanother problem with the steering file of n be either int or float not string nit being a string leads to this error in simprod n n file data user flauber software combo stable build lib icecube weighting weighting py line in from simprod n emax steering gev n typeerror can t multiply sequence by non int of type float n n reporter flauber cc resolution fixed time component analysis summary wrong datatype in steering file for unable to weight via from simprod priority normal keywords milestone owner type defect
| 1
|
64,683
| 18,794,892,033
|
IssuesEvent
|
2021-11-08 21:02:17
|
scipy/scipy
|
https://api.github.com/repos/scipy/scipy
|
closed
|
SciPy on Python 3.10 Windows does not include msvcp140.dll
|
defect Official binaries
|
When downloading the wheels from https://pypi.org/project/scipy/#files, `scipy-1.7.2-cp310-cp310-win_amd64.whl` does not include `.libs/msvcp140.dll`, while `scipy-1.7.2-cp39-cp39-win_amd64.whl` does include `msvcp140.dll`.
When looking at the build logs for [Windows Python 3.9](https://ci.appveyor.com/project/scipy/scipy-wheels/build/job/bjie6pkv2xm926op), we see the line that shows `msvcp140.dll` is copied into the wheel
```
[00:13:20] copying build\lib.win-amd64-3.9\scipy\.libs\msvcp140.dll -> build\bdist.win-amd64\wheel\.\scipy\.libs
```
In the build logs for [Windows Python 3.10](https://ci.appveyor.com/project/scipy/scipy-wheels/build/job/i4pv1vlslv8f1mu1) the line is missing. This is causing the import of `scipy.sparse` to fail.
Related to https://github.com/scipy/scipy/issues/7969
|
1.0
|
SciPy on Python 3.10 Windows does not include msvcp140.dll - When downloading the wheels from https://pypi.org/project/scipy/#files, `scipy-1.7.2-cp310-cp310-win_amd64.whl` does not include `.libs/msvcp140.dll`, while `scipy-1.7.2-cp39-cp39-win_amd64.whl` does include `msvcp140.dll`.
When looking at the build logs for [Windows Python 3.9](https://ci.appveyor.com/project/scipy/scipy-wheels/build/job/bjie6pkv2xm926op), we see the line that shows `msvcp140.dll` is copied into the wheel
```
[00:13:20] copying build\lib.win-amd64-3.9\scipy\.libs\msvcp140.dll -> build\bdist.win-amd64\wheel\.\scipy\.libs
```
In the build logs for [Windows Python 3.10](https://ci.appveyor.com/project/scipy/scipy-wheels/build/job/i4pv1vlslv8f1mu1) the line is missing. This is causing the import of `scipy.sparse` to fail.
Related to https://github.com/scipy/scipy/issues/7969
|
defect
|
scipy on python windows does not include dll when downloading the wheels from scipy win whl does not include libs dll while scipy win whl does include dll when looking at the build logs for we see the line that shows dll is copied into the wheel copying build lib win scipy libs dll build bdist win wheel scipy libs in the build logs for the line is missing this is causing the import of scipy sparse to fail related to
| 1
|
78,663
| 27,700,158,561
|
IssuesEvent
|
2023-03-14 07:17:08
|
vector-im/element-web
|
https://api.github.com/repos/vector-im/element-web
|
closed
|
Adding a phone number fails
|
T-Defect
|
### Steps to reproduce
1. Add a US phone number
2. Adding the phone number will fail

### Outcome
#### What did you expect?
I should be able to add a phone number to my account.
#### What happened instead?
I cannot add a phone number to my account.
### Operating system
Fedora Linux 37
### Application version
Element version: 1.11.24 Olm version: 3.2.12
### How did you install the app?
https://flathub.org/apps/details/im.riot.Riot
### Homeserver
matrix.org
### Will you send logs?
Yes
|
1.0
|
Adding a phone number fails - ### Steps to reproduce
1. Add a US phone number
2. Adding the phone number will fail

### Outcome
#### What did you expect?
I should be able to add a phone number to my account.
#### What happened instead?
I cannot add a phone number to my account.
### Operating system
Fedora Linux 37
### Application version
Element version: 1.11.24 Olm version: 3.2.12
### How did you install the app?
https://flathub.org/apps/details/im.riot.Riot
### Homeserver
matrix.org
### Will you send logs?
Yes
|
defect
|
adding a phone number fails steps to reproduce add a us phone number adding the phone number will fail outcome what did you expect i should be able to add a phone number to my account what happened instead i cannot add a phone number to my account operating system fedora linux application version element version olm version how did you install the app homeserver matrix org will you send logs yes
| 1
|
40,729
| 5,257,617,785
|
IssuesEvent
|
2017-02-02 21:02:23
|
blockstack/designs
|
https://api.github.com/repos/blockstack/designs
|
closed
|
inverse color of ovals, rectangles and lines atlas illustration
|
design production
|
Create inverse version of the Blockstack Atlas Illustration.
|
1.0
|
inverse color of ovals, rectangles and lines atlas illustration - Create inverse version of the Blockstack Atlas Illustration.
|
non_defect
|
inverse color of ovals rectangles and lines atlas illustration create inverse version of the blockstack atlas illustration
| 0
|
224,859
| 17,778,699,433
|
IssuesEvent
|
2021-08-30 23:26:28
|
microsoft/vscode
|
https://api.github.com/repos/microsoft/vscode
|
closed
|
Flaky test: vscode - untitled automatic language detection
|
upstream integration-test-failure web
|
```
1 failing
1) vscode - untitled automatic language detection
test automatic language detection works:
Error: asPromise TIMEOUT reached
at Timeout._onTimeout (extensions/vscode-api-tests/src/utils.ts:130:11)
at listOnTimeout (internal/timers.js:554:17)
```
https://dev.azure.com/monacotools/Monaco/_build/results?buildId=131883&view=logs&j=3792f238-f35e-5f82-0dbc-272432d9a0fb&t=ff86458e-371d-5daa-d7a2-208b9a858585&l=378
|
1.0
|
Flaky test: vscode - untitled automatic language detection - ```
1 failing
1) vscode - untitled automatic language detection
test automatic language detection works:
Error: asPromise TIMEOUT reached
at Timeout._onTimeout (extensions/vscode-api-tests/src/utils.ts:130:11)
at listOnTimeout (internal/timers.js:554:17)
```
https://dev.azure.com/monacotools/Monaco/_build/results?buildId=131883&view=logs&j=3792f238-f35e-5f82-0dbc-272432d9a0fb&t=ff86458e-371d-5daa-d7a2-208b9a858585&l=378
|
non_defect
|
flaky test vscode untitled automatic language detection failing vscode untitled automatic language detection test automatic language detection works error aspromise timeout reached at timeout ontimeout extensions vscode api tests src utils ts at listontimeout internal timers js
| 0
|
132,728
| 18,268,859,764
|
IssuesEvent
|
2021-10-04 11:43:16
|
artsking/linux-3.0.35
|
https://api.github.com/repos/artsking/linux-3.0.35
|
opened
|
CVE-2014-4655 (Medium) detected in linux-stable-rtv3.8.6
|
security vulnerability
|
## CVE-2014-4655 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/artsking/linux-3.0.35/commit/5992fa81c6ac1b4e9db13f5408d914525c5b7875">5992fa81c6ac1b4e9db13f5408d914525c5b7875</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The snd_ctl_elem_add function in sound/core/control.c in the ALSA control implementation in the Linux kernel before 3.15.2 does not properly maintain the user_ctl_count value, which allows local users to cause a denial of service (integer overflow and limit bypass) by leveraging /dev/snd/controlCX access for a large number of SNDRV_CTL_IOCTL_ELEM_REPLACE ioctl calls.
<p>Publish Date: 2014-07-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2014-4655>CVE-2014-4655</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2014-4655">https://nvd.nist.gov/vuln/detail/CVE-2014-4655</a></p>
<p>Release Date: 2014-07-03</p>
<p>Fix Resolution: 3.15.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2014-4655 (Medium) detected in linux-stable-rtv3.8.6 - ## CVE-2014-4655 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/artsking/linux-3.0.35/commit/5992fa81c6ac1b4e9db13f5408d914525c5b7875">5992fa81c6ac1b4e9db13f5408d914525c5b7875</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The snd_ctl_elem_add function in sound/core/control.c in the ALSA control implementation in the Linux kernel before 3.15.2 does not properly maintain the user_ctl_count value, which allows local users to cause a denial of service (integer overflow and limit bypass) by leveraging /dev/snd/controlCX access for a large number of SNDRV_CTL_IOCTL_ELEM_REPLACE ioctl calls.
<p>Publish Date: 2014-07-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2014-4655>CVE-2014-4655</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2014-4655">https://nvd.nist.gov/vuln/detail/CVE-2014-4655</a></p>
<p>Release Date: 2014-07-03</p>
<p>Fix Resolution: 3.15.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve medium detected in linux stable cve medium severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href found in base branch master vulnerable source files vulnerability details the snd ctl elem add function in sound core control c in the alsa control implementation in the linux kernel before does not properly maintain the user ctl count value which allows local users to cause a denial of service integer overflow and limit bypass by leveraging dev snd controlcx access for a large number of sndrv ctl ioctl elem replace ioctl calls publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
15,578
| 3,476,159,835
|
IssuesEvent
|
2015-12-26 15:06:59
|
websharks/comment-mail
|
https://api.github.com/repos/websharks/comment-mail
|
opened
|
Queued (Pending) Notifications are never sent
|
bug needs research needs testing
|
I was testing Comment Mail Lite v151224 after importing StCR subscriptions when I tried posting two new replies to existing comments with subscriptions (i.e., I posted new replies to comments that had StCR subscriptions, which were imported into Comment Mail). The new Comment Mail subscriptions were created fine, and were connected to the appropriate imported StCR subscription, however the notifications for those new replied were never sent.
Looking at the Mail Queue, I noticed the two notifications were sitting there, unsent:

Notice the time on them... it had been ~45 minutes since the notifications were created and they still have not been sent. Here's what I tried to get the email queue to be processed:
- Tried manually running the `_cron_comment_mail_queue_processor` event
- Tried disabling the two email-related plugins that I had installed (**WP Mail SMTP** and **Email Log**) and then manually running the `_cron_comment_mail_queue_processor` event.
- Tried disabling/enabling Comment Mail
- Tried disabling Mail Queue Processing in **Comment Mail → Config. Options → Enable/Disable** and then enabling it again.
None of these things helped. The queued notifications remained queued and were not sent out.
|
1.0
|
Queued (Pending) Notifications are never sent - I was testing Comment Mail Lite v151224 after importing StCR subscriptions when I tried posting two new replies to existing comments with subscriptions (i.e., I posted new replies to comments that had StCR subscriptions, which were imported into Comment Mail). The new Comment Mail subscriptions were created fine, and were connected to the appropriate imported StCR subscription, however the notifications for those new replied were never sent.
Looking at the Mail Queue, I noticed the two notifications were sitting there, unsent:

Notice the time on them... it had been ~45 minutes since the notifications were created and they still have not been sent. Here's what I tried to get the email queue to be processed:
- Tried manually running the `_cron_comment_mail_queue_processor` event
- Tried disabling the two email-related plugins that I had installed (**WP Mail SMTP** and **Email Log**) and then manually running the `_cron_comment_mail_queue_processor` event.
- Tried disabling/enabling Comment Mail
- Tried disabling Mail Queue Processing in **Comment Mail → Config. Options → Enable/Disable** and then enabling it again.
None of these things helped. The queued notifications remained queued and were not sent out.
|
non_defect
|
queued pending notifications are never sent i was testing comment mail lite after importing stcr subscriptions when i tried posting two new replies to existing comments with subscriptions i e i posted new replies to comments that had stcr subscriptions which were imported into comment mail the new comment mail subscriptions were created fine and were connected to the appropriate imported stcr subscription however the notifications for those new replied were never sent looking at the mail queue i noticed the two notifications were sitting there unsent notice the time on them it had been minutes since the notifications were created and they still have not been sent here s what i tried to get the email queue to be processed tried manually running the cron comment mail queue processor event tried disabling the two email related plugins that i had installed wp mail smtp and email log and then manually running the cron comment mail queue processor event tried disabling enabling comment mail tried disabling mail queue processing in comment mail → config options → enable disable and then enabling it again none of these things helped the queued notifications remained queued and were not sent out
| 0
|
52,048
| 13,211,373,266
|
IssuesEvent
|
2020-08-15 22:40:19
|
icecube-trac/tix4
|
https://api.github.com/repos/icecube-trac/tix4
|
opened
|
[recclasses] Non-offline dependencies (Trac #1567)
|
Incomplete Migration Migrated from Trac combo reconstruction defect
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1567">https://code.icecube.wisc.edu/projects/icecube/ticket/1567</a>, reported by olivasand owned by hdembinski</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"_ts": "1550067117911749",
"description": "Currently recclasses depends on portia (r142454) and ophelia(r142462). This is breakage that needs to be fixed. One of the main reasons for the creation of the project was to provide a project with minimal dependencies that people could include into any meta-project. This project therefore can't have any dependencies outside of offline-software.\n\nCan we remove those dependencies and get whatever fixes into portia and ophelia that are needed so that their tests pass on all platforms?\n\n",
"reporter": "olivas",
"cc": "",
"resolution": "fixed",
"time": "2016-02-25T16:34:10",
"component": "combo reconstruction",
"summary": "[recclasses] Non-offline dependencies",
"priority": "blocker",
"keywords": "",
"milestone": "",
"owner": "hdembinski",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
[recclasses] Non-offline dependencies (Trac #1567) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1567">https://code.icecube.wisc.edu/projects/icecube/ticket/1567</a>, reported by olivasand owned by hdembinski</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"_ts": "1550067117911749",
"description": "Currently recclasses depends on portia (r142454) and ophelia(r142462). This is breakage that needs to be fixed. One of the main reasons for the creation of the project was to provide a project with minimal dependencies that people could include into any meta-project. This project therefore can't have any dependencies outside of offline-software.\n\nCan we remove those dependencies and get whatever fixes into portia and ophelia that are needed so that their tests pass on all platforms?\n\n",
"reporter": "olivas",
"cc": "",
"resolution": "fixed",
"time": "2016-02-25T16:34:10",
"component": "combo reconstruction",
"summary": "[recclasses] Non-offline dependencies",
"priority": "blocker",
"keywords": "",
"milestone": "",
"owner": "hdembinski",
"type": "defect"
}
```
</p>
</details>
|
defect
|
non offline dependencies trac migrated from json status closed changetime ts description currently recclasses depends on portia and ophelia this is breakage that needs to be fixed one of the main reasons for the creation of the project was to provide a project with minimal dependencies that people could include into any meta project this project therefore can t have any dependencies outside of offline software n ncan we remove those dependencies and get whatever fixes into portia and ophelia that are needed so that their tests pass on all platforms n n reporter olivas cc resolution fixed time component combo reconstruction summary non offline dependencies priority blocker keywords milestone owner hdembinski type defect
| 1
|
33,372
| 9,106,305,539
|
IssuesEvent
|
2019-02-20 23:22:31
|
dotnet/coreclr
|
https://api.github.com/repos/dotnet/coreclr
|
closed
|
ARM legs are failing in CI in release/2.0.0
|
area-Build blocking-clean-ci blocking-official-build
|
It looks like these have never passed?
```
15:54:41 BUILDTEST: Commencing build of native test components for arm/Release
15:54:41 BUILDTEST: Using environment: "C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\Tools\\..\..\VC\Auxiliary\Build\vcvarsall.bat" x86_arm
15:54:41 **********************************************************************
15:54:41 ** Visual Studio 2017 Developer Command Prompt v15.0.26730.16
15:54:41 ** Copyright (c) 2017 Microsoft Corporation
15:54:41 **********************************************************************
15:54:42 The input line is too long.
15:54:42 :export_x86
15:54:42 was unexpected at this time.
15:54:42
```
https://github.com/dotnet/coreclr/pull/15686
I don't see a line like `:export_x86` in the vcvars related batch files.
@RussKeldorph who sponsors the ARM legs?
|
2.0
|
ARM legs are failing in CI in release/2.0.0 - It looks like these have never passed?
```
15:54:41 BUILDTEST: Commencing build of native test components for arm/Release
15:54:41 BUILDTEST: Using environment: "C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\Tools\\..\..\VC\Auxiliary\Build\vcvarsall.bat" x86_arm
15:54:41 **********************************************************************
15:54:41 ** Visual Studio 2017 Developer Command Prompt v15.0.26730.16
15:54:41 ** Copyright (c) 2017 Microsoft Corporation
15:54:41 **********************************************************************
15:54:42 The input line is too long.
15:54:42 :export_x86
15:54:42 was unexpected at this time.
15:54:42
```
https://github.com/dotnet/coreclr/pull/15686
I don't see a line like `:export_x86` in the vcvars related batch files.
@RussKeldorph who sponsors the ARM legs?
|
non_defect
|
arm legs are failing in ci in release it looks like these have never passed buildtest commencing build of native test components for arm release buildtest using environment c program files microsoft visual studio enterprise tools vc auxiliary build vcvarsall bat arm visual studio developer command prompt copyright c microsoft corporation the input line is too long export was unexpected at this time i don t see a line like export in the vcvars related batch files russkeldorph who sponsors the arm legs
| 0
|
35,572
| 7,781,135,492
|
IssuesEvent
|
2018-06-05 22:35:10
|
google/sanitizers
|
https://api.github.com/repos/google/sanitizers
|
closed
|
remove the limit of the number of ever existed threads in asan
|
Priority-Medium ProjectAddressSanitizer Status-Accepted Type-Defect
|
Originally reported on Google Code with ID 273
```
Today asan will die after 4M threads are created, because we keep some
small amount of metadata for every thread that ever lived.
We should eventually remove this limitation.
```
Reported by `konstantin.s.serebryany` on 2014-03-11 08:51:27
|
1.0
|
remove the limit of the number of ever existed threads in asan - Originally reported on Google Code with ID 273
```
Today asan will die after 4M threads are created, because we keep some
small amount of metadata for every thread that ever lived.
We should eventually remove this limitation.
```
Reported by `konstantin.s.serebryany` on 2014-03-11 08:51:27
|
defect
|
remove the limit of the number of ever existed threads in asan originally reported on google code with id today asan will die after threads are created because we keep some small amount of metadata for every thread that ever lived we should eventually remove this limitation reported by konstantin s serebryany on
| 1
|
55,520
| 14,531,078,663
|
IssuesEvent
|
2020-12-14 20:13:57
|
BOINC/boinc
|
https://api.github.com/repos/BOINC/boinc
|
closed
|
Wrong cast when reporting non-allowed rpc access
|
C: Client - Daemon E: 1 day P: Major T: Defect
|
For some time, reports from new installations of Linux show non-allowed access from ip addresses starting with 2. 2.x.x.x is typically a French address.
reference issue #3246. I could not find an issue like I put together here but I might have missed it.
In actuality, the 2 is the identifier AF_INET due to wrong cast.
In program gui_rpc_server.cpp function "show_connect_error" the following lines of code:
```
#ifdef _WIN32
sockaddr_in* sin = (sockaddr_in*)&s;
safe_strcpy(buf, inet_ntoa(sin->sin_addr));
#else
inet_ntop(s.ss_family, &s, buf, 256);
#endif
```
The reference s.ss_family causes the "2" to be picked up. The actual ip address starts at s.ss_data which is not defined AFAICT
I ran a test from 192.168.1.241 and attempted to access the client. The remote_host.cfg file has to be empty as that causes the problem somehow.
=====================bug================
18-Oct-2019 14:03:49 [---] Setting up GUI RPC socket
18-Oct-2019 14:03:49 [---] Checking presence of 0 project files
18-Oct-2019 14:03:49 [---] This computer is not attached to any projects
18-Oct-2019 14:03:49 Initialization completed
18-Oct-2019 14:04:14 [---] GUI RPC request from non-allowed address 2.0.233.131
(note that the above can change but the 2.0 will not as that is the value AF_INET)
If you dump out the hex values of the buffer "buf" one gets the following
18-Oct-2019 15:36:37 [---]
debug2 02 00 e9 83 c0 a8 01 f1 00 00 00 00 00 00 00 00
----------------^^ this is 233
------------------^^ this is 131
the 2.0.233.131 came from the buffer "buf"
however, the computer that triggered the access was 192.168.1.241 and that value is also in that buffer but not at the location the coder expected
debug2 02 00 e9 83 c0 a8 01 f1 00 00 00 00 00 00 00 00
----------------------^^ this is 192
--------------------------^^ this is 168
----------------------------^^ this is the 1
-------------------------------^^ this is the 241
A "fix" is to use the same code as the win32. I took a guess and changed
```
#ifdef _WIN32
sockaddr_in* sin = (sockaddr_in*)&s;
safe_strcpy(buf, inet_ntoa(sin->sin_addr));
#else
inet_ntop(s.ss_family, &s, buf, 256);
#endif
```
to
```
#ifdef _WIN32
sockaddr_in* sin = (sockaddr_in*)&s;
safe_strcpy(buf, inet_ntoa(sin->sin_addr));
#else
sockaddr_in* sin = (sockaddr_in*)&s;
safe_strcpy(buf, inet_ntoa(sin->sin_addr));
#endif
```
the above ran correctly
File tempfix.png shows the correct ip address and the hex dump of the buffer
file tempfix1.png shows the code that dumped out the buffer


[EDIT] Not necessary for remote_access.cfg to be actually empty or null for the bug to show up.
|
1.0
|
Wrong cast when reporting non-allowed rpc access - For some time, reports from new installations of Linux show non-allowed access from ip addresses starting with 2. 2.x.x.x is typically a French address.
reference issue #3246. I could not find an issue like I put together here but I might have missed it.
In actuality, the 2 is the identifier AF_INET due to wrong cast.
In program gui_rpc_server.cpp function "show_connect_error" the following lines of code:
```
#ifdef _WIN32
sockaddr_in* sin = (sockaddr_in*)&s;
safe_strcpy(buf, inet_ntoa(sin->sin_addr));
#else
inet_ntop(s.ss_family, &s, buf, 256);
#endif
```
The reference s.ss_family causes the "2" to be picked up. The actual ip address starts at s.ss_data which is not defined AFAICT
I ran a test from 192.168.1.241 and attempted to access the client. The remote_host.cfg file has to be empty as that causes the problem somehow.
=====================bug================
18-Oct-2019 14:03:49 [---] Setting up GUI RPC socket
18-Oct-2019 14:03:49 [---] Checking presence of 0 project files
18-Oct-2019 14:03:49 [---] This computer is not attached to any projects
18-Oct-2019 14:03:49 Initialization completed
18-Oct-2019 14:04:14 [---] GUI RPC request from non-allowed address 2.0.233.131
(note that the above can change but the 2.0 will not as that is the value AF_INET)
If you dump out the hex values of the buffer "buf" one gets the following
18-Oct-2019 15:36:37 [---]
debug2 02 00 e9 83 c0 a8 01 f1 00 00 00 00 00 00 00 00
----------------^^ this is 233
------------------^^ this is 131
the 2.0.233.131 came from the buffer "buf"
however, the computer that triggered the access was 192.168.1.241 and that value is also in that buffer but not at the location the coder expected
debug2 02 00 e9 83 c0 a8 01 f1 00 00 00 00 00 00 00 00
----------------------^^ this is 192
--------------------------^^ this is 168
----------------------------^^ this is the 1
-------------------------------^^ this is the 241
A "fix" is to use the same code as the win32. I took a guess and changed
```
#ifdef _WIN32
sockaddr_in* sin = (sockaddr_in*)&s;
safe_strcpy(buf, inet_ntoa(sin->sin_addr));
#else
inet_ntop(s.ss_family, &s, buf, 256);
#endif
```
to
```
#ifdef _WIN32
sockaddr_in* sin = (sockaddr_in*)&s;
safe_strcpy(buf, inet_ntoa(sin->sin_addr));
#else
sockaddr_in* sin = (sockaddr_in*)&s;
safe_strcpy(buf, inet_ntoa(sin->sin_addr));
#endif
```
the above ran correctly
File tempfix.png shows the correct ip address and the hex dump of the buffer
file tempfix1.png shows the code that dumped out the buffer


[EDIT] Not necessary for remote_access.cfg to be actually empty or null for the bug to show up.
|
defect
|
wrong cast when reporting non allowed rpc access for some time reports from new installations of linux show non allowed access from ip addresses starting with x x x is typically a french address reference issue i could not find an issue like i put together here but i might have missed it in actuality the is the identifier af inet due to wrong cast in program gui rpc server cpp function show connect error the following lines of code ifdef sockaddr in sin sockaddr in s safe strcpy buf inet ntoa sin sin addr else inet ntop s ss family s buf endif the reference s ss family causes the to be picked up the actual ip address starts at s ss data which is not defined afaict i ran a test from and attempted to access the client the remote host cfg file has to be empty as that causes the problem somehow bug oct setting up gui rpc socket oct checking presence of project files oct this computer is not attached to any projects oct initialization completed oct gui rpc request from non allowed address note that the above can change but the will not as that is the value af inet if you dump out the hex values of the buffer buf one gets the following oct this is this is the came from the buffer buf however the computer that triggered the access was and that value is also in that buffer but not at the location the coder expected this is this is this is the this is the a fix is to use the same code as the i took a guess and changed ifdef sockaddr in sin sockaddr in s safe strcpy buf inet ntoa sin sin addr else inet ntop s ss family s buf endif to ifdef sockaddr in sin sockaddr in s safe strcpy buf inet ntoa sin sin addr else sockaddr in sin sockaddr in s safe strcpy buf inet ntoa sin sin addr endif the above ran correctly file tempfix png shows the correct ip address and the hex dump of the buffer file png shows the code that dumped out the buffer not necessary for remote access cfg to be actually empty or null for the bug to show up
| 1
|
32,415
| 6,777,127,827
|
IssuesEvent
|
2017-10-27 20:43:48
|
Stibbons/dopplerr
|
https://api.github.com/repos/Stibbons/dopplerr
|
closed
|
Docker image not working
|
Type: Defect
|
Docker image is not working at all. Process is running in the container but nothing is listening.
PID USER TIME COMMAND
1 root 0:00 s6-svscan -t0 /var/run/s6/services
32 root 0:00 s6-supervise s6-fdholderd
194 root 0:00 s6-supervise dopplerr
762 root 0:00 /bin/bash
1941 abc 0:00 {dopplerr} /usr/bin/python3.6 /usr/bin/dopplerr --no-colo
1946 root 0:00 ps aux
Even when using the --net=host option all is quiet on the network front.
lsof -i
1 /bin/s6-svscan /dev/null
1 /bin/s6-svscan pipe:[22504200]
1 /bin/s6-svscan pipe:[22504201]
1 /bin/s6-svscan anon_inode:[signalfd]
1 /bin/s6-svscan /var/run/s6/services/.s6-svscan/control
1 /bin/s6-svscan /var/run/s6/services/.s6-svscan/control
32 /bin/s6-supervise /dev/null
32 /bin/s6-supervise pipe:[22504200]
32 /bin/s6-supervise pipe:[22504201]
32 /bin/s6-supervise anon_inode:[signalfd]
32 /bin/s6-supervise /var/run/s6/services/s6-fdholderd/supervise/control
32 /bin/s6-supervise /var/run/s6/services/s6-fdholderd/supervise/control
194 /bin/s6-supervise /dev/null
194 /bin/s6-supervise pipe:[22504200]
194 /bin/s6-supervise pipe:[22504201]
194 /bin/s6-supervise anon_inode:[signalfd]
194 /bin/s6-supervise /var/run/s6/services/dopplerr/supervise/control
194 /bin/s6-supervise /var/run/s6/services/dopplerr/supervise/control
762 /bin/bash /dev/pts/5
762 /bin/bash /dev/pts/5
762 /bin/bash /dev/pts/5
762 /bin/bash /dev/pts/5
Furthermore, no logs to be found on the running docker image, -e SUBDLSRC_VERBOSE=1 is set. Even when a config volume is mounted, all is empty.
|
1.0
|
Docker image not working - Docker image is not working at all. Process is running in the container but nothing is listening.
PID USER TIME COMMAND
1 root 0:00 s6-svscan -t0 /var/run/s6/services
32 root 0:00 s6-supervise s6-fdholderd
194 root 0:00 s6-supervise dopplerr
762 root 0:00 /bin/bash
1941 abc 0:00 {dopplerr} /usr/bin/python3.6 /usr/bin/dopplerr --no-colo
1946 root 0:00 ps aux
Even when using the --net=host option all is quiet on the network front.
lsof -i
1 /bin/s6-svscan /dev/null
1 /bin/s6-svscan pipe:[22504200]
1 /bin/s6-svscan pipe:[22504201]
1 /bin/s6-svscan anon_inode:[signalfd]
1 /bin/s6-svscan /var/run/s6/services/.s6-svscan/control
1 /bin/s6-svscan /var/run/s6/services/.s6-svscan/control
32 /bin/s6-supervise /dev/null
32 /bin/s6-supervise pipe:[22504200]
32 /bin/s6-supervise pipe:[22504201]
32 /bin/s6-supervise anon_inode:[signalfd]
32 /bin/s6-supervise /var/run/s6/services/s6-fdholderd/supervise/control
32 /bin/s6-supervise /var/run/s6/services/s6-fdholderd/supervise/control
194 /bin/s6-supervise /dev/null
194 /bin/s6-supervise pipe:[22504200]
194 /bin/s6-supervise pipe:[22504201]
194 /bin/s6-supervise anon_inode:[signalfd]
194 /bin/s6-supervise /var/run/s6/services/dopplerr/supervise/control
194 /bin/s6-supervise /var/run/s6/services/dopplerr/supervise/control
762 /bin/bash /dev/pts/5
762 /bin/bash /dev/pts/5
762 /bin/bash /dev/pts/5
762 /bin/bash /dev/pts/5
Furthermore, no logs to be found on the running docker image, -e SUBDLSRC_VERBOSE=1 is set. Even when a config volume is mounted, all is empty.
|
defect
|
docker image not working docker image is not working at all process is running in the container but nothing is listening pid user time command root svscan var run services root supervise fdholderd root supervise dopplerr root bin bash abc dopplerr usr bin usr bin dopplerr no colo root ps aux even when using the net host option all is quiet on the network front lsof i bin svscan dev null bin svscan pipe bin svscan pipe bin svscan anon inode bin svscan var run services svscan control bin svscan var run services svscan control bin supervise dev null bin supervise pipe bin supervise pipe bin supervise anon inode bin supervise var run services fdholderd supervise control bin supervise var run services fdholderd supervise control bin supervise dev null bin supervise pipe bin supervise pipe bin supervise anon inode bin supervise var run services dopplerr supervise control bin supervise var run services dopplerr supervise control bin bash dev pts bin bash dev pts bin bash dev pts bin bash dev pts furthermore no logs to be found on the running docker image e subdlsrc verbose is set even when a config volume is mounted all is empty
| 1
|
31,357
| 14,940,691,919
|
IssuesEvent
|
2021-01-25 18:38:51
|
hibernate/hibernate-reactive
|
https://api.github.com/repos/hibernate/hibernate-reactive
|
closed
|
Performance: pool() access from SessionFactoryImpl(s) to not invoke the ServiceRegistry
|
performance
|
Both `pool()` implementations in the Stage & Mutiny versions of the ReactiveSessionFactoryImpl should have a faster access the the pool instance.
Accessing Services from the `ServiceRegistry` is quite flexible, but should only be done during initialization, or to handle occasional / non-hot needs.
|
True
|
Performance: pool() access from SessionFactoryImpl(s) to not invoke the ServiceRegistry - Both `pool()` implementations in the Stage & Mutiny versions of the ReactiveSessionFactoryImpl should have a faster access the the pool instance.
Accessing Services from the `ServiceRegistry` is quite flexible, but should only be done during initialization, or to handle occasional / non-hot needs.
|
non_defect
|
performance pool access from sessionfactoryimpl s to not invoke the serviceregistry both pool implementations in the stage mutiny versions of the reactivesessionfactoryimpl should have a faster access the the pool instance accessing services from the serviceregistry is quite flexible but should only be done during initialization or to handle occasional non hot needs
| 0
|
56,527
| 8,081,423,873
|
IssuesEvent
|
2018-08-08 03:25:05
|
servo/servo
|
https://api.github.com/repos/servo/servo
|
closed
|
Doc build is broken with latest rust update
|
A-documentation A-infrastructure
|
```
error: `[cfg]` cannot be resolved, ignoring it...
--> /home/servo/.cargo/registry/src/github.com-1ecc6299db9ec823/cfg-if-0.1.2/src/lib.rs:1:28
|
1 | //! A macro for defining #[cfg] if-else statements.
| ^^^ cannot be resolved, ignoring
|
note: lint level defined here
--> /home/servo/.cargo/registry/src/github.com-1ecc6299db9ec823/net2-0.2.29/src/lib.rs:42:23
|
42| #![deny(missing_docs, warnings)]
| ^^^^^^^^
= note: #[deny(intra_doc_link_resolution_failure)] implied by #[deny(warnings)]
= help: to escape `[` and `]` characters, just add '\' before them like `\[` or `\]`
error: `[cfg]` cannot be resolved, ignoring it...
--> /home/servo/.cargo/registry/src/github.com-1ecc6299db9ec823/cfg-if-0.1.2/src/lib.rs:7:59
|
7 | //! This allows you to conveniently provide a long list #[cfg]'d blocks of code
| ^^^ cannot be resolved, ignoring
|
= help: to escape `[` and `]` characters, just add '\' before them like `\[` or `\]`
error: Could not document `net2`.
```
|
1.0
|
Doc build is broken with latest rust update - ```
error: `[cfg]` cannot be resolved, ignoring it...
--> /home/servo/.cargo/registry/src/github.com-1ecc6299db9ec823/cfg-if-0.1.2/src/lib.rs:1:28
|
1 | //! A macro for defining #[cfg] if-else statements.
| ^^^ cannot be resolved, ignoring
|
note: lint level defined here
--> /home/servo/.cargo/registry/src/github.com-1ecc6299db9ec823/net2-0.2.29/src/lib.rs:42:23
|
42| #![deny(missing_docs, warnings)]
| ^^^^^^^^
= note: #[deny(intra_doc_link_resolution_failure)] implied by #[deny(warnings)]
= help: to escape `[` and `]` characters, just add '\' before them like `\[` or `\]`
error: `[cfg]` cannot be resolved, ignoring it...
--> /home/servo/.cargo/registry/src/github.com-1ecc6299db9ec823/cfg-if-0.1.2/src/lib.rs:7:59
|
7 | //! This allows you to conveniently provide a long list #[cfg]'d blocks of code
| ^^^ cannot be resolved, ignoring
|
= help: to escape `[` and `]` characters, just add '\' before them like `\[` or `\]`
error: Could not document `net2`.
```
|
non_defect
|
doc build is broken with latest rust update error cannot be resolved ignoring it home servo cargo registry src github com cfg if src lib rs a macro for defining if else statements cannot be resolved ignoring note lint level defined here home servo cargo registry src github com src lib rs note implied by help to escape characters just add before them like error cannot be resolved ignoring it home servo cargo registry src github com cfg if src lib rs this allows you to conveniently provide a long list d blocks of code cannot be resolved ignoring help to escape characters just add before them like error could not document
| 0
|
2,292
| 2,603,992,449
|
IssuesEvent
|
2015-02-24 19:06:58
|
chrsmith/nishazi6
|
https://api.github.com/repos/chrsmith/nishazi6
|
opened
|
沈阳疱疹哪个医院最好
|
auto-migrated Priority-Medium Type-Defect
|
```
沈阳疱疹哪个医院最好〓沈陽軍區政治部醫院性病〓TEL:024-3
1023308〓成立于1946年,68年專注于性傳播疾病的研究和治療。�
��于沈陽市沈河區二緯路32號。是一所與新中國同建立共輝煌�
��歷史悠久、設備精良、技術權威、專家云集,是預防、保健
、醫療、科研康復為一體的綜合性醫院。是國家首批公立甲��
�部隊醫院、全國首批醫療規范定點單位,是第四軍醫大學、�
��南大學等知名高等院校的教學醫院。曾被中國人民解放軍空
軍后勤部衛生部評為衛生工作先進單位,先后兩次榮立集體��
�等功。
```
-----
Original issue reported on code.google.com by `q964105...@gmail.com` on 4 Jun 2014 at 8:38
|
1.0
|
沈阳疱疹哪个医院最好 - ```
沈阳疱疹哪个医院最好〓沈陽軍區政治部醫院性病〓TEL:024-3
1023308〓成立于1946年,68年專注于性傳播疾病的研究和治療。�
��于沈陽市沈河區二緯路32號。是一所與新中國同建立共輝煌�
��歷史悠久、設備精良、技術權威、專家云集,是預防、保健
、醫療、科研康復為一體的綜合性醫院。是國家首批公立甲��
�部隊醫院、全國首批醫療規范定點單位,是第四軍醫大學、�
��南大學等知名高等院校的教學醫院。曾被中國人民解放軍空
軍后勤部衛生部評為衛生工作先進單位,先后兩次榮立集體��
�等功。
```
-----
Original issue reported on code.google.com by `q964105...@gmail.com` on 4 Jun 2014 at 8:38
|
defect
|
沈阳疱疹哪个医院最好 沈阳疱疹哪个医院最好〓沈陽軍區政治部醫院性病〓tel: 〓 , 。� �� 。是一所與新中國同建立共輝煌� ��歷史悠久、設備精良、技術權威、專家云集,是預防、保健 、醫療、科研康復為一體的綜合性醫院。是國家首批公立甲�� �部隊醫院、全國首批醫療規范定點單位,是第四軍醫大學、� ��南大學等知名高等院校的教學醫院。曾被中國人民解放軍空 軍后勤部衛生部評為衛生工作先進單位,先后兩次榮立集體�� �等功。 original issue reported on code google com by gmail com on jun at
| 1
|
109,226
| 23,740,176,539
|
IssuesEvent
|
2022-08-31 11:44:51
|
sast-automation-dev/hackme-20
|
https://api.github.com/repos/sast-automation-dev/hackme-20
|
opened
|
Code Security Report: 15 high severity findings, 20 total findings
|
code security findings
|
# Code Security Report
**Latest Scan:** 2022-08-31 11:43am
**Total Findings:** 20
**Tested Project Files:** 44
**Detected Programming Languages:** 2
<!-- SAST-MANUAL-SCAN-START -->
- [ ] Check this box to manually trigger a scan
<!-- SAST-MANUAL-SCAN-END -->
## Language: JavaScript / Node.js
| Severity | CWE | Vulnerability Type | Count |
|-|-|-|-|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-89](https://cwe.mitre.org/data/definitions/89.html)|SQL Injection|1|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-94](https://cwe.mitre.org/data/definitions/94.html)|Code Injection|2|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-338](https://cwe.mitre.org/data/definitions/338.html)|Weak Pseudo-Random|2|
### Details
> The below list presents the 3 high vulnerability findings that need your attention. To view information on these findings, navigate to the [Mend SAST Application](https://dev.whitesourcesoftware.com/sast/#/scans/0101899f-a51b-4eec-a2f2-5b85392ef52a/details).
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>SQL Injection (CWE-89) : 1</summary>
#### Findings
<details>
<summary>javascripts/prototype.js:5068</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/prototype.js#L5063-L5068
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/prototype.js#L5057
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/prototype.js#L5068
</details>
</details>
</details>
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>Code Injection (CWE-94) : 2</summary>
#### Findings
<details>
<summary>javascripts/prototype.js:1631</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/prototype.js#L1626-L1631
</details>
<details>
<summary>javascripts/controls.js:786</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/controls.js#L781-L786
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/controls.js#L783
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/controls.js#L786
</details>
</details>
</details>
## Language: Ruby
| Severity | CWE | Vulnerability Type | Count |
|-|-|-|-|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-79](https://cwe.mitre.org/data/definitions/79.html)|Cross-Site Scripting|12|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-321](https://cwe.mitre.org/data/definitions/321.html)|Secret Key In Source|1|
|<img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low|[CWE-113](https://cwe.mitre.org/data/definitions/113.html)|HTTP Response Splitting|2|
### Details
> The below list presents the 12 high vulnerability findings that need your attention. To view information on these findings, navigate to the [Mend SAST Application](https://dev.whitesourcesoftware.com/sast/#/scans/0101899f-a51b-4eec-a2f2-5b85392ef52a/details).
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>Cross-Site Scripting (CWE-79) : 12</summary>
#### Findings
<details>
<summary>posts/index.html.erb:7</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L2-L7
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L32
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L7
</details>
</details>
<details>
<summary>posts/index.html.erb:7</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L2-L7
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L42
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L7
</details>
</details>
<details>
<summary>posts/show.html.erb:4</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/show.html.erb#L-1-L4
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L7
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/show.html.erb#L4
</details>
</details>
<details>
<summary>posts/index.html.erb:5</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L0-L5
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L7
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L5
</details>
</details>
<details>
<summary>posts/index.html.erb:5</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L0-L5
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L15
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L5
</details>
</details>
<details>
<summary>posts/index.html.erb:5</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L0-L5
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L19
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L5
</details>
</details>
<details>
<summary>posts/index.html.erb:5</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L0-L5
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L32
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L5
</details>
</details>
<details>
<summary>posts/show.html.erb:2</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/show.html.erb#L-3-L2
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L7
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/show.html.erb#L2
</details>
</details>
<details>
<summary>posts/index.html.erb:5</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L0-L5
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L42
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L5
</details>
</details>
<details>
<summary>posts/index.html.erb:7</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L2-L7
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L19
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L7
</details>
</details>
<details>
<summary>posts/index.html.erb:7</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L2-L7
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L15
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L7
</details>
</details>
<details>
<summary>posts/index.html.erb:7</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L2-L7
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L7
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L7
</details>
</details>
</details>
|
1.0
|
Code Security Report: 15 high severity findings, 20 total findings - # Code Security Report
**Latest Scan:** 2022-08-31 11:43am
**Total Findings:** 20
**Tested Project Files:** 44
**Detected Programming Languages:** 2
<!-- SAST-MANUAL-SCAN-START -->
- [ ] Check this box to manually trigger a scan
<!-- SAST-MANUAL-SCAN-END -->
## Language: JavaScript / Node.js
| Severity | CWE | Vulnerability Type | Count |
|-|-|-|-|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-89](https://cwe.mitre.org/data/definitions/89.html)|SQL Injection|1|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-94](https://cwe.mitre.org/data/definitions/94.html)|Code Injection|2|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-338](https://cwe.mitre.org/data/definitions/338.html)|Weak Pseudo-Random|2|
### Details
> The below list presents the 3 high vulnerability findings that need your attention. To view information on these findings, navigate to the [Mend SAST Application](https://dev.whitesourcesoftware.com/sast/#/scans/0101899f-a51b-4eec-a2f2-5b85392ef52a/details).
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>SQL Injection (CWE-89) : 1</summary>
#### Findings
<details>
<summary>javascripts/prototype.js:5068</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/prototype.js#L5063-L5068
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/prototype.js#L5057
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/prototype.js#L5068
</details>
</details>
</details>
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>Code Injection (CWE-94) : 2</summary>
#### Findings
<details>
<summary>javascripts/prototype.js:1631</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/prototype.js#L1626-L1631
</details>
<details>
<summary>javascripts/controls.js:786</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/controls.js#L781-L786
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/controls.js#L783
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/public/javascripts/controls.js#L786
</details>
</details>
</details>
## Language: Ruby
| Severity | CWE | Vulnerability Type | Count |
|-|-|-|-|
|<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-79](https://cwe.mitre.org/data/definitions/79.html)|Cross-Site Scripting|12|
|<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-321](https://cwe.mitre.org/data/definitions/321.html)|Secret Key In Source|1|
|<img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low|[CWE-113](https://cwe.mitre.org/data/definitions/113.html)|HTTP Response Splitting|2|
### Details
> The below list presents the 12 high vulnerability findings that need your attention. To view information on these findings, navigate to the [Mend SAST Application](https://dev.whitesourcesoftware.com/sast/#/scans/0101899f-a51b-4eec-a2f2-5b85392ef52a/details).
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>Cross-Site Scripting (CWE-79) : 12</summary>
#### Findings
<details>
<summary>posts/index.html.erb:7</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L2-L7
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L32
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L7
</details>
</details>
<details>
<summary>posts/index.html.erb:7</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L2-L7
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L42
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L7
</details>
</details>
<details>
<summary>posts/show.html.erb:4</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/show.html.erb#L-1-L4
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L7
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/show.html.erb#L4
</details>
</details>
<details>
<summary>posts/index.html.erb:5</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L0-L5
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L7
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L5
</details>
</details>
<details>
<summary>posts/index.html.erb:5</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L0-L5
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L15
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L5
</details>
</details>
<details>
<summary>posts/index.html.erb:5</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L0-L5
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L19
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L5
</details>
</details>
<details>
<summary>posts/index.html.erb:5</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L0-L5
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L32
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L5
</details>
</details>
<details>
<summary>posts/show.html.erb:2</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/show.html.erb#L-3-L2
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L7
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/show.html.erb#L2
</details>
</details>
<details>
<summary>posts/index.html.erb:5</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L0-L5
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L42
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L5
</details>
</details>
<details>
<summary>posts/index.html.erb:7</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L2-L7
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L19
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L7
</details>
</details>
<details>
<summary>posts/index.html.erb:7</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L2-L7
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L15
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L7
</details>
</details>
<details>
<summary>posts/index.html.erb:7</summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L2-L7
<details>
<summary> Trace </summary>
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/controllers/posts_controller.rb#L7
https://github.com/sast-automation-dev/hackme-20/blob/70f0018a31c21d24f16ccdf7be683e170d9821db/hackme-20/app/views/posts/index.html.erb#L7
</details>
</details>
</details>
|
non_defect
|
code security report high severity findings total findings code security report latest scan total findings tested project files detected programming languages check this box to manually trigger a scan language javascript node js severity cwe vulnerability type count high injection high injection medium pseudo random details the below list presents the high vulnerability findings that need your attention to view information on these findings navigate to the sql injection cwe findings javascripts prototype js trace code injection cwe findings javascripts prototype js javascripts controls js trace language ruby severity cwe vulnerability type count high scripting medium key in source low response splitting details the below list presents the high vulnerability findings that need your attention to view information on these findings navigate to the cross site scripting cwe findings posts index html erb trace posts index html erb trace posts show html erb trace posts index html erb trace posts index html erb trace posts index html erb trace posts index html erb trace posts show html erb trace posts index html erb trace posts index html erb trace posts index html erb trace posts index html erb trace
| 0
|
295,412
| 22,215,492,059
|
IssuesEvent
|
2022-06-08 00:52:52
|
Luizgsmkw/helpr
|
https://api.github.com/repos/Luizgsmkw/helpr
|
closed
|
Criar RestCrontoller para listagem de um cliente em cliente.service.
|
documentation enhancement
|
- [x] Criar um controlador Rest;
- [x] Criar o serviço de cliente;
- [x] Criar métodos para o Endpoint de RestController;
- [x] Vincular os repositórios de cliente;
- [x] Testar Endpoint no Postman.
|
1.0
|
Criar RestCrontoller para listagem de um cliente em cliente.service. - - [x] Criar um controlador Rest;
- [x] Criar o serviço de cliente;
- [x] Criar métodos para o Endpoint de RestController;
- [x] Vincular os repositórios de cliente;
- [x] Testar Endpoint no Postman.
|
non_defect
|
criar restcrontoller para listagem de um cliente em cliente service criar um controlador rest criar o serviço de cliente criar métodos para o endpoint de restcontroller vincular os repositórios de cliente testar endpoint no postman
| 0
|
11,435
| 2,651,545,003
|
IssuesEvent
|
2015-03-16 12:21:53
|
micheles/papers
|
https://api.github.com/repos/micheles/papers
|
closed
|
kwargs dict is not filled with kwargs
|
auto-migrated Priority-Medium Type-Defect
|
```
Hi Michele. Thanks for your great job!
Wrote a small decorator(django by default caches everything in 'default' cache):
def filesystem_cache(key_prefix, cache_time=None):
"""
Caches function based on key_prefix and function args/kwargs.
Stores function result in filesystem cache for a certain cache_time.
"""
if cache_time:
FILESYSTEM_CACHE_TIME = cache_time
else:
FILESYSTEM_CACHE_TIME = settings.CACHES['filesystem'].get('TIMEOUT')
@decorator
def wrapfunc(func, *args, **kwargs):
ignore_cache = kwargs.pop('_ignore_cache', False)
full_args = list()
full_args.extend(args)
for k, v in kwargs.items():
full_args.append('%s:%s' % (str(k), str(v)))
md5_args = md5_constructor(u':'.join([urlquote(var) for var in full_args]))
cache_key = 'template.cache.%s.%s' % (key_prefix, md5_args.hexdigest())
filesystem_cache = get_cache(FILESYSTEM_CACHE_NAME)
cached_value = filesystem_cache.get(cache_key)
if cached_value and not ignore_cache:
# if cached value exists - return it
return cached_value
result = func(*args, **kwargs)
filesystem_cache.set(cache_key, result, FILESYSTEM_CACHE_TIME)
return result
return wrapfunc
Everything works ok except that all args and kwargs for function fall into args
section :
In [2]: @filesystem_cache('bla')
...: def hello(a,b,c, d=10,e=12):
...: print a
...: print b
...: print c
...: print d
...: return e
In [3]: hello(5,6,7, d=11,e=18)
> /home/dev/imax/mws/src/imax_main/utils.py(29)wrapfunc()
-> md5_args = md5_constructor(u':'.join([urlquote(var) for var in full_args]))
(Pdb) l
24 full_args = list()
25 full_args.extend(args)
26 for k, v in kwargs.items():
27 full_args.append('%s:%s' % (str(k), str(v)))
28 import pdb; pdb.set_trace()
29 -> md5_args = md5_constructor(u':'.join([urlquote(var) for var in full_args]))
30 cache_key = 'template.cache.%s.%s' % (key_prefix, md5_args.hexdigest())
31 filesystem_cache = get_cache(FILESYSTEM_CACHE_NAME)
32 cached_value = filesystem_cache.get(cache_key)
33 if cached_value and not ignore_cache:
34 # if cached value exists - return it
(Pdb) args
func = <function hello at 0xa63f454>
args = (5, 6, 7, 11, 18)
kwargs = {}
(Pdb) kwargs
{}
Why d and e kwargs are in args array ? Thanks !
```
Original issue reported on code.google.com by `alecs....@gmail.com` on 13 Feb 2012 at 10:20
|
1.0
|
kwargs dict is not filled with kwargs - ```
Hi Michele. Thanks for your great job!
Wrote a small decorator(django by default caches everything in 'default' cache):
def filesystem_cache(key_prefix, cache_time=None):
"""
Caches function based on key_prefix and function args/kwargs.
Stores function result in filesystem cache for a certain cache_time.
"""
if cache_time:
FILESYSTEM_CACHE_TIME = cache_time
else:
FILESYSTEM_CACHE_TIME = settings.CACHES['filesystem'].get('TIMEOUT')
@decorator
def wrapfunc(func, *args, **kwargs):
ignore_cache = kwargs.pop('_ignore_cache', False)
full_args = list()
full_args.extend(args)
for k, v in kwargs.items():
full_args.append('%s:%s' % (str(k), str(v)))
md5_args = md5_constructor(u':'.join([urlquote(var) for var in full_args]))
cache_key = 'template.cache.%s.%s' % (key_prefix, md5_args.hexdigest())
filesystem_cache = get_cache(FILESYSTEM_CACHE_NAME)
cached_value = filesystem_cache.get(cache_key)
if cached_value and not ignore_cache:
# if cached value exists - return it
return cached_value
result = func(*args, **kwargs)
filesystem_cache.set(cache_key, result, FILESYSTEM_CACHE_TIME)
return result
return wrapfunc
Everything works ok except that all args and kwargs for function fall into args
section :
In [2]: @filesystem_cache('bla')
...: def hello(a,b,c, d=10,e=12):
...: print a
...: print b
...: print c
...: print d
...: return e
In [3]: hello(5,6,7, d=11,e=18)
> /home/dev/imax/mws/src/imax_main/utils.py(29)wrapfunc()
-> md5_args = md5_constructor(u':'.join([urlquote(var) for var in full_args]))
(Pdb) l
24 full_args = list()
25 full_args.extend(args)
26 for k, v in kwargs.items():
27 full_args.append('%s:%s' % (str(k), str(v)))
28 import pdb; pdb.set_trace()
29 -> md5_args = md5_constructor(u':'.join([urlquote(var) for var in full_args]))
30 cache_key = 'template.cache.%s.%s' % (key_prefix, md5_args.hexdigest())
31 filesystem_cache = get_cache(FILESYSTEM_CACHE_NAME)
32 cached_value = filesystem_cache.get(cache_key)
33 if cached_value and not ignore_cache:
34 # if cached value exists - return it
(Pdb) args
func = <function hello at 0xa63f454>
args = (5, 6, 7, 11, 18)
kwargs = {}
(Pdb) kwargs
{}
Why d and e kwargs are in args array ? Thanks !
```
Original issue reported on code.google.com by `alecs....@gmail.com` on 13 Feb 2012 at 10:20
|
defect
|
kwargs dict is not filled with kwargs hi michele thanks for your great job wrote a small decorator django by default caches everything in default cache def filesystem cache key prefix cache time none caches function based on key prefix and function args kwargs stores function result in filesystem cache for a certain cache time if cache time filesystem cache time cache time else filesystem cache time settings caches get timeout decorator def wrapfunc func args kwargs ignore cache kwargs pop ignore cache false full args list full args extend args for k v in kwargs items full args append s s str k str v args constructor u join cache key template cache s s key prefix args hexdigest filesystem cache get cache filesystem cache name cached value filesystem cache get cache key if cached value and not ignore cache if cached value exists return it return cached value result func args kwargs filesystem cache set cache key result filesystem cache time return result return wrapfunc everything works ok except that all args and kwargs for function fall into args section in filesystem cache bla def hello a b c d e print a print b print c print d return e in hello d e home dev imax mws src imax main utils py wrapfunc args constructor u join pdb l full args list full args extend args for k v in kwargs items full args append s s str k str v import pdb pdb set trace args constructor u join cache key template cache s s key prefix args hexdigest filesystem cache get cache filesystem cache name cached value filesystem cache get cache key if cached value and not ignore cache if cached value exists return it pdb args func args kwargs pdb kwargs why d and e kwargs are in args array thanks original issue reported on code google com by alecs gmail com on feb at
| 1
|
29,129
| 5,542,560,954
|
IssuesEvent
|
2017-03-22 15:15:13
|
bridgedotnet/Bridge
|
https://api.github.com/repos/bridgedotnet/Bridge
|
closed
|
try...finally in async
|
defect in progress
|
Return keyword doesn't work correctly inside async try/catch block
### Steps To Reproduce
http://deck.net/2a511799409c10edd4cb311a322de94c
```c#
public class Program
{
public static async void Main()
{
try
{
var errors = await TestAsync();
if (errors.Length != 0)
return;
Console.WriteLine("Should not be printed");
}
finally
{
Console.WriteLine("Something");
}
}
public static async Task<string[]> TestAsync()
{
var errors = await ValidateAsync();
if (errors.Count != 0)
Console.WriteLine("Showing errors");
return errors.ToArray();
}
public static async Task<List<string>> ValidateAsync()
{
var result = new List<string>();
result.Add("xxx");
result.Add("yyy");
return result;
}
}
```
### Expected Result
```js
Showing errors
Something
```
### Actual Result
```js
Showing errors
Should not be printed
Something
```
### See Also
http://forums.bridge.net/forum/bridge-net-pro/bugs/3649-open-2462-try-finally-in-async
#2481
|
1.0
|
try...finally in async - Return keyword doesn't work correctly inside async try/catch block
### Steps To Reproduce
http://deck.net/2a511799409c10edd4cb311a322de94c
```c#
public class Program
{
public static async void Main()
{
try
{
var errors = await TestAsync();
if (errors.Length != 0)
return;
Console.WriteLine("Should not be printed");
}
finally
{
Console.WriteLine("Something");
}
}
public static async Task<string[]> TestAsync()
{
var errors = await ValidateAsync();
if (errors.Count != 0)
Console.WriteLine("Showing errors");
return errors.ToArray();
}
public static async Task<List<string>> ValidateAsync()
{
var result = new List<string>();
result.Add("xxx");
result.Add("yyy");
return result;
}
}
```
### Expected Result
```js
Showing errors
Something
```
### Actual Result
```js
Showing errors
Should not be printed
Something
```
### See Also
http://forums.bridge.net/forum/bridge-net-pro/bugs/3649-open-2462-try-finally-in-async
#2481
|
defect
|
try finally in async return keyword doesn t work correctly inside async try catch block steps to reproduce c public class program public static async void main try var errors await testasync if errors length return console writeline should not be printed finally console writeline something public static async task testasync var errors await validateasync if errors count console writeline showing errors return errors toarray public static async task validateasync var result new list result add xxx result add yyy return result expected result js showing errors something actual result js showing errors should not be printed something see also
| 1
|
28,535
| 5,286,901,480
|
IssuesEvent
|
2017-02-08 10:39:53
|
AsyncHttpClient/async-http-client
|
https://api.github.com/repos/AsyncHttpClient/async-http-client
|
closed
|
Early termination of reactive streams subscriber should close connection
|
Defect
|
If a reactive streams subscriber cancels mid response, currently, AHC will drain the response, and return the connection to the queue once done. This is not what happens when `AsyncHandler.onBodyPartReceived` returns `ABORT`, for example, when that happens, the connection is closed (I think). The problem we're seeing is that if you have an infinite stream (eg, a Twitter streamed search response), it's impossible to ever terminate it. Reactive streams response handling should terminate the connection upon a reactive streams subscriber cancelling.
|
1.0
|
Early termination of reactive streams subscriber should close connection - If a reactive streams subscriber cancels mid response, currently, AHC will drain the response, and return the connection to the queue once done. This is not what happens when `AsyncHandler.onBodyPartReceived` returns `ABORT`, for example, when that happens, the connection is closed (I think). The problem we're seeing is that if you have an infinite stream (eg, a Twitter streamed search response), it's impossible to ever terminate it. Reactive streams response handling should terminate the connection upon a reactive streams subscriber cancelling.
|
defect
|
early termination of reactive streams subscriber should close connection if a reactive streams subscriber cancels mid response currently ahc will drain the response and return the connection to the queue once done this is not what happens when asynchandler onbodypartreceived returns abort for example when that happens the connection is closed i think the problem we re seeing is that if you have an infinite stream eg a twitter streamed search response it s impossible to ever terminate it reactive streams response handling should terminate the connection upon a reactive streams subscriber cancelling
| 1
|
4,517
| 2,610,111,850
|
IssuesEvent
|
2015-02-26 18:34:40
|
chrsmith/scribefire-chrome
|
https://api.github.com/repos/chrsmith/scribefire-chrome
|
closed
|
server error requested method get wp.getusersblogs does not exist
|
auto-migrated Priority-Medium Type-Defect
|
```
What's the problem?
This summary line is the error I get when I try to add my WP blog. FWIW,
it's a WP blog hosted at my domain. It worked fine with Windows Live
Writer.
What version of ScribeFire for Chrome are you running?
0.1.1.0
```
-----
Original issue reported on code.google.com by `barbaric...@gmail.com` on 24 Apr 2010 at 3:02
* Merged into: #34
|
1.0
|
server error requested method get wp.getusersblogs does not exist - ```
What's the problem?
This summary line is the error I get when I try to add my WP blog. FWIW,
it's a WP blog hosted at my domain. It worked fine with Windows Live
Writer.
What version of ScribeFire for Chrome are you running?
0.1.1.0
```
-----
Original issue reported on code.google.com by `barbaric...@gmail.com` on 24 Apr 2010 at 3:02
* Merged into: #34
|
defect
|
server error requested method get wp getusersblogs does not exist what s the problem this summary line is the error i get when i try to add my wp blog fwiw it s a wp blog hosted at my domain it worked fine with windows live writer what version of scribefire for chrome are you running original issue reported on code google com by barbaric gmail com on apr at merged into
| 1
|
781,488
| 27,439,516,318
|
IssuesEvent
|
2023-03-02 10:01:03
|
pmgbergen/porepy
|
https://api.github.com/repos/pmgbergen/porepy
|
opened
|
Specific volumes for interfaces
|
enhancement priority - high
|
Interfaces need a notion of specific volumes to allow integration. The specific volume accounts for the collapsed dimension(s), so that it matches the dimension of the boundary of its higher-dimensional neighbor.
|
1.0
|
Specific volumes for interfaces - Interfaces need a notion of specific volumes to allow integration. The specific volume accounts for the collapsed dimension(s), so that it matches the dimension of the boundary of its higher-dimensional neighbor.
|
non_defect
|
specific volumes for interfaces interfaces need a notion of specific volumes to allow integration the specific volume accounts for the collapsed dimension s so that it matches the dimension of the boundary of its higher dimensional neighbor
| 0
|
121,047
| 15,834,508,867
|
IssuesEvent
|
2021-04-06 16:52:24
|
WordPress/gutenberg
|
https://api.github.com/repos/WordPress/gutenberg
|
closed
|
Button Update Doesn't Make Centering and Moving of Block Obvious/Intuitive
|
Needs Design Feedback [Block] Buttons
|
Hi All,
I love Gutenberg! I create content in it every day for upwards of 30 unique sites.
The button update:
LOVE that I can do side-by-side
HATE that it's not UX friendly in terms of innate functionality being obvious and present (moving up and down/centering) when you click the plus button. See [short video]([url](https://trainingvideoslaura.s3.amazonaws.com/zoom_0.mp4)) of me replicating multiple frustrated clients who asked for help this week and ME (been using WordPress since 2006) who took three days to figure out what was going on!
https://trainingvideoslaura.s3.amazonaws.com/zoom_0.mp4
|
1.0
|
Button Update Doesn't Make Centering and Moving of Block Obvious/Intuitive - Hi All,
I love Gutenberg! I create content in it every day for upwards of 30 unique sites.
The button update:
LOVE that I can do side-by-side
HATE that it's not UX friendly in terms of innate functionality being obvious and present (moving up and down/centering) when you click the plus button. See [short video]([url](https://trainingvideoslaura.s3.amazonaws.com/zoom_0.mp4)) of me replicating multiple frustrated clients who asked for help this week and ME (been using WordPress since 2006) who took three days to figure out what was going on!
https://trainingvideoslaura.s3.amazonaws.com/zoom_0.mp4
|
non_defect
|
button update doesn t make centering and moving of block obvious intuitive hi all i love gutenberg i create content in it every day for upwards of unique sites the button update love that i can do side by side hate that it s not ux friendly in terms of innate functionality being obvious and present moving up and down centering when you click the plus button see of me replicating multiple frustrated clients who asked for help this week and me been using wordpress since who took three days to figure out what was going on
| 0
|
67,707
| 21,076,189,076
|
IssuesEvent
|
2022-04-02 07:02:27
|
klubcoin/lcn-mobile
|
https://api.github.com/repos/klubcoin/lcn-mobile
|
closed
|
[Account Maintenance][SEED Phrase][Biometrics] Fix must accept correct password for Revealing Secret Recovery Phrase after unlocking the application via biometrics.
|
Defect Must Have Critical Account Maintenance Services
|
### **Description:**
Must accept correct password for Backing up again the Secret Recovery Phrase after unlocking the application via biometrics.
**Build Environment:** Staging Environment
**Affects Version:** 1.0.0.22
**Device Platform:** Android
**Device OS:** 11
**Test Device:** OnePlus 7T Pro
### **Pre-condition:**
1. Successfully installed the app
2. Successfully logged in.
3. User has an existing Wallet account
4. User is currently at Security & Privacy Screen
### **Steps to Reproduce:**
1. Set auto-lock time to 30 seconds
2. Wait for 30 seconds until the app is locked
3. Unlock via biometrics
4. Tap Reveal SEED Phrase
5. Enter correct password
### **Expected Result:**
Must accept accurate password and proceed to View SEED Phrase Information
### **Actual Result:**
Displaying couldn't unlock account.
### **Attachment/s:**
https://drive.google.com/file/d/1HFQwv02el_qQiS5gmrUjzPa-HBgoiViC/view
|
1.0
|
[Account Maintenance][SEED Phrase][Biometrics] Fix must accept correct password for Revealing Secret Recovery Phrase after unlocking the application via biometrics. - ### **Description:**
Must accept correct password for Backing up again the Secret Recovery Phrase after unlocking the application via biometrics.
**Build Environment:** Staging Environment
**Affects Version:** 1.0.0.22
**Device Platform:** Android
**Device OS:** 11
**Test Device:** OnePlus 7T Pro
### **Pre-condition:**
1. Successfully installed the app
2. Successfully logged in.
3. User has an existing Wallet account
4. User is currently at Security & Privacy Screen
### **Steps to Reproduce:**
1. Set auto-lock time to 30 seconds
2. Wait for 30 seconds until the app is locked
3. Unlock via biometrics
4. Tap Reveal SEED Phrase
5. Enter correct password
### **Expected Result:**
Must accept accurate password and proceed to View SEED Phrase Information
### **Actual Result:**
Displaying couldn't unlock account.
### **Attachment/s:**
https://drive.google.com/file/d/1HFQwv02el_qQiS5gmrUjzPa-HBgoiViC/view
|
defect
|
fix must accept correct password for revealing secret recovery phrase after unlocking the application via biometrics description must accept correct password for backing up again the secret recovery phrase after unlocking the application via biometrics build environment staging environment affects version device platform android device os test device oneplus pro pre condition successfully installed the app successfully logged in user has an existing wallet account user is currently at security privacy screen steps to reproduce set auto lock time to seconds wait for seconds until the app is locked unlock via biometrics tap reveal seed phrase enter correct password expected result must accept accurate password and proceed to view seed phrase information actual result displaying couldn t unlock account attachment s
| 1
|
55,984
| 14,882,315,936
|
IssuesEvent
|
2021-01-20 11:43:29
|
martinrotter/rssguard
|
https://api.github.com/repos/martinrotter/rssguard
|
closed
|
[BUG]: Application starting from version 3.8.2 does not start because of custom compiled QtWebEngine
|
Component-Web-Browser Status-Partially-Fixed Type-Defect Type-Deployment
|
#### Brief description of the issue.
Application starting from version 3.8.2 does not start
#### How to reproduce the bug?
1. Download the application and run the installation over the already installed
#### What was the expected result?
Application installed and worked
#### What actually happened?
Application installed and not worked (does not start)
#### Other information
Tested on version:
rssguard-3.8.4-7ab95a75-win64.exe
rssguard-3.8.3-03684f5-win64.exe
rssguard-3.8.2-12104c7-win64.exe
Latest working version:
3.8.0 (Windows/x86_64)
13c8ecf
10.11.2020 08:09
Qt: 5.14.2 (5.14.2)
Localisation: Russian
Windows 10 64bit
No log file is created, the command `rssguard.exe --log 'log.txt'` is not executed
|
1.0
|
[BUG]: Application starting from version 3.8.2 does not start because of custom compiled QtWebEngine - #### Brief description of the issue.
Application starting from version 3.8.2 does not start
#### How to reproduce the bug?
1. Download the application and run the installation over the already installed
#### What was the expected result?
Application installed and worked
#### What actually happened?
Application installed and not worked (does not start)
#### Other information
Tested on version:
rssguard-3.8.4-7ab95a75-win64.exe
rssguard-3.8.3-03684f5-win64.exe
rssguard-3.8.2-12104c7-win64.exe
Latest working version:
3.8.0 (Windows/x86_64)
13c8ecf
10.11.2020 08:09
Qt: 5.14.2 (5.14.2)
Localisation: Russian
Windows 10 64bit
No log file is created, the command `rssguard.exe --log 'log.txt'` is not executed
|
defect
|
application starting from version does not start because of custom compiled qtwebengine brief description of the issue application starting from version does not start how to reproduce the bug download the application and run the installation over the already installed what was the expected result application installed and worked what actually happened application installed and not worked does not start other information tested on version rssguard exe rssguard exe rssguard exe latest working version windows qt localisation russian windows no log file is created the command rssguard exe log log txt is not executed
| 1
|
197,335
| 22,591,809,236
|
IssuesEvent
|
2022-06-28 20:42:57
|
billmcchesney1/jazz
|
https://api.github.com/repos/billmcchesney1/jazz
|
opened
|
CVE-2022-0691 (High) detected in url-parse-1.4.7.tgz
|
security vulnerability
|
## CVE-2022-0691 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary>
<p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p>
<p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p>
<p>Path to dependency file: /templates/react-website-template/app/package.json</p>
<p>Path to vulnerable library: /templates/react-website-template/app/node_modules/url-parse/package.json,/core/jazz_ui/src/app/primary-components/daterange-picker/ng2-datepicker/node_modules/url-parse/package.json,/core/jazz_ui/node_modules/url-parse/package.json,/templates/angular-website-template/app/node_modules/url-parse/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.8.tgz (Root Library)
- react-dev-utils-8.0.0.tgz
- sockjs-client-1.3.0.tgz
- :x: **url-parse-1.4.7.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/jazz/commit/712665b267203375ee4b15e1f8d1ebe08abc1547">712665b267203375ee4b15e1f8d1ebe08abc1547</a></p>
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Authorization Bypass Through User-Controlled Key in NPM url-parse prior to 1.5.9.
<p>Publish Date: 2022-02-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0691>CVE-2022-0691</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691</a></p>
<p>Release Date: 2022-02-21</p>
<p>Fix Resolution (url-parse): 1.5.9</p>
<p>Direct dependency fix Resolution (react-scripts): 3.0.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
True
|
CVE-2022-0691 (High) detected in url-parse-1.4.7.tgz - ## CVE-2022-0691 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary>
<p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p>
<p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p>
<p>Path to dependency file: /templates/react-website-template/app/package.json</p>
<p>Path to vulnerable library: /templates/react-website-template/app/node_modules/url-parse/package.json,/core/jazz_ui/src/app/primary-components/daterange-picker/ng2-datepicker/node_modules/url-parse/package.json,/core/jazz_ui/node_modules/url-parse/package.json,/templates/angular-website-template/app/node_modules/url-parse/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.8.tgz (Root Library)
- react-dev-utils-8.0.0.tgz
- sockjs-client-1.3.0.tgz
- :x: **url-parse-1.4.7.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/jazz/commit/712665b267203375ee4b15e1f8d1ebe08abc1547">712665b267203375ee4b15e1f8d1ebe08abc1547</a></p>
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Authorization Bypass Through User-Controlled Key in NPM url-parse prior to 1.5.9.
<p>Publish Date: 2022-02-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0691>CVE-2022-0691</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691</a></p>
<p>Release Date: 2022-02-21</p>
<p>Fix Resolution (url-parse): 1.5.9</p>
<p>Direct dependency fix Resolution (react-scripts): 3.0.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
non_defect
|
cve high detected in url parse tgz cve high severity vulnerability vulnerable library url parse tgz small footprint url parser that works seamlessly across node js and browser environments library home page a href path to dependency file templates react website template app package json path to vulnerable library templates react website template app node modules url parse package json core jazz ui src app primary components daterange picker datepicker node modules url parse package json core jazz ui node modules url parse package json templates angular website template app node modules url parse package json dependency hierarchy react scripts tgz root library react dev utils tgz sockjs client tgz x url parse tgz vulnerable library found in head commit a href found in base branch develop vulnerability details authorization bypass through user controlled key in npm url parse prior to publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution url parse direct dependency fix resolution react scripts check this box to open an automated fix pr
| 0
|
49,585
| 13,187,236,621
|
IssuesEvent
|
2020-08-13 02:46:48
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
opened
|
[steamshovel] ipdf I3DOMLikelihoodArtist causes shovelart to segfault on exit (Trac #1711)
|
Incomplete Migration Migrated from Trac combo reconstruction defect
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1711">https://code.icecube.wisc.edu/ticket/1711</a>, reported by kjmeagher and owned by hdembinski</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:14:55",
"description": "Simply loading `icecube.ipdf.artists` causes `shovelart.so` to segfault on exit. It is not clear why this is happening. This code was disabled in r146010/IceCube, but the issue still needs to be fixed. \n\nThis segfault was causing the doc build to segfault and fail even though the docs build correctly. See the complete stacktrace at http://builds.icecube.wisc.edu/builders/docs/builds/59/steps/compile_2/logs/stdio",
"reporter": "kjmeagher",
"cc": "nega",
"resolution": "fixed",
"_ts": "1550067295757382",
"component": "combo reconstruction",
"summary": "[steamshovel] ipdf I3DOMLikelihoodArtist causes shovelart to segfault on exit",
"priority": "normal",
"keywords": "",
"time": "2016-05-18T09:12:15",
"milestone": "",
"owner": "hdembinski",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
[steamshovel] ipdf I3DOMLikelihoodArtist causes shovelart to segfault on exit (Trac #1711) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1711">https://code.icecube.wisc.edu/ticket/1711</a>, reported by kjmeagher and owned by hdembinski</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:14:55",
"description": "Simply loading `icecube.ipdf.artists` causes `shovelart.so` to segfault on exit. It is not clear why this is happening. This code was disabled in r146010/IceCube, but the issue still needs to be fixed. \n\nThis segfault was causing the doc build to segfault and fail even though the docs build correctly. See the complete stacktrace at http://builds.icecube.wisc.edu/builders/docs/builds/59/steps/compile_2/logs/stdio",
"reporter": "kjmeagher",
"cc": "nega",
"resolution": "fixed",
"_ts": "1550067295757382",
"component": "combo reconstruction",
"summary": "[steamshovel] ipdf I3DOMLikelihoodArtist causes shovelart to segfault on exit",
"priority": "normal",
"keywords": "",
"time": "2016-05-18T09:12:15",
"milestone": "",
"owner": "hdembinski",
"type": "defect"
}
```
</p>
</details>
|
defect
|
ipdf causes shovelart to segfault on exit trac migrated from json status closed changetime description simply loading icecube ipdf artists causes shovelart so to segfault on exit it is not clear why this is happening this code was disabled in icecube but the issue still needs to be fixed n nthis segfault was causing the doc build to segfault and fail even though the docs build correctly see the complete stacktrace at reporter kjmeagher cc nega resolution fixed ts component combo reconstruction summary ipdf causes shovelart to segfault on exit priority normal keywords time milestone owner hdembinski type defect
| 1
|
224,906
| 17,781,655,199
|
IssuesEvent
|
2021-08-31 05:47:46
|
aodn/nrmn-application
|
https://api.github.com/repos/aodn/nrmn-application
|
closed
|
Data summary in validation feedback not accurate
|
issue ready to test
|
Software version: 0.0.318
Environment: systest
From user:
> Not displaying correct feedback for a loaded sheet.
> i.e. loaded an ATRC sheet with 44 surveys from 11 sites all of which are complete, but the feedback says 5 surveys from 11 sites and 5 surveys incomplete
## Tasks
- [ ] Improve summary accuracy
- [ ] DOD
|
1.0
|
Data summary in validation feedback not accurate - Software version: 0.0.318
Environment: systest
From user:
> Not displaying correct feedback for a loaded sheet.
> i.e. loaded an ATRC sheet with 44 surveys from 11 sites all of which are complete, but the feedback says 5 surveys from 11 sites and 5 surveys incomplete
## Tasks
- [ ] Improve summary accuracy
- [ ] DOD
|
non_defect
|
data summary in validation feedback not accurate software version environment systest from user not displaying correct feedback for a loaded sheet i e loaded an atrc sheet with surveys from sites all of which are complete but the feedback says surveys from sites and surveys incomplete tasks improve summary accuracy dod
| 0
|
266,115
| 8,363,109,182
|
IssuesEvent
|
2018-10-03 18:46:10
|
letelete/Sleep-Cycle-Alarm
|
https://api.github.com/repos/letelete/Sleep-Cycle-Alarm
|
closed
|
Settings improvement
|
difficulty: easy improvement priority: medium user experience user interface
|
Settings layout needs to be changed.
Settings improvement in few steps:
1. Fix left margin
2. Split settings to sections exactly the same as Google Pay has it.

3. Decrease the margin between close button (X) with settings label and the top of the app screen
4. Done.
|
1.0
|
Settings improvement - Settings layout needs to be changed.
Settings improvement in few steps:
1. Fix left margin
2. Split settings to sections exactly the same as Google Pay has it.

3. Decrease the margin between close button (X) with settings label and the top of the app screen
4. Done.
|
non_defect
|
settings improvement settings layout needs to be changed settings improvement in few steps fix left margin split settings to sections exactly the same as google pay has it decrease the margin between close button x with settings label and the top of the app screen done
| 0
|
64,708
| 18,841,427,870
|
IssuesEvent
|
2021-11-11 10:01:09
|
FreeRADIUS/freeradius-server
|
https://api.github.com/repos/FreeRADIUS/freeradius-server
|
closed
|
[defect]: Accounting problem
|
defect
|
### What type of defect/bug is this?
Unexpected behaviour (obvious or verified by project member)
### How can the issue be reproduced?
im using VPN to connecting nas to radius server. Sometimes vpn is disconnecting, Accounting is not response. After the vpn is connected, new record added to radacct from nasipaddress not listed in nas table. the effect is doubled login time
why it can be?
and how to solve?
### Log output from the FreeRADIUS daemon
```shell
FreeRADIUS Version 3.0.20
Copyright (C) 1999-2019 The FreeRADIUS server project and contributors
There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A
PARTICULAR PURPOSE
You may redistribute copies of FreeRADIUS under the terms of the
GNU General Public License
For more information about these matters, see the file named COPYRIGHT
Starting - reading configuration files ...
including dictionary file /usr/share/freeradius/dictionary
including dictionary file /usr/share/freeradius/dictionary.dhcp
including dictionary file /usr/share/freeradius/dictionary.vqp
including dictionary file /etc/freeradius/3.0/dictionary
including configuration file /etc/freeradius/3.0/radiusd.conf
including configuration file /etc/freeradius/3.0/proxy.conf
including configuration file /etc/freeradius/3.0/clients.conf
including files in directory /etc/freeradius/3.0/mods-enabled/
including configuration file /etc/freeradius/3.0/mods-enabled/chap
including configuration file /etc/freeradius/3.0/mods-enabled/ntlm_auth
including configuration file /etc/freeradius/3.0/mods-enabled/realm
including configuration file /etc/freeradius/3.0/mods-enabled/replicate
including configuration file /etc/freeradius/3.0/mods-enabled/preprocess
including configuration file /etc/freeradius/3.0/mods-enabled/mschap
including configuration file /etc/freeradius/3.0/mods-enabled/echo
including configuration file /etc/freeradius/3.0/mods-enabled/soh
including configuration file /etc/freeradius/3.0/mods-enabled/expiration
including configuration file /etc/freeradius/3.0/mods-enabled/utf8
including configuration file /etc/freeradius/3.0/mods-enabled/digest
including configuration file /etc/freeradius/3.0/mods-enabled/sradutmp
including configuration file /etc/freeradius/3.0/mods-enabled/logintime
including configuration file /etc/freeradius/3.0/mods-enabled/detail.log
including configuration file /etc/freeradius/3.0/mods-enabled/eap
including configuration file /etc/freeradius/3.0/mods-enabled/dynamic_clients
including configuration file /etc/freeradius/3.0/mods-enabled/passwd
including configuration file /etc/freeradius/3.0/mods-enabled/linelog
including configuration file /etc/freeradius/3.0/mods-enabled/cache_eap
including configuration file /etc/freeradius/3.0/mods-enabled/radutmp
including configuration file /etc/freeradius/3.0/mods-enabled/pap
including configuration file /etc/freeradius/3.0/mods-enabled/files
including configuration file /etc/freeradius/3.0/mods-enabled/exec
including configuration file /etc/freeradius/3.0/mods-enabled/always
including configuration file /etc/freeradius/3.0/mods-enabled/detail
including configuration file /etc/freeradius/3.0/mods-enabled/unpack
including configuration file /etc/freeradius/3.0/mods-enabled/attr_filter
including configuration file /etc/freeradius/3.0/mods-enabled/unix
including configuration file /etc/freeradius/3.0/mods-enabled/expr
including configuration file /etc/freeradius/3.0/mods-enabled/sql
including configuration file /etc/freeradius/3.0/mods-config/sql/main/mysql/queries.conf
including configuration file /etc/freeradius/3.0/mods-available/sqlcounter
including configuration file /etc/freeradius/3.0/mods-config/sql/counter/mysql/dailycounter.conf
including configuration file /etc/freeradius/3.0/mods-config/sql/counter/mysql/monthlycounter.conf
including configuration file /etc/freeradius/3.0/mods-config/sql/counter/mysql/noresetcounter.conf
including configuration file /etc/freeradius/3.0/mods-config/sql/counter/mysql/expire_on_login.conf
including configuration file /etc/freeradius/3.0/mods-config/sql/counter/mysql/accessperiod.conf
including files in directory /etc/freeradius/3.0/policy.d/
including configuration file /etc/freeradius/3.0/policy.d/dhcp
including configuration file /etc/freeradius/3.0/policy.d/cui
including configuration file /etc/freeradius/3.0/policy.d/moonshot-targeted-ids
including configuration file /etc/freeradius/3.0/policy.d/accounting
including configuration file /etc/freeradius/3.0/policy.d/abfab-tr
including configuration file /etc/freeradius/3.0/policy.d/rfc7542
including configuration file /etc/freeradius/3.0/policy.d/canonicalization
including configuration file /etc/freeradius/3.0/policy.d/eap
including configuration file /etc/freeradius/3.0/policy.d/operator-name
including configuration file /etc/freeradius/3.0/policy.d/control
including configuration file /etc/freeradius/3.0/policy.d/filter
including configuration file /etc/freeradius/3.0/policy.d/debug
including files in directory /etc/freeradius/3.0/sites-enabled/
including configuration file /etc/freeradius/3.0/sites-enabled/inner-tunnel
including configuration file /etc/freeradius/3.0/sites-enabled/default
including configuration file /etc/freeradius/3.0/sites-enabled/control-socket
main {
security {
user = "freerad"
group = "freerad"
allow_core_dumps = no
}
name = "freeradius"
prefix = "/usr"
localstatedir = "/var"
logdir = "/var/log/freeradius"
run_dir = "/var/run/freeradius"
}
main {
name = "freeradius"
prefix = "/usr"
localstatedir = "/var"
sbindir = "/usr/sbin"
logdir = "/var/log/freeradius"
run_dir = "/var/run/freeradius"
libdir = "/usr/lib/freeradius"
radacctdir = "/var/log/freeradius/radacct"
hostname_lookups = no
max_request_time = 30
cleanup_delay = 5
max_requests = 16384
pidfile = "/var/run/freeradius/freeradius.pid"
checkrad = "/usr/sbin/checkrad"
debug_level = 0
proxy_requests = yes
log {
stripped_names = no
auth = no
auth_badpass = no
auth_goodpass = no
colourise = yes
msg_denied = "You are already logged in - access denied"
}
resources {
}
security {
max_attributes = 200
reject_delay = 1.000000
status_server = yes
}
}
radiusd: #### Loading Realms and Home Servers ####
proxy server {
retry_delay = 5
retry_count = 3
default_fallback = no
dead_time = 120
wake_all_if_all_dead = no
}
home_server localhost {
ipaddr = 127.0.0.1
port = 1812
type = "auth"
secret = <<< secret >>>
response_window = 20.000000
response_timeouts = 1
max_outstanding = 65536
zombie_period = 40
status_check = "status-server"
ping_interval = 30
check_interval = 30
check_timeout = 4
num_answers_to_alive = 3
revive_interval = 120
limit {
max_connections = 16
max_requests = 0
lifetime = 0
idle_timeout = 0
}
coa {
irt = 2
mrt = 16
mrc = 5
mrd = 30
}
}
home_server_pool my_auth_failover {
type = fail-over
home_server = localhost
}
realm example.com {
auth_pool = my_auth_failover
}
realm LOCAL {
}
radiusd: #### Loading Clients ####
client localhost {
ipaddr = 127.0.0.1
require_message_authenticator = no
secret = <<< secret >>>
nas_type = "other"
proto = "*"
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
client localhost_ipv6 {
ipv6addr = ::1
require_message_authenticator = no
secret = <<< secret >>>
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
Debug state unknown (cap_sys_ptrace capability not set)
systemd watchdog is disabled
# Creating Auth-Type = mschap
# Creating Auth-Type = eap
# Creating Auth-Type = PAP
# Creating Auth-Type = CHAP
# Creating Auth-Type = MS-CHAP
# Creating Auth-Type = digest
radiusd: #### Instantiating modules ####
modules {
# Loaded module rlm_chap
# Loading module "chap" from file /etc/freeradius/3.0/mods-enabled/chap
# Loaded module rlm_exec
# Loading module "ntlm_auth" from file /etc/freeradius/3.0/mods-enabled/ntlm_auth
exec ntlm_auth {
wait = yes
program = "/path/to/ntlm_auth --request-nt-key --domain=MYDOMAIN --username=%{mschap:User-Name} --password=%{User-Password}"
shell_escape = yes
}
# Loaded module rlm_realm
# Loading module "IPASS" from file /etc/freeradius/3.0/mods-enabled/realm
realm IPASS {
format = "prefix"
delimiter = "/"
ignore_default = no
ignore_null = no
}
# Loading module "suffix" from file /etc/freeradius/3.0/mods-enabled/realm
realm suffix {
format = "suffix"
delimiter = "@"
ignore_default = no
ignore_null = no
}
# Loading module "realmpercent" from file /etc/freeradius/3.0/mods-enabled/realm
realm realmpercent {
format = "suffix"
delimiter = "%"
ignore_default = no
ignore_null = no
}
# Loading module "ntdomain" from file /etc/freeradius/3.0/mods-enabled/realm
realm ntdomain {
format = "prefix"
delimiter = "\\"
ignore_default = no
ignore_null = no
}
# Loaded module rlm_replicate
# Loading module "replicate" from file /etc/freeradius/3.0/mods-enabled/replicate
# Loaded module rlm_preprocess
# Loading module "preprocess" from file /etc/freeradius/3.0/mods-enabled/preprocess
preprocess {
huntgroups = "/etc/freeradius/3.0/mods-config/preprocess/huntgroups"
hints = "/etc/freeradius/3.0/mods-config/preprocess/hints"
with_ascend_hack = no
ascend_channels_per_line = 23
with_ntdomain_hack = no
with_specialix_jetstream_hack = no
with_cisco_vsa_hack = no
with_alvarion_vsa_hack = no
}
# Loaded module rlm_mschap
# Loading module "mschap" from file /etc/freeradius/3.0/mods-enabled/mschap
mschap {
use_mppe = yes
require_encryption = no
require_strong = no
with_ntdomain_hack = yes
passchange {
}
allow_retry = yes
winbind_retry_with_normalised_username = no
}
# Loading module "echo" from file /etc/freeradius/3.0/mods-enabled/echo
exec echo {
wait = yes
program = "/bin/echo %{User-Name}"
input_pairs = "request"
output_pairs = "reply"
shell_escape = yes
}
# Loaded module rlm_soh
# Loading module "soh" from file /etc/freeradius/3.0/mods-enabled/soh
soh {
dhcp = yes
}
# Loaded module rlm_expiration
# Loading module "expiration" from file /etc/freeradius/3.0/mods-enabled/expiration
# Loaded module rlm_utf8
# Loading module "utf8" from file /etc/freeradius/3.0/mods-enabled/utf8
# Loaded module rlm_digest
# Loading module "digest" from file /etc/freeradius/3.0/mods-enabled/digest
# Loaded module rlm_radutmp
# Loading module "sradutmp" from file /etc/freeradius/3.0/mods-enabled/sradutmp
radutmp sradutmp {
filename = "/var/log/freeradius/sradutmp"
username = "%{User-Name}"
case_sensitive = yes
check_with_nas = yes
permissions = 420
caller_id = no
}
# Loaded module rlm_logintime
# Loading module "logintime" from file /etc/freeradius/3.0/mods-enabled/logintime
logintime {
minimum_timeout = 60
}
# Loaded module rlm_detail
# Loading module "auth_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
detail auth_log {
filename = "/var/log/freeradius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/auth-detail-%Y%m%d"
header = "%t"
permissions = 384
locking = no
escape_filenames = no
log_packet_header = no
}
# Loading module "reply_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
detail reply_log {
filename = "/var/log/freeradius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/reply-detail-%Y%m%d"
header = "%t"
permissions = 384
locking = no
escape_filenames = no
log_packet_header = no
}
# Loading module "pre_proxy_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
detail pre_proxy_log {
filename = "/var/log/freeradius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/pre-proxy-detail-%Y%m%d"
header = "%t"
permissions = 384
locking = no
escape_filenames = no
log_packet_header = no
}
# Loading module "post_proxy_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
detail post_proxy_log {
filename = "/var/log/freeradius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/post-proxy-detail-%Y%m%d"
header = "%t"
permissions = 384
locking = no
escape_filenames = no
log_packet_header = no
}
# Loaded module rlm_eap
# Loading module "eap" from file /etc/freeradius/3.0/mods-enabled/eap
eap {
default_eap_type = "md5"
timer_expire = 60
ignore_unknown_eap_types = no
cisco_accounting_username_bug = no
max_sessions = 16384
}
# Loaded module rlm_dynamic_clients
# Loading module "dynamic_clients" from file /etc/freeradius/3.0/mods-enabled/dynamic_clients
# Loaded module rlm_passwd
# Loading module "etc_passwd" from file /etc/freeradius/3.0/mods-enabled/passwd
passwd etc_passwd {
filename = "/etc/passwd"
format = "*User-Name:Crypt-Password:"
delimiter = ":"
ignore_nislike = no
ignore_empty = yes
allow_multiple_keys = no
hash_size = 100
}
# Loaded module rlm_linelog
# Loading module "linelog" from file /etc/freeradius/3.0/mods-enabled/linelog
linelog {
filename = "/var/log/freeradius/linelog"
escape_filenames = no
syslog_severity = "info"
permissions = 384
format = "This is a log message for %{User-Name}"
reference = "messages.%{%{reply:Packet-Type}:-default}"
}
# Loading module "log_accounting" from file /etc/freeradius/3.0/mods-enabled/linelog
linelog log_accounting {
filename = "/var/log/freeradius/linelog-accounting"
escape_filenames = no
syslog_severity = "info"
permissions = 384
format = ""
reference = "Accounting-Request.%{%{Acct-Status-Type}:-unknown}"
}
# Loaded module rlm_cache
# Loading module "cache_eap" from file /etc/freeradius/3.0/mods-enabled/cache_eap
cache cache_eap {
driver = "rlm_cache_rbtree"
key = "%{%{control:State}:-%{%{reply:State}:-%{State}}}"
ttl = 15
max_entries = 0
epoch = 0
add_stats = no
}
# Loading module "radutmp" from file /etc/freeradius/3.0/mods-enabled/radutmp
radutmp {
filename = "/var/log/freeradius/sradutmp"
username = "%{User-Name}"
case_sensitive = yes
check_with_nas = yes
permissions = 384
caller_id = yes
}
# Loaded module rlm_pap
# Loading module "pap" from file /etc/freeradius/3.0/mods-enabled/pap
pap {
normalise = yes
}
# Loaded module rlm_files
# Loading module "files" from file /etc/freeradius/3.0/mods-enabled/files
files {
filename = "/etc/freeradius/3.0/mods-config/files/authorize"
acctusersfile = "/etc/freeradius/3.0/mods-config/files/accounting"
preproxy_usersfile = "/etc/freeradius/3.0/mods-config/files/pre-proxy"
}
# Loading module "exec" from file /etc/freeradius/3.0/mods-enabled/exec
exec {
wait = no
input_pairs = "request"
shell_escape = yes
timeout = 10
}
# Loaded module rlm_always
# Loading module "reject" from file /etc/freeradius/3.0/mods-enabled/always
always reject {
rcode = "reject"
simulcount = 0
mpp = no
}
# Loading module "fail" from file /etc/freeradius/3.0/mods-enabled/always
always fail {
rcode = "fail"
simulcount = 0
mpp = no
}
# Loading module "ok" from file /etc/freeradius/3.0/mods-enabled/always
always ok {
rcode = "ok"
simulcount = 0
mpp = no
}
# Loading module "handled" from file /etc/freeradius/3.0/mods-enabled/always
always handled {
rcode = "handled"
simulcount = 0
mpp = no
}
# Loading module "invalid" from file /etc/freeradius/3.0/mods-enabled/always
always invalid {
rcode = "invalid"
simulcount = 0
mpp = no
}
# Loading module "userlock" from file /etc/freeradius/3.0/mods-enabled/always
always userlock {
rcode = "userlock"
simulcount = 0
mpp = no
}
# Loading module "notfound" from file /etc/freeradius/3.0/mods-enabled/always
always notfound {
rcode = "notfound"
simulcount = 0
mpp = no
}
# Loading module "noop" from file /etc/freeradius/3.0/mods-enabled/always
always noop {
rcode = "noop"
simulcount = 0
mpp = no
}
# Loading module "updated" from file /etc/freeradius/3.0/mods-enabled/always
always updated {
rcode = "updated"
simulcount = 0
mpp = no
}
# Loading module "detail" from file /etc/freeradius/3.0/mods-enabled/detail
detail {
filename = "/var/log/freeradius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/detail-%Y%m%d"
header = "%t"
permissions = 384
locking = no
escape_filenames = no
log_packet_header = no
}
# Loaded module rlm_unpack
# Loading module "unpack" from file /etc/freeradius/3.0/mods-enabled/unpack
# Loaded module rlm_attr_filter
# Loading module "attr_filter.post-proxy" from file /etc/freeradius/3.0/mods-enabled/attr_filter
attr_filter attr_filter.post-proxy {
filename = "/etc/freeradius/3.0/mods-config/attr_filter/post-proxy"
key = "%{Realm}"
relaxed = no
}
# Loading module "attr_filter.pre-proxy" from file /etc/freeradius/3.0/mods-enabled/attr_filter
attr_filter attr_filter.pre-proxy {
filename = "/etc/freeradius/3.0/mods-config/attr_filter/pre-proxy"
key = "%{Realm}"
relaxed = no
}
# Loading module "attr_filter.access_reject" from file /etc/freeradius/3.0/mods-enabled/attr_filter
attr_filter attr_filter.access_reject {
filename = "/etc/freeradius/3.0/mods-config/attr_filter/access_reject"
key = "%{User-Name}"
relaxed = no
}
# Loading module "attr_filter.access_challenge" from file /etc/freeradius/3.0/mods-enabled/attr_filter
attr_filter attr_filter.access_challenge {
filename = "/etc/freeradius/3.0/mods-config/attr_filter/access_challenge"
key = "%{User-Name}"
relaxed = no
}
# Loading module "attr_filter.accounting_response" from file /etc/freeradius/3.0/mods-enabled/attr_filter
attr_filter attr_filter.accounting_response {
filename = "/etc/freeradius/3.0/mods-config/attr_filter/accounting_response"
key = "%{User-Name}"
relaxed = no
}
# Loaded module rlm_unix
# Loading module "unix" from file /etc/freeradius/3.0/mods-enabled/unix
unix {
radwtmp = "/var/log/freeradius/radwtmp"
}
Creating attribute Unix-Group
# Loaded module rlm_expr
# Loading module "expr" from file /etc/freeradius/3.0/mods-enabled/expr
expr {
safe_characters = "@abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789.-_: /äéöüàâæçèéêëîïôœùûüaÿÄÉÖÜßÀÂÆÇÈÉÊËÎÏÔŒÙÛÜŸ"
}
# Loaded module rlm_sql
# Loading module "sql" from file /etc/freeradius/3.0/mods-enabled/sql
sql {
driver = "rlm_sql_mysql"
server = "127.0.0.1"
port = 3306
login = "radius"
password = <<< secret >>>
radius_db = "radius"
read_groups = yes
read_profiles = yes
read_clients = yes
delete_stale_sessions = yes
sql_user_name = "%{User-Name}"
default_user_profile = ""
client_query = "SELECT id, nasname, shortname, type, secret, server FROM nas"
authorize_check_query = "SELECT a.id, a.username, b.attribute, b.value, b.op FROM voucher a CROSS JOIN( SELECT 'Cleartext-Password' AS attribute, ':=' AS op, password AS 'value', id FROM voucher UNION SELECT 'Max-All-Session' AS attribute, ':=' AS op, session AS 'value', id FROM voucher UNION SELECT 'Access-Period' AS attribute, ':=' AS op, access_periode AS 'value', id FROM voucher UNION SELECT 'Simultaneous-Use' AS attribute, ':=' AS op, '1' AS 'value', id FROM voucher ) b WHERE a.id = b.id AND a.username = '%{SQL-User-Name}' ORDER BY a.id ASC, b.attribute"
authorize_reply_query = "SELECT a.id, a.username, b.attribute, b.value, b.op FROM voucher a CROSS JOIN( SELECT 'WISPr-Bandwidth-Max-Down' AS attribute, ':=' AS op, download AS 'value', id FROM voucher UNION SELECT 'WISPr-Bandwidth-Max-Up' AS attribute, ':=' AS op, upload AS 'value', id FROM voucher ) b WHERE a.id = b.id AND a.username = '%{SQL-User-Name}' ORDER BY a.id ASC, b.attribute"
authorize_group_check_query = "SELECT id, groupname, attribute, Value, op FROM radgroupcheck WHERE groupname = '%{SQL-Group}' ORDER BY id"
authorize_group_reply_query = "SELECT id, groupname, attribute, value, op FROM radgroupreply WHERE groupname = '%{SQL-Group}' ORDER BY id"
group_membership_query = "SELECT groupname FROM radusergroup WHERE username = '%{SQL-User-Name}' ORDER BY priority"
simul_count_query = "SELECT COUNT(*) FROM radacct WHERE username = '%{SQL-User-Name}' AND acctstoptime IS NULL"
simul_verify_query = "SELECT radacctid, acctsessionid, username, nasipaddress, nasportid, framedipaddress, callingstationid, framedprotocol FROM radacct WHERE username = '%{SQL-User-Name}' AND acctstoptime IS NULL"
safe_characters = "@abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789.-_: /"
auto_escape = no
accounting {
reference = "%{tolower:type.%{Acct-Status-Type}.query}"
type {
accounting-on {
query = "UPDATE radacct SET acctstoptime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctsessiontime = '%{integer:Event-Timestamp}' - UNIX_TIMESTAMP(acctstarttime), acctterminatecause = '%{%{Acct-Terminate-Cause}:-NAS-Reboot}' WHERE acctstoptime IS NULL AND nasipaddress = '%{NAS-IP-Address}' AND acctstarttime <= FROM_UNIXTIME(%{integer:Event-Timestamp})"
}
accounting-off {
query = "UPDATE radacct SET acctstoptime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctsessiontime = '%{integer:Event-Timestamp}' - UNIX_TIMESTAMP(acctstarttime), acctterminatecause = '%{%{Acct-Terminate-Cause}:-NAS-Reboot}' WHERE acctstoptime IS NULL AND nasipaddress = '%{NAS-IP-Address}' AND acctstarttime <= FROM_UNIXTIME(%{integer:Event-Timestamp})"
}
start {
query = "INSERT INTO radacct (acctsessionid, acctuniqueid, username, realm, nasipaddress, nasportid, nasporttype, acctstarttime, acctupdatetime, acctstoptime, acctsessiontime, acctauthentic, connectinfo_start, connectinfo_stop, acctinputoctets, acctoutputoctets, calledstationid, callingstationid, acctterminatecause, servicetype, framedprotocol, framedipaddress) VALUES ('%{Acct-Session-Id}', '%{Acct-Unique-Session-Id}', '%{SQL-User-Name}', '%{Realm}', '%{NAS-IP-Address}', '%{%{NAS-Port-ID}:-%{NAS-Port}}', '%{NAS-Port-Type}', FROM_UNIXTIME(%{integer:Event-Timestamp}), FROM_UNIXTIME(%{integer:Event-Timestamp}), NULL, '0', '%{Acct-Authentic}', '%{Connect-Info}', '', '0', '0', '%{Called-Station-Id}', '%{Calling-Station-Id}', '', '%{Service-Type}', '%{Framed-Protocol}', '%{Framed-IP-Address}')"
}
interim-update {
query = "UPDATE radacct SET acctupdatetime = (@acctupdatetime_old:=acctupdatetime), acctupdatetime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctinterval = %{integer:Event-Timestamp} - UNIX_TIMESTAMP(@acctupdatetime_old), framedipaddress = '%{Framed-IP-Address}', acctsessiontime = %{%{Acct-Session-Time}:-NULL}, acctinputoctets = '%{%{Acct-Input-Gigawords}:-0}' << 32 | '%{%{Acct-Input-Octets}:-0}', acctoutputoctets = '%{%{Acct-Output-Gigawords}:-0}' << 32 | '%{%{Acct-Output-Octets}:-0}' WHERE AcctUniqueId = '%{Acct-Unique-Session-Id}'"
}
stop {
query = "UPDATE radacct SET acctstoptime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctsessiontime = %{%{Acct-Session-Time}:-NULL}, acctinputoctets = '%{%{Acct-Input-Gigawords}:-0}' << 32 | '%{%{Acct-Input-Octets}:-0}', acctoutputoctets = '%{%{Acct-Output-Gigawords}:-0}' << 32 | '%{%{Acct-Output-Octets}:-0}', acctterminatecause = '%{Acct-Terminate-Cause}', connectinfo_stop = '%{Connect-Info}' WHERE AcctUniqueId = '%{Acct-Unique-Session-Id}'"
}
}
}
post-auth {
reference = ".query"
query = "INSERT INTO radpostauth (username, pass, reply, authdate) VALUES ( '%{SQL-User-Name}', '%{%{User-Password}:-%{Chap-Password}}', '%{reply:Packet-Type}', '%S')"
}
}
rlm_sql (sql): Driver rlm_sql_mysql (module rlm_sql_mysql) loaded and linked
Creating attribute SQL-Group
# Loaded module rlm_sqlcounter
# Loading module "dailycounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
sqlcounter dailycounter {
sql_module_instance = "sql"
key = "User-Name"
query = "SELECT SUM(acctsessiontime - GREATEST((%%b - UNIX_TIMESTAMP(acctstarttime)), 0)) FROM radacct WHERE username = '%{User-Name}' AND UNIX_TIMESTAMP(acctstarttime) + acctsessiontime > '%%b'"
reset = "daily"
counter_name = "Daily-Session-Time"
check_name = "Max-Daily-Session"
reply_name = "Session-Timeout"
}
# Loading module "monthlycounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
sqlcounter monthlycounter {
sql_module_instance = "sql"
key = "User-Name"
query = "SELECT SUM(acctsessiontime - GREATEST((%%b - UNIX_TIMESTAMP(acctstarttime)), 0)) FROM radacct WHERE username='%{User-Name}' AND UNIX_TIMESTAMP(acctstarttime) + acctsessiontime > '%%b'"
reset = "monthly"
counter_name = "Monthly-Session-Time"
check_name = "Max-Monthly-Session"
reply_name = "Session-Timeout"
}
# Loading module "noresetcounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
sqlcounter noresetcounter {
sql_module_instance = "sql"
key = "User-Name"
query = "SELECT IFNULL(SUM(AcctSessionTime),0) FROM radacct WHERE UserName='%{User-Name}'"
reset = "never"
counter_name = "Max-All-Session-Time"
check_name = "Max-All-Session"
reply_name = "Session-Timeout"
}
# Loading module "expire_on_login" from file /etc/freeradius/3.0/mods-available/sqlcounter
sqlcounter expire_on_login {
sql_module_instance = "sql"
key = "User-Name"
query = "SELECT IFNULL( MAX(TIME_TO_SEC(TIMEDIFF(NOW(), acctstarttime))),0) FROM radacct WHERE UserName='%{User-Name}' ORDER BY acctstarttime LIMIT 1;"
reset = "never"
counter_name = "Expire-After-Initial-Login"
check_name = "Expire-After"
reply_name = "Session-Timeout"
}
# Loading module "accessperiod" from file /etc/freeradius/3.0/mods-available/sqlcounter
sqlcounter accessperiod {
sql_module_instance = "sql"
key = "User-Name"
query = "SELECT IF(COUNT(radacctid>=1),(UNIX_TIMESTAMP() - IFNULL(UNIX_TIMESTAMP(AcctStartTime),0)),0) FROM radacct WHERE UserName = '%{User-Name}' AND AcctSessionTime >= 1 ORDER BY AcctStartTime LIMIT 1"
reset = "never"
counter_name = "Max-Access-Period-Never"
check_name = "Access-Period"
reply_name = "Session-Timeout"
}
instantiate {
}
# Instantiating module "IPASS" from file /etc/freeradius/3.0/mods-enabled/realm
# Instantiating module "suffix" from file /etc/freeradius/3.0/mods-enabled/realm
# Instantiating module "realmpercent" from file /etc/freeradius/3.0/mods-enabled/realm
# Instantiating module "ntdomain" from file /etc/freeradius/3.0/mods-enabled/realm
# Instantiating module "preprocess" from file /etc/freeradius/3.0/mods-enabled/preprocess
reading pairlist file /etc/freeradius/3.0/mods-config/preprocess/huntgroups
reading pairlist file /etc/freeradius/3.0/mods-config/preprocess/hints
# Instantiating module "mschap" from file /etc/freeradius/3.0/mods-enabled/mschap
rlm_mschap (mschap): using internal authentication
# Instantiating module "expiration" from file /etc/freeradius/3.0/mods-enabled/expiration
# Instantiating module "logintime" from file /etc/freeradius/3.0/mods-enabled/logintime
# Instantiating module "auth_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
rlm_detail (auth_log): 'User-Password' suppressed, will not appear in detail output
# Instantiating module "reply_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
# Instantiating module "pre_proxy_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
# Instantiating module "post_proxy_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
# Instantiating module "eap" from file /etc/freeradius/3.0/mods-enabled/eap
# Linked to sub-module rlm_eap_md5
# Linked to sub-module rlm_eap_leap
# Linked to sub-module rlm_eap_gtc
gtc {
challenge = "Password: "
auth_type = "PAP"
}
# Linked to sub-module rlm_eap_tls
tls {
tls = "tls-common"
}
tls-config tls-common {
verify_depth = 0
ca_path = "/etc/freeradius/3.0/certs"
pem_file_type = yes
private_key_file = "/etc/ssl/private/ssl-cert-snakeoil.key"
certificate_file = "/etc/ssl/certs/ssl-cert-snakeoil.pem"
ca_file = "/etc/ssl/certs/ca-certificates.crt"
private_key_password = <<< secret >>>
dh_file = "/etc/freeradius/3.0/certs/dh"
fragment_size = 1024
include_length = yes
auto_chain = yes
check_crl = no
check_all_crl = no
cipher_list = "DEFAULT"
ecdh_curve = "prime256v1"
tls_max_version = ""
tls_min_version = "1.0"
cache {
enable = yes
lifetime = 24
max_entries = 255
}
verify {
skip_if_ocsp_ok = no
}
ocsp {
enable = no
override_cert_url = yes
url = "http://127.0.0.1/ocsp/"
use_nonce = yes
timeout = 0
softfail = no
}
}
The configuration allows TLS 1.0 and/or TLS 1.1. We STRONGLY recommned using only TLS 1.2 for security
Please set: min_tls_version = "1.2"
# Linked to sub-module rlm_eap_ttls
ttls {
tls = "tls-common"
default_eap_type = "md5"
copy_request_to_tunnel = no
use_tunneled_reply = no
virtual_server = "inner-tunnel"
include_length = yes
require_client_cert = no
}
tls: Using cached TLS configuration from previous invocation
# Linked to sub-module rlm_eap_peap
peap {
tls = "tls-common"
default_eap_type = "mschapv2"
copy_request_to_tunnel = no
use_tunneled_reply = no
proxy_tunneled_request_as_eap = yes
virtual_server = "inner-tunnel"
soh = no
require_client_cert = no
}
tls: Using cached TLS configuration from previous invocation
# Linked to sub-module rlm_eap_mschapv2
mschapv2 {
with_ntdomain_hack = no
send_error = no
}
# Instantiating module "etc_passwd" from file /etc/freeradius/3.0/mods-enabled/passwd
rlm_passwd: nfields: 3 keyfield 0(User-Name) listable: no
# Instantiating module "linelog" from file /etc/freeradius/3.0/mods-enabled/linelog
# Instantiating module "log_accounting" from file /etc/freeradius/3.0/mods-enabled/linelog
# Instantiating module "cache_eap" from file /etc/freeradius/3.0/mods-enabled/cache_eap
rlm_cache (cache_eap): Driver rlm_cache_rbtree (module rlm_cache_rbtree) loaded and linked
# Instantiating module "pap" from file /etc/freeradius/3.0/mods-enabled/pap
# Instantiating module "files" from file /etc/freeradius/3.0/mods-enabled/files
reading pairlist file /etc/freeradius/3.0/mods-config/files/authorize
reading pairlist file /etc/freeradius/3.0/mods-config/files/accounting
reading pairlist file /etc/freeradius/3.0/mods-config/files/pre-proxy
# Instantiating module "reject" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "fail" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "ok" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "handled" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "invalid" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "userlock" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "notfound" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "noop" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "updated" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "detail" from file /etc/freeradius/3.0/mods-enabled/detail
# Instantiating module "attr_filter.post-proxy" from file /etc/freeradius/3.0/mods-enabled/attr_filter
reading pairlist file /etc/freeradius/3.0/mods-config/attr_filter/post-proxy
# Instantiating module "attr_filter.pre-proxy" from file /etc/freeradius/3.0/mods-enabled/attr_filter
reading pairlist file /etc/freeradius/3.0/mods-config/attr_filter/pre-proxy
# Instantiating module "attr_filter.access_reject" from file /etc/freeradius/3.0/mods-enabled/attr_filter
reading pairlist file /etc/freeradius/3.0/mods-config/attr_filter/access_reject
[/etc/freeradius/3.0/mods-config/attr_filter/access_reject]:11 Check item "FreeRADIUS-Response-Delay" found in filter list for realm "DEFAULT".
[/etc/freeradius/3.0/mods-config/attr_filter/access_reject]:11 Check item "FreeRADIUS-Response-Delay-USec" found in filter list for realm "DEFAULT".
# Instantiating module "attr_filter.access_challenge" from file /etc/freeradius/3.0/mods-enabled/attr_filter
reading pairlist file /etc/freeradius/3.0/mods-config/attr_filter/access_challenge
# Instantiating module "attr_filter.accounting_response" from file /etc/freeradius/3.0/mods-enabled/attr_filter
reading pairlist file /etc/freeradius/3.0/mods-config/attr_filter/accounting_response
# Instantiating module "sql" from file /etc/freeradius/3.0/mods-enabled/sql
rlm_sql_mysql: libmysql version: 8.0.26
mysql {
tls {
tls_required = no
check_cert = no
check_cert_cn = no
}
warnings = "auto"
}
rlm_sql (sql): Attempting to connect to database "radius"
rlm_sql (sql): Initialising connection pool
pool {
start = 5
min = 3
max = 32
spare = 10
uses = 0
lifetime = 0
cleanup_interval = 30
idle_timeout = 60
retry_delay = 30
spread = no
}
rlm_sql (sql): Opening additional connection (0), 1 of 32 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
rlm_sql (sql): Opening additional connection (1), 1 of 31 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
rlm_sql (sql): Opening additional connection (2), 1 of 30 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
rlm_sql (sql): Opening additional connection (3), 1 of 29 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
rlm_sql (sql): Opening additional connection (4), 1 of 28 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
rlm_sql (sql): Processing generate_sql_clients
rlm_sql (sql) in generate_sql_clients: query is SELECT id, nasname, shortname, type, secret, server FROM nas
rlm_sql (sql): Reserved connection (0)
rlm_sql (sql): Executing select query: SELECT id, nasname, shortname, type, secret, server FROM nas
rlm_sql (sql): Adding client 172.16.3.87 (user-272-10) to global clients list
rlm_sql (172.16.3.87): Client "user-272-10" (sql) added
rlm_sql (sql): Released connection (0)
Need 5 more connections to reach 10 spares
rlm_sql (sql): Opening additional connection (5), 1 of 27 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
# Instantiating module "dailycounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
rlm_sqlcounter: Current Time: 1636603029 [2021-11-11 03:57:09], Prev reset 1636588800 [2021-11-11 00:00:00]
# Instantiating module "monthlycounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
rlm_sqlcounter: Current Time: 1636603029 [2021-11-11 03:57:09], Prev reset 1635724800 [2021-11-01 00:00:00]
# Instantiating module "noresetcounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
rlm_sqlcounter: Current Time: 1636603029 [2021-11-11 03:57:09], Prev reset 0 [2021-11-11 03:00:00]
# Instantiating module "expire_on_login" from file /etc/freeradius/3.0/mods-available/sqlcounter
rlm_sqlcounter: Current Time: 1636603029 [2021-11-11 03:57:09], Prev reset 0 [2021-11-11 03:00:00]
# Instantiating module "accessperiod" from file /etc/freeradius/3.0/mods-available/sqlcounter
rlm_sqlcounter: Current Time: 1636603029 [2021-11-11 03:57:09], Prev reset 0 [2021-11-11 03:00:00]
} # modules
radiusd: #### Loading Virtual Servers ####
server { # from file /etc/freeradius/3.0/radiusd.conf
} # server
server inner-tunnel { # from file /etc/freeradius/3.0/sites-enabled/inner-tunnel
# Loading authenticate {...}
# Loading authorize {...}
Ignoring "ldap" (see raddb/mods-available/README.rst)
# Loading session {...}
# Loading post-proxy {...}
# Loading post-auth {...}
} # server inner-tunnel
server default { # from file /etc/freeradius/3.0/sites-enabled/default
# Loading authenticate {...}
# Loading authorize {...}
# Loading preacct {...}
# Loading accounting {...}
# Loading session {...}
# Loading post-proxy {...}
# Loading post-auth {...}
} # server default
radiusd: #### Opening IP addresses and Ports ####
listen {
type = "control"
listen {
socket = "/var/run/freeradius/freeradius.sock"
peercred = yes
}
}
listen {
type = "auth"
ipaddr = 127.0.0.1
port = 18160
}
listen {
type = "auth"
ipaddr = *
port = 1816
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
listen {
type = "acct"
ipaddr = *
port = 1817
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
listen {
type = "auth"
ipv6addr = ::
port = 1816
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
listen {
type = "acct"
ipv6addr = ::
port = 1817
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
Listening on command file /var/run/freeradius/freeradius.sock
Listening on auth address 127.0.0.1 port 18160 bound to server inner-tunnel
Listening on auth address * port 1816 bound to server default
Listening on acct address * port 1817 bound to server default
Listening on auth address :: port 1816 bound to server default
Listening on acct address :: port 1817 bound to server default
Listening on proxy address * port 35774
Listening on proxy address :: port 42819
Ready to process requests
```
### Relevant log output from client utilities
_No response_
### Backtrace from LLDB or GDB
_No response_
|
1.0
|
[defect]: Accounting problem - ### What type of defect/bug is this?
Unexpected behaviour (obvious or verified by project member)
### How can the issue be reproduced?
im using VPN to connecting nas to radius server. Sometimes vpn is disconnecting, Accounting is not response. After the vpn is connected, new record added to radacct from nasipaddress not listed in nas table. the effect is doubled login time
why it can be?
and how to solve?
### Log output from the FreeRADIUS daemon
```shell
FreeRADIUS Version 3.0.20
Copyright (C) 1999-2019 The FreeRADIUS server project and contributors
There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A
PARTICULAR PURPOSE
You may redistribute copies of FreeRADIUS under the terms of the
GNU General Public License
For more information about these matters, see the file named COPYRIGHT
Starting - reading configuration files ...
including dictionary file /usr/share/freeradius/dictionary
including dictionary file /usr/share/freeradius/dictionary.dhcp
including dictionary file /usr/share/freeradius/dictionary.vqp
including dictionary file /etc/freeradius/3.0/dictionary
including configuration file /etc/freeradius/3.0/radiusd.conf
including configuration file /etc/freeradius/3.0/proxy.conf
including configuration file /etc/freeradius/3.0/clients.conf
including files in directory /etc/freeradius/3.0/mods-enabled/
including configuration file /etc/freeradius/3.0/mods-enabled/chap
including configuration file /etc/freeradius/3.0/mods-enabled/ntlm_auth
including configuration file /etc/freeradius/3.0/mods-enabled/realm
including configuration file /etc/freeradius/3.0/mods-enabled/replicate
including configuration file /etc/freeradius/3.0/mods-enabled/preprocess
including configuration file /etc/freeradius/3.0/mods-enabled/mschap
including configuration file /etc/freeradius/3.0/mods-enabled/echo
including configuration file /etc/freeradius/3.0/mods-enabled/soh
including configuration file /etc/freeradius/3.0/mods-enabled/expiration
including configuration file /etc/freeradius/3.0/mods-enabled/utf8
including configuration file /etc/freeradius/3.0/mods-enabled/digest
including configuration file /etc/freeradius/3.0/mods-enabled/sradutmp
including configuration file /etc/freeradius/3.0/mods-enabled/logintime
including configuration file /etc/freeradius/3.0/mods-enabled/detail.log
including configuration file /etc/freeradius/3.0/mods-enabled/eap
including configuration file /etc/freeradius/3.0/mods-enabled/dynamic_clients
including configuration file /etc/freeradius/3.0/mods-enabled/passwd
including configuration file /etc/freeradius/3.0/mods-enabled/linelog
including configuration file /etc/freeradius/3.0/mods-enabled/cache_eap
including configuration file /etc/freeradius/3.0/mods-enabled/radutmp
including configuration file /etc/freeradius/3.0/mods-enabled/pap
including configuration file /etc/freeradius/3.0/mods-enabled/files
including configuration file /etc/freeradius/3.0/mods-enabled/exec
including configuration file /etc/freeradius/3.0/mods-enabled/always
including configuration file /etc/freeradius/3.0/mods-enabled/detail
including configuration file /etc/freeradius/3.0/mods-enabled/unpack
including configuration file /etc/freeradius/3.0/mods-enabled/attr_filter
including configuration file /etc/freeradius/3.0/mods-enabled/unix
including configuration file /etc/freeradius/3.0/mods-enabled/expr
including configuration file /etc/freeradius/3.0/mods-enabled/sql
including configuration file /etc/freeradius/3.0/mods-config/sql/main/mysql/queries.conf
including configuration file /etc/freeradius/3.0/mods-available/sqlcounter
including configuration file /etc/freeradius/3.0/mods-config/sql/counter/mysql/dailycounter.conf
including configuration file /etc/freeradius/3.0/mods-config/sql/counter/mysql/monthlycounter.conf
including configuration file /etc/freeradius/3.0/mods-config/sql/counter/mysql/noresetcounter.conf
including configuration file /etc/freeradius/3.0/mods-config/sql/counter/mysql/expire_on_login.conf
including configuration file /etc/freeradius/3.0/mods-config/sql/counter/mysql/accessperiod.conf
including files in directory /etc/freeradius/3.0/policy.d/
including configuration file /etc/freeradius/3.0/policy.d/dhcp
including configuration file /etc/freeradius/3.0/policy.d/cui
including configuration file /etc/freeradius/3.0/policy.d/moonshot-targeted-ids
including configuration file /etc/freeradius/3.0/policy.d/accounting
including configuration file /etc/freeradius/3.0/policy.d/abfab-tr
including configuration file /etc/freeradius/3.0/policy.d/rfc7542
including configuration file /etc/freeradius/3.0/policy.d/canonicalization
including configuration file /etc/freeradius/3.0/policy.d/eap
including configuration file /etc/freeradius/3.0/policy.d/operator-name
including configuration file /etc/freeradius/3.0/policy.d/control
including configuration file /etc/freeradius/3.0/policy.d/filter
including configuration file /etc/freeradius/3.0/policy.d/debug
including files in directory /etc/freeradius/3.0/sites-enabled/
including configuration file /etc/freeradius/3.0/sites-enabled/inner-tunnel
including configuration file /etc/freeradius/3.0/sites-enabled/default
including configuration file /etc/freeradius/3.0/sites-enabled/control-socket
main {
security {
user = "freerad"
group = "freerad"
allow_core_dumps = no
}
name = "freeradius"
prefix = "/usr"
localstatedir = "/var"
logdir = "/var/log/freeradius"
run_dir = "/var/run/freeradius"
}
main {
name = "freeradius"
prefix = "/usr"
localstatedir = "/var"
sbindir = "/usr/sbin"
logdir = "/var/log/freeradius"
run_dir = "/var/run/freeradius"
libdir = "/usr/lib/freeradius"
radacctdir = "/var/log/freeradius/radacct"
hostname_lookups = no
max_request_time = 30
cleanup_delay = 5
max_requests = 16384
pidfile = "/var/run/freeradius/freeradius.pid"
checkrad = "/usr/sbin/checkrad"
debug_level = 0
proxy_requests = yes
log {
stripped_names = no
auth = no
auth_badpass = no
auth_goodpass = no
colourise = yes
msg_denied = "You are already logged in - access denied"
}
resources {
}
security {
max_attributes = 200
reject_delay = 1.000000
status_server = yes
}
}
radiusd: #### Loading Realms and Home Servers ####
proxy server {
retry_delay = 5
retry_count = 3
default_fallback = no
dead_time = 120
wake_all_if_all_dead = no
}
home_server localhost {
ipaddr = 127.0.0.1
port = 1812
type = "auth"
secret = <<< secret >>>
response_window = 20.000000
response_timeouts = 1
max_outstanding = 65536
zombie_period = 40
status_check = "status-server"
ping_interval = 30
check_interval = 30
check_timeout = 4
num_answers_to_alive = 3
revive_interval = 120
limit {
max_connections = 16
max_requests = 0
lifetime = 0
idle_timeout = 0
}
coa {
irt = 2
mrt = 16
mrc = 5
mrd = 30
}
}
home_server_pool my_auth_failover {
type = fail-over
home_server = localhost
}
realm example.com {
auth_pool = my_auth_failover
}
realm LOCAL {
}
radiusd: #### Loading Clients ####
client localhost {
ipaddr = 127.0.0.1
require_message_authenticator = no
secret = <<< secret >>>
nas_type = "other"
proto = "*"
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
client localhost_ipv6 {
ipv6addr = ::1
require_message_authenticator = no
secret = <<< secret >>>
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
Debug state unknown (cap_sys_ptrace capability not set)
systemd watchdog is disabled
# Creating Auth-Type = mschap
# Creating Auth-Type = eap
# Creating Auth-Type = PAP
# Creating Auth-Type = CHAP
# Creating Auth-Type = MS-CHAP
# Creating Auth-Type = digest
radiusd: #### Instantiating modules ####
modules {
# Loaded module rlm_chap
# Loading module "chap" from file /etc/freeradius/3.0/mods-enabled/chap
# Loaded module rlm_exec
# Loading module "ntlm_auth" from file /etc/freeradius/3.0/mods-enabled/ntlm_auth
exec ntlm_auth {
wait = yes
program = "/path/to/ntlm_auth --request-nt-key --domain=MYDOMAIN --username=%{mschap:User-Name} --password=%{User-Password}"
shell_escape = yes
}
# Loaded module rlm_realm
# Loading module "IPASS" from file /etc/freeradius/3.0/mods-enabled/realm
realm IPASS {
format = "prefix"
delimiter = "/"
ignore_default = no
ignore_null = no
}
# Loading module "suffix" from file /etc/freeradius/3.0/mods-enabled/realm
realm suffix {
format = "suffix"
delimiter = "@"
ignore_default = no
ignore_null = no
}
# Loading module "realmpercent" from file /etc/freeradius/3.0/mods-enabled/realm
realm realmpercent {
format = "suffix"
delimiter = "%"
ignore_default = no
ignore_null = no
}
# Loading module "ntdomain" from file /etc/freeradius/3.0/mods-enabled/realm
realm ntdomain {
format = "prefix"
delimiter = "\\"
ignore_default = no
ignore_null = no
}
# Loaded module rlm_replicate
# Loading module "replicate" from file /etc/freeradius/3.0/mods-enabled/replicate
# Loaded module rlm_preprocess
# Loading module "preprocess" from file /etc/freeradius/3.0/mods-enabled/preprocess
preprocess {
huntgroups = "/etc/freeradius/3.0/mods-config/preprocess/huntgroups"
hints = "/etc/freeradius/3.0/mods-config/preprocess/hints"
with_ascend_hack = no
ascend_channels_per_line = 23
with_ntdomain_hack = no
with_specialix_jetstream_hack = no
with_cisco_vsa_hack = no
with_alvarion_vsa_hack = no
}
# Loaded module rlm_mschap
# Loading module "mschap" from file /etc/freeradius/3.0/mods-enabled/mschap
mschap {
use_mppe = yes
require_encryption = no
require_strong = no
with_ntdomain_hack = yes
passchange {
}
allow_retry = yes
winbind_retry_with_normalised_username = no
}
# Loading module "echo" from file /etc/freeradius/3.0/mods-enabled/echo
exec echo {
wait = yes
program = "/bin/echo %{User-Name}"
input_pairs = "request"
output_pairs = "reply"
shell_escape = yes
}
# Loaded module rlm_soh
# Loading module "soh" from file /etc/freeradius/3.0/mods-enabled/soh
soh {
dhcp = yes
}
# Loaded module rlm_expiration
# Loading module "expiration" from file /etc/freeradius/3.0/mods-enabled/expiration
# Loaded module rlm_utf8
# Loading module "utf8" from file /etc/freeradius/3.0/mods-enabled/utf8
# Loaded module rlm_digest
# Loading module "digest" from file /etc/freeradius/3.0/mods-enabled/digest
# Loaded module rlm_radutmp
# Loading module "sradutmp" from file /etc/freeradius/3.0/mods-enabled/sradutmp
radutmp sradutmp {
filename = "/var/log/freeradius/sradutmp"
username = "%{User-Name}"
case_sensitive = yes
check_with_nas = yes
permissions = 420
caller_id = no
}
# Loaded module rlm_logintime
# Loading module "logintime" from file /etc/freeradius/3.0/mods-enabled/logintime
logintime {
minimum_timeout = 60
}
# Loaded module rlm_detail
# Loading module "auth_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
detail auth_log {
filename = "/var/log/freeradius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/auth-detail-%Y%m%d"
header = "%t"
permissions = 384
locking = no
escape_filenames = no
log_packet_header = no
}
# Loading module "reply_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
detail reply_log {
filename = "/var/log/freeradius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/reply-detail-%Y%m%d"
header = "%t"
permissions = 384
locking = no
escape_filenames = no
log_packet_header = no
}
# Loading module "pre_proxy_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
detail pre_proxy_log {
filename = "/var/log/freeradius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/pre-proxy-detail-%Y%m%d"
header = "%t"
permissions = 384
locking = no
escape_filenames = no
log_packet_header = no
}
# Loading module "post_proxy_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
detail post_proxy_log {
filename = "/var/log/freeradius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/post-proxy-detail-%Y%m%d"
header = "%t"
permissions = 384
locking = no
escape_filenames = no
log_packet_header = no
}
# Loaded module rlm_eap
# Loading module "eap" from file /etc/freeradius/3.0/mods-enabled/eap
eap {
default_eap_type = "md5"
timer_expire = 60
ignore_unknown_eap_types = no
cisco_accounting_username_bug = no
max_sessions = 16384
}
# Loaded module rlm_dynamic_clients
# Loading module "dynamic_clients" from file /etc/freeradius/3.0/mods-enabled/dynamic_clients
# Loaded module rlm_passwd
# Loading module "etc_passwd" from file /etc/freeradius/3.0/mods-enabled/passwd
passwd etc_passwd {
filename = "/etc/passwd"
format = "*User-Name:Crypt-Password:"
delimiter = ":"
ignore_nislike = no
ignore_empty = yes
allow_multiple_keys = no
hash_size = 100
}
# Loaded module rlm_linelog
# Loading module "linelog" from file /etc/freeradius/3.0/mods-enabled/linelog
linelog {
filename = "/var/log/freeradius/linelog"
escape_filenames = no
syslog_severity = "info"
permissions = 384
format = "This is a log message for %{User-Name}"
reference = "messages.%{%{reply:Packet-Type}:-default}"
}
# Loading module "log_accounting" from file /etc/freeradius/3.0/mods-enabled/linelog
linelog log_accounting {
filename = "/var/log/freeradius/linelog-accounting"
escape_filenames = no
syslog_severity = "info"
permissions = 384
format = ""
reference = "Accounting-Request.%{%{Acct-Status-Type}:-unknown}"
}
# Loaded module rlm_cache
# Loading module "cache_eap" from file /etc/freeradius/3.0/mods-enabled/cache_eap
cache cache_eap {
driver = "rlm_cache_rbtree"
key = "%{%{control:State}:-%{%{reply:State}:-%{State}}}"
ttl = 15
max_entries = 0
epoch = 0
add_stats = no
}
# Loading module "radutmp" from file /etc/freeradius/3.0/mods-enabled/radutmp
radutmp {
filename = "/var/log/freeradius/sradutmp"
username = "%{User-Name}"
case_sensitive = yes
check_with_nas = yes
permissions = 384
caller_id = yes
}
# Loaded module rlm_pap
# Loading module "pap" from file /etc/freeradius/3.0/mods-enabled/pap
pap {
normalise = yes
}
# Loaded module rlm_files
# Loading module "files" from file /etc/freeradius/3.0/mods-enabled/files
files {
filename = "/etc/freeradius/3.0/mods-config/files/authorize"
acctusersfile = "/etc/freeradius/3.0/mods-config/files/accounting"
preproxy_usersfile = "/etc/freeradius/3.0/mods-config/files/pre-proxy"
}
# Loading module "exec" from file /etc/freeradius/3.0/mods-enabled/exec
exec {
wait = no
input_pairs = "request"
shell_escape = yes
timeout = 10
}
# Loaded module rlm_always
# Loading module "reject" from file /etc/freeradius/3.0/mods-enabled/always
always reject {
rcode = "reject"
simulcount = 0
mpp = no
}
# Loading module "fail" from file /etc/freeradius/3.0/mods-enabled/always
always fail {
rcode = "fail"
simulcount = 0
mpp = no
}
# Loading module "ok" from file /etc/freeradius/3.0/mods-enabled/always
always ok {
rcode = "ok"
simulcount = 0
mpp = no
}
# Loading module "handled" from file /etc/freeradius/3.0/mods-enabled/always
always handled {
rcode = "handled"
simulcount = 0
mpp = no
}
# Loading module "invalid" from file /etc/freeradius/3.0/mods-enabled/always
always invalid {
rcode = "invalid"
simulcount = 0
mpp = no
}
# Loading module "userlock" from file /etc/freeradius/3.0/mods-enabled/always
always userlock {
rcode = "userlock"
simulcount = 0
mpp = no
}
# Loading module "notfound" from file /etc/freeradius/3.0/mods-enabled/always
always notfound {
rcode = "notfound"
simulcount = 0
mpp = no
}
# Loading module "noop" from file /etc/freeradius/3.0/mods-enabled/always
always noop {
rcode = "noop"
simulcount = 0
mpp = no
}
# Loading module "updated" from file /etc/freeradius/3.0/mods-enabled/always
always updated {
rcode = "updated"
simulcount = 0
mpp = no
}
# Loading module "detail" from file /etc/freeradius/3.0/mods-enabled/detail
detail {
filename = "/var/log/freeradius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/detail-%Y%m%d"
header = "%t"
permissions = 384
locking = no
escape_filenames = no
log_packet_header = no
}
# Loaded module rlm_unpack
# Loading module "unpack" from file /etc/freeradius/3.0/mods-enabled/unpack
# Loaded module rlm_attr_filter
# Loading module "attr_filter.post-proxy" from file /etc/freeradius/3.0/mods-enabled/attr_filter
attr_filter attr_filter.post-proxy {
filename = "/etc/freeradius/3.0/mods-config/attr_filter/post-proxy"
key = "%{Realm}"
relaxed = no
}
# Loading module "attr_filter.pre-proxy" from file /etc/freeradius/3.0/mods-enabled/attr_filter
attr_filter attr_filter.pre-proxy {
filename = "/etc/freeradius/3.0/mods-config/attr_filter/pre-proxy"
key = "%{Realm}"
relaxed = no
}
# Loading module "attr_filter.access_reject" from file /etc/freeradius/3.0/mods-enabled/attr_filter
attr_filter attr_filter.access_reject {
filename = "/etc/freeradius/3.0/mods-config/attr_filter/access_reject"
key = "%{User-Name}"
relaxed = no
}
# Loading module "attr_filter.access_challenge" from file /etc/freeradius/3.0/mods-enabled/attr_filter
attr_filter attr_filter.access_challenge {
filename = "/etc/freeradius/3.0/mods-config/attr_filter/access_challenge"
key = "%{User-Name}"
relaxed = no
}
# Loading module "attr_filter.accounting_response" from file /etc/freeradius/3.0/mods-enabled/attr_filter
attr_filter attr_filter.accounting_response {
filename = "/etc/freeradius/3.0/mods-config/attr_filter/accounting_response"
key = "%{User-Name}"
relaxed = no
}
# Loaded module rlm_unix
# Loading module "unix" from file /etc/freeradius/3.0/mods-enabled/unix
unix {
radwtmp = "/var/log/freeradius/radwtmp"
}
Creating attribute Unix-Group
# Loaded module rlm_expr
# Loading module "expr" from file /etc/freeradius/3.0/mods-enabled/expr
expr {
safe_characters = "@abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789.-_: /äéöüàâæçèéêëîïôœùûüaÿÄÉÖÜßÀÂÆÇÈÉÊËÎÏÔŒÙÛÜŸ"
}
# Loaded module rlm_sql
# Loading module "sql" from file /etc/freeradius/3.0/mods-enabled/sql
sql {
driver = "rlm_sql_mysql"
server = "127.0.0.1"
port = 3306
login = "radius"
password = <<< secret >>>
radius_db = "radius"
read_groups = yes
read_profiles = yes
read_clients = yes
delete_stale_sessions = yes
sql_user_name = "%{User-Name}"
default_user_profile = ""
client_query = "SELECT id, nasname, shortname, type, secret, server FROM nas"
authorize_check_query = "SELECT a.id, a.username, b.attribute, b.value, b.op FROM voucher a CROSS JOIN( SELECT 'Cleartext-Password' AS attribute, ':=' AS op, password AS 'value', id FROM voucher UNION SELECT 'Max-All-Session' AS attribute, ':=' AS op, session AS 'value', id FROM voucher UNION SELECT 'Access-Period' AS attribute, ':=' AS op, access_periode AS 'value', id FROM voucher UNION SELECT 'Simultaneous-Use' AS attribute, ':=' AS op, '1' AS 'value', id FROM voucher ) b WHERE a.id = b.id AND a.username = '%{SQL-User-Name}' ORDER BY a.id ASC, b.attribute"
authorize_reply_query = "SELECT a.id, a.username, b.attribute, b.value, b.op FROM voucher a CROSS JOIN( SELECT 'WISPr-Bandwidth-Max-Down' AS attribute, ':=' AS op, download AS 'value', id FROM voucher UNION SELECT 'WISPr-Bandwidth-Max-Up' AS attribute, ':=' AS op, upload AS 'value', id FROM voucher ) b WHERE a.id = b.id AND a.username = '%{SQL-User-Name}' ORDER BY a.id ASC, b.attribute"
authorize_group_check_query = "SELECT id, groupname, attribute, Value, op FROM radgroupcheck WHERE groupname = '%{SQL-Group}' ORDER BY id"
authorize_group_reply_query = "SELECT id, groupname, attribute, value, op FROM radgroupreply WHERE groupname = '%{SQL-Group}' ORDER BY id"
group_membership_query = "SELECT groupname FROM radusergroup WHERE username = '%{SQL-User-Name}' ORDER BY priority"
simul_count_query = "SELECT COUNT(*) FROM radacct WHERE username = '%{SQL-User-Name}' AND acctstoptime IS NULL"
simul_verify_query = "SELECT radacctid, acctsessionid, username, nasipaddress, nasportid, framedipaddress, callingstationid, framedprotocol FROM radacct WHERE username = '%{SQL-User-Name}' AND acctstoptime IS NULL"
safe_characters = "@abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789.-_: /"
auto_escape = no
accounting {
reference = "%{tolower:type.%{Acct-Status-Type}.query}"
type {
accounting-on {
query = "UPDATE radacct SET acctstoptime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctsessiontime = '%{integer:Event-Timestamp}' - UNIX_TIMESTAMP(acctstarttime), acctterminatecause = '%{%{Acct-Terminate-Cause}:-NAS-Reboot}' WHERE acctstoptime IS NULL AND nasipaddress = '%{NAS-IP-Address}' AND acctstarttime <= FROM_UNIXTIME(%{integer:Event-Timestamp})"
}
accounting-off {
query = "UPDATE radacct SET acctstoptime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctsessiontime = '%{integer:Event-Timestamp}' - UNIX_TIMESTAMP(acctstarttime), acctterminatecause = '%{%{Acct-Terminate-Cause}:-NAS-Reboot}' WHERE acctstoptime IS NULL AND nasipaddress = '%{NAS-IP-Address}' AND acctstarttime <= FROM_UNIXTIME(%{integer:Event-Timestamp})"
}
start {
query = "INSERT INTO radacct (acctsessionid, acctuniqueid, username, realm, nasipaddress, nasportid, nasporttype, acctstarttime, acctupdatetime, acctstoptime, acctsessiontime, acctauthentic, connectinfo_start, connectinfo_stop, acctinputoctets, acctoutputoctets, calledstationid, callingstationid, acctterminatecause, servicetype, framedprotocol, framedipaddress) VALUES ('%{Acct-Session-Id}', '%{Acct-Unique-Session-Id}', '%{SQL-User-Name}', '%{Realm}', '%{NAS-IP-Address}', '%{%{NAS-Port-ID}:-%{NAS-Port}}', '%{NAS-Port-Type}', FROM_UNIXTIME(%{integer:Event-Timestamp}), FROM_UNIXTIME(%{integer:Event-Timestamp}), NULL, '0', '%{Acct-Authentic}', '%{Connect-Info}', '', '0', '0', '%{Called-Station-Id}', '%{Calling-Station-Id}', '', '%{Service-Type}', '%{Framed-Protocol}', '%{Framed-IP-Address}')"
}
interim-update {
query = "UPDATE radacct SET acctupdatetime = (@acctupdatetime_old:=acctupdatetime), acctupdatetime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctinterval = %{integer:Event-Timestamp} - UNIX_TIMESTAMP(@acctupdatetime_old), framedipaddress = '%{Framed-IP-Address}', acctsessiontime = %{%{Acct-Session-Time}:-NULL}, acctinputoctets = '%{%{Acct-Input-Gigawords}:-0}' << 32 | '%{%{Acct-Input-Octets}:-0}', acctoutputoctets = '%{%{Acct-Output-Gigawords}:-0}' << 32 | '%{%{Acct-Output-Octets}:-0}' WHERE AcctUniqueId = '%{Acct-Unique-Session-Id}'"
}
stop {
query = "UPDATE radacct SET acctstoptime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctsessiontime = %{%{Acct-Session-Time}:-NULL}, acctinputoctets = '%{%{Acct-Input-Gigawords}:-0}' << 32 | '%{%{Acct-Input-Octets}:-0}', acctoutputoctets = '%{%{Acct-Output-Gigawords}:-0}' << 32 | '%{%{Acct-Output-Octets}:-0}', acctterminatecause = '%{Acct-Terminate-Cause}', connectinfo_stop = '%{Connect-Info}' WHERE AcctUniqueId = '%{Acct-Unique-Session-Id}'"
}
}
}
post-auth {
reference = ".query"
query = "INSERT INTO radpostauth (username, pass, reply, authdate) VALUES ( '%{SQL-User-Name}', '%{%{User-Password}:-%{Chap-Password}}', '%{reply:Packet-Type}', '%S')"
}
}
rlm_sql (sql): Driver rlm_sql_mysql (module rlm_sql_mysql) loaded and linked
Creating attribute SQL-Group
# Loaded module rlm_sqlcounter
# Loading module "dailycounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
sqlcounter dailycounter {
sql_module_instance = "sql"
key = "User-Name"
query = "SELECT SUM(acctsessiontime - GREATEST((%%b - UNIX_TIMESTAMP(acctstarttime)), 0)) FROM radacct WHERE username = '%{User-Name}' AND UNIX_TIMESTAMP(acctstarttime) + acctsessiontime > '%%b'"
reset = "daily"
counter_name = "Daily-Session-Time"
check_name = "Max-Daily-Session"
reply_name = "Session-Timeout"
}
# Loading module "monthlycounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
sqlcounter monthlycounter {
sql_module_instance = "sql"
key = "User-Name"
query = "SELECT SUM(acctsessiontime - GREATEST((%%b - UNIX_TIMESTAMP(acctstarttime)), 0)) FROM radacct WHERE username='%{User-Name}' AND UNIX_TIMESTAMP(acctstarttime) + acctsessiontime > '%%b'"
reset = "monthly"
counter_name = "Monthly-Session-Time"
check_name = "Max-Monthly-Session"
reply_name = "Session-Timeout"
}
# Loading module "noresetcounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
sqlcounter noresetcounter {
sql_module_instance = "sql"
key = "User-Name"
query = "SELECT IFNULL(SUM(AcctSessionTime),0) FROM radacct WHERE UserName='%{User-Name}'"
reset = "never"
counter_name = "Max-All-Session-Time"
check_name = "Max-All-Session"
reply_name = "Session-Timeout"
}
# Loading module "expire_on_login" from file /etc/freeradius/3.0/mods-available/sqlcounter
sqlcounter expire_on_login {
sql_module_instance = "sql"
key = "User-Name"
query = "SELECT IFNULL( MAX(TIME_TO_SEC(TIMEDIFF(NOW(), acctstarttime))),0) FROM radacct WHERE UserName='%{User-Name}' ORDER BY acctstarttime LIMIT 1;"
reset = "never"
counter_name = "Expire-After-Initial-Login"
check_name = "Expire-After"
reply_name = "Session-Timeout"
}
# Loading module "accessperiod" from file /etc/freeradius/3.0/mods-available/sqlcounter
sqlcounter accessperiod {
sql_module_instance = "sql"
key = "User-Name"
query = "SELECT IF(COUNT(radacctid>=1),(UNIX_TIMESTAMP() - IFNULL(UNIX_TIMESTAMP(AcctStartTime),0)),0) FROM radacct WHERE UserName = '%{User-Name}' AND AcctSessionTime >= 1 ORDER BY AcctStartTime LIMIT 1"
reset = "never"
counter_name = "Max-Access-Period-Never"
check_name = "Access-Period"
reply_name = "Session-Timeout"
}
instantiate {
}
# Instantiating module "IPASS" from file /etc/freeradius/3.0/mods-enabled/realm
# Instantiating module "suffix" from file /etc/freeradius/3.0/mods-enabled/realm
# Instantiating module "realmpercent" from file /etc/freeradius/3.0/mods-enabled/realm
# Instantiating module "ntdomain" from file /etc/freeradius/3.0/mods-enabled/realm
# Instantiating module "preprocess" from file /etc/freeradius/3.0/mods-enabled/preprocess
reading pairlist file /etc/freeradius/3.0/mods-config/preprocess/huntgroups
reading pairlist file /etc/freeradius/3.0/mods-config/preprocess/hints
# Instantiating module "mschap" from file /etc/freeradius/3.0/mods-enabled/mschap
rlm_mschap (mschap): using internal authentication
# Instantiating module "expiration" from file /etc/freeradius/3.0/mods-enabled/expiration
# Instantiating module "logintime" from file /etc/freeradius/3.0/mods-enabled/logintime
# Instantiating module "auth_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
rlm_detail (auth_log): 'User-Password' suppressed, will not appear in detail output
# Instantiating module "reply_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
# Instantiating module "pre_proxy_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
# Instantiating module "post_proxy_log" from file /etc/freeradius/3.0/mods-enabled/detail.log
# Instantiating module "eap" from file /etc/freeradius/3.0/mods-enabled/eap
# Linked to sub-module rlm_eap_md5
# Linked to sub-module rlm_eap_leap
# Linked to sub-module rlm_eap_gtc
gtc {
challenge = "Password: "
auth_type = "PAP"
}
# Linked to sub-module rlm_eap_tls
tls {
tls = "tls-common"
}
tls-config tls-common {
verify_depth = 0
ca_path = "/etc/freeradius/3.0/certs"
pem_file_type = yes
private_key_file = "/etc/ssl/private/ssl-cert-snakeoil.key"
certificate_file = "/etc/ssl/certs/ssl-cert-snakeoil.pem"
ca_file = "/etc/ssl/certs/ca-certificates.crt"
private_key_password = <<< secret >>>
dh_file = "/etc/freeradius/3.0/certs/dh"
fragment_size = 1024
include_length = yes
auto_chain = yes
check_crl = no
check_all_crl = no
cipher_list = "DEFAULT"
ecdh_curve = "prime256v1"
tls_max_version = ""
tls_min_version = "1.0"
cache {
enable = yes
lifetime = 24
max_entries = 255
}
verify {
skip_if_ocsp_ok = no
}
ocsp {
enable = no
override_cert_url = yes
url = "http://127.0.0.1/ocsp/"
use_nonce = yes
timeout = 0
softfail = no
}
}
The configuration allows TLS 1.0 and/or TLS 1.1. We STRONGLY recommned using only TLS 1.2 for security
Please set: min_tls_version = "1.2"
# Linked to sub-module rlm_eap_ttls
ttls {
tls = "tls-common"
default_eap_type = "md5"
copy_request_to_tunnel = no
use_tunneled_reply = no
virtual_server = "inner-tunnel"
include_length = yes
require_client_cert = no
}
tls: Using cached TLS configuration from previous invocation
# Linked to sub-module rlm_eap_peap
peap {
tls = "tls-common"
default_eap_type = "mschapv2"
copy_request_to_tunnel = no
use_tunneled_reply = no
proxy_tunneled_request_as_eap = yes
virtual_server = "inner-tunnel"
soh = no
require_client_cert = no
}
tls: Using cached TLS configuration from previous invocation
# Linked to sub-module rlm_eap_mschapv2
mschapv2 {
with_ntdomain_hack = no
send_error = no
}
# Instantiating module "etc_passwd" from file /etc/freeradius/3.0/mods-enabled/passwd
rlm_passwd: nfields: 3 keyfield 0(User-Name) listable: no
# Instantiating module "linelog" from file /etc/freeradius/3.0/mods-enabled/linelog
# Instantiating module "log_accounting" from file /etc/freeradius/3.0/mods-enabled/linelog
# Instantiating module "cache_eap" from file /etc/freeradius/3.0/mods-enabled/cache_eap
rlm_cache (cache_eap): Driver rlm_cache_rbtree (module rlm_cache_rbtree) loaded and linked
# Instantiating module "pap" from file /etc/freeradius/3.0/mods-enabled/pap
# Instantiating module "files" from file /etc/freeradius/3.0/mods-enabled/files
reading pairlist file /etc/freeradius/3.0/mods-config/files/authorize
reading pairlist file /etc/freeradius/3.0/mods-config/files/accounting
reading pairlist file /etc/freeradius/3.0/mods-config/files/pre-proxy
# Instantiating module "reject" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "fail" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "ok" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "handled" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "invalid" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "userlock" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "notfound" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "noop" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "updated" from file /etc/freeradius/3.0/mods-enabled/always
# Instantiating module "detail" from file /etc/freeradius/3.0/mods-enabled/detail
# Instantiating module "attr_filter.post-proxy" from file /etc/freeradius/3.0/mods-enabled/attr_filter
reading pairlist file /etc/freeradius/3.0/mods-config/attr_filter/post-proxy
# Instantiating module "attr_filter.pre-proxy" from file /etc/freeradius/3.0/mods-enabled/attr_filter
reading pairlist file /etc/freeradius/3.0/mods-config/attr_filter/pre-proxy
# Instantiating module "attr_filter.access_reject" from file /etc/freeradius/3.0/mods-enabled/attr_filter
reading pairlist file /etc/freeradius/3.0/mods-config/attr_filter/access_reject
[/etc/freeradius/3.0/mods-config/attr_filter/access_reject]:11 Check item "FreeRADIUS-Response-Delay" found in filter list for realm "DEFAULT".
[/etc/freeradius/3.0/mods-config/attr_filter/access_reject]:11 Check item "FreeRADIUS-Response-Delay-USec" found in filter list for realm "DEFAULT".
# Instantiating module "attr_filter.access_challenge" from file /etc/freeradius/3.0/mods-enabled/attr_filter
reading pairlist file /etc/freeradius/3.0/mods-config/attr_filter/access_challenge
# Instantiating module "attr_filter.accounting_response" from file /etc/freeradius/3.0/mods-enabled/attr_filter
reading pairlist file /etc/freeradius/3.0/mods-config/attr_filter/accounting_response
# Instantiating module "sql" from file /etc/freeradius/3.0/mods-enabled/sql
rlm_sql_mysql: libmysql version: 8.0.26
mysql {
tls {
tls_required = no
check_cert = no
check_cert_cn = no
}
warnings = "auto"
}
rlm_sql (sql): Attempting to connect to database "radius"
rlm_sql (sql): Initialising connection pool
pool {
start = 5
min = 3
max = 32
spare = 10
uses = 0
lifetime = 0
cleanup_interval = 30
idle_timeout = 60
retry_delay = 30
spread = no
}
rlm_sql (sql): Opening additional connection (0), 1 of 32 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
rlm_sql (sql): Opening additional connection (1), 1 of 31 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
rlm_sql (sql): Opening additional connection (2), 1 of 30 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
rlm_sql (sql): Opening additional connection (3), 1 of 29 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
rlm_sql (sql): Opening additional connection (4), 1 of 28 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
rlm_sql (sql): Processing generate_sql_clients
rlm_sql (sql) in generate_sql_clients: query is SELECT id, nasname, shortname, type, secret, server FROM nas
rlm_sql (sql): Reserved connection (0)
rlm_sql (sql): Executing select query: SELECT id, nasname, shortname, type, secret, server FROM nas
rlm_sql (sql): Adding client 172.16.3.87 (user-272-10) to global clients list
rlm_sql (172.16.3.87): Client "user-272-10" (sql) added
rlm_sql (sql): Released connection (0)
Need 5 more connections to reach 10 spares
rlm_sql (sql): Opening additional connection (5), 1 of 27 pending slots used
rlm_sql_mysql: Starting connect to MySQL server
rlm_sql_mysql: Connected to database 'radius' on 127.0.0.1 via TCP/IP, server version 8.0.26, protocol version 10
# Instantiating module "dailycounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
rlm_sqlcounter: Current Time: 1636603029 [2021-11-11 03:57:09], Prev reset 1636588800 [2021-11-11 00:00:00]
# Instantiating module "monthlycounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
rlm_sqlcounter: Current Time: 1636603029 [2021-11-11 03:57:09], Prev reset 1635724800 [2021-11-01 00:00:00]
# Instantiating module "noresetcounter" from file /etc/freeradius/3.0/mods-available/sqlcounter
rlm_sqlcounter: Current Time: 1636603029 [2021-11-11 03:57:09], Prev reset 0 [2021-11-11 03:00:00]
# Instantiating module "expire_on_login" from file /etc/freeradius/3.0/mods-available/sqlcounter
rlm_sqlcounter: Current Time: 1636603029 [2021-11-11 03:57:09], Prev reset 0 [2021-11-11 03:00:00]
# Instantiating module "accessperiod" from file /etc/freeradius/3.0/mods-available/sqlcounter
rlm_sqlcounter: Current Time: 1636603029 [2021-11-11 03:57:09], Prev reset 0 [2021-11-11 03:00:00]
} # modules
radiusd: #### Loading Virtual Servers ####
server { # from file /etc/freeradius/3.0/radiusd.conf
} # server
server inner-tunnel { # from file /etc/freeradius/3.0/sites-enabled/inner-tunnel
# Loading authenticate {...}
# Loading authorize {...}
Ignoring "ldap" (see raddb/mods-available/README.rst)
# Loading session {...}
# Loading post-proxy {...}
# Loading post-auth {...}
} # server inner-tunnel
server default { # from file /etc/freeradius/3.0/sites-enabled/default
# Loading authenticate {...}
# Loading authorize {...}
# Loading preacct {...}
# Loading accounting {...}
# Loading session {...}
# Loading post-proxy {...}
# Loading post-auth {...}
} # server default
radiusd: #### Opening IP addresses and Ports ####
listen {
type = "control"
listen {
socket = "/var/run/freeradius/freeradius.sock"
peercred = yes
}
}
listen {
type = "auth"
ipaddr = 127.0.0.1
port = 18160
}
listen {
type = "auth"
ipaddr = *
port = 1816
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
listen {
type = "acct"
ipaddr = *
port = 1817
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
listen {
type = "auth"
ipv6addr = ::
port = 1816
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
listen {
type = "acct"
ipv6addr = ::
port = 1817
limit {
max_connections = 16
lifetime = 0
idle_timeout = 30
}
}
Listening on command file /var/run/freeradius/freeradius.sock
Listening on auth address 127.0.0.1 port 18160 bound to server inner-tunnel
Listening on auth address * port 1816 bound to server default
Listening on acct address * port 1817 bound to server default
Listening on auth address :: port 1816 bound to server default
Listening on acct address :: port 1817 bound to server default
Listening on proxy address * port 35774
Listening on proxy address :: port 42819
Ready to process requests
```
### Relevant log output from client utilities
_No response_
### Backtrace from LLDB or GDB
_No response_
|
defect
|
accounting problem what type of defect bug is this unexpected behaviour obvious or verified by project member how can the issue be reproduced im using vpn to connecting nas to radius server sometimes vpn is disconnecting accounting is not response after the vpn is connected new record added to radacct from nasipaddress not listed in nas table the effect is doubled login time why it can be and how to solve log output from the freeradius daemon shell freeradius version copyright c the freeradius server project and contributors there is no warranty not even for merchantability or fitness for a particular purpose you may redistribute copies of freeradius under the terms of the gnu general public license for more information about these matters see the file named copyright starting reading configuration files including dictionary file usr share freeradius dictionary including dictionary file usr share freeradius dictionary dhcp including dictionary file usr share freeradius dictionary vqp including dictionary file etc freeradius dictionary including configuration file etc freeradius radiusd conf including configuration file etc freeradius proxy conf including configuration file etc freeradius clients conf including files in directory etc freeradius mods enabled including configuration file etc freeradius mods enabled chap including configuration file etc freeradius mods enabled ntlm auth including configuration file etc freeradius mods enabled realm including configuration file etc freeradius mods enabled replicate including configuration file etc freeradius mods enabled preprocess including configuration file etc freeradius mods enabled mschap including configuration file etc freeradius mods enabled echo including configuration file etc freeradius mods enabled soh including configuration file etc freeradius mods enabled expiration including configuration file etc freeradius mods enabled including configuration file etc freeradius mods enabled digest including configuration file etc freeradius mods enabled sradutmp including configuration file etc freeradius mods enabled logintime including configuration file etc freeradius mods enabled detail log including configuration file etc freeradius mods enabled eap including configuration file etc freeradius mods enabled dynamic clients including configuration file etc freeradius mods enabled passwd including configuration file etc freeradius mods enabled linelog including configuration file etc freeradius mods enabled cache eap including configuration file etc freeradius mods enabled radutmp including configuration file etc freeradius mods enabled pap including configuration file etc freeradius mods enabled files including configuration file etc freeradius mods enabled exec including configuration file etc freeradius mods enabled always including configuration file etc freeradius mods enabled detail including configuration file etc freeradius mods enabled unpack including configuration file etc freeradius mods enabled attr filter including configuration file etc freeradius mods enabled unix including configuration file etc freeradius mods enabled expr including configuration file etc freeradius mods enabled sql including configuration file etc freeradius mods config sql main mysql queries conf including configuration file etc freeradius mods available sqlcounter including configuration file etc freeradius mods config sql counter mysql dailycounter conf including configuration file etc freeradius mods config sql counter mysql monthlycounter conf including configuration file etc freeradius mods config sql counter mysql noresetcounter conf including configuration file etc freeradius mods config sql counter mysql expire on login conf including configuration file etc freeradius mods config sql counter mysql accessperiod conf including files in directory etc freeradius policy d including configuration file etc freeradius policy d dhcp including configuration file etc freeradius policy d cui including configuration file etc freeradius policy d moonshot targeted ids including configuration file etc freeradius policy d accounting including configuration file etc freeradius policy d abfab tr including configuration file etc freeradius policy d including configuration file etc freeradius policy d canonicalization including configuration file etc freeradius policy d eap including configuration file etc freeradius policy d operator name including configuration file etc freeradius policy d control including configuration file etc freeradius policy d filter including configuration file etc freeradius policy d debug including files in directory etc freeradius sites enabled including configuration file etc freeradius sites enabled inner tunnel including configuration file etc freeradius sites enabled default including configuration file etc freeradius sites enabled control socket main security user freerad group freerad allow core dumps no name freeradius prefix usr localstatedir var logdir var log freeradius run dir var run freeradius main name freeradius prefix usr localstatedir var sbindir usr sbin logdir var log freeradius run dir var run freeradius libdir usr lib freeradius radacctdir var log freeradius radacct hostname lookups no max request time cleanup delay max requests pidfile var run freeradius freeradius pid checkrad usr sbin checkrad debug level proxy requests yes log stripped names no auth no auth badpass no auth goodpass no colourise yes msg denied you are already logged in access denied resources security max attributes reject delay status server yes radiusd loading realms and home servers proxy server retry delay retry count default fallback no dead time wake all if all dead no home server localhost ipaddr port type auth secret response window response timeouts max outstanding zombie period status check status server ping interval check interval check timeout num answers to alive revive interval limit max connections max requests lifetime idle timeout coa irt mrt mrc mrd home server pool my auth failover type fail over home server localhost realm example com auth pool my auth failover realm local radiusd loading clients client localhost ipaddr require message authenticator no secret nas type other proto limit max connections lifetime idle timeout client localhost require message authenticator no secret limit max connections lifetime idle timeout debug state unknown cap sys ptrace capability not set systemd watchdog is disabled creating auth type mschap creating auth type eap creating auth type pap creating auth type chap creating auth type ms chap creating auth type digest radiusd instantiating modules modules loaded module rlm chap loading module chap from file etc freeradius mods enabled chap loaded module rlm exec loading module ntlm auth from file etc freeradius mods enabled ntlm auth exec ntlm auth wait yes program path to ntlm auth request nt key domain mydomain username mschap user name password user password shell escape yes loaded module rlm realm loading module ipass from file etc freeradius mods enabled realm realm ipass format prefix delimiter ignore default no ignore null no loading module suffix from file etc freeradius mods enabled realm realm suffix format suffix delimiter ignore default no ignore null no loading module realmpercent from file etc freeradius mods enabled realm realm realmpercent format suffix delimiter ignore default no ignore null no loading module ntdomain from file etc freeradius mods enabled realm realm ntdomain format prefix delimiter ignore default no ignore null no loaded module rlm replicate loading module replicate from file etc freeradius mods enabled replicate loaded module rlm preprocess loading module preprocess from file etc freeradius mods enabled preprocess preprocess huntgroups etc freeradius mods config preprocess huntgroups hints etc freeradius mods config preprocess hints with ascend hack no ascend channels per line with ntdomain hack no with specialix jetstream hack no with cisco vsa hack no with alvarion vsa hack no loaded module rlm mschap loading module mschap from file etc freeradius mods enabled mschap mschap use mppe yes require encryption no require strong no with ntdomain hack yes passchange allow retry yes winbind retry with normalised username no loading module echo from file etc freeradius mods enabled echo exec echo wait yes program bin echo user name input pairs request output pairs reply shell escape yes loaded module rlm soh loading module soh from file etc freeradius mods enabled soh soh dhcp yes loaded module rlm expiration loading module expiration from file etc freeradius mods enabled expiration loaded module rlm loading module from file etc freeradius mods enabled loaded module rlm digest loading module digest from file etc freeradius mods enabled digest loaded module rlm radutmp loading module sradutmp from file etc freeradius mods enabled sradutmp radutmp sradutmp filename var log freeradius sradutmp username user name case sensitive yes check with nas yes permissions caller id no loaded module rlm logintime loading module logintime from file etc freeradius mods enabled logintime logintime minimum timeout loaded module rlm detail loading module auth log from file etc freeradius mods enabled detail log detail auth log filename var log freeradius radacct packet src ip address packet src address auth detail y m d header t permissions locking no escape filenames no log packet header no loading module reply log from file etc freeradius mods enabled detail log detail reply log filename var log freeradius radacct packet src ip address packet src address reply detail y m d header t permissions locking no escape filenames no log packet header no loading module pre proxy log from file etc freeradius mods enabled detail log detail pre proxy log filename var log freeradius radacct packet src ip address packet src address pre proxy detail y m d header t permissions locking no escape filenames no log packet header no loading module post proxy log from file etc freeradius mods enabled detail log detail post proxy log filename var log freeradius radacct packet src ip address packet src address post proxy detail y m d header t permissions locking no escape filenames no log packet header no loaded module rlm eap loading module eap from file etc freeradius mods enabled eap eap default eap type timer expire ignore unknown eap types no cisco accounting username bug no max sessions loaded module rlm dynamic clients loading module dynamic clients from file etc freeradius mods enabled dynamic clients loaded module rlm passwd loading module etc passwd from file etc freeradius mods enabled passwd passwd etc passwd filename etc passwd format user name crypt password delimiter ignore nislike no ignore empty yes allow multiple keys no hash size loaded module rlm linelog loading module linelog from file etc freeradius mods enabled linelog linelog filename var log freeradius linelog escape filenames no syslog severity info permissions format this is a log message for user name reference messages reply packet type default loading module log accounting from file etc freeradius mods enabled linelog linelog log accounting filename var log freeradius linelog accounting escape filenames no syslog severity info permissions format reference accounting request acct status type unknown loaded module rlm cache loading module cache eap from file etc freeradius mods enabled cache eap cache cache eap driver rlm cache rbtree key control state reply state state ttl max entries epoch add stats no loading module radutmp from file etc freeradius mods enabled radutmp radutmp filename var log freeradius sradutmp username user name case sensitive yes check with nas yes permissions caller id yes loaded module rlm pap loading module pap from file etc freeradius mods enabled pap pap normalise yes loaded module rlm files loading module files from file etc freeradius mods enabled files files filename etc freeradius mods config files authorize acctusersfile etc freeradius mods config files accounting preproxy usersfile etc freeradius mods config files pre proxy loading module exec from file etc freeradius mods enabled exec exec wait no input pairs request shell escape yes timeout loaded module rlm always loading module reject from file etc freeradius mods enabled always always reject rcode reject simulcount mpp no loading module fail from file etc freeradius mods enabled always always fail rcode fail simulcount mpp no loading module ok from file etc freeradius mods enabled always always ok rcode ok simulcount mpp no loading module handled from file etc freeradius mods enabled always always handled rcode handled simulcount mpp no loading module invalid from file etc freeradius mods enabled always always invalid rcode invalid simulcount mpp no loading module userlock from file etc freeradius mods enabled always always userlock rcode userlock simulcount mpp no loading module notfound from file etc freeradius mods enabled always always notfound rcode notfound simulcount mpp no loading module noop from file etc freeradius mods enabled always always noop rcode noop simulcount mpp no loading module updated from file etc freeradius mods enabled always always updated rcode updated simulcount mpp no loading module detail from file etc freeradius mods enabled detail detail filename var log freeradius radacct packet src ip address packet src address detail y m d header t permissions locking no escape filenames no log packet header no loaded module rlm unpack loading module unpack from file etc freeradius mods enabled unpack loaded module rlm attr filter loading module attr filter post proxy from file etc freeradius mods enabled attr filter attr filter attr filter post proxy filename etc freeradius mods config attr filter post proxy key realm relaxed no loading module attr filter pre proxy from file etc freeradius mods enabled attr filter attr filter attr filter pre proxy filename etc freeradius mods config attr filter pre proxy key realm relaxed no loading module attr filter access reject from file etc freeradius mods enabled attr filter attr filter attr filter access reject filename etc freeradius mods config attr filter access reject key user name relaxed no loading module attr filter access challenge from file etc freeradius mods enabled attr filter attr filter attr filter access challenge filename etc freeradius mods config attr filter access challenge key user name relaxed no loading module attr filter accounting response from file etc freeradius mods enabled attr filter attr filter attr filter accounting response filename etc freeradius mods config attr filter accounting response key user name relaxed no loaded module rlm unix loading module unix from file etc freeradius mods enabled unix unix radwtmp var log freeradius radwtmp creating attribute unix group loaded module rlm expr loading module expr from file etc freeradius mods enabled expr expr safe characters äéöüàâæçèéêëîïôœùûüaÿäéöüßàâæçèéêëîïôœùûüÿ loaded module rlm sql loading module sql from file etc freeradius mods enabled sql sql driver rlm sql mysql server port login radius password radius db radius read groups yes read profiles yes read clients yes delete stale sessions yes sql user name user name default user profile client query select id nasname shortname type secret server from nas authorize check query select a id a username b attribute b value b op from voucher a cross join select cleartext password as attribute as op password as value id from voucher union select max all session as attribute as op session as value id from voucher union select access period as attribute as op access periode as value id from voucher union select simultaneous use as attribute as op as value id from voucher b where a id b id and a username sql user name order by a id asc b attribute authorize reply query select a id a username b attribute b value b op from voucher a cross join select wispr bandwidth max down as attribute as op download as value id from voucher union select wispr bandwidth max up as attribute as op upload as value id from voucher b where a id b id and a username sql user name order by a id asc b attribute authorize group check query select id groupname attribute value op from radgroupcheck where groupname sql group order by id authorize group reply query select id groupname attribute value op from radgroupreply where groupname sql group order by id group membership query select groupname from radusergroup where username sql user name order by priority simul count query select count from radacct where username sql user name and acctstoptime is null simul verify query select radacctid acctsessionid username nasipaddress nasportid framedipaddress callingstationid framedprotocol from radacct where username sql user name and acctstoptime is null safe characters auto escape no accounting reference tolower type acct status type query type accounting on query update radacct set acctstoptime from unixtime integer event timestamp acctsessiontime integer event timestamp unix timestamp acctstarttime acctterminatecause acct terminate cause nas reboot where acctstoptime is null and nasipaddress nas ip address and acctstarttime from unixtime integer event timestamp accounting off query update radacct set acctstoptime from unixtime integer event timestamp acctsessiontime integer event timestamp unix timestamp acctstarttime acctterminatecause acct terminate cause nas reboot where acctstoptime is null and nasipaddress nas ip address and acctstarttime from unixtime integer event timestamp start query insert into radacct acctsessionid acctuniqueid username realm nasipaddress nasportid nasporttype acctstarttime acctupdatetime acctstoptime acctsessiontime acctauthentic connectinfo start connectinfo stop acctinputoctets acctoutputoctets calledstationid callingstationid acctterminatecause servicetype framedprotocol framedipaddress values acct session id acct unique session id sql user name realm nas ip address nas port id nas port nas port type from unixtime integer event timestamp from unixtime integer event timestamp null acct authentic connect info called station id calling station id service type framed protocol framed ip address interim update query update radacct set acctupdatetime acctupdatetime old acctupdatetime acctupdatetime from unixtime integer event timestamp acctinterval integer event timestamp unix timestamp acctupdatetime old framedipaddress framed ip address acctsessiontime acct session time null acctinputoctets acct input gigawords acct input octets acctoutputoctets acct output gigawords acct output octets where acctuniqueid acct unique session id stop query update radacct set acctstoptime from unixtime integer event timestamp acctsessiontime acct session time null acctinputoctets acct input gigawords acct input octets acctoutputoctets acct output gigawords acct output octets acctterminatecause acct terminate cause connectinfo stop connect info where acctuniqueid acct unique session id post auth reference query query insert into radpostauth username pass reply authdate values sql user name user password chap password reply packet type s rlm sql sql driver rlm sql mysql module rlm sql mysql loaded and linked creating attribute sql group loaded module rlm sqlcounter loading module dailycounter from file etc freeradius mods available sqlcounter sqlcounter dailycounter sql module instance sql key user name query select sum acctsessiontime greatest b unix timestamp acctstarttime from radacct where username user name and unix timestamp acctstarttime acctsessiontime b reset daily counter name daily session time check name max daily session reply name session timeout loading module monthlycounter from file etc freeradius mods available sqlcounter sqlcounter monthlycounter sql module instance sql key user name query select sum acctsessiontime greatest b unix timestamp acctstarttime from radacct where username user name and unix timestamp acctstarttime acctsessiontime b reset monthly counter name monthly session time check name max monthly session reply name session timeout loading module noresetcounter from file etc freeradius mods available sqlcounter sqlcounter noresetcounter sql module instance sql key user name query select ifnull sum acctsessiontime from radacct where username user name reset never counter name max all session time check name max all session reply name session timeout loading module expire on login from file etc freeradius mods available sqlcounter sqlcounter expire on login sql module instance sql key user name query select ifnull max time to sec timediff now acctstarttime from radacct where username user name order by acctstarttime limit reset never counter name expire after initial login check name expire after reply name session timeout loading module accessperiod from file etc freeradius mods available sqlcounter sqlcounter accessperiod sql module instance sql key user name query select if count radacctid unix timestamp ifnull unix timestamp acctstarttime from radacct where username user name and acctsessiontime order by acctstarttime limit reset never counter name max access period never check name access period reply name session timeout instantiate instantiating module ipass from file etc freeradius mods enabled realm instantiating module suffix from file etc freeradius mods enabled realm instantiating module realmpercent from file etc freeradius mods enabled realm instantiating module ntdomain from file etc freeradius mods enabled realm instantiating module preprocess from file etc freeradius mods enabled preprocess reading pairlist file etc freeradius mods config preprocess huntgroups reading pairlist file etc freeradius mods config preprocess hints instantiating module mschap from file etc freeradius mods enabled mschap rlm mschap mschap using internal authentication instantiating module expiration from file etc freeradius mods enabled expiration instantiating module logintime from file etc freeradius mods enabled logintime instantiating module auth log from file etc freeradius mods enabled detail log rlm detail auth log user password suppressed will not appear in detail output instantiating module reply log from file etc freeradius mods enabled detail log instantiating module pre proxy log from file etc freeradius mods enabled detail log instantiating module post proxy log from file etc freeradius mods enabled detail log instantiating module eap from file etc freeradius mods enabled eap linked to sub module rlm eap linked to sub module rlm eap leap linked to sub module rlm eap gtc gtc challenge password auth type pap linked to sub module rlm eap tls tls tls tls common tls config tls common verify depth ca path etc freeradius certs pem file type yes private key file etc ssl private ssl cert snakeoil key certificate file etc ssl certs ssl cert snakeoil pem ca file etc ssl certs ca certificates crt private key password dh file etc freeradius certs dh fragment size include length yes auto chain yes check crl no check all crl no cipher list default ecdh curve tls max version tls min version cache enable yes lifetime max entries verify skip if ocsp ok no ocsp enable no override cert url yes url use nonce yes timeout softfail no the configuration allows tls and or tls we strongly recommned using only tls for security please set min tls version linked to sub module rlm eap ttls ttls tls tls common default eap type copy request to tunnel no use tunneled reply no virtual server inner tunnel include length yes require client cert no tls using cached tls configuration from previous invocation linked to sub module rlm eap peap peap tls tls common default eap type copy request to tunnel no use tunneled reply no proxy tunneled request as eap yes virtual server inner tunnel soh no require client cert no tls using cached tls configuration from previous invocation linked to sub module rlm eap with ntdomain hack no send error no instantiating module etc passwd from file etc freeradius mods enabled passwd rlm passwd nfields keyfield user name listable no instantiating module linelog from file etc freeradius mods enabled linelog instantiating module log accounting from file etc freeradius mods enabled linelog instantiating module cache eap from file etc freeradius mods enabled cache eap rlm cache cache eap driver rlm cache rbtree module rlm cache rbtree loaded and linked instantiating module pap from file etc freeradius mods enabled pap instantiating module files from file etc freeradius mods enabled files reading pairlist file etc freeradius mods config files authorize reading pairlist file etc freeradius mods config files accounting reading pairlist file etc freeradius mods config files pre proxy instantiating module reject from file etc freeradius mods enabled always instantiating module fail from file etc freeradius mods enabled always instantiating module ok from file etc freeradius mods enabled always instantiating module handled from file etc freeradius mods enabled always instantiating module invalid from file etc freeradius mods enabled always instantiating module userlock from file etc freeradius mods enabled always instantiating module notfound from file etc freeradius mods enabled always instantiating module noop from file etc freeradius mods enabled always instantiating module updated from file etc freeradius mods enabled always instantiating module detail from file etc freeradius mods enabled detail instantiating module attr filter post proxy from file etc freeradius mods enabled attr filter reading pairlist file etc freeradius mods config attr filter post proxy instantiating module attr filter pre proxy from file etc freeradius mods enabled attr filter reading pairlist file etc freeradius mods config attr filter pre proxy instantiating module attr filter access reject from file etc freeradius mods enabled attr filter reading pairlist file etc freeradius mods config attr filter access reject check item freeradius response delay found in filter list for realm default check item freeradius response delay usec found in filter list for realm default instantiating module attr filter access challenge from file etc freeradius mods enabled attr filter reading pairlist file etc freeradius mods config attr filter access challenge instantiating module attr filter accounting response from file etc freeradius mods enabled attr filter reading pairlist file etc freeradius mods config attr filter accounting response instantiating module sql from file etc freeradius mods enabled sql rlm sql mysql libmysql version mysql tls tls required no check cert no check cert cn no warnings auto rlm sql sql attempting to connect to database radius rlm sql sql initialising connection pool pool start min max spare uses lifetime cleanup interval idle timeout retry delay spread no rlm sql sql opening additional connection of pending slots used rlm sql mysql starting connect to mysql server rlm sql mysql connected to database radius on via tcp ip server version protocol version rlm sql sql opening additional connection of pending slots used rlm sql mysql starting connect to mysql server rlm sql mysql connected to database radius on via tcp ip server version protocol version rlm sql sql opening additional connection of pending slots used rlm sql mysql starting connect to mysql server rlm sql mysql connected to database radius on via tcp ip server version protocol version rlm sql sql opening additional connection of pending slots used rlm sql mysql starting connect to mysql server rlm sql mysql connected to database radius on via tcp ip server version protocol version rlm sql sql opening additional connection of pending slots used rlm sql mysql starting connect to mysql server rlm sql mysql connected to database radius on via tcp ip server version protocol version rlm sql sql processing generate sql clients rlm sql sql in generate sql clients query is select id nasname shortname type secret server from nas rlm sql sql reserved connection rlm sql sql executing select query select id nasname shortname type secret server from nas rlm sql sql adding client user to global clients list rlm sql client user sql added rlm sql sql released connection need more connections to reach spares rlm sql sql opening additional connection of pending slots used rlm sql mysql starting connect to mysql server rlm sql mysql connected to database radius on via tcp ip server version protocol version instantiating module dailycounter from file etc freeradius mods available sqlcounter rlm sqlcounter current time prev reset instantiating module monthlycounter from file etc freeradius mods available sqlcounter rlm sqlcounter current time prev reset instantiating module noresetcounter from file etc freeradius mods available sqlcounter rlm sqlcounter current time prev reset instantiating module expire on login from file etc freeradius mods available sqlcounter rlm sqlcounter current time prev reset instantiating module accessperiod from file etc freeradius mods available sqlcounter rlm sqlcounter current time prev reset modules radiusd loading virtual servers server from file etc freeradius radiusd conf server server inner tunnel from file etc freeradius sites enabled inner tunnel loading authenticate loading authorize ignoring ldap see raddb mods available readme rst loading session loading post proxy loading post auth server inner tunnel server default from file etc freeradius sites enabled default loading authenticate loading authorize loading preacct loading accounting loading session loading post proxy loading post auth server default radiusd opening ip addresses and ports listen type control listen socket var run freeradius freeradius sock peercred yes listen type auth ipaddr port listen type auth ipaddr port limit max connections lifetime idle timeout listen type acct ipaddr port limit max connections lifetime idle timeout listen type auth port limit max connections lifetime idle timeout listen type acct port limit max connections lifetime idle timeout listening on command file var run freeradius freeradius sock listening on auth address port bound to server inner tunnel listening on auth address port bound to server default listening on acct address port bound to server default listening on auth address port bound to server default listening on acct address port bound to server default listening on proxy address port listening on proxy address port ready to process requests relevant log output from client utilities no response backtrace from lldb or gdb no response
| 1
|
55,008
| 14,118,433,427
|
IssuesEvent
|
2020-11-08 13:39:40
|
oleg-shilo/cs-script
|
https://api.github.com/repos/oleg-shilo/cs-script
|
closed
|
Compatibility with old script syntax
|
defect done
|
Compile file content:
```C#
//css_reference "System.dll";
//css_reference "System.Windows.Forms.dll";
import System;
import System.Windows.Forms;
public class Script
{
public static function Main(args)
{
Console.WriteLine("Hello World! (JScript)");
MessageBox.Show("Hello World! (JScript)");
}
}
```
Gets exception at file "csparser.cs":
```C#
public ImportInfo(string statement, string parentScript)
{
string statementToParse = statement.Replace("($this.name)", Path.GetFileNameWithoutExtension(parentScript));
statementToParse = statementToParse.Replace("\t", "").Trim();
string[] parts = CSharpParser.SplitByDelimiter(statementToParse, DirectiveDelimiters);
this.file =
this.rawStatement = parts[0];
this.parentScript = parentScript;
---> here: if (!Path.IsPathRooted(this.file) && ResolveRelativeFromParentScriptLocation)
…
`
Solution - remove double quotes from a string statement:
`statementToParse = statementToParse.Replace("\t", "").Trim().Trim('"');`
|
1.0
|
Compatibility with old script syntax - Compile file content:
```C#
//css_reference "System.dll";
//css_reference "System.Windows.Forms.dll";
import System;
import System.Windows.Forms;
public class Script
{
public static function Main(args)
{
Console.WriteLine("Hello World! (JScript)");
MessageBox.Show("Hello World! (JScript)");
}
}
```
Gets exception at file "csparser.cs":
```C#
public ImportInfo(string statement, string parentScript)
{
string statementToParse = statement.Replace("($this.name)", Path.GetFileNameWithoutExtension(parentScript));
statementToParse = statementToParse.Replace("\t", "").Trim();
string[] parts = CSharpParser.SplitByDelimiter(statementToParse, DirectiveDelimiters);
this.file =
this.rawStatement = parts[0];
this.parentScript = parentScript;
---> here: if (!Path.IsPathRooted(this.file) && ResolveRelativeFromParentScriptLocation)
…
`
Solution - remove double quotes from a string statement:
`statementToParse = statementToParse.Replace("\t", "").Trim().Trim('"');`
|
defect
|
compatibility with old script syntax compile file content c css reference system dll css reference system windows forms dll import system import system windows forms public class script public static function main args console writeline hello world jscript messagebox show hello world jscript gets exception at file csparser cs c public importinfo string statement string parentscript string statementtoparse statement replace this name path getfilenamewithoutextension parentscript statementtoparse statementtoparse replace t trim string parts csharpparser splitbydelimiter statementtoparse directivedelimiters this file this rawstatement parts this parentscript parentscript here if path ispathrooted this file resolverelativefromparentscriptlocation … solution remove double quotes from a string statement statementtoparse statementtoparse replace t trim trim
| 1
|
73,351
| 24,579,311,447
|
IssuesEvent
|
2022-10-13 14:32:11
|
DependencyTrack/dependency-track
|
https://api.github.com/repos/DependencyTrack/dependency-track
|
closed
|
Policy Violation Display Bug
|
defect
|
### Current Behavior:
When viewing the Policy Violation of a project, it is not displaying the State or Policy Name in the table. When you delete the policy and recreate it, it then shows. However it disappears as soon as you change any view options on the table, such as sorting the items in the table or changing the amount of items to display.
### Steps to Reproduce:
View the Policy Violation section of a project.
### Expected Behavior:
Be able to see the State and Policy Name fields.
### Environment:
- Dependency-Track Version: 4.6.0
- Distribution: Docker
- BOM Format & Version: N/A
- Database Server: PostgreSQL
- Browser: Brave and Firefox
### Additional Details:
Here is a screenshot showing the issue:

|
1.0
|
Policy Violation Display Bug - ### Current Behavior:
When viewing the Policy Violation of a project, it is not displaying the State or Policy Name in the table. When you delete the policy and recreate it, it then shows. However it disappears as soon as you change any view options on the table, such as sorting the items in the table or changing the amount of items to display.
### Steps to Reproduce:
View the Policy Violation section of a project.
### Expected Behavior:
Be able to see the State and Policy Name fields.
### Environment:
- Dependency-Track Version: 4.6.0
- Distribution: Docker
- BOM Format & Version: N/A
- Database Server: PostgreSQL
- Browser: Brave and Firefox
### Additional Details:
Here is a screenshot showing the issue:

|
defect
|
policy violation display bug current behavior when viewing the policy violation of a project it is not displaying the state or policy name in the table when you delete the policy and recreate it it then shows however it disappears as soon as you change any view options on the table such as sorting the items in the table or changing the amount of items to display steps to reproduce view the policy violation section of a project expected behavior be able to see the state and policy name fields environment dependency track version distribution docker bom format version n a database server postgresql browser brave and firefox additional details here is a screenshot showing the issue
| 1
|
58,570
| 16,603,404,395
|
IssuesEvent
|
2021-06-01 23:06:17
|
mozilla/jetstream
|
https://api.github.com/repos/mozilla/jetstream
|
opened
|
Statistics errors not written to BigQuery
|
Defect
|
Whenever an error occurs while computing statistics, it's logged to the console but not written to BigQuery. E.g.
`Error while computing statistic <bound method Statistic.name of <class 'jetstream.statistics.BootstrapMean'>> for metric days_of_use: 'threshold_
quantile' should be close to 1` does show up in the [Argo logs](http://127.0.0.1:8080/workflows/argo/jetstream-n8pbf?tab=workflow&nodeId=jetstream-n8pbf-1478496465&sidePanel=logs%3Ajetstream-n8pbf-1478496465%3Amain) but it's not in BigQuery
|
1.0
|
Statistics errors not written to BigQuery - Whenever an error occurs while computing statistics, it's logged to the console but not written to BigQuery. E.g.
`Error while computing statistic <bound method Statistic.name of <class 'jetstream.statistics.BootstrapMean'>> for metric days_of_use: 'threshold_
quantile' should be close to 1` does show up in the [Argo logs](http://127.0.0.1:8080/workflows/argo/jetstream-n8pbf?tab=workflow&nodeId=jetstream-n8pbf-1478496465&sidePanel=logs%3Ajetstream-n8pbf-1478496465%3Amain) but it's not in BigQuery
|
defect
|
statistics errors not written to bigquery whenever an error occurs while computing statistics it s logged to the console but not written to bigquery e g error while computing statistic for metric days of use threshold quantile should be close to does show up in the but it s not in bigquery
| 1
|
43,863
| 11,870,560,902
|
IssuesEvent
|
2020-03-26 13:01:37
|
ehimetakahashilab/research_papers
|
https://api.github.com/repos/ehimetakahashilab/research_papers
|
closed
|
LFSR-Based Test Generation for Reduced Fail Data Volume
|
Defect diagnosis fail data volume test data compression
|
## 0. 論文
[LFSR-Based Test Generation for Reduced Fail Data Volume](https://ehimetakahashilab.slack.com/files/U2CTQRAMS/FV4MKA74M/2020ieee-tcad_lfsr_based.pdf)
Irith Pomeranz and Srikanth Venkataraman
## 1. どんなもの?
故障診断において一部のFaulty Unitでは大量のfail dataが発生し,故障診断の実行が長くなってしまう.
そこで,Faulty Unitのfail data全体を保存する前に,fail data収集プロセスを終了したいという背景がある.
今回はLFSRを用いたテストデータ圧縮本体ではなく,テストセットを修正して,生成されるfail data量を削減することを目的とする.
### ref: fail data
def: 故障したユニットが故障した出力値を生成するテストと出力のリスト
一般的に,エントリ数(fail dataの総数)が最大値,または最大値に近い故障はごく少ない.
つまり,fail data削減には少数の故障のサブセットに焦点を当てる必要がある.
## 2. 先行研究と比べてどこがすごい?
故障診断に使用するLFSRを各回路に対して調整することで,fail dataの削減を達成.
fail data削減手法の手順がわかりやすく示されている.
## 3. 技術や手法のキモはどこ?
エントリ数が最も多い故障をF_{targ}に追加する.
これをF_{targ}のサイズntargで管理する.
手順は次の通りに行う.

テストセットの修正は使用するLFSRのシードを修正することで達成される.
## 4. どうやって有効だと検証した?
ベンチマーク回路に対して,手法を適用した.
各回路について,最大3セットのシードについて結果を取っている.
1. シードの初期セット
2. LFSRの最大ビット数を増加させずに,図1の手順で得られた修正後のテストセットでのfail dataエントリ数が最も小さいシードのセット
3. LFSRの最大ビット数を増加させてもよい場合,図1の手順で得られた修正後のテストセットでのfail dataエントリ数が最も小さいシードの集合
### 実験結果
targはntargを示し,testsはテスト数,f.c.は単一のstuck-at fault coverageを示す.
また,LFSR列は,LFSR bitの最大数と平均数を示す.
そして,fail data entries列は,maxが故障集合F内の故障のfail dataエントリの最大数を示し,
ratioは修正前のmaxと修正後のmaxの比率を示す.
targはF_{targ}内の故障のみを考慮したfail dataエントリ数の平均を示す.
allはFすべての故障を考慮したfail dataエントリ数を示す.
ntimeは,初期テストセットTの故障を削除した故障シミュレーションの実行時間で,
提案手法の実行時間を割った正規化されたランタイムを示す.
また,ntimeは回路のサイズによらず,スケーリングされている.

fail data entries列のtargが小さくなることがfail data量削減の指標となる.
LFSR列の平均bit数(ave)はテストあたりのテストデータ圧縮のレベルを決定する.
より大きなbit数(max)のLFSRが許可されている場合でも,aveは緩やかにしか増加せず,
提案手法はテストデータ圧縮のレベルを初期値に近い値に維持する.
## 5. 議論はある?
- ntimeについて
- テストデータ圧縮のレベルについて
## 6. 次に読むべき論文は?
|
1.0
|
LFSR-Based Test Generation for Reduced Fail Data Volume - ## 0. 論文
[LFSR-Based Test Generation for Reduced Fail Data Volume](https://ehimetakahashilab.slack.com/files/U2CTQRAMS/FV4MKA74M/2020ieee-tcad_lfsr_based.pdf)
Irith Pomeranz and Srikanth Venkataraman
## 1. どんなもの?
故障診断において一部のFaulty Unitでは大量のfail dataが発生し,故障診断の実行が長くなってしまう.
そこで,Faulty Unitのfail data全体を保存する前に,fail data収集プロセスを終了したいという背景がある.
今回はLFSRを用いたテストデータ圧縮本体ではなく,テストセットを修正して,生成されるfail data量を削減することを目的とする.
### ref: fail data
def: 故障したユニットが故障した出力値を生成するテストと出力のリスト
一般的に,エントリ数(fail dataの総数)が最大値,または最大値に近い故障はごく少ない.
つまり,fail data削減には少数の故障のサブセットに焦点を当てる必要がある.
## 2. 先行研究と比べてどこがすごい?
故障診断に使用するLFSRを各回路に対して調整することで,fail dataの削減を達成.
fail data削減手法の手順がわかりやすく示されている.
## 3. 技術や手法のキモはどこ?
エントリ数が最も多い故障をF_{targ}に追加する.
これをF_{targ}のサイズntargで管理する.
手順は次の通りに行う.

テストセットの修正は使用するLFSRのシードを修正することで達成される.
## 4. どうやって有効だと検証した?
ベンチマーク回路に対して,手法を適用した.
各回路について,最大3セットのシードについて結果を取っている.
1. シードの初期セット
2. LFSRの最大ビット数を増加させずに,図1の手順で得られた修正後のテストセットでのfail dataエントリ数が最も小さいシードのセット
3. LFSRの最大ビット数を増加させてもよい場合,図1の手順で得られた修正後のテストセットでのfail dataエントリ数が最も小さいシードの集合
### 実験結果
targはntargを示し,testsはテスト数,f.c.は単一のstuck-at fault coverageを示す.
また,LFSR列は,LFSR bitの最大数と平均数を示す.
そして,fail data entries列は,maxが故障集合F内の故障のfail dataエントリの最大数を示し,
ratioは修正前のmaxと修正後のmaxの比率を示す.
targはF_{targ}内の故障のみを考慮したfail dataエントリ数の平均を示す.
allはFすべての故障を考慮したfail dataエントリ数を示す.
ntimeは,初期テストセットTの故障を削除した故障シミュレーションの実行時間で,
提案手法の実行時間を割った正規化されたランタイムを示す.
また,ntimeは回路のサイズによらず,スケーリングされている.

fail data entries列のtargが小さくなることがfail data量削減の指標となる.
LFSR列の平均bit数(ave)はテストあたりのテストデータ圧縮のレベルを決定する.
より大きなbit数(max)のLFSRが許可されている場合でも,aveは緩やかにしか増加せず,
提案手法はテストデータ圧縮のレベルを初期値に近い値に維持する.
## 5. 議論はある?
- ntimeについて
- テストデータ圧縮のレベルについて
## 6. 次に読むべき論文は?
|
defect
|
lfsr based test generation for reduced fail data volume 論文 irith pomeranz and srikanth venkataraman どんなもの? 故障診断において一部のfaulty unitでは大量のfail dataが発生し,故障診断の実行が長くなってしまう. そこで,faulty unitのfail data全体を保存する前に,fail data収集プロセスを終了したいという背景がある. 今回はlfsrを用いたテストデータ圧縮本体ではなく,テストセットを修正して,生成されるfail data量を削減することを目的とする. ref fail data def 故障したユニットが故障した出力値を生成するテストと出力のリスト 一般的に,エントリ数 fail dataの総数 が最大値,または最大値に近い故障はごく少ない. つまり,fail data削減には少数の故障のサブセットに焦点を当てる必要がある. 先行研究と比べてどこがすごい? 故障診断に使用するlfsrを各回路に対して調整することで,fail dataの削減を達成. fail data削減手法の手順がわかりやすく示されている. 技術や手法のキモはどこ? エントリ数が最も多い故障をf targ に追加する. これをf targ のサイズntargで管理する. 手順は次の通りに行う. テストセットの修正は使用するlfsrのシードを修正することで達成される. どうやって有効だと検証した? ベンチマーク回路に対して,手法を適用した. 各回路について, . シードの初期セット lfsrの最大ビット数を増加させずに, dataエントリ数が最も小さいシードのセット lfsrの最大ビット数を増加させてもよい場合, dataエントリ数が最も小さいシードの集合 実験結果 targはntargを示し,testsはテスト数,f c は単一のstuck at fault coverageを示す. また,lfsr列は,lfsr bitの最大数と平均数を示す. そして,fail data entries列は,maxが故障集合f内の故障のfail dataエントリの最大数を示し, ratioは修正前のmaxと修正後のmaxの比率を示す. targはf targ 内の故障のみを考慮したfail dataエントリ数の平均を示す. allはfすべての故障を考慮したfail dataエントリ数を示す. ntimeは,初期テストセットtの故障を削除した故障シミュレーションの実行時間で, 提案手法の実行時間を割った正規化されたランタイムを示す. また,ntimeは回路のサイズによらず,スケーリングされている. fail data entries列のtargが小さくなることがfail data量削減の指標となる. lfsr列の平均bit数 ave はテストあたりのテストデータ圧縮のレベルを決定する. より大きなbit数 max のlfsrが許可されている場合でも,aveは緩やかにしか増加せず, 提案手法はテストデータ圧縮のレベルを初期値に近い値に維持する. 議論はある? ntimeについて テストデータ圧縮のレベルについて 次に読むべき論文は?
| 1
|
30,516
| 6,148,905,237
|
IssuesEvent
|
2017-06-27 18:52:13
|
cakephp/cakephp
|
https://api.github.com/repos/cakephp/cakephp
|
closed
|
Creating fixtures fails if mysql reports current_timestamp() with parenthesis
|
Defect
|
This is a (multiple allowed):
* [x] bug
* [ ] enhancement
* [ ] feature-discussion (RFC)
* CakePHP Version: 2.9.9.
* Platform and Target: MacOS, mysqld Ver 10.2.6-MariaDB for osx10.12 on x86_64 (Homebrew).
### What you did
I have a table that has a field called updated_at that was created as follows: ``` `updated_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP```
However my db version reports it back as ``` `updated_at` timestamp NOT NULL DEFAULT current_timestamp() ON UPDATE current_timestamp()```.
The bug occurs when trying to run tests that create fixtures for the affected table.
### What happened
An error is displayed and tests won't run.
```
2017-06-27 11:51:16 Error: Fixture creation for "ad_campaigns" failed "SQLSTATE[42000]: Syntax error or access violation: 1067 Invalid default value for 'updated_at'"
PHP Warning: Fixture creation for "ad_campaigns" failed "SQLSTATE[42000]: Syntax error or access violation: 1067 Invalid default value for 'updated_at'" in /Users/kurre/mcn/nativeflow/Vendor/cakephp/cakephp/lib/Cake/TestSuite/Fixture/CakeTestFixture.php on line 244
```
Looking at the input to DboSource->_execute() I see the column being described as ``` `updated_at` timestamp DEFAULT 'current_timestamp()' NOT NULL``` which is obviously not correct.
### What you expected to happen
I expect tests to run and no error to be raised.
I am opening a PR with a fix.
|
1.0
|
Creating fixtures fails if mysql reports current_timestamp() with parenthesis - This is a (multiple allowed):
* [x] bug
* [ ] enhancement
* [ ] feature-discussion (RFC)
* CakePHP Version: 2.9.9.
* Platform and Target: MacOS, mysqld Ver 10.2.6-MariaDB for osx10.12 on x86_64 (Homebrew).
### What you did
I have a table that has a field called updated_at that was created as follows: ``` `updated_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP```
However my db version reports it back as ``` `updated_at` timestamp NOT NULL DEFAULT current_timestamp() ON UPDATE current_timestamp()```.
The bug occurs when trying to run tests that create fixtures for the affected table.
### What happened
An error is displayed and tests won't run.
```
2017-06-27 11:51:16 Error: Fixture creation for "ad_campaigns" failed "SQLSTATE[42000]: Syntax error or access violation: 1067 Invalid default value for 'updated_at'"
PHP Warning: Fixture creation for "ad_campaigns" failed "SQLSTATE[42000]: Syntax error or access violation: 1067 Invalid default value for 'updated_at'" in /Users/kurre/mcn/nativeflow/Vendor/cakephp/cakephp/lib/Cake/TestSuite/Fixture/CakeTestFixture.php on line 244
```
Looking at the input to DboSource->_execute() I see the column being described as ``` `updated_at` timestamp DEFAULT 'current_timestamp()' NOT NULL``` which is obviously not correct.
### What you expected to happen
I expect tests to run and no error to be raised.
I am opening a PR with a fix.
|
defect
|
creating fixtures fails if mysql reports current timestamp with parenthesis this is a multiple allowed bug enhancement feature discussion rfc cakephp version platform and target macos mysqld ver mariadb for on homebrew what you did i have a table that has a field called updated at that was created as follows updated at timestamp not null default current timestamp on update current timestamp however my db version reports it back as updated at timestamp not null default current timestamp on update current timestamp the bug occurs when trying to run tests that create fixtures for the affected table what happened an error is displayed and tests won t run error fixture creation for ad campaigns failed sqlstate syntax error or access violation invalid default value for updated at php warning fixture creation for ad campaigns failed sqlstate syntax error or access violation invalid default value for updated at in users kurre mcn nativeflow vendor cakephp cakephp lib cake testsuite fixture caketestfixture php on line looking at the input to dbosource execute i see the column being described as updated at timestamp default current timestamp not null which is obviously not correct what you expected to happen i expect tests to run and no error to be raised i am opening a pr with a fix
| 1
|
159,484
| 24,998,953,497
|
IssuesEvent
|
2022-11-03 05:18:35
|
cloudforet-io/mirinae
|
https://api.github.com/repos/cloudforet-io/mirinae
|
closed
|
[Font-weight] Add new font-weight
|
Design Update Priority: Medium
|
**Problem**
Need new font weight (between regular and blod)
**Solution**
Add medium(500) font weight
**Figma / Jira Link**
[Typography](https://www.figma.com/file/DVwz3WQLLIxWXj4d4W8MQI/Foundation?node-id=17863%3A182708&viewport=6928%2C1468%2C0.44)
|
1.0
|
[Font-weight] Add new font-weight - **Problem**
Need new font weight (between regular and blod)
**Solution**
Add medium(500) font weight
**Figma / Jira Link**
[Typography](https://www.figma.com/file/DVwz3WQLLIxWXj4d4W8MQI/Foundation?node-id=17863%3A182708&viewport=6928%2C1468%2C0.44)
|
non_defect
|
add new font weight problem need new font weight between regular and blod solution add medium font weight figma jira link
| 0
|
31,660
| 6,582,700,675
|
IssuesEvent
|
2017-09-13 00:21:42
|
extnet/Ext.NET
|
https://api.github.com/repos/extnet/Ext.NET
|
closed
|
Chart legend infinite loop if chart narrower than series name entry
|
4.x defect override-removal-pending review review-after-extjs-upgrade sencha
|
Found: 4.4.0
Ext.NET forum thread: [Problem in chart with legend text is long](https://forums.ext.net/showthread.php?62104)
Sencha thread: [[6.5.1 classic] Layout loop on charts using bottom/top legend on too narrow canvas](https://www.sencha.com/forum/showthread.php?407205)
If a chart legend entry (a series' description), docked top or bottom, makes the legend entry (including bullet and border space/padding) exceeds the chart surface area, [Ext.chart.legend.SpriteLegend.performLayout()](http://docs.sencha.com/extjs/6.5.1/classic/src/SpriteLegend.js.html#line326) will be caught in an infinite loop when trying to accommodate the bottom/top docked legend boundary boxes.
|
1.0
|
Chart legend infinite loop if chart narrower than series name entry - Found: 4.4.0
Ext.NET forum thread: [Problem in chart with legend text is long](https://forums.ext.net/showthread.php?62104)
Sencha thread: [[6.5.1 classic] Layout loop on charts using bottom/top legend on too narrow canvas](https://www.sencha.com/forum/showthread.php?407205)
If a chart legend entry (a series' description), docked top or bottom, makes the legend entry (including bullet and border space/padding) exceeds the chart surface area, [Ext.chart.legend.SpriteLegend.performLayout()](http://docs.sencha.com/extjs/6.5.1/classic/src/SpriteLegend.js.html#line326) will be caught in an infinite loop when trying to accommodate the bottom/top docked legend boundary boxes.
|
defect
|
chart legend infinite loop if chart narrower than series name entry found ext net forum thread sencha thread layout loop on charts using bottom top legend on too narrow canvas if a chart legend entry a series description docked top or bottom makes the legend entry including bullet and border space padding exceeds the chart surface area will be caught in an infinite loop when trying to accommodate the bottom top docked legend boundary boxes
| 1
|
102,279
| 21,941,261,604
|
IssuesEvent
|
2022-05-23 18:22:23
|
apollographql/apollo-ios
|
https://api.github.com/repos/apollographql/apollo-ios
|
closed
|
Codegen parameter order nondeterministic
|
enhancement codegen
|
I've been extremely frustrated with the fact that the order of codegen'd initializer parameters for objects, queries and mutations appears to be nondeterministic. Every time I update my schema, and refetch the schema from my server using the CLI, then re-codegen from that schema, a bunch of parameters for my requests change order.
The auto-fixits often only reorder the parameter names and not the values making them dangerous. This means that every time I update the schema, when I codegen I have to go fix how I call these initializers in many places. At least I'm able to shield some of my code from this liability by providing a consistent API which abstracts away Apollo using domain models, but it's still painful.
I've gone so far as writing a script to alphabetize the fetched schema (which helps reduce noise in commits), but the generated code still seems to be nondeterministic and the parameters do not seem to follow any discernible ordering.
Example:
In my alphabetized schema, you can see that the fields of EditableUserFields are in the correct order:
```
{
"description": "",
"enumValues": null,
"fields": null,
"inputFields": [
{
"defaultValue": null,
"description": "",
"name": "biography",
"type": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
},
{
"defaultValue": null,
"description": "",
"name": "emailAddress",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
},
{
"defaultValue": null,
"description": "",
"name": "fullName",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
},
{
"defaultValue": null,
"description": "",
"name": "profilePictureURL",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
},
{
"defaultValue": null,
"description": "",
"name": "username",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
},
{
"defaultValue": null,
"description": "",
"name": "website",
"type": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
],
"interfaces": null,
"kind": "INPUT_OBJECT",
"name": "EditableUserFields",
"possibleTypes": null
},
```
However in the generated code, I see this order:
EditableUserFields(biography:,profilePictureUrl:,fullName:,username:, emailAddress:, website:)
Is there a way to address this? Am I holding it wrong? Or do we actually need to fix it on the Apollo end? I'm surprised no one I could find was complaining about this as it seems like it would be a common bugbear for consumers of Apollo, making me think that there's something odd about my setup which is causing this noise.
|
1.0
|
Codegen parameter order nondeterministic - I've been extremely frustrated with the fact that the order of codegen'd initializer parameters for objects, queries and mutations appears to be nondeterministic. Every time I update my schema, and refetch the schema from my server using the CLI, then re-codegen from that schema, a bunch of parameters for my requests change order.
The auto-fixits often only reorder the parameter names and not the values making them dangerous. This means that every time I update the schema, when I codegen I have to go fix how I call these initializers in many places. At least I'm able to shield some of my code from this liability by providing a consistent API which abstracts away Apollo using domain models, but it's still painful.
I've gone so far as writing a script to alphabetize the fetched schema (which helps reduce noise in commits), but the generated code still seems to be nondeterministic and the parameters do not seem to follow any discernible ordering.
Example:
In my alphabetized schema, you can see that the fields of EditableUserFields are in the correct order:
```
{
"description": "",
"enumValues": null,
"fields": null,
"inputFields": [
{
"defaultValue": null,
"description": "",
"name": "biography",
"type": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
},
{
"defaultValue": null,
"description": "",
"name": "emailAddress",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
},
{
"defaultValue": null,
"description": "",
"name": "fullName",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
},
{
"defaultValue": null,
"description": "",
"name": "profilePictureURL",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
},
{
"defaultValue": null,
"description": "",
"name": "username",
"type": {
"kind": "NON_NULL",
"name": null,
"ofType": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
},
{
"defaultValue": null,
"description": "",
"name": "website",
"type": {
"kind": "SCALAR",
"name": "String",
"ofType": null
}
}
],
"interfaces": null,
"kind": "INPUT_OBJECT",
"name": "EditableUserFields",
"possibleTypes": null
},
```
However in the generated code, I see this order:
EditableUserFields(biography:,profilePictureUrl:,fullName:,username:, emailAddress:, website:)
Is there a way to address this? Am I holding it wrong? Or do we actually need to fix it on the Apollo end? I'm surprised no one I could find was complaining about this as it seems like it would be a common bugbear for consumers of Apollo, making me think that there's something odd about my setup which is causing this noise.
|
non_defect
|
codegen parameter order nondeterministic i ve been extremely frustrated with the fact that the order of codegen d initializer parameters for objects queries and mutations appears to be nondeterministic every time i update my schema and refetch the schema from my server using the cli then re codegen from that schema a bunch of parameters for my requests change order the auto fixits often only reorder the parameter names and not the values making them dangerous this means that every time i update the schema when i codegen i have to go fix how i call these initializers in many places at least i m able to shield some of my code from this liability by providing a consistent api which abstracts away apollo using domain models but it s still painful i ve gone so far as writing a script to alphabetize the fetched schema which helps reduce noise in commits but the generated code still seems to be nondeterministic and the parameters do not seem to follow any discernible ordering example in my alphabetized schema you can see that the fields of editableuserfields are in the correct order description enumvalues null fields null inputfields defaultvalue null description name biography type kind scalar name string oftype null defaultvalue null description name emailaddress type kind non null name null oftype kind scalar name string oftype null defaultvalue null description name fullname type kind non null name null oftype kind scalar name string oftype null defaultvalue null description name profilepictureurl type kind non null name null oftype kind scalar name string oftype null defaultvalue null description name username type kind non null name null oftype kind scalar name string oftype null defaultvalue null description name website type kind scalar name string oftype null interfaces null kind input object name editableuserfields possibletypes null however in the generated code i see this order editableuserfields biography profilepictureurl fullname username emailaddress website is there a way to address this am i holding it wrong or do we actually need to fix it on the apollo end i m surprised no one i could find was complaining about this as it seems like it would be a common bugbear for consumers of apollo making me think that there s something odd about my setup which is causing this noise
| 0
|
637,088
| 20,620,064,356
|
IssuesEvent
|
2022-03-07 16:35:08
|
googleapis/python-bigquery-pandas
|
https://api.github.com/repos/googleapis/python-bigquery-pandas
|
closed
|
to_gbq() fails with columns having bools and NaNs
|
type: bug priority: p1 api: bigquery
|
Hi,
I've got an issue running `to_gbq()` with a DataFrame that has a column containing bools and NaNs. It can be reproduced with the example below.
From what I understand, the dtype for this column is `object`, thus `pandas-gbq` detects this column as `string` and `pyarrow` is not happy because it contains `bool` entries. I'm not sure what's the best way to fix this, would you have any idea ? I'd be happy to write a patch if you have some pointers on how to fix.
#### Environment details
- OS type and version: Linux
- Python version: 3.9.9
- pip version: 20.3.4
- `pandas-gbq` version: 0.17.1
#### Steps to reproduce
Can be reproduced using the following snippet.
#### Code example
```python
#!/usr/bin/python3
import pandas_gbq
import pandas as pd
import numpy as np
dtf = pd.DataFrame({'col': [np.NaN, False, True]})
pandas_gbq.to_gbq(dtf, 'dataset.table')
```
#### Stack trace
```
Traceback (most recent call last):
File "test.py", line 16, in <module>
pandas_gbq.to_gbq(dtf, 'dataset.table')
File "venv/lib/python3.9/site-packages/pandas_gbq/gbq.py", line 1148, in to_gbq
connector.load_data(
File "venv/lib/python3.9/site-packages/pandas_gbq/gbq.py", line 565, in load_data
chunks = load.load_chunks(
File "venv/lib/python3.9/site-packages/pandas_gbq/load.py", line 237, in load_chunks
load_parquet(
File "venv/lib/python3.9/site-packages/pandas_gbq/load.py", line 129, in load_parquet
client.load_table_from_dataframe(
File "venv/lib/python3.9/site-packages/google/cloud/bigquery/client.py", line 2649, in load_table_from_dataframe
_pandas_helpers.dataframe_to_parquet(
File "venv/lib/python3.9/site-packages/google/cloud/bigquery/_pandas_helpers.py", line 589, in dataframe_to_parquet
arrow_table = dataframe_to_arrow(dataframe, bq_schema)
File "venv/lib/python3.9/site-packages/google/cloud/bigquery/_pandas_helpers.py", line 532, in dataframe_to_arrow
bq_to_arrow_array(get_column_or_index(dataframe, bq_field.name), bq_field)
File "venv/lib/python3.9/site-packages/google/cloud/bigquery/_pandas_helpers.py", line 293, in bq_to_arrow_array
return pyarrow.Array.from_pandas(series, type=arrow_type)
File "pyarrow/array.pxi", line 913, in pyarrow.lib.Array.from_pandas
File "pyarrow/array.pxi", line 311, in pyarrow.lib.array
File "pyarrow/array.pxi", line 83, in pyarrow.lib._ndarray_to_array
File "pyarrow/error.pxi", line 122, in pyarrow.lib.check_status
pyarrow.lib.ArrowTypeError: Expected bytes, got a 'bool' object
```
Thanks!
Vincent
|
1.0
|
to_gbq() fails with columns having bools and NaNs - Hi,
I've got an issue running `to_gbq()` with a DataFrame that has a column containing bools and NaNs. It can be reproduced with the example below.
From what I understand, the dtype for this column is `object`, thus `pandas-gbq` detects this column as `string` and `pyarrow` is not happy because it contains `bool` entries. I'm not sure what's the best way to fix this, would you have any idea ? I'd be happy to write a patch if you have some pointers on how to fix.
#### Environment details
- OS type and version: Linux
- Python version: 3.9.9
- pip version: 20.3.4
- `pandas-gbq` version: 0.17.1
#### Steps to reproduce
Can be reproduced using the following snippet.
#### Code example
```python
#!/usr/bin/python3
import pandas_gbq
import pandas as pd
import numpy as np
dtf = pd.DataFrame({'col': [np.NaN, False, True]})
pandas_gbq.to_gbq(dtf, 'dataset.table')
```
#### Stack trace
```
Traceback (most recent call last):
File "test.py", line 16, in <module>
pandas_gbq.to_gbq(dtf, 'dataset.table')
File "venv/lib/python3.9/site-packages/pandas_gbq/gbq.py", line 1148, in to_gbq
connector.load_data(
File "venv/lib/python3.9/site-packages/pandas_gbq/gbq.py", line 565, in load_data
chunks = load.load_chunks(
File "venv/lib/python3.9/site-packages/pandas_gbq/load.py", line 237, in load_chunks
load_parquet(
File "venv/lib/python3.9/site-packages/pandas_gbq/load.py", line 129, in load_parquet
client.load_table_from_dataframe(
File "venv/lib/python3.9/site-packages/google/cloud/bigquery/client.py", line 2649, in load_table_from_dataframe
_pandas_helpers.dataframe_to_parquet(
File "venv/lib/python3.9/site-packages/google/cloud/bigquery/_pandas_helpers.py", line 589, in dataframe_to_parquet
arrow_table = dataframe_to_arrow(dataframe, bq_schema)
File "venv/lib/python3.9/site-packages/google/cloud/bigquery/_pandas_helpers.py", line 532, in dataframe_to_arrow
bq_to_arrow_array(get_column_or_index(dataframe, bq_field.name), bq_field)
File "venv/lib/python3.9/site-packages/google/cloud/bigquery/_pandas_helpers.py", line 293, in bq_to_arrow_array
return pyarrow.Array.from_pandas(series, type=arrow_type)
File "pyarrow/array.pxi", line 913, in pyarrow.lib.Array.from_pandas
File "pyarrow/array.pxi", line 311, in pyarrow.lib.array
File "pyarrow/array.pxi", line 83, in pyarrow.lib._ndarray_to_array
File "pyarrow/error.pxi", line 122, in pyarrow.lib.check_status
pyarrow.lib.ArrowTypeError: Expected bytes, got a 'bool' object
```
Thanks!
Vincent
|
non_defect
|
to gbq fails with columns having bools and nans hi i ve got an issue running to gbq with a dataframe that has a column containing bools and nans it can be reproduced with the example below from what i understand the dtype for this column is object thus pandas gbq detects this column as string and pyarrow is not happy because it contains bool entries i m not sure what s the best way to fix this would you have any idea i d be happy to write a patch if you have some pointers on how to fix environment details os type and version linux python version pip version pandas gbq version steps to reproduce can be reproduced using the following snippet code example python usr bin import pandas gbq import pandas as pd import numpy as np dtf pd dataframe col pandas gbq to gbq dtf dataset table stack trace traceback most recent call last file test py line in pandas gbq to gbq dtf dataset table file venv lib site packages pandas gbq gbq py line in to gbq connector load data file venv lib site packages pandas gbq gbq py line in load data chunks load load chunks file venv lib site packages pandas gbq load py line in load chunks load parquet file venv lib site packages pandas gbq load py line in load parquet client load table from dataframe file venv lib site packages google cloud bigquery client py line in load table from dataframe pandas helpers dataframe to parquet file venv lib site packages google cloud bigquery pandas helpers py line in dataframe to parquet arrow table dataframe to arrow dataframe bq schema file venv lib site packages google cloud bigquery pandas helpers py line in dataframe to arrow bq to arrow array get column or index dataframe bq field name bq field file venv lib site packages google cloud bigquery pandas helpers py line in bq to arrow array return pyarrow array from pandas series type arrow type file pyarrow array pxi line in pyarrow lib array from pandas file pyarrow array pxi line in pyarrow lib array file pyarrow array pxi line in pyarrow lib ndarray to array file pyarrow error pxi line in pyarrow lib check status pyarrow lib arrowtypeerror expected bytes got a bool object thanks vincent
| 0
|
581,371
| 17,292,289,183
|
IssuesEvent
|
2021-07-25 02:12:46
|
crcn/paperclip
|
https://api.github.com/repos/crcn/paperclip
|
closed
|
Birdseye view not always displaying CSS
|
area: playground bug effort: 2 estimate: 1 day priority: 4
|
<img width="1792" alt="Screen Shot 2021-06-06 at 10 13 03 AM" src="https://user-images.githubusercontent.com/757408/120929731-d3cdfa00-c6af-11eb-9411-76a30a4c552f.png">
|
1.0
|
Birdseye view not always displaying CSS - <img width="1792" alt="Screen Shot 2021-06-06 at 10 13 03 AM" src="https://user-images.githubusercontent.com/757408/120929731-d3cdfa00-c6af-11eb-9411-76a30a4c552f.png">
|
non_defect
|
birdseye view not always displaying css img width alt screen shot at am src
| 0
|
302,513
| 9,260,946,014
|
IssuesEvent
|
2019-03-18 07:46:58
|
k8smeetup/website-tasks
|
https://api.github.com/repos/k8smeetup/website-tasks
|
closed
|
/blog/_posts/2015-03-00-Welcome-To-Kubernetes-Blog.md
|
doc/accessory finished lang/zh priority/P3 version/1.12
|
Path:`/blog/_posts/2015-03-00-Welcome-To-Kubernetes-Blog.md`
[Source code](https://github.com/kubernetes/website/tree/release-1.12/content/en//blog/_posts/2015-03-00-Welcome-To-Kubernetes-Blog.md)
|
1.0
|
/blog/_posts/2015-03-00-Welcome-To-Kubernetes-Blog.md - Path:`/blog/_posts/2015-03-00-Welcome-To-Kubernetes-Blog.md`
[Source code](https://github.com/kubernetes/website/tree/release-1.12/content/en//blog/_posts/2015-03-00-Welcome-To-Kubernetes-Blog.md)
|
non_defect
|
blog posts welcome to kubernetes blog md path: blog posts welcome to kubernetes blog md
| 0
|
367,124
| 10,840,755,697
|
IssuesEvent
|
2019-11-12 09:03:26
|
StrangeLoopGames/EcoIssues
|
https://api.github.com/repos/StrangeLoopGames/EcoIssues
|
closed
|
[8.3] Auth from store reset with Restart
|
Fixed Medium Priority
|
Every time the Server has a Resart the Auth are for everyone on the Store
We have the same Bug
https://github.com/StrangeLoopGames/EcoIssues/issues/9583
|
1.0
|
[8.3] Auth from store reset with Restart - Every time the Server has a Resart the Auth are for everyone on the Store
We have the same Bug
https://github.com/StrangeLoopGames/EcoIssues/issues/9583
|
non_defect
|
auth from store reset with restart every time the server has a resart the auth are for everyone on the store we have the same bug
| 0
|
54,387
| 13,638,022,164
|
IssuesEvent
|
2020-09-25 08:45:30
|
hazelcast/hazelcast-jet
|
https://api.github.com/repos/hazelcast/hazelcast-jet
|
closed
|
Job can get stuck due to failed IMap operation
|
core defect
|
Due to the nature of IMDG, any `IMap` (or `DistributedObject` object) operation can fail if the connection between members is disrupted. The connection can be transparently reestablished so there is no topology change, but operations executed during that time or their responses can be lost. An `OperationTimeoutException` is thrown.
We manipulate IMaps a lot during job startup and cleanup. We don't defend against exceptions there, the job usually gets stuck in a broken state.
|
1.0
|
Job can get stuck due to failed IMap operation - Due to the nature of IMDG, any `IMap` (or `DistributedObject` object) operation can fail if the connection between members is disrupted. The connection can be transparently reestablished so there is no topology change, but operations executed during that time or their responses can be lost. An `OperationTimeoutException` is thrown.
We manipulate IMaps a lot during job startup and cleanup. We don't defend against exceptions there, the job usually gets stuck in a broken state.
|
defect
|
job can get stuck due to failed imap operation due to the nature of imdg any imap or distributedobject object operation can fail if the connection between members is disrupted the connection can be transparently reestablished so there is no topology change but operations executed during that time or their responses can be lost an operationtimeoutexception is thrown we manipulate imaps a lot during job startup and cleanup we don t defend against exceptions there the job usually gets stuck in a broken state
| 1
|
75,202
| 25,586,320,691
|
IssuesEvent
|
2022-12-01 09:36:25
|
vector-im/element-call
|
https://api.github.com/repos/vector-im/element-call
|
opened
|
[ERROR] Could not load src/index.html: ENOENT: no such file or directory
|
T-Defect
|
### Steps to reproduce
I am trying to build "element-call" according to the build instructions (given here: https://github.com/vector-im/element-call#host-it-yourself).
The build process fails with the below message:
```
yarn build
yarn run v1.22.19
$ vite build
vite v2.9.14 building for production...
✓ 0 modules transformed.
mv: no such file or directory: /element-call/dist/src/*.html
[vite:load-fallback] Could not load src/index.html: ENOENT: no such file or directory, open '\element-call\src\index.html'
error during build:
Error: Could not load src/index.html: ENOENT: no such file or directory, open '\element-call\src\index.html'
error Command failed with exit code 1.
```
### Outcome
Error
### Operating system
_No response_
### Browser information
_No response_
### URL for webapp
_No response_
### Will you send logs?
No
|
1.0
|
[ERROR] Could not load src/index.html: ENOENT: no such file or directory - ### Steps to reproduce
I am trying to build "element-call" according to the build instructions (given here: https://github.com/vector-im/element-call#host-it-yourself).
The build process fails with the below message:
```
yarn build
yarn run v1.22.19
$ vite build
vite v2.9.14 building for production...
✓ 0 modules transformed.
mv: no such file or directory: /element-call/dist/src/*.html
[vite:load-fallback] Could not load src/index.html: ENOENT: no such file or directory, open '\element-call\src\index.html'
error during build:
Error: Could not load src/index.html: ENOENT: no such file or directory, open '\element-call\src\index.html'
error Command failed with exit code 1.
```
### Outcome
Error
### Operating system
_No response_
### Browser information
_No response_
### URL for webapp
_No response_
### Will you send logs?
No
|
defect
|
could not load src index html enoent no such file or directory steps to reproduce i am trying to build element call according to the build instructions given here the build process fails with the below message yarn build yarn run vite build vite building for production ✓ modules transformed mv no such file or directory element call dist src html could not load src index html enoent no such file or directory open element call src index html error during build error could not load src index html enoent no such file or directory open element call src index html error command failed with exit code outcome error operating system no response browser information no response url for webapp no response will you send logs no
| 1
|
151,985
| 19,671,496,812
|
IssuesEvent
|
2022-01-11 07:53:06
|
ChoeMinji/xStream_1_4_17
|
https://api.github.com/repos/ChoeMinji/xStream_1_4_17
|
opened
|
CVE-2021-39139 (High) detected in xstream-1.4.17.jar
|
security vulnerability
|
## CVE-2021-39139 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xstream-1.4.17.jar</b></p></summary>
<p></p>
<p>Library home page: <a href="http://x-stream.github.io">http://x-stream.github.io</a></p>
<p>Path to dependency file: /xstream-benchmark/pom.xml</p>
<p>Path to vulnerable library: /sitory/com/thoughtworks/xstream/xstream/1.4.17/xstream-1.4.17.jar,/sitory/com/thoughtworks/xstream/xstream/1.4.17/xstream-1.4.17.jar,/sitory/com/thoughtworks/xstream/xstream/1.4.17/xstream-1.4.17.jar</p>
<p>
Dependency Hierarchy:
- :x: **xstream-1.4.17.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ChoeMinji/xStream_1_4_17/commit/91b0fd5ebba59bc3d610a838562e863b18b7bb67">91b0fd5ebba59bc3d610a838562e863b18b7bb67</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
XStream is a simple library to serialize objects to XML and back again. In affected versions this vulnerability may allow a remote attacker to load and execute arbitrary code from a remote host only by manipulating the processed input stream. A user is only affected if using the version out of the box with JDK 1.7u21 or below. However, this scenario can be adjusted easily to an external Xalan that works regardless of the version of the Java runtime. No user is affected, who followed the recommendation to setup XStream's security framework with a whitelist limited to the minimal required types. XStream 1.4.18 uses no longer a blacklist by default, since it cannot be secured for general purpose.
<p>Publish Date: 2021-08-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-39139>CVE-2021-39139</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/x-stream/xstream/security/advisories/GHSA-64xx-cq4q-mf44">https://github.com/x-stream/xstream/security/advisories/GHSA-64xx-cq4q-mf44</a></p>
<p>Release Date: 2021-08-23</p>
<p>Fix Resolution: com.thoughtworks.xstream:xstream:1.4.18</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-39139 (High) detected in xstream-1.4.17.jar - ## CVE-2021-39139 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xstream-1.4.17.jar</b></p></summary>
<p></p>
<p>Library home page: <a href="http://x-stream.github.io">http://x-stream.github.io</a></p>
<p>Path to dependency file: /xstream-benchmark/pom.xml</p>
<p>Path to vulnerable library: /sitory/com/thoughtworks/xstream/xstream/1.4.17/xstream-1.4.17.jar,/sitory/com/thoughtworks/xstream/xstream/1.4.17/xstream-1.4.17.jar,/sitory/com/thoughtworks/xstream/xstream/1.4.17/xstream-1.4.17.jar</p>
<p>
Dependency Hierarchy:
- :x: **xstream-1.4.17.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ChoeMinji/xStream_1_4_17/commit/91b0fd5ebba59bc3d610a838562e863b18b7bb67">91b0fd5ebba59bc3d610a838562e863b18b7bb67</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
XStream is a simple library to serialize objects to XML and back again. In affected versions this vulnerability may allow a remote attacker to load and execute arbitrary code from a remote host only by manipulating the processed input stream. A user is only affected if using the version out of the box with JDK 1.7u21 or below. However, this scenario can be adjusted easily to an external Xalan that works regardless of the version of the Java runtime. No user is affected, who followed the recommendation to setup XStream's security framework with a whitelist limited to the minimal required types. XStream 1.4.18 uses no longer a blacklist by default, since it cannot be secured for general purpose.
<p>Publish Date: 2021-08-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-39139>CVE-2021-39139</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/x-stream/xstream/security/advisories/GHSA-64xx-cq4q-mf44">https://github.com/x-stream/xstream/security/advisories/GHSA-64xx-cq4q-mf44</a></p>
<p>Release Date: 2021-08-23</p>
<p>Fix Resolution: com.thoughtworks.xstream:xstream:1.4.18</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve high detected in xstream jar cve high severity vulnerability vulnerable library xstream jar library home page a href path to dependency file xstream benchmark pom xml path to vulnerable library sitory com thoughtworks xstream xstream xstream jar sitory com thoughtworks xstream xstream xstream jar sitory com thoughtworks xstream xstream xstream jar dependency hierarchy x xstream jar vulnerable library found in head commit a href found in base branch master vulnerability details xstream is a simple library to serialize objects to xml and back again in affected versions this vulnerability may allow a remote attacker to load and execute arbitrary code from a remote host only by manipulating the processed input stream a user is only affected if using the version out of the box with jdk or below however this scenario can be adjusted easily to an external xalan that works regardless of the version of the java runtime no user is affected who followed the recommendation to setup xstream s security framework with a whitelist limited to the minimal required types xstream uses no longer a blacklist by default since it cannot be secured for general purpose publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com thoughtworks xstream xstream step up your open source security game with whitesource
| 0
|
3,957
| 2,610,084,761
|
IssuesEvent
|
2015-02-26 18:25:49
|
chrsmith/dsdsdaadf
|
https://api.github.com/repos/chrsmith/dsdsdaadf
|
opened
|
深圳彩光怎么样治疗痘痘
|
auto-migrated Priority-Medium Type-Defect
|
```
深圳彩光怎么样治疗痘痘【深圳韩方科颜全国热线400-869-1818��
�24小时QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以��
�国秘方——韩方科颜这一国妆准字号治疗型权威,祛痘佳品�
��韩方科颜专业祛痘连锁机构,采用韩国秘方配合专业“不反
弹”健康祛痘技术并结合先进“先进豪华彩光”仪,开创国��
�专业治疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸�
��的痘痘。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 7:00
|
1.0
|
深圳彩光怎么样治疗痘痘 - ```
深圳彩光怎么样治疗痘痘【深圳韩方科颜全国热线400-869-1818��
�24小时QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以��
�国秘方——韩方科颜这一国妆准字号治疗型权威,祛痘佳品�
��韩方科颜专业祛痘连锁机构,采用韩国秘方配合专业“不反
弹”健康祛痘技术并结合先进“先进豪华彩光”仪,开创国��
�专业治疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸�
��的痘痘。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 7:00
|
defect
|
深圳彩光怎么样治疗痘痘 深圳彩光怎么样治疗痘痘【 �� � 】深圳韩方科颜专业祛痘连锁机构,机构以�� �国秘方——韩方科颜这一国妆准字号治疗型权威,祛痘佳品� ��韩方科颜专业祛痘连锁机构,采用韩国秘方配合专业“不反 弹”健康祛痘技术并结合先进“先进豪华彩光”仪,开创国�� �专业治疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸� ��的痘痘。 original issue reported on code google com by szft com on may at
| 1
|
58,461
| 16,544,047,120
|
IssuesEvent
|
2021-05-27 20:54:55
|
networkx/networkx
|
https://api.github.com/repos/networkx/networkx
|
closed
|
MultiGraph initialization with dict-of-dict-of-dict-of-dict doesn't work properly
|
Defect
|
### Current Behavior
The initialization of a multigraph with dictionaries treats the third dictionary, which its keys suppose to be the edges IDs, as the attribute of the edge.
### Expected Behavior
Initializing a multigraph should not accept dictionaries, or it should use reserved keys for edges' IDs.
### Steps to Reproduce
```
g = networkx.MultiGraph({'a': {'b': {0: {'w': 200}, 1: {'w': 201}}}})
print(g.edges)
h = networkx.MultiGraph()
h.add_edges_from([('a', 'b', {"w": 200}), ('a', 'b', {"w": 201})])
print(h.edges)
output
[('a', 'b', 0)]
[('a', 'b', 0), ('a', 'b', 1)]
```
### Environment
Python version: 3.9.5
NetworkX version: 2.5.1
|
1.0
|
MultiGraph initialization with dict-of-dict-of-dict-of-dict doesn't work properly - ### Current Behavior
The initialization of a multigraph with dictionaries treats the third dictionary, which its keys suppose to be the edges IDs, as the attribute of the edge.
### Expected Behavior
Initializing a multigraph should not accept dictionaries, or it should use reserved keys for edges' IDs.
### Steps to Reproduce
```
g = networkx.MultiGraph({'a': {'b': {0: {'w': 200}, 1: {'w': 201}}}})
print(g.edges)
h = networkx.MultiGraph()
h.add_edges_from([('a', 'b', {"w": 200}), ('a', 'b', {"w": 201})])
print(h.edges)
output
[('a', 'b', 0)]
[('a', 'b', 0), ('a', 'b', 1)]
```
### Environment
Python version: 3.9.5
NetworkX version: 2.5.1
|
defect
|
multigraph initialization with dict of dict of dict of dict doesn t work properly current behavior the initialization of a multigraph with dictionaries treats the third dictionary which its keys suppose to be the edges ids as the attribute of the edge expected behavior initializing a multigraph should not accept dictionaries or it should use reserved keys for edges ids steps to reproduce g networkx multigraph a b w w print g edges h networkx multigraph h add edges from print h edges output environment python version networkx version
| 1
|
2,834
| 2,607,961,972
|
IssuesEvent
|
2015-02-26 00:40:37
|
chrsmithdemos/leveldb
|
https://api.github.com/repos/chrsmithdemos/leveldb
|
closed
|
afdfsdfsdfsdf
|
auto-migrated Priority-Medium Type-Defect
|
```
sdfsdfsfd
```
-----
Original issue reported on code.google.com by `wpx...@gmail.com` on 13 May 2011 at 10:13
|
1.0
|
afdfsdfsdfsdf - ```
sdfsdfsfd
```
-----
Original issue reported on code.google.com by `wpx...@gmail.com` on 13 May 2011 at 10:13
|
defect
|
afdfsdfsdfsdf sdfsdfsfd original issue reported on code google com by wpx gmail com on may at
| 1
|
80,263
| 30,201,874,321
|
IssuesEvent
|
2023-07-05 06:40:03
|
dotCMS/core
|
https://api.github.com/repos/dotCMS/core
|
opened
|
Inactivity Session Issue (JSESSIONID and JWT)
|
Type : Defect Triage
|
### Parent Issue
_No response_
### Problem Statement
We are having security assessment and need to shorten the login session when inactivity.
How do we shorten the session and auto logout when inactivity?
### Steps to Reproduce
At first, I tried to change the session-timeout on web.xml to 1 min.
Then we login to dotCMS, wait for 1 min and it didn't logout.
The session doesn't invalidate after 1 min (Probably due to api/ws/v1/system/events constantly refreshing the session)
So we temporary disable requesting the events.
After 1 minute, we still able to login the dotCMS if we stay inactive (Not closing browser).
Looks like the jwt token is refreshing to make it stay login?
I delete the access_token from browser and wait 1 min, it successful logout.
### Acceptance Criteria
Auto logout when inactivity for certain time of period.
### dotCMS Version
22.03.6
### Proposed Objective
Security & Privacy
### Proposed Priority
Priority 3 - Average
### External Links... Slack Conversations, Support Tickets, Figma Designs, etc.
_No response_
### Assumptions & Initiation Needs
_No response_
### Quality Assurance Notes & Workarounds
_No response_
### Sub-Tasks & Estimates
_No response_
|
1.0
|
Inactivity Session Issue (JSESSIONID and JWT) - ### Parent Issue
_No response_
### Problem Statement
We are having security assessment and need to shorten the login session when inactivity.
How do we shorten the session and auto logout when inactivity?
### Steps to Reproduce
At first, I tried to change the session-timeout on web.xml to 1 min.
Then we login to dotCMS, wait for 1 min and it didn't logout.
The session doesn't invalidate after 1 min (Probably due to api/ws/v1/system/events constantly refreshing the session)
So we temporary disable requesting the events.
After 1 minute, we still able to login the dotCMS if we stay inactive (Not closing browser).
Looks like the jwt token is refreshing to make it stay login?
I delete the access_token from browser and wait 1 min, it successful logout.
### Acceptance Criteria
Auto logout when inactivity for certain time of period.
### dotCMS Version
22.03.6
### Proposed Objective
Security & Privacy
### Proposed Priority
Priority 3 - Average
### External Links... Slack Conversations, Support Tickets, Figma Designs, etc.
_No response_
### Assumptions & Initiation Needs
_No response_
### Quality Assurance Notes & Workarounds
_No response_
### Sub-Tasks & Estimates
_No response_
|
defect
|
inactivity session issue jsessionid and jwt parent issue no response problem statement we are having security assessment and need to shorten the login session when inactivity how do we shorten the session and auto logout when inactivity steps to reproduce at first i tried to change the session timeout on web xml to min then we login to dotcms wait for min and it didn t logout the session doesn t invalidate after min probably due to api ws system events constantly refreshing the session so we temporary disable requesting the events after minute we still able to login the dotcms if we stay inactive not closing browser looks like the jwt token is refreshing to make it stay login i delete the access token from browser and wait min it successful logout acceptance criteria auto logout when inactivity for certain time of period dotcms version proposed objective security privacy proposed priority priority average external links slack conversations support tickets figma designs etc no response assumptions initiation needs no response quality assurance notes workarounds no response sub tasks estimates no response
| 1
|
212,535
| 7,238,253,522
|
IssuesEvent
|
2018-02-13 14:03:21
|
CareSet/CareSetReportEngine
|
https://api.github.com/repos/CareSet/CareSetReportEngine
|
reopened
|
How do I prevent SQL injection? give me a shortcut to "quote"
|
priority 2
|
I am writing raw SQL for my reports, and I am accepting user input.
I need to have a consistent way to prevent SQL injections. Whatever that way is needs to be clearly documented in the template code and how-to.
|
1.0
|
How do I prevent SQL injection? give me a shortcut to "quote" - I am writing raw SQL for my reports, and I am accepting user input.
I need to have a consistent way to prevent SQL injections. Whatever that way is needs to be clearly documented in the template code and how-to.
|
non_defect
|
how do i prevent sql injection give me a shortcut to quote i am writing raw sql for my reports and i am accepting user input i need to have a consistent way to prevent sql injections whatever that way is needs to be clearly documented in the template code and how to
| 0
|
20,048
| 3,293,393,113
|
IssuesEvent
|
2015-10-30 18:43:23
|
mehlon/acme-sac
|
https://api.github.com/repos/mehlon/acme-sac
|
closed
|
9cpu/Feedkey broken
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1. Start Feedkey in ACME-sac
2. 9cpu -h some.cpu.server
3. look at console
What is the expected output? What do you see instead?
I expect to be cpu'd into the target system, instead I see the following on the
console:
-> proto=p9any role=client <-
proto=p9any
start p9any
findkey '!password? user? proto=p9sk1 dom=vmware'
[Bufio] Broken: "Bus error"
What version of the product are you using? On what operating system?
built from hg code version: changeset: 226:ffec88ce5fdb
Please provide any additional information below.
I'm trying to cpu to a vmware image on the same machine, but this has worked
under normal
Inferno using wm/feedkey, so I'm not sure what's going south.
```
Original issue reported on code.google.com by `eri...@gmail.com` on 22 May 2009 at 6:50
|
1.0
|
9cpu/Feedkey broken - ```
What steps will reproduce the problem?
1. Start Feedkey in ACME-sac
2. 9cpu -h some.cpu.server
3. look at console
What is the expected output? What do you see instead?
I expect to be cpu'd into the target system, instead I see the following on the
console:
-> proto=p9any role=client <-
proto=p9any
start p9any
findkey '!password? user? proto=p9sk1 dom=vmware'
[Bufio] Broken: "Bus error"
What version of the product are you using? On what operating system?
built from hg code version: changeset: 226:ffec88ce5fdb
Please provide any additional information below.
I'm trying to cpu to a vmware image on the same machine, but this has worked
under normal
Inferno using wm/feedkey, so I'm not sure what's going south.
```
Original issue reported on code.google.com by `eri...@gmail.com` on 22 May 2009 at 6:50
|
defect
|
feedkey broken what steps will reproduce the problem start feedkey in acme sac h some cpu server look at console what is the expected output what do you see instead i expect to be cpu d into the target system instead i see the following on the console proto role client proto start findkey password user proto dom vmware broken bus error what version of the product are you using on what operating system built from hg code version changeset please provide any additional information below i m trying to cpu to a vmware image on the same machine but this has worked under normal inferno using wm feedkey so i m not sure what s going south original issue reported on code google com by eri gmail com on may at
| 1
|
372,501
| 11,016,159,986
|
IssuesEvent
|
2019-12-05 04:16:24
|
Veil-Project/veil
|
https://api.github.com/repos/Veil-Project/veil
|
opened
|
Veil granularity and wall-clock time adjustments for 60 second block times from Bitcoin's 600.
|
Issue Type: Change Request Priority: 2 - Normal
|
<!-- Describe the issue -->
The issue is that this, among many others, our timing is correct for Bitcoin in many instances, and not for Veil. Veil is every 60 seconds per block per Bitcoin's 600 seconds per block. Times need to be adjusted.
A specific example is that in CTxIn, `nSequence` lock-time granularity is incorrectly set for Bitcoin's block-time to seconds.
Another example is that we are set for only holding the past few hours or blocks, not 2 days by default for pruned nodes.
This serves to set as to investigate and find many of these block related time issues and then fix them in a future pull request. The below list will be updated as more are found.
TO UPDATE:
- `transaction.h` - SEQUENCE_LOCKTIME_GRANULARITY
|
1.0
|
Veil granularity and wall-clock time adjustments for 60 second block times from Bitcoin's 600. - <!-- Describe the issue -->
The issue is that this, among many others, our timing is correct for Bitcoin in many instances, and not for Veil. Veil is every 60 seconds per block per Bitcoin's 600 seconds per block. Times need to be adjusted.
A specific example is that in CTxIn, `nSequence` lock-time granularity is incorrectly set for Bitcoin's block-time to seconds.
Another example is that we are set for only holding the past few hours or blocks, not 2 days by default for pruned nodes.
This serves to set as to investigate and find many of these block related time issues and then fix them in a future pull request. The below list will be updated as more are found.
TO UPDATE:
- `transaction.h` - SEQUENCE_LOCKTIME_GRANULARITY
|
non_defect
|
veil granularity and wall clock time adjustments for second block times from bitcoin s the issue is that this among many others our timing is correct for bitcoin in many instances and not for veil veil is every seconds per block per bitcoin s seconds per block times need to be adjusted a specific example is that in ctxin nsequence lock time granularity is incorrectly set for bitcoin s block time to seconds another example is that we are set for only holding the past few hours or blocks not days by default for pruned nodes this serves to set as to investigate and find many of these block related time issues and then fix them in a future pull request the below list will be updated as more are found to update transaction h sequence locktime granularity
| 0
|
813,667
| 30,466,279,260
|
IssuesEvent
|
2023-07-17 10:35:11
|
nestjs/nest
|
https://api.github.com/repos/nestjs/nest
|
closed
|
Lazy loaded module can't access global module's providers
|
type: bug :sob: scope: core effort1: hours priority: medium (3)
|
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Current behavior
In the documentation it says:
> Also, "lazy-loaded" modules share the same modules graph as those eagerly loaded on the application bootstrap as well as any other lazy modules registered later in your app.
But when trying to import a provider that is exported from a globally registered module (which eager modules are able to receive injected into their providers), lazy modules produce an error saying that the provider is not found.
### Minimum reproduction code
https://stackblitz.com/edit/nestjs-typescript-starter-muxzyg?file=src/app.module.ts
### Steps to reproduce
_No response_
### Expected behavior
I expected the service of the lazy module to be able to import a service from a global module.
### Package
- [ ] I don't know. Or some 3rd-party package
- [X] <code>@nestjs/common</code>
- [X] <code>@nestjs/core</code>
- [ ] <code>@nestjs/microservices</code>
- [ ] <code>@nestjs/platform-express</code>
- [ ] <code>@nestjs/platform-fastify</code>
- [ ] <code>@nestjs/platform-socket.io</code>
- [ ] <code>@nestjs/platform-ws</code>
- [ ] <code>@nestjs/testing</code>
- [ ] <code>@nestjs/websockets</code>
- [ ] Other (see below)
### Other package
_No response_
### NestJS version
_No response_
### Packages versions
```json
{
"@nestjs/common": "^8.1.1",
"@nestjs/core": "^8.1.1",
}
```
### Node.js version
_No response_
### In which operating systems have you tested?
- [X] macOS
- [ ] Windows
- [X] Linux
### Other
_No response_
|
1.0
|
Lazy loaded module can't access global module's providers - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Current behavior
In the documentation it says:
> Also, "lazy-loaded" modules share the same modules graph as those eagerly loaded on the application bootstrap as well as any other lazy modules registered later in your app.
But when trying to import a provider that is exported from a globally registered module (which eager modules are able to receive injected into their providers), lazy modules produce an error saying that the provider is not found.
### Minimum reproduction code
https://stackblitz.com/edit/nestjs-typescript-starter-muxzyg?file=src/app.module.ts
### Steps to reproduce
_No response_
### Expected behavior
I expected the service of the lazy module to be able to import a service from a global module.
### Package
- [ ] I don't know. Or some 3rd-party package
- [X] <code>@nestjs/common</code>
- [X] <code>@nestjs/core</code>
- [ ] <code>@nestjs/microservices</code>
- [ ] <code>@nestjs/platform-express</code>
- [ ] <code>@nestjs/platform-fastify</code>
- [ ] <code>@nestjs/platform-socket.io</code>
- [ ] <code>@nestjs/platform-ws</code>
- [ ] <code>@nestjs/testing</code>
- [ ] <code>@nestjs/websockets</code>
- [ ] Other (see below)
### Other package
_No response_
### NestJS version
_No response_
### Packages versions
```json
{
"@nestjs/common": "^8.1.1",
"@nestjs/core": "^8.1.1",
}
```
### Node.js version
_No response_
### In which operating systems have you tested?
- [X] macOS
- [ ] Windows
- [X] Linux
### Other
_No response_
|
non_defect
|
lazy loaded module can t access global module s providers is there an existing issue for this i have searched the existing issues current behavior in the documentation it says also lazy loaded modules share the same modules graph as those eagerly loaded on the application bootstrap as well as any other lazy modules registered later in your app but when trying to import a provider that is exported from a globally registered module which eager modules are able to receive injected into their providers lazy modules produce an error saying that the provider is not found minimum reproduction code steps to reproduce no response expected behavior i expected the service of the lazy module to be able to import a service from a global module package i don t know or some party package nestjs common nestjs core nestjs microservices nestjs platform express nestjs platform fastify nestjs platform socket io nestjs platform ws nestjs testing nestjs websockets other see below other package no response nestjs version no response packages versions json nestjs common nestjs core node js version no response in which operating systems have you tested macos windows linux other no response
| 0
|
15,413
| 2,852,119,855
|
IssuesEvent
|
2015-06-01 11:43:56
|
hazelcast/hazelcast
|
https://api.github.com/repos/hazelcast/hazelcast
|
closed
|
[TEST-FAILURE] JCacheListenerTest.testSyncListener_shouldNotHang_whenCacheDestroyed
|
Team: Core Type: Defect
|
```
java.lang.AssertionError: Cache operations should not hang when sync listener is present!, failed to complete within 120 seconds , count left: 2
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.assertTrue(Assert.java:41)
at com.hazelcast.test.HazelcastTestSupport.assertOpenEventually(HazelcastTestSupport.java:197)
at com.hazelcast.test.HazelcastTestSupport.assertOpenEventually(HazelcastTestSupport.java:183)
at com.hazelcast.cache.JCacheListenerTest.testSyncListener_shouldNotHang_AfterAction(JCacheListenerTest.java:311)
at com.hazelcast.cache.JCacheListenerTest.testSyncListener_shouldNotHang_whenCacheDestroyed(JCacheListenerTest.java:270)
```
https://hazelcast-l337.ci.cloudbees.com/job/Hazelcast-3.x-OpenJDK8/com.hazelcast$hazelcast/415/testReport/junit/com.hazelcast.cache/JCacheListenerTest/testSyncListener_shouldNotHang_whenCacheDestroyed/
|
1.0
|
[TEST-FAILURE] JCacheListenerTest.testSyncListener_shouldNotHang_whenCacheDestroyed - ```
java.lang.AssertionError: Cache operations should not hang when sync listener is present!, failed to complete within 120 seconds , count left: 2
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.assertTrue(Assert.java:41)
at com.hazelcast.test.HazelcastTestSupport.assertOpenEventually(HazelcastTestSupport.java:197)
at com.hazelcast.test.HazelcastTestSupport.assertOpenEventually(HazelcastTestSupport.java:183)
at com.hazelcast.cache.JCacheListenerTest.testSyncListener_shouldNotHang_AfterAction(JCacheListenerTest.java:311)
at com.hazelcast.cache.JCacheListenerTest.testSyncListener_shouldNotHang_whenCacheDestroyed(JCacheListenerTest.java:270)
```
https://hazelcast-l337.ci.cloudbees.com/job/Hazelcast-3.x-OpenJDK8/com.hazelcast$hazelcast/415/testReport/junit/com.hazelcast.cache/JCacheListenerTest/testSyncListener_shouldNotHang_whenCacheDestroyed/
|
defect
|
jcachelistenertest testsynclistener shouldnothang whencachedestroyed java lang assertionerror cache operations should not hang when sync listener is present failed to complete within seconds count left at org junit assert fail assert java at org junit assert asserttrue assert java at com hazelcast test hazelcasttestsupport assertopeneventually hazelcasttestsupport java at com hazelcast test hazelcasttestsupport assertopeneventually hazelcasttestsupport java at com hazelcast cache jcachelistenertest testsynclistener shouldnothang afteraction jcachelistenertest java at com hazelcast cache jcachelistenertest testsynclistener shouldnothang whencachedestroyed jcachelistenertest java
| 1
|
432,426
| 30,283,403,533
|
IssuesEvent
|
2023-07-08 11:07:35
|
EreliaCorp/Sparkle
|
https://api.github.com/repos/EreliaCorp/Sparkle
|
closed
|
Missing Doxygen Documentation
|
Documentation
|
There is a set of documentation missing:
includes/math/spk_perlin.hpp:9: Compound spk::Perlin is not documented.
includes/application/modules/spk_profiler_module.hpp:7: Compound spk::ProfilerModule is not documented.
includes/graphics/windows/spk_window.hpp:1: member 'spk::Singleton<Window>' of class 'Window' cannot be found
includes/application/modules/spk_profiler_module.hpp:20: Member update() (function) of class spk::ProfilerModule is not documented.
includes/application/modules/spk_profiler_module.hpp:21: Member increaseRenderIPS() (function) of class spk::ProfilerModule is not documented.
includes/application/modules/spk_profiler_module.hpp:22: Member increaseUpdateIPS() (function) of class spk::ProfilerModule is not documented.
includes/debug/spk_profiler.hpp:84: Member increseCounter(const std::wstring &p_key) (function) of class spk::Profiler is not documented.
includes/debug/spk_profiler.hpp:85: Member setCounter(const std::wstring &p_key, const size_t &p_value) (function) of class spk::Profiler is not documented.
includes/debug/spk_profiler.hpp:86: Member resetCounter(const std::wstring &p_key) (function) of class spk::Profiler is not documented.
includes/debug/spk_profiler.hpp:87: Member counter(const std::wstring &p_key) const (function) of class spk::Profiler is not documented.
includes/debug/spk_profiler.hpp:24: Member RENDER_IPS_COUNTER (variable) of class spk::Profiler is not documented.
includes/debug/spk_profiler.hpp:25: Member UPDATE_IPS_COUNTER (variable) of class spk::Profiler is not documented.
includes/design_pattern/spk_context_manager.hpp:72: Member swapRequested() (function) of class spk::ContextManager::ReadOnlyAccessor is not documented.
includes/math/spk_perlin.hpp:31: Member Perlin(unsigned long p_seed=12500) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:33: Member seed() const (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:35: Member configureSeed(unsigned long p_seed) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:36: Member configureFrequency(float p_frequency) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:37: Member configurePersistance(float p_persistance) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:38: Member configureLacunarity(float p_lacunarity) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:39: Member configureOctave(size_t p_octaveValue) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:40: Member configureRange(float p_min, float p_max) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:42: Member sample(float p_x, float p_y=0, float p_z=0) (function) of class spk::Perlin is not documented.
includes/system/spk_chronometer.hpp:15: Member UNINITIALIZED (variable) of class spk::Chronometer is not documented.
|
1.0
|
Missing Doxygen Documentation - There is a set of documentation missing:
includes/math/spk_perlin.hpp:9: Compound spk::Perlin is not documented.
includes/application/modules/spk_profiler_module.hpp:7: Compound spk::ProfilerModule is not documented.
includes/graphics/windows/spk_window.hpp:1: member 'spk::Singleton<Window>' of class 'Window' cannot be found
includes/application/modules/spk_profiler_module.hpp:20: Member update() (function) of class spk::ProfilerModule is not documented.
includes/application/modules/spk_profiler_module.hpp:21: Member increaseRenderIPS() (function) of class spk::ProfilerModule is not documented.
includes/application/modules/spk_profiler_module.hpp:22: Member increaseUpdateIPS() (function) of class spk::ProfilerModule is not documented.
includes/debug/spk_profiler.hpp:84: Member increseCounter(const std::wstring &p_key) (function) of class spk::Profiler is not documented.
includes/debug/spk_profiler.hpp:85: Member setCounter(const std::wstring &p_key, const size_t &p_value) (function) of class spk::Profiler is not documented.
includes/debug/spk_profiler.hpp:86: Member resetCounter(const std::wstring &p_key) (function) of class spk::Profiler is not documented.
includes/debug/spk_profiler.hpp:87: Member counter(const std::wstring &p_key) const (function) of class spk::Profiler is not documented.
includes/debug/spk_profiler.hpp:24: Member RENDER_IPS_COUNTER (variable) of class spk::Profiler is not documented.
includes/debug/spk_profiler.hpp:25: Member UPDATE_IPS_COUNTER (variable) of class spk::Profiler is not documented.
includes/design_pattern/spk_context_manager.hpp:72: Member swapRequested() (function) of class spk::ContextManager::ReadOnlyAccessor is not documented.
includes/math/spk_perlin.hpp:31: Member Perlin(unsigned long p_seed=12500) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:33: Member seed() const (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:35: Member configureSeed(unsigned long p_seed) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:36: Member configureFrequency(float p_frequency) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:37: Member configurePersistance(float p_persistance) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:38: Member configureLacunarity(float p_lacunarity) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:39: Member configureOctave(size_t p_octaveValue) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:40: Member configureRange(float p_min, float p_max) (function) of class spk::Perlin is not documented.
includes/math/spk_perlin.hpp:42: Member sample(float p_x, float p_y=0, float p_z=0) (function) of class spk::Perlin is not documented.
includes/system/spk_chronometer.hpp:15: Member UNINITIALIZED (variable) of class spk::Chronometer is not documented.
|
non_defect
|
missing doxygen documentation there is a set of documentation missing includes math spk perlin hpp compound spk perlin is not documented includes application modules spk profiler module hpp compound spk profilermodule is not documented includes graphics windows spk window hpp member spk singleton of class window cannot be found includes application modules spk profiler module hpp member update function of class spk profilermodule is not documented includes application modules spk profiler module hpp member increaserenderips function of class spk profilermodule is not documented includes application modules spk profiler module hpp member increaseupdateips function of class spk profilermodule is not documented includes debug spk profiler hpp member incresecounter const std wstring p key function of class spk profiler is not documented includes debug spk profiler hpp member setcounter const std wstring p key const size t p value function of class spk profiler is not documented includes debug spk profiler hpp member resetcounter const std wstring p key function of class spk profiler is not documented includes debug spk profiler hpp member counter const std wstring p key const function of class spk profiler is not documented includes debug spk profiler hpp member render ips counter variable of class spk profiler is not documented includes debug spk profiler hpp member update ips counter variable of class spk profiler is not documented includes design pattern spk context manager hpp member swaprequested function of class spk contextmanager readonlyaccessor is not documented includes math spk perlin hpp member perlin unsigned long p seed function of class spk perlin is not documented includes math spk perlin hpp member seed const function of class spk perlin is not documented includes math spk perlin hpp member configureseed unsigned long p seed function of class spk perlin is not documented includes math spk perlin hpp member configurefrequency float p frequency function of class spk perlin is not documented includes math spk perlin hpp member configurepersistance float p persistance function of class spk perlin is not documented includes math spk perlin hpp member configurelacunarity float p lacunarity function of class spk perlin is not documented includes math spk perlin hpp member configureoctave size t p octavevalue function of class spk perlin is not documented includes math spk perlin hpp member configurerange float p min float p max function of class spk perlin is not documented includes math spk perlin hpp member sample float p x float p y float p z function of class spk perlin is not documented includes system spk chronometer hpp member uninitialized variable of class spk chronometer is not documented
| 0
|
191,214
| 6,826,948,383
|
IssuesEvent
|
2017-11-08 15:38:22
|
brave/browser-android-tabs
|
https://api.github.com/repos/brave/browser-android-tabs
|
opened
|
Google Assistant doesn't always work with Brave
|
bug priority/P3 QA/steps-specified
|
**Did you search for similar issues before submitting this one?**
Yes
**Description:**
Some devices allow for Google Assistant to display its UI when Brave is active, others do not.
**Device Details:**
- Install Type(ARM, x86): tested with ARM
- Device(Phone, Tablet, Phablet): tested with Phones
- Android Version: 6.0 and above
**Brave Version:**
1.0.37
**Steps to reproduce:**
1. Be sure you are using a device which has Google Assistant
2. Open an app that is not Brave (slack, instagram, etc)
3. Say 'OK Google' (if voice activation is enabled) or long press on home button/icon.
4. Google Assistant UI is displayed.
5. Exit this app and open Brave.
6. Say 'OK Google' (if voice activation is enabled) or long press on home button/icon.
**Actual Behavior**
On some devices Google Assistant UI is displayed, on some it is not. Generally though, after performing step 6 while Brave is active, you can still ask voice questions of Google Assistant.
**Expected Behavior**
Google Assistant should open its UI as it does for other apps.
**Is this an issue with Beta build?**
n/a
**Is this an issue in the currently released version?**
yes
**Can this issue be consistently reproduced?**
On a device where it doesn't work, yes.
**Extra QA steps:**
1.
2.
3.
**Website problems only:**
- did you check with Brave Shields down?
- did you check in Chrome for same behavior?
**Screenshot if needed:**
This is the setting to enable voice activation:

**Any related issues:**
|
1.0
|
Google Assistant doesn't always work with Brave - **Did you search for similar issues before submitting this one?**
Yes
**Description:**
Some devices allow for Google Assistant to display its UI when Brave is active, others do not.
**Device Details:**
- Install Type(ARM, x86): tested with ARM
- Device(Phone, Tablet, Phablet): tested with Phones
- Android Version: 6.0 and above
**Brave Version:**
1.0.37
**Steps to reproduce:**
1. Be sure you are using a device which has Google Assistant
2. Open an app that is not Brave (slack, instagram, etc)
3. Say 'OK Google' (if voice activation is enabled) or long press on home button/icon.
4. Google Assistant UI is displayed.
5. Exit this app and open Brave.
6. Say 'OK Google' (if voice activation is enabled) or long press on home button/icon.
**Actual Behavior**
On some devices Google Assistant UI is displayed, on some it is not. Generally though, after performing step 6 while Brave is active, you can still ask voice questions of Google Assistant.
**Expected Behavior**
Google Assistant should open its UI as it does for other apps.
**Is this an issue with Beta build?**
n/a
**Is this an issue in the currently released version?**
yes
**Can this issue be consistently reproduced?**
On a device where it doesn't work, yes.
**Extra QA steps:**
1.
2.
3.
**Website problems only:**
- did you check with Brave Shields down?
- did you check in Chrome for same behavior?
**Screenshot if needed:**
This is the setting to enable voice activation:

**Any related issues:**
|
non_defect
|
google assistant doesn t always work with brave did you search for similar issues before submitting this one yes description some devices allow for google assistant to display its ui when brave is active others do not device details install type arm tested with arm device phone tablet phablet tested with phones android version and above brave version steps to reproduce be sure you are using a device which has google assistant open an app that is not brave slack instagram etc say ok google if voice activation is enabled or long press on home button icon google assistant ui is displayed exit this app and open brave say ok google if voice activation is enabled or long press on home button icon actual behavior on some devices google assistant ui is displayed on some it is not generally though after performing step while brave is active you can still ask voice questions of google assistant expected behavior google assistant should open its ui as it does for other apps is this an issue with beta build n a is this an issue in the currently released version yes can this issue be consistently reproduced on a device where it doesn t work yes extra qa steps website problems only did you check with brave shields down did you check in chrome for same behavior screenshot if needed this is the setting to enable voice activation any related issues
| 0
|
12,991
| 2,732,850,450
|
IssuesEvent
|
2015-04-17 09:44:36
|
tiku01/oryx-editor
|
https://api.github.com/repos/tiku01/oryx-editor
|
closed
|
Link to a subprocess
|
auto-migrated Priority-Medium Type-Defect
|
```
I need some help in creating a expanded subprocess in Activiti Modeler 5.6.
After dropping the collapsed subprocess on the task, there is a collapsed
subproces sign +.
There is another process diagram which should be displayed as a subprocess when
I click on the + sign, how do I do this? I have been through the Guide and it
is still not clear to me. When I click on the + sign it opens another windows
with the same diagram, but without the + sign.
Thanks.
```
Original issue reported on code.google.com by `mascar...@gmail.com` on 20 Aug 2011 at 6:38
|
1.0
|
Link to a subprocess - ```
I need some help in creating a expanded subprocess in Activiti Modeler 5.6.
After dropping the collapsed subprocess on the task, there is a collapsed
subproces sign +.
There is another process diagram which should be displayed as a subprocess when
I click on the + sign, how do I do this? I have been through the Guide and it
is still not clear to me. When I click on the + sign it opens another windows
with the same diagram, but without the + sign.
Thanks.
```
Original issue reported on code.google.com by `mascar...@gmail.com` on 20 Aug 2011 at 6:38
|
defect
|
link to a subprocess i need some help in creating a expanded subprocess in activiti modeler after dropping the collapsed subprocess on the task there is a collapsed subproces sign there is another process diagram which should be displayed as a subprocess when i click on the sign how do i do this i have been through the guide and it is still not clear to me when i click on the sign it opens another windows with the same diagram but without the sign thanks original issue reported on code google com by mascar gmail com on aug at
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.