Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 5
112
| repo_url
stringlengths 34
141
| action
stringclasses 3
values | title
stringlengths 1
757
| labels
stringlengths 4
664
| body
stringlengths 3
261k
| index
stringclasses 10
values | text_combine
stringlengths 96
261k
| label
stringclasses 2
values | text
stringlengths 96
232k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
56,344
| 15,025,866,026
|
IssuesEvent
|
2021-02-01 21:44:36
|
unascribed/Fabrication
|
https://api.github.com/repos/unascribed/Fabrication
|
closed
|
Billboard drops causes a crash with a compass
|
k: Defect n: Fabric
|
Just give yourself a compass or drop one and it will crash when billboard drops is enabled.
https://gist.github.com/Snowiez/5394f948a3ade0120e5c7729178bc263
|
1.0
|
Billboard drops causes a crash with a compass - Just give yourself a compass or drop one and it will crash when billboard drops is enabled.
https://gist.github.com/Snowiez/5394f948a3ade0120e5c7729178bc263
|
defect
|
billboard drops causes a crash with a compass just give yourself a compass or drop one and it will crash when billboard drops is enabled
| 1
|
391,347
| 26,887,608,690
|
IssuesEvent
|
2023-02-06 05:34:37
|
Seakimhour/pro-translate
|
https://api.github.com/repos/Seakimhour/pro-translate
|
closed
|
Data Storage Method
|
documentation
|
各 Web サイトには独自の「ローカル ストレージ」があり、相互にデータを取得できないので、ローカル ストレージは使用できません。
Chrome には、ユーザー データの変更を保存、取得、追跡するための chrome.storage API が用意されています。
このプロジェクトでは、[WebExtension Polyfill](https://github.com/mozilla/webextension-polyfill) を使用します。 これにより、拡張機能がより多くのブラウザーをサポートできるようになります。
#### ユーザー設定データ
```js
settings: {
targetLanguage: { code: "en", country: "English" },
secondTargetLanguage: { code: "ja", country: "Japanese" },
autoSwitch: true,
targetFormat: "camel",
autoSetFormat: true,
showIcon: true,
cases: ["snake", "param", "camel", "pascal", "path", "constant", "dot"]
}
```
#### 設定ページのスクリーンショットです。

#### この拡張機能で使用するデータは JS ファイルの形に保存しています。
[format-cases.js](https://github.com/Seakimhour/pro-translate/blob/master/src/assets/format-cases.js)
[languages.js](https://github.com/Seakimhour/pro-translate/blob/master/src/assets/languages.js)
利用するとき`import`で呼び出し
```js
import { formatCases } from "../../assets/format-cases.js";
```
|
1.0
|
Data Storage Method - 各 Web サイトには独自の「ローカル ストレージ」があり、相互にデータを取得できないので、ローカル ストレージは使用できません。
Chrome には、ユーザー データの変更を保存、取得、追跡するための chrome.storage API が用意されています。
このプロジェクトでは、[WebExtension Polyfill](https://github.com/mozilla/webextension-polyfill) を使用します。 これにより、拡張機能がより多くのブラウザーをサポートできるようになります。
#### ユーザー設定データ
```js
settings: {
targetLanguage: { code: "en", country: "English" },
secondTargetLanguage: { code: "ja", country: "Japanese" },
autoSwitch: true,
targetFormat: "camel",
autoSetFormat: true,
showIcon: true,
cases: ["snake", "param", "camel", "pascal", "path", "constant", "dot"]
}
```
#### 設定ページのスクリーンショットです。

#### この拡張機能で使用するデータは JS ファイルの形に保存しています。
[format-cases.js](https://github.com/Seakimhour/pro-translate/blob/master/src/assets/format-cases.js)
[languages.js](https://github.com/Seakimhour/pro-translate/blob/master/src/assets/languages.js)
利用するとき`import`で呼び出し
```js
import { formatCases } from "../../assets/format-cases.js";
```
|
non_defect
|
data storage method 各 web サイトには独自の「ローカル ストレージ」があり、相互にデータを取得できないので、ローカル ストレージは使用できません。 chrome には、ユーザー データの変更を保存、取得、追跡するための chrome storage api が用意されています。 このプロジェクトでは、 を使用します。 これにより、拡張機能がより多くのブラウザーをサポートできるようになります。 ユーザー設定データ js settings targetlanguage code en country english secondtargetlanguage code ja country japanese autoswitch true targetformat camel autosetformat true showicon true cases 設定ページのスクリーンショットです。 この拡張機能で使用するデータは js ファイルの形に保存しています。 利用するとき import で呼び出し js import formatcases from assets format cases js
| 0
|
30,227
| 6,046,547,780
|
IssuesEvent
|
2017-06-12 12:25:51
|
autovpn4openwrt/autovpn-for-openwrt
|
https://api.github.com/repos/autovpn4openwrt/autovpn-for-openwrt
|
closed
|
我也发现一个问题,连接数大,流量大时,dnsmasq响应缓慢
|
auto-migrated Priority-Medium Type-Defect
|
```
Intel
D525的双核1.8Ghz,启用了双核Openwrt处理,当连接数达到10000左�
��时,dnsmasq响应就比较慢,需要几秒中的时间回应dns请求。��
�想请问,我在dnsmasq官网看了这是一款小型dns服务器,是否有
其他性能更好的dns方案可以替换dnsmasq?
```
Original issue reported on code.google.com by `jiekec...@gmail.com` on 20 Nov 2014 at 3:14
|
1.0
|
我也发现一个问题,连接数大,流量大时,dnsmasq响应缓慢 - ```
Intel
D525的双核1.8Ghz,启用了双核Openwrt处理,当连接数达到10000左�
��时,dnsmasq响应就比较慢,需要几秒中的时间回应dns请求。��
�想请问,我在dnsmasq官网看了这是一款小型dns服务器,是否有
其他性能更好的dns方案可以替换dnsmasq?
```
Original issue reported on code.google.com by `jiekec...@gmail.com` on 20 Nov 2014 at 3:14
|
defect
|
我也发现一个问题,连接数大,流量大时,dnsmasq响应缓慢 intel ,启用了双核openwrt处理, � ��时,dnsmasq响应就比较慢,需要几秒中的时间回应dns请求。�� �想请问,我在dnsmasq官网看了这是一款小型dns服务器,是否有 其他性能更好的dns方案可以替换dnsmasq? original issue reported on code google com by jiekec gmail com on nov at
| 1
|
57,291
| 15,729,588,745
|
IssuesEvent
|
2021-03-29 15:00:46
|
danmar/testissues
|
https://api.github.com/repos/danmar/testissues
|
opened
|
Incorrect variable id, when delete is used. (Trac #269)
|
Incomplete Migration Migrated from Trac Other aggro80 defect
|
Migrated from https://trac.cppcheck.net/ticket/269
```json
{
"status": "closed",
"changetime": "2009-04-29T19:46:55",
"description": "{{{\nvoid f()\n{\n int *a;\n delete a;\n}\n}}}\n\nVariable id should be 1, not 2. \n{{{\n##file 0\n1: void f ( )\n2: {\n3: int * a@1 ;\n4: delete a@2 ;\n5: }\n}}}\n",
"reporter": "aggro80",
"cc": "",
"resolution": "fixed",
"_ts": "1241034415000000",
"component": "Other",
"summary": "Incorrect variable id, when delete is used.",
"priority": "",
"keywords": "",
"time": "2009-04-29T19:17:36",
"milestone": "1.32",
"owner": "aggro80",
"type": "defect"
}
```
|
1.0
|
Incorrect variable id, when delete is used. (Trac #269) - Migrated from https://trac.cppcheck.net/ticket/269
```json
{
"status": "closed",
"changetime": "2009-04-29T19:46:55",
"description": "{{{\nvoid f()\n{\n int *a;\n delete a;\n}\n}}}\n\nVariable id should be 1, not 2. \n{{{\n##file 0\n1: void f ( )\n2: {\n3: int * a@1 ;\n4: delete a@2 ;\n5: }\n}}}\n",
"reporter": "aggro80",
"cc": "",
"resolution": "fixed",
"_ts": "1241034415000000",
"component": "Other",
"summary": "Incorrect variable id, when delete is used.",
"priority": "",
"keywords": "",
"time": "2009-04-29T19:17:36",
"milestone": "1.32",
"owner": "aggro80",
"type": "defect"
}
```
|
defect
|
incorrect variable id when delete is used trac migrated from json status closed changetime description nvoid f n n int a n delete a n n n nvariable id should be not n n file void f int a delete a n n reporter cc resolution fixed ts component other summary incorrect variable id when delete is used priority keywords time milestone owner type defect
| 1
|
243,461
| 20,388,708,895
|
IssuesEvent
|
2022-02-22 09:48:01
|
ZcashFoundation/zebra
|
https://api.github.com/repos/ZcashFoundation/zebra
|
closed
|
Coverage changes are inaccurate, because the job doesn't run on `main`
|
C-bug A-devops P-Medium :zap: C-testing
|
## Motivation
> By the way, why does this PR change test coverage when it only edits comments? The bot says there will be 854 new hits.
## Suggested Solution
Our `main` branch coverage seems to be out of date, maybe we need to start running our coverage job on `main` again.
_Originally posted by @teor2345 in https://github.com/ZcashFoundation/zebra/issues/3521#issuecomment-1039479864_
|
1.0
|
Coverage changes are inaccurate, because the job doesn't run on `main` - ## Motivation
> By the way, why does this PR change test coverage when it only edits comments? The bot says there will be 854 new hits.
## Suggested Solution
Our `main` branch coverage seems to be out of date, maybe we need to start running our coverage job on `main` again.
_Originally posted by @teor2345 in https://github.com/ZcashFoundation/zebra/issues/3521#issuecomment-1039479864_
|
non_defect
|
coverage changes are inaccurate because the job doesn t run on main motivation by the way why does this pr change test coverage when it only edits comments the bot says there will be new hits suggested solution our main branch coverage seems to be out of date maybe we need to start running our coverage job on main again originally posted by in
| 0
|
24,242
| 5,040,053,789
|
IssuesEvent
|
2016-12-19 02:33:00
|
coreos/bugs
|
https://api.github.com/repos/coreos/bugs
|
closed
|
document or change journald rate limit defaults
|
area/usability component/systemd kind/documentation team/os
|
### Desired Feature ###
Lots of people complain about journald locking up on them. As a solution perhaps we set an even lower limit on number of logs per second per service in journald.conf.
Docs: https://www.freedesktop.org/software/systemd/man/journald.conf.html#RateLimitIntervalSec=
I believe this is what other distros do.
|
1.0
|
document or change journald rate limit defaults - ### Desired Feature ###
Lots of people complain about journald locking up on them. As a solution perhaps we set an even lower limit on number of logs per second per service in journald.conf.
Docs: https://www.freedesktop.org/software/systemd/man/journald.conf.html#RateLimitIntervalSec=
I believe this is what other distros do.
|
non_defect
|
document or change journald rate limit defaults desired feature lots of people complain about journald locking up on them as a solution perhaps we set an even lower limit on number of logs per second per service in journald conf docs i believe this is what other distros do
| 0
|
343,837
| 30,695,175,852
|
IssuesEvent
|
2023-07-26 18:00:41
|
microsoft/AzureStorageExplorer
|
https://api.github.com/repos/microsoft/AzureStorageExplorer
|
closed
|
'View Options' panel doesn't disappear after switching to 'Snapshots/Versions' view
|
🧪 testing :gear: blobs :beetle: regression
|
**Storage Explorer Version**: 1.31.0-dev
**Build Number**: 20230726.3
**Branch**: main
**Platform/OS**: Windows 10/Linux Ubuntu 20.04/MacOS Ventura 13.4.1 (Apple M1 Pro)
**Architecture**: x64/x64/arm64
**How Found**: From running test cases
**Regression From**: Previous release (1.30.2)
## Steps to Reproduce ##
1. Expand one storage account -> Blob Containers.
2. Create a blob container -> Right click the blob container -> Click 'Open in React'.
3. Upload a blob -> Open 'View Options' panel.
4. Right click the blob -> Click 'Manage History -> Manage Versions'.
5. Check whether 'View Options' panel disappears.
## Expected Experience ##
'View Options' panel disappears.
## Actual Experience ##
'View Options' panel doesn't disappear.

|
1.0
|
'View Options' panel doesn't disappear after switching to 'Snapshots/Versions' view - **Storage Explorer Version**: 1.31.0-dev
**Build Number**: 20230726.3
**Branch**: main
**Platform/OS**: Windows 10/Linux Ubuntu 20.04/MacOS Ventura 13.4.1 (Apple M1 Pro)
**Architecture**: x64/x64/arm64
**How Found**: From running test cases
**Regression From**: Previous release (1.30.2)
## Steps to Reproduce ##
1. Expand one storage account -> Blob Containers.
2. Create a blob container -> Right click the blob container -> Click 'Open in React'.
3. Upload a blob -> Open 'View Options' panel.
4. Right click the blob -> Click 'Manage History -> Manage Versions'.
5. Check whether 'View Options' panel disappears.
## Expected Experience ##
'View Options' panel disappears.
## Actual Experience ##
'View Options' panel doesn't disappear.

|
non_defect
|
view options panel doesn t disappear after switching to snapshots versions view storage explorer version dev build number branch main platform os windows linux ubuntu macos ventura apple pro architecture how found from running test cases regression from previous release steps to reproduce expand one storage account blob containers create a blob container right click the blob container click open in react upload a blob open view options panel right click the blob click manage history manage versions check whether view options panel disappears expected experience view options panel disappears actual experience view options panel doesn t disappear
| 0
|
10,547
| 2,622,171,862
|
IssuesEvent
|
2015-03-04 00:14:51
|
byzhang/rapidjson
|
https://api.github.com/repos/byzhang/rapidjson
|
closed
|
Memory access error due to 'memcmp'
|
auto-migrated Priority-Medium Type-Defect
|
```
I tried to use rapidjson on the large json, and valgrind/memcheck finds errors,
see below. The offending line is this:
if (name[member->name.data_.s.length] == '\0' &&
memcmp(member->name.data_.s.str, name, member->name.data_.s.length *
sizeof(Ch)) == 0)
This happens during map value lookup. 'memcmp' can't be used in this place,
because some keys can be longer than the supplied value, and it is illegal to
read a string past its terminating zero character.
In fact, this bug can cause segmentation fault if the end of the string
supplied by the caller would happen to align with the end of the memory segment.
---error log---
==81117== Invalid read of size 1
==81117== at 0x110A543: memcmp (mc_replace_strmem.c:1001)
==81117== by 0x49F01C: rapidjson::GenericValue<rapidjson::UTF8<char>,
rapidjson::MemoryPoolAllocator<rapidjson::CrtAllocator> >::FindMember(char
const*) (document.h:271)
==81117== by 0x49EECC: rapidjson::GenericValue<rapidjson::UTF8<char>,
rapidjson::MemoryPoolAllocator<rapidjson::CrtAllocator> >::operator[](char
const*) (document.h:239)
==81117== by 0x49EE9C: rapidjson::GenericValue<rapidjson::UTF8<char>,
rapidjson::MemoryPoolAllocator<rapidjson::CrtAllocator> >::operator[](char
const*) const (document.h:247)
```
Original issue reported on code.google.com by `yuriv...@gmail.com` on 28 Apr 2014 at 11:42
* Merged into: #108
|
1.0
|
Memory access error due to 'memcmp' - ```
I tried to use rapidjson on the large json, and valgrind/memcheck finds errors,
see below. The offending line is this:
if (name[member->name.data_.s.length] == '\0' &&
memcmp(member->name.data_.s.str, name, member->name.data_.s.length *
sizeof(Ch)) == 0)
This happens during map value lookup. 'memcmp' can't be used in this place,
because some keys can be longer than the supplied value, and it is illegal to
read a string past its terminating zero character.
In fact, this bug can cause segmentation fault if the end of the string
supplied by the caller would happen to align with the end of the memory segment.
---error log---
==81117== Invalid read of size 1
==81117== at 0x110A543: memcmp (mc_replace_strmem.c:1001)
==81117== by 0x49F01C: rapidjson::GenericValue<rapidjson::UTF8<char>,
rapidjson::MemoryPoolAllocator<rapidjson::CrtAllocator> >::FindMember(char
const*) (document.h:271)
==81117== by 0x49EECC: rapidjson::GenericValue<rapidjson::UTF8<char>,
rapidjson::MemoryPoolAllocator<rapidjson::CrtAllocator> >::operator[](char
const*) (document.h:239)
==81117== by 0x49EE9C: rapidjson::GenericValue<rapidjson::UTF8<char>,
rapidjson::MemoryPoolAllocator<rapidjson::CrtAllocator> >::operator[](char
const*) const (document.h:247)
```
Original issue reported on code.google.com by `yuriv...@gmail.com` on 28 Apr 2014 at 11:42
* Merged into: #108
|
defect
|
memory access error due to memcmp i tried to use rapidjson on the large json and valgrind memcheck finds errors see below the offending line is this if name memcmp member name data s str name member name data s length sizeof ch this happens during map value lookup memcmp can t be used in this place because some keys can be longer than the supplied value and it is illegal to read a string past its terminating zero character in fact this bug can cause segmentation fault if the end of the string supplied by the caller would happen to align with the end of the memory segment error log invalid read of size at memcmp mc replace strmem c by rapidjson genericvalue rapidjson memorypoolallocator findmember char const document h by rapidjson genericvalue rapidjson memorypoolallocator operator char const document h by rapidjson genericvalue rapidjson memorypoolallocator operator char const const document h original issue reported on code google com by yuriv gmail com on apr at merged into
| 1
|
536,861
| 15,715,930,945
|
IssuesEvent
|
2021-03-28 04:16:52
|
AY2021S2-CS2103-T16-3/tp
|
https://api.github.com/repos/AY2021S2-CS2103-T16-3/tp
|
opened
|
Ui display
|
priority.Low
|
currently if a lot of bookings are added the size of the residence row keeps increasing. might be better to make it fixed size with a scroll pane?
|
1.0
|
Ui display - currently if a lot of bookings are added the size of the residence row keeps increasing. might be better to make it fixed size with a scroll pane?
|
non_defect
|
ui display currently if a lot of bookings are added the size of the residence row keeps increasing might be better to make it fixed size with a scroll pane
| 0
|
368,103
| 25,776,666,639
|
IssuesEvent
|
2022-12-09 12:35:37
|
bounswe/bounswe2022group9
|
https://api.github.com/repos/bounswe/bounswe2022group9
|
opened
|
[Documentation] Create Executive part for backend
|
Documentation Backend
|
For the milestone 2, executive part should be created for the backend.
|
1.0
|
[Documentation] Create Executive part for backend - For the milestone 2, executive part should be created for the backend.
|
non_defect
|
create executive part for backend for the milestone executive part should be created for the backend
| 0
|
18,392
| 3,054,473,318
|
IssuesEvent
|
2015-08-13 02:58:57
|
eczarny/spectacle
|
https://api.github.com/repos/eczarny/spectacle
|
closed
|
Cancel "edit hot key" process
|
defect ★★
|
I can't abort the "edit hot key" process, once entered. It doesn't help to close the preferences panel.
--
Cheers and congratulations on creating the best program for this purpose :)
|
1.0
|
Cancel "edit hot key" process - I can't abort the "edit hot key" process, once entered. It doesn't help to close the preferences panel.
--
Cheers and congratulations on creating the best program for this purpose :)
|
defect
|
cancel edit hot key process i can t abort the edit hot key process once entered it doesn t help to close the preferences panel cheers and congratulations on creating the best program for this purpose
| 1
|
47,169
| 5,867,701,568
|
IssuesEvent
|
2017-05-14 04:08:00
|
nix-rust/nix
|
https://api.github.com/repos/nix-rust/nix
|
opened
|
Continuous testing for FreeBSD
|
A-testing O-freebsd
|
I'm working on a buildbot cluster that could support continuous testing on FreeBSD for several projects. I've got a prototype running, and you can see a PR in action at https://github.com/asomers/mio-aio/pull/2 . The hardest question is how to secure it. Since anybody can open a PR, that means that anybody can run arbitrary code on the buildslaves. The potential damage is limited; each project gets its own worker, each worker runs in its own jail, and there's a timeout on each build. I could use the firewall to prevent workers from sending email and stuff, but I can't completely isolate workers from the internet without breaking a lot of builds. There are a few options to improve the security situation.
1. Don't automatically build a PR until a maintainer posts a specific comment. This is what open-zfs does. It completely eliminates unreviewed code from running on the workers. However, it's inconvenient for people who are accustomed to Travis building stuff without needing to be asked.
2. Destroy and reclone the worker for each build. This is what Travis does. The worker still runs untrusted code, but not for long. The untrusted code can't modify the filesystem in any persistent way.
3. A hybrid of the previous two. Do a build when a maintainer give the magic comment, but also do builds automatically whenever a PR is posted by a known-good contributor. The list of contributors would probably have to be maintained by hand, but it could be much larger than the list of maintainers.
Does anybody have any better ideas?
|
1.0
|
Continuous testing for FreeBSD - I'm working on a buildbot cluster that could support continuous testing on FreeBSD for several projects. I've got a prototype running, and you can see a PR in action at https://github.com/asomers/mio-aio/pull/2 . The hardest question is how to secure it. Since anybody can open a PR, that means that anybody can run arbitrary code on the buildslaves. The potential damage is limited; each project gets its own worker, each worker runs in its own jail, and there's a timeout on each build. I could use the firewall to prevent workers from sending email and stuff, but I can't completely isolate workers from the internet without breaking a lot of builds. There are a few options to improve the security situation.
1. Don't automatically build a PR until a maintainer posts a specific comment. This is what open-zfs does. It completely eliminates unreviewed code from running on the workers. However, it's inconvenient for people who are accustomed to Travis building stuff without needing to be asked.
2. Destroy and reclone the worker for each build. This is what Travis does. The worker still runs untrusted code, but not for long. The untrusted code can't modify the filesystem in any persistent way.
3. A hybrid of the previous two. Do a build when a maintainer give the magic comment, but also do builds automatically whenever a PR is posted by a known-good contributor. The list of contributors would probably have to be maintained by hand, but it could be much larger than the list of maintainers.
Does anybody have any better ideas?
|
non_defect
|
continuous testing for freebsd i m working on a buildbot cluster that could support continuous testing on freebsd for several projects i ve got a prototype running and you can see a pr in action at the hardest question is how to secure it since anybody can open a pr that means that anybody can run arbitrary code on the buildslaves the potential damage is limited each project gets its own worker each worker runs in its own jail and there s a timeout on each build i could use the firewall to prevent workers from sending email and stuff but i can t completely isolate workers from the internet without breaking a lot of builds there are a few options to improve the security situation don t automatically build a pr until a maintainer posts a specific comment this is what open zfs does it completely eliminates unreviewed code from running on the workers however it s inconvenient for people who are accustomed to travis building stuff without needing to be asked destroy and reclone the worker for each build this is what travis does the worker still runs untrusted code but not for long the untrusted code can t modify the filesystem in any persistent way a hybrid of the previous two do a build when a maintainer give the magic comment but also do builds automatically whenever a pr is posted by a known good contributor the list of contributors would probably have to be maintained by hand but it could be much larger than the list of maintainers does anybody have any better ideas
| 0
|
44,241
| 12,066,854,656
|
IssuesEvent
|
2020-04-16 12:29:34
|
primefaces/primefaces
|
https://api.github.com/repos/primefaces/primefaces
|
closed
|
DataTable rowsPerPageTemplate-ShowAll+ filtering throws NumberFormatException
|
defect
|
## 1) Environment
- PrimeFaces version: Primesfaces 8.0
- Application server + version: Wildfly 16
- Affected browsers: all
## 2) Expected behavior
When I select "ShowAll" in DataTable's rows per page drop-down list, all the rows should be filtered.
## 3) Actual behavior
When the "ShowAll" option is selected, an NumberFormatException is thrown during filtering
java.lang.NumberFormatException: For input string: "*"
at java.base/java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.base/java.lang.Integer.parseInt(Integer.java:638)
at java.base/java.lang.Integer.parseInt(Integer.java:770)
at deployment.test.war//org.primefaces.component.datatable.feature.FilterFeature.encode(FilterFeature.java:106)
at deployment.test.war//org.primefaces.component.datatable.DataTableRenderer.encodeEnd(DataTableRenderer.java:88)
at javax.faces.api@2.3.9.SP01//javax.faces.component.UIComponentBase.encodeEnd(UIComponentBase.java:595)
## 4) Steps to reproduce
Set the "records per page" to "all" and make some filtering
## 5) Sample XHTML
<h:form>
<p:dataTable value="#{dataTableController.rows}" var="row" paginator="true"
rowsPerPageTemplate="5,10,{ShowAll|'All'}" rows="5">
<p:column filterBy="#{row}" filterMatchMode="contains">
<h:outputText value="#{row}"/>
</p:column>
</p:dataTable>
</h:form>
## 6) Sample bean
@Named
@ApplicationScoped
public class DataTableController implements Serializable {
private static final int ROW_COUNT = 15;
private static final List<String> ROWS = new ArrayList<>(ROW_COUNT);
static {
for (int i = 0; i < ROW_COUNT; i++) {
ROWS.add("Row " + i);
}
}
public List<String> getRows() {
return ROWS;
}
}
|
1.0
|
DataTable rowsPerPageTemplate-ShowAll+ filtering throws NumberFormatException -
## 1) Environment
- PrimeFaces version: Primesfaces 8.0
- Application server + version: Wildfly 16
- Affected browsers: all
## 2) Expected behavior
When I select "ShowAll" in DataTable's rows per page drop-down list, all the rows should be filtered.
## 3) Actual behavior
When the "ShowAll" option is selected, an NumberFormatException is thrown during filtering
java.lang.NumberFormatException: For input string: "*"
at java.base/java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.base/java.lang.Integer.parseInt(Integer.java:638)
at java.base/java.lang.Integer.parseInt(Integer.java:770)
at deployment.test.war//org.primefaces.component.datatable.feature.FilterFeature.encode(FilterFeature.java:106)
at deployment.test.war//org.primefaces.component.datatable.DataTableRenderer.encodeEnd(DataTableRenderer.java:88)
at javax.faces.api@2.3.9.SP01//javax.faces.component.UIComponentBase.encodeEnd(UIComponentBase.java:595)
## 4) Steps to reproduce
Set the "records per page" to "all" and make some filtering
## 5) Sample XHTML
<h:form>
<p:dataTable value="#{dataTableController.rows}" var="row" paginator="true"
rowsPerPageTemplate="5,10,{ShowAll|'All'}" rows="5">
<p:column filterBy="#{row}" filterMatchMode="contains">
<h:outputText value="#{row}"/>
</p:column>
</p:dataTable>
</h:form>
## 6) Sample bean
@Named
@ApplicationScoped
public class DataTableController implements Serializable {
private static final int ROW_COUNT = 15;
private static final List<String> ROWS = new ArrayList<>(ROW_COUNT);
static {
for (int i = 0; i < ROW_COUNT; i++) {
ROWS.add("Row " + i);
}
}
public List<String> getRows() {
return ROWS;
}
}
|
defect
|
datatable rowsperpagetemplate showall filtering throws numberformatexception environment primefaces version primesfaces application server version wildfly affected browsers all expected behavior when i select showall in datatable s rows per page drop down list all the rows should be filtered actual behavior when the showall option is selected an numberformatexception is thrown during filtering java lang numberformatexception for input string at java base java lang numberformatexception forinputstring numberformatexception java at java base java lang integer parseint integer java at java base java lang integer parseint integer java at deployment test war org primefaces component datatable feature filterfeature encode filterfeature java at deployment test war org primefaces component datatable datatablerenderer encodeend datatablerenderer java at javax faces api javax faces component uicomponentbase encodeend uicomponentbase java steps to reproduce set the records per page to all and make some filtering sample xhtml p datatable value datatablecontroller rows var row paginator true rowsperpagetemplate showall all rows sample bean named applicationscoped public class datatablecontroller implements serializable private static final int row count private static final list rows new arraylist row count static for int i i row count i rows add row i public list getrows return rows
| 1
|
4,632
| 2,610,135,411
|
IssuesEvent
|
2015-02-26 18:42:41
|
chrsmith/hedgewars
|
https://api.github.com/repos/chrsmith/hedgewars
|
closed
|
The Rope shot into a crate remains in the air after collecting
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1. Shooting a rope into a crate
2. Collecting it
What is the expected output? What do you see instead?
The rope tears/disappears, and the hog falls
What version of the product are you using? On what operating system?
0.9.13, Win7
Please provide any additional information below.
http://www.youtube.com/watch?v=3-fhUBxKMW8
```
-----
Original issue reported on code.google.com by `joship...@gmail.com` on 15 Sep 2010 at 10:12
* Merged into: #40
|
1.0
|
The Rope shot into a crate remains in the air after collecting - ```
What steps will reproduce the problem?
1. Shooting a rope into a crate
2. Collecting it
What is the expected output? What do you see instead?
The rope tears/disappears, and the hog falls
What version of the product are you using? On what operating system?
0.9.13, Win7
Please provide any additional information below.
http://www.youtube.com/watch?v=3-fhUBxKMW8
```
-----
Original issue reported on code.google.com by `joship...@gmail.com` on 15 Sep 2010 at 10:12
* Merged into: #40
|
defect
|
the rope shot into a crate remains in the air after collecting what steps will reproduce the problem shooting a rope into a crate collecting it what is the expected output what do you see instead the rope tears disappears and the hog falls what version of the product are you using on what operating system please provide any additional information below original issue reported on code google com by joship gmail com on sep at merged into
| 1
|
63,693
| 12,368,303,475
|
IssuesEvent
|
2020-05-18 13:38:35
|
sourcegraph/sourcegraph
|
https://api.github.com/repos/sourcegraph/sourcegraph
|
closed
|
Make code insights user-resizable and reorderable
|
code insights stretch-goal webapp
|
The user should have freedom to reorder and resize code insights to create a dashboard.
This should be persistable to user, org or global settings.
This also allows us to give fixed default width and height to views, which makes the UI stable (in combination with #10375).
First idea is to use CSS grid with native CSS `resize`, then a `ResizeObserver` to persist it. Could snap to grids with the CSS grid `span` keyword.
|
1.0
|
Make code insights user-resizable and reorderable - The user should have freedom to reorder and resize code insights to create a dashboard.
This should be persistable to user, org or global settings.
This also allows us to give fixed default width and height to views, which makes the UI stable (in combination with #10375).
First idea is to use CSS grid with native CSS `resize`, then a `ResizeObserver` to persist it. Could snap to grids with the CSS grid `span` keyword.
|
non_defect
|
make code insights user resizable and reorderable the user should have freedom to reorder and resize code insights to create a dashboard this should be persistable to user org or global settings this also allows us to give fixed default width and height to views which makes the ui stable in combination with first idea is to use css grid with native css resize then a resizeobserver to persist it could snap to grids with the css grid span keyword
| 0
|
49,872
| 13,466,604,668
|
IssuesEvent
|
2020-09-09 23:21:23
|
wrbejar/JavaVulnerableLab
|
https://api.github.com/repos/wrbejar/JavaVulnerableLab
|
opened
|
CVE-2018-1000632 (High) detected in dom4j-1.6.1.jar
|
security vulnerability
|
## CVE-2018-1000632 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>dom4j-1.6.1.jar</b></p></summary>
<p>dom4j: the flexible XML framework for Java</p>
<p>Library home page: <a href="http://dom4j.org">http://dom4j.org</a></p>
<p>Path to vulnerable library: _depth_0/JavaVulnerableLab/target/JavaVulnerableLab/META-INF/maven/org.cysecurity/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar,/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar,_depth_0/JavaVulnerableLab/bin/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar,/home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar,/home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar,_depth_0/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar,/home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar,_depth_0/JavaVulnerableLab/bin/target/JavaVulnerableLab/META-INF/maven/org.cysecurity/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar,/JavaVulnerableLab/bin/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **dom4j-1.6.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/wrbejar/JavaVulnerableLab/commit/29032cb446233dde79d67459af426f67b9224d28">29032cb446233dde79d67459af426f67b9224d28</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
dom4j version prior to version 2.1.1 contains a CWE-91: XML Injection vulnerability in Class: Element. Methods: addElement, addAttribute that can result in an attacker tampering with XML documents through XML injection. This attack appear to be exploitable via an attacker specifying attributes or elements in the XML document. This vulnerability appears to have been fixed in 2.1.1 or later.
<p>Publish Date: 2018-08-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632>CVE-2018-1000632</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632</a></p>
<p>Release Date: 2018-08-20</p>
<p>Fix Resolution: org.dom4j:dom4j:2.0.3</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"dom4j","packageName":"dom4j","packageVersion":"1.6.1","isTransitiveDependency":false,"dependencyTree":"dom4j:dom4j:1.6.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.dom4j:dom4j:2.0.3"}],"vulnerabilityIdentifier":"CVE-2018-1000632","vulnerabilityDetails":"dom4j version prior to version 2.1.1 contains a CWE-91: XML Injection vulnerability in Class: Element. Methods: addElement, addAttribute that can result in an attacker tampering with XML documents through XML injection. This attack appear to be exploitable via an attacker specifying attributes or elements in the XML document. This vulnerability appears to have been fixed in 2.1.1 or later.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2018-1000632 (High) detected in dom4j-1.6.1.jar - ## CVE-2018-1000632 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>dom4j-1.6.1.jar</b></p></summary>
<p>dom4j: the flexible XML framework for Java</p>
<p>Library home page: <a href="http://dom4j.org">http://dom4j.org</a></p>
<p>Path to vulnerable library: _depth_0/JavaVulnerableLab/target/JavaVulnerableLab/META-INF/maven/org.cysecurity/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar,/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar,_depth_0/JavaVulnerableLab/bin/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar,/home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar,/home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar,_depth_0/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar,/home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar,_depth_0/JavaVulnerableLab/bin/target/JavaVulnerableLab/META-INF/maven/org.cysecurity/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar,/JavaVulnerableLab/bin/target/JavaVulnerableLab/WEB-INF/lib/dom4j-1.6.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **dom4j-1.6.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/wrbejar/JavaVulnerableLab/commit/29032cb446233dde79d67459af426f67b9224d28">29032cb446233dde79d67459af426f67b9224d28</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
dom4j version prior to version 2.1.1 contains a CWE-91: XML Injection vulnerability in Class: Element. Methods: addElement, addAttribute that can result in an attacker tampering with XML documents through XML injection. This attack appear to be exploitable via an attacker specifying attributes or elements in the XML document. This vulnerability appears to have been fixed in 2.1.1 or later.
<p>Publish Date: 2018-08-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632>CVE-2018-1000632</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632</a></p>
<p>Release Date: 2018-08-20</p>
<p>Fix Resolution: org.dom4j:dom4j:2.0.3</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"dom4j","packageName":"dom4j","packageVersion":"1.6.1","isTransitiveDependency":false,"dependencyTree":"dom4j:dom4j:1.6.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.dom4j:dom4j:2.0.3"}],"vulnerabilityIdentifier":"CVE-2018-1000632","vulnerabilityDetails":"dom4j version prior to version 2.1.1 contains a CWE-91: XML Injection vulnerability in Class: Element. Methods: addElement, addAttribute that can result in an attacker tampering with XML documents through XML injection. This attack appear to be exploitable via an attacker specifying attributes or elements in the XML document. This vulnerability appears to have been fixed in 2.1.1 or later.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1000632","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_defect
|
cve high detected in jar cve high severity vulnerability vulnerable library jar the flexible xml framework for java library home page a href path to vulnerable library depth javavulnerablelab target javavulnerablelab meta inf maven org cysecurity javavulnerablelab target javavulnerablelab web inf lib jar javavulnerablelab target javavulnerablelab web inf lib jar depth javavulnerablelab bin target javavulnerablelab web inf lib jar home wss scanner repository jar home wss scanner repository jar depth javavulnerablelab target javavulnerablelab web inf lib jar home wss scanner repository jar depth javavulnerablelab bin target javavulnerablelab meta inf maven org cysecurity javavulnerablelab target javavulnerablelab web inf lib jar javavulnerablelab bin target javavulnerablelab web inf lib jar dependency hierarchy x jar vulnerable library found in head commit a href vulnerability details version prior to version contains a cwe xml injection vulnerability in class element methods addelement addattribute that can result in an attacker tampering with xml documents through xml injection this attack appear to be exploitable via an attacker specifying attributes or elements in the xml document this vulnerability appears to have been fixed in or later publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails version prior to version contains a cwe xml injection vulnerability in class element methods addelement addattribute that can result in an attacker tampering with xml documents through xml injection this attack appear to be exploitable via an attacker specifying attributes or elements in the xml document this vulnerability appears to have been fixed in or later vulnerabilityurl
| 0
|
2,234
| 3,736,653,006
|
IssuesEvent
|
2016-03-08 16:36:56
|
pgharts/trusty-clipped-extension
|
https://api.github.com/repos/pgharts/trusty-clipped-extension
|
opened
|
Brakeman: Possible SQL injection
|
security
|
Possible SQL injection
where(["asset_content_type IN (#{mimes.map{'?'}.join(',')})", *mimes])
Found in app/models/asset.rb by brakeman
|
True
|
Brakeman: Possible SQL injection - Possible SQL injection
where(["asset_content_type IN (#{mimes.map{'?'}.join(',')})", *mimes])
Found in app/models/asset.rb by brakeman
|
non_defect
|
brakeman possible sql injection possible sql injection where found in app models asset rb by brakeman
| 0
|
138,583
| 5,344,628,530
|
IssuesEvent
|
2017-02-17 15:01:22
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
closed
|
Feature Request: torch 'module' object has no attribute '__version__'
|
enhancement high priority
|
could you please add `__version__` to the `torch` module! all the cool modules have it! ;)
|
1.0
|
Feature Request: torch 'module' object has no attribute '__version__' - could you please add `__version__` to the `torch` module! all the cool modules have it! ;)
|
non_defect
|
feature request torch module object has no attribute version could you please add version to the torch module all the cool modules have it
| 0
|
249,305
| 26,910,069,546
|
IssuesEvent
|
2023-02-06 22:39:35
|
OpenLiberty/open-liberty
|
https://api.github.com/repos/OpenLiberty/open-liberty
|
opened
|
403 status code with securityContext.authenticate(request, response, AuthenticationParameters.withParams().newAuthentication(true));
|
bug team:Security SSO jakartaEE10
|
securityContext.authenticate(request, response, AuthenticationParameters.withParams().newAuthentication(true));
Is trying to reuse the context instead of forcing a new login - I'm also getting a status code of 403.
[Uploading securityContext.auth.zip…]()
|
True
|
403 status code with securityContext.authenticate(request, response, AuthenticationParameters.withParams().newAuthentication(true)); - securityContext.authenticate(request, response, AuthenticationParameters.withParams().newAuthentication(true));
Is trying to reuse the context instead of forcing a new login - I'm also getting a status code of 403.
[Uploading securityContext.auth.zip…]()
|
non_defect
|
status code with securitycontext authenticate request response authenticationparameters withparams newauthentication true securitycontext authenticate request response authenticationparameters withparams newauthentication true is trying to reuse the context instead of forcing a new login i m also getting a status code of
| 0
|
67,268
| 20,961,597,600
|
IssuesEvent
|
2022-03-27 21:46:36
|
abedmaatalla/imsdroid
|
https://api.github.com/repos/abedmaatalla/imsdroid
|
closed
|
it isn't possible to start the sip stack without registering to a sip server
|
Priority-Medium Type-Defect auto-migrated
|
```
There is no way to receive/do a p2p call without registering to a sip server
(the stack isn't initialized before this action?)
```
Original issue reported on code.google.com by `sylar1...@gmail.com` on 11 Jan 2011 at 8:52
|
1.0
|
it isn't possible to start the sip stack without registering to a sip server - ```
There is no way to receive/do a p2p call without registering to a sip server
(the stack isn't initialized before this action?)
```
Original issue reported on code.google.com by `sylar1...@gmail.com` on 11 Jan 2011 at 8:52
|
defect
|
it isn t possible to start the sip stack without registering to a sip server there is no way to receive do a call without registering to a sip server the stack isn t initialized before this action original issue reported on code google com by gmail com on jan at
| 1
|
18,012
| 3,016,271,497
|
IssuesEvent
|
2015-07-30 00:54:31
|
googlei18n/noto-fonts
|
https://api.github.com/repos/googlei18n/noto-fonts
|
closed
|
Sans and Serif: Modifier arrowheads are oversized and vertically misplaced
|
Script-LatinGreekCyrillic Status-Unreproducible Type-Defect
|
Moved from googlei18n/noto-alpha#156. Filed by @roozbehp
The modifier arrowheads (U+02C2..02C5), used in phonetics to note place of articulation, are oversized and vertically too low in NotoSans and NotoSerif. They are almost indistinguishable from greater-than and less-than sign.
They should be at a superscript like position, and smaller.
Compare, for example, with SIL fonts Andika, Charis, Doulos, Gentium.
|
1.0
|
Sans and Serif: Modifier arrowheads are oversized and vertically misplaced - Moved from googlei18n/noto-alpha#156. Filed by @roozbehp
The modifier arrowheads (U+02C2..02C5), used in phonetics to note place of articulation, are oversized and vertically too low in NotoSans and NotoSerif. They are almost indistinguishable from greater-than and less-than sign.
They should be at a superscript like position, and smaller.
Compare, for example, with SIL fonts Andika, Charis, Doulos, Gentium.
|
defect
|
sans and serif modifier arrowheads are oversized and vertically misplaced moved from noto alpha filed by roozbehp the modifier arrowheads u used in phonetics to note place of articulation are oversized and vertically too low in notosans and notoserif they are almost indistinguishable from greater than and less than sign they should be at a superscript like position and smaller compare for example with sil fonts andika charis doulos gentium
| 1
|
269,149
| 28,960,015,420
|
IssuesEvent
|
2023-05-10 01:08:24
|
dpteam/RK3188_TABLET
|
https://api.github.com/repos/dpteam/RK3188_TABLET
|
reopened
|
CVE-2019-19768 (High) detected in linuxv3.0
|
Mend: dependency security vulnerability
|
## CVE-2019-19768 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv3.0</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/verygreen/linux.git>https://github.com/verygreen/linux.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/dpteam/RK3188_TABLET/commit/0c501f5a0fd72c7b2ac82904235363bd44fd8f9e">0c501f5a0fd72c7b2ac82904235363bd44fd8f9e</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/kernel/trace/blktrace.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
In the Linux kernel 5.4.0-rc2, there is a use-after-free (read) in the __blk_add_trace function in kernel/trace/blktrace.c (which is used to fill out a blk_io_trace structure and place it in a per-cpu sub-buffer).
<p>Publish Date: 2019-12-12
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19768>CVE-2019-19768</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2019-19768">https://nvd.nist.gov/vuln/detail/CVE-2019-19768</a></p>
<p>Release Date: 2020-06-10</p>
<p>Fix Resolution: kernel-doc - 3.10.0-514.76.1,3.10.0-957.54.1,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-327.88.1,4.18.0-80.18.1,4.18.0-193,3.10.0-1062.26.1,3.10.0-693.67.1;kernel-rt-core - 4.18.0-193.rt13.51;kernel-rt-debug-debuginfo - 4.18.0-193.rt13.51;kernel-abi-whitelists - 3.10.0-327.88.1,3.10.0-1062.26.1,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-514.76.1,3.10.0-1127.8.2,3.10.0-957.54.1,4.18.0-193,3.10.0-693.67.1;kernel-zfcpdump-modules - 4.18.0-193,4.18.0-147.13.2;kernel-rt-trace-devel - 3.10.0-1127.8.2.rt56.1103;kernel-debug-modules-extra - 4.18.0-147.13.2,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-193,4.18.0-193,4.18.0-147.13.2;kernel-rt-debug-kvm - 4.18.0-193.rt13.51,3.10.0-1127.8.2.rt56.1103;kernel-bootwrapper - 3.10.0-1062.26.1,3.10.0-1127.8.2,3.10.0-693.67.1,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-957.54.1;kernel-rt-debuginfo - 4.18.0-193.rt13.51;kernel-rt-debug-modules - 4.18.0-193.rt13.51;kernel-zfcpdump-devel - 4.18.0-193,4.18.0-147.13.2;perf - 3.10.0-514.76.1,3.10.0-957.54.1,3.10.0-1062.26.1,4.18.0-147.13.2,3.10.0-957.54.1,4.18.0-80.18.1,4.18.0-193,4.18.0-193,3.10.0-327.88.1,4.18.0-147.13.2,3.10.0-1062.26.1,3.10.0-1127.8.2,4.18.0-193,4.18.0-80.18.1,4.18.0-193,3.10.0-1127.8.2,4.18.0-147.13.2,3.10.0-1062.26.1,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-957.54.1;kernel-zfcpdump-modules-extra - 4.18.0-193,4.18.0-147.13.2;kernel-debuginfo - 3.10.0-514.76.1,4.18.0-80.18.1,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-1127.8.2,3.10.0-693.67.1,4.18.0-193,3.10.0-957.54.1,4.18.0-147.13.2,3.10.0-327.88.1,3.10.0-1062.26.1;kernel-debug-devel - 3.10.0-514.76.1,4.18.0-147.13.2,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-957.54.1,4.18.0-193,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-1127.8.2,4.18.0-147.13.2,3.10.0-327.88.1,4.18.0-193,4.18.0-80.18.1,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-1062.26.1,4.18.0-193,3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-514.76.1,4.18.0-147.13.2,4.18.0-193,3.10.0-1062.26.1,4.18.0-80.18.1;bpftool - 3.10.0-1127.8.2,3.10.0-1062.26.1,4.18.0-147.13.2,4.18.0-193,3.10.0-1062.26.1,4.18.0-147.13.2,3.10.0-1062.26.1,3.10.0-1127.8.2,4.18.0-193,4.18.0-147.13.2,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-193,3.10.0-957.54.1,4.18.0-80.18.1,4.18.0-193,3.10.0-1127.8.2;kernel-rt-debug-core - 4.18.0-193.rt13.51;kernel-tools-libs - 3.10.0-1062.26.1,3.10.0-1062.26.1,3.10.0-327.88.1,3.10.0-1127.8.2,4.18.0-193,3.10.0-693.67.1,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-957.54.1,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,3.10.0-957.54.1,3.10.0-1062.26.1,4.18.0-193,3.10.0-514.76.1,3.10.0-957.54.1,3.10.0-514.76.1,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-1127.8.2;perf-debuginfo - 3.10.0-957.54.1,3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-1062.26.1,3.10.0-1062.26.1,4.18.0-193,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-327.88.1;kernel-cross-headers - 4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-193,4.18.0-193,4.18.0-193,4.18.0-147.13.2;kernel-debug-debuginfo - 3.10.0-1127.8.2,3.10.0-1062.26.1,3.10.0-693.67.1,4.18.0-193,3.10.0-514.76.1,3.10.0-327.88.1,3.10.0-957.54.1,3.10.0-1062.26.1,3.10.0-957.54.1,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-1127.8.2;kernel-debug - 3.10.0-514.76.1,3.10.0-327.88.1,4.18.0-193,3.10.0-1127.8.2,3.10.0-693.67.1,3.10.0-957.54.1,4.18.0-193,4.18.0-193,3.10.0-1062.26.1,3.10.0-1062.26.1,4.18.0-80.18.1,3.10.0-957.54.1,4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-1127.8.2,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-1062.26.1,4.18.0-193,3.10.0-957.54.1,4.18.0-147.13.2;kernel-devel - 4.18.0-193,3.10.0-957.54.1,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-1127.8.2,3.10.0-957.54.1,4.18.0-147.13.2,3.10.0-514.76.1,4.18.0-193,4.18.0-80.18.1,3.10.0-1062.26.1,4.18.0-193,3.10.0-957.54.1,4.18.0-80.18.1,3.10.0-1062.26.1,4.18.0-147.13.2,3.10.0-327.88.1,3.10.0-1062.26.1,4.18.0-193,3.10.0-693.67.1,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-1127.8.2;kernel - 3.10.0-1062.26.1,3.10.0-1062.26.1,3.10.0-693.67.1,3.10.0-327.88.1,3.10.0-327.88.1,4.18.0-147.13.2,4.18.0-147.13.2,3.10.0-957.54.1,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-514.76.1,3.10.0-1127.8.2,3.10.0-693.67.1,4.18.0-193,4.18.0-193,3.10.0-1127.8.2,4.18.0-147.13.2,3.10.0-1062.26.1,4.18.0-80.18.1,3.10.0-957.54.1,3.10.0-957.54.1,3.10.0-514.76.1,4.18.0-147.13.2,4.18.0-193,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-1127.8.2,4.18.0-193,3.10.0-514.76.1,3.10.0-693.67.1,4.18.0-193,3.10.0-1127.8.2;bpftool-debuginfo - 4.18.0-193,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-1062.26.1,4.18.0-80.18.1;kpatch-patch-3_10_0-1062_12_1 - 1-2,1-2;kernel-zfcpdump-core - 4.18.0-147.13.2,4.18.0-193;kernel-debug-core - 4.18.0-80.18.1,4.18.0-193,4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-193;kernel-modules-extra - 4.18.0-147.13.2,4.18.0-193,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-193,4.18.0-147.13.2;kernel-rt-debug-devel - 3.10.0-1127.8.2.rt56.1103,4.18.0-193.rt13.51;python-perf - 3.10.0-514.76.1,3.10.0-1127.8.2,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-957.54.1,3.10.0-327.88.1,3.10.0-1062.26.1,3.10.0-693.67.1,3.10.0-514.76.1,3.10.0-1127.8.2,3.10.0-1062.26.1,3.10.0-957.54.1;kernel-core - 4.18.0-147.13.2,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-193,4.18.0-147.13.2,4.18.0-80.18.1,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2;kernel-rt-debug - 3.10.0-1127.8.2.rt56.1103,4.18.0-193.rt13.51;kernel-rt-devel - 3.10.0-1127.8.2.rt56.1103,4.18.0-193.rt13.51;kernel-debuginfo-common-ppc64 - 3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-1062.26.1;python3-perf - 4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-193,4.18.0-147.13.2,4.18.0-147.13.2;kernel-tools - 3.10.0-514.76.1,3.10.0-957.54.1,3.10.0-957.54.1,4.18.0-193,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-1127.8.2,3.10.0-514.76.1,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,3.10.0-1062.26.1,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-1062.26.1,3.10.0-327.88.1,3.10.0-1062.26.1,4.18.0-193,3.10.0-957.54.1,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-1127.8.2;kernel-debug-modules - 4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-193,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2;kernel-rt-trace-kvm - 3.10.0-1127.8.2.rt56.1103;kernel-rt-debuginfo-common-x86_64 - 4.18.0-193.rt13.51;kernel-tools-libs-devel - 3.10.0-514.76.1,3.10.0-327.88.1,3.10.0-693.67.1,3.10.0-1062.26.1,3.10.0-1062.26.1,3.10.0-1127.8.2,3.10.0-957.54.1,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-957.54.1,3.10.0-1062.26.1,3.10.0-957.54.1;kernel-modules - 4.18.0-147.13.2,4.18.0-80.18.1,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-193,4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-193;kernel-tools-debuginfo - 3.10.0-1062.26.1,4.18.0-193,3.10.0-1127.8.2,4.18.0-80.18.1,3.10.0-327.88.1,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-957.54.1,3.10.0-514.76.1,3.10.0-693.67.1;kernel-rt-modules - 4.18.0-193.rt13.51;kernel-rt-doc - 3.10.0-1127.8.2.rt56.1103;kernel-rt-kvm - 4.18.0-193.rt13.51,3.10.0-1127.8.2.rt56.1103;python-perf-debuginfo - 3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-327.88.1,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-514.76.1,3.10.0-1062.26.1;kernel-headers - 3.10.0-1062.26.1,4.18.0-147.13.2,3.10.0-957.54.1,3.10.0-514.76.1,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,3.10.0-327.88.1,3.10.0-1127.8.2,4.18.0-147.13.2,4.18.0-193,3.10.0-1062.26.1,3.10.0-693.67.1,4.18.0-193,3.10.0-1127.8.2,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-957.54.1,3.10.0-514.76.1,3.10.0-1062.26.1,4.18.0-80.18.1,3.10.0-957.54.1,4.18.0-193,3.10.0-1127.8.2;kernel-rt-trace - 3.10.0-1127.8.2.rt56.1103;kernel-debuginfo-common-x86_64 - 3.10.0-1127.8.2,3.10.0-693.67.1,3.10.0-327.88.1,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-1062.26.1,3.10.0-514.76.1,4.18.0-193,3.10.0-957.54.1;kernel-rt - 3.10.0-1127.8.2.rt56.1103,4.18.0-193.rt13.51,3.10.0-1127.8.2.rt56.1103,4.18.0-193.rt13.51;kernel-zfcpdump - 4.18.0-147.13.2,4.18.0-193;kernel-rt-debug-modules-extra - 4.18.0-193.rt13.51;python3-perf-debuginfo - 4.18.0-147.13.2,4.18.0-80.18.1,4.18.0-193;kernel-rt-modules-extra - 4.18.0-193.rt13.51</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-19768 (High) detected in linuxv3.0 - ## CVE-2019-19768 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv3.0</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/verygreen/linux.git>https://github.com/verygreen/linux.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/dpteam/RK3188_TABLET/commit/0c501f5a0fd72c7b2ac82904235363bd44fd8f9e">0c501f5a0fd72c7b2ac82904235363bd44fd8f9e</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/kernel/trace/blktrace.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
In the Linux kernel 5.4.0-rc2, there is a use-after-free (read) in the __blk_add_trace function in kernel/trace/blktrace.c (which is used to fill out a blk_io_trace structure and place it in a per-cpu sub-buffer).
<p>Publish Date: 2019-12-12
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19768>CVE-2019-19768</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2019-19768">https://nvd.nist.gov/vuln/detail/CVE-2019-19768</a></p>
<p>Release Date: 2020-06-10</p>
<p>Fix Resolution: kernel-doc - 3.10.0-514.76.1,3.10.0-957.54.1,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-327.88.1,4.18.0-80.18.1,4.18.0-193,3.10.0-1062.26.1,3.10.0-693.67.1;kernel-rt-core - 4.18.0-193.rt13.51;kernel-rt-debug-debuginfo - 4.18.0-193.rt13.51;kernel-abi-whitelists - 3.10.0-327.88.1,3.10.0-1062.26.1,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-514.76.1,3.10.0-1127.8.2,3.10.0-957.54.1,4.18.0-193,3.10.0-693.67.1;kernel-zfcpdump-modules - 4.18.0-193,4.18.0-147.13.2;kernel-rt-trace-devel - 3.10.0-1127.8.2.rt56.1103;kernel-debug-modules-extra - 4.18.0-147.13.2,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-193,4.18.0-193,4.18.0-147.13.2;kernel-rt-debug-kvm - 4.18.0-193.rt13.51,3.10.0-1127.8.2.rt56.1103;kernel-bootwrapper - 3.10.0-1062.26.1,3.10.0-1127.8.2,3.10.0-693.67.1,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-957.54.1;kernel-rt-debuginfo - 4.18.0-193.rt13.51;kernel-rt-debug-modules - 4.18.0-193.rt13.51;kernel-zfcpdump-devel - 4.18.0-193,4.18.0-147.13.2;perf - 3.10.0-514.76.1,3.10.0-957.54.1,3.10.0-1062.26.1,4.18.0-147.13.2,3.10.0-957.54.1,4.18.0-80.18.1,4.18.0-193,4.18.0-193,3.10.0-327.88.1,4.18.0-147.13.2,3.10.0-1062.26.1,3.10.0-1127.8.2,4.18.0-193,4.18.0-80.18.1,4.18.0-193,3.10.0-1127.8.2,4.18.0-147.13.2,3.10.0-1062.26.1,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-957.54.1;kernel-zfcpdump-modules-extra - 4.18.0-193,4.18.0-147.13.2;kernel-debuginfo - 3.10.0-514.76.1,4.18.0-80.18.1,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-1127.8.2,3.10.0-693.67.1,4.18.0-193,3.10.0-957.54.1,4.18.0-147.13.2,3.10.0-327.88.1,3.10.0-1062.26.1;kernel-debug-devel - 3.10.0-514.76.1,4.18.0-147.13.2,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-957.54.1,4.18.0-193,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-1127.8.2,4.18.0-147.13.2,3.10.0-327.88.1,4.18.0-193,4.18.0-80.18.1,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-1062.26.1,4.18.0-193,3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-514.76.1,4.18.0-147.13.2,4.18.0-193,3.10.0-1062.26.1,4.18.0-80.18.1;bpftool - 3.10.0-1127.8.2,3.10.0-1062.26.1,4.18.0-147.13.2,4.18.0-193,3.10.0-1062.26.1,4.18.0-147.13.2,3.10.0-1062.26.1,3.10.0-1127.8.2,4.18.0-193,4.18.0-147.13.2,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-193,3.10.0-957.54.1,4.18.0-80.18.1,4.18.0-193,3.10.0-1127.8.2;kernel-rt-debug-core - 4.18.0-193.rt13.51;kernel-tools-libs - 3.10.0-1062.26.1,3.10.0-1062.26.1,3.10.0-327.88.1,3.10.0-1127.8.2,4.18.0-193,3.10.0-693.67.1,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-957.54.1,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,3.10.0-957.54.1,3.10.0-1062.26.1,4.18.0-193,3.10.0-514.76.1,3.10.0-957.54.1,3.10.0-514.76.1,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-1127.8.2;perf-debuginfo - 3.10.0-957.54.1,3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-1062.26.1,3.10.0-1062.26.1,4.18.0-193,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-327.88.1;kernel-cross-headers - 4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-193,4.18.0-193,4.18.0-193,4.18.0-147.13.2;kernel-debug-debuginfo - 3.10.0-1127.8.2,3.10.0-1062.26.1,3.10.0-693.67.1,4.18.0-193,3.10.0-514.76.1,3.10.0-327.88.1,3.10.0-957.54.1,3.10.0-1062.26.1,3.10.0-957.54.1,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-1127.8.2;kernel-debug - 3.10.0-514.76.1,3.10.0-327.88.1,4.18.0-193,3.10.0-1127.8.2,3.10.0-693.67.1,3.10.0-957.54.1,4.18.0-193,4.18.0-193,3.10.0-1062.26.1,3.10.0-1062.26.1,4.18.0-80.18.1,3.10.0-957.54.1,4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-1127.8.2,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-1062.26.1,4.18.0-193,3.10.0-957.54.1,4.18.0-147.13.2;kernel-devel - 4.18.0-193,3.10.0-957.54.1,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-1127.8.2,3.10.0-957.54.1,4.18.0-147.13.2,3.10.0-514.76.1,4.18.0-193,4.18.0-80.18.1,3.10.0-1062.26.1,4.18.0-193,3.10.0-957.54.1,4.18.0-80.18.1,3.10.0-1062.26.1,4.18.0-147.13.2,3.10.0-327.88.1,3.10.0-1062.26.1,4.18.0-193,3.10.0-693.67.1,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-1127.8.2;kernel - 3.10.0-1062.26.1,3.10.0-1062.26.1,3.10.0-693.67.1,3.10.0-327.88.1,3.10.0-327.88.1,4.18.0-147.13.2,4.18.0-147.13.2,3.10.0-957.54.1,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-514.76.1,3.10.0-1127.8.2,3.10.0-693.67.1,4.18.0-193,4.18.0-193,3.10.0-1127.8.2,4.18.0-147.13.2,3.10.0-1062.26.1,4.18.0-80.18.1,3.10.0-957.54.1,3.10.0-957.54.1,3.10.0-514.76.1,4.18.0-147.13.2,4.18.0-193,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-1127.8.2,4.18.0-193,3.10.0-514.76.1,3.10.0-693.67.1,4.18.0-193,3.10.0-1127.8.2;bpftool-debuginfo - 4.18.0-193,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-1062.26.1,4.18.0-80.18.1;kpatch-patch-3_10_0-1062_12_1 - 1-2,1-2;kernel-zfcpdump-core - 4.18.0-147.13.2,4.18.0-193;kernel-debug-core - 4.18.0-80.18.1,4.18.0-193,4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-193;kernel-modules-extra - 4.18.0-147.13.2,4.18.0-193,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-193,4.18.0-147.13.2;kernel-rt-debug-devel - 3.10.0-1127.8.2.rt56.1103,4.18.0-193.rt13.51;python-perf - 3.10.0-514.76.1,3.10.0-1127.8.2,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-957.54.1,3.10.0-327.88.1,3.10.0-1062.26.1,3.10.0-693.67.1,3.10.0-514.76.1,3.10.0-1127.8.2,3.10.0-1062.26.1,3.10.0-957.54.1;kernel-core - 4.18.0-147.13.2,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-193,4.18.0-147.13.2,4.18.0-80.18.1,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2;kernel-rt-debug - 3.10.0-1127.8.2.rt56.1103,4.18.0-193.rt13.51;kernel-rt-devel - 3.10.0-1127.8.2.rt56.1103,4.18.0-193.rt13.51;kernel-debuginfo-common-ppc64 - 3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-1062.26.1;python3-perf - 4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-193,4.18.0-147.13.2,4.18.0-147.13.2;kernel-tools - 3.10.0-514.76.1,3.10.0-957.54.1,3.10.0-957.54.1,4.18.0-193,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-1127.8.2,3.10.0-514.76.1,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,3.10.0-1062.26.1,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-1062.26.1,3.10.0-327.88.1,3.10.0-1062.26.1,4.18.0-193,3.10.0-957.54.1,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-1127.8.2;kernel-debug-modules - 4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-193,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2;kernel-rt-trace-kvm - 3.10.0-1127.8.2.rt56.1103;kernel-rt-debuginfo-common-x86_64 - 4.18.0-193.rt13.51;kernel-tools-libs-devel - 3.10.0-514.76.1,3.10.0-327.88.1,3.10.0-693.67.1,3.10.0-1062.26.1,3.10.0-1062.26.1,3.10.0-1127.8.2,3.10.0-957.54.1,3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-1127.8.2,3.10.0-514.76.1,3.10.0-957.54.1,3.10.0-1062.26.1,3.10.0-957.54.1;kernel-modules - 4.18.0-147.13.2,4.18.0-80.18.1,4.18.0-193,4.18.0-147.13.2,4.18.0-193,4.18.0-80.18.1,4.18.0-193,4.18.0-147.13.2,4.18.0-147.13.2,4.18.0-193;kernel-tools-debuginfo - 3.10.0-1062.26.1,4.18.0-193,3.10.0-1127.8.2,4.18.0-80.18.1,3.10.0-327.88.1,4.18.0-147.13.2,3.10.0-1127.8.2,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-957.54.1,3.10.0-514.76.1,3.10.0-693.67.1;kernel-rt-modules - 4.18.0-193.rt13.51;kernel-rt-doc - 3.10.0-1127.8.2.rt56.1103;kernel-rt-kvm - 4.18.0-193.rt13.51,3.10.0-1127.8.2.rt56.1103;python-perf-debuginfo - 3.10.0-693.67.1,3.10.0-1127.8.2,3.10.0-957.54.1,3.10.0-1127.8.2,3.10.0-327.88.1,3.10.0-1062.26.1,3.10.0-957.54.1,3.10.0-514.76.1,3.10.0-1062.26.1;kernel-headers - 3.10.0-1062.26.1,4.18.0-147.13.2,3.10.0-957.54.1,3.10.0-514.76.1,4.18.0-193,4.18.0-80.18.1,4.18.0-147.13.2,3.10.0-327.88.1,3.10.0-1127.8.2,4.18.0-147.13.2,4.18.0-193,3.10.0-1062.26.1,3.10.0-693.67.1,4.18.0-193,3.10.0-1127.8.2,3.10.0-693.67.1,4.18.0-147.13.2,3.10.0-957.54.1,3.10.0-514.76.1,3.10.0-1062.26.1,4.18.0-80.18.1,3.10.0-957.54.1,4.18.0-193,3.10.0-1127.8.2;kernel-rt-trace - 3.10.0-1127.8.2.rt56.1103;kernel-debuginfo-common-x86_64 - 3.10.0-1127.8.2,3.10.0-693.67.1,3.10.0-327.88.1,4.18.0-147.13.2,4.18.0-80.18.1,3.10.0-1062.26.1,3.10.0-514.76.1,4.18.0-193,3.10.0-957.54.1;kernel-rt - 3.10.0-1127.8.2.rt56.1103,4.18.0-193.rt13.51,3.10.0-1127.8.2.rt56.1103,4.18.0-193.rt13.51;kernel-zfcpdump - 4.18.0-147.13.2,4.18.0-193;kernel-rt-debug-modules-extra - 4.18.0-193.rt13.51;python3-perf-debuginfo - 4.18.0-147.13.2,4.18.0-80.18.1,4.18.0-193;kernel-rt-modules-extra - 4.18.0-193.rt13.51</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve high detected in cve high severity vulnerability vulnerable library linux kernel source tree library home page a href found in head commit a href found in base branch master vulnerable source files kernel trace blktrace c vulnerability details in the linux kernel there is a use after free read in the blk add trace function in kernel trace blktrace c which is used to fill out a blk io trace structure and place it in a per cpu sub buffer publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution kernel doc kernel rt core kernel rt debug debuginfo kernel abi whitelists kernel zfcpdump modules kernel rt trace devel kernel debug modules extra kernel rt debug kvm kernel bootwrapper kernel rt debuginfo kernel rt debug modules kernel zfcpdump devel perf kernel zfcpdump modules extra kernel debuginfo kernel debug devel bpftool kernel rt debug core kernel tools libs perf debuginfo kernel cross headers kernel debug debuginfo kernel debug kernel devel kernel bpftool debuginfo kpatch patch kernel zfcpdump core kernel debug core kernel modules extra kernel rt debug devel python perf kernel core kernel rt debug kernel rt devel kernel debuginfo common perf kernel tools kernel debug modules kernel rt trace kvm kernel rt debuginfo common kernel tools libs devel kernel modules kernel tools debuginfo kernel rt modules kernel rt doc kernel rt kvm python perf debuginfo kernel headers kernel rt trace kernel debuginfo common kernel rt kernel zfcpdump kernel rt debug modules extra perf debuginfo kernel rt modules extra step up your open source security game with mend
| 0
|
47,789
| 13,066,230,044
|
IssuesEvent
|
2020-07-30 21:15:39
|
icecube-trac/tix2
|
https://api.github.com/repos/icecube-trac/tix2
|
closed
|
Improved line fit resources directory cleanup (Trac #1181)
|
Migrated from Trac combo reconstruction defect
|
There are no links to improved linefit documentation in the meta-project documentation. there is a decent index.rst, but it is in resources/ not resources/docs/ it needs the name the maintainer and have links to the CHANGELOG and the doxygen documentation.
In addition the example script example.py needs to be updated, it tries to load something called libNFE. and moved to resources/examples directory
Migrated from https://code.icecube.wisc.edu/ticket/1181
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"description": "There are no links to improved linefit documentation in the meta-project documentation. there is a decent index.rst, but it is in resources/ not resources/docs/ it needs the name the maintainer and have links to the CHANGELOG and the doxygen documentation.\n\nIn addition the example script example.py needs to be updated, it tries to load something called libNFE. and moved to resources/examples directory",
"reporter": "kjmeagher",
"cc": "",
"resolution": "wontfix",
"_ts": "1550067117911749",
"component": "combo reconstruction",
"summary": "Improved line fit resources directory cleanup",
"priority": "blocker",
"keywords": "",
"time": "2015-08-19T11:40:01",
"milestone": "",
"owner": "gmaggi",
"type": "defect"
}
```
|
1.0
|
Improved line fit resources directory cleanup (Trac #1181) - There are no links to improved linefit documentation in the meta-project documentation. there is a decent index.rst, but it is in resources/ not resources/docs/ it needs the name the maintainer and have links to the CHANGELOG and the doxygen documentation.
In addition the example script example.py needs to be updated, it tries to load something called libNFE. and moved to resources/examples directory
Migrated from https://code.icecube.wisc.edu/ticket/1181
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"description": "There are no links to improved linefit documentation in the meta-project documentation. there is a decent index.rst, but it is in resources/ not resources/docs/ it needs the name the maintainer and have links to the CHANGELOG and the doxygen documentation.\n\nIn addition the example script example.py needs to be updated, it tries to load something called libNFE. and moved to resources/examples directory",
"reporter": "kjmeagher",
"cc": "",
"resolution": "wontfix",
"_ts": "1550067117911749",
"component": "combo reconstruction",
"summary": "Improved line fit resources directory cleanup",
"priority": "blocker",
"keywords": "",
"time": "2015-08-19T11:40:01",
"milestone": "",
"owner": "gmaggi",
"type": "defect"
}
```
|
defect
|
improved line fit resources directory cleanup trac there are no links to improved linefit documentation in the meta project documentation there is a decent index rst but it is in resources not resources docs it needs the name the maintainer and have links to the changelog and the doxygen documentation in addition the example script example py needs to be updated it tries to load something called libnfe and moved to resources examples directory migrated from json status closed changetime description there are no links to improved linefit documentation in the meta project documentation there is a decent index rst but it is in resources not resources docs it needs the name the maintainer and have links to the changelog and the doxygen documentation n nin addition the example script example py needs to be updated it tries to load something called libnfe and moved to resources examples directory reporter kjmeagher cc resolution wontfix ts component combo reconstruction summary improved line fit resources directory cleanup priority blocker keywords time milestone owner gmaggi type defect
| 1
|
126,819
| 12,300,114,095
|
IssuesEvent
|
2020-05-11 13:30:39
|
vigge93/PA1450-Development-task
|
https://api.github.com/repos/vigge93/PA1450-Development-task
|
closed
|
Create frontend specification
|
Frontend documentation
|
A frontend specification is needed to allow for multiple people working on the frontend at different times to still understand the structure of the program
|
1.0
|
Create frontend specification - A frontend specification is needed to allow for multiple people working on the frontend at different times to still understand the structure of the program
|
non_defect
|
create frontend specification a frontend specification is needed to allow for multiple people working on the frontend at different times to still understand the structure of the program
| 0
|
20,402
| 3,352,902,842
|
IssuesEvent
|
2015-11-18 01:21:08
|
dart-lang/sdk
|
https://api.github.com/repos/dart-lang/sdk
|
closed
|
execution.mapUri doesn't on Windows for dart:* uris
|
analyzer-server area-analyzer Priority-Medium Type-Defect
|
From Alex: Breakpoints in SDK files do not work for me because execution_mapUri(1, "C:\dart\dart-1.14-fake\lib\core\print.dart", null) returns file:///C:/dart/dart-1.14-fake/lib/core/print.dart instead of dart:core/print.dart. Server bug?
Source:https://github.com/JetBrains/intellij-plugins/pull/298#issuecomment-155445301
|
1.0
|
execution.mapUri doesn't on Windows for dart:* uris - From Alex: Breakpoints in SDK files do not work for me because execution_mapUri(1, "C:\dart\dart-1.14-fake\lib\core\print.dart", null) returns file:///C:/dart/dart-1.14-fake/lib/core/print.dart instead of dart:core/print.dart. Server bug?
Source:https://github.com/JetBrains/intellij-plugins/pull/298#issuecomment-155445301
|
defect
|
execution mapuri doesn t on windows for dart uris from alex breakpoints in sdk files do not work for me because execution mapuri c dart dart fake lib core print dart null returns file c dart dart fake lib core print dart instead of dart core print dart server bug source
| 1
|
58,385
| 16,519,059,154
|
IssuesEvent
|
2021-05-26 12:49:00
|
scipy/scipy
|
https://api.github.com/repos/scipy/scipy
|
closed
|
Minimize with trust-constr method leads to TypeError if option verbose is set to 2 or 3
|
defect scipy.optimize
|
When I changed the minimize method to "trust-constr" I got the TypeError. TypeError does not appear if verbose is set to 0 or 1.
#### Reproducing code example:
```python
cons = ({'type':'ineq','fun': lambda params: params[31]},\
{'type':'ineq','fun': lambda params: params[-1]})
opts = {'maxiter': 1000, 'verbose': 3, 'factorization_method': 'SVDFactorization'}
result = scipy.optimize.minimize(func_value,params,args=(func_args), method='trust-constr', \
options=opts,constraints=cons)
```
#### Error message:
```python
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_minimize.py", line 628, in minimize return _minimize_trustregion_constr(fun, x0, args, jac, hess, hessp,
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\minimize_trustregion_constr.py", line 509, in _minimize_trustregion_constr
_, result = tr_interior_point(
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\tr_interior_point.py", line 321, in tr_interior_point
z, state = equality_constrained_sqp(
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\equality_constrained_sqp.py", line 93, in equality_constrained_sqp
while not stop_criteria(state, x, last_iteration_failed,
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\tr_interior_point.py", line 251, in stop_criteria
if self.global_stop_criteria(state, x,
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\minimize_trustregion_constr.py", line 449, in stop_criteria
BasicReport.print_iteration(state.nit,
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\report.py", line 28, in print_iteration
print(fmt.format(*args))
TypeError: unsupported format string passed to numpy.ndarray.__format__
```
#### Scipy/Numpy/Python version information:
```
1.5.0 1.19.1 sys.version_info(major=3, minor=8, micro=3, releaselevel='final', serial=0)
```
|
1.0
|
Minimize with trust-constr method leads to TypeError if option verbose is set to 2 or 3 - When I changed the minimize method to "trust-constr" I got the TypeError. TypeError does not appear if verbose is set to 0 or 1.
#### Reproducing code example:
```python
cons = ({'type':'ineq','fun': lambda params: params[31]},\
{'type':'ineq','fun': lambda params: params[-1]})
opts = {'maxiter': 1000, 'verbose': 3, 'factorization_method': 'SVDFactorization'}
result = scipy.optimize.minimize(func_value,params,args=(func_args), method='trust-constr', \
options=opts,constraints=cons)
```
#### Error message:
```python
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_minimize.py", line 628, in minimize return _minimize_trustregion_constr(fun, x0, args, jac, hess, hessp,
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\minimize_trustregion_constr.py", line 509, in _minimize_trustregion_constr
_, result = tr_interior_point(
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\tr_interior_point.py", line 321, in tr_interior_point
z, state = equality_constrained_sqp(
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\equality_constrained_sqp.py", line 93, in equality_constrained_sqp
while not stop_criteria(state, x, last_iteration_failed,
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\tr_interior_point.py", line 251, in stop_criteria
if self.global_stop_criteria(state, x,
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\minimize_trustregion_constr.py", line 449, in stop_criteria
BasicReport.print_iteration(state.nit,
File "C:\ProgramData\Anaconda3\lib\site-packages\scipy\optimize\_trustregion_constr\report.py", line 28, in print_iteration
print(fmt.format(*args))
TypeError: unsupported format string passed to numpy.ndarray.__format__
```
#### Scipy/Numpy/Python version information:
```
1.5.0 1.19.1 sys.version_info(major=3, minor=8, micro=3, releaselevel='final', serial=0)
```
|
defect
|
minimize with trust constr method leads to typeerror if option verbose is set to or when i changed the minimize method to trust constr i got the typeerror typeerror does not appear if verbose is set to or reproducing code example python cons type ineq fun lambda params params type ineq fun lambda params params opts maxiter verbose factorization method svdfactorization result scipy optimize minimize func value params args func args method trust constr options opts constraints cons error message python file c programdata lib site packages scipy optimize minimize py line in minimize return minimize trustregion constr fun args jac hess hessp file c programdata lib site packages scipy optimize trustregion constr minimize trustregion constr py line in minimize trustregion constr result tr interior point file c programdata lib site packages scipy optimize trustregion constr tr interior point py line in tr interior point z state equality constrained sqp file c programdata lib site packages scipy optimize trustregion constr equality constrained sqp py line in equality constrained sqp while not stop criteria state x last iteration failed file c programdata lib site packages scipy optimize trustregion constr tr interior point py line in stop criteria if self global stop criteria state x file c programdata lib site packages scipy optimize trustregion constr minimize trustregion constr py line in stop criteria basicreport print iteration state nit file c programdata lib site packages scipy optimize trustregion constr report py line in print iteration print fmt format args typeerror unsupported format string passed to numpy ndarray format scipy numpy python version information sys version info major minor micro releaselevel final serial
| 1
|
74,882
| 25,382,075,834
|
IssuesEvent
|
2022-11-21 18:24:42
|
networkx/networkx
|
https://api.github.com/repos/networkx/networkx
|
closed
|
Potential bug in connected_double_edge_swap
|
Defect
|
The following test case failed during a `pytest-randomly` workflow:
https://github.com/networkx/networkx/blob/895963729231fe02153afe92ecc946a400247f1d/networkx/algorithms/tests/test_swap.py#L37-L39
I was able to reproduce locally on `main` by running interactively. When running without a `seed`, I occasionally get `1` instead of `0`. This is also reproducible when seeded:
```python
>>> G = nx.path_graph(4)
>>> nx.connected_double_edge_swap(G, seed=140)
1
```
I tested with both 2.8.3 and 2.7 and was able to reproduce with those versions as well, so I don't think this is something recently introduced (though I haven't looked at blame or bisected).
|
1.0
|
Potential bug in connected_double_edge_swap - The following test case failed during a `pytest-randomly` workflow:
https://github.com/networkx/networkx/blob/895963729231fe02153afe92ecc946a400247f1d/networkx/algorithms/tests/test_swap.py#L37-L39
I was able to reproduce locally on `main` by running interactively. When running without a `seed`, I occasionally get `1` instead of `0`. This is also reproducible when seeded:
```python
>>> G = nx.path_graph(4)
>>> nx.connected_double_edge_swap(G, seed=140)
1
```
I tested with both 2.8.3 and 2.7 and was able to reproduce with those versions as well, so I don't think this is something recently introduced (though I haven't looked at blame or bisected).
|
defect
|
potential bug in connected double edge swap the following test case failed during a pytest randomly workflow i was able to reproduce locally on main by running interactively when running without a seed i occasionally get instead of this is also reproducible when seeded python g nx path graph nx connected double edge swap g seed i tested with both and and was able to reproduce with those versions as well so i don t think this is something recently introduced though i haven t looked at blame or bisected
| 1
|
288,382
| 24,902,009,152
|
IssuesEvent
|
2022-10-28 22:11:13
|
Moonshine-IDE/Moonshine-IDE
|
https://api.github.com/repos/Moonshine-IDE/Moonshine-IDE
|
closed
|
TypeError in application menus
|
bug test ready haxe
|
```
: Error #1009
: TypeError: Error #1009
: at actionScripts.plugin.recentlyOpened::RecentlyOpenedPlugin/openRecentItem()
: at actionScripts.plugin.recentlyOpened::RecentlyOpenedPlugin/onOpenRecentProject()
: at flash.events::EventDispatcher/dispatchEvent()
: at components.views.splashscreen::SplashScreen/handleRecentClick()
: at flash.events::EventDispatcher/dispatchEvent()
: at mx.core::UIComponent/dispatchEvent()
: at components.renderers::RecentProjectRenderer/handleClick()
```
This is related to Haxe conversion. As opposed to ActionScript, Haxe is strictly typed, so if an object that a specific menuitem is based on is not of the required/expected type, these errors occur. I have to recheck/redo relevant parts in the code.
|
1.0
|
TypeError in application menus - ```
: Error #1009
: TypeError: Error #1009
: at actionScripts.plugin.recentlyOpened::RecentlyOpenedPlugin/openRecentItem()
: at actionScripts.plugin.recentlyOpened::RecentlyOpenedPlugin/onOpenRecentProject()
: at flash.events::EventDispatcher/dispatchEvent()
: at components.views.splashscreen::SplashScreen/handleRecentClick()
: at flash.events::EventDispatcher/dispatchEvent()
: at mx.core::UIComponent/dispatchEvent()
: at components.renderers::RecentProjectRenderer/handleClick()
```
This is related to Haxe conversion. As opposed to ActionScript, Haxe is strictly typed, so if an object that a specific menuitem is based on is not of the required/expected type, these errors occur. I have to recheck/redo relevant parts in the code.
|
non_defect
|
typeerror in application menus error typeerror error at actionscripts plugin recentlyopened recentlyopenedplugin openrecentitem at actionscripts plugin recentlyopened recentlyopenedplugin onopenrecentproject at flash events eventdispatcher dispatchevent at components views splashscreen splashscreen handlerecentclick at flash events eventdispatcher dispatchevent at mx core uicomponent dispatchevent at components renderers recentprojectrenderer handleclick this is related to haxe conversion as opposed to actionscript haxe is strictly typed so if an object that a specific menuitem is based on is not of the required expected type these errors occur i have to recheck redo relevant parts in the code
| 0
|
50,302
| 13,187,432,642
|
IssuesEvent
|
2020-08-13 03:23:47
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
closed
|
steamshovel.qrc and symlinks (Trac #489)
|
Migrated from Trac combo core defect
|
From Martin Wolf:
I just encountered an unusual compilation error when compiling streamshovel in offline-software trunk when using a symlink for the steamshovel directory in the meta-project's source direcory:
make[2]: *** No rule to make target `/home/mwolf/software/meta-projects/offline-software/trunk/src/steamshovel/resources/../../phys-services/resources/doms.txt', needed by `steamshovel/qrc_steamshovel.cxx'. Stop.
make[1]: *** [steamshovel/CMakeFiles/steamshovel.dir/all] Error 2
The problem is the relative path specification in steamshovel/resources/steamshovel.qrc line 9:
<file alias="doms.txt">../../phys-services/resources/doms.txt</file>
It would be great if an absolute path (generated by cmake) could be used instead.
Using symlinks for i3projects in meta-project's source directory is quite handy, when working with different versions.
<details>
<summary>_Migrated from https://code.icecube.wisc.edu/ticket/489
, reported by david.schultz and owned by hdembinski_</summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-07-22T00:49:45",
"description": "From Martin Wolf:\n\nI just encountered an unusual compilation error when compiling streamshovel in offline-software trunk when using a symlink for the steamshovel directory in the meta-project's source direcory:\n\nmake[2]: *** No rule to make target `/home/mwolf/software/meta-projects/offline-software/trunk/src/steamshovel/resources/../../phys-services/resources/doms.txt', needed by `steamshovel/qrc_steamshovel.cxx'. Stop.\nmake[1]: *** [steamshovel/CMakeFiles/steamshovel.dir/all] Error 2\n\nThe problem is the relative path specification in steamshovel/resources/steamshovel.qrc line 9:\n\n<file alias=\"doms.txt\">../../phys-services/resources/doms.txt</file>\n\nIt would be great if an absolute path (generated by cmake) could be used instead.\n\nUsing symlinks for i3projects in meta-project's source directory is quite handy, when working with different versions. ",
"reporter": "david.schultz",
"cc": "david.schultz@icecube.wisc.edu",
"resolution": "fixed",
"_ts": "1437526185451006",
"component": "combo core",
"summary": "steamshovel.qrc and symlinks",
"priority": "normal",
"keywords": "",
"time": "2014-03-21T18:54:41",
"milestone": "",
"owner": "hdembinski",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
steamshovel.qrc and symlinks (Trac #489) - From Martin Wolf:
I just encountered an unusual compilation error when compiling streamshovel in offline-software trunk when using a symlink for the steamshovel directory in the meta-project's source direcory:
make[2]: *** No rule to make target `/home/mwolf/software/meta-projects/offline-software/trunk/src/steamshovel/resources/../../phys-services/resources/doms.txt', needed by `steamshovel/qrc_steamshovel.cxx'. Stop.
make[1]: *** [steamshovel/CMakeFiles/steamshovel.dir/all] Error 2
The problem is the relative path specification in steamshovel/resources/steamshovel.qrc line 9:
<file alias="doms.txt">../../phys-services/resources/doms.txt</file>
It would be great if an absolute path (generated by cmake) could be used instead.
Using symlinks for i3projects in meta-project's source directory is quite handy, when working with different versions.
<details>
<summary>_Migrated from https://code.icecube.wisc.edu/ticket/489
, reported by david.schultz and owned by hdembinski_</summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-07-22T00:49:45",
"description": "From Martin Wolf:\n\nI just encountered an unusual compilation error when compiling streamshovel in offline-software trunk when using a symlink for the steamshovel directory in the meta-project's source direcory:\n\nmake[2]: *** No rule to make target `/home/mwolf/software/meta-projects/offline-software/trunk/src/steamshovel/resources/../../phys-services/resources/doms.txt', needed by `steamshovel/qrc_steamshovel.cxx'. Stop.\nmake[1]: *** [steamshovel/CMakeFiles/steamshovel.dir/all] Error 2\n\nThe problem is the relative path specification in steamshovel/resources/steamshovel.qrc line 9:\n\n<file alias=\"doms.txt\">../../phys-services/resources/doms.txt</file>\n\nIt would be great if an absolute path (generated by cmake) could be used instead.\n\nUsing symlinks for i3projects in meta-project's source directory is quite handy, when working with different versions. ",
"reporter": "david.schultz",
"cc": "david.schultz@icecube.wisc.edu",
"resolution": "fixed",
"_ts": "1437526185451006",
"component": "combo core",
"summary": "steamshovel.qrc and symlinks",
"priority": "normal",
"keywords": "",
"time": "2014-03-21T18:54:41",
"milestone": "",
"owner": "hdembinski",
"type": "defect"
}
```
</p>
</details>
|
defect
|
steamshovel qrc and symlinks trac from martin wolf i just encountered an unusual compilation error when compiling streamshovel in offline software trunk when using a symlink for the steamshovel directory in the meta project s source direcory make no rule to make target home mwolf software meta projects offline software trunk src steamshovel resources phys services resources doms txt needed by steamshovel qrc steamshovel cxx stop make error the problem is the relative path specification in steamshovel resources steamshovel qrc line phys services resources doms txt it would be great if an absolute path generated by cmake could be used instead using symlinks for in meta project s source directory is quite handy when working with different versions migrated from reported by david schultz and owned by hdembinski json status closed changetime description from martin wolf n ni just encountered an unusual compilation error when compiling streamshovel in offline software trunk when using a symlink for the steamshovel directory in the meta project s source direcory n nmake no rule to make target home mwolf software meta projects offline software trunk src steamshovel resources phys services resources doms txt needed by steamshovel qrc steamshovel cxx stop nmake error n nthe problem is the relative path specification in steamshovel resources steamshovel qrc line n n phys services resources doms txt n nit would be great if an absolute path generated by cmake could be used instead n nusing symlinks for in meta project s source directory is quite handy when working with different versions reporter david schultz cc david schultz icecube wisc edu resolution fixed ts component combo core summary steamshovel qrc and symlinks priority normal keywords time milestone owner hdembinski type defect
| 1
|
117,928
| 25,216,717,437
|
IssuesEvent
|
2022-11-14 09:42:30
|
Regalis11/Barotrauma
|
https://api.github.com/repos/Regalis11/Barotrauma
|
closed
|
[Factions] Cultist Robes overlaps Diving Suit Helmet
|
Bug Code Design
|
### Disclaimers
- [X] I have searched the issue tracker to check if the issue has already been reported.
- [ ] My issue happened while using mods.
### What happened?
Whilst wearing the Cultist Robes and a Combat Diving Suit, the hood overlaps with the helmet. This is represented below.

### Reproduction steps
1. Wear Cultist Robes
2. Wear Combat Diving Suit
3. Observe hood overlapping the helmet
### Bug prevalence
Happens every time I play
### Version
Faction test branch
### -
_No response_
### Which operating system did you encounter this bug on?
Windows
### Relevant error messages and crash reports
_No response_
|
1.0
|
[Factions] Cultist Robes overlaps Diving Suit Helmet - ### Disclaimers
- [X] I have searched the issue tracker to check if the issue has already been reported.
- [ ] My issue happened while using mods.
### What happened?
Whilst wearing the Cultist Robes and a Combat Diving Suit, the hood overlaps with the helmet. This is represented below.

### Reproduction steps
1. Wear Cultist Robes
2. Wear Combat Diving Suit
3. Observe hood overlapping the helmet
### Bug prevalence
Happens every time I play
### Version
Faction test branch
### -
_No response_
### Which operating system did you encounter this bug on?
Windows
### Relevant error messages and crash reports
_No response_
|
non_defect
|
cultist robes overlaps diving suit helmet disclaimers i have searched the issue tracker to check if the issue has already been reported my issue happened while using mods what happened whilst wearing the cultist robes and a combat diving suit the hood overlaps with the helmet this is represented below reproduction steps wear cultist robes wear combat diving suit observe hood overlapping the helmet bug prevalence happens every time i play version faction test branch no response which operating system did you encounter this bug on windows relevant error messages and crash reports no response
| 0
|
606,814
| 18,768,803,239
|
IssuesEvent
|
2021-11-06 12:43:33
|
logseq/logseq
|
https://api.github.com/repos/logseq/logseq
|
closed
|
[0.4.5] all links of the pages which have ':' in title are broken after update
|
bug priority-A
|
**Describe the bug**
Page link which have ':' in the title, are not properly linked, and redirected to run "Open Apps with.." in Windows.
**To Reproduce**
Steps to reproduce the behavior:
1. Create a page, and change title to insert colon
2. link the page from anothe page.
3. click the link
**Expected behavior**
Linked page should be shown
**Screenshots**

**Desktop (please complete the following information):**
- OS: Windows 10
- Browser: chrome
- Version: 0.4.5
**Additional context**
It worked properly until 0.4.4, but after updating to 0.4.5, this bug appeared.
|
1.0
|
[0.4.5] all links of the pages which have ':' in title are broken after update - **Describe the bug**
Page link which have ':' in the title, are not properly linked, and redirected to run "Open Apps with.." in Windows.
**To Reproduce**
Steps to reproduce the behavior:
1. Create a page, and change title to insert colon
2. link the page from anothe page.
3. click the link
**Expected behavior**
Linked page should be shown
**Screenshots**

**Desktop (please complete the following information):**
- OS: Windows 10
- Browser: chrome
- Version: 0.4.5
**Additional context**
It worked properly until 0.4.4, but after updating to 0.4.5, this bug appeared.
|
non_defect
|
all links of the pages which have in title are broken after update describe the bug page link which have in the title are not properly linked and redirected to run open apps with in windows to reproduce steps to reproduce the behavior create a page and change title to insert colon link the page from anothe page click the link expected behavior linked page should be shown screenshots desktop please complete the following information os windows browser chrome version additional context it worked properly until but after updating to this bug appeared
| 0
|
57,975
| 16,237,997,033
|
IssuesEvent
|
2021-05-07 05:00:08
|
navilg/easyawscli
|
https://api.github.com/repos/navilg/easyawscli
|
opened
|
SSL validation failed for https://sts.amazonaws.com/ [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certif icate in certificate chain (_ssl.c:1123) : SSLError Login Failed
|
Defect
|
This issue is intermittent
|
1.0
|
SSL validation failed for https://sts.amazonaws.com/ [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certif icate in certificate chain (_ssl.c:1123) : SSLError Login Failed - This issue is intermittent
|
defect
|
ssl validation failed for certificate verify failed self signed certif icate in certificate chain ssl c sslerror login failed this issue is intermittent
| 1
|
49,361
| 13,186,638,392
|
IssuesEvent
|
2020-08-13 00:49:58
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
opened
|
CoincSuite go through the FIXME's (Trac #1233)
|
Incomplete Migration Migrated from Trac combo reconstruction defect
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1233">https://code.icecube.wisc.edu/ticket/1233</a>, reported by jtatar and owned by </em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:14:55",
"description": "There are several FIXME's. Please make the intended changes so the FIXME's would go away:\n\nprivate/test/CoincSuiteHelpersTest.cxx\nprivate/CoincSuite/lib/PartialCOG.cxx\nprivate/CoincSuite/Modules/DecisionMaker.cxx\nprivate/CoincSuite/Modules/AfterpulseDiscard.cxx\npython/coincsuite.py",
"reporter": "jtatar",
"cc": "",
"resolution": "wontfix",
"_ts": "1550067295757382",
"component": "combo reconstruction",
"summary": "CoincSuite go through the FIXME's",
"priority": "blocker",
"keywords": "",
"time": "2015-08-19T22:39:20",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
CoincSuite go through the FIXME's (Trac #1233) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1233">https://code.icecube.wisc.edu/ticket/1233</a>, reported by jtatar and owned by </em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:14:55",
"description": "There are several FIXME's. Please make the intended changes so the FIXME's would go away:\n\nprivate/test/CoincSuiteHelpersTest.cxx\nprivate/CoincSuite/lib/PartialCOG.cxx\nprivate/CoincSuite/Modules/DecisionMaker.cxx\nprivate/CoincSuite/Modules/AfterpulseDiscard.cxx\npython/coincsuite.py",
"reporter": "jtatar",
"cc": "",
"resolution": "wontfix",
"_ts": "1550067295757382",
"component": "combo reconstruction",
"summary": "CoincSuite go through the FIXME's",
"priority": "blocker",
"keywords": "",
"time": "2015-08-19T22:39:20",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
|
defect
|
coincsuite go through the fixme s trac migrated from json status closed changetime description there are several fixme s please make the intended changes so the fixme s would go away n nprivate test coincsuitehelperstest cxx nprivate coincsuite lib partialcog cxx nprivate coincsuite modules decisionmaker cxx nprivate coincsuite modules afterpulsediscard cxx npython coincsuite py reporter jtatar cc resolution wontfix ts component combo reconstruction summary coincsuite go through the fixme s priority blocker keywords time milestone owner type defect
| 1
|
41,168
| 10,321,569,436
|
IssuesEvent
|
2019-08-31 03:21:13
|
ascott18/TellMeWhen
|
https://api.github.com/repos/ascott18/TellMeWhen
|
opened
|
[Bug] Classic LUA error
|
defect
|
**What version of TellMeWhen are you using? **
<!-- Found in-game at the top of TMW's configuration window. "The latest" is not a version. -->
8.6.8 Classic
**What steps will reproduce the problem?**
1. Create icon with check cast bar for interrupt able
<!-- Add more steps if needed -->
**Additional Info**
<!-- Please add any additional information you think will be useful in reproducing and/or solving the issue. -->
`
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:240: Attempt to register unknown event "UNIT_SPELLCAST_NOT_INTERRUPTIBLE"
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:240: Attempt to register unknown event "UNIT_SPELLCAST_NOT_INTERRUPTIBLE"
Interface\AddOns\TellMeWhen\TellMeWhen.lua:2781: in function <Interface\AddOns\TellMeWhen\TellMeWhen.lua:2738>
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:251: Attempt to unregister unknown event "UNIT_SPELLCAST_NOT_INTERRUPTIBLE"
[C]: in function `UnregisterEvent'
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:251: in function `UnregisterAllEvents'
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:863: in function `DisableIcon'
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:957: in function <Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:907>
(tail call): ?
[C]: ?
[string "safecall Dispatcher[1]"]:9: in function <[string "safecall Dispatcher[1]"]:5>
(tail call): ?
...nterface\AddOns\TellMeWhen\Components\Core\Group.lua:497: in function <...nterface\AddOns\TellMeWhen\Components\Core\Group.lua:425>
(tail call): ?
[C]: ?
[string "safecall Dispatcher[1]"]:9: in function <[string "safecall Dispatcher[1]"]:5>
(tail call): ?
Interface\AddOns\TellMeWhen\TellMeWhen.lua:2689: in function `UpdateNormally'
Interface\AddOns\TellMeWhen\TellMeWhen.lua:2830: in function `Update'
Interface\AddOns\TheAction Classic\Action.lua:7847: in function <Interface\AddOns\TheAction Classic\Action.lua:7843>
Interface\AddOns\TheAction Classic\Action.lua:7852: in function <Interface\AddOns\TheAction Classic\Action.lua:7852>
(tail call): ?
[C]: ?
[string "safecall Dispatcher[1]"]:9: in function <[string "safecall Dispatcher[1]"]:5>
(tail call): ?
Interface\AddOns\TellMeWhen\TellMeWhen.lua:838: in function `Fire'
Interface\AddOns\TellMeWhen\TellMeWhen.lua:2755: in function <Interface\AddOns\TellMeWhen\TellMeWhen.lua:2738>
`
|
1.0
|
[Bug] Classic LUA error - **What version of TellMeWhen are you using? **
<!-- Found in-game at the top of TMW's configuration window. "The latest" is not a version. -->
8.6.8 Classic
**What steps will reproduce the problem?**
1. Create icon with check cast bar for interrupt able
<!-- Add more steps if needed -->
**Additional Info**
<!-- Please add any additional information you think will be useful in reproducing and/or solving the issue. -->
`
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:240: Attempt to register unknown event "UNIT_SPELLCAST_NOT_INTERRUPTIBLE"
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:240: Attempt to register unknown event "UNIT_SPELLCAST_NOT_INTERRUPTIBLE"
Interface\AddOns\TellMeWhen\TellMeWhen.lua:2781: in function <Interface\AddOns\TellMeWhen\TellMeWhen.lua:2738>
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:251: Attempt to unregister unknown event "UNIT_SPELLCAST_NOT_INTERRUPTIBLE"
[C]: in function `UnregisterEvent'
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:251: in function `UnregisterAllEvents'
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:863: in function `DisableIcon'
Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:957: in function <Interface\AddOns\TellMeWhen\Components\Core\Icon.lua:907>
(tail call): ?
[C]: ?
[string "safecall Dispatcher[1]"]:9: in function <[string "safecall Dispatcher[1]"]:5>
(tail call): ?
...nterface\AddOns\TellMeWhen\Components\Core\Group.lua:497: in function <...nterface\AddOns\TellMeWhen\Components\Core\Group.lua:425>
(tail call): ?
[C]: ?
[string "safecall Dispatcher[1]"]:9: in function <[string "safecall Dispatcher[1]"]:5>
(tail call): ?
Interface\AddOns\TellMeWhen\TellMeWhen.lua:2689: in function `UpdateNormally'
Interface\AddOns\TellMeWhen\TellMeWhen.lua:2830: in function `Update'
Interface\AddOns\TheAction Classic\Action.lua:7847: in function <Interface\AddOns\TheAction Classic\Action.lua:7843>
Interface\AddOns\TheAction Classic\Action.lua:7852: in function <Interface\AddOns\TheAction Classic\Action.lua:7852>
(tail call): ?
[C]: ?
[string "safecall Dispatcher[1]"]:9: in function <[string "safecall Dispatcher[1]"]:5>
(tail call): ?
Interface\AddOns\TellMeWhen\TellMeWhen.lua:838: in function `Fire'
Interface\AddOns\TellMeWhen\TellMeWhen.lua:2755: in function <Interface\AddOns\TellMeWhen\TellMeWhen.lua:2738>
`
|
defect
|
classic lua error what version of tellmewhen are you using classic what steps will reproduce the problem create icon with check cast bar for interrupt able additional info interface addons tellmewhen components core icon lua attempt to register unknown event unit spellcast not interruptible interface addons tellmewhen components core icon lua attempt to register unknown event unit spellcast not interruptible interface addons tellmewhen tellmewhen lua in function interface addons tellmewhen components core icon lua attempt to unregister unknown event unit spellcast not interruptible in function unregisterevent interface addons tellmewhen components core icon lua in function unregisterallevents interface addons tellmewhen components core icon lua in function disableicon interface addons tellmewhen components core icon lua in function tail call in function tail call nterface addons tellmewhen components core group lua in function tail call in function tail call interface addons tellmewhen tellmewhen lua in function updatenormally interface addons tellmewhen tellmewhen lua in function update interface addons theaction classic action lua in function interface addons theaction classic action lua in function tail call in function tail call interface addons tellmewhen tellmewhen lua in function fire interface addons tellmewhen tellmewhen lua in function
| 1
|
46,544
| 13,055,930,503
|
IssuesEvent
|
2020-07-30 03:09:12
|
icecube-trac/tix2
|
https://api.github.com/repos/icecube-trac/tix2
|
opened
|
[steamshovel] bubbles-artist does not refresh on missing key (Trac #1402)
|
Incomplete Migration Migrated from Trac combo core defect
|
Migrated from https://code.icecube.wisc.edu/ticket/1402
```json
{
"status": "closed",
"changetime": "2016-03-18T21:14:08",
"description": "When cycling through frames and a key is visualized with an artist (hit-bubbles), which is not present in a following frame, the visual information of the previous frame remains active (bubbles show, which there is actually no info for in the frame).\n\nforcing the window to re-draw for example by cycling desktops fixes the visualized information. so it seems this is a refresh issue.\n\nproposed resolution: call the 'refresh' every time a frame is changed; I guess currently this is called by the artist themselves, which try to update only if their key to operate on is in fact present in the frame.",
"reporter": "mzoll",
"cc": "",
"resolution": "fixed",
"_ts": "1458335648518728",
"component": "combo core",
"summary": "[steamshovel] bubbles-artist does not refresh on missing key",
"priority": "normal",
"keywords": "steamshovel refresh",
"time": "2015-10-16T19:22:06",
"milestone": "",
"owner": "hdembinski",
"type": "defect"
}
```
|
1.0
|
[steamshovel] bubbles-artist does not refresh on missing key (Trac #1402) - Migrated from https://code.icecube.wisc.edu/ticket/1402
```json
{
"status": "closed",
"changetime": "2016-03-18T21:14:08",
"description": "When cycling through frames and a key is visualized with an artist (hit-bubbles), which is not present in a following frame, the visual information of the previous frame remains active (bubbles show, which there is actually no info for in the frame).\n\nforcing the window to re-draw for example by cycling desktops fixes the visualized information. so it seems this is a refresh issue.\n\nproposed resolution: call the 'refresh' every time a frame is changed; I guess currently this is called by the artist themselves, which try to update only if their key to operate on is in fact present in the frame.",
"reporter": "mzoll",
"cc": "",
"resolution": "fixed",
"_ts": "1458335648518728",
"component": "combo core",
"summary": "[steamshovel] bubbles-artist does not refresh on missing key",
"priority": "normal",
"keywords": "steamshovel refresh",
"time": "2015-10-16T19:22:06",
"milestone": "",
"owner": "hdembinski",
"type": "defect"
}
```
|
defect
|
bubbles artist does not refresh on missing key trac migrated from json status closed changetime description when cycling through frames and a key is visualized with an artist hit bubbles which is not present in a following frame the visual information of the previous frame remains active bubbles show which there is actually no info for in the frame n nforcing the window to re draw for example by cycling desktops fixes the visualized information so it seems this is a refresh issue n nproposed resolution call the refresh every time a frame is changed i guess currently this is called by the artist themselves which try to update only if their key to operate on is in fact present in the frame reporter mzoll cc resolution fixed ts component combo core summary bubbles artist does not refresh on missing key priority normal keywords steamshovel refresh time milestone owner hdembinski type defect
| 1
|
23,255
| 10,863,563,830
|
IssuesEvent
|
2019-11-14 15:20:28
|
status-im/nim-beacon-chain
|
https://api.github.com/repos/status-im/nim-beacon-chain
|
opened
|
[Security] Harden against seemingly valid BLS signature
|
security
|
The following snippet will be parsed as a valid BLS signature but will be initialized as the infinity point:
```Nim
import blscurve
let sigbytes = @[byte 217, 149, 255, 97, 73, 133, 236, 43, 248, 34, 30, 10, 15, 45, 82, 72, 243, 179, 53, 17, 27, 17, 248, 180, 7, 92, 200, 153, 11, 3, 111, 137, 124, 171, 29, 218, 191, 246, 148, 57, 160, 50, 232, 129, 81, 90, 72, 161, 110, 138, 243, 116, 0, 88, 125, 180, 67, 153, 194, 181, 117, 152, 166, 147, 13, 77, 15, 91, 33, 50, 140, 199, 150, 10, 15, 10, 209, 165, 38, 57, 56, 114, 175, 29, 49, 11, 11, 126, 55, 189, 170, 46, 218, 240, 189, 144]
var sig: Signature
let success = init(sig, sigbytes)
echo success
echo sig
```
see also: https://github.com/status-im/nim-blscurve/issues/29
|
True
|
[Security] Harden against seemingly valid BLS signature - The following snippet will be parsed as a valid BLS signature but will be initialized as the infinity point:
```Nim
import blscurve
let sigbytes = @[byte 217, 149, 255, 97, 73, 133, 236, 43, 248, 34, 30, 10, 15, 45, 82, 72, 243, 179, 53, 17, 27, 17, 248, 180, 7, 92, 200, 153, 11, 3, 111, 137, 124, 171, 29, 218, 191, 246, 148, 57, 160, 50, 232, 129, 81, 90, 72, 161, 110, 138, 243, 116, 0, 88, 125, 180, 67, 153, 194, 181, 117, 152, 166, 147, 13, 77, 15, 91, 33, 50, 140, 199, 150, 10, 15, 10, 209, 165, 38, 57, 56, 114, 175, 29, 49, 11, 11, 126, 55, 189, 170, 46, 218, 240, 189, 144]
var sig: Signature
let success = init(sig, sigbytes)
echo success
echo sig
```
see also: https://github.com/status-im/nim-blscurve/issues/29
|
non_defect
|
harden against seemingly valid bls signature the following snippet will be parsed as a valid bls signature but will be initialized as the infinity point nim import blscurve let sigbytes var sig signature let success init sig sigbytes echo success echo sig see also
| 0
|
20,366
| 3,348,769,648
|
IssuesEvent
|
2015-11-17 04:41:01
|
dart-lang/sdk
|
https://api.github.com/repos/dart-lang/sdk
|
closed
|
Code highlight stops working in DartEditor
|
area-analyzer NeedsInfo Priority-Medium resolution-assumed-stale Type-Defect
|
*This issue was originally filed by regardingSc...@gmail.com*
_____
**What steps will reproduce the problem?**
1. Code for ~ 30 minutes in several project
2. Change files or comment and then uncomment code blocks / rows
**3.**
**What is the expected output? What do you see instead?**
DartEditor should be able to highlight correctly the uncommented code. Instead it remains all black.
**What version of the product are you using?**
1.9.1
**On what operating system?**
Linux
**What browser (if applicable)?**
**Please provide any additional information below.**
happens to both method bodies as well as static fields (for example the word static is not highlighted).
Re-analyzing does not help, close and open again the file helps.
|
1.0
|
Code highlight stops working in DartEditor - *This issue was originally filed by regardingSc...@gmail.com*
_____
**What steps will reproduce the problem?**
1. Code for ~ 30 minutes in several project
2. Change files or comment and then uncomment code blocks / rows
**3.**
**What is the expected output? What do you see instead?**
DartEditor should be able to highlight correctly the uncommented code. Instead it remains all black.
**What version of the product are you using?**
1.9.1
**On what operating system?**
Linux
**What browser (if applicable)?**
**Please provide any additional information below.**
happens to both method bodies as well as static fields (for example the word static is not highlighted).
Re-analyzing does not help, close and open again the file helps.
|
defect
|
code highlight stops working in darteditor this issue was originally filed by regardingsc gmail com what steps will reproduce the problem code for minutes in several project change files or comment and then uncomment code blocks rows what is the expected output what do you see instead darteditor should be able to highlight correctly the uncommented code instead it remains all black what version of the product are you using on what operating system linux what browser if applicable please provide any additional information below happens to both method bodies as well as static fields for example the word static is not highlighted re analyzing does not help close and open again the file helps
| 1
|
37,107
| 8,244,387,515
|
IssuesEvent
|
2018-09-11 06:01:54
|
CenturyLinkCloud/mdw
|
https://api.github.com/repos/CenturyLinkCloud/mdw
|
closed
|
Task instance failing when rule based prioritization strategy is speciified
|
defect
|
getting below error because the package name of RulesBasedRoutingStrategy is different
.11:56:10.738 p29329940.10002 a13.10020] Activity started - Aprilia Fallout Task
[(s)20180905.11:56:10.937 ~1467] com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
com.centurylink.mdw.activity.ActivityException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:76)
at com.centurylink.mdw.workflow.activity.task.CustomManualTaskActivity.execute(CustomManualTaskActivity.java:56)
at com.centurylink.mdw.workflow.activity.DefaultActivityImpl.execute(DefaultActivityImpl.java:41)
at com.centurylink.mdw.services.process.BaseActivity.execute(BaseActivity.java:221)
at com.centurylink.mdw.services.process.ProcessEngineDriver.executeActivity(ProcessEngineDriver.java:381)
at com.centurylink.mdw.services.process.ProcessEngineDriver.processEvent(ProcessEngineDriver.java:612)
at com.centurylink.mdw.services.process.ProcessEngineDriver.processEvents(ProcessEngineDriver.java:578)
at com.centurylink.mdw.services.process.InternalEventDriver.run(InternalEventDriver.java:37)
at com.centurylink.mdw.container.plugin.CommonThreadPool$Work.run(CommonThreadPool.java:263)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at com.centurylink.mdw.container.plugin.CommonThreadPool$ManagedThread.run(CommonThreadPool.java:232)
Caused by: com.centurylink.mdw.common.service.ServiceException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.services.task.TaskWorkflowHelper.determineWorkgroups(TaskWorkflowHelper.java:784)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.createTaskInstance(TaskWorkflowHelper.java:203)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.createTaskInstance(TaskWorkflowHelper.java:127)
at com.centurylink.mdw.services.task.TaskServicesImpl.createTask(TaskServicesImpl.java:87)
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:89)
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:72)
... 12 more
Caused by: com.centurylink.mdw.common.StrategyException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getStrategyInstance(TaskInstanceStrategyFactory.java:166)
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getRoutingStrategy(TaskInstanceStrategyFactory.java:74)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.determineWorkgroups(TaskWorkflowHelper.java:776)
... 17 more
Caused by: java.lang.ClassNotFoundException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1291)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1119)
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getStrategyInstance(TaskInstanceStrategyFactory.java:161)
... 19 more
[(s)20180905.11:56:10.938 ~1467] Failed to execute activity - com.centurylink.mdw.activity.ActivityException
com.centurylink.mdw.activity.ActivityException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.workflow.activity.task.CustomManualTaskActivity.execute(CustomManualTaskActivity.java:75)
at com.centurylink.mdw.workflow.activity.DefaultActivityImpl.execute(DefaultActivityImpl.java:41)
at com.centurylink.mdw.services.process.BaseActivity.execute(BaseActivity.java:221)
at com.centurylink.mdw.services.process.ProcessEngineDriver.executeActivity(ProcessEngineDriver.java:381)
at com.centurylink.mdw.services.process.ProcessEngineDriver.processEvent(ProcessEngineDriver.java:612)
at com.centurylink.mdw.services.process.ProcessEngineDriver.processEvents(ProcessEngineDriver.java:578)
at com.centurylink.mdw.services.process.InternalEventDriver.run(InternalEventDriver.java:37)
at com.centurylink.mdw.container.plugin.CommonThreadPool$Work.run(CommonThreadPool.java:263)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at com.centurylink.mdw.container.plugin.CommonThreadPool$ManagedThread.run(CommonThreadPool.java:232)
Caused by: com.centurylink.mdw.activity.ActivityException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:76)
at com.centurylink.mdw.workflow.activity.task.CustomManualTaskActivity.execute(CustomManualTaskActivity.java:56)
... 11 more
Caused by: com.centurylink.mdw.common.service.ServiceException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.services.task.TaskWorkflowHelper.determineWorkgroups(TaskWorkflowHelper.java:784)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.createTaskInstance(TaskWorkflowHelper.java:203)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.createTaskInstance(TaskWorkflowHelper.java:127)
at com.centurylink.mdw.services.task.TaskServicesImpl.createTask(TaskServicesImpl.java:87)
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:89)
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:72)
... 12 more
Caused by: com.centurylink.mdw.common.StrategyException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getStrategyInstance(TaskInstanceStrategyFactory.java:166)
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getRoutingStrategy(TaskInstanceStrategyFactory.java:74)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.determineWorkgroups(TaskWorkflowHelper.java:776)
... 17 more
Caused by: java.lang.ClassNotFoundException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1291)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1119)
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getStrategyInstance(TaskInstanceStrategyFactory.java:161)
... 19 more
[(i)20180905.11:56:10.941 p29329940.10002 a13.10020] Activity failed - com.centurylink.mdw.activity.ActivityException
[(i)20180905.11:56:11.116 p29329940.10002 a13.10020] Inherited Event - type=4, compcode=null
[(i)20180905.11:56:11.182 ~1467] Transition has not been defined for event of type 4
|
1.0
|
Task instance failing when rule based prioritization strategy is speciified - getting below error because the package name of RulesBasedRoutingStrategy is different
.11:56:10.738 p29329940.10002 a13.10020] Activity started - Aprilia Fallout Task
[(s)20180905.11:56:10.937 ~1467] com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
com.centurylink.mdw.activity.ActivityException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:76)
at com.centurylink.mdw.workflow.activity.task.CustomManualTaskActivity.execute(CustomManualTaskActivity.java:56)
at com.centurylink.mdw.workflow.activity.DefaultActivityImpl.execute(DefaultActivityImpl.java:41)
at com.centurylink.mdw.services.process.BaseActivity.execute(BaseActivity.java:221)
at com.centurylink.mdw.services.process.ProcessEngineDriver.executeActivity(ProcessEngineDriver.java:381)
at com.centurylink.mdw.services.process.ProcessEngineDriver.processEvent(ProcessEngineDriver.java:612)
at com.centurylink.mdw.services.process.ProcessEngineDriver.processEvents(ProcessEngineDriver.java:578)
at com.centurylink.mdw.services.process.InternalEventDriver.run(InternalEventDriver.java:37)
at com.centurylink.mdw.container.plugin.CommonThreadPool$Work.run(CommonThreadPool.java:263)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at com.centurylink.mdw.container.plugin.CommonThreadPool$ManagedThread.run(CommonThreadPool.java:232)
Caused by: com.centurylink.mdw.common.service.ServiceException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.services.task.TaskWorkflowHelper.determineWorkgroups(TaskWorkflowHelper.java:784)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.createTaskInstance(TaskWorkflowHelper.java:203)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.createTaskInstance(TaskWorkflowHelper.java:127)
at com.centurylink.mdw.services.task.TaskServicesImpl.createTask(TaskServicesImpl.java:87)
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:89)
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:72)
... 12 more
Caused by: com.centurylink.mdw.common.StrategyException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getStrategyInstance(TaskInstanceStrategyFactory.java:166)
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getRoutingStrategy(TaskInstanceStrategyFactory.java:74)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.determineWorkgroups(TaskWorkflowHelper.java:776)
... 17 more
Caused by: java.lang.ClassNotFoundException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1291)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1119)
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getStrategyInstance(TaskInstanceStrategyFactory.java:161)
... 19 more
[(s)20180905.11:56:10.938 ~1467] Failed to execute activity - com.centurylink.mdw.activity.ActivityException
com.centurylink.mdw.activity.ActivityException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.workflow.activity.task.CustomManualTaskActivity.execute(CustomManualTaskActivity.java:75)
at com.centurylink.mdw.workflow.activity.DefaultActivityImpl.execute(DefaultActivityImpl.java:41)
at com.centurylink.mdw.services.process.BaseActivity.execute(BaseActivity.java:221)
at com.centurylink.mdw.services.process.ProcessEngineDriver.executeActivity(ProcessEngineDriver.java:381)
at com.centurylink.mdw.services.process.ProcessEngineDriver.processEvent(ProcessEngineDriver.java:612)
at com.centurylink.mdw.services.process.ProcessEngineDriver.processEvents(ProcessEngineDriver.java:578)
at com.centurylink.mdw.services.process.InternalEventDriver.run(InternalEventDriver.java:37)
at com.centurylink.mdw.container.plugin.CommonThreadPool$Work.run(CommonThreadPool.java:263)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at com.centurylink.mdw.container.plugin.CommonThreadPool$ManagedThread.run(CommonThreadPool.java:232)
Caused by: com.centurylink.mdw.activity.ActivityException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:76)
at com.centurylink.mdw.workflow.activity.task.CustomManualTaskActivity.execute(CustomManualTaskActivity.java:56)
... 11 more
Caused by: com.centurylink.mdw.common.service.ServiceException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.services.task.TaskWorkflowHelper.determineWorkgroups(TaskWorkflowHelper.java:784)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.createTaskInstance(TaskWorkflowHelper.java:203)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.createTaskInstance(TaskWorkflowHelper.java:127)
at com.centurylink.mdw.services.task.TaskServicesImpl.createTask(TaskServicesImpl.java:87)
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:89)
at com.centurylink.mdw.workflow.activity.task.ManualTaskActivity.createTaskInstance(ManualTaskActivity.java:72)
... 12 more
Caused by: com.centurylink.mdw.common.StrategyException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getStrategyInstance(TaskInstanceStrategyFactory.java:166)
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getRoutingStrategy(TaskInstanceStrategyFactory.java:74)
at com.centurylink.mdw.services.task.TaskWorkflowHelper.determineWorkgroups(TaskWorkflowHelper.java:776)
... 17 more
Caused by: java.lang.ClassNotFoundException: com.centurylink.mdw.workflow.task.strategy.RulesBasedRoutingStrategy
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1291)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1119)
at com.centurylink.mdw.services.task.factory.TaskInstanceStrategyFactory.getStrategyInstance(TaskInstanceStrategyFactory.java:161)
... 19 more
[(i)20180905.11:56:10.941 p29329940.10002 a13.10020] Activity failed - com.centurylink.mdw.activity.ActivityException
[(i)20180905.11:56:11.116 p29329940.10002 a13.10020] Inherited Event - type=4, compcode=null
[(i)20180905.11:56:11.182 ~1467] Transition has not been defined for event of type 4
|
defect
|
task instance failing when rule based prioritization strategy is speciified getting below error because the package name of rulesbasedroutingstrategy is different activity started aprilia fallout task com centurylink mdw workflow task strategy rulesbasedroutingstrategy com centurylink mdw activity activityexception com centurylink mdw workflow task strategy rulesbasedroutingstrategy at com centurylink mdw workflow activity task manualtaskactivity createtaskinstance manualtaskactivity java at com centurylink mdw workflow activity task custommanualtaskactivity execute custommanualtaskactivity java at com centurylink mdw workflow activity defaultactivityimpl execute defaultactivityimpl java at com centurylink mdw services process baseactivity execute baseactivity java at com centurylink mdw services process processenginedriver executeactivity processenginedriver java at com centurylink mdw services process processenginedriver processevent processenginedriver java at com centurylink mdw services process processenginedriver processevents processenginedriver java at com centurylink mdw services process internaleventdriver run internaleventdriver java at com centurylink mdw container plugin commonthreadpool work run commonthreadpool java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java at com centurylink mdw container plugin commonthreadpool managedthread run commonthreadpool java caused by com centurylink mdw common service serviceexception com centurylink mdw workflow task strategy rulesbasedroutingstrategy at com centurylink mdw services task taskworkflowhelper determineworkgroups taskworkflowhelper java at com centurylink mdw services task taskworkflowhelper createtaskinstance taskworkflowhelper java at com centurylink mdw services task taskworkflowhelper createtaskinstance taskworkflowhelper java at com centurylink mdw services task taskservicesimpl createtask taskservicesimpl java at com centurylink mdw workflow activity task manualtaskactivity createtaskinstance manualtaskactivity java at com centurylink mdw workflow activity task manualtaskactivity createtaskinstance manualtaskactivity java more caused by com centurylink mdw common strategyexception com centurylink mdw workflow task strategy rulesbasedroutingstrategy at com centurylink mdw services task factory taskinstancestrategyfactory getstrategyinstance taskinstancestrategyfactory java at com centurylink mdw services task factory taskinstancestrategyfactory getroutingstrategy taskinstancestrategyfactory java at com centurylink mdw services task taskworkflowhelper determineworkgroups taskworkflowhelper java more caused by java lang classnotfoundexception com centurylink mdw workflow task strategy rulesbasedroutingstrategy at org apache catalina loader webappclassloaderbase loadclass webappclassloaderbase java at org apache catalina loader webappclassloaderbase loadclass webappclassloaderbase java at com centurylink mdw services task factory taskinstancestrategyfactory getstrategyinstance taskinstancestrategyfactory java more failed to execute activity com centurylink mdw activity activityexception com centurylink mdw activity activityexception com centurylink mdw workflow task strategy rulesbasedroutingstrategy at com centurylink mdw workflow activity task custommanualtaskactivity execute custommanualtaskactivity java at com centurylink mdw workflow activity defaultactivityimpl execute defaultactivityimpl java at com centurylink mdw services process baseactivity execute baseactivity java at com centurylink mdw services process processenginedriver executeactivity processenginedriver java at com centurylink mdw services process processenginedriver processevent processenginedriver java at com centurylink mdw services process processenginedriver processevents processenginedriver java at com centurylink mdw services process internaleventdriver run internaleventdriver java at com centurylink mdw container plugin commonthreadpool work run commonthreadpool java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java at com centurylink mdw container plugin commonthreadpool managedthread run commonthreadpool java caused by com centurylink mdw activity activityexception com centurylink mdw workflow task strategy rulesbasedroutingstrategy at com centurylink mdw workflow activity task manualtaskactivity createtaskinstance manualtaskactivity java at com centurylink mdw workflow activity task custommanualtaskactivity execute custommanualtaskactivity java more caused by com centurylink mdw common service serviceexception com centurylink mdw workflow task strategy rulesbasedroutingstrategy at com centurylink mdw services task taskworkflowhelper determineworkgroups taskworkflowhelper java at com centurylink mdw services task taskworkflowhelper createtaskinstance taskworkflowhelper java at com centurylink mdw services task taskworkflowhelper createtaskinstance taskworkflowhelper java at com centurylink mdw services task taskservicesimpl createtask taskservicesimpl java at com centurylink mdw workflow activity task manualtaskactivity createtaskinstance manualtaskactivity java at com centurylink mdw workflow activity task manualtaskactivity createtaskinstance manualtaskactivity java more caused by com centurylink mdw common strategyexception com centurylink mdw workflow task strategy rulesbasedroutingstrategy at com centurylink mdw services task factory taskinstancestrategyfactory getstrategyinstance taskinstancestrategyfactory java at com centurylink mdw services task factory taskinstancestrategyfactory getroutingstrategy taskinstancestrategyfactory java at com centurylink mdw services task taskworkflowhelper determineworkgroups taskworkflowhelper java more caused by java lang classnotfoundexception com centurylink mdw workflow task strategy rulesbasedroutingstrategy at org apache catalina loader webappclassloaderbase loadclass webappclassloaderbase java at org apache catalina loader webappclassloaderbase loadclass webappclassloaderbase java at com centurylink mdw services task factory taskinstancestrategyfactory getstrategyinstance taskinstancestrategyfactory java more activity failed com centurylink mdw activity activityexception inherited event type compcode null transition has not been defined for event of type
| 1
|
10,419
| 2,622,151,466
|
IssuesEvent
|
2015-03-04 00:06:20
|
byzhang/lh-vim
|
https://api.github.com/repos/byzhang/lh-vim
|
closed
|
lhBrackets requires lh-dev
|
auto-migrated lhBrackets Priority-Medium Type-Defect
|
```
after/ftplugin/c/c_brackets.vim from lhBrackets/lh-map-tools uses
lh#dev#option#get and thus generates some errors when opening/creating c files
while brackets-addon-info.txt is missing the dependency on lh-dev
```
Original issue reported on code.google.com by `tiziano....@gmail.com` on 28 Jun 2012 at 7:22
|
1.0
|
lhBrackets requires lh-dev - ```
after/ftplugin/c/c_brackets.vim from lhBrackets/lh-map-tools uses
lh#dev#option#get and thus generates some errors when opening/creating c files
while brackets-addon-info.txt is missing the dependency on lh-dev
```
Original issue reported on code.google.com by `tiziano....@gmail.com` on 28 Jun 2012 at 7:22
|
defect
|
lhbrackets requires lh dev after ftplugin c c brackets vim from lhbrackets lh map tools uses lh dev option get and thus generates some errors when opening creating c files while brackets addon info txt is missing the dependency on lh dev original issue reported on code google com by tiziano gmail com on jun at
| 1
|
157,277
| 19,957,081,196
|
IssuesEvent
|
2022-01-28 01:22:21
|
panasalap/linux-4.1.15
|
https://api.github.com/repos/panasalap/linux-4.1.15
|
opened
|
CVE-2017-8797 (High) detected in linux-yocto-4.1v4.1.17
|
security vulnerability
|
## CVE-2017-8797 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yocto-4.1v4.1.17</b></p></summary>
<p>
<p>[no description]</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto-4.1>https://git.yoctoproject.org/git/linux-yocto-4.1</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The NFSv4 server in the Linux kernel before 4.11.3 does not properly validate the layout type when processing the NFSv4 pNFS GETDEVICEINFO or LAYOUTGET operand in a UDP packet from a remote attacker. This type value is uninitialized upon encountering certain error conditions. This value is used as an array index for dereferencing, which leads to an OOPS and eventually a DoS of knfsd and a soft-lockup of the whole system.
<p>Publish Date: 2017-07-02
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-8797>CVE-2017-8797</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2017-8797">https://www.linuxkernelcves.com/cves/CVE-2017-8797</a></p>
<p>Release Date: 2017-07-02</p>
<p>Fix Resolution: v4.12-rc1,v4.11.3,v4.9.30</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2017-8797 (High) detected in linux-yocto-4.1v4.1.17 - ## CVE-2017-8797 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yocto-4.1v4.1.17</b></p></summary>
<p>
<p>[no description]</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto-4.1>https://git.yoctoproject.org/git/linux-yocto-4.1</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The NFSv4 server in the Linux kernel before 4.11.3 does not properly validate the layout type when processing the NFSv4 pNFS GETDEVICEINFO or LAYOUTGET operand in a UDP packet from a remote attacker. This type value is uninitialized upon encountering certain error conditions. This value is used as an array index for dereferencing, which leads to an OOPS and eventually a DoS of knfsd and a soft-lockup of the whole system.
<p>Publish Date: 2017-07-02
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-8797>CVE-2017-8797</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2017-8797">https://www.linuxkernelcves.com/cves/CVE-2017-8797</a></p>
<p>Release Date: 2017-07-02</p>
<p>Fix Resolution: v4.12-rc1,v4.11.3,v4.9.30</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve high detected in linux yocto cve high severity vulnerability vulnerable library linux yocto library home page a href found in base branch master vulnerable source files vulnerability details the server in the linux kernel before does not properly validate the layout type when processing the pnfs getdeviceinfo or layoutget operand in a udp packet from a remote attacker this type value is uninitialized upon encountering certain error conditions this value is used as an array index for dereferencing which leads to an oops and eventually a dos of knfsd and a soft lockup of the whole system publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
61,278
| 17,023,655,394
|
IssuesEvent
|
2021-07-03 03:07:58
|
tomhughes/trac-tickets
|
https://api.github.com/repos/tomhughes/trac-tickets
|
closed
|
Navigating away from PL2 gives confusing message
|
Component: website Priority: minor Resolution: fixed Type: defect
|
**[Submitted to the original trac issue database at 6.45pm, Tuesday, 30th November 2010]**
When navigating away from Potlatch 2 with unsaved changes, the website produces the following confusing message:
To save in Potlatch, you should deselect the current way or point, if editing in live mode, or click save if you have a save button.
Potlatch 2 does not have live mode.
|
1.0
|
Navigating away from PL2 gives confusing message - **[Submitted to the original trac issue database at 6.45pm, Tuesday, 30th November 2010]**
When navigating away from Potlatch 2 with unsaved changes, the website produces the following confusing message:
To save in Potlatch, you should deselect the current way or point, if editing in live mode, or click save if you have a save button.
Potlatch 2 does not have live mode.
|
defect
|
navigating away from gives confusing message when navigating away from potlatch with unsaved changes the website produces the following confusing message to save in potlatch you should deselect the current way or point if editing in live mode or click save if you have a save button potlatch does not have live mode
| 1
|
70,125
| 22,956,985,779
|
IssuesEvent
|
2022-07-19 12:28:51
|
hazelcast/hazelcast
|
https://api.github.com/repos/hazelcast/hazelcast
|
closed
|
hazelcast map cas remove return false,but the map contain the removed obj
|
Type: Defect Team: Core Source: Community Module: IMap
|
We used hazelcast version 3.12.7, create map with object memory format. There are two nodes in our cluster. Initialize the data when there are two nodes, and then close one node to remove the map's data. removal failed, but in fact, there is this object in the map. Get can also return the equal same object.
```java
public void init() {
Config config = this.hazelcast.getConfig();
MapConfig sharedMapConfig = config.getMapConfig(HZ_SHARED_MAP);
sharedMapConfig.setName(HZ_SHARED_MAP).setInMemoryFormat(InMemoryFormat.OBJECT);
this.hazelcast.getConfig().addMapConfig(sharedMapConfig);
this.sharedMap = this.hazelcast.getMap(HZ_SHARED_MAP);
}
public boolean removeTopic(String topic, String clientId, String version) {
Subscription target = Subscription.newSub(topic, clientId);
String key = this.sharedKeyProvider.apply(topic, clientId);
boolean r = this.sharedMap.remove(key, target);
if (!r) {
Object source = this.sharedMap.get(key);
logger.warn("remove failure,result:{},target:{},source:{},equal:{}", r, target, source, target.equals(source));
}
return r;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
Subscription that = (Subscription) o;
return this.topic.equals(that.topic) && this.clientId.equals(that.clientId);
}
@Override
public int hashCode() {
return Objects.hash(this.topic, this.clientId);
}
```
remove return false, but map contain the source object, logger print target equal source
|
1.0
|
hazelcast map cas remove return false,but the map contain the removed obj - We used hazelcast version 3.12.7, create map with object memory format. There are two nodes in our cluster. Initialize the data when there are two nodes, and then close one node to remove the map's data. removal failed, but in fact, there is this object in the map. Get can also return the equal same object.
```java
public void init() {
Config config = this.hazelcast.getConfig();
MapConfig sharedMapConfig = config.getMapConfig(HZ_SHARED_MAP);
sharedMapConfig.setName(HZ_SHARED_MAP).setInMemoryFormat(InMemoryFormat.OBJECT);
this.hazelcast.getConfig().addMapConfig(sharedMapConfig);
this.sharedMap = this.hazelcast.getMap(HZ_SHARED_MAP);
}
public boolean removeTopic(String topic, String clientId, String version) {
Subscription target = Subscription.newSub(topic, clientId);
String key = this.sharedKeyProvider.apply(topic, clientId);
boolean r = this.sharedMap.remove(key, target);
if (!r) {
Object source = this.sharedMap.get(key);
logger.warn("remove failure,result:{},target:{},source:{},equal:{}", r, target, source, target.equals(source));
}
return r;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
Subscription that = (Subscription) o;
return this.topic.equals(that.topic) && this.clientId.equals(that.clientId);
}
@Override
public int hashCode() {
return Objects.hash(this.topic, this.clientId);
}
```
remove return false, but map contain the source object, logger print target equal source
|
defect
|
hazelcast map cas remove return false but the map contain the removed obj we used hazelcast version create map with object memory format there are two nodes in our cluster initialize the data when there are two nodes and then close one node to remove the map s data removal failed but in fact there is this object in the map get can also return the equal same object java public void init config config this hazelcast getconfig mapconfig sharedmapconfig config getmapconfig hz shared map sharedmapconfig setname hz shared map setinmemoryformat inmemoryformat object this hazelcast getconfig addmapconfig sharedmapconfig this sharedmap this hazelcast getmap hz shared map public boolean removetopic string topic string clientid string version subscription target subscription newsub topic clientid string key this sharedkeyprovider apply topic clientid boolean r this sharedmap remove key target if r object source this sharedmap get key logger warn remove failure result target source equal r target source target equals source return r override public boolean equals object o if this o return true if o null getclass o getclass return false subscription that subscription o return this topic equals that topic this clientid equals that clientid override public int hashcode return objects hash this topic this clientid remove return false but map contain the source object logger print target equal source
| 1
|
12,677
| 4,513,659,075
|
IssuesEvent
|
2016-09-04 12:15:55
|
owncloud/gallery
|
https://api.github.com/repos/owncloud/gallery
|
closed
|
GDrive-like grid view
|
coder wanted enhancement sponsor needed
|
For me would be great having the possibility to have a grid view similar to the Google Drive one with nice cropped image that share the same size.
I think it's an elegant and efficient way to look at the image thumbnails.
*Gallery+*
<img width="341" alt="screen shot 2016-01-07 at 12 28 50" src="https://cloud.githubusercontent.com/assets/805144/12169644/4ed1950a-b53a-11e5-805d-df245cab9fd9.png">
*Google Drive*
<img width="1167" alt="screen shot 2016-01-07 at 12 27 45" src="https://cloud.githubusercontent.com/assets/805144/12169715/d1431d9c-b53a-11e5-92d5-0d1ade39b6f6.png">
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/29686884-gdrive-like-grid-view?utm_campaign=plugin&utm_content=tracker%2F9328526&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F9328526&utm_medium=issues&utm_source=github).
</bountysource-plugin>
|
1.0
|
GDrive-like grid view - For me would be great having the possibility to have a grid view similar to the Google Drive one with nice cropped image that share the same size.
I think it's an elegant and efficient way to look at the image thumbnails.
*Gallery+*
<img width="341" alt="screen shot 2016-01-07 at 12 28 50" src="https://cloud.githubusercontent.com/assets/805144/12169644/4ed1950a-b53a-11e5-805d-df245cab9fd9.png">
*Google Drive*
<img width="1167" alt="screen shot 2016-01-07 at 12 27 45" src="https://cloud.githubusercontent.com/assets/805144/12169715/d1431d9c-b53a-11e5-92d5-0d1ade39b6f6.png">
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/29686884-gdrive-like-grid-view?utm_campaign=plugin&utm_content=tracker%2F9328526&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F9328526&utm_medium=issues&utm_source=github).
</bountysource-plugin>
|
non_defect
|
gdrive like grid view for me would be great having the possibility to have a grid view similar to the google drive one with nice cropped image that share the same size i think it s an elegant and efficient way to look at the image thumbnails gallery img width alt screen shot at src google drive img width alt screen shot at src want to back this issue we accept bounties via
| 0
|
207,216
| 23,434,916,465
|
IssuesEvent
|
2022-08-15 08:44:31
|
Gal-Doron/Baragon-36
|
https://api.github.com/repos/Gal-Doron/Baragon-36
|
opened
|
jetty-server-9.4.18.v20190429.jar: 5 vulnerabilities (highest severity is: 7.5)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p></summary>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonData/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-28165](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28165) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | jetty-io-9.4.18.v20190429.jar | Transitive | 9.4.39.v20210325 | ✅ |
| [CVE-2021-28169](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28169) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.41.v20210516 | ✅ |
| [CVE-2020-27218](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-27218) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.8 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.35.v20201120 | ✅ |
| [CVE-2021-34428](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-34428) | <img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low | 3.5 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.41.v20210516 | ✅ |
| [CVE-2022-2047](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-2047) | <img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low | 2.7 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.47.v20220610 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-28165</summary>
### Vulnerable Library - <b>jetty-io-9.4.18.v20190429.jar</b></p>
<p>The Eclipse Jetty Project</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonService/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-io/9.4.18.v20190429/jetty-io-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-io/9.4.18.v20190429/jetty-io-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-io/9.4.18.v20190429/jetty-io-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- jetty-server-9.4.18.v20190429.jar (Root Library)
- :x: **jetty-io-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Eclipse Jetty 7.2.2 to 9.4.38, 10.0.0.alpha0 to 10.0.1, and 11.0.0.alpha0 to 11.0.1, CPU usage can reach 100% upon receiving a large invalid TLS frame.
<p>Publish Date: 2021-04-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28165>CVE-2021-28165</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-26vr-8j45-3r4w">https://github.com/eclipse/jetty.project/security/advisories/GHSA-26vr-8j45-3r4w</a></p>
<p>Release Date: 2021-04-01</p>
<p>Fix Resolution (org.eclipse.jetty:jetty-io): 9.4.39.v20210325</p>
<p>Direct dependency fix Resolution (org.eclipse.jetty:jetty-server): 9.4.39.v20210325</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-28169</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonData/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
For Eclipse Jetty versions <= 9.4.40, <= 10.0.2, <= 11.0.2, it is possible for requests to the ConcatServlet with a doubly encoded path to access protected resources within the WEB-INF directory. For example a request to `/concat?/%2557EB-INF/web.xml` can retrieve the web.xml file. This can reveal sensitive information regarding the implementation of a web application.
<p>Publish Date: 2021-06-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28169>CVE-2021-28169</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-gwcr-j4wh-j3cq">https://github.com/eclipse/jetty.project/security/advisories/GHSA-gwcr-j4wh-j3cq</a></p>
<p>Release Date: 2021-06-09</p>
<p>Fix Resolution: 9.4.41.v20210516</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-27218</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonData/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Eclipse Jetty version 9.4.0.RC0 to 9.4.34.v20201102, 10.0.0.alpha0 to 10.0.0.beta2, and 11.0.0.alpha0 to 11.0.0.beta2, if GZIP request body inflation is enabled and requests from different clients are multiplexed onto a single connection, and if an attacker can send a request with a body that is received entirely but not consumed by the application, then a subsequent request on the same connection will see that body prepended to its body. The attacker will not see any data but may inject data into the body of the subsequent request.
<p>Publish Date: 2020-11-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-27218>CVE-2020-27218</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-86wm-rrjm-8wh8">https://github.com/eclipse/jetty.project/security/advisories/GHSA-86wm-rrjm-8wh8</a></p>
<p>Release Date: 2020-11-28</p>
<p>Fix Resolution: 9.4.35.v20201120</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> CVE-2021-34428</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonData/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
For Eclipse Jetty versions <= 9.4.40, <= 10.0.2, <= 11.0.2, if an exception is thrown from the SessionListener#sessionDestroyed() method, then the session ID is not invalidated in the session ID manager. On deployments with clustered sessions and multiple contexts this can result in a session not being invalidated. This can result in an application used on a shared computer being left logged in.
<p>Publish Date: 2021-06-22
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-34428>CVE-2021-34428</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>3.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Physical
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-m6cp-vxjx-65j6">https://github.com/eclipse/jetty.project/security/advisories/GHSA-m6cp-vxjx-65j6</a></p>
<p>Release Date: 2021-06-22</p>
<p>Fix Resolution: 9.4.41.v20210516</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> CVE-2022-2047</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonData/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Eclipse Jetty versions 9.4.0 thru 9.4.46, and 10.0.0 thru 10.0.9, and 11.0.0 thru 11.0.9 versions, the parsing of the authority segment of an http scheme URI, the Jetty HttpURI class improperly detects an invalid input as a hostname. This can lead to failures in a Proxy scenario.
<p>Publish Date: 2022-07-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-2047>CVE-2022-2047</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>2.7</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-cj7v-27pg-wf7q">https://github.com/eclipse/jetty.project/security/advisories/GHSA-cj7v-27pg-wf7q</a></p>
<p>Release Date: 2022-07-07</p>
<p>Fix Resolution: 9.4.47.v20220610</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
True
|
jetty-server-9.4.18.v20190429.jar: 5 vulnerabilities (highest severity is: 7.5) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p></summary>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonData/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-28165](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28165) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | jetty-io-9.4.18.v20190429.jar | Transitive | 9.4.39.v20210325 | ✅ |
| [CVE-2021-28169](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28169) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.41.v20210516 | ✅ |
| [CVE-2020-27218](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-27218) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.8 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.35.v20201120 | ✅ |
| [CVE-2021-34428](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-34428) | <img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low | 3.5 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.41.v20210516 | ✅ |
| [CVE-2022-2047](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-2047) | <img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low | 2.7 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.47.v20220610 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-28165</summary>
### Vulnerable Library - <b>jetty-io-9.4.18.v20190429.jar</b></p>
<p>The Eclipse Jetty Project</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonService/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-io/9.4.18.v20190429/jetty-io-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-io/9.4.18.v20190429/jetty-io-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-io/9.4.18.v20190429/jetty-io-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- jetty-server-9.4.18.v20190429.jar (Root Library)
- :x: **jetty-io-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Eclipse Jetty 7.2.2 to 9.4.38, 10.0.0.alpha0 to 10.0.1, and 11.0.0.alpha0 to 11.0.1, CPU usage can reach 100% upon receiving a large invalid TLS frame.
<p>Publish Date: 2021-04-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28165>CVE-2021-28165</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-26vr-8j45-3r4w">https://github.com/eclipse/jetty.project/security/advisories/GHSA-26vr-8j45-3r4w</a></p>
<p>Release Date: 2021-04-01</p>
<p>Fix Resolution (org.eclipse.jetty:jetty-io): 9.4.39.v20210325</p>
<p>Direct dependency fix Resolution (org.eclipse.jetty:jetty-server): 9.4.39.v20210325</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-28169</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonData/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
For Eclipse Jetty versions <= 9.4.40, <= 10.0.2, <= 11.0.2, it is possible for requests to the ConcatServlet with a doubly encoded path to access protected resources within the WEB-INF directory. For example a request to `/concat?/%2557EB-INF/web.xml` can retrieve the web.xml file. This can reveal sensitive information regarding the implementation of a web application.
<p>Publish Date: 2021-06-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28169>CVE-2021-28169</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-gwcr-j4wh-j3cq">https://github.com/eclipse/jetty.project/security/advisories/GHSA-gwcr-j4wh-j3cq</a></p>
<p>Release Date: 2021-06-09</p>
<p>Fix Resolution: 9.4.41.v20210516</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-27218</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonData/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Eclipse Jetty version 9.4.0.RC0 to 9.4.34.v20201102, 10.0.0.alpha0 to 10.0.0.beta2, and 11.0.0.alpha0 to 11.0.0.beta2, if GZIP request body inflation is enabled and requests from different clients are multiplexed onto a single connection, and if an attacker can send a request with a body that is received entirely but not consumed by the application, then a subsequent request on the same connection will see that body prepended to its body. The attacker will not see any data but may inject data into the body of the subsequent request.
<p>Publish Date: 2020-11-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-27218>CVE-2020-27218</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-86wm-rrjm-8wh8">https://github.com/eclipse/jetty.project/security/advisories/GHSA-86wm-rrjm-8wh8</a></p>
<p>Release Date: 2020-11-28</p>
<p>Fix Resolution: 9.4.35.v20201120</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> CVE-2021-34428</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonData/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
For Eclipse Jetty versions <= 9.4.40, <= 10.0.2, <= 11.0.2, if an exception is thrown from the SessionListener#sessionDestroyed() method, then the session ID is not invalidated in the session ID manager. On deployments with clustered sessions and multiple contexts this can result in a session not being invalidated. This can result in an application used on a shared computer being left logged in.
<p>Publish Date: 2021-06-22
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-34428>CVE-2021-34428</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>3.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Physical
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-m6cp-vxjx-65j6">https://github.com/eclipse/jetty.project/security/advisories/GHSA-m6cp-vxjx-65j6</a></p>
<p>Release Date: 2021-06-22</p>
<p>Fix Resolution: 9.4.41.v20210516</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> CVE-2022-2047</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonData/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-36/commit/3335ef04e9449f11036516e41533318fc21bd8a3">3335ef04e9449f11036516e41533318fc21bd8a3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Eclipse Jetty versions 9.4.0 thru 9.4.46, and 10.0.0 thru 10.0.9, and 11.0.0 thru 11.0.9 versions, the parsing of the authority segment of an http scheme URI, the Jetty HttpURI class improperly detects an invalid input as a hostname. This can lead to failures in a Proxy scenario.
<p>Publish Date: 2022-07-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-2047>CVE-2022-2047</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>2.7</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-cj7v-27pg-wf7q">https://github.com/eclipse/jetty.project/security/advisories/GHSA-cj7v-27pg-wf7q</a></p>
<p>Release Date: 2022-07-07</p>
<p>Fix Resolution: 9.4.47.v20220610</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
non_defect
|
jetty server jar vulnerabilities highest severity is vulnerable library jetty server jar the core jetty server artifact library home page a href path to dependency file baragondata pom xml path to vulnerable library home wss scanner repository org eclipse jetty jetty server jetty server jar repository org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high jetty io jar transitive medium jetty server jar direct medium jetty server jar direct low jetty server jar direct low jetty server jar direct details cve vulnerable library jetty io jar the eclipse jetty project library home page a href path to dependency file baragonservice pom xml path to vulnerable library home wss scanner repository org eclipse jetty jetty io jetty io jar home wss scanner repository org eclipse jetty jetty io jetty io jar home wss scanner repository org eclipse jetty jetty io jetty io jar dependency hierarchy jetty server jar root library x jetty io jar vulnerable library found in head commit a href found in base branch master vulnerability details in eclipse jetty to to and to cpu usage can reach upon receiving a large invalid tls frame publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org eclipse jetty jetty io direct dependency fix resolution org eclipse jetty jetty server rescue worker helmet automatic remediation is available for this issue cve vulnerable library jetty server jar the core jetty server artifact library home page a href path to dependency file baragondata pom xml path to vulnerable library home wss scanner repository org eclipse jetty jetty server jetty server jar repository org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar dependency hierarchy x jetty server jar vulnerable library found in head commit a href found in base branch master vulnerability details for eclipse jetty versions it is possible for requests to the concatservlet with a doubly encoded path to access protected resources within the web inf directory for example a request to concat inf web xml can retrieve the web xml file this can reveal sensitive information regarding the implementation of a web application publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library jetty server jar the core jetty server artifact library home page a href path to dependency file baragondata pom xml path to vulnerable library home wss scanner repository org eclipse jetty jetty server jetty server jar repository org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar dependency hierarchy x jetty server jar vulnerable library found in head commit a href found in base branch master vulnerability details in eclipse jetty version to to and to if gzip request body inflation is enabled and requests from different clients are multiplexed onto a single connection and if an attacker can send a request with a body that is received entirely but not consumed by the application then a subsequent request on the same connection will see that body prepended to its body the attacker will not see any data but may inject data into the body of the subsequent request publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library jetty server jar the core jetty server artifact library home page a href path to dependency file baragondata pom xml path to vulnerable library home wss scanner repository org eclipse jetty jetty server jetty server jar repository org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar dependency hierarchy x jetty server jar vulnerable library found in head commit a href found in base branch master vulnerability details for eclipse jetty versions if an exception is thrown from the sessionlistener sessiondestroyed method then the session id is not invalidated in the session id manager on deployments with clustered sessions and multiple contexts this can result in a session not being invalidated this can result in an application used on a shared computer being left logged in publish date url a href cvss score details base score metrics exploitability metrics attack vector physical attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library jetty server jar the core jetty server artifact library home page a href path to dependency file baragondata pom xml path to vulnerable library home wss scanner repository org eclipse jetty jetty server jetty server jar repository org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar dependency hierarchy x jetty server jar vulnerable library found in head commit a href found in base branch master vulnerability details in eclipse jetty versions thru and thru and thru versions the parsing of the authority segment of an http scheme uri the jetty httpuri class improperly detects an invalid input as a hostname this can lead to failures in a proxy scenario publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required high user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
| 0
|
18,916
| 3,098,700,070
|
IssuesEvent
|
2015-08-28 12:54:33
|
simulationcraft/simc
|
https://api.github.com/repos/simulationcraft/simc
|
closed
|
Can sim others, but not main character
|
Status-New Type-Defect
|
Originally reported on Google Code with ID 2604
```
IF RUNNING FROM THE GUI, PLEASE ATTACH (or Cut-N-Paste contents) OF
simc_gui.simc FILE. THIS WILL ENABLE TO US TO REPRODUCE THE PROBLEM.
What steps will reproduce the problem?
1. Importing character from armory
2. Running simulation
3. "Simulationcraft is has stopped working..."
What is the expected output? What do you see instead?
Full standard simulation
What version of the product are you using? On what operating system?
64 bit from 7s file on Windows 8 64 bit.
Please provide any additional information below.
```
Reported by `erydell45` on 2015-07-29 22:55:40
|
1.0
|
Can sim others, but not main character - Originally reported on Google Code with ID 2604
```
IF RUNNING FROM THE GUI, PLEASE ATTACH (or Cut-N-Paste contents) OF
simc_gui.simc FILE. THIS WILL ENABLE TO US TO REPRODUCE THE PROBLEM.
What steps will reproduce the problem?
1. Importing character from armory
2. Running simulation
3. "Simulationcraft is has stopped working..."
What is the expected output? What do you see instead?
Full standard simulation
What version of the product are you using? On what operating system?
64 bit from 7s file on Windows 8 64 bit.
Please provide any additional information below.
```
Reported by `erydell45` on 2015-07-29 22:55:40
|
defect
|
can sim others but not main character originally reported on google code with id if running from the gui please attach or cut n paste contents of simc gui simc file this will enable to us to reproduce the problem what steps will reproduce the problem importing character from armory running simulation simulationcraft is has stopped working what is the expected output what do you see instead full standard simulation what version of the product are you using on what operating system bit from file on windows bit please provide any additional information below reported by on
| 1
|
44,274
| 12,101,433,255
|
IssuesEvent
|
2020-04-20 15:12:28
|
codesmithtools/Templates
|
https://api.github.com/repos/codesmithtools/Templates
|
closed
|
Add identification to be a default route parameter
|
Framework-PLINQOEF Type-Defect auto-migrated stale
|
```
routes.MapHttpRoute(
name: "DefaultApi",
routeTemplate: "api/{controller}/{identification}",
defaults: new { identification = RouteParameter.Optional }
);
```
Original issue reported on code.google.com by `bniemyjski` on 6 May 2014 at 6:55
|
1.0
|
Add identification to be a default route parameter - ```
routes.MapHttpRoute(
name: "DefaultApi",
routeTemplate: "api/{controller}/{identification}",
defaults: new { identification = RouteParameter.Optional }
);
```
Original issue reported on code.google.com by `bniemyjski` on 6 May 2014 at 6:55
|
defect
|
add identification to be a default route parameter routes maphttproute name defaultapi routetemplate api controller identification defaults new identification routeparameter optional original issue reported on code google com by bniemyjski on may at
| 1
|
46,505
| 13,055,923,236
|
IssuesEvent
|
2020-07-30 03:07:56
|
icecube-trac/tix2
|
https://api.github.com/repos/icecube-trac/tix2
|
opened
|
[simulation] release notes not up to date (Trac #1325)
|
Incomplete Migration Migrated from Trac combo simulation defect
|
Migrated from https://code.icecube.wisc.edu/ticket/1325
```json
{
"status": "closed",
"changetime": "2016-03-18T21:14:03",
"description": "The simulation meta-project trunk release notes only goes up to V04-01-07. I've seen release notes for versions past that, so we're out of sync. Fix that.",
"reporter": "david.schultz",
"cc": "",
"resolution": "fixed",
"_ts": "1458335643235016",
"component": "combo simulation",
"summary": "[simulation] release notes not up to date",
"priority": "blocker",
"keywords": "",
"time": "2015-09-01T22:48:13",
"milestone": "",
"owner": "olivas",
"type": "defect"
}
```
|
1.0
|
[simulation] release notes not up to date (Trac #1325) - Migrated from https://code.icecube.wisc.edu/ticket/1325
```json
{
"status": "closed",
"changetime": "2016-03-18T21:14:03",
"description": "The simulation meta-project trunk release notes only goes up to V04-01-07. I've seen release notes for versions past that, so we're out of sync. Fix that.",
"reporter": "david.schultz",
"cc": "",
"resolution": "fixed",
"_ts": "1458335643235016",
"component": "combo simulation",
"summary": "[simulation] release notes not up to date",
"priority": "blocker",
"keywords": "",
"time": "2015-09-01T22:48:13",
"milestone": "",
"owner": "olivas",
"type": "defect"
}
```
|
defect
|
release notes not up to date trac migrated from json status closed changetime description the simulation meta project trunk release notes only goes up to i ve seen release notes for versions past that so we re out of sync fix that reporter david schultz cc resolution fixed ts component combo simulation summary release notes not up to date priority blocker keywords time milestone owner olivas type defect
| 1
|
703,669
| 24,169,947,455
|
IssuesEvent
|
2022-09-22 18:15:06
|
cloudflare/cloudflared
|
https://api.github.com/repos/cloudflare/cloudflared
|
closed
|
🐛High CPU utilization with latest version 2022.9.0 on ARM64 - Raspberry Pi 4B
|
Type: Bug Priority: Normal
|
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Cloudflare Zero Trust
2. Latest ARM64 Container
3. High CPU utilization than previous version
If it's an issue with Cloudflare Tunnel:
4. Tunnel ID : 58d7dfc2-9105-4414-90f3-7979f05e55d4
5. cloudflared config: Cloudflare Zero Trust
**Expected behavior**
Similar CPU utilization as previous version
**Environment and versions**
- OS: Linux RaspberryPi 5.15.61-v8+
- Architecture: ARM
- Version: 2022.9.0

|
1.0
|
🐛High CPU utilization with latest version 2022.9.0 on ARM64 - Raspberry Pi 4B - **Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Cloudflare Zero Trust
2. Latest ARM64 Container
3. High CPU utilization than previous version
If it's an issue with Cloudflare Tunnel:
4. Tunnel ID : 58d7dfc2-9105-4414-90f3-7979f05e55d4
5. cloudflared config: Cloudflare Zero Trust
**Expected behavior**
Similar CPU utilization as previous version
**Environment and versions**
- OS: Linux RaspberryPi 5.15.61-v8+
- Architecture: ARM
- Version: 2022.9.0

|
non_defect
|
🐛high cpu utilization with latest version on raspberry pi describe the bug a clear and concise description of what the bug is to reproduce steps to reproduce the behavior cloudflare zero trust latest container high cpu utilization than previous version if it s an issue with cloudflare tunnel tunnel id cloudflared config cloudflare zero trust expected behavior similar cpu utilization as previous version environment and versions os linux raspberrypi architecture arm version
| 0
|
34,022
| 7,327,200,007
|
IssuesEvent
|
2018-03-04 06:53:07
|
scipy/scipy
|
https://api.github.com/repos/scipy/scipy
|
closed
|
hyp0f1 and struveh/struvel test failures
|
defect scipy.special
|
With the 0.18.0rc2 32-bit Linux wheel from PyPI:
```
======================================================================
FAIL: test_mpmath.TestSystematic.test_hyp0f1_complex
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/rgommers/.local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/decorators.py", line 147, in skipper_func
return f(*args, **kwargs)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/tests/test_mpmath.py", line 1198, in test_hyp0f1_complex
[Arg(-25, 25), ComplexArg(complex(-120, -120), complex(120, 120))])
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 225, in assert_mpmath_equal
d.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 212, in check
reraise(*sys.exc_info())
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 208, in check
param_filter=self.param_filter)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 125, in assert_func_equal
fdata.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 344, in check
assert_(False, "\n".join(msg))
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/utils.py", line 75, in assert_
raise AssertionError(smsg)
AssertionError:
Max |adiff|: 4.98453e+25
Max |rdiff|: 1.16226e+14
Bad results (2 out of 5491) for the following points (in output 0):
(-1e-30+0j) (1e-30+0j) => (-5.811323644522354e-17+0j) != (-5e-31+0j) (rdiff 116226472890446.06)
(1e-30+0j) (-1e-30+0j) => (-5.811323644522253e-17+0j) != (5e-31+0j) (rdiff 116226472890446.06)
======================================================================
FAIL: test_mpmath.TestSystematic.test_struveh
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/rgommers/.local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/decorators.py", line 147, in skipper_func
return f(*args, **kwargs)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/tests/test_mpmath.py", line 1609, in test_struveh
rtol=5e-10)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 225, in assert_mpmath_equal
d.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 212, in check
reraise(*sys.exc_info())
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 208, in check
param_filter=self.param_filter)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 125, in assert_func_equal
fdata.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 344, in check
assert_(False, "\n".join(msg))
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/utils.py", line 75, in assert_
raise AssertionError(smsg)
AssertionError:
Max |adiff|: 6.00153e+281
Max |rdiff|: 1.04376e-06
Bad results (17 out of 2610) for the following points (in output 0):
-10.0 27.825594022071243 => 0.08538849163456759 != 0.0853885807600122 (rdiff 1.0437630396775024e-06)
-1.584893192461111e-07 16.68100537200059 => 0.0030327868849725037 != 0.0030327868586599417 (rdiff 8.676034012477932e-09)
-2.511886431509572e-13 16.68100537200059 => 0.003032859340241818 != 0.003032859357762292 (rdiff 5.7768831831575855e-09)
-3.9810717055349853e-19 16.68100537200059 => 0.0030328593642341573 != 0.0030328593578771955 (rdiff 2.0960292192322536e-09)
-6.309573444801943e-25 16.68100537200059 => 0.003032859386511313 != 0.0030328593578771955 (rdiff 9.44129425983736e-09)
-1e-30 16.68100537200059 => 0.003032859382870125 != 0.0030328593578771955 (rdiff 8.240714999191768e-09)
0.0 16.68100537200059 => 0.003032859337594946 != 0.0030328593578771955 (rdiff 6.687500807317889e-09)
1e-30 16.68100537200059 => 0.0030328593420661386 != 0.0030328593578771955 (rdiff 5.21325093752338e-09)
6.309573444801943e-25 16.68100537200059 => 0.0030328593496474253 != 0.0030328593578771955 (rdiff 2.713535048232232e-09)
3.9810717055349853e-19 16.68100537200059 => 0.0030328593802469183 != 0.003032859357877196 (rdiff 7.375786264466402e-09)
2.511886431509572e-13 16.68100537200059 => 0.0030328593792583133 != 0.0030328593579920992 (rdiff 7.011935450985688e-09)
1.584893192461111e-07 16.68100537200059 => 0.0030329318165232045 != 0.0030329318571076067 (rdiff 1.3381244310934225e-08)
0.1 16.68100537200059 => 0.051432929176759075 != 0.051432929221421814 (rdiff 8.683685619545708e-10)
0.10000000000000009 16.68100537200059 => 0.05143292918322988 != 0.051432929221421855 (rdiff 7.425588613477252e-10)
77.4263682681127 77.4263682681127 => 124408928.42117062 != 124408929.01600471 (rdiff 4.781281327505454e-09)
215.44346900318823 129.1549665014884 => 2.449041958118309e-22 != 2.4490419531540947e-22 (rdiff 2.0270025182055675e-09)
599.4842503189409 215.44346900318823 => 6.090078970541256e-190 != 6.09007895879149e-190 (rdiff 1.9293290039418937e-09)
======================================================================
FAIL: test_mpmath.TestSystematic.test_struvel
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/rgommers/.local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/decorators.py", line 147, in skipper_func
return f(*args, **kwargs)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/tests/test_mpmath.py", line 1627, in test_struvel
ignore_inf_sign=True)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 225, in assert_mpmath_equal
d.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 212, in check
reraise(*sys.exc_info())
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 208, in check
param_filter=self.param_filter)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 125, in assert_func_equal
fdata.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 344, in check
assert_(False, "\n".join(msg))
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/utils.py", line 75, in assert_
raise AssertionError(smsg)
AssertionError:
Max |adiff|: 6.00153e+281
Max |rdiff|: 1.34067e-08
Bad results (2 out of 2612) for the following points (in output 0):
-599.4842503189409 215.44346900318823 => 7.496028765368905e+182 != 7.496028788983548e+182 (rdiff 3.150287113158528e-09)
-215.44346900318823 129.1549665014884 => 1.8527173939362396e+16 != 1.852717369097493e+16 (rdiff 1.340665727773665e-08)
```
The `hyp0f1` is not specific to this wheel I think, but the `struve` ones I've never seen before.
|
1.0
|
hyp0f1 and struveh/struvel test failures - With the 0.18.0rc2 32-bit Linux wheel from PyPI:
```
======================================================================
FAIL: test_mpmath.TestSystematic.test_hyp0f1_complex
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/rgommers/.local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/decorators.py", line 147, in skipper_func
return f(*args, **kwargs)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/tests/test_mpmath.py", line 1198, in test_hyp0f1_complex
[Arg(-25, 25), ComplexArg(complex(-120, -120), complex(120, 120))])
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 225, in assert_mpmath_equal
d.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 212, in check
reraise(*sys.exc_info())
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 208, in check
param_filter=self.param_filter)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 125, in assert_func_equal
fdata.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 344, in check
assert_(False, "\n".join(msg))
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/utils.py", line 75, in assert_
raise AssertionError(smsg)
AssertionError:
Max |adiff|: 4.98453e+25
Max |rdiff|: 1.16226e+14
Bad results (2 out of 5491) for the following points (in output 0):
(-1e-30+0j) (1e-30+0j) => (-5.811323644522354e-17+0j) != (-5e-31+0j) (rdiff 116226472890446.06)
(1e-30+0j) (-1e-30+0j) => (-5.811323644522253e-17+0j) != (5e-31+0j) (rdiff 116226472890446.06)
======================================================================
FAIL: test_mpmath.TestSystematic.test_struveh
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/rgommers/.local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/decorators.py", line 147, in skipper_func
return f(*args, **kwargs)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/tests/test_mpmath.py", line 1609, in test_struveh
rtol=5e-10)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 225, in assert_mpmath_equal
d.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 212, in check
reraise(*sys.exc_info())
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 208, in check
param_filter=self.param_filter)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 125, in assert_func_equal
fdata.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 344, in check
assert_(False, "\n".join(msg))
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/utils.py", line 75, in assert_
raise AssertionError(smsg)
AssertionError:
Max |adiff|: 6.00153e+281
Max |rdiff|: 1.04376e-06
Bad results (17 out of 2610) for the following points (in output 0):
-10.0 27.825594022071243 => 0.08538849163456759 != 0.0853885807600122 (rdiff 1.0437630396775024e-06)
-1.584893192461111e-07 16.68100537200059 => 0.0030327868849725037 != 0.0030327868586599417 (rdiff 8.676034012477932e-09)
-2.511886431509572e-13 16.68100537200059 => 0.003032859340241818 != 0.003032859357762292 (rdiff 5.7768831831575855e-09)
-3.9810717055349853e-19 16.68100537200059 => 0.0030328593642341573 != 0.0030328593578771955 (rdiff 2.0960292192322536e-09)
-6.309573444801943e-25 16.68100537200059 => 0.003032859386511313 != 0.0030328593578771955 (rdiff 9.44129425983736e-09)
-1e-30 16.68100537200059 => 0.003032859382870125 != 0.0030328593578771955 (rdiff 8.240714999191768e-09)
0.0 16.68100537200059 => 0.003032859337594946 != 0.0030328593578771955 (rdiff 6.687500807317889e-09)
1e-30 16.68100537200059 => 0.0030328593420661386 != 0.0030328593578771955 (rdiff 5.21325093752338e-09)
6.309573444801943e-25 16.68100537200059 => 0.0030328593496474253 != 0.0030328593578771955 (rdiff 2.713535048232232e-09)
3.9810717055349853e-19 16.68100537200059 => 0.0030328593802469183 != 0.003032859357877196 (rdiff 7.375786264466402e-09)
2.511886431509572e-13 16.68100537200059 => 0.0030328593792583133 != 0.0030328593579920992 (rdiff 7.011935450985688e-09)
1.584893192461111e-07 16.68100537200059 => 0.0030329318165232045 != 0.0030329318571076067 (rdiff 1.3381244310934225e-08)
0.1 16.68100537200059 => 0.051432929176759075 != 0.051432929221421814 (rdiff 8.683685619545708e-10)
0.10000000000000009 16.68100537200059 => 0.05143292918322988 != 0.051432929221421855 (rdiff 7.425588613477252e-10)
77.4263682681127 77.4263682681127 => 124408928.42117062 != 124408929.01600471 (rdiff 4.781281327505454e-09)
215.44346900318823 129.1549665014884 => 2.449041958118309e-22 != 2.4490419531540947e-22 (rdiff 2.0270025182055675e-09)
599.4842503189409 215.44346900318823 => 6.090078970541256e-190 != 6.09007895879149e-190 (rdiff 1.9293290039418937e-09)
======================================================================
FAIL: test_mpmath.TestSystematic.test_struvel
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/rgommers/.local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/decorators.py", line 147, in skipper_func
return f(*args, **kwargs)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/tests/test_mpmath.py", line 1627, in test_struvel
ignore_inf_sign=True)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 225, in assert_mpmath_equal
d.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 212, in check
reraise(*sys.exc_info())
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_mptestutils.py", line 208, in check
param_filter=self.param_filter)
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 125, in assert_func_equal
fdata.check()
File "/home/rgommers/.local/lib/python2.7/site-packages/scipy/special/_testutils.py", line 344, in check
assert_(False, "\n".join(msg))
File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/utils.py", line 75, in assert_
raise AssertionError(smsg)
AssertionError:
Max |adiff|: 6.00153e+281
Max |rdiff|: 1.34067e-08
Bad results (2 out of 2612) for the following points (in output 0):
-599.4842503189409 215.44346900318823 => 7.496028765368905e+182 != 7.496028788983548e+182 (rdiff 3.150287113158528e-09)
-215.44346900318823 129.1549665014884 => 1.8527173939362396e+16 != 1.852717369097493e+16 (rdiff 1.340665727773665e-08)
```
The `hyp0f1` is not specific to this wheel I think, but the `struve` ones I've never seen before.
|
defect
|
and struveh struvel test failures with the bit linux wheel from pypi fail test mpmath testsystematic test complex traceback most recent call last file home rgommers local lib site packages nose case py line in runtest self test self arg file home rgommers local lib site packages numpy testing decorators py line in skipper func return f args kwargs file home rgommers local lib site packages scipy special tests test mpmath py line in test complex file home rgommers local lib site packages scipy special mptestutils py line in assert mpmath equal d check file home rgommers local lib site packages scipy special mptestutils py line in check reraise sys exc info file home rgommers local lib site packages scipy special mptestutils py line in check param filter self param filter file home rgommers local lib site packages scipy special testutils py line in assert func equal fdata check file home rgommers local lib site packages scipy special testutils py line in check assert false n join msg file home rgommers local lib site packages numpy testing utils py line in assert raise assertionerror smsg assertionerror max adiff max rdiff bad results out of for the following points in output rdiff rdiff fail test mpmath testsystematic test struveh traceback most recent call last file home rgommers local lib site packages nose case py line in runtest self test self arg file home rgommers local lib site packages numpy testing decorators py line in skipper func return f args kwargs file home rgommers local lib site packages scipy special tests test mpmath py line in test struveh rtol file home rgommers local lib site packages scipy special mptestutils py line in assert mpmath equal d check file home rgommers local lib site packages scipy special mptestutils py line in check reraise sys exc info file home rgommers local lib site packages scipy special mptestutils py line in check param filter self param filter file home rgommers local lib site packages scipy special testutils py line in assert func equal fdata check file home rgommers local lib site packages scipy special testutils py line in check assert false n join msg file home rgommers local lib site packages numpy testing utils py line in assert raise assertionerror smsg assertionerror max adiff max rdiff bad results out of for the following points in output rdiff rdiff rdiff rdiff rdiff rdiff rdiff rdiff rdiff rdiff rdiff rdiff rdiff rdiff rdiff rdiff rdiff fail test mpmath testsystematic test struvel traceback most recent call last file home rgommers local lib site packages nose case py line in runtest self test self arg file home rgommers local lib site packages numpy testing decorators py line in skipper func return f args kwargs file home rgommers local lib site packages scipy special tests test mpmath py line in test struvel ignore inf sign true file home rgommers local lib site packages scipy special mptestutils py line in assert mpmath equal d check file home rgommers local lib site packages scipy special mptestutils py line in check reraise sys exc info file home rgommers local lib site packages scipy special mptestutils py line in check param filter self param filter file home rgommers local lib site packages scipy special testutils py line in assert func equal fdata check file home rgommers local lib site packages scipy special testutils py line in check assert false n join msg file home rgommers local lib site packages numpy testing utils py line in assert raise assertionerror smsg assertionerror max adiff max rdiff bad results out of for the following points in output rdiff rdiff the is not specific to this wheel i think but the struve ones i ve never seen before
| 1
|
166,391
| 6,303,927,209
|
IssuesEvent
|
2017-07-21 14:48:38
|
craftercms/craftercms
|
https://api.github.com/repos/craftercms/craftercms
|
closed
|
[search] Log levels in Search are too high
|
enhancement Priority: Low
|
Please go through the logs in Search and tone down to more realistic/meaningful levels.
For example: [ERROR] 2017-04-07 18:43:48,392 [ajp-nio-8009-exec-7] [search.SearchServiceImpl] | Creating search index for site:123
Many logs with regard to insertion/deletion of elements are at INFO when perhaps they should be Debug.
Ping me to discuss details or adjust.
|
1.0
|
[search] Log levels in Search are too high - Please go through the logs in Search and tone down to more realistic/meaningful levels.
For example: [ERROR] 2017-04-07 18:43:48,392 [ajp-nio-8009-exec-7] [search.SearchServiceImpl] | Creating search index for site:123
Many logs with regard to insertion/deletion of elements are at INFO when perhaps they should be Debug.
Ping me to discuss details or adjust.
|
non_defect
|
log levels in search are too high please go through the logs in search and tone down to more realistic meaningful levels for example creating search index for site many logs with regard to insertion deletion of elements are at info when perhaps they should be debug ping me to discuss details or adjust
| 0
|
497,767
| 14,384,084,276
|
IssuesEvent
|
2020-12-02 09:59:37
|
wso2/carbon-apimgt
|
https://api.github.com/repos/wso2/carbon-apimgt
|
opened
|
Error Maven install
|
Docs/Has Impact Priority/Normal Type/Docs
|
### Description:
command: mvn clean install
but Its not working, its display me bellow error:
[ERROR] Failed to execute goal on project org.wso2.carbon.apimgt.impl: Could not resolve dependencies for project org.wso2.carbon.apimgt:org.wso2.carbon.apimgt.impl:bundle:6.8.112-SNAPSHOT: Failed to collect dependencies at com.sun.java:tools:jar:11.0.1: Failed to read artifact descriptor for com.sun.java:tools:jar:11.0.1: Could not transfer artifact com.sun.java:tools:pom:11.0.1 from/to central (https://repo.maven.apache.org/maven2): Transfer failed for https://repo.maven.apache.org/maven2/com/sun/java/tools/11.0.1/tools-11.0.1.pom: Remote host terminated the handshake: SSL peer shut down incorrectly
### Steps to reproduce:
I follow same your steps to get source code carbon-apimgt project but not working, its fail in step 3 which is Maven clean install

|
1.0
|
Error Maven install - ### Description:
command: mvn clean install
but Its not working, its display me bellow error:
[ERROR] Failed to execute goal on project org.wso2.carbon.apimgt.impl: Could not resolve dependencies for project org.wso2.carbon.apimgt:org.wso2.carbon.apimgt.impl:bundle:6.8.112-SNAPSHOT: Failed to collect dependencies at com.sun.java:tools:jar:11.0.1: Failed to read artifact descriptor for com.sun.java:tools:jar:11.0.1: Could not transfer artifact com.sun.java:tools:pom:11.0.1 from/to central (https://repo.maven.apache.org/maven2): Transfer failed for https://repo.maven.apache.org/maven2/com/sun/java/tools/11.0.1/tools-11.0.1.pom: Remote host terminated the handshake: SSL peer shut down incorrectly
### Steps to reproduce:
I follow same your steps to get source code carbon-apimgt project but not working, its fail in step 3 which is Maven clean install

|
non_defect
|
error maven install description command mvn clean install but its not working its display me bellow error failed to execute goal on project org carbon apimgt impl could not resolve dependencies for project org carbon apimgt org carbon apimgt impl bundle snapshot failed to collect dependencies at com sun java tools jar failed to read artifact descriptor for com sun java tools jar could not transfer artifact com sun java tools pom from to central transfer failed for remote host terminated the handshake ssl peer shut down incorrectly steps to reproduce i follow same your steps to get source code carbon apimgt project but not working its fail in step which is maven clean install
| 0
|
42,827
| 22,956,033,986
|
IssuesEvent
|
2022-07-19 11:39:43
|
umbraco/Umbraco-CMS
|
https://api.github.com/repos/umbraco/Umbraco-CMS
|
closed
|
Save space / memory by storing True/False values in a bit null column in the cmsPropertyData table
|
community/up-for-grabs category/performance type/feature status/stale
|
A brief description of your feature request goes here.
Feature request:
Save space / memory by storing True/False values in a bit nullcolumn in the cmsPropertyData table. Currently true/false values are stored in an int column which wastes ~30bits per record. I've saved ~40mb of memory by converting the int's to bool in c# during load into nucache.
|
True
|
Save space / memory by storing True/False values in a bit null column in the cmsPropertyData table - A brief description of your feature request goes here.
Feature request:
Save space / memory by storing True/False values in a bit nullcolumn in the cmsPropertyData table. Currently true/false values are stored in an int column which wastes ~30bits per record. I've saved ~40mb of memory by converting the int's to bool in c# during load into nucache.
|
non_defect
|
save space memory by storing true false values in a bit null column in the cmspropertydata table a brief description of your feature request goes here feature request save space memory by storing true false values in a bit nullcolumn in the cmspropertydata table currently true false values are stored in an int column which wastes per record i ve saved of memory by converting the int s to bool in c during load into nucache
| 0
|
12,628
| 2,712,177,008
|
IssuesEvent
|
2015-04-09 12:08:57
|
xgenvn/android-vnc-server
|
https://api.github.com/repos/xgenvn/android-vnc-server
|
closed
|
Compile error with android 2.3.4
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1. cd android-vnc-server
2. mm
target Executable: androidvncserver
(out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LIN
KED/androidvncserver)
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/rfbserver.o: in function
rfbSendFileTransferChunk:android-vnc-server/LibVNCServer-0.9.7/libvncserver/rfbs
erver.c:1364: error: undefined reference to 'compress'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/rfbserver.o: in function
rfbProcessFileTransfer:android-vnc-server/LibVNCServer-0.9.7/libvncserver/rfbser
ver.c:1626: error: undefined reference to 'uncompress'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/rfbserver.o: in function
rfbClientConnectionGone:android-vnc-server/LibVNCServer-0.9.7/libvncserver/rfbse
rver.c:525: error: undefined reference to 'deflateEnd'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/rfbserver.o: in function
rfbClientConnectionGone:android-vnc-server/LibVNCServer-0.9.7/libvncserver/rfbse
rver.c:531: error: undefined reference to 'deflateEnd'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zlib.o: in function
rfbSendOneRectEncodingZlib:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zl
ib.c:160: error: undefined reference to 'deflateInit2_'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zlib.o: in function
rfbSendOneRectEncodingZlib:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zl
ib.c:175: error: undefined reference to 'deflate'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zrleoutstream.o: in function
zrleOutStreamOverrun:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zrleouts
tream.c:179: error: undefined reference to 'deflate'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zrleoutstream.o: in function
zrleOutStreamFlush:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zrleoutstr
eam.c:132: error: undefined reference to 'deflate'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zrleoutstream.o: in function
zrleOutStreamFree:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zrleoutstre
am.c:99: error: undefined reference to 'deflateEnd'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zrleoutstream.o: in function
zrleOutStreamNew:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zrleoutstrea
m.c:88: error: undefined reference to 'deflateInit_'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/tight.o: in function
CompressData:android-vnc-server/LibVNCServer-0.9.7/libvncserver/tight.c:920:
error: undefined reference to 'deflateInit2_'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/tight.o: in function
CompressData:android-vnc-server/LibVNCServer-0.9.7/libvncserver/tight.c:937:
error: undefined reference to 'deflateParams'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/tight.o: in function
CompressData:android-vnc-server/LibVNCServer-0.9.7/libvncserver/tight.c:944:
error: undefined reference to 'deflate'
collect2: ld returned 1 exit status
make: ***
[out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LIN
KED/androidvncserver] Error 1
make: Leaving directory `/home/tranzdawebbsp/workspace/rowboat_8148'
```
Original issue reported on code.google.com by `cww0...@gmail.com` on 9 May 2012 at 12:01
|
1.0
|
Compile error with android 2.3.4 - ```
What steps will reproduce the problem?
1. cd android-vnc-server
2. mm
target Executable: androidvncserver
(out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LIN
KED/androidvncserver)
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/rfbserver.o: in function
rfbSendFileTransferChunk:android-vnc-server/LibVNCServer-0.9.7/libvncserver/rfbs
erver.c:1364: error: undefined reference to 'compress'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/rfbserver.o: in function
rfbProcessFileTransfer:android-vnc-server/LibVNCServer-0.9.7/libvncserver/rfbser
ver.c:1626: error: undefined reference to 'uncompress'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/rfbserver.o: in function
rfbClientConnectionGone:android-vnc-server/LibVNCServer-0.9.7/libvncserver/rfbse
rver.c:525: error: undefined reference to 'deflateEnd'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/rfbserver.o: in function
rfbClientConnectionGone:android-vnc-server/LibVNCServer-0.9.7/libvncserver/rfbse
rver.c:531: error: undefined reference to 'deflateEnd'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zlib.o: in function
rfbSendOneRectEncodingZlib:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zl
ib.c:160: error: undefined reference to 'deflateInit2_'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zlib.o: in function
rfbSendOneRectEncodingZlib:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zl
ib.c:175: error: undefined reference to 'deflate'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zrleoutstream.o: in function
zrleOutStreamOverrun:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zrleouts
tream.c:179: error: undefined reference to 'deflate'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zrleoutstream.o: in function
zrleOutStreamFlush:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zrleoutstr
eam.c:132: error: undefined reference to 'deflate'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zrleoutstream.o: in function
zrleOutStreamFree:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zrleoutstre
am.c:99: error: undefined reference to 'deflateEnd'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/zrleoutstream.o: in function
zrleOutStreamNew:android-vnc-server/LibVNCServer-0.9.7/libvncserver/zrleoutstrea
m.c:88: error: undefined reference to 'deflateInit_'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/tight.o: in function
CompressData:android-vnc-server/LibVNCServer-0.9.7/libvncserver/tight.c:920:
error: undefined reference to 'deflateInit2_'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/tight.o: in function
CompressData:android-vnc-server/LibVNCServer-0.9.7/libvncserver/tight.c:937:
error: undefined reference to 'deflateParams'
prebuilt/linux-x86/toolchain/arm-eabi-4.4.3/bin/../lib/gcc/arm-eabi/4.4.3/../../
../../arm-eabi/bin/ld:
out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LibV
NCServer-0.9.7/libvncserver/tight.o: in function
CompressData:android-vnc-server/LibVNCServer-0.9.7/libvncserver/tight.c:944:
error: undefined reference to 'deflate'
collect2: ld returned 1 exit status
make: ***
[out/target/product/ti814xevm/obj/EXECUTABLES/androidvncserver_intermediates/LIN
KED/androidvncserver] Error 1
make: Leaving directory `/home/tranzdawebbsp/workspace/rowboat_8148'
```
Original issue reported on code.google.com by `cww0...@gmail.com` on 9 May 2012 at 12:01
|
defect
|
compile error with android what steps will reproduce the problem cd android vnc server mm target executable androidvncserver out target product obj executables androidvncserver intermediates lin ked androidvncserver prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver rfbserver o in function rfbsendfiletransferchunk android vnc server libvncserver libvncserver rfbs erver c error undefined reference to compress prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver rfbserver o in function rfbprocessfiletransfer android vnc server libvncserver libvncserver rfbser ver c error undefined reference to uncompress prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver rfbserver o in function rfbclientconnectiongone android vnc server libvncserver libvncserver rfbse rver c error undefined reference to deflateend prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver rfbserver o in function rfbclientconnectiongone android vnc server libvncserver libvncserver rfbse rver c error undefined reference to deflateend prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver zlib o in function rfbsendonerectencodingzlib android vnc server libvncserver libvncserver zl ib c error undefined reference to prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver zlib o in function rfbsendonerectencodingzlib android vnc server libvncserver libvncserver zl ib c error undefined reference to deflate prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver zrleoutstream o in function zrleoutstreamoverrun android vnc server libvncserver libvncserver zrleouts tream c error undefined reference to deflate prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver zrleoutstream o in function zrleoutstreamflush android vnc server libvncserver libvncserver zrleoutstr eam c error undefined reference to deflate prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver zrleoutstream o in function zrleoutstreamfree android vnc server libvncserver libvncserver zrleoutstre am c error undefined reference to deflateend prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver zrleoutstream o in function zrleoutstreamnew android vnc server libvncserver libvncserver zrleoutstrea m c error undefined reference to deflateinit prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver tight o in function compressdata android vnc server libvncserver libvncserver tight c error undefined reference to prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver tight o in function compressdata android vnc server libvncserver libvncserver tight c error undefined reference to deflateparams prebuilt linux toolchain arm eabi bin lib gcc arm eabi arm eabi bin ld out target product obj executables androidvncserver intermediates libv ncserver libvncserver tight o in function compressdata android vnc server libvncserver libvncserver tight c error undefined reference to deflate ld returned exit status make out target product obj executables androidvncserver intermediates lin ked androidvncserver error make leaving directory home tranzdawebbsp workspace rowboat original issue reported on code google com by gmail com on may at
| 1
|
321,558
| 9,805,367,449
|
IssuesEvent
|
2019-06-12 08:50:10
|
gilde-der-nacht/website
|
https://api.github.com/repos/gilde-der-nacht/website
|
closed
|
feedback rollenspieltage.ch v1.0
|
bug top priority
|
- [x] after sending form, you are redirectet to https://gildedernacht.ch/contact/ but correct would be http://rollenspieltage.ch/contact/
- [x] footer on mobile doesn’t work: Facebook logo is overflowing, social logos hover effect renders underline
- [x] manual line-break in headline doesn’t look good on mobile: remove manual line-break
- [x] table (opening hours, startpage) looks weird on mobile
- [x] all relative links point to gildedernacht.ch/…
|
1.0
|
feedback rollenspieltage.ch v1.0 - - [x] after sending form, you are redirectet to https://gildedernacht.ch/contact/ but correct would be http://rollenspieltage.ch/contact/
- [x] footer on mobile doesn’t work: Facebook logo is overflowing, social logos hover effect renders underline
- [x] manual line-break in headline doesn’t look good on mobile: remove manual line-break
- [x] table (opening hours, startpage) looks weird on mobile
- [x] all relative links point to gildedernacht.ch/…
|
non_defect
|
feedback rollenspieltage ch after sending form you are redirectet to but correct would be footer on mobile doesn’t work facebook logo is overflowing social logos hover effect renders underline manual line break in headline doesn’t look good on mobile remove manual line break table opening hours startpage looks weird on mobile all relative links point to gildedernacht ch …
| 0
|
73,675
| 14,113,433,379
|
IssuesEvent
|
2020-11-07 11:14:05
|
pygame/pygame
|
https://api.github.com/repos/pygame/pygame
|
closed
|
Should music.get_busy() detect pause or get_paused() be added?
|
C code Difficulty: Easy docs enhancement mixer.music
|
There is no way currently to detect if mixer.music is actually playing. get_busy() returns true when music is paused.
Will using SDL_MixPausedMusic() to detect paused music in mixer.music.get_busy() adversely effect legacy code? I don't think it will.
Should get_paused() be added? I think the above is better.
Note that SDL_MixerPausedMusic will return false if the music is stopped, a combination of the to calls is necessary to determine if music is actually playing.
I could add this functionality if it seems useful. It's not very important because users can code their own states, but it would be convenient. Also, I think the way that get_busy currently works is silly, quirky behaviour from SDL that could have been improved for pygame
**To Do**
- [ ] Make `get_busy()` track whether music is paused on pygame side and return False if music is paused.
- [ ] Document this change from pygame 1.
**Related Docs**: https://www.pygame.org/docs/ref/music.html#pygame.mixer.music.get_busy
|
1.0
|
Should music.get_busy() detect pause or get_paused() be added? - There is no way currently to detect if mixer.music is actually playing. get_busy() returns true when music is paused.
Will using SDL_MixPausedMusic() to detect paused music in mixer.music.get_busy() adversely effect legacy code? I don't think it will.
Should get_paused() be added? I think the above is better.
Note that SDL_MixerPausedMusic will return false if the music is stopped, a combination of the to calls is necessary to determine if music is actually playing.
I could add this functionality if it seems useful. It's not very important because users can code their own states, but it would be convenient. Also, I think the way that get_busy currently works is silly, quirky behaviour from SDL that could have been improved for pygame
**To Do**
- [ ] Make `get_busy()` track whether music is paused on pygame side and return False if music is paused.
- [ ] Document this change from pygame 1.
**Related Docs**: https://www.pygame.org/docs/ref/music.html#pygame.mixer.music.get_busy
|
non_defect
|
should music get busy detect pause or get paused be added there is no way currently to detect if mixer music is actually playing get busy returns true when music is paused will using sdl mixpausedmusic to detect paused music in mixer music get busy adversely effect legacy code i don t think it will should get paused be added i think the above is better note that sdl mixerpausedmusic will return false if the music is stopped a combination of the to calls is necessary to determine if music is actually playing i could add this functionality if it seems useful it s not very important because users can code their own states but it would be convenient also i think the way that get busy currently works is silly quirky behaviour from sdl that could have been improved for pygame to do make get busy track whether music is paused on pygame side and return false if music is paused document this change from pygame related docs
| 0
|
56,081
| 14,924,121,779
|
IssuesEvent
|
2021-01-23 22:08:51
|
colour-science/colour
|
https://api.github.com/repos/colour-science/colour
|
closed
|
DaVinci Wide Gamut - Primaries from White Paper
|
API Defect Minor
|
Hi,
The DaVinci Wide Gamut Primaries are wrong in your implementation.
Here the correct one from the Official White Paper.
red | 0.8000 | 0.3130
green | 0.1618 | 0.9877
blue | 0.0790 | -0.1155
white | 0.3127 | 0.3290
Hope it's help.
Thanks,
Fred Savoir
https://documents.blackmagicdesign.com/InformationNotes/DaVinci_Resolve_17_Wide_Gamut_Intermediate.pdf?_v=1607414410000
|
1.0
|
DaVinci Wide Gamut - Primaries from White Paper - Hi,
The DaVinci Wide Gamut Primaries are wrong in your implementation.
Here the correct one from the Official White Paper.
red | 0.8000 | 0.3130
green | 0.1618 | 0.9877
blue | 0.0790 | -0.1155
white | 0.3127 | 0.3290
Hope it's help.
Thanks,
Fred Savoir
https://documents.blackmagicdesign.com/InformationNotes/DaVinci_Resolve_17_Wide_Gamut_Intermediate.pdf?_v=1607414410000
|
defect
|
davinci wide gamut primaries from white paper hi the davinci wide gamut primaries are wrong in your implementation here the correct one from the official white paper red green blue white hope it s help thanks fred savoir
| 1
|
724,217
| 24,921,059,150
|
IssuesEvent
|
2022-10-30 23:53:46
|
blocklet/create-blocklet
|
https://api.github.com/repos/blocklet/create-blocklet
|
closed
|
更好的create blocklet 默认 xmark
|
priority/important-soon
|
- 一个已经看起来很漂亮并立刻可用的layout
- 内容本身是说明性的,引导用户如何修改和配置
- 内容本身带有文档的连接和简单的定制化指引,帮助用户立刻定制和修改
|
1.0
|
更好的create blocklet 默认 xmark -
- 一个已经看起来很漂亮并立刻可用的layout
- 内容本身是说明性的,引导用户如何修改和配置
- 内容本身带有文档的连接和简单的定制化指引,帮助用户立刻定制和修改
|
non_defect
|
更好的create blocklet 默认 xmark 一个已经看起来很漂亮并立刻可用的layout 内容本身是说明性的,引导用户如何修改和配置 内容本身带有文档的连接和简单的定制化指引,帮助用户立刻定制和修改
| 0
|
716,642
| 24,643,206,879
|
IssuesEvent
|
2022-10-17 13:11:32
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.aliexpress.com - see bug description
|
status-needsinfo browser-firefox priority-important engine-gecko
|
<!-- @browser: Firefox 105.0.3 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/106.0.0.0 Safari/537.36 -->
<!-- @reported_with: unknown -->
**URL**: https://www.aliexpress.com
**Browser / Version**: Firefox 105.0.3
**Operating System**: Windows 11
**Tested Another Browser**: Yes Chrome
**Problem type**: Something else
**Description**: Drop-down menu does not appear when clicking
**Steps to Reproduce**:
I tried to click on the drop-down menu that allows you to choose the language, country and currency. There is no problem if I don't log in to the site, but when I log in, nothing appears when I click. This does not happen in Chrome
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/10/3b31bd2f-e55d-44bc-b494-d23135f229b9.jpg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.aliexpress.com - see bug description - <!-- @browser: Firefox 105.0.3 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/106.0.0.0 Safari/537.36 -->
<!-- @reported_with: unknown -->
**URL**: https://www.aliexpress.com
**Browser / Version**: Firefox 105.0.3
**Operating System**: Windows 11
**Tested Another Browser**: Yes Chrome
**Problem type**: Something else
**Description**: Drop-down menu does not appear when clicking
**Steps to Reproduce**:
I tried to click on the drop-down menu that allows you to choose the language, country and currency. There is no problem if I don't log in to the site, but when I log in, nothing appears when I click. This does not happen in Chrome
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/10/3b31bd2f-e55d-44bc-b494-d23135f229b9.jpg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_defect
|
see bug description url browser version firefox operating system windows tested another browser yes chrome problem type something else description drop down menu does not appear when clicking steps to reproduce i tried to click on the drop down menu that allows you to choose the language country and currency there is no problem if i don t log in to the site but when i log in nothing appears when i click this does not happen in chrome view the screenshot img alt screenshot src browser configuration none from with ❤️
| 0
|
10,984
| 2,622,856,593
|
IssuesEvent
|
2015-03-04 08:08:05
|
max99x/dict-lookup-chrome-ext
|
https://api.github.com/repos/max99x/dict-lookup-chrome-ext
|
closed
|
Incorrect frame size due to box-sizing override on some sites
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1.Just use it as usually
What is the expected output? What do you see instead?
- The frame should have rounded corners below, but it doesnt.
What version of the product are you using? On what operating system?
- Latest Chrome on Windows
Please provide any additional information below.
- Please see attached screen shot
```
Original issue reported on code.google.com by `haider....@gmail.com` on 21 Jan 2015 at 4:01
Attachments:
* [Dictionary Lookup Bug.jpg](https://storage.googleapis.com/google-code-attachments/dict-lookup-chrome-ext/issue-33/comment-0/Dictionary Lookup Bug.jpg)
|
1.0
|
Incorrect frame size due to box-sizing override on some sites - ```
What steps will reproduce the problem?
1.Just use it as usually
What is the expected output? What do you see instead?
- The frame should have rounded corners below, but it doesnt.
What version of the product are you using? On what operating system?
- Latest Chrome on Windows
Please provide any additional information below.
- Please see attached screen shot
```
Original issue reported on code.google.com by `haider....@gmail.com` on 21 Jan 2015 at 4:01
Attachments:
* [Dictionary Lookup Bug.jpg](https://storage.googleapis.com/google-code-attachments/dict-lookup-chrome-ext/issue-33/comment-0/Dictionary Lookup Bug.jpg)
|
defect
|
incorrect frame size due to box sizing override on some sites what steps will reproduce the problem just use it as usually what is the expected output what do you see instead the frame should have rounded corners below but it doesnt what version of the product are you using on what operating system latest chrome on windows please provide any additional information below please see attached screen shot original issue reported on code google com by haider gmail com on jan at attachments lookup bug jpg
| 1
|
29,760
| 5,873,105,735
|
IssuesEvent
|
2017-05-15 13:18:46
|
primefaces/primefaces
|
https://api.github.com/repos/primefaces/primefaces
|
closed
|
Resizing issue with Dialog Positioning
|
defect invalid
|
- Using PF 4.0.10
- At the bottom of a very long page add a button that open a dialog.
- Click button. Dialog will open.
- Resize dialog, dialog will disappear.
In fact, the dialog does not disappear, it change the coordinates with strange "top" value (like -2224 for example).
I can even reproduce it in Showcase by changing a component height before button that open dialog. You have to scoll to reach the button to reproduce the problem.
|
1.0
|
Resizing issue with Dialog Positioning - - Using PF 4.0.10
- At the bottom of a very long page add a button that open a dialog.
- Click button. Dialog will open.
- Resize dialog, dialog will disappear.
In fact, the dialog does not disappear, it change the coordinates with strange "top" value (like -2224 for example).
I can even reproduce it in Showcase by changing a component height before button that open dialog. You have to scoll to reach the button to reproduce the problem.
|
defect
|
resizing issue with dialog positioning using pf at the bottom of a very long page add a button that open a dialog click button dialog will open resize dialog dialog will disappear in fact the dialog does not disappear it change the coordinates with strange top value like for example i can even reproduce it in showcase by changing a component height before button that open dialog you have to scoll to reach the button to reproduce the problem
| 1
|
72,993
| 24,396,535,349
|
IssuesEvent
|
2022-10-04 19:49:25
|
vector-im/element-web
|
https://api.github.com/repos/vector-im/element-web
|
closed
|
sender_claimed_keys not always included in E2E room key export even though required
|
T-Defect S-Tolerable A-E2EE A-E2EE-Key-Backup Z-Spec-Compliance O-Uncommon
|
The spec says it is required https://spec.matrix.org/unstable/client-server-api/#key-export-format
and it breaks tools like https://github.com/russelldavies/matrix-archive which expect it to exist
https://matrix.to/#/!JiiOHXrIUCtcOJsZCa:matrix.org/$2fsSk0WCGPOUzMOMBJZgyUuA_ZgiQoZHf9o1MbnAudQ?via=matrix.org&via=privacytools.io&via=converser.eu
<img width="240" src="https://user-images.githubusercontent.com/5855073/123015107-4667f680-d38d-11eb-8512-2a3ce20603d0.png">
|
1.0
|
sender_claimed_keys not always included in E2E room key export even though required - The spec says it is required https://spec.matrix.org/unstable/client-server-api/#key-export-format
and it breaks tools like https://github.com/russelldavies/matrix-archive which expect it to exist
https://matrix.to/#/!JiiOHXrIUCtcOJsZCa:matrix.org/$2fsSk0WCGPOUzMOMBJZgyUuA_ZgiQoZHf9o1MbnAudQ?via=matrix.org&via=privacytools.io&via=converser.eu
<img width="240" src="https://user-images.githubusercontent.com/5855073/123015107-4667f680-d38d-11eb-8512-2a3ce20603d0.png">
|
defect
|
sender claimed keys not always included in room key export even though required the spec says it is required and it breaks tools like which expect it to exist img width src
| 1
|
26,672
| 27,064,622,198
|
IssuesEvent
|
2023-02-13 22:53:10
|
tailscale/tailscale
|
https://api.github.com/repos/tailscale/tailscale
|
closed
|
TS_AUTHKEY doesnt work in a sidecar with containerboot on tailscale/tailscale:latest
|
OS-kubernetes L1 Very few P2 Aggravating T5 Usability bug
|
### What is the issue?
building a k8s pod with tailscale as a sidecar failed to auth.
all the docs and the scripts refer to TS_AUTHKEY.
including the docs in tailscale/docs/k8
if you rename TS_AUTHKEY to TS_AUTH_KEY it does work.
### Steps to reproduce
you can add command: ["/bin/sh", "-c", "env"] to your k8 pod
to see the env vars are set, yet if you let the image run containerboot as normal it fails to find the auth key and asks you to login manually.
IF you change the env var definition in your k8s pod to
TS_AUTH_KEY then it works like a charm.
### Are there any recent changes that introduced the issue?
_No response_
### OS
_No response_
### OS version
ubuntu on WSL, docker desktop k8s enabled. Windows 10.
### Tailscale version
latest. as of 2/2/2023
### Other software
using a tailscale auth key generated as
- multi use,
- ephemeral
- tagged
### Bug report
_No response_
|
True
|
TS_AUTHKEY doesnt work in a sidecar with containerboot on tailscale/tailscale:latest - ### What is the issue?
building a k8s pod with tailscale as a sidecar failed to auth.
all the docs and the scripts refer to TS_AUTHKEY.
including the docs in tailscale/docs/k8
if you rename TS_AUTHKEY to TS_AUTH_KEY it does work.
### Steps to reproduce
you can add command: ["/bin/sh", "-c", "env"] to your k8 pod
to see the env vars are set, yet if you let the image run containerboot as normal it fails to find the auth key and asks you to login manually.
IF you change the env var definition in your k8s pod to
TS_AUTH_KEY then it works like a charm.
### Are there any recent changes that introduced the issue?
_No response_
### OS
_No response_
### OS version
ubuntu on WSL, docker desktop k8s enabled. Windows 10.
### Tailscale version
latest. as of 2/2/2023
### Other software
using a tailscale auth key generated as
- multi use,
- ephemeral
- tagged
### Bug report
_No response_
|
non_defect
|
ts authkey doesnt work in a sidecar with containerboot on tailscale tailscale latest what is the issue building a pod with tailscale as a sidecar failed to auth all the docs and the scripts refer to ts authkey including the docs in tailscale docs if you rename ts authkey to ts auth key it does work steps to reproduce you can add command to your pod to see the env vars are set yet if you let the image run containerboot as normal it fails to find the auth key and asks you to login manually if you change the env var definition in your pod to ts auth key then it works like a charm are there any recent changes that introduced the issue no response os no response os version ubuntu on wsl docker desktop enabled windows tailscale version latest as of other software using a tailscale auth key generated as multi use ephemeral tagged bug report no response
| 0
|
51,170
| 13,201,990,439
|
IssuesEvent
|
2020-08-14 11:19:30
|
STEllAR-GROUP/hpx
|
https://api.github.com/repos/STEllAR-GROUP/hpx
|
closed
|
Undefined reference to main build error when HPX_WITH_DYNAMIC_HPX_MAIN=OFF
|
category: init type: defect
|
## Expected Behavior
I had been using HPX 1.3.0 for my [Blazemark](https://bitbucket.org/blaze-lib/blaze/wiki/Blazemark)(benchmark suite for Blaze) experiments but when I updated HPX I wasn't able to build a benchmark in Blazemark anymore.
I had always built HPX with `-DHPX_WITH_DYNAMIC_HPX_MAIN=OFF` for these runs. If I don't specify this flag and link the benchmark with hpx_wrap I am able to build it with the updated HPX, but I get an error when hpx is built with `-DHPX_WITH_DYNAMIC_HPX_MAIN=OFF` .
## Actual Behavior
I get this error message when I try to build `dmatdmatadd` benchmark from Blazemark.
```
/usr/lib/gcc/x86_64-redhat-linux/4.8.5/../../../../lib64/crt1.o: In function `_start':
(.text+0x20): undefined reference to `main'
clang-6.0: error: linker command failed with exit code 1 (use -v to see invocation)
```
## Steps to Reproduce the Problem
... Please be as specific as possible while describing how to reproduce your problem.
1. ```git clone git@bitbucket.org:blaze-lib/blaze.git```
2. ```cd blazemark```
3. change [Configfile](https://bitbucket.org/blaze-lib/blaze/src/master/blazemark/Configfile) as follows:
```
# Compiler selection
# This variable specifies the compiler used for the compilation of all benchmarks.
CXX=clang++
# Special compiler flags
# This variable specifies the compiler flags used for the compilation of all benchmarks.
CXXFLAGS="-O3 -std=c++17 -stdlib=libc++ -DNDEBUG -march=native -fpermissive -DBLAZE_USE_HPX_THREADS"
# Special include directives
# This variable can be used to specify special/additional include-related compiler directives.
INCLUDE_DIRECTIVES="-isystem /home/sshirzad/lib/hpx/hpx_release_clang/include/"
# Special library directives
# This variable can be used to specify special/additional library-related compiler directives.
LIBRARY_DIRECTIVES="-L/home/sshirzad/lib/hpx/hpx_release_clang/lib64 -lhpx -rdynamic /home/sshirzad/lib/hpx/hpx_release_clang/lib64/libhpx_init.a -ldl -lrt -L/opt/apps/clang6/boost/1.68.0-clang6/release/lib -lboost_system -lboost_program_options -pthread"
```
4. ```./configure Configfile```
5. ```make dmatdmatadd```
## Specifications
- HPX Version: master
- Platform (compiler, OS): clang 6.0.1, linux (marvin node on Rostam)
|
1.0
|
Undefined reference to main build error when HPX_WITH_DYNAMIC_HPX_MAIN=OFF - ## Expected Behavior
I had been using HPX 1.3.0 for my [Blazemark](https://bitbucket.org/blaze-lib/blaze/wiki/Blazemark)(benchmark suite for Blaze) experiments but when I updated HPX I wasn't able to build a benchmark in Blazemark anymore.
I had always built HPX with `-DHPX_WITH_DYNAMIC_HPX_MAIN=OFF` for these runs. If I don't specify this flag and link the benchmark with hpx_wrap I am able to build it with the updated HPX, but I get an error when hpx is built with `-DHPX_WITH_DYNAMIC_HPX_MAIN=OFF` .
## Actual Behavior
I get this error message when I try to build `dmatdmatadd` benchmark from Blazemark.
```
/usr/lib/gcc/x86_64-redhat-linux/4.8.5/../../../../lib64/crt1.o: In function `_start':
(.text+0x20): undefined reference to `main'
clang-6.0: error: linker command failed with exit code 1 (use -v to see invocation)
```
## Steps to Reproduce the Problem
... Please be as specific as possible while describing how to reproduce your problem.
1. ```git clone git@bitbucket.org:blaze-lib/blaze.git```
2. ```cd blazemark```
3. change [Configfile](https://bitbucket.org/blaze-lib/blaze/src/master/blazemark/Configfile) as follows:
```
# Compiler selection
# This variable specifies the compiler used for the compilation of all benchmarks.
CXX=clang++
# Special compiler flags
# This variable specifies the compiler flags used for the compilation of all benchmarks.
CXXFLAGS="-O3 -std=c++17 -stdlib=libc++ -DNDEBUG -march=native -fpermissive -DBLAZE_USE_HPX_THREADS"
# Special include directives
# This variable can be used to specify special/additional include-related compiler directives.
INCLUDE_DIRECTIVES="-isystem /home/sshirzad/lib/hpx/hpx_release_clang/include/"
# Special library directives
# This variable can be used to specify special/additional library-related compiler directives.
LIBRARY_DIRECTIVES="-L/home/sshirzad/lib/hpx/hpx_release_clang/lib64 -lhpx -rdynamic /home/sshirzad/lib/hpx/hpx_release_clang/lib64/libhpx_init.a -ldl -lrt -L/opt/apps/clang6/boost/1.68.0-clang6/release/lib -lboost_system -lboost_program_options -pthread"
```
4. ```./configure Configfile```
5. ```make dmatdmatadd```
## Specifications
- HPX Version: master
- Platform (compiler, OS): clang 6.0.1, linux (marvin node on Rostam)
|
defect
|
undefined reference to main build error when hpx with dynamic hpx main off expected behavior i had been using hpx for my suite for blaze experiments but when i updated hpx i wasn t able to build a benchmark in blazemark anymore i had always built hpx with dhpx with dynamic hpx main off for these runs if i don t specify this flag and link the benchmark with hpx wrap i am able to build it with the updated hpx but i get an error when hpx is built with dhpx with dynamic hpx main off actual behavior i get this error message when i try to build dmatdmatadd benchmark from blazemark usr lib gcc redhat linux o in function start text undefined reference to main clang error linker command failed with exit code use v to see invocation steps to reproduce the problem please be as specific as possible while describing how to reproduce your problem git clone git bitbucket org blaze lib blaze git cd blazemark change as follows compiler selection this variable specifies the compiler used for the compilation of all benchmarks cxx clang special compiler flags this variable specifies the compiler flags used for the compilation of all benchmarks cxxflags std c stdlib libc dndebug march native fpermissive dblaze use hpx threads special include directives this variable can be used to specify special additional include related compiler directives include directives isystem home sshirzad lib hpx hpx release clang include special library directives this variable can be used to specify special additional library related compiler directives library directives l home sshirzad lib hpx hpx release clang lhpx rdynamic home sshirzad lib hpx hpx release clang libhpx init a ldl lrt l opt apps boost release lib lboost system lboost program options pthread configure configfile make dmatdmatadd specifications hpx version master platform compiler os clang linux marvin node on rostam
| 1
|
57,262
| 15,728,350,083
|
IssuesEvent
|
2021-03-29 13:43:43
|
danmar/testissues
|
https://api.github.com/repos/danmar/testissues
|
opened
|
(style) the scope of variable < > can be limited (Trac #203)
|
False positive Incomplete Migration Migrated from Trac defect hyd_danmar
|
Migrated from https://trac.cppcheck.net/ticket/203
```json
{
"status": "closed",
"changetime": "2009-03-24T19:31:22",
"description": "checking the code\n\n{{{\nunsigned short foo()\n{\n test_client CClient;\n\n try\n {\n if (CClient.Open())\n {\n return 0;\n }\n }\n catch (...)\n {\n return 2;\n }\n\n try\n {\n CClient.Close();\n }\n catch (...)\n {\n return 2;\n }\n\n return 1;\n}\n\nreturns:\n\n[test.cpp:5]: (style) The scope of the variable CClient can be limited\n\nBut this is not true.\n\n}}}\n",
"reporter": "ettlmartin",
"cc": "",
"resolution": "fixed",
"_ts": "1237923082000000",
"component": "False positive",
"summary": "(style) the scope of variable < > can be limited",
"priority": "",
"keywords": "",
"time": "2009-03-22T21:30:33",
"milestone": "1.31",
"owner": "hyd_danmar",
"type": "defect"
}
```
|
1.0
|
(style) the scope of variable < > can be limited (Trac #203) - Migrated from https://trac.cppcheck.net/ticket/203
```json
{
"status": "closed",
"changetime": "2009-03-24T19:31:22",
"description": "checking the code\n\n{{{\nunsigned short foo()\n{\n test_client CClient;\n\n try\n {\n if (CClient.Open())\n {\n return 0;\n }\n }\n catch (...)\n {\n return 2;\n }\n\n try\n {\n CClient.Close();\n }\n catch (...)\n {\n return 2;\n }\n\n return 1;\n}\n\nreturns:\n\n[test.cpp:5]: (style) The scope of the variable CClient can be limited\n\nBut this is not true.\n\n}}}\n",
"reporter": "ettlmartin",
"cc": "",
"resolution": "fixed",
"_ts": "1237923082000000",
"component": "False positive",
"summary": "(style) the scope of variable < > can be limited",
"priority": "",
"keywords": "",
"time": "2009-03-22T21:30:33",
"milestone": "1.31",
"owner": "hyd_danmar",
"type": "defect"
}
```
|
defect
|
style the scope of variable can be limited trac migrated from json status closed changetime description checking the code n n nunsigned short foo n n test client cclient n n try n n if cclient open n n return n n n catch n n return n n n try n n cclient close n n catch n n return n n n return n n nreturns n n style the scope of the variable cclient can be limited n nbut this is not true n n n reporter ettlmartin cc resolution fixed ts component false positive summary style the scope of variable can be limited priority keywords time milestone owner hyd danmar type defect
| 1
|
70,453
| 23,173,329,109
|
IssuesEvent
|
2022-07-31 03:14:07
|
colour-science/colour
|
https://api.github.com/repos/colour-science/colour
|
closed
|
[BUG]: sd_to_XYZ with k=683 is 100x larger than expected
|
Defect
|
### Description
The constant 683 is often used to convert from a SPD in measurements of radiometric watts to a candela referenced value.
In the equation for computing tristimulus values, if the dλ is 1nm, then the provided example code should result in a value close to Y = 683. However, sd_to_XYZ currently returns 68300.
Furthermore, for those of us who work in absolute colorimetry (FTW 😁) the exact value of the candela constant is calculated according to the procedure below. It would be nice to include some code for this constant, but I'll settle for just making the 683 assumption correct.
> The candela [...] is defined by taking the fixed numerical value of the luminous efficacy of monochromatic radiation of frequency 540 × 1012 Hz, Kcd, to be 683 when expressed in the unit lm W–1, which is equal to cd sr W–1, or cd sr kg–1 m–2 s3, where the kilogram, metre and second are defined in terms of [h](https://en.wikipedia.org/wiki/Planck_constant), [c](https://en.wikipedia.org/wiki/Speed_of_light_in_vacuum) and [ΔνCs](https://en.wikipedia.org/wiki/Unperturbed_ground_state_hyperfine_transition_frequency_of_the_caesium_133_atom).
### Code for Reproduction
```python
import colour
from colour import SpectralDistribution, SpectralShape, sd_multi_leds, sd_to_XYZ
from colour.plotting import plot_single_sd
import numpy as np
shape = SpectralShape(380,780,1)
values = np.zeros(401)
values[780-555]
spd = SpectralDistribution(np.zeros((401)), domain=shape)
v = spd.values
v[555-380] = 1 #SPD is 1W at 555nm, 0 everywhere else.
spd.values = v
xyz = sd_to_XYZ(spd, k=683, method="integration")
print(xyz)
```
### Exception Message
_No response_
### Environment Information
_No response_
|
1.0
|
[BUG]: sd_to_XYZ with k=683 is 100x larger than expected - ### Description
The constant 683 is often used to convert from a SPD in measurements of radiometric watts to a candela referenced value.
In the equation for computing tristimulus values, if the dλ is 1nm, then the provided example code should result in a value close to Y = 683. However, sd_to_XYZ currently returns 68300.
Furthermore, for those of us who work in absolute colorimetry (FTW 😁) the exact value of the candela constant is calculated according to the procedure below. It would be nice to include some code for this constant, but I'll settle for just making the 683 assumption correct.
> The candela [...] is defined by taking the fixed numerical value of the luminous efficacy of monochromatic radiation of frequency 540 × 1012 Hz, Kcd, to be 683 when expressed in the unit lm W–1, which is equal to cd sr W–1, or cd sr kg–1 m–2 s3, where the kilogram, metre and second are defined in terms of [h](https://en.wikipedia.org/wiki/Planck_constant), [c](https://en.wikipedia.org/wiki/Speed_of_light_in_vacuum) and [ΔνCs](https://en.wikipedia.org/wiki/Unperturbed_ground_state_hyperfine_transition_frequency_of_the_caesium_133_atom).
### Code for Reproduction
```python
import colour
from colour import SpectralDistribution, SpectralShape, sd_multi_leds, sd_to_XYZ
from colour.plotting import plot_single_sd
import numpy as np
shape = SpectralShape(380,780,1)
values = np.zeros(401)
values[780-555]
spd = SpectralDistribution(np.zeros((401)), domain=shape)
v = spd.values
v[555-380] = 1 #SPD is 1W at 555nm, 0 everywhere else.
spd.values = v
xyz = sd_to_XYZ(spd, k=683, method="integration")
print(xyz)
```
### Exception Message
_No response_
### Environment Information
_No response_
|
defect
|
sd to xyz with k is larger than expected description the constant is often used to convert from a spd in measurements of radiometric watts to a candela referenced value in the equation for computing tristimulus values if the dλ is then the provided example code should result in a value close to y however sd to xyz currently returns furthermore for those of us who work in absolute colorimetry ftw 😁 the exact value of the candela constant is calculated according to the procedure below it would be nice to include some code for this constant but i ll settle for just making the assumption correct the candela is defined by taking the fixed numerical value of the luminous efficacy of monochromatic radiation of frequency × hz kcd to be when expressed in the unit lm w– which is equal to cd sr w– or cd sr kg– m– where the kilogram metre and second are defined in terms of and code for reproduction python import colour from colour import spectraldistribution spectralshape sd multi leds sd to xyz from colour plotting import plot single sd import numpy as np shape spectralshape values np zeros values spd spectraldistribution np zeros domain shape v spd values v spd is at everywhere else spd values v xyz sd to xyz spd k method integration print xyz exception message no response environment information no response
| 1
|
315,815
| 23,598,947,924
|
IssuesEvent
|
2022-08-23 22:32:23
|
MIT-CAVE/cave_app
|
https://api.github.com/repos/MIT-CAVE/cave_app
|
closed
|
General Documentation Updates
|
documentation
|
This should focus on the new `format` functions as well as the App Bar
|
1.0
|
General Documentation Updates - This should focus on the new `format` functions as well as the App Bar
|
non_defect
|
general documentation updates this should focus on the new format functions as well as the app bar
| 0
|
29,886
| 5,947,946,670
|
IssuesEvent
|
2017-05-26 09:50:28
|
primefaces/primeng
|
https://api.github.com/repos/primefaces/primeng
|
closed
|
Resizable columns can break Scrollable DataTable
|
defect
|
In cases where scrollWidth is bigger than total column width, table is misaligned like;
```xml
<p-dataTable [value]="cars" scrollable="true" scrollHeight="200px" resizableColumns="true">
<p-header>Vertical</p-header>
<p-column field="vin" header="Vin" [style]="{width: '25%'}"></p-column>
<p-column field="year" header="Year" [style]="{width: '25%'}"></p-column>
<p-column field="brand" header="Brand" [style]="{width: '50%'}"></p-column>
</p-dataTable>
```
|
1.0
|
Resizable columns can break Scrollable DataTable - In cases where scrollWidth is bigger than total column width, table is misaligned like;
```xml
<p-dataTable [value]="cars" scrollable="true" scrollHeight="200px" resizableColumns="true">
<p-header>Vertical</p-header>
<p-column field="vin" header="Vin" [style]="{width: '25%'}"></p-column>
<p-column field="year" header="Year" [style]="{width: '25%'}"></p-column>
<p-column field="brand" header="Brand" [style]="{width: '50%'}"></p-column>
</p-dataTable>
```
|
defect
|
resizable columns can break scrollable datatable in cases where scrollwidth is bigger than total column width table is misaligned like xml vertical
| 1
|
643,433
| 20,957,367,364
|
IssuesEvent
|
2022-03-27 09:27:59
|
cl8n/project-loved-web
|
https://api.github.com/repos/cl8n/project-loved-web
|
opened
|
Nginx doesn't shut down immediately in development build
|
priority:1
|
again :upside_down_face: probably indicative of some other more important problem for live
|
1.0
|
Nginx doesn't shut down immediately in development build - again :upside_down_face: probably indicative of some other more important problem for live
|
non_defect
|
nginx doesn t shut down immediately in development build again upside down face probably indicative of some other more important problem for live
| 0
|
37,283
| 8,328,732,021
|
IssuesEvent
|
2018-09-27 02:27:42
|
netTiers/netTiers
|
https://api.github.com/repos/netTiers/netTiers
|
closed
|
Generating Sub instead of Function
|
Priority-Medium Type-Defect auto-migrated
|
```
I have added a stored procedure which executes a select and Returns a dataset.
After using NetTiers CodeSmith Generator, a project is created. I searched the
project for my new stored procedure. I find the new stored procedure is coded
as a subroutine. It should be a function. Given some input the stored
procedure returns a dataset. The nettiers code calling the stored procecure
views it as not returning any output (sub) and not a function.
Is there something wrong with the stored procedure? Other?
I am using CodeSmith Generator 7.0.1.15136
NetTiers 2.3
```
Original issue reported on code.google.com by `noidfac...@gmail.com` on 11 Aug 2014 at 4:13
|
1.0
|
Generating Sub instead of Function - ```
I have added a stored procedure which executes a select and Returns a dataset.
After using NetTiers CodeSmith Generator, a project is created. I searched the
project for my new stored procedure. I find the new stored procedure is coded
as a subroutine. It should be a function. Given some input the stored
procedure returns a dataset. The nettiers code calling the stored procecure
views it as not returning any output (sub) and not a function.
Is there something wrong with the stored procedure? Other?
I am using CodeSmith Generator 7.0.1.15136
NetTiers 2.3
```
Original issue reported on code.google.com by `noidfac...@gmail.com` on 11 Aug 2014 at 4:13
|
defect
|
generating sub instead of function i have added a stored procedure which executes a select and returns a dataset after using nettiers codesmith generator a project is created i searched the project for my new stored procedure i find the new stored procedure is coded as a subroutine it should be a function given some input the stored procedure returns a dataset the nettiers code calling the stored procecure views it as not returning any output sub and not a function is there something wrong with the stored procedure other i am using codesmith generator nettiers original issue reported on code google com by noidfac gmail com on aug at
| 1
|
456,180
| 13,146,378,839
|
IssuesEvent
|
2020-08-08 09:29:42
|
wso2/product-apim
|
https://api.github.com/repos/wso2/product-apim
|
closed
|
Tokens cannot be generated with shared scopes
|
Migration Priority/Normal Type/Bug
|
### Description:
In 2.0 to 3.2 migrated setup tokens cannot be generated from shared scopes.
### Steps to reproduce:
1. Migrate to 3.2 from 2.0
2. Create and Attach a shared scope to an API
3. Subscribe to the API and generate an access token for the shared scope
4. The generated token contains only default scopes
### Affected Product Version:
APIM 2.0 to 3.2 Migration
### Environment details (with versions):
- OS:
- Client:
- Env (Docker/K8s):
---
### Optional Fields
#### Related Issues:
<!-- Any related issues from this/other repositories-->
#### Suggested Labels:
<!--Only to be used by non-members-->
#### Suggested Assignees:
<!--Only to be used by non-members-->
|
1.0
|
Tokens cannot be generated with shared scopes - ### Description:
In 2.0 to 3.2 migrated setup tokens cannot be generated from shared scopes.
### Steps to reproduce:
1. Migrate to 3.2 from 2.0
2. Create and Attach a shared scope to an API
3. Subscribe to the API and generate an access token for the shared scope
4. The generated token contains only default scopes
### Affected Product Version:
APIM 2.0 to 3.2 Migration
### Environment details (with versions):
- OS:
- Client:
- Env (Docker/K8s):
---
### Optional Fields
#### Related Issues:
<!-- Any related issues from this/other repositories-->
#### Suggested Labels:
<!--Only to be used by non-members-->
#### Suggested Assignees:
<!--Only to be used by non-members-->
|
non_defect
|
tokens cannot be generated with shared scopes description in to migrated setup tokens cannot be generated from shared scopes steps to reproduce migrate to from create and attach a shared scope to an api subscribe to the api and generate an access token for the shared scope the generated token contains only default scopes affected product version apim to migration environment details with versions os client env docker optional fields related issues suggested labels suggested assignees
| 0
|
70,945
| 23,384,712,038
|
IssuesEvent
|
2022-08-11 12:50:47
|
department-of-veterans-affairs/va.gov-cms
|
https://api.github.com/repos/department-of-veterans-affairs/va.gov-cms
|
opened
|
Meta Information table in CMS should be marked as presentation
|
Needs refining ⭐️ Sitewide CMS 508/Accessibility 508-defect-2
|
## Description
On the View screens in the CMS, the meta information is displayed in a table. It seems that the intention of this is for layout purposes in which case the table should be marked with `role="presentation"` to indicate to screen readers that the semantic markup of the table should be ignored.
## Screenshot

## Accessibility Standard
WCAG version 2.0 A, [Criterion 1.3.1](https://www.w3.org/WAI/WCAG21/Understanding/info-and-relationships.html)
## Acceptance Criteria
- [ ] Confirm the intent of the table was for layout purposes
- [ ] Technical review
- [ ] Change management consulted
- [ ] Implementation ticket created
### CMS Team
Please check the team(s) that will do this work.
- [ ] `Program`
- [ ] `Platform CMS Team`
- [ ] `Sitewide Crew`
- [ ] `⭐️ Sitewide CMS`
- [ ] `⭐️ Public Websites`
- [ ] `⭐️ Facilities`
- [ ] `⭐️ User support`
|
1.0
|
Meta Information table in CMS should be marked as presentation - ## Description
On the View screens in the CMS, the meta information is displayed in a table. It seems that the intention of this is for layout purposes in which case the table should be marked with `role="presentation"` to indicate to screen readers that the semantic markup of the table should be ignored.
## Screenshot

## Accessibility Standard
WCAG version 2.0 A, [Criterion 1.3.1](https://www.w3.org/WAI/WCAG21/Understanding/info-and-relationships.html)
## Acceptance Criteria
- [ ] Confirm the intent of the table was for layout purposes
- [ ] Technical review
- [ ] Change management consulted
- [ ] Implementation ticket created
### CMS Team
Please check the team(s) that will do this work.
- [ ] `Program`
- [ ] `Platform CMS Team`
- [ ] `Sitewide Crew`
- [ ] `⭐️ Sitewide CMS`
- [ ] `⭐️ Public Websites`
- [ ] `⭐️ Facilities`
- [ ] `⭐️ User support`
|
defect
|
meta information table in cms should be marked as presentation description on the view screens in the cms the meta information is displayed in a table it seems that the intention of this is for layout purposes in which case the table should be marked with role presentation to indicate to screen readers that the semantic markup of the table should be ignored screenshot accessibility standard wcag version a acceptance criteria confirm the intent of the table was for layout purposes technical review change management consulted implementation ticket created cms team please check the team s that will do this work program platform cms team sitewide crew ⭐️ sitewide cms ⭐️ public websites ⭐️ facilities ⭐️ user support
| 1
|
79,656
| 28,496,030,654
|
IssuesEvent
|
2023-04-18 14:17:43
|
vector-im/element-desktop
|
https://api.github.com/repos/vector-im/element-desktop
|
opened
|
Cannot run Element as root
|
T-Defect
|
### Steps to reproduce
Launch as root Element with `--no-sandbox` because of electron:
>Running as root without `--no-sandbox` is not supported.
### Outcome
#### What did you expect?
To be able to run Element as root.
#### What happened instead?
Element doesn't start, I can see a lot of errors with:
`FATAL:electron_main_delegate.cc(252)] Running as root without --no-sandbox is not supported. See https://crbug.com/638180.`
### Operating system
Debian 11
### Application version
Version 1.9.4
### How did you install the app?
https://flathub.org/apps/details/im.riot.Riot
### Homeserver
matrix.org
### Will you send logs?
I can't now eventually yes.
|
1.0
|
Cannot run Element as root - ### Steps to reproduce
Launch as root Element with `--no-sandbox` because of electron:
>Running as root without `--no-sandbox` is not supported.
### Outcome
#### What did you expect?
To be able to run Element as root.
#### What happened instead?
Element doesn't start, I can see a lot of errors with:
`FATAL:electron_main_delegate.cc(252)] Running as root without --no-sandbox is not supported. See https://crbug.com/638180.`
### Operating system
Debian 11
### Application version
Version 1.9.4
### How did you install the app?
https://flathub.org/apps/details/im.riot.Riot
### Homeserver
matrix.org
### Will you send logs?
I can't now eventually yes.
|
defect
|
cannot run element as root steps to reproduce launch as root element with no sandbox because of electron running as root without no sandbox is not supported outcome what did you expect to be able to run element as root what happened instead element doesn t start i can see a lot of errors with fatal electron main delegate cc running as root without no sandbox is not supported see operating system debian application version version how did you install the app homeserver matrix org will you send logs i can t now eventually yes
| 1
|
12,987
| 2,732,684,681
|
IssuesEvent
|
2015-04-17 08:32:25
|
troessner/reek
|
https://api.github.com/repos/troessner/reek
|
closed
|
Feature Envy reported even when it is reporting Utility Function
|
defect
|
For example:
class Level1Generator
def classes(row, col, value)
(row == 1 || col == 1) ? %w[ header ] : [ ]
end
end
Gets both smells. Surely it's not informative to report Feature Envy in this case?
|
1.0
|
Feature Envy reported even when it is reporting Utility Function - For example:
class Level1Generator
def classes(row, col, value)
(row == 1 || col == 1) ? %w[ header ] : [ ]
end
end
Gets both smells. Surely it's not informative to report Feature Envy in this case?
|
defect
|
feature envy reported even when it is reporting utility function for example class def classes row col value row col w end end gets both smells surely it s not informative to report feature envy in this case
| 1
|
169,330
| 26,781,775,859
|
IssuesEvent
|
2023-01-31 21:52:26
|
google/docsy
|
https://api.github.com/repos/google/docsy
|
opened
|
Drop / inline prepend() SCSS function
|
design/style cleanup/refactoring e0-minutes breaking change
|
Consider dropping `prepend()`, in favor of using `join()` directly: https://github.com/google/docsy/blob/c8530e3c891109a416c4ee1746608c33ae636d5e/assets/scss/support/_functions.scss#L3-L5
In doing so we can eliminate https://github.com/google/docsy/blob/main/assets/scss/support/_functions.scss.
AFAICT, there is only one instance of the call to `prepend()` in Docsy SCSS.
The other function that was in this file (`color-diff()`) was removed by:
- #1384
|
1.0
|
Drop / inline prepend() SCSS function - Consider dropping `prepend()`, in favor of using `join()` directly: https://github.com/google/docsy/blob/c8530e3c891109a416c4ee1746608c33ae636d5e/assets/scss/support/_functions.scss#L3-L5
In doing so we can eliminate https://github.com/google/docsy/blob/main/assets/scss/support/_functions.scss.
AFAICT, there is only one instance of the call to `prepend()` in Docsy SCSS.
The other function that was in this file (`color-diff()`) was removed by:
- #1384
|
non_defect
|
drop inline prepend scss function consider dropping prepend in favor of using join directly in doing so we can eliminate afaict there is only one instance of the call to prepend in docsy scss the other function that was in this file color diff was removed by
| 0
|
26,178
| 4,593,663,528
|
IssuesEvent
|
2016-09-21 02:19:30
|
afisher1/GridLAB-D
|
https://api.github.com/repos/afisher1/GridLAB-D
|
closed
|
#126 Error in convert or complex object,
|
defect
|
During revision 145, a bug was introduced. Voltage is output in polar coordinates to the xml file, however, after this revision, its output in rectangular coordinates while still thinking it's in polar. An example output should be +2481+119d. Now it outputs -1224+2158d (this is 2481+119d in polar).
,
|
1.0
|
#126 Error in convert or complex object,
- During revision 145, a bug was introduced. Voltage is output in polar coordinates to the xml file, however, after this revision, its output in rectangular coordinates while still thinking it's in polar. An example output should be +2481+119d. Now it outputs -1224+2158d (this is 2481+119d in polar).
,
|
defect
|
error in convert or complex object during revision a bug was introduced voltage is output in polar coordinates to the xml file however after this revision its output in rectangular coordinates while still thinking it s in polar an example output should be now it outputs this is in polar
| 1
|
793,883
| 28,014,478,265
|
IssuesEvent
|
2023-03-27 21:17:53
|
ledd-23/crowdyy
|
https://api.github.com/repos/ledd-23/crowdyy
|
closed
|
[CRD-18] Research a scheduling algorithm
|
high priority spike
|
**What:** A scheduling algorithm mockup with the user's schedule and historical crowd density data as inputs and a new and improved schedule as output.
**Why:** Literally the point of the project.
**AC:** An viable algorithm and future plannings in the form of new issues.
|
1.0
|
[CRD-18] Research a scheduling algorithm - **What:** A scheduling algorithm mockup with the user's schedule and historical crowd density data as inputs and a new and improved schedule as output.
**Why:** Literally the point of the project.
**AC:** An viable algorithm and future plannings in the form of new issues.
|
non_defect
|
research a scheduling algorithm what a scheduling algorithm mockup with the user s schedule and historical crowd density data as inputs and a new and improved schedule as output why literally the point of the project ac an viable algorithm and future plannings in the form of new issues
| 0
|
3,895
| 2,931,629,648
|
IssuesEvent
|
2015-06-29 13:39:47
|
studio107/Mindy_Orm
|
https://api.github.com/repos/studio107/Mindy_Orm
|
closed
|
Ошибка imageField при force => true
|
Bug Code review
|
При использовании кода `$model->toArray()` с включенным force => true происходит перебор всех полей и вывод всех значений поля `ImageField`, баг заключается в первую очередь в том, что при `force` `ImageField` берет не оригинальное изображение, а текущее и изменяет его. Решение не работает при изменении размера в большую сторону.
Решение, привести sizeUrl метод к виду:
```php
public function sizeUrl($prefix)
{
// TODO refactoring
if ($this->getStorage() instanceof MimiBoxStorage) {
$size = explode('x', $prefix);
if (count($size) > 1) {
list($width, $height) = $size;
} else {
$width = array_pop($size);
$height = 0;
}
$path = $this->sizeStoragePath(null, $this->value);
$path .= "?width=" . $width . '&height=' . $height;
if ($this->force) {
$path .= '&force=true';
}
} else {
$path = $this->sizeStoragePath($prefix, $this->value);
if ($this->force || !is_file($this->getStorage()->path($path))) {
$absPath = $this->getStorage()->path($this->getValue());
if ($absPath) {
$image = $this->getImagine()->open($absPath);
$this->processSource($image, true);
}
}
}
return $this->getStorage()->url($path);
}
```
Второй баг заключается в том, что при итерации к примеру 6 разных превью (размеров) у 1 поля, мы получаем 6 * 6 итераций и загрузку в память оригинального изображения (если выше проблему исправить). Итого 36 итераций на обработку одних и тех же изображений.
|
1.0
|
Ошибка imageField при force => true - При использовании кода `$model->toArray()` с включенным force => true происходит перебор всех полей и вывод всех значений поля `ImageField`, баг заключается в первую очередь в том, что при `force` `ImageField` берет не оригинальное изображение, а текущее и изменяет его. Решение не работает при изменении размера в большую сторону.
Решение, привести sizeUrl метод к виду:
```php
public function sizeUrl($prefix)
{
// TODO refactoring
if ($this->getStorage() instanceof MimiBoxStorage) {
$size = explode('x', $prefix);
if (count($size) > 1) {
list($width, $height) = $size;
} else {
$width = array_pop($size);
$height = 0;
}
$path = $this->sizeStoragePath(null, $this->value);
$path .= "?width=" . $width . '&height=' . $height;
if ($this->force) {
$path .= '&force=true';
}
} else {
$path = $this->sizeStoragePath($prefix, $this->value);
if ($this->force || !is_file($this->getStorage()->path($path))) {
$absPath = $this->getStorage()->path($this->getValue());
if ($absPath) {
$image = $this->getImagine()->open($absPath);
$this->processSource($image, true);
}
}
}
return $this->getStorage()->url($path);
}
```
Второй баг заключается в том, что при итерации к примеру 6 разных превью (размеров) у 1 поля, мы получаем 6 * 6 итераций и загрузку в память оригинального изображения (если выше проблему исправить). Итого 36 итераций на обработку одних и тех же изображений.
|
non_defect
|
ошибка imagefield при force true при использовании кода model toarray с включенным force true происходит перебор всех полей и вывод всех значений поля imagefield баг заключается в первую очередь в том что при force imagefield берет не оригинальное изображение а текущее и изменяет его решение не работает при изменении размера в большую сторону решение привести sizeurl метод к виду php public function sizeurl prefix todo refactoring if this getstorage instanceof mimiboxstorage size explode x prefix if count size list width height size else width array pop size height path this sizestoragepath null this value path width width height height if this force path force true else path this sizestoragepath prefix this value if this force is file this getstorage path path abspath this getstorage path this getvalue if abspath image this getimagine open abspath this processsource image true return this getstorage url path второй баг заключается в том что при итерации к примеру разных превью размеров у поля мы получаем итераций и загрузку в память оригинального изображения если выше проблему исправить итого итераций на обработку одних и тех же изображений
| 0
|
220,869
| 24,586,703,813
|
IssuesEvent
|
2022-10-13 20:25:48
|
billmcchesney1/concord
|
https://api.github.com/repos/billmcchesney1/concord
|
opened
|
CVE-2022-37601 (Medium) detected in loader-utils-1.4.0.tgz, loader-utils-1.2.3.tgz
|
security vulnerability
|
## CVE-2022-37601 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>loader-utils-1.4.0.tgz</b>, <b>loader-utils-1.2.3.tgz</b></p></summary>
<p>
<details><summary><b>loader-utils-1.4.0.tgz</b></p></summary>
<p>utils for webpack loaders</p>
<p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.4.0.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.4.0.tgz</a></p>
<p>Path to dependency file: /console2/package.json</p>
<p>Path to vulnerable library: /console2/node_modules/loader-utils/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.4.3.tgz (Root Library)
- webpack-4.3.3.tgz
- :x: **loader-utils-1.4.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>loader-utils-1.2.3.tgz</b></p></summary>
<p>utils for webpack loaders</p>
<p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz</a></p>
<p>Path to dependency file: /console2/package.json</p>
<p>Path to vulnerable library: /console2/node_modules/adjust-sourcemap-loader/node_modules/loader-utils/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.4.3.tgz (Root Library)
- react-dev-utils-10.2.1.tgz
- :x: **loader-utils-1.2.3.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in function parseQuery in parseQuery.js in webpack loader-utils 2.0.0 via the name variable in parseQuery.js.
<p>Publish Date: 2022-10-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37601>CVE-2022-37601</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-12</p>
<p>Fix Resolution (loader-utils): 2.0.0</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.1</p><p>Fix Resolution (loader-utils): 2.0.0</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.1</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
True
|
CVE-2022-37601 (Medium) detected in loader-utils-1.4.0.tgz, loader-utils-1.2.3.tgz - ## CVE-2022-37601 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>loader-utils-1.4.0.tgz</b>, <b>loader-utils-1.2.3.tgz</b></p></summary>
<p>
<details><summary><b>loader-utils-1.4.0.tgz</b></p></summary>
<p>utils for webpack loaders</p>
<p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.4.0.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.4.0.tgz</a></p>
<p>Path to dependency file: /console2/package.json</p>
<p>Path to vulnerable library: /console2/node_modules/loader-utils/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.4.3.tgz (Root Library)
- webpack-4.3.3.tgz
- :x: **loader-utils-1.4.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>loader-utils-1.2.3.tgz</b></p></summary>
<p>utils for webpack loaders</p>
<p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz</a></p>
<p>Path to dependency file: /console2/package.json</p>
<p>Path to vulnerable library: /console2/node_modules/adjust-sourcemap-loader/node_modules/loader-utils/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.4.3.tgz (Root Library)
- react-dev-utils-10.2.1.tgz
- :x: **loader-utils-1.2.3.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in function parseQuery in parseQuery.js in webpack loader-utils 2.0.0 via the name variable in parseQuery.js.
<p>Publish Date: 2022-10-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37601>CVE-2022-37601</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-12</p>
<p>Fix Resolution (loader-utils): 2.0.0</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.1</p><p>Fix Resolution (loader-utils): 2.0.0</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.1</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
non_defect
|
cve medium detected in loader utils tgz loader utils tgz cve medium severity vulnerability vulnerable libraries loader utils tgz loader utils tgz loader utils tgz utils for webpack loaders library home page a href path to dependency file package json path to vulnerable library node modules loader utils package json dependency hierarchy react scripts tgz root library webpack tgz x loader utils tgz vulnerable library loader utils tgz utils for webpack loaders library home page a href path to dependency file package json path to vulnerable library node modules adjust sourcemap loader node modules loader utils package json dependency hierarchy react scripts tgz root library react dev utils tgz x loader utils tgz vulnerable library found in base branch master vulnerability details prototype pollution vulnerability in function parsequery in parsequery js in webpack loader utils via the name variable in parsequery js publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution loader utils direct dependency fix resolution react scripts fix resolution loader utils direct dependency fix resolution react scripts check this box to open an automated fix pr
| 0
|
25,415
| 18,717,893,983
|
IssuesEvent
|
2021-11-03 08:20:24
|
airyhq/airy
|
https://api.github.com/repos/airyhq/airy
|
closed
|
The CLI status subcommand doesn't work
|
infrastructure bug cli
|
## Describe the bug
On the latest version (currently 0.32.0) `airy status` generates an error
```
❌ could not read status: request was unsuccessful. Status code: 403
```
## To Reproduce
Steps to reproduce the behavior:
1. Get into a workspace
2. Run `airy status`
## Expected behavior
The status should be printed out
## Screenshots
N/A
## Environment
All
## Additional context
N/A
|
1.0
|
The CLI status subcommand doesn't work - ## Describe the bug
On the latest version (currently 0.32.0) `airy status` generates an error
```
❌ could not read status: request was unsuccessful. Status code: 403
```
## To Reproduce
Steps to reproduce the behavior:
1. Get into a workspace
2. Run `airy status`
## Expected behavior
The status should be printed out
## Screenshots
N/A
## Environment
All
## Additional context
N/A
|
non_defect
|
the cli status subcommand doesn t work describe the bug on the latest version currently airy status generates an error ❌ could not read status request was unsuccessful status code to reproduce steps to reproduce the behavior get into a workspace run airy status expected behavior the status should be printed out screenshots n a environment all additional context n a
| 0
|
94,441
| 8,490,592,789
|
IssuesEvent
|
2018-10-27 02:43:50
|
mono/mono
|
https://api.github.com/repos/mono/mono
|
opened
|
VBCSCompiler cannot create named pipe
|
epic: Roslyn Tests
|
If you attempt to use VBCSCompiler from the latest rev of roslyn using latest mono, it hangs attempting to create its shared named pipe over and over:
```RoslynCommandLineLogFile=~/vbcs-log.txt ~/.mono/bin/mono --debug roslyn/Binaries/Debug/Exes/VBCSCompiler/net46/VBCSCompiler.exe -pipename:1024```
```
--- PID=10197 TID=1 Ticks=5550264: Constructing pipe '1024'.
--- PID=10197 TID=1 Ticks=5550264: Exception 'The method or operation is not implemented.' occurred during 'Error creating client named pipe'. Stack trace:
at System.Security.Principal.WindowsIdentity.get_Owner () [0x00000] in /home/kate/Projects/mono/mcs/class/corlib/System.Security.Principal/WindowsIdentity.cs:255
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.ConstructPipe (System.String pipeName) [0x00016] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:76
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.CreateListenTaskCore (System.Threading.CancellationToken cancellationToken) [0x0000f] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:50
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.CreateListenTask (System.Threading.CancellationToken cancellationToken) [0x0002f] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:34
at Microsoft.CodeAnalysis.CompilerServer.ServerDispatcher.HandleClientConnection (System.Threading.Tasks.Task`1[TResult] clientConnectionTask, System.Boolean allowCompilationRequests, System.Threading.CancellationToken cancellationToken) [0x00038] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/ServerDispatcher.cs:346
--- PID=10197 TID=1 Ticks=5550264: Constructing pipe '1024'.
--- PID=10197 TID=1 Ticks=5550264: Exception 'The method or operation is not implemented.' occurred during 'Error creating client named pipe'. Stack trace:
at System.Security.Principal.WindowsIdentity.get_Owner () [0x00000] in /home/kate/Projects/mono/mcs/class/corlib/System.Security.Principal/WindowsIdentity.cs:255
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.ConstructPipe (System.String pipeName) [0x00016] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:76
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.CreateListenTaskCore (System.Threading.CancellationToken cancellationToken) [0x0000f] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:50
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.CreateListenTask (System.Threading.CancellationToken cancellationToken) [0x0002f] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:34
at Microsoft.CodeAnalysis.CompilerServer.ServerDispatcher.HandleClientConnection (System.Threading.Tasks.Task`1[TResult] clientConnectionTask, System.Boolean allowCompilationRequests, System.Threading.CancellationToken cancellationToken) [0x00038] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/ServerDispatcher.cs:346
```
This is likely why the automated tests fail as well.
|
1.0
|
VBCSCompiler cannot create named pipe - If you attempt to use VBCSCompiler from the latest rev of roslyn using latest mono, it hangs attempting to create its shared named pipe over and over:
```RoslynCommandLineLogFile=~/vbcs-log.txt ~/.mono/bin/mono --debug roslyn/Binaries/Debug/Exes/VBCSCompiler/net46/VBCSCompiler.exe -pipename:1024```
```
--- PID=10197 TID=1 Ticks=5550264: Constructing pipe '1024'.
--- PID=10197 TID=1 Ticks=5550264: Exception 'The method or operation is not implemented.' occurred during 'Error creating client named pipe'. Stack trace:
at System.Security.Principal.WindowsIdentity.get_Owner () [0x00000] in /home/kate/Projects/mono/mcs/class/corlib/System.Security.Principal/WindowsIdentity.cs:255
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.ConstructPipe (System.String pipeName) [0x00016] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:76
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.CreateListenTaskCore (System.Threading.CancellationToken cancellationToken) [0x0000f] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:50
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.CreateListenTask (System.Threading.CancellationToken cancellationToken) [0x0002f] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:34
at Microsoft.CodeAnalysis.CompilerServer.ServerDispatcher.HandleClientConnection (System.Threading.Tasks.Task`1[TResult] clientConnectionTask, System.Boolean allowCompilationRequests, System.Threading.CancellationToken cancellationToken) [0x00038] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/ServerDispatcher.cs:346
--- PID=10197 TID=1 Ticks=5550264: Constructing pipe '1024'.
--- PID=10197 TID=1 Ticks=5550264: Exception 'The method or operation is not implemented.' occurred during 'Error creating client named pipe'. Stack trace:
at System.Security.Principal.WindowsIdentity.get_Owner () [0x00000] in /home/kate/Projects/mono/mcs/class/corlib/System.Security.Principal/WindowsIdentity.cs:255
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.ConstructPipe (System.String pipeName) [0x00016] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:76
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.CreateListenTaskCore (System.Threading.CancellationToken cancellationToken) [0x0000f] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:50
at Microsoft.CodeAnalysis.CompilerServer.NamedPipeClientConnectionHost.CreateListenTask (System.Threading.CancellationToken cancellationToken) [0x0002f] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/NamedPipeClientConnection.cs:34
at Microsoft.CodeAnalysis.CompilerServer.ServerDispatcher.HandleClientConnection (System.Threading.Tasks.Task`1[TResult] clientConnectionTask, System.Boolean allowCompilationRequests, System.Threading.CancellationToken cancellationToken) [0x00038] in /home/kate/Projects/roslyn/src/Compilers/Server/VBCSCompiler/ServerDispatcher.cs:346
```
This is likely why the automated tests fail as well.
|
non_defect
|
vbcscompiler cannot create named pipe if you attempt to use vbcscompiler from the latest rev of roslyn using latest mono it hangs attempting to create its shared named pipe over and over roslyncommandlinelogfile vbcs log txt mono bin mono debug roslyn binaries debug exes vbcscompiler vbcscompiler exe pipename pid tid ticks constructing pipe pid tid ticks exception the method or operation is not implemented occurred during error creating client named pipe stack trace at system security principal windowsidentity get owner in home kate projects mono mcs class corlib system security principal windowsidentity cs at microsoft codeanalysis compilerserver namedpipeclientconnectionhost constructpipe system string pipename in home kate projects roslyn src compilers server vbcscompiler namedpipeclientconnection cs at microsoft codeanalysis compilerserver namedpipeclientconnectionhost createlistentaskcore system threading cancellationtoken cancellationtoken in home kate projects roslyn src compilers server vbcscompiler namedpipeclientconnection cs at microsoft codeanalysis compilerserver namedpipeclientconnectionhost createlistentask system threading cancellationtoken cancellationtoken in home kate projects roslyn src compilers server vbcscompiler namedpipeclientconnection cs at microsoft codeanalysis compilerserver serverdispatcher handleclientconnection system threading tasks task clientconnectiontask system boolean allowcompilationrequests system threading cancellationtoken cancellationtoken in home kate projects roslyn src compilers server vbcscompiler serverdispatcher cs pid tid ticks constructing pipe pid tid ticks exception the method or operation is not implemented occurred during error creating client named pipe stack trace at system security principal windowsidentity get owner in home kate projects mono mcs class corlib system security principal windowsidentity cs at microsoft codeanalysis compilerserver namedpipeclientconnectionhost constructpipe system string pipename in home kate projects roslyn src compilers server vbcscompiler namedpipeclientconnection cs at microsoft codeanalysis compilerserver namedpipeclientconnectionhost createlistentaskcore system threading cancellationtoken cancellationtoken in home kate projects roslyn src compilers server vbcscompiler namedpipeclientconnection cs at microsoft codeanalysis compilerserver namedpipeclientconnectionhost createlistentask system threading cancellationtoken cancellationtoken in home kate projects roslyn src compilers server vbcscompiler namedpipeclientconnection cs at microsoft codeanalysis compilerserver serverdispatcher handleclientconnection system threading tasks task clientconnectiontask system boolean allowcompilationrequests system threading cancellationtoken cancellationtoken in home kate projects roslyn src compilers server vbcscompiler serverdispatcher cs this is likely why the automated tests fail as well
| 0
|
47,335
| 13,056,126,576
|
IssuesEvent
|
2020-07-30 03:44:18
|
icecube-trac/tix2
|
https://api.github.com/repos/icecube-trac/tix2
|
closed
|
Time Scroll Bar (Trac #365)
|
Migrated from Trac defect glshovel
|
From Boersma's wishlist :
The horizontal time slide bar is heavily handicapped, especially
for neutrino data sets (for which t=0 sometimes corresponds to the
neutrino generation at the North Pole). The easiest fix would be to
allow numbers larger than 6 digits for the start time of the time
window, this would enable manual sanitization of the time range, which
is currently practically impossible.
Intelligent choices for the time range would also be great. There are
various ideas for this:
* based on event header (DAQ)
* based on times of the launches in raw data
* based on times of pulses in (user chosen) pulseseriesmap
Migrated from https://code.icecube.wisc.edu/ticket/365
```json
{
"status": "closed",
"changetime": "2013-08-23T16:42:33",
"description": "From Boersma's wishlist :\n\nThe horizontal time slide bar is heavily handicapped, especially\nfor neutrino data sets (for which t=0 sometimes corresponds to the\nneutrino generation at the North Pole). The easiest fix would be to\nallow numbers larger than 6 digits for the start time of the time\nwindow, this would enable manual sanitization of the time range, which\nis currently practically impossible.\nIntelligent choices for the time range would also be great. There are\nvarious ideas for this:\n* based on event header (DAQ)\n* based on times of the launches in raw data\n* based on times of pulses in (user chosen) pulseseriesmap",
"reporter": "olivas",
"cc": "",
"resolution": "fixed",
"_ts": "1377276153000000",
"component": "glshovel",
"summary": "Time Scroll Bar",
"priority": "normal",
"keywords": "",
"time": "2012-02-29T06:46:56",
"milestone": "",
"owner": "olivas",
"type": "defect"
}
```
|
1.0
|
Time Scroll Bar (Trac #365) - From Boersma's wishlist :
The horizontal time slide bar is heavily handicapped, especially
for neutrino data sets (for which t=0 sometimes corresponds to the
neutrino generation at the North Pole). The easiest fix would be to
allow numbers larger than 6 digits for the start time of the time
window, this would enable manual sanitization of the time range, which
is currently practically impossible.
Intelligent choices for the time range would also be great. There are
various ideas for this:
* based on event header (DAQ)
* based on times of the launches in raw data
* based on times of pulses in (user chosen) pulseseriesmap
Migrated from https://code.icecube.wisc.edu/ticket/365
```json
{
"status": "closed",
"changetime": "2013-08-23T16:42:33",
"description": "From Boersma's wishlist :\n\nThe horizontal time slide bar is heavily handicapped, especially\nfor neutrino data sets (for which t=0 sometimes corresponds to the\nneutrino generation at the North Pole). The easiest fix would be to\nallow numbers larger than 6 digits for the start time of the time\nwindow, this would enable manual sanitization of the time range, which\nis currently practically impossible.\nIntelligent choices for the time range would also be great. There are\nvarious ideas for this:\n* based on event header (DAQ)\n* based on times of the launches in raw data\n* based on times of pulses in (user chosen) pulseseriesmap",
"reporter": "olivas",
"cc": "",
"resolution": "fixed",
"_ts": "1377276153000000",
"component": "glshovel",
"summary": "Time Scroll Bar",
"priority": "normal",
"keywords": "",
"time": "2012-02-29T06:46:56",
"milestone": "",
"owner": "olivas",
"type": "defect"
}
```
|
defect
|
time scroll bar trac from boersma s wishlist the horizontal time slide bar is heavily handicapped especially for neutrino data sets for which t sometimes corresponds to the neutrino generation at the north pole the easiest fix would be to allow numbers larger than digits for the start time of the time window this would enable manual sanitization of the time range which is currently practically impossible intelligent choices for the time range would also be great there are various ideas for this based on event header daq based on times of the launches in raw data based on times of pulses in user chosen pulseseriesmap migrated from json status closed changetime description from boersma s wishlist n nthe horizontal time slide bar is heavily handicapped especially nfor neutrino data sets for which t sometimes corresponds to the nneutrino generation at the north pole the easiest fix would be to nallow numbers larger than digits for the start time of the time nwindow this would enable manual sanitization of the time range which nis currently practically impossible nintelligent choices for the time range would also be great there are nvarious ideas for this n based on event header daq n based on times of the launches in raw data n based on times of pulses in user chosen pulseseriesmap reporter olivas cc resolution fixed ts component glshovel summary time scroll bar priority normal keywords time milestone owner olivas type defect
| 1
|
371,746
| 25,961,881,436
|
IssuesEvent
|
2022-12-19 00:40:54
|
NewPath-Consulting/ez-wildapricot-webdesigner
|
https://api.github.com/repos/NewPath-Consulting/ez-wildapricot-webdesigner
|
closed
|
add documentation on whitelisting any URLs that are referenced by EZ
|
documentation
|
There is a new feature in WA that requires whitelisting domains that are used to load external JS scripts. EZ will load external scripts from the code and as a result we need to add those URLs to the whitelist list and describe how to do that for installation purposes.
|
1.0
|
add documentation on whitelisting any URLs that are referenced by EZ - There is a new feature in WA that requires whitelisting domains that are used to load external JS scripts. EZ will load external scripts from the code and as a result we need to add those URLs to the whitelist list and describe how to do that for installation purposes.
|
non_defect
|
add documentation on whitelisting any urls that are referenced by ez there is a new feature in wa that requires whitelisting domains that are used to load external js scripts ez will load external scripts from the code and as a result we need to add those urls to the whitelist list and describe how to do that for installation purposes
| 0
|
14,006
| 2,789,844,281
|
IssuesEvent
|
2015-05-08 21:52:01
|
google/google-visualization-api-issues
|
https://api.github.com/repos/google/google-visualization-api-issues
|
closed
|
AnnotatedTimeLine doesn't display annotations when loaded from file system.
|
Priority-Medium Type-Defect
|
Original [issue 275](https://code.google.com/p/google-visualization-api-issues/issues/detail?id=275) created by orwant on 2010-05-07T13:12:26.000Z:
<b>What steps will reproduce the problem? Please provide a link to a</b>
<b>demonstration page if at all possible, or attach code.</b>
1. Look at: http://code.google.com/apis/ajax/playground/#annotated_time_line
2. Download the attached file.
3. Notice that the annotations in the graph ([A], [B]) do not show up.
<b>What component is this issue related to (PieChart, LineChart, DataTable,</b>
<b>Query, etc)?</b>
AnnotatedTimeLine
<b>Are you using the test environment (version 1.1)?</b>
<b>(If you are not sure, answer NO)</b>
NO.
<b>What operating system and browser are you using?</b>
Linux. I get the same problem in Firefox and Chrome.
<b>*********************************************************</b>
<b>For developers viewing this issue: please click the 'star' icon to be</b>
<b>notified of future changes, and to let us know how many of you are</b>
<b>interested in seeing it resolved.</b>
<b>*********************************************************</b>
|
1.0
|
AnnotatedTimeLine doesn't display annotations when loaded from file system. - Original [issue 275](https://code.google.com/p/google-visualization-api-issues/issues/detail?id=275) created by orwant on 2010-05-07T13:12:26.000Z:
<b>What steps will reproduce the problem? Please provide a link to a</b>
<b>demonstration page if at all possible, or attach code.</b>
1. Look at: http://code.google.com/apis/ajax/playground/#annotated_time_line
2. Download the attached file.
3. Notice that the annotations in the graph ([A], [B]) do not show up.
<b>What component is this issue related to (PieChart, LineChart, DataTable,</b>
<b>Query, etc)?</b>
AnnotatedTimeLine
<b>Are you using the test environment (version 1.1)?</b>
<b>(If you are not sure, answer NO)</b>
NO.
<b>What operating system and browser are you using?</b>
Linux. I get the same problem in Firefox and Chrome.
<b>*********************************************************</b>
<b>For developers viewing this issue: please click the 'star' icon to be</b>
<b>notified of future changes, and to let us know how many of you are</b>
<b>interested in seeing it resolved.</b>
<b>*********************************************************</b>
|
defect
|
annotatedtimeline doesn t display annotations when loaded from file system original created by orwant on what steps will reproduce the problem please provide a link to a demonstration page if at all possible or attach code look at download the attached file notice that the annotations in the graph do not show up what component is this issue related to piechart linechart datatable query etc annotatedtimeline are you using the test environment version if you are not sure answer no no what operating system and browser are you using linux i get the same problem in firefox and chrome for developers viewing this issue please click the star icon to be notified of future changes and to let us know how many of you are interested in seeing it resolved
| 1
|
13,568
| 2,769,950,657
|
IssuesEvent
|
2015-05-01 09:00:29
|
cultibox/cultibox
|
https://api.github.com/repos/cultibox/cultibox
|
closed
|
[cultipi] perte de controle : plug x programme is empty
|
Component-cultipi Priority-Critical Type-Defect
|
A 1h la lampe n°3 ne s'est pas allumé.
Sur le logiciel, le syno indique en effet que rien ne tourne.
La culti semblait fonctionner normalement.
L'heure indiqué par le petit logo "lecture" vert en haut à droite du syno m'indiquait l'heure correct.
Je m'en suis rendu compte après 1 h du matin, J'ai simpolement fini pa cliquer sur le bouton pour redémarrer ma culti depuis config> admin et c'est reparti!
Voila ce que j'ai trouvé dans les logs(j'nvoie le reste par mail)
00:00:00.20 serverPlugUpdate info plugv -0420-
20/04/2015 00:00:00.172 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.186 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.242 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.280 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.350 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.386 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.452 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.655 serverPlugUpdate info plugv filename : plu00
20/04/2015 00:00:03.643 serverPlugUpdate error Plug 1 programme is empty
20/04/2015 00:00:03.653 serverPlugUpdate error Plug 2 programme is empty
20/04/2015 00:00:03.659 serverPlugUpdate error Plug 3 programme is empty
20/04/2015 00:00:03.663 serverPlugUpdate error Plug 4 programme ...
|
1.0
|
[cultipi] perte de controle : plug x programme is empty - A 1h la lampe n°3 ne s'est pas allumé.
Sur le logiciel, le syno indique en effet que rien ne tourne.
La culti semblait fonctionner normalement.
L'heure indiqué par le petit logo "lecture" vert en haut à droite du syno m'indiquait l'heure correct.
Je m'en suis rendu compte après 1 h du matin, J'ai simpolement fini pa cliquer sur le bouton pour redémarrer ma culti depuis config> admin et c'est reparti!
Voila ce que j'ai trouvé dans les logs(j'nvoie le reste par mail)
00:00:00.20 serverPlugUpdate info plugv -0420-
20/04/2015 00:00:00.172 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.186 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.242 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.280 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.350 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.386 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.452 serverHisto error ::sql::addPlugState : unknow state
20/04/2015 00:00:00.655 serverPlugUpdate info plugv filename : plu00
20/04/2015 00:00:03.643 serverPlugUpdate error Plug 1 programme is empty
20/04/2015 00:00:03.653 serverPlugUpdate error Plug 2 programme is empty
20/04/2015 00:00:03.659 serverPlugUpdate error Plug 3 programme is empty
20/04/2015 00:00:03.663 serverPlugUpdate error Plug 4 programme ...
|
defect
|
perte de controle plug x programme is empty a la lampe n° ne s est pas allumé sur le logiciel le syno indique en effet que rien ne tourne la culti semblait fonctionner normalement l heure indiqué par le petit logo lecture vert en haut à droite du syno m indiquait l heure correct je m en suis rendu compte après h du matin j ai simpolement fini pa cliquer sur le bouton pour redémarrer ma culti depuis config admin et c est reparti voila ce que j ai trouvé dans les logs j nvoie le reste par mail serverplugupdate info plugv serverhisto error sql addplugstate unknow state serverhisto error sql addplugstate unknow state serverhisto error sql addplugstate unknow state serverhisto error sql addplugstate unknow state serverhisto error sql addplugstate unknow state serverhisto error sql addplugstate unknow state serverhisto error sql addplugstate unknow state serverplugupdate info plugv filename serverplugupdate error plug programme is empty serverplugupdate error plug programme is empty serverplugupdate error plug programme is empty serverplugupdate error plug programme
| 1
|
92,971
| 26,828,424,815
|
IssuesEvent
|
2023-02-02 14:27:57
|
WordPress/developer-blog-content
|
https://api.github.com/repos/WordPress/developer-blog-content
|
opened
|
Tutorial on block deprecation
|
flow: approved Tutorial Building Blocks
|
### Discussed in https://github.com/WordPress/developer-blog-content/discussions/67
<div type='discussions-op-text'>
<sup>Originally posted by **mburridge** January 26, 2023</sup>
If you make a change to the way that a block is rendered you will see the "invalid content" error message after the change.

This commonly occurs during development, and developers working with blocks are very aware of this and are used to dealing with it.
However, if you've already published a block and want to make changes to it in an update you won't want users to see the "invalid content" error message, which they will in each instance of the block across their site until recovery is done on each of them.
To circumvent this problem developers can "deprecate" the old version of the block.
A tutorial on how to deprecate a block, and when and why you should do it, would be a nice addition to the Developer Blog.
Reference:
[Deprecation](https://developer.wordpress.org/block-editor/reference-guides/block-api/block-deprecation/)
</div>
|
1.0
|
Tutorial on block deprecation - ### Discussed in https://github.com/WordPress/developer-blog-content/discussions/67
<div type='discussions-op-text'>
<sup>Originally posted by **mburridge** January 26, 2023</sup>
If you make a change to the way that a block is rendered you will see the "invalid content" error message after the change.

This commonly occurs during development, and developers working with blocks are very aware of this and are used to dealing with it.
However, if you've already published a block and want to make changes to it in an update you won't want users to see the "invalid content" error message, which they will in each instance of the block across their site until recovery is done on each of them.
To circumvent this problem developers can "deprecate" the old version of the block.
A tutorial on how to deprecate a block, and when and why you should do it, would be a nice addition to the Developer Blog.
Reference:
[Deprecation](https://developer.wordpress.org/block-editor/reference-guides/block-api/block-deprecation/)
</div>
|
non_defect
|
tutorial on block deprecation discussed in originally posted by mburridge january if you make a change to the way that a block is rendered you will see the invalid content error message after the change this commonly occurs during development and developers working with blocks are very aware of this and are used to dealing with it however if you ve already published a block and want to make changes to it in an update you won t want users to see the invalid content error message which they will in each instance of the block across their site until recovery is done on each of them to circumvent this problem developers can deprecate the old version of the block a tutorial on how to deprecate a block and when and why you should do it would be a nice addition to the developer blog reference
| 0
|
31,293
| 5,923,738,332
|
IssuesEvent
|
2017-05-23 08:44:04
|
piceaTech/ember-rapid-forms
|
https://api.github.com/repos/piceaTech/ember-rapid-forms
|
opened
|
Build docs automatically
|
Documentation enhancement
|
When a new release is made we should publish the current docs with an automatic tool. Examples are:
- http://usejsdoc.org/
- https://yui.github.io/yuidoc/
|
1.0
|
Build docs automatically - When a new release is made we should publish the current docs with an automatic tool. Examples are:
- http://usejsdoc.org/
- https://yui.github.io/yuidoc/
|
non_defect
|
build docs automatically when a new release is made we should publish the current docs with an automatic tool examples are
| 0
|
695,555
| 23,863,365,540
|
IssuesEvent
|
2022-09-07 08:58:30
|
threefoldfoundation/www_threefold_io
|
https://api.github.com/repos/threefoldfoundation/www_threefold_io
|
closed
|
threefold.io publish website using zola
|
priority_critical
|
- make zola on development branch
- publish to /html
- get github to use gitpages to show the website
- put link from home to gitpages
|
1.0
|
threefold.io publish website using zola - - make zola on development branch
- publish to /html
- get github to use gitpages to show the website
- put link from home to gitpages
|
non_defect
|
threefold io publish website using zola make zola on development branch publish to html get github to use gitpages to show the website put link from home to gitpages
| 0
|
21,193
| 3,471,090,336
|
IssuesEvent
|
2015-12-23 13:13:41
|
lpechacek/cpuset
|
https://api.github.com/repos/lpechacek/cpuset
|
closed
|
Does not work on Centos7
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
Running as root:
1. untar cset source
2. python setup.py bdist_rpm (fix all problems)
3. python setup.py install
4. Verify cset installed w/o problems run "cset" at command line (no args)
correctly get usage statement.
5. Try to list current shields via: cset set -l
What is the expected output? What do you see instead?
EXPECTED:
cset:
Name CPUs-X MEMs-X Tasks Subs Path
------------ ---------- - ------- - ----- ---- ----------
root 0-7 y 0 y 141 0 /
ACTUAL:
cset: **> [Errno 2] No such file or directory: '/cpusets//cpus'
What version of the product are you using? On what operating system?
cset: Cpuset (cset) 1.5.6
kernel: Linux 3.10.0-123.el7.x86_64
distro: CentOS Linux release 7.0.1406 (Core)
Please provide any additional information below.
Looks like it installs fine on Centos7 but does not work.
```
Original issue reported on code.google.com by `JonSto...@gmail.com` on 15 Dec 2014 at 6:38
|
1.0
|
Does not work on Centos7 - ```
What steps will reproduce the problem?
Running as root:
1. untar cset source
2. python setup.py bdist_rpm (fix all problems)
3. python setup.py install
4. Verify cset installed w/o problems run "cset" at command line (no args)
correctly get usage statement.
5. Try to list current shields via: cset set -l
What is the expected output? What do you see instead?
EXPECTED:
cset:
Name CPUs-X MEMs-X Tasks Subs Path
------------ ---------- - ------- - ----- ---- ----------
root 0-7 y 0 y 141 0 /
ACTUAL:
cset: **> [Errno 2] No such file or directory: '/cpusets//cpus'
What version of the product are you using? On what operating system?
cset: Cpuset (cset) 1.5.6
kernel: Linux 3.10.0-123.el7.x86_64
distro: CentOS Linux release 7.0.1406 (Core)
Please provide any additional information below.
Looks like it installs fine on Centos7 but does not work.
```
Original issue reported on code.google.com by `JonSto...@gmail.com` on 15 Dec 2014 at 6:38
|
defect
|
does not work on what steps will reproduce the problem running as root untar cset source python setup py bdist rpm fix all problems python setup py install verify cset installed w o problems run cset at command line no args correctly get usage statement try to list current shields via cset set l what is the expected output what do you see instead expected cset name cpus x mems x tasks subs path root y y actual cset no such file or directory cpusets cpus what version of the product are you using on what operating system cset cpuset cset kernel linux distro centos linux release core please provide any additional information below looks like it installs fine on but does not work original issue reported on code google com by jonsto gmail com on dec at
| 1
|
206,667
| 16,049,281,322
|
IssuesEvent
|
2021-04-22 17:01:38
|
Interacao-Humano-Computador/2020.2-DefensoriaSP
|
https://api.github.com/repos/Interacao-Humano-Computador/2020.2-DefensoriaSP
|
opened
|
Documento de Relato da Avaliação dos Storyboards
|
documentation
|
## Descrição
Criação do Documento onde será detalhada a avaliação realizada em relação à técnica de storyboard.
## Tarefas
<!-- Descreva as tarefas da issue -->
- [ ] Criar documento de Planejamento de Relato da Avaliação do Storyboard
- [ ] Detalhar respostas e resultados obtidos
- [ ] Revisar documento
|
1.0
|
Documento de Relato da Avaliação dos Storyboards - ## Descrição
Criação do Documento onde será detalhada a avaliação realizada em relação à técnica de storyboard.
## Tarefas
<!-- Descreva as tarefas da issue -->
- [ ] Criar documento de Planejamento de Relato da Avaliação do Storyboard
- [ ] Detalhar respostas e resultados obtidos
- [ ] Revisar documento
|
non_defect
|
documento de relato da avaliação dos storyboards descrição criação do documento onde será detalhada a avaliação realizada em relação à técnica de storyboard tarefas criar documento de planejamento de relato da avaliação do storyboard detalhar respostas e resultados obtidos revisar documento
| 0
|
44,884
| 12,420,512,257
|
IssuesEvent
|
2020-05-23 12:24:46
|
unixfreaxjp/nanorc
|
https://api.github.com/repos/unixfreaxjp/nanorc
|
closed
|
better PHP config
|
Priority-Medium Type-Defect auto-migrated
|
```
I found the php.nanorc's out there lacking, so here's mine. Still working
on it but it's fairly good for daily use, at least for me.
Thanks.
BEGIN--------------------
syntax "php" "\.php[2345s~]?$"
## HTML (assume everything is HTML until proven otherwise)
color white,blue "^.+$"
## PHP
color white,black start="<\?php" end="\?>"
## strings, part 1
#malformed
color magenta,yellow "('|")"
#override not malformed
color brightmagenta start="['"]" end="[^\\]['"]"
## functions
icolor white start="[a-z_0-9]+\(" end=")[\);,$]*"
## variables
icolor brightred "\$[a-z_0-9]+"
color brightred start="\$\{" end="\}"
## types
color green "([^\$]|^)\<(var|float|global|double|bool|char|int|enum|const)\>"
## constants (assume constants are ALL_CAPS)
color red "\<[A-Z_0-9]*\>"
## numbers
#decimal
color yellow "\<\-?[0-9\.]+\>"
#hex
icolor yellow,blue "\<0x[0-9a-f]+\>"
#octal
color yellow,green "\<0[0-7]+\>"
#boolean
icolor cyan "([^\$]|^)\<(true|false)\>"
## structure
#note, $class is a variable, class is not.
color brightyellow
"([^\$]|^)\<(class|new|private|public|function|for|foreach|if|while|do|else|else
if|case|default|switch)\>"
## control flow
color brightblue "([^\$]|^)\<(goto|continue|break|return)\>"
## operators
color green
"(\^|\&|\||=|==|===|&&|\|\||\!==?|>|<|\.=?|->|::|\+|\-|\*|\/|\!|\!=|\!==|%=|\*=|
\+=|\-=|\/=)"
## braces
color white "(\{|\(|\)|\})"
##sad wee end brackets etc
color white "^[[:blank:]]*([\)\}][,;]?[[:blank:]]*)*$"
## strings
color brightmagenta "'([^']*\\')*[^']*'"
color brightmagenta ""(\.|[^"])*""
#kind of a hack here since backrefs don't work with start and end apparently
color brightmagenta start="<<<(.*)" end=";$"
## control flow
color brightblue "([^\$]|^)\<(goto|continue|break|return)\>"
# vars in strings
icolor brightred "\$[a-z0-9_]*"
## comments
color blue "[^:]//.*"
color blue "^//.*"
color blue "(^|[^'"]+)#.*"
color blue start="/\*" end="\*/"
##HTML again
color white,blue start="\?>" end="<\?php"
color white,blue start="" end="<\?php"
color white,blue start="\?>" end=".$"
#color brightred,blue "&[^&]"
icolor cyan,blue "&[a-z0-9#]{2,8};"
## Trailing whitespace
color ,white "[[:blank:]]+$"
## php markings
color brightgreen "(<\?(php)?|\?>)"
```
Original issue reported on code.google.com by `rod.mcfa...@gmail.com` on 16 Oct 2009 at 5:16
|
1.0
|
better PHP config - ```
I found the php.nanorc's out there lacking, so here's mine. Still working
on it but it's fairly good for daily use, at least for me.
Thanks.
BEGIN--------------------
syntax "php" "\.php[2345s~]?$"
## HTML (assume everything is HTML until proven otherwise)
color white,blue "^.+$"
## PHP
color white,black start="<\?php" end="\?>"
## strings, part 1
#malformed
color magenta,yellow "('|")"
#override not malformed
color brightmagenta start="['"]" end="[^\\]['"]"
## functions
icolor white start="[a-z_0-9]+\(" end=")[\);,$]*"
## variables
icolor brightred "\$[a-z_0-9]+"
color brightred start="\$\{" end="\}"
## types
color green "([^\$]|^)\<(var|float|global|double|bool|char|int|enum|const)\>"
## constants (assume constants are ALL_CAPS)
color red "\<[A-Z_0-9]*\>"
## numbers
#decimal
color yellow "\<\-?[0-9\.]+\>"
#hex
icolor yellow,blue "\<0x[0-9a-f]+\>"
#octal
color yellow,green "\<0[0-7]+\>"
#boolean
icolor cyan "([^\$]|^)\<(true|false)\>"
## structure
#note, $class is a variable, class is not.
color brightyellow
"([^\$]|^)\<(class|new|private|public|function|for|foreach|if|while|do|else|else
if|case|default|switch)\>"
## control flow
color brightblue "([^\$]|^)\<(goto|continue|break|return)\>"
## operators
color green
"(\^|\&|\||=|==|===|&&|\|\||\!==?|>|<|\.=?|->|::|\+|\-|\*|\/|\!|\!=|\!==|%=|\*=|
\+=|\-=|\/=)"
## braces
color white "(\{|\(|\)|\})"
##sad wee end brackets etc
color white "^[[:blank:]]*([\)\}][,;]?[[:blank:]]*)*$"
## strings
color brightmagenta "'([^']*\\')*[^']*'"
color brightmagenta ""(\.|[^"])*""
#kind of a hack here since backrefs don't work with start and end apparently
color brightmagenta start="<<<(.*)" end=";$"
## control flow
color brightblue "([^\$]|^)\<(goto|continue|break|return)\>"
# vars in strings
icolor brightred "\$[a-z0-9_]*"
## comments
color blue "[^:]//.*"
color blue "^//.*"
color blue "(^|[^'"]+)#.*"
color blue start="/\*" end="\*/"
##HTML again
color white,blue start="\?>" end="<\?php"
color white,blue start="" end="<\?php"
color white,blue start="\?>" end=".$"
#color brightred,blue "&[^&]"
icolor cyan,blue "&[a-z0-9#]{2,8};"
## Trailing whitespace
color ,white "[[:blank:]]+$"
## php markings
color brightgreen "(<\?(php)?|\?>)"
```
Original issue reported on code.google.com by `rod.mcfa...@gmail.com` on 16 Oct 2009 at 5:16
|
defect
|
better php config i found the php nanorc s out there lacking so here s mine still working on it but it s fairly good for daily use at least for me thanks begin syntax php php html assume everything is html until proven otherwise color white blue php color white black start strings part malformed color magenta yellow override not malformed color brightmagenta start end functions icolor white start end variables icolor brightred color brightred start end types color green constants assume constants are all caps color red numbers decimal color yellow hex icolor yellow blue octal color yellow green boolean icolor cyan structure note class is a variable class is not color brightyellow class new private public function for foreach if while do else else if case default switch control flow color brightblue operators color green braces color white sad wee end brackets etc color white strings color brightmagenta color brightmagenta kind of a hack here since backrefs don t work with start and end apparently color brightmagenta start end control flow color brightblue vars in strings icolor brightred comments color blue color blue color blue color blue start end html again color white blue start end php color white blue start end php color white blue start end color brightred blue icolor cyan blue trailing whitespace color white php markings color brightgreen original issue reported on code google com by rod mcfa gmail com on oct at
| 1
|
40,862
| 10,195,669,121
|
IssuesEvent
|
2019-08-12 18:43:20
|
scipy/scipy
|
https://api.github.com/repos/scipy/scipy
|
opened
|
incomplete doc build instructions
|
Documentation defect
|
On https://scipy.github.io/devdocs/dev/contributor/rendering_documentation.html, a critical step is missing. The description assumes you've done something like `python setup.py develop` already. I.e. before `make html` or `make html-scipyorg`, make sure that `>>> import scipy` works and picks up the scipy build you are working on. This is confusing, so clearly note this in the docs.
Also, it may be useful to take over the recent change to the numpy doc Makefile that compares git hashes and refuses to build if they don't match. That ensures you pick up the right version.
|
1.0
|
incomplete doc build instructions - On https://scipy.github.io/devdocs/dev/contributor/rendering_documentation.html, a critical step is missing. The description assumes you've done something like `python setup.py develop` already. I.e. before `make html` or `make html-scipyorg`, make sure that `>>> import scipy` works and picks up the scipy build you are working on. This is confusing, so clearly note this in the docs.
Also, it may be useful to take over the recent change to the numpy doc Makefile that compares git hashes and refuses to build if they don't match. That ensures you pick up the right version.
|
defect
|
incomplete doc build instructions on a critical step is missing the description assumes you ve done something like python setup py develop already i e before make html or make html scipyorg make sure that import scipy works and picks up the scipy build you are working on this is confusing so clearly note this in the docs also it may be useful to take over the recent change to the numpy doc makefile that compares git hashes and refuses to build if they don t match that ensures you pick up the right version
| 1
|
22,043
| 3,590,837,020
|
IssuesEvent
|
2016-02-01 08:49:01
|
excilys/androidannotations
|
https://api.github.com/repos/excilys/androidannotations
|
closed
|
@Background: BackgroundExecutor delay is calculated wrongly. Task is never called.
|
Defect
|
There seems to be a bug in `BackgroundExecutor` class. The `remainingDelay` of the next task (same serial) is set wrongly.
**To reproduce:**
```java
@EActivity
public class MainActivity extends AppCompatActivity {
private static final String TAG = "APP";
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
firstTaskNoDelay();
secondTaskWithDelay();
}
@Background(serial = "serial1")
void firstTaskNoDelay() {
Log.d(TAG, "Executing task 1");
try {
Thread.sleep(6000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
@Background(serial = "serial1", delay = 1000)
void secondTaskWithDelay() {
Log.d(TAG, "Executing task 2"); // <-- this is not called (delayed by a huge time)
}
}
```
There are two tasks with same serial. First one with no delay, second one with delay 1000. The calculation of `next.remainingDelay` is incorrect which causes the second task to be delayed by a huge time.
**Buggy method:**
There is issue when casting long to int.
```java
private void postExecute() {
if (id == null && serial == null) {
/* nothing to do */
return;
}
CURRENT_SERIAL.set(null);
synchronized (BackgroundExecutor.class) {
/* execution complete */
TASKS.remove(this);
if (serial != null) {
Task next = take(serial);
if (next != null) {
if (next.remainingDelay != 0) {
/* the delay may not have elapsed yet */
// ============= BUG HERE =============
next.remainingDelay = Math.max(0, (int) (targetTimeMillis - System.currentTimeMillis()));
/*
* targetTimeMillis => 0
* System.currentTimeMillis() => 1453870701164l
* (int) (targetTimeMillis - System.currentTimeMillis()) => ((int) (0 - 1453870701164l)) => 2123212180
* next.remainingDelay => 2123212180 <-- WRONG
*/
}
/* a task having the same serial was queued, execute it */
BackgroundExecutor.execute(next);
}
}
}
}
```
**Proposed Fix:**
```java
next.remainingDelay = Math.max(0l, targetTimeMillis - System.currentTimeMillis());
```
**Update**:
Interestingly this bug will start to happen for all users from today (Tue Jan 26 2016 22:13:49 UTC time) onwards as the `System.currentTimeMillis()` value exceeds `1453846429696l`.
```
(int)(0 - 1453846429696l) = -2147483648 // <-- negative (so no bug)
(int)(0 - 1453846429697l) = 2147483647 // <-- positive (bug starts to happen)
```
Let me know if I have misunderstood anything.
Regards
-Amulya
|
1.0
|
@Background: BackgroundExecutor delay is calculated wrongly. Task is never called. - There seems to be a bug in `BackgroundExecutor` class. The `remainingDelay` of the next task (same serial) is set wrongly.
**To reproduce:**
```java
@EActivity
public class MainActivity extends AppCompatActivity {
private static final String TAG = "APP";
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
firstTaskNoDelay();
secondTaskWithDelay();
}
@Background(serial = "serial1")
void firstTaskNoDelay() {
Log.d(TAG, "Executing task 1");
try {
Thread.sleep(6000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
@Background(serial = "serial1", delay = 1000)
void secondTaskWithDelay() {
Log.d(TAG, "Executing task 2"); // <-- this is not called (delayed by a huge time)
}
}
```
There are two tasks with same serial. First one with no delay, second one with delay 1000. The calculation of `next.remainingDelay` is incorrect which causes the second task to be delayed by a huge time.
**Buggy method:**
There is issue when casting long to int.
```java
private void postExecute() {
if (id == null && serial == null) {
/* nothing to do */
return;
}
CURRENT_SERIAL.set(null);
synchronized (BackgroundExecutor.class) {
/* execution complete */
TASKS.remove(this);
if (serial != null) {
Task next = take(serial);
if (next != null) {
if (next.remainingDelay != 0) {
/* the delay may not have elapsed yet */
// ============= BUG HERE =============
next.remainingDelay = Math.max(0, (int) (targetTimeMillis - System.currentTimeMillis()));
/*
* targetTimeMillis => 0
* System.currentTimeMillis() => 1453870701164l
* (int) (targetTimeMillis - System.currentTimeMillis()) => ((int) (0 - 1453870701164l)) => 2123212180
* next.remainingDelay => 2123212180 <-- WRONG
*/
}
/* a task having the same serial was queued, execute it */
BackgroundExecutor.execute(next);
}
}
}
}
```
**Proposed Fix:**
```java
next.remainingDelay = Math.max(0l, targetTimeMillis - System.currentTimeMillis());
```
**Update**:
Interestingly this bug will start to happen for all users from today (Tue Jan 26 2016 22:13:49 UTC time) onwards as the `System.currentTimeMillis()` value exceeds `1453846429696l`.
```
(int)(0 - 1453846429696l) = -2147483648 // <-- negative (so no bug)
(int)(0 - 1453846429697l) = 2147483647 // <-- positive (bug starts to happen)
```
Let me know if I have misunderstood anything.
Regards
-Amulya
|
defect
|
background backgroundexecutor delay is calculated wrongly task is never called there seems to be a bug in backgroundexecutor class the remainingdelay of the next task same serial is set wrongly to reproduce java eactivity public class mainactivity extends appcompatactivity private static final string tag app override protected void oncreate bundle savedinstancestate super oncreate savedinstancestate setcontentview r layout activity main firsttasknodelay secondtaskwithdelay background serial void firsttasknodelay log d tag executing task try thread sleep catch interruptedexception e e printstacktrace background serial delay void secondtaskwithdelay log d tag executing task this is not called delayed by a huge time there are two tasks with same serial first one with no delay second one with delay the calculation of next remainingdelay is incorrect which causes the second task to be delayed by a huge time buggy method there is issue when casting long to int java private void postexecute if id null serial null nothing to do return current serial set null synchronized backgroundexecutor class execution complete tasks remove this if serial null task next take serial if next null if next remainingdelay the delay may not have elapsed yet bug here next remainingdelay math max int targettimemillis system currenttimemillis targettimemillis system currenttimemillis int targettimemillis system currenttimemillis int next remainingdelay wrong a task having the same serial was queued execute it backgroundexecutor execute next proposed fix java next remainingdelay math max targettimemillis system currenttimemillis update interestingly this bug will start to happen for all users from today tue jan utc time onwards as the system currenttimemillis value exceeds int negative so no bug int positive bug starts to happen let me know if i have misunderstood anything regards amulya
| 1
|
16,339
| 20,997,134,722
|
IssuesEvent
|
2022-03-29 14:21:20
|
sjmog/smartflix
|
https://api.github.com/repos/sjmog/smartflix
|
opened
|
Render shows to the homepage
|
Rails/File processing Rails/Haml
|
You have just set up a Rails application with a test-driven dummy view! ����
In this challenge, you will update the application so the root route renders the shows from the [provided CSV file](../training-data/netflix_titles.zip).
Here's how it should look by the end of this ticket:

## To complete this ticket, you will have to:
- [ ] Write a new acceptance test that asserts: when the user visits the homepage, the page content should include each show title in the [provided CSV file](../training-data/netflix_titles.csv).
- [ ] Configure your Rails app to use [Haml](https://haml.info/) for the views.
- [ ] Create a new controller to show all shows. Make sure you're following the [Rails naming conventions](https://guides.rubyonrails.org/action_controller_overview.html)!
- [ ] Create a new route so that users visiting the root of your application are directed to the index action of your new controller. Make sure you're following the [Rails routing conventions](https://guides.rubyonrails.org/routing.html)!
- [ ] Pass the acceptance test by displaying all shows from the [provided CSV file](../training-data/netflix_titles.zip) file.
## Tips
- There are a lot of shows in the [provided CSV file](../training-data/netflix_titles.zip)! You may need to limit the number you render to the view.
|
1.0
|
Render shows to the homepage - You have just set up a Rails application with a test-driven dummy view! ����
In this challenge, you will update the application so the root route renders the shows from the [provided CSV file](../training-data/netflix_titles.zip).
Here's how it should look by the end of this ticket:

## To complete this ticket, you will have to:
- [ ] Write a new acceptance test that asserts: when the user visits the homepage, the page content should include each show title in the [provided CSV file](../training-data/netflix_titles.csv).
- [ ] Configure your Rails app to use [Haml](https://haml.info/) for the views.
- [ ] Create a new controller to show all shows. Make sure you're following the [Rails naming conventions](https://guides.rubyonrails.org/action_controller_overview.html)!
- [ ] Create a new route so that users visiting the root of your application are directed to the index action of your new controller. Make sure you're following the [Rails routing conventions](https://guides.rubyonrails.org/routing.html)!
- [ ] Pass the acceptance test by displaying all shows from the [provided CSV file](../training-data/netflix_titles.zip) file.
## Tips
- There are a lot of shows in the [provided CSV file](../training-data/netflix_titles.zip)! You may need to limit the number you render to the view.
|
non_defect
|
render shows to the homepage you have just set up a rails application with a test driven dummy view ���� in this challenge you will update the application so the root route renders the shows from the training data netflix titles zip here s how it should look by the end of this ticket images smartflix png to complete this ticket you will have to write a new acceptance test that asserts when the user visits the homepage the page content should include each show title in the training data netflix titles csv configure your rails app to use for the views create a new controller to show all shows make sure you re following the create a new route so that users visiting the root of your application are directed to the index action of your new controller make sure you re following the pass the acceptance test by displaying all shows from the training data netflix titles zip file tips there are a lot of shows in the training data netflix titles zip you may need to limit the number you render to the view
| 0
|
130,975
| 18,214,371,654
|
IssuesEvent
|
2021-09-30 01:03:34
|
RG4421/java-slack-sdk
|
https://api.github.com/repos/RG4421/java-slack-sdk
|
opened
|
CVE-2021-37137 (High) detected in multiple libraries
|
security vulnerability
|
## CVE-2021-37137 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>netty-codec-4.1.45.Final.jar</b>, <b>netty-codec-4.1.48.Final.jar</b>, <b>netty-codec-4.1.34.Final.jar</b></p></summary>
<p>
<details><summary><b>netty-codec-4.1.45.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Path to dependency file: java-slack-sdk/bolt-helidon/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-codec/4.1.45.Final/netty-codec-4.1.45.Final.jar</p>
<p>
Dependency Hierarchy:
- helidon-bundles-webserver-1.4.4.jar (Root Library)
- helidon-webserver-1.4.4.jar
- netty-codec-http-4.1.45.Final.jar
- :x: **netty-codec-4.1.45.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-4.1.48.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p>
<p>Path to dependency file: java-slack-sdk/bolt-micronaut/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-codec/4.1.48.Final/netty-codec-4.1.48.Final.jar</p>
<p>
Dependency Hierarchy:
- micronaut-http-server-netty-1.3.4.jar (Root Library)
- netty-codec-http-4.1.48.Final.jar
- :x: **netty-codec-4.1.48.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-4.1.34.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: java-slack-sdk/bolt-quarkus-examples/pom.xml</p>
<p>Path to vulnerable library: java-slack-sdk/bolt-quarkus-examples/target/lib/io.netty.netty-codec-4.1.34.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-codec/4.1.34.Final/netty-codec-4.1.34.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-4.1.34.Final.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The Snappy frame decoder function doesn't restrict the chunk length which may lead to excessive memory usage. Beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well.
This vulnerability can be triggered by supplying malicious input that decompresses to a very big size (via a network stream or a file) or by sending a huge skippable chunk
<p>Publish Date: 2021-07-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37137>CVE-2021-37137</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-9vjp-v76f-g363">https://github.com/advisories/GHSA-9vjp-v76f-g363</a></p>
<p>Release Date: 2021-07-21</p>
<p>Fix Resolution: io.netty:netty-codec:4.1.68.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec","packageVersion":"4.1.45.Final","packageFilePaths":["/bolt-helidon/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"io.helidon.bundles:helidon-bundles-webserver:1.4.4;io.helidon.webserver:helidon-webserver:1.4.4;io.netty:netty-codec-http:4.1.45.Final;io.netty:netty-codec:4.1.45.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec:4.1.68.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec","packageVersion":"4.1.48.Final","packageFilePaths":["/bolt-micronaut/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"io.micronaut:micronaut-http-server-netty:1.3.4;io.netty:netty-codec-http:4.1.48.Final;io.netty:netty-codec:4.1.48.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec:4.1.68.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec","packageVersion":"4.1.34.Final","packageFilePaths":["/bolt-quarkus-examples/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec:4.1.34.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec:4.1.68.Final"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-37137","vulnerabilityDetails":"The Snappy frame decoder function doesn\u0027t restrict the chunk length which may lead to excessive memory usage. Beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well.\n\nThis vulnerability can be triggered by supplying malicious input that decompresses to a very big size (via a network stream or a file) or by sending a huge skippable chunk","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37137","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2021-37137 (High) detected in multiple libraries - ## CVE-2021-37137 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>netty-codec-4.1.45.Final.jar</b>, <b>netty-codec-4.1.48.Final.jar</b>, <b>netty-codec-4.1.34.Final.jar</b></p></summary>
<p>
<details><summary><b>netty-codec-4.1.45.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Path to dependency file: java-slack-sdk/bolt-helidon/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-codec/4.1.45.Final/netty-codec-4.1.45.Final.jar</p>
<p>
Dependency Hierarchy:
- helidon-bundles-webserver-1.4.4.jar (Root Library)
- helidon-webserver-1.4.4.jar
- netty-codec-http-4.1.45.Final.jar
- :x: **netty-codec-4.1.45.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-4.1.48.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p>
<p>Path to dependency file: java-slack-sdk/bolt-micronaut/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-codec/4.1.48.Final/netty-codec-4.1.48.Final.jar</p>
<p>
Dependency Hierarchy:
- micronaut-http-server-netty-1.3.4.jar (Root Library)
- netty-codec-http-4.1.48.Final.jar
- :x: **netty-codec-4.1.48.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-codec-4.1.34.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="http://netty.io/">http://netty.io/</a></p>
<p>Path to dependency file: java-slack-sdk/bolt-quarkus-examples/pom.xml</p>
<p>Path to vulnerable library: java-slack-sdk/bolt-quarkus-examples/target/lib/io.netty.netty-codec-4.1.34.Final.jar,/home/wss-scanner/.m2/repository/io/netty/netty-codec/4.1.34.Final/netty-codec-4.1.34.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-4.1.34.Final.jar** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The Snappy frame decoder function doesn't restrict the chunk length which may lead to excessive memory usage. Beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well.
This vulnerability can be triggered by supplying malicious input that decompresses to a very big size (via a network stream or a file) or by sending a huge skippable chunk
<p>Publish Date: 2021-07-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37137>CVE-2021-37137</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-9vjp-v76f-g363">https://github.com/advisories/GHSA-9vjp-v76f-g363</a></p>
<p>Release Date: 2021-07-21</p>
<p>Fix Resolution: io.netty:netty-codec:4.1.68.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec","packageVersion":"4.1.45.Final","packageFilePaths":["/bolt-helidon/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"io.helidon.bundles:helidon-bundles-webserver:1.4.4;io.helidon.webserver:helidon-webserver:1.4.4;io.netty:netty-codec-http:4.1.45.Final;io.netty:netty-codec:4.1.45.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec:4.1.68.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec","packageVersion":"4.1.48.Final","packageFilePaths":["/bolt-micronaut/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"io.micronaut:micronaut-http-server-netty:1.3.4;io.netty:netty-codec-http:4.1.48.Final;io.netty:netty-codec:4.1.48.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec:4.1.68.Final"},{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec","packageVersion":"4.1.34.Final","packageFilePaths":["/bolt-quarkus-examples/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec:4.1.34.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec:4.1.68.Final"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-37137","vulnerabilityDetails":"The Snappy frame decoder function doesn\u0027t restrict the chunk length which may lead to excessive memory usage. Beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well.\n\nThis vulnerability can be triggered by supplying malicious input that decompresses to a very big size (via a network stream or a file) or by sending a huge skippable chunk","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37137","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_defect
|
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries netty codec final jar netty codec final jar netty codec final jar netty codec final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients path to dependency file java slack sdk bolt helidon pom xml path to vulnerable library home wss scanner repository io netty netty codec final netty codec final jar dependency hierarchy helidon bundles webserver jar root library helidon webserver jar netty codec http final jar x netty codec final jar vulnerable library netty codec final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file java slack sdk bolt micronaut pom xml path to vulnerable library home wss scanner repository io netty netty codec final netty codec final jar dependency hierarchy micronaut http server netty jar root library netty codec http final jar x netty codec final jar vulnerable library netty codec final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file java slack sdk bolt quarkus examples pom xml path to vulnerable library java slack sdk bolt quarkus examples target lib io netty netty codec final jar home wss scanner repository io netty netty codec final netty codec final jar dependency hierarchy x netty codec final jar vulnerable library found in base branch master vulnerability details the snappy frame decoder function doesn t restrict the chunk length which may lead to excessive memory usage beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well this vulnerability can be triggered by supplying malicious input that decompresses to a very big size via a network stream or a file or by sending a huge skippable chunk publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec final isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree io helidon bundles helidon bundles webserver io helidon webserver helidon webserver io netty netty codec http final io netty netty codec final isminimumfixversionavailable true minimumfixversion io netty netty codec final packagetype java groupid io netty packagename netty codec packageversion final packagefilepaths istransitivedependency true dependencytree io micronaut micronaut http server netty io netty netty codec http final io netty netty codec final isminimumfixversionavailable true minimumfixversion io netty netty codec final packagetype java groupid io netty packagename netty codec packageversion final packagefilepaths istransitivedependency false dependencytree io netty netty codec final isminimumfixversionavailable true minimumfixversion io netty netty codec final basebranches vulnerabilityidentifier cve vulnerabilitydetails the snappy frame decoder function doesn restrict the chunk length which may lead to excessive memory usage beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well n nthis vulnerability can be triggered by supplying malicious input that decompresses to a very big size via a network stream or a file or by sending a huge skippable chunk vulnerabilityurl
| 0
|
74,887
| 25,386,873,707
|
IssuesEvent
|
2022-11-21 22:47:06
|
cakephp/cakephp
|
https://api.github.com/repos/cakephp/cakephp
|
closed
|
Invalid array access in MysqlSchemaDialect::listTablesSql()
|
defect
|
### Description
The error I snatched from the logs is this:
```
Argument 1 passed to Cake\Database\Driver\Mysql::quoteIdentifier() must be of the type string, null given, called in /var/www/html/squeeze/vendor/cakephp/database/Schema/MysqlSchemaDialect.php on line 45
```
The code where this occurs is:
```
public function listTablesSql(array $config): array
{
return ['SHOW FULL TABLES FROM ' . $this->_driver->quoteIdentifier($config['database']), []];
}
```
I managed to work around this by simply removing the `FROM <database>` from the generated SQL code. This seems to work and it gives me the info I'm after.
Notes:
* I'm using the cakephp/database package, not the full cake package.
* Its sibling `listTablesWithoutViewsSql()` is similarly affected.
* `describeForeignKeySql()` also uses `$config['database'])`, not sure if that needs work, too.
* The according functions for other DBs seem to not retrieve this parameter from the array, so they seem to not be affected, but I haven't really verified that.
* I don't really know what is expected from the `$config` array. It could be that my app is messed up, so that this array just isn't populated correctly. I'd welcome feedback on that matter. Alternatively, some `assert()`s in the code would help, too.
### CakePHP Version
4.4.5
### PHP Version
7.4.30
|
1.0
|
Invalid array access in MysqlSchemaDialect::listTablesSql() - ### Description
The error I snatched from the logs is this:
```
Argument 1 passed to Cake\Database\Driver\Mysql::quoteIdentifier() must be of the type string, null given, called in /var/www/html/squeeze/vendor/cakephp/database/Schema/MysqlSchemaDialect.php on line 45
```
The code where this occurs is:
```
public function listTablesSql(array $config): array
{
return ['SHOW FULL TABLES FROM ' . $this->_driver->quoteIdentifier($config['database']), []];
}
```
I managed to work around this by simply removing the `FROM <database>` from the generated SQL code. This seems to work and it gives me the info I'm after.
Notes:
* I'm using the cakephp/database package, not the full cake package.
* Its sibling `listTablesWithoutViewsSql()` is similarly affected.
* `describeForeignKeySql()` also uses `$config['database'])`, not sure if that needs work, too.
* The according functions for other DBs seem to not retrieve this parameter from the array, so they seem to not be affected, but I haven't really verified that.
* I don't really know what is expected from the `$config` array. It could be that my app is messed up, so that this array just isn't populated correctly. I'd welcome feedback on that matter. Alternatively, some `assert()`s in the code would help, too.
### CakePHP Version
4.4.5
### PHP Version
7.4.30
|
defect
|
invalid array access in mysqlschemadialect listtablessql description the error i snatched from the logs is this argument passed to cake database driver mysql quoteidentifier must be of the type string null given called in var www html squeeze vendor cakephp database schema mysqlschemadialect php on line the code where this occurs is public function listtablessql array config array return i managed to work around this by simply removing the from from the generated sql code this seems to work and it gives me the info i m after notes i m using the cakephp database package not the full cake package its sibling listtableswithoutviewssql is similarly affected describeforeignkeysql also uses config not sure if that needs work too the according functions for other dbs seem to not retrieve this parameter from the array so they seem to not be affected but i haven t really verified that i don t really know what is expected from the config array it could be that my app is messed up so that this array just isn t populated correctly i d welcome feedback on that matter alternatively some assert s in the code would help too cakephp version php version
| 1
|
76,581
| 21,513,222,418
|
IssuesEvent
|
2022-04-28 07:29:15
|
PaddlePaddle/Paddle
|
https://api.github.com/repos/PaddlePaddle/Paddle
|
opened
|
aarch64源码编译paddle报错
|
status/new-issue type/build
|
### 问题描述 Issue Description
在 https://hub.docker.com/r/paddlepaddle/paddle/tags?page=1&name=aarch docker镜像包中编译paddle,服务器架构为aarch64,编译部分log如下:
**
[ 95%] Built target final_dygraph_node
[ 95%] Built target final_dygraph_function
[ 95%] Built target parallel_ssa_graph_executor
[ 95%] Built target eager_reducer
[ 95%] Built target async_ssa_graph_executor
[ 95%] Built target parallel_executor
[ 95%] Built target executor_cache
[ 95%] Built target run_program_op
[ 95%] Built target paddle_inference_io
[ 95%] Built target eager_generator
[ 95%] Built target eager_op_function_generator
[ 96%] Built target op_function_generator
[ 96%] copy_if_different /home/Paddle/paddle/fluid/eager/api/generated/fluid_generated/nodes/nodes.tmp.cc to /home/Paddle/paddle/fluid/eager/api/generated/fluid_generated/nodes/nodes.cc
[ 96%] Built target eager_op_function_generator_cmd
[ 96%] Built target kernel_signature_generator
[ 96%] Built target op_function_generator_cmd
[ 96%] Built target analysis_helper
[ 96%] Built target ir_pass_manager
[ 96%] Built target ir_analysis_pass
[ 96%] Built target ir_params_sync_among_devices_pass
[ 96%] Built target ir_graph_build_pass
[ 96%] Built target analysis_passes
[ 96%] Built target analysis
[ 96%] Built target eager_codegen
[ 97%] Built target dygraph_node
[ 97%] Built target dygraph_function
[ 97%] Built target analysis_predictor
[ 97%] Built target performance_benchmark_utils
[ 97%] Built target op_function_common
[ 98%] Built target paddle_eager
[ 98%] Built target paddle_inference
[ 98%] Built target paddle_inference_c
[ 98%] Built target paddle_inference_c_shared
[ 99%] Built target paddle_inference_shared
[ 99%] Generating .check_symbol
copying /home/Paddle/build/third_party/threadpool/src/extern_threadpool/ThreadPool.h -> /home/Paddle/build/paddle_inference_install_dir/third_party/threadpool
copying /home/Paddle/build/CMakeCache.txt -> /home/Paddle/build/paddle_inference_install_dir
copying /home/Paddle/build/third_party/install/openblas/lib -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/openblas
copying /home/Paddle/build/third_party/install/openblas/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/openblas
copying /home/Paddle/build/third_party/install/gflags/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/gflags
copying /home/Paddle/build/third_party/install/gflags/lib/libgflags.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/gflags/lib
copying /home/Paddle/build/third_party/install/glog/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/glog
copying /home/Paddle/build/third_party/install/glog/lib/libglog.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/glog/lib
[ 99%] Linking CXX shared library libpaddle_pybind.so
copying /home/Paddle/build/third_party/install/utf8proc/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/utf8proc
copying /home/Paddle/build/third_party/install/utf8proc/lib/libutf8proc.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/utf8proc/lib
copying /home/Paddle/build/third_party/install/cryptopp/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/cryptopp
copying /home/Paddle/build/third_party/install/cryptopp/lib/libcryptopp.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/cryptopp/lib
copying /home/Paddle/build/third_party/install/xxhash/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/xxhash
copying /home/Paddle/build/third_party/install/xxhash/lib/libxxhash.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/xxhash/lib
copying /home/Paddle/build/third_party/install/protobuf/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/protobuf
copying /home/Paddle/build/third_party/install/protobuf/lib/libprotobuf.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/protobuf/lib
copying /home/Paddle/paddle/fluid/inference/api/paddle_*.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include
copying /home/Paddle/build/paddle/fluid/inference/libpaddle_inference.* -> /home/Paddle/build/paddle_inference_install_dir/paddle/lib
[ 99%] Built target check_symbol
copying /home/Paddle/build/paddle/fluid/framework/framework.pb.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/internal
copying /home/Paddle/paddle/fluid/framework/io/crypto/cipher.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/crypto/
copying /home/Paddle/paddle/phi/api/ext/*.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext/
copying /home/Paddle/paddle/phi/api/include/*.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/
copying /home/Paddle/paddle/phi/api/all.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/
copying /home/Paddle/paddle/phi/common/*.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/
copying /home/Paddle/paddle/phi/core/macros.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/core/
copying /home/Paddle/paddle/phi/core/visit_type.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/core/
copying /home/Paddle/paddle/utils/any.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/utils/
copying /home/Paddle/paddle/utils/optional.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/utils/
copying /home/Paddle/paddle/utils/none.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/utils/
copying /home/Paddle/paddle/utils/flat_hash_map.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/utils/
copying /home/Paddle/paddle/extension.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/
Change phi header include path to adapt to inference api path
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/ext_all.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/extension.h
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/all.h
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext/dispatch.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext/exception.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext/op_meta_info.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext/tensor_compat.h
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/api.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/context_pool.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/dll_decl.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/sparse_api.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/strings_api.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/tensor.h
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/amp_type_traits.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/backend.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/bfloat16.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/complex.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/cpstring_impl.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/data_type.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/float16.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/int_array.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/layout.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/place.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/pstring.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/scalar.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/type_traits.h
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/core
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/core/macros.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/core/visit_type.h
copying /home/Paddle/build/third_party/install/openblas/lib -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/openblas
copying /home/Paddle/build/third_party/install/openblas/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/openblas
copying /home/Paddle/build/third_party/install/gflags/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/gflags
copying /home/Paddle/build/third_party/install/gflags/lib/libgflags.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/gflags/lib
copying /home/Paddle/build/third_party/install/glog/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/glog
copying /home/Paddle/build/third_party/install/glog/lib/libglog.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/glog/lib
copying /home/Paddle/build/third_party/install/utf8proc/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/utf8proc
copying /home/Paddle/build/third_party/install/utf8proc/lib/libutf8proc.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/utf8proc/lib
copying /home/Paddle/build/third_party/install/cryptopp/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/cryptopp
copying /home/Paddle/build/third_party/install/cryptopp/lib/libcryptopp.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/cryptopp/lib
copying /home/Paddle/build/third_party/install/xxhash/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/xxhash
copying /home/Paddle/build/third_party/install/xxhash/lib/libxxhash.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/xxhash/lib
copying /home/Paddle/build/third_party/install/protobuf/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/protobuf
copying /home/Paddle/build/third_party/install/protobuf/lib/libprotobuf.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/protobuf/lib
copying /home/Paddle/paddle/fluid/inference/capi_exp/pd_*.h -> /home/Paddle/build/paddle_inference_c_install_dir/paddle/include
copying /home/Paddle/build/paddle/fluid/inference/capi_exp/libpaddle_inference_c.* -> /home/Paddle/build/paddle_inference_c_install_dir/paddle/lib
copying /home/Paddle/build/third_party/eigen3/src/extern_eigen3/Eigen/Core -> /home/Paddle/build/paddle_install_dir/third_party/eigen3/Eigen
copying /home/Paddle/build/third_party/eigen3/src/extern_eigen3/Eigen/src -> /home/Paddle/build/paddle_install_dir/third_party/eigen3/Eigen
copying /home/Paddle/build/third_party/eigen3/src/extern_eigen3/unsupported/Eigen -> /home/Paddle/build/paddle_install_dir/third_party/eigen3/unsupported
copying /home/Paddle/build/third_party/boost/src/extern_boost/boost -> /home/Paddle/build/paddle_install_dir/third_party/boost
paddle/fluid/pybind/CMakeFiles/paddle_pybind.dir/build.make:2027: recipe for target 'paddle/fluid/pybind/libpaddle_pybind.so' failed
CMakeFiles/Makefile2:232003: recipe for target 'paddle/fluid/pybind/CMakeFiles/paddle_pybind.dir/all' failed
copying /home/Paddle/build/third_party/dlpack/src/extern_dlpack/include/dlpack -> /home/Paddle/build/paddle_install_dir/third_party/dlpack
copying /home/Paddle/build/third_party/install/zlib/include -> /home/Paddle/build/paddle_install_dir/third_party/install/zlib
copying /home/Paddle/build/third_party/install/zlib/lib/libz.a -> /home/Paddle/build/paddle_install_dir/third_party/install/zlib/lib
[ 99%] Built target inference_lib_dist
Makefile:148: recipe for target 'all' failed
**
### 版本&环境信息 Version & Environment Information
在 https://hub.docker.com/r/paddlepaddle/paddle/tags?page=1&name=aarch docker镜像包中编译
|
1.0
|
aarch64源码编译paddle报错 - ### 问题描述 Issue Description
在 https://hub.docker.com/r/paddlepaddle/paddle/tags?page=1&name=aarch docker镜像包中编译paddle,服务器架构为aarch64,编译部分log如下:
**
[ 95%] Built target final_dygraph_node
[ 95%] Built target final_dygraph_function
[ 95%] Built target parallel_ssa_graph_executor
[ 95%] Built target eager_reducer
[ 95%] Built target async_ssa_graph_executor
[ 95%] Built target parallel_executor
[ 95%] Built target executor_cache
[ 95%] Built target run_program_op
[ 95%] Built target paddle_inference_io
[ 95%] Built target eager_generator
[ 95%] Built target eager_op_function_generator
[ 96%] Built target op_function_generator
[ 96%] copy_if_different /home/Paddle/paddle/fluid/eager/api/generated/fluid_generated/nodes/nodes.tmp.cc to /home/Paddle/paddle/fluid/eager/api/generated/fluid_generated/nodes/nodes.cc
[ 96%] Built target eager_op_function_generator_cmd
[ 96%] Built target kernel_signature_generator
[ 96%] Built target op_function_generator_cmd
[ 96%] Built target analysis_helper
[ 96%] Built target ir_pass_manager
[ 96%] Built target ir_analysis_pass
[ 96%] Built target ir_params_sync_among_devices_pass
[ 96%] Built target ir_graph_build_pass
[ 96%] Built target analysis_passes
[ 96%] Built target analysis
[ 96%] Built target eager_codegen
[ 97%] Built target dygraph_node
[ 97%] Built target dygraph_function
[ 97%] Built target analysis_predictor
[ 97%] Built target performance_benchmark_utils
[ 97%] Built target op_function_common
[ 98%] Built target paddle_eager
[ 98%] Built target paddle_inference
[ 98%] Built target paddle_inference_c
[ 98%] Built target paddle_inference_c_shared
[ 99%] Built target paddle_inference_shared
[ 99%] Generating .check_symbol
copying /home/Paddle/build/third_party/threadpool/src/extern_threadpool/ThreadPool.h -> /home/Paddle/build/paddle_inference_install_dir/third_party/threadpool
copying /home/Paddle/build/CMakeCache.txt -> /home/Paddle/build/paddle_inference_install_dir
copying /home/Paddle/build/third_party/install/openblas/lib -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/openblas
copying /home/Paddle/build/third_party/install/openblas/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/openblas
copying /home/Paddle/build/third_party/install/gflags/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/gflags
copying /home/Paddle/build/third_party/install/gflags/lib/libgflags.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/gflags/lib
copying /home/Paddle/build/third_party/install/glog/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/glog
copying /home/Paddle/build/third_party/install/glog/lib/libglog.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/glog/lib
[ 99%] Linking CXX shared library libpaddle_pybind.so
copying /home/Paddle/build/third_party/install/utf8proc/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/utf8proc
copying /home/Paddle/build/third_party/install/utf8proc/lib/libutf8proc.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/utf8proc/lib
copying /home/Paddle/build/third_party/install/cryptopp/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/cryptopp
copying /home/Paddle/build/third_party/install/cryptopp/lib/libcryptopp.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/cryptopp/lib
copying /home/Paddle/build/third_party/install/xxhash/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/xxhash
copying /home/Paddle/build/third_party/install/xxhash/lib/libxxhash.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/xxhash/lib
copying /home/Paddle/build/third_party/install/protobuf/include -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/protobuf
copying /home/Paddle/build/third_party/install/protobuf/lib/libprotobuf.a -> /home/Paddle/build/paddle_inference_install_dir/third_party/install/protobuf/lib
copying /home/Paddle/paddle/fluid/inference/api/paddle_*.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include
copying /home/Paddle/build/paddle/fluid/inference/libpaddle_inference.* -> /home/Paddle/build/paddle_inference_install_dir/paddle/lib
[ 99%] Built target check_symbol
copying /home/Paddle/build/paddle/fluid/framework/framework.pb.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/internal
copying /home/Paddle/paddle/fluid/framework/io/crypto/cipher.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/crypto/
copying /home/Paddle/paddle/phi/api/ext/*.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext/
copying /home/Paddle/paddle/phi/api/include/*.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/
copying /home/Paddle/paddle/phi/api/all.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/
copying /home/Paddle/paddle/phi/common/*.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/
copying /home/Paddle/paddle/phi/core/macros.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/core/
copying /home/Paddle/paddle/phi/core/visit_type.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/core/
copying /home/Paddle/paddle/utils/any.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/utils/
copying /home/Paddle/paddle/utils/optional.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/utils/
copying /home/Paddle/paddle/utils/none.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/utils/
copying /home/Paddle/paddle/utils/flat_hash_map.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/utils/
copying /home/Paddle/paddle/extension.h -> /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/
Change phi header include path to adapt to inference api path
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/ext_all.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/extension.h
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/all.h
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext/dispatch.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext/exception.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext/op_meta_info.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/ext/tensor_compat.h
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/api.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/context_pool.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/dll_decl.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/sparse_api.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/strings_api.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/api/include/tensor.h
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/amp_type_traits.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/backend.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/bfloat16.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/complex.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/cpstring_impl.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/data_type.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/float16.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/int_array.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/layout.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/place.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/pstring.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/scalar.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/common/type_traits.h
-- phi header path compat processing: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/core
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/core/macros.h
-- phi header path compat processing complete: /home/Paddle/build/paddle_inference_install_dir/paddle/include/experimental/phi/core/visit_type.h
copying /home/Paddle/build/third_party/install/openblas/lib -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/openblas
copying /home/Paddle/build/third_party/install/openblas/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/openblas
copying /home/Paddle/build/third_party/install/gflags/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/gflags
copying /home/Paddle/build/third_party/install/gflags/lib/libgflags.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/gflags/lib
copying /home/Paddle/build/third_party/install/glog/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/glog
copying /home/Paddle/build/third_party/install/glog/lib/libglog.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/glog/lib
copying /home/Paddle/build/third_party/install/utf8proc/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/utf8proc
copying /home/Paddle/build/third_party/install/utf8proc/lib/libutf8proc.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/utf8proc/lib
copying /home/Paddle/build/third_party/install/cryptopp/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/cryptopp
copying /home/Paddle/build/third_party/install/cryptopp/lib/libcryptopp.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/cryptopp/lib
copying /home/Paddle/build/third_party/install/xxhash/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/xxhash
copying /home/Paddle/build/third_party/install/xxhash/lib/libxxhash.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/xxhash/lib
copying /home/Paddle/build/third_party/install/protobuf/include -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/protobuf
copying /home/Paddle/build/third_party/install/protobuf/lib/libprotobuf.a -> /home/Paddle/build/paddle_inference_c_install_dir/third_party/install/protobuf/lib
copying /home/Paddle/paddle/fluid/inference/capi_exp/pd_*.h -> /home/Paddle/build/paddle_inference_c_install_dir/paddle/include
copying /home/Paddle/build/paddle/fluid/inference/capi_exp/libpaddle_inference_c.* -> /home/Paddle/build/paddle_inference_c_install_dir/paddle/lib
copying /home/Paddle/build/third_party/eigen3/src/extern_eigen3/Eigen/Core -> /home/Paddle/build/paddle_install_dir/third_party/eigen3/Eigen
copying /home/Paddle/build/third_party/eigen3/src/extern_eigen3/Eigen/src -> /home/Paddle/build/paddle_install_dir/third_party/eigen3/Eigen
copying /home/Paddle/build/third_party/eigen3/src/extern_eigen3/unsupported/Eigen -> /home/Paddle/build/paddle_install_dir/third_party/eigen3/unsupported
copying /home/Paddle/build/third_party/boost/src/extern_boost/boost -> /home/Paddle/build/paddle_install_dir/third_party/boost
paddle/fluid/pybind/CMakeFiles/paddle_pybind.dir/build.make:2027: recipe for target 'paddle/fluid/pybind/libpaddle_pybind.so' failed
CMakeFiles/Makefile2:232003: recipe for target 'paddle/fluid/pybind/CMakeFiles/paddle_pybind.dir/all' failed
copying /home/Paddle/build/third_party/dlpack/src/extern_dlpack/include/dlpack -> /home/Paddle/build/paddle_install_dir/third_party/dlpack
copying /home/Paddle/build/third_party/install/zlib/include -> /home/Paddle/build/paddle_install_dir/third_party/install/zlib
copying /home/Paddle/build/third_party/install/zlib/lib/libz.a -> /home/Paddle/build/paddle_install_dir/third_party/install/zlib/lib
[ 99%] Built target inference_lib_dist
Makefile:148: recipe for target 'all' failed
**
### 版本&环境信息 Version & Environment Information
在 https://hub.docker.com/r/paddlepaddle/paddle/tags?page=1&name=aarch docker镜像包中编译
|
non_defect
|
问题描述 issue description 在 docker镜像包中编译paddle, ,编译部分log如下: built target final dygraph node built target final dygraph function built target parallel ssa graph executor built target eager reducer built target async ssa graph executor built target parallel executor built target executor cache built target run program op built target paddle inference io built target eager generator built target eager op function generator built target op function generator copy if different home paddle paddle fluid eager api generated fluid generated nodes nodes tmp cc to home paddle paddle fluid eager api generated fluid generated nodes nodes cc built target eager op function generator cmd built target kernel signature generator built target op function generator cmd built target analysis helper built target ir pass manager built target ir analysis pass built target ir params sync among devices pass built target ir graph build pass built target analysis passes built target analysis built target eager codegen built target dygraph node built target dygraph function built target analysis predictor built target performance benchmark utils built target op function common built target paddle eager built target paddle inference built target paddle inference c built target paddle inference c shared built target paddle inference shared generating check symbol copying home paddle build third party threadpool src extern threadpool threadpool h home paddle build paddle inference install dir third party threadpool copying home paddle build cmakecache txt home paddle build paddle inference install dir copying home paddle build third party install openblas lib home paddle build paddle inference install dir third party install openblas copying home paddle build third party install openblas include home paddle build paddle inference install dir third party install openblas copying home paddle build third party install gflags include home paddle build paddle inference install dir third party install gflags copying home paddle build third party install gflags lib libgflags a home paddle build paddle inference install dir third party install gflags lib copying home paddle build third party install glog include home paddle build paddle inference install dir third party install glog copying home paddle build third party install glog lib libglog a home paddle build paddle inference install dir third party install glog lib linking cxx shared library libpaddle pybind so copying home paddle build third party install include home paddle build paddle inference install dir third party install copying home paddle build third party install lib a home paddle build paddle inference install dir third party install lib copying home paddle build third party install cryptopp include home paddle build paddle inference install dir third party install cryptopp copying home paddle build third party install cryptopp lib libcryptopp a home paddle build paddle inference install dir third party install cryptopp lib copying home paddle build third party install xxhash include home paddle build paddle inference install dir third party install xxhash copying home paddle build third party install xxhash lib libxxhash a home paddle build paddle inference install dir third party install xxhash lib copying home paddle build third party install protobuf include home paddle build paddle inference install dir third party install protobuf copying home paddle build third party install protobuf lib libprotobuf a home paddle build paddle inference install dir third party install protobuf lib copying home paddle paddle fluid inference api paddle h home paddle build paddle inference install dir paddle include copying home paddle build paddle fluid inference libpaddle inference home paddle build paddle inference install dir paddle lib built target check symbol copying home paddle build paddle fluid framework framework pb h home paddle build paddle inference install dir paddle include internal copying home paddle paddle fluid framework io crypto cipher h home paddle build paddle inference install dir paddle include crypto copying home paddle paddle phi api ext h home paddle build paddle inference install dir paddle include experimental phi api ext copying home paddle paddle phi api include h home paddle build paddle inference install dir paddle include experimental phi api include copying home paddle paddle phi api all h home paddle build paddle inference install dir paddle include experimental phi api copying home paddle paddle phi common h home paddle build paddle inference install dir paddle include experimental phi common copying home paddle paddle phi core macros h home paddle build paddle inference install dir paddle include experimental phi core copying home paddle paddle phi core visit type h home paddle build paddle inference install dir paddle include experimental phi core copying home paddle paddle utils any h home paddle build paddle inference install dir paddle include experimental utils copying home paddle paddle utils optional h home paddle build paddle inference install dir paddle include experimental utils copying home paddle paddle utils none h home paddle build paddle inference install dir paddle include experimental utils copying home paddle paddle utils flat hash map h home paddle build paddle inference install dir paddle include experimental utils copying home paddle paddle extension h home paddle build paddle inference install dir paddle include experimental change phi header include path to adapt to inference api path phi header path compat processing home paddle build paddle inference install dir paddle include experimental phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental ext all h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental extension h phi header path compat processing home paddle build paddle inference install dir paddle include experimental phi api phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi api all h phi header path compat processing home paddle build paddle inference install dir paddle include experimental phi api ext phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi api ext dispatch h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi api ext exception h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi api ext op meta info h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi api ext tensor compat h phi header path compat processing home paddle build paddle inference install dir paddle include experimental phi api include phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi api include api h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi api include context pool h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi api include dll decl h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi api include sparse api h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi api include strings api h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi api include tensor h phi header path compat processing home paddle build paddle inference install dir paddle include experimental phi common phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common amp type traits h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common backend h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common complex h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common cpstring impl h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common data type h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common int array h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common layout h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common place h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common pstring h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common scalar h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi common type traits h phi header path compat processing home paddle build paddle inference install dir paddle include experimental phi core phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi core macros h phi header path compat processing complete home paddle build paddle inference install dir paddle include experimental phi core visit type h copying home paddle build third party install openblas lib home paddle build paddle inference c install dir third party install openblas copying home paddle build third party install openblas include home paddle build paddle inference c install dir third party install openblas copying home paddle build third party install gflags include home paddle build paddle inference c install dir third party install gflags copying home paddle build third party install gflags lib libgflags a home paddle build paddle inference c install dir third party install gflags lib copying home paddle build third party install glog include home paddle build paddle inference c install dir third party install glog copying home paddle build third party install glog lib libglog a home paddle build paddle inference c install dir third party install glog lib copying home paddle build third party install include home paddle build paddle inference c install dir third party install copying home paddle build third party install lib a home paddle build paddle inference c install dir third party install lib copying home paddle build third party install cryptopp include home paddle build paddle inference c install dir third party install cryptopp copying home paddle build third party install cryptopp lib libcryptopp a home paddle build paddle inference c install dir third party install cryptopp lib copying home paddle build third party install xxhash include home paddle build paddle inference c install dir third party install xxhash copying home paddle build third party install xxhash lib libxxhash a home paddle build paddle inference c install dir third party install xxhash lib copying home paddle build third party install protobuf include home paddle build paddle inference c install dir third party install protobuf copying home paddle build third party install protobuf lib libprotobuf a home paddle build paddle inference c install dir third party install protobuf lib copying home paddle paddle fluid inference capi exp pd h home paddle build paddle inference c install dir paddle include copying home paddle build paddle fluid inference capi exp libpaddle inference c home paddle build paddle inference c install dir paddle lib copying home paddle build third party src extern eigen core home paddle build paddle install dir third party eigen copying home paddle build third party src extern eigen src home paddle build paddle install dir third party eigen copying home paddle build third party src extern unsupported eigen home paddle build paddle install dir third party unsupported copying home paddle build third party boost src extern boost boost home paddle build paddle install dir third party boost paddle fluid pybind cmakefiles paddle pybind dir build make recipe for target paddle fluid pybind libpaddle pybind so failed cmakefiles recipe for target paddle fluid pybind cmakefiles paddle pybind dir all failed copying home paddle build third party dlpack src extern dlpack include dlpack home paddle build paddle install dir third party dlpack copying home paddle build third party install zlib include home paddle build paddle install dir third party install zlib copying home paddle build third party install zlib lib libz a home paddle build paddle install dir third party install zlib lib built target inference lib dist makefile recipe for target all failed 版本 环境信息 version environment information 在 docker镜像包中编译
| 0
|
81,056
| 10,091,926,987
|
IssuesEvent
|
2019-07-26 15:21:35
|
elementary/appcenter
|
https://api.github.com/repos/elementary/appcenter
|
closed
|
Update size is no longer helpful with Flatpak
|
Needs Design
|
Showing the (potential maximum) size of to-be-installed packages still can make sense, but showing the size for updates is almost always wrong, and wrong by an order of magnitude or more due to the way Flatpak does diff-based updates.
I think we should either remove download size information for updates, or better expose that this is a maximum, worst-case-scenario estimate but that the true size is likely to be much, much, much lower. We discussed this at the Metered Data hackfest and as of then there was no way to get a more accurate download estimate, because the client and server negotiate the diff on demand afaik.
|
1.0
|
Update size is no longer helpful with Flatpak - Showing the (potential maximum) size of to-be-installed packages still can make sense, but showing the size for updates is almost always wrong, and wrong by an order of magnitude or more due to the way Flatpak does diff-based updates.
I think we should either remove download size information for updates, or better expose that this is a maximum, worst-case-scenario estimate but that the true size is likely to be much, much, much lower. We discussed this at the Metered Data hackfest and as of then there was no way to get a more accurate download estimate, because the client and server negotiate the diff on demand afaik.
|
non_defect
|
update size is no longer helpful with flatpak showing the potential maximum size of to be installed packages still can make sense but showing the size for updates is almost always wrong and wrong by an order of magnitude or more due to the way flatpak does diff based updates i think we should either remove download size information for updates or better expose that this is a maximum worst case scenario estimate but that the true size is likely to be much much much lower we discussed this at the metered data hackfest and as of then there was no way to get a more accurate download estimate because the client and server negotiate the diff on demand afaik
| 0
|
324
| 2,525,529,840
|
IssuesEvent
|
2015-01-21 02:02:21
|
IcecaveStudios/duct
|
https://api.github.com/repos/IcecaveStudios/duct
|
closed
|
Object in nested array discards nested array.
|
defect semver:patch
|
# INPUT
```json
{
"images": [
{
"url": "http:\/\/cdn3.independent.ie\/incoming\/article30858802.ece\/d6893\/ALTERNATES\/w50square\/MOL.PNG",
"width": 50,
"height": 50
}
]
}
```
# EXPECTED
```
stdClass Object
(
[images] => Array
(
stdClass Object
(
[url] => http://cdn3.independent.ie/incoming/article30858802.ece/d6893/ALTERNATES/w50square/MOL.PNG
[width] => 50
[height] => 50
)
)
)
```
# ACTUAL
```
stdClass Object
(
[images] => stdClass Object
(
[url] => http://cdn3.independent.ie/incoming/article30858802.ece/d6893/ALTERNATES/w50square/MOL.PNG
[width] => 50
[height] => 50
)
)
```
|
1.0
|
Object in nested array discards nested array. - # INPUT
```json
{
"images": [
{
"url": "http:\/\/cdn3.independent.ie\/incoming\/article30858802.ece\/d6893\/ALTERNATES\/w50square\/MOL.PNG",
"width": 50,
"height": 50
}
]
}
```
# EXPECTED
```
stdClass Object
(
[images] => Array
(
stdClass Object
(
[url] => http://cdn3.independent.ie/incoming/article30858802.ece/d6893/ALTERNATES/w50square/MOL.PNG
[width] => 50
[height] => 50
)
)
)
```
# ACTUAL
```
stdClass Object
(
[images] => stdClass Object
(
[url] => http://cdn3.independent.ie/incoming/article30858802.ece/d6893/ALTERNATES/w50square/MOL.PNG
[width] => 50
[height] => 50
)
)
```
|
defect
|
object in nested array discards nested array input json images url http independent ie incoming ece alternates mol png width height expected stdclass object array stdclass object actual stdclass object stdclass object
| 1
|
227,982
| 17,405,956,299
|
IssuesEvent
|
2021-08-03 05:54:44
|
microsoft/pxt-arcade
|
https://api.github.com/repos/microsoft/pxt-arcade
|
closed
|
No response when clicking task # 2 and task # 4 in Chinese
|
cs-intro documentation localization
|
**Describe the bug**
No response when clicking **task # 2** and **task # 4** in **Chinese**.
**Steps to reproduce the behavior**
1.Navigate to https://arcade.makecode.com/courses/csintro1/intro/makecode-orientation
2.Change language to **Chinese**
3.Mouse slide to the bottom
4.Click **task # 2** and **task # 4**
**Expect behavior**
It jumps to **task # 2** when you click **task # 2**. And same with **task # 4**.

**Actual behavior**
No response when you click **task # 2** and **task # 4**.

**Additional context**
1.OS: Windows(rs6)
2.arcade version: 1.5.27
3.Microsoft MakeCode version: 7.1.4
|
1.0
|
No response when clicking task # 2 and task # 4 in Chinese - **Describe the bug**
No response when clicking **task # 2** and **task # 4** in **Chinese**.
**Steps to reproduce the behavior**
1.Navigate to https://arcade.makecode.com/courses/csintro1/intro/makecode-orientation
2.Change language to **Chinese**
3.Mouse slide to the bottom
4.Click **task # 2** and **task # 4**
**Expect behavior**
It jumps to **task # 2** when you click **task # 2**. And same with **task # 4**.

**Actual behavior**
No response when you click **task # 2** and **task # 4**.

**Additional context**
1.OS: Windows(rs6)
2.arcade version: 1.5.27
3.Microsoft MakeCode version: 7.1.4
|
non_defect
|
no response when clicking task and task in chinese describe the bug no response when clicking task and task in chinese steps to reproduce the behavior navigate to change language to chinese mouse slide to the bottom click task and task expect behavior it jumps to task when you click task and same with task actual behavior no response when you click task and task additional context os windows arcade version microsoft makecode version
| 0
|
14,598
| 2,829,610,100
|
IssuesEvent
|
2015-05-23 02:06:28
|
awesomebing1/fuzzdb
|
https://api.github.com/repos/awesomebing1/fuzzdb
|
closed
|
http://www.nureyev-medical.org/forum/virginia-tech-vs-duke-live-streaming-ncaaf-football-2014-online-tv-pc
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1.
2.
3.
http://www.nureyev-medical.org/forum/virginia-tech-vs-duke-live-streaming-ncaaf-
football-2014-online-tv-pc
http://www.nureyev-medical.org/forum/virginia-tech-vs-duke-live-streaming-ncaaf-
football-2014-online-tv-pc
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Please provide any additional information below.
```
Original issue reported on code.google.com by `sabujhos...@gmail.com` on 15 Nov 2014 at 3:53
|
1.0
|
http://www.nureyev-medical.org/forum/virginia-tech-vs-duke-live-streaming-ncaaf-football-2014-online-tv-pc - ```
What steps will reproduce the problem?
1.
2.
3.
http://www.nureyev-medical.org/forum/virginia-tech-vs-duke-live-streaming-ncaaf-
football-2014-online-tv-pc
http://www.nureyev-medical.org/forum/virginia-tech-vs-duke-live-streaming-ncaaf-
football-2014-online-tv-pc
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Please provide any additional information below.
```
Original issue reported on code.google.com by `sabujhos...@gmail.com` on 15 Nov 2014 at 3:53
|
defect
|
what steps will reproduce the problem football online tv pc football online tv pc what is the expected output what do you see instead what version of the product are you using on what operating system please provide any additional information below original issue reported on code google com by sabujhos gmail com on nov at
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.