Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3 values | title stringlengths 1 757 | labels stringlengths 4 664 | body stringlengths 3 261k | index stringclasses 10 values | text_combine stringlengths 96 261k | label stringclasses 2 values | text stringlengths 96 232k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
5,234 | 2,573,224,133 | IssuesEvent | 2015-02-11 07:40:09 | HubTurbo/HubTurbo | https://api.github.com/repos/HubTurbo/HubTurbo | closed | Users can see the activity history of a member | feature-collaborators priority.low status.accepted type.story | * Parent(s): #73
<hr>
This should include comments, issue edits, (commits?) etc. | 1.0 | Users can see the activity history of a member - * Parent(s): #73
<hr>
This should include comments, issue edits, (commits?) etc. | non_defect | users can see the activity history of a member parent s this should include comments issue edits commits etc | 0 |
41,054 | 10,278,728,132 | IssuesEvent | 2019-08-25 16:45:20 | joshuaulrich/IBrokers | https://api.github.com/repos/joshuaulrich/IBrokers | closed | Connot connect with tws | Priority-Medium Type-Defect auto-migrated | ```
What steps will reproduce the problem?
1. run tws. API opened.
2. tws<-twsConnect()
3.
What is the expected output? What do you see instead?
Error in structure(list(s, clientId = clientId, port = port, server.version
= SERVER_VERSION, :
object "SERVER_VERSION" not found
What version of the product are you using? On what operating system?
Windows Vista, R 2.7.2
Please provide any additional information below.
```
Original issue reported on code.google.com by `cinvestor@gmail.com` on 12 Nov 2008 at 5:48
| 1.0 | Connot connect with tws - ```
What steps will reproduce the problem?
1. run tws. API opened.
2. tws<-twsConnect()
3.
What is the expected output? What do you see instead?
Error in structure(list(s, clientId = clientId, port = port, server.version
= SERVER_VERSION, :
object "SERVER_VERSION" not found
What version of the product are you using? On what operating system?
Windows Vista, R 2.7.2
Please provide any additional information below.
```
Original issue reported on code.google.com by `cinvestor@gmail.com` on 12 Nov 2008 at 5:48
| defect | connot connect with tws what steps will reproduce the problem run tws api opened tws twsconnect what is the expected output what do you see instead error in structure list s clientid clientid port port server version server version object server version not found what version of the product are you using on what operating system windows vista r please provide any additional information below original issue reported on code google com by cinvestor gmail com on nov at | 1 |
24,216 | 3,925,281,559 | IssuesEvent | 2016-04-22 18:21:35 | CMU-CREATE-Lab/create-lab-visual-programmer | https://api.github.com/repos/CMU-CREATE-Lab/create-lab-visual-programmer | closed | Manually entering servo or LED value sometimes has no effect | auto-migrated Priority-Medium Type-Defect | ```
Report from Tom...
I found a really weird bug in the latest visual programmer (pointed out to me
during a screenshare debug session with a customer). Try this:
* Turn on a servo
* Slide the servo to a position
* Now use the box and set that servo to 90 degrees by typing in 90 and hitting
enter
On mine, the slider does not update most of the time. It only happens at the
midway point (90), and it seems to affect the LEDs too (at 50).
Motors/vibration motors and tri-color LEDs don't seem affected.
```
Original issue reported on code.google.com by `garthabr...@gmail.com` on 28 Jul 2014 at 5:06 | 1.0 | Manually entering servo or LED value sometimes has no effect - ```
Report from Tom...
I found a really weird bug in the latest visual programmer (pointed out to me
during a screenshare debug session with a customer). Try this:
* Turn on a servo
* Slide the servo to a position
* Now use the box and set that servo to 90 degrees by typing in 90 and hitting
enter
On mine, the slider does not update most of the time. It only happens at the
midway point (90), and it seems to affect the LEDs too (at 50).
Motors/vibration motors and tri-color LEDs don't seem affected.
```
Original issue reported on code.google.com by `garthabr...@gmail.com` on 28 Jul 2014 at 5:06 | defect | manually entering servo or led value sometimes has no effect report from tom i found a really weird bug in the latest visual programmer pointed out to me during a screenshare debug session with a customer try this turn on a servo slide the servo to a position now use the box and set that servo to degrees by typing in and hitting enter on mine the slider does not update most of the time it only happens at the midway point and it seems to affect the leds too at motors vibration motors and tri color leds don t seem affected original issue reported on code google com by garthabr gmail com on jul at | 1 |
257,158 | 27,561,787,420 | IssuesEvent | 2023-03-07 22:46:22 | samqws-marketing/box_box-ui-elements | https://api.github.com/repos/samqws-marketing/box_box-ui-elements | closed | CVE-2022-29244 (High) detected in npm-6.13.1.tgz - autoclosed | security vulnerability | ## CVE-2022-29244 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>npm-6.13.1.tgz</b></p></summary>
<p>a package manager for JavaScript</p>
<p>Library home page: <a href="https://registry.npmjs.org/npm/-/npm-6.13.1.tgz">https://registry.npmjs.org/npm/-/npm-6.13.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/npm/package.json</p>
<p>
Dependency Hierarchy:
- semantic-release-16.0.2.tgz (Root Library)
- npm-6.0.0.tgz
- :x: **npm-6.13.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/box_box-ui-elements/commit/4fc776e2b95c8b497f6994cb2165365562ae1f82">4fc776e2b95c8b497f6994cb2165365562ae1f82</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
npm pack ignores root-level .gitignore and .npmignore file exclusion directives when run in a workspace or with a workspace flag (ie. `--workspaces`, `--workspace=<name>`). Anyone who has run `npm pack` or `npm publish` inside a workspace, as of v7.9.0 and v7.13.0 respectively, may be affected and have published files into the npm registry they did not intend to include. Users should upgrade to the latest, patched version of npm v8.11.0, run: npm i -g npm@latest . Node.js versions v16.15.1, v17.19.1, and v18.3.0 include the patched v8.11.0 version of npm.
<p>Publish Date: 2022-06-13
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-29244>CVE-2022-29244</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-hj9c-8jmm-8c52">https://github.com/advisories/GHSA-hj9c-8jmm-8c52</a></p>
<p>Release Date: 2022-06-13</p>
<p>Fix Resolution (npm): 6.14.18</p>
<p>Direct dependency fix Resolution (semantic-release): 17.0.1</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| True | CVE-2022-29244 (High) detected in npm-6.13.1.tgz - autoclosed - ## CVE-2022-29244 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>npm-6.13.1.tgz</b></p></summary>
<p>a package manager for JavaScript</p>
<p>Library home page: <a href="https://registry.npmjs.org/npm/-/npm-6.13.1.tgz">https://registry.npmjs.org/npm/-/npm-6.13.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/npm/package.json</p>
<p>
Dependency Hierarchy:
- semantic-release-16.0.2.tgz (Root Library)
- npm-6.0.0.tgz
- :x: **npm-6.13.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/box_box-ui-elements/commit/4fc776e2b95c8b497f6994cb2165365562ae1f82">4fc776e2b95c8b497f6994cb2165365562ae1f82</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
npm pack ignores root-level .gitignore and .npmignore file exclusion directives when run in a workspace or with a workspace flag (ie. `--workspaces`, `--workspace=<name>`). Anyone who has run `npm pack` or `npm publish` inside a workspace, as of v7.9.0 and v7.13.0 respectively, may be affected and have published files into the npm registry they did not intend to include. Users should upgrade to the latest, patched version of npm v8.11.0, run: npm i -g npm@latest . Node.js versions v16.15.1, v17.19.1, and v18.3.0 include the patched v8.11.0 version of npm.
<p>Publish Date: 2022-06-13
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-29244>CVE-2022-29244</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-hj9c-8jmm-8c52">https://github.com/advisories/GHSA-hj9c-8jmm-8c52</a></p>
<p>Release Date: 2022-06-13</p>
<p>Fix Resolution (npm): 6.14.18</p>
<p>Direct dependency fix Resolution (semantic-release): 17.0.1</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| non_defect | cve high detected in npm tgz autoclosed cve high severity vulnerability vulnerable library npm tgz a package manager for javascript library home page a href path to dependency file package json path to vulnerable library node modules npm package json dependency hierarchy semantic release tgz root library npm tgz x npm tgz vulnerable library found in head commit a href found in base branch master vulnerability details npm pack ignores root level gitignore and npmignore file exclusion directives when run in a workspace or with a workspace flag ie workspaces workspace anyone who has run npm pack or npm publish inside a workspace as of and respectively may be affected and have published files into the npm registry they did not intend to include users should upgrade to the latest patched version of npm run npm i g npm latest node js versions and include the patched version of npm publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution npm direct dependency fix resolution semantic release check this box to open an automated fix pr | 0 |
70,111 | 22,952,113,016 | IssuesEvent | 2022-07-19 08:22:16 | vector-im/element-ios | https://api.github.com/repos/vector-im/element-ios | closed | Mobile FTUE: ReCaptcha form is laggy | T-Defect S-Minor Z-FTUE | The form shown as part of the new FTUE registration flow is slow to respond to taps. | 1.0 | Mobile FTUE: ReCaptcha form is laggy - The form shown as part of the new FTUE registration flow is slow to respond to taps. | defect | mobile ftue recaptcha form is laggy the form shown as part of the new ftue registration flow is slow to respond to taps | 1 |
81,304 | 30,790,025,154 | IssuesEvent | 2023-07-31 15:32:59 | cakephp/cakephp | https://api.github.com/repos/cakephp/cakephp | opened | Wrong relative date / failed unit test on specific dates | defect | ### Description
https://github.com/cakephp/cakephp/blob/e28e6125405bcf447d5c7120b3af8f0f4f47feff/tests/TestCase/I18n/DateTest.php#L505-L507
This tests fails on "July, 31st", because `-5 months -2 days` results in "March, 1st" due to PHP's overflow behavior (here in `DateTimeImmutable`), as described here: https://www.php.net/manual/en/datetime.examples-arithmetic.php (Example 3).
Other tests will fail in on other dates. The three tests above will fail every year on March, 31st, if the current is not a leap year.
While I don't expect Chronos to cover those edge-cases and work around PHPs implementation, the tests should not fail, I think.
Should the tests still fail on those days or should they get adjusted so that all `+/- day`-statements use at least 3 days of difference?
### CakePHP Version
4.4.15 / 5.x
### PHP Version
8.2 | 1.0 | Wrong relative date / failed unit test on specific dates - ### Description
https://github.com/cakephp/cakephp/blob/e28e6125405bcf447d5c7120b3af8f0f4f47feff/tests/TestCase/I18n/DateTest.php#L505-L507
This tests fails on "July, 31st", because `-5 months -2 days` results in "March, 1st" due to PHP's overflow behavior (here in `DateTimeImmutable`), as described here: https://www.php.net/manual/en/datetime.examples-arithmetic.php (Example 3).
Other tests will fail in on other dates. The three tests above will fail every year on March, 31st, if the current is not a leap year.
While I don't expect Chronos to cover those edge-cases and work around PHPs implementation, the tests should not fail, I think.
Should the tests still fail on those days or should they get adjusted so that all `+/- day`-statements use at least 3 days of difference?
### CakePHP Version
4.4.15 / 5.x
### PHP Version
8.2 | defect | wrong relative date failed unit test on specific dates description this tests fails on july because months days results in march due to php s overflow behavior here in datetimeimmutable as described here example other tests will fail in on other dates the three tests above will fail every year on march if the current is not a leap year while i don t expect chronos to cover those edge cases and work around phps implementation the tests should not fail i think should the tests still fail on those days or should they get adjusted so that all day statements use at least days of difference cakephp version x php version | 1 |
39,512 | 9,525,017,088 | IssuesEvent | 2019-04-28 08:51:52 | cakephp/bake | https://api.github.com/repos/cakephp/bake | closed | Bake model: Cache key `myapp_cake_model_default_"user"` contains invalid characters | Defect | ### Environment:
CakePHP 4.x
Bake 4.x
PHP 7.3.2
PostgreSQL 11.2
### What I did:
bin/cake bake model Entry
### Result (not as expected):
```
Exception: Cache key `myapp_cake_model_default_"user"` contains invalid characters. You cannot use /, \, <, >, ?, :, |, *, or " in cache keys. in [/usr/local/.../lfcakephp4/vendor/cakephp/cakephp/src/Cache/Engine/FileEngine.php, line 431]
2019-04-26 18:12:58 Error: [Cake\Cache\InvalidArgumentException] Cache key `myapp_cake_model_default_"user"` contains invalid characters. You cannot use /, \, <, >, ?, :, |, *, or " in cache keys. in /usr/local/.../lfcakephp4/vendor/cakephp/cakephp/src/Cache/Engine/FileEngine.php on line 431
Stack Trace:
#0 /usr/local/www/.../vendor/cakephp/cakephp/src/Cache/Engine/FileEngine.php(174): Cake\Cache\Engine\FileEngine->_key('myapp_cake_mode...')
#1 /usr/local/www/.../lfcakephp4/vendor/cakephp/cakephp/src/Cache/Cache.php(327): Cake\Cache\Engine\FileEngine->get('default_"user"')
#2 /usr/local/www/.../lfcakephp4/vendor/cakephp/cakephp/src/Database/Schema/CachedCollection.php(79): Cake\Cache\Cache::read('default_"user"', '_cake_model_')
#3 /usr/local/www/.../lfcakephp4/vendor/cakephp/cakephp/src/ORM/Table.php(487): Cake\Database\Schema\CachedCollection->describe('"user"')
#4 /usr/local/www/.../lfcakephp4/vendor/cakephp/bake/src/Command/ModelCommand.php(913): Cake\ORM\Table->getSchema()
#5 /usr/local/www/.../lfcakephp4/vendor/cakephp/bake/src/Command/ModelCommand.php(890): Bake\Command\ModelCommand->getCounterCache(Object(Cake\ORM\Table))
#6 /usr/local/www/.../lfcakephp4/vendor/cakephp/bake/src/Command/ModelCommand.php(131): Bake\Command\ModelCommand->getBehaviors(Object(Cake\ORM\Table))
#7 /usr/local/www/.../lfcakephp4/vendor/cakephp/bake/src/Command/ModelCommand.php(97): Bake\Command\ModelCommand->getTableContext(Object(Cake\ORM\Table), 'entry', 'Entry', Object(Cake\Console\Arguments), Object(Cake\Console\ConsoleIo))
#8 /usr/local/www/.../lfcakephp4/vendor/cakephp/bake/src/Command/ModelCommand.php(79): Bake\Command\ModelCommand->bake('Entry', Object(Cake\Console\Arguments), Object(Cake\Console\ConsoleIo))
#9 /usr/local/www/..../lfcakephp4/vendor/cakephp/cakephp/src/Console/Command.php(178): Bake\Command\ModelCommand->execute(Object(Cake\Console\Arguments), Object(Cake\Console\ConsoleIo))
#10 /usr/local/www/.../lfcakephp4/vendor/cakephp/cakephp/src/Console/CommandRunner.php(323): Cake\Console\Command->run(Array, Object(Cake\Console\ConsoleIo))
#11 /usr/local/www/.../lfcakephp4/vendor/cakephp/cakephp/src/Console/CommandRunner.php(161): Cake\Console\CommandRunner->runCommand(Object(Bake\Command\ModelCommand), Array, Object(Cake\Console\ConsoleIo))
#12 /usr/local/www/.../lfcakephp4/bin/cake.php(12): Cake\Console\CommandRunner->run(Array)
#13 {main}
```
| 1.0 | Bake model: Cache key `myapp_cake_model_default_"user"` contains invalid characters - ### Environment:
CakePHP 4.x
Bake 4.x
PHP 7.3.2
PostgreSQL 11.2
### What I did:
bin/cake bake model Entry
### Result (not as expected):
```
Exception: Cache key `myapp_cake_model_default_"user"` contains invalid characters. You cannot use /, \, <, >, ?, :, |, *, or " in cache keys. in [/usr/local/.../lfcakephp4/vendor/cakephp/cakephp/src/Cache/Engine/FileEngine.php, line 431]
2019-04-26 18:12:58 Error: [Cake\Cache\InvalidArgumentException] Cache key `myapp_cake_model_default_"user"` contains invalid characters. You cannot use /, \, <, >, ?, :, |, *, or " in cache keys. in /usr/local/.../lfcakephp4/vendor/cakephp/cakephp/src/Cache/Engine/FileEngine.php on line 431
Stack Trace:
#0 /usr/local/www/.../vendor/cakephp/cakephp/src/Cache/Engine/FileEngine.php(174): Cake\Cache\Engine\FileEngine->_key('myapp_cake_mode...')
#1 /usr/local/www/.../lfcakephp4/vendor/cakephp/cakephp/src/Cache/Cache.php(327): Cake\Cache\Engine\FileEngine->get('default_"user"')
#2 /usr/local/www/.../lfcakephp4/vendor/cakephp/cakephp/src/Database/Schema/CachedCollection.php(79): Cake\Cache\Cache::read('default_"user"', '_cake_model_')
#3 /usr/local/www/.../lfcakephp4/vendor/cakephp/cakephp/src/ORM/Table.php(487): Cake\Database\Schema\CachedCollection->describe('"user"')
#4 /usr/local/www/.../lfcakephp4/vendor/cakephp/bake/src/Command/ModelCommand.php(913): Cake\ORM\Table->getSchema()
#5 /usr/local/www/.../lfcakephp4/vendor/cakephp/bake/src/Command/ModelCommand.php(890): Bake\Command\ModelCommand->getCounterCache(Object(Cake\ORM\Table))
#6 /usr/local/www/.../lfcakephp4/vendor/cakephp/bake/src/Command/ModelCommand.php(131): Bake\Command\ModelCommand->getBehaviors(Object(Cake\ORM\Table))
#7 /usr/local/www/.../lfcakephp4/vendor/cakephp/bake/src/Command/ModelCommand.php(97): Bake\Command\ModelCommand->getTableContext(Object(Cake\ORM\Table), 'entry', 'Entry', Object(Cake\Console\Arguments), Object(Cake\Console\ConsoleIo))
#8 /usr/local/www/.../lfcakephp4/vendor/cakephp/bake/src/Command/ModelCommand.php(79): Bake\Command\ModelCommand->bake('Entry', Object(Cake\Console\Arguments), Object(Cake\Console\ConsoleIo))
#9 /usr/local/www/..../lfcakephp4/vendor/cakephp/cakephp/src/Console/Command.php(178): Bake\Command\ModelCommand->execute(Object(Cake\Console\Arguments), Object(Cake\Console\ConsoleIo))
#10 /usr/local/www/.../lfcakephp4/vendor/cakephp/cakephp/src/Console/CommandRunner.php(323): Cake\Console\Command->run(Array, Object(Cake\Console\ConsoleIo))
#11 /usr/local/www/.../lfcakephp4/vendor/cakephp/cakephp/src/Console/CommandRunner.php(161): Cake\Console\CommandRunner->runCommand(Object(Bake\Command\ModelCommand), Array, Object(Cake\Console\ConsoleIo))
#12 /usr/local/www/.../lfcakephp4/bin/cake.php(12): Cake\Console\CommandRunner->run(Array)
#13 {main}
```
| defect | bake model cache key myapp cake model default user contains invalid characters environment cakephp x bake x php postgresql what i did bin cake bake model entry result not as expected exception cache key myapp cake model default user contains invalid characters you cannot use or in cache keys in error cache key myapp cake model default user contains invalid characters you cannot use or in cache keys in usr local vendor cakephp cakephp src cache engine fileengine php on line stack trace usr local www vendor cakephp cakephp src cache engine fileengine php cake cache engine fileengine key myapp cake mode usr local www vendor cakephp cakephp src cache cache php cake cache engine fileengine get default user usr local www vendor cakephp cakephp src database schema cachedcollection php cake cache cache read default user cake model usr local www vendor cakephp cakephp src orm table php cake database schema cachedcollection describe user usr local www vendor cakephp bake src command modelcommand php cake orm table getschema usr local www vendor cakephp bake src command modelcommand php bake command modelcommand getcountercache object cake orm table usr local www vendor cakephp bake src command modelcommand php bake command modelcommand getbehaviors object cake orm table usr local www vendor cakephp bake src command modelcommand php bake command modelcommand gettablecontext object cake orm table entry entry object cake console arguments object cake console consoleio usr local www vendor cakephp bake src command modelcommand php bake command modelcommand bake entry object cake console arguments object cake console consoleio usr local www vendor cakephp cakephp src console command php bake command modelcommand execute object cake console arguments object cake console consoleio usr local www vendor cakephp cakephp src console commandrunner php cake console command run array object cake console consoleio usr local www vendor cakephp cakephp src console commandrunner php cake console commandrunner runcommand object bake command modelcommand array object cake console consoleio usr local www bin cake php cake console commandrunner run array main | 1 |
26,598 | 4,770,549,298 | IssuesEvent | 2016-10-26 15:32:46 | buildo/prisma | https://api.github.com/repos/buildo/prisma | reopened | [macro] label brutta | defect macro | [Trello card](https://trello.com/c/5810c83f9f0312ae28bd84f5/)
## description
la label al momento é brutta
## how to reproduce
- {optional: describe steps to reproduce defect}
## specs
{optional: describe a possible fix for this defect, if not obvious}
## misc
{optional: other useful info}
## sub-issues
- [ ] sub-issue!! #49
[@prismabot comment: don't remove]: # (Card: #5810c83f9f0312ae28bd84f5) | 1.0 | [macro] label brutta - [Trello card](https://trello.com/c/5810c83f9f0312ae28bd84f5/)
## description
la label al momento é brutta
## how to reproduce
- {optional: describe steps to reproduce defect}
## specs
{optional: describe a possible fix for this defect, if not obvious}
## misc
{optional: other useful info}
## sub-issues
- [ ] sub-issue!! #49
[@prismabot comment: don't remove]: # (Card: #5810c83f9f0312ae28bd84f5) | defect | label brutta description la label al momento é brutta how to reproduce optional describe steps to reproduce defect specs optional describe a possible fix for this defect if not obvious misc optional other useful info sub issues sub issue card | 1 |
61,081 | 17,023,596,765 | IssuesEvent | 2021-07-03 02:50:26 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | /etc/init.d/renderd has inconsistant output for start/stop vs restart | Component: mapnik Priority: minor Resolution: fixed Type: defect | **[Submitted to the original trac issue database at 4.36pm, Sunday, 23rd May 2010]**
I'm running renderd/mod_tile (r21420) on Ubuntu Lucid, build using the debian dpkg-buildpackage.
If you run ```/etc/init.d/renderd start``` to start renderd, you get no output. The same happens with ```stop```.
However if you do ```/etc/init.d/renderd restart``` you get the familiar "Restarting SERVICE.... [done]" output.
This is internally inconsistant behaviour. The normal debian/ubuntu idea is that restart should be similar to doing a start then a stop. It is strange that you get no output for start or stop, but output for restart. To make it internally consistant, there should be either (a) no output from either or (b) similar output for start, stop and restart.
This is also inconsistant behaviour compared to other debian/ubuntu packages. It's common to have a "Start SERVICE... [done]" output messages. e.g. apache2 does this. Ergo, I suggest making /etc/init.d/renderd output messages on start, stop and restart.
Output for all can be enabled with this patch (which just removes the 'VERBOSE' checks):
```
Index: debian/renderd.init
===================================================================
--- debian/renderd.init (revision 21420)
+++ debian/renderd.init (working copy)
@@ -95,19 +95,19 @@
case "$1" in
start)
- [ "$VERBOSE" != no ] && log_daemon_msg "Starting $DESC" "$NAME"
+ log_daemon_msg "Starting $DESC" "$NAME"
do_start
case "$?" in
- 0|1) [ "$VERBOSE" != no ] && log_end_msg 0 ;;
- 2) [ "$VERBOSE" != no ] && log_end_msg 1 ;;
+ 0|1) log_end_msg 0 ;;
+ 2) log_end_msg 1 ;;
esac
;;
stop)
- [ "$VERBOSE" != no ] && log_daemon_msg "Stopping $DESC" "$NAME"
+ log_daemon_msg "Stopping $DESC" "$NAME"
do_stop
case "$?" in
- 0|1) [ "$VERBOSE" != no ] && log_end_msg 0 ;;
- 2) [ "$VERBOSE" != no ] && log_end_msg 1 ;;
+ 0|1) log_end_msg 0 ;;
+ 2) log_end_msg 1 ;;
esac
;;
#reload|force-reload)
``` | 1.0 | /etc/init.d/renderd has inconsistant output for start/stop vs restart - **[Submitted to the original trac issue database at 4.36pm, Sunday, 23rd May 2010]**
I'm running renderd/mod_tile (r21420) on Ubuntu Lucid, build using the debian dpkg-buildpackage.
If you run ```/etc/init.d/renderd start``` to start renderd, you get no output. The same happens with ```stop```.
However if you do ```/etc/init.d/renderd restart``` you get the familiar "Restarting SERVICE.... [done]" output.
This is internally inconsistant behaviour. The normal debian/ubuntu idea is that restart should be similar to doing a start then a stop. It is strange that you get no output for start or stop, but output for restart. To make it internally consistant, there should be either (a) no output from either or (b) similar output for start, stop and restart.
This is also inconsistant behaviour compared to other debian/ubuntu packages. It's common to have a "Start SERVICE... [done]" output messages. e.g. apache2 does this. Ergo, I suggest making /etc/init.d/renderd output messages on start, stop and restart.
Output for all can be enabled with this patch (which just removes the 'VERBOSE' checks):
```
Index: debian/renderd.init
===================================================================
--- debian/renderd.init (revision 21420)
+++ debian/renderd.init (working copy)
@@ -95,19 +95,19 @@
case "$1" in
start)
- [ "$VERBOSE" != no ] && log_daemon_msg "Starting $DESC" "$NAME"
+ log_daemon_msg "Starting $DESC" "$NAME"
do_start
case "$?" in
- 0|1) [ "$VERBOSE" != no ] && log_end_msg 0 ;;
- 2) [ "$VERBOSE" != no ] && log_end_msg 1 ;;
+ 0|1) log_end_msg 0 ;;
+ 2) log_end_msg 1 ;;
esac
;;
stop)
- [ "$VERBOSE" != no ] && log_daemon_msg "Stopping $DESC" "$NAME"
+ log_daemon_msg "Stopping $DESC" "$NAME"
do_stop
case "$?" in
- 0|1) [ "$VERBOSE" != no ] && log_end_msg 0 ;;
- 2) [ "$VERBOSE" != no ] && log_end_msg 1 ;;
+ 0|1) log_end_msg 0 ;;
+ 2) log_end_msg 1 ;;
esac
;;
#reload|force-reload)
``` | defect | etc init d renderd has inconsistant output for start stop vs restart i m running renderd mod tile on ubuntu lucid build using the debian dpkg buildpackage if you run etc init d renderd start to start renderd you get no output the same happens with stop however if you do etc init d renderd restart you get the familiar restarting service output this is internally inconsistant behaviour the normal debian ubuntu idea is that restart should be similar to doing a start then a stop it is strange that you get no output for start or stop but output for restart to make it internally consistant there should be either a no output from either or b similar output for start stop and restart this is also inconsistant behaviour compared to other debian ubuntu packages it s common to have a start service output messages e g does this ergo i suggest making etc init d renderd output messages on start stop and restart output for all can be enabled with this patch which just removes the verbose checks index debian renderd init debian renderd init revision debian renderd init working copy case in start log daemon msg starting desc name log daemon msg starting desc name do start case in log end msg log end msg log end msg log end msg esac stop log daemon msg stopping desc name log daemon msg stopping desc name do stop case in log end msg log end msg log end msg log end msg esac reload force reload | 1 |
53,029 | 13,260,072,553 | IssuesEvent | 2020-08-20 17:38:51 | jkoan/test-navit | https://api.github.com/repos/jkoan/test-navit | closed | call of navit with command line parameter different from valid navit.xml dumps core (Trac #30) | Incomplete Migration Migrated from Trac core defect/bug somebody | Migrated from http://trac.navit-project.org/ticket/30
```json
{
"status": "closed",
"changetime": "2007-11-27T09:30:31",
"_ts": "1196155831000000",
"description": "uwe@rock:~/navit$ navit g\n\n** ERROR **: Fehler beim Parsen von 'g': Failed to open file 'g': No such file or directory\n\naborting...\nAborted (core dumped)",
"reporter": "asys3@yahoo.com",
"cc": "",
"resolution": "fixed",
"time": "2007-11-25T20:38:04",
"component": "core",
"summary": "call of navit with command line parameter different from valid navit.xml dumps core",
"priority": "trivial",
"keywords": "",
"version": "0.0.3",
"milestone": "",
"owner": "somebody",
"type": "defect/bug",
"severity": ""
}
```
| 1.0 | call of navit with command line parameter different from valid navit.xml dumps core (Trac #30) - Migrated from http://trac.navit-project.org/ticket/30
```json
{
"status": "closed",
"changetime": "2007-11-27T09:30:31",
"_ts": "1196155831000000",
"description": "uwe@rock:~/navit$ navit g\n\n** ERROR **: Fehler beim Parsen von 'g': Failed to open file 'g': No such file or directory\n\naborting...\nAborted (core dumped)",
"reporter": "asys3@yahoo.com",
"cc": "",
"resolution": "fixed",
"time": "2007-11-25T20:38:04",
"component": "core",
"summary": "call of navit with command line parameter different from valid navit.xml dumps core",
"priority": "trivial",
"keywords": "",
"version": "0.0.3",
"milestone": "",
"owner": "somebody",
"type": "defect/bug",
"severity": ""
}
```
| defect | call of navit with command line parameter different from valid navit xml dumps core trac migrated from json status closed changetime ts description uwe rock navit navit g n n error fehler beim parsen von g failed to open file g no such file or directory n naborting naborted core dumped reporter yahoo com cc resolution fixed time component core summary call of navit with command line parameter different from valid navit xml dumps core priority trivial keywords version milestone owner somebody type defect bug severity | 1 |
25,840 | 4,470,125,182 | IssuesEvent | 2016-08-25 15:02:32 | contao/core | https://api.github.com/repos/contao/core | closed | User, member, article icons status behaves incorrect on eye toggeling in combination with auto activation | defect | Ich machs mal auf deutsch, damit mich überhaupt jemand versteht.
Online-Demo:
1. Ein **Mitglied** zum Bearbeiten öffnen
2. Die Autoaktivierung auf einen Zeitpunkt in der Zukunft stellen
3. Speichern und schließen
4. Das Auge ist jetzt grün und das User-Icon grau (richtig)
5. Auf das Auge klicken
Man sieht jetzt, dass die beiden Icons synchron grau und farbig werden sobald man das Auge klickt. Der richtige Zustand - User-Icon grau, Auge grün - lässt sich so nicht wieder herstellen, erst wenn das Auge grün ist und man die Seite neu lädt.
Ich habe es noch mit den **Benutzern**, den **Seiten** und **Artikeln** getestet. Bei den Benutzern und Artikeln ist es genauso. Bei den Seiten ist es korrekt, das Seiten-Icon wird hier nicht durch Klicks auf das Auge beeinflusst, wenn die Autoaktivierung in der Zukunft erfolgen soll.
Ich weiß nicht, wo das sonst noch eingebaut ist, da ich diese Autoaktivierung nie benutze und diese Option in meinen Backends deshalb auch gar nicht zur Verfügung steht. | 1.0 | User, member, article icons status behaves incorrect on eye toggeling in combination with auto activation - Ich machs mal auf deutsch, damit mich überhaupt jemand versteht.
Online-Demo:
1. Ein **Mitglied** zum Bearbeiten öffnen
2. Die Autoaktivierung auf einen Zeitpunkt in der Zukunft stellen
3. Speichern und schließen
4. Das Auge ist jetzt grün und das User-Icon grau (richtig)
5. Auf das Auge klicken
Man sieht jetzt, dass die beiden Icons synchron grau und farbig werden sobald man das Auge klickt. Der richtige Zustand - User-Icon grau, Auge grün - lässt sich so nicht wieder herstellen, erst wenn das Auge grün ist und man die Seite neu lädt.
Ich habe es noch mit den **Benutzern**, den **Seiten** und **Artikeln** getestet. Bei den Benutzern und Artikeln ist es genauso. Bei den Seiten ist es korrekt, das Seiten-Icon wird hier nicht durch Klicks auf das Auge beeinflusst, wenn die Autoaktivierung in der Zukunft erfolgen soll.
Ich weiß nicht, wo das sonst noch eingebaut ist, da ich diese Autoaktivierung nie benutze und diese Option in meinen Backends deshalb auch gar nicht zur Verfügung steht. | defect | user member article icons status behaves incorrect on eye toggeling in combination with auto activation ich machs mal auf deutsch damit mich überhaupt jemand versteht online demo ein mitglied zum bearbeiten öffnen die autoaktivierung auf einen zeitpunkt in der zukunft stellen speichern und schließen das auge ist jetzt grün und das user icon grau richtig auf das auge klicken man sieht jetzt dass die beiden icons synchron grau und farbig werden sobald man das auge klickt der richtige zustand user icon grau auge grün lässt sich so nicht wieder herstellen erst wenn das auge grün ist und man die seite neu lädt ich habe es noch mit den benutzern den seiten und artikeln getestet bei den benutzern und artikeln ist es genauso bei den seiten ist es korrekt das seiten icon wird hier nicht durch klicks auf das auge beeinflusst wenn die autoaktivierung in der zukunft erfolgen soll ich weiß nicht wo das sonst noch eingebaut ist da ich diese autoaktivierung nie benutze und diese option in meinen backends deshalb auch gar nicht zur verfügung steht | 1 |
58,624 | 16,656,086,618 | IssuesEvent | 2021-06-05 14:54:26 | meerk40t/meerk40t | https://api.github.com/repos/meerk40t/meerk40t | closed | Prerelease 0.7.0: Testing... Report any issues found here. | Context: UI/UX Priority: Critical Status: In Progress Type: Bug Type: Defect Type: Documentation Type: Enhancement Type: Maintenance Work: Chaotic [̲̅$̲̅(̲̅ιοο̲̅)̲̅$̲̅] help wanted | # PLEASE DO ALL FURTHER TESTING ON v0.7.0b30 RC-5
The latest beta release is v0.7.0beta30 Release Candidate 5.
Please do all further testing on this release. Please do not report any further bugs with previous beta releases.
https://github.com/meerk40t/meerk40t/releases/tag/0.7.0-beta30
There are still some other significant bugs and a lot of small tweaks still to come..
# CAUTION
**CAUTION:** Please note that (depending on your design) the path optimisation in combination with multiple passes of the same path can result in the laser beam burning the same path several times in quick succession without allowing time for the material to cool down between. This can have unintended consequences. **Please do not use this feature of the beta whilst your K40 is unattended.** | 1.0 | Prerelease 0.7.0: Testing... Report any issues found here. - # PLEASE DO ALL FURTHER TESTING ON v0.7.0b30 RC-5
The latest beta release is v0.7.0beta30 Release Candidate 5.
Please do all further testing on this release. Please do not report any further bugs with previous beta releases.
https://github.com/meerk40t/meerk40t/releases/tag/0.7.0-beta30
There are still some other significant bugs and a lot of small tweaks still to come..
# CAUTION
**CAUTION:** Please note that (depending on your design) the path optimisation in combination with multiple passes of the same path can result in the laser beam burning the same path several times in quick succession without allowing time for the material to cool down between. This can have unintended consequences. **Please do not use this feature of the beta whilst your K40 is unattended.** | defect | prerelease testing report any issues found here please do all further testing on rc the latest beta release is release candidate please do all further testing on this release please do not report any further bugs with previous beta releases there are still some other significant bugs and a lot of small tweaks still to come caution caution please note that depending on your design the path optimisation in combination with multiple passes of the same path can result in the laser beam burning the same path several times in quick succession without allowing time for the material to cool down between this can have unintended consequences please do not use this feature of the beta whilst your is unattended | 1 |
49,706 | 13,187,254,403 | IssuesEvent | 2020-08-13 02:50:09 | icecube-trac/tix3 | https://api.github.com/repos/icecube-trac/tix3 | opened | HS processing with long queue broken (Trac #1907) | Incomplete Migration Migrated from Trac defect other | <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1907">https://code.icecube.wisc.edu/ticket/1907</a>, reported by dheereman and owned by dheereman</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2016-11-06T20:51:10",
"description": "Some of your recent Condor jobs have used the following\ninvalid accounting group names(s):\nlong.hitspool\n\nAll accounting groups must match GROUPNAME.USERNAME pattern,\nwhere USERNAME is /your/ user name and GROUPNAME is one of:\n1_week 2_week admin backfill gpu quicktest sanctioned uwa\n (note that both user name and group name must be lower case)\n\nPlease see the link below for more information:\nhttps://wiki.icecube.wisc.edu/index.php/Condor/FAQ#Accounting_Groups\n\nGenerated automatically by /etc/cron.daily/invalid_groups.sh",
"reporter": "dheereman",
"cc": "gmoment",
"resolution": "invalid",
"_ts": "1478465470883821",
"component": "other",
"summary": "HS processing with long queue broken",
"priority": "normal",
"keywords": "hitspool processing",
"time": "2016-11-06T17:28:32",
"milestone": "",
"owner": "dheereman",
"type": "defect"
}
```
</p>
</details>
| 1.0 | HS processing with long queue broken (Trac #1907) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1907">https://code.icecube.wisc.edu/ticket/1907</a>, reported by dheereman and owned by dheereman</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2016-11-06T20:51:10",
"description": "Some of your recent Condor jobs have used the following\ninvalid accounting group names(s):\nlong.hitspool\n\nAll accounting groups must match GROUPNAME.USERNAME pattern,\nwhere USERNAME is /your/ user name and GROUPNAME is one of:\n1_week 2_week admin backfill gpu quicktest sanctioned uwa\n (note that both user name and group name must be lower case)\n\nPlease see the link below for more information:\nhttps://wiki.icecube.wisc.edu/index.php/Condor/FAQ#Accounting_Groups\n\nGenerated automatically by /etc/cron.daily/invalid_groups.sh",
"reporter": "dheereman",
"cc": "gmoment",
"resolution": "invalid",
"_ts": "1478465470883821",
"component": "other",
"summary": "HS processing with long queue broken",
"priority": "normal",
"keywords": "hitspool processing",
"time": "2016-11-06T17:28:32",
"milestone": "",
"owner": "dheereman",
"type": "defect"
}
```
</p>
</details>
| defect | hs processing with long queue broken trac migrated from json status closed changetime description some of your recent condor jobs have used the following ninvalid accounting group names s nlong hitspool n nall accounting groups must match groupname username pattern nwhere username is your user name and groupname is one of week week admin backfill gpu quicktest sanctioned uwa n note that both user name and group name must be lower case n nplease see the link below for more information n automatically by etc cron daily invalid groups sh reporter dheereman cc gmoment resolution invalid ts component other summary hs processing with long queue broken priority normal keywords hitspool processing time milestone owner dheereman type defect | 1 |
355,143 | 10,576,931,154 | IssuesEvent | 2019-10-07 18:57:58 | compodoc/compodoc | https://api.github.com/repos/compodoc/compodoc | closed | [BUG] Unable to generate doc for routing based on environment variable | Context : routing Priority: Medium Status: Accepted Time: ~3 hours Type: Bug wontfix | <!--
> Please follow the issue template below for bug reports and queries.
> For issue, start the label of the title with [BUG]
> For feature requests, start the label of the title with [FEATURE] and explain your use case and ideas clearly below, you can remove sections which are not relevant.
-->
##### **Overview of the issue**
<!-- explain the issue or feature request, if an error is being thrown a stack trace helps -->
So I have implemented routes based on environment variable and compodoc fails to generate doc for the same,
I have environment.ts decides if its "webadmin" or "userportal" based on that it will load the routes for the webadmin or userportal
Unhandled Rejection at: Promise {
<rejected> { SyntaxError: Syntax error at line 1 column 11 of the JSON5 data. Still to read: "[\"environment\".\"appl"
at error (/home/cyberoam/pritesh/git/picasso-develop/ui/node_modules/json5/lib/json5.js:56:25)
at Object.parse (/home/cyberoam/pritesh/git/picasso-develop/ui/node_modules/json5/lib/json5.js:511:13)
at loopModulesParser (/home/cyberoam/pritesh/git/picasso-develop/ui/node_modules/@compodoc/compodoc/dist/index-cli.js:810:48)
##### **Operating System, Node.js, npm, compodoc version(s)**
Operating System : Ubuntu 14.04 LTS,
Node.js : v8.6.0
compodoc version : 1.0.5
##### **Angular configuration, a `package.json` file in the root folder**
attached
[package.json.zip](https://github.com/compodoc/compodoc/files/1538249/package.json.zip)
##### **Compodoc installed globally or locally ?**
locally
##### **Motivation for or Use Case**
Unable to generate doc with current version
##### **Reproduce the error**
Install compodoc locally
run compodoc from ng cli using npm run compodoc
##### **Related issues**
No
##### **Suggest a Fix**
<!-- if you can't fix the bug yourself, perhaps you can point to what might be
causing the problem (line of code or commit) -->
| 1.0 | [BUG] Unable to generate doc for routing based on environment variable - <!--
> Please follow the issue template below for bug reports and queries.
> For issue, start the label of the title with [BUG]
> For feature requests, start the label of the title with [FEATURE] and explain your use case and ideas clearly below, you can remove sections which are not relevant.
-->
##### **Overview of the issue**
<!-- explain the issue or feature request, if an error is being thrown a stack trace helps -->
So I have implemented routes based on environment variable and compodoc fails to generate doc for the same,
I have environment.ts decides if its "webadmin" or "userportal" based on that it will load the routes for the webadmin or userportal
Unhandled Rejection at: Promise {
<rejected> { SyntaxError: Syntax error at line 1 column 11 of the JSON5 data. Still to read: "[\"environment\".\"appl"
at error (/home/cyberoam/pritesh/git/picasso-develop/ui/node_modules/json5/lib/json5.js:56:25)
at Object.parse (/home/cyberoam/pritesh/git/picasso-develop/ui/node_modules/json5/lib/json5.js:511:13)
at loopModulesParser (/home/cyberoam/pritesh/git/picasso-develop/ui/node_modules/@compodoc/compodoc/dist/index-cli.js:810:48)
##### **Operating System, Node.js, npm, compodoc version(s)**
Operating System : Ubuntu 14.04 LTS,
Node.js : v8.6.0
compodoc version : 1.0.5
##### **Angular configuration, a `package.json` file in the root folder**
attached
[package.json.zip](https://github.com/compodoc/compodoc/files/1538249/package.json.zip)
##### **Compodoc installed globally or locally ?**
locally
##### **Motivation for or Use Case**
Unable to generate doc with current version
##### **Reproduce the error**
Install compodoc locally
run compodoc from ng cli using npm run compodoc
##### **Related issues**
No
##### **Suggest a Fix**
<!-- if you can't fix the bug yourself, perhaps you can point to what might be
causing the problem (line of code or commit) -->
| non_defect | unable to generate doc for routing based on environment variable please follow the issue template below for bug reports and queries for issue start the label of the title with for feature requests start the label of the title with and explain your use case and ideas clearly below you can remove sections which are not relevant overview of the issue so i have implemented routes based on environment variable and compodoc fails to generate doc for the same i have environment ts decides if its webadmin or userportal based on that it will load the routes for the webadmin or userportal unhandled rejection at promise syntaxerror syntax error at line column of the data still to read environment appl at error home cyberoam pritesh git picasso develop ui node modules lib js at object parse home cyberoam pritesh git picasso develop ui node modules lib js at loopmodulesparser home cyberoam pritesh git picasso develop ui node modules compodoc compodoc dist index cli js operating system node js npm compodoc version s operating system ubuntu lts node js compodoc version angular configuration a package json file in the root folder attached compodoc installed globally or locally locally motivation for or use case unable to generate doc with current version reproduce the error install compodoc locally run compodoc from ng cli using npm run compodoc related issues no suggest a fix if you can t fix the bug yourself perhaps you can point to what might be causing the problem line of code or commit | 0 |
374,544 | 26,122,391,182 | IssuesEvent | 2022-12-28 14:10:30 | ervinteoh/flask-demo-login | https://api.github.com/repos/ervinteoh/flask-demo-login | opened | [DOC]: Wiki & README | documentation | **Describe the solution you'd like**
A simple to follow and clear readme to guide the user how to install and launch the application.
**Additional context**
Wiki should be more detailed written with GitHub Wiki page for the current repository.
**Tasks**
- [ ] README.md
- [ ] Wiki Structure
| 1.0 | [DOC]: Wiki & README - **Describe the solution you'd like**
A simple to follow and clear readme to guide the user how to install and launch the application.
**Additional context**
Wiki should be more detailed written with GitHub Wiki page for the current repository.
**Tasks**
- [ ] README.md
- [ ] Wiki Structure
| non_defect | wiki readme describe the solution you d like a simple to follow and clear readme to guide the user how to install and launch the application additional context wiki should be more detailed written with github wiki page for the current repository tasks readme md wiki structure | 0 |
40,012 | 9,794,497,950 | IssuesEvent | 2019-06-10 23:12:55 | scipy/scipy | https://api.github.com/repos/scipy/scipy | closed | sparse matrices indexing with scipy 1.3 | defect scipy.sparse | Indexing csr or csc sparce matrix return an array of one element with scipy 1.3, With previous version it returned a scalar. I believe it is an oversight since lil_matrix indexing still return a scalar.
```
import scipy
import numpy as np
print(type(scipy.sparse.csr_matrix(np.eye(3))[0,0]))
print(type(scipy.sparse.csc_matrix(np.eye(3))[0,0]))
print(type(scipy.sparse.lil_matrix(np.eye(3))[0,0]))
```
```
<class 'numpy.ndarray'>
<class 'numpy.ndarray'>
<class 'numpy.float64'>
```
### Scipy/Numpy/Python version information:
scipy 1.3.0
numpy 1.16.3
python 3.7.0 | 1.0 | sparse matrices indexing with scipy 1.3 - Indexing csr or csc sparce matrix return an array of one element with scipy 1.3, With previous version it returned a scalar. I believe it is an oversight since lil_matrix indexing still return a scalar.
```
import scipy
import numpy as np
print(type(scipy.sparse.csr_matrix(np.eye(3))[0,0]))
print(type(scipy.sparse.csc_matrix(np.eye(3))[0,0]))
print(type(scipy.sparse.lil_matrix(np.eye(3))[0,0]))
```
```
<class 'numpy.ndarray'>
<class 'numpy.ndarray'>
<class 'numpy.float64'>
```
### Scipy/Numpy/Python version information:
scipy 1.3.0
numpy 1.16.3
python 3.7.0 | defect | sparse matrices indexing with scipy indexing csr or csc sparce matrix return an array of one element with scipy with previous version it returned a scalar i believe it is an oversight since lil matrix indexing still return a scalar import scipy import numpy as np print type scipy sparse csr matrix np eye print type scipy sparse csc matrix np eye print type scipy sparse lil matrix np eye scipy numpy python version information scipy numpy python | 1 |
64,953 | 18,961,046,644 | IssuesEvent | 2021-11-19 04:57:13 | vector-im/element-web | https://api.github.com/repos/vector-im/element-web | closed | Cannot read property 'store' of undefined | T-Defect P1 Z-Rageshake A-Storage | ```
2019-07-15T12:51:11.324Z E Unable to load session Cannot read property 'store' of null
TypeError: Cannot read property 'store' of null
at Object.t.trackStores (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:63:263291)
at e.<anonymous> (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:9665)
at E (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:38:13738)
at Generator._invoke (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:38:13526)
at Generator.e.(anonymous function) [as next] (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:38:13917)
at Generator.c (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:84959)
at f._promiseFulfilled (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:42111)
at F._settlePromise (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:59201)
at F._settlePromise0 (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:59799)
at F._settlePromises (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:61151)
at p (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:13372)
at f (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:13311)
at l._drainQueues (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:14819)
at drainQueues (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:13053)
From previous event:
at F.B [as _captureStackTrace] (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:28146)
at new f (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:40986)
at vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:44075
at c (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:63:1089612)
at Object.<anonymous> (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:63:1094532)
at Object.<anonymous> (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:63:1094538)
at o (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:318)
at Object.<anonymous> (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:63:1030830)
at o (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:318)
at vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:1992
at vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:2004
```
`trackStores` reference is https://github.com/matrix-org/matrix-react-sdk/blob/fc636c6cb9ef1d154e9437c06938a03e401f666f/src/utils/StorageManager.js#L158 | 1.0 | Cannot read property 'store' of undefined - ```
2019-07-15T12:51:11.324Z E Unable to load session Cannot read property 'store' of null
TypeError: Cannot read property 'store' of null
at Object.t.trackStores (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:63:263291)
at e.<anonymous> (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:9665)
at E (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:38:13738)
at Generator._invoke (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:38:13526)
at Generator.e.(anonymous function) [as next] (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:38:13917)
at Generator.c (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:84959)
at f._promiseFulfilled (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:42111)
at F._settlePromise (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:59201)
at F._settlePromise0 (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:59799)
at F._settlePromises (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:61151)
at p (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:13372)
at f (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:13311)
at l._drainQueues (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:14819)
at drainQueues (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:13053)
From previous event:
at F.B [as _captureStackTrace] (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:28146)
at new f (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:40986)
at vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:44075
at c (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:63:1089612)
at Object.<anonymous> (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:63:1094532)
at Object.<anonymous> (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:63:1094538)
at o (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:318)
at Object.<anonymous> (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:63:1030830)
at o (vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:318)
at vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:1992
at vector://vector/webapp/bundles/e9b8313cd5591059cfdc/bundle.js:1:2004
```
`trackStores` reference is https://github.com/matrix-org/matrix-react-sdk/blob/fc636c6cb9ef1d154e9437c06938a03e401f666f/src/utils/StorageManager.js#L158 | defect | cannot read property store of undefined e unable to load session cannot read property store of null typeerror cannot read property store of null at object t trackstores vector vector webapp bundles bundle js at e vector vector webapp bundles bundle js at e vector vector webapp bundles bundle js at generator invoke vector vector webapp bundles bundle js at generator e anonymous function vector vector webapp bundles bundle js at generator c vector vector webapp bundles bundle js at f promisefulfilled vector vector webapp bundles bundle js at f settlepromise vector vector webapp bundles bundle js at f vector vector webapp bundles bundle js at f settlepromises vector vector webapp bundles bundle js at p vector vector webapp bundles bundle js at f vector vector webapp bundles bundle js at l drainqueues vector vector webapp bundles bundle js at drainqueues vector vector webapp bundles bundle js from previous event at f b vector vector webapp bundles bundle js at new f vector vector webapp bundles bundle js at vector vector webapp bundles bundle js at c vector vector webapp bundles bundle js at object vector vector webapp bundles bundle js at object vector vector webapp bundles bundle js at o vector vector webapp bundles bundle js at object vector vector webapp bundles bundle js at o vector vector webapp bundles bundle js at vector vector webapp bundles bundle js at vector vector webapp bundles bundle js trackstores reference is | 1 |
59,663 | 17,023,196,512 | IssuesEvent | 2021-07-03 00:48:47 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | Patch to fix further XSS holes, and nicer cleaning of tags where some are suitable | Component: website Priority: major Resolution: fixed Type: defect | **[Submitted to the original trac issue database at 4.25pm, Tuesday, 15th January 2008]**
Fixes two further XSS holes (in/outboxes) and uses the Rails 2 sanitize function to filter out nasty js based on a whitelist approach. (also handles javascript: links well) | 1.0 | Patch to fix further XSS holes, and nicer cleaning of tags where some are suitable - **[Submitted to the original trac issue database at 4.25pm, Tuesday, 15th January 2008]**
Fixes two further XSS holes (in/outboxes) and uses the Rails 2 sanitize function to filter out nasty js based on a whitelist approach. (also handles javascript: links well) | defect | patch to fix further xss holes and nicer cleaning of tags where some are suitable fixes two further xss holes in outboxes and uses the rails sanitize function to filter out nasty js based on a whitelist approach also handles javascript links well | 1 |
40,482 | 10,541,949,475 | IssuesEvent | 2019-10-02 12:06:20 | intermine/bluegenes | https://api.github.com/repos/intermine/bluegenes | closed | Testing: replace flymine with (bio?)testmine | Build-CI-Testing | right now, a lot of tests time out in Travis, probably due to the fact that we're testing on a live internet connection with real data. It would be a lot better to use biotestmine as a test server, since it can easily be scripted to run in tests locally, thereby avoiding network latency failures | 1.0 | Testing: replace flymine with (bio?)testmine - right now, a lot of tests time out in Travis, probably due to the fact that we're testing on a live internet connection with real data. It would be a lot better to use biotestmine as a test server, since it can easily be scripted to run in tests locally, thereby avoiding network latency failures | non_defect | testing replace flymine with bio testmine right now a lot of tests time out in travis probably due to the fact that we re testing on a live internet connection with real data it would be a lot better to use biotestmine as a test server since it can easily be scripted to run in tests locally thereby avoiding network latency failures | 0 |
52,932 | 13,225,240,540 | IssuesEvent | 2020-08-17 20:46:38 | icecube-trac/tix4 | https://api.github.com/repos/icecube-trac/tix4 | closed | ports dies if you don't specify --prefix (Trac #597) | Migrated from Trac defect tools/ports | you get:
troy@zinc:~/Icecube/DarwinPorts/t2$ /opt/local/bin/port sync
can't find package darwinports
while executing
"package require darwinports"
(file "/opt/local/bin/port" line 36)
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/597">https://code.icecube.wisc.edu/projects/icecube/ticket/597</a>, reported by troyand owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2011-07-29T20:08:27",
"_ts": "1311970107000000",
"description": "you get:\n\ntroy@zinc:~/Icecube/DarwinPorts/t2$ /opt/local/bin/port sync\ncan't find package darwinports\n while executing\n\"package require darwinports\"\n (file \"/opt/local/bin/port\" line 36)\n",
"reporter": "troy",
"cc": "",
"resolution": "wont or cant fix",
"time": "2010-02-22T01:19:13",
"component": "tools/ports",
"summary": "ports dies if you don't specify --prefix",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
| 1.0 | ports dies if you don't specify --prefix (Trac #597) - you get:
troy@zinc:~/Icecube/DarwinPorts/t2$ /opt/local/bin/port sync
can't find package darwinports
while executing
"package require darwinports"
(file "/opt/local/bin/port" line 36)
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/597">https://code.icecube.wisc.edu/projects/icecube/ticket/597</a>, reported by troyand owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2011-07-29T20:08:27",
"_ts": "1311970107000000",
"description": "you get:\n\ntroy@zinc:~/Icecube/DarwinPorts/t2$ /opt/local/bin/port sync\ncan't find package darwinports\n while executing\n\"package require darwinports\"\n (file \"/opt/local/bin/port\" line 36)\n",
"reporter": "troy",
"cc": "",
"resolution": "wont or cant fix",
"time": "2010-02-22T01:19:13",
"component": "tools/ports",
"summary": "ports dies if you don't specify --prefix",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
| defect | ports dies if you don t specify prefix trac you get troy zinc icecube darwinports opt local bin port sync can t find package darwinports while executing package require darwinports file opt local bin port line migrated from json status closed changetime ts description you get n ntroy zinc icecube darwinports opt local bin port sync ncan t find package darwinports n while executing n package require darwinports n file opt local bin port line n reporter troy cc resolution wont or cant fix time component tools ports summary ports dies if you don t specify prefix priority normal keywords milestone owner nega type defect | 1 |
159,756 | 13,771,425,639 | IssuesEvent | 2020-10-07 21:59:59 | eclipse/deeplearning4j | https://api.github.com/repos/eclipse/deeplearning4j | closed | We should define what dl4j is "for" (People think we are closer to opencv) | Documentation | There still seems to be a lot of confusion on what dl4j does. It would be good if we could try to explain
what you should expect to do with dl4j.
For example, we are lower level than stanford nlp or opencv with "canned" models.
However we are higher level than just a "math library". | 1.0 | We should define what dl4j is "for" (People think we are closer to opencv) - There still seems to be a lot of confusion on what dl4j does. It would be good if we could try to explain
what you should expect to do with dl4j.
For example, we are lower level than stanford nlp or opencv with "canned" models.
However we are higher level than just a "math library". | non_defect | we should define what is for people think we are closer to opencv there still seems to be a lot of confusion on what does it would be good if we could try to explain what you should expect to do with for example we are lower level than stanford nlp or opencv with canned models however we are higher level than just a math library | 0 |
209,698 | 16,054,249,013 | IssuesEvent | 2021-04-23 00:44:15 | kubernetes/kubernetes | https://api.github.com/repos/kubernetes/kubernetes | closed | e2e storage: re-enable health check sidecars | area/test kind/feature needs-triage sig/storage | #### What would you like to be added:
https://github.com/kubernetes/kubernetes/pull/100637 added newer hostpath deployments but removed the csi-external-health-monitor-agent because it seemed to cause overhead without being needed for the tests.
#### Why is this needed:
We should re-enable it and check whether that really causes tests to fail because of the extra load.
| 1.0 | e2e storage: re-enable health check sidecars - #### What would you like to be added:
https://github.com/kubernetes/kubernetes/pull/100637 added newer hostpath deployments but removed the csi-external-health-monitor-agent because it seemed to cause overhead without being needed for the tests.
#### Why is this needed:
We should re-enable it and check whether that really causes tests to fail because of the extra load.
| non_defect | storage re enable health check sidecars what would you like to be added added newer hostpath deployments but removed the csi external health monitor agent because it seemed to cause overhead without being needed for the tests why is this needed we should re enable it and check whether that really causes tests to fail because of the extra load | 0 |
627,434 | 19,904,875,086 | IssuesEvent | 2022-01-25 11:43:49 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | k5.macromill.com - site is not usable | priority-normal browser-fenix engine-gecko | <!-- @browser: Firefox Mobile 98.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 11; Mobile; rv:98.0) Gecko/98.0 Firefox/98.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/98696 -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://k5.macromill.com/e/1193223_24/index.php
**Browser / Version**: Firefox Mobile 98.0
**Operating System**: Android 11
**Tested Another Browser**: Yes Chrome
**Problem type**: Site is not usable
**Description**: Buttons or links not working
**Steps to Reproduce**:
どれだけタップしても反応しない。
Android One S8,Android11.
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/1/5c7c9a10-9733-42c2-81bf-b872849cdcf5.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20220124093541</li><li>channel: nightly</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2022/1/54d337b2-7b57-41a2-bcdf-fa7ae7390b7c)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | k5.macromill.com - site is not usable - <!-- @browser: Firefox Mobile 98.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 11; Mobile; rv:98.0) Gecko/98.0 Firefox/98.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/98696 -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://k5.macromill.com/e/1193223_24/index.php
**Browser / Version**: Firefox Mobile 98.0
**Operating System**: Android 11
**Tested Another Browser**: Yes Chrome
**Problem type**: Site is not usable
**Description**: Buttons or links not working
**Steps to Reproduce**:
どれだけタップしても反応しない。
Android One S8,Android11.
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/1/5c7c9a10-9733-42c2-81bf-b872849cdcf5.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20220124093541</li><li>channel: nightly</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2022/1/54d337b2-7b57-41a2-bcdf-fa7ae7390b7c)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | non_defect | macromill com site is not usable url browser version firefox mobile operating system android tested another browser yes chrome problem type site is not usable description buttons or links not working steps to reproduce どれだけタップしても反応しない。 android one view the screenshot img alt screenshot src browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel nightly hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️ | 0 |
18,120 | 3,024,309,810 | IssuesEvent | 2015-08-02 13:46:35 | aayush93/xbt | https://api.github.com/repos/aayush93/xbt | closed | GitHub | auto-migrated Priority-Medium Type-Defect | ```
Let`s go to GitHub
I create https://github.com/poiuty/xbt
I give access rights to this repository.
```
Original issue reported on code.google.com by `df3434er...@gmail.com` on 14 Jul 2015 at 7:03 | 1.0 | GitHub - ```
Let`s go to GitHub
I create https://github.com/poiuty/xbt
I give access rights to this repository.
```
Original issue reported on code.google.com by `df3434er...@gmail.com` on 14 Jul 2015 at 7:03 | defect | github let s go to github i create i give access rights to this repository original issue reported on code google com by gmail com on jul at | 1 |
62,427 | 17,023,921,273 | IssuesEvent | 2021-07-03 04:34:04 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | Interpolation doesn't work with nodes without address | Component: nominatim Priority: major Resolution: fixed Type: defect | **[Submitted to the original trac issue database at 8.41pm, Friday, 17th April 2015]**
"gorriti 3639" works, because there's a node there. However "gorriti 3638" doesn't.
I can't find any interpolated result that would work around here.
Not being able to handle interpolations renders nominatim useless. | 1.0 | Interpolation doesn't work with nodes without address - **[Submitted to the original trac issue database at 8.41pm, Friday, 17th April 2015]**
"gorriti 3639" works, because there's a node there. However "gorriti 3638" doesn't.
I can't find any interpolated result that would work around here.
Not being able to handle interpolations renders nominatim useless. | defect | interpolation doesn t work with nodes without address gorriti works because there s a node there however gorriti doesn t i can t find any interpolated result that would work around here not being able to handle interpolations renders nominatim useless | 1 |
987 | 2,594,406,971 | IssuesEvent | 2015-02-20 02:57:42 | BALL-Project/ball | https://api.github.com/repos/BALL-Project/ball | closed | BALLView: tooltip - text empty | C: VIEW P: minor R: invalid T: defect | **Reported by akdehof on 4 Dec 41589808 15:35 UTC**
As can be seen in the | 1.0 | BALLView: tooltip - text empty - **Reported by akdehof on 4 Dec 41589808 15:35 UTC**
As can be seen in the | defect | ballview tooltip text empty reported by akdehof on dec utc as can be seen in the | 1 |
17,227 | 2,984,629,043 | IssuesEvent | 2015-07-18 05:59:02 | Tarsnap/scrypt | https://api.github.com/repos/Tarsnap/scrypt | closed | *(uint64_t *)usermembuf triggers alignment error in clang. | auto-migrated Priority-Medium Type-Defect | ```
lib/util/memlimit.c:78:14: error: cast from 'uint8_t *' (aka 'unsigned char *')
to 'uint64_t *' (aka 'unsigned long long *') increases required alignment from
1 to 8
[-Werror,-Wcast-align]
```
Original issue reported on code.google.com by `lhunath@lyndir.com` on 2 May 2014 at 3:52 | 1.0 | *(uint64_t *)usermembuf triggers alignment error in clang. - ```
lib/util/memlimit.c:78:14: error: cast from 'uint8_t *' (aka 'unsigned char *')
to 'uint64_t *' (aka 'unsigned long long *') increases required alignment from
1 to 8
[-Werror,-Wcast-align]
```
Original issue reported on code.google.com by `lhunath@lyndir.com` on 2 May 2014 at 3:52 | defect | t usermembuf triggers alignment error in clang lib util memlimit c error cast from t aka unsigned char to t aka unsigned long long increases required alignment from to original issue reported on code google com by lhunath lyndir com on may at | 1 |
62,532 | 17,023,941,187 | IssuesEvent | 2021-07-03 04:41:04 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | P2's GPS traces (from server) display is affected by new API treatment of unordered traces | Component: potlatch2 Priority: major Resolution: fixed Type: defect | **[Submitted to the original trac issue database at 6.04pm, Thursday, 8th November 2018]**
If I remember correctly (cannot test, since I have no Flash player anymore) P2 displays all GPS traces as lines. Hence, it is likely affected, as JOSM is, by the API trackpoints api call code change which closed https://github.com/openstreetmap/openstreetmap-website/issues/2046 . | 1.0 | P2's GPS traces (from server) display is affected by new API treatment of unordered traces - **[Submitted to the original trac issue database at 6.04pm, Thursday, 8th November 2018]**
If I remember correctly (cannot test, since I have no Flash player anymore) P2 displays all GPS traces as lines. Hence, it is likely affected, as JOSM is, by the API trackpoints api call code change which closed https://github.com/openstreetmap/openstreetmap-website/issues/2046 . | defect | s gps traces from server display is affected by new api treatment of unordered traces if i remember correctly cannot test since i have no flash player anymore displays all gps traces as lines hence it is likely affected as josm is by the api trackpoints api call code change which closed | 1 |
225,686 | 24,881,128,938 | IssuesEvent | 2022-10-28 01:15:55 | michaeldotson/scaffolding-app | https://api.github.com/repos/michaeldotson/scaffolding-app | closed | WS-2022-0334 (Medium) detected in nokogiri-1.10.3.gem - autoclosed | security vulnerability | ## WS-2022-0334 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nokogiri-1.10.3.gem</b></p></summary>
<p>Nokogiri (鋸) is an HTML, XML, SAX, and Reader parser. Among
Nokogiri's many features is the ability to search documents via XPath
or CSS3 selectors.</p>
<p>Library home page: <a href="https://rubygems.org/gems/nokogiri-1.10.3.gem">https://rubygems.org/gems/nokogiri-1.10.3.gem</a></p>
<p>Path to dependency file: /scaffolding-app/Gemfile.lock</p>
<p>Path to vulnerable library: /var/lib/gems/2.3.0/cache/nokogiri-1.10.3.gem</p>
<p>
Dependency Hierarchy:
- web-console-3.7.0.gem (Root Library)
- actionview-5.2.2.gem
- rails-dom-testing-2.0.3.gem
- :x: **nokogiri-1.10.3.gem** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
nokogiri up to and including 1.13.8 is affected by several vulnerabilities (CVE-2022-40303, CVE-2022-40304 and CVE-2022-2309) in the dependency bundled libxml2 library. Version 1.13.9 of nokogiri contains a patch where the dependency is upgraded with the patches as well.
<p>Publish Date: 2022-10-18
<p>URL: <a href=https://github.com/sparklemotion/nokogiri/commit/e8cfe13953c63099f879d8a25ca70a909e19fb96>WS-2022-0334</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-2qc6-mcvw-92cw">https://github.com/advisories/GHSA-2qc6-mcvw-92cw</a></p>
<p>Release Date: 2022-10-18</p>
<p>Fix Resolution: nokogiri - 1.13.9</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2022-0334 (Medium) detected in nokogiri-1.10.3.gem - autoclosed - ## WS-2022-0334 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nokogiri-1.10.3.gem</b></p></summary>
<p>Nokogiri (鋸) is an HTML, XML, SAX, and Reader parser. Among
Nokogiri's many features is the ability to search documents via XPath
or CSS3 selectors.</p>
<p>Library home page: <a href="https://rubygems.org/gems/nokogiri-1.10.3.gem">https://rubygems.org/gems/nokogiri-1.10.3.gem</a></p>
<p>Path to dependency file: /scaffolding-app/Gemfile.lock</p>
<p>Path to vulnerable library: /var/lib/gems/2.3.0/cache/nokogiri-1.10.3.gem</p>
<p>
Dependency Hierarchy:
- web-console-3.7.0.gem (Root Library)
- actionview-5.2.2.gem
- rails-dom-testing-2.0.3.gem
- :x: **nokogiri-1.10.3.gem** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
nokogiri up to and including 1.13.8 is affected by several vulnerabilities (CVE-2022-40303, CVE-2022-40304 and CVE-2022-2309) in the dependency bundled libxml2 library. Version 1.13.9 of nokogiri contains a patch where the dependency is upgraded with the patches as well.
<p>Publish Date: 2022-10-18
<p>URL: <a href=https://github.com/sparklemotion/nokogiri/commit/e8cfe13953c63099f879d8a25ca70a909e19fb96>WS-2022-0334</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-2qc6-mcvw-92cw">https://github.com/advisories/GHSA-2qc6-mcvw-92cw</a></p>
<p>Release Date: 2022-10-18</p>
<p>Fix Resolution: nokogiri - 1.13.9</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | ws medium detected in nokogiri gem autoclosed ws medium severity vulnerability vulnerable library nokogiri gem nokogiri 鋸 is an html xml sax and reader parser among nokogiri s many features is the ability to search documents via xpath or selectors library home page a href path to dependency file scaffolding app gemfile lock path to vulnerable library var lib gems cache nokogiri gem dependency hierarchy web console gem root library actionview gem rails dom testing gem x nokogiri gem vulnerable library vulnerability details nokogiri up to and including is affected by several vulnerabilities cve cve and cve in the dependency bundled library version of nokogiri contains a patch where the dependency is upgraded with the patches as well publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution nokogiri step up your open source security game with mend | 0 |
57,766 | 24,222,239,278 | IssuesEvent | 2022-09-26 11:51:52 | dotnet/fsharp | https://api.github.com/repos/dotnet/fsharp | closed | Go to definition on external dependency/metadata is broken in 17.4 | Regression Area-LangService-Navigation | > 
>
> BCL/external dependency metadata navigation. I'm pressing F12 on this to no avail (it used to work). Perhaps it's a different use case.
Yep, doesn't work for me for `Newtonsoft.Json`.
_Originally posted by @vzarytovskii in https://github.com/dotnet/fsharp/issues/13882#issuecomment-1247893240_ | 1.0 | Go to definition on external dependency/metadata is broken in 17.4 - > 
>
> BCL/external dependency metadata navigation. I'm pressing F12 on this to no avail (it used to work). Perhaps it's a different use case.
Yep, doesn't work for me for `Newtonsoft.Json`.
_Originally posted by @vzarytovskii in https://github.com/dotnet/fsharp/issues/13882#issuecomment-1247893240_ | non_defect | go to definition on external dependency metadata is broken in bcl external dependency metadata navigation i m pressing on this to no avail it used to work perhaps it s a different use case yep doesn t work for me for newtonsoft json originally posted by vzarytovskii in | 0 |
70,405 | 23,155,565,522 | IssuesEvent | 2022-07-29 12:42:12 | vector-im/element-web | https://api.github.com/repos/vector-im/element-web | closed | Messages in thread are out of order | T-Defect S-Major A-Timeline O-Uncommon A-Threads | ### Steps to reproduce
1. Start a thread
2. Put your computer to sleep
3. Receive some replies to that thread overnight
4. Take your computer out of sleep and watch Element sync
### Outcome
#### What did you expect?
The messages in the thread should be in chronological order, as they are on Element Android.
#### What happened instead?
The 3 messages at the top of this thread are actually supposed to follow the message at the bottom:

### Operating system
NixOS unstable
### Browser information
Firefox 101.0.1
### URL for webapp
develop.element.io
### Application version
Element version: 7a131fc50f09-react-18c21d77cd6c-js-58227307974a Olm version: 3.2.8
### Homeserver
Synapse 1.61.0
### Will you send logs?
Yes | 1.0 | Messages in thread are out of order - ### Steps to reproduce
1. Start a thread
2. Put your computer to sleep
3. Receive some replies to that thread overnight
4. Take your computer out of sleep and watch Element sync
### Outcome
#### What did you expect?
The messages in the thread should be in chronological order, as they are on Element Android.
#### What happened instead?
The 3 messages at the top of this thread are actually supposed to follow the message at the bottom:

### Operating system
NixOS unstable
### Browser information
Firefox 101.0.1
### URL for webapp
develop.element.io
### Application version
Element version: 7a131fc50f09-react-18c21d77cd6c-js-58227307974a Olm version: 3.2.8
### Homeserver
Synapse 1.61.0
### Will you send logs?
Yes | defect | messages in thread are out of order steps to reproduce start a thread put your computer to sleep receive some replies to that thread overnight take your computer out of sleep and watch element sync outcome what did you expect the messages in the thread should be in chronological order as they are on element android what happened instead the messages at the top of this thread are actually supposed to follow the message at the bottom operating system nixos unstable browser information firefox url for webapp develop element io application version element version react js olm version homeserver synapse will you send logs yes | 1 |
23,668 | 3,851,865,301 | IssuesEvent | 2016-04-06 05:27:47 | GPF/imame4all | https://api.github.com/repos/GPF/imame4all | closed | Controller Overlay in Portrait Full-Screen broken | auto-migrated Priority-Medium Type-Defect | ```
What steps will reproduce the problem?
1. Set to portrait full-screen
2. Start a game using portrait mode
3. the controller picture (red ball) will be scaled up to full screen
What is the expected output? What do you see instead?
Normal sized controller/joystick picture! Heavily upscaled picture of
controller.
What version of the product are you using? On what operating system?
Gridlee wit iMame 1.3. Latest iOS release.
Please provide any additional information below.
```
Original issue reported on code.google.com by `presiden...@gmail.com` on 10 Feb 2013 at 2:22 | 1.0 | Controller Overlay in Portrait Full-Screen broken - ```
What steps will reproduce the problem?
1. Set to portrait full-screen
2. Start a game using portrait mode
3. the controller picture (red ball) will be scaled up to full screen
What is the expected output? What do you see instead?
Normal sized controller/joystick picture! Heavily upscaled picture of
controller.
What version of the product are you using? On what operating system?
Gridlee wit iMame 1.3. Latest iOS release.
Please provide any additional information below.
```
Original issue reported on code.google.com by `presiden...@gmail.com` on 10 Feb 2013 at 2:22 | defect | controller overlay in portrait full screen broken what steps will reproduce the problem set to portrait full screen start a game using portrait mode the controller picture red ball will be scaled up to full screen what is the expected output what do you see instead normal sized controller joystick picture heavily upscaled picture of controller what version of the product are you using on what operating system gridlee wit imame latest ios release please provide any additional information below original issue reported on code google com by presiden gmail com on feb at | 1 |
66,044 | 19,907,822,449 | IssuesEvent | 2022-01-25 14:30:15 | cakephp/cakephp | https://api.github.com/repos/cakephp/cakephp | opened | [cakephp/http] Auth adapters can't be used | defect | ### Description
Using `cakephp/http` standalone, with any auth adapter, currently doesn't work without setting `App.namespace` during bootstrap.
```
PHP Fatal error: Uncaught TypeError: rtrim() expects parameter 1 to be string, null given in vendor/cakephp/core/App.php:63
```
We could either add a fallback in https://github.com/cakephp/cakephp/blob/4.x/src/Core/App.php#L62 to `App` or check for a null `$base` value before applying `rtrim`.
### CakePHP Version
4.3.4
### PHP Version
7.4 | 1.0 | [cakephp/http] Auth adapters can't be used - ### Description
Using `cakephp/http` standalone, with any auth adapter, currently doesn't work without setting `App.namespace` during bootstrap.
```
PHP Fatal error: Uncaught TypeError: rtrim() expects parameter 1 to be string, null given in vendor/cakephp/core/App.php:63
```
We could either add a fallback in https://github.com/cakephp/cakephp/blob/4.x/src/Core/App.php#L62 to `App` or check for a null `$base` value before applying `rtrim`.
### CakePHP Version
4.3.4
### PHP Version
7.4 | defect | auth adapters can t be used description using cakephp http standalone with any auth adapter currently doesn t work without setting app namespace during bootstrap php fatal error uncaught typeerror rtrim expects parameter to be string null given in vendor cakephp core app php we could either add a fallback in to app or check for a null base value before applying rtrim cakephp version php version | 1 |
17,597 | 10,731,731,017 | IssuesEvent | 2019-10-28 20:13:05 | dockstore/dockstore | https://api.github.com/repos/dockstore/dockstore | closed | Adding gitlab as a docker registry | enhancement web-service | as per title, we are using gitlab for one of our projects, it's nice that it has docker integration so we don't need to host dockers and code in separate places. It would be great if dockstore can support this.
┆Issue is synchronized with this [Jira Story](https://ucsc-cgl.atlassian.net/browse/DOCK-443)
┆Issue Type: Story
┆Issue Number: DOCK-443
| 1.0 | Adding gitlab as a docker registry - as per title, we are using gitlab for one of our projects, it's nice that it has docker integration so we don't need to host dockers and code in separate places. It would be great if dockstore can support this.
┆Issue is synchronized with this [Jira Story](https://ucsc-cgl.atlassian.net/browse/DOCK-443)
┆Issue Type: Story
┆Issue Number: DOCK-443
| non_defect | adding gitlab as a docker registry as per title we are using gitlab for one of our projects it s nice that it has docker integration so we don t need to host dockers and code in separate places it would be great if dockstore can support this ┆issue is synchronized with this ┆issue type story ┆issue number dock | 0 |
11,260 | 2,645,154,633 | IssuesEvent | 2015-03-12 20:58:29 | felipeleao/trabalho-pm-2 | https://api.github.com/repos/felipeleao/trabalho-pm-2 | closed | Teste de Fermat com problemas | auto-migrated Priority-Medium Type-Defect | ```
Ao tentar entrar com alguns números, quando se escolhe a opção do teste de
Fermat, aparecem erros dizendo que certos números primos não são primos.
```
Original issue reported on code.google.com by `brmagadu...@gmail.com` on 16 Oct 2010 at 2:15 | 1.0 | Teste de Fermat com problemas - ```
Ao tentar entrar com alguns números, quando se escolhe a opção do teste de
Fermat, aparecem erros dizendo que certos números primos não são primos.
```
Original issue reported on code.google.com by `brmagadu...@gmail.com` on 16 Oct 2010 at 2:15 | defect | teste de fermat com problemas ao tentar entrar com alguns números quando se escolhe a opção do teste de fermat aparecem erros dizendo que certos números primos não são primos original issue reported on code google com by brmagadu gmail com on oct at | 1 |
12,093 | 14,740,081,591 | IssuesEvent | 2021-01-07 08:29:04 | kdjstudios/SABillingGitlab | https://api.github.com/repos/kdjstudios/SABillingGitlab | closed | Atlanta - SA Billing - Late Fee Account List | anc-process anp-important ant-bug has attachment | In GitLab by @kdjstudios on Oct 3, 2018, 11:09
[Atlanta.xlsx](/uploads/efbbfb141893f5b2abbb8b52ea0c24e0/Atlanta.xlsx)
HD: http://www.servicedesk.answernet.com/profiles/ticket/2018-10-03-52748 | 1.0 | Atlanta - SA Billing - Late Fee Account List - In GitLab by @kdjstudios on Oct 3, 2018, 11:09
[Atlanta.xlsx](/uploads/efbbfb141893f5b2abbb8b52ea0c24e0/Atlanta.xlsx)
HD: http://www.servicedesk.answernet.com/profiles/ticket/2018-10-03-52748 | non_defect | atlanta sa billing late fee account list in gitlab by kdjstudios on oct uploads atlanta xlsx hd | 0 |
318,188 | 27,293,234,061 | IssuesEvent | 2023-02-23 18:08:08 | tarantool/tarantool-qa | https://api.github.com/repos/tarantool/tarantool-qa | opened | test: flaky replication-luatest/gh_8121_memtx_mvcc_replication_stream_test.lua | flaky test | Test https://github.com/tarantool/tarantool/blob/master/test/replication-luatest/gh_8121_memtx_mvcc_replication_stream_test.lua is flaky
* branch: master, 2.10
* OS and version: Linux, FreeBSD
* Architecture: any
* gc64: any
<img width="727" alt="image" src="https://user-images.githubusercontent.com/426722/220992652-20196b10-17d7-4aa3-b5d9-4b1de27ee97d.png"> | 1.0 | test: flaky replication-luatest/gh_8121_memtx_mvcc_replication_stream_test.lua - Test https://github.com/tarantool/tarantool/blob/master/test/replication-luatest/gh_8121_memtx_mvcc_replication_stream_test.lua is flaky
* branch: master, 2.10
* OS and version: Linux, FreeBSD
* Architecture: any
* gc64: any
<img width="727" alt="image" src="https://user-images.githubusercontent.com/426722/220992652-20196b10-17d7-4aa3-b5d9-4b1de27ee97d.png"> | non_defect | test flaky replication luatest gh memtx mvcc replication stream test lua test is flaky branch master os and version linux freebsd architecture any any img width alt image src | 0 |
237,000 | 19,589,793,673 | IssuesEvent | 2022-01-05 11:34:15 | elastic/kibana | https://api.github.com/repos/elastic/kibana | closed | Failing test: Chrome UI Functional Tests.test/functional/apps/discover/_discover_histogram·ts - discover app discover histogram should visualize monthly data with different day intervals | blocker Feature:Discover failed-test v8.0.0 skipped-test v7.11.0 Team:DataDiscovery | A test failed on a tracked branch
```
Error: retry.try timeout: TimeoutError: Waiting for element to be located By(css selector, [data-test-subj="superDatePickerAbsoluteDateInput"])
Wait timed out after 10019ms
at /dev/shm/workspace/kibana/node_modules/selenium-webdriver/lib/webdriver.js:842:17
at process._tickCallback (internal/process/next_tick.js:68:7)
at onFailure (test/common/services/retry/retry_for_success.ts:28:9)
at retryForSuccess (test/common/services/retry/retry_for_success.ts:68:13)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+7.x/9050/)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome UI Functional Tests.test/functional/apps/discover/_discover_histogram·ts","test.name":"discover app discover histogram should visualize monthly data with different day intervals","test.failCount":6}} --> | 2.0 | Failing test: Chrome UI Functional Tests.test/functional/apps/discover/_discover_histogram·ts - discover app discover histogram should visualize monthly data with different day intervals - A test failed on a tracked branch
```
Error: retry.try timeout: TimeoutError: Waiting for element to be located By(css selector, [data-test-subj="superDatePickerAbsoluteDateInput"])
Wait timed out after 10019ms
at /dev/shm/workspace/kibana/node_modules/selenium-webdriver/lib/webdriver.js:842:17
at process._tickCallback (internal/process/next_tick.js:68:7)
at onFailure (test/common/services/retry/retry_for_success.ts:28:9)
at retryForSuccess (test/common/services/retry/retry_for_success.ts:68:13)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+7.x/9050/)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome UI Functional Tests.test/functional/apps/discover/_discover_histogram·ts","test.name":"discover app discover histogram should visualize monthly data with different day intervals","test.failCount":6}} --> | non_defect | failing test chrome ui functional tests test functional apps discover discover histogram·ts discover app discover histogram should visualize monthly data with different day intervals a test failed on a tracked branch error retry try timeout timeouterror waiting for element to be located by css selector wait timed out after at dev shm workspace kibana node modules selenium webdriver lib webdriver js at process tickcallback internal process next tick js at onfailure test common services retry retry for success ts at retryforsuccess test common services retry retry for success ts first failure | 0 |
432,332 | 30,278,812,215 | IssuesEvent | 2023-07-07 22:59:32 | r-lib/lintr | https://api.github.com/repos/r-lib/lintr | closed | Document "symbols" naming style in object_name_linter | documentation | The [documentation](https://github.com/r-lib/lintr/blob/main/R/object_name_linter.R) for `object_name_linter` describes:
> ... The default naming styles are "snake_case" and "symbols".
After that, there is no explanation of what "symbols" means, and there is no example that uses `styles = "symbols"`. Could this documentation please be added?
Thanks | 1.0 | Document "symbols" naming style in object_name_linter - The [documentation](https://github.com/r-lib/lintr/blob/main/R/object_name_linter.R) for `object_name_linter` describes:
> ... The default naming styles are "snake_case" and "symbols".
After that, there is no explanation of what "symbols" means, and there is no example that uses `styles = "symbols"`. Could this documentation please be added?
Thanks | non_defect | document symbols naming style in object name linter the for object name linter describes the default naming styles are snake case and symbols after that there is no explanation of what symbols means and there is no example that uses styles symbols could this documentation please be added thanks | 0 |
300,363 | 9,209,967,819 | IssuesEvent | 2019-03-09 00:45:39 | codephil-columbia/typephil | https://api.github.com/repos/codephil-columbia/typephil | closed | Redesign "My account"(Chelsy) | Medium Priority Visual-Related | Need to redesign "my account" page since we currently do not ask for email or have classroom features. current design is toooooo bare.


| 1.0 | Redesign "My account"(Chelsy) - Need to redesign "my account" page since we currently do not ask for email or have classroom features. current design is toooooo bare.


| non_defect | redesign my account chelsy need to redesign my account page since we currently do not ask for email or have classroom features current design is toooooo bare | 0 |
263,484 | 19,912,685,911 | IssuesEvent | 2022-01-25 18:50:55 | NorESMhub/NorESM_LandSites_Platform | https://api.github.com/repos/NorESMhub/NorESM_LandSites_Platform | closed | Update wiki | documentation question | The links in our wiki are outdated. Should we remove them? Do we need the wiki at all now that we have the documentation page?
This is what is there: https://github.com/NorESMhub/NorESM_LandSites_Platform/wiki/Useful-materials-for-developing-NLP | 1.0 | Update wiki - The links in our wiki are outdated. Should we remove them? Do we need the wiki at all now that we have the documentation page?
This is what is there: https://github.com/NorESMhub/NorESM_LandSites_Platform/wiki/Useful-materials-for-developing-NLP | non_defect | update wiki the links in our wiki are outdated should we remove them do we need the wiki at all now that we have the documentation page this is what is there | 0 |
35,349 | 7,708,353,409 | IssuesEvent | 2018-05-22 04:29:27 | bridgedotnet/Bridge | https://api.github.com/repos/bridgedotnet/Bridge | closed | New line emmited when calling multiline script | defect in-progress | New line emmited when calling multiline script.
Im not sure if this usecase is expected to be used in that way :)
### Steps To Reproduce
https://deck.net/5fbd63b8bd314550e325edd544f78599
```csharp
public class Program
{
public static void Main()
{
string a = "foo";
var b = Script.Call<string>(@"
(
function(p){ return p + 'bar';}
)", a);
Console.WriteLine(b); // expected foobar
}
}
```
### Expected Result
```js
Bridge.assembly("Demo", function ($asm, globals) {
"use strict";
Bridge.define("Demo.Program", {
main: function Main () {
var a = "foo";
var b =
(
function(p){ return p + 'bar';}
)(a);
System.Console.WriteLine(b);
}
});
});
```
### Actual Result
```js
Bridge.assembly("Demo", function ($asm, globals) {
"use strict";
Bridge.define("Demo.Program", {
main: function Main () {
var a = "foo";
var b = \r\n (\r\n function(p){ return p + 'bar';}\r\n )(a);
System.Console.WriteLine(b);
}
});
});
```
| 1.0 | New line emmited when calling multiline script - New line emmited when calling multiline script.
Im not sure if this usecase is expected to be used in that way :)
### Steps To Reproduce
https://deck.net/5fbd63b8bd314550e325edd544f78599
```csharp
public class Program
{
public static void Main()
{
string a = "foo";
var b = Script.Call<string>(@"
(
function(p){ return p + 'bar';}
)", a);
Console.WriteLine(b); // expected foobar
}
}
```
### Expected Result
```js
Bridge.assembly("Demo", function ($asm, globals) {
"use strict";
Bridge.define("Demo.Program", {
main: function Main () {
var a = "foo";
var b =
(
function(p){ return p + 'bar';}
)(a);
System.Console.WriteLine(b);
}
});
});
```
### Actual Result
```js
Bridge.assembly("Demo", function ($asm, globals) {
"use strict";
Bridge.define("Demo.Program", {
main: function Main () {
var a = "foo";
var b = \r\n (\r\n function(p){ return p + 'bar';}\r\n )(a);
System.Console.WriteLine(b);
}
});
});
```
| defect | new line emmited when calling multiline script new line emmited when calling multiline script im not sure if this usecase is expected to be used in that way steps to reproduce csharp public class program public static void main string a foo var b script call function p return p bar a console writeline b expected foobar expected result js bridge assembly demo function asm globals use strict bridge define demo program main function main var a foo var b function p return p bar a system console writeline b actual result js bridge assembly demo function asm globals use strict bridge define demo program main function main var a foo var b r n r n function p return p bar r n a system console writeline b | 1 |
12,565 | 2,708,262,836 | IssuesEvent | 2015-04-08 07:36:38 | Simsys/qhexedit2 | https://api.github.com/repos/Simsys/qhexedit2 | closed | Error in example (function saveFile()) | auto-migrated Priority-Medium Type-Defect | ```
In Example saveFile() function:
file.open(QFile::WriteOnly | QFile::Text)
causes errors while saving (/n translated to /r/n);
correct way:
file.open(QFile::WriteOnly | QFile::Truncate)
```
Original issue reported on code.google.com by `gibol...@gmail.com` on 6 Jun 2013 at 4:28 | 1.0 | Error in example (function saveFile()) - ```
In Example saveFile() function:
file.open(QFile::WriteOnly | QFile::Text)
causes errors while saving (/n translated to /r/n);
correct way:
file.open(QFile::WriteOnly | QFile::Truncate)
```
Original issue reported on code.google.com by `gibol...@gmail.com` on 6 Jun 2013 at 4:28 | defect | error in example function savefile in example savefile function file open qfile writeonly qfile text causes errors while saving n translated to r n correct way file open qfile writeonly qfile truncate original issue reported on code google com by gibol gmail com on jun at | 1 |
133,931 | 18,365,153,529 | IssuesEvent | 2021-10-09 23:21:15 | turkdevops/istanbul | https://api.github.com/repos/turkdevops/istanbul | opened | CVE-2020-7774 (High) detected in y18n-3.2.1.tgz | security vulnerability | ## CVE-2020-7774 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>y18n-3.2.1.tgz</b></p></summary>
<p>the bare-bones internationalization library used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz">https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz</a></p>
<p>Path to dependency file: istanbul/package.json</p>
<p>Path to vulnerable library: istanbul/node_modules/nyc/node_modules/y18n/package.json</p>
<p>
Dependency Hierarchy:
- nodeunit-0.9.5.tgz (Root Library)
- tap-7.1.2.tgz
- nyc-7.1.0.tgz
- yargs-4.8.1.tgz
- :x: **y18n-3.2.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/turkdevops/istanbul/commit/1bc76b2a4f1c68652ba76458f4f973a52dbbfe9e">1bc76b2a4f1c68652ba76458f4f973a52dbbfe9e</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package y18n before 3.2.2, 4.0.1 and 5.0.5. PoC by po6ix: const y18n = require('y18n')(); y18n.setLocale('__proto__'); y18n.updateLocale({polluted: true}); console.log(polluted); // true
<p>Publish Date: 2020-11-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774>CVE-2020-7774</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1654">https://www.npmjs.com/advisories/1654</a></p>
<p>Release Date: 2020-11-17</p>
<p>Fix Resolution: 3.2.2, 4.0.1, 5.0.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-7774 (High) detected in y18n-3.2.1.tgz - ## CVE-2020-7774 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>y18n-3.2.1.tgz</b></p></summary>
<p>the bare-bones internationalization library used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz">https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz</a></p>
<p>Path to dependency file: istanbul/package.json</p>
<p>Path to vulnerable library: istanbul/node_modules/nyc/node_modules/y18n/package.json</p>
<p>
Dependency Hierarchy:
- nodeunit-0.9.5.tgz (Root Library)
- tap-7.1.2.tgz
- nyc-7.1.0.tgz
- yargs-4.8.1.tgz
- :x: **y18n-3.2.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/turkdevops/istanbul/commit/1bc76b2a4f1c68652ba76458f4f973a52dbbfe9e">1bc76b2a4f1c68652ba76458f4f973a52dbbfe9e</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package y18n before 3.2.2, 4.0.1 and 5.0.5. PoC by po6ix: const y18n = require('y18n')(); y18n.setLocale('__proto__'); y18n.updateLocale({polluted: true}); console.log(polluted); // true
<p>Publish Date: 2020-11-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774>CVE-2020-7774</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1654">https://www.npmjs.com/advisories/1654</a></p>
<p>Release Date: 2020-11-17</p>
<p>Fix Resolution: 3.2.2, 4.0.1, 5.0.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | cve high detected in tgz cve high severity vulnerability vulnerable library tgz the bare bones internationalization library used by yargs library home page a href path to dependency file istanbul package json path to vulnerable library istanbul node modules nyc node modules package json dependency hierarchy nodeunit tgz root library tap tgz nyc tgz yargs tgz x tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects the package before and poc by const require setlocale proto updatelocale polluted true console log polluted true publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
62,201 | 17,023,871,595 | IssuesEvent | 2021-07-03 04:17:28 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | It is now difficult to share a location | Component: website Priority: major Resolution: fixed Type: defect | **[Submitted to the original trac issue database at 2.45am, Saturday, 3rd August 2013]**
Previously, it was easy to "share a location".
That is, to send someone precise coordinates of a point.
Double click on that location, click permalink, copy URL and that's it.
Now it's become extremely difficult because double clicking moves the location only half way towards the center and the location has to be repeatedly clicked and mouse pointer readjusted until the map no longer moves.
Why that ????
Also, may I suggest a "Marked Link" in addition to "Long Link" and "Short Link"?
[[BR]]mlat/mlong are very useful for checking the accuracy.
[[BR]]Also, it's easier to position a pointing pointer than the five-fingered hand.
| 1.0 | It is now difficult to share a location - **[Submitted to the original trac issue database at 2.45am, Saturday, 3rd August 2013]**
Previously, it was easy to "share a location".
That is, to send someone precise coordinates of a point.
Double click on that location, click permalink, copy URL and that's it.
Now it's become extremely difficult because double clicking moves the location only half way towards the center and the location has to be repeatedly clicked and mouse pointer readjusted until the map no longer moves.
Why that ????
Also, may I suggest a "Marked Link" in addition to "Long Link" and "Short Link"?
[[BR]]mlat/mlong are very useful for checking the accuracy.
[[BR]]Also, it's easier to position a pointing pointer than the five-fingered hand.
| defect | it is now difficult to share a location previously it was easy to share a location that is to send someone precise coordinates of a point double click on that location click permalink copy url and that s it now it s become extremely difficult because double clicking moves the location only half way towards the center and the location has to be repeatedly clicked and mouse pointer readjusted until the map no longer moves why that also may i suggest a marked link in addition to long link and short link mlat mlong are very useful for checking the accuracy also it s easier to position a pointing pointer than the five fingered hand | 1 |
42,900 | 5,545,290,326 | IssuesEvent | 2017-03-22 21:12:49 | Microsoft/vscode | https://api.github.com/repos/Microsoft/vscode | closed | Pop-up Hints showing incorrect JSDoc description or no description for variables/constants referencing a function | as-designed javascript typescript | <!-- Do you have a question? Please ask it on http://stackoverflow.com/questions/tagged/vscode -->

```javascript
/**
* description of a
*/
function a() { }
/**
* description of curry
*/
function curry() { }
/**
* description of c
*/
var c = a;
const d = curry(a);
```
VSCode started with --disable-extensions
VSCode Version: 1.10.2
OS Version: Microsoft Windows [Version 10.0.14393]
| 1.0 | Pop-up Hints showing incorrect JSDoc description or no description for variables/constants referencing a function - <!-- Do you have a question? Please ask it on http://stackoverflow.com/questions/tagged/vscode -->

```javascript
/**
* description of a
*/
function a() { }
/**
* description of curry
*/
function curry() { }
/**
* description of c
*/
var c = a;
const d = curry(a);
```
VSCode started with --disable-extensions
VSCode Version: 1.10.2
OS Version: Microsoft Windows [Version 10.0.14393]
| non_defect | pop up hints showing incorrect jsdoc description or no description for variables constants referencing a function javascript description of a function a description of curry function curry description of c var c a const d curry a vscode started with disable extensions vscode version os version microsoft windows | 0 |
66,120 | 20,011,147,333 | IssuesEvent | 2022-02-01 06:40:04 | scipy/scipy | https://api.github.com/repos/scipy/scipy | opened | BUG: ndimage.zoom introduces a phase shift | defect | ### Describe your issue.
My understanding of `ndimage.zoom` is that it is _not_ supposed to "move" the contents of an image, only change the scale.
Using it, I found that it introduces a phase shift in the image as a side-effect. The reproducing example includes more detail.
### Reproducing Code Example
```python
Apologies for the not-so-MWE nature of this setup code. The routines just produce a grid which is centered on what an FFT thinks of as the spatial analog to the DC bin, such that the FFT is phaseless if the object is centered on that sample. A few other routines are just for optimization and not the point.
def make_xy_grid(shape, *, dx=0, diameter=0, grid=True):
if not isinstance(shape, tuple):
shape = (shape, shape)
if diameter != 0:
dx = diameter/max(shape)
y, x = (fftrange(s) * dx for s in shape)
if grid:
x, y = np.meshgrid(x, y)
return x, y
def fftrange(n):
"""FFT-aligned coordinate grid for n samples."""
return np.arange(-n//2, -n//2+n)
def gaussian(sigma, x, y, center=(0, 0)):
s = sigma
x, y = optimize_xy_separable(x, y)
x0, y0 = center
return np.exp(-4 * np.log(2) * ((x - x0) ** 2 + (y - y0) ** 2) / s ** 2)
def optimize_xy_separable(x, y):
if x.ndim == 2:
# assume same dimensionality of x and y
# second indexing converts y to a broadcasted column vector
x = x[0, :]
y = y[:, 0][:, np.newaxis]
return x, y
def fft2(a):
return np.fft.fftshift(np.fft.fft2(np.fft.ifftshift(a)))
Using them,
```python
dx = 0.1
x, y = make_xy_grid(660, dx=dx)
f = gaussian(1.25, x, y)
# play with this value...
fctr = 0.9
spline_order = 3
# note: prefilter is mostly irrelevant since the input is bandlimited
fzoomed = ndimage.zoom(ifn, fctr, order=spline_order)
# classic convention, FT maps f -> F
F = fft2(f)
F2 = fft2(fzoomed)
plt.imshow(np.angle(F), cmap='twilight_r')
plt.imshow(np.angle(F2), cmap='twilight_r')
```
(images in comments, since issue template does not allow them)
```
### Error message
```shell
N/A
```
### SciPy/NumPy/Python version information
1.6.2 1.21.2 sys.version_info(major=3, minor=8, micro=12, releaselevel='final', serial=0) | 1.0 | BUG: ndimage.zoom introduces a phase shift - ### Describe your issue.
My understanding of `ndimage.zoom` is that it is _not_ supposed to "move" the contents of an image, only change the scale.
Using it, I found that it introduces a phase shift in the image as a side-effect. The reproducing example includes more detail.
### Reproducing Code Example
```python
Apologies for the not-so-MWE nature of this setup code. The routines just produce a grid which is centered on what an FFT thinks of as the spatial analog to the DC bin, such that the FFT is phaseless if the object is centered on that sample. A few other routines are just for optimization and not the point.
def make_xy_grid(shape, *, dx=0, diameter=0, grid=True):
if not isinstance(shape, tuple):
shape = (shape, shape)
if diameter != 0:
dx = diameter/max(shape)
y, x = (fftrange(s) * dx for s in shape)
if grid:
x, y = np.meshgrid(x, y)
return x, y
def fftrange(n):
"""FFT-aligned coordinate grid for n samples."""
return np.arange(-n//2, -n//2+n)
def gaussian(sigma, x, y, center=(0, 0)):
s = sigma
x, y = optimize_xy_separable(x, y)
x0, y0 = center
return np.exp(-4 * np.log(2) * ((x - x0) ** 2 + (y - y0) ** 2) / s ** 2)
def optimize_xy_separable(x, y):
if x.ndim == 2:
# assume same dimensionality of x and y
# second indexing converts y to a broadcasted column vector
x = x[0, :]
y = y[:, 0][:, np.newaxis]
return x, y
def fft2(a):
return np.fft.fftshift(np.fft.fft2(np.fft.ifftshift(a)))
Using them,
```python
dx = 0.1
x, y = make_xy_grid(660, dx=dx)
f = gaussian(1.25, x, y)
# play with this value...
fctr = 0.9
spline_order = 3
# note: prefilter is mostly irrelevant since the input is bandlimited
fzoomed = ndimage.zoom(ifn, fctr, order=spline_order)
# classic convention, FT maps f -> F
F = fft2(f)
F2 = fft2(fzoomed)
plt.imshow(np.angle(F), cmap='twilight_r')
plt.imshow(np.angle(F2), cmap='twilight_r')
```
(images in comments, since issue template does not allow them)
```
### Error message
```shell
N/A
```
### SciPy/NumPy/Python version information
1.6.2 1.21.2 sys.version_info(major=3, minor=8, micro=12, releaselevel='final', serial=0) | defect | bug ndimage zoom introduces a phase shift describe your issue my understanding of ndimage zoom is that it is not supposed to move the contents of an image only change the scale using it i found that it introduces a phase shift in the image as a side effect the reproducing example includes more detail reproducing code example python apologies for the not so mwe nature of this setup code the routines just produce a grid which is centered on what an fft thinks of as the spatial analog to the dc bin such that the fft is phaseless if the object is centered on that sample a few other routines are just for optimization and not the point def make xy grid shape dx diameter grid true if not isinstance shape tuple shape shape shape if diameter dx diameter max shape y x fftrange s dx for s in shape if grid x y np meshgrid x y return x y def fftrange n fft aligned coordinate grid for n samples return np arange n n n def gaussian sigma x y center s sigma x y optimize xy separable x y center return np exp np log x y s def optimize xy separable x y if x ndim assume same dimensionality of x and y second indexing converts y to a broadcasted column vector x x y y return x y def a return np fft fftshift np fft np fft ifftshift a using them python dx x y make xy grid dx dx f gaussian x y play with this value fctr spline order note prefilter is mostly irrelevant since the input is bandlimited fzoomed ndimage zoom ifn fctr order spline order classic convention ft maps f f f f fzoomed plt imshow np angle f cmap twilight r plt imshow np angle cmap twilight r images in comments since issue template does not allow them error message shell n a scipy numpy python version information sys version info major minor micro releaselevel final serial | 1 |
356,709 | 25,176,246,213 | IssuesEvent | 2022-11-11 09:30:56 | AkkFiros/pe | https://api.github.com/repos/AkkFiros/pe | opened | Inaccuracy in UG: Error under 'Edit contact's modules' section | type.DocumentationBug severity.Low | Description:
There is a sentence under the 'Edit contact's modules' section that does not make sense and does not agree with the section.
Screenshot:
(error underlined in red)

Bug:
The section is talking about contact's modules, the sentence should refer to the contact and not the user.
Sentence should be 'You are able to specify the modules the contact is taking currently, has taken in the past, and is
planning to take in the future.'
<!--session: 1668153575756-d645a771-4f5e-40b0-b98a-96f06aa45e5b-->
<!--Version: Web v3.4.4--> | 1.0 | Inaccuracy in UG: Error under 'Edit contact's modules' section - Description:
There is a sentence under the 'Edit contact's modules' section that does not make sense and does not agree with the section.
Screenshot:
(error underlined in red)

Bug:
The section is talking about contact's modules, the sentence should refer to the contact and not the user.
Sentence should be 'You are able to specify the modules the contact is taking currently, has taken in the past, and is
planning to take in the future.'
<!--session: 1668153575756-d645a771-4f5e-40b0-b98a-96f06aa45e5b-->
<!--Version: Web v3.4.4--> | non_defect | inaccuracy in ug error under edit contact s modules section description there is a sentence under the edit contact s modules section that does not make sense and does not agree with the section screenshot error underlined in red bug the section is talking about contact s modules the sentence should refer to the contact and not the user sentence should be you are able to specify the modules the contact is taking currently has taken in the past and is planning to take in the future | 0 |
17,481 | 3,008,889,140 | IssuesEvent | 2015-07-28 00:10:47 | belangeo/cecilia5 | https://api.github.com/repos/belangeo/cecilia5 | closed | Status of Cybil? | auto-migrated Priority-Medium Type-Defect | ```
In the 2.x version of Cecilia, the Cybil language allowed you to do powerful
pre-processing scroe generation. But Cybil is not embeded with Cecilia 4 or 5.
Is there any expectation that Cybil will be included in future releases? Or
will it be available as a stand-alone precompiler?
I am interested in using this in an OS X environment, but I would imagine that
it would be useful on any platform.
Thanks in advance.
Stephen David Beck
School of Music
Louisiana State University
```
Original issue reported on code.google.com by `stephend...@gmail.com` on 1 Aug 2013 at 3:01 | 1.0 | Status of Cybil? - ```
In the 2.x version of Cecilia, the Cybil language allowed you to do powerful
pre-processing scroe generation. But Cybil is not embeded with Cecilia 4 or 5.
Is there any expectation that Cybil will be included in future releases? Or
will it be available as a stand-alone precompiler?
I am interested in using this in an OS X environment, but I would imagine that
it would be useful on any platform.
Thanks in advance.
Stephen David Beck
School of Music
Louisiana State University
```
Original issue reported on code.google.com by `stephend...@gmail.com` on 1 Aug 2013 at 3:01 | defect | status of cybil in the x version of cecilia the cybil language allowed you to do powerful pre processing scroe generation but cybil is not embeded with cecilia or is there any expectation that cybil will be included in future releases or will it be available as a stand alone precompiler i am interested in using this in an os x environment but i would imagine that it would be useful on any platform thanks in advance stephen david beck school of music louisiana state university original issue reported on code google com by stephend gmail com on aug at | 1 |
293,387 | 22,055,936,995 | IssuesEvent | 2022-05-30 12:53:59 | DarkWolfie-Youtube/Unity-Music-Player | https://api.github.com/repos/DarkWolfie-Youtube/Unity-Music-Player | closed | Feature: Adding Images for the music. | documentation enhancement help wanted | Currently adding the code to show the cover art for the music on discord rich presence. Will also add it to the UI afterwards. | 1.0 | Feature: Adding Images for the music. - Currently adding the code to show the cover art for the music on discord rich presence. Will also add it to the UI afterwards. | non_defect | feature adding images for the music currently adding the code to show the cover art for the music on discord rich presence will also add it to the ui afterwards | 0 |
172,750 | 21,054,809,564 | IssuesEvent | 2022-04-01 01:18:17 | ghuangsnl/spring-boot | https://api.github.com/repos/ghuangsnl/spring-boot | opened | CVE-2022-27772 (Medium) detected in spring-boot-2.2.4.RELEASE.jar | security vulnerability | ## CVE-2022-27772 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-boot-2.2.4.RELEASE.jar</b></p></summary>
<p>Spring Boot</p>
<p>Library home page: <a href="https://projects.spring.io/spring-boot/#/spring-boot-parent/spring-boot">https://projects.spring.io/spring-boot/#/spring-boot-parent/spring-boot</a></p>
<p>Path to dependency file: /ci/images/releasescripts/pom.xml</p>
<p>Path to vulnerable library: /2.2.4.RELEASE/spring-boot-2.2.4.RELEASE.jar</p>
<p>
Dependency Hierarchy:
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
** UNSUPPORTED WHEN ASSIGNED ** spring-boot versions prior to version v2.2.11.RELEASE was vulnerable to temporary directory hijacking. This vulnerability impacted the org.springframework.boot.web.server.AbstractConfigurableWebServerFactory.createTempDir method. NOTE: This vulnerability only affects products and/or versions that are no longer supported by the maintainer.
<p>Publish Date: 2022-03-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-27772>CVE-2022-27772</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/JLLeitschuh/security-research/security/advisories/GHSA-cm59-pr5q-cw85">https://github.com/JLLeitschuh/security-research/security/advisories/GHSA-cm59-pr5q-cw85</a></p>
<p>Release Date: 2022-03-30</p>
<p>Fix Resolution: org.springframework.boot:spring-boot:2.2.11.RELEASE</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-27772 (Medium) detected in spring-boot-2.2.4.RELEASE.jar - ## CVE-2022-27772 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-boot-2.2.4.RELEASE.jar</b></p></summary>
<p>Spring Boot</p>
<p>Library home page: <a href="https://projects.spring.io/spring-boot/#/spring-boot-parent/spring-boot">https://projects.spring.io/spring-boot/#/spring-boot-parent/spring-boot</a></p>
<p>Path to dependency file: /ci/images/releasescripts/pom.xml</p>
<p>Path to vulnerable library: /2.2.4.RELEASE/spring-boot-2.2.4.RELEASE.jar</p>
<p>
Dependency Hierarchy:
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
** UNSUPPORTED WHEN ASSIGNED ** spring-boot versions prior to version v2.2.11.RELEASE was vulnerable to temporary directory hijacking. This vulnerability impacted the org.springframework.boot.web.server.AbstractConfigurableWebServerFactory.createTempDir method. NOTE: This vulnerability only affects products and/or versions that are no longer supported by the maintainer.
<p>Publish Date: 2022-03-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-27772>CVE-2022-27772</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/JLLeitschuh/security-research/security/advisories/GHSA-cm59-pr5q-cw85">https://github.com/JLLeitschuh/security-research/security/advisories/GHSA-cm59-pr5q-cw85</a></p>
<p>Release Date: 2022-03-30</p>
<p>Fix Resolution: org.springframework.boot:spring-boot:2.2.11.RELEASE</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | cve medium detected in spring boot release jar cve medium severity vulnerability vulnerable library spring boot release jar spring boot library home page a href path to dependency file ci images releasescripts pom xml path to vulnerable library release spring boot release jar dependency hierarchy vulnerability details unsupported when assigned spring boot versions prior to version release was vulnerable to temporary directory hijacking this vulnerability impacted the org springframework boot web server abstractconfigurablewebserverfactory createtempdir method note this vulnerability only affects products and or versions that are no longer supported by the maintainer publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework boot spring boot release step up your open source security game with whitesource | 0 |
27,942 | 22,627,348,074 | IssuesEvent | 2022-06-30 11:54:32 | WordPress/performance | https://api.github.com/repos/WordPress/performance | opened | Prepare 1.3.0 release | [Type] Enhancement Infrastructure | This issue is to track preparation of the upcoming 1.3.0 release up until publishing, which is due **July 18, 2022**.
- [ ] Create release/1.3.0 branch closer to the release date
- [ ] Finalize scope and punt unfinished pull requests to following release
- [ ] Prepare the release (Monday, July 18, 2022) | 1.0 | Prepare 1.3.0 release - This issue is to track preparation of the upcoming 1.3.0 release up until publishing, which is due **July 18, 2022**.
- [ ] Create release/1.3.0 branch closer to the release date
- [ ] Finalize scope and punt unfinished pull requests to following release
- [ ] Prepare the release (Monday, July 18, 2022) | non_defect | prepare release this issue is to track preparation of the upcoming release up until publishing which is due july create release branch closer to the release date finalize scope and punt unfinished pull requests to following release prepare the release monday july | 0 |
638,608 | 20,732,036,309 | IssuesEvent | 2022-03-14 10:19:29 | wso2/product-apim | https://api.github.com/repos/wso2/product-apim | closed | Access token generation fails in cross tenant subscription | Type/Bug Priority/Normal Feature/CrossTenantSubscription APIM - 4.1.0 | ### Description:
<!-- Describe the issue -->
Access token generation gives page not found error in cross tenant subscription
<img width="1429" alt="Screenshot 2565-03-07 at 18 30 06" src="https://user-images.githubusercontent.com/36605514/157039279-8045e1e1-2845-4589-8cbb-f6264fd4ca96.png">
### Steps to reproduce:
1. Subscribe to an API in a different tenant
2. Try to generate an access token
### Affected Product Version:
<!-- Members can use Affected/*** labels -->
APIM-4.1.0-beta
### Environment details (with versions):
- OS: MacOS
- Client: Local | 1.0 | Access token generation fails in cross tenant subscription - ### Description:
<!-- Describe the issue -->
Access token generation gives page not found error in cross tenant subscription
<img width="1429" alt="Screenshot 2565-03-07 at 18 30 06" src="https://user-images.githubusercontent.com/36605514/157039279-8045e1e1-2845-4589-8cbb-f6264fd4ca96.png">
### Steps to reproduce:
1. Subscribe to an API in a different tenant
2. Try to generate an access token
### Affected Product Version:
<!-- Members can use Affected/*** labels -->
APIM-4.1.0-beta
### Environment details (with versions):
- OS: MacOS
- Client: Local | non_defect | access token generation fails in cross tenant subscription description access token generation gives page not found error in cross tenant subscription img width alt screenshot at src steps to reproduce subscribe to an api in a different tenant try to generate an access token affected product version apim beta environment details with versions os macos client local | 0 |
11,341 | 14,164,897,781 | IssuesEvent | 2020-11-12 06:09:22 | tikv/tikv | https://api.github.com/repos/tikv/tikv | closed | cases::test_coprocessor::test_parse_request_failed_2 not work | priority/low severity/Major sig/coprocessor type/bug | ## Bug Report
<!-- Thanks for your bug report! Don't worry if you can't fill out all the sections. -->
### What version of TiKV are you using?
master
### Steps to reproduce
- add `println!("list {:?}", fail::list());` at `fail_point!("rockskv_async_snapshot",`
- `cargo test --package tests --test failpoints -- cases::test_coprocessor::test_parse_request_failed_2 --exact --nocapture`
### What did you expect?
fail::list prints failpoint configuration
| 1.0 | cases::test_coprocessor::test_parse_request_failed_2 not work - ## Bug Report
<!-- Thanks for your bug report! Don't worry if you can't fill out all the sections. -->
### What version of TiKV are you using?
master
### Steps to reproduce
- add `println!("list {:?}", fail::list());` at `fail_point!("rockskv_async_snapshot",`
- `cargo test --package tests --test failpoints -- cases::test_coprocessor::test_parse_request_failed_2 --exact --nocapture`
### What did you expect?
fail::list prints failpoint configuration
| non_defect | cases test coprocessor test parse request failed not work bug report what version of tikv are you using master steps to reproduce add println list fail list at fail point rockskv async snapshot cargo test package tests test failpoints cases test coprocessor test parse request failed exact nocapture what did you expect fail list prints failpoint configuration | 0 |
683,900 | 23,398,910,308 | IssuesEvent | 2022-08-12 05:12:34 | ballerina-platform/ballerina-lang | https://api.github.com/repos/ballerina-platform/ballerina-lang | closed | Bad-Sad error when function is accessed using field access within the init method of same class | Type/Bug Priority/High Team/CompilerFE Team/jBallerina Points/3 SwanLakeDump Area/TypeChecker Lang/ClassDefinition Crash Lang/Expressions/FieldAccess | **Description:**
```ballerina
class A {
function foo(int i) returns int {
return i+1;
}
function (int i) returns int bar;
function init() {
self.bar = self.foo;
}
// This works
function baz() returns function (int) returns int {
self.bar = self.foo;
return self.bar;
}
}
public function main() {
A a = new A();
var x = a.baz();
panic error(x(1).toString());
}
```
[ballerina-internal.log](https://github.com/ballerina-platform/ballerina-lang/files/5931625/ballerina-internal.log)
**Steps to reproduce:**
**Affected Versions:**
Ballerina Swan Lake Alpha.
**OS, DB, other environment details and versions:**
**Related Issues (optional):**
<!-- Any related issues such as sub tasks, issues reported in other repositories (e.g component repositories), similar problems, etc. -->
**Suggested Labels (optional):**
<!-- Optional comma separated list of suggested labels. Non committers can’t assign labels to issues, so this will help issue creators who are not a committer to suggest possible labels-->
**Suggested Assignees (optional):**
<!--Optional comma separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, so this will help issue creators who are not a committer to suggest possible assignees-->
| 1.0 | Bad-Sad error when function is accessed using field access within the init method of same class - **Description:**
```ballerina
class A {
function foo(int i) returns int {
return i+1;
}
function (int i) returns int bar;
function init() {
self.bar = self.foo;
}
// This works
function baz() returns function (int) returns int {
self.bar = self.foo;
return self.bar;
}
}
public function main() {
A a = new A();
var x = a.baz();
panic error(x(1).toString());
}
```
[ballerina-internal.log](https://github.com/ballerina-platform/ballerina-lang/files/5931625/ballerina-internal.log)
**Steps to reproduce:**
**Affected Versions:**
Ballerina Swan Lake Alpha.
**OS, DB, other environment details and versions:**
**Related Issues (optional):**
<!-- Any related issues such as sub tasks, issues reported in other repositories (e.g component repositories), similar problems, etc. -->
**Suggested Labels (optional):**
<!-- Optional comma separated list of suggested labels. Non committers can’t assign labels to issues, so this will help issue creators who are not a committer to suggest possible labels-->
**Suggested Assignees (optional):**
<!--Optional comma separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, so this will help issue creators who are not a committer to suggest possible assignees-->
| non_defect | bad sad error when function is accessed using field access within the init method of same class description ballerina class a function foo int i returns int return i function int i returns int bar function init self bar self foo this works function baz returns function int returns int self bar self foo return self bar public function main a a new a var x a baz panic error x tostring steps to reproduce affected versions ballerina swan lake alpha os db other environment details and versions related issues optional suggested labels optional suggested assignees optional | 0 |
656,613 | 21,768,735,000 | IssuesEvent | 2022-05-13 06:46:34 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | web.whatsapp.com - see bug description | browser-firefox priority-critical engine-gecko | <!-- @browser: Firefox 100.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:100.0) Gecko/20100101 Firefox/100.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/104286 -->
**URL**: https://web.whatsapp.com/
**Browser / Version**: Firefox 100.0
**Operating System**: Windows 7
**Tested Another Browser**: Yes Chrome
**Problem type**: Something else
**Description**: A página não carrega. Já limpei os cookies e demais arquivos temporário. Só acontece isto no Firefox
**Steps to Reproduce**:
Nao consigo acessar o whatsapp web
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | web.whatsapp.com - see bug description - <!-- @browser: Firefox 100.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:100.0) Gecko/20100101 Firefox/100.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/104286 -->
**URL**: https://web.whatsapp.com/
**Browser / Version**: Firefox 100.0
**Operating System**: Windows 7
**Tested Another Browser**: Yes Chrome
**Problem type**: Something else
**Description**: A página não carrega. Já limpei os cookies e demais arquivos temporário. Só acontece isto no Firefox
**Steps to Reproduce**:
Nao consigo acessar o whatsapp web
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | non_defect | web whatsapp com see bug description url browser version firefox operating system windows tested another browser yes chrome problem type something else description a página não carrega já limpei os cookies e demais arquivos temporário só acontece isto no firefox steps to reproduce nao consigo acessar o whatsapp web browser configuration none from with ❤️ | 0 |
61,168 | 17,023,623,322 | IssuesEvent | 2021-07-03 02:58:50 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | primitive_xml_parsing/osm2pgsql core dump on options | Component: mapnik Priority: major Resolution: fixed Type: defect | **[Submitted to the original trac issue database at 12.39pm, Monday, 16th August 2010]**
Fixed with the patch below:
```diff
Index: osm2pgsql.c
===================================================================
--- osm2pgsql.c (revision 22625)
+++ osm2pgsql.c (working copy)
@@ -741,6 +741,7 @@
struct output_options options;
PGconn *sql_conn;
+ memset(&options, '\0', sizeof(options));
fprintf(stderr, "osm2pgsql SVN version %s\n\n", VERSION);
while (1) {
``` | 1.0 | primitive_xml_parsing/osm2pgsql core dump on options - **[Submitted to the original trac issue database at 12.39pm, Monday, 16th August 2010]**
Fixed with the patch below:
```diff
Index: osm2pgsql.c
===================================================================
--- osm2pgsql.c (revision 22625)
+++ osm2pgsql.c (working copy)
@@ -741,6 +741,7 @@
struct output_options options;
PGconn *sql_conn;
+ memset(&options, '\0', sizeof(options));
fprintf(stderr, "osm2pgsql SVN version %s\n\n", VERSION);
while (1) {
``` | defect | primitive xml parsing core dump on options fixed with the patch below diff index c c revision c working copy struct output options options pgconn sql conn memset options sizeof options fprintf stderr svn version s n n version while | 1 |
250,017 | 7,966,535,351 | IssuesEvent | 2018-07-14 23:46:19 | utra-robosoccer/soccer-embedded | https://api.github.com/repos/utra-robosoccer/soccer-embedded | closed | Code style: FreeRTOS functions use a mixture of tabs and spaces which is ugly | Priority: Low Type: Maintenance | This issue is due to legacy usage of tabs in functions instead of spaces, and some stuff that has to do with Cube's auto-generated HAL code. All indentations should consist of 4 spaces. | 1.0 | Code style: FreeRTOS functions use a mixture of tabs and spaces which is ugly - This issue is due to legacy usage of tabs in functions instead of spaces, and some stuff that has to do with Cube's auto-generated HAL code. All indentations should consist of 4 spaces. | non_defect | code style freertos functions use a mixture of tabs and spaces which is ugly this issue is due to legacy usage of tabs in functions instead of spaces and some stuff that has to do with cube s auto generated hal code all indentations should consist of spaces | 0 |
74,595 | 15,360,431,913 | IssuesEvent | 2021-03-01 16:55:39 | Reid-Turner/uppy | https://api.github.com/repos/Reid-Turner/uppy | opened | CVE-2017-16137 (Medium) detected in debug-2.2.0.tgz | security vulnerability | ## CVE-2017-16137 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>debug-2.2.0.tgz</b></p></summary>
<p>small debugging utility</p>
<p>Library home page: <a href="https://registry.npmjs.org/debug/-/debug-2.2.0.tgz">https://registry.npmjs.org/debug/-/debug-2.2.0.tgz</a></p>
<p>
Dependency Hierarchy:
- @uppy-example/uppy-with-companion-file:examples/uppy-with-companion.tgz (Root Library)
- upload-server-1.1.6.tgz
- express-directory-0.0.2.tgz
- finalhandler-0.3.6.tgz
- :x: **debug-2.2.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Reid-Turner/uppy/commit/b8e5cf171f5977293d839ff39cf30743353617dc">b8e5cf171f5977293d839ff39cf30743353617dc</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter. It takes around 50k characters to block for 2 seconds making this a low severity issue.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-16137>CVE-2017-16137</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-16137">https://nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-16137</a></p>
<p>Release Date: 2018-06-07</p>
<p>Fix Resolution: 2.6.9</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"debug","packageVersion":"2.2.0","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"@uppy-example/uppy-with-companion:file:examples/uppy-with-companion;upload-server:1.1.6;express-directory:0.0.2;finalhandler:0.3.6;debug:2.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.6.9"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2017-16137","vulnerabilityDetails":"The debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter. It takes around 50k characters to block for 2 seconds making this a low severity issue.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-16137","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2017-16137 (Medium) detected in debug-2.2.0.tgz - ## CVE-2017-16137 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>debug-2.2.0.tgz</b></p></summary>
<p>small debugging utility</p>
<p>Library home page: <a href="https://registry.npmjs.org/debug/-/debug-2.2.0.tgz">https://registry.npmjs.org/debug/-/debug-2.2.0.tgz</a></p>
<p>
Dependency Hierarchy:
- @uppy-example/uppy-with-companion-file:examples/uppy-with-companion.tgz (Root Library)
- upload-server-1.1.6.tgz
- express-directory-0.0.2.tgz
- finalhandler-0.3.6.tgz
- :x: **debug-2.2.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Reid-Turner/uppy/commit/b8e5cf171f5977293d839ff39cf30743353617dc">b8e5cf171f5977293d839ff39cf30743353617dc</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter. It takes around 50k characters to block for 2 seconds making this a low severity issue.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-16137>CVE-2017-16137</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-16137">https://nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-16137</a></p>
<p>Release Date: 2018-06-07</p>
<p>Fix Resolution: 2.6.9</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"debug","packageVersion":"2.2.0","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"@uppy-example/uppy-with-companion:file:examples/uppy-with-companion;upload-server:1.1.6;express-directory:0.0.2;finalhandler:0.3.6;debug:2.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.6.9"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2017-16137","vulnerabilityDetails":"The debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter. It takes around 50k characters to block for 2 seconds making this a low severity issue.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-16137","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_defect | cve medium detected in debug tgz cve medium severity vulnerability vulnerable library debug tgz small debugging utility library home page a href dependency hierarchy uppy example uppy with companion file examples uppy with companion tgz root library upload server tgz express directory tgz finalhandler tgz x debug tgz vulnerable library found in head commit a href found in base branch master vulnerability details the debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter it takes around characters to block for seconds making this a low severity issue publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree uppy example uppy with companion file examples uppy with companion upload server express directory finalhandler debug isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier cve vulnerabilitydetails the debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter it takes around characters to block for seconds making this a low severity issue vulnerabilityurl | 0 |
24,318 | 3,962,702,425 | IssuesEvent | 2016-05-02 17:48:10 | wpostma/notifier-jenkins | https://api.github.com/repos/wpostma/notifier-jenkins | closed | Unable to download application. Details file attached. | auto-migrated Priority-Medium Type-Defect | ```
What steps will reproduce the problem?
1. Click on the usage and installation link that starts the ClickOnce
installation.
```
Original issue reported on code.google.com by `pisar...@gmail.com` on 25 Jan 2013 at 6:01
Attachments:
* [CAHR4RQE.log](https://storage.googleapis.com/google-code-attachments/notifier-jenkins/issue-2/comment-0/CAHR4RQE.log)
| 1.0 | Unable to download application. Details file attached. - ```
What steps will reproduce the problem?
1. Click on the usage and installation link that starts the ClickOnce
installation.
```
Original issue reported on code.google.com by `pisar...@gmail.com` on 25 Jan 2013 at 6:01
Attachments:
* [CAHR4RQE.log](https://storage.googleapis.com/google-code-attachments/notifier-jenkins/issue-2/comment-0/CAHR4RQE.log)
| defect | unable to download application details file attached what steps will reproduce the problem click on the usage and installation link that starts the clickonce installation original issue reported on code google com by pisar gmail com on jan at attachments | 1 |
241,986 | 18,507,182,454 | IssuesEvent | 2021-10-19 20:08:51 | ErenoGit/Virtual-ATM-machine | https://api.github.com/repos/ErenoGit/Virtual-ATM-machine | opened | Add README.md | documentation hacktoberfest | The programme is written in the author's language, which makes it difficult to analyse. We have to change everything into English, the international language. | 1.0 | Add README.md - The programme is written in the author's language, which makes it difficult to analyse. We have to change everything into English, the international language. | non_defect | add readme md the programme is written in the author s language which makes it difficult to analyse we have to change everything into english the international language | 0 |
46,711 | 13,055,962,621 | IssuesEvent | 2020-07-30 03:14:55 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | opened | I3Particle documentation incomplete (Trac #1746) | Incomplete Migration Migrated from Trac combo core defect | Migrated from https://code.icecube.wisc.edu/ticket/1746
```json
{
"status": "closed",
"changetime": "2019-02-13T14:12:47",
"description": "\nPer Alex Olivas ... there is missing documentation on the cascade\n",
"reporter": "pmeade",
"cc": "olivas, gmaggi@vub.ac.be",
"resolution": "fixed",
"_ts": "1550067167842669",
"component": "combo core",
"summary": "I3Particle documentation incomplete",
"priority": "normal",
"keywords": "",
"time": "2016-06-14T19:36:30",
"milestone": "",
"owner": "gmaggi",
"type": "defect"
}
```
| 1.0 | I3Particle documentation incomplete (Trac #1746) - Migrated from https://code.icecube.wisc.edu/ticket/1746
```json
{
"status": "closed",
"changetime": "2019-02-13T14:12:47",
"description": "\nPer Alex Olivas ... there is missing documentation on the cascade\n",
"reporter": "pmeade",
"cc": "olivas, gmaggi@vub.ac.be",
"resolution": "fixed",
"_ts": "1550067167842669",
"component": "combo core",
"summary": "I3Particle documentation incomplete",
"priority": "normal",
"keywords": "",
"time": "2016-06-14T19:36:30",
"milestone": "",
"owner": "gmaggi",
"type": "defect"
}
```
| defect | documentation incomplete trac migrated from json status closed changetime description nper alex olivas there is missing documentation on the cascade n reporter pmeade cc olivas gmaggi vub ac be resolution fixed ts component combo core summary documentation incomplete priority normal keywords time milestone owner gmaggi type defect | 1 |
795,508 | 28,075,433,105 | IssuesEvent | 2023-03-29 22:57:47 | Yoooi0/MultiFunPlayer | https://api.github.com/repos/Yoooi0/MultiFunPlayer | opened | Allow disabling bindings | enhancement priority-medium | Toggle in UI.
Shortcuts that disable bindings? Select by capture/index/dropdown. | 1.0 | Allow disabling bindings - Toggle in UI.
Shortcuts that disable bindings? Select by capture/index/dropdown. | non_defect | allow disabling bindings toggle in ui shortcuts that disable bindings select by capture index dropdown | 0 |
131,735 | 28,014,427,019 | IssuesEvent | 2023-03-27 21:15:13 | llvm/llvm-project | https://api.github.com/repos/llvm/llvm-project | reopened | Clang-format confused trailing return type | clang-format invalid-code-generation | Problem found when upgrading from 16-RC2 to 16.0.0
````cpp
struct EXPORT_MACRO [[nodiscard]] C
{
using U = int;
auto f() noexcept -> void {}
auto g() -> void {}
};
````
gets reformatted to
````cpp
struct EXPORT_MACRO [[nodiscard]] C
{
using U = int;
auto f() noexcept -> void {}
auto g()->void {}
};
````
Note in function `g()` that `->` has no spaces around it while `f()` does have them.
[reproduction.zip](https://github.com/llvm/llvm-project/files/11068943/reproduction.zip) contains full reproduction for this | 1.0 | Clang-format confused trailing return type - Problem found when upgrading from 16-RC2 to 16.0.0
````cpp
struct EXPORT_MACRO [[nodiscard]] C
{
using U = int;
auto f() noexcept -> void {}
auto g() -> void {}
};
````
gets reformatted to
````cpp
struct EXPORT_MACRO [[nodiscard]] C
{
using U = int;
auto f() noexcept -> void {}
auto g()->void {}
};
````
Note in function `g()` that `->` has no spaces around it while `f()` does have them.
[reproduction.zip](https://github.com/llvm/llvm-project/files/11068943/reproduction.zip) contains full reproduction for this | non_defect | clang format confused trailing return type problem found when upgrading from to cpp struct export macro c using u int auto f noexcept void auto g void gets reformatted to cpp struct export macro c using u int auto f noexcept void auto g void note in function g that has no spaces around it while f does have them contains full reproduction for this | 0 |
21,489 | 3,512,770,943 | IssuesEvent | 2016-01-11 05:01:01 | Virtual-Labs/problem-solving-iiith | https://api.github.com/repos/Virtual-Labs/problem-solving-iiith | reopened | QA_String Operations_Back to experiment | Category :Usability Defect raised on: 24-11-2015 Developed by:IIIT Hyd Release Number Severity :S2 Status :Open Version Number :1.1 | Defect Description:
In the "String Operations" experiment,the back to experiments link is not present in the page instead the back to experiments link should be displayed on the screen inorder to view the list of experiments by the user .
Actual Result:
In the "String Operations" experiment,the back to experiments link is not present in the page.
Environment :
OS: Windows 7, Ubuntu-16.04,Centos-6
Browsers: Firefox-42.0,Chrome-47.0,chromium-45.0
Bandwidth : 100Mbps
Hardware Configuration:8GBRAM ,
Processor:i5
Test Step Link:
https://github.com/Virtual-Labs/problem-solving-iiith/blob/master/test-cases/integration_test-cases/String%20Operations/String%20Operations_29_Back%20to%20experiments_smk.org

| 1.0 | QA_String Operations_Back to experiment - Defect Description:
In the "String Operations" experiment,the back to experiments link is not present in the page instead the back to experiments link should be displayed on the screen inorder to view the list of experiments by the user .
Actual Result:
In the "String Operations" experiment,the back to experiments link is not present in the page.
Environment :
OS: Windows 7, Ubuntu-16.04,Centos-6
Browsers: Firefox-42.0,Chrome-47.0,chromium-45.0
Bandwidth : 100Mbps
Hardware Configuration:8GBRAM ,
Processor:i5
Test Step Link:
https://github.com/Virtual-Labs/problem-solving-iiith/blob/master/test-cases/integration_test-cases/String%20Operations/String%20Operations_29_Back%20to%20experiments_smk.org

| defect | qa string operations back to experiment defect description in the string operations experiment the back to experiments link is not present in the page instead the back to experiments link should be displayed on the screen inorder to view the list of experiments by the user actual result in the string operations experiment the back to experiments link is not present in the page environment os windows ubuntu centos browsers firefox chrome chromium bandwidth hardware configuration processor test step link | 1 |
55,738 | 14,663,359,784 | IssuesEvent | 2020-12-29 09:31:49 | primefaces/primeng | https://api.github.com/repos/primefaces/primeng | closed | Multiselect on chips display does not update the model when deleting them from the chip icon | defect | I'm submitting a
[X] bug report
**Current behavior**
On chips display when you delete a chip by clicking the X button the model doesn't refresh
**Expected behavior**
Model should be refreshed like when you unclick from the dropdown list.
**Minimal reproduction of the problem with instructions**
https://primeng-multiselect-demo.stackblitz.io/
| 1.0 | Multiselect on chips display does not update the model when deleting them from the chip icon - I'm submitting a
[X] bug report
**Current behavior**
On chips display when you delete a chip by clicking the X button the model doesn't refresh
**Expected behavior**
Model should be refreshed like when you unclick from the dropdown list.
**Minimal reproduction of the problem with instructions**
https://primeng-multiselect-demo.stackblitz.io/
| defect | multiselect on chips display does not update the model when deleting them from the chip icon i m submitting a bug report current behavior on chips display when you delete a chip by clicking the x button the model doesn t refresh expected behavior model should be refreshed like when you unclick from the dropdown list minimal reproduction of the problem with instructions | 1 |
64,308 | 18,419,616,206 | IssuesEvent | 2021-10-13 14:50:33 | vector-im/element-web | https://api.github.com/repos/vector-im/element-web | closed | Room List not loading | T-Defect S-Critical A-Room-List Z-Rageshake O-Occasional | ### Steps to reproduce
Unknown
### Outcome
#### What did you expect?
Room list to load promptly.
#### What happened instead?
We are seeing users report issues where the room list stays in its 'placeholder' state, even after the app has finished syncing. It can stay like this 20 minutes or so. Trying to filter with it in this state causes it to break in a different way, showing no rooms at all instead of the placeholder.
### Operating system
_No response_
### Browser information
_No response_
### URL for webapp
_No response_
### Homeserver
_No response_
### Will you send logs?
No | 1.0 | Room List not loading - ### Steps to reproduce
Unknown
### Outcome
#### What did you expect?
Room list to load promptly.
#### What happened instead?
We are seeing users report issues where the room list stays in its 'placeholder' state, even after the app has finished syncing. It can stay like this 20 minutes or so. Trying to filter with it in this state causes it to break in a different way, showing no rooms at all instead of the placeholder.
### Operating system
_No response_
### Browser information
_No response_
### URL for webapp
_No response_
### Homeserver
_No response_
### Will you send logs?
No | defect | room list not loading steps to reproduce unknown outcome what did you expect room list to load promptly what happened instead we are seeing users report issues where the room list stays in its placeholder state even after the app has finished syncing it can stay like this minutes or so trying to filter with it in this state causes it to break in a different way showing no rooms at all instead of the placeholder operating system no response browser information no response url for webapp no response homeserver no response will you send logs no | 1 |
4,092 | 2,610,087,272 | IssuesEvent | 2015-02-26 18:26:30 | chrsmith/dsdsdaadf | https://api.github.com/repos/chrsmith/dsdsdaadf | opened | 深圳痤疮粉刺的祛除方法 | auto-migrated Priority-Medium Type-Defect | ```
深圳痤疮粉刺的祛除方法【深圳韩方科颜全国热线400-869-1818��
�24小时QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以��
�国秘方——韩方科颜这一国妆准字号治疗型权威,祛痘佳品�
��韩方科颜专业祛痘连锁机构,采用韩国秘方配合专业“不反
弹”健康祛痘技术并结合先进“先进豪华彩光”仪,开创国��
�专业治疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸�
��的痘痘。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 7:19 | 1.0 | 深圳痤疮粉刺的祛除方法 - ```
深圳痤疮粉刺的祛除方法【深圳韩方科颜全国热线400-869-1818��
�24小时QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以��
�国秘方——韩方科颜这一国妆准字号治疗型权威,祛痘佳品�
��韩方科颜专业祛痘连锁机构,采用韩国秘方配合专业“不反
弹”健康祛痘技术并结合先进“先进豪华彩光”仪,开创国��
�专业治疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸�
��的痘痘。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 7:19 | defect | 深圳痤疮粉刺的祛除方法 深圳痤疮粉刺的祛除方法【 �� � 】深圳韩方科颜专业祛痘连锁机构,机构以�� �国秘方——韩方科颜这一国妆准字号治疗型权威,祛痘佳品� ��韩方科颜专业祛痘连锁机构,采用韩国秘方配合专业“不反 弹”健康祛痘技术并结合先进“先进豪华彩光”仪,开创国�� �专业治疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸� ��的痘痘。 original issue reported on code google com by szft com on may at | 1 |
56,973 | 15,553,983,517 | IssuesEvent | 2021-03-16 02:44:47 | NREL/EnergyPlus | https://api.github.com/repos/NREL/EnergyPlus | closed | VS DX cooling coil outlet air condition calculation problem when PLR is zero for CV system | Defect | Issue overview
--------------
A variable speed DX cooling coil in a constant volume system calculates outlet air condition when the compressor part-load-ratio is zero. At low supply air flow rates, the calculated coil outlet temperature can be very low (e.g., -46C) that triggers Psych error message. This is a defect.
### Details
Some additional details for this issue (if relevant):
- WIN10 (Operating system, version)
- EnergyPlus Issue #8497
### Checklist
Add to this list or remove from it as applicable. This is a simple templated set of guidelines.
- [ ] Defect file added (list location of defect file here)
- [ ] Ticket added to Pivotal for defect (development team task)
- [ ] Pull request created (the pull request will have additional tasks related to reviewing changes that fix this defect)
| 1.0 | VS DX cooling coil outlet air condition calculation problem when PLR is zero for CV system - Issue overview
--------------
A variable speed DX cooling coil in a constant volume system calculates outlet air condition when the compressor part-load-ratio is zero. At low supply air flow rates, the calculated coil outlet temperature can be very low (e.g., -46C) that triggers Psych error message. This is a defect.
### Details
Some additional details for this issue (if relevant):
- WIN10 (Operating system, version)
- EnergyPlus Issue #8497
### Checklist
Add to this list or remove from it as applicable. This is a simple templated set of guidelines.
- [ ] Defect file added (list location of defect file here)
- [ ] Ticket added to Pivotal for defect (development team task)
- [ ] Pull request created (the pull request will have additional tasks related to reviewing changes that fix this defect)
| defect | vs dx cooling coil outlet air condition calculation problem when plr is zero for cv system issue overview a variable speed dx cooling coil in a constant volume system calculates outlet air condition when the compressor part load ratio is zero at low supply air flow rates the calculated coil outlet temperature can be very low e g that triggers psych error message this is a defect details some additional details for this issue if relevant operating system version energyplus issue checklist add to this list or remove from it as applicable this is a simple templated set of guidelines defect file added list location of defect file here ticket added to pivotal for defect development team task pull request created the pull request will have additional tasks related to reviewing changes that fix this defect | 1 |
45,832 | 13,055,754,061 | IssuesEvent | 2020-07-30 02:38:09 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | opened | trac attach file to ticket fails. (Trac #108) | Incomplete Migration Migrated from Trac defect infrastructure | Migrated from https://code.icecube.wisc.edu/ticket/108
```json
{
"status": "closed",
"changetime": "2007-12-03T15:32:37",
"description": "Trying to attach a .py script to a ticket from OS X on my MacBook.\n\nIt barfs.",
"reporter": "blaufuss",
"cc": "",
"resolution": "fixed",
"_ts": "1196695957000000",
"component": "infrastructure",
"summary": "trac attach file to ticket fails.",
"priority": "normal",
"keywords": "",
"time": "2007-08-31T18:54:13",
"milestone": "",
"owner": "cgils",
"type": "defect"
}
```
| 1.0 | trac attach file to ticket fails. (Trac #108) - Migrated from https://code.icecube.wisc.edu/ticket/108
```json
{
"status": "closed",
"changetime": "2007-12-03T15:32:37",
"description": "Trying to attach a .py script to a ticket from OS X on my MacBook.\n\nIt barfs.",
"reporter": "blaufuss",
"cc": "",
"resolution": "fixed",
"_ts": "1196695957000000",
"component": "infrastructure",
"summary": "trac attach file to ticket fails.",
"priority": "normal",
"keywords": "",
"time": "2007-08-31T18:54:13",
"milestone": "",
"owner": "cgils",
"type": "defect"
}
```
| defect | trac attach file to ticket fails trac migrated from json status closed changetime description trying to attach a py script to a ticket from os x on my macbook n nit barfs reporter blaufuss cc resolution fixed ts component infrastructure summary trac attach file to ticket fails priority normal keywords time milestone owner cgils type defect | 1 |
3,861 | 2,610,083,108 | IssuesEvent | 2015-02-26 18:25:23 | chrsmith/dsdsdaadf | https://api.github.com/repos/chrsmith/dsdsdaadf | opened | 青春痘怎么治深圳 | auto-migrated Priority-Medium Type-Defect | ```
青春痘怎么治深圳【深圳韩方科颜全国热线400-869-1818,24小时
QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘��
�——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方�
��颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健
康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专业��
�疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘�
��。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 6:46 | 1.0 | 青春痘怎么治深圳 - ```
青春痘怎么治深圳【深圳韩方科颜全国热线400-869-1818,24小时
QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘��
�——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方�
��颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健
康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专业��
�疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘�
��。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 6:46 | defect | 青春痘怎么治深圳 青春痘怎么治深圳【 , 】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘�� �——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方� ��颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健 康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专业�� �疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘� ��。 original issue reported on code google com by szft com on may at | 1 |
272,863 | 23,708,317,312 | IssuesEvent | 2022-08-30 04:59:12 | prettier/prettier | https://api.github.com/repos/prettier/prettier | opened | Add more tests for `graphql` parser | lang:graphql type:tests | During #13400, I found that we check `node.description` for `EnumTypeDefinition`, `UnionTypeDefinition`, and `ScalarTypeDefinition`, but we don't have any test with `description`. | 1.0 | Add more tests for `graphql` parser - During #13400, I found that we check `node.description` for `EnumTypeDefinition`, `UnionTypeDefinition`, and `ScalarTypeDefinition`, but we don't have any test with `description`. | non_defect | add more tests for graphql parser during i found that we check node description for enumtypedefinition uniontypedefinition and scalartypedefinition but we don t have any test with description | 0 |
56,998 | 15,588,771,258 | IssuesEvent | 2021-03-18 07:00:06 | scipy/scipy | https://api.github.com/repos/scipy/scipy | opened | BUG: stats: Spurious warning generated by arcsine.pdf at the ends of the support. | defect scipy.stats | Back in https://github.com/scipy/scipy/pull/7431, the `arcsine` distribution was updated so that the PDF returned `inf` at x=0 and x=1, to be consistent with the `beta(0.5, 0.5)` distribution. If we accept `inf` as the correct behavior at the endpoints, the function should not generate a warning at those points as it currently does:
```
In [2]: scipy.__version__
Out[2]: '1.7.0.dev0+f575333'
In [3]: from scipy.stats import beta, arcsine
In [4]: arcsine.pdf(0)
/Users/warren/mc37allpip/lib/python3.7/site-packages/scipy-1.7.0.dev0+f575333-py3.7-macosx-10.9-x86_64.egg/scipy/stats/_continuous_distns.py:505: RuntimeWarning: divide by zero encountered in true_divide
return 1.0/np.pi/np.sqrt(x*(1-x))
Out[4]: inf
In [5]: arcsine.pdf(1)
/Users/warren/mc37allpip/lib/python3.7/site-packages/scipy-1.7.0.dev0+f575333-py3.7-macosx-10.9-x86_64.egg/scipy/stats/_continuous_distns.py:505: RuntimeWarning: divide by zero encountered in true_divide
return 1.0/np.pi/np.sqrt(x*(1-x))
Out[5]: inf
```
The `beta` distribution` does not generate a warning:
```
In [6]: beta.pdf(0, 0.5, 0.5)
Out[6]: inf
In [7]: beta.pdf(1, 0.5, 0.5)
Out[7]: inf
``` | 1.0 | BUG: stats: Spurious warning generated by arcsine.pdf at the ends of the support. - Back in https://github.com/scipy/scipy/pull/7431, the `arcsine` distribution was updated so that the PDF returned `inf` at x=0 and x=1, to be consistent with the `beta(0.5, 0.5)` distribution. If we accept `inf` as the correct behavior at the endpoints, the function should not generate a warning at those points as it currently does:
```
In [2]: scipy.__version__
Out[2]: '1.7.0.dev0+f575333'
In [3]: from scipy.stats import beta, arcsine
In [4]: arcsine.pdf(0)
/Users/warren/mc37allpip/lib/python3.7/site-packages/scipy-1.7.0.dev0+f575333-py3.7-macosx-10.9-x86_64.egg/scipy/stats/_continuous_distns.py:505: RuntimeWarning: divide by zero encountered in true_divide
return 1.0/np.pi/np.sqrt(x*(1-x))
Out[4]: inf
In [5]: arcsine.pdf(1)
/Users/warren/mc37allpip/lib/python3.7/site-packages/scipy-1.7.0.dev0+f575333-py3.7-macosx-10.9-x86_64.egg/scipy/stats/_continuous_distns.py:505: RuntimeWarning: divide by zero encountered in true_divide
return 1.0/np.pi/np.sqrt(x*(1-x))
Out[5]: inf
```
The `beta` distribution` does not generate a warning:
```
In [6]: beta.pdf(0, 0.5, 0.5)
Out[6]: inf
In [7]: beta.pdf(1, 0.5, 0.5)
Out[7]: inf
``` | defect | bug stats spurious warning generated by arcsine pdf at the ends of the support back in the arcsine distribution was updated so that the pdf returned inf at x and x to be consistent with the beta distribution if we accept inf as the correct behavior at the endpoints the function should not generate a warning at those points as it currently does in scipy version out in from scipy stats import beta arcsine in arcsine pdf users warren lib site packages scipy macosx egg scipy stats continuous distns py runtimewarning divide by zero encountered in true divide return np pi np sqrt x x out inf in arcsine pdf users warren lib site packages scipy macosx egg scipy stats continuous distns py runtimewarning divide by zero encountered in true divide return np pi np sqrt x x out inf the beta distribution does not generate a warning in beta pdf out inf in beta pdf out inf | 1 |
32,084 | 13,756,257,963 | IssuesEvent | 2020-10-06 19:38:53 | cityofaustin/atd-data-tech | https://api.github.com/repos/cityofaustin/atd-data-tech | opened | Shared Mobility Data Request - Voting Trips | Product: Shared Mobility Operations Service: Dev Workgroup: PE |
Goal: Shared Mobility Services would like to analyze shared mobility trips taken during primary runoffs to plan for and support future trips to polling locations in November.
This analysis would help us understand if there is a need for shared mobility support in certain areas, e.g., increased deployments, rebalancing, provider promotions if necessary, etc.
Date range: Early voting was June 12-July 10, except July 3 and 4 because no voting took place for the July 4 holiday. July 14 was Election Day.
Early voting locations: https://countyclerk.traviscountytx.gov/images/pdfs/polling_locations/07.14.2020/PR20_and_SE20_Early_Voting_Sites_Flyer.pdf
Election Day (July 14) locations: https://countyclerk.traviscountytx.gov/images/pdfs/election_results/2020.07.14/07142020locreportingrun3_w.pdf
Fields we require:
Retention period: Until November 3, 2020 – Election Day.
Temporal precision: Hourly
| 1.0 | Shared Mobility Data Request - Voting Trips -
Goal: Shared Mobility Services would like to analyze shared mobility trips taken during primary runoffs to plan for and support future trips to polling locations in November.
This analysis would help us understand if there is a need for shared mobility support in certain areas, e.g., increased deployments, rebalancing, provider promotions if necessary, etc.
Date range: Early voting was June 12-July 10, except July 3 and 4 because no voting took place for the July 4 holiday. July 14 was Election Day.
Early voting locations: https://countyclerk.traviscountytx.gov/images/pdfs/polling_locations/07.14.2020/PR20_and_SE20_Early_Voting_Sites_Flyer.pdf
Election Day (July 14) locations: https://countyclerk.traviscountytx.gov/images/pdfs/election_results/2020.07.14/07142020locreportingrun3_w.pdf
Fields we require:
Retention period: Until November 3, 2020 – Election Day.
Temporal precision: Hourly
| non_defect | shared mobility data request voting trips goal shared mobility services would like to analyze shared mobility trips taken during primary runoffs to plan for and support future trips to polling locations in november this analysis would help us understand if there is a need for shared mobility support in certain areas e g increased deployments rebalancing provider promotions if necessary etc date range early voting was june july except july and because no voting took place for the july holiday july was election day early voting locations election day july locations fields we require retention period until november – election day temporal precision hourly | 0 |
41,547 | 10,513,142,089 | IssuesEvent | 2019-09-27 19:46:10 | hazelcast/hazelcast | https://api.github.com/repos/hazelcast/hazelcast | closed | Network stats for REST communication is not accurate | Team: Management Center Type: Defect | Start a single member with this configuration:
```
Config config = new Config();
config.getManagementCenterConfig().setEnabled(true).setUrl("http://localhost:8083/hazelcast-mancenter");
config.getAdvancedNetworkConfig()
.setEnabled(true)
.setMemberEndpointConfig(new ServerSocketEndpointConfig().setPort(5701))
.setClientEndpointConfig(new ServerSocketEndpointConfig().setPort(5703))
.setRestEndpointConfig(new RestServerEndpointConfig()
.setPort(5702)
.enableAllGroups());
```
Send this request to the member:
```
curl -v http://localhost:5702/hazelcast/rest/cluster
```
Result is:
```
* Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 5702 (#0)
> GET /hazelcast/rest/cluster HTTP/1.1
> Host: localhost:5702
> User-Agent: curl/7.54.0
> Accept: */*
>
< HTTP/1.1 200 OK
< Content-Length: 144
<
Members {size:1, ver:1} [
Member [192.168.1.21]:5701 - 926c9885-3a55-4569-88d4-5fe1c72ebf18 this
]
ConnectionCount: 0
AllConnectionCount: 1
* Connection #0 to host localhost left intact
```
So, we should expect at least `144` bytes to be transferred as outbound traffic from the member. But, when we query for it on MC by executing this:
```
curl -v http://localhost:8083/hazelcast-mancenter/rest/clusters/dev/state > output.json
```
We get the following output:
```
"inboundNetworkStats": {
"bytesTransceived": {
"MEMCACHE": 0,
"REST": 100,
"WAN": 0,
"CLIENT": 0,
"MEMBER": 0
}
},
"outboundNetworkStats": {
"bytesTransceived": {
"MEMCACHE": 0,
"REST": 100,
"WAN": 0,
"CLIENT": 0,
"MEMBER": 0
}
}
```
And, when I continue to send the same request, both values keep increasing by `100` each time I send a new request. This might be an issue on Management Center side, but as the logic in MC side is much simpler, I suspect it's on IMDG side. | 1.0 | Network stats for REST communication is not accurate - Start a single member with this configuration:
```
Config config = new Config();
config.getManagementCenterConfig().setEnabled(true).setUrl("http://localhost:8083/hazelcast-mancenter");
config.getAdvancedNetworkConfig()
.setEnabled(true)
.setMemberEndpointConfig(new ServerSocketEndpointConfig().setPort(5701))
.setClientEndpointConfig(new ServerSocketEndpointConfig().setPort(5703))
.setRestEndpointConfig(new RestServerEndpointConfig()
.setPort(5702)
.enableAllGroups());
```
Send this request to the member:
```
curl -v http://localhost:5702/hazelcast/rest/cluster
```
Result is:
```
* Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 5702 (#0)
> GET /hazelcast/rest/cluster HTTP/1.1
> Host: localhost:5702
> User-Agent: curl/7.54.0
> Accept: */*
>
< HTTP/1.1 200 OK
< Content-Length: 144
<
Members {size:1, ver:1} [
Member [192.168.1.21]:5701 - 926c9885-3a55-4569-88d4-5fe1c72ebf18 this
]
ConnectionCount: 0
AllConnectionCount: 1
* Connection #0 to host localhost left intact
```
So, we should expect at least `144` bytes to be transferred as outbound traffic from the member. But, when we query for it on MC by executing this:
```
curl -v http://localhost:8083/hazelcast-mancenter/rest/clusters/dev/state > output.json
```
We get the following output:
```
"inboundNetworkStats": {
"bytesTransceived": {
"MEMCACHE": 0,
"REST": 100,
"WAN": 0,
"CLIENT": 0,
"MEMBER": 0
}
},
"outboundNetworkStats": {
"bytesTransceived": {
"MEMCACHE": 0,
"REST": 100,
"WAN": 0,
"CLIENT": 0,
"MEMBER": 0
}
}
```
And, when I continue to send the same request, both values keep increasing by `100` each time I send a new request. This might be an issue on Management Center side, but as the logic in MC side is much simpler, I suspect it's on IMDG side. | defect | network stats for rest communication is not accurate start a single member with this configuration config config new config config getmanagementcenterconfig setenabled true seturl config getadvancednetworkconfig setenabled true setmemberendpointconfig new serversocketendpointconfig setport setclientendpointconfig new serversocketendpointconfig setport setrestendpointconfig new restserverendpointconfig setport enableallgroups send this request to the member curl v result is trying tcp nodelay set connected to localhost port get hazelcast rest cluster http host localhost user agent curl accept http ok content length members size ver member this connectioncount allconnectioncount connection to host localhost left intact so we should expect at least bytes to be transferred as outbound traffic from the member but when we query for it on mc by executing this curl v output json we get the following output inboundnetworkstats bytestransceived memcache rest wan client member outboundnetworkstats bytestransceived memcache rest wan client member and when i continue to send the same request both values keep increasing by each time i send a new request this might be an issue on management center side but as the logic in mc side is much simpler i suspect it s on imdg side | 1 |
11,215 | 7,109,153,898 | IssuesEvent | 2018-01-17 04:22:12 | bwsw/cloudstack-ui | https://api.github.com/repos/bwsw/cloudstack-ui | closed | Dialog does not close after creation of SSH key with not autogenerated public key | bug statistics bug: regression statistics bug: usability | **Steps to reproduce:**
- Go to SSH keys and try to create new one
- Input your own public key
**Expected result:** SSH key was created, dialog was closed
**Actual result:** SSH key was created, but dialog is still open
 | True | Dialog does not close after creation of SSH key with not autogenerated public key - **Steps to reproduce:**
- Go to SSH keys and try to create new one
- Input your own public key
**Expected result:** SSH key was created, dialog was closed
**Actual result:** SSH key was created, but dialog is still open
 | non_defect | dialog does not close after creation of ssh key with not autogenerated public key steps to reproduce go to ssh keys and try to create new one input your own public key expected result ssh key was created dialog was closed actual result ssh key was created but dialog is still open | 0 |
14,847 | 18,242,123,449 | IssuesEvent | 2021-10-01 14:04:10 | pycaret/pycaret | https://api.github.com/repos/pycaret/pycaret | closed | ValueError: dtype for the target variable should be int32 or int64 only | bug preprocessing | **Describe the bug**
<!--
-->
In regression it is usual to use independent variables in float.
In Pycaret version 2.3.4 I get the following error in regression when activating "Remove Multicollinearity=True", in "False" it does not give the problem.
ERROR: pycaret regression ValueError: dtype for the target variable should be int32 or int64 only
I understand that it should support float and integer values.
FILE: /usr/local/lib/python3.7/dist-packages/pycaret/internal/preprocess.py
rows 2515, 2516
class Fix_multicollinearity(BaseEstimator, TransformerMixin):
def fit(self, data, y=None):
...
if data[self.target_variable].dtype not in ["int32", "int64"]:
raise ValueError('dtype for the target variable should be int32 or int64 only')
...
**To Reproduce**
<!--
-->
from pycaret.datasets import get_data
dataset = get_data('diamond')
dataset['Price'] = dataset['Price'].astype('float32')
from pycaret.regression import *
exp_reg101 = setup(data = dataset, target = 'Price', session_id=123,
remove_multicollinearity = True, multicollinearity_threshold = 0.95)
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-9-f41871135e22> in <module>()
1 from pycaret.regression import *
2 exp_reg101 = setup(data = dataset, target = 'Price', session_id=123,
----> 3 remove_multicollinearity = True, multicollinearity_threshold = 0.95)
7 frames
/usr/local/lib/python3.7/dist-packages/pycaret/internal/preprocess.py in fit(self, data, y)
2514
2515 if data[self.target_variable].dtype not in ["int32", "int64"]:
-> 2516 raise ValueError('dtype for the target variable should be int32 or int64 only')
2517
2518 # global data1
ValueError: dtype for the target variable should be int32 or int64 only
```python
from pycaret.datasets import get_data
dataset = get_data('diamond')
dataset['Price'] = dataset['Price'].astype('float32')
from pycaret.regression import *
exp_reg101 = setup(data = dataset, target = 'Price', session_id=123,
remove_multicollinearity = True, multicollinearity_threshold = 0.95)
```
**Expected behavior**
<!--
A clear and concise description of what you expected to happen.
-->
**Additional context**
<!--
Add any other context about the problem here.
-->
I understand that it should support float and integer values.
**Versions**
<2.3.4>
<!--
Please run the following code snippet and paste the output here:
import pycaret
pycaret.__version__
-->
</details>
<!-- Thanks for contributing! -->
| 1.0 | ValueError: dtype for the target variable should be int32 or int64 only - **Describe the bug**
<!--
-->
In regression it is usual to use independent variables in float.
In Pycaret version 2.3.4 I get the following error in regression when activating "Remove Multicollinearity=True", in "False" it does not give the problem.
ERROR: pycaret regression ValueError: dtype for the target variable should be int32 or int64 only
I understand that it should support float and integer values.
FILE: /usr/local/lib/python3.7/dist-packages/pycaret/internal/preprocess.py
rows 2515, 2516
class Fix_multicollinearity(BaseEstimator, TransformerMixin):
def fit(self, data, y=None):
...
if data[self.target_variable].dtype not in ["int32", "int64"]:
raise ValueError('dtype for the target variable should be int32 or int64 only')
...
**To Reproduce**
<!--
-->
from pycaret.datasets import get_data
dataset = get_data('diamond')
dataset['Price'] = dataset['Price'].astype('float32')
from pycaret.regression import *
exp_reg101 = setup(data = dataset, target = 'Price', session_id=123,
remove_multicollinearity = True, multicollinearity_threshold = 0.95)
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-9-f41871135e22> in <module>()
1 from pycaret.regression import *
2 exp_reg101 = setup(data = dataset, target = 'Price', session_id=123,
----> 3 remove_multicollinearity = True, multicollinearity_threshold = 0.95)
7 frames
/usr/local/lib/python3.7/dist-packages/pycaret/internal/preprocess.py in fit(self, data, y)
2514
2515 if data[self.target_variable].dtype not in ["int32", "int64"]:
-> 2516 raise ValueError('dtype for the target variable should be int32 or int64 only')
2517
2518 # global data1
ValueError: dtype for the target variable should be int32 or int64 only
```python
from pycaret.datasets import get_data
dataset = get_data('diamond')
dataset['Price'] = dataset['Price'].astype('float32')
from pycaret.regression import *
exp_reg101 = setup(data = dataset, target = 'Price', session_id=123,
remove_multicollinearity = True, multicollinearity_threshold = 0.95)
```
**Expected behavior**
<!--
A clear and concise description of what you expected to happen.
-->
**Additional context**
<!--
Add any other context about the problem here.
-->
I understand that it should support float and integer values.
**Versions**
<2.3.4>
<!--
Please run the following code snippet and paste the output here:
import pycaret
pycaret.__version__
-->
</details>
<!-- Thanks for contributing! -->
| non_defect | valueerror dtype for the target variable should be or only describe the bug in regression it is usual to use independent variables in float in pycaret version i get the following error in regression when activating remove multicollinearity true in false it does not give the problem error pycaret regression valueerror dtype for the target variable should be or only i understand that it should support float and integer values file usr local lib dist packages pycaret internal preprocess py rows class fix multicollinearity baseestimator transformermixin def fit self data y none if data dtype not in raise valueerror dtype for the target variable should be or only to reproduce from pycaret datasets import get data dataset get data diamond dataset dataset astype from pycaret regression import exp setup data dataset target price session id remove multicollinearity true multicollinearity threshold valueerror traceback most recent call last in from pycaret regression import exp setup data dataset target price session id remove multicollinearity true multicollinearity threshold frames usr local lib dist packages pycaret internal preprocess py in fit self data y if data dtype not in raise valueerror dtype for the target variable should be or only global valueerror dtype for the target variable should be or only python from pycaret datasets import get data dataset get data diamond dataset dataset astype from pycaret regression import exp setup data dataset target price session id remove multicollinearity true multicollinearity threshold expected behavior a clear and concise description of what you expected to happen additional context add any other context about the problem here i understand that it should support float and integer values versions please run the following code snippet and paste the output here import pycaret pycaret version | 0 |
39,159 | 9,222,662,694 | IssuesEvent | 2019-03-11 23:49:40 | supertuxkart/stk-code | https://api.github.com/repos/supertuxkart/stk-code | closed | LOD issues in local multiplayer | C:Graphics P3: normal T: defect | I had a quick look and the LOD issue can be easily reproduced in cocoa temple with local multiplayer. When you move one kart, palms are flickering for both players. It looks that getLevel() must be executed every time, because it depends on camera position, so it's different for different players. Or current/previous level would have to be remembered for every player. @samuncle I assume that you want to fix it yourself? | 1.0 | LOD issues in local multiplayer - I had a quick look and the LOD issue can be easily reproduced in cocoa temple with local multiplayer. When you move one kart, palms are flickering for both players. It looks that getLevel() must be executed every time, because it depends on camera position, so it's different for different players. Or current/previous level would have to be remembered for every player. @samuncle I assume that you want to fix it yourself? | defect | lod issues in local multiplayer i had a quick look and the lod issue can be easily reproduced in cocoa temple with local multiplayer when you move one kart palms are flickering for both players it looks that getlevel must be executed every time because it depends on camera position so it s different for different players or current previous level would have to be remembered for every player samuncle i assume that you want to fix it yourself | 1 |
126,407 | 4,994,918,011 | IssuesEvent | 2016-12-09 08:02:52 | pmem/issues | https://api.github.com/repos/pmem/issues | closed | windows/pmem: pmem_map_file closes file after creating it | Exposure: Critical OS: Windows Priority: 2 high Type: Bug | It causes losing file handle which is required to perform further operations like msync etc
Found on: 1.1-583-g971769f
| 1.0 | windows/pmem: pmem_map_file closes file after creating it - It causes losing file handle which is required to perform further operations like msync etc
Found on: 1.1-583-g971769f
| non_defect | windows pmem pmem map file closes file after creating it it causes losing file handle which is required to perform further operations like msync etc found on | 0 |
577,949 | 17,139,651,985 | IssuesEvent | 2021-07-13 08:12:59 | status-im/StatusQ | https://api.github.com/repos/status-im/StatusQ | closed | `StatusBaseInput` needs focussed state | module: controls priority 2: required type: feature | The current `Input` component in Status Desktop comes with a proper focussed state, which adds a blue ring around the control:
<img width="246" alt="Screenshot 2021-07-08 at 11 35 48" src="https://user-images.githubusercontent.com/445106/124899687-a642dc00-dfe0-11eb-9165-3210389f7d21.png">
We should enable that in `StatusBaseInput` as well or at least make it configurable. | 1.0 | `StatusBaseInput` needs focussed state - The current `Input` component in Status Desktop comes with a proper focussed state, which adds a blue ring around the control:
<img width="246" alt="Screenshot 2021-07-08 at 11 35 48" src="https://user-images.githubusercontent.com/445106/124899687-a642dc00-dfe0-11eb-9165-3210389f7d21.png">
We should enable that in `StatusBaseInput` as well or at least make it configurable. | non_defect | statusbaseinput needs focussed state the current input component in status desktop comes with a proper focussed state which adds a blue ring around the control img width alt screenshot at src we should enable that in statusbaseinput as well or at least make it configurable | 0 |
4,653 | 7,495,777,941 | IssuesEvent | 2018-04-08 01:13:35 | gkiar/reading | https://api.github.com/repos/gkiar/reading | closed | Paper: testname | processing | URL: [http://testurl.io](http://testurl.io)
## This paper does...
testdo
## This paper does not...
test note
## Other comments?
it works!
| 1.0 | Paper: testname - URL: [http://testurl.io](http://testurl.io)
## This paper does...
testdo
## This paper does not...
test note
## Other comments?
it works!
| non_defect | paper testname url this paper does testdo this paper does not test note other comments it works | 0 |
91,369 | 10,719,543,317 | IssuesEvent | 2019-10-26 10:49:42 | elsa-workflows/elsa-core | https://api.github.com/repos/elsa-workflows/elsa-core | closed | Documentation: What is it, why is it here, how to get started | documentation | We need a section on getting started. | 1.0 | Documentation: What is it, why is it here, how to get started - We need a section on getting started. | non_defect | documentation what is it why is it here how to get started we need a section on getting started | 0 |
69,046 | 8,371,330,757 | IssuesEvent | 2018-10-05 00:00:27 | mozilla/foundation.mozilla.org | https://api.github.com/repos/mozilla/foundation.mozilla.org | closed | Buyers Guide: finalize home page design | design | This is sooo close! https://redpen.io/p/qz6bdb2811e058ffb1
Last few things to finalize:
- update main nav requirements in dev ticket - Buyers Guide: nav and footer #1818
- Nat to holiday-ify the hero background image #1870
- thumbs up/down icon
- share/donate menu on mobile
- look of filters inside the filter bubble | 1.0 | Buyers Guide: finalize home page design - This is sooo close! https://redpen.io/p/qz6bdb2811e058ffb1
Last few things to finalize:
- update main nav requirements in dev ticket - Buyers Guide: nav and footer #1818
- Nat to holiday-ify the hero background image #1870
- thumbs up/down icon
- share/donate menu on mobile
- look of filters inside the filter bubble | non_defect | buyers guide finalize home page design this is sooo close last few things to finalize update main nav requirements in dev ticket buyers guide nav and footer nat to holiday ify the hero background image thumbs up down icon share donate menu on mobile look of filters inside the filter bubble | 0 |
79,126 | 28,004,250,943 | IssuesEvent | 2023-03-27 14:23:34 | matrix-org/matrix-appservice-bridge | https://api.github.com/repos/matrix-org/matrix-appservice-bridge | opened | ProvisionerApi.OnError is not applied to new routes added by addRoute | S-Major T-Defect | Since it is configured in the constructor and therefore in the wrong part of the routing tree. | 1.0 | ProvisionerApi.OnError is not applied to new routes added by addRoute - Since it is configured in the constructor and therefore in the wrong part of the routing tree. | defect | provisionerapi onerror is not applied to new routes added by addroute since it is configured in the constructor and therefore in the wrong part of the routing tree | 1 |
51,605 | 13,207,534,262 | IssuesEvent | 2020-08-14 23:29:12 | icecube-trac/tix4 | https://api.github.com/repos/icecube-trac/tix4 | opened | port photonics_1.73 does not build on SL5 64 bit (Trac #686) | Incomplete Migration Migrated from Trac booking defect | <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/686">https://code.icecube.wisc.edu/projects/icecube/ticket/686</a>, reported by boersma</summary>
<p>
```json
{
"status": "closed",
"changetime": "2012-06-22T16:05:07",
"_ts": "1340381107000000",
"description": "Scientific Linux release 5.8 (Boron)\n\n{{{\n$ gcc --version\ngcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-52)\nCopyright (C) 2006 Free Software Foundation, Inc.\nThis is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.\n}}}\n\n{{{\n$ ./bin/port -vd install photonics_1.73\nDEBUG: Found port in file:///net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/sources/rsync.code.icecube.wisc.edu_icecube-tools-ports/science/photonics_1.73\nDEBUG: Changing to port directory: /net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/sources/rsync.code.icecube.wisc.edu_icecube-tools-ports/science/photonics_1.73\nDEBUG: Requested variant x86_64 is not provided by port photonics_1.73.\nDEBUG: Executing variant linux provides linux\nDEBUG: Executing com.apple.main (photonics_1.73)\nDEBUG: No TGZ archive: /net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/packages/linux/x86_64/photonics_1.73-1.73_0.x86_64.tgz\nDEBUG: Skipping unarchive (photonics_1.73) since no archive found\nDEBUG: Skipping completed com.apple.unarchive (photonics_1.73)\n---> Fetching photonics_1.73\nDEBUG: Executing com.apple.fetch (photonics_1.73)\n---> photonics-1.73.tar.bz2 doesn't seem to exist in /net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/distfiles/photonics_1.73\n---> Attempting to fetch photonics-1.73.tar.bz2 from http://kent.dl.sourceforge.net/photonics\nDEBUG: Assembled command: 'cd \"/net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/distfiles/photonics_1.73\" && curl -f -L -o photonics-1.73.tar.bz2.TMP http://kent.dl.sourceforge.net/photonics/photonics-1.73.tar.bz2'\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n100 485k 100 485k 0 0 299k 0 0:00:01 0:00:01 --:--:-- 10.4M\n---> Verifying checksum(s) for photonics_1.73\nDEBUG: Executing com.apple.checksum (photonics_1.73)\n---> Checksumming photonics-1.73.tar.bz2\nDEBUG: Correct (md5) checksum for photonics-1.73.tar.bz2\nDEBUG: setting option extract.cmd to /usr/bin/bzip2\n---> Extracting photonics_1.73\nDEBUG: Executing com.apple.extract (photonics_1.73)\n---> Extracting photonics-1.73.tar.bz2\nDEBUG: setting option extract.args to /net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/distfiles/photonics_1.73/photonics-1.73.tar.bz2\nDEBUG: Assembled command: 'cd \"/net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work\" && /usr/bin/bzip2 -dc /net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/distfiles/photonics_1.73/photonics-1.73.tar.bz2 | tar --no-same-owner -xf -'\nDEBUG: Executing com.apple.patch (photonics_1.73)\n---> Configuring photonics_1.73\nDEBUG: Executing com.apple.configure (photonics_1.73)\nDEBUG: Assembled command: 'cd \"/net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73\" && CFLAGS=-fPIC ./configure --prefix=/net/software_icecube/SL5-py26/I3_PORTS --enable-optimize'\nchecking for a BSD-compatible install... /usr/bin/install -c\nchecking whether build environment is sane... yes\nchecking for a thread-safe mkdir -p... /bin/mkdir -p\nchecking for gawk... gawk\nchecking whether make sets $(MAKE)... yes\nchecking for gawk... (cached) gawk\nchecking for gcc... gcc\nchecking for C compiler default output file name... a.out\nchecking whether the C compiler works... yes\nchecking whether we are cross compiling... no\nchecking for suffix of executables... \nchecking for suffix of object files... o\nchecking whether we are using the GNU C compiler... yes\nchecking whether gcc accepts -g... yes\nchecking for gcc option to accept ISO C89... none needed\nchecking for style of include used by make... GNU\nchecking dependency style of gcc... gcc3\nchecking how to run the C preprocessor... gcc -E\nchecking for g++... g++\nchecking whether we are using the GNU C++ compiler... yes\nchecking whether g++ accepts -g... yes\nchecking dependency style of g++... gcc3\nchecking how to run the C++ preprocessor... g++ -E\nchecking for a BSD-compatible install... /usr/bin/install -c\nchecking whether ln -s works... yes\nchecking whether make sets $(MAKE)... (cached) yes\nchecking build system type... x86_64-unknown-linux-gnu\nchecking host system type... x86_64-unknown-linux-gnu\nchecking for a sed that does not truncate output... /bin/sed\nchecking for grep that handles long lines and -e... /bin/grep\nchecking for egrep... /bin/grep -E\nchecking for fgrep... /bin/grep -F\nchecking for ld used by gcc... /usr/bin/ld\nchecking if the linker (/usr/bin/ld) is GNU ld... yes\nchecking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B\nchecking the name lister (/usr/bin/nm -B) interface... BSD nm\nchecking the maximum length of command line arguments... 98304\nchecking whether the shell understands some XSI constructs... yes\nchecking whether the shell understands \"+=\"... yes\nchecking for /usr/bin/ld option to reload object files... -r\nchecking how to recognize dependent libraries... pass_all\nchecking for ar... ar\nchecking for strip... strip\nchecking for ranlib... ranlib\nchecking command to parse /usr/bin/nm -B output from gcc object... ok\nchecking for ANSI C header files... yes\nchecking for sys/types.h... yes\nchecking for sys/stat.h... yes\nchecking for stdlib.h... yes\nchecking for string.h... yes\nchecking for memory.h... yes\nchecking for strings.h... yes\nchecking for inttypes.h... yes\nchecking for stdint.h... yes\nchecking for unistd.h... yes\nchecking for dlfcn.h... yes\nchecking whether we are using the GNU C++ compiler... (cached) yes\nchecking whether g++ accepts -g... (cached) yes\nchecking dependency style of g++... (cached) gcc3\nchecking how to run the C++ preprocessor... g++ -E\nchecking for objdir... .libs\nchecking if gcc supports -fno-rtti -fno-exceptions... no\nchecking for gcc option to produce PIC... -fPIC -DPIC\nchecking if gcc PIC flag -fPIC -DPIC works... yes\nchecking if gcc static flag -static works... yes\nchecking if gcc supports -c -o file.o... yes\nchecking if gcc supports -c -o file.o... (cached) yes\nchecking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes\nchecking whether -lc should be explicitly linked in... no\nchecking dynamic linker characteristics... GNU/Linux ld.so\nchecking how to hardcode library paths into programs... immediate\nchecking whether stripping libraries is possible... yes\nchecking if libtool supports shared libraries... yes\nchecking whether to build shared libraries... yes\nchecking whether to build static libraries... yes\nchecking for ld used by g++... /usr/bin/ld -m elf_x86_64\nchecking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes\nchecking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes\nchecking for g++ option to produce PIC... -fPIC -DPIC\nchecking if g++ PIC flag -fPIC -DPIC works... yes\nchecking if g++ static flag -static works... yes\nchecking if g++ supports -c -o file.o... yes\nchecking if g++ supports -c -o file.o... (cached) yes\nchecking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes\nchecking dynamic linker characteristics... GNU/Linux ld.so\nchecking how to hardcode library paths into programs... immediate\nchecking for ANSI C header files... (cached) yes\nchecking for stdbool.h that conforms to C99... yes\nchecking for _Bool... yes\nchecking limits.h usability... yes\nchecking limits.h presence... yes\nchecking for limits.h... yes\nchecking malloc.h usability... yes\nchecking malloc.h presence... yes\nchecking for malloc.h... yes\nchecking for an ANSI C-conforming const... yes\nchecking for size_t... yes\nchecking for int8_t... yes\nchecking for int16_t... yes\nchecking for int32_t... yes\nchecking for int64_t... yes\nchecking for off_t... yes\nchecking for stdlib.h... (cached) yes\nchecking for GNU libc compatible malloc... yes\nchecking for stdlib.h... (cached) yes\nchecking for unistd.h... (cached) yes\nchecking for getpagesize... yes\nchecking for working mmap... yes\nchecking for stdlib.h... (cached) yes\nchecking for GNU libc compatible realloc... yes\nchecking for working strtod... yes\nchecking for strstr... yes\nchecking for strtod... (cached) yes\nchecking for strtol... yes\nchecking for strerror... yes\nchecking for memset... yes\nchecking for floor... no\nchecking for library containing floor... -lm\nchecking for pow... yes\nchecking for sqrt... yes\nchecking whether to enable debug mode... yes\ndisabled cernlib dependent code... yes\nconfigure: creating ./config.status\nconfig.status: creating Makefile\nconfig.status: creating lib/Makefile\nconfig.status: creating src/Makefile\nconfig.status: creating ice/Makefile\nconfig.status: creating scripts/Makefile\nconfig.status: creating amasim/Makefile\nconfig.status: creating level2/Makefile\nconfig.status: creating config.h\nconfig.status: executing depfiles commands\nconfig.status: executing libtool commands\n-------------------------------------------------------------------\n Photonics: \"1.73: pyrosoma r4\"\n please refer to the 'INSTALL' file for further instructions.\n Hints:\n\n Building photonics......................'make'\n Compiling tool directory................'make tool'\n Performing post compile test............'make tests'\n All of the above........................'make everything'\n\n Clean objects and binaries..............'make clean'\n Clean tool directory....................'make toolclean'\n Remove traces of previous configure.....'make distclean'\n\n Also consider trying 'scripts/install_in_icetray.sh --help'\n\n\n---> Building photonics_1.73 with target all\nDEBUG: Executing com.apple.build (photonics_1.73)\nDEBUG: Assembled command: 'cd \"/net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73\" && CFLAGS=-fPIC make all'\nmake all-recursive\nmake[1]: Entering directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73'\nMaking all in lib\nmake[2]: Entering directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73/lib'\n/bin/sh ../libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I.. -O3 -funroll-loops -fmerge-all-constants -march=native -mtune=native -g -O2 -Wall -fno-inline -fPIC -MT boundary.lo -MD -MP -MF .deps/boundary.Tpo -c -o boundary.lo boundary.c\nlibtool: compile: gcc -DHAVE_CONFIG_H -I. -I.. -O3 -funroll-loops -fmerge-all-constants -march=native -mtune=native -g -O2 -Wall -fno-inline -fPIC -MT boundary.lo -MD -MP -MF .deps/boundary.Tpo -c boundary.c -fPIC -DPIC -o .libs/boundary.o\nboundary.c:1: error: bad value (native) for -march= switch\nboundary.c:1: error: bad value (native) for -mtune= switch\nmake[2]: *** [boundary.lo] Error 1\nmake[2]: Leaving directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73/lib'\nmake[1]: *** [all-recursive] Error 1\nmake[1]: Leaving directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73'\nmake: *** [all] Error 2\nError: Target com.apple.build returned: shell command \"cd \"/net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73\" && CFLAGS=-fPIC make all\" returned error 2\nCommand output: make all-recursive\nmake[1]: Entering directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73'\nMaking all in lib\nmake[2]: Entering directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73/lib'\n/bin/sh ../libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I.. -O3 -funroll-loops -fmerge-all-constants -march=native -mtune=native -g -O2 -Wall -fno-inline -fPIC -MT boundary.lo -MD -MP -MF .deps/boundary.Tpo -c -o boundary.lo boundary.c\nlibtool: compile: gcc -DHAVE_CONFIG_H -I. -I.. -O3 -funroll-loops -fmerge-all-constants -march=native -mtune=native -g -O2 -Wall -fno-inline -fPIC -MT boundary.lo -MD -MP -MF .deps/boundary.Tpo -c boundary.c -fPIC -DPIC -o .libs/boundary.o\nboundary.c:1: error: bad value (native) for -march= switch\nboundary.c:1: error: bad value (native) for -mtune= switch\nmake[2]: *** [boundary.lo] Error 1\nmake[2]: Leaving directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73/lib'\nmake[1]: *** [all-recursive] Error 1\nmake[1]: Leaving directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73'\nmake: *** [all] Error 2\n\nWarning: the following items did not execute (for photonics_1.73): com.apple.activate com.apple.build com.apple.destroot com.apple.archive com.apple.install\nicecubemgr@lx3b48:/net/software_icecube/SL5-py26/I3_PORTS\n}}}",
"reporter": "boersma",
"cc": "",
"resolution": "worksforme",
"time": "2012-06-22T15:13:34",
"component": "booking",
"summary": "port photonics_1.73 does not build on SL5 64 bit",
"priority": "normal",
"keywords": "photonics I3_PORTS gcc",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
| 1.0 | port photonics_1.73 does not build on SL5 64 bit (Trac #686) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/686">https://code.icecube.wisc.edu/projects/icecube/ticket/686</a>, reported by boersma</summary>
<p>
```json
{
"status": "closed",
"changetime": "2012-06-22T16:05:07",
"_ts": "1340381107000000",
"description": "Scientific Linux release 5.8 (Boron)\n\n{{{\n$ gcc --version\ngcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-52)\nCopyright (C) 2006 Free Software Foundation, Inc.\nThis is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.\n}}}\n\n{{{\n$ ./bin/port -vd install photonics_1.73\nDEBUG: Found port in file:///net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/sources/rsync.code.icecube.wisc.edu_icecube-tools-ports/science/photonics_1.73\nDEBUG: Changing to port directory: /net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/sources/rsync.code.icecube.wisc.edu_icecube-tools-ports/science/photonics_1.73\nDEBUG: Requested variant x86_64 is not provided by port photonics_1.73.\nDEBUG: Executing variant linux provides linux\nDEBUG: Executing com.apple.main (photonics_1.73)\nDEBUG: No TGZ archive: /net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/packages/linux/x86_64/photonics_1.73-1.73_0.x86_64.tgz\nDEBUG: Skipping unarchive (photonics_1.73) since no archive found\nDEBUG: Skipping completed com.apple.unarchive (photonics_1.73)\n---> Fetching photonics_1.73\nDEBUG: Executing com.apple.fetch (photonics_1.73)\n---> photonics-1.73.tar.bz2 doesn't seem to exist in /net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/distfiles/photonics_1.73\n---> Attempting to fetch photonics-1.73.tar.bz2 from http://kent.dl.sourceforge.net/photonics\nDEBUG: Assembled command: 'cd \"/net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/distfiles/photonics_1.73\" && curl -f -L -o photonics-1.73.tar.bz2.TMP http://kent.dl.sourceforge.net/photonics/photonics-1.73.tar.bz2'\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n100 485k 100 485k 0 0 299k 0 0:00:01 0:00:01 --:--:-- 10.4M\n---> Verifying checksum(s) for photonics_1.73\nDEBUG: Executing com.apple.checksum (photonics_1.73)\n---> Checksumming photonics-1.73.tar.bz2\nDEBUG: Correct (md5) checksum for photonics-1.73.tar.bz2\nDEBUG: setting option extract.cmd to /usr/bin/bzip2\n---> Extracting photonics_1.73\nDEBUG: Executing com.apple.extract (photonics_1.73)\n---> Extracting photonics-1.73.tar.bz2\nDEBUG: setting option extract.args to /net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/distfiles/photonics_1.73/photonics-1.73.tar.bz2\nDEBUG: Assembled command: 'cd \"/net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work\" && /usr/bin/bzip2 -dc /net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/distfiles/photonics_1.73/photonics-1.73.tar.bz2 | tar --no-same-owner -xf -'\nDEBUG: Executing com.apple.patch (photonics_1.73)\n---> Configuring photonics_1.73\nDEBUG: Executing com.apple.configure (photonics_1.73)\nDEBUG: Assembled command: 'cd \"/net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73\" && CFLAGS=-fPIC ./configure --prefix=/net/software_icecube/SL5-py26/I3_PORTS --enable-optimize'\nchecking for a BSD-compatible install... /usr/bin/install -c\nchecking whether build environment is sane... yes\nchecking for a thread-safe mkdir -p... /bin/mkdir -p\nchecking for gawk... gawk\nchecking whether make sets $(MAKE)... yes\nchecking for gawk... (cached) gawk\nchecking for gcc... gcc\nchecking for C compiler default output file name... a.out\nchecking whether the C compiler works... yes\nchecking whether we are cross compiling... no\nchecking for suffix of executables... \nchecking for suffix of object files... o\nchecking whether we are using the GNU C compiler... yes\nchecking whether gcc accepts -g... yes\nchecking for gcc option to accept ISO C89... none needed\nchecking for style of include used by make... GNU\nchecking dependency style of gcc... gcc3\nchecking how to run the C preprocessor... gcc -E\nchecking for g++... g++\nchecking whether we are using the GNU C++ compiler... yes\nchecking whether g++ accepts -g... yes\nchecking dependency style of g++... gcc3\nchecking how to run the C++ preprocessor... g++ -E\nchecking for a BSD-compatible install... /usr/bin/install -c\nchecking whether ln -s works... yes\nchecking whether make sets $(MAKE)... (cached) yes\nchecking build system type... x86_64-unknown-linux-gnu\nchecking host system type... x86_64-unknown-linux-gnu\nchecking for a sed that does not truncate output... /bin/sed\nchecking for grep that handles long lines and -e... /bin/grep\nchecking for egrep... /bin/grep -E\nchecking for fgrep... /bin/grep -F\nchecking for ld used by gcc... /usr/bin/ld\nchecking if the linker (/usr/bin/ld) is GNU ld... yes\nchecking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B\nchecking the name lister (/usr/bin/nm -B) interface... BSD nm\nchecking the maximum length of command line arguments... 98304\nchecking whether the shell understands some XSI constructs... yes\nchecking whether the shell understands \"+=\"... yes\nchecking for /usr/bin/ld option to reload object files... -r\nchecking how to recognize dependent libraries... pass_all\nchecking for ar... ar\nchecking for strip... strip\nchecking for ranlib... ranlib\nchecking command to parse /usr/bin/nm -B output from gcc object... ok\nchecking for ANSI C header files... yes\nchecking for sys/types.h... yes\nchecking for sys/stat.h... yes\nchecking for stdlib.h... yes\nchecking for string.h... yes\nchecking for memory.h... yes\nchecking for strings.h... yes\nchecking for inttypes.h... yes\nchecking for stdint.h... yes\nchecking for unistd.h... yes\nchecking for dlfcn.h... yes\nchecking whether we are using the GNU C++ compiler... (cached) yes\nchecking whether g++ accepts -g... (cached) yes\nchecking dependency style of g++... (cached) gcc3\nchecking how to run the C++ preprocessor... g++ -E\nchecking for objdir... .libs\nchecking if gcc supports -fno-rtti -fno-exceptions... no\nchecking for gcc option to produce PIC... -fPIC -DPIC\nchecking if gcc PIC flag -fPIC -DPIC works... yes\nchecking if gcc static flag -static works... yes\nchecking if gcc supports -c -o file.o... yes\nchecking if gcc supports -c -o file.o... (cached) yes\nchecking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes\nchecking whether -lc should be explicitly linked in... no\nchecking dynamic linker characteristics... GNU/Linux ld.so\nchecking how to hardcode library paths into programs... immediate\nchecking whether stripping libraries is possible... yes\nchecking if libtool supports shared libraries... yes\nchecking whether to build shared libraries... yes\nchecking whether to build static libraries... yes\nchecking for ld used by g++... /usr/bin/ld -m elf_x86_64\nchecking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes\nchecking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes\nchecking for g++ option to produce PIC... -fPIC -DPIC\nchecking if g++ PIC flag -fPIC -DPIC works... yes\nchecking if g++ static flag -static works... yes\nchecking if g++ supports -c -o file.o... yes\nchecking if g++ supports -c -o file.o... (cached) yes\nchecking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes\nchecking dynamic linker characteristics... GNU/Linux ld.so\nchecking how to hardcode library paths into programs... immediate\nchecking for ANSI C header files... (cached) yes\nchecking for stdbool.h that conforms to C99... yes\nchecking for _Bool... yes\nchecking limits.h usability... yes\nchecking limits.h presence... yes\nchecking for limits.h... yes\nchecking malloc.h usability... yes\nchecking malloc.h presence... yes\nchecking for malloc.h... yes\nchecking for an ANSI C-conforming const... yes\nchecking for size_t... yes\nchecking for int8_t... yes\nchecking for int16_t... yes\nchecking for int32_t... yes\nchecking for int64_t... yes\nchecking for off_t... yes\nchecking for stdlib.h... (cached) yes\nchecking for GNU libc compatible malloc... yes\nchecking for stdlib.h... (cached) yes\nchecking for unistd.h... (cached) yes\nchecking for getpagesize... yes\nchecking for working mmap... yes\nchecking for stdlib.h... (cached) yes\nchecking for GNU libc compatible realloc... yes\nchecking for working strtod... yes\nchecking for strstr... yes\nchecking for strtod... (cached) yes\nchecking for strtol... yes\nchecking for strerror... yes\nchecking for memset... yes\nchecking for floor... no\nchecking for library containing floor... -lm\nchecking for pow... yes\nchecking for sqrt... yes\nchecking whether to enable debug mode... yes\ndisabled cernlib dependent code... yes\nconfigure: creating ./config.status\nconfig.status: creating Makefile\nconfig.status: creating lib/Makefile\nconfig.status: creating src/Makefile\nconfig.status: creating ice/Makefile\nconfig.status: creating scripts/Makefile\nconfig.status: creating amasim/Makefile\nconfig.status: creating level2/Makefile\nconfig.status: creating config.h\nconfig.status: executing depfiles commands\nconfig.status: executing libtool commands\n-------------------------------------------------------------------\n Photonics: \"1.73: pyrosoma r4\"\n please refer to the 'INSTALL' file for further instructions.\n Hints:\n\n Building photonics......................'make'\n Compiling tool directory................'make tool'\n Performing post compile test............'make tests'\n All of the above........................'make everything'\n\n Clean objects and binaries..............'make clean'\n Clean tool directory....................'make toolclean'\n Remove traces of previous configure.....'make distclean'\n\n Also consider trying 'scripts/install_in_icetray.sh --help'\n\n\n---> Building photonics_1.73 with target all\nDEBUG: Executing com.apple.build (photonics_1.73)\nDEBUG: Assembled command: 'cd \"/net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73\" && CFLAGS=-fPIC make all'\nmake all-recursive\nmake[1]: Entering directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73'\nMaking all in lib\nmake[2]: Entering directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73/lib'\n/bin/sh ../libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I.. -O3 -funroll-loops -fmerge-all-constants -march=native -mtune=native -g -O2 -Wall -fno-inline -fPIC -MT boundary.lo -MD -MP -MF .deps/boundary.Tpo -c -o boundary.lo boundary.c\nlibtool: compile: gcc -DHAVE_CONFIG_H -I. -I.. -O3 -funroll-loops -fmerge-all-constants -march=native -mtune=native -g -O2 -Wall -fno-inline -fPIC -MT boundary.lo -MD -MP -MF .deps/boundary.Tpo -c boundary.c -fPIC -DPIC -o .libs/boundary.o\nboundary.c:1: error: bad value (native) for -march= switch\nboundary.c:1: error: bad value (native) for -mtune= switch\nmake[2]: *** [boundary.lo] Error 1\nmake[2]: Leaving directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73/lib'\nmake[1]: *** [all-recursive] Error 1\nmake[1]: Leaving directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73'\nmake: *** [all] Error 2\nError: Target com.apple.build returned: shell command \"cd \"/net/software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73\" && CFLAGS=-fPIC make all\" returned error 2\nCommand output: make all-recursive\nmake[1]: Entering directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73'\nMaking all in lib\nmake[2]: Entering directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73/lib'\n/bin/sh ../libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I.. -O3 -funroll-loops -fmerge-all-constants -march=native -mtune=native -g -O2 -Wall -fno-inline -fPIC -MT boundary.lo -MD -MP -MF .deps/boundary.Tpo -c -o boundary.lo boundary.c\nlibtool: compile: gcc -DHAVE_CONFIG_H -I. -I.. -O3 -funroll-loops -fmerge-all-constants -march=native -mtune=native -g -O2 -Wall -fno-inline -fPIC -MT boundary.lo -MD -MP -MF .deps/boundary.Tpo -c boundary.c -fPIC -DPIC -o .libs/boundary.o\nboundary.c:1: error: bad value (native) for -march= switch\nboundary.c:1: error: bad value (native) for -mtune= switch\nmake[2]: *** [boundary.lo] Error 1\nmake[2]: Leaving directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73/lib'\nmake[1]: *** [all-recursive] Error 1\nmake[1]: Leaving directory `/.automount/net_ro/net__software_icecube/SL5-py26/I3_PORTS/var/db/dports/build/file._net_software_icecube_SL5-py26_I3_PORTS_var_db_dports_sources_rsync.code.icecube.wisc.edu_icecube-tools-ports_science_photonics_1.73/work/photonics-1.73'\nmake: *** [all] Error 2\n\nWarning: the following items did not execute (for photonics_1.73): com.apple.activate com.apple.build com.apple.destroot com.apple.archive com.apple.install\nicecubemgr@lx3b48:/net/software_icecube/SL5-py26/I3_PORTS\n}}}",
"reporter": "boersma",
"cc": "",
"resolution": "worksforme",
"time": "2012-06-22T15:13:34",
"component": "booking",
"summary": "port photonics_1.73 does not build on SL5 64 bit",
"priority": "normal",
"keywords": "photonics I3_PORTS gcc",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
| defect | port photonics does not build on bit trac migrated from json status closed changetime ts description scientific linux release boron n n n gcc version ngcc gcc red hat ncopyright c free software foundation inc nthis is free software see the source for copying conditions there is no warranty not even for merchantability or fitness for a particular purpose n n n n bin port vd install photonics ndebug found port in file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics ndebug changing to port directory net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics ndebug requested variant is not provided by port photonics ndebug executing variant linux provides linux ndebug executing com apple main photonics ndebug no tgz archive net software icecube ports var db dports packages linux photonics tgz ndebug skipping unarchive photonics since no archive found ndebug skipping completed com apple unarchive photonics n fetching photonics ndebug executing com apple fetch photonics n photonics tar doesn t seem to exist in net software icecube ports var db dports distfiles photonics n attempting to fetch photonics tar from assembled command cd net software icecube ports var db dports distfiles photonics curl f l o photonics tar tmp total received xferd average speed time time time current n dload upload total spent left speed n verifying checksum s for photonics ndebug executing com apple checksum photonics n checksumming photonics tar ndebug correct checksum for photonics tar ndebug setting option extract cmd to usr bin n extracting photonics ndebug executing com apple extract photonics n extracting photonics tar ndebug setting option extract args to net software icecube ports var db dports distfiles photonics photonics tar ndebug assembled command cd net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work usr bin dc net software icecube ports var db dports distfiles photonics photonics tar tar no same owner xf ndebug executing com apple patch photonics n configuring photonics ndebug executing com apple configure photonics ndebug assembled command cd net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work photonics cflags fpic configure prefix net software icecube ports enable optimize nchecking for a bsd compatible install usr bin install c nchecking whether build environment is sane yes nchecking for a thread safe mkdir p bin mkdir p nchecking for gawk gawk nchecking whether make sets make yes nchecking for gawk cached gawk nchecking for gcc gcc nchecking for c compiler default output file name a out nchecking whether the c compiler works yes nchecking whether we are cross compiling no nchecking for suffix of executables nchecking for suffix of object files o nchecking whether we are using the gnu c compiler yes nchecking whether gcc accepts g yes nchecking for gcc option to accept iso none needed nchecking for style of include used by make gnu nchecking dependency style of gcc nchecking how to run the c preprocessor gcc e nchecking for g g nchecking whether we are using the gnu c compiler yes nchecking whether g accepts g yes nchecking dependency style of g nchecking how to run the c preprocessor g e nchecking for a bsd compatible install usr bin install c nchecking whether ln s works yes nchecking whether make sets make cached yes nchecking build system type unknown linux gnu nchecking host system type unknown linux gnu nchecking for a sed that does not truncate output bin sed nchecking for grep that handles long lines and e bin grep nchecking for egrep bin grep e nchecking for fgrep bin grep f nchecking for ld used by gcc usr bin ld nchecking if the linker usr bin ld is gnu ld yes nchecking for bsd or ms compatible name lister nm usr bin nm b nchecking the name lister usr bin nm b interface bsd nm nchecking the maximum length of command line arguments nchecking whether the shell understands some xsi constructs yes nchecking whether the shell understands yes nchecking for usr bin ld option to reload object files r nchecking how to recognize dependent libraries pass all nchecking for ar ar nchecking for strip strip nchecking for ranlib ranlib nchecking command to parse usr bin nm b output from gcc object ok nchecking for ansi c header files yes nchecking for sys types h yes nchecking for sys stat h yes nchecking for stdlib h yes nchecking for string h yes nchecking for memory h yes nchecking for strings h yes nchecking for inttypes h yes nchecking for stdint h yes nchecking for unistd h yes nchecking for dlfcn h yes nchecking whether we are using the gnu c compiler cached yes nchecking whether g accepts g cached yes nchecking dependency style of g cached nchecking how to run the c preprocessor g e nchecking for objdir libs nchecking if gcc supports fno rtti fno exceptions no nchecking for gcc option to produce pic fpic dpic nchecking if gcc pic flag fpic dpic works yes nchecking if gcc static flag static works yes nchecking if gcc supports c o file o yes nchecking if gcc supports c o file o cached yes nchecking whether the gcc linker usr bin ld m elf supports shared libraries yes nchecking whether lc should be explicitly linked in no nchecking dynamic linker characteristics gnu linux ld so nchecking how to hardcode library paths into programs immediate nchecking whether stripping libraries is possible yes nchecking if libtool supports shared libraries yes nchecking whether to build shared libraries yes nchecking whether to build static libraries yes nchecking for ld used by g usr bin ld m elf nchecking if the linker usr bin ld m elf is gnu ld yes nchecking whether the g linker usr bin ld m elf supports shared libraries yes nchecking for g option to produce pic fpic dpic nchecking if g pic flag fpic dpic works yes nchecking if g static flag static works yes nchecking if g supports c o file o yes nchecking if g supports c o file o cached yes nchecking whether the g linker usr bin ld m elf supports shared libraries yes nchecking dynamic linker characteristics gnu linux ld so nchecking how to hardcode library paths into programs immediate nchecking for ansi c header files cached yes nchecking for stdbool h that conforms to yes nchecking for bool yes nchecking limits h usability yes nchecking limits h presence yes nchecking for limits h yes nchecking malloc h usability yes nchecking malloc h presence yes nchecking for malloc h yes nchecking for an ansi c conforming const yes nchecking for size t yes nchecking for t yes nchecking for t yes nchecking for t yes nchecking for t yes nchecking for off t yes nchecking for stdlib h cached yes nchecking for gnu libc compatible malloc yes nchecking for stdlib h cached yes nchecking for unistd h cached yes nchecking for getpagesize yes nchecking for working mmap yes nchecking for stdlib h cached yes nchecking for gnu libc compatible realloc yes nchecking for working strtod yes nchecking for strstr yes nchecking for strtod cached yes nchecking for strtol yes nchecking for strerror yes nchecking for memset yes nchecking for floor no nchecking for library containing floor lm nchecking for pow yes nchecking for sqrt yes nchecking whether to enable debug mode yes ndisabled cernlib dependent code yes nconfigure creating config status nconfig status creating makefile nconfig status creating lib makefile nconfig status creating src makefile nconfig status creating ice makefile nconfig status creating scripts makefile nconfig status creating amasim makefile nconfig status creating makefile nconfig status creating config h nconfig status executing depfiles commands nconfig status executing libtool commands n n photonics pyrosoma n please refer to the install file for further instructions n hints n n building photonics make n compiling tool directory make tool n performing post compile test make tests n all of the above make everything n n clean objects and binaries make clean n clean tool directory make toolclean n remove traces of previous configure make distclean n n also consider trying scripts install in icetray sh help n n n building photonics with target all ndebug executing com apple build photonics ndebug assembled command cd net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work photonics cflags fpic make all nmake all recursive nmake entering directory automount net ro net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work photonics nmaking all in lib nmake entering directory automount net ro net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work photonics lib n bin sh libtool tag cc mode compile gcc dhave config h i i funroll loops fmerge all constants march native mtune native g wall fno inline fpic mt boundary lo md mp mf deps boundary tpo c o boundary lo boundary c nlibtool compile gcc dhave config h i i funroll loops fmerge all constants march native mtune native g wall fno inline fpic mt boundary lo md mp mf deps boundary tpo c boundary c fpic dpic o libs boundary o nboundary c error bad value native for march switch nboundary c error bad value native for mtune switch nmake error nmake leaving directory automount net ro net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work photonics lib nmake error nmake leaving directory automount net ro net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work photonics nmake error nerror target com apple build returned shell command cd net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work photonics cflags fpic make all returned error ncommand output make all recursive nmake entering directory automount net ro net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work photonics nmaking all in lib nmake entering directory automount net ro net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work photonics lib n bin sh libtool tag cc mode compile gcc dhave config h i i funroll loops fmerge all constants march native mtune native g wall fno inline fpic mt boundary lo md mp mf deps boundary tpo c o boundary lo boundary c nlibtool compile gcc dhave config h i i funroll loops fmerge all constants march native mtune native g wall fno inline fpic mt boundary lo md mp mf deps boundary tpo c boundary c fpic dpic o libs boundary o nboundary c error bad value native for march switch nboundary c error bad value native for mtune switch nmake error nmake leaving directory automount net ro net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work photonics lib nmake error nmake leaving directory automount net ro net software icecube ports var db dports build file net software icecube ports var db dports sources rsync code icecube wisc edu icecube tools ports science photonics work photonics nmake error n nwarning the following items did not execute for photonics com apple activate com apple build com apple destroot com apple archive com apple install nicecubemgr net software icecube ports n reporter boersma cc resolution worksforme time component booking summary port photonics does not build on bit priority normal keywords photonics ports gcc milestone owner type defect | 1 |
30,868 | 6,334,496,691 | IssuesEvent | 2017-07-26 16:47:18 | BALL-Project/ball | https://api.github.com/repos/BALL-Project/ball | closed | BALLView crashes in RT mode | C: VIEW P: major T: defect | **Reported by nicste on 23 Apr 41970140 22:04 UTC**
After full integration of RTfact code into the master branch (12/2011), BALLView crashes with an undefined symbol error at runtime, when either opening a project in RT mode or switching to RT mode. Seems to come from RTfact. Coould be related to used Boost versions (1.4.0 for BALL/BALLView, 1.4.3 for RTfact)
Detailed error message is:
BALLView: symbol lookup error: /home/AH/nicste/BALL.rtfact/BALL/build/lib/libVIEW.so.1.4: undefined symbol: _ZNKSt3tr14hashIN5boost13intrusive_ptrIN6RTfact6Remote8GeometryEEEEclES6_
| 1.0 | BALLView crashes in RT mode - **Reported by nicste on 23 Apr 41970140 22:04 UTC**
After full integration of RTfact code into the master branch (12/2011), BALLView crashes with an undefined symbol error at runtime, when either opening a project in RT mode or switching to RT mode. Seems to come from RTfact. Coould be related to used Boost versions (1.4.0 for BALL/BALLView, 1.4.3 for RTfact)
Detailed error message is:
BALLView: symbol lookup error: /home/AH/nicste/BALL.rtfact/BALL/build/lib/libVIEW.so.1.4: undefined symbol: _ZNKSt3tr14hashIN5boost13intrusive_ptrIN6RTfact6Remote8GeometryEEEEclES6_
| defect | ballview crashes in rt mode reported by nicste on apr utc after full integration of rtfact code into the master branch ballview crashes with an undefined symbol error at runtime when either opening a project in rt mode or switching to rt mode seems to come from rtfact coould be related to used boost versions for ball ballview for rtfact detailed error message is ballview symbol lookup error home ah nicste ball rtfact ball build lib libview so undefined symbol | 1 |
622,608 | 19,651,251,559 | IssuesEvent | 2022-01-10 07:27:10 | cloudnativedaysjp/reviewapp-operator | https://api.github.com/repos/cloudnativedaysjp/reviewapp-operator | closed | ReviewAppController から appRepo の PR に飛ばすメッセージが singleflight になるようにする | bug low priority | 現状重複して message が送られてしまってるので、 singleflight か何かで 1 通しか送信されないようにする

| 1.0 | ReviewAppController から appRepo の PR に飛ばすメッセージが singleflight になるようにする - 現状重複して message が送られてしまってるので、 singleflight か何かで 1 通しか送信されないようにする

| non_defect | reviewappcontroller から apprepo の pr に飛ばすメッセージが singleflight になるようにする 現状重複して message が送られてしまってるので、 singleflight か何かで 通しか送信されないようにする | 0 |
29,783 | 8,407,083,829 | IssuesEvent | 2018-10-11 19:51:46 | gradle/gradle | https://api.github.com/repos/gradle/gradle | closed | Can we delete buildTagging.gradle? | a:chore from:member in:gradle-build | It looks like that this script is not used in our build: https://github.com/gradle/gradle/blob/6f51f19a37990d1260a81cbbf14f95bdb959a0bf/gradle/buildTagging.gradle#L23
It is probably used on our internal CI infrastructure. If that is the case we should move it out of the public repository. | 1.0 | Can we delete buildTagging.gradle? - It looks like that this script is not used in our build: https://github.com/gradle/gradle/blob/6f51f19a37990d1260a81cbbf14f95bdb959a0bf/gradle/buildTagging.gradle#L23
It is probably used on our internal CI infrastructure. If that is the case we should move it out of the public repository. | non_defect | can we delete buildtagging gradle it looks like that this script is not used in our build it is probably used on our internal ci infrastructure if that is the case we should move it out of the public repository | 0 |
42,778 | 11,267,987,505 | IssuesEvent | 2020-01-14 04:25:48 | scipy/scipy | https://api.github.com/repos/scipy/scipy | closed | BUG: directed_hausdorff distance on sets u and v when u is a subset of v | defect scipy.spatial | When `u` and `v` are two sets with `u` being a subset of `v` we should get:
`directed_hausdorff(u, v, seed=0) = (0, index_of_any_point_from_u, index_of_the_same_point_from_v)`
While the distance is returned correctly, the indices for the points witnessing this distance are not guaranteed to be correct. For example,
`directed_hausdorff([(0,0)], [(0,1), (0,0)], seed=0)` return correct answer of `(0.0, 0, 1)`
but `directed_hausdorff([(0,0)], [(0,1), (0,0)], seed=1)` returns incorrect `(0.0, 0, 0)`.
This is due to strict inequality, `cmin > cmax`, in line 66 of [_directed_hausdorff definition](https://github.com/scipy/scipy/blob/v1.4.1/scipy/spatial/_hausdorff.pyx), as this prevents reassignment of `j_ret` (and `i_ret`) from its default value of `0` (because `cmax=cmin=0` on all `i` iterations in the case of `u` being a subset of `v`).
To fix this, either the `cmin > cmax` could be changed to `cmin >= cmax` or it could be checked in the beginning if `u` is a subset of `v`. | 1.0 | BUG: directed_hausdorff distance on sets u and v when u is a subset of v - When `u` and `v` are two sets with `u` being a subset of `v` we should get:
`directed_hausdorff(u, v, seed=0) = (0, index_of_any_point_from_u, index_of_the_same_point_from_v)`
While the distance is returned correctly, the indices for the points witnessing this distance are not guaranteed to be correct. For example,
`directed_hausdorff([(0,0)], [(0,1), (0,0)], seed=0)` return correct answer of `(0.0, 0, 1)`
but `directed_hausdorff([(0,0)], [(0,1), (0,0)], seed=1)` returns incorrect `(0.0, 0, 0)`.
This is due to strict inequality, `cmin > cmax`, in line 66 of [_directed_hausdorff definition](https://github.com/scipy/scipy/blob/v1.4.1/scipy/spatial/_hausdorff.pyx), as this prevents reassignment of `j_ret` (and `i_ret`) from its default value of `0` (because `cmax=cmin=0` on all `i` iterations in the case of `u` being a subset of `v`).
To fix this, either the `cmin > cmax` could be changed to `cmin >= cmax` or it could be checked in the beginning if `u` is a subset of `v`. | defect | bug directed hausdorff distance on sets u and v when u is a subset of v when u and v are two sets with u being a subset of v we should get directed hausdorff u v seed index of any point from u index of the same point from v while the distance is returned correctly the indices for the points witnessing this distance are not guaranteed to be correct for example directed hausdorff seed return correct answer of but directed hausdorff seed returns incorrect this is due to strict inequality cmin cmax in line of as this prevents reassignment of j ret and i ret from its default value of because cmax cmin on all i iterations in the case of u being a subset of v to fix this either the cmin cmax could be changed to cmin cmax or it could be checked in the beginning if u is a subset of v | 1 |
9,312 | 8,584,883,822 | IssuesEvent | 2018-11-14 00:40:03 | MicrosoftDocs/azure-docs | https://api.github.com/repos/MicrosoftDocs/azure-docs | closed | WcfCommunicationClient is confusing / too close to WcfCommunicationClientFactory | cxp product-feedback service-fabric/svc triaged | In reviewing of this code, I assumed that WcfCommunicationClient was a .Net Framework class that was provided and fulfilled using generics, which literally took me days to correct that assumption.
To ensure clarity that the class is part of the solution/implementation of WCF client communication, my suggestion would be to call it what it's for: CalculatorWCFClient, or WCFCalculatorClient… that or make this an extension method or builder directly from the client... the only item that is required in this boilerplate code is the service interface.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: efc5a255-70d0-c9c6-a290-914681b71479
* Version Independent ID: a1b395d1-6e8c-eca6-4b48-4739f3a24f72
* Content: [Reliable Services WCF communication stack](https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-reliable-services-communication-wcf)
* Content Source: [articles/service-fabric/service-fabric-reliable-services-communication-wcf.md](https://github.com/Microsoft/azure-docs/blob/master/articles/service-fabric/service-fabric-reliable-services-communication-wcf.md)
* Service: **service-fabric**
* GitHub Login: @BharatNarasimman
* Microsoft Alias: **bharatn** | 1.0 | WcfCommunicationClient is confusing / too close to WcfCommunicationClientFactory - In reviewing of this code, I assumed that WcfCommunicationClient was a .Net Framework class that was provided and fulfilled using generics, which literally took me days to correct that assumption.
To ensure clarity that the class is part of the solution/implementation of WCF client communication, my suggestion would be to call it what it's for: CalculatorWCFClient, or WCFCalculatorClient… that or make this an extension method or builder directly from the client... the only item that is required in this boilerplate code is the service interface.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: efc5a255-70d0-c9c6-a290-914681b71479
* Version Independent ID: a1b395d1-6e8c-eca6-4b48-4739f3a24f72
* Content: [Reliable Services WCF communication stack](https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-reliable-services-communication-wcf)
* Content Source: [articles/service-fabric/service-fabric-reliable-services-communication-wcf.md](https://github.com/Microsoft/azure-docs/blob/master/articles/service-fabric/service-fabric-reliable-services-communication-wcf.md)
* Service: **service-fabric**
* GitHub Login: @BharatNarasimman
* Microsoft Alias: **bharatn** | non_defect | wcfcommunicationclient is confusing too close to wcfcommunicationclientfactory in reviewing of this code i assumed that wcfcommunicationclient was a net framework class that was provided and fulfilled using generics which literally took me days to correct that assumption to ensure clarity that the class is part of the solution implementation of wcf client communication my suggestion would be to call it what it s for calculatorwcfclient or wcfcalculatorclient… that or make this an extension method or builder directly from the client the only item that is required in this boilerplate code is the service interface document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service service fabric github login bharatnarasimman microsoft alias bharatn | 0 |
578,025 | 17,142,295,534 | IssuesEvent | 2021-07-13 10:59:20 | openMF/mifos-android-sdk-arch | https://api.github.com/repos/openMF/mifos-android-sdk-arch | closed | Adopt Kotlin Coroutines | :x: invalid feature priority | **Summary**
Currently, the SDK does not support Kotlin Coroutines. Implementation of Coroutines will simplify and optimize background network requests.
| 1.0 | Adopt Kotlin Coroutines - **Summary**
Currently, the SDK does not support Kotlin Coroutines. Implementation of Coroutines will simplify and optimize background network requests.
| non_defect | adopt kotlin coroutines summary currently the sdk does not support kotlin coroutines implementation of coroutines will simplify and optimize background network requests | 0 |
21,448 | 3,711,349,149 | IssuesEvent | 2016-03-02 09:57:58 | akvo/akvo-web | https://api.github.com/repos/akvo/akvo-web | closed | Events Calendar plugin - styles to be added through sass | 1 - Design 1 - Plug-in | Styles were directly added through css without being preprocessed using sass. | 1.0 | Events Calendar plugin - styles to be added through sass - Styles were directly added through css without being preprocessed using sass. | non_defect | events calendar plugin styles to be added through sass styles were directly added through css without being preprocessed using sass | 0 |
400,952 | 27,308,982,548 | IssuesEvent | 2023-02-24 10:35:24 | Avaiga/taipy-doc | https://api.github.com/repos/Avaiga/taipy-doc | closed | Documentation for set default config | Core 📄 Documentation 🟨 Priority: Medium | Add example on how to set default configuration for:
- Datanode
- Task
- Pipeline
- Scenario | 1.0 | Documentation for set default config - Add example on how to set default configuration for:
- Datanode
- Task
- Pipeline
- Scenario | non_defect | documentation for set default config add example on how to set default configuration for datanode task pipeline scenario | 0 |
402,662 | 27,379,752,578 | IssuesEvent | 2023-02-28 09:14:03 | input-output-hk/ouroboros-network | https://api.github.com/repos/input-output-hk/ouroboros-network | closed | Write a blog article about a consensus map | :handshake: consensus documentation | Re-organize the consensus docs, to improve their discoverability and announce this via a blogpost
The consensus map could be used for:
- Onboarding new collaborators.
- Identifying potential points of improvement.
- Hiring.
- Helping current collaborators.
One important message we want to get across is the fact that consensus:
- requires time to learn,
- it is mission-critical,
- it is highly rewarding to work on it, but it will take time till one can make a significant contribution.
## TODOs
* [x] #4149
* [x] #4057
* [x] #4217
* [x] #4061
* [x] #4198
* [x] #4247
* [x] #4323
* [x] #4245
* [x] #4324
* [x] Link to the Galois examples from the docs.
* [ ] ~Announce the new documentation somewhere (eg `cardano-updates` or engineering blog).~
## Tasks for later
* [ ] #4215
* [ ] #4153
| 1.0 | Write a blog article about a consensus map - Re-organize the consensus docs, to improve their discoverability and announce this via a blogpost
The consensus map could be used for:
- Onboarding new collaborators.
- Identifying potential points of improvement.
- Hiring.
- Helping current collaborators.
One important message we want to get across is the fact that consensus:
- requires time to learn,
- it is mission-critical,
- it is highly rewarding to work on it, but it will take time till one can make a significant contribution.
## TODOs
* [x] #4149
* [x] #4057
* [x] #4217
* [x] #4061
* [x] #4198
* [x] #4247
* [x] #4323
* [x] #4245
* [x] #4324
* [x] Link to the Galois examples from the docs.
* [ ] ~Announce the new documentation somewhere (eg `cardano-updates` or engineering blog).~
## Tasks for later
* [ ] #4215
* [ ] #4153
| non_defect | write a blog article about a consensus map re organize the consensus docs to improve their discoverability and announce this via a blogpost the consensus map could be used for onboarding new collaborators identifying potential points of improvement hiring helping current collaborators one important message we want to get across is the fact that consensus requires time to learn it is mission critical it is highly rewarding to work on it but it will take time till one can make a significant contribution todos link to the galois examples from the docs announce the new documentation somewhere eg cardano updates or engineering blog tasks for later | 0 |
72,507 | 24,153,267,768 | IssuesEvent | 2022-09-22 04:31:23 | openzfs/zfs | https://api.github.com/repos/openzfs/zfs | closed | Wording regarding an error importing zfs pool | Type: Defect Status: Stale Status: Triage Needed | Hi,
The following issue appears in ZFS 0.8.x regardless on any Linux distribution..
I have a RAID card which hosts a ZFS pool and the system boots from a SATA disk using an ext4 partition.
Today, after upgrading kernel and rebooting, zfs didn't import my pool, and only showed an error "cannot import metadata is corrupted"
After trying to check my system, it seems that the disks from the RAID card are not recognized by the OS, due to an issue with mpt3sas kernel module. After fixing the mpt3sas kernel module issue, the pool was imported successfully without any problem.
If I may suggest something: in the import section of ZFS, it should actually check if the cache if available and if not, show a message that "Cache is not available, are the disks online?" instead of showing a message about corruption. | 1.0 | Wording regarding an error importing zfs pool - Hi,
The following issue appears in ZFS 0.8.x regardless on any Linux distribution..
I have a RAID card which hosts a ZFS pool and the system boots from a SATA disk using an ext4 partition.
Today, after upgrading kernel and rebooting, zfs didn't import my pool, and only showed an error "cannot import metadata is corrupted"
After trying to check my system, it seems that the disks from the RAID card are not recognized by the OS, due to an issue with mpt3sas kernel module. After fixing the mpt3sas kernel module issue, the pool was imported successfully without any problem.
If I may suggest something: in the import section of ZFS, it should actually check if the cache if available and if not, show a message that "Cache is not available, are the disks online?" instead of showing a message about corruption. | defect | wording regarding an error importing zfs pool hi the following issue appears in zfs x regardless on any linux distribution i have a raid card which hosts a zfs pool and the system boots from a sata disk using an partition today after upgrading kernel and rebooting zfs didn t import my pool and only showed an error cannot import metadata is corrupted after trying to check my system it seems that the disks from the raid card are not recognized by the os due to an issue with kernel module after fixing the kernel module issue the pool was imported successfully without any problem if i may suggest something in the import section of zfs it should actually check if the cache if available and if not show a message that cache is not available are the disks online instead of showing a message about corruption | 1 |
21,004 | 6,973,098,551 | IssuesEvent | 2017-12-11 19:18:37 | meteor/meteor | https://api.github.com/repos/meteor/meteor | closed | meteor errors on multiple comments in one CSS declaration block | bug confirmed Impact:few Project:Isobuild:Minifiers pull-requests-encouraged Severity:has-workaround | **See reproduction [here](https://github.com/hwillson/meteor-issue-2334)**
----
When I bring in a new Jquery Mobile CSS theme from ThemeRoller I need to remove CSS comments in order for meteor release 0.8.2 to be happy. Here's the error:
```
=> Started proxy.
=> Started MongoDB.
=> Errors prevented startup:
While building the application:
client/jqm.theme.min.css: property missing ':' near line 91:44
=> Your application has errors. Waiting for file change.
```
Here's the unmodified CSS:
``` CSS
.ui-corner-all {
-webkit-border-radius: .6em /*{global-radii-blocks}*/;
border-radius: .6em /*{global-radii-blocks}*/;
}
```
And the modification to make meteor happy:
``` CSS
.ui-corner-all {
-webkit-border-radius: .6em /*{global-radii-blocks}*/;
border-radius: .6em;
}
```
| 1.0 | meteor errors on multiple comments in one CSS declaration block - **See reproduction [here](https://github.com/hwillson/meteor-issue-2334)**
----
When I bring in a new Jquery Mobile CSS theme from ThemeRoller I need to remove CSS comments in order for meteor release 0.8.2 to be happy. Here's the error:
```
=> Started proxy.
=> Started MongoDB.
=> Errors prevented startup:
While building the application:
client/jqm.theme.min.css: property missing ':' near line 91:44
=> Your application has errors. Waiting for file change.
```
Here's the unmodified CSS:
``` CSS
.ui-corner-all {
-webkit-border-radius: .6em /*{global-radii-blocks}*/;
border-radius: .6em /*{global-radii-blocks}*/;
}
```
And the modification to make meteor happy:
``` CSS
.ui-corner-all {
-webkit-border-radius: .6em /*{global-radii-blocks}*/;
border-radius: .6em;
}
```
| non_defect | meteor errors on multiple comments in one css declaration block see reproduction when i bring in a new jquery mobile css theme from themeroller i need to remove css comments in order for meteor release to be happy here s the error started proxy started mongodb errors prevented startup while building the application client jqm theme min css property missing near line your application has errors waiting for file change here s the unmodified css css ui corner all webkit border radius global radii blocks border radius global radii blocks and the modification to make meteor happy css ui corner all webkit border radius global radii blocks border radius | 0 |
222,978 | 24,711,514,174 | IssuesEvent | 2022-10-20 01:27:12 | n-devs/freebitco.in-mobile | https://api.github.com/repos/n-devs/freebitco.in-mobile | opened | CVE-2022-37601 (High) detected in loader-utils-1.2.3.tgz | security vulnerability | ## CVE-2022-37601 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>loader-utils-1.2.3.tgz</b></p></summary>
<p>utils for webpack loaders</p>
<p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz</a></p>
<p>Path to dependency file: /freebitco.in-mobile/package.json</p>
<p>Path to vulnerable library: /node_modules/loader-utils/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- webpack-4.1.0.tgz
- :x: **loader-utils-1.2.3.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in function parseQuery in parseQuery.js in webpack loader-utils 2.0.0 via the name variable in parseQuery.js.
<p>Publish Date: 2022-10-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37601>CVE-2022-37601</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-12</p>
<p>Fix Resolution (loader-utils): 2.0.0</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-37601 (High) detected in loader-utils-1.2.3.tgz - ## CVE-2022-37601 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>loader-utils-1.2.3.tgz</b></p></summary>
<p>utils for webpack loaders</p>
<p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz</a></p>
<p>Path to dependency file: /freebitco.in-mobile/package.json</p>
<p>Path to vulnerable library: /node_modules/loader-utils/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.0.1.tgz (Root Library)
- webpack-4.1.0.tgz
- :x: **loader-utils-1.2.3.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in function parseQuery in parseQuery.js in webpack loader-utils 2.0.0 via the name variable in parseQuery.js.
<p>Publish Date: 2022-10-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37601>CVE-2022-37601</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-12</p>
<p>Fix Resolution (loader-utils): 2.0.0</p>
<p>Direct dependency fix Resolution (react-scripts): 5.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | cve high detected in loader utils tgz cve high severity vulnerability vulnerable library loader utils tgz utils for webpack loaders library home page a href path to dependency file freebitco in mobile package json path to vulnerable library node modules loader utils package json dependency hierarchy react scripts tgz root library webpack tgz x loader utils tgz vulnerable library vulnerability details prototype pollution vulnerability in function parsequery in parsequery js in webpack loader utils via the name variable in parsequery js publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution loader utils direct dependency fix resolution react scripts step up your open source security game with mend | 0 |
53,919 | 13,262,515,288 | IssuesEvent | 2020-08-20 21:57:43 | icecube-trac/tix4 | https://api.github.com/repos/icecube-trac/tix4 | closed | [PROPOSAL] just_use_readonly_path causes test failures (Trac #2335) | Migrated from Trac combo simulation defect | I am not sure what the purpose of this option is but it causes a lot of test failures on my machine. Setting it to false in PROPOSAL/resources/config_icesim.json causes those test to pass. I recommend we change that file or some other remedy.
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/2335">https://code.icecube.wisc.edu/projects/icecube/ticket/2335</a>, reported by kjmeagherand owned by jsoedingrekso</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-06-28T15:35:02",
"_ts": "1561736102787341",
"description": "I am not sure what the purpose of this option is but it causes a lot of test failures on my machine. Setting it to false in PROPOSAL/resources/config_icesim.json causes those test to pass. I recommend we change that file or some other remedy. ",
"reporter": "kjmeagher",
"cc": "",
"resolution": "invalid",
"time": "2019-06-26T21:18:01",
"component": "combo simulation",
"summary": "[PROPOSAL] just_use_readonly_path causes test failures",
"priority": "blocker",
"keywords": "",
"milestone": "Autumnal Equinox 2019",
"owner": "jsoedingrekso",
"type": "defect"
}
```
</p>
</details>
| 1.0 | [PROPOSAL] just_use_readonly_path causes test failures (Trac #2335) - I am not sure what the purpose of this option is but it causes a lot of test failures on my machine. Setting it to false in PROPOSAL/resources/config_icesim.json causes those test to pass. I recommend we change that file or some other remedy.
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/2335">https://code.icecube.wisc.edu/projects/icecube/ticket/2335</a>, reported by kjmeagherand owned by jsoedingrekso</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-06-28T15:35:02",
"_ts": "1561736102787341",
"description": "I am not sure what the purpose of this option is but it causes a lot of test failures on my machine. Setting it to false in PROPOSAL/resources/config_icesim.json causes those test to pass. I recommend we change that file or some other remedy. ",
"reporter": "kjmeagher",
"cc": "",
"resolution": "invalid",
"time": "2019-06-26T21:18:01",
"component": "combo simulation",
"summary": "[PROPOSAL] just_use_readonly_path causes test failures",
"priority": "blocker",
"keywords": "",
"milestone": "Autumnal Equinox 2019",
"owner": "jsoedingrekso",
"type": "defect"
}
```
</p>
</details>
| defect | just use readonly path causes test failures trac i am not sure what the purpose of this option is but it causes a lot of test failures on my machine setting it to false in proposal resources config icesim json causes those test to pass i recommend we change that file or some other remedy migrated from json status closed changetime ts description i am not sure what the purpose of this option is but it causes a lot of test failures on my machine setting it to false in proposal resources config icesim json causes those test to pass i recommend we change that file or some other remedy reporter kjmeagher cc resolution invalid time component combo simulation summary just use readonly path causes test failures priority blocker keywords milestone autumnal equinox owner jsoedingrekso type defect | 1 |
14,519 | 2,814,459,643 | IssuesEvent | 2015-05-18 20:10:16 | geoffhumphrey/brewcompetitiononlineentry | https://api.github.com/repos/geoffhumphrey/brewcompetitiononlineentry | closed | Unknown column 'brewStyleVersion' in 'field list' | auto-migrated Priority-Critical Type-Defect Usability wontfix | ```
This error is showing up whenever you try to view the list of entries -- no
entries display, it says "no data in table".
On the main Admin Dashboard it shows the correct number of entries, it also
shows the correct number above the Entries list -- then you see the error.
When I go to My Entries and Info, I see this (despite having an entry) which
implies that I have no entries:
You have one (1) entry left before you reach the limit of one (1) entry per
participant in this competition.
I am using version 1.3.0.4, with the patches posted for updating how Archiving
works (issue# 421).
Here is the text from the Entries page showing where the error displays:
Please note: Judging numbers are automatically assigned ...<snip>... entering
them in the fields provided and clicking “Update Entries.“
Unknown column 'brewStyleVersion' in 'field list'
Showing 0 to 0 of 0 entries
```
Original issue reported on code.google.com by `br...@brewdrinkrepeat.com` on 19 Apr 2015 at 1:36 | 1.0 | Unknown column 'brewStyleVersion' in 'field list' - ```
This error is showing up whenever you try to view the list of entries -- no
entries display, it says "no data in table".
On the main Admin Dashboard it shows the correct number of entries, it also
shows the correct number above the Entries list -- then you see the error.
When I go to My Entries and Info, I see this (despite having an entry) which
implies that I have no entries:
You have one (1) entry left before you reach the limit of one (1) entry per
participant in this competition.
I am using version 1.3.0.4, with the patches posted for updating how Archiving
works (issue# 421).
Here is the text from the Entries page showing where the error displays:
Please note: Judging numbers are automatically assigned ...<snip>... entering
them in the fields provided and clicking “Update Entries.“
Unknown column 'brewStyleVersion' in 'field list'
Showing 0 to 0 of 0 entries
```
Original issue reported on code.google.com by `br...@brewdrinkrepeat.com` on 19 Apr 2015 at 1:36 | defect | unknown column brewstyleversion in field list this error is showing up whenever you try to view the list of entries no entries display it says no data in table on the main admin dashboard it shows the correct number of entries it also shows the correct number above the entries list then you see the error when i go to my entries and info i see this despite having an entry which implies that i have no entries you have one entry left before you reach the limit of one entry per participant in this competition i am using version with the patches posted for updating how archiving works issue here is the text from the entries page showing where the error displays please note judging numbers are automatically assigned entering them in the fields provided and clicking “update entries “ unknown column brewstyleversion in field list showing to of entries original issue reported on code google com by br brewdrinkrepeat com on apr at | 1 |
4,786 | 2,610,156,083 | IssuesEvent | 2015-02-26 18:49:37 | chrsmith/republic-at-war | https://api.github.com/repos/chrsmith/republic-at-war | closed | Text | auto-migrated Priority-Medium Type-Defect | ```
Fix Mandator text
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 30 Jan 2011 at 2:30 | 1.0 | Text - ```
Fix Mandator text
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 30 Jan 2011 at 2:30 | defect | text fix mandator text original issue reported on code google com by gmail com on jan at | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.