Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1
value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3
values | title stringlengths 1 1k | labels stringlengths 4 1.38k | body stringlengths 1 262k | index stringclasses 16
values | text_combine stringlengths 96 262k | label stringclasses 2
values | text stringlengths 96 252k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
131,688 | 10,706,240,162 | IssuesEvent | 2019-10-24 15:03:30 | python-discord/bot | https://api.github.com/repos/python-discord/bot | opened | Writing unit tests for the bot | area: unit tests help wanted | ### This issue is work in progress
I'm currently in the process of adding individual issues for individual files. I'll update this issue as I create these issues.
---
The unit test branch coverage of our bot is currently 20% (30% including the test files). We would like to increase the test coverage of our bot to make one of our community's most essential tools more resilient to breakage. This *meta* issue contains information general information for this project and keeps track of the progress.
**Note:** If you want to help with writing unit tests, please make to get yourself assigned to one of the open issues for writing tests for an individual file. This system prevents that multiple people are working on the same tests.
## General information
We are using [`unittest`](https://docs.python.org/3.7/library/unittest.html) combined with [`coverage.py`](https://coverage.readthedocs.io/en/stable/) as the framework for our unit tests. To make sure you are using compatible versions of `unittest` and `coverage.py`, run your tests in the `pipenv` environment. Please note that our `bot` currently uses Python 3.7, which means that the new `unittest` features introduced in Python 3.8 are not yet available.
A general introduction to writing tests for our bot is available in the [`README.md`](https://github.com/python-discord/bot/blob/master/tests/README.md) in the `/tests` directory of this repository. This file also contains information on how to run the test suite with `pipenv`.
A few important points:
- The `/tests/bot` directory uses the same directory structure as `/bot`. Please follow this convention to simplify locating the right tests.
- Make sure your tests do not rely on third-party connections or code external to the unit you are testing, but use mocks instead. See [README.md#mocking](https://github.com/python-discord/bot/blob/master/tests/README.md#mocking).
- Please take a look at the existing test files to get an idea of the existing conventions in this repository.
- The tests files themselves are considered to be part of the coverage. This makes sure that if certain tests are not discovered due to a mistake, this will show up in the coverage report for the relevant test files.
## Progress and links to issues for individual files
This directory tree keeps track of the files that are fully tested. Some files do already have test files associated with them, but do not yet have full coverage; for these partially covered files, I've indicated the current branch coverage percentage in parentheses.
- [ ] bot/
- [ ] ├── cogs
- [ ] │ ├── moderation
- [ ] │ │ ├── \_\_init\_\_.py
- [ ] │ │ ├── infractions.py
- [ ] │ │ ├── management.py
- [ ] │ │ ├── modlog.py
- [ ] │ │ ├── superstarify.py
- [ ] │ │ └── utils.py
- [ ] │ ├── sync
- [ ] │ │ ├── \_\_init\_\_.py
- [ ] │ │ ├── cog.py (currently ~36% coverage)
- [ ] │ │ └── syncers.py (currently ~57% coverage)
- [ ] │ ├── watchchannels
- [ ] │ │ ├── \_\_init\_\_.py
- [ ] │ │ ├── bigbrother.py
- [ ] │ │ ├── talentpool.py
- [ ] │ │ └── watchchannel.py
- [x] │ ├── \_\_init\_\_.py
- [ ] │ ├── alias.py
- [ ] │ ├── antimalware.py
- [ ] │ ├── antispam.py (currently ~29% coverage)
- [ ] │ ├── bot.py
- [ ] │ ├── clean.py
- [ ] │ ├── defcon.py
- [ ] │ ├── doc.py
- [ ] │ ├── error\_handler.py
- [ ] │ ├── eval.py
- [ ] │ ├── extensions.py
- [ ] │ ├── filtering.py
- [ ] │ ├── free.py
- [ ] │ ├── help.py
- [ ] │ ├── information.py (currently ~51% coverage)
- [ ] │ ├── jams.py
- [ ] │ ├── logging.py
- [ ] │ ├── off\_topic\_names.py
- [ ] │ ├── reddit.py
- [ ] │ ├── reminders.py
- [x] │ ├── security.py
- [ ] │ ├── site.py
- [ ] │ ├── snekbox.py
- [ ] │ ├── tags.py
- [ ] │ ├── token\_remover.py (currently ~97% coverage)
- [ ] │ ├── utils.py
- [ ] │ ├── verification.py
- [ ] │ └── wolfram.py
- [ ] ├── patches
- [x] │ ├── \_\_init\_\_.py
- [ ] │ └── message\_edited\_at.py
- [x] ├── resources
- [x] │ └── stars.json
- [ ] ├── rules
- [x] │ ├── \_\_init\_\_.py
- [x] │ ├── attachments.py
- [ ] │ ├── burst.py
- [ ] │ ├── burst\_shared.py
- [ ] │ ├── chars.py
- [ ] │ ├── discord\_emojis.py
- [ ] │ ├── duplicates.py
- [ ] │ ├── links.py
- [ ] │ ├── mentions.py
- [ ] │ ├── newlines.py
- [ ] │ └── role\_mentions.py
- [ ] ├── utils
- [ ] │ ├── \_\_init\_\_.py
- [ ] │ ├── checks.py (currently ~73%)
- [ ] │ ├── messages.py
- [ ] │ ├── scheduling.py
- [ ] │ └── time.py
- [ ] ├── \_\_init\_\_.py
- [ ] ├── \_\_main\_\_.py
- [ ] ├── api.py (currently ~55%)
- [ ] ├── constants.py (currently ~94%)
- [ ] ├── converters.py (currently ~72%)
- [ ] ├── decorators.py
- [ ] ├── interpreter.py
- [ ] └── pagination.py (currently ~16%) | 1.0 | Writing unit tests for the bot - ### This issue is work in progress
I'm currently in the process of adding individual issues for individual files. I'll update this issue as I create these issues.
---
The unit test branch coverage of our bot is currently 20% (30% including the test files). We would like to increase the test coverage of our bot to make one of our community's most essential tools more resilient to breakage. This *meta* issue contains information general information for this project and keeps track of the progress.
**Note:** If you want to help with writing unit tests, please make to get yourself assigned to one of the open issues for writing tests for an individual file. This system prevents that multiple people are working on the same tests.
## General information
We are using [`unittest`](https://docs.python.org/3.7/library/unittest.html) combined with [`coverage.py`](https://coverage.readthedocs.io/en/stable/) as the framework for our unit tests. To make sure you are using compatible versions of `unittest` and `coverage.py`, run your tests in the `pipenv` environment. Please note that our `bot` currently uses Python 3.7, which means that the new `unittest` features introduced in Python 3.8 are not yet available.
A general introduction to writing tests for our bot is available in the [`README.md`](https://github.com/python-discord/bot/blob/master/tests/README.md) in the `/tests` directory of this repository. This file also contains information on how to run the test suite with `pipenv`.
A few important points:
- The `/tests/bot` directory uses the same directory structure as `/bot`. Please follow this convention to simplify locating the right tests.
- Make sure your tests do not rely on third-party connections or code external to the unit you are testing, but use mocks instead. See [README.md#mocking](https://github.com/python-discord/bot/blob/master/tests/README.md#mocking).
- Please take a look at the existing test files to get an idea of the existing conventions in this repository.
- The tests files themselves are considered to be part of the coverage. This makes sure that if certain tests are not discovered due to a mistake, this will show up in the coverage report for the relevant test files.
## Progress and links to issues for individual files
This directory tree keeps track of the files that are fully tested. Some files do already have test files associated with them, but do not yet have full coverage; for these partially covered files, I've indicated the current branch coverage percentage in parentheses.
- [ ] bot/
- [ ] ├── cogs
- [ ] │ ├── moderation
- [ ] │ │ ├── \_\_init\_\_.py
- [ ] │ │ ├── infractions.py
- [ ] │ │ ├── management.py
- [ ] │ │ ├── modlog.py
- [ ] │ │ ├── superstarify.py
- [ ] │ │ └── utils.py
- [ ] │ ├── sync
- [ ] │ │ ├── \_\_init\_\_.py
- [ ] │ │ ├── cog.py (currently ~36% coverage)
- [ ] │ │ └── syncers.py (currently ~57% coverage)
- [ ] │ ├── watchchannels
- [ ] │ │ ├── \_\_init\_\_.py
- [ ] │ │ ├── bigbrother.py
- [ ] │ │ ├── talentpool.py
- [ ] │ │ └── watchchannel.py
- [x] │ ├── \_\_init\_\_.py
- [ ] │ ├── alias.py
- [ ] │ ├── antimalware.py
- [ ] │ ├── antispam.py (currently ~29% coverage)
- [ ] │ ├── bot.py
- [ ] │ ├── clean.py
- [ ] │ ├── defcon.py
- [ ] │ ├── doc.py
- [ ] │ ├── error\_handler.py
- [ ] │ ├── eval.py
- [ ] │ ├── extensions.py
- [ ] │ ├── filtering.py
- [ ] │ ├── free.py
- [ ] │ ├── help.py
- [ ] │ ├── information.py (currently ~51% coverage)
- [ ] │ ├── jams.py
- [ ] │ ├── logging.py
- [ ] │ ├── off\_topic\_names.py
- [ ] │ ├── reddit.py
- [ ] │ ├── reminders.py
- [x] │ ├── security.py
- [ ] │ ├── site.py
- [ ] │ ├── snekbox.py
- [ ] │ ├── tags.py
- [ ] │ ├── token\_remover.py (currently ~97% coverage)
- [ ] │ ├── utils.py
- [ ] │ ├── verification.py
- [ ] │ └── wolfram.py
- [ ] ├── patches
- [x] │ ├── \_\_init\_\_.py
- [ ] │ └── message\_edited\_at.py
- [x] ├── resources
- [x] │ └── stars.json
- [ ] ├── rules
- [x] │ ├── \_\_init\_\_.py
- [x] │ ├── attachments.py
- [ ] │ ├── burst.py
- [ ] │ ├── burst\_shared.py
- [ ] │ ├── chars.py
- [ ] │ ├── discord\_emojis.py
- [ ] │ ├── duplicates.py
- [ ] │ ├── links.py
- [ ] │ ├── mentions.py
- [ ] │ ├── newlines.py
- [ ] │ └── role\_mentions.py
- [ ] ├── utils
- [ ] │ ├── \_\_init\_\_.py
- [ ] │ ├── checks.py (currently ~73%)
- [ ] │ ├── messages.py
- [ ] │ ├── scheduling.py
- [ ] │ └── time.py
- [ ] ├── \_\_init\_\_.py
- [ ] ├── \_\_main\_\_.py
- [ ] ├── api.py (currently ~55%)
- [ ] ├── constants.py (currently ~94%)
- [ ] ├── converters.py (currently ~72%)
- [ ] ├── decorators.py
- [ ] ├── interpreter.py
- [ ] └── pagination.py (currently ~16%) | non_priority | writing unit tests for the bot this issue is work in progress i m currently in the process of adding individual issues for individual files i ll update this issue as i create these issues the unit test branch coverage of our bot is currently including the test files we would like to increase the test coverage of our bot to make one of our community s most essential tools more resilient to breakage this meta issue contains information general information for this project and keeps track of the progress note if you want to help with writing unit tests please make to get yourself assigned to one of the open issues for writing tests for an individual file this system prevents that multiple people are working on the same tests general information we are using combined with as the framework for our unit tests to make sure you are using compatible versions of unittest and coverage py run your tests in the pipenv environment please note that our bot currently uses python which means that the new unittest features introduced in python are not yet available a general introduction to writing tests for our bot is available in the in the tests directory of this repository this file also contains information on how to run the test suite with pipenv a few important points the tests bot directory uses the same directory structure as bot please follow this convention to simplify locating the right tests make sure your tests do not rely on third party connections or code external to the unit you are testing but use mocks instead see please take a look at the existing test files to get an idea of the existing conventions in this repository the tests files themselves are considered to be part of the coverage this makes sure that if certain tests are not discovered due to a mistake this will show up in the coverage report for the relevant test files progress and links to issues for individual files this directory tree keeps track of the files that are fully tested some files do already have test files associated with them but do not yet have full coverage for these partially covered files i ve indicated the current branch coverage percentage in parentheses bot ├── cogs │ ├── moderation │ │ ├── init py │ │ ├── infractions py │ │ ├── management py │ │ ├── modlog py │ │ ├── superstarify py │ │ └── utils py │ ├── sync │ │ ├── init py │ │ ├── cog py currently coverage │ │ └── syncers py currently coverage │ ├── watchchannels │ │ ├── init py │ │ ├── bigbrother py │ │ ├── talentpool py │ │ └── watchchannel py │ ├── init py │ ├── alias py │ ├── antimalware py │ ├── antispam py currently coverage │ ├── bot py │ ├── clean py │ ├── defcon py │ ├── doc py │ ├── error handler py │ ├── eval py │ ├── extensions py │ ├── filtering py │ ├── free py │ ├── help py │ ├── information py currently coverage │ ├── jams py │ ├── logging py │ ├── off topic names py │ ├── reddit py │ ├── reminders py │ ├── security py │ ├── site py │ ├── snekbox py │ ├── tags py │ ├── token remover py currently coverage │ ├── utils py │ ├── verification py │ └── wolfram py ├── patches │ ├── init py │ └── message edited at py ├── resources │ └── stars json ├── rules │ ├── init py │ ├── attachments py │ ├── burst py │ ├── burst shared py │ ├── chars py │ ├── discord emojis py │ ├── duplicates py │ ├── links py │ ├── mentions py │ ├── newlines py │ └── role mentions py ├── utils │ ├── init py │ ├── checks py currently │ ├── messages py │ ├── scheduling py │ └── time py ├── init py ├── main py ├── api py currently ├── constants py currently ├── converters py currently ├── decorators py ├── interpreter py └── pagination py currently | 0 |
105,929 | 11,462,475,567 | IssuesEvent | 2020-02-07 14:13:40 | justinwilaby/spellchecker-wasm | https://api.github.com/repos/justinwilaby/spellchecker-wasm | closed | Browser example does not work | bug documentation | Hi,
I may be missing something, but in the initializeSpellchecker function, "spellchecker" is not defined.
Furthermore, when running your sample, I get this error (latest version of Chrome) :
spellchecker.js:56 Uncaught TypeError: initializeSpellchecker.then is not a function
Thanks for any help
| 1.0 | Browser example does not work - Hi,
I may be missing something, but in the initializeSpellchecker function, "spellchecker" is not defined.
Furthermore, when running your sample, I get this error (latest version of Chrome) :
spellchecker.js:56 Uncaught TypeError: initializeSpellchecker.then is not a function
Thanks for any help
| non_priority | browser example does not work hi i may be missing something but in the initializespellchecker function spellchecker is not defined furthermore when running your sample i get this error latest version of chrome spellchecker js uncaught typeerror initializespellchecker then is not a function thanks for any help | 0 |
442,843 | 12,751,558,027 | IssuesEvent | 2020-06-27 11:42:31 | sodafoundation/dashboard | https://api.github.com/repos/sodafoundation/dashboard | closed | Opening the Cloud Volume page in Firefox / Edge browser breaks the UI. | Critical Priority bug | **Issue/Feature Description:**
When the dashboard is accessed using the Firefox to Edge browser on accessing the Cloud Volume page the UI crashes with JavaScript errors.


**Why this issue to fixed / feature is needed(give scenarios or use cases):**
The UI is completely broken and unusable after this point.
**How to reproduce, in case of a bug:**
1. Open the dashboard using Firefox or Edge browser.
2. Login and navigate to Resource -> Volume -> Cloud Tab
3. Check the developer console. you will see above error.
4. The page will be broken and the Create At field in the table will be empty.
**Other Notes / Environment Information: (Please give the env information, log link or any useful information for this issue)**
| 1.0 | Opening the Cloud Volume page in Firefox / Edge browser breaks the UI. - **Issue/Feature Description:**
When the dashboard is accessed using the Firefox to Edge browser on accessing the Cloud Volume page the UI crashes with JavaScript errors.


**Why this issue to fixed / feature is needed(give scenarios or use cases):**
The UI is completely broken and unusable after this point.
**How to reproduce, in case of a bug:**
1. Open the dashboard using Firefox or Edge browser.
2. Login and navigate to Resource -> Volume -> Cloud Tab
3. Check the developer console. you will see above error.
4. The page will be broken and the Create At field in the table will be empty.
**Other Notes / Environment Information: (Please give the env information, log link or any useful information for this issue)**
| priority | opening the cloud volume page in firefox edge browser breaks the ui issue feature description when the dashboard is accessed using the firefox to edge browser on accessing the cloud volume page the ui crashes with javascript errors why this issue to fixed feature is needed give scenarios or use cases the ui is completely broken and unusable after this point how to reproduce in case of a bug open the dashboard using firefox or edge browser login and navigate to resource volume cloud tab check the developer console you will see above error the page will be broken and the create at field in the table will be empty other notes environment information please give the env information log link or any useful information for this issue | 1 |
143,377 | 13,061,657,477 | IssuesEvent | 2020-07-30 14:10:46 | TheThingsNetwork/lorawan-stack | https://api.github.com/repos/TheThingsNetwork/lorawan-stack | closed | Document Gateway Server MQTT V3 messages | c/gateway server documentation prio/medium s/in progress | <!--
Thanks for submitting this documentation request. Please fill the template
below, otherwise we will not be able to process this request.
-->
#### Summary
<!-- Summarize the request in a few sentences: -->
Document Gateway Server MQTT V3 messages
#### Why do we need this ?
<!-- Please explain the motivation, for whom, etc. -->
For users to know how to interact via MQTT with Gateway Server, using V3-enabled forwarders.
#### What is already there? What do you see now?
<!--
Please add any relevant documentation articles and resources. Screenshots if necessary.
-->
The application-layer integration: https://thethingsstack.io/v3.8.4/integrations/mqtt/
#### What is missing? What do you want to see?
<!-- Please add documentation sources (forum links, outside sources...), mock-ups if applicable -->
Similar documentation for interacting with the Gateway Server via MQTT.
#### How do you propose to document this?
<!-- Please think about how this could be fixed. -->
Authentication, message format, topic structure and some example messages for uplink, downlink and status.
#### Can you do this yourself and submit a Pull Request?
<!-- You can also @mention experts if you need help with this. -->
Can review | 1.0 | Document Gateway Server MQTT V3 messages - <!--
Thanks for submitting this documentation request. Please fill the template
below, otherwise we will not be able to process this request.
-->
#### Summary
<!-- Summarize the request in a few sentences: -->
Document Gateway Server MQTT V3 messages
#### Why do we need this ?
<!-- Please explain the motivation, for whom, etc. -->
For users to know how to interact via MQTT with Gateway Server, using V3-enabled forwarders.
#### What is already there? What do you see now?
<!--
Please add any relevant documentation articles and resources. Screenshots if necessary.
-->
The application-layer integration: https://thethingsstack.io/v3.8.4/integrations/mqtt/
#### What is missing? What do you want to see?
<!-- Please add documentation sources (forum links, outside sources...), mock-ups if applicable -->
Similar documentation for interacting with the Gateway Server via MQTT.
#### How do you propose to document this?
<!-- Please think about how this could be fixed. -->
Authentication, message format, topic structure and some example messages for uplink, downlink and status.
#### Can you do this yourself and submit a Pull Request?
<!-- You can also @mention experts if you need help with this. -->
Can review | non_priority | document gateway server mqtt messages thanks for submitting this documentation request please fill the template below otherwise we will not be able to process this request summary document gateway server mqtt messages why do we need this for users to know how to interact via mqtt with gateway server using enabled forwarders what is already there what do you see now please add any relevant documentation articles and resources screenshots if necessary the application layer integration what is missing what do you want to see similar documentation for interacting with the gateway server via mqtt how do you propose to document this authentication message format topic structure and some example messages for uplink downlink and status can you do this yourself and submit a pull request can review | 0 |
200,193 | 15,092,285,686 | IssuesEvent | 2021-02-06 18:53:48 | PaigeVannelli/whats-cookin-starter-kit | https://api.github.com/repos/PaigeVannelli/whats-cookin-starter-kit | opened | Pantry Class Testing | class testing | -[ ] test constructor
-[ ] test if user has enough ingredients to cook a meal
-[ ] test if user doesn't have the amount of ingredients - return missing ingredients amount
-[ ] test if user doesn't have an ingredient - return missing ingredient amounts
-[ ] test if user's ingredients are removed from pantry
-[ ] sad path testing
-[ ] refactor
-[ ] linter | 1.0 | Pantry Class Testing - -[ ] test constructor
-[ ] test if user has enough ingredients to cook a meal
-[ ] test if user doesn't have the amount of ingredients - return missing ingredients amount
-[ ] test if user doesn't have an ingredient - return missing ingredient amounts
-[ ] test if user's ingredients are removed from pantry
-[ ] sad path testing
-[ ] refactor
-[ ] linter | non_priority | pantry class testing test constructor test if user has enough ingredients to cook a meal test if user doesn t have the amount of ingredients return missing ingredients amount test if user doesn t have an ingredient return missing ingredient amounts test if user s ingredients are removed from pantry sad path testing refactor linter | 0 |
136,034 | 12,697,992,678 | IssuesEvent | 2020-06-22 12:46:00 | zimmerman-zimmerman/iati.cloud | https://api.github.com/repos/zimmerman-zimmerman/iati.cloud | opened | Validator slidedeck | DOCUMENTATION ENHANCEMENT | @luminhan will create a slidedeck so it is clear how the current of validation results to the datastore are handled and how the current setup lacks some synchronisation handling. Use https://app.lucidchart.com/documents/edit/b514aed0-c8c4-4cff-a24a-f3c62c3ddf09/0_0 as the base for those slides. | 1.0 | Validator slidedeck - @luminhan will create a slidedeck so it is clear how the current of validation results to the datastore are handled and how the current setup lacks some synchronisation handling. Use https://app.lucidchart.com/documents/edit/b514aed0-c8c4-4cff-a24a-f3c62c3ddf09/0_0 as the base for those slides. | non_priority | validator slidedeck luminhan will create a slidedeck so it is clear how the current of validation results to the datastore are handled and how the current setup lacks some synchronisation handling use as the base for those slides | 0 |
172,539 | 13,309,920,079 | IssuesEvent | 2020-08-26 05:25:08 | dunossauro/todo_list_flask_brython | https://api.github.com/repos/dunossauro/todo_list_flask_brython | opened | Criar BasePage para Page Objects | enhancement testing | Durante o desenvolvimento de #16 foi notado que as páginas compartilham o mesmo método de erros
`wait_error_message` em praticamente todos os PO com a espera do mesmo elemento.
Criar um base page para resolver esse problema de duplicação de código | 1.0 | Criar BasePage para Page Objects - Durante o desenvolvimento de #16 foi notado que as páginas compartilham o mesmo método de erros
`wait_error_message` em praticamente todos os PO com a espera do mesmo elemento.
Criar um base page para resolver esse problema de duplicação de código | non_priority | criar basepage para page objects durante o desenvolvimento de foi notado que as páginas compartilham o mesmo método de erros wait error message em praticamente todos os po com a espera do mesmo elemento criar um base page para resolver esse problema de duplicação de código | 0 |
532,126 | 15,530,043,925 | IssuesEvent | 2021-03-13 17:32:30 | dandyvalentine/lum-chan | https://api.github.com/repos/dandyvalentine/lum-chan | closed | MongoDB: Log Document | Priority: Critical Status: Completed Type: Maintenance | ## MongoDB: Log Document ##
The Log document of **_Lum-chan!_ (2.0.0)** will be available via the [MongoDB Cloud](https://www.mongodb.com/cloud) service. This will allow for remote monitoring of exceptions and operations without having to directly access the production server of this application.
### Schema ###
This is a sample of the _log.model.js_ source file.
```javascript
const log_schema = new mongoose.Schema({
meta: { type: String, required: true },
level: { type: String, required: true },
message: { type: String, required: true },
hostname: { type: String, required: true },
timestamp: { type: Date, required: true }
});
```
#### Meta ####
The **_meta_** field will describe relevant metadata relating to any particular document.
#### Level ####
The **_level_** field will describe the type of logging level of any particular document. The three expected types of logging levels are `info`, `warn`, and `error`.
#### Message ####
The **_message_** field will describe the information of any particular document.
#### Hostname ####
The **_hostname_** field will describe the machine of any particular document of which a logging event had occurred.
#### Timestamp ####
The **_timestamp_** field is necessary purely for record-keeping purposes. | 1.0 | MongoDB: Log Document - ## MongoDB: Log Document ##
The Log document of **_Lum-chan!_ (2.0.0)** will be available via the [MongoDB Cloud](https://www.mongodb.com/cloud) service. This will allow for remote monitoring of exceptions and operations without having to directly access the production server of this application.
### Schema ###
This is a sample of the _log.model.js_ source file.
```javascript
const log_schema = new mongoose.Schema({
meta: { type: String, required: true },
level: { type: String, required: true },
message: { type: String, required: true },
hostname: { type: String, required: true },
timestamp: { type: Date, required: true }
});
```
#### Meta ####
The **_meta_** field will describe relevant metadata relating to any particular document.
#### Level ####
The **_level_** field will describe the type of logging level of any particular document. The three expected types of logging levels are `info`, `warn`, and `error`.
#### Message ####
The **_message_** field will describe the information of any particular document.
#### Hostname ####
The **_hostname_** field will describe the machine of any particular document of which a logging event had occurred.
#### Timestamp ####
The **_timestamp_** field is necessary purely for record-keeping purposes. | priority | mongodb log document mongodb log document the log document of lum chan will be available via the service this will allow for remote monitoring of exceptions and operations without having to directly access the production server of this application schema this is a sample of the log model js source file javascript const log schema new mongoose schema meta type string required true level type string required true message type string required true hostname type string required true timestamp type date required true meta the meta field will describe relevant metadata relating to any particular document level the level field will describe the type of logging level of any particular document the three expected types of logging levels are info warn and error message the message field will describe the information of any particular document hostname the hostname field will describe the machine of any particular document of which a logging event had occurred timestamp the timestamp field is necessary purely for record keeping purposes | 1 |
33,895 | 7,762,404,263 | IssuesEvent | 2018-06-01 13:27:02 | surrsurus/edgequest | https://api.github.com/repos/surrsurus/edgequest | opened | Add screens to renderer | enhancement:code eq:core priority:low | The renderer could benefit from a `Screen` trait that has some `draw_all()` method implemented. This would allow the renderer to hold some `Box<Screen>` and allow the game to swap such screens in and out. This would allow for things such as UI screens, a main game screen, etc that could be modularly hot swapped on a button press. | 1.0 | Add screens to renderer - The renderer could benefit from a `Screen` trait that has some `draw_all()` method implemented. This would allow the renderer to hold some `Box<Screen>` and allow the game to swap such screens in and out. This would allow for things such as UI screens, a main game screen, etc that could be modularly hot swapped on a button press. | non_priority | add screens to renderer the renderer could benefit from a screen trait that has some draw all method implemented this would allow the renderer to hold some box and allow the game to swap such screens in and out this would allow for things such as ui screens a main game screen etc that could be modularly hot swapped on a button press | 0 |
178,320 | 6,607,121,572 | IssuesEvent | 2017-09-19 05:04:02 | apache/incubator-openwhisk-wskdeploy | https://api.github.com/repos/apache/incubator-openwhisk-wskdeploy | closed | Two "inputs" formats create different assets | bug priority: medium | According the samples in specification, we support two formats for "inputs" (list of
parameter):
a complex one:
```
actions:
func1:
function: actions/function.js
runtime: nodejs:6
inputs:
functionID:
type: string
description: the ID of function
visited:
type: string
description: the visted city list
```
and a simple one:
```
package:
name: TestSequencesCreation
actions:
func1:
function: actions/function.js
runtime: nodejs:6
inputs:
functionID: string
visited: string
```
The complex one will create an action, with empty string `""` to be the default values of parameters. The simple one will create an action with `string` to be the default values of parameters. See below samples.
action created by complex format
```
{
"namespace": "guoyingc@cn.ibm.com_dev/TestSequencesCreation",
"name": "func1",
"version": "0.0.1",
"exec": {
"kind": "nodejs:6",
"code": "/**\n * Return a simple string to \n * confirm this function has been visited.\n *\n * @param visited the visited function list\n */\nfunction main(params) {\n functionID = params.functionID || 'X'\n if (params.visited == null) {\n params.visited = 'function'+functionID;\n } else {\n params.visited = params.visited + ', function'+functionID;\n }\n return {\"visited\":params.visited};\n}\n"
},
"annotations": [
{
"key": "exec",
"value": "nodejs:6"
}
],
"parameters": [
{
"key": "functionID",
"value": ""
},
{
"key": "visited",
"value": ""
}
],
"limits": {
"timeout": 60000,
"memory": 256,
"logs": 10
},
"publish": false
}
```
action created by the simple format:
```
ok: got action TestSequencesCreation/func1
{
"namespace": "guoyingc@cn.ibm.com_dev/TestSequencesCreation",
"name": "func1",
"version": "0.0.1",
"exec": {
"kind": "nodejs:6",
"code": "/**\n * Return a simple string to \n * confirm this function has been visited.\n *\n * @param visited the visited function list\n */\nfunction main(params) {\n functionID = params.functionID || 'X'\n if (params.visited == null) {\n params.visited = 'function'+functionID;\n } else {\n params.visited = params.visited + ', function'+functionID;\n }\n return {\"visited\":params.visited};\n}\n"
},
"annotations": [
{
"key": "exec",
"value": "nodejs:6"
}
],
"parameters": [
{
"key": "functionID",
"value": "string"
},
{
"key": "visited",
"value": "string"
}
],
"limits": {
"timeout": 60000,
"memory": 256,
"logs": 10
},
"publish": false
}
```
I think it is a bug that the simple format creates an action with `string` to be the default values of parameters. We should fix it. | 1.0 | Two "inputs" formats create different assets - According the samples in specification, we support two formats for "inputs" (list of
parameter):
a complex one:
```
actions:
func1:
function: actions/function.js
runtime: nodejs:6
inputs:
functionID:
type: string
description: the ID of function
visited:
type: string
description: the visted city list
```
and a simple one:
```
package:
name: TestSequencesCreation
actions:
func1:
function: actions/function.js
runtime: nodejs:6
inputs:
functionID: string
visited: string
```
The complex one will create an action, with empty string `""` to be the default values of parameters. The simple one will create an action with `string` to be the default values of parameters. See below samples.
action created by complex format
```
{
"namespace": "guoyingc@cn.ibm.com_dev/TestSequencesCreation",
"name": "func1",
"version": "0.0.1",
"exec": {
"kind": "nodejs:6",
"code": "/**\n * Return a simple string to \n * confirm this function has been visited.\n *\n * @param visited the visited function list\n */\nfunction main(params) {\n functionID = params.functionID || 'X'\n if (params.visited == null) {\n params.visited = 'function'+functionID;\n } else {\n params.visited = params.visited + ', function'+functionID;\n }\n return {\"visited\":params.visited};\n}\n"
},
"annotations": [
{
"key": "exec",
"value": "nodejs:6"
}
],
"parameters": [
{
"key": "functionID",
"value": ""
},
{
"key": "visited",
"value": ""
}
],
"limits": {
"timeout": 60000,
"memory": 256,
"logs": 10
},
"publish": false
}
```
action created by the simple format:
```
ok: got action TestSequencesCreation/func1
{
"namespace": "guoyingc@cn.ibm.com_dev/TestSequencesCreation",
"name": "func1",
"version": "0.0.1",
"exec": {
"kind": "nodejs:6",
"code": "/**\n * Return a simple string to \n * confirm this function has been visited.\n *\n * @param visited the visited function list\n */\nfunction main(params) {\n functionID = params.functionID || 'X'\n if (params.visited == null) {\n params.visited = 'function'+functionID;\n } else {\n params.visited = params.visited + ', function'+functionID;\n }\n return {\"visited\":params.visited};\n}\n"
},
"annotations": [
{
"key": "exec",
"value": "nodejs:6"
}
],
"parameters": [
{
"key": "functionID",
"value": "string"
},
{
"key": "visited",
"value": "string"
}
],
"limits": {
"timeout": 60000,
"memory": 256,
"logs": 10
},
"publish": false
}
```
I think it is a bug that the simple format creates an action with `string` to be the default values of parameters. We should fix it. | priority | two inputs formats create different assets according the samples in specification we support two formats for inputs list of parameter a complex one actions function actions function js runtime nodejs inputs functionid type string description the id of function visited type string description the visted city list and a simple one package name testsequencescreation actions function actions function js runtime nodejs inputs functionid string visited string the complex one will create an action with empty string to be the default values of parameters the simple one will create an action with string to be the default values of parameters see below samples action created by complex format namespace guoyingc cn ibm com dev testsequencescreation name version exec kind nodejs code n return a simple string to n confirm this function has been visited n n param visited the visited function list n nfunction main params n functionid params functionid x n if params visited null n params visited function functionid n else n params visited params visited function functionid n n return visited params visited n n annotations key exec value nodejs parameters key functionid value key visited value limits timeout memory logs publish false action created by the simple format ok got action testsequencescreation namespace guoyingc cn ibm com dev testsequencescreation name version exec kind nodejs code n return a simple string to n confirm this function has been visited n n param visited the visited function list n nfunction main params n functionid params functionid x n if params visited null n params visited function functionid n else n params visited params visited function functionid n n return visited params visited n n annotations key exec value nodejs parameters key functionid value string key visited value string limits timeout memory logs publish false i think it is a bug that the simple format creates an action with string to be the default values of parameters we should fix it | 1 |
580,466 | 17,259,078,288 | IssuesEvent | 2021-07-22 03:25:23 | Thorium-Sim/thorium | https://api.github.com/repos/Thorium-Sim/thorium | opened | Copy Timeline Items Across to Other Timelines | priority/medium type/feature | ### Requested By: Bracken
### Priority: Medium
### Version: 3.3.1
What are the odds of adding the ability to copy a timeline item across to another timeline?
| 1.0 | Copy Timeline Items Across to Other Timelines - ### Requested By: Bracken
### Priority: Medium
### Version: 3.3.1
What are the odds of adding the ability to copy a timeline item across to another timeline?
| priority | copy timeline items across to other timelines requested by bracken priority medium version what are the odds of adding the ability to copy a timeline item across to another timeline | 1 |
597,285 | 18,160,021,193 | IssuesEvent | 2021-09-27 08:33:19 | creativecommons/project_creativecommons.org | https://api.github.com/repos/creativecommons/project_creativecommons.org | closed | Profile (Gutenberg editor test) | enhancement good first issue help wanted 🟥 priority: critical 🕹 aspect: interface | We recently agreed to use Gutenberg editor as much as possible to build pages for creativecommons.org. Now, we need to test how well Gutenberg default widgets support each page design.
## Task
Try to realize the creativecommons.org [Profile page mockup](https://www.figma.com/file/K6kbDVsx4Zpluhd52yEdDB/Mockups?node-id=2205%3A5735) using only Gutenberg default widgets.
## Workflow
Discuss the overall task and basic steps below by adding comments to this issue. Keep a holistic view in mind based on the overall task defined in creativecommons/project_creativecommons.org#24
- [ ] identify page elements and corresponding Gutenberg editor blocks
- [ ] identify composite elements (e.g. a "staff profile" element that might be built with a combination of Gutenberg blocks)
- [ ] if applicable, attempt to build the page layout using generic Gutenberg editor blocks
- [ ] document the process including
- [ ] the specific block hierarchy used and
- [ ] any difficulties/shortcomings | 1.0 | Profile (Gutenberg editor test) - We recently agreed to use Gutenberg editor as much as possible to build pages for creativecommons.org. Now, we need to test how well Gutenberg default widgets support each page design.
## Task
Try to realize the creativecommons.org [Profile page mockup](https://www.figma.com/file/K6kbDVsx4Zpluhd52yEdDB/Mockups?node-id=2205%3A5735) using only Gutenberg default widgets.
## Workflow
Discuss the overall task and basic steps below by adding comments to this issue. Keep a holistic view in mind based on the overall task defined in creativecommons/project_creativecommons.org#24
- [ ] identify page elements and corresponding Gutenberg editor blocks
- [ ] identify composite elements (e.g. a "staff profile" element that might be built with a combination of Gutenberg blocks)
- [ ] if applicable, attempt to build the page layout using generic Gutenberg editor blocks
- [ ] document the process including
- [ ] the specific block hierarchy used and
- [ ] any difficulties/shortcomings | priority | profile gutenberg editor test we recently agreed to use gutenberg editor as much as possible to build pages for creativecommons org now we need to test how well gutenberg default widgets support each page design task try to realize the creativecommons org using only gutenberg default widgets workflow discuss the overall task and basic steps below by adding comments to this issue keep a holistic view in mind based on the overall task defined in creativecommons project creativecommons org identify page elements and corresponding gutenberg editor blocks identify composite elements e g a staff profile element that might be built with a combination of gutenberg blocks if applicable attempt to build the page layout using generic gutenberg editor blocks document the process including the specific block hierarchy used and any difficulties shortcomings | 1 |
822,909 | 30,913,817,696 | IssuesEvent | 2023-08-05 03:02:32 | space-wizards/space-station-14 | https://api.github.com/repos/space-wizards/space-station-14 | closed | Carp on uplink seems a bit too stronk | Priority: 2-Before Release Difficulty: 1-Easy Issue: Balance | e.g.
- Nukies spam it
- Make carp instant death trap
It's just a matter of whether people want to abuse it or not rn. | 1.0 | Carp on uplink seems a bit too stronk - e.g.
- Nukies spam it
- Make carp instant death trap
It's just a matter of whether people want to abuse it or not rn. | priority | carp on uplink seems a bit too stronk e g nukies spam it make carp instant death trap it s just a matter of whether people want to abuse it or not rn | 1 |
245,010 | 7,880,732,448 | IssuesEvent | 2018-06-26 16:46:30 | aowen87/FOO | https://api.github.com/repos/aowen87/FOO | closed | DBMetaData MeshForVar needs processed var names, but GetMesh, GetVar, etc get passed raw var names | Likelihood: 3 - Occasional OS: All Priority: Normal Severity: 2 - Minor Irritation Support Group: Any Target Version: 2.12.0 bug version: 2.10.0 | I you have a variable name with special chars in it (colons, brackets, etc)
You will have to use the filtered names to interact with the metadata -- even though the names requested by the pipeline appear to be the unfiltered names.
| 1.0 | DBMetaData MeshForVar needs processed var names, but GetMesh, GetVar, etc get passed raw var names - I you have a variable name with special chars in it (colons, brackets, etc)
You will have to use the filtered names to interact with the metadata -- even though the names requested by the pipeline appear to be the unfiltered names.
| priority | dbmetadata meshforvar needs processed var names but getmesh getvar etc get passed raw var names i you have a variable name with special chars in it colons brackets etc you will have to use the filtered names to interact with the metadata even though the names requested by the pipeline appear to be the unfiltered names | 1 |
284,228 | 8,736,725,576 | IssuesEvent | 2018-12-11 20:22:22 | aowen87/TicketTester | https://api.github.com/repos/aowen87/TicketTester | closed | Displaying curves in the pseudocolor plot as ribbons crashes the engine. | bug crash likelihood medium priority reviewed severity high wrong results | I was displaying streamlines as ribbons in the pseudocolor plot window and then engine crashed. We need this before supercomputing.
-----------------------REDMINE MIGRATION-----------------------
This ticket was migrated from Redmine. As such, not all
information was able to be captured in the transition. Below is
a complete record of the original redmine ticket.
Ticket number: 2700
Status: Resolved
Project: VisIt
Tracker: Bug
Priority: Immediate
Subject: Displaying curves in the pseudocolor plot as ribbons crashes the engine.
Assigned to: Eric Brugger
Category:
Target version: 2.12.0
Author: Eric Brugger
Start: 10/20/2016
Due date:
% Done: 100
Estimated time: 8.0
Created: 10/20/2016 02:35 pm
Updated: 10/25/2016 01:44 pm
Likelihood: 3 - Occasional
Severity: 4 - Crash / Wrong Results
Found in version: 2.10.0
Impact:
Expected Use:
OS: All
Support Group: Any
Description:
I was displaying streamlines as ribbons in the pseudocolor plot window and then engine crashed. We need this before supercomputing.
Comments:
I committed revisions 29476 and 29478 to the 2.11 RC and trunk withthe following change:1) I modified the polyline to ribbon filter to clean the polydata since the vtkRibbonFilter can't handle duplicate points. I also corrected several bugs with the integral curver filter with the generation of normals for displaying the curves as ribbons. This resolves #2700.M avt/Filters/avtPolylineToRibbonFilter.CM avt/Filters/avtPolylineToTubeFilter.CM operators/IntegralCurve/avtIntegralCurveFilter.C
| 1.0 | Displaying curves in the pseudocolor plot as ribbons crashes the engine. - I was displaying streamlines as ribbons in the pseudocolor plot window and then engine crashed. We need this before supercomputing.
-----------------------REDMINE MIGRATION-----------------------
This ticket was migrated from Redmine. As such, not all
information was able to be captured in the transition. Below is
a complete record of the original redmine ticket.
Ticket number: 2700
Status: Resolved
Project: VisIt
Tracker: Bug
Priority: Immediate
Subject: Displaying curves in the pseudocolor plot as ribbons crashes the engine.
Assigned to: Eric Brugger
Category:
Target version: 2.12.0
Author: Eric Brugger
Start: 10/20/2016
Due date:
% Done: 100
Estimated time: 8.0
Created: 10/20/2016 02:35 pm
Updated: 10/25/2016 01:44 pm
Likelihood: 3 - Occasional
Severity: 4 - Crash / Wrong Results
Found in version: 2.10.0
Impact:
Expected Use:
OS: All
Support Group: Any
Description:
I was displaying streamlines as ribbons in the pseudocolor plot window and then engine crashed. We need this before supercomputing.
Comments:
I committed revisions 29476 and 29478 to the 2.11 RC and trunk withthe following change:1) I modified the polyline to ribbon filter to clean the polydata since the vtkRibbonFilter can't handle duplicate points. I also corrected several bugs with the integral curver filter with the generation of normals for displaying the curves as ribbons. This resolves #2700.M avt/Filters/avtPolylineToRibbonFilter.CM avt/Filters/avtPolylineToTubeFilter.CM operators/IntegralCurve/avtIntegralCurveFilter.C
| priority | displaying curves in the pseudocolor plot as ribbons crashes the engine i was displaying streamlines as ribbons in the pseudocolor plot window and then engine crashed we need this before supercomputing redmine migration this ticket was migrated from redmine as such not all information was able to be captured in the transition below is a complete record of the original redmine ticket ticket number status resolved project visit tracker bug priority immediate subject displaying curves in the pseudocolor plot as ribbons crashes the engine assigned to eric brugger category target version author eric brugger start due date done estimated time created pm updated pm likelihood occasional severity crash wrong results found in version impact expected use os all support group any description i was displaying streamlines as ribbons in the pseudocolor plot window and then engine crashed we need this before supercomputing comments i committed revisions and to the rc and trunk withthe following change i modified the polyline to ribbon filter to clean the polydata since the vtkribbonfilter can t handle duplicate points i also corrected several bugs with the integral curver filter with the generation of normals for displaying the curves as ribbons this resolves m avt filters avtpolylinetoribbonfilter cm avt filters avtpolylinetotubefilter cm operators integralcurve avtintegralcurvefilter c | 1 |
637,666 | 20,674,939,513 | IssuesEvent | 2022-03-10 08:18:22 | wso2/product-apim | https://api.github.com/repos/wso2/product-apim | opened | Key Provisioning does not work | Type/Bug Priority/High APIM - 4.1.0 | ### Description:
Followed documentation [1]. Getting the below error when providing existing oauth keys and clicking the Provide button.
<img width="1680" alt="Screenshot 2022-03-10 at 13 46 14" src="https://user-images.githubusercontent.com/8557410/157617934-905757c1-6a7e-4c54-a29b-6a1d0f80a7bb.png">
```[2022-03-10 13:44:52,249] ERROR - AbstractKeyManager Some thing went wrong while getting OAuth application for given consumer key 1aa3HVmTqOmtVFAuovGoVAa5XN8a
org.wso2.carbon.apimgt.impl.kmclient.KeyManagerClientException: Received status code: 401 Reason:
at org.wso2.carbon.apimgt.impl.kmclient.KMClientErrorDecoder.decode_aroundBody0(KMClientErrorDecoder.java:42) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at org.wso2.carbon.apimgt.impl.kmclient.KMClientErrorDecoder.decode(KMClientErrorDecoder.java:35) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at feign.AsyncResponseHandler.handleResponse(AsyncResponseHandler.java:96) ~[io.github.openfeign.feign-core_11.0.0.jar:?]
at feign.SynchronousMethodHandler.executeAndDecode(SynchronousMethodHandler.java:138) ~[io.github.openfeign.feign-core_11.0.0.jar:?]
at feign.SynchronousMethodHandler.invoke(SynchronousMethodHandler.java:89) ~[io.github.openfeign.feign-core_11.0.0.jar:?]
at feign.ReflectiveFeign$FeignInvocationHandler.invoke(ReflectiveFeign.java:100) ~[io.github.openfeign.feign-core_11.0.0.jar:?]
at com.sun.proxy.$Proxy475.getApplication(Unknown Source) ~[?:?]
at org.wso2.carbon.apimgt.impl.AMDefaultKeyManagerImpl.mapOAuthApplication_aroundBody20(AMDefaultKeyManagerImpl.java:581) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at org.wso2.carbon.apimgt.impl.AMDefaultKeyManagerImpl.mapOAuthApplication(AMDefaultKeyManagerImpl.java:561) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at org.wso2.carbon.apimgt.impl.APIConsumerImpl.mapExistingOAuthClient_aroundBody78(APIConsumerImpl.java:2517) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at org.wso2.carbon.apimgt.impl.APIConsumerImpl.mapExistingOAuthClient(APIConsumerImpl.java:2452) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at org.wso2.carbon.apimgt.rest.api.store.v1.impl.ApplicationsApiServiceImpl.applicationsApplicationIdMapKeysPost(ApplicationsApiServiceImpl.java:1101) ~[?:?]
at org.wso2.carbon.apimgt.rest.api.store.v1.ApplicationsApi.applicationsApplicationIdMapKeysPost(ApplicationsApi.java:281) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_301]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_301]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_301]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_301]
at org.apache.cxf.service.invoker.AbstractInvoker.performInvocation(AbstractInvoker.java:179) ~[?:?]
at org.apache.cxf.service.invoker.AbstractInvoker.invoke(AbstractInvoker.java:96) ~[?:?]
at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:201) ~[?:?]
at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:104) ~[?:?]
at org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:59) ~[?:?]
at org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:96) ~[?:?]
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307) ~[?:?]
at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121) ~[?:?]
at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:265) ~[?:?]
at org.apache.cxf.transport.servlet.ServletController.invokeDestination(ServletController.java:234) ~[?:?]
at org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:208) ~[?:?]
at org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:160) ~[?:?]
at org.apache.cxf.transport.servlet.CXFNonSpringServlet.invoke(CXFNonSpringServlet.java:225) ~[?:?]
at org.apache.cxf.transport.servlet.AbstractHTTPServlet.handleRequest(AbstractHTTPServlet.java:304) ~[?:?]
at org.apache.cxf.transport.servlet.AbstractHTTPServlet.doPost(AbstractHTTPServlet.java:217) ~[?:?]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:681) ~[tomcat-servlet-api_9.0.54.wso2v1.jar:?]
at org.apache.cxf.transport.servlet.AbstractHTTPServlet.service(AbstractHTTPServlet.java:279) ~[?:?]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:227) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:197) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:540) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:135) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.wso2.carbon.identity.context.rewrite.valve.TenantContextRewriteValve.invoke(TenantContextRewriteValve.java:107) ~[org.wso2.carbon.identity.context.rewrite.valve_1.4.52.jar:?]
at org.wso2.carbon.identity.authz.valve.AuthorizationValve.invoke(AuthorizationValve.java:110) ~[org.wso2.carbon.identity.authz.valve_1.4.52.jar:?]
at org.wso2.carbon.identity.auth.valve.AuthenticationValve.invoke(AuthenticationValve.java:102) ~[org.wso2.carbon.identity.auth.valve_1.4.52.jar:?]
at org.wso2.carbon.tomcat.ext.valves.CompositeValve.continueInvocation(CompositeValve.java:101) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.wso2.carbon.tomcat.ext.valves.TomcatValveContainer.invokeValves(TomcatValveContainer.java:49) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.wso2.carbon.tomcat.ext.valves.CompositeValve.invoke(CompositeValve.java:62) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.wso2.carbon.tomcat.ext.valves.CarbonStuckThreadDetectionValve.invoke(CarbonStuckThreadDetectionValve.java:146) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:687) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.wso2.carbon.tomcat.ext.valves.CarbonContextCreatorValve.invoke(CarbonContextCreatorValve.java:58) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.wso2.carbon.tomcat.ext.valves.RequestCorrelationIdValve.invoke(RequestCorrelationIdValve.java:126) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:357) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:382) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:895) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1722) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) ~[tomcat_9.0.54.wso2v1.jar:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_301]
[2022-03-10 13:44:52,250] ERROR - GlobalThrowableMapper Some thing went wrong while getting OAuth application for given consumer key 1aa3HVmTqOmtVFAuovGoVAa5XN8a```
[1] https://apim.docs.wso2.com/en/latest/design/api-security/oauth2/provisioning-out-of-band-oauth-clients/#step-3-provision-the-out-of-band-oauth2-client | 1.0 | Key Provisioning does not work - ### Description:
Followed documentation [1]. Getting the below error when providing existing oauth keys and clicking the Provide button.
<img width="1680" alt="Screenshot 2022-03-10 at 13 46 14" src="https://user-images.githubusercontent.com/8557410/157617934-905757c1-6a7e-4c54-a29b-6a1d0f80a7bb.png">
```[2022-03-10 13:44:52,249] ERROR - AbstractKeyManager Some thing went wrong while getting OAuth application for given consumer key 1aa3HVmTqOmtVFAuovGoVAa5XN8a
org.wso2.carbon.apimgt.impl.kmclient.KeyManagerClientException: Received status code: 401 Reason:
at org.wso2.carbon.apimgt.impl.kmclient.KMClientErrorDecoder.decode_aroundBody0(KMClientErrorDecoder.java:42) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at org.wso2.carbon.apimgt.impl.kmclient.KMClientErrorDecoder.decode(KMClientErrorDecoder.java:35) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at feign.AsyncResponseHandler.handleResponse(AsyncResponseHandler.java:96) ~[io.github.openfeign.feign-core_11.0.0.jar:?]
at feign.SynchronousMethodHandler.executeAndDecode(SynchronousMethodHandler.java:138) ~[io.github.openfeign.feign-core_11.0.0.jar:?]
at feign.SynchronousMethodHandler.invoke(SynchronousMethodHandler.java:89) ~[io.github.openfeign.feign-core_11.0.0.jar:?]
at feign.ReflectiveFeign$FeignInvocationHandler.invoke(ReflectiveFeign.java:100) ~[io.github.openfeign.feign-core_11.0.0.jar:?]
at com.sun.proxy.$Proxy475.getApplication(Unknown Source) ~[?:?]
at org.wso2.carbon.apimgt.impl.AMDefaultKeyManagerImpl.mapOAuthApplication_aroundBody20(AMDefaultKeyManagerImpl.java:581) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at org.wso2.carbon.apimgt.impl.AMDefaultKeyManagerImpl.mapOAuthApplication(AMDefaultKeyManagerImpl.java:561) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at org.wso2.carbon.apimgt.impl.APIConsumerImpl.mapExistingOAuthClient_aroundBody78(APIConsumerImpl.java:2517) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at org.wso2.carbon.apimgt.impl.APIConsumerImpl.mapExistingOAuthClient(APIConsumerImpl.java:2452) ~[org.wso2.carbon.apimgt.impl_9.20.12.jar:?]
at org.wso2.carbon.apimgt.rest.api.store.v1.impl.ApplicationsApiServiceImpl.applicationsApplicationIdMapKeysPost(ApplicationsApiServiceImpl.java:1101) ~[?:?]
at org.wso2.carbon.apimgt.rest.api.store.v1.ApplicationsApi.applicationsApplicationIdMapKeysPost(ApplicationsApi.java:281) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_301]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_301]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_301]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_301]
at org.apache.cxf.service.invoker.AbstractInvoker.performInvocation(AbstractInvoker.java:179) ~[?:?]
at org.apache.cxf.service.invoker.AbstractInvoker.invoke(AbstractInvoker.java:96) ~[?:?]
at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:201) ~[?:?]
at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:104) ~[?:?]
at org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:59) ~[?:?]
at org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:96) ~[?:?]
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307) ~[?:?]
at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121) ~[?:?]
at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:265) ~[?:?]
at org.apache.cxf.transport.servlet.ServletController.invokeDestination(ServletController.java:234) ~[?:?]
at org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:208) ~[?:?]
at org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:160) ~[?:?]
at org.apache.cxf.transport.servlet.CXFNonSpringServlet.invoke(CXFNonSpringServlet.java:225) ~[?:?]
at org.apache.cxf.transport.servlet.AbstractHTTPServlet.handleRequest(AbstractHTTPServlet.java:304) ~[?:?]
at org.apache.cxf.transport.servlet.AbstractHTTPServlet.doPost(AbstractHTTPServlet.java:217) ~[?:?]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:681) ~[tomcat-servlet-api_9.0.54.wso2v1.jar:?]
at org.apache.cxf.transport.servlet.AbstractHTTPServlet.service(AbstractHTTPServlet.java:279) ~[?:?]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:227) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:197) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:540) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:135) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.wso2.carbon.identity.context.rewrite.valve.TenantContextRewriteValve.invoke(TenantContextRewriteValve.java:107) ~[org.wso2.carbon.identity.context.rewrite.valve_1.4.52.jar:?]
at org.wso2.carbon.identity.authz.valve.AuthorizationValve.invoke(AuthorizationValve.java:110) ~[org.wso2.carbon.identity.authz.valve_1.4.52.jar:?]
at org.wso2.carbon.identity.auth.valve.AuthenticationValve.invoke(AuthenticationValve.java:102) ~[org.wso2.carbon.identity.auth.valve_1.4.52.jar:?]
at org.wso2.carbon.tomcat.ext.valves.CompositeValve.continueInvocation(CompositeValve.java:101) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.wso2.carbon.tomcat.ext.valves.TomcatValveContainer.invokeValves(TomcatValveContainer.java:49) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.wso2.carbon.tomcat.ext.valves.CompositeValve.invoke(CompositeValve.java:62) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.wso2.carbon.tomcat.ext.valves.CarbonStuckThreadDetectionValve.invoke(CarbonStuckThreadDetectionValve.java:146) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:687) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.wso2.carbon.tomcat.ext.valves.CarbonContextCreatorValve.invoke(CarbonContextCreatorValve.java:58) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.wso2.carbon.tomcat.ext.valves.RequestCorrelationIdValve.invoke(RequestCorrelationIdValve.java:126) ~[org.wso2.carbon.tomcat.ext_4.6.3.beta.jar:?]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:357) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:382) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:895) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1722) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659) ~[tomcat_9.0.54.wso2v1.jar:?]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) ~[tomcat_9.0.54.wso2v1.jar:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_301]
[2022-03-10 13:44:52,250] ERROR - GlobalThrowableMapper Some thing went wrong while getting OAuth application for given consumer key 1aa3HVmTqOmtVFAuovGoVAa5XN8a```
[1] https://apim.docs.wso2.com/en/latest/design/api-security/oauth2/provisioning-out-of-band-oauth-clients/#step-3-provision-the-out-of-band-oauth2-client | priority | key provisioning does not work description followed documentation getting the below error when providing existing oauth keys and clicking the provide button img width alt screenshot at src error abstractkeymanager some thing went wrong while getting oauth application for given consumer key org carbon apimgt impl kmclient keymanagerclientexception received status code reason at org carbon apimgt impl kmclient kmclienterrordecoder decode kmclienterrordecoder java at org carbon apimgt impl kmclient kmclienterrordecoder decode kmclienterrordecoder java at feign asyncresponsehandler handleresponse asyncresponsehandler java at feign synchronousmethodhandler executeanddecode synchronousmethodhandler java at feign synchronousmethodhandler invoke synchronousmethodhandler java at feign reflectivefeign feigninvocationhandler invoke reflectivefeign java at com sun proxy getapplication unknown source at org carbon apimgt impl amdefaultkeymanagerimpl mapoauthapplication amdefaultkeymanagerimpl java at org carbon apimgt impl amdefaultkeymanagerimpl mapoauthapplication amdefaultkeymanagerimpl java at org carbon apimgt impl apiconsumerimpl mapexistingoauthclient apiconsumerimpl java at org carbon apimgt impl apiconsumerimpl mapexistingoauthclient apiconsumerimpl java at org carbon apimgt rest api store impl applicationsapiserviceimpl applicationsapplicationidmapkeyspost applicationsapiserviceimpl java at org carbon apimgt rest api store applicationsapi applicationsapplicationidmapkeyspost applicationsapi java at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org apache cxf service invoker abstractinvoker performinvocation abstractinvoker java at org apache cxf service invoker abstractinvoker invoke abstractinvoker java at org apache cxf jaxrs jaxrsinvoker invoke jaxrsinvoker java at org apache cxf jaxrs jaxrsinvoker invoke jaxrsinvoker java at org apache cxf interceptor serviceinvokerinterceptor run serviceinvokerinterceptor java at org apache cxf interceptor serviceinvokerinterceptor handlemessage serviceinvokerinterceptor java at org apache cxf phase phaseinterceptorchain dointercept phaseinterceptorchain java at org apache cxf transport chaininitiationobserver onmessage chaininitiationobserver java at org apache cxf transport http abstracthttpdestination invoke abstracthttpdestination java at org apache cxf transport servlet servletcontroller invokedestination servletcontroller java at org apache cxf transport servlet servletcontroller invoke servletcontroller java at org apache cxf transport servlet servletcontroller invoke servletcontroller java at org apache cxf transport servlet cxfnonspringservlet invoke cxfnonspringservlet java at org apache cxf transport servlet abstracthttpservlet handlerequest abstracthttpservlet java at org apache cxf transport servlet abstracthttpservlet dopost abstracthttpservlet java at javax servlet http httpservlet service httpservlet java at org apache cxf transport servlet abstracthttpservlet service abstracthttpservlet java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org apache tomcat websocket server wsfilter dofilter wsfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org apache catalina core standardwrappervalve invoke standardwrappervalve java at org apache catalina core standardcontextvalve invoke standardcontextvalve java at org apache catalina authenticator authenticatorbase invoke authenticatorbase java at org apache catalina core standardhostvalve invoke standardhostvalve java at org apache catalina valves errorreportvalve invoke errorreportvalve java at org carbon identity context rewrite valve tenantcontextrewritevalve invoke tenantcontextrewritevalve java at org carbon identity authz valve authorizationvalve invoke authorizationvalve java at org carbon identity auth valve authenticationvalve invoke authenticationvalve java at org carbon tomcat ext valves compositevalve continueinvocation compositevalve java at org carbon tomcat ext valves tomcatvalvecontainer invokevalves tomcatvalvecontainer java at org carbon tomcat ext valves compositevalve invoke compositevalve java at org carbon tomcat ext valves carbonstuckthreaddetectionvalve invoke carbonstuckthreaddetectionvalve java at org apache catalina valves abstractaccesslogvalve invoke abstractaccesslogvalve java at org carbon tomcat ext valves carboncontextcreatorvalve invoke carboncontextcreatorvalve java at org carbon tomcat ext valves requestcorrelationidvalve invoke requestcorrelationidvalve java at org apache catalina core standardenginevalve invoke standardenginevalve java at org apache catalina connector coyoteadapter service coyoteadapter java at org apache coyote service java at org apache coyote abstractprocessorlight process abstractprocessorlight java at org apache coyote abstractprotocol connectionhandler process abstractprotocol java at org apache tomcat util net nioendpoint socketprocessor dorun nioendpoint java at org apache tomcat util net socketprocessorbase run socketprocessorbase java at org apache tomcat util threads threadpoolexecutor runworker threadpoolexecutor java at org apache tomcat util threads threadpoolexecutor worker run threadpoolexecutor java at org apache tomcat util threads taskthread wrappingrunnable run taskthread java at java lang thread run thread java error globalthrowablemapper some thing went wrong while getting oauth application for given consumer key | 1 |
148,961 | 5,704,079,507 | IssuesEvent | 2017-04-18 02:42:43 | Erigitic/TotalEconomy | https://api.github.com/repos/Erigitic/TotalEconomy | reopened | Virtual Account Support | feature request high priority | Recently issues have been coming in that would be fixed if there was support for virtual accounts. As well as that, virtual account support should allow for other plugins, such as those with towns that have their own balance, to work with Total Economy. So, with that said, this is definitely priority number one.
UPDATE 4/13/17: It looks as if virtual accounts are being properly created, but issues arise with commands that are not setup to work with virtual accounts. Most if not all of these issues would be solved by determining if the UUID is valid. | 1.0 | Virtual Account Support - Recently issues have been coming in that would be fixed if there was support for virtual accounts. As well as that, virtual account support should allow for other plugins, such as those with towns that have their own balance, to work with Total Economy. So, with that said, this is definitely priority number one.
UPDATE 4/13/17: It looks as if virtual accounts are being properly created, but issues arise with commands that are not setup to work with virtual accounts. Most if not all of these issues would be solved by determining if the UUID is valid. | priority | virtual account support recently issues have been coming in that would be fixed if there was support for virtual accounts as well as that virtual account support should allow for other plugins such as those with towns that have their own balance to work with total economy so with that said this is definitely priority number one update it looks as if virtual accounts are being properly created but issues arise with commands that are not setup to work with virtual accounts most if not all of these issues would be solved by determining if the uuid is valid | 1 |
164,732 | 12,812,888,063 | IssuesEvent | 2020-07-04 09:26:15 | aliasrobotics/RVD | https://api.github.com/repos/aliasrobotics/RVD | closed | RVD#2954: CWE-134 (format), If format strings can be influenced by an attacker, they can be exploi... @ s/boards/aerofc-v1/init.c:100 | CWE-134 bug components software flawfinder flawfinder_level_4 mitigated robot component: PX4 static analysis testing triage version: v1.8.0 | ```yaml
id: 2954
title: 'RVD#2954: CWE-134 (format), If format strings can be influenced by an attacker,
they can be exploi... @ s/boards/aerofc-v1/init.c:100'
type: bug
description: If format strings can be influenced by an attacker, they can be exploited
(CWE-134). Use a constant for the format specification. . Happening @ ...s/boards/aerofc-v1/init.c:100
cwe:
- CWE-134
cve: None
keywords:
- flawfinder
- flawfinder_level_4
- static analysis
- testing
- triage
- CWE-134
- bug
- 'version: v1.8.0'
- 'robot component: PX4'
- components software
system: ./Firmware/src/drivers/boards/aerofc-v1/init.c:100:21
vendor: null
severity:
rvss-score: 0
rvss-vector: ''
severity-description: ''
cvss-score: 0
cvss-vector: ''
links:
- https://github.com/aliasrobotics/RVD/issues/2954
flaw:
phase: testing
specificity: subject-specific
architectural-location: application-specific
application: N/A
subsystem: N/A
package: N/A
languages: None
date-detected: 2020-06-29 (15:56)
detected-by: Alias Robotics
detected-by-method: testing static
date-reported: 2020-06-29 (15:56)
reported-by: Alias Robotics
reported-by-relationship: automatic
issue: https://github.com/aliasrobotics/RVD/issues/2954
reproducibility: always
trace: '(context) # define message printf'
reproduction: See artifacts below (if available)
reproduction-image: gitlab.com/aliasrobotics/offensive/alurity/pipelines/active/pipeline_px4/-/jobs/615986299/artifacts/download
exploitation:
description: ''
exploitation-image: ''
exploitation-vector: ''
exploitation-recipe: ''
mitigation:
description: Use a constant for the format specification
pull-request: ''
date-mitigation: ''
``` | 1.0 | RVD#2954: CWE-134 (format), If format strings can be influenced by an attacker, they can be exploi... @ s/boards/aerofc-v1/init.c:100 - ```yaml
id: 2954
title: 'RVD#2954: CWE-134 (format), If format strings can be influenced by an attacker,
they can be exploi... @ s/boards/aerofc-v1/init.c:100'
type: bug
description: If format strings can be influenced by an attacker, they can be exploited
(CWE-134). Use a constant for the format specification. . Happening @ ...s/boards/aerofc-v1/init.c:100
cwe:
- CWE-134
cve: None
keywords:
- flawfinder
- flawfinder_level_4
- static analysis
- testing
- triage
- CWE-134
- bug
- 'version: v1.8.0'
- 'robot component: PX4'
- components software
system: ./Firmware/src/drivers/boards/aerofc-v1/init.c:100:21
vendor: null
severity:
rvss-score: 0
rvss-vector: ''
severity-description: ''
cvss-score: 0
cvss-vector: ''
links:
- https://github.com/aliasrobotics/RVD/issues/2954
flaw:
phase: testing
specificity: subject-specific
architectural-location: application-specific
application: N/A
subsystem: N/A
package: N/A
languages: None
date-detected: 2020-06-29 (15:56)
detected-by: Alias Robotics
detected-by-method: testing static
date-reported: 2020-06-29 (15:56)
reported-by: Alias Robotics
reported-by-relationship: automatic
issue: https://github.com/aliasrobotics/RVD/issues/2954
reproducibility: always
trace: '(context) # define message printf'
reproduction: See artifacts below (if available)
reproduction-image: gitlab.com/aliasrobotics/offensive/alurity/pipelines/active/pipeline_px4/-/jobs/615986299/artifacts/download
exploitation:
description: ''
exploitation-image: ''
exploitation-vector: ''
exploitation-recipe: ''
mitigation:
description: Use a constant for the format specification
pull-request: ''
date-mitigation: ''
``` | non_priority | rvd cwe format if format strings can be influenced by an attacker they can be exploi s boards aerofc init c yaml id title rvd cwe format if format strings can be influenced by an attacker they can be exploi s boards aerofc init c type bug description if format strings can be influenced by an attacker they can be exploited cwe use a constant for the format specification happening s boards aerofc init c cwe cwe cve none keywords flawfinder flawfinder level static analysis testing triage cwe bug version robot component components software system firmware src drivers boards aerofc init c vendor null severity rvss score rvss vector severity description cvss score cvss vector links flaw phase testing specificity subject specific architectural location application specific application n a subsystem n a package n a languages none date detected detected by alias robotics detected by method testing static date reported reported by alias robotics reported by relationship automatic issue reproducibility always trace context define message printf reproduction see artifacts below if available reproduction image gitlab com aliasrobotics offensive alurity pipelines active pipeline jobs artifacts download exploitation description exploitation image exploitation vector exploitation recipe mitigation description use a constant for the format specification pull request date mitigation | 0 |
327,262 | 9,968,950,041 | IssuesEvent | 2019-07-08 16:46:30 | beelabhmc/ant_tracker | https://api.github.com/repos/beelabhmc/ant_tracker | closed | Parallelize croprotate.py | enhancement low-priority | The croprotate.py file is a significant bottleneck in the code's execution speed. With the current video setup and running on purves, it takes almost an hour to crop 8 ROIs on one video.
It would greatly increase the pipeline speed if this code gets parallelized, to run several crop actions at once. It's more important to get the pipeline working, but this is a thing that can be worked on to improve the pipeline performance. | 1.0 | Parallelize croprotate.py - The croprotate.py file is a significant bottleneck in the code's execution speed. With the current video setup and running on purves, it takes almost an hour to crop 8 ROIs on one video.
It would greatly increase the pipeline speed if this code gets parallelized, to run several crop actions at once. It's more important to get the pipeline working, but this is a thing that can be worked on to improve the pipeline performance. | priority | parallelize croprotate py the croprotate py file is a significant bottleneck in the code s execution speed with the current video setup and running on purves it takes almost an hour to crop rois on one video it would greatly increase the pipeline speed if this code gets parallelized to run several crop actions at once it s more important to get the pipeline working but this is a thing that can be worked on to improve the pipeline performance | 1 |
184,353 | 14,289,010,911 | IssuesEvent | 2020-11-23 18:34:03 | github-vet/rangeclosure-findings | https://api.github.com/repos/github-vet/rangeclosure-findings | opened | posener/wstest: dialer_test.go; 27 LoC | small test |
Found a possible issue in [posener/wstest](https://www.github.com/posener/wstest) at [dialer_test.go](https://github.com/posener/wstest/blob/e79331f65216413fbfc72b452b2ee78884c97cc6/dialer_test.go#L82-L108)
The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements
which capture loop variables.
[Click here to see the code in its original context.](https://github.com/posener/wstest/blob/e79331f65216413fbfc72b452b2ee78884c97cc6/dialer_test.go#L82-L108)
<details>
<summary>Click here to show the 27 line(s) of Go which triggered the analyzer.</summary>
```go
for _, pair := range []struct{ src, dst *websocket.Conn }{{s.Conn, c}, {c, s.Conn}} {
go func() {
for i := 0; i < count; i++ {
err := pair.src.WriteJSON(i)
require.Nil(t, err)
}
}()
received := make([]bool, count)
for i := 0; i < count; i++ {
var j int
err := pair.dst.ReadJSON(&j)
require.Nil(t, err)
received[j] = true
}
var missing []int
for i := range received {
if !received[i] {
missing = append(missing, i)
}
}
assert.Equal(t, 0, len(missing), "%q -> %q: Did not received: %q", pair.src.LocalAddr(), pair.dst.LocalAddr(), missing)
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: e79331f65216413fbfc72b452b2ee78884c97cc6
| 1.0 | posener/wstest: dialer_test.go; 27 LoC -
Found a possible issue in [posener/wstest](https://www.github.com/posener/wstest) at [dialer_test.go](https://github.com/posener/wstest/blob/e79331f65216413fbfc72b452b2ee78884c97cc6/dialer_test.go#L82-L108)
The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements
which capture loop variables.
[Click here to see the code in its original context.](https://github.com/posener/wstest/blob/e79331f65216413fbfc72b452b2ee78884c97cc6/dialer_test.go#L82-L108)
<details>
<summary>Click here to show the 27 line(s) of Go which triggered the analyzer.</summary>
```go
for _, pair := range []struct{ src, dst *websocket.Conn }{{s.Conn, c}, {c, s.Conn}} {
go func() {
for i := 0; i < count; i++ {
err := pair.src.WriteJSON(i)
require.Nil(t, err)
}
}()
received := make([]bool, count)
for i := 0; i < count; i++ {
var j int
err := pair.dst.ReadJSON(&j)
require.Nil(t, err)
received[j] = true
}
var missing []int
for i := range received {
if !received[i] {
missing = append(missing, i)
}
}
assert.Equal(t, 0, len(missing), "%q -> %q: Did not received: %q", pair.src.LocalAddr(), pair.dst.LocalAddr(), missing)
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: e79331f65216413fbfc72b452b2ee78884c97cc6
| non_priority | posener wstest dialer test go loc found a possible issue in at the below snippet of go code triggered static analysis which searches for goroutines and or defer statements which capture loop variables click here to show the line s of go which triggered the analyzer go for pair range struct src dst websocket conn s conn c c s conn go func for i i count i err pair src writejson i require nil t err received make bool count for i i count i var j int err pair dst readjson j require nil t err received true var missing int for i range received if received missing append missing i assert equal t len missing q q did not received q pair src localaddr pair dst localaddr missing leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id | 0 |
784,450 | 27,571,514,759 | IssuesEvent | 2023-03-08 09:40:03 | AdrKacz/super-duper-guacamole | https://api.github.com/repos/AdrKacz/super-duper-guacamole | opened | 📈 Delete sam-group-production stack | priority:P3 operation | # What is your idea to improve operation?
`sam-group-production` is not used anymore (**to verify**), so delete it.
# How you're idea will improve operation?
Reduce cost of maintenance.
# Why do we need your idea to improve operation? | 1.0 | 📈 Delete sam-group-production stack - # What is your idea to improve operation?
`sam-group-production` is not used anymore (**to verify**), so delete it.
# How you're idea will improve operation?
Reduce cost of maintenance.
# Why do we need your idea to improve operation? | priority | 📈 delete sam group production stack what is your idea to improve operation sam group production is not used anymore to verify so delete it how you re idea will improve operation reduce cost of maintenance why do we need your idea to improve operation | 1 |
118,629 | 15,342,905,984 | IssuesEvent | 2021-02-27 18:05:03 | plotn/coolreader | https://api.github.com/repos/plotn/coolreader | closed | Перенести настройки "Опции рендеринга" и "Уровень совместимости DOM" | design | Пользователь [написал](https://4pda.ru/forum/index.php?s=&showtopic=995536&view=findpost&p=104618639):
> "Опции рендеринга" и "Уровень совместимости DOM" на мой взгляд стоит убрать в "Редкие и экспериментальные". Понять что это такое у простого юзверя нет никаких шансов (даже невзирая на разъяснения), да и умолчальное их значение подойдёт подавляющему большинству. А то маячит, пугает, справки не даёт, дублируется в "Шрифтах" и "CSS"
Дальше ещё немного обсуждения:
> хорошая мысль. Но лучше тогда в "тонкие настройки шрифта", дополнительно переименовав их в "... рендеринга и шрифта" | 1.0 | Перенести настройки "Опции рендеринга" и "Уровень совместимости DOM" - Пользователь [написал](https://4pda.ru/forum/index.php?s=&showtopic=995536&view=findpost&p=104618639):
> "Опции рендеринга" и "Уровень совместимости DOM" на мой взгляд стоит убрать в "Редкие и экспериментальные". Понять что это такое у простого юзверя нет никаких шансов (даже невзирая на разъяснения), да и умолчальное их значение подойдёт подавляющему большинству. А то маячит, пугает, справки не даёт, дублируется в "Шрифтах" и "CSS"
Дальше ещё немного обсуждения:
> хорошая мысль. Но лучше тогда в "тонкие настройки шрифта", дополнительно переименовав их в "... рендеринга и шрифта" | non_priority | перенести настройки опции рендеринга и уровень совместимости dom пользователь опции рендеринга и уровень совместимости dom на мой взгляд стоит убрать в редкие и экспериментальные понять что это такое у простого юзверя нет никаких шансов даже невзирая на разъяснения да и умолчальное их значение подойдёт подавляющему большинству а то маячит пугает справки не даёт дублируется в шрифтах и css дальше ещё немного обсуждения хорошая мысль но лучше тогда в тонкие настройки шрифта дополнительно переименовав их в рендеринга и шрифта | 0 |
126,926 | 5,007,702,836 | IssuesEvent | 2016-12-12 17:25:44 | RobotLocomotion/drake | https://api.github.com/repos/RobotLocomotion/drake | opened | lcmt_driving_command_t Should Be Updated to Include Reference Velocity Rather Than Reference Throttle / Brake | priority: medium type: cleanup | # Problem Definition
`drake/lcmtypes/lcmt_driving_command_t.lcm` over-specifies the system's desired state by including both `throttle` and `brake` reference values.
```
struct lcmt_driving_command_t {
// The timestamp in milliseconds.
int64_t timestamp;
double steering_angle;
double throttle;
double brake;
}
```
In reality, the `car_sim_lcm` system only provides two control inputs: (1) desired steering angle and (2) desired velocity. We should thus modify `lcmt_driving_command_t` to specify the desired velocity rather than the desired state of the `throttle` and `brake`. Once this is done, `car_sim_lcm` can be simplified since it no longer needs to combine `throttle` and `brake` commands into a single reference `velocity` for the controller to follow. | 1.0 | lcmt_driving_command_t Should Be Updated to Include Reference Velocity Rather Than Reference Throttle / Brake - # Problem Definition
`drake/lcmtypes/lcmt_driving_command_t.lcm` over-specifies the system's desired state by including both `throttle` and `brake` reference values.
```
struct lcmt_driving_command_t {
// The timestamp in milliseconds.
int64_t timestamp;
double steering_angle;
double throttle;
double brake;
}
```
In reality, the `car_sim_lcm` system only provides two control inputs: (1) desired steering angle and (2) desired velocity. We should thus modify `lcmt_driving_command_t` to specify the desired velocity rather than the desired state of the `throttle` and `brake`. Once this is done, `car_sim_lcm` can be simplified since it no longer needs to combine `throttle` and `brake` commands into a single reference `velocity` for the controller to follow. | priority | lcmt driving command t should be updated to include reference velocity rather than reference throttle brake problem definition drake lcmtypes lcmt driving command t lcm over specifies the system s desired state by including both throttle and brake reference values struct lcmt driving command t the timestamp in milliseconds t timestamp double steering angle double throttle double brake in reality the car sim lcm system only provides two control inputs desired steering angle and desired velocity we should thus modify lcmt driving command t to specify the desired velocity rather than the desired state of the throttle and brake once this is done car sim lcm can be simplified since it no longer needs to combine throttle and brake commands into a single reference velocity for the controller to follow | 1 |
82,291 | 7,836,560,329 | IssuesEvent | 2018-06-17 21:01:07 | FIDATA/gradle-semantic-release-plugin | https://api.github.com/repos/FIDATA/gradle-semantic-release-plugin | opened | Rewrite quoting arguments in tests using Gradle API | Test | `org.gradle.internal.Transformers.asSafeCommandLineArgument().transform(...)`
or
`org.gradle.internal.process.ArgWriter....` | 1.0 | Rewrite quoting arguments in tests using Gradle API - `org.gradle.internal.Transformers.asSafeCommandLineArgument().transform(...)`
or
`org.gradle.internal.process.ArgWriter....` | non_priority | rewrite quoting arguments in tests using gradle api org gradle internal transformers assafecommandlineargument transform or org gradle internal process argwriter | 0 |
4,572 | 5,194,842,007 | IssuesEvent | 2017-01-23 06:34:14 | msimerson/Mail-Toaster-6 | https://api.github.com/repos/msimerson/Mail-Toaster-6 | closed | integrate LetsEncrypt into the provisioning steps | enhancement security | ### present
As part of the base jail provisioning step, a self-signed TLS certificate is generated and left in /etc/ssl. When TLS is used (haproxy, haraka, dovecot), the files (key, certs, ca-cert) are installed into the necessary jails.
If the site has a real certificate, the sysadmin is expected to manually copy it into place before building the rest of the service jails.
### issues
- Let's Encrypt requires either a DNS or HTTP tokens to be published to confirm domain ownership. The current build script uses HTTP.
- to validate via HTTP, the webmail and haproxy services jails must be available.
- the haproxy jail requires a TLS certificate
- The Let's Encrypt jail has scripts that deploys the updated TLS files into place (overwriting any existing certs) and automatically restarts the services.
- If DNS isn't set up for the host, creating Let's Encrypt certs isn't going to work.
### future
The best way to introduce Let's Encrypt is... | True | integrate LetsEncrypt into the provisioning steps - ### present
As part of the base jail provisioning step, a self-signed TLS certificate is generated and left in /etc/ssl. When TLS is used (haproxy, haraka, dovecot), the files (key, certs, ca-cert) are installed into the necessary jails.
If the site has a real certificate, the sysadmin is expected to manually copy it into place before building the rest of the service jails.
### issues
- Let's Encrypt requires either a DNS or HTTP tokens to be published to confirm domain ownership. The current build script uses HTTP.
- to validate via HTTP, the webmail and haproxy services jails must be available.
- the haproxy jail requires a TLS certificate
- The Let's Encrypt jail has scripts that deploys the updated TLS files into place (overwriting any existing certs) and automatically restarts the services.
- If DNS isn't set up for the host, creating Let's Encrypt certs isn't going to work.
### future
The best way to introduce Let's Encrypt is... | non_priority | integrate letsencrypt into the provisioning steps present as part of the base jail provisioning step a self signed tls certificate is generated and left in etc ssl when tls is used haproxy haraka dovecot the files key certs ca cert are installed into the necessary jails if the site has a real certificate the sysadmin is expected to manually copy it into place before building the rest of the service jails issues let s encrypt requires either a dns or http tokens to be published to confirm domain ownership the current build script uses http to validate via http the webmail and haproxy services jails must be available the haproxy jail requires a tls certificate the let s encrypt jail has scripts that deploys the updated tls files into place overwriting any existing certs and automatically restarts the services if dns isn t set up for the host creating let s encrypt certs isn t going to work future the best way to introduce let s encrypt is | 0 |
96,760 | 16,164,652,834 | IssuesEvent | 2021-05-01 08:31:11 | AlexRogalskiy/github-action-issue-commenter | https://api.github.com/repos/AlexRogalskiy/github-action-issue-commenter | opened | CVE-2020-11023 (Medium) detected in jquery-1.8.1.min.js | security vulnerability | ## CVE-2020-11023 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.8.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js</a></p>
<p>Path to dependency file: github-action-issue-commenter/node_modules/redeyed/examples/browser/index.html</p>
<p>Path to vulnerable library: github-action-issue-commenter/node_modules/redeyed/examples/browser/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.8.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/github-action-issue-commenter/commit/9a16ececaefdcda6589f96993f0bc7b342f5fd59">9a16ececaefdcda6589f96993f0bc7b342f5fd59</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing <option> elements from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11023>CVE-2020-11023</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jquery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-11023 (Medium) detected in jquery-1.8.1.min.js - ## CVE-2020-11023 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.8.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js</a></p>
<p>Path to dependency file: github-action-issue-commenter/node_modules/redeyed/examples/browser/index.html</p>
<p>Path to vulnerable library: github-action-issue-commenter/node_modules/redeyed/examples/browser/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.8.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/github-action-issue-commenter/commit/9a16ececaefdcda6589f96993f0bc7b342f5fd59">9a16ececaefdcda6589f96993f0bc7b342f5fd59</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing <option> elements from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11023>CVE-2020-11023</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jquery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file github action issue commenter node modules redeyed examples browser index html path to vulnerable library github action issue commenter node modules redeyed examples browser index html dependency hierarchy x jquery min js vulnerable library found in head commit a href vulnerability details in jquery versions greater than or equal to and before passing html containing elements from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery step up your open source security game with whitesource | 0 |
313,177 | 23,460,543,926 | IssuesEvent | 2022-08-16 12:47:07 | corvvs/webserv | https://api.github.com/repos/corvvs/webserv | closed | HTTPエラーと持続的接続 | documentation 設計 | ## タスク
- [x] 正常に終了するリクエストを、接続を切らずに連続して処理できる
- [x] 正常に終了するリクエストを処理したあと、データを送らない場合に接続がタイムアウトする
- [x] 正常に終了するリクエストを2つ繋げて送信した時、レスポンスが2つ連続で返ってくる
- [ ] recoverable(回復可能)エラーについて
- 以下にあてはまるエラーを"recoverableエラー"とする:
- リクエストのヘッダ解析完了までに生じたエラーの中で、そのまま次のリクエストを処理することが可能かつそうしても支障がないもの
- リクエストマッチングで生じたもの
- 上記に当てはまらないエラーは"unrecoverableエラー"とし、例外を投げることで対処する。
- [x] ルーティング時にリクエストがrecoverableエラーを抱えている場合, ルーティングを行わずデフォルトのエラーページに対する`FileReader`オリジネータを生成する。
- [ ] リクエストマッチング結果がrecoverableエラーを抱えている場合, エラーページに対する`FileReader`オリジネータを生成する。
## テーマ
- ラウンドトリップの途中でHTTPエラー例外が投げられた時、処理フローはどう変わるか?
- あるリクエストがHTTPエラーを引き起こすとして、HTTP接続を持続させられるのはどのような時か?
| 1.0 | HTTPエラーと持続的接続 - ## タスク
- [x] 正常に終了するリクエストを、接続を切らずに連続して処理できる
- [x] 正常に終了するリクエストを処理したあと、データを送らない場合に接続がタイムアウトする
- [x] 正常に終了するリクエストを2つ繋げて送信した時、レスポンスが2つ連続で返ってくる
- [ ] recoverable(回復可能)エラーについて
- 以下にあてはまるエラーを"recoverableエラー"とする:
- リクエストのヘッダ解析完了までに生じたエラーの中で、そのまま次のリクエストを処理することが可能かつそうしても支障がないもの
- リクエストマッチングで生じたもの
- 上記に当てはまらないエラーは"unrecoverableエラー"とし、例外を投げることで対処する。
- [x] ルーティング時にリクエストがrecoverableエラーを抱えている場合, ルーティングを行わずデフォルトのエラーページに対する`FileReader`オリジネータを生成する。
- [ ] リクエストマッチング結果がrecoverableエラーを抱えている場合, エラーページに対する`FileReader`オリジネータを生成する。
## テーマ
- ラウンドトリップの途中でHTTPエラー例外が投げられた時、処理フローはどう変わるか?
- あるリクエストがHTTPエラーを引き起こすとして、HTTP接続を持続させられるのはどのような時か?
| non_priority | httpエラーと持続的接続 タスク 正常に終了するリクエストを、接続を切らずに連続して処理できる 正常に終了するリクエストを処理したあと、データを送らない場合に接続がタイムアウトする 、 recoverable 回復可能 エラーについて 以下にあてはまるエラーを recoverableエラー とする リクエストのヘッダ解析完了までに生じたエラーの中で、そのまま次のリクエストを処理することが可能かつそうしても支障がないもの リクエストマッチングで生じたもの 上記に当てはまらないエラーは unrecoverableエラー とし、例外を投げることで対処する。 ルーティング時にリクエストがrecoverableエラーを抱えている場合 ルーティングを行わずデフォルトのエラーページに対する filereader オリジネータを生成する。 リクエストマッチング結果がrecoverableエラーを抱えている場合 エラーページに対する filereader オリジネータを生成する。 テーマ ラウンドトリップの途中でhttpエラー例外が投げられた時、処理フローはどう変わるか? あるリクエストがhttpエラーを引き起こすとして、http接続を持続させられるのはどのような時か? | 0 |
249,978 | 21,220,037,784 | IssuesEvent | 2022-04-11 11:02:34 | gameserverapp/Platform | https://api.github.com/repos/gameserverapp/Platform | closed | Chat log parses same chat twice | bug status: to be tested admin tools:chat console | **Describe the bug**
Occasionally chat is being indexed twice.
Seems to happen mostly with higher-latency servers (US).
**Screenshots**
Symptom:

**Additional context**
https://discord.com/channels/315049169071112193/685253991726317584/931616660635410482
| 1.0 | Chat log parses same chat twice - **Describe the bug**
Occasionally chat is being indexed twice.
Seems to happen mostly with higher-latency servers (US).
**Screenshots**
Symptom:

**Additional context**
https://discord.com/channels/315049169071112193/685253991726317584/931616660635410482
| non_priority | chat log parses same chat twice describe the bug occasionally chat is being indexed twice seems to happen mostly with higher latency servers us screenshots symptom additional context | 0 |
53,766 | 13,205,357,455 | IssuesEvent | 2020-08-14 17:46:50 | open-telemetry/opentelemetry-java-instrumentation | https://api.github.com/repos/open-telemetry/opentelemetry-java-instrumentation | closed | Use Gradle build cache | area:build contributor experience priority:p2 release:required-for-ga | It takes a lot of time to build instrumentation project. But the majority of that time is spent building and testing tons of individual instrumentations. As developers often make changes to a specific instrumentation, there is no sense in building all other, unrelated, instrumentations. [Gradle build cache](https://docs.gradle.org/current/userguide/build_cache.html) can help a lot here. One easy way to start using it is via [this plugin](https://github.com/myniva/gradle-s3-build-cache) | 1.0 | Use Gradle build cache - It takes a lot of time to build instrumentation project. But the majority of that time is spent building and testing tons of individual instrumentations. As developers often make changes to a specific instrumentation, there is no sense in building all other, unrelated, instrumentations. [Gradle build cache](https://docs.gradle.org/current/userguide/build_cache.html) can help a lot here. One easy way to start using it is via [this plugin](https://github.com/myniva/gradle-s3-build-cache) | non_priority | use gradle build cache it takes a lot of time to build instrumentation project but the majority of that time is spent building and testing tons of individual instrumentations as developers often make changes to a specific instrumentation there is no sense in building all other unrelated instrumentations can help a lot here one easy way to start using it is via | 0 |
816,214 | 30,593,848,940 | IssuesEvent | 2023-07-21 19:42:14 | 42atomys/stud42 | https://api.github.com/repos/42atomys/stud42 | closed | misc: phone features postponed for consistent API Integration | state/confirmed 💜 priority/critical 🟥 aspect/backend 💻 type/chore 🛠 domain/complicated 🟨 | ### Please exprime yourself
We are currently facing a technical challenge regarding the alignment of data between webhooks and the intra API. More specifically, there is a notable inconsistency in how the phone number is synchronized between these two interfaces.
In the interest of transparency and delivering a high-quality user experience, we have decided to take a proactive decision in this regard. We will temporarily remove the phone number from our data schema. This action means that all features related to the use of the phone number will be put on hold.
This is not a step back, but rather a cautious move to avoid potential complications. We want to prevent any misunderstanding or controversy by taking this measure. We will also delete all phone number data from our database to ensure the protection of our users' personal information.
In the future, instead of trying to synchronize this information between the webhooks and the intra API, we will request the phone number directly on our site. This will allow us to ensure that we have the most up-to-date and accurate information, while better controlling the process.
We appreciate your patience and understanding while we work on this transition. We are doing everything we can to minimize the impact on our users and we are committed to maintaining transparent communication throughout this process.
Best regards,
### Action Items
- [x] Remove the 'phone' field from the data schema
- [ ] Erase all phone data from the database and backups across all environments
- [x] Eliminate all code sections related to the phone number
- [x] Postpone all phone number-related features indefinitely
### Code of Conduct
- [X] I agree to follow this project's Code of Conduct | 1.0 | misc: phone features postponed for consistent API Integration - ### Please exprime yourself
We are currently facing a technical challenge regarding the alignment of data between webhooks and the intra API. More specifically, there is a notable inconsistency in how the phone number is synchronized between these two interfaces.
In the interest of transparency and delivering a high-quality user experience, we have decided to take a proactive decision in this regard. We will temporarily remove the phone number from our data schema. This action means that all features related to the use of the phone number will be put on hold.
This is not a step back, but rather a cautious move to avoid potential complications. We want to prevent any misunderstanding or controversy by taking this measure. We will also delete all phone number data from our database to ensure the protection of our users' personal information.
In the future, instead of trying to synchronize this information between the webhooks and the intra API, we will request the phone number directly on our site. This will allow us to ensure that we have the most up-to-date and accurate information, while better controlling the process.
We appreciate your patience and understanding while we work on this transition. We are doing everything we can to minimize the impact on our users and we are committed to maintaining transparent communication throughout this process.
Best regards,
### Action Items
- [x] Remove the 'phone' field from the data schema
- [ ] Erase all phone data from the database and backups across all environments
- [x] Eliminate all code sections related to the phone number
- [x] Postpone all phone number-related features indefinitely
### Code of Conduct
- [X] I agree to follow this project's Code of Conduct | priority | misc phone features postponed for consistent api integration please exprime yourself we are currently facing a technical challenge regarding the alignment of data between webhooks and the intra api more specifically there is a notable inconsistency in how the phone number is synchronized between these two interfaces in the interest of transparency and delivering a high quality user experience we have decided to take a proactive decision in this regard we will temporarily remove the phone number from our data schema this action means that all features related to the use of the phone number will be put on hold this is not a step back but rather a cautious move to avoid potential complications we want to prevent any misunderstanding or controversy by taking this measure we will also delete all phone number data from our database to ensure the protection of our users personal information in the future instead of trying to synchronize this information between the webhooks and the intra api we will request the phone number directly on our site this will allow us to ensure that we have the most up to date and accurate information while better controlling the process we appreciate your patience and understanding while we work on this transition we are doing everything we can to minimize the impact on our users and we are committed to maintaining transparent communication throughout this process best regards action items remove the phone field from the data schema erase all phone data from the database and backups across all environments eliminate all code sections related to the phone number postpone all phone number related features indefinitely code of conduct i agree to follow this project s code of conduct | 1 |
84,907 | 3,681,533,816 | IssuesEvent | 2016-02-24 03:55:19 | scottaj/mocha.el | https://api.github.com/repos/scottaj/mocha.el | closed | Debugging? | Feature Low Priority | Could there be a debug function that spawns the test command in a terminal instead of a compilation buffer and runs mocha with the --debug option? Combined with the debugger keyword in js this could be pretty nice. | 1.0 | Debugging? - Could there be a debug function that spawns the test command in a terminal instead of a compilation buffer and runs mocha with the --debug option? Combined with the debugger keyword in js this could be pretty nice. | priority | debugging could there be a debug function that spawns the test command in a terminal instead of a compilation buffer and runs mocha with the debug option combined with the debugger keyword in js this could be pretty nice | 1 |
59,600 | 17,023,172,412 | IssuesEvent | 2021-07-03 00:42:03 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | Bookmarks that aren't centered on the marker | Component: mapnik Priority: minor Resolution: fixed Type: defect | **[Submitted to the original trac issue database at 8.58am, Monday, 16th July 2007]**
I have switched the link from my employer's web site to OSM now that the area looks respectable (actually much better than any other online map), and now it has a marker to denote our location:
http://www.openstreetmap.org/?mlon=-0.156508&mlat=51.382090&zoom=16
The link would be even more useful if I could shift the center of the image away from the marker to the South, so the train station is visible:
http://www.openstreetmap.org/?lat=51.381018608079245&lon=-0.15459826759503276&zoom=16
So links could pack in both a marker location and a center-of-view location. The perfect solution would of course show a route, but one step at a time :) | 1.0 | Bookmarks that aren't centered on the marker - **[Submitted to the original trac issue database at 8.58am, Monday, 16th July 2007]**
I have switched the link from my employer's web site to OSM now that the area looks respectable (actually much better than any other online map), and now it has a marker to denote our location:
http://www.openstreetmap.org/?mlon=-0.156508&mlat=51.382090&zoom=16
The link would be even more useful if I could shift the center of the image away from the marker to the South, so the train station is visible:
http://www.openstreetmap.org/?lat=51.381018608079245&lon=-0.15459826759503276&zoom=16
So links could pack in both a marker location and a center-of-view location. The perfect solution would of course show a route, but one step at a time :) | non_priority | bookmarks that aren t centered on the marker i have switched the link from my employer s web site to osm now that the area looks respectable actually much better than any other online map and now it has a marker to denote our location the link would be even more useful if i could shift the center of the image away from the marker to the south so the train station is visible so links could pack in both a marker location and a center of view location the perfect solution would of course show a route but one step at a time | 0 |
12,740 | 2,715,160,016 | IssuesEvent | 2015-04-10 10:55:35 | codenameone/CodenameOne | https://api.github.com/repos/codenameone/CodenameOne | closed | Spinner overlay applied twice on Android | Priority-Medium Type-Defect | Original [issue 711](https://code.google.com/p/codenameone/issues/detail?id=711) created by codenameone on 2013-05-16T02:32:35.000Z:
Previously you had fixed issue 629 were the spinners overlay was applied twice. I have been creating an app with includeNativeBool set to false, and the spinner looked fine, now when I set includeNativeBool to true, the spinner has the same issue again, seems to only occur on Android though, as my iPhone displays it correctly either way. | 1.0 | Spinner overlay applied twice on Android - Original [issue 711](https://code.google.com/p/codenameone/issues/detail?id=711) created by codenameone on 2013-05-16T02:32:35.000Z:
Previously you had fixed issue 629 were the spinners overlay was applied twice. I have been creating an app with includeNativeBool set to false, and the spinner looked fine, now when I set includeNativeBool to true, the spinner has the same issue again, seems to only occur on Android though, as my iPhone displays it correctly either way. | non_priority | spinner overlay applied twice on android original created by codenameone on previously you had fixed issue were the spinners overlay was applied twice i have been creating an app with includenativebool set to false and the spinner looked fine now when i set includenativebool to true the spinner has the same issue again seems to only occur on android though as my iphone displays it correctly either way | 0 |
690,270 | 23,652,938,984 | IssuesEvent | 2022-08-26 08:30:40 | kubernetes/ingress-nginx | https://api.github.com/repos/kubernetes/ingress-nginx | closed | ingress-nginx-admission do not work. log show "got secret, but it did not contain a 'ca' key" | kind/bug needs-triage needs-priority |

```bash
[root@node136 tmp.9n5adjQhNn]# kubectl logs -f -n ingress-nginx ingress-nginx-admission-create-855fz
W0826 04:01:30.263082 1 client_config.go:615] Neither --kubeconfig nor --master was specified. Using the inClusterConfig. This might not work.
{"level":"fatal","msg":"got secret, but it did not contain a 'ca' key","source":"k8s/k8s.go:237","time":"2022-08-26T04:01:30Z"}
``` | 1.0 | ingress-nginx-admission do not work. log show "got secret, but it did not contain a 'ca' key" -

```bash
[root@node136 tmp.9n5adjQhNn]# kubectl logs -f -n ingress-nginx ingress-nginx-admission-create-855fz
W0826 04:01:30.263082 1 client_config.go:615] Neither --kubeconfig nor --master was specified. Using the inClusterConfig. This might not work.
{"level":"fatal","msg":"got secret, but it did not contain a 'ca' key","source":"k8s/k8s.go:237","time":"2022-08-26T04:01:30Z"}
``` | priority | ingress nginx admission do not work log show got secret but it did not contain a ca key bash kubectl logs f n ingress nginx ingress nginx admission create client config go neither kubeconfig nor master was specified using the inclusterconfig this might not work level fatal msg got secret but it did not contain a ca key source go time | 1 |
205,063 | 15,964,473,829 | IssuesEvent | 2021-04-16 06:17:30 | Maurice2n97/pe | https://api.github.com/repos/Maurice2n97/pe | opened | User Guide Bullet list doesnt seem to be indented | severity.VeryLow type.DocumentationBug | 
It would be easier to search if the command types were indented by one more level, under each command type for patient, doctor etc, instead of it being all lying on the same indentation level
<!--session: 1618552879044-420b0aa5-0cd9-4465-99b1-e5f995f2b681--> | 1.0 | User Guide Bullet list doesnt seem to be indented - 
It would be easier to search if the command types were indented by one more level, under each command type for patient, doctor etc, instead of it being all lying on the same indentation level
<!--session: 1618552879044-420b0aa5-0cd9-4465-99b1-e5f995f2b681--> | non_priority | user guide bullet list doesnt seem to be indented it would be easier to search if the command types were indented by one more level under each command type for patient doctor etc instead of it being all lying on the same indentation level | 0 |
24,288 | 17,083,215,192 | IssuesEvent | 2021-07-08 08:31:36 | python-pillow/Pillow | https://api.github.com/repos/python-pillow/Pillow | closed | Old pypi homepage is SEO spam | Infrastructure | Taking a look at https://pypi.org/project/Pillow/2.2.1/ (a very old version I realize, but also the first google result for me), the homepage it points at is https://python-imaging.github.io/, which is some kind of computer-generated SEO garbage. Is there any way you could get this pointed at something more reasonable, such as https://python-pillow.org/ ?
Alex clark said:
> Thanks! Any chance you could open a ticket for this? https://github.com/python-pillow/Pillow/issues I'll take a look in the meantime ... Thanks again. | 1.0 | Old pypi homepage is SEO spam - Taking a look at https://pypi.org/project/Pillow/2.2.1/ (a very old version I realize, but also the first google result for me), the homepage it points at is https://python-imaging.github.io/, which is some kind of computer-generated SEO garbage. Is there any way you could get this pointed at something more reasonable, such as https://python-pillow.org/ ?
Alex clark said:
> Thanks! Any chance you could open a ticket for this? https://github.com/python-pillow/Pillow/issues I'll take a look in the meantime ... Thanks again. | non_priority | old pypi homepage is seo spam taking a look at a very old version i realize but also the first google result for me the homepage it points at is which is some kind of computer generated seo garbage is there any way you could get this pointed at something more reasonable such as alex clark said thanks any chance you could open a ticket for this i ll take a look in the meantime thanks again | 0 |
10,953 | 8,229,272,255 | IssuesEvent | 2018-09-07 08:52:02 | dotnet/corefx | https://api.github.com/repos/dotnet/corefx | closed | new X509Certificate2 throws exception while loading certificate in Docker container | area-System.Security | SDK: Microsoft.AspNetCore.App 2.1.1
The following code throws an exception when it runs in a docker container
` var certificatePath = Path.Combine(env.ContentRootPath, "TestCertificate.pfx");
Console.WriteLine($"Certificate file exists: {File.Exists(certificatePath)}");
var certificate =new X509Certificate2(certificatePath, "Password");`
The following exception occcurs:
Object was not found
at Internal.Cryptography.Pal.CertificatePal.FilterPFXStore(Byte[] rawData, SafePasswordHandle password, PfxCertStoreFlags pfxCertStoreFlags)
at Internal.Cryptography.Pal.CertificatePal.FromBlobOrFile(Byte[] rawData, String fileName, SafePasswordHandle password, X509KeyStorageFlags keyStorageFlags)
at System.Security.Cryptography.X509Certificates.X509Certificate..ctor(String fileName, String password, X509KeyStorageFlags keyStorageFlags)
at System.Security.Cryptography.X509Certificates.X509Certificate2..ctor(String fileName, String password)
The certificate file exists in the Docker container. Running the web application outside the Docker container throws no exception.
Example project can be find here: https://github.com/MarcelWouters/DockerLoadCertificate
| True | new X509Certificate2 throws exception while loading certificate in Docker container - SDK: Microsoft.AspNetCore.App 2.1.1
The following code throws an exception when it runs in a docker container
` var certificatePath = Path.Combine(env.ContentRootPath, "TestCertificate.pfx");
Console.WriteLine($"Certificate file exists: {File.Exists(certificatePath)}");
var certificate =new X509Certificate2(certificatePath, "Password");`
The following exception occcurs:
Object was not found
at Internal.Cryptography.Pal.CertificatePal.FilterPFXStore(Byte[] rawData, SafePasswordHandle password, PfxCertStoreFlags pfxCertStoreFlags)
at Internal.Cryptography.Pal.CertificatePal.FromBlobOrFile(Byte[] rawData, String fileName, SafePasswordHandle password, X509KeyStorageFlags keyStorageFlags)
at System.Security.Cryptography.X509Certificates.X509Certificate..ctor(String fileName, String password, X509KeyStorageFlags keyStorageFlags)
at System.Security.Cryptography.X509Certificates.X509Certificate2..ctor(String fileName, String password)
The certificate file exists in the Docker container. Running the web application outside the Docker container throws no exception.
Example project can be find here: https://github.com/MarcelWouters/DockerLoadCertificate
| non_priority | new throws exception while loading certificate in docker container sdk microsoft aspnetcore app the following code throws an exception when it runs in a docker container var certificatepath path combine env contentrootpath testcertificate pfx console writeline certificate file exists file exists certificatepath var certificate new certificatepath password the following exception occcurs object was not found at internal cryptography pal certificatepal filterpfxstore byte rawdata safepasswordhandle password pfxcertstoreflags pfxcertstoreflags at internal cryptography pal certificatepal frombloborfile byte rawdata string filename safepasswordhandle password keystorageflags at system security cryptography ctor string filename string password keystorageflags at system security cryptography ctor string filename string password the certificate file exists in the docker container running the web application outside the docker container throws no exception example project can be find here | 0 |
121,058 | 15,835,719,724 | IssuesEvent | 2021-04-06 18:22:38 | ParabolInc/parabol | https://api.github.com/repos/ParabolInc/parabol | closed | Update Upgrade Page to include Enterprise | design discussion enhancement icebox | ###3790 Issue - Enhancement
Right now, the conversion prompt and the upgrade page only direct users to upgrade Pro. This forces them to go onto our website, find contact details, and write in seeking information on Enterprise if they don't feel that Pro is the right option for them. In order to increase enterprise conversion and speed the sales cycle, we should simplify this process for them by having a CTA on the upgrade page that allows them to enter their details and then flows through Hubspot so that the proper rep can respond quickly.
The flow should be:
- The "Upgrade to Pro" CTA from the user avatar should change to "Upgrade Options"
- Once the conversion prompt is changed to 'Upgrade Now' no further changes on the prompt messaging will be needed for this enhancement
- From https://action.parabol.co/me/organizations Upgrade page, users should see a CTA that says 'Upgrade to Enterprise' next to the CTA for 'Upgrade to Pro'
- The CTA for Enterprise should collect their info: name, title, phone number
- Once input, they should receive a message saying "Thanks, we'll be in touch to schedule a call with you soon!"
- Their info should flow through Hubspot so that the Org & User information is connected, and either a new Deal is made and assigned via the current workflows or, if it's an existing Deal, the Deal owner is notified
### Acceptance Criteria (optional)
- Users can request information about upgrading to Enterprise directly for the https://action.parabol.co/me/organizations Upgrade page
- Parabol sales is notified about the inbound request and can action on it quickly
| 1.0 | Update Upgrade Page to include Enterprise - ###3790 Issue - Enhancement
Right now, the conversion prompt and the upgrade page only direct users to upgrade Pro. This forces them to go onto our website, find contact details, and write in seeking information on Enterprise if they don't feel that Pro is the right option for them. In order to increase enterprise conversion and speed the sales cycle, we should simplify this process for them by having a CTA on the upgrade page that allows them to enter their details and then flows through Hubspot so that the proper rep can respond quickly.
The flow should be:
- The "Upgrade to Pro" CTA from the user avatar should change to "Upgrade Options"
- Once the conversion prompt is changed to 'Upgrade Now' no further changes on the prompt messaging will be needed for this enhancement
- From https://action.parabol.co/me/organizations Upgrade page, users should see a CTA that says 'Upgrade to Enterprise' next to the CTA for 'Upgrade to Pro'
- The CTA for Enterprise should collect their info: name, title, phone number
- Once input, they should receive a message saying "Thanks, we'll be in touch to schedule a call with you soon!"
- Their info should flow through Hubspot so that the Org & User information is connected, and either a new Deal is made and assigned via the current workflows or, if it's an existing Deal, the Deal owner is notified
### Acceptance Criteria (optional)
- Users can request information about upgrading to Enterprise directly for the https://action.parabol.co/me/organizations Upgrade page
- Parabol sales is notified about the inbound request and can action on it quickly
| non_priority | update upgrade page to include enterprise issue enhancement right now the conversion prompt and the upgrade page only direct users to upgrade pro this forces them to go onto our website find contact details and write in seeking information on enterprise if they don t feel that pro is the right option for them in order to increase enterprise conversion and speed the sales cycle we should simplify this process for them by having a cta on the upgrade page that allows them to enter their details and then flows through hubspot so that the proper rep can respond quickly the flow should be the upgrade to pro cta from the user avatar should change to upgrade options once the conversion prompt is changed to upgrade now no further changes on the prompt messaging will be needed for this enhancement from upgrade page users should see a cta that says upgrade to enterprise next to the cta for upgrade to pro the cta for enterprise should collect their info name title phone number once input they should receive a message saying thanks we ll be in touch to schedule a call with you soon their info should flow through hubspot so that the org user information is connected and either a new deal is made and assigned via the current workflows or if it s an existing deal the deal owner is notified acceptance criteria optional users can request information about upgrading to enterprise directly for the upgrade page parabol sales is notified about the inbound request and can action on it quickly | 0 |
35,573 | 9,629,566,212 | IssuesEvent | 2019-05-15 09:53:16 | eclipse/kapua | https://api.github.com/repos/eclipse/kapua | closed | Travis tests random fails | bug build test | A lot of Travis builds are failing due to random errors in tests. Usually retrying the job some times fixes the builds, but this is not always the case. Also, the same branch may fail on a fork while building correctly on others.
**To Reproduce**
Steps to reproduce the behavior:
1. Launch a build in Travis
2. Check build results
**Expected behavior**
The build result should be consistent across the same branch in all forks
**Screenshots**
N/A
**Version of Kapua**
1.0.0-SNAPSHOT
**Type of deployment**
[ ] Local Vagrant deployment
[ ] Docker
[ ] Openshift (in its variants)
[x] Others - Travis
**Main component affected**
[ ] Console (in case of console please report info on which browser you encountered the problem)
[ ] REST API
[ ] Message Broker
[X] - Others
**Additional context**
Travis logs of failing builds are more than welcome. The more we collect, the better should be to identify the cause | 1.0 | Travis tests random fails - A lot of Travis builds are failing due to random errors in tests. Usually retrying the job some times fixes the builds, but this is not always the case. Also, the same branch may fail on a fork while building correctly on others.
**To Reproduce**
Steps to reproduce the behavior:
1. Launch a build in Travis
2. Check build results
**Expected behavior**
The build result should be consistent across the same branch in all forks
**Screenshots**
N/A
**Version of Kapua**
1.0.0-SNAPSHOT
**Type of deployment**
[ ] Local Vagrant deployment
[ ] Docker
[ ] Openshift (in its variants)
[x] Others - Travis
**Main component affected**
[ ] Console (in case of console please report info on which browser you encountered the problem)
[ ] REST API
[ ] Message Broker
[X] - Others
**Additional context**
Travis logs of failing builds are more than welcome. The more we collect, the better should be to identify the cause | non_priority | travis tests random fails a lot of travis builds are failing due to random errors in tests usually retrying the job some times fixes the builds but this is not always the case also the same branch may fail on a fork while building correctly on others to reproduce steps to reproduce the behavior launch a build in travis check build results expected behavior the build result should be consistent across the same branch in all forks screenshots n a version of kapua snapshot type of deployment local vagrant deployment docker openshift in its variants others travis main component affected console in case of console please report info on which browser you encountered the problem rest api message broker others additional context travis logs of failing builds are more than welcome the more we collect the better should be to identify the cause | 0 |
54,875 | 3,071,468,476 | IssuesEvent | 2015-08-19 12:19:59 | pavel-pimenov/flylinkdc-r5xx | https://api.github.com/repos/pavel-pimenov/flylinkdc-r5xx | closed | Сделать полное описание команд, доступных пользователю из окна чата. | Component-Docs enhancement imported Priority-High Usability | _From [a.rain...@gmail.com](https://code.google.com/u/117892482479228821242/) on October 20, 2010 15:11:55_
Часть команд уже описана в r5070 , крохи разбросаны по справкам, необходимо собрать все описание вместе и добавить к справке.
_Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=206_ | 1.0 | Сделать полное описание команд, доступных пользователю из окна чата. - _From [a.rain...@gmail.com](https://code.google.com/u/117892482479228821242/) on October 20, 2010 15:11:55_
Часть команд уже описана в r5070 , крохи разбросаны по справкам, необходимо собрать все описание вместе и добавить к справке.
_Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=206_ | priority | сделать полное описание команд доступных пользователю из окна чата from on october часть команд уже описана в крохи разбросаны по справкам необходимо собрать все описание вместе и добавить к справке original issue | 1 |
389,584 | 11,504,214,243 | IssuesEvent | 2020-02-12 22:45:19 | openmsupply/mobile | https://api.github.com/repos/openmsupply/mobile | opened | Payment type not correctly sent to mSupply on SYNC | 4.0.0-rc7 API: sync Bug: development Docs: not needed Effort: small Module: dispensary Priority: high | ## Describe the bug
The Payment type is missing from the Cash payment that was done on a mobile store using `New Prescription` with a payment type selected. The prescription was Finalised and synced.
## To Reproduce
Steps to reproduce the behaviour:
On mSupply MOBILE:
1. Go to Dispensary
2. Click on any Patient, column dispense
3. Click on any Prescriber (or create a New Prescriber and select it)
4. Add item that has price associated (this needs to be set in mSupply server before injitial sync) and some quantity to it
5. Go to the next step and select a 'Payment type' i.e.: Espèce
6. Click confirm
7. Click on the SYNC enabled
8. Click on Manual SYNC. Wait until is finished
On mSupply DESKTOP:
1. Log to the same store as used in the mSupply mobile
2. Go to Customers > Cash receipts
3. Double-click on the line for the same customer that the Prescription was created for
4. Check that the Payment type is showing: `
## Expected behaviour
Show the same Payment Type that was selected on the Created Prescription on mSupply mobile
## Screenshots
<img width="768" alt="Screen Shot 2020-02-13 at 10 51 57 AM" src="https://user-images.githubusercontent.com/16461988/74383673-8b545280-4e54-11ea-9088-46605c5c1dad.png">
<img width="768" alt="Screen Shot 2020-02-13 at 10 51 55 AM" src="https://user-images.githubusercontent.com/16461988/74383611-665fdf80-4e54-11ea-9608-b7e696606a6f.png">
## Versions (please complete the following information)
- mSupply Version v4.09.00-RC7
- mSupply mobile Version: 4.0.0-RC7
- OS: macOS X
## Users affected
Ivory Coast
## Additional context | 1.0 | Payment type not correctly sent to mSupply on SYNC - ## Describe the bug
The Payment type is missing from the Cash payment that was done on a mobile store using `New Prescription` with a payment type selected. The prescription was Finalised and synced.
## To Reproduce
Steps to reproduce the behaviour:
On mSupply MOBILE:
1. Go to Dispensary
2. Click on any Patient, column dispense
3. Click on any Prescriber (or create a New Prescriber and select it)
4. Add item that has price associated (this needs to be set in mSupply server before injitial sync) and some quantity to it
5. Go to the next step and select a 'Payment type' i.e.: Espèce
6. Click confirm
7. Click on the SYNC enabled
8. Click on Manual SYNC. Wait until is finished
On mSupply DESKTOP:
1. Log to the same store as used in the mSupply mobile
2. Go to Customers > Cash receipts
3. Double-click on the line for the same customer that the Prescription was created for
4. Check that the Payment type is showing: `
## Expected behaviour
Show the same Payment Type that was selected on the Created Prescription on mSupply mobile
## Screenshots
<img width="768" alt="Screen Shot 2020-02-13 at 10 51 57 AM" src="https://user-images.githubusercontent.com/16461988/74383673-8b545280-4e54-11ea-9088-46605c5c1dad.png">
<img width="768" alt="Screen Shot 2020-02-13 at 10 51 55 AM" src="https://user-images.githubusercontent.com/16461988/74383611-665fdf80-4e54-11ea-9608-b7e696606a6f.png">
## Versions (please complete the following information)
- mSupply Version v4.09.00-RC7
- mSupply mobile Version: 4.0.0-RC7
- OS: macOS X
## Users affected
Ivory Coast
## Additional context | priority | payment type not correctly sent to msupply on sync describe the bug the payment type is missing from the cash payment that was done on a mobile store using new prescription with a payment type selected the prescription was finalised and synced to reproduce steps to reproduce the behaviour on msupply mobile go to dispensary click on any patient column dispense click on any prescriber or create a new prescriber and select it add item that has price associated this needs to be set in msupply server before injitial sync and some quantity to it go to the next step and select a payment type i e espèce click confirm click on the sync enabled click on manual sync wait until is finished on msupply desktop log to the same store as used in the msupply mobile go to customers cash receipts double click on the line for the same customer that the prescription was created for check that the payment type is showing expected behaviour show the same payment type that was selected on the created prescription on msupply mobile screenshots img width alt screen shot at am src img width alt screen shot at am src versions please complete the following information msupply version msupply mobile version os macos x users affected ivory coast additional context | 1 |
108,864 | 13,673,732,752 | IssuesEvent | 2020-09-29 10:14:20 | copilot-jp/project-sprint | https://api.github.com/repos/copilot-jp/project-sprint | closed | マニュアルにfigureを適宜いれたい | design | ターゲットによると思うのですが、全体的にもじもじしいので、ちょっとでもイラストが入ってたりするといいのかもなあと思ったりしました。内容よりも先に文字と文字がすごいつまってるなと思ってしまったというか……。
デリバリー平林 | 1.0 | マニュアルにfigureを適宜いれたい - ターゲットによると思うのですが、全体的にもじもじしいので、ちょっとでもイラストが入ってたりするといいのかもなあと思ったりしました。内容よりも先に文字と文字がすごいつまってるなと思ってしまったというか……。
デリバリー平林 | non_priority | マニュアルにfigureを適宜いれたい ターゲットによると思うのですが、全体的にもじもじしいので、ちょっとでもイラストが入ってたりするといいのかもなあと思ったりしました。内容よりも先に文字と文字がすごいつまってるなと思ってしまったというか……。 デリバリー平林 | 0 |
9,374 | 7,703,471,643 | IssuesEvent | 2018-05-21 08:33:45 | symfony/symfony-docs | https://api.github.com/repos/symfony/symfony-docs | closed | [Security] Typo in Security Main Page | Bug Security Status: Needs Review good first issue help wanted | Under the section
Always Check if the User is Logged In
the code:
`use Symfony\Component\Security\Core\User\UserInterface\UserInterface;`
should be
`use Symfony\Component\Security\Core\User\UserInterface;`
| True | [Security] Typo in Security Main Page - Under the section
Always Check if the User is Logged In
the code:
`use Symfony\Component\Security\Core\User\UserInterface\UserInterface;`
should be
`use Symfony\Component\Security\Core\User\UserInterface;`
| non_priority | typo in security main page under the section always check if the user is logged in the code use symfony component security core user userinterface userinterface should be use symfony component security core user userinterface | 0 |
551,492 | 16,174,154,693 | IssuesEvent | 2021-05-03 01:25:53 | azerothcore/azerothcore-wotlk | https://api.github.com/repos/azerothcore/azerothcore-wotlk | closed | GObject rotation broken [$100] | Bounty CORE Priority - Low | EXPECTED BLIZZLIKE BEHAVIOUR:
The GObjects should "rotate", change angles when defined in the DB table gameobject column rotation0, rotation1, rotation2, rotation3
Also, the command for this ingame is weird. It should offer .gobject turn [3 rotations] instead it is only offering the "...same as current character orientation."
STEPS TO REPRODUCE THE PROBLEM:
try any gobject ingame, or spawn fresh one...try to change angle/s. The orientation is working, the rotation is not.
BRANCH(ES):
master
AC HASH/COMMIT:
commit 205e8eb
-> have not updated since
OPERATING SYSTEM:
Win 10 x64, latest update
MODULES:
no
OTHER CUSTOMIZATIONS:
none
I hope my explanation is clear. TC had this issue too, they ported that stuff from mangos...it was some time around y2015 so it is possible it was not included in sunwell. If you find my report invalid, please explain how and if even is this stuff working.
<bountysource-plugin>
---
There is a **[$100 open bounty](https://www.bountysource.com/issues/64129314-gobject-rotation-broken?utm_campaign=plugin&utm_content=tracker%2F40032087&utm_medium=issues&utm_source=github)** on this issue. Add to the bounty at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F40032087&utm_medium=issues&utm_source=github).
</bountysource-plugin> | 1.0 | GObject rotation broken [$100] - EXPECTED BLIZZLIKE BEHAVIOUR:
The GObjects should "rotate", change angles when defined in the DB table gameobject column rotation0, rotation1, rotation2, rotation3
Also, the command for this ingame is weird. It should offer .gobject turn [3 rotations] instead it is only offering the "...same as current character orientation."
STEPS TO REPRODUCE THE PROBLEM:
try any gobject ingame, or spawn fresh one...try to change angle/s. The orientation is working, the rotation is not.
BRANCH(ES):
master
AC HASH/COMMIT:
commit 205e8eb
-> have not updated since
OPERATING SYSTEM:
Win 10 x64, latest update
MODULES:
no
OTHER CUSTOMIZATIONS:
none
I hope my explanation is clear. TC had this issue too, they ported that stuff from mangos...it was some time around y2015 so it is possible it was not included in sunwell. If you find my report invalid, please explain how and if even is this stuff working.
<bountysource-plugin>
---
There is a **[$100 open bounty](https://www.bountysource.com/issues/64129314-gobject-rotation-broken?utm_campaign=plugin&utm_content=tracker%2F40032087&utm_medium=issues&utm_source=github)** on this issue. Add to the bounty at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F40032087&utm_medium=issues&utm_source=github).
</bountysource-plugin> | priority | gobject rotation broken expected blizzlike behaviour the gobjects should rotate change angles when defined in the db table gameobject column also the command for this ingame is weird it should offer gobject turn instead it is only offering the same as current character orientation steps to reproduce the problem try any gobject ingame or spawn fresh one try to change angle s the orientation is working the rotation is not branch es master ac hash commit commit have not updated since operating system win latest update modules no other customizations none i hope my explanation is clear tc had this issue too they ported that stuff from mangos it was some time around so it is possible it was not included in sunwell if you find my report invalid please explain how and if even is this stuff working there is a on this issue add to the bounty at | 1 |
11,667 | 3,214,178,657 | IssuesEvent | 2015-10-06 23:40:47 | kubernetes/kubernetes | https://api.github.com/repos/kubernetes/kubernetes | closed | kubeproxy e2e test is flaky | area/test component/kube-proxy kind/flake priority/P0 team/cluster | ```
STEP: Hit Test with Fewer Endpoints
STEP: Hitting endpoints from host and container
STEP: dialing(udp) endpointPodIP:endpointUdpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do echo 'hostName' | nc -w 2 -u 10.245.3.24 8081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) endpointPodIP:endpointHttpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do curl -s --connect-timeout 2 http://10.245.3.24:8080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) endpointPodIP:endpointUdpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.245.3.24&port=8081&tries=1'
STEP: dialing(http) endpointPodIP:endpointHttpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.245.3.24&port=8080&tries=1'
STEP: dialing(udp) endpointPodIP:endpointUdpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do echo 'hostName' | nc -w 2 -u 10.245.1.23 8081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) endpointPodIP:endpointHttpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do curl -s --connect-timeout 2 http://10.245.1.23:8080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) endpointPodIP:endpointUdpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.245.1.23&port=8081&tries=1'
STEP: dialing(http) endpointPodIP:endpointHttpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.245.1.23&port=8080&tries=1'
STEP: dialing(udp) endpointPodIP:endpointUdpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do echo 'hostName' | nc -w 2 -u 10.245.4.25 8081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) endpointPodIP:endpointHttpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do curl -s --connect-timeout 2 http://10.245.4.25:8080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) endpointPodIP:endpointUdpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.245.4.25&port=8081&tries=1'
STEP: dialing(http) endpointPodIP:endpointHttpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.245.4.25&port=8080&tries=1'
STEP: dialing(udp) endpointPodIP:endpointUdpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do echo 'hostName' | nc -w 2 -u 10.245.2.25 8081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) endpointPodIP:endpointHttpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do curl -s --connect-timeout 2 http://10.245.2.25:8080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) endpointPodIP:endpointUdpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.245.2.25&port=8081&tries=1'
STEP: dialing(http) endpointPodIP:endpointHttpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.245.2.25&port=8080&tries=1'
STEP: dialing(udp) endpointPodIP:endpointUdpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do echo 'hostName' | nc -w 2 -u 10.245.5.17 8081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) endpointPodIP:endpointHttpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do curl -s --connect-timeout 2 http://10.245.5.17:8080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) endpointPodIP:endpointUdpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.245.5.17&port=8081&tries=1'
STEP: dialing(http) endpointPodIP:endpointHttpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.245.5.17&port=8080&tries=1'
STEP: Hitting clusterIP from host and container
STEP: dialing(udp) node1 --> clusterIP:clusterUdpPort
STEP: Dialing from node. command:for i in $(seq 1 35); do echo 'hostName' | nc -w 2 -u 10.0.225.2 90; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) node1 --> clusterIP:clusterHttpPort
STEP: Dialing from node. command:for i in $(seq 1 35); do curl -s --connect-timeout 2 http://10.0.225.2:80/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) test container --> clusterIP:clusterUdpPort
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.0.225.2&port=90&tries=35'
STEP: dialing(http) test container --> clusterIP:clusterHttpPort
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.0.225.2&port=80&tries=35'
STEP: dialing(udp) endpoint container --> clusterIP:clusterUdpPort
STEP: Dialing from container. Running command:curl -q 'http://10.245.3.24:8080/dial?request=hostName&protocol=udp&host=10.0.225.2&port=90&tries=35'
STEP: dialing(http) endpoint container --> clusterIP:clusterHttpPort
STEP: Dialing from container. Running command:curl -q 'http://10.245.3.24:8080/dial?request=hostName&protocol=http&host=10.0.225.2&port=80&tries=35'
STEP: Hitting nodePort from host and container
STEP: dialing(udp) node1 --> node1:nodeUdpPort
STEP: Dialing from node. command:for i in $(seq 1 35); do echo 'hostName' | nc -w 2 -u 162.222.176.86 32081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) node1 --> node1:nodeHttpPort
STEP: Dialing from node. command:for i in $(seq 1 35); do curl -s --connect-timeout 2 http://162.222.176.86:32080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
[AfterEach] KubeProxy
```
```
KubeProxy
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubeproxy.go:118
should test kube-proxy [It]
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubeproxy.go:117
Expected
<int>: 4
to be ==
<int>: 5
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubeproxy.go:265
```
More detail can be seen in #13632 e2e test | 1.0 | kubeproxy e2e test is flaky - ```
STEP: Hit Test with Fewer Endpoints
STEP: Hitting endpoints from host and container
STEP: dialing(udp) endpointPodIP:endpointUdpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do echo 'hostName' | nc -w 2 -u 10.245.3.24 8081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) endpointPodIP:endpointHttpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do curl -s --connect-timeout 2 http://10.245.3.24:8080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) endpointPodIP:endpointUdpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.245.3.24&port=8081&tries=1'
STEP: dialing(http) endpointPodIP:endpointHttpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.245.3.24&port=8080&tries=1'
STEP: dialing(udp) endpointPodIP:endpointUdpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do echo 'hostName' | nc -w 2 -u 10.245.1.23 8081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) endpointPodIP:endpointHttpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do curl -s --connect-timeout 2 http://10.245.1.23:8080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) endpointPodIP:endpointUdpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.245.1.23&port=8081&tries=1'
STEP: dialing(http) endpointPodIP:endpointHttpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.245.1.23&port=8080&tries=1'
STEP: dialing(udp) endpointPodIP:endpointUdpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do echo 'hostName' | nc -w 2 -u 10.245.4.25 8081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) endpointPodIP:endpointHttpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do curl -s --connect-timeout 2 http://10.245.4.25:8080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) endpointPodIP:endpointUdpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.245.4.25&port=8081&tries=1'
STEP: dialing(http) endpointPodIP:endpointHttpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.245.4.25&port=8080&tries=1'
STEP: dialing(udp) endpointPodIP:endpointUdpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do echo 'hostName' | nc -w 2 -u 10.245.2.25 8081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) endpointPodIP:endpointHttpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do curl -s --connect-timeout 2 http://10.245.2.25:8080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) endpointPodIP:endpointUdpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.245.2.25&port=8081&tries=1'
STEP: dialing(http) endpointPodIP:endpointHttpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.245.2.25&port=8080&tries=1'
STEP: dialing(udp) endpointPodIP:endpointUdpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do echo 'hostName' | nc -w 2 -u 10.245.5.17 8081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) endpointPodIP:endpointHttpPort from node1
STEP: Dialing from node. command:for i in $(seq 1 1); do curl -s --connect-timeout 2 http://10.245.5.17:8080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) endpointPodIP:endpointUdpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.245.5.17&port=8081&tries=1'
STEP: dialing(http) endpointPodIP:endpointHttpPort from test container
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.245.5.17&port=8080&tries=1'
STEP: Hitting clusterIP from host and container
STEP: dialing(udp) node1 --> clusterIP:clusterUdpPort
STEP: Dialing from node. command:for i in $(seq 1 35); do echo 'hostName' | nc -w 2 -u 10.0.225.2 90; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) node1 --> clusterIP:clusterHttpPort
STEP: Dialing from node. command:for i in $(seq 1 35); do curl -s --connect-timeout 2 http://10.0.225.2:80/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(udp) test container --> clusterIP:clusterUdpPort
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=udp&host=10.0.225.2&port=90&tries=35'
STEP: dialing(http) test container --> clusterIP:clusterHttpPort
STEP: Dialing from container. Running command:curl -q 'http://10.245.1.27:8080/dial?request=hostName&protocol=http&host=10.0.225.2&port=80&tries=35'
STEP: dialing(udp) endpoint container --> clusterIP:clusterUdpPort
STEP: Dialing from container. Running command:curl -q 'http://10.245.3.24:8080/dial?request=hostName&protocol=udp&host=10.0.225.2&port=90&tries=35'
STEP: dialing(http) endpoint container --> clusterIP:clusterHttpPort
STEP: Dialing from container. Running command:curl -q 'http://10.245.3.24:8080/dial?request=hostName&protocol=http&host=10.0.225.2&port=80&tries=35'
STEP: Hitting nodePort from host and container
STEP: dialing(udp) node1 --> node1:nodeUdpPort
STEP: Dialing from node. command:for i in $(seq 1 35); do echo 'hostName' | nc -w 2 -u 162.222.176.86 32081; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
STEP: dialing(http) node1 --> node1:nodeHttpPort
STEP: Dialing from node. command:for i in $(seq 1 35); do curl -s --connect-timeout 2 http://162.222.176.86:32080/hostName; echo; done | grep -v '^\s*$' |sort | uniq -c | wc -l
[AfterEach] KubeProxy
```
```
KubeProxy
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubeproxy.go:118
should test kube-proxy [It]
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubeproxy.go:117
Expected
<int>: 4
to be ==
<int>: 5
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubeproxy.go:265
```
More detail can be seen in #13632 e2e test | non_priority | kubeproxy test is flaky step hit test with fewer endpoints step hitting endpoints from host and container step dialing udp endpointpodip endpointudpport from step dialing from node command for i in seq do echo hostname nc w u echo done grep v s sort uniq c wc l step dialing http endpointpodip endpointhttpport from step dialing from node command for i in seq do curl s connect timeout echo done grep v s sort uniq c wc l step dialing udp endpointpodip endpointudpport from test container step dialing from container running command curl q step dialing http endpointpodip endpointhttpport from test container step dialing from container running command curl q step dialing udp endpointpodip endpointudpport from step dialing from node command for i in seq do echo hostname nc w u echo done grep v s sort uniq c wc l step dialing http endpointpodip endpointhttpport from step dialing from node command for i in seq do curl s connect timeout echo done grep v s sort uniq c wc l step dialing udp endpointpodip endpointudpport from test container step dialing from container running command curl q step dialing http endpointpodip endpointhttpport from test container step dialing from container running command curl q step dialing udp endpointpodip endpointudpport from step dialing from node command for i in seq do echo hostname nc w u echo done grep v s sort uniq c wc l step dialing http endpointpodip endpointhttpport from step dialing from node command for i in seq do curl s connect timeout echo done grep v s sort uniq c wc l step dialing udp endpointpodip endpointudpport from test container step dialing from container running command curl q step dialing http endpointpodip endpointhttpport from test container step dialing from container running command curl q step dialing udp endpointpodip endpointudpport from step dialing from node command for i in seq do echo hostname nc w u echo done grep v s sort uniq c wc l step dialing http endpointpodip endpointhttpport from step dialing from node command for i in seq do curl s connect timeout echo done grep v s sort uniq c wc l step dialing udp endpointpodip endpointudpport from test container step dialing from container running command curl q step dialing http endpointpodip endpointhttpport from test container step dialing from container running command curl q step dialing udp endpointpodip endpointudpport from step dialing from node command for i in seq do echo hostname nc w u echo done grep v s sort uniq c wc l step dialing http endpointpodip endpointhttpport from step dialing from node command for i in seq do curl s connect timeout echo done grep v s sort uniq c wc l step dialing udp endpointpodip endpointudpport from test container step dialing from container running command curl q step dialing http endpointpodip endpointhttpport from test container step dialing from container running command curl q step hitting clusterip from host and container step dialing udp clusterip clusterudpport step dialing from node command for i in seq do echo hostname nc w u echo done grep v s sort uniq c wc l step dialing http clusterip clusterhttpport step dialing from node command for i in seq do curl s connect timeout echo done grep v s sort uniq c wc l step dialing udp test container clusterip clusterudpport step dialing from container running command curl q step dialing http test container clusterip clusterhttpport step dialing from container running command curl q step dialing udp endpoint container clusterip clusterudpport step dialing from container running command curl q step dialing http endpoint container clusterip clusterhttpport step dialing from container running command curl q step hitting nodeport from host and container step dialing udp nodeudpport step dialing from node command for i in seq do echo hostname nc w u echo done grep v s sort uniq c wc l step dialing http nodehttpport step dialing from node command for i in seq do curl s connect timeout echo done grep v s sort uniq c wc l kubeproxy kubeproxy go src io kubernetes output dockerized go src io kubernetes test kubeproxy go should test kube proxy go src io kubernetes output dockerized go src io kubernetes test kubeproxy go expected to be go src io kubernetes output dockerized go src io kubernetes test kubeproxy go more detail can be seen in test | 0 |
549,892 | 16,101,598,667 | IssuesEvent | 2021-04-27 09:59:12 | input-output-hk/cardano-node | https://api.github.com/repos/input-output-hk/cardano-node | closed | [FR] - CLI - Add support for Bech32 Addresses input format | cli revision enhancement priority medium shelley mainnet | **Internal**
**Describe the feature you'd like**
We need to support addresses in Bech32 format so that we can interchange between the wallet and node.
| 1.0 | [FR] - CLI - Add support for Bech32 Addresses input format - **Internal**
**Describe the feature you'd like**
We need to support addresses in Bech32 format so that we can interchange between the wallet and node.
| priority | cli add support for addresses input format internal describe the feature you d like we need to support addresses in format so that we can interchange between the wallet and node | 1 |
6,372 | 9,421,405,684 | IssuesEvent | 2019-04-11 06:41:52 | plazi/arcadia-project | https://api.github.com/repos/plazi/arcadia-project | opened | selection of articles for BLR processing: IRMNG data | Article processing | Here is the list of articles from the list of genus names in taxonomy, provided by Tony Rees. can you please create a ranking of contribution of the journals, eg how many time is a journal present in this list? This is important to decide where the most new names are
[irmng-sources-extra-Jul2017.xlsx](https://github.com/plazi/arcadia-project/files/3067248/irmng-sources-extra-Jul2017.xlsx)
| 1.0 | selection of articles for BLR processing: IRMNG data - Here is the list of articles from the list of genus names in taxonomy, provided by Tony Rees. can you please create a ranking of contribution of the journals, eg how many time is a journal present in this list? This is important to decide where the most new names are
[irmng-sources-extra-Jul2017.xlsx](https://github.com/plazi/arcadia-project/files/3067248/irmng-sources-extra-Jul2017.xlsx)
| non_priority | selection of articles for blr processing irmng data here is the list of articles from the list of genus names in taxonomy provided by tony rees can you please create a ranking of contribution of the journals eg how many time is a journal present in this list this is important to decide where the most new names are | 0 |
141,192 | 5,431,570,426 | IssuesEvent | 2017-03-04 01:39:54 | Radarr/Radarr | https://api.github.com/repos/Radarr/Radarr | closed | Validate Folder on Net Import / Lists | bug priority:high | **Description:**
People aren't choosing a root folder when importing and are getting errors because of that. We need to validate they choose something. | 1.0 | Validate Folder on Net Import / Lists - **Description:**
People aren't choosing a root folder when importing and are getting errors because of that. We need to validate they choose something. | priority | validate folder on net import lists description people aren t choosing a root folder when importing and are getting errors because of that we need to validate they choose something | 1 |
42,468 | 11,054,095,092 | IssuesEvent | 2019-12-10 12:48:46 | primefaces/primefaces | https://api.github.com/repos/primefaces/primefaces | closed | Dialog: closeDynamic fails due to invalid use of data-widgetvar | defect | ## 1) Environment
* **PrimeFaces version:** 7.0.10
* **Does it work on the newest released PrimeFaces version?** No.
* **Does it work on the newest sources in GitHub?** In works in showcase.
* **Application server + version:** Apache TomEE 8.0.0 (Tomct 9.0.22, MyFaces 2.3.4)
* **Affected browsers:** all tested
## 2) Expected behavior
Opening and closing a dialog from backend bean by
```java
PrimeFaces.current().dialog().openDynamic("/myDialog", opt, par);
/* ... */
PrimeFaces.current().dialog().closeDynamic(res);
```
should open and close the dialog.
## 3) Actual behavior
Opening works correctly, but adds the old attribite `data-widgetvar` to the markup (shortened):
```html
<div id="myForm:dlgBtn_dlg" class="..." data-pfdlgcid="..." data-widgetvar="myForm_dlgBtn_dlgwidget" role="dialog">
<!-- ... -->
</div>
```
Closing throws this error in the browser console:
```
core.js.xhtml?ln=primefaces&v=7.0.10:1 Widget for var 'undefined' not available!
components.js.xhtml:1 Uncaught TypeError: Cannot read property 'cfg' of undefined
```
This line tries to read `data-widget` which is not populated:
https://github.com/primefaces/primefaces/blob/45bb466d5a52707fa0542a0dbf68fb5ddcc8abc3/src/main/resources/META-INF/resources/primefaces/core/core.dialog.js#L142
## 4) Steps to reproduce
Implement an application like this:
https://www.primefaces.org/showcase/ui/df/basic.xhtml
## 5) Sample XHTML
**page.xhtml**
```html
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns:f="http://xmlns.jcp.org/jsf/core"
xmlns:h="http://xmlns.jcp.org/jsf/html"
xmlns:p="http://primefaces.org/ui">
<f:view>
<h:body>
<h:form id="myForm">
<p:commandLink actionListener="#{myBean.openDialog}" id="openDlg" value="Open Dialog"/>
</h:form>
</h:body>
</f:view>
</html>
```
**myDialog.xhtml**
```html
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns:f="http://xmlns.jcp.org/jsf/core"
xmlns:h="http://xmlns.jcp.org/jsf/html"
xmlns:p="http://primefaces.org/ui">
<f:view>
<h:body>
<h:form id="myDialogForm">
<p:commandLink actionListener="#{myBean.closeDialog}" id="closeDlg" value="Close Dialog"/>
</h:form>
</h:body>
</f:view>
</html>
```
## 6) Sample bean
**MyBean.java** (minimal extract from real project, translated from Scala)
```java
import org.primefaces.PrimeFaces;
import javax.enterprise.context.SessionScoped;
import javax.inject.Named;
import java.io.Serializable;
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
@Named("myBean")
@SessionScoped
public class MyBean implements Serializable {
public void openDialog() {
Map<String, Object> options = new HashMap<>();
options.put("modal", true);
options.put("width", 650);
options.put("height", 500);
Map<String, List<String>> param = new HashMap<>();
param.put("ids", Arrays.asList("123", "456"));
PrimeFaces.current().dialog().openDynamic("/myDialog.xhtml", options, param);
}
public void closeDialog() {
PrimeFaces.current().dialog().closeDynamic(null);
}
}
```
| 1.0 | Dialog: closeDynamic fails due to invalid use of data-widgetvar - ## 1) Environment
* **PrimeFaces version:** 7.0.10
* **Does it work on the newest released PrimeFaces version?** No.
* **Does it work on the newest sources in GitHub?** In works in showcase.
* **Application server + version:** Apache TomEE 8.0.0 (Tomct 9.0.22, MyFaces 2.3.4)
* **Affected browsers:** all tested
## 2) Expected behavior
Opening and closing a dialog from backend bean by
```java
PrimeFaces.current().dialog().openDynamic("/myDialog", opt, par);
/* ... */
PrimeFaces.current().dialog().closeDynamic(res);
```
should open and close the dialog.
## 3) Actual behavior
Opening works correctly, but adds the old attribite `data-widgetvar` to the markup (shortened):
```html
<div id="myForm:dlgBtn_dlg" class="..." data-pfdlgcid="..." data-widgetvar="myForm_dlgBtn_dlgwidget" role="dialog">
<!-- ... -->
</div>
```
Closing throws this error in the browser console:
```
core.js.xhtml?ln=primefaces&v=7.0.10:1 Widget for var 'undefined' not available!
components.js.xhtml:1 Uncaught TypeError: Cannot read property 'cfg' of undefined
```
This line tries to read `data-widget` which is not populated:
https://github.com/primefaces/primefaces/blob/45bb466d5a52707fa0542a0dbf68fb5ddcc8abc3/src/main/resources/META-INF/resources/primefaces/core/core.dialog.js#L142
## 4) Steps to reproduce
Implement an application like this:
https://www.primefaces.org/showcase/ui/df/basic.xhtml
## 5) Sample XHTML
**page.xhtml**
```html
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns:f="http://xmlns.jcp.org/jsf/core"
xmlns:h="http://xmlns.jcp.org/jsf/html"
xmlns:p="http://primefaces.org/ui">
<f:view>
<h:body>
<h:form id="myForm">
<p:commandLink actionListener="#{myBean.openDialog}" id="openDlg" value="Open Dialog"/>
</h:form>
</h:body>
</f:view>
</html>
```
**myDialog.xhtml**
```html
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns:f="http://xmlns.jcp.org/jsf/core"
xmlns:h="http://xmlns.jcp.org/jsf/html"
xmlns:p="http://primefaces.org/ui">
<f:view>
<h:body>
<h:form id="myDialogForm">
<p:commandLink actionListener="#{myBean.closeDialog}" id="closeDlg" value="Close Dialog"/>
</h:form>
</h:body>
</f:view>
</html>
```
## 6) Sample bean
**MyBean.java** (minimal extract from real project, translated from Scala)
```java
import org.primefaces.PrimeFaces;
import javax.enterprise.context.SessionScoped;
import javax.inject.Named;
import java.io.Serializable;
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
@Named("myBean")
@SessionScoped
public class MyBean implements Serializable {
public void openDialog() {
Map<String, Object> options = new HashMap<>();
options.put("modal", true);
options.put("width", 650);
options.put("height", 500);
Map<String, List<String>> param = new HashMap<>();
param.put("ids", Arrays.asList("123", "456"));
PrimeFaces.current().dialog().openDynamic("/myDialog.xhtml", options, param);
}
public void closeDialog() {
PrimeFaces.current().dialog().closeDynamic(null);
}
}
```
| non_priority | dialog closedynamic fails due to invalid use of data widgetvar environment primefaces version does it work on the newest released primefaces version no does it work on the newest sources in github in works in showcase application server version apache tomee tomct myfaces affected browsers all tested expected behavior opening and closing a dialog from backend bean by java primefaces current dialog opendynamic mydialog opt par primefaces current dialog closedynamic res should open and close the dialog actual behavior opening works correctly but adds the old attribite data widgetvar to the markup shortened html closing throws this error in the browser console core js xhtml ln primefaces v widget for var undefined not available components js xhtml uncaught typeerror cannot read property cfg of undefined this line tries to read data widget which is not populated steps to reproduce implement an application like this sample xhtml page xhtml html doctype html public dtd xhtml transitional en html xmlns f xmlns h xmlns p mydialog xhtml html doctype html public dtd xhtml transitional en html xmlns f xmlns h xmlns p sample bean mybean java minimal extract from real project translated from scala java import org primefaces primefaces import javax enterprise context sessionscoped import javax inject named import java io serializable import java util arrays import java util hashmap import java util list import java util map named mybean sessionscoped public class mybean implements serializable public void opendialog map options new hashmap options put modal true options put width options put height map param new hashmap param put ids arrays aslist primefaces current dialog opendynamic mydialog xhtml options param public void closedialog primefaces current dialog closedynamic null | 0 |
203,136 | 15,351,076,036 | IssuesEvent | 2021-03-01 04:09:36 | tgstation/TerraGov-Marine-Corps | https://api.github.com/repos/tgstation/TerraGov-Marine-Corps | closed | Wraith bug | Bug In Game Exploit Test Merge Bug | <!-- Write **BELOW** The Headers and **ABOVE** The comments else it may not be viewable -->
## Testmerges:
#5240
<!-- If you're certain the issue is to be caused by a test merge, go on the TGMC discord (preferabily the #bot-abuse channel) and type '!tgs prs' (without the brackets), and then copy and paste the bot's output here. If no testmerges are active, feel free to remove this section. -->
## Reproduction:
When banish is used on dead xeno, the dead annoncment is played again. When used on a bursted colonist, the colonist comes back to life, still bursted though
Also, please make this impossible Surr:

I'm not sure why hot hot was abusing it, but the spam was insane
<!-- Explain your issue in detail, including the steps to reproduce it. Issues without proper reproduction steps or explanation are open to being ignored/closed by maintainers.-->
<!-- **For Admins:** Oddities induced by var-edits and other admin tools are not necessarily bugs. Verify that your issues occur under regular circumstances before reporting them. -->
| 1.0 | Wraith bug - <!-- Write **BELOW** The Headers and **ABOVE** The comments else it may not be viewable -->
## Testmerges:
#5240
<!-- If you're certain the issue is to be caused by a test merge, go on the TGMC discord (preferabily the #bot-abuse channel) and type '!tgs prs' (without the brackets), and then copy and paste the bot's output here. If no testmerges are active, feel free to remove this section. -->
## Reproduction:
When banish is used on dead xeno, the dead annoncment is played again. When used on a bursted colonist, the colonist comes back to life, still bursted though
Also, please make this impossible Surr:

I'm not sure why hot hot was abusing it, but the spam was insane
<!-- Explain your issue in detail, including the steps to reproduce it. Issues without proper reproduction steps or explanation are open to being ignored/closed by maintainers.-->
<!-- **For Admins:** Oddities induced by var-edits and other admin tools are not necessarily bugs. Verify that your issues occur under regular circumstances before reporting them. -->
| non_priority | wraith bug testmerges reproduction when banish is used on dead xeno the dead annoncment is played again when used on a bursted colonist the colonist comes back to life still bursted though also please make this impossible surr i m not sure why hot hot was abusing it but the spam was insane | 0 |
578,048 | 17,143,043,715 | IssuesEvent | 2021-07-13 11:49:52 | logisim-evolution/logisim-evolution | https://api.github.com/repos/logisim-evolution/logisim-evolution | opened | Ability to remove "Bold" attribute using font picker | bug low priority | > Thing is, even if I wanted to change the fonts one by one for each element, seems that I can't get rid of the bold aspect of them. Whatever font I chose, even if I change it to other font face and select plain, it always comes back as bold.
Taken from #444 | 1.0 | Ability to remove "Bold" attribute using font picker - > Thing is, even if I wanted to change the fonts one by one for each element, seems that I can't get rid of the bold aspect of them. Whatever font I chose, even if I change it to other font face and select plain, it always comes back as bold.
Taken from #444 | priority | ability to remove bold attribute using font picker thing is even if i wanted to change the fonts one by one for each element seems that i can t get rid of the bold aspect of them whatever font i chose even if i change it to other font face and select plain it always comes back as bold taken from | 1 |
442,744 | 12,749,591,866 | IssuesEvent | 2020-06-26 23:26:31 | NRCan/GSIP | https://api.github.com/repos/NRCan/GSIP | opened | User Story: Ontology Management | Priority: Could Have Requirement User: Administrator | As an administrator, I have a way to manage the ontology as a separate artifact to be used by the inference engine. Ideally, the configuration should be a reference to someplace on the web to pull an OWL file (so, all nodes share the same ontology), but I also want to add extra rules (as long as they don't override the agreed on ontology) for specific purposes | 1.0 | User Story: Ontology Management - As an administrator, I have a way to manage the ontology as a separate artifact to be used by the inference engine. Ideally, the configuration should be a reference to someplace on the web to pull an OWL file (so, all nodes share the same ontology), but I also want to add extra rules (as long as they don't override the agreed on ontology) for specific purposes | priority | user story ontology management as an administrator i have a way to manage the ontology as a separate artifact to be used by the inference engine ideally the configuration should be a reference to someplace on the web to pull an owl file so all nodes share the same ontology but i also want to add extra rules as long as they don t override the agreed on ontology for specific purposes | 1 |
581,845 | 17,333,617,464 | IssuesEvent | 2021-07-28 07:26:59 | buddyboss/buddyboss-platform | https://api.github.com/repos/buddyboss/buddyboss-platform | closed | Enhancement Data Storing of Users Avatar/Cover Photo and Group Avatar/Cover Photo | feature: enhancement priority: medium | **Is your feature request related to a problem? Please describe.**
At this moment platform is storing all the data related to avatar/cover photos into the `avatar/USERID/.jpg` and same for the Groups as `group-avatars/GROUPID/.jpg` and it's not storing these values in the Database.
Basically avatars do not get stored in the database at all, they pull from the file system based on user id matching folder id. This causes a problem for plugins that sync everything to AWS as they don't sync any of the avatars.
**Describe the solution you'd like**
We need to store values in DB and then we have to test the plugins that sync files to AWS S3 to make sure they work with avatars and cover photos.
This is the main plugin we need to test with:
https://deliciousbrains.com/wp-offload-media/ | 1.0 | Enhancement Data Storing of Users Avatar/Cover Photo and Group Avatar/Cover Photo - **Is your feature request related to a problem? Please describe.**
At this moment platform is storing all the data related to avatar/cover photos into the `avatar/USERID/.jpg` and same for the Groups as `group-avatars/GROUPID/.jpg` and it's not storing these values in the Database.
Basically avatars do not get stored in the database at all, they pull from the file system based on user id matching folder id. This causes a problem for plugins that sync everything to AWS as they don't sync any of the avatars.
**Describe the solution you'd like**
We need to store values in DB and then we have to test the plugins that sync files to AWS S3 to make sure they work with avatars and cover photos.
This is the main plugin we need to test with:
https://deliciousbrains.com/wp-offload-media/ | priority | enhancement data storing of users avatar cover photo and group avatar cover photo is your feature request related to a problem please describe at this moment platform is storing all the data related to avatar cover photos into the avatar userid jpg and same for the groups as group avatars groupid jpg and it s not storing these values in the database basically avatars do not get stored in the database at all they pull from the file system based on user id matching folder id this causes a problem for plugins that sync everything to aws as they don t sync any of the avatars describe the solution you d like we need to store values in db and then we have to test the plugins that sync files to aws to make sure they work with avatars and cover photos this is the main plugin we need to test with | 1 |
703,932 | 24,178,642,582 | IssuesEvent | 2022-09-23 06:32:08 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | www.netflix.com - video or audio doesn't play | browser-firefox priority-critical engine-gecko | <!-- @browser: Firefox 105.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 10; Mobile VR; rv:105.0) Gecko/105.0 Firefox/105.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/111196 -->
**URL**: https://www.netflix.com/watch/81056739?trackId=14170287&tctx=2%2C0%2Cb5917c6f-75fc-4917-af89-b73d3c297e31-2494234%2CGPS_3B97A4C8CAE6AF8E9A1D362CD3166B-994911DC4F528C-E80C05FFC8_p_1663864421290%2CGPS_3B97A4C8CAE6AF8E9A1D362CD3166B_p_1663864421290%2C%2C%2C%2C
**Browser / Version**: Firefox 105.0
**Operating System**: Android 10
**Tested Another Browser**: Yes Other
**Problem type**: Video or audio doesn't play
**Description**: The video or audio does not play
**Steps to Reproduce**:
wolvic on quest 2 cannot play netflix
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/9/28075082-ffc9-4e14-9817-7ff8c9c97270.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20220907015523</li><li>channel: default</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2022/9/e7695368-b1fe-41e5-a8c6-86c0e400a8cc)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | www.netflix.com - video or audio doesn't play - <!-- @browser: Firefox 105.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 10; Mobile VR; rv:105.0) Gecko/105.0 Firefox/105.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/111196 -->
**URL**: https://www.netflix.com/watch/81056739?trackId=14170287&tctx=2%2C0%2Cb5917c6f-75fc-4917-af89-b73d3c297e31-2494234%2CGPS_3B97A4C8CAE6AF8E9A1D362CD3166B-994911DC4F528C-E80C05FFC8_p_1663864421290%2CGPS_3B97A4C8CAE6AF8E9A1D362CD3166B_p_1663864421290%2C%2C%2C%2C
**Browser / Version**: Firefox 105.0
**Operating System**: Android 10
**Tested Another Browser**: Yes Other
**Problem type**: Video or audio doesn't play
**Description**: The video or audio does not play
**Steps to Reproduce**:
wolvic on quest 2 cannot play netflix
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/9/28075082-ffc9-4e14-9817-7ff8c9c97270.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20220907015523</li><li>channel: default</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2022/9/e7695368-b1fe-41e5-a8c6-86c0e400a8cc)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | video or audio doesn t play url browser version firefox operating system android tested another browser yes other problem type video or audio doesn t play description the video or audio does not play steps to reproduce wolvic on quest cannot play netflix view the screenshot img alt screenshot src browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel default hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️ | 1 |
127,552 | 5,032,210,305 | IssuesEvent | 2016-12-16 10:25:16 | ubuntudesign/snapcraft.io | https://api.github.com/repos/ubuntudesign/snapcraft.io | closed | Change --force-dangerous to --dangerous | Priority: Low | ## Summary
--force-dangerous needs to be changed to --dangerous when the next version of snapd lands in xenial
| 1.0 | Change --force-dangerous to --dangerous - ## Summary
--force-dangerous needs to be changed to --dangerous when the next version of snapd lands in xenial
| priority | change force dangerous to dangerous summary force dangerous needs to be changed to dangerous when the next version of snapd lands in xenial | 1 |
9,048 | 12,551,490,050 | IssuesEvent | 2020-06-06 14:56:42 | teddywilson/defund12.org | https://api.github.com/repos/teddywilson/defund12.org | closed | Eureka - Letter to Mayor and Council Members | meets-issue-requirements | To: sseaman@ci.eureka.ca.gov, lcastellano@ci.eureka.ca.gov, hmessner@ci.eureka.ca.gov, kbergel@ci.eureka.ca.gov, aallison@ci.eureka.ca.gov, narroyo@ci.eureka.ca.gov, cityclerk@ci.eureka.ca.gov
Subject: Commit to reallocate for social equity
Message:
To whom it may concern,
I am a resident of Eureka's [X] Ward. I am writing to demand that the City Council adopt a budget strategy that prioritizes community well-being and redirects funding away from the police in the next budget evaluation period.
We have seen mounting evidence that police departments are ineffective institutions that marginalize minority communities and put citizens at risk of injury and death, yet the police budget accounts for 46% of our general fund (see this article). I ask that you redirect the majority of the $14.2M allotted for crime prevention toward community programs that provide citizens with basic human needs, like affordable healthcare and housing. We don’t need a militarized police force. We need to create a space in which more mental health service providers, social workers, victim/survivor advocates, religious leaders, neighbors, and friends - all of the people who really make up our community - can look out for one another. This is of course a long transition process, but real, actionable change starts with reallocating funding and investing in inclusive and diverse support strategies for our community.
As the City Council, the budget proposal is in your hands. It is your duty to represent your constituents. I am urging you to completely revise the budget for the 2020-2021 fiscal year. We can be a beacon for other cities to follow if only we have the courage to change.
Thank you for your time,
[NAME]
[ADDRESS]
[PHONE NUMBER]
[EMAIL ADDRESS] | 1.0 | Eureka - Letter to Mayor and Council Members - To: sseaman@ci.eureka.ca.gov, lcastellano@ci.eureka.ca.gov, hmessner@ci.eureka.ca.gov, kbergel@ci.eureka.ca.gov, aallison@ci.eureka.ca.gov, narroyo@ci.eureka.ca.gov, cityclerk@ci.eureka.ca.gov
Subject: Commit to reallocate for social equity
Message:
To whom it may concern,
I am a resident of Eureka's [X] Ward. I am writing to demand that the City Council adopt a budget strategy that prioritizes community well-being and redirects funding away from the police in the next budget evaluation period.
We have seen mounting evidence that police departments are ineffective institutions that marginalize minority communities and put citizens at risk of injury and death, yet the police budget accounts for 46% of our general fund (see this article). I ask that you redirect the majority of the $14.2M allotted for crime prevention toward community programs that provide citizens with basic human needs, like affordable healthcare and housing. We don’t need a militarized police force. We need to create a space in which more mental health service providers, social workers, victim/survivor advocates, religious leaders, neighbors, and friends - all of the people who really make up our community - can look out for one another. This is of course a long transition process, but real, actionable change starts with reallocating funding and investing in inclusive and diverse support strategies for our community.
As the City Council, the budget proposal is in your hands. It is your duty to represent your constituents. I am urging you to completely revise the budget for the 2020-2021 fiscal year. We can be a beacon for other cities to follow if only we have the courage to change.
Thank you for your time,
[NAME]
[ADDRESS]
[PHONE NUMBER]
[EMAIL ADDRESS] | non_priority | eureka letter to mayor and council members to sseaman ci eureka ca gov lcastellano ci eureka ca gov hmessner ci eureka ca gov kbergel ci eureka ca gov aallison ci eureka ca gov narroyo ci eureka ca gov cityclerk ci eureka ca gov subject commit to reallocate for social equity message to whom it may concern i am a resident of eureka s ward i am writing to demand that the city council adopt a budget strategy that prioritizes community well being and redirects funding away from the police in the next budget evaluation period we have seen mounting evidence that police departments are ineffective institutions that marginalize minority communities and put citizens at risk of injury and death yet the police budget accounts for of our general fund see this article i ask that you redirect the majority of the allotted for crime prevention toward community programs that provide citizens with basic human needs like affordable healthcare and housing we don’t need a militarized police force we need to create a space in which more mental health service providers social workers victim survivor advocates religious leaders neighbors and friends all of the people who really make up our community can look out for one another this is of course a long transition process but real actionable change starts with reallocating funding and investing in inclusive and diverse support strategies for our community as the city council the budget proposal is in your hands it is your duty to represent your constituents i am urging you to completely revise the budget for the fiscal year we can be a beacon for other cities to follow if only we have the courage to change thank you for your time | 0 |
323,167 | 9,850,836,659 | IssuesEvent | 2019-06-19 09:05:43 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | nextdoor.com - site is not usable | browser-firefox engine-gecko priority-normal | <!-- @browser: Firefox 62.0 -->
<!-- @ua_header: https://chase.com/Mozilla/5.0 (rv:62.0) Gecko/20100101 Firefox/62.0 -->
<!-- @reported_with: addon-reporter-firefox -->
**URL**: https://nextdoor.com/login/?session_error=true
**Browser / Version**: Firefox 62.0
**Operating System**: Unknown
**Tested Another Browser**: Yes
**Problem type**: Site is not usable
**Description**: canpt log in
**Steps to Reproduce**:
when log in: it gives error message:There was an error establishing a session. Please try again.
[](https://webcompat.com/uploads/2019/6/2d14ec09-75d8-4115-9c97-aaf1789d2bc7.jpg)
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | nextdoor.com - site is not usable - <!-- @browser: Firefox 62.0 -->
<!-- @ua_header: https://chase.com/Mozilla/5.0 (rv:62.0) Gecko/20100101 Firefox/62.0 -->
<!-- @reported_with: addon-reporter-firefox -->
**URL**: https://nextdoor.com/login/?session_error=true
**Browser / Version**: Firefox 62.0
**Operating System**: Unknown
**Tested Another Browser**: Yes
**Problem type**: Site is not usable
**Description**: canpt log in
**Steps to Reproduce**:
when log in: it gives error message:There was an error establishing a session. Please try again.
[](https://webcompat.com/uploads/2019/6/2d14ec09-75d8-4115-9c97-aaf1789d2bc7.jpg)
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | nextdoor com site is not usable url browser version firefox operating system unknown tested another browser yes problem type site is not usable description canpt log in steps to reproduce when log in it gives error message there was an error establishing a session please try again browser configuration none from with ❤️ | 1 |
18,732 | 5,697,599,996 | IssuesEvent | 2017-04-16 23:21:29 | Xinnx/rsps317 | https://api.github.com/repos/Xinnx/rsps317 | closed | missing google commons dependency | code cleanup | multiple packages in the build are missing packages from com.google.common.* | 1.0 | missing google commons dependency - multiple packages in the build are missing packages from com.google.common.* | non_priority | missing google commons dependency multiple packages in the build are missing packages from com google common | 0 |
218,980 | 24,424,803,958 | IssuesEvent | 2022-10-06 01:05:23 | jasonbrown17/jasonbrown17.github.io | https://api.github.com/repos/jasonbrown17/jasonbrown17.github.io | opened | WS-2022-0320 (Medium) detected in commonmarker-0.17.13.gem | security vulnerability | ## WS-2022-0320 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commonmarker-0.17.13.gem</b></p></summary>
<p>A fast, safe, extensible parser for CommonMark. This wraps the official libcmark library.</p>
<p>Library home page: <a href="https://rubygems.org/gems/commonmarker-0.17.13.gem">https://rubygems.org/gems/commonmarker-0.17.13.gem</a></p>
<p>Path to dependency file: /Gemfile.lock</p>
<p>Path to vulnerable library: /var/lib/gems/2.5.0/cache/commonmarker-0.17.13.gem</p>
<p>
Dependency Hierarchy:
- github-pages-204.gem (Root Library)
- jekyll-commonmark-ghpages-0.1.6.gem
- :x: **commonmarker-0.17.13.gem** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Unbounded resource exhaustion in cmark-gfm autolink extension may lead to denial of service
<p>Publish Date: 2022-09-21
<p>URL: <a href=https://github.com/gjtorikian/commonmarker/commit/a8f8d76fbc8c92ddb2e539a06bd93c5f8326705e>WS-2022-0320</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-4qw4-jpp4-8gvp">https://github.com/advisories/GHSA-4qw4-jpp4-8gvp</a></p>
<p>Release Date: 2022-09-21</p>
<p>Fix Resolution: commonmarker - 0.23.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2022-0320 (Medium) detected in commonmarker-0.17.13.gem - ## WS-2022-0320 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commonmarker-0.17.13.gem</b></p></summary>
<p>A fast, safe, extensible parser for CommonMark. This wraps the official libcmark library.</p>
<p>Library home page: <a href="https://rubygems.org/gems/commonmarker-0.17.13.gem">https://rubygems.org/gems/commonmarker-0.17.13.gem</a></p>
<p>Path to dependency file: /Gemfile.lock</p>
<p>Path to vulnerable library: /var/lib/gems/2.5.0/cache/commonmarker-0.17.13.gem</p>
<p>
Dependency Hierarchy:
- github-pages-204.gem (Root Library)
- jekyll-commonmark-ghpages-0.1.6.gem
- :x: **commonmarker-0.17.13.gem** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Unbounded resource exhaustion in cmark-gfm autolink extension may lead to denial of service
<p>Publish Date: 2022-09-21
<p>URL: <a href=https://github.com/gjtorikian/commonmarker/commit/a8f8d76fbc8c92ddb2e539a06bd93c5f8326705e>WS-2022-0320</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-4qw4-jpp4-8gvp">https://github.com/advisories/GHSA-4qw4-jpp4-8gvp</a></p>
<p>Release Date: 2022-09-21</p>
<p>Fix Resolution: commonmarker - 0.23.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws medium detected in commonmarker gem ws medium severity vulnerability vulnerable library commonmarker gem a fast safe extensible parser for commonmark this wraps the official libcmark library library home page a href path to dependency file gemfile lock path to vulnerable library var lib gems cache commonmarker gem dependency hierarchy github pages gem root library jekyll commonmark ghpages gem x commonmarker gem vulnerable library found in base branch master vulnerability details unbounded resource exhaustion in cmark gfm autolink extension may lead to denial of service publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution commonmarker step up your open source security game with mend | 0 |
38,369 | 8,789,660,957 | IssuesEvent | 2018-12-21 05:09:41 | line/armeria | https://api.github.com/repos/line/armeria | closed | Armeria removes ContentLength header when performing PUT requests without content | defect | When performing a PUT http request without content and with Content-Length=0 with armeria, the Content-Length header gets removed [here](https://github.com/line/armeria/blob/master/core/src/main/java/com/linecorp/armeria/internal/Http1ObjectEncoder.java#L247). This was added in #226 which speaks of GET requests, but this logic is being applied to all request methods, including PUT, POST, etc., which are supposed to have content. | 1.0 | Armeria removes ContentLength header when performing PUT requests without content - When performing a PUT http request without content and with Content-Length=0 with armeria, the Content-Length header gets removed [here](https://github.com/line/armeria/blob/master/core/src/main/java/com/linecorp/armeria/internal/Http1ObjectEncoder.java#L247). This was added in #226 which speaks of GET requests, but this logic is being applied to all request methods, including PUT, POST, etc., which are supposed to have content. | non_priority | armeria removes contentlength header when performing put requests without content when performing a put http request without content and with content length with armeria the content length header gets removed this was added in which speaks of get requests but this logic is being applied to all request methods including put post etc which are supposed to have content | 0 |
97,476 | 11,014,466,574 | IssuesEvent | 2019-12-04 22:50:25 | vvvv/TeachingPatching | https://api.github.com/repos/vvvv/TeachingPatching | opened | Migrating gray book from legacy to current | Documentation | 
The link is here: https://jhnnslmk.gitbook.io/graybook/
**!!!** this is not to be intended the link to the new graybook.
***My reasoning: I personally think the new graybook looks much better than the legacy one, providing more readable visual structure as well. Also these little hint-sections just seem very handy.
Also, I wouldn't exactly say it's a good thing to release the documentation to your state-of-the-art software with a legacy-prefix ...***
And as it seems it can display what joreg referred to just perfectly.
I would offer to start migrating the gitbook to the new system to see if we hit any other obstacle.
Any objections? Updating it doesn't have to be on halt either. I'll even offer continuous migration while the old one is receiving updates still (fingers crossed) and you can make the switch whenever I/we reach 100% at some point. | 1.0 | Migrating gray book from legacy to current - 
The link is here: https://jhnnslmk.gitbook.io/graybook/
**!!!** this is not to be intended the link to the new graybook.
***My reasoning: I personally think the new graybook looks much better than the legacy one, providing more readable visual structure as well. Also these little hint-sections just seem very handy.
Also, I wouldn't exactly say it's a good thing to release the documentation to your state-of-the-art software with a legacy-prefix ...***
And as it seems it can display what joreg referred to just perfectly.
I would offer to start migrating the gitbook to the new system to see if we hit any other obstacle.
Any objections? Updating it doesn't have to be on halt either. I'll even offer continuous migration while the old one is receiving updates still (fingers crossed) and you can make the switch whenever I/we reach 100% at some point. | non_priority | migrating gray book from legacy to current the link is here this is not to be intended the link to the new graybook my reasoning i personally think the new graybook looks much better than the legacy one providing more readable visual structure as well also these little hint sections just seem very handy also i wouldn t exactly say it s a good thing to release the documentation to your state of the art software with a legacy prefix and as it seems it can display what joreg referred to just perfectly i would offer to start migrating the gitbook to the new system to see if we hit any other obstacle any objections updating it doesn t have to be on halt either i ll even offer continuous migration while the old one is receiving updates still fingers crossed and you can make the switch whenever i we reach at some point | 0 |
608,931 | 18,851,490,760 | IssuesEvent | 2021-11-11 21:32:18 | craftercms/craftercms | https://api.github.com/repos/craftercms/craftercms | closed | [studio-ui] improve sidebar explorer widget breadcrumb behavior | enhancement priority: low | # Feature Request
#### Is your feature request related to a problem? Please describe.
The explorer widget displays a bread crumb that includes the current folder. When the current folder is the first level this creates a visually redundant experience for the user (Cabinet is "Pages", Breadcrumb is "Home" and right under that is the page "Home")
This visual redundancy is compounded when the Cabinet carries the same name as the top level folder (Cabinet is "Components", Breadcrumb is "components" and right under that is the folder "components")
#### Describe the solution you'd like
Remove the current item from the breadcrumb. When the current item is the first item this will leave that area empty. Visually this is also unappealing -- perhaps we can use this space to add a small description for the Cabinet?
#### Describe alternatives you've considered
When the breadcrumb is empty simply remove the gap (move the first folder up into this area)
We don't like this idea because it collides with the filter icon and it makes the UI "jumpy"
#### Additional context
<img width="531" alt="Screen Shot 2021-07-22 at 9 52 32 AM" src="https://user-images.githubusercontent.com/169432/126650831-ff94b5fd-2c07-4ab5-a3cd-96c8c51fdf7d.png">
| 1.0 | [studio-ui] improve sidebar explorer widget breadcrumb behavior - # Feature Request
#### Is your feature request related to a problem? Please describe.
The explorer widget displays a bread crumb that includes the current folder. When the current folder is the first level this creates a visually redundant experience for the user (Cabinet is "Pages", Breadcrumb is "Home" and right under that is the page "Home")
This visual redundancy is compounded when the Cabinet carries the same name as the top level folder (Cabinet is "Components", Breadcrumb is "components" and right under that is the folder "components")
#### Describe the solution you'd like
Remove the current item from the breadcrumb. When the current item is the first item this will leave that area empty. Visually this is also unappealing -- perhaps we can use this space to add a small description for the Cabinet?
#### Describe alternatives you've considered
When the breadcrumb is empty simply remove the gap (move the first folder up into this area)
We don't like this idea because it collides with the filter icon and it makes the UI "jumpy"
#### Additional context
<img width="531" alt="Screen Shot 2021-07-22 at 9 52 32 AM" src="https://user-images.githubusercontent.com/169432/126650831-ff94b5fd-2c07-4ab5-a3cd-96c8c51fdf7d.png">
| priority | improve sidebar explorer widget breadcrumb behavior feature request is your feature request related to a problem please describe the explorer widget displays a bread crumb that includes the current folder when the current folder is the first level this creates a visually redundant experience for the user cabinet is pages breadcrumb is home and right under that is the page home this visual redundancy is compounded when the cabinet carries the same name as the top level folder cabinet is components breadcrumb is components and right under that is the folder components describe the solution you d like remove the current item from the breadcrumb when the current item is the first item this will leave that area empty visually this is also unappealing perhaps we can use this space to add a small description for the cabinet describe alternatives you ve considered when the breadcrumb is empty simply remove the gap move the first folder up into this area we don t like this idea because it collides with the filter icon and it makes the ui jumpy additional context img width alt screen shot at am src | 1 |
246,604 | 7,895,427,195 | IssuesEvent | 2018-06-29 03:12:34 | aowen87/BAR | https://api.github.com/repos/aowen87/BAR | closed | Add ability to specify the x axis label for curve plots. | Expected Use: 3 - Occasional Feature Impact: 3 - Medium OS: All Priority: High Support Group: Any version: 2.12.3 | Dinesh Shetty has time dependent curve data and it only labels it with x-axis. He would like it to say time. This seem like a reasonable request.
-----------------------REDMINE MIGRATION-----------------------
This ticket was migrated from Redmine. The following information
could not be accurately captured in the new ticket:
Original author: Eric Brugger
Original creation: 10/30/2017 04:45 pm
Original update: 11/21/2017 06:51 pm
Ticket number: 2957 | 1.0 | Add ability to specify the x axis label for curve plots. - Dinesh Shetty has time dependent curve data and it only labels it with x-axis. He would like it to say time. This seem like a reasonable request.
-----------------------REDMINE MIGRATION-----------------------
This ticket was migrated from Redmine. The following information
could not be accurately captured in the new ticket:
Original author: Eric Brugger
Original creation: 10/30/2017 04:45 pm
Original update: 11/21/2017 06:51 pm
Ticket number: 2957 | priority | add ability to specify the x axis label for curve plots dinesh shetty has time dependent curve data and it only labels it with x axis he would like it to say time this seem like a reasonable request redmine migration this ticket was migrated from redmine the following information could not be accurately captured in the new ticket original author eric brugger original creation pm original update pm ticket number | 1 |
225,792 | 7,495,070,627 | IssuesEvent | 2018-04-07 16:51:51 | openshift/origin | https://api.github.com/repos/openshift/origin | closed | External Kubernetes: Builder and registry attempt to hit the base Kubernetes API instead of the OpenShift API for OpenShift resources | component/imageregistry lifecycle/rotten priority/P2 | Running under Kubernetes as usual.
Here are the build errors, notice the error near the beginning about being unable to PUT builds:
```
I0317 23:14:19.031041 1 builder.go:57] Master version "v1.1.4", Builder version "v1.1.4"
I0317 23:14:19.031912 1 builder.go:145] Running build with cgroup limits: api.CGroupLimits{MemoryLimitBytes:9223372036854771712, CPUShares:2, CPUPeriod:100000, CPUQuota:-1, MemorySwap:9223372036854771712}
I0317 23:14:19.032187 1 source.go:197] Downloading "https://github.com/enokd/docker-node-hello.git" ...
I0317 23:14:19.772437 1 source.go:208] Cloning source from https://github.com/enokd/docker-node-hello.git
W0317 23:14:21.129408 1 common.go:89] An error occurred saving build revision: the server could not find the requested resource (put builds docker-node-hello-5)
Step 1 : FROM centos@sha256:ec1bf627545d77d05270b3bbd32a9acca713189c58bc118f21abd17ff2629e3f
Pulling from library/centos
Pulling fs layer
[downloading, extracting, etc]
Pull complete
Digest: sha256:ec1bf627545d77d05270b3bbd32a9acca713189c58bc118f21abd17ff2629e3f
Status: Downloaded newer image for centos@sha256:ec1bf627545d77d05270b3bbd32a9acca713189c58bc118f21abd17ff2629e3f
---> ed452988fb6e
Step 2 : ENV "BUILD_LOGLEVEL" "2"
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in e7e8448f3c37
---> ad6a59f257b6
Removing intermediate container e7e8448f3c37
Step 3 : RUN rpm -Uvh http://download.fedoraproject.org/pub/epel/6/i386/epel-release-6-8.noarch.rpm
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in a0fb9e022dba
[91mwarning: [0m[91m/var/tmp/rpm-tmp.Twu0i5: Header V3 RSA/SHA256 Signature, key ID 0608b895: NOKEY
[0mRetrieving http://download.fedoraproject.org/pub/epel/6/i386/epel-release-6-8.noarch.rpm
Preparing... ##################################################
epel-release ##################################################
---> 86f9329cc8f9
Removing intermediate container a0fb9e022dba
Step 4 : RUN yum install -y -q npm
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in e56b2545570d
[91mwarning: [0m[91mrpmts_HdrFromFdno: Header V3 RSA/SHA256 Signature, key ID 0608b895: NOKEY
[0m[91mImporting GPG key 0x0608B895:
Userid : EPEL (6) <epel@fedoraproject.org>
Package: epel-release-6-8.noarch (installed)
From : /etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-6
[0m[91mwarning: [0m[91mrpmts_HdrFromFdno: Header V3 RSA/SHA1 Signature, key ID c105b9de: NOKEY
[0m[91mImporting GPG key 0xC105B9DE:
Userid : CentOS-6 Key (CentOS 6 Official Signing Key) <centos-6-key@centos.org>
Package: centos-release-6-7.el6.centos.12.3.x86_64 (@CentOS/6.7)
From : /etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-6
[0m[91mWarning: RPMDB altered outside of yum.
[0m ---> bbcea1291029
Removing intermediate container e56b2545570d
Step 5 : ADD . /src
---> 14651e8812e1
Removing intermediate container 41cf3579c84a
Step 6 : RUN cd /src; npm install
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in 7d36d14eb47e
[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/express/3.2.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/express/3.2.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/express/-/express-3.2.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/express/-/express-3.2.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/connect/2.7.9
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/commander/0.6.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/range-parser/0.0.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/mkdirp/0.3.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/cookie/0.0.5
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/buffer-crc32/0.2.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/fresh/0.1.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/methods/0.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/send/0.1.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/cookie-signature/1.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/debug
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/commander/0.6.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/commander/-/commander-0.6.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/range-parser/0.0.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/cookie/0.0.5
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/range-parser/-/range-parser-0.0.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/mkdirp/0.3.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/buffer-crc32/0.2.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/cookie/-/cookie-0.0.5.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/fresh/0.1.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/methods/0.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/mkdirp/-/mkdirp-0.3.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/buffer-crc32/-/buffer-crc32-0.2.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/cookie-signature/1.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/connect/2.7.9
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/fresh/-/fresh-0.1.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/methods/-/methods-0.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/debug
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/connect/-/connect-2.7.9.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/debug/-/debug-2.2.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/range-parser/-/range-parser-0.0.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/commander/-/commander-0.6.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/cookie/-/cookie-0.0.5.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/fresh/-/fresh-0.1.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/methods/-/methods-0.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/buffer-crc32/-/buffer-crc32-0.2.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/mkdirp/-/mkdirp-0.3.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/debug/-/debug-2.2.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/send/0.1.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/send/-/send-0.1.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/connect/-/connect-2.7.9.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/send/-/send-0.1.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/mime/1.2.6
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/ms/0.7.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/ms/0.7.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/ms/-/ms-0.7.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/mime/1.2.6
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/mime/-/mime-1.2.6.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/qs/0.6.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/formidable/1.0.13
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/bytes/0.2.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/ms/-/ms-0.7.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/pause/0.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/mime/-/mime-1.2.6.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/bytes/0.2.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/pause/0.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/bytes/-/bytes-0.2.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/pause/-/pause-0.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/formidable/1.0.13
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/formidable/-/formidable-1.0.13.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/bytes/-/bytes-0.2.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/qs/0.6.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/pause/-/pause-0.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/qs/-/qs-0.6.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/formidable/-/formidable-1.0.13.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/qs/-/qs-0.6.4.tgz
[0m[91mnpm[0m[91m [0m[91mWARN[0m[91m [0m[91mengine[0m[91m formidable@1.0.13: wanted: {"node":"<0.9.0"} (current: {"node":"v0.10.42","npm":"1.3.6"})
[0mexpress@3.2.4 node_modules/express
├── methods@0.0.1
├── fresh@0.1.0
├── range-parser@0.0.4
├── cookie-signature@1.0.1
├── buffer-crc32@0.2.1
├── cookie@0.0.5
├── commander@0.6.1
├── mkdirp@0.3.4
├── debug@2.2.0 (ms@0.7.1)
├── send@0.1.0 (mime@1.2.6)
└── connect@2.7.9 (pause@0.0.1, qs@0.6.4, bytes@0.2.0, formidable@1.0.13)
---> fc571f308bd8
Removing intermediate container 7d36d14eb47e
Step 7 : EXPOSE 8080
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in e6c12fe3b4a3
---> 560904365f82
Removing intermediate container e6c12fe3b4a3
Step 8 : CMD node /src/index.js
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in 62915084469e
---> f899be7443f4
Removing intermediate container 62915084469e
Step 9 : ENV "OPENSHIFT_BUILD_NAME" "docker-node-hello-5" "OPENSHIFT_BUILD_NAMESPACE" "test" "OPENSHIFT_BUILD_SOURCE" "https://github.com/enokd/docker-node-hello.git" "OPENSHIFT_BUILD_COMMIT" "bf26884a967968c5b38c7f5cc30b11ef359c64ff"
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in 22b0c65c3f4e
---> bcf9ba173d1f
Removing intermediate container 22b0c65c3f4e
Step 10 : LABEL "io.openshift.build.commit.author" "Djibril Koné \u003ckone.djibril@gmail.com\u003e" "io.openshift.build.commit.date" "Sun Nov 15 23:09:18 2015 +0100" "io.openshift.build.commit.id" "bf26884a967968c5b38c7f5cc30b11ef359c64ff" "io.openshift.build.commit.ref" "master" "io.openshift.build.commit.message" "Merge pull request #2 from fossilet/patch-3" "io.openshift.build.source-location" "https://github.com/enokd/docker-node-hello.git"
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in 66bc3b8ca08f
---> 9b16587e3af4
Removing intermediate container 66bc3b8ca08f
Successfully built 9b16587e3af4
I0317 23:15:47.261534 1 docker.go:118] Pushing image 10.0.67.144:5000/test/docker-node-hello:latest ...
F0317 23:15:47.333034 1 builder.go:204] Error: build error: Failed to push image: Received unexpected HTTP status: 500 Internal Server Error
```
While the error did not fail out the build, and registry errors resulted in the push failure, I think this indicates that the registry may be attempting to connect to the Kubernetes cluster host instead of the openshift master, despite OPENSHIFT_MASTER being set in the environment.
| 1.0 | External Kubernetes: Builder and registry attempt to hit the base Kubernetes API instead of the OpenShift API for OpenShift resources - Running under Kubernetes as usual.
Here are the build errors, notice the error near the beginning about being unable to PUT builds:
```
I0317 23:14:19.031041 1 builder.go:57] Master version "v1.1.4", Builder version "v1.1.4"
I0317 23:14:19.031912 1 builder.go:145] Running build with cgroup limits: api.CGroupLimits{MemoryLimitBytes:9223372036854771712, CPUShares:2, CPUPeriod:100000, CPUQuota:-1, MemorySwap:9223372036854771712}
I0317 23:14:19.032187 1 source.go:197] Downloading "https://github.com/enokd/docker-node-hello.git" ...
I0317 23:14:19.772437 1 source.go:208] Cloning source from https://github.com/enokd/docker-node-hello.git
W0317 23:14:21.129408 1 common.go:89] An error occurred saving build revision: the server could not find the requested resource (put builds docker-node-hello-5)
Step 1 : FROM centos@sha256:ec1bf627545d77d05270b3bbd32a9acca713189c58bc118f21abd17ff2629e3f
Pulling from library/centos
Pulling fs layer
[downloading, extracting, etc]
Pull complete
Digest: sha256:ec1bf627545d77d05270b3bbd32a9acca713189c58bc118f21abd17ff2629e3f
Status: Downloaded newer image for centos@sha256:ec1bf627545d77d05270b3bbd32a9acca713189c58bc118f21abd17ff2629e3f
---> ed452988fb6e
Step 2 : ENV "BUILD_LOGLEVEL" "2"
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in e7e8448f3c37
---> ad6a59f257b6
Removing intermediate container e7e8448f3c37
Step 3 : RUN rpm -Uvh http://download.fedoraproject.org/pub/epel/6/i386/epel-release-6-8.noarch.rpm
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in a0fb9e022dba
[91mwarning: [0m[91m/var/tmp/rpm-tmp.Twu0i5: Header V3 RSA/SHA256 Signature, key ID 0608b895: NOKEY
[0mRetrieving http://download.fedoraproject.org/pub/epel/6/i386/epel-release-6-8.noarch.rpm
Preparing... ##################################################
epel-release ##################################################
---> 86f9329cc8f9
Removing intermediate container a0fb9e022dba
Step 4 : RUN yum install -y -q npm
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in e56b2545570d
[91mwarning: [0m[91mrpmts_HdrFromFdno: Header V3 RSA/SHA256 Signature, key ID 0608b895: NOKEY
[0m[91mImporting GPG key 0x0608B895:
Userid : EPEL (6) <epel@fedoraproject.org>
Package: epel-release-6-8.noarch (installed)
From : /etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-6
[0m[91mwarning: [0m[91mrpmts_HdrFromFdno: Header V3 RSA/SHA1 Signature, key ID c105b9de: NOKEY
[0m[91mImporting GPG key 0xC105B9DE:
Userid : CentOS-6 Key (CentOS 6 Official Signing Key) <centos-6-key@centos.org>
Package: centos-release-6-7.el6.centos.12.3.x86_64 (@CentOS/6.7)
From : /etc/pki/rpm-gpg/RPM-GPG-KEY-CentOS-6
[0m[91mWarning: RPMDB altered outside of yum.
[0m ---> bbcea1291029
Removing intermediate container e56b2545570d
Step 5 : ADD . /src
---> 14651e8812e1
Removing intermediate container 41cf3579c84a
Step 6 : RUN cd /src; npm install
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in 7d36d14eb47e
[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/express/3.2.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/express/3.2.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/express/-/express-3.2.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/express/-/express-3.2.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/connect/2.7.9
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/commander/0.6.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/range-parser/0.0.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/mkdirp/0.3.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/cookie/0.0.5
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/buffer-crc32/0.2.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/fresh/0.1.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/methods/0.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/send/0.1.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/cookie-signature/1.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/debug
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/commander/0.6.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/commander/-/commander-0.6.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/range-parser/0.0.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/cookie/0.0.5
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/range-parser/-/range-parser-0.0.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/mkdirp/0.3.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/buffer-crc32/0.2.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/cookie/-/cookie-0.0.5.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/fresh/0.1.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/methods/0.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/mkdirp/-/mkdirp-0.3.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/buffer-crc32/-/buffer-crc32-0.2.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/cookie-signature/1.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/connect/2.7.9
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/fresh/-/fresh-0.1.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/methods/-/methods-0.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/debug
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/connect/-/connect-2.7.9.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/debug/-/debug-2.2.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/range-parser/-/range-parser-0.0.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/commander/-/commander-0.6.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/cookie/-/cookie-0.0.5.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/fresh/-/fresh-0.1.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/methods/-/methods-0.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/buffer-crc32/-/buffer-crc32-0.2.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/mkdirp/-/mkdirp-0.3.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/debug/-/debug-2.2.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/send/0.1.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/send/-/send-0.1.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/connect/-/connect-2.7.9.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/send/-/send-0.1.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/mime/1.2.6
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/ms/0.7.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/ms/0.7.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/ms/-/ms-0.7.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/mime/1.2.6
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/mime/-/mime-1.2.6.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/qs/0.6.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/formidable/1.0.13
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/bytes/0.2.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/ms/-/ms-0.7.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/pause/0.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/mime/-/mime-1.2.6.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/bytes/0.2.0
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/pause/0.0.1
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/bytes/-/bytes-0.2.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/pause/-/pause-0.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/formidable/1.0.13
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/formidable/-/formidable-1.0.13.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/bytes/-/bytes-0.2.0.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/qs/0.6.4
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/pause/-/pause-0.0.1.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91mGET[0m[91m https://registry.npmjs.org/qs/-/qs-0.6.4.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/formidable/-/formidable-1.0.13.tgz
[0m[91mnpm[0m[91m [0m[91mhttp[0m[91m [0m[91m200[0m[91m https://registry.npmjs.org/qs/-/qs-0.6.4.tgz
[0m[91mnpm[0m[91m [0m[91mWARN[0m[91m [0m[91mengine[0m[91m formidable@1.0.13: wanted: {"node":"<0.9.0"} (current: {"node":"v0.10.42","npm":"1.3.6"})
[0mexpress@3.2.4 node_modules/express
├── methods@0.0.1
├── fresh@0.1.0
├── range-parser@0.0.4
├── cookie-signature@1.0.1
├── buffer-crc32@0.2.1
├── cookie@0.0.5
├── commander@0.6.1
├── mkdirp@0.3.4
├── debug@2.2.0 (ms@0.7.1)
├── send@0.1.0 (mime@1.2.6)
└── connect@2.7.9 (pause@0.0.1, qs@0.6.4, bytes@0.2.0, formidable@1.0.13)
---> fc571f308bd8
Removing intermediate container 7d36d14eb47e
Step 7 : EXPOSE 8080
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in e6c12fe3b4a3
---> 560904365f82
Removing intermediate container e6c12fe3b4a3
Step 8 : CMD node /src/index.js
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in 62915084469e
---> f899be7443f4
Removing intermediate container 62915084469e
Step 9 : ENV "OPENSHIFT_BUILD_NAME" "docker-node-hello-5" "OPENSHIFT_BUILD_NAMESPACE" "test" "OPENSHIFT_BUILD_SOURCE" "https://github.com/enokd/docker-node-hello.git" "OPENSHIFT_BUILD_COMMIT" "bf26884a967968c5b38c7f5cc30b11ef359c64ff"
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in 22b0c65c3f4e
---> bcf9ba173d1f
Removing intermediate container 22b0c65c3f4e
Step 10 : LABEL "io.openshift.build.commit.author" "Djibril Koné \u003ckone.djibril@gmail.com\u003e" "io.openshift.build.commit.date" "Sun Nov 15 23:09:18 2015 +0100" "io.openshift.build.commit.id" "bf26884a967968c5b38c7f5cc30b11ef359c64ff" "io.openshift.build.commit.ref" "master" "io.openshift.build.commit.message" "Merge pull request #2 from fossilet/patch-3" "io.openshift.build.source-location" "https://github.com/enokd/docker-node-hello.git"
---> [Warning] Your kernel does not support swap limit capabilities, memory limited without swap.
---> Running in 66bc3b8ca08f
---> 9b16587e3af4
Removing intermediate container 66bc3b8ca08f
Successfully built 9b16587e3af4
I0317 23:15:47.261534 1 docker.go:118] Pushing image 10.0.67.144:5000/test/docker-node-hello:latest ...
F0317 23:15:47.333034 1 builder.go:204] Error: build error: Failed to push image: Received unexpected HTTP status: 500 Internal Server Error
```
While the error did not fail out the build, and registry errors resulted in the push failure, I think this indicates that the registry may be attempting to connect to the Kubernetes cluster host instead of the openshift master, despite OPENSHIFT_MASTER being set in the environment.
| priority | external kubernetes builder and registry attempt to hit the base kubernetes api instead of the openshift api for openshift resources running under kubernetes as usual here are the build errors notice the error near the beginning about being unable to put builds builder go master version builder version builder go running build with cgroup limits api cgrouplimits memorylimitbytes cpushares cpuperiod cpuquota memoryswap source go downloading source go cloning source from common go an error occurred saving build revision the server could not find the requested resource put builds docker node hello step from centos pulling from library centos pulling fs layer pull complete digest status downloaded newer image for centos step env build loglevel your kernel does not support swap limit capabilities memory limited without swap running in removing intermediate container step run rpm uvh your kernel does not support swap limit capabilities memory limited without swap running in var tmp rpm tmp header rsa signature key id nokey preparing epel release removing intermediate container step run yum install y q npm your kernel does not support swap limit capabilities memory limited without swap running in hdrfromfdno header rsa signature key id nokey gpg key userid epel package epel release noarch installed from etc pki rpm gpg rpm gpg key epel hdrfromfdno header rsa signature key id nokey gpg key userid centos key centos official signing key package centos release centos centos from etc pki rpm gpg rpm gpg key centos rpmdb altered outside of yum removing intermediate container step add src removing intermediate container step run cd src npm install your kernel does not support swap limit capabilities memory limited without swap running in formidable wanted node current node npm node modules express ├── methods ├── fresh ├── range parser ├── cookie signature ├── buffer ├── cookie ├── commander ├── mkdirp ├── debug ms ├── send mime └── connect pause qs bytes formidable removing intermediate container step expose your kernel does not support swap limit capabilities memory limited without swap running in removing intermediate container step cmd node src index js your kernel does not support swap limit capabilities memory limited without swap running in removing intermediate container step env openshift build name docker node hello openshift build namespace test openshift build source openshift build commit your kernel does not support swap limit capabilities memory limited without swap running in removing intermediate container step label io openshift build commit author djibril konã© djibril gmail com io openshift build commit date sun nov io openshift build commit id io openshift build commit ref master io openshift build commit message merge pull request from fossilet patch io openshift build source location your kernel does not support swap limit capabilities memory limited without swap running in removing intermediate container successfully built docker go pushing image test docker node hello latest builder go error build error failed to push image received unexpected http status internal server error while the error did not fail out the build and registry errors resulted in the push failure i think this indicates that the registry may be attempting to connect to the kubernetes cluster host instead of the openshift master despite openshift master being set in the environment | 1 |
386,116 | 11,432,126,036 | IssuesEvent | 2020-02-04 13:33:04 | luna/ide | https://api.github.com/repos/luna/ide | closed | Drawing collapsed nodes. | Category: BaseGL API Change: Non-Breaking Difficulty: Core Contributor Priority: Highest Type: Enhancement | ### Summary
This task is only about the visual part of the nodes. It does not include interactions.
### Specification
- Ability to display collapsed nodes with:
- icons
- ports
- arrows (with labels and colors)
- expressions
- selection
- Theme management
- ability to change the theme and re-draw the scene.
### Acceptance Criteria & Test Cases
- Demo scene.
| 1.0 | Drawing collapsed nodes. - ### Summary
This task is only about the visual part of the nodes. It does not include interactions.
### Specification
- Ability to display collapsed nodes with:
- icons
- ports
- arrows (with labels and colors)
- expressions
- selection
- Theme management
- ability to change the theme and re-draw the scene.
### Acceptance Criteria & Test Cases
- Demo scene.
| priority | drawing collapsed nodes summary this task is only about the visual part of the nodes it does not include interactions specification ability to display collapsed nodes with icons ports arrows with labels and colors expressions selection theme management ability to change the theme and re draw the scene acceptance criteria test cases demo scene | 1 |
174,508 | 13,493,113,354 | IssuesEvent | 2020-09-11 19:07:02 | rancher/rancher | https://api.github.com/repos/rancher/rancher | closed | Rancher Dashboard doesn't work when Rancher/cluster is configured with a proxy | [zube]: To Test | <!--
Please search for existing issues first, then read https://rancher.com/docs/rancher/v2.x/en/contributing/#bugs-issues-or-questions to see what we expect in an issue
For security issues, please email security@rancher.com instead of posting a public issue in GitHub. You may (but are not required to) use the GPG key located on Keybase.
-->
**What kind of request is this (question/bug/enhancement/feature request):**
bug
**Steps to reproduce (least amount of steps as possible):**
1) Rancher run with - `docker run -d --restart=unless-stopped -p 80:80 -p 443:443 -e HTTP_PROXY="http://10.199.12.174:3128" -e HTTPS_PROXY="http://10.199.12.174:3128" -e NO_PROXY="localhost,127.0.0.1,10.199.0.0/16,<rancher hostname>" -v /var/lib/rancher:/var/lib/rancher --name rancher rancher/rancher`
1) All downstream nodes configured with HTTP_PROXY and NO_PROXY in docker environment variables through /etc/systemd/system/docker.service.d/http-proxy.conf
```
[Service]
Environment="HTTP_PROXY=http://10.199.12.174:3128" NO_PROXY="localhost,127.0.0.1,10.199.0.0/16,<rancher hostname>"
```
3) Custom cluster created, nodes joined to cluster with docker run command
4) Once nodes stabilize and become active, the "Try Dashboard" link shows up.
**Result:**
Dashboard link redirects to /dashboard/fail-whale/
Rancher server logs `http: proxy error: proxyconnect tcp: failed to find Session for client <IP of proxy>`
**Other details that may be helpful:**
**Environment information**
- Rancher version (`rancher/rancher`/`rancher/server` image tag or shown bottom left in the UI):
v2.4.2
- Installation option (single install/HA):
rancher on single-node docker
custom downstream cluster on 3 nodes with all roles.
| 1.0 | Rancher Dashboard doesn't work when Rancher/cluster is configured with a proxy - <!--
Please search for existing issues first, then read https://rancher.com/docs/rancher/v2.x/en/contributing/#bugs-issues-or-questions to see what we expect in an issue
For security issues, please email security@rancher.com instead of posting a public issue in GitHub. You may (but are not required to) use the GPG key located on Keybase.
-->
**What kind of request is this (question/bug/enhancement/feature request):**
bug
**Steps to reproduce (least amount of steps as possible):**
1) Rancher run with - `docker run -d --restart=unless-stopped -p 80:80 -p 443:443 -e HTTP_PROXY="http://10.199.12.174:3128" -e HTTPS_PROXY="http://10.199.12.174:3128" -e NO_PROXY="localhost,127.0.0.1,10.199.0.0/16,<rancher hostname>" -v /var/lib/rancher:/var/lib/rancher --name rancher rancher/rancher`
1) All downstream nodes configured with HTTP_PROXY and NO_PROXY in docker environment variables through /etc/systemd/system/docker.service.d/http-proxy.conf
```
[Service]
Environment="HTTP_PROXY=http://10.199.12.174:3128" NO_PROXY="localhost,127.0.0.1,10.199.0.0/16,<rancher hostname>"
```
3) Custom cluster created, nodes joined to cluster with docker run command
4) Once nodes stabilize and become active, the "Try Dashboard" link shows up.
**Result:**
Dashboard link redirects to /dashboard/fail-whale/
Rancher server logs `http: proxy error: proxyconnect tcp: failed to find Session for client <IP of proxy>`
**Other details that may be helpful:**
**Environment information**
- Rancher version (`rancher/rancher`/`rancher/server` image tag or shown bottom left in the UI):
v2.4.2
- Installation option (single install/HA):
rancher on single-node docker
custom downstream cluster on 3 nodes with all roles.
| non_priority | rancher dashboard doesn t work when rancher cluster is configured with a proxy please search for existing issues first then read to see what we expect in an issue for security issues please email security rancher com instead of posting a public issue in github you may but are not required to use the gpg key located on keybase what kind of request is this question bug enhancement feature request bug steps to reproduce least amount of steps as possible rancher run with docker run d restart unless stopped p p e http proxy e https proxy e no proxy localhost v var lib rancher var lib rancher name rancher rancher rancher all downstream nodes configured with http proxy and no proxy in docker environment variables through etc systemd system docker service d http proxy conf environment http proxy no proxy localhost custom cluster created nodes joined to cluster with docker run command once nodes stabilize and become active the try dashboard link shows up result dashboard link redirects to dashboard fail whale rancher server logs http proxy error proxyconnect tcp failed to find session for client other details that may be helpful environment information rancher version rancher rancher rancher server image tag or shown bottom left in the ui installation option single install ha rancher on single node docker custom downstream cluster on nodes with all roles | 0 |
360,650 | 25,301,653,845 | IssuesEvent | 2022-11-17 11:09:10 | t3-oss/create-t3-app | https://api.github.com/repos/t3-oss/create-t3-app | opened | feat: in translated docs, show if the English version is newer | 📚 documentation 🌟 enhancement | ### Is your feature request related to a problem? Please describe.
With Arabic and Russian docs translations underway, we should start thinking about how to integrate the docs well.
One common issue with docs translations is that they become out of date as the English version gets more updates. Explicitly showing that could give people a warning that there might be newer information, and maybe also motivate more people to contribute to translations.
### Describe the solution you'd like to see
A component that shows up on translated docs if the English version of this page is newer than the translated version.
Something like “This translation was last updated on <date>. Since then there have been <number> revisions to the English version, with the newest one being on <date>. Click <here> to see the English version of this page. Or click <here> to help us update the translated version” (I’m sure there is a better way to write this)
Of course this component would ideally be translated into each of the languages we support in the docs.
### Describe alternate solutions
Not doing this
### Additional information
I don’t think it’s super urgent as all the incoming translations will be new for the time being, but it would be an awesome thing to have. | 1.0 | feat: in translated docs, show if the English version is newer - ### Is your feature request related to a problem? Please describe.
With Arabic and Russian docs translations underway, we should start thinking about how to integrate the docs well.
One common issue with docs translations is that they become out of date as the English version gets more updates. Explicitly showing that could give people a warning that there might be newer information, and maybe also motivate more people to contribute to translations.
### Describe the solution you'd like to see
A component that shows up on translated docs if the English version of this page is newer than the translated version.
Something like “This translation was last updated on <date>. Since then there have been <number> revisions to the English version, with the newest one being on <date>. Click <here> to see the English version of this page. Or click <here> to help us update the translated version” (I’m sure there is a better way to write this)
Of course this component would ideally be translated into each of the languages we support in the docs.
### Describe alternate solutions
Not doing this
### Additional information
I don’t think it’s super urgent as all the incoming translations will be new for the time being, but it would be an awesome thing to have. | non_priority | feat in translated docs show if the english version is newer is your feature request related to a problem please describe with arabic and russian docs translations underway we should start thinking about how to integrate the docs well one common issue with docs translations is that they become out of date as the english version gets more updates explicitly showing that could give people a warning that there might be newer information and maybe also motivate more people to contribute to translations describe the solution you d like to see a component that shows up on translated docs if the english version of this page is newer than the translated version something like “this translation was last updated on since then there have been revisions to the english version with the newest one being on click to see the english version of this page or click to help us update the translated version” i’m sure there is a better way to write this of course this component would ideally be translated into each of the languages we support in the docs describe alternate solutions not doing this additional information i don’t think it’s super urgent as all the incoming translations will be new for the time being but it would be an awesome thing to have | 0 |
137,963 | 11,171,682,516 | IssuesEvent | 2019-12-28 21:55:48 | ariya/phantomjs | https://api.github.com/repos/ariya/phantomjs | closed | Some tests failing on Windows | Need testing stale | A couple look like they might be forward-back slash related.
Master branch, Windows 10, Visual Studio 2013
```
ff..E..F......F.......f......F..f.............s................E.......F.....E....f...
basics\require: ERROR
ERROR: Test group skipped
ERROR: Error: Cannot find module 'require/require_spec.js'
basics\timeout: FAIL
FAIL: did not exit as expected
expected exit code -15 got 1
module\fs\fileattrs: FAIL
FAIL: size/date queries on existing files
assert_greater_than_equal: expected a number greater than or equal to 1454595802000 but got 1454574202559
phantomjs://code/fileattrs.js:44:30
module\webpage\cookies: FAIL
FAIL: page.cookies provides cookies only to appropriate requests
assert_no_property: property "cookie" found on object, expected to be absent
phantomjs://code/cookies.js:109:27
module\webpage\render: ERROR
ERROR: Test group skipped
ERROR: Error: Cannot find module './renders'
module\webpage\url-encoding: FAIL
FAIL: arguments to onResourceError and onResourceTimeout
assert_equals: expected "http://localhost:51638/url-encoding?/%89i%8Bv" but got "http://localhost:51638/url-encoding?/re"
phantomjs://code/url-encoding.js:110:22
module\webserver\requests: ERROR
ERROR: setup failed
ERROR: assert_is_false: expected false got true
127.174s elapsed
185 passed
4 failed
8 failed as expected
3 had errors
4 skipped
```
| 1.0 | Some tests failing on Windows - A couple look like they might be forward-back slash related.
Master branch, Windows 10, Visual Studio 2013
```
ff..E..F......F.......f......F..f.............s................E.......F.....E....f...
basics\require: ERROR
ERROR: Test group skipped
ERROR: Error: Cannot find module 'require/require_spec.js'
basics\timeout: FAIL
FAIL: did not exit as expected
expected exit code -15 got 1
module\fs\fileattrs: FAIL
FAIL: size/date queries on existing files
assert_greater_than_equal: expected a number greater than or equal to 1454595802000 but got 1454574202559
phantomjs://code/fileattrs.js:44:30
module\webpage\cookies: FAIL
FAIL: page.cookies provides cookies only to appropriate requests
assert_no_property: property "cookie" found on object, expected to be absent
phantomjs://code/cookies.js:109:27
module\webpage\render: ERROR
ERROR: Test group skipped
ERROR: Error: Cannot find module './renders'
module\webpage\url-encoding: FAIL
FAIL: arguments to onResourceError and onResourceTimeout
assert_equals: expected "http://localhost:51638/url-encoding?/%89i%8Bv" but got "http://localhost:51638/url-encoding?/re"
phantomjs://code/url-encoding.js:110:22
module\webserver\requests: ERROR
ERROR: setup failed
ERROR: assert_is_false: expected false got true
127.174s elapsed
185 passed
4 failed
8 failed as expected
3 had errors
4 skipped
```
| non_priority | some tests failing on windows a couple look like they might be forward back slash related master branch windows visual studio ff e f f f f f s e f e f basics require error error test group skipped error error cannot find module require require spec js basics timeout fail fail did not exit as expected expected exit code got module fs fileattrs fail fail size date queries on existing files assert greater than equal expected a number greater than or equal to but got phantomjs code fileattrs js module webpage cookies fail fail page cookies provides cookies only to appropriate requests assert no property property cookie found on object expected to be absent phantomjs code cookies js module webpage render error error test group skipped error error cannot find module renders module webpage url encoding fail fail arguments to onresourceerror and onresourcetimeout assert equals expected but got phantomjs code url encoding js module webserver requests error error setup failed error assert is false expected false got true elapsed passed failed failed as expected had errors skipped | 0 |
5,000 | 2,765,678,784 | IssuesEvent | 2015-04-29 21:50:40 | aspnet/HttpAbstractions | https://api.github.com/repos/aspnet/HttpAbstractions | opened | Consider merging the FeatureModel and Features packages | needs design | The FeatureModel package is used by Http (DefaultHttpContext), Servers (IFeatureCollection), and Hosting (new DefaultHttpContext(iFeatureCollection)).
The Features package is used by Http (DefaultHttpContext), Servers, and some middleware that access features not directly available on HttpContext. | 1.0 | Consider merging the FeatureModel and Features packages - The FeatureModel package is used by Http (DefaultHttpContext), Servers (IFeatureCollection), and Hosting (new DefaultHttpContext(iFeatureCollection)).
The Features package is used by Http (DefaultHttpContext), Servers, and some middleware that access features not directly available on HttpContext. | non_priority | consider merging the featuremodel and features packages the featuremodel package is used by http defaulthttpcontext servers ifeaturecollection and hosting new defaulthttpcontext ifeaturecollection the features package is used by http defaulthttpcontext servers and some middleware that access features not directly available on httpcontext | 0 |
123,580 | 4,865,316,833 | IssuesEvent | 2016-11-14 20:25:42 | phetsims/tasks | https://api.github.com/repos/phetsims/tasks | closed | Investigate Chrome web store | priority:5-deferred tasks:developer | @arnabp can you look into the Chrome web store again? What would it take to put an HTML5 sim on the store? All of our HTML5 sims?
Also, @aadish mentioned Appcache manifest - investigate bookmarking html5 sim: http://www.html5rocks.com/en/tutorials/appcache/beginner/
| 1.0 | Investigate Chrome web store - @arnabp can you look into the Chrome web store again? What would it take to put an HTML5 sim on the store? All of our HTML5 sims?
Also, @aadish mentioned Appcache manifest - investigate bookmarking html5 sim: http://www.html5rocks.com/en/tutorials/appcache/beginner/
| priority | investigate chrome web store arnabp can you look into the chrome web store again what would it take to put an sim on the store all of our sims also aadish mentioned appcache manifest investigate bookmarking sim | 1 |
37,458 | 8,405,326,123 | IssuesEvent | 2018-10-11 15:00:08 | primefaces/primeng | https://api.github.com/repos/primefaces/primeng | closed | p-chips disabled issue | defect | ```
[ x] bug report => Search github for a similar issue or PR before submitting
```
I am having a problem with disabling the add chip functionality.
After adding this to my code:
`<p-chips name="tags" [disabled]="true" [(ngModel)]="model.tags"></p-chips>`
The old chips are disabled (I can't remove them), however the input is still present and I can add new ones. Is there a way to disable this as well? Also is there a way to disable only a subset of the chips and not all of them?
Angular version: 6.x
PrimeNG version: 6.x
Browser:
ALL
Language: [all | TypeScript X.X | ES6/7 | ES5]
| 1.0 | p-chips disabled issue - ```
[ x] bug report => Search github for a similar issue or PR before submitting
```
I am having a problem with disabling the add chip functionality.
After adding this to my code:
`<p-chips name="tags" [disabled]="true" [(ngModel)]="model.tags"></p-chips>`
The old chips are disabled (I can't remove them), however the input is still present and I can add new ones. Is there a way to disable this as well? Also is there a way to disable only a subset of the chips and not all of them?
Angular version: 6.x
PrimeNG version: 6.x
Browser:
ALL
Language: [all | TypeScript X.X | ES6/7 | ES5]
| non_priority | p chips disabled issue bug report search github for a similar issue or pr before submitting i am having a problem with disabling the add chip functionality after adding this to my code the old chips are disabled i can t remove them however the input is still present and i can add new ones is there a way to disable this as well also is there a way to disable only a subset of the chips and not all of them angular version x primeng version x browser all language | 0 |
14,422 | 2,811,804,975 | IssuesEvent | 2015-05-18 01:36:13 | RenatoUtsch/nulldc | https://api.github.com/repos/RenatoUtsch/nulldc | closed | Enter one-line summary | auto-migrated Priority-Medium Restrict-AddIssueComment-Commit Type-Defect | ```
Please i need lst files for hokuto no ken converted game for nulldc naomi i
dont know how to create lst file please help
```
Original issue reported on code.google.com by `markocur...@gmail.com` on 25 Dec 2014 at 8:42 | 1.0 | Enter one-line summary - ```
Please i need lst files for hokuto no ken converted game for nulldc naomi i
dont know how to create lst file please help
```
Original issue reported on code.google.com by `markocur...@gmail.com` on 25 Dec 2014 at 8:42 | non_priority | enter one line summary please i need lst files for hokuto no ken converted game for nulldc naomi i dont know how to create lst file please help original issue reported on code google com by markocur gmail com on dec at | 0 |
233,470 | 7,698,064,911 | IssuesEvent | 2018-05-18 21:15:22 | HabitRPG/habitica | https://api.github.com/repos/HabitRPG/habitica | opened | Some promo images and text become unreadably jumbled on screen resize | help wanted priority: minor section: Login/Statics | [//]: # (Before logging this issue, please post to the Report a Bug guild from the Habitica website's Help menu. Most bugs can be handled quickly there. If a GitHub issue is needed, you will be advised of that by a moderator or staff member -- a player with a dark blue or purple name. It is recommended that you don't create a new issue unless advised to.)
[//]: # (Bugs in the mobile apps can also be reported there.)
[//]: # (If you have a feature request, use "Help > Request a Feature", not GitHub or the Report a Bug guild.)
[//]: # (For more guidelines see https://github.com/HabitRPG/habitica/issues/2760)
[//]: # (Fill out relevant information - UUID is found from the Habitia website at User Icon > Settings > API)
### General Info
* UUID: 06cda236-af88-4c53-8e0d-bc52d3762869
* Browser: Chrome
* OS: ??
### Description
[//]: # (Describe bug in detail here. Include screenshots if helpful.)

#### Console Errors
[//]: # (Include any JavaScript console errors here.)
| 1.0 | Some promo images and text become unreadably jumbled on screen resize - [//]: # (Before logging this issue, please post to the Report a Bug guild from the Habitica website's Help menu. Most bugs can be handled quickly there. If a GitHub issue is needed, you will be advised of that by a moderator or staff member -- a player with a dark blue or purple name. It is recommended that you don't create a new issue unless advised to.)
[//]: # (Bugs in the mobile apps can also be reported there.)
[//]: # (If you have a feature request, use "Help > Request a Feature", not GitHub or the Report a Bug guild.)
[//]: # (For more guidelines see https://github.com/HabitRPG/habitica/issues/2760)
[//]: # (Fill out relevant information - UUID is found from the Habitia website at User Icon > Settings > API)
### General Info
* UUID: 06cda236-af88-4c53-8e0d-bc52d3762869
* Browser: Chrome
* OS: ??
### Description
[//]: # (Describe bug in detail here. Include screenshots if helpful.)

#### Console Errors
[//]: # (Include any JavaScript console errors here.)
| priority | some promo images and text become unreadably jumbled on screen resize before logging this issue please post to the report a bug guild from the habitica website s help menu most bugs can be handled quickly there if a github issue is needed you will be advised of that by a moderator or staff member a player with a dark blue or purple name it is recommended that you don t create a new issue unless advised to bugs in the mobile apps can also be reported there if you have a feature request use help request a feature not github or the report a bug guild for more guidelines see fill out relevant information uuid is found from the habitia website at user icon settings api general info uuid browser chrome os description describe bug in detail here include screenshots if helpful console errors include any javascript console errors here | 1 |
72,859 | 3,392,131,487 | IssuesEvent | 2015-11-30 18:16:15 | thesgc/chembiohub_helpdesk | https://api.github.com/repos/thesgc/chembiohub_helpdesk | closed | Search page rework.
Search page needs some redevelopment:
add ability to select multiple projects
| app: ChemReg enhancement name: Karen priority: Medium status: New | Search page rework.
Search page needs some redevelopment:
add ability to select multiple projects
move structural search type to underneath chemdoodle widget (minimally)
OR hide ChemDoodle (or separate tab?) unless user selects a new 'Search by substructure' option which makes draw window visible (in pop-up?)
add 'added by' search option
add 'project type' search option
add 'download template' option (documented in another issue)
add grey banner (above results table) cf project list page - can be used to list projects included in search
add 'save search' option
| 1.0 | Search page rework.
Search page needs some redevelopment:
add ability to select multiple projects
- Search page rework.
Search page needs some redevelopment:
add ability to select multiple projects
move structural search type to underneath chemdoodle widget (minimally)
OR hide ChemDoodle (or separate tab?) unless user selects a new 'Search by substructure' option which makes draw window visible (in pop-up?)
add 'added by' search option
add 'project type' search option
add 'download template' option (documented in another issue)
add grey banner (above results table) cf project list page - can be used to list projects included in search
add 'save search' option
| priority | search page rework search page needs some redevelopment add ability to select multiple projects search page rework search page needs some redevelopment add ability to select multiple projects move structural search type to underneath chemdoodle widget minimally or hide chemdoodle or separate tab unless user selects a new search by substructure option which makes draw window visible in pop up add added by search option add project type search option add download template option documented in another issue add grey banner above results table cf project list page can be used to list projects included in search add save search option | 1 |
54,098 | 13,251,357,508 | IssuesEvent | 2020-08-20 01:55:14 | beaverbuilder/feature-requests | https://api.github.com/repos/beaverbuilder/feature-requests | opened | Add Publish button beside Done button | Beaver Builder | Add a Publish button beside the Done button so that people don't have to click twice to publish a page.
I could type control-P if my hand was on the keyboard, but my hand is always on the mouse when building pages and setting style options in modules.
It's an easy technical fix, has low risk of failure, doesn't require extensive testing, and would save customers millions of clicks. | 1.0 | Add Publish button beside Done button - Add a Publish button beside the Done button so that people don't have to click twice to publish a page.
I could type control-P if my hand was on the keyboard, but my hand is always on the mouse when building pages and setting style options in modules.
It's an easy technical fix, has low risk of failure, doesn't require extensive testing, and would save customers millions of clicks. | non_priority | add publish button beside done button add a publish button beside the done button so that people don t have to click twice to publish a page i could type control p if my hand was on the keyboard but my hand is always on the mouse when building pages and setting style options in modules it s an easy technical fix has low risk of failure doesn t require extensive testing and would save customers millions of clicks | 0 |
414,100 | 12,099,013,800 | IssuesEvent | 2020-04-20 11:22:39 | ppy/osu | https://api.github.com/repos/ppy/osu | closed | "Show Online" button in social browser | area:overlay-social low-priority type:UI type:UX | **Describe the new feature:**
In the Social Browser in osu!stable, it only shows online players.
I feel like this needs to be an option, if not planned already in osu!lazer's social browser, as i always message friends and its slightly annoying to have to search for that "online" tag.

Just a checkbox around here, i guess | 1.0 | "Show Online" button in social browser - **Describe the new feature:**
In the Social Browser in osu!stable, it only shows online players.
I feel like this needs to be an option, if not planned already in osu!lazer's social browser, as i always message friends and its slightly annoying to have to search for that "online" tag.

Just a checkbox around here, i guess | priority | show online button in social browser describe the new feature in the social browser in osu stable it only shows online players i feel like this needs to be an option if not planned already in osu lazer s social browser as i always message friends and its slightly annoying to have to search for that online tag just a checkbox around here i guess | 1 |
549,309 | 16,090,308,789 | IssuesEvent | 2021-04-26 15:54:25 | grafana/grafana | https://api.github.com/repos/grafana/grafana | closed | ElasticSearch: Permit "field" to be parsed to a variable | area/dashboard/templating area/datasource datasource/Elasticsearch effort/small help wanted onboarding priority/important-longterm type/feature-request | Grafana version: 4.3.0-pre1
Datasource: Elasticsearch
OS: CentOS 7
If is tried to use "field": "$variable" in query of Variables, the variable is not resolved and the query for the elasticsearch.
Tryied:
Create a variable named $test, type Custom with value "AAA,BBB"
Select $test ='AAA' in a dashboard.
Set another variable named named $test1 with query = {"find": "terms", "field": "$test"}
Result:
{"find": "terms", "field": "$test"}
Espected result:
{"find": "terms", "field": "AAA"} | 1.0 | ElasticSearch: Permit "field" to be parsed to a variable - Grafana version: 4.3.0-pre1
Datasource: Elasticsearch
OS: CentOS 7
If is tried to use "field": "$variable" in query of Variables, the variable is not resolved and the query for the elasticsearch.
Tryied:
Create a variable named $test, type Custom with value "AAA,BBB"
Select $test ='AAA' in a dashboard.
Set another variable named named $test1 with query = {"find": "terms", "field": "$test"}
Result:
{"find": "terms", "field": "$test"}
Espected result:
{"find": "terms", "field": "AAA"} | priority | elasticsearch permit field to be parsed to a variable grafana version datasource elasticsearch os centos if is tried to use field variable in query of variables the variable is not resolved and the query for the elasticsearch tryied create a variable named test type custom with value aaa bbb select test aaa in a dashboard set another variable named named with query find terms field test result find terms field test espected result find terms field aaa | 1 |
49,566 | 26,211,765,316 | IssuesEvent | 2023-01-04 07:19:44 | tensorflow/tensorflow | https://api.github.com/repos/tensorflow/tensorflow | closed | TF2.9 perf is slower | stat:awaiting response stalled comp:model type:performance TF 2.9 | <details><summary>Click to expand!</summary>
### Issue Type
Performance
### Source
source
### Tensorflow Version
tf2.9
### Custom Code
No
### OS Platform and Distribution
_No response_
### Mobile device
_No response_
### Python version
_No response_
### Bazel version
_No response_
### GCC/Compiler version
_No response_
### CUDA/cuDNN version
_No response_
### GPU model and memory
A100
### Current Behaviour?
```shell
We migrate from tf2.4 to tf2.9, and observed that the training speed of some models has ~20% decrease.
On tf2.4, it takes ~30mins after starting the job, before processing the 1st batch. Training speed increase and then become stable.
On tf2.9, it takes ~5mins after starting the job, before processing the 1st batch. Training speed does not increase.
Q1: Can we use tf2.4 to train the model and use tf2.9 for inferencing? Any potential issues?
Q2: How can we find the root cause of tf2.9 training slowness?
```
### Standalone code to reproduce the issue
```shell
Can't share the source code.
```
### Relevant log output
_No response_</details> | True | TF2.9 perf is slower - <details><summary>Click to expand!</summary>
### Issue Type
Performance
### Source
source
### Tensorflow Version
tf2.9
### Custom Code
No
### OS Platform and Distribution
_No response_
### Mobile device
_No response_
### Python version
_No response_
### Bazel version
_No response_
### GCC/Compiler version
_No response_
### CUDA/cuDNN version
_No response_
### GPU model and memory
A100
### Current Behaviour?
```shell
We migrate from tf2.4 to tf2.9, and observed that the training speed of some models has ~20% decrease.
On tf2.4, it takes ~30mins after starting the job, before processing the 1st batch. Training speed increase and then become stable.
On tf2.9, it takes ~5mins after starting the job, before processing the 1st batch. Training speed does not increase.
Q1: Can we use tf2.4 to train the model and use tf2.9 for inferencing? Any potential issues?
Q2: How can we find the root cause of tf2.9 training slowness?
```
### Standalone code to reproduce the issue
```shell
Can't share the source code.
```
### Relevant log output
_No response_</details> | non_priority | perf is slower click to expand issue type performance source source tensorflow version custom code no os platform and distribution no response mobile device no response python version no response bazel version no response gcc compiler version no response cuda cudnn version no response gpu model and memory current behaviour shell we migrate from to and observed that the training speed of some models has decrease on it takes after starting the job before processing the batch training speed increase and then become stable on it takes after starting the job before processing the batch training speed does not increase can we use to train the model and use for inferencing any potential issues how can we find the root cause of training slowness standalone code to reproduce the issue shell can t share the source code relevant log output no response | 0 |
476,573 | 13,746,960,300 | IssuesEvent | 2020-10-06 06:47:56 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | medium.com - site is not usable | browser-fenix engine-gecko priority-critical type-webrender-enabled | <!-- @browser: Firefox Mobile 83.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 8.1.0; Mobile; rv:83.0) Gecko/83.0 Firefox/83.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/59378 -->
<!-- @extra_labels: browser-fenix, type-webrender-enabled -->
**URL**: https://medium.com/
**Browser / Version**: Firefox Mobile 83.0
**Operating System**: Android 8.1.0
**Tested Another Browser**: No
**Problem type**: Site is not usable
**Description**: Page not loading correctly
**Steps to Reproduce**:
Medium takes minutes to load.
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2020/10/02ca9ab7-4995-4bd6-b249-6ee98d0dab2e.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: true</li><li>gfx.webrender.blob-images: false</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20201001094020</li><li>channel: nightly</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2020/10/9247af23-0eec-4d60-a79d-b09612e385fd)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | medium.com - site is not usable - <!-- @browser: Firefox Mobile 83.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 8.1.0; Mobile; rv:83.0) Gecko/83.0 Firefox/83.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/59378 -->
<!-- @extra_labels: browser-fenix, type-webrender-enabled -->
**URL**: https://medium.com/
**Browser / Version**: Firefox Mobile 83.0
**Operating System**: Android 8.1.0
**Tested Another Browser**: No
**Problem type**: Site is not usable
**Description**: Page not loading correctly
**Steps to Reproduce**:
Medium takes minutes to load.
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2020/10/02ca9ab7-4995-4bd6-b249-6ee98d0dab2e.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: true</li><li>gfx.webrender.blob-images: false</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20201001094020</li><li>channel: nightly</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2020/10/9247af23-0eec-4d60-a79d-b09612e385fd)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | medium com site is not usable url browser version firefox mobile operating system android tested another browser no problem type site is not usable description page not loading correctly steps to reproduce medium takes minutes to load view the screenshot img alt screenshot src browser configuration gfx webrender all true gfx webrender blob images false gfx webrender enabled false image mem shared true buildid channel nightly hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️ | 1 |
412,651 | 12,053,899,866 | IssuesEvent | 2020-04-15 10:09:37 | oppia/oppia-android | https://api.github.com/repos/oppia/oppia-android | closed | Crash on ProfileRestPinActivity. | Priority: Important | There is crash on pressing back navigation button on ProfileResetPinActivity. | 1.0 | Crash on ProfileRestPinActivity. - There is crash on pressing back navigation button on ProfileResetPinActivity. | priority | crash on profilerestpinactivity there is crash on pressing back navigation button on profileresetpinactivity | 1 |
464,182 | 13,307,547,228 | IssuesEvent | 2020-08-25 22:26:19 | WebTools-NG/WebTools-NG | https://api.github.com/repos/WebTools-NG/WebTools-NG | closed | Need to make this git public | Priority-Critical Translations | If not public, we can't get an Open-Source access to translation site, so we need to open up soon | 1.0 | Need to make this git public - If not public, we can't get an Open-Source access to translation site, so we need to open up soon | priority | need to make this git public if not public we can t get an open source access to translation site so we need to open up soon | 1 |
237,869 | 19,681,425,007 | IssuesEvent | 2022-01-11 17:07:41 | lizrad/recursio | https://api.github.com/repos/lizrad/recursio | closed | Ghosts that shouldn't yet exist show up on the client sometimes | bug needs testing | Example scenario where it occured (but can't reliably be reproduced): Start with a hitscan timeline, then play the other hitscan timeline -> the wall ghost shows up
It seems like that ghost might have some movement data, at least we saw it do something, it didn't just stand there.
Because it's possible to glitch through it, we assume that it only exists on the client. | 1.0 | Ghosts that shouldn't yet exist show up on the client sometimes - Example scenario where it occured (but can't reliably be reproduced): Start with a hitscan timeline, then play the other hitscan timeline -> the wall ghost shows up
It seems like that ghost might have some movement data, at least we saw it do something, it didn't just stand there.
Because it's possible to glitch through it, we assume that it only exists on the client. | non_priority | ghosts that shouldn t yet exist show up on the client sometimes example scenario where it occured but can t reliably be reproduced start with a hitscan timeline then play the other hitscan timeline the wall ghost shows up it seems like that ghost might have some movement data at least we saw it do something it didn t just stand there because it s possible to glitch through it we assume that it only exists on the client | 0 |
450,340 | 13,002,569,798 | IssuesEvent | 2020-07-24 03:48:17 | googlefonts/noto-fonts | https://api.github.com/repos/googlefonts/noto-fonts | closed | add Ruble sign (U+20BD) to Phase II Noto Sans and Serif (LGC) | Android FoundIn-1.x GNTF Priority-High Script-LatinGreekCyrillic | ```
Please add the symbol of the ruble.
```
Original issue reported on code.google.com by `livan97....@gmail.com` on 4 Dec 2014 at 5:10
| 1.0 | add Ruble sign (U+20BD) to Phase II Noto Sans and Serif (LGC) - ```
Please add the symbol of the ruble.
```
Original issue reported on code.google.com by `livan97....@gmail.com` on 4 Dec 2014 at 5:10
| priority | add ruble sign u to phase ii noto sans and serif lgc please add the symbol of the ruble original issue reported on code google com by gmail com on dec at | 1 |
566,894 | 16,833,353,995 | IssuesEvent | 2021-06-18 08:41:29 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | beinternetawesome.withgoogle.com - desktop site instead of mobile site | browser-firefox-ios os-ios priority-normal | <!-- @browser: Firefox iOS 34.0 -->
<!-- @ua_header: Mozilla/5.0 (iPhone; CPU iPhone OS 14_4_2 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) FxiOS/34.0 Mobile/15E148 Safari/605.1.15 -->
<!-- @reported_with: mobile-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/77490 -->
<!-- @extra_labels: browser-firefox-ios -->
**URL**: https://beinternetawesome.withgoogle.com/en_us/interland
**Browser / Version**: Firefox iOS 34.0
**Operating System**: iOS 14.4.2
**Tested Another Browser**: Yes Safari
**Problem type**: Desktop site instead of mobile site
**Description**: Desktop site instead of mobile site
**Steps to Reproduce**:
Nothing happened at All so I’m upset blaming you and I’m not gonna be able to get you back again and you have useh
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | beinternetawesome.withgoogle.com - desktop site instead of mobile site - <!-- @browser: Firefox iOS 34.0 -->
<!-- @ua_header: Mozilla/5.0 (iPhone; CPU iPhone OS 14_4_2 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) FxiOS/34.0 Mobile/15E148 Safari/605.1.15 -->
<!-- @reported_with: mobile-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/77490 -->
<!-- @extra_labels: browser-firefox-ios -->
**URL**: https://beinternetawesome.withgoogle.com/en_us/interland
**Browser / Version**: Firefox iOS 34.0
**Operating System**: iOS 14.4.2
**Tested Another Browser**: Yes Safari
**Problem type**: Desktop site instead of mobile site
**Description**: Desktop site instead of mobile site
**Steps to Reproduce**:
Nothing happened at All so I’m upset blaming you and I’m not gonna be able to get you back again and you have useh
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | beinternetawesome withgoogle com desktop site instead of mobile site url browser version firefox ios operating system ios tested another browser yes safari problem type desktop site instead of mobile site description desktop site instead of mobile site steps to reproduce nothing happened at all so i’m upset blaming you and i’m not gonna be able to get you back again and you have useh browser configuration none from with ❤️ | 1 |
2,073 | 4,789,435,499 | IssuesEvent | 2016-10-31 01:25:17 | zeit/hyper | https://api.github.com/repos/zeit/hyper | closed | CTRL-T not working | Platform: Linux Status: Help Wanted Type: Compatibility | ## On my Gnome 3 Ubuntu 16.10 install, CTRL-C is the only command that sort of works and it just makes a new prompt. CTRL-T doesn't make a new tab and none of the other key combos seem to work.
Hyper 0.8.2
Electron 1.4.0
linux x64 4.8.0-11-generic
| True | CTRL-T not working - ## On my Gnome 3 Ubuntu 16.10 install, CTRL-C is the only command that sort of works and it just makes a new prompt. CTRL-T doesn't make a new tab and none of the other key combos seem to work.
Hyper 0.8.2
Electron 1.4.0
linux x64 4.8.0-11-generic
| non_priority | ctrl t not working on my gnome ubuntu install ctrl c is the only command that sort of works and it just makes a new prompt ctrl t doesn t make a new tab and none of the other key combos seem to work hyper electron linux generic | 0 |
664,733 | 22,286,606,371 | IssuesEvent | 2022-06-11 18:30:28 | mabel-dev/opteryx | https://api.github.com/repos/mabel-dev/opteryx | closed | [BUG] Unable to evaluate valid filters | bug awaiting closure priority | replicated with sample data:
~~~sql
SELECT * FROM $astronauts
WHERE name LIKE '%o%'
AND `year` > 1900
AND (gender ILIKE '%ale%' AND group IN (1,2,3,4,5,6))
~~~ | 1.0 | [BUG] Unable to evaluate valid filters - replicated with sample data:
~~~sql
SELECT * FROM $astronauts
WHERE name LIKE '%o%'
AND `year` > 1900
AND (gender ILIKE '%ale%' AND group IN (1,2,3,4,5,6))
~~~ | priority | unable to evaluate valid filters replicated with sample data sql select from astronauts where name like o and year and gender ilike ale and group in | 1 |
223,015 | 17,533,567,190 | IssuesEvent | 2021-08-12 02:21:51 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | roachtest: jepsen/monotonic/split failed | C-test-failure O-robot O-roachtest branch-master release-blocker | roachtest.jepsen/monotonic/split [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3286047&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3286047&tab=artifacts#/jepsen/monotonic/split) on master @ [90809c048d05f923a67ce9b89597b2779fc73e32](https://github.com/cockroachdb/cockroach/commits/90809c048d05f923a67ce9b89597b2779fc73e32):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/jepsen/monotonic/split/run_1
cluster.go:1887,jepsen.go:93,jepsen.go:147,jepsen.go:360,test_runner.go:777: output in run_212125.805272127_n1-6_sh: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-3286047-1628662905-71-n6cpu4:1-6 -- sh -c "sudo apt-get -y upgrade -o Dpkg::Options::='--force-confold' > logs/apt-upgrade.log 2>&1" returned: context canceled
(1) attached stack trace
-- stack trace:
| main.(*clusterImpl).RunE
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:1977
| main.(*clusterImpl).Run
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:1885
| github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.initJepsen
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/jepsen.go:93
| github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.runJepsen
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/jepsen.go:147
| github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.RegisterJepsen.func1
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/jepsen.go:360
| main.(*testRunner).runTest.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/test_runner.go:777
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1371
Wraps: (2) output in run_212125.805272127_n1-6_sh
Wraps: (3) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-3286047-1628662905-71-n6cpu4:1-6 -- sh -c "sudo apt-get -y upgrade -o Dpkg::Options::='--force-confold' > logs/apt-upgrade.log 2>&1" returned
| stderr:
|
| stdout:
| <... some data truncated by circular buffer; go to artifacts for details ...>
| ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
Wraps: (4) secondary error attachment
| signal: interrupt
| (1) signal: interrupt
| Error types: (1) *exec.ExitError
Wraps: (5) context canceled
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *cluster.WithCommandDetails (4) *secondary.withSecondaryError (5) *errors.errorString
```
<details><summary>Reproduce</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/roachtest)
See: [CI job to stress roachtests](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestStress)
<p>For the CI stress job, click the ellipsis (...) next to the Run button and fill in:
* Changes / Build branch: master
* Parameters / `env.TESTS`: `^jepsen/monotonic/split$`
* Parameters / `env.COUNT`: <number of runs>
</p>
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #68764 roachtest: jepsen/monotonic/split failed [C-test-failure O-roachtest O-robot branch-release-21.1 release-blocker]
</p>
</details>
/cc @cockroachdb/kv-triage
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*jepsen/monotonic/split.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| 2.0 | roachtest: jepsen/monotonic/split failed - roachtest.jepsen/monotonic/split [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3286047&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3286047&tab=artifacts#/jepsen/monotonic/split) on master @ [90809c048d05f923a67ce9b89597b2779fc73e32](https://github.com/cockroachdb/cockroach/commits/90809c048d05f923a67ce9b89597b2779fc73e32):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/jepsen/monotonic/split/run_1
cluster.go:1887,jepsen.go:93,jepsen.go:147,jepsen.go:360,test_runner.go:777: output in run_212125.805272127_n1-6_sh: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-3286047-1628662905-71-n6cpu4:1-6 -- sh -c "sudo apt-get -y upgrade -o Dpkg::Options::='--force-confold' > logs/apt-upgrade.log 2>&1" returned: context canceled
(1) attached stack trace
-- stack trace:
| main.(*clusterImpl).RunE
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:1977
| main.(*clusterImpl).Run
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:1885
| github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.initJepsen
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/jepsen.go:93
| github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.runJepsen
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/jepsen.go:147
| github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.RegisterJepsen.func1
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/jepsen.go:360
| main.(*testRunner).runTest.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/test_runner.go:777
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1371
Wraps: (2) output in run_212125.805272127_n1-6_sh
Wraps: (3) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-3286047-1628662905-71-n6cpu4:1-6 -- sh -c "sudo apt-get -y upgrade -o Dpkg::Options::='--force-confold' > logs/apt-upgrade.log 2>&1" returned
| stderr:
|
| stdout:
| <... some data truncated by circular buffer; go to artifacts for details ...>
| ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
Wraps: (4) secondary error attachment
| signal: interrupt
| (1) signal: interrupt
| Error types: (1) *exec.ExitError
Wraps: (5) context canceled
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *cluster.WithCommandDetails (4) *secondary.withSecondaryError (5) *errors.errorString
```
<details><summary>Reproduce</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/roachtest)
See: [CI job to stress roachtests](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestStress)
<p>For the CI stress job, click the ellipsis (...) next to the Run button and fill in:
* Changes / Build branch: master
* Parameters / `env.TESTS`: `^jepsen/monotonic/split$`
* Parameters / `env.COUNT`: <number of runs>
</p>
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #68764 roachtest: jepsen/monotonic/split failed [C-test-failure O-roachtest O-robot branch-release-21.1 release-blocker]
</p>
</details>
/cc @cockroachdb/kv-triage
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*jepsen/monotonic/split.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| non_priority | roachtest jepsen monotonic split failed roachtest jepsen monotonic split with on master the test failed on branch master cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts jepsen monotonic split run cluster go jepsen go jepsen go jepsen go test runner go output in run sh home agent work go src github com cockroachdb cockroach bin roachprod run teamcity sh c sudo apt get y upgrade o dpkg options force confold logs apt upgrade log returned context canceled attached stack trace stack trace main clusterimpl rune home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go main clusterimpl run home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go github com cockroachdb cockroach pkg cmd roachtest tests initjepsen home agent work go src github com cockroachdb cockroach pkg cmd roachtest tests jepsen go github com cockroachdb cockroach pkg cmd roachtest tests runjepsen home agent work go src github com cockroachdb cockroach pkg cmd roachtest tests jepsen go github com cockroachdb cockroach pkg cmd roachtest tests registerjepsen home agent work go src github com cockroachdb cockroach pkg cmd roachtest tests jepsen go main testrunner runtest home agent work go src github com cockroachdb cockroach pkg cmd roachtest test runner go runtime goexit usr local go src runtime asm s wraps output in run sh wraps home agent work go src github com cockroachdb cockroach bin roachprod run teamcity sh c sudo apt get y upgrade o dpkg options force confold logs apt upgrade log returned stderr stdout wraps secondary error attachment signal interrupt signal interrupt error types exec exiterror wraps context canceled error types withstack withstack errutil withprefix cluster withcommanddetails secondary withsecondaryerror errors errorstring reproduce see see for the ci stress job click the ellipsis next to the run button and fill in changes build branch master parameters env tests jepsen monotonic split parameters env count lt number of runs gt same failure on other branches roachtest jepsen monotonic split failed cc cockroachdb kv triage | 0 |
264,519 | 8,316,072,690 | IssuesEvent | 2018-09-25 07:57:18 | metasfresh/metasfresh | https://api.github.com/repos/metasfresh/metasfresh | closed | Open Items List Reference Date wrong parm | branch:master priority:high type:bug | ### Is this a bug or feature request?
Bug
### What is the current behavior?
#### Which are the steps to reproduce?
### What is the expected or desired behavior?
Report should work and take the given date | 1.0 | Open Items List Reference Date wrong parm - ### Is this a bug or feature request?
Bug
### What is the current behavior?
#### Which are the steps to reproduce?
### What is the expected or desired behavior?
Report should work and take the given date | priority | open items list reference date wrong parm is this a bug or feature request bug what is the current behavior which are the steps to reproduce what is the expected or desired behavior report should work and take the given date | 1 |
180,225 | 21,625,603,631 | IssuesEvent | 2022-05-05 01:24:28 | JMD60260/fetchmeaband | https://api.github.com/repos/JMD60260/fetchmeaband | closed | WS-2015-0018 (Medium) detected in semver-1.0.14.tgz - autoclosed | security vulnerability | ## WS-2015-0018 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>semver-1.0.14.tgz</b></p></summary>
<p>The semantic version parser used by npm.</p>
<p>Library home page: <a href="https://registry.npmjs.org/semver/-/semver-1.0.14.tgz">https://registry.npmjs.org/semver/-/semver-1.0.14.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/fetchmeaband/public/vendor/owl.carousel/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/fetchmeaband/public/vendor/owl.carousel/node_modules/semver/package.json</p>
<p>
Dependency Hierarchy:
- grunt-blanket-qunit-0.2.0.tgz (Root Library)
- grunt-contrib-qunit-0.2.2.tgz
- grunt-lib-phantomjs-0.3.1.tgz
- :x: **semver-1.0.14.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/JMD60260/fetchmeaband/commit/430b5f2947d45ada69dc047ea870d3c988006344">430b5f2947d45ada69dc047ea870d3c988006344</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Semver is vulnerable to regular expression denial of service (ReDoS) when extremely long version strings are parsed.
<p>Publish Date: 2015-04-04
<p>URL: <a href=https://github.com/npm/node-semver/commit/c80180d8341a8ada0236815c29a2be59864afd70>WS-2015-0018</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/31">https://nodesecurity.io/advisories/31</a></p>
<p>Release Date: 2015-04-04</p>
<p>Fix Resolution: Update to a version 4.3.2 or greater</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2015-0018 (Medium) detected in semver-1.0.14.tgz - autoclosed - ## WS-2015-0018 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>semver-1.0.14.tgz</b></p></summary>
<p>The semantic version parser used by npm.</p>
<p>Library home page: <a href="https://registry.npmjs.org/semver/-/semver-1.0.14.tgz">https://registry.npmjs.org/semver/-/semver-1.0.14.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/fetchmeaband/public/vendor/owl.carousel/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/fetchmeaband/public/vendor/owl.carousel/node_modules/semver/package.json</p>
<p>
Dependency Hierarchy:
- grunt-blanket-qunit-0.2.0.tgz (Root Library)
- grunt-contrib-qunit-0.2.2.tgz
- grunt-lib-phantomjs-0.3.1.tgz
- :x: **semver-1.0.14.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/JMD60260/fetchmeaband/commit/430b5f2947d45ada69dc047ea870d3c988006344">430b5f2947d45ada69dc047ea870d3c988006344</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Semver is vulnerable to regular expression denial of service (ReDoS) when extremely long version strings are parsed.
<p>Publish Date: 2015-04-04
<p>URL: <a href=https://github.com/npm/node-semver/commit/c80180d8341a8ada0236815c29a2be59864afd70>WS-2015-0018</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/31">https://nodesecurity.io/advisories/31</a></p>
<p>Release Date: 2015-04-04</p>
<p>Fix Resolution: Update to a version 4.3.2 or greater</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws medium detected in semver tgz autoclosed ws medium severity vulnerability vulnerable library semver tgz the semantic version parser used by npm library home page a href path to dependency file tmp ws scm fetchmeaband public vendor owl carousel package json path to vulnerable library tmp ws scm fetchmeaband public vendor owl carousel node modules semver package json dependency hierarchy grunt blanket qunit tgz root library grunt contrib qunit tgz grunt lib phantomjs tgz x semver tgz vulnerable library found in head commit a href vulnerability details semver is vulnerable to regular expression denial of service redos when extremely long version strings are parsed publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution update to a version or greater step up your open source security game with whitesource | 0 |
420,241 | 12,235,158,167 | IssuesEvent | 2020-05-04 14:29:38 | canonical-web-and-design/snapcraft.io | https://api.github.com/repos/canonical-web-and-design/snapcraft.io | closed | Clicking on screenshot should open a lightbox image viewer. | Priority: Low | ### Expected behaviour
When users clicks on an screenshot on the snap preview page they are expecting to get the image open in a lightbox image viewer.
### Steps to reproduce the problem
Visit https://snapcraft.io/wonderwall click on any screenshot page will make user jump away from the website which is not a good practice (from SEO perspective too)
### Specs
- _URL:_ https://snapcraft.io/wonderwall
- _Operating system:_ Any
- _Browser:_ Any
### Solution
Implement any lightweight lightbox plugin to open screenshots on the same page with image navigation and close feature.
| 1.0 | Clicking on screenshot should open a lightbox image viewer. - ### Expected behaviour
When users clicks on an screenshot on the snap preview page they are expecting to get the image open in a lightbox image viewer.
### Steps to reproduce the problem
Visit https://snapcraft.io/wonderwall click on any screenshot page will make user jump away from the website which is not a good practice (from SEO perspective too)
### Specs
- _URL:_ https://snapcraft.io/wonderwall
- _Operating system:_ Any
- _Browser:_ Any
### Solution
Implement any lightweight lightbox plugin to open screenshots on the same page with image navigation and close feature.
| priority | clicking on screenshot should open a lightbox image viewer expected behaviour when users clicks on an screenshot on the snap preview page they are expecting to get the image open in a lightbox image viewer steps to reproduce the problem visit click on any screenshot page will make user jump away from the website which is not a good practice from seo perspective too specs url operating system any browser any solution implement any lightweight lightbox plugin to open screenshots on the same page with image navigation and close feature | 1 |
236,830 | 18,110,423,290 | IssuesEvent | 2021-09-23 02:36:15 | Taro-IT/docs | https://api.github.com/repos/Taro-IT/docs | opened | PL04-Plantilla de trazabilidad de requisito | documentation | ### Descripción
Definir una plantilla para poder ver el estatus de todos los requisitos así como los recursos para su implementación y los componentes generados.
### Tipo
_No response_ | 1.0 | PL04-Plantilla de trazabilidad de requisito - ### Descripción
Definir una plantilla para poder ver el estatus de todos los requisitos así como los recursos para su implementación y los componentes generados.
### Tipo
_No response_ | non_priority | plantilla de trazabilidad de requisito descripción definir una plantilla para poder ver el estatus de todos los requisitos así como los recursos para su implementación y los componentes generados tipo no response | 0 |
805,661 | 29,579,928,245 | IssuesEvent | 2023-06-07 04:26:49 | momentum-mod/website | https://api.github.com/repos/momentum-mod/website | closed | Mock/Presetup backend for easy frontend work | Priority: Low Size: Large Type: Dev/Internal For: Backend | To be able to run a docker container that has a "mock" backend that is filled with the mock data from the [faker script](https://github.com/momentum-mod/website/blob/staging/scripts/populate_db_with_fake_data.js) would be lovely for those looking to do only frontend development, but do not want to go through the entire setup process of the backend (Database, logging in, steamworks API, etc). | 1.0 | Mock/Presetup backend for easy frontend work - To be able to run a docker container that has a "mock" backend that is filled with the mock data from the [faker script](https://github.com/momentum-mod/website/blob/staging/scripts/populate_db_with_fake_data.js) would be lovely for those looking to do only frontend development, but do not want to go through the entire setup process of the backend (Database, logging in, steamworks API, etc). | priority | mock presetup backend for easy frontend work to be able to run a docker container that has a mock backend that is filled with the mock data from the would be lovely for those looking to do only frontend development but do not want to go through the entire setup process of the backend database logging in steamworks api etc | 1 |
39,961 | 2,861,997,285 | IssuesEvent | 2015-06-04 00:08:32 | dart-lang/sdk | https://api.github.com/repos/dart-lang/sdk | closed | dartanalyzer forces things like indents and spaces when reformatting afile. Should it use a preferences file for it? | Area-Formatter Priority-Unassigned Triaged Type-Enhancement | *This issue was originally filed by hurtado.ferna...@gmail.com*
_____
**What steps will clearly show the issue / need for enhancement?**
1. Use dart-tools package for atom
2. On package settings change space tab to something different of 2
3. Type some code
4. save the file (Ctrl + S)
**What is the current output?**
When formatting, dartanalyzer forces things like tab length to 2 and removes spaces on parenthesis and curly braces.
**What would you like to see instead?**
Since Dart Editor is going to dissapear, It'd be nice if dartanalyzer respect user preferences, maybe as flags or config files (maybe a analyzer.json?), when formatting code.
**What version of the product are you using? On what operating system?**
I'm using dart-sdk 1.10.0 on Windows 7.
**Please provide any additional information below.**
I opened a bug on dart-tools project which brings dart tools to atom users as an option to Java IDEs (https://github.com/radicaled/dart-tools/issues/29).
Thanks in advance | 1.0 | dartanalyzer forces things like indents and spaces when reformatting afile. Should it use a preferences file for it? - *This issue was originally filed by hurtado.ferna...@gmail.com*
_____
**What steps will clearly show the issue / need for enhancement?**
1. Use dart-tools package for atom
2. On package settings change space tab to something different of 2
3. Type some code
4. save the file (Ctrl + S)
**What is the current output?**
When formatting, dartanalyzer forces things like tab length to 2 and removes spaces on parenthesis and curly braces.
**What would you like to see instead?**
Since Dart Editor is going to dissapear, It'd be nice if dartanalyzer respect user preferences, maybe as flags or config files (maybe a analyzer.json?), when formatting code.
**What version of the product are you using? On what operating system?**
I'm using dart-sdk 1.10.0 on Windows 7.
**Please provide any additional information below.**
I opened a bug on dart-tools project which brings dart tools to atom users as an option to Java IDEs (https://github.com/radicaled/dart-tools/issues/29).
Thanks in advance | priority | dartanalyzer forces things like indents and spaces when reformatting afile should it use a preferences file for it this issue was originally filed by hurtado ferna gmail com what steps will clearly show the issue need for enhancement use dart tools package for atom on package settings change space tab to something different of type some code save the file ctrl s what is the current output when formatting dartanalyzer forces things like tab length to and removes spaces on parenthesis and curly braces what would you like to see instead since dart editor is going to dissapear it d be nice if dartanalyzer respect user preferences maybe as flags or config files maybe a analyzer json when formatting code what version of the product are you using on what operating system i m using dart sdk on windows please provide any additional information below i opened a bug on dart tools project which brings dart tools to atom users as an option to java ides thanks in advance | 1 |
763,795 | 26,774,913,527 | IssuesEvent | 2023-01-31 16:28:25 | gwt-plugins/gwt-eclipse-plugin | https://api.github.com/repos/gwt-plugins/gwt-eclipse-plugin | closed | Test on Eclipse Photon | enhancement High Priority | It's time to test the gwt-eclipse-plugin with Eclipse Photon because of the upcoming release in June. Currently M6 is available for download:
http://www.eclipse.org/downloads/packages/release/Photon/M6 | 1.0 | Test on Eclipse Photon - It's time to test the gwt-eclipse-plugin with Eclipse Photon because of the upcoming release in June. Currently M6 is available for download:
http://www.eclipse.org/downloads/packages/release/Photon/M6 | priority | test on eclipse photon it s time to test the gwt eclipse plugin with eclipse photon because of the upcoming release in june currently is available for download | 1 |
725,102 | 24,951,291,022 | IssuesEvent | 2022-11-01 07:32:33 | AY2223S1-CS2103-F14-2/tp | https://api.github.com/repos/AY2223S1-CS2103-F14-2/tp | closed | [PE-D][Tester D] Bug: Finding student with same display CAP as search term failed | type.Bug priority.High | Steps to reproduce:
1. Execute the command: `add n/John Doe p/98765432 e/johnd@example.com a/311, Clementi Ave 2, #02-25 c/3.999/4.00 g/male u/Nanyang Polytechnic gd/05-2024 m/Computer Science ji/173296 jt/Software Engineer Intern t/rejected t/KIV` (If there exist a user with the same identity fields, delete it)

2. `find c/3.999` finds the user but not `find c/4.00` (although display shows 4.00)

Suppose the employer has the records, and wants to filter by CAP. However, during the input process, he accidentally input the student's CAP as 3.999 instead of 3.99, which was accepted by the system. He does not notice his mistake. The following day, he sees that John has a CAP of 4.00 (perfect CAP for polytechnic student), and starts to take notice of John. Then, when he starts looking at those with CAP 4.00, he wonders why John was missing from the list.
Hence, this behaviour can be problematic.
<!--session: 1666950176477-4fa23b17-6a9c-4d06-af98-8f7f9be162d4-->
<!--Version: Web v3.4.4-->
-------------
Labels: `severity.Low` `type.FunctionalityBug`
original: cheeheng/ped#13 | 1.0 | [PE-D][Tester D] Bug: Finding student with same display CAP as search term failed - Steps to reproduce:
1. Execute the command: `add n/John Doe p/98765432 e/johnd@example.com a/311, Clementi Ave 2, #02-25 c/3.999/4.00 g/male u/Nanyang Polytechnic gd/05-2024 m/Computer Science ji/173296 jt/Software Engineer Intern t/rejected t/KIV` (If there exist a user with the same identity fields, delete it)

2. `find c/3.999` finds the user but not `find c/4.00` (although display shows 4.00)

Suppose the employer has the records, and wants to filter by CAP. However, during the input process, he accidentally input the student's CAP as 3.999 instead of 3.99, which was accepted by the system. He does not notice his mistake. The following day, he sees that John has a CAP of 4.00 (perfect CAP for polytechnic student), and starts to take notice of John. Then, when he starts looking at those with CAP 4.00, he wonders why John was missing from the list.
Hence, this behaviour can be problematic.
<!--session: 1666950176477-4fa23b17-6a9c-4d06-af98-8f7f9be162d4-->
<!--Version: Web v3.4.4-->
-------------
Labels: `severity.Low` `type.FunctionalityBug`
original: cheeheng/ped#13 | priority | bug finding student with same display cap as search term failed steps to reproduce execute the command add n john doe p e johnd example com a clementi ave c g male u nanyang polytechnic gd m computer science ji jt software engineer intern t rejected t kiv if there exist a user with the same identity fields delete it find c finds the user but not find c although display shows suppose the employer has the records and wants to filter by cap however during the input process he accidentally input the student s cap as instead of which was accepted by the system he does not notice his mistake the following day he sees that john has a cap of perfect cap for polytechnic student and starts to take notice of john then when he starts looking at those with cap he wonders why john was missing from the list hence this behaviour can be problematic labels severity low type functionalitybug original cheeheng ped | 1 |
522,556 | 15,161,929,770 | IssuesEvent | 2021-02-12 09:50:18 | magento/magento2 | https://api.github.com/repos/magento/magento2 | closed | customer save performance issue | Component: Customer Issue: Confirmed Issue: Format is valid Issue: Ready for Work Priority: P3 Progress: ready for dev Reported on 2.4.0 Reproduced on 2.4.x Severity: S3 Triage: Performance stale issue | <!---
Please review our guidelines before adding a new issue: https://github.com/magento/magento2/wiki/Issue-reporting-guidelines
Fields marked with (*) are required. Please don't remove the template.
-->
### Preconditions (*)
<!---
Provide the exact Magento version (example: 2.3.2) and any important information on the environment where bug is reproducible.
-->
1. latest Magento
2. customer contains 50 address
### Steps to reproduce (*)
<!---
Important: Provide a set of clear steps to reproduce this bug. We can not provide support without clear instructions on how to reproduce.
-->
1. try to save customer
2. on save customer why Magento check each address and updating
### Expected result (*)
<!--- Tell us what do you expect to happen. -->
1. it should be saved address if anything change related to address
### Actual result (*)
<!--- Tell us what happened instead. Include error messages and issues. -->
1. on every customer save, it is calling each address and save it
More info for this:-
https://github.com/magento/magento2/blob/2.4-develop/app/code/Magento/Customer/Model/ResourceModel/CustomerRepository.php#L263
| 1.0 | customer save performance issue - <!---
Please review our guidelines before adding a new issue: https://github.com/magento/magento2/wiki/Issue-reporting-guidelines
Fields marked with (*) are required. Please don't remove the template.
-->
### Preconditions (*)
<!---
Provide the exact Magento version (example: 2.3.2) and any important information on the environment where bug is reproducible.
-->
1. latest Magento
2. customer contains 50 address
### Steps to reproduce (*)
<!---
Important: Provide a set of clear steps to reproduce this bug. We can not provide support without clear instructions on how to reproduce.
-->
1. try to save customer
2. on save customer why Magento check each address and updating
### Expected result (*)
<!--- Tell us what do you expect to happen. -->
1. it should be saved address if anything change related to address
### Actual result (*)
<!--- Tell us what happened instead. Include error messages and issues. -->
1. on every customer save, it is calling each address and save it
More info for this:-
https://github.com/magento/magento2/blob/2.4-develop/app/code/Magento/Customer/Model/ResourceModel/CustomerRepository.php#L263
| priority | customer save performance issue please review our guidelines before adding a new issue fields marked with are required please don t remove the template preconditions provide the exact magento version example and any important information on the environment where bug is reproducible latest magento customer contains address steps to reproduce important provide a set of clear steps to reproduce this bug we can not provide support without clear instructions on how to reproduce try to save customer on save customer why magento check each address and updating expected result it should be saved address if anything change related to address actual result on every customer save it is calling each address and save it more info for this | 1 |
180,355 | 30,487,535,128 | IssuesEvent | 2023-07-18 04:26:21 | hypha-dao/hypha_wallet | https://api.github.com/repos/hypha-dao/hypha_wallet | closed | UI review | Design | Dark mode & Light mode
- [x] Scan QR banner
- [x] Memo in transaction history (and detail)
- [x] Transaction signing - action title
- [x] transaction expires in
- [x] Handle “while” sliding.
- [x] Bottom Sheets
- [x] Former Bottom sheet “transaction fail”
- [x] Import account
- [x] toggle::Active use Main app colour gradient (not green), and use the same component all over the app (also fix size pls if possible)
More info:
https://docs.google.com/document/d/1K9SPwWHTO4IMRp1JaN5K75FiAs4qs1Lq9Ec2DUjLhxA/edit | 1.0 | UI review - Dark mode & Light mode
- [x] Scan QR banner
- [x] Memo in transaction history (and detail)
- [x] Transaction signing - action title
- [x] transaction expires in
- [x] Handle “while” sliding.
- [x] Bottom Sheets
- [x] Former Bottom sheet “transaction fail”
- [x] Import account
- [x] toggle::Active use Main app colour gradient (not green), and use the same component all over the app (also fix size pls if possible)
More info:
https://docs.google.com/document/d/1K9SPwWHTO4IMRp1JaN5K75FiAs4qs1Lq9Ec2DUjLhxA/edit | non_priority | ui review dark mode light mode scan qr banner memo in transaction history and detail transaction signing action title transaction expires in handle “while” sliding bottom sheets former bottom sheet “transaction fail” import account toggle active use main app colour gradient not green and use the same component all over the app also fix size pls if possible more info | 0 |
59,208 | 14,538,854,926 | IssuesEvent | 2020-12-15 11:02:44 | curl/curl | https://api.github.com/repos/curl/curl | closed | Package curl for Windows in a signed installer | Windows build feature-request on-hold | Hi. PM on Windows Dev Platform here. Wasn't sure if this was a discussion or an issue, so posted here.
We're working on some "new & improved things" that would benefit from apps and tools like curl being signed & distributed via a signed installer package (MSI/MSIX), including:
* A simplified process for installing apps and tools on Windows
* Improve perf by minimizing number and frequency of scans by anti-malware tools like Defender
Have/would you consider:
1. Signing curl binaries with a cert chained to a CA?
2. Bundling curl outputs in a signed installer package?
3. Distributing curl installer packages for Windows?
| 1.0 | Package curl for Windows in a signed installer - Hi. PM on Windows Dev Platform here. Wasn't sure if this was a discussion or an issue, so posted here.
We're working on some "new & improved things" that would benefit from apps and tools like curl being signed & distributed via a signed installer package (MSI/MSIX), including:
* A simplified process for installing apps and tools on Windows
* Improve perf by minimizing number and frequency of scans by anti-malware tools like Defender
Have/would you consider:
1. Signing curl binaries with a cert chained to a CA?
2. Bundling curl outputs in a signed installer package?
3. Distributing curl installer packages for Windows?
| non_priority | package curl for windows in a signed installer hi pm on windows dev platform here wasn t sure if this was a discussion or an issue so posted here we re working on some new improved things that would benefit from apps and tools like curl being signed distributed via a signed installer package msi msix including a simplified process for installing apps and tools on windows improve perf by minimizing number and frequency of scans by anti malware tools like defender have would you consider signing curl binaries with a cert chained to a ca bundling curl outputs in a signed installer package distributing curl installer packages for windows | 0 |
789,023 | 27,776,248,889 | IssuesEvent | 2023-03-16 17:25:11 | googleapis/google-cloud-go | https://api.github.com/repos/googleapis/google-cloud-go | closed | bigquery/storage/managedwriter: TestFlowControllerNoStarve failed | type: bug api: bigquery priority: p1 flakybot: issue | Note: #6529 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
----
commit: 2b9b77d3371feccff9b5900f37a21f6920bf8853
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/305a78f6-d6ff-4c2e-8870-15c31b7003b2), [Sponge](http://sponge2/305a78f6-d6ff-4c2e-8870-15c31b7003b2)
status: failed | 1.0 | bigquery/storage/managedwriter: TestFlowControllerNoStarve failed - Note: #6529 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
----
commit: 2b9b77d3371feccff9b5900f37a21f6920bf8853
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/305a78f6-d6ff-4c2e-8870-15c31b7003b2), [Sponge](http://sponge2/305a78f6-d6ff-4c2e-8870-15c31b7003b2)
status: failed | priority | bigquery storage managedwriter testflowcontrollernostarve failed note was also for this test but it was closed more than days ago so i didn t mark it flaky commit buildurl status failed | 1 |
394,670 | 11,647,270,582 | IssuesEvent | 2020-03-01 14:18:06 | opencv/opencv | https://api.github.com/repos/opencv/opencv | closed | error when load onnx model | category: dnn effort: few weeks feature priority: normal | I am trying to load a onnx model which is converted from pytorch.
`import cv2`
`model = cv2.dnn.readNetFromONNX('tmp.onnx')`
cv2.error: OpenCV(4.1.2) /io/opencv/modules/dnn/src/dnn.cpp:525: error: (-2:Unspecified error) Can't create layer "166" of type "Cast" in function 'getLayerInstance'
here is my model
[https://www.dropbox.com/s/sg0n1kdpbbio848/tmp.onnx?dl=0](url)
anyone have idea how to fix this problem? Thanks a lot.
| 1.0 | error when load onnx model - I am trying to load a onnx model which is converted from pytorch.
`import cv2`
`model = cv2.dnn.readNetFromONNX('tmp.onnx')`
cv2.error: OpenCV(4.1.2) /io/opencv/modules/dnn/src/dnn.cpp:525: error: (-2:Unspecified error) Can't create layer "166" of type "Cast" in function 'getLayerInstance'
here is my model
[https://www.dropbox.com/s/sg0n1kdpbbio848/tmp.onnx?dl=0](url)
anyone have idea how to fix this problem? Thanks a lot.
| priority | error when load onnx model i am trying to load a onnx model which is converted from pytorch import model dnn readnetfromonnx tmp onnx error opencv io opencv modules dnn src dnn cpp error unspecified error can t create layer of type cast in function getlayerinstance here is my model url anyone have idea how to fix this problem thanks a lot | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.